annathepiper: (Little Help?)
[personal profile] annathepiper
The professional webgeeks, that is. As y'all know, I'm QA for the Seattle Times now and affiliated web sites. That means a whole heck of a lot of web presence. Right now we have no formal site regression procedures in place, and this really, really needs to change. I have a two-pronged goal here: one, set up what is essentially a site BVT, and two, set up a full-fledged, in-depth, bang the hell out of each and every functional aspect of the site functionality pass. These goals are going to have to come in stages, though. First we have to write the test cases, and then we have to figure out how many of them we can automate.

This is where you all come in. I'm looking for recommendations for web site testing tools, things that will do the grunt work of verifying links, check for missing graphics, that kind of thing--and ideally, also, let you focus the scope of what you're looking for so that (say) if you're testing all the links on seattletimes.com, it doesn't wander off into testing anything that's not part of that site. (We do have tools that are supposed to do this right now, but they're unreliable and frequently have to be double-checked manually. Not helpful.) For extra bonus points, an easy to understand UI, ability to save stuff out into readable log files, and ability to set up custom scripts/macros/actions without having to have huge gobs of coding skill would be bonus.

Commercially produced software would be fine, as would solid, reliable open source products. Hit me with your recommendations, people!

Date: 2006-09-30 04:59 am (UTC)
From: [identity profile] xpioti.livejournal.com
WebTrends has been wildly popular with my previous clientele. I have no idea what my current clientele uses, assuming they use anything at all. :)

Date: 2006-09-30 05:05 am (UTC)
From: [identity profile] fleetfootmike.livejournal.com
JMeter (open source) doesn't suck.

You can use the GUI to set up a bunch of tests, then run them in batch mode by the command line once you're happy.

Date: 2006-09-30 07:28 pm (UTC)
From: [identity profile] kirbyk.livejournal.com
One solution, a very DIY approach, would be to write custom Perl to do it. There's a great module, WWW::Mechanize, that lets you define a path of URLs, handles cookies, and that sort of thing. It's probably the most work to take this approach, but you can't beat it for the range of what you can do. It might be a good kind of thing to fill in the gaps if you find an easy tool that does 90% of what you want, too. Or if you want to do more programming.

http://search.cpan.org/dist/WWW-Mechanize/

Article on using it: http://www.perl.com/pub/a/2003/01/22/mechanize.html

Profile

annathepiper: (Default)
Anna the Piper

November 2025

S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829
30      

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Feb. 14th, 2026 02:18 am
Powered by Dreamwidth Studios