While testing our own content is great, especially when trying out new coding techniques, nothing is as good as feedback from real users. See a comparison of screen readers on Wikipedia for a more comprehensive list of what screen reader is supported across platforms and their capabilities. NVDA, Orca and SAToGo are also open source. Free screen readers include VoiceOver on Mac (as well as iPhone and iPad), Orca on Linux, Non Visual Desktop Access (NVDA) and System Access to Go (SAToGo). Popular paid for screen readers, often supported by employers and schools are Jaws for Windows, WindowEyes and Hal.
Throw into the mix that software upgrades can vary considerably and you find yourself not knowing if content not being read correctly on a page is to do with your content, a bug in the reader or a quirk with the browser. Testing with screen readers when developing web content is key but can be tricky given the diversity between screen reader capabilities and support of existing and new technologies such as Flash, PDF, WAI ARIA, SVG and, new developments in HTML (formally known as HTML5). There are a multitude of options out there on the market with users choosing their screen reader dependent on it’s capabilities, cost, whether it is supported by their employer or school and so on.
Screen readers are a text-to-speech software that work on top of a web browser (and other applications) to read screen content out to users who have severe sight problems, reading problems or learning disabilities. Its written in such a way that I hope organisations of any size or budget can adapt and use it. This article is not about testing with screen readers as such (I’ve written about this elsewhere) but rather what needs to be considered in order to establish a good screen reader testing plan within larger overall accessibility and general quality assurance plans.