Accessibility work often breaks down not because people disagree with it, but because expectations are unclear. Business Analysts (BAs) write acceptance criteria every day, yet accessibility is still often summarised in vague statements like “meets WCAG” or “accessible for screen readers.” These statements aim to help but don’t describe behaviour, and they’re difficult or impossible to test.
Accessibility Acceptance Criteria (AACs) solve that. They give Business Analysts a structured and repeatable way to define accessible user experiences in the same place they define everything else: the user story. AACs turn broad accessibility intentions into practical, testable outcomes that guide design, development and testing.
This approach isn’t new. Years ago, the GOV.UK team highlighted how AACs reduce ambiguity, promote consistency across teams and make accessibility a shared responsibility. What’s changed since then is the maturity and clarity of modern AAC frameworks.
Two resources in particular shape how AACs are used today:
• Mel O’Brien’s Accessibility Acceptance Criteria – the latest and most complete set of AACs written specifically from a Business Analyst perspective.
• Canaxess AAC Repository – based on Mel’s work, adding broader explanation and a companion guide detailing how to test AACs in real delivery environments.
Mel’s AACs define the expected behaviour.
The Canaxess testing guide explains how to verify that behaviour.
Combined, they bring the clarity and structure that earlier AAC guidance lacked.
Why Accessibility Acceptance Criteria Matter in Business Analysis
Business Analysts shape behaviour. They describe how a feature works, how users interact with it and what success looks like. AACs extend that capability into accessibility, ensuring that:
the intended accessible user experience is explicit
accessibility decisions are made early, not after testing
design, development and testing work toward the same outcome
•he Web Content Accessibility Guidelines (WCAG) support the work without needing to be repeated verbatim
accessibility is understood as a team responsibility, not a specialist-only activity
AACs keep accessibility visible, predictable and easy to test.
Mel O’Brien’s Accessibility Acceptance Criteria as the Core Reference
Mel O’Brien’s latest AACs provide the most current and user-focused set available. They communicate real user experience rather than code-level implementation and describe:
how users should navigate content
how screen readers should announce meaningful information
how errors are identified and understood
how interface states are communicated
how structural elements support accessible navigation
For BAs, this shifts the focus from technical compliance to behavioural outcomes. It removes the challenge of interpreting WCAG line-by-line and replaces it with clear, practical statements about what users must be able to do.
How the Canaxess Repository Builds on This
The Canaxess repository builds on Mel’s framework by adding broader context and practical testing instructions. The companion Testing with AAC guide shows teams how to verify AACs without needing deep accessibility expertise.
The guidance explains:
how to test behaviours across different input methods
what to look for in screen reader announcements
how to confirm correct focus movement
how to check error handling and visible/non-visible cues
how to validate entire user journeys, not just components
This combination of behavioural criteria and practical testing brings consistency across teams — a point earlier GOV.UK guidance emphasised. Accessibility stops being theoretical and becomes an operational standard.
Structuring AACs for Business Analysts
AACs are most effective when they follow a simple, repeatable structure. A consistent pattern helps BAs capture the behaviours that matter most.
The four areas to define are:
Interaction
Explain how users operate the component with different inputs, such as keyboard, screen readers or switch devices.
Example:
“When navigating with a keyboard, focus moves through elements in a logical order, and each element’s purpose is announced clearly.”
Feedback
Describe what the user should understand when something changes, succeeds or fails.
Example:
“When an error appears, it is visible on screen and also announced when the field receives focus.”
Structure
Define how content is organised so users relying on headings, landmarks or reading order can navigate predictably.
A key requirement:
Headings must follow a sequential order.
If heading levels jump unexpectedly, screen reader users may believe content is missing.
States
Specify how interactive elements communicate their condition, such as expanded, collapsed, disabled, loading or selected.
Example:
“All states must be conveyed visually and programmatically so assistive technology can recognise them.”
These categories ensure the AACs are behaviour-focused rather than implementation-specific.
AACs and Testing: A Combined Workflow
Pairing Mel’s AACs with the Canaxess Testing with AAC guidance creates a complete accessibility workflow:
the Business Analyst defines the intended behaviour
teams share a common understanding of the expected experience
testers know exactly how to verify the behaviour
accessibility becomes consistent across sprints and squads
For example:
AAC:
“Error messages must be associated with their input fields and announced when the field receives focus.”
Testing with AAC:
Use a screen reader to navigate to the field and confirm the error is announced automatically.
This combined process reflects the principle GOV.UK emphasised early on: AACs work best when they are specific, actionable and testable.
Takeaway
Accessibility Acceptance Criteria give Business Analysts a reliable way to embed accessibility into every user story. Mel O’Brien’s AACs provide clear, behavioural criteria, while the Canaxess Testing with AAC guide explains how to validate them in real delivery environments.
Together, they create a practical foundation for consistent, inclusive delivery — not as an afterthought, but as the way teams build accessible products every day.
This post continues the Accessibility In Practice series by focusing on tools and methods that make accessibility clear, testable and achievable for any team.
Sources
Mel O’Brien – Accessibility Acceptance Criteria (Latest Version)
Canaxess – Testing with AAC Guide
GOV.UK – Improving Accessibility with Accessibility Acceptance Criteria (2018)
Accessible design is good design. Everything we build should be as inclusive, legible and readable as possible. [...] The people who most need our services are often the people who find them hardest to use. Let’s think about those people from the start.
unknownx500





