23 results found with an empty search
- Lessons in accessibility: A day at the DfE Accessibility Lab and conversations with the experts
At the DfE Accessibility Lab, our colleagues Sree (User Researcher) and Claire (UX Designer) explored how assistive technologies are used—and where they can fall short when services aren’t designed with everyone in mind. One crisp spring morning, as the sun finally pushed through the grey weight of winter, a user researcher, Sree, travelled from Newcastle and an interaction designer, Claire, journeyed from London, converging in Sheffield. Their destination: the Department for Education’s (DfE) Accessibility Lab. Their goal: to understand how digital services function for those who navigate the world differently. Inside the Accessibility Lab: Where digital barriers become visible From left to right: Claire, Sree and Jane at DfE’s Accessibility Lab, Sheffield We expected a technical demonstration—a run-through of tools and accessibility best practices. What we got was something much more human: a window into the lived experience of those who rely on assistive technologies daily. Guided by Jane Dickinson, an accessibility specialist at DfE, we explored tools like Dragon, JAWS, ZoomText, and Fusion. Jane not only explained how they work but showed us how easily they can fail when services aren't built with accessibility in mind. Insights from testing with assistive tools Dragon: Voice recognition for hands-free navigation Dragon voice control lets users navigate computers hands-free. But if clickable elements aren’t properly coded as buttons, Dragon can’t find them. Jane demonstrated how Dragon struggled with buttons on a DfE service and the BBC homepage as they weren’t coded as such. Dragon couldn’t recognise the “click button” command as the button was invisible to the tool - highlighting a major gap between design and code. JAWS: Screen reader for non-visual navigation JAWS relies on well-structured content: heading levels, labelled buttons, and descriptive links. Jane showed how generic links like “Read more” or “Download” confuse JAWS users due to a lack of individual distinction or missing ARIA labels, making browsing chaotic and frustrating. As Jane put it: “If a page isn’t structured properly, it’s a nightmare to navigate.” ZoomText: For low vision users ZoomText is a magnification tool that helps users navigate visually. However, it requires users to hover or click on links to have them read aloud, unlike JAWS, which reads automatically. At higher magnification, text can become distorted where the page has not been coded to handle zoom, affecting readability. Fusion: Combining JAWS and ZoomText Fusion provides auditory feedback and high-level magnification for individuals with partial vision loss, offering magnification up to 20x with auditory feedback. But Jane showed us that even a 3x zoom can cause layout issues, like pixelation and clipped content, especially when sites don’t reflow content properly. Keyboard-only navigation Keyboard navigation is essential for users who can’t use a mouse, relying on shortcuts like the Alt key. But inconsistent implementation makes things harder. Jane pointed out unmarked buttons on the BBC homepage that would leave keyboard users guessing: “If something isn’t labelled properly, it just gets skipped over.” Captions for hearing impairments Captions aren’t just for deaf users—they help everyone. But live captions often lag, making comprehension harder. Testing BBC video content, we saw captions fall out of sync with speech, making it difficult for a user to keep track. Experiencing the world through the eyes of others Sree and Claire testing visual simulation glasses As part of our lab experience, we tested simulation glasses that aimed to alter vision, giving a general insight into conditions like: Cataracts : everything looks blurred. Tunnel vision : loss of peripheral vision, reducing situational awareness. Left-sided hemianopia : half the visual field disappears, common after strokes or brain injuries. It was very insightful to be reminded how much of the digital world can become difficult to use under these conditions, and how inclusive and thoughtful design can prevent the digital barriers that some users may face. N.B. While simulation glasses offer a glimpse, they can’t replicate the full experience of visual impairment. They’re a starting point for empathy, not a substitute for listening to real users who experience visual impairments. To truly understand, we need to speak with and learn from real users. The Visual Impairment North-East (Vine) Simulation Package In conversation with Accessibility Experts To deepen our understanding of accessibility, we interviewed Jane Dickinson and Jake Lloyd, two key accessibility specialists at DfE, to hear their insights. Jane’s biggest frustration? Accessibility being bolted on at the end. “It’s not enough to test for accessibility. Real users need to shape the design from the beginning.” She also highlighted how many users hesitate to disclose their accessibility needs for fear of being seen as difficult. Even when reports are written to improve accessibility, they often go ignored. “I can spend a whole day writing a report, and sometimes nothing changes.” Despite these challenges, Jane celebrated the wins—a blind user who was able to access their payslip independently for the first time: “One of our blind users told me, ‘For the first time, I didn’t have to ask someone to read my payslip. I could do it myself.’ That made all the work worth it.” Even small changes like properly marking up pdfs or labelling buttons has a huge impact and can make a service more accessible. Jake emphasised the importance of building for keyboard navigation and screen readers from the very start. “There are so many accessibility issues that come from not thinking about keyboard accessibility… It affects focus, visibility, and how well voice and assistive tech tools work.” He highlighted issues like repetitive, unclear links in patterns such as “Check your answers”: “Something like the ‘Check your answers’ pattern has links that just say ‘Change’… If you're just using a screen reader and you're navigating through a bunch of links… you're only going to hear “change”. So providing some hidden screen reader text, giving more context to that link can be really helpful.” This was another thoughtful reminder that different users read pages differently, and not everyone will be able to view the visual context to written content. A holistic approach to accessibility The accessibility specialists broke down their layered approach to testing the accessibility of services: Automated testing to catch common issues early. Manual testing using only a keyboard or different zoom levels. Assistive tech checks like screen readers and voice controls. Code reviews to ensure correct HTML and component use. As Jake put it, accessibility goes beyond the Web Content Accessibility Guidelines (WCAG) standards: “I’ll also record issues that don’t fail WCAG but still create barriers—like having to tab 30 times to reach an ‘apply filter’ button.” Jake warned against treating accessibility as an afterthought: “Where teams haven't thought about accessibility and inclusive design up front and early on, complex issues tend to come out of that.” Not boring. Not optional. A myth Jake wants to debunk is that accessible design equals boring design. “You can still be innovative. Your website can look good and be accessible if you plan it that way from the start,” he said. “Unfortunately, some organisations continue to treat accessibility as an afterthought, which remains a cultural issue”. Our specialists pointed out that advocacy and awareness are key to changing this mindset: “Having people with actual lived experience that can demonstrate the way that they interact with digital content, can be really powerful… Here's someone who is blind. They use a screen reader to navigate your service, and they can't do it.” They stressed how one in four people have a disability—can you afford to turn them away with inaccessible services? Why accessibility matters for everyone Jane and Jake made it clear: accessibility isn’t just for disabled users. It benefits all of us. Captions help on a noisy train. Good contrast helps in bright light. And if zooming to 400% breaks your layout—it’s not just low vision users who suffer. “If it’s not thought about up front, then it affects a lot of people.” Accessibility isn’t a task—it’s a mindset As user researchers and designers, we focus on how people interact with digital services. But in Sheffield, we were not the experts—we were the students. This wasn’t about checking off accessibility guidelines. It was about understanding what happens when those guidelines aren’t met. A missing label, a broken heading structure, or an unlabelled button—these aren’t small issues. Each one determines who gets to participate and who doesn’t. Accessibility is also never ‘done’, it is an ongoing activity that requires the whole team's input to maintain. As we left Sheffield, catching our trains to opposite ends of the country, we carried more than just knowledge. We carried a quiet but certain resolve to champion accessibility. The best accessibility work doesn’t “help” people. It supports their independence and ensures they don’t have to ask for help in the first place. Useful resources Department for Education accessibility and inclusive design training Making your service accessible: an introduction Department for Education accessibility and inclusive design manual W3C: Making the Web Accessible W3Cx: Introduction to Web Accessibility Sara Soueidan: The Practical Accessibility Course About the authors Sree is a Lead User Researcher specialising in uncovering user needs and delivering data-driven insights. A CDDO-DfE trained Service Assessor, she champions user-centricity and accessibility in government services. When she’s not diving into research, Sree can be found roaming the countryside with her husky, cooking up a storm, or curling up with a good book. Claire is a Senior User Experience Designer, specialising in interaction design. She advocates for accessibility and strives to bridge the gap between usability and inclusion. Outside of work, Claire enjoys exploring new places and experimenting with new recipes. Contact information If you have any questions about our research and design services or you want to find out more about other services we provide at Solirius, please get in touch (opens in a new tab) .
- Driving social impact: Solirius' pro-bono digital and design project with The Talent Tap
Supporting charities through pro-bono digital and design services by Theone Johnson and Sam Smith At Solirius, we are committed to using our technical expertise to create a positive and lasting impact on society through our pro-bono work with charities. Supporting social mobility through user-centered design The Talent Tap is a social mobility charity that supports young people from areas with fewer opportunities, often rural and coastal areas where challenges are greater, by providing access to work placements and other professional opportunities and support that help shape their future careers. They also work with businesses to advocate for industry-wide change to implement sustainable diversity and inclusion strategies, with a focus on improving social mobility. As part of our Social Value initiative, we partnered with The Talent Tap to provide pro-bono digital and design support, identifying opportunities to improve the content and design of their website, attract potential corporate partners, and amplify their impact. The Design Team: Claire McShane Designer, Louise Morales-Brown User Researcher, Anna Rapp User Researcher, Sam Smith User Researcher, Lydia Davidson Designer, Hattie Brash Service Designer and Theone Johnson Project Manager Our Solirius Design Team, an outstanding group of talented design consultants used their expertise in user-centred design, content strategy and accessibility to make a meaningful impact on the charity while reinforcing Solirius' commitment to its core social values. The aim The Talent Tap wanted their website to have a simple and professional feel to attract more interest and funding from potential corporate partners, while still maintaining a friendly undertone that communicated their USP as a youth-led charity. Our approach We worked closely with the Talent Tap team to enhance their website’s content and design, ensuring it aligned with their goals and vision. For this, we took a holistic approach by equipping them with industry insights to refine their online presence and leveraged user research to identify ways the design can better meet user needs and their own strategic goals. This included: Conducting a content audit to identify pain points and improvement opportunities Carrying out an accessibility audit to ensure inclusivity for all users Performing a competitor analysis to identify opportunities for differentiation Mapping and improving the existing information architecture to create clearer user journeys Facilitating collaborative design workshops to inform the new creative direction for the charity Conducting user research and analysis to understand and design for the user needs of their target audience Applying industry standards to update the content to ensure clear, impactful messaging Creating high-fidelity wireframes that aligned with The Talent Tap’s new brand kit Adding clear wireframe annotations to guide the development phase Working examples Screenshots of examples of the project work carried out. Included are a Talent Tap Planning Workshop, competitor analysis, accessibility audit, content audit, a corporate partner persona and an image of an information architecture map. The outcome We are proud to have supported The Talent Tap’s mission by enhancing their digital presence and helping them connect more effectively with their audience. Social mobility is a key focus for us at Solirius, and we’re grateful for the opportunity to help The Talent Tap expand their reach—bringing this important message to more young people and businesses across the UK. Thank you and congratulations to everyone who contributed to this project. The Talent Tap team is thrilled with the new website designs and their CEO said: “Working with Solirius as a not for profit was a joy both in terms of their patience and incredible enthusiasm and knowledge. You have taken what was a clunky and standard not-for-profit website and turned into a fully functional, user centric asset. As a charity we simply could not have afforded the service provided.” Contact information If you have any questions about our design services or you want to find out more about other services we provide at Solirius , please get in touch (opens in a new tab) .
- WCAG 2.2 one year on: Impact on government services
WCAG 2.2 one year on: Impact on government services by Ayesha Saeed After over a year of the release of WCAG 2.2 what should you be doing as a government service? Ayesha one of our Accessibility Leads answers some key questions you may have for how to implement WCAG 2.2 if you haven't already started. Overview: What is WCAG? Overview of the changes What are the new guidelines? Key questions on WCAG 2.2 Looking forward Useful resources What is WCAG? The WCAG (Web Content Accessibility Guidelines) ( opens in a new tab) are universal guidelines that are used by public bodies to ensure accessibility is built into digital services. The WCAG guidelines are broken down by levels: Level A: Must do, basic requirements (legally required for public sector). Level AA: Must do, removes further significant barriers (legally required for public sector). Level AAA: Specialised support, most comprehensive. Meeting the WCAG guidelines is one part of meeting legal accessibility guidelines as a government service (both public and internal users). Check out Piya’s article on government requirements (opens in new tab) from earlier in our accessibility series for details. You can also see understanding accessibility requirements for public sector bodies (opens in new tab) for a comprehensive breakdown. Overview of the changes The latest official version of WCAG 2.2 was published on 5th October 2023. This replaces the previous version, 2.1, which was published in 2018. WCAG 2.2 builds on and is compatible with WCAG 2.1, with added requirements. One success criterion, 4.1.1 Parsing, was removed in WCAG 2.2 as it was deemed redundant. WCAG 2.2 also addresses aspects related to privacy and security in web content. There are 9 further A, AA and AAA guidelines to be aware of including; focus management, dragging movements, target size, consistent help, redundant entry, and accessible authentication. 6 of the new criteria are A and AA level which are what government services are legally required to meet for WCAG 2.2, bringing the total of A and AA guidelines to 55. You can see the full details of the changes on the W3 website for the WCAG 2.2 introduction (opens in new tab) . What are the new guidelines? Level A and AA: 2.4.11 (AA): Focus Not Obscured (Minimum): focus states must not be entirely hidden. A graphic of a good example of two popup bubbles overlapping. You can partially see the focus on the popup behind. 2.5.7 (AA): Dragging Movements: functionality must not rely on dragging. Alternatives such as buttons for left and right should be provided. A graphic of a good example of a dragging function, with left and right arrows on either side. A hovering mouse shows how you can use the buttons and the dragging feature. 2.5.8: Target Size (Minimum) (AA): there can only be one interactive target in a 24px by 24px area. A graphic of a good example of icons where there is only one interactive element in a 24px by 24px area. 3.2.6: Consistent Help (A): help mechanisms must appear in the same place on each page. A graphic of a good example of two screens next to each other, with the help function located in the same top right hand corner on both. 3.3.7: Redundant Entry (A): users must not be required to re-enter the same information, unless essential such as for security purposes. Provide an option to automate the input for the same information twice if required twice. A graphic of a good example of the option to use the same details for an address so a user does not have to enter the same information twice. In this example there is a checkbox to say the billing address being input is the same as the your address input. 3.3.8: Accessible Authentication (AA): authentication must not require a cognitive test (exceptions for object recognition or personal content). For example, provide compatibility with a password manager so a user doesn't have to input or transfer information for authentication. A graphic of a good example of giving users several options for authentication e.g through the use of a password manager. Level AAA: 2.4.12: Focus Not Obscured (Enhanced): focus states must not be hidden at all. A graphic of a good example of two popup bubbles. You can fully see the focus on the popups and they do not overlap. 2.4.13: Focus Appearance: focus indicator must meet a contrast ratio of at least 3:1 and at least 2 px in thickness that goes around the item . A graphic of a good example of a clear focus around a button, with contrast of a minimum of 3:1 and 2px thickness. In this example a black outline is used on a light grey background. 3.3.9: Accessible Authentication (Enhanced): authentication must not require a cognitive test, with no exceptions. For example, provide compatibility with a password manager so a user doesn't have to input or transfer information for authentication. A graphic of a good example of an authentication form with no cognitive test or Captchas to login. Key questions on WCAG 2.2 Q1: Does meeting WCAG 2.2 ‘break’ my accessibility progress? A site that meets WCAG 2.2 will also meet 2.1 and 2.0. Q2: When do I start building and testing for WCAG 2.2? Testing your service against WCAG 2.2 should be incorporated as soon as possible if you haven't already started. You should aim to conduct regular accessibility testing (manual, automated and against assistive technologies), so you can maintain an accurate understanding of how compliant your service is and prevent any surprises when it comes to a yearly audit. Do not rely solely on an annual audit to accessibility test your service, as this is only a snapshot in time and does not reflect ongoing maintenance of accessibility. If it has been at least a year since your service was last audited, or it was audited against WCAG 2.1, you will need to conduct an audit again. You should also continuously conduct usability testing to ensure your service is meeting the needs of real users, and not just WCAG. Q3: Do I need to update my Accessibility Statement? You should reassess your service for WCAG and other legislation compliance every year, and update your accessibility statement to reflect this. As it is over a year since WCAG 2.2 was released, all services should now be testing and updating their accessibility statement to the WCAG 2.2 guidelines. Q4: When will GDS start monitoring? The GDS Monitoring Team started testing sites against the new WCAG 2.2 success criteria from 5th October 2024. Find out more information at changes to the public sector digital accessibility regulations (opens in new tab) . Q5: When will the GOV.UK Design system be updated? The GOV UK Design System Team have reviewed WCAG 2.2 (opens in new tab) and updated the design system, and included these changes in the latest GOV.UK Frontend v5.0.0 (opens in new tab) . They have also provided guidance on how to meet WCAG 2.2, and which components, pages and patterns will be affected. Q6: How is my accessibility automated testing impacted? You should continue to use automated tools such as pa11y and aXeCore to support testing in build pipelines. For aXeCore, you can tag which level you want your tests to run against, so make sure you add ‘wcag22’ to cover the new guidelines. Find out more at Axe-core 4.5: First WCAG 2.2 Support and More (opens in new tab) . Semi-automated tools such as Wave and aXe can still also be used to pick up some accessibility issues. Automated/semi-automated tools do not cover all WCAG 2.2 guidelines so it is important to continue to test manually, with assistive technology and with real users. Looking forward WCAG 3.0 (opens in new tab) is currently a Working Draft and aims to provide guidance to build for users with blindness, low vision and other vision impairments; deafness and hearing loss; limited movement and dexterity; speech disabilities; sensory disorders; cognitive and learning disabilities; and combinations of these. WCAG 3.0 also aims to support a wider range of web content on desktops, laptops, tablets, mobile devices, wearable devices, and other web of things devices. Content that conforms to WCAG 2.2 A and AA is expected to meet most of the minimum conformance level of this new standard but, since WCAG 3 includes additional tests and different scoring mechanics, additional work will be needed to reach full conformance. Ensuring you factor in regular maintenance is paramount to keeping accessibility up to date. And remember, WCAG does not cover every scenario. Test with your users and conduct regular user research. Useful resources WCAG 2.2 and what it means for you (Craig Abbott) (opens in new tab) Obligatory WCAG 2.2 Launch Post (Adrian Roselli (opens in new tab) What WCAG 2.2 means for UK public sector websites and apps (GDS - YouTube) (opens in new tab) Testing for WCAG 2.2 (Intopia - YouTube) ( opens in a new tab) WCAG 2.2 Explained: Everything You Need to Know about the Web Content Accessibility Guidelines 2.2 ( opens in a new tab) Contact information If you have any questions about our accessibility services or you want to find out more about other services we provide at Solirius, please get in touch .
- Let’s talk accessibility: why we need proxy users
Have you ever been in a situation where you’re keen to test the accessibility of a service, but your target users haven’t communicated any accessibility needs? Sree (Sreemoyee), our Principal User Researcher, discusses how you can advocate for diverse user needs and ensure inclusive design on your projects. In a recent project, our data-fluent user group did not declare any accessibility needs, which led our team to consider skipping accessibility tests. Recognising the importance of catering to future users with accessibility needs and staying ahead of evolving user requirements, I turned to an ‘Accessibility Lab’, a database of proxy users with accessibility needs curated by our client’s User Centered Design (UCD) team. Who are proxy users in the context of accessibility testing? Proxy users, though not part of the primary user group, share comparable digital skills and accessibility needs that make them useful contributors to inclusive design. For my education-centric project, the Department for Education (DfE) Accessibility Lab was the ideal resource, featuring primarily teachers as proxy users who had signed up to be contacted for accessibility testing. Importantly, these teachers were not users of the service we were testing, ensuring unbiased perspectives without preconceptions. How I prepared for accessibility testing with proxy users: hot tips We opted for remote testing to accommodate the preference and availability of the proxy users. This decision necessitated adjustments to ensure effective testing. Clearly communicating the necessary information I communicated with the participants through emails and video calls, reassuring them that no prior knowledge of the service was necessary. Before the remote testing sessions, I provided them with the project background, outlining the goal of evaluating service accessibility. Throughout, I encouraged open communication, emphasising to participants that we are testing the service and not them, encouraging candid and honest feedback. Tailoring the usability tests It was important to familiarise myself with the specific accessibility needs of the proxy users to understand each person’s unique requirements. When testing with a participant with dyslexia who reported finding traditional text-heavy interfaces challenging, I asked them to describe their current environment and any assistive technologies they might use for dyslexia. During the test, I focussed on their interaction with fonts, line spacing, and visual cues to assess their content comprehension. Crafting guided interactions In remote sessions, I asked participants to use their main device and specified the browsers. Recognising potential challenges faced by proxy users who are unfamiliar with the service, I provided extra guidance and prompts, to enhance clarity in task understanding. For example: Original prompt: “Start the data submission journey and go through it as you normally would.” Guided prompt: “Start the data submission journey by selecting option x on the homepage, and if you encounter any difficulties, feel free to ask for guidance.” Observing and enquiring As the remote setting made it more difficult to pick up on non-verbal cues, I used screen-sharing tools to observe participants’ facial expressions and gestures as they navigated through the webpages. I encouraged them to think out loud and share their preferences and dislikes. With their consent, I recorded the sessions for later review. I observed closely for signs of difficulty and asked open-ended questions, such as: “How did you feel navigating through that section?” “How would you describe your experience using this feature?” Engaging with empathy Mindful of potential challenges faced by users with cognitive impairments, I approached remote testing with patience and empathy. I gave extra time for understanding, adjusted the testing environment based on their real-time feedback, and strategically built in breaks and buffers within the testing schedule. One participant made what was my favourite request: “Mind if I take a break to cuddle my cat?” Using relevant tools and technologies I facilitated the use of tools and assistive technologies as per user need to make the testing process smoother and more accurate. During a session, noting the need for screen magnification, I provided proxy users with the option to adjust the interface’s font size and contrast settings. Would I recommend accessibility testing with proxy users? Absolutely. The Project Leads observed these research sessions firsthand and described them as “eye-opening” and “fascinating”. But why? The pros of accessibility testing The benefits of conducting accessibility testing with proxy users are nuanced and varied: Tech-debt mitigation In the absence of actual users with declared accessibility needs, accessibility testing with proxy users encourages the adoption of inclusive design and development practices from the outset - the foundation that a truly user-centered service is built upon. In testing, visually impaired users highlighted issues with cluttered screens and excessive scrolling. Their feedback revealed that the approach of cramming information into a small screen made it hard for users with visual challenges to understand the content. The insight from users with accessibility needs, together with feedback from our target users, prompted us to simplify the homepage, making it cleaner and more straightforward, reducing cognitive load. We validated these changes through further testing to ensure enhanced usability. Proxy users, with their unique needs, enable us to spot and fix accessibility issues early, helping avoid the accumulation of technical debt and costly retrofits later in its development journey. Ethical inclusivity Engaging with diverse users is vital for inclusivity. When real users don’t declare accessibility needs, proxy users guide us in understanding diverse experiences. It’s not a checkbox exercise; it’s our ethical duty to ensure digital services are equitable for everyone. During testing, one proxy user emphasised the importance of truly grasping diverse user needs, stating: “I want options, not assumptions… It’s awfully good of you and your team to reach out to understand my experiences.” Enhancing user experience through unbiased perspectives Proxy users, especially those unrelated to the service or product being tested, bring a fresh perspective to the table. They offer insights without the bias of prior knowledge or experience, helping us see our product objectively. Their feedback acts as a powerful tool to uncover potential blind spots and create a more user-friendly experience. Compliance with accessibility standards Conducting accessibility testing, alongside accessibility audits, helps us meet the Web Content Accessibility Guidelines (WCAG) 2.2, which is based on 4 design principles: perceivable, operable, understandable, and robust. In structuring the guidelines as principles instead of technology; the WCAG accentuates the need to understand how people interact with digital content, ensuring that the service is accessible, identifying areas for improvement, and reducing legal risks while promoting ethical design and development practices. Specific educational insights In the instance of our education focussed project, testing with the proxy users who were primarily teachers gave us valuable insights into the unique accessibility needs of education providers. Their feedback enabled us to develop and refine our service to align with the real needs of those in the sector. The cons of accessibility testing with proxy users While the benefits of involving proxy users are significant, it’s essential to acknowledge potential risks: Representation gap Proxy users, while sharing comparable accessibility needs, may not fully represent the experiences of the target user group. To address this, it’s essential to complement proxy user insights with targeted feedback from users with disabilities to bridge the representation gap. Availability Finding suitable proxy users for recruitment can be a challenge, potentially causing testing delays. In my project, this risk was mitigated by leveraging the client’s Accessibility Lab, a database of proxy users, which was readily available, preventing potential recruitment challenges and minimising testing delays. Intermediary role Proxy users, as intermediaries, may unintentionally filter or misunderstand information because they might not fully grasp the nuances of the target user group’s experiences. To counter this, I structured testing sessions with extra guidance and prompts to minimise the risk of misinterpretation. In conclusion Effective leveraging of proxy users in accessibility testing requires a balanced approach. While their insights are invaluable for inclusive design and early issue detection, it’s important to supplement their feedback with testing from actual users with disabilities whenever possible. Combining both approaches ensures a thorough evaluation of accessibility and usability. See you folks on the inclusive side! Key takeaways Inclusive design: Proxy users can play a crucial role in ensuring inclusive design for diverse user groups, especially when there are no declared users with accessibility needs in the user research pool Strategic decision-making: Gaining insights into accessibility needs of a diverse audience can enable data-driven informed choices. Communication is key : Clear communication before and during testing sessions, and encouraging open feedback creates a conducive testing environment. Tailoring testing session : Adapting usability tests to address specific accessibility challenges enables a focused assessment of user interactions with the service. Testing with empathy and flexibility: Prioritising users’ needs and conducting tests with patience and empathy are crucial. Maintaining a balanced approach : While proxy user insights are invaluable, supplementing feedback with testing from actual users with disabilities ensures a comprehensive evaluation of accessibility and usability. Useful resources Understanding WCAG 2.2 WCAG 2.2 Map Testing for accessibility Contact information If you have any questions about our accessibility services or you want to find out more about other services we provide at Solirius, please get in touch . This article was originally posted by Sree on medium.com .
- 6 common accessibility mistakes in design—and how to fix them
In this article, Philena discusses the importance of designing accessible experiences that cater to a diverse range of users, as well as for temporary or situational challenges. She touches on why accessibility is not just a technical requirement but a design principle that benefits everyone. Philena highlights six common design mistakes that hinder accessibility and provides practical solutions to create more inclusive, user-friendly designs. Why accessibility in design matters Design isn’t just about making things look good—it’s about making sure everyone can use your product or service. Think about it: you’ve probably struggled with low contrast on your phone in bright sunlight or found it hard to navigate a cluttered website when you’re in a rush. Accessible design makes things easier for everyone. But accessibility isn’t just about following guidelines - it’s also about understanding real user needs. That’s why user research and feedback on design decisions are essential to ensure designs truly meet the needs of diverse users. By listening to feedback and testing with people who have a range of abilities and experiences, designers can identify barriers and create solutions that work for everyone. So, let’s look at some common design mistakes and how you can avoid them to create a better experience for all users. Mistake 1: Low contrast text Let’s start with one of the most obvious issues - low contrast. Sure, it might look stylish to have light grey text on a white background, but can anyone actually read it? Now, imagine someone with a visual impairment trying to make sense of that. But here’s the thing: low contrast isn’t just an issue for those with impaired vision. Think of someone trying to read on their phone outside in the sun, with the screen reflecting glare—contrast matters in that scenario too. How to get it right: Aim for a contrast ratio of at least 4.5:1 for normal text. Use tools like the WebAIM Color Contrast Checker to test your designs. Think of contrast as a universal design principle—if it’s easier for someone with a visual impairment, it’s easier for everyone. Mistake 2: Relying only on colour to convey information Think about a form where the only indication of an error is a red outline. For someone who’s colourblind, that red outline might not even register. The same problem happens when colour alone is used to convey important information, like in charts or buttons. Accessibility isn’t just about catering to specific disabilities, it’s also about ensuring clarity for everyone. Whether it’s a person with colour blindness or someone trying to interact with your design in less than ideal lighting, relying solely on colour can be a problem. How to get it right: Always supplement colour with icons, text, or patterns. For example, instead of just using a red outline for errors, add a symbol and text that clearly explains the issue and how to fix it. Use a colour-blindness simulator during the design process to ensure your work is still clear without colour. Be aware that blindness simulators will never replace real user feedback. Ensure you test your designs with diverse users. Mistake 3: Complex layouts that confuse users We’ve all been there—landing on a website that’s so cluttered and chaotic that we have no idea where to look. For someone with cognitive disabilities or attention issues, this kind of layout can make navigation nearly impossible. But even without a disability, a complex layout can be frustrating. Picture yourself trying to book a flight on a crowded train, with limited time and attention—simplicity and clarity become lifesavers. How to get it right: Use a clear visual hierarchy with headings and subheadings that guide users. Make important information easy to find with a clean layout, such as grouping related elements together to create an intuitive flow. Use consistent spacing, fonts, and alignment to reduce cognitive load. Keep consistency across pages, so users don’t have to relearn how to navigate every time. For example, place the primary action button, like "Continue" or "Submit," in the same location across all pages and use consistent labelling to avoid confusion. Mistake 4: Text that’s too small or difficult to read Tiny text is a big problem. Whether someone has low vision or is trying to read on a small screen in a bumpy car ride, small, illegible text makes for a frustrating experience. Readable text benefits everyone. Imagine you’re trying to skim an article on your phone during your commute—clear, bold text that’s easy to read helps you grasp the key points. How to get it right: Use a minimum font size of 16px for body text. Keep line length between 45 to 75 characters for better readability. Choose fonts that are easy to read, with good spacing between letters and lines. Some fonts that are considered accessible include: Arial, Calibri, Century Gothic, Helvetica, Tahoma, Verdana, Tiresias, and OpenDyslexic. Again, it is important to get real user feedback to see what works for your users. Mistake 5: Missing image descriptions For someone using a screen reader, images without descriptions are a black hole of information. They can’t see what the image is trying to convey, so they miss out on key content. Alternative text or alt text can provide that context for users by describing images for users who can’t see them. But alt text isn’t just for screen reader users. What about someone with a slow internet connection? While they’re waiting for the images to load, they can still understand what’s there if you’ve provided alt text. How to get it right: Always include meaningful alt text for images that convey information. Avoid purely decorative images, or if they are not needed make sure they’re marked as such by using empty alt text ( alt="" ). Alt text should reflect the image’s purpose and context in relation to the surrounding content, for example if you use ‘simple illustration of mountains and a sun’: On a page about travel destinations it could be: “Illustration of a mountain range at sunrise, representing a peaceful travel location.” On a page about design inspiration it could be: “Minimalist mountain and sun illustration showcasing simple design concepts.” Think of alt text as part of the story you’re telling—don’t leave users in the dark. How to write good alt text for screen readers Mistake 6: Incomprehensible data graphs Complex data visualisations can be a headache for users, especially those with assistive technology or those who are colourblind. Labels that are too small or graphs that rely solely on colour can make it difficult to understand what’s being presented. But this isn’t just a challenge for users with disabilities. Anyone trying to read a graph on a small screen or in a distracting environment will appreciate clear, easy to understand visuals. One simple way to make graphs more accessible is to incorporate patterns or textures in addition to colour. For example, instead of only using red and green in a pie chart, you can add stripes or dots to differentiate between sections for users who struggle with colour perception. How to get it right: Provide clear, concise summaries of data trends. Label graphs and charts clearly, with text and visual cues like patterns. Use high contrast colours and provide alternative formats, like tables, for users who prefer text-based information. For image-based graphs, provide clear alt text or captions that describe the data and key insights, ensuring the information is accessible to screen reader users. Designing for everyone At the end of the day, accessibility is about making sure everyone has equal access to services and products. By avoiding these common design mistakes, you’re not just helping people with disabilities—you’re creating a better experience for anyone who might be in a permanent, temporary or environmental situation where good design means accessible design. Take action When designing services or products, ask yourself: is this accessible for everyone? Start making these changes today, and be sure to conduct user accessibility testing along the way - you may be surprised by small changes that improve the overall user experience for everyone. Additional resources To further enhance your accessibility design skills, explore these valuable resources: Accessibility - Material Design WebAIM: Web Accessibility for Designers Stark - Contrast & Accessibility Checker | Figma Accessible fonts and readability: the basics How to write good alt text for screen readers Contact information If you have any questions about our accessibility services or you want to find out more about other services we provide at Solirius, please get in touch .
- Breaking barriers: digital inclusion in government services
In this article, Piya discusses the importance of creating government services that are accessible to everyone . Government accessibility standards exist to ensure that a wide range of people can use government services on both web and mobile applications. Importantly, accessibility is a shared responsibility, and Piya lists resources that offer guidance on integrating accessibility into the development of services. Overview: GOV.UK requirements Meeting WCAG 2.2 Testing with assistive technology User research with disabled people Accessibility statements GOV.UK design system DWP resource GOV.UK requirements The government accessibility requirements state that all services must meet the following criteria to ensure that all legal requirements regarding public sector websites and mobile applications are met: Meet level AA of the WCAG 2.2 (Web Content Accessibility Guidelines) at a minimum Work on the most commonly used assistive technologies - including screen magnifiers, screen readers and speech recognition tools Include disabled people in user research (including cognitive, motor, situational, visual and auditory impairments) Have an accessibility statement that explains how accessible the service is (published when the service moves to public beta) Reaching these requirements ensures that services meet the legal requirements as stated by Public Sector Bodies (Websites and Mobile Applications) (No. 2) Accessibility Regulations 2018 . In addition, we can ensure that we are creating more inclusive digital services for users with diverse needs. Meeting WCAG 2.2 WCAG 2.2 is based on 4 principles, that emphasise the need to think about the different ways that people interact with digital content: perceivable: recognising and using the service with senses that are available to the user. operable: finding and using content, regardless of how a user chooses to access it. understandable: understanding content and how the service works. robust: content that can be interpreted reliably by a wide variety of user agents. For example, users might use a keyboard instead of a mouse or rely on a screen reader to have content spoken aloud. The WCAG 2.2 principles apply to all aspects of your service (including code, content and interactions), which means all members of your team need to understand and consider them. It is important to conduct regular accessibility testing using a range of automated and manual tools as early as possible to ensure your design, code, and content meet WCAG 2.2 AA requirements (all A and AA criteria). Testing with assistive technology To meet the government service standard, testing should be done across the following assistive technologies and browsers throughout development, ensuring that the most commonly used assistive technologies are tested and work on the service before moving to public beta: JAWS (screen reader) on Chrome or Edge NVDA (screen reader) on Chrome, Firefox or Edge VoiceOver (screen reader) on Safari TalkBack (mobile screen reader) on Chrome Windows magnifier or Apple Zoom (screen magnifiers) Dragon (speech recognition tool) on Chrome Source: Digital Accessibility Centre (DAC) https://digitalaccessibilitycentre.org/usertesting.html It is a shared responsibility to make sure services are compatible with commonly used assistive technologies as testing across these combinations should be done throughout all stages of development; when planning new features, when designing and building new features, and testing. For more information on how to test with assistive technology, see testing with assistive technologies . User research with disabled people Inclusive user research is essential for creating user-centred services that meet the needs of all users, including those with disabilities and diverse backgrounds. By involving a varied group of participants early on, teams can identify and address usability and accessibility barriers, enhancing the design, functionality, and content to benefit everyone. This approach encourages continuous improvement, ensuring government services evolve with users' needs. Ultimately, inclusive user research builds trust by showing a commitment to accessibility, making services more usable and welcoming for a broader audience. Accessibility statements Accessibility statements are required to communicate how accessible a service is. This includes stating the WCAG compliance level, explaining where the service has failed to meet guidelines (and a roadmap of when this will be fixed), contact information and how to report accessibility issues. Government services should follow a standard accessibility statement format to maintain consistency. GOV.UK Design System (GDS) The GOV.UK design system (GDS) has many reusable components that are utilised across government services. Each component shows an example, an option to view the details on how to implement the component, as well as research regarding the component's usability and what kind of issues users have faced. Any known accessibility issues are also highlighted and based on this research, some components are labelled ‘experimental’ as some users may still experience issues navigating them. Services must proceed with caution when adopting these components, and carry out rigorous manual, assistive technology and user testing to ensure that the implementation is accessible and WCAG guidelines are met. Source: Government Design System (GDS) details component - https://design-system.service.gov.uk/components/details/ Summary Overall, government services must ensure they are creating services that are regularly tested and work with users who have a range of access needs or assistive technology requirements including: Reviewing, understanding, and meeting GOV.UK and WCAG 2.2 standards Implementing accessible components that can be accessed by assistive technology Ensuring accessibility is the whole team’s responsibility when developing a service Regularly testing with users with disabilities Providing an accessibility statement to inform users where the service does and does not meet accessibility guidelines Accessibility should be considered from the start as retrofitting costs more time and resources, and results in your users not being able to use your service. DWP resource: The Department for Work and Pensions (DWP) accessibility manual is a great resource for guidance on testing, accessibility best practices throughout service development and details on how each member of the team can integrate accessibility. Source: GOV.UK - Accessibility in Government - https://accessibility.blog.gov.uk/2021/05/27/why-weve-created-an-accessibility-manual-and-how-you-can-help-shape-it/ Contact information If you have any questions about our accessibility services or you want to find out more about other services we provide at Solirius, please get in touch .
- Delivering a modern analytical platform for DfE
We supported the Department for Education (DfE) to introduce a new data intelligence platform while focusing on user needs and better ways of working. Linda Souto Maior, service designer on the project, shares how we did this. The DfE Analyst Community plays a critical role in helping the department achieve its goals to enable children and learners to thrive . Over time analysts were operating on increasingly outdated, disparate and costly legacy systems. As the data and analytical sector moves towards cloud based technologies, DfE wanted to build on existing ways of working, keep pace with emerging trends and new opportunities to support the Strategic Data Transformation programme. The vision was to build a new joined up service to help analysts find data and access cloud analytical tools, known as Analytical Data Access (ADA), named after the famous pioneer and mathematician Ada Lovelace. A key enabler for any major technology change is to design the service in a way that meets the needs of users and also recognises the impact of the change to the organisation and importantly to the individuals affected. Challenges across the organisation There were a number of challenges across business and technology teams: DfE’s Data Directorate had an immediate requirement to replace outdated and costly platforms with cost of support and maintenance high. Users of data across the DfE Analyst Community could not always find what data is being stored, who owns it, how to get access and how to use it. Users are limited by capacity and difficulty in scaling without impacting others, and may experience delays in running data queries. Duplication of the same data assets across different platforms with no single source of truth, and increasing costs, risks and effort to store, manage and control access to data. Reduced ability for experts in datasets to collaborate easily. Time-consuming daily log-ins to multiple systems. The approach to address the problem By working in blended teams Solirius provided service design, user research, interaction design, business analysis, delivery and business change expertise to bring the service to live. The service was designed to bring together three underlying platforms: a data discovery platform to catalogue all DfE data, a data intelligence platform using Databricks on Azure with a Delta Lake for data storage, and a library of reports and dashboards. Key improvements included: A single point of entry to reduce sign-ins across platforms. Greater processing power for faster calculations and complex analysis. Governance of data through a single request form to gain access to datasets. Information about the service, support guides and access to tailored training designed in collaboration with analysts. A single homepage to access all services. Outcomes and value added Alongside the technology challenge of deployment, we worked closely with users to overcome nervousness about the new platform. This included adapting the service but also improving communications: Set up a super user group of analysts who helped with design input, test and provide feedback on the service. Worked with the supplier (Databricks) to integrate R Studio (third party modelling software) based on user feedback. Mapped out all data requirements and user flows to identify common pain points and avoid duplication of data prior to any data migration. Developed a roadmap for data migration and changes to ways of working. Ran frequent show & tells across the organisation and invited our analysts to demonstrate the use of tooling. Engaged with analysts and the supplier to design training and support guides. Co-designed a business change strategy to establish collective ownership of the change. Worked closely across DfE departments and alongside other projects to ensure a joined-up service across all channels. Provision of flexible resourcing to meet the needs of delivery and budget constraints. The service continues to be rolled out across the organisation and is now being used by 300 analysts with 60 modelling areas migrated for 50+ analyst teams. Meanwhile we continue to work with the Analyst Community and Data Directorate to improve and adapt the tool as new use cases come up. Long term, the service will save time, costs and effort, through better collaboration and faster processing, and also enable better use and governance of data. "As sponsor for the DfE Analytical Data Access (ADA) service I have been impressed with the calibre of the Solirius resources who have supported us in getting this ambitious programme off the ground. They have been key in helping us build multi-disciplinary squads and they have integrated seamlessly with our existing staff. Their expertise has brought shape and rigour to our work and enabled us to deliver a professional service that is growing in demand." Patrick Healey, Deputy Director | Data Operations | Data Contact information If you have any questions about implementing new digital technology in your organisation or want to find out more about what services we provide at Solirius please get in touch .
- Providing data engineering services that support company growth
Growth Intelligence specialise in helping companies grow, using AI and uniquely rich SME data to drive more revenue, reduce acquisition costs and increase conversion rates. To support their business model Growth Intelligence were looking for assistance to: manage existing infrastructure and scale data pipelines to handle ever increasing amounts of data using contemporary cloud technologies create and maintain bespoke applications to support the day to day activities of their data science and customer success teams maintain code libraries, repos, apps and machine learning feature data. Setting up the specialist team Our specialised resourcing solution gives GI access to experienced, high quality data engineers at a wide variety of levels. Proponents of agile delivery, our team adopts GI’s hybrid scrum/kanban approach, monitoring sprint progress using kanban boards and running the full range of agile sprint ceremonies (daily stand-ups, retrospectives and sprint planning sessions). We collaborate with GI’s engineering and data scientists teams, using a stack of Python, Ansible, AWS, Tornado, Flask, Elasticsearch, Docker, Pandas. Working alongside the GI team and key stakeholders, our engineer’s work spans requirements gathering, technical spikes and OKR management. 2 years of success and growing Solirius has worked with Growth Intelligence for over 2 years and we are proud to continue our association, helping to maintain and improve the leading-edge services that GI provides to companies around the world. "The Solirius team are great additions to our engineering team. They are highly professional, effective at working independently (whilst also knowing when to seek clarification on requirements / design / architecture) and proactive in taking on projects / problem solving. They have integrated seamlessly into the team - which is great for us as a start-up as it enables us to have a single cohesive engineering team. They have a genuine interest in helping us succeed and creating a friendly and enjoyable culture to work in." Prashant Majmudar, CTO at Growth Intelligence
- Solirius pro-bono partnership with The Talent Tap
As part of our social value initiative, we are providing pro-bono digital and design support to charities, and we are excited to announce our partnership with The Talent Tap in this endeavour. Who is The Talent Tap? This remarkable charity supports young people from socio-economically deprived areas, with a specific focus on coastal and rural regions , by providing access to professional opportunities and work placements that help shape their future careers. Why are we doing this? The social value initiative at Solirius reflects our commitment to using our digital and design expertise to create a positive and lasting impact on society. By offering our services pro-bono, we aim to support charities that might not have access to the resources they need to enhance their digital presence and improve their services for the communities they serve. Through our partnership with The Talent Tap , we are proud to support their mission of equipping young people with the tools and opportunities they need to realise their potential and shape their professional futures. We believe that everyone, regardless of background, should have access to opportunities that can define their careers, and we are excited to contribute to The Talent Tap's efforts in driving positive change. What have we done so far? We’ve started working with The Talent Tap to identify opportunities to enhance the content and design of their website. Our approach includes conducting accessibility audits, competitor analysis, content reviews, and user research. This comprehensive strategy will guide future ideation, ensuring the platform meets user needs all while highlighting the essential work they do. What is next? Over the coming weeks, we’ll continue working closely with The Talent Tap to shape the future design and user experience of their website, ensuring that any new concepts make a meaningful impact for the people they serve. We look forward to sharing more updates on this exciting journey as we continue to work together! Contact information If you have any questions about our social value initiative or you want to find out more about what services we provide at Solirius please get in touch .
- HMCTS QA Town Hall Recap: Monday, 14th October 2024
The Testing Centre of Excellence (TCoE) at His Majesty's Courts and Tribunals services (HMCTS) had the privilege of being part of an incredible day of learning and collaboration at the QA Town Hall, run by Solirius. Our Test Lead Elizabeth Jones explains what we got up to. Firstly, what is the Testing Centre of Excellence (TCoE)? The purpose of the TCoE is to establish a centralised hub for standardised testing and quality processes across the organisation. We promote continuous improvement by fostering cohesive testing practices and ensuring that all members have access to the necessary resources and support. Through our dedicated platform, we offer a space for the testing community to connect, collaborate, and share knowledge with access to a wide range of resources, including document templates, best practices, and training materials, all to enhance testing capabilities. The three main goals of the HMCTS TCoE are: Standardisation: To establish a centralised model where testing and quality processes are standardised across the whole of HMCTS. This centralised model will serve as a unified framework that sets clear guidelines, best practices, and standardised methodologies for testing and quality assurance activities. Efficiency: To streamline workflows, reduce redundancies, and enhance the overall efficiency and effectiveness of our testing efforts. Community : To connect testers from all jurisdictions and create a testing community. By sharing experiences, knowledge and training, we can bridge the gap in understanding and expertise, ultimately enhancing the quality and effectiveness of our testing processes across the entire organisation. What did we do at the QA Town Hall? The TCoE at HMCTS had the privilege of participating in an incredible day of learning and collaboration at the QA Town Hall, run by Solirius. It was a full day packed with insightful sessions, attended by 47 participants eager to enhance their knowledge of testing practices and get hands-on experiences with our workshop. Keynote speaker Christine Pinto: Playwright & AI in test automation talk Christine is a respected figure in the QA community and delivered an engaging talk, sharing her expertise in automation engineering globally through insightful articles and presentations. She gave us an in-depth look into the capabilities of Playwright and how AI tools can significantly enhance automation frameworks. The session covered practical ways AI can assist with coding, optimising test coverage, and ways AI can help improve the quality, reliability, scalability and security of our tests. We explored various AI tools that streamline development, reduce repetitive tasks, and boost efficiency, empowering teams to stay ahead of the curve. We discussed essential guidelines to protect sensitive data, learning strategies to ensure that confidential information is never shared with AI tools, thus preventing potential data breaches and maintaining robust data security. TCoE: Accessibility & screen reader testing workshop This hands-on session run by Ayesha Saeed (Accessibility Testing Lead), Philena Bremner (Accessibility/UX Consultant) and Piya Patel (Junior Accessibility Tester) highlighted the importance of accessibility in software testing. Attendees practised live screen reader testing of multiple websites and learned the essential steps to ensuring applications are inclusive for all users, bringing greater awareness about improving product and service accessibility standards. We then covered how to integrate axeCore into automated Playwright tests, a crucial step in ensuring earlier accessibility visibility (shift-left) by automatically detecting WCAG2.2 violations when the CI/CD build pipelines are run. Attendee's learned how automation can help to scale accessibility testing and build more inclusive and user-friendly digital services. Showcasing the Testing Centre of Excellence (TCoE) Website Myself (Elizabeth Jones - Test Lead) and Abigail Smith (QA Engineer) showcased the HMCTS TCoE website which has over 230 users, and is designed to be a one-stop resource for QA professionals. We demonstrated how the platform can support testers with access to tools, templates, and expert advice, promoting efficiency and continuous learning. Crime Test Practice Lead: Show and Tell by HMCTS Crime department: Improving testing processes We also had an insightful session from James Widdowson (Crime Test Practice Lead) from the Crime Team at HMCTS who showcased their journey to enhancing testing efficiency and reducing defects through a transformative approach. They shared valuable, actionable strategies for creating a streamlined, collaborative testing environment, detailing how they developed a reliable automation pack compatible with the existing Serenity framework. Their transition from Jenkins and Gerrit to Azure and GitHub, alongside the integration of API checks to verify message processing, substantially increased stability. As a result, they now run over 600 automated UI tests, achieving a pass rate consistently in excess of 90%, highlighting the positive impact of robust automation practices on quality outcomes. A big thank you A special thanks to the day's organisers and session hosts— Elizabeth Jones, Tom Bowker, Gareth Davies, Nimesh Patel, Josh Gray, Christine Pinto, James Widdowson, Abigail Smith, Ayesha Saeed, Piya Patel, Phi lena Bremner and Jake Gowler — for making this event such a success. Your hard work truly made this an engaging and impactful experience. Finally, we would like to extend a big thank you to PA Consulting for generously hosting the event! We truly appreciate your support. The QA Town Hall was an incredible day of learning, collaboration, and growth for all involved. Thank you to all the participants for attending and actively contributing to the success of the event. We can’t wait for the next one! Contact information If you have any questions about our Quality Assurance services, or if you want to find out more about what other services we provide at Solirius please get in touch .
- Unlocking the web: start your journey into digital accessibility
A look at how we can follow inclusive practices to ensure equal access to digital services for everyone. Guided by standards such as the Web Content Accessibility Guidelines (WCAG) and legislation, organisations should prioritise accessibility from the outset. Through rigorous testing, user feedback loops, and continuous improvement we can drive progress in accessibility. Overview What is digital accessibility Who benefits from digital accessibility? Legal standards and guidelines Shift Left accessibility Testing, auditing and user feedback Progress over perfection Contact information What is digital accessibility? Digital accessibility ensures there are no barriers for individuals when using digital services. This makes accessibility a functionality issue. Simply put, if the service is not accessible it is not functional. Although there are legal requirements to highlight the importance of accessibility, it goes beyond legal compliance checklists and is centred on creating inclusive digital spaces that everyone can use. Who benefits from digital accessibility? Web accessibility benefits everyone. When digital spaces are built with accessibility in mind the result is faster, easier and more usable services. Importantly, this makes the service accessible for people with permanent, temporary and situational disabilities. People may have accessibility needs across the following areas: Cognitive Visual Auditory Motor Speech Source: https://www.esri.com/arcgis-blog/products/arcgis-storymaps/constituent-engagement/building-an-accessible-product-our-journey-so-far/ Take time to understand your users and understand their experiences on your services. Not every user will have the same needs, and some users' requirements may conflict with others. Providing options and alternatives will allow you to create more inclusive digital spaces with reduced barriers for your users. Legal standards and guidelines Equality Act 2010 As far as legal requirements go, the Equality Act 2010 states that there is a ‘ a duty to make reasonable adjustments’ for those who classify as ‘disabled persons’. Government requirements Under the Public Sector Bodies (Websites and Mobile Applications) (No. 2) Accessibility Regulations 2018 all public services have further defined accessibility requirements which are to: meet level AA of the Web Content Accessibility Guidelines (WCAG 2.2) as a minimum work on the most commonly used assistive technologies - including screen magnifiers, screen readers and speech recognition tools include disabled people in user research have an accessibility statement that explains how accessible the service is - you need to publish this when the service moves into public beta As a minimum, it is required that public services meet basic requirements, but even for non-public services it is good practice to follow these guidelines. In doing so, you begin to make your digital service an accessible space for all. WCAG The Web Content Accessibility Guidelines (WCAG) serve as the internationally recognised standards for web accessibility. WCAG provides guidelines organised into four principles: Perceivable, Operable, Understandable, and Robust (POUR). Following these guidelines enhances the overall accessibility of your web content. Perceivable: Provide alternatives for non-text content, captions, and sufficient colour contrast for text. Operable: Ensure keyboard accessibility, sufficient reading time, and avoid content causing discomfort. Understandable: Use clear language, consistent navigation, and offer input assistance. Robust: Employ valid code, adhere to web standards, and avoid browser-specific features. Currently, web content should adhere to the WCAG 2.2 (2023) standards . The recent version introduces 9 new guidelines (6 A & AA) and removes one (4.1.1 Parsing) . Meeting the WCAG 2.2 guidelines will mean you will also meet the previous versions of the guidelines. Shift Left accessibility Source: https://blogs.vmware.com/cloud/2021/05/11/shift-left-platform-teams/ Accessibility should not be the responsibility of a single person/role but of the whole team. This involves baking accessibility in from the start, from the initial idea through to sign off. This implements a ‘Shift Left’ approach which encourages earlier accessibility reviews, involving all on the team from product owners through to release. A shift left approach embeds accessibility into the process so that it is not just an afterthought or a bottleneck to releases. It also prevents an excess of accessibility tech debt items that tend to remain at the bottom of the backlog. Testing, auditing and user feedback A large part of creating accessible services is to regularly test the service using automated testing tools and manual assessments (including testing with assistive technology). At Solirius we have several Accessibility specialists who are continuously working to implement, build and maintain accessible and inclusive services. Testing needs to be carried out in parallel to regular user testing to ensure you better understand real experiences for users and are not just building services to meet compliance. Progress over perfection Accessibility is a vast area with many specialisms, and can initially feel overwhelming. But it’s important to remember that even small accessibility considerations are a start and can go a long way for users. Don’t let the pressure of perfection stop you from getting involved and learning about accessibility. Lean on your peers and figure out how you can tackle challenges together, it is a learning curve for many but we all start somewhere. Summary Prioritising web accessibility ensures that your services are inclusive and usable for all users. By implementing a shift left approach, utilising the Web Content Accessibility Guidelines (WCAG) and involving users with a variety of needs, you can create a more inclusive digital landscape. Remember, accessibility is an ongoing journey involving everyone, and continual efforts to improve will help create digital services that benefit all. Contact information If you have any questions about accessibility or you want to find out more about what services we provide at Solirius please get in touch .
- Discovery to Beta: putting users at the centre of the design for new digital services in education
The Education and Skills Funding Agency (ESFA), sponsored by the Department for Education (DfE), brings together the former responsibilities of the Education Funding Agency (EFA) and the Skills Funding Agency (SFA) to create a single agency accountable for funding the education and skills training for children, young people and adults. Aligning with the department’s data strategy The funding for educational institutions is delivered through various funding streams. The processes used to gather data for calculating the value and allocation of funds were time-consuming and complex. Teams had developed their own processes, on varying technology stacks, with limited consistency between teams. The production of datasets was one of the steps in the process to be digitally transformed, contributing to the department’s data strategy of reducing complexity and improving consistency. The objective of the Funding Data Service (FDS) project was to align the preparation of data with this strategy, whilst enhancing functionality from legacy technology solutions that were being decommissioned. Introducing agile ways of working - Discovery and Alpha For Discovery we deployed an agile multi-disciplinary team consisting of a user researcher, business analyst, service designer, data analyst and developer. In alpha the team adopted scrum methodologies and ceremonies (sprint planning, stand-ups, show and tells and retrospectives). A large amount of user research, business analysis and service design was needed during alpha. User research consisted of interviews, contextual enquiries and user surveys to help develop user personas and to map user journeys. Interviews were conducted with small focus groups, starting with single teams and then moving onto larger multi-team meetings. This helped encourage richer discussion and alignment between different groups. The business analysis covered stakeholder analysis, process mapping and backlog development. This work demonstrated that the service had a relatively small number of users, but the majority were experts in their discipline. Our focus, therefore, was to consume as much of their domain knowledge as possible and to ensure that we had a good understanding of their current pain points. “I always appreciated the FDS team following up on the feedback we provided as users because it felt like you were keen to build something that would work for us.” Choosing the right technology During alpha the technical team carried out data modelling, technical spikes and developed prototypes to prove our riskiest technical assumptions. For example, our first major technical challenge was to securely transfer data from a SQL server instance behind an internal firewall to cloud-hosted, MS Azure data storage. A technical spike was conducted to investigate the use of Azure services to do this, and desk research conducted to understand all relevant security frameworks. The technical stack consisted of: Front end: Angular 12, HTML, JavaScript, CSS Back end: .NET Core v5, microservices, Azure web services, Azure functions Data stores: SQL Server, Azure SQL server, Azure Blob storage, SQL SSIS packages, Azure Data Factory At the conclusion of alpha the team had: validated that a digital service would help resolve the problem identified the people and process change necessary for the new service agreed the tech stack and developed an approach for developing a Minimal Viable Service (MVS) for Beta. Developing the Minimal Viable Service The MVS would encompass a digital system for sourcing, managing and publishing provider data, including integrations with other digital services. This would deliver extensive value for the client and enable the decommissioning of legacy functionality. Using Scrum and working in 2 week sprints the team established a regular delivery cadence that supported dependency and risk management at a programme level. We adopted a behaviour driven development (BDD) approach across the team (development, quality assurance, analysis and design) to refine the understanding of user needs and pain-points. Early stage wireframes were iterated to hi-fidelity ‘development ready’ designs based on user feedback collected in design working sessions. User stories incorporated ‘gherkin syntax’ style acceptance criteria to give both the development and quality assurance teams a clear understanding of the expected user experience. The quality assurance team deployed an ‘automation first’ approach to testing, improving consistency, frequency and efficiency in test execution. Putting users first “I did feel like I wanted to put the extra effort in for FDS as it felt you listened to me as a user of the service and actually took on board what we wanted” Due to the seasonal nature of the user’s workload (peaks around term times), the timing of the MVS go-live date needed to coincide with the start of a new funding year to prevent operational disruption. Before the release of new functionality, the team conducted usability testing sessions with key users. This was critical to the product achieving user acceptance, and the feedback captured in a ‘near-live’ environment was analysed, refined and ultimately added to the product backlog as development ready user stories. The team worked closely with users in group and 1-to-1 settings, delivered regular ‘show and tell’ sessions with stakeholder groups including senior leadership, other digital services and potential future users. ‘Show and tells’ were used to drive a common understanding of the project’s progress and the service itself, and to capture input from a wider cohort. This helped to manage expectations and dependencies with other teams. Growing the service The goals were to deliver an MVS service that would meet user needs, deliver value, prevent operational disruption, and create the foundation for future scaling and enhancement. The MVS went live after 5 months of intensive work, supporting the delivery of £691 million in annual funding for 16-19 year olds. Following MVS the team have: transitioned to a hybrid live-support and development model, supporting day-to-day operations alongside the delivery of new functionality. released new functionality weekly, ensuring value is provided quickly and incrementally. onboarded new Funding Streams, meaning the service is now supporting the annual delivery of billions in education and skills funding. The team received excellent feedback throughout for their user-centred approach and were widely recognised as an exemplar for agile software development.