37 results found with an empty search
- Unlocking the web: start your journey into digital accessibility
A look at how we can follow inclusive practices to ensure equal access to digital services for everyone. Guided by standards such as the Web Content Accessibility Guidelines (WCAG) and legislation, organisations should prioritise accessibility from the outset. Through rigorous testing, user feedback loops, and continuous improvement we can drive progress in accessibility. Overview What is digital accessibility Who benefits from digital accessibility? Legal standards and guidelines Shift Left accessibility Testing, auditing and user feedback Progress over perfection Contact information What is digital accessibility? Digital accessibility ensures there are no barriers for individuals when using digital services. This makes accessibility a functionality issue. Simply put, if the service is not accessible it is not functional. Although there are legal requirements to highlight the importance of accessibility, it goes beyond legal compliance checklists and is centred on creating inclusive digital spaces that everyone can use. Who benefits from digital accessibility? Web accessibility benefits everyone. When digital spaces are built with accessibility in mind the result is faster, easier and more usable services. Importantly, this makes the service accessible for people with permanent, temporary and situational disabilities. People may have accessibility needs across the following areas: Cognitive Visual Auditory Motor Speech Visual representation of disability types such as cognitive, visual, auditory, motor, and speech. Source: https://www.esri.com/arcgis-blog/products/arcgis-storymaps/constituent-engagement/building-an-accessible-product-our-journey-so-far/ Take time to understand your users and understand their experiences on your services. Not every user will have the same needs, and some users' requirements may conflict with others. Providing options and alternatives will allow you to create more inclusive digital spaces with reduced barriers for your users. Legal standards and guidelines Equality Act 2010 As far as legal requirements go, the Equality Act 2010 states that there is a ‘ a duty to make reasonable adjustments’ for those who classify as ‘disabled persons’. Government requirements Under the Public Sector Bodies (Websites and Mobile Applications) (No. 2) Accessibility Regulations 2018 all public services have further defined accessibility requirements which are to: meet level AA of the Web Content Accessibility Guidelines (WCAG 2.2) as a minimum work on the most commonly used assistive technologies - including screen magnifiers, screen readers and speech recognition tools include disabled people in user research have an accessibility statement that explains how accessible the service is - you need to publish this when the service moves into public beta As a minimum, it is required that public services meet basic requirements, but even for non-public services it is good practice to follow these guidelines. In doing so, you begin to make your digital service an accessible space for all. WCAG The Web Content Accessibility Guidelines (WCAG) serve as the internationally recognised standards for web accessibility. WCAG provides guidelines organised into four principles: Perceivable, Operable, Understandable, and Robust (POUR). Following these guidelines enhances the overall accessibility of your web content. Perceivable: Provide alternatives for non-text content, captions, and sufficient colour contrast for text. Operable: Ensure keyboard accessibility, sufficient reading time, and avoid content causing discomfort. Understandable: Use clear language, consistent navigation, and offer input assistance. Robust: Employ valid code, adhere to web standards, and avoid browser-specific features. Currently, web content should adhere to the WCAG 2.2 (2023) standards . The recent version introduces 9 new guidelines (6 A & AA) and removes one (4.1.1 Parsing) . Meeting the WCAG 2.2 guidelines will mean you will also meet the previous versions of the guidelines. Shift Left accessibility Visual representation of shift left activities that involve security, testing and operations processes earlier on in the dev cycle including throughout plan, code, build, test, release, deploy, operate and monitor phases. Source: https://blogs.vmware.com/cloud/2021/05/11/shift-left-platform-teams/ Accessibility should not be the responsibility of a single person/role but of the whole team. This involves baking accessibility in from the start, from the initial idea through to sign off. This implements a ‘Shift Left’ approach which encourages earlier accessibility reviews, involving all on the team from product owners through to release. A shift left approach embeds accessibility into the process so that it is not just an afterthought or a bottleneck to releases. It also prevents an excess of accessibility tech debt items that tend to remain at the bottom of the backlog. Testing, auditing and user feedback A large part of creating accessible services is to regularly test the service using automated testing tools and manual assessments (including testing with assistive technology). At Solirius we have several Accessibility specialists who are continuously working to implement, build and maintain accessible and inclusive services. Testing needs to be carried out in parallel to regular user testing to ensure you better understand real experiences for users and are not just building services to meet compliance. Progress over perfection Accessibility is a vast area with many specialisms, and can initially feel overwhelming. But it’s important to remember that even small accessibility considerations are a start and can go a long way for users. Don’t let the pressure of perfection stop you from getting involved and learning about accessibility. Lean on your peers and figure out how you can tackle challenges together, it is a learning curve for many but we all start somewhere. Summary Prioritising web accessibility ensures that your services are inclusive and usable for all users. By implementing a shift left approach, utilising the Web Content Accessibility Guidelines (WCAG) and involving users with a variety of needs, you can create a more inclusive digital landscape. Remember, accessibility is an ongoing journey involving everyone, and continual efforts to improve will help create digital services that benefit all. Contact information If you have any questions about accessibility or you want to find out more about what services we provide at Solirius please get in touch .
- Meet the Team: Ayesha Saeed
Ayesha shares her journey to becoming an Accessibility Lead at Solirius as well as insight into her top tips and interests. Meet Ayesha Saeed, a Senior Accessibility Specialist with over 5 years of experience working in accessibility on a range of products in both public and private sectors. She has a wide variety of experience including conducting audits, delivering training, and building implementation plans with teams, through to app accessibility and consulting. How did you get involved in accessibility? I have a QA background and so I started my accessibility journey by conducting accessibility audits, which prompted me to begin learning about accessibility principles and user-focused design. I really enjoyed learning about accessibility and all the different specialisms within it. I studied Social Anthropology at university so I enjoyed learning about people and understanding the numerous ways people interact with technology. I went on to work on a government project where I learnt lots about the laws surrounding digital accessibility; GDS standards and WCAG compliance. I expanded my experience to mobile apps, gaining invaluable insights into the nuances of mobile accessibility and learning more about guidelines for iOS and Android platforms. I began to cultivate a culture of accessibility on the projects I worked on, educating my team, working to ensure that accessibility considerations were no longer an afterthought. Currently, I am an Accessibility Lead at Solirius working on another government project, managing several services and ensuring they have the necessary guidance to deliver accessible services. I support on testing practices, writing Accessibility Statements and working with teams to build roadmaps to make their services accessible. I also deliver training sessions to empower services to integrate accessibility principles in the early stages of development and help to motivate them to sustain their efforts throughout the process. What are your interests? I like to cook a lot and enjoy taking my mum’s classics and turning them into veggie friendly versions using my homemade seitan. I also like to keep active by swimming regularly and (occasionally) attempting yoga. I’ve also gotten into crocheting recently and enjoy seeing what I can make. Top accessibility tip? Don’t feel like you need to know it all! Digital accessibility is such a rich subject and can be difficult to grasp when you are new to it. Just remember to be patient with your learnings, reach out to peers, read about accessibility and try to get involved with the accessibility communities for support. Your small changes can have a huge impact! Top accessibility resource? The A11y Slack - It’s a great community of accessibility specialists and advocates who are friendly and open to help. It is free and open to all, and you can join at web-a11y.slack.com . Contact information If you have any questions about accessibility or you want to find out more about what services we provide at Solirius please get in touch .
- Breaking barriers: digital inclusion in government services
Breaking barriers: digital inclusion in government services In this article, Piya discusses the importance of creating government services that are accessible to everyone . Government accessibility standards exist to ensure that a wide range of people can use government services on both web and mobile applications. Importantly, accessibility is a shared responsibility, and Piya lists resources that offer guidance on integrating accessibility into the development of services. Overview: GOV.UK requirements Meeting WCAG 2.2 Testing with assistive technology User research with disabled people Accessibility statements GOV.UK design system DWP resource GOV.UK requirements The government accessibility requirements state that all services must meet the following criteria to ensure that all legal requirements regarding public sector websites and mobile applications are met: Meet level AA of the WCAG 2.2 (Web Content Accessibility Guidelines) at a minimum Work on the most commonly used assistive technologies - including screen magnifiers, screen readers and speech recognition tools Include disabled people in user research (including cognitive, motor, situational, visual and auditory impairments) Have an accessibility statement that explains how accessible the service is (published when the service moves to public beta) Reaching these requirements ensures that services meet the legal requirements as stated by Public Sector Bodies (Websites and Mobile Applications) (No. 2) Accessibility Regulations 2018 . In addition, we can ensure that we are creating more inclusive digital services for users with diverse needs. Meeting WCAG 2.2 WCAG 2.2 is based on 4 principles, that emphasise the need to think about the different ways that people interact with digital content: perceivable: recognising and using the service with senses that are available to the user. operable: finding and using content, regardless of how a user chooses to access it. understandable: understanding content and how the service works. robust: content that can be interpreted reliably by a wide variety of user agents. For example, users might use a keyboard instead of a mouse or rely on a screen reader to have content spoken aloud. The WCAG 2.2 principles apply to all aspects of your service (including code, content and interactions), which means all members of your team need to understand and consider them. It is important to conduct regular accessibility testing using a range of automated and manual tools as early as possible to ensure your design, code, and content meet WCAG 2.2 AA requirements (all A and AA criteria). Testing with assistive technology To meet the government service standard, testing should be done across the following assistive technologies and browsers throughout development, ensuring that the most commonly used assistive technologies are tested and work on the service before moving to public beta: JAWS (screen reader) on Chrome or Edge NVDA (screen reader) on Chrome, Firefox or Edge VoiceOver (screen reader) on Safari TalkBack (mobile screen reader) on Chrome Windows magnifier or Apple Zoom (screen magnifiers) Dragon (speech recognition tool) on Chrome Low vision user using a screen magnification tool to increase the text size on a webpage to allow them to see the content clearly. Source: Digital Accessibility Centre (DAC) https://digitalaccessibilitycentre.org/usertesting.html It is a shared responsibility to make sure services are compatible with commonly used assistive technologies as testing across these combinations should be done throughout all stages of development; when planning new features, when designing and building new features, and testing. For more information on how to test with assistive technology, see testing with assistive technologies . User research with disabled people Inclusive user research is essential for creating user-centred services that meet the needs of all users, including those with disabilities and diverse backgrounds. By involving a varied group of participants early on, teams can identify and address usability and accessibility barriers, enhancing the design, functionality, and content to benefit everyone. This approach encourages continuous improvement, ensuring government services evolve with users' needs. Ultimately, inclusive user research builds trust by showing a commitment to accessibility, making services more usable and welcoming for a broader audience. Accessibility statements Accessibility statements are required to communicate how accessible a service is. This includes stating the WCAG compliance level, explaining where the service has failed to meet guidelines (and a roadmap of when this will be fixed), contact information and how to report accessibility issues. Government services should follow a standard accessibility statement format to maintain consistency. GOV.UK Design System (GDS) The GOV.UK design system (GDS) has many reusable components that are utilised across government services. Each component shows an example, an option to view the details on how to implement the component, as well as research regarding the component's usability and what kind of issues users have faced. Any known accessibility issues are also highlighted and based on this research, some components are labelled ‘experimental’ as some users may still experience issues navigating them. Services must proceed with caution when adopting these components, and carry out rigorous manual, assistive technology and user testing to ensure that the implementation is accessible and WCAG guidelines are met. Example of where to find accessibility research on the GDS details component, under heading ‘Research on this component’. Source: Government Design System (GDS) details component - https://design-system.service.gov.uk/components/details/ Summary Overall, government services must ensure they are creating services that are regularly tested and work with users who have a range of access needs or assistive technology requirements including: Reviewing, understanding, and meeting GOV.UK and WCAG 2.2 standards Implementing accessible components that can be accessed by assistive technology Ensuring accessibility is the whole team’s responsibility when developing a service Regularly testing with users with disabilities Providing an accessibility statement to inform users where the service does and does not meet accessibility guidelines Accessibility should be considered from the start as retrofitting costs more time and resources, and results in your users not being able to use your service. DWP resource: The Department for Work and Pensions (DWP) accessibility manual is a great resource for guidance on testing, accessibility best practices throughout service development and details on how each member of the team can integrate accessibility. DWP Accessibility Manual home page Source: GOV.UK - Accessibility in Government - https://accessibility.blog.gov.uk/2021/05/27/why-weve-created-an-accessibility-manual-and-how-you-can-help-shape-it/ Contact information If you have any questions about our accessibility services or you want to find out more about other services we provide at Solirius, please get in touch .
- 6 common accessibility mistakes in design—and how to fix them
6 common accessibility mistakes in design—and how to fix them by Philena Bremner In this article, Philena discusses the importance of designing accessible experiences that cater to a diverse range of users, as well as for temporary or situational challenges. She touches on why accessibility is not just a technical requirement but a design principle that benefits everyone. Philena highlights six common design mistakes that hinder accessibility and provides practical solutions to create more inclusive, user-friendly designs. Why accessibility in design matters Design isn’t just about making things look good—it’s about making sure everyone can use your product or service. Think about it: you’ve probably struggled with low contrast on your phone in bright sunlight or found it hard to navigate a cluttered website when you’re in a rush. Accessible design makes things easier for everyone. But accessibility isn’t just about following guidelines - it’s also about understanding real user needs. That’s why user research and feedback on design decisions are essential to ensure designs truly meet the needs of diverse users. By listening to feedback and testing with people who have a range of abilities and experiences, designers can identify barriers and create solutions that work for everyone. So, let’s look at some common design mistakes and how you can avoid them to create a better experience for all users. Mistake 1: Low contrast text Let’s start with one of the most obvious issues - low contrast. Sure, it might look stylish to have light grey text on a white background, but can anyone actually read it? Now, imagine someone with a visual impairment trying to make sense of that. But here’s the thing: low contrast isn’t just an issue for those with impaired vision. Think of someone trying to read on their phone outside in the sun, with the screen reflecting glare—contrast matters in that scenario too. Don’t example of low contrast text with light grey text on a light grey background, making it hard to read. Do example of high contrast text with dark grey text on a lighter grey background, making it clear and easy to read. How to get it right: Aim for a contrast ratio of at least 4.5:1 for normal text. Use tools like the WebAIM Color Contrast Checker to test your designs. Think of contrast as a universal design principle—if it’s easier for someone with a visual impairment, it’s easier for everyone. Mistake 2: Relying only on colour to convey information Think about a form where the only indication of an error is a red outline. For someone who’s colourblind, that red outline might not even register. The same problem happens when colour alone is used to convey important information, like in charts or buttons. Accessibility isn’t just about catering to specific disabilities, it’s also about ensuring clarity for everyone. Whether it’s a person with colour blindness or someone trying to interact with your design in less than ideal lighting, relying solely on colour can be a problem. Don’t example of two forms side by side showing an error relying solely on colour to convey information. On the left, the perspective of a user without colour blindness shows a red border around the email field to indicate an error. On the right, the perspective of a user with colour blindness (Deuteranopia) shows the same form where the red border is not distinguishable, making the error unclear. Do example of two forms side by side showing an improved design where errors are supplemented with icons and text. On the left, the perspective of a user without colour blindness shows an email field with a red border, an error icon, and the text 'Enter your email.' On the right, the perspective of a user with colour blindness (Deuteranopia) shows the same form where the error icon and text are clearly visible, ensuring the error is understandable without relying on colour alone. How to get it right: Always supplement colour with icons, text, or patterns. For example, instead of just using a red outline for errors, add a symbol and text that clearly explains the issue and how to fix it. Use a colour-blindness simulator during the design process to ensure your work is still clear without colour. Be aware that blindness simulators will never replace real user feedback. Ensure you test your designs with diverse users. Mistake 3: Complex layouts that confuse users We’ve all been there—landing on a website that’s so cluttered and chaotic that we have no idea where to look. For someone with cognitive disabilities or attention issues, this kind of layout can make navigation nearly impossible. But even without a disability, a complex layout can be frustrating. Picture yourself trying to book a flight on a crowded train, with limited time and attention—simplicity and clarity become lifesavers. Don’t example of three pages showing a complex and inconsistent layout. The panels have inconsistent button placements, varied spacing, and misaligned elements, making navigation and readability difficult. Do example of three panels showing a simple and consistent layout. The panels have aligned elements, consistent button placements labeled 'Continue,' and uniform spacing, making navigation clear and easy to follow How to get it right: Use a clear visual hierarchy with headings and subheadings that guide users. Make important information easy to find with a clean layout, such as grouping related elements together to create an intuitive flow. Use consistent spacing, fonts, and alignment to reduce cognitive load. Keep consistency across pages, so users don’t have to relearn how to navigate every time. For example, place the primary action button, like "Continue" or "Submit," in the same location across all pages and use consistent labelling to avoid confusion. Mistake 4: Text that’s too small or difficult to read Tiny text is a big problem. Whether someone has low vision or is trying to read on a small screen in a bumpy car ride, small, illegible text makes for a frustrating experience. Readable text benefits everyone. Imagine you’re trying to skim an article on your phone during your commute—clear, bold text that’s easy to read helps you grasp the key points. Don’t example showing text that is tiny and hard to read, with a decorative font that reduces readability Do example showing text with a larger font size and a clear, easy-to-read typeface for better accessibility. How to get it right: Use a minimum font size of 16px for body text. Keep line length between 45 to 75 characters for better readability. Choose fonts that are easy to read, with good spacing between letters and lines. Some fonts that are considered accessible include: Arial, Calibri, Century Gothic, Helvetica, Tahoma, Verdana, Tiresias, and OpenDyslexic. Again, it is important to get real user feedback to see what works for your users. Mistake 5: Missing image descriptions For someone using a screen reader, images without descriptions are a black hole of information. They can’t see what the image is trying to convey, so they miss out on key content. Alternative text or alt text can provide that context for users by describing images for users who can’t see them. But alt text isn’t just for screen reader users. What about someone with a slow internet connection? While they’re waiting for the images to load, they can still understand what’s there if you’ve provided alt text. Don’t example showing an unclear alt text description for an image with a purpose. The image of mountains and a sun is labeled with the file name '12344545767.jpg,' which does not provide meaningful context. Do example showing a clear alt text description for an image with a purpose. The image of mountains and a sun is described as 'Simple illustration of mountains and the sun,' providing meaningful context. How to get it right: Always include meaningful alt text for images that convey information. Avoid purely decorative images, or if they are not needed make sure they’re marked as such by using empty alt text ( alt="" ). Alt text should reflect the image’s purpose and context in relation to the surrounding content, for example if you use ‘simple illustration of mountains and a sun’: On a page about travel destinations it could be: “Illustration of a mountain range at sunrise, representing a peaceful travel location.” On a page about design inspiration it could be: “Minimalist mountain and sun illustration showcasing simple design concepts.” Think of alt text as part of the story you’re telling—don’t leave users in the dark. How to write good alt text for screen readers Mistake 6: Incomprehensible data graphs Complex data visualisations can be a headache for users, especially those with assistive technology or those who are colourblind. Labels that are too small or graphs that rely solely on colour can make it difficult to understand what’s being presented. But this isn’t just a challenge for users with disabilities. Anyone trying to read a graph on a small screen or in a distracting environment will appreciate clear, easy to understand visuals. One simple way to make graphs more accessible is to incorporate patterns or textures in addition to colour. For example, instead of only using red and green in a pie chart, you can add stripes or dots to differentiate between sections for users who struggle with colour perception. Don’t example of two pie charts relying solely on colour to convey information. On the left, the perspective of a user without colour blindness shows sections in orange, purple, and pink labeled 'Pass,' 'Fail,' and 'Not applicable.' On the right, the perspective of a user with colour blindness (Achromatopsia) shows the same chart in grayscale, making it impossible to distinguish between sections. Do example of two pie charts with additional patterns and labels to supplement colour. On the left, the perspective of a user without colour blindness shows the chart with colours, patterns, and text labels indicating '24% not applicable,' '45% pass,' and '31% fail.' On the right, the perspective of a user with colour blindness (Achromatopsia) shows the same chart with patterns and text labels, ensuring the data is still understandable without relying on colour. How to get it right: Provide clear, concise summaries of data trends. Label graphs and charts clearly, with text and visual cues like patterns. Use high contrast colours and provide alternative formats, like tables, for users who prefer text-based information. For image-based graphs, provide clear alt text or captions that describe the data and key insights, ensuring the information is accessible to screen reader users. Designing for everyone At the end of the day, accessibility is about making sure everyone has equal access to services and products. By avoiding these common design mistakes, you’re not just helping people with disabilities—you’re creating a better experience for anyone who might be in a permanent, temporary or environmental situation where good design means accessible design. Take action When designing services or products, ask yourself: is this accessible for everyone? Start making these changes today, and be sure to conduct user accessibility testing along the way - you may be surprised by small changes that improve the overall user experience for everyone. Additional resources To further enhance your accessibility design skills, explore these valuable resources: Accessibility - Material Design WebAIM: Web Accessibility for Designers Stark - Contrast & Accessibility Checker | Figma Accessible fonts and readability: the basics How to write good alt text for screen readers Contact information If you have any questions about our accessibility services or you want to find out more about other services we provide at Solirius, please get in touch .
- Let’s talk accessibility: why we need proxy users
Have you ever been in a situation where you’re keen to test the accessibility of a service, but your target users haven’t communicated any accessibility needs? Sree (Sreemoyee), our Principal User Researcher, discusses how you can advocate for diverse user needs and ensure inclusive design on your projects. In a recent project, our data-fluent user group did not declare any accessibility needs, which led our team to consider skipping accessibility tests. Recognising the importance of catering to future users with accessibility needs and staying ahead of evolving user requirements, I turned to an ‘Accessibility Lab’, a database of proxy users with accessibility needs curated by our client’s User Centered Design (UCD) team. Who are proxy users in the context of accessibility testing? Proxy users, though not part of the primary user group, share comparable digital skills and accessibility needs that make them useful contributors to inclusive design. For my education-centric project, the Department for Education (DfE) Accessibility Lab was the ideal resource, featuring primarily teachers as proxy users who had signed up to be contacted for accessibility testing. Importantly, these teachers were not users of the service we were testing, ensuring unbiased perspectives without preconceptions. Venn diagram illustrating the intersection of Target users and Proxy users, highlighting shared traits in the overlapping area: comparable digital skills and accessibility needs. How I prepared for accessibility testing with proxy users: hot tips We opted for remote testing to accommodate the preference and availability of the proxy users. This decision necessitated adjustments to ensure effective testing. Clearly communicating the necessary information I communicated with the participants through emails and video calls, reassuring them that no prior knowledge of the service was necessary. Before the remote testing sessions, I provided them with the project background, outlining the goal of evaluating service accessibility. Throughout, I encouraged open communication, emphasising to participants that we are testing the service and not them, encouraging candid and honest feedback. Tailoring the usability tests It was important to familiarise myself with the specific accessibility needs of the proxy users to understand each person’s unique requirements. When testing with a participant with dyslexia who reported finding traditional text-heavy interfaces challenging, I asked them to describe their current environment and any assistive technologies they might use for dyslexia. During the test, I focussed on their interaction with fonts, line spacing, and visual cues to assess their content comprehension. Crafting guided interactions In remote sessions, I asked participants to use their main device and specified the browsers. Recognising potential challenges faced by proxy users who are unfamiliar with the service, I provided extra guidance and prompts, to enhance clarity in task understanding. For example: Original prompt: “Start the data submission journey and go through it as you normally would.” Guided prompt: “Start the data submission journey by selecting option x on the homepage, and if you encounter any difficulties, feel free to ask for guidance.” Observing and enquiring As the remote setting made it more difficult to pick up on non-verbal cues, I used screen-sharing tools to observe participants’ facial expressions and gestures as they navigated through the webpages. I encouraged them to think out loud and share their preferences and dislikes. With their consent, I recorded the sessions for later review. I observed closely for signs of difficulty and asked open-ended questions, such as: “How did you feel navigating through that section?” “How would you describe your experience using this feature?” Engaging with empathy Mindful of potential challenges faced by users with cognitive impairments, I approached remote testing with patience and empathy. I gave extra time for understanding, adjusted the testing environment based on their real-time feedback, and strategically built in breaks and buffers within the testing schedule. One participant made what was my favourite request: “Mind if I take a break to cuddle my cat?” Using relevant tools and technologies I facilitated the use of tools and assistive technologies as per user need to make the testing process smoother and more accurate. During a session, noting the need for screen magnification, I provided proxy users with the option to adjust the interface’s font size and contrast settings. Would I recommend accessibility testing with proxy users? Absolutely. The Project Leads observed these research sessions firsthand and described them as “eye-opening” and “fascinating”. But why? The pros of accessibility testing The benefits of conducting accessibility testing with proxy users are nuanced and varied: Tech-debt mitigation In the absence of actual users with declared accessibility needs, accessibility testing with proxy users encourages the adoption of inclusive design and development practices from the outset - the foundation that a truly user-centered service is built upon. In testing, visually impaired users highlighted issues with cluttered screens and excessive scrolling. Their feedback revealed that the approach of cramming information into a small screen made it hard for users with visual challenges to understand the content. Frustrated user staring at a laptop, stating: ‘A busy screen is hell.' The insight from users with accessibility needs, together with feedback from our target users, prompted us to simplify the homepage, making it cleaner and more straightforward, reducing cognitive load. We validated these changes through further testing to ensure enhanced usability. Proxy users, with their unique needs, enable us to spot and fix accessibility issues early, helping avoid the accumulation of technical debt and costly retrofits later in its development journey. Ethical inclusivity Engaging with diverse users is vital for inclusivity. When real users don’t declare accessibility needs, proxy users guide us in understanding diverse experiences. It’s not a checkbox exercise; it’s our ethical duty to ensure digital services are equitable for everyone. During testing, one proxy user emphasised the importance of truly grasping diverse user needs, stating: “I want options, not assumptions… It’s awfully good of you and your team to reach out to understand my experiences.” A proxy user stating “I want options, not assumptions.” Enhancing user experience through unbiased perspectives Proxy users, especially those unrelated to the service or product being tested, bring a fresh perspective to the table. They offer insights without the bias of prior knowledge or experience, helping us see our product objectively. Their feedback acts as a powerful tool to uncover potential blind spots and create a more user-friendly experience. Compliance with accessibility standards Conducting accessibility testing, alongside accessibility audits, helps us meet the Web Content Accessibility Guidelines (WCAG) 2.2, which is based on 4 design principles: perceivable, operable, understandable, and robust. A four-piece jigsaw puzzle representing the four design principles: perceivable, operable, understandable, robust. In structuring the guidelines as principles instead of technology; the WCAG accentuates the need to understand how people interact with digital content, ensuring that the service is accessible, identifying areas for improvement, and reducing legal risks while promoting ethical design and development practices. Specific educational insights In the instance of our education focussed project, testing with the proxy users who were primarily teachers gave us valuable insights into the unique accessibility needs of education providers. Their feedback enabled us to develop and refine our service to align with the real needs of those in the sector. The cons of accessibility testing with proxy users While the benefits of involving proxy users are significant, it’s essential to acknowledge potential risks: Representation gap Proxy users, while sharing comparable accessibility needs, may not fully represent the experiences of the target user group. To address this, it’s essential to complement proxy user insights with targeted feedback from users with disabilities to bridge the representation gap. Availability Finding suitable proxy users for recruitment can be a challenge, potentially causing testing delays. In my project, this risk was mitigated by leveraging the client’s Accessibility Lab, a database of proxy users, which was readily available, preventing potential recruitment challenges and minimising testing delays. Intermediary role Proxy users, as intermediaries, may unintentionally filter or misunderstand information because they might not fully grasp the nuances of the target user group’s experiences. To counter this, I structured testing sessions with extra guidance and prompts to minimise the risk of misinterpretation. In conclusion Effective leveraging of proxy users in accessibility testing requires a balanced approach. While their insights are invaluable for inclusive design and early issue detection, it’s important to supplement their feedback with testing from actual users with disabilities whenever possible. Combining both approaches ensures a thorough evaluation of accessibility and usability. See you folks on the inclusive side! Key takeaways Inclusive design: Proxy users can play a crucial role in ensuring inclusive design for diverse user groups, especially when there are no declared users with accessibility needs in the user research pool Strategic decision-making: Gaining insights into accessibility needs of a diverse audience can enable data-driven informed choices. Communication is key : Clear communication before and during testing sessions, and encouraging open feedback creates a conducive testing environment. Tailoring testing session : Adapting usability tests to address specific accessibility challenges enables a focused assessment of user interactions with the service. Testing with empathy and flexibility: Prioritising users’ needs and conducting tests with patience and empathy are crucial. Maintaining a balanced approach : While proxy user insights are invaluable, supplementing feedback with testing from actual users with disabilities ensures a comprehensive evaluation of accessibility and usability. Useful resources Understanding WCAG 2.2 WCAG 2.2 Map Testing for accessibility Contact information If you have any questions about our accessibility services or you want to find out more about other services we provide at Solirius, please get in touch . This article was originally posted by Sree on medium.com .
- WCAG 2.2 one year on: Impact on government services
WCAG 2.2 one year on: Impact on government services by Ayesha Saeed After over a year of the release of WCAG 2.2 what should you be doing as a government service? Ayesha one of our Accessibility Leads answers some key questions you may have for how to implement WCAG 2.2 if you haven't already started. Overview: What is WCAG? Overview of the changes What are the new guidelines? Key questions on WCAG 2.2 Looking forward Useful resources What is WCAG? The WCAG (Web Content Accessibility Guidelines) ( opens in a new tab) are universal guidelines that are used by public bodies to ensure accessibility is built into digital services. The WCAG guidelines are broken down by levels: Level A: Must do, basic requirements (legally required for public sector). Level AA: Must do, removes further significant barriers (legally required for public sector). Level AAA: Specialised support, most comprehensive. Meeting the WCAG guidelines is one part of meeting legal accessibility guidelines as a government service (both public and internal users). Check out Piya’s article on government requirements (opens in new tab) from earlier in our accessibility series for details. You can also see understanding accessibility requirements for public sector bodies (opens in new tab) for a comprehensive breakdown. Overview of the changes The latest official version of WCAG 2.2 was published on 5th October 2023. This replaces the previous version, 2.1, which was published in 2018. WCAG 2.2 builds on and is compatible with WCAG 2.1, with added requirements. One success criterion, 4.1.1 Parsing, was removed in WCAG 2.2 as it was deemed redundant. WCAG 2.2 also addresses aspects related to privacy and security in web content. There are 9 further A, AA and AAA guidelines to be aware of including; focus management, dragging movements, target size, consistent help, redundant entry, and accessible authentication. 6 of the new criteria are A and AA level which are what government services are legally required to meet for WCAG 2.2, bringing the total of A and AA guidelines to 55. You can see the full details of the changes on the W3 website for the WCAG 2.2 introduction (opens in new tab) . What are the new guidelines? Level A and AA: 2.4.11 (AA): Focus Not Obscured (Minimum): focus states must not be entirely hidden. A graphic of a good example of two popup bubbles overlapping. You can partially see the focus on the popup behind. 2.5.7 (AA): Dragging Movements: functionality must not rely on dragging. Alternatives such as buttons for left and right should be provided. A graphic of a good example of a dragging function, with left and right arrows on either side. A hovering mouse shows how you can use the buttons and the dragging feature. 2.5.8: Target Size (Minimum) (AA): there can only be one interactive target in a 24px by 24px area. A graphic of a good example of icons where there is only one interactive element in a 24px by 24px area. 3.2.6: Consistent Help (A): help mechanisms must appear in the same place on each page. A graphic of a good example of two screens next to each other, with the help function located in the same top right hand corner on both. 3.3.7: Redundant Entry (A): users must not be required to re-enter the same information, unless essential such as for security purposes. Provide an option to automate the input for the same information twice if required twice. A graphic of a good example of the option to use the same details for an address so a user does not have to enter the same information twice. In this example there is a checkbox to say the billing address being input is the same as the your address input. 3.3.8: Accessible Authentication (AA): authentication must not require a cognitive test (exceptions for object recognition or personal content). For example, provide compatibility with a password manager so a user doesn't have to input or transfer information for authentication. A graphic of a good example of giving users several options for authentication e.g through the use of a password manager. Level AAA: 2.4.12: Focus Not Obscured (Enhanced): focus states must not be hidden at all. A graphic of a good example of two popup bubbles. You can fully see the focus on the popups and they do not overlap. 2.4.13: Focus Appearance: focus indicator must meet a contrast ratio of at least 3:1 and at least 2 px in thickness that goes around the item . A graphic of a good example of a clear focus around a button, with contrast of a minimum of 3:1 and 2px thickness. In this example a black outline is used on a light grey background. 3.3.9: Accessible Authentication (Enhanced): authentication must not require a cognitive test, with no exceptions. For example, provide compatibility with a password manager so a user doesn't have to input or transfer information for authentication. A graphic of a good example of an authentication form with no cognitive test or Captchas to login. Key questions on WCAG 2.2 Q1: Does meeting WCAG 2.2 ‘break’ my accessibility progress? A site that meets WCAG 2.2 will also meet 2.1 and 2.0. Q2: When do I start building and testing for WCAG 2.2? Testing your service against WCAG 2.2 should be incorporated as soon as possible if you haven't already started. You should aim to conduct regular accessibility testing (manual, automated and against assistive technologies), so you can maintain an accurate understanding of how compliant your service is and prevent any surprises when it comes to a yearly audit. Do not rely solely on an annual audit to accessibility test your service, as this is only a snapshot in time and does not reflect ongoing maintenance of accessibility. If it has been at least a year since your service was last audited, or it was audited against WCAG 2.1, you will need to conduct an audit again. You should also continuously conduct usability testing to ensure your service is meeting the needs of real users, and not just WCAG. Q3: Do I need to update my Accessibility Statement? You should reassess your service for WCAG and other legislation compliance every year, and update your accessibility statement to reflect this. As it is over a year since WCAG 2.2 was released, all services should now be testing and updating their accessibility statement to the WCAG 2.2 guidelines. Q4: When will GDS start monitoring? The GDS Monitoring Team started testing sites against the new WCAG 2.2 success criteria from 5th October 2024. Find out more information at changes to the public sector digital accessibility regulations (opens in new tab) . Q5: When will the GOV.UK Design system be updated? The GOV UK Design System Team have reviewed WCAG 2.2 (opens in new tab) and updated the design system, and included these changes in the latest GOV.UK Frontend v5.0.0 (opens in new tab) . They have also provided guidance on how to meet WCAG 2.2, and which components, pages and patterns will be affected. Q6: How is my accessibility automated testing impacted? You should continue to use automated tools such as pa11y and aXeCore to support testing in build pipelines. For aXeCore, you can tag which level you want your tests to run against, so make sure you add ‘wcag22’ to cover the new guidelines. Find out more at Axe-core 4.5: First WCAG 2.2 Support and More (opens in new tab) . Semi-automated tools such as Wave and aXe can still also be used to pick up some accessibility issues. Automated/semi-automated tools do not cover all WCAG 2.2 guidelines so it is important to continue to test manually, with assistive technology and with real users. Looking forward WCAG 3.0 (opens in new tab) is currently a Working Draft and aims to provide guidance to build for users with blindness, low vision and other vision impairments; deafness and hearing loss; limited movement and dexterity; speech disabilities; sensory disorders; cognitive and learning disabilities; and combinations of these. WCAG 3.0 also aims to support a wider range of web content on desktops, laptops, tablets, mobile devices, wearable devices, and other web of things devices. Content that conforms to WCAG 2.2 A and AA is expected to meet most of the minimum conformance level of this new standard but, since WCAG 3 includes additional tests and different scoring mechanics, additional work will be needed to reach full conformance. Ensuring you factor in regular maintenance is paramount to keeping accessibility up to date. And remember, WCAG does not cover every scenario. Test with your users and conduct regular user research. Useful resources WCAG 2.2 and what it means for you (Craig Abbott) (opens in new tab) Obligatory WCAG 2.2 Launch Post (Adrian Roselli (opens in new tab) What WCAG 2.2 means for UK public sector websites and apps (GDS - YouTube) (opens in new tab) Testing for WCAG 2.2 (Intopia - YouTube) ( opens in a new tab) WCAG 2.2 Explained: Everything You Need to Know about the Web Content Accessibility Guidelines 2.2 ( opens in a new tab) Contact information If you have any questions about our accessibility services or you want to find out more about other services we provide at Solirius, please get in touch .
- AI in action 1: Supporting service teams through the Service Standard
AI in action 1: Supporting service teams through the Service Standard by Matt Hobbs As digital public services evolve, so must the tools we use to build them. This series explores how Artificial Intelligence (AI) can responsibly support UK government service teams in meeting the Government Digital Service (GDS) Service Standard. From user research to accessibility testing, performance monitoring to service assessments, we’ll examine where AI can complement human expertise, enhancing delivery without compromising trust, transparency, or inclusion. Overview What is the Service Standard? What is the Service Manual? What is a Service Assessment? Wrapping up About the author Welcome to a series exploring how Artificial Intelligence (AI) can support UK government service teams in meeting the Government Digital Service (GDS) Service Standard . As digital public services continue to evolve, so too must the tools and methods used to build them. AI, when applied thoughtfully and responsibly, has the potential to enhance delivery, improve user outcomes, and support those working in government to focus on what matters most: meeting real user needs. This series will explore how AI can play a role in supporting service teams at every stage of the service lifecycle, from discovery to live, and how it can complement the Service Manual’s practical guidance. Whether through natural language processing, data analysis, accessibility testing, or helping teams with performance monitoring, we’ll consider both current capabilities and future possibilities. This is not a call to automate everything, nor to substitute human judgement, but to embrace new tools in a way that strengthens delivery and accountability across government. Before we continue, let me cover a couple of important points... What is the Service Standard? The UK Government Service Standard is a set of points designed to help teams create and run effective, user-centred digital services. Maintained by the Government Digital Service (GDS) , it ensures that public services are accessible, efficient, and meet user needs. The standard promotes practices such as understanding users , using agile methodologies , testing services with real users , and making services secure and accessible . It's used throughout the development lifecycle to ensure quality and consistency across UK government digital services. What is the Service Manual? There may be a few readers who've never heard of the Service Manual. So, here's a brief history and overview. The UK Government Service Manual was introduced as part of the Government Digital Service (GDS) initiative, launched in 2011 to improve digital public services. Continuously updated, it reflects evolving best practices and legal requirements , ensuring government services remain effective and accessible for all users. The Digital Service Standard and the Service Manual are the foundations for what you need to complete a Service Assessment. What is a Service Assessment? A UK government Service Assessment is a structured evaluation process designed to ensure that digital services meet government standards before they go live or progress through key stages of development. Approval stages in a Service Assessment: UK government services typically go through 3 key Service Standard assessments: 1. Alpha assessment Conducted at the end of the Alpha phase Focuses on whether the service team has researched user needs, developed and tested prototypes, and has a plan for the Beta phase Core evaluation criteria : user research, design, technology choices, and feasibility 2. Beta assessment Conducted at the end of the Beta phase Evaluates whether the service has been tested with users, can handle expected demand, and meets accessibility and security standards Some departments may also decide to run a private beta for certain services, testing them with a small group of invited users In some cases, a service may remain in the Beta stage for an extended period Core evaluation criteria : performance, scalability, accessibility, data security, and readiness for live deployment 3. Live assessment Conducted before a service moves from Beta to Live (full public availability) Ensures the service is sustainable, meeting user needs, and is continuously improved Core evaluation criteria : performance monitoring, governance, data management, and ongoing user feedback integration Service Standard criteria Each assessment evaluates against 14 Service Standard points , some of these include: Understanding user needs Designing for everyone (inclusivity) Making the service simple and accessible Using open standards and scalable technology Ensuring security and privacy To progress to the next stage, service teams must pass these assessments. If unsuccessful, they are expected to resolve the issues highlighted and reapply for a future assessment. Wrapping up Some people might see using AI in the Service Standard, and Service Assessment process as “cheating” because if AI does all the work, what’s left for the service team to do? But really, AI is just a tool to help things run more efficiently and save the UK government time and money. It’s not about replacing human expertise. It’s also important to remember that AI can sometimes get things wrong (what’s called a “ hallucination ”), so it’s critically important that teams sense-check what AI produces instead of just accepting it at face value. Now that we’ve outlined the purpose and structure of the Service Standard and the role of service assessments, we’re ready to dive into the practical side, where and how AI can help. In the next post, we’ll begin exploring each of the 14 Service Standard points in turn, starting with what is arguably the most critical: understanding users and their needs. We’ll look at how AI can assist user researchers, support data analysis, and improve how teams gather insights, without losing the nuance or empathy that human researchers bring. So please stay tuned! About the author My name is Matt Hobbs — Principal Engineer (Frontend) and Guild Lead at Solirius Consulting, currently embedded in HMCTS. Before joining Solirius, I spent six years at GDS, leading on frontend development and shaping strategy across accessibility, performance, and digital best practice. I also wrote a series of blog posts documenting the performance improvements made to GOV.UK — covering everything from HTTP/2 and jQuery removal to Real User Monitoring. Well worth a read if you’re interested in practical, real-world frontend engineering in the public sector. Why we focus on frontend performance Speeding up GOV.UK with HTTP/2 How GDS improved GOV.UK ’s frontend performance with HTTP/2 (Case Study) Making GOV.UK pages load faster and use less data How Real User Monitoring will improve GOV.UK for everyone What we’ve learned from one year of Real User Monitoring data on GOV.UK The impact of removing jQuery on our web performance A Request For Comments (RFC) for enabling HTTP/3 on GOV.UK Contact information If you have any questions about our AI initiatives, Software Engineering services, or you want to find out more about other services we provide at Solirius, please get in touch (opens in a new tab) .
- Solirius Reply attends the Reply Xchange
Ayan Kar and Hamid Ali-Khan presenting at the Reply Xchange Earlier this month, Solirius attended our very first Reply Xchange, a high‑energy event designed to explore the latest in technology, innovation, and digital experience. Hosted by Reply, the day brought together clients, partners, and teams from across the network for a packed programme of expert talks, interactive demos, and collaborative discussion. The goal: to connect people and ideas, share what’s working, and inspire bold thinking for the future. Solirius was proud to contribute by presenting on the role of AI in delivering complex data migrations, a critical enabler for transformation programmes across government. We showcased how AI can enhance the accuracy, speed, and scale of migrations, reduce manual effort, and improve long‑term data quality and governance. Our talk delivered by Ayan Kar, Data Engineering Lead and Hamid Ali-Khan, Head of Engineering focused on some key AI themes: The criticality of modernisation of applications How AI can significantly enhance data migration accuracy and efficiency The use of AI tools to improve development productivity The importance of the decommissioning of legacy systems in the journey for the best outcome in data migrations The Xchange left a strong impression on the Solirius team in attendance, leaving them feeling not only energised by what’s possible, but more connected to the broader Reply community. It’s clear there’s real momentum and we’re motivated to accelerate our AI capabilities, deepen collaboration across the network, and adapt innovative Reply solutions to better serve the needs of our public sector clients. From intelligent data services to AI-assisted delivery and decision support tools, we see a huge opportunity to unlock value and deliver lasting impact through thoughtful, human-centred innovation. Members of the Solirius Reply team at the Reply Xchange Contact information If you would like to see the full presentation or speak with our Engineering or AI practitioners on how we can support your transformation efforts, please reach out to Ayan Kar or Hamid Ali-Khan via our contact form here (opens in a new tab) .
- Lessons in accessibility: A day at the DfE Accessibility Lab and conversations with the experts
At the DfE Accessibility Lab, our colleagues Sree (User Researcher) and Claire (UX Designer) explored how assistive technologies are used—and where they can fall short when services aren’t designed with everyone in mind. One crisp spring morning, as the sun finally pushed through the grey weight of winter, a user researcher, Sree, travelled from Newcastle and an interaction designer, Claire, journeyed from London, converging in Sheffield. Their destination: the Department for Education’s (DfE) Accessibility Lab. Their goal: to understand how digital services function for those who navigate the world differently. Inside the Accessibility Lab: Where digital barriers become visible From left to right: Claire, Sree and Jane at DfE’s Accessibility Lab, Sheffield We expected a technical demonstration—a run-through of tools and accessibility best practices. What we got was something much more human: a window into the lived experience of those who rely on assistive technologies daily. Guided by Jane Dickinson, an accessibility specialist at DfE, we explored tools like Dragon, JAWS, ZoomText, and Fusion. Jane not only explained how they work but showed us how easily they can fail when services aren't built with accessibility in mind. Insights from testing with assistive tools Dragon: Voice recognition for hands-free navigation Dragon voice control lets users navigate computers hands-free. But if clickable elements aren’t properly coded as buttons, Dragon can’t find them. Jane demonstrated how Dragon struggled with buttons on a DfE service and the BBC homepage as they weren’t coded as such. Dragon couldn’t recognise the “click button” command as the button was invisible to the tool - highlighting a major gap between design and code. JAWS: Screen reader for non-visual navigation JAWS relies on well-structured content: heading levels, labelled buttons, and descriptive links. Jane showed how generic links like “Read more” or “Download” confuse JAWS users due to a lack of individual distinction or missing ARIA labels, making browsing chaotic and frustrating. As Jane put it: “If a page isn’t structured properly, it’s a nightmare to navigate.” ZoomText: For low vision users ZoomText is a magnification tool that helps users navigate visually. However, it requires users to hover or click on links to have them read aloud, unlike JAWS, which reads automatically. At higher magnification, text can become distorted where the page has not been coded to handle zoom, affecting readability. Fusion: Combining JAWS and ZoomText Fusion provides auditory feedback and high-level magnification for individuals with partial vision loss, offering magnification up to 20x with auditory feedback. But Jane showed us that even a 3x zoom can cause layout issues, like pixelation and clipped content, especially when sites don’t reflow content properly. Keyboard-only navigation Keyboard navigation is essential for users who can’t use a mouse, relying on shortcuts like the Alt key. But inconsistent implementation makes things harder. Jane pointed out unmarked buttons on the BBC homepage that would leave keyboard users guessing: “If something isn’t labelled properly, it just gets skipped over.” Captions for hearing impairments Captions aren’t just for deaf users—they help everyone. But live captions often lag, making comprehension harder. Testing BBC video content, we saw captions fall out of sync with speech, making it difficult for a user to keep track. Experiencing the world through the eyes of others Sree and Claire testing visual simulation glasses As part of our lab experience, we tested simulation glasses that aimed to alter vision, giving a general insight into conditions like: Cataracts : everything looks blurred. Tunnel vision : loss of peripheral vision, reducing situational awareness. Left-sided hemianopia : half the visual field disappears, common after strokes or brain injuries. It was very insightful to be reminded how much of the digital world can become difficult to use under these conditions, and how inclusive and thoughtful design can prevent the digital barriers that some users may face. N.B. While simulation glasses offer a glimpse, they can’t replicate the full experience of visual impairment. They’re a starting point for empathy, not a substitute for listening to real users who experience visual impairments. To truly understand, we need to speak with and learn from real users. The Visual Impairment North-East (Vine) Simulation Package In conversation with Accessibility Experts To deepen our understanding of accessibility, we interviewed Jane Dickinson and Jake Lloyd, two key accessibility specialists at DfE, to hear their insights. Jane’s biggest frustration? Accessibility being bolted on at the end. “It’s not enough to test for accessibility. Real users need to shape the design from the beginning.” She also highlighted how many users hesitate to disclose their accessibility needs for fear of being seen as difficult. Even when reports are written to improve accessibility, they often go ignored. “I can spend a whole day writing a report, and sometimes nothing changes.” Despite these challenges, Jane celebrated the wins—a blind user who was able to access their payslip independently for the first time: “One of our blind users told me, ‘For the first time, I didn’t have to ask someone to read my payslip. I could do it myself.’ That made all the work worth it.” Even small changes like properly marking up pdfs or labelling buttons has a huge impact and can make a service more accessible. Jake emphasised the importance of building for keyboard navigation and screen readers from the very start. “There are so many accessibility issues that come from not thinking about keyboard accessibility… It affects focus, visibility, and how well voice and assistive tech tools work.” He highlighted issues like repetitive, unclear links in patterns such as “Check your answers”: “Something like the ‘Check your answers’ pattern has links that just say ‘Change’… If you're just using a screen reader and you're navigating through a bunch of links… you're only going to hear “change”. So providing some hidden screen reader text, giving more context to that link can be really helpful.” This was another thoughtful reminder that different users read pages differently, and not everyone will be able to view the visual context to written content. A holistic approach to accessibility The accessibility specialists broke down their layered approach to testing the accessibility of services: Automated testing to catch common issues early. Manual testing using only a keyboard or different zoom levels. Assistive tech checks like screen readers and voice controls. Code reviews to ensure correct HTML and component use. As Jake put it, accessibility goes beyond the Web Content Accessibility Guidelines (WCAG) standards: “I’ll also record issues that don’t fail WCAG but still create barriers—like having to tab 30 times to reach an ‘apply filter’ button.” Jake warned against treating accessibility as an afterthought: “Where teams haven't thought about accessibility and inclusive design up front and early on, complex issues tend to come out of that.” Not boring. Not optional. A myth Jake wants to debunk is that accessible design equals boring design. “You can still be innovative. Your website can look good and be accessible if you plan it that way from the start,” he said. “Unfortunately, some organisations continue to treat accessibility as an afterthought, which remains a cultural issue”. Our specialists pointed out that advocacy and awareness are key to changing this mindset: “Having people with actual lived experience that can demonstrate the way that they interact with digital content, can be really powerful… Here's someone who is blind. They use a screen reader to navigate your service, and they can't do it.” They stressed how one in four people have a disability—can you afford to turn them away with inaccessible services? Why accessibility matters for everyone Jane and Jake made it clear: accessibility isn’t just for disabled users. It benefits all of us. Captions help on a noisy train. Good contrast helps in bright light. And if zooming to 400% breaks your layout—it’s not just low vision users who suffer. “If it’s not thought about up front, then it affects a lot of people.” Accessibility isn’t a task—it’s a mindset As user researchers and designers, we focus on how people interact with digital services. But in Sheffield, we were not the experts—we were the students. This wasn’t about checking off accessibility guidelines. It was about understanding what happens when those guidelines aren’t met. A missing label, a broken heading structure, or an unlabelled button—these aren’t small issues. Each one determines who gets to participate and who doesn’t. Accessibility is also never ‘done’, it is an ongoing activity that requires the whole team's input to maintain. As we left Sheffield, catching our trains to opposite ends of the country, we carried more than just knowledge. We carried a quiet but certain resolve to champion accessibility. The best accessibility work doesn’t “help” people. It supports their independence and ensures they don’t have to ask for help in the first place. Useful resources Department for Education accessibility and inclusive design training Making your service accessible: an introduction Department for Education accessibility and inclusive design manual W3C: Making the Web Accessible W3Cx: Introduction to Web Accessibility Sara Soueidan: The Practical Accessibility Course About the authors Sree is a Lead User Researcher specialising in uncovering user needs and delivering data-driven insights. A CDDO-DfE trained Service Assessor, she champions user-centricity and accessibility in government services. When she’s not diving into research, Sree can be found roaming the countryside with her husky, cooking up a storm, or curling up with a good book. Claire is a Senior User Experience Designer, specialising in interaction design. She advocates for accessibility and strives to bridge the gap between usability and inclusion. Outside of work, Claire enjoys exploring new places and experimenting with new recipes. Contact information If you have any questions about our research and design services or you want to find out more about other services we provide at Solirius, please get in touch (opens in a new tab) .
- Driving social impact: Solirius' pro-bono digital and design project with The Talent Tap
Supporting charities through pro-bono digital and design services by Theone Johnson and Sam Smith At Solirius, we are committed to using our technical expertise to create a positive and lasting impact on society through our pro-bono work with charities. Supporting social mobility through user-centered design The Talent Tap is a social mobility charity that supports young people from areas with fewer opportunities, often rural and coastal areas where challenges are greater, by providing access to work placements and other professional opportunities and support that help shape their future careers. They also work with businesses to advocate for industry-wide change to implement sustainable diversity and inclusion strategies, with a focus on improving social mobility. As part of our Social Value initiative, we partnered with The Talent Tap to provide pro-bono digital and design support, identifying opportunities to improve the content and design of their website, attract potential corporate partners, and amplify their impact. The Design Team: Claire McShane Designer, Louise Morales-Brown User Researcher, Anna Rapp User Researcher, Sam Smith User Researcher, Lydia Davidson Designer, Hattie Brash Service Designer and Theone Johnson Project Manager Our Solirius Design Team, an outstanding group of talented design consultants used their expertise in user-centred design, content strategy and accessibility to make a meaningful impact on the charity while reinforcing Solirius' commitment to its core social values. The aim The Talent Tap wanted their website to have a simple and professional feel to attract more interest and funding from potential corporate partners, while still maintaining a friendly undertone that communicated their USP as a youth-led charity. Our approach We worked closely with the Talent Tap team to enhance their website’s content and design, ensuring it aligned with their goals and vision. For this, we took a holistic approach by equipping them with industry insights to refine their online presence and leveraged user research to identify ways the design can better meet user needs and their own strategic goals. This included: Conducting a content audit to identify pain points and improvement opportunities Carrying out an accessibility audit to ensure inclusivity for all users Performing a competitor analysis to identify opportunities for differentiation Mapping and improving the existing information architecture to create clearer user journeys Facilitating collaborative design workshops to inform the new creative direction for the charity Conducting user research and analysis to understand and design for the user needs of their target audience Applying industry standards to update the content to ensure clear, impactful messaging Creating high-fidelity wireframes that aligned with The Talent Tap’s new brand kit Adding clear wireframe annotations to guide the development phase Working examples Screenshots of examples of the project work carried out. Included are a Talent Tap Planning Workshop, competitor analysis, accessibility audit, content audit, a corporate partner persona and an image of an information architecture map. The outcome We are proud to have supported The Talent Tap’s mission by enhancing their digital presence and helping them connect more effectively with their audience. Social mobility is a key focus for us at Solirius, and we’re grateful for the opportunity to help The Talent Tap expand their reach—bringing this important message to more young people and businesses across the UK. Thank you and congratulations to everyone who contributed to this project. The Talent Tap team is thrilled with the new website designs and their CEO said: “Working with Solirius as a not for profit was a joy both in terms of their patience and incredible enthusiasm and knowledge. You have taken what was a clunky and standard not-for-profit website and turned into a fully functional, user centric asset. As a charity we simply could not have afforded the service provided.” Contact information If you have any questions about our design services or you want to find out more about other services we provide at Solirius , please get in touch (opens in a new tab) .
- Delivering a modern analytical platform for DfE
We supported the Department for Education (DfE) to introduce a new data intelligence platform while focusing on user needs and better ways of working. Linda Souto Maior, service designer on the project, shares how we did this. The DfE Analyst Community plays a critical role in helping the department achieve its goals to enable children and learners to thrive . Over time analysts were operating on increasingly outdated, disparate and costly legacy systems. As the data and analytical sector moves towards cloud based technologies, DfE wanted to build on existing ways of working, keep pace with emerging trends and new opportunities to support the Strategic Data Transformation programme. The vision was to build a new joined up service to help analysts find data and access cloud analytical tools, known as Analytical Data Access (ADA), named after the famous pioneer and mathematician Ada Lovelace. A key enabler for any major technology change is to design the service in a way that meets the needs of users and also recognises the impact of the change to the organisation and importantly to the individuals affected. Challenges across the organisation There were a number of challenges across business and technology teams: DfE’s Data Directorate had an immediate requirement to replace outdated and costly platforms with cost of support and maintenance high. Users of data across the DfE Analyst Community could not always find what data is being stored, who owns it, how to get access and how to use it. Users are limited by capacity and difficulty in scaling without impacting others, and may experience delays in running data queries. Duplication of the same data assets across different platforms with no single source of truth, and increasing costs, risks and effort to store, manage and control access to data. Reduced ability for experts in datasets to collaborate easily. Time-consuming daily log-ins to multiple systems. The approach to address the problem By working in blended teams Solirius provided service design, user research, interaction design, business analysis, delivery and business change expertise to bring the service to live. The service was designed to bring together three underlying platforms: a data discovery platform to catalogue all DfE data, a data intelligence platform using Databricks on Azure with a Delta Lake for data storage, and a library of reports and dashboards. Key improvements included: A single point of entry to reduce sign-ins across platforms. Greater processing power for faster calculations and complex analysis. Governance of data through a single request form to gain access to datasets. Information about the service, support guides and access to tailored training designed in collaboration with analysts. A single homepage to access all services. Outcomes and value added Alongside the technology challenge of deployment, we worked closely with users to overcome nervousness about the new platform. This included adapting the service but also improving communications: Set up a super user group of analysts who helped with design input, test and provide feedback on the service. Worked with the supplier (Databricks) to integrate R Studio (third party modelling software) based on user feedback. Mapped out all data requirements and user flows to identify common pain points and avoid duplication of data prior to any data migration. Developed a roadmap for data migration and changes to ways of working. Ran frequent show & tells across the organisation and invited our analysts to demonstrate the use of tooling. Engaged with analysts and the supplier to design training and support guides. Co-designed a business change strategy to establish collective ownership of the change. Worked closely across DfE departments and alongside other projects to ensure a joined-up service across all channels. Provision of flexible resourcing to meet the needs of delivery and budget constraints. The service continues to be rolled out across the organisation and is now being used by 300 analysts with 60 modelling areas migrated for 50+ analyst teams. Meanwhile we continue to work with the Analyst Community and Data Directorate to improve and adapt the tool as new use cases come up. Long term, the service will save time, costs and effort, through better collaboration and faster processing, and also enable better use and governance of data. "As sponsor for the DfE Analytical Data Access (ADA) service I have been impressed with the calibre of the Solirius resources who have supported us in getting this ambitious programme off the ground. They have been key in helping us build multi-disciplinary squads and they have integrated seamlessly with our existing staff. Their expertise has brought shape and rigour to our work and enabled us to deliver a professional service that is growing in demand." Patrick Healey, Deputy Director | Data Operations | Data Contact information If you have any questions about implementing new digital technology in your organisation or want to find out more about what services we provide at Solirius please get in touch .
- Providing data engineering services that support company growth
Growth Intelligence specialise in helping companies grow, using AI and uniquely rich SME data to drive more revenue, reduce acquisition costs and increase conversion rates. To support their business model Growth Intelligence were looking for assistance to: manage existing infrastructure and scale data pipelines to handle ever increasing amounts of data using contemporary cloud technologies create and maintain bespoke applications to support the day to day activities of their data science and customer success teams maintain code libraries, repos, apps and machine learning feature data. Setting up the specialist team Our specialised resourcing solution gives GI access to experienced, high quality data engineers at a wide variety of levels. Proponents of agile delivery, our team adopts GI’s hybrid scrum/kanban approach, monitoring sprint progress using kanban boards and running the full range of agile sprint ceremonies (daily stand-ups, retrospectives and sprint planning sessions). We collaborate with GI’s engineering and data scientists teams, using a stack of Python, Ansible, AWS, Tornado, Flask, Elasticsearch, Docker, Pandas. Working alongside the GI team and key stakeholders, our engineer’s work spans requirements gathering, technical spikes and OKR management. 2 years of success and growing Solirius has worked with Growth Intelligence for over 2 years and we are proud to continue our association, helping to maintain and improve the leading-edge services that GI provides to companies around the world. "The Solirius team are great additions to our engineering team. They are highly professional, effective at working independently (whilst also knowing when to seek clarification on requirements / design / architecture) and proactive in taking on projects / problem solving. They have integrated seamlessly into the team - which is great for us as a start-up as it enables us to have a single cohesive engineering team. They have a genuine interest in helping us succeed and creating a friendly and enjoyable culture to work in." Prashant Majmudar, CTO at Growth Intelligence











