Training: Assessing Your Website for Accessibility


Welcome to ‘Assessing Your Website for Accessibility’ presented by the University of Chicago Center for Digital Accessibility.

My name is Jack Auses. I am an Accessible Web Technology Specialist in the UChicago Center for Digital Accessibility. I’ve been at the University since 2005 and was previously the Lead Front-End Developer in the ITS Web Services Group. I joined the CDA in January of 2020 and, like the other members of the CDA team, am a Certified Professional in Accessibility Core Competencies from the International Association of Accessibility Professionals.

A little background about the CDA. The UChicago Center for Digital Accessibility provides digital accessibility, consulting, assessment and training for students, faculty, other academic appointees, staff, and postdoctoral researchers at the University of Chicago. The CDA was instituted in January of 2020 and is part of a larger campus IT risk initiative that covers network and computer device security, visual identity and branding, and digital accessibility.

[What is Web Accessibility?]

What is web accessibility? In this section we will discuss what we mean when we talk about web accessibility and why it’s important.

According to the World Wide Web Consortium: “Web accessibility means that websites, tools and technologies are designed and developed so that people with disabilities can use them. More specifically, people must be able to perceive, understand, navigate, interact with and contribute to the web.”

And so, why is Web accessibility important? It’s important for a number of reasons:

It supports the University’s mission. As the UChicago Diversity Initiative states: “Diversity is critical to the process of discovery. At UChicago, different backgrounds, viewpoints, and perspectives are not only sought after and encouraged, they are the building blocks that make rigorous inquiry possible.” Diversity supports the University’s core academic mission of fostering an atmosphere of free and open inquiry.

It’s also a legal obligation. As you know, the expectation that online content be accessible is not new. And as a place of public accommodation, as well as a recipient of federal funding under Section 508 of the Rehabilitation Act, the University is obligated to make its programs, course materials, events, and online content accessible.

I wanted to share this tweet by @LareneLG. She posted this in June of 2020:

“Today, my dad cried over the phone. He wanted one week where he could use his computer without my help. He’s blind. Each inaccessible web page tells him, you aren’t welcome in this world. If you don’t know whether your website or app is accessible: it’s not. Start learning.”

And the final reason why web accessibility is important is: it’s the right thing to do. Maintaining accessible digital content helps people of all backgrounds and abilities have a better experience participating fully in the University community. Adhering to the University’s web accessibility guidelines ensures that our students, faculty, staff, and campus community can access University content online in a way that is equitable regardless of a person’s abilities. It embodies the University’s commitment to the diversity necessary for intellectual exchange and rigorous inquiry.

[WCAG 2.1, Level AA]

In this section we’re going to talk about WCAG 2.1, Level AA, which are the web accessibility guidelines and how they relate to UChicago websites. So, what is WCAG 2.1, Level AA? The World Wide Web Consortium’s Web Content Accessibility Guidelines, or WCAG 2.1, Level AA are best practices for public facing websites, and they address accessibility of web content on desktops, laptops, tablets, and mobile devices.

WCAG 2.1 was adopted in June of 2018 and it extends the previous standard, WCAG 2.0, which had been in place since 2008. While WCAG 2.0 remains a W3C recommendation, the W3C advises the use of 2.1 to maximize future applicability of accessibility efforts. The W3C also encourages the use of the most current version of WCAG when developing or updating any web accessibility policies at your institution. Following the WCAG 2.1, Level AA guidelines will make web content more accessible to a wider range of people with disabilities and—because accessibility barriers tend to amplify usability problems—will often make your website more usable to people in general. For example, alternative text on images can help people with slow Internet connections understand the purpose of the content before it loads.

The WCAG guidelines and success criteria are organized around the following four principles of web accessibility, which lay the foundation necessary for anyone to access and use web content. The four principles are:

Perceivable. Information and user interface components must be presentable to users in ways they can perceive. This means, the user can identify content and interface elements by means of the senses. For many users, this means perceiving a system primarily visually. While others may use sound or touch.

The second principle is: Operable. User interface components and navigation must be operable. Operability means that a user can successfully use controls, buttons, navigation, and other necessary interactive elements. For many users, this means identifying an interface control visually and then clicking on it, tapping, or swiping. For other users, using a computer keyboard or voice commands may be the only means by which they can operate and control the interface.

The third principle is: Understandable. Information and the operation of user interface must be understandable. And this means that your web design is consistent in its presentation and format. It’s predictable in its design and usage patterns. Concise, multimodal, and appropriate to the audience in its voice and tone. Users should be able to comprehend the content and learn and remember how to use the interface. An example here would be a main navigation bar that is consistent and appears at the top of every page on a website.

And the final principle is: Robust. Content must be robust enough that it can be interpreted reliably by a wide variety of browsers, including assistive technologies. It is standard, compliant, and designed to function on all appropriate technologies. Users should be able to choose the technology they use to interact with websites. An example of a site that would not be robust would be one that requires Internet Explorer as the browser and doesn’t work on other browsers.

UChicago is adopting the WCAG 2.1, AA standard. All University web properties with a domain and those used for teaching—regardless of the domain—must comply with the standards defined in WCAG 2.1, Level AA. There is a digital accessibility standards policy in place that is administered by the Associate Provost for Equal Opportunity Programs.

[Automated Web Accessibility Assessments]

So…how can you ensure the accessibility of your UChicago website? Regularly assessing a website is the only way to ensure that your content is accessible. And this can be done by using a combination of automated tools and manual checks. In this action we’re going to talk about automated web accessibility assessments and we’ll explore some easy and automated tools that you can use for performing initial accessibility reviews.

There are a number of automated accessibility assessment tools available. Using a free browser extension like the Siteimprove Accessibility Checker or Deque Axe, you can quickly test individual web pages for common accessibility issues, such as: insufficient color contrast, which is a Level AA issue that falls under the Perceivable principle. Visual presentation of text and images of text must have a contrast ratio of at least 4.5:1.

Another issue that can be detected automatically is if a web page is missing a title. That is a Level A error and falls under the Operable principle. You know, every web page must have a title that describes the topic or purpose of the web page.

Another issue that can be found via automation would be a language attribute missing on the page’s HTML tag. This is, again, a Level A issue and it falls under the Understandable principle. The primary language of each page must be declared so that a screen reader can accurately announce the page to a user in the correct language.

And finally, another example would be invalid ARIA roles in your code. This is, again, a Level A issue, and it falls under the Robust principle. The name and role of all interface components must be programmatically determined and be coded correctly. Similar to common HTML errors, invalid ARIA may prevent assistive technology from accurately navigating a web page.

So what can these tools do? With fully automated tests, these tools can quickly identify about 40 percent of potential accessibility issues. You can and should use them through all phases of the web design and development process. However, not all accessibility issues are created equally. Certain issues will completely prevent a user from interacting with the site, some will cause content to be confusing, and others will make the experience of using the site annoying. And so, some of the assessment tools—such as Deque Axe—will score each issue by impact to assist you with prioritizing.

Some examples of the different impacts: a blocker issue would be some portion of the webpage only works with a mouse and is not operable with a keyboard. We would consider that a blocker and a very high impact issue. A minor issue might be that an element has a duplicated ID in the source code or there’s a small HTML error. In certain contexts that might cause a screen reader to announce something in a confusing manner. But generally speaking, it will not greatly impact the user experience for someone using assistive technology.

[Automated Assessment Demo]

In this section we’re going to go through a demo of an automated assessment of a web page. I’m going to switch over to my browser now and we’ll walk through that. We’re going to assess the page, identify a couple errors, fix them, and then recheck to see if we, in fact, have addressed the problems.

So, I have a demo page setup here in my browser in the very familiar UChicago web template. And I’m going to use the Deque Axe browser extension for this demonstration. I’m going to open up my web inspector. The Axe browser extension is accessed through the web dev tools in Chrome. So, as you can see I have an Axe item over here. I’m going to switch to that. And I’m going to click “Analyze”. And so the first thing I’m gonna do is I’m going to click this “Highlight” item here. This will just highlight items on the page that have issues as I go through the report.

So, the tool found 10 violations. But it really looks like there are two main violations. There are a number of elements that don’t have sufficient contrast and the HTML element does not have a language attribute. So, I’m going to click on the color contrast one and this will allow me to kind of tab through all of the elements on the page that do not have sufficient contrast.

I can see the address in the footer, the link to the home page, and these various links in the footer here do not have sufficient contrast. What this tells me is, actually, that light gray color background on the footer is not dark enough. There’s a darker gray that we typically use. And so we’ll need to fix that in our CSS.

Clicking on the second violation, the HTML element must have a “lang” attribute. Obviously this is in code, so it won’t be highlighted on screen. But Axe here does give me a description of the issue, and it even shows me the source code so that I can look at that and then find that code in my templates or wherever and make the change. So it looks like I’ve got to edit the HTML tag, add that language attribute, and I need to fix the background color of the footer.

So, to do that I’m going to switch over to my code editor. And I already have these files up. This site is a content managed site and so it’s heavily templated. And so there’s a discrete template for my open DOCTYPE and my HTML tag. And so I need to add the language attribute here on one file and save it. And then that will fix the issue across the entire site. I don’t have to go in and, you know, search for this code on hundreds or thousands of web pages. I can just do it once and it’ll fix it everywhere. So we’re going to add a language attribute. And the page is in English. So the language code for English is “en”. We’ll save that.

Now, I’m going to switch over to my CSS. This site is built with LESS, that compiles. And so I have a footer file that is just the CSS rules for the footer. And I can see that this gray is not correct. #7D7D7D. The gray that I want to use is a lot darker and the hex value for that is #404040. The specific color doesn’t really matter, but I know that this is the official gray for the footer. So I’m going to change that. Save that. Give it a second for my CSS to compile and switch back to my browser.

And let’s refresh the page. So it looks pretty much the same. Let’s scroll to the bottom. The footer is now darker, and so I will re-analyze. And there are no violations found. So, we successfully—editing two lines of code in our templates—fixed 10 issues on this page that were found via automation. These tools are really invaluable and they’re easy to use. And as you’re developing sites, it’s just great to continually test your pages and catch issues as they come up during the development phase rather than waiting until the end to run some automated tests.

While the automated tools can do a lot of stuff. There are some things they cannot do. Not every web accessibility issue can be checked automatically. Human judgment is required to evaluate about 60 percent of all the accessibility issues. Automated tools cannot fully determine accessibility. They can only assist in doing so.

[Manual Web Accessibility Assessments]

And so we’ll go through some manual web accessibility assessments in this section. We’ll talk about techniques for manually assessing your web pages and the tools to use. So, as I said, the automated checks catch about 40 percent of web accessibility issues. Manual assessments are crucial for identifying the roughly 60 percent of issues that cannot be detected automatically.

Some examples of these issues are an aria-hidden=”true” attribute used on informative content. This is a Level A issue and it’s under the Perceivable principle. This would be a circumstance where an element, for whatever reason, is visible on screen—is conveying some information—but in the code there’s an aria-hidden=”true” attribute on there. And that just completely removes that item from a screen reader. So, human judgment needs to come into play there to determine if that’s appropriate or not.

Vague link text like “click here” or “read more” would be an issue that would need to be evaluated manually. Again, this is a Level A issue and it falls under the Operable principle. And the reason that’s a problem is the purpose of each link must be determined from the linked text alone without any additional context from surrounding text. A lot of times screen reader users will have the screen reader read out all the links on the page. It’s a way for them to scan the page. And so if there are a bunch of links on a page that just say “click here” or “read more” that’s not going to be useful for someone with a screen reader. So those kinds of links really need to say something like “click here to read more about the name of the article” let’s say.

Another example would be visually hidden elements that are accessible via the keyboard. It’s again, a Level A issue and it’s under the Understandable principle. If you have a search box or something that is generally hidden unless it’s triggered into view, but you can tab to it with your keyboard while it is hidden, that would be an issue that would need to be assessed manually.

And a fourth type of issue that needs to be assessed manually is the use of improper ARIA roles. So in the previous example, we talked about invalid ARIA roles, which is more of a code error. Here, this is a Level A issue and it falls under the Robust principle. This would be an example where you have an accordion element on your page, for instance, but it is marked up with a “tab” role. So you’re just using the wrong role for the function.

Tools like, again, the Deque Axe Beta browser extension can guide you through conducting manual assessments of your pages. It is also recommended that you test your site using a screen reader such as NVDA with Firefox or Chrome if you’re on a PC. Or VoiceOver with Safari, if you’re on a Mac. Going into screen reader testing is a bit beyond the scope of this training but I would encourage you to. NVDA is free. You can download it and try it out. VoiceOver is built right into every Mac and iOS device. So I’d encourage you to play around with these tools. We’ll link to some resources in the Resources slide later on. Testing with a screen reader is really critical, as well, to getting a handle on the experience that someone using these technologies will have when using your website.

[Guided Manual Assessment Demo]

So now we’re going to do another demo. This one will be using the Deque Axe tool to do a guided manual assessment of the page we were looking at previously. Switch back over to the browser. I’m going to refresh the page just to get into a default state. I’ve actually saved a test for this. If you’re using Axe and you go through your automated checks, you can then save the results, which will then allow you to conduct these guided manual tests. For the purposes of this demo I’m just going to test Keyboard and Buttons, and Links.

But Deque has six additional sets of tests that it will allow you to conduct. So I’ll show those really quick here. You can test ARIA Modals, Page Information, Lists, Images, Headings, and Forms.

So we’ve got the page set how we want it. I’m going to start running the Keyboard test. And we will make sure the page; we’ll just refresh again just to double check. Get it where we want it and we’ll click “Start”. And so what the tool is doing here is tabbing through all of the buttons and links on the page. This will take a little bit of time, but it’s checking to see are there any links that should be accessible via keyboard that are not. And we just want to verify that everything that is a link or a button is, in fact, accessible via keyboard. So it has completed its test. It found 34 items. Click “Next”.

And now it’s going to say: “Are there any elements that you expected to be in the tab order but were not?” And so you can kind of scroll and just take a look. It looks like everything is…Oh. So I see my search magnifying glass icon up in the top part there, which triggers the site search, is not highlighted. So, yes, there was an element skipped in the tab order. So I will select “Yes” and click “Next”.

Any elements not already highlighted that should be in the Tab order? And yes. And so I’m going to use my element selector here. And we’re going to click on this. OK. The search trigger; we’ll select that. So now we’ve selected the search trigger there. We’ll click “Next”. And now we have one issue that’s been reported. Finish. OK, so one issue found for Keyboard.

Now we’re gonna do Buttons and Links. Start this new test. I’m just going to refresh the page to get it to the default state I want to test. Click “Start”. OK, so this is a similar thing. I see some stuff that should be included that is not and, again, it’s that search button. Click “Next”. And, again, we will select the search trigger. Click “Next”.

OK, so now it has scanned all of the links on the page and I need to check that the accessible text for all of these items are sufficient. So we’ve got some “skip to main nav”. That’s fine. “UChicago Digital Accessibility Demo”. That’s fine. Scrolling down, these all look pretty good. Now we’re going to look at the links inside the main body of the page. As you see it will highlight the element. So here that link text is not sufficient. It’s too vague out of context. So I’m going to flag that one as not being sufficient. Look at the rest here. These look fine. And we’ll click “Next”.

Now, it’s going to ask me to just review that the role of all these elements were created correctly. And, we’ll say that should be a button. Everything else looks pretty good. These are all mostly links. And we’ll click “Next”. So there’s some issues, again, with the search button. These two issues are related. The button does not have a role and the element’s role is missing. So they’re basically the same issue. And then the purpose of the link—that “Tools” link—is not clear within its programmatically determined meaning. So we’ll click “Finish”. And so now we have three issues under Buttons and Links and one issue found under Keyboard.

Now I’m going to switch back to my code editor and we will go through these. We’ll click on “Keyboard”. Again, the search trigger is not keyboard accessible. And so if I switch over to my code editor and I take a look at the code for that brand bar section, I see the issue here is that this should be a button but it is marked up as a div and it does not have an ARIA role on it that would make it behave like a button. So I’m going to change this to a button. And save it. And that should actually fix three of those issues.

All of the issues that were related to the search trigger should now be resolved or most mostly resolved by making that change. And we also need to figure out why this thing doesn’t work with a keyboard. I see it’s just hovering. Like, if I hover over that with my mouse, it will open up the search, but I can’t tab to it with the keyboard. So that’s a problem.

Again, we’ll switch over to our code editor and take a look at the JavaScript for the search trigger. And as you can see here, it is being triggered on mouseenter(); which is a device specific event handler. So, you really want to stay away from using, you know, mouseenter, mouseleave, hover, those kind of event handlers in your JavaScript that are specific to the mouse. You really want to use things more like click, which are device agnostic. So I’m going to change this to click. And save and we’ll allow that to recompile. Switch back over here. Refresh the page.

And you can see my cursor changes. So this is clickable. I can click to it. If I tab through with my keyboard I now can tab to that element, which is great. So two lines of code and we’ve fixed the issues with the search trigger.

I’m going to go back and take a look at my Buttons and Links issues. We’ve already dealt with the search trigger—these two issues—and now the “purpose of the link is not clear”. Here we have that “Tools” link. So this I need to edit in the CMS. So I will go over here. Log in to the content management system. We’ll scroll down. And here is that link. I’m going to change this to read “Automated accessibility assessment tools”. Now this link phrase is much more descriptive. We’ll scroll down to the bottom and save that, and refresh. And so now that link has been updated. So now we can go back and clear these results and we can rerun this manual test.

We’ll put the page back where we want it and click “Start”. And now, again, it will tab through all the elements. And this time we should see that search trigger appearing in the tab order. Let it do its thing here; takes a little bit. Almost done. All right. So 35 now were recorded, whereas only 34 were recorded the last time.

We’ll scroll to the top just to double check. Yep, the search trigger now is there. And that was the only thing missing last time. So let’s do a quick check, make sure we didn’t break anything else. All looks good. So, no, nothing was skipped in the tab order. Click “Next” to finish. No issues found for the keyboard. Now we’ll quickly rerun the Buttons and Links test and hopefully this will come back clear as well. OK, we’ve got it there. Click “Start”. Again, 35 buttons and links. We already know that that’s how many there should be; nothing is missing. Click “Next”. And now we’ll just double check the accessible names of everything. And we know that that “Tools” link was the main problem, and so scroll down here. Go through and find it. And yeah, that looks good. The rest of these look good. We’ll click “Next”. So most of these things are links. We fixed the search trigger in code. We changed that to a button element from a div. And now that is correctly being reported as a button. So the rest of these are links, which are correct. And we’ll click “Next” and “Finish”. And now those three issues have been fixed, as well.

Similar to the automated tests, we edited a couple lines of code and fixed three issues. And then the fourth one that we discovered required a really quick content edit in the CMS. And now this page is in good shape. We successfully remediated a number of issues through a combination of manual and automated assessments.

Manual assessments are challenging. While the demo we just went through was quick and the issues we found were pretty simple, full manual assessments are time consuming. And since human judgment is required, results can be subjective. Engaging multiple people with varying skills and backgrounds, including users with disabilities, to conduct manual assessments will yield the best results.

[Final Thoughts]

Now that we’ve seen how to run automated and manual assessments on your web site, let’s just talk briefly about expectations and how to prioritize fixing accessibility issues that you find. First of all, it’s really important to keep in mind that one hundred percent web accessibility is impossible. It’s impossible to achieve for all content, especially as web sites become more complex. The goal is to make a reasonable effort to ensure as much of your content is accessible to as many people as possible. And so how do you prioritize the things to fix to achieve that goal?

First, after you’ve conducted an accessibility assessment, all blockers should be remediated as soon as possible. So in our demo, the fact that that search trigger icon was not accessible to a keyboard user that was a blocker that we needed to fix right away.

After you’ve addressed blockers, addressing critical and major issues—especially those detected by automated testing— should be the next area of focus, along with any content related issues that can be fixed using the site’s content management system. Fixing automatically detected errors are the low hanging fruit of accessibility remediation. These issues tend to be cut and dry; they’re code based. As we saw in the demo, oftentimes changing one line of code or two lines of code will fix three issues that were kind of compounding one another. And so, really trying to have no issues that are detected automatically is important.

And third, plan to address minor issues and non critical issues found manually the next time the site is being redesigned or undergoing a major development cycle. It’s really difficult, and expensive, to try to fully remediate an existing website that was not built with accessibility in mind. So, for minor issues wait until the site’s going to be redesigned or you have some time set aside to do some development and then take care of those. For new sites that are just in development, the goal certainly would be to minimize the number of total accessibility issues regardless of their impact because, during that initial development cycle, that’s the best time to really get rid of as many of these issues as possible. Just don’t put them into production and then you won’t have to worry about going back in the future to fix them.


So, if you have any questions about this about this presentation or just about digital accessibility in general, please feel free to contact the CDA. We are here to provide digital accessibility resources for campus. Please contact us at and we would be happy to talk to you about your accessibility needs.


Just real quick, some references that were used in the presentation are listed out here. These will also be available on the website where the presentation is embedded so you can get them. But I just wanted to show them here. If you want to write down any URLs feel free to pause the recording here and take note of these.

Slide 3
UChicago Center for Digital Accessibility
Slide 5
W3C Web Accessibility Initiative – Introduction to Web Accessibility
Slide 6
UChicago Diversity Initiative
Slide 7
Why should I strive for digital accessibility? – UChicago Center for Digital Accessibility
Slide 8
Tweet by @LareneLG
Slide 11
Web Content Accessibility Guidelines (WCAG) 2.1
Slide 13
Introduction to Understanding WCAG 2.1
Slide 15
UChicago Digital Accessibility Standards
Slide 18
Website Evaluation Tools – UChicago Center for Digital Accessibility
Slide 19
Axe Core Rule Descriptions
Slide 20
Selecting Web Accessibility Evaluation Tools (W3C)
Slide 27
Siteimprove Browser Extensions
Deque Axe Browser Extensions
Slide 28
Slide 32
Handling common accessibility problems – MDN Web Docs

[Additional Resources]

And then some additional resources that are available. We’ve got a bunch of links on the CDA website. The W3C has all kinds of really valuable resources that go over techniques for working with WCAG 2.1. Code samples; really in-depth documentation. Overviews on how to evaluate for web accessibility; how to if—you’re an evaluator or a tester—how to approach those roles. They also have lists of evaluation tools. And Deque Systems has a really nice blog post on how to prioritize accessibility remediation. And so that’s linked there at the bottom.


Thank you! Thanks for spending the time to learn about web assessments. Again, if you are interested, the CDA’s website is at A lot of resources there. Again, I thank you for your time and good luck on your web accessibility journey.