Volume XIII Number 1, April 2013

Seeking Predictors of Web Accessibility in U.S. Higher Education Institutions

Terrill Thompson
Dan Comden
Scott Ferguson, M.A.
Sheryl Burgstahler, Ph.D.
Elizabeth J. Moore, Ph.D.
University of Washington


In the current study, researchers conducted a comprehensive semi-automated web search for web or IT accessibility policies in all higher education institutions in the United States. At the same time, automated data was collected to measure the accessibility of institutional websites and frequency of results found when searching for "web accessibility" or "technology accessibility" on the websites of each institution (a measure we refer to as "conversation"). It was found that website accessibility varied considerably across the different measures used in the study. The measures with the highest accessibility were headings and alternate text on images. Other accessibility features (labels on form fields, language identified, and tagged PDF) were much less likely to be present, with ARIA landmark roles the least likely to be present. Overall, Doctorate-granting Universities had the highest web accessibility ratings of all Carnegie classifications; however their ratings were lowest for PDF accessibility. Nearly one-tenth (8.4%) of the institutions had web or technology accessibility policies. The web pages of these institutions had higher overall accessibility ratings, and were especially higher on alt text with images and labeled input fields. However, institutions with a policy were less likely to have tagged PDFs. These effects were significant but small due to the large amount of variance within groups. Institutions with a policy also had significantly more "conversation," most dramatically when institutions did not have an accessibility link on their site template. Together, amount of conversation and having a policy accounted for more variance than either factor alone, though the total variance accounted for was very small. Post-hoc analysis showed that institutions with formal, stand alone policies had significantly higher accessibility ratings than institutions with other types of policies. An exploratory stepwise multiple regression analysis revealed that having an accessibility policy in place, being a master’s or doctoral-granting institution, and being in the state of California account for about 3% of the variance in overall accessibility. Though statistically significant, the remaining 97% of the variance that is not accounted for indicates that the best predictors of overall accessibility are not available to the model. Further research is warranted in order to identify other factors that may contribute to institutions’ success at implementing accessible websites.


The Internet plays an integral role in the delivery of postsecondary academic content as well as student and administrative services. However, many websites are not designed in such a way to be accessible to some individuals who have disabilities, including those who are blind and using assistive technologies such as screen readers or Braille displays. For many years federal legislation (e.g., Section 504 of the Rehabilitation Act of 1973, The Americans with Disabilities Act (ADA) of 1990 and 2008 Amendments) has mandated that an institution’s programs and services be accessible to qualified individuals with disabilities. However, many institutions have not addressed web accessibility as aggressively as they have addressed the physical accessibility of their campuses.

Section 508 of the Rehabilitation Act requires that most information technology (IT) that federal agencies procure, develop, maintain, and use be accessible to people with disabilities, both employees and members of the public. In response to this law, the U.S. Architectural and Transportation Barriers Compliance Board (Access Board) developed accessibility standards (Office of the Federal Register, 2000) to which federal agencies must comply. The web standards are based in part on the Web Content Accessibility Guidelines 1.0 (WCAG), developed by the World Wide Web Consortium (W3C). In 2008 the W3C updated its standards to WCAG 2.0, and in 2011 the Access Board issued a second draft of updated accessibility standards for Section 508 (Access Board, 2011).

Some states have also passed laws or implemented policies that require accessibility of technology developed, procured, or used by state-funded colleges and universities. For example, California State University (CSU) has a system-wide policy, articulated in Executive Order 926 (2004) and implemented by the Accessible Technology Initiative (ATI) outlined in the 2007 Coded Memo AA-2007-04 and revised in Coded Memo AA-2010-13 (CSU, 2010). In CSU’s shared governance approach, the Chancellor’s Office provides leadership regarding IT accessibility, but responsibilities for implementation are widely distributed. The policy defines implementation roles for an ATI Leadership Council, campus executive sponsors and administrators (e.g., presidents, provosts, CIO’s, vice presidents), academic and faculty senates, centers for faculty development, and disability support services (Hanley, Mitrano, Thompson, Goldstein, Martin, & Krishnaswamy (2011)). Similarly, the State of Texas requires that all state websites, including those sponsored by state-funded postsecondary institutions, comply with Texas Administrative Code 206.70 (effective September 1, 2006), which adopts in part the Section 508 standards for web accessibility, and further requires each institution to establish its own IT accessibility policy. Also, The Illinois Information Technology Accessibility Act, passed in 2007, requires Illinois agencies and universities to ensure that their websites, information systems, and information technologies are accessible to people with disabilities. The law is accompanied by standards, implementation guidelines for web-based information and applications, and procurement recommendations (Illinois Department of Human Resources, n.d.).

For more than a dozen years, published studies have reported the accessibility of select web pages at institutions of higher education. They include Rowland & Smith (1999); Flowers, Bray, & Algozzine (1999); Jackson (1999); Rowan, Gregor, Sloan, & Booth (2000); Walden, Rowland, & Bohman (2000); Schmetzke (2001, 2002a, 2002b, 2002c, 2003, 2007); Axtell & Dixon (2002); Byerley & Chambers (2002); Erickson (2002); Johns (2002); Opitz, Savenye, & Rowland (2003); Kelly (2002); McCord, Frederiksen, & Campbell (2002); McMullin (2002); Riley (2002); Horwath (2002); Thompson, Burgstahler, & Comden (2003); Thompson, Burgstahler, & Moore (2007); Thompson, Burgstahler, Moore, Gunderson, & Hoyt (2007); Comeaux, D., & Schmetzke, A. (2007); Harper, K. A., & DeWaters, J. (2008); Wijayaratne, A., & Singh, D. (2010); Behzad, H. (September 01, 2011); Espadinha, C., Pereira, L., Da Silva, F., & Lopes, J. (2011); and Michaillidou, Mavrou, & Zaphiris (2012). These studies have employed a variety of methods to evaluate the accessibility of web pages. Most used automated tools; others tested pages manually; still others employed a combination of methods. Automated tools make it reasonable to test a much larger sample of pages than manual methods and are more reliable than humans at collecting very specific data (e.g., whether images have alt attributes, whether web pages include HTML heading elements). However, manual methods are better at assessing overall functional accessibility of a page for specific users as well as the relative impact of accessibility errors within the scope of the overall user experience. For example, two pages might be rated identically by an automated tool if both pages are missing alternate text on images; however, one of the pages might be missing alternate text on minor, insignificant images whereas the other might be missing alternate text on images that represent the items in a navigation menu. From a functional standpoint, the second page is much less accessible than the first, but automated tools are unable to make that distinction.

Most studies that have evaluated web accessibility of higher education institutions have focused on the institutional home pages. Thompson et al. (2003) used a broader scope for their study by conducting manual assessments of eleven pages that are common to most institutions, including the home page, search page, campus directory, academic calendar, library home page, employment home page, and five others. By doing so, the researchers hoped to get a better sense of each institution’s overall commitment to web accessibility.

Despite the differences in methods, studies have consistently shown that the websites of higher education institutions, both in the U.S. and abroad, have had and continue to present significant accessibility barriers to visitors with disabilities. Clearly, continued effort is needed to identify and implement strategies that will address this problem.

Campus Efforts to Ensure Accessible Websites

Postsecondary schools have responded in a range of ways to the problem of web accessibility. Some have made no effort in this regard; some individual campus units have made significant efforts; some schools are working for accessibility systematically and campus-wide. Some campuses provide accessibility guidance online for their campus communities (e.g., University of Washington, n.d.). Some schools provide specialized training on web accessibility to faculty and staff who design web pages. Some have licensed tools that monitor web accessibility throughout the institution and provide feedback to site owners.

Increasingly, campuses are adopting policies that specifically address web accessibility. Some institutional policies broadly encompass all IT, not just websites. Thompson, Draffan, & Patel (2009) conducted a survey in which 149 individual self-selected representatives from 106 higher education institutions responded to questions related to their technology accessibility efforts. Of these, 55% reported having a documented policy regarding web accessibility, and 46% reported having policies and/or procedures that require consideration of accessibility when acquiring IT. Bradbard, Peters, & Caneva (2010) reported that fifty of the fifty-eight original land grant institutions (86%) have web accessibility policies. However, they also found that the majority of these policies had what they consider to be serious deficiencies such as failing to clearly define who the policy covers; articulate definitions and information that clarifies what is meant by web accessibility; and provide implementation timelines, enforcement mechanisms, and consequences for noncompliance.

The purpose of the present study is to assess the current state of web and PDF accessibility at postsecondary institutions in the United States, and to assess the impact of various factors on accessibility.


The current study expands upon existing knowledge by conducting a comprehensive semi-automated web search for web or IT accessibility policies in all higher education institutions in the United States, while also collecting automated data to measure the accessibility of institutional websites and frequency data regarding the number of results found when searching for "web accessibility" or "technology accessibility" on the websites of each institution (a measure we refer to as "conversation").

Research Questions, Hypotheses and Variables

The researchers sought to answer the following research questions:

  1. What is the current state of web and PDF accessibility at higher education institutions in the United States?
  2. How many higher education institutions in the United States have policies related to web and/or IT accessibility?
  3. How do institutions compare on their level of conversation related to web and technology accessibility?
  4. Which independent variables are the best predictors of web and PDF accessibility?

Hypotheses for the study are:

  1. Level of institutional "conversation" about accessibility (as measured by the number of search results for related terms) will be a better predictor of high accessibility measures than the presence of an accessibility policy, but
  2. Institutions that have both an accessibility policy and a high level of conversation will have the most accessible web pages.

In other words, if an institution has a policy, but hasn't done much to raise awareness of that policy with training, online resources, and other communications, accessible practices are not likely to have permeated the online community at that institution. Conversely, if an institution has a lot of online resources related to IT and/or web accessibility but no policy, it will likely be more accessible than institutions with policies but little additional effort. However, an institution will be even more likely to demonstrate accessibility if that conversation is sanctioned with an institutional policy.

The dependent variable in this study is:

The independent variables for the current study are:


A data set containing 4365 U.S. higher education institutions was downloaded from the Carnegie Foundation for the Advancement of Teaching (2012). Internet domain names were collected for these institutions from Universities Worldwide, The University of Texas at Austin, and Google.com. Excluded from the study were institutions located outside the fifty U.S. states and District of Columbia, and institutions for which domain names were not found. Institutions and educational organizations with multiple campuses all sharing a central website (e.g., ITT Technical Institute, Everest College, Kaplan College, Devry University, Sanford-Brown) were included as a single record.


The web pages tested for accessibility were the top ten HTML pages returned by a Google search within each institution’s domain. The study also conducted an automated assessment of each institution’s top ten PDF files, also collected using a Google search. By examining a variety of web pages and a variety of PDF files at each institution, the researchers hoped to measure the level to which an institution is embracing accessibility and practicing accessible design, development, and authoring techniques campus-wide.

Web page accessibility data was collected using a semi-automated procedure developed in PHP, and utilizing Google’s Custom Search API (Google). The procedure performed the following functions:

For institutions to be included in the study, researchers required that a minimum of five web pages be returned by the Google search query. Some small schools did not meet this restriction. Each institution that failed to meet this threshold was re-evaluated approximately one month later in case the low return rate had been due to temporary errors either at Google or on the institution’s website. Ultimately, 3251 institutions were included in the study, and 31,701 web pages and 28,395 PDF files were reviewed for accessibility. The number of institutions by Carnegie classification is shown in Table 1.

Table 1. Institutions sample size by Carnegie classification.
Type of Institution Count
Associate's Colleges 1204
Master's Colleges and Universities 623
Baccalaureate Colleges 594
Special Focus Institutions 518
Doctorate-granting Universities 283
Tribal Colleges 29
Total 3251

Of the web pages evaluated, 566 were being redirected to other pages using the HTML <meta> element to refresh the pages. In these cases, the original URL was replaced with the URL of the new target page and data was re-collected using the new URL.

Also, 1502 (4.7%) of web pages in the sample had no images. Upon further investigation, there appeared to be a variety of reasons for this:

The researchers did not attempt to distinguish between pages that truly had no images and pages that were using these alternative methods to deliver images.

The following composite accessibility scores were calculated from the data for each institution:

After data was collected for all institutions according to the semi-automated process described above, the researchers conducted a manual analysis of the identified policies. The policy analysis team included two individuals who are national leaders in the technology accessibility field, with over 40 years of combined experience, including experience with accessibility planning and policy development. A third team member holds a Master’s Degree in Policy Studies. The team worked together over multiple sessions to review, discuss, and categorize policies. Policies were classified into the following four policy types:

A formal policy was defined as a plan or edict that was issued from the institutional or state leadership level. In order for a policy to be classified as formal, it also had to include language such as "required", "shall", and "will" with respect to accessibility requirements. In rare cases institutions were found to have a policy despite not using the word "policy" in the document's title or other reference materials. Predominant use of "should" or "recommended" within text relegated the classification to standards/guidelines or a general statement. Formal policies nearly always contained references to standards such as the W3Cs' Web Content Accessibility Guidelines (WCAG) or Section 508. In some cases the standards were independently designed by the state or institution.

A study that gathers data from the entire population rather than a sample does not have the same need for inferential statistics as one that reviews only a sample. However, inferential statistics were applied to this dataset since the webpages studied were a sample of each institution’s webpages, and the data collected is a snapshot of characteristics that could (and likely will) be different the next time they are measured. Thus Analysis of Variance (ANOVA) and correlational analysis were used to provide a measure of confidence in the dependability of the relationships among the independent and dependent variables.

With such a large number of subjects, this study has great statistical power, meaning that analyses are sensitive to small and possibly unimportant effects. Thus, when possible, effect size is reported. Finally, the individual measures of accessibility and conversation often deviated dramatically from a normal distribution. To ensure that effects were not due to or obscured by violations of the assumptions of the statistical techniques used, these variables were transformed using cut-points, transforming scales ranging from 0 to 100 percent to more normally-distributed scales ranging from 0 to 4 or 0 to 6. To correct the significant positive skew of the two conversation variables, values more than three standard deviations from the mean were truncated at that value. When conclusions were the same, regardless of whether the original or transformed variables were used, results are discussed in terms of the original variables.


The results of the current study are organized by research question in the paragraphs that follow.

Research Question 1: What is the current state of web and PDF accessibility at higher education institutions in the United States?

Overall, level of accessibility varies considerably across the measures used in this study. The accessibility features observed most frequently were headings (on average, 77.9% of institutions’ top web pages had headings) and alternate text on images (on average, 60.4% of institutions’ images across all pages had alternate text). Other accessibility features occurred less frequently (39.8% of input elements have labels, 37.3% of pages have language identified, and 33.8% of PDF documents are tagged). The least frequently used accessibility feature was ARIA landmark roles, with only 3.3% of pages including this feature. This result is not surprising, as ARIA landmarks are a relatively new accessibility feature. Table 2 summarizes these findings.

Doctorate-granting Universities have the highest ratings of all Carnegie classifications on each of the web accessibility measures, but they have the lowest accessibility ratings on PDF accessibility (p<.001 for overall accessibility measure, alternate text, headings, language, and PDF).

Table 2. Accessibility Ratings by Carnegie Classification
Carnegie Classification Alt Text Labels Headings ARIA Lang PDF OVERALL (Web)
Associate's Colleges 62.4 38.8 74.7 3.1 33.3 39.7 42.5
Master's Colleges and Universities 65.1 43.7 81.5 2.8 37.2 32.7 46.1
Baccalaureate Colleges 58.8 39.9 78.6 3.2 37.6 31.8 43.6
Special Focus Institutions 48.8 35.8 77 3.8 41.6 32.4 41.4
Doctorate-granting Universities 66.1 44.2 83.6 4.2 45.6 20.7 48.8
Tribal Colleges 58.8 25.6 74.7 3.4 45.2 40.7 41.6
Full Sample 60.4 39.8 77.9 3.3 37.3 33.8 43.7

Small differences in accessibility were detected by region. Overall web accessibility ratings ranges from 42.0 and 42.2 (Southeast and Southwest regions) to 45.4 (Pacific region). Examination of effect sizes shows that these differences are likely to be too small to be practically important).

Also, institutions within the California State University system or those subject to Texas or Illinois state laws and policies (when analyzed as one group) tend to have higher overall ratings on accessibility (F(3,3247)=8.9; p<.001); however, the very small effect sizes suggest a high degree of variability among institutions both within these states and among the other states. The implication of this variability is that while state accessibility laws and policies seem to have a small effect on actual accessibility, factors not measured in this study are more influential in determining the actual accessibility outcome. The most robust finding was a greater likelihood of an accessibility link on the home page among these states (between 30% and 40%), compared with an average of 7% of the other institutions (F(3,3247)=31.1; p<.001). Also, institutions in the California State University (CSU) system have higher average overall web accessibility than other groups of institutions (60% vs. about 50% in Texas and Illinois, and 44% in the other states), and are especially high on alternate text (80%, compared with about 70% of Texas and Illinois, and an average of 60% in the other states), labels (75% in CSU, 50% to 55% in Texas and Illinois, and 39% in the other states). Though these percentage differences seem meaningful, it is important to remember that the small effect sizes (well below the minimum guideline to be considered important) indicate almost as much variability within these states as between them. Interestingly, all three of these statewide groups of institutions rated lower than average on PDF accessibility (CSU = 25.8%, Texas = 31%, and Illinois = 30.8%, whereas the overall mean is 33.8%). Table 3 provides additional details on the accessibility results for these states.

Table 3. Accessibility Ratings for Public Institutions Subject to Laws or Policies in Select States
State/System Alt Text Labels Headings ARIA Lang PDF OVERALL (Web)
California State University80.3 75.1 88.1 6.1 49 27 59.7
State of Texas72.8 55.4 85.4 3.3 44.7 31 52.3
State of Illinois71.4 51.2 76.8 1.5 45.3 30.8 49.2
Full Sample 60.4 39.8 77.9 3.3 37.3 33.8 43.7

Research Question 2: How many higher education institutions in the United States have policies related to web and/or IT accessibility?

A total of 273 policies were found (8.4% of the total population) that address web and/or technology accessibility. A variety of types of policies were represented. Table 4 shows a breakdown. There is a positive relationship between degrees awarded and policy. Doctorate-granting Universities are the most likely Carnegie Classification to have a policy (26.1% of institutions), while Master’s Colleges and Universities are the second most likely (14.6%). These numbers are considerably lower than those reported by Bradbard, Peters, and Caneva (2010), who found policies at 86% of the original land grant institutions. In the present study, policies were found at 56% of the original land grant institutions (excluding institutions that are located outside of the 50 states). The discrepancy between these findings and those of Bradbard et al may be due to differences in method (Bradbard et al conducted an exhaustive search, even contacting institutions if they couldn't find a policy online) or in how policy was defined.

Of the Doctorate-granting Universities at which policies were found in the present study, the most common type of policy was formal-standalone (47.3% of policies), followed by formal-incorporated (24.3%). At Master’s Colleges and Universities formal-standalone and formal-incorporated policies were observed with approximately equal frequency (about 39%). Other types of institutions have fewer policies overall, and where they exist they tend to be expressed as general statements rather than formal policies. See Table 5 for types of policies per Carnegie Classification.

Table 4. Frequency of Web/Technology Accessibility Policies by Carnegie Classification
Type of Institution Policies Found Percent of Institutions
Associate's Colleges 76 6.3%
Master's Colleges and Universities 92 14.8%
Baccalaureate Colleges 21 3.5%
Special Focus Institutions 11 2.1%
Doctorate-granting Universities 74 26.1%
Tribal Colleges 0 0%
Total 274 8.4%
Table 5. Frequency of Web/Technology Accessibility Policies by Carnegie Classification and Type of Policy
Associates Masters Baccalaureate Specialty Doctorate
Count Percent Count Percent Count Percent Count Percent Count Percent
Formal, Standalone 21 27.6% 36 39.1% 1 4.8% 3 27.3% 35 47.3%
Formal, Incorporated 17 22.4% 36 39.1% 7 33.3% 1 9.1% 18 24.3%
Standards or Guidelines 12 15.8% 6 6.5% 5 23.8% 0 0% 10 13.5%
General Statement 26 34.2% 14 15.2% 8 38.1% 7 63.6% 11 14.9%
Total 76 100% 92 100% 21 100% 11 100% 74 100%

Research Question 3: How do institutions compare on their amount of conversation related to web and technology accessibility?

Results show a broad variance in the amount of "conversation" related to web or technology accessibility. 1098 institutions have 0 results when searching for "web accessibility," and 1002 have 0 results when searching for "technology accessibility." In contrast, 102 institutions have more than 1000 results when searching for "web accessibility" (including six with more than 10,000), and 110 institutions exceed 1000 when searching for "technology accessibility" (including three with more than 10,000). The two institutions that top both lists are The University of Texas at Austin and Penn State University. The mean number of pages returned was 156.4 for "web accessibility" and 130.8 for "technology accessibility." The very high numbers at some institutions are skewed toward institutions that have "Accessibility" links on their home pages, which suggests that these links are probably included on templates that affect large numbers of pages across their entire website. Doctorate-granting Universities are the most likely type of institution to include these links, with 20.5% of institutions having them. Most of these links simply used the word "Accessibility" by itself (67.4%). Ten institutions have links on their home pages to "Accessibility Statement," and ten others to "Accessibility Policy." Table 6 shows accessibility links by Carnegie classification.

Table 6. Accessibility Links on Home Pages, by Carnegie Classification
Carnegie Classification # of Accessibility Links % of Institutions
Associate's Colleges 86 7.1%
Master's Colleges and Universities 51 8.2%
Baccalaureate Colleges 31 5.2%
Special Focus Institutions 10 1.9%
Doctorate-granting Universities 58 20.5%
Tribal Colleges 0 0%
Total 242 7.4%

There is a strong positive relationship between conversation measures and the presence of a home page link. However, despite this overall trend several institutions with high conversation measures do not have an "Accessibility" link on their home pages. For example, the list of the top fifty institutions with respect to web accessibility includes seventeen institutions that do not have "Accessibility" links. In fact, the institution that ranks second, Penn State University, does not have a link. The high level of conversation at Penn State may be in response to the National Federation of the Blind filing a much-publicized Office for Civil Rights complaint against the university for multiple violations related to technology accessibility (National Federation of the Blind, 2010).

Research Question 4: Which independent variables are the best predictors of web and PDF accessibility?

Overall, the webpages on the websites of institutions with an accessibility policy of any type had higher overall accessibility ratings. Figure 1 shows the prevalence of various accessibility features across the institutions, with nearly 80% overall including HTML headings, down to about 3% including ARIA landmarks. The specific features where having a policy may make a difference are: the inclusion of alt text with images; labeled input fields; and especially, having an accessibility link on the home page. Having a policy is negatively associated with having tagged PDF. Once again, these effect sizes are small, due to the large amount of variance within the two groups.

Figure 1. Accessibility ratings by presence of accessibility policy
A bar graph shows the generally positive relationship between accessibility policy and performance on several web accessibility measures

Figure 2 shows the relationship between an accessibility policy and the level of conversation about web and/or technology accessibility. When the institution’s home page has an accessibility link, the overall level of conversation is much higher than at institutions without an accessibility link on the home page, and higher yet among institutions that also have an accessibility policy. Among institutions without an accessibility link on the home page, institutions with an accessibility policy average nearly 10 times the level of conversation as those without an accessibility policy.

Figure 2. Accessibility conversation by accessibility policy and presence of accessibility link on home page
A bar graph shows a significantly positive relationship between accessibility policy and the amount of conversation, especially significant at institutions where there is no accessibility link on the home page

Figure 3 shows that having some type of accessibility policy in place is related to higher accessibility ratings on alternate text on images, labels on form fields, and overall accessibility. No significant differences were found on HTML headings, ARIA landmarks, and use of the LANG attribute. Although differences in accessibility emerged among policy categories, the effect sizes were very small, and no distinct pattern emerged showing one category of policy to be more effective in increasing the use of accessibility features overall. However, post-hoc analysis showed that compared to other types of policies, formal-standalone policies are related to greater use of accessibility features overall.

Figure 3. Accessibility ratings by type of accessibility policy
A bar graph shows the complex relationships between type of accessibility policy and performance on several web accessibility measures

Figure 4 shows that having a formal-standalone accessibility policy is related to a significantly higher level of accessibility conversation, an effect that is particularly striking in the absence of an accessibility link on the home page.

Figure 4. Accessibility conversation by category of accessibility policy and presence of accessibility link on home page
A bar graph shows the relationship between type of accessibility policy and conversation. Formal, standalone policies have an especially strong effect

To determine the relative impact of a web accessibility policy, and conversations about accessibility on overall accessibility, three multiple regression analyses were conducted. The first revealed that number of web accessibility conversations accounted for 2% of the variance in overall accessibility (F(1,3249)=67.7; p<.001). The second revealed that the presence of any policy accounted for 1.2% of the variance in overall accessibility (F(1,3249)=37.8; p<.001). The third analysis revealed that combined, these two predictors accounted for 2.4% of the variance in overall accessibility (F(2,3248)=41.4; p<.001).

In a final analysis, researchers conducted an exploratory stepwise multiple regression analysis to predict the overall accessibility score based on all available information. The final model accounted for a modest 3.3% of the variance in the overall accessibility ratings, indicating that the scale’s best predictors are not available to the model. Taken together, the model produced by this multivariate analysis indicates that greater overall accessibility scores are associated with:

Noteworthy Institutions

Several institutions across all Carnegie Classifications were rated significantly higher than average on web accessibility measures. Following are profiles of a few of these institutions, as well as institutions that for reasons described in their profiles were determined to have noteworthy policies or practices.

Asnuntuck Community College.

Asnuntuck has had the best overall accessibility rating among Associate’s Colleges (94.8). They had perfect ratings on all measures with the exception of alternate text (74%). However, they attained this high level of accessibility with no web or technology accessibility policy, no accessibility link on the home page, and no conversation on web or technology accessibility. A manual review of their site confirmed that accessibility had been a design consideration. In addition to the characteristics identified by the automated check (especially good heading structure and good use of ARIA landmarks), a manual check revealed "Jump to Navigation" and "Jump to Main Content" links at the top of the page, a dropdown menu that was accessible by keyboard, and generally good color contrast.

Rockhurst University

Rockhurst had the best overall accessibility rating among Master’s Colleges and Universities (89.8); it had perfect ratings on labels and headings, 90% ratings on ARIA landmarks and language, and a 69% rating on alternate text. However, none of its PDF files were tagged. Like Asuntuck Community College, Rockhurst had no policy, no accessibility link, and very low scores on conversation measures (5 results for "web accessibility" and 6 for "technology accessibility").

University of North Carolina at Chapel Hill (UNC – Chapel Hill)

UNC – Chapel Hill had the best overall accessibility rating among Doctorate-granting Universities (80.8). All of its pages in the research sample had headings, and it had very high scores on language (90%) and alternate text (89%) and above-average scores on labels (65%) and ARIA landmarks (60%). A manual review of the site revealed a few additional accessibility features, as well as a few shortcomings. UNC-Chapel Hill did not have a web or technology accessibility policy, but did have an "Accessibility" link on the site template that leads to an Accessibility page featuring a statement of commitment, tips for users with disabilities, and information and resources for developers. The accessibility link influenced its conversation ratings, which are very high (2870 for "web accessibility", 1690 for "technology accessibility"). Thirty percent of the institution’s PDFs were tagged, but a manual inspection revealed that none of these documents included accessibility features such as good heading structure or alternate text for images.

California Polytechnic State University (Cal Poly)

Cal Poly had the best overall accessibility rating (78.2) among the 23 institutions that constitute the California State University system. It was one of six CSU institutions with overall ratings above 70. Cal Poly had perfect ratings on headings, labels, and language, and high ratings on alternate text (91%). It had not yet implemented ARIA landmarks on its site. A manual inspection revealed additional accessibility features beyond those captured in the automated assessment, including a simple easy-to-read design, very good color contrast, and a "Pause Rotating Stories" link at the top of the page that enables users to pause the slideshow feature that appears on the home page. There were accessible controls for advancing the slideshow manually. Cal Poly also had tagged 50% of its top PDFs, although a manual inspection of these PDFs revealed that none were optimized for accessibility. As part of the CSU system, Cal Poly is subject to the requirements of the CSU Executive Order 926, and has a formal standalone policy that reaffirms this responsibility and links out to various supporting documents, both internal and system-wide (California Polytechnic State University, n.d.).

The University of Texas at Austin (UT-Austin)

UT-Austin had the highest number of search results for "web accessibility" (36,500), far exceeding the values for all other institutions. They were second highest for "technology accessibility" (14,900), behind Penn State. The high conversation scores were likely affected by the size of the campus website in combination with the presence of a "Web Accessibility Policy" link in their site template. The link led to a Web Accessibility Policy, which reaffirmed that the institution must meet the requirements in Texas Administrative Code 206.70. The policy page also identified an Accessibility Coordinator whose role is to monitor compliance, facilitate training, and grant written requests for exceptions to accessibility policies under appropriate circumstances (University of Texas at Austin, n.d.). UT-Austin’s web accessibility ratings were well above average on all measures except ARIA landmarks, which had not yet been implemented on the site. Unfortunately only one of the institution’s top ten PDFs was tagged, and a manual inspection of that PDF revealed that it was not optimized for accessibility.

Holy Apostles College and Seminary (Holy Apostles)

Holy Apostles was one of 28 institutions at which 100% of PDFs in the sample were tagged. A manual inspection of these PDFs revealed that they were in fact tagged for accessibility, with very good heading structure. Holy Apostles also did well on other web accessibility measures, with perfect ratings on all measures except alternate text on images (79%) and ARIA landmarks, which they had not yet implemented on any pages. Holy Apostles did not have a web or technology accessibility policy, an accessibility link, nor identified conversations on either "web accessibility" or "technology accessibility".

University of Illinois at Chicago (UIC)

UIC was only one of four Doctorate-granting Universities at which 90% of PDFs in the sample were tagged. A manual inspection of these PDFs revealed that they were in fact tagged for accessibility, with very good heading structure. Also, PDFs had been created by printing to PDF using Acrobat Distiller, a process that does not produce accessible tagged PDFs, which suggests that accessibility had been added in a separate post-production process. UIC is subject to the Illinois Information Technology Accessibility Act, and has a formal standalone policy that reaffirms that requirement and links out to the IITAA Implementation Guidelines (University of Illinois at Chicago, n.d.).

North Carolina State University (NCSU)

NCSU had a formal-standalone policy that in many ways stands out as a model policy. All terms and requirements were clearly defined, and it included an enforcement mechanism that carried a possible penalty of removal of resources. The policy was issued by the Chancellor in 2006 and revised in 2011. The revision expanded the scope from web accessibility to accessibility of all information and communication technologies (ICT), including requirements that accessibility be addressed in procurement and specifically addressing "emerging technologies". There was an "Accessibility" link on the website template that pointed to a "Legislation and Policies" page which describes and links to the ICT accessibility policy. Related to the policy and link, NCSU was high on conversation measures, 5920 on "web accessibility" and 5760 on "technology accessibility". However, despite the exemplary policy and effort, NCSU was not among the highest on web accessibility measures. The overall web accessibility rating was 61.2. NCSU had high ratings on headings (90%) and language (70%), and used ARIA landmarks on 40% of pages, which was well above the mean. However, their rating on labels (44.2%) was only slightly above the mean for Doctorate-granting Universities, and its rating on alternate text (56%) was below the mean (66.1%).

Discussion, Limitations, and Future Research

The results of this study confirm researchers’ hypotheses that accessibility measures can be predicted by both institutional policy on accessibility and level of "conversation" about accessibility. These variables account for a significant amount of the variance between institutions on web accessibility measures. However, the effects are small. Many of the institutions that perform well at accessibility do not have accessibility policies and have very little or no conversation, that is, very few web pages that mention or discuss web or technology accessibility. Institutions that do exceptionally well with no policy or significant conversation tend to be smaller institutions, where websites are likely to be highly centralized with a small number of individuals responsible for their creation and maintenance. If these individuals are knowledgeable about accessibility and motivated to improve it, that knowledge and motivation may be reflected in their work. This might argue that a highly effective strategy for influencing accessibility at a smaller institution with a highly centralized web management approach is to ensure that the individuals who create and maintain the institution’s web pages are knowledgeable of accessibility issues and practices, perhaps by both making accessibility knowledge a required qualification when hiring new staff and investing in accessibility-related training for these individuals.

For larger institutions with potentially hundreds of thousands of web pages and thousands of individuals building sites and creating content, the challenge is very different. At Doctorate-granting Universities, institutions that do best on web accessibility tend to either have high levels of conversation or web/technology accessibility policies or both. However, neither guarantees an overall high accessibility rating regarding their websites.

Similarly, the results of this study suggest that strong state laws and policies that require web or technology accessibility at covered institutions may have a positive effect overall, but also do not guarantee accessibility.

The results of the present study with regards to PDF accessibility are especially surprising., Institutions with accessibility policies, including those with strong state laws and policies, had significantly lower measures of PDF accessibility than those without. This finding is likely influenced by the relative complexity of PDF documents at Doctorate-granting Universities, which is the mostly likely group to have policies. Smaller institutions may be more likely to produce comparatively simple documents by saving Microsoft Word documents as PDF, which results by default in a tagged PDF in Word for Windows. In contrast, Doctorate-granting Universities may use a greater variety of authoring tools, many of which require additional steps to add tags to the PDF output.

Also, it is important to note that the current study only measured whether PDFs are tagged. Tagging PDFs is an important first step in making them accessible, as without an underlying tag structure it is impossible to add other accessibility features such as headings and alternate text on images. However, tags alone do not make a PDF accessible. Additional good practices are required in order to ensure that tagged PDFs have accessible features. Still, only 33.8% of PDFs in the current study met even the very loose measure of PDF accessibility used, and that value was only 20.7% at Doctorate-granting Universities. Further, manual reviews of a small sample of those PDFs that had tags revealed that they did not have other accessibility features such as headings and alt text. A more extensive examination of PDFs is likely to reveal that a much lower percentage of these documents are truly accessible across the entire population. Since PDFs are used extensively in higher education for communicating both academic and administrative information, much more work is required in order to improve the state of PDF accessibility at institutions nationwide.

Another limitation of this study is that the number of web pages evaluated for each institution is very small. Most higher education institutions have thousands to millions of web pages, and a sample of ten pages–no matter which ten we choose–may have limitations in its reliability as an indicator of whether the institution as a whole has embraced web accessibility.

Also, as noted in the Introduction, the specific automated measures of accessibility used in this study only tell part of the story. These measures are believed to be good indicators of, and positively correlated with, overall accessibility. Although a manual review was not practical for the present study, a truly accurate web accessibility review requires some level of manual evaluation.

Despite the limitations, the present study provides data to support the complexity of the problem facing higher education institutions. Given that the independent variables measured in this study seem to have only modest influence on accessibility ratings, additional factors must contribute to institutions’ web and IT accessibility. Additional research is warranted to identify these contributing factors, which may include:


This study has contributed to an understanding of the accessibility of IT at postsecondary institutions by exploring the impact of policies and campus conversations on the accessibility of campus websites and PDFs. The overall level of accessibility of websites tested varied greatly. Doctorate-granting Universities had the highest ratings of all Carnegie classifications on each of the web accessibility measures (except tagged PDFs), and institutions that are subject to state laws and policies (specifically in Texas and Illinois, as well as California State University institutions) also tended to have higher ratings than other institutions. Web or technology accessibility policies were found at 8.4% of the institutions, including 26.1% of Doctorate-granting Universities and 14.6% of Master’s Colleges and Universities. The web pages of institutions with policies had higher overall accessibility ratings; however, having a policy was negatively associated with having tagged PDF. A large variance was observed in level of conversation on "web accessibility" or "technology accessibility", ranging from 0 to over 10,000 pages per institution. Overall, the study found that having an accessibility policy and having a high level of accessibility conversation are both positively correlated with HTML accessibility. However, the small effect sizes suggest a high degree of variability among institutions. Neither policy nor conversation guarantees a high level of web accessibility, nor does their absence guarantee a low level of web accessibility. Also, any positive effect of policy and conversation on HTML web accessibility does not extend to PDFs. Further research is warranted in order to identify other factors that may contribute to institutions’ success at implementing accessible websites.



The content of this article is based upon work supported by the National Science Foundation (award #HRD-0833504 and #CNS-1042260). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the federal government.

Thompson, T., Comden, D., Ferguson, S., Burgstahler, S., Moore, E. (2013). Seeking Predictors of Web Accessibility in U.S. Higher Education Institutions. Information Technology and Disabilities E-Journal, 13(1).