About the 2013 Survey

For the 2013 Census, data collection took place from March 2013 to October 2014. The research team took a targeted approach to eliciting survey data. Using a generated list of non-profit and public, four-year institutions in the United States from the Integrated Postsecondary Education Data System (IPEDS), members of the research team visited institution websites to gather the email addresses of writing program, writing center, and writing across the curriculum directors; in cases where institutions did not have explicitly titled directors, we gathered email addresses for the people most likely to have those responsibilities. We then sent these professionals an invitation to participate in the survey. We repeated this process for non-profit and public, two-year institutions, with a slightly modified survey.

The Census database includes data from 680 of the 1621 four-year institutions invited to participate (a 42% response rate) and from 220 of the 924 two-year institutions (a 24% response rate). Schools needed to complete at least one section of the Census survey to remain in the database. Sections of the 2013 Census include: 

  • Sites of writing
  • First-year writing/English composition
  • Identifying and supporting diversely-prepared students
  • Writing across the curriculum (WAC) and writing beyond the first year
  • The undergraduate and graduate writing major and minor
  • Writing centers
  • Administrative structures
  • Demographics of respondents

After the closure date of each survey, the research team cleaned the data and when questions emerged, the research team contacted respondents for clarification. In the end, we dropped or modified some questions or left incomplete responses blank. The responses to certain questions were challenging to process because of the varying definitions that exist for elements of writing and writing administration. Some of the decisions that the research team made about these questions are explained in the Glossary and Notes; further examination of these areas of disagreement will be explored in future presentations and articles. To protect the privacy of respondents and to ensure that the identity of schools could not be deduced, if a question had fewer than ten responses, it was omitted from the database.

In order to give users of the database the most beneficial data, we did not include the incomplete or non-applicable responses with each question. This means the ā€œnā€ for each question will vary.