6 Fun Things to Do with the National Census of Writing Data!

The 2013 National Census of Writing database contains a wealth of data on writing education at the college level. As other blog posts here highlight, these data present opportunities for the development of meaningful information that can be used to answer a variety of research questions or pressing administration-related exigencies.

But it’s also really interesting to just play around with! In the interests of promoting a sense of exploration with this data set, here are 6 Fun Things to Do with the National Census of Writing Data. (Note: Your definition of Fun may vary.)

1. See how responses vary across school type, size and geographic location

The census researchers collected an extraordinary amount of data from their respondents, mostly through dogged persistence. The research team spent months repeatedly calling and emailing respondents, chasing down referrals, and responding to questions about the survey. And as responses came in, they repeated that all again in order to clear up any issues with the data. This resulted in a 42% response rate for a survey that could take hours to complete (note that I’m referring to the “Four-Year Institution Survey” here and throughout this post.)

Once this was done, the researchers went back to the data and used publicly available information to break the responses down into a variety of categories, such as Carnegie classification, size, geographic area and more. This allows you to explore how responses vary across a wide range of institutions. For instance, I was unsurprised to learn that Midwestern schools are more likely to have a writing major (45%, versus 34% in other areas), given the historically Midwestern roots of Composition at land-grant colleges and normal schools. However, Southwestern schools (such as my own institution and my undergraduate alma mater) are most likely to have a writing minor.  

2. Browse open response questions

Obviously, not all questions can be answered with a checkbox. The survey included many open-response questions in order to capture a breadth of information. These data would support a deep dive, but even a quick look can be revealing. For instance, the survey included a large number of multiple-choice questions about changes in the writing program. But it also provided an open-ended “Were there any additional changes?” question – which 188 (out of 680) respondents answered.  It seems writing studies is a discipline in flux.

Another thing you can learn from open response questions--We love acronyms. It can be a bit disorienting to scroll through hundreds of open-answer responses and see FYC, FYW, FWP, FWE, FYWS, ESL, ELL, EFL, WAC, WID, WAC/WID, WEC, etc. In some cases, these variations probably signal meaningful philosophical or material differences in how the program is delivered. Luckily, the census website includes a glossary to make their own assumptions explicit and help anyone bogged down by the storm of Fs, Ws, Cs and Ds.

3. Overlap sets of responses to get more specific data

Using the Census’ Advanced Search, users can pull data sets that overlap multiple questions. In other words, “Of programs that answer Yes to Question 8, how many also answered No to Question 15?” Using this tool I was able to discover that there is exactly one respondent who administers First-Year Writing, and administers Basic Writing, and administers a writing center, and has a PhD… and earns under $30,000 a year. (I don’t know who you are or where you are, but you have my sympathy.)

4. Get answers to some common WPA-Listserv questions

How are your courses staffed? What size are your courses? How do you handle placement? What is the content of your first-year writing course? These are broad but important questions which provide valuable benchmarks for WPAs working to develop their program. Thus, variations on these questions regularly show up on the WPA-L and other Listservs. Already the go-to answer for these questions is “Check the Census!” It seems inevitable that the census data will eventually become a go-to resource.

5. Take the survey yourself – when it’s repeated next year

Given the amount of work that went into developing this data set, I’m frankly surprised anyone would sign up to administer the survey a second time. But the research team will be back next year with a new and updated survey. Previous respondents were identified primarily through internet searches and had no previous relationship with the researchers. With the rising profile of the census it seems probable that the next round will see an even higher response rate and more valuable data.

So those of you in administration, set aside some time to complete the survey. Everyone else, start prepping for your inevitable “Trends in X Aspect of Writing Instruction, 2013-2017” CCCC’s presentations.                

6. Contribute to the blog

It’s a fun opportunity to contribute to this ongoing project. Just reach out to Jill or Brandon!