Wednesday, March 1, 2017

MAA Calculus Studies: Use of Local Data

You can now follow me on Twitter @dbressoud.

From our 2012 study, Characteristics of Successful Programs in College Calculus (NSF #0910240), the most successful departments had a practice of monitoring and reflecting on data from their courses. When we surveyed all departments with graduate programs in 2015 as part of Progress through Calculus (NSF #1430540), we asked about their access to and use of these data, what we are referring to as “local data.”

The first thing we learned is that a few departments report no access to data about their courses or what happens to their students. For almost half, access is not readily available (see Table 1). When we asked, “Which types of data does your department review on a regular basis to inform decisions about your undergraduate program?”, most departments review grade distributions and pay attention to end of term student course evaluations (Table 2). Between 40% and 50% of the surveyed departments correlate performance in subsequent courses with the grades they received in previous courses and look at how well placement procedures are being followed. Given how important it is to track persistence rates (see The Problem of Persistence, Launchings, January 2010), it is disappointing to see that only 41% of departments track these data. Regular communication with client disciplines is almost non-existent.

Table 1. Responses to the question, “Does your department have access to data 
to help inform decisions about your undergraduate program? PhD indicates 
departments that offer a PhD in Mathematics. MA indicates departments for 
which the highest degree offered in Mathematics is a Master’s.

Table 2. Responses to the question, “Which types of data does your department
 review on a regular basis to inform decisions about your undergraduate program?”

We also asked departments to describe the kinds of data they collect and regularly review. Several reported combining placement scores, persistence, and grades in subsequent courses to better understand the success of their program. Some of the other interesting uses of data included universities that
  • Built a model of “at-risk” students in Calculus I using admissions data from the past seven years. Using it, they report “developing a program to assist these students right at the beginning of Fall quarter, rather than target them after they start to perform poorly.” 
  • Surveyed calculus students to get a better understanding of their backgrounds and attitudes toward studying in groups. 
  • Collected regular information from business and industry employers of their majors. 
  • Measured correlation of grade in Calculus I with transfer status, year in college, gender, whether repeating Calculus I, and GPA. 
  • Used data from the university’s Core Learning Objectives and a uniform final exam to inform decisions about the course (including the ordering of topics, emphasis on material and time devoted to mastery of certain concepts, particularly in Calculus II). 
  • Reviewed the performance on exam problems to decide if a problem type is too hard, a problem type needs to be rephrased, or an idea needs to be revisited on a future exam.
The intelligent use of data to shape and monitor interventions is a central feature of the large- scale initiatives that are now underway. To mention just one, the AAU STEM Initiative (Association of American Universities, a consortium of 62 of the most prominent research universities in the U.S. and Canada) has established a Framework for sustainable institutional change. It can be found at (Figure 1).

Figure 1. AAU STEM Initiative Framework

The three levels of change are subdivided into topics, each of which links to programs at member universities that illustrate work on this aspect of the framework.

Cultural change encompasses
  1.  Aligning incentives with expectations of teaching excellence. 
  2.  Establishing strong measures of teaching excellence. 
  3.  Leadership commitment.
Scaffolding includes
  1.  Facilities. 
  2.  Technology. 
  3.  Data. 
  4. Faculty professional development.
Pedagogy is comprised of
  1.  Access. 
  2. Articulated learning goals. 
  3. Assessments. 
  4. Educational practices.
In addition, AAU is now finalizing a list of “Essential Questions” to ask about the institution, the college, the department, and the course, illustrating the types of data and information that should be collected and pointing to helpful resources. This report, which should be published by the time this column appears, will be accessible through the AAU STEM Initiative homepage at