From our 2012 study, Characteristics of Successful Programs in College Calculus (NSF #0910240), the most successful departments had a practice of monitoring and reflecting on data from their courses. When we surveyed all departments with graduate programs in 2015 as part of Progress through Calculus (NSF #1430540), we asked about their access to and use of these data, what we are referring to as “local data.”
The first thing we learned is that a few departments report no access to data about their courses or what happens to their students. For almost half, access is not readily available (see Table 1). When we asked, “Which types of data does your department review on a regular basis to inform decisions about your undergraduate program?”, most departments review grade distributions and pay attention to end of term student course evaluations (Table 2). Between 40% and 50% of the surveyed departments correlate performance in subsequent courses with the grades they received in previous courses and look at how well placement procedures are being followed. Given how important it is to track persistence rates (see The Problem of Persistence, Launchings, January 2010), it is disappointing to see that only 41% of departments track these data. Regular communication with client disciplines is almost non-existent.
Table 2. Responses to the question, “Which types of data does your department review on a regular basis to inform decisions about your undergraduate program?” |
We also asked departments to describe the kinds of data they collect and regularly review. Several reported combining placement scores, persistence, and grades in subsequent courses to better understand the success of their program. Some of the other interesting uses of data included universities that
- Built a model of “at-risk” students in Calculus I using admissions data from the past seven years. Using it, they report “developing a program to assist these students right at the beginning of Fall quarter, rather than target them after they start to perform poorly.”
- Surveyed calculus students to get a better understanding of their backgrounds and attitudes toward studying in groups.
- Collected regular information from business and industry employers of their majors.
- Measured correlation of grade in Calculus I with transfer status, year in college, gender, whether repeating Calculus I, and GPA.
- Used data from the university’s Core Learning Objectives and a uniform final exam to inform decisions about the course (including the ordering of topics, emphasis on material and time devoted to mastery of certain concepts, particularly in Calculus II).
- Reviewed the performance on exam problems to decide if a problem type is too hard, a problem type needs to be rephrased, or an idea needs to be revisited on a future exam.
Figure 1. AAU STEM Initiative Framework |
The three levels of change are subdivided into topics, each of which links to programs at member universities that illustrate work on this aspect of the framework.
Cultural change encompasses
- Aligning incentives with expectations of teaching excellence.
- Establishing strong measures of teaching excellence.
- Leadership commitment.
- Facilities.
- Technology.
- Data.
- Faculty professional development.
- Access.
- Articulated learning goals.
- Assessments.
- Educational practices.
No comments:
Post a Comment