Thursday, June 1, 2017

Re-imagining the Calculus Curriculum, II

You can follow me on Twitter @dbressoud.

Last month, in "Re-imagining the Calculus Curriculum," I, I introduced Project DIRACC (Developing and Investigating a Rigorous Approach to Conceptual Calculus), developed by Pat Thompson, Mark Ashbrook, and Fabio Milner at Arizona State University. References to the theory underpinning this approach are given at the end of this column. This month’s column will expand on some details of this curriculum.

One of the first common student misconceptions that Project DIRACC tackles is that variables are simply stand-ins for unknown quantities. The authors begin the meat of his course in Chapter 3 with an explanation of the distinction between variable, constant, and parameter, pointing out how context-specific the designations as either variable or parameter can be. One of the distinctive features of this project is the thoughtful use of technology, in this case enabling students to play with the effect of varying a variable with a variety of choices of parameter (see

This leads to relationships between variables (how volume varies with height), and then functions as a special class of relationships between variables, one in which “any value of one variable determines exactly one value of the other.” The point is that the f in f (x) has meaning. It is the name of the relationship. This enables the authors to tackle the misconception that f (x) is simply a lengthy way of expressing the variable y.

While acknowledging that f(x) can represent a second variable, they emphasize that it is shorthand for “the value of the relationship f when applied to a value of x.” This point is driven home by an example of the usefulness of functional notation. If d(x) relates a moment in time, x measured in years, to the distance between the Earth and the Moon at that time, then d(x) – d(x–5) enables us to express the change in distance over the five years before time x, while d(x+5) – d(x) expresses the change in distance over the succeeding five years.

The authors also make the important distinction between functions defined conceptually—the distance between Earth and Moon at a given time—and those defined computationally, such as V(u) = u(13.76 – 2u)(16.42 – 2u). They then proceed to devote considerable effort to describing the structure of functions as they are built from sums, products, quotients, compositions, and inverses. This includes clarifying the distinction between the independent variable and the argument of a function. Thus for f (x/3 + 5) the independent variable is x, but the function argument is x/3 + 5, an important step toward understanding composition of functions.

While function structure should be part of precalculus, the importance of including this material has been revealed in exploring student difficulties with differentiation. Given a complicated computational rule that defines a function, students often have difficulty parsing this rule and thus determining the choice and order of the techniques of differentiation they need to use.

Rates of change are now introduced in Chapter 4. The authors distinguish between ∆x, the parameter that describes the length of a small subinterval of the domain, and the changes in x and y represented by the differentials dx and dy. These are variables that within the given subinterval are always connected by a linear relationship.

A nice illustration of how this works is given with a photograph of a truck traveling through an intersection (Figure 1).

Figure 1. A photo of truck taken with a shutter setting of 1/1000 sec.
Taken at a shutter speed of 1/1000th of a second, it appears to freeze the truck. But if you zoom in on the tail light (Figure 2, see Section 4.3 for a video of the zoom), the streaks reveal that the truck was moving.

Figure 2. A closer look at the truck's tail light shows small streaks.
The truck moved slightly while the camera's shutter was open.

One can even estimate the length of the streaks to approximate the velocity of the truck. Over 1/1000th of a second, it is doubtful that the truck’s velocity changed very much. The picture of the truck was taken at a “moment” in time, but that moment stretched over 0.001 seconds. The point is that this period of time is short enough that the truck’s velocity measured as change in distance over change in time is “essentially constant.” If y is position and x is time, then over this interval of length ∆x = 0.001 seconds, we can treat the variable dy as a constant times dx. It is this constant that is used to define the rate of change at a moment,

We say that a function has a rate of change at the moment x0 if, over a suitably small interval of its independent variable containing x0, the function’s value changes at essentially a constant rate with respect to its independent variable.

Significantly, even as the authors are defining the rate of change at a moment, they emphasize that “all motion, and hence all variation, is blurry.”

Note that there is no mention of limits, a means of defining the derivative that is often more confusing than enlightening (see the 2014 Launchings columns from July, August, and September).

After further discussion and exploration of rate of change functions, the authors now move in Chapter 5 to Accumulation Functions, building up total changes from rates of change that are essentially constant on very small intervals. These give rise to what are anachronistically referred to as left-hand Riemann sums. Students use technology to explore the increasing accuracy as ∆x gets smaller. The effect of the choice of starting value is noted, and the definite integral with a variable upper limit now appears. It is important that the first time students see a definite integral it has a variable upper limit.

In Chapter 6, the inverse problem, going from knowledge of an exact expression of the accumulation function to the discovery of the corresponding rate of change function, is now explored, leading to the Fundamental Theorem of Integral Calculus in the form: The derivative with respect to x of the definite integral from a to x of a rate of change function is equal to that rate of change function evaluated at x. Techniques and applications of differentiation follow as the semester concludes.

The great strength and promise of this approach is that the traditional content of the first semester of calculus is only slightly tweaked, especially since it is increasingly common for university Calculus I courses to avoid or significantly downplay limits. But the curriculum has been totally reshaped to address common student difficulties and misconceptions. This route into calculus has the added advantage—though perhaps a disadvantage in the eyes of some students—that those who have been through a procedurally oriented course are unlikely to recognize this as an accelerated repetition of what they have already studied. It will challenge them to rethink what they believe calculus to be.


Thompson, P.W. and Silverman, J. (2008). The concept of accumulation in calculus. In M.P. Carlson & C. Rasmussen (Eds.), Making the connection: Research and teaching in undergraduate mathematics (MAA Notes Vol. 73, pp. 43–52). Washington, DC: Mathematical Association of America.

Thompson, P.W., Byerley, C. and Hatfield, N. (2013). A conceptual approach to calculus made possible by technology. Computers in the Schools. 30:124–147.

Thompson, P.W. and Dreyfus, T. (2016). A coherent approach to the Fundamental Theorem of Calculus using differentials. In R. Göller. R. Biehler & R. Hochsmuth (Eds.), Proceedings of the Conference on Didactics of Mathematics in Higher Education as a Scientific Discipline (pp. 355–359 ) Hannover, Germany: KHDM.

No comments:

Post a Comment