•  
  •  
 

Abstract

A full-cycle assessment of our efforts to improve quantitative reasoning in an introductory math course is described. Our initial iteration substituted more open-ended performance tasks for the active learning projects than had been used. Using a quasi-experimental design, we compared multiple sections of the same course and found non-significant gains on a pre/post, rubric-scored, measure of quantitative reasoning. Subsequent course modifications included more explicit emphasis on critical thinking as a course goal and extended experience with the rubric used to score the performance tasks. Results of the second iteration yielded stronger evidence for gains in quantitative reasoning and suggest that the impact of open-ended performance tasks is increased when supported by efforts that emphasize their importance.

Share

COinS