*Editor’s Note: An expanded version of this article previously appeared at **http://openpyviv.com/2016/07/12/ECCO/**.*

Being one of the few women in the men’s world of mathematics and computer science has led me to look around and spot our flaws when inclusivity is concerned. Let’s not fool ourselves: even though we think of us as being purely objective beyond bias, the maths world is not an inclusive paradise. The academic world I personally live in is made of mostly white men, mostly from western countries (Europe, US, Canada, and just a little bit of Asia). If you look even closer, you’ll see that most of us come from well-off educated families. Except for the fact that I am a woman, I check all the other boxes myself and I am well-aware of it. Considering the multiple causes of this situation, what can be done? What can I do as a single individual in this world, when I’m busy fighting my own fights earning my right to stay around? Well, I’m not going to answer that just now, but I will share a very good experience I just had. I went to a CIMPA maths summer school in Colombia that was different: ECCO 2016. For the first time, I felt it was indeed inclusive in the best possible way. And, it was excellent maths too, so I was really happy.

First, a little bit of context. As opposed to a classical conference where most presentations are short ones to announce new results, a summer school is usually made of mini-courses on a certain topic. At ECCO 2016, the main audience was made of students (masters students, PhD students, and undergrads) but some postdocs and even professors participated as well, as we are always keen on learning new things. It was in Colombia and the topic was combinatorics, which happens to be my field. ECCO runs every two years, and began in 2003 as a small event organized by Federico Ardila. He is a Colombian mathematician based in the US and we (the academic world) owe him thanks for many great researchers in combinatorics. I had noticed before that the number of Colombian people among researchers in combinatorics was astonishingly high, but before I met Federico I had no idea why. Most of this very active Colombian community is now organizing the conference. Over the years, ECCO has become quite a big event in combinatorics with a very positive, well-earned, reputation. This year, for the first time, it was a CIMPA school and there were over 100 participants. So why was this conference so good?

**Background diversity. **One thing that I found surprising is that the students came from very different knowledge backgrounds: some of them were undergrads, some were Masters degree students, some were PhD students, some had experience with combinatorics, some did not. And of course, there were also postdocs and professors as I mentioned. Honestly, I didn’t think it was possible to make a conference that was interesting for so many different people with such a variety of knowledge bases. And still, they did it. I think the main reason for this was they intended, from the beginning, that their conference be accessible and interesting for the entire audience instead of just a narrow selection. I am pretty sure they gave detailed instructions to the teachers. The classes themselves were high level mathematics, as you would expect from any summer school. So, of course, not everyone understood everything (that never happens): you cannot expect an undergrad to perfectly follow a condensed high level course on a subject he/she has never heard of. But it was done in a way that everyone could get something out of it. I learned very interesting subjects which gave me new ideas for my research, and undergrads could get direct insight of what combinatorics was about, often understanding much more than I would have expected.

**Country diversity. **That was probably one of the nicest aspect of the conference. Being in academia and travelling a lot, I get to meet people from a bunch of countries, but I don’t think I had ever seen that many nationalities! Of course Colombians and other South Americans, but also North Americans, and Europeans, and more. I counted 23 nationalities, most of them students. Academia is a lot about networking, but it is a very difficult network to enter when you come from the wrong country, so such events can really change the way things are. Also, I liked that it broke the old colonialist structure of “western teachers” spreading the knowledge to poor students from “left out” countries. It was an international crowd listening to teachers from an international background. It was European and North American students coming to Colombia to get maths knowledge along with the Colombian students.

**Women. **Let’s stick to numbers: I counted about 25% of women among participants, and two classes out of four were taught by women. Believe me, these are quite good numbers. On the first day, all speakers were women and I’m not even sure it was intended!

**Code of conduct.** The first time I heard about the notion of a Code of Conduct was when I started attending programming conferences, especially PyCon. The very idea appeared quite odd to me. To a French person, the idea of a list of rules often strikes as prudery, especially when it comes from America. It is also very far from the spirit of maths conferences where the idea is, basically, that you only care about the maths stuff and the rest is mostly irrelevant. I do believe we should find a way to bring the idea of the code of conduct to the maths world but I have no idea how. It looks like such an exhausting lost cause and I have no time or energy for it. And so, I was very surprised to see that ECCO had a community agreement which was basically the same thing! I thought it was well-written, emphasising the diversity of the conference and the way to make it a comfortable place for everyone. I believe it was well enforced, though I cannot know first hand. But what I could see is that the organizers made some time for us to read it and also, later in the week, to come back to it and discuss it. My academic colleagues were a bit taken aback as they had never even heard of such a thing. But I will conclude this paragraph with a quote from a female participant:

I was first very surprised and looked at it as an oddity. Then I remembered what it was being a grad student at conferences and of all the weird guys I had to avoid. So I figured, yeah, why not.

**Language. **The conference was in English, as it is the common language in academia nowadays. But a special effort was made towards Spanish speakers, especially Colombian students so that they wouldn’t feel left out by the language. All the announcements were made in both languages. All the class material was translated in Spanish, either by the speakers or the organizers. Also, I felt that the mathematical language was made accessible all throughout the conference by using clear simple definitions without prerequired knowledge.

**Exercise sessions. **It is typical to have exercise sessions in a summer school, but here they were special. The organizers had a very simple and great idea: each day, we would get a random number that would determine the 3 to 5 person group we would be working with. Sometimes, the randomness was a bit modified to allow a uniform distribution of the professors among the groups. The result was great: I got to work with different people everyday, it gave me a good reason to participate and work on those exercises, I got to meet most of the students, I felt useful as I could use my knowledge to help the students understand the course material. It was a great time for the students, especially undergrads, to review the class material in a casual atmosphere where they had people around to answer their questions. Also, the exercises themselves were really well thought: they included very basic questions so that everyone could familiarize themselves with the class content but also advanced problems for those who already knew a bit and could go faster. At the end of the session, some students would go on the board and do the exercises. It was a good occasion for everyone (undergrads, master students and beyond) to show what they had understood, to make them confident that they were able to solve problems even though the course looked hard and they might have not understood a word when they first heard it.

**Questions. **This is a little detail but one that quite summarizes the spirit of this conference. After a few days, the organizers noticed that questions were coming mostly from the most experienced participants (postdocs, professors). It is indeed very hard to ask a question after a talk: you have to speak up in front of everybody, you feel like you didn’t understand much, that your question is just going to sound stupid, that you will sound stupid in front of everyone… So at some point, the organizers decided that the first question after each talk should come from an undergrad or a master student. It meant we professors had to wait a bit until one of them would feel strong enough (and pushed by the awkward silence) to speak up. They were not stupid questions! I am not sure we stuck to this rule up to the end but it definitely helped “break the ice” for the younger students and make them feel like their questions were welcomed.

**Panel. **As I said, in most maths conferences, everything that is not maths content is often thought as irrelevant. It was not the case here. Proof is they organized a panel where participants could ask questions to people at different points in their career: an undergrad, a grad student, a postdoc and a professor. I wasn’t there myself (I was visiting the great city of Medellin as you can read here in French) but I think it is a great idea. Most students have no idea what it’s like to work as an academic; they are entering the unknown. For many Colombian students, it often means applying to foreign programs when they have never left their home country. Getting a little feedback from people who are already out there is quite helpful!

**Reaching to the outside world.** There was a unique effort to connect the conference with the *outside world*: a high-quality and successful public lecture was given by Federico Ardila during the time of the conference and some organizers took part in a program for high schoolers. It was a great pleasure for me to see so many people being curious about mathematics, about knowledge. And I really liked the fact that it was connected to the summer school.

**Conclusion. **It worked. It was a great event! I believe everybody left with the feeling that they learned a lot and lived a great experience. Most people were staying in the same hotel and we would go out together, having dinner, going dancing (salsa!). At 3AM on the last day, after a great evening of salsa dancing, the students would not leave the bar that was trying to close down. They would say goodbye for ever, exchanging vows to stay in touch like young teenagers after a summer camp.

Not all conferences are like this. Actually, none of them are. I can understand that every conference has a different purpose, we cannot just apply everything everywhere. But, we could take this event as an example: I know I will. For me, this is how maths should be most of the time, not two weeks every other year.

]]>To start, I want to thank all of our readers, subscribers, and contributors — we appreciate your feedback and ideas through your writing, social media comments, and in-person conversations at mathematical meetings and events. Since launching our blog in June of 2014, our articles have received over 189,000 unique page views! We will continue to strive to provide high-quality articles on a broad range of topics related to post-secondary mathematics, and we welcome your feedback and suggestions.

I have a few changes to announce regarding the editorial board for *On Teaching and Learning Mathematics*. Following two years of service as a founding Contributing Editor for our blog, Elise Lockwood is leaving our board to join the editorial board of the new International Journal on Research in Undergraduate Mathematics Education (IJRUME). Many thanks to Elise for her excellent contributions that helped the blog have a great start over the past two years! I know that we can look forward to hearing more from Elise in the future as a contributing author.

For 2016-2017, I am happy to welcome three new Contributing Editors to the board:

- Luis David García Puente, Sam Houston State University
- Jess Ellis, Colorado State University
- Steven Klee, Seattle University

Luis, Jess, and Steve bring with them a wealth of expertise in teaching, research, and mentoring, and I am excited that they will be sharing their expertise with our readers.

For those of you who are regular readers, we will continue to publish articles roughly every two weeks, with a target goal of publishing 24 articles per year. Our next post is scheduled for August 22, 2016, where we will hear from Viviane Pons about her experience at a math summer school that was “inclusive in the best possible way.” Stay tuned!

]]>(Note: Authors are listed alphabetically; all authors contributed equally to the preparation of this blog entry.)

Concept inventories have emerged over the past two decades as one way to measure conceptual understanding in STEM disciplines, with the Calculus Concept Inventory (CCI), developed by Epstein and colleagues (Epstein, 2007, 2013), being one of the primary instruments developed in the area of differential calculus. The CCI is a criterion-referenced instrument, measuring classroom normalized gains, which specifically is the change in the class average divided by the possible change in the class average. Its goal was to evaluate the impact of teaching techniques on conceptual learning of differential calculus.

While the CCI represents a good start toward measuring calculus understanding, recent studies point out some significant issues with the instrument. This is concerning, given that there seems to be an increased use of the instrument in formal and informal studies and assessment. For example, in a recent special issue of PRIMUS (Maxson & Szaniszlo, 2015a, 2015b) related to flipped classrooms in mathematics, three of the five papers dealing with calculus cited and used the CCI. In this blog we provide an overview of concept inventories, discuss the CCI, outline some problems we found, and suggest future needs for high-quality conceptual measures of calculus understanding.

Before proceeding, however, we would like to acknowledge and thank the designers of the CCI for starting the process of developing a measure for students’ understanding of calculus. We regret that with the passing of Jerome Epstein, he is unable to respond directly to our findings or contribute to future work. His efforts, and those of his collaborators, have undoubtedly had tremendous impact on the awareness of the mathematics community of concept inventories and the associated need to teach and learn conceptually, and we believe they contributed positively to the teaching and learning of calculus. We hope that the mathematical community will continue the work that he started.

The first concept inventory to make a significant impact in the undergraduate education community was the Force Concept Inventory (FCI), written by Hestenes, Wells, and Swackhamer (1992). Despite the fact that most physics professors deemed the questions on the inventory “too trivial to be informative” (Hestenes et al., 1992, p. 2), students did poorly on the test and, in both high-school and university physics classes, only made modest gains. Of the 1,500 high-school students and over 500 university students who took the test, high school students were learning 20%-23% of the previously unknown concepts, and college students at most 32% (Hestenes et al., 1992, p. 6). Through a well-documented process of development and refinement, the test has become an accepted and widely used tool in the physics community, and has led to changes in the way introductory physics has been taught (e.g., Hake, 1998; Mazur, 1997). The FCI paved the way for the broad application of analyzing student conceptual understanding of the basic ideas in various STEM disciplines (Hake, 1998, 2007; Hestenes et al., 1992), including physics, chemistry, astronomy, biology, and geoscience.

Recently, the authors of this blog post conducted a thorough analysis of the CCI (Gleason et.al., 2015a, 2015b), with the primary objective to assess the degree to which the CCI conforms to certain standards for psychometric properties, including content validity, internal structure validity, and internal reliability. One can think of validity as determining whether an instrument measures what it is intended to measure, and of reliability as determining how well the instrument measures whatever it is measuring. Thus for educational instruments, validity addresses whether a person’s score on an instrument is meaningful with regards to measuring the desired constructs and helps researchers make inferences. The goal of establishing content validity is to determine the degree to which the instrument measures what it intends, and internal structure validity investigates “beneath” the item responses of participants, including how subscales may relate to each other.

Subscales are usually created when there is a desire to understand different components of a knowledge state of the individual or group, and when there is an expectation of different levels of knowledge in the different categories. For example, a high school geometry test might consist of subscales measuring student understanding of triangles, circles, and arcs. Another example would be the different components of an ACT or SAT test where scores are given for English, Math and Reading, as well as a composite score. The items within the subscales should be highly correlated and items between the different subscales may also be correlated, though likely to a much lesser extent. The goal of a validity study regarding the internal structure of an instrument is to determine if the items are measuring distinct constructs or just a single underlying construct in order to justify the usage of subscores.

Since one of the goals of concept inventories is to measure conceptual understanding before exposure to the content, students are required to use their prior knowledge to respond to assessment items at initial enrollment in the course. After completing the course, the concept inventory can then be used to measure gains in conceptual understanding. To ensure validity in this process of measuring gains, items must be carefully written to avoid using terminology taught in the course to which students have no prior exposure.

However, several researchers noticed that released CCI items contained terminology and notation introduced only in a calculus course, such as the word “derivative” and the notation \(f’(x)\). This is problematic because the CCI is meant to only assess students understanding of concepts in calculus, rather than the specific vocabulary of calculus. While a majority (67%) of calculus students at Ph.D. granting institutions have had previous exposure to calculus, 41% of all post-secondary calculus students did not take calculus in high school and high school calculus students have had no previous exposure (Bressoud, Mesa, & Rasmussen, 2015). Items that contain unfamiliar terminology and notation would confuse students and generate responses around random chance for those items. However, Epstein claims they intended the instrument to measures above random chance at the pre-test and to avoid “confusing wording” (Epstein, 2013, p. 7). Though he may have meant “confusing” to mean “convoluted”, a student with no background in calculus would be in a poor position to answer a question in which the notation \(f’(x)\) appears. Because of the use of calculus specific terminology, the validity of pre-test scores is questionable for populations with large numbers of students lacking previous exposure to calculus, such as those in high schools, at community colleges and at regional universities.

With regard to internal structure validity, issues emerged with the CCI when conducting a factor analysis. (For technical details, see p. 1291-1297 of these proceedings.) A factor analysis explores relationships among the underlying factors of the assessment instrument and the cause of those relationships through the analysis of student responses in order to determine the number of underlying factors of the instrument, and their relationships. Epstein and colleagues suggested that the CCI measures conceptual understanding of calculus through three factors (functions, derivatives, and limits/ratios/the continuum). In other words, they claim that a three-factor model captures all of the correlations among the items on the instrument. However, we showed that the item responses are so closely correlated that the total CCI score is explained by one factor, which appears to be an overall knowledge of calculus content, that “can adequately account for the pattern of correlations among the items” and that there are no sub-scales (DeVillis, 2003, p. 109). This finding means the CCI cannot generate valuable information about conceptual understanding of different components of calculus, such as limits or rates of change, but instead is measuring an overall calculus knowledge.

One method of measuring the reliability of an instrument is to measure the extent to which the individual items of an instrument fall within the same general construct. In this regard, Epstein (2013) reported, and the authors of this blog confirmed, that the CCI has an internal reliability Cronbach alpha of around 0.7, meaning that the instrument has a 51% error variance and a standard error of 10% on each individual student score (Cohen & Swerdlik, 2010, Tavakol & Dennick 2011). In particular, this does not meet the established standard of having an alpha of 0.80 or higher necessary for use in research for any type of educational assessment (Lance, Butts, & Michels, 2006, p. 206).

**Conclusion**

With the centrality of calculus to undergraduate mathematics programs and a variety of mathematically intensive partner disciplines, such as economics, physics, and engineering, there is a need to look at the course’s learning outcomes. Recent efforts through the MAA’s National Studies of College Calculus have helped the mathematical community better understand the current state of calculus programs around the country. Data and research on student outcomes in calculus, especially with regards to conceptual knowledge, lag somewhat behind. Part of this is attributable to a lack of appropriate, well-validated instruments to measure outcomes. As most faculty are not trained in rigorous assessment development, they often depend on others for instruments to measure student learning in courses and programs.

Because of the aforementioned concerns, though, we conclude that the existing CCI does not conform to accepted standards for educational testing (American Educational Research Association, 2014; DeVellis, 2012). As such, users of the CCI should be very aware of its limitations. In particular, it may underestimate the conceptual understanding at the beginning of a calculus course for students who have never taken a calculus class before but understand the ideas underlying calculus. We recommend careful consideration in using the CCI and urge users to keep in mind the kind of information being sought. In addition, we suggest exercising extreme caution in using it for any type of formal assessment processes.

Given the shortcomings of the CCI, as well as the inherent limitations of a static instrument with set questions, we argue that there is a need to create an item bank, consisting of rigorously-developed and validated questions on which we have solid psychometric properties, that measure students’ conceptual understanding of differential calculus. Such an item bank would significantly impact teaching and learning during the first two years for undergraduate STEM. Such an item bank could be used by instructors for formative and summative assessment during their calculus courses to improve student learning. The resources could also be used by researchers and evaluators to measure growth of student conceptual understanding during a first semester calculus course to compare gains of students in classrooms implementing differing instructional techniques.

If permission were granted for the CCI to be used as a launching point, then perhaps some of those questions could be used or modified. Prior work on developing conceptually-focused instruments in mathematics, such as the Precalculus Concept Assessment (Carlson, Oehrtman, & Engelke, 2010) and the Calculus Concept Readiness Instrument (Carlson, Madison, & West, 2010), could serve as models for the item-development process.

**References**

American Educational Research Association., American Psychological Association., National Council on Measurement in Education., & Joint Committee on Standards for Educational and Psychological Testing (U.S.). (2014). *Standards for educational and psychological testing*. Washington, DC: Author.

Bressoud, D., Mesa, V., Rasmussen, C. (2015). *Insights and recommendations from the MAA national study of college calculus*. MAA Press.

Carlson, M., Madison, B., & West, R. (2010). The Calculus Concept Readiness (CCR) Instrument: Assessing student readiness for calculus. arXiv preprint. *arXiv*, *1010.2719*.

Carlson, M., Oehrtman, M., & Engelke, N. (2010). The precalculus concept assessment: A tool for assessing students’ reasoning abilities and understandings. *Cognition and Instruction, 28*(2), 113-145.

Cohen, R. & Swerdlik, M. (2010). *Psychological testing and assessment*. Burr Ridge, IL: McGraw-Hill.

DeVellis, R.F. (2003). Scale development: Theory and applications (2nd ed.). Thousand Oaks, CA: SAGE Publications

DeVellis, R.F. (2012). Scale Development: Theory and applications (3rd ed.). Thousand Oaks, CA: SAGE Publications.

Epstein, J. (2007). Development and validation of the Calculus Concept Inventory. In *Proceedings of the Ninth International Conference on Mathematics Education in a Global Community* (pp. 165–170).

Epstein, J. (2013). The calculus concept inventory – Measurement of the effect of teaching methodology in mathematics. *Notices of the American Mathematical Society, 60*(8), 2-10.

Gleason, J., Thomas, M., Bagley, S., Rice, L., White, D., and Clements, N. (2015a) Analyzing the Calculus Concept Inventory: Content Validity, Internal Structure Validity, and Reliability Analysis, *Proceedings of the 37**th** International Conference of the North American Chapter of the Psychology of Mathematics Education, *1291-1297.

Gleason, J., White, D., Thomas, M., Bagley, S., and Rice, L. (2015b) The Calculus Concept Inventory: A Psychometric Analysis and Framework for a New Instrument, *Proceedings of the 18**th** Annual Conference on Research in Undergraduate Mathematics Education, *135-149.

Hake, R. R. (1998). Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. *American Journal of Physics, 66*, 64-74.

Hestenes, D., Wells, M., & Swackhamer, G. (1992). Force concept inventory. *The Physics Teacher*, *30*(3), 141–158. doi:10.1119/1.2343497

Lance, C. E., Butts, M. M., & Michels, L. C. (2006). The sources of four commonly reported cutoff criteria: What did they really say?. *Organizational Research Methods*, 9(2), 202-220.

Maxson, K. & Szaniszlo, Z. (Ed.). (2015a). Special Issue on the Flipped Classroom: Reflections on Implementation [Special Issue]. *PRIMUS,* 25(8).

Maxson, K. & Szaniszlo, Z. (Ed.). (2015b). Special Issue: Special Issue on the Flipped Classroom: Effectiveness as an Instructional Model [Special Issue]. *PRIMUS,* 25(9-10).

Mazur, E. (1997). *Peer instruction: A user’s manual*. Upper Saddle River, NJ: Prentice Hall.

Tavakol, M., & Dennick, R. (2011). Making sense of Cronbach’s alpha. *International Journal of Medical Education*, *2*, 53–55. http://doi.org/10.5116/ijme.4dfb.8dfd

Another year has flown by, and so it is once again a good time to collect and reflect on all the articles we have been able to share with you since our last annual review. I enjoyed the chance to re-read all the articles, and I was also surprised at the interesting variety of themes that emerged when I sorted them out. It was not easy to put each article in a unique box, and I will point out the blurring between categories. I hope you enjoy the chance to revisit these articles, and perhaps find new meaning from the juxtapositions here.

**Active learning. ** We devoted two months in the fall to our six-part series on active learning. Taking the article on this subject by Freeman et al. that had recently appeared in the Proceedings of the National Academy of the Sciences as jumping off point, we explored different aspects of active learning. It was exhausting and exhilarating for us to work together as an editorial board to write those articles, starting each new one before all the previous ones were done, and finding new things to say in reaction to ideas that emerged from earlier articles.

- Active Learning in Mathematics, Part I: The Challenge of Defining Active Learning
- Active Learning in Mathematics, Part II: Levels of Cognitive Demand
- Active Learning in Mathematics, Part III: Teaching Techniques and Environments
- Active Learning in Mathematics, Part IV: Personal Reflections
- Active Learning in Mathematics, Part V: The Role of “Telling” in Active Learning
- Active Learning in Mathematics, Part VI: Mathematicians’ Training as Teachers

**Teaching practices.** It should be no surprise that, once again, the bulk of our articles land in this category. Each one discusses something someone has done in their classroom and/or that you can do in yours. But there were some interesting sub-themes that showed up.

**Conceptual, procedural, and modeling:**Whether looking at a framework for integrating the procedural and conceptual, or using modeling, derivative machines, or even our own bodies, all of these articles explore how to include the concrete with the abstract.- Karen Keene and Nicholas Fortune, A Framework for Integrating Conceptual and Procedural Understanding in the First Two Years of Undergraduate Mathematics
- Tevian Dray, Thick Derivatives
- Brian Winkel, Learning Mathematics in Context with Modeling and Technology
- Hortensia Soto-Johnson, Learning Mathematics through Embodied Activities

**High-impact practices:**Maria Mercedes Franco’s article included many high-impact practices she uses, and then Priscilla Bremser focused on one of these practices, service learning.- Maria Mercedes Franco, Why High-Impact Educational Practices (Despite Being So Labor-Intensive) Keep Me Coming For More
- Priscilla Bremser, A Skeptic’s Guide to Service Learning in Mathematics

**Class frameworks:**These three articles focused on the class syllabus and two different ways to implement grading.- Priscilla Bremser, What’s in Your Syllabus?
- Kate Owens, A Beginner’s Guide to Standards Based Grading
- Elise Lockwood, Let Your Students Do Some Grading? Using Peer Assessment to Help Students Understand Key Concepts

**Everything else in the classroom:**These are the remaining articles that addressed things we do, or could do, in the classroom. Drew Lewis’ article on social media included a mention of how this helped him learn about Standards Based Grading, listed above.- Drew Lewis, Social Media as a Teaching Resource
- Elise Lockwood, Don’t Count Them Out — Helping Students Successfully Solve Combinatorial Tasks
- Johanna Hardin and Nicholas J. Horton, Preparing the Next Generation of Students in the Mathematical Sciences to “Think with Data”
- Elise Lockwood, Attending to Precision: A Need for Characterizing and Promoting Careful Mathematical Work
- Art Duval, (Don’t?) Make ’em Laugh

**The affective domain.** I was struck by the different articles that explored aspects of the affective domain. Benjamin Braun (our Editor-in-Chief) wrote two articles directly about this, but Taylor Martin and Ken Smith’s article about classroom culture is also largely about what we can do as teachers to structure our classes to help students develop in this direction. Of course, Martin and Smith’s article also goes nicely with the Class frameworks articles above.

- Benjamin Braun, The Secret Question (Are We Actually Good at Math?)
- Benjamin Braun, Believing in Mathematics
- Taylor Martin and Ken Smith, Creating a Classroom Culture

**Student voices. **Once again, we featured several articles written by students giving their different perspectives. A.K. Whitney wrote about beginning math courses, Sabrina Schmidt about her undergraduate math major overall, and Steve Balady about the program he started as a graduate student.

- A. K. Whitney, Shredding My (Calculus) Confidence
- Sabrina Schmidt, What I Wish I Had Learned More About in College Mathematics
- Steve Balady, We Started a Directed Reading Program (And So Can You!)

**K-12.** Although our main focus is on undergraduate mathematics teaching and learning, it is neither possible nor wise to put a rigid barrier between K-12 and post-secondary. All of these articles find some connection or another between these two levels, whether through curriculum, outreach, or teacher preparation.

- Erin Baldinger, Shawn Broderick, Eileen Murray, Nick Wasserman, and Diana White, Connections between Abstract Algebra and High School Algebra: A Few Connections Worth Exploring
- Matt Baker, Number Theory and Cryptography: A Distance Learning Course for High School Students
- Kathleen Fowler, Start Small, Think Big: Making a Difference Through K-12 Mathematics Outreach
- Jennifer S. McCray, What is Early Math and Why Should We Care?

**Policy, etc.** These are articles that are more broad than a single classroom, and report on, or advocate for, changes that can be made to curriculum and beyond. The latter two articles include actions that ordinary mathematicians and mathematics instructors can take, mostly aimed at the K-12 level.

- Benjamin Braun, Recent Reports and Recommendations Related to Courses in the First Two Years of College Study
- Art Duval, Kristin Umland, James J. Madden, and Dick Stanley, Wanted, Mathematicians for an Important but Difficult Task
- Priscilla Bremser, Imagining Equity

**And one more thing**. Not fitting into any one other category was the article collecting the varied personal reflections about this year’s Joint Math Meetings by each of the members of the editorial board.

I cannot accept that mathematics be taught in a vacuum. Yes, mathematics is beautiful, be it pure or applied. However, in our age of immediacy for students we need to move more of our efforts to teaching mathematics in context, in touch with the real world. We should incorporate more modeling and applications in our mathematics courses to richly support and motivate our students in their attempts to learn mathematics and we should support colleagues who seek to use this approach.

Over the course of time I have moved to this position. At first I used applications of mathematics in course lectures, e.g., error correcting codes in algebra, cryptology in number theory, life sciences in calculus, and engineering in differential equations. Then I assigned students to read articles in other disciplines and share these applications in class. Finally, I incorporated projects in which students could see and practice the application of mathematics. Introducing a modeling scenario makes the mathematics immediate; what do I do right now? Students desire to address the problem at hand, which is real to them, primarily because it intrigues them and piques their curiosity. Thus the mathematics becomes a necessary tool they are ready to learn. I eventually used the application to motivate the learning of the mathematics *before* introducing that mathematics. This is a “flipping” of content.

Some students are a bit shy, even resistant, to this approach. However, in an active and supportive learning environment in which students work in small groups and the teacher works the room by watching, visiting, listening, and assisting the groups, students do amazing things. Sometimes they get off a workable track, but colleagues and teachers bring them along. Students make mistakes, but as we know, learning from mistakes is an important part of learning [BrownEtAl2014]. Indeed, we do it all the time ourselves and call it conjecture and research.

**Practicing What is Preached**

For some time, many colleagues have been calling for using modeling in the mathematics curriculum, be it after the introduction and practice of the mathematical topics or before the mathematics is introduced. An example of the latter is to give elementary school students objects – lots of them – and ask them to describe what they have. Quite often, and quite naturally, they will settle on one attribute, e.g., color, weight, size. The vehicle for description is usually an organized list and quite often an associated visual; something we would recognize as a histogram. We need not formally introduce the notion of a histogram; rather just name it after our students invent it for their immediate purpose.

At a more advanced level, one example [Winkel1997] is to ask students to put one eye on a point on a hillside opposite a mountain across the valley and describe what they can see. In the course of their investigations students invent the notions of partial derivative, tangent plane, and normal to a surface. In another activity [Winkel2008], students invent Fourier series by coming up with the rather natural criteria for measuring best fit of a trigonometric series, motivated by images of spectra from chemistry, voice studies, and seismology. In [LibertiniBliss2016] the authors demonstrate that one can cover traditional topics and techniques in differential equations courses and also introduce rich modeling activities to motivate and consolidate learning. We have found that when students see a modeling situation first, it really motivates the learning of the differential equations material and their grasp of the mathematics is firmer and lasts longer because of the modeling experience. Indeed, Dina Yagodich, of Frederick Community College, says that throughout the semester her students refer to a first day of class activity on death and immigration modeling with simulations using m&m candies [Winkel2014], an indication of the importance and meaning of a modeling first approach to teaching.

**The Big Picture **

In the 2015 CUPM Curriculum Guide to Majors in the Mathematical Sciences, described by Martha Siegel in this AMS Blog [Siegel2015], there is rich support for applications and use of technology in many mathematics courses. From the Course Group on Differential Equations of the CUPM Curriculum Guide material [CUPMODE2015], we note, “There are major applications involving differential equations in all areas of science and engineering, and so many of these should be included in the ODE courses to show students the relevance and importance of this topic.” In the section, “Technology and the Mathematics Curriculum,” of main report [CUPM2015] there is strong encouragement to include technology wherever possible as its use enhances understanding and enables more sophisticated modeling and applications, thereby motivating students.

For years COMAP [COMAP2016] has enriched the repertoire for teachers who seek to motivate mathematics through modeling and application with the production of UMAP Modules, journals, texts, videos, and modeling competitions. I have worked with students who took the Mathematical Contest in Modeling and the Interdisciplinary Contest in Modeling [MCMICM2016] offered by COMAP. This is a four day, team of three, competition in which students apply the mathematics they know and learn lots more mathematics en route to solve a real world problem. Students always say, “This is the best mathematical experience in my life.” Hands down the students tell us that applying mathematics in context and on the spot for the competitions in order to build a model is the most rewarding experience of their undergraduate mathematics.

The Society for Industrial and Applied Mathematics (SIAM) and COMAP have recently released a powerful report, *Guidelines for Assessment and Instruction in Mathematical Modeling Education *[GAIMME2016], in support of modeling throughout the mathematics curricula from K-16. This report is rich in support of why, what, how, and when to both introduce and assess/evaluate modeling efforts in the classroom. The report encourages and supports faculty with little experience in modeling to get into the game and offers practical suggestions and illustrations which should enable more faculty to incorporate modeling in their teaching.

**Taking a Natural Step and Building Community**

We offer an effort to include modeling in one course, differential equations, in the hope that others will join the effort and also do so in other courses. We have created a freely-available community for teachers and students called SIMIODE — Systemic Initiative for Modeling Investigations and Opportunities with Differential Equations. SIMIODE is about teaching differential equations using modeling and technology upfront and throughout the learning process. You can learn more at our dynamic website www.simiode.org [SIMIODE2013] where we offer a community in which colleagues can communicate, contribute, collaborate, publish, teach, explore, etc.

SIMIODE is a teacher repository of materials and references to other useful sources of materials and ideas concerning teaching differential equations using modeling and technology. SIMIODE offers a growing set of Modeling Scenarios. These are key pedagogical components of SIMIODE in which a modeling situation, rich in detail, motivates the study of differential equations. Additionally, there are Technique Narratives which provide techniques and strategies for solving differential equations with motivating examples, activities, and exercises. These materials are double-blind, peer reviewed, and published on-line at SIMIODE. In addition SIMIODE offers videos from which students can collect their own data for modeling with differential equations, both at SIMIODE [SIMIODE2013] and at SIMIODE’s YouTube Channel [SIMIODEYouTube2014].

Examples for learning differential equations with modeling from SIMIODE include such topics as chemical kinetics, sublimation process, Torricelli’s Law, feral cat control, dialysis, word propagation, mixing fish, spread of oil slick, ant tunnel building, pendulum study, machine replacement, pursuit, drug administration, spring mass configurations, shuttlecock fall, design of stadium, hang time, malaria control, electric circuit, whales and krills, and many more. In each case a scenario, often with data, is offered and students are supported in building differential equation models to address the situation. Quite often the model comes before the introduction of the formal differential equation, indeed, the model motivates the mathematics.

SIMIODE is project and inquiry-based learning at its core, for teachers can find (and create and publish their own) activities in which students discover and build differential equation models to address the scenario offered. The use of technology, as appropriate, encourages a rich solution space, addressing technical, graphical, numerical, and symbolic issues in order to demonstrate techniques and address issues for the model under study. Technology permits deeper understanding and richer analyses.

Most importantly, SIMIODE is a community of teachers and students, wherein teachers can collaborate in building modeling opportunities, address issues appropriate to their interests and the interests of their students, and reach out to new colleagues who are interested in teaching differential equations using modeling as the motivation for the subject. Within SIMIODE teachers can build their own course, form groups based on common themes from class rosters to special student teams, and can work on projects with colleagues and students from different campuses.

SIMIODE is sponsoring minicourses at both MathFest in August 2016 and the Joint Mathematics Meetings (JMM2017) in January 2017 as well as conducting a Special AMS Session, “Experiences in teaching differential equations in a modeling first approach,” at JMM 2017. Thus, there is ample opportunity to get first-hand experience in this approach to teaching in addition to collegial support from the on-line community at www.simiode.org.

**Conclusion**

The message is this: Students can learn mathematics in context and we should use mathematical modeling and technology to teach mathematics. The encouragement to use modeling in mathematics coursework and support is growing in the form of resources, professional society encouragement, collegial conversations, and support communities such as SIMIODE.

Just as with radio waves that are everywhere, our receivers tune in and pick up this message about the joy and power of using modeling in mathematics instruction. More importantly, many are engaging due to what they see and hear. We invite you to join us.

**References **

[BrownEtAl2014] Brown, Peter C., Roediger, Henry L., and McDaniel, Mark A. *Make It Stick: The Science of Successful Learning.* Belknap Press, 2014.

[COMAP2016] The Consortium for Mathematics and Its Applications. 2016. www.comap.com. Accessed 17 May 2016.

[CUPM2015] 2015. CUPM. *2015 CUPM Curriculum Guide to Majors in the Mathematical Sciences*. Editor Paul Zorn. http://www.maa.org/sites/default/files/pdf/CUPM/pdf/CUPMguide_print.pdf . Accessed 2 May 2016.

[CUPMODE2015] Devaney, R. 2015. *Ordinary Differential Equations Course Report*. http://www2.kenyon.edu/Depts/Math/schumacherc/public_html/Professional/CUPM/2015Guide/Course%20Groups/OrdDiffeq.pdf . Accessed 5 May 2016.

[GAIMME2016] COMAP and SIAM. 2016. *Guidelines for Assessment and Instruction in Mathematical Modeling Education (GAIMME)*. http://www.comap.com/Free/GAIMME/index.html . Accessed 1 May 2016.

[LibertiniBliss2016] Libertini, J. and K. Bliss. 2016. Using Applications to Motivate the Learning of Differential Equations. To appear in Association for Women in Mathematics publication.

[MCMICM2016] MCM/ICM. 2016. Mathematical Contest in Modeling and Interdisciplinary Contest in Modeling. COMAP. http://www.comap.com/undergraduate/contests/. Accessed 17 May 2016.

[SIMIODE2013] SIMIODE. 2013. Systemic Initiative for Modeling Investigations and Opportunities with Differential Equations. www.simiode.org. Accessed 1 May 2016.

[SIMIODEYoutube2014] 2014. SIMIODE YouTube Channel. https://www.youtube.com/channel/UC14lC-tyBGkDPmUnKMV3f3w. Accessed 1 May 2016.

[Winkel1997] Winkel, B. J. 1997. In Plane View: An Exercise in Visualization*. International Journal of Mathematical Education in Science and Technology*. 28(4): 599-607.

[Winkel2008] Winkel, B. J. 2008. Fourier Series: Optimization Opportunity. *International Journal of Mathematical Education in Science and Technology*. 39(2): 276-284.

[Winkel2014] Winkel, B. J. 2014. 1-1-S-MandMDeathAndImmigration. https://www.simiode.org/resources/132 . Accessed 15 May 2016.

]]>In my Mathematics for Teachers course, students take a fresh look at foundational concepts, such as fractions and place value, from an advanced perspective. For some of them, our work together exposes weaknesses in their backgrounds, and unsettling stories emerge regularly, but B.’s story stands out. B. was a senior Japanese Studies major who offered insightful observations during problem-solving sessions. As the semester progressed, it became clear that there was a gap in his mathematical knowledge. He explained that he moved to the U.S. speaking only Spanish, and missed out on the mathematics being taught while he was learning English. He soon moved to a different city, and never learned how to add fractions. A significant chunk of the college curriculum was inaccessible to him because his middle school had no mechanism for accommodating his language transition. B. has many strengths, and he will do well in the world, but he was shortchanged at a critical phase in his mathematics education.

We have all had students who arrive at college unprepared to do college-level mathematics. While there are many contributing factors at play, it’s clear that inequities in pre-K-12 education systems play an important role. It’s also clear that it is extremely difficult, if not impossible, to make up in four years for disparities experienced over fifteen years. Although we work in higher education, nevertheless we must advocate for greater equity in pre-college education. If we don’t, we’re simply perpetuating injustice.

That injustice is reflected in persistent and significant differences in educational attainment among demographic groups in the United States. It’s not just that students from some groups are less prepared for college. Those college students have too many peers who don’t have access to college at all, for reasons that are well beyond their, or their families’, control.

Consider this chapter title from a recent report of the United States Government Accountability Office (GAO):

**The Percentage of High-Poverty Schools with Mostly Black or Hispanic Students Increased over Time, and Such Schools Tend to Have Fewer Resources.**

The report goes on to describe differences in those resources. For example, 79% of schools described as low-poverty and 0 – 25 percent Black or Hispanic offer Algebra in 7th or 8th grade, compared to 49% for high-poverty, 75 – 100 percent Black or Hispanic schools. (Within that category, the rate is 37% for charter schools.)

Just last week, the U.S. Department of Education released “A First Look” at its Civil Rights Data Collection for 2013 – 2014. From that list of highlights: “Black, Latino, and American Indian or Alaska Native students are more likely to attend schools with higher concentrations of inexperienced teachers … 11% of black students, 9% of Latino students, and 7% of American Indian or Alaska Native students attend schools where more than 20% of teachers are in their first year of teaching, compared to 5% of white students and 4% of Asian students.” Recent research supports the idea that teachers become more effective with experience, and that (contrary to earlier claims) they continue to improve well into their careers.

I focus on high-poverty schools with mostly Black or Hispanic students because students in those schools are getting the least from their education systems by various measures (performance on tests, graduation rates, and college attendance). This is not to say that these are the only students that should concern us. Indeed, here in Vermont, with a predominantly white population, students eligible for free and reduced-price lunch (a common proxy for poverty) are less successful in school than their peers, by several measures. Michael Marder has put together excellent visualizations of data on connections between child poverty and school outcomes. (If you are tired of hearing how poorly U.S. students perform on international assessments of mathematics learning, note from Marder’s slides that if Massachusetts were its own country, it would rank much higher than the U.S. as a whole.)

Remedial college courses have an unimpressive track record overall. An alternative approach offered by the Carnegie Foundation is promising, but it certainly doesn’t absolve us of the responsibility to reduce the need for remediation in the first place.

A look back at recent attempts to reform public education identifies some measures that don’t work to address achievement gaps. Blaming and shaming teachers, for example, is counterproductive. For one thing, many factors influence a student’s learning. Of course teacher training and experience are important, but high-stakes testing that holds teachers accountable for factors beyond their control makes no sense.

So-called “value-added measures” (VAMs), which try to quantify a teacher’s effect on student learning by way of pre- and post-testing, don’t perform as advertised. Indeed, the American Statistical Association (ASA) issued a statement in 2014, which warns that VAMs should be used with care and expertise, because, for example, “VAMs typically measure correlation, not causation. Effects – positive or negative – attributed to a teacher may actually be caused by other factors that are not captured in the model.” For this and other reasons, the ASA states, “(r)anking teachers by their VAM scores can have unintended consequences that reduce quality.”

What are individual mathematicians to do? Given that public schools in the U.S. are largely under local control, we can start by finding out what’s happening in our own towns and states, beginning with the funding disparities among school districts. From the abstract: “…low-salary districts serve students with higher needs, offer poorer working conditions, and hire teachers with significantly lower qualifications, who typically exhibit higher turnover.” This is infuriating, if not surprising. As educators, we should understand the challenges facing our colleagues who teach children and adolescents.

On a more granular level, we might investigate what measures our local districts take to address achievement gaps. Do new teachers have access to effective induction programs designed to reduce teacher turnover? Are high-quality preschool experiences available to all children? Are there programs, like New York City’s Community Schools, that provide services to needy families in order to support learning?

We mathematicians have something to offer to the local and national conversations, given our well-developed attention to detail and our ability to analyze quantitative arguments. One has to be prepared to face, early and often, the irony of poor data analysis and inaccurate terminology being used in the name of improving education. For one example of how to respond, see this excellent piece by John Ewing. For another, we can thank Cathy Kessel.

In our academic departments, we might start by supporting our colleagues who provide appropriate training to future teachers and professional development to current practitioners. We can value mathematics research and mentor future PhD mathematicians while at the same time recognizing the importance, and complex challenges, of bringing substantial mathematics education to all children.

Our professional organizations can provide inspiration and evidence in the form of position statements. The ASA statement on VAMs is one example of a valuable contribution. The National Council of Supervisors of Mathematics and TODOS: Mathematics for All just released a strong joint statement on social justice, while the Principles to Actions document from the National Council of Teachers of Mathematics (NCTM) includes “Access and Equity” as its second principle.

An encouraging development at the AMS is the appointment of Helen Grundman to the newly established position of Director of Education and Diversity. While the focus of that position is on graduate education, this commitment to promoting diversity in mathematics will certainly draw closer attention to conditions at all levels of the pipeline.

In a recent interview, NCTM President Matt Larson reminded us to recognize the power of mathematical understanding:

I think traditionally, especially in the current era, the importance of mathematics education has always been positioned in terms of national defense and economic need and college and career readiness. And all of those issues are absolutely important, but I think we also need to keep in mind that we also teach mathematics to develop democratic citizenship through critical thinking with mathematics and that that is also an important goal for us.

Without quantitative literacy, citizens are unlikely to comprehend, let alone be able to influence, many of the decisions and actions of those in power in political, social, scientific, and economic institutions.

I want to make sure we remember that mathematics teachers in a very real way contribute to a democratic society.

I like to think that all of us who teach mathematics contribute to a democratic society, but we’ll do a better job of it if we pay attention to equity at all levels. In the 1980’s, a consortium of organizations called us to treat Calculus as “a pump, not a filter.” While we search for effective ways to bring under-prepared college students into mathematics, we can also bear witness to the filters experienced by many younger students, and support the construction of pumps to take their place.

]]>One of the iconic messages of the calculus reforms that took place in the 1990s is the “Rule of Four,” emphasizing the use of multiple representations: algebraic, geometric, numeric, and verbal. But what is a numerical representation of the derivative?

In a recent study [1], we asked faculty in mathematics, physics, and engineering to determine a derivative based on experimental data they had to collect themselves, using the apparatus shown in Figure 1. The physicists and engineers had no trouble doing so—but the mathematicians refused to acknowledge a computed *average* rate of change, however accurate, as a derivative. The physicists and engineers knew full well that their computation was an approximation, but they also knew how to ensure that it was a good one.

*Figure 1: The Partial Derivatives Machine, designed by David Roundy at Oregon State University. In this mechanical analog of a thermodynamic system, the variables are the two string positions (the flags) and the tensions in the strings (the weights). However, it is not obvious which variables are independent, nor even how many independent variables there are. For further details, see [1].
*

Context is everything in applications. Ask a physicist how small an infinitesimal distance is, and she will surely ask, “With respect to what?” Furthermore, even when working with quantities \(x\ll L\) for some scale \(L\), she might well add, “but not so small that atomic structure matters!” Physicists know that derivatives do not in fact describe the real world; they are a (very useful) idealization. This awareness of the allowed regime is second nature to scientists and engineers, even if often left unstated.

We recently argued [2] that mathematicians’ “bright line” distinction between average and instantaneous rates of change is therefore misplaced. How does one talk about instantaneous rates of change numerically, anyway? It is not only atomic structure that imposes a lower bound: Roundoff error becomes a problem for “infinitesimal” numerical computations—and experimental error plays the same role when measurements are involved. In both cases, the very notion of numerical derivative *requires* a lower bound on the step size; it is simply not possible to compute actual limits numerically, nor from experimental data.

Should we therefore reject numerical or experimental representations of derivatives? Of course not. Rather, we should move the line; what matters is not whether a rate of change is average or instantaneous, but whether it is “good enough.” Do we need to teach students complicated techniques of data analysis to determine what “good enough” means? Not necessarily, although it wouldn’t hurt to acknowledge that such techniques exist.

Our group has coined the name *thick derivative* for the resulting notion of “good enough approximation to the instantaneous rate of change.”

The lesson here goes well beyond a discussion of how best to teach students what a derivative is. The mathematics community is well aware that negative experiences with calculus are the single biggest factor causing students to switch out of STEM majors [3, 4, 5]. There is clearly a mismatch between what we mathematicians believe such a course should teach and the needs and abilities of our students. Perhaps we are focusing too much on dotting the i’s, and not enough on the underlying concepts.

I once asked my physicist wife whether physicists cared about the difference between the functions \(\frac{x^2-1}{x-1}\) and \(x+1\). Her straight-faced response was, “What difference?” This was not an instance of “sloppy math,” but rather a very deliberate attempt to point out that there are no physical situations where such removable singularities matter. So why do we start our calculus courses with them?

Similarly, mathematicians delight in constructing examples (and, with the advent of 3-d printing, models) of functions with direction-dependent limits, or of critical points that are *not* local extrema. Shouldn’t we be emphasizing the examples that *are* well behaved?

One of my favorite books as a student had the marvelous (and accurate) title, *Counterexamples in Topology* [6]. One thing I learned from this book is that some other mathematician is always going to be smarter than I am. As a successful mathematician, I have learned how to clarify my assumptions. But a calculus student should be learning how calculus works, not the largely unphysical mathematical contexts in which it doesn’t.

Is there a better way? I would argue that calculus is the study of infinitesimal reasoning, not limits. Calculus had been used successfully for 150 years before limits were invented—and the real numbers on which such limits depend were not properly defined until even later. Another 100 years would pass before nonstandard analysis would justify infinitesimal reasoning without limits, but by then it was too late; limits were here to stay.

So what do I suggest? Skip the fine print. Emphasize examples, not counterexamples. Use numerical data, and discuss the implications. Ask students to determine derivatives experimentally. No fancy apparatus is necessary; just measure rise over run! But be sure to include some examples that are not based on graphical data.

Emphasize the need to be fluent with multiple representations, not merely the ability to perform symbolic manipulations.

Much of our own work has emphasized geometric reasoning as the key to conceptual understanding. The dot product is fundamentally a *projection*; the cross product is fundamentally a *directed area*; the divergence is fundamentally about *flux*. In each case, the formulas follow from these conceptual underpinnings, rather than the other way around.

Use and encourage infinitesimal reasoning, the art of working with quantities that are “small enough”. As we have argued in a series of papers [7, 8, 9] and an online multivariable calculus text [10], differentials provide a robust, geometric, conceptual framework for working with such quantities; there are also others, such as power series.

All of these suggestions align well with the recommendations of the Curriculum Foundation Project of the MAA [11], after seeking extensive input from partner disciplines: Emphasize conceptual understanding, problem solving skills, communication skills, and a balance between perspectives.

Small group activities supporting many of these ideas are available through the project websites described below, which include indexes of activities suitable for vector calculus and multivariable calculus.

Each activity is documented separately, in hopes of allowing instructors to use as many or as few activities as they wish. Although our own work has focused on second-year calculus, many of the ideas—and some of the activities—could be easily restricted to single variable calculus. The Partial Derivatives Machine in Figure 1 becomes a *Derivatives Machine *if one string is locked down. Similarly, use just one edge of the surfaces in Figure 2.

Finally, tell a story. After all, there are really only two ideas in calculus: ratios of small quantities, and chopping and adding. Let’s not lose sight of the coherence of that underlying message.

*Figure 2: One of the plastic surface models developed by Aaron Wangberg at Winona State University as part of the Surfaces project. Each of the color-coded surfaces is dry-erasable, as are the matching contour maps, one of which is visible underneath the surface. For further details, see the Surfaces project website.*

**Acknowledgements**

Most of the ideas presented here grew out of more than 20 years of collaboration with my wife, Corinne Manogue, as well as many colleagues too numerous to name. David Roundy deserves the credit for introducing “experiment” as a representation of the derivative, leading directly to the concept of *thick derivatives*. Much of this work was done under the auspices of three overlapping projects.

The Vector Calculus Bridge project seeks to bridge the gap between the way mathematicians teach vector calculus and the way physicists use it.

The Paradigms in Physics project has redesigned the entire upper-division physics curriculum at OSU, incorporating modern pedagogy and deep conceptual connections across traditional disciplinary boundaries; its website documents both the 18 new courses that resulted, and the more than 300 group activities that were developed.

The Raising Calculus to the Surface project uses plastic surfaces and accompanying contour maps, all writable, to convey a geometric understanding of multivariable calculus.

The Bridge and Paradigms projects have been supported by the NSF through grants DUE–9653250, DUE–0088901, DUE–0231032, DUE–0618877, DUE–1023120, and DUE–1323800; the Surfaces project is supported by the NSF through grant DUE–1246094.

Figure 1 first appeared in [1]; Figure 2 is taken from the Surfaces project website, and is used by permission.

**Bibliography**

[1] David Roundy, Eric Weber, Tevian Dray, Rabindra R. Bajaracharya, Allison Dorko, Emily M. Smith, and Corinne A. Manogue, *Experts’ understanding of partial derivatives using the PartialDerivative Machine*, Phys. Rev. ST Phys. Educ. Re.s **11**, 020126 (2015).

[2] David Roundy, Tevian Dray, Corinne A. Manogue, Joseph F. Wagner, and Eric Weber, *An Extended Theoretical Framework for the Concept of Derivative*, in Proceedings of the **18th Annual Conference on Research in Undergraduate Mathematics Education**, (Pittsburgh, 2015), eds. Tim Fukawa-Connelly, Nicole Engelke Infante, Karen Keene, Michelle Zandieh, MAA, pp. 838–843.

[3] *Engage to Excel: Producing One Million Additional College Graduateswith Degrees in Science, Technology, Engineering, and Mathematics*, President’s Council of Advisors on Science and Technology, The White House, Washington, DC, 2012.

[4] Chris Rasmussen and Jessica Ellis, *Who is Switching out of Calculus and Why?*, In:

Proceedings of the 37th Conference of the International Group for the Psychology of Mathematics Education, Vol. 4, eds. Anke M. Lindmeier and Also Heinze, PME, Kiel, Germany, 2013, pp. 73–80.

[5] David Bressoud and Chris Rasmussen, *Seven Characteristics of Successful Calculus Programs*, Notices of the AMS **62**, 144–146 (2015).

[6] Lynn Arthur Steen and J. Arthur Seebach, Jr., **Counterexamples in Topology**, 2nd edition, Springer Verlag, New York, 1978.

[7] Tevian Dray and Corinne A. Manogue, *Using Differentials to Bridge the Vector Calculus Gap*, College Math. J. **34**, 283–290 (2003).

[8] Tevian Dray and Corinne A. Manogue, *Putting differentials back into calculus*, College Math. J. **41**, 90–100 (2010).

[9] Tevian Dray, *Using differentials to determine the derivatives of trigonometric and exponential functions*, College Math. J. **44**, 17–23 (2013).

[10] Tevian Dray and Corinne A. Manogue, *The Geometry of Vector Calculus*, (online only).

[11] Susan Ganter and William Barker, eds., *Curriculum Foundations Project: Voices of the Partner Disciplines*, MAA, 2004.

In my experience, many students in K-12 and post-secondary mathematics courses believe that:

- all math problems have known answers,
- failure and misunderstanding are absent from successful mathematics,
- their instructor can always find answers to problems, and
- regardless of what instructors say, students will be judged and/or assessed based on whether or not they can obtain correct answers to problems they are given.

As long as students believe in this mythology, it is hard to motivate them to develop quality mathematical practices. In an effort to undercut these misunderstandings and unproductive beliefs about the nature of mathematics, over the past several years I’ve experimented with assignments and activities that purposefully range across the intellectual, behavioral, and emotional psychological domains. In this article, I provide a toolbox of activities for faculty interested in incorporating these or similar interventions in their courses.

**Psychological Domains**

A useful oversimplification frames the human psyche as a three-stranded model:

The intellectual, or *cognitive*, domain regards knowledge and understanding of concepts. The behavioral, or *enactive*, domain regards the practices and actions with which we apply or develop that knowledge. The emotional, or *affective*, domain regards how we feel about our knowledge and our actions. All three of these domains play key roles in student learning. In post-secondary mathematics courses, our classroom activities and assessments often focus primarily on intellectual knowledge and understanding, with emotional and behavioral aspects of learning addressed either implicitly or not at all. A partial antidote to this is found in the many active learning techniques being used in post-secondary mathematics courses, such as think-pair-share, “clicker” systems, one-minute papers, inquiry-based learning, and service learning, among others. A strength of active learning methods is that they challenge students’ unhelpful beliefs and practices through public dialogue and activities. What active learning techniques might not *explicitly* do is frame these discussions and activities within a broader context involving the nature of intelligence and the process of successful learning.

A goal for my courses is to incorporate direct interventions that provide students with three things:

- language that supports articulate reflection and discussion in the context of emotional and behavioral domains,
- an environment in which such reflection and discussion arise naturally and effectively, and
- a contemporary “external source” motivating this language and environment so that our discussion is not driven by the will of the instructor.

The ways in which these interventions are realized in my classes will change over time, and I am willing to follow current educational trends if they are effective tools for my students. Many of the interventions I have used are based on research in psychology regarding mindsets, a topic that I’ve written about previously on this blog. While the literature on mindset research contains contradictory empirical findings, this is not a problem for me since my main goal is to use the language and motivation that this research provides as a tool for engaging students across psychological domains. Mindset research is only one among many possible sources of motivation for meeting the goals above;* what is critical is to make sure that my mathematics courses include activities that explicitly promote student development across all three of these psychological domains.*

**A Toolbox of Interventions**

What follows are student assignments and activities that I’ve used in classes ranging from 20-student upper-level courses for math majors to 150-student Calculus courses for STEM majors. They have a common purpose of promoting student development in one or both of the emotional or behavioral domains, complementing other work that my students do to develop intellectually in mathematics. An important disclaimer: none of these activities are original with me; rather, these are all adaptations of the work of others, to whom I will always be indebted.

*Introductions*. On the first day of class each semester, I begin with students introducing themselves to each other. In a small class with less than 30-50 students, there is time for everyone to take turns sharing with the entire class their name and the reason they are taking the course. In a large-lecture course, I tell students to do the same thing with 4-6 people sitting next to each other. I teach at the University of Kentucky, and many of our STEM majors are primarily enrolled in large lecture courses during their first year. By beginning every course with a 5-minute activity that recognizes the students and promotes discussion, a collaborative tone is set for the remainder of the course, and some of the isolation that students feel (especially as one among many in a large lecture) can be countered.

*Day 1, small classes: reading and autobiography assignment.* During the first week of class, I assign an article regarding mindset research by Carol Dweck along with a one-page autobiographical essay. I have used Dweck’s articles “The Secret to Raising Smart Kids” and “Is Math a Gift? Beliefs that put females at risk” for this with success. I assign a grade to the essay based on completion only, completely ignoring the quality of the writing, editing, or ideas. The goal is to get students to reflect and be honest, not necessarily to train them to write well. If students respond to the prompt in a relevant manner, they get full credit.

*Day 1, large classes: video and small group discussions.* In large classes with 150 or more students, especially in courses that are coordinated across sections, the autobiography assignment is harder to implement. Another way to introduce students to the language of mindsets (or other tools) is to have students students watch a 10-minute video about mindset research during class on the first day. Following the video, have students spend 2-3 minutes free response writing about the video. Following the writing, have students spend 2-3 minutes discussing their response with a neighbor in the class.

*Course policy on supportive language.* I have a course policy on supportive language that I use in all of my classes: *Students are not allowed to make disparaging comments about themselves or their mathematical ability, at any time, for any reason.* I give students a variety of examples of “banned” phrases and suggested replacements that can be found here. The important aspect of this policy is that it must be enforced — if I hear students making negative comments, I say “course policy” and have them create a neutral rephrasing of their negative self-comment. This is tougher to implement in large lectures, but even in this context the policy sets a positive tone for the first month of class. In large lectures with accompanying recitations, it is important that graduate student teaching assistants are aware of this policy and enforce it during their recitation sections. It is also important that students know that the policy applies to faculty and teaching assistants as well. I had a student in a large Calculus II lecture call me out for violating this policy last semester when I was frustrated at making errors during an example, and it was an excellent moment for the class.

*Video regarding effectiveness of science videos.* During class, I have students watch a video about research regarding the effectiveness of science videos. As with the video on the first day of class, students complete a two-minute free writing followed by a two-minute discussion with their neighbors regarding their response to the video. For many students, a common behavioral practice is that if they are stuck on a math problem, they immediately search the internet for videos that explain how to do this type of problem. This is typically an unproductive behavior, and dedicating some class time to confront it directly sets the stage for further discussions regarding the processes students use for completing homework and solving problems.

*Assign an unsolved problem as homework.* As I’ve written before, assigning an unsolved math problem as homework can serve as a gateway to discussions about the nature of high-level mathematical problem solving and the processes, practices, and attitudes that students bring to authentic mathematical challenges. When I assign an unsolved problem, e.g. those given in the article linked to above, I provide students with the following prompt.

This is a famous unsolved problem in mathematics. Work on it for a while — the goal isn’t for you to solve this, but rather to get a feel for the problem. Create an essay by recording your thoughts and attempts as you work. Focus on responding to the following questions: What did you try to do? Why did you try this? What did you discover as a result? Why is this problem challenging? (Seriously, write down everything you’re thinking and every idea you try, even if it doesn’t go anywhere.)

It’s good to grade this problem generously regarding mathematical content, keeping in mind that the goal is for students to be rewarded for demonstrating persistence and good mathematical processes.

*Reflective essay about homework. *In most of my upper-level courses, especially those in which I assign an unsolved problem as homework, I have students write a 2-3-page essay explaining what they found most and least challenging in the homework so far, and what their most and least favorite homework problems have been. The prompt can ask them to directly link to mindset or another external topic, or can be left relatively open-ended to see what connections students make on their own. This can be either graded with a rubric for writing or graded based on completion. The majority of my students have discussed at length their experience working on the unsolved problem, both what they did and how they felt about their work.

*Create-your-own homework assignment.* A recent assignment that I’ve used is to have students write their own homework assignment toward the end of the semester. The specific prompt I used was this:

Create your own homework assignment containing three problems. The homework assignment should be typed. There should be a mix of easy and hard problems that represent a broad spectrum of ideas from the entire course. For each of these problems, type a paragraph explaining why you chose that problem, whether you think it is easy, medium, or hard in difficulty, and what area of the course the problem represents. Once you have created the homework assignment, you should include complete solutions to each of the problems. Your solutions to the problems may be either typed or handwritten, but they should be complete and correct.

It was fascinating to see what the students came up with for their homework. What I found particularly noteworthy was the large number of students who included as one problem a critical analysis essay or short reflective essay similar to what I had assigned in the course to complement mathematical content work. I had honestly expected their assignments to contain a range of standard problems focused on mathematical content, and was pleasantly surprised to see the students incorporating into their homework tasks that addressed behavioral and emotional aspects of doing mathematics.

*End-of-course reflective essay.* In my smaller classes, I assign as the final homework assignment the following short essay prompt. The grade is based only on completion, because I want students to write honestly without fear of being penalized for their opinions.

What were six of the most important discoveries or realizations you made in this class? In other words, what are you taking away from this class that you think might stick with you over time and/or influence you in the future? What have you experienced that might have a long-term effect on you intellectually or personally? These can include things you had not realized about mathematics or society, specific homework problems or theorems from the readings, etc. These can be things that made sense to you, or topics where you were confused, points that you agreed/disagreed with in the readings or class discussions, issues that arose while working on your course project, etc. Explain why these six discoveries or realizations are important to you.

I have found that reading through these essays is a fascinating exercise, because of the wide range of messages that the students perceived as being central to the course. Using this assignment consistently over time has helped me improve my ability to create focused courses with clearly defined and communicated learning outcomes.

**Final Thought**

If you experiment with any of these activities in your own courses, I would love to hear about your experiences!

]]>One common instructional approach during the first two years of undergraduate mathematics in courses such as calculus or differential equations is to teach primarily analytic techniques (procedures) to solve problems and find solutions. In differential equations, for example, this is true whether the course is strictly analytical or focuses on both analytic techniques and qualitative methods for analysis of solutions.

While these analytic techniques play a major part of the early undergraduate mathematics curriculum, there is significant discussion and research about the importance of learning the concepts of mathematics. Many researchers in mathematics education encourage teaching mathematics where students learn the concepts before the procedures and are guided through the process of reinventing traditional procedures themselves (e.g., Heibert, 2013). Additionally, educators who have developed mathematical learning theories often set up a dichotomy between the two kinds of learning (e.g., Skemp, 1975; Haapasalo & Kadijevich, 2000). At the collegiate level, we as professors may agree that these educational ideas hold merit, but also firmly believe that students have a significant amount of content to learn and may not always be able to spend the time necessary to allow students to participate fully in the development of conceptual understanding and the reinvention of the mathematics (including procedures).

However, some researchers, including ourselves, provide evidence that “teaching the procedures to solve problems and find solutions” and “providing ways for teaching concepts first so students will truly understand” can be integrated, and that the notion of learning procedures does not need to be shallow and merely a memorized list (Star, 2005; Hassenbrank & Hodgson, 2007). Our framework to merge these two ways of teaching is titled the *Framework for Relational Understanding of Procedures*. It was developed as part of Rasmussen and colleagues’ work in differential equations teaching and learning (Rasmussen et. al., 2006). Skemp coined the original definition; she defines relational understanding as “knowing both what to do and why” and contrasts it to instrumental understanding as “rules without reason” (1976, p. 21).

Following, we describe the six components of the *Framework for Relational Understanding of Procedures*. The idea is that each category can be used to consider and enhance students’ learning as they study a procedure. For each one, we provide a brief explanation, questions about student thinking, and an example of an exam question related to each component taken from our work in differential equations. Likely, each instructor could add other algorithms in differential equations as well as other courses.

**Components of Relational Understanding of Procedures**

**Student can anticipate the outcome of carrying out the procedure without actually having to do so and they can anticipate the relationship of the expected outcome to outcomes from other procedures. **

This component suggests that a student understands what kind of solution would be expected before solving. A student might need to consider the following: Is the solution going to be a number, or a function? When is the solution one or two functions? Are there different forms to show the answer? How do the answers compare to other answers from similar procedures?

Example:

*Suppose that a differential equation can be solved with either separation of variables or with a general technique for solving first order linear differential equations. Let* \(y_{sep}(t)\)* be the solution for an initial value problem using separation of variables, and let \(y_{lin}(t)\) **be the solution for the same initial value problem using the technique for linear differential equations. Which of the following statements correctly states the relationship between* \(y_{sep}(t)\)* and \(y_{lin}(t)\)**?*

*\(y_{sep}(t)\) is not equivalent to \(y_{lin}(t)\)*

*\(y_{sep}(t)\) is equivalent to \(y_{lin}(t)\) for all t*

*\(y_{sep}(t) = y_{lin}(t)\) only for equilibrium solutions*

*\(y_{sep}(t) = y_{lin}(t)\) only at the initial condition*

**Student can identify when it is appropriate to use a specific procedure.**

Students often can do the procedure when they know that is what is needed. However, they often are unable to decide before they start which procedure is needed. Ultimately, one reason that this is an issue is because of the structure of typical textbooks (e.g., the homework always matches the section). How many of you have had students say, “I could do all the problems in the homework, but then I didn’t know what to do for the exam”?

Example:

*Circle all that apply. A differential equation can be solved with the technique for first order linear ODEs if:*

*it has the form \(\frac{dy}{dx}=ax+by\)**for some constants a and b*

*it has a solution whose graph is linear*

*it has the form \(\frac{dy}{dx}=f(x)y+g(x)\)**for some functions f(x) and g(x)*

*it has the form \(\frac{dy}{dx}=mx+b^2\)*

**Student can correctly carry out the entire procedure or a selected step in the procedure**.

This is what we typically think of as doing a problem, or performing the framework. Can the student do the steps necessary to complete a problem correctly? Can the student analyze where they are in the procedure and know what to do next?

Example:

*A student is solving a first order linear differential equation and at some point in her solution process she correctly gets the expression to \(e^{2y}\left(\frac{dy}{dt}+2y\right)\)**. This expression is equivalent to which of the following?*

*a) \((e^{2t}y)’\)*

*b) \((e^{3t}y)’\)*

*c) \(e^{3t}y’\)*

*d) \(e^{2t}y’\)*

**Student understands the reasons why a procedure works overall. Additionally, student knows the motivation or rationale for key steps in the procedure.**

This step fundamentally involves the conceptual idea behind the procedure. As instructors, we make efforts to teach these ideas in our classes on a regular basis. However, are we concerned about how the students grow to understand the “why” of the procedure? Do the reasons for the steps play a part in the students’ solving? Can the students go back and make modifications because they understand what is really happening?

Example:

*Which of the following would be a justification for one or more of the steps needed to solve a first order linear differential equations? Circle all that apply.*

*Fundamental Theorem of Calculus*

*Mean Value Theorem*

*L’Hopital’s Rule*

*Product Rule*

**Student can symbolically or graphically verify the correctness or reasonableness of a purported outcome to a procedure without repeating the procedure.**

This component is about thinking through the answer in a way that you can decide if it makes sense. Our experience says that if you ask students to check for the reasonableness, they often just repeat the procedure, and this indicates a need to push for the bigger picture of making sense of a solution beyond just doing. Showing competence in this component might involve either checking in terms of seeing if the solution works, or using a graphical or numerical technique to see if the two solutions are compatible. Can the student find a way to check for correctness? Can the student decide if answers are reasonable?

Example:

*Joey is solving an autonomous differential equation of the form \(\frac{dy}{dt}=f(x)\)**, using separation of variables to find the general solution. At one point in his solution process he correctly gets \(e^x=t^2+c \) **. His final answer is then \(x=ln(t^2)+c \)**. We can verify that Joey’s final answer is:*

*a) Correct because \(x=ln(t^2)+c \)** says that graphs of solutions are shifts of each other along the t axis (that is, they are horizontal shifts of each other).*

*b) Correct because \(x=ln(t^2)+c \)** says that graphs of solution are shifts of each other along the x axis (that is, they are vertical shifts of each other).*

*c) * *Incorrect because \(x=ln(t^2)+c \)** says that graphs of solutions are shifts of each other along the t axis (that is, they are horizontal shifts of each other).*

*d) * *Incorrect because \(x=ln(t^2)+c \)** says that graphs of solutions are shifts of each other along the x axis (that is, they are vertical shifts of each other).*

*e) Incorrect because \(e^x\) **is always positive.*

**Student can make connections within and across representations involved in the problem and solution: symbolic, graphical, and numerical.**

Educational literature suggests that one way to demonstrate deep understanding is to make connections among representations. Traditionally, in upper level mathematics, the representations are often symbolic, but in differential equations, linear algebra, and other freshman and sophomore classes, there are several representations, and students who can be flexible and move among them have better understanding.

Example:

*Jung Hee uses a slope field to determine the long term behavior (that is what happens as \(t \to \infty \)**) of the solution to the initial value problem \(\frac{dy}{dt}=0.4y(70-y) \)**. Which of the following methods could be used to corroborate the long term behavior she found by using the slope field? Circle all that apply.*

*The technique to solve separable differential equations.*

*Euler’s numerical method with a small step size.*

*The technique to solve first order linear differential equations.*

*None of the above.*

**Conclusion**

The framework described here and the examples from an assessment developed for relational understanding (Keene, Glass, Kim, 2011) may offer some ways to think about teaching procedures that are the foundation of many of the early undergraduate mathematics class. It may not be a matter of trying to teach the procedures or the concepts (as a dichotomy) but of developing a relational understanding of the procedures so that students can not only find answers, but also understand the underpinnings and development of the procedures. We believe that if students have this relational understanding, not only will they perform better in their classes, they will retain the skills and understandings over periods of time. This will result in students doing better in all their mathematics classes.

We would like to acknowledge Dr. Chris Rasmussen for his contributions to the work.

**References**

Haapasalo, L., & Kadijevich, D. (2000). Two types of mathematical knowledge and their relation. *Journal für Mathematik-Didaktik*, *21*(2), 139-157.

Hassenbrank, J. & Hodgson., T. (2007). A framework for developing algebraic understanding & procedural skill: An initial assessment. In *Proceedings of Research in Undergraduate Mathematics Annual Conference.*

Hiebert, J. (2013). *Conceptual and procedural knowledge: The case of mathematics*. Routledge.

Keene, K. A., Glass, M. & Kim, J. H. (2011). Identifying and assessing relational understanding in ordinary differential equations. In *Proceedings of the 41st Annual Frontiers in Education Conference*, Rapid City, SD.

Rasmussen, C., Kwon, O., Allen, K., Marrongelle, K. & Burtch, M. (2006). Capitalizing on advances in K-12 mathematics education in undergraduate mathematics: An inquiry-oriented approach to differential equations. *Asia Pacific Education Review*, 7, 85-93.

Skemp, R. R. (1976). Relational understanding and instrumental understanding. *Mathematics Teaching, 77, *20-26.

Star, J. R. (2005). Reconceptualizing procedural knowledge. *Journal for Research in Mathematics Education,* 36(5), 404-415.

A good educator must facilitate learning for a classroom full of students with different attitudes, personalities, and backgrounds. But how? This question was the starting point for a new Faculty Teaching Seminar in the math and statistics department at Sam Houston State University. In the conversation that transpired, we looked to identify the most important components of creating a class culture that best enables us to achieve learning outcomes. What are our goals? How do we get the ball rolling each semester? How do we get our students on board? Read on to find out…

**What is a “classroom culture”? What are you after? Why is it important?**

*Taylor*: The environment in my classroom is a necessary component of a successful semester. The rapport that I build with my students, the tone of the class, and the ways that my students interact with each other are just as much a component of learning as the lectures or textbook.

In my experience, setting up a productive class culture can determine the potential for learning for the entire semester. A productive class culture is one where the students feel supported, protected, and valued.

*Ken*: I seek a “learning community” of student-scholars, people who are curious about mathematics and serious about learning. I want calculus students who are *proud* to be taking calculus. I want an upper level mathematics class where the students see themselves as professionals. I want a graduate class where students focus on exploration of mathematics and its mysteries, and where curiosity is the driving reason for study.

I don’t distribute the class syllabus as a hard copy. I collect work every class period and “speed-grade” it to return it the next day. I work with a department secretary to force late registration students to meet with me before adding my classes. I never have office hours before class. When challenged by colleagues about some of these unusual practices, I realized that I desire a certain type of classroom environment. I push, coach, and manipulate my students to achieve that environment.

**What do you do on Day 1 to create a classroom culture?**

*Ken*: My class culture begins with my syllabus, which lays out some “professional” expectations of my students. But I also begin, from Day 1, to set the stage for class expectations. Since much of the material I provide will be online (either via Blackboard or Google Drive), the syllabus is also available there and I do *not* hand out a hard copy. There will not be handouts during the semester; let’s get the students used to this on the first day!

In classes with a prerequisite, I give a quiz the first day. The intended message is, “We are serious about learning and are on the move!” Early in the semester I keep the class at a fairly brisk pace (emphasizing a steady regime of study) and I make sure to model this on day 1. Since many first-year students view office hours before class as an invitation to procrastinate, my office hours are not before class, but afterwards!

I never dismiss class early, not even on the first day.

*Taylor*: The answer to this question depends on what level the class is and what method of teaching I am using in the class, but there are some common themes in all of my classes on Day 1:

- Get the students talking: I always do some form of introductions in my class. Most often, I will have students pair up, introduce themselves to their partners, and then have each students’ partner introduce him/her to the entire class. This takes up a lot of time, but it is worth it! The students quickly learn that they are expected to participate. They must contribute to class, and this exercise makes them more comfortable speaking up. This also helps me to start learning their names and eliminates the need for me to do a roll call, inevitably stumbling awkwardly through hard-to-pronounce names.
- Be a cheerleader: I use some type of unconventional or atypical pedagogy in all of my classes. I always start with the assumption that my students will be new to this teaching method. I must begin to sell my teaching style on Day 1! I achieve this by explaining to students what they can expect from a typical class day and why we do things the way that we do. I also make sure to tell them what my expectations are.
- Include some content. I want to make sure that my students take my class seriously. Hard work begins on Day 1; like Ken, I never dismiss class early.

**Does classroom culture vary by class level?**

*Taylor*: Yes, absolutely. I usually focus on one or two aspects of a successful class culture and hone in on developing those aspects. In a Calculus class, for example, I most want the students to learn to justify their thought processes. To achieve this, I will ask them to buddy up every day – literally push their desk next to someone else’s. I tell them, “Turn to you partner and ask, `Why is it true that…?’” I’ll then solicit feedback in a way that supports their collaboration by asking a student, “What justification did you and your partner come up with?”

In an Inquiry Based Learning class, I most want students to value productive failure as an integral part of the learning process.

I will then carefully praise mistakes and encourage participation from students who know they are wrong. In the photo, you see my IBL Algebra students writing proofs on the board; I have them visit each other’s work and circle anything they don’t agree with. Since we have a safe space where it’s ok to be wrong, my students are professional but thorough when it comes to correcting mathematical errors.

*Ken*: Yes, certainly this varies by level. At the lower level my expectations are typically overly optimistic. I don’t abandon them, but I recognize that students have been trained to focus on grades and testing. At the graduate level a classroom culture can be relatively easy to create, particularly if the students are already in a cohort and beginning to form a community.

The emphasis on a classroom environment is even important at the grade school level – see this article by Yackel and Cobb on creating a productive classroom environment in second and third grade!

**What about students who don’t buy in? How do you create/enforce “buy in” of your culture?**

*Ken*: Some students, in first- or second-year classes, don’t buy in to the steady stream of new material, and the necessary consistent study discipline. I routinely remind everyone of the expectations, and I attempt to motivate these expectations, in the same way that the coach of an athletic team might create team pride. For those students clearly not keeping up, I eventually chat with them briefly about the fact that this class is probably not for them. I encourage these students to either catch up quickly or find a more constructive use of their time. (There is an art to this. I often write an email to a poorly-performing student in which I express concerns about the progress and suggest some constructive alternatives that include starting fresh in the course next semester. I write these emails with a view to Mom reading over the student’s shoulder!)

At every level there is a fair amount of coaching. “Here is where we are going! Here is what we are trying to achieve! Look how far you’ve come!” I’ve coached competitive youth soccer teams and the speeches are similar. “You are working hard to reach this level! Keep it up! Here is our game plan for today…”

*Taylor*: I want all my students to take charge of their own education, so I will let a challenging student make his or her own decisions on how to participate in class, as long as the behavior isn’t disruptive. I may gently remind that student that I would prefer her or him to be fully engaged. In general, though, I think that if your class culture is based on a genuine desire to facilitate learning, students recognize and value the effort.

**What are pitfalls, mistakes, disasters?**

*Taylor*: A few semesters ago, I had a mutinous Calculus class. Somehow, I encouraged so much communication and collaboration among my students outside of class that a vocal minority opposition sprung up from within the class. I later discovered that there were students campaigning for the class to give me bad course evaluations (which happened). My feelings were hurt for a bit, but I learned valuable lessons that semester. I had been uncompromising in my desire for them to ask themselves “Why?” and this group wasn’t academically ready to do that. I now pay more attention to differentiating instruction, for example when a student asks a question in class.

*Ken*: My goal is a community of students all going in the same direction. If just one or two students are not swimming with the rest, the general flow of students will often pull them in to the current. But if a significant minority resist the direction of the class then things can go bad quickly. I must keep up with class morale and make sure that the program is flowing (somewhat).

Long ago, in an abstract algebra class where students were supposed to do small projects without discussing their work with others, I uncovered a collaborative ring that included a majority of the class. The students had ignored my published restrictions on collaboration. Rather than punish over half the class for this “plagiarism”, I backed up and restarted the process, admitting that I had not been sufficiently aware of the stress my problems generated. (The memory of that class is still a bit painful.)

**In summary, **effective learning occurs in a class environment in which curiosity, exploration and even mistakes are part of the norm. We seek to create that culture even before the first class day!

*What do other teachers do to facilitate this? We would like to know!*

]]>