Archived Newsletters

From the Chair

Larry Cain, Davidson College

I would like to bring you up to date on the activities of the Forum on Education (FEd) that occurred this past summer and will occur this fall. Our revised bylaws continue to become part of our operations. January 1 is now the beginning of officer terms, so at the end of 2018 three members of the current Executive Committee (ExComm) will rotate off the ExComm. I would like to thank John Stewart, outgoing Past Chair, for four years of excellent service on the chair line. He has chaired the nominating committee, the program committee, the Forum, and the fellowship committee in successive years. He has revised our bylaws to make our operations smoother and more structured. Thanks also to Lux Martinez-Miranda and Toni Sauncy for their outstanding work and contributions as members-at-large for the last three years. All three will be missed and I wish them well in their future work for physics education.

This fall we will elect new members of the ExComm. The Nominating Committee chaired by Vice Chair Jerry Feldman has been working over the summer to select the next slate of FEd candidates. We will elect a new vice chair and two new members-at-large as usual, but we will also elect a graduate student member of the ExComm for the first time. This new position was established in our revised bylaws. Please remember to vote when you receive the ballot this fall. Your input is needed to help the Forum remain strong and support physics education. Jerry describes the process below.

The Program Committee chaired by Chair Elect Laurie McNeil has been working over the summer and has finished the program for next year’s March meeting in Boston and April meeting in Denver. The committee has created an excellent array of talks at these meetings. I encourage you to attend these meetings and the FEd education sessions. Laurie describes the programs later in this newsletter.

The awards committees have also been busy this summer. They have finished their work selecting new APS Fellows, the Excellence in Physics Education Award awardee, and the Jonathan F. Reichert and Barbara Wolff-Reichert Award for Excellence in Advanced Laboratory Instruction awardee. The latter two awards will not be announced until late October. However, the APS Fellows have been announced by APS. Please congratulate the following new APS Fellows who were nominated by FEd: Diola Bagayoko (Southern University and A&M College), Amy L. R. Graves (Swarthmore College) and Heather Lewandowski (University of Colorado, Boulder). I want to encourage you to think about nominating persons (or group as appropriate) for the above APS education-related awards and for APS Forum on Education Fellowship for next year. We all know of deserving colleagues for these awards and for fellowship, but we must nominate them for them to be considered.

FEd also works closely with the APS Committee on Education (COE), with the past chair, chair and chair-elect of FEd being members of the COE. In this way, the Forum maintains an active voice in physics education in the American Physical Society. An award given by the COE is the Award for Improving Undergraduate Education. I encourage nominations for that education award.

A new committee established in the bylaws is the Membership Committee. FEd has slowly lost members over the years as the APS has grown. This committee hopes to find ways to encourage more APS members to join the Forum as it works to keep physics education in the foreground for APS members. APS members can join any number of Forums for free. I encourage newsletter readers to recruit colleagues and other APS members to FEd. We can be successful only by having an engaged and broad membership in the Forum.

The ExComm is currently in the process of identifying a person to be the new Editor-in-Chief of the FEd newsletter. The new Editor-in-Chief will begin work later this fall and will work with current editor Richard Steinberg in creating the Spring newsletter before taking over for the Summer 2019 newsletter. I will contact the FEd membership when a new Editor-in-Chief is elected.

Election Process for the FEd Executive Committee

Jerry Feldman, Vice Chair and Chair of the Nominating Committee – Forum on Education, George Washington University

The Forum on Education has assembled a slate of candidates for election to the FEd Executive Committee. The candidates for Vice Chair are Catherine Crouch (Swarthmore College) and Scott Franklin (Rochester Institute of Technology). The nominee who is elected this year will serve as Chair-Elect, Chair, and Past Chair in subsequent years.

The candidates for the Member-at-Large seat to replace Luz Martinez-Miranda (whose term ends in 2018) are Brad Conrad (AIP) and Adrienne Traxler (Wright State University). The candidates for the APS-AAPT Member-at-Large seat currently held by Toni Sauncy (whose term ends in 2018) are Robert Hobbs (Bellevue College) and Ben Dreyfus (George Mason University). The elected Members-at-Large will take office in January 2019 and will serve a three-year term.

This year, for the first time, we are also electing a Graduate Student member of the FEd Executive Committee. We are grateful to our two candidates for this position, Julian Gifford (Univ. of Colorado) and Nicholas Young (Michigan State Univ.), for their willingness to stand for election. The elected Graduate Student member will take office in January 2019 and will serve a two-year term.

To produce this slate of candidates, a Nominating Committee (chaired by Jerry Feldman, FEd Vice Chair) was appointed in May 2018 and was composed of Andy Gavrin (IUPUI), Laura McCullough (Univ. of Wisconsin Stout), Luz Martinez-Miranda (Univ. of Maryland), Toni Sauncy (Texas Lutheran Univ.), Monica Plisch (APS) and Gordon Ramsey (AAPT). In the first round of deliberations, each member of the committee was provided with a list of all FEd members and asked to propose potential candidates for each position. In the second round, this list was filtered down to at least six top choices for each position and a rank ordering was determined, keeping in mind diversity of demographics, institution type, career stage, and focus of educational interests. In August, the Vice Chair contacted the persons named in order to identify those willing to stand for election, completing the slate by the end of September. The ballots for the FEd Executive Committee election will be available in mid-October and voting will close in mid-November. The results of the election will be announced shortly thereafter.

Forum on Education Sessions at the Upcoming 2019 March and April Meetings

Laurie McNeil, Chair Elect – Forum on Education, University of North Carolina, Chapel Hill

The Forum on Education Program Committee has completed its work selecting the sessions for the March Meeting March 4-8, 2019 in Boston, MA and the April Meeting April 13-16, 2019 in Denver, CO. The Chair Elect of the Forum on Education is the chair of the Program Committee. The Program Committee has developed two great sets of sessions which should be of interest to a broad audience at each meeting.

As Program Chair, I would like to thank the committee for all their hard work putting together these sessions. This year’s Program Committee included Forum on Education Executive Committee members Chuhee Kwon (for the second year in a row!) and Beth Lindsay, Gordon Ramsey representing the American Association of Physics Teachers (AAPT), Don Lincoln representing the Forum for Outreach and Engaging the Public, and Monica Plisch representing APS. Other members included Barbara Whitten, Jennifer Blue, Peggy Cebe, and Tim McKay. The committee worked closely with Larry Engelhardt and Dimitri Dounas-Frazer of AAPT and with Paula Heron from the Topical Group on Physics Education Research to co-organize sessions at the April meeting. Informal invitations have gone out to the speakers who will soon receive a formal invitation from the APS, so a speaker list cannot be announced at this time. However, session titles can be announced.

APS March Meeting, March 4-8, 2019 in Boston, MA
Session 1: Jonathan F. Reichert and Barbara Wolff-Reichert Award for Excellence in Advanced Laboratory Instruction This session will feature the Reichert Award recipient and other speakers discussing how to incorporate state-of-the-art research into advanced laboratory instruction. The session is co-sponsored by the Division of Condensed Matter Physics.

Session 2: Creating Inclusive Environments in Which to Work and Learn (co-sponsored by the Committee on the Status of Women in Physics) This session will feature speakers discussing the recent report on sexual harassment issued by the National Academies, effective practices for inclusion in undergraduate education and industry, and creating environments where women of color can thrive.

Session 3: Live Long and Prosper as a Physicist, Innovator and Entrepreneur This session on entrepreneurship education in physics, co-sponsored by the Forum on Industrial and Applied Physics, will feature talks about the APS-PIPELINE project, financing and intellectual property for physicist entrepreneurs, and education for entrepreneurship.

Session 4: Launching a Successful Career as a Physicist will feature speakers in data science, patent law, physics teaching and finance. It is co-sponsored by the Forum on Graduate Student Affairs and is particularly appropriate for the many student (graduate and undergraduate) attendees at the meeting.

Session 5: Life, the Universe, and Everything: Teaching Biology to Physicists and Physics to Biologists This session addresses interdisciplinary education at the boundary between physics and biology, at the undergraduate and graduate levels. It will feature speakers from Princeton’s Integrated Science Curriculum, Univ. of California – San Diego’s Quantitative Biology program, and the biophysics program in the physics department at Georgetown. Also speaking will be an author of biophysics textbooks and an architect of the Living Physics Portal of biophysics instructional materials.

Session 6: The Role of Physics Departments in Educating Teachers The critical role that physics departments can play in alleviating the severe shortage of qualified high school physics teachers will be addressed by speakers in this session. It will include an overview of the problem and the myths that physics students might believe about the teaching career path as well as examples of successful teacher preparation programs and what contributes to their success.

In addition to these invited sessions there will also be a Focus Session co-organized with the Division of Computational Physics on Education and Modern Computation. Forum members (and others) are encouraged to contribute abstracts for talks to be presented in this session.

APS April Meeting, April 13-16, 2019 in Denver, CO
Session 1: Forum on Education Excellence in Physics Education Award This session will open with a presentation by the award recipient and include other talks related to the work for which the award is presented.

Session 2: Critical examination of the relationships among reasoning, intuition, and conceptual understanding in physics This session, which is co-sponsored by the Topical Group on Physics Education Research, will explore the interplay of “thinking” and “feeling” in conceptual understanding of physics and how students develop intuition about physics.

Session 3: Calling all physics teachers: It is time to integrate computation into your courses! AAPT co-organized this session with the Forum on Education, and it is co-sponsored by the APS Division of Computational Physics. Speakers will explore the need for computation in physics courses as well as examples of ways in which this can be done.

Session 4: Engaging Students in Authentic Experimentation During Laboratory Courses This is the second session co-organized with AAPT for the April meeting, and it will focus on examples of project-based learning in laboratory courses and what physics education research has to say about it.

Session 5: Teaching Energy in the 21st Century This session will include presentations on how the teaching of energy in physics can impact learning in chemistry, pedagogical content knowledge for teaching energy, and ontologies for energy.

Session 6: Stereotype Threat: What It Is and What to Do About It Speakers in this session will explore how stereotype threat (the predicament in which people feel themselves to be at risk of conforming to stereotypes about their social group) affects education in physics.

We hope that members of the Forum who attend the March and April APS meetings will come to hear the many excellent invited speakers in these exciting sessions. We hope to see you there!

FFPER: Puget Sound 2018

Joss Ives, University of British Columbia, Vancouver, Andrew Boudreaux, Western Washington University

The fourth Foundations and Frontiers in Physics Education Research: Puget Sound conference welcomed more than 40 PER practitioners and consumers to the North Cascades Institute’s Environmental Learning Center for four days this past June. This residential conference was modeled after the ongoing FFPER meetings in Bar Harbor, ME. Five plenary talks, three workshops and two poster sessions provided the jump-off points for discussions that developed during unstructured time, where people participated in activities such as afternoon hikes through the surrounding rainforest and evening star-gazing. Most participants came from British Columbia, California, Oregon, and Washington state, with a few travelers from as far away as Hawaii and the east coast of the U.S. The group included graduate students, high school teachers, post-docs, and faculty from 2-yr and 4-yr colleges and universities, as well as one undergraduate student.

To encourage presenters to share their most current ideas, the conference is “off the record.” However, this edition of the APS Fed newsletter shares highlights from the featured presentations. The list below summarizes all of the plenary talks and workshops that were presented at the conference. The set of short articles that follows describes some of these in more detail.

  • Joel Corbo (University of Colorado, Boulder) discussed the need for cultural change in higher education and how to think about one's work in terms of principles and commitments.
  • Leslie Atkins Elliott (Boise State University) showed that when students draw on the rich contexts of their lives as they develop and critique scientific ideas, transfer to their out-of-class lives is much more prevalent.
  • Elizabeth Gire (Oregon State University) shared her work studying the development of physics sensemaking practices in a new, sophomore-level mechanics course that helps physics majors refine and routinize physics sensemaking strategies.
  • Enrique Suarez (University of Washington, Seattle) explored the idea that learners and scientists leverage a wide variety of communicative resources when sharing their observations and reasoning, such as gestures and multiple languages. Constraining what counts as acceptable communication to academic English can leave out a host of productive communicative resources and create inequitable learning environments, especially for emerging bilingual students.
  • Ben Van Dusen (California State University, Chico) encouraged our field to modernize its data collection and analysis methods to generate less biased and more generalizable findings.
  • James Day (University of British Columbia, Vancouver), Paula Heron (University of Washington, Seattle) and Jayson Nissen (California State University, Chico) facilitated a workshop in which participants explored the use and misuse of statistics in science, with a focus on statistical significance versus educational significance.
  • Jared Stang and Joss Ives (University of British Columbia, Vancouver) facilitated a workshop exploring the rationales, implementation choices, and best-practices related to adding a group-phase to an exam.
  • Regina Barber-DeGraaff and Robin Kodner (Western Washington University) facilitated a workshop, titled "Cultural awareness of self in STEM," in which participants explored issues of identity and privilege.

We look forward to welcoming returning participants and newcomers to the next offering of FFPERPS, planned for June 2020.

Joss Ives is a senior instructor at the University of British Columbia. He has participated in all four of the FFPER-Puget Sound conferences offered to date, but this was the first time he helped with the organizing process.

Andrew Boudreaux is an associate professor at Western Washington University. He has participated in and helped organize all four of the FFPER-Puget Sound conferences offered to date.

Focusing on Principles and Commitments at FFPER: Puget Sound 2018

Joel C. Corbo, University of Colorado Boulder

At the 2018 FFPER conference, I gave a talk entitled “Envisioning a Better Academia: Principles and Commitments.” This was a very different kind of talk than I had given before (not least of which because it was my first plenary). I was hoping to embrace what I understood to be the ethos of FFPER by presenting “outside the box” material and leaving the other attendees with a “call to action.” Thus, my talk focused on ways in which different principles and commitments have been woven into my work and presented alternatives to the current culture of academia through several examples.

The direction that my work has taken has been shaped by my background and experiences. I’m a child of an immigrant father and a Puerto Rican mother, a first generation college-student, and an out gay person. On the other hand, I’m male, cis-gendered, and able-bodied. Thus, some components of my identity align with traditional models of success in STEM, while others make my success a statistical outlier. Additionally, my higher education trajectory was a rocky one. As an undergraduate, I was part of a strong living group community and had supportive physics peers, but I also had a lab instructor who said things like “the theme this semester is we give you enough rope to hang yourselves.” As a graduate student, I dealt with poor instructors, weak community, and traumatic advising, which led to imposter syndrome, anxiety, depression, and graduation in 9.5 years. I developed growing identity as a teacher, which led to alienation from the dominant R1 physics culture. Fortunately, I also connected with a group of like-minded fellow students with whom I created the Compass Project. Compass provided me with a community of people who cared about education and equity, a space where I felt valued and where I could make an impact, an alternative model of organization and leadership, and the opportunity to recognize that the best way to “serve” a group is to work in partnership with them. Quite simply, Compass is the reason I stayed in grad school.

All of these experiences have shaped my fundamental assumptions about higher education:

  • The current culture of higher education, especially at R1 institutions, is toxic. It damages virtually everyone who interacts with it (especially traditionally marginalized folks).
  • While most people have positive intentions, they will nevertheless act to uphold the current system unless they learn to do otherwise.
  • True change cannot be done to or for others. Instead, it must be done in partnership with others.
  • Change is not going to come about by “disseminating” best practices or by decree from above. If you want to see change, you have to roll up your sleeves and do the hard work to make change happen.

These assumptions, in turn, drive my work to change the culture of higher education.

One of my main current endeavors is the Departmental Action Team (DAT) project, which I work on with a large, collaborative team. A DAT is a departmentally-based working group of 6 to 8 faculty, staff, and/or students with two goals: (1) to create change around a broad-scale undergraduate education issue by shifting departmental structures and culture and (2) to help DAT participants become change agents through developing facilitation and leadership skills. The DAT is supported by external facilitators from our project team and works over an extended time. We take a broad view of what “undergraduate education” means—anything from aligning learning goals across the major and assessing disciplinary skills to improving student advising structures and increasing the inclusion of underrepresented folks in the major. In effect, we support departments in becoming better versions of themselves, based on what they want to improve.

The DAT model is built on a foundation of six core principles, which serve both as design principles in the development of the DAT model and as target cultural characteristics of the DATs-as-enacted. Our project team has iterated on our understanding of these principles and is nearly ready to externalize them. As a preview, I’ll briefly discuss one of our principles: “Students are partners.” Full embodiment of this principle involves recognizing and acting on students' unique expertise; their evolving, multifaceted nature as a group; and their ability to successfully share power and participate in decision-making. Moreover, student have to see themselves as partners. This principle is important for creating change because students are best positioned to understand their own cultural backgrounds, experiences, and histories, and therefore can best understand how a change will impact them.1 Unfortunately, institutional power structures typically exclude students, and student-faculty partnerships require considerable hard work to enact ethically and thoughtfully.2.3

One way we enact this principle is by incentivizing student members on all of our DATs. To support their investment of time and acknowledge the value they bring, we provide stipends to all student DAT members. We also actively structure our facilitation to level the playing field between student and non-student DAT members as much as possible by, e.g., revoicing and affirming student ideas and ensuring equitable distribution of work and decision-making among DAT members. One undergraduate DAT member summed up his experience during a focus group:

“I definitely feel much more empowered being part of this to know that even as an undergrad that my voice is represented in the department. That's huge. It makes me feel like I want to get up, I want to get off the couch, I want to do these activities, plan, organize, execute. And really, you know, maybe undergrads have a lot more energy, they haven't beaten it out of us yet, but I think we are kind of an untapped potential resource, that it's at least good to have open communication between all these levels.”

For my FFPER talk, I created a list of “professional commitments” that I strive to embody in the work I do and how I do it:

  • Do work that will make the world a better place.
  • Work with communities of people who share my values.
  • Make sure that my research is driven by practice and my practice is driven by research.
  • Eliminate the distinctions between the actor (researcher, teacher, changer) and those who are “acted upon.”
  • Acknowledge the oppressive ways I behave, accept when others point them out, and do the hard work to unlearn them.

These are aspirations, and while I hope they are visible in the DAT project and my other work (e.g., the Access Network), I always have room to grow (as we all do). Thus, I end with my charge to you: take the time to articulate your professional commitments and the principles that you want to underlie your work, strive to enact them, find people to hold you accountable, and listen to them when they do. When you (inevitably) fall short, don’t be too hard on yourself, learn from the experience, and keep moving forward.

Joel Corbo is a Senior Research Associate at the University of Colorado Boulder. His work focuses on improving undergraduate education through institutional and cultural change.

Endnotes

  1. K. Tobin, “A Systematic Literature Review of Students as Partners in Higher Education.” Teach. Educ. 17, 133 (2006).
  2. L. Mercer-Mapstone, S. L. Dvorakova, K. Matthews, S. Abbot, B. Cheng, P. Felten, K. Knorr, E. Marquis, R. Shammas, and K. Swaim, “A Systematic Literature Review of Students as Partners in Higher Education.” Int. J. Stud. Partn. 1 (2017).
  3. K. E. Matthews, “Five Propositions for Genuine Students as Partners Practice.” Int. J. Stud. Partn. 1 (2017).

The Transfer of Physics to Everyday Life

Leslie Atkins Elliott, Boise State University

As part of an ongoing study of transfer, I administered a survey1 that examines the prevalence of a particular kind of transfer, addressing whether or not students notice, value and use ideas from physics (in this case, optics) in their everyday lives. When told the premise of this survey, students in a traditional introductory physics course for science and engineering majors laughed - in every section across two universities. In one section a student joked with his lab group, imagining a hypothetical moment of transfer and saying, with mad-scientist prosody, “yeah - the air drag caused by me riding my bicycle causes my beard hairs to deflect 13 degrees towards my neck.” And in response to the question asking whether or not students think of concepts from class when they see everyday objects, such as eyeglasses and television screens, only one of the 55 students surveyed “strongly agreed;” nine “agreed.” For the other 45 students, then, they report that classroom instruction on optics has not influenced how they view everyday objects that exploit these principles.

In contrast, when students in a science course for elementary education majors, Scientific Inquiry (Atkins & Salter, 2015),2 were given the survey, not one student strongly disagreed. 13 of 25 strongly agreed with that statement, 9 more “agreed.” One student, Maddy, offered the following example:

“Right now, our group is working on the idea of how glasses and contacts change the shape of your cornea to balance out a person's misshapen cornea. We thought we could explain it by explaining that people with near sighted vision need glasses with thicker glass on the sides and that people with far sighted vision need glasses with thicker glass in the center. However, we … didn’t know what far-sighted glasses looked like. When I was at Walgreens the other day, I saw some reading glasses and decided to investigate. And sure enough, the glasses were thicker in the center and as the intensity of the prescription increased, so did the thickness of the center. I was so proud of our group to turn out correct!”

Students’ quantitative and qualitative responses across the set of survey questions indicate that students in the Scientific Inquiry course have significantly different out-of-class experiences related to the content of the course than those in the traditional physics course, as seen in Figure 1.

Figure 1: Traditional Physics Vs Scientific Inquiry

Figure 1. Student responses to the Transformative Experiences Survey in Optics. All bars are length 1 and shading represents the fraction of students who answered with that response. A bar entirely above 0, then, indicates all students “agreed” or “strongly agreed” with the question.

This talk provided an overview of an explanation for why two courses, each meeting 5 - 6 hours a week and covering topics of geometric optics, would have such different outcomes — not with respect to the traditional metrics of learning (e.g., concept inventories), but to the transfer of those ideas to everyday life. Rather than attending to the skills, knowledge, or traits of individuals, as is common in the literature on transfer and the related construct of transformative experience (TE),1 I analyze how the classroom activity in the inquiry-rich course facilitates this transfer. In particular, I argue, the ways in which students themselves leveraged out-of-class contexts to develop and vet scientific ideas can - at least in part - explain why the class has such high TE. Below I briefly describe these contexts, and the ways in which they are leveraged in class.

These contexts, taken from work by Barnett & Ceci on contexts of transfer, are described in Table I along with examples taken from the Scientific Inquiry class.

Type of Context Examples:

  • Knowledge Domain: Ideas from another knowledge domain are positioned as relevant in class. Knowledge from bartending is used to justify a three-color model for colors. 
  • Physical Context: Physical objectives/spaces that are not typically part of class are positioned as relevant. A student brings in a jell-o cup to view the path laser light makes when entering a lens. 
  • Temporal Content: Prior ideas are held accountable to current and future knowledge. A model of reflection students had established over several days is refuted and discarded for a new model.
  • Functional Context: Ideas and objects used to perform x are now used to perform y. A hot playground slide justifies the idea that light is absorbed, and not just reflected in mirrors. 
  • Social Context: Relationships, identities, and roles are not usually relevant are invoked and relevant to developing scientific ideas. A student notes he does not understand this until he can explain it to his wife, positioning her as an audience for our new ideas. 
  • Modality (Multimodal): An idea is expressed using multiple modes. Diffuse reflection is drawn as a "shattered" ray, demonstrated with a flashlight, compared to a cartoon, and represented audibly (a vibrato "aah").

Broadly speaking, the talk and its related research is an argument for the following: (1) that we expand our assessments of physics education to examine the role that physics education plays in the lives of students; (2) in doing so, certain features of classroom activity - in particular, students’ agency and the idiosyncratic ways in which students draw on their own background and resources - have increased importance.

Leslie Atkins Elliott is an Associate Professor of Curriculum, Instruction and Foundational Studies at Boise State University, specializing in Science Education. Her research focuses on fostering participation in the practices of science - particularly writing and design - and how science instruction can reduce barriers between classrooms and everyday life.

Endnotes

  1. K. J. Pugh, L. Linnenbrink‐Garcia, K. L. Koskey, V. C. Stewart, and C.Manzey, “Motivation, learning, and transformative experience: A study of deep engagement in science,” Science Education, 94(1), 1-28, (2010).
  2. L. J., Atkins and I. Y. Salter, “Engaging future teachers in having wonderful ideas,” In Recruiting and Educating Future Physics Teachers: Case Studies and Effective Practices, edited by E. Brewe and C. Sandifer. 2015, APS.
  3. S. M. Barnett and S. J. Ceci, “When and where do we apply what we learn?: A taxonomy for far transfer,” Psychological bulletin, 128(4), 612, (2002).

Making Sense of Physics Sensemaking

Elizabeth Gire, Oregon State University, Paul J. Emigh, Oregon State University, Kelby T. Hahn, Oregon State University, MacKenzie Lenz, Oregon State University

Physics instruction ought to reflect both the nature of physics and what we know about how people learn. Einstein described science as “the attempt to make the chaotic diversity of our sense-experience correspond to a logically uniform system of thought.”1 This definition highlights how science is about making connections, both between what we experience every day and how we describe the world, as well as between the various logical connections within our descriptions.

Our models of the world use a variety of representations of knowledge, including equations, graphs, diagrams, conceptual stories, and experiences, encompassing both real-world experiences and experimental observations. Our colleague Dr. Charles de Leone often says that “Physicists are representation junkies.”2 He is right—we try to use every tool at our disposal to make sense of the universe, always searching for new insights. Combining these perspectives, we might view physics sensemaking as seeking coherence between different representations of physics knowledge.

Solution Evaluation
Physics sensemaking may occur at many different stages of a physics problem: when initially identifying a problem, when orienting to a new problem, when stuck on a problem before a solution is reached, or when a solution is reached and you want to build confidence in your answer. Here, we focus on this last aspect of physics sensemaking: solution evaluation. By this, we mean examining an algebraic answer to a physics problem and probing whether it is reasonable. Students often have the luxury of being able to check against a solution manual or having their solution evaluated by a teacher. Research physicists have no solution manual: all we can do is make sure our solutions are consistent with what we already know and that they make predictions about the behavior of the universe that can be verified. Students must develop the skills and habits to evaluate their own solutions. Here we discuss two big categories of answer evaluation: examining “beasts” and answer contextualization/comparison (Figure 1).

Figure 1: Types of Solution Evaluation

Figure 1: Types of Solution Evaluation

What kind of a beast is it?
When we arrive at an answer to a physics problem, often we want to make sure it is the right type of mathematical and physical object. In classes at Oregon State, we ask students “What kind of a beast is it?” meaning “what kind of a mathematical/physical object is it?” We ask this question for two reasons. First, we want to make sure the answer is appropriate for the problem that is being solved. If you are solving for an angular momentum, you better have a vector with dimensions of angular momentum. Second, if we have an equation (which we usually do), we want to make sure that the equation is balanced: if you have a vector with dimensions of angular momentum on one side of an equals sign, you should have the same dimensions on the other side (similarly, you want every term in an equation to be consistent).

Answer Contextualization/Comparison
A second way to evaluate an answer is to try to understand the answer in context or compare the answer to something you already know. Strategies for this include examining parameter space (including checking special or limiting cases and examining the relationships between variables—see Figure 2), telling a conceptual story, checking against observations/intuition, and making sure a numerical answer has a reasonable magnitude.

Figure 2: The spectrum of strategies that includes exploring the parameter space of an algebraic solution illustrated through the example of the acceleration of an Atwood machine with a massive pulley.

Figure 2: The spectrum of strategies that includes exploring the parameter space of an algebraic solution illustrated through the example of the acceleration of an Atwood machine with a massive pulley.

A Course in Physics Sensemaking
Recently, Oregon State University did a major revision of the physics major, including the addition of a course called Techniques in Theoretical Mechanics. This course is one of two new sophomore-level courses aimed at (1) easing the transition between introductory and upper-division physics courses and (2) bringing more of the “cool” physics earlier in the major to interest new (and perhaps underrepresented) populations of students.

In this mechanics course, physics sensemaking is on equal footing with the physics and math content of the course. Evaluative sensemaking strategies are discussed and practiced in class, on homework, and on exams. The course is generally modeled after a course on mathematical problem solving offered by Alan Schoenfeld at UC Berkeley in the 1980’s.3 The design of the physics sensemaking course attends to four aspects of physics sensemaking: knowledge of sensemaking strategies (and physics content), metacognitive skills, productive beliefs about the nature of doing physics, and valuing physics sensemaking. Sensemaking strategies are named and described in class and on course assignments. Instructors use Schoenfeld’s metacognitive prompts (What are you doing? Why are you doing it? How will it help you?) to support routine self-monitoring.4 The course is pitched as professional development for future physicists; physics epistemology and professional sensemaking practice is discussed. Demonstrations of physics sensemaking are rewarded with grades on course assignments and praise in class discussions.

In terms of physics content (one cannot do physics sensemaking in the absence of physics!), the course begins with using Newton’s Laws to find equations of motion in situations where forces depend on velocity. Students learn to view Newton’s 2nd law as a differential equation of motion and to solve separable differential equations to find velocities and positions as functions of time. Students are introduced to hyperbolic trig functions in the context of quadratic drag forces, which become useful later in doing Lorentz transformations in special relativity. Students then learn Lagrangian and Hamiltonian approaches to finding and solving equations of motion for classical systems.5 Students learn to leverage their intuitions of classical systems to make sense of messy algebraic calculations. The course ends with special relativity taught with an emphasis on using spacetime diagrams as a bridge between conceptual, geometric, and algebraic modes of reasoning.6 Here, students learn to use physics sensemaking to develop and refine their intuitions about relativistic physics.

The hope is that doing physics sensemaking becomes a habitual part of solving physics problems for the students. To support this goal, we use a scaffolding and fading approach7 in which students are given explicit support at the beginning of the course that is gradually removed as the course proceeds. Students initially receive instructions for how and when to use specific physics sensemaking strategies. Sensemaking strategies are tagged on homework assignments in appropriate places. At the beginning of the term, in the context of solving a projectile motion problem, the class generates a list of physics sensemaking strategies that is made available as a resource for the remainder of the course. On later homework assignments, sensemaking prompts become less prescribed: “Use at least 3 strategies to make sense of your answer.” Eventually, sensemaking is mentioned as an expectation in the assignment instruction but not specifically prompted in any problem.

Preliminary Results
We are still in the early days of this project but we have some preliminary results to report. At the beginning of this course, students are familiar with many different strategies for making sense of physics problems,8 but often the implementation of these strategies needs support.9 For example, in examining a special case of an algebraic solution, most students can evaluate functions at special values and interpret the result, but some need to learn how to select a good special case to examine (from physical intuition or from a known result).10 Students may also not recognize a strategy that they can perform as being useful for physics sensemaking. For example, one of our students did not recognize examining the functional relationships between physical variables with a graph as a way of making sense of an answer to a physics problem.11 Overall, a variety of data sources—including pre/post-test data, in-class observations, analysis of homework and exams, interviews with individual students, an end-of-course survey, and anecdotal reports from our faculty colleagues—students generally become more familiar, more proficient, and more productive with strategies for making sense of symbolic answers to physics problems. Our next steps include following up with students to see how their experiences in the course have influenced their physics sensemaking practices in later courses and research experiences, characterizing and developing curriculum for physics sensemaking beyond solution evaluation, and developing assessments of physics sensemaking.

Elizabeth Gire is an assistant professor of physics at Oregon State University. She conducts research on the teaching and learning of physics, particularly physics sensemaking and representational fluency.

Paul J. Emigh is a Postdoctoral Scholar at Oregon State University. His research focuses on how students at all levels of physics understand and make sense of both physics concepts and the underlying mathematics.

Kelby T. Hahn is a doctoral student in STEM Education at Oregon State University. She works in Dr. Gire’s physics education research group studying physics sensemaking, specifically special-case analysis.

MacKenzie Lenz is a doctoral candidate in Physics at Oregon State University. She works in the Oregon State University Physics Education Research group studying student performance of and beliefs surrounding sensemaking at different levels of the undergraduate physics curriculum.

Endnotes

  1. Albert Einstein, “Considerations concerning the fundaments of theoretical physics.” Science, 91(2369), 487–492 (1940).
  2. Charles De Leone, private communication.
  3. Alan H. Schoenfeld, “Chapter 1 - A framework for the analysis of mathematical behavior,” in Mathematical Problem Solving Academic Press, pp. 11–45 (1985).
  4. Alan H. Schoenfeld, “Chapter 4 - Control,” in Mathematical Problem Solving, Academic Press, pp. 11–45 (1985).
  5. John R. Taylor, Classical Mechanics, University Science Books (2005).
  6. Tevian Dray, The Geometry of Special Relativity Boca Raton, FL: A K Peters/CRC Press, (2012).
  7. Barak Rosenshine and Carla Meister, “The use of scaffolds for teaching higher-level cognitive strategies” Educational Leadership, 49(7), 26–33 (1992).
  8. Kelby T. Hahn, Paul J. Emigh, MacKenzie Lenz, and Elizabeth Gire, “Student Sense-making on Homework in a Sophomore Mechanics Course,” 2017 PERC Proceedings [Cincinnati, OH, July 26-27], 160-163 (2017).
  9. MacKenzie Lenz, Kelby T. Hahn, Paul J. Emigh, and Elizabeth Gire, “Student perspective of and experience with sense-making: a case study,” 2017 PERC Proceedings [Cincinnati, OH, July 26-27], 240-243 (2017).
  10. Kelby T. Hahn. “Student evaluative sensemaking on homework in intermediate mechanics,” Master’s Thesis, Oregon State University, (2018).
  11. MacKenzie Lenz, Paul J. Emigh, and Elizabeth Gire, “Surprise! Students don’t do special-case analysis when unaware of it,” 2018 PERC Proceedings [Washington, DC, August 1-2], (accepted).

LASSO: A New Tool to Support Instructors and Researchers

Ben Van Dusen, California State University-Chico

We developed the Learning About STEM Student Outcomes (LASSO) online assessment platform to increase instructor use of research-based assessments (RBAs). LASSO does this by making it easy to collect and analyze high-quality evidence about student learning in their courses. Specifically, LASSO simplifies the process of administering, scoring, and analyzing RBAs and saves class time by automating the process online. Course results are anonymized and aggregated in the LASSO database to provide instructors normative feedback about their student outcomes.

RBAs, such as the Force Concept Inventory, measure students’ knowledge of concepts or attitudes that are core to a discipline. The LASSO database offers researchers access to a large-scale, multi-disciplinary, and longitudinal student and course-level data. The database can save researchers significant time and allow them to investigate novel research questions that require large datasets.

In this article we will discuss:

  1. How LASSO supports instructors
  2. How LASSO supports researchers
  3. Research on collecting and analyzing data using LASSO.

LASSO Supports Instructors
To measure student changes in STEM courses, the LASSO platform hosts, administers, scores, and analyzes student pretest and posttest scores online. Figure 1 outlines the steps for instructors to use LASSO. The LASSO platform is hosted on the Learning Assistant (LA) Alliance website.1Figure 1. Steps to assessing a course using the LASSO platform.

Figure 1. Steps to assessing a course using the LASSO platform.

Instructors add new courses by answering a short series of questions about their course. Instructors then select assessments from the LASSO repository to administer to their students. As of the Fall ‘18 term, LASSO hosts sixteen research-based conceptual and attitudinal assessments across the STEM disciplines. Once instructors upload a course roster with emails and select a deadline for the pretest, they can launch the pretest. Each student receives an email with participation instructions including a personalized link to their online assessment. Students first choose whether they would like their answers to be anonymized and aggregated into the LASSO research database. They then complete a set of demographics questions and the RBA.

After students have completed their pretests, instructors can download a spreadsheet of their students’ raw and scored responses. They can use the student responses to inform teaching practices, such as identifying concepts the students are more knowledgeable about, identifying students who may need additional support, and creating student small groups.

During the final weeks of the course, instructors follow the same steps for launching and tracking their students’ progress on the posttests as they did on the pretest. Instructors can then download a spreadsheet with their students’ pre and posttest responses as well as a final report. The spreadsheet supports faculty who wish to research their own course outcomes or upload their results to another data analysis system, such as Data Explorer. The final report is an assessment-specific PDF that provides instructors with an easy-to-understand analysis about their class’s performance.

LASSO Supports Research
The LASSO Platform aggregates and anonymizes the assessment data for researchers with IRB approval to use. Most students who take part in LASSO assessments (83%) agree to share their anonymized data with researchers. Besides providing researchers with information about student performance and demographics, the database also provides course-level information (e.g., goals of the course, how many times the instructor has taught the course before, and the class size). As of the Summer 2018 term, the LASSO research database has data from 32,728 students, in 618 courses, from 51 institutions (Table 1).

Table 1. Data within the LASSO researcher database by discipline as of the 2018 Fall term.

Table 1. Data within the LASSO researcher database by discipline as of the 2018 Fall term.

Discipline Institutions Instructors Courses Students Physics 41 129 462 19,819 Astronomy 3 3 3 181 Mathematics 7 11 30 2,257 Chemistry 12 20 68 5,764 Biology 12 21 75 5,575

While all instructor features on LASSO are free, there are fees to access to the LASSO database. The fees are small enough to not prevent researcher access to the database while providing funds to make the LASSO platform sustainable.

Research on LASSO
We developed LASSO to support educators and researchers in collecting high quality data using instruments and analyses with strong validity arguments. To support this goal, we investigated two research questions of interest to LASSO-using instructors and researchers:

  1. Are online assessments a good replacement for paper assessments?
  2. What are the best methods for handling missing data?
    We also investigated a third research question specifically for researchers
  3. What are the best methods for analyzing large-scale multi-level datasets?

Are online assessments a good replacement for paper assessments? Nissen et al.2 used a randomized between groups experimental design to investigate whether LASSO administered RBAs provided equivalent data to traditional in-class assessments for both student performance and participation. Analysis of 1,310 students in 3 college physics courses indicated that LASSO-based and in-class assessments provide equivalent participation rates when instructors used four recommended practices (shown in figure 2):

  1. In-class reminder
  2. Multiple email reminders
  3. Credit for pretest participation
  4. Credit for posttest participation.

Figure 2. Participation rates on LASSO as instructors increased their use of the recommended practices (e.g., sending email reminders & offering credit) on computer-based tests (CBT) versus paper and pencil tests (PPT). When all 4 recommended practices were used, the participation rates were nearly identical.

Figure 2. Participation rates on LASSO as instructors increased their use of the recommended practices (e.g., sending email reminders & offering credit) on computer-based tests (CBT) versus paper and pencil tests (PPT). When all 4 recommended practices were used, the participation rates were nearly identical.

Models of student performance indicated that tests administered with LASSO had equivalent scores to those administered in class. This indicates that instructors can compare their data from LASSO to any prior data they may have collected and the broader literature on student gains.

What are the best methods for handling missing data? Nissen et al.2 found that students with lower grades participated at lower rates than students with higher grades. These results indicated a bias toward high performing students for RBAs collected in-class or with LASSO. PER studies most commonly report using complete-case analysis (aka, matched data) in which data is discarded for any student who does not complete both the pre and posttest. Nissen, Donatello, and Van Dusen3 used simulated classroom data to measure the potential bias introduced by complete case analysis and Multiple Imputation. Multiple Imputation uses all of the available data to build statistical models, which allows it to account for patterns in the missing data. Results, shown in Figure 3, indicated that complete-case analysis introduced meaningfully more bias into the results than multiple imputation.

Figure 3. Bias introduced into posttest scores for complete case analysis and multiple imputation.

Figure 3. Bias introduced into posttest scores for complete case analysis and multiple imputation.

What are the best methods for analyzing large-scale multi-level databases? PER studies often use single-level regression models (e.g., linear and logistic regression) to analyze student outcomes. However, education datasets often have hierarchical structures, such as students nested within courses, that single-level models fail to account for. Multi-level models account for the structure of hierarchical datasets.

To illustrate the importance of performing a multi-level analysis of nested data, Van Dusen and Nissen3 analyzed a dataset with 112 introductory physics courses from the LASSO database using both multiple linear regression and hierarchical linear modeling. They developed models that examined student learning in classrooms that use traditional instruction, collaborative learning with LAs, and collaborative learning without LAs. The two models produced significantly different findings about the impact of courses that used collaborative learning without LAs, shown in Figure 4. This analysis illustrated that the use of multi-level models to analyze nested datasets can impact the findings and implications of studies in PER. They concluded that the DBER community should use multi-level models to analyze datasets with hierarchical structures.

Figure 4. Predicted gains for average students across course contexts as predicted by: a) multiple linear regression and b) hierarchical linear modeling. Error bars are +/- 1 standard error.

Figure 4. Predicted gains for average students across course contexts as predicted by: a) multiple linear regression and b) hierarchical linear modeling. Error bars are +/- 1 standard error.

Conclusion
The LASSO platforms purpose is to support instructors in implementing research-based teaching practices in their courses by providing them with simple, accurate, and reliable assessments for their courses and to support research on STEM instruction. The LASSO platform makes it easy for instructors to assess their courses, supports instructors interpreting the results from their assessments, and provides them with documentation summarizing their assessment results. Large-scale, multi-disciplinary data collection allows researchers to further understanding of student learning in STEM.

Ben Van Dusen is an assistant professor in Science Education at Chico State and the director of the LASSO platform.

Endnotes

  1. Learning Assistance
  2. J. M. Nissen, M. Jariwala, E. W. Close, & B. Van Dusen, “Participation and performance on paper-and computer-based low-stakes assessments,” International Journal of STEM Education, 5(1), 21, (2018).
  3. J. Nissen, R. Donatello, & B. Van Dusen, “Missing data and bias in physics education research: A case for using multiple imputation,” Physical Review Physics Education Research (under review).
  4. Van Dusen and Nissen “Modernizing PER's use of regression models: a review of hierarchical linear modeling,” Physical Review Physics Education Research (under review).

The Use of Data in Creating, Implementing, and Assessing Evidence-based Pedagogies

Jayson Nissen, California State University-Chico, James Day, University of British Columbia, Paula Heron, University of Washington

Physics education researchers often use statistics to develop insights into student learning and attitudes in physics courses, and as a guide in developing instructional strategies. Educators rely on knowledge of statistics to understand research articles, identify effective practices to use, and evaluate the effectiveness of their courses. While an education in physics provides robust opportunities to develop mathematical efficacy, it rarely develops the specialized knowledge necessary to conduct or consume the wide range of statistics used in education research.

To help educators and researchers extend their statistical literacy, we ran a workshop on issues surrounding p-values. “A p-value measures whether an observed result can be attributed to chance. But it cannot answer the researcher's real question: what are the odds that it is correct?”1 We focused on p-values because researchers commonly misuse and misinterpret them.2

To address the misuse and misinterpretation of p-values, the American Statistical Association (ASA) released a statement that included six principles for using p-values; our workshop addressed four of these.

We first looked at choices that make p-values useful (or not), addressing principles three and four from the ASA’s statement: (3) scientific conclusions and business or policy decisions should not be based only on whether a p-value passes a specific threshold, and (4) proper inference requires full reporting and transparency.

We asked participants to read False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant3 before the workshop. The article presents two studies with statistically significant findings that listening to a children’s song makes people feel older and that listening to a song about old age makes people actually younger. They conclude that the first result was unlikely, and the second was necessarily false. Using simulations, the authors show that researchers’ choices make it likely that false-positives are common in the research literature. The article closes by providing guidelines for authors and reviewers to minimize the likelihood of reporting false-positive results.

After discussing the article, participants explored the effects of p-hacking using the Hack Your Way to Scientific Glory tool. The tool models the statistical relationship between the political affiliation of elected officials and economic outcomes. Depending on which variables were chosen for the analysis, participants could demonstrate that either Republican or Democrat affiliation had a statistically significant correlation with either higher or lower economic outcomes. The article and the p-hacking activity illustrate the broad set of choices that researchers face in using statistics and highlight how some choices can lead to unreliable results.

Next we looked at the additional information needed to interpret a p-value, addressing principles five and six from the ASA’s statement: (5) a p-value, or statistical significance, does not measure the size of an effect or the importance of a result, and (6) by itself, a p-value does not provide a good measure of evidence regarding a model or hypothesis.

To address the need for additional measures to inform conclusions drawn from statistical tests, participants used four contrasting cases of pretest and posttest data to explore the relationships between effect size, statistical significance, and sample size. The effect size was calculated using Cohen’s d, which is the difference in the means divided by the pooled standard deviation, with a Hedge’s correction for the sample size. The p-value was calculated with a matched-samples, two-tailed t-test. Figure 1 contains four bar graphs with error bars showing one standard error. The columns in Figure 1 have different sample sizes and the rows have different effect sizes. Figure 1.B reports a large effect size, d = 0.9, that was not statistically significant, while Figure 1.C reports a small effect size, d = 0.4, that was statistically significant. Together, 1.B and 1.C illustrate how a p-value alone cannot determine the educational significance of a result.

Figure 1. Bar plots comparing the pretest and posttest scores in four courses. The sample size (N) was 100 in the left column and 6 in the right column. The effect size was large in the top row and small in the bottom row. Comparing B and C shows that a large effect may lack statistical significance, while a small effect may have statistical significance.

Figure 1. Bar plots comparing the pretest and posttest scores in four courses. The sample size (N) was 100 in the left column and 6 in the right column. The effect size was large in the top row and small in the bottom row. Comparing B and C shows that a large effect may lack statistical significance, while a small effect may have statistical significance.

While developing the workshop, we collected resources on quantitative methods. These resources are publicly available and are organized into six topics:

  1. Popular media, which covers blogs, books, and podcasts that discuss various issues of statistical analysis.
  2. Broad resources for conducting statistical analyses.
  3. Resources on p-values and effect sizes.
  4. Data visualizations.
  5. Resources for preregistering studies.
  6. Other fundamentals.

We invite readers to review and comment on the materials and to recommend new materials.

This workshop is part of a collaborative effort to support new and emerging quantitative researchers in discipline-based education research in developing their statistical literacy. Following the workshop at Foundations and Frontiers in Physics Education Research – Puget Sound, a similar workshop ran at the 2018 Physics Education Research Conference. The team is working on a proposal for a workshop at the 2019 National Association for Research on Science Teaching and will run a workshop at the 2019 American Association of Physics Teachers Summer Meeting.

Jayson Nissen is a Postdoctoral Researcher in the Department of Science Education at California State University - Chico.

James Day is a Research Associate for the Stewart Blusson Quantum Matter Institute at the University of British Columbia.

Paula Heron is a professor of physics at the University of Washington, where she is a member of the Physics Education Group.

Endnotes

  1. R. Nuzzo, “Scientific method: statistical errors,” Nature News, 506(7487), 150 (2014).
  2. R. L. Wasserstein & N. A. Lazar, “The ASA’s statement on p-values: context, process, and purpose,” The American Statistician, 70(2), 129-133 (2016).
  3. J. P. Simmons, L. D. Nelson, & U. Simonsohn, “False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant,” Psychological science, 22(11), 1359-1366 (2011).

Four-way High Fives During Exams: Adding a Group Phase to Provide Immediate Feedback and Increase Enjoyment

Jared Stang, University of British Columbia, Vancouver, Joss Ives, University of British Columbia, Vancouver

Active learning strategies, such as peer instruction and collaborative group work, are important components of many contemporary physics classrooms.1 A key part of the efficacy of these teaching techniques comes from increased student access to feedback—“the most powerful single influence” on student achievement.2 For maximum impact, feedback should focus on performance and learning, address small chunks of material, and be timely to and match the purpose of the assessment.2

The two-phase collaborative group exam is an active learning strategy that provides students with an opportunity for feedback, in situations typically absent of timely feedback. In a two-phase exam, students first complete the exam individually—the solo phase—and then form groups to complete the same or similar questions in the group phase. Students receive fine-grained and responsive feedback directly matched to the assessment, immediately, from their peers, when they still care about it. This innovation can be effective with many types of low- or high-stakes assessments, such as quizzes, midterms, or final exams.

During the group phase, students are animated, enthused and often smiling, and the room is loud. Students often leave the test with positive body language. In fact, it is our experience that group exams are the course activity with the highest level of student engagement. This feedback and strong engagement translates into learning. The average for the group phase typically exceeds the average for the solo phase by 15-20%, indicating that on average students discuss or see more correct understanding than they brought to the solo phase. Studies on retention of this learning3,4 have shown a statistically significant increase in retention when a student had a group phase.

Our workshop explored several practical aspects of group exam implementations. Participants first identified characteristics of questions that may facilitate learning in the group phase: those with salient conceptual pieces rather than procedural calculations, those with a high ratio of sense-making to answer-making, and those which may necessitate input from a diverse range of perspectives. We identified algebra-heavy, traditional "plug-and-chug" style problems as less effective for a group phase. A fundamental design principle is to maximize feedback opportunities by maximizing group conversations.

Next, we had the participants consider some aspects of group exam design and then shared our recommended implementation for first-time users: Start with a low-stakes assessment, provide 10 minutes of group phase time for every 20 minutes of solo phase time, and place a much higher grading weight on the solo exam (e.g., a weighting of 85% solo and 15% group or similar). As with all active learning strategies, implementation should include telling and showing the students why you chose the activity and making sure the students know the logistics of the activity. Some participants articulated a desire to present groups with more difficult or synthesizing problems. While this is a viable group assessment strategy, given our primary framing of the two-phase exam as a feedback activity, we default to using the same or very similar problems for the group phase, sometimes edited to be more discussion friendly.

Group formation raises further implementation choices. With respect to group size, workshop participants noted the possible constraints of too many group members as “too many cooks in the kitchen,” and for some group members reduced participation and perhaps marginalization, consistent with our own observations. Based on group-phase performance results collected over the past few years in our courses, we recommend groups of three or four.

In courses where students work within assigned groups for extended periods of time, it works well to maintain those groups for the group exam. However, many courses will require that ad-hoc groups are formed for the group exam, and the choice between instructor-formed and student-formed groups—an open question in the two-phase exam literature—will need to be made. Instructors choosing to form groups themselves should be careful to avoid isolating female or minority students, advice we extend from Heller and Hollabaugh’s observations5 that group dysfunction is higher in groups with isolated females. We tend to let the students choose their own groups, but recommend that instructors offer to facilitate for those students who find it challenging to form a group. The literature provides some support for student-formed groups, with female students seeing more value in group work6 and groups engaging in more productive scientific behaviours when friends work together.7

We closed the workshop by sharing comments from a recent student survey we ran after a sophomore Chemistry midterm. Responses to a prompt asking for advice for future students to get the most out of their group exam experience included themes of consensus (“Discuss each answer in depth, to make sure all group members understand why they reached that decision”; roughly 40% of comments), speaking up/sharing (“Don’t be afraid to share contrasting opinions or bring up new possibilities, that’s what makes group exams beneficial!”; roughly 25% of comments), listen/respect (“Listen to and respect everyone’s opinions, even if you don’t agree with them”; roughly 15% of comments), and know your group before (“Get to know your group members before the exam”; roughly 15% of comments).

Overall, two-phase group exams are a low-barrier, easy-to-implement way to incorporate active learning and feedback into a traditional summative assessment. Furthermore, they are overwhelmingly well-received by students: In surveys, we typically see more than 95% of students recommend continued use of two-phase exams for their midterms, matching or exceeding previously reported results.8 We love these exams, and suspect that you might too.

For more information, please see our workshop resources online or contact us by email: 

  • Jared Stang is a lecturer at the University of British Columbia.
  • Joss Ives is a senior instructor at the University of British Columbia.

References

  1. S. Freeman, S. L. Eddy, M. McDonough, M. K. Smith, N. Okoroafor, H. Jordt, & M. P. Wenderoth, “Active learning increases student performance in science, engineering, and mathematics,” Proceedings of the National Academy of Sciences, 111(23), 8410-8415 (2014).
  2. G. Gibbs & C. Simpson, “Conditions under which assessment supports students’ learning,” Learning and teaching in higher education, (1), 3-31 (2005).
  3. B. H. Gilley & B. Clarkston, “Collaborative testing: Evidence of learning in a controlled in-class study of undergraduate students,” Journal of College Science Teaching, 43(3), 83-91 (2014).
  4. J. Ives, “Measuring the Learning from Two-Stage Collaborative Group Exams,” 2014 PERC Proceedings [Minneapolis, MN, July 30-31, 2014], edited by P. V. Engelhardt, A. D. Churukian, and D. L. Jones, doi:10.1119/perc.2014.pr.027.
  5. P. Heller & M. Hollabaugh, “Teaching problem solving through cooperative grouping. Part 2: Designing problems and structuring groups,” American Journal of Physics, 60(7), 637-644 (1992).
  6. S. L. Eddy, S. E. Brownell, P. Thummaphan, M. C. Lan, & M. P. Wenderoth, “Caution, student experience may vary: Social identities impact a student’s experience in peer discussions,” CBE Life Sciences Education, 14(4), 1–17 (2015). http://doi.org/10.1187/cbe.15-05-0108
  7. M. Azmitia & R. Montgomery, “Friendship, transactive dialogues, and the development of scientific reasoning,” Social Development, 2(3), 202–221 (1993). http://doi.org/10.1111/j.1467-9507.1993.tb00014.x
  8. G. W. Rieger & C. E. Heiner, Examinations that support collaborative learning: The students' perspective,” Journal of College Science Teaching, 43(4), 41-47 (2014).

Browsing the Journals

Carl Mungan, United States Naval Academy

Stack of booksAlex Small has a practical review of lens aberrations for nonexperts on page 487 of the July 2018 issue of the American Journal of Physics. In the September issue, Seung Ki Baek models the breaking of a pulled horizontal chain anchored to a wall at the other end as a function of the rate at which the force is applied; it is similar to the classic demonstration of pulling slowly or jerking the string connected to a hanging “inertia” ball. Finally, an article on page 733 of the October issue presents experiments and theory that help explain the flipping of a water bottle and of a plastic can of tennis balls in the air such that they land upright on a table.

Ker Liang Goh neatly explains how one can tell whether or not an extended body is in mechanical equilibrium when subject to three nonparallel but coplanar forces on page 384 of the September 2018 issue of The Physics Teacher.

Article 055101 in the September 2018 issue of the European Journal of Physics considers a “thermal bandage” toy model which is a 2D generalization of a classic problem by Charles Kittel useful as a student exercise in setting up and manipulating the partition function. Article 055203 in the same issue quantitatively analyzes a dc electric motor to find its angular speed and power as a function of time including back emf and load.

Online Journal

Article 2301 in the June 2018 issue of the Latin-American Journal of Physics Education attempts to define the term “thermodynamics” at the beginning of an introductory course or unit on the subject. I agree a definition is tricky; mine is “The study of how materials change when energy (by heat, pressure, electromagnetic fields, mass transport, etc) is added to or removed from them.”

The September 2018 issue of Resonance has a five-page biography of Arthur Holly Compton, along with a reprint of the 1923 Physical Review paper on his eponymous scattering effect. 

Resonance Articles

A document camera, 3D glasses, and smartphone screen can be used to construct a polarimeter to analyze a sucrose solution, as explained on page 837 of the May 2018 issue of the Journal of Chemical Education. Page 1668 of the September issue presents a historical account of the name and ideas behind shot noise. 

Journal Archives

Article 010144 in Physical Review Physics Education Research discusses undergraduate student beliefs about the curvature of the universe, such as that it must be spherical.

Laptop, Tablet, and Smart PhoneWeb Watch

Carl Mungan, United States Naval Academy

Teacher Preparation Section

Alma Robinson, Virginia Tech

As pre-service teacher educators, it is incumbent on us to not only provide our future physics teachers with the training to become good teachers, but to also inform them of the ongoing resources they’ll need to help them grow into excellent ones. This issue of the Teacher Preparation Section highlights the American Association of Physics Teachers (AAPT) and the Supporting Teachers to Encourage the Pursuit of Undergraduate Physics for Women (STEP-UP 4 Women) project.

Kelsey Sheridan describes the ways in which AAPT supports physics teacher preparation programs and physics teachers during all stages of their careers from a scholarship for future physics teachers to curricular materials aligned with the Next Generation Science Standards. Through peer-reviewed journals, workshops, conferences, and a database of online resources, AAPT provides physics teachers the ability to participate in a rich community of physics educators.

The percentage of women that make up undergraduate physics majors in the United States is about 20%. Kathryne Sparks Woodle explains a new initiative from the American Physical Society (and others, including funding from the National Science Foundation) which attempts to bring that percentage up to 50% by working directly with high school physics teachers. This isn’t an intractable problem: If half of the high school physics teachers were able to recruit one new woman to major in physics each year, 50% of the incoming physics majors would be women. “The STEP UP 4 Women project has demonstrated that teaching two research-based lessons and implementing “Everyday Actions” can make the difference.”

By sharing these resources with your future physics teachers, you can help them stay engaged with other physics teachers throughout their careers and make a more positive impact on their students. Please don’t assume that they will hear about these programs once they start their careers. Instead, inform them of these resources now, and they may become the teacher leaders who share these programs with their colleagues!

AAPT: Improving the Quality of Physics Education by Supporting the Recruitment and Development of Teachers

Kelsey Sheridan, American Association of Physics Teachers

The American Association of Physics Teachers (AAPT) is a membership organization dedicated to “enhancing the understanding and appreciation of physics through teaching.”1 AAPT members primarily teach at the high school, two-year college, or university level. Additionally, some of our members are physics education researchers who develop evidence-based resources to improve education at each of these levels. The strength of AAPT lies in the nexus of these communities to provide teachers with a comprehensive understanding of physics education. By informing your pre-service teachers about AAPT, you can introduce them to the myriad of resources available to them both now and when they are in their own classrooms.

In order to fuel this vibrant community and fulfill its mission, AAPT takes a multipronged approach to recruiting, training, and supporting physics teachers throughout their careers. AAPT provides teacher recruitment tools to university physics departments that combat misconceptions about teaching, funds early career teachers and physics majors who intend to teach in secondary schools, and coordinates teacher mentoring programs. AAPT also disseminates physics education research through its journals, The Physics Teacher and American Journal of Physics, and the Physport website. Finally, AAPT develops and publishes curricular resources, and promotes opportunities for teacher leadership that allow teachers to advance professionally without leaving the classroom. In this way, AAPT both builds a pipeline to bring physics majors into physics classrooms and sustains their impact by providing curricular resources, financial support, and promoting partnerships among physics teachers across academic levels.

Teacher preparation
With funding from the National Science Foundation, AAPT and many other STEM-education groups such as APS, American Chemical Society, Mathematical Association of America, and the Colorado School of Mines are developing the “Get the Facts Out” campaign. Once launched, the campaign will provide data, guidelines, and modifiable resources to support faculty as they talk with their students and colleagues about a career in secondary physics teaching.

Once an interest in teaching is established and nurtured in physics majors, they need to learn and practice the pedagogy and behavior-management strategies used by great teachers. PhysTEC, another partnership between AAPT and APS, works with more than 300 institutions dedicated to improving and promoting physics teacher education across the United States. PhysTEC institutions have identified key components of programs that successfully recruit and train physics majors to teach and thrive in K-12 classrooms2. AAPT helps to build a network where members of these institutions can continue to identify and share best practices for recruiting and supporting strong physics teachers.

On an individual level, AAPT supports future physics teachers with the Barbara Lotze Scholarship, a financial award for undergraduates in physics teacher preparation programs.

Teacher supports
AAPT has a wide range of resources to support teachers in their daily work. Because about eighty percent of physics teachers are the only physics teacher at their school, building a professional learning community where teachers can compare data, and subsequently design and implement interventions is incredibly difficult.3 Physport addresses this reality through the creation and upkeep of an online database of research-based teaching methods and materials, assessments, norm-referenced data, and targeted intervention strategies that are all specific to physics.

Compadre.org is a library of curated and vetted lesson resources. AAPT’s newest lesson resource is the Digi Kits collection. Digi Kits include innovative hands-on lesson plans that are aligned with the Next Generation Science Standards4 and supported by digital simulations, animations, and videos to extend your students’ thinking beyond the lab activity. These curricular resources help teachers quickly plan strong lessons that they can confidently execute with success.

Through programs like the AIP/AAPT Master Teacher Policy Fellowship, teacher leaders receive support and training to work on advocacy issues that matter to their communities. Creating a network of teacher leaders helps to address a multitude of challenges inherent to the difficulties of sustaining qualified physics teaching and equitable physics learning opportunities for students in a complex educational system.

In providing strategic resources and development opportunities to physics teacher preparation programs, pre-service teachers, and current teachers, AAPT hopes to positively impact the future of physics as a field through supporting physics educators today.

Kelsey Sheridan is the marketing coordinator for the American Association of Physics Teachers. Prior to joining AAPT, Kelsey was a high school science teacher in Baltimore City, Maryland.

Endnotes

  1. AAPT Mission
  2. Improving the Education of Future Physics Teachers
  3. Casey Langer Tesfaye and Susan White, "High School Physics Teacher Preparation." February 2012.
  4. Next Generation Science Standards

STEP UP 4 Women: Reducing Barriers to Young Women's Participation in Physics

Kathryne Sparks Woodle, American Physical Society

For a number of years, the American Physical Society (APS) has been searching for a way to increase the fraction of women who participate in physics at the undergraduate (and above) level. From the time they start their undergraduate studies all the way up through becoming assistant professors, women make up about 20% of the people pursing a physics major/career in the United States. High school physics, on the other hand is comprised of nearly 50% women, so we realized that if we wanted to change things for the country, we need to look earlier and enlist the help of high school physics teachers.

Consequently, we designed, and were recently funded by the National Science Foundation to mount, a national effort to work with high school physics teachers to reduce barriers and inspire young women to major in physics in college, now known as STEP UP 4 Women. Led by Prof. Zahra Hazari, a team of physics education researchers at Florida International University and Texas A&M Commerce worked alongside a group of experienced high school physics teachers and representatives from APS and the American Association of Physics Teachers to begin a project to Support Teachers to Encourage the Pursuit of Undergraduate Physics for (STEP UP 4) women.1

To create inclusive classrooms and encourage young women to pursue a degree in physics, STEP UP 4 Women has designed two classroom lessons and a guide with general strategies to provide high school physics teachers with resources to take achievable and concrete steps on the longer journey to enact cultural change. If half of the high school physics teachers in the U.S. encourage just one more female student to pursue physics as a major each year, a historic shift will be initiated–female students will make up 50% of incoming physics majors.

Preliminary results from the pilot study show that the two lessons, "Careers in Physics" and "Women in Physics," improve students’ future physics intentions (e.g., majoring in physics in college, intending physics-related careers) in classes across the U.S. (N=823).2 Both female and non-female students have positive gains from the lesson. A controlled experimental study in 30+ classrooms is currently underway to provide additional insight into the impact of the materials, including the general strategies guide, called Everyday Actions.

The Everyday Actions guide focuses on explicitly recruiting, reducing the marginalization of, and promoting the recognition of students throughout the year. Teachers are encouraged to use these actions to increase the inclusivity of their classroom environment. Everyday Actions is broken into recommended strategies, each supported by research and accompanied by anecdotes from students and teachers reflecting the effectiveness of these actions and examples of how to inspire, particularly female, students to continue in physics. Here's a snapshot of some of the recommendations:

When you… Talk to Students Individually
Recognize students: Discuss with students why they would be a good fit for physics. Remind students of these messages regularly – students might not internalize the message the first time.

When you… Facilitate Group Work/Labs
Choose group members: Ensure women are taking active roles.

When you… Address the Whole Class
Distribute attention: Distribute attention during class discussions. Make sure all students can participate and that male students don’t dominate the discussion.

When you… Plan and Assess
Plan lessons with context: Incorporate real world physics examples related to helping people (e.g. medical/health, alternative energy, climate science).

When you're… Outside the Classroom
Parents and family: Provide parents with information about job opportunities in physics.

These general strategies are not limited to high school! We encourage you to join the project and share the Everyday Actions guide with your colleagues and students. As a sneak peek at stepup4women.org, we've made a self-reflection instrument with all the strategies listed available for you to use to rate your own use of inclusive practices right now. We also have a poster available that presents guidelines for conduct during discussions from the "Women in Physics" lesson. These are great to set the tone in any classroom. There's a limited supply, but if you'd like one, please email stepup4women@aps.org.

STEP UP 4 Women was created to increase the number of women pursuing the study of physics, regardless of their race/ethnicity. However, our preliminary research shows that when broken down by demographics, the lessons are beneficial to students from traditionally underrepresented races/ethnicities in physics as well. We also include statistical data on the underrepresentation of specific races/ethnicities in physics in the appendix of the "Women in Physics" lesson to complement discussion of the topic should it arise. In our pursuit to increase the number of women in physics, we hope to enact cultural change that will make physics more welcoming for all marginalized groups, including those with non-binary genders as well as others in the LGBT+ community.

After just one year, STEP UP 4 Women already has over 400 members. We welcome all those interested in changing the culture of physics to reduce barriers and inspire more women to participate–not just high school physics teachers but university faculty and students too. If you work with pre-service teachers, you have a unique opportunity to help! Not only can you help inspire them to become inclusive teachers, they may encourage their future colleagues to join the STEP UP 4 Women effort as well. Join the movement at stepup4women.org!

Kathryne Sparks Woodle is the APS Program Manager for STEP UP 4 Women. She enjoys working in the APS Education and Diversity department on initiatives that support cultural change to enable those from groups underrepresented in physics to join the physics community.

References: 

  1. Stepup4women.org
  2. H. Cheng, G. Potvin, R. Khatri, L. Kramer, R. Lock, Z. Hazari, Examining physics identity development through two high school interventions, 2018 PERC Proceedings [Washington DC, August 1-2, 2018], accepted.

Disclaimer: The articles and opinion pieces found in this issue of the APS Forum on Education Newsletter are not peer refereed and represent solely the views of the authors and not necessarily the views of the APS.