Development of An Introductory Physics Prob-Lem-Solving Assessment Tool
The physics education research group at Rensselaer Polytechnic Institute is developing a 30-40 question assessment tool to measure introductory physics students' problem-solving ability. The tool includes four types of questions: attitudinal, identification of underlying principles, comparison of problems, and quantitative problems requiring solutions. Preliminary validation studies showed correlations between incorrect assessment answers and below average performance on problem-solving exam questions. The assessment is intended to evaluate problem-solving skills in classical mechanics topics.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
57 views4 pages
Development of An Introductory Physics Prob-Lem-Solving Assessment Tool
The physics education research group at Rensselaer Polytechnic Institute is developing a 30-40 question assessment tool to measure introductory physics students' problem-solving ability. The tool includes four types of questions: attitudinal, identification of underlying principles, comparison of problems, and quantitative problems requiring solutions. Preliminary validation studies showed correlations between incorrect assessment answers and below average performance on problem-solving exam questions. The assessment is intended to evaluate problem-solving skills in classical mechanics topics.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4
DEVELOPMENT OF AN INTRODUCTORY PHYSICS PROB-
LEM-SOLVING ASSESSMENT TOOL
Timothy French and Karen Cummings Rensselaer Polytechnic Institute, Troy, New York 12180
The physics education research group at Rensselaer is currently working to develop an
assessment tool that will measure the problem-solving ability of introductory physics stu- dents. In its final form, the tool will consists of approximately 30-40 multiple-choice questions related to a limited number of classical mechanics topics. There are currently four types of questions included in the exam: attitudinal questions, quantitative problems that require students to identify the underlying principles used in solving the problem but not an explicit solution, questions that ask students to compare posed problems in terms of solution method, and quantitative problems requiring solution. Although the assess- ment is still under development, we have performed preliminary validation studies on questions requiring students to identify underlying principles. Specifically, both an ANOVA and a Fisher LSD test have been preformed. These evaluations showed (at the 98% and 95% confidence level, respectively) that wrong answers on assessment ques- tions correlate to below average performance on the problem solving portion of the final course exam.
I. Introduction choice format makes the development of
Assessment tools play an important role a valid assessment tool more difficult, it in the use of educational research to also allows grader-independent scoring improve physics education. Without of responses. Furthermore, because the some form of assessment, it is impossi- assessment will be easy to give and ble to determine whether a change in grade, we increase the probability that it pedagogy, curricular materials, or the will be used. use of technology has any impact on Questions on the assessment student learning or skill development. tool address only problems within the However, while much work regarding regime of classical mechanics. Problem assessment has been done within the solving ability, like other skills and domain of conceptual learning, assess- knowledge domains, is context depend- ment tools which evaluate problem solv- ent. Hence, success on this assessment ing ability are not yet available.1 , 2 This will not necessarily be predictive of stu- is despite the fact that development of dents’ abilities in solving problems problem solving ability is widely cited within other domains. Kinematics, ap- as an important goal in an introductory plication of Newton’s second law, con- physics course. Hence, with support servation of energy and conservation of from the National Science Foundation, linear momentum questions were chosen we are developing an assessment tool to for inclusion because these topics are measure the problem-solving ability of widely seen as being fundamental com- introductory physics students. ponents of a first-semester undergradu- The exam under development is ate physics course. a multiple choice instrument, will ulti- mately contain 30-40 questions, and can II. Signatures of Expertise be completed by students in less than an We argue that there are several “signa- hour. While the selection of a multiple tures” of expertise in introductory phys- ics problem solving. Experts, or ad- This assessment tool includes four types vanced problem solvers, are not just of questions. Examples of three types more likely to get the correct answer to are shown in Figs. 1-3. a given problem than are novices. They The first question type (Fig. 1) also possess certain characteristics and is designed to probe students’ attitudes have certain attitudes toward aspects of toward, and common behaviors during, problem solving that both facilitate their the solution of introductory physics success and help to identify them as a problems. group. These characteristics are the foundation of this assessment. Figure 1: An example of a “Type 1” For example, experts are com- question that is used to probe student fortable with trying an approach that is attitudes towards, and behaviors dur- not certain to lead to a correct answer. ing, problem solving. They can ignore irrelevant information. They understand that problems can be Imagine that you have just started to solved in more than one way, and they solve a physics problem. How likely are readily adopt the simplest possible you to do the following? (Definitely, model of the situation (e.g. they will very likely, possibly, probably not, not ignore friction when appropriate even if at all likely). “ignore friction” is not explicitly stated 1)Draw, redraw, or visualize a picture or in the problem). Experts reflect on and graph evaluate the reasonableness of their 2)Make a list or table work at many points along the solution 3)Look for similar problems in the text- path. They have a rich set of actions book and use them as a model they know that they can take if they 4)Look up an appropriate equation cannot find a solution to the problem. 5) Think about which physics concepts Additionally, expert problem are important for the problem solvers have (and know they have) a set 6) Start with the solution (from the back procedure by which they solve certain of the book) and work backward types of problems. For example, they 7)Make some assumptions about the have a set procedure by which they ap- situation proach the solution of a “conservation of 8) Ask yourself questions about what is energy” problem. Typically, novice going on in the problem and problem problem solvers either do not have such solution. a procedure or they fail to realize and exploit the fact that they do. We conjec- The second type of problem ture that this difference is the reason that deals with the identification of the un- experts categorize problems (i.e. group derlying principles used to solve a given them as “similar”) based on the concep- problem (Fig. 2). Students are not re- tual, or “deep”, features3 of the prob- quired to find a numerical answer to lems (e.g. application of conservation of these problems and are told not to solve energy), while novices categorize based them. This problem type provides us on “surface” features (e.g. problems with information regarding a student’s containing rolling balls). ability to identify nuances within the problem statement that clue experts into III Question Types likely solution approaches. Not requiring a complete solu- tion to every question allows us to ask about a wider range of problems without experts categorize based on “deep” con- making the test too time consuming. ceptual features)4. Furthermore, we assume that identifying a reasonable solution approach is a criti- Figure 3: An example of a “Type 3” cal first step in the problem solution. Of question used to probe student cate- course, even if a potentially productive gorization schemes. approach is chosen, the correct solution of the problem is by no means guaran- Consider the following two problems: teed. PROBLEM A) A 30 g wooden block rests on a frictionless surface. A bullet Figure 2: An example of a “Type 2” with mass 5 g, traveling with a speed of question that asks students to identify 275 m/s, passes through the block. The the best approach to use in solving the bullet’s speed after passing through the given problem. block is 240 m/s. How fast is the block moving after the bullet leaves the block? Consider the following problem: PROBLEM B) A 50 g lump of clay A ball is knocked off the edge of a cliff slides along a frictionless surface until it at 3 m/s and strikes the ground below. hits a 3 kg block. The block and clay, The cliff is 20 meters high. What is the now united and moving together, have a velocity of the ball when it strikes the final velocity of 0.5 m/s. What was the ground? Which of the following ap- speed of the clay before the collision? proaches would you use in solution of this problem. Are the problems solved in a similar manner? A) Newton’s Second Law (F=ma) A) Yes B) Either kinematics or conservation of B) No momentum C) I’m not sure. C) Kinematics D) Kinematics and then conservation of The fourth question type (Not momentum shown in a figure) requires the student E) Conservation of momentum to actually solve a quantitative problem. F) None of the above These problems probe whether a student can successfully pull all the required In the third type of question, skills and abilities together. The distrac- two physics problems are presented. ters for this type of problem are gener- Some of the problems are standard ated through student interviews and by “textbook” style questions, while others gathering data from free response prob- are “context-rich” problems gathered lems given on exams in the introductory from the work of Heller and Heller6. As physics courses at Rensselaer. with the previous question type, students are not to find a numerical answer to the IV Scoring the Assessment posed problems. Instead, they are told Evaluation of student answers on the to determine whether or not the two assessment is done through a compari- problems are solved in a “similar” man- son to the answers typically given by ner. This question type, shown in Fig. “expert” introductory physics problem 3, focuses on the test-taker’s problem solvers. Note that truly (i.e. non- categorization schemes. (Recall that subjective) correct answers only exist for the fourth type of question discussed above. For all the other question types, cant correlation at the 98% confidence student responses are considered “cor- level. rect” to the extent that they match the In addition, a Fisher least sig- answer most commonly given by ex- nificant difference (LSD) test was used perts (e.g. professors, upper-level gradu- to compare the three groups as pairs. ate students, practicing scientists). For This evaluation also showed (at a 95% example, if a student groups problems confidence level) that incorrectly an- (in a Type 3 question) based on deep swering any one (or more) of the three structure rather than surface features, Type 2 questions was predictive of a this mirrors the thinking associated with lower score on the free response, prob- an expert and so is judged to be “cor- lem solving section of the exam. rect”. (“Lower” as used here is in comparison By giving the problem-solving to the scores of students who answered assessment both pre- and post- all of the Type 2 questions correctly.) instruction, we will ultimately be able to While we still have a significant measure gains in problem solving ability amount of work to do before this as- (i.e. student motion towards the “expert” sessment tool is ready for dissemination, end of the continuum). we feel the results of the ANOVA and Fisher LSD tests cited above indicate we V Results of a Preliminary Validation are headed in a productive direction. Study 1 We have now completed a preliminary D. Hestenes, M. Wells, and validation study on a subset of the Type G. Swackhamer, “Force Concept Inven- 2 questions (i.e. “identify the approach tory,” Phys. Teach. 30 (3), 141-158 (1992). 2 used to solve” questions). Three ques- R. Thornton and D. Sokoloff, “Assessing tions of this type were included on the student learning of Newton’s Laws: The Force and Motion Conceptual Evaluation multiple choice portion of the Physics I and the Evaluation of Active Learning Labo- final for the Spring 2001 semester at ratory and Lecture Curricula,” Am. J. Phys. Rensselaer.7 In order to test whether a 66 (4), 338-352 (1998). student’s score on these questions was 3 M.T.H. Chi, P.J. Feltovich, R. Glaser, positively correlated with their score on “Categorization and representation of phys- the free response, quantitative problem ics problems by experts and novices,” Cog. solving portion of the exam, we looked Sci. 5, 121-152 (1981). 6 at a subset of 225 finals. (The questions K. Heller and P. Heller, Cooperative on this portion of the exam are textbook, Group Problem Solving in Physics, Univer- “show all your work to receive credit” sity of Minnesota (1997). 4 P.T. Hardiman, R. Dufresne, J. Mestre, style questions.) “The relation between problem categoriza- The 225 exams were catego- tion and problem solving among experts and rized based upon the Type 2 question novices,” Mem. & Cog. 17 (5), 627-638 score as follows: 0 wrong (N0=148), 1 (1989). wrong (N1=64), and 2 wrong (N2=13). 7 K. Cummings, D. Kuhl, J. Marx, (No one got all 3 wrong). The average R. Thornton, “Evaluating innovation in stu- score out of 65 total points on the free dio physics,” Phys. Educ. Res., Am. J. Phys. response problem solving section for Suppl. 67 (7), S38-S44, (1999). each group was 48.82, 43.46, and 39.54, respectively. Performing an ANOVA shows this to be a statistically signifi-