Computerizing Introductory Problem-Solving Physics Labs

Here at the University of Minnesota, we are always interested in improving the quality of education our physics students receive. We have recently implemented a computerized version of our problem-solving labs. Here you can find out about this new project of ours, but please be patient if not everything is here yet. An "in-progress" project means an "in-progress" web page.

Why change labs that already work?

"Even if you're on the right track, you'll get run over if you just sit there." -- Will Rogers.

The world around us is constantly changing. New technology, new ideas, new breakthroughs in education all lead us to a constant evaluation of the work we do in the physics department of the University of Minnesota. Our goal is to give our students the best possible education we can give them. This often means trying new things and seeing how best we can apply them to our situation.

Why use microcomputer based labs (MBLs)?

Computers are an increasing presence in all of our lives. Physics in particular is heavily dominated by technology, from high school classrooms to university research and industry labs. Students are starting to use computers in both primary and secondary classrooms. Many of the students in our introductory courses have used computers in their high school physics classes, and in other high school and college courses. The Third International Mathematics and Science Study found that 42% of high school physics students have been asked to use computers in their class. Since our 125x sequence students are primarily engineers, most of our students will move on to use computers both in their engineering courses and in the labs and research groups they work at. Early and frequent exposure to computers will help our students develop the familiarity with computers that they will need in their classes and in the workforce. We can support this process by using computers in our labs.

Computers also allow for a much greater range of data gathering and analysis than our traditional labs allow for. With computers, students can take cleaner data and spend less time on analysis, and more time on the concepts.

Try the following site for more discussion on this subject. Why use computers to teach Physics Labs?

What sort of research is there on computers in labs?

How have computers been implemented at the University of Minnesota?

Our computerized labs use the same cooperative group problem solving techniques that our traditional labs do. The lab problems are very similar, and the structure of the lab remains the same. The only difference is the manner in which students collect data and analyze it to solve the lab problem.

We chose to use a video capture and analysis system for data collection and analysis. Our labs are equipped with Macintosh PowerMacs, with audio-visual (AV) capabilities built in. Included with the Macs is a video capture program, Apple Video Player. Groups use this software with small surveillance-type cameras to take movies of carts rolling on tracks, accelerating hanging masses, blocks sliding down ramps, and other motion-based phenomena. The movies are saved to the computer's hard disk in a folder specifically for that group's lab section.

Once the students have a movie that they are pleased with, they open up a software program called VideoTOOL, designed and built by the University of Minnesota Physics Education group and Ted Hodapp from Hamline University, using the LabVIEW environment.

VideoTool is graph-oriented, and does not do the analysis for the students. Instead, the software guides the students through several phases. First the students predict what the particular motion they are studying will look like on a graph, picking the proper equations and coefficients to match their prediction. Then the students open their movie from within VideoTool and calibrate the scale, choose axes, and set the time = 0 point. After the calibration is finished, students collect data by following the object on the screen with the mouse. Data is plotted as it is gathered. Once the data collection is over, students compare their prediction to the data, and fit the data with a new line, selecting an equation and coefficients as in the prediction. With the position data fit, students are asked to predict the shape of the velocity graph. Then the program analyzes the position data and plots the velocity data. Students then fit the velocity data, and then predict the acceleration. A similar process for acceleration gives the students six graphs; position, velocity, and acceleration in both the x and the y direction. The graphs are printed out for the students to review and use in their lab reports. The printouts include both predicted and fitted values for all six graphs.

How are the new labs being evaluated?

During the Fall Quarter, 1997 we trial-tested VideoTool to answer the following questions:

  1. Does using VideoTool (without optimizing the lab problems for this tool) adversely affect students' conceptual understanding of kinematics and forces?
  2. Does substituting VideoTool for spark tapes and Polaroid film enable students to spend more time discussing their understanding of kinematics and forces (rather than equipment or analysis difficulties)?
  3. Does working with a computer in the lab adversely affect the groups' ability to collaborate?

To answer these questions, the lab sections of three instructors were randomly assigned as VideoTool labs or spark tape labs (about 170 students in each type of lab). Three types of data were collected: (1) data from two multiple-choice tests for conceptual understanding (Force Concept Inventory and Test of Understanding Graphs - Kinematics); (2) student questionnaire data for general opinions of the lab, use of VideoTool or spark tapes, and extent of collaboration in lab groups; and (3) independent observation data to determine how much time students spend doing different tasks (e.g. discussing how to use the equipment, taking data, discussing the physics) in the lab.

What are the results so far?

Test of Understanding Graphs - Kinematics (TUG-K)

TUG-K Average Pretest Score (percent correct) Average Posttest Score (percent correct) Relative Gain (post-pre/100-pre)
Spark tape labs (N~170) 52% 72% 39
VideoTool labs (N~170) 58% 75% 42

The error is 1-2%. As the table above suggests, there was little difference between the two labs. Students using VideoTool may have a slight edge on traditional students, but it is hard to tell.

Force Concept Inventory (FCI)

FCI Average Pretest Score (percent correct) Average Posttest Score (percent correct) Relative Gain (post-pre/100-pre)
Spark tape labs (N~170) 44% 70% 47
VideoTool labs (N~170) 47% 69% 45

The error is 1-2%. As with the TUG-K, there is very little difference between the two groups' performance on the FCI.

Student opinions from questionnaire

The general topics we asked about on the questionnaire included:

Data has been analyzed for several of these topics. A summary is below. In a rank ordering type of question, students in the VideoTool labs, as compared to students in traditional labs:

These differences were not great in any case except the text, to which almost 40% of computer lab students gave the top ranking. About 30% of traditional lab students gave the text top ranking.

Other questionnaire data reveals a few differences in the way students interacted with each other and the lab.

  1. Students in the computer labs did not feel the Teaching Assistants were as helpful as those students in the traditional labs did. This may reflect the minimal training the TAs recieved in VideoTool. This suggest that the TAs need more training in order to help students properly.

  2. Learning VideoTool was considered time well spent to 51% of computer lab students. Only 37% of traditional lab students agreed that learning to use spark tape was time well spent. Students realize that computer literacy will be useful to them later on, while few students will ever see spark tape again.

  3. 46% of computer lab students looked forward to using VideoTool again. Only 16% of traditional lab students looked forward to using spark tape again.

  4. Students in computer labs were more likely to ignore unrealistic values in their analysis, and just to move on to the next problem. This also comes back to TA training. TAs need to learn to watch for unrealistic values that students have accepted, especially in lab reports. This is easy to do by checking over the student's printouts with the students as they finish a problem.

  5. Students in the computer labs spent less time in their groups discussing equipment difficulties, and more time discussing misunderstandings about physics. This is the students' perception. The observational data is another source of determining what students talk about.

  6. Students in traditional labs were more likely to divide tasks up among their group, while computer lab students were more likely to have one person doing the data analysis. This is not surprising, since the analysis is done on the computer, and there is only one keyboard. Enhanced TA training can help make sure that all students are contributing to the data analysis.

    The data from the independent observations has not yet been analyzed. The observers in the labs were looking at group dynamics. In a nutshell: who is talking to whom and what are they talking about. The data should prove very interesting when analyzed.

    On-line Resources and References for MBL

    The microcomputer and practical work in science laboratories. By John Layman and Joseph Krajcik with the University of Maryland.

    Is the Computer Appropriate for Teaching Physics? By Edward Redish at the University of Maryland.

    What Can a Physics Teacher Do with a Computer? Parts 1 and 2 by Edward Redish at the University of Maryland.

    Multimedia General Physics "The Next Generation" at the University of Lincoln-Nebraska.

    The impact of MBL The Concord Consortium, Inc.

    Computer Representations in Students' Conversations Gregory Kelly and Teresa Crawford at UC Santa Barbara.

    Special Interest Group for Computers in Education Text from the proceedings of the Spring '92 ACM Conference on Computer Supported Collaborative Learning


    Return to Physics Education Home Page

    Comments: <webmaster>

    Last Updated: April 16, 1998