Common Mistakes on Chemistry 106 Lab Reports

David L. Zellmer
November 6, 1999

Information on what is expected in the written lab reports is provided in the course syllabus and in oral advice given in class. In spite of this, students keep making the same mistakes on lab reports year after year. Here is a collection of common errors to avoid.

The Abstract:

Many students still fail to tell me what they did and what their final results were. Keep this short and to the point, but tell me these two things. Don't ramble on in a mini version of your Critical Evaluation. Don't just state "we studied gas chromatography." Under final results, be sure to tell how well the method was able to measure the unknown provided by a team member. (Give the values and the errors.) Professional analytical papers will always tell how well their measurement method worked.

The Theory Section:

The most common deduction made here is for failure to provide the theory behind all the work that will be presented. For example, the basic theory of gas chromatography may be presented fairly well, but then there will be no discussion of the theory behind the use of internal standards used for the analysis of the alcohol unknowns, how the unknown will be measured, why changes in things like column temperature can affect retention times, why hydrocarbons should follow any particular trends, and so forth. Look over all the things you had to work up in the Data and Results section. Look over what needs to be understood when doing the Critical Evaluation. Then go back to the Theory section and make sure that the theoretical underpinnings and mathematical equations for all of these things have been provided.

Another fairly common problem is over-reliance on quoted material. Read your sources until you have a good understanding of how everything works. Then write it out in your own words.

The Procedure Section:

In spite of repeated exhortations in class and in the syllabus, students still fail to tell me what they did and what they used to do it. I get procedures virtually copied out of the lab manual that I know were extensively modified on the real experiment. I get poor or no description of the instruments used. I get diagrams of the instruments that come from textbooks, not from looking at what is actually sitting on the bench. I get tables of dilutions and standards that are based on suggested values from manuals or handouts, not the real values used to make the experiment work. The oldest procedure error in the book, failure to identify the source and grade of the chemicals used, is still frequently violated. Even more serious than that is reporting a grade of chemical specified in a procedure, but not actually used on the experiment.

1) Look at your instrument. Identify all of its critical parts. Give make and model for each part, since instruments are often loaded with options. Draw a diagram for how THIS instrument works. If you are confused about any of this, ask your instructor.

2) Write down everything you do and every new thing you learn in your laboratory notebook. Think of this as a message to your future self, should you ever find yourself in front of a similar instrument in a real job or research situation. With a good set of notes on how everything works and how everything was done, you will look like a star to your employer or research director. I've had real people tell me how heeding this advice saved their backsides. It will also make it possible for you to write a concise and accurate Procedure section.

3) Directions in student lab manuals frequently have to be changed to meet the needs of the instrument you are actually using. If an absorbance is too high, or a suggested concentration cannot be detected, or if crummy-looking peaks tell you your column has overloaded, CHANGE THINGS! Don't just plow ahead because it is "in the book." Write down what you did and why you did it. Discussion of such improvements is worth many brownie points on your report, and shows you have the ability to "think on your feet."

The Data and Results Section

This section has improved greatly over the last few years. Students now use their computer spreadsheets to organize their data and perform good statistical analyses on it. A few common errors are worth mentioning, however.

1) Don't put a linear least squares line through your data unless you have reason to believe the data really is linear. Sometimes the method chosen gives terrible results, and the working curve looks more like random points. If this happens, just make the plot; don't dignify the poor data with a statistical analysis, and simply discuss why the method failed in your Critical Evaluation. Stating that the value is 2.334 plus or minus 110.03 makes it look like you really don't understand what happened here.

2) Report how close you came to the unknown value. On most experiments you are asked to run an unknown prepared by a member of your instrument team. Make sure everyone in your team understands how to use "Class A" techniques to prepare reliable standards and unknowns so all of you can trust the values produced. When writing your report, be sure to get this value from your team member, then report this "true" value at the same time you report your measured value, plus or minus your computed statistical error. In the next section, Critical Evaluation, you will use these values in your discussion of how precise and accurate your analysis was.

The Critical Evaluation Section

The most common error is still the oldest and most traditional one: failure to use the statistical results in your discussion. Students still seem confused about the difference between precision and accuracy. Percentage (relative) errors are confused with absolute errors. A lot of discussion is still on the level of "the results were good." Period.

It is very easy to get all the points in the Critical Evaluation section. For each method used, state how well (or poorly) it worked), giving your precision values to back up your statement. Use relative errors (percentage errors) when comparing to other methods. Compare your experimental value and error with the "true" value from you team member. Discuss how well or poorly they agree, using statistics to support your discussion. Don't automatically blame all error on the instrument. It is all too common for students to use sloppy analytical technique in making standards and team unknowns. Try to sort out sloppiness errors from true instrumental errors.

Discuss all changes made to the suggested procedures. Here is where you can really show your professionalism. Show how your superior understanding of instrumental technique has saved the free world from the blunders made by simply "following the book."

Quality of Result

If things don't seem to be working out, change the conditions right away. Don't assume all is well just because you are following the book. Get advice from your instructor or from fellow students. Don't discover the night before your report is due that your data are garbage. Heeding early warning signs of impending disaster is one of the hallmarks of a good professional. If in spite of your best efforts, poor data is still being produced, talk to your instructor. Perhaps a simplification of what is expected can yield some useful data.