Guidelines for Judging
- Examine the quality of the Finalist’s work, and how well the Finalist understands his or her project and area of study. The physical display is secondary to the student’s knowledge of the subject. Look for evidence of laboratory, field or theoretical work, not just library research or gadgeteering.
- Judges should keep in mind that competing in a science fair is not only a competition, but an educational and motivating experience for the students. The high point of the Fair experience for most of the students is their judging interviews.
- Students may have worked on a research project for more than one year. However, for the purpose of judging, ONLY research conducted within the current year is to be evaluated. Although previous work is important, it should not unduly impact the judging of this year’s project.
- As a general rule, judges represent professional authority to Finalists. For this reason, judges should use an encouraging tone when asking questions, offering suggestions or giving constructive criticism. Judges should not criticize, treat lightly, or display boredom toward projects they personally consider unimportant. Always give credit to the Finalist for completing a challenging task and/or for their success in previous competitions.
- Compare projects only with those competing at this Fair and not with projects seen in other competitions or scholastic events.
- It is important in the evaluation of a project to determine how much guidance was provided to the student in the design and implementation of his or her research. When research is conducted in an industrial or institutional setting, the student should have documentation, most often the Intel ISEF Form 1C, that provides a forum for the mentor or supervisor to discuss the project. Judges should review this information in detail when evaluating research.
- Please be discreet when discussing winners or making critical comments in elevators, restaurants, or elsewhere, as students or adult escorts might overhear. Results are confidential until announced at the awards ceremony.
- Participant #'s are the only reference to be used on scoring sheets - no names or project titles.
- Judges are encouraged to view projects before students are present and before interviews begin - look through the data research notebook, Final Report binder and project board to get an overall impression of the quality and depth of research in preparation for student interviews.
Evaluation Criteria for Category Judging
I. Creative Ability (30 points)
- Does the project show creative ability and originality in the questions asked?
- the approach to solving the problem?, the analysis of the data?, the interpretation of the data?
- the use of equipment?, the construction or design of new equipment?
II a. Scientific Thought (30 points)
For an engineering project, or some projects in categories such as computer science and mathematical sciences, the more appropriate questions are those found in IIb. Engineering Goals.
- Is the problem stated clearly and unambiguously?
- Was the problem sufficiently limited to allow plausible approach? Good scientists can identify important problems capable of solutions.
- Was there a procedural plan for obtaining a solution?
- Are the variables clearly recognized and defined?
- If controls were necessary, did the student recognize their need and were they correctly used?
- Are there adequate data to support the conclusions?
- Does the Finalist or team recognize the data’s limitations?
- Does the Finalist/team understand the project’s ties to related research?
- Does the Finalist/team have an idea of what further research is warranted?
- Did the Finalist/team cite scientific literature, or only popular literature (i.e., local newspapers, Reader’s Digest).
II b. Engineering Goals (30 points)
- Does the project have a clear objective?
- Is the objective relevant to the potential user’s needs?
- Is the solution workable? acceptable to the potential user? economically feasible?
- Could the solution be utilized successfully in design or construction of an end product?
- Is the solution a significant improvement over previous alternatives?
- Has the solution been tested for performance under the conditions of use?
III. Thoroughness (15 points)
- Was the purpose carried out to completion within the scope of the original intent?
- How completely was the problem covered?
- Are the conclusions based on a single experiment or replication?
- How complete are the project notes?
- Is the Finalist/team aware of other approaches or theories?
- How much time did the finalist or team spend on the project?
- Is the finalist/team familiar with scientific literature in the studied field?
IV. Skill (15 points)
- Does the finalist/team have the required laboratory, computation, observational and design skills to obtain supporting data?
- Where was the project performed (i.e., home, school laboratory, university laboratory)? Did the student or team receive assistance from parents, teachers, scientists or engineers?
- Was the project completed under adult supervision, or did the student/team work largely alone?
- Where did the equipment come from? Was it built independently by the Finalist or team? Was it obtained on loan? Was it part of a laboratory where the Finalist or team worked?
V. Clarity (10 points)
- How clearly does the Finalist discuss his/her project and explain the purpose, procedure, and conclusions? Watch out for memorized speeches that reflect little understanding of principles.
- Does the written material reflect the Finalist’s or team’s understanding of the research?
- Are the important phases of the project presented in an orderly manner?
- How clearly is the data presented?
- How clearly are the results presented?
- How well does the project display explain the project?
- Was the presentation done in a forthright manner, without tricks or gadgets?
- Did the Finalist/team perform all the project work, or did someone help?