T1999 - G8 - Released - Items-SCI-4
T1999 - G8 - Released - Items-SCI-4
items
TIMSS
R E P E A T
•1999 •
S
TIMSS 1999 SCIENCE ITEMS
Overview of TIMSS timss 1999 also included a voluntary Benchmarking Study including 13
United States of America states and 14 districts and consortia. The results
timss 1999 represents the continuation of a long series of studies con- were published in Mathematics Benchmarking Report TIMSS 1999 - Eighth
ducted by the International Association for the Evaluation of Educational Grade: Achievement for U. S. States and Districts in an International Context (Mul-
Achievement (iea). Since its inception in 1959, the iea has conducted more lis, Martin, Gonzalez, O’Connor, Chrostowski, Gregory, Garden and Smith,
than 15 studies of cross-national achievement in the curricular areas of 2001) and Science Benchmarking Report TIMSS 1999 - Eighth Grade: Achievement
mathematics, science, language, civics, and reading. The Third Interna- for U. S. States and Districts in an International Context (Martin, Mullis, Gonza-
tional Mathematics and Science Study (timss), conducted in 1995-1996, lez, O’Connor, Chrostowski, Gregory, Smith and Garden, 2001).
was the largest and most complex iea study to date, and included both
mathematics and science at third and fourth grades, seventh and eighth
grades, and the final year of secondary school. Participants in TIMSS 1999
In 1999, timss again assessed eighth-grade students in both mathematics Of the 42 countries that participated in timss1 at the eighth grade in 1995,
and science to measure trends in student achievement since 1995. This 26 availed themselves of the opportunity to measure changes in the achieve-
study was also known as timss-Repeat, or timss-r. The results of timss 1999 ment of their students by also taking part in 1999 (see Exhibit 1). Twelve
were published in two companion volumes, TIMSS 1999 International Mathe- additional countries participated in 1999, for a total of 38 countries. Of
matics Report (Mullis, Martin, Gonzalez, Gregory, Garden, O’Connor, Chros- those taking part in 1999, 19 had also participated in 1995 at the fourth
towski, and Smith, 2000) and TIMSS 1999 International Science Report grade. Since fourth-grade students in 1995 were in eighth grade in 1999,
(Martin, Mullis, Gonzalez, Gregory, Smith, Chrostowski, Garden, and these countries can compare their eighth-grade performance with their per-
O’Connor, 2000). formance at the fourth grade, as well as with the eighth-grade performance
of students in other countries.
1. Results for 41 countries are reported in the 1995 international reports; Italy also completed the 1995 test-
ing, but too late to be included.
Exhibit 1: Countries Participating in TIMSS 1999 and TIMSS 1995 TIMSS 1995 TIMSS 1995
Country TIMSS 1999
TIMSS 1995 TIMSS 1995 (Grade 8) (Grade 4)
Country TIMSS 1999
(Grade 8) (Grade 4) ● ●
South Africa
Australia ● ● ● Spain ●
Austria ● ● Sweden ●
Belgium (Flemish) ● ● Switzerland ●
Belgium (French) ● Thailand ● ● ●
Bulgaria ● ● Tunisia ●
Canada ● ● ● Turkey ●
Chile ● United States ● ● ●
Chinese Taipei ●
Colombia ●
Cyprus ● ● ● The TIMSS 1999 Science Test
Czech Republic ● ● ●
Denmark ● The timss curriculum framework underlying the timss 1995 science test
England ● ● ●
●
was developed by groups of science educators with input from the timss
Finland
France ● National Research Coordinators (nrcs).2 The content aspect of the frame-
Germany ● work represents the subject matter content of school science. The perfor-
Greece ● ● mance expectations aspect of the framework describes, in a non-
Hong Kong, SAR ● ● ● hierarchical way, the many kinds of performances or behaviors that might
Hungary ● ● ●
be expected of students in school science. Working within the science cur-
Iceland ● ●
●
riculum framework, science test specifications were developed for timss
Indonesia
Iran, Islamic Republic ● ● ● 1995 that included items representing a wide range of science topics and
Ireland ● ● eliciting a range of skills from the students.
Israel ● ● ●
Italy ● ● ● To provide as much information as possible about the nature and scope of
Japan ● ● ● the 1995 timss achievement tests, almost two thirds of the items on the tests
Jordan ●
were released to the public. The remaining one-third were kept secure as a
Korea, Republic of ● ● ●
Kuwait ● ●
basis for accurately measuring trends in student achievement from 1995 to
Latvia ● ● ● 1999. Releasing most of the 1995 items enabled more meaningful reports,
Lithuania ● ● both national and international, to be published and also provided informa-
Macedonia, Republic of of” ● tion for secondary research. But it also meant that students in the timss
Malaysia ● 1999 samples may have been exposed to these items, which necessitated the
Moldova ●
development of new science items for timss 1999.
Morocco ●
Netherlands ● ● ●
New Zealand ● ● ●
The major goal of timss 1999 test development was to produce a test that
Norway ● ● would parallel that of timss 1995 in overall structure and content. The strat-
Philippines ● egy used involved treating the 1995 items as a representative sample from
Portugal ● ● the “pool” of all possible items within the defined test domain and selecting
Romania ● ●
new items from this “pool” with the same subdomains as the released items
Russian Federation ● ●
● ●
from timss 1995. In practice, each released item was evaluated to define its
Scotland
Singapore ● ● ●
Slovak Republic ● ● 2. The complete TIMSS curriculum frameworks can be found in Robitaille, D.F. et al. (1993). TIMSS Monograph
Slovenia ● ● ● No. 1: Curriculum Frameworks for Mathematics and Science. Vancouver, B.C.: Pacific Educational Press.
ii
subdomain (mathematics or science content, performance expectation, Exhibit 2: Distribution of Science Items by Content Reporting Category
item format, and difficulty level), and a set of potential replacement items Item Type
from the same subdomain was then created. This method ensured that the Number Score
Reporting Category
final test, comprising the nonreleased and replacement items, covered the Multiple- Short- Extended of Items Points
same test domain as in timss 1995. Choice Answer Response
The tests were developed through an international consensus involving Earth Science 17 4 1 22 23
input from experts in science and measurement specialists.3 The timss Sub- Life Science 28 7 5 40 42
ject Matter Item Committee, which included distinguished scholars from 10
countries, ensured that the test reflected current thinking and priorities Physics 28 11 - 39 39
within the field of science. The items underwent an iterative development Chemistry 15 2 3 20 22
and review process with one pilot testing effort involving 31 countries. Every
effort was made to help ensure that the tests represented the curricula of Environmental and 7 2 4 13 14
Resource Issues
the participating countries and that the items did not exhibit any bias
towards or against particular countries. The final forms of the test were Scientific Inquiry and 9 2 1 12 13
endorsed by the nrcs of all the participating countries. The resulting test for the Nature of Science
the timss 1999 students (eighth grade in many countries) contained 146
Total 104 28 14 146 153
science items representing a range of science topics and skills.
Approximately one-fourth of the timss items were in the free-response for- Exhibit 3: Distribution of Science Items by Performance Category
mat, which required students to generate and write their own answers. Number of Number
Total Number
Designed to represent approximately one-third of students’ response time, Performance Percentage Multiple- of Free-
Number of Score
some free-response questions asked for short answers, while others called Category of Items Choice Response
of Items points
Items Items
for extended responses and required students to show their work. The
remaining questions used a multiple-choice format. The distribution of Understanding Simple 39 57 56 1 57
items across content areas (as reported in the international reports) and Information
performance expectations, as well as by item format, is presented in Exhibits
Understanding 31 45 30 15 47
2 and 3, respectively. To ensure broad subject matter coverage without Complex Information
overburdening individual students, timss used a rotated design that
included both the mathematics and science items. In accordance with the Theorizing, Analyxing 19 28 5 23 32
design, the mathematics and science items were assembled in 26 different and Solving Problems
clusters — labeled A through Z. The clusters were assigned to eight differ- Using Tools, Routine 7 10 9 1 10
ent booklets in accordance with the rotated design so that representative Procedures and
samples of students responded to each cluster.4 Each student completed one Science Processes
90-minute test booklet containing both mathematics and science items. Investigating the 4 6 4 2 7
Natural World
3. Garden, R. A. and Smith, T. A. (2000) “TIMSS Test Development” in M.O. Martin, K. D. Gregory, and S. E.
Stemler, eds, TIMSS 1999 Technical Report, Chestnut Hill, MA: Boston College.
4. The TIMSS test design is documented in Garden, R. A. and Smith, T. A. (2000) “TIMSS Test Development” in
M.O. Martin, K. D. Gregory, and S. E. Stemler, eds, TIMSS 1999 Technical Report, Chestnut Hill, MA: Boston
College.
iii
Item Release Policy Across the top of each item, there is documentation about the item includ-
ing the item label, item identification, the classification of the item by con-
In accordance with iea policy, timss kept about one-half of the timss 1999 tent category and performance expectation as well as information about
items secure for future use in measuring international trends in mathemat- scoring, trend status and international performance. If the item is a two-part
ics and science achievement. The secure items are in every second cluster, item, the documentation for Part A is shown on the first page and the docu-
starting with cluster A. All the remaining items, those in every second cluster mentation for Part B is shown on the following page.
starting with cluster B, are available for general use. This means that half of
Key. For multiple-choice items, the key for the correct answer is provided.
the secure items from 1995 are now being released. To facilitate their use,
For free-response questions, the scoring rubrics identifying categories of
the released timss items for timss 1999 have been replicated in their
responses and their codes are shown next to the item. In scoring the timss
entirety in this science volume and in the companion mathematics volume.
free-response questions, timss utilized two-digit codes with rubrics specific
As shown in Exhibit 4, this volume contains 68 science items. To provide a
to each item. The first digit designates the correctness level of the response.
unique identifier for each item, the timss cluster and item number is shown
The first digit is usually a “1” designating a correct response, a “7” indicating
in the box on the right hand side of each page.
an incorrect response, or a “9” for non-response. Sometimes, however, fully
Some of the free-response items have multiple parts, indicated as A, B, or C. correct responses are differentiated from partially correct responses. In
In addition, for some items, students were asked to provide an answer with these instances, the fully correct responses are designated by a “2” and the
supporting work, or to provide two reasons, examples, consequences, etc. partially correct responses by a “1.” The second digit, combined with the
For these items, derived variables based on the combined scores of the sepa- first digit, represents a diagnostic code used to identify specific types of
rate parts (A and B, or B and C) were also computed. These derived vari- approaches, strategies, or common errors and misconceptions.
ables are indicated as D.
Content Category. The science items were reported according to six con-
While the purpose of this volume is to encourage the use of timss and timss tent areas.
items, please note the iea copyright; appropriate references to the iea and
• Earth Science
timss should be provided in your use of these items.
• Life Science
• Physics
Item Documentation and Item Results
• Chemistry
The timss tests were prepared in English and translated into 33 additional • Environmental and Resource Issues
languages. Each item is reproduced for this volume as it was presented to
• Scientific Inquiry and the Nature of Science
each of the timss countries. In translating the tests or making adaptations
for cultural purposes, every effort was made to ensure that the meaning and Exhibit 4 indicates which items have been classified into each of the six con-
difficulty of items did not change. This process required an enormous effort tent areas.
by the national centers, with many checks made along the way.5
All of the items in this volume are science items. The mathematics items are
provided in a companion volume, TIMSS 1999 Mathematics Items: Released Set
for Grade 8
5. More details about the translation verification procedures can be found in O’Connor, K. M. and Malak, B.
(2000) “Translation and Cultural Adaptation of the TIMSS Instruments”, in M.O. Martin, K. D. Gregory, and
S. E. Stemler, eds, TIMSS 1999 Technical Report, Chestnut Hill, MA: Boston College.
iv
Performance Expectation. Items were classified into the following per- For More Information About TIMSS
formance expectations.
For more details about the timss 1999 results and procedures, please see
• Understanding Simple Information
the following reports:
• Understanding Complex Information
Martin, M.O., Mullis, I.V.S., Gonzalez, E.J., Gregory, K.D., Smith, T.A.,
• Theorizing, Analyzing, and Solving Problems
Chrostowski, S.J., Garden, R.A., & O’Connor, K.M. (2000). TIMSS 1999
• Using Tools, Routine Procedures, and Science Processes International Science Report: Findings from IEA’s Repeat of the Third International
• Investigating the Natural World Mathematics and Science Study at the Eighth Grade. Chestnut Hill, MA: Boston
College.
International Average Percentage of Eighth Grade Students
Responding Correctly. The percent of students responding correctly to Mullis, I.V.S., Martin, M.O., Gonzalez, E.J., Gregory, K.D., Garden, R.A.,
the item reflects the international average across the countries participating O’Connor, K.M., Chrostowski, S.J., & Smith, T.A. (2000). TIMSS 1999 Inter-
in timss 1999. That is, first the percentage of students responding correctly national Mathematics Report: Findings from IEA’s Repeat of the Third International
to the item was calculated for each country. Next, an average was calculated Mathematics and Science Study at the Eighth Grade. Chestnut Hill, MA: Boston
across the 38 countries. For items using a partial credit scoring scheme, the College.
percentages given are for students responding with fully correct answers.
Martin, M.O., Gregory, K.D., and Stemler, S.E., eds., (2000), TIMSS 1999
Technical Report, Chestnut Hill, MA: Boston College.
v
Exhibit 4: Item Listing by Science Content Area
Life Science
Earth Science
D05 Sensory messages to the brain
B05 Elevation diagram of wind/temperature
D06 Seed development from plant part
D03 Contour map showing river
F01 Characteristic of mammal
F05 Oxygen equipment on mountain tops
F03 Interpretation of senses
H03 Why moon shines
H01 NOT a function of blood
H04 Diagram of soil layers
H02 Role of vitamins
J01 Earth's plates over millions of years
J02 Feature shared by all insects
J06 Factor explaining seasons on Earth
J07 Reason for protein in diet
J09 Life on other planets
L02 Large leaves on seedlings
R04 Atmospheric conditions in jets
L03 Physical characteristic of prey
Z02 Diagram of rain from sea
L05 Wolves marking territory
vi
Physics Chemistry
B02 Energy released from car engine F06 Best reason for painting iron surfaces
B03 Greatest density from mass/volume table H06 Burning wood absorbs/releases energy
B06 Color reflecting most light J03 Compounds, molecules and atoms
D04 Sequence of energy changes R05 Small pieces of wood burn faster
N09 Balancing 10 and 5 liter buckets P05A Two reasons for famine
N10 Flashlights with white/black reflectors P05B Two reasons for famine
P01 Determination of speed from graph P05D Two reasons for famine
P02 Amount of light on wall and ceiling R06 Result of global warming
vii
Layers of Earth B01
Item Score International Average Percentage of
Content Category Performance Expectation Key Points 8th Grade Students Responding Correctly Used in 1995
Understanding Complex
Life Science C 1 87 Y
Information
Understanding Complex
Physics B 1 42 Y
Information
Understanding Complex
Physics C 1 72 Y
Information
Understanding Complex
Physics C 1 59 Y
Information
Understanding Complex
Physics A 1 65 Y
Information
Understanding Complex
Environmental and Resource Issues D 1 68 Y
Information
Understanding Complex
Earth Science A 1 79 Y
Information
Understanding Complex
Earth Science A 1 48 Y
Information
Understanding Complex
Chemistry Rubric 1 46 N
Information
Understanding Complex
Physics C 1 84 N
Information
Understanding Complex
Earth Science C 1 26 N
Information
Note: A correct response must clearly indicate the reason why a condition listed in the table
makes it hard for humans to live on Proto. Responses referencing insufficient (too little,
less, not enough, etc.) oxygen with or without explicitly mentioning breathing will be
given credit (Code 10) due to the assumption of common knowledge. If more than one
reason is given, assign the code corresponding to the first correct reason.
Understanding Complex
Physics D 1 62 N
Information
Understanding Complex
Life Science D 1 72 N
Information
Understanding Complex
Life Science A 1 37 N
Information
Note: A correct response must identify B and include an explanation based on the concept of
energy efficiency (ratio of energy output to energy input) that compares the volume of
water pumped for an equivalent volume of gasoline used for the two machines.
Responses based ONLY on comparing the amount of gasoline used OR the amount of
water pumped by the machines without considering the ratio of water/gasoline are scored
as incorrect (Codes 70 and 72). No credit is lost for missing/incorrect units or for minor
computational errors, provided the correct conclusion and explanation are given.
Understanding Complex
Environmental and Resource Issues B 1 48 N
Information
Understanding Complex
Life Science Rubric 1 55 N
Information
Note: There are two possible food webs that are accepted as correct. The most likely
corresponds to Code 10. An alternative, but less preferred, food web with the hawk (3)
and snake (4) reversed is also scored as correct (Code 11).
Understanding Complex
Physics C 1 64 N
Information
Note: A correct response must include a feasible explanation directly relating the predicted
change in robin population to the effect of corn crop failure on prey/predator
relationships indicated in the food web. Responses do not have to use the specific terms
decrease, increase, and same, as long as the explanation is clear with respect to the
effect on the robin population. If more than one effect is given, assign the code
corresponding to the first correct explanation.
Understanding Complex
Life Science D 1 40 N
Information
Note: A correct response must identify Roddy’s and include an explanation based on the
relative reflectivity of the white and black cardboard. Credit is given both for responses
explicitly mentioning the higher reflectance of the white cardboard and/or the higher
absorptance of the black cardboard as well as responses communicating this concept
using less scientific terminology.
Note: A correct response is based on the same amount of light reaching both the ceiling and the
wall but being more spread out (less bright) on the ceiling. Correct responses must
identify NO and include an explanation that states that the light is the same (Code 10) or
that indicates that the light is just more spread out (less bright) on the ceiling without
explicitly stating same (Code 11). If the explanation merely repeats information that is in
the stem, it is scored as incorrect (Code 71) even if NO is checked. If a response indicates
that there is less light on the ceiling, the explanation must include a correct reason based
on more air absorption/scattering at a greater distance to receive the correct Code 12.
Responses that indicate less light at a greater distance without further explanation should
receive Code 70.
Note: A correct response is based on trees increasing in height as a result of growth at the tips
of stems/branches (apical meristem) and trunk growth only resulting in increased
diameter. Responses should be scored as correct if either of these two factors are
included.
Understanding Complex
Life Science D 1 48 N
Information
Understanding Complex
Environmental and Resource Issues Rubric 2 42 N
Information
Note: Each of the two reasons must be coded separately. The same code can be used twice.
However, if the reasons described are essentially the same, or an extension of the same
idea, a Code 79 should be given to the second one. If only one reason is given, a Code 99
should be given for the second reason.
Understanding Complex
Life Science Rubric 1 41 N
Information
Note: To receive code 10 or 11, a response must name a specific digestive substance found in
the stomach (enzyme, hydrochloric acid, or gastric juices) with or without a full
description of its function. A general response related to “acid” will be accepted as
correct (code 12), but an incorrect acid will be scored as incorrect (code 70).
Understanding Complex
Physics C 1 37 N
Information
Understanding Complex
Life Science Rubric 2 40 N
Information
Note: Each of the two outcomes are coded separately. The same code may be used twice, since
they are based on general outcome categories. However, if the two outcomes are
essentially the same, the second outcome should be coded as 79. If only one outcome is
given, the second should be coded as 99.
Understanding Complex
Earth Science Rubric 1 33 N
Information
Note: A correct response must include an explanation identifying an atmospheric condition that
is different at high altitudes and why it must be controlled for. Responses referencing
low oxygen level (too little, less, not enough) with or without explicitly mentioning
breathing will be given credit (code 10) due to the assumption of prior knowledge.
Note: A correct response is based on the concept of increased surface area in the smaller pieces
resulting in faster burning (reaction with oxygen). Credit is given both for higher-level
responses indicating increased availability of oxygen/air (Code 10) or surface area (Code
11) in the chopped wood pieces as well as less sophisticated responses describing only
that more wood is exposed to the flame and can, therefore, burn simultaneously (Code
12).
Note: A correct response must identify more and include a correct explanation based on
electrical energy being converted to heat (Code 10) or a more general description of
energy losses or low efficiency (Code 11). Responses that include explanations based on
heat, energy losses or low efficiency but with an incorrect application to the problem by
checking less are scored as incorrect (Code 72).
Note: A fully correct procedure may or may not include a separate materials list in order to
receive full credit. If a materials list is not included, then time measurements must be
explicitly referenced within the procedure (e.g. ‘time how long it takes’). Partial credit is
given for responses where one of the criteria for Code 20 is not completely satisfied.
Note: A correct response must explicitly reference rusting, corrosion, oxidation, or comparable
term.
Note: A fully correct response must show clear evidence of the following 4 steps:
(i) Evaporation of water from the sea
(ii) Condensation (as clouds)
(iii) Transportation (from sea to land)
(iv) Precipitation.
The steps do not have to be indicated on a labeled diagram for full credit, but the drawing
and/or accompanying explanatory text must be clear with respect to the direction of water
flow. Steps (ii) and (iii) may be clearly indicated as two steps (e.g. formation of clouds
and transportation by wind) or as a single step showing a series of clouds extending over
land and sea.
Note: A correct response is based on an increase in gas volume (or internal gas pressure) as a
result of increased temperature. Credit is given for both higher-level responses relating
to the increased kinetic energy of helium atoms as a function of temperature (Code 10) as
well as more general responses relating to increased internal gas pressure and/or gas
volume (Code 11). An increase in temperature does not have to be explicitly mentioned
in order to receive credit. Responses referring ONLY to the balloon expanding or to the
effect of temperature on the balloon without further explanation of the gas behavior are
scored as incorrect.