About CIERA

http://www.ciera.org

The Center for the Improvement of Early Reading Achievement (CIERA) is the national center for research on early reading and represents a consortium of educators in five universities (University of Michigan, University of Virginia, and Michigan State University with University of Southern California and University of Minnesota), teacher educators, teachers, publishers of texts, tests, and technology, professional organizations, and schools and school districts across the United States. CIERA is supported under the Educational Research and Development Centers Program, PR/Award Number R305R70004, as administered by the Office of Educational Research and Improvement, U.S. Department of Education.



Mission

CIERA's mission is to improve the reading achievement of America's children by generating and disseminating theoretical, empirical, and practical solutions to persistent problems in the learning and teaching of beginning reading.



CIERA Research Model

The model that underlies CIERA's efforts acknowledges many influences on children's reading acquisition. The multiple influences on children's early reading acquisition can be represented in three successive layers, each yielding an area of inquiry of the CIERA scope of work. These three areas of inquiry each present a set of persistent problems in the learning and teaching of beginning reading:



CIERA Inquiry 1
Readers and Texts

Characteristics of readers and texts and their relationship to early reading achievement. What are the characteristics of readers and texts that have the greatest influence on early success in reading? How can children's existing knowledge and classroom environments enhance the factors that make for success?



CIERA Inquiry 2
Home and School

Home and school effects on early reading achievment. How do the contexts of homes, communities, classrooms, and schools support high levels of reading achievement among primary-level children? How can these contexts be enhanced to ensure high levels of reading achievement for all children?


CIERA Inquiry 3
Policy and Profession

Policy and professional effects on early reading achievement. How can new teachers be initiated into the profession and experienced teachers be provided with the knowledge and dispositions to teach young children to read well? How do policies at all levels support or detract from providing all children with access to high levels of reading instruction?

 

An Analysis of Early Literacy Assessments Used for Instruction


CIERA Report #2-013

Samuel J. Meisels and Ruth A. Piker
University of Michigan

CIERA Inquiry 2: Home and School
What classroom-based literacy measures are available to teachers and how can we best characterize the instructional assessments teachers use in their classrooms to evaluate their students' literacy performance?

CIERA April 23, 2001

This report focuses on results of a systematic study of instructional assessments of early literacy designed by teachers and other educators for use in K-3 classrooms. The report presents the methodology and coding scheme used for collecting classroom-based measures and evaluating their content. It provides data about how reading and writing skills are assessed by teachers and shows the relationship between the skills included on these assessments and the skills associated with national standards and benchmarks. It also characterizes the instructional assessments teachers use in their classrooms to evaluate their students' literacy performance in terms of categories of skills assessed, types of assessment models utilized, differences in student responses elicited by the assessments, forms of administration, types of mental processing required of students, and other parameters. The discussion concerns questions about the psychometric properties of these assessments, their relationship to national standards, and their place in the instructional process for classroom teachers.


University of Michigan School of Education

CIERA

610 E University Ave., Rm 1600 SEB
Ann Arbor, MI 48109-1259

734.647.6940 voice
734.615.4858 fax
ciera@umich.edu

www.ciera.org

 

©2001 Center for the Improvement of Early Reading Achievement.
This research was supported under the Educational Research and Development Centers Program, PR/Award Number R305R70004, as administered by the Office of Educational Research and Improvement, U.S. Department of Education. However, the comments do not necessarily represent the positions or policies of the National Institute of Student Achievement, Curriculum, and Assessment or the National Institute on Early Childhood Development, or the U.S. Department of Education, and you should not assume endorsement by the Federal Government.


An Analysis of Early Literacy Assessments Used for Instruction


Samuel J. Meisels and Ruth A. Piker
University of Michigan

T he current administration in Washington has made development of early reading skills a topic of great importance. President Bush's predecessor, Bill Clinton, did his part to raise early reading assessment to a pinnacle of public attention when, in his 1997 State of the Union address, he said that "Every state should adopt high national standards, and by 1999 every state should test every fourth grader in reading and every eighth grader in math to make sure these standards are met. . . . Good tests will show us who needs help, what changes in teaching to make, and which schools to improve."

Unfortunately or not, the President's words outstripped reality. Congress fought his plan for "voluntary" national tests in reading and math and refused to allow government funds to be used for this purpose. On a more academic level, one can see that his goals for "good tests" can never be achieved by a single assessment: No test can, by itself, serve as many purposes as the President desired. First, in order for a test to "show us who needs help" we would need information about individuals that predicts future performance. This is what Resnick and Resnick (1992) call selection and certification of students. Second, in order to know what changes in teaching to make, we would need to have tools available that would permit us to diagnose particular strengths and weaknesses in individual student performances and then be in a position to monitor the effects of instruction. This type of assessment is called instructional management and monitoring, or instructional assessment. Finally, if we want our tests to tell us "which schools to improve" we are seeking an assessment that provides public accountability and program evaluation. Such tests provide those with responsibility for the funding and supervision of education with information on whether a particular program is succeeding in its academic goals (Resnick & Resnick, 1992).

In short, no single assessment can cover all of the purposes that are required of tests and evaluations. Of all the testing that take place in schools, the vast majority is created by teachers or is otherwise some form of informal classroom or instructional assessment (Stiggins & Bridgeford, 1985; Stiggins, Griswold, & Wikelund, 1989). Although teachers devote some attention to diagnostic assessments in order to enhance their instructional practices (see Lipson & Wixson, 1991; Murphy, Shannon, Johnston, & Hansen, 1988), and schools, districts, states, and the federal government certainly impose accountability testing in great quantities (see Anthony, Johnson, Mickelson, & Preece, 1991; Calkins, Montgomery, Santman, & Falk, 1998), the vast majority of the available assessment time and energy is consumed by instructional assessment.

We define instructional assessment as formal or informal methods of obtaining information about children's classroom performance in order to guide instructional decision-making and provide instructionally relevant information to teachers. In an instructional assessment the primary focus is on individual learning rather than on group reporting of average scores. More specifically, instructional assessment is not designed to rank or compare students or to be used for high-stakes purposes. Rather, it is a tool for the teacher, and its value is linked directly to its impact on instruction. Instructional assessments are intended to clarify what students are learning and have begun to master by providing information that is relevant to understanding individual students' learning profiles. In this way, like other authentic performance assessments, their purpose is to enhance learning and improve instruction (Calfee, 1992; Calfee & Hiebert, 1991; Meisels, 1997).

Conventional standardized tests of reading achievement have been subjected to extensive analysis (see Haladyna, Nolen, & Haas, 1991; Stallman & Pearson, 1990a, 1990b), but less information is available regarding instructional assessments. Indeed, the National Research Council's Committee on the Prevention of Reading Difficulties (Snow, Burns, & Griffin, 1998) made the following recommendation:

Toward the goal of assisting teachers in day-to-day monitoring of student progress along the array of dimensions on which reading growth depends, the appropriate government agencies and private foundations should sponsor evaluation, synthesis, and, as necessary, further development of informal and curriculum-based assessment tools and strategies. In complement, state and local school districts should undertake concerted efforts to assist teachers and reading specialists in understanding how best to administer, interpret, and instructionally respond to such assessments. (p. 337)

In short, notwithstanding several attempts to describe the significance and role of instructional assessment in the classroom routine (Taylor, 1990; Valencia & Calfee, 1991; Winograd, Paris, & Bridge, 1991), more focus is needed on the area of instructional assessment--particularly in the area of literacy. This technical report is intended to provide a compilation and analysis of early literacy assessments used for instruction.

The purpose of this study is threefold: (a) to gain an understanding of classroom-based literacy measures that are available to teachers; (b) to characterize the instructional assessments teachers use in their classrooms to evaluate their students' literacy performance; and (c) to learn more about how teachers assess reading and writing elements. Throughout this report we will refer to "skills and elements" to denote what the literacy assessments are designed to measure. In some cases (e.g., spelling, punctuation, phonetic analysis), the assessments focus clearly on skills. In other cases (e.g., demonstrating concepts of print; extracting meaning from text; assessing self-reflection, motivation, or attitudes), the term "literacy element" is more appropriate.

Our specific research questions focus on both the measures available for analysis and the skills and elements inherent in the measures. Regarding the measures, we asked the following questions:

Regarding the literacy skills or elements that are implicit in the measures:

This report presents our response to these research questions as well as a set of recommendations based on them. It is accompanied by a database available on the CIERA website (www.ciera.org) that provides detailed information about each of the assessments reviewed for this report.

I. Methods


A. Sample


1. Selection criteria

We used four criteria to select early literacy assessments for this study. First, we included measures that were developed for use in classrooms by teachers, school districts, state departments of education, and/or researchers. As will be described later, these measures were nominated by teachers and other educational professionals. Second, for the most part we focused on measures that were developed and distributed by noncommercial publishers. Third, we included measures whose primary purpose was instruction, rather than accountability. Finally, we examined assessments that targeted children between kindergarten and third grade. Measures that extended beyond third grade were only analyzed to grade 3.

Several measures that were recommended by our sources were not included in our sample. We excluded measures designed primarily for toddlers, preschoolers, or students in fourth grade and beyond; non-literacy related assessments (e.g., science, social studies); assessments used for research purposes; and assessments primarily used for accountability purposes. We included, but did not comprehensively sample, measures that assess motivation, self-perception, and attitudes toward reading.

2. Sources of measures

We gathered the measures used in this survey from five sources: listservs, personal contacts, literature searches and published reviews of the measures, websites, and newsletter postings. We posted a request for information regarding classroom-based literacy practices on eight listservs (see Table 1). These listservs reach a wide range of practitioners, researchers, and policymakers, many of whom provided us with names of informal literacy assessments and with referrals regarding people to contact, books to review, and websites to examine.

Personal contacts took place with practitioners, researchers, state-level policymakers, and representatives of professional reading organizations. These contacts included individuals who responded to our listserv postings as well as leading researchers, state reading coordinators, academics, and others who were recommended to us. These conversations led to our receiving copies of several measures, as well as additional suggestions for other literacy assessments.

Listservs Used for Data Collection

Acronym

Title

Subscription Address

AERA-D

American Educational Research Association--Measurement and Research Methodology

Listserv@asu.edu

ARN_L

Assessment Reform Network

Listserv@lists.cua.edu

ASCELA

Assembly of State Coordinators of English/Language Arts

Ascela@servl.ncte.org

CIERA

Center for the Improvement of Early Reading Achievement

Not publicly available

ECENET-L

Early Childhood Education

Listserv@postoffice.cso.uiuc.edu

K12ASSESS-L

Discussion of K-12 Education Assessment

Mailserv@lists.cua.edu

NRC

National Reading Conference

Not publicly available

TAWL

Teaching Whole Language Discussion

Listserv@listserv.arizona.edu

Our literature search identified numerous books, journals, articles, and papers that were reviewed for relevant assessment information. Most sources consisted of guidelines for developing informal assessments, assessing students in higher grades, and current trends in the field of assessment. A few included specific assessments for K-3. The majority of the assessments were found in books, and several were located in such reading journals as The Reading Teacher and Elementary School Journal. Other searches provided standardization and psychometric properties for the assessments we received.

We also accessed the websites of numerous national organizations, state departments of education, schools, and the U. S. Department of Education's Cross-Site Index (see Table 2). These websites were primarily concerned with assessment-related information and described articles, books, and handouts with guidelines for developing informal assessments. The few sites with specific literacy assessments for K-3 described materials that were commercially developed and distributed.

Websites Reviewed for Literacy Assessments

Name of Website

Address

ERIC Clearinghouse

http://ericps.crc.uiuc.edu

http://ericae.net/bstore/

The Learning Record

http://www.cwrl.utexas.edu/~syverson

The Work Sampling System

http://www.rebusinc.com

Richard C. Owen Publishers

http://www.rcowen.com

K-12ASSESS

http://ericae.net/scripts/small3.htm

CIERA

http://www.ciera.org/intranet
(not publicly available)

US Department of Education's

Cross-Site Index

http://search.ed.gov/csi/index.html

http://www.cwrl.utexas.edu

http://scrtec.org/track/tracks/c00133an.html

http://www.indiana.edu/~eric_rec/ieo/bibs/altasses.html

http://scrtec.org/track/tracks/t00133.html

Connections to Regional Educational Laboratories

http://www.nwrel.org/scpd/natspec/catalog/readrecovery.htm

http://www.nwrel.org/nwreport/sept96/edition.html

http://www.nwrel.org/nwreport/sept96/biblio.html

http://www.nwrel.org/eval/ea%5fbibs/folio.html

http://www.mcrel.org/resources/literacy/

http://www.ncrel.org

http://www.ncrel.org/sdrs/areas/as0top10.htm

National Center for Research on Evaluation, Standards, and Student Testing (CRESST)

http://www.cresst96.cse.ucla.edu/

The Harbor School Assessment Model

http://www.wolfenet.com/~harbrsch/assessmodel.html

(no longer available)

Beth Conant's site

http://www.users.sgi.net/~cokids

http://www.servtech.com/~germaine/rubric.html

http://www.ehhs.cmich.edu/ins/kidart.perf

The New "Teacher's Guide to the U.S. Department of Education"

http://www.ed.gov/pubs/TeachersGuide/

The Department of Education's Office of Reform Assistance & Dissemination (ORAD)

http://www.ed.gov/offices/OERI/ORAD/

We posted a notice in a large number of local, state, and national newsletters that reach reading teachers and early childhood and elementary educators. Local affiliates of the Michigan Reading Association, state affiliates of the National Association for the Education of Young Children (NAEYC), and affiliates of the International Reading Association agreed to post our notice in their newsletters (see Table 3). Although these requests for literacy assessments reached a large number of practitioners, we received only a handful of assessments from this effort. However, the measures we received included references to other literacy-related measures for K-3. Nevertheless, it is clear that this report does not include an exhaustive enumeration of informal literacy assessments. It represents strictly a sampling of the universe.

Organizations That Posted an Information Request in Their Newsletter

Michigan Local Organizations

Metro Detroit Reading Council

Oakland County Reading Council

State Affiliates of the International Reading Association (IRA)

Colorado Council of IRA

Connecticut Reading Association

State of Maryland IRA

Massachusetts Reading Association

Michigan Reading Association

Missouri IRA

New England Reading Association

Oregon Reading Association

South Carolina Council of Teachers of English

South Carolina Council of IRA

Texas State Reading Association

 

State and Local Affiliates of the National Association for the Education of Young Children (NAEYC)

Boston

California-San Diego

Chicago Metropolitan

Delaware Valley

Hawaii

Indiana

Michigan

New York

New York City

Ohio

Texas

Texas-Houston

Wisconsin Early Childhood Association

3. Developers and currency of measures

Overall, we collected a large number of measures (N = 89) that were created by a wide spectrum of developers (states, 10%; districts or schools, 11%; teachers, 16%; researchers, 60%; and other developers, 3%). The copyright dates of the assessments extend from 1936-1999, although the majority are from the past 10 years
(N = 60). For assessments with more than one version, the most recent edition was analyzed. All measures were examined directly, either through obtaining copies of the measures from the developers or through library or interlibrary loan requests.

 

B. Coding Manual


The coding scheme for analyzing the measures is adapted from Stallman and Pearson (1990b), Pearson, Sensale, Vyas, and Kim (1998), Stiggins (1995), Mariotti and Homan (1997), and our own explanatory analysis. The list of analytic categories is presented in Table 4. The coding scheme is organized around the types of literacy elements evaluated and the ways in which these skills or elements are assessed at different grade levels. The scheme is divided into two broad sections: (a) general overview, and (b) skills or elements tested, with each section further subdivided into more discrete elements. The coding manual, which provides a description of each section, is located in Appendix A. Below we describe the contents of the coding scheme.

Outline of the Coding Scheme

 

I. General Overview

A. Title of assessment

B. Author(s)

C. Availability

D. Overall purpose

E. Language(s)

F. Grade/age

G. Form of administration

H. Frequency

I. Amount of time required to administer

J. Assessment model(s)

K. Format(s) for recording student response

L. Category of elements

M. Standardization

N. Psychometric properties

O. Comments

P.. Notes

II. Skills or Elements Tested

A. Skill or element

B. Grade

C. Form of administration

D. Frequency

E. Amount of time required to administer

F.. Assessment model

G. Item response format

H. Number of items

I. Description of items

J. Presentation

1. Mode

2. Unit of presentation

K. Response

1. Type of mental processing

2. Unit of response

3. Student response

L. Scoring

M. Notes

 

1. General overview

The general overview contains identifying information about the measure, including names of authors, general availability, overall purpose, and language availability. The purpose of the measures indicates its overall intent. Some measures are very specific about the types of elements they evaluate (e.g., spelling, phonemic awareness), whereas others are more global and encompass a range of elements (e.g., reading, writing). Information concerning the measure's standardization and psychometric properties is located in this section. Finally, any additional information unique to the measure that is not included in the Skills or Elements Tested section is indicated in the comments section. The general overview also provides a summary of the contents of the Skills or Elements Tested section, the grade levels evaluated by the measure, the form of administration, frequency, time required to administer the measure, assessment models, format for recording student responses, and category of elements.

2. Skills or elements tested

This section examines the specific skills or elements the measures are designed to assess. Eighty-eight percent of the measures assess more than one literacy element, ranging from 1 to 67 different elements.

The elements are divided into eleven literacy-related categories, with two additional categories examining students' oral language and other elements. These categories are further subdivided into specific constituents, accounting for 133 skills or elements in all (see Table 5). The categories and constituent elements were derived from our analysis of the assessments. We compared these elements to the standards and benchmarks compiled by the Mid-continental Regional Educational Laboratory (McREL; Kendall & Marzano, 1997). McREL standards and benchmarks provide a format that reflects state and national standards in the various curriculum domains. The McREL content standards for Language Arts comprise eight standards for K-12. We include the eight Language Arts standards with their benchmarks for K-3 as an Appendix to the coding manual (see Appendix A) and we indicate with an asterisk those elements that are referenced in the McREL content standards.

Literacy Categories and Constituent Skills or Elements

Category

Skills or Elements

Writing Process

a. Illustrations Are Representative of the Story

b. Message Quality

c. Types of Compositions 1

d. Uses Illustrations to Express Ideas

e. Uses Lively and Descriptive Language

f. Use of Formal and/or Literary Language

g. Vocabulary Usage

h. Writing Attends To Audience See Matches a McREL Benchmark and Standard.

i. Writing Behaviors

j. Writing Contains A Purpose See Matches a McREL Benchmark and Standard.

k. Writing Contains Description and Details

l. Writing Conveys a Sense of Story

m. Writing Has Evidence of Beginning, Middle, and End

n. Writing Is Easy to Understand And Follow

o. Writing Is Logical And Sequential

p. Writing Process See Matches a McREL Benchmark and Standard.

Conventions

a. Capitalization See Matches a McREL Benchmark and Standard.

b. Directional Principles in Writing

c. Grammatically Correct Sentences See Matches a McREL Benchmark and Standard.

d. Handwriting

c. Linguistic Organization See Matches a McREL Benchmark and Standard.

d. Paragraphs See Matches a McREL Benchmark and Standard.

e. Punctuation Marks See Matches a McREL Benchmark and Standard.

f. Spelling See Matches a McREL Benchmark and Standard.

g. Uses Complex Word Structures

h. Uses Upper- And Lower-Case Letters in Writing*

i. Writes Own Name

Print Awareness

a. Concept of Letter or Word

b. Directionality See Matches a McREL Benchmark and Standard.

c. Identification of Parts of a Book See Matches a McREL Benchmark and Standard.

d. Labels Pictures

e. Letter and Word Order See Matches a McREL Benchmark and Standard.

f. Sense of Story

g. Understands Punctuation Marks

h. Understands That Print Conveys Meaning See Matches a McREL Benchmark and Standard.

i. Understands Upper- And Lower-Case Letters

j. Word Boundaries See Matches a McREL Benchmark and Standard.

Aspects of Word Recognition

a. Decoding Words See Matches a McREL Benchmark and Standard.

b. Identification of Beginning Sounds*

c. Letter Identification

d. Manipulation of Sounds

e. Phonemic Awareness See Matches a McREL Benchmark and Standard.

f. Production of Rhyming Words

g. Sound-Symbol Correspondence

Reading

 

a. Book Topic

b. Fluency

c. Identifies Own Name

d. Instructions

e. Pretend Reading

f. Reading Accuracy See Matches a McREL Benchmark and Standard.

g. Reading Flexibility

h. Reads as if Passage is Meaningful

i. Texts Student Can Read See Matches a McREL Benchmark and Standard.

j. Use of Book Language

k. Voice-to-Print Match

Reading Strategies

a. Locating Answers

b. Monitoring Own Reading Strategies See Matches a McREL Benchmark and Standard.

c. Self-Correction See Matches a McREL Benchmark and Standard.

d. Using Pictures and Story Line for Predicting Context and Words See Matches a McREL Benchmark and Standard.

e. Using Print for Predicting Meaning of the Text

f. Way of Reading

Comprehension

a. Comments on Literary Aspects of the Text

b. Connects Universally Shared Experiences With Text See Matches a McREL Benchmark and Standard.

c. Distinguishes Fantasy From Realistic Texts See Matches a McREL Benchmark and Standard.

d. Drawing Conclusions

e. Identify Cause-Effect Relationships

f. Inferences See Matches a McREL Benchmark and Standard.

g. Literal Comprehension

h. Literary Analysis See Matches a McREL Benchmark and Standard.

i. Prediction Strategies See Matches a McREL Benchmark and Standard.

j. Provides Supporting Details See Matches a McREL Benchmark and Standard.

k. Reference to Evidence Presented in Text

l. Retelling See Matches a McREL Benchmark and Standard.

m.Sequence of Story's Events

n. Summarizes Main Ideas and Points*

o. Wider Meaning

Motivation

a. Book Referral

b. Current Reading Practices

c. Family Support and Prior Experience

d. Reading Preferences

e. Response to Literature

f. Student Reads for Own Purposes

g. Time Spent

h. Other

Self-Perception/Self-Concept

a. Characteristics of a Good Reader

b. Learning and Understanding

c. Others' Opinions

d. Reads Independently

e. Writes Independently

Metacognition

a. Familiarity With Types of Texts

b. Monitoring How Student Reads

c. Personal Progress

d. Planning How to Read

e. Pride

f. Reading Related Behaviors

g. Self Assessment in Non-Language Arts Domains

h. Self Review

i. Sharing with Others

j. Strategy-Execution for How to Read

k. Teacher Feedback

l. Writing Related Behaviors

m. Other

Attitude

a. Attitudes Towards Other Literacy Activities

b. Attitudes Towards Reading

c. Attitudes Towards Reading Behaviors

d. Attitudes Towards Writing

e. Other

Oral Language:
Listening and Speaking

a. Asks for Clarification

b. Communicates Effectively

c. Figurative Language

d. Holds Attention of Others

e. Language Production

f. Listens Attentively

g. Oral Directions

h. Others' Perspective

i. Participates in Group Discussion

j. Questions

k. Responses Make Connections to the Situation

l. Self Corrects When Speaking

m. Story Telling/Retelling

n. Various Types of Communication

Other

a. Color Identification

b. Fact vs. Opinion

c. Notetaking

d. Presentations

e. Reference Skills

f. Skimming

g. Similarities and Differences

h. Synonyms and Antonyms

i. Text Comparison

j. Topic Knowledge

k. Use of Text

l. Other

We gathered information about the grade of the student for which the element is intended; different elements may be evaluated in different grades by the same measure. Certain elements are more relevant to earlier grades, such as letter identification and identification of parts of a book, whereas other elements may be more specific to older children in second or third grade, such as writing in paragraphs and using complex sentence structures. The form of administration--whether the assessment uses an individual, one-to-one setting, a group format, or both--is noted next. Several forms may be used for different elements within the same measure. The frequency and amount of time required to administer this part of the measure is also noted for each element. This helps us understand how often teachers evaluate elements, and specifically which elements are evaluated regularly and which are assessed infrequently. The amount of time teachers spend evaluating students' literacy elements in a one-to-one setting or in a group suggests how much time is spent on the assessment process.

The six assessment models in the coding scheme are based in part on the work of Stiggins (1995): (a) clinical interviews, (b) constructed response, (c) observation, (d) on-demand response (also described as closed-response set), (e) student self-assessment, and (f) multiple responses (see Table 6). The first four and the sixth of these models emerged from our readings and a priori categorizations; however, student self-assessment was derived from the data we reviewed. Teachers, researchers, and districts view students' involvement with the evaluation of their work as a growing and critical aspect of the assessment process. We also found through our analyses that the same element was sometimes evaluated differently with the same tool. In cases in which a element is assessed in multiple ways, we classified the model as comprising multiple responses.

Assessment Models

Clinical Interview

The teacher gathers information regarding the student's process of thinking while engaging in literacy activities.

Constructed Response

The student is asked to provide a range of answers or responses within a broad structure.

Observation

The teacher observes the student's literacy practices in a natural or contrived setting.

On-Demand Response

The student is asked to provide the correct answer, often in response to a limited set of responses.

Student Self-Assessment

The student evaluates his/her own work.

Multiple Responses

The assessment evaluates the element in multiple ways.

Item response format covers a list of formats that practitioners use for recording student responses (see Table 7). The formats were derived from several sources, including Stallman and Pearson (1990b) and Pearson et al. (1998), as well as from our analysis of the measures we obtained. Stallman and Pearson (1990b) only included checklists and multiple choice. Pearson et al. (1998) expanded Stallman and Pearson's (1990b) analysis to include four more categories. We further expanded the categories to include twelve formats and we renamed the formats to distinguish among the numerous types of formats available to practitioners.

The number of items the measure offers for evaluating a specific skill or element describes the quantity of information teachers are asked to gather in order to assess a particular element. However, the number of items says very little in itself; a place is provided for a description of the items, such as "uses a passage or rubric," "is a question or statement," or "is part of a larger checklist or questionnaire."

Item Response Formats

Checklist

The examiner keeps track of the quality and/or occurrence of student responses in relationship to items on a predetermined list.

Dictation

The teacher presents information orally for students to encode.

Informal Reading Inventory

Graded series of passages and sentences of increasing difficulty are used to determine students' strengths, weaknesses, and strategies in word identification and comprehension.

Miscue Analysis

This is a formal examination of the use of miscues or errors as the basis for determining the strengths and weaknesses of students as they read.

Miscue Analysis/Informal Reading Inventory

This is a combination of the two formats.ObservationThe teacher observes student behavior formally or informally. This option is used as the default.

Oral-Directed

The student verbalizes his or her response to a question that only allows for one correct answer.

Oral Open-Ended

Questions or tasks are used to explore a student's understanding of elements in reading or literacy that are intended to produce an oral free response, rather than a directed one; the response is recorded by the teacher or the administrator.

Running Records

A neutral observation of students' elements and capabilities as they read; the teacher informally tracks students' reading ability.

Written-Directed

The student writes his or her response to a question that allows for only one correct answer.

Written Open-Ended/Constructed Response

Questions or tasks are used to explore a student's understanding of elements in reading or literacy that are intended to produce a written free response, rather than a directed one; the response is recorded by the teacher or the administrator.

Multiple Responses

The assessment evaluates the element in multiple ways.

The presentation section uses subcategories from Stallman and Pearson (1990b), with revisions from Pearson et al. (1998). The mode of presentation, which contains six options, describes the main mode of presentation used by the examiner, including auditory and visual (see Table 8). The unit of presentation is the type of stimulus to which the student is asked to respond; we added a few options and eliminated others to arrive at a total of 24 options (see Table 8). Examples of units of presentation that emerged from our data include books, connected discourse, letters, phonemes, stories, and words.

Presentation

Mode

Auditory

Visual

Auditory and visual, mixed

Production

Other

Multiple responses

Unit

Auditory-general

Book

Connected discourse

Gesture

Grapheme

Incomplete passage

Incomplete word

Letter

Nonsense word

Number

Object

Patterns

Phoneme

Phrase

Picture with directions

Punctuation marks

Sentence/question

Story

Syllable

Symbol

Visual-general

Word

Other

Multiple responses

The response section is also borrowed from Stallman and Pearson (1990b), with revisions by the authors and by Pearson et al. (1998). Specific types of student responses are divided into three subcategories: type of mental processing, unit of response, and student response (see Table 9). The type of mental processing describes how students process the information presented in order to provide the appropriate response. We added three additional options to the original options of identification, production, recognition, and other: recall, reproduction, and multiple responses. Recall is common when assessing comprehension; however, reproduction rarely emerged. The unit of response refers to the stimuli used by the student to indicate the correct answer to the item. Examples of stimuli used by the measures we collected include grapheme, objects, phrase, picture, punctuation marks, and sounds. The student's response categorizes what the student does when responding to the item.

Response Types

Types of Mental Processing

Identification

Production

Recall

Recognition

Reproduction

Combination

Other

Multiple responses

Unit

Book

Clause

Connected discourse

Gesture

Grapheme

Letter

Letter that matches response

Nonsense word

Unit

Number

Objects

Oral

Passage

Phrase

Picture

Punctuation marks

Sentence

Shape

Sound

Word

Written

Other

Multiple responses

Student Response

Circle

Color

Draw

Fill in the blank

Fill in the circle

Find

Manipulate

Mark

Perform

Point

Respond orally

Sort/organize

Underline

Use mouse/keyboard

Write

Other

Multiple responses

Finally, we indicated how the element is scored (rating scale, rubric, or yes/no). In the notes section we include any additional information relevant to the element.

C. Analytic Methods


We present frequencies to describe the general overview of the measures we collected, including grade levels, forms of administration, types of assessment models, formats for recording student responses, and categories of elements. The frequencies offer a clear description of the measures. The next step of the analysis focuses on the elements evaluated by the measures, including the methodology, formats, grade levels, and student responses to the items. We also perform cross-tabulations of elements by assessment models, student response formats, and response types. In addition, we examine the standardization and psychometric data that are available concerning these measures. Finally, we provide a description of two samples of our measures in order to demonstrate the kind of information available in the database. The two measures are Guidance in Story Retelling (Morrow, 1986), and Literacy Assessment for Elementary Grades (St. Vrain Valley School District, 1997). The format used to describe these measures was applied to all of the assessments we collected.

II. Results


This section is divided into two parts. First, we present analyses by specific assessments. In the second part we focus on elements and provide analyses that cut across our entire sample of assessments.

A. Analysis by Assessment

Our analysis includes 89 assessments. A brief overview of the 89 measures is presented in Appendix B; a comprehensive review of each measure is available at www.ciera.org. The summary provides the name of the assessment, author, purpose, grade, form of administration (individual or group setting), and the category of elements each measure assesses. The name of the measure is the title of the tool or the title of the group of measures developed by the same author(s). The groups of measures are placed under the umbrella of the author or title of the book. For example, An Observational Survey (Clay, 1998) contains several tools, such as Concepts about Print and Dictation; all of these assessments are found under the title of Clay's book. Many measures state their purpose as part of the measure. Some descriptions are global, such as "evaluates students' literacy development" (MacArthur CCDP Follow-up Study, 1998), whereas others are very specific, for example "to estimate students' reading level, group students effectively, and appropriately choose textbooks, and to plan intervention instruction" (Leslie & Caldwell, 1995). Measures that do not have a stated purpose receive a generic statement of "to evaluate students' reading and writing abilities."

The grades the measures are to be used with range from K-3; the distribution is presented in Figure 1. Only 10% (N = 9) of the measures are designed for a particular grade level. Many apply to students in two or three grades (N of two grades = 16; N of three grades = 24), with almost half of the measures evaluating literacy elements at all four grade levels (N = 40).

 

 

 

 

 

 

 

 

All measures are available in English, and only 5% (N = 4) are available in Spanish (one assessment is available in Danish; see Table 10). Seventy percent of the measures are designed for individual administration, rather than for use in a group setting. These individual forms of administration also include teacher observations of students. Measures that ask teachers to use observations of students in order to complete a checklist are coded as individual administrations unless the measure states that the teacher can complete the checklist or rubric within a group setting. Only 7% of the measures we collected are intended to be administered solely to a group of children.

Table 10 shows how often the measures indicate exactly when to administer the entire assessment or parts of the measure. Fewer than half of the measures (44%) we analyzed explicitly state the minimum number of times that a teacher should evaluate students' literacy elements. About a quarter of the measures (26%) indicate the length of time required to complete the evaluation.

 

 

 

 

 

 

Languages, Type of Administration, Frequency and Time Required to Administer

Categories

Number

Percent

Language

 

 

English

89

100

Spanish

4

5

Danish

1

1

Administration

 

 

Group

6

7

Individual

62

70

Individual and/or group

21

24

Frequency (yes/no)

39

44

Time required to administer

23

26

The skills or elements evaluated by the assessments range across 13 categories (see Table 11). Of the assessments we collected, all categories are represented in at least 26% of the measures. More than half of the assessments evaluate students' use of conventions, phonics, reading, and comprehension elements. Evaluations of writing process, print awareness, and reading strategies appear somewhat less frequently (42-48%). The other six categories are included in one third of the assessments. A summary of the specific elements assessed by each measure is presented in Appendix C.

Frequency of Elements Included in the Assessments

Category of Elements

N of Assessments

Percent

Phonics

54

61

Comprehension

52

58

Reading

51

57

Writing Conventions

48

54

Writing Process

43

48

Print Awareness

42

47

Reading Strategies

37

42

Listening and Speaking

30

34

Metacognition

30

34

Other

30

34

Motivation

27

30

Self-Perception

26

29

Attitude

23

26

Next, we examine the number of McREL standards found throughout our measures. Table 12 indicates the number of assessments with one or more standards, up to all eight standards. One or two McREL standards are represented in nearly one third of the measures (N = 28), and 13% (N = 12) of the assessments contain a element relevant to all 8 standards. Only 4% (N = 4) of the measures do not contain any McREL standards. The specific standard that is represented most frequently is Standard 5 ("Demonstrates competence in the general skills and strategies of the reading process"). Seventy-three of the assessments included this standard.

We also investigated the various methodologies represented by the assessments. Of the 89 measures, more than half (N = 47) use two very different approaches--observation or on-demand methods--for evaluating students' literacy skills (see Figure 2). Only 29% (N = 26) use constructed responses, and such responses occur predominantly with the writing process and conventions; 16% (N = 14) provide students with the opportunity to participate in the evaluation of their work. Observation, constructed response, and on-demand methods are used most consistently across all grade levels.

Number of Assessments With a McREL Standard

N of Standards

N of Assessments

Percent

1

16

18

2

12

13

3

13

15

4

15

17

5

6

7

6

3

3

7

8

9

8

12

13

No Standards

4

4

All twelve item formats are used across the measures to record student responses (see Table 13). Of the 89 measures, 42% (N = 37) use oral-directed responses as part of their assessment. The next most common format is checklist (36%, N = 32), followed by written open-ended (18%, N = 16). The item formats used by the measures are related to the methodologies; only checklists are used by all methods. An observation methodology in conjunction with checklists is the most frequent combination.

We further explore student responses to the assessments by examining mental processing strategies. The most common type of mental processing used

Frequencies of Item Formats Used to Record Student Responses

Formats of Student Responses

Number

Percent

Oral-Directed

37

42

Checklist

32

36

Written Open-Ended

16

18

Dictation

14

16

Written-Directed

14

16

Running Records

13

15

Observation

11

12

Multiple Responses

11

12

Informal Reading Inventory (IRI)

9

10

Miscue Analysis/IRI

9

10

Oral Open-Ended

7

8

Miscue Analysis

6

7

by students for processing the information presented is identification (N = 50, 56%; see Table 14). Production, recall, and "other" are the next most common types of mental processing required of students by the assessments, followed by recognition and multiple responses. The table demonstrates that students use 10 different ways to respond to the items. More than 60% of the measures require students to respond orally; this is followed by written responses (46%). Of the 10 possible ways of responding included in our analysis, 5 were rarely used, occurring in less than 10% of the assessments.

Type of Student Responses

 

Number

Percent

Mental Processing

 

 

Identification

50

56

Production

38

43

Other

38

43

Recall

37

42

Recognition

15

17

Multiple Responses

15

17

Combination

1

1

Student Response

 

 

Responding Orally

57

64

Write

41

46

Other

36

40

Multiple Responses

18

20

Point

11

12

Circle

7

8

Find

4

4

Manipulate

3

3

Draw

1

1

Mark

1

1

B. Analysis by Skills or Elements

This section describes our analyses in terms of the constituent skills or elements of the assessments. Each skill or element (N = 133) appears only once for each assessment in our coding scheme, regardless of the multiple ways it may be assessed. The frequency of a single element appearing across all assessments ranged from 1-41; a summary of the elements that appear on 10 or more measures is presented in Table 15. The specific skill of decoding words appeared in more than 40 measures; the next most common skill was spelling (N = 38), followed by reading accuracy, summarizing main ideas, and providing supportive details (N for each = 32). In short, this table shows us which elements appear most frequently in the 89 measures we analyzed. (For an analysis of the number of elements included in each assessment, see Appendix C.)

Frequency of Skills or Elements Across All Measures (N = 66)

Category

Skill or Element

Number of Assessments

Phonics

Decoding Words

41

Conventions

Spelling

38

Comprehension

Provides Supporting Details

32

Comprehension

Summarizes Main Ideas and Points

32

Reading

Reading Accuracy

32

Print Awareness

Word Boundaries

29

Conventions

Punctuation Marks

28

Print Awareness

Concept of Letter Or Word

28

Reading Strategies

Using Pictures and Story Line for Predicting Context and Words

28

Phonics

Identification of Beginning Sounds

27

Writing Process

Writing Behaviors

27

Conventions

Capitalization

26

Reading Strategies

Self-Correction

26

Comprehension

Retelling

25

Print Awareness

Directionality

25

Comprehension

Connect Universally Shared Experiences With Text

24

Conventions

Grammatically Correct Sentences

24

Reading Strategies

Using Print for Predicting Meaning of the Text

24

Phonics

Letter Identification

23

Self-Perception

Reads Independently

23

Comprehension

Prediction Strategies

22

Conventions

Linguistic Organization

22

Print Awareness

Identification of Parts of a Book

22

Reading Strategies

Monitoring Own Reading Strategies

22

Writing Process

Writing Process

22

Phonics

Sound-Symbol Correspondence

21

Print Awareness

Understands That Print Conveys Meaning

21

Reading

Fluency

21

Comprehension

Sequence of Story's Events

20

Reading

Voice-To-Print Match

20

Listening and Speaking

Participates in Group Discussion

19

Motivation

Response to Literature

19

Writing Process

Message Quality

19

Writing Process

Vocabulary Usage

18

Conventions

Directional Principles in Writing

18

Writing Process

Types of Compositions

16

Writing Process

Writing Contains a Purpose

16

Writing Process

Writing Contains Description and Details

16

Writing Process

Writing Is Logical and Sequential

16

Reading

Pretend Reading

16

Reading

Texts Student Can Read

16

Phonics

Phonemic Awareness

16

Phonics

Production of Rhyming Words

16

Comprehension

Inferences

16

Writing Process

Writing Has Evidence of Beginning, Middle, and End

15

Writing Process

Writing Is Easy to Understand and Follow

15

Print Awareness

Understands Punctuation Marks

15

Comprehension

Wider Meaning

15

Listening and Speaking

Story Telling/Retelling

14

Attitude

Attitudes Towards Reading

14

Reading Strategies

Way of Reading

13

Print Awareness

Labels Pictures

13

Other

Reference Elements

13

Metacognition

Self Review

13

Conventions

Uses Upper- and Lower-Case Letters in Writing

13

Reading

Reading Flexibility

12

Print Awareness

Understands Upper- And Lower-Case Letters

12

Motivation

Reading Preferences

12

Listening and Speaking

Listens Attentively

11

Conventions

Paragraphs

11

Comprehension

Reference to Evidence Presented in Text

11

Writing Process

Uses Illustrations to Express Ideas

10

Writing Process

Uses Lively and Descriptive Language

10

Reading

Use of Book Language

10

Comprehension

Drawing Conclusions

10

Comprehension

Literal Comprehension

10

We examined the number of constituent skills or elements that match a particular standard on the McREL standards in the Language Arts content area. We found that 27% (N = 55) of our elements were represented in the McREL standards; Figure 3 shows the number of elements associated with each standard. Overall, we identified a total of 133 constituent elements that were included in the 89 assessments. In addition to the 55 that match the McREL standards, 25 (19%) reflect motivation, self-perception, metacognition, and attitude towards reading. The remaining elements (N = 52; 39%) do not match the McREL standards or the motivation/self-perception group. The three groups of elements are presented in Appendix D.

 

We further analyzed the distribution of grade levels and forms of administration by constituent skills or elements. Ninety-two percent (N = 123) of the elements are assessed in all grades, K-3. The elements that are not evaluated in all grades are part of the motivation, self-perception, attitude, and metacognition categories (N = 10). These elements tend, on average, to be evaluated in second and third grades, when they are more stable. The form of administration (individual or group) for evaluating the skills or elements is presented in Figure 4. Almost all of the skills or elements are assessed individually, with two thirds assessed as either individual or group.

The most common methodology used for evaluating a particular skill or element is observation (N = 123; see Figure 5). Half of the elements were assessed using either constructed response (N = 67) or on-demand response (N = 65). The least frequently used methodology was clinical interview (N = 20), which is most commonly associated with motivation, self-perception, attitude, and metacognition elements.

  •  

    The item formats used by administrators for recording student responses across skills or elements are presented in Table 16. Elements are recorded most often with checklists (N = 117). The next most frequently used method of tracking student responses is observation (N = 92), followed by multiple responses, written open-ended, oral-directed, written-directed, and informal reading inventory.

    In Table 17 we examine the specific type of response students use to identify correct answers and what the student does in response to each item with the constituent skills or elements. For 90% of the elements (N = 120), teachers decide which activity to use in order to assess a particular skill or element. Approximately two thirds of the elements (N = 89) call upon students to respond in multiple forms and to produce the correct response in order to show their mastery of a skill or element. The use of identification is limited to half of the elements (N = 68). Students respond in 10 different ways when indicating the correct answer; Table 17 lists those responses that occur with more than 10% of the skills or elements. The responses with fewer than 10% include draw, find, manipulate, and mark.

    Number of Skills or Elements Using Each Item Format

    Formats of Student Responses

    N

    Percent

    Checklist

    117

    88

    Observation

    92

    69

    Multiple Responses

    78

    59

    Written Open-Ended

    64

    48

    Written-Directed

    38

    29

    Oral-Directed

    61

    46

    Informal Reading Inventory (IRI)

    42

    32

    Oral Open-Ended

    23

    17

    Miscue Analysis

    7

    5

    Miscue Analysis/IRI

    7

    5

    Dictation

    5

    4

    Running Records

    1

    1

    Types of Mental Processing and Student Responses

     

    N

    Percent

    Mental Processing

     

     

    Other (teacher discretion)

    120

    90

    Multiple responses

    89

    67

    Production

    77

    58

    Identification

    68

    51

    Recall

    52

    39

    Recognition

    39

    29

    Combination

    5

    4

    Student Responses

     

     

    Other

    120

    90

    Multiple responses

    92

    69

    Oral

    79

    59

    Write

    75

    56

    Circle

    21

    16

    Point

    13

    10

    C. Standardization and Psychometrics

    Tables 18 and 19 provide all available information about the standardization and psychometric properties of the assessments that have been reviewed in this report. Very little information is available concerning standardization samples, and in general, relatively little information regarding psychometrics is provided by the authors of the assessments.

    Table 18 displays the reliability data available for the 13 assessments that report such information. Both internal and test-retest data are available, and the values reported are moderate to high. Unfortunately, only 14% of the assessments report reliability.

    Table 19 provides information regarding the validity of 32 assessments. Content validity is reported for most of the assessments, although in most cases this procedure was not conducted in a formal way. Rather, the author(s) primarily report on how the assessment was developed. Most assessments are validated with an external criterion using a wide variety of outcomes. Indeed, no single outcome was used by more than one assessment. Sample sizes vary from small (18) to large (1,215). Again, few conclusions can be drawn from these findings.

    Reliability of the Assessments (N = 13)

    Assessments

    Author

    Internal Reliability

    Interrater Reliability

    N of Raters

    An Observational Survey

    Cited in Clay (1998)

     

     

     

    Letter Identification

     

    .97 2

     

    N = 100; urban; age 6 (1966)

    Concepts About Print

     

    .73-.89 3

     

    N = 56, Texas, K grade (1978)

    Concepts About Print

     

    .84-.88 See Split-Half.

     

    N = 56, Texas, K grade (1978)

    Concepts About Print

     

    .95 4

     

    N = 40, urban, age 5 to 7 (1968)

    Ready to Read

     

    .90 See Kuder-Richardson Formula 20.

     

    N = 100; urban; age 6 (1966)

    Writing Vocabulary

     

    .97 See Test-Retest.

     

    N = 34; urban; age 5.6 (1973)

    Assessing Literacy With the Learning Record

    Barr, Craig, Fisette, & Syverson (1999)

     

    .80

    N = 66; 27 schools

    Early Literacy Profile: South Brunswick Public Schools

    Bridgeman, Chittenden, & Cline (1995)

     

    .93

    N = 61; new & experienced teachers

    Elementary Literacy Profile

    Falk, Ort, & Moirs (1999)

     

    .95

    N/A

    Elementary Reading Attitude Survey 5

    McKenna & Kear (1990)

    .74-.89 6

     

    N = 18,138; 1-6 grades; 95 schools; # of girls exceeded by 5 # of boys; ethnicity--close to U.S. population

    Metacomprehension Strategy Index

    Cited in Schmitt (1990)

    .87 See Kuder-Richardson Formula 20.

     

    N/A

    Motivation to Read Profile

    Gambrell, Palmer, Codling, & Mazzoni (1996)

     

    .87

    N = 2

    Reading Survey

     

    .68-82 See Cronbach's Alpha.

     

    N = 330; eastern US; 3 & 5 grades; 27 classes; 4 schools

    Phonological Awareness & Literacy Screening

    Phonological Awareness & Literacy Screening (1998)

    .78-.95

    r = .99

    k = .78-.95

    N/A; PALS has been revised since, new analysis available fall 1999

    Qualitative Reading Inventory--II

    Leslie & Caldwell (1995)

    Yes

    .98

    N = 3; reading teacher or specialist

    Readings

     

     

    .99

    N = 3; reading teacher or specialist

    Explicit Compr.

     

     

    .98

    N = 3; reading teacher or specialist

    Implicit Compr.

     

     

    .98

    N = 3; reading teacher or specialist

    Passages

     

     

    .94

    N = 3; reading teacher or specialist

    The Name Test

    Cunningham (1990)

    .98 See Kuder-Richardson Formula 20.

     

    N = 120; 2-5 grades; equal # of boys & girls; 35 minority children

    Think Alouds: Assessing Comprehension

    Wade (1990)

     

    .92

    N = 2

    Work Sampling System

    Meisels, Liaw, Dofrman, & Nelson (1995)

    .89-.94 at each interval
    .69-.89 between intervals

     

    N = 100; Michigan; K grade; 10 classrooms; WSS has been revised since this study

    Yopp Singer Test

    Yopp (1995)

    .95 See Cronbach's Alpha.

     

    N = 100; southern California; K grade; 3 schools; predominantly White, 1% Black, 2% Asian, & 15% with Spanish surnames

     

    Validity of Assessments (N = 32)

    Measure

    Author

    Construct Validity

    Content Validity 7

    Criterion Used

    Sample

    Alternative Concepts About Print

    Bordeaux (unpublished)

     

    Yes

     

     

    An Observational Survey

    Cited in Clay (1998)

     

     

     

     

    Letter Identification

     

     

     

    Word Reading, r = .85

    N = 100; age 6 (1966)

    Concepts About Print

     

     

     

    Word Reading, r = .79

    N =100; age 6 (1966)

    Ready to Read

     

     

     

    Word Test, r = .90

    N = 87; age 6

    Writing Vocabulary

     

     

     

    Reading, r = .82

    N = 50; urban; aged 5.6 (1973)

    Analytical Reading Inventory

    Wood & Moe (1995)

     

    Yes

     

     

    Basic Reading Vocabulary

    Harris & Jacobson (1982)

     

    Yes

     

     

    Basic Sight Vocabulary

    Leibert (1991)

     

    Yes

    Houghton Mifflin Series--differed by 14%

    N = 296; 2-4 grades; 5 urban schools

    Book Selection

    Paris & Van Kraayenoord (1998)

     

    Yes

     

     

    Checklist for Ownership of Reading

    Au, Scheu, & Kawakami (1990)

     

    Yes

     

     

    Eary Literacy Portfolio: South Brunswick Public Schools

    Bridgeman et al. (1995)

     

     

    Stanford Achievement Test, r = .73
    Comprehension Test of Basic Skills, r = .72

    N = 253, grade 1
    N = 612, grade 2

    Elementary Literacy Profile

    Falk et al. (1999)

     

    Yes

    4th grade NAEP, r = .15-.49
    Degrees of Reading Power, r = .38-.61

    N = 1215; 1-3 grades; equal # of boys and girls; ethnicity--57% White, 17% African American, 17% Latino/a, 5% Native American, 4% Asian

    Elementary Reading Attitude Survey See Provides a description of the item development..

    Recreational

    McKenna & Kear (1990)

    Factor analysis

    Yes

    1. Asked whether a public library was available and if owned a library card. Students with library cards scored significantly higher (M = 30) on the scale than students without cards and library is available (M = 28.9)
    2. Students checked out books from school library. students with checked out books scored significantly higher (M = 29.2) than students without checked out books (M = 27.3)
    3. Students who reported watching less than 1 hour of TV per night. Students who watched TV less than 1 hour per night scored significantly higher (M = 31.5) than students who watch more than 2 hours (M = 28.6)

    N = 18,138; 1-6 grades; 95 schools; # of girls exceeded by 5 # of boys; ethnicity--close to U.S. population

    Academic

    McKenna & Kear (1990)

     

    Yes

    Teacher rated students as low, average, or high reading ability. High-ability students scored significantly higher (M = 27.7) than low ability students (M = 27)

    N = 18,138; 1-6 grades; 95 schools; # of girls exceeded by 5 # of boys; ethnicity--close to U.S. population

    Guidance in Story Retelling

    Morrow (1986)

     

    Yes

     

     

    Index of Reading Awareness

    Jacobs & Paris (1987)

     

    Yes

     

     

    Informal Reading Inventory

    Burns & Roe (1999)

     

    Yes

     

     

    Informal Reading-Thinking Inventory

    Manzo, Manzo, & McKenna (1995)

     

    Yes

     

     

    Learning to Write: A Mode of Curriculum and Evaluation

    McCaig (1990)

     

    Yes

     

     

    Literacy Development Checklist

    Seeds University Elementary School & UCLA (1999)

     

    Yes

     

     

    Metacomprehension Strategy Index

    Cited in Schmitt (1990)

     

    Yes

    1. Index of Reading Awareness, r = .48; Error Detection Task,
    r = .50;
    Cloze Task, r = .49
    2. Student with training in metacomprehension scored significantly higher than students with no training

     

    Motivation to Read Profile

    Gambrell, Palmer, Codling, & Mazzoni (1996)

    Factor analysis

    Yes

    1. Teacher rated students as low, medium, and high performing. Significant difference in a positive direction.
    2. Compared 3rd and 5th grade scores, results emerged consistent with literature.

     

    Reading Survey

     

     

    Yes

     

     

    Conversational Interview

     

     

    Yes

     

     

    Multidimensional Fluency Scale

    Zutell & Rasinski (1991)

     

    Yes

     

     

    Phonological Awareness and Literacy Screening

    Invernizzi et al. (1998)

    Factor analysis

    Yes

    Concurrent: Stanford-9, correctly classified 78% of fall sample

     

    Portfolio Assessment and Evaluation in First Grade...

    Ehlerding (1993)

     

    Yes

     

     

    Pre-Reading Plan

    Langer (1981)

     

    Yes

     

     

    Qualitative Reading Inventory--II

    Leslie & Caldwell (1995)

    Yes

    Yes

    1. California Achievement Test or Iowa Test of Basic Skills, r = .65-.86
    2. Word Recognition and Word Attack from WMRT-Rb, r = .90; Comprehension and Passage Comprehension from WMRT-R, r = .75

    N = 31-41; 1, 2, & 4 grades

    Reading Inventory for the Classroom

    Flynt & Cooter, Jr. (1998)

     

    Yes

     

     

    Story Construction From a Picture Book

    Van Kraayenoord & Paris (1996)

     

    Yes

     

     

    Test of Auditory Analysis Skills

    Rosner & Simon (1971)

     

    Yes

    Language Arts subtest of Stanford Achievement Test--ranges .53-.84

    N = 284; K-6 grades; ethnicity--White

    The Name Test

    Cunningham (1990)

     

    Yes

    2nd graders (M = 22.6) do worse than 5th graders (M = 47.3)

    N = 120; 2-5 grades; 2 schools; equal #s of gender

    Think-Along Passage

    Paris (1991)

     

    Yes

     

     

    Think Alouds: Assessing Comprehension

    Wade (1990)

     

    Yes

     

     

    Work Samples Interview

    Van Kraayenoord & Paris (1997)

     

    Yes

     

     

    Work Sampling System

    Meisels, Bickel, Nicholson, Xue, & Atkins-Burnett (in press)

     

    Yes

    Concurrent: Woodcock Johnson-Revised 8 --of correlation ranged .50-.75

    N = 345; K-3 grades; 17 classrooms; ethnicity--70% African American, 26% White, 2% Asian, 1% Hispanic, 2% Other

    Yopp Singer Test

    Yopp (1995)

    Factor analysis

    Yes

    Predictive: over 7 years with multiple tests, ranges .38-.78

     

     

     

    D. Sample Assessments

    Appendix E presents the description and complete results of our analysis of two sample measures: Guidance in Story Retelling (Morrow, 1986) and the Literacy for Elementary Grades (St. Vrain Valley School District, 1997). They are presented in order to indicate of what the entire corpus of analyses of individual assessments includes of in our database.

    III. Conclusions and Recommendations

    This study reviewed 89 assessments coded for 133 skills or elements designed for instructional assessment of early literacy. The measures were selected according to criteria presented in this report, and they represent all such instruments recommended by teachers, administrators, researchers, and policymakers who we were able to contact.

    The precursor to this study was conducted more than a decade ago by Stallman and Pearson (1990a, 1990b). Their study differed from ours in that it described and evaluated formal measures used for early literacy assessments whereas this study focused on informal, instructional measures. Nevertheless, it is interesting to consider the two studies simultaneously, if for no other reason than it provides a context that allows us to compare instructional assessments with more conventional "standardized" tests used for accountability.

    Stallman and Pearson examined 20 assessments that contained 208 subtests. They found that 82% of the subtests were administered to groups of children; we found that nearly 70% of the assessments we examined were administered to individuals. Because they were examining commercially available tests, it is not surprising that two thirds of the tests included guidelines for administration. However, only one fourth of the assessments we studied had such guidelines. In terms of types of student responses generated by the assessments, Stallman and Pearson reported that 72% of the tests required students to recognize a response, 23% asked for identification, and 5% asked for production. In contrast, 56% of our measures asked students to identify a correct response, followed by 43% requiring students to produce a response; only 17% called for recognition. Finally, Stallman and Pearson reported that 63% of the tests they reviewed required students to fill in bubbles, ovals, or circles to indicate the correct response, whereas our study found that students were most frequently asked to respond orally or to produce a written response. Stallman and Pearson noted that the assessments they studied decontextualized literacy activities; those we analyzed were much more sensitive to assessing literacy in a curriculum-embedded fashion.

    In short, the commercially developed measures analyzed by Stallman and Pearson consisted predominantly of multiple choice items that required students to recognize a response that was usually presented out of context. The assessments examined in this study were more complex. They contained a variety of measures, used few multiple choice item formats, and relied primarily on teacher checklists and observations within the flow of classroom activities. Further, the instructional assessments examined here focus on individual students, thus facilitating instructional planning and charting of student progress.

    After completing the analysis of these 89 informal assessments used for instruction, several conclusions can be enumerated. They will be listed in terms of the dual focus that we employed in presenting the results: by measures, and by specific skills or elements.

    A. Analysis by Measures

    1. Most of the measures covered all four grade levels (K-3), and nearly all measures were written for administration in English only.
    2. Most of the measures did not indicate how often they should be administered.
  • The majority of the instruments were intended for individual rather than group administration.
  • Many literacy elements (N = 133) were incorporated into the assessments; more than half of the assessments included the following skills or elements: phonics, comprehension, reading, and writing conventions.
  • Only 13% of the measures incorporated all eight of the McREL literacy standards.
  • Of the various measurement methods that were analyzed, both observational methods and on-demand approaches were used most frequently.
  • Parallel to this finding, the types of methods used most frequently in the assessments were checklists and oral-directed approaches.
  • The most frequent student responses elicited by the measures were oral responses and writing.
  • Identification, production, and recall were all included among the types of mental processing called for by the assessments.
  • The psychometric data available for analysis of these measures are very limited, and few conclusions can be drawn from them other than the need for more attention to this area.

    B. Analysis by Skills or Elements

    1. The elements that were found to occur most frequently were decoding, spelling, and comprehension.
    2. Only 41% of the elements we coded correspond to a McREL standard.
  • Most elements are assessed across all grade levels, and most are assessed in an individual format.
  • The method used most frequently to assess skills or elements and the most frequent type of item format employed were observation and checklist, respectively.
  • The types of student response and mental processing most commonly used to assess skills or elements were left to teacher discretion rather than dictated by the assessment.

    C. Summary of Analyses

    The simplest way of summarizing this information is to say that instructional assessments used for early literacy are extremely varied. Some are well-developed, nationally distributed, and carefully presented. Others are highly informal, contain virtually no psychometric or standardization data, and are relatively incomplete from the point of view of providing rules for systematic interpretation and use.

    Of interest is the lack of strong correlation between the national standards published by McREL and the assessments we analyzed. We attribute this lack of strong overlap to differences between our rating scheme and the skills and elements included in the McREL standards. We found that motivation, self-perception, attitude towards reading, and metacognitive categories were omitted from McREL, although we included these areas in our coding scheme. We also found that the fifth McREL Standard ("Demonstrates competence in the general skills and strategies of the reading process") was the most frequently used of all the standards in our analyses. In short, the discrepancies between McREL and this study may reflect a difference in perspective on how the reading process should be analyzed rather than an inconsistency between what was assessed and what was included in the standards.

    The sample of instruments used in this study may have also influenced the results of the analysis of McREL Standards, as well as all other findings reported. The study sample represents both a strength and a weakness. Its strength lies in the way that we accumulated these measures from the field and the inclusiveness with which we sought to locate candidate assessments that could be used in this study. The weakness of this approach is that we have no way of knowing what we did not find through this approach. Moreover, the sample is very mixed; some measures are very well developed and widely used, and others are very informal and were developed primarily for a particular teaching situation.

    D. Recommendations

    Based on this national study, it is possible to make several recommendations:

    1. Although observations are used prominently throughout the measures, especially in conjunction with checklists and other teacher-developed activities, explicit guidelines and/or goals for the observations were sparse. This suggests that more detailed instructions are needed in order to provide teachers with an understanding of why certain types of information should be observed rather than others.
  • Information about how to interpret the measures is needed. What does it mean if a child receives a score of 20 on a given assessment? What does it mean if the child shows that she can successfully perform 15 of the tasks on a list? More attention to the use and functional meaning of the assessments is called for.
  • More assessments are needed that will provide opportunities for students to construct their own understanding of the reading process and that will allow flexibility in what is considered right and wrong. Most of the measures evaluated students' abilities to produce discrete answers; only a few allowed them to construct meaning in conjunction with the teacher.
  • Developers of assessments need to acknowledge and accept that not all children, especially in the primary grades, speak English as their primary language. Only a handful of measures is available in Spanish, and none are available in other languages that are prevalent in schools today.
  • Many measures practically neglect writing altogether and seem to assume that writing occurs primarily in the upper elementary grades. The assessments require students to identify and recall what they read orally, but few allow students to express themselves in written form.
  • Cooperative group work is not accommodated in the assessments. No provision is made for children to collaborate on the construction of a product that would represent what they have learned. The group activities that were part of the assessments usually only suggest that a teacher ask individual students for the correct answer or request that students complete a worksheet individually.
  • More attention to standardization and psychometric principles is necessary so that we will know the meaning and accuracy of these measures.
  • Assessments should provide multiple ways for students to demonstrate what they know. Only a few of the measures allowed students to show their understanding of a concept in more than one way.

    This study has demonstrated the diversity and commonalities among assessments of early literacy used for instruction. Many such assessments exist and a wide range of elements are tapped by them. However, if these assessments are to be successful in reaching their dual goals of enhancing teaching and improving learning, it is critical that more of the developers of these measures undertake systematic analyses of the skills and elements they cover, the literacy methods and responses they incorporate, the types of data to which they are sensitive, and the psychometric properties that provide justification for their meaning and use. Only when these matters have been addressed more adequately will these tools truly achieve their potential for improving early reading achievement.

    References

    Alief Independent School District. (1998). Measuring Growth in Literacy Survey. Alief, TX: Author.

    Ann Arbor Public Schools. (1997). Reading and Writing Rubric. Unknown author, citation unavailable. Received from a teacher at Ann Arbor Open School.

    Anthony, R. J., Johnson, T. D., Mickelson, N. I., & Preece, A. (1991). Evaluating literacy: A perspective for change. Portsmouth, NH: Heinemann.

    Au, K. H., Scheu, J. A., & Kawakami, A. J. (1990). Assessment of students' ownership of literacy. The Reading Teacher, 44 (2), 154-156.

    Barr, M. A., Craig, D. A., Fisette, D., & Syverson, M. A. (1999). Assessing literacy with the Learning Record: A handbook for teachers, Grades K-6. Portsmouth, NH: Heinemann.

    Batzle, J. (1992). Portfolio assessment and evaluation: Developing and using portfolios in the K-6 classroom. Cypress, CA: Creative Teaching Press, Inc.

    Biggam, S. C., Herman, N., & Trubisz, S. (1998). Primary and 2-4 Literacy/Communication Profiles: Resource guide. Montpelier: Vermont Department of Education.

    Blount, R. H. (1991). Story Frame (personal communication). In A. S. Mariotti & S. P. Homan, Linking reading assessment to instruction (pp. 165-171). Mahwah, NJ: Lawrence Erlbaum Associates.

    Board of Education of the City of New York & CTB/McGraw-Hill. (1998). Early Childhood Literacy Assessment System. Monterey, CA: CTB/McGraw-Hill.

    Bolton, F., & Snowball, D. (1993). Ideas for Spelling. Portsmouth, NH: Heinemann.

    Bordeaux, M. (n.d.) Alternative Concepts About Print Test. Unpublished senior thesis project. Cambridge, MA: Harvard University.

    Bridgeman, B., Chittenden, E., & Cline, F. (1995). Characteristics of a portfolio scale for rating early literacy. Princeton, NJ: Center for Performance Assessment, Educational Testing Service.

    Burke, C. (1987). Reading interview. In L. K. Rhodes (Ed.), Literacy assessment: A handbook of instruments (pp. 2-14). Portsmouth, NH: Heinemann.

    Burns, P. C., & Roe, B. D. (1999). Informal reading inventory: Preprimer to twelfth grade (5th ed). Boston: Houghton Mifflin.

     

    Calfee, R. (1992). Authentic assessment of reading and writing in the elementary classroom. In M. J. Dreher & W. H. Slater (Eds.), Elementary school literacy: Critical issues (pp. 211-226). Norwood, MA: Christopher-Gordon.

    Calfee, R. C., & Calfee, K. H. (1981). Interactive Reading Assessment System (IRAS). Palo Alto, CA: Stanford University.

    Calfee, R. C., & Hiebert, E. H. (1991). Teacher assessment of student achievement. In R. E. Stake (Ed.), Advances in program evaluation (Vol. 1, pp. 103-131). Greenwich, CT: JAI Press.

    Calkins, L., Montgomery, K., Santman, D., & Falk, B. (1998). A teacher's guide to standardized reading tests: Knowledge is power. Portsmouth, NH: Heinemann.

    Casale, D. (1999). Rubric for written work. Sadbury, MA: Author.

    Center for Language in Learning. (1999). Learning Record Moderation Report. Connecting classroom and large scale assessment. Los Angeles, CA: Center for Language in Learning.

    Clay, M. M. (1998). An observation survey of early literacy achievement. Portsmouth, NH: Heinemann.

    Cooper, J. D., & Au, K. H. (1997). Literacy: Helping children construct meaning (3rd ed.). Boston: Houghton Mifflin.

    Conrad, L. L. (1993). An inventory of classroom writing use. Adapted from An inventory of classroom reading use. In L. K. Rhodes (Ed.), Literacy assessment: A handbook of instruments (pp. 56-72). Portsmouth, NH: Heinemann.

    Cunningham, P. (1990). The Names Test: A quick assessment of decoding ability. Reading Teacher, 44 (2), 124-129.

    Davidson, A. (1985). Monitoring reading progress. Auckland, New Zealand: Shortland Publications Ltd.

    Denver Coordinators/Consultants Applying Whole Language. (1993). Classroom Reading Miscue Assessment. In L. K. Rhodes (Ed.), Literacy assessment: A handbook of instruments (pp. 38-43). Portsmouth, NH: Heinemann.

    Denver Public Schools Collaboration. (1993). Emergent Reading and Writing Evaluations. In L. K. Rhodes (Ed.), Literacy assessment: A handbook of instruments (pp. 113-144). Portsmouth, NH: Heinemann.

    Dolch, E. W. (1936). A basic sight vocabulary. Elementary School Journal, 36, 456-460.

    Duckett, P. (1998, March). Entrance Assessment. Cairo, Egypt: American College.

    Ehlerding, J. G. (1993). Portfolio assessment and evaluation in a first grade whole language classroom. Unpublished master's thesis, University of Dayton, Dayton, Ohio.

    Falk, B., Ort, S. W., & Moirs, K. (1999). New York State Goals 2000: Early Literacy Profile Project. Technical Report. National Center for Restructuring Education, Schools, and Teaching. New York: Columbia Teachers College.

    Flynt, E. S., & Cooter Jr., R. B. (1998). Reading inventory for the classroom (3rd ed.). Upper Saddle River, NJ: Prentice Hall.

    Fry, E. B. (1980). The new instant word list. Reading Teacher, 34 (3), 284-289.

    Gambrell, L. B., Palmer, B. M., Codling, R. M., & Mazzoni, S. A. (1996). Assessing motivation to read. Reading Teacher, 49 (7), 518-533.

    Gentry, R., & Gillet, J. W. (1993). Teaching kids to spell. Portsmouth, NH: Heinemann.

    Gillet, J. W., & Temple, C. (1990). Understanding reading problems: Assessment and instruction. New York: HarperCollins.

    Haladyna, T., Nolen, S., & Haas, N. (1991). Raising standardized achievement test scores and the origins of test score pollution. Educational Researcher, 20 (5), 2-7.

    Harris, A. J., & Jacobson, M. D. (1982). Basic reading vocabularies. New York: Macmillan Publishing Company.

    Hill, B. C., & Ruptic, C. A. (1994). Practical aspects of authentic assessment: Putting the pieces together. Norwood, MA: Christopher-Gordon.

    Hoffmann, M., & Hesbol, K. (n.d.). First Grade Screening. Des Plaines, IL: Des Plaines Elementary School District 62.

    Imbens-Bailey, A. L. (1997). Scoring narrative structure. Los Angeles, CA: Author.

    Imbens-Bailey, A. L., Dingle, M., & Moughamian, A. (1999). Assessment of Syntactic Structure. Los Angeles, CA: Center for the Study of Evaluation/Center for Research on Evaluation, Standards, and Student Testing, University of California at Los Angeles.

    Invernizzi, M., Meier, J. D., Juel, C. L., & Swank, L. K. (1997). Phonological Awareness & Literacy Screening, I and II. Charlottesville, VA: The Virginia State Department of Education and University of Virginia, Curry School of Education.

    Jacobs, J. E., & Paris, S. G. (1987). Children's metacognition about reading: Issues in definition, measurement, and instruction. Educational Psychologist, 22 (3&4), 255-278.

    Johns, J. L. (1997). Basic reading inventory: Pre-Primer through grade twelve and early literacy assessments (7th ed.). Dubuque, IA: Kendall/Hunt Publishing Company.

    Johns Hopkins University. (1998). Success for all. Baltimore, MD: New American Schools.

    Kendall, J. S., & Marzano, R. J. (1997). Content knowledge: A compendium of standards and benchmarks for K-12 education. Aurora, CO and Alexandria, VA: Mid-continent Regional Educational Laboratories and the Association for Supervision and Curriculum Development.

    Kentucky Department of Education. (1996). Primary Performance Tasks. Frankfort, KY: Kentucky Department of Education.

    Klesius, J. P., & Homan, S. P. (1980). Klesius-Homan Phonic Word Analysis Test (unpublished manuscript). In A. S. Mariotti & S. P. Homan, Linking reading assessment to instruction (pp. 182-185). Mahwah, NJ: Lawrence Erlbaum Associates.

    Klesius, J. P., & Searls, E. F. (1985). Modified concepts about print (unpublished manuscript). In A. S. Mariotti & S. P. Homan (1997), Linking reading assessment to instruction (pp. 190-195). Mahwah, NJ: Lawrence Erlbaum Associates.

    Langer, J. A. (1981). From theory to practice: A Prereading Plan. Journal of Reading, 25 (2), 152-156.

    LaPray, M., & Ross, R. (1969). The graded word list: Quick gauge of reading ability. Journal of Reading, 12 (4), 305-307.

    Leibert, R. E. (1991). The Dolch List Revisited: An analysis of pupil responses then and now. Reading Horizons, 31 (3), 217-227.

    Leslie, L., & Caldwell, J. (1995). Qualitative Reading Inventory-II. New York: HarperCollins.

    Lessard, A. (n.d.) Peterborough, NH: The Peterborough Group.

    Linguistic Diagnostic. (1997). Unknown author, citation unavailable. Received from teacher in Unified, NH.

    Lipson, M. Y., & Wixson, K. K. (1991). Assessment and instruction of reading disability: An interactive approach. New York: HarperCollins.

    MacArthur CCDP Follow-Up Study. (1998, February). Literacy Assessment: MacArthur Foundation Pathways Study. Los Angeles, CA: University of California at Los Angeles.

    Manzo, A. V., Manzo, U. C., & McKenna, M. C. (1995). Informal reading-thinking inventory. New York: Harcourt Brace.

    Mariotti, A. S., & Homan, S. P. (1997). Linking reading assessment to instruction (2nd ed.). Mahwah, NJ: Lawrence Erlbaum Associates.

    McCaig, R. A. (1990). Learning to write: A mode for curriculum and evaluation (3rd ed.). Grosse Point, MI: The Grosse Point Public School System.

    McKenna, M. C., & Kear, D. J. (1990). Measuring attitude toward reading: A new tool for teachers. Reading Teacher, 43 (9), 626-639.

    Meisels, S. J. (1997). Using Work Sampling in authentic performance assessments. Educational Leadership, 54, 60-65.

    Meisels, S. J., Bickel, D. D., Nicholson, J., Xue, Y., & Atkins-Burnett, S. (in press). Trusting teachers' judgments: A validity study of a curriculum-embedded performance assessment in K-3. American Educational Research Journal.

    Meisels, S., Jablon, J., Marsden, D., Dichtelmiller, M., & Dorfman, A. (1994). The Work Sampling System. Ann Arbor, MI: Rebus, Inc.

    Meisels, S. J., Liaw, F., Dorfman, A., & Nelson, R. F. (1995). The Work Sampling System: Reliability and validity of a performance assessment for young children. Early Childhood Research Quarterly, 10, 277-296.

    Michigan Department of Education. (1998). Michigan Literacy Progress Profile. Lansing, MI: Author.

    Miller, W. H. (1995). Alternative assessment techniques for reading and writing. West Nyack, NY: Center for Applied Research in Education.

    Ministry of Education. (1992). Dancing with the pen: The learner as a writer. Wellington, Australia: Learning Media.

    Morrow, L. M. (1986). Effects of structural guidance in story retelling on children's dictation of original stories. Journal of Reading Behavior, 18 (2), 135-152.

    Murphy, S., Shannon, P., Johnston, P., & Hansen, J. (1998). Fragile evidence: A critique of reading assessment. Mahwah, NJ: Lawrence Erlbaum Associates.

    NCREST/Cayuga-Onondaga. (1997). Elementary Literacy Profile: A New York state pilot assessment. New York: NCREST.

    North Carolina State Department of Education. (November, 1997). North Carolina Grades K-2 Literacy Assessment. Raleigh, NC: Author.

    O'Connor Elementary Magnet School. (n.d.). . Writing Checklist. Victoria, TX: Victoria Independent School District.

    Oregon Department of Education, Office of Assessment and Evaluation. (1998, March). Reading Assessment: Grades K-4, Third Grade Benchmark. Portland, OR: Author.

    Paris, S. G. (1991). Assessment and remediation of metacognitive aspects of children's reading comprehension. Topics in Language Disorders, 12 (1), 32-50.

    Paris, S. G., & Van Kraayenoord, C. E. (1998). Book selection. In S. Paris & H. Wellman (Eds.), Global prospects for education: Development, culture, and school (pp. 193-227).Washington, DC: American Psychological Association.

    Pearson P. D., Sensale, L., Vyas, S., & Kim, Y. (1998, December). Early literacy assessment: A marketplace analysis. Paper presented at the annual meeting of the National Reading Conference, Austin, TX.

    Phonological Awareness and Literacy Screening. (1998). PALS Technical Manual - Executive Summary. http://curry.edschool.virginia.edu/curry/centers/pals/pals-news.htlm.

    Primary Language Arts Portfolio. Unknown author, citation unavailable. Received from a teacher at West Word Elementary School in Killeen, TX.

    Reading Skills Inventory. Unknown author, citation unavailable. Received from a teacher at Unified, NH.

    Resnick, L. B., & Resnick, D. P. (1992). Assessing the thinking curriculum: New tools for educational reform. In B. R. Gifford & M. C. O'Connor (Eds.), Changing assessment: Alternative views of aptitude, achievement, and instruction (pp. 37-75). Boston: Klewer.

    Rhodes, L. K. (1993). Literacy Assessment: A handbook of instruments. Portsmouth, NH: Heinemann.

    Rosner, J. (1975). Helping children overcome learning difficulties. Novato, CA: Academic Therapy Publications.

    Rosner, J., & Simon, D. P. (1971). The Auditory Analysis Test: An initial report. Journal of Learning Disabilities, 4 (7), 40-48.

    Routman, R. (1994). Invitations: Changing as teachers and learners, K-12. Portsmouth, NH: Heinemann.

    Rubric for performance assessment. Unknown author, citation unavailable. Received from a teacher at West Word Elementary School in Killeen, TX.

    Schmitt, M. C. (1990). A questionnaire to measure children's awareness of strategic reading processes. Reading Teacher, 43 (7), 454-461.

    School District of Philadelphia, Office of Assessment. (1998). Early reading assessment. Philadelphia: Author.

    Seeds University Elementary School and the University of California at Los Angeles. (1999). Literacy Development Checklist. Los Angeles, CA: Authors.

    Shanklin, N. L. (1993). Authoring Cycle Profile. In L. K. Rhodes (Ed.), Literacy assessment: A handbook of instruments (pp. 73-105). Portsmouth, NH: Heinemann.

    Sharp, Q. Q. (1989). Evaluation: Whole language checklist for evaluating your children, for grades K to 6. New York: Scholastic.

    Shefelbine, J. (1996). Beginning Phonic Skills Test. Publisher unknown.

    Silvaroli, N. J. (1997). Classroom reading inventory (8th ed). Madison, WI: Brown & Benchmark Publishers.

    Snow, C. E., Burns, M. S., & Griffin, P. (Eds.). (1998). Preventing reading difficulties in young children. Washington, DC: National Academy Press.

    South Brunswick Public Schools. (1998, August). Early Literacy Portfolio. South Brunswick, NJ: Author.

    Southwest Allen County Schools. (1997). Southwest Allen County Schools Curriculum-Based Assessment. Ft. Wayne, IN: Author.

    St. Vrain Valley School District. (1997, Fall). Literacy Assessment for Elementary Grades. Longmont, CO: Author

    Stahl, S. A., & Murray, B. A. (1993). Test of Phonemic Awareness (unpublished manuscript). In A. S. Mariotti & S. P. Homan (1997), Linking reading assessment to instruction (pp. 205-206). Mahwah, NJ: Lawrence Erlbaum Associates.

    Stallman, A. C., & Pearson, P. D. (1990a). Formal measures of early literacy. In L. M. Morrow & J. K. Smith (Eds.), Assessment for instruction in early literacy (pp. 7-44). Englewood Cliffs, NJ: Prentice Hall.

    Stallman, A. C., & Pearson, P. D. (1990b). Formal measures of early literacy (No. G0087-C1001-90). Cambridge, MA: Bolt, Beranek and Newman, Inc. Illinois University, Urbana. Center for the Study of Reading. (ERIC Document Reproduction Services No. ED 324 647)

    Stiggins, R. J. (1995). Assessment literacy for the 21st century. Phi Delta Kappan, 77 (3), 238-245.

    Stiggins, R. J., & Bridgeford, N. J. (1985). The ecology of classroom assessment. Journal of Educational Measurement, 22, 271-286.

    Stiggins, R. J., Griswold, M. M., & Wikelund, K. R. (1989). Measuring thinking elements through classroom assessment. Journal of Educational Measurement, 26, 233-246.

    Sub-Committee of the K-4 Language Arts Institute Council. (1998). South Colonie Central Schools--K-1 Assessment for Language Arts. Albany, NY: South Colonie Central Schools.

    Taylor, D. (1990). Teaching without testing: Assessing the complexity of children's literacy learning. English Education, 22, 4-74.

    Texas Education Agency. (1997). Texas Primary Reading Inventory. Austin, TX: Texas Education Agency.

    Thomson Elementary School. (n.d.). Informal Reading Readiness Assessment. Davison, MI: Author.

    Valencia, S. W., & Calfee, R. (1991). The development and use of literacy portfolios for students, classes, and teachers. Applied Measurement in Education, 4, 333-345.

    Van Kraayenoord, C., & Paris, S. G. (1996). Story construction from a picture book: An assessment activity for young learners. Early Childhood Research Quarterly, 11, 41-61.

    Van Kraayenoord, C., & Paris, S. G. (1997). Australian students' self-appraisal of their samples and academic progress. Elementary School Journal, 97 (5), 523-537.

    Wade, S. E. (1990). Using think alouds to assess comprehension. The Reading Teacher, 43 (2), 442-451.

    Winograd, P., Paris, S., & Bridge, C. (1991). Improving the assessment of literacy. The Reading Teacher, 45, 108-116.

    Wood, K. D. (1988). Techniques for assessing students' potential for learning. The Reading Teacher, 41 (1), 440-447.

    Wood, M. L., & Moe, A. J. (1995). Analytical reading inventory (5th ed). Upper Saddle River, NJ: Prentice Hall.

    Yopp, H. K. (1995). A test for assessing phonemic awareness in young children. The Reading Teacher, 49 (1), 20-29.

    Zutell, J., & Rasinski, T. V. (1991). Training teachers to attend to their students' oral reading fluency. Theory into Practice, 30 (3), 211-217.

    Appendix A: Coding Manual for Informal Literacy Assessments

    This Manual describes a classification system for analyzing teacher-, district-, and research-developed literacy assessments for grades K-3. The manual is divided into two sections: General Overview, and Skills or Elements Tested. The General Overview consists of basic information about the assessment (e.g., name, author), a brief summary of the content (e.g., grade, format), psychometric information (e.g., standardization, reliability, validity), and additional information (e.g., description of how to develop a portfolio system). The Skills or Elements Tested section contains information specific to particular skills and elements included in the assessments. For example, an assessment that focuses on mechanics in compositional writing may be further divided into the student's use of punctuation marks, grammatically correct sentences, correct spelling, and so forth. For each particular element, information is presented concerning grade level, frequency and mode of administration, and scoring. For an outline of the Manual's contents, see Table 1.

    The coding classification systems are a synthesis of codes used in other sources including Kendall and Marzano (1997), Pearson, Sensale, Vyas, and Kim (1998), Stallman and Pearson (1990), and Stiggins (1995). Most of the definitions were derived from Harris and Hodges (1995), Pearson et al. (1998), and Stallman and Pearson (1990).

    Table 1: Outline of the Coding System

    I. General Overview

    A. Title of assessment

    B. Author(s)

    C. Availability

    D. Overall purpose

    E. Language(s)

    F. Grade/age

    G. Form of administration

    H. Frequency

    I. Amount of time required to administer

    J. Assessment model(s)

    K. Format(s) for recording student response

    L. Category of elements

    M. Standardization

    N. Psychometric properties

    O. Comments

    P. Notes

    II. Skills or Elements Tested

    A. Skill or element

    B. Grade

    C. Form of administration

    D. Frequency

    E. Amount of time required to administer

    F. Assessment model

    G. Item response format

    H. Number of items

    I. Description of items

    J. Presentation

    1. Mode

    2. Unit of presentation

    K. Response

    1. Type of mental processing

    2. Unit of response

    3. Student response

    L. Scoring

    M. Notes

    Coding Manual

    I. General Overview: This section contains basic information relevant to the entire assessment, as well as an overview of the assessment's content.

    1. Title of assessment: The name used by the author(s) for identifying the assessment and any acronyms or abbreviations.
  • Author(s): The individual, group of individuals, or organization name(s) on the cover page of the assessment. If no name is provided, the individual from whom the assessment was obtained is listed.
  • Availability: Information concerning how to obtain a copy of the instrument. The contact person and publication date are noted. If this information is unclear, the name of the individual from whom the assessment was obtained is noted.
  • Overall Purpose: The author(s) description of the purpose of the assessment. If a description is not provided, then a generic description is supplied, such as "to evaluate students' reading and writing abilities."
  • Languages: All languages in which a version of this test is provided are listed. For most assessments, English only may be assumed. If languages are not specified, but the author indicates availability in a second language, the author should be contacted for further information.
  • Grade/age: The grades or ages of the students who the measure is designed to assess.
  • Form of administration: The type of administration for which the measure is designed.
  • Group
  • Individual
  • Group or Individual
  • Frequency: How often the measure is administered during the year (e.g., fall and spring).
  • Amount of time required to administer: The length of time required for administering all sections of the assessment. If an amount of time is not provided, the author should be contacted.
  • Assessment model: The assessment method(s) used for assessment.
  • Format(s) for recording student response(s): All item formats used to document student responses.
  • Category of elements: The general elements assessed by the measure; specific elements are described in section II. The total number of specific elements assessed (identified by the letter A in the next section) is noted here.
  • Standardization: A description of the sample used for establishing norms, such as age, grade, gender, and ethnicity.
  • Psychometric Properties: A description of the measurement properties that relate to the development, administration, and interpretation of the test.
  • Comments: Any other relevant information not described elsewhere.
  • Notes: How the assessment was obtained.

    II. Skills or Elements Tested: This section contains information pertaining to the specific literacy skills or elements covered by each assessment. Each element (described in section A) is identified, and then information about how this element is assessed is provided in sections B-M. Because assessments typically assess more than one skill or element, there are usually several Elements Tested sections for each assessment.

    A. Elements, Standards, and Benchmarks: The skills or elements are divided into eleven literacy-related categories with two additional categories examining student oral language and other elements. These categories are further subdivided into specific elements. The specific element is designated by the letter A. Accompanying information about that element is included in paragraphs identified by letters B-M (see Table 1).

    This Manual utilizes the format for representing state and national standards compiled by the Mid-continent Regional Educational Laboratory (McREL; Kendall & Marzano, 1997). This widely accepted format is used to identify the relevant standard(s) and benchmark(s) being assessed. The McREL content standards describe the knowledge and skills that students should attain. The content standards encompass three general types of knowledge: procedural (which is most often used in Language Arts), declarative, and contextual. Benchmarks, which are subcomponents of standards, identify expected levels of understanding or skills at various grade levels.

    The McREL English Language Arts subject area contains eight standards with two levels of benchmarks: grades K-2 and 3-5. (For a complete listing of the benchmarks and levels for each standard, see the Appendix to the Coding Manual.) Whenever possible, each literacy-related category is identified by the appropriate standard(s). However, not all categories correspond to a standard. Relevant benchmarks are noted in parentheses following each specific element. If the author(s) identifies state standards as the basis for the assessment, this is noted in the comments section of the General Overview.

     

    1. Literacy Category: Writing Process

    Standards:

    1.0--Demonstrates competence in the general skills and strategies of the writing process.

    2.0--Demonstrates competence in the stylistic and rhetorical aspects of writing.

    Specific Elements:

    a. Illustrations Are Representative of the Story: The student's drawing matches the story with details.

    b. Message Quality: The composition contains the author's idea about a certain topic and a coherent message that holds together.

    c. Types of Compositions: The student composes a variety of products, such as poems, stories, lists, letters. (1.8)

    d. Uses Illustrations to Express Ideas: The student uses drawings and maybe simple words to express his/her ideas that relate to a story.

    e. Uses Lively and Descriptive Language: The student uses strategies such as dialogue, description or suspense in writing. (2.2)

    f. Use of Formal and/or Literary Language: The student uses the vocabulary, themes, and language structure from books in own writing (e.g., "Once upon a time").

    g. Vocabulary Usage: The extent to which different words are used in writing or speaking.

    h. Writing Attends to Audience: The composition shows awareness of an intended audience. (1.13)

    i. Writing Behaviors: The student writes and/or participates in writing behaviors, such as pretend writing activities (e.g., drawings, scribbles, random letters).

    j. Writing Contains a Purpose: The composition conveys an intended purpose. (1.14)

    k. Writing Contains Description and Details: Uses description and supportive details to develop and elaborate ideas. (2.2)

    l. Writing Conveys a Sense of Story: The composition contains a sense of narrative.

    m. Writing Has Evidence of Beginning, Middle, and End: The composition presents a beginning, a middle, and an end.

    n. Writing is Easy to Understand and Follow: Writing is clear, organized, focused, and makes sense. This element refers to simple writing, such as the use of one or two sentences.

    o. Writing is Logical and Sequential: The composition contains a clearly logical and sequential order of events.

    p. Writing Process: Understands the many aspects of the complex act of producing a written communication; specifically, choosing a topic of interest, planning or prewriting, drafting, revising, editing, and publishing. (1.1, 1.2, 1.3, 1.9, 1.10, 1.11)

     

     

    2. Literacy Category: Conventions

    Standards:

    3.0--Uses grammatical and mechanical conventions in written compositions.

    Specific Elements:

    a. Capitalization: Uses capitalization appropriately in writing. (3.9)

    b. Directional Principles in Writing: The student's composition illustrates an ability to perceive spatial and directional orientation (e.g., letters and words are arranged from left to right and top to bottom).

    c. Grammatically Correct Sentences: The degree to which a written or spoken utterance follows the grammatical rules of language, such as understanding subject-verb agreement. Additionally, the use of grammatically complex structures in compositions (e.g., the number of clauses in a sentence) and discriminating between types of sentences is included. (2.4, 3.2, 3.3, 3.12)

    d. Handwriting: Uses accurate letter formation. (3.1)

    e. Linguistic Organization: The ability to organize language forms, such as phonemes and morphemes (e.g., writing a recognizable word or a simple sentence). (3.1, 3.2)

    f. Paragraphs: Student uses paragraph form in writing. (2.3)

    g. Punctuation Marks: Using graphic marks appropriately in written phrases and sentences to clarify meaning or to give speech characteristics to written materials. (3.10, 3.22)

    h. Spelling: The process of representing language by means of a writing system; this includes invented or transitional spelling. (3.8, 3.20)

    i. Uses Complex Word Structures: Understands and uses compound words, contractions, root words, prefixes and suffixes, and sorts words by common patterns (e.g., -ack, -ight) in writing.

    j. Uses Upper- and Lower-Case Letters in Writing: Using different letter forms that may be either a smaller letter (lower-case) or a larger letter (upper-case) (e.g., John played with Bob). (3.9)

    k. Writes Own Name: Student correctly writes his/her own name.

     

     

    3. Literacy Category: Print Awareness

    Standards:

    5.0--Demonstrates competence in the general elements and strategies of the reading process.

    Specific Elements:

    a. Concept of Letter or Word: Understands concepts of a letter or word only.

    b. Directionality: The ability to perceive spatial and directional orientation when reading (e.g., reads from left to right and reads from left page to right page). (5.2)

    c. Identification of Parts of a Book: The student identifies the front and the back of a book, the title, the author, etc. (5.2)

    d. Labels Pictures: Student labels and/or describes pictures and retells what has been written.

    e. Letter and Word Order: The sequential arrangement of letters in a morpheme or words in a phrase, clause, or sentence, or phrases in a line sequence.(5.2)

    f. Sense of Story: The student understands that the printed text represents a narrative with characters, main ideas, details, and a beginning, middle, and end.

    g. Understands Punctuation Marks: The student identifies punctuation marks and either tells why they are used or uses them appropriately (e.g., if shown a ?, he or she can verbalize question mark or raises voice at end of sentence).

    h. Understands That Print Conveys Meaning: The student understands that the graphic symbols of a text represent a thought or a story meaning and preserves the meaning. (5.1)

    i. Understands Upper- and Lower-Case Letters: The student understands the differences between upper- and lower-case letters.

    j. Word Boundaries: The student identifies the beginning and the end of a word or a sentence and understands the concept of first and last. Knowing where to start reading, differentiating between morphemes by placing a space between them (e.g., playing ball for playingball), and understanding the bottom and top of a picture are also considered word boundaries. (3.1, 5.2)

     

     

     

    4. Literacy Category: Aspects of Word Recognition

    Standards:

    5.0--Demonstrates competence in the general skills and strategies of the reading process.

    Specific Elements:

    a. Decoding Words: Students translate or analyze spoken or graphic symbols of a familiar language to ascertain their intended meaning. Word identification and sight vocabulary, which refer to the process of determining the pronunciation and some degree of meaning of a word in written or printed form, are also considered decoding. The differentiation between the two depends on the student's prior knowledge of the word. (5.5, 5.13, 5.14)

    b. Identification of Beginning Sounds: The application of phonic skills in reproducing the sound(s) presented by a letter or letter group in a word. Knowing the sounds for each letter, and matching phonemes with their letter is also considered identification of beginning sounds. (5.5)

    c. Letter Identification: The process of determining one of a set of graphic symbols that forms an alphabet.

    d. Manipulation of Sounds: The student changes the beginning, middle, and ending sounds to produce words or nonwords.

    e. Segmenting and Blending: Awareness of the sounds (phonemes) that make up spoken or written words (e.g., blending and segmenting phonemes and syllables). (5.5)

    f. Production of Rhyming Words: Articulating identical or very similar beginning and final sounds in words or at the ends of lines of a verse (e.g., "book" and "took").

    g. Sound-Symbol Correspondence: The relationship between a phoneme and its graphemic representation(s) in writing and reading (e.g., /s/, spelled s in sit, c in city, and ss in grass).

     

    5. Literacy Category: Oral Reading

    Standards:

    5.0--Demonstrates competence in the general skills and strategies of the reading process.

    Specific Elements:

    a. Book Topic: The student predicts what the book is about from the title.

    b. Fluency: The clear, easy written or spoken expression of ideas at a normal rate of reading (e.g., the student's reading can be choppy vs. fluid).

    c. Identifies Own Name: The student recognizes own name in print.

    d. Instructions: Reads and understands simple and multiple instructions.

    e. Pretend Reading: Refers to participating in reading-related activities and make-believe reading, such as turning pages of a book while inventing words and repeating the contents of a book from memory after listening to it.

    f. Reading Accuracy: The number of different words identified correctly while reading. (6.1, 6.7, 7.1, 7.5)

    g. Reading Flexibility: The adjustment of one's reading speed, purpose, or strategies to the prevailing contextual conditions (e.g., use of inflection while reading).

    h. Reads as if Passage is Meaningful: The student understands what he/she is saying/reading.

    i. Texts Student Can Read: The type of texts the student is able to read. This refers to such diverse skills as: recognizing own name in print, reading words in the environment, reading simple text, reading complex children's literature, reading different genres, and interpreting reference materials, such as dictionaries, tables of contents, diagrams, and maps. (6.1, 6.7, 7.1, 7.5)

    j. Use of Book Language: The student's use of common phrases found in text when telling stories, such as "Once upon a time" and "The End".

    k. Voice-To-Print Match: An understanding of the one-to-one correspondence between the printed words on a page and the words as they are read aloud.

     

    6. Literacy Category: Reading Strategies

    Standard:

    5.0--Demonstrates competence in the general skills and strategies of the reading process.

    Specific Elements:

    a. Locating Answers: The student rereads or goes through a book focusing on detail to locate specific information and to clarify meaning.

    b. Monitoring Own Reading Strategies: When reading, the student monitors his/her own reading and makes modifications that produce grammatically acceptable sentences and that make meaningful substitutions. (5.16)

    c. Self-Correction: The student corrects him or herself when mispronouncing a word. (5.7)

    d. Using Pictures and Story Line for Predicting Context and Words: The ability to predict what will happen next in a story and determining meaning of the words by using pictorial and contextual cues. (5.4)

    e. Using Print for Predicting Meaning of the Text: The ability to use one's knowledge of the rules and patterns of language to find the meaning of the text.

    f. Way of Reading: How the student reads the text, orally or silently.

     

    7. Literacy Category: Comprehension

    Standards:

    6.0--Demonstrates competence in general skills and strategies for reading a variety of literary texts.

    7.0--Demonstrates competence in general skills and strategies for reading a variety of informational texts.

    Specific Elements:

    a. Comments on Literary Aspects of the Text: The student evaluates and/or judges the characters, authors, genre, figurative language, symbols, and tone of the text orally and in writing.

    b. Connects Universally Shared Experiences With Text: The student relates previous knowledge to the current text. (6.6, 6.15, 7.4)

    c. Distinguishes Fantasy From Realistic Texts: The student understands the difference between fiction and nonfiction. (6.8)

    d. Drawing Conclusions: The student is able to make connections and build from the text to draw conclusions.

    e. Identify Cause-Effect Relationships: Notices the stated or implied association between an outcome and the conditions that brought it about; often an organizing principle in narrative and expository text.

    f. Inferences: The student uses the text and prior knowledge to make inferences about what will happen next. (6.4, 6.12)

    g. Literal Comprehension: The student reconstructs the intended meaning of a communication and can understand accurately what is written or said.

    h. Literary Analysis: The analysis of the structural characteristics of the text, such as setting, characters, and events. (6.11)

    i. Prediction Strategies: The student uses knowledge about language and the context in which it occurs to anticipate what is about to take place in writing, speech, or reading. (5.12, 6.4, 6.12)

    j. Provides Supporting Details: The student identifies setting, main characters, main events, objects, and problems in stories, and notices nuances and subtleties of text. (6.3)

    k. Reference to Evidence Presented in Text: Student supports ideas with proof from the text.

    l. Retelling: The process by which the reader, having heard or silently read a story, describes what happened in it. (7.3, 7.9)

    m. Sequence of Story's Events: The arranging or ordering of subject matter in a logical progression.

    n. Summarizes Main Ideas and Points: The student understands the gist of a passage or central thought. (6.5, 7.2)

    o. Wider Meaning: The ability to understand the greater meaning of the text.

     

    8. Literacy Category: Motivation

    Specific Elements:

    a. Book Referral: The student recommends books that he/she has read to others.

    b. Current Reading Practices: The book(s) the student is reading currently.

    c. Family Support and Prior Experience: Family influence on literacy behavior and opportunities provided for the student, such as being read to before school entry, having books in the home, and visiting the library.

    d. Reading Preferences: An explanation of which books the student prefers to read or reread.

    e. Response to Literature: The student's oral or written reaction to the materials read, such as what he/she liked and disliked about the text and his/her personal point of view. (1.7, 1.19)

    f. Student Reads for Own Purposes: Student reads to suit personal needs and preferences.

    g. Time Spent: The amount of time the student spends on reading and writing.

    h. Other: Other motivation elements related to literacy that do not fit within these elements.

     

    9. Literacy Category: Self-Perception/Self-Concept

    a. Characteristics of a Good Reader: Student's opinion of what constitutes a good reader.

    b. Learning and Understanding: The student believes he understands what he/she read and/or student feels he/she has learned something.

    c. Others' Opinions: Student's perceptions of how others feel about the student's reading ability (e.g., peers, teacher).

    d. Reads Independently: The degree of independence and confidence the student demonstrates while reading.

    e. Writes Independently: The degree of confidence and independence the student has as a writer.

     

    10. Literacy Category: Metacognition

    a. Familiarity With Types of Texts: Demonstrates familiarity with a variety of different types of texts related to reading.

    b. Monitoring How Student Reads: Student can summarize and clarify what he/she read; strategies available to determine unknown words, employ reinspection or look backs, and use repair strategies.

    c. Personal Progress: The student's evaluation of how well his/her reading and writing abilities are improving and which areas need improvement.

    d. Planning How to Read: Student analyzes the task required of him/her; the kind of reading materials; what he/she already knows about the subject; what he/she expects to learn.

    e. Pride: Which of these pieces of work is the student proud of?

    f. Reading-Related Behaviors: Activities or behaviors the student takes part in that have some association with reading.

    g. Self-Assessment in non-Language Arts domains: What else is the student trying to improve?

    h. Self-Review: How the student feels when reviewing or evaluating his/her literacy work. (1.4, 1.12)

    i. Sharing With Others: Student shares his/her work and ideas with others (teacher, parents, and peers).

    j. Strategy-Execution for How to Read: Student selects a suitable strategy that will allow him/her to realize a learning goal; may elect to skim the passage and develop a set of guiding questions, use story grammar, a pattern guide, imaging, note-taking, or other strategies; reader initiates the reading task with the most appropriate strategy to facilitate the meaning-making process. (6.1)

    k. Teacher Feedback: The teacher informs the student about work that was good and work needing improvement.

    l. Writing-Related Behaviors: How the student goes about writing and other relevant writing behaviors.

    m. Other: Other metacognition elements related to literacy that do not fit within the other elements (e.g., any element that the teacher or other individuals evaluate directly).

     

    11. Literacy Category: Attitude

    a. Attitudes Towards Other Literacy Activities: The student's attitudes about other literacy activities, such as going to the library or using a dictionary.

    b. Attitudes Towards Reading: The student's feeling regarding reading per se (e.g., learning from a book, reading is important, etc.).

    c. Attitudes Towards Reading Behaviors: The student's feelings regarding reading behaviors (e.g., getting a book for a present, reading during summer vacation).

    d. Attitudes Towards Writing: The student's feelings about writing per se (e.g., student does or does not enjoy writing).

    e. Other: Other attitude elements related to literacy that do not fit within the other elements (e.g., any element that the teacher or other individuals evaluate directly).

     

    12 Oral Language: Listening and Speaking

    Standards:

    8.0--Demonstrates competence in speaking and listening as tools for learning.

     

    Specific Elements:

    a. Ask for Clarification: The student is able to request clarification when necessary.

    b. Communicates Effectively: The student is able to communicate major ideas effectively by presenting them in an organized manner.

    c. Figurative Language: Uses lively and descriptive language by varying pace, tone, and volume in different situations (e.g., experiments with language patterns).

    d. Holds Attention of Others: Student is able to sustain the attention of others when speaking.

    e. Language Production: The ability to listen and express oneself verbally in a clear, understandable fashion, from simple sentences to use of complex sentences (e.g., gives clear directions orally). (8.14)

    f. Listens Attentively: The student listens actively for long periods of time.

    g. Oral Directions: The student listens and responds to oral directions appropriately. (8.6)

    h. Others' Perspective: Student demonstrates an ability to understand other perspectives or points of view and responds with appropriate behaviors.

    i. Participates in Group Discussion: The student contributes to small group or class discussions (e.g., to discuss reading or writing). (8.2, 8.10)

    j. Questions: The student elicits and responds effectively to questions.

    k. Responses Make Connections to the Situation: Student draws meaningful connections between ideas.

    l. Self-Corrects When Speaking: The student corrects him/herself when language is inconsistent or inaccurate.

    m. Story Telling/Retelling: Ability to tell or retell a literary or personal story well.

    n. Various Types of Communication: Student participates in a range and variety of talk, such as planning an event, solving a problem, expressing a point of view, and reporting results of an investigation.

     

     

     

    13. Other

    Specific Elements:

    a. Color Identification: Naming the correct colors of objects presented by the administrator.

    b. Fact vs. Opinion: The ability to distinguish between fact and opinion.

    c. Note-Taking: Ability to outline or summarize the important ideas of a lecture, book, or other source of information to aid in the organization and retention of ideas. (4.3)

    d. Presentations: The student writes about, organizes and presents information in an appropriate format.

    e. Reference Elements: The ability to search and locate information from pictures and other sources, such as a dictionary or encyclopedia. (4.4, 4.5)

    f. Skimming: The student is able to obtain information from a text quickly.

    g. Similarities and Differences: Identifies similarities and differences between objects (e.g., picking the bears that do not match from a set of pictures).

    h. Synonyms and Antonyms: The ability to identify and use one of two or more words that have highly similar meanings and/or have opposite meanings.

    i. Text Comparison: Compares and contrasts poems, informational selections, or other literary selections.

    j. Topic Knowledge: The ability to understand the general category or class of ideas from a text.

    k. Use of Text: Uses text for a variety of functions, including literary, informational, and practical.

    l. Other: Any unrelated elements.

     

    B. Grade/age: The grades or ages of the students for which the element is intended. If no grade or age is provided, grade levels may be inferred from the element and the instrument. However, if questions remain, the author should be contacted.

    C. Form of administration: Refers to the type of administration used to assess a particular element.

    1. Group
    2. Individual
    3. Group or Individual

    D. Frequency: How often the element is assessed during the year (e.g., fall and spring).

    E. Amount of time required to administer: The length of time required for assessing the element. If an amount of time is not provided, the author should be contacted. However, due to the nature of the way the element is assessed, this parameter may not be applicable. For example, if a student is asked to write a response, the amount of time may vary greatly.

    F. Assessment model: The assessment method used for assessing literacy knowledge.

    1. Clinical Interview: The teacher gathers information regarding the student's process of thinking while engaging in literacy activities.
    2. Constructed Response: Student is asked to provide a range of answers or responses within a broad structure.
    3. Observation: Teacher observes the student's literacy practices in a natural or contrived setting.
    4. On-Demand Response (Closed Set Response): Student is asked to provide the correct answer, often in response to a limited set of responses.
    5. Student Self-Assessment: The student evaluates his/her own work.
    6. Multiple Responses: The assessment evaluates the element in multiple ways.

    G. Format for recording student response: The item format used to track student responses.

    1. Checklist: The examiner keeps track of the quality and/or occurrence of student responses to items on a predetermined list.
    2. Dictation: A message spoken by the teacher for students to encode.
    3. Informal Reading Inventory: The use of a graded series of passages and sentences of increasing difficulty to determine students' strengths, weaknesses, and strategies in word identification and comprehension.
    4. Miscue Analysis: A formal examination of the use of miscues or errors as the basis for determining the strengths and weaknesses of students as they read.
    5. Observation: The teacher observes student behavior formally or informally. This option is used as the default.
    6. Oral-Directed: The student verbalizes his or her response to a question that only allows for one correct answer.
    7. Oral Open-Ended/Constructed Response: Questions or tasks used to explore a student's understanding or skills in reading or literacy that are intended to produce an oral free response, rather than a directed one; the response is recorded by the teacher or the administrator.
    8. Running Records: A neutral observation of students' skills and capabilities as they read; the teacher informally tracks the student's reading ability.
    9. Written-Directed: The student writes his or her response to a question that allows for only one correct answer.
    10. Written Open-Ended/Constructed Response: Questions or tasks used to explore a student's understanding or skills in reading or literacy that are intended to produce a written free response, rather than a directed one; the response is recorded by the teacher or the administrator.
    11. Miscue Analysis/Informal Reading Inventory: Combination of the two formats.
    12. Multiple Responses: The assessment evaluates the element in multiple ways.

    H. Number of items: The number of items used to assess the skill or element. If a skill is assessed only through oral reading of a passage, then the number of items should be zero.

    I. Description: Description of the items, such as the number of words in a passage or the number of alternative word lists provided.

    J. Presentation: How the items are presented to the students.

    1. Mode: The primary mode of presentation used by the examiner.

    a. Auditory: Student responds to something that the examiner says or to another auditory stimulus.

    b. Visual: Test items are administered visually, through written text or illustrations.

    c. Auditory and Visual, Mixed: The test is administered both orally and visually, through written text or illustrations.

    d. Production: The administrator writes something to which the student must immediately respond. For example, having the student copy his/her name after the examiner writes it.

    e. Other: Teacher develops her/his own tasks, or the teacher may observe the student in a nonstandardized method.

    f. Multiple Responses: The assessment evaluates the element in multiple ways.

    1. Unit of presentation: The type of stimulus to which the student is asked to respond.

    a. Auditory-General: Any form of auditory presentation ranging from single letter or word to connected discourse.

    b. Book: A written or printed composition gathered into successive pages and bound together in a volume; this includes illustrated books.

    c. Connected Discourse: Several connected sentences that convey meaning.

    d. Gesture: Body movement used to communicate; specifically, spontaneous movement of the hands and arms that are closely synchronized with the flow of speech.

    e. Grapheme: A written or printed representation of a phoneme (e.g., b for /b/ and oy for /oi/ in boy).

    f. Incomplete Passage: The student is given a passage with words missing, such as a cloze test. The cloze test requires a student to fill in the blank with a word that makes sense within the surrounding text.

    g. Incomplete Word/Sentence: A morpheme or sentence that is missing a letter, such as swi for swim, or missing a word, such as the boy______ the ball for the boy kicked the ball.

    h. Letter: A graphic alphabetic symbol.

    i. Nonsense Word: A pronounceable combination of graphic characters that do not constitute a real word.

    j. Number: A symbol or word depicting how many or which one in a series (e.g., 2, four, sixth).

    k. Object: Something that can be manipulated (e.g., a block).

    l. Patterns: A set of predictable relations that can be described and arranged in a particular configuration.

    m. Phoneme: A minimal sound unit of speech that, when contrasted with another phoneme, affects the meaning of words in a language (e.g., /b/ in book contrasts with /t/ in took, /k/ in cook, or /h/ in hook).

    n. Phrase: A grammatical construction without a subject and predicate.

    o. Picture with directions from administrator: Specific directions that directly relate to the student's interaction with a picture. The directions will not make sense without the picture, and the picture may be specific to each item.

    p. Punctuation Mark: One of the set of graphic marks used in written phrases and sentences to clarify meaning or to give speech characteristics to written materials.

    q. Sentence/Question: A grammatical unit of one or more words.

    r. Story: A narrative tale with a plot, characters, and setting.

    s. Syllable: In phonology, a minimal unit of sequential speech sounds comprised of a vowel sound or a vowel-consonant combination, for example, /a/, /ba/, /ab/, /bab/, etc.

    t. Symbol: Any arbitrary, conventional, written or printed mark intended to communicate, such as letters, numerals, ideographs, etc.

    u. Visual-General: Any form of visual presentation ranging from single letter or word to connected discourse.

    v. Word: A morpheme that is regarded as a pronounceable and meaningful unit.

    w. Other: Teacher develops her/his own tasks, or the teacher may observe the student in a nonstandardized method.

    x. Multiple Responses: The assessment evaluates the element in multiple ways.

    K. Specific type of student response

    1. Type of mental processing: How the student processes the information presented in order to provide the appropriate response.

    a. Identification: The student names the letter, word, picture, etc.

    b. Production: The student writes, speaks, or performs the response.

    c. Recall: The student retrieves information that was presented earlier.

    d. Recognition: The student selects the correct responses from a list of alternatives.

    e. Reproduction: The student copies what the teacher has written or performed.

    f. Combination of two or more of the above.

    g. Other: The teacher develops her/his own tasks, or the teacher may observe the student in a nonstandardized method.

    h. Multiple Responses: The assessment evaluates the element in multiple ways.

    1. Unit of response: Stimuli that the student uses to indicate the correct answer to the item.

    a. Book: A written or printed composition gathered into successive pages and bound together in a volume; this includes illustrated books.

    b. Clause: A group of words with a subject and a predicate used to form either a part of or a whole sentence.

    c. Connected Discourse: Several connected sentences that convey meaning.

    d. Gesture: Body movement used to communicate.

    e. Grapheme: A written or printed representation of a phoneme (e.g., b for /b/ and oy for /oi/ in boy).

    f. Letter: A graphic alphabetic symbol.

    g. Letter that corresponds with intended response: Identifying the correct response from among several alternatives, such as multiple-choice questions.

    h. Nonsense Word: A pronounceable combination of graphic characters that do not make a real word.

    i. Number: A symbol or word showing how many or which one in a series (e.g., 2, four, sixth).

    j. Objects: Something that can be manipulated (e.g., a block).

    k. Oral: Varied forms of oral response ranging from a letter to connected discourse.

    l. Passage: Any section of a written text.

    m. Phrase: A grammatical construction without a subject and a predicate.

    n. Picture: An illustration produced by a drawing, painting, or photograph.

    o. Punctuation Mark: One of the set of graphic marks used in written phrases and sentences to clarify meaning or to give speech characteristics to written materials.

    p. Sentence: A grammatical unit of one or more words containing a subject and predicate.

    q. Shape: Something that depends on the relative position of all the points on its surface; a physical form.

    r. Sound: A distinctive feature of a speech sound.

    s. Word: A morpheme that is regarded as a pronounceable and meaningful unit.

    t. Written: Varied forms of written response ranging from a letter to connected discourse.

    u. Other: Teacher develops her/his own tasks, or the teacher may observe the student in a nonstandardized method.

    v. Multiple Responses: The assessment evaluates the element in multiple ways.

    1. Student response: Categorization of what the student does in order to respond to the item.

    a. Circle: A curved line that is placed around the correct answer.

    b. Color: The student uses a pigmented instrument for producing the correct response.

    c. Draw: A response is drawn with a writing instrument.

    d. Fill in the Blank: The missing word or words are written or verbalized in the appropriate place.

    e. Fill in the Circle: A writing instrument, usually a pencil, is used to darken a circle that indicates the correct answer.

    f. Find: The student searches for the correct response.

    g. Manipulate: The student alters something in order to produce the correct response.

    h. Mark: An arbitrary, conventional, written, or printed mark intended to indicate the correct answer.

    i. Perform: The student is asked to follow through on a task presented by the examiner.

    j. Point: The student indicates with a finger or a writing implement the correct response.

    k. Responds Orally: The correct answer is verbalized.

    l. Sort/Organize: The student places objects in the correct sequence or categories.

    m. Underline: The student places a horizontal line under the correct response.

    n. Uses the mouse and/or keyboard from a computer: The student uses the mouse or keyboard to point, write, or indicate the correct response while working on a computer program.

    o. Write: The student uses a writing system or orthography to produce the correct response.

    p. Other: Teacher develops her/his own tasks, or the teacher may observe the student in a nonstandardized method.

    q. Multiple Responses: The assessment evaluates the element in multiple ways.

    L. Scoring: Description of how items are scored (yes/no, rating scale, rubric, or computer-scored).

    M. Notes: Any additional relevant information particular to this element.

    References

    Kendall, J. S., & Marzano, R. J. (1997). Content knowledge: A compendium of standards and benchmarks for K-12 education (2nd ed.). Aurora, CO: Mid-continent Regional Educational Laboratory, Inc.

    Harris, T. L., & Hodges, R. E. (Eds.). (1995). The literacy dictionary: The vocabulary of reading and writing. Newark, DE: International Reading Association.

    Pearson P. D., Sensale, L., Vyas, S., and Kim, Y. (1998, December). Early Literacy Assessment: A marketplace analysis. Paper presented at the meeting of the National Reading Conference, Austin, TX.

    Stallman, A. C., & Pearson, P. D. (1990). Formal measures of early literacy (No. G0087-C1001-90). Cambridge, MA and Champaign, IL: Bolt, Beranek and Newman, Inc., and the University of Illinois at Urbana-Champaign, Center for the Study of Reading. (ERIC Document Reproduction Services No. ED 324 647)

    Stiggins, R. J. (1995). Assessment literacy for the 21st century. Phi Delta Kappan, 77 (3), 238-245.

    Appendix to Coding Manual

    McREL Standards and Benchmarks (1997)

     

    1.0 Demonstrates competence in the general skills and strategies of the writing process

     

    Level I (Grades K-2)

    1.1 Prewriting: Uses prewriting strategies to plan written work (e.g., discusses ideas with peers, draws pictures to generate ideas, writes key thoughts and questions, rehearses ideas, records reactions and observations)

    1.2 Drafting and Revising: Uses strategies to draft and revise written work (e.g., rereads; rearranges words, sentences, and paragraphs to improve or clarify meaning; varies sentence type; adds descriptive words and details; deletes extraneous information; incorporates suggestions from peers and teachers; sharpens the focus)

    1.3 Editing and Publishing: Uses strategies to edit and publish written work (e.g., proofreads using a dictionary and other resources; edits for grammar, punctuation, capitalization, and spelling at a developmentally appropriate level; incorporates illustrations or photos; shares finished product)

    1.4 Evaluates own and others' writing (e.g., asks questions and makes comments about writing, helps classmates apply grammatical and mechanical conventions)

    1.5 Dictates or writes with a logical sequence of events (e.g., includes a beginning, middle, and ending)

    1.6 Dictates or writes detailed descriptions of familiar persons, places, objects, or experiences

    1.7 Writes in response to literature

    1.8 Writes in a variety of formats (e.g., picture books, letters, stories, poems, information pieces)

     

    Level II (Grades 3-5)

    1.9 Prewriting: Uses prewriting strategies to plan written work (e.g., uses graphic organizers, story maps, and webs; groups related ideas; takes notes; brainstorms ideas)

     

    1.10 Drafting and Revising: Uses strategies to draft and revise written work (e.g., elaborates on a central idea; writes with attention to voice, audience, word choice, tone, and imagery; uses paragraphs to develop separate ideas)

    1.11 Editing and Publishing: Uses strategies to edit and publish written work (e.g., edits for grammar, punctuation, capitalization, and spelling at a developmentally appropriate level; considers page format [paragraphs, margins, indentations, titles]; selects presentation format; incorporates photos, illustrations, charts, and graphs)

    1.12 Evaluates own and others' writing (e.g., identifies the best features of a piece of writing, determines how own writing achieves its purposes, asks for feedback, responds to classmates' writing)

    1.13 Writes stories or essays that show awareness of intended audience

    1.14 Writes stories or essays that convey an intended purpose (e.g., to record ideas, to describe, to explain)

    1.15 Writes expository compositions (e.g., identifies and stays on the topic; develops the topic with simple facts, details, examples, and explanations; excludes extraneous and inappropriate information)

    1.16 Writes narrative accounts (e.g., engages the reader by establishing a context and otherwise developing reader interest; establishes a situation, plot, point of view, setting, and conflict; creates an organizational structure that balances and unifies all narrative aspects of the story; uses sensory details and concrete language to develop plot and character; uses a range of strategies such as dialogue and tension or suspense)

    1.17 Writes autobiographical compositions (e.g., provides a context within which the incident occurs, uses simple narrative strategies, provides some insight into why this incident is memorable)

    1.18 Writes expressive compositions (e.g., expresses ideas, reflections, and observations; uses an individual, authentic voice; uses narrative strategies, relevant details, and ideas that enable the reader to imagine the world of the event or experience)

    1.19 Writes in response to literature (e.g., advances judgments; supports judgments with references to the text, other works, other authors, nonprint media, and personal knowledge)

    1.20 Writes personal letters (e.g., includes the date, address, greeting, and closing; addresses envelopes)

    2.0 Demonstrates competence in the stylistic and rhetorical aspects of writing

     

     

     

    Level I (Grades K-2)

    2.1 Uses general, frequently used words to convey basic ideas

     

    Level II (Grades 3-5)

    2.2 Uses descriptive language that clarifies and enhances ideas (e.g., describes familiar people, places, or objects)

    2.3 Uses paragraph form in writing (e.g., indents the first word of a paragraph, uses topic sentences, recognizes a paragraph as a group of sentences about one main idea, writes several related paragraphs)

    2.4 Uses a variety of sentence structures

    3.0 Uses grammatical and mechanical conventions in written compositions

     

    Level I (Grades K-2)

    3.1 Forms letters in print and spaces words and sentences

    3.2 Uses complete sentences in written compositions

    3.3 Uses declarative and interrogative sentences in written compositions

    3.4 Uses nouns in written compositions (e.g., nouns for simple objects, family members, community workers, and categories)

    3.5 Uses verbs in written compositions (e.g., verbs for a variety of situations, action words)

    3.6 Uses adjectives in written compositions (e.g., uses descriptive words)

    3.7 Uses adverbs in written compositions (i.e., uses words that answer how, when, where, and why questions)

    3.8 Uses conventions of spelling in written compositions (e.g., spells high-frequency, commonly misspelled words from appropriate grade-level list; uses a dictionary and other resources to spell words; spells own first and last name)

    3.9 Uses conventions of capitalization in written compositions (e.g., first and last names, first word of a sentence)

    3.10 Uses conventions of punctuation in written compositions (e.g., uses periods after declarative sentences, uses questions marks after interrogative sentences, uses commas in a series of words)

     

    Level II (Grades 3-5)

    3.11 Writes in cursive

     

    3.12 Uses exclamatory and imperative sentences in written compositions

    3.13 Uses pronouns in written compositions (e.g., substitutes pronouns for nouns)

    3.14 Uses nouns in written compositions (e.g., uses plural and singular naming words; forms regular and irregular plurals of nouns; uses common and proper nouns; uses nouns as subjects)

    3.15 Uses verbs in written compositions (e.g., uses a wide variety of action verbs, past and present verb tenses, simple tenses, forms of regular verbs, verbs that agree with the subject)

    3.16 Uses adjectives in written compositions (e.g., indefinite, numerical, predicate adjectives)

    3.17 Uses adverbs in written compositions (e.g., to make comparisons)

    3.18 Uses coordinating conjunctions in written compositions (e.g., links ideas using connecting words)

    3.19 Uses negatives in written compositions (e.g., avoids double negatives)

    3.20 Uses conventions of spelling in written compositions (e.g., spells high frequency, commonly misspelled words from appropriate grade-level list; uses a dictionary and other resources to spell words; uses initial consonant substitution to spell related words; uses vowel combinations for correct spelling)

    3.21 Uses conventions of capitalization in written compositions (e.g., titles of people; proper nouns [names of towns, cities, counties, and states; days of the week; months of the year; names of streets; names of countries; holidays]; first word of direct quotations; heading, salutation, and closing of a letter)

    3.22 Uses conventions of punctuation in written compositions (e.g., uses periods after imperative sentences and in initials, abbreviations, and titles before names; uses commas in dates and addresses and after greetings and closings in a letter; uses apostrophes in contractions and possessive nouns; uses quotation marks around titles and with direct quotations; uses a colon between hour and minutes)

    4.0 Gathers and uses information for research purposes

     

    Level I (Grades K-2)

    4.1 Generates questions about topics of personal interest

    4.2 Uses books to gather information for research topics (e.g., uses table of contents, examines pictures and charts)

     

     

    Level II (Grades 3-5)

    4.3 Uses a variety of strategies to identify topics to investigate (e.g., brainstorms, lists questions, uses idea webs)

    4.4 Uses encyclopedias to gather information for research topics

    4.5 Uses dictionaries to gather information for research topics

    4.6 Uses key words, indexes, cross-references, and letters on volumes to find information for research topics

    4.7 Uses multiple representations of information (e.g., maps, charts, photos) to find information for research topics

    4.8 Uses graphic organizers to gather and record information for research topics (e.g., notes, charts, graphs)

    4.9 Compiles information into written reports or summaries

    5.0 Demonstrates competence in the general skills and strategies of the reading process

     

    Level I (Grades K-2)

    5.1 Understands that print conveys meaning

    5.2 Understands how print is organized and read (e.g., identifies front and back covers, title page, and author; follows words from left to right and from top to bottom; recognizes the significance of spaces between words)

    5.3 Creates mental images from pictures and print

    5.4 Uses picture clues and picture captions to aid comprehension and to make predictions about content

    5.5 Decodes unknown words using basic elements of phonetic analysis (e.g., common letter/sound relationships) and structural analysis (e.g., syllables, basic prefixes, suffixes, root words)

    5.6 Uses a picture dictionary to determine word meaning

    5.7 Uses self-correction strategies (e.g., searches for cues, identifies miscues, rereads)

    5.8 Reads aloud familiar stories, poems, and passages with attention to rhythm, flow, and meter

     

    Level II (Grades 3-5)

    5.9 Previews text (e.g., skims material; uses pictures, textual clues, and text format)

    5.10 Establishes a purpose for reading

    5.11 Represents concrete information (e.g., persons, places, things, events) as explicit mental pictures

    5.12 Makes, confirms, and revises simple predictions about what will be found in a text

    5.13 Decodes words not recognized immediately by using phonetic and structural analysis techniques, the syntactic structure in which the word appears, and the semantic context surrounding the word

    5.14 Decodes unknown words using a variety of context clues (e.g., draws on earlier reading, reads ahead)

    5.15 Determines the meaning of unknown words using a glossary, dictionary, and thesaurus

    5.16 Monitors own reading strategies and makes modifications as needed (e.g., recognizes when he or she is confused by a section of text, questions whether the text makes sense)

    5.17 Adjusts speed of reading to suit purpose and difficulty of the material

    5.18 Identifies the author's purpose (e.g., to persuade, to inform)

    6.0 Demonstrates competence in the general skills and strategies for reading a variety of literary texts

     

    Level I (Grades K-2)

    6.1 Applies reading skills and strategies to a variety of familiar literary passages and texts (e.g., fairy tales, folktales, fiction, nonfiction, legends, fables, myths, poems, picture books, predictable books)

    6.2 Identifies favorite books and stories

    6.3 Identifies setting, main characters, main events, and problems in stories

    6.4 Makes simple inferences regarding the order of events and possible outcomes

    6.5 Identifies the main ideas or theme of a story

    6.6 Relates stories to personal experiences

     

    Level II (Grades 3-5)

    6.7 Applies reading skills and strategies to a variety of literary passages and texts (e.g., fairy tales, folktales, fiction, nonfiction, myths, poems, fables, fantasies, historical fiction, biographies, autobiographies)

    6.8 Knows the defining characteristics of a variety of literary forms and genres (e.g., fairy tales, folktales, fiction, nonfiction, myths, poems, fables, fantasies, historical fiction, biographies, autobiographies)

    6.9 Selects reading material based on personal criteria (e.g., personal interest, knowledge of authors and genres, text difficulty, recommendations of others)

    6.10 Understands the basic concept of plot

    6.11 Identifies similarities and differences among literary works in terms of settings, characters, and events

    6.12 Makes inferences regarding the qualities and motives of characters and the consequences of their actions

    6.13 Understands simple dialogues and how they relate to a story

    6.14 Identifies recurring themes across literary works

    6.15 Makes connections between characters or simple events in a literary work and people or events in his or her own life

    6.16 Shares responses to literature with peers

    7.0 Demonstrates competence in the general skills and strategies for reading a variety of informational texts

     

    Level I (Grades K-2)

     

    7.1 Applies reading skills and strategies to a variety of informational books

    7.2 Understands the main idea of simple expository information

    7.3 Summarizes information found in texts (e.g., retells in own words)

    7.4 Relates new information to prior knowledge and experience

     

    Level II (Grades 3-5)

    7.5 Applies reading skills and strategies to a variety of informational texts (e.g., textbooks, biographical sketches, letters, diaries, directions, procedures, magazines)

    7.6 Knows the defining characteristics of a variety of informational texts (e.g., textbooks, biographical sketches, letters, diaries, directions, procedures, magazines)

    7.7 Uses text organizers (e.g., headings, topic and summary sentences, graphic features) to determine the main ideas and to locate information in a text

    7.8 Identifies and uses the various parts of a book (index, table of contents, glossary, appendix) to locate information

    7.9 Summarizes and paraphrases information in texts (e.g., identifies main ideas and supporting details)

    7.10 Uses prior knowledge and experience to understand and respond to new information

    7.11 Identifies the author's viewpoint in an informational text

    8.0 Demonstrates competence in speaking and listening as tools for learning

     

    Level I (Grades K-2)

    8.1 Recognizes the characteristic sounds and rhythms of language

    8.2 Makes contributions in class and group discussions (e.g., recounts personal experiences, reports on personal knowledge about a topic, initiates conversations)

    8.3 Asks and responds to questions

    8.4 Follows rules of conversation (e.g., takes turns, raises hand to speak, stays on topic, focuses attention on speaker)

    8.5 Uses different voice level, phrasing, and intonation for different situations

    8.6 Listens and responds to oral directions

    8.7 Listens to and recites familiar stories, poems, and rhymes with patterns

    8.8 Listens and responds to a variety of media (e.g., books, audiotapes, videos)

    8.9 Identifies differences between language used at home and language used in school

     

    Level II (Grades 3-5)

    8.10 Contributes to group discussions

    8.11 Asks questions in class (e.g., when he or she is confused, to see others' opinions and comments)

    8.12 Responds to questions and comments (e.g., gives reasons in support of opinions)

    8.13 Listens to classmates and adults (e.g., does not interrupt, faces the speaker, asks questions, paraphrases to confirm understanding, gives feedback)

    8.14 Makes some effort to have a clear main point when speaking to others

    8.15 Reads compositions to the class

    8.16 Makes eye contact while giving oral presentations

    8.17 Organizes ideas for oral presentations (e.g., includes content appropriate to the audience, uses notes or other memory aids, summarizes main points)

    8.18 Listens to and identifies persuasive messages (e.g., television commercials, commands and requests, pressure from peers)

    8.19 Identifies the use of nonverbal cues used in conversation

    8.20 Identifies specific ways in which language is used in real-life situations (e.g., buying something from a shopkeeper, requesting something from a parent, arguing with a sibling, talking to a friend)

    Appendix B: Summary of Assessments

    Assessment Title

    Author

    Overall Purpose

    Grade

    Form of Administration

    Skills and Elements Assessed

    Alternative Assessment Techniques for Reading and Writing

    Miller (1995)

    Provides alternative methods for evaluating reading and writing abilities

    K, 1, 2, 3

    individual and group

    writing process, conventions, print awareness, phonics, reading, reading strategies, comprehension, listening and speaking, motivation, self-perception, attitude, other

    Alternative Concepts About Print Test

    Bordeaux (unpublished)

    To evaluate the print awareness of low-income students

    K, 1

    individual

    conventions, print awareness, reading

    An Inventory of Classroom Writing Use

    Conrad (teacher; 1993)

    To examine students' progress toward making writing an important part of their classroom lives and to assess whether the student independently initiates use of writing

    1, 2, 3

    individual

    writing process, motivation, metacognition, other

    An Observation Survey of Early Literacy Achievement

    Clay (1998)

    To monitor beginning literacy skills

    K, 1, 2

    individual

    reading, reading strategies, phonics, print awareness, conventions, writing process

    Analytical Reading Inventory

    Wood & Moe (1995)

    To evaluate the processing strategies a reader uses as he or she reads

    K, 1, 2, 3

    individual

    phonics, comprehension, reading, print awareness

    Ann Arbor Public Schools--Reading and Writing Rubric

    Ann Arbor Public Schools (1997)

    To evaluate students' reading, writing, and spelling abilities

    K, 1, 2

    individual

    writing process, conventions, print awareness, phonics, reading, reading strategies, comprehension, listening and speaking, motivation, attitude

    Assessing Literacy With the Learning Record: A Handbook for Teachers, Grades K-6

    Barr, Craig, Fisette, & Syverson (1999)

    To assess learning throughout the year and from one year to the next

    K, 1, 2, 3

    individual

    writing process, conventions, print awareness, reading, reading strategies, comprehension, listening and speaking, motivation, attitude

    Assessment of Syntactic Structure

    Imbens-Bailey, Dingle, & Moughamian (1999)

    To evaluate students' complex syntactic structure knowledge

    K, 1, 2

    individual

    conventions

    Authoring Cycle Profile

    Shanklin (teacher; 1993)

    To assess students' writing process and to guide instruction

    2, 3

    individual and group

    writing process, conventions, metacognition, attitude, other

    Basic Reading Inventory, 7th ed.

    Johns (1997)

    To help teachers gain insights about students' reading behaviors

    K, 1, 2, 3

    individual

    writing process, conventions, print awareness, phonics, reading, reading strategies, comprehension, self-perception, attitude

    Basic Reading Vocabularies

    Harris & Jacobson (1982)

    To evaluate student's knowledge of high-frequency words at their grade level

    K, 1, 2 ,3

    individual

    phonics

    Basic Sight Vocabulary

    Dolch (1936)

    To evaluate sight vocabulary

    K, 1, 2, 3

    individual

    phonics

    Beginning Phonic Skills Test (BPST)

    Shefelbine (1996)

    To evaluate students' phonemic awareness

    K, 1, 2, 3

    individual

    phonics

    Book Selection

    Paris & Van Kraayenoord (1998)

    To measure the children's awareness and use of strategies for choosing books

    K, 1, 2, 3

    individual

    comprehension, motivation, attitude, other

    Checklist for Ownership of Reading

    Au, Scheu, & Kawakami (1990)

    To evaluate a student's sense of ownership of literacy

    K, 1, 2, 3

    individual

    motivation, self-perception, metacognition, attitude, other

    Classroom Reading Inventory

    Silvaroli (1997)

    To identify a student's reading skills, abilities, or both

    K, 1, 2, 3

    individual

    phonics, comprehension, reading

    Classroom Reading Miscue Assessment

    Denver Coordinators/Consultants Applying Whole Language (1993)

    To help classroom teachers efficiently gather miscue data

    1, 2, 3

    individual

    reading, reading strategies, comprehension

    Comprehension Profiles

    Wood (1988)

    To assess comprehension and students handling of reading material under conditions which simulate the classroom situation

    1, 2, 3

    individual and/or group

    reading, reading strategies, comprehension, listening and speaking

    Dancing With the Pen: The Learner as a Writer

    Ministry of Education (1992)

    For primary teachers to develop their understanding of how children learn to write and how teachers can facilitate the process

    1, 2, 3

    individual

    writing process, conventions, listening and speaking, attitude

    Early Childhood Literacy Assessment System

    Board of Education of the City of New York & CTB McGraw-Hill (1998)

    A performance assessment designed to measure the literacy development of young children (K-3)

    K, 1, 2, 3

    individual and group

    writing process, conventions, print awareness, phonics, reading, reading strategies, comprehension, other

    Early Literacy Portfolio: South Brunswick Public Schools

    South Brunswick Public Schools (1998)

    To evaluate students' literacy knowledge

    K, 1, 2

    individual

    reading, phonics, comprehension, conventions, writing process, print awareness, reading strategies

    Elementary Literacy Profile: A New York State Pilot Assessment

    NCREST/Cayuga-Onondaga, ETS, the New York State Education Department, & numerous teachers from throughout the state (1997)

    To provide information about students' progress in various aspects of literacy development: reading, writing, speaking, and listening

    1, 2, 3

    individual and/or small group

    print awareness, phonics, reading, reading strategies, comprehension, writing process, conventions, listening and speaking

    Elementary Reading Attitude Survey (ERAS)

    McKenna & Kear (1990)

    To enable teachers to estimate attitude levels efficiently and reliably

    1, 2, 3

    group

    attitude

    Emergent Reading and Writing Evaluations

    Denver Public Schools Collaboration (1993)

    To assess any student who does not yet independently read unfamiliar text, and to gain more specific information about how to focus instruction

    K, 1

    individual

    and groupwriting process, conventions, print awareness, phonics, reading, reading strategies, comprehension, listening and speaking, attitude, other

    Entrance Assessment

    Duckett (1998)

    To place students in the appropriate group level

    K, 1, 2, 3

    individual

    print awareness, reading, reading strategies, comprehension, conventions

    Evaluation: Whole Language Checklists for Evaluating Your Children

    Sharp (1989)

    Designed to monitor and assess student progress within the framework of a whole language approach

    K, 1, 2, 3

    individual and/or group

    writing process, print awareness, reading, motivation, reading strategies, conventions, comprehension, listening and speaking, self-perception, metacognition, attitude

    Features List

    Gillet & Temple (1990)

    To determine the students spelling stage

    K, 1, 2, 3

    individual and group

    conventions

    First Grade Screening Instrument

    Hoffman & Hesbol (n.d.)

    To evaluate students' reading and writing abilities

    1

    individual

    phonics, conventions, reading, print awareness

    Fry Instant Word Lists

    Fry (1980)

    To evaluate sight vocabulary

    K, 1, 2, 3

    individual

    phonics

    Guidance in Story Retelling

    Morrow (1986)

    To determine if frequent practice and guidance in retelling stories can improve children's dictation of original stories specifically for inclusion of story structural elements and syntactic complexity

    K, 1, 2, 3

    individual

    comprehension

    Ideas for Spelling

    Bolton & Snowball (1993)

    To help teachers implement a balanced spelling program within the context of total language program

    K, 1, 2, 3

    individual and/or group

    convention, print awareness, phonics

    Index of Reading Awareness (IRA)

    Jacobs & Paris (1987)

    Used to measure children's understanding of reading comprehension processes

    3

    group

    metacognition

    Informal Reading Inventory: Preprimer to Twelfth Grade (5th ed.)

    Burns & Roe (1999)

    To help teachers discover the levels of reading materials pupils can read both with and without teacher assistance

    1, 2, 3

    individual

    phonics, reading, comprehension

    Informal Reading Readiness Assessment

    Thomson Elementary School (n.d.)

    To evaluate students' reading and writing ability

    K

    individual

    phonics, conventions, print awareness, reading, writing process, other

    Informal Reading-Thinking Inventory (IR-TI)

    Manzo, Manzo, & McKenna (1995)

    To discover more about students' reading and language development

    1, 2, 3

    individual

    phonics, comprehension, reading, motivation

    Interactive Reading Assessment System (IRAS)

    Calfee & Calfee (1981)

    To determine if precisely directed instruction is necessary to assure further growth in reading ability

    1, 2, 3

    individual

    phonics, reading, conventions, comprehension, metacognition

    Invitations: Changing a teacher and learners, K-12

    Routman (1994)

    To evaluate students' reading and writing abilities

    K, 1, 2, 3

    individual

    writing process, conventions, reading, reading strategies, comprehension, listening and speaking, motivation, other, metacognition, attitude

    Klesius-Homan Phonic Word Analysis Test

    Klesius & Homan (1980)

    To evaluate word analysis

    K, 1

    individual

    phonics

    Learning to Write: A Mode for Curriculum and Evaluation (3rd ed.)

    McCaig (1990)

    To evaluate writing abilities

    K, 1, 2, 3

    individual and group

    writing process, conventions, print awareness, phonics, other

    Linguistic Diagnostic

    From Title I teacher/Reading Specialist (1997)

    To evaluate students' reading and writing abilities

    K, 1

    individual

    phonics, conventions, writing process

    Linking Reading Assessment to Instruction (2nd ed.)

    Mariotti & Homan (1997)

    To evaluate literacy skills in elementary students

    K, 1, 2, 3

    individual and group

    writing process, conventions, print awareness, reading, reading strategies, comprehension, listening and speaking, motivation, metacognition, self-perception, attitude, other

    Literacy Assessment for Elementary Grades

    St. Vrain Valley School District (1997)

    To provide diagnostic information to teachers, monitor individual students' growth and achievement, select students for categorical programs, compile school profiles for accountability and goal setting, and evaluate programs and report results

    K, 1, 2, 3

    individual

    reading, phonics, comprehension, listening and speaking, convention

    Literacy Assessment: A Handbook of Instruments

    Rhodes (1993)

    To evaluate students' reading and writing abilities

    1, 2, 3

    individual and group

    writing process, conventions, reading, reading strategies, comprehension, listening and speaking, motivation, other, metacognition

    Literacy Assessment: MacArthur Foundation Pathways Study

    MacArthur CCDP Follow-Up Study (1998)

    To evaluate literacy development

    K, 1, 3

    individual

    convention, print awareness, phonics, comprehension, listening and speaking, reading, reading strategies, writing process, other

    Literacy Development Checklist

    Seeds University Elementary School & UCLA (1999)

    To summarize what a teacher knows about a students literacy development, and to identify children for early intervention

    K, 1

    individual

    writing process, conventions, comprehension, phonics, print awareness, reading, listening and speaking, motivation, metacognition, self-perception, attitude, other

    Literacy: Helping Children Construct Meaning

    Cooper & Au (1997)

    To evaluate a student's meaning construction

    2, 3

    individual

    writing process, comprehension, listening and speaking, motivation, self-perception, metacognition, attitude

    Measuring Growth in Literacy Survey

    Alief Independent School District (1997)

    To provide the teacher with a sampling of important aspects of advancing literacy development

    K, 1, 2, 3

    individual and group

    writing process, conventions, print awareness, phonics, reading, reading strategies, comprehension

    Metacomprehension Strategy Index (MSI)

    Schmitt (1990)

    To evaluate students' knowledge of strategic reading process

    2, 3

    group

    reading, comprehension, other

    Michigan Literacy Progress Profile

    Michigan Department of Education (1998)

    To provide teachers and parents with information about what individual children know and do, as well as to support instruction

    K, 1, 2, 3

    individual and group

    writing process, conventions, print awareness, phonics, reading, reading strategies, comprehension, listening and speaking, motivation, other

    Modified Concepts About Print

    Klesius & Searls (1985)

    To evaluate a child's understanding of print

    K, 1, 2

    individual

    print awareness

    Motivation to Read Profile

    Gambrell, Palmer, Codling, & Mazzoni (1996)

    To provide teachers with an efficient and reliable way to quantitatively and qualitatively assess reading motivation by evaluating students' self-concept as readers and the value they place on reading

    2, 3

    individual and group

    motivation, metacognition, self-perception, attitude

    Multidimensional Fluency Scale

    Zutell & Rasinski (1991)

    To analyze the fluency of reading

    1, 2, 3

    individual

    reading, reading strategies

    North Carolina Grades K-2 Literacy Assessment

    North Carolina State Department of Education (1997)

    To evaluate students' competencies in spelling, writing and reading

    K, 1, 2, 3

    individual

    writing process, conventions, print awareness, phonics, reading, reading strategies, comprehension, listening and speaking, other, metacognition

    Observation of Reading Behavior

    Davidson (1985)

    To identify elements of early literacy growth

    K, 1, 2, 3

    individual

    writing process, conventions, print awareness, phonics, reading

    Phonological Awareness & Literacy Screening (PALS I & PALS II)

    Invernizzi, Meier, Juel, & Swank (1997)

    To provide specific information about what a young child knows regarding essential literacy components

    K, 1

    individual and/or group

    phonics, writing process, conventions, print awareness, listening and speaking

    Portfolio Assessment and Evaluation in a First Grade Whole Language Classroom

    Ehlerding (1993)

    To develop portfolios as a means of evaluating student progress

    1

    individual

    writing process, conventions, print awareness, phonics, reading, reading strategies, comprehension, listening and speaking, motivation, other, self-perception, metacognition, attitude

    Portfolio Assessment and Evaluation: Developing & Using Portfolios in the K-6 Classroom

    Batzle (1992)

    To develop portfolios as a means of evaluating student progress

    K, 1, 2, 3

    individual

    writing process, conventions, print awareness, phonics, reading, reading strategies, listening and speaking, motivation, other, metacognition, self-perception, attitude

    Practical Aspects of Authentic Assessment: Putting the Pieces Together

    Hill & Ruptic (1994)

    To focus on specific and practical aspects of assessment and evaluation in elementary classrooms

    K, 1, 2, 3

    individual

    writing process, conversions, print awareness, reading, reading strategies, comprehension, listening and speaking, motivation, self-perception, metacognition, attitude, other

    Prereading Plan

    Langer (1981)

    To determine the amount of information a reader has about a specific topic, which leads to the student's comprehension and learning

    2, 3

    group

    comprehension

    Primary and 2-4 Literacy/Communication Profiles

    Biggam, Herman, & Trubisz (1998)

    To aid in classroom-based linking of assessment and instruction. Profiles are frameworks, which help gauge students' literacy development, guide instruction based on students' strengths and needs, and help communication with families

    K, 1, 2, 3

    individual

    writing process, conventions, print awareness, phonics, reading, reading strategies, comprehension, listening and speaking, motivation, other, metacognition, attitude

    Primary Language Arts Portfolio

    Unknown

    To evaluate students' reading and writing abilities

    1

    individual

    writing process, conventions, print awareness, phonics, reading, reading strategies, comprehension, motivation, metacognition, attitude, other

    Primary Performance Tasks

    Kentucky Department of Education (1996)

    To provide models of activities in which all primary students engage to prepare for successful completion of the primary program

    1, 2, 3

    individual and group

    writing process, motivation, self-perception, other

    Qualitative Reading Inventory-II (QRI-II)

    Leslie & Caldwell (1995)

    To estimate students' reading level, group students effectively, and appropriately choose textbooks; to plan intervention instruction

    K, 1, 2, 3

    individual

    phonics, reading, comprehension, reading strategies, attitude, other

    Reading Assessment: Grades K-4, Third Grade Benchmark

    Oregon Department of Education, Office of Assessment and Evaluation (1998)

    To successfully complete the third grade benchmark performance of reading for accuracy, fluency, and comprehension

    K, 1, 2, 3

    individual and group

    reading, comprehension, phonics

    Reading Interview

    Burke (teacher; 1987)

    To tap into students' attitudes about themselves as readers and provide information about students' perceptions of reading and reading instruction

    2,3

    individual

    self-perception, metacognition, other

    Reading Inventory for the Classroom (3rd Ed.)

    Flynt and Couter, Jr. (1998)

    To assist teachers in the placement of students with appropriate reading and instructional materials

    1, 2, 3

    individual

    phonics, comprehension, reading, print awareness, reading strategies, listening and speaking

    Reading Skills Inventory

    From Title I teacher/Reading Specialist

    To evaluate a student's reading ability

    K, 1

    individual

    writing process, phonics, reading

    Rubric for Performance Assessment

    From a first grade teacher

    To evaluate students' writing and comprehension abilities

    1

    individual

    writing process, conventions, comprehension, print awareness

    Rubric for Written Work

    Casale (1999)

    To encourage students to show their best work and to evaluate writing ability

    K, 1, 2,3

    individual

    writing process, conventions

    San Diego Quick Assessment List

    LaPray & Ross (1969)

    To determine a child's appropriate instructional reading level

    K, 1, 2, 3

    individual

    phonics

    Early Reading Assessment

    School District of Philadelphia, Office of Assessment (1998)

    To evaluate and demonstrate students' reading and writing abilities

    K, 1, 2, 3

    individual

    writing process, conventions, print awareness, phonics, reading, reading strategies, comprehension, listening and speaking, other, metacognition

    Scoring Narrative Structure

    Imbens-Bailey (1997)

    To evaluate students' narrative, story retelling, and story generation

    K, 1

    individual

    comprehension, listening and speaking

    South Colonie Central Schools--K-1 Assessment for Language Arts

    Sub-Committee of the K-4 Language Arts Institute Council (1998)

    To provide a method for teachers to monitor ongoing progress through systematic observations of young children as they learn to read and write

    K, 1, 2, 3

    individual and/or group

    print awareness, phonics, reading, reading strategies, comprehension, conventions, writing process, metacognition, other

    Southwest Allen County Schools Curriculum-Based Assessment

    Southwest Allen County Schools (1997)

    To evaluate a student's reading and writing ability

    K, 1, 2, 3

    individual

    writing process, conventions, print awareness, reading, reading strategies, phonics, comprehension, listening and speaking, motivation, other

    Story Construction From a Picture Book

    Van Kraayenoord & Paris (1996)

    To determine what meaning children can construct from a narrative

    K, 1

    individual

    print awareness, reading, reading strategies, comprehension, listening and speaking, motivation, other

    Story Frame

    Blount (1991)

    To assess story grammar

    1, 2, 3

    group

    comprehension, motivation, other

    Success for All, Roots and Wings

    Johns Hopkins University (1998)

    To determine student's current instructional level in reading

    1

    individual and group

    conventions, phonics, reading, comprehension

    Teaching Kids to Spell

    Gentry & Gillet (1993)

    To help teachers understand their students' level of spelling, which will assist them in developing a spelling program for all students

    K, 1, 2

    group

    conventions

    Test of Auditory Analysis Skills (TAAS)

    Rosner (1975)

    To evaluate a student's phonetic knowledge

    K, 1, 2, 3

    individual

    phonics

    Test of Phonemic Awareness

    Stahl & Murray (1993)

    To evaluate phonemic awareness

    K, 1, 2

    individual

    phonics

    Texas Primary Reading Inventory

    Texas Education Agency (1997)

    To provide teachers of kindergarten, grade 1, and grade 2 students with an informal means of observing and recording students progress

    K, 1, 2

    individual

    print awareness, phonics, reading, comprehension

    The Names Test

    Cunningham (1990)

    To evaluate students' abilities to pronounce proper names

    K, 1, 2, 3

    individual

    phonics

    The Peterborough Group

    Lessard (n.d.)

    To evaluate the student's reading and writing ability

    K, 1, 2, 3

    individual

    writing process, conventions, print awareness, phonics, reading, reading strategies, listening and speaking, motivation, self-perception, other

    Think Alouds: Assessing Comprehension

    Wade (1990)

    To assess reading comprehension strategies

    2, 3

    individual

    phonics, reading strategies, comprehension

    Think-Along Passage (TAP)

    Paris (1991)

    To assess strategic reading and metacognition

    K, 1, 2

    individual and group

    comprehension, metacognition

    Work Sample Interviews

    Van Kraayenoord & Paris (1997)

    To elicit students' metacognitive analyses of their own learning and accomplishments

    3

    individual

    metacognition

    Work Sampling System

    Meisels, Jablon, Marsden, Dichtelmiller, & Dorfman (1994)

    To document and assess children's skills, knowledge, behavior, and accomplishments across a wide variety of curriculum

    K, 1, 2, 3

    individual

    writing process, conventions, print awareness, phonics, reading, reading strategies, comprehension, listening and speaking, motivation, attitude, other

    Writing Checklist

    O'Connor Elementary Magnet School

    To evaluate students' writing ability

    K, 1, 2, 3

    individual

    writing process, conventions, print awareness, reading, reading strategies, listening and speaking, other.

    Yopp Singer Test

    Yopp (1995)

    To evaluate students' phonemic awareness

    K

    individual

    phonics

     

     

     

     

    Appendix C

     

    Number of Specific Skills or Elements by Assessment

    Assessment Title

    Number of Specific SkiLLs or Elements

    Alternative Assessment Techniques for Reading and Writing

    75

    Alternative Concepts About Print Test

    10

    An Inventory of Classroom Writing Use

    8

    An Observation Survey of Early Literacy Achievement

    15

    Analytical Reading Inventory

    12

    Ann Arbor Public Schools--Reading and Writing Rubric

    41

    Assessing Literacy With the Learning Record

    39

    Assessment of Syntactic Structure

    1

    Authoring Cycle Profile

    17

    Basic Reading Inventory, 7th Ed.

    37

    Basic Reading Vocabulary

    1

    Basic Sight Vocabularies

    1

    Beginning Phonic Skills Test (BPST)

    2

    Book Selection

    6

    Checklist for Ownership of Reading

    8

    Classroom Reading Inventory

    8

    Classroom Reading Miscue Assessment

    11

    Comprehension Profiles

    10

    Dancing With the Pen: The Learner as a Writer

    11

    Early Childhood Literacy Assessment System

    28

    Early Literacy Portfolio: South Brunswick Public Schools

    23

    Elementary Literacy Profile: A New York State Pilot Assessment

    54

    Elementary Reading Attitude Survey (ERAS)

    3

    Emergent Reading and Writing Evaluations

    28

    Entrance Assessment

    8

    Evaluation: Whole Language Checklists for Evaluating Your Children

    39

    Features List

    1

    First Grade Screening Instrument

    6

    Fry Instant Word Lists

    1

    Guidance in Story Retelling

    4

    Ideas for Spelling

    7

    Index of Reading Awareness (IRA)

    6

    Informal Reading Inventory: Preprimer to Twelfth Grade (5th ed.)

    14

    Informal Reading Readiness Assessment

    10

    Informal Reading-Thinking Inventory (IR-TI)

    10

    Interactive Reading Assessment System (IRAS)

    9

    Invitations: Changing as Teacher and Learners, K-12

    30

    Klesius-Homan Phonic Word-Analysis Test

    2

    Learning to Write: A Mode for Curriculum and Evaluation (3rd ed.)

    20

    Linguistic Diagnostic

    6

    Linking Reading Assessment to Instruction (2nd ed.)

    41

    Literacy Assessment for Elementary Grades

    18

    Literacy Assessment: A Handbook of Instruments

    35

    Literacy Assessment: MacArthur Foundation Pathways Study

    17

    Literacy Development Checklist

    26

    Literacy: Helping Children Construct Meaning

    12

    Measuring Growth in Literacy Survey

    24

    Metacomprehension Strategy Index (MSI)

    6

    Michigan Literacy Progress Profile

    49

    Modified Concepts About Print

    5

    Motivation to Read Profile

    15

    Multidimensional Fluency Scale

    3

    North Carolina Grades K-2 Literacy Assessment

    44

    Observation of Reading Behavior

    13

    Phonological Awareness and Literacy Screening (PALS I & PALS II)

    9

    Portfolio Assessment and Evaluation in a First Grade Whole Language Classroom

    57

    Portfolio Assessment and Evaluation: Developing and Using Portfolios in the K-6 Classroom

    55

    Practical Aspects of Authentic Assessment: Putting the Pieces Together

    51

    Prereading Plan

    1

    Primary and 2-4 Literacy/Communication Profiles

    67

    Primary Language Arts Portfolio

    52

    Primary Performance Tasks

    10

    Qualitative Reading Inventory-II (QRI-II)

    11

    Reading Assessment: Grades K-4, Third Grade Benchmark

    10

    Reading Interview

    6

    Reading Inventory for the Classroom (3rd Ed.)

    20

    Reading Skills Inventory

    5

    Rubric for Performance Assessment

    8

    Rubric for Written Work

    10

    San Diego Quick Assessment List

    1

    School District of Philadelphia Balanced Early Literacy Draft Assessment

    51

    Scoring Narrative Structure

    4

    South Colonie Central Schools--K-1 Assessment for Language Arts

    31

    Southwest Allen County Schools Curriculum-Based Assessment

    67

    Story Construction From a Picture Book

    14

    Story Frame

    6

    Success for All, Roots and Wings

    6

    Teaching Kids to Spell

    1

    Test of Auditory Analysis Skills (TAAS)

    1

    Test of Phonemic Awareness

    2

    Texas Primary Reading Inventory

    20

    The Names Test

    1

    The Peterborough Group

    29

    Think Alouds: Assessing Comprehension

    9

    Think-Along Passage (TAP)

    6

    Work Sample Interviews

    7

    Work Sampling System

    47

    Writing Checklist

    33

    Yopp Singer Test

    1

     

    Appendix D

    Skills or Elements Included in the Assessments

     

    Skills or Elements Related to McREL Standards (see Coding Manual for relationship to standards)

    Category

    Skill or Element

    Comprehension

    Comments on Literary Aspects of the Text

    Comprehension

    Connect Universally Shared Experiences With Text

    Comprehension

    Distinguishes Fantasy From Realistic Texts

    Comprehension

    Inferences

    Comprehension

    Prediction Strategies

    Comprehension

    Provides Supporting Details

    Comprehension

    Reference To Evidence Presented In Text

    Comprehension

    Retelling

    Comprehension

    Summarizes Main Ideas and Points

    Conventions

    Capitalization

    Conventions

    Grammatically Correct Sentences

    Conventions

    Handwriting

    Conventions

    Linguistic Organization

    Conventions

    Paragraphs

    Conventions

    Punctuation Marks

    Conventions

    Spelling

    Conventions

    Uses Upper- and Lower-Case Letters in Writing

    Listening and Speaking

    Communicates Effectively

    Listening and Speaking

    Figurative Language

    Listening and Speaking

    Language Production

    Listening and Speaking

    Listens Attentively

    Listening and Speaking

    Oral Directions

    Listening and Speaking

    Participates in Group Discussion

    Listening and Speaking

    Questions

    Listening and Speaking

    Story Telling/Retelling

    Metacognition

    Self Review

    Motivation

    Reading Preferences

    Motivation

    Response to Literature

    Other

    Note-Taking

    Other

    Reference Skills

    Other

    Skimming

    Other

    Topic Knowledge

    Aspects of Word Recog.

    Decoding Words

    Aspects of Word Recog.

    Identification of Beginning Sounds

    Aspects of Word Recog.

    Phonemic Awareness

    Print Awareness

    Directionality

    Print Awareness

    Identification of Parts of a Book

    Print Awareness

    Letter and Word Order

    Print Awareness

    Understands That Print Conveys Meaning

    Print Awareness

    Word Boundaries

    Reading

    Fluency

    Reading

    Reading Flexibility

    Reading

    Texts Student Can Read

    Reading Strategies

    Monitoring Own Reading Strategies

    Reading Strategies

    Self-Correction

    Reading Strategies

    Using Pictures and Story Line for Predicting Context and Words

    Reading Strategies

    Using Print for Predicting Meaning of the Text

    Writing Process

    Types of Compositions

    Writing Process

    Writing Attends to Audience

    Writing Process

    Writing Contains a Purpose

    Writing Process

    Writing Contains Description and Details

    Writing Process

    Writing Has Evidence of Beginning, Middle, and End

    Writing Process

    Writing Is Logical and Sequential

    Writing Process

    Writing Process

     

    Motivation, Self-Perception, Metacognition, and Attitude Skills That Are Not Representative of the McREL Standards

    Category

    Skill or Element

    Attitude

    Attitudes Toward Other Literacy Activities

    Attitude

    Attitudes Toward Reading

    Attitude

    Attitudes Toward Reading Behavior

    Attitude

    Attitudes Toward Writing

    Attitude

    Other--Attitude

    Metacognition

    Monitoring How Student Reads

    Metacognition

    Other

    Metacognition

    Personal Progress

    Metacognition

    Planning How to Read

    Metacognition

    Pride

    Metacognition

    Reading-Related Behavior

    Metacognition

    Self-Assessment in Non-Language Arts Domain

    Metacognition

    Sharing With Others

    Metacognition

    Strategy-Execution for How to Read

    Metacognition

    Teacher Feedback

    Metacognition

    Writing Related Behavior

    Motivation

    Book Referral

    Motivation

    Current Reading Practices

    Motivation

    Family Support and Prior Experience

    Motivation

    Other--Motivation

    Motivation

    Student Reads for Own Purposes

    Motivation

    Time Spent

    Self-Perception

    Characteristics of a Good Reader

    Self-Perception

    Learning and Understanding

    Self-Perception

    Others' Opinions

    Self-Perception

    Reads Independently

    Self-Perception

    Writes Independently

     

    Other Skills or Elements That Are Not Related to McREL Standards and That Are Not Part of the Motivation, Self-Perception, Metacognition, and Attitude Skills

    Category

    Skill or Element

    Aspects of Word Recog.

    Letter Identification

    Aspects of Word Recog.

    Manipulation of Sounds

    Aspects of Word Recog.

    Production of Rhyming Words

    Aspects of Word Recog.

    Sound-Symbol Correspondence

    Comprehension

    Drawing Conclusions

    Comprehension

    Identify Cause-Effect Relationships

    Comprehension

    Literal Comprehension

    Comprehension

    Literary Analysis

    Comprehension

    Sequence of Story's Events

    Comprehension

    Wider Meaning

    Conventions

    Directional Principles in Writing

    Conventions

    Uses Complex Word Structures

    Conventions

    Writes Own Name

    Listening and Speaking

    Asks for Clarification

    Listening and Speaking

    Holds Attention of Others

    Listening and Speaking

    Others Perspective

    Listening and Speaking

    Responses Make Connections to the Situation

    Listening and Speaking

    Self-Corrects When Speaking

    Listening and Speaking

    Various Types of Communication

    Other

    Color Identification

    Other

    Fact vs Opinion

    Other

    Familiarity With Texts

    Other

    Instructions

    Other

    Other

    Other

    Presentations

    Other

    Similarities and Differences

    Other

    Synonyms and Antonyms

    Other

    Text Comparison

    Other

    Use of Text

    Print Awareness

    Concept of Letter or Word

    Print Awareness

    Labels Pictures

    Print Awareness

    Sense of StoryCategorySkill or Element

    Print Awareness

    Understands Punctuation Marks

    Print Awareness

    Understands Upper- and Lower-Case Letters

    Reading

    Book Topic

    Reading

    Identifies Own Name

    Reading

    Pretend Reading

    Reading

    Reading

    Accuracy

    Reading

    Reads as if Passage is Meaningful

    Reading

    Use of Book Language

    Reading

    Voice-To-Print Match

    Reading Strategies

    Locating Answers

    Reading Strategies

    Way of Reading

    Writing Process

    Illustrations are Representative of the Story

    Writing Process

    Message Quality

    Writing Process

    Use of Formal and/or Literary Language

    Writing Process

    Uses Illustrations to Express Ideas

    Writing Process

    Uses Lively and Descriptive Language

    Writing Process

    Vocabulary Usage

    Writing Process

    Writing Behaviors

    Writing Process

    Writing Conveys a Sense of Story

    Writing Process

    Writing Is Easy to Understand and Follow

     

    Appendix E: Two Sample Assessments

    We provide a narrative description of two measures; at the end we attach output of each assessment.

    Click here to download an acrobat version of the charts.

    Guidance in Story Retelling

    The purpose of the Guidance in Story Retelling (GSR) is to determine whether students' dictation of stories improves with frequent practice and guidance. This measure is recommended by reading specialists who use Morrow's guided retelling to evaluate their students' comprehension. The measure is available in English, and although Morrow's research is with kindergarten students, reading teachers use her measure with students across a range of grade levels, including K-3. Morrow does not prescribe how often the measure should be administered, leaving this to the discretion of the teacher. The length of time to administer the measure depends on how long it takes the students to complete their recall of the story.

    The actual measure is printed on one page with 12 general questions that test for the students' memory of the different elements of a story. The questions assess the students' recall of four specific elements: the sequencing of the story's recalled events, and the ability to summarize main ideas, to provide supporting details, and to draw conclusions. Morrow uses on-demand assessment methodology, with the teacher writing down students' oral responses. The presentation of the stimuli used for evaluating students' comprehension is both auditory and visual, which in this case is a storybook read by the teacher. Students respond to the stimuli by orally recalling the events of the story read by the teacher. Once the students complete their retelling, their responses are evaluated using the 12 items. Students receive one point for each correct response, half a point for getting the basic idea of the story, and no points for irrelevant information. The points are added to give the student a single score with a maximum of 12.

    Literacy Assessment for Elementary Grades

    The purpose of the Literacy Assessment for Elementary Grades (LAEG) is to provide teachers with a comprehensive measure for evaluating their students' literacy skills. This information is then used for instruction, placement, and program evaluation. The assessment is available in both English and Spanish and can be used with students in K-3. The measure is administered in the fall, with an option for teachers to administer it in the spring. The amount of time for administration is not indicated or suggested by the authors. LAEG is divided by grade levels. Each level contains focused individual measures for evaluating a series of related skills, such as a phonics test and a word identification test. Most of these focused tests are consistent across grades; however, some are particular to one or two grade levels, such as letter identification for kindergarten and the phonics test for kindergarten and first grade. The LAEG includes two additional components drawn from other authors and publishers, specifically Marie Clay's Concepts About Print and the Houghton Mifflin Baseline Test.

    LAEG assesses students' mechanical conventions, phonics, reading, comprehension, and listening and speaking abilities, focusing on 18 specific skills. LAEG uses both observations and on-demand methodologies for assessing students' literacy performance. Teachers are asked to record student responses using either checklists, running records, oral-directed forms, oral open-ended forms, dictation, or an informal reading inventory.

    The stimuli used to assess the 18 skills are presented either as auditory, visual, or both auditory and visual. The unit of how the stimuli are presented includes the use of a grapheme, letter, word, story, or story with related questions. The number of items range from 0 to 52, depending on the focus test, with most skills evaluated using a story or a passage. Students respond to the stimuli by identifying, producing, or recalling the responses. The stimuli the student uses to indicate the correct response is a letter, word, sound, or some type of verbal response, with students responding either orally or through writing. Three forms of scoring are used throughout the measure for calculating students' total score: (a) their response is correct or incorrect, (b) based on their response they pass or fail that section of the test, and (c) their responses are scored using a rubric.


  • 1. Matches a McREL Benchmark and Standard

    2. Split-Half

    3. Test-Retest

    4. Kuder-Richardson Formula 20

    5. Provides standardization information: National representative sample of 18,138 students in 1-6 grades; 499 schools within 95 school districts in 38 U.S. states; ethnicity is close to the U.S. population; and includes percentile ranks for each grade and scale.

    6. Cronbach's Alpha

    7. Provides a description of the item development.

    8. Woodcock Reading Mastery Test-Revised