Parental Reactions To Authentic Performance Assessment[1]

Samuel J. Meiselsa

Yange Xuea

Donna DiPrima Bickelb

Julie Nicholsona

Sally Atkins-Burnetta

aUniversity of Michigan

bUniversity of Pittsburgh

Abstract

This paper examines parents’ reactions to the use of a curriculum-embedded performance assessment––the Work Sampling System (WSS)––with their kindergarten–third grade children. Of the 350 surveys distributed to this predominantly low-income, African American, inner-city population, 246 were completed and returned, yielding a return rate of 70%. The data were aggregated into subscales, and internal reliability measures, descriptive statistics, and two-step hierarchical regressions on each subscale were calculated. In order to examine the direct and indirect effect of parents’ perceptions of teachers’ willingness to use WSS and other factors on parents’ overall satisfaction, structural equation modeling was used. Results demonstrate that parents in this study hold positive attitudes towards WSS and believe that WSS benefits their children. The majority of parents who returned the survey prefer WSS summary reports to conventional report cards and want their children to continue participating in classrooms that use WSS. Parents’ perceptions of teachers’ willingness to use WSS and staff availability to answer parents’ questions about WSS strongly affect parents’ attitudes towards WSS. The structural equation modeling shows that parents’ attitudes towards WSS are unaffected by whether children are high or low achievers in school. In sum, this study demonstrates that when schools using a systematic, curriculum-embedded performance assessment make an effort to keep parents informed about the assessment, and when consistent informal communications between parents and teachers take place, parental reactions to performance assessment can be very positive.


Parental Reactions To Authentic Performance Assessment

Standards-driven, performance-based assessments are one of the cornerstones of contemporary American educational reform, largely because of the belief that these assessments can bring about improvements in teaching and learning and, therefore, in student achievement (Darling-Hammond, 1994; Frederiksen & Collins, 1989; Khattri, Reeve, & Kane, 1998; Khattri & Sweet, 1996; Resnick & Resnick, 1992; Wiggins, 1989a, 1996). Performance assessments provide alternative ways of assessing achievement (Linn, 2000; Meisels 1996; Resnick & Resnick, 1992; Wiggins, 1989a). They refer to “active student production of evidence of learning” rather than a collection of responses that reflect a “passive selection among preconstructed answers” (Mitchell, 1995, p. 2).

Performance assessments vary in different contexts. In the early grades they may focus on the process of learning, based on observation and documentation over time of learners’ growth in natural contexts (Darling-Hammond & Falk, 1996; Stiggins, 1987). In the upper grades, assessments may continue to include observations of student growth but may focus increasingly on the products of student learning, using a range of projects, performances, and exhibitions to evaluate student achievement according to well-formed criteria that are consistent with actual performance in that field (Wiggins, 1989b).

Because performance assessment represents a departure from conventional assessment methodology, public support must be marshaled for it to begin to achieve its potential impact. Parents, in particular, are critical to the eventual acceptance of performance assessment (Bridge, 1976; Fullan, 1982, 1991; Dodd, 1996; Shepard & Bliem, 1995). As states and school districts have adopted performance assessment, parental reactions have been mixed (Khattri & Sweet, 1996; Konzal & Dodd, 1999; Shepard & Bliem, 1995). Although there may be differences between parents of younger and older students (with the parents of high school students more concerned about how colleges and employers will view performance assessments), comparative research on reactions of parents of children of different ages is unavailable.

Most research on parents’ perspectives suggests that parental opposition to such innovations as performance assessment may be due to preconceptions, lack of information, or misinformation (Johnson, 1991, Konzal & Dodd, 1999). Yet, without support from parents, effective implementation of performance assessment will not occur (Khattri & Sweet, 1996). For example, Littleton, Colorado revoked its high school-level assessment reforms due to opposition from the community. Parents were not well informed about the reforms, and, according to some parents, the reforms were enacted too quickly. In contrast, Vermont’s assessment reform in elementary through high school has been supported by parents in part because of its investment in a large-scale informational process prior to beginning statewide implementation (Khattri & Sweet, 1996).

Parents tend to generate their beliefs about education based upon their own past experiences. When talking about schooling and assessment, parents generally reflect on their own exposure to testing in school (Robinson, 1996). Many parents question whether their children will learn if they are enrolled in classes that do not resemble their own schooling experiences, including multiple choice examinations and standardized tests (Meyer & Rowan, 1978). Despite the fact that test scores, letter grades, and percentiles provide only narrow interpretations of student achievement, parents generally express greater comfort with these types of indicators than with more complex, alternative measures. In one elementary school-based study, parents indicated that much of their discomfort with alternative assessment was based on the fact that they were accustomed to standardized testing and letter grades on report cards (Diffily, 1994). Although many admitted that the letter grades did not tell them very much about their child’s abilities or progress, they questioned the value of change.

In another study in which elementary school parents were asked about their views regarding alternative assessment, most parents admitted having difficulty expressing their opinions because of lack of general knowledge about assessment (Robinson, 1996). Although these parents agreed that the teacher is responsible for assigning grades, they expressed a desire for a better understanding of how assessment works.

Ensuring high standards is another concern of parents (Robinson, 1996). Parents tend to place their confidence in standardized test scores that measure their child’s progress compared to other children of the same grade level. Performance assessments, however, typically do not yield comparative data; rather, they provide criterion-referenced information about what children know and what students need to work on in order to perform better (Meisels, 1996).

Previous studies of parents’ opinions regarding performance assessments and standardized tests showed a high percentage of respondents in favor of standardized national tests (Elam, Rose, & Gallup, 1992, 1994; Phelps, 1998). The results of the 1992 Gallup Poll indicated that 71% of public school parents favored requiring public schools to use standardized tests to measure the academic achievement of students. In 1994 73% of the respondents indicated that standardized national exams were either very important or quite important. In his extensive review of public polls and surveys concerning student testing in past three decades, Phelps (1998) concluded that parents generally wanted more standardized testing in schools. However, in a study examining elementary school parents’ opinions about standardized tests and performance assessments, most parents approved of both types of measures and actually gave stronger approval ratings to performance assessments (Shepard & Bliem, 1995). When parents had an opportunity to examine performance assessment problems, most preferred them. The parents concluded that performance assessments encourage children to think, and are likely to give teachers a better understanding of how children perform in school. These findings suggest that parents’ favorable ratings of standardized tests do not imply a preference for such measures over alternative evaluation methods. Rather, the increased familiarity of parents with conventional tests appears to account for their preferences. Although parents may not be very familiar with performance assessments, the literature shows that they can become comfortable with new types of assessments when schools and teachers mount a well-designed program of public information.

The present study focuses on parents’ reactions to the implementation of a curriculum-embedded performance assessment with young children––the Work Sampling System (or WSS; Meisels, Jablon, Marsden, Dichtelmiller, & Dorfman, 1994). Using data from a survey of parents’ opinions regarding WSS and a direct assessment of student achievement in grades K–3, this paper examines parental information about and satisfaction with this specific performance assessment. Three research questions are addressed in this study:

1.     How do parents respond to the substitution of performance assessments for traditional report cards with letter grades and percentages?

2.     How do parents react to WSS––the performance assessment in use––overall?

3.     Which specific factors affected parents’ overall reactions to WSS?

Answers to these questions will help improve communication with parents about this approach to assessment, thus increasing the potential for a more successful implementation of performance assessment in the future.

Method

This study is part of a larger investigation of the validity of the Work Sampling System and its influences on teacher practices and children’s achievement (see Meisels, Bickel, Nicholson, Xue, & Atkins-Burnett, 1998, in press). The study took place from fall 1996 to spring 1997 in the Pittsburgh (PA) Public Schools. The larger study included information from teachers, parents, and individually administered standardized assessments of children’s achievement. This paper focuses on data gathered from 363 students in 17 classrooms and six WSS schools. Participating teachers were selected based on the following criteria: a minimum of two years using WSS; inclusion within a group of “high implementers,” based on an independent WSS/PPS Portfolio review conducted in the spring of 1996; and review of these teachers’ 1996-97 WSS materials.

Measures

The Work Sampling System. WSS is a curriculum-embedded, instructional performance assessment designed for preschool (3 years old)–Grade 5. It is intended to document and assess children’s knowledge, skills, behavior, and accomplishments on multiple occasions across a wide variety of classroom domains (Meisels et al., in press; Meisels, Dorfman, & Steele, 1995; Meisels, Liaw, Dorfman, & Nelson, 1995). It relies on extensive sampling of children’s academic, personal, and social progress over the course of the school year and provides rich information about student strengths and weaknesses by helping teachers observe children systematically through use of clearly-stated standards and procedures (Meisels, 1993; Meisels et al., 1994). Teachers translate students’ work into the data of assessment by systematically recording and evaluating it (Meisels, 1997).

WSS consists of three complementary elements: (1) developmental guidelines and checklists, (2) collection of children’s work in portfolios, and (3) summary reports that integrate the information from checklists and portfolios (Meisels et al., 1994). These components are all classroom-focused, instructionally relevant, and curriculum-embedded (Baron & Wolf, 1996). Since one of the principal ways that teachers communicate with parents about their children’s WSS performance is through the summary report, a completed summary report is included in Figure 1.

WSS’s three elements focus on instruction and reflect national, state, and local standards, as well as the teacher’s objectives. Instead of providing a mere snapshot of academic skills at a single point in time, WSS consists of a continuous recording and evaluation process that aims to improve the teacher’s instructional practices and student’s learning. Since 1991, this system has been broadly adopted throughout the United States and abroad (Meisels, 1997). More information is available at www.rebusinc.com.

Parent survey. A brief survey was distributed to families of children in WSS classrooms in the spring of 1997. The parents were compensated with a small gift certificate to local supermarkets for completing the survey. The purpose of the survey was to learn about family members’ reactions to WSS and their opinions about its implementation. The questionnaire was divided into four parts: eight items in Part I concerned parents’ opinions about the effectiveness of the summary report as a means to monitor and report on children’s academic accomplishments and progress. Four items in Part II concerned parents’ opinions about the effectiveness of the portfolio. Eleven items in Part III asked for parents’ overall opinions about WSS. In these three sections, parents responded on a four-point scale ranging from strongly disagree (1) to strongly agree (4), with the exception of the last item in Part III, which had two choices (yes/no). Part IV included five background questions about race, relationship to child, parental education, and experience with WSS. (The questionnaire items are shown in the Appendix.)

The Woodcock Johnson Psychoeducational Battery—Revised. Information about children’s achievement was obtained from a nationally-normed, standardized assessment, the Woodcock Johnson Psychoeducational Battery—Revised (WJ-R; Woodcock & Johnson, 1989). The WJ-R is an individually administered achievement test that was normed on a random, stratified sample of 6,359 individuals with an age range of 24 months to 95 years. It was administered to the subjects in this study in the fall and spring of the 1996–1997 school year. All examiners received training on administration of the WJ-R and were blind to the objectives of the study. Eight subtests were administered to first–third grade children: letter word identification, passage comprehension, dictation, writing sample, applied problems, calculation, science, and social studies. Five subtests were administered to kindergarten children: letter word identification, dictation, applied problems, science and social studies. For the purposes of this paper, the standard scores for the letter word identification and dictation subtests in the fall were used as proxies for the children’s literacy achievement, as they were the only scores available for the entire age range.

Analyses

Parents’ reactions to WSS. The family survey data were aggregated into four subscales (see Table 1): (1) parents’ reactions to the WSS summary report, (2) parents’ reactions to the WSS portfolio, (3) parents’ overall reactions to WSS, and (4) parents’ thoughts about WSS in relation to report cards (a subset of parents’ overall reactions). Parent ratings for all items within a subscale were combined, and means and standard deviations were computed for each of the subscales. Three types of analyses were completed. First, Cronbach alphas and inter-subscale correlations were computed to test the internal reliability of each subscale. Second, descriptive statistics were computed for all four subscales and all items concerning parents’ opinions about the WSS summary report and portfolio and parents’ overall reactions to WSS. Third, two-step hierarchical regressions were used to examine which of several variables best predicted parents’ overall satisfaction with WSS and parents’ reactions to the summary report and portfolio. All results are presented as standardized regression coefficients in order to highlight the correlations between the predictors and the outcomes, as well as the relative power of each predictor after controlling for other variables (see Cohen & Cohen, 1983).

Structural equation modeling (SEM). In order to examine the direct and indirect effect of parents’ perceptions of teachers’ willingness to use WSS and other factors on parents’ overall satisfaction, two models were specified to represent these relationships using a structural equation approach (Bollen & Long, 1993; Hoyle, 1995; Joreskog, 1993; Maruyama, 1998). Structural equation modeling typically consists of a measurement model and a structural model. It allows researchers to use latent variables (i.e., unobserved variables) and incorporate multiple measures as indicators of latent variables. Thus, measurement errors can be taken into account in the model, whereas general path analysis cannot solve this problem (Schumaker & Lomax, 1996). The analysis in this study includes latent variables that were measured by the items in the parent questionnaire and the subtests of WJ-R. The AMOS program (Arbuckle, 1997) was used for this analysis. A covariance matrix was analyzed using maximum likelihood method.

We hypothesized that seven variables entered the model:

1)    Parents’ perceptions of the teacher’s willingness to use the WSS (F1_Teacher Willingness): This was measured by the item “My child’s teacher seems to like using this system.” (Var1) in the questionnaire.

2)    Parents’ attendance at WSS parent/teacher conferences (F2_Conference Attendance): This was measured by the item “I have attended at least one parent/teacher conference at my child’s school this year.” (Var2) in the questionnaire.

3)    Staff availability to answer parents’ questions about WSS (F3_Staff Availability): This was measured by the item “The teacher, principal, and other school staff are available and helpful when I have questions about the Work Sampling System.” (Var3) in the questionnaire.

4)    Student achievement (F4_Child Achievement): This latent variable has two indicators: the standard scores of WJ-R letter word identification (Letter word) and dictation (Dictation) in the fall.

5)    Parents’ reactions to the summary report (F5_Summary Report): This latent variable was measured by the subscale of parents’ reactions to the WSS summary report (sursum, 8 items, Cronbach alpha = .91).

6)    Parents’ reactions to the portfolio (F6_Portfolio): This latent variable has four indicators based on four items in the questionnaire concerning parents’ opinions of the portfolio:

The Portfolio helps my child:

Think about improving his/her work (Var4).

Take pride in his/her work (Var5).

Understand the progress he/she is making in school (Var6).

Understand his/her strengths (Var7).

7)    Parents’ overall reactions to WSS (F7_Overall Reactions): This latent variable was measured by the subscale of parents’ overall reactions to WSS (surover; 8 items, Cronbach alpha = .91).

After using listwise deletion to manage the missing data, the sample size for this analysis was reduced to 192. The model was identified by fixing appropriate regression weights. This made possible a unique solution for each parameter in the model.

For the purpose of evaluating the overall fit of the model, both absolute fit indices and incremental fit indices are reported. The absolute fit indices, GFI (Goodness-of-Fit Index) and AGFI (Adjusted Goodness-of-Fit Index), indicate the relative amount of the observed variances and covariances that are accounted for by the implied model. The incremental fit indices, CFI (Comparative Fit Index) and NFI (Normed Fit Indexes), measure the proportion of improvement in fit by comparing the target model with a baseline or null model. The agreed-upon cutoff for overall fit indices is .90 (Bagozzi & Yi, 1988; Bollen & Long, 1993; Hoyle, 1995).

Missing Data Analysis. Because there can be no missing data in SEM analyses, and not all families returned the survey, the sample size for the SEM analysis was reduced to 192 cases. In order to study the impact of missing data on our conclusions, we compared the missing and non-missing groups on race, gender, SES, grade level, and student performance levels. No differences were found in gender and SES between the missing and non-missing groups. However, the missing group contained significantly higher proportions of parents of African American students (c2 =17.96, p<.01) and third graders (c2 =16.73, p<.01). The students in the missing group also had lower mean scores on the WJ-R letter word knowledge and dictation subtests than the non-missing group (t=-3.84, p<.001). Such differences between the two groups could have an impact on the SEM analysis and consequently condition the generalizability of our results.

Results

Parents’ Reactions to WSS

Of the 350 surveys distributed to families, 246 were completed and returned, yielding a return rate of 70%. The majority of the respondents (79%) were the students’ mothers and a large proportion of respondents (62%) were African American. Compared to the percentage of African American in the sample as a whole (70%), this response was representative. Over half of the respondents (59%) had completed at least some college, technical, vocational, or business school beyond high school. The majority (82%) of the students in the survey received free or reduced lunch. About half of the families (48%) had more than one child in classrooms using WSS.

For the purpose of analysis, the parent survey was aggregated into four subscales. The items included in each subscale are shown in Table 1. All parent survey items were answered using a four-point scale: 1 = Strongly Disagree, 2 = Disagree, 3 = Agree, 4 = Strongly Agree. The mean of the items in a particular subscale was computed as the subscale score. Descriptive statistics for the subscales are shown in Table 2. Reliability coefficients (Cronbach alphas) were computed for each subscale and are also reported in Table 2. Reliability coefficients for all four subscales are high, ranging from .87 to .92. Correlations between three of the subscales are moderate to high, ranging from .44 to .67 (see Table 3).

Families were generally very positive about the WSS. The highest ratings were given to the summary report, the element of WSS with which parents are most familiar (see Table 2). Overall, the response of parents to the summary report and portfolio were very positive, with a mean score of ≥3 (see Tables 4 and 5).

Most of the respondents indicated that the summary report was helpful to them as parents. A majority agreed or strongly agreed that it helped them understand their children’s strengths, areas in need of assistance, how well their children were meeting the teacher’s expectations for achievement, and how well their children were doing. Parents overwhelmingly agreed or strongly agreed that the WSS portfolio helped their children think about improving their work, take pride in their work, and understand the strengths and progress they were making.

Parents also gave several other aspects of WSS high ratings (see Table 6). For example, 80% of the respondents reported that they have a good understanding of WSS, and 79% of the respondents rated WSS highly in helping them understand more about how their children learn and about their children’s school work. A high proportion (81%) of the families agreed that they know more about how their child learns from this system than they could learn from a conventional report card. Most families perceived their children’s teacher as willing to use WSS (M = 3.17) and as being available to answer parents’ questions about the system (M = 3.12), and 76% of the families agreed that WSS helps their children understand what they are learning.

More than two-thirds of the families (69%) reported that they would like to continue receiving a summary report. When families were asked to rate whether they would prefer WSS to traditional report cards with letter grades, almost two-thirds of the respondents (62%) agreed.

Regression Analysis

Analyses were conducted to examine which of several variables contributed most to parents’ overall reactions to WSS. Eight variables were entered into a hierarchical regression model as possible predictors of families’ overall reactions to WSS. In the first step, six variables were entered: (1) parents’ ethnicity, (2) parents’ education level, (3) parents’ relationship to the child (mother or not), (4) the number of children in the family that were in WSS classrooms during the 1996–1997 school year, (5) the number of years the family had received a summary report, and (6) attendance at a minimum of one parent/teacher conference in which WSS was discussed during the 1996–1997 school year.

The single most significant predictor was attendance at parent/teacher conferences. Parents who had attended at least one parent/teacher conference where WSS was discussed rated the system more favorably overall than families who had never had the opportunity to discuss WSS with their child’s teacher.

Next, two additional variables were entered into the model: (1) parents’ ratings of staff availability to answer their questions about WSS, and (2) parents’ ratings of how much their child’s classroom teacher liked using WSS. These two variables proved to be strong predictors in the regression model, and the effect of conference attendance was partialed out by these two predictors (see Table 7). Families gave higher ratings to WSS when they perceived that their child’s teacher liked using the system and the school staff was helpful and available to answer their questions about WSS.

Similar regression models were conducted to examine the possible predictors of families’ reactions to the WSS summary report and portfolio. In the first step, attending parent/teacher conferences was significant in predicting parents’ opinions of the WSS portfolio, but not their reactions to the summary report. In the second step, parents’ perceptions of teachers’ willingness to use WSS and staff availability to answer questions about WSS best predicted their satisfaction with the summary report and portfolio, whereas attendance at conferences was no longer significant in predicting parents’ ratings of the portfolio (see Table 7). However, parents’ attitudes towards the summary report were somewhat unstable over grade. Those who had received the summary report longest were marginally less positive than those for whom the report was new. The descriptive statistics for the subscale that reports parents’ reaction to the summary report by grade are shown in Table 8. Kindergarten parents showed the most positive attitude towards the summary report, third grade parents’ satisfaction was second after that of kindergarten parents, and second grade parents were the lowest. The results of a one-way ANOVA show that only the difference between kindergarten and second grade reached statistical significance (F=2.68, p<.05).

The regression analyses demonstrate that parent/teacher communication is an important predictor of parental attitude toward WSS. To test whether this type of communication, which tends to occur more frequently in kindergarten, might be confounded with grade, we conducted one-way ANOVAs on parent conference attendance, staff availability, and perceptions of the teacher’s willingness to use WSS. The results show no significant differences across these variables by grade.

Structural Equation Modeling (SEM)

To examine the direct and indirect effects of parents’ perceptions of teachers’ willingness to use WSS and other factors on parents’ overall satisfaction with WSS, a structural equation model (SEM) was constructed to represent these relationships. Since the results of the regression analysis showed that the demographic variables were not significant predictors of parents’ overall perceptions of WSS when the effects of other predictors were controlled, they were not included in the model. From interviews with the teachers in the study as a whole (Nicholson, 2000), we learned that some teachers in WSS schools believed that parents of low-achieving students may be more favorably disposed towards WSS because they believed that WSS gives their children opportunities to demonstrate their unique characteristics. In contrast, parents of high achievers may be less positive about this system because they believed it deprives their children of opportunities to achieve high scores and is less likely to motivate their children. Therefore, children’s achievement was included in the model. It was measured by the standard scores of WJ-R letter word knowledge and dictation subtests in fall 1996––the only two literacy subtests administered to both kindergartners and first–third graders.

In the initial model (see Figure 2, Model A), two variables were specified to have direct effects on parents’ overall reactions to WSS: parents’ perception of teachers’ interest in using WSS and staff availability to answer parents’ questions about WSS. Parents’ attitudes towards the summary report and the portfolio are also likely to mediate the effects on parents’ overall reactions to WSS of teachers’ willingness to use WSS and staff availability to help. Children’s achievement has an indirect effect on parents’ overall reaction, which is mediated by parents’ opinions about the summary report and portfolio.

The model was identified by fixing the appropriate regression weights and the errors. In order for the model to be specified, we assumed that the three exogenous variables with a single indicator were measured perfectly. The regression weights of these variables were fixed to 1. The error of each exogenous variable was fixed to zero. The measurement scales of F4_Child Achievement and F6_Portfolio were fixed by setting one of their regression weights to 1. Since the reliability of the summary report subscale is .91 (91% of the variance of SURSUM is explained by F5_Summary Report), the path F5®SURSUM was fixed to .95, and err10 was fixed to .09. This means that 9% of the variance of SURSUMR is unexplained. The latent variable F7_Overall Reactions with a single indicator SUROVER was also fixed in this way.

The standardized solutions of model A are shown in Figure 2. The c2 of 134.29 with 34 degrees of freedom (N=192, p<.000; c2¤df = 3.95) indicates that the model does not fit the data well. Other fit indices shown in Table 9 demonstrate that this initial model is unsatisfactory. All the fit indices (GFI, AGFI, CFI, and NFI) are below the acceptable value of .90. The squared multiple correlation indicates that only 16% of the variance in F6_Portfolio is explained, leaving 84% unexplained by the model. This indicates that a revised form of the model might provide a better fit.

In the revised model B, a two-way causal relationship is added between the variables F5_Summary Report and F6_Portfolio (see Figure 3) based on the modification indices. In order for the model to be as parsimonious as possible, we also eliminated some paths that were not significant. This model is non-recursive because of the reciprocal relationships between variables. It is identified due to the existence of instrument variables (Maruyama, 1998). Instruments in the model need to have a direct causal relationship with one of the two variables that have a bi-directional relationship, but not with the other. In this model, F1 and F3 are instruments for identifying paths to F6, because they have modestly significant relationships with F5, but no relationships with F6. The standardized solutions of this model are shown in Figure 3. Selected fit indices are displayed in Table 9.

After adding the two-way relations between F5 and F6 and deleting non-significant paths, the fit of the model was greatly improved. The c2 with 35 degrees of freedom decreased to 67.07 (N=192, p<.01, c2 /df =1.92). Since the chi-square test is sensitive to sample size, and we had a relatively large sample, we used other criteria to evaluate the fit of the model, including c2/df and other fit indices. As another measure of model fit, a c2 /df of 2 or 3 would be acceptable (Bollen, 1989). In model B, the c2 /df is 1.92, which is acceptable. GFI, CFI, and NFI are all above .90. GFI is .94, which means 94% of the variance and covariance in the observed variables are accounted for by the implied model. CFI is .96. NFI is .93, which means that 93% of the total covariance among observed variables is explained by this model when using the null model as a baseline. In model B, AGFI increased to .89. The percentage of variance in F6 explained by the model increased to 55%. More than 70% of the variances in F5 and F7 are explained. It is obvious that model B is superior to model A and is acceptable.

Based on the critical ratios generated from AMOS, most of the structural paths were significant in model B except the paths from F2 to F6, F4 to F5, and F4 to F6. The reciprocal relationships between parents’ satisfaction with the summary report (F5) and parents’ satisfaction with the portfolio (F6) were significant (standardized, .60 from F5 to F6 and .27 from F6 to F5). The model shows that teachers’ availability for parents’ questions about WSS and parents’ perceptions of teachers’ willingness to use WSS affected parents’ reactions to the summary report positively, which in turn positively affected parents’ overall opinions of WSS. The effect of parents’ attitudes towards the portfolio on parents’ overall satisfaction with WSS is primarily mediated by parents’ attitudes towards the summary report. Children’s achievement had no significant relationship either with parents’ reactions to the WSS summary report or with parents’ attitude towards the WSS portfolio. This means that there were no significant differences in attitudes towards the summary report and the portfolio between the parents of low achievers and high achievers. WSS is welcomed not only by the parents of high-achieving students but by those of low-achieving children as well.

In summary, the result of SEM shows that parents’ perceptions of teachers’ willingness to use WSS and the availability of school staff to answer their questions about WSS had positive effects on parents’ satisfaction with the summary report and parents’ overall ratings of WSS. Parents’ satisfaction with the summary report and portfolio influenced each other. Children’s achievement had no significant relationships with parents’ attitudes toward the WSS summary report and portfolio.

Discussion

This study investigated parents’ reactions to the Work Sampling System through use of a parent questionnaire. Given the relatively high return rate of this survey (70%) and the sample size (N = 246), the results have considerable generalizability to comparable families (low income, primarily African American) in urban school districts. We also believe that these results can have important implications for the implementation of performance assessment in general.

Parents in this study hold positive attitudes towards the WSS summary report and portfolio, and believe that these tools benefit their children. These results suggest that parents appreciate the detailed information they receive from Work Sampling about their children’s performance and progress. Parents’ ratings of other aspects of WSS are also high. Most of them agreed or strongly agreed that WSS helped them know more about how their children learn and about their children’s school work than traditional report cards. They indicated that they understood the concept of WSS and agreed that WSS helped their children improve their learning.

However, there is some variance of opinion among respondents concerning whether they prefer WSS to typical report cards and whether they wish to continue receiving a summary report. Approximately two-thirds of the parents preferred WSS to conventional report cards and wanted to continue receiving the summary report instead of a report card with letter grades; one-third did not. There are several possible explanations for this finding. One possibility is that letter grades provide an opportunity for them to compare their children’s achievement with other children. Parents are very interested in knowing that their children are “keeping up” with children in other schools. Previous studies (Diffily, 1994; Shepard & Bliem, 1995) show that parents who were raised on standardized tests and conventional report cards feel most comfortable with these methodologies. In general, society considers letter grades to be the most acceptable indicators of students’ accomplishments. In addition, since some of the schools in the district did not use WSS, it was difficult for some parents to communicate with friends or relatives at other schools, or to boast about their children’s achievement and participate in community celebrations of students’ achievement. Finally, the school district where this research took place used WSS only through third grade; some parents may have been concerned that their children would have difficulty making the transition to conventional letter grades in the upper elementary years.

The results of hierarchical regressions show that in the first step, where the demographic characteristics of the parents and attendance at parent/teacher conferences were entered into the model, conference attendance was the only significant predictor of parents’ satisfaction with the portfolio and their overall reactions to WSS. However, it was not significant in predicting parents’ ratings of the summary report. In the parent/teacher conferences, students’ portfolios were usually presented to the parents. This provided parents with opportunities to see their children’s work and the progress they made. Therefore, parents who attended the conferences were more likely to rate the portfolio high and to be favorable to WSS.

The results of hierarchical regressions also show that in the second step, parents’ ratings of the summary report and portfolio and their overall ratings of WSS were best predicted by parents’ perceptions of teachers’ willingness to use WSS and by the school staff’s availability to answer questions about WSS. The more parents perceived that their children’s teachers enjoyed using WSS, and the more available staff were to help them understand WSS, the more positive their ratings. (This will be discussed later in conjunction with the results of the structural equation modeling.) However, conference attendance was no longer significant in predicting parents’ ratings of the portfolio and their overall ratings of WSS in the second step. Further t-tests show that, when compared with those who did not attend parent/teacher conferences, parents who attended the conferences were more likely to perceive that their child’s teacher is willing to adopt the system (t = 2.85, p<.01) and that they can turn to the school staff for help when they have questions about WSS (t = 2.42, p<.05). Thus the effects of conference attendance was explained away by these two variables.

Nevertheless, families’ reactions to the summary report were most positive in kindergarten. This might be because families receiving summary reports for the first time were attracted by the advantages of the new system, as contrasted to traditional report cards, but as their children advanced through the grades, their enthusiasm diminished, although only marginally. As demonstrated in this study, parents of kindergartners gave the summary report the highest rating compared to parents of children in other grades. Although this trend is unstable across the grades (third grade parents were next highest in their ratings), generally, parents in each grade responded to the WSS summary report positively.

The revised structural equation model shows that parents’ perceptions of teachers’ willingness to use WSS and school staff availability to answer their questions about WSS had significant effects on parents’ thoughts about the WSS summary report and parents’ overall ratings of WSS. This suggests that teachers’ willingness to adopt a new form of assessment in the classroom may strongly influence parents’ reactions to the assessment. Moreover, as shown in our teacher interview study (see Nicholson, 2000), parents’ expectations also affect teachers’ classroom assessment practice (Shepard & Bliem, 1995). Unfortunately, not enough teachers participated in this study to allow us to analyze the relationship between teachers’ and parents’ attitudes towards WSS. It would be of practical significance to study such a relationship in future investigations.

The results of the structural equation modeling also suggest that when school staff are available to answer parents’ questions about WSS, parents are more likely to accept this new type of assessment. Studies remind us that parents’ opposition to performance assessment is due largely to lack of information (Johnson, 1991; Khattri & Sweet, 1996) and lack of communication with teacher and school (Konzal & Dodd, 1999). With the guidance of school staff, parents can see how WSS helps them learn about their own child’s strengths and weaknesses, his/her work and progress, and how WSS summaries can be more informative than typical report cards.

Nevertheless, several barriers may have prevented parents from becoming better educated about WSS. These barriers included, but were not limited to the following: some schools cancelled conferences because of district budget cuts; parent participation in WSS workshops was low in some schools; illiteracy among some parents made it difficult for them to appreciate the information included on the summary report; and transportation problems prevented some parents from attending conferences. These problems may be less prevalent in middle-class communities where parents have more frequent contact with teachers, higher rates of participation in the schools, more positive personal histories with the schools, higher literacy rates, and fewer transportation challenges.

Thus, it was unexpected to find that attending at least one parent/teacher conference did not affect parents’ opinions about the summary report significantly. This may be because the cancellation of some conferences by the district prevented teachers from educating parents about the summary report and addressing their concerns and questions. Such results also suggest that consistent informal communications between parents and teachers might be more effective than formal conference attendance. This is consistent with the finding of positive effects for staff availability to answer parents’ questions about WSS and the positive relationship of conferences to parents’ overall reactions to WSS. The more parents know about WSS, the higher their levels of satisfaction.

We also found that, contrary to the concerns of some teachers in WSS schools, children’s achievement does not have a negative relationship with parents’ acceptance of the new assessment system. Instead, as shown in the structural equation model, there is no significant difference between the reactions of parents of high achievers and low achievers towards WSS. Parents of both high and low achievers reacted similarly to WSS: They all believed that WSS would benefit their children.

The reciprocal relationships between parents’ opinions concerning the summary report and the portfolio in the revised model suggest that parents who perceived the summary report as useful were also likely to think the portfolio would be helpful to their children, and vice versa. The higher parents rated the summary report and portfolio, the more positive their overall reactions to WSS. Moreover, the effect of parents’ ratings of the portfolio on parents’ overall satisfaction was mediated primarily by their opinions of the summary report. Because summary reports are sent home three times per year, parents are most familiar with this element of WSS. Therefore, it is not surprising to find a strong relationship between parents’ opinions of the summary report and their overall reactions to WSS.

In sum, we found that parents’ responses to the WSS summary report and portfolio were very positive; they appreciated the detailed information they received from the WSS summary report and portfolio about their children’s performance and progress. The parents in this study believed that WSS as a whole benefited their children. The majority of them preferred WSS to conventional report cards and wanted to continue receiving a WSS summary report. Parents’ perceptions of teachers’ willingness to use WSS and staff availability to answer parents’ questions about WSS strongly affected parents’ responses to the WSS summary report and portfolio; their effects on parents’ overall satisfaction with WSS were mediated by parents’ reactions towards the two integrated elements of WSS—the summary report and the portfolio. But parents’ reactions towards WSS were not affected by their child’s overall level of achievement. In order to receive support from parents to ensure effective implementation of WSS or of other performance assessments, schools need to make an effort to keep parents informed about those assessments. Consistent informal communication between teachers and parents appears to be a highly effective method of accomplishing this. Because of the problems that the study teachers faced in having frequent informal communication with these families, the study’s results can be seen as a conservative estimate of the potential impact of WSS and as a set of implicit recommendations for implementing performance assessment in general.

Performance assessment has now been used widely in America’s schools for more than a decade. If it is ever to become more generally accepted by parents and policy makers, it is essential that parents’ reactions be taken into account and shaped through positive and informative interactions with teachers and other educators. This study suggests that when such efforts are undertaken, the results may well be worthwhile.


References

Arbuckle, J. L. (1997). AMOS users’ guide: Version 3.6. Chicago: SPSS.

Baron, J. B., & Wolf, D. P. (1996). Performance-based student assessment: Challenges and possibilities. Ninety-fifth yearbook of the National Society for the Study of Education, Part I. Chicago: University of Chicago Press.

Bagozzi, R. P., & Yi, Y. (1988). On the evaluation of structural equation models. Journal of the Academy of Marketing Sciences, 16, 74–94.

Bollen, K. A. (1989). Structural equations with latent variables. New York: Wiley.

Bollen, K. A., & Long, J. S. (1993). Testing structural equation models. Newbury Park, CA: Sage.

Bridge, G. R. (1976). Parent participation in school innovations. Teachers College Record, 77, 366–384.

Cohen, J., & Cohen, P. (1983). Applied multiple regression/correlation analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Erlbaum.

Darling-Hammond, L. (1994). Performance-based assessment and educational equity. Harvard Educational Review, 64(1), 5–30.

Darling-Hammond, L., & Falk, B. (1996). Supporting teaching and learning for all students: Policies for authentic assessment systems. In A. L. Goodwin (Ed.), Assessment for equity and inclusion: Embracing all our children (pp. 77–99). New York: Routledge.

Diffily, D. (1994). What parents think about alternative assessment and narrative reporting: One school’s finding. ERIC ED 381230.

Dodd, A. W. (1996). Involving parents, avoiding gridlock. Educational Leadership, 53, 44-46.

Elam, S. M., Rose, L. C., & Gallup, A. M. (1992) The 24th annual Gallup-Phi Delta Kappan Poll of the public’s attitude toward the public schools. Phi Delta Kappan, 74, 41–53.

Elam, S. M., Rose, L. C., & Gallup, A. M. (1994) The 26th annual Gallup-Phi Delta Kappan Poll of the public’s attitude toward the public schools. Phi Delta Kappan, 76, 41–56.

Frederiksen, J. R., & Collins, A. (1989). A systems approach to educational testing. Educational Researcher, 18 (9), 27–32.

Fullan, M. (1982). The meaning of educational change. New York: Teachers College Press.

Fullan, M. (1991). The new meaning of educational change. New York: Teachers College Press.

Hoyle, R. H. (1995). Structural equation modeling: Concepts, issues, and applications. Thousand Oaks, CA: Sage.

Joreskog, K. G. (1993). Testing structural equation models. In K. A. Bollen & J. S. Long (Eds.), Testing structural equation models (pp. 294–316). Newbury Park, CA: Sage.

Johnson, D. (1991). Parents, students and teachers: A three-way relationship. International Journal of Educational Research, 15, 171–81.

Khattri, N., Reeve, A. L., & Kane, M. B. (1998). Principles and practices of performance assessment. Hillsdale, NJ: Erlbaum.

Khattri, N., & Sweet, D. (1996). Assessment reform: Promises and challenges. In M. Kane & R. Mitchell (Eds.), Implementing performance assessment: Promises, problems, and challenges (pp.1–21). Hillsdale, NJ: Erlbaum.

Konzal, J. L., & Dodd, A. W. (1999). Implementing higher standards and alternative assessment and grading policies in high schools: What do educators need to know about what parents think? Paper presented at the AERA annual conference. Montreal, Canada.

Linn, R. L. (2000). Assessments and accountability. Educational Researcher, 29(2), 4–16.

Maruyama, G. M. (1998). Basics of structural equation modeling. Thousand Oaks, CA: Sage.

Meisels, S. J. (1993). Remaking classroom assessment with the Work Sampling System. Young Children, 48(5), 34–40.

Meisels, S. J. (1996). Performance in context: Assessing children's achievement at the outset of school. In A. J. Sameroff & M. M. Haith (Eds.), The five to seven year shift: The age of reason and responsibility (pp. 410–431). Chicago: University of Chicago Press.

Meisels, S. J. (1997). Using Work Sampling in authentic assessments. Educational Leadership, 54 (4), 60–65.

Meisels, S. J., Bickel, D. P., Nicholson, J., Xue, Y., & Atkins-Burnett, S. (1998). Pittsburgh Work Sampling Achievement Validation Study. Ann Arbor, MI: University of Michigan.

Meisels, S. J., Bickel, D. P., Nicholson, J., Xue, Y., & Atkins-Burnett, S. (in press). Trusting teachers’ judgments: A validity study of a curriculum-embedded performance assessment in Kindergarten–Grade 3. American Educational Research Journal.

Meisels, S. J., Dorfman, A., & Steele, D. (1995). Equity and excellence in group-administered and performance-based assessments. In M. T. Nettles & A. L. Nettles (Eds.), Equity in educational assessment and testing (pp. 195–211). Boston: Kluwer Academic.

Meisels, S. J., Jablon, J. R., Marsden, D. B., Dichtelmiller, M. L., & Dorfman, A. B. (1994). The Work Sampling System. Ann Arbor, MI: Rebus, Inc.

Meisels, S. J., Liaw, F., Dorfman, A., & Nelson, R. (1995). The Work Sampling System: Reliability and validity of a performance assessment for young children. Early Childhood Research Quarterly, 10, 277–296.

Meyer, J., & Rowan, B. (1978). The structure of educational organizations. In J. Meyer (Ed.), Environments and organizations (pp. 78–109). San Francisco: Jossey-Bass.

Mitchell, R. (1995). The promise of performance assessment: How to use backlash constructively. Paper presented at the AERA annual conference. San Francisco, CA.

Nicholson, J. M. (2000). Examining evidence of the consequential aspects of validity in a curriculum-embedded performance assessment. (Doctoral dissertation, University of Michigan, Ann Arbor).

Phelps, R. P. (1998). The demand for standardized student testing. Educational Measurement: Issues and Practice, 17(3), 5–23.

Resnick, L. B., & Resnick, D. P. (1992). Assessing the thinking curriculum: New tools for educational reform. In B. R. Grifford & M. C. O’Connor (Eds.), Changing assessment: Alternative views of aptitude, achievement, and instruction (pp. 37–75). Boston: Klewer.

Robinson, J. (1996). Parents as allies for alternative assessment. In A. L. Goodwin (Ed.), Assessment for equity and inclusion: Embracing all our children (pp. 297–303). New York: Routledge.

Shepard, L. A., & Bliem, C. L. (1995) Parents’ thinking about standardized tests and performance assessments. Educational Researcher, 24, 25–32.

Schumaker, R. E., & Lomax, R. B. (1996). A beginner’s guide to structural equation modeling. Hillsdale, NJ: Erlbaum.

Stiggins, R. (1987). Design and development of performance assessments. Educational Measurement: Issues and Practice, 6, 33–42.

Wiggins, G. (1989a). Teaching to the (authentic) test. Educational Leadership, 46, 41–47.

Wiggins, G. (1989b). A true test: Toward more authentic and equitable assessment. Phi Delta Kappan, 10, 703–713.

Wiggins, G. (1996). Practicing what we preach in authentic assessments. Educational leadership, 54(4), 18–25.

Woodcock, R. W., & Johnson, M. B. (1989). Woodcock Johnson Psychoeducational Battery–Revised. Allen, TX: DLM Teaching Resources.


Table 1

Subscales for Parent Questionnaire

Subscale

Items

1. Parents’ reactions to the WSS summary report (8 items)

 

 

 

 

 

 

 

 

 

 

2. Parents’ reactions to the WSS portfolio
(4 items)

 

 

 

 

3. Parents’ overall reactions to WSS (8 items)

 

 

 

 

 

 

 

 

 

 

4. Parents’ thoughts about WSS in relation to report cards
(5 items)

 

 

 

The Summary Report helps me understand:

My child’s strengths.

Where my child needs help.

How well my child’s achievement compares with expectations at his/her grade level.

How well my child is meeting the teacher’s expectations for learning.

My child’s progress.

How my child’s teacher is helping my child learn.

How well my child is doing overall.

The different areas of learning in my child’s classroom.

 

The Portfolio helps my child:

Think about improving his/her work.

Take pride in his/her work.

Understand the progress he/she is making in school.

Understand his/her strengths.

 

Compared to typical report card with letter grades, I like this system better.

The WSS helps me know more about my child’s school work than report cards.

I know more about how my child learns from this system than from report cards.

I think the WSS helps my child to understand what he/she is learning.

My child likes using this system.

I would recommend the WSS to other schools and parents.

I feel that I understand what the WSS is all about.

If given the choice, I want to continue receiving the Summary Report instead of a report card with letter grades.

 

Compared to typical report card with letter grades, I like this system better.

The WSS helps me know more about my child’s school work than report cards.

I know more about how my child learns from this system than from report cards.

I would recommend the WSS to other schools and parents.

If given the choice, I want to continue receiving the Summary Report instead of a report card with letter grades.


Table 2

Descriptive Statistics and Reliabilities for Parent Survey Subscales

Subscale

M

SD

N

N of Items

Reliability (a)

Summary report

3.17

.61

246

8

.91

Portfolio

3.12

.66

245

4

.87

Report card

2.92

.80

246

5

.92

Overall reaction

2.92

.80

246

8

.91

 

Table 3

Correlations Between Parent Survey Subscales

 

Parents’ reactions to WSS Summary Report

Parents’ reactions to WSS Portfolio

Parents’ reactions to WSS portfolio

.67***

Parents’ overall reaction to WSS

.66***

.53***

Parents’ reactions to WSS in relation to report cards

.59***

.44***

***p <. 001.


Table 4

Descriptive Statistics for Items Measuring Parents’ Reactions to the WSS Summary Report

Item

M

SD

N

% agree and strongly agree

The summary report helps me understand:

 

 

 

 

My child’s strengths

3.24

.68

246

92

Where my child needs help

3.18

.76

246

87

How well my child’s achievement compares with grade expectations

3.10

.80

246

82

How well my child is meeting the teacher’s expectations for learning

3.13

.74

245

85

My child’s progress

3.28

.71

243

90

How the teacher is helping my child learn

3.06

.88

244

77

How well my child is doing overall

3.14

.83

243

86

The different areas of learning in my child’s classroom

3.23

.76

245

88


Table 5

Descriptive Statistics for Items Measuring Parents’ Reactions to the WSS Portfolio

Item

M

SD

N

% agree and strongly agree

The portfolio helps my child:

 

 

 

 

 

Think about improving his/her work

3.02

.77

243

79

 

Take pride in his/her work

3.31

.75

245

89

 

Understand the progress s/he is making in school

3.09

.80

245

82

 

Understand his/her strengths

3.06

.78

245

80

 


Table 6

Descriptive Statistics for Items Measuring Parents’ Overall Reactions to WSS

Item

M

SD

N

% agree and strongly agree

Feel I understand what WSS is all about

3.00

.80

242

80

Know more about school work than from report card

3.05

.88

242

79

Know more about child’s learning than from report card

3.07

.88

245

81

Staff available to answer questions

3.12

.74

245

86

WSS helps child understand what he/she is learning

2.95

.79

244

76

Teacher likes using WSS

3.17

.69

235

89

Child likes using WSS

2.82

.76

238

73

Recommend WSS to others

2.89

.92

239

70

Prefer WSS to report card

2.69

.97

243

62

Want to continue receiving summary report

2.90

.99

244

69


Table 7

Best Predictors of Parents’ Reactions to the WSS Summary Report and Portfolio and Parents’ Overall Reactions to WSS

Predictors

Reactions to the summary report(Regression Coefficients)

Reactions to the portfolio (Regression Coefficients)

Overall Reactions to WSS (Regression Coefficients)

Model 1

Model 2

Model 1

Model 2

Model 1

Model 2

Parents’ ethnicity (minority)

.025

-.007

.048

.031

.050

.017

Parents’ relationship to child (mother)

.038

.080

.004

.043

-.044

.002

Parents’ level of education

-.086

-.019

-.112

-.064

-.040

.033

Number of children in WSS classrooms

-.137*

-.056

-.102

-.059

-.127

-.044

Number of years family received summary report

-.113

-.125*

-.056

-.055

-.069

-.081

Families attending at least one parent/teacher conference

.076

-.037

.132*

.052

.125*

.005

Parents’ perceptions of staff availability to answer questions

 

.308***

 

.296***

 

.342***

Parents’ perceptions of whether classroom teacher likes WSS

 

.416***

 

.221**

 

.431***

R2

.052

.405***

.044

.224***

.049

.449***

R2 change

.353***

.180***

.400***

*p<.05, **p<.01, ***p<.000


Table 8

Descriptive Statistics for Parents’ Reactions to the Summary Report by Grade

Grade

M

SD

N

Kindergarten

3.33

.58

69

First Grade

3.14

.58

65

Second Grade

3.03

.67

61

Third Grade

3.17

.57

51

 

Table 9

Fit Indices for Model A and Model B

 

c2

df

c2 /df

GFI

AGFI

CFI

NFI

Model A

134.29

34

3.95

.89

.80

.89

.86

Model B

67.07

35

1.92

.94

.89

.96

.93


Appendix

Parent Questionnaire Items

I. Summary Report

The Summary Report helps me understand:

1.     My child’s strengths.

2.     Where my child needs help.

3.     How well my child’s achievement compares with expectations at his/her grade level.

4.     How well my child is meeting the teacher’s expectations for learning.

5.     My child’s progress.

6.     How my child’s teacher is helping my child learn.

7.     How well my child is doing overall.

8.     The different areas of learning in my child’s classroom (math, science, social studies etc.).

 

II. Portfolio

The Portfolio helps my child:

9.  Think about improving his/her work.

10.  Take pride in his/her work.

11.  Understand the progress he/she is making in school.

12.  Understand his/her strengths.

 

III. Overall Reaction

13.  Compared to typical report card with letter grades, I like this system better.

14.  The Work Sampling System helps me know more about my child’s school work than report cards.

15.  I know more about how my child learns from this system than from report cards.

16.  The teacher, principal, and other school staff are available and helpful when I have questions about the Work Sampling System.

17.  I think the Work Sampling System helps my child to understand what he/she is learning.

18.  My child’s teacher seems to like using this system.

19.  My child likes using this system.

20.  I would recommend the Work Sampling System to other schools and parents.

21.  I feel that I understand what the Work Sampling System is all about.

22.  I have attended at least one parent/teacher conference at which Work Sampling was discussed at my child’s school this year.

23.  If given the choice, I want to continue receiving the Summary Report instead of a report card with letter grades.

 

IV. Background

24.  Relation of respondent to child.

25.  Race/ethnicity of Respondent.

26.  Highest level of school completed by the child’s mother, guardian, or other full-time female caregiver.

27.  Number of your children in classrooms using Work Sampling.

28.  Number of years you’ve received a Summary Report.



[1] We acknowledge the invaluable assistance of Sandi Koebler of the University of Pittsburgh and Carolyn Burns of the University of Michigan in collecting and coding these data, Jack Garrow for assisting us with school district data, and Margo Dichtelmiller and Adrienne Gelpi Lomangino for assistance in developing the survey. We are also deeply grateful to the principals, teachers, parents, and children who participated in this study, and to the staff and administrators of the Pittsburgh Public Schools. This study was supported by a grant from the School Restructuring Evaluation Project, University of Pittsburgh, the Heinz Endowments, and the Grable and the Mellon Foundations. The views expressed in this paper are those of the authors and do not necessarily represent the positions of these organizations. Dr. Meisels is associated with Rebus Inc, the publisher, distributor, and source of professional development for the Work Sampling Systemâ. Corresponding author: Samuel J. Meisels, School of Education, University of Michigan, Ann Arbor, MI 48109-1259; smeisels@umich.edu.

’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’’