Homeschooling is a voluntary activity, and demographic data indicate that parents who choose homeschooling have higher education levels, income levels, and church attendance levels than the general population (Mayberry, 1988). Children from families with such demographic factors most likely would also score higher than average if they attended schools.
One alternative to between-group comparisons is within-group comparisons. This has been the strategy taken by Jon Wartes (1988, 1990a, 1990b) who matched standardized achievement test scores of home educated students with survey results from their parents, and was able to draw some conclusions which had strong policy implications.
For example, Wartes found a weak relationship between students’ test scores and parents’ educational level. He concluded that his data did not support policy decisions that limited access to homeschooling based upon some arbitrary level of parent education (in excess of a high school education). This finding permitted Richman and Richman (1988) to argue in an Education Week commentary that state laws should be changed if they restrict home education to parents who only have college degrees. In turn, Richman and Richman’s Commentary was cited by the United States Secretary of Education in a 1989 report to President Bush titled “Educating Our Children” (Sheffer, 1991).
The purpose of this study was to replicate many aspects of the research (Wartes, 1988, 1990a, 1990b) in Washington in order to determine whether it is appropriate to generalize its conclusions about factors affecting homeschool achievement.
Like the Washington studies, ours matched standardized achievement test scores of home educated students with survey results from their parents. We even borrowed many of the exact questions used in the Washington studies for our survey.
On the other hand, there were at least three major differences between our study and those in Washington: (a) difference in state (Pennsylvania instead of Washington), (b) different testing instrument (they used the Stanford Achievement Test series while we used the Comprehensive Test of Basic Skills 4th Edition Survey Test), and (c) different testing season (they administered their tests in the spring while we administered ours in the fall). These changes allowed us to determine whether the results of the Washington studies can be generalized to other populations of homeschoolers, other testing instruments, and other seasons of testing.
Method
In order to help homeschooling families comply with the Pennsylvania home education law, 174 achievement tests were administered in group settings, usually in the Sunday School rooms of a Church at locations sponsored by homeschooling support groups in every region of Pennsylvania in the fall of 1989. Cost to the parents was $20 for each test administered. All test administration was closely supervised by two Pennsylvania certified teachers, including one of the authors of this study. Confidentiality of answers and student’s achievement test scores were protected. Another one of this study’s authors matched the information on the numbered questionnaires with the numbered achievement test results without knowing the names of any of the students or parents who had completed the questionnaires. Then the statistical analysis was conducted based upon an anonymous computer disk file.
Of the 174 tests, three were eliminated from consideration because the wrong level of test was administered. Of the 171 remaining, 129 of the parents returned the surveys. The mean age of the students tested, at the time of testing, was 10.4 years. The range of ages was seven to seventeen. The mean grade level at the time of testing was fourth grade, eighth month. The range of grades was second to twelfth.
Only math and reading testing are required in Pennsylvania. Many of the students took the science and social studies sections, but we only used the “Total Reading” and “Total Mathematics” scores in our data analysis.
Testing is only required in Pennsylvania in third, fifth, and eighth grades. Of the 171 students whose scores were included in the study, 128 were in the mandatory testing grades. A comparison of the test results of those in mandatory testing grades and those in the voluntary grades did not appear different. Those who took the test in voluntary grades, however, were slightly lower in reading and slightly higher in math than those who took the test in the required grades. In order to obtain a large enough sample, students in all grades were pooled together, and were compared with the pooled results of the Washington studies.
Individual normal curve equivalents were tabulated from the raw test scores based upon tables provided in the Fall Norms Book by CTB McGraw Hill. The normal curve equivalent scores for total reading and total math were each found by averaging the scale scores from subtests (math subtests were Computation and Concepts and Applications, reading subtests were Vocabulary and Comprehension) and then looking up the average scale score in the normal curve equivalent table. When the average scale score was half-way between two integers (i.e. 545.5) the correct normal curve equivalent was obtained by first rounding upwards and rounding downwards, then looking up both scale scores, and then averaging the normal curve equivalents that were obtained. All data analysis was performed using these normal curve equivalents. All statistical procedures were performed using Systat: The System for Statistics for the PC.
Findings
Although 75% of the parents whose children were tested returned the questionnaires, there was a significant difference between the math scores of those whose parents did not turn in the questionnaires and those whose parents did turn in the questionnaires, t(169) = 2.6, p = .01.
The mean reading and math scores of the students whose parents returned the questionnaires were the 72nd and 60th normal curve equivalents. The mean reading and math scores of the students whose parents did not return the questionnaires were the 74th and 70th normal curve equivalents. Overall, the mean scores for the 171 students, if converted from normal curve equivalents to percentile rank, were the 86th national percentile in total reading and the 73rd national percentile in total mathematics.
Comparison with Washington Studies
The Washington studies have found lower overall achievement than our study. The mean total reading score of 2,305 home educated students in Washington was the 64th national percentile, and the mean total mathematics score of 2,607 home educated students was the 53rd national percentile (Wartes, 1990a, p. 58).
On other measures, however, we found the same general pattern of survey responses that was found in the Washington studies. Table 1 shows the answers given by Pennsylvania parents and those given by Washington parents.
Tables 2 and 3 show the regression coefficients calculated using a simple linear regression model (y = bx + c) in both our Pennsylvania study and the Wartes (1990b) Washington studies. The correlations between education level of the parents and achievement of the students in both our study and the Washington study were quite small. Three of the four correlation coefficients (i.e. standardized regression coefficients) which compared the educational level of the parent with the achievement of the student were slightly lower in the Pennsylvania study (.09 compared to .21, .14 compared to .15, and .01 compared to .13). The other correlation coefficient was slightly higher in the Pennsylvania study (.21 compared to .20).
Similarly, the correlations between family income and achievement was generally small in both studies. The correlation between family income and reading achievement was .13 in Washington and .11 in Pennsylvania. The correlation between family income and math achievement was .13 in Washington and -.11 in Pennsylvania.
When parents were asked, “About how many hours per week does this student spend doing ‘formal schooling’ (structured lessons that were preplanned by either the parent or a provider of educational materials)?” The parents responded with an average of about 16 hours per week, which is similar to the average of 15 hours per week which was found in Washington. Wartes found that the number of hours per week of formal instruction increases gradually with student’s age. The same general rise is apparent in the Pennsylvania data. In Washington, hours per week increases by 0.86 hours for each year of age while in our sample it increases by 0.77 hours for each year. Similarly, in Washington level of structure on the seven point scale increases by 0.04 units for each year of age while in our sample it increases by
0.10 units for each year of age.
When achievement test scores of children who had a certified teacher as a parent were compared with scores of children whose parents were not certified, there were no significant differences. The mean total reading score of the 11 children with a certified teacher-parent was the 73rd normal curve equivalent and was not significantly different from the 72nd normal curve equivalent for children without a certified teacher-parent, t(127) = -.084, p = .93. The mean total math score of the 11 children with a certified teacher-parent was the 56th normal curve equivalent and was not significantly different from the 61st normal curve equivalent for children without a certified teacher parent, t(127) = .69, p = .49.
One last comparison was made. When achievement test scores of boys were compared with scores of girls, there were no significant differences. The mean total reading score of the 64 boys was the 73rd normal curve equivalent compared to the 72nd normal curve equivalent for the 65 girls. The mean total math score for the boys was the 62nd normal curve equivalent compared to the 59th normal curve equivalent for the girls.
Discussion
Overall Test Results
We found by averaging normal curve equivalents that the mean total reading score corresponded to the 86th national percentile and that the mean total math score corresponded to the 73rd national percentile.
An earlier Pennsylvania study (Resetar, 1990) which collected Pennsylvania achievement test reports in 1987 from a similar sample of homeschoolers, found similar scores. Resetar sent a questionnaire to approximately 300 families who received the Pennsylvania Homeschoolers newsletter. Although the Resetar study predates ours by several years, the samples may be very similar since both were related to the Pennsylvania Homeschoolers support network. Resetar’s study utilized the newsletter mailing list, while our study was based upon tests which were advertised in the newsletter and were sponsored by support groups listed in the newsletter.
One difference was the method of collection of achievement test results. We collected our results at the test site while Resetar relied upon parents to report the results. Only about 25% of approximately 300 families responded to Resetar’s questionnaire. Of the families who did respond, many had children who had not yet taken achievement tests.
Another difference was in the method of calculating the mean percentile rank. We averaged the normal curve equivalents and then converted the average to a percentile rank; Resetar used the less accurate method of averaging the percentile ranks.
Nevertheless, the score reports were somewhat similar. Resetar found that the mean percentile rank of the 47 children whose parents sent in reading test scores was the 82nd percentile, and the mean percentile rank of the 46 children whose parents sent in math test scores was the 81st percentile. The similarly high level of achievement of our study and Resetar’s study, suggests that our result may be a reliable indicator of the achievement of homeschoolers within the loosely defined Pennsylvania Homeschoolers support network.
On the other hand, the mean test scores that we found were much higher than the mean test results that were reported for the Washington studies. The Washington students scored at the 64th national percentile in total reading and the 53rd national percentile in total mathematics (Wartes, 1990a, p. 58).
Similarly, the high scores that we found do not fit with the test results that were reported by the Pennsylvania Department of Education (J. F. Hertzog, July 2, 1990, personal communication) using a criterion referenced test called the TELLS test. In Pennsylvania both the CTBS/4 and the TELLS are on the list of tests which can be used to meet the testing requirement of the Pennsylvania law. Unlike the CTBS/4, the TELLS tests can only be taken at a school. They are administered each March at every public school in Pennsylvania.
While the scores of the homeschool students on the CTBS/4 were quite a bit higher than the norms for school educated students, there was no such disparity with the TELLS test results. In fact, average reading scores for the 278 home educated students who took the TELLS test in March 1990 were only slightly higher than the scores of school educated students, and the average scores in math were slightly lower than the average scores of school educated students. It appears that home educated students who took the TELLS test scored much worse than home educated students who took the CTBS/4 test.
Perhaps the difference between these scores may be attributed to some of the many differences between the CTBS/4 test and the TELLS test:
1. Teachers in public schools may teach the TELLS objectives more directly than homeschooling parents since public school teachers have better access to these objectives, and are urged to teach to them by their supervisors.
2. Home educated students may test better in the fall, than in the spring, when compared with school educated students, because more home educated students continue to engage in educational activities during the summer.
3. Home educated students may have felt more comfortable taking the CTBS/4 test with a group of their peers than taking the TELLS test with a group of school students.
Of these explanations, only the second could also explain the difference in scores between the Pennsylvania students who took the CTBS/4 and the Washington students who took the Stanford Achievement Tests.
On the other hand, the possibility cannot be ruled out that the students who took the CTBS and those who took the TELLS test represent two different populations within the Pennsylvania homeschooling community. As Jon Wartes has pointed out to one of the authors in a personal communication, the average of the CTBS/4 results and the TELLS test results would approximate the scores found in Washington.
One difference between the CTBS/4 and the TELLS is the difference in cost. Home educated students who took the TELLS test may, on average, have come from lower-income families than the home educated students who took the CTBS/4 test since there was a $20 charge to take the CTBS/4, but the TELLS was free. However, the low correlation between income and achievement found in both the Washington studies and this study encourages the search for other factors. Alternatively, there may be two groups of home educated students in Pennsylvania. One would be those who are in contact with the Pennsylvania Homeschoolers support network, and the second would be those who get their information about homeschooling from their school districts. Those in the support network would be more likely to take a test administered by Pennsylvania Homeschoolers, and those who get their information from public school employees would be more likely to take a test administered by the public schools. People in the support network (and we include here all of the support groups who sponsored the tests) may be more resourceful, may get more input about alternative approaches, and may have more people to turn to when they have problems.
If this last interpretation is true, the results reported in this study cannot be considered to be representative of the entire homeschooling population in the state of Pennsylvania, but instead must be considered to represent a subset of the homeschoolers within a loosely defined Pennsylvania Homeschoolers support network.
Poor Match Between Survey Sample and Test Sample
Even though 75% of those tested were represented in the survey data, there is some doubt as to whether the questionnaires adequately match the population of students who participated in the testing program. This doubt arises because the students whose parents did not return the questionnaire scored higher in mathematics than those students whose parents did return the questionnaire. Perhaps parents whose children perform less well in mathematics would have been more likely to remain at the test site after the testing was over in order to discuss their children’s math difficulties with the tester since both testers were authors and speakers about ways to teach academic subjects at home. The testers report that they did discuss math difficulties with many parents at the test sites after testing. While waiting to talk with the testers, these parents would have been more likely to remember to fill out and/or return the questionnaires.
In view of this disparity in math scores, the survey information cannot be interpreted as being fully representative of the population of students tested.
Comparison with Wartes’ Studies
Although our study cannot be assumed to be representative of the entire homeschooling population of Pennsylvania, it provides a replication and a confirmation of many of the findings reported by Wartes (1988, 1990a, 1990b).
The large differences in achievement test scores between Pennsylvania students who took the CTBS/4 and Pennsylvania students who took the TELLS test, suggests that neither the TELLS group nor the CTBS/4 group are fully representative of the Pennsylvania homeschooling population. The fact that the Washington scores appear to lie between these two groups suggests that the mean scores obtained by the Washington studies may be more representative of the homeschool population in general.
The close fit of the Pennsylvania survey and correlation results with the Washington results suggests that the Washington survey data can be generalized to home education families in other states and that the Washington linear regression data can be generalized to measurement of reading and mathematics achievement with other standardized achievement tests. Specifically, our results suggest that all of the following conclusions of the Washington studies (Wartes, 1990b, p. 5-8) are also applicable in other states:
1. The relationship between parent education level and homeschooler outcomes is weak. Policy decisions that would limit access to homeschooling based upon some arbitrary level of parent education (in excess of a high school education) are not supported.
2. As a group, children of certified teachers did not score above children of non-teachers. This evidence provides no support for policy decisions that would require contact with a certified teacher as a condition to homeschool.
3. There were no significant differences between the scores of boys and the scores of girls. Potential concerns that either sex may be at a disadvantage compared to the other as a result of homeschooling are not validated by these data.
4. There is little to no relationship between level of structure and academic outcomes. Policy decisions that might impose a curriculum for the sake of its structure and/or require minimum hours per week of formal instruction are not supported.
5. The relationship between family income level and homeschooler outcomes is weak. There is little basis for concern regarding academic achievement among homeschoolers based upon family income levels.
Further Research
All of the achievement test results discussed in this paper, including the Washington studies, have shown that home educated students score lower in mathematics than reading. Our current project is to try to determine what makes for a good homeschool math program.
References
Mayberry, Maralee. (1988). The 1987-88 Oregon home school survey: An overview of the findings. Home School Researcher, 4(1), 1-9.
Resetar, Mark A. (1990). An exploratory study of the rationales parents have for homeschooling. Home School Researcher, 6(2), 1-7.
Richman, Howard, & Richman, Susan. (1988). Commentary: Legalization of home schools should proceed. Education Week, 8(2), 32.
Sheffer, Susannah. (1991). Support from the Federal Gov’t. Growing Without Schooling, No. 79, 3.
Wartes, Jon. (1988). Summary of two reports from the Washington homeschool research project: 1987. Home School Researcher, 4(2), 1-4.
Wartes, Jon. (1990a). Report from the 1986 through 1989 Washington Homeschool Testing. (Available from Washington Homeschool Research Project, 16109 N.E. 169 Pl., Woodinville WA 98072)
Wartes, Jon. (1990b). The relationship of selected input variables to academic achievement among Washington’s Homeschoolers. (Available from Washington Homeschool Research Project, 16109 N.E. 169 Pl., Woodinville WA 98072)
1 The authors of the two papers in this issue of Home School Researcher prefer to refer to the practice of home education with one word (e.g., homeschoolers, homeschooling). The traditional editorial practice of this journal has been to use two words (e.g., home schoolers, home schooling, home educators).