The Cranky Taxpayer

The Cranky Taxpayer

SGP Scores


Back | Home | Up
Accreditation | Attendance | Corruption | Cost | Failing Board | Discipline | Eat Cake | NCLB | FOIA Suit | Waste $ | SOL Scores | SGP Scores


Think what you will of the Federal Devils, they recognized some time back that the SOL correlates with socioeconomic status.  For example, here are the Virginia 2015 reading pass rates by division vs. the division % of economically disadvantaged students:

We're not here to discuss whether this correlation suggests that more affluent families live in better school districts, whether their children are better prepared for school, whether their children have higher IQs, or whatever.  The point here is that a teacher with a classroom full of more affluent kids can be a lousy teacher and still show better SOLs than a better teacher with a class of less affluent students.

So, under the federal whip, Virginia from 2011 to 2014 collected Student Growth Percentiles

As we will see below, the SGP turns out to be very useful for measuring teacher performance, inter alia.  Perhaps too useful; VDOE now is abandonng the SGP in favor of a much duller tool.  More on that later.  For now, I'll just point out that their excuse for abandoning the SGP is that the scores cannot be calculated until all the testing is complete, i.e., until the end of the summer.  But they knew that when they adopted the SGP.  So either they are stupid or they are lying.  They are not stupid.

VDOE has a detailed discussion of the SGP here.  In short, a kid who passed last year with a pretty good score and who gets a score this year that is average for the statewide group who had the same performance as this student last year will score in the 50th percentile.  By the same measure, a student who had an awful score last year is compared to other students with the same score; if his performance is average for that group, he also will score in the 50th percentile.

To this point, VDOE says:

A student growth percentile expresses how much progress a student has made relative to the progress of students whose achievement was similar on previous assessments.

A student growth percentile complements a student’s SOL scaled score and gives his or her teacher, parents and principal a more complete picture of achievement and progress. A high growth percentile is an indicator of effective instruction, regardless of a student’s scaled score.

Thus, the SGP measures a teacher's performance vs. other teachers with similar students.  Slide 31 in the VDOE powerpoint is the Colorado data suggesting essentially no correlation between the SGP and economic disadvantage.  More on that and related issues here and here.

You'd think that an enlightened Education Department would trumpet those data.  Doubtless it would.  But don't mistake our own State Department of Data Suppression as being enlightened.  Brian Davison, a parent of two Loudoun schoolchildren, had to sue VDOE and pay $1,100 to get the data, and then with the teacher identities suppressed.

First, the dataset: VDOE has produced three sets of SGP data.  They say there were errors in the first set, so I'll be looking at the 2d and 3d.

The 2d set of data has statewide SOL and SGP data with anonymous student IDs but with the schools identified.  VODE suppressed the data for classes of fewer than ten students and for students who transferred during the school year.  The 3d dataset also has anonymous student IDs (different from those in set 2 so there's no way to join the two sets) and anonymous teacher IDs but with the schools not identified.

The 2d dataset has lots of duplicate records.  Both sets 2 and 3 are missing a flock of Richmond data for 2013, especially in the middle schools.  VODE says it's because Richmond didn't report the data.

Even so, there's a lot to learn from these data.

CAVEATS:

  • VDOE has manipulated these data. For example, they have suppressed the data in cases where the number of students (presumably in a class) is fewer than ten. In contrast to earlier data sets, they have suppressed the no-retest scores for students showing a retest score. They also have suppressed retest scores below passing (<400). Those are the changes I know about or have found; there may well be others. In any event, these data deviate from the expected mean and median of 50, probably because of the VDOE manipulations. For example, the average of the 2014 sixth grade reading SGPs in this dataset is 47.7 and the distribution is not the expected flat line:


  • VDOE gave us anonymized teacher IDs, but not the names of the teachers involved. More on this outrage in a later post.

  • The SGP is calculated from the difference between similarly sized SOL scores, which can lead to large relative errors in the SGP. This suggests considerable caution in interpreting any individual student’s SGP score or the average (or distribution) for a teacher where the number of scores is small.

That said, here is the 2014 distribution of division average reading SGPs.

On this graph, and the one below, Richmond is the yellow bar, Petersburg is red, Norfolk is blue, and Hampton is green.

Excel is unable to readably list all the divisions on this graph; the list here and below excludes every 2d division. Thus, on the graph below, the yellow bar (Richmond) appears between Pulaski and Henry Counties. That’s just an artifact of the program, not evidence that Richmond has disappeared.

The relatively higher SGP averages for Norfolk and, especially, Hampton tell us that, despite mediocre SOL pass rates, their students' reading scores are improving significantly year over year.

The graph above averages the results of reading tests in the five grades, 4-8 (the SOL score in Grade 3 is the starting point so there’s no third grade SGP). It turns out there is a considerable variation from grade to grade. For example, Richmond:

As to math and algebra I, Richmond does much better.

As with the reading scores, the Richmond math scores plummet when the students enter middle school. Yet the state averages remain nearly flat (as they nearly should; if VDOE were not manipulating the data, the state average should be entirely flat at 50 on every test).

Something uniquely ugly happens in the sixth grade in Richmond.

I asked the formidable Carol Wolf:

What’s going on with the 6th grade? Richmond’s [SGP] scores in both reading and math fall into a pit from fifth to 6th grade. I’m looking at data that show that NO Richmond teacher for whom we have SGP data taught 6th grade reading two years in a row; ONE Richmond teacher for whom we have SGP data taught 6th grade math two years in a row. Nobody taught either subject three years in a row.

Any ideas?

She replied:

I have asked several teachers what and why they think it is that Richmond’s 6th and 7th graders go from a fully accredited elementary school to being dumber than a sack of rocks when they hit middle school.

Their collective answer: The elementary schools are cheating.

Could be. The 8th Grade SGPs (which are below but approaching state average values) are based entirely on the change from previous years’ middle school SOL scores, while the 7th Grade SGP scores can reach one year into elementary school and the 6th grade SGP scores are based entirely on the change from students’ SOL histories in elementary school. If Richmond’s elementary SOL scores were artificially high and the middle school SOLs were low normal, the 6th graders and, to a lesser degree the 7th graders, would be starting at an artificially high SOL, so their SGP scores would show abnormally little improvement. That is, the SGP scores would be abnormally low in the sixth and, to a lesser degree, seventh grades.

The 6th Grade teach-once-then-teach-something-else pattern would suggest that the new teachers get the sixth grade classes and that they get out as soon as they get any seniority. That would be consistent with unusually low sixth grade SGP scores, whether the elementary SOLs were inflated or not.

Let’s label Carol’s suggestion as an hypothesis and try to think of an experiment to falsify it.

In any event, with the prophylactic effect of Richmond’s appalling dropout rate (and, probably, Richmond’s remarkable retest rate), the scores rebound for Algebra I.

As I mention above, the third SGP dataset from VDOE contains (anonymous) teacher IDs.  This gives a first peek at how well, and how badly, some of Richmond's teachers are performing.

With all the caveats listed above, let's start with the statewide distributions of teachers' average SGP scores in reading and math.

Brian Davison points out that both distributions are reasonably symmetrical, suggesting that we do not have an unusually large number of teachers doing particularly well or poorly. That said, no parent will want a child to be subjected to the reading teacher in the first percentile, the other teacher in the second, or the three in the eighth.

The math scores are more widely distributed, showing a larger number of excellent and a larger number of awful teachers.

We already have seen that the Richmond average reading SGP plunges from fifth to sixth grades.  The Richmond distributions conform to that pattern. First, grade 5, statewide and then Richmond:

As you see, this distribution is a bit wider than the statewide distribution. That is, Richmond has relatively more excellent fifth grade reading teachers than the statewide average, and also relatively more who are not performing. Five (of sixty-seven) Richmond teachers are more than two standard deviations above the state average; three are more than two standard deviations below.

Those teachers at the low end need some work but, for the most part, Richmond’s fifth graders are in pretty good hands as to reading.

Then we have grade 6 reading results; again statewide and then Richmond:

Only one of Richmond’s twenty-one sixth grade reading teachers produced an average student improvement better than the state average; none was more than two standard deviations above the statewide average. Six (or seven, depending on the rounding) were more than two standard deviations below the state average and four were more than three standard deviations below. The Richmond average is 1.5 standard deviations below the state average.

These data tell us that Richmond’s sixth grade reading teachers are not doing a bad job. They are doing an appalling job.

Upon some reflection, the data also tell us two even more important things:

  • The principals (and the Superintendent) now have a quantitative measure of teacher performance (at least as to reading and math). If they don’t do something (soon!) about rewarding the excellent performers and retraining or firing the poor ones, we’ll know they need to be replaced themselves.

  • VDOE is hiding the identities of these high- and low-performing teachers from the parents who pay them and the teachers and whose kids are directly affected by teacher performance. Indeed, VDOE is abandoning the SGP, apparently for telling too much.  It seems that our educrats think it would be intrusive for the parents of Virginia’s schoolchildren to know whether their kids are in the hands or excellent, average, or lousy teachers. I think the term for that kind of inexcusable bureaucratic arrogance is “malfeasance.”

Turning to the math tests, are are the statewide and Richmond fifth-grade distributions.

Again, a lower average and larger numbers of good and bad teachers, but nothing startling.  But see the sixth grade numbers:

The Richmond average is 1.75 standard deviations below the state average. Four of eighteen Richmond math teachers are more than two standard deviations below the state average. Only one is above the state average.

At present, parents take their kids’ teachers willy nilly. VDOE now has data in some cases to tell those parents whether the teachers are effective. Yet VDOE says the “privacy” of those public employees is more important than informing the public about those employees’ performance. VDOE’s refusal to share those important data that have been bought with taxpayer dollars is an abiding and outrageous insult to Virginia’s taxpayers.

More data on teacher performance are here.

Until someone comes up with a catchy acronym, I will call the gang of secretive bureaucrats at VDOE the State Department of Data Suppression. The name may not be catchy but it surely is accurate.

(And the abbreviation, SDDS, is a palindrome!)

Your tax dollars at “work.”

ANOTHER CAVEAT:

As set out is some detail here, the Richmond middle schools data are messed up.  VDOE, which is responsible for those data, does not appear to have remedied the problems, or even to have tried.  This tells us that the middle school data above may exaggerate (or may conceal the really awful magnitude of) the problems in Richmond's middle schools.  And for sure it tells us there is a large malfeasance problem at VDOE, even beyond their public posture.

On another front, the 2014 SGP data tell us, again, that spending more money on schools does not produce more learning.  First reading, then math:

Richmond is the gold squares; the red diamonds, from the left, are Hampton, Newport News, and Norfolk.

 

A Modest Proposal

SOL scores decrease with decreasing economic status of the family. Thus, the Feds have required (select the SGP Primer link) VDOE to compute a measure of learning, not family income. VDOE selected the SGP. VDOE now has three years’ of those SGP data that can be used to measure teacher effectiveness.

VDOE has a lawyer full of bogus excuses for not releasing the data with the teachers’ identities attached. None of those would prevent use of the data to rate the effectiveness of the colleges that sent us those teachers.

Just think, VDOE now can measure how well each college’s graduates perform as fledgling teachers and how quickly they improve (or not) in the job. In this time of increasing college costs, those data would be important for anyone considering a career in education. And the data should help our school divisions make hiring decisions.

In addition, VDOE could assess the effectiveness of the teacher training at VCU, which has been spending $90,000 a year of your and my tax money to hire Richmond’s failed Superintendent as an Associate Professor in “Educational Leadership." (link now broken; we can hope they are rid of her)  Wouldn’t it be interesting to see whether that kind of “leadership” can produce capable teachers (albeit it produced an educational disaster in Richmond).

 


 

Back to the Top

High Taxes | High Crime | Lousy Schools | Obdurate Bureaucrats

Last updated 11/12/15
Please send questions or comments to John Butcher