Blank

The Cranky Taxpayer

Fun1


Accreditation | Attendance | Corruption | Cost | Failing Board | Discipline | Eat Cake | NCLB | FOIA Suit | Waste $ | SOL Scores | SGP Scores
Disability Abuse | SOL v Poverty | SAT Disaster | Fun1 | Fun2 | Fun3 | Do SOLs Work? | Leadership | Neanderthal Security
Embarrassing the Governor


The State is cooking the SOL Numbers for the accreditation process:

False Advertising

Until January, 2004, the Education Department web site described the accreditation requirement as follows:

A school is fully accredited if students achieve pass rates of 70 percent or above in all four content areas.

As we shall see, that statement was not true.  Indeed, the accreditation web page now is considerably more circumspect (in its wordy, bureaucratic way):

School accreditation ratings reflect student achievement on Standards of Learning Assessments and other tests in English, history/social science, mathematics, and science. Ratings are based on the achievement of students on tests taken during the previous academic year and may also reflect a three-year average of achievement. Adjustments also may be made for students with limited English proficiency and for students who have recently transferred into a Virginia public school. Accreditation ratings also may reflect the success of a school in preparing students for retakes of SOL tests.

(Emphasis supplied).

Note added on 2/7/05: Although the Department fixed its web site in 2004, the misrepresentation that the cooked accreditation numbers are "pass rates" lived on.  In 2005 they used it to embarrass the Governor.

Even with all those caveats, you might have been inclined to credit the regulation that says "[t]he awarding of an accreditation rating shall be based on the percentage of students passing SOL tests . . . ."

In fact, the scoring only starts with the percentage of the kids who pass the test.  Then we get the “adjustments.”  As the critics of the test point out, “the state allows use of various accreditation-inflating strategies.”  These include:

It makes sense to make allowances for disabled kids, for transfer students, and for students who do not speak English.  Given that the test numbers can fluctuate, perhaps it makes sense to allow a rolling average of the scores.  For a clear discussion of those adjustments, see the PAVURSOL web site.

As to the Bonus Points, however, the State Board has created two huge deceptions that can falsely inflate the scores.  The bonus points and some other "adjustments" are not authorized by the regulations.  In fact, some of the adjustments contradict the regulationsAnd for good measure, it looks like they have invented a couple of smaller "adjustments" without even a vote of the Education Board. 

Big Bonuses

The Regulations provide for a "remediation recovery" program through the eighth grade for students who have flunked an English SOL and in all twelve grades for the kids who have flunked a math SOL test.  The students who go through that program and fail the retake do not count against the score for the school.  The kids who finish the program and pass inflate the school's score.

Here is the way the state guidelines say it:

The passing rates on assessments administered in schools shall be calculated by dividing the total number students in a school who pass the assessments (numerator) divided by the total number of students who take the assessments except that students who are re-tested and fail SOL tests in English and/or mathematics after participating in a remediation recovery program shall not be counted in the total number of students assessed.

Stated in the Mother Tongue, that means if a kid flunks the test in English or math and suffers through a “remediation recovery program” and then flunks again, he does not count as having taken the test the second time, i.e., the second failure does not lower the school's pass rate.  Only the passes count in the scoring.

Of course, the pass rate for those who pass is 100% so counting only the passes inflates the score.

That Guideline is not consistent with the Board's Regulation:

Schools shall be evaluated by the percentage of the school's eligible students who achieve a passing score on the SOL tests or other additional tests approved by the Board as outlined in 8 VAC 20-131-110.B. of these regulations in the four core academic areas administered in the school.

At 8 VAC 20-131-280.C.4 the Regulations further tell us that "eligible students" are all those "enrolled in the school at a grade or course for which a SOL test is required" with exceptions for kids with limited English proficiency or disabilities.

The Guidelines, however, go on to provide a score inflator that appears to be contrary even to the to the guideline above:

Placing a child in a remediation recovery program in English (Reading, Literature, and Research) and/or mathematics does not penalize a school if the student is not successful on the retake of an SOL test. Students who are successful on a retake of an SOL test are counted in the number of students passing a test but not in the number of students taking a test when calculating the passing rate for the school.

That is, the kids who go through remediation recovery and pass the test are counted as passing but not as taking the test.  This is even better than just not counting the remediation recovery kids who fail: This counts the passes in the numerator of the score but not in the denominator.

This can have a dramatic effect on the scoring: If 100 kids take the test and 50% of them pass, the raw pass score is, sure enough, 50%.  But if 50 kids who have flunked earlier undergo “remediation recovery” and retake the test and half of them pass, the score is not 50/100 = 50%.  Neither is it (50+25)/(100+50) = 50%.  Nor is it (50+25)/(100+25) = 60%, a 10% improvement over reality as provided by the first guideline above.  No, it is (50+25)/100 = 75%, a 25% improvement over reality.

Indeed, the effect can be more spectacular than this scenario suggests: Because a remediation recovery pass adds to the number of passes but not to the number of tests taken, the bonus points can produce a score greater than 100%

In 2001 (see the Minute for 10/22/01, no longer available on line), the Board added another score booster: It voted (again without changing the regulation and this time without even modifying the Guidelines) to give the same boost for kids who are retaking and pass a SOL test for verified credit (generally high school kids, who need "verified credits" to graduate).  Again, these passes add to the numerator of the score but the number of these students retaking the tests does not appear in the calculation.  This also can run the score over 100%

That, or something like it, happened in Richmond this year, where Franklin Military had a 112.5%(!) in English and Community High had a 100.86% pass rate, also in English.

This maneuver is entirely contrary to the Regulation that provides, "Schools shall be evaluated by the percentage of the school's eligible students who achieve a passing score."  These retake kids are not "eligible students" because they have taken the test as required.  Thus they should not affect the score either way but they are being used to inflate the scores.


Good News & Bad News

On September 28, 2006, Chuck Pyle of the Department of Education kindly pointed out that the Board has modified the Remediation Recovery program.  The details are here.

The Bad News is that the numbers there reveal that the new procedure is merely cooking the numbers on a different burner.  The process still can improve the "Adjusted" scores dramatically.  For example, from the 2006 data:

School Test SOL Accred. Diff.
Lucille Brown Middle English 72 80 8
Math 58 72 14
Fred Thompson Middle English 75 79 4
Math 58 74 16
Westover Hills Elem. Math 58 74 16

The other Bad News is that the Richmond Times-Dispatch doesn't understand the difference between the two scores, and reported the cooked numbers as "SOL scores." [Link removed.  The Times-Disgrace busted them all when it moved to the InRich web site.]
 

 

More Good & Bad News

On September 28, 2007, Chuck Pyle of DOE delivered his second annual chastisement, to wit:

With the advent of annual testing in grades 3-8 and the revision of the SOA, remediation recovery has changed, as described below.

If a student fails an English or math test, receives remediation, and then passes the subsequent year's test in the subject, he now counts once as a passing student, and once as a remediation recovery student in both the numerator AND the denominator. So, rather than a "bonus point" in the numerator only, the student is counted as an extra passing student in both the numerator and the denominator. The school benefits, but not to the extent as under the previous remediation recovery process.

So...if 25 of 30 students in a school passed tests in English (83 percent)...and two of the passing students had received remediation after failing the previous year's English test...then the adjusted pass rate in English would be 84 percent. Previously, if the two students had passed the current year's English test AND retakes of the previous year's English test, the adjusted pass rate would have been 90 percent.

Chuck also sent along a link to the Board's Guidelines on the subject (see p. 26).  In short, the Guidelines say they massage the SOL score by using a running 3-year average if that raises the score and by adding the kids who pass after remediation recovery (but not those who flunk, it seems), as well as unspecified "allowances" for transfer students, kids who don't speak English, and retakes.  There is no word whether they still use the same extralegal bonuses as before.  The result, they admit, is "adjusted pass rates." 

This process is perfectly opaque: They do not publish the raw and adjusted scores in the same place and they do not show the details of the adjustments.  The data for the 2006-07 testing suggest that the current cooking of the numbers produces results perhaps a third to a half as deceptive as the old system.

SCHOOL NAME 

GRADE 4 6 7 8 AND EOC ENGLISH  MATH  English SOL Math SOL English Boost Math Boost
Albert Hill Middle 84 73 84 69 0 4
Armstrong High School 85 80 81 81 4 -1
Binford Middle 84 61 83 56 1 5
Chandler Middle 68 54 65 48 3 6
Elkhardt Middle 76 53 74 44 2 9
Franklin Military 93 88 91 87 2 1
Fred D. Thompson Middle 74 55 71 50 3 5
George Wythe High 78 71 76 71 2 0
Henderson Middle 84 78 79 74 5 4
Huguenot High 94 94 92 93 2 1
John Marshall High 89 83 84 84 5 -1
Lucille M. Brown Middle 77 65 73 63 4 2
Martin Luther King, Jr. Middle School* 80 75 79 72 1 3
Open High 100 100 100 100 0 0
Richmond Community High 100 99 100 99 0 0
Thomas C. Boushall Middle 61 51 61 47 0 4
Thomas Jefferson High 88 81 91 83 -3 -2
Average   1.8 2.4

  *warned for history

   

Notes:

  • The schools in red were not accredited this year
  • The two "-1" score boosts may be round off errors. 
  • I don't have an explanation for the "-3" and "-2" for TJ.  I did check the data and the numbers here are those on the Web.

What can we say to Mr. Pyle:

Is the accreditation process less deceptive than before?  Yes.
Is the accreditation process still deceptive? Yes.
Is the accreditation process transparent?  Hardly.
Is this any way for our government to act?  Not on a bet!

The other Bad News is that the Richmond Times-Dispatch STILL doesn't understand the difference between the accreditation numbers and the SOL scores, and today (9/18/07) reported the cooked numbers as "SOL scores."  The failure to understand the difference embarrassed the (former) Governor but he, at least, corrected the error.

 

Extra Bonuses

As if that were not enough, there are two little boosts and one coverup, all of which modify the regulation without being adopted as regulations.  Indeed, two of these did not even enjoy a vote by the Board of Education:

In response to my Freedom of Information Act request, the Education Dept. produced a Word document that is titled "accreditation procedures/notes for 2003-2004."  This document says, in part:

for each SOLtest (sic) for each school for the current cycle (summer  2002, fall 2002, spring 2003)

  • remove students marked as cheating from unadjusted number taking and adjusted number taking . . .

  • accreditation pass rates are capped at 100% passing

  • accreditation percents are rounded up at 0.45 (34.44 = 34, 34.45 =35)

First, they reward the schools that permit cheating by removing the cheaters from the scoring, rather than scoring them as failures. The Board voted to approve this change (search the 9/28/00 Minute minute -- no longer available on line -- for "improprieties" to find it) but did not adopt the change as a regulation.

Next, the 100 point cap, which does not inflate the scores but does serve to hide their nature: The only reason to cap the score at "100% passing" is that the score in fact is not "% passing" and numbers above 100 would give the game away.  Thus, to avoid embarrassing truths, the Department reports the 112.5% score at Franklin and the 100.86% at Community as "100" in each case.

Finally, even the round off is cooked, rounding up at 0.45 rather than 0.5.  The folks at the Department say this is to prevent principals who are below but very near to the 0.5 point border for accreditation from tormenting the Department by "finding" a student or two who should be excluded from the scoring, thus qualifying for the round up and accreditation.  They do not explain how this prevents the same gaming by principals who are within a student or two of the 0.45 cutoff.

For these last two "adjustments," they say they consulted the Chairman of the Board.  For sure, neither the Regulations nor the Guidelines authorizes these modifications of the Regulation.

Regulation by Fiat

As we have discussed, the Board's Regulations provide for accreditation "based on" the SOL scores.  The Guidelines modify that to provide a big boost for remediation recovery.  The Board voted on the retake boost for students seeking verified credits and the little boost by removing cheaters but did not incorporate them into either the Regulations or the Guidelines.  The Board did not even vote on the 100% cap and deceptive roundup.

So, we have accreditation requirements in a regulation, in the guidelines, in Board minutes but not in either the regulations or the guidelines, and in two changes the Board did not even vote upon.

The Virginia Register Act defines "guidance document" as follows:

"Guidance document" means any document developed by a state agency or staff that provides information or guidance of general applicability to the staff or public to interpret or implement statutes or the agency's rules or regulations, excluding agency minutes or documents that pertain only to the internal management of agencies. Nothing in this definition shall be construed or interpreted to expand the identification or release of any document otherwise protected by law.

The guidelines are filed with the Registrar of Regulations as guidance documents of the Education Department.  They are not filed as regulations under the Administrative Process Act.

My copy of the Administrative Process Act says "'Regulation' means any statement of general application, having the force of law, affecting the rights or conduct of any person. . . ."  The Administrative Process Act also sets out a complicated process for adopting regulations.  The Board did not follow that process when it adopted its Guidelines.  Similarly, it did not follow the process when it adopted the other changes discussed above.  For sure, the Department did not follow any process when it installed the 100% cutoff and the 0.45 roundup.

Beyond question, all the "adjustments" discussed here are statements of general application, having the force of law, affecting the rights of Virginia schools and their students.  A few, such as the three-year rolling average, are in the Regulations.  If the rest had been adopted as regulations they surely would be regulations.  Instead, we have the Board and the Department modifying the Regulations by processes that do not amend or rescind the Regulations.  Of course, if they can install accreditation requirements by fiat, they don't need the Regulations in the first place. 

"Fiat" means "an authoritative or arbitrary order," as opposed to an Italian automobile or a duly adopted regulation.  The other term that probably applies to these non-regulation adjustments is "unlawful."

Hiding the Ball

You might think the State Education Department would put the "adjustments" on the web where every citizen can see how they are arriving at the accreditation decisions.  You also might think pigs can fly.

I asked the Department for the data on their "adjustments."  They gave me the "procedures" quoted above and a set of spreadsheets showing the adjusted numbers of students passing and failing each test at each school.  That is not enough to figure out where all the adjustments came from. 

So I asked for the data showing where those adjusted numbers came from.  They replied:

There are no existing files which show the magnitude of effect for recovery/retakes or any other accreditation procedure. Temporary files are run in SPSS by individual test for each administration with formulas to derive the number of students taking and passing each test according to accreditation procedures. The resulting files are merged into content areas and saved as Excel files. These Excel files are used to determine ratings. The SPSS files are discarded after accreditation ratings are released.

It's hard to be sure what that means.  It could well mean they are so ashamed of their shenanigans that they destroyed the audit trail.  For sure they had destroyed the files.

To their credit, the folks at the Education Department met with me, discussed their procedures, and recreated the intermediate data for Richmond.  They also fixed the description on their web page (as discussed above), insofar as their Board's regulation would allow it.  That behavior was consistent with their reputation as one of the more capable (and more transparent) state agencies.

Stay tuned for an analysis of those recreated data.  In the meantime, the numbers below are from the "adjusted" scores in the spreadsheets they produced earlier.

Also stay tuned to see whether they start reporting the size of the "adjustments" and whether they figure a way to stop calling the "adjusted" numbers "pass rates."

Boosted Scores

In any event, the 25% enhancement calculated in the scenario above is not far from what happened in Richmond with some of the 2003 scores.

From the data they gave me earlier it is not possible to tease out just how they arrived at the adjusted scores.   The data reveal the effects of some of the adjustments, however.

Here are the middle school accreditation numbers and the corresponding pass rates on the 8th Grade math test:

The accreditation scores average 14.2 points higher than the test scores.

Some of the kids take algebra (the number ranges from half of those taking the 8th grade test at AP Hill to 7% at Thompson) and a few take geometry (20% at Binford, fewer elsewhere).  The pass rates in these advanced classes are excellent.  If we include those scores, the accreditation inflation drops to 8.5 points:

The 8.5 difference, however, double-counts the smart kids (count them once when they take 8th grade math; count them again when they take an advanced math course).  The fair measure of the school's failure is the raw, eighth grade score.  By that measure, Boushall, Mosby, and Thompson are seen to be performing at sub-Petersburg levels; with all the adjustments, they all are provisionally accredited in math.

The pass rates for the alternate assessment kids is very high, but the number of kids is small (eleven at Elkhardt, seven at Boushall, fewer elsewhere).  The big boost in the "adjustments" comes from the count of the fifth grade math test:

School

Take Math 5 Pass Math 5
ALBERT HILL 0 20
BINFORD 0 14
CHANDLER 0 5
ELKHARDT 0 18
FRED D. THOMPSON 8 13
HENDERSON 0 20
LUCILLE M. BROWN 0 5
MOSBY 2 9
ONSLOW MINNIS 0 10
THOMAS C. BOUSHALL 0 20

Or, in graphical terms:

That's right folks: They are counting the passes but nowhere near all the takes.  See above for their (ephemeral) authority for this. 

If we look at the underlying data at Elkhardt, here is what we see:

Their Label What it Means Datum
SCHNAME School Name ELKHARDT MIDDLE
LO Low Grade 06
HI High Grade 08
TYPE Type of School Middle
TAKE MATH Adjusted No. Math Takes 150
PASS MATH Adjusted No. Math Pass 113
PCT Unrounded, adjusted score 75.3333
ACCRED.MATH.PCT Rounded, Adjusted score 75
S3T5M 5th Grade Math Takes, Spring 0
S3T8M 8th Grade Math Takes, Spring 114
S3TA1 Algebra I Takes, Spring 25
TAKE.M_A Alt. Assessment Takes 11.00
S3P5M 5th Grade Math Passes, Spring 18
S3P8M 8th Grade Math Passes, Spring 61
S3PA1 Algebra I Passes, Spring 24
PASS.M_A Alt. Assessment Passes 10.00
 
  Takes w/o 5th Grade 150
  Passes w/o 5th Grade 95
  Score w/o 5th Grade 63.3
 
  Score of 8th Grade Test alone 53.5

The folks at the Department say those Fifth Grade scores are remediation recovery kids.  Sure enough, the intermediate data they had destroyed show 18 5th Grade Math remediation passes.

It looks to my old eyes as if the Board's Regulations do not support this shenanigan.  At 8 VAC 20-131-280.C the regs say:

2. In a manner prescribed by the board, the evaluation of the performance of schools shall take into consideration:

a. The percentage of eligible students who achieve a passing score on the prescribed SOL tests or additional tests used for verified units of credit as outlined in 8VAC20-131-110 B;

* * *

e. The number of students who successfully complete a remediation recovery program and subsequently pass SOL tests in English (Reading, Literature, and Research) and/or mathematics during any scheduled administration by the end of the following school year.

* * *

4. Eligible students shall be defined as the total number of students enrolled in the school at a grade or course for which a SOL test is required unless excluded under subsection E of this section and those students with disabilities who participate in the alternate assessment program.

5. Schools shall be evaluated by the percentage of the school's eligible students who achieve a passing score on the SOL tests or other additional tests approved by the board as outlined in 8VAC20-131-110 B in the four core academic areas administered in the school.

The straightforward reading of that is that each school counts its own, not imported flunkers.  In contrast, the Guidelines say (at pp. 1-2):

As always, the scores of the student count at the school where the remediation and re-testing takes place. For example, a fifth-grade student fails the 5th grade mathematics test and is promoted to the 6th grade in a middle school. The student, who is remediated during the next school year, and who retakes and passes the 5th grade test, will count as a pass for the middle school. This would also be the case with a student who is promoted to the 9th grade, is retested on the 8th grade English (Reading, Literature, and Research) or mathematics test.

See above for a more elaborate discussion of this process of using Guidelines (or even lesser instruments) to modify the regulations.

This counting of passes but not failures can have a dramatic effect on the scoring.  For example, the eighteen 5th grade scores at Elkhardt, plus ten alternate assessment passes, increased the (already inflated by the double-count) score from 61 to 75, a 13.8 point boost that took the school comfortably into full accreditation for math (If you include the alternate assessments in the overall scoring, the remediation recovery scores increase the score from 63 to 75, a 12-point boost).  Yet the actual pass rate on the 8th grade test at Elkhardt was 54%, just above Petersburg performance.  This is fraudulent accounting.

As to middle school English, the major boost to the scores again came from the fifth grade test where, again, many more passed the test than were counted for taking it:

School Take 5 Eng Pass 5 Eng
ALBERT HILL 0 20
BINFORD 0 12
CHANDLER 0 12
ELKHARDT 0 12
FRED D. THOMPSON 4 18
HENDERSON 0 24
LUCILLE M. BROWN 0 16
MOSBY 2 11
ONSLOW MINNIS 1 5
THOMAS C. BOUSHALL 0 15

Turning to the remarkable 112.5% English score at Franklin: Here are the high school English scores for the end of course tests ("English" + writing) and the raw accreditation scores (unrounded but also not cut down to 100).

The funny business here comes from the eighth grade scores.  Franklin and TJ had nice numbers of folks who passed but did not count as taking the test:

With the smaller enrollment at Franklin, the inflation was more dramatic.

On the other hand, Kennedy had a flock of eighth graders who were counted for flunking the 8th grade test.  Go figure.

Again, the 8th graders came in at Franklin via the remediation recovery process:

Their Label What it Means Datum
SCHNAME School Name FRANKLIN MILITARY
LO Low Grade 09
HI High Grade 12
TYPE School Type High
TAKE ENG Adjusted No. Taking English 56
PASS ENG Adjusted No. Passing English 63
PCT ENG Unrounded Score 112.5000
ACCRED.PCT.ENG Rounded and Capped Score 100
S3T8E 8th Grade English Takes, Spring 1
S3TRL EOC English Takes, Spring 23
S3T9W EOC Writing Takes, Spring 32
TAKE.E_A Alternate Assessment Takes  
TAKESU_1 Substitute Test Takes  
S3P8E 8th Grade English Pass, Spring 9
S3PRL EOC English Pass, Spring 22
S3P9W EOC Writing Pass, Spring 32
PASS.E_A Alternate Assessment Pass  
PASSSU_1 Substitute Test Pass  
   
  Score w/o the 8th Grade 98.2

As you see, the remediation recovery boost was slightly over 14  points.  The state wasted 13 of those points by the (unauthorized and misleading) capping at 100.

For the history test in the elementary schools, they throw out the third grade score (per the regulations) if that will improve the school score.  For 20 of 31 elementary schools, ignoring the third grade score had just that effect.  At twelve of those schools, the improvement in the history score was more than five points:

School Score w/o Grade 3 Score w Grade 3 Improve-ment
PATRICK HENRY  75.7 57.1 18.5
GEORGE W. CARVER  70.8 54.2 16.5
MAYMONT  84.8 69.0 15.8
CHIMBORAZO  84.3 69.9 14.4
J. E. B. STUART  90.7 77.8 12.9
A. V. NORRELL  86.5 74.8 11.8
MILES JONES  93.7 83.2 10.4
SWANSBORO  89.2 79.5 9.6
WESTOVER HILLS  93.5 84.0 9.5
WHITCOMB COURT  74.4 66.7 7.7
E. S. H. GREENE  98.4 92.2 6.2
SOUTHAMPTON  86.7 80.7 5.9

Note the 18.5 (!) point boost at Patrick Henry.

In exactly the same fashion, 15 of the 31 elementary schools got their science scores boosted by not counting the third grade scores.  Six of those schools got a boost of more than five points with Broad Rock the winner at 12.3:

School Score w/o Grade 3 Score w Grade 3 Improve-ment
BROAD ROCK ELEM. 98.1 85.8 12.2
OAK GROVE/BELLEMEADE ELEM 89.1 77.0 12.0
A. V. NORRELL ELEM. 77.8 67.3 10.5
PATRICK HENRY ELEM. 57.9 48.7 9.2
BLACKWELL ELEM. 77.8 70.1 7.6
MILES JONES ELEM 80.0 74.6 5.4

In a couple of years we'll see whether it was wise to reward these schools for shorting the history and science instruction in the third grade.

Here are the differences between the raw and the "adjusted" 3d Grade English scores:

Norrell is the big winner, enjoying a 33 point boost from 54.7 (the actual score) to 87.8 (the "adjusted" score).  The average boost is 11.8 points.

Here are the same data for the 3d Grade Math Scores:

The big winner here is Reid, with a 40 point boost (that is NOT a typo) from 78.7% kids passing to an "adjusted score" of 118.6 (THAT is not a typo, either).  The average boost is 15.9 points.

I could go on but I trust the point is clear.  They are cooking the numbers and misrepresenting the inflated results as having something to do with pass rates.  They are accrediting schools that, in fact, are doing a dismal job.

The net of all these "adjustments" was to take Richmond from about 11 fully accredited schools on the raw data to 23 on the adjusted data and from about 12 schools on warning to an adjusted 9.

I sure wish I could balance my bank book by this kind of process.

One (probably unintended) result of this deception was to embarrass the Governor, who danced the funky chicken over two "perfect scores" that in fact were a 73.7 and a 76.3.

You Can't Fail If You Don't Take the Test

Another factor that adjusted the Richmond SOL scores was a very low percentage tested.  Doubtless Richmond's already lousy SOL scores would be even worse if the school system tested those missing students.  For reasons best known to the educrats, there is no accreditation penalty for not testing all the kids.

In contrast, the number tested has implications for Richmond's Adequate Yearly Progress under the the No Child Left Behind Act.  Indeed, the 91% overall test rate has prevented most of the Richmond schools from making Adequate Yearly Progress under the Act.

 

Back to the Top 

High Taxes | High Crime | Lousy Schools | Obdurate Bureaucrats

Last updated 04/01/12
Please send questions or comments to John Butcher