Monday, December 28, 2009

School district excludes feedback from mathematicians

By Laurie H. Rogers


Last September, Spokane Public Schools created an adoption committee that was to choose a new high school mathematics curriculum. For the first time perhaps ever, parents, students and community members (including yours truly) were allowed to participate. The committee met six times in 2009. At our Dec. 9 meeting, we chose two strong finalists from eight possibilities, and we did it despite the district’s complete mishandling of the adoption process. Administrators and “facilitators” wasted our time and tax dollars on useless activities; minimized or excluded feedback from parents, teachers, students, and committee members; and continually showered us with extreme reform propaganda.

On Dec. 9, the district’s interference went to a whole new level. On that day, members of the adoption committee were barred from determining whether the eight possible curricula met the most crucial requirement on our list: “Alignment to State Standards.” Instead, we were told to use a previous assessment from the Washington State Office of Superintendent of Public Instruction. We were assured that if OSPI ranked a curriculum as being closely aligned to the state standards, the program was assumed to be “accurate, rigorous, and high quality.”

This assurance necessarily presumed that OSPI's assessment was done thoroughly, correctly and without bias. But that presumption is on shaky ground.

In 2008, under former State Superintendent Terry Bergeson, a team put together by OSPI assessed 13 high school math curricula (3 texts each), mapping each to the new state standards. There wasn't time for careful assessments of "mathematical soundness." The top four were:
1. Holt
2. Discovering Algebra/Geometry
3. Glencoe
4. Prentice Hall

Core-Plus Mathematics Project, Spokane’s current high school math curriculum, placed sixth overall. Core-Plus is a reform program, and its text is arranged in an “integrated” fashion. Of the textbooks with a “traditional arrangement of content,” (algebra/geometry/algebra), Holt placed second in algebra and first in geometry. Meanwhile, Discovering Algebra/Geometry placed first in algebra, but sixth in geometry.
(Although Discovering Algebra/Geometry contains a “traditional arrangement of content,” it isn’t a “traditional” textbook. The texts are heavily constructivist, with constant group work and student "discovery.")

OSPI also asked Drs. James King and George Bright to assess the top four curricula for mathematical soundness. Both men had potential conflicts of interest.
  • Dr. Bright has a Ph.D. in mathematics education (not in mathematics) and has advocated for reform math. At the time, he worked for OSPI. He also was part of the assessment team for reform curriculum Connected Mathematics.
  • Dr. King has a Ph.D. in mathematics and is an author for Key Curriculum Press (Key Curriculum Press is the publisher for Discovering/Algebra/Geometry).
Only Dr. Bright reviewed the algebra textbooks. Only Dr. King assessed the geometry textbooks, but he assessed McDougal-Littell instead of Discovering Geometry because the latter had scored too low in OSPI’s initial assessment to be considered for this additional assessment. Dr. Bright found Holt and Discovering Algebra to be the best in algebra; Dr. King found Holt and Prentice Hall to be the best in geometry.

OSPI released a preliminary recommendation to the State Board of Education (SBE). Legislation required the SBE to review the recommendation before OSPI issued a final recommendation to the school districts and to the general public. The top 3:
1. Holt
2. Discovering Algebra/Geometry
3. Core-Plus Mathematics

I’ll bet you’re wondering how Core-Plus snuck in there. I wondered the same thing. In January 2009 I asked OSPI why Core-Plus was recommended over other, better curricula. Greta Bornemann, OSPI’s math director, told me that Randy Dorn, the new superintendent, wanted to have at least one integrated curriculum in the top three.

And so OSPI initially chose to recommend Core-Plus (despite the entire series being widely panned) and Discovering Algebra/Geometry (despite the geometry portion of that series being widely panned). The SBE meanwhile had contracted with Strategic Teaching, Inc. to have the top four curricula assessed by other independent mathematicians. For this assessment, the fourth-ranked curriculum – Prentice Hall – was passed over so that sixth-place Core-Plus Mathematics could be assessed in depth.

Two mathematicians – Dr. Stephen Wilson, Johns Hopkins University, and Dr. Guershon Harel, University of California, San Diego – determined that Core-Plus and Discovering Algebra/Geometry are indeed mathematically unsound. Holt – while not thought to be fabulous – was the only one of the four found to be mathematically sound in all categories assessed. Following this process, OSPI issued its final recommendation to the public. Just one high school curriculum was recommended: Holt Mathematics.

This fall, Dr. Bridget Lewis, Spokane’s executive director of instructional programs, told parents in two community forums that the mathematicians conducting the state reviews did not agree on the results. This is a partial truth. All four of the in-depth reviewers (Drs. Wilson, Harel, Bright and King) chose Holt Mathematics in their final summary. Drs. Wilson and Harel also agreed on the unsound nature of Discovering Algebra/Geometry and Core-Plus Mathematics. Dr. Lewis and Rick Biggerstaff, Spokane’s secondary math coordinator, knew about the additional in-depth assessments, and also about OSPI’s sole recommendation of Holt, yet they still forced the curriculum adoption committee to use OSPI’s original, cursory scoring.

On Dec. 9, I asked Rick Biggerstaff why they did that. I mentioned the in-depth assessments from Drs. Wilson and Harel, plus another done by mathematician Dr. John Lee, University of Washington (who also found Discovering Algebra/Geometry to be inadequate). Rick Biggerstaff brushed off my concerns, saying the assessments from Drs. Wilson and Harel were only about “mathematical soundness,” not “alignment.” Pointing to OSPI’s original scoring, he repeatedly stated, “We’ve decided we’re going to use this.”

But why? Why would Spokane administrators insist on using OSPI’s original scoring when its results conflict with later in-depth assessments? The most notable aspect of OSPI’s original scoring is that the OSPI team ranked Discovering Algebra/Geometry – a highly constructivist (discovery) program – as second overall despite its dismal scoring in geometry. Perhaps Dr. Lewis and Rick Biggerstaff didn’t bother to become informed about the in-depth assessments. Or, perhaps their unstated agenda was to keep a constructivist program in the running despite its known inadequacy. Perhaps both. Are there other possibilities?

Despite all of this, a majority of the members of Spokane’s adoption committee stood tall on Dec. 9 and chose Holt Mathematics and Prentice Hall as the two finalists. We did it based on our familiarity with mathematics, our experience in mathematics instruction and tutoring, and the desires of the community we serve. I’m proud of the committee. Now, if we can successfully navigate Spokane’s brief pilot of Holt and Prentice Hall, the district’s final recommendation to the school board, the school board vote, and the funding of the new math curriculum, we’ll really be getting somewhere.

Please note: The information in this post is copyrighted. The proper citation is: Rogers, L. (December, 2009). "School district excludes feedback from mathematicians." Retrieved (date) from the Betrayed Web site: http://betrayed-whyeducationisfailing.blogspot.com/

Thursday, December 17, 2009

School district excludes feedback from parents, teachers

Statement from Laurie Rogers on the feedback (part 2):
Spokane Public Schools’ pre-selection criteria for its new high school math curriculum are purportedly based on summaries of feedback from parents, students and teachers.
Careful diligence was required, therefore, in the collection and summarizing of this feedback. Unfortunately, the processes were so poorly conducted as to render the District’s summaries of the feedback virtually worthless.
It's a shame. Parents and teachers who offered their thoughts appear to care deeply about the issue. It’s disgraceful that so many of their thoughts were rewritten, minimized, reinterpreted, questioned, doubted and “summarized” right off the page.
(In spite of this, the adoption committee members did appear to take the community feedback into account, overwhelmingly voting for Holt Mathematics or Prentice Hall Mathematics as their top choice. These two curricula will now undergo a more in-depth assessment, including a brief classroom pilot, before a final recommendation is made to the school board.)

*********************************************

Parent Feedback Contaminated; Parent/Teacher Feedback Excluded:

In November, SPS hosted two community “forums” on the adoption of a new high school math curriculum. Those who came to the forums listened to a 50-minute District presentation, then were asked to write down on 3”x5” cards what they want from a new high school math program.
District staff made no attempt to differentiate between parents who work for the District and parents who don’t. There was no attempt to collect feedback only from parents, or only from parents of students in the district. The forums were attended by District staff, board members, university professors, District teachers, and math coaches – including people on the curriculum committee. Cards were handed out across the room. I was given cards at both forums.
Thus, the parent feedback was contaminated from the start.

The next week, I went back to the District office and looked through the cards collected at the two forums. I saw an interesting dichotomy.
Most of the cards are clear about a desire for a more traditional approach. They variously ask for “traditional” math, “basic” math, examples, direct instruction, practice, review, standard algorithms, a textbook, mathematical principles, skill proficiency (without calculators), level-appropriate material, tutoring, individual work, a dual track, alignment with state standards, a reference set of formulas, workbooks, and clarity.
On the polar opposite are a few cards asking for connections, explorations, conceptual understanding, application, and real-world context. Looking at these cards, one might think regular parents left their home at dinner time so they could drive to a district high school and use typical educator language to ask the District for more reform math.
Just one card in all of the so-called “parent” cards specifically asks for a “balance” between conceptual and procedural skills, yet this one word became the framework for the District’s summary of the “parent” feedback.

Teacher feedback also was solicited in the same casual, unscientific manner. The two most common teacher requests are “examples” and more opportunities to practice skills. Close behind are requests for context, conceptual understanding or application. Also popular are requests for close alignment with the new Washington State math standards.
A dichotomy is present in the teacher cards, too, however this dichotomy probably is legitimate. Some teachers clearly want a more traditional approach, asking for equations, algorithms, step-by-step instruction or examples for the students, basic skills, proficiency with skills such as algebra, a logical sequence to the material, and no integration of concepts.
The other group wants to stick with reform, asking for investigations and a student-centered, constructivist classroom.
The incompatibility between these philosophies was never discussed in any adoption committee meeting. Quite the contrary. All efforts to discuss it were squelched by the people running our meetings. The way these people consistently handled any disagreement over “reform” vs. “traditional” was to change the subject or substitute the word “balance,” as in “a balance between,” or a “balanced approach” – even if that wasn’t what was said.

On Dec. 3, adoption committee members were asked to go through the parent and teacher cards. We were divided into four groups and asked to “silently” lump the cards into “three to five” categories and then “come to a consensus about a phrase to describe each category.” At my table - a "parent" table - I was surrounded by administrator types, and we didn't have consensus.
“The parents want a textbook,” I said at one point to the Administrator In Charge of the Pen.
“I think it’s implied,” he said, refusing to write the word.
We argued back and forth. “Look,” I finally said, exasperated, showing him the parent cards. “’Textbook.’ ‘Textbook.’ ‘Textbook.’ Just write it down.”
In the course of this process, requests for a more “traditional” approach were excluded. I asked the Administrator In Charge of the Pen to note the disagreement on the poster paper, that some of the parent cards asked emphatically for “traditional math.” Instead, he added words that ultimately fostered the impression of parent requests for balance.

On this day, we were given the opportunity to walk around the room and add notes to other summaries if we thought something was missing. I heard some administrators question what parents or teachers meant by “basic math,” “traditional math” or “standard algorithm.” I wondered what we all had to say before our desire for Math-That-Is-Not-Reform was taken seriously.
When we returned to our tables, we (as in “Not Laurie”) could permanently add additional comments if we thought they were “needed.” At my table, all additional sticky notes were plucked back off.
“You’re removing what the parents told you,” I said to the offender. She was unmoved. “This is supposed to be through our eyes,” she said.

This is the District’s “summary” of what parents requested: “Parent support; student support; practice – a lot; resources for help; real-life or contextual problems; basic skills; balanced content – align with state standards/college readiness; balanced between skills and concepts (some procedural, some contextual, not overly emphasize technology); parent/home/on line resources (textbook); user-friendly with numerous examples, (cleaner, less cluttered appearance, consistent layout).”
The teacher "summary" is strikingly similar to the parent summary. Missing from both are words like “standard algorithm,” “direct instruction” and “traditional math,” even though some committee members added them after seeing them on the cards.
Two members even acknowledged to the ESD101 facilitators that respondents aren’t in sync on a “balanced” approach. That acknowledgment isn’t reflected in the final summaries.

The missing words also don’t show up in the pre-screen criteria. The word “balance” is there, however. Also there is “socially equitable/just for the broad scope of student experiences,” even though no parent, teacher or student feedback card asked for that. In the next article, I’ll tell you about the adoption committee’s pre-screen criteria, and how they shaped – and didn’t shape – the curricula choices that were made.


Please note: The information in this post is copyrighted. The proper citation is:Rogers, L. (December, 2009). "School district excludes feedback from parents, teachers." Retrieved (date) from the Betrayed Web site: http://betrayed-whyeducationisfailing.blogspot.com/

Sunday, December 13, 2009

School district excludes feedback from committee, students

Statement from Laurie Rogers on the Feedback (Part 1):
Spokane Public Schools is in the midst of replacing "Core-Plus Mathematics," its current high school mathematics curriculum. The adoption committee’s pre-screen criteria for a new curriculum are purportedly based on summaries of feedback from parents, students and teachers.
(Or so we were told by SPS administrators and two “facilitators” hired from Educational Service District 101).
Careful diligence was required, therefore, in the collection and summarizing of this feedback. Unfortunately, the processes were so poorly conducted as to render the District’s summaries of the feedback virtually worthless.
It’s too bad. Parents, students and teachers who offered their thoughts appear to care deeply about the issue. That their desires are so misrepresented by the District and ESD101 facilitators indicates an unprofessionalism and a lack of respect that I find appalling. But not surprising. Feedback from the adoption committee was misrepresented, too. Then it was tossed out.

District Throws Out Committee Feedback:
The curriculum adoption committee met five times from Sept. 29 to Dec. 3. Each time, we were to share ideas, preferences and concerns and then record summaries of our discussions on sticky notes and poster paper. We disagreed on various issues, so the poster papers reflected oppositional viewpoints.

In our Nov. 9 meeting, committee members were given a typed “perspective” of all of our written feedback to that date. That “perspective,” particularly a section called “Desired Outcome,” seems different from what I remember of the conversations. (Another committee member echoed this thought.)
Returning home on Nov. 9, I emailed Bridget Lewis, executive director of instructional programs, asking her to keep original artifacts handy. I received no reply.
On Nov. 12, I went to the District office and asked to see the artifacts. I was given Nov. 9 poster papers only. When asked for the others, Bridget Lewis and another staff member said they didn’t know where the other artifacts were. On Nov. 13, the staff member confirmed that committee feedback from September and October was “typed up and then tossed.” No apology or explanation was given.

Today, the District’s “perspective” on committee feedback doesn’t mention certain comments, words or phrases from some of the committee members. “Traditional math,” “direct instruction” and “standard algorithm,” for example, aren’t there.
One oppositional viewpoint was pretty much eliminated.

Student Feedback Is Ignored, Excluded:
This fall, SPS asked middle and high school students to share their desires for a high school math curriculum. More than 400 feedback cards were collected.

At the Nov. 9 adoption committee meeting, the ESD101 “facilitators” told members to write down categories for what we thought the students would want. Then the students’ feedback cards were spread out over three desks. We were divided into three groups and told to “silently” assign each card to one heading.
(Most of us didn’t have a chance to see more than our third of the cards. The headings were ours, chosen before we ever saw the cards. The cards contained multiple requests, yet each card was placed under a single heading. From the start, therefore, much of the student feedback was destined to be excluded.) The facilitators then asked for initial committee impressions of the student desires, and the resulting list said: “Good examples; Resources for help: glossary, Website, answers, toolkit; Easy to read and understand; Real-life content (how will I use?); Lots of practice; Technology.”
Committee members didn’t have another chance to look at the student cards, so this initial impression stood, as if it were some kind of proper analysis.

But I had promised my daughter that the student voice would be heard. On Nov. 12, I went to the District office and photocopied the student cards, took the copies home, and over a few days, categorized each student comment according to similar language.
My analysis isn’t an exact science, and it can be argued that, because the method of collecting data was unscientific, any tabulated results are bogus. The question asked of the students was not standard. The students were not given a survey with standard choices, explanations or definitions. I did not speak to the students nor have a chance to clarify their exact intent. Their comments came from their own lexicon and could have meant anything. It’s why I left the results in their own words.
Still, the student comments are consistent. The two most commonly requested items by far are “more examples” and variations of “I need explanations of how to do it.”

On Dec. 3, I gave my results to each member of the committee. When I asked for “explanations” to be added to the District’s “summary,” an ESD101 facilitator said it’s the same thing as “examples” and that I was “splitting hairs.” She didn’t add the word. I asked for “technology” to be removed, since very few students asked for that (29 did say they like their calculators). She refused to remove the word.
Later, I persisted with the two facilitators: “What’s the point of asking the students their views if you aren't going to write down what they said?” Finally, one of them agreed to add the word “explanations,” and she placed a tiny question mark next to “technology.”
The next morning, on Dec. 4, I received an email from Bridget Lewis, telling committee members how happy she was with our effort ... and by the way,

“One caution...when we requested this feedback from these three groups, we did not indicate to them that these comments would be public. This is the reason for only posting our summary of the perspective. Displaying individual card statements publicly would not be appropriate since we did not make that known at the time of the request for input.”

I pondered this email. In my table and summary on student feedback, I don’t have names, grades, classes, schools or programs. There is no identifying information. The table simply collects “like” comments and counts them. The original cards had been spread out on tables in a curriculum adoption meeting that was open to the public. Committee members had viewed the cards and/or openly discussed them in two public meetings. The cards had been taken to the District’s central office where they were viewed by more than one person and kept openly on at least one administrative desk. I was allowed to photocopy the student and parent feedback cards and also to take those photocopies home. Now, suddenly, this information is no longer public? Meanwhile, the District has published some of the students’ exact language.

Well, I am a rule follower, even if I think the rules were put in place solely to squelch debate, foster a predetermined viewpoint, or keep pertinent, critical information from seeing the light of day.
Following is my summary of the top student requests, in order, from most commonly cited to least. I presume that, where the District published exact student language, they did it “appropriately,” and so I used the same student language, placed in “quotes.” I paraphrased the rest.

Laurie Rogers’ Summary of the Top Student Requests

Students said they want:

  • More “examples”
  • “Explanations,” line by line, of how to do each skill
  • Helpful “resources” within the textbook structure, such as the meanings of words, “answers,” “glossary,” directory, lists of mathematical procedures, explanations of mathematical symbols
  • Clearer and simpler language, “easier to read and understand”
  • Classical math – the math schools used to teach, the math that will get them to college without remediation – with “equations, algorithms, formulas, theorems”
  • Useful “visuals”
  • Uncomplicated word problems; (or) No more word problems
  • Content that’s germane to them, to their life, to college and to their future
  • More time and opportunity to “practice” skills
  • Small, portable machines that will calculate for them
  • The paid adult in the classroom to actually show them how to do things
  • To be allowed to progress when they understand something
  • Help from a “Website”
  • To learn a skill before they’re told to use it
  • A textbook that isn’t so big and heavy
  • A book they can work in at home

In the next article, I’ll tell you what parents and teachers asked for, and what the District says they asked for. The parent and teacher requests, and the District’s summaries of their requests, are not the same.



Please note: The information in this post is copyrighted. The proper citation is:
Rogers, L. (December, 2009). "School district excludes feedback from committee, students." Retrieved (date) from the Betrayed Web site: http://betrayed-whyeducationisfailing.blogspot.com/

Sunday, November 29, 2009

The truth of Spokane's K-12 math program

Statement from Laurie Rogers:

Many people in Washington State aren't aware of just how unprepared our students are in mathematics, at nearly every grade, as they leave elementary school, as they enter high school, and when they graduate - IF they graduate. Statements keep coming at us from education administrators that supposedly point to improved scores, high rankings, increased enrollment in advanced classes, and a strong showing on college placement classes. This consistent misrepresentation of the situation has a dramatic impact on how they see the problem and what they think should be done.
On Nov. 25, in a guest editorial for the Spokesman-Review, for example, Washington State Superintendent Randy Dorn said of Washington: "We are one of just 24 states that have high school exit exams, which places us far ahead of more than half the nation." Also, "We consistently finish near the top on national tests, such as the National Assessment of Educational Progress, the SAT and the ACT."
But these statements are a reworking of the hard truth, as you'll see below.
Local administrators also work hard to give a rosy impression of achievement. In Spokane's Nov. 25 meeting of the high school math curriculum adoption committee, I was told NAEP scores went up "two grades," and I was shown scatter plots that supposedly indicate improved student achievement.

At the Dec. 3 meeting of the curriculum adoption committee, I gave members the following information:

************************************************

Quotations from the August 2009 issue of Spokane Public Schools “School Talk”

“Spokane Public Schools administrators have been working to establish and support a consistent mathematics program across the district…Classrooms have become places where students are highly engaged in the learning process.”
High school teacher: “I’m impressed by the students’ depth of understanding and their ability to communicate mathematical ideas.”
(But Shadle High School’s pass rate for the 2009 10th-grade math test was 47.4%.)
Elementary school teacher: “Kids are able to apply concepts seamlessly in different contexts. They are excited about math now.”
(But Ridgeview Elementary School’s pass rates were 62.1%, 56.3%, 58.2% and 43.5%).
Middle school teacher: “The curriculum does a good job of pushing kids to discover their own understanding. and it also allows time to practice skills and algorithms.”
(But Chase Middle School’s pass rates for the 2009 math WASL were 52.8% and 55.6%).
Elementary school teacher again: “We’re not throwing good practices away. We are melding them with the new things we know.”

Spokane students’ actual mathematical achievement paints a different picture from that perpetuated by the school district. Below is the truth of Spokane's (and Washington State's) K-12 mathematics program.

*************************************************************

Real Data for Spokane Public Schools’ Student Achievement:

The WASL:
In Spring 2009, just 42.3 % of Spokane’s 10th graders passed the math portion of the WASL. The passing cut score is reportedly at about 54%. The content has been estimated as comparable to about a 7th-grade-level internationally.
Therefore, in Spring 2009, 57.7% of Spokane's 10th-grade students could not pass a math test that reportedly required just over 56% to pass and that is based on 7th- or 8th-grade content.

The NAEP:
In order to test as “proficient” in mathematics on the 2009 Mathematics NAEP, 4th-grade students needed to reach just 249 on a scale of 500. Sadly, 57% of Washington’s 4th-grade students couldn’t do it. The average score for Washington’s 4th-graders was 242 out of 500.
In order to test as “proficient” in mathematics on the 2009 Mathematics NAEP, 8th-grade students needed to reach just 299 on a scale of 500. Sadly, 61% of Washington’s 8th-grade students couldn’t do it. The average score for Washington’s 8th-graders was 289 out of 500.


Spokane's Advanced Placement Classes

1992 2000 2008
Number of students 193 368 1093
Number of exams 271 636 2028
Number of course areas 13 15 27
Number of exams passed 198 517 1099
Percent passing 73% 81% 54%
Number of exams failed 73 121 929
Percent failing 27% 19% 46%
Average grade 3.18 3.45 2.72

The 2006 SAT.

In March 2007, former State Superintendent Terry Bergeson stated that for four years, Washington’s average SAT scores were the highest in the nation. But when I looked at SAT scores for 1995-2007, we were neither the highest nor the lowest. Finally, I realized what Dr. Bergeson was saying: Washington is at the bottom of the top half of the states with respect to how many of its students take the SAT.
In 2006, about 55% of Washington students took the SAT, while in other states, it was as few as 3% or as many as 100%. Counting just those states that had similar participation rates, Washington was ranked first. Therefore, Washington scored lower on the 2006 SAT than 24 other states – and Dr. Bergeson still claimed it was highest in the nation.

The 2009 SAT.
On Aug. 25, 2009, Washington State's Office of Superintendent of Public Instruction reported once again that “State SAT Scores Lead the Nation.” “For the seventh consecutive year,” an OSPI press release said, “Washington state SAT averages are the highest in the nation among states in which more than half of the eligible students took the tests … ”
It’s also important to note that the 2009 SAT scaled score was between 200 and 800. In mathematics, Washington students scored an average of 531 out of 800. Washington’s black students scored an average of 446 out of 800.

The ACT.
In August 2008, OSPI released ACT scores, saying that “for the fifth straight year,” Washington students scored “far above” the national average. Washington scored 23.1; the national average was 21.1.
However, the highest possible score on the ACT is 36. The benchmark score is 22. The benchmark is the “minimum score needed on an ACT subject-area test to indicate a 50% chance of obtaining a B or higher or about a 75% chance of obtaining a C or higher” in college Algebra.
In 2009. Washington’s composite score in math slipped to 22.9 out of 36.

REMEDIATION:
(Updated with actual figures June 11, 2010): The remedial rate at Spokane Community Colleges in 2008/2009 was 96.3%. The remedial rate at Spokane Falls Community College in 2008/2009 was 83.5%. Together, it was 87.1%. Of the students who took remedial classes that year, most tested into Elementary Algebra or lower. Of the students who took remedial classes in 2008/2009, nearly 47% failed those classes or withdrew early.

FTE (Full-Time Enrollment);
FTE in Spokane Public Schools has dropped by about 2,500 students since 2002/2003. This is a net figure, not a gross figure, therefore, incoming students likely have offset the total drop.
A Fall 2008 district survey of families who chose to leave the district showed that about 33% said the quality of the curriculum didn’t meet their expectations. Five of the top six schools having out-of-district transfers were high schools. Five of the district’s 7 middle schools were listed in the top 14. A whopping 79% of students who left went to: the Mead School District; online for virtual options; or to the West Valley School District. (Private schools were not included in the survey.)

Academically-related reasons chosen for leaving:
33%: Quality of curriculum does not match your expectations
26%: District class sizes too large
21%: Desired coursework is not offered in the district
19%: Student is not on schedule to graduate
12%: Student is enrolled in a full-time non-district on-line program
12%: Student has not met the 10th grade WASL standards
6%: There is not space for student in a particular district program
DROPOUTS:
According to a recent Spokane Public Schools PowerPoint presentation ironically called “Becoming a World-Class System,” just “66% of students in SPS actually graduate from high school.”

*****************************************************************

And there you have it, folks. The truth of mathematics achievement in Spokane and Washington State. This is where public school administrators' almost complete dedication to reform math and constructivist teaching styles have brought us.
Don't let anyone tell you things are looking up relative to mathematics. In order for that to happen, administrators would have to modify their thinking.
In upcoming articles, I'll show you more evidence of the thinking in Spokane Public Schools relative to mathematics instruction.



Please note: The information in this post is copyrighted.

The proper citation is:Rogers, L. (December, 2009). "The truth of Spokane's K-12 math program." Retrieved (date) from the Betrayed Web site: http://betrayed-whyeducationisfailing.blogspot.com/

Wednesday, November 25, 2009

H.S math curriculum adoption - their research

Statement from Laurie Rogers:

David Sousa’s “How the Brain Learns Mathematics” was the single resource given to each member of the Spokane Public Schools High School Math Curriculum Adoption Committee at an introductory meeting on Sept. 29, 2009. (I had already purchased my own copy.)
This book has serious issues, from start to finish. The fact that it’s being used as a resource for a high school math curriculum adoption is a sad commentary on the value Spokane administrators place on proper research.
Over several committee meetings, I have challenged the merits of Sousa’s book, and also the merits of an excerpt given to us later from another Sousa book: “How the Brain Learns.” On Nov. 9, I spoke to the entire committee about my concerns. After I finished speaking, Bridget Lewis, executive director of instructional programs, implied that I might have quoted the district superintendent out of context (I had not). She then moved the meeting forward. My concerns about the book were not discussed, nor was any further mention of them made.

*********************************************************

Text of comments made by Laurie Rogers, parent, to the Spokane Public Schools High School Math Curriculum Adoption Committee, on Nov. 9, 2009

(Stated comments might have differed very slightly.)

“You all know I have issues with the book this committee has been given (“How the Brain Learns Mathematics,” by David Sousa). I have been asking questions about it from the very first day.
It's why I sent out emails to professionals across the country, asking them, “If you were in this room, what would you want to see? What would you recommend I bring to this committee?” These professionals made several recommendations. Most are over there (on the table). I couldn’t bring everything, but I brought most of it (a suitcase-full), and most of it is also on the CD that I’m giving to each of you today.
The task of this committee is to choose a curriculum that’s aligned with the new state standards. A question was asked at the last school board meeting: “What is different about the new state math standards? What’s new that drives this curriculum process?” Something fundamental has changed; it’s the focus on procedural fluency – standard algorithms, basic arithmetic skills. Students must be procedurally fluent in the skills that will take them to college without remediation. The standards say it. The research says it. The National Mathematics Advisory Panel report says it clearly. Some of us have said it here. Children need to practice, they need arithmetic skills and they need standard algorithms, and they need to know those skills fluently.

I have four problems with this book and the excerpt we’ve been given from an earlier David Sousa book.


  1. The philosophy of these materials is oppositional to what we’ve been directed to do here.
  2. The argumentation in these materials is very weak. David Sousa does not make his case.
  3. Professionals in the field of cognitive science are not supportive of Sousa’s work.
  4. David Sousa appears to lay the blame for low student achievement at the feet of the teachers.
Let’s take these one by one.

  1. David Sousa’s contentions are that rote learning isn’t helpful and that memorizing the multiplication tables might actually “hinder” learning. He says memorization turns children into little calculators.
    This view – and his subsequent arguments – are not in line with the new state math standards or with the National Mathematics Advisory Panel Final Report. Both of those reports are on the CD I gave you.
  2. I’ve been reading this book from the view of a researcher, reporter, and someone with a master’s degree in communication. I expect this book – as I expect all research – to prove its point.
    Unfortunately, Sousa quotes himself as proof for his own assertions, he contradicts himself, he makes huge leaps in logic, and he draws illogical conclusions from the evidence he does provide. Additionally, his clear bias against anything traditional verges on the irrational. In my professional opinion, the result is a completely unsupported argument.
  3. Professionals who are actually in the field of cognitive science do not support David Sousa’s conclusions. After reading the first two chapters of this book, I wrote to professionals in the fields of cognitive science, mathematics, engineering and technology to ask if they had any thoughts on this material and what they would recommend for research. What they recommended is over there, and on the CD I gave you today.
    Dr. David Geary, member of the National Mathematics Advisory Panel, and the chief author of the panel’s report on Brain Science, advised me that making curriculum decisions based on brain science is “premature.”
    Dr. Dan Willingham, author of “Why Don't Students Like School?,” has written that he hopes educators will view with skepticism any “claims that instructional techniques and strategies are ‘proven’ because they are based on neuroscience.”
    Dr. John Hattie, author of "Visible Learning," reviewed some assertions made by Sousa relative to memorization and practicing of skills, and called them “nonsense.”
    Dr. Sandra Stotsky, also on the National Mathematics Advisory Panel, sent me 7 of her own copies of the NMAP Final Report so you could read them while you’re here.

  4. Sousa’s implication is that if you would just teach like these teachers, even using multiple inadequate models to teach simple arithmetic concepts, the students would learn what they need to learn. Ergo, if the students don’t learn, the problem must be the teachers. I disagree with his implication. I don’t blame teachers for the 42.3% pass rate on the WASL last spring or the 80% remediation rate in math at SFCC. I don't agree with Dr. Nancy Stowell (Spokane superintendent), who said in the Aug. 26 school board meeting that the district “has a real problem with quality teaching.” I think you care, and you do your best, and I respect your dedication. I know it’s a hard-knock life for teachers. I know you teach the curriculum you’re given, and then you’re blamed for the results. And I disagree with David Sousa. It matters what you teach. Content is critical.
We’ve heard it from Strategic Teaching, which assessed our old state standards. We’ve heard it from the National Mathematics Advisory Panel. We’ve heard it from teachers, parents, businesses, legislators. We’ve heard it through reports, data, and international studies. Content is king. Basic skills are critical. Sustained practice is necessary. This is intuitive. It makes sense. And – it’s in our new math standards.

I respectfully submit that this book won’t get us where we need to go so that we can help the students get where they need to go.”

***************************************************************

I asked district staff to hand out, while I was reading this statement, a CD I had made for committee members. The CD contained the long list of research I had compiled, most of the reports on that list, and the following list of quotations from David Sousa’s “How the Brain Learns Mathematics.” I could easily fill a manuscript with quotations from the book. I'm including the list here, along with my overall objections. It's hard to believe that educators would support and have faith in a book such as this.

A Few Quotations from David Sousa’s “How the Brain Learns Mathematics”

Page 4. “(The chapter) discusses why the brain views learning to multiply as an unnatural act, and it suggests some other ways to look at teaching multiplication that may be easier.”
Premise of statement not sufficiently proved.

Page 20. “Because of language differences, Asian children learn to count earlier and higher than their Western peers…How do we know the difference is due to language? Because children in the two countries show no age difference in their ability to count from 1 to 12.”
Not a logical argument. Not sufficiently proved.

Page 26. “They further propose that number sense is the missing component in the learning of early arithmetic facts, and explain the reason that rote drill and practice do not lead to significant improvement in mathematics ability.”
Premise of statement not sufficiently proved.

Page 33. “Why is learning multiplication so difficult, even for adults?”
Premise of statement not sufficiently proved.

Page 44. “If memorizing arithmetic tables is so difficult…”
Premise of statement not sufficiently proved.

Page 46. “Do the multiplication tables help or hinder? They can do both. … The idea here is to use the student’s innate sense of patterning to build a multiplication network without memorizing the tables themselves.”
Premise of statement not sufficiently proved.

Page 51: “You may recall from Chapter 1, however, that working memory’s capacity for digits can vary from one culture to another, depending on that culture’s linguistic and grammatic system for building number words.”
Premise of statement was never sufficiently proved.

Page 54: Sousa provides an example of a pizza cut into pieces in order to dismiss rote learning, but in reality, rote learning does not exclude visuals and explanations. Sousa misstates the "rote learning" process completely in order to come to his preferred conclusion.

Page 54: “Too often, students use rote rehearsal to memorize important mathematical terms and facts in a lesson, but are unable to use the information to solve problems. They will probably do fine on a true-false or fill-in-the-blank test. But they will find difficulty answering higher-order questions that require them to apply their knowledge to new situations especially those that have more than one standard.”
Premise of statement not sufficiently proved. Huge leap in logic.

Page 55: Sousa says the working memory asks two questions to determine whether a memory is saved or not saved. 1. Does this make sense? 2. Does it have meaning? He says: “Of the two criteria, meaning has the greater impact on the probability that information will be stored.” He goes on to blame lack of meaning for a failure to store mathematical concepts. For support, Sousa references himself.
Premises not sufficiently proved. Memories also are saved through repetition, explaining why so many of us repeat our parents' mistakes. A failure in memory could be due to other factors, including a lack of sufficient practice. Referencing himself is like saying, “It’s true because I said so.” Sousa's work is an insufficient support for his own argument.

Page 56: “We have already noted that evolution did not prepare our brains for multiplication tables, complicated algorithms, fractions, or any other formal mathematical operation.” Sousa goes on to say children who memorize arithmetic tables and facts become “little calculators” who compute without understanding. “Furthermore, the language associated with solving a particular problem may itself interfere with the brain’s understanding of what it is being asked to compute.”
Premises of these statements not sufficiently proved. Obviously evolution prepared our brains for these concepts since we are capable of learning these concepts. The memorizing of math facts does not turn children into a little calculator. Failure in understanding could be simply a matter of not reading carefully. The last sentence actually supports a traditional approach, since reform language often confuses the children.

Page 58: “While we recognize the need for learners to remember some basic arithmetic facts, memorization should not be the main component of instruction…(Students who depend on memorization) see arithmetic solely as the memorization of mechanical recipes that have no practical application and no obvious meaning. Such a view can be discouraging, lead to failure, and see the state for a lifelong distaste for mathematics.”
Premises of statements not sufficiently proved. The last sentence overreaches, and is ironic, considering current situation.

Page 61: “In a learning episode, we tend to remember best that which comes first, and remember second best that which comes last. We remember least that which comes just past the middle of the episode. This common phenomenon is referred to as the primacy -recency effect…”
I agree with a portion of this - the Law of Primacy - however this law supports a direct teaching model, i.e. Teach it properly the first time.

Page 62: “The old adage that “practice makes perfect” is rarely true.
Premise of statement not sufficiently proved.

Page 62: “It is very possible to practice the same skill repeatedly with no increase in achievement or accuracy of application.”
Statement is technically true, but it is not a sufficient argument for rejecting the practicing of skills. Sousa has already stated several times that practice helps with retention.

Page 127: In Chapter 3, Sousa talks at length about "practice" and how it helps with retention, however it is the way he insists the children should practice that is counterproductive. With young students, he says on page 127, "Limit the amount of material to practice," "Limit the amount of time to practice," "Determine the frequency of practice," and "Assess the accuracy of practice." These statements are arguably true to some degree, however Sousa goes on to warn: "When the practice period is short, students are more likely to be intent on learning what they are practicing. Keep in mind the 5- to 10-minute time limits of working memory for preadolescents..."

I could go on and on, finding more quotations with issues just like these, but you have the gist. Whichever statement Sousa makes, whichever research he cites, he turns it to favor a reform, discovery-based approach.
I’ll give you one more reference for this book.

Pages 176-177: Sousa provides two tables. One supposedly compares a “traditional classroom” against a “sense-making classroom.” The other compares “traditional tasks” against “rich tasks.” You can see the biased language.
Both tables set up a "straw man" argument, creating a caricature of a "traditional" approach and then knocking it down. The language continues to be completely biased throughout.
The tables make Sousa’s preferences crystal clear. He supports discovery, rejects “recollection and practice,” and supports muddying the learning process with multiple strategies and skills. He does this in contradiction of what he’s already asserted previously, and despite all of the scientific, peer-reviewed evidence available to him from around the country – very little of which he quotes or includes.
If Sousa’s tables were based in truth, Spokane Public Schools' K-12 mathematics achievement would be completely different from what it is.

The curriculum adoption committee also was given an excerpt from an earlier David Sousa book: "How the Brain Learns." In this excerpt, Sousa presents examples of teachers who offer students multiple "models" in order to teach simple math concepts. These "models" are inadequate, something the teachers in the examples acknowledge. Meanwhile, concise and efficient "traditional" models are passed up.
Sousa uses these examples as if they present good teaching, but in my tutoring, I have found it productive to choose one good model and stick with it. Switching models on students often confuses them. Therefore, I see Sousa's examples differently. To me, they show teachers deliberately choosing to use ineffective, inefficient, ultimately inadequate models instead of effective, efficient, and sufficient "traditional" counterparts.
Nowhere in this excerpt are data proving success or improvement.


****************************************************************

On Nov. 9, 2009, Bridget Lewis, Spokane's executive director for teaching and learning, brought in other books and excerpts, such as an excerpt from “Best Practices.” The “Best Practices” excerpt criticizes the typical American K-12 math instruction, saying it over-emphasizes rote learning. But this type of classroom no longer exists in Spokane and hasn’t for several years.

The materials brought to the committee by people in the district are heavily reform and/or constructivist – with the exception of a brief excerpt from the "National Mathematics Advisory Panel Final Report." This excerpt discusses the recommendations from the NMAP, noting the importance of procedural fluency, practicing of skills, and caution on calculators in the classroom. The NMAP excerpt was reviewed by a small subcommittee of our curriculum adoption committee. As of Nov. 30, 2009, Dr. Lewis’ materials (other than the books) are posted on the school district Web site, but the NMAP excerpt and its subcommittee review are not there.


(Update: At some point between Dec. 1 and Dec. 3, 2009, the NMAP excerpt and its committee review were finally added to the Spokane Public Schools Web site.)


Please note: The information in this post is copyrighted. The proper citation is:
Rogers, L. (November, 2009). "H.S. math curriculum adoption - their research." Retrieved (date) from the Betrayed Web site:
http://betrayed-whyeducationisfailing.blogspot.com/




Monday, November 23, 2009

Research: Traditional vs. Reform

Statement from Laurie Rogers:

In the Fall of 2009, I applied to be and was accepted as a parent member of Spokane's latest high school math curriculum adoption committee. All committee members were given a single book as a resource for decision-making. Concerned about the heavily reform/constructivist/poorly argued nature of that single book, I compiled a list of research suggested to me by generous math/science/research/education professionals across the country.

Below is a list of some of those suggestions, including a small portion of the research, data and commentary I have compiled over 3 years of research into K-12 public education. I offered this complete list, and much of the actual research, on a CD to each committee member.

If you have other research or information to offer to this curriculum adoption committee, please write to me at wlroge@comcast.net.

******************************

Data, Research Papers:

“A Brief History of American K-12 Mathematics Education in the 20th Century” (2003)
Dr. David Klein, California State Northridge
“(NCTM math programs of the 1990s) typically failed to develop fundamental arithmetic and algebra skills. Elementary school programs encouraged students to invent their own arithmetic algorithms, while discouraging the use of the superior standard algorithms for addition, subtraction, multiplication, and division. Calculator use was encouraged to excess, and in some cases calculators were even incorporated into kindergarten lesson plans. Student discovery group work was the preferred mode of learning, sometimes exclusively, and the guidelines for discovery projects were at best inefficient and often aimless. Topics from statistics and data analysis were redundant from one grade level to the next, and were overemphasized. Arithmetic and algebra were radically de-emphasized. Mathematical definitions and proofs for the higher grades were generally deficient, missing entirely, or even incorrect.” http://www.csun.edu/~vcmth00m/AHistory.html

Achievement Effects of Four Early Elementary School Math Curricula Findings from First Graders in 39 Schools (2009)
Roberto Agodini, et al.
“Eight of the fifteen subgroup analyses found statistically significant differences in student math achievement between curricula. The significant curriculum differences ranged from 0.28 to 0.71 standard deviations, and all of the significant differences favored Math Expressions or Saxon over Investigations or SFAW. There were no subgroups for which Investigations or SFAW showed a statistically significant advantage.”
http://ies.ed.gov/ncee/pubs/20094052/pdf/20094052.pdf

A Close Examination of Jo Boaler’s Railside Report (undated)
Wayne Bishop, Cal. State; Paul Clopton, VAMC; R. James Milgram, Stanford
“This study makes extremely strong claims for discovery style instruction in mathematics, and consequently has the potential to affect instruction and curriculum throughout the country.
… Prof. Boaler has refused to divulge the identities of the schools to qualified researchers. Consequently, it would normally be impossible to independently check her work. However, in this case, the names of the schools were determined and a close examination of the actual outcomes in these schools shows that Prof. Boaler’s claims are grossly exaggerated and do not translate into success for her treatment students.”
ftp://math.stanford.edu/pub/papers/milgram/combined-evaluations-version3.pdf

A Review of Four High-School Mathematics Programs (Holt; Discovering Algebra; Core-Plus; Glencoe McGraw-Hill) March 2009
Guershon Harel, University of California, San Diego
“As can be seen in the chart below, none of the programs was found mathematically sound on the first two criteria. The (checkmark) in Holt on these criteria in geometry is better characterized as the least mathematically unsound.”
http://www.math.jhu.edu/~wsw/ED/harelfinal.pdf

A Study of Core-Plus Students Attending Michigan State University (2006)
Richard O. Hill and Thomas H. Parker
“As the implementation progressed, from 1996 to 1999, Core-Plus students placed into, and enrolled in, increasingly lower level courses; this downward trend is statistically robust .... The percentages of students who (eventually) passed a technical calculus course show a statistically significant ... decline averaging 27 percent a year; this trend is accompanied by an obvious and statistically significant increase in percentages of students who placed into low-level and remedial algebra courses. The grades the Core-Plus students earned in their university mathematics courses are also below average, except for a small group of top students.”
http://www.mth.msu.edu/~hill/HillParker5.pdf

“Brain-Based” Learning: More Fiction than Fact (2006)
Daniel T. Willingham, professor of cognitive psychology, University of Virginia
“… I hope educators will approach claims that instructional techniques and strategies are ‘proven’ because they are based on neuroscience with a healthy dose of skepticism. Cognitive and educational studies are the best sources for educators looking to improve their students’ cognitive and educational outcomes.”
http://www.aft.org/pubs-reports/american_educator/issues/fall2006/cogsci.htm

Critical Thinking: Why is it So Hard to Teach? (2007)
Daniel T. Willingham, professor of cognitive psychology, University of Virginia
“As the main article explains, the ability to think critically depends on having adequate content knowledge; you can’t think critically about topics you know little about or solve problems that you don’t know well enough to recognize and execute the type of solutions they call for.”
http://www.aft.org/pubs-reports/american_educator/issues/summer07/Crit_Thinking.pdf

Direct Instruction Mathematics: A Longitudinal Evaluation of Low-Income Elementary School Students (1984)
Russell Gersten, University of Oregon; Doug Carnine, University of Oregon
“…low-income primary-grade students who received the full 3- or 4-year Direct Instruction mathematics program tended to perform significantly better in all mathematic subtests of the Metropolitan Achievement Test than students who received other approaches, whether experimental or traditional. Direct Instruction Follow Through students achieved at a level much higher than is typical for students with similar demographic characteristics…”
The Elementary School Journal, Vol. 84, No. 4. (Mar., 1984), pp. 395-407.

Direct Instruction Mathematics Programs: An Overview and Research Summary (2004)
Angela M. Przychodzin, et al., Eastern Washington University
“In all, 12 studies published since 1990 were found using DI math programs. The majority (11 out of 12) of these found DI math programs to be effective in improving math skills in a variety of settings with a variety of students.”
Journal of Direct Instruction, v4 n1 p53-84 Win 2004

District 81’s Main and Supplementary Mathematics Materials (April 2009)
These pages represent a list (as of April 2009) of various math curricula used in Spokane Public Schools, along with approved supplementary math materials. This document was 11 pages long.


Don’t Forget Curriculum (October 2009)
Grover “Russ” Whitehurst, Governance Studies, Brown Center Letters on Education, Brookings
“Anyone interested in, ‘doing what works for the kids,’ should pay attention to this table. …Curriculum effects are large compared to most popular policy levers.”
http://www.brookings.edu/~/media/Files/rc/papers/2009/1014_curriculum_whitehurst/1014_curriculum_whitehurst.pdf

Educating the Evolved Mind: Conceptual Foundations for an Evolutionary Educational Psychology (2007) (Draft)
Geary, D. C. (2007). In J. S. Carlson & J. R. Levin (Eds.), Educating the evolved mind (pp. 1-99, Vol. 2, Psychological perspectives on contemporary educational issues).
“(Schools) are thus often used for purposes that have more to do with the best interests of those attempting to influence this socialization than the best educational interests of children. In fact, the history of education in the United States might be viewed as being more strongly driven by ideology and untested assumptions about children’s learning than by concerns about the efficacy of schooling vis-à-vis the long-term social and employment interests of children …. These ideological debates and the attendant opportunity costs to children’s educational outcomes and later employment opportunities will continue well into the twenty-first century, if current attempts to move the field of education to a more solid scientific footing are not successful ….” (Portions provided to committee, with permission.)

High School Mathematics Curriculum Study, March 2009
Prepared by Linda Plattner, Strategic Teaching
“…none of the reviewed programs were completely satisfactory. Holt was the strongest of the four, meaning the mathematics is not compromised in any of the three topics examined. Discovering was the weakest with all three areas considered inadequate. … The good news is that there are other programs that match well to Washington’s standards.”
http://www.strategicteaching.com/washington_state_standards_.html


How Educational Theories Can Use Neuroscientific Data (2007)
Daniel T. Willingham, Department of Psychology, University of Virginia; and John W. Lloyd, Curry School of Education, University of Virginia
“…most (scholarly treatments of neuroscience in education … argue that neuroscience has been and will continue to be helpful to education …— but they argue that data from neuroscience must be funneled through a behavioral level of analysis … or that neuroscience should be part of a broader approach to research in education, not the sole savior…”
http://www.danielwillingham.com/WillinghamLloyd2007.pdf

Independent Study of Washington State - K-8 Curriculum Review, November 2008
Prepared by Linda Plattner, Strategic Teaching
“ST reviewed Bridges in Mathematics, Investigations, Math Connects, and Math Expressions for elementary school. Holt Mathematics, Math Connects, Math Thematics, and Prentice Hall Mathematics were reviewed at the middle school level. These are OSPI’s highest-scoring programs. Other programs, such as the Connected Math Project that is widely used in Washington schools, were not reviewed because they did not meet OSPI’s minimum threshold for content.”
http://www.strategicteaching.com/washington_state_standards_.html


K-12 Calculator Usage and College Grades (2004)
W. Stephen Wilson, and Daniel Q. Naiman, , Johns Hopkins University
Educational Studies in Mathematics, 56:119-122, 2004.
“We find that students in the big mathematics service courses at the Johns Hopkins University who were encouraged to use calculators in K-12 have somewhat lower grades than those who weren’t.”
http://www.math.jhu.edu/~wsw/ED/pubver.pdf


Outcomes Analysis for Core Plus Students At Andover High School: One Year Later
R. James Milgram, Department of Mathematics, Stanford University
“…Andover High School scores are considerably above the state and national means in keeping with Andover’s position as one of the top high schools in the country. However, as was indicated above, both English and reading got stronger against these measures by about 6 percentile points. By comparison, in the final two years of the data, when the effects of the Core Plus mathematics program kicked in, the mathematics scores dropped against these measures by six percentile points.”
http://www.math.wayne.edu/~greg/milgram.htm


Performance Indicators in Math: Implications for Brief Experimental Analysis of Academic Performance (February 2009)
Amanda M. VanDerHeyden, Education Research and Consulting, Inc, Fairhope, AL;
Matthew K. Burns, University of Minnesota, Minneapolis, MN
J Behav Educ (2009) 18:71–91
“Specifically, children who did not master early skills, failed to reach the mastery criterion following intervention for future related skills at much higher rates and earned lower scores on all remaining intervention skills relative to peers who attained the mastery criterion early in the sequence of tasks. … The correlation between fluent computation and faster subsequent learning of related skills is indeed promising, but future studies must examine the degree to which fluent sub-skill computation causes faster learning on more complex related problems. Contemporary research in mathematics seems to indicate a need for explicit teaching of procedural rules to solve both computation and applied problems as well as specific training to apply or generalize that knowledge over time and across varying stimuli (Fuchs and Fuchs 2001; Fuchs et al. 2003; Kameenui and Griffin 1989).


Practice Makes Perfect--But Only If You Practice Beyond the Point of Perfection (2004)
Daniel T. Willingham, cognitive psychology and neuroscience at the University of Virginia
“That students would benefit from practice might be deemed unsurprising. … The unexpected finding from cognitive science is that practice does not make perfect. Practice until you are perfect and you will be perfect only briefly. What’s necessary is sustained practice. By sustained practice I mean regular, ongoing review or use of the target material (e.g., regularly using new calculating skills to solve increasingly more complex math problems, reflecting on recently-learned historical material as one studies a subsequent history unit, taking regular quizzes or tests that draw on material learned earlier in the year). This kind of practice past the point of mastery is necessary to meet any of these three important goals of instruction: acquiring facts and knowledge, learning skills, or becoming an expert.”
http://www.aft.org/pubs-reports/american_educator/spring2004/cogsci.html

Reflections of Evolution and Culture in Children’s Cognition: Implications for Mathematical Development and Instruction (1995)
David C. Geary, University of Missouri at Columbia
American Psychologist, Vol. 50, No. 1, 24-37, 1995
“The basic assumptions that guide constructivist-based instruction appear to be well suited for the acquisition of biologically primary mathematical abilities, such as number and counting. However, constructivist philosophers and researchers fail to distinguish between biologically primary and biologically secondary mathematical abilities and, as a result, treat all of mathematics as if it were a biologically primary domain. That is, given an appropriate social context and materials, children will be motivated and able to construct mathematical knowledge for themselves in all areas. The adoption of these assumptions and the associated instructional techniques appear to reflect wider cultural values and only weakly follow empirical and theoretical work in contemporary developmental and cognitive psychology, much less a consideration of evolutionary issues.”


Reform vs. Traditional Math Curricula: Preliminary report on a survey of the graduating classes of 1997 of Andover High School and Lahser High School, Bloomfield Hills, Michigan, concerning their high school math programs and how well these programs prepared them for college math (1998), and update (1999)
Gregory F. Bachelis, Ph.D., Professor of Mathematics, Wayne State University, Detroit
“The other matter I would like to comment on is the performance of Core-Plus graduates on the placement tests at UMAA, MSU, as well as other colleges. A lot of them complained that they did not do well because of their lack of knowledge of basic algebra, and some said they did not do well even in the courses they were placed into. Now it is all well and good to say that people are just having a bad day when they do poorly on a placement test, but as someone who has taught remedial algebra for more years than I care to remember, let me assure you that there is a big difference between learning basic algebra and then forgetting most or all of it, and never having learned it at all. Core-Plus appears to have created a new category of students who land in remedial math courses - courses that were not designed with such students in mind.”
http://www.math.wayne.edu/~greg/original.htm
http://www.math.wayne.edu/~greg/update.htm


“Report Cards” for middle and high schools in Spokane Public Schools (2009)
http://reportcard.ospi.k12.wa.us/ /

Washington State High School Math Text Review (March 2009)
W. Stephen Wilson, Johns Hopkins University
“Geometry is important, so the unacceptable nature of geometry in Discovering and Core Plus makes these programs unacceptable. The flaws in these geometry programs are such that they could not easily be compensated for by a teacher, even with the help of supplementation.”
http://www.math.jhu.edu/~wsw/ED/wahighschoolwsw.pdf


Washington State Mathematics Standards Review and Recommendations (2007)
Linda Plattner, Strategic Teaching
“Simply put, Washington is not focused enough on the important fundamental content topics in mathematics. This is shown in the early grades in which Washington standards do not ensure that students learn the critical algorithms of arithmetic and continues throughout the standards until it ends in secondary school with minimal expectations that are missing most of the algebra, geometry, and trigonometry found in other places.”
http://www.strategicteaching.com/review_wa_standards_8-30-07.pdf


What Is Developmentally Appropriate Practice? (Summer 2008)
Daniel T. Willingham, cognitive psychology and neuroscience at the University of Virginia
“Unfortunately, Piaget’s theory is not right. He is credited with brilliant insights and many of his observations hold true—for example, kindergartners do have some egocentrism and 9-year-olds do have some trouble with highly abstract concepts. Nonetheless, recent research indicates that development does not proceed in stages after all.”
http://www.aft.org/pubs-reports/american_educator/issues/summer08/willingham.pdf


What Works Clearinghouse – Investigations in Number, Data, and Space (2009)
“No studies of Investigations in Number, Data, and Space® that fall within the scope of the Elementary School Math review protocol meet What Works Clearinghouse (WWC) evidence standards. The lack of studies meeting WWC evidence standards means that, at this time, the WWC is unable to draw any conclusions based on research about the effectiveness or ineffectiveness of Investigations in Number, Data, and Space®."
http://ies.ed.gov/ncee/wwc/pdf/wwc_investigations_022409.pdf


What Works Clearinghouse – Scott Foresman-Addison Wesley Elementary Mathematics (2006)
“Scott Foresman–Addison Wesley Elementary Mathematics was found to have no discernible effects on students’ math achievement. Improvement index Average: –2 percentile points; Range: –7 to +3 percentile points”
http://www2.ednet10.net/SpecialEducation/documents/WWCScottForesmanWesley.pdf


What Works Clearinghouse – Connected Mathematics Project (2007)
“The CMP curriculum was found to have mixed effects on math achievement. Rating of effectiveness. Improvement Average: –2 percentile points; Range: –12 to +11 percentile points”
http://ies.ed.gov/ncee/wwc/pdf/WWC_CMP_040907.pdf

Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching (2006)Paul A. Kirschner, Educational Technology Expertise Center, Open University of the Netherlands, Research Centre Learning in Interaction, Utrecht University, The Netherlands; John Sweller, School of Education, University of New South Wales; Richard E. Clark, Rossier School of Education, University of Southern California
Although unguided or minimally guided instructional approaches are very popular and intuitively appealing, the point is made that these approaches ignore both the structures that constitute human cognitive architecture and evidence from empirical studies over the past half-century that consistently indicate that minimally guided instruction is less effective and less efficient than instructional approaches that place a strong emphasis on guidance of the student learning process.”
Educational Psychologist, 41(2), 75–86, 2006


Books:


Angry Parents: Failing Schools: What’s Wrong with the Public Schools and What You Can Do About It (2000)
Elaine K. McEwan

Betrayed: How the Education Establishment Has Betrayed America and What You Can Do about it" (2011)
Laurie H. Rogers


Conspiracy of Ignorance: The Failure of American Public Schools (1999)
Martin L. Gross


Crazy Like a Fox: One Principal’s Triumph in the Inner City (2009)Ben Chavis, former principal of American Indian Charter School, California


The Mad, Mad World of Textbook Adoption (2004)
The Thomas B. Fordham Institute

Monographs: Implementation and Child Effects of Teaching Practices in Follow Through Classrooms (1975)
Jane Stallings, Stanford Research Institute


Out-come Based Education: Understanding the Truth About Education Reform (1994)
Ron Sunseri

The Schools We Need and Why We Don’t Have Them (1999)
E.D. Hirsch, Jr.

Visible Learning: A synthesis of over 800 meta-analyses relating to achievement (2008)
John Hattie

What’s At Stake in the K-12 Standards Wars
Edited by Sandra Stotsky

Why Don't Students Like School?: A Cognitive Scientist Answers Questions About How the Mind Works and What It Means for the Classroom (2009)
Daniel T. Willingham, associate professor, University of Virginia


Reports:


National Mathematics Advisory Panel Final Report (2008)
“During most of the 20th century, the United States possessed peerless mathematical prowess—not just as measured by the depth and number of the mathematical specialists who practiced here but also by the scale and quality of its engineering, science, and financial leadership, and even by the extent of mathematical education in its broad population…
“This Panel, diverse in experience, expertise, and philosophy, agrees broadly that the (current) delivery system in mathematics education—the system that translates mathematical knowledge into value and ability for the next generation—is broken and must be fixed.”
http://www.ed.gov/about/bdscomm/list/mathpanel/report/final-report.pdf


Chapter 2: Report of the Subcommittee on Standards of Evidence
Valerie F. Reyna, Chair; Camilla Persson Benbow; A. Wade Boykin; Grover J. “Russ” Whitehurst, Ex Officio; Tyrrell Flawn
“The Panel’s systematic reviews have yielded hundreds of studies on important topics, but only a small proportion of those studies have met methodological standards. … Many studies rely on self-report, introspection about what has been learned or about learning processes, and open-ended interviewing techniques, despite well-known limitations of such methods ….” http://www.ed.gov/about/bdscomm/list/mathpanel/report/standards-of-evidence.pdf

Chapter 3: Report of the Task Group on Conceptual Knowledge and Skills
Francis “Skip” Fennell, Chair; Larry R. Faulkner; Liping Ma; Wilfried Schmid; Sandra Stotsky; Hung-Hsi Wu; Tyrrell Flawn
“Proficiency with whole numbers, fractions, and particular aspects of geometry and measurement are the Critical Foundation of Algebra. Emphasis on these essential concepts and skills must be provided at the elementary- and middle-grade levels. The coherence and sequential nature of mathematics dictate the foundational skills that are necessary for the learning of algebra. By the nature of algebra, the most important foundational skill is proficiency with fractions (including decimals, percent, and negative fractions). The teaching of fractions must be acknowledged as critically important and improved before an increase in student achievement in Algebra can be expected."
http://www.ed.gov/about/bdscomm/list/mathpanel/report/conceptual-knowledge.pdf


Chapter 4: Report of the Task Group on Learning Processes
David C. Geary, Chair; A. Wade Boykin; Susan Embretson; Valerie Reyna; Robert Siegler; Daniel B. Berch, Ex Officio; Jennifer Graban
“Anxiety is an emotional reaction that is related to low math achievement, failure to enroll in advanced mathematics courses, and poor scores on standardized tests of math achievement. Math anxiety creates a focus of limited working memory on managing anxiety reaction rather than on solving the math problem, but it can be reduced by therapeutic interventions. … The mastery of whole number arithmetic is a critical step in children’s mathematical education. The road to mastery involves learning arithmetic facts, algorithms, and concepts. The quick and efficient solving of simple arithmetic problems is achieved when children retrieve answers from long-term memory or retrieve related information that allows them to quickly reconstruct the answer. Retention of these facts requires repeated practice.”
http://www.ed.gov/about/bdscomm/list/mathpanel/report/learning-processes.pdf


Chapter 6: Report of the Task Group on Instructional Practices
Russell Gersten, Co-Chair; Joan Ferrini-Mundy, Ex Officio, Co-Chair; Camilla Benbow; Douglas H. Clements; Tom Loveless; Vern Williams; Irma Arispe, Ex Officio; Marian Banfield
“The studies presented a mixed and inconclusive picture of the relative impact of these two forms of instruction. High-quality research does not support the contention that instruction should be either entirely “child centered” or “teacher directed.” Research indicates that some forms of particular instructional practices can have a positive impact under specified conditions. All-encompassing recommendations that instruction should be entirely “child centered” or “teacher directed” are not supported by research.”
http://www.ed.gov/about/bdscomm/list/mathpanel/report/instructional-practices.pdf

Chapter 7: Report of the Subcommittee on Instructional Materials
Robert S. Siegler, Chair; Bert Fristedt; Vern Williams; Irma Arispe; Daniel B. Berch; Marian Banfield
“When mathematicians have reviewed already published middle and high school textbooks, however, they have identified a nontrivial number of errors, and a large number of ambiguous and confusing statements and problems. Many of these errors and ambiguities arise on word problems that are intended to elicit use of the mathematical concepts and procedures in real-world contexts. The Subcommittee recommends that publishers obtain reviews from mathematicians prior to publication, so that these errors and ambiguities can be identified and corrected. …Having mathematicians also read textbooks in the formative stages may increase the coherence of the presentation of mathematics between earlier and later grades.”
http://www.ed.gov/about/bdscomm/list/mathpanel/report/instructional-materials.pdf


Chapter 9: Subcommittee on the National Survey of Algebra I Teachers
Tom Loveless, Chair; Francis “Skip” Fennell; Vern Williams; Deborah Loewenberg Ball; Marian Banfield, U.S. Department of Education Staff
“The most frequent type of suggestion among the 578 respondents was a greater focus in primary education placed on mastery of basic mathematical concepts and skills… A substantial number of teachers considered mixed-ability groupings to be a “moderate” (28%) or “serious” (23%) problem… The responses indicate that about 28% of the algebra teachers felt family participation is a serious problem and another 32% believed lack of family participation is a moderate problem…”
http://www.ed.gov/about/bdscomm/list/mathpanel/report/nsat.pdf


Reports:


The Condition of College Readiness 2009 – ACT
“About 67% of all ACT-tested high school graduates met the English College Readiness Benchmark in 2009. Just under 1 in 4 (23%) met all four College Readiness Benchmarks.
In 2009, 53% of graduates met the Reading Benchmark, while 42% met the Mathematics
Benchmark. Over 1 in 4 (28%) met the College Readiness Benchmark in Science.”
http://www.act.org/research/policymakers/pdf/TheConditionofCollegeReadiness.pdf


Comparative Indicators of Education in the United States and other G-8 Countries: 2009
“For example, the advanced benchmark (the highest TIMSS benchmark) was reached by 26 percent of Japan’s eighth-graders in mathematics … In the United States, 6 percent of eighth-graders reached the advanced benchmark …
“In Japan, 61 percent of eighth-graders reached the high benchmark in mathematics… In the United States, 31 percent of eighth-graders reached the high benchmark...”
http://nces.ed.gov/pubs2009/2009039.pdf


Mathematics 2009 - National Assessment of Educational Progress at Grades 4 and 8National Center for Education Statistics (2009). The Nation’s Report Card: Mathematics 2009 (NCES 2010–451). Institute of Education Sciences, U.S. Department of Education, Washington, D.C.
“Gains in students’ average mathematics scores seen in earlier years did not continue from 2007 to 2009 at grade 4 but did continue at grade 8 … While still higher than the scores in the six assessment years from 1990 to 2005, the overall average score for fourth-graders in 2009 was unchanged from the score in 2007. The upward trend seen in earlier assessments for eighth-graders continued with a 2-point increase from 2007 to 2009.”
http://nces.ed.gov/nationsreportcard/pdf/main2009/2010451.pdf


Executive summary of 2009 NAEP Mathematics Results (2009)
Nation's Report Card: America's High School Graduates: Results From The 2005 NAEP High School Transcript Study
“How can increasing numbers of students be taking more credits and more rigorous curricula without increased performance on the Nation’s Report Card?”
http://nces.ed.gov/nationsreportcard/pdf/studies/2007467.pdf


Reports:


Project Follow Through: In-depth and Beyond (1996)
Gary Adams, Educational Achievement Systems, Seattle
Only the Direct Instruction model had positive scores on all three types of outcomes (Basic Skills, Cognitive, and Affective). Overall, the Direct Instruction model was highest on all three types of measures. … The Affective Models had the worst affective ranks (6.7 compared to 2.7 for the Basic Skills models).
http://www.uoregon.edu/~adiep/ft/adams.htm

Sponsor Findings From Project Follow Through
Wesley C. Becker and Siegfried Engelmann, University of Oregon
“The closest rival to the Direct Instruction Model in overall effects was another behaviorally-based program, the University of Kansas Behavior Analysis Model. Child-centered, cognitively focused, and open classroom approaches tended to perform poorly on all measures of academic progress.”
http://www.uoregon.edu/~adiep/ft/becker.htm


Overview: The Story Behind Project Follow Through
Bonnie Grossen, Editor
“The only model that brought children close to the 50th percentile in all subject areas was the Direct Instruction model. …The most popular models were not only unable to demonstrate many positive effects; most of them produced a large number of negative effects. …
“Yet 10 short years later, the models that achieved the worst results, even negative results, are the ones that are, in fact, becoming legislated policy in many states, under new names. … Every educator in the country should know that in the history of education, no educational model has ever been documented to achieve such positive results with such consistency across so many variable sites as Direct Instruction. It never happened before FT, and it hasn't happened since. … Not enough people know this.”
http://www.uoregon.edu/~adiep/ft/grossen.htm


A Constructive Look at Follow Through Results (1981)
Carl Bereiter, Ontario Institute for Studies in Education, and Midian Kurland, University of Illinois at Urbana-Champaign
“Thus we have, if we wish it, a battle of the philosophies, with the child-centered philosophy coming out the loser on measured achievement, as it has in a number of other experiments … Consistently it is the more direct methods, involving clear specifications of objectives, clear explanations, clear corrections of wrong responses, and a great deal of ‘time on task,’ that are associated with superior achievement test performance. The effects tend to be strongest with disadvantaged children.”
http://www.uoregon.edu/~adiep/ft/bereiter.htm


Follow Through: Why Didn't We?
Cathy L. Watkins, California State University, Stanislaus
“The Joint Dissemination Review Panel and the National Diffusion Network were created to validate and disseminate effective educational programs. In 1977, Follow Through sponsors submitted programs to the JDRP. ‘Effectiveness’ was, however, broadly interpreted. For example, according the JDRP, the positive impact of a program need not be directly related to academic achievement. In addition, a program could be judged effective if it had a positive impact on individuals other than students. As a result, programs that had failed to improve academic achievement in Follow Through were rated as ‘exemplary and effective.’ And, once a program was validated, it was packaged and disseminated to schools through the National Diffusion Network…. The JDRP apparently felt that to be "fair" it had to represent the multiplicity of methods in education. Not only did this practice make it virtually impossible for school districts to distinguish between effective and ineffective programs, it defeated the very purpose for which the JDRP and NDN were established.”
http://www.uoregon.edu/~adiep/ft/watkins.htm

Honest follow-through needed on this project (1998)
By Marian Kester Coombs, special to The Washington Times
“After the Harvard article appeared, all the test models were recommended equally for dissemination to the school districts, and by 1982, the least-effective models were receiving higher levels of funding than the most effective ones, in an apparent effort to equalize results. …
“The president of the National Council of Teachers of Mathematics, Gail Burrill, was asked about Project Follow Through in a recent interview and responded, "I have never heard of it" … Mr. Adams shakes his head. "The most puzzling thing is how the very models like whole language and discovery learning that the data showed to be ineffective and even harmful are still being pushed. Parents should be asking, 'Where is the proof these programs work?'”
http://www.mathematicallycorrect.com/honestft.htm

Our Failure To Follow Through (1994)
Billy Tashman, New York, Newsday
“In fact, the federal oversight panel for Follow Through cut the Direct Instruction program even as it continued other models that were spectacular flops. Eschewing basic skills, the failed programs tried to teach kids how to learn on their own, or tried to raise students' self-esteem (both categories, by the way, in which Direct Instruction students excelled).”
http://www.uoregon.edu/~adiep/ft/tashman.htm


Documents, Letters, Commentary


An Open Letter to United States Secretary of Education Richard Riley (1999)
Dr. David Klein, et al.
“It is not likely that the mainstream views of practicing mathematicians and scientists were shared by those who designed the criteria for selection of "exemplary" and "promising" mathematics curricula. … In an article entitled, "It's Time To Abandon Computational Algorithms," published February 9, 1994, in Education Week on the Web, (Steven Leinwand) wrote: ‘It's time to recognize that, for many students, real mathematical power, on the one hand, and facility with multidigit, pencil-and-paper computational algorithms, on the other, are mutually exclusive. In fact, it's time to acknowledge that continuing to teach these skills to our students is not only unnecessary, but counterproductive and downright dangerous.’
“… Even before the endorsements by the Department of Education were announced, mathematicians and scientists from leading universities had already expressed opposition to several of the programs listed above and had pointed out serious mathematical shortcomings in them.”
http://www.mathematicallycorrect.com/riley.htm


Email from Dr. David C. Geary, Curators’ Professor, Thomas Jefferson Professor, Department of Psychological Sciences, University of Missouri (2009)
“The National Mathematics Advisory Panel explicitly reviewed the brain imaging literature on math processing as related to instructional practices and concluded that any instructional claims based on brain sciences is premature.”


Email series from three members of Where’s the Math? Advocacy Group (2009)
“…the reason for so much failure in Algebra I is because many students simply lack most of the necessary skills in order to succeed. When you tell administrators this they say “quit giving me excuses.” It’s kind of like being asked to teach kids to play water polo when they don’t know how to swim. When your water polo team does lousy the administration asks you why these kids are so bad..…. You answer because they don’t know how to swim… Even worse, they won’t let you teach them how to swim…..In fact they are claiming that you can just teach them to swim while they are playing the game….. That is exactly what is happening in math education and the result is that many kids are figuratively “drowning” in this system."

How to Improve National Math Scores - New York Times – (2009)
Bruce Fuller, professor of education and public policy; Lance T. Izumi, Pacific Research Institute; Holly Tsakiris Horrigan, parent; Richard Bisk, math professor; Barry Garelick, U.S. Coalition for World Class Math
“To give students a firm foundation in math, we must start in the elementary grades by providing three things: a substantial improvement in elementary teachers’ knowledge of mathematics; a more focused curriculum that emphasize core concepts and skills; and more challenging textbooks that teach for mastery and not just exposure.”
http://roomfordebate.blogs.nytimes.com/2009/10/15/how-to-improve-national-math-scores/

Letter from Dr. Shannon Overbay, associate mathematics professor, Gonzaga University (not dated)
“At Gonzaga, I have continued to see students who have come from various reform programs struggle with basic skills. My students often complain that they never learned their times tables and say that they should not have been allowed to use calculators in grade school. They do see the damage that has been done. Many programs, such as Investigations (TERC) do not even cover topics such as long division and routine computations with fractions. By the time these students come to college, they are unable to go into technical majors and have to struggle to pass even elementary math classes designed for non-technical majors. By the time the students hit college, the problems and gaps cannot easily be fixed with one or two “refresher” courses. There are often gaps and holes in their mathematics background that would require years of remediation to fix. For most students, that is not a reasonable option. So, instead, they opt for non-technical majors. We are faced with a 20% decline nationally in the number of engineering majors in recent years. It is devastating.”


Letter from Martha McLaren, retired Seattle Public Schools teacher (2009)
“I began to doubt reform math because of what I had seen in my own classroom, and what I later saw throughout the school district. I worked with confused, demoralized students who were frantically grabbing calculators for the simplest computations. Roughly half of middle schoolers and high schoolers did not understand fractions, decimals, or percents, much less negative numbers or pre-algebra skills. I've spoken with numerous overworked and demoralized teachers who were at their wits end trying to help their students become competent in basic math. These teachers almost never spoke out against reform texts because school district administrators gave them no choice but to support reform math -- teachers' jobs were on the line.”


Short Response to Tunis’ Letter to the Editor on Technology in College (2005)
W. Stephen Wilson, Johns Hopkins University
“…I have not yet encountered a mathematics concept that required technology to either teach it or assess it. The concepts and skills we teach are so basic and fundamental that technology is not needed to either elucidate or enhance them….Consequently, all of Tunis’s questions about how to best insert technology into these introductory courses in college are really a non-issue.”
http://www.math.jhu.edu/~wsw/ED/EDUCTunis.pdf


Teaching too-hard math concepts does students no favors (2009)
Joseph Ganem, physics professor, Loyola University, Maryland
We are in the midst of a paradox in math education. As more states strive to improve math curricula and raise standardized test scores, more students show up to college unprepared for college-level math. In Maryland, 49 percent of high school graduates take noncredit remedial math courses in college, before they can take math courses for credit. In many cases, incoming college students cannot do basic arithmetic, even after passing all high school math tests. Recently, it was reported that student math achievement actually grew faster in the years before the No Child Left Behind law.”
http://www.baltimoresun.com/news/opinion/oped/bal-op.math02nov02,0,1068320.story


What Do College Students Know? By this professor’s calculations, math skills have plummeted (2008)
W. Stephen Wilson, Johns Hopkins University
“I am inclined to conclude that the 2006 JHU students are not as well prepared as the corresponding group was in 1989, despite there being significantly more competition to get into JHU today than ever before. This phenomenon is probably shared with many other universities. The year 1989 is, in mathematics education, indelibly tied to the publication by the National Council of Teachers of Mathematics of the report "Curriculum and Evaluation Standards for School Mathematics," which downplayed pencil-and-paper computations and strongly suggested that calculators play an important role in K-12 mathematics education. ...
“Since 1994, the College Board has allowed the use of calculators on the mathematics SAT. … I believe it is precisely this gained “advantage” that causes the SATM to fail universities in the admissions process. My findings spread like wildfire through the mathematics community. … The surprise was the general indifference that administrators at JHU had toward the study. This kind of drop in SAT scores would be a crisis, but the news that high-performing students were less prepared for college math than students 17 years earlier didn’t seem to bother anyone, at least not enough to contemplate taking action.”
http://www.math.jhu.edu/~wsw/ED/ednext_20084_88.pdf

What the Data Really Show: Direct Instruction Really Works! (2009)
Dr. Jeff Lindsay
“Direct Instruction is the dirty little secret of the educational establishment. This method, rich in structure and drilling and content, is the opposite of the favored methods of today's high-paid education gurus, and contradicts the popular theories that are taught to new teachers in our universities. Direct Instruction should be no secret at all, for it has been proven in the largest educational study ever (discussed below) and continues to bring remarkable success at low cost when it is implemented.”
http://www.jefflindsay.com/EducData.shtml