The Committee for Economic Development of The Conference Board (CED) uses cookies to improve our website, enhance your experience, and deliver relevant messages and offers about our products. Detailed information on the use of cookies on this site is provided in our cookie policy. For more information on how CED collects and uses personal data, please visit our privacy policy. By continuing to use this Site or by clicking "OK", you consent to the use of cookies.OK

In the Nation's Interest

How Can We Improve STEM Education?

President Obama will be spotlighting the importance of STEM (science, technology, engineering, and math) education in his final State of the Union address.

And indeed, the attention is well-placed. Especially at the “mid-level” of jobs that require some postsecondary training but not a full bachelor’s degree, STEM jobs are in demand, pay well, and shortages of qualified applicants are expected now and in the future.

Looking at higher skill levels, those whose highest degree is a bachelors in technology, engineering, or mathematics can expect to earn $73,000 to $87,000 annually at mid-career – which is 20-43% higher than the $61,000 average mid-career earnings for those with any type of bachelor’s degree.  (Engineers do best, followed by applied math and computer science graduates. Perhaps surprisingly, those who stop with a bachelor’s in a physical science earn only marginally more than the average college graduate.)

Unfortunately, US high school students rank below average for developed countries in mathematics achievement and only 27th among all 65 countries that participated in the 2012 PISA assessments.  In science we ranked roughly average for developed countries and 20th among all 65 PISA countries. OECD’s report on the PISA results stated, “Students in the United States have particular weaknesses in performing mathematics tasks with higher cognitive demands, such as taking real-world situations, translating them into mathematical terms, and interpreting mathematical aspects in real-world problems. An alignment study between the Common Core State Standards for Mathematics and PISA suggests that a successful implementation of the Common Core Standards would yield significant performance gains also in PISA.”

Despite much interest in improving US students’ achievement in math and science, our evidence base regarding what works in science and math education is surprisingly thin.  Again, this is not for lack of interest or lack of new approaches.  A plethora of efforts exist aimed at improving STEM achievement and attracting more students into STEM fields.

Yet what I mean by saying that our evidence base is thin is that very few of these new approaches and innovations have been rigorously evaluated to see whether they are more or less effective than our previous ways of teaching math and science.

The “gold standard” for assessing whether something has truly changed desired outcomes is a randomized controlled trial or (in some cases) a quasi-experimental trial.  In randomized controlled trials (RCTs), some people are randomly assigned to receive the new approach and others are randomly assigned to receive some other “treatment” – typically the existing approach.  We all recognize this from clinical trials of new drugs and medical procedures, but the same approach is possible with any sort of intervention that is intended to change some outcome.

By this standard, our knowledge of what works in raising student achievement in math and science is very, very thin.  For example, the What Works Clearinghouse of the US Department of Education lists only 3 math curricula or programs that positively affect student achievement, and another 11 with “potentially positive” effects.  For science the numbers are worse: only 3 science curricula or programs with “potentially positive” effects are listed.

Now granted, the What Works Clearinghouse has very high standards for effectiveness, but these numbers do underscore how little we know about truly effective programs.

This explains why I was so pleased when I ran across a report of a program that does have strong evidence of effectiveness in raising students’ achievement in math and science: the College Readiness Program (CRP) of the National Math and Science Initiative (NMSI). (NMSI is a non-profit launched by business, education, and science leaders to transform math and science education in the US and is led by CED Member Matthew Randazzo.) 

Not only does CRP appear to improve math and science achievement, it does so even for disadvantaged and traditionally underrepresented students, who are a particularly challenging group of students to achieve success with.

CRP works with schools to increase the number of students taking and earning qualifying scores on AP math, science, and English exams. Teachers receive course-specific training in how to teach the AP courses, and access to mentors and online curricular resources.  Students receive study sessions led by state and national AP experts, lab equipment and supplies, and exam fee subsidies covering about 50% of the cost of the AP exams.   Teachers and students each earn $100 for a score of 3 or higher on the AP exam, with teachers eligible for an additional $1,000 bonus if certain classroom performance measures are met.  School administrators receive support from academic and program experts as well as shared goal setting, performance analysis, and accountability.

Currently CRP has been implemented in nearly 800 high schools in 30 states – about 1.5% of American high schools.  Even more strikingly CRP has strong evidence from four different quasi-experimental studies indicating that the program improves student achievement in math, science and English, improves college outcomes, and most impressively increases participant earnings in the immediate post high school years.

• The net increase in the number of students taking AP exams in math and science was 6.9 percentage points in CRP schools (relative to a statistically matched comparison group) in the most recent studies, and there was a net increase of  3.9 percentage points in the percentage of students earning scores of 3 or higher on those exams.
• A 2010 quasi-experimental study found that CRP led to higher college enrollment, college GPA, and college persistence for students from participating schools (relative to the statistically matched comparison group).
• A study that tracked students’ annual earnings approximately 4-5 years after high school graduation found that CRP participants’ earnings were 2.7% higher than that of the statistically matched comparison group.  (Although this may not sound like much, if the difference continues throughout the individual’s lifetime it translates to a lifetime benefit in the range of $16,000 for individuals at lower skill levels.) 

The last result is particularly noteworthy given that very few educational efforts bother to track their impact on the ultimate longer-term outcomes that we care about – let alone show a positive result for those outcomes.

Given the strength of these outcomes CRP was recently awarded a federal Investing in Innovation grant of nearly $20 million to expand CRP to 60,000 additional students in 10 school districts in 8 states.  The districts are the Atlanta Public Schools, the Cleveland Metropolitan School District, the Detroit City School District, the Houston Independent School District, the Noble Network of Charter Schools in Chicago, the Oakland Unified School District, the St. Louis Public Schools, and the Bismarck Public Schools, West Fargo Public Schools, and Mandan Public School District in North Dakota.

Matthew Randazzo, CED Member and CEO of NMSI, commented, “NMSI's College Readiness Program, which has been implemented in nearly 800 schools across 30 states, has proven time and again that with the proper resources and support, all students can achieve at high levels. This scale-up grant will enable NMSI to broaden the reach of the College Readiness Program to 40 new schools serving a high proportion of students traditionally underrepresented in STEM fields, and help ensure that they have the knowledge and skills they need to be successful in college and career. An important component of this work will be a rigorous research study, conducted in partnership with UCLA’s National Center for Research on Evaluation, Standards, and Student Testing, to further demonstrate the impact the program is having on student achievement in both urban and rural contexts.”

The federal Investing in Innovation (i3) grant program has some of the highest evidence standards for making grant awards.  Awards are scaled based on the strength of evidence supporting the intervention.  The smallest awards, development grants, provide up to $3 million to support the development or testing of practices that have evidence of “promise” in improving student outcomes.  The second tier of awards, validation grants, provide up to $12 million to expand to the national or regional level those projects supported by moderate evidence of effectiveness and to evaluate them more rigorously.  The highest level of award, scale-up grants, provide up to $20 million to support expansion of those projects supported by strong evidence of effectiveness to the national level.  In the words of former Deputy Secretary of Education Jim Shelton, i3 provides “a little bit of money for a little bit of evidence, and a lot of money for a lot of evidence.”

It was this highest level of Scale-up grant that CRP received.  This is a feel-good story about education policy in multiple ways.  First, that we have an approach that makes a true and measurable difference in improving STEM outcomes, especially for disadvantaged students.  (We don’t have too many of those that have strong evidence that they work.)  And second, we have a case where government funding is being awarded in a way that allows programs to gradually build an evidence base and demonstrate effectiveness—and that only rewards programs that do so.  If we’re serious about improving the educational outcomes of American students—both in STEM and elsewhere—we should be making more of these systematic and evidence-based improvements.

Monica Herk is CED's Vice President of Education Research.