Several weeks ago, I avowed that I rarely get mad reading
books. However, it has happened
again. I’ve just finished Amanda
Ripley’s The Smartest Kids in the World
and How They Got That Way. Ripley writes a very readable though perhaps not
terribly scientific examination of different educational systems with the goal
of identifying the reasons that some countries’ students score high on the
international PISA test and, concurrently, why American students do so
poorly. She conducts this examination by
following three American high school students who spend a year as American
Field Service (AFS) exchange students South Korea, Finland, and Poland, three of
the world’s top scoring countries. She
supplements the student experiences with her own visits and extensive
research. The characteristics that
Ripley believes distinguish high performing countries are not particularly
complicated or theoretically hard to achieve, and that’s what makes me mad. We in the U.S. shortchange so many of our young
people by failing to provide them with a good education, the kind Ripley and so
many experts argue they need to succeed in today’s economy.
Many of you are probably at least aware of the PISA test,
but let’s begin by explaining what it is, why we should care about it, and how
students in various countries perform on it.
As an independent school educator, I take a skeptical view of virtually
all standardized tests. For example, we
know that SAT’s predict no more than freshman year grades and students’ scores
say more about racial and socio-economic background than actual aptitude. Most other American tests, particularly the
state tests implemented in response to No Child Left Behind, frequently fail to
measure the knowledge and skills we should be measuring. Generally, they are simply not good tests and
yet they are driving public school curriculum.
While I don’t pretend to be an expert on PISA, I think it is a good test
that in fact asks students to demonstrate mastery and skills that we should
care about.
PISA stands for Program for International Student
Assessment. Andreas Schleicher, a German
physicist, developed the examination for the Organisation for Economic
Cooperation and Development, an international organization based in Paris which
aims “to promote policies that will improve the economic and social well-being
of people around the world.”[i] As Schleicher explained in a December 2001 press
conference announcing the results of the first administration of the test, PISA
represents a different kind of assessment.
“We were not looking for answers to equations or to multiple choice
questions. We were looking for the ability
to think creatively.”[ii]
He wanted to devise a test that actually measures what students need for
success in the 21st century which is not necessarily – indeed
probably not – just information that students can memorize and
regurgitate.
The OECD describes the 2012 test as taking two hours and
including “a mixture of questions requiring students to construct their own
responses and multiple choice items.” The questions “were organised in groups based
on a passage setting out a real life situation.” You can do sample questions on
the website, if you like. PISA
officially tests fifteen-year-olds, and in 2012, 510,000 students aged 15 years
3 months to 16 years 2 months took the exam in 65 countries and defined
economies (the latter being Shanghai, Hong Kong, and Macao; Ripley does not use
these economies in her calculations).[iii]
Ripley actually took the test. She gives examples of
questions that demand writing answers to questions in which you have to defend
your position. Not all the questions even have right or wrong answers; how you
score depends on your argument. This is
a long way from even the best multiple choice questions, and Ripley – who
graduated Phi Beta Kappa from Cornell -- came away convinced that PISA does
measure critical thinking.
Since that first 2000 administration, American students have
performed pretty disappointingly, especially in math. For example in 2009, out of 61 countries, the
US ranked twenty-sixth on the math test, seventeenth on the science test, and
twelfth on the reading test. Our
students scored about average on the science, above average on the reading and
below average on the math. Which
countries scored higher than us may interest you even more. Korea and Finland had the top reading scores,
and we tied with Iceland and Poland.
Estonia, the Netherlands, Japan, Canada, Australia, and New Zealand all
scored higher. In math, Singapore, then
Korea and Finland had the highest scores; we tied with Ireland and Portugal
while countries with higher scores included Poland, the Czech Republic,
Hungary, France, the U.K. and all the countries already mentioned that
outscored us in reading. The 2012
results differed little (the test is given every three years). We slipped to twenty-eighth in math; came in
thirteenth in reading with our overall score dropping slightly; and fell to
twenty-second in science. In addition,
because more countries had tie scores, more nations actually scored ahead of us
than in 2009. Our actual scores were
about the same as previous years.
However, other countries have pulled ahead of us. Professor Jan Rivkin, co-chair of the Harvard
project on U.S. competitiveness, made this observation to NPR in December 2013:
While our scores in reading are the
same as 2009, scores from Belgium, Estonia, Germany, Ireland, Poland and others
have improved and now surpass ours. Other countries that were behind us, like
Italy and Portugal, are now catching up. We are in a race in the global
economy. The problem is not that we're slowing down. The problem is that the
other runners are getting faster.[iv]
It’s notable that our relative decline occurred while others
improved; twenty-five countries raised their math scores between 2009 and
2012. In addition, Congress passed No
Child Left Behind the year of the first PISA test. Clearly, that educational reform has failed
to improve our educational system, at least by the measure of an international
standardized test despite the law’s emphasis on testing in reading and math. Education
Secretary Arne Duncan termed the 2013 PISA results a "picture of
educational stagnation."[v]
When I first learned about our mediocre scores several years
ago – actually from Andreas Schleicher himself whom Global Education Director
Melissa Brown and I met in China, I countered by arguing that the U.S. has a
much more diverse population than the countries who score at the top. I asked, too, who they were testing. It turns out they test a cross section of
students in every country and they can slice and dice the data by such
characteristics as income, public and private schooling, and immigrant status. Controlling for specific factors, such as
income, U.S. students still perform at mediocre levels. For example, using the 2009 results, as we’ve
already noted, American students placed twenty-sixth in math; our wealthiest
students, including those going to private schools, did score better than
Americans in general, placing eighteenth when compared to the most economically
privileged students in other countries.
Rich Slovenian and Hungarian students still tested higher than Americans
whose scores were comparable to wealthy Portuguese. Plus, our poorest students did even worse,
ranking twenty-seventh when measured against the poorest youngsters in other
countries.[vi] As this difference might suggest, American
students demonstrated an especially wide gap between the most advantaged and
the least advantaged, more than 90 points in 2001 reading scores. By contrast, South Korea’s rich and poor
students’ scores only differed by 33 points.[vii]
Beyond national pride – the United States doesn’t usually
think of itself as mediocre – should we care about our PISA performance? The OECD now has mountains of data from PISA,
and not just the exams themselves. Many
of the students also fill out questionnaires about their family background,
their schooling, and their interests and aspirations and OECD’s statisticians
can use this data to provide information about students who do well and who do
not. Ripley reports that we have learned
that, in general, math scores carry more weight than reading scores (not a
great fact for the U.S. with its subpar math performance). Students who “mastered high level math” stood
a greater chance of finishing college, irrespective of race and income, and
enjoyed higher earnings as adults.[viii]
The reading scores do matter, however,
and a student with poor reading was more likely to drop out of high school, and
generally PISA scores predicted college success better than high school
grades. The PISA statistics also show
that spending per pupil and small class sizes don’t positively correlate to
higher scores. For example, in 2009, the
U.S. spent more per student, on average, than every other country except
Luxembourg and had smaller average class sizes than many of the countries who
scored better than we did. Perhaps most
significantly, “economists had found an almost one-to-one match between PISA
scores and nation’s long term economic growth.” [ix]
Ripley gives us some examples of the impact of a poor educational
system. She introduces us to the Bama Companies,
an Oklahoma company that makes McDonald’s pies.
Paula Marshall, the CEO, opened a plant in Poland because, unlike
Oklahoma, Poland offered an ample supply of educated workers. In Oklahoma, the Bama Companies sometimes can’t
find enough people to fill their lowest skilled jobs because even these require
thinking and communications skills.
Marshall told Ripley she would underwrite technical training, but the
people lacked the basic reading and math skills necessary to take advantage of
the training. She simply couldn’t find
candidates for the more demanding maintenance tech jobs which require the
ability to interpret technical blueprints, write a summary of a shift, and
problem solve and fix sophisticated systems.
She was confident, though, that she would find such employees in
Poland. And perhaps the dearth of
potential Bama employees shouldn’t surprise us since twenty-five percent of
Oklahoma high school graduates hoping to enlist fail the military’s academic
aptitude test. Admittedly, Oklahoma has weak
schools, but shouldn’t we be worried that, assuming that the Bama Companies
aren’t unusual, our manufacturing companies have trouble hiring employees with
basic skills and that a quarter of young people with high school diplomas
(these aren’t drop outs) don’t qualify as enlisted men and women in the
military?[x] If for no other reason, these examples should
compel us to care about our PISA performance.
The PISA results prove that countries, including
democracies, can and do change their educational systems so their students
perform better. If the United States
wanted to do something about this “picture of educational stagnation,” what
should we do? No simple answer presents
itself. However, Ripley does offer some
possibilities. She does so by looking at
Finland and South Korea, two of the highest scoring countries, and Poland which
has made significant progress. I’ll
explore these systems in more detail next week, but her high level takeaways
are: quality of teachers matters; rigor and accountability matter; student
drive matters. Parent involvement also
matters but not the way we might expect and diversity matters, particularly for
lower performing students. As I’ve
already mentioned, up to a point, spending per student and class size don’t
matter and neither does technology. And
we could learn a great deal about teaching math effectively from others. We’ll look at these issues in the context of
the U.S., then a few states, like Massachusetts, whose students perform
significantly better than American students as a whole, and then think about
the implications of Ripley’s conclusions for independent schools like Holton.
No comments:
Post a Comment