Memo to the Times: Boston is a lot closer than Helsinki
Mike the Mad Biologist notes a Times editorial that purports to answer the question Why students do better overseas, with special attention to the miracle that is Finland, where children start life in cardboard boxes and all that. All well and good, Mike says, except Massachusetts does better than Finland on standardized tests:
No doubt the fixation on Finland is a boon for the Finnish Tourist Bureau (and we like helping!), but we have a local model that works very well. It’s not exotic - and, alas, no reindeer - but it works. This isn't to say it couldn’t be better, but someday our intellectual betters might want to ask why other states aren't adopting the Massachusetts model.
Meanwhile, though, the Globe notes that while Boston continues to rank well in comparison to other urban school districts, its standardized test scores have stagnated in recent years.
Ad:
Comments
That's nice, but he's
That's nice, but he's comparing apples to oranges. How can anyone compare a state to an entire country? He can compare Boston and its suburbs to Helsinki and its suburbs, but not to the other parts of Finland where kids are pulled from school so they can start their career as a fisherman!
It's pretty close to apples and apples
Finland has a population of around 5.5 million. Mass is around 6.5 million. We also have a fairly sizable fishing industry. We have urban, suburbs and rural within the state, just like Finland does in the country.
No comparison
The population of MA is 1.2 million greater than that of Finland.
Not that there are many fishermen her either.
MCAS
Boston has made huge strides since MCAS became a graduation requirement in 2003 - a little accountability for the teachers and the students went a long way. I get that testing isn't perfect - but the mere act of establishing a high hurdle of minimum competency seems to have been a huge part of the equation.
Boston's rank in Mass remains pretty low - but the rising tide has raised all ships.
MCAS and teacher testing
MCAS and teacher testing forced many school systems to stop "passing the trash" from school to school every year. A badly performing teacher isn't as like to be foisted upon students to cripple them for an entire academic year anymore now that funding may be on the line.
MCAS isn't even a difficult test.
What are we measuring?
Are graduating smarter kids or kids who have adapted to be able to take one specific test better?
Your solution?
As opposed to kids that ten years ago got passed through the system and couldn't read or do basic arithmetic?
And doesn't seem to be any one test - they (mostly) pass the MCAS and Mass students seem to knock it out of the park on almost any test you throw at them.
We're doing something right especially outside the bigger cities. If we can pull up the Bostons, Lowells, New Bedfords, etc we'd be off the charts.
Can't speak for other cities, but in Boston and Cambridge it's not a money problem. I'll leave it to the experts to come up with the solution, but when you are spending $20-$30k per student - money's not the problem unless too much money is a problem.
That's not an answer
Passing a test means you studied to pass that test. Is there some other measure which indicates that the students graduating now are smarter than the ones graduating ten and twenty years ago?
Yes, more are going on to
Yes, more are going on to college and graduating. The graduation rate is higher, other scores like SATs higher, literacy higher, etc.
No more graduating kids that can't read or write anywhere near their grade level.
Pure Anecdote, but ...
Check the writing from the previous generation that came through the same schools.
I remember when the new schools opened in Medford. There were parents who were graduates of the entire system writing barely readable e-mails. It was pretty astounding to see such a large group of clearly fairly bright, functional people struggle with writing at even a 4th grade level. One woman who organized the PTO asked me directly to help her write better, and I did. Her deficits had zero to do with her abilities - she was straight up screwed over by a system with low expectations. (almost all the people who struggled were first-generation or first-generation US born children of immigrants who were written off by the system, even though most came in before first grade).
I suspect that these screwed-over people were the result of the attitude of those who were in charge of the schools for way too long. My former boss and his wife had been directly told (pre-MCAS) by a long-term member of the school committee that they had no business demanding a real education from the public schools - if they wanted educated children they should pay for it and send their kids to private schools.
Lovely, huh. MCAS put a huge stop to that stupidity. Sad that the manner in which it was abruptly implemented screwed over kids who were already screwed over.
The kids aren't smarter - they just know how to read, write, and show basic functional skills in some other areas because the schools are required to teach all of them basic things.
Yes
Passing the MCAS means that you know at least a certain percentage of the material on the MCAS. Standardized tests are rote and imperfect, but they function as a fail-safe: they make sure that kids were taught and learned something before they graduated. This helps kids in the weakest districts, by guaranteeing that certain material will be covered in the classroom to the retention level required for the kid to pass the test.
Everyone focuses on how the material is dragging down the curriculum of strong districts, but no one pays attention to the fact that tests like MCAS ensure that the poorest children get a certain level of basic education.
Also, I don't think we can ensure that students graduating each year are "smarter" than the rest. It's wrong to conflate intelligence with education, since intelligence has an innate/genetic component, while education doesn't. A less intelligent person with a full brain is more valuable to society than a very intelligent person with an empty one.
I don't think they are any smarter
They are just better educated about the things we want them to know, at a minimum.
If you have an argument about what we are teaching them - fine - but based on what the state (and other organizations) think they should know, we are doing a better job than almost anyone anywhere in the country.
I taught for 2 years in the Japanese public system - tests can be good and they can be bad - the Japanese system is not very good. I think we've struck a pretty good balance between getting kids to know what they need to know but not turning them into test taking machines.
Anything can be improved -but we do a pretty good job without killing the child in the kids like they do elsewhere.
You Can't Just "Learn the Test"
Your statement that "passing a test means you studied to pass that test" is nonsense. When was the last time that you passed a test without learning the material first? Understandably, you studied before taking the test to bone up on the material, and you may have even taken a prep course to familiarize yourself with the format and general subject matter of the test, but you didn't just go to a prep course and pass your final exams in high school. The MCAS tests knowledge of math and reading. Although I am sure that you can learn some tips on how to do better on the MCAS by studying the test, that doesn't change the fact that you have to have learned the basic skills (math and reading) to have passed or to achieve an "advanced" ranking. Setting all of that aside, I would be interesting in hearing how you think we should be measuring mastery of the material if not through the MCAS or some other system-wide test. Is it just the MCAS you dislike or the idea of testing in general?
Money is kind of a problem
I agree that MA is doing something right, but I just want to note that Boston isn't spending $20-30k per student (Cambridge does however). It's ~$17k (http://profiles.doe.mass.edu/state_report/ppx.aspx). That's towards the high end for MA, but Boston has a high cost of living and lots of special needs and ESOL students (which typically have smaller class sizes and more specialized teachers). As to the test scores, compared to other cities, Boston does reasonably well (http://schoolfinance101.wordpress.com/2013/12/19/on-short-term-memory-st...); if you look for the NAEP-TUDA test results, you can see how Boston does.
That's what they want you to believe
the $17k is only the operating costs Mike. When you add in retiree health care, pensions, capital expenses and external funds it comes to somewhere in the $22k range - and that includes $0 for real estate which would probably add 10-15% to that number.
If you look at the numbers our serious "special needs" is probably no bigger than any other community - about 1% of students. We do have a higher percentage of ESOL and a couple of less severe learning disabilities we have to deal with - but again we spend so much more than almost any other community there are no problems in BPS that more money is going to solve (and the school budget is getting a massive bump this year - I think like 5% - far more than any other cabinet position if I'm not mistaken. The only things going up by more would be fixed costs like pensions/debt payments and possibly health care).
But another test also shows good results
A different test, given internationally.
http://www.wbur.org/2013/12/03/massachusetts-pisa-test-results
Yep, another standardized test but I'm having a difficult time believing that they were teaching to the MCAS and also this test, the PISA. I just don't see them having the bandwidth for that.
My point is that the kids are objectively doing pretty well. I can't say that it's because of the MCAS though.
It's more like comparing apples to bicycles
The standardized testing is given to students within the same year that they have learned something. The article measured the math abilities of adults - "adults in the United States scored far below average and better than only two of 12 other developed comparison countries" - which measures how effective the overall education people receive is (retainage, application of learning). You might be able to score well on the 7th grade test in 7th grade but can you calculate a tip at 32?
It seems to me that the current zeal to tie funding for school systems to standardized testing has driven schools to teach to the test - they may be able to pass this year's test but what did they learn? I think it punishes the poorer school districts, making them poorer, and the students, especially in the long run.
Cardboard box?
You were lucky. We lived for three months in a paper bag in a septic tank.