iOS app Android app More

Featuring fresh takes and real-time analysis from HuffPost's signature lineup of contributors
Jose Ferreira

Jose Ferreira

Posted: June 14, 2010 01:08 PM

Big Test Prep's Dirty Little Secret

What's Your Reaction:

A few weeks ago Stephen Colbert dedicated a segment of his show to a topic near and dear to many hearts (well, mine at least): SAT prep.

Colbert's first piece of advice? Spend a lot of money. He's not joking. Well, he is, but those old-school courses offered by "Big Test Prep"--companies like Kaplan and The Princeton Review (TPR)--ain't cheap. Twenty-four hours of private tutoring from TPR will burn an $8400-shaped hole in your pocket. And no, that's not an extra zero.

Despite Big Test Prep's astronomical prices, their courses aren't very good. Just look at their Net Promoter Scores (NPS)--a business metric invented by Bain Consulting that measures customer satisfaction. Old school test prep courses score just 15--on a 100-point scale. For reference, Apple has the highest large-company score at 77. The score for Knewton, the online test-prep company I founded, ranges from 75 to 85.

Of course, what ultimately matters is how well the product works, not how enjoyable it is. Maybe Big Test Prep courses are like Cherry Nyquil--revolting yet strangely effective.

Sadly, no, Big Test Prep's average score increases are even worse than their NPS scores. They don't always let facts stand in their way though. Recently, at the "gentle urging" of the Better Business Bureau, Princeton Review agreed to stop advertising an unsubstantiated "average score increase" of 255 points for SAT students. (Their president said that they'd "been planning to shift away from an emphasis on score improvement" anyway. Um, sure you were.)

It's true that determining average score improvements can be a bit tricky. Companies have to avoid selection bias among students who send in scores, since the best-performing students tend to report back at a disproportionate rate. But this can be resolved by having an independent third party--say, PricewaterhouseCoopers--collect the data. Kaplan did exactly this in the '90s. And if anyone can accurately calculate average score increases, it's Kaplan--the biggest player in the industry, four times larger than TPR.

But Kaplan quickly scrap-heaped their Pricewaterhouse-vetted results because the score improvements were underwhelming (at least to students who wanted their scores to go up). Try asking any Kaplan representative now what their average score increases are. Good luck getting a straight answer. In fact, many students who take Kaplan and TPR get lower scores. Old-school test-prep companies refuse to talk about how well their products work. They will discuss ad nauseum their products' features, but never their effectiveness. Now--just a little basic common sense here--if their products were particularly effective, and they knew it, wouldn't they talk about it?

Can you imagine this kind of willful indifference to discussing product performance from any other type of manufacturer?

Admittedly, the public can't take at face value any test-prep company's score-increase claims, due to Big Test Prep's history of gamesmanship. What the public can take at face value are Money-Back Guarantees. An MBG pegs the course provider who offers it to a certain level of improvement. It's not an average; it's a baseline that virtually every customer achieves. A company can only offer a Money-Back Guarantee if virtually all of its students exceed that guarantee. Students can take for granted that if nearly all customers beat that baseline, then the average student does significantly better.

Unlike score claims, MBG's are very difficult for companies to game. The only way to do so would be to make the "diagnostic" test at the beginning of the course so toxically difficult that students get artificially lower scores. But this doesn't really work, since students can use a previous official test score as their baseline instead. The majority of SAT students take the SAT more than once; so, increasingly, do GMAT and LSAT test-takers. (Twenty-five percent of Knewton GMAT students peg their Money-Back Guarantee to a previous official score. The number is even higher for the SAT. There is no difference in MBG rates between Knewton students who use a Knewton diagnostic test vs. a previous official test score.)

Knewton guarantees GMAT students a 50-point guarantee or their money back. We also guarantee at least 5 points on the LSAT and 150 points on the SAT. Around 95% of our students achieve these increases. How can we be sure? Because we pay students the price of a course to tell us if they don't--that's $690 for GMAT and LSAT students, and $490 for SAT.

Kaplan and TPR have MBGs, too--but all they guarantee is that they'll refund your money if your score goes down. (The average GMAT student goes up 28 points just by studying on her own.) That's like the Hoover vacuum people guaranteeing that their vacuums won't make your floor any dirtier.

TPR and Kaplan also let you repeat their 10-week programs if you aren't satisfied with your score--though TPR tacks on a $200-$300 "administrative fee" for laughs (theirs, not yours). But if the course doesn't work, why repeat it? That'd be like returning your defective vacuum, only to hear, "You're in luck! Take it home again--for a hefty fee!"

Big Test Prep companies have slyly convinced consumers that the problem isn't them--it's you. It's like Weight Watchers. Less than 2% of Weight Watchers participants achieve their desired weight loss. But heavy people don't blame Weight Watchers: they blame themselves (and maybe cupcakes). Same with Big Test Prep. They've convinced anxious test-takers that the inadequacy lies with themselves-- rather than with too-short courses taught by 25-year-old part-time musicians/bartenders/burnouts with decent but not exceptional test scores and virtually no training who will probably be doing something totally different in 6 months.

So, how do they sell so many overpriced yet ineffectual courses? Simple: since they can't sell results, they sell fear. As Colbert quipped, "Take enough math classes and you may even learn [their] equation for turning children's fear into cash."

See for yourself. Attend a Kaplan "free informational seminar" for, say, the GMAT. You'll get some marketing spiel and then be thrown right into some brutally hard combinatorics questions. [Combinatorics is a difficult but seldom tested area on the GMAT. You may not see a single combinatorics question on test day.] Kaplan is banking - literally - that you'll freak out because you've never even heard the word combinatorics.

Big Test Prep also won't let students take modular courses--only the math or the verbal (or the logic games if you're an LSAT student). Many students need no help with, say, verbal, but their fear of math compels them to buy twice as much course as they need. Knewton is developing modular courses in math/verbal/logic that we will release within a year. If Big Test Prep were selling results rather than fear, they would do the same.

"Facts don't matter. Just lots and lots of words." This is Colbert's advice for SAT essay-writers. While it may be misguided in that context, it seems to be working just fine for Kaplan and TPR. As for their students? That's a different story.


 

Follow Jose Ferreira on Twitter: www.twitter.com/Knewton_Jose