Charlie Miller's Punishment By Apple Tests A Complex Relationship
For years, they have been partners in the struggle for better cybersecurity.
Security researchers, known as "white hat" hackers, have found flaws in tech products and reported them to the products' makers, who have quickly fixed them before "black hat" hackers exploited them for malicious purposes.
But last week, prominent researcher Charlie Miller and Apple had a falling out. After Miller publicly disclosed a flaw in Apple's App Store, Apple punished him by revoking his app developer's license.
Apple's response shines light on a complex relationship that has long been at the heart of securing tech products. While several major companies have started reaching out to researchers for security help, Apple's response showed a less amicable side of this partnership.
Some experts say Apple sent a negative message that could prompt researchers to sell their discoveries to an underground market of criminally-minded hackers -- many of whom are willing to pay up to $80,000 for bugs -- instead of reporting them to the company for free.
"Anything that stifles their willingness to come forward is going to hurt the public good," said Jeff Moss, founder of the Black Hat and DefCon hacker conferences in Las Vegas. "It's one less place to get insight on the quality of the product."
Miller was reprimanded by Apple for the way he demonstrated the security flaw: He created a secret application that he believed could download malware onto iPhones and iPads, and got it approved for distribution in Apple's App Store. Apple said this violated the terms of the developer's agreement that said developers should not disguise their apps.
Some experts said what Miller did was unethical because he potentially exposed millions of Apple customers to malware. But Miller claims Apple customers were not at risk and argues that, if he did not go to such lengths, Apple would have denied the bug existed.
Apple has since patched the flaw that Miller found. The company did not return requests for comment from The Huffington Post.
The incident created a new chapter in a long-running debate over the appropriate way for researchers to disclose security risks. Some argue that flaws should only be disclosed privately to developers because revealing them publicly shows attackers how to exploit them. Others say going public is the only way to force a company to improve their security.
"If researchers don’t go public, things don’t get fixed," said Bruce Schneier, a security expert who has written several books on the subject. "Companies don't see it as a security problem; they see it as a PR problem. And if there's no PR problem, it'll never be a priority.”
After reporting bugs, many researchers wait to disclose them publicly until the company has had a chance to issue a fix, known as a "patch." But when researchers receive no response from the company, they often detail their findings at security conferences, a move that has upset the products' makers.
"That's the researcher's Trump card," said Chris Wysopal, chief technology officer at Veracode. "If the vendor is not going to fix it, they're making sure that everyone is not at risk for eternity."
But publicly disclosing security flaws can be risky, triggering hostile responses both from embarrassed companies and law enforcement. In 2001, the FBI arrested security researcher Dmitry Sklyarov at his hotel in Las Vegas, the day after he disclosed a bug in Adobe's PDF format at the DefCon hacker conference. He was charged with violating the Digital Millennium Copyright Act.
In 2005, Cisco threatened legal action to prevent researcher Michael Lynn from presenting a security flaw he found in the company's Internet routers at the BlackHat security conference. Cisco employees also tore 20 pages outlining Lynn's presentation from the conference program and ordered 2,000 CDs containing the presentation destroyed, according to the Wall Street Journal.
But companies who threaten hackers may face retribution too. Earlier this year, Sony filed for a restraining order against the hacker George Hotz for allowing Sony customers to run unapproved software on the PlayStation 3 console, a technique known as "jailbreaking." In response, the hacker group Anonymous declared war on Sony.
"You have now received the undivided attention of Anonymous," the hacker group wrote to Sony in April, saying the company's legal action against Hotz "has not only alarmed us, it has been deemed wholly unforgivable."
Two weeks later, Sony suffered an embarrassing data breach that compromised the personal data of more than 100 million PlayStation customers and forced PlayStation Network to be shut down for a month.
The Sony incident was an example of why technology companies should make nice with "white hat" hackers, Wysopal said.
"If a company has a good relationship with the research community, then researchers will behave more friendly to that company," Wysopal said. "But if someone threatens a researcher, the community might find flaws and disclose them improperly."
In recent years, many companies have sought closer ties with "white hat" hackers. At conferences, software developers actively court security researchers, wining and dining them and inviting them to lavish parties, according to Kevin Mitnick, who went to prison for computer hacking and now runs his own security consulting firm.
"They're trying to recruit or befriend the hacker community so when they find these holes, they'll disclose them to these companies first," Mitnick said.
This summer, Facebook launched a "Bug Bounty" program, joining Google and the Mozilla Foundation, who now pay researchers to report flaws. Earlier this month, Microsoft invited a select group of researchers to the company's headquarters in Redmond, Wash., to discuss emerging security threats at a conference called "BlueHat."
Experts say tech companies are reaching out to researchers because researchers can make more money by selling bugs to the underground "gray market." A serious security flaw in a mobile phone could be sold on the underground bug market for up to $80,000, Moss said.
"I could sell a couple bugs a year and it could equal an entire salary," Moss said. "So it becomes a question of 'Well, do I pay for college for my kids or give these companies free research?'"
Moss said Apple's censure of Miller might have a chilling effect.
"When things like that happen, it might discourage other researchers from giving away bugs for free," he said.
CORRECTION: A previous version of this article said Microsoft has a bug bounty program.