In the aftermath of the revelation that technology start-up Uber would consider using its private customer data, such as travel logs, to track down the private lives of reporters, the word "values" was bantered around a lot by the media. Paradoxically, Uber Senior Vice President Emil Michael--who suggested "digging up dirt" on reporters--later claimed his own right to privacy and to express himself freely during a private dinner with friends. On top of this debacle, Uber employees warned a reporter looking into the scandal that higher ups in the company could review all all of her personal data--without her permission.
When ex-venture capitalists like Peter Sim, who took offense when his personal data was allegedly used to promote Uber at its Chicago launch, decide to boycott Uber, it makes me wonder why he and--more importantly--all those who currently invest in start-ups don't learn the lessons that jeopardize their investments until after the fact. In other words, why is a human rights values "litmus test" not ranking higher on the boxes that need to be checked when reviewing funding proposals, deciding to invest, and testing prototypes and betas?
Venture capital firms that phase out their "seed," "early stage," or "later stage" practices should wonder more when it is "too late" to integrate human rights values in the start-ups they fund. Why, when investment firms and venture capitalists compete on their advanced skills in picking the right teams, spotting hot development talent, and understanding what applications and functionalities would seamlessly be adopted by the masses, would they not simply add human rights resiliency to that set of winning criteria at any stage?
At WITNESS, we are deeply focused on and excited by the ways that technology can catalyze the fight for human rights. Back in 2011, we published our "Cameras Everywhere Report"--the product of a series of interviews with technologists, academics, human rights defenders, peer human rights organizations, policy makers, and influencers. Encouraged by the enormous potential of technology to expose and document human rights abuses and galvanize social change, we asked our interviewees what they saw as the (then) current challenges and opportunities at the intersection of human rights, video, and technology. There was a near unanimous consensus among our interviewees that privacy and safety, as well as authentication challenges, were key concerns in a landscape where digital communication technologies offered exciting opportunities.
Based on this report, we adopted our current program on Tech Advocacy, which selectively aims to advocate to and work with technology providers to incorporate human rights values such as safety, security, and dignity into their functionalities, designs, policies, and platforms. This complements our focus on ensuring that human rights documentation can be properly verified. Our guiding assumption is that with millions of people turning to video to document their experience, making the spaces where people share or curate videos and communicate with each other safer and more effective could have a transformative effect on the fight for human rights. This is why we worked with YouTube (although hardly a start-up) to incorporate a face blurring functionality in its video editor, allowing people who upload videos to protect the privacy of people featured in their footage.
To us, this is a no-brainer. We invest our (and our funders') resources in human lives, and we cannot afford to jeopardize our investment. But I would argue that venture capitalists have just as much to lose. Human rights defenders often are the proverbial canaries in the coal mine. What can kill or detain a human rights defender (like a leaky email account in China, or a repressive government surveilling its citizens) reflects the same values that ordinary citizens--the users of systems fueled by new technologies--care about.
In a world where government surveillance regimes like the NSA are not shy about asking companies for users' personal information, those companies have a heightened responsibility to only collect what they absolutely need--and to keep it safe. Without ensuring that these values (or as we call them, "human rights") are part of the DNA of the systems, processes, functionalities, and user policies of the technology start-ups creating the apps and platforms that are part of our daily lives, investment dollars are at risk and there is no healthy business case in the long run. In the aforementioned "Cameras Everywhere" report, we also did an early shout-out to venture capitalist funders, suggesting closer collaboration between NGO funders, investors, and technology developers as well as processes that included the assessment of human rights risks. "What would be nice," we wrote, "is a human rights advisory board that will assess all technology-led proposals."
Since 2011, there have been some positive changes. This includes Whatsapp's use of encryption for anti-snooping purposes, which integrates the open-source software Textsecure (created by the non-profit Open Whisper System)--although this does not yet apply to photos and videos. It also includes some notable imperfect efforts, such as Whisper.
Clearly, we are not the only ones advocating for more ethical and human rights-friendly technologies and businesses, as evidenced by the growth in social impact investing and the success of organizations like the Social Venture Network. Given these trends and the billions of dollars in venture capital investment to go around, you would think that VCs are catching up rapidly on more human rights-friendly or ethical practices. But despite the laudable initiatives by venture capital firms that are aimed at social change, which exist separate from their commercial business practices, technology investors seem to be lagging behind in integrating human rights values into their daily business practices.
If I were a VC, I would ensure that, next to my "entrepreneur in residence" and other prestigiously-titled folks in my company, I had a human rights expert to work closely with the developers and the business team from inception--and that he or she had a "human-rights friendly" sign-off before the team progresses to the next phase. I would also collaborate closely with developers and NGOs that put human rights first, such as The Guardian Project.
At a bare minimum, this is a way to avoid unpleasant surprises that will jeopardize your investment down the line, alienate users, or stunt a start-up's growth. But if you'd prefer to couch it in terms of universal "values", I would quote Whatsapp co-founder Jan Koum: "Nobody should have the right to eavesdrop, or you become a totalitarian state."