In my keynote Wednesday night at CES, the annual consumer electronics show in Las Vegas, I talked about a number of trends transforming the role that technology plays in our lives -- trends like the proliferation of ever cheaper and lighter HD screens, the combination of smart devices and powerful PCs connected to the cloud and the changing nature of television when combined with innovative software.
But if I had to pick one technology trend I think will have a major impact this year, it's the entirely new ways in which computing technology will work more naturally alongside us.
For years, we've relied on familiar GUI (graphical user interface) tools and methods -- the keyboard and mouse; menus and commands; clicking and scrolling; files and folders -- to control and manipulate computers and the applications that run on them.
But I believe we will look back on 2010 as the year we expanded beyond the mouse and keyboard and started incorporating more natural forms of interaction such as touch, speech, gestures, handwriting, and vision -- what computer scientists call the "NUI" or natural user interface. This process is already well underway through the proliferation of new touch screen phones and PCs, and in our growing reliance on voice-controlled in-car technology for communications, navigation, and entertainment.
In some ways, this transformation has been a long time coming. For many years now, Microsoft has been working on NUI-based technologies such as speech, touch, contextual and environmental awareness, immersive 3D experiences, and anticipatory computing -- all with the goal of a computer that can see, listen, learn, talk and act smartly on our behalf. Ever since the development of the first computers, this lofty goal has been one of the most challenging problems in computer science. But an incredible expansion in computing power along with new breakthroughs in software have enabled us to solve many of these problems, putting us at the verge of an important leap forward.
Microsoft's recent work in the area of video gaming is starting to bring NUI to life in a tangible way. As we shared at CES, an ambitious effort codenamed "Project Natal" which uses sophisticated sensors and software to track body movements, recognize faces, and respond to spoken directions is something we plan to bring to market by holiday of this year. With Project Natal your whole body is turned into a video game controller, so that you can enjoy games with friends the same way that you play them in the real world -- by talking, shouting, running, swinging, and a million other movements and gestures.
While Project Natal will transform the video gaming and in-home entertainment experience, I believe it only hints at the potential of the technology behind it. In the near future, computers will do more than work at our command: they will work on our behalf, acting as assistants that understand what we want and possessing the intelligence to carry out complex tasks in a way that accurately reflects -- and even predicts -- our preferences and intentions.
Simply put, NUI is about easing discovery so that the computing technology that surrounds you acts as a more natural and dynamic partner, not a tool, for helping you work, live and have fun. And, I believe these advances will help usher in a new generation of human-computer interaction this decade.