Co-authored with Jonathan Regier
The bankruptcy of Kodak -- long an icon of photography, of capturing a moment in an image (hence the popular slogan: a "Kodak Moment") -- has been taken as an example of how even the most dominant player in a market can succumb to so-called poor business strategies. In short, the world went digital and Kodak didn't adapt effectively. To us, however, Kodak's rise and fall is a classic example of how a technology, developed to fulfill basic human needs, will advance along semi-predictable and ultimately unpredictable lines as it develops and generates its own impetus and new demands.
Humans have long wanted a way to make images of things either to pass along information, art, or to commemorate objects, scenes, and events. We are obsessed with representations of ourselves and of our environments. These representations are probably the greater part of our creativity. Paleolithic artists were already much better at representing animal form and movement than most of us are now. The paintings of the Chauvet and Lascaux caves may begin to preoccupy you (these works date from about 30,000 and 17,000 years ago, respectively):
In the 19th century, we became good at making accurate images by "capturing" light easily and inexpensively. Prior to that we depended upon artists to sketch or paint portraits. In 1837, Daguerre created an early kind of photograph: silver-plated copper, silver iodide "developed" coat, warm mercury. The French government gave Daguerre a nice state pension. In return they were allowed to publish his methods and to bestow on all French citizens the right to use the Daguerreotype process. By the 1860s, photographic technology was already advanced enough that Mathew Brady and his staff could cover the American Civil War. Many of those images have their own power -- of seeing ourselves in period dress, killing and being killed with cannons and horses and burying the dead.
In 1880, George Eastman founded Eastman Dry Plate Company in Rochester, New York. In 1888, he brought out the first Kodak camera, containing a 20-foot roll of paper -- which meant 100, 2.5-inch diameter circular pictures. In 1889, Eastman-Kodak replaced paper rolls with a roll of film. In 1900, the Kodak Brownie box roll-film camera was introduced. One of us (GFS) remembers getting a Kodak Hawkeye Brownie (with a flash) as a birthday gift and has saved it in a position of honor in his cabinet of treasures. Even now each of its photos stirs a memory of a particular childhood when viewed.)
Until the 19th century, it was much easier to record a man or a woman's face than it was the voices that stirred souls. Many of the implicit rules of poetry and prose indicate how a voice should rise and fall, how it should be charged with emotion. Capturing the actual sound is rather difficult, though. In the late 19th century, we got the phonograph. On February 19, 1878, Edison was issued U.S. patent #200,521 for it. The first for a sound recording and play back device. While other inventors had produced devices that could record sounds, Edison's phonograph was the first to be able to reproduce the recorded sound.
Photographs trap a moment of electromagnetic energy in stable matter, thus preserving the trace indefinitely. A phonograph performs the same service for sound waves. The early technology was quite primitive, making what could be thought of as "graphs" of sound. Here is the story of how German Chancellor Otto von Bismarck's voice was liberated from the wax substrate of a phonograph cylinder. Consider how advanced Edison's technology was in the late 19th century, and also how primitive it now seems. Put the phonograph together with a rapid sequence of photographs and one has the videograph, commonly called a video, which is a continuation of cinematographs or the movies.
We now offer two examples of the latest advances in recording. They are both cutting-edge, and they are both still primitive:
- Here is an early attempt at listening in on human thought, via electrode helmet.
- Here neuroscientists are reconstructing visual experiences by way of fMRI. (Make sure to watch the video.)
In one way, these brain activity scans are even more primitive than early recording technologies, since the thing they are meant to capture is so much more complicated and multidimensional. They also respond to an even more primitive need: to know exactly what another person is thinking, to know exactly what we're thinking, to capture an experience or memory exactly as it occurs to the person living it.
The analogy we are pursuing is one of resolution, and of the huge improvement in resolution, transfer and cost of information during the last hundred or so years. Now, we can put a person in a machine and observe at low resolution what they're seeing via changes in their brain. It's easy to imagine that coarse emotions could also be perceived: anger, pleasure, disgust. Putting these pieces of technology together, one can envision a crude recording of what one senses and thinks during an event or brief period of time.
But what would be the difference between a mere video recording and experience taken by a high-resolution scan? The difference would be in the reproduction of "lived" details, how the eye moves through the 3D scene, what the ear picks up and doesn't, the way focus shifts from object to object, happening to happening, from word to word, conversation to conversation. It would be delicate, sumptuous, human. Thoughts and sensations would come in and out of the whirl. We self-censor what we say all the time, but how well do we self-censor what goes on in our heads? Clearly this is one level less filtered. It is more stream of conscious.
If recent history tells us something, it's that our current resolution will improve very rapidly, so long as there's something to drive the market. Here we can imagine two powerful sponsors: governments and, once the technology is far enough along, the global porn industry. The development of the Internet came initially through government sponsorship of basic technologies, also of simple network connections -- those between government facilities, national laboratories and universities. Once the effectiveness and usefulness were clear, the Internet moved into the commercial sector. By one estimation, the porn industry was more than 30% of the commercial driver in the early years and currently generates about 12% of Internet sales. It isn't hard to extrapolate to the large market that would exist for pornographs -- recordings of experiences and dreams that could be played in glorious high resolution reality.
But the devil is indeed in the details, and that's the problem in predicting how technologies advance. You have to be able to predict many factors on many fronts: pure science, engineering, markets. From the advent of audio and visual recording, it would have been impossible to predict radio and television. Simply impossible. The difficulty is not just how one technology will advance, but how it will combine with other technologies (some that may not yet exist). For example, we still watch television, but many of us do it on a computer with a Wi-Fi connection set on a backbone of fiber optics and satellites. The prediction problem also applies to short-term predictions. If you look at the rise of companies like Google and Facebook, which traffic largely in sound and image, you'll notice something strange. While these companies serve a basic human need for communication, they do so in ways that were unforeseen only three decades ago. You need almost everything that comes with the Internet, hardware and software, before you see how Google and Facebook realize themselves, how they fit.
Let's finish on a speculative note. Imagine the guy with his neck in a clamp (so he doesn't move) getting a portrait done in the 19th century. Photography is definitely neat for that guy. But it's not earth shattering. Photography became huge because the precise recording of our environments and ourselves is useful in every endeavor. Here we can imagine a portable device that can record all of our thoughts and sensations. At first it would be bulky and cumbersome, used only for special occasions. Then it would become smaller and smaller, more and more ubiquitous and less intrusive, as cellphone cameras have become. From there it might be diminished to the size of a tattoo or less, transmitting information to an external library. One could envision being able to turn it on when a young child and have it on all one's life. One could then go to the backup library, at home or in the cloud, and recall any incident in HD -- even a traditional playback would be enough. Being able to tap into all of this would be a kind of total recall, a manner of living in another person's mind, or of living in your own mind as it was many years ago.
Professor George Smoot was co-awarded the 2006 Nobel Prize in Physics "for discovery of the blackbody form and anisotropy of the cosmic microwave background radiation." More information here.
Jonathan Reiger is a graduate student of the History and Philosophy of Science at the Universite of Paris 7. Regier is writing his doctoral thesis and his major research interests are the development of science and technology and the many ways in which natural philosophers and scientists have considered the universe to possess order.
The truth is stranger than fiction. Step into the world of weird news. Learn more