Monday, August 30, 2010

Inception and the Power of Lucid Dreams


Zhuangzi, a Daoist philosopher who lived in the 4th century BC, once had a dream in which he was a beautiful yellow butterfly, fluttering gracefully, blissfully unaware that he was dreaming. Without warning he woke up, suddenly conscious that he was a man and that he had been dreaming. But as he pondered the vividness of his dream, he realized that he had no way of being absolutely sure whether he was a man who had been dreaming he was a butterfly, or a butterfly who was now dreaming he was a man.

It is this realization about our perception of reality that writer and director Chris Nolan exploits in his latest blockbuster Inception, in which the action moves dizzyingly from dreams to reality to dreams within dreams, blurring the lines between the dream world and real world.

Think about it. No matter how bizarre a dream gets, you never question its reality until after you wake up and realize how strange the whole thing was. The main character in the film, Cobb (Leonardo DiCaprio), comes up with an ingenious solution to the problem of separating dreams from reality: He spins a small metal top—if he is dreaming it just spins and spins forever, but if he is not dreaming the top will eventually fall over.

The inception of Inception was Nolan’s own experience with lucid dreaming (in which the dreamer becomes aware that it is a dream yet doesn’t wake up). The film is a brilliant exploration of the human subconscious as expressed through dreams—a kind of thinking man’s Avatar—with all the strangeness, wonder, and fascination of exploring an incredible new world of alternate perception, minus the fantasy creatures and simplistic dialogue.

The concept of lucid dreaming has been around for thousands of years—historical references to it are found in the writings of Tibetan monks who as early as 700 AD were practicing yoga dreaming—a kind of sleeping meditation which allowed them to consciously explore the dream world. They were the first to discover that the dream world exists entirely in one’s own mind and that the reality of dreams can be altered by the dreamer.

My own first experience with lucid dreaming was not born of a desire to explore my subconscious or create new realities. I simply wanted to escape the torments of a recurring nightmare that I had been having since I was a little boy. The dream unfolded in the same way every time—I was trying to escape from some unnamed, unseen, but nonetheless terrifying force and found myself unable to run or shout for help. My feet felt like they were stuck in molasses or as though I was running the wrong way on a moving walkway, and no matter how much I tried to call out to the people just ahead of me, I could never get the slightest sound to emerge from my throat as my inner demons closed in.

By chance I came across a book which mentioned lucid dreaming, and for the first time I heard about the practice of becoming aware that you are dreaming while still in the dream state. Usually we don't realize we are dreaming until we wake up, but lucid dreamers can learn to recognize their dreams and, the book told me, change them in interesting ways—even to overcome nightmares. I resolved to remember this the next time I found myself inside that nightmare world and just a few weeks later it happened—I was trying to run and my legs wouldn't move and the terror began to grip me and my throat tightened up, when it suddenly occurred to me that this had to be a dream. Once the realization struck, my pursuers disappeared instantly and the fear lifted. My next thought was that I wanted to try to fly—something I had read was possible in dreams—and I spent the next half hour (who knows how long it was in "real" time) flying around my own dream world, exploring strange amalgams of places I had been and seen, watching figments of my own imagination walk around a world that, I was suddenly aware, my own subconscious was creating.

Once I had overcome my recurring nightmare, I began to explore my dream world in subsequent lucid dreams, and found a few surprises along the way. It makes sense that Nolan is himself a lucid dreamer, because his depiction of the dream state in Inception is so uncannily accurate. I do find that I can alter reality in interesting ways, just as the young architect Ariadne (Ellen Page) does in the film; but also like the film, there are limits to what I can do.

The "projections" as they are called—the other people who populate my dream world—are always difficult to talk to. I have found it impossible to manipulate their behavior (“It's my subconscious, remember. I can't control it,” Cobb tells Ariadne at one point in the film) and I often find them to be downright stubborn and uncooperative. Most of the time they ignore me, say unintelligible things, or even resist my will as I move through my dreams—a projection once yelled at me angrily in a dream when I told her she was a figment of my imagination, giving me a bit of a shock and causing me to suddenly wake up. None of them have ever actually attacked anyone the way that Mal (Marion Cotillard) does in Inception, but in the wrong subconscious mind, I wouldn't put it past them.

There are other surprising limits to what I can do in a lucid dream. I can, just as Ariadne does, alter certain elements of the dream world. But I also find that some things just spring up completely unbidden, and that often my attempts to change things breaks the dream reality down to the point where everything simply falls apart and goes black, causing me to suddenly wake up. I have also tried some experiments like flying through a ceiling, expecting to pop out through the top of the strange room I was in. Unfortunately, as I made contact with the ceiling I found my way blocked and as I tried to force my way up and out of the room everything went black, leaving me floating in empty space for a few moments before I woke up.

It had been a couple of years since I last had a lucid dream, but seeing Inception twice now in the last two weeks triggered another one just the other night. Finding myself in a dream state, I decided to fly away and accelerated up through a strange cartoonish world of floating Christmas trees and dancing lights. I woke up that morning in a euphoric state—there's something invigorating about the freedom and wonder of lucid dreams, and wished I could go right back to sleep and continue dreaming.

It’s strange to think that Oneironauts (explorers of the dreamworld) are exploring a world that their own subconscious mind is creating at the exact same time that they are experiencing it. It’s a profoundly empowering and self-reflective experience. But still, I wouldn't want to get lost in the dream world the way that Mal and Cobb do in Inception. There's something essential missing in the dream world, and it’s always comforting to come back to reality. That is, if I really am awake right now. How can I really know for certain? Maybe I should get myself one of those little tops . . . just to be sure.

Wednesday, July 7, 2010

A Shallow Sentiment: Why Nicholas Carr's Anti-Technology Crusade Misses the Whole Point

There’s an oft-quoted Scottish proverb that there is no great loss without some gain. The flip side of that sentiment, I suppose, would be that there is no great gain without some loss. The latter phrase came to mind as I read Nicholas Carr’s latest book The Shallows, which expands on his popular article which first appeared in The Atlantic, “Is Google Making Us Stupid?” and argues that the Internet is fundamentally altering the very structure of our minds and making us all dumber. But even if we assume that he’s right and the Internet is decreasing attention spans and creating shallower thinking among many of us in the developed world, is that really such a great loss when compared to its potential gains?

I hear variations of this same argument all the time in researching and teaching educational technology and I’ve come to see it as a kind of excuse. We all tend to dismiss what we don’t fully understand as either unimportant or dangerous, and I think that Carr falls into the latter category.

Disappointingly, the book is void of broad generalizations, wild claims, and unwarranted superlatives, which makes it dishearteningly difficult to attack. He works slowly and methodically, in a scholarly-way, providing interesting background on the history of writing, providing evidence and examples for his claims and genuinely acknowledging the other side of most of his arguments. So I must admit that Carr’s point is legitimate as far as it goes. He makes a good case for the fact that the use of the Internet can make it more difficult to engage in focused, undistracted thought—a practice which characterizes many of histories’ greatest thinkers.

But how many people will actually read his carefully qualified and well-nuanced argument? On the other hand how many anti-technology zealots will use it as a bludgeon in the fight to deny schools adequate funding for educational technology? How many will use it as an excuse to ignore the work of programs like One Laptop Per Child? How many misinformed people will see it as a reason to miss out on the infinite information and opportunities the Internet provides?

He’s not helping us by making this argument. This is just one more in a long line of anti-innovation arguments which go back to the time of Socrates and I think there's more going on here than simply an instinctive resistance to change; taken together, these arguments paint a picture of an effort among elite scholars throughout history to maintain exclusive power structures and deny the world’s poor and undereducated entrance to the marketplace of ideas.

Carr argues that the Internet is a “form of human regress” and he supports this claim by bemoaning the loss of what he terms the “linear mind.” The linear mind, he says, began to emerge after Gutenberg invented the printing press and is characterized by “calm, focused, undistracted” thought processes like those facilitated by the careful, focused reading of a printed book. He argues that the linear mind is being replaced by an Internet-fueled need “to take in and dole out information in short, disjointed, often overlapping bursts—the faster the better.”

The latter may be true—Carr cites some preliminary evidence that attention spans are shrinking in the age of the Internet—but the “linear mind” which he evokes is merely the latest incarnation of a very tired old argument. There have always been a few thinkers who dig deeper and think harder than the rest of us, in both primitive and advanced societies. But the printing press didn’t create the linear mind, nor will computers destroy it.

For the first 200,000 years of human existence no one knew how to read or write. In the scheme of things writing is still a new-fangled invention, a recent and innovative technology in the development of humanity. Homo Sapiens have been writing for less than 2% of our time on Earth, and the historical evidence indicates that the “linear mind” existed long before writing, let alone the printing press, which has been around for less than .3% of our total time on Earth.

It is, after all, the great philosopher Socrates who argues in The Phaedrus that the latest and greatest technology of his time—writing—“will create forgetfulness in the learners' souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves.” In other words, he believed that the use of writing was creating shallow thinkers—a sentiment that is virtually identical to the argument Carr is making today about the Internet.

Of course, I suppose it never occurred to Socrates that not everyone lives in a center of learning like ancient Athens or can afford to spend his days sitting under a tree conversing with a wise teacher. Despite his reservations about the damage writing could do, the fact remained that in order for his words of wisdom to spread and reach beyond his own time and place, someone (his student Plato, for example) would eventually have to write them down.

Eventually, of course, Plato did write down Socrates words, and as a result, some 2,500 years later we are still able to read them. That is not to say that Socrates didn’t have a point about the loss of mental focus that writing caused, though, as any one who has ever written something down so as not to have to remember it can attest. In Socrates' time a talented and well-trained bard could memorize an entire epic poem after just two or three hearings, a mnemonic skill that quickly vanished once writing took hold.

So yes, Socrates was right about writing (just as Carr is probably right that too much computer use decreases attention spans), but that doesn’t make it right to argue that humans shouldn’t have ever learned to write (or that people shouldn’t use the Internet). The irony is that today so many anti-technologists decry the “death of handwriting” which is the very evil that Socrates feared in his own time—it’s an endless cycle. The old school never welcomes new innovations, but once the revolution prevails the fears it raised always end up seeming quaint and foolish to the next generation.

A similar thing happened with the invention of the printing press. Scholars argued that the ease with which printed materials were being created was destroying the quality of literature and scholarship. They lamented that it was filling student's time with trash reading rather than forcing them to apply undistracted focus in mastering a few key texts (in other words, they believed the “linear mind” was destroyed, not created by Gutenberg’s press).

Perhaps the most intriguing example of the anti-printing argument is found in Johannes Trithemius’ De Laude Scriptorum (In Praise of Scribes), in which he argues that monks should continue to practice copying despite the invention of the printing press. His reasons? It keeps idle hands busy, encourages diligence, devotion, and deep knowledge of scripture. He also writes convincingly about the beauty, integrity and individuality of the copied text compared to the stark and mistake-prone results of the printing press.

Surely there is no one who would argue that a printed book is as beautiful, unique, and artistic as a hand-copied manuscript, or that a monk who has hand-copied the Bible dozens of times doesn’t know it better than one who has merely read a printed copy. But there is one great irony in Trithemius’ story: Within two years he took his laboriously hand-copied manuscript to a movable-type print shop in Mainz, Germany and had it printed for more widespread dissemination. I can only imagine the peculiar feeling that a devoted scribe must have felt reading the printed version of this book articulating the value of copying by hand. It must have been at least a little bit similar to the strange dissonance I often felt as I read Nicholas Carr’s anti-tech book electronically.

I have to admit, Carr is right about one thing. I probably was more distracted reading his book on my computer than I would have been had I been locked away in some room with a hard copy of his work. I often found myself opening up another window to further research some fact or define some word that he had dropped, to see what other writers had said about it, or to expand my understanding of some element of his argument. But is that really more “shallow” than sitting in a room reading his words in stark isolation? I’m still not convinced.

A few notable individual geniuses have always had this focused ability to read and think, but the vast majority of human beings never have. On the other hand the Internet is allowing more people to participate in the process of learning and creating knowledge than has ever been possible before. Is it better to have a small group of isolated scholars thinking deeply, or an entire world engaged in a global conversation? Before the development of writing just a handful of scholars, philosophers, and politicians were educated at all. Before the printing press, far less than one percent of the world's people could read and write. Before computers less than half of the global population could read. But now, according to C.I.A. figures, global illiteracy is at the lowest level in human history at just 18%, and is dropping rapidly.

The Internet is a fundamental new medium and is already proving to be every bit as important and paradigm changing as the invention of writing or the printing press. Are we going to lose some things in the process? The smell of freshly printed paper, the feeling of cracking open a brand new book, the sound of a pen scratching across a page, the security of a signed hard-copy of a legal document, even the very ability to write things out by hand? Eventually, yes, I think we will lose those things. And although it seems like a tremendous loss at times, it will also be a tremendous gain.

Think about it. Carr’s argument is fundamentally based on the same kind of elitism that innovative communication technology has encountered since at least the time of Socrates. Certain privileged thinkers and academics find value in the status quo and don’t want to see it disappear while the rest of humanity starves for access to the knowledge that they covet.

It was easy for the wealthy American Nicholas Carr to isolate himself in an “unplugged” mountain hideaway to write this book, which argues that the Internet is making us all dumber. Perhaps for the few privileged wealthy individuals in this world who have access to a world-class education (Carr got his B.A. at Dartmouth and his M.A. at Harvard), vast public libraries, and well-stocked bookstores, the newly interactive Web 2.0 will have a negative impact. But how many people who were formerly shut out have now found a window to a whole world of information and opportunity? And why, Nicholas Carr, are they “shallow” for finally accessing it?

For a planet filled with individuals hungering to join the global conversation, the Internet is opening up worlds of opportunity in just a few years that old-fashioned paper never could have in a thousand. The free and globally accessible Internet has the potential to improve education, revitalize democracy, undermine dictatorships, circumvent censorship, facilitate global understanding, empower the oppressed, expand opportunity, and perhaps not least importantly, revolutionize the art of writing itself. So who here is really guilty of shallow thinking?

Sunday, January 10, 2010

Reality in the New Economy: More Wealth, Fewer Jobs

How quickly we forget the roaring nineties and the sense of nearly universal optimism engendered by the end of the cold war and the longest period of uninterrupted economic growth in our nation’s history. The madness surrounding the “New Economy” and predictions of unending economic prosperity seem to have reached their peak in 1999 with the now infamous book Dow 36,000 by James K. Glassman and Kevin A Hasett, which boldly proclaimed, “The stock market is a money machine: Put dollars in at one end, get those dollars back and more at the other end [. . .] The Dow should rise to 36,000 immediately, but to be realistic, we believe the rise will take some time, perhaps three to five years” (22).

As it turned out, of course, the Dow peaked in January of 2000 at just over 11,300 before falling precipitously, bottoming out just three years after Glassman and Hassett’s book was published at 7,286. More recently it plunged even lower, hitting 6,547 in March 2009. Of course the stock market is nothing if not unpredictable, and so this historic miscalculation is an excusable, though none-the-less fascinating and colossally epic fail.

The realization that economists are coming to is that the predictions for the new economy were based on a fundamental flaw in our understanding of how technology would impact the job market. The consistent increases in productivity brought about by the successful integration of technology into the economy have not resulted in the uninterrupted economic boom many economists predicted, which is forcing them to take another look at some of our most basic assumptions.

The economic stimulus is a case in point. Congress, Ben Bernanke, and President Obama poured close to a trillion dollars into the economy in the form of stimulus, following the model for economic growth established in the post-depression era by the New Deal and WWII in which massive government spending pulled our economy out of the great depression. Although it was designed to unfold too slowly to create the immediate results many voters were looking for, the stimulus package is working at some level. The stock market has rebounded to over 10,000, consumer spending is on the rise, productivity is up, the GDP is increasing, and the ailing financial system has been stabilized.

There is one major economic indicator that has not improved, however, and that is going to create serious problems for the Democrats in November. That indicator, of course, is unemployment. Due to the bizarre way in which this rate is calculated, the feds pegged unemployment at around 10% for the last three months of 2009, though the real number of unemployed and underemployed is probably closer to 20%. It’s not a surprise to anyone who has studied economics that unemployment growth is lagging behind other indicators, but that hasn’t stopped Rush Limbaugh and the Republicans in Congress from pouncing on Obama and proclaiming the stimulus a failure.

The stimulus is not a failure, but unemployment is not going to improve any time in the near future and unfortunately the unsophisticated American electorate is about to be taken for a ride by politicians and pundits who are going to manipulate that for their own ideological ends. Liberal pundits will argue that unemployment always lags behind other indicators, which is true, but it’s not true that in past recessions unemployment lagged as much as it has (and will) during the recovery from the Great Recession.

The connection between productivity and unemployment is often debated, but over the last sixty years sudden increases in productivity have always meant that workers were busy (and making money), which almost always results in companies hiring more help. This pattern first emerged in 1950 when a surge in productivity caused unemployment to drop 33% after an impressive quarter in which productivity growth reached 14.6% due to the 1950’s post-WWII economic boom. A similar scenario unfolded in the early 1980s when during a deep recession productivity growth suddenly hit 9.6%, giving companies a flush of income which allowed them to hire new employees; by the end of the third quarter of 1983 unemployment had fallen a full one percent, and it continued to decline rapidly and consistently for the next two years. Ditto for the first quarter of 2000, when productivity growth of 9.4% resulted in an almost unbelievably low unemployment rate of 3.9% just two months later.

Based on this analysis, it’s obvious why economists of the late nineties predicted that the productivity gains of the new economy (productivity is up 65% since 1970) would create jobs and lead to sustained economic growth. So why did the recent boom in productivity: 6.9% and 8.1% in the second and third quarters of 2009—a dramatic change from the 1.8% average of the two previous years—not cause a drop in unemployment? Unfortunately, it seems, that the answer lies in an aspect of the new economy that has been largely overlooked.

Think about it. How many jobs that were viable career options just a decade ago are now disappearing? And I’m not just talking about factory jobs that are being lost as automation and outsourcing increases. Travel agents have been replaced by websites; airline ticket agents have been laid off as airlines move to e-tickets; real estate agents are being bypassed by buyers and sellers who use the web.

Distributors, once a key middle-man in the economy, are being replaced by complex computerized distribution systems like the one Wal-Mart employs, which no longer requires white collar professionals to sell and distribute products to retail outlets, but instead employs underpaid blue collar workers without benefits to drive forklifts while a computer places the orders and keeps the cheapest products available flowing into stores.

Tech support and customer service has been outsourced along with white collar engineering jobs. The post office has completely stopped replacing employees who quit or retire. Teachers and professors are being asked to facilitate online classes that can be run at much lower cost with higher student to teacher ratios. Illegal downloading has gutted the music industry and new software has made DIY recording relatively easy, eliminating thousands of high paying recording, editing, production, marketing, and distribution jobs in a rapidly disappearing industry.

Journalists are being replaced by bloggers. Locally owned businesses are being replaced by huge chain stores which replace upper-middle class local owners with lower-middle class managers with no health insurance coverage. The art of professional photography, which at one time was so marketable because of the difficulty of getting a decent shot with an old-fashioned film camera, has been rapidly eroded by digital cameras that show you the picture instantly, allowing you to get the shot you want without paying a professional for it.

All of this seems great on an individual basis. We now can get whatever music we want whenever we want it. We can have Uncle George photograph our wedding instead of paying a professional photographer thousands of dollars. We can buy cheap crap at Wal-Mart instead of high priced quality goods from local retailers. We can buy or sell a house without having to pay a commission. All great things. But the side of it all that we tend to ignore is that the more middlemen you take out of the economy, the fewer middle-class jobs there will be out there that don’t require a college education.

The new economy is turning out to be all about the further stratification of wealth. We used to say that the rich get richer and the poor get poorer back when the wealthiest one percent of Americans was earning just 10% of all pretax income. Now that the same one percent receives close to 22% of all income, the real impact of the new economy is becoming strikingly clear. Increased worker productivity and the disappearance of the middleman has been a great thing for stockholders and corporate CEOs, but it’s kicking the middle class in the ass and will continue to do so for the foreseeable future. So if you’re wondering why jobs aren’t coming back despite economic growth, look no further than the computer on which you’re reading this for free right now.