Monday, November 16, 2009

The Singularity and singularity

When I lived in Boulder, I joined a group of fellow nerds in a “Future Salon” to discuss, well, the future. Several of the group’s members, not least its leader Wayne Radinsky, made an impression on me by focusing intensely, one might even say obsessively, on the pace of technological change in semi-conductors. They were constantly updating us on the relevance (yes, it’s still valid) of Moore’s Law: computing power has been doubling every 18 months or so since the late 1960s. Anyone aware of compound growth will immediately understand what this means. Quite frequently Wayne and others would let us know about Intel’s latest breakthroughs at the nano-level. I had heard of Moore’s Law before, but the Future Salonnieres’ obsession with it opened my eyes to its central importance for our economy and world.
This led soon to the Singularity, which I hadn’t heard of previously. The Singularity is a term coined by Ray Kurzweil, a singular genius who has invented or contributed to much of the technology underlying our high tech world. As far as I could tell (and I have only dipped a small toe in one of Kurzweil’s books), the Singularity refers in this context not to the convergence of space and time in a black hole, but to something just as irreversible: the imminent (within the next few decades) merging of computer and human intelligence, under the lead of computers whose abilities will have inexorably outpaced those of us humans. Most of the devotees of the Singularity appear to believe that this will be a wholly or largely good thing, i.e. that the computers will take over or merge with us with generally good purposes in mind. Kurzweil himself has apparently placed himself on an ultra-low calorie diet so that he may live another 30 or so years and thereby participate in the Singularity.
I found all of this fascinating, but also rather dubious. I can’t judge the technical likelihood that computers will actually surpass human intelligence. Yes, exponential growth is tremendous; but will some fundamental physical limits be reached at the atomic level (I thought I’d heard that we were nearing some with our current semi-conductor sketching technologies, but Wayne tells me we haven’t) which will stop Moore’s Law cold? Can raw computing power, no matter how great, really match and then exceed the creativity of wet human brains forged in millions of years of evolution? Maybe. I don’t know. Regardless (and again, I admit to making these judgments without having read the literature), the advocates of the Singularity seem to overlook the possibility, indeed the likelihood, that humans would intervene well before the stage when computers actually surpass us across the board. Why should we view technological advance as inevitable and out of all human control? The advocates seem to me to be most naïve in regard to the alleged desirability of the computer-human merger. Why should we believe the computer(s) will act benevolently, either in terms of our interests or even their own? If there’s one super-computer running the show, won’t it establish an electronic closed society as have all the human dictatorships in history? And if there are many, why wouldn’t they compete with each other, simply replicating all of humanity’s foibles at the speed of electrons? The nerds’ dream of an electronic heaven seems to me to be a fata morgana: it replicates the illusions of both organized religion and of secular movements of the 19th and 20th centuries built around the hopes of “scientific management” and centralized knowledge. They dreamt of an omniscient (and benevolent) overlord and refused to acknowledge that we are on our own in this world, engaged in different forms and levels of unavoidable combat or at least competition, without any overall referee. There can never be an Archimedean point of knowledge or control.
More interesting to me than the Singularity is singularity, the question of what kinds of individuals will exist and thrive in a world in which Moore’s Law applies. Computers are doubling their power every year or two because many different kinds of people want it to be so. Nerds love the challenge. Corporations and their marketers love the money. Ordinary people love the stimulation (ever more options on smart phones, ever more realistic video games, ever faster internet, ever more cable stations, etc.). As I see it, the first two groups are benefiting most. Ordinary people think they’re benefiting, of course, but I think an enormous craze for stimulation is sweeping over us. I don’t know how deleterious the addictions are or will become. I know that many of my students cannot put away their cell phones in class, even when I threaten them with severe penalties, and many of them admit they have an addiction. Will these habits undermine people’s abilities to achieve what they say they want – success in career, friendships, relationships? How will they affect deep human relations, which studies show to be the best predictor of happiness and are probably a prerequisite for other “goods” as well, such as civic engagement? Will all the gadgets and social media turn out to be more like coffee – addictive, to be sure, but for most people an enrichment of social life – or like cigarettes – addictive and in the long run destructive of health - or like crack cocaine or methamphetamines – immediate wreckers of lives? Perhaps at present the harm seems minimal. Perhaps it will never become as visible as the harm done by cigarettes or hard drugs. But for that reason, it may grow to be all the more insidious. Nobody will see the damage, least of all the addicts themselves. How will this change power relations in our society? Will the masses not be lulled into complacency and indifference? (If I'm sounding stangely like Adorno here, so be it.) These new narcotics have grown out of fully legitimate industries and desires. They enjoy the full backing of the law. No government, at least no democratic one, will ever ban their use, though perhaps more laws will restrict them in particular settings, as with limitations on texting while driving. And Moore’s Law says that the power of the drugs – the speed and effectiveness of the stimuli – will only continue to grow.
Who’s right – the optimistic visionaries of the Singularity, or the pessimistic mourners of disappearing singularity and individual independence and even "sobriety?" And if the latter, what should, what can we do about it?

Saturday, November 14, 2009

Bismarck and Kafka

Since no one comes to Queens, where I live, I often end up taking the subway into Manhattan and Brooklyn to see friends, a trip that takes from 20 to 40 minutes. I recently started using this time to at least dip into some of the many books on my shelves that I’ve never read, in particular literature and poetry (a recent interest).
A couple of days ago, on a trip to Brooklyn, I started reading a collection of love letters from Bismarck to his beloved wife, Johanna. The very first letter, not to Johanna, but her father, revealed a whole, vanished world. In it, Bismarck asks the father for the hand of his daughter. In rich, complicated sentences (it’s hard to imagine any politician, indeed almost anybody, nowadays formulating such complex thoughts, or writing such a long letter, not to mention asking a potential father-in-law for permission to marry) the future German unifier reveals a deeply personal side of his past. He describes the spiritual emptiness that engulfed him as he lost faith in God – and then the rebirth he experienced, not least thanks to Johanna, as he rediscovered that belief. It’s a remarkable confession – in its revelation of weakness and despair by a strong man, in its self-reflection, in what it shows about the role of Christianity and faith and about the social relations governing courtship in earlier times. To compare all this to the world of internet dating today! It’s almost as if we’re dealing with two different kinds of humans, two different worlds.
Then yesterday I began Kafka’s In the Penal Colony, which I finished today (short enough that it took just four rides). Remarkable how Kafka uses language so sparingly to create an entire atmosphere of alienation, dread, mutual incomprehension (the characters among themselves, us and the characters). At the same time, the story seemed more obviously political – there are clearer “sides,” more easily recognized good and deranged parties – than in the other works of Kafka’s I’m familiar with.

Wednesday, November 11, 2009

A Sketch of Human History

If I were to channel Condorcet or Hegel on the big sweep of human history, this is what I would say.
Human history begins about 200,000 years ago with the Great Leap Forward: human language capabilities become fully-formed, enabling social learning to blossom. I.e. to a far greater extent than even their closest hominid ancestors and relatives, homo sapiens are no longer limited to their genetic repertoire combined with individual learning; they can now learn from each other, socially. This launches a whole new, much more rapid, stage of evolution – cultural evolution. Religion, art, significantly more advanced tools, even trade networks all emerge after 200,000 BP.
However, this potential for rapid progress is held back by the *sparseness of human population*. Foragers (hunter-gatherers) need lots of space to survive, and the expansion of humans starting around 100,000 BP out of Africa into Eurasia, then Australia and the Americas allows foragers to maintain this scattered lifestyle. In the absence of dense populations, social learning does not occur as rapidly as otherwise might.
The agricultural revolution starting around 10,000 BCE and the first emergence of agrarian civilizations around 3,500 BCE appear to “solve” this fundamental limitation of the foraging phase. Humans can now live in much denser settlements. Social learning and progress should take off. However, agrarian civilizations remain stagnant in many ways across the millennia. Why? Because a new impediment to social learning and progress has arisen hand-in-hand with agriculture and agrarian civilization: *hierarchy*. Whereas foragers lived in basically egalitarian (and small) groups, agrarian civilization is characterized by enormous gulfs of wealth and power. The powerful hold back progress for millennia. Political power-holders (“macro-parasites,” in William McNeill’s phrase) leach off their peasants, making property insecure and preferring (their own) political security to the threatening dynamism of economic growth. Meanwhile, intellectual power-holders (religious authorities) guard their monopolies, preventing alternatives from arising and intellectual innovation from occurring. One impediment to social learning and progress – sparseness of population – has been replaced by another – hierarchy.
Finally, in the last few centuries, parts of north-western Europe pioneer a path out of hierarchy. Arbitrary political power is tamed (through parliaments and the rule of law) and the intellectual monopoly of the religious authorities is broken by science, religious toleration, and legal guarantees of intellectual pluralism. In the modern age, 200,000 years after the possibility of social learning emerged, its promise is finally being realized, as humans are free for the first time from the successive handicaps of sparse population and hierarchy.
– Georg Wilhelm Friedrich Meskill

Wednesday, November 4, 2009

The Great Divergence

If the transition to modernity was the main spur prodding the early giants of social science to investigate how society works and changes, a secondary and related puzzle was why this transition happened in Europe first (or even exclusively). To this day, Why Europe? – namely, why industrialization, the break-out from Malthusian traps, the breakthrough to parliamentary, law-based polities, and the emergence of science all began in this relatively small, unprepossessing corner of the great Eurasian landmass – remains THE great, framing question of the social sciences.
Kenneth Pomeranz’ The Great Divergence: China, Europe, and the Making of the Modern World (2000) has been one of the most notable attempts in recent decades to take on this challenge. Pomeranz’ strongly revisionist account rejects most of the dominant strands of current thought, which have sought deep historical origins for Europe’s special path. Instead, he proposes that Europe diverged from other great civilizations, notably China, only in the 19th century. Furthermore, this special path was not inevitable, but rooted in historical contingency.
By the 18th century, Europe, China and other leading agrarian civilizations were all fast approaching ecological limits to growth. Crises induced by deforestation, declining soil fertility and related problems threatened to derail the low capital-intensity proto-industrial expansions underway in some places well short of true industrialization. While exactly this happened in China, Europe escaped the same fate thanks to the fortuitous presence of coal deposits near centers of proto-industry and commerce (in China, on the other hand, coal was in the northwest, hundreds of miles from the commercial center in the Yangzi delta) and thanks to the windfall of the vast territories of its colonial empires, above all in the Americas.
The Great Divergence is a terrifically impressive book. Pomeranz has command of vast amounts of literature relating to European, Chinese, and other economic histories. He’s able to summarize and categorize arguments in very helpful ways. I learned a good deal about European economic history and historiography from this China expert! Furthermore, he argues very methodically and empirically, doing his best even when the data support only speculative conclusions. I came away almost convinced that as late as the 18th century, parts of China were as advanced or poised for a breakthrough as the northwestern regions of Europe were (Pomeranz emphasizes the importance of moving away from the scale of “China” or “Europe” and instead focusing on smaller regions within each). I was particularly struck by his argument that economic historians have overemphasized the importance of labor-saving devices while ignoring a problem of at least equal weight, at least before the dawn of scientific chemistry and other technological innovations in the 19th century and the vast improvements in productivity these permitted, namely, the central limiting role of land and physical resources.
Nonetheless, Pomeranz’s argument left out some crucial matters. First, he discusses European colonies and the advantages these conferred as if they were a windfall, a matter of luck, and not something that itself was in need of explanation. In the early 15th century, nearly a century before Columbus, the Chinese sent out fleets that reached as far as East Africa and whose ships dwarfed the Nina, Pinta, and Santa Maria. Yet those expeditions did not lead to a Chinese global empire. Why not? Second, he doesn’t adequately treat the European lead in science, which was evident and growing by 1600 at the latest. There’s a debate about just how important the scientific revolution was to European industrialization, at least before the second half of the nineteenth century. Many scholars have argued that there was little transfer into economically relevant technology before the chemical and electrical industries developed. Floris Cohen, however, a respected historian of science, has argued that the European understanding of the physics behind the vacuum was crucial to the development of the steam engine. Third, Pomeranz’s argument that Europe benefited from the Americas *in the 19th century* runs afoul of timing. By this point, namely, the colonies in North and South America had gained their independence; their former European masters had to trade with them for their raw materials. What was preventing China from doing the same? Finally, and relatedly, once industrialization took off in England, it spread fairly rapidly to other parts of western and central Europe – but then stopped abruptly, as if at a firewall. The Ottoman lands, India, and China didn’t immediately jump on board and try to emulate the Europeans. So, again, we must ask, if China was equal with Europe in so many respects in 1800, what prevented the Chinese from adopting such successful innovations?
The first two critiques of The Great Divergence – about colonialism and science – point in the same direction, toward the critical role of institutions and how societies were organized. Two outstanding works address Europe’s advantages in colonialism and science, respectively, in remarkably similar ways. David Abernethy’s The Dynamics of Global Dominance and Toby Huff’s The Rise of Early Modern Science argue that Europe’s advantage lay in its greater organizational and institutional capacity. Abernethy shows that European states, chartered companies, and Church bodies all contributed, together or singly, depending on circumstances, to a “triple assault” on weaker societies that even the strongest of the other civilizations, built around despotic rulers, extended families, and individual entrepreneurs, could not hope to match. Similarly, Huff points to the semi-autonomy and longevity the corporate structures of European universities granted to intellectual inquiry. In China and Islam, by contrast, scholarship and science depended much more on the support – on the whims – of individual rulers or benefactors. I suspect Europe’s organizational capacity, which was itself rooted, as Huff points out, in the so-called Papal Revolution of the 11th-13th centuries and the concomitant legal revolution, will provide an important element of any successful explanation of Europe’s divergent path.