Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

November 06 2013

19:35

Recent Books Read

[This column first appeared in the Shift Age Newsletter ]

People often ask me about what books I read.  I sense that they expect some kind of unique or weird answer from a futurist.  In addition to this question I am often given books or names of books to read, usually books that have transformed the life of that person.  So I often seem to be in book conversations.

So here is a list of recent reads that I have greatly enjoyed.  In a future column I will list some older favorites that stand the test of time, even for a futurist.

Recent Non-Fiction

The Unwinding: A Inner History of the New America”  by George Packer is a brilliant book.  It is beautifully written and uniquely structured.  It is perhaps the best, most human book I have read about the unraveling of American Democracy and the profound changes that have occurred in the last 30 years.  A great story teller with an eye for telling details, Packer writes with compassion about how much of America has been hollowed out by Wall Street and globalism.

The Next Decade:  Where We Have Been and Where We Are Going by George Friedman is a deeply thoughtful, high level look at how geopolitics will unfold between now and 2020.  Friedman fills a gap for me as he is so strategically thoughtful about geopolitics.  As those of you who have read my books or heard me speak know, I am much more focused on the big, technologically influenced future of humanity than in geopolitics.  I would recommend anyone wanting to let go of the weeds of policy and politics and step up to some deeper, historically based analysis of how geopolitics may play out in the next ten year to pick this up.

Distrust That Particular Flavor by William Gibson.  I often quote the famous saying from Gibson:”The future is here it is just not evenly distributed”.  Here is an author of fiction who coined the phrase  “Cyberspace” in his first hugely successful novel  “Neuromancer”, who now is writing non-fiction.  The fictional future he wrote about in the 1980s and 1990s is now here and is real.  So it makes sense that he now writes non-fiction about the real world only imagined in his novels.  This is a collection of essays by this great and future facing author.  If you haven’t read any of Gibson’s fiction and you like science fiction you are in for a treat with almost any of his novels.

Two Books About the Younger Generations

Those who have heard me or read me know that I focus a good bit of attention on the two generations I call the Millennials and the Digital Natives.  In my last book, “Entering the Shift Age” I call them the Shift Age generations.  There are two books about these new generations I would highly recommend.

First is Hooked Up: A New Generation’s Surprising Take on Sex, Politics and Saving the World” by Jack Myers. Jack is a long time friend and collaborator and someone I listen to carefully about the future of media and advertising. In this well researched book Jack takes a look at what he calls the “Internet Pioneers” the generation that is now in high school and college and how different and powerfully unique they are.  This is the part of the Millennial generation born between 1991 and 1995. If you want to understand your children and your future customers this is a great read.

Second is  “Disrupted: From Gen Y to iGen: Communicating with the Next Generation” by Stefan Pollack I got to read this book in final draft PDF form as I wrote the foreword to this book.  Pollack is a friend and a PR maven.  In this book he looks at what he calls the iGen, the children born between 1994 and 2004.  His focus is how differently they need to be communicated to.  He focuses on how different the media and technology environments are perceived and used by these young people.

If you want to really dive deep into a better understanding of the generation(s) born since 1991, reading these two books is the best way to go.

Fiction

Last year, when writing “Entering the Shift Age” any reading I did was related to the writing of that book, so I didn’t allow myself the option of reading fiction.  So this year I have read a number of novels with great pleasure.  Here are three science fiction books I have enjoyed this year.

Stand on Zanzibar  by John Brunner I first read this book in 1970, shortly after it came out.  It had such an impact on me that I reread it this year.  What is striking about reading it in 2013 is how prophetic it turned out to be in its view of Earth in this new millennium.  Genetic mapping, an eviscerated Detroit, a mega-multinational corporation and even a President [ admittedly of an African country]named Obomi.  I read the newer version, with a foreword from Bruce Sterling, which triggered reading the next book. 

Schismatrix Plus by Bruce Sterling This was the first book of Sterling’s that I have read.  He is considered one of the great science fiction writers of the last 20 years, and I now understand why.  This space adventure of fantastic human worlds in the future has a continuing theme of genetic and physical rebirth, allowing people to live for hundreds of years, again touching on an emerging trend of today.  I intend to read more Sterling for sure

“The Drowned World”  by J.G. Ballard A reissued classic first published in 1962, this is an atmospheric future fiction novel about earth after significant global warming.  The big cities of the world are under water, the equator is too hot for human habitation and nature has gone on a rapid evolutionary regression.  Given the current situation of rising sea levels today, this is an extreme view of what might eventually occur.  Again an author I intend to read more of in the future.

In a future column I will list some books that are classics and which gave me the impetus to become a futurist.

April 19 2013

11:05

Time to get tough on the physiological causes of crime

Bob Holmes, consultant

OBC_OURWAR_0010.jpg

Early intervention could counteract the effect of an abusive childhood (Image: Oscar B. Castillo/Fractures Collective)

In The Anatomy of Violence, Adrian Raine makes a strong case - often based on his own research - that distinct biological traits shape criminal behaviour

SUPPOSE you had to predict which kids in a roomful of 3-year-olds at your local preschool were likely to grow up to be violent criminals. How would you decide?

Most of us would probably round up the usual sociological suspects, and check whether a child comes from a broken or abusive home, is part of a family living below the poverty line, or has a parent who is a convicted criminal. But there's an easier way, says Adrian Raine: just measure their resting heart rate. His research shows that lower heart rates are a better indicator of criminal behaviour than smoking is of lung cancer.

In The Anatomy of Violence Raine, a criminologist at the University of Pennsylvania in Philadelphia, uses this and other evidence - much of it unearthed by himself over decades of research - to build a convincing case that violent criminals are biologically different from the rest of us.

violence_cover.jpg

"The seeds of sin are brain-based," he writes. Genetics, accidents of birth or events in early childhood have left criminals' brains and bodies with measurable flaws predisposing them to committing assault, murder and other antisocial acts.

That is good news, he argues, because what's broken can, in theory, be fixed. Indeed, tests have already proven that the right kind of intervention can reduce violent crime dramatically.

Back in the 1970s, when Raine began his career, sociologists believed that criminality was entirely the product of circumstance: factors that affect the less advantaged members of society, such as poverty, neglect and poor education.

Suggesting, as Raine did even then, that some people might have an innate predisposition to violence struck most as misguided or even racist. But the ground has shifted over the past few decades, and Raine spends the first few chapters of the book setting out why.

Identical twins, for example, are now known to be more likely than fraternal twins to share antisocial behaviours - which suggests this is an inherited trait. More detailed studies show that about half of the variability in antisocial behaviour between individuals has a genetic basis. Even identical twins brought up separately show a shared tendency towards criminal behaviour. Part of this can be explained by the few specific genes now known to influence violent behaviour, most notably the "warrior gene", MAOA.

From genes, Raine turns to brains. Back in the 1990s, he led the first study to image the brains of convicted murderers. Using PET scans, he found that their brains showed reduced activity in the prefrontal cortex, the region just behind the forehead that controls impulses and is responsible for planning. In other words, the murderers were less able than average to restrain themselves in stressful situations.

The same holds true for less extreme violent crime and conditions associated with it. When Raine scanned the brains of people diagnosed with antisocial personality disorder, he found that their prefrontal cortexes had 11 per cent less grey matter than those of individuals who did not have the condition.

Ingeniously, Raine discovered an excellent source of antisocial and psychotic research subjects: temporary employment agencies, which can attract people unable to form stable, long-term commitments. In one study, nearly a third of these agency workers met the standard criteria for being classified as psychotic.

Not all brain differences have a genetic basis, though. Raine presents plenty of evidence that mothers who smoke or drink can change the brains of their developing fetus. Even testosterone levels in the womb can alter the size of the prefrontal cortex - it is smaller in males, which may be part of the reason most violent crimes are committed by men. Complications at birth or abuse in childhood can also damage the brain, and the prefrontal cortex is especially vulnerable.

Many offenders also have impairments in their autonomic nervous system, the system responsible for the edgy, nervous feeling that can come with emotional arousal. This leads to a fearless, risk-taking personality, perhaps to compensate for chronic under-arousal.

evhr_6.02091446.jpg

Many convicted criminals, like the Unabomber, have slow heartbeats (Image: NYT/Eyevine)

It also gives them lower heart rates, which explains why heart rate is such a good predictor of criminal tendencies. The Unabomber, Ted Kaczynski (above), for example, had a resting heart rate of just 54 beats per minute, which put him in the bottom 3 per cent of the population.

Biology is not destiny, however, and Raine is careful to note that none of these physical differences guarantees a life of crime. Raine himself is a case in point: his resting heart rate is 48 beats per minute, and his brain scans are more similar to those of many murderers than of normal people. Without doubt, environmental factors also play a role in tipping someone towards crime. Indeed, it often takes both bad biology and bad surroundings to induce criminal behaviour.

Even so, it seems clear that biological factors underlie much criminal conduct. If so, Raine argues, we should treat it as a medical condition. "Treating the physical causes will work more quickly and effectively than repairing the complicated social factors that also contribute to criminal behaviour," he argues.

A few successful tests show that this approach can, indeed, work. On the island of Mauritius, for example, Raine and his colleagues provided extra nutrition, exercise and intellectual enrichment to a group of children for two years, beginning at age 3. When they tested them at age 11, they found the children's brains had matured faster than those of kids from comparable backgrounds who had not been part of this programme - and, a decade later, they showed 35 per cent less criminal behaviour.

Up to this point, Raine's book is a masterpiece. He has the research at his fingertips - not surprising, since he carried out much of it - and makes a compelling case that society needs to grapple with the biological underpinnings of violent crime just as vigorously as the social causes, if not more so. And he presents all this in a lively, engaging style, leavened with plenty of case studies of notorious murderers.

154665188.jpg

Future police might seek brain scans to prevent crimes occurring (Image: AFP/Getty)

In the final few chapters, however, Raine sets aside his academic cap and tries on a more speculative one, with somewhat less success. Suppose, for instance, that in another 20 years our knowledge has progressed to the point where we can scan someone's brain and predict that they have better-than-even odds of committing a murder in the next few years. What should we do? Raine spins a dystopian fantasy of a future where at-risk individuals are forced into treatment centres and held there until their brains can be modified to reduce the risk.

Pushing the limits even further, he imagines screening and re-educating children, or even requiring prospective parents to prove their knowledge and fitness before being granted a parenting licence. All this is certainly interesting - but Raine sheds his authority when he crosses the line between science and speculation. Back to the lab.

This article appeared in print under the headline "Devil in the DNA"

Book information
The Anatomy of Violence: The biological roots of crime by Adrian Raine
Pantheon
$35

Follow @CultureLabNS on Twitter

Like us on Facebook

    


Sponsored post
soup-sponsored
04:52

April 09 2013

11:28

Get off my back! How to reduce your stress levels

Michael Bond, consultant

evhr_6.02776672.jpg

Ignoring stress's social background puts the burden of change on the poor (Image: Steve Liss/Polaris/Eyevine)

Two opposing strategies for dealing with the stress of modern life have been put forward by Dana Becker and Marc Shoen, but which is best?

STRESS is the epidemic of our age, or so it seems: a disfiguring consequence of modern life that we all succumb to from time to time. Yet it is hard to know what it really is, other than a miscellany of physical and psychological symptoms covering everything from anxiety to hypertension. The original medical definition, which, as its derivation from mechanics suggests, is concerned specifically with an organism's response to external pressures, has all but vanished from view.

The effect of the external environment is central to some of the most telling scientific studies on stress, such as those exploring links between wealth inequalities and brain development. But this is not how most of us - or indeed most scientists - talk about stress. The focus has now turned inward, from environmental causes to medical solutions and what individuals should do to cope.

The result of this recalibration, initiated partly by the discovery that stressful experiences affect people's immune systems in different ways, is a vast market for biomedical and psychological interventions. In the scramble for drugs and therapy, the social and developmental context of stress and stress-related disease is conveniently ignored. Children with chronic behavioural issues, for example, are diagnosed with "conduct disorder", a label that pathologises their shortcomings and disregards the deficiencies in care and upbringing that are likely to have contributed to them.

one_nation_stress_cover_175.jpg

In One Nation Under Stress, Dana Becker argues that the medicalisation of stress and the current infatuation with neurobiology is a disaster for societies, and particularly for women. The problems women face daily in balancing work and family, for example, are so strongly shaped by social attitudes that they have most to lose when social conditions are ignored.

Ignoring the social background to stress, she says, puts the burden of responsibility on vulnerable people to change themselves - to solve their own problems - and it condones the external conditions that lead to their suffering. It allows us to avoid the larger problems. The upshot, writes Becker, is that it becomes "far easier to talk about the 'stressed' African American single mother, say, than to think about the effects of de facto school segregation in our cities, or the effects of discrimination on employment opportunities, or the shortage of affordable childcare".

Becker is a family therapy specialist at Bryn Mawr College in Pennsylvania with a long interest in cultural and historical attitudes to illness. She is a sharp observer of the social and cultural implications of modern attitudes to stress, such as the tendency of researchers and the media to exaggerate the incidence of post-traumatic stress disorder among both civilians and soldiers.

One Nation Under Stress reads like a manifesto against the current order, and few areas of medicine emerge unscathed. Becker sounds angry and occasionally bitter, which can make for a difficult read. She is convincing but also frustrating, for she offers few solutions, short of the need to "make substantive structural changes in our society", such as reducing inequalities.

Here, she makes the radical and clever suggestion that poverty should be viewed (by researchers and funders presumably) as a direct cause of illness and death, since it is well established that poverty leads to a greater risk of hypertension, depression, heart disease and other life-threatening conditions. But she fails to show how that could affect how science is carried out. Does she want money diverted from biomedicine to social sciences? Should we just give up on trying to discern individual differences in the way people deal with environmental and social pressures?

survival_instinct_cover_175.jpg

Marc Schoen's Your Survival Instinct is Killing You, on the other hand, offers to "retrain your brain" to better cope with the stresses of modern living. It looks like just the kind of approach Becker hopes to banish.

Schoen is a clinical professor of medicine at the University of California, Los Angeles, and a strong believer in the ability of the mind to influence the body (he's been hypnotising people since he was 16). He identifies what he perceives as a growing intolerance of discomfort as a major factor behind many disorders from insomnia to obesity. The modern world puts us on edge, which triggers maladaptive behaviours that then become entrenched.

He suggests ways to correct this distorted response to stress. Some make good sense, though are easier said than done: wind down before trying to sleep, stop procrastinating, delay your need for gratification, use a breathing meditation, learn to experience discomfort without reacting to it.

Others, such as "take a technology time out", seem dubious because they are predicated on the notion that modern technologies are bad for us. Where's the evidence that using computers leads to "a lower tolerance for ambiguity" and makes us less inclined to tolerate human imperfections?

Such self-help solutions are alluring for the reasons that grieve Becker most: we have so little control over the things dictating our health that the best we can do is adapt and survive. Her enduring point is this is not a level playing field, since those whose living conditions make them more susceptible to stress have the least access to tools that would help.

She hails a much-needed revolution, but until it arrives (or is at least signposted) people are likely to grab at anything that offers help and researchers will no doubt strive to provide it.

This article appeared in print under the headline "Get off my back!"

Book information
One Nation Under Stress: The trouble with stress as an idea by Dana Becker
Oxford University Press
$35

Your Survival Instinct is Killing You: Retrain your brain to conquer fear, make better decisions, and thrive in the 21st century by Marc Schoen
Hudson Street Press
$25.95

Follow @CultureLabNS on Twitter

Like us on Facebook

    


April 05 2013

10:53

Real life with robots in 20 and 200 years

George Annas, contributor

evhr_4.00365285.jpg

(Image: Stefano Pauesi/Contrasto/Eyevine)

C-3PO and the Borg inspired Illah Reza Nourbakhsh to imagine the reality of living with robots. The plausible scenarios of Robot Futures are the result

IN ROBOTICS, there is a fine line between science fiction and science fact. Illah Reza Nourbakhsh turns out to be a master of straddling that line. In this accessible, smart book, he suggests how plausible robot futures could change our lives.

robot_cover.jpg

Nourbakhsh, a roboticist at Carnegie Mellon University in Pittsburgh, Pennsylvania, was enchanted by the human-shaped C-3PO and friendly R2-D2 in Star Wars when he was a child. But he also came to understand the world of the interconnected Borg from Star Trek and Blade Runner's dangerous replicants. This is clear from his near-future scenarios, set in 2030, 2040 and 2045. These have robots collecting personal data for marketing, toys that threaten our public spaces, and reservations agents that are indistinguishable from humans.

More speculatively, suggests Nourbakhsh, by 2231 some robots could look like us. But only if we use a real human body as a shell and populate it with trillions of nanobots working together to make an artificial nervous system.

He is less sure-footed when it comes to robotics laws and ethics. For example, it is not clear that treating those nanobot-run "human" bodies as property will lead us to devalue our own human bodies, and later, human rights, as he asserts, although it certainly could. But he is right to assert that using law and ethics to help channel research and protect our privacy and rights is a challenge we ignore at our peril.

As he argues so eloquently, "citizens and local communities" motivated "to contribute to a sustainable quality of life" need a real role in decision-making if robots are to do more than empower corporations, the military and governments, while disempowering us.

George Annas is a bioethicist at Boston University

Book information
Robot Futures by Illah Reza Nourbakhsh
MIT Press
$24.95

Follow @CultureLabNS on Twitter

Like us on Facebook

    


April 04 2013

14:12

Time for economics to shed its fanciful past

Stephen Battersby, consultant

BEJ0087698.jpg

(Image: Jean-Luc Bertini/Picturetank)

Make way for the physicists! Economics is finally becoming an enlightened science, say new books by Mark Buchanan and James Owen Weatherall

SOMETHING is stirring amid the swamps of the dismal science. It speaks of brass beads and rice piles, of power laws and positive feedback, of phase changes, chaos and complexity. Resembling an actual science, it might come to have some bearing on the real world and help reduce future financial catastrophes. That is the main message of Mark Buchanan's new book, Forecast.

forecast_cover_175.jpg

This new science is battling to escape the dominant theory of neoclassical economics. That "almost psychotic fantasy", as Buchanan puts it, is as absurd as a theory of weather that says storms do not exist. Its adherents see only equilibrium where there is dynamism, its edicts have made markets and the global economy more unstable.

In the 1930s and 40s, Buchanan recounts, American economist Irving Fisher and others advanced a view of the economy as a dynamic, unstable beast, but over the following decades others worked hard to bury the insight. Milton Friedman and his allies in the Chicago School promoted the idea of equilibrium in markets and economies.

With forbiddingly intricate equations, they proved that a perfect market of rational people, coolly calculating how to maximise their gains, reaches an equilibrium where all goods are priced at the optimum level. In this best of all possible worlds, says Buchanan, any change would reduce the overall wealth. Adam Smith's invisible hand was drawn in rigorous mathematics.

Of course, we do not act like rational automata. If we did, only the literally starving would choose to take out a "payday loan" at 2000 per cent interest. Instead, we guess, panic and follow whims, hormones or herds. Psychologists have long known how much other people affect our perceptions and behaviours.

But in a remarkable feat of logic, Friedman argued that for a hypothesis to be important, it "must be false in its assumptions". Which makes neoclassical theory sound very important indeed.

And the theory fails to account for bubbles or market crashes. No matter, say proponents: external shocks, from outside economics, are behind all those unfortunate events. So the "equilibrium delusion", as Buchanan calls it, has persisted. Now deregulation and derivatives have moved us closer to that perfect market in which anything can be traded with anyone at any time.

As you may have noticed, this did not turn out well. The global economic crisis, writes Buchanan, now looks like "an unusually powerful hurricane that was brewing... for half a dozen years while economists - their textbooks on financial weather dealing only with the theory of calm blue skies - insisted that everything was fine".

His attack on the Chicago School is unmerciful, but the book's main aim is more positive, describing how physicists become financial storm-chasers. Buchanan is not afraid to use a few graphs, but that is no bad thing, and he tells a lucid, absorbing story. In short, physicists saw that markets follow statistical patterns called power-law distributions, which also crop up in complex systems, from earthquakes to the weather, and in simpler things such as avalanches in a rice pile. These systems tend to organise themselves into critical states, where large and violent events can have small and ordinary origins.

A new breed of financial models does the same. Based on simple, interacting agents that chop and change strategies based on past performance, the models show that the largest market crashes can be triggered by a single trade. And they can explain oddities, too, such as how markets change phase, like melting ice, when they get too crowded.

The models also show how proliferating derivatives and increased leveraging (borrowing to invest) make a market less stable. Just as we have driven the weather to be more extreme, so we have driven markets to be more violent. In future, Buchanan hopes similar models will help head off trouble.

physics_finance_cover_175.jpg

In The Physics of Finance, James Weatherall takes a more winding road, through tales of scientists drawn into finance. He runs from the 19th-century Louis Bachelier's work on the randomness of market prices to today's "quants", the quantitative analysts, often recruited from physics, who devise complex trading algorithms. He covers critical phenomena, too, but ranges more widely than Buchanan, telling of physicists' bids to beat casinos at roulette, and how theories from particle physics could measure inflation.

This is all very engaging, and makes an easy, informative read. But it can also be a little frustrating as we skim the surface of ideas without gaining the same sense of insight that we do from Forecast.

Some of the converts to "econophysics" have called for a new Manhattan Project for economics. It may be a good plan, but the name carries radioactive connotations - and physicists must recognise the danger of radical new tools and theories.

Weatherall argues, reasonably, that quants alone cannot be blamed for the recent crisis. Even so, the input of physicists has clearly helped destabilise markets. Take the Black-Scholes equation: based on diffusion equations in physics, it has been used to price arcane derivatives. And recently, high-frequency algorithmic trading has led to "flash crashes", where a stock dips massively for under a second. If physics inspires more effective economic theories, they may in turn change the nature of markets in profound, possibly dangerous ways.

This is reason for caution, but wilful ignorance is not an option. Some hedge funds use the latest physics-inspired dynamic models, and the light is dawning even in government institutions. While the European Central Bank still uses equilibrium models to make growth predictions, its president, Jean-Claude Trichet, has admitted that existing models failed during the crisis, and that central banks could learn from the new models.

Economics may be reaching its age of enlightenment, becoming a rational science by abandoning the idea that we are perfectly rational. The field may even warrant its own Nobel one day.

Book information
Forecast: What physics, meteorology, and the natural sciences can teach us about economics by Mark Buchanan
Bloomsbury Publishing
£19.99

The Physics of Finance: Predicting the unpredictable by James Owen Weatherall
Short Books
£12.99

Follow @CultureLabNS on Twitter

Like us on Facebook



April 02 2013

16:00

Has technology forced us into a 'present shock'?

Jim Giles, consultant

RTRS0M7.jpg

It is no longer possible for leaders to control how a political story unfolds (Image: Larry Downing/Reuters)

In Present Shock, Douglas Rushkoff says everyday technologies have destroyed our sense of perspective, but his insights need better backup

TOWARDS the end of Present Shock, the media theorist Douglas Rushkoff describes a radio talk show participant called Cheryl. She had phoned in to discuss the white trails that aircraft leave behind as they pass overhead.

Like other followers of the "chemtrail" conspiracy theory, Cheryl argued that these clouds contain chemicals that governments are distributing for some unknown but certainly nefarious purpose. Perhaps, she suggested, the aim was the creation of a planet-wide system for causing earthquakes. "Illuminating," replied the host, apparently in earnest.

Rushkoff knows why people like Cheryl think the way they do. In recent decades, he says, everyday technologies have forced us into a discombobulated state of constant alert. Our phones beep around the clock with news of emails, tweets and text messages. And entertainment networks have largely abandoned long-form narratives in favour of the strobe-like intensity of reality television.

future_shock_cover.jpg

Changes like these, thinks Rushkoff, have robbed us of the ability to pause and to put events into context. We are afflicted with "present shock". Among other things, it causes some of us, like Cheryl, to see connections where there are none.

Arguments of this nature, which link technology with major cultural and social changes, involve joining up a lot of dots. Rushkoff starts the process at an intriguing point: the invention of the remote control. Once channel-switching became effortless, viewers became less tolerant of character development and the other mechanics of narrative that help build a show up to its climax.

To keep newly impatient audiences locked in, networks had to up the frequency of dramatic events in their shows. Hence the creation, among other new genres, of reality television, where scandals hit the screen so regularly that viewers never hit the remote.

This loss of narrative has had knock-on effects. All disciplines rely on familiar grand narratives - the triumph of the underdog or the comeback of a fallen great in sport, say. But these don't resonate in our "post-narrative" world, says Rushkoff. It is one reason why, he argues, attendance at American football and baseball has declined in recent years.

Something related is also at work in the world of news, he says. Now the audience has no time for narrative, politicians are finding it harder to control the story they attempt to create about themselves. The result is policy-making on the fly.

These are big claims, but Rushkoff doesn't come close to substantiating them. Sometimes he simply gets his facts wrong: baseball attendance, for example, rose over the past two seasons. It is true that crowds thinned for a few years from 2007, but if you had to bet on the cause, would you go for the great recession, which began in 2008, or the death of narrative?

Rushkoff's other examples aren't much more helpful. He uses George W. Bush's so-called "mission accomplished" statement, made years before US forces actually withdrew from Iraq, as an illustration of how it is no longer possible for our leaders to control the way a political story unfolds. Yet Bush was hardly the first ruler to suffer from moments of hubris.

These careless examples irk, but others are almost offensive. At one point, Rushkoff notes the mental burden caused by multitasking - such as simultaneous tweeting, watching of television and talking on the phone - and suggests it may be a cause of teen suicide.

It is a bizarre foray into a complex issue. Mental health specialists have debated numerous causes of suicide in young people, from substance abuse to genetics. Whatever the causes, suicide rates among young people have remained relatively flat in the US during the past two decades.

This use of such loose sourcing and flawed examples has an ironic result: I was left feeling that Rushkoff has identified some interesting and potentially important phenomena, but woven them together so carelessly that the theory he has created disintegrates on close examination. Which, if you think about it, isn't a bad definition of a conspiracy theory.

This article appeared in print under the headline "Big bold claims"

Book information
Present Shock: When everything happens now by Douglas Rushkoff
Penguin
$26.95


March 28 2013

17:31

Our enduring love affair with 'flying jewels'

Shaoni Bhattacharya, consultant

rexfeatures_2225295o.jpg

(Image: Ray Tang/Rex)

“My parents took me to a butterfly house when I was 6. That Christmas, I asked for a greenhouse.” By the age of 8, Luke Brown had bred his first butterfly, the zebra longwing, Heliconius charithonia. Now as manager of the Sensational Butterflies exhibition at the Natural History Museum in London, it is that child’s delight he hopes to inspire.

And he succeeds. Against the backdrop of the museum’s towers, a heated marquee attracts visitors to walk around an evocation of a butterfly’s life cycle, with up to 1500 tropical and semi-tropical butterflies and moths dancing around them. Especially exciting is a glass-fronted warm room, where nascent butterflies emerge from chrysalises and gingerly unfurl their wings.

The romance of butterflies is also captured in William Leach’s book Butterfly People: An American encounter with the beauty of the world. He reminds us of the duality of these “flying jewels”, initially desired as objects of beauty, and later representing the spirituality of nature versus emergent capitalism.

Leach voyages from 18th to the early 20th century, telling the strange stories of America’s obsessive “butterfly people”. We meet, for example, the owner of a coal mine turned eminent lepidopterist, and a Shakespearean actor who entertained gold miners panning the Wild West, while also netting his own lepidopteral gold.

Through these characters Leach captures a passion bordering on fanaticism. But their obsessional pursuit of butterflies resonates outside their field, their stories criss-crossing that of the early US scientific establishment, the journals Science and The American Naturalist, and the American Association for the Advancement of Science.

Fascinating, too, is the background against which these early naturalists and their butterfly stories unfold, with the frontier spirit permeating their explorations. Butterfly hunting went hand in hand with the gold rush, and the steady advance of the railroads into uncharted wilds opened up new frontiers into nature.

That was not without a big price tag, however: in just 20 years the US achieved a level of environmental exploitation that had taken centuries in Europe.

Meanwhile in Europe the influence of the Romantic poets, with their appreciation of the natural world, and the spread of empires opened up new windows on nature.

In many ways, Butterfly People is a boys’ adventure story in which gun-toting naturalists imperil their lives just to touch the wing of a rare species. Today’s butterfly people will be enthralled, though outsiders perhaps less so. But the wonder Leach evokes will captivate all who appreciate the natural world.

Sensational Butterflies is at London’s Natural History Museum until 15 September

Book information
Butterfly People: An American encounter with the beauty of the world by William R. Leach
Random House
$32.50

Follow @CultureLabNS on Twitter

Like us on Facebook



March 26 2013

12:24

Can there be an algorithm for every human desire?

Jacob Aron, reporter

Golden-Ticket.jpg

The travelling salesman problem still defies the most powerful computers (Image: Quentin Bertoux/Agence Vu)

Lance Fortnow shows how an esoteric mathematical problem has deep implications for our future in The Golden Ticket: P, NP, and the search for the impossible

IT SEEMS as if computers can do almost anything. Serve up the world's knowledge in an instant? No problem. Simulate a human brain? Working on it.

But are there some tasks that computers will never be able to perform, no matter how powerful they become? This question forms the basis of one of the most important problems in computer science and mathematics, known by the esoteric name: P versus NP.

The definition of this problem is tricky and technical, but in The Golden Ticket, Lance Fortnow cleverly sidesteps the issue with a boiled-down version. P is the collection of problems we can solve quickly, NP is the collection of problems we would like to solve. If P = NP, computers can answer all the questions we pose and our world is changed forever.

cover24705-medium.jpg

It is an oversimplification, but Fortnow, a computer scientist at Georgia Institute of Technology, Atlanta, knows his stuff and aptly illustrates why NP problems are so important. Take the travelling salesman puzzle, which asks if it is possible to find a route between a number of cities that is shorter than a given distance. This conundrum crops up in everything from genome sequencing to circuit design, so a solution would be pretty handy.

Unfortunately, just deciding the best route for visiting the capitals of 48 US states involves checking an infeasible number of possibilities. As Fortnow says, even a computer capable of calculating one route in the time it takes light to cross the width of a hydrogen atom would take 10 trillion trillion times the age of the universe to check them all. That is why researchers want to know whether some advanced algorithm exists that could find this route quickly.

The book kicks off with a scenario Fortnow calls "the beautiful world", in which a fictional mathematician unearths just such a method. Applying this search algorithm to look for improved versions of itself eventually leads to the discovery of the Urbana algorithm, which completely transforms the world.

Urbana allows doctors to perfectly diagnose and treat diseases, musicians to create Beethoven's tenth symphony, and moviegoers to watch a perfect (computer-generated) romantic comedy staring Humphrey Bogart and Julia Roberts. The algorithm also puts millions of people out of work, creates irresistible adverts that drive people to buy products, and breaks the public-key cryptography behind internet commerce.

Fortunately or unfortunately, depending on your view, this beautiful world is unlikely to ever exist. That is because most computer scientists and mathematicians believe P does not equal NP, and an Urbana algorithm will never be possible - they just cannot prove that yet.

Fortnow also charts the history of P versus NP. It was mentioned in Alan Turing's time, but the problem was not fully formulated on both sides of the Iron Curtain until the 1970s. No one has yet solved the problem - although not for lack of trying. Proving it either way will win you $1 million from the Clay Mathematics Institute, not to mention mathematical immortality.

Fortnow believes he will not see proof in his lifetime, but he is sure there are problems that even the most advanced algorithms cannot solve. But there is one question I can give you an easy answer to: should you read this book? Computer says yes.

This article appeared in print under the headline "Burden of proof"

Book information
The Golden Ticket: P, NP, and the search for the impossible by Lance Fortnow
Princeton University Press
$26.95/£18.95

Follow @CultureLabNS on Twitter

Like us on Facebook



March 25 2013

17:22

The promise and perils of a datafied world

Big Data: A revolution that will transform how we live by Viktor Mayer-Schönberger and Kenneth Cukier alternates between enthusiasm and apocalyptic caution

IN A former life I was a research assistant. After painstaking weeks spent gathering data, I was tasked with putting the numbers into a statistics application that would help us deduce our trends.

While I was analysing the figures, my boss peered over my shoulder and pointed at a record on the screen. "Get rid of that one," I was told. "Also that one, that one, that one and that one." They were outliers and they were going to mess up the findings. "You're never going to trust science again," said my superior with a rueful laugh.

But if you believe Viktor Mayer-Schönberger and Kenneth Cukier, science will be just fine because such practices are about to become as archaic as leeching. Indeed, in Big Data, Cukier, a business writer at The Economist, and Mayer-Schönberger, a professor at the University of Oxford, argue that the big data revolution will save science.

Big data seemed to reach the apex of its hype cycle around 2012, when journalists and experts variously extolled its virtues – or wrung their hands about its implications. And yet, somehow it remained elusive: what exactly was it? This book answers that question.

First, then, a definition. Big data describes the idea that everything can be digitised and "datafied" – thanks to cheaper storage, faster processing and better algorithms. And that really means everything, from your current location or liking for strawberry pop tarts, to your propensity for misspelling and degree of personal compassion. And not just your data: everyone's data.

This changes science, say the authors, by ridding us of biased random samples and the need to massage the resulting data to make it sufficiently representative of a larger population. Could biased sample sets be at the root of many failed attempts to replicate experiments?

Whatever the answer, store everything and the need for proxies disappears. Instead of formulating hypotheses and then looking for confirmation in small, error-prone trials or experiments, scientists now have the storage, processing and algorithmic sifting power to simply trawl through the constellation of all data and spot trends.

Getting rid of the hypothesis would be a staggering change in the scientific method, but both Cukier and Mayer-Schönberger appear convinced it is a step in the right direction, freeing science from the (often unconscious) biases of scientists and increasing the accuracy of its findings.

But there is something even more revolutionary going on here. Reusing data was often nigh-on impossible: data collected for one purpose could rarely be reshuffled to probe it for anything other than the original purpose for which it was generated. This is no longer true – and that promises to have deep, potentially dystopian consequences.

The authors provide a brilliant metaphor for big data in the shape of the new Lytro camera, which uses information from not just one plane of light but which, its makers claim, "captures and processes the entire light field" of a given view. This means that you can focus later, during processing, on any plane you choose. Similarly, the ability to suck up massive amounts of unfiltered data means that any dataset can be used a nearly limitless number of times and for any purpose an algorithm designer can think up.

This opens up extraordinary possibilities. Want to tease out people's route to work from their cellphone data? There's an algorithm for that. Want to look through someone's Twitter feed to predict if they are prone to certain crimes? You can bet an algorithm for that will come along some time soon.

But do you really want every detail of your personal life mined ever more efficiently by every proto-Zuckerberg with an algorithm? I don't, but my phone company may decide to sell my data to start-ups for extra profit. And we would have no easy way to protect ourselves: just reading the terms and conditions of the privacy policies for the hardware, software and apps you already own would take a week and a half, with no breaks for sleeping or eating, and that will only increase.

Worse, is there an algorithm to watch over the algorithms? Where do you turn if some day in the future your love of pop tarts gets you barred from surgery. Or what if you're a prisoner and CCTV gait-analysis software decides you are still an unreformed character – and so you are refused parole?

There is an interesting tension in the book, alternating sometimes between enthusiastic business-speak and apocalyptic caution. This comes across as an ongoing "conversation" between the authors. Since Mayer-Schönberger wrote Delete: The virtue of forgetting in the digital age, I suspect that it is his caution tempering Cukier's evangelism.

This ensures that we have a credible picture of the upside of big data. For example, it could help when inspection systems get overwhelmed – as the 55 inspectors tasked with ensuring the safety of 3500 Gulf oil production platforms a year before the Deepwater Horizon accident certainly did. If every single aspect of the rigs' operational data had been routinely collected, an algorithm could have spotted trouble early enough for action.

Cukier and Mayer-Schönberger have pulled all this together in an elegant and readable primer. The one thing missing is a "next steps" section that doesn't just throw the problem into the laps of policy-makers, who are notorious for sitting on their hands until crises force them to act.

And my former boss may well be doing better research thanks to big data, but what secrets will the records of the lab's students divulge to future data-miners?

Book information
Big Data: A revolution that will transform how we live, work and think by Viktor Mayer-Schönberger and Kenneth Cukier
John Murray
£20
81WEOH4yyOL.jpg

Big data seemed to reach the apex of its hype cycle around 2012, when journalists and experts variously extolled its virtues - or wrung their hands about its implications. And yet, somehow it remained elusive: what exactly was it? This book answers that question.

First, then, a definition. Big data describes the idea that everything can be digitised and "datafied" - thanks to cheaper storage, faster processing and better algorithms. And that really means everything, from your current location or liking for strawberry pop tarts, to your propensity for misspelling and degree of personal compassion. And not just your data: everyone's data.

This changes science, say the authors, by ridding us of biased random samples and the need to massage the resulting data to make it sufficiently representative of a larger population. Could biased sample sets be at the root of many failed attempts to replicate experiments?

Whatever the answer, store everything and the need for proxies disappears. Instead of formulating hypotheses and then looking for confirmation in small, error-prone trials or experiments, scientists now have the storage, processing and algorithmic sifting power to simply trawl through the constellation of all data and spot trends.

Getting rid of the hypothesis would be a staggering change in the scientific method, but both Cukier and Mayer-Schönberger appear convinced it is a step in the right direction, freeing science from the (often unconscious) biases of scientists and increasing the accuracy of its findings.

But there is something even more revolutionary going on here. Reusing data was often nigh-on impossible: data collected for one purpose could rarely be reshuffled to probe it for anything other than the original purpose for which it was generated. This is no longer true - and that promises to have deep, potentially dystopian consequences.

The authors provide a brilliant metaphor for big data in the shape of the new Lytro camera, which uses information from not just one plane of light but which, its makers claim, "captures and processes the entire light field" of a given view. This means that you can focus later, during processing, on any plane you choose. Similarly, the ability to suck up massive amounts of unfiltered data means that any dataset can be used a nearly limitless number of times and for any purpose an algorithm designer can think up.

This opens up extraordinary possibilities. Want to tease out people's route to work from their cellphone data? There's an algorithm for that. Want to look through someone's Twitter feed to predict if they are prone to certain crimes? You can bet an algorithm for that will come along some time soon.

But do you really want every detail of your personal life mined ever more efficiently by every proto-Zuckerberg with an algorithm? I don't, but my phone company may decide to sell my data to start-ups for extra profit. And we would have no easy way to protect ourselves: just reading the terms and conditions of the privacy policies for the hardware, software and apps you already own would take a week and a half, with no breaks for sleeping or eating, and that will only increase.

Worse, is there an algorithm to watch over the algorithms? Where do you turn if some day in the future your love of pop tarts gets you barred from surgery. Or what if you're a prisoner and CCTV gait-analysis software decides you are still an unreformed character - and so you are refused parole?

There is an interesting tension in the book, alternating sometimes between enthusiastic business-speak and apocalyptic caution. This comes across as an ongoing "conversation" between the authors. Since Mayer-Schönberger wrote Delete: The virtue of forgetting in the digital age, I suspect that it is his caution tempering Cukier's evangelism.

This ensures that we have a credible picture of the upside of big data. For example, it could help when inspection systems get overwhelmed - as the 55 inspectors tasked with ensuring the safety of 3500 Gulf oil production platforms a year before the Deepwater Horizon accident certainly did. If every single aspect of the rigs' operational data had been routinely collected, an algorithm could have spotted trouble early enough for action.

Cukier and Mayer-Schönberger have pulled all this together in an elegant and readable primer. The one thing missing is a "next steps" section that doesn't just throw the problem into the laps of policy-makers, who are notorious for sitting on their hands until crises force them to act.

And my former boss may well be doing better research thanks to big data, but what secrets will the records of the lab's students divulge to future data-miners?

Book information
Big Data: A revolution that will transform how we live, work and think by Viktor Mayer-Schönberger and Kenneth Cukier
John Murray
£20

Follow @CultureLabNS on Twitter

Like us on Facebook



March 18 2013

17:32

The radioactive legacy of the search for plutopia

Rob Edwards, contributor

Hanford_sheep_testing_2.jpg

Hanford scientists feeding radioactive food to sheep (Image: US Department of Energy)

Cold war dreams of producing nuclear bombs fuelled shocking radiation experiments by US and Soviet governments, reveals Kate Brown's Plutopia

MAKING plutonium for nuclear bombs takes balls, but not in the way you might think. In 1965, scientists at the Hanford nuclear weapons complex in Washington state wanted to investigate the impact of radiation on fertility - and they weren't hidebound by ethics.

In a specially fortified room in the basement of Washington State Penitentiary in Walla Walla, volunteer prisoners were asked to lie face down on a trapezoid-shaped bed. They put their legs into stirrups, and let their testicles drop into a plastic box of water where they were zapped by X-rays.

The experiments, which lasted for a decade and involved 131 prisoners, came up with some unsurprising results. Even at the lowest dose - 0.1 gray - sperm was damaged, and at twice that dose the prisoners became sterile. They were paid $5 a month for their trouble, plus $25 per biopsy and $100 for a compulsory vasectomy at the end so they didn't father children with mutations.

The testicle tests are just one of many disturbing details Kate Brown has unearthed from the official archives in her fascinating nuclear history. She also tells how tunnels created by muskrats undermined one of Hanford's storage ponds, causing 60 million litres of radioactive effluent to pour into the Columbia river.

jacket.jpg

And there is the scary tale of how Hanford scientists conducted one of their riskiest experiments, later dubbed the "green run". For 7 hours, they processed highly radioactive "green" fuel that had not been allowed to decay for as long as usual - and showered 407,000 gigabecquerels of radioactive iodine over nearby cities. The green run is said to have been an attempt to mimic what the US thought the Soviet Union was doing to boost plutonium production at its Mayak nuclear weapons plant at Ozersk, in the Urals.

It is the looking-glass links between Hanford and Mayak, and the communities that host them, that form the central theme of Brown's book. They were two secretive citadels, dedicated to producing as much plutonium as possible to fuel the cold war arsenals of the world's two opposing superpowers. They both conferred wealth and privilege on their elite staff, copying each other to create what Brown styles as a "plutopia".

But the two vast, creaking, nuclear complexes also deliberately discharged huge amounts of radioactivity into the environment, cut corners and caused countless accidents and leaks. Brown estimates that during their existence they each released at least 7.4 billion gigabecquerels, four times the amount released by the accident at the Chernobyl nuclear plant in Ukraine in 1986.

The rivers that drain the two sites, the Columbia and the Techa, have both been called the most radioactive in the world, and many thousands of people who live downstream and downwind say the contamination has made them sick. These are, says Brown, "slow motion disasters" created and covered up by state machines.

Brown argues that the US and the Soviet Union both subverted science to maintain the plutopia. The most shocking example was the US Atomic Energy Commission's takeover of seminal Japanese research into the health impacts of the atomic bombs dropped on Hiroshima and Nagasaki. This was necessary, according to a senior AEC official in 1955, to ensure that "misleading and unsound reports" were "kept to a minimum".

Brown's account is unique, partisan and occasionally personal in that she includes some of her thoughts about interviews she conducted: for example, she recounts how she ended up becoming friends with one interviewee. But because she is open and thorough about her sources, those are strengths to be celebrated, not weaknesses to be deplored. It also means her book is engaging, honest and, in the end, entirely credible.

This article appeared in print under the headline "The hot cold war cover-up"

Book information
Plutopia: Nuclear families, atomic cities, and the great Soviet and American plutonium disasters by Kate Brown
Oxford University Press
$27.95

Follow @CultureLabNS on Twitter

Like us on Facebook



March 13 2013

11:23

It took more than inspiration to spawn Frankenstein

Tiffany O'Callaghan, Opinion editor

rexfeatures_1288971d.jpg

Like all great stories, Frankenstein has been reborn many times (Image: Alastair Muir/Rex Features)

In The Lady and Her Monsters, Roseanne Montillo explores the often macabre historical backdrop to Mary Shelley's famous tale

THE wind howled outside the house near Lake Geneva in Switzerland, where Mary Godwin, her future husband Percy Bysshe Shelley, Lord Byron and others were on holiday in May 1816. Inside, the story of Frankenstein was taking shape as the friends tried their hand at writing scary stories.

But as Roseanne Montillo says in The Lady and Her Monsters, it wasn't simply literary rivalry that spawned the tale. She makes a compelling case for other strong influences shaping this famous fable of science gone awry.

Frankenstein_jacket.jpg

Mary Shelley was the daughter of Mary Wollstonecraft, the writer and women's rights advocate, and William Godwin, the philosopher and novelist. In part, her literary pedigree is what attracted her husband-to-be. And it is no accident, Montillo says, that the Victor Frankenstein character had much in common with Percy, who was known for his love of the ghoulish and for his unruly experiments, with noxious smells often escaping from his room at University College, Oxford.

As well as the people in her life, broader changes in Georgian Britain influenced Mary Shelley. Montillo delves into the growing use of cadavers in medical education, and the trafficking in corpses stolen from fresh graves. Mary Shelley would have known about this, and of the raucous public hangings - some not far from her London home. She would have known, too, that the corpses of criminals were used in dissection, and, occasionally, in attempts to reanimate the dead.

Yet in tracing these influences, Montillo never loses sight of the fact that it was Mary Shelley's imagination that sewed the pieces together - and provided the vital spark that keeps the tale alive nearly two centuries on.

This article appeared in print under the headline "Frankenstein lives!"

Book information
The Lady and Her Monsters: A tale of dissections, real-life Dr Frankensteins and the creation of Mary Shelley's masterpiece by Roseanne Montillo
William Morrow
$26.99

Follow @CultureLabNS on Twitter

Like us on Facebook



March 11 2013

12:27

The search for ET is a detective story without a body

Nigel Henbest, contributor

p2380243.jpg

(Image: Plainpicture)

Paul Murdin investigates the possibilities of nearby alien life in Are We Being Watched?, but neglects to cover more thrilling aspects of the quest

OVER lunch at Los Alamos National Laboratory in 1950, a group of leading physicists turned their minds to flying saucers, and the possibility of interstellar travel. "Where is everybody?" Enrico Fermi exclaimed. As his colleagues laughed, Fermi performed a legendary back-of-an-envelope calculation showing that aliens - if they exist - should have visited Earth many times over.

If they've been here, then perhaps they are watching us - hence the provocative title of Paul Murdin's new book. As one of the team that discovered the first black hole in our galaxy, Murdin is an astrophysicist of impeccable credentials. But in weighing up the evidence for alien life, Murdin has to confront one insuperable obstacle: there is no evidence for even microbial life beyond Earth. So the Herculean task of writing a book on alien life is the ultimate detective story without a body.

Are_watched_jacket.jpg

Murdin seems to draw his inspiration from the poet Alexander Pope's famous quote that the "proper study of mankind is man". His discussions centre mainly on Earth's indigenous life, with the premise that alien life follows much the same pattern: evolution and survival of disasters to culminate - after much the same lapse of time - in intelligent creatures pretty similar to us.

He also investigates the possibility of microbial life on our neighbouring worlds, including the frozen deserts of Mars and the icy depths of Saturn's moon Enceladus, along with its peer Titan, which is coated with a sorbet of chemicals with the potential for creating life. There is even a chance that alien fish swim in the deep oceans of Europa, the icy moon circling Jupiter.

If that is what you are looking for, then Murdin is an excellent and thoughtful guide. But when it comes to a deeper discussion of the search for life beyond our solar system - and especially intelligent aliens - we are frustratingly short-changed. There is only a cursory discussion of the fascinating zoo of planets now being discovered around other stars. Many of these new worlds are so hot or cold, water-covered or parched, that our best clues to life there come from Earth's most hardy life forms. Yet these amazing extremophiles are dismissed in a single page.

Alien life is a subject that sparks off-the-wall ideas and excitement that are largely lacking here. Take the physical appearance of aliens. Admittedly, this is a pretty nebulous subject, but some evolutionary biologists such as Jack Cohen have taken a hard look at what might be possible. And communications researchers like Laurance Doyle have derived algorithms that can tease out meaning from apparently random patterns of sound, providing hints about how aliens might communicate.

The global Search for Extraterrestrial Intelligence project (SETI) is the key to finding intelligent life out there. Murdin gives a full account of SETI up to 1977, but squeezes its subsequent history into a page. There is nothing on the searches that have targeted individual stars, including those with newly discovered planets, nor on the world's most sensitive eavesdropping radio ear on the cosmos, the Allen Telescope Array. Murdin is silent on recent searches for laser signals from alien intelligence, and on attempts to track down tell-tale infrared emissions from advanced civilisations. And we also learn nothing of efforts to detect alien artefacts in the solar system. That's not as daft as it may sound: if Fermi was right, and aliens have visited, then where's the rubbish they left behind?

For the science of the solidly known - with references and footnotes - Murdin has written a thorough survey. But he stops short of providing the thrill that would come from answering the provocative question in the title.

Book information
Are We Being Watched? The search for life in the cosmos by Paul Murdin
Thames & Hudson
£16.95

Follow @CultureLabNS on Twitter

Like us on Facebook



March 04 2013

12:20

Timing was everything when Darwin's bombshell exploded

Bob Holmes, consultant

2659123.jpg

(Image: Topical Press Agency/Hulton/Getty)

Peter Bowler's Darwin Deleted shows that the shock of Darwin's theories may have delayed the acceptance of evolution - and science may have paid the price

SUPPOSE Charles Darwin had been swept overboard and drowned during the voyage of the Beagle. He would never have developed his theory of evolution by natural selection, never written On The Origin of Species, never sparked one of the greatest intellectual revolutions ever. What would the world be like without him?

That is the question Peter Bowler sets out to answer in Darwin Deleted. From such a beginning, one might expect a playful romp through the fields of imagination. Instead, Bowler - a historian of science at Queen's University Belfast, UK - delivers a forced march through the academic jungles. He uses the notion of a world without Darwin to explore the context of evolutionary thought in the 19th century, and examine exactly what Darwin's contributions were.

978-0-226-06867-1-frontcover.jpg

In many ways, says Bowler, Darwin played less of a role than you might suppose. The concept of evolution, or change through time, was already around before Darwin's Origin was published in 1859. Geologists were beginning to realise that Earth was much more than a few thousand years old, and palaeontologists were piecing together a fossil record that testified to vast changes in life forms over aeons.

Darwin's big idea was that evolution proceeded by natural selection: better-adapted individuals are more likely to survive and reproduce, and thus pass on adaptive traits to their offspring, while less well-adapted individuals die taking their failed traits with them. Others, notably Alfred Russel Wallace, came up a similar idea at about the same time, but only Darwin's book attracted wide attention.

That concept, with its emphasis on struggle, competition and the relentless winnowing out of failures, was a bombshell. How could a benevolent God permit such violence, such wastefulness? Darwin's theory instantly polarised the public, with conservative Christians rejecting it outright and anti-religionists such as Thomas Henry Huxley using it as an argument against the established church. Without Darwin, Bowler says, Huxley and his followers might have seized another sword, perhaps using geological evidence for an ancient Earth as their weapon against religion. As a result, the fundamentalist backlash of today might have targeted geology rather than evolution.

In the latter half of the 19th century, natural selection was not the only evolutionary theory in town, or even the dominant one. Many naturalists thought that evolution proceeded along a predetermined path leading to ever more complex organisms and culminating in humans, much as a developing embryo increases in complexity. Others followed the lead of Jean-Baptiste Lamarck, arguing that animals pass on the traits they acquired in life. In the classic example, giraffes that stretched to reach leaves high up in trees would have offspring with longer necks.

These other theories left room for a purpose, for improvement in the evolutionary enterprise, and were therefore easier to reconcile with religion. Some alternatives also left space for independent lines of descent instead of a single tree of life, so they avoided the uncomfortable notion that humans shared ancestry with apes. Without Darwin, these alternatives would have allowed the general concept of evolution to enter the public consciousness more gently, Bowler suggests. No doubt there would still have been conflict with religion, but it would have been muted.

Modern opponents may argue that Darwinism laid the foundation for societal amorality, culminating in two world wars, in eugenics and the Nazi atrocities. So would a world without Darwin have been a kinder, gentler place? Not likely, says Bowler, who shows that the factors underlying the horrors of the past century or so, such as racism or imperialism, existed long before Darwin. True, the notion of Darwinism provided a useful rhetorical framework, as when Nazis spoke of "racial purification" as a step toward the evolution of better humans. But without Darwin they could easily have turned to another metaphor, says Bowler, such as the need to excise a cancer from society.

All this is fascinating and should have made a lively book. But Bowler is so meticulous about his historical detail, so careful to explore every angle of each point he makes, that he often leaves the reader unsure where he is going.

Even so, the book is worth the effort. Without Darwin, he concludes, we would probably have ended up in much the same place we are today, with an evolutionary theory based largely on natural selection, and with most of the big historical events of the past century unfolding as they did. Where Darwin really mattered was in timing. Here, ironically, the shock of his book, and the polarisation it caused, may have delayed the acceptance of evolution. The great man was ahead of his time, and science may have paid a price for that.

This article appeared in print under the headline "Timing is everything"

Book information
Darwin Deleted: Imagining a world without Darwin by Peter J. Bowler
University of Chicago Press
$30

Follow @CultureLabNS on Twitter

Like us on Facebook



February 27 2013

16:33

Time is running out for the solar revolution

Fred Pearce, consultant

p9112110.jpg

The future might be bright if solar power takes off around the world (Image: Donkeysoho/Plainpicture)

Project Sunshine by Steve McKevitt and Tony Ryan shows the great promise of solar power, but time is running out for its advocates to make it shine

WE ARE stardust. And ultimately the energy that powers our world comes from sunlight. For two centuries, we have been tapping into "fossilised sunlight", burning solar energy trapped over millions of years in coal, oil and natural gas. Now we have to get back to catching real-time sunbeams.

That is the key premise of this book about a multimillion pound (largely public cash) scheme known as Project Sunshine which is based at the University of Sheffield, UK. Penned by project leader Tony Ryan and writer Steve McKevitt, it is lucid, optimistic - and plans to save the world.

Project-Sunshine_cover.jpg

To stave off climate change, the authors calculate we must find up to 20 terawatts of low-carbon capacity by 2050. The sun delivers almost 10,000 times more, of which some 600 TW might be harvested. It is the only renewable source able to deliver the energy generating capacity we need. One day, solar will be the "source of all the energy we consume", they say.

In this future, nuclear, wind, geothermal and hydropower are simply stopgap technologies while we get solar up and running. Electricity grids will be powered by spray-on organic photovoltaics that turn sunlight into electricity. We can redesign photosynthesis to make the liquid fuels, such as methanol, that will replace oil. Photosynthesis is nature's way of storing solar energy, but it is, they say, "lousily inefficient". So we can forget regular biofuels: the holy grail is super-productive bioengineered photosynthesis - artificial leaves, if you will.

Sunshine is the only thing we won't run out of. We may suffer peak oil, peak soil and peak metals, but the sun will keep shining. This is stirring stuff, and well told. But the authors have trouble placing their solar dream in the wider world. They are stuck in simple-minded environmental homilies and hand-me-down futures full of techno-optimism.

Take their second, subsidiary theme: feeding the world. They have to address this carefully, because all those solar panels and artificial leaves will take up lots of land that would otherwise be used for growing food. No problem: food production is on the cusp of a revolution as powerful as their own solar transformation - genetic modification. They are emphatic about the urgent need for GM foods, and resort to crude Malthusianism in the cause. "Without GM, people will starve in their tens, if not hundreds, of millions," they say. Phooey.

GM technology will be very useful for many things, and we should stop fearing it. But when up to half of the food we grow is wasted, and when tens of millions of hungry African farmers could triple yields with existing crops if only they could afford a few bags of fertiliser, GM crops look like the solution to the wrong problem.

The authors' naivety leads them to mischaracterise a range of problems as supply issues when they are about something else. On energy and food, they pay lip service to the need for much greater efficiency in production, distribution and use - but rarely get further. That's a shame. If "it's going to be much easier to use less energy in the future than it will be to generate more power", as they argue, why not discuss how?

Given the importance they attach to the rapid adoption of new technologies, it is also odd that the authors' analysis of human progress, which occupies much of the first half of the book, spends little time on why even the best ideas often fail to be adopted.

Thus we learn that the Roman Empire "had no culture of innovation at all... across its whole 800-year history". It faltered on the verge of inventing both the printing press and the steam engine, postponing the industrial revolution by almost two millennia. Eventually the British made it happen, turning an "inconsequential European backwater" into "the world's foremost industrial power". Since the authors call for a similar transformation to decarbonise our energy economy, why not first try to understand why the Romans failed while the Brits succeeded so spectacularly?

Time is short for the solar revolution. As well as fast-tracking the technologies, we have to find out how to break the logjam created by the old technologies. If we continue to act like Romans rather than 18th-century Brits, then stardust is all we will be.

This article appeared in print under the headline "A place in the sun?"

Book information
Project Sunshine: How science can use the sun to fuel and feed the world by Steve McKevitt and Tony Ryan
Icon Books
£16.99

Follow @CultureLabNS on Twitter

Like us on Facebook



February 26 2013

10:49

The man who's crashing the techno-hype party

Jim Giles, consultant

evhr_4.00320893.jpg

Get your big data groove on at an "I ♥ Facebook" party - or don't (Image: Stefano Dal P/Contrasto/Eyevine)

Evgeny Morozov does a good job of dispelling "big data" hype in To Save Everything, Click Here, but fails to explore the way we shape the tech we use

IF SILICON VALLEY is a party, Evgeny Morozov is the guy who turns up late and spoils the fun. The valley loves ambitious entrepreneurs with world-changing ideas. Morozov is, in his own words, an "Eastern European curmudgeon". He's wary of quick fixes and irritated by hype. He's the guy who saunters over to the technophiles gathered around the punch bowl and tells them, perhaps in too much detail, how misguided they are.

click_here_cover.jpg

Morozov should be invited all the same, because he brings a caustic yet thoughtful scepticism that is usually missing from debates about technology. Remember the claims that Facebook and Twitter, having helped power the Arab Spring, would lead to a more open and democratic world? If you heard a dissenting voice, it was probably Morozov's: in his 2011 book The Net Delusion, and also in New Scientist, he pointed out that dictators can use social media to spread propaganda and identify activists. The web is a medium for both liberation and oppression.

Now Morozov is crashing another party. This one is in full swing, filled with feverish talk of algorithms and the cloud and big data. Here's Facebook founder Mark Zuckerberg, describing how his site tracks our personal interests so that it can serve up the news we most care about. And that's computer pioneer Gordon Bell holding forth on "life-logging". For about a decade he has worn a digital camera that takes frequent shots of his surroundings, which helps supplement his memory.

Enter the curmudgeon. Morozov says people like Zuckerberg and Bell espouse Silicon Valley philosophies that are pervasive but shallow. Bell's desire to catalogue everything, for example, is an example of "solutionism": the relentless drive to fix and to optimise; to take problems - in this case an imperfect biological memory - and propose solutions. This rush for solutions prevents us from thinking about the causes of the problems, and whether, in fact, they are problems at all.

Morozov's right: a digital catalogue of the books we have read and scenes we have witnessed is not, as Bell claims, the basis for more truthful recollections, unless your primary concern is the colour of the socks you wore one day in 2007. There are many things that are important - the mood in a room, a person's demeanour - but too intangible to be captured by a camera, or any other form of technology. Even if they could be, it is far from certain that our future selves would benefit from being able to "recall" these things.

His other bugbear is "internet-centrism": the belief that the internet has inherent properties that should dictate the form of the solutions we pursue. Take the idea that governments should publish data on issues like crime and court cases. To an internet-centrist, this data should be as open and searchable as possible. Morozov wants to know why. Maps of crime data can drive down house prices, worsening a cycle of decline. And publicising the names of trial witnesses can lead to intimidation. The desire for openness requires real trade-offs, and it is naive to assume that a technology can tell us how to handle them. "To suppose that 'the Internet', like the Bible or the Koran, contains simple answers... is to believe that it operates according to laws as firm as those of gravity," he writes.

It is important that someone is countering techno-hype, yet this book lacks the punch of Morozov's earlier writings. It reads like he really did imagine himself crashing a Silicon Valley party and lecturing the attendees, because the book, essentially, is a series of rebuttals of prominent technologists. There is little in the way of practical alternatives. By my count, Morozov dedicates 317 pages to attacking solutionism and internet-centrism, and 33 pages to thinking about how to move beyond them. By taking aim at the technologists, Morozov averts his gaze from something more important: the way that technologies are actually used.

This is most clear in the chapter on the media, in which he worries about technology that allows news organisations to track what people read. The result, he fears, will be a whittling away of less-popular but important types of news, like reporting on poverty. Meanwhile, sites like Facebook are profiling us and using algorithms to feed us news they think we will enjoy, limiting serendipitous discoveries that open us up to new events and ways of thinking.

These are legitimate fears, but they predate the internet. Newspapers and magazines have long used focus groups to gauge reactions to content, sometimes culling and expanding sections in accordance with this feedback. The feedback on digital media may be more rapid and fine-grained, but that does not mean that editors are now slaves to it. A good editor knows that readers want to be challenged as well as entertained; to read about topics they love, and those they may come to love. This involves balancing audience feedback with an instinct for a story. It is nothing new.

This is true even at sites that embrace the algorithmic approach. At Buzzfeed.com, stories are designed to maximise the chances that they will be shared on social media (one headline as I write: "Here Is A Horse Playing A Recorder With Its Nose"). In 2011, the site hired a notable political reporter and tasked him with generating scoops - about politics, not horses. Last year, in a move headed by a different editor, the site began publishing long-form features. Will these new sections generate as many hits as cat videos? Probably not. But Buzzfeed founder Jonah Peretti, whom Morozov dismisses as the "king of memes", presumably knows that the site will be stronger with this richer content added, regardless of what the page-view figures say.

This is the way that technologies are used in real life. They are shaped and adapted by people that use them, based on personal needs and histories. Editors can make use of new audience data without bowing to the algorithms. People can log aspects of their lives - perhaps miles run, or calories consumed - without signing up to Bell's eccentric ideas. Many of us like the connectivity that Facebook brings, but that does not mean we swallow Zuckerberg's self-serving philosophy. Technology shapes us, for sure, but we shape it, too. That process is extremely complex, and it cannot be critiqued by focusing solely on the hype.

This article appeared in print under the headline "Crashing the techno-hype party"

Book information
To Save Everything, Click Here by Evgeny Morozov
Allen Lane/PublicAffairs
£20/$28.99

Follow @CultureLabNS on Twitter

Like us on Facebook



February 22 2013

15:57

Desperate data about desperate children

Shaoni Bhattacharya, consultant

109276917.jpg

A 13-year-old girl, Aissata Konate, a few days after getting married to 32-year-old Ely Barry in Gbon, Ivory Coast (Image: Carol Guzy/The Washington Post/Getty)

If governments could pass a simple law that would save the lives of millions of infants, they’d do it, right? And if a policy or constitution could transform the lives of women by sparing them the poor health or despair that they inevitably pass on to their children through sickness, disability, and even death, surely they would get working?

But no.

A new book, the first of its kind, together with the extensive report underpinning it, shows just how far the world has to go - even (or perhaps, especially) rich countries like the US.

Children’s Chances: How countries can move from surviving to thriving by Jody Heymann with Kristen McNeill was launched (with the accompanying report) in London last week. It aims to provide an armoury of deeply disturbing data with which to hold to account the world’s passive politicians.

It is a culmination of years of work led by Heymann, who is director of the World Policy Analysis Center and dean of the University of California, Los Angeles, Fielding School of Public Health.

She and her colleagues have sifted through (quite literally sometimes) boxes piled with paper and reams of information from organisations like UNESCO to provide the first global comparison of laws and public policies in 191 countries covering poverty, discrimination, education, health, child labour, child marriage and parental care options.

Laws really matter, found Heymann and her colleagues. Laws covering what look to be family or cultural decisions such as early child marriage or education are important because these issues determine whether a child survives or thrives.

When girls marry young, for example, they tend to drop out of school earlier and have poorer health, and, in turn, their children have poorer health.

At the book’s launch at the Royal Society in London Heymann said that sometimes just having a law can help: “What surprised me is that many people had said, 'What if policies are not fully implemented?' In fact many of these policies are so powerful that it is enough [to make a difference].”

The book, report and website aim to make the crucial information Heymann and her colleagues have gathered accessible to ordinary citizens, non-governmental organisations and policy-makers.

It’s readable, and given the Herculean task the authors had bringing it all together, makes clear sense of what they found with online maps providing a wealth of revealing information never before available, at a keystroke.

But I can’t help feeling that they are missing a trick. The tone of the book may be assertive, but it is not as forceful as its material - just too polite.

Why beat about the bush when children’s lives, health and future are at risk? Name and shame, I say. This book has the moral high ground, and scientific rigour, to do so. And it should.

At the launch, the US was rightly described as a “laggard”, but it is only by trawling through the maps that it becomes clear just how far behind it is for a rich nation.

What, for example, does the US have in common with Papua New Guinea, Liberia and Tonga?

These are three of only eight countries in the world with no guaranteed paid maternity leave. As for paternity leave, paid parental leave for sick children - forget it.

And there isn’t even protection against early child marriage. The US is right up there with Sudan and Iran, with no legal minimum age for marriage for girls or boys.

This was shocking, but the authors don’t go into surprises like this. Surely we should be told what the lack of such a law does to a developed nation like the US? Does anyone actually get married very young? If so, how young, and how common is it?

But we do know that lack of maternity leave makes a huge difference. Globally every 10-week increase in paid maternity is associated with a 10 per cent drop of newborn deaths, infant deaths and under-5 mortality rates. Staggering.

The reasons are simple. Off work, mothers are more likely to breastfeed and take their babies to be vaccinated.

Even in the US, infant mortality rates are not good for a developed nation. And recent studies show the country’s health generally is not as robust as it could be.

There are plenty of other surprises.

What does Laos do for its children that the UK doesn’t? Astonishingly, it is one of five countries in the world with father-specific paid paternity leave of over four weeks. The others are Iceland, Norway, Slovenia and Sweden.

Luxury? No. Paternity leave matters and when it's specifically allocated to men, dads are more likely to take it. Studies show that fathers who take paternity leave when it is available are much more involved with care of their children, even after a pre-existing commitment to mother and child is controlled for.

And where fathers are involved, new mothers are less likely to get depressed - and maternal depression has strong knock-on effects on children.

Then again, what makes one of the poorest African nations a better place to be a child than its neighbours? Madagascar has policies for children and families that are more progressive than many western nations, and this has paid off because its infant and child mortality rates are among the lowest in sub-Saharan Africa.

Sadly, the most powerful of the emerging economies - India and China - fare worse than many very poor countries in terms of children’s chances and healthcare.

The authors want this to be a call for action. But they need to shout louder. While they do quote the numbers of countries opting out of child-friendly policies or laws in their report, they should name the countries - and give them a report card. And likewise hail the countries (especially the poor ones) doing right by their kids.

The book does try quite hard in some ways, detailing heart-rending case studies of kids so hungry that they fall asleep at school, 9-year-olds rising at 4 am to help their parents set up street vegetable stalls before going to school, or parents who can’t take seriously ill kids to the doctor because they risk losing their jobs if they take time off…

The good news is that governments can move mountains if they find the will: saving the lives of millions of children worldwide is surely easy compared with finding a cure for AIDS or cancer?

Heymann and colleagues should be commended for their meticulous and arduous work. Let’s hope citizens, NGOs and movers and shakers pick up this report and wield it forcefully in the faces of governments.


Book information:
Children’s Chances: How countries can move from surviving to thriving by Jody Heymann with Kristen McNeill
Harvard University Press
$45/£33.95

Follow @CultureLabNS on Twitter

Like us on Facebook



February 14 2013

10:00

What we really know about human courtship

Kate Douglas, features editor

PAR11280.jpg

(Image: Leonard Freed/Magnum Photos)

Enjoy what two insightful books have to say about mating intelligence and relationships - but don't park your scepticism just yet

"DO YOU have any raisins? No? Well then, how about a date?" You don't need Mating Intelligence Unleashed to tell you this is a dire chat-up line. On the other hand, you may be interested to learn that men may use "unattractive opening gambits" to "screen out" women seeking commitment in favour of those who fancy a fling. I'm interested, but sceptical.

It's not the only observation to prompt a "really?" Even so, this is a book built on sound biological foundations. Few would argue with the idea that humans possess evolved biases and preferences in courtship and mating behaviour. And, although the field of mating intelligence is relatively new, one thing that is clear in this eclectic overview is just how much research has already been done. The subject is almost guaranteed to entertain. Glenn Geher and Scott Barry Kaufman ask: how is individual mating behaviour influenced by gender, personality, and environment? Why do we find creativity and humour sexy? What does a woman/man want? And do nice guys really finish last?

mating_intelligence_cover.jpg

Attentive New Scientist readers will find little that is new, and some that is laughable; you will marvel at the effort made to discover that "women do not like 'jerks' per se". Still, there are real revelations: intelligence is the second most desirable trait in a sexual partner, funny men are perceived as taller, and flirting doesn't always signify sexual interest.

As well as elucidating research, Geher and Kaufman believe these revelations can improve your love life. But can it help to be aware of subconscious biases underlying sexual relationships? The authors think so, yet even they ask: "How often has your meta-cognitive awareness caused a drop in your ability to accomplish a task smoothly?"

Perhaps it's just sour grapes on my part, having failed to shine on their Mating Intelligence Scale. If you must know, I lack "commitment scepticism", and I should have answered "true" to the statement "my current beau spends a lot of money on material items for me" because this would show that I was good at obtaining resources from my partner. But I doubt that mating intelligence improves through exegesis.

love_2_cover.jpg

Call me old school, but I believe we learn about relationships via trial and error, directed by our emotions, yet Geher and Kaufman barely mention love. Luckily, I had Barbara Fredrickson's book to fill the void. Or so I thought. It turns out Love 2.0 is not a treatise for old romantics, but something far more interesting. For Fredrickson, love isn't the all-encompassing state of bonded bliss most people envisage. Instead, she redefines it as fleeting moments of emotion we feel whenever we make a real connection with another human.

This "positivity resonance" is, she argues, mediated through three physiological channels: a synchronisation of brain activity with a loved person; oxytocin; and nerve impulses that travel between the brainstem and heart via the vagus nerve. This is a virtuous cycle - the more love your body experiences, the more it can experience. Such positivity resonance is transformative, bringing health, happiness, social integration, wisdom and resilience.

Yes, this can be excruciatingly New-Agey. And no, the science doesn't quite stack up. Oxytocin, for example, isn't all about positive relationships, and Fredrickson doesn't mention the link between endorphins and well-being. She also ignores the finding that fear synchronises brain activity at least as effectively as love. Still, her research on the vagus nerve is intriguing, and no doubt her ideas will evolve. If you can stomach "letting the heart of your I resonate with the heart of your we" and other such purple prose, this book may change your life. Really. Give Love 2.0 a chance.

Book information:
Mating Intelligence Unleashed: The role of the mind in sex, dating, and love by Glenn Geher and Scott Barry Kaufman
Oxford University Press
$27.95/£17.99

Love 2.0: How our supreme emotion affects everything we feel, think, do, and become by Barbara L. Fredrickson
Hudson Street Press
$25.95/£18.99

Follow @CultureLabNS on Twitter

Like us on Facebook



September 03 2012

16:34

Retracing the stardust trail

510UKgiLETL._SL500_SS500_.jpgMarcus Chown, contributor

THE iron in your blood, the calcium in your bones, the oxygen that fills your lungs each time you take a breath - all were forged in stars that lived and died before the Earth was born. You are stardust made flesh. You were literally made in heaven.

The story of the nuclear processes that built the elements in your body was essentially told in a seminal paper in 1957 by Fred Hoyle, William Fowler, and Geoffrey and Margaret Burbidge. I recounted the story in my book, The Magic Furnace, which science writer Jacob Berkowitz draws on for The Stardust Revolution. But Berkowitz does not just bring the story up to date: he broadens it, exploring what we have discovered about the 9 billion years of history that preceded Earth's birth.

The key to his story is stardust. It is on the super-cold surfaces of such silicate or ice grains that atoms are stitched together to make a wide range of molecules. These molecules radiate heat, which causes gas clouds to collapse to form suns. Later, those molecules, still stuck to stardust, rain down on planets, providing the building blocks of biology.

In this impressively researched and exciting book, Berkowitz talks to some of the key figures who have pieced together the pre-chemistry of the Earth, including 97-year-old Charles Townes, inventor of the maser -a microwave laser - and discoverer of water in space. But the real excitement is in the discovery of hundreds of planets around nearby stars. We are on the verge, Berkowitz thinks, of finding the missing piece in the picture: precisely how stardust led to life.

  • The Stardust Revolution
  • by Jacob Berkowitz
  • Published by: Prometheus
  • £24.95/$27


Tags: Books
10:11

Babies without sex or pregnancy?

Like-a-Virgin-How-Science-is.jpgIn Like a Virgin Aarathi Prasad looks at historical notions of virgin birth and explores the possiblity of a future without the need for pregnancy

FIRST, seal a man's semen in a glass tube. Then bury the tube in horse manure for 40 days, remove it, magnetise it and watch the semen assemble themselves into a tiny human form. Voilà - conception, without the pesky uterus.

This was the recipe for life suggested by the 16th-century scientist Paracelsus. His was one of the first enquiries into whether children could be born without the need for women. According to the prevailing scientific wisdom of the time, women were simply vessels, replaceable if you could just find the right alternative.

This misunderstanding isn't even the most entertaining story in Like a Virgin, in which biologist and science writer Aarathi Prasad looks from several perspectives at the possibility of human parthenogenesis, or reproduction without sex. The phenomenon is most often associated with the well-known religious tale, but it has been investigated for a variety of reasons throughout history.

When it comes to the Virgin Mary, Prasad is frank about the science of the predicament. The story can only be true, she says, if Mary were a chimera with testicular feminisation, a condition in which the external body manifests as female, but the internal organs are part male. In other words, Mary was also a man, and got herself pregnant. Prasad is so matter of fact - and well-sourced - about the underlying science that when she unveils this bombshell it has no shock value.

The book is peppered with illuminating analogies. For example, Prasad likens chromosomes swapping genetic information to "dancers circling to and fro, touching hands and retreating back to the line at one of Jane Austen's house balls".

These entertaining biology lessons are woven together with intriguing historical anecdotes and stories from nature. From electric ants to Amazonian lizards, both of which can reproduce without the need for males, the natural world is no stranger to virgin birth.

Intertwining science and history, Prasad meticulously builds a case for the possibility of human parthenogenesis. What she anticipates is neither religious miracle nor one-off scientific fluke. Instead, she believes the future will bring a reality of ectogenesis - gestation in an artificial, external womb - much as Paracelsus envisioned it.

She builds a strong case for this possibility - citing animal studies using artificial wombs and efforts to engineer artificial human placentas. Prasad also argues that this is an outcome that could be more desirable than natural childbirth - not because it is a better alternative to women, but because it's a better alternative for them. Human pregnancy remains risky and even life-threatening, and infertility is a growing problem. An alternative could be extremely attractive.

A hugely successful braid of reproductive biology, history of science, and politics, this is science writing that will keep you up past your bedtime.

  • Book Information
  • by Aarathi Prasad
  • Published by: Oneworld
  • £12.99/$19.95



Tags: Books

August 31 2012

09:26
The science behind our weirdest behaviours
Could sneezing, yawning and tickling have a shared function? PLUS The call to bring back parasites and the woolly credulity of Sheldrake's swansong


Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.
(PRO)
No Soup for you

Don't be the product, buy the product!

close
YES, I want to SOUP ●UP for ...