Pin It
Favorite

Armageddon it 

Will the world end this year?

The idyllic city of Geneva, Switzerland, is about 5,700 miles from California’s Central Coast, an entire world away. But something in that European town may affect us all in upcoming months: It may cause the end of days.

To explain, scientists at the Large Hadron Collider—the world’s largest particle accelerator at CERN, the research organization that also invented the World Wide Web—are ready to recreate the Big Bang by sending two beams of protons, traveling nearly at the speed of light, on a head-on collision course. Though the $8 billion experiment seems to lack any practical goals, one possible outcome is to prove the existence of the elusive “God particle” or Higgs boson, which is something like the “force” in George Lucas’ Star Wars movies that binds everything in the universe.

click to enlarge Atom_Earth___Explosives0.jpg

  Famed physicist Sir Martin Rees puts the odds of doomsday from this experiment at 1 in 50 million, about the same chance of winning the lottery—and someone wins the lottery almost every week. Stephen Hawkins and others claim that there is no risk. Most of us aren’t particle physicists, but we don’t have to be in order to reasonably think that there must some risk of catastrophe, however infinitesimal, in any experiment that involves simulating the highly energetic and dynamic conditions of the universe at less than one-millionth of a second after the Big Bang (there’s a reason why it’s capitalized).

Physicists admit that micro black holes may result from the experiment, which critics fear might swallow up the world; but scientists largely believe that these black holes will dissipate or “pass harmlessly through the earth” if they do appear. Others point out that more powerful particle collisions happen in the universe all the time, many bombarding the earth, and because nothing bad has (yet) resulted from those collisions, we shouldn’t be worried about CERN’s little experiment.

But here’s the problem: When you tell non-particle-physicists that you intend to “recreate the Big Bang,” that sounds like an awfully, astronomically powerful experiment: If the Big Bang was explosive enough to create the universe, then it doesn’t seem implausible that even a fraction of that energy can cause a disaster, never mind doomsday, here on earth. Thus we can’t really blame folks for being afraid. So is it an issue of semantics—are scientists overhyping the experiment that really won’t even come close to approximating Big Bang-like conditions? Or perhaps they’re downplaying the risks of such a dangerous-sounding experiment in a classic case of having your cake and eating it too? Or both?

My point here isn’t to discuss the plausibility of these risks or to rally behind protests and lawsuits that are seeking to stop the CERN experiment, but this event is an opportunity to point out a few things from a technology ethicist’s perspective:

 

You’re not the boss of me

It’s one thing for the good engineers at Apple to tinker with next-generation iPod technology, and quite another to simulate the Big Bang: There is no existential risk (the danger of ending life on a massive scale) with the iPod, unless the plan is to power it with a miniature nuclear reactor. Even if there is a miniscule risk of catastrophe or doomsday with the CERN experiment, how did it happen that the world’s public was not involved in making the decision to proceed with the experiment or not? Most don’t even realize this experiment is happening. When did we entrust CERN or scientists in general with the right to take such risks on our behalf? Is 1 in 50 million really such a small risk that we can ignore it for all practical purposes? What about 1 in 10 million, or 1 million, or 1,000—who gets to makes these decisions for the world?

Note that during the Manhattan Project during World War II, scientists believed there was a small chance (1 percent or more) that the atomic bomb may cause a chain reaction that would burn up the world’s atmosphere. Like the CERN experiment, no one then really knew what was going to happen, yet they proceeded anyway, believing that the (military) benefits justified the risk. We now know that such a risk from a nuclear reaction was overestimated, but even a 1 percent perceived chance of some action destroying the world seems too great, no matter what the benefit. (Imagine if a pilot said that one out of 100 passengers on your flight would die: Wouldn’t you get off the plane?)

Further, on the off chance that a catastrophe does occur, who is ready to accept the blame, legal liability, and financial consequences? I don’t know of a single insurance agency ready—or funded enough—to cover that bet. If the CERN experiment is so harmless, should we require the families of those scientists to be nearby when the experiment occurs, to put their conviction about risk to the test? (This reminds me of getting x-rays despite the dentist’s reassurances: If they’re so safe, why am I wearing this heavy lead vest, while the dentist hides out in the next room?)

 

Oh, what a tangled web we weave

As the CERN experiment highlights, we, as a society, need to be more aware about technological and scientific developments. It used to be that scientists could stay in their labs, invent stuff, and throw them over the wall into the consumer marketplace for us to enjoy. Now, our lives are inextricably intertwined. New software and devices, such as RFID, raise privacy concerns; cognitive science pushes on the question of whether we really have free will (and its implications for legal responsibility); pharmacology, such as steroids and Ritalin, helps patients who need it, but also enhances the physical and mental abilities of otherwise-healthy individuals; robots are quickly entering the marketplace as well as the military, raising ethical questions; nanotechnology gives us new materials but also new risks that may require increased regulation; and so on.

Because technology is increasingly part of everyday life, we should be aware of the risk and ethical and social questions it may spawn. For instance, while I love my Roomba robotic vacuum, robots are also being used to take care of the elderly—but are we simply pawning off our responsibility to machines (who are unlikely to provide the emotional content or interactions needed by humans)? What are the ethics of robots designed to kill—or have sex with—people? In recent weeks, scientists have reported engineering a creature that can suspend all biological activity to survive the harsh environment of space: We have to file reams of environmental impact reports before we can even build a shack on the beach or tinker with the wetlands, so shouldn’t we give some thought into sending new life forms into space? The questions go on in many areas of science and technology, and the public needs to weigh in.

 

A new generation of scientists and engineers

What responsibilities do scientists and engineers have toward society? Some believe that they need not concern themselves about matters of ethics: We’re not ethicists, so let someone else sort those out, they say. But as quickly as science and technology moves now, ethics won’t have time to catch up. So as the new academic year starts again, I’d like to challenge science and engineering students at Cal Poly, Cuesta College, and beyond—as well as the flow of technologists to the area—to have some awareness of what societal impact their (future) research may have. At the least, this would help anticipate and defuse any problems arising from public misperception, such as the global backlash on genetically modified organisms or “Frankenfoods.”

There’s also a lesson here about science communications. If the CERN experiment is really much less modest than simulating the Big Bang, and all its implications, then let’s say so. It also doesn’t help that searching for what’s known as the “God particle” is part of the experiment, which lends more importance to the project, at least in perception. In another area, nanotechnology has been called “the next Industrial Revolution,” but it’s foolish to think that it can live up to that billing and not be disruptive (e.g., harmful to worker safety) as previous industrial revolutions have been. Scientists and engineers will need to better communicate in advance the aims and risks, real and perceived, of their work, if they expect the public to understand and accept it. Failing to do this, or ignoring the job in hopes that attention will stay inside the realm of the informed science community, risks the kind of negative speculations and lawsuits that plague the CERN experiment.

Local colleges and universities are making proactive efforts in technology literacy with us liberal-arts folks as well as humanities literacy for the science minded. For instance, my work at Cal Poly includes starting an Ethics & Emerging Technologies Group, which already was awarded a Department of Defense grant to study risk and ethics with autonomous military robots. The local Discovery Institute for the Advancement of Science & Technology Education is reaching out to K-16 students and recently hosted an impressive showcase at Cuesta College (that included robotic creations by elementary-school kids!). Allan Hancock College continues to invest in the right areas with its new science building, innovative programs, and increasing focus on its Social Sciences Department and other areas. U.C. Santa Barbara, nationally recognized for its science research, also has an active Center for Nanotechnology and Society, among other programs.

Science and technology are a far cry from the model rockets, chemistry sets, or electronic kits from previous generations; they are serious business now and tread onto social and ethical issues. And we can no longer afford, as a society and for national competitiveness, to be technologically illiterate, or for scientists to be uneducated in matters of society. Science and society are now deeply connected: Think about energy, global warming, bioengineered foods, technology-driven markets, social-networking sites, our daily gadgets, and so on.

We’re on the right trajectory, as I see more attention paid to the ethics and technology, and forward-thinking general-education requirements go a long way toward ensuring that students of all majors are introduced to both science and the humanities. Even if the saying is mostly right, that ignorance is bliss, sometimes it’s helpful to know (or to have been asked), such as with CERN: It still seems better to see Armageddon coming to help avoid it—or at least to reschedule your meetings or exams to the week after.

 


 

Patrick Lin is the director of The Nanoethics Group as well as the Ethics & Emerging Technologies Group. His recent work includes federally funded research projects in technology ethics, as well as a pair of nanoethics anthologies. He is a visiting assistant professor in the philosophy department at Cal Poly, San Luis Obispo. Dr. Lin also has academic appointments at Dartmouth College and Western Michigan University. He earned his B.A. from U.C. Berkeley and his M.A. and Ph.D. from UCSB. Send comments to the executive editor at [email protected].

Tags:

Pin It
Favorite

Latest in Commentaries

Comments

Subscribe to this thread:

Add a comment

Search, Find, Enjoy

Submit an event

Trending Now