CERN, the European Organization for Nuclear Research, is like a little city. Straddling the border of France and Switzerland, it employs three thousand people and occupies a site that is measured in square miles. CERN boasts a string of magnets that weigh more than the Eiffel Tower and an underground tunnel over sixteen miles around.
Breaking up atoms, as James Trefil has noted, is easy; you do it each time you switch on a fluorescent light. Breaking up atomic nuclei, however, requires quite a lot of money and a generous supply of electricity. Getting down to the level of quarks—the particles that make up particles—requires still more: trillions of volts of electricity and the budget of a small Central American nation. CERN's new Large Hadron Collider, scheduled to begin operations in 2005, will achieve fourteen trillion volts of energy and cost something over $1.5 billion to construct.
There are practical side effects to all this costly effort. The World Wide Web is a CERN offshoot. It was invented by a CERN scientist, Tim Berners-Lee, in 1989.
But these numbers are as nothing compared with what could have been achieved by, and spent upon, the vast and now unfortunately never-to-be Superconducting Supercollider, which began being constructed near Waxahachie, Texas, in the 1980s, before experiencing a supercollision of its own with the United States Congress.