Nanoscale optics, quantum computing - the battle for technology supremacy is being fought inside the labs of a national standards agency called NIST. And the new enemy is in the White House.
Bill Phillips has taken the "close enough for government work" dodge and turned it upside down. He is sitting in a room that looks every bit like a civil-service cliché - bare walls and low-bid-contractor furniture inside a dull building in an equally dull town just outside the Washington Beltway - and explaining how he can tell time with a deviation of about one second every 20 million years. The feat helped win him the 1997 Nobel Prize in physics.
Phillips, like the other researchers gathered with us in this room, has bypassed the corporate and academic worlds to work here in the Gaithersburg, Maryland, headquarters of the National Institute of Standards and Technology, aka NIST. He has spent more than two decades as a physicist in this spot, and he's never really been tempted to leave. "The fact is," he says, gesturing to include his colleagues, "most of us are interested in learning how things work rather than in making money."
That kind of commitment runs thick throughout NIST, which is split between two campuses - the main facility in Gaithersburg and a second in Boulder, Colorado. Celebrating its hundredth anniversary this year, the agency was originally named the National Bureau of Standards and charged with maintaining a measurement infrastructure that would determine the exact length of a meter, or how long a second really lasts, or how much power constitutes a volt. In other words, NIST would create yardsticks - at a time when there were at least eight different measurements for a gallon in the United States.
A century later, NIST defines the meter as the distance light travels in a vacuum in one-299,792,458th of a second. Researchers here study everything from nanocrystals to quantum computing. As the Supreme Court of measurement in an increasingly nano world, the agency is honing its level of precision to the atomic scale, an effort that is leading its staff of 3,200 - with a budget of $635.8 million in 2000 - to explore the very boundaries of the physical world.
Take, for example, the measurement of optical fibers used in telecommunications. To prevent the degradation of signals caused by splicing together fibers of different widths, NIST has created an extraordinarily precise micrometer that can measure fiber diameters to within 50 nanometers - the width of about 100 molecular layers of the glass.
You need exquisitely advanced physics to work out measurements like these, and while such details may seem like so much esoteric mumbo-jumbo, they are literally the language of modern science, and, increasingly, of our everyday lives. Solar panel makers, semiconductor manufacturers, optical communications companies, chemical suppliers, TV technology developers - all exploit NIST measurements, standards, and technologies. It's NIST that figured out how to measure the dosage of radioactive "seeds" implanted into cancerous tumors. NIST verifies the electrical outputs of heart defibrillators.
And yet, despite the impact of these types of projects, Phillips and his colleagues know that NIST remains obscure to most Americans. Some members of Congress and their aides say they aren't sure what NIST does, and even The New York Times, in a full-on feature celebrating the agency's 100 years, overlooked NIST's growing contributions to nanotechnology.
So it's no surprise that voices rise in the Gaithersburg conference room when I ask the handful of NIST physicists to convey the nature of their work. "Every mammogram in this country is traceable to NIST!" says one. "We're extending the time frame for Moore's law!" adds another.
Physicist Robert Celotta, a sleek dresser with a solid Republican haircut, stands up and says I have to see it all for myself. We navigate a series of hallways en route to a bifurcated room that's humming with gear. One half of the space is lined with computer monitors. The other is occupied by a gleaming, stainless steel machine, shaped like a series of diving bells, with small round windows that make the whole thing look like a piece of Captain Nemo's sub. Celotta tells me it's an autonomous atom assembler; it moves individual atoms to build nanostructures while making the process visible, in graphic form, on the monitors. Right now, the screens are depicting the construction of a nanoscale box, its walls composed of single atoms. For the moment, there's not much Celotta can do with the box: Making use of it, or any other atomic-scale structure, requires a mastery of the governing rules of a quantum-mechanical universe, which physics hasn't achieved. But the rewards could be great. At this scale the problems of jamming more data onto a hard drive vanish, and the potential for new forms of medical treatment explode. Researchers imagine tiny machines, inserted into the bloodstream, which could act like scissors, clipping off plaque and cholesterol.
Like much of the work at NIST, such effort appears to have only the dimmest connection to measuring something. "Our role is three-fold," Celotta says when I ask how his atom assembler fits into NIST's mission. "One is standards." NIST helps determine the shared lexicon of size, weight, speed, temperature, density - and every other metric science can dream up - whereby individuals, companies, and countries communicate. Without standards, manufacturers can't reproduce objects exactly; without standards, trading partners can't agree. "Another," he continues, "is developing advanced measuring techniques to get picked up by industrial companies and made into products." NIST invents new machines and processes that capture those metrics, and often winds up releasing the devices to industry. "The third is to produce data to characterize materials beyond anybody else's abilities." This is what Celotta is doing with his atom assembler - studying how a nanostructure will behave and how it might be manipulated. NIST tests and catalogs the properties of substances not yet fully understood.
Work like this has attracted more than a thousand of the best scientific and technical minds from around the world. (Whereas Xerox PARC, in its heyday, employed about 300 all told, NIST's research staff alone totals about 1,700.) It has also meant that the agency, whether by default or design, has become a repository of technological odds and ends. This makes walking through the Boulder or Gaithersburg campus seem like a visit to a national wildlife refuge for science geeks. While somebody upstairs is figuring out how much heat a burning chair releases, somebody downstairs is finding out how sticky he can make a polymer.
But NIST's work is almost universally praised by scientists and academics, who say it's an essential font of data, technique, and innovation at a time when major companies are cutting their own basic science efforts. "It used to be places like Bell Labs did what we do," says NIST researcher Eric Cornell. "Their day is passing."
Caltech physicist David Goodstein agrees: "Companies like Boeing, AT&T, and Hughes supported big facilities doing fundamental research. Today, most of those labs have shut down or been scaled back." Without NIST, Goodstein believes, the US would not be a tech leader.
Where NIST comes in for criticism is around the edges of its research. And this year, with a new administration in the White House, that faultfinding has turned to action. After years of ideological quarreling in Congress over NIST's precise role, George W. Bush's budget blueprint this March called for a "reassessment" of the agency's cash grant program, initiated in the 1980s to support advance-guard research that businesses wouldn't support themselves. The blueprint wiped out funds for new grants, effectively killing the program, which accounts for one-quarter of NIST's budget.
NIST insiders are firm that no Congress would dare cripple the agency at its core - the basic mandate of making our clocks synchronize and our inches match up is not likely to be contested. What will be the object of worried debate, as winds shift in Washington, is whether NIST's labs should continue to be a haven for cutting-edge research and attempt to fill the void left by yesterday's R&D giants.
NIST has always been an absolutist kind of place. With its standard-setting platinum-iridium meter bars and kilogram nuggets stored in safes, this institution worships accuracy. And US businesses depend on its piety.
The extreme ultraviolet consortium, for instance, a group of chipmakers and laboratories that includes Intel and AMD, is relying on NIST to help the semiconductor industry boost the power of its microchips. The EUV consortium hopes to increase transistor density by using ultraviolet wavelengths as narrow as 13.4 nanometers to print designs on chips. But for EUV technology to work, the stepper optics - mirrors and lenses that reduce a big image into a minuscule one that will fit on a chip - have to be within a few atoms of perfection to avoid distorting the image; the smoothness of the optics' surfaces must be uniform within 1 nanometer.
NIST's Synchrotron Ultraviolet Radiation Facility in Gaithersburg is just that kind of perfection machine. Shaped like an oversize doughnut, about 6 feet in diameter, SURF III is a particle accelerator that sends electrons racing around a circle so they'll throw off photons. The resulting light can be used to measure the quality of the steppers. "When we're comparing optics for our tool manufacturing in Europe and Japan," says Chuck Gwyn, an Intel scientist who manages the EUV consortium, "we have to make sure they're cross-correlated for accuracy and measurement."
And NIST works with other such consortia. Currently, the agency is aiding the International Disk Drive Equipment and Materials Association (Idema) in developing ways to characterize the magnetic properties of disk media films, some of which are only a couple of atoms thick. NIST will test the films and their magnetic stability at various thicknesses. Then, in a kind of round-robin of exactitude, Idema member labs will test them over again, and again pass the job to NIST. "NIST's measurements will become the gold standards," says Winthrop Baylies, a founder of Idema and a participant in the Magnetics Test Task Force. Companies will use the standards to make sure their products are consistent, configuring their own testing gear so it's calibrated to NIST's.
Some of NIST's work leads to the outer limits of science and the physical world. What begins as an attempt to build a fancy scale or ruler can end up being the basis for a major discovery. This was the case with the Bose-Einstein condensate. Since its earliest days at the turn of the last century, NIST had been keeping the nation's civilian time with a quartz-crystal clock calibrated to mean solar time. Then, in 1949, it replaced this technology with its first atomic clock. (Let's face it: Our planet keeps crummy time. Measuring days - and hours, minutes, seconds - by the revolutions of Earth on its axis, while glaciers are melting and oceans are changing and the whole ball is wobbling in its orbit, wasn't good enough for a cult like NIST.) But counting the 9,192,631,770 oscillations of a cesium 133 atom that make up each second isn't easy, largely because the atoms create a distorting Doppler effect as they whiz through the clock's stainless steel tube. So, in the late 1980s, NIST's future Nobelist Bill Phillips developed a way of using lasers to apply the brakes to atoms and dampen the Doppler effect. By 1995, NIST scientist Eric Cornell and University of Colorado researcher Carl Wieman had built on Phillips' work to create the first Bose-Einstein condensate, supercoded rubidium, whose atoms move so slowly that, at about 30 nanokelvin (or billionths of a degree above absolute zero), it's the coldest thing in the universe.
Now, at NIST's Boulder campus, in research labs known as JILA (the Joint Institute for Laboratory Astrophysics, operated in conjunction with the University of Colorado), Cornell is refining the achievement that could make him the agency's second Nobel Prize winner. Whereas Phillips succeeded at holding atoms stationary for about a second, Cornell is attempting to keep them stable indefinitely. (In their normal state, atoms bounce around so furiously that attempting to study them is like herding ducks.) The BEC, as the Bose-Einstein condensate is called, is a mass of atoms that are so stable they tend to act like one big atom - big enough to be almost visible to the naked eye.
Cornell's lab is chock-full of electronic gear - oscillators, cameras, lasers, lenses, and video monitors. He uses the lasers to push against the momentum of the rubidium atoms. Once the atoms have all but stopped moving, they fall into a trap, an invisible magnetic field, where they collect into the condensate - Cornell describes it as "gelatinous." A slight man with boyish features, the 39-year-old physicist says that right now he can't make the substance do all that much. ("We whack it, we wiggle it, we take its temperature.") In the future, though, the process of creating the BEC may lead to single-atom-layer manufacturing, or superconductive devices, or quantum computing. If you can make atoms hold still and work in unison, why not also get them to act like 1s and 0s - or qubits? A quantum computer, according to Phillips, could quickly solve problems that no classical computer could even if allowed to run for billions of years.
So far, Cornell has gotten his BEC to act like one big, lethargic atom wave - shaped, he says, "like a Reaganesque pompadour gelled into place."
MIT vice president and dean of research David Litster, a longtime observer of NIST, says quantum computing could be just the beginning of the BEC's uses. What sort of nanomachines could be made, he wonders, with a beam of atoms doing the fabrication? "It's really far out, but we can imagine a coherent beam of matter doing all sorts of fancy things: Just think about molecular-beam lithography for microchips."
Today NIST runs a multimillion-dollar program, which has three teams of researchers, focused on the problem of quantum computing. One is headed by Cornell, one by his clockmaking colleague Bill Phillips, and one by Boulder physicist Dave Wineland. Wineland, a tall, lanky man who looks a bit like Frank Zappa, has already created a 4-qubit quantum computer made of stationary beryllium ions that can perform very simple calculations.
When I ask Wineland about the urgency of his research - about rival labs all over the world attempting to achieve the same goals - he simply smiles and kids around. Like Phillips, he displays the spirit of a scientific purist who's in it for the edification, not the triumph. "It's all driven by spies," he jokes, referring to the funding NIST receives from the National Security Agency and Darpa. And then he adds, "Most of us are in this business because it's like going to school forever. It's not really a job. It's like a hobby."
In Boulder, Ray Radebaugh shares the passion. His work - more than Phillips', more than Wineland's - truly stretches the definition of NIST's measurement mission and lobs the mind into the far reaches of possibility. In the race to create new kinds of bombs after World War II, the US needed a place to make liquid hydrogen, and NIST's Boulder lab wound up with the assignment. Now, cryogenics expert Radebaugh and the people in his lab create cryocoolers - metal devices that turn gas into liquid. "If you're going to Mars, you need enough fuel to return, and fuel is too heavy to take with you from Earth. You have to make it while you're out there," Radebaugh explains, as if he's describing changing the oil in his car. For round-trip travel to the Red Planet, he has created a pulse-tube cryocooler - a stainless steel tube about 2 feet long fitted with a steel- and gold-plated copper cooling element he calls the "cold tip." Small pistons vary the air pressure in the pulse tube. Just the right changes in pressure force gas back and forth through a restrictor valve between a warm end and the cold tip, and a heat exchanger at the warm end dissipates heat. The gas is expanded at the cold tip until it becomes a liquid and drips into a dewar. The device, the result of a 1982 collaboration with NASA, is designed to suck in Martian gases and output them as rocket fuel.
Radebaugh has also created acoustic cryocoolers that eliminate the pulse tube's pistons in favor of acoustic oscillations to produce the differential between expanding and contracting gas. These devices are now being used in a demonstration project for liquid natural-gas vehicles, where piped-in natural gas is liquefied onsite at filling stations, eliminating the need to truck in fuel. And Radebaugh's lab is perfecting what are called cryocatheters - narrow coaxial tubes designed to slide into the body through tiny incisions. Cryocooled gas flows through the tube to a surgical tip, which is used like a scalpel to perform delicate operations. Work like this has made NIST the world's leading site for cryogas research, and Radebaugh a star in the field.
Charles I of England discovered the importance of accuracy and impartiality the hard way. In the 1640s, he tried to boost tax revenues by decreasing the volume of a liquid measure called a jack while keeping the tax on the jack the same. That meant his subjects got fewer sips for their tax dollars, and the move led, according to some interpretations, to a protest chant dubbed "Jack and Jill." A hill was mounted, a pail was fetched, but disaster ensued: "Jack fell down." Since two jacks equaled one gill, the poor girl "came tumbling after." This type of arbitrary taxation, along with absolutist religious policies, led to a civil war, which Charles lost. "He broke his crown" in 1649 - which is to say he was beheaded.
Such disputes, if less bloody, were not uncommon in the US before 1901. There was an office of weights and measures, but it failed to apply uniform standards throughout the entire country. What few reliable measuring devices there were had to be calibrated in Europe, where metrology - the science of measurement - was well established. But the advent of electrification in the late 1880s forced the US government to become a more aggressive arbiter of amounts. One company's network couldn't link to another's; the amount of light emitted by lightbulbs was all over the map. Business' need for a rigorous referee to bring some order to the industry - and some relief from litigation - was so pressing that Congress authorized the Bureau of Standards as the nation's first physical science research laboratory, locating the agency within the Treasury Department, then renowned for catching forgers and other cheats. The Bureau of Standards was later moved to the Department of Commerce and Labor, and when this department was divided in 1913, the Bureau was folded into the Department of Commerce.
A good deal of NIST's work throughout its history has been for the US government. In World War II, the agency helped develop proximity fuses, devices that could tell how close bombs were to the ground and then detonate them at just the right elevation. James Faller, now the director of NIST's quantum physics division, helped design the reflector array that Apollo 11 placed on the moon in 1969. That array, and others left by Apollo 14 and 15, has helped measure the distance between Earth and the moon to the inch. In addition to calibrating NASA's science satellite optics, NIST's SURF III also checks up on the lenses in the nation's spy birds.
But from the beginning, even the work NIST performed for the government also wound up priming business. For example, before World War I all optical glass came from Germany; during the war, the US faced a sudden shortage of parts for periscopes and binoculars. So NIST began making optical glass. "We made tons of the stuff," says Robert Scace, retired director of NIST's Office of Microelectronics Programs and something of a NIST historian. "It was enough to supply all the critical needs during the war; then Bausch & Lomb and Kodak picked up on the technology and so did glass companies like Corning." Over the years, a range of inventions have been handed over to the private sector - like the high-speed dental drill, closed captioning, and a digital Braille reader. (See "Proto Type," Wired 8.09, page 79.)
NIST has been a crucial arbiter of standards for the computer industry. In the '60s, the agency promoted ASCII by adopting it for government use. For years NIST has helped coordinate the worldwide development of a system of standards called STEP (Standard for the Exchange of Product Model Data), which aims to facilitate interoperability among industrial suppliers, manufacturers, and subcontractors, so that a company designing a widget will have a standard for communicating that widget's characteristics to collaborating engineers. In 2000, NIST presided over a competition for a new data encryption scheme to replace the quickly outmoded DES; because it will be adopted by the government and won't be patented, the winner will likely become the standard for many commercial applications. NIST also works with Oasis, the XML consortium dedicated to advancing the language of the Web.
And in recent years, NIST has been called upon to help US industry remain competitive with other potential technology superpowers. In 1987, US senator Ernest Hollings (D-South Carolina) discussed the new science of superconductivity with Craig Fields, Darpa's deputy director at the time - and came away alarmed. He was worried that Japan's Ministry of International Trade and Industry was unfairly assisting its own industrial giants, and he feared Japan was about to use American research to commercialize superconductivity and elbow out US companies. "We would win the prizes and the Japanese got the profits!" Hollings recalls, with urgency still in his voice. What was needed, Hollings reasoned after his meeting with Fields, was a civilian Darpa, and NIST seemed the likeliest home.
The passage of the Omnibus Trade and Competitiveness Act of 1988 created two new NIST programs designed to soothe Hollings' fears. It established the Manufacturing Extension Partnership, a system of government consulting aid to small companies, and it launched the Advanced Technology Program (ATP), a Darpa-like system of grants for companies pursuing risky technologies that might not find private funding. As if to emphasize this turning point in its history, the agency's name was changed from the National Bureau of Standards to the National Institute of Standards and Technology.
ATP is self-consciously aimed at maintaining the American lead in technology and commerce - warfare under another guise. Its division leaders (representing technology areas like electronics and photonics, tissue engineering, and so on) consult with industry and science experts in a variety of fields to identify new technologies that, if developed, could give the US a leg up on the competition. Gene expression was one such technology. ATP funded early R&D projects for creating DNA microarrays, sprinkling tens of millions of dollars in grants among companies like Affymetrix, Nanogen, and Motorola. Partly as a result of this funding, the US now virtually owns the worldwide market in so-called biochips.
"Our contract with NIST has been absolutely essential," says Motorola's Herb Goronkin. "We have been able to break cells apart, extract DNA, purify it, dice it up, amplify segments, then analyze those segments for the sequence of DNA, and compare that sequence to known strands. Because of NIST funding, we have been able to do all that on individual chips."
The Hollings-backed legislation of 1988, however, ushered in more than a bigger budget for NIST. It marked a change in focus, too, one that raised questions about what exactly the agency was supposed to be. Suddenly, NIST was a third bigger, with a new bureaucracy that apparently had nothing to do with its measurement research. What's more, the ATP system of giving companies matching grants put NIST in the position of directly subsidizing business. NIST's James Faller, one of the world's leading experts on the force of gravity, worries openly that the gravitational pull of money could damage NIST's reputation - that the combination of filthy lucre and Congress' appetite, he argues, could undermine the lab's renowned immunity from political pressure. For him, NIST should be about science. Period.
Meanwhile, congressional opponents of this expanded NIST role say that the government has no business subsidizing private enterprise in the first place. Almost every year since the first President Bush approved ATP in 1990, there has been a move in Congress to cancel its funding. "If we are against welfare for the poor," says US representative Dana Rohrabacher (R-California), "then we have to be against it for the big corporations." The fact that Motorola, a company that had a market cap of $69 billion, was given a $4.4 million ATP grant to develop DNA expression analysis products, or that Harris Corp. was given $13.8 million to develop wireless infrastructure "for digital video and multimedia applications," makes the program an easy target for legislators sharpening their budget-cutting teeth. Yet until now the program thrived, in part because NIST was not alone. There are 10 other federal agencies that dole out SBIRs, or Small Business Innovation Research grants, and small businesses are the chief targets of ATP. The National Institutes of Health alone had a 2000 budget of $350 million for such grants. Darpa has long supported the US semiconductor industry, spending $252.4 million in 2000 on "advanced electronics technology."
Other critics worry that NIST's role as a kind of advanced science weapon that can fend off foreign competition with marquee research has caused it to become preoccupied with winning prizes rather than creating the standards artifacts that industry can use to test and measure its own products. "NIST is not paying enough attention to getting materials and data to industry," argues Idema's Winthrop Baylies.
And still others assert that NIST simply can't keep up with the rapid pace of corporate innovation. Jeff Livas, a chief technical officer for optical gear maker Ciena, is careful to say he appreciates NIST's value, but he points out that his industry has moved faster than NIST lately, especially in the area of measuring the channel spaces in multiplexed networks. "Lots of times what you sell as a product is out in front of the standards," Livas says. "For example, 100-GHz channel spacing is the NIST standard. Well, we have been shipping products for a couple of years with 50-GHz channel spacing, and we recently announced a 12.5-GHz."
These criticisms have caught the ear of the new president. As long as Bill Clinton and technophile veep Al Gore were in office, those who griped about NIST made little headway. Raymond Kammer, named director of the agency in 1997, was an eloquent spokesperson for the lab's expanded role. He argued that NIST had to step into the void created by the rollback of corporate R&D. The US can quibble all it wants about whether the government ought to be picking up the slack, he argued, but somebody has to do the science.
But Kammer is history: He announced his resignation a few days after the election outcome was certain last year, making room for a Bush administration appointee. (Karen Brown, acting director, remains in office as of this printing.) And Bush's cabinet has responded promptly to detractors. Commerce secretary Donald Evans heard the "corporate welfare" argument about ATP and asked for the freeze on new grants. Now would-be recipients are wondering whether to bother applying, and ATP staffers are keeping their eyes open for new jobs. Although the Commerce Department insists ATP's fate is not a foregone conclusion, Kammer says the freeze amounts to Republican payback for a pet Clinton project and ruefully calls the reassessment period "the fair trial before the hanging." Yale physicist D. Allan Bromley, who was science and technology adviser to the president from 1989 to 1993, agrees that any freeze or elimination of ATP "is a terrible mistake."
"The federal government," he insists, "should support basic research." The Senate, long ATP's guardian angel, might mount a defense this year. Whether or not it succeeds, debate over ATP's funding is sure to expose Washington's views about government's long-term obligation to support science.
Charles Clark is in a meeting, but he has left instructions that he wants to see me - or rather, wants me to see the synchrotron facility. His face lights up when I peek in the door of the meeting room, and he excuses himself. And then Clark, a solid guy who at 48 still looks a little like an Ivy League fullback, begins a purposeful stride down one of NIST's ubiquitous hallways. I can barely keep up without breaking into a jog, and all the while Clark is talking. He told me earlier about what the synchrotron does, but that isn't enough. He wants to show me. And as we approach the big warehouse of a building where the device, SURF III, shoots its atomic particles around and around, he actually seems to grow more excited.
We stop in an anteroom where Clark shows off photographs of the sun taken by NASA research satellites. One of the synchrotron's many applications is to test the optics in cameras intended for highly specialized uses, such as NASA's program for monitoring solar radiation. SURF III provides a constant, known amount of light radiation to calibrate those optics - by literally counting individual electrons as they race around the synchrotron. The photos - arranged on the wall in chronological order - all depict enormous solar flares, great fire hoses of gas reaching out into space. And, as evidence once again of the NIST religion, each one is more precise than the last. The earliest images are good, but each succeeding picture is better - sharper, clearer, more detailed - than the one before.
Clark's pride in the photos produced by someone else's project is typical of the NIST mentality. NASA is the glory agency. It gets the oohs and ahhs and the news conferences describing how much ultraviolet radiation is released by a solar storm and how it might affect Earth's atmosphere. But that's OK with Clark. Like everybody else at NIST, he doesn't seem to care all that much about being famous. He doesn't mind that NIST didn't make the optics, didn't design the satellite or fire off the rockets that carried it into space. It's enough to know that he and his colleagues are getting the measurements right, so that SURF III can measure the state of electrons in a solid material, assess the optical properties of materials, and suss out how radiation interacts with matter.
In fact, for Clark - and Phillips, and Celotta, and many other NIST scientists - measurement is as exhilarating a science as any other. It requires leaps of imagination and marathons of reason. It generates insights and discoveries and inventions. Far from being the scientific equivalent of accounting - a repetitive labor, deploying yardsticks and calipers and stopwatches - in NIST's hands, measurement is a truly creative science.
We enter the big room where the synchrotron is loudly humming to produce its light. Still talking, Clark whips out a white card as if he's Harry Blackstone pulling a dove from his sleeve. Then he opens one of the light ports, allowing a beam to escape the accelerator. He holds the card behind a diffraction grating that intercepts the light beam and ä voilà! - the spectrum!
Of course, I saw my first prism in junior high school, but that's not the point. Clark wants me to see the spectrum in a new way. He stands a few feet past the end of the spectrum and explains that the radiation he can measure exists way, way out there, far beyond the test pattern shining on the card. It seems he can hardly believe it himself.
Then, when I ask if he really means to say SURF III can count individual electrons, he spreads his arms, uses the agency's former name, and shouts over the din, "Hey, man! This is the National Bureau of Standards! It is what we say it is, and we do what we say!"