To See Proteins Change in Quadrillionths of a Second, Use AI

Researchers have long wanted to capture how protein structures contort in response to light. But getting a clear image was impossible—until now.
moving ball
Photograph: Westend61/Getty Images

All products featured on WIRED are independently selected by our editors. However, we may receive compensation from retailers and/or from purchases of products through these links.

Have you ever had an otherwise perfect photo ruined by someone who moved too quickly and caused a blur? Scientists have the same issue while recording images of proteins that change their structure in response to light. This process is common in nature, so for years researchers have tried to capture its details. But they have long been thwarted by how incredibly fast it happens.

Now a team of researchers from the University of Wisconsin Milwaukee and the Center for Free-Electron Laser Science at the Deutsches Elektronen-Synchrotron in Germany have combined machine learning and quantum mechanical calculations to get the most precise record yet of structural changes in a photoactive yellow protein (PYP) that has been excited by light. Their study, published in Nature in November, showed that they were able to make movies of processes that occur in quadrillionths of a second.

When PYP absorbs light, it absorbs its energy, then rearranges itself. Because the protein’s function inside the cell is determined by its structure, whenever PYP folds or bends after being illuminated, this triggers huge changes. One important example of proteins interacting with light is in plants during photosynthesis, says Abbas Ourmazd, a physicist at UWM and coauthor on the study. More specifically, PYP is similar to proteins in our eyes that help us see at night, when a protein called retinal changes shape, activating some of our photoreceptor cells, explains Petra Fromme, director of the Biodesign Center for Applied Structural Discovery at Arizona State University, who was not involved with the study. PYP’s shape change also helps some bacteria detect blue light that may be damaging to their DNA so they can move away from it, Fromme notes.

Details of this important light-induced molecular shape-shifting, called isomerization, have eluded scientists for years. “When you look at any textbook, it always says that this isomerization is instant upon light excitation,” says Fromme. But, for scientists, “an instant” is not unquantifiable—the changes in the protein’s structure happen in the remarkably short amount of time known as a femtosecond, or a quadrillonth of a second. A second is to a femtosecond what 32 million years is to a second, Fromme says.

Scientists experimentally probe these incredibly short timescales with similarly short flashes of x-rays. The new study used data obtained in this way by a team led by UWM physicist Marius Schmidt at a special facility at the SLAC National Accelerator Laboratory in California. Here, the researchers first illuminated PYP with light. Then they hit it with an ultrashort x-ray burst. The x-rays that bounced off of the protein—called diffracted x-rays—reflected its most recent structure in the same way that light reflected from objects helps make conventional photographs. The briefness of the pulses allowed scientists to get something like a snapshot of the positions of all of the protein’s atoms as they moved, similar to the way a camera with a very fast shutter can capture the different positions of a cheetah’s legs as it runs.

This illustration depicts an experiment at SLAC that revealed how a protein from photosynthetic bacteria changes shape in response to light.Illustration: SLAC

But even the shortest x-ray flashes have typically not made for a fast enough “shutter” to get a femtosecond-by-femtosecond record of a protein’s shape change. "A major problem in analyzing diffraction signals is that the x-ray source is noisy,” says Shaul Mukamel, a chemist at the University of California, Irvine who was not part of the study. In other words, the x-ray flash always leads to at least some blurriness. Imagine the protein as a contortionist folding itself into a pretzel. Using x-rays, scientists can get a clear image of its relaxed pose immediately after it absorbs the light energy that spurs the contortion, and of its intertwined limbs at the end. But any images of its in-between motions would be fuzzy.

However, Mukamel adds, x-ray experiments like the one analyzed in the new study tend to collect huge datasets. Chemists like himself are always trying to innovate ways to unearth new information from them, he says. In the new study, using artificial intelligence to analyze the data was key.

Ourmazd’s Wisconsin team, led by research scientist Ahmad Hosseinizadeh, used a machine learning algorithm to extract unprecedentedly precise information from the experimental x-ray diffraction data. Ourmazd compares their method to an innovation in taking a three-dimensional scan of a person’s head. “Normally, what happens if you want a 3D image of somebody's head, you sit them down, get them to be still, and take lots of pictures,” he says. But his group’s algorithm does something more like taking a series of photos from different angles and at different times as the person repeats the same motion, like slightly turning their head. Then the AI extracts the complete 3D image from this group of snapshots and learns what the entire movement should look like, creating a sort of animated “movie” of it. “Using artificial intelligence at each time point, we’d reconstruct a three-dimensional picture of the head. We’d have a 3D movie as a function of time,” Ourmazd says.

In the PYP experiment, the machine learning algorithm was given data from multiple nearly-identical proteins that had been imaged in sequence. (Researchers couldn’t reuse the same protein, because they get damaged by the x-ray.) The AI extracted the details of the process without the blurriness of the x-ray flashes, and it uncovered what the blur had been obscuring. Remarkably, these images showed how electrons inside the protein move within frames that are only femtoseconds apart. These movies—which the team later slowed down enough to allow the human eye to track the change—show electrons moving from one part of the protein to another. Their motion inside the molecule indicates how the whole thing is changing its structure. “If my thumb moves, then the electrons inside of it have to move with it,” Ourmazd offers as a comparison. “When I look at the change in the charge distribution [of the thumb], it tells me where my thumb was before and where it has gone.”

The protein’s reaction to light has never been observed in such small time increments before. “There's a lot more information in datasets than people generally think,” Ourmazd says.

To better understand the motions of electrons, the Wisconsin team worked with physicists at the Deutsches Elektronen-Synchrotron who performed theoretical simulations of the protein’s reaction to light. The electrons and atoms within the protein have to move according to the laws of quantum mechanics, which act as something like a rulebook. Comparing their results to a simulation based on those rules helped the team understand which of the allowed moves the protein was performing. This brought them closer to understanding why they saw the motions they did.

The union of quantum theory and AI encapsulated in the new work holds promise for future research into light-sensitive molecules, says Fromme. She emphasizes that a machine learning approach can extract lots of detailed information from seemingly limited experimental data, which may mean that future experiments could consist of fewer long days doing the same thing over and over in the lab. Mukamel agrees: "This is a most welcome development that offers a new path for the analysis of ultrafast diffraction measurements."

Coauthor Robin Santra, a physicist at the Deutsches Elektronen-Synchrotron and the University of Hamburg, believes that the team’s novel approach could change scientists’ thinking about incorporating data analysis into their work. “The combination of modern experimental techniques with ideas from theoretical physics and mathematics is a promising route towards further progress. Sometimes, this may require scientists to leave their comfort zone,” he says.

But some chemists would like to see the new approach examined in even more detail. Massimo Olivucci, a chemist at Bowling Green State University, points out that PYP’s response to light includes something like a singularity in its energy spectrum—a point where the mathematical equations for calculating the protein’s energy “break.” This kind of occurrence is as important to a quantum chemist as a black hole is to an astrophysicist, because it is another instance in which the laws of physics, as we understand them today, fail to tell us exactly what is happening.

According to Olivucci, many fundamental processes in chemistry and molecular physics involve these “rule-breaking” features. So understanding the minute details of what a molecule is doing when laws of physics can’t offer clarity is really important to scientists. Olivucci hopes that future work with the machine learning algorithm from the new study will compare its “movies” to theoretical simulations that contain atomistic detail—rulebooks specifying what every single atom in the protein can and cannot do. This could help chemists determine the fundamental reasons why some of the smallest parts of PYP perform some of its fastest moves.

Ourmazd also notes that his team’s approach could help uncover even more about PYP’s response to light. He would like to use the algorithm to observe what happens slightly before the protein absorbs light, before it “knows” that it is about to start contorting, rather than immediately after the absorption, when it is locked into the motion. Additionally, he notes, instead of using flashes of x-rays, scientists could throw ultrafast electrons at the protein, then record their bouncing off to produce even more fine-grained snapshots that the AI could analyze to achieve an even more detailed animation of the process.

Ourmazd would also like to tackle astrophysics and astronomy next, two fields in which scientists have long been taking images of a changing universe, and from which an AI might extract useful data—although he doesn’t have a specific experiment in mind yet. “The world's our oyster, to some extent,” he says. “The question is: What are the most important questions to ask and realistically expect to answer?”


More Great WIRED Stories