WASHINGTON, D.C. — Joysticks, 3-D glasses, and room-sized stereoscopic views of some of the tiniest things in the universe — scientists at the Energy Department’s National Renewable Energy Laboratory (NREL) now have a new way to view and interact with their data.
Researchers from national labs, universities, and utilities get a human-sized embodied view of molecules, enzymes, solar junctions, and polymers in the Insight Center at NREL’s Energy Systems Integration Facility (ESIF).
NREL Senior Scientists Ross Larsen, left, and Travis Kemper get a human-scale view of a molecular model of polymeric organic nitroxide radical (PTMA) film at the Insight Center Collaboration Room in the Energy Systems Integration Facility (ESIF) at NREL. Credit: Dennis Schroeder
The primary display in the Insight Center Collaboration Room is a glass wall 16 feet wide and eight feet high with a projection floor that extends about five feet outward, allowing the user to be physically immersed in the data. Six projectors blend together into a seamless image to illuminate the display: four projectors back-project onto the glass wall, while two projectors overhead front-project onto the floor.
Turning the body, even nodding the head, changes the researcher’s view, affording him or her the best possible angle of a junction, molecule, or anything on the screen. Users can manipulate the scene from the comfort of a desk using keyboard and mouse, but they can also directly interact with models in three dimensions using a joystick (and eventually gloves).
It’s much more than fun. Real scientific problems are being solved with these larger-than-life visualizations.
Despite the appearance, the image isn’t actually three-dimensional. But with the aid of the room architecture, projectors, glasses, and tracking technology, it sure seems to be.
“The idea behind the Insight Center is to provide researchers new ways to view and interact with their data. For example, the immersive display is designed to help the brain process very complicated data by letting the user physically explore the imagery with the aid of these visualization and interaction technologies,” said NREL’s Kenny Gruchalla, senior scientist, who designed the Insight Center visualization rooms.
“Our brains are very sophisticated pattern-matching machines with a majority of our neurons dedicated to processing visual information. However, the brain has largely evolved to process visual information from an embodied perspective,” he said.
Discovering Junctions Not Seen on the Desktop
NREL Senior Scientist Kenny Gruchalla examines the velocity field from a wind turbine simulation using a 3-D model in the Insight Center Collaboration Room. Credit: Dennis Schroeder
Ross Larsen, a senior scientist and polymer expert at NREL, is a big fan.
He has used the Collaboration Room for two projects — organic radical batteries and organic photovoltaic (PV) cells — to understand how very large polymer molecules pack and intertwine, and how one chain is oriented relative to another chain in complicated arrangements.
He remembers the first day he serendipitously encountered one of his images on the big screen and made an almost instant discovery.
“I had this image of a coarse-grained model — blue and red pipes to represent a bulk hetero-junction structure used in organic PV,” Larsen recalled. “One day, the day before a VIP tour of the lab, that image was up on the big screen looking like huge eye candy.”
He slipped on some 3-D glasses and walked toward the image. “All of a sudden, I noticed, ‘Hey, wait a minute. There are five tubes converging at a node here — over there, there are three, and there are seven up there.’
“It occurred to me that it might make a big difference how easily a charge can move through a material if it has more options.” An electron might be able to exit a device faster — and provide electricity more efficiently — if it had more of those seven-pipe junctions to work with than, say, three-pipe junctions.
He is still wrapping his head around the possible implications, but: “What was new to me was the idea that within a single morphology, there may be multiple different branching levels. The motions the electrons are undergoing — and the environment they are moving through — isn’t nearly as simple as I thought.”
Larsen said he “spent a lot of time staring at those things on my computer screen, rotating them around, and I never noticed anything like that. It took me just five minutes walking into that structure for the first time to see it. That got me very excited — that I discovered something new, even though I don’t know yet what to make of it.”
He is convinced that the human-sized scale of the model helped with the discovery. “I was encountering something about the size of another human. It lets you zoom in at the right scale — if it’s too small, you might as well be at your computer. If it’s too big, it can just be overwhelming.”
High-Tech Glasses Give 3-D Illusion
In the Collaboration Room, optical trackers and infrared strobes work with 3-D shutter glasses to orient the images to the will of the user. The shutter glasses provide separate images for each eye. The glasses have LCD shutters that are synchronized with the projectors using a radio-frequency signal. When the right image is displayed, the left eye is shutter closed — and vice versa. The computer displays the images at 120 hertz, so the user gets 60 hertz per eye.
At the same time, reflective markers on the glasses allow the user’s viewpoint to be tracked, providing a motion parallax, or an egocentric point of view. As the scientists view the space, they move the data because the glasses track the position and orientation of the operator’s head. As the head moves, the computer modifies the position of the scene.
“It gives the illusion that these data models are in three dimensions and in the presence of the user,” Gruchalla said. “As you move through the space, the tracker communicates your viewpoint to the computer. The computer then optimizes the scenery to that position, allowing you to physically move through your data.”
Visualization work currently underway in the Collaboration Room is looking at data from the very small at atomistic scales to the very large at the scale of North America.
On the huge end of the scale, researchers studying the complex, turbulent flow fields around wind turbines can treat the space as a massive turbine-array-scale virtual wind tunnel, standing amongst the turbines and seeding virtual particles to see the complex flow patterns and better understand how turbine wakes and turbulence can cause early gearbox failure.
In a real-world wind tunnel, researchers use smoke to try to understand the dynamics of the flow. There are analogues to the smoke in the computational spaces of computer engineers. Using a numerical vector field, they can integrate a path through that field. They can see where a particle dropped in the flow would travel. The computational space affords many ways to investigate the flow that are not possible in a physical wind tunnel — not to mention operating on sizes that would be wholly impractical in the physical world.
On the tiny end, a researcher can, for example, use the immersive space to zero in on the junctions of a solar cell, sorting through the electrons and corresponding holes to see that multi-pronged junctions can cause entirely different dynamics than single- or double-pronged junctions.
Experiments Prove Worth of Human-Sized 3-D Visualization
High-tech glasses give the viewer a sense of three dimensions at NREL’s Insight Center Collaboration Room, where scientists can see details at human-sized scale that they would often miss when looking at the same models on a desktop. Displayed here is a molecular model of cellulose microfibrils, which are of interest to biofuels scientists looking for easier ways to unlock the cellulose that is crucial to the economics of the industry. Credit: Dennis Schroeder
Gruchalla has a Ph.D in computer science. His master’s research investigated scientific workflows in virtual environments at the University of Colorado (CU). His Ph.D. research focused on visualizations of large-scale turbulence simulations at the National Center for Atmospheric Research.
While at CU, Gruchalla conducted multiple controlled studies to investigate the added value of immersive visualization. In one experiment, 16 subjects planned the paths of oil wells in immersive and non-immersive environments. Fifteen of them finished the task faster in an immersive visualization environment such as the one at ESIF than they did on a desktop computer. And they also had a higher percentage of correct solutions than they did with the desktop.
In a separate study, three independent groups of biochemists visualized and interacted with individual biological modules in a virtual immersive environment. In each case, the groups yielded new insights after 90 minutes in the human-scale environment that they hadn’t discovered during months or years examining the same data on a desktop computer.
In one case, a group found a molecular pocket that it hadn’t seen before. The researchers credited not only the human-sized scale but the greater opportunity to collaborate during their interactive viewing of the giant models. Two of the groups produced new scientific papers in part as a result of these new insights.
“Once they’re literally standing inside their molecule, and are able to interact three-dimensionally, they’re able to make judgments and take measurements that wouldn’t be possible otherwise,” Gruchalla said.
Complementing the immersive virtual environment of the Collaboration Room is the Insight Center Visualization Room, which is used for presentations and the exploration of large-scale data. Its high-resolution screen is composed of more than 14 million pixels.
The presentation room uses LED projectors that generate minimal heat. They’re rated for 60,000 hours of use, so they shouldn’t need to be replaced for about 20 or 30 years — or, more likely, until new technology supersedes them.
The screen will be used for, among other things, displaying images and simulation results from studies of electric vehicles, wind farm performance, biomass-degrading enzymes, and PV materials. “Using the high-resolution, large-scale display [on the Visualization side] of the Insight Center, we now have the visual real estate to lay out a significant amount of data simultaneously that will enable the analysis of ensembles of these types of simulations,” Gruchalla said.
The screen’s size allows researchers to, say, look at 500 meteorological variables at one time, instead of the dozen or so they could examine on a desktop. That illuminates trends and correlations that would otherwise be lost on the margins.
Insight Center Boon to Energy Systems Integration Facility
ESIF offers utility executives and other decision makers a place to research new technologies in a virtual environment before they’re loaded onto the actual grid. The United States is moving into an era with a higher penetration of renewable energy and distributed energy resources linked to the electric grid. NREL’s $135-million ESIF was built to help utilities prepare for the day in the not-too-distant future when the electric grid speeds power from all our energy resources to American homes, offices, and stores. ESIF’s Insight Center used less than 1 percent of that budget — about $1 million — but is expected to make a significant difference for researchers trying to understand the newest complexities in an integrated grid.
The two visualization rooms work hand in hand with NREL’s petaflop-scale High Performance Computer – which, not coincidentally, sits just a few feet from the big screens, connected by high-speed fiber. “What we’re doing is using new tools to better delve into terabytes to petabytes of complex data,” Gruchalla said.
The two new screens in the ESIF Insight Center open a different world to researchers used to peering into an electron microscope or a laptop screen to make discoveries.
“The researchers have been very excited to come in here and look at the data in large scale,” Gruchalla said. “To explore and see things in these data is quite a thrill.”