This article reviews that the rate of discovery obtained from an experiment or a computational model is enhanced and accelerated by using parallel computing techniques, visualization algorithms, and advanced visualization hardware. The National Institute of Standards and Technology (NIST) in Gaithersburg, MD, team believe that high-performance computing speeds discovery within the sciences. It defines advanced computing methods as those technologies that possess capabilities beyond current state-of-the-art desktop computing. Visualization tools, for example, now extend beyond the three-dimensional computer-aided design model viewable on a desktop computer to include virtual reality software and hardware. A cave automatic virtual environment, called a CAVE, features four walls onto which an image is projected in 3D so that engineers feel they are standing in front of an object. Researchers at Iowa State and NIST’s engineers both say the future of technology won't happen without advanced computing methods, including visualization, virtual reality, and parallel computing.
Few engineers could call themselves experts in everything from computational dynamics to computer programming to scientific theory. Even so, scientific and engineering advances come about these days through a combination of experimentation, mathematical application, and, oftentimes, complicated computer simulation.
The rate of discovery obtained from an experiment or a computational model is enhanced and accelerated by use of parallel computing techniques, visualization algorithms, and advanced visualization hardware, according to those who make up the Scientific Applications and Visualization Group at the National Institute of Standards and Technology in Gaithersburg, Md.
The group was formed to advance computer expertise in the scientific realm, but it points out that high-level visualizaton hardware and software are just as important for engineers working today. They argue that engineering principles can be expanded upon and advanced more quickly by means of these advanced computer technologies than would be possible without technological aid or with what we’ve come to think of as primitive tools—a calculator, for example.
The NIST team believes that high-performance computing speeds discovery within the sciences. It defines advanced computing methods as those technologies that possess capabilities beyond current state-of-the-art desktop computing. Visualization tools, for example, now extend beyond the three-dimensional computer-aided design model viewable on a desktop computer to include virtual reality software and hardware. A cave automatic virtual environment, called a CAVE, features four walls onto which an image is projected in 3-D so that engineers feel they are standing in front of an object.
No desktop computer needed. And parallel computing moves beyond the desktop computer by harnessing the power of many microprocessors working in tandem.
To be effective in this arena, a team comprising a critical mass of talent, parallel computing techniques, visualization algorithms, advanced visualization hardware, and a recurring investment is required to go beyond desktop capabilities, according to a paper published by team members.
Their main point is: Keep on top of things, keep yourself up-to-date, because that’s the only way science advances. To say that hardware and software necessary for such experimentation are rapidly evolving means that teams of top engineers must always be on top of their game to fully exploit available technologies. Engineers must ensure that their organizations are using advanced computer techniques—such as visualization—to the fullest extent possible, which of course is not always easy in a world where technology seems to change moment by moment and where desktop computers are still a vital commodity that can’t be ignored, either. Obviously, taking such a tack—that is, fully exploiting today’s technologies—can challenge the funding sources at even the most generous company or institution.
To give an example of how advanced computing methods represent a leapfrog over conventional methods, those involved in the NIST group tell us to consider parallel computing, which links many processors to speed code execution. The efficiency and effectiveness of the parallel processing are largely dependent on the problems to be solved with selected algorithms and hardware architectures.
Many engineers know how parallel computing speeds an application, that computers solve problems overnight that took weeks previously. The method also lets researchers perform much more computation than was previously feasible. Therefore, parallel computing can be used to solve larger problems than were practical before.
With visualization software, engineers can actually see data as a picture, such as a view of the way air flows around a vehicle in motion. Unlike a spreadsheet of numbers, the picture can provide an intuitive understanding of the data being studied and can exhibit structure where no structure was previously known, according to group members. The human mind readily understands data when it’s laid out in a way we can see in our mind’s eye.
To be able to visualize things more clearly, a group called Web3D Consortium from San Ramon, Calif., has spearheaded the quest for an open industry-standard language for three-dimensional applications viewable on the World Wide Web. The use of 3-D images available over the Web is popular because the technique helps people see images that would otherwise appear statically on the Web page.
But Web3D doesn’t want popularity to breed tediousness. The group doesn’t maintain that it’s hard to view 3-D objects via the Internet, rather that it might just be too easy. More than 40 vendors provide 3-D viewing technologies, most often in the form of plugins that users must download and install to see the image, said Sandy Ressler, vice president of the consortium and project manager at NIST.
When Web standards were first getting under way, back in the dark ages of 1994, virtual reality modeling language, or VRML, was to be the industry standard for 3-D Web displays. Early Web standards setters expected that VRML would be the equivalent of hypertext markup language, the HTML that drives the display of two-dimensional Web pages. VRML is an open-source language to which developers can contribute. But many say VRML didn’t specify exactly how complex data should be conveyed. The language was considered too problematic for even standard 3-D Web applications, necessitating the various plug-in applications that vendors soon rushed in to develop.
VRML, since updated to VRML 97, is still the leading candidate for an open, industry-standard 3-D Web language, Ressler’s group maintains. Such a standard would gready reduce the use of proprietary 3-D viewing technology. The consortium’s goal isn’t to claim that VRML or its Web-enabled version, called X3D, will be the only method of providing 3-D content on the Web, Ressler said, but an open-industry standard will greatly encourage the use of it.
A Step Into the Virtual
Three-dimensional applications, though not necessarily Web-enabled, have become increasingly familiar to engineers over the past few years as immersive environments found a number of important uses in the engineering community. Immersive design environments can allow an engineer to feel, see, and manipulate a product in three dimensions before it’s produced.
Many academic laboratories today serve as proving grounds for these technologies, which are then licensed, refined, and sold to the public by commercial developers. For instance, in 1992 while a doctoral candidate at the University of Illinois at Chicago, Carolina Cruz-Neira wrote the first version of the CAVE software library as part of her thesis project. The CAVE, originally a three-sided virtual-reality room, is now widely used in industry.
Cruz-Neira moved on to become associate director of the Virtual Reality Applications Center at Iowa State University in Ames, where she and team members continue to push the virtual reality envelope. The center, a multidisciplinary operation not tied specifically to any university department, develops virtual reality applications, often with a key toward future commercial use.
For instance, students have developed a virtual factory line that matches one in an actual plant. By simulating production runs and determining line configuration needs virtually, rather than physically, plants can save hundreds of thousands of dollars in downtime.
In another research project, called the Interactive Structural Analysis project, the center’s researchers look at how to apply virtual reality techniques for interactive stress analysis of an industrial design. For this, researchers have combined non-uniform rational B-spline, or NURBS, free-form deformation with finite element analysis, sensitivity analysis, and collision detection software, and merged them with virtual reality software and hardware available through the center. NURBS is the mathematical algorithm used in computer graphics software.
The software and hardware are implemented by use of a surround-screen virtual environment, called the C2. The combination creates an interactive environment in which designers can view and modify a part and see how it fits into an assembly. They work in real time and in 3-D; they feel as if they’re touching the part when they modify it. The entire piece—not just the part designers are working on—is displayed to give engineers a context in which to work.
In order to let engineers manipulate the models directly, the NURBS model is deformed by engineers pushing and pulling on it. Then, the embedded model—what the engineer sees—is deformed correspondingly. In such an environment, engineers feel as if they’re actually moving and manipulating the model. And they can immediately view the results of their changes. Actually, the engineer changes stress sensitivities of the model when he or she manipulates it. The model changes correspondingly, and the shape change represents the new stresses.
In these types of environments, engineers are doing more than looking at a computer screen or viewing the results of an analysis. An immersive design experience allows them to feel much as mechanics do when they tinker with a car. An automobile mechanic can take out a malfunctioning part and examine it, then replace it in the system and turn on the engine to troubleshoot the problem. The mechanic can look at surrounding parts to see if they might affect the troubled part’s performance.
Engineers should be given this same type of hands-on ability, say the associates at the Iowa center, even if the machine they’re working on doesn’t yet exist.
At Work on Reality
Visualization, which includes virtual reality, begets change because engineers design large structural systems by using what’s called optimization software—basically, any program, including CAD and analysis, that will enable engineers to ensure that the product they’re working on will measure up to, and possibly exceed, the standards they’ve set. The software, in other words, helps realize an optimal design. Marry this with visualization, and you have the CAD and analysis programs that allow virtual prototyping. Move it beyond CAD to merge optimization and virtual reality, and you have a whole different ballgame.
At this stage in engineering technology development, optimization software most often comes into play in the initial stages of design definition, according to researchers at the Iowa center. The team is at work to find more ways for engineers to feel as if they’re actually building or working on the large structure, although in reality they occupy a virtual immersive environment. Virtual reality and visualization techniques allow for design optimization earlier in the process, they say. The researchers aim to develop technologies that let designers play an active role in design optimization. The Interactive Structural Analysis design and analysis environment was created for hands-on design.
Optimizaton software has to be fast, must be able to analyze problems with great accuracy, and has to be combined with virtual reality techniques to provide an interactive method for design that takes place in real time and allows engineers to fully optimize their designs. Virtual reality, the ability to feel as though you’re in the same room with an object that only exists in the computer’s memory, allows the engineer to quickly find flaws and investigate different design methods. The key, according to the researchers, is keeping the designer active in the design process.
Who knows where engineering will go in the future? We don’t know what discoveries are to be made or what technologies are yet to be developed. But researchers at Iowa State and NIST’s engineers both say the future of technology won’t happen without advanced computing methods, including visualization, virtual reality, and parallel computing.