This article discusses that virtual reality can be expected to be the next big thing in product design and prototype, especially for products too small to be seen with the naked eye. Virtual reality will be new technology for engineering, according to many companies. Virtual reality applications could become commonplace as soon as five years down the road. By using virtual reality systems, designers touch a stylus or pen to manipulate a three-dimensional image. They get their hands on tiny sensing devices, the way you would tinker under the hood of a car. They can still determine, through the system’s visualization and animation capabilities, how the MEMS devices will function in their tiny world. Intersense of Burlington, MA, provides the motion-tracking sensors that literally track a user’s eyes and hand motions, to ensure the object they feel and manipulate in three dimensions matches the movements of their hands and eyes. It is this key piece of technology that allows users to interact in a three-dimensional environment the same way they move in their real environments.



When engineering companies want to catch a glimpse of the technologies that will become part of their everyday reality in the near future, they take a peek at research now being done at the Virtual Reality Applications Center at Iowa State University in Ames, at the virtual prototyping laboratory of the University of Arkansas in Fayetteville, and at other academic institutions pioneering the use of a technology poised to be the next big thing in the world of engineering technology.

Virtual reality will be—within the decade— the must-have new technology for engineering, according to many companies.

Virtual reality applications could become commonplace as soon as five years down the road, according to Carolina Cruz-Neira, associate director of Iowa State’s Virtual Reality Applications Center. In fact, the founders wrote into the center’s mission statement that they believe the emerging field of synthetic environments—also called virtual reality—has reached a level of maturity to stimulate revolutionary changes in science and engineering.

Of course, engineers are already familiar with how virtual reality can be used as part of their jobs. But that use will be pushed down farther along a company hierarchy, so product designers may make use of it every day. In less than a generation, designers will have gone from two-dimensional design to immersive design, whereby they can feel, see, and manipulate a product before it is produced.

Campus laboratories are the place to get a glimpse of virtual reality technology right now because the universities are at the research forefront, as professors and students work out the kinks of their assembled virtual reality applications and find ways to make them practical in the nonlaboratory, engineering environment.

Iowa State’s VRAC is a leader in creating and testing virtual reality technologies. The University of Arkansas team combines research on two high-growth technologies as it works on a virtual reality system to be used for the design and manufacture of MEMS devices, a technology whose growth is also predicted to be explosive over the next decade.

Cruz-Neira, who also serves as a faculty member in the ISU electrical and computer engineering departments, has a specific claim to virtual reality fame. In 1992, while a doctoral candidate at the University of Illinois at Chicago, Cruz-Neira wrote the first version of the cave-automatic virtual environment software library as part of her thesis project.

With that, she became designer of the CAVE, a three-sided virtual-reality immersion room now widely used in industry and a VRAC mainstay.

The CAVE is so named because users literally enter a completely virtual environment in which they can walk around and can touch objects as they would in the real environment. A CAVE gives this illusion by projecting stereo images on the walls and floor of a room-size cubicle. Several people wearing lightweight stereo glasses can enter and walk freely inside the CAVE. A head-tracking system continuously adjusts the stereo projection to the viewer’s position.

Inside such an environment, a number of input devices like data gloves, joysticks, and handheld wands let the user navigate through a virtual environment and interact with virtual objects. Directional sound, force-feedback, and voice recognition devices that are sometimes included in these immersive environments help users feel more present in the virtual world.

If you thought virtual reality technology was the sort of stuff created mainly to allow computer gamers to run about on a digitized plane, killing off extraterrestrials, think again. Although the entertainment industry is expected to account for a big piece of the virtual reality technology market in the future, the engineering and medical industries have been at work for the past decade finding a myriad of uses for the technology.

And that’s where the VRAC comes in. Cruz-Neira called it a multidisciplinary center not tied directly to any particular ISU department. Instead, it’s staffed by faculty members from various departments. The bulk of faculty members come from the engineering disciplines, with many from the mechanical engineering department, she said.

“We have people from a lot of engineering disciplines here, but we also have people from statistics, mathematics, medicine, architecture, and botany,” Cruz-Neira said.

The center’s funding comes from many engineering companies or companies that rely heavily on engineers to seek answers to workplace problems they encounter.


“We work with companies like John Deere and Ford, and we work with some of the engineering divisions of the Air Force and the Army,” Cruz-Neira said. “We have a long list of engineering companies that we support.” They can find answers by using the VRAC’s expertise and equipment to simulate a problem in virtual reality for help in locating a solution, or to find answers to sticky training questions. In the process, the companies can see how virtual reality is of use to them and can test the kinds of technologies they expect to bring in-house when they become available to the public, Cruz-Neira said.

Solving Problems with the C6

The center’s hardware and software include more than 20 computers from Silicon Graphics of Mountain View, Calif., and a variety of virtual-reality equipment, nearly all of it proprietary. But its pièce de résistance is a new-in- June CAVE called the C6. Unlike other CAVEs, which consist of three projection walls, the C6 includes six projection walls—the four surrounding walls as well as the ceiling and the floor. Companies that contract with VRAC have access to the C6 for advanced virtual-reality problem solving.

“In essence, we do technology transfers with industry, in the sense that they’re using us to explore what, five years into the future, they’ll be using,” Cruz-Neira added. “The kinds of things we do here are one step ahead of the every-day use of virtual reality. So the companies come to us and say, ‘We have these issues, and we use these methods to deal with them now, but how could we do this in a more advanced way?’ Or, ‘Help us investigate how to do this by using your virtual reality techniques.’”

For instance, a manufacturing company that Cruz-Neira declined to name came to the VRAC with a production-line problem that executives thought might be solved with the help of virtual reality software and hardware.

“Sometimes, the workers had to reorganize the production line in the middle of a shift because they needed to make a different batch of products with nonstandard features,” Cruz-Neira said. “To figure out how to configure the machinery, they’d sometimes run trial-and- error situations on the shop floor, and when they got the trial up and working, it didn’t do what the operators had hoped it would do.

Within the decade, virtual reality will be the must-have new technology for engineering.

“When the company ran these tests to see how to rearrange the lines, it cost them hundreds of thousands of dollars for each hour the line was stopped,” she added.

The company’s goal was to dramatically cut the time the line had to be stopped while the crew reconfigured machinery for a second production run. Officials also wanted to get accurate information on how to quickly reconfigure the line to be ready to make a new part in the shortest time possible.

Obviously, if the answer to how the line could be quickly reconfigured were to be found via trial and error on a virtual factory floor that exactly mirrored the actual floor, the company would save those hundreds of thousands of dollars in line downtime. VRAC students and faculty members are now developing a virtual factory line that matches the manufacturing plant’s production line.

One of the other ways the VRAC aids industry is in helping companies determine which virtual reality tools they actually need to get a particular job done. Many times, because virtual reality is often spoken of in the reverent tones used to ring in a change in technology, companies have been led to believe they need more hardware and software for a particular application than they actually do. Because the center includes many forms of virtual reality devices, faculty members can test a particular company’s project on a number of different devices before finding a fit, Cruz-Neira said.

“Because we have more than one system here, we don’t need to impose a system on a project,” Cruz-Neira said. “We can look at what a particular task would require, and then look at what would be the appropriate level of system use.

Virtual Humans

“For example, let’s say you’re a medical doctor and you’re trying to teach medical students how to do surgery,” she the doctor. A nurse has to have a better spatial understanding of the operating room itself. So you’re better said. “For that, you’ll need a virtual human, and you’ll probably want him lying on an operating table.” That job is easily accomplished with what’s called a virtual reality workbench. This is simply a one-dimensional table at which users look down, the way doctors would look down upon a real operating table.

“If you’re focusing just on the surgery, the full look- around capabilities you get from a CAVE might not be applicable,” Cruz-Neira said. “You don’t need to look up or look right. You only need a workbench to solve your task.

“But say that, rather than medical students, you’re trying to train nurses in the operating room. Well, then the workbench might not be the right thing,” she added. “They’ll be walking around the room; they might have to walk over to look at a monitor or go to some other part of the room and pick up an instrument and bring it over to off doing nurses’ training in a full CAVE environment.”

Mems Meets Virtual Prototyping

At the University of Arkansas, Ajay Malshe, associate professor of mechanical engineering and director of the university’s virtual prototyping laboratory, leads a team of researchers who use virtual reality applications to design microelectromechanical systems. MEMS, a technology of tiny mechanical devices such as sensors, valves, gears, mirrors, and actuators, sometimes embedded in semiconductor chips, has also been called a foundational technology of the next decade.

Basically, a MEMS device contains microcircuitry on a tiny silicon chip into which some mechanical device such as a mirror or a sensor has been manufactured. Obviously, the features of the devices are much too small to be seen with the human eye, but they’re also complex little machines capable of determining the inflation of an automobile air bag based not only on deceleration of the car, but also on the size of the person behind the wheel. The microscopic MEMS device can sense both of these factors— deceleration and size—and respond to them accordingly.

This technology promises to become an even bigger part of everyday life in the near future, and some experts suggest it will revolutionize the everyday. Future applications for MEMS technology have expanded from beyond its pioneered use in the electronics industry, to applications in the chemical, optical, and biological fields, Malshe said. For instance, sensor-driven heating and cooling systems could dramatically improve energy savings, and sensors built into the fabric of an airplane wing could sense and react to airflow by changing the wing surface resistance.

Methods of designing and producing MEMS devices need to be explored with alacrity as the industry debates their uses, Malshe maintains. And this is where the University of Arkansas virtual prototyping laboratory comes into play. Frequently, even the very men and women charged with designing these devices have a hard time conceptualizing them, since their details are impossible to see without a high-powered microscope, Malshe said.

“As MEMS get more complex, you have a hard time imagining the functionality of them,” he said. “And there are so many new MEMS structures coming up so fast, you need a way to interpret what the MEMS will be doing.”

Virtual prototyping, Malshe said, can be seen as an interface between the human eye and the microscopic MEMS structures.

“Our system is more or less a visualization tool that interacts with you as you design,” Malshe said. “You run the model and you animate it all in one environment to see how it would work because you certainly couldn’t see that with the naked eye.”

He described the laboratory’s virtual reality systems, not as the mix of systems in use at the VRAC, but as high-end computers loaded with the engineering CAD and analysis software that many engineers already use. Users do not walk into a CAVE; they sit at a computer and don a proprietary headpiece that lets them feel as if they’re viewing the MEMS prototype in three dimensions. Systems also feature hardware such as pens and styluses that users manipulate so they can literally feel the object as if they were really touching it.

Virtual reality technology serves here as a virtual prototyping tool, he said. Users can model and animate their MEMS designs to see if they work. If a system needs tweaking, designers can see that with the help of the virtual reality system and go back to shape it further during redesign.

“If I wanted to show you how a car works, I could show you how the trunk opens and closes, I could open the hood and show you the engine, and you would see the wheels turning to make the car move forward,” he added. “But we can’t, of course, show you how a MEMS system works because they’re so small. But if you build them in virtual reality, you can use that technology to visualize them.”

Manipulating 3-D Images

By use of the virtual reality systems, designers touch a stylus or pen to manipulate a three-dimensional image. They get their hands on tiny sensing devices, the way you would tinker under the hood of a car. And they can still determine, through the system’s visualization and animation capabilities, how the MEMS devices will function in their tiny world.

“In the macroscopic world, I can design and test a car before I build it; in the microscopic world, I can’t,” Malshe said. “But I can feel like I’m designing and testing it through the help of a virtual environment.”

The university laboratory uses computer-aided design systems from Autodesk in San Rafael, Calif., and Solid- Works in Concord, Mass., to model the MEMS system. Software analysis packages from ANSYS in Pittsburgh, and Fluent Technologies in Lebanon, N.H., allow the designers to run tests, such as stress and thermal analyses, on their MEMS designs.

Intersense of Burlington, Mass., provides the motiontracking sensors that literally track a user’s eyes and hand motions, to ensure the object they feel and manipulate in three dimensions matches the movements of their hands and eyes. It is this key piece of technology that allows users to interact in a three-dimensional environment the same way they move in their real environments, said Patrick Riley, Intersense’s director of corporate communications.


The tracking sensors determine how to move the images so they correspond with the user’s eye movements and with his or her sense of feel, Riley added.

“When you interact with the model, when you can adjust the vents, and turn the steering wheel of an automobile model, for example, you can judge the accuracy of the simulation, which is a critical step in virtual prototyping,” Riley said. “Reworking a physical model can take weeks, but reworking a virtual prototype can take only minutes, sometimes, if you can figure out what needs to be adjusted. And it helps if you can adjust the model physically.”

In the near future, Malshe and his team plan to build a virtual reality system that will let users feel as though they’re assembling MEMS devices. This helps work out any production kinks and ensures that the device is buildable as designed, he said. Also on the horizon is a portable virtual reality system for MEMS design.

“If we could take this out of the laboratory, the applications for it would just widen,” he said. “In the future, the applications for this system are just going to explode like the applications for MEMS.”