This article focuses on the importance of virtual reality in computer-created environment. The ability that CAD software grants us to see how an end product will look long before it is machined is a form of virtual reality. The moving forward toward more and more advanced computer capabilities means that engineers can expect to use their computers to be able to visualize the final outcome of a design or project much earlier in the process. Obviously, the ability to see an object three-dimensionally as an engineer designs it is a boon over two-dimensional rendering. Soon technology may add another sensation to the design process—the ability of an engineer to feel the object as he or she creates it, although the object exists only on a computer screen. This is called haptic technology. The ability to translate hard numbers and complex mathematical formulas into something that looks like a human version of reality is upon us. The ability to touch that computerized version is quickly approaching.
To many people, the phrase “virtual reality” conjures up images of elaborate helmets and gloves that users wear in order to feel completely immersed in a computer-created environment. But that image falls far short of the mark, according to Sanjay Sarnia, an associate professor at the Massachusetts Institute of Technology in Cambridge.
Many engineers, whether they know it or not, encounter virtual reality every day when they work with software that allows them to envision a product or concept with the help of their computers. And, in the near future, mechanical engineers may be able to use one more sense—their sense of touch—to gain insight into a concept that is not yet reality.
Computer users may take for granted the leaps and bounds the past decade has brought in terms of visualization capabilities. In the mechanical engineering field, much of this comes into play because of the commonality of graphical user interfaces that offer a lifelike version of reality on a computer screen and the explosive growth of CAD software. The ability that CAD software grants us to see how an end product will look long before it is machined is a form of virtual reality, according to Sarma.
Virtual reality is basically a hijacking of the senses, Sarma explained. The first virtual reality device to be invented was the gramophone in the 19th century. “With movies, the sense of vision was hijacked. But we’ve been kind of stalled since then,” he added.
The march forward toward more and more advanced computer capabilities means that engineers can expect to use their computers to be able to visualize the final outcome of a design or project much earlier in the process. Obviously, the ability to see an object three-dimensionally as an engineer designs it is a boon over two-dimensional rendering. And soon, technology may add another sensation to the design process—the ability of an engineer to feel the object as he or she creates it, although the object exists only on a computer screen. This is called haptic technology.
“Basically, after the first two senses were leashed, people have been wondering whether we can fully make use of the sense of touch in computer systems,” Sarma said.
Sarma’s research is centered in the field of haptic systems, also called force-feedback systems. Haptic systems are meant to create the perception of touch through a computer-aided device and accompanying software. The device that Sarma is working to perfect, for computer numerically controlled milling machines and for other uses, looks rather like a small robotic arm that sits in front of the computer screen. Users touch the tip of the arm, which exerts an external force on the fingertip. This creates the illusion that users are interacting with the solid virtual objects represented on the computer screen. As they move the arm about, it feels as though they’re tracing the surface of the computerized object.
To effectively mimic a sense of touch, the device must update its force at a cycle rate of 1,000 hertz, Sarma said. He and his fellow researchers are seeking to create a device that can attain that cycle rate and, therefore, have application in the solids modeling environment.
If users were to gain 1,000 Hz per second in virtual space—like within a solid modeling system where two objects collide—this would allow them to actually feel where the objects collide, Sarma said. “What if you could touch your model and maneuver it, which the haptic device exerting a force on your finger would let you do?”
Engineers who were to use a solid modeling system joined to a haptic device could design not only by sight, but also by feel. Engineers could tell by feel that one side of the part doesn’t meet another side as needed, Sarma said.
Engineers would be able to include the sense of 3-D objects touching each other when they model, Sarma said. He described the sense of touch a user would experience with the haptic device as the equivalent of grasping one end of a screwdriver or a pencil and running it over an object.
“You’ll be able to feel the point at which surfaces of these objects intersect,” he added.
Last year, SensAble Technologies of Woburn, Mass., released software that allows sculptors and commercial designers to create models on computer screens and to feel those models as they create them. The company’s software, called FreeForm, enables users to interact with 3-D data not only by viewing it but by touching it, said Andrew Hally, director of marketing for SensAble Technologies. He said that using the technology is analogous to modeling with computerized clay. Indeed, the software and the accompanying hardware are intended for use by commercial designers rather than by engineers.
Mouse of the Future
“In the future, touch is going to be the key driver of product visualization on the computer,” Hally added. “Bringing a sense of touch to the computer really does make it more humancentric.
Computers are hugely useful; they’re handy engineering tools, Hally said. “One problem is that computers can do all this great work and figure things out, but a lot of times there’s a barrier between the computer crunching data and the users processing the data and working with the computers and benefiting from the work that the computer has done.”
Hally pointed out that 30 years ago, computer screens only displayed fiery rows of numbers or fines of text.
“That’s the way machines work, but it’s not the way humans process data,” he said. “The biggest thing that made computers work in a more human-oriented rather than computer-oriented way was the switch from texts to graphics. That’s when you moved from a keyboard to using Windows with a mouse.”
The haptic arm will be the mouse of the future, Hally predicted.
The haptic arm, called Phantom, which accompanies the FreeForm software, resembles the haptic arm that Sarma and his researchers are working on. But Sarma’s tool would be for use with computer-aided design programs and also could be used with CNC milling machines that make the parts, whereas commercial designers mainly use the SensAble technology.
The FreeForm software is coupled with the company’s Phantom haptic device, which was originally designed and built by company founder Thomas Massie when he was an MIT undergraduate. Massie worked with Kenneth Salisbury, principal research scientist at MIT’s artificial intelligence laboratory, to build and design the device.
The FreeForm software is geared to those who create models so mathematically difficult to design that they couldn’t easily be created with a CAD system, Hally said.
“A phone has a mathematical shape with parallel lines, but think of tennis shoes with swooping lines,” he said. “You can imagine why it would be hard for designers to create a fine like that if they had to rely on a CAD system driven by mathematical formulas to do it.”
Commercial designers like those that design tennis shoes often model with clay or foam. The FreeForm software serves as a replacement, allowing them to form by feel, from what Hally referred to as digital clay. But design engineers have much different needs, he added.
“Engineers need things to be very precise and constrained and mathematical, whereas designers need to be creative, unconstrained, and free,” Hally said.
To make the software more useful to mechanical engineers, in the future SensAble will seek to partner with a CAD vendor that provides software for engineers. That way, the sense of touch could be incorporated into the engineering design software.
The technology will serve other engineering purposes in the future as well. At Boeing, Bill McNeely and his group are working with SensAble Technologies to develop a tactile system for virtual prototyping. McNeely, a project manager at Boeing’s Phantom Works—the research and development unit in Seattle—said the system should give engineers a more realistic hands-on experience with assembly than is possible with a vision-based computer system alone.
The system, now in the prototype stage, links Voxmap Pointshell, or VPS, software from Phantom Works and SensAble Technologies’ Phantom force-feedback haptic device.
It will allow users to conduct producibility analyses, especially of complex systems, more thoroughly and quickly than before, according to McNeely.
What the technology should do, said Hally, is help to answer an age-old engineering question: How does a company make sure that the products it designs can be easily manufactured and serviced? Take a car door, for example. After engineers have designed both the interior and the exterior of the door, the designs are passed along to the manufacturing engineer to make sure the door can be properly produced. If the door is to include electric windows, for example, the engineer must make sure the motor that controls the windows will fit within the door as it’s designed. Someone also must ensure that mechanics will be able to get to the motor later should it need to be fixed.
Today, manufacturing engineers usually build a prototype door from the commercial design and then manually place the window and the window motor between the panels to see if everything fits properly.
Boeing and others are trying to develop a system so that this testing—often called assembly path planning— can be done with digital design files that wouldn’t require prototypes to be built, Hally said. Prototype cars can cost millions to build.
“The haptics piece is obvious,” he added. “The user grabs the window motor and tries to maneuver it between the exterior door panel and the interior panel. The VPS software then does what’s called collision detection—figuring out whether or not the motor is bumping into anything as the user moves it. If it is, the Phantom will resist that movement and users know they’re hitting something because they can feel it.”
The Big Picture Made Small
In fact, the ability to conceptualize a product on a computer screen has been changing the way engineers have been doing their jobs for the past several years. And innovations introduced by software companies won’t be slowing down soon.
For instance, Engineering Animation Inc. of Ames, Iowa, sells a suite of software called Open Virtual Factory, which allows all participants in the design and manufacturing process to work together. The software uses 3-D images to represent the layout of a factory, allowing the manufacturer to see how the factory would look and how the line would function, and to work out potential kinks right there on the screen.
The system also incorporates CAD information from the design engineers to help factory managers incorporate product design into their plans for factory layout. Integrated software that uses graphic interfaces to demonstrate ergonomics and other human factors of a potential factory is also included in the system. Among the other technologies included in the package is animation software, which allows users in essence to automate the virtual factory they’ve designed to see how it would run.
The system also includes communication capabilities that those in the software industry refer to as collaboration capabilities. These features take advantage of the Internet, allowing manufacturing engineers involved with a factory project to communicate with industrial engineers who may be located at another site. A designer could use the system to pass CAD information to a manufacturing engineer.
This type of technology, with which users can easily communicate back and forth regardless of distance and pass large fries of information, represents a trend that CIMData, an Ann Arbor, Mich., consulting firm, has called a rapidly evolving visualization and communication marketplace.
Many factors are driving the quickly expanding visualization and communication sector, according to CIMData. As companies outsource projects across a far-flung network of suppliers and no longer exist in one centralized location, they need a way to trade information quickly and to get a broad overview on the status of a project that may be fractured among employees at different locations.
CIMData says that companies like Engineering Animation Inc. provide software for the visualization of process data. They allow top management to envision an end result and to track progress toward the result. These suites can consist of coupled technologies—like CAD and product data management systems—that track a project by means of both text-based and image-based software.
Senses Provide a Complete View
Some technologies that also can be called visualization systems translate analytical or mathematical equations into sensory data that users can grasp intuitively. In these cases, the technology enables users to envision a result or experience that they couldn’t if they were merely looking at a spreadsheet of numbers.
One software supplier, Muse Technologies of Albuquerque, N.M., makes technology that gives computer users what the vendor calls sensory-rich environments. In these environments, many forms of data are integrated and presented using sight, sound, and tactile feedback.
“Over time, the ability to collect information has grown and grown, and the ability to store information has, too,” said Joe Krasnov, director of quality assurance at Muse. “But the ability to understand this megadata hasn’t kept up with our ability to create it.
“This technology allows people to use all their senses to understand complex information,” Krasnov said. “We use visualization techniques so the data looks like the types of things it represents.”
In the automotive industry, the technology has helped integrate the results of different data so researchers can better understand test results.
For instance, Goodyear Tire and Rubber Co. in Akron, Ohio, collected a great deal of information on tire performance from tests and other sources. The company wanted to determine the best way to manufacture its tires for use on a race car.
“They were trying to figure out which sort of rubber and other things they should use on their race car tires to get results that would be one-thousandth of a second better than the next guy,” Krasnov said. “They had information about the pressure of the tires and how the car turned corners, and all sorts of things.”
Goodyear engineers hung numerous wall charts depicting that information numerically and then walked back and forth between the charts to compare numbers. That changed after Muse provided an application depicting a race car moving around the track.
“We represented the data exactly as it looks, so a person’s brain says, ‘This isn’t just numbers; this is the real thing,’ ” Krasnov said.
By depicting a model of a car racing around the track, Goodyear was able to apply 20 different sets of data to various elements of the track environment, said Caryl Booker, a Goodyear engineer who worked on the project.
“We can assimilate far more information when it’s delivered in a manner we can relate to readily, such as in pictures, in sound, and in touch,” Booker said.
Although on the surface, the technology might be said to resemble a video racing game, engineers gleaned a great deal of information from the application. Engineers could hear the tires squeal as they turned corners. Based on the pitch of the squeal, the engineers were able to determine how the tires performed on curves, Krasnov said.
They were also able to formulate what-if questions and could receive visual feedback as well.
“Right turns are shown in blue and left turns are in yellow,” Booker said. “Tire load changes are depicted as increasing and decreasing tire size, while steering wheel positions appear as blue dials over the center of the vehicle model.”
The application also integrates what Krasnov called head-tracking capabilities. Users can turn their heads—in this case, their joysticks— to the right to view the right side of the race track and can then turn their heads to the left to get the left-hand view.
The ability to translate hard numbers and complex mathematical formulas into something that looks like a human version of reality is upon us. The ability to touch that computerized version is quickly approaching.
Now if we could only smell it.