This article is a summary of a paper named Next Wave of Technology produced by D. Tesar. The urgent need is to create a balance of all supporting technologies required in electro-mechanical systems, especially those of high economic magnitude. The desired tech base is described in terms of 10 major topics that have been summarized here to indicate its relevance to meeting the needs of mankind, to its potential to reinforce our national security, and to augment our consumer product competitive position. All future machine systems will increasingly be highly nonlinear, reconfigurable to meet changing needs, and architecturally a mixture of serial/parallel control structures. This means that the influence of any one-control input (an actuator) faces an ever changeable physical plant. This complexity can now be addressed by using very low-cost/distributed sensors providing operational data to a criteria-based decision structure with a full evaluation of the system in 5 to 10 milli-sec. A computational revolution for decision-making is now feasible because of accelerating computer technology. This revolution will be based on the geometry of the decision process.
A new urgency is being recognized at the national level because of under-investment in the mechanical tech base in the U.S. This weakness limits the strength of other tech base sectors (computers, communications, medical, transportation, military, etc.). The urgent need is to create a balance of all supporting technologies required in electro-mechanical systems (trains, orthotics, aircraft, robot surgery, vehicles, etc.), especially those of high economic magnitude. This argument is presented in a paper entitled Next Wave of Technology by D. Tesar just submitted for publication to urge serious consideration of this under-investment by our federal agencies. The desired tech base is described in terms of ten major topics which will be summarized here to indicate its relevance to meeting the needs of mankind, to its potential to reinforce our national security, and to augment our consumer product competitive position.
1. Overall Vision: The goal is to open up the architecture of electro-mechanical systems, use standardized interfaces to permit plug-and-play of highly-certified components (especially intelligent actuators as computer chips are to electrical systems) produced in minimum sets for each application domain and provided by a competitive supply chain to continuously improve the performance/cost ratio of these components. The concept of long-duration design/evaluation/production of one-off systems would be a thing of the past enabling more rapid infusion of technology, repair on demand, and frequently the elimination of single-point failures and the prediction of performance failure without false alarms.
2. Machine System Intelligence: All future machine systems will increasingly be highly nonlinear, reconfigurable to meet changing needs, and architecturally a mixture of serial/parallel control structures. This means that the influence of any one control input (an actuator) faces an ever-changeable physical plant. This complexity can now be addressed by using very low-cost/distributed sensors providing operational data (in a milli-sec., or less) to a criteria-based decision structure (set by humans) with a full evaluation of the system in 5 to 10 milli-sec. (effectively linearizing the system) because of superior computational resources available today. Given decision inputs as a result, the command/response must be managed by ever-improving actuators to adequately respond in the 5 to 10 milli-sec. time frame.
3. Computational Intelligence: A computational revolution for decision making is now feasible because of our accelerating computer technology. This revolution will be based on the geometry of the decision process. If it is serial, as for a centralized company (top-down decisions), the criteria are set by leaders at the top of a decision pyramid. Flow control from the bottom is virtually impossible. By contrast, parallel structures (holding companies, universities, multiple government agencies) can accept and facilitate flow control from the bottom in layers with nominal control from the top. Then, decision criteria in the serial case are fewer and change less often. Those criteria in the parallel case are more numerous and change more frequently. The power of predictive analytics would set/rank these criteria based on archived operational data. Of course, mixed/ parallel systems do exist and their sensed/archived data would be managed in both the serial and the parallel flow with selected criteria set at each level or intersection of the decision geometry.
4. System Level Sensors: Fortunately, sensors for all components and systems are becoming very low cost (some averaging $1 in quantity). Body sensors will soon enable effective orthotics to assist the disabled. Freight trains will embed sensors to locate hot bearings, cracked wheels, unbalanced loads, etc. Vehicles will embed torque sensors to monitor wheel traction, etc. All this information on component and system performance goes in milli-seconds to inform the decision structure to compare actual and desired performance (against operator-set criteria). Further, the real performance data can then be archived to continuously update the criteria (say, efficiency, response time, lack of precision, temperature, etc.) using predictive analytics. In the past, control techniques were structured to make decisions based on the minimum of sensed data. This approach is no longer germane in today's computational world.
5. Marriage of Man and Machine: To meet human needs, we must integrate a parametric representation of the human with that of the responsive system. Each system will be represented by hundreds of performance maps (and envelopes) at two or more physical layers. Each intelligent actuator may require 40 maps to adequately represent its nonlinear nature. Given 10 actuators, that would represent 400 maps, which then, must be built into a reconfigurable decision structure at the system level (because the system may be reconfigured to meet the ever-changing needs of the human). Doing so structures the full decision process and enables highly refined data on the map surface to be retrieved and combined in terms of human-set criteria. Of course, performance maps also apply to the human. Hence, all human and system maps/envelopes become part of the decision process with far less uncertainty and far less response time (clearly, this is useful for operator training, as well). Note that autonomy only augments this process, removing from the human the burden of repetitive low-level decisions (as long realized in the case of a fighter pilot).
6. Human Operator Visualization: Given truly complex and critical decisions where human life is at stake (surgery, battlefield operations, orthotics, etc.), it becomes essential to provide visual guidance to the human operator so that decisions can be made more rapidly and more accurately. Most visual representations will be to difference an actual system performance envelope relative to an embedded criteria-based envelope (prioritized by the human). This difference must highlight desired sweet spots (say, for efficiency) or operation where danger is involved, etc. A useful difference map must contrast good and bad on the same map so that critical decisions can be made quickly, probably moving away from danger in favor of a good performance region). This command would be tracked visually on one or more supportive decision envelopes.
7. Command/Response: The idea that all decisions can be predetermined exists far in the past. Today, our low-cost sensors can completely document how all parts of the system are functioning. This data can then inform all parts of the decision structure (autonomy, human, envelope-based criteria, etc.), and then instruct each actuator to respond to its co-ordinated command, all in 5 to 10 milli-sec, or less. For example, a car moving at 70 mph will cover the distance of 2 feet in 10 milli-sec., which may not be sufficiently quick to respond to special road surface conditions (i.e., loss of traction). The same may be true of surgery, response to battlefield threats, precision response to force disturbances in manufacturing, etc. The 10 milli-sec. decision window enables the linearization of highly complex, coupled, nonlinear systems, enabling strictly algebraic/geometric decisions without the use of cumbersome continua mathematics, which are easily incapacitated by any form of coupling, or nonlinearity in the system. It also means that pseudo inverses which are computational approximations no longer need to be relied on to make timely and accurate decisions.
8. Open Architecture in EMS: Computer technology became open in the 1980 decade where standardized and highly-certified computer chips enabled the construction, almost on demand, of unique and popular computer “boards”. This, then, created a demand for higher performance at lower cost for all chips utilized in each board's domain of application. Eventually, the whole design process was inverted in favor of a minimum set of computer chips of ever-higher performance-to-cost for each application domain; i.e., the board designer had to design based on the chips readily available in the supply chain or specify a unique, but more expensive, chip for a special function. The Next Wave of Technology is built on this concept of a minimum set of classes of intelligent actuators (from 2 to 4 orders of magnitude better than the SoA) to operate electro-mechanical systems (EMS). The goal is to concentrate on five classes of actuator technology to create an equivalent of Moore's law for actuators. Special cases in each class will meet unique needs (torque density, stiffness, backdrivability, efficiency, shock resistance, etc.), as is now done for computer chips. Each class/case will become available in the supply chain certified to meet acceptable performance standards (i.e., certified in-depth). Each actuator will utilize standard interfaces to permit rapid integration in a targeted domain of application. Then, the system designer becomes an architect assembling the system on demand to respond to the widest possible set of downstream conditions by reconfiguration commands from the operator or from the embedded decision structure (say, the equivalent of autonomy).
9. Enhanced Availability: Durability is one ingredient of availability. Reliability is one measure of durability. Standards for effectiveness and a fixed schedule for component replacements are another means to manage unexpected failure and long down times. Here, this managed failure avoidance will be expanded to enhance the technical basis for almost complete availability, almost no false alarms, and reduced cost by eliminating extended outages. Each system will be composed of components with birth-certificate performance maps. Each component will use predictive analytics to update their actual maps and difference the updated maps against the functionally required maps to estimate remaining useful life (i.e., a modernized form of condition-based maintenance). Based on this predicted RUL, spares can be brought in for replacement before failure in a timely and cost-effective manner. Using this archived data, this degradation history can now be quantified to assist the component designer to improve the design (documented in terms of performance maps), move towards lower cost, and be more responsive to the customer (as part of the supply chain process). Given that the open architecture EMS will continue to become more complex (do more in multiple configurations), this expanded view of availability will become the norm and a necessity.
10. Actuator Intelligence: Item 5 highlights the absolute importance of intelligent decision making within all future actuators (as it is now for all market-driven computer chips). Given a decision time span of 10 milli-seconds at the system level, it must be 1 milli-sec., or less, for the actuator because these systems are highly coupled in most systems (frequently in a force fight). Unfortunately, all actuators are highly nonlinear making their command/response approach largely untractable by that embodied in the concept of automatic control (usually good for simple linear systems of a few DOF). To make each actuator responsive to command, at least ten sensors (voltage, current, temperature, velocity, acceleration, torque, etc.) must generate data to accurately represent the actuator's real condition (in much less than a milli-sec.). This data, then, locates each performance measure on its respective embedded maps. Each such data point then enables algebraic decisions to be made as to how to respond to commands in the next time frame (say, 1 milli-sec.). These algebraic decisions are based on operator-set performance criteria to meet the system's operational demands in this allotted time span. This includes torque, acceleration, stiffness, backdriving, etc. It also includes condition-based maintenance and fault avoidance. Actuator operational software will be dedicated to each actuator class and evolve over time depending on the application domain. There may eventually be a concentration on forward (what is commanded) and inverse (depending on what actually occurred) decisions.
Recommended Actions: The weakness in the mechanicals will require a national reawakening especially among the U.S. federal funding agencies. The following may build a wave of development based on a strong tech base community of interest.
Convene an industrial council of interested R&D vice presidents of our high-valued industries to advise multiple federal agencies on balancing all technologies, with emphasis on rebuilding the mechanical tech base.
Have the DOE revisit the critical role of the mechanicals in the energy sector (oil & gas, efficient vehicles, manufacturing, power plants, etc.).
Have DARPA commit to a revolution in intelligent and highly-certified actuators with emphasis on military systems, as it did for the computer chip in the 1970s.
Have NSF engineering restructure its program plans to rebalance their portfolio to create a proportional investment in the mechanicals to meet tech base requirements of our major economic product producers taking advice from the recommended industrial council so that young faculty would be able to seek balanced funding to support graduate students better oriented to real industrial needs.