Follow-Up Q&As : Open Architectures and Enduring Competitiveness
December 4, 2020Mercury Systems
Our CTO, Dr. Bill Conley, recently hosted a webinar with Dr. Ian Dunn titled “Open Architectures and Enduring Competitiveness.” The highly engaged audience had more questions than time allowed to be answered, so we’d like to use this blog to answer the other insightful and important questions raised by our customers, suppliers and partners who attend the event.
In this discussion, Bill described how the source of research and development has shifted from the federal sector into the private sector over the last six decades. This trend has diminished Defense’s role as the global incubator of high technology and thrust it into the much broader role of first adopter of high-tech innovations.
Ian then walked viewers through the important role of open architectures inside Mercury’s high-tech business model as we operate in the Defense marketplace. Numerous aspects of successful open standards were covered, including the interfaces, data, testing and hardware associated with the ecosystem. Open architectures are a great way for the government to control sustainment costs for weapon systems intended to be maintained for years; this comes at a cost of lagging behind the development timeline of a vertically integrated, proprietary solution. In addition, government needs to maintain an economic role in the open architecture that spans the life cycle of the adopting programs.
Question: On the last chart, you mentioned that digital engineering is being accelerated, please expand on what this means.
Answer: Digital engineering is a new engineering best practice. With recent developments in tools as well as simulation capabilities, entire systems can be designed, integrated and demonstrated in a digital environment. This digital design can be optimized based on feedback from higher-level designs and simulations. After maturing the design in the virtual environment, it is then prototyped and moved into production. The goal of digital engineering is to reduce development timelines, improve performance, reduce rework and, ultimately, reduce cost.
You may have also heard the terms digital twin/virtual twin. These are increasingly important and being used in the real world to prototype physical systems using varying degrees of digital simulation, but increasingly inside a real system. At Mercury, we’ve adopted a technology strategy very much aligned with the commercial innovations in this space, allowing our customers to deploy these concepts on their tactical hardware.
Earlier you mentioned that open architectures typically lag five years behind the state of the art. When is the right time for the use of an open architecture on a program and when shouldn’t you specify an open architecture?
Successfully developing, deploying and designing products aligned to an open architecture typically takes about five years when compared to executing a vertical development strategy. For aircraft and ground vehicles, this is often acceptable. The five-year delay in architecture development is also similar to the timeline associated with airframe design. Additionally, most platforms will be sustained for years, and an open standards architecture allows for periodic upgrades over the life cycle. However, to address critical threats with a new weapon or electronic warfare system, a vertically integrated solution may be required. Delaying by multiple years may not meet the user’s urgent operational need. In these situations, an open system architecture may not be the right approach.
In the context of the network effect, how could we expand the marketplace to make the OAs more useful? Are there particular directions of market expansion that would be especially valuable?
There are some obvious opportunities for defense and civilian radio and networking equipment to share the same architecture. While specific anti-jam features or low probability of detection may be needed for defense applications, large portions of the architecture could be directly shared. Opportunities to share architectures for vetronics and avionics also exist with civilian users. The increasing role of autonomy for commercial applications will create new opportunities for sensors previously used primarily for military end uses.
Lastly, microelectronics deserves some special commentary because of its role at the bottom level of the technology stack up. Chips of all kinds are powering the revolution in digital engineering, and microelectronic devices now permeate every aspect of our personal and professional lives. As a consequence, microelectronics is not so much about market expansion for government and the defense industry in general; it is a bigger conversation around the role of these devices with respect to national security. The right choices at this technical level would allow all high-tech systems to be designed, developed and deployed more quickly—of benefit to both defense and civilian applications.
Is microelectronics today still empirically considered x10-^-6? If so, will there be a future area of “nano-electronics, 10^-9?” If so, will any of these six working areas [highlighted in webinar slides] change? Can we increase at the speed of light without going to nano-sized electronics?
Today’s microelectronics are built using transistor features of under 10nm, near the bottom end of the size scale described in the question. Full designs are roughly one million times bigger, or millimeters in scale. Broadly, we expect to use the term microelectronics to describe transistor-based devices constructed from semiconductor materials. Quantum processing, quantum sensing and quantum communication systems have seen a substantial increase in interest for defense applications. As the technology continues to mature, we expect the six capabilities (addressed in the webinar) and their design trade-off considerations to each remain critical: having access to novel materials, moving at the highest performance and processing speeds , cognizant of SWaP [as well as cooling, vibration, and other environmentals] constraints, to be integrated into existing software with an open interface, in a way that provides mission trust, aka safety and security.
Does an open architecture introduce configuration challenges?
Intrinsically, no. However, as standards evolve, they may choose to limit their backwards compatibility. Without backwards compatibility, configuration challenges could exist years in the future. Most standards continue to evolve: a great example is the standard wall electrical plug. The only major change in decades is the inclusion of the ground connector. Examples like USB include substantial backwards compatibility while maintaining the same physical connection.