Whether aviation believes in AI, or aviation people believe in AI, I have thought about the topic for a long time, and there is a difference in people's consciousness, but aviation will not. Although AI-oriented data certification and the previous ** certification have produced a change in thinking, the consensus process on aviation AI will inevitably be completed step by step.
Machine Xi (ML) is a long-established field that builds input-output models by automatically learning Xi from example data. Over the past decade, a series of groundbreaking research breakthroughs have reignited the field and opened up a wide range of new applications. This progress is largely possible due to the increased data and computing power available. Deep learning Xi is a subset of machine Xi with subtle differences. In machine Xi, models Xi learn from data features (i.e., object color, edges, texture, velocity, and so on) handcrafted by subject matter experts. In the deep Xi, the model learns Xi these features on its own. For many professional tasks, this model outperforms traditional methods and even human experts. These tasks include medical diagnosis, satellite image analysis, disease classification, board and video game strategies, and more. Researchers and practitioners see deep Xi as a key enabler of robot autonomy, leading to unprecedented levels of safety. Deep Xi, in particular, is behind the success of autonomous driving over the past decade and has the potential to unlock the true benefits of autonomous flight. Despite these achievements, the widespread adoption of machine Xi in safety-critical systems such as automobiles and airplanes has been slow. This is largely due to lagging regulations and the lack of standardized certification policies, especially in the aerospace sector.
In aerospace, certification is designed to ensure that hardware, software, and processes adhere to a defined set of design and performance requirements, and ultimately reduce the probability of failure. The key to obtaining a proof of operation is to quantitatively prove the reliability of the entire operating domain (OD). For aerospace, ** agencies grant certifications such as FAA and EASA. They do not prescribe the exact practices to be followed, but instead issue advisory notices to acknowledge acceptable methods for developing and certifying aircraft, aerospace systems, or components.
These bodies, supported by professional associations such as SAE, RTCA or Eurocae, provide a forum for the development of technical standards and recommended practices for aircraft system design. These documents do not have any legal effect, but are generally considered an acceptable means of compliance. The regulator then works with each applicant to approve the use of each standard to define a formal method of conformity (MOC).
The widely accepted aerospace model is to follow the guidance of SAE ARP4761 and ARP4754A ED-79A. The framework follows the classic systems engineering life cycle to determine the level of certification required for system components. Use functional hazard analysis (FHA) to break down the system architecture into different levels;From the aircraft level all the way down to the subsystems. The capabilities of each level are then listed and mapped to the corresponding subsystems. The failure conditions and severity of each feature then determine the Development Assurance Level (DAL) and determine the safety requirements to be met. The applicant then follows the Conformance Method (MOC) to verify that the failure condition requirements are met.
DO-254 ED-80 and DO-178C ED-12C are guidance documents for the certification of hardware and software components, respectively. Similarly, the DO-200B provides guidance for working with aeronautical data and can be extended to the management of machine Xi data. According to DO-178C, each requirement must be traced back to the line that implements it, the test case that verifies that correctness, and the results of such testing. This ensures that the source meets every requirement and that there are no superfluous.
For machine Xi models, mapping the first line to the requirements is not straightforward, so there is a gap in the current certification process. First, such a model typically contains millions of parameters, a subset of which are activated based on the properties of the input. This makes it impossible to track the impact of individual parameters on the output of the results. Second, these parameters are not derived directly from physics or requirements, as is the case with traditional software (e.g., Kalman filters, yaw controllers). Parameters are automatically Xi from sample data, which is fed to the model during the training phase of the model. Therefore, in order to track this parameter to a requirement, the data itself should be mapped to the requirement. This not only shifts the focus of the current certification process from the certification process itself to data traceability. Third, some intermediate representations of data built by machine Xi models are becoming increasingly complex, powerful, and unexplainable. By looking at those middle tiers in **, it is not possible to map their parameters to requirements.
Article **From: Aviation House