The saga over Boeing’ 737 Max and the Federal Aviation Administration’s handling of it has a striking resemblance to another government controversy: the criticism FDA has received over its 510(k) medical device clearances.
Originally posted by Medical Design & Outsourcing on August 15, 2019. See original article here.
From Netflix’s documentary “The Bleeding Edge” to the International Consortium of Investigative Journalists’ “The Implant Files” series, FDA has taken heat over the 510(k),which allows devices with “substantially equivalent” features to previously approved technology to go through a less strenuous review.
Meanwhile, FAA has faced similar fault-finding when it comes to Boeing 737’s Max 8. Initially announced in 2011, the Max was delivered with promises of improved fuel and economic efficiency. But less than a year and a half after its first flight the 737 Max 8 crashed in Indonesia, killing all 189 people on board. Just five months later, the Max crashed again, in Ethiopia, accounting for a total of 346 lost lives. The two crashes represent a strong safety signal warranting further attention — raising questions that apply not only to airplanes but also to medical devices.
Here are four regulatory challenges that both the FDA and FAA need to overcome:
1. Use of a derivative/predicate design
While the 737 Max 8 was designed with new benefits in mind, such as enhanced fuel efficiency, the 737 Max 8 is not a novel design. It instead is a series of revisions made to the initial 1960s 737 design in order to make a more competitive aircraft, as the Los Angeles Times reported in March.
While the FAA has tightened restrictions on designs, there were also derivatives of many designs grandfathered in, making it easier and more cost-effective to change an existing design frame compared to certifying a novel design, according to Aviation Law Monitor. Under this derivative, the 737 Max 8 also did not need a certification of type rating. A type rating refers to the levels of certification required by a pilot to fly certain aircrafts (think: like a special license). As such, no additional training of pilots was required, despite the numerous changes to the aircraft. Changes made included a new larger engine, split winglets, taller nose gear, and others. The new engine size resulted in a need to move the engine slightly forward and higher up, beneath the wing. The change in nose gear resulted in a need to extend the landing gear nose. Taken in combination, the new design led to changes in the overall handling of the plane.
Use of a derivative design mirrors uses of predicate device in the 510(k) medical device pathway. Similar to the FAA, the FDA has recently been looking to tighten restrictions on predicate device use. In a 510(k) premarketing submission, a predicate device refers to the use of an existing legally marketed device for comparison to a new pending device in which a ‘substantial equivalence’ determination must be made
Opponents of the 510(k) pathway state concerns over the age of predicates, including the use of predicates that are no longer on the market (possibly replaced by other devices due to lack of effectiveness). They want additional safety data before devices go on the market.
In the same way that revisions to the 737 Max 8 seem to have produced tragic results, there have been horror stories around 510(k)-cleared medical devices. For example. the use of surgical mesh for pelvic organ prolapse — initially cleared under the 510(k) pathway — has resulted in tens of thousands of lawsuits and hundreds of millions of dollars worth of legal settlements, according to ConsumerSafety.org. Because of the relative the success of mesh for other conditions such as hernias, medical device company officials thought that mesh might have a useful extrapolation to pelvic organ prolapse, though this extrapolation was unfounded and lacked data supporting its use. Complications from mesh included pain, bleeding, infection, and in some cases, punctured organs or abdomen walls. The FDA in April moved to completely shut down the sales of pelvic mesh in the U.S.
FDA officials themselves know the 510(k) pathway is flawed needs an overhaul, but they find themselves walking a delicate tightrope. On one end, the 510(k) pathway ensures quicker innovation, providing newer, improved products to patients quicker. On the other hand, this speed shouldn’t sacrifice overall safety. Elimination of predicate use within the 510(k) pathway may slow innovation, leaving patients with fewer and aging options.
Acknowledging a flawed process, the FDA has been searching for ways to revamp the 510(k). Recent changes include the consideration of the implementation of a 10 year limit on predicates. Likewise, the FAA has announced plans to make changes to the certification process, though exact plans are still underway.
Adding an age limit to derivative designs may offer the FAA an appropriate starting point.
2. Software issues
Design changes to the initial 737 design resulted in new safety issues. The new engine size and placement altered the way the aircraft handled and, under certain circumstances, the new changes could cause an upward pitch, creating a stalling risk. Boeing’s solution was to introduce the Maneuvering Characteristics Augmentation System (MCAS) software to mitigate the stalling risk.
The MCAS — as described in the Seattle Times in March — is an automated corrective action which adjusts the horizontal stabilizer so that the plane nose is brought down in scenarios where the plane is in risk of stalling. Boeing’s CEO stated that while the MCAS is viewed as an anti-stall system, it is actually a system that was designed to give handling options to pilots based on preference. However, in the case of the two crashes, the MCAS system was erroneously activated. Since the MCAS is automated, it is given full authority to make changes, rendering pilots unable to simply override the function while sending the plane’s nose into a downward pitch. Further, the system’s feedback was based off a single angle of attack sensor. Erroneous angle of attack sensors is nothing new to the aviation industry. In 2014 an Airbus320 experienced similar angle-of-attack sensor failure sending the plane nose downward, Herald Net reported. Fortunately, the pilot was able to pull the nose back after falling almost 31,000 feet.
Use of multiple sensors for feedback is vital in the event of a faulty sensor, as was the case in both crashes.
The error in the MCAS system ultimately lead to the grounding of all 737 Max 8 planes.
Software has been a growing problem for the medical device industry, too. It’s the leading cause of medical device recalls (11 consecutive quarters through Q4 of 2018). Software troubles are indiscriminate and affect even the largest industry names. For example, Medtronic recalled approximately 13,000 pacemakers in the U.S. alone this year for a software error causing lack of pacing; FDA designated the recall as Class I. Medtronic is not alone, in a review of medical device recalls since January 1, 2019 reasons for software-related recalls include sporadic error messages, risk of device “lockup” or freezing and circuit errors. These errors can result in severe or even fatal consequences.
But a new and increasing issue for devices is cybersecurity, and much like the MCAS, sometimes introducing a software patch to mitigate issues can add problems. In 2018, Medtronic CareLink identified a Common Vulnerability and Exposure, which meant that a vulnerability in the software that manages patient information and the device was found. In accordance with the identification of the issue, Medtronic released a patch intended to increase patient security and privacy but instead created a software vulnerability that could allow hackers the ability to push malware into the system, subsequently gaining control and enabling them to change the programming of the pacemakers. Fortunately, there were no reported adverse events reported in association with the voluntary recall.
Software appears to be a painful blind spot in hardware-focused industries, despite the fact that these industries and their relative safety rely on a harmonious relationship between hardware and software. Given the complexities of software and networks, regulatory oversight relating to software has been problematic for many agencies, especially given the uncertainty in regulating software and software changes. Regulatory agencies may need to consider “how much” of a software change turns an otherwise derivative/predicate application into something entirely novel.
3. User error, training, and human factors
Whether it’s airplanes or medical devices, software and hardware aren’t the only things determining the safety of a product. User error is another important regulatory. While neither crash investigation has concluded, crashes related to the MCAS system has been debated as user error. As previously stated, the MCAS system is an automated system which makes changes to the plane handling based on feedback, with a lack of override cited as a precipitating factor in both crashes. The FAA chief has noted that pilot decisions also played a role in the chain of events that brought down the two carriers. It was stated that pilots could have overridden the MCAS by turning off the motors in the parts of the plane that were driving it downward. Indeed, one day before the Lion Air Crash, the same plane experienced similar problems with the MCAS system. However, a third off-duty pilot was able to manage the system by shutting off the motors, preventing the nose from repeatedly turning downward. This procedure, according to Bloomberg, is part of a checklist that is required to be memorized by pilots. Reliance on memorization, especially in an emergent situation is prone to failure. The second crash has been noted to have not followed the emergency directive correctly, and that while the pilots did correctly turn off the motors, they did not appropriately control for speed.
Training also appears to have been problematic. Under the derivative design the 737 Max 8 did not need a new type rating, meaning there was no need for retraining despite the many changes. Thus, proper handling of the MCAS system, especially in scenarios where the plane is erroneously pitching down was never properly communicated to both regulatory authorities and pilots. News reports have noted that documentation regarding the MCAS system was initially missing from manuals. A combination of training and readily documented emergency procedures may have resulted in a different outcome. But even then, with limited time to make a decision, documentation of emergency procedures may not be enough. Products should be designed to allow minimal room for user error. Providing pilots with the ability to simply disable the MCAS or hit an ignore button may have provided an easy, less error-prone solution that is already commonly equipped in cockpits for other software capabilities. It would seem limiting the need for emergency procedures offers a more human-friendly fix over just outlining them and hoping for ready recall under pressure.
Human factors and user error are also a critical element in medical device creation and use. There remains a need to understand not only ways people perceive a device but also ways in which they use and interpret information from the device. This means providing adequate training and limiting room for errors not only in the manufacturing of the device but also in use. For implantable devices, training is a crucial component of outcomes, though things like training may be influenced by factors such as off-label use, which the FDA does not oversee. Current medical device applications require a clear outline of who will use the device and what training is expected.
As with the MCAS, alarm signals and user manuals play an important role in medical device safety, but the FDA has stated a need for reduced user reliance on manuals as this may contribute to error. Instead, devices should have easy to use interfaces and a design that promotes proper use even in the absence of ready recall.
User errors in devices may have serious consequences, such as injury or death. For example, errors in usability of infusion pumps accounted for more than a third of medical errors resulting in significant harm between 2005 and 2009, according to FDA.
4. Regulatory communication, reporting and transparency
Reporting refers to the responsibility of industries to make safety issues known. Reporting by professionals and users remains a staple safety feature of any industry, yet despite many pilots having complaints about the Boeing 737 Max 8 handling, it does not appear the FAA thoroughly investigated the prior to the crashes. A federal database showed five reports of aviation incidences involving the MCAS system, all of which were noted to have occurred during takeoff. Given the small time frame, it is arguable that this was a missed safety signal. However, an audit of the FAA in 2014 noted that data from voluntary reports in not conducive for analysis of safety risks. So how many more complaints may have existed but were never reported, especially in light of many pilots voicing their opinions since the crash?
Further, a collection of other related reports show that angle-of-attack sensors had been flagged as problematic more than 50 times across different planes. There were 19 cases of Boeing-related sensor reports, including an emergency landing of a 737-800 and two 767s, according to Herald Net, which cited FAA data. In addition, flight logs such as the one from the Lion Air flight the day prior to its crash may have offered valuable safety information. Could FAA have caught MCAS problems earlier, assuming pilots in the past had logged similar issues?
The FDA also boasts open communication through the federal reporting system, intended to track adverse events for purposes of identifying safety issues. Medical Device Reporting (MDR) is composed of voluntary and mandatory reporting requirements which may be made through the MedWatch platform. Both medical professionals and patients may make reports, and while mandatory reporting may make reports more likely, the FDA has struggled to get users to make reports where needed to promote safety. It is possible that due to relative underreporting, important safety signals are going undetected. A recent data release of 20 years of Medical Device Adverse Event Reports was intended to end the Alternative Summary Reporting program, which exempted some manufacturers from public reporting. With this release comes an era of enhanced transparency that may help foster earlier detection of safety signals. The FDA is also working to make the Manufacturer and User Facility Device Experience (MAUDE) database more user-friendly in order to encourage transparent reporting.
Transparency is vital not only when it comes to safety but also to consumer trust. The FAA could look to the FDA for guidance on better collection of data to inform safety signals, analysis, and transparency. In addition to keeping the lines of communication open between industry, regulators and consumers, both agencies will need to continue to work to find new ways to encourage and increase reporting.
Both the FAA and FDA face similar problems and could learn from each other. The aviation certification process and 510(k) device pathway appear outdated and are in need of improvements to keep up with the ever-increasing technological advancements that make both industries so preeminent. It appears the FDA may offer FAA clues in terms of assessing regulatory needs and making needed changes in accordance with such assessments. The FAA may also benefit from more structured reporting programs such as the FDA’s MedWatch that would allow for earlier detection of safety signals.
Both agencies will need to consider how to define derivatives and predicates going forward. While the use of an older derivative/predicate may be problematic due to many changes over time, resulting in a significantly different product, it is also possible that imposing age limits so that only newer predicates can be used may result in issues with lead time. For example, use of a 10-year-old predicate with minimal changes and minimal adverse event reports would be lower risk than approving a new device based on a two-year-old predicate that may not have any reported events yet but may have many in the future. This may especially be the case with regards to the aging of devices, machinery or software— which may not show reportable problems until later in their lifespan.
Finally, both agencies should work to minimize human error through more seamless designs and continue to foster transparency and early safety signal detection through effective reporting.
Ashley Holub is an epidemiology PhD candidate with a graduate certificate in regulatory science at the University of Rochester. Follow her on Medium.
The opinions expressed in this blog post are the author’s only and do not necessarily reflect those of Medical Design and Outsourcing, Micro or either of their employees, as well as the author’s employer.