Technology is a marvellous thing, but we take for granted the detailed process that goes into the very smallest of tools, especially when it comes to public health. It might strike you as odd that, if you were born 300 years ago, a mere cut on your hand, foot or other body part would leave you vulnerable to possibly fatal sepsis, but society was far from equipped for this. In the 1800s, an outbreak of smallpox could devastate a community, and the average life expectancy was 40.
Fast forward to the modern day, and there’s no shortage of incredible innovations that mean we can vaccinate against a novel virus within 18 months, and even get robots to help us perform surgical operations. Behind the scenes, however, there’s a lot more to these innovations than simply the inspired invention and hard work of implementation. In this article, we’ll explore the history of medical devices and why their regulation is so vital.
What medical devices do
Essentially, what they say on the tin. There are countless tools and objects used for a vast range of medical procedures, from diagnostics to surgical procedures and post-operative monitoring. From a technical standpoint, the label itself refers to any appliance with a medical application. The humble stethoscope, for instance, is a medical device since it helps a doctor figure out the behaviour of the heart, lungs and arteries.
However, there has been a great deal of development and innovation in the tools used for medical treatment, particularly since the birth of modern medicine in the post-Industrial era. The mainstream harnessing of electricity has also been unparalleled in its transformation of medical technology, as you might have already gathered.
The evolution of medical devices
The use of knives, scalpels, saws and needles for treating injuries and diseases go back millenia. In ancient times, neurosurgery was carried out using a method called trepanation, where drills were used to make holes in patients’ skulls. Our understanding of medicine has thankfully progressed in the thousands of years since, but this is an early example of a medical device.
Between A.D. 1 and the 1600s, the majority of medical procedures were carried out on the battlefield. Surgical or other curative treatment for injury and illness was relatively limited during this period, so the tools available were usually used for ill-founded reasons.
For instance, during the Black Death (1346-52), history’s most fatal pandemic that took the lives of up to 200 million people across Europe, Asia and Africa, the most common medical device was the knife. This was because blood-letting was thought to be the best solution, as mainstream understanding saw a humoural imbalance as the cause for the bubonic plague — which, of course, it wasn’t.
The birth of modern medicine
Before the 19th century, the majority of medical devices were manufactured by doctors or small companies, and sold to the public without any regulation. However, in 1867, Joseph Lister discovered the use of antiseptics to destroy germs, which would prove essential in modern surgical procedures.
The French chemist and microbiologist Louis Pasteur would later unearth the principles of vaccination and pasteurisation, two staples of modern medical treatment and preventative measures. This spawned the use of pharmaceuticals, anaesthetic appliances and devices like the hypodermic needle and the electrocardiogram.
In the 20th century, open heart surgery became commonplace, along with organ and bone replacements, dialysis machines, ventilators, respirators and pacemakers, plus countless other devices.
Why medical devices are regulated
Since medical devices are used for treatment and monitoring people with often life-threatening diseases and injuries, their efficacy is highly important. In terms of how this consideration is incorporated into modern medical technology, devices such as these are deemed ‘mission-critical’ equipment.
It’s important to remember here that, for a long time, many medical procedures like surgery were generally considered mostly dangerous and painful. This only changed around 100 years ago. With the advent of electronic medical devices throughout the 20th century, greater assistance and utility has come with greater risk, too.
For example, since many electronic devices rely on electricity, which has given rise to the problem of ‘patient leakage’, where appliances like X-rays can pose the risk of electrical shock to patients. One consequence of this has been the development of isolation barriers that are “required to ensure that the applied part is isolated from the ground and meets the patient leakage current limits”, as XP Power explains.
While a great deal of this improvement is linked to medical devices, this also throws up the issue of ensuring that all medical technology fulfils its designed purpose, is safe to use and does not harm either patient or operator.
Safety regulation pushes up production and service costs, however, which in turn drives up the price of healthcare operations, and can therefore hamper further progress. However, there have been many diverse instances of poorly regulated medical devices leading to injury and even death. For this reason, medical devices are organised into three risk-classes: Class I (low-risk), Class II (intermediate-risk) and Class III (high-risk).
Given the universal demand for life-altering medical technology, it’s important to ensure that the basic requirements and design of medical devices meet international standards for marketing worldwide.
If, therefore, a new medical tool is proposed to help with critical surgical operations, regulations are there to make sure that it meets the requirements of all medical technology, so its use and viability is transparent. This in turn leads to fairer pricing and market access, as devices are valued against a universal benchmark.
The most commonly known of these are ISO (International Organisation for Standardisation) standards, but you can find a comprehensive list of other regulations on the International Medical Device Regulators Forum (IMDRF).