We’ve all seen the movies. Robots replacing humans at everyday tasks like vacuuming the floors, shuttling you around the city, and flying into war zones instead of pilots. Except, it’s no longer just a cool idea or concept, it’s happening for real. So, why isn’t it happening in healthcare? Well, in fact, it is, but with exceptions.
With technology becoming more advanced and accessible, it is becoming easier for businesses to find ways to streamline their standardized tasks and make their workflow more efficient. With healthcare, using technology can improve accuracy as well as increase efficiency in many tasks. So why don’t you see a robotic pharmacist or doctor? Let’s take a look at some of the reasons.
The personal touch
Perhaps one of the biggest things that people take for granted is the personal touch, or bedside manner as you will have it. It’s not easy for a medical professional to tell people the bad news that they don’t have long to live. Machines cannot express emotion, because they do not possess emotion. While you may think that you don’t need a pharmacist to express emotion, many patients find solace and comfort in asking medical questions they feel that the doctors will not give them an answer, or they just value a second opinion. If you’ve read my other posts, you’ll know that pharmacists are drug specialists. We know more about the medication than the doctors, because that’s our job. So when it comes to a question about medicine, the pharmacist is the one you ask. Yes, you can google or ask some software your medical question, and yes the responses are most likely what the pharmacist or doctor is going to tell you, but it’s the human touch makes all the difference in comforting or reinforcing the patient’s decision to continue or discontinue the medication.
Can a machine make the right judgement call?
The principle trouble with machines and technology is that they cannot inherently make a judgement call. And until we develop true artificial intelligence, even machine learning can only make a decision based on statistics. Trouble with that is that human life cannot boil down to statistical decisions. The best a machine can do is to run a statistical assessment of whether or not this will work, and even with machine learning, we have yet to be able to find that software can take all other variables into play when making a decision. I’m talking about decisions such as eating habits, medication adherence, and the little things in daily life that may affect how a person may or may not take their medicine. While we are finally able to start collecting such data, not everyone will welcome the intrusion in privacy, and that will pose a challenge to machine automation.
Another issue with the lack of judgement is, at what point will the machine outweigh the risk vs benefit? Because a machine does not understand pain, have compassion, or express emotion, it cannot judge discomfort for each individual. So how can we expect a machine to make the decision to dispense a medication to a patient or consult the doctor to discuss a potential alternative therapy? What will be the data driving those factors? And who will control the decisions based off that data?
Who is truly in control of your healthcare? And why does that matter with replacing pharmacists and doctors with robots?
The answer is… third-party payers, or your insurance. In truth, doctors and pharmacists have very minimal control over what medications you get. We constantly are being shown new drugs on the market that are promised to be more effective than older ones they are replacing. But that’s doesn’t mean you’ll get those drugs. Why? Because your insurance is ultimately making the decision in what they will or will not pay for. With the advancement of technology, we are able to collect more data from patients. While we as scientists and healthcare professionals may use technology try to improve our ability to treat our patients, not everyone in the industry will use that data accordingly. Why does this pose such a problem? One of the biggest fears of using multiple sources of data, or “big data” in healthcare, is that the insurance companies may use that data against you. They may use your disease conditions to lock you out of specific plans that you may otherwise have qualified for in order to get coverage to treat your condition.
System flaws in technology
Everything in science is data driven. And the improving technology is a welcome addition in collecting more data. However, technology and data by itself is not enough of a factor to replace a human with a machine. In healthcare, we should be using technology to work with us, not replace us. We also have to be extremely careful with data in healthcare, because patient sensitive information is at risk. Technology always has security risks, and having more data also increases risk. There is also the risk of manipulation of data. Human beings are able to interpret data, but do not make decisions only based on data. As brilliant a machine or computer can be even with machine learning, it is simply collecting more data in order to make a better statistical decision. If someone were able to manipulate that data to provide false data, then we may potentially create more problems than we are trying to solve.
Looking forward into the horizon
In the end, technology is a double-edged sword. There is limitless potential, yet also the potential for limitless destruction. it is up to humanity to step in and help modulate and decide how we apply technology, and to what limits we do so. Automation with robotics and machines may work great for many industries, but in healthcare, it can never truly replace people like doctors, pharmacists, and nurses. That’s because there is always going to be risk for system malfunctions and failures. Would you go to a hospital or clinic that only had robots and machines instead of doctors and nurses? Or a pharmacy that had a robot behind the counter and machines whirring around counting your medicine? Do you really think that machines and software are that trustworthy and without system flaws? I’ll leave you to think about it until next time.