Home Page
cover of Artificial Intelligence in Healthcare: Is your Personal Health Data Protected?
Artificial Intelligence in Healthcare: Is your Personal Health Data Protected?

Artificial Intelligence in Healthcare: Is your Personal Health Data Protected?

Isa

0 followers

00:00-10:01

Nothing to say, yet

Podcastspeechfemale speechwoman speakingnarrationmonologue

Audio hosting, extended storage and much more

AI Mastering

Transcription

AI in healthcare poses privacy threats as companies like Google and Amazon gain access to personal health information. Lack of regulation allows them to use this data for their own benefit. Examples include DeepMind obtaining patient records and tech giants striking deals with hospitals. The healthcare industry is profitable, providing a financial incentive for AI companies. The legal framework, such as the HIPAA Act, allows sharing of identifiable information for treatment purposes. Women's healthcare is particularly affected by the lack of privacy. Dr. Annie Lyerly discusses the ethical concerns and potential harm, including discrimination and restrictions on reproductive healthcare. The current system needs increased protection to prevent the leakage of private patient information. How would you feel if Google, Amazon, or IBM could see the private details of your latest visit to the doctor? What about your nearest Tesco or Sainsbury having access to your personal health information when you are checking out? What if your employer had your health data stored within their system's algorithm? Secure patient data is becoming less secure as artificial intelligence's presence in the healthcare sector has grown. How do we move forward with this information and work to have better oversight over AI? Hello, and welcome to ESA Investigates, the podcast where we tackle the biggest ethical headlines in artificial intelligence today. I am your host, Isabelle Garrett, and today we are going to talk about privacy threats that are arising as a result of AI's implementation into healthcare. But first, what is artificial intelligence and how is it being used in healthcare? AI is the generic ability of machines to acquire information from their environment to perform specific tasks. Artificial intelligence encompasses machine learning, the ability of computers to learn data to perform a task without being explicitly programmed. So how is AI impacting the healthcare field? Well, there is a growing understanding that the development of artificial intelligence technology is far surpassing the implementation of safety measures to protect the data that AI is collecting. For example, there is no current overall global regulation policy towards AI in healthcare, which is an issue as private companies could use the data they collect for nefarious uses. One example is in 2015, it became public that DeepMind, which is Google's AI branch, obtained the personal records of 1.6 million patients at the Royal Free London's NHS Foundation Trust through the Streams app, which is used to identify kidney failure. So in obtaining data from hospitals to use in creating healthcare tech, like this valuable kidney failure detecting app, companies can get large pools of data that they can use for their own benefit. Another more recent example from 2020 is tech giants such as Microsoft, IBM, and Google have all struck deals with hospitals to provide them access to patient health information. For example, Amazon Web Services employees were given patient health information to develop their programming that is meant to read medical notes, but that patient health information was identifiable, as in the individual people could be recognized and were not anonymous. In the U.S., where I am from, our national health expenditure grew from 4.1 to 4.5 trillion in 2022, or $13,493 U.S. dollars per person. And overall, the spending on healthcare in the U.S. accounted for 17.3% of gross domestic product, or GDP. The healthcare industry is a profitable market, clearly, based on these statistics, and there is a great financial incentive for AI companies to be in this industry, an industry that they can have an edge over with the personal data that they can collect and use for their own systems. Well, now you may be saying, how is this legal? Well, in actuality, it is completely, completely legal. The HIPAA Act requires that you protect identifiable information, like name and social security number, unless records are needed for treatment, payment, or hospital operations, which all of these tech partnerships fall under. This all connects to the idea of privacy and data protection and the lack thereof in the current environment, where regulation has not caught up to the progression of AI. Women's are being impacted by their personal information not staying personal, doctors are being impacted because their medical care is no longer private, and big companies are benefiting from this entire situation. I am fascinated in how this impacts women's health, so today I sat down with Dr. Annie Lyerly, professor of social medicine and a research professor in obstetrics and gynecology at UNC Chapel Hill. Her research focuses on social and moral issues in women's health and reproductive medicine. Dr. Lyerly has founded initiatives such as the Obstetrics and Gynecology Risk Research Group, which examines how risk is assessed and managed in the context of pregnancy, and the Second Wave Initiative, which aims to represent the health interests of pregnant people in biomedical research and drug and device policies. She also has completed the Greenwald Fellowship in Bioethics and Health Policy at Georgetown and Johns Hopkins Universities, as well as being the chair of the American College of Obstetricians and Gynecologists Committee on Ethics, and as well the co-chair of the Program Committee for the American Society of Bioethics and Humanities. Dr. Lyerly has extensive knowledge in women's health care and also ethical policy, which is why I brought her onto the podcast for this episode. I began by welcoming Annie to the podcast and asking her, how does a lack of personal information privacy affect women's health care? Well, I think it's really worrisome for women's health care. I think when we think about the health information that revolves around health care, it is some of the most private and personal information that anybody has, right? So, it's one thing for somebody to know that you have asthma. It's another thing for somebody to know whether you're pregnant or not pregnant, or what your menstrual cycles are like, or what your sexual health care is like. So, these are some of the most deeply private and personal kinds of information that there are. So, the lack of privacy around it is really problematic for a number of reasons and worries those of us who do ethics. You know, one of the worries has to just do with a breach of privacy, right? So, if your health information is circulated, then you are violated in a certain way, right? So, there has been a lack of respect for your privacy and for your ability to keep your information to yourself. So, I think when we looked at some of the responses to the privacy breaches around the menstrual websites, a lot of women who were using them felt that they had been personally violated in a very deep way. And I think that's one reason for ethical concern, right? But there are other kind of deeper, I mean, not necessarily deeper, but sort of more concrete worries that I think we can think about that relate to the specific harms of privacy breaches. So, one is that, you know, pregnancy and fertility have long, you know, for decades been used against women for a variety of reasons, as a reason to let them have the kind of education they want, to let them have the kind of job they want. This is against the law, but we know that there is a tendency to consider the likelihood of or the potential for pregnancy, for instance, in a range of decisions. And sometimes these are very tacit these days, but we know that we still live in a sexist society and that these considerations are still floating around in the heads of people who are in a position to make decisions that affect women's lives and empowerment. I think kind of even more worrisome these days, however, is the fact that, you know, especially in the United States, there are increasingly restrictions on women's reproductive health care, specifically, you know, access to abortion in many states is criminalized. So, people who get abortions and people who provide abortions are both at risk. And these kinds of data can allow a third party to have a sense of when and whether somebody has become pregnant and when and whether somebody's pregnancy has ended and could potentially be used as evidence. What Annie is talking about is a contrast between a deontological and utilitarian perspective. The deontological perspective that Annie is leaning towards is that every human being deserves to be treated with dignity. This perspective would say that women's data should not be shared. On the other hand, the utilitarian perspective, which demands doing the greatest thing for the greatest number of people, could be a devil's advocate in saying this data is creating health care AI that benefits the masses and doing a greater good for this large group of people. So, the sacrifice of certain women's data is worth it. Annie and I spoke at length on how we would improve the current system. So, if you would like to hear the full recording, please refer to the ESA Investigates podcast website. To summarize, we discussed that in the U.S., hospitals can share patient data if they follow federal privacy laws, which have clearly limited consumer protections, including the lack of identifiable information being removed from this data. So, in conclusion, we need increased protection. Artificial intelligence has the potential to create great efficiency and aid medical professionals in their work, and it can be used for good as long as the intersection of technology and health care does not result in a leaking of patient private information. Thank you very much for listening, and thank you very much to Annie for coming on the podcast today. See you next week.

Listen Next

Other Creators