Clicky

  • Login
  • Register
  • Submit Your Content
  • Contact Us
Saturday, August 24, 2024
World Tribune
No Result
View All Result
  • Home
  • News
  • Business
  • Technology
  • Sports
  • Health
  • Food
Submit
  • Home
  • News
  • Business
  • Technology
  • Sports
  • Health
  • Food
No Result
View All Result
World Tribune
No Result
View All Result

Q&A: Microsoft’s AI for Good Lab on AI biases and regulation

April 29, 2024
in Health
Reading Time: 6 mins read
A A
Q&A: Microsoft’s AI for Good Lab on AI biases and regulation
0
SHARES
ShareShareShareShareShare

Q&A: Microsoft’s AI for Good Lab on AI biases and regulation

The head of Microsoft’s AI for Good Lab, Juan Lavista Ferres, co-authored a book providing real-world examples of how artificial intelligence can responsibly be used to positively affect humankind.

Ferres sat down with MobiHealthNews to discuss his new book, how to mitigate biases within data input into AI, and recommendations for regulators creating rules around AI use in healthcare.  

MobiHealthNews: Can you tell our readers about Microsoft’s AI for Good lab?

Juan Lavista Ferres: The initiative is a completely philanthropic initiative, where we partner with organizations around the world and we provide them with our AI skills, our AI technology, our AI knowledge and they provide the subject matter experts. 

We create teams combining those two efforts, and collectively, we help them solve their problems. This is something that is extremely important because we have seen that AI can help many of these organizations and many of these problems, and unfortunately, there is a big gap in AI skills, especially with nonprofit organizations or even government organizations that are working on these projects. Usually, they don’t have the capacity or structure to hire or retain the talent that is needed, and that’s why we decided to make an investment from our perspective, a philanthropic investment to help the world with those problems.  

We have a lab here in Redmond. We have a lab in New York. We have a lab in Nairobi. We have people also in Uruguay. We have postdocs in Colombia, and we work in many areas, health being one of them and an important area for us–a very important area for us. We work a lot in medical imaging, like through CT scans, X-rays, areas where we have a lot of unstructured data also through text, for example. We can use AI to help these doctors even learn more or better understand the problems.

MHN: What are you doing to ensure AI is not causing more harm than good, especially when it comes to inherent biases within data?

Ferres: That is something that is in our DNA. It is fundamental for Microsoft. Even before AI became a trend in the last two years, Microsoft has been investing heavily on areas like our responsible AI. Every project we have goes through a very thorough work on responsible AI. That is also why it is so fundamental for us that we will never work on a project if we don’t have a subject matter expert on the other side. And not only any subject matter experts, we try to pick the best. For example, we are working with pancreatic cancer, and we are working with Johns Hopkins University. These are the best doctors in the world working on cancer.  

The reason why it is so critical, particularly when it relates to what you have mentioned, is because these experts are the ones that have a better understanding of data collection and any potential biases. But even with that, we go through our review for responsible AI. We are making sure that the data is representative. We just published a book about this. 

MHN: Yes. Tell me about the book.

Ferres: I talk a lot in the first two chapters, specifically about the potential biases and the risk of these biases, and there are a lot of, unfortunately, bad examples for society, particularly in areas like skin cancer detection. A lot of the models in skin cancer have been trained on white people’s skin because usually that’s the population that has more access to doctors, that is the population that is usually targeted for skin cancer and that’s why you have an under-representative number of people with those issues.  

So, we do a very thorough review. Microsoft has been leading the way, if you ask me, on responsible AI. We have our chief responsible AI officer at Microsoft, Natasha Crampton.  

Also, we are a research organization so we will publish the results. We will go through peer review to make sure that we’re not missing anything on that, and at the end, our partners are the ones that will be understanding the technology.  

Our job is to make sure that they understand all these risks and potential biases.

MHN: You mentioned the first couple of chapters discuss the issue of potential biases in data. What does the rest of the book address?

Ferres: So, the book is like 30 chapters. Each chapter is a case study, and you have case studies in sustainability and case studies in health. These are real case studies that we have worked on with partners. But in the first three chapters, I do a good review of some of the potential risks and try to explain these in an easy way for people to understand. I would say a lot of people have heard about biases and data collection problems but sometimes it’s difficult for people to realize how easy it is for this to happen.  

We also need to understand that even from a bias perspective, the fact that you can predict something, it doesn’t necessarily mean that it is causal. Predictive power doesn’t imply causation and a lot of times people understand and repeat correlation doesn’t imply causation; sometimes people don’t necessarily grasp that predictive power also doesn’t imply causation and even explainable AI also doesn’t imply causation. That’s really important for us. Those are some of the examples that I cover in the book.  

MHN: What recommendations do you have for government regulators regarding the creation of rules for AI implementation in healthcare?

Ferres: I am not the right person to talk to about regulation itself but I can tell you, in general, having a very good understanding of two things.  

First, what is AI, and what is not? What is the power of AI? What is not the power of AI? I think having a very good understanding of the technology will always help you make better decisions. We do think that technology, any technology, can be used for good and can be used for bad, and in many ways, it is our societal responsibility to make sure that we use the technology in the best way, maximizing the probability that it will be used for good and minimizing the risk factors.  

So, from that perspective, I think there’s a lot of work on making sure people understand the technology. That’s rule number one. 

Listen, we as a society need to have a better understanding of the technology. And what we see and what I see personally is that it has huge potential. We need to make sure we maximize the potential, but also make sure that we are using it right. And that requires governments, organizations, private sector, nonprofits to first start by understanding the technology, understanding the risks and working together to minimize those potential risks.

Credit: Source link

READ ALSO

Synapticure, Pearl Health partner to give personalized care to Medicare patients

Fusion Health rolls out EHR system for Ohio Department of Youth Services

ShareTweetSendSharePin
Previous Post

Craig Counsell explains decision to join Cubs over Mets

Next Post

Stair climbing workout to boost heart health and longevity

Related Posts

Synapticure, Pearl Health partner to give personalized care to Medicare patients
Health

Synapticure, Pearl Health partner to give personalized care to Medicare patients

August 24, 2024
Fusion Health rolls out EHR system for Ohio Department of Youth Services
Health

Fusion Health rolls out EHR system for Ohio Department of Youth Services

August 23, 2024
Hackensack Meridian Health, CancerIQ partner on cancer risk, early detection program
Health

Hackensack Meridian Health, CancerIQ partner on cancer risk, early detection program

August 23, 2024
MAUI emerges from stealth with M Department of Defense contract
Health

MAUI emerges from stealth with $4M Department of Defense contract

August 23, 2024
APAC’s growing focus on genAI and more briefs
Health

APAC’s growing focus on genAI and more briefs

August 23, 2024
Latest wearable sensor in Singapore tracks diabetes, stroke
Health

Latest wearable sensor in Singapore tracks diabetes, stroke

August 23, 2024
Next Post
Stair climbing workout to boost heart health and longevity

Stair climbing workout to boost heart health and longevity

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

What's New Here!

Stephen Nedoroscik’s girlfriend, Tess McCracken, talks 2024 Olympics

Stephen Nedoroscik’s girlfriend, Tess McCracken, talks 2024 Olympics

August 1, 2024
Microsoft’s cloud business is powering its profits—but there’s still some disappointment

Microsoft’s cloud business is powering its profits—but there’s still some disappointment

July 31, 2024
Allow AI scraping from Google or lose search visibility

Allow AI scraping from Google or lose search visibility

August 16, 2024
Google IPO banker tracks two-decade journey from Silicon Valley upstart to  trillion

Google IPO banker tracks two-decade journey from Silicon Valley upstart to $2 trillion

August 19, 2024
Scottie Scheffler shoots 62 to win Olympic gold, Jon Rahm collapses

Scottie Scheffler shoots 62 to win Olympic gold, Jon Rahm collapses

August 4, 2024
Workday stock gains as software provider widens 2027 margin target

Workday stock gains as software provider widens 2027 margin target

August 24, 2024
Kamala Harris debuts official TikTok account as presidential campaign picks up

Kamala Harris debuts official TikTok account as presidential campaign picks up

July 25, 2024

About

World Tribune is an online news portal that shares the latest news on world, business, health, tech, sports, and related topics.

Follow us

Recent Posts

  • 71-year-old billionaire Sir Jim Ratcliffe is in a race to secure his legacy
  • Mets take brutal loss to Padres as Paul Blackburn exits with injury
  • The ‘Viking Code’ leadership of Nicolai Tangen, a wealth fund CEO unafraid to lock horns with Elon Musk
  • Carlos Rodon throws latest Yankees gem in blanking of Rockies

Newslatter

Loading
  • Submit Your Content
  • Contact Us
  • Privacy Policy
  • Terms of Use
  • DMCA

© 2024 World Tribune - All Rights Reserved!

No Result
View All Result
  • Home
  • News
  • Business
  • Technology
  • Sports
  • Health
  • Food

© 2024 World Tribune - All Rights Reserved!

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms below to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In