By: Manuel García-Herranz

At UNICEF Innovation, we’re supporting our partners to help build models that give us a better understanding of empathy. We hear in the news about how Artificial Intelligence (AI) can be a tool for evil — how robots can take over the world, or how automation will take away jobs. Often that comes from an understanding of a mechanical type of AI. But if AI can understand how human empathy works, we have the potential to build intelligences that help us understand ourselves.

UNICEF deals with a world that is increasingly more fractured and more troubled than ever before, and one of our core missions is to help people understand this complex world and how they can help to fix it. We are asking our friends and supporters to spend some of their time to train a piece of AI that is being build out of MIT’s Scalable Cooperation Lab, so that it can better understand elements of human emotion.

Deep Empathy gets us closer to the realities of those that suffer the most, by helping us imagine what neighbourhoods around the world would look like if hit by a disaster. ©MITmediaLab

The end result we’re looking for? Models that are publicly available and published, and a deeper understanding that shows us how to learn from AI about ourselves. Understandings such as how can an image help us empathize and connect with the reality of others, or how can AI help us distinguish fake news from real news when talking about a disaster?

Finally, these models can help to connect a separated world by using computers and the power behind AI to scale at the scale of problems that we face. With 11 major emergencies active around the world right now, it’s difficult for people to keep track of every issue or situation, and to connect them in a logical way — and empathize with the realities of those suffering. We believe technology can support us in finding the commonalities behind those disasters, and the commonalities that connect us all.

If you visit this project’s website — Deepempathy.mit.edu — and click on the images where suggested, you will build up a catalogue of what drives your empathy towards a series of situations. This can help train a machine, much like computers are trained to understand street signs every time you enter a captcha when you log into a new website.

Can empathy be facilitated through images that combine the familiarity of the cities we live in with traces of the realities of those that live far away? Can AI help make that translation at scale? ©MITmediaLab

Find out more at Deepempathy.mit.edu, and spend a few minutes of your time to help humanity understand its own ability for empathy through the lens of AI.


print