• Saturday, April 27, 2024

Uncategorized

Will AI Change the Way We Perceive History?

By: Eastern Eye Staff

Google’s Gemini is in hot water because of a perceived bias in favor of people of color. While there’s nothing wrong with artificial intelligence showing good role models in a range of colors, pundits slammed the service for generating historically inaccurate pictures. Examples cited include a female pope and dark-skinned Vikings.

It seems that Gemini is too woke for its own good, which is something of a refreshing change because other models, like stable diffusion, seem to go the other way. However, it does bring up an interesting question, “Will AI change the way we perceive history?”

In this post, the SupportYourApp team looks at this in greater depth.

Can Perceptions Change the Past?

We all know that the victors write the history in their own image. Even the ancient Egyptians engaged in some form of rewriting the past. The only female pharaoh, Hatshepsut, was almost eradicated from history by her predecessors. The same is true of the heretic pharaoh, Akhenaten.

Today, we read about Akhenaten and might consider him enlightened. Perhaps his own court felt the same way. However, after he died, there was a big shift back to the old ways.

His successors tried to stamp out all traces of him, seemingly going as far as breaking down his palace and temples at Thebes. His name was literally obliterated on any monuments. People born fifty years after his death probably didn’t even know he existed.

Another more recent example is what happened in South Africa after the apartheid government was toppled. Prior to that, Nelson Mandela was considered a terrorist. After the ANC came into power, he became the first black president.

The history books had to change as a result. So, new generations learned history from a whole new perspective, which changed many opinions on what happened.

So, while perceptions don’t change what actually happened, they can shape what we believe. One person might have grown up believing that the ANC in South Africa was a terrorist organization. Their children might have learned that the organization was fighting for basic human rights.

Both the parent and child are talking about the same history, but perceive it differently. Therefore, we have to be careful in managing perceptions of history so as not to distort the facts.

How Can AI Be Helpful Here?

Artificial intelligence doesn’t have emotions, so it can be useful in reviewing historical events. In theory, it should be able to do so without any cultural or human bias. It could, therefore, be useful in bringing all the facts together,

Its data-processing capabilities could also prove helpful in getting several accounts of different events. Say, for example, you have a database of struggle fighters’ letters during apartheid. AI could scan those, and also historic accounts, newspaper articles, and census records of the time period to give a reasonably good sentiment analysis of the time.

It could scan documents in several languages to give a more unbiased view of what happened. It could, for example, give us perspectives from both sides.

However, you’d need human oversight here because AI wouldn’t be able to recognize propaganda for what it is.

How Can AI Hurt History?

Let’s take the example of the female pope that the image generator produced. It’s historically inaccurate because a woman can’t be a priest. As they aren’t allowed to be ordained, they can’t rise through the ranks of the Catholic church.

Some might say that Gemini’s portrayal of a female pope was a woke statement. Today, we know this is inaccurate because we’re living in a time where women can’t hold the position. We can cite it, rightfully or wrongly, as one example of the patriarchy.

However, what happens if things change and the Catholic church starts to allow women to be ordained? In a hundred years’ time, it might be possible to have a female pope. One might even argue that the furor over the picture might lead to this type of change.

The downside is that at that stage, how many people will remember that women were previously barred? If women are still fighting for equal treatment, images reversing the injustices of the past might harm their cause.

Say, for example, that you saw pictures of the apartheid-era police beating white people as well as people of color. You might believe that they were a brutal regime in general rather than that they segregated the community.

Why We Need Balance

We need to find a way to achieve a balance between AI being politically correct and being historically accurate. Would it be nice to see James Bond as a person of color? Most definitely. Considering that the historical portrayals of people of color are not always fair, we need more positive role models.

AI can produce wonderful images of astronauts, scientists, and so on to inspire young people of all races. However, where a prompt calls for a specific time period, we need to ensure that AI checks for historical accuracy as well.

If we don’t acknowledge the mistakes of the past, we run the risk of repeating them. If we allow AI to rewrite history or change how we see it, we might forget the injustice and risk it occurring again.

Related Stories

Videos

Mrunal Thakur on Dhamaka, experience of working with Kartik Aaryan,…
Nushrratt Bharuccha on Chhorii, pressure of comparison with Lapachhapi, upcoming…
Abhimanyu Dassani on Meenakshi Sundareshwar, how his mom Bhagyashree reacted…