House introduces bipartisan bill on AI in banking and housing
The bill would require a report on how these industries use AI to valuate homes and underwrite loans
Read more...In an era where technology is evolving faster than we’ve ever imagined, it's no surprise that artificial intelligence (AI) snuck into our lives and appeared through the software and daily technological items we use. In a recent article, ExpressVPN explored how an AI technology known as deep learning has been used to create deepfakes. According to the cybersecurity and virtual private network company, deep fakes can alter our perception of reality and change how we remember things.
Deepfakes and memory manipulation
In its article, ExpressVPN explored a phenomenon known as the Mandela Effect and how some people have gone on to remember certain live events—namely the life and death of former South African president Nelson Mandela—differently. The term came after a paranormal researcher recalled reading news coverage of Mandela passing away in prison despite how it never actually happened.
From recalling movie lines that were never said to recalling historical events differently, AI algorithms can implant false memories on a massive scale. After all, AI relies heavily on the information fed to it by its creators and programmers, and it’s easy for false information to be fed to an AI system.
Suddenly, a generation may think that Darth Vader said, "Luke, I am your father," even though the actual line is, "No, I am your father." With AI altering the past, the present, and the future, reality becomes a slippery slope, and the truth becomes a playground for AI's mischief.
Deepfakes creating a reality rabbit hole
With virtual reality (VR) and augmented reality (AR) technologies powered by AI, we can dive headfirst into alternate realms and explore fantastical dimensions. AI-driven simulations can transport us to places we've never been, introduce us to people who never existed, and immerse us in realities that defy the laws of physics.
Suddenly, our perceptions become malleable, and distinguishing between what's real and what's AI-induced becomes a frustrating game of trying to separate reality from fiction.
Most recently, photos of the Pope in a puffy white Moncler-inspired jacket circulated online. The photos were so convincing that many people believed it to be true and even applauded the Pope for his relatively daring fashion choices. On Instagram, an AI artist went a step further by creating photos of former celebrity couples and what their kids would have looked like had they not broken up. Regardless of what people think about them, these images were definitely interesting to look at.
Deepfakes creating a center for false information
In 2018, a video of former U.S. president Barack Obama made the rounds on the Internet. In the video, Obama essentially made offensive remarks about his counterpart. However, the video was false and was created with actor Jordan Peele doing a voice-over. Peele’s voice was adjusted and overlaid over a video of Obama speaking. The video was media company Buzzfeed’s way of warning Internet users everywhere of the dangers of deepfakes and false information. It was also a stark reminder for internet users not to believe everything they see online.
As AI continues to weave its way into our lives, altering our memories and reshaping our perception of reality, we find ourselves in a brave new world of uncertainty and wonder. The line between fact and fiction blurs, leaving us to ponder what it means to be human in an age where our memories can be manipulated. Thus, AI content creators and programmers will need to be wary of the sort of responsibility they hold when it comes to creating something and the potential drawbacks that such creations would have.
The bill would require a report on how these industries use AI to valuate homes and underwrite loans
Read more...The artists wrote an open letter accusing OpenAI of misleading and using them
Read more...The role will not be filled by Elon Musk, though he will be involved in who is chosen
Read more...