AfterMath

The Rumblings

From subtle murmurs to sudden booming shifts, the marketing and advertising landscape changes on the reg. Check out our perspectives on what’s shaking things up.

Evaluating Our Preconceived Notions About AI: Part 1

Your Introduction to Artificial Intelligence  

Where did you first see Artificial Intelligence (AI)? Was it introduced to you as a fictional introduction? Like the idea of AI robots surpassing human intelligence and taking over? Or you may have heard about AI technology like virtual assistants. You may even have one in your home. 

New developments in AI continue to happen every day, making it harder to escape the AI conversation. In fact, by the time you're done reading this, there may be newer news. 

So, why are we hearing about it more now?

Well, today AI is an integral part of our lives. From voice-controlled smart speakers to personalized recommendation systems on e-commerce platforms, its applications span various industries, including healthcare, finance, transportation and entertainment. It powers virtual assistants like Siri and Alexa, enables autonomous vehicles and improves medical diagnostics — and these are only a few examples. 

The field faced challenges during what became known as the "AI Winter" from the late 1980s to the early 2000s, due to overhyped expectations not aligning with actual technological capabilities. However, breakthroughs in machine learning algorithms, fueled by increasing computational power and access to vast amounts of data, sparked a resurgence of interest in AI during the last two decades. 

The Metaverse—a digital realm where users can interact with each other and their surroundings in virtual or augmented reality—has gained increased attention in recent years. As this virtual world continues to evolve, the role of AI will become increasingly crucial. 

As the Metaverse expands, AI will undoubtedly play an integral role in shaping its development. By combining the power of AI with virtual reality and augmented reality technologies, we can unlock new possibilities for immersive experiences in digital worlds. However, much like the "AI Winter," Metaverse advancements have not caught up to the hype Big Tech created over Metaverse-related projects. Many Big Tech companies, like Meta, have slowed down Metaverse developments to focus on more AI developments.   

As AI technology advances at an unprecedented pace, we can expect AI to play an increasingly prominent role in shaping the Metaverse's future, but that's only scratching the surface. 

Brief History of AI  

Artificial Intelligence has a rich and fascinating history that stretches back several decades. The origin of AI can be traced back to the Dartmouth Conference in 1956, where the term "artificial intelligence" was coined. This conference marked the birth of AI as an academic discipline. 

However, the idea of machines exhibiting intelligent behavior can be traced back to ancient times. Philosophers and scientists have been contemplating the concept of artificial beings with human-like intelligence for centuries. 

Before 1949, computers lacked a key prerequisite for intelligence: They couldn't store commands, only execute them. In other words, computers could be told what to do but couldn't remember what they did. Computer memory, for example, is a form of AI. 

Timeline Breakdown:

1950s and 1960s
Thanks to the invention of computers, scientists and researchers began exploring ways to create machines that could simulate human intelligence and perform tasks such as problem solving, pattern recognition, and decision making. 

1960s and 1970s
AI experienced significant progress with advancements like researchers developing rule-based systems that could mimic human expertise in specific domains, leading to applications like medical diagnosis and automated decision making. 

1980s and 1990s
A shift towards more data-driven approaches in AI research; machine learning algorithms emerged that enabled computers to learn from data and improve their performance without explicit programming. 

1990s and 2000s
The turn of the century came with the emergence of big data. With the exponential growth of digital information, AI algorithms gained access to vast datasets that further fueled their learning capabilities. 

BONUS: Popular Media That Features AI Technology   

Have you seen any of these popular movies depicting AI technology? We promise there will be no spoilers. 

Terminator (1984), Terminator 2 (1991) and Terminator 3 (2003)  
These movies starred Arnold Schwarzenegger, who plays the "Terminator," a highly advanced cyborg assassin sent from the future—the year 2029—to quite literally terminate a mother whose unborn child is humanity's future savior. Don't worry; we are still far from this kind of robot technology. But who knows what may happen in 2029? 

Resident Evil (2002)  
A commando team is tasked with breaking into "The Hive," a vast underground genetics laboratory. However, after the lab unleashed a deadly virus, the lab's personnel turned into zombies. The team has just three hours to shut down the lab's supercomputer, which looks and sounds like a little girl. The supercomputer has the ability to stop the virus before it gets out and threatens to overrun the Earth. Super AI and zombies; yikes.  

I, Robot (2004)  
Based on a book, this movie featuring a robot takeover stars Will Smith, who plays a detective in 2035 who manages to meet the robot responsible for the uprising. Again, we are still far away from the year 2035 or, at least, sentient robots that enslave the human race. 

 
This was part 1 of 2 on “Evaluating Our Preconceived Notions about AI.”

Check out part 2 as we evaluate more misconceptions surrounding AI while separating fictional AI from real-world applications.

Have Questions? Contact our Executive Director of Connections, Mike Pocci, at mpocci@teamaftermath.com.

Research & StrategyAfterMath