skip to Main Content
They-not-like-us:-drake-bets-on-australian-ai-sector-as-‘emotional-ai‘-arrives

They not like us: Drake bets on Australian AI sector as ‘Emotional AI‘ arrives

As Kendrick Lamar prepares to headline the Super Bowl half-time show, Drake – his rival in a recent high-profile hip-hop feud – is seeking technical (and possibly emotional) support. According to All About AI, a recent trip down under saw the mega-selling Toronto artist teasing new partnerships and collaborations with Australia’s AI sector.

A report says that during a recent visit, “Drake hinted at collaborations with Australian start-ups specializing in artificial intelligence (AI) and machine learning, signaling a major leap toward blending technology with music.”

“His foray into the tech world is not just about enhancing his creative output but also about exploring how AI can revolutionize the music industry at large.”

The piece celebrates Australia’s startup culture, though is light on the specifics of how Drake might leverage generative AI to transform music, and/or write something better than “Hotline Bling.”

However, it touches on a deep well of concern in the creative industries: the notion that AI has a place in creativity – and, as a result, in human emotional intelligence.

Welcome to emotech: the AIs some have come to fear the most

Having reached a point at which many people are comfortable letting AI think for them, it is no surprise that the next frontier is human emotion. A recent article in T-Magazine explores the capacity of so-called “Emotion AI” to “recognise, interpret, and simulate human emotions.”

“Also known as affective computing, Emotion AI is rapidly transforming the way machines interact with humans by enabling them to interpret, simulate, and respond to emotional cues,” the article says.

Emotion recognition and other emotion-based AI tools like it gather “emotional signals from facial expressions, voice tone, speech patterns, and physiological indicators like heart rate and skin conductance” – a “multi-dimensional approach” that enables AI systems to parse emotional states in real-time.

It is powered by tools like conversational AI algorithms trained to recognize speech patterns and emotional cues, and emotion recognition that uses neural networks to analyze and predict human facial expressions.

Although emotion recognition and other would-be emotech has thus far been more of a fringe scientific pursuit (a pseudoscience, according to some regulators) the piece says the global market for Emotion AI is projected to exceed $90 billion by 2030.

“As this technology continues to evolve,” it says, “it becomes increasingly vital to address the growing concerns surrounding its ethical implications, particularly its use in sensitive sectors such as the legal system, national security, and military operations. From improving healthcare outcomes to transforming education, the possibilities are limitless, but the ethical and legal risks cannot be ignored.”

Creative professions not often the focus of AI regulation frameworks

The authors recommend strong cybersecurity frameworks and international regulations and guidelines for ethical use. “The European Union’s AI Act offers a potential model for regulating AI technologies, setting a precedent for the responsible development and deployment of Emotion AI.”

Opposing the use of Emotion AI for, say, warfare, may be an unambiguous way to argue for regulation. But what’s at stake in emotional AI is fundamentally intangible. The relationship between creativity, emotion and legislation is difficult to navigate, let alone regulate – even as the economic and cultural implications for millions of musicians, artists, writers, filmmakers and other creative professionals becomes clear.

Drizzy, however, is apparently not among those creatives for whom AI presents an existential threat. Granted, his use case may end up leaning more toward real-time biometric feedback of audiences – a trick being explored for at least a decade, for uses in interactive film screenings and customized concert experiences.

Or perhaps he’ll use it to generate another response to Kendrick Lamar – who, it should be noted, fired his killing shot not with AI, but with Mustard.

Article Topics

affective biometrics  |  biometrics  |  data privacy  |  emotion recognition  |  ethics  |  expression recognition  |  voice analysis

Back To Top