<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=2006193252832260&amp;ev=PageView&amp;noscript=1">

1 Min Read

Read This Before You Upload a 10-Year-Old Photo of Yourself

Featured Image

Wondering how to get started with AI? Take our on-demand Piloting AI for Marketers Series.

Learn More

At the Marketing AI Institute, we read dozens of articles on artificial intelligence every week to uncover the most valuable ones for our subscribers and we curate them for you here. Enjoy!


Is the #10YearChallenge just a harmless meme?

If you signed on to any social media platform in the last week, you’ve probably noticed the #10YearChallenge that’s currently trending. Users all over the world are sharing portraits of themselves today, in 2019, beside one from 2009. Seems harmless, right? The founder of KO Insights and author Kate O’Neill begs to differ.

Screenshot of Kate O'Neill's tweet regarding the #10YearChallenge

Her tweet sparked an opinion piece in WIRED where she explains the reasoning behind her thesis. The main rebuttal she received was that Facebook already has access to these profile photos in their account.


However, if someone wanted to train a facial recognition algorithm on age-related characteristics or age progression, mining Facebook for a wide dataset of photos would be difficult. For example, users don’t always upload their photos in chronological order, they repurpose photos from the past, or their photo could have been scanned.


Instead, this #10YearChallenge created a clean, labeled dataset of “carefully curated photos of people from roughly 10 years ago and now.”


O’Neill continues by offering several likely use cases for facial recognition and how the scenarios could play out. Read them here.

What will 2019 hold for AI? These experts have an idea.

To kick off the new year, VentureBeat spoke with AI visionaries about their predictions for artificial intelligence in 2019.


Andrew Ng is the cofounder of Google Brain and founder of Landing AI, among other impressive achievements. His experience transforming tech giants into AI companies has provided him with a unique perspective on AI’s progression. His main prediction is the coming year will bring lots of progress for AI in companies outside of the software industry.


“I think the next massive wave of value creation will be when you can get a manufacturing company or agriculture devices company or a healthcare company to develop dozens of AI solutions to help their businesses.”


VentureBeat also spoke with Hillary Mason, Cloudera’s general manager of machine learning. She hopes 2019 brings business leaders a better understanding of how AI can be incorporated across their companies.


“I think in the same way we expect all of those people to be minimally competent using something like spreadsheets to do simple modeling, we will soon expect them to be minimally competent in recognizing where AI opportunities in their own products are.”

For more insights from Ng, Mason and more, check out the full article here.

 

 

Related Posts

How To Create a Digital Clone of Your Voice in Seconds Using Artificial Intelligence

Ashley Sams | June 6, 2018

Your weekly roundup of artificial intelligence and machine learning news. This week we can’t stop reading about an AI that can clone your voice, the world’s first AI psychopath Norman, and how to reduce biases in chatbots.

IBM Watson Can Now Predict How Good You'll Be At Your Job

Ashley Sams | July 11, 2018

A weekly roundup of artificial intelligence news for marketers and business professionals. This week we cover IBM Watson’s ability to predict how well employees will do in the future, resources to expand your machine learning knowledge, and how China’s Tinder plans to use AI better match couples.

How to Use Artificial Intelligence to Supercharge Your Email Marketing

Ashley Sams | September 6, 2018

Your weekly roundup of the best AI and machine learning posts on the web. This week we're covering email marketing tips from Salesforce, how AI video recording is changing job interviews, and why algorithms struggle with "catastrophic forgetting."