Drift Automation: https://www.drift.com/automation/?utm_medium=cpc&utm_source=marketing-ai-conference&utm_campaign=chat-2020-MAICON-Sponsorship&utm_content=banner-ad-728-90
How To Tell What Is And Isn’t Artificial Intelligence
Blog Feature

By: Ashley Sams

Print this Page

March 28th, 2018

How To Tell What Is And Isn’t Artificial Intelligence

At the Marketing AI Institute, we read dozens of articles on artificial intelligence every week to uncover the most valuable ones for our subscribers (become a subscriber today), and we curate them for you here. We call it 3 Links in 3 Minutes. Enjoy!

1. Becoming a Professional AI Tool Investigator

We’ve touched on the topic of cutting through the hype surrounding artificial intelligence before, but with more companies offering “AI-powered solutions,” it can be hard to tell which businesses are actually using AI and which aren’t. AdWeek offers some tips for marketers on the hunt for tools actually fueled by AI.

Jason Carmel, global chief data officer for Possible said it best, “Any time marketing technology makes a decision that a human normally would, it’s being driven by some form of machine learning.” The important differentiation here is that the machine is making the decision in real time, not a human.

While investigating an AI-driven technology, marketers should admit to being ignorant on the subject and ask as many questions as possible. Austin Miller, director of product marketing at Oracle Marketing Cloud, recommends, “When a vendor refers to ‘algorithms,’ making ‘smart decisions,’ for your business, ask them what those algorithms are and which specific decisions are made, and what the readout is for a human to validate that. Ask them for examples of business using that technology currently and, even better, ask to talk with them.”

Lastly, the worst thing a marketer can do is being afraid of sounding stupid and not speaking up. Understanding the product and the technology that powers it is essential to making informed decisions for your organization.

 
2. How To Use The Google Lens In Your Pocket

One of the apps being talked about the most right now is Google Lens, and for a good reason. By tapping into Google’s AI, the app is able to recognize items through your phone’s camera and take action. The Lens update has been rolling out to Android and iOS users for the last month.

To highlight what exactly having this type of machine learning in your pocket means, Gizmodo dove into all of its capabilities.

Using the Google Photos app, you can apply Lens to any photo you’ve already taken and it can identify most common objects. Gizmodo’s writers had it successfully identify a cappuccino, a MacBook Pro, a daffodil, and a pint of beer. It was also able to identify more specific images such as “The Bean” sculpture in Chicago and LS Lowry’s painting Going to the Match.

Additionally, Lens can pick up text such as email addresses, phone numbers, and map addresses, and launch another relevant app with one more tap. A great example of how this would be useful is snapping a photo of an event poster. Lens can tell you where the event is in Google Maps and use the text on the flyer to add an event to your calendar.

Users are still waiting on functionality to complete some of the tasks Google demoed at last year’s I/O event, such as being able to wipe away a chain link fence to see a clear photo of what’s behind it or snapping a photo of a Wi-Fi router’s code and connecting to that network. However, while Lens still has a long way to go, the recent updates show strong progress. And, as with all AI, the more it is used (and trained), the faster it will improve.

3. Reducing Food Waste With An AI App

AgShift, a food inspection technology startup in California, has started using deep learning to fight food waste, according to AgFunder News.

Without deep learning, the process of inspecting produce quality is very time-consuming and subjective. Workers receive a palate of fresh produce, unwrap the palate, remove a few packages and inspect the produce with the human eye or a ruler, evaluating the size, color and amount of bruising. This leads to inconsistencies with the quality of produce arriving at stores and markets, ultimately leading to food waste.

Thanks to their recent round of funding led by Exfinity Ventures, AgShift now has $2 million in seed money to apply deep learning to this process. With each new crop, the AgShift team takes six weeks to perfect the algorithms to tell the app what to look for. From there, workers simply photograph the produce and the app measure the color, size and proportion of bruising and issues a USDA grade.

Founder Miku Jha says the app has cut the inspection time nearly in half for each team. Also, it has not replaced any workers, just increased their capacity. Since the industry has never done inspections this way, Jha hopes that photographing the produce will also decrease or eliminate claims, which are major contributors to food waste. Get free access to the Ultimate Beginner's Guide to AI in Marketing: https://www.marketingaiinstitute.com/beginners-guide-access

About Ashley Sams

Ashley Sams is a consultant at PR 20/20. She joined the agency in 2017 with a background in marketing, specifically for higher education and social media. Ashley is a 2015 graduate of The University of Mount Union where she earned a degree in marketing.

  • Connect with Ashley Sams