Over the past 60 years, artificial intelligence – or AI – has transformed with each generation. It’s no longer a toy of the future or an idea in a sci-fi film. It fits right into our data-driven world.  

At AT&T Labs, we’ve been working in AI for more than 20 years.

Past

We’ve developed commercial offerings in speech technologies, virtual assistance, machine vision and contact center automations – to name a few areas. If you’re a movie buff, you’re familiar with this without even knowing it. These technologies have been advancing image and video processing since 1995.

We’ve also applied AI to our social media and chat analytics. It helps us better understand customer emotions, sentiment, satisfaction level and intent.

Thanks to big data analytics, we’re now using these technologies to help us drive smart, hyper-automation. This means we’re starting to look at machine intelligence in a way that we’ve never done before.

Present

What many people don’t realize is they already interact with AI daily. If you use voice recognition on your smartphone or within a search engine to answer a question, you’re using AI.

At AT&T Labs, we’re creating a network that is self-healing and self-learning all the time through software agents based on AI technologies.

We’re adding AI to the foundation of our software-centric network. By embedding intelligence within our network, we’ll have the potential to create AI applications overnight. AI apps range in complexity, but the creation today typically takes several months by expert individuals.

Today, we are continuing to research and develop ways to create intelligent systems that can adapt, learn and mature over time. This is a different approach than the industry standard of building task-oriented intelligent systems. These only aid a specific problem and are unable to learn to “become smarter.”

A simple example of this is call routing: A machine is designed to  perform a few routes, but it only knows those routes. It won’t change or improve by itself. Someone must change the design and modify the development.

AI systems can ingest broader sets of data and learn to do more than a single task. These systems can actually learn, predict and improve from data without any supervision.

Just think: How can our contact centers get smarter with every customer interaction?

It can’t today, but AI promises to enable this vision.

Future

In the future, AI will improve the customer experience by speeding the time it takes to address an issue. These intelligent systems will offer a personalized and connected experience for every customer.

It will analyze and predict how customers feel and provide automated responses to address their issues and proactive personalized recommendations to match them with offers and services most appealing to them.

AI is evolving at AT&T through 3 generations of application types:

  • Speech recognition: Generation 1 began with the innovation of voice apps and speech recognition technology. This includes voice recognition, voice biometrics, natural language processing and video processing.
  • Network transformation: The bulk of Generation 2 is the transformation of our network from hardware to software. Ultimately, our network will be more self-healing and self-learning through the power of AI. This means we’ll be able to better predict network issues and solve them before they happen so our customers are not affected. 
  • IoT and Big Data: We’re already starting to work in Generation 3. We’re working to house IoT, security and big data services within these AI platforms. Imagine your home appliances communicating with each other and making shopping requests to prepare a meal. Or your car driving itself for repairs and back home without you lifting a finger. The possibilities are endless.

Saving lives

Think about the future of AI this way. It has the potential to help cure patients before they even show symptoms of an illness.

With the use of sensors to capture data during early stages of a disease, doctors will be able to better address those issues for their patients before they become serious. The vision is that patients will be able to carry light sensors that capture their vital signs.

Today, smart watches can monitor your heart beat, blood pressure and the number of steps you’ve taken in a day.

Tomorrow, smart devices may be able to collect advanced vital signs like frequent blood tests, glucose measurements and more. And the technology would collect these signals routinely, just like visiting your doctor.

AI will help identify key anomalies and early signs of issues—drastically reducing hospital visits and wait times. And most importantly, it will help identify early signs of health concerns faster than waiting for an annual doctor’s visit. It could even recommend the best course of treatment based on billions of data points.

This can be done by patients controlling and sharing their own data and information in a safe and secure way. AI can do this by understanding to whom and where the data is to be released, and if the receiver is a secure party.

This is what the new world might be like – and it’s very exciting.

We believe the work we’re doing at AT&T Labs with others in the industry, with startups and universities, and with the open source community, will help us bring this new world to life faster than ever imagined.        

Mazin Gilbert - Assistant Vice President, Intelligent Systems and Platform Research, AT&T Labs 

Mazin Gilbert
Mazin Gilbert Vice President of Advanced Technology & Systems at AT&T Labs