The second improvement is the use of enhanced text analysis to better understand ticket data. Using natural language processing, we can analyze those tickets to match precise geographic locations to the written descriptions in the tickets. Even though the textual description contained in a dig request ticket is only a couple of sentences, AI algorithms can often detect the patterns that dispatchers and callers use to describe a dig box that may span an entire block, both sides of a street, or positionally indicate the location of an area within a customer’s parcel. This area of analysis started from keyword spotting, but with the proliferation of large language models (LLMs) such as those used in generative pre-trained transformers (GPTs), additional intents may be “read between the lines” to correlate digs within an area or other patterns that develop within a region.
A third data-based opportunity that my team investigated is the analysis of street view images. While street view images are sourced from capture data that is typically months or years old (how often do you see a van with four cameras and a LIDAR sensor drive by your home?), these images can provide validation for buried facility information that is used in geospatial analyses. We noticed how technicians inspecting tickets for dispatch or suppression would often head to popular map websites and perform this visual analysis themselves. While this effort falls into the class of an unsupervised computer vision (CV) machine learning (ML) task, the abundance of data and potential opportunity was too great to pass up. While automated analysis of street view images doesn’t work for every ticket the AFO team gets, it’s a great backup to help confirm a location.
Armed with this carefully curated data, my team developed several baseline component ML models to analyze the wide range of different data types. For reproducibility and simulation, the AI solution records and archives every aspect of each 811 CBYD cable request, providing a detailed audit trail for ongoing regulatory reporting and compliance. This automated workflow has created two unique capabilities previous unavailable:
- Highly granular cost structure control made possible by “tuning” the algorithm to state-, city-, or even neighborhood-based localities. This tuning allows a critical advantage over rule-based systems where statistically learned patterns can be identified for financial and performance impacts.
- Adjusting the algorithm by type of request (damage repair vs. damage avoidance), time of year, the ticket requester (or excavator), and size of the job. This allows the Field Operations team to evaluate springtime-based consumer home improvement digs differently than large infrastructure overhaul digs happening during inclement winter temperatures.
Overall, the automated systems we created to improve the detection and approval of when someone can dig and how to avoid cable cuts saves the company $13 million to $16 million a year. AT&T spends more than $250 million every year to support the Call Before You Dig infrastructure across the 21-state footprint that contains buried facilities. I was honored to be awarded the AT&T Science & Technology Medal for outstanding technical leadership in Machine Learning supporting Field Operations for my role on the CBYD project. The AT&T Science and Technology Medals are presented for achievements demonstrating remarkable technical depth or breadth and providing a unique and significant contribution to AT&T.
But it’s truly a team effort. Collaborating with all the people who helped make the CBYD project happen is a testament to our spirit of innovation.