Contributing Editor, Annie Turner asks Steve Jarrett, Global Head of Data and AI at Orange, about the operator’s ambitious network automation strategy, timing and progress. He charts the path from the test and learn approach that is being applied today, to the recent transformational tie-up with Google Cloud and a future of “massive network automation”.
Orange unveiled its Engage2025 five-year strategic plan in December 2019, which has four main strands. They include gaining a new level in its digital transformation by positioning AI and data at the heart of innovation model. More specifically, to achieve three tightly linked goals of smarter networks, greater operating efficiency and reinventing customer experience.
Telcos have been trying to exploit big data for years, with little or mixed success in the main. Jarrett says that artificial intelligence has accelerated progress with this greatly in the last two years, but adds, “We’re in this environment where there’s lots of new tools, most of which are not very mature and the environments extremely dynamic. That’s what led us to the test and learn approach [with AI] because it’s just a very dynamic situation.”
He stresses that Orange is, “very focused on impact and use cases to help the business” right across the business. He is also keen to emphasise that in no sense is Orange waiting for the 5G non-standalone core to progress.
Jarret says, “The vast majority of our investment is in physical infrastructure and will continue to be so. Think about how much it costs to lay fibre and deploy base stations, even if they’re virtualised. There is still the power element, the antenna and the compute, however it’s structured at the base station, and those towers, not to mention the spectrum.”
He says that AI and automation could be applied to them, with the “test and learn approach”, from network planning to predictive maintenance. Early experiments are already saving the company millions of euros as well as improving customer service, from identifying the most profitable base station to preventing truck roll for fibre broadband problems, and saving energy through predicting idle nodes in the RAN.
On the starting blocks
Partnership with Google Cloud
Little wonder then that in July, Orange and Google Cloud announced a strategic partnership to accelerate the transformation of Orange’s IT infrastructure and the development of future cloud services, in particular edge computing.
Jarrett says, “A big part of my job is to make sure that we make large partnerships decisions that allow us to improve the advantage of all [that] external investment,” pointing out that the hyperscale cloud companies have been dealing with data problems similar to those of the operators for years, but at much greater scale.
Still, while the cloud hyperscalers were created to be data driven, the operators were not, hence the travails of digital transformation. Jarrett is nothing daunted, comparing the situation to the early years of the internet when the pace of change was extraordinary, and companies had fundamentally to rethink how their business would operate.
He says, “We need to think about data as being a common wealth, which is the ability to share data between teams and break down the silos. It enables everyone to take business benefit from using the data for different purposes. Historically, the team that generated a particular data set felt like they owned it but, for example, network data is…extremely useful to many different teams. That’s the biggest, the hardest, problem plus the willingness to change and that also relates to training and skills.”
As part of Engage2025, Orange is committed to invest more than €1.5 billion in a skills-building programme open to all employees, that will train 20,000 staff in network virtualisation, AI, data, cloud computing, code and cybersecurity.
In the meantime Jarrett explains, “To have extremely heterogeneous network data requires really good data governance, which is the methodology to structure the data, to understand where the data comes from and what actions have been taken on the data. Then additionally [you need] really good tools to allow you to ingest the data and look for anomalies.”
He says very high scale data systems should not always simply use a pipeline to extract the data and prepare it for the next step, as pipelines can go wrong for many reasons. Consequently, systems must not only to look for network anomalies, but for anomalies in the data to avoid acting on bad or skewed data created by a software glitch or another issue generating inaccurate data.
He states, “I think we have a really good understanding of those problems and we’ve done very nice work, including… this deal with Google, which is a really fundamental shift for the company. And I think they also have a lot to bring to us regarding these kinds of problem.”
This is because, Jarrett says, there is a rapidly growing awareness and understanding of the value of data and potential problems with it. “As a result, there’s so many new startups, as well as established players, that are really invested in addressing these kinds of problems, and there is enormous innovation. I probably spend an hour or two, every day, just reading and trying to keep up with the industry.”
He says the likes of Amazon and Google and Azure’s not only invest in cloud infrastructure, but provide a platform for these companies to sell their specialist value added services – for example, fixing the labelling of complex, poorly labelled data. Jarret says, “They have a really strong business model,” and continues, “There’s enormous venture capital investments and acquisitions and so that’s a very, very healthy market and it’s really helping us dramatically in our ability to be efficient.”
By Annie Turner, Contributing Editor, FutureNet World, Annie is also Editor of Mobile Europe, see their website here