Future 6G systems are already coming into focus through research and early development, and one of the most exciting aspects is the role that artificial intelligence will play in the next generation of cellular systems.
First, a note on artificial intelligence versus machine learning. As Andreas Roessler, technology manager for Rohde & Schwarz, describes them, AI is a branch of computer science focused on building intelligent machines that mimic human cognitive functions, decision-making and problem solving. Machine learning is essentially a subset of AI, in which algorithms are able to improve their performance gradually and automatically through data and experience without additional programming.
We already live in a time of narrow AI use, where virtual assistants and customer service bots can respond to queries and requests based on natural language processing. AI and ML are also already present to a limited extent in the 5G standard, Roessler notes, with a Network Data Analytics Function (NWDAF) defined as of Release 15 that is meant to enable the collection of data from various nodes in the network and use it for automated network management and optimization of individual, virtualized network functions. However, he points out in a recent R&S webinar, only the interfaces are defined for this function—the specific AI/ML models are up to the vendor community to develop. In addition, the NWDAF functionality is quite limited in Release 15 to providing information on load levels of a network slice, and only applicable to 5G Standalone mode.
But, as with many aspects of 5G, the NWDAF provides a building block that is the basis for far more extensive and interesting use cases in subsequent releases—and offers a glimpse of what the next generation of wireless technology may one day become. With Release 16, Roessler explains, the scope of the NWDAF is expanded for more extensive analytics support: Not just for load information, but device mobility functions, user access of specific applications, subscriber user experience, sustainability information such as battery statistics and even Radio Access Network congestion information that can be relayed to operations and maintenance teams. Over time, the NWDAF becomes a valuable piece of the puzzle for predictive behavior and realistic models for fine-turning virtual network functions, optimizing mobility and session management and QoS—and it’s based on the use of AI/ML.
“The network data analytics function is a way to control the network better and improve performance through automation,” Roessler says.
Meanwhile, initial discussions for Release 18 specifications include the use of AI and ML to improve the performance of the 5G New Radio air interface, such as beam management, as well as overhead reduction for channel state information (CSI) feedback and enhanced position accuracy in various scenarios, he explains.
When it comes to the ongoing work around 6G, AI and ML are key foundational technologies to many aspects of future wireless systems.
“It is not a standalone research area – instead, it ties into all the other areas,” Roessler explains. Ultimately, AI/ML is likely to underpin some of the features that make 6G revolutionary. Examples of current areas of research include using ML models for self-interference cancellation in order to enable full-duplex operation, which has been elusive because of the immense complexity (and cost) involved to make it work. AI/ML might finally put it within reach. In addition, when it comes to the 6G physical layer, Roessler says that both academics and key industry players are already examining how ML-based models might be used in baseband signal processing so that wireless receivers can detect channel conditions and recover signal information more accurately and efficiently.
While early ML research has focused primarily on receiver aspects, he adds, “the next massive step is to use machine learning to jointly optimize the entire chain for transmission, reception and baseband signal processing.
“The ultimate goal of machine learning is to adapt the transmission to the environment—which means the underlying hardware, signal processing techniques and applications,” Roessler says. That implies, then, that AI/ML will actually design parts of the 6G physical layer itself. While it’s impossible to predict exactly how that might play out, Rohde & Schwarz sees the most likely path forward as a three-phase approach: The first, with ML being incorporated into the transceiver/RF front end and antenna system, followed by a second phase of ML integration within baseband signal processing from a receiver perspective, and a third phase of end-to-end optimization.
Still, there are significant challenges and questions that must be answered in the coming years. Is AI/ML even possible for some parts of wireless systems with extreme hardware constraints? When signals contain only a few bits of information, does an AI system have enough to go on? And one of the major challenges for even basic research, Roessler points out, is that there simply may not be enough existing data sets, or access to those data sets, in order to move forward with applying AI and training ML algorithms.
Intriguingly, fundamental research for an AI-native air interface already dates back to papers from 2020, Roessler says. “To me, the most interesting part is that the later papers referenced the earlier papers and demonstrated additional improvements, compared to the earlier findings—a clear indication that the methodology has a great potential,” Roessler adds. Learn more about the potential uses of AI/ML in 6G systems, and how Rohde and Schwarz is supporting early R&D, here.
The post Smarter than the average ‘G’: The role of AI/ML in 6G systems appeared first on RCR Wireless News.