Gareth Owen - Counterpoint https://www.counterpointresearch.com Technology Market Research & Industry Analysis Firm Wed, 31 Jul 2024 07:40:58 +0000 en-US hourly 1 https://www.counterpointresearch.com/wp-content/uploads/2021/12/counter_favicon-150x150.png Gareth Owen - Counterpoint https://www.counterpointresearch.com 32 32 AI Chip Start-Ups – Can Domain-Specific Chips Impact Nvidia’s Dominance? https://www.counterpointresearch.com/insights/ai-chip-start-ups-can-domain-specific-chips-impact-nvidias-dominance/ Mon, 29 Jul 2024 19:19:43 +0000 https://www.counterpointresearch.com/?post_type=insights&p=1048222 Modern AI requires a massive amount of processing power, especially when training LLM models. Today, the billions of trillions of calculations required are performed primarily by GPUs, whose parallel processing is well-suited for the task. No company has benefited more from this AI boom than Nvidia. Big tech companies and enterprises alike are fighting for […]

The post AI Chip Start-Ups – Can Domain-Specific Chips Impact Nvidia’s Dominance? appeared first on Counterpoint.

]]>
Modern AI requires a massive amount of processing power, especially when training LLM models. Today, the billions of trillions of calculations required are performed primarily by GPUs, whose parallel processing is well-suited for the task. No company has benefited more from this AI boom than Nvidia. Big tech companies and enterprises alike are fighting for access to its chips, which are well-suited for training and running advanced AI models.

For more than 10 years, Nvidia has built an almost unassailable lead in developing silicon that can perform complex AI tasks such as image, facial and speech recognition, as well as generating text for AI chatbots such as ChatGPT. It has achieved this dominance by recognizing the AI trend early on, adapting its chip hardware to suit those AI tasks and investing in its CUDA software ecosystem.

Nvidia keeps raising the bar. To maintain its leading position, the company now offers customers access to its specialized GPU-based computers, computing services and other tools on a “as-a-Service” basis. To all intents and purposes, this has turned Nvidia from being a chip vendor into a one-stop shop for AI development. However, there are two crucial factors that motivate Nvidia’s rivals, both established chip vendors and start-ups, and that is high pricing and the fear of vendor lock-in by its customers. Clearly, no company wants to be beholden to a dominant vendor and so more competition seems inevitable. The intense demand for AI services and the desire to diversify reliance on a single company are driving the ambitions of rival big chip vendors as well as numerous start-ups.

In 2017, Google’s Tensor Processing Unit (TPU), a chip designed specifically for Deep Learning. demonstrated that it was possible for new players to build domain-specific chips with better performance, lower power consumption and cost compared to Nvidia’s general-purpose GPUs. Now, the emergence of generative AI with its unique and heavy computational requirements presents new opportunities for domain-specific ASICs vendors.

Many AI chip start-ups believe that their new silicon innovations exceed Nvidia GPUs in performance and have a significantly lower power consumption since they have been designed specifically for the training and processing of deep neural networks.  However, achieving commercial success has  proven to be much more challenging, particularly with the explosion in foundation models that followed the launch of OpenAI’s ChatGPT. As a result, many start-ups have recently had to re-engineer their designs to handle the massive number of parameters needed for LLMs. Others have changed their business models to become service providers rather than chip vendors.

Counterpoint Research’s recent report AI Chip Start-Ups – Can Domain-Specific Chips Impact Nvidia’s Dominance?provides an overview of the AI chip start-up market and highlights the opportunities and challenges facing new players entering the burgeoning AI chip market.

Table of Contents

Introduction

Nvidia – A One Stop Ship for AI

The Compute Challenge

Established AI Chips Start-Ups

  • Cerebras
  • Groq
  • SambaNova
  • Tenstorrent

Emerging AI Chip Start-Ups

  • Rivos, Enflame Technology, Rain AI, etc.

Key Challenges for Start-Ups

  • Technical complexity
  • Funding difficulties
  • The Nvidia effect
  • Open Standards
  • Big Tech

Analyst Viewpoint

Related Posts

The post AI Chip Start-Ups – Can Domain-Specific Chips Impact Nvidia’s Dominance? appeared first on Counterpoint.

]]>
Gareth Owen
FDD Sub-3GHz Massive MIMO Radios Will Play A Critical Role Enabling Operators To Maximise Opportunities Offered By 5G Advanced https://www.counterpointresearch.com/insights/fdd-sub-3ghz-massive-mimo-radios-will-play-a-critical-role-enabling-operators-to-maximise-opportunities-offered-by-5g-advanced/ Wed, 19 Jun 2024 23:17:36 +0000 https://www.counterpointresearch.com/?post_type=insights&p=1046019 Over the next few years, 5G Advanced will provide networks with enhanced capabilities, including improved uplink, lower latency and better coverage resulting in higher network performance and increased reliability. This will enable operators to enhance existing services while generating additional sources of revenues via new services. However, to do this operators will need to maximise […]

The post FDD Sub-3GHz Massive MIMO Radios Will Play A Critical Role Enabling Operators To Maximise Opportunities Offered By 5G Advanced appeared first on Counterpoint.

]]>
Over the next few years, 5G Advanced will provide networks with enhanced capabilities, including improved uplink, lower latency and better coverage resulting in higher network performance and increased reliability. This will enable operators to enhance existing services while generating additional sources of revenues via new services. However, to do this operators will need to maximise their spectrum assets across all bands, particularly their legacy Sub-3GHz spectrum.

Benefits of Sub-3GHz Bands

While higher TDD spectrum bands are being used to provide 5-10Gbps downlink speeds required for premium customer experiences, FDD Sub-3GHz spectrum is also needed to extend high-speed 5G mobile broadband coverage across urban, suburban and rural regions and to provide reliable coverage for IoT services. In particular, Sub-1GHz spectrum is vital for indoor coverage. In addition, FDD spectrum offers lower latency than TDD as different channels are used for the uplink and downlink channels.

Most operators have around 80-120MHz of Sub-3GHz spectrum as this was the primary spectrum used by previous cellular generations and most of this spectrum is becoming available for 5G as operators switch-off legacy 2G and 3G networks. Refarming existing FDD spectrum bands will allow operators to quickly build up their 5G footprint. However, as there is less spectrum available in the Sub-3GHz bands compared to higher bands, operators will need to introduce advanced radio technologies with much improved spectrum efficiencies, throughputs and latencies. In addition, this must be done using compact, cost-efficient radio solutions in order to keep capex and opex costs to a minimum.

FDD Triple-band MIMO and Massive MIMO

Several vendors offer 4T4R and 8T8R ultra-wideband modules for low-band (700MHz, 800MHz and 900MHz) and mid-band (1.8GHz, 2.1GHz, and 2.6GHz) frequencies. As a result, only two multi-antenna, single RAN RF modules are required to cover both spectrum bands – instead of six as with conventional radios. In addition, some vendors offer 32T32R massive MIMO modules which can be deployed in a compact radio/antenna enclosure for the mid-band spectrum bands.

FDD triple-band massive MIMO radios significantly boost capacity and coverage compared to conventional 4T4R radios. This enables operators to improve the spectrum efficiency of their existing Sub-3GHz resources, while at the same time, satisfying higher traffic demands and improving user experience. From an investment viewpoint, FDD massive MIMO radios also reduce operators’ overall capex and opex costs.

Key Radio Technologies

Key technologies at the heart of state-of-the-art FDD RF modules include FDD Beamforming, GigaBand multi-band fusion technology plus innovative energy-saving features enabled primarily by innovations in power amplifier technologies.

  • FDD Beamforming

To date, beamforming has primarily been used in TDD spectrum as operators initially deployed 5G in TDD bands. However, beamforming can also be used in FDD spectrum which has been supported since 3GPP’s Release 15 specification. As with TDD spectrum bands, the main benefit is increased spectrum efficiency as the directional beams focus the radio signals where they are needed rather than distributing them across the entire cell. This increases radio capacity and coverage while reducing interference. Increasing the number of transmit antenna – and hence the number of beams – means narrower and more focused beams, resulting in even higher capacity and better spectral efficiency.

  • Multi-Band Power Amplifier Technology

Multi-band technology allows the functionality of several radios to be combined into a single radio unit with a single transceiver accommodating two or three bands. A tri-band radio therefore only requires one power amplifier and one filter rather than three power amplifiers/filters as required in a conventional radio. This significantly increases the level of integration thereby reducing tower footprint. As traffic rarely peaks in all bands simultaneously, a multi-band power amplifier allows power to be dynamically shared between different bands. This means that it is possible to provide full power output for each band but without designing the power amplifier for simultaneous peak power in all bands.

  • GigaBand Technology

Compared to wide-band TDD spectrum, 700-900MHz low-band FDD spectrum is limited, fragmented and consists of narrow bandwidth channels. The same is true of 1.8-2.1/2.6GHz spectrum. GigaBand technology is a multi-band fusion technology that converts these disparate spectrum assets into a single 100MHz-wide FDD carrier. Using carrier aggregation and Multi-Band Serving Cell (MBSC) technology, six Sub-3GHz spectrum bands can be combined into a single carrier which maximises spectral efficiency.

  • Energy Saving AI Software

Service requirements are higher with 5G, which means that peak-to-average traffic ratios are also higher. As a result, the requirements for higher energy efficiency at peak hours and low power consumption in idle time are more urgent. AI software solutions are playing a major part to help reduce energy consumption in 5G networks by enabling shutdowns at network, cell and at radio levels coupled with fast wake-up capabilities, with AI/ML being used in real-time to optimise network parameters according to traffic demands.

In the radios, state-of-the-art AI-based software solutions can achieve “deep sleep dormancy” with very low power consumption in an idle state coupled with fast on-demand wake-up. For an operator using multiple frequency bands at a single cell, smart algorithms can progressively shut down different frequency bands depending on traffic loads until only one band is operating, with all other frequencies being in a dormant state.

Huawei FDD MIMO and massive MIMO Radios

In 2022, Huawei launched the industry’s first FDD triple-band MIMO modules supporting GHz-level bandwidths contained in a single radio box. The modules are available in 4T4R, 8T8R and 32T32R configurations.

  • GigaBand RF Modules – Huawei’s FDD ultra-wideband RF modules have an instantaneous bandwidth (IBW) in excess of 800MHz per module. This is a major advantage for operators with fragmented spectrum as one ultra-wideband radio can replace two or even three narrower band radios. Alternatively, ultra-wideband radios can be used for RAN sharing, which means that two operators can share one radio, again reducing capex and opex costs.
  • Advanced Power Amplifiers – key to this ultra-wideband capability is Huawei’s advanced power amplifier technology, which leverages breakthroughs in several technologies, including AI-based DPD[1] beamforming algorithms, advanced power amplifier architectures, RF filter materials and improved passive cooling via bionic heat sinks. Huawei claims that its power amplifiers are 10% more efficient than industry rivals and that its innovative filter materials generate 1dB less filter loss compared to industry rivals.
  • GigaGreen Platform – at the heart of the GigaGreen platform is Huawei’s “0-bit, 0-Watt” and “More-bit, Less-Watt” solutions, which leverage breakthroughs in materials technology, energy saving policy orchestration and smart algorithms. With millisecond level carrier and channel shutdown, Huawei’s “0-bit, 0-Watt” solution can achieve 99% “super deep sleep dormancy” enabling RF modules to consume almost zero power under low load, while its “More-bit, Less Watt” solution continuously minimizes energy consumption under medium and high loads – without compromising user experience. Achieving these low power consumption levels requires independent shutdown of individual power amplifiers and full power sharing across all carriers, frequency bands and Radio Access Technologies (RATs).

FDD sub-3GHz Deployment Example

FDD Sub-3GHz triple band radios are designed from the outset to simplify deployments at cell sites while reducing power consumption. For operators, this translates into capex and opex savings. For example, many operators use four or five FDD frequency bands at a single tower site. Traditionally, this would require four or five radios, i.e. one RF module per frequency band. However, with ultra-wideband 4T4R, 8T8R and 32T32R massive MIMO modules, only two radios are required. In addition, the operator has the option to add one (or two) additional frequency bands.

By combining an active massive MIMO radio/antenna with a six-band passive antenna into a single package, the number of “boxes” per sector can be reduced from seven to two (Exhibit 1). This enables new frequency bands to be added without increasing the number of base stations or power consumption while opex is also reduced.

                ©Huawei

Exhibit 1:  Leveraging Ultra-Wideband Radios To Enable Site Simplification

Analyst Viewpoint

FDD technology at Sub-3GHz frequency bands is set to play a critical role as 5G Advanced is rolled out over the next few years. To fully leverage the opportunities offered by 5G Advanced, Counterpoint Research believes that it is imperative that operators maximise their existing FDD Sub-3GHz spectrum assets in order to ensure seamless coverage across urban, suburban and rural regions. Not only will this provide an enhanced user experience for customers, it will also enable operators to offer a range of new high-data, low-latency services with guaranteed service levels across their entire network footprint.

As less spectrum is available at Sub-3GHz, however, operators will need to boost the spectrum efficiencies of their RAN equipment at these FDD frequencies. In practice, this will involve investing in the latest, advanced massive MIMO radios, which offer significant spectral efficiency gains compared to conventional radios. For example, Huawei claims that its Sub-3GHz 32T32R massive MIMO radio can offer operators up to 10X more capacity, 10X more data downlink throughput, a 10dB increase in coverage and a 30% reduction in power consumption compared to a 4T4R radio.

Counterpoint Research believes that this transition to advanced massive MIMO radios must be done without substantially increasing capex and opex costs for operators, particularly power consumption. RAN operations are typically a trade-off between radio performance and power consumption. Despite their higher throughputs and superior spectrum efficiencies, massive MIMO radios can also help minimise capex and opex costs. As seen in Exhibit 1, replacing multiple single-band radios with a single multi-band radio leads to considerable site simplification and thus a lowering of tower leasing costs. In many cases, using radios with higher spectral efficiencies can also result in a reduction in the number of cell sites required and may even lower operators’ investments in new spectrum bands. In addition, by leveraging state-of-the art radio technologies, including the latest power amplifiers coupled with the latest AI-driven power saving techniques, smart algorithms, etc., massive MIMO radios can also reduce overall power consumption on a cell site basis, thus reducing opex costs for operators, while helping to minimise their carbon footprint.

This blog was sponsored by Huawei

Related Posts

The post FDD Sub-3GHz Massive MIMO Radios Will Play A Critical Role Enabling Operators To Maximise Opportunities Offered By 5G Advanced appeared first on Counterpoint.

]]>
Gareth Owen
TDD Multi-Carrier Aggregation Builds Foundation For New 5G Advanced Experiences https://www.counterpointresearch.com/insights/tdd-multi-carrier-aggregation-builds-foundation-for-new-5g-advanced-experiences/ Wed, 19 Jun 2024 23:11:39 +0000 https://www.counterpointresearch.com/?post_type=insights&p=1046012 Since its launch in 2019, 5G has experienced fast adoption surpassing all previous cellular generations. Yet despite this rapid adoption, operators are only beginning to monetise their 5G investments. Over the next few years, 5G Advanced will provide operators with much improved network capabilities that will enable them to offer a raft of innovative new […]

The post TDD Multi-Carrier Aggregation Builds Foundation For New 5G Advanced Experiences appeared first on Counterpoint.

]]>
Since its launch in 2019, 5G has experienced fast adoption surpassing all previous cellular generations. Yet despite this rapid adoption, operators are only beginning to monetise their 5G investments. Over the next few years, 5G Advanced will provide operators with much improved network capabilities that will enable them to offer a raft of innovative new applications as well as enhancing existing  consumer and business services.

The Benefits of 5G Advanced

While 5G essentially provided higher data rates compared to 4G, 5G Advanced will help operators enhance existing 5G networks making them more efficient, with superior uplink and downlink performance, higher throughput and capacity coupled with lower latency and improved network  reliability. 5G Advanced will also enable up to 100 billion device connections.

To a large extent, these enhancements will be enabled by the introduction of new 5G base station products such as advanced massive MIMO radios leveraging state-of-the-art multi-antenna and ultra-wideband technologies. 5G Advanced will also introduce new use cases such as passive IoT and Integrated Communications and Sensing (ICAS), two technologies with considerable potential to revolutionize mobile communications. 5G Advanced will also bring deeper integration of AI/ML across the network, including inside radios, devices and across networks, bringing  significant gains in network performance as well as improving mobility and coverage.

5G Advanced Will Need More Spectrum

In many cases, operators will need access to new spectrum in order to maximise the full potential of 5G Advanced. The last World Radio Conference (WRC-23) allocated more mid-band spectrum in the 3.5GHz band while the 6GHz band was designated as a mobile band in most regions. The emergence of 6GHz band spectrum will be important as it provides operators with an option to acquire more mid-band spectrum to support higher data transfer rates and increased network capacity, thus helping them to realize new service opportunities. In Europe, much of the 6GHz range is currently used for Wi-Fi, with only a modest amount of the band set aside for cellular use. However, the agreement at WRC-23 will allow more cellular access to the upper portion of this band in Region 1 (Europe, Africa, Middle East). This means that local operators have an opportunity to work with regulators to secure a larger amount of 6GHz spectrum.

Another important band is the millimetre band. Continuously rising data consumption plus 5G Advanced’s demand for more spectrum is reviving operator interest in the millimetre wave bands. However, although more than 140 operators worldwide have acquired millimetre wave spectrum, less than 5% of them have actually deployed any networks, due mainly to the limited coverage of millimetre wave cells. A good example of a millimetre wave network deployment is that of China Unicom Beijing, which has deployed a 5G Advanced network in Beijing’s financial district. The network leverages the latest Extra Large Antenna Array (ELAA) technology with high and low frequency coordination and serves as a benchmark demonstration of the commercial capabilities of millimetre wave spectrum.

Leveraging TDD Multi-Carrier Aggregation

5G Advanced heralds the start of the Multi-Gigabit era. TDD multi-carrier aggregation combines spectrum from different TDD frequency bands and is a proven method for augmenting data throughputs in wireless networks. It allows operators to increase network capacity, providing higher downlink data rates and increased coverage thereby allowing operators to maximise their spectrum assets. TDD multi-carrier aggregation can be used across 2.6GHz, 3.5GHz, C-band, 4.9GHz as well as millimetre bands such as 26GHz and 28GHz. The choice of 3GPP-specified 5G Component Carrier (CC) configurations, for example, 2CC, 3CC, 4CC, etc., continues to expand and enables multi-Gigabit 5G data rates across a wide variety of sub-6GHz and millimetre spectrum allocations globally.

In practice, operators have two primary options at the present time: use 3CC carrier aggregation to aggregate 200MHz of sub-6GHz spectrum or use 3CC in the U6G (6,425-7,125MHz) and millimetre wave bands to aggregate larger bandwidths exceeding 400MHz. With 3CC carrier aggregation in sub-6GHz, it is possible to achieve 5Gbps downlink speeds, i.e. 5 times the capability of current 5G. For example, China Mobile has achieved 5Gbps downlink throughput using 2.6GHz (100MHz & 60MHz) and 4.9GHz (100MHz) spectrum bands. However, millimetre wave spectrum is essential to reach 10Gbps. For example, China Unicom recently achieved a throughput of 10Gbps using 3CC in C-band (100MHz) and millimetre wave (800MHz) bands.

Services enabled by 5G Advanced

The improved network performance due to 5G Advanced will enable operators to enhance existing services by introducing more immersive mobile broadband offerings and improve connectivity in homes, enterprises and vehicles.

Enhanced mobile broadband – 5G Advanced will enable operators to introduce smarter, more interactive experiences which benefit from low-latency connectivity. For example, New Calling services, which introduce capabilities such as voice/video calls with real-time language translation, speech-to-text translation, screen sharing, interactive visual menus and enterprise ID cards.

UAE operator Du plans to use its 5G Advanced network to accelerate the introduction of 3D Internet-type services as well as offering services such as video calls, live video streaming, games and e-shopping, all of which require high-bandwidth and stable low latency connectivity in both uplink and downlink.

Connecting Homes – FWA-based 5G Advanced provides significantly more bandwidth and reduced latency thus making it possible to offer a range of FWA services across different markets. For example,   5G RedCap CPEs can offer downlink rates of up to 150 Mbps with 50% less power consumption compared to standard 5G CPEs. With RedCap CPEs expected to be priced around $40-$60, this could enable operators to target new markets such as homes without home broadband or served by lower-speed networks.

Operators could introduce a Home Plus type service by leveraging 5G Advanced-based FWA’s high downlink/low latency characteristics and guaranteed service quality offering. This would allow them to earn incremental revenue from existing subscribers by offering new services such as 8K video streaming, large scale cloud-gaming, home security monitoring and VR sports for multiple concurrent users. SMEs could be another new market. 5G Advanced could enable operators to offer  “super-uplink” type FWA services with low latencies and guaranteed uplink rates between 50Mbps to 1Gbps.

Connecting Things – 5G Advanced introduces a host of features to improve device connectivity. For example, improvements in RedCap will boost the ability of 5G-Advanced to support lower-performance and more affordable devices, which should open new business opportunities for operators, particularly in IoT. Another promising new use case is passive IoT. Next-generation passive IoT tags will have a range of over 200 metres, far exceeding that of RFID tags, and with a significantly lower cost per tag.

Connecting Enterprises and Vehicles – by leveraging its enhanced uplink characteristics, 5G Advanced will enable fully wireless-connected factories enabling flexible production, a key to digitalization in manufacturing while bringing new capabilities to vehicles, for example,  route planning beyond a driver’s line-of-sight, thus making transportation safer.

Operator Deployments

China Mobile has announced that it will launch 5G Advanced in more than 300 cities in China by the end of 2024 and has already launched services in around 100 cities. Full deployment across China is expected by the end of 2026. The operator expects that around 20 million devices and terminals will be shipped by the end of 2024. Key consumer services include New Calling, VIP Gbps, Cloud Phone and glasses-free 3D.

Other operators have also announced commercial 5G Advanced plans, including Beijing Unicom, Zain KSA and Finnish operator DNA. Beijing Unicom has announced that it will deploy 5,000 5G Advanced sites and will use TDD multi-carrier aggregation to provide high-bandwidth services. Zain is developing a 5G Advanced city in Riyadh and will showcase a number of 5G Advanced services, including enhanced FWA services. The operator plans to expand its 5G Advanced coverage to another eight cities by 2026. Meanwhile in Europe, Finnish operator DNA plans to launch a 5G FWA Gbps service.

Analyst Viewpoint

The improved network capabilities enabled by 5G Advanced will enrich the mobile experience for consumers and business users alike, while providing new monetization opportunities and service options for operators. This will allow them to expand existing services and earn incremental revenues through the introduction of  new speed or uplink-based tariffs and guaranteed service quality levels. As a result, Counterpoint Research believes that operators should start upgrading their networks to 5G Advanced as soon as possible to ensure they can offer the enhanced user experiences that will be expected by customers, as well as being ready for the accompanying surge in bandwidth demands.

Over the next few years, TDD multi-carrier aggregation will play a key role in the 5G Advanced network upgrade ensuring that operators can offer 5-10Gbps fibre-like downlink speeds, as well as providing improved low-latency and high-data uplink experiences – an acute problem with existing 5G. However, some technical challenges remain in relation to antennas and data processing requirements. For example, 3D or even higher-dimensional computing may be necessary to provide the required user experience. In addition, deploying large-scale commercial 5G Advanced networks will require more cell sites while multi-carrier aggregation will be more challenging if an operator’s spectrum is highly fragmented.

Clearly, the use of millimetre wave spectrum will be crucial to provide the ultimate user experience and maximise the potential of 5G Advanced for operators. However, several problems, such as overcoming high-frequency signal loss and achieving TDD uplink/downlink symmetry, need to be solved first. As a result, Counterpoint Research believes that 3CC carrier aggregation in the sub-6GHz bands will be the preferred choice by many operators as the first wave of 5G Advanced commercial networks are rolled out. However, leading infrastructure vendors are working to overcome the technical challenges associated with TDD multi-carrier aggregation and are confident that they will be able to offer competitive products in due course.

This blog was sponsored by Huawei. To learn about FDD Sub-3GHz Massive MIMO click here

Related Posts

The post TDD Multi-Carrier Aggregation Builds Foundation For New 5G Advanced Experiences appeared first on Counterpoint.

]]>
Gareth Owen
Custom Silicon and New Interconnect Opportunities To Drive Growth at Marvell https://www.counterpointresearch.com/insights/custom-silicon-and-new-interconnect-opportunities-to-drive-growth-at-marvell/ Wed, 19 Jun 2024 23:06:06 +0000 https://www.counterpointresearch.com/?post_type=insights&p=1045575 Marvell reported Q1 FY25 financial results on May 30th with total revenues of $1.16 billion, down 12% Year-on-Year (YoY) and 9% sequentially. As with the previous quarter, the standout business segment was the data centre business, with revenues up 87% YoY and 7% sequentially. However, Marvell posted a $216 million net loss, an increase of […]

The post Custom Silicon and New Interconnect Opportunities To Drive Growth at Marvell appeared first on Counterpoint.

]]>
Marvell reported Q1 FY25 financial results on May 30th with total revenues of $1.16 billion, down 12% Year-on-Year (YoY) and 9% sequentially. As with the previous quarter, the standout business segment was the data centre business, with revenues up 87% YoY and 7% sequentially. However, Marvell posted a $216 million net loss, an increase of 28% YoY.

Data Centre Segment

Data centre revenue reached $814.4 million in Q1 FY25 driven by strong demand for its AI electro-optical products (PAM4 DSPs, TIAs and drivers) as well for its ZR Data Centre Interconnect (DCI) products. The double-digit revenue growth was driven by cloud AI as well as standard cloud infrastructure, which offset a higher than seasonal decline in on-premises enterprise data centre revenues.

In addition, Marvell booked revenues from initial shipments of its custom AI compute programs and announced three new data centre interconnect opportunities: PCIe Gen 6 retimers, AEC PAM4 DSPs and extended range coherent DSP-based DCI modules for use inside and outside data centres.

Optical Interconnects:

  • 100G-lane 800G PAM4 DSPs – are the primary interconnect product for state-of-the-art AI deployments and are shipping in volumes today. Marvell announced that it has started qualifying next-generation 200G/lane 1.6T PAM4 DSP solutions, which will enable next-generation AI accelerators. Volume adoption is expected to start later this year and accelerate during CY 2025.
  • PCIe Gen 6 retimers – Marvell recently announced its new PCIe Gen 6 retimer product range. PCIe Gen 6 is the first PCI standard to use PAM4 DSPs and these products are designed to help data centre compute fabrics continue to scale inside AI servers. The company is currently sampling its 8-lane and 16-lane PAM4-based PCIe Gen 6 retimer products.
  • Active Electrical Copper (AEC) PAM4 DSPs – AI demands higher speeds, which is driving the need for active interconnects inside racks. Marvell has started shipping its AEC PAM4 DSPs and has secured design wins with multiple Tier-1 cloud customers.
  • DCI ZR Modules – Marvell is shipping its 400G ZR products in high volumes and is seeing strong interest in its next-generation 800G ZR/ZR+ pluggable module DCI products. During the quarter, it also demonstrated the industry’s first 3D photonics engine. Marvell’s DCI customer base is expanding with design wins at multiple new data centre customers. However, revenue contribution from the 800G ZR/ZR+ DCI products is not expected to ramp up until next year.

AI clusters today are made up of thousands of GPUs within a single building. As future LLMs increase in size, clusters are expected to comprise hundreds of thousands of GPUs and will be accommodated in multiple buildings on campuses. These buildings will need to be connected so as to look like a single data centre. Marvell recently announced a new coherent DSP for use on campuses, extending the current range from less than two kilometres to 20 kilometres.

Marvell is further expanding its DCI market opportunities by introducing another coherent DSP design based on a new technology called Probabilistic Constellation Shaping (PCS), extending the reach of pluggable DCI modules from 120 kilometres to 1,000 kilometres.

Switches:

In data centre switching, Marvell expects to start production and shipments of its next-generation 51.2T Teralynx 10 switch this summer for lead customer Nvidia.

Custom compute:

Marvell’s custom compute AI programs started to ship in Q1 FY25 with a very substantial ramp-up expected in H2 FY25 followed by a full year of high-volume shipments in FY26. At its recent AI Analyst Day, Marvell revealed that it has custom compute projects with three of the four biggest US hyperscaler operators:

  • Amazon – AI Trainium training accelerator (ramping up now) and AI Inferentia inference accelerator (expected CY2025 ramp)
  • Google – ARM Axion custom CPU (ramping up now)
  • Microsoft – AI Maia accelerator with expected ramp-up in CY2026

Marvell has also developed a custom silicon for chip start-up Groq’s first PetaOPs AI accelerator, a 700mm2 custom ASIC which is in volume production.

Marvell expects its custom silicon business to generate around $200 million by the end of 2024 and the company claims has strong visibility over its 5nm programs over the next two years. In addition, it claims that its 3nm design pipeline and wins have been very strong and is also already engaged on 2nm design work.

Exhibit 1 shows a breakdown of Marvell Q1 FY25 revenues by market segment.

Exhibit 1: Data Centre vs Enterprise, Carrier and Automotive Revenues

5G Infrastructure and Enterprise Networking

As predicted by the company last quarter, the weakness in the 5G infrastructure and enterprise network markets continued with revenues down at both business units:

  • 5G Networks – revenues were $71.8 million in Q1 FY25, down 75% YoY and 58% sequentially. In the previous quarter, Marvell was expecting Q1 FY25 to be the low point with growth resuming in Q2. However, the company now expects Q2 to be flat sequentially with recovery beginning in H2 FY25 driven by the adoption of the next-generation Marvell DPUs at a Tier-1 customer, probably Nokia. Nevertheless, the company expects its 5G market share to increase as customers transition to 5nm Octeon 10 DPUs and baseband processors in H2 FY25. It claims to have won an additional socket with Nokia.
  • Enterprise Networks – revenue was down 58% YoY and 42% sequentially to $153 million. Although Marvell reported that customers’ excess inventory absorption is progressing, it still expects Q2 revenue to be flat sequentially with recovery starting in H2 FY25.

Automotive/Industrial

The automotive/industrial market is another market experiencing inventory correction. Reported revenues were $78 million, down 13% YoY and 6% sequentially. However, Marvell expects growth to resume in H2 FY25 driven by an increase in Marvell Ethernet content in 2025 model year vehicles.

Analyst Viewpoint

Counterpoint Research believes that Marvell is increasingly well positioned to benefit from the burgeoning growth in AI data centre infrastructure over the next few years. However, many of its new revenues streams will only start towards the end of FY25 but will set up a solid foundation for FY26.

With two hyperscaler customer wins already ramping up, most of the growth in data centre revenue will come from custom silicon over the next two to three years. Marvell had previously pencilled in around $1.5 billion for AI revenue for FY25 (split two-thirds electro-optics, one-third custom compute) – up from $500 million in FY24. However, it now expects to exceed that number and has a target of $2.5 billion in FY26 as the custom silicon programs reach the first full year of volume shipments.

Custom silicon is a business with high barriers to entry and could be very lucrative for Marvell in the medium and long-term. To succeed, vendors need to have R&D scale, technical excellence, cutting edge IP as well as being able to operate on the latest leading-edge process node – attributes which Marvell possesses. In addition, by working closely with the hyperscalers, Marvell will gain a significant advantage as it will acquire unique insights into their next-generation architectural requirements, not only for custom silicon, but also for the connectivity, switching and other products. However, Marvell competes against Broadcom – the 800-pound gorilla of the custom silicon market – and others, including Nvidia, a recent new entrant to the custom silicon market. Over time, there is also a risk that the hyperscalers will develop their own expertise and follow the examples of Apple and Huawei and bring their custom silicon design work in-house.

The other new growth market is data centre interconnects with three new opportunities – AEC interconnects, PCIe Gen 6 retimers and pluggable DCI coherent DSPs – each of which is estimated to be a $1 billion opportunity over the long term.  However, in the non-data centre businesses, thinks are not looking quite as rosy, although there are high hopes that both the 5G infrastructure and the enterprise networking businesses have reached the bottom of their respective cycles. 5G capital spending still looks weak and growth might be delayed beyond Q4 FY25. In addition, Marvell is heavily exposed to one vendor, Nokia. Last year, AT&T, one of Nokia’s largest base station customers, switched to rival Ericsson. It will probably take quite some time for these two business units to return to their previous $1 billion per year run rates.

Related Posts

The post Custom Silicon and New Interconnect Opportunities To Drive Growth at Marvell appeared first on Counterpoint.

]]>
Gareth Owen
Analysing The Role of AI/ML in 5G Advanced https://www.counterpointresearch.com/insights/analysing-the-role-of-ai-ml-in-5g-advanced/ Wed, 27 Mar 2024 18:29:17 +0000 https://www.counterpointresearch.com/?post_type=insights&p=1038155 The recent surge in interest in generative AI highlights the critical role that AI will play in future wireless systems. With the transition to 5G, wireless systems have become increasingly complex and more challenging to manage. In particular, the heterogenous nature of 5G networks comprising multiple access networks, frequency bands and cells, all with overlapping […]

The post Analysing The Role of AI/ML in 5G Advanced appeared first on Counterpoint.

]]>
The recent surge in interest in generative AI highlights the critical role that AI will play in future wireless systems. With the transition to 5G, wireless systems have become increasingly complex and more challenging to manage. In particular, the heterogenous nature of 5G networks comprising multiple access networks, frequency bands and cells, all with overlapping coverage areas, presents operators with network planning and deployment challenges. This is forcing the wireless industry to think beyond traditional rules-based design methods and turn to AI and ML.

5G Advanced is set to expand the role of wireless AI across 5G networks introducing new, innovative AI applications that will enhance the design and operation of networks and devices over the next three to five years, particularly those demanding high data rates, low latency or massive connectivity such as Extended Reality (XR), Reduced Capex (RedCap), Non-Terrestrial Network (NTN), Unmanned Aerial Vehicles (UAV) as well as applications requiring precise positioning and synchronization.

This Technology Report provides an overview of the role of AI/ML in the 3GPP’s upcoming 5G Advanced standard and outlines the key technologies and use cases where it will be used.

Key Takeaway No. 1: AI will be transformative

Although the application of AI/ML is still in its infancy, its integration into 5G-Advanced networks signifies a transformative shift in the telecommunications market. This development promises not only improved network performance but also opens the door to a wide range of innovative use cases. As a result, the commercial launch of 5G-Advanced in 2025 should accelerate the monetization of 5G for operators.

Key Takeaway No. 2: A Bridge to 6G

The adoption of AI/ML in 5G Advanced provides a platform to experiment with new techniques and should be regarded as a  trial for the full introduction of AI/ML in future 6G networks. For example, 6G will be the first opportunity where AI/ML-based optimization will be used in the fundamental design of an air  interface from the very beginning. However, the impact of AI/ML will not only enable improved 5G/6G performance, it should also allow 5G-Advanced to evolve faster.

Analysing The Role of AI/ML in 5G Advanced

The full version of this insight report, including a complete set of Key Takeaways is published in the following report, available to clients of Counterpoint Research’s 5G Network Infrastructure Service (5GNI).

Table of Contents

  • Snapshot
  • Key Takeaways
  • Introduction
  • 5G Advanced Specifications and Timelines
  • Current Use of AI/ML in Networks
  • Overview of AI/ML in 5G Advanced
  • AI/ML in the 5G Air Interface
  • -AI Models and End-To-End Optimization
  • -Channel State Feedback
  • -AI-based Millimetre Wave Beam Management
  • -Precise Positioning
  • -Single and Multiple Models
  • Devices And Network Sustainability
  • A Bridge to 6G
  • An AI-Native 6G Air Interface
  • Facing The Challenges
  • Analyst Viewpoint

Related Posts

The post Analysing The Role of AI/ML in 5G Advanced appeared first on Counterpoint.

]]>
Gareth Owen
Accelerating AI Revenues Plus Recovery in Non-AI Core Markets Signals Return to Profit for Marvell in Fiscal 2025 https://www.counterpointresearch.com/insights/marvell-q4-2024/ Mon, 18 Mar 2024 12:43:07 +0000 https://www.counterpointresearch.com/?post_type=insights&p=1037426 Marvell reported a slight increase in revenue in Q4 2024, up 1% sequentially to $1.43 billion. However, the highlight of the quarter was the high growth recorded in its data centre business, which benefited from increased AI spending. Compared to Q3, data centre revenues increased 38% and were up 54% on a YoY basis. Overall […]

The post Accelerating AI Revenues Plus Recovery in Non-AI Core Markets Signals Return to Profit for Marvell in Fiscal 2025 appeared first on Counterpoint.

]]>
Marvell reported a slight increase in revenue in Q4 2024, up 1% sequentially to $1.43 billion. However, the highlight of the quarter was the high growth recorded in its data centre business, which benefited from increased AI spending. Compared to Q3, data centre revenues increased 38% and were up 54% on a YoY basis. Overall fiscal 2024 revenue totalled $5.5 billion with strong growth in the H2 driven by AI demand. However, Marvell posted a $392.7 million net loss, a lot worse than the $15.4 million it posted in the previous year.

For Q1 2025, Marvell is forecasting total revenues of $1.15 billion with weak demand expected to continue in the carrier, enterprise and consumer segments. However, the company expects that revenues in these segments will stabilise after Q1 2025, with a recovery expected in H2 2025.

Data Centre Segment

Data centre revenue reached $765 million in Q4 with cloud services being a significant contributor while revenue from AI-driven optics exceeded $200 million (Exhibit 1). In fact, data centre revenue – particularly AI-driven optics – accelerated throughout the year, increasing from around one-third of total revenue in Q1 to more than half at the end of Q4. Clearly, the acquisition of Inphi is proving to be a major strategic asset.

Demand for its cloud-optimized silicon solutions is up, driven by the increase in AI and accelerated computing investment. The chip vendor has successfully executed several 5nm designs in the last two years and expects initial shipments for its two 5nm AI compute programs to start in Q1 2025. Marvell believes that it is on track for a very substantial ramp-up in H2 2025.

Marvell reported that it is also heavily engaged with cloud customers on new 3nm opportunities and remains confident of its 3nm funnel and design win rates. In addition, AI is increasing the cadence of new chip releases, and this plays well to Marvell’s strength as a key partner for its cloud customers with a proven ASIC platform.

The company also announced an extension of its long-standing collaboration with TSMC to develop the industry’s first technology platform to produce 2nm chips optimized for accelerated infrastructure. This new platform will enable Marvell to deliver substantial advancements in performance, power and area, which will be critical for next-generation accelerated workloads.

Carrier and Enterprise Networking

Carrier and enterprise markets have been experiencing a period of weak industry demand for several months. As a result, revenues at both segments were down sequentially in Q4. Marvell expects further sequential declines in Q1 of approximately 50% for carrier networking and 40% for enterprise. Beyond Q1, Marvell expects these markets to stabilise and forecasts a recovery in fiscal H2 2025.

Data Centre versus Carrier Network Revenues (2023-24)
This exhibit compares Data Centre versus Carrier Network revenues during 2023-2034.

At their peak during the pandemic, the carrier and enterprise networking market contributed a total of $2.5 billion to Marvell’s revenue. Looking forward, Marvell expects both of these markets to contribute over $1 billion each in revenue on an annual basis once demand normalizes. Both these businesses have very long product life cycles – typically seven years in production. In particular, Marvell stresses that it has not lost business or market share, with the lower revenues being attributable to demand softness and inventory corrections. In fact, the company maintains that its up-coming product upgrades will drive up revenues as both these cyclical markets recover over the next few years.

Analyst Viewpoint

Marvell is a critical enabler of accelerated infrastructure for AI with a full suite of solutions across data centre interconnect, switching and compute plus in-house expertise to integrate all these technologies together. Essentially, the company is a one-stop shop for data centres. As a result, Counterpoint Research believes that the company is well positioned to capitalise on the massive AI-based technology build-out as it continues to gain momentum during 2024.

Growth in generative AI applications is driving cloud providers to build new data centres. As a result,  Marvell is experiencing an increase in design wins in AI, custom silicon and networking optics. This is providing good opportunities, particularly in custom silicon. However, this is not a zero-sum game with both the merchant and custom silicon expected to benefit. Also, the on-going transformation of data centre architectures is resulting in increased investment in inferencing, which drives more bandwidth between data centres, resulting in more demand for Marvell’s data centre interconnect products.

In Q1 of fiscal 2025, Marvell expects continued sequential growth in its data centre revenues with initial shipments of its cloud-optimized silicon programmes for AI complementing its electro-optics products. Carrier, enterprise and consumer markets are expected to bottom out in Q1 –  representing a cyclical trough – which limits any downside. With a healthy gross margin of  42.1% and focus on high-growth areas (and clear view of demand for fiscal 2025 and 2026), Counterpoint Research believes that Marvell is set for recovery and a profitable fiscal 2025.

Related Posts

The post Accelerating AI Revenues Plus Recovery in Non-AI Core Markets Signals Return to Profit for Marvell in Fiscal 2025 appeared first on Counterpoint.

]]>
Gareth Owen
LeapFrog Semiconductor develops RISC-V based AI-enhanced DSP for Wireless Infrastructure https://www.counterpointresearch.com/insights/leapfrog-semiconductor/ Wed, 21 Feb 2024 19:50:13 +0000 https://www.counterpointresearch.com/?post_type=insights&p=1035910 Virtually all commercial open RAN deployments to date have used COTS server hardware based on Intel’s x86-based compute with or without FPGA hardware acceleration. While x86-based platforms are adequate for initial prototyping and low bandwidth deployments without acceleration, they are, however, expensive, power-hungry and highly inefficient for high-traffic, low-latency use cases requiring FPGA acceleration. Hence […]

The post LeapFrog Semiconductor develops RISC-V based AI-enhanced DSP for Wireless Infrastructure appeared first on Counterpoint.

]]>
Virtually all commercial open RAN deployments to date have used COTS server hardware based on Intel’s x86-based compute with or without FPGA hardware acceleration. While x86-based platforms are adequate for initial prototyping and low bandwidth deployments without acceleration, they are, however, expensive, power-hungry and highly inefficient for high-traffic, low-latency use cases requiring FPGA acceleration. Hence not the best choice for deployment at scale.

Open RAN’s Massive MIMO Challenge

Solving the massive MIMO performance deficit is one of the key issues inhibiting an industry wide transition to open RAN. This challenge must be resolved before mainstream adoption of massive MIMO radios can occur. However, this will require a new breed of merchant silicon solutions designed specifically to efficiently process real-time, latency-sensitive Layer-1 workloads such as beamforming, channel coding, etc.

In early 2023, a number of vendors demonstrated alternatives to Intel’s x86 platform at MWC in Barcelona based on ASICs, GPUs as well as RISC-V architectures. Late last year, an interesting new contender – LeapFrog Semiconductor – appeared on the market. 

LeapFrog’s RISC-V Based Modular, Customizable And First Truly Software Defined Layer-1 Solution

LeapFrog Semiconductor is an early-stage fabless semiconductor company focused solely on developing next-generation Layer-1 silicon and software solutions for the mobile infrastructure and enterprise markets. Founded in 2020, it is funded and staffed by seasoned semiconductor veterans.

The San Diego-based start-up has developed a unique AI-enhanced DSP-based silicon platform based on the RISC-V architecture as well as a Network-on-Chip silicon design. The  result is a multi-core, distributed 5G RAN silicon platform, which is modular, customizable and flexible, thus creating the first truly software defined, AI-enhanced RAN solution.

LeapFrog’s DSP Chip

Known as the LeapFrog Processing Unit (LPU), LeapFrog’s DSP core uses a specialized Instruction Set Architecture (ISA) developed in-house that natively supports fine-grain parallelism. This means that Layer-1 computation is broken down into a large number of small tasks, resulting in a high level of parallelism. Together with its programmable NOC architecture which minimises communication and synchronization overheads, LeapFrog’s Layer-1 chip results in several unique benefits:

  • Power and area efficient design – LeapFrog claims that its SoC is significantly smaller than rival designs and boasts single-digit (<10W) power consumption.
  • Software-based Layer-1 solutions – LeapFrog’s RU and DU Layer-1 solutions are 100% software-based and are thus fully programmable, with no requirements for hardware-based accelerators.
  • AI-enhanced L1 chip solution – the LeapFrog chip includes in-line processing of AI and L1 algorithms, which includes AI-based channel estimation and other L1 algorithms. This results in a low-latency chip solution and hence improved RAN system performance.
  • Tile and chiplet-based silicon design – resulting in a scalable, customizable and modular design which can be optimized for different deployment scenarios. For example, chiplets can be combined to make different functions such as L1, I/O, CPU, etc.

In contrast, many rival open RAN chip designs currently under development are based on coarse-grained parallelism, thus necessitating the use of hardware accelerators or hard IP blocks. These designs are not as scalable as LeapFrog’s solution and offer very little flexibility with respect to changes in the computation logic. As a result, a new chip tape-out would be needed if any architectural or logic changes are required.

LeapFrog Network-on-Chip (LNOC)

LeapFrog has also developed a highly power efficient, programmable LeapFrog Network-on-Chip (LNOC) chip design which connects multiple LPUs to create a multi-core, distributed 5G RAN silicon platform. Leveraging innovations in chiplet and Die2Die (D2D) technologies, this results in a highly scalable, modular and flexible chip design complying with all 5G O-RAN specifications (Exhibit 1).

©Leapfrog Semiconductor

Exhibit 1: LeapFrog Semiconductor’s RISC-V 5G Layer 1 Silicon Architecture

LeapFrog believes that its LNOC design is currently the only chiplet-based 5G open RAN chip platform with a fully software-based RU and DU L1 solution that can be easily customized to suit different 5G deployment scenarios.  In addition, the company claims that its AI-enhanced L1 solution results in 50% to 100% better system performance and 10x lower cost and power compared to existing open RAN RU and DU platforms. Another benefit is that software development and testing can be performed on an FPGA platform, which is then transferred to LeapFrog’s silicon platform. This allows a faster time-to-market compared to alternative designs from other vendors. 

Target Markets

LeapFrog is targeting multiple markets with its unique LPU design. Chiplet based productization allows the same platform to scale all the way from small cell, fixed wireless access (FWA) to macro cell RU and DU market with a major focus on massive MIMO networks. Potential customers include small and large 5G infrastructure vendors, greenfield CSPs as well as hyperscalers. The company is also pursuing an IP licensing model for its general-purpose DSP targeting consumer/industrial IoT modems, wireless CPEs/gateways, automotive connectivity/sensor fusion as well as mobile handset modems. The IP is ready on FPGA now and was recently demonstrated at the India Mobile Congress and the RISC-V Summit in 2023. The chip design was tested in H2 2023 and delivery of samples to customers is expected to start in Q2 2024.

Related Posts

The post LeapFrog Semiconductor develops RISC-V based AI-enhanced DSP for Wireless Infrastructure appeared first on Counterpoint.

]]>
Gareth Owen
3GPP’s Release 19 Continues 5G Advanced Standardization, Sets The Stage For 6G https://www.counterpointresearch.com/insights/3gpps-release-19-continues-5g-advanced-standardization-sets-the-stage-for-6g/ Tue, 06 Feb 2024 18:23:39 +0000 https://www.counterpointresearch.com/?post_type=insights&p=1035515 After months of discussions and deliberations, the scope of 5G Advanced Release 19 was approved at the 3GPP’s Plenary Meeting in Edinburgh in December. Led by Wanshi Chen, Chair of 3GPP TSG RAN, Release 19 builds on Release 18 and focuses on enhancing 5G performance while expanding the capability of 5G across devices and deployments. […]

The post 3GPP’s Release 19 Continues 5G Advanced Standardization, Sets The Stage For 6G appeared first on Counterpoint.

]]>
After months of discussions and deliberations, the scope of 5G Advanced Release 19 was approved at the 3GPP’s Plenary Meeting in Edinburgh in December. Led by Wanshi Chen, Chair of 3GPP TSG RAN, Release 19 builds on Release 18 and focuses on enhancing 5G performance while expanding the capability of 5G across devices and deployments. In addition, it will establish the technical foundations for 6G and will include preliminary work on new 6G capabilities.

Release 19 will be followed by Release 20, the first 3GPP release for 6G studies. During the next few years, 5G Advanced will continue to evolve within 3GPP while the standardization of 6G officially starts to ramp up in parallel. Release 18 is expected to be finalized in mid-2024 with Release 19 following in late 2025.

Performance Enhancements

5G Advanced continues to push the spectral efficiency limits and coverage in both sub-7GHz and millimetre wave spectrum. In addition to continued enhancements to massive MIMO radios and mobility, Release 19 provides advancements for new use cases such as XR and Non-Terrestrial Networks.

  • Massive MIMO Radio – Release 18 introduced improvements to massive MIMO uplink and downlink throughput. Release 19 will boost capacity further by improving multi-user MIMO, which enables more UEs to share the same time and frequency resources.
    Release 19 will also enable the cost-efficient realization of distributed transmitters and receivers, thus improving signal quality. This is an important step towards enabling fully distributed MIMO (D-MIMO) systems. Other enhancements include 5G beam management with UE-initiated measurement reporting, thus resulting in faster beam selection.
  • Mobility – 5G Advanced introduces a new handover procedure known as low-layer (i.e. L2) triggered mobility (LTM). In Release 18, LTM is supported between cells served by the same gNB. In Release 19, the LTM framework will be extended to support handover between cells served by different gNBs.
  • XR and the Metaverse – Release 19 builds on the low latency and power saving features of Release 18 by enabling higher XR capacity by adding improved uplink and downlink scheduling using packet delay information.
  • Non-Terrestrial Networks – 5G Advanced combines terrestrial and satellite communications under one standard for the first time. Release 19 will build on the enhancements introduced in Release 18 with a focus on increasing satellite downlink coverage, introducing UEs with higher output power and providing Redcap device support. It will also investigate whether additional support is required for regenerative payloads.

With a long history as an innovator in satellite communications, San Diego-based Qualcomm is leading the charge in non-terrestrial networking. In addition to its contributions to 5G NR NTN and 5G IoT NTN standards, the vendor recently launched two modems: the 212S modem, a satellite-only IoT modem and the 9205S modem. The latter connects to both terrestrial cellular and satellite networks and includes a Global Navigation Satellite Systems (GNSS) chip to provide location data.

Role of AI/ML in 5G Advanced

AI/ML will become a key feature of 5G networks with numerous applications ranging from network planning and network operations optimization to full network automation. Another important application is the use of AI/ML to improve the performance and functionality of the 5G air interface.

3GPP studied the use of AI/ML in the air interface in Release 18 and defined three use cases: channel state feedback (CSF) information, beam management and positioning. Based on the conclusions of Release 18 studies, Release 19 will specify a general AI/ML framework, i.e. actual specifications to support the above three use cases as well as specific support for each individual use case. Release 19 will also explore new areas in the AI/ML air interface such as mobility improvement and AI/ML-related model training, model management and global 5G data collection.

AI/ML is another major focus for Qualcomm. The company has dedicated significant technical resources to develop full-scale demonstrations of the three Release 18 defined use cases. For example, it recently demonstrated CSF-based cross-node machine learning involving E2E optimization between devices and the network. This reduces device communication overheads resulting in improved capacity and throughput. Qualcomm has also demonstrated the use of AI/ML to improve beam prediction on its 28GHz massive MIMO test network and is heavily involved in positioning technologies. For example, it has showcased its outdoor precise positioning technology, which uses multi-cell roundtrip (RTT) and angle-of-arrival (AoA) based technologies, as well as its RF finger printing technology operating in an indoor industrial private network.

Over the next few months, 3GPP will continue exploring the applicability of AI/ML based solutions for other use cases such as load balancing between cells, mobility optimization and network energy savings. For example, there will be support for conditional Layer 2 mobility in Release 19 and a new study item targeting new use cases designed to improve coverage and capacity optimization, such as AI-assisted dynamic cell shaping.

Enhancing Device and Network Sustainability

5G Advanced focuses on sustainability and introduces energy-saving features for devices and networks as well as exploring end-to-end energy saving opportunities that benefit devices. There are also improved features for RedCap and the study of ambient IoT as a new device type.

  • Power-optimized devices – Releases 18 and 19 build on existing energy saving features, for example, a new low-power wakeup signal (LP-WUS). A low-complexity, power-optimized receiver is specified to monitor low-power wake-up signals from the network which only wakes-up the main radio when data is available at the device. This avoids the significant power consumption required to keep the main radio monitoring control signals from the network.
  • Ambient IoT – enables new use cases enabled by very-low power devices that harvest energy from the ambient environment, for example, RF waves. Release 19 will investigate new architectures for ambient IoT devices and will include the development of a harmonized specification. Numerous use cases will be studied, including smart agriculture, industrial wireless sensor networks, smart logistics, warehousing, etc.
  • Network energy savings – 5G Advanced reduces network energy consumption by dynamically adjusting the network’s operation based on feedback from the device, i.e. shutting down parts of the network when idle and transmitting less power depending on the overall traffic load or using more efficient antennas.

Setting The Stage For 6G

Although Release 19 will be the last release focused on 5G, it will also include some longer-term technologies that will become the foundation of 6G, thus setting the direction for Release 20. For example, Integrated Sensing and Communications (ISAC), which combines wireless communications with RF sensing, will enable a raft of new position-based use cases. Release 19 will study channel characteristics suitable for the sensing of various objects, including vehicles, UAVs and humans. Full duplex, another 6G technology, allows  transmitters and receivers to operate simultaneously on the same frequency, potentially resulting in a doubling of network capacity. Release 19 will study sub-band full duplex, a type of full duplex, which will improve capacity and latency, particularly for the uplink. Release 19 will also include channel model studies for the upper mid-band spectrum (7-16GHz), which will be supported by “Giga-MIMO” in the 6G timeframe, in order to enable wide-area coverage in this higher band.

Whereas AI/ML is a key pillar of 5G Advanced, it will be a core foundational technology of 6G and will underpin the key features that will make 6G revolutionary. For example, 6G will start to move away from the traditional, model-driven approach of designing communication systems and transition towards a more data-driven design. Indeed, it is likely that the 6G air-interface will be designed to be AI-native from the outset, thus signalling a paradigm change in the way communication systems are designed.  An AI-native air interface could offer many benefits. For example, it could refine existing communication protocols by continuously learning and improving them, thereby enabling the air interface to be customized dynamically to suit local radio environments.

Analyst Viewpoint

Despite huge investments in 5G, network operators are still reliant on revenues from traditional voice and broadband data services and are struggling to increase ARPU. Clearly, operators will need to leverage the capabilities of 5G Advanced in order to realize the full potential of 5G.

Although Release 19 includes a focus on new 6G focused technologies, Counterpoint Research believes that 5G is currently only at the midpoint of its development. Over the next few years, 5G Advanced will offer a plethora of new features to improve device and network capabilities and lower OPEX costs. It will also offer innovative new use cases thus enabling operators to generate new revenue streams. Together, this should enable operators to drive up ARPU of existing customers, lower OPEX costs and to acquire new B2B customers across several verticals.

However, 5G Advanced requires operators to deploy 5G SA cores across their networks. While around 92% of all 5G devices support 5G SA, only 21% of operators have started to invest in 5G SA. Of these, only 47 have commercially deployed 5G SA cores in their networks to date[1]. In the short-term, operators need to urgently prioritize 5G SA core deployment in order to fully benefit from their 5G investments.

[1] Source: GSA, 5G Standalone, October 2023

This blog is sponsored by Qualcomm.

The post 3GPP’s Release 19 Continues 5G Advanced Standardization, Sets The Stage For 6G appeared first on Counterpoint.

]]>
Gareth Owen
Rapid Transition To L4 Automation Key To Successful 5G Network Monetization https://www.counterpointresearch.com/insights/rapid-transition-to-l4-automation-key-to-successful-5g-network-monetization/ Wed, 17 Jan 2024 06:33:44 +0000 https://www.counterpointresearch.com/?post_type=insights&p=1034706 Despite huge investments in 5G, network operators are still highly reliant on revenues from traditional voice and broadband data services and are struggling to increase ARPU. With 5G Advanced around the corner, they will need to continue investing heavily in new network infrastructure for many years to come, despite rising debt levels. As a result, […]

The post Rapid Transition To L4 Automation Key To Successful 5G Network Monetization appeared first on Counterpoint.

]]>
Despite huge investments in 5G, network operators are still highly reliant on revenues from traditional voice and broadband data services and are struggling to increase ARPU. With 5G Advanced around the corner, they will need to continue investing heavily in new network infrastructure for many years to come, despite rising debt levels. As a result, the business models of many of these operators risk becoming unsustainable unless major changes are made.

Overcoming Operator Challenges

To survive financially and benefit from their 5G investments, operators need to develop new revenue streams while reducing OPEX costs. To achieve this, they need to radically transform the way they operate their networks. Instead of a fixed architecture, a fully flexible and agile service platform is required with the capability of delivering a wide variety of services on-demand. This means that networks need to be cloud-based, software defined, highly programmable and ultimately completely automated. By making their networks agile, operators will be able to deliver a huge variety of services with user experiences and connectivity dynamically tailored to individual use cases, or even individual users. Over the next few years, Counterpoint Research believes that AL/ML driven automation will play a critical role in facilitating this business model transformation, allowing operators to develop new revenue streams while significantly reducing OPEX costs.

Transition to L4 Automation

Autonomous networks are networks that can run with minimal (and ultimately zero, i.e. zero-touch) human intervention while leveraging technologies such as AI, machine learning and edge computing. Operators have already started the journey to automation, which they plan to implement in stages. For example, most Tier-1 operators have reached either Level 2 or Level 3 and many plan to reach Level 4 automation by the end of 2025/26.

Compared to L3 automation, L4 offers many new features and capabilities. With L3 networks, O&M updates are implemented into the network manually and run to gauge the network’s response. This typically involves multiple iterations. In contrast, L4 automation offers the capability of using a digital twin, i.e. a virtual or digital copy of the physical network. This is essentially a simulation environment which enables network changes to be run hundreds of times in isolation, enabling the optimum parameters to be identified before they are implemented into the physical network.

L4 automation also enables improved data collection processes allowing operators to have greater visibility into the network. For example, L4 offers the ability to collect more data from a base station compared to L3. L4 can also collect data more frequently. As a result, L4 automation can offer predictive and preventative capabilities, where potential faults are identified and rectified, thus ensuring that base stations are always online. Operators typically do not want to implement automation for everything, with most focusing on two processes: network deployment and fault monitoring and maintenance.

Huawei’s RAN Digital Twin

Huawei has developed a RAN Digital Twin System (RDTS) which is used in conjunction with its IntelligentRAN architecture to leverage the new capabilities of L4 automation. Central to its operation are the following four new innovative features:

  • Improved Data Collection – it typically takes around 15 to 30 minutes to collect historical data on a conventional mobile network. By implementing L4, Huawei is able to do this in around 10-200ms, i.e. effectively in real-time.
  • Predictive O&M Capabilities – maintenance costs can be significantly reduced by using RDTS. For example, RDTS enables operators to predict equipment failures due to overheating boards and detect faults in optical modules and back-up power supplies by up to seven days in advance. Faults can thus be rectified before they disrupt network operations. In contrast, a conventional network may require 4+ hour post processing after a fault is rectified in the field.
  • Transition from KPIs to SLAs (Service Level Agreements) – using the RDTS enables operators to offer SLAs to their customers, resulting in new business opportunities and higher revenues.
  • Single To Multiple Target Optimization – conventional networks can only handle single target optimization, for example, energy efficiency. However, by using the RDTS, Huawei’s IntelligentRAN is able to perform multiple target optimization, for example, simultaneously optimizing energy efficiency and user experience.

IntelligentRAN L4 i-series solutions

In early 2023, Huawei launched its 3-layer, hierarchical IntelligentRAN architecture which has been deployed to date by more than 30 operators worldwide. IntelligentRAN enables the key capabilities of L4 autonomous networks to be realised. This includes intent-driven networking, intelligent sensing, multi-target decision optimization and proactive/predictive O&M. At its recent Global Mobile Broadband Forum in Dubai, Huawei announced three additional L4 i-series solutions:

  • iLiveStreaming – by means of dynamic allocation of time, frequency and space resources coupled with intelligent SLA trend prediction, Huawei is able to offer deterministic experience assurance delivering a reliability higher than 95% for uplink livestreaming.
  • iKeyEvent – using spatiotemporal traffic prediction technology, iKeyEvent enables network risks to be identified and hence predicted and monitored at big events such as major sports meetings. Emergency plans are then generated automatically and the control loop-closed within seconds.
  • iPowerStar – uses intelligent algorithms to manage end-to-end energy consumption across multiple network channels, including the time, space, frequency and power domains. Multi-target optimization helps operators minimise energy consumption without compromising on network performance or user experience. Huawei claims that iPowerStar reduces carbon emissions by 30%.

Operator Examples

In recent months, Huawei has demonstrated the benefits of using the RDTS system operating within its IntelligentRAN architecture with several of its operator partners. For example:

  • In the Middle East – Huawei demonstrated how the RDTS eliminates the need to perform multiple iterations on a live network (typically 20+ times with conventional O&M over 20 days) to just once with a RDTS network. Huawei claims that this enables operators to deploy new features and services ten times faster than with conventional O&M.
  • In China – by using Huawei’s RDTS in conjunction with its IntelligentRAN system, a Chinese operator was able to reduce the number of O&M site visits from 29,000 to 780 visits per year. According to Huawei, this reduces maintenance inspection and passive analysis costs by up to 90%.
  • In Europe – guaranteeing the performance of a live streaming service is very challenging. Prior to using RDTS, an European operator was able to get a 5 times package gain compared to the traditional streaming package. However, by implementing Huawei’s Live Streaming Solution, the operator was able to increase its SLA assurance from 50% to 90% enabling it to generate $200 per hour per package compared to the original $40 live streaming package. This certainty of guaranteed service experience will open up new business opportunities for operators.

Viewpoint

The business models of many network operators risk become unsustainable unless they fully embrace automation. L4 autonomous networks will allow operators to deliver an experience that is far better than with previous generations of mobile networks. With L4, intent-driven networking replaces policy-based network management, deterministic service assurance replaces best-efforts approaches while proactive O&M (leveraging predictive/preventive capabilities) is used instead of responsive O&M. Together, these new capabilities will enable operators to significantly reduce OPEX costs as well as generate new revenues.

However, there are still challenges ahead. Standards, or specifically a lack of collaboration among standards bodies and open-source groups, is perhaps the biggest challenge. In particular, the industry needs to define data standards and formats. Another challenge is transforming company culture and skills, for example, with respect to network operations personnel. Linked with culture and skills is a lack of a common understanding of key technologies: for example, is there a precise, industry agreed definition for intent-driven management? The development of open APIs will also be very important. Collaboration with industry and ecosystem partners, including device manufacturers, equipment suppliers and developers will be essential in order to bring the economies of scale needed to benefit all players.

This blog is sponsored by Huawei.

Related Posts

The post Rapid Transition To L4 Automation Key To Successful 5G Network Monetization appeared first on Counterpoint.

]]>
Gareth Owen
Selected Highlights from ITU’s WRC-23 Meeting In Dubai https://www.counterpointresearch.com/insights/selected-highlights-from-itus-wrc-23-meeting-in-dubai/ Thu, 28 Dec 2023 18:54:44 +0000 https://www.counterpointresearch.com/?post_type=insights&p=1033199 Held every four years, the ITU’s World Radio Conference (WRC) came to an end last week. During the 4-week long conference, 43 new resolutions were approved, 56 existing ones were revised while 33 resolutions were suppressed. Key highlights included: 4G/5G Spectrum – WRC-23 identified spectrum for 4G and 5G which will be crucial for expanding […]

The post Selected Highlights from ITU’s WRC-23 Meeting In Dubai appeared first on Counterpoint.

]]>
Held every four years, the ITU’s World Radio Conference (WRC) came to an end last week. During the 4-week long conference, 43 new resolutions were approved, 56 existing ones were revised while 33 resolutions were suppressed. Key highlights included:

  • 4G/5G Spectrum – WRC-23 identified spectrum for 4G and 5G which will be crucial for expanding broadband connectivity and developing mobile services. The new spectrum includes the 3,300-3,400MHz, 3,600-3,800MHz, 4,800-4,990MHz and 6,425-7,125MHz frequency bands in various countries and regions.In particular, the decision  to set aside the 6.425-7.125GHz band for licensed, mobile operations and to harmonise this band is very important for the mobile community. The 6GHz band is the only remaining midband spectrum currently available to respond to the data traffic growth in the 5G-Advanced era and is critical for manufacturers of the 6GHz equipment ecosystem.However, a  compromise was adopted in ITU Region 1 and Region 3, which means that the 6,425-7,125MHz band can also be used by Wi-Fi. Individual administrations will have the freedom to decide what happens in this frequency range.
  • HIBS spectrum – WRC-23 also identified the 2GHz and 2.6GHz bands for using high-altitude platform stations as IMT base stations (HIBS) and established regulations for their operations. This technology offers a new platform to provide mobile broadband with minimal infrastructure using the same frequencies and devices as IMT mobile networks. HIBS can contribute to bridging the digital divide in remote and rural areas and maintain connectivity during disasters.
  • Low-bands – WRC-23 also defined mobile use of more low-band spectrum in the 470-694MHz band in the EMEA region (Europe, the Middle East and Africa).In the UK, Ofcom has already released spectrum down to the 700MHz (694-790MHz) band for use by mobile networks, by shifting Digital Terrestrial TV (DTV) services into the 600MHz band and lower (starting around 470MHz). DTV is going to be around for a number of years yet, but once it does end (i.e. once people have shifted to broadband-based TV) then it looks increasingly likely that the bands will be used for mobile.
  • 6G Spectrum – prior to the start of WRC-23, the ITU adopted a resolution intended to guide the development of a 6G standard. During the conference, regulators agreed to study the 7-8.5GHz band for 6G in time for the next ITU conference in 2027. That spectrum band aligns with proposals from major incumbents for early 6G operations at spectrum bands between 7GHz and 20GHz.

The full version of this insight report, including a complete set of analyst takeaways, is published in the following report, available to clients of Counterpoint Research’s 5G Network Infrastructure Service (5GNI).

Report: Highlights from ITU’s WRC-23 Meeting In Dubai

Table of Contents

  • Key Highlights
  • Mobile Agreements
  • 5G Spectrum
  • 6G Spectrum
  • Terrestrial Broadcast Agreements
  • Satellite Agreements
  • Other Agreements
  • Analyst Viewpoint

Related Posts

The post Selected Highlights from ITU’s WRC-23 Meeting In Dubai appeared first on Counterpoint.

]]>
Gareth Owen
3GPP 5G NTN Standards Set To Dramatically Boost Mobile Satellite Addressable Market https://www.counterpointresearch.com/insights/5g-ntn/ Wed, 04 Oct 2023 12:56:12 +0000 https://www.counterpointresearch.com/?post_type=insights&p=1023858 Satellite communications is back in the limelight following the launch of Apple’s direct Satellite-to-Phone service earlier this year. Partnering with satellite operator Globalstar, the service provides SOS messaging for iPhone 14/15 users. Recently, the service was expanded to include roadside assistance via satellite as well. A host of similar services and partnerships have been announced […]

The post 3GPP 5G NTN Standards Set To Dramatically Boost Mobile Satellite Addressable Market appeared first on Counterpoint.

]]>
Satellite communications is back in the limelight following the launch of Apple’s direct Satellite-to-Phone service earlier this year. Partnering with satellite operator Globalstar, the service provides SOS messaging for iPhone 14/15 users. Recently, the service was expanded to include roadside assistance via satellite as well. A host of similar services and partnerships have been announced between satellite operators and chip vendors/cellular operators during the past few months, including Inmarsat with Mediatek, Iridium with Qualcomm and most recently SpaceX with KDDI.

In addition to the incumbent operators, there are a number of new players such as AST SpaceMobile and Lynk Global. AST SpaceMobile has partnered with Rakuten Mobile and currently has one operational satellite in-orbit. It has been granted preliminary experimental licenses in Japan and in the US. Meanwhile Lynk launched a limited commercial “store-and-forward” service using three satellites in April. Both companies plan to launch full constellations over the next few years.

The Mobile Satellite Services (MSS) market has historically been a niche market due primarily to the fact that MSS is based on proprietary technologies. However, 3GPP is working with the satellite industry on a global standardized solution, called 5G Non-Terrestrial Networks (NTN). 5G NTN will enable seamless roaming between terrestrial and satellite networks, using largely standard cellular devices, i.e., eliminating the need for proprietary terminals and fragmented satellite constellations. This could dramatically increase the addressable market for mobile satellite services.

5G Non-Terrestrial Networks (NTN)

With the emergence of new Satellite-to-Phone services, there is now a widespread industry push to deploy NTN-based satellite networks as this would benefit the satellite industry and the wider mobile industry. However, 3GPP has been working on NTN for some time. For example, there has been an ongoing study on 5G NTN since 3GPP Release 15, while in 2022, 3GPP introduced two parallel workstreams in its Release 17 specifications addressing 5G satellite-based mobile broadband and low-complexity IoT use cases:

  • NR-NTN (New Radio NTN) – adapts the 5G NR framework for satellite communications, providing direct mobile broadband services as well as voice using standard apps. This will enable 5G phones operating on dedicated 5G NTN frequencies and existing sub-7GHz terrestrial frequencies to link directly with Release-17 compatible satellites. Release 17 also includes enhancements for satellite backhaul and the inclusion of 80MHz MSS uplink spectrum in L-band (1-2GHz) plus a similar amount of downlink spectrum in S-band (2-4GHz).
  • IoT-NTN – provides satellite support for low-complexity eMTC and NB-IoT devices, which expands the coverage for key use cases such as worldwide asset tracking (for example, air freight, shipping containers and other assets outside cellular coverage). IoT-NTN is designed for low data rate applications such as the transmission of sensor data and text messages.

Release 17 established the NR-NTN and IoT-NTN standards while the upcoming 5G Advanced Release 18 will introduce new capabilities, coverage/mobility enhancements and support for expanded spectrum bands. For example, there are plans to extend the NR-NTN frequency range beyond 10GHz by adding Fixed Satellite Services (FSS) spectrum in the 17.7-20.2GHz band for downlink and 27.5-30.0GHz for uplink.

Satellite IoT

Traditional mobile satellite operators such as Inmarsat, Iridium and Globalstar have been offering M2M/IoT type services for many years targeting various industry verticals, ranging from agriculture, construction and oil and gas to maritime, transportation and utilities. Some of the traditional FSS players, such as AsiaSat, Eutelsat and Intelsat, also offer M2M/IoT services over Ku or Ka bands.

Another player with a long history in satellite communications is San Diego-based chip vendor Qualcomm. The company was a founding partner and key technology provider in Globalstar and also developed satellite-based asset tracking service OmniTRACS. Qualcomm is still heavily involved in the satcom business and earlier this year announced Snapdragon Satellite, its Satellite-to-Phone service. More recently, it announced the availability of two Release 17 compatible GEO/GSO IoT-NTN satellite modems launched in collaboration with US-based Skylo, a NTN connectivity service provider, that enables cellular devices to connect to existing, proprietary satellite networks:

  • Qualcomm 212S Modem – a satellite-only IoT modem designed to enable stationary sensing and monitoring IoT devices to communicate with NTN-based satellites. The chipset is an ultra-low power device and can be powered from solar panels or supercapacitors.
  • Qualcomm 9205S Modem – enables IoT devices to connect to both terrestrial cellular and satellite networks and has integrated GNSS to provide location data. Typical applications include industrial applications requiring always-on, hybrid terrestrial and satellite connectivity for tracking assets such as agricultural machinery, shipping containers, livestock, etc.

Both devices are designed for low-power, cost optimized applications and support the Qualcomm Aware cloud platform, which provides real-time asset tracking and device management in off-grid, remote areas for IoT.

Most of the major chip vendors, such as MediaTek, Qualcomm and Sony Semiconductors, have already developed Release 17 compatible chipsets. This means that satellite-compliant 5G IoT devices could be available commercially by the end of 2023 and should become commonplace in 2024.

NTN Satellite Operators

Only a few NTN-based satellites have been launched to date. A noteworthy example is Spanish LEO operator Sateliot, the first company to deploy satellites complying with 3GPP’s Release 17 IoT-NTN standard. Sateliot currently has two satellites in orbit and recently carried out a successful roaming test between its satellite network and Telefonica’s 5G terrestrial network using an IoT device with a standard SIM card. Sateliot plans to start commercial activities in 2024. Ultimately, the company hopes to launch a total of 250 nanosatellites, which will enable it to offer global 5G IoT-NTN services.

No satellite operator presently supports 3GPP’s Release 17 NR-NTN standard for voice and data. Although AST SpaceMobile and Lynk Global have demonstrated two-way satellite-to-5G terrestrial communications, neither uses the NR-NTN standard, although they have plans to test the NR-NTN standard.

Satellite Déjà Vu?

Over two decades ago, the mobile satellite industry invested billions to launch a number of ground-breaking LEO-based voice and narrowband data constellations. Only a handful survived and even fewer have prospered. Will history repeat itself?

Although there are some parallels, Counterpoint Research believes that there are also some important differences this time. During the past 20 years, satellites have become much smaller, more capable and less expensive. Some of these satellites are based on CubeSat technology, which uses commercial, off-the-shelf (COTS) components, thus drastically reducing costs while accelerating time to market. This is particularly relevant to nanosatellites, many of whom are being developed to target the IoT-NTN market. Another important difference is that launch costs have decreased significantly due to the entry of new private launch companies, notably SpaceX.

Perhaps the most important differentiator between current and next-generation satellites, however, is that the latter will be based on 3GPP’s NTN standards. Historically, proprietary satellite systems have resulted in a limited range of low volume and hence expensive end user devices – a significant barrier to growth. As with 5G (and 4G before it), a common set of cellular-based standards will enable the mobile satellite industry – plus the vertical markets it serves – to benefit from the vast economies of scale of the cellular device ecosystem. This should result in higher volume chipset production, more affordable devices and services and hence a much larger market of end users. For instance, Sateliot estimates that the cost of satellite IoT connectivity will drop from hundreds of dollars per device per month to less than $10 per device per month.

Furthermore, the adoption of 5G NTN and its integration with terrestrial 5G will result in a truly seamless global telecoms network, with increased space segment capacity, resulting in more users benefiting from higher data rate services. This will lead to more applications and use cases thus creating more value-add for vertical market users. Clearly, this could lead to a significant expansion of the mobile satellite services market globally.

Related Posts

The post 3GPP 5G NTN Standards Set To Dramatically Boost Mobile Satellite Addressable Market appeared first on Counterpoint.

]]>
Gareth Owen
NG-LLS Fronthaul Interface – A Pivotal Moment For The 5G RAN Ecosystem? https://www.counterpointresearch.com/insights/ng-lls/ Mon, 25 Sep 2023 03:34:30 +0000 https://www.counterpointresearch.com/?post_type=insights&p=1022902 One of the major challenges to the adoption of open RAN in 5G networks in dense, urban environments is its sub-optimal support for massive MIMO radios. While there are several reasons behind this performance deficit, a key reason is that the O-RAN Alliance 7.2x open fronthaul specification was not originally designed to accommodate massive MIMO […]

The post NG-LLS Fronthaul Interface – A Pivotal Moment For The 5G RAN Ecosystem? appeared first on Counterpoint.

]]>

One of the major challenges to the adoption of open RAN in 5G networks in dense, urban environments is its sub-optimal support for massive MIMO radios. While there are several reasons behind this performance deficit, a key reason is that the O-RAN Alliance 7.2x open fronthaul specification was not originally designed to accommodate massive MIMO radio systems. Recently, the O-RAN Alliance announced a new fronthaul interface specification designed specifically for use with massive MIMO radio systems in dense, high-traffic environments.

This Technology Report provides an objective analysis of the O-RAN Alliance’s Next Generation Lower-Layer Split (LLS) and discusses the implications of the new interface on the adoption of open RAN massive MIMO radios.

Key Takeaway 1: Impact of Incumbents

With the availability of the new NG-LLS fronthaul split, it “appears” that the open RAN community has united around a single specification which will enable open RAN to be adopted in high-traffic urban regions. This should be welcome news as it means that operators will be able to use open RAN technology across all parts of their networks, from rural deployments to dense, high traffic urban environments.  However, the NG-LLS standard has brought major incumbents such as Ericsson and Nokia into the open RAN limelight. While this brings scale and credibility to open RAN in the high-end 5G market, it also raises questions about open RAN’s goal of diversifying the radio supply chain and lowering barriers to smaller vendors.

Key Takeaway 2: Massive MIMO Use Cases Suitable for Split 7.2b

Although Split 7.2b has limitations when deployed in dense, high-traffic urban networks, Counterpoint Research believes that it will continue to be a good choice for other mMIMO use cases. For example, in uses cases with moderate traffic loads, where cell sizes are larger and where end-user mobility is low such as in Fixed Wireless Access applications. Radios based on Split 7.2b will also benefit from reduced complexity and lower costs compared to NG-LLS based radios. In future, the application of advanced AI/ML algorithms in the DU may narrow the performance differential between Split 7.2b and NG-LLS for some use cases.

The post NG-LLS Fronthaul Interface – A Pivotal Moment For The 5G RAN Ecosystem? appeared first on Counterpoint.

]]>
Gareth Owen
5G Advanced and Wireless AI Set To Transform Cellular Networks, Unlocking True Potential https://www.counterpointresearch.com/insights/5g-advanced-wireless-ai/ Mon, 31 Jul 2023 06:10:39 +0000 http://cpr.presscat.kr/insights/5g-advanced-wireless-ai/ The recent surge in interest in generative AI highlights the critical role that AI will play in future wireless systems. With the transition to 5G, wireless systems have become increasingly complex and more challenging to manage, forcing the wireless industry to think beyond traditional rules-based design methods. 5G Advanced will expand the role of wireless […]

The post 5G Advanced and Wireless AI Set To Transform Cellular Networks, Unlocking True Potential appeared first on Counterpoint.

]]>
The recent surge in interest in generative AI highlights the critical role that AI will play in future wireless systems. With the transition to 5G, wireless systems have become increasingly complex and more challenging to manage, forcing the wireless industry to think beyond traditional rules-based design methods.

5G Advanced will expand the role of wireless AI across 5G networks introducing new, innovative AI applications that will enhance the design and operation of networks and devices over the next three to five years. Indeed, wireless AI is set to become a key pillar of 5G Advanced and will play a critical role in the end-to-end (E2E) design and optimization of wireless systems. In the case of 6G, wireless AI will become native and all-pervasive, operating autonomously between devices and networks and across all protocols and network layers.

E2E Systems Optimization

AI has already been used in smartphones and other devices for several years and is now increasingly being used in the network. However, AI is currently implemented independently, i.e. either on the device or in the network. As a result, E2E systems performance optimization across devices and network has not been fully realized yet. One of the reasons for this is that on-device AI training has not been possible until recently.

On-device AI will play a key role in improving the E2E optimization of 5G networks, bringing important benefits for operators and users, as well as overcoming key challenges. Firstly, on-device AI enables processing to be distributed over millions of devices thus harnessing the aggregated computational power of all these devices. Secondly, it enables AI model learning to be customized to a particular user’s personalized data. Finally, this personalized data stays local on the device and is not shared with the cloud. This improves reliability and alleviates data sovereignty concerns. On-device AI will not be limited to just smartphones but will be implemented across all kinds of devices from consumer devices to sensors and a plethora of industrial equipment.

New AI-native processors are being developed to implement on-device AI and other AI-based applications. A good example is Qualcomm’s new Snapdragon X75 5G modem-RF chip, which has a dedicated hardware tensor accelerator. Using Qualcomm’s own AI implementation, this Gen 2 AI processor boosts the X75’s AI performance more than 2.5 times compared to the previous Gen 1 design.

While on-device AI will play a key role in improving the E2E performance of 5G networks, overall systems optimization is limited when AI is implemented independently. To enable true E2E performance optimization, AI training and inference needs to be done on a systems-wide basis, i.e.  collaboratively across both the network and the devices. Making this a reality in wireless system design requires not only AI know-how but also deep wireless domain knowledge. This so-called cross-node AI is a key focus of 5G Advanced with a number of use cases being defined in 3GPP’s Release 18 specification and further use cases expected to be added in later releases.

Wireless AI: 5G Advanced Release 18 Use Cases

3GPP’s Release 18 is the starting point for more extensive use of wireless AI expected in 6G. Three use cases have been prioritized for study in this release:

  • Use of cross-node Machine Learning (ML) to dynamically adapt the Channel State Information (CSI) feedback mechanism between a base station and a device, thus enabling coordinated performance optimization between networks and devices.
  • Use of ML to enable intelligent beam management at both the base station and device, thus improving usable network capacity and device battery life.
  • Use of ML to enhance positioning accuracy of devices in both indoor and outdoor environments, including both direct and ML-assisted positioning.

Channel State Feedback:

CSI is used to determine the propagation characteristics of the communication link between a base station and a user device and describes how this propagation is affected by the local radio environment. Accurate CSI data is essential to provide reliable communications. With traditional model-based CSI, the user device compresses the downlink CSI data and feeds the compressed data back to the base station. Despite this compression, the signalling overhead can still be significant, particularly in the case of massive MIMO radios, reducing the device’s uplink capacity and adversely affecting its battery life.

An alternative approach is to use AI to track the various parameters of the communications link. In contrast to model-based CSI, a data driven air interface can dynamically learn from its environment to improve performance and efficiency. AI-based channel estimation thus overcomes many of the limitations of model-based CSI feedback techniques resulting in higher accuracy and hence an improved link performance. The is particularly effective at the edges of a cell.

Implementing ML-based CSI feedback, however, can be challenging in a system with multiple vendors. To overcome this, Qualcomm has developed a sequential training technique which avoids the need to share data across vendors. With this approach, the user device is firstly trained using its own data. Then, the same data is used to train the network. This eliminates the need to share proprietary, neural network models across vendors. Qualcomm has successfully demonstrated sequential training on massive MIMO radios at its 3.5GHz test network in San Diego (Exhibit 1).

Wireless AI
© Qualcomm Inc.

Exhibit 1: Realizing system capacity gain even in challenging non-LOS communication

AI-based Millimetre Wave Beam Management:

The second use case involves the use of ML to improve beam prediction on millimetre wave radios. Rather than continuously measuring all beams, ML is used to intelligently select the most appropriate beams to be measured – as and when needed. A ML algorithm is then used to predict future beams by interpolating between the beams selected – i.e. without the need to measure the beams all the time. This is done at both the device and the base station. As with CSI feedback, this improves network throughput and reduces power consumption.

Qualcomm recently demonstrated the use of ML-based algorithms on its 28GHz massive MIMO test network and showed that the performance of the AI-based system was equivalent to a base case network set-up where all beams are measured.

Precise Positioning:

The third use case involves the use of ML to enable precise positioning. Qualcomm has demonstrated the use of multi-cell roundtrip (RTT) and angle-of-arrival (AoA)-based positioning in an outdoor network in San Diego. The vendor also demonstrated how ML-based positioning with RF finger printing can be used to overcome challenging non-line of sight channel conditions in indoor industrial private networks.

An AI-Native 6G Air Interface

6G will need to deliver a significant leap in performance and spectrum efficiency compared to 5G if it is to deliver even faster data rates and more capacity while enabling new 6G use cases. To do this, the 6G air interface will need to accommodate higher-order Giga MIMO radios capable of operating in the upper mid-band spectrum (7-16GHz), support wider bandwidths in new sub-THz 6G bands (100GHz+) as well as on existing 5G bands. In addition, 6G will need to accommodate a far broader range of devices and services plus support continuous innovation in air interface design.

To meet these requirements, the 6G air interface must be designed to be AI native from the outset, i.e. 6G will largely move away from the traditional, model-driven approach of designing communications networks and transition toward a data-driven design, in which ML is integrated across all protocols and layers with distributed learning and inference implemented across devices and networks.

This will be a truly disruptive change to the way communication systems have been designed in the past but will offer many benefits. For example, through self-learning, an AI-native air interface design will be able to support continuous performance improvements, where both sides of the air interface — the network and device — can dynamically adapt to their surroundings and optimize operations based on local conditions.

5G Advanced wireless AI/ML will be the foundation for much more AI innovation in 6G and will result in many new network capabilities. For instance, the ability of the 6G AI native air interface to refine existing communication protocols and learn new protocols coupled with the ability to offer E2E network optimization will result in wireless networks that can be dynamically customized to suit specific deployment scenarios, radio environments and use cases. This will a boon for operators, enabling them to automatically adapt their networks to target a range of applications, including various niche and vertical-specific markets.

Related Posts:

The post 5G Advanced and Wireless AI Set To Transform Cellular Networks, Unlocking True Potential appeared first on Counterpoint.

]]>
Gareth Owen
NVIDIA and Softbank Developing AI-Based 5G MEC Telco Network https://www.counterpointresearch.com/insights/nvidia/ Fri, 30 Jun 2023 06:34:02 +0000 http://cpr.presscat.kr/insights/nvidia/ NVIDIA and Softbank recently announced that they had developed a dual-purpose AI-driven 5G MEC and vRAN distributed platform based on NVIDIA’s new GH200 Grace Hopper superchip.  The two partners intend to deploy a network of regional data centres across Japan later this year to capitalize on the demand for accelerated computing and generative AI services. The […]

The post NVIDIA and Softbank Developing AI-Based 5G MEC Telco Network appeared first on Counterpoint.

]]>
NVIDIA and Softbank recently announced that they had developed a dual-purpose AI-driven 5G MEC and vRAN distributed platform based on NVIDIA’s new GH200 Grace Hopper superchip.  The two partners intend to deploy a network of regional data centres across Japan later this year to capitalize on the demand for accelerated computing and generative AI services. The shared multi-tenant platform will also offer a range of 5G vRAN applications and Softbank is creating 5G applications for autonomous driving, AI factories, augmented and virtual reality, computer vision and digital twins.

Key Takeaway No. 1: Platform Limitations

Softbank is offering a dual-purpose platform where the main application is AI compute via a high-performance edge analytics platform to capitalise on the expected surge in demand for AI processing capacity. Although the company is also developing a range of 5G applications, the AI and 5G workloads will be offered simultaneously. Counterpoint Research believes that the platform is unlikely to be feasible for vRAN workloads alone and understands that the two partners are not targeting this market.

Key Takeaway No. 2: Leveraging GPU Usage in the RAN

NVIDIA is planning to leverage its GPU processing capacity in the RAN in a number of ways, for example, to  improve spectral efficiency. One way of doing this is to apply AI to optimise channel estimation feedback data between a user device and a base station. A compute intensive problem with mMIMO radios, using AI to compress receiver feedback data would reduce signalling overhead, thereby resulting in an useful increase in uplink channel capacity. This could be particularly effective at the edges of a cell. Using its GPUs in this way, NVIDIA claims that it can boost gain for cell edge users by 14-17 dB. Other applications include using AI to optimise beamforming management in millimetre wave mMIMO radios as well as to accelerate Layer 2 scheduling.

The full version of this insight report, including a complete set of Key Takeaways is published in the following report, available to clients of Counterpoint Research’s 5G Network Infrastructure Service (5GNI).

New Report: NVIDIA and Softbank Join Forces To Deploy AI-Based 5G MEC Telco Network Across Japan

NVIDIA Softbank Join Forces

Table of Contents:

Snapshot
Key Highlights
– 5G MEC Telco Network
– Grace Hopper Superchip
– Leveraging Software Resources
– Performance Details
– Use Case and Deployment Options
– Key Partners
– Competitors
Analyst Viewpoint
– Platform Limitations
– Benefits of RAN-in-the-Cloud
– Eliminating RAN hardware dependency
– The Intel vs ARM battle
– Leveraging GPUs in the RAN
– A Crowded Market

Get a quote

Related Reports and Blogs

The post NVIDIA and Softbank Developing AI-Based 5G MEC Telco Network appeared first on Counterpoint.

]]>
Gareth Owen
Open RAN Networks – Layer 1 Acceleration Strategy Will Be Key To Operator Success https://www.counterpointresearch.com/insights/open-ran-networks-layer-1-acceleration-strategy-will-be-key-to-operator-success/ Wed, 07 Jun 2023 04:40:50 +0000 http://cpr.presscat.kr/insights/open-ran-networks-layer-1-acceleration-strategy-will-be-key-to-operator-success/ In-line, PCIe-based accelerator cards will be essential to process latency-sensitive, Layer 1 workloads in COTS-based massive MIMO open RAN networks. Accelerator card energy efficiency will be a key differentiator among vendors as could the degree of Layer 1 stack openness. Virtually all open RAN deployments to date are based on Intel’s FlexRAN reference software architecture […]

The post Open RAN Networks – Layer 1 Acceleration Strategy Will Be Key To Operator Success appeared first on Counterpoint.

]]>
  • In-line, PCIe-based accelerator cards will be essential to process latency-sensitive, Layer 1 workloads in COTS-based massive MIMO open RAN networks.
  • Accelerator card energy efficiency will be a key differentiator among vendors as could the degree of Layer 1 stack openness.
  • Virtually all open RAN deployments to date are based on Intel’s FlexRAN reference software architecture running on x86-based COTS servers. While this configuration is adequate for low to medium traffic scenarios, it is not sufficient for high-traffic use cases in dense urban areas involving the use of massive MIMO radios.

    Solving the massive MIMO performance deficit is a major challenge delaying the transition to open RAN. This challenge must be resolved before mainstream adoption of open RAN-based massive MIMO radios can occur. However, this will require a new breed of merchant silicon solutions designed to efficiently process latency-sensitive Layer-1 baseband workloads. Last year, a number of vendors announced alternatives to Intel’s FlexRAN platform based on ASICs, GPUs as well as RISC-V architectures and several of these vendors showcased their latest products recently at MWC-23 in Barcelona (Exhibit 1).

    Look Aside Versus In-line Acceleration

    Open RAN COTS platforms typically use PCIe-based accelerator cards to process the compute-intensive Layer 1 workloads. There are essentially two types of architecture designs: look-aside and in-line:

    • Look-aside accelerators offload a small subset of the 5G Layer 1 functions, for example, forward error correction, from the host CPU to an external FPGA or eASIC-based accelerator. However, this offloading adds latency and degrades system performance as the compute is done offline.

    An alternative to using PCIe cards is to integrate the look-aside accelerator and CPU in a SoC. This eliminates the need for a separate PCIe card. Look-aside acceleration is used by AMD and Intel, including in the latter’s vRAN Boost integrated SoC design.

    • With the in-line accelerator architecture, all the Layer 1 data passes directly through the accelerator and is processed in real-time – a critical requirement for Layer 1 workloads. This processing is done by other types of processors, for example, ARM or RISC-V based DSPs, which results in a more energy-efficient implementation and reduces the need for additional CPUs with a high number of cores. For operators, this results in significant CAPEX and OPEX savings, particularly in the case of massive MIMO base stations, the most demanding of all 5G network deployments.

    In-line accelerators also offer important scalability benefits as operators can add extra accelerator cards (up to six cards in a standard telco grade server) as more L1 capacity is required. In contrast, the look-aside architecture involves adding expensive, power-hungry COTS CPUs (+FPGA/eASIC cards) to meet capacity increases. In-line acceleration is used by ARM-based chip vendors such as Qualcomm Technologies, Inc and Marvell, as well as some RISC-V start-ups.

    Layer-1 Software Stack

    Chip vendors typically offer Layer-1 reference software stacks, which OEMs or third-party software vendors then customize and harden. This is an intensive two- or three-year process that demands considerable technical expertise and resources, particularly for telco-grade massive MIMO networks. With 5G, there is added complexity as the Layer 1 stack needs to be very adaptive and programmable to cater for the multitude of workloads and use cases. As a result, very few chip vendors offer carrier-grade Layer 1 software. In fact, the choice of vendors with the capability to develop macro cell massive MIMO Layer 1 stacks is essentially limited to the Tier-1 incumbents plus a handful of open RAN challenger vendors.

    Tier-1 incumbents are unlikely to offer the same level of openness and flexibility as the challenger vendors, and hence accelerator cards from the latter may be a more attractive option for operators looking for a higher level of network customization. For example, Qualcomm Technologies will offer a set of APIs that allow vendors to port alternative software into its Layer 1 stack – such as beamforming or channel estimation algorithms. This lowers the barriers to entry for small software vendors and enables new players to enter the market – i.e. without requiring them to develop a full commercial-grade Layer 1 stack themselves. This could result in a rapid expansion of the Layer 1 software ecosystem.

    Exhibit 1: Open RAN Layer 1 Accelerator Card Options by Vendor (Macro Cells)

    Energy Efficiency – A Critical Metric

    Reducing OPEX costs has become a major priority for operators due to soaring energy costs plus the need to minimize their carbon footprints. As a result, energy efficiency, i.e. Gbps/Watt, will be a critical metric for operators when evaluating Layer 1 accelerator cards. However, only a few vendors have revealed power consumption data. In a recent demo, Qualcomm Technologies showed the total power consumption of its Qualcomm® X100 5G RAN Accelerator Card [1] to be just 16-18W when driving a 16-layer 64TRx massive MIMO radio configuration serving four user devices (using four layers/device). According to the vendor, the card is designed to support a throughput of 24Gbps at less than 40W peak power consumption.

    Viewpoint

    The open RAN story is gaining momentum and Counterpoint Research expects this growth to accelerate during 2023 and beyond driven by the availability of new merchant silicon solutions. To succeed in the marketplace, however, these new silicon platforms will need to demonstrate the potential to compete against the latest incumbent RAN solutions – across all key technical metrics – as well as offering the same level of advanced 5G features. Clearly, energy efficiency will be a major product differentiator, which puts current x86-based look-aside designs at a disadvantage compared to the latest in-line accelerators and suggests that most operators will favour the in-line architecture approach. Ultimately, the winning vendors will be those that are best able to satisfy the key technical requirements of individual operators while at the same time offering them the flexibility to customize their networks to suit their own requirements.

     

    [1] Qualcomm X100 5G RAN Accelerator Card is a product of Qualcomm Technologies, Inc. and/or its subsidiaries. Qualcomm is a trademark or registered trademark of Qualcomm Incorporated

    Related Reports

    Qualcomm On Track To Launch Open RAN 5G Macro Base Station Portfolio

    Cloud RAN – Waiting for a Viable Business Plan?

    The Emerging Cloud RAN Ecosystem – Players and Solutions

    Cloud RAN – Technology Review and Market Challenges

    Open RAN Radios – Chinese Vendors Set To Dominate An Emerging Market?

     

    The post Open RAN Networks – Layer 1 Acceleration Strategy Will Be Key To Operator Success appeared first on Counterpoint.

    ]]>
    Gareth Owen
    Rakuten's Q1 2023 Results – Key Highlights And Analysis https://www.counterpointresearch.com/insights/rakuten-4/ Thu, 25 May 2023 06:38:16 +0000 http://cpr.presscat.kr/insights/rakuten-4/ Rakuten released its Q1 2023 results earlier this month. Mobile segment revenue increased 25.7% YoY to $710 million but was down 11.5% sequentially compared to Q4 2022. Operating loss improved 22.4% to $760 million YoY with a smaller 4.7% improvement sequentially compared to Q4 2022. Improving Customer Experience In the short term, improving network quality […]

    The post Rakuten's Q1 2023 Results – Key Highlights And Analysis appeared first on Counterpoint.

    ]]>
    Rakuten released its Q1 2023 results earlier this month. Mobile segment revenue increased 25.7% YoY to $710 million but was down 11.5% sequentially compared to Q4 2022. Operating loss improved 22.4% to $760 million YoY with a smaller 4.7% improvement sequentially compared to Q4 2022.

    Improving Customer Experience

    In the short term, improving network quality is Rakuten’s main priority.  Following a recent customer survey, Rakuten has launched several initiatives to improve customer experience, most of which will be resolved by a new roaming agreement with KDDI – which includes access to the latter’s sub-1GHz spectrum for the first time. This should improve coverage, particular in-doors such as in the home and in high-traffic shopping malls, as well as underground in subways, tunnels, etc. However, Rakuten urgently needs to build its own low-band networks – i.e. in the so-called “platinum” band – starting probably in early 2024.

    Selling the Family Silver

    Rakuten is still in a precarious situation financially and is in the process of selling more of the family silver to fund its mobile network roll-out. This includes IPOs and the sale of stakes in some non-core assets, for example, the Seiyu supermarket chain. The company has also pushed back its target date to become profitable from the end of 2023 to an unspecified time in 2024.

    Challenges of Rolling Out Greenfield Networks

    Rakuten’s experience over the past three years illustrates the challenges of rolling out a greenfield network in a mature market and the time, effort and investment required to achieve coverage and reliability on a par with better-heeled rivals. Indeed, many of these challenges have nothing to do with the choice of network architecture, i.e. open RAN. As a result, there are many lessons here for Dish and 1&1 Drillisch.

    The silver lining for Rakuten, perhaps, is that it is primarily a tech company and its Rakuten Symphony division is starting to deliver meaningful revenues, which are projected to accelerate during 2023. Dish and 1&1 Drillisch do not have that luxury!

    Summary of Rakuten Customer Churn Survey

    Exhibit 1:  Summary of Rakuten’s Customer Churn Survey

    The full version of this insight report, including all highlights and viewpoints is published in the following report “Rakuten Unveils Plan To Boost Subscriber Growth” available to clients of Counterpoint Research’s 5G Network Infrastructure Service (5GNI).

    Related Reports

    Dish Gets Ready To Launch Boost Infinite Post Paid Service

    Dish Clears First FCC Hurdle, Launches in 120 US Cities

    Rakuten Mobile – Time To Show Disruptive Networks Can Deliver Disruptive Profits?

     

    The post Rakuten's Q1 2023 Results – Key Highlights And Analysis appeared first on Counterpoint.

    ]]>
    Gareth Owen
    Cloud RAN Platforms – Why Are Vendors Adopting Different Layer-1 Acceleration Strategies? https://www.counterpointresearch.com/insights/cloud-ran-platforms/ Thu, 18 May 2023 04:10:49 +0000 http://cpr.presscat.kr/insights/cloud-ran-platforms/ CSPs are showing an increasing interest in leveraging the benefits of RAN virtualization and cloud-native technologies and vendors are responding to this demand. As a result, future RAN networks are expected to evolve gradually towards Cloud RAN based solutions, which will be deployed alongside traditional, proprietary 5G networks. In contrast to traditional RAN networks, the […]

    The post Cloud RAN Platforms – Why Are Vendors Adopting Different Layer-1 Acceleration Strategies? appeared first on Counterpoint.

    ]]>
    CSPs are showing an increasing interest in leveraging the benefits of RAN virtualization and cloud-native technologies and vendors are responding to this demand. As a result, future RAN networks are expected to evolve gradually towards Cloud RAN based solutions, which will be deployed alongside traditional, proprietary 5G networks.

    In contrast to traditional RAN networks, the baseband unit of a cloud RAN base station is split into two units: a Distributed Unit (DU) and a Centralized unit (CU). Today, the vast majority of commercially deployed DU basebands run on x86 processors. However, alternatives to Intel’s x86 platform, based on ASICs, GPU and RISC-V architectures are expected to become widely available during the next three years.

    Cloud RAN platforms typically use PCIe-based accelerator cards to process the compute-intensive Layer 1 workloads. There are essentially two types of accelerator architecture: look-aside and in-line:

    • Look-aside accelerators offload a small subset of the 5G Layer 1 functions, for example, forward error correction, from the host CPU to an external FPGA-based accelerator.
    • With an in-line accelerator card, all the Layer 1 data passes directly through the accelerator and is processed in real-time – a critical requirement for Layer 1 workloads. This processing is done by other processor types, for example, ARM or RISC-V based DSPs.

    However, there is a marked difference in the approach of vendors towards Layer 1 acceleration, with some vendors supporting the look-aside option, some supporting the in-line option, while others plan to offer both options.

    Counterpoint Research’s latest report “Cloud RAN Platforms – Why Are Vendors Adopting Different Layer 1 Acceleration Strategies?provides details of the cloud RAN platform configurations offered by various incumbent and challenger vendors and discusses the reasoning and underlying strategy behind their technology choices and partnerships.

    Table of Contents

    Snapshot
    Introduction
    Key Cloud RAN Platforms
    -Ericsson
    -Nokia
    -Samsung
    -NEC
    -Fujitsu
    -Rakuten
    -Mavenir
    -JMA Wireless
    Viewpoint

    This report is available to clients of Counterpoint Research’s 5G Network Infrastructure (5GNI) Service.

    Related 5GNI Reports and Blogs

    New L1 Accelerator Cards Set To Boost Open RAN Market – Or Create More Lock-In?

    Qualcomm On Track To Launch Open RAN 5G Macro Base Station Portfolio

    Cloud RAN – Waiting For A Viable Business Case?

    The Emerging Cloud RAN Ecosystem – Players and Solutions

    Open RAN Radio Market: Product Availability Study

    The post Cloud RAN Platforms – Why Are Vendors Adopting Different Layer-1 Acceleration Strategies? appeared first on Counterpoint.

    ]]>
    Gareth Owen
    New Layer-1 Accelerator Cards Set To Boost Open RAN Market – Or Create More Lock-In? https://www.counterpointresearch.com/insights/layer-1-accelerator-cards/ Tue, 09 May 2023 05:12:08 +0000 http://cpr.presscat.kr/insights/layer-1-accelerator-cards/ The transition of the Radio Access Network (RAN) from a standalone, integrated network into a disaggregated, virtualized solution is well underway. However, all open RAN deployments to date rely on Intel’s x86-based COTS servers, with most deployments also using Intel’s proprietary FlexRAN software architecture. Recently, various silicon vendors have announced that they are developing alternatives […]

    The post New Layer-1 Accelerator Cards Set To Boost Open RAN Market – Or Create More Lock-In? appeared first on Counterpoint.

    ]]>
    The transition of the Radio Access Network (RAN) from a standalone, integrated network into a disaggregated, virtualized solution is well underway. However, all open RAN deployments to date rely on Intel’s x86-based COTS servers, with most deployments also using Intel’s proprietary FlexRAN software architecture. Recently, various silicon vendors have announced that they are developing alternatives to Intel’s x86 platform based on ASICs, GPUs as well RISC-V architectures. Several of these vendors are currently testing their new PCIe-based Layer-1 accelerator cards with CSPs and commercial versions of these products are expected to become widely available during the next three years.

    This report provides an overview of the emerging open RAN PCIe-based Layer-1 accelerator card market based on new merchant silicon and highlights the opportunities and technical challenges facing the open RAN chip community as they strive to develop alternative chip solutions capable of efficiently processing real-time, latency-sensitive Layer-1 workloads.

    Key Takeaway No. 1: Too much diversity?

    The launch of new L1 accelerator cards from various vendors, large and small, should be welcomed by CSPs calling for diversity and will go some way to quell criticism that the open RAN market is too Intel-based. However, CSPs may now be faced with another dilemma – too much choice! They must now face the difficult challenge of testing and comparing multiple accelerator cards, inevitably involving complicated technical and commercial trade-offs.

    Key Takeaway No. 2: Look-Aside or In-Line Accelerators?

    At present, the choice of accelerator architecture is binary: either look-aside or inline. Both types have their advantages and drawbacks. Depending on use cases and applications, Counterpoint Research believes that operators may need to use both types of accelerators. However, only one vendor currently offers a software/silicon platform with the capability to do this.

    Key Takeaway No. 3: Interoperability and Vendor Lock-In

    Developing commercial-grade Layer 1 software suitable for massive MIMO networks is an expensive process requiring very specific skills and a lot of experience – but with no guarantee of commercial success. Although open RAN is designed to promote interoperability and vendor diversity, all L1 stacks are currently tied to the underlying silicon architectures and hence are not portable between hardware platforms. This introduces a new form of vendor lock-in for CSPs. Clearly, there is an urgent need for an universal software abstraction layer between the L1 stack and the various hardware platforms to enable stack portability.

    The complete versions of these Key Takeaways, including the full set of  Takeaways is published in the following report, available to clients of Counterpoint Research’s 5G Network Infrastructure (5GNI) Service.

    Report: New L1 Accelerator Cards Set To Boost Open RAN Market – Or Create More Lock-In?

    Table of Contents

    • Snapshot
    • Key Takeaways
    • Introduction
    • PCIe-based Hardware Acceleration
      • Look Aside vs In-Line Acceleration
      • Technical Trade-Offs
    • Processor Architectures
      • Types of Processors
      • Comparison of Hardware Options
      • Intel’s Xeon with vRAN Boost
    • Layer-1 Stacks
      • Reference or Commercial Grade Stacks?
      • Open or Closed Stacks?
      • Layers 2 and 3

    • Interoperability and Standardization
      • FAPI Interface
      • Proprietary L1 Software Stacks
      • Accelerator Abstraction Layer (AAL)
      • Saankya Labs RANwiser
    • Key Players (in alphabetical order)
      • AMD Xilinx
      • Dell
      • EdgeQ
      • Intel
      • Leapfrog Semiconductor
      • Marvell
      • Nvidia
      • Picocom
      • Qualcomm
    • Viewpoint

    Related Reports and Blogs

    MWC22 Las Vegas: Samsung Makes Breakthrough In US Cable Market

    Qualcomm On Track To Launch Open RAN 5G Macro Base Station Portfolio

    Cloud RAN – Waiting For A Viable Business Case?

    The Emerging Cloud RAN Ecosystem – Players and Solutions

    Open RAN Radio Market: Product Availability Study

    The post New Layer-1 Accelerator Cards Set To Boost Open RAN Market – Or Create More Lock-In? appeared first on Counterpoint.

    ]]>
    Gareth Owen
    5G Advanced – Stakeholder Collaboration Essential To Maximise ROI For Operators https://www.counterpointresearch.com/insights/5-5g/ Mon, 27 Mar 2023 04:32:41 +0000 http://cpr.presscat.kr/insights/5-5g/ With more than 230 5G networks deployed worldwide serving 1+ billion end user devices, 5G has become the fastest-growing cellular standard of all time. However, there is an urgent need to prepare for the future to enable operators and enterprises to leverage its full capabilities. 5G Advanced (5.5G) is the next evolutionary step in 5G […]

    The post 5G Advanced – Stakeholder Collaboration Essential To Maximise ROI For Operators appeared first on Counterpoint.

    ]]>
    With more than 230 5G networks deployed worldwide serving 1+ billion end user devices, 5G has become the fastest-growing cellular standard of all time. However, there is an urgent need to prepare for the future to enable operators and enterprises to leverage its full capabilities. 5G Advanced (5.5G) is the next evolutionary step in 5G technology which will introduce new levels of capabilities, enabling operators to generate a return on their 5G investments.

    The “10 Gbps Everywhere” Experience

    Compared to conventional 5G, 5.5G represents a 10-fold improvement in performance across the board. This means that 5.5G networks will be able to provide ubiquitous 10 Gbps downlink and 1 Gbps uplink speeds while supporting 100 billion IoT connections – compared to just 10 billion with 5G. In addition, 5.5G is expected to deliver latency and positioning accuracy that are a fraction of the current 5G standard as well as significant reductions in overall network power consumption.

    5.5G will provide enhanced connectivity and better user experiences. By leveraging the 10 Gbps downlink throughput and low milli-second latency, 5.5G will bridge the gap between the physical and virtual worlds. Although 5G already provides some immersive services, 5.5G will enable interactive immersive services, such as 24k resolution VR gaming, glasses-free 3D video and 3D online malls.

    Benefits for Enterprises

    In addition to enhanced connectivity, 5.5G will offer a broad range of new capabilities for enterprises. Counterpoint Research expects a surge in new private network applications as networks are able to leverage the technical innovations enabled by 5.5G. For instance, enterprises will benefit greatly from the 1 Gbps uplink capability, enabling, for example, high-precision AI-based industrial vision inspection, while enhanced positioning with sub-10cm accuracy – both indoors and outdoors – will enable a plethora of new Industry 4.0 applications.

    In addition, 5.5G will support three rapidly developing IoT technologies: NB IoT, RedCap and passive IoT tags, an innovative, low cost location sensing technology. A promising application of passive IoT tags is HCS-based Millimetre Wave[1] technology, an integrated sensing and communications technology, which enables centimetre precise positioning of objects, including pedestrians and personal items, livestock, autonomous vehicles, drones, etc. On the network side, enhanced AI/ML capabilities across the RAN, core and network management domains plus new power saving features will result in significant energy savings for operators.

    Standards and Spectrum

    Technical standards are the bedrock of the telecommunications industry and it is imperative that common standards are adopted worldwide. The standardization of 5.5G via 3GPP Release 18 is on-going. However, the industry must work together to ensure that Release 18 is frozen by the first quarter of 2024 as planned to enable 5.5G to be introduced from 2025 onwards.

    Release 18 will be followed by Releases 19 and 20 after which the 3GPP will focus on 6G. Clearly, industry players need to collaborate closely over the next few years in order to define and maximise the technical innovations and capabilities of 5.5G and to ensure new services and use case scenarios are properly supported. This will help to maximise the potential of 5.5G for operators and extend its lifecycle.

    Additional spectrum will be required to enable 5.5G to deliver its full potential. Re-farming of legacy 2G and 3G bands will free some lower band spectrum. However, this is not sufficient. More spectrum in the 6GHz and millimetre bands is necessary. With the WRC-23 radio conference taking place in November, it is essential that all stakeholders, including governments and regulators as well as operators and vendors, agree on the best spectrum strategy. Clearly, the 6GHz band should be a key 5.5G target band for the industry. In fact, the 3GPP has already licensed the 6,425-7,125MHz bands and Counterpoint Research expects that the upper part of this band will be identified as an IMT band at WRC-23. Millimetre wave is another key spectrum band for 5.5G and more than 800MHz additional millimetre wave spectrum will likely be needed to enable operators to deliver the 10 Gbps experience.

    Networks and Devices

    Networks and devices will need to be upgraded to enable 5G Advanced and this will involve further innovation with respect to 5.5G chipset technologies and devices.

    5.5G will introduce a plethora of new devices with new capabilities beyond smartphones. Some of these will be full-capability devices while others will have reduced capabilities. For example, Red Cap devices only need to support a shortened set of specific capabilities, for example, video surveillance devices used for industrial quality control, process monitoring, sensing or tracking. However, all players, including chipset and device OEMs, must start working immediately to define the digital requirements for individual vertical use cases and applications in order to ensure that an ecosystem of suppliers is developed.

    A significant recent development is the release of millimetre chipsets. For example, Qualcomm recently demonstrated its 5.5G Snapdragon chip, which offers 10 Gbps speed with 10CC carrier aggregation on millimetre wave and 5CC carrier aggregation on sub-6GHz frequencies. Similarly, MediaTek’s chipset offers downlink and uplink speeds of 7.67 Gbps and 3.76 Gbps respectively.

    Upgrading Fibre to 5G Advanced

    Achieving the “10 Gbps Everywhere” experience” will involve upgrading standards for fixed fibre broadband as well as for 5G RAN and Core. In fact, the evolution of Fibre Broadband 5G (F5G) to all-optical F5.5G has already progressed from proposals to specification design.

    Performance improvements in fibre networks will be achieved by agreements on the use of key technologies such as 50G Passive Optical Network (PON) technology, Fibre to the Room (FTTR), etc. 50G PON is being standardized as the next-generation PON by the ITU-T. Together with technologies such as “uplink/downlink symmetry” and “multi-band in one,” this will pave the way for a smooth evolution to F5.5G. Last September, ETSI released its F5G Advanced White Paper and the standards body has been leading the formulation of F5.5G’s first release, Release 3, which will be frozen in first half of 2024.

    The development of 5.5G and F5.5G will require a converged fixed/wireless IP network. Work on the definition of a new converged network – tentatively called Net5.5G – has already begun. Both the IETF and the IEEE are working on the first phase of Net5.5G standardization, but consensus is still needed on fixed/wireless bearer technologies such as 800GE backbone, 400GE MAN, etc. as well as on key aspects of other technologies such as WiFi-7, Segment Routing over IPv6 (SRv6), etc. before the new standard is released in 2024. With new capabilities, Net5.5G will enable operators maximise the potential of 5.5G and provide new opportunities for growth.

    Viewpoint

    The increasing popularity of immersive experiences and the emergence of the metaverse coupled with the demands of enterprise digital transformation mean that 5G networks will soon be unable to support the expected exponential growth in traffic. With 6G around 8-12 years away, 5.5G is the next obvious evolution of 5G and next-generation consumer and B2B opportunities will only be possible if operators and enterprises upgrade to 5.5G.

    However, a successful and timely upgrade to 5.5G will require all industry stakeholders – from technical standards bodies, operators, network and device manufacturers to policy developers and regulators – to work closely together and collaborate on key 5.5G enablers, including standards, spectrum, networks and device specifications, etc. Major MNOs will be required to pilot new 5.5G technologies and build business cases.  In addition, Counterpoint Research believes that an industry consensus on the digital requirements of new use cases needs to be developed, particularly with respect to enterprise vertical uses cases, as well as a focus on developing a diverse ecosystem of players encompassing all verticals. Finally, closer collaboration between the mobile and fixed telecoms communities will be essential in order to ensure synchronization of standards between wireless and fixed networks.

     

    [1] Harmonized Communications Sensing

    Related Posts

    The post 5G Advanced – Stakeholder Collaboration Essential To Maximise ROI For Operators appeared first on Counterpoint.

    ]]>
    Gareth Owen
    Ericsson's Cradlepoint Launches NetCloud Private Networks Solution https://www.counterpointresearch.com/insights/netcloud/ Mon, 16 Jan 2023 10:51:51 +0000 http://cpr.presscat.kr/insights/netcloud/ Last week, Cradlepoint announced the launch of its subscription-based NetCloud Private Networks solution, an SME-type enterprise solution targeted at office buildings, retail outlets, stadiums, hospitality venues, smart city, schools, etc. It will complement Ericsson’s Private 5G product which is designed primarily for industrial applications such as manufacturing, energy and utilities, etc, where low-latency, high reliability […]

    The post Ericsson's Cradlepoint Launches NetCloud Private Networks Solution appeared first on Counterpoint.

    ]]>
    Last week, Cradlepoint announced the launch of its subscription-based NetCloud Private Networks solution, an SME-type enterprise solution targeted at office buildings, retail outlets, stadiums, hospitality venues, smart city, schools, etc. It will complement Ericsson’s Private 5G product which is designed primarily for industrial applications such as manufacturing, energy and utilities, etc, where low-latency, high reliability and business-critical capabilities are essential.

    Key Features of NetCloud

    Key features of NetCloud include:

    A “Private Networks-in-a-Box” solution – which includes access points, core/gateway, network planning software, routers, private SIMs plus a single pane of glass cloud management and orchestration system via NetCloud.

    Cradlepoint Distribution Channels – NetCloud will leverage Cradlepoint’s existing sales channels via network enterprise resellers, managed service providers, etc. Later in the year, the company plans to target distribution via mobile operators.

    Flexible Customer Offering – based on a CAPEX or OPEX business model to suit customers. Enterprises can buy cellular access points, core capacity, etc on a capacity and service duration basis. For example, Cradlepoint offers 500Mbps 2 Gbps and 5 Gbps options for its core on a 3 or 5 year basis with SIM cards being sold in packs of 10.

    At present, NetCloud is only available for use in the US’s 4G LTE CBRS market, but later this year, Cradlepoint will offer 5G radios for use in markets beyond the US, for example, Europe. The company also expects to introduce eSIM capabilities at the same time.

    The key elements of Cradlepoint NetCloud solution are shown in Exhibit 1 below:

    Exhibit 1:  Overview of Cradlepoint’s NetCloud Private Network Solution

    Viewpoint

    In the past year or so, there have been many “Private-Networks-in-a-Box” announcements from numerous vendors such as AWS, Cisco, HPE, etc. as well as operators such as Dish, etc. who believe that private network solutions can be marketed as an “out-of-a box” product like Wi-Fi. With so many offerings, this could turn out to be a very competitive market with thin margins, where success is largely determined by the vendors’ “go-to-market” strategy and distribution channels.

    The launch of NetCloud is a clear indication that Ericsson (with Cradlepoint) plans to play across the whole breadth of the private networks market. Although surprisingly a bit late launching its own “box” solution, Counterpoint Research believes that Cradlepoint is well-positioned to benefit from the opportunities in this market due to its strong position in the enterprise market (32,000 customer base), its global distribution network of 6,000 resellers and partners – and backed by Ericsson’s connectivity expertise.

     

    Related Reports

    Private Networks – Market Sizing and Forecasts: 2021-2030

    Private Networks – High Expectations Amid An Expanding Ecosystem

    Network Slicing vs Private Networks: Benefits and Drawbacks

    Private Cellular Networks – Devices Key To Growth In Unlicensed Spectrum

    5G Mobile Edge Computing – An Emerging Technology Slowly Transitioning To Commercial Reality

    The post Ericsson's Cradlepoint Launches NetCloud Private Networks Solution appeared first on Counterpoint.

    ]]>
    Gareth Owen