Huawei introduced LampSite, a local C-RAN small cell system for indoor applications, in early 2013.  Since then Ericsson, ZTE, Airvana, and Kathrein have announced scalable indoor C-RAN small cell systems.

Huawei and SpiderCloud announced multi-mode standalone small cells in 2013. Since then Ericsson, Alcatel-Lucent, and Cisco have announced indoor multi-mode products targeting small to medium sized buildings.

Ericsson and Nokia started emphasizing macro parity when moving from macro to small cell in 2013.  Now other vendors are working on beefing up the performance/features of their small cells.

With macro base station gross margins hovering around 20% for the initial H/W deployments and a combined mix of 35% to 40% when accounting for additional RF carriers and S/W, the question is if vendors will be able to differentiate the products to maintain these margins as BTS deployments shift from macro to small cells outdoors and larger indoor buildings and eventually to medium sized buildings?

Vendors seem to focus on the following features/benefits to stand out:

  • Coordination
  • Scale
  • Capacity
  • Macro Parity
  • WiFi Footprint
  • Time-to-Install
  • Upgrade path
  • Services
  • Aesthetics
  • Breadth of portfolio.


The objectives of radio coordination and coordination between cell layers are to improve the performance and reduce the number of BTSs required to cover a certain area.

Ericsson has the largest macro installed base (Dell’Oro Group estimates) and is also a big proponent of introducing coordinated small cells.  The message seems to be resonating with operators as Ericsson is recording a lot of small cell wins.   We have received feedback from larger Tier1 operators suggesting they want to deploy coordinated small cells, but they also want more vendor options and in some cases (if signal is weak indoors) the level of coordination is not as important, and it could make sense to mix the macro and small cell vendors.

If the government allocates specific bands such as the 3.55 to 3.7 GHz band to small cells, the concept of coordinating macro and small cells will not be as important.  Though longer term it is safe to assume that carriers will need to optimize all available spectrum assets.



Building sizes vary significantly.  Ericsson, Huawei, Nokia, and SpiderCloud have focused some of their marketing messages on the scalability of their products which play an important role for larger buildings. Ericsson and Huawei targets buildings greater than 50 K sq. ft. with their distributed or local C-RAN small cell radio systems.  Ericsson’s Radio Dot can support up to 96 APs while Huawei’s LampSite system can connect up to 32 picos.  SpiderCloud’s indoor system targets buildings ranging from 100 K to 1 M sq. ft.  ZTE and Airvana have also announced local C-RAN scalable small cell solutions targeting larger buildings. Alcatel-Lucent announced that its new enterprise small cell can be grouped to offer seamless handover for large scale deployments.  And Nokia’s picos and micros can be turned into a FlexiZone for large scale deployments.


Macro Parity

Ericsson, Nokia, and Huawei have highlighted macro parity with their small cell portfolios resulting in comparable performance when moving between macro and small cell and cost/time advantages using same platform on macro and small cells when rolling out new features/enhancements.  Alcatel-Lucent, through its partnership with Qualcomm, has plans to include LTE-CA features in its new indoor small cell.


WiFi Footprint

Surprise or not, small cell vendors with superior WiFi technology have not used this to a greater extent in their overall marketing message.  But what we have seen is that vendors with larger WiFi footprints are trying to leverage the existing installed base to complement with 3G/4G.



Vendors are starting to advertise how quickly the small cell can be installed.  Ericsson claims its latest RBS6402 product can be up and running in less than 10 minutes while SpiderCloud can address the installation in a couple of days for a larger enterprise.


Upgrade path

Carriers want to leverage new features/options as they become available with minimal disruption.  Ericsson announced that its RBS6402 will come configured (and tested) with 10 bands ensuring some future proofing in the event of spectrum re-farming down the road.



At the end of the day, a product that can deliver 300 or 450 Mbit/s using LTE-A (CA) is only so valuable if it can’t be installed in the right location.  And given the changes the shift from macro to small cell introduces when it comes to site identification/permitting/design/analysis/installation/maintenance/optimization, it makes sense that vendors are placing more emphasis on improving their services and partnership portfolios.



Some operators have openly expressed their dissatisfaction with the appearance of the small cell products available on the market.  Vendors are listening and this will be an increasingly important aspect of the products as the distance to the end user shrinks.


Breadth of portfolio

Some vendors such as Alcatel-Lucent cover the residential, non-residential indoor, and non-residential outdoor small cell markets with products ranging from a few mW in output power to up to 5 W.  Ericsson and Huawei have stayed away from addressing the home small cell market and instead have introduced a wide range of products to address all the possible deployment scenarios for carrier small cell deployments in enterprises and urban settings.


In other words, it might be logical to assume that feature/spec differentiation will become increasingly challenging as the RF output power shrinks and feature/benefits converge.  But given the wide range of small cell deployment scenarios, there will be plenty of ways for the vendors to differentiate their overall solution.  Now if this will translate into a better or worse margin profile than the macro base station, it remains to be seen…


With telecom service providers now in the fifth year of rolling out 4G networks, the industry is already preparing for what is coming next. Even if it is still very early for consumers to start thinking about 5G, development of any new technology takes a significant amount of time and researchers are currently in the very early stages aligning the key stakeholders as well and setting the vision for future mobile broadband networks.

What is 5G?

Although it is not clear at this point what 5G will be as it has not yet been defined, initial views of some of the big thinkers in the industry suggest the goal with 5G will be to connect everything efficiently and seamlessly with no performance limitations. In other words, just as 5 billion people take mobile voice for granted today, 5G will ensure 50 billion devices have access to high performance mobile broadband throughput, capacity, and latency, regardless of if the connection is accessed in the home, outdoors, or in the office. 5G should guarantee the performance regardless of how many people are on the network and how much data the person or device is consuming.

What will be the focus?

The standard tools available for improving capacity include improving spectral efficiency, spatial efficiency, and using more spectrum.  While improving spatial efficiency by the use of small cell radios is often seen as the most efficient method for scaling capacity, early research suggests the amount of spectrum and how it is used will also play a crucial role with 5G networks.  In addition to allocating more spectrum bands for future mobile broadband cellular communication networks, regulators are investigating the possibility of sharing spectrum enabling more efficient use of existing resources. Increased use of small cell radios will shrink the cell sizes and enable the use of higher operating frequencies with larger bandwidths. Some researchers are currently testing the propagation characteristics of millimeter wave bands.

Initial feedback from key stakeholders also suggests there will be an increased focus on improving overall utilization and energy consumption. The base station, which consumes the majority of the energy in the network, is expected to shrink in physical size which will drive continued innovation across the RF component ecosystem. Huawei stated in a 5G position paper that the energy-per-bit usage should be reduced by a factor of 1000 compared to today’s networks.

Improving end-to-end latencies five to ten-fold over today’s network would stimulate more innovation and services across a wide range of applications that have not been envisioned yet.

The 5G air-interface is most certainly not settled yet. Some innovators believe 5G will mean a completely redesigned air interface while some view 5G more as the technology that will optimize and combine the performance of existing technologies.  It could be the ability to combine a 10 MHz LTE TDD and 40 MHz 802.11ac carrier for DL, while using a 10 MHz LTE FDD or 5 MHz HSPA+ carrier for the UL.

What are the major challenges?

As with any technology transition, there will be a wide range of challenges to overcome.  Given that the devices could be combining different technologies and bands, battery consumption particularly in the devices is generally seen as a concern.  New innovation will be required to reduce the battery consumption in devices.  The use of higher frequencies will also introduce new challenges typically not encountered in the sub-2 GHz range.  Huawei also suggested in a 5G paper that new breakthroughs are required for the baseband and RF architecture to meet the computational requirements of new solutions such as mass-scale MIMO (recall the array Samsung engineers prototyped including 64 antenna elements).  With the expected proliferation of small cell radios in hyper-dense networks, new innovation will be required to integrate access and backhaul radio products and ensure they can be installed easily using SON technologies.  Given that some expect 5G to be able to combine any technology with any spectrum bands, many regulatory hurdles will need to be addressed.

How long before we will see 5G smartphones?

Even if some service providers in Korea and Japan are already including 5G in their roadmaps, it is a bit premature to start forecasting commercial availability of 5G networks and devices, particularly since it is not clear yet what 5G means.

Historically there has been a new mobile technology roughly every 10 years starting with 2G in the early 90s, 3G in the early 2000s, followed by LTE in 2009/2010.  If history is any indicator of future performance, 5G devices could be a reality for innovators and early adopters in the 2020 time frame. This of course is just a loose target at this point which will be dependent on too many variables to list here, including how well technologies are adopted across the globe and if the pace of 4G will continue to be adopted at a faster pace than 2G/3G even as we enter the late majority phase.

I recently attended the Next Generation Optical Networking conference held in Nice, France.  With over 700 optical networking professionals in attendance, I participated on two panels and had the opportunity to chair a portion of the day.

One panel I moderated was called “Packet & OTN Switching Integration – What kind of switching is needed?”

The group of professionals that joined the panel discussion included:

  • Geoff Bennett, Director, Solutions & Technology, Infinera
  • Alan Corfield, Consultant, Transport Engineering, Virgin Media
  • Bartek Raszczyk, Senior Network Engineer, London Internet Exchange (LINX)
  • Kristian Andersson, IP Transport Consulting Engineer, Alcatel-Lucent
  • Zhao Shuai, Technical Director, ZTE

Each of the panelists shared his unique viewpoint and while we thought this topic would stir a large debate, we were pleasantly surprised when the panelists were in general agreement.

To summarize, we concluded that OTN switching and packet switching will both be around for a long time.  For obvious reasons, the future is with packet switching, but do not discount OTN switching because it serves a couple of important functions even in a packet world.

One of the reasons for OTN switching is that the client side and line side interfaces will always differ in speeds.  So, this means an OTN switch will be needed to maintain higher network utilization through active bandwidth management.  In general, the panel concluded the majority of OTN switches would be integrated within a DWDM system for this reason.

The second reason for OTN switches is to provide a high quality circuit that has predictable features, much like private line services today.  It could be said that OTN is a direct replacement for the type of services carriers often delivered with SONET and SDH equipment.  That is to say, OTN will allow carriers to continue delivering services that have high service level agreement (SLA) requirements.

When we discussed the type of equipment and whether an integrated solution was truly needed, the panelists were in general consensus that an integrated packet and OTN switch would be optimal.  As one panelist described, in his network, he wants to have a switch that can be an OTN switch today and a packet switch tomorrow.  In between those days, he wants to be able to slide the scale from OTN to packet.

The conference was a great opportunity to discuss the current challenges for the industry and discover and highlight the solutions for these challenges going forward.  I look forward to going back next year!

Dell’Oro Group recently held a lunchtime presentation and discussion on Network Functions Virtualization (NFV) and related technologies in San Jose.

At the luncheon, Shin Umeda, our vice president who spearheaded our Advanced Research Report on Network Functions Virtualization, led the discussion.  Shin was joined by Chris DePuy, our vice president covering the two key markets we track that are beginning to see the effects of NFV, namely Carrier IP Telephony (IMS) and Wireless Packet Core (Evolved Packet Core).

One of the highlight’s of Shin’s presentation was his insight into the potential implications of Network Functions Virtualization on Service Provider business challenges (see chart).

NFV Lunch Blog

Traditional Service Provider business models and processes are under pressure from operational, financial, and competitive perspectives.  According to Shin, NFV has the potential to address these growing business pressures in a new way.

The presentation addressed one of the biggest sources of pressure for Service Providers – exponential traffic growth.  The number of global connections is currently in the billions and is moving to the tens of billions.  Service Providers are constantly looking for ways to optimize networks to better match traffic volumes and patterns.

As network connections and traffic grow, the complexity of the underlying network infrastructure and the corresponding service delivery processes and mechanisms also increase.  This leaves Service Providers looking for new ways to manage financial resources – namely CAPEX and OPEX.

Shin described one of the key premises of NFV as a potential solution – the use of low-cost and standard IT, common off the shelf (COTS) hardware instead of expensive, specialized hardware, thereby reducing equipment costs.

OPEX improvements can also be achieved in many areas.  NFV offers the potential to reduce the costs to operate the physical network through lower power consumption, increased automation, and more efficient element management.

Service providers are looking to NFV as a way to develop and deliver new services.  As a service delivery platform, NFV has the potential to accelerate the introduction of new services and reduce development risks, and thereby improve revenue growth and competiveness.

However, Shin sees a long list of tasks and risks that must be addressed by Service Providers as part of the NFV transition.  It is a long term process that will take many years.

Thank you to everyone who attended this presentation and discussion.  For further information on this topic, see Shin’s Advanced Research Report on Network Functions Virtualization.

Here at Dell’Oro Group we update our 5-year forecasts after the first and third quarter each year.  With 1Q14 now behind us, we are starting to update the 5-year forecasts.  For the RAN market forecasts, some of the historical data that we will study reveals the relationships between equipment revenues and Capex spending, as well as data growth.

If we analyze data over a 10-year period from 2000-2010, we will find that equipment revenues as a percentage of Capex tend to shrink as the cost of equipment generally declines faster than the cost of services and other non-equipment-related infrastructure.

If we analyze the data since 2010, we estimate that worldwide Capex was roughly 15% greater in 2013 compared to 2010 levels while total service provider equipment-driven revenues for vendors did not, on average, grow at the same pace (see chart). Blog 5-year forecast RAN  If this is any indicator of future performance, one challenge for the RAN market will be the anticipated slower growth in overall Capex spending after the Chinese LTE roll-outs have taken place.

For our base station estimates, we typically analyze both coverage and capacity-driven base station deployments for the various regions.  And when it comes to data growth trends, we will pay close attention to the additional RAN capacity that is required for each region to accommodate incremental data traffic, as well as the proportion of the incremental data traffic that will be addressed with macro and small cell radios.

Network Functions Virtualization, or NFV*, continues to be among the hottest areas within telecommunications networking.  Fundamentally, NFV changes the way networks are designed and operated, and in turn, changes the products that vendors sell to network operators.  Sizing the market of new technologies such as NFV can be tricky, and here, we offer a quick look into some key considerations for this expanding market.

One approach to measuring the NFV market is to assess its effect is on existing technology markets.  A number of questions arise when taking this route, and here are five that we think are at the top of the list for consideration.

  • Which telecom products and technologies are affected by NFV and to what extent?  This is the starting point from which we can leverage Dell’Oro Group’s broad range of telecom equipment market coverage.
  • When implemented, is NFV a like-for-like replacement of an existing technology?  In other words, does the market simply shift from one class of products (network appliances) to another (software and servers)?  This is the cannibalization or substitution scenario that is common with new technologies.
  • Can the new technology be enabled with existing products, or are all new products and technologies required?  Some vendors might provide an upgrade path that leverages the existing installed base to enable new capabilities.
  • Does the new technology create pricing and/or unit volume disruption?  As the NFV movement is being driven by network operators, overall price reductions are a very likely outcome.
  • Beyond the like-for-like replacement, can NFV technologies result in an incrementally larger market than existing technologies?  One of the goals of employing NFV is to drive revenue growth and justify investment returns beyond capital and operating cost savings.

For our recently published NFV Advanced Research Report, we considered the questions above, as well as dozens of others.  NFV is an early stage market that is changing and developing rapidly.  Likewise, our approach to sizing the market will evolve and allow us to incorporate new information on a regular basis.  Stay tuned…

* NFV is a new construct for networking in which virtualization technologies are used to perform network functions that have traditionally been run on proprietary hardware appliances.  The network functions are implemented in software running on industry standard servers, and can be deployed or moved in various locations without installing new equipment.

It is that time of the year again. So in addition to sore feet, what can we expect from Barcelona this year? At a high level, I anticipate the underlying challenges or opportunities for the industry will remain fairly similar to last year (and most likely the foreseeable future):

  • Consumers expect to be able to use any app at anytime, anywhere. How do we ensure that the network can handle the great variation in minimum performance requirements among all the use cases for a mobile broadband subscriber base that is growing exponentially?
  • The percentage of income spent on phone services has not changed a great deal since the Smartphone boom began. What can service providers do to ensure revenue growth keeps up with Capex and Opex?

From a hetnet/radio perspective, there should not be too many surprises when it comes to the different set of principles used to improve network throughput.  At the end of the day, it boils down to maximizing spectral and spatial efficiency.  But even if the set of principles are limited and similar for everyone, what is interesting is that the strategy and methods used to maximize the performance of the LTE network can vary significantly from vendor to vendor, and I expect to see continued divergence of the various approaches for maximizing spectral efficiency.

The difference in strategies becomes even more pronounced when we take a look at the various approaches to small-cell.  There was a multitude of new solutions announced in 2013 by the top RAN vendors including Ericsson, Huawei, and NSN, emphasizing feature/performance/hardware/firmware parity with the macro radios.  It will be interesting to see how competitors will respond, what approach new entrants will take, and what enhancements the existing solutions will bring.  German-based antenna and electronics manufacturer, Kathrein, has already announced a new centralized in-building solution that will support multiple operators and other vendors are also expected to announce new platforms.

Taking into consideration both the expected growth rates of the equipment market and the opportunities in services, I anticipate plenty of activity around services at the show.  NSN has already pre-announced a fault prediction solution, while Ericsson will be launching a services program focusing on small-cells.  It will be interesting to see how it will differ from Alcatel-Lucent’s and NSN’s small-cell services and if others will have new announcements as well.

Even if the combination of the cloud, NFV, and eventually SDN are expected to play the greatest role simplifying the network, reducing cost, and helping service providers accelerate and optimize new services, the radio can play a role as well, helping service providers to address the revenue versus investment challenge.  The concept of pushing more intelligence towards the edge showed up last year and I anticipate seeing more progress this year.  NSN has already announced enhancements to its Liquid Application enabling real-time services and content acceleration.  And while I am personally not as optimistic that virtualized baseband using Commercial Off-The-Shelf (COTS) parts will happen in larger scale anytime soon, I do anticipate the topic will come up at the show.

In other words, with a pair of jogging shoes, this should be a very exciting show!