Where is Africa in the great data race and will cellular connectivity be the way the continent bridges the digital divide?
Where South African users have begun to embrace the power of mobile Internet connectivity with telecommunication companies starting to look beyond HSPA and HSPA Plus technologies towards fourth generation solutions such as LTE and Wimax, the rest of Africa is at the point where 3G connectivity is only now becoming an option. And depending on who you speak to, the story varies between being extremely positive and downright disheartening. While on paper a number of telcos have technically rolled 3G out in their regions of operation, that should by no means imply that 3G is available across the entire reach of their coverage area – or even in areas with a healthy saturation of users. It seems like, 3G is still reserved for the most built-up urban areas and set up as an overlay of a blanket GPRS or EDGE network. But it’s not because the telcos don’t see any potential. It’s among other things because Africa is a market with an extremely low average revenue per user (ARPU) and in conditions such as these telcos need to gauge what the uptake and revenue generating capability of a new service is likely to be before they launch that service. It really isn’t a case of ‘build it and they will come’.
Expensive licenses, limited returns
Douglas Lubbe, Group Executive for Vodacom Group’s International Business says the first hurdle to get through is acquiring a license and justifying the associated capital expense. Acquiring a 3G license can be a costly affair, so costly in fact that often there’s little hope of recovering that expense through adding 3G revenues to the mix over the short-to-medium term. That could be why the Vodacom Group has 3G coverage in Mozambique, Lesotho and Tanzania, but not in the Democratic Republic of Congo, where the government is asking $55m for a license to deliver 3G services in its country. Bertus Ehmke, Senior manager for technology strategy and the MTN Group agrees. He says that telcos paid ridiculous sums for 3G licenses ten years ago and when the technology under-promised somewhat in terms of the revenue it delivered, prices came sliding down. But, now they’re on their way back up to a ridiculous level. “Take another emerging market like India for example,” he says. “The base price for the country’s 3G spectrum auction is already at the $2bn market,” he says. It seems like the market sentiment is that regulators want too much for a decent size chunk of 3G spectrum and aren’t willing to consider the limited revenue generation opportunities for telcos, which in turn hinge on the disposable income in a country, the level of cellular penetration and lastly, the telco’s ability to backhaul Internet traffic into and out of the country. One positive Ehmke points out is that the growth and adoption of HSPA on 900MHz is going well. “We’re especially excited by the ability it grants telcos to serve a three times larger coverage area at the same price and with the same capabilities,” he says. And Ehmke says this service is already being trialed by MTN’s operations in Uganda, Zambia, Ivory Coast, Cameroon and Botswana. Once the license has been acquired, it’s also not realistic to expect the 3G coverage area to be vast.
Conservative rollout planning
“When we embark on a 3G rollout in any market, we start with the major urban areas i.e. where it makes most financial and business sense, and then extend into other areas based whether there’s a business and commercial rationale to do so,” Lubbe says. He says that for many customers, it’s more about having some form of reliable connectivity than having enhanced speed of connectivity, and Vodacom is not convinced that by extending 3G capabilities to areas that already have EDGE or GPRS coverage, there will be a massive increase in revenues. “In Mozambique for example, some of the biggest revenue generating data connections we have are in the rural areas, where a handful of game lodges require Internet connectivity and GPRS or EDGE suits fine,” he says. That said however, Lubbe says that Vodacom is focused on constantly expanding its 3G network in these markets. “But we have to make sure there’s a market for 3G services before we deploy into an area,” he says. Furthermore, Lubbe says there’s no hard and fast rule for when a region or a coverage area needs 3G. The company conducts extensive research into a piece of geography and its population’s usage patterns, and then makes a decision from there. Lubbe says the Vodacom Group is also not stopping with 3G. “We do envisage rolling out nextgeneration technologies such as HSPA and LTE out into Africa, but it’s not likely to be in the near future,” he says.
Is there backhaul?
Another big factor Vodacom weighs up when gauging the readiness of a market for enhanced data services such as 3G is whether or not there’s adequate backhaul infrastructure in place to feed users what they need – whether that’s access to a website in the U.S. or Europe, or a piece of content or functionality that is available on a local server. While the international bandwidth issue is in the process of being solved, right now there’s no redundancy of which to speak, as was demonstrated by the massive outages companies across the continent faced while the Seacom cable was being repaired a month ago. He says the local loop in many African countries is also a massive issue, since there’s not enough working fibre cabling in the ground to connect major business centres together and ultimately provide a speedy, reliable link to the much needed undersea cables carrying traffic to and from the more developed world. Lubbe says that Vodacom is making investments into fibre principally at local loop level. But it’s important to note that these investments are not at the same level as Vodacom’s investments in South Africa, both in terms of the volume of capital being spent or their long-haul nature.
Customer equipment costs
Once the economies of the network are out of the way, Lubbe says there’s also the availability and price of terminal equipment to consider. “3G is still a relative niche and the client-side equipment that enables 3G tends to be more expensive than 2G equipment,” he says. “The prevalence of notebooks equipped with internal 3G modems is also not as significant in Africa as it is in other parts of the world,” he says. “We are seeing encouraging growth here however,” he says. Given time, these markets will develop. “It’s contingent on factors like the reduction of terminal and handset pricing and since many of the markets Vodacom is interested in have upwards of 98.5% on prepaid plans, there’s little that can be done to bring terminal pricing down with subsidies,” he says. “That will change, though. “A couple of years ago we were dreaming about handsets that cost less than $20 a unit and that’s now a reality. The next step is for 3G handsets to get to that level and we’re hoping it can happen in the next three to five years,” he says. MTN’s Ehmke says that 3G terminal equipment is already in the region of $50 for a USB modem-type device and that it’s already much less of a barrier than what is used to be. “And with the industry’s help, the price of 3G equipment is getting driven down even further,” he adds. While Lubbe says that businesses currently have the core drive for mobile data requirements in these regions, the consumer market is catching up and this is where the momentum will come from. But the point is, enhanced third generation and new fourth generation technologies are still some way off.
Looking beyond 3G, HSPA
Ehmke says that internationally LTE has grown in maturity rapidly over the past 18 months and this is largely because mobile Wimax, another fourth generation technology is being relatively successful in emerging markets. The recent maturation of LTE is largely because the GSMA has been encouraging mobile operators to hold off on Wimax and instead start investigating LTE. “The problem in Africa,” he says, “is although LTE is hugely diverse and adaptable, it does have a ‘favourite spectrum’, namely the 2.5GHz to 2.6GHz band. This means it’s subject to higher levels of attenuation (noise) and a lower range than 3G/HSPA and other technologies.” It also means the coverage area of LTE is smaller, making it increasingly difficult for telcos to build a financial model that works. While Ehmke says LTE can work in the 700-900MHz band, because there’s less traction for the technology between those frequencies, the equipment capable of operating within that spectrum will bear a price premium. “Chances are this will be a very different discussion in two years’ time,” he says.
Similar issues to 3G
Another reason LTE is not the perfect solution for Africa right now is that Telcos all recognise that voice revenues are under pressure and remain a massive portion of income made in the Africa markets. “Voice is an afterthought when it comes to LTE,” Ehmke says, “and having solid voice capabilities available in Africa is key. It’s far more difficult to build a business model based on a purely data service,” The terminal equipment is also too expensive. Ehmke says customers need to contend with average price tags of $150 for normal USB-type data modem for LTE and closer to the $300 mark for a router-type modem for a small office. “All of these factors make LTE too early for Africa,” he says, “and besides, there’s a huge amount of headroom that can still be extracted from 3G and HSPA.”
The headroom is there
Ehmke says that this is also a more logical route for telcos to follow, since the majority of the 3G radio equipment available and being installed today is already capable of providing HSPA services. “All it takes is the telco licensing itself for that technology and bolting a handful of enhancements onto its infrastructure, like taking advantage of MIMO by adding more antennae elements,” he says. HSPA is also very capable of offering the kind of speeds and services that allow telcos to deliver ‘actual value’ to their markets. “The next billion customers being connected to the Internet will not necessarily be using the next billion PCs to be connected to the Internet,” he says. “Many of them will be engaging and experiencing the Internet over a mobile device – whether the experience is as trivial as a WAP browser embedded on a low-end cellular phone or a 3G-enabled iPad type device. “And this leads me to believe that once the network has reached the 7.2Mbps mark – as is comfortably offered by HSDPA today – it reaches an inflection point in terms of what one can appreciate on a device that’s not a PC or a notebook. “I mean, we’re talking about devices with a maximum of ten inches worth of screen – and that’s too small a screen to appreciate or even need high definition visuals or sound,” he adds. “Considering this, even from an multimedia perspective, there’s little need for much faster connectivity than what our current networks have the headroom to provide,” he says. This substantially changes the way we need to think about things on the African continent. It makes it less about ‘keeping up with the Joneses’ than making sure today’s technology is capable of scaling linearly with what future demands are. And from the looks of things, the technology being installed by telcos is capable of just that.
Fourth generation networks are some way off, but in the interim, two technologies are competing for their day in the sun. Which is due to win?
While the argument continues to rage on between fourth generation mobile network pundits favouring long-term evolution (LTE) or WiMAX as the technology that will drive the future of their wireless networks, too few commentators, vendors and market experts consider that there are scenarios in which the two technologies can comfortably coexist. A fourth generation network, according to the International Telecommunication Union is capable of delivering data rates in excess of 100Mbps data rates in mobile contexts and in excess of 1Gbps in static or nomadic (fixed-mobile) contexts. Presently LTE seems to be the technology that’s favoured by the world’s cellular operators, since it provides a good stepping stone between both CDMA and GSM based mobile networks and LTE-advanced – the only standard that currently conform to the ITU’s definition of a fourth generation network. WiMAX on the other hand is receiving encouraging traction with telecoms operators – mobile and fixed – that have very little existing infrastructure, but an overwhelming demand for high-data rate Internet connectivity. While it is still very conceivable that the two technologies will ultimately end up duking it out for dominance, it’s unlikely that one will fall at the hands of another. That’s because the technologies are fundamentally different and have their own benefits and drawbacks that in turn make them suitable to overcoming various challenges or in other cases, non-viable routes to follow. In order to explain this thoroughly however, we’ll need to take a closer look at each technology.
What’s the fuss with WiMAX?
WiMAX is more formerly known as IEEE standard 802.16 and originates from the LAN/MAN subcommittee. That straight from the word ‘go’ should speak volumes about WiMAX’s roots and its intended use. Borne out of the almost ridiculous growth of the 802.11 wireless standard that is today responsible for the amazing number of WiFi hotspots we find in the workplace and public areas, such as parks, coffee shops and university campuses, not to mention modern homes, the basic idea behind WiMAX was to take the same tenets on which WLAN technology was built and extend it into the WAN environment. While more developed parts of the world could rely on fibre-optic backhaul networks, WiMAX was envisaged as a technology that would provide a wireless alternative for telcos that had limited budgets and couldn’t hope to contend with the timeframes associated with installing fibre. The broader WiMAX standard has derivatives that are perfect for fixed last mile connectivity, derivatives that are ideal for providing backhaul for high-speed telco networks and even derivatives for more mobile applications – hence the reason it is touted as a possible competitor for LTE. And as mentioned, previously, WiMAX is currently the top choice for telcos and Internet service providers looking to provide high-speed Internet services where relatively little or no current infrastructure exists. The technology has undoubtedly been more successful than LTE, with numerous successful deployments of WiMAX being undertaken in both the developing and developed world (by contrast to LTE that is only now beginning to gain any traction). But WiMAX adoption hasn’t been without challenges.
What’s in a name?
The largest stumbling block WiMAX faces is not technological. Where the name WiMAX makes the technology easier for the average user to understand – inferring that it’s nothing more than WiFi taken to the max – WiFi’s numerous stability, security and performance related-issues kept it in the consumer market until recently. While WiMAX has good provisions for all of these stumbling blocks, its close name association with WiFi has meant it hasn’t been taken quite as seriously as many of the vendors might have liked. Outside of these concerns, it’s also worthwhile noting that WiMAX makes use of regulated and leased spectrum, as opposed to WiFi, which operates inside open spectrum. Apart from this meaning that the spectrum assigned to WiMAX isn’t likely to interfere with, or be interfered with by other wireless devices, these networks are also professionally installed by experts and as such are more likely to deliver on its performance promises.
Long term evolution
As opposed to the IEEE standards process that was followed with WiMAX, LTE was developed by the 3rd generation partnership project (3GPP) – a body that consists of the numerous contributors and stakeholders responsible for the evolution and maintenance of the GSM standards. As such, LTE is part of the evolution from the circuitswitched GSM networks of old, through the early General Packet Radio Service (GPRS)-based (2.5G) networks and more recently, Enhanced Data Rates for GSM Evolution, 3G, High Speed Packet Access (HSPA) and HSPA+ networks in operation across the world today. While LTE is a much younger technology than WiMAX, what makes LTE so attractive is that it exceeds the performance level of current HSPA+ networks, which have a theoretical down link data rate of 56 MB/s and that it is entirely IP-based, from one end to the other. While WiMAX is a strong offering, LTE is the logical choice for existing mobile telcos because it builds logically on everything they have in place from an infrastructural point of view. In fact, the existing infrastructure vendors being used by the world’s mobile networks all have rollout plans that are able to predictably transition their customers from their current 3G infrastructure, through to HSPA+ and LTE. And quite possibly beyond.
And the winner is?
Looking at where the roots of each technology likes, predicting which technology will win is by no means cut and dried. In fact, even though the market is expecting either LTE or WiMAX to prevail, it’s unlikely one will unseat the other entirely. It’s plausible that the two will continue to coexist – LTE as the natural, higher speed evolution in the GSM realm and WiMAX as the solution to purpose-built networks designed to connect users’ homes and possibly even their notebook computers to the Internet. It’s also worthwhile noting that the two technologies are complementary enough to coexist in the same ecosystem, with WiMAX assisting with ever growing backhaul requirements of mobile network operators and LTE being used at the client end, for providing fast data service to mobile phones and computing devices. Of course, it’s also not outlandish to predict that a combination of LTE and WiMAX will be used on the client end – each playing to its strengths as a data delivery mechanism for mobile phone service and connecting personal computers to the Internet, respectively. It’s interesting to note however that the vendors in this industry have, instead of choosing a middle ground that would see both technologies being supported indefinitely, drawn battle lines and begun pursuing one or another technology. Whether or not this will see an ultimate leader developing in the coming years or not isn’t clear. It will almost certainly mean that things will remain interesting in this space for the foreseeable future.
Mobile is Moving Africa
Growth opportunity on the “plateau continent”
These are exciting times for the mobile industry. Estimates at the time of this writing indicate that the industry is currently adding one billion mobile connections every 18 months worldwide, 40% of which are 3G; meaning that total connections are on track to reach six billion in the first half of 2012. Emerging markets such as Africa – often referred to as the plateau continent – represent a tremendous growth opportunity as they transition from 2G to 3G (EV-DO and UMTS900/HSPA+), which offers voice and data services at lower cost and with better performance. In the first quarter of 2010, Informa Telecoms & Media reported that by 2012, emerging markets are expected to represent 50% of 3G handset shipments worldwide. Growth of the African mobile market has been driven by the continent’s historical lack of fixed-line infrastructure for voice or data, essentially making mobile service the default means for voice communication and Internet access. The completion of a second deep-sea communications cable should also help drive down the cost of data and accelerate mobile data consumption.
The bottom line: improving the quality of life
While the revenue opportunities offer significant potential for mobile industry players in Africa, what’s really remarkable about mobile technologies in Africa goes far beyond the economics – it’s mobile’s ability to transform lives. Beyond voice calls, mobile phones can help Africans do any number of extraordinary things: improve access to news and information, expose schoolchildren to better learning opportunities, remind the elderly to take their medication… even track market prices for fishermen and farmers, ensuring that they get the best prices for their goods. The fact is, for many people in Africa, their first and only access to telephone services and computing capabilities is going to be through a mobile phone. If that’s not inspiring, consider the point from a macro perspective. Mobile networks, particularly 3G communications networks, are critical infrastructure in developing countries across Africa, and these networks are now a major factor in driving substantial economic growth and socioeconomic progress. A 2009 World Bank Information and Communications for Development report showed that wireless connectivity matters: a 10% increase in mobile phone penetration results in an increase of .81% in per capita GDP and a 10% increase in Internet/broadband penetration results in an increase of 1.38% in GDP.
Making mobile more accessible and affordable for all
One might conclude that mobile technologies offer Africa a lot of potential, but how can a rural African citizen, one who might live on $2 (USD) a day, afford a mobile phone? Qualcomm and its many partners are working continuously to develop solutions that lower the barrier to entry for the everyday consumer. For instance, Qualcomm’s “system-ona- chip” single-chip solutions and turnkey reference designs are not only making mobile phones more power efficient and reliable, but also more cost-effective to build and less time consuming to bring to market. The cost benefits realized by device manufacturers are ultimately passed on to consumers through the availability of more device choices at more affordable prices. Ultimately, efforts like these are bringing down the average selling price of mobile phones, allowing more and more Africans – particularly those in the underserved communities – to discover and benefit from the mobile phone.
The future depends on us
What’s on the horizon for mobile is limited only by the imaginations of key industry players – device manufacturers, network operators, app developers and content publishers – as well new market entrants who have the vision to apply wireless in a nearly infinite number of non-traditional ways. Regardless, Qualcomm will be there to help bring to fruition Jing Wang Senior Vice President and Chairman, Qualcomm Asia Pacific the big ideas and solutions that make a difference in Africa and accelerate mobility around the world.
If technology exhibitions were mythical cartoon characters, the Consumer Electronics Show (CES) held in Las Vegas every January would undoubtedly have to be Godzilla. Bigger than any other technology trade show in the world, more brutal on attendees’ feet than a marathon and more taxing on their minds than a master’s degree in advanced computational mathematics, CES has for some time now been ‘the’ place to unveil new technology. There are so many new things to see and so many different vendors to engage with, most news agencies take entire teams of journalists to the event – and begin reporting on the goings on two days before the show opens its doors to the public. However, this year’s CES wasn’t as impressive as in previous years. That’s partly because the world is still recovering from the economic meltdown and partly because the industry seems to be stuck in that uncomfortable space between new technologies becoming available and the mass adoption of those technologies. Think 3D television, tablet/slate PCs and cloud computing if you need examples. This year the show only played host to 2,500 different exhibitors and managed to command the attention of somewhere close to 120,000 attendees. But even in its small form, the sheer scale of the tradeshow means it’s the perfect event for gauging market sentiment towards specific products and technologies, and a great opportunity to identify the trends that will shape the electronics space in the years to come.
More of the same
While there were some new takes on technology, the majority of the products announced at CES could have been predicted six months ago. For example, tablet or slate PCs continued to be a big focus area and well over five of the industry’s big names made announcements in and around the tablet or slate computing space. Another slightly predictable ‘hot topic’ was the evolution of 3D and the rather shrewd realization by manufacturers that in enabling users to create their own 3D content, they can get their 3D televisions flying off the shelves. As was expected, the show was also filled with a number of new handsets that US networks are still getting away with calling 4G, when in fact they’re equipped with nothing more than HSPA+ or LTE. There were of course some exceptions. One rather unexpected move came from US network operator Cricket, which aims to provide users with an ‘all– you-can-eat’ music service along with an unlimited voice, SMS and data plan. Another – and one that has stronger relevance on African shores – was the announcement of Motorola’s Atrix handset that becomes a desktop computer, media centre or notebook computer as and when the user’s needs dictate. But enough glossing over the details … Let’s get knee-deep in what was announced.
Tablets take centre stage
When Apple announced the iPad a little more than year ago and the market finally got to experience how trouble free this new mode of computing was – browsing the social web and consuming media with ridiculous ease – it was clear that everything was about to change. And even though it’s taken the market some time to catch up, now that RIM is aiming to ring-fence its customer base and Google has released Honeycomb, the tablet version of its Android operating system, things are becoming interesting. While it’s par for the course for us to expect the vast majority of vendors to simply take the ‘me too’ approach, much like Samsung did with its release of the Galaxy Tab, there will be some bold attempts at redefining the market. And there are really only three that stand out from the array of tablet-centric announcements at CES.
Motorola Xooms into view
The first was Motorola’s announcement of its 10-inch, Honeycomb-powered Xoom tablet. As yet, we’re unsure what processor it runs (Motorola has said no more than ‘it’s a dual-core’), exactly how much memory it has on board and what it will cost. What we do know is that it’s the closest thing we’ve seen to an iPad – both in terms of the overall polish of the hardware and the fluidity of the graphical user interface – and in a field of unsuccessful imitators is a good thing. Unfortunately, we’ll have to wait some time for Motorola to firm some of those details up.
A decent Windows 7 tablet
Next in line when it comes to interesting tablet announcements, ASUS – the company that pretty much invented the netbook market with the release of the Eee PC all those years ago – let fly with the only remotely compelling Windowsbased tablet we’ve seen to date. Called the Eee Slate EP121, this little puppy has a 12.1-inch capacitive touch screen, runs an Intel Core i5 processor, 4GB of memory and a 64GB solid-state drive. Reality check. That’s a more powerful specification than the vast majority of notebooks out there today. When the EP121 was demonstrated on stage, the presenter retouched a 60MB image using the stylus while simultaneously playing back a 1080p video in the background. Finally there’s a tablet capable of running Windows 7 in a compelling way. Again, details that weren’t dished out readily at the event include the unit’s battery life and what the expected price point will be. Despite this, it looks promising.
Best of both worlds
Rounding up the tablet announcements, Lenovo finally showed off its U1 Hybrid: as the name suggests, a mix between a tablet or slate and a full-blown notebook that doesn’t compromise on either device’s core functionality. The idea is simple. Tablets are great for certain things, but sometimes notebooks are just far better for getting the job done. With the U1 Hybrid, users won’t have to make that tough choice. One on side , the U1 consists of a Core2Duo notebook, complete with a keyboard, trackpad, hard disk and other system essentials running Windows 7. But, instead of a normal screen, the U1 has a LePad – Lenovo’s touch screen tablet – which unclips from the notebook chassis and transforms into an Android tablet when the user wants to transform their work mode. To make the whole scenario more awesome, Lenovo has ensured that when the U1 is in ‘notebook mode’ the tablet’s internal memory is mounted like a USB flash drive in the Windows 7 file system and that whatever content was loaded into the tablet’s browser when the machine was docked is automatically synchronised to the Windows 7 browser. As would be expected, the same applies when undocking the tablet from its chassis. While Lenovo has an interesting approach for taking the Hybrid and LePad to market – selling the tablet separately and the U1 as a kit, but not the U1 chassis as an upgrade – what’s also interesting is that this product in its current form won’t make it outside of the Chinese market. That said, however, a couple of tweaks to this design could well see it released elsewhere in the world before the end of the year. Whatever happens, Lenovo has committed to making tablet or slate related announcements that are relevant to the rest of the world before the end of the year.
3D content creation
Putting tablets on a shelf for the meantime, the second major trend at CES was 3D technology and more specifically the strategy the leaders in the market will be employing to continue driving this new technology segment. As most analysts and some large consumer electronics brands will admit, 3D technology hasn’t been nearly as much of a success as the big noisemakers in the industry would have liked. While it’s still early days for 3D, like anything in the consumer electronics space there’s always time pressure to contend with. And although there is a wealth of display devices available today (and some that don’t require glasses coming during 2011) there’s not nearly enough content to create any real interest for the average person in the street. This, and the fact that we’re living at a time when social media interactions and users’ ability to create/contribute their own content to the mix is of massive importance. It follows logically then that the number of 3D-capable still and video cameras announced at this year’s CES are designed to get users excited about 3D content creation … and in doing so, sell more 3D televisions.
A horse for every course
The majority of the announcements made around 3D capable cameras came from the likes of Panasonic and Sony who together seem to have a solution for every user. Panasonic’s announcements comprised a number of new camcorders with 1MOS sensors (designed primarily for capturing 1920 x 1080 clips), a gaggle of others with a 3MOS sensor (designed for more professional 1080/60p shooting) – both ranges capable of recording 3D video with an additional lens – and a new ‘professional’ 3D camcorder with a US$21,000 recommended price tag. On the upside, it does come with a special lens, dual memory cards and more. Looking next at the company that could well have the largest vested interest in 3D, it’s not surprising that the number of camera-centric announcements from Sony dwarfed the rest of the industry. Starting with 3D video, the company announced a new Handycam that features what Sony calls ‘Double Full HD 3D’. In more simple terms, these Handycams feature an integrated dual lens system, which includes two Sony G Lenses, two ‘Exmor R’ CMOS sensors and two ‘BIONZ’ image processors. The result is the ability for 2D high definition and 3D high definition footage to be recorded seamlessly and simultaneously. Next up, jumping on the 3D stills bandwagon, Sony’s five-unit lineup of Cyber-shot cameras have 16.2 megapixel sensors and quite remarkably, are able to take 3D stills using only one lens and imager. Rounding its announcements out, Sony added a 3D unit to its popular Bloggie range of shoot and share cameras. The new 3D camera, as expected, makes use of two lenses, two image sensors and a stereo microphone to record 3D footage. Whether or not the focus on 3D cameras will save the 3D display space remains to be seen. One hopes that the current focus on user generated content on a worldwide basis will be enough to give this new market segment impetus.
No technology trade show would be complete without a bunch of smartphone-centric announcements. And CES played host to a number of new handset launches. While for the most part it was more of what we’ve become accustomed to expecting, there were obviously some exceptions. Carrying on the 3D trend, LG showcased an early concept of a 4.3-inch smartphone that’s capable of playing back glasses-free 3D video (using the parallax barrier method). This is a long way off, but it was interesting to see vendors thinking in this direction. However, hot on the heels of its announcement of the Xoom, it was Motorola that again stole the show with the release of two new handsets – the Atrix and the Droid Bionic. While the Droid Bionic is nothing more than a crazy-fast LTE-equipped cellphone, the Atrix is a completely new concept that we believe will take the market by storm.
Press and analysts alike have been saying for years that carrying around multiple devices with separate instances of our data on is a pretty counterintuitive exercise, not to mention one that’s heavy on the pocket and the back. What we’ve all really needed is a single device that has a large enough screen to provide access to one’s most vital information while on the road, but back at the office be attached to an external display, keyboard and mouse so that real work can commence. It would also be cool if this device was media centric so that it could double as a media hub some of the time, playing back high-definition stills and video on a large screen if needs dictate. And it seems like Motorola is the only company that listened. The Atrix does exactly what the dream outlined above calls for – and more. Not only is it a smartphone when you need it to be, a net-top when you need it to be (using a separately sold dock) and a media hub when you need it to be (using the same separately sold dock), Motorola has gone ahead and developed a notebook-chassis style dock – much the same form factor as a MacBook Air – into which the Atrix can be slotted, giving users a netbook while they’re out on the road. Again, while there’s relatively little tangible info available on the Atrix (it’s due for release in March in the US), we know that it runs Android, uses a dual-core NVidia Tegra chip and that the notebook-style dock has a six-hour battery, which simultaneously charges the smartphone’s internal battery while it’s being used. The Atrix is by a long shot the most interesting announcement to see the light of day at CES and one that could see Motorola taking the kudos for finally unseating the iPhone’s dominance in the market: not because it’s better at doing what the iPhone does so well, but rather because it solves a whole bunch of problems the iPhone doesn’t. The Atrix will undoubtedly be as significant as the release of the first tablet device, the original mainstream release of 3DTV and almost certainly, those first smartphones. And in a year’s time, who knows where this trend will drive things?
So, there you have the announcements that are likely – from a trends perspective, at least – to shape 2011’s tech landscape. While we wait with bated breath to see Apple’s response to many of the announcements made by its rivals at CES (the fruit company doesn’t unveil or exhibit at CES), it’s clear that the consumer electronics industry is alive, well and where the majority of the innovations are coming from today. Will the focus ever return to the business market? It’s unlikely. Does it matter? Not really. Most new consumer technologies make their way into the business sector sooner or later. It’s managing that transition that remains tricky and more importantly, where the business sector should be focusing its attention.
This Issue of Africa Telecoms is focusing on Backhaul as a subsector of the Telecoms Market in Africa. What do you think the most challenging area is for Mobile Operators in Africa when it comes to Backhaul?
Mobile operators in Africa are facing the same issues as network operators in many parts of the world: building their networks to meet the quality of experience their customers expect while earning the profits their shareholders demand. From the vendor perspective, it’s whether you make your customers money or save them money. Backhaul is frequently seen as a spiraling expense, we believe we have a solution that gets that situation under control at a cost that is very attractive compared to the alternatives.
CBNL is also stated at having 30 deployments around the world and helps 2G,3G, HSPA, Wimax and LTE Networks. How many of the deployments are in Africa? Can you provide further insight into your operations in Africa?
Certainly, our presence in Africa has grown considerably over the last three years. We currently have five network deployments with MTN in countries including Rwanda, Nigeria and South Africa. All in all, we have eight networks operating in Africa including deployments with Vodacom, Gateway and Inwi. We also recently renewed our framework agreement with MTN group which was announced in October last year. Each of MTN’s operating companies now has access to a range of pre-approved VectaStar microwave backhaul equipment which has been validated for a range of applications, including backhaul of 2G, WiMAX and 3G network traffic, and the provision of broadband Internet access services to businesses. The FIFA World Cup was a significant catalyst for growth of South Africa’s ICT and mobile telecoms sector. Operators have made substantial infrastructure investments and service upgrades that will benefit subscribers for years to come. In particular, HSPA+ has given consumers significant improvements in data speeds.
CBNL states that it is a market leader in Point to Multi Point backhaul solutions. Can you explain the differences between the CBNL products and traditional solutions?
There are a few fundamental differences between our products in comparison to traditional solutions such as Point-to-Point (PtP), but let me briefly highlight how VectaStar works. Our technology uses transmission architecture similar to broadcast standards, where a central hub communicates with a number of remote terminals within a sector. Spectrum and capacity is shared across the sector giving operators the flexibility to manage network resources and provision those resources to a certain cellsite, if and when, required. This is inherently different to PtP which uses two radios per connection on a one-to-one basis. Spectrum is tied to each of these connections and is unable to be reassigned to another cell site that maybe experiencing a higher footfall. Essentially VectaStar is different because it simplifies microwave backhaul networks, reducing the number of radios needed to create each sector, which in result makes them cheaper and faster to build and reduces backhaul costs by up to 60%.
At the end of 2010, CBNL had a major round of financing that was raised, and Graham Peel, CEO, stated that the financing would be used to drive product development as well as sales and support. Is Africa one of the regions that CBNL will be focusing on in 2011?
Absolutely, our growth plans include increasing the scale of our operations across Africa, particularly in the Sub-Saharan region and Nigeria. With mobile internet access in Africa growing exponentially and with many operators in the region deploying or looking to deploy 3G services over the next few years, CBNL has a strong opportunity to help operators manage and provision their enterprise access and backhaul networks for this increase in lumpy data traffic.
The ever-growing data market in Africa is a great opportunity for CBNL as backhaul becomes more and more important for network operators. Where does CBNL see this market moving to, and how is CBNL helping operators face the issues of mobile data backhaul?
You’re right there is a great opportunity. We’ve seen a huge amount of optimism from mobile operators in the uptake of mobile devices including tablets across Africa. Bringing ‘mobile to the masses’ in Africa has a significant upside for all concerned. For operators and device manufacturers, as well as all other companies involved in the installation and operation of the networks and services, there is obviously an opportunity for new revenue growth, as well as the economic benefit it will provide to a much wider community. To help operators meet the demand that this growth causes, our technology is ideal for African operators as it allows them to make efficient use of scarce spectrum resources and at the same time, ensure that provision is made for peak data throughput.
How does CBNL ensure the quality of service of data and/or voice running through its Microwave point to multipoint system?
VectaStar offers four priority classes which are numbered 0 (the highest), to 3 (the lowest) and each service is allocated to one class. Therefore, bandwidth is firstly offered to the highest priority class and the remaining bandwidth is then offered to the next class. Prioritisation is used in conjunction with Adaptive Modulation. All radio links suffer from fading and typically the longer the link and the higher the frequency, the more frequent the fading will be. Therefore, radio links are planned to take account of the inevitable fading to ensure that they are reliable in the presence of a fading environment. When they are properly designed, radio links can provide 99.999% availability. With seven levels of adaptive modulation built into VectaStar, the system ensures the best possible performance in all weather conditions.
How well does the CBNL Microwave systems integrate with existing legacy terrestrial networks?
VectaStar is actually a new topology for microwave backhaul. Therefore the most efficient way to integrate it with existing terrestrial microwave systems would be to use VectaStar as an overlay in dense urban situations. This overlay can also be applied to support the deployment of 3G networks throughout the region leaving traditional point to point microwave to deal with the rural situations where it is still efficient.
Does the CBNL Vectastar platform increase or decrease OPEX and CAPEX and what payback period can the operators expect?
It decreases both CAPEX and OPEX. It does this by reducing the amount of equipment that needs to be deployed – thereby reducing installation labour which is a major contributor to CAPEX. Once installed, VectaStar has a smaller footprint, reducing site and antenna space rental costs. It uses a lot less radio spectrum which reduces annual costs as well. It is difficult to give an exact ROI figure that applies to every situation, but we frequently find that the reduction in CAPEX and OPEX as well as the time to build the network is 50% less than the alternatives so you can see that the improvement in ROI is significant.
With Infrastructure sharing becoming more and more important internationally, does the Vectastar system allow for the potential of Mobile Operators sharing backhaul infrastructure?
Good question. It really depends on how the sharing is structured by operators. If they are using a common RAN the process of sharing is relatively simple for VectaStar to support. However, if the sharing is only site based then the different equipment at each site could present issues for any backhaul vendor. Our answer would be to work with operators to come up with the right solution-but it’s fair to say that we haven’t found a network that we can’t backhaul, yet.
Can the current price curve of mobile backhaul keep up with the cost of flash memory and broadband data to ensure that the end cost to consumers remains on a downward curve?
Absolutely, in all cases microwave capacity is growing faster than the aggregated traffic coming from cell sites. In fact, CBNL has been developing a new radio controller that will eventually achieve 1GB/s throughput per sector – with the first inclination doubling our current throughput level to 300mb/s. The biggest challenge for the industry is reducing the cost and time spent in building the backhaul network in the first place. That is an area where our product, VectaStar, excels: our product takes a fraction of the time to provision a connection when compared to the alternatives – enabling our customers to build-out their networks quickly and capture the market from their competitors.
Does the Vectastar system allow for the use of Carrier Grade Ethernet as well as Microwave transmission?
Yes it does. All our VectaStar platforms include Ethernet interfaces and offer throughput of 150mb/s. With the recent introduction of the VectaStar Radio Controller, based on a Gigabit Ethernet backplane supporting up to 10GB/s sustained operation, we will increase throughput to 1GB/s per sector.