Home Blog Page 1209

Advertorial – E-Publishing – NDS helps publishers embrace the e-reader

0

Amazon's announcement that it had sold more electronic books than the print variety during the pre- Christmas shopping season was firm evidence, if any was needed, that the era of the e-book has finally arrived.

For those who still remained unconvinced, the Consumer Electronics Show (CES) in January brought a slew of new e-readers and tablets, with HP, Dell, Lenovo and many lesser-known companies all unveiling new products.

So far, most of the e-reader excitement has been generated by the hardware vendors, particularly by the Kindle device from Amazon. But NDS is cementing its position as a technological innovator, as the first software vendor to launch an end-to-end solution for the e-publishing industry.  Building on its experience as the world's leading provider of security and middleware systems for pay-TV, NDS has created a rich, multi-level solution for the e-reader which addresses the needs of all players in the e-publishing ecosystem, from device manufacturers and publishers right through to the end user.
The role of publishers is key to the success of any e-publishing solution, as ultimately, e-readers need content to display on their screens; which in turn provides publishers – of books, newspapers and magazines – the opportunity to regain at least some of the ground they lost during a decade of ineffective dabbling on the Internet.

The digital revolution has not been kind to print publishers, who have seen sales plummet in recent years, along with an even more precipitous decline in advertising across all print media. Publishers have good reason to be wary of the Internet and the devices that display the electronic versions of their products – more often than not without regard for copyright.

Alongside the threat, however, the digitization of print media also holds out the promise of desperately-needed new revenue streams – a facet made possible by the NDS solution. Crucially, the NDS system supports advanced advertising techniques, including targeted and contextual advertising, telescopic and click-through ads, and advanced purchasing models such as the ability to rent, gift or lend content.

The inadequacy of Internet advertising as a replacement for lost subscription revenue was the key lesson the publishers took from their Internet experience. The NDS e-publishing solution will enable them to return to a two-pronged revenue strategy, while utilizing the full advertising potential of the electronic platform.

Publishers are determined not to repeat the mistakes they made in giving away their content for free on the Internet. Thus, their move into e-publishing creates a requirement for robust and effective e-publishing technology solutions, above all for Digital Rights Management (DRM,) the technology that enables publishers to realize revenue from the content published on an e-reader or similar device. To fully address those needs, the NDS solution includes protection of the service, to allow controlled access to content based on user rights, protection of the device, to prevent modification, and protection and authentication of the content itself.

Proving true to the end-to-end solution, NDS is also able to provide the e-reader user interface, which can be based on the look-and-feel of the print product, and a variety of advanced capabilities. These include proven measurement and analytical technologies, which enable service providers to better understand their users' reading habits and thus to tailor their services to their customers. And to complete the package, NDS also provides consulting services for publishers who would like to better understand the technical issues in the value chain. Indeed, before defining an e-Publishing strategy, a lot of issues need to be addressed, such as content aggregation and reformatting, security and control of the content on the end device in a retail market etc. NDS expertise is this domain is well recognized and NDS' consultancy services are well positioned to support Publishers in making the right decisions.

For now, many publishers seem to be content working with the established e-reader hardware vendors. But such cooperation is, in all likelihood, a holding tactic. Many publishers are said to be dissatisfied with the deals they are getting from the hardware vendors (typically 30% for the hardware vendor, according to published reports) and they dislike the vertical nature of the business, they would prefer that their content be available across multiple e-readers and other mobile devices.

Eventually, the publishing industry is likely to follow the lead of the Hearst Group, which launched its own e-reader at CES. "We are going to create an entity by publishers, for publishers," group president Kenneth Bronfin told the Wall Street Journal, adding that the company would also establish a portal with its content available for paid download to a variety of electronic devices.

Other companies that are reported to be planning to head in the same direction include Time Inc., Conde Nast and News Corp.

Publishers have a long road ahead of them before the print revenue they have lost to the Internet can be recovered on the e-reader. But they seem to be determined to see it through, and they will need the support of strong and innovative technology partners if they are to succeed. NDS is just such a partner.

Advertorial – Mobile data growth

0

Strategies for profiting from mobile data growth

The mobile data industry has evolved rapidly over the past two years, with the impact of growing 3G penetration, lower cost smartphones and USB laptop dongles, together with the popularity of mobile applications and flat-rate data plans. This has resulted in huge growth in data traversing operators' networks.  The market has now reached a chaotic and critical point with network congestion being felt by operators and consumers alike.

In response, operators are introducing a toolkit of network congestion management strategies that will reduce costs and improve economies of scale by balancing traffic requirements across networks and implementing real time usage controls. 

Policy control, data traffic offload, evolution to 4G, and network optimisation will incrementally reduce data delivery costs by more than 60 per cent over the next three years. A holistic approach that takes into consideration traffic growth, subscriber behaviour, and application trends is vital to long term success.

Policy control  –  how, when and under which circumstances subscribers can access  networks, applications and services – will contribute cost savings of over 10 per cent, equating to over $15 billion in savings by 2013 in the US market alone. 

Policy control provides real-time network, application, and subscriber policies that allow operators to manage mobile data growth and deliver personalised services on a far more refined level than was possible in the past. It helps operators prioritise traffic based on an individual users' subscription. So effective is policy control in reducing traffic peaks that data throughput in the busiest times can be reduced by 15 to 20 per cent, according to Chetan Sharma Consulting.

Shifting data traffic off a congested mobile network and onto another access technology fundamentally changes the economics of delivering that data. Offload is being implemented by operators globally to manage the total data throughput with, typically, two flavours: offload to Wi-Fi and offload to femtocells. In some regions, WiMAX deployments are also crucial to an offload strategy.

Operators deploying a data traffic offload strategy, using service control to ensure transparent and secure subscriber access, can expect annual network cost savings of about 25 per cent per annum by 2013. 

Infrastructure evolution to 3.5G (HSPA) and 4G (LTE) lowers the cost-per-bit for data throughput on the network, thereby reducing overall costs. Network cost is lowered dramatically with each incremental technology deployment, with the evolution to HSPA and then LTE saving just under 20 per cent in costs, according to Chetan Sharma Consulting.

Cost reduction is only one side of the equation. Operators are now creating new service models that move away from unsustainable flat-rate plans towards tiered and usage-based pricing underpinned by subscriber, service, and policy control as shown in the table.

Flexible, dynamic, and personalised pricing models that reflect subscribers' preferences and context, bandwidth and application usage, and network conditions are the wave of the future. 

Comparative cost reduction strategies, when placed alongside the new service models now being introduced, aid the development of sustainable business models for the mobile industry.

But pricing models will ultimately determine future success and growth in the sector. Unsustainable all-you-can-eat data plans will evolve to include flexible pricing models based on time-of-day, individual usage patterns, casual usage, application preferences, and location.

It is ultimately the responsibility of mobile operators to introduce these models with quality of service guarantees that are based on users modifying their behaviour. With that will come order from the mobile data chaos.

 

Innovative service models

Service models are evolving in a data-centric mobile world as a result of massive growth in data throughput. Flat-rate data plans are unsustainable for the heaviest users and innovation inevitable: 

l Speed-rated: These plans offer operators the ability to increase revenue from the heaviest users by placing these subscribers on the most expensive tariffs, implemented through effective policy control on the consumer side.
l Time-based: Telecom Italia Mobile has successfully deployed time-based mobile data plans. The model implements tiered pricing based on the number of minutes a user spends on the data network.
l Bandwidth usage and application specific: Next generation policy control solutions enable operators to implement controls and pricing based on bandwidth usage or specific traffic types. Operators can flexibly charge for heavy bandwidth services such as video or peer-to-peer in real time. SmarTone-Vodafone, for example, is delivering tiered services in Hong Kong based on bandwidth usage and time, as well as applications on-demand using Bridgewater's policy control and subscriber data management capabilities.
l Time of day: Operators in mature markets have seen a clear time-of-day usage pattern emerge for mobile data. Similar to other utilities, they can charge more at peak times according to  network capacity , or conversely, offer consumers incentives to download during  quiet network times.  Underpinned by policy control, dynamic and transparent pricing enables operators to effectively manage peak loads.
l Location-based service models: Traffic patterns over the past two years demonstrate that the most congested cell sites are in urban centres. Implementing charging models based on congestion is commonplace -London's congestion charge zone for example. Could operators implement a similar model on their mobile networks if guaranteed quality of service is the outcome?
l Quality of service models: Guaranteed QoS comes at a cost to operators, especially in mobile networks where bandwidth is necessarily a shared resource. But the emergence of ‘bandwidth boost' models- whereby a user is offered a short-term increase in bandwidth for a set fee for example – provide the opportunity to implement service level agreements.
l Ad-funded solutions:  Mobile advertising is beginning to emerge as a revenue source for operators. With subscriber data privacy concerns now being addressed, mobile advertising could create new revenue streams for the operator, personalized offers for the consumer, and more brand awareness for the advertiser .
l Mobile commerce driven: Japan offers insight into a commerce-driven mobile data market, with an open ecosystem driving adoption and consumer spending on services. Leading mobile Internet players including Yahoo! Japan have developed a viable market for content, services and mobile advertising in partnership with mobile operators.

Anu Shah – Interview

0

 

An open, flexible service environment will help operators create and provide better customer  experiences, Anu Shah, Head of IMImobile Europe, tells Keith Dyer

Keith Dyer:
Anu, many readers might not be aware of IMImobile’s capabilities, as you have not been visibly active in the European market for all that long. Yet you have been helping operators across the world build and expand their service offerings for years. How would you position the company within the European market?

Anu Shah:
We are Service Creation partners for mobile operators, to help them create, launch and then grow the penetration, usage and scale of their service portfolio. We have a modular series of services, from music and messaging to advertising and social media, all built within our open-API approach. This means we can we can integrate our own, commercial services to interoperate with each other, and with third party service environments. Added to that is the important fact that we can offer these either as an in-house trusted partner, or on a managed service basis.

Our view is that as operators face up to the challenges of creativity, innovation, and maintaining high quality customer experiences, they need to be able to create combinations of services across their portfolio. At the moment, very often they can’t do that. Or they can only do it with a great deal of integration expenditure and resource. This then means that the resulting service cannot afford to fail, as it has had too much invested in it. Our vision is that operators need to be able to move flexibly and quickly to create new combinations of services. They may choose to launch a music service, integrate this with social media and then add advertising and be able to test these propositions with consumers quickly.

Keith Dyer:
It’s certainly a view that operators have heard before. Of course, we have seen investment in service delivery platforms, in opening up OSS architectures in a number of different ways, and the move to cloud-based applications. All of these have been intended to enable operators to act more creatively and be more agile. Why do you think IMImobile’s approach meets the demands you have outlined?

Anu Shah:
In a sense we have been lucky because we originally set out with that flexibility in our approach – and we have been able to evolve with that. It’s really something that is in our technical and cultural DNA.

In terms of technology our products have all been built with an open architecture, but really we are trying to sell a solution or a service, rather than focus on technology. Operators come aboard with us because they see how we make a great effort to integrate with their existing services and billing systems and help them build new services. We also make it simple for them to achieve a cross-service view of their subscriber and network data.

Operators are faced with a world where they are accused of being dumb pipes. Actually I think that’s an unfair accusation and very far from the truth. But it is difficult for them to collect and analyse their data in one place. So that’s where our subscriber modules, combined with our open API approach, helps, because it can collect data centrally from all services.

This approach, opening up and integrating core elements to a central subscriber module, means that operators can move faster and innovate in time with their customer demands, rather than always be reacting. They can move from being network-centric entities to customer-centric businesses.

Keith Dyer:
And it’s an approach you have demonstrated real-world success with, rather than just as a vision?

Anu Shah:
We have. Although Europe is a new opportunity for us, we are now proceeding to grow organically and inorganically in this market, and have a strong pedigree globally.

We have put together a proposition that’s proven to be scaleable and innovative in India, Latin America and through Asia. Last year, for example, we signed with the MTN Group for 21 territories, providing their entire CMS infrastructure and we went from scratch to a live service in South Africa in just a few months, driven by the immovable deadline of the Football World Cup!

That’s the reason why Sequoia Capital, a recent major investor in our business, sees us as one of a few companies with the proven global deployments and core technology backbone that prove we are able to work as a service creation partner for Tier One operators.

Keith Dyer:
And in line with this theme of enabling operator innovation, you have recently launched another service module to your portfolio.

Anu Shah:
We have recently launched our DaVinci Social service. DaVinci Social helps operators to build an enhanced phone book for their users, bringing together contacts, events, social interactions and updates all within a rich phone book environment.

The aim is to enable operators to offer a competitive social service without the associated high operational costs of designing a service from the ground up. Of course, it will also bring integration with our other services such as music and advertising, and offer extensions into other third party elements to ensure the service can keep evolving at pace.

Keith Dyer:
And you see the phone book as being a critical point of contact, and differentiation, for operators?

Anu Shah:
I think it really is the spring board not just to the communications experience, such as messaging and voice, but to other services that we and others bring, such as music, entertainment, and advertising.

Operators have found it challenging to make their services hold together in an integrated fashion from an end user point of view. With V360, Vodafone is one that has made the right move in saying, “We want to use this connected address book as the hub for all our other activities.” It means you can always be embedded in the daily activity of a consumer. That lends itself to developing an advertising proposition around certain events, say. It means operators can start offering a service beyond just mobile services to become a full digital service provider.

I am positive about this opportunity for operators because they have three or four key assets they need to fully exploit – phone book, consumer data, billing relationships and their brand. So far they have struggled to do this from a consumer perspective but providing a good service around the phonebook could enable them to utilise all of their assets. There are of course question marks over whether Vodafone have executed their strategy correctly but the objective is clear. With solutions like ours that are designed to create a more connected user experience, collecting and analysing consumer data becomes an integral part of designing, testing and launching new services quickly. Historically this data while available, has been difficult to discover and even more difficult to use for service creation.

Really, the connected address book should just be part of a service provider’s core offer. It’s still early days but these are important first steps into the market.

Keith Dyer:
So far from providing point solutions – “here’s a music platform, here’s an ad server” – you’re talking about helping operators utilise their core assets more effectively based on actual subscriber usage data from the services you and others provide.

Anu Shah:
Yes. Our objective is to provide specific solutions that are flexible enough to be part of any existing or evolving service environment. Operators have to be thinking about innovation and how they can build that into their business processes. Our solution is stronger than any point solution.

As I said, the advantage is we can integrate easily, providing a high level of tehnology re-use. Even within just one service, say music, an operator may have a provider for his real tones service, one for ring back tones, and a full track download service as well. Yet it’s all music- and there’s a high chance that a user would respond well to a combined offer, or be prompted to use one service whilst engaging with another. We offer all of these services or provide just one of them as part of a solution to weave together all of the components and data so that an operator can add real value to the consumer experience.

Our whole aim is to increase revenues and consumer uptake of existing and new services. Even a slight tweak to the CRM around a service may lift that service penetration from 2 to 2.2%. But that’s a 10% rise in itself. That’s what I meant when I talked about us regarding ourselves as an in-sourced partner focused on revenue generation, rather than just an outsourced services provider. It’s also an advantage we have gained from working in some of the world’s most competitive markets such as India. Sometimes working in that environment drives operators to achieve higher levels of innovation and marketing skills.

Keith Dyer:
So you think that operators do have the tools they need to fight back against the loyalty consumers may feel to other providers.

Anu Shah:
Yes. Sure, they need to be innovative, focus quickly, and provide a great customer experience. And let’s be honest, how many operator services can you hold up and say that it’s a fantastic customer experience? But if the customer experience is great then customers will follow that. And operators have the added advantage of being trusted far more with customers’ data than many other providers. So that is where partners like us come into play.

For example, one area where we can help with this focus on innovation is that we are working with operators, developers and other third parties to create a fully managed environment in which new services can be developed and tested at speed and rolled out across networks.

Keith Dyer:
As we can see from the focus on application stores and social mobile platforms, such as V360, it seems operators are aware that they don’t have long to put all this together.

Anu Shah:
Certainly they don’t have that much time. They have to make their choice today. It could be that in five years’ time there may only be one or two operators per territory that own their own network: everyone else is on a shared network and focusing on delivering consumer services. In which case they need to be quick about leveraging the non-network centric assets that they do have.

A provider like us – that has experience growing up in aggressive markets, that has a business model that is based on sharing the upside and growth as a partner, and has the technical flexibility to power a range of services- is ideally placed to help operators meet that challenge.

Vince Lesch – Interview

0

 

Managing network and service complexity in LTE

Mobile operators are responding to rising data volumes by planning the deployment of LTE networks. But this brings with it the need for service interworking between 2G/3G and LTE networks. Vince Lesch, CTO, Tekelec, tells Keith Dyer how operators can meet these challenges effectively

KEITH DYER:
Vince, as we talk now at the start of 2010, we now have a small number of live LTE deployments. And we can be sure that we are in a year that will see a number of further LTE network rollouts. Operators are being driven to consider solutions to the increase in data volumes they are undergoing, and to the number of services that are now being used across their networks. But that brings with it an associated increase in service and network complexity. Can you describe that?

VINCE LESCH:
From the perspective of complexity what we are seeing is the introduction of more new technology into the networks.  From increases in bandwidth on the access network, with the move to LTE, to the introduction of technology platforms to provide new data services, our customers have a number of exciting plans.

The challenge that brings is that it grows the need for interoperability between different networks modes, and between the signaling layers controlling the network. Operators now have complex hybrid networks composed of elements from many vendors across their 2G, 3G, and now LTE networks. They are also introducing a range of new services, and service capabilities, to enable them to take advantage of the new business models that are emerging – for example around application stores. But they have to make all these services work seamlessly to the end user, no matter what domain the user is in, the device he has, or what service he is accessing.

That’s where Tekelec adds a lot of value. Our systems and solutions offer a way to do this more cost effectively, allowing our customers to solve their interoperability problems. And we have experience of this. For instance, in the past we have had customers that have integrated their GSM and CDMA networks and had deep interoperability requirements at a subscriber management level, to ensure they were offering continuity of service. And we see similar types of things happening as customers evolve to LTE. They will have the same type of demands for accessing real-time service and network databases, to determine customer call set-up requirements across multiple domains.

KEITH DYER:
How swiftly are you seeing your customers move to LTE? And what do you think the scale of those deployments will be?

VINCE LESCH:
It really depends on the country and the carrier. I think we are seeing some carriers in the USA move very quickly to LTE, and in fact some European operators as well. They are moving quickly and looking at that right now. Others are in the stage of having  forward looking discussions – planning their migration and evolution. It is impacted by a number of factors, such as the depth and coverage of the current 3G and HSPA networks, as well as financial priorities of course.

KEITH DYER:
With that in mind, how do you characterise the current state of operators’ investment priorities. Are we in a brighter economic climate last year?

VINCE LESCH:
I think that we’re pleased with where we are from a company perspective. We are certainly excited by the opportunities out there with reference to LTE. We are also optimistic that, at least from a global economic perspective, we have reached a levelling off period in the wider economy.

At Mobile World Congress a year ago we met customers who have since gone public about the priorities they have around capacity planning. The network planners told us that when they have discussions on capacity planning they are used to going to the marketing department. Marketing gives them a forecast and then the planners routinely divide that by four. And that’s all been OK until last year – when marketing was right! So really that is a significant growth in mobile data, and that is driving us forward in terms of servicing these needs.

KEITH DYER:
You say that you can help operators deal with this transition more cost-effectively, what are some of the things that you can do, and in which areas?

VINCE LESCH:
Yes, there are a lot of clever things that we can do with signaling and service control that can help. Our EAGLE XG next-generation SIP signaling platform allows us to provide a view of that interoperability for the carriers, across their SIP and SS7 signaling domains. It can host multiple application platforms and technologies to help control the signaling across the hybrid networks that operators have to operate.

This is important because one of the key areas for operators to address will be how they manage services that cross the border between 2G/3G and LTE networks.

You have got to remember that as well as interworking demands within one operator’s network, this is a global mobile world and operators need to be able to provide seamless services to users from different networks – so that someone sending a message from a 3G device, controlled within the SS7 environment, can send a message and have it read by a user within an LTE/IMS environment.

KEITH DYER:
You mentioned messaging there, and indeed I think how SMS and voice services will interoperate between LTE and 2G/3G networks is a current hot topic.

VINCE LESCH:
Yes, I think the early thinking that LTE would be all about IP data-only services, and so the demand for interworking with legacy services would be reduced, has for some time now given way to the realisation that operators will have to deal with service interworking issues. As I said, they will have users who have not migrated to the latest devices, but they will also have the requirement to deal with incoming call flows from users of other networks that have not been upgraded to LTE.

So operators will have customers sending a message from one mode to another, and we need to provide the protocol translation and signaling support in scenarios that are fairly complex.

Messaging services are very interesting because if you look at revenues by various services then clearly SMS is very important, even though we are seeing some erosion in the margins there.

So there is a dual demand to protect SMS revenues as operators move to LTE, but also to make their SMS network support more efficient.

An efficient messaging system requires a mechanism to deliver SMS in the LTE/IMS domain, and in the pre-IMS SIP domain, as well as in the SS7 domain. Our IP Short Message Gateway (IP-SM-GW) supports SMS and MMS in all-IP networks, and that allows operators to interwork their LTE networks with 2G/3G networks, by using a single system. And because the IP-SM-GW uses the existing SMSCs and the MMSCs for forwarding and storing, no new application servers are required in the IP network.

KEITH DYER:
And as well as driving that cost-effectiveness across the domains, you are also able to protect revenues?

VINCE LESCH:
Yes, we are also looking at utilising our expertise to enable operators to layer some additional types of service on top of their SMS service layer. The GSMA initiative around the  Rich Communications Suite combines aspects of text, MMS, and video with the address book and presence. There are several trials across the world, with carriers really looking at how they can add value for their customers, and not just to have their users access these services in an “over the top” fashion.
Layering services in that manner requires the advanced signaling and protocol translation systems that we have.

We are also able to provide advertising insertion in mobile messaging, allowing carriers to insert adverts into texts, using LBS or user-profile information if they want to. It’s really all a part of our focus on the evolution of mobile messaging.

KEITH DYER:
As well as focussing on a service-specific area, you mentioned that you can help operators deal with the increased data volumes they face more effectively…

VINCE LESCH:
This increase in data volumes  means that we have identified a requirement to get involved in the performance management piece. Really it’s about managing data analysis in an intelligent manner – so that as data volumes increase, mobile operators do not have to scale their monitoring and management tools in a linear fashion.

LTE is designed to deliver a lower cost per bit on the access network, but if demands on all the other supporting items increase in-line with traffic growth then the benefits will be cancelled. We have the capability to look at the control plane data, and go down to the protocol level to probe into what’s going on. We can look at the payload and see what type of service is being used.

This means that operators can focus their intensive analysis on the customers and services that are most profitable for them – collecting the data they need based on their specific requirements. For example, they may use the capability to assure SLAs for important enterprise customers, or for a high value service such as TV.

By monitoring only the data they need to, operators can scale their monitoring systems gracefully. Operators could control 80% of the revenue flow by concentrating at a deep level only on a small percentage of the data. They still collect all the user data, of course, but  it’s about how they manage that intelligently.

This is a great example of the new requirements that LTE will generate, and also of how we can help operators address them.

EMT Estonia completes first live 3G network tests of BlueSky Positioning’s A-GPS technology

0

BlueSky Positioning, a A-GPS solutions provider for the SIM card industry, and EMT, a mobile operator in Estonia, announced today that EMT has completed its initial field testing of BlueSky Positioning's A-GPS (Assisted-Global Positioning System) technology. The tests were conducted using both GPRS and USSD (Unstructured Supplementary Services Data) for retrieving assistance data in a live 3G network.

EMT Estonia is the first network operator to test a USSD connection for assistance data, allowing BlueSky Positioning's A-GPS solution to be used with ordinary mobile phones, even if the model does not support GPRS. Using USSD is said to bring with it a number of benefits such as the ability to be used concurrently with a voice call. Retrieving assistance data during a call is particularly important when making an emergency call, allowing the caller to be located accurately and quickly.
 
The tests were carried out with handsets varying from low end 2G to high end 3.5G. Future testing will cover a wider range of handsets and scenarios including various weather conditions, in urban and rural landscapes and using entry level handsets and smartphones. Running parallel to this testing, new services and business models will be finalised in readiness for a mass market launch by EMT.

BlueSky Positioning's A-GPS technology is completely embedded into a phone's SIM card. A key benefit of A-GPS SIM technology is the provision of accurate positioning information when an emergency call is made from the phone, as required by E112 in the EU and E911 legislation in the US. In addition, the range of location based applications that can be offered with this combined A-GPS SIM technology is vast, including generic Location Based Services (LBS) such as navigation, social networking, child tracking or more complex applications such as workforce and fleet management.

"Our preliminary test results with A-GPS in-SIM technology have demonstrated compelling accuracy coupled with a fast TTFF (time-to-first-fix), comparable to any commercial A-GPS handsets in the market. A-GPS SIM technology opens up a whole range of possibilities for both EMT and its subscribers," said Argo Kivilo, R&D Manager at EMT. "Not only this, but in light of the newly updated EU E112 directive, operators must be able to provide location data on nearly all emergency calls on their network within the next 18 months. BlueSky Positioning's technology will be essential in ensuring this happens."

Velipekka Kuoppala, Vice President of Sales and Marketing at BlueSky Positioning, added, "Our work with EMT takes us a big step closer to the mass market deployment of A-GPS technology within SIM cards. EMT has been very supportive in the development of this technology and we are excited to already see initial results from these advanced testings. We strongly believe that A-GPS technology used with SIM cards is the answer operators, service providers and consumers are looking for in terms of the easy integration of GPS capabilities that enable useful, revenue generating and legally mandated LBS applications whilst also delivering compliance to E112 and E911 legislation."

LTE backhaul strategies – Spectral efficiency in the backhaul network

0

Operators need to take into account the total cost of ownership of their next generation  backhaul strategy, and that includes spectrum costs. But with microwave spectrum costs set to rise, it will take careful planning to ensure operators have the best solution, says Alan Solheim

The promise of new services and new revenue streams is driving the adoption of next generation radio access technologies such as HSPA+ and LTE. These technologies promise a mobile internet experience that is virtually the same as broadband access at work or at home,. creating an entirely new profile for the traffic coming from the base station and, in turn, the backhaul network.

HSPA+ networks require 50 to 100 Mbps per base station and LTE networks require 100 to 200 Mbps per base station – an order of magnitude more than traditional 2G or 3G networks. In addition, this traffic is predominantly IP based. Traditional TDM radio systems can not handle this backhaul traffic, and leased E1 circuits are not cost effective. The viable backhaul solutions going forward are owned or leased services based on fibre or packet microwave radio.

While fibre provides almost unlimited bandwidth, the majority of the base stations do not have fibre connections. In addition, depending on the amount of radio access spectrum that is available to the mobile operator, additional base stations will have to be deployed in order to deliver the expected bandwidth per user. It has been estimated that operators with 50 MHz of radio access spectrum will have to double their base station density in order to deploy advanced 3G services. These new base stations must be placed where they can provide the most effective re-use of the radio access spectrum, and only by chance will they have fibre connections at the most desirable locations.

The net result is that in order to use fibre as the backhaul medium, lateral runs from the nearest fibre point of presence must be installed in the vast majority of cases. The cost of these lateral fibre runs are proportional to the distance, and the cost per meter is proportional to the population density – highest in the city centers where demand for 3G+ and 4G services is greatest.

The business case for packet microwave, on the other hand, is almost distance insensitive. The cost to purchase and install a packet microwave link is relatively constant up to a distance of several kilometers; longer than the typical base station spacing. The combination of these two facts means that a significant portion (majority) of the backhaul network will be implemented using packet microwave. This, coupled with the already heavy use of microwave for the GSM network and other private networks, will result in congestion, especially in the city centres, for the RF channels required to support the new packet microwave deployments.

In response to these concerns, many of the world's telecommunications regulators have implemented new measures to more carefully manage the available microwave spectrum. Several European countries including France and Russia have essentially eliminated larger channel bandwidths (56 MHz and above) in order to encourage greater efficiency in smaller channels . The Office of Communications, the independent regulator in the UK, is addressing spectrum congestion with a pricing strategy that favors higher frequencies and smaller channel sizes.

With most regulators adopting similar pricing strategies, it is clear that larger channels and lower frequencies are cost prohibitive for most operators. While all other cost elements will remain relatively fixed, spectrum cost is proportional to the size of the channel. If the operator has to double the channel size of the backhaul link the cost of spectrum – already one of the dominant cost contributors – will grow linearly. Looking at future capacity requirements, where hundreds of megabits or more will be needed, careful spectrum utilization planning will be essential in order to ensure the ongoing economic viability of these backhaul networks.

Despite the significant impact to the total cost of ownership (TCO), spectral efficiency is often a secondary consideration when evaluating microwave backhaul solutions. The following chart (Figure 2) represents the backhaul business case sensitivity to various cost elements. These costs are varied according to ranges found in existing backhaul deployments, illustrating the potential impact that each can have (positive or negative) to the operator's total cost of ownership.

As shown, spectrum cost has a much greater impact on the operator business case when compared to items such as equipment and installation costs. The wide range of potential impact is due to a combination of pricing variation and the degree of efficiency in existing deployments. Equipment cost, while important, is often overemphasized in the buying decision as it represents a small fraction of the TCO. This highlights the importance of selecting a microwave solution that minimizes key operating expenses such as spectrum licensing. While all other cost elements will remain relatively fixed, spectrum cost is set to rise dramatically, becoming the dominant ongoing expense for operators  deploying broadband mobile networks.  This is due to the fact that many existing microwave backhaul solutions will not scale sufficiently within existing spectrum allocations, resulting in additional spectrum investment for operators.

Fortunately, next generation packet microwave systems address many of these elements with capabilities such as all-outdoor deployment, reduced antenna sizes, and most importantly a suite of technologies which deliver a dramatic improvement in spectral efficiency relative to previous microwave systems. As shown in the figure below the introduction of higher order modulation, adaptive modulation, Cross Polarization Cancellation (XPIC) and now baseband bandwidth optimization techniques has increased the spectral efficiency by almost a factor of 10 over the past decade. These techniques provide the capability for mobile operators to deliver bandwidth suitable for LTE base stations within their existing 7 or 14 MHz backhaul channel allocations. This avoids the need to spend the time and money on re-engineering the RF portion of the backhaul network, eliminates the concern over backhaul spectrum availability for base station upgrades, and reduces the total cost of ownership by up to 40%, resulting in cost savings that are much higher than the cost of the new equipment.

The demand for mobile broadband services is seemingly insatiable, driving the rapid adoption of 3G and 4G networks. Packet microwave is a preferred technology to provide the backhaul for these networks due to speed of deployment, simplicity and cost. The backhaul technology decision will be much more than about the box cost, but will have to include the total cost of ownership including spectrum lease costs. While the availability of spectrum for the backhaul network of next generation mobile networks is a concern, advancements in the spectral efficiency of packet radios are providing the answer.

About the author of this article: Alan Solheim is VP, Product Management, DragonWave

Network intelligence – These pipes ain’t so dumb

0

Selecting the best Network Intelligence strategy will help operators meet the needs of rapidly changing business models, writes Keith Cobler

Network operators today are faced with many challenges that are impacting their businesses, some of which include increased market competition, technology challenges and a more demanding customer base, to name a few. Many operators have been blind-sided by these and other industry challenges and are searching for viable solutions. Business models of the past no longer work in this new market environment. Operators today recognize the importance of running their businesses based on actual data that can be analysed and evaluated in real-time. To do this, network operators are relying on network intelligence solutions that correlate high-level business objectives with what is actually occurring at the network level.

Different from Business Intelligence solutions, Network Intelligence solutions start by capturing the raw data and events that take place at the network level and transform this data into actionable information. They rely on the collection of real-time, high-quality events that transverse multiple network technologies as the basis from which meaningful information can be obtained. To some degree, the old adage of ‘garbage in – garbage out' applies since the quality of information at the output is very much dependent on the data-in and how that data is transformed into information.
In general, Network Intelligence solutions consist of three integrated components: (1) collection agents that collect data from network interfaces and elements, (2) a network correlation layer that ties together related network events and (3) analysis capabilities that empower multiple departments within a network operator with key information that allows them to manage better their individual departments and their business as a whole.

In terms of solutions, there are many solutions on the market but few that cover all three components of collection, correlation and analysis. Based on this, operators who implement a Network Intelligence strategy may decide to piece together the top-to-bottom solution using different vendors, or will look to a single provider to supply all the necessary components.

From Data to Information
A Network Intelligence solution needs to collect events at the network level and then correlate or tie them back to specific business objectives or desired outcomes. Within the network, events and transactions are captured and stored as a call data record, or CDR, which is generally used as the primary data source for calculating key performance indicators (KPIs) by network, service or customer assurance or other OSS/BSS application.

Today's Network Intelligence systems rely on signalling and media data as the primary data sources from which intelligence can be derived. Some Network Intelligence systems may also use data supplied from network elements or from other OSS/BSS systems. Different network technologies and topologies also play a factor in the types of data that can be collected and used. As networks continue to evolve, the trend has been towards combining elements and functionality into as few elements as possible in an effort to reduce costs. From a Network Intelligence perspective, this has created some new challenges in terms of getting access to the data; e.g., in LTE networks, a new network element called the E-NodeB (Enhanced Node-B), is a combination of the Node-B base-station and the RNC as a single element, thus physically eliminating the Iub interface.

Data collection devices for Network Intelligence solutions consist of probes (passive or active), element feeds and software agents. The primary difference between passive and active probes is that passive probes are non-intrusive, meaning they do not interfere or insert themselves into the data path, but rather capture the data using a mirrored port. Active probes, on the other hand, inject a test signal into the network and then measure the response of the network to that input. Software-based collection agents are generally used when physical probe deployment is impractical due to size or cost constraints.

The second key layer of a network intelligence system is the correlation/mediation layer. The purpose of this layer is to correlate all data sources end-to-end across the network and then to write this data to a CDR for post-processing. Doing this is not as simple as it sounds since most networks today are not based on a single, homogenous technology but rather have evolved and consist of a patchwork of legacy and next-generation technologies.  And to correlate a call or session end-to-end across multiple network technologies requires a sophisticated protocol correlation engine that can piece together the protocols across every leg of the connection, or in the case of IP, derive this correlation from the IP packets themselves.

After the data has been collected and correlated, the last layer of a network intelligence system is the processing of the data into meaningful, accurate information that can be used by different individuals and organizations within the carrier.

At the heart of information analysis is the KPI, or key performance indicator. In order to correlate accurately a desired business outcome with events that occur within the network, considerable attention must be paid to identify and define KPIs correctly that are meaningful and accurate and indeed drive desired business results. This is never an easy task given the underlying complexities at the network level and the interdependence between events and variables that describe these events. For Network Intelligence solutions, it is the attention paid to modelling a desired outcome accurately and efficiently, which requires an in-depth understanding of what and when to measure, that defines the value of that system.

Different from counter values that describe a given state, KPIs are formulae that give greater insight and are based on multiple inputs such as cumulative counter values, constant values, timer values and even other KPIs that have already been computed. It would be a mistake to compare one KPI to another just by name alone without knowing the exact definition of the KPI, the criteria used to select its inputs, and any other information that may impact the KPI's accuracy or manor in which it can be used.

Linking your desired business outcome to events at the network level is the basis of network intelligence. To do this, a model of the system needs to be developed that links the desired business outputs to the dependent variables at the network level. In most cases, model development is a science in itself with many considerations that are beyond the scope of this article, but in general, the process consists of three basic steps: (1) model the system, (2) compare – modelled vs. actual and (3) optimize the model. Some network intelligence systems that are available today provide off-the-shelf KPI packages, which others provide the capabilities for users to create and modify their own. In most network intelligence implementations, it is usually a combination of the two approaches that provides the most cost-efficiency and flexibility.

Another key consideration at the analysis level is the ability to provide information in real-time to those individuals or departments who need it. In addition, the format and display of this information should be tailored to the individual groups or individuals that use it. For example, whereas network operation teams may receive their information in the form of real-time dashboards and alarms in the NOC, product planning teams may require a dashboard or report that looks at more historical or geographical trends. The point being that although a Network Intelligence system leverages a common data stream, different departments within the network operator require to see the information in a format that is most beneficial and meaningful to them.

What to consider
In general, there are two classes of network intelligence solutions available today. The first class of solution is the vertically integrated solution that contains all three functions of collection, correlation and analysis and is based on an open architecture. These vertically integrated solutions are usually provided by a single vendor, but at the same time are designed using open architectures with the required hooks required to support 3rd party hardware and software. The second general class of solutions is referred to as partial or point solutions because they tend to focus only on a single layer of a Network Intelligence solution and not the entire top-to-bottom integration you find in vertically integrated solutions. Although it is a bit of an apples-to-oranges comparison, vendors of point solutions will often sell their customers that their hardware or software components can be integrated with other 3rd party hardware and software.

The chart above includes a summary of the two general classes of Network Intelligence solutions that are available today. For each of the three components that make up a Network Intelligence solution, listed are some of the key attributes that define each of these. Depending on your specific circumstances, this list may be expanded or modified considerably – and is only intended to be a rough guide for comparison.

Ultimately, the important points to keep in mind when selecting any type of Network Intelligence solution are:
1 Does the solution accurately and effectively correlate my desired business outcomes to what is occurring at the network level?
2 Does the solution provide an actionable path for different groups or individuals within the company to identify an issue and take action that has a positive business impact?
3 Does the solution have a justifiable Return-on-Investment (ROI)?

Summary
Network operators today are faced with tremendous challenges. In order to reduce the risk and better ensure that their business objectives and strategies can be achieved, network operators are moving towards Network Intelligence solutions as a means to achieve their business objectives based on what is actually happening at the network level. In short, network intelligence solutions consist of three components: (1) collection agents – that collect data from the network, (2) correlation layer or engine – that ties together relevant data from across the network and saves it in an accessible data file and (3) analysis packages – that turn the data into meaningful information that can be used by multiple departments within the network operator. The two general classes of solutions available to the market today are vertically integrated solutions and point, or partial solutions. Determining which type of solution best meets your needs is not simple, and comes down to decisions related to budget, resources and what types of systems and technologies you currently have in place. But the bottom line is that for whatever Network Intelligence system you choose to go with, it must have the ability to drive positive business results based on accurate and real-time information.

Volantis launches Volantis Framework 5.3

0

Volantis Systems announced today the launch of Volantis Framework 5.3, said to be a 'significant expansion' to its Framework platform. The latest release includes a fully featured library of advanced user interface controls, which allow for Web 2.0 user interfaces to be built on the latest mobile handsets and browsers.

Volantis Framework 5.3 allows operators and developers to build and deploy next-generation portals and browser-based applications with customizable, out-of-the-box user interface components. The new release of the company's core mobile content delivery platform includes a set of programmable controls that can be extended to support the most demanding and compelling user interfaces, while still meeting device portability requirements. Volantis Framework 5.3 supports the latest features included in leading smartphone devices, while developers looking to offer an improved and compelling experience will also benefit from the ability to quickly roll out new, targeted services across a number of high-end devices. The platform will be available immediately within commercial releases. 

"The mobile industry understands that different customers demand different experiences, and for operators the challenge lies in how they can quickly facilitate and provide the latest experiences across multiple handsets that improve customer loyalty and drive new revenues," said Mark Watson, Chief Executive Officer of Volantis Systems. "Our customers are at the forefront of delivering innovative, feature-rich mobile content and services, but the challenge of leveraging the latest handsets to their full potential to ensure subscribers have access to the latest and most compelling user interfaces across many devices can be considerable. With a strong heritage in providing integrated software platforms that deliver varied, innovative and personalized content, Volantis Framework 5.3 will enable operators and developers to quickly implement highly customizable, Web 2.0 content for subscribers that boasts a smoother, richer user experience."

 

Mathias Prüssing – Interview

0

 

Interview – Meeting the future roaming challenge

With data roaming set to increase through 2010, but with traditional roaming revenues under  threat, operators need roaming solutions that match their business needs. Comfone CEO Mathias Prüssing tells Keith Dyer how an integrated roaming service can do just that

Keith Dyer:
Mathias, readers may know Comfone as a roaming service provider but they may not appreciate the scope of your activities.

Mathias Prüssing:
Comfone has achieved its aim of becoming a full service provider in the roaming market, offering a complete portfolio of roaming management services. It ranges from roaming enablement through our hub solutions, including our WeRoam wireless IP and Key2roam hubs, to our Advanced Signalling network and our in-house Clearing capabilities. We also offer 2G and 3G data roaming management through our GRX service with connections to over 500 networks worldwide.

Our goal is to harness the synergy of all these elements by providing a one-stop-shop approach to customers.  We are in a very strong position, as a full service provider with extensive hubbing expertise, to meet the evolving roaming requirements of our customers.

Keith Dyer:
One aspect facing operators in 2010 will be the increase in data roaming. Do you think that traditional roaming relationships will be able to handle the more complex nature of data roaming?

Mathias Prüssing:
We know from mobile network operators that data roaming traffic is presenting them with a complex set of hurdles which they require help with. Therefore the move to data roaming motivates us, as a service provider, to be even more innovative in finding further simple and robust solutions for the upcoming data roaming challenges.

We already provide roaming solutions for GPRS and 3G data traffic with our GRX service which is handling increasing traffic. However, in response to operator feedback, we have also designed a roadmap to focus our energy on the demands this growth in IP traffic will bring. We have enormously simplified data roaming by adding data to our hub model. Now our customers will not only be able to centralise their voice and messaging businesses on our hub, but also their IP data traffic. Since we began integrating our data business onto our Key2roam hub, different operators have confirmed their data roaming needs and have entered into trials with us.

We pioneered the hub concept and because we are also in control of a centralised signalling platform, we are able to provide full service guarantees – indeed we already meet the full requirements of 133 live operators on our Key2roam hub, supporting over 2,800 live relations and approximately 100 live GPRS relations.

Keith Dyer:
What advantages will adding this data capability to the hub have?

Mathias Prüssing:
There are several. The main advantage is that operators will be able to centralise their data businesses on our hub. In their traditional voice businesses, an operator may have bilateral relations with 400 other players – and those relations can be managed to provide ongoing revenue streams. On the other hand the set up and management of data roaming is generally more complex. Data roaming traffic is email, internet messenger, browsing on the internet and entertainment and requires extensive bandwidth.

Our hub data roaming solution allows operators to reduce the overall complexity, provides them with powerful networks and enables them to focus on the end-customer instead of inter-operator relationships.

Added to that, data roaming relations often do not exist yet. Most carriers see their current relations and think there is no reason to change to a hub. However, operators are attempting to find new value propositions for their customers, whether it is through guaranteed SLAs or providing high quality services to different types of customers. We can provide the technology expertise to assist them with this, all in one integrated offer.

At the moment, business customers are most attractive for operators, someone who is travelling a lot and wants a high quality data roaming solution but also needs to be able to manage the expense of that. As well as this market, I believe that the consumer roaming business will also grow, as more and more people want to take their home data environment, such as messaging, social networks and favourite services, with them when they travel. Anytime, anywhere data roaming access is becoming the standard of service expected by consumers today.

Keith Dyer:
How does this change in usage translate into the services you can offer?

Mathias Prüssing:
Well, this change will drive the need for the basic roaming service enablers; the connections that hub access brings, as well as control of signalling and clearing. What is of more importance however and where we can differentiate ourselves is that operators will need to optimise their traffic and meet marketing requirements for stable, reliable connections and SLAs with fast implementation times. Added to this, operators want access to value-added services such as business optimisation tools so that the roaming manager is able to steer roaming traffic, to reduce costs and increase roaming revenues.

This is why it is so important to be an integrated service provider rather than just providing access and connectivity. We applied our extensive signalling knowledge in order to provide reporting tools for operators to access their data and better control what’s going on in their networks. For instance, with our reporting tools they can react to seasonal peaks, perhaps by providing special offers to end-users. We also offer technical tools that run rating simulations to illustrate how a rating change might impact ongoing revenues for an operator. Additionally, we offer roaming marketing tools which make it easier for our customers to flexibly steer their roaming business and ensure revenues by providing increased transparency of the impact of their marketing activities.

Keith Dyer:
Added to these changes in data roaming, operators are looking at increasing “data offload” strategies, such as WiFi. How will this affect your business?

Mathias Prüssing:
We have our Wireless IP hub solution, WeRoam, so WiFi is also part of our data story. We have over 63,000 aggregated hotspots, making it possible for customers to have consistent access to that environment. We are also supporting the facilitation of WiMax-to-WiFi roaming. Although traditionally operators tend to think in terms of GSM/3G/HSPA technology, they will consider WiFi and WiMax as they face future demands for bandwidth.

We have seen our WeRoam traffic increase 30-40% over the last year, so we think this is a real opportunity for the future. In my opinion there is 2G/3G/ HSPA on the one hand and at the same time there’s WiFi. I think we will see them become complementary, rather than competitive. That makes our integrated offering all the more attractive.

Keith Dyer:
And it seems no discussion of roaming can be complete without reference to the legislative environment within Europe…

Mathias Prüssing:
In a way, yes. It’s a fact that all mobile network operators are experiencing price erosion and restrictions resulting from EU legislation, so that’s something that we just have to face. That’s why analysts see 10-15% growth in roaming volume but no equivalence in revenues. This limited revenue growth is driving the view that roaming is all about deriving the benefit of economies of scale, to compensate or over-compensate for price erosion. As a leading hub provider, it is important for Comfone to take the current market situation into account in its strategy and ensure our customers achieve ecomonies of scale in their roaming businesses.

For this reason, Comfone’s value added services play a key role – they allow operators to offer bundles, flat fees, and other solutions which will enable them to derive increasing roaming value from their customers. It is also important for operators to take control of all elements of their roaming business; such as revenue assurance and financial management. That’s why we have taken our clearing capability in house again. We can do clearing ourselves and not just buy it in from another competitor.

Keith Dyer:
So you are confident that current operator needs around roaming, from pressure on voice revenues, to increasing data traffic, to complementary WiFi or WiMax strategies, all point to your vision of an integrated, full service provider?

Mathias Prüssing:
It is important that Comfone is recognised as an integrated solution provider and not seen only as a signalling or hubbing provider. Customers tell us that they are surprised by our capabilities and are very impressed with what we can do in hubbing, signalling and clearing. They appreciate that we can cover every element of roaming and also bring a lot additional capabilities and value to their products on top of that.

Operators have asked us to help them take control of their revenues, to help them manage the growth in data roaming. As a full service provider offering all roaming elements in our portfolio, we have the added advantage of being able to act flexibly and independently. All of our operations are in Bern and nothing is outsourced. We have full control over the quality of our services and solutions and we can go the extra mile for the customer that others cannot, providing our customers with the integrated solutions they need.

Keith Dyer:
Mathias, thank you.

 

Huawei to supply GSM-R system to DB Systel

0

DB Systel, subsidiary of Deutsche Bahn and provider of ICT services in Germany, and Huawei have jointly announced today that DB Systel has acquired a modern IP core based GSM-Railway (GSM-R) system from Huawei. GSM-R is an international wireless communications standard for railway communications and applications, and the system will be used in DB Systel's test laboratory to examine future-oriented network architectures for GSM-R.

DB Systel plans to conduct product and interoperability testing for its own purposes to examine interaction between Huawei`s system architectures with existing architectures in the GSM-R network. With this acquisition, DB Systel's test laboratory provides access to GSM-R equipment from Huawei for the first time.

In April 2009, Huawei successfully launched its GSM-R communication system for the Shijiazhuang-Taiyuan passenger transport railway, which can reach speeds of up to 250km/h and covers 189.93km throughout China. The Shijiazhuang-Taiyuan railway includes the longest tunnel in Asia – the 28km Taihang double-tube tunnel, and Huawei's GSM-R solution is said to have solved the coverage problems previously experienced inside 'super long' tunnels. By the end of 2009, more than 4000km of railway lines had been contracted to Huawei. In January 2010, Huawei was contracted for the delivery of a GSM-R system to RailCorp NSW, Australia by UGL.  Huawei's digital train radio system will cover 1,455 kilometers of track, stables and rail sidings and 70 kilometres of railway tunnels across the Sydney metropolitan rail network.

"Optimising new telecommunications systems for different vendors for railway purposes is a core business within our test lab and necessary to provide market overview" said Wolfgang Klein, Head of Systems Engineering and Network Operations of DB Systel.

 "We are confident that Huawei's innovative GSM-R solution will meet DB Systel requirements and will help them in their mission to examine interactions between state-of-the-art system architectures of Huawei with architectures already implemented by DB Systel'' said Tan Zhu, director of Wireless Marketing Huawei Europe.

- Advertisement -
DOWNLOAD OUR NEW REPORT

5G Advanced

Will 5G’s second wave deliver value?