Monday, April 30, 2012

AT&T Stockholders wont Allow Net Neutrality to Stop Mobile Data Revenues Growth


If you own a MNO, you probably do not like Net Neutrality .. even if it means you will pay more, as a customer. This is what 94% of AT&T stockholders are saying.

ATT announced that "At its annual meeting of stockholders .. The following proposals from stockholders were also voted on: ..A proposal asking that AT&T commit to operating its wireless network without the ability to privilege, degrade or prioritize any traffic was defeated by a vote of 94.1 percent against to 5.9 percent in favor".  See "AT&T Announces Preliminary Results of 2012 Annual Meeting" - here.

Before the meeting, AT&T's board recommended stockholders "you vote AGAINST this proposal for the following reasons:"

"This proposal would harm AT&T’s ability to manage its network and maintain the network quality and security that our customers expect .. The proponents appear to have no understanding of the negative impact on the Company’s operations of requiring purely “neutral” routing of Internet traffic. .. This proposal would impose rules on the operation of AT&T’s wireless broadband network that are stricter than those adopted by the Federal Communications Commission (FCC). This would put us at a significant disadvantage because we would be required to operate our network under constraints that would not apply to our competitors. Furthermore, the proposal would not allow AT&T to use reasonable network management practices in operating its wireless broadband network .. AT&T is committed to maintaining an open Internet and providing competitive choices for consumers. We want our broadband networks to enable new applications and new services, while ensuring our customers have the tools they need to protect their privacy and security" (more - here)

The chart below shows ATT's wireless data revenues, from its 2012 Q1 report (here).  



[Guest post]: Packet Signing

By Marc Tremblay*, contributed independently

Traffic classification inevitably resonates with Deep Packets Inspection, DPI. Typically, DPI is used for two purposes: identification of the flow of traffic crossing a network as well as targeted extraction of protocol, meta and very specific payload data from the network sessions. DPI typically works out of a duplicate copy of the network traffic obtained from in-band or out-of-band network elements.
  
The process is extremely resource-intensive for the DPI and switching equipment. It has been very challenging for DPI vendors to keep up with the increasing pace in the volumes of network traffic while simultaneously improving the packet inspection mechanisms and keeping those mechanisms up-to-date with the latest applications signatures and patterns.
  
The explosion of consumer operating systems and applications, combined with the ever increasing share of encrypted traffic constantly challenges the viability of even the best DPI offerings. This is confirmed by the increased use of machine-learning based approaches by DPI vendor in an attempt to keep-up with the pace. A DPI solution that does not correctly categorize the vast majority of traffic is doomed.
   
The multiplication of DPI deployments in support to multiple applications is another growing source of pain for network operators. In addition to the costs and network complexity involved, within the most mature organisations, issues around the misalignment of DPI capabilities from various vendors, the differences in volumes computation as well as in applications classification taxonomies are beginning to emerge.
  
As organizations progress toward higher levels of maturity with regards to DPI-based applications and information, they need to be able to rely on uniform vocabularies and consistent measurement across applications. For instance, a wireless operator that relies on DPI-based analytics to design tiered service plans will expect to have access to harmonized applications taxonomies, data volumes measurements and parameters extraction capabilities, on both its PCC and Analytics infrastructure.
  
Application-level authentication has been around at least since the very early 60’s. The authorisation to use an application is mostly controlled and once a user is cleared to go, every aspect of the use of the application is contextualized to the identity of that specific user: preferences, roles, transactions logging, configuration,  etc.
  
Traffic transported by the Network does not yet benefit from the same controls and contexts, at least not in a coherent and harmonized fashion. Layers 2 to 6 together form an unequally policed and chaotic area where anyone and their code are anonymously admitted in a grand free-for-all party. At best, post-admission techniques such as port-based identification and DPI are used in an attempt to identify and police the traffic.
  
Why is it that the admission to the network is not better controlled? That generated traffic is not identified at the source, as it is the case for applications usage? Why is it that the identity of traffic, application and end-user is not preserved during the session lifetime? How would such architecture look like?
  
The need to attribute an identity to the traffic is an area that appears to be underserved by academics, commercial ventures and normalization groups. The concept is to attach a unique “license plate” at the Network layers, resulting in a unique Packet Signing that would enable the identification of the application that generated a given traffic flow. That traffic “license plate” would be eventually associated with an in-band or out-of-band “driver license”, enabling the effective authentication of the end-user or organization associated with a given traffic.
  
The Packet Signature could be transported alongside individual packets or in a specific independent protocol. No elaboration on the implementation of such a protocol is provided here but the latter appears more realistic.
  
This thinking inevitably brings us to the concept of a unified registry for signed applications: a DNS-like repository hierarchy where characteristics of individual software and releases could be registered and made available. At minimum, the registry could contain classification information for applications: a standardized taxonomy would be required for that. As DPI applications multiply within the same or different organizations, this would enable a common vocabulary across DPI-sources and DPI-based applications. Other characteristics to follow: author, vendor, coordinates for parameters extraction, security compliance, etc. 

If we try to envision an architecture for Packet Signing, the following elements come to mind:

1.     Packet Signing Client – Application-aware gatekeeper software, ideally deployed prior to network admission and closer to the ever-changing applications, responsible for attaching unique Packet Signatures to the generated traffic. Network Access Control (NAC) and Trusted Network Connect (TNC) appear to be natural fits for this functionality.
  
2.      Packet Signing Protocol - A protocol to transport or associate the Signatures with the Packets. It is anticipated that the application-specific signatures generated by Code Signing infrastructures would an ideal basis for signing the packets.
  
3.      Signed Packets Inspection – In this architectural vision, DPI is replaced or complemented by Signed Packets Inspection (SPI). Outside the box, for client applications, SPI has roughly the same functionality as DPI. The difference essentially resides in the reliance on Packet Signing for applications classification and use of a standardized taxonomy. As the standard evolves, the SPI will implement standardized and secure methods to extract specific information from the application’s data flow when needed.
  
4.   Applications Registry - A standardized taxonomy is the most basic element required in order to perform an effective and uniform classification of applications. As the architecture evolves, other formats will need to be standardized, such as the specifications for extracting specific parameters from specific application’s traffic streams. Hence there is a need for a distributed registry hierarchy to hold that information.  The registration process implemented for Code Signing appears to be an ideal entry point into the Applications Registry.

Packet Signing Architecture



Many aspects are left unaddressed here. Among the most important is Packet Authentication: the ability to associate packets with a specific end-user or organization, as well as the need for a fully Trusted Environment as a secure framework to ensure the integrity of the whole Packet Signing system. The Trusted Computing Architecture (TCA) appears to be a credible basis for the latter.
In addition to the motivations expressed earlier there are other advantages to Packet Signing, noticeable ones are, the complete elimination of rogue applications from networks as well as new authentication paradigms.

The following table attempt to summarize the technical pros-and cons of DPI versus Packet Signing:


DPI
SPI
Dependence on standardization and community alignment Very low Very high: requires alignment between 
Code Signing, networking equipment manufacturers, SPI manufacturers, taxonomy, etc.
Keeping-up with applications fingerprints     Scalability of this process is an issue as it requires reverse-engineering millions of applications and it is a forever task.
    Process cannot be economically viable or too many vendors.
None required
Detection and classification precision     Varying between DPI releases, between DPI vendors and highly dependent on reverse-engineering of applications and interpretation of
standards.
   Getting lower every day as encryption gets more widely used and mobile applications are booming.
Close to 100%
Measurements precision     Varying between DPI releases, between DPI vendors and highly dependent on reverse-engineering of applications and interpretation of standards. Close to 100%
Classification and
measurements consistence across DPI releases and vendors
     Not perfect between releases from same vendor
     None between vendors
Close to 100%
CPU and memory resources
requirements
     Server-side: Very High
     Client-side: Not Applicable
    Server-side: Low
    Client-side: Expected to be low; however, this is an aspect that requires further research.

Traction will be very unequal from one application to another. The following table is a very subjective attempt to measure the level of desirability based on the applications. This is an embryonic effort to foresee where Packet Signing will emerge from:

Typical DPI-based Applications
Traction for Packet Signing
Sample Players and Organizations
Content-Based Forwarding
High
OpenFlow, Cisco, IEEE
Copyright Enforcement
High
HADOPI, RIAA, MPAA
Lawful Interception
High
Siemens, Verint Systems, VeriSign, CALEA, RIPA
Tiered Services & Charging
High
Ericsson, Huawei, Amdocs, PCC
Analytics
Medium
Amethon, Guavus, IBM, NetCracker, Neuralitic, SDM
Security
Medium
Arbor Networks, Radware, Sonicwall
Service Assurance & Application Performance Management
Medium
BMC, CA, Compuware
Targeted Advertising
Medium
Kindsight, Openet
Network Management
Low
Tektronix, EXFO, Polystar

It’s going to take a long time before we get there. Packet Signing will most likely follow the same rocky paths as Network Access control (NAC), Trusted Computing Architecture (TCA) and Code Signing. Initial experiments are happening now, on campus, alongside Content-based Forwarding, IPv6 and OpenFlow.
   
Questions and objections related to Net Neutrality and vulnerability exposures, such as man-in-the-middle attacks, will need to be addressed as we go forward. And, without a doubt, Packet Signing is a controversial proposition.
  
If Packet Signing is ever deployed in large scales, the first real-life deployments is anticipated to happen in Defence organisations, with the enterprise market to follow, and Communications Service Providers last. Finally, no need to worry about conventional DPI, as it is also expected to cohabit synergistically with Signed Packets Inspection for a long, long time.

  
_____________

*As CTO and Vice-President of Product Development at Neuralitic Systems, M. Tremblay headed the development of the company’s big data, DPI-based, mobile analytics platform as well as its intellectual property strategy. Prior to that, he occupied executive positions at Cilys, the pioneering wireless data optimization start-up acquired by Openwave Systems, and Sipro Lab Telecom/VoiceAge Corporation, as well as engineering management and product management positions at Openwave Systems.

M. Tremblay contributed to multiple pending patents related to DPI, PCRF analytics, classification of web content, classification of encrypted traffic and converged analytics. Marc is based in Montréal.

Sunday, April 29, 2012

Astellia Adds VoIP Monitoring from VOIPFUTURE

 
Astellia (see "Astellia's IP Probe" - here) and VOIPFUTURE (see "VOIPFUTURE Correlates VoIP Call Quality with Subscriber ID" - here) announced their partnership ".. to complete Astellia’s network monitoring solution for 2G, 3G and LTE networks with VOIPFUTURE’s offering. Astellia, forerunner in monitoring solutions for optimization of mobile network QoS and QoE, and VOIPFUTURE, the leading provider of voice quality monitoring in IP networks, combine their best of breed technology for a superior QoS and QoE solution"

VOIPFUTURE’s CEO Jan Bastian (pictured) said: “In joining forces, VOIPFUTURE and Astellia anticipate the strong demand of mobile operators for VoIP quality monitoring .. VOIPFUTURE and Astellia will give mobile operators the crucial data they need to control their network, secure service quality and optimize operations”.



See "Astellia and VOIPFUTURE join forces to provide VoIP monitoring for 3G and 4G networks" - here.

TechNavio: 33% CAGR for DPI Market (2011-15)

   
TechNavio (the research platform of Infiniti Research) forecasts that "the Global Deep Packet Inspection market to grow at a CAGR of 32.98 percent over the period 2011-2015 .. the report also discusses that lack of awareness of benefits of DPI remains a leading challenge to vendors".

The CAGR forecast is very similar Infonetics' numbers - 33.6% for 2011-16 ($470M to $2B here). Key vendors covered in-depth are SandvineCisco and Allot (according to Infonetics, Huawei led the DPI market in 2011).

"The influence of DPI technology is increasing in the network environment of service providers. They are integrating DPI technology in their network equipment. Service providers are using standalone DPI products for the core of the network and integrated DPI solutions for their access points to get maximum benefit from both types of DPI technology. Thus, network operators are using both types of DPI solutions for better network management”. 
   
See "Global Deep Packet Inspection Security Market 2011-2015"- here and "TechNavio Announces the Publication of its Report – Global Deep Packet Inspection Security Market 2011–2015" - here.

Saturday, April 28, 2012

Sandvine Presents: Mobile is also Audio and Messaging


See "Sandvine Report: Mobile Networks Teeming With Streaming" - here.




US Government - RFI for Threat Detection, DPI and Analytics


The US Department of State is "..  looking to refresh their Network-based Advanced Threat Detection (ATD) / Deep Packet Inspection & Analytics (DPIA) technologies that support their Defense in Depth architecture. The technology will be adjacent to current Intrusion Detection Systems (IDS) and other border protection technologies".
   
See "Request for Information (RFI) – Advanced Threat Protection / Deep Packet Inspection & Analytics (DPIA)" - here.

Friday, April 27, 2012

Monday's Guest Post: Can Application-level Authentication Replace DPI?


A new guest post will be published on Monday. In his article, "Packet Signing" my 17th guest, Marc Tremblay , will look into Application-level authentication, as an alternative to DPI.

"The need to attribute an identity to the traffic is an area that appears to be underserved by academics, commercial ventures and normalization groups. The concept is to attach a unique “license plate” at the Network layers, resulting in a unique Packet Signing that would enable the identification of the application that generated a given traffic flow. That traffic “license plate” would be eventually associated with an in-band or out-of-band “driver license”, enabling the effective authentication of the end-user or organization associated with a given traffic.", says Marc.

Stay tuned.

Guest posts are welcomed! Please send me a proposed subject, abstract and the author details.

TM Wins: US Tier1 Uses Radware to Manage DNS Traffic ($2M Deal)

 
Radware announced a ".. $2 million sale of its Alteon®10000 application delivery controller (ADC) to a leading Tier 1 telecommunications carrier in the United States. The telecom provider will deploy Radware's carrier-grade ADC in its network hubs across the U.S. as part of a major upgrade to its domain name system (DNS) application .. With Radware's Alteon 10000, the carrier has an advanced ADC platform delivering up to 80 Gbps of on-demand capacity for unparalleled application scalability, availability, reliability and performance".
  
See "Radware's Alteon 10000 Delivers the Capacity and Performance Needed to Help a U.S. Tier 1 Carrier Expand Its DNS Application" - here.

Diametriq (IntelliNet Technologies) Now Focused on Diameter Signaling

 
IntelliNet Technologies, one of the first companies to offer Diameter Signaling Routing, is now dedicated to the space.

The company, now named Diametriq announced that it ".. was launched today to focus on “Smart Signaling” solutions and meeting the ever increasing signaling traffic demands of 4G/LTE networks. The new company is derived from key assets including executive leadership, engineering, and technologies from IntelliNet Technologies, a wireless networks convergence company".

The Diameter Signaling Routing made an impressive progress in its short existence. Recent milestones were the acquisition of Traffix by F5 (here, and "F5 CEO Happy with Traffix" - here), significant deals for Tekelec (here, here, here) and new players (Ulticom, Alepo). My Diameter Router Product page (here) was updated accordingly.

"Anjan Ghosal (pictured), CEO of IntelliNet Technologies and a veteran of the telecom industry, successfully spun out IntelliNet’s Mobile Data Offload business last year to Ruckus Wireless of San Jose, California .. For this new venture Ghosal has teamed up with industry veterans Kumar Ramalingam heading up Product Development, Chris Knight, Product Management, and Dan Wonak, Marketing and Operations".



See "Diametriq™ Delivers a Unique Approach for Managing LTE Control Traffic" - here.

Thursday, April 26, 2012

2012 Olympics: Will ISPs Use Traffic shaping to Cope with Demand?

   
The summer Olympic games in London are going to be a milestone in broadband traffic management - in all networks - fixed and mobile and all users - UK subscribers, tourists and viewers around the world.
 
I already has several posts about the preparation for [data consumption] records: "Olympic Sized Bottlenecks" - here ; "UK 2012 Olympic Delegation: All Available Frequencies" - here ; "A Year Before the Olympics - London is Running Out of Mobile Capacity" - here, but it seems that there is still a problem to be solved.
 
Paul France reports to Cable UK that "A broadband trade association has said its members may introduce traffic management policies to cope with demand during the Olympics .. Businesses operating in London have been urged to allow staff to work from home during the international sporting event to reduce levels of congestion on the city's roads, but this move is set to have a knock-on effect on broadband networks .. The number of people planning to watch the Games online is also likely to place ISPs under strain .. As a result, trade body the Internet Service Providers' Association (ISPA) stressed that some of its members may utilise traffic management to cope with potentially unprecedented demand for bandwidth".

"Service providers are not expected to cap data use, but may use technology to manage the network at peak times to prevent access from stalling completely .. Despite the spokesman's warnings, O2, Orange and Virgin Media insisted they do not plan to introduce any additional measures to control usage, such as throttling or data caps .. BT explained that it is expecting to see higher-than-average levels of usage throughout London 2012, but stated that it has been working to increase capacity and is consequently satisfied that it will not need to implement additional controls"

See "ISPA warns members may use traffic management during Olympics" - here.

CommProve: The Business Model Behind Cell Awareness

  
CommProve published some numbers, "that have been pulled together based on CommProve real-life deployments", arguing that "..real-time cell-awareness is the key to mobile network operators simultaneously increasing ARPU whilst preventing OTT revenue erosion. Combined with policy management, real-time cell awareness provides mobile operators with the opportunity to customize subscriber quality of experience with – for example – speed boosts and priority network use on a per subscriber or application basis".
See "CommProve Adds Cell, Location and Mobility Intelligence to Policy Management" - here.

"Real-time cell awareness enables real-time customization of the subscriber experience. For the first time effective policy control can be implemented and managed minute-by-minute at the subscriber level .. By including real-time cell awareness operators can: Make CapEx savings and Avoid revenue leakage from subscribers not being able to access resources"


Example Use Case:(Euro)
Monthly revenue from 1 BTS with congestion€55,500
Monthly revenue from 1 BTS without congestion€57,200
Number of BTSs full coverage120,000
Estimated number of congested BTSs8,400
Estimated number of congested hours (in a month)75
Revenue loss saving per month€1,487,500

See "Real-time cell awareness key to sweating mobile network assets" - here.

Wednesday, April 25, 2012

Sprint Migrates from a "Basket of Tools" to Consolidated Video Optimization Solutions

 
The recent trends in traffic management are integrating multiple traffic management and optimization tools into a single solution, and moving away from simple throttling policy to more sophisticated congestion management, OTT control and quality of experience management - considering both network cost reduction and revenues generation.
 
Vendors that used to offer single functions (DPI, optimization) are enhancing their solution with additional functions - either by self-development (F5 and Bytemobile adding DPI), acquisitions (rumors of Allot buying Ortiva - here), OEM or partnerships.

All this is done to stay ahead of some of these capabilities that are offered by existing network elements - such as Cisco's GGSN in-line services or DPI functionality in Ericsson, Alcatel-Lucent or Nokia-Siemens Networks GGSNs.

A recent story by Sarah Reedy (pictured) to Light Reading shows how well this approach is accepted by Sprint.

Bob Azzi, SVP of Networks, said "When we deliver video, we we're wasting bandwidth .. the core is aware of the type of device and the content the user is demanding and only sends the bandwidth necessary for both, reducing what's required at the RF layer".

VP of Development and Engineering Iyad Tarazi (pictured) said "We used to see in the past, four to five years ago, one supplier doing caching, one doing encoding and decoding, others looking at pinching and throttling .. We're now seeing consolidation and optimization tools that are customer-friendly .. The tools still have to do all the things -- caching, encoding, decoding, etc. -- but now the tools focus on managing the network, rather than looking at the quality of the video that the content supplier is capable of delivering. That doesn't mean throttling .. it could mean sending video to a consumer that is optimized for the network speed in a specific area, instead of sending the consumer the best possible version of that video available from the content provider". 
See "Sprint's Optimized Mobile Video Strategy" - here.

Flash Networks: "Content recommendations" will be Launched in June



David Murphy, MobileMarketing, interviewed Merav Bahat (pictured), head of marketing at Flash Networks, about the optimization vendor's future product "content recommendations" (or behavioral advertising) which is based on DPI analysis of  subscriber's traffic going through Flash' gateway.

"So we have these user sessions and traffic going through the gateway, so we know the context. We know, for example, that Jane is a sports fan, so we can recommend stuff to her that she might be interested in .. [we may push] messaging or interstitials, but we have something else - a toolbar we overlay on a web page. We can put this on any web page, and this toolbar contains the offers".

The history of behavioral advertising shows that operators should be careful with the way they present it to their subscribers, to avoid privacy concerns, and offer that through an opt-in process. For example, it is not the best idea to show that the information could be associated with the subscriber identity (Jane, above). See the recent example from Marriott - here.

Nevertheless, some operators already launched similar services(?) - Smart (here), Romtelecom (here) and Orange France (here).
  
"The toolbar solution with the content recommendations is not live yet, but we have had one live deployment for a year in Russia [MTS?? - hereof another service, which has been very successful, and there are more in the pipeline. This is a Quota Management service where mobile users can check their account status on the phone .. It tells you the status of your account, warns you when you are about to use up your allocated data plan, so you can upgrade to a more relevant plan that better meets your needs. It puts the user in control .. [content recommendations] will launch this June. We are in the process of negotiating the revenue share with a number of operators. The feedback we have had so far has been very encouraging".

See "Recommended Reading" - here.

Ericsson: "The IP application router SSR 8020 is now in commercial operation"

   
I continue to follow-up on the availability of the Ericsson SSR 8000 (which also has DPI - here).
 
In its 2012-Q1 report, Ericsson says - "The introduction of LTE also drives operators’ interest for investments in core and IMS. The IP application router SSR 8020 is now in commercial operation"

See previous updates - "Ericsson - "Volume deliveries of SSR 8000 are expected in 2012" (March '12 - here) and "Ericsson Updates: SSR (w/DPI) Scheduled for Q4" (Aug '11 - here).
 
See "Ericsson FIRST QUARTER report" - here.

Tuesday, April 24, 2012

PCRF Announcements: Tekelec Adds OTT Features, Shared Data Plans and Location Based Services

 
Tekelec announced ".. enhanced Over-the-Top (OTT) applications, advanced shared data plans and location-based LTE services with the newest version of Tekelec’sPolicy Server (PCRF)"
   
"The new Policy Server release, available in June 2012, gives service providers the ability to introduce: Shared data plans [see "Infonetics: Shared Data Plans - an Opportunity with OSS Needs"- here], LTE service options in available service areas, Enhanced over-the-top applications and table-driven policies to accelerate time-to-market".
 
As for OTT - "Add value to OTT applications by offering unique quality-of-service rules, including location-based service enhancements, and launching consumer-friendly plans based on popular applications. The latest Diameter interfaces such as Sd for Deep Packet Inspection [here, chart below] and Content Optimization solutions deliver faster time-to-market for new services and improved interoperability".
   

Source: Dr. Sungho Choi

See "New release of Tekelec policy server key to profitable, innovative mobile data business model" - here.


[US] Bill Shock Prevention: VZW, AT&T and T-Mobile will Alert on Exceeding Data Charges

   
The FCC announced that "Consumers are getting help avoiding bill shock thanks to an agreement by the major U.S. wireless service providers to a change in the voluntary Consumer Code for Wireless Service sponsored by the industry trade group CTIA – The Wireless Association. The participating carriers, representing more than 97 percent of the nation’s wireless customers [i.e. Verizon and AT&T included], have agreed to start sending, by October 2012, a series of free alerts to subscribers who have wireless plans that impose additional charges for exceeding limits on voice, data and text usage, and to those who will incur additional charges when using their wireless devices while travelling abroad".
 
The history of this matter goes as follows:
  • [Oct, 2010] FCC: We have a "Bill Shock" Problem - here
  • [Jan, 2011] CTIA: FCC Bill Shock Alerts "would cost tens, if not hundreds, of millions of dollars to implement - here.
  • [Oct, 2011] Bill Shock Prevention is Coming to the US Mobile Service - here 
"The following table tracks the progress in providing the alerts by the carriers that have subscribed to the code, according to the latest information supplied by CTIA .. All alerts must be provided without charge and automatically. Subscribers will not need to take any action to receive the alerts. The carriers must provide their subscribers with at least two of the four types of alerts by Oct. 17, 2012, and all of the alerts by April 17, 2013".

CarrierVoiceDataSMS/TextInt'l Roaming
AT&TYesN/A
Cellcom
Cellular One of NE Arizona
Clearwire
Illinois Valley Cellular
SouthernLINC Wireless
SprintYes
T-Mobile USAYesYesN/AYes
US Cellular
Verizon WirelessYesYes

See "Bill Shock: Wireless Usage Alerts for Consumers" - here and "New FCC Website to Help Consumers Beat ‘Bill Shock’" - here.

[Infonetics]: Huawei and Sandvine Led the DPI Market in 2011

 
Shira Levine, directing analyst for next gen OSS and policy, Infonetics Research updated its Service Provider Deep Packet Inspection Products report, stating that "product revenue grew 29% to over $470 million worldwide in 2011" and forecasting that " the service provider DPI market to grow to $2.0 billion in 2016, with the bulk of the growth coming from the mobile space".

Infonetics previous report forecasted a market size of $1.6B for 2015 (here). A year ago, the forecast was $2.1B for 2015 (here).

Note that the revenues of the 3 public DPI vendors (Sandvine, Allot and Procera - see market share chart below) grew by 27% during 2011 (here).


My DPI Market size table (here) is updated accordingly.
 
In addition the report finds that "Huawei narrowly pulled ahead of Sandvine to take the lead in global service provider DPI revenue share in 2011".
 
Infonetics sees risks for the standalone DPI products market - "DPI is increasingly being incorporated into larger solutions, such as video optimization and mobile offload, creating opportunities for suppliers that offer DPI technology on an OEM basis" and points to the changes in the way DPI is used: "Operators are evaluating alternatives to throttling or blocking high-bandwidth video content, including using DPI for media caching, to prioritize select video content to support guaranteed QoS and as part of a content delivery network strategy".
   
"Though fixed-line operators continue to invest in deep packet inspection solutions for traffic management and to manage the impact of over-the-top (OTT) content on their networks, wireless operators are looking to DPI for more granular traffic management, including prioritization and strategic offload, and are starting to deploy DPI hand-in-hand with their LTE network upgrades"

See "Deep packet inspection (DPI) market a $2 billion opportunity by 2016" - here.

Monday, April 23, 2012

[Rumors]: Allot to Buy Ortiva Wireless


Orr Hirschauge reports to The Marker that Allot Communications is going to acquire Ortiva Wireless, a vendor of video optimization solutions for mobile operators. (here, Hebrew - I was interviewed to the story).

Ortiva's solutions provides 4 main functions: Application Aware Bandwidth Allocation, Client Buffer Management, Content Aware Bandwidth Reduction and Network Aware Adaption.  Like Allot, Ortiva uses ATCA as its hardware platform, and uses blades from Radisys (former Continuous Computing products). Ortiva is listed by Allot as a partner (here).Ortiva has also relations with Procera (here) and Sandvine (here).

Allot is looking for video optimization technology for long time now, and was previously in negotiations with other vendors in this space, including Mobixell and Flash Networks (here). Such offering may compete with the DPI/optimization integrated solutions from Bytemobile (here and here) and those that are planned by F5 (here) as well as other vendors. Allot already has a video caching solution integrated in its gateway (OEM from PeerApp).

P&G Saves $15M/year by Blocking "Non Business" Internet Traffic


Using traffic management (usually with DPI products) to optimize enterprise networks and internet access is not new, but recent numbers published by David Holthaus in Cincinnati.com show amazing consumption of recreational traffic by employees and potential savings.

"When Procter & Gamble’s IT sleuths investigated why the company’s computers were running so slow, they found something surprising: More than 50,000 YouTube videos were being downloaded from company computers every day. Along with watching videos, P&Gers were listening to 4,000 hours of music a day on Pandora, the personal playlist Web site .. The demand for desktop videos, music and other non-business entertainment delivered to PCs through P&G’s Internet network was so great that it exceeded the company’s capacity".
Nevertheless, it is not an easy to sort applications into business and non-business, and as such those 50,000 daily YouTube downloads are saved (for now):
"The digital emergency led P&G to block Pandora and the movie and video site Netflix .. P&G called Pandora and Netflix of “limited business need,” according to the company memo. P&G has not blocked YouTube or Facebook, which claims more than 500 million users around the world. The company uses both of those sites for marketing its brands and for internal and external communications".

"If the company keeps using bandwidth at the current rate, it will cost $15 million a year just to add more for non-business use, the memo estimated".

See "P&G puts clamps on web surfing" - here.

[Guest post]: Monetizing a $3 Trillion Industry – Mobile Data Means Business

By Lyn Cantor*, President, Tektronix Communications 


Social media properties such as Twitter, Facebook and even Google have dominated the headlines of the world’s media. Despite the valuation of Facebook at $100 billion in its recent IPO, this figure pales into insignificance when compared to the estimated value of the global mobile telecoms industry - $3 trillion. This underlines the sheer scale of the wireless market. An image has been cultivated in the global media that mobile operators are being overrun by OTT players, riding roughshod over their networks for free. Amid diminishing ARPU figures and an increasingly challenging global market – the situation appears grim for mobile operators. However, with $3 trillion worth of assets, I believe the industry is at the top of its game and has the right mix of elements in place to spark exponential growth; dwarfing other established industries.

Mobile data – a boost for operators

Mobile operators spanning the globe have, in fact, been buoyed by the increased mobile data usage that social media and OTT players have generated for them. Indeed, the next challenge for operators will be how to monetise this traffic, or at the very least, introduce new business models in which they can generate a significant portion of revenue from their delivery. Mobile operators will be assisted in this process by their access to the richest vein of data that any business in the world could wish for. Each day, millions of voice calls, messages and data sessions, among other information, pass over the world’s mobile networks. This gold mine of data holds the key to carriers’ prolonged and extensive growth. Operators’ network information enables them to build up a wealth of customer information deriving context such as location, demographics, values, lifestyle and even the opinion of their end-users. Operators can now use this ‘customer lifecycle’ data as the foundation for applications and services for a new breed of ARPU data subscriber - who is willing to pay for their personalised experience.

Information is power

By leveraging information from multiple technology domains across their networks, operators can formulate sustainable business models focused around the delivery of mobile data. For example, operators can see, in real-time, quality of services and service usage at the end-user level – right down to the performance of the handset. This level of granular insight can form the basis for informed business decisions by an operator. For example – this high value information can be fed into the marketing and other customer facing functions in an operator. This data can form the basis for the development of a range of targeted and highly personalised services and applications. These flexible services can be tuned to meet the exacting needs of differing demographics in the subscriber base. This level of customer intimacy allows operators to truly deliver on their promises to subscribers – and this ultimately translates into higher revenues and profit margins.

The challenges of mining data

Having access to the right data at the right time is not an easy task – given the patchwork of technologies that makes up modern mobile networks. Network technologies have become more complex with the move to all IP networks. But as networks develop they are required to deliver mobile broaband and data services in a relaible and high quality fashion. Network environments are complex and are now made up of countless elements, interfaces and protocols. It is key for operators to be able to harness all of this disparate data and turn it into meaningful and actionable strategic information. This requires network measurement tools that can access, collect and analyse all of the information that is held in the network.

Tools like this can help mobile operators to understand the impact of the OTT providers on their business - as well as their networks. Critically such tools enable operators to monetise the data they hold and charge OTT players that want to access it. The volume, and types of data, transported by mobile networks is growing exponentially. By leveraging network measurement tools, operators can not only optimise and monetise traffic; but also ensure their networks are as cost efficient as possible by using information mined form the network to make informed resourcing decisions.

Bolstering service uptake

Operators continually seek ways to up-sell and cross-sell services – but in order to do this successfully, operators need to fully understand users’ quality and usage patterns. In the past this has been something that operators have had limited ability to do. However, the detailed information they can derive from network measurement tools; empowers operators to directly align their customer resources with their customer needs.

The advent of LTE will provide operators with an even greater insight into their customers behaviour. Intelligence from LTE networks on the types of services and the experience of users is very compelling. Operators can now sieze the opportunity and convert data from their network into bottom line profitability.

The attention of the media and the markets continues to focus on the likes of Apple, because of the stylish form factors of its products, and Facebook, as it is poised for an IPO. But while social media brands grab the headlines; operators are happy to work in the background developing innovative business models, using network data that social media properties simply don’t have access to, to open a series of lucrative new revenue streams. It seems the $3 trillion mobile industry is on the up – despite the contrary headlines in the world’s media.

______________

*Lyn Cantor was appointed President of Tektronix Communications last April. He re-joined the company after being SVP and General Manager of Visual Network Systems (VNS). Prior to his appointment at VNS, Lyn was VP of Worldwide Sales, Service and Marketing at Tektronix Communications. Over the course of his 27 year career, of which 14 years have been with Tektronix Communications, Lyn has held various vice president positions in Americas sales, global channels, product management and marketing; in addition to being a general manager.




Sunday, April 22, 2012

Resource: Diameter Dictionary

 
Developing Solutions has " .. released their complete Diameter Dictionary .. The dsTest dictionary capability allows users to define a set of message templates against which all messages are to be validated. A dictionary may be defined as a base level which can be applied to all applications with dictionary validation enabled, or at the application level which would only be applied to a specific application. Application level dictionaries are translucent, in that if a message or AVP from a message is not found, the base dictionary is also searched".

See "Developing Solutions® Announces Complete Diameter Dictionary" - here. The dictionary is accessible here.



[Google]: "There’s a clear correlation between speed and the success of your online business”

   
Olga Kharif reports to Bloomberg Businessweek on Arvind Jain (pictured), Google's engineering director efforts to improve web sites response time.

"Jain’s mission: get websites to load over mobile- phone networks twice as quickly as they do now. Today’s times are typically 9.2 seconds in the U.S .. "There’s a clear correlation between speed and the success of your online business,” Jain said"

"What makes a mobile Web connection slow? In some cases, it’s the carriers’ network -- say, if users can’t get 3G or 4G service on their phones. Often, though, it’s because the Web page wasn’t designed to load quickly on a wireless device .. To fix the problem, Google is tweaking its mobile browser and working with other companies on changing the way basic Internet technologies work".

"Google also is pushing for revisions to Internet protocols, the decades-old rules that govern the way the Web functions. The changes would better handle the quirks of modern mobile networks, such as their propensity to occasionally lose data en route. A revision called TCP PRR, for example, will deploy a new algorithm that accounts for data losses and network congestion" - See Google [Still] Claims there is a Faster TCP" - here.

"Faster mobile Web speeds also translate into additional mobile-ad revenue. A 30 percent improvement in mobile Internet’s speed could lead to a 15 percent rise in ad sales, said Trevor Healy (pictured), chief executive officer of mobile-ad provider Amobee Inc. U.S. mobile-advertising spending will reach $2.61 billion this year, up from $1.45 billion in 2011, according to EMarketer Inc."
   
See "Google Seeks Billions by Boosting Mobile Internet Speeds" - here.

Saturday, April 21, 2012

Network Capacity Issues - Wi-Fi vs. Small Cells


Interesting presentation by Zahid Ghadialy (pictured), Mobile Telecoms Consultant, on network capacity issues, Carrier Wi-Fi vs. small cells and more.