May 12, 2008
Volume XXII, Issue 1
QTRAX Signs with UMG for P2P Music
Excerpted from The Guardian Report by Owen Gibson
Ad-funded file-sharing service QTRAX has signed a deal with Universal Music Group (UMG), the world's largest record label and home to U2, Kanye West, and Mariah Carey.
UMG SVP Peter Lofrumento confirmed the new deal in an announcement with QTRAX spokesperson Shamin Abas.
QTRAX announced its imminent launch at MidemNet/Midem in January, with a goal of making 25 million tracks available for download free. It plans to piggy-back on existing unlicensed peer-to-peer (P2P) download services, but clear all the rights with labels and publishers, and generate revenue for rights holders through advertising.
The service has since signed deals with Beggars Banquet Records, the UK's largest independent label, home to Dizzee Rascal and the White Stripes, and the publishing arms of EMI Music and Sony/ATV.
Currently, tracks from TVT Records are available, with the Beggars and Finetunes catalogs set to become available in the next few weeks. QTRAX should be able to offer UMG music within the next month or so.
QTRAX has promised that tracks downloaded through its browser can be kept by users forever as long as they regularly log-in to renew the rights management software. A Mac-compatible version of QTRAX is scheduled to debut May 18th.
As well as selling advertising, QTRAX will offer merchandise, concert tickets, and other music-related items.
Major record label executives are keen to experiment with new revenue models that could help plug the gap in earnings caused by the rise of online copyright infringement and the corresponding slump in CD sales. However, the licensing minefield of negotiating with a range of rights holders has caused frustration for new entrants.
MK Capital Sponsors Kontiki Spinout
Kontiki, the leading provider of managed peer-assisted delivery for high-quality video and digital content, announced this week that it has completed its anticipated divestiture from VeriSign. As a result, the assets previously comprising VeriSign's Broadband Content Services business unit will be re-launched as Kontiki Inc., with funding from MK Capital.
The Kontiki Delivery Management System is a patented peer-assisted content delivery technology that enables the delivery of high-quality video and other rich media securely and efficiently on a very large scale. Built on an award-winning peer-assisted content delivery and management platform, Kontiki is used by Fortune 500 companies for its massive scalability and network efficiency.
Originally founded in 2000 in Silicon Valley, Kontiki was acquired by VeriSign in 2006. Kontiki retains all of its global customers and several members of the original Kontiki management team, including Eric Armstrong who has been named President of Kontiki; Todd Johnson, formerly Chief Executive Officer of Kontiki, is now Chairman of the Board.
"Throughout our time as part of VeriSign, we remained committed to the development and support of the Kontiki award-winning peer-assisted delivery technology," said Eric Armstrong, President of Kontiki. "The Kontiki technology is eight-years proven in the marketplace and we are proud to claim some of the most prominent enterprises and consumer media companies in the world as our long-standing customers."
"We are very excited to be financing the expansion of Kontiki," said Mark Koulogeorge, Principal, MK Capital. "The company has a unique and tested P2P technology, which addresses the need to efficiently distribute video over the Internet and private networks. The new generation of Internet users will demand video content in order to be engaged. As the use of video as a communication tool continues to expand, Kontiki's technology will be increasingly strategic to corporations worldwide."
Kontiki will be headquartered in Mountain View, CA with regional sales offices in Denver, CO, Dulles, VA, and London.
P4P Lowers ISP Costs, Accelerates Delivery Speed
Excerpted from Beet.TV Report by Andy Pressler
In the span of just six months, P2P companies have gone from being the bane of Internet service providers (ISPs) to emerging with a practical solution for managing network traffic and maximizing profits for ISPs and content creators.
A great deal of collaboration is underway, including the efforts of Comcast and BitTorrent. NBC Universal is releasing its video download service with a P2P component from Pando Networks.
This week in Los Angeles at the P2P MEDIA SUMMIT LA, held in conjunction with Digital Hollywood Spring, leaders in the P2P and media industry gathered to discuss implementations of P2P.
As an indication of the progress now underway, Bob Pisano, President & COO of the Motion Picture Association of America (MPAA), was the conference luncheon speaker.
A number of test results from the P4P Working Group (P4PWG) were released by the Distributed Computing Industry Association (DCIA).
Pando is a member of the 60-company working group, and its CEO Robert Levitan gave a keynote address on Monday. Beet.TV caught up with Levitan on Friday in Manhattan's Madison Park for a chat about the work of the DCIA and the findings of the group.
In tests conducted over the past months by Pando, P4P has showed that P2P network operating costs can be reduced and the speed of delivery to consumers increased. Some of these findings were published last month.
We have been following Pando for a while. The company has shifted its business from providing a desktop P2P consumer application to becoming a P2P solution provider for content creators and network operators.
Pando recently raised $8 million in an additional funding round from Intel Capital and others. This is Robert Levitan's fourth start-up. He was the co-founder if iVillage.
Report from CEO Marty Lafferty
We are very grateful to all who participated this week in the P2P MEDIA SUMMIT LA, the DCIA's flagship event.
The keynotes and panel discussions were especially informative and stimulating, making the program, along with the conference's many networking opportunities, very worthwhile for attendees.
Above all, there was an invigorating sense that the pace of P2P commercial advancement is now accelerating among ISPs, software companies, and content providers, with more major breakthroughs on the horizon.
For those who were not able to attend, we will be featuring key presentations online at the archival website.
Presentations from BitTorrent's Eric Klinker, Comcast's Rich Woundy, HIRO Media's Ronny Golan, KlikVU's Lowell Feuer, Microsoft's See-Mong Tan, Motorola's John Waclawsky, the Motion Picture Association of America's (MPAA) Fritz Attaway, P4P Working Group (P4PWG)'s Doug Pasko, Laird Popkin, and Haiyong Xie, TVU Networks' Dan Lofgren, and Unlimited Media's Memo Rhein have already been posted. We will alert DCINFO readers as more are added.
Thanks, too, to Digital Hollywood Spring's Next Generation P2P panel speakers, MediaNet's Mark Mooradian, Verimatrix's Tom Munro, MediaDefender's Jonathan Lee, Pando Networks' David Buckland, GridNetworks' Tony Naughtin, and LimeWire's George Searle.
Mark outlined MediaNet's role as a white-label provider of music services to B2C brands including subscription P2P offering iMesh. P2P already commands the largest audience of Internet users, and B2B entrants in the P2P space, such as BitTorrent, will be increasingly important as file sizes increase to the 2 gigabit range. Business models for P2P music need to evolve to capture the revenue potential from the voluminous redistribution of MP3 streams.
ISPs don't want to be in the business of policing traffic or alienating their subscribers with warning notices or service interruptions, placing the onus of copyright enforcement on others in the P2P channel. Content companies need to reduce license fees to increase take-rates in the digital marketplace generally, and to make it possible for P2P distributors to retain more attractive margins.
Tom described Verimatrix as an anti-piracy solutions provider, echoing Mark's observation that the scalability and efficiency of P2P are making it more important as hybrid-P2P content delivery networks (CDNs) emerge. Nevertheless, we are only at about Chapter 2 of a novel the length of 'War & Peace' in terms of P2P industry maturity.
Some ISPs are exploring usage caps and taking actions based on excessive bandwidth consumption. ISPs need to continue to invest in 'bigger pipes' to accommodate increasing demand for rich media content. Meanwhile, content providers should seriously address licensing reform. It is still too complicated and expensive to sign P2P distribution deals.
Jonathan discussed ARTISTdirect's evolution and launch of its new PiCAST service. He recounted the challenges of MediaDefender's attempting to drive redirects to digital rights managed (DRM'd) paid download files contrasted with the relative success of pushing free-to-the-user branded content to P2P users.
MediaDefender uses hash code identification to verify licensed content and is currently putting services together for ISPs. P2P is still misunderstood in many quarters of the content industry. Acceptance of P2P as a viable distribution channel needs to expand, and better content is needed to be licensed for the P2P marketplace to flourish.
David characterized Pando Networks as a carrier-grade P2P network. One of its goals is to make online video business models work, to improve the experience for users, and to integrate seamlessly with existing ISP infrastructures.
Content is indeed king, and NBC Direct's decision to distribute using Pando's P2P solution is a key milestone for the industry. Harnessing the power of large distributed networks is the most attractive alternative for online video.
Increasing consumer demand for online videos is putting pressure on broadband ISP bandwidth capacity, making it imperative for P2P companies to coordinate more closely with carriers to ensure quality of service (QoS), and continue to make progress in protecting the integrity of copyrighted works. Content rights holders can contribute to commercial development of P2P by making higher quality ad-supported content available to the channel.
Tony provided his vision for GridNetworks as a managed, controlled system for bringing the high-definition (HD) video experience online. GridNetworks offers a high-quality white-label service now, and later this year will introduce a consumer premise equipment (CPE) embedded product.
The P2P industry continues to suffer from the legacy baggage of its initial association with copyright infringement, dating back to the original Napster. But with media companies, perception is reality, and the critical determinant of adoption will be the amount of control P2P can provide content providers to harness the power of distributed networks. Dealing with all stakeholders in the P2P distribution channel is challenging, but is the only way to move forward.
Networks must be managed to keep from getting out of control. Content must be protected. Applications must have certainty. Consumers must have transparency. The regulatory response must not be over-complicated or it will have a negative effect on progress. A greater good can be served through a best practices model.
The content community needs to help dispel the myth (which they helped create) that P2P is bad. The money in the P2P channel comes from consumers - whether indirectly as viewers of advertising or directly as payers for content.
George discussed LimeWire's role as the developer and distributor of the highly popular P2P file-sharing client of the same name, now with a monthly cume reach of 60 million users, who are making in excess of 5 billion search queries.
LimeWire has just introduced its LimeWire Store, which sells music; LimeSpot, its social networking service; and is now preparing for the launch of FanMedia, it search-advertising entry. This last initiative has enormous potential to monetize the tremendous number of requests LimeWire users enter monthly, and to share this revenue equitably with participating content rights holders, whose properties are the subject matter of such searches.
The challenges created by the overwhelming consumer adoption of P2P include making it very difficult now to reconcile everyone's interests. Suing P2P users, litigating P2P companies, and attempting forced conversions are clearly not the way to go.
The market will dictate the ultimate winning strategies, and it behooves all participants in the P2P space to pay attention to what the users are saying and doing. Content interests need to experiment with new business models. Consumers are front-and-center, and their interests need to be considered first.
Audience questions led to discussions of the importance of copyright, but a sense that it is not yet working properly in the digital space; and observations that media will increasingly be delivered via TCP/IP, that bringing together key stakeholders needed to commercially advance P2P is critical at this juncture. and that content providers need to retain the right to participate or not in P2P marketing and distribution programs. Share wisely, and take care.
Q&A with Kontiki's Eric Armstrong
Excerpted from NewTeeVee Report by Liz Gannes
Kontiki, an early commercial P2P platform provider, announced this week that VeriSign has sold the company back to its investors at MK Capital. The newly independent company's President, Eric Armstrong, agreed to talk about Kontiki's strategy, its experience at VeriSign, which remains a minority investor in the company going forward, and the role of software in video delivery.
NewTeeVee: So should I be congratulating you? Are you happy with this turn of events?
Eric Armstrong: I can't explain how happy we are. When Mike Homer started this company with a few other folks from Netscape, they wanted to do what Netscape had done for text-and-images for video: that was a simple, low-cost way for people to publish media over the Internet and corporate intranets that would democratize media. And I think we've done that. Last month Kontiki systems delivered over 3 million videos.
In terms of new customers, Project Kangaroo, 'the Hulu of the UK,' is going to use us - so they're buying our system for the second time (the first being for BBC's iPlayer).
In the US, we've got tens of major organizations, and we deliver videos worldwide. Now we're going to get back to our roots, making Kontiki P2P software that's very secure and highly manageable.
NewTeeVee: How is that different from what you were doing at VeriSign?
Armstrong: VeriSign bought us because it had a vision that was very aspirational - to build a three-screen business: broadband, mobile and television - and we were the broadband part. When it changed CEOs last year, it changed its vision for the company, and it was very upfront about that. The focus of VeriSign was to use Kontiki for a managed service, like a CDN, while we're really interested in the software.
Some P2P companies are like iTunes in that they sell movies and TV - like Joost, or even BitTorrent. Another P2P service is like Pando, a P2P CDN. We are like none of those companies; we build P2P software that we sell to our customers and then they launch their service.
NewTeeVee: So where are the untapped markets for that business?
Armstrong: Any large media company that wants to monetize its content over the Internet at high scale. Any CDN that doesn't want to launch P2P. A telco or MSO. Or any enterprise company - most of our customers are global businesses that use us to communicate.
NewTeeVee: What are the assets, including people, of Kontiki today?
Armstrong: Our patents, all our technology, all our products - over 90% of the people that joined VeriSign as Kontiki have stayed. In fact, we've gotten bigger since then.
NewTeeVee: Did you sign up any hybrid P2P-CDN customers? What's going to happen to them?
Armstrong: We did sign up some CDN customers. Our customers that were just using the P2P component, we maintained; the ones that were not P2P customers, we found another vendor for them - we went out to our competitors and they took them.
NewTeeVee: What will MK Capital's role be going forward?
Armstrong: They're an investor, they have two members on the Board of Directors. They are heavily interested in the business for the long term.
NewTeeVee: I had seen on the BBC blog that the ratio of web use vs. the Kontiki player is 8:1 - so what will Kontiki's role be for the iPlayer project going forward?
Armstrong: They expect the ratio to skew more toward downloads over time, especially as the quality increases. You can't really stream HD content because the pipes aren't big enough. Plus one thing they haven't implemented is the equivalent of an Internet DVR, where you book the shows you want to download in advance.
Also, some people complained on the iPlayer blog that the Kontiki service does not support Mac, but it does. We've had it available for over a year, but the BBC hasn't yet found a Mac DRM solution, so it hasn't been released to iPlayer users yet.
NewTeeVee: So do you still have a strong relationship with the BBC?
Armstrong: Very strong.
Asian ISPs Prepare for Video Explosion
Excerpted from ZDNet Report by Sol Solomon
Most network infrastructures today may be coping well with the increases in Internet traffic, but any existing overcapacity would very quickly be consumed by the advent of video. The good news is, Internet service providers (ISPs) in the Asia-Pacific region are already preparing for an online video explosion.
Sharat Sinha, Asia-Pacific Director of Service Provider Operations at Cisco Systems, said the pervasiveness of video is already changing the way businesses collaborate, innovate, market their products and services, and interact with customers.
Sinha explained in an interview, "As companies worldwide respond to the exploding need to deliver voice, data, and video in real-time to their end-users, regardless of their location and device, they will need to be constantly vigilant of their core infrastructure, increasing capacity when necessary, and also using new technologies to optimize the traffic flow."
Krishna Baidya, senior industry analyst at Frost & Sullivan, said Internet traffic worldwide grew over 70% in the past year, boosting global bandwidth consumption by more than 45%.
Internet traffic is estimated to contribute up to 75% of used submarine cable capacity, Baidya said.
"Recent reports suggest key activities accounting for this growth were multimedia - mostly video - which grew 40%, community participation that rose 33%, and online games with 14% growth," he said.
"The use of video calls among consumers has also experienced a huge rise," he noted. "Video continues to capture over 60% of P2P traffic. Soon, there will be a shift to high-definition (HD) video, which is almost 10 times more bandwidth intensive."
According to Cisco research, global Internet traffic will nearly double every two years through 2011. Total Web traffic is also projected to quadruple in the four-year period from 2007 to 2011, generating over 6 exabytes of data per month in 2007 to 29 exabytes per month in 2011. An exabyte is 1 billion gigabytes.
Consumer Internet traffic driven by HD video and broadband adoption will sustain a compound annual growth rate (CAGR) of 46% through 2011, Cisco projected.
Last month, an AT&T executive warned that without additional investment, the current Internet network architecture will reach the limits of its capacity by 2010.
However, Kenneth Liew, IDC's Asia-Pacific Senior Market Analyst of Communications, said it was unlikely this would happen in the next few years as ISPs and network owners are constantly upgrading their infrastructure and systems to maintain their level of customer service.
Frost's Baidya agreed, noting that, "Today's Internet has reached its current state of infrastructure mostly due to contributions from private investors. Innovators in the industry took the early steps to develop the infrastructure, and continue to maintain and upgrade, as and when needed."
Liew said, "ISPs and submarine communication cable owners are the first to know their networks are reaching capacity and make their own judgment on whether to upgrade and expand capacity or not. Being the owners of their respective networks, it is their responsibility to ensure there is enough bandwidth and capacity to serve their users."
In 2007, 17 operators signed an agreement to build the first high-bandwidth optical fiber submarine cable system linking Southeast Asia directly to the United States. The Asia-America Gateway (AAG) is expected to cost about $500 million and to be completed by year-end 2008.
Please click here for the rest of the report.
P2P & P4P: The Tangled Web We Weave
Excerpted from TechNewsWorld Report by Paul Hartsock
It was Monday morning, and Haiyong Xie was running late. His flight to Los Angeles had been delayed, and then he had to face LA's beastly morning traffic.
Xie, of Yale University, was on his way to the P2P MEDIA SUMMIT LA at the Renaissance Hollywood Hotel to take part in a panel discussion about the P4P Working Group (P4PWG). The group's sponsor, the Distributed Computing Industry Association (DCIA), was also the host of the event.
The panel's other participants, P4PWG Co-Chairs Doug Pasko of Verizon and Laird Popkin of Pando Networks, held their own until Xie arrived. When he did, his late entrance, and the congested traffic that had caused it, conveniently underscored what the P4PWG is about: easing traffic through better management. P4P focuses on the Internet, though, not freeways.
P2P networking decentralizes the exchange of data. Instead of an application retrieving data (for example, a large movie file) from a single, all-knowing server located deep in the bowels of some online entertainment distribution enterprise, a P2P app will connect to dozens / hundreds / thousands of computers across the Internet to simultaneously download bits of the file it doesn't have and upload bits of the file it's already acquired to other users.
Proponents of the technology say it prevents a single server's pipeline from getting clogged with too much data at times of high demand. Even rare data files that no longer exist on any centralized server might live on, constantly replicating on P2P networks.
Even though video-streaming sites like YouTube are taking up an increasingly large portion of Internet traffic, P2P traffic remains a substantial portion of all web activity - 37%, according to an Ellacoya Networks study from last June.
Since Napster's heyday in 1999, P2P file sharing has continued to be the pirate's tool of choice for obtaining and distributing copyrighted material. However, legitimate organizations are warming to the concept of P2P as well. Above-the-table operations like Joost, Skype, and Microsoft's Threedegrees use the technology. And the DCIA, which promotes commercial development of P2P, counts Cisco, VeriSign, and AT&T among its Member companies.
Later on Monday, at another panel discussion at the same DCIA conference, the argument was made that large entertainment companies that distribute video online will need to embrace P2P technology more firmly in the face of growing demand for high-bandwidth content like high-definition (HD) movies and TV shows. The decrease in prices for ever-wider bandwidth is not enough to address the problem, contended Michael King, CEO of Abacast.
"Broadband prices are dropping, but the drop is slowing down," he commented. "Demand for high-quality content is speeding up. So the argument that nothing's needed is wrong."
Naive P2P has shortcomings of its own. In choosing which fellow sharers to hook up with, a P2P application will sometimes make connections across networks, across ISPs and across the world. The result is a tangled web of connections that do the job of sharing data but not always in the most efficient way, at least not from a typical ISP's point of view. These sorts of unregulated hook-ups are part of what led to Comcast's need to throttle back P2P traffic.
This tangle is what P4P intends to address. The system introduces an element of logic to a P2P network that directs user traffic to first share with neighboring computers on the same network or in the same region before attempting to hook up with computers that are farther away and necessitate connecting with other networks. It's distributed computing's answer to the locavore diet. Its intention is to raise efficiency and lower costs for the ISP.
The panel said that tests run with Verizon yielded promising results. Whereas 98% of the traffic on a "naive P2P" network was external, implementing P4P cut external traffic to 50%. Under P4P, 29% was local, and 21% of traffic was internal, they said.
With all the sunshine the DCIA is pouring on P4P, one must wonder what the downside is or what controversy might be involved. The panel members appeared to anticipate such questions - they even worked a few "myths" about P4P into their presentation.
The question of privacy was raised: How much will the management applications know about who a sharer is and the exact nature of the data transmitted and received? According to Pasko, it will be quite minimal. "A little bit of information goes a long way."
A section of Pando's site that explains the P4PWG states that the group's goals include promoting ways to improve P2P "while protecting the intellectual property (IP) of participating entities."
P4P is also a way for ISPs to avoid a head-on collision with net neutrality legislation that would prevent them from managing traffic at all.
Comcast VP Kathryn Zachem said in a filing with the Federal Communications Commission (FCC) last month, following a P4P test conducted by Pando, that it "provides further proof that policymakers have been right to rely on marketplace forces, rather than government regulation, to govern the evolution of Internet services."
The government regulations to which she's alluding - if, hypothetically, they're enacted - may well forbid the very kind of network management P4P represents. Apparently, Comcast would rather have a little network management control than further risk provoking a law that would guarantee it none at all.
How much control does P4P entail? At the DCIA panel, an audience member asked a question regarding peer preference. If User A finds he can connect with User B at a fairly fast rate, but she resides on a different, far-away network, will P4P force User A to connect instead with User C, who is on a closer but far slower connection?
Popkin's answer was that using one would not prevent you from using the other; you'd just connect to the closer one first. "We're not going to force you on a communication path that's not going to provide the service," added Pasko.
Relieving the OTT Burden & Improving Online Video
Oversi this week unveiled its multi-service platform (MSP) for managing and monetizing over-the-top (OTT) video. Until now service providers were struggling between the constraints of the network and the demands of their customers.
OverCache MSP relieves the ever-increasing burden of online video, P2P, and other media applications on service provider networks while providing the best quality of experience (QoE) for users.
Eitan Efron, Oversi's Co-Founder and VP of Marketing and Business Development, said, "OTT video is growing exponentially, consuming a significant amount of service providers' network resources and jeopardizing their existing business models. Serving large quantities of OTT video, including high-definition (HD), to numerous concurrent users creates a huge problem for service providers, unless you intelligently push the content to the ends. It's a matter of network necessity and quality of service (QoS)."
To meet this challenge, Oversi has launched its OverCache MSP for caching and content delivery which benefits service providers in three key ways: 1) relieving the network load without compromising the customer experience; 2) accelerating user performance, enabling the service provider to offer tiered services and increase average revenue per user (ARPU); and 3) opening up new monetization opportunities with content providers by guaranteeing customers' QoE and facilitating local advertising insertions.
OTT video is derived from external sources, passing through the operator's network continuously at a high cost to the service provider. By deploying OverCache MSP as close as possible to the user, service providers benefit from bandwidth and capital expense savings.
The MSP platform detects OTT video intelligently, caches it and delivers it on demand to the users. As the content is delivered locally, customers enjoy a far better viewing experience with faster download times and no risk of service interruptions.
David Tolub, President & CEO of Oversi, said, "Users' satisfaction with their service provider is directly linked to the quality of their online video experience. Our customers have reported a dramatic drop in support calls to their call centers since the MSP has been in operation. Average user Internet video download times have been reduced from 35 seconds to eight seconds, and the system has generated bandwidth savings across the network and over the interconnect links."
OverCache MSP enables service providers to offer tiered services relating to the Internet viewing experience. Tolub continued, "Our customers are asking to relate the performance of our system to their broadband packages, enabling them to increase ARPU significantly. By actually 'feeling' the difference in experience, customers can be motivated to purchase and upgrade to higher broadband packages."
In addition, OverCache MSP can provide assured QoE for specific content and support local advertising insertion, allowing service providers to gain new revenue sources directly from content providers interested in delivering broadcast-quality video for Internet content. This can only occur with OTT caching and delivery platforms located close to the users.
Oversi will be presenting its OverCache MSP solution at the CableNET Pavilion, The Cable Show, in New Orleans, LA May 18th-20th in Booth No: 3735 and at CommunicAsia, Singapore June 17th-20th at Stand 3K2-07.
Oversi is an active member of the P4P Working Group (P4PWG), sponsored by the Distributed Computing Industry Association (DCIA).
DCIA Takes Reins of P2P Best Practices
Excerpted from Cable Digital News Report by Jeff Baumgartner
The Distributed Computing Industry Association (DCIA) has kicked off a P2P Best Practices initiative that will supersede the P2P Bill of Rights & Responsibilities (BRR) effort that Comcast and Pando Networks introduced in mid-April.
The DCIA-led project aims to start up where the BRR project left off and "broaden the scope of this endeavor" (See DCIA Takes P2P Reins and Comcast, Pando Crafting 'P2P Bill of Rights').
"It was an excellent idea suggested by Pando and Comcast. We met with them and discussed the concept and its potential," DCIA CEO Marty Lafferty told Cable Digital News.
"The Comcast-Pando concept is the basis of the work that is now proceeding." Comcast and Pando unveiled the BRR project after Comcast pledged that it will migrate to a "protocol-agnostic" platform by year's end.
Meanwhile, the "fair use" scheme introduced by Camiant on Monday offers a glimpse into how cable operators might manage bandwidth consumption down the road. (See Camiant Intros 'Fair Use' Bandwidth System.)
Lafferty acknowledges that the new project is in the early stages, with the DCIA now actively seeking participation from ISPs, P2P companies, motion picture studios, and other content providers.
Comcast is already on board, and AT&T and Verizon Communications are backing the new P2P Best Practices project, Lafferty says. He adds that other MSOs are interested as well.
From the P2P world, Pando and Kontiki have also agreed to join the DCIA-led venture.
According to Lafferty, the Motion Picture Association of America (MPAA) and the Recording Industry Association of America (RIAA) also have expressed interest and are conducting discussions about prospective member companies participating.
Lafferty says the DCIA will remain in heavy "recruitment mode" between now and a working group formative meeting slated for May 20th in New York, in conjunction with the Streaming Media East show.
The goal, he says, is to have a working group formed by June and to finish a document outlining P2P best practices "well before the end of the year."
The DCIA also expects to seek involvement from consumer advocacy groups, including Free Press and Public Knowledge.
The DCIA noted that the newly launched P2P Best Practices project differs from its P4P Working Group (P4PWG), in that the P4PWG is centered not on business practices, but on technologies for optimizing P2P traffic across ISP networks.
The original concept of bringing together a coalition of ISPs and P2P companies hasn't changed, added Comcast. "DCIA is a great forum to lead the initiative," says Comcast spokesman Charlie Douglas. "We fully support the DCIA's effort and look forward to the next steps."
PeerApp Finalist in MITX Technology Awards
PeerApp, the innovator in Intelligent Media Caching, this week announced that it has been named a finalist in the 5th Annual MITX Technology Awards. The MITX What's Next Forum & Technology Awards, sponsored annually by the Massachusetts Innovation & Technology Exchange (MITX), recognize innovative digital technologies developed in the New England area, as well as the individuals and organizations responsible for these advancements.
Based on patented P2P caching and acceleration technology, PeerApp's UltraBand products help Internet service providers (ISPs) alleviate network-congestion problems caused by the flood of video files coming over P2P networks and from streaming video websites such as YouTube and MySpace. ISPs can satisfy subscriber demand for video, ensure a consistently great customer experience, and deliver the content at wire speeds -- without incurring massive network upgrades.
"This recognition further validates the need for creative new approaches that make the Internet infrastructure and business models ready for video," said Alan Arolovitch, PeerApp's Chief Technology Officer (CTO).
"All the video being created and distributed over new platforms is wasted if it can't be delivered to consumers with great quality of experience (QoE). Our technology benefits the entire Video Internet value chain: ISPs, content owners, content delivery networks, and consumers."
PeerApp will be recognized with the other finalists in the Video category at an awards ceremony attended by the region's top technology and business professionals. Winners will be announced at the ceremony at the Royal Sonesta Hotel in Boston, MA on June 3rd. This year's keynote speaker and special honoree is Dr. Amar Bose, Chairman & Founder of Bose Corporation.
NCTA and Neutral Nets
Excerpted from Cable 360 Report by Jonathan Tombes
Battle lines over technology and policy sharpened as partisans joined in a "net neutrality" debate before a House subcommittee this week. The real action, however, appears to be a deepening of new, cross-industry alliances and technical solutions.
National Cable & Telecommunications Association (NCTA) President & CEO Kyle McSlarrow's testimony to the Subcommittee on Telecommunications and the Internet underscored the cable industry's support for certain initiatives, such as the Broadband Census of America Act, but weighed in against the Internet Freedom Preservation Act, the focus of the hearing.
"We strongly believe that a 'net neutrality' mandate or government intervention in the operation of networks is unnecessary and would undermine the goals of broadband deployment and adoption," McSlarrow said. (For a similar view, see this month's Reality Check column).
The cable industry's commitment to those deployment and adoption goals and the government's heretofore successful hands-off policy when it comes to broadband network management were the first and third points of McSlarrow's testimony.
The centerpiece of his statement was a defense of "reasonable network optimization techniques," which include collaborative efforts with the P4P Working Group (P4PWG) that have been the subject of recent reports, such as Reflections on Comcast-BitTorrent.
McSlarrow also mentioned the Distributed Computing Industry Association's (DCIA) launch of a P2P Best Practices initiative through the formation of a new dedicated working group, which includes Comcast, Time Warner Cable, Cox, Charter, Suddenlink, Bend Broadband, CableLabs, AT&T, and Verizon, as well as P2P service providers and content owners. McSlarrow said that the group would form by June and complete its work by the end of the year.
Technology companies that cater to the network service providers are likewise engaged in helping them to balance competing regulatory and network management challenges.
In a discussion of a new fair-use management application, Camiant VP Business Development Randy Fuller argued on behalf of a full set of tools. "P4P will be very useful to the cable companies, but by itself is not going to completely solve network management," he said.
Stellar Start-Up: P2P under the Radar
Excerpted from Jerusalem Post Report by David Shamah
Humans have a need to communicate; no man is an island. Stick a person in a situation where he or she is isolated or cut off from speaking to others, and you'll find that seeking a way to find and talk to others becomes a priority for them.
For Professor Roy Friedman of the Technion, one of those "lonely moments" took place as he was visiting a friend in France. As a computer expert, an associate professor of computing at the Technion, author of many scientific papers, and initiator of several important projects, it never occurred to Friedman that he would end up having to exchange family photos between his and a friend's laptop using, of all things, a "sneakernet."
Instead of the laptops connecting using a LAN (wired or wireless), the two computers just couldn't get past the membership requirements of the local network's router.
"Instead, we used a USB stick to move the photos," he says. "And it was very annoying, too, because there were too many to fit on the stick for one 'shipment,' so we had to swap the files and the stick several times."
But, where there is a communication will, there is a way - and the incident helped inspire Friedman to build WiPeer, the first general release application that allows computers to communicate with each other directly, without using an access point like a router. That type of network, called an ad-hoc P2P network, lets users form a small local network directly among the computers involved, initiated by one or more of the computers participating. It's sort of like an IM system without a central server moving the messages between users over the Internet.
While any computer with a network card, wired or wireless, is technically capable of participating in ad hoc P2P networks, few users even know the possibility exists.
So, Friedman and a team of graduate students he works with wrote WiPeer, which automatically configures, sets up, and discovers others on a network who have set up WiPeer ad hoc networks on their computers.
Ideally, you would use WiPeer using your WiFi connection, in an area where there is no WiFi network (router) handling the IP traffic, or where you cannot join an existing network. WiPeer makes the network among the participating computers (currently it works only with Windows XP, but Vista and Linux versions are in development), bypassing the router - or allowing communications where there is no router-based network.
"If you're using a router-based network connected to the Internet, especially in WiFi situations, your overall local network speed is generally limited to the speed that the router is connecting to the Internet," Friedman says - and that speed is dependent on many factors external to your local network. If you avoid the Internet, using WiPeer to build a home network, though, you're in control of the speed - and you're all but guaranteed of getting the full 54 mbit throughput promised by 801.11g wireless networks (the application also utilizes WEP encryption to ensure security on the ad hoc network).
WiPeer, in other words, gives you options: blazing fast connections on a LAN, and the ability to switch over to your "regular" WiFi connection when you need the Internet (i.e. you don't have to change your network configuration settings in order to switch back and forth). You can connect to computers on the WiPeer network just like you can on a LAN, transferring files or even playing network games (WiPeer supplies several games, in fact).
The application is free (for now), and Friedman and his team welcome user contributions - which was the case with the "Peersonalizer" Facebook application for WiPeer, written by college students using WiPeer. Peersonalizer checks a user's Facebook connections and seeks them out when WiPeer is running, to see if any are in range; if they are, the program prompts the users, letting them know when WiPeer-using buddies are in the neighborhood.
Eventually, says Friedman, the project will be ported over to handheld devices - like iPhones and other cell-phones or PDAs, which would greatly enhance the ability of these devices to connect with computers for file sharing or communication.
MetaASO: A Bootstrapped P2P Start-Up from India
Excerpted from ReadWriteWeb Report by Bernard Lunn
Innovation is going global, particularly innovation from India, and P2P is the next great disruptive technology - the only one that could derail the Google steamroller. So it is no wonder that MetaASO is featured on Pluggd.in, a site that tracks Indian start-ups.
MetaASO is a self-funded, bootstrapped start-up that claims north of $1 million in revenue. In fact, being self-funded means it's very likely profitable.
MetaASO is the name of the company. Mermaid is the name of its product suite. MetaASO started in October 2002 and its first release took place 1.5 years ago in limited circle beta. Full public beta release 2 happened a few weeks ago. There are 5 founders and the engineering team is 20 people.
Besides giving away software for free, MetaASO also develops custom P2P software for organizations. And that typically costs around $400,000 per software application. MetaASO has three enterprise customers, and also makes money from ads on its software. MetaASO's emphasis is on product development.
This is smart self-funding. The company learns a lot from each enterprise job as well as getting cash. This is the classic "3 custom jobs to a product, iterating, and generalizing on each project" that the enterprise software business has been built on for decades.
Mermaid software can be used on LANs (e.g., at the office, on campus, etc.) without any Internet connection. To use it on the Internet, users require a "Globally Routable IPV6 Address" for their computers.
An Internet connection speed minimum of 256Kbps is recommended for all audio/video based applications. But nowadays its best to get a 512Kbps or 1Mbps connection if you are uploading video.
An "enterprise first" strategy makes sense for MetaASO. The big message is "no servers needed." That's right, no supernodes, no nothing. Reak P2P. You can become your very own virtual TV station. Sounds like YouTube - except you don't need a server farm costing gazillions.
This is similar to the start-up Faroo from Germany that is also causing excitement.
How Much P2P File-Sharing Traffic Travels the Net
Excerpted from Wired News Report by Ryan Singel
How much of the traffic on the Internet is P2P file trading?
Everyone seems to agree it represents a lot of the traffic, but the truth is no one knows (with the possible exception of the ISPs and backbone providers in the middle, and they aren't telling or sharing raw data).
One of the most recent reports on P2P traffic came from a traffic optimization firm called Ellacoya in June 2007. Their report said that HTTP-based web traffic had overtaken P2P traffic on the net, thanks to streaming media sites like YouTube.
Ellacoya pegged HTTP traffic at 46% of the net's volume, with P2P traffic close-by at 37%. The company says the data was based on about 1 million North American broadband subscribers.
Independent Internet researchers, including KC Claffy of the Cooperative Association for Internet Data Analysis, ran their own tests in 2003 and 2004 - following conflicting reports that file sharing was decreasing and increasing.
Using data from an Internet backbone link in San Jose, CA, the researchers found that P2P traffic was steady, if not increasing. For instance, BitTorrent grew some 100% in popularity from 2003 to 2004, but the researchers found that it was getting harder to track P2P bits, since P2P traffic was increasingly using encryption and random ports, making it harder to quickly identify the application that a packet was coming from.
The last time Sprint published an analysis of 30 large Internet links (January 2005), it found that file sharing accounted for less than 6% of the packets in the tube, with regular web traffic clocking in at more than 50% of the flow.
Speaking at a Supernova conference last July, Claffy expressed confusion at how the government can have a public policy debate about network management when no one except the network operators knows anything about traffic on the Net.
NBC filed something with the FCC using the CacheLogic Study, done a year after the Pew Internet Study saying that file sharing was dropping and our study showing file sharing was increasing. And the CacheLogic Study just came out with a number - no trends, just that file sharing was 30-50% of traffic.
Ipoque, a P2P traffic management firm, released its own study of Internet traffic in 2007, focusing on Germany, Australia, Eastern Europe, and Southern Europe.
According to its report, P2P traffic accounted for between 49% and 83% of Internet traffic in these regions. Using deep packet inspection (DPI) techniques, the company says it could identify the types of files being traded, as well as unique hashes that pinpointed unique files.
For instance in the Middle East, the most popular BitTorrent Audio download was Beyonce's "Listen," according to Ipoque.
The study is unlikely to please Internet scientists, since the data set is not public nor is there much discussion of how the numbers were arrived at.
All of the data out there is suspect; but the information is vital. In Washington, DC, Congress is once again considering legislating rules for ISPs, while the five-member Federal Communications Commission (FCC) is publicly wringing its hands over whether to adopt stricter net-neutrality guidelines generally.
We would love to know if good measurements of P2P traffic are out there or if, indeed, the debate over net neutrality is taking place without the slightest bit of verifiable data.
MPAA: Ending Infringing Downloads is Up to P2Ps
Excerpted from Home Media Magazine Report by Chris Tribbey
Fritz Attaway, EVP of the Motion Picture Association of America (MPAA), heard from P2P network operators such as BitTorrent, LimeWire, and Vuze May 5th in Los Angeles.
"This industry has to decide where its future lies. If it lies in the distribution of licensed content, then you need to discourage access to unauthorized content," he said.
The third-annual P2P MEDIA SUMMIT LA brought together P2P operators, Internet service providers (ISPs), and content companies to wrestle over how Hollywood can monetize a medium that many people long have been using for free.
"This is not a new thing with us, adopting technology," said Derek Broes, SVP of Digital Entertainment for Paramount Pictures. "It's the association with copyright infringement that we despise. Our interest in P2P is that everyone learn to 'play nice.'"
Paramount has certainly tried - and succeeded - when it comes to digital distribution. Last year's 'Jackass 2.5' was available first as an online, free streaming video, and then as a for-pay digital download, VOD, and DVD. That was a Hollywood first.
"After it was available for free, it did extremely well in for-pay formats," Broes said. "That's a great example of what you can do when you preview a property to consumers and then give them the opportunity to enjoy it in more traditional formats."
LimeWire, one of the more popular P2P networks, is making headway in the ad-supported download realm.
"And we think this is a global solution," said LimeWire CEO George Searle.
Still, others argued that the studios are as much responsible as the P2P networks when it comes to convincing consumers to choose licensed content over unauthorized choices.
"If I search for 'Iron Man,' what type of experience is needed to motivate me to click on the licensed results?" asked Joey Patuleia, Co-Founder of Brand Asset Digital, which markets and brands digitally distributed content.
"The studios need to understand the lifestyles of these consumers and provide a more value-added experience."
Ted Cohen, Managing Partner of TAG Strategic and former SVP of Digital Development for EMI Music, said, "It's a tough period right now. But everyone's got to make deals, long-term, if we're going to get through this awkward transition from physical distribution to digital."
P2P: Understanding the File-Sharing Audience
Excerpted from ClickZ News Report by Michael Miraflor
Agency personnel need to become familiar with the P2P space as a marketing channel to take advantage of user behavior in these environments to distribute branded and promotional content. Marketers must understand the type of core consumer being reached on P2P platforms.
The exchange of digital files over P2P platforms is no longer fringe behavior. The Electronic Frontier Foundation (EFF) recently cited a study that estimates more than one-in-three desktops worldwide have the popular P2P application LimeWire installed. The act of downloading digital content over P2P networks has reached mass adoption.
What type of consumer is likely to engage in P2P behavior? It is an environment dominated by an 18-34 male demographic. There are many personality types, from geeks who frequently debate what the superior P2P platform is (most agree it is BitTorrent) to creative-class types searching for indie movies not yet available via their Netflix accounts.
Here are some additional observations: as diverse as the group is, clearly they are all active consumers who are used to an on-demand lifestyle; they are culturally in touch and use the Internet as their primary source of information - P2P forums are not immune to conversation about the '08 election; there is plenty of discussion on the latest gadgets, home theater setups, and car accessories - people frequently exchange links to online deals and discounts for everything, ranging from automobiles to televisions to bulk packages of chewing gum.
Quite a few P2P users are Mac addicts; gaming is a part of daily life - seemingly everyone has an Xbox LIVE account; there is constant discussion about ARG type marketing campaigns; P2P users often express brand, network, and TV show loyalty; very few express a counter-culture attitude to taking down "the man" by knowingly downloading and sharing unauthorized content.
They want to have the ability to consume content when and where they please - many express that they have grown frustrated with the DRM attached to files downloaded over mainstream services such as iTunes; many people express frustration with the RIAA, music labels, studios, and networks for not having their act together and for punishing consumers for being fans of their content.
Quality is paramount - in terms of video, most would only download a high-quality copy of a screener or an actual DVD rip - theater camera footage is only acceptable when no screener copy or DVD rip is available; many express the need to acquire a new piece of content not only for their private use but also to share with friends.
RightsFlow Launches Outsourced Music Licensing
RightsFlow - a music, media, and entertainment-focused professional services firm - this week announced the launch of its Outsourced Music Licensing Solutions.
RightsFlow offers outsourced music publishing licensing and royalty systems to record labels, distributors, online music retailers, and any other company engaged in distribution and sale of recorded music.
These services provide a cost-effective end-to-end solution for publishing licensing, including copyright owner research, obtaining and administering publishing licenses, royalty calculation, and accounting to publishers.
RightsFlow's services relieve the burden of publishing licensing and royalty administration from record labels, distributors, retailers, and others, and allow such companies to save time and effort, reduce overhead costs, ensure that content is being distributed lawfully, and get more product to market.
RightsFlow's licensing expertise covers the breadth of the US and European markets and licensing practices. These services provide a particular boon for labels and distributors based outside of the US, who would like to distribute content for sale in the US, but are unable to license from US publishers.
"We find there is a tremendous need when distributing music for outsourced licensing and we are in position to meet the market demand", said RightsFlow President & CEO Patrick Sullivan.
Coming Events of Interest
Streaming Media East – May 20th-21st in New York, NY. SME is the place to learn what is taking place with all forms of online video business models and technology. Content owners, viral video creators, online marketers, enterprise corporations, broadcast professionals, ad agencies, educators, and others attend. The DCIA will participate in the P2P session.
Advertising 2.0 New York - June 4th-5th in New York, NY. A new kind of event being developed as a partnership of Advertising Age and Digital Hollywood. The DCIA is fully supporting this important inaugural effort and encourages DCINFO readers to plan now to attend.
P2P MEDIA SUMMIT SV - August 4th in San Jose, CA. The first-ever P2P MEDIA SUMMIT in Silicon Valley. Featuring keynotes from industry-leading P2P and social network operators; tracks on policy, technology and marketing; panel discussions covering content distribution and solutions development; valuable workshops; networking opportunities; and more.
Building Blocks 2008 - August 5th-7th in San Jose, CA. The premier event for transforming entertainment, consumer electronics, social media & web application technologies & the global communications network: TV, cable, telco, consumer electronics, mobile, broadband, search, games and the digital home.
International Broadcasting Convention - September 11th-16th in Amsterdam, Holland. IBC is committed to providing the world's best event for everyone involved in the creation, management, and delivery of content for the entertainment industry. Uniquely, the key executives and committees who control the convention are drawn from the industry, bringing with them experience and expertise in all aspects.