Distributed Computing Industry
Weekly Newsletter

In This Issue

P2P Safety

P2PTV Guide

P2P Networking

Industry News

Data Bank

Techno Features


August 4, 2008
Volume XXIII, Issue 1

Kontiki Awarded Patent for Distributed Content Delivery

Kontiki, the leading provider of managed peer-assisted delivery for high-quality video and digital content, this week announced that the United States Patent and Trademark Office (USPTO) has granted Kontiki its fourth patent in the area of distributed content delivery across a computer network.

This patent (USPTO #7,398,301 "Method and apparatus for facilitating distributed delivery of content across a computer network") is a fundamental component in the company's content delivery system, to enable distribution of high-quality video and digital content to enterprise customers and entertainment consumers.

As an early inventor and continued leader in the commercial peer-to-peer (P2P) content delivery space, Kontiki remains focused on leveraging large
distributed computer networks to more efficiently deliver high-quality video and digital content.

Contrary to traditional delivery systems, which pull information from centralized servers, Kontiki utilizes the large number of distributed machines in a network to speed delivery.

By accessing and receiving content from multiple client sources, Kontiki can deliver content faster, more efficiently and at lower costs. The newly awarded patent further validates Kontiki's early pioneering release of its peer-assisted delivery technology.

"Kontiki's deep understanding of network computing environments and peer-assisted technology has helped us achieve great success over the years, establishing us as a key player in the content delivery space," said Eric Armstrong, President, Kontiki.

"The acknowledgment from the US Patent and Trademark Office is recognition of our early leadership in the market and positions us favorably for technology development and innovation in the future."

At the core of the company's development is the Kontiki Delivery Management System (DMS), a patented peer-assisted content delivery technology that enables the distribution of high-quality video and other rich media securely and efficiently at very large scale.

Built on an award-winning peer-assisted content delivery and management platform, Kontiki is used by Fortune 500 enterprises and consumer media giants for its massive scalability and network efficiency.

Kontiki President Eric Armstrong will delivery the opening keynote address at P2P MEDIA SUMMIT Silicon Valley on Monday August 4th.

Octoshape Brings Live P2P Streaming to Millions

Octoshape, a leading provider of high-quality streaming solutions, and CDNetworks, a top-three global, full-service content delivery network (CDN), have joined forces to deliver affordable live streaming for millions of concurrent viewers.

The hybrid peer-to-peer (P2P) solution, resulting from the integration of CDNetworks' enterprise network and Octoshape's peer platform, is the first solution of its kind that is commercially available and tested for Internet broadcasting of large scale events.

Available immediately, the Media Live Streaming service enables massively scalable live audio and video streaming that ensures high-quality end-user experiences while significantly cutting costs for broadcasters and content owners.

The combined Octoshape and CDNetworks service enables unprecedented real-time online delivery of large-scale live events, such as concerts, sporting events, newscasts, and other events attracting hundreds of thousands to millions of viewers.

Until this partnership, distributing content reliably, and especially live content, to an audience that is located around the world and is connected by Internet broadcast media would have been technically and financially untenable.

Media Live Streaming makes distributing content, such as live streaming on a large scale, affordable (in-line with traditional television broadcast) while benefiting media service providers including TV broadcasters, cable television programmers, and Internet video producers and publishers.

The new service uses Octoshape's powerful state-of-the-art hybrid-P2P transport technology, which secures optimal transmission routes by utilizing a special communication protocol to ensure smooth transmission in diverse user network environments.

Media Live Streaming also uses a high-quality transport mechanism that eliminates buffering and enables stable transmission of high-quality content even over great network distances.

"The combined force of Octoshape and CDNetworks makes a powerful mix of leading-edge technology and global infrastructure," said Stephen Alstrup, CEO of Octoshape. "We are excited to be working with CDNetworks on a global scale to transform the way live content is delivered."

Stephen Alstrup will deliver a keynote address at the P2P MEDIA SUMMIT Silicon Valley on Monday August 4th.

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyIf you choose just one conference to attend this summer to stay current on the latest developments affecting Internet service providers (ISPs), content delivery networks (CDNs), and peer-to-peer companies (P2Ps), make it this one: P2P MEDIA SUMMIT Silicon Valley.

This inaugural Bay Area DCIA event features industry-leading speakers from all over the world discussing the latest regulatory actions, technological break-throughs, ground-breaking deals, new business models, and ongoing case studies.

The day-long conference takes place this Monday August 4th at the San Jose Marriott. There will be a continental breakfast, conference luncheon, VIP networking reception, and more!

Special registration rates can still save you over $100 when you also sign up for Building Blocks, jointly produced by the Consumer Electronics Association (CEA) and Digital Hollywood. Building Blocks is at the same venue on August 5th through 7th.

Topics will include P2P and network management, P2P and content delivery systems, P2P and marketing, and P2P and content licensing.

For more information and specially discounted passes, please call or e-mail Sari at 410-476-7964 or sari@dcia.info.

At this first-ever P2P MEDIA SUMMIT in the Bay Area, attendees will have opportunities to personally meet Abacast's Michael King, Andolis' Stephane Roulland, Bingham's Joshua Wattles, Brand Asset Digital's Joey Patuleia, CloudShield's Peter Jungk, Creative Commons' Mike Linksvayer, Digicorp's Jay Rifkin, Double V3's Martine Groulx, EM Syndication's Laura Tunberg, Getback Media's Chris Dominguez, GridNetworks' Jeffrey Payne, Hiro-Media's Daniel Leon, Ignite Technologies' Fabian Gordon, Jambo Media's Rob Manoff, Kontiki's Eric Armstrong, Leaf Networks' Jeff Capone, LimeWire's Brian Dick, mBit's Chunyan See, MediaDefender's Chris Gillis, Microsoft's Ravi Rao, Motorola's John Waclawsky, Octoshape's Stephen Alstrup, Ono's Fabian Bustamante, Oversi's Shmuel Bachinsky, Packet Exchange's Chuck Stormon, Pando Networks' Robert Levitan and Laird Popkin, TVU Networks' Dan Lofgren, Ultramercial's Dana Jones, Verizon Communications' Doug Pasko, YuMe's Rosanne Vathana, and more.

Our first morning discussion will focus on network resources - reducing bandwidth usage and improving P2P throughput. What are the mission, objectives, history, and status of the P4P Working Group (P4PWG) and Ono? What tests have been conducted to date and what have the results shown? How do technologies such as live P2P streaming and P2P caching impact on this issue? How does the P2P industry move from testing to standards setting and best practices? How can interested parties get involved?

Hybrid CDNs - the evolving distribution chain - will also be covered during a morning session. What is the current landscape for web-based content distribution and what role do P2P-based and hybrid-P2P technologies play? How does wide-area peering relate to this? What trends are emerging in P2P implementation by other participants in the distribution chain and in consumer usage? What impact do advances in security, data compression, caching, content acceleration, swarming, streaming, and other P2P-related technologies have?

In the afternoon, featured subject matter will include consumer propositions - what's working and what's not. Has any alternative business model - paid-download, subscription, or advertising-supported - yet proven to be more promising than adware that first predominated in P2P? How do contextual marketing and promotional offerings relate to the P2P channel? Have any more innovative approaches been attempted and what has been the learning? How can marketers and content owners collaborate to effectively exploit the unique opportunities afforded by P2P distribution?

And finally, we will feature a special session on private versus public approaches - P2P for content rights holders. What are the various content licensing and market exploitation strategies that have been tried to date with respect to P2P distribution? How and why is collective licensing gaining traction overseas and could this be applied to the US market? What should the roles and responsibilities be for P2P companies, ISPs, CDNs, and other parties in an optimal but practical P2P content licensing regime?

P2P, which already represents the lion's share of all Internet traffic, now also offers unprecedented opportunities for commercial development and highly attractive entry points at many levels. We hope to see you Monday. Share wisely, and take care.

Javien Powers BETA Records Online Sales

Javien Digital Payment Solutions this week announced that BETA Records, an online community that actively attracts, produces, and distributes content from independent artists, has implemented the Javien Total Commerce Solution to manage its e-commerce, from subscriptions to providing alternative payment methods for both artists and fans.

Javien is a proud sponsor of the P2P MEDIA SUMMIT Silicon Valley taking place this Monday August 4th. 

The Javien Total Commerce solution is an e-commerce platform that includes built-in support for online and off-deck mobile sales of digital music. The platform also leverages Javien's patented micro-payment technology to aggregate and batch transactions according to various rules, and adapt those rules depending upon an individual buyer's behavior or the collective behavior of a category of individuals over time. 

The micro-pay technology is critical to BETA's business model, enabling high volumes of small transactions across the web site. It also offers BETA's users the opportunity to conduct customer-to-customer transfers. Utilizing Javien's solution, BETA has developed a unique payment model in which both fans and artists can earn revenue from the sale of their music.

The site allows fans to refer an artist to BETA Records, and in return, earn a percentage of the revenue from that artist's track. A fan's earnings can be credited to their "BETA Bank," the website's wallet system. Javien's flexible platform also allows BETA users to store multiple cards on one account and even select a primary card from which to withdraw transactions. Independent artists benefit from Javien's flexible e-commerce solution as well. 

The platform gives BETA the ability to pay artists online, donating 85% of sale of the track directly to their account, while the remaining 15% is allotted for administrative costs. Perhaps most unique to the site is the artists' ability to "tip" each other, via customer-to-customer transfers. These transfers can flow directly into and out of an artists' personal BETA Bank. "BETA Records, like many of our customers, needed a solution that they could adapt to the unique and innovative ways they are engaging their communities," said Leslie Poole, CEO of Javien Digital Payment Solutions.

"Our breadth of experience and ongoing leadership in e-commerce allows us to offer our customers the technology and resources needed to implement some of the most creative business ideas that we are seeing, especially in the independent music space," Poole added.

"We are excited to be a part of the future of the music industry."

MiniNova Introduces Update to Support Artists

Excerpted from Slyck Report by Thomas Mennecke

MiniNova, like most BitTorrent trackers and developers, has an affinity for artists who embrace the file-sharing concept, and has worked to augment its indexing site into a viable distribution venue. 

Artists that relied on recording companies in the past have become self-determined by the Internet and file sharing, where sites like MiniNova help aspiring musicians, movie producers, and other creative talents launch their careers. 

Business savvy artists have a lot to gain by using distribution sites such as MiniNova, which is heavily trafficked and has a large audience looking for an alternative to mainstream media. 

This, coupled with the free nature of BitTorrent, provides ample opportunity for an aspiring artist to succeed. MiniNova's distribution platform has always been very accommodating towards the aspiring artist, however, today's update makes it even more so. 

With HD technology becoming more ubiquitous, media recorded in this format is becoming increasing large. Prior to today, MiniNova accommodated aspiring artists with an HTTP upload feature, which functioned as a seed distribution platform for their works. This worked great for smaller files, however, HTTP has its limits for larger files - especially once media gets close to the 1 gigabyte range. 

To remedy this, MiniNova today introduced FTP uploading. This bypasses the problems of HTTP uploading and uses the proper protocol for transferring very large files. FTP is near limitless in its file accommodation, and can easily support 1 gig, 10 gigs, or 100 gigs. 

For the file-sharing commoner, MiniNova also unveiled several enhancements. The recently introduced "My Bookmarks" are now sortable, and the user can also list the entire contents of a specific tracker.

Cucku Lets Users Swap Data With Back-Up Buddies

Excerpted from Washington Post Report by Jason Kincaid

In the last few years we've seen a number of back-up solutions emerge that have tried to make the back-up process as painless as possible. Local back-up solutions like Apple's Time Machine back-up to an external disk and tend to be quick and easy, but they also leave data exposed to theft and natural disaster.

Cloud-based storage services like Mozy and Carbonite remove the risk of physical damage to a disk, but it can take days or weeks to recover data after a disk fails.

Cucku, a new startup that launched last week, is trying to merge the benefits of local and remote back-up. Using a technology it calls "social back-up," users are able to automatically save their files to a friend's computer over the Skype P2P network. The software is Windows only for now, with a Mac version planned for the future.

Here's how it works: each user pairs off with a friend, who becomes their "back-up buddy". Every time you modify a file on your drive, it gets uploaded and saved to the allotted back-up portion of your friend's drive. Likewise, whenever your back-up buddy modifies one of their files, it gets saved to your hard drive. Everything is encrypted, so you shouldn't have to worry about prying eyes.

CEO Rob Ellison says that Cucku solves some of the problems associated with cloud-based remote back-up solutions like Mozy and Carbonite by keeping all of your files nearby. Instead of having to wait a few days for your files to arrive on a DVD, you can just call up your friend and get access immediately.

There are a number of other sites that use P2P back-up, including Wuala and AllMyData.

The Pirate Bay Adds Tagging Support

Excerpted from TorrentFreak Report

The Pirate Bay has rolled out a new feature that allows users to add tags to the torrents they upload. The tags will make it easier to structure and discover new content, and it gives users the opportunity to form tag based groups.

Although tags are fairly common for blogs and other online publications, The Pirate Bay is one of the first BitTorrent sites to implement this feature.

The purpose is to make it easier for users to discover content they are interested in, and structure and organize torrents more easily.

The tag cloud still has to be filled, but when it is, Pirate Bay users will have the option to browse through tag-based archives. In the days to come, support for tag-based RSS feeds will be added, and the option to browse tags per category.

Among other things, the tags will allow users to form micro-communities within the site, as Pirate Bay co-founder Peter Sunde explained, "Let's say I run a small movie club. I just tag the uploads with my club name and they will be on the same page. The new feature adds genres as well, and I'm making a tag browser per category."

Making Money in the Free Economy

Excerpted from Seeking Alpha Report by Chris Anderson

Here's a question I get all the time: how big is the free economy? That's harder to answer than you might think, for both definition and measurement reasons. But here's a first pass at doing it anyway. There are at least three classes of free and, please note, I'm just using US figures below, unless marked otherwise.

The first is the use of "free" as a marketing gimmick: "buy one, get one free", "free with purchase", "free phone if you commit to the two-year service plan", etc. All basically cross-subsidies or loss leaders - sooner or later you'll pay. I suspect that there isn't an industry that doesn't use this one way or another. There's no new economic model there and it's totally impossible to quantify, but arguably it touches every bit of the entire consumer economy itself, which is to say trillions of dollars a year. And thus it's a meaningless number. So I'll move on....

The second form of free is the "three-party market", which is to say the world of advertising-supported free media. That's most radio and broadcast television, most web media, and the proliferation of free print publications, from newspapers to "controlled circulation" magazines. For the top 100 US media firms alone, in 2006 radio and TV (not including cable) advertising revenues were $45 billion.

Online, almost all media companies make their offerings free and ad-supported, as do many non-media companies, so I'll include the entire online ad market in the "paying for content to be free to consumers" category. That's another $21-$25 billion. Free paper newspapers and magazine are probably a billion more, and there are no doubt some other smaller categories I'm omitting and a lot of independents not included in the numbers above. Let's call the total of offline and online ad-driven content and services $80-$100 billion.

Finally, there is what I call "real free." Products and services that don't cost most consumers anything at all, either in cash or ad clutter. (Most of this is online, where the marginal costs are near zero). This includes companies that use the "freemium" model (where a minority of paying customers support a majority of non-paying customers, as in Flickr and Flickr Pro or the growing world of online games), all those companies that are in the pre-revenue part of their evolution, and the entire "gift economy", from Wikipedia to the blogosphere.

This last category is impossible to properly quantify, especially since much of it has no dollar figure attached at all, but I'll break out some interesting subcategories that do have some numbers attached.

Open source software (service and support around free software): the "Linux ecosystem" (everything from RedHat to IBM's open source consulting business) is around $30 billion today; other companies built around open source, such as MySQL ($50 million annual revenues) and Sugar CRM ($15 million), probably add up to less than $1 billion.

Free-to-play videogames: these are mostly online massively multiplayer games, which are free to play but make money by charging the most dedicated gamers for digital assets (upgrades, clothing, new levels, etc). They started in South Korea and China (where they're now a $1 billion business) and have now come to the US, with games like Runescape and NeoPets.

The "casual games market" (think everything from online card games to flash games) is now at nearly $3 billion.

Free music: how much of Apple's iPod $4 billion in annual sales should be credited to the libraries of "free" MP3 that created demand for gigabyte storage devices? How much of MySpace's $65 billion estimated value is due to the free music bands put there? How much of the $2 billion concert business is driven by P2P file sharing?

So what's the bottom line? By a strict definition of free (just the third category), it's pretty easy to get to $50 billion total revenues. Include the next most interesting free market, online ad-driven content and services, and you're around $75 billion. Expand that to the traditional ad-supported media, and you can get to $150 billion. Go worldwide, and you can easily double all those figures.

Whichever definition you like, there's a lot of money to be made around free.

HP, Intel, Yahoo Study Cloud Computing

Excerpted from Reuters Report by Eric Auchard

Three tech giants - Hewlett-Packard, Intel and Yahoo - said on Tuesday they are teaming up on a research project to help turn web services into reliable, everyday utilities.

The companies are joining forces with academic researchers in Asia, Europe, and the United States to create an experimental network that lets researchers test "cloud-computing" projects - web-wide services that can reach billions of users at once.

Their goal is to promote open collaboration among industry, academic, and government researchers by removing financial and logistical barriers to working on hugely computer-intensive, Internet-wide projects.

Founding members of the consortium said they aim to create a level playing field for individual researchers and organizations of all sizes to conduct research on software, network management, and the hardware needed to deliver web-wide services as billions of computer and phone users come online.

"No one institution or company is going to figure this out," said Prabhakar Raghavan, the head of Yahoo Research who is also a consulting professor of computer science at nearby Stanford University.

Cloud computing has become the industry's biggest buzzword. It is a catch-all term to describe how Internet-connected hardware and software once delivered as discreet products can be managed as web-based, utility-like services.

"Potentially the entire planet will come to rely on this, like electricity," Raghavan said, referring to the push to make everything from daily communications to shopping to entertainment into always-available, on-demand web services.

"We are all trying to move from the horse driving the wagon to a million ants driving the wagon," Raghavan said of the need to let computers manage millions of small jobs, adding that the available capacity on the web would vary widely. "The challenge can be a billion ants one day and a million ants the next."

Big industry players from Google to Microsoft to IBM all jumped on the cloud-computing as a way to create web services on an unprecedented scale - in effect, forming barriers to entry for smaller companies.

By contrast, HP, the world's top computer maker, Intel, the biggest maker of semiconductors, and Yahoo, a web pioneer with some of the biggest audiences for online services, are creating an open network run on data centers from many companies.

"It is an overstatement to say we have a firm grip on all the technical challenges involved," said Intel Research Vice President Andrew Chien, adding, "It's not that easy for small innovators to do things" that run reliably across the web.

Chien said Intel's involvement will help it learn how to build chips to power ever-larger web tasks but use less energy. The chipmaker also sees a general benefit to the industry by encouraging the widest possible participation by researchers.

HP, Intel and Yahoo have partnered with the state-run Infocomm Development Authority of Singapore, the University of Illinois at Urbana-Champaign - which 15 years ago gave birth to the web browser - and Germany's Karlsruhe Institute of Technology. The Illinois partnership also involves the US National Science Foundation.

The test network will consist of data centers run by each of the six initial partners, and be based largely on HP hardware and Intel micro-processors. Machines at each location will dedicate 1,000 to 4,000 processor chips, backers said.

Details can be found here.

Forget Darknets, Make Way for the Dark Cloud

Excerpted from SYS-CON Report by Reuven Cohen

For nearly as long as the Internet has been around there have been private sub-networks called "darknets." These private, covert, and often secret networks were typically formed as decentralized groups of people engaged in the sharing of information, computing resources and communications. 

Recently there has been a resurgence in interest of the darknet ranging from P2P file sharing to inter-government information sharing, bandwidth alliances, or even offensive military botnets. All of these activities are pointing to a growing interest in the form of covert computing I call "dark cloud computing" whereby a private computing alliance is formed. In this alliance members are able to pool together computing resources to address the ever-expanding need for capacity. 

According to my favorite source of quick disinformation, the term darknet was originally coined in the 1970s to designate networks which were isolated from ARPANET (which evolved into the Internet) for security purposes.

Some darknets were able to receive data from ARPANET but had addresses which did not appear in the network lists and would not answer pings or other inquiries. More recently the term has been associated with the use of dark fiber networks, private file-sharing networks, and distributed criminal botnets. 

The botnet is quickly becoming the tool of choice for governments around the globe. Recently Col. Charles W. Williamson III. staff judge advocate, Air Force Intelligence, Surveillance and Reconnaissance Agency, wrote in the Armed Forces Journal for the need of botnets within the US DoD.

In his report, he said, "The world has abandoned a fortress mentality in the real world, and we need to move beyond it in cyberspace. America needs a network that can project power by building an af.mil robot network (botnet) that can direct such massive amounts of traffic to target computers that they can no longer communicate and become no more useful to our adversaries than hunks of metal and plastic. America needs the ability to carpet bomb in cyberspace to create the deterrent we lack." 

I highly doubt the US is alone in this thinking. The world is more-then-ever driven by information and botnet usages are not just limited to governments but to enterprises as well. In our modern information driven economy the distinction between corporation and governmental organization has been increasingly blurred.

Corporate entities are quickly realizing they need the same network protections. By covertly pooling resources in the form of a dark cloud or cloud alliance, members are able to counter or block network threats in a private, anonymous, and quarantined fashion.

This type of distributed network environment may act as an early warning and threat avoidance system. An anonymous cloud computing alliance would enable a network of decentralized nodes capable of neutralizing potential threats through a series of counter-measures. 

My question is: are we on the brink of seeing the rise of private corporate darknets a.k.a. dark clouds? And if so, what are the legal ramifications, and do they out weight the need to protect ourselves from criminals who can and will use these tactics against us?

FCC Divided on Network Management Opinion

On Friday, a sharply divided Federal Communications Commission (FCC) voted in favor of a Memorandum Opinion and Order concluding that Comcast's limited management of its broadband network to avoid congestion that harms consumers was not acceptable to the Commission.

Sena Fitzmaurice, Senior Director, Corporate Communications and Government Affairs at Comcast, said, "We are gratified that the Commission did not find any conduct by Comcast that justified a fine and that the deadline established in the order is the same self-imposed deadline that we announced four months ago."

"On the other hand, we are disappointed in the Commission's divided conclusion because we believe that our network management choices were reasonable, wholly consistent with industry practices and that we did not block access to websites or online applications, including P2P services."

In March, Comcast announced it would migrate all of its systems to a protocol-agnostic network management technique by year-end. Comcast is already trialing these techniques in five markets. Information on these trials and on Comcast's network management practices can be found here.

Comcast has announced joint efforts with BitTorrent and Pando Networks to address issues related to network management and is participating in the P2P Best Practices initiative organized by the Distributed Computing Industry Association (DCIA).

Jim Cicconi, Senior Executive Vice President of External and Legislative Affairs at AT&T, said, "Regardless of how one views the merits of the complaint against Comcast, the FCC today has shown that its national Internet policies work, and that they are more than sufficient for handling any net neutrality concerns that may arise. We have argued repeatedly that there is no need for federal legislation in this area, and today's FCC action proves that point.

"Once a complaint was filed alleging that Comcast had violated the FCC's national Internet policies, it was appropriate for the FCC to adjudicate the complaint. We are pleased the FCC decided to handle the matter on its own unique facts, setting a wise precedent for dealing with such complaints on a case-by-case basis. We are also pleased that, by deciding not to levy a fine, the FCC effectively recognized there was no evidence of anti-competitive intent in Comcast's practices."

Reactions to FCC Decision Come Fast and Furious

Excerpted from Ars Technica Report by Matthew Lasar

Hope, indignation, and outrage greeted the Federal Communications Commission's decision.

Jay Monahan, General Counsel of Vuze, said that when the hi-res video content company filed its net neutrality petition, he didn't expect the explosion of passionate support that followed.

"When I saw the thousands of submissions to the Commission by consumers and the standing-room-only FCC field hearings that we attended and in some cases testified at, that part surprised me," Monahan confided in an interview. "That there were that many people paying this much attention to this."

The FCC's action may also be a global precedent. Ars asked Columbia Law Professor Tim Wu whether any other country has taken similar steps. It's a tricky call, he responded, because, unlike the United States, some countries have retained their common carrier powers over the Internet.

"However, in terms of enforcement, this is a first in the world as far as I know," Wu said.

The cable industry has resolutely stood by Comcast's side. On Tuesday, an SVP of Time Warner Cable met with the FCC, warning that "government intrusion into broadband providers' traffic management practices would have a chilling effect on investment and innovation."

Four days earlier, the National Cable and Telecommunications Association (NCTA) sent the agency a chart of the network management practices of the nation's top colleges and universities. "If there is to be regulation, therefore, it must apply equally to all providers," NCTA's filing advised.

Undisguised outrage has come from the hardcore right, which views with horror the spectacle of Republican FCC Chair Martin delivering what it sees as the broadband equivalent of the Fairness Doctrine.

The Wall Street Journal's editorial writers - who must surely sign a pact never to read the newspaper's excellent articles about telecommunications - lambasted Martin on Wednesday as a self-appointed "Master of the Media Universe," a chump for Moveon.org, and worse.

"Mr. Martin is also greasing the skids for a potential Barack Obama Administration to take an Internet industrial policy who knows where," the Journal warned. Ditto, declared House Republican Minority leader John Boehner, who the next day sent an angry letter to Martin, denouncing his efforts to "hijack the evolution of the Internet to everyone's detriment."

One senses in these frantic protests legitimate fears that Martin's move represents yet another sign that these are the End Days of the Reagan Era. It is very unlikely that the FCC's 42-year-old chief parties with the Free Press crowd. But with Friday's ruling, he has clearly sided not just with the FCC's "two Democrats," as the Journal bitterly calls them, but with a younger, technology-loving generation that sees government as an ally rather than The Problem.

In Ars' interview with Jay Monahan, the attorney bristled at the Wall Street Journal's insistence that "net neutrality is a slippery slope toward interventions of all kinds." It is the opposite, he insisted. "What Martin has proposed, and what the Commission is about to do, is exactly designed to protect innovation, and to protect competition," Monahan argued. "If net neutrality means anything, it means not that each of us is made equal in the marketplace, but that at least we have an equal set of rules that are transparent to all of us in order to compete."

Nobody, least of all Vuze, thinks this fight is over. Monahan says he fully expects Comcast to "appeal the Commission's order" - which means a lawsuit against the FCC, a Congressional counterattack, or both. Still, he sees today as a day to celebrate.

"We do view this as a first step," Monahan concluded. "A first step towards helping to build an open and free Internet. And we're grateful to the Commission for having the courage to adopt this order so that we can move forward and go back to our Palo Alto office and continue to compete in this marketplace."

Video Traffic Seen Overtaking P2P

Excerpted from Electronista Report

Customers with multiple options for video are triggering a drop in the relative amount of P2P traffic, according to a published study of AT&T's network.

The DSL Internet provider found that while P2P traffic did ultimately increase in the past year, streaming video services such as Hulu and YouTube have largely flattened the growth in demand for P2P versus its rapid growth in recent years.

"On the Tier 1 AT&T backbone P2P actually dropped 20% during a period last year," said analyst Dave Burstein, adding that AT&T Labs Vice President Charles Kalmanek directly attributes the drop to a "shift in customer mix" rather than a decline in popularity. Overall traffic is characterized as growing at "more than 50%" per year but now sees YouTube and other web media streams accounting for a full third of all of AT&T's traffic where P2P is now approximately one fifth.

The shift is said to be a blow to Internet providers hoping to use Comcast's former model of throttling P2P data. As customers are now accessing video content through the web in multiple ways, such traffic shaping measures are no longer effective or justifiable. 

"Many of the policy people believe that P2P is a ravenous monster that is devouring the Internet," Burstein adds. "The data shows that simply isn't true." 

AT&T is known to be aware of this and is planning chiefly to avoid problems by increasing the capacity of its network. The provider will soon quadruple its backbone from 10 gigabits per second to 40 gigabits in a way that shouldn't raise its typical expenses and plans to make a similar move to 100 gigabits per second within the next few years. 

Sandvine, the creator of the utility Comcast has used to limit P2P traffic, is also unveiling a new technique that would allow Internet providers much finer control over how they throttle traffic. The new implementation would let a service carrier target individual users rather than an entire network as well as limiting the effect only to certain times of the day and to only partial speed reductions.

AT&T Explains P2P Wireless Policy

Robert Quinn, Senior Vice President, Federal Regulatory at AT&T, provided information to the Federal Communications Commission (FCC) this week regarding AT&T's policies and practices with respect to the use of P2P applications by AT&T's mobile wireless broadband customers.

AT&T does not use network management tools to block the use of P2P applications by its mobile wireless broadband customers. However, AT&T's terms of service for mobile wireless broadband customers prohibit all uses that may cause extreme network capacity issues, and explicitly identify P2P file-sharing applications as such a use.

Under these terms of service, which are similar to those of other wireless providers, use of a P2P file sharing application would constitute a material breach of contract for which the user's service could be terminated.

Since the vast majority of its customers abide by their contractual commitments, AT&T has not yet found it necessary to terminate anyone's service for such a use.

Mobile wireless broadband services rely on shared network resources at every point in the network including shared spectrum in the "last mile." With any shared network, some limitations on the uses individual subscribers make of their service are inherently necessary to ensure that all customers collectively receive an acceptable level of service.

AT&T's policy on the use of P2P file-sharing applications illustrates this principle. Today's P2P file-sharing applications are inappropriate for AT&T's mobile wireless broadband network, which is optimized to efficiently support high data-rates for multiple users that send and receive intermittent or "bursty" traffic generated by activities such as browsing the Internet and sending e-mail.

Because P2P file-sharing applications typically engage in continuous (rather than bursty) transmissions at high data-rates, a small number of users of P2P file-sharing applications served by a particular cell-site could severely degrade the service quality enjoyed by all customers served by that site.

Moreover, unlike wired broadband networks where the maximum number of potential simultaneous users in a given neighborhood is known in advance, the maximum number of potential mobile wireless broadband users that may simultaneously seek to access a given cell-site at any particular time - and thus the collective service experience for all users at that site, for both data and voice services - is far less predictable due to the inherently nomadic nature of mobile wireless users.

Accordingly, AT&T's terms of service prohibit the use of P2P applications to safeguard service quality for the benefit of all customers.

MBit CEO Keynotes at SUMMIT on Mobile P2P

Chunyan See, CEO of mBit, a leading provider of mobile P2P service offerings will deliver a keynote address at the P2P MEDIA SUMMIT Silicon Valley on Monday August 4th.

mBit is the first company in the world to have created a true P2P file-sharing service in collaboration with major IP Multimedia Subsystem (IMS) vendors, with whom it has been working since December 2004.

It is also interoperable with cell-phones that do not support SIP/IMS by distributing the data via HTTP proxy. Thus users do not need to worry about firewalls in their offices or homes anymore.

It is designed to overcome the issue of MMS size limitation of 300kb and interoperability among different wireless network operators in different countries.

Today, a typical 3.2 mega-pixel camera phone takes a photo at 500kb-600kb file size and records video at nearly 20 Mb per minute. For users who can't wait to go back home to switch-on their PCs to synchronize and e-mail such a file, mBit has the solution.

With devices supporting WiFi, 3G (384 kbps) and 3.5G networks (up to 14.4 mbps), sharing such user-generated content, as well as downloading podcasts, TV shows, and movies to watch on increasingly wider-and-sharper screens have become a reality.

Japanese phones now have Sharp Aquos and Sony Bravia LCD screens that can twist to landscape mode to watch live TV broadcasts and movies.

Most devices can support up to 2-4Gb in external storage. With 8Gb flash memory in phones launched and Apple iPod now supporting 160Gb, just imagine how many full-length movies optimized for your screen size you can start to watch on your phone?

With the mBit.tv client, you can download these large video files and wake-up to have them on your way to work.

With its award-winning technology, the mBit protocol can help resume the download even if it is disrupted. Especially, if a user needs to switch the phone off for an important meeting or enter a subway or elevator.

For operators, with such a channel to encourage super-distribution, mBit can leverage "word-of-mouth" marketing to achieve the viral effect and increase ARPU.

The software is currently available for Symbian S60 3rd edition and Java MIDP 2.0-enabled devices and requires an Internet connection.

DRM, Anti-Circumvention Legislation, Unauthorized P2P

Excerpted from IT World Canada Report by Rusell McOrmond

There is an all too common belief that putting a legal layer around digital rights management (DRM) to encourage its usage will decrease copyright infringement. The USA, being the primary source of that thinking from its National Information Infrastructure Task Force work in the early 1990's, was of course the first country to implement this thinking in its Digital Millennium Copyright Act (DMCA).

The DMCA was signed into law on October 28, 1998, and nearly a decade later every report I see suggests that unauthorized P2P sharing of copyrighted files is on the rise in the United States. While this will come as a shock to people who thought the DMCA would (or think the Canadian Bill C-61 will) decrease copyright infringement, the increase seems very logical to me.

Citizens of both Canada and the USA want to access the content they have acquired over the years on all the devices that they currently own. As 8-tracks went out of style, they made cassette tapes, and eventually as time goes on their media ends up as digital files in their computers and other digital media devices. They quite reasonably believe that once they have paid for the content that they shouldn't have to pay for it over-and-over again with every new format or device.

In the past, the technology to do this conversion was readily available and reasonably easy to use, so people were doing this conversion in the privacy of their own homes or at a friend's house. When you introduce technical measures, you introduce a level of complexity to this conversion which requires additional technical knowledge. When you add a legal layer on top, you push the tools necessary to do this conversion into an underground where the general public is not able to go to a store and purchase the tools required to do the conversion.

It will always be technologically possible for someone to circumvent these technical measures. Internationally renowned security technologist and author Bruce Schneier has said a few times that, "Trying to make digital files uncopyable is like trying to make water not wet." Most people are not computer security experts, so this doesn't apply to most citizens.

Alongside this perfectly legitimate conversion is a publicly available pool of already converted files on P2P networks. All it takes is one technically sophisticated person to circumvent any technical measure and share the results with others. P2P tools are easy to use, even easier to use than many of the tools used to time-and-format-shift content.

People who previously did the time/format shifting themselves will be driven to P2P networks by DRM. They are told that time/format shifting of the DRM infected media is already unauthorized when they head to these sites to do something that replaces what was previously perfectly allowable and which remains perfectly legitimate. If acquiring time/format shifted files of content they own is already unauthorized, they may feel that acquiring other content that they never had before isn't any worse.

While I don't personally use P2P to infringe copyright, I have to agree that it is perfectly logical that other people will be doing so. It also seems perfectly logical to me that the USA's DMCA and (if passed) Canada's Bill C-61 will only make the problem worse.

For greater certainty: I am not apologizing for the increasing number of people who infringe copyright, just helping to explain that it is a logical and understandable behavior. Policy makers should be trying to understand this behavior in order to reduce it, rather than increasing the behavior by making more perfectly legitimate activities illegal.

If I were the government wanting to reduce copyright infringement, I would take a very different approach.

1) I would clearly carve time-and-device shifting out of copyright, as well as other perfectly legitimate private activities. This would be done in part by Canada adopting a US-style living Fair Use regime to allow what is considered legitimate private activities to more dynamically change to match the times.

Neither government nor copyright holders have any business in the bedrooms (or other private rooms) of our nation!

2) I would disallow the abuse of technical measures to attempt to make this shifting harder, including disallowing the locking of content to only be interoperable with specific brands of devices.

3) I would ensure that no law (copyright or otherwise) disallowed the owner of a device from removing any foreign locks on their devices. I would use the law to discourage this harmful activity by device manufacturers and software authors, and clearly enable hardware owners to make their own software choices.

(These are all re-wordings of proposals that are part of the CLUE Policy Summary)

If activities which the majority of citizens consider perfectly reasonable were allowed, and activities which the majority of us thought were inappropriate were illegal, then there would be far more respect for the law.

The direction we are currently heading will only increase disrespect for the law and thus increase copyright infringement. The stronger that inappropriate laws are enforced, the more we might see an increasing disrespect for other laws which could have a devastating effect on society.

In the United States, time-and-device shifting was considered perfectly allowable under its living Fair Use regime. This was a key part of its Sony Betamax Case in 1984, which said that Sony was not liable for contributing to copyright infringement because the VCR had substantial non-infringing uses. The uses that were considered non-infringing included time-and-device shifting.

In Canada, people have been told that Canadian Copyright law is somehow "weaker" than US law. Canadians legitimately believe that if Americans are allowed to time-and-device shift without permission, then we must be able to as well. While Canadian copyright law is tilted more in favor of copyright holders than US law in a number of ways, copyright holders have not wanted to sue Canadians for this type of infringement as it would be a public relations nightmare.

It is the same reason that music labels have not been suing for unauthorized P2P file sharing in Canada, even though this is clearly not permitted under current Canadian law, open to massive statutory damages (maximum $20K per infringing file shared), and the music industry only lost its case in 2004/2005 due to lack of evidence.

The Conservative media spin around Bill C-61 claims that time-and-device shifting will be legal in Canada. Unlike the USA where this type of activity is taken out of requiring permission, it remains part of the permission culture with Bill C-61.

All the Conservatives did was change the default for a few special cases from being denied unless granted to being granted unless denied. In the vast majority of situations, these activities will continue to be denied in Canada under technical measures, contracts, or both.

In other words, those wishing to time-and-device shift in Canada will be less able to do so under Bill C-61 than the status-quo, given the Conservatives have turned the situation from one where lawsuits would be unlikely to where lawsuits will be considered reasonable.

Where the Conservatives are trying to spin C-61 as balancing the rights of users with those of copyright holders, I feel that even the provisions they have claimed will help copyright users will only cause harm.

MPAA Planning Website to Find Movies Online

Excerpted from Afterdawn Report

It looks like the Motion Picture Association of America (MPAA) has come up with a novel approach to getting consumers to stop downloading unlicensed movies on the Internet.

Instead of concentrating on lawsuits like the Recording Industry of America (RIAA), the MPAA is apparently developing a website to help consumers find authorized ways to watch movies. According to an anonymous source, the MPAA is building a new website where consumers can search for a movie and be presented with options to buy, rent, or view it.

For example, a search for a movie that's still in theaters might result in links to online movie ticket sites, while an older release would give you options for buying or renting the DVD, and perhaps others for downloading or streaming it. Variety was reportedly told that the project was initiated after research showed that consumers often have a hard time distinguishing between authorized and unauthorized sources for movies online.

Coming Events of Interest

P2P MEDIA SUMMIT SV - August 4th in San Jose, CA. The first-ever P2P MEDIA SUMMIT in Silicon Valley. Featuring keynotes from industry-leading P2P and social network operators; tracks on policy, technology and marketing; panel discussions covering content distribution and solutions development; valuable workshops; networking opportunities; and more.

Building Blocks 2008 - August 5th-7th in San Jose, CA. The premier event for transforming entertainment, consumer electronics, social media & web application technologies & the global communications network: TV, cable, telco, consumer electronics, mobile, broadband, search, games and the digital home. The DCIA will conduct a P2P session.

International Broadcasting Convention - September 11th-16th in Amsterdam, Holland. IBC is committed to providing the world's best event for everyone involved in the creation, management, and delivery of content for the entertainment industry. Uniquely, the key executives and committees who control the convention are drawn from the industry, bringing with them experience and expertise in all aspects.

Streaming Media West - September 23rd-25th in San Jose, CA. The only show that covers both the business of online video and the technology of P2PTV, streaming, downloading, webcasting, Internet TV, IPTV, and mobile video. Covering both corporate and consumer business, technology, and content issues in the enterprise, advertising, media and entertainment, broadcast, and education markets. The DCIA will conduct a P2P session.

P2P MEDIA SUMMIT LV - January 7th in Las Vegas, NV. This is the DCIA's must-attend event for everyone interested in monetizing content using P2P and related technologies. Keynotes, panels, and workshops on the latest breakthroughs. This DCIA flagship event is a Conference within CES - the Consumer Electronics Show.

Copyright 2008 Distributed Computing Industry Association
This page last updated December 14, 2008
Privacy Policy