September 6, 2010
Volume XXXII, Issue 2
Twitter Surpasses 145 Million Registered Users
Excerpted from Digital Media Wire Report by Mark Hefflinger
Twitter now counts over 145 million registered users, Co-Founder Evan Williams revealed in a post on the company's blog. Williams also noted that the number of mobile users of Twitter has risen 62% since mid-April.
Sixteen percent (16%) of all new Twitter users start on mobile, and 46% of active users are using mobile devices to tweet regularly.
Over the past month, 78% of users who logged into Twitter did so directly on Twitter.com, while 14% used m.twitter.com.
Another 8% tweeted via SMS message, while 8% used Twitter for iPhone; 7% used Twitter for BlackBerry; 4% used TwitPic; and 3% used TweetDeck or Echofon.
Williams added that the number of third-party applications registered with Twitter is now nearing 300,000.
Cloud Computing Stocks Lead the Market
Excerpted from Cabot Wealth Advisory Report by Maura Lockwood
Cloud computing stocks - offering computer technology that provides scalable IT infrastructure and often, virtualized resources - are leading the market, Investor's Business Daily (IBD) reported. Three of them-VMware, F5 Networks, and Aruba Networks hold IBD's best-possible 99 Composite Rating.
VMware offers a cloud infrastructure software platform, and holds an 80% market share in the server virtualization market. Analysts expect earnings to grow 39% this year and 22% in 2011, IBD reported.
F5 Networks provides application delivery controllers that are vital to cloud computing. Analysts expect earnings to rise 45% this fiscal year (ending September 30th) and 24% in fiscal 2011, reported IBD.
Aruba Networks provides wireless network management with a cloud-based software-as-a-service offering called AirWave On-Demand, which eliminates the need for an on-site server, according to IBD.
These three distributed computing industry leaders are all recent Cabot Top Ten Weekly picks.
Game Companies Should Play Fair with P2P
Excerpted from TorrentFreak Report
Increasingly, game companies are using peer-to-peer (P2P) powered solutions to deliver games and updates to their customers. While the use of P2P technology could be beneficial for publishers, consumers, and the image of file-sharing in general, the use of P2P by game companies still lacks transparency, privacy, and control. A newly published best practices outline aims to change this.
In the past, we've praised game companies who adopted P2P-based solutions for the distribution of their content. Through the use of P2P, the game companies can save resources and consumers often see improved download times. However, there is also a dark side to this apparent synergy.
Although the use of P2P technology has many benefits, it is not always implemented with the interests of consumers in mind. In fact, quite often gamers are simply abused as cheap bandwidth sources by million dollar corporations, often without their knowledge.
Akamai, one of the largest content delivery companies around, has a P2P-based product called the Netsession Interface, which is rather abusive towards customers. The software is installed as a Windows service and it is always running in the background. Even worse, most users won't even know that it's running because it doesn't show up in task manager. Nothing of the above is mentioned in its EULA.
The NetSession Interface is used by game publishers including Kuma Games, Aeria Games, and NetDevil. Customers who play the games have no user controls or visible indicators, while the software uses 'their' upload bandwidth to deliver content to other users for an indefinite period after the download is completed.
It is clear that something has to be done to ensure that consumers are not exploited as bandwidth slaves. P2P technology is great, and many consumers would love to donate some, but there has to be a clear set of rules to guarantee that consumers have a choice.
To address this issue, game publisher company Solid State Networks has just released a best practices document, which emphasizes giving users transparency and control over their resources. According to the company it all boils down to the following directives:
1. Transparency - Make visible and readily accessible information about the presence and operational activity of the P2P technology.
2. Control - Provide the ability to manage, operate and remove the P2P technology in an intuitive and conspicuous manner to the user.
3. Privacy - Ensure the absolute privacy and security of personal information and user originated files.
We think this is a great initiative and sincerely hope that the gaming industry will adopt this, or a similar set of rules, in the interests of the consumer. A quick search on Google shows that most of it is much needed, as there are many complaints from gamers about the lack of transparency and control that most of the current P2P delivery systems offer.
Solid State Networks already offers its very own P2P-based delivery solution for games publishers that adheres to all three directives. The other P2P-based solutions that already exist out there can be easily adopted to become "fair" as well.
Click here for an additional PSA, summarizing how and why game publishers should handle P2P-powered game distribution.
Report from CEO Marty Lafferty
As DCINFO readers are aware, later this month, the US Federal Communications (FCC) could decide, simply by voting along party lines under its current Democratic majority, to much more strictly regulate broadband services, with restrictions that would range from fundamental aspects of operational controls to key marketing considerations.
The latest word is that the Commission may delay such action until at least the end of the year, taking time instead to review the more than 50,000 submissions on this subject that it has received from interested parties. That would be a welcome course of short-term action by the FCC regarding this extremely important issue.
Chairman Julius Genachowski, if his "third-way" approach is ultimately enacted, will no doubt stress his insistence on forbearing from imposition of the most onerous regulations that would be possible under such a reversion to pre-Internet common-carrier rules, in keeping with the approach that has been employed by the FCC since 2005.
That explanation, however, would provide little comfort in terms of what future FCCs might do under such a regime. It would be a terrible mistake.
Arguably most troubling this week, Commissioner Michael Copps invested time to attack the Washington Post as well as disparage the Republican Party, saying that the media outlet "wrongly stated that a court decided the FCC has no authority over Internet service providers (ISPs)."
"What the DC Circuit Court of Appeals said was that the section (Title I) of the communications statute cited by agency lawyers did not support the FCC ruling against Comcast's blocking of BitTorrent."
"This was a predictable outcome of FCC actions during the Bush administration that consciously moved broadband Internet access from Title II, which would have supported the Commission's authority, to a murky place that invited court challenge."
"This was a major flip-flop from the historic - and successful - approach of forbidding discrimination on our communications networks. Now is the time to put broadband back under Title II, where it belongs - and under which many smaller companies continue to offer Internet access to the public."
"Nor is this debate about regulating the Internet. It's about whether consumers or a few huge ISPs will control consumers' online experiences."
"The Verizon-Google plan that The Post endorsed creates a two-tiered Internet at the expense of the open Internet we now have, almost completely excludes wireless and transforms the FCC from what is supposed to be a consumer protection agency into an agent of big business."
"I thought we'd had enough of that. To expect big telecom and cable duopolies to protect consumers while a toothless agency stands quietly by is to expect what never was nor will be."
Our question for the Chairman, the Commissioner, and other FCC members, respectfully is this: Since the Internet is neither broken nor in immediate danger of any serious problem in this regard that would negatively impact consumers in a significant way, what's the hurry?
Why not take a more measured approach, working with Congress to revise the Communications Act to explicitly delineate the FCC's authority with respect to broadband, which is drastically different from voice telephony, and update other provisions for the digital broadband era that are now badly in need of modernization.
On as important a matter as this, and absent an urgent related crisis, wouldn't a careful, deliberative process be the preferred way to go? What would be the downside of slowing down to ensure we get this right?
Meanwhile this week, in the private sector, another group of companies negotiated a proposed net-neutrality compromise. Verizon once again participated, this time joined by AT&T, Microsoft, Skype, and others, including the National Cable & Telecommunications Association (NCTA), under the auspices of the Information Technology Industry Council (ITI).
This newest prospective agreement reflects greater restrictions than the proposal announced by Verizon and Google last month with respect to some of the more controversial issues, such as wireless Internet access, FCC authority, and a newly outlined area known as "managed services."
For example, with respect to wireless access, which, due to differences in technology, participants agree cannot be treated identically to wireline Internet access, at least in the immediate future, there could be prohibitions from blocking websites, although probably not from prohibiting use of particularly bandwidth intensive applications.
While objective industry observers should welcome this proposal, David Linthicum sounded an alarm over it in his Lowdown on Net Neutrality and Cloud Computing, saying that, "The concept of allowing specific networks, especially wireless networks, to restrict or prioritize some traffic is a huge threat to the success of cloud computing."
His concern is that provider networks could give priority to larger cloud computing vendors that write big checks for the privilege. At the same time, smaller cloud computing start-ups, who can't afford such fees, could have access to their offerings slowed noticeably, or perhaps not even allowed on the network at all. Of course, we share this concern, but our question is whether this would really make business sense.
Businesses using pubic cloud computing will demand multiple choices and competition will prevent costs from going up irrationally. Moreover, the market will continue to reflect that the most innovative aspects of cloud computing will typically come from very lean start-ups.
While he concludes that this approach would be, "bad for the web, bad for users, and very bad for cloud computing. Let's work together to make sure that the Internet stays open and neutral," we disagree. Natural effects of the marketplace will prevail and prevent this outcome.
Over at Techdirt, Mike Masnick's criticism of the new proposal was perhaps more on point: "I'm still at a loss as to how this actually matters. The companies can agree to whatever they want, and none of it makes a difference if Congress acts (or the courts say that the FCC is allowed to act). I guess the idea is to think that an 'industry agreement' will stave off legislation, which perhaps might work for some time, but still reeks of collusion without consumer input or review."
In our view, AT&T's submission to the FCC this week contained the most significant revelation that should influence the net-neutrality debate in a meaningful way. AT&T argued that paid prioritization already exists online. "AT&T alone has hundreds of third-party customers for such services," it noted, adding that these customers include "healthcare providers, community service organizations, restaurant chains, car dealers, electric utilities, banks, municipalities, security/alarm companies, hotels, labor unions, charities, and video-relay service providers."
As the Commission signaled last October, its new rules would ban any business practice adopted by a broadband provider that can reasonably be characterized as "discriminatory," except on a case-by-case basis, in terms of impact on content, applications, or services. Little guidance has been offered to explain what this might mean in practice.
An important area that seems to be overlooked in the Commission's approach, ironically, is consumers. Nothing in its proposal addresses possible discrimination against broadband subscribers, whose best interests and security should be of paramount concern to the FCC. Amidst all of this arguing, we tend to come down on the side of the more than twenty economists, including heavyweights Alfred Kahn and Vernon Smith, who opined that, "There is no economic basis for a finding of market failure in the markets at issue. There is strong economic evidence that the regulations would inhibit, or prohibit, efficiency enhancing conduct, thereby reducing competition, slowing innovation, deterring investment and ultimately reducing consumer welfare."
Bottom line: let's not succumb to some new form of taxation - or wealth redistribution - via regulation. Let's use the Labor Day weekend to step back from the arguments and take a breath and think about the stakes involved here from a broader perspective. Above all, let's avoid a rush to judgment on this extremely important issue.
If anything, encouraging further private sector dialog, factual disclosures about the practical aspects of various approaches, and most of all, further proposal development by industry participants in all quarters, are what show the most promise. Share wisely, and take care.
Kerry on Net Neutrality Debate: Take a Deep Breath
Excerpted from Multichannel News Report by John Eggerton
Network neutrality fan Senator John Kerry (D-MA) renewed his call for an open Internet, but also called on parties on both sides of the issue to cool the rhetoric.
Responding to the Federal Communications Commission's (FCC) announcement that it is seeking more comment on its network neutrality rulemaking proposal, which will put off any decision until at least the end of the year, Kerry said that he understood the strong "beliefs and interests" being expressed.
However, he said that "rather than retreat to our predictable corners, this should be a time when everyone takes a deep breath and continues to engage in a constructive process."
In calling for comment on applying open access to wireless broadband and permitting specialized services to allow for paid priority outside the public Internet, FCC Chairman Julius Genachowski also pointed to a meeting of the minds among stakeholders on many of the issues in the rulemaking. Kerry echoed that, and even borrowed the FCC's characterization of the growing agreement as a "narrowing of disagreement."
The Senator said that all "responsible parties" now agree that broadband should fall within the scope of the FCC and that networks should not pick winners and losers. That, he said, is progress.
Kerry also said he was actively working on a legislative solution to ensuring an open Internet was preserved and promoted, and added that in the meantime, the FCC has the "authority, ability, and responsibility" to preserve that openness "with or without a new law."
Kerry's call for calm and constructive dialog is notable given his passion for the issue. Last April, he blogged a call to arms on the issue, asking Daily Kos readers to contact their Senators after the "travesty" of the BitTorrent decision and in the face of networks who wanted to "throttle traffic" as they wish.
First Alpha of uTorrent Server for Linux Released
Excerpted from Softpedia Report
BitTorrent has proudly announced the immediate availability for download of the first alpha version of uTorrent Server for Linux and UNIX-like operating systems!
The uTorrent Server application provides a state-of-the-art implementation of the popular BitTorrent protocol and a full-featured web-based user interface (WebUI).
"This morning, we are announcing a preview release of the first of two new products for Linux users. uTorrent Server, an alpha version available immediately for download, is intended for users seeking a fast, powerful and lightweight BitTorrent client without the need of the full features and complexity of the native GUI."
"The server is a demonizable 32-bit binary of the uTorrent core, built for x86 compatible Linux. It can be managed programmatically via an HTTP API or interactively by using the included customized version of the popular uTorrent Web user interface," said the announcement.
Highlights of uTorrent Server include: Distributed Hash Table (DHT) support, UPnP port mapping, NAT-PMP port mapping, Upload rate limiting, Download rate limiting, Queuing support, Configurable limit on number of simultaneously uploading peers, Incremental file allocation, Block level piece picking, Separate threads for download and file-check, Single port and single thread for multiple torrent downloads, BitTorrent extension protocol, Multi-tracker extension support, Fair trade extension, Compact tracker extension, and Fast resume support
Features also include: Queuing of torrent file-check if fast resume not possible, HTTP seed support, Resumption of partial downloads from other BitTorrent clients, File-sizes greater than 2GB, Selective download of multi-file torrents, IPv6 support, High performance network stack, and uTP - Advanced UDP-based transport with dynamic congestion control.
At the moment, this alpha version of uTorrent Server for Linux is available only as an archive, intended to be used for testing purposes only. Packages for various Linux distributions (i.e., Ubuntu, Fedora) will be available in the next releases.
The company added, "Today's version is only the first step, and we will continue to support the Linux user community with new versions in the near future. If you prefer to stick to more conventional user experience, rest assured we are working hard to build a full-featured client, coming soon."
"uTorrent Linux will offer the same clean and full featured UI that millions of users of of ÂµTorrent on Windows have enjoyed. We are hoping to get this out to you for testing in a few months. Stay tuned!"
Kazaa Ups Its Playlist Power
Excerpted from Brand-M Biz Report
Resurgent P2P outfit Kazaa has added over 500,000 playlists and other interactive features to its recently launched Playlist Project offering.
The outfit enables users to share lists within the Kazaa community, and also through social networking sites such as Twitter and Facebook.
"The best thing about the playlist feature is that it connects people through music," says Emanuel Krassenstein, the company's CTO.
"Kazaa aims at providing the best music discovery environment that encompasses various music genres from pop to country, Latin to hip-hop, where music fans can interact with artists, albums, and songs."
QTRAX Announces September Marketing Campaign
QTRAX, a division of Brilliant Technologies Corporation announced today that it will begin its marketing campaign during September to inform the public of the QTRAX beta which has been operating in six countries.
Allan Klepfisz, Co-President & CEO said, "We have been doing beta testing for several months now with a significant catalogue and full functionality except for portability. While the site has been open to the general public, we have deliberately not publicized it during this phase. We will start doing so in a measured way, during September. We have learned that even modest publicity has a profound effect on unique monthly visits, duration of visits, bounce rate etc."
Lance Ford, Co-President & CMO said, "Our marketing program will be multi-tiered but initially will be focused on PR and SEO in the pertinent territories. It is time to introduce QTRAX and initially we will do so in a targeted way for each territory rather than via a global campaign".
QTRAX is currently engaged in beta testing in the Asia-Pacific region. It will soon be rolling out to other territories and introducing portability.
QTRAX is the world's first free and legal global download music service and showcases an innovative 360 degree revenue model that easily directs revenue back to artists and rights holders.
QTRAX has successfully signed licensing deals with major labels, music publishers and leading indies. QTRAX will soon provide fans in multiple regions with access to a colorful and diverse catalog of high-quality, high-fidelity digital music files. Based in New York City, QTRAX is a subsidiary of Brilliant Technologies Corporation, a publicly traded technology holding company.
Major Record Label Founder Says Internet Not Music's Enemy
Excerpted from Zeropaid Report by Drew Wilson
Shortly after Stevie Nicks said that the Internet has destroyed rock, the founder of Elektra Records, an RIAA record label according to RIAA Radar, says that music has a bright future and the internet is not the enemy.
There's a fascinating article over on CNet, which shows that Jac Holzman, founder of Elektra Records and helped bring you artists such as The Doors, has had some positive things to say about the digital age.
He helped push the record industry to adopt the CD and was quoted as saying, "I think the music industry has a bright future" when discussing the Internet. Here are two excerpts from the article:
In music, Holzman saw the rise of the LP, 8-track tape, DAT, compact disc, MP3, and BitTorrent. After all that, new technologies don't spook him. On the contrary, he says many of these technologies helped make a lot of artists and industry people rich. When it comes to the Internet and digital distribution, Holzman is confident music labels can capitalize on them, too. He says they really don't have a choice.
"I was having lunch with a very dear friend of mine in the record business sometime around 2000," Holzman said during an interview this week with CNet. "We met right around the time when Napster came together, and I said, 'There are opportunities and there are potholes. How are you preparing for a digital future?' He said to me, 'Jac, I just want it to go away.' Well, you can't continue that conversation."
Holzman suggested that the big labels goofed when they sued Napster out of existence. At that point, the rise of the CD had left the industry without an effective way to sell individual songs. Before the CD, the 45-rpm vinyl disc was the perfect singles vehicle. The costs of manufacturing CDs, however, made that format more suited to selling full albums, according to Holzman.
"With Napster, it would have been easy to proliferate singles," Holzman said. "You would have had no manufacturing costs. You would still have the value of the single as a calling card for albums and you could have sold songs for something like 79 cents, made it affordable. You would have had ability to count because all of the transactions went through a central server at Napster, unlike P2P where you bypassed servers. Now, would P2P still have happened? Yes it would. But we would have established a principle of being paid for digital music."
Holzman also sees positive things when it comes to the re-use of copyrighted works through fair use. Additionally, he thinks that suing music fans is a mistake and that ISPs should share some profits from the music that has been flowing through their networks.
All this comes after Stevie Nicks, in spite of the evidence that suggests otherwise, blamed the internet for destroying rock as well as John Mellencamp saying that the Internet is the most dangerous invention since the A-bomb.
So where does this leaves the mainstream American music industry? I think it shows that everyone in the RIAA ranks doesn't universally have the opinion that the Internet is destroying music.
Even within the RIAA, there are opposing views with the digital age and how it affects music. It's a lot like the misconceptions of those who support digital rights. Those who support a loosening of copyright laws don't necessarily all say that copyright laws should be abolished and that everything should be free.
In fact, many who support a more liberal copyright law even say that they are more than happy to pay for copyrighted material.
In the ranks of the RIAA, not everyone is of the extreme point of view that the Internet should just be dismantled. Do such people exist? Yes. Are they all of the same opinion? No.
That doesn't mean that the debate hasn't shown signs of polarity. I think it's the extreme points of view that do have this affect though. If one person says that the Internet should be abolished, a large number of people with various points of view will rally against a call like that.
Overall, it's very refreshing to see something like this surface.
Music Industry Could be Transformed by Cloud
Excerpted from TMCNews Report
Tech heavyweights promise to send music collections to the cloud. Could Apple and Google finally deliver on the promise to make your music collection redundant? The next revolution in digital entertainment promises instant access to a vast music library, straight from the cloud.
From the sagging shelf of vinyl, through teetering stacks of CDs to a hard drive stuffed with MP3s, personal music collections have adapted to, even thrived on, the technological changes of the last few decades. But actually owning music could soon become a thing of the past - because of the cloud.
Cloud music services, which enable you to stream music from the Internet to your computer or phone, have been around for a few years. But so far, services such as Pandora, Last FM and Spotify, have had little discernible effect on music lovers' appetite for owning songs.
But the number of sites is growing, along with the range of services they offer, and with Apple and Google gearing up to offer their own services, could the era of the digital download be coming to an end?
Typically, such services offer unlimited access to a music store of millions of songs, either interspersed with advertisements or for a subscription. The latest versions also promise to give people access to their own music collection from anywhere with an Internet connection.
The extra bandwith and data transmission speeds of next-generation (4G) mobile networks mean it could soon be possible to listen to music streaming straight from the Internet anywhere where there's reception. When that happens, music ownership might start to look unappealing.
Who would choose to own music when every song is available to stream wirelessly, on demand?
Last December, Apple acquired Lala, a music-streaming service launched in 2006 that could scan a user's hard drive for music files and replicate them in the cloud. This technology is expected to form the basis of a cloud-based update to iTunes, and although Apple's intentions were not clear as New Scientist went to press, the company is building a huge data center in North Carolina that is set to be one of the largest in the world - handy if you were planning to host a lot of music files.
Google appears to be taking a different approach. In May, it acquired Simplify Media, a California-based start-up whose technology enables people to access media files, such as photos and music, across multiple devices without having to synchronize them. So instead of having to wait while files are transferred from your hard drive to your mobile music player, the technology will send the files into the cloud and let you access them from there. The app will run on Google's Android smart-phone operating system.
Being able to connect people with their desired song is only one aspect of the process, however. Perhaps the greater challenges are persuading the music companies that own the rights to the songs to get on board, and devising a workable royalty system.
Difficulties with this have scuppered many earlier attempts at delivering cloud music. For example, Swedish music service Spotify, while popular in Europe, has been unable to agree with the major record labels how much their artists would receive each time a user accesses their song. As a result it has been unable expand into the US.
An early cloud service from Yahoo left many users disgruntled when changes to its digital rights management (DRM) technology meant they could no longer access songs they had paid for.
Seamless transmission of the music also needs perfecting. One of the primary means of accessing music over the Internet is via streaming, which allows users to start playing the music before the entire file has been transmitted. Typically, streaming files do not get written permanently to the hard drive, so each time a user wants to listen to a song they have to start a new stream. It can also be a slow process when a lot of users are online at the same time, and such bandwidth issues are only magnified when it comes to mobile phones.
Spotify has built its service on blending streaming with peer-to-peer (P2P) networking. P2P technology pulls together chunks of a file from multiple sources to create the whole. Songs can get started quickly from the company's servers, while the rest of the data is gathered via P2P, says Gunnar Kreitz, an engineer with Spotify. This decreases the amount of server space and bandwidth used, making it feasible to deliver a cloud-based music streaming service. "By combining P2P and server streaming, we can get the best of both worlds," Kreitz says.
California-based mSpot keeps bandwidth down in a different way. Its application scans a user's own music library and enables them to access it via the cloud. By letting users dictate which songs are hosted in the cloud and which are stored on their mobile device, it even promises to allow users to access their music when they have no mobile reception, says mSpot's chief technology officer, Ed Ho. The software "pre-fetches" a user's most listened-to songs when the web connection is good, ensuring they can be played if the connection drops.
Its charges also vary with how much bandwidth a user consumes, giving the company more control over how much is taken up. One potential drawback is that mSpot uses data compression technology to squeeze down the size of the files being transmitted, thereby losing some of the sound quality.
Corynne McSherry, a staff attorney with the Electronic Frontier Foundation, a digital-rights pressure group in San Francisco, sounds a note of caution over cloud music services. She believes licensing problems like those with Yahoo will be difficult to overcome. "Downloading the MP3, having the actual file, gives you more control over how you use the music," she says.
She thinks that while some people will continue to want to own recordings, the culture of music ownership could eventually die out. "You might have a next generation of people who say: 'I don't want to bother with having my own collection of music.' I don't know when that date will come."
Online Video Usage Surges Thanks to Streaming Sites
Excerpted from Online Media Daily Report by Gavin O'Malley
Over the past year, the amount of time American audiences spent watching video for the major live video publishers has grown 648% to more than 1.4 billion minutes, according to comScore.
By comparison, the amount of time American audiences spent watching YouTube and Hulu increased 68% and 75%, respectively, over the same time period.
"Live online video sites have not only been successful in building audience, but also in keeping that audience tuned-in," says comScore Web video specialist Andres Palmiter. As Palmiter notes, the average live streamed video view is 7% longer than the average online video view. "If you narrow the audience to a specific demographic, though, live video really begins to prove its advertising value to media planners." Live video sites are 72% more likely to deliver the elusive demographic, males age 18-34, than the average online video site.
Palmiter says the success of live streaming video is due in no small part to sites' willingness to build out their technology infrastructure to provide a better user experience. "For instance, Justin.tv recently announced mobile applications for Android and iOS, the former allowing users to live stream from their mobile device," he said. "The growth of broadband (both through regular and cellular networks) has made features unthinkable two years ago a reality today."
Along with Justin.tv, other top live video publishers include USTREAM, Livestream, LiveVideo, and Stickam. In July, USTREAM reached over 3.2 million unique viewers, beating out Justin.tv's 2.6 million viewers, and Livestream's 2.4 million. As Palmiter notes, however, Livestream served more than 160 million videos, compared to roughly 130 million from Justin.tv and 20 million from USTREAM. "Those 20 million videos on USTREAM, however, were viewed eight minutes longer on average than videos on Justin.tv and 17 minutes more than those on Livestream." In terms of total minutes, viewers logged almost 900 million minutes watching Justin.tv in July -- outpacing the other two sites.
YouTube is also rumored to be seriously considering a live streaming video service. Google recently integrated its Moderator service -- which lets users vote on user-submitted ideas -- into YouTube. In the past, YouTube used the service for special events, and granted only select users access. Similarly, YouTube has sparingly employed live-streaming for presidential speeches, healthcare debates, cricket matches and a U2 concert.
Regarding online video in general, "A lot has changed in last 10+ years," says Palmiter. "YouTube, once maligned for its streaming quality, can now pump out videos in 4K (for the uninitiated, that's 4x the pixels of broadcast/cable HD), most online TV programming can be found in HD, and even the cheapest camcorders have the capability to upload a HD video."
ShareThis Puts a Value on Shared Content
Excerpted from Online Media Report by Laurie Sullivan
ShareThis plans to release two analytics tools that allow advertisers and marketers to determine the value of content being shared across websites. Through both, Social Reach and Audience Index, brands have an opportunity to understand the value of social traffic.
Social Reach measures the true value of shared media across the web by looking at inbound social traffic and outbound sharing, valuing the responder of a share as much as the sharer. The analysis aims to provide more data than buttons on Facebook, Twitter, and Tweetmeme buttons that measure outbound sharing.
Audience Index measures and segments a publisher's audience by influence, so it identifies who has shared, responded and viewed content from their site. It indexes the information by category and matches it against other sites across the web.
Some early data shows that social traffic engages consumers more than search traffic, according to ShareThis CEO Tim Schigel. "It measures the social reach and allows publishers to measure it by article," he says. "They also can index their reach from the articles on their site against the rest of the network."
Publishers can segment the data by media channel, topic, and trending topic around what's being shared. The tools - a move in response to plumping CPMs on publisher sites - become available following Labor Day. A few publishers have been using it for the past month.
Advertisers have begun to judge publisher's sites based on social-traffic engagement. The act of sharing a piece of content has become a key indicator of the success of a campaign, but measurement tools do not exist.
It took ShareThis since May to build the tools into the company's analytics dashboard. Both measure large amounts of data in real time across 850,000 publisher websites. Today, ShareThis pulls in information from about 10 billion web pages per month. Within the next few months, that number will rocket to between 15 billion and 20 billion after several large deals are signed.
Engage author Brian Solis told a packed room at an industry event Thursday night how social media is more than a gimmick; it's rather a tool to change the experience of customers and people who influence their decisions. "If you're creating content, make sure it's not just relevant, but worth sharing," he says.
Putting a price on a piece of content being shared lets marketers manage campaign budgets to acquire traffic. People engaged in sharing on a specific topic have intent as high -- or higher -- than search traffic, Schigel says. "We're finding advertisers request this information," he says. "It was latent in our data, but we didn't expose it until they started asking for it."
Brand marketers strive to reach the top of the funnel to create a wider audience. Having this type of feedback can also drive display advertisers, which provides new insights.
Why Can't Tech Companies Break into TV?
Excerpted from The Atlantic Wire Report by John Hudson
America's "digital living room" has attracted the country's biggest tech companies, such as Apple, Google, Microsoft, and Amazon. But if the lukewarm reception to Apple TV reveals anything, it's that Silicon Valley is still a long way from taking over your TV set. For proof, look no further than Steve Jobs, who raised serious doubts about expanding into television just two months ago in an interview with All Things Digital:
The problem with innovation in the TV industry is the go-to-market strategy. Ask TiVo, ask Roku, ask us. Ask Google in a few months. It's not a problem of technology, it's not a problem of vision, it's a problem of go to market strategy. TV is very tower of Babel-ish, it's Balkanized.
Jobs' commitment to Internet TV has always been somewhat of a mystery. But the new update of Apple TV suggests renewed enthusiasm. However, if Apple TV doesn't have what it takes to be a mainstream success (which is what most observers are saying), do any of the tech giants stand a chance? Here's what's standing in their way:
All the TV networks must get on board, writes Eliot Van Buskirk at Wired: "The biggest promise of devices such as Apple TV, from the consumer's point of view, is that they might - at long last - allow them to 'cut the cord,' replacing their cable or satellite connections with an Internet-connected set-top box, the same way many have replaced their landlines with cell-phones. But with only two networks - ABC and Fox - included in Apple's new television rental program, the only way a television viewer with normal viewing habits would be able to cut the cord using the new Apple TV would be to wait a day and download unsupported new shows from BitTorrent (more on that below), while relying on Netflix for older shows."
Here's why the tech giants fail, writes Larry Dignan at ZDNet. He takes each tech company on individually. "Apple TV was a flop and probably will be again. Apple TV probably needs to be a real TV not a little box with an HDMI port. Google has the same problem as Apple: content companies are wary of the search giant, but will deal with the company just so Jobs doesn't control their fate. Amazon can rent you movies and sell you content in your living room, but the e-commerce company is largely a wild-card."
The one company with a close shot at conquering the "digital living room" is Netflix, argues Dignan:
The real deal is that Netflix CEO Reed Hastings and his company have navigated the digital entertainment landscape better than anyone. Netflix's ability to navigate the turbulent entertainment business is absolutely brilliant.
You pick a screen or consumer electronics device and you're likely to find Netflix. And it's all you can eat for a subscription. The real genius with Netflix: The company isn't a huge threat to cable or any of the incumbents. If anything Netflix is more HBO killer than Comcast killer.
Clearly the networks have dug in their heels, writes Andrew Wallenstein at the Hollywood Reporter: "Although Apple is touting the inclusion of Disney and News Corp., the holdouts are hoping their absence from iTunes rentals will weaken it. Already they're taking pride in having successfully prevented some of Apple's previous efforts, including a 99-cent download (as opposed to rental streaming) and a bundled monthly offering."
Eliot Van Buskirk adds, "The fact that one of only two launch partners Apple could secure is ABC - owned by Disney, of which Jobs is the largest shareholder - is not exactly a hopeful sign that the networks will be climbing aboard anytime soon."
America's Most Underestimated Company
Excerpted from Financial Post Report by James Ledbetter
People who think and write about technology companies for a living are prone to be wrong now and again. Try to find, for example, veteran analysts or journalists who haven't at some point made a claim about Apple that they didn't later regret. The technology sector is too dynamic, and the growth of certain technologies too explosive and unpredictable, for anyone to be right all the time. That's part of the fun.
But there is one company that has been more consistently underestimated than any other, whose innovations, growth, and, indeed, survival have been dismissed and denied for nearly all its corporate life. That's Netflix. Despite a long-term record of success, the company has repeatedly seen its stock, its technology, and its very business model publicly derided.
How nasty and wrong have the critics been? In 2005, Michael Pachter, an analyst for Wedbush Morgan Securities, called Netflix "a worthless piece of crap with really nice people running it." Today, that worthless piece of crap has a market capitalization of $6.4 billion. In early 2007, when Netflix first announced its plans to allow subscribers to stream videos instantly-rather than wait for DVDs to arrive in the mail-esteemed tech journalist Om Malik predicted that this move would "soon be relegated to the dustbin of failed ideas." Netflix has more than doubled its subscriber base since then, and today nearly two-thirds of them use Netflix's streaming video service.
Some Netflix skeptics have been honest enough to admit their errors. In October 2006, Jim Cramer memorably donned sackcloth and ate a piece of a hat with the stock symbol NFLX on it. His sin: He told his viewers to sell Netflix at $19 a share. Today, it's trading at more than $130.
While its critics were flailing away, the company has continued to grow steadily and spread its influence well beyond the red envelope. One of Netflix's direct competitors, Blockbuster-which for years was supposed to put Netflix out of business-is teetering on the edge of bankruptcy. Netflix's iPad app was widely deemed one of the best available when the device launched in April. And when Apple announced today that its new Apple TV service would stream movies and TV shows, Netflix content was front and center.
What is it about Netflix that causes critics to misread it so badly? Call it the innovator's paradox: Netflix forged an identity by building a simple business-DVD delivery by mail-that had never been done before. The very fact that this DVD-by-mail idea connected so deeply with consumers led many observers to think that was all that Netflix could or would ever do. Instead, the DVD delivery service-while still vital to Netflix's revenue-looks more like the Trojan horse of a much wider strategy designed to change how Americans watch filmed entertainment.
The company's critics have also tended to focus on technological platforms, rather than what consumers actually want. Netflix, like Amazon, has built its relationship with customers extremely carefully and successfully-some 15 million people now send Netflix money every month. (How many nonutility companies can boast that?) As long as it continues to keep its customers happy, it should be able to transfer them to whatever platform-DVDs by mail, streaming over the Xbox or Wii or set-top boxes, the iPad, the iPhone-those customers want.
This isn't to say that Netflix is bulletproof or that it's possible to justify the more than doubling of its stock price in the last year. There are all sorts of threats to its business. These threats come from competing technologies like BitTorrent and from stakeholders in Hollywood. But perhaps the biggest threat is from Amazon. For years, it's been rumored that Amazon would take over Netflix, even thought Netflix's real-world distribution centers would create huge tax headaches for Amazon. Even if Amazon did once have designs on Netflix, it appears to have decided that if it can't join 'em, it'll beat 'em with the apparently imminent launch of a subscription TV service.
It would be silly to deny that competitors may eat away at Netflix's lunch, and it's not as if Netflix has a fallback. But as Daniel Roth elegantly put it in a Wired story a year ago: "There are a million different ways for Netflix to fail. But that has always been the case. Netflix should have failed already, taken down by Blockbuster or Wal-Mart, kneecapped by Hollywood, made irrelevant by BitTorrent or iTunes." If you want to predict the demise of Netflix, go ahead-but somewhere there's a hat for you, ready to be eaten.
FTC Closes LimeWire Privacy Probe without Taking Action
Excerpted from Online Examiner Report by Wendy Davis
In February, the Federal Trade Commission (FTC) warned a host of businesses, local governments and schools that sensitive data about their employees and customers had ended up on file-sharing networks. At the same time, the FTC said it was investigating individual companies to determine whether they exposed private data online, a potential violation of various federal laws like the Fair Credit Reporting Act.
Now the FTC has publicized the results of one investigation, that of peer-to-peer network, LimeWire. And the results are good news for the company: The FTC closed its investigation without taking action.
The FTC said that one factor that led it to close the probe was that recent versions of LimeWire software incorporate mechanisms aimed at preventing the accidental sharing of personal documents. The agency added that it expects LimeWire "to continue to advise consumers to upgrade legacy versions of its software" and also "to participate in software industry efforts to inform consumers about how best to avoid the inadvertent sharing of sensitive documents."
The FTC's decision here makes sense. While people who use file-sharing networks can compromise others' privacy, so can those who use e-mail, blogs, social networking sites, and the like. In fact, any time an individual posts information online, the potential exists that someone's private data will be exposed.
The answer isn't to target the technology, but the companies and individuals who use particular platforms to broadcast private data. The FTC currently is considering complaints against Google, which arguably exposed some users' private contacts when it launched Buzz, and Facebook, which riled privacy advocates by launching "instant personalization" - a program that tells Yelp, Microsoft Docs and Pandora the names of visitors who are signed in to Facebook at the time.
Those complaints certainly appear more valid than allegations that file-sharing software companies violate people's privacy.
Coming Events of Interest
NY Games Conference - September 21st in New York, NY.The most influential decision-makers in the digital media industry gather to network, do deals, and share ideas about the future of games and connected entertainment. Now in its 3rd year, this show features lively debate on timely cutting-edge business topics.
M2M Evolution Conference - October 4th-6th in Los Angeles, CA. Machine-to-machine (M2M) embraces the any-to-any strategy of the Internet today. "M2M: Transformers on the Net" showcases the solutions, and examines the data strategies and technological requirements that enterprises and carriers need to capitalize on a market segment that is estimated to grow to $300 Billion in the year ahead.
Digital Content Monetization 2010 - October 4th-7th in New York, NY. DCM 2010 is a rights-holder focused event exploring how media and entertainment owners can develop sustainable digital content monetization strategies.
Digital Music Forum West - October 6th-7th in Los Angeles, CA. Over 300 of the most influential decision-makers in the music industry gather in Los Angeles each year for this incredible 2-day deal-makers forum to network, do deals, and share ideas about the business.
Digital Hollywood Fall - October 18th-21st in Santa Monica, CA. Digital Hollywood Spring (DHS) is the premier entertainment and technology conference in the country covering the convergence of entertainment, the web, television, and technology.
P2P Streaming Workshop - October 29th in Firenze, Italy. ACM Multimedia presents this workshop on advanced video streaming techniques for P2P networks and social networking. The focus will be on novel contributions on all aspects of P2P-based video coding, streaming, and content distribution, which is informed by social networks.
Streaming Media West - November 2nd-3rd in Los Angeles, CA. The number-one place to come see, learn, and discuss what is taking place with all forms of online video business models and technology. Content owners, viral video creators, online marketers, enterprise corporations, broadcast professionals, ad agencies, educators, and others all come to Streaming Media West.
Fifth International Conference on P2P, Parallel, Grid, Cloud, and Internet Computing - November 4th-6th in Fukuoka, Japan. The aim of this conference is to present innovative research results, methods and development techniques from both theoretical and practical perspectives related to P2P, grid, cloud and Internet computing. A number of workshops will take place.