Posted:


In 2009 we helped a group of researchers and industry partners launch Measurement Lab (M-Lab), an open platform for broadband measurement tools. Over the past two years, M-Lab has grown significantly – more than 300 terabytes of data from over half a billion tests are now publicly available.

M-Lab tools help an individual understand the performance of one’s own broadband connection, but making sense of that much data in the aggregate is more complicated. That’s why we’re happy to announce that, working with M-Lab, we have developed a set of maps to help investigate such a huge dataset using Google’s Public Data Explorer.





The visualizations show measured median upload and download speeds as measured by M-Lab tools across the United States, Europe, and Australia, and you can drill down to city-level aggregates. You can also view to what extent speeds are limited by problems with users’ network connections or with their computers (or other devices).

The maps are built entirely on open data collected by Network Diagnostic Tool (NDT), an open source tool developed by Internet2 and widely deployed. The platform, the tool, and the data are all open – which means the Internet community can vet the measurement methodology, perform independent analysis of the same data, and build their own visualizations. In fact, the M-Lab data provide much more information that what’s presented in these visualizations, and we hope that our effort will help drive future research in this area.

On Wednesday, the Open Technology Initiative will be hosting a panel discussion on M-Lab in Washington, D.C. In a keynote, Vint Cerf will explain how M-Lab is helping analyze broadband performance and promote good science. For those who can’t attend in person, the event will be live-streamed via the web, starting at 10:30am, EST.

Update (3/23/11): Check out video of the event, below.

Posted:


(Cross-posted from the Official Google Blog.)

In the same way your phone is associated with a unique number, your computer is assigned a unique Internet Protocol (IP) address when you connect to the Internet. The current protocol, IPv4, allows for approximately 4 billion unique addresses—and that number is about to run out.

This morning the Internet Corporation for Assigned Names and Numbers (ICANN) announced (PDF) that it has distributed the last batch of its remaining IPv4 addresses to the world’s five Regional Internet Registries, the organizations that manage IP addresses in different regions. These Registries will begin assigning the final IPv4 addresses within their regions until they run out completely, which could come as soon as early 2012.

As the last blocks of IPv4 addresses are assigned, adoption of a new protocol—IPv6—is essential to the continued growth of the open Internet. IPv6 will expand Internet address space to 128 bits, making room for approximately 340 trillion trillion trillion addresses (enough to last us for the foreseeable future).

Google, along with others, has been working for years to implement the larger IPv6 format. We’re also participating in the planned World IPv6 Day, scheduled for June 8, 2011. On this day, all of the participating organizations will enable access to as many services as possible via IPv6.

Today’s ICANN announcement marks a major milestone in the history of the Internet. IPv6, the next chapter, is now under way.

Posted:


Internet users deserve to be well-informed about the performance of their broadband connections, and good data is the foundation of sound policy. So I'm excited to see that the FCC has launched a "beta" consumer broadband test on broadband.gov today. The site provides access to two third-party measurement tools, and is "the FCC's first attempt at providing consumers real-time information about the quality of their broadband connection."

One of the tests is provided through Measurement Lab (M-Lab), the open server platform that a group of researchers and other organizations created with our help last year. The FCC allows users to run Network Diagnostic Tool (NDT) -- an open source tool developed by Internet2 -- and see their estimated download and upload speeds. They can also see the estimated latency and jitter of the connection test between the user's computer and an M-Lab server.

Since M-Lab launched, a number of partners have joined to add new tools, improve the platform, and make the data more accessible. One of M-Lab's core goals is to help advance network research, and we're thrilled to have the FCC contribute to this effort as well. All M-Lab test results are made open and publicly available so that researchers can build on and learn from the data without restriction. By pointing users to this tool, the FCC is contributing to this open pool of broadband data. (Note that as part of these tests, the FCC asks users to submit their addresses; to be clear, M-Lab is not collecting any of this information.)

The FCC has also said that the forthcoming National Broadband Plan will recommend different measures to improve broadband transparency. As we stated in previous comments, we think it's important to consider the complementary ways it can use multiple measurement and data collection methodologies, and we look forward to seeing what else the Plan recommends.

For now, you can head over to broadband.gov to try out this first step.

Posted:
Posted by Richard Whitt, Washington Telecom and Media Counsel

Today the FCC asked Congress for a one-month extension to deliver its National Broadband Plan, explaining that it needs more time to review the public record and to brief key officials. The deadline originally had been set for February 17th.

For years Google has been a vocal proponent of a national broadband strategy, and we're eager to see this plan delivered to Congress as soon as possible. That said, given the immense challenges faced by FCC staff in putting together such a comprehensive and far-reaching document, asking for an additional four weeks is not at all unreasonable. A broadband plan for our country may be too many years overdue, but with so much at stake, it's important to get this done right.

We continue to believe that the FCC should set both shorter-term "ubiquity" goals of bringing high-speed Internet access to every single American, and longer-term "stretch" goals of driving the adoption of truly high capacity broadband pipes. We hope the Commission shares this dual vision, and we look forward to seeing their final product.

Posted:


One of Google's top policy priorities is spurring the availability and uptake of affordable, open broadband Internet service. The Internet may have been invented in the United States, but unfortunately in too many places we continue to lag behind Asia and Europe when it comes to broadband speed, penetration, and adoption.

We've been working closely with FCC staff over the past several months as they prepare to deliver a National Broadband Plan to Congress in February, and to date they've shown a strong commitment to providing the best possible blueprint for action.

As we explained in our initial comments, we think it's essential that in addition to instituting some constructive near-term solutions, the plan also should include some explicit, ambitious – and ultimately achievable – longer term goals for bringing ultra-high broadband speeds to all Americans. Those goals should be supported by our country's best thinking about various potential pathways to achieving them.

Today, in a letter to FCC Chairman Julius Genachowski, House Communications Subcommittee Chairman Rick Boucher called on the Commission to commit to specific "stretch" goals as part of its overall plan – and we agree. Without including in the plan some future-focused benchmarks for speed and service, our nation risks losing the opportunity to make robust, nationwide broadband access a reality for American consumers.

Affordable, high-speed Internet access can drive economic growth, job creation, and education. We should not be satisfied with shorter-term fixes alone that likely will still leave us lagging behind the rest of the industrialized world.

Posted:


How are the performance and quality of broadband networks changing over time? How does the service experienced by users on certain networks compare against others?

Today, Measurement Lab (M-Lab) took another step to help answer these types of questions. Two M-Lab researchers have publicly released the results from over 150,000 broadband connection speed and quality tests run by users all over the world. Anyone can use the datasets without restriction, under a "no rights reserved" Creative Commons Zero waiver.

As we've discussed here before, M-Lab is an open server platform for researchers to deploy broadband measurement tools. This project is a collaborative effort led by researchers, with Google and other partners around the world providing additional support.

Thousands of users are now running tests every day on M-Lab, and while only results from two tools – NDT and NPAD – are available right now, all data collected by M-Lab researchers will be released in the near future. Amazon Web Services is providing M-Lab with free data hosting through its Public Data Sets program, and M-Lab would welcome the participation of others who want to host the data and make it easier to access.

The raw data are not yet in a form that's easily intelligible to average users, but since re-use of the data is entirely unrestricted, anyone is free to analyze the information, mash it up with maps, or create other user-friendly reports. In addition, M-Lab requires that tools' source code be open, so that anyone can review, understand, and build upon the testing methodologies. We think this kind of openness is critical to developing robust, reliable broadband measurement.

Posted:


In our response today to the FCC's inquiry about Google Voice, we announced that our engineers have developed a tailored solution for restricting calls to specific numbers engaged in what some have called high-cost "traffic pumping" schemes, like adult chat and "free" conference call lines.

We went to work on this fix because earlier this year, we noticed an extremely high number of calls were being made to an extremely small number of destinations. In fact, the top 10 telephone prefixes -- the area code plus the first three digits of a seven digit number, e.g., 555-555-XXXX -- generated more than 160 times the expected traffic volumes, and accounted for a whopping 26 percent of our monthly connection costs.

To prevent these schemes from exploiting the free nature of Google Voice -- making it harder for us to offer this new service to users -- we began restricting calls to certain telephone number prefixes. But over the past few weeks, we've been looking at ways to do this on a more granular level. We told the FCC today that Google Voice now restricts calls to fewer than 100 specific phone numbers, all of which we have good reason to believe are engaged in traffic pumping schemes.

While we've developed a fix to address this problem, the bottom line is that we still believe the Commission needs to repair our nation's broken carrier compensation system. The current system simply does not serve consumers well and these types of schemes point up the pressing need for reform.

Posted:


This afternoon AT&T filed a letter with the Federal Communications Commission, alleging that Google Voice is preventing its users from making outbound calls to certain phone numbers with inflated access charges, and asking the Commission to intervene.

Here's the quick background: Local telephone carriers charge long-distance companies for originating and terminating calls to and from their networks. Certain local carriers in rural areas charge AT&T and other long-distance companies especially high rates to connect calls to their networks. Sometimes these local carriers partner and share revenue with adult chat services, conference calling centers, party lines, and others that are able to attract lots of incoming phone calls to their networks.

Under the common carrier laws, AT&T and other traditional phone companies are required to connect these calls. In the past they've argued that these rural carriers are abusing the system to "establish grossly excessive access charges under false pretenses," and to "offer kickbacks to operators of pornographic chat lines and other calling services." (This is a complicated issue, but these articles from USA Today and the Associated Press explain it well.)

We agree with AT&T that the current carrier compensation system is badly flawed, and that the single best answer is for the FCC to take the necessary steps to fix it.

So how does any of this relate to Google Voice?

Google Voice's goal is to provide consumers with free or low-cost access to as many advanced communications features as possible. In order to do this, Google Voice does restrict certain outbound calls from our Web platform to these high-priced destinations. But despite AT&T's efforts to blur the distinctions between Google Voice and traditional phone service, there are many significant differences:
  • Unlike traditional carriers, Google Voice is a free, Web-based software application, and so not subject to common carrier laws.
  • Google Voice is not intended to be a replacement for traditional phone service -- in fact, you need an existing land or wireless line in order to use it. Importantly, users are still able to make outbound calls on any other phone device.
  • Google Voice is currently invitation-only, serving a limited number of users.
AT&T is trying to make this about Google's support for an open Internet, but the comparison just doesn't fly. The FCC's open Internet principles apply only to the behavior of broadband carriers -- not the creators of Web-based software applications. Even though the FCC does not have jurisdiction over how software applications function, AT&T apparently wants to use the regulatory process to undermine Web-based competition and innovation.

* Note: This blog post was updated at 4:55 PM ET to clarify the FCC's open Internet principles.

Posted:


(Editor's note: We're pleased to welcome Sascha Meinrath and Robb Topolski of the Open Technology Initiative (OTI) as guest bloggers. As a part of The New America Foundation, OTI works to support policy and regulatory measures that further open technologies and communications networks.)

Eight months ago, we joined a group of researchers in launching Measurement Lab (M-Lab), an open platform for researchers to deploy Internet measurement tools.

We created M-Lab in order to help measure the actual performance of broadband Internet connections. Is your connection as fast as advertised? Where are the bottlenecks that impact VoIP or video performance? Answers to these sorts of questions will help users to make informed decisions in the market, and help governments around the globe to craft sound broadband policy.

So, how's it doing?

To date, more than 150,000 Internet users from around the world have used M-Lab to test the performance of their broadband connection and share information with researchers.

Now M-Lab is hitting the Mediterranean. We're thrilled to announce that the EETT -- Greece's telecommunications regulator -- and the Greek Research and Technology Network (GRnet) have contributed servers and connectivity for a new M-Lab node in Athens, Greece, and will collaborate with M-Lab to help improve the usability of the platform's tools.

EETT has already been working to provide useful information about broadband networks to consumers, through their central Web portal. EETT plans to incorporate data collected through M-Lab into this map, so that users will be able to compare broadband providers' and their Internet connection's performance across several dimensions.

In addition to EETT and GRnet, Voxel also has joined as an M-Lab partner, providing server nodes and connectivity in New York City and Amsterdam. Since launch, we've added many new servers, for a total of 38 between the U.S. and Europe.

We've also added two new tools, PathLoad2 and ShaperProbe. PathLoad2 allows users to test their available bandwidth (the maximum bit rate you can send to a network link before it gets congested), and ShaperProbe detects whether the ISP reduces the speed of a download or an upload after it begins.

We're happy about M-Lab's successful beginning, but it's only the beginning. The platform and its tools are still very much in beta, and we continue efforts to improve them.

In the coming months we're aiming to make the collected data publicly available and accessible, improve the user experience and stability of our tools, and expand the availability of the site globally. Stay tuned, and in the meantime we hope you'll run an M-Lab test on your own broadband connection.

Posted:


Earlier this month Google and the New America Foundation announced a special Google Moderator page where users can submit and vote on ideas for how to make high-speed Internet access more available and affordable in the United States. So far we've been overwhelmed with the creativity and enthusiasm of the responses.

In just under two weeks, more than 2,100 people from around the world have submitted more than 530 ideas and cast more than 45,300 votes. Here's a quick taste of what people are saying:

"Whatever method or process is decided for deploying broadband, it needs to support down and upstream capacities that prepare for the future, not simply serve the needs of right now."
Justin M, Turlock, CA

"We seem to have gotten a road and an electric wire to almost every building in the US, no matter how remote. Any plan that has a lesser goal than eventually attaching every building to a high-speed computer network is too small, IMHO."
Rollie, Indianapolis, IN

"This country can have free WiFi nationwide by constructing WiFi towers that are also windmills. Excess energy can be sold, providing free WiFi and helping this country with its enormous energy needs, while also providing free WiFi to the masses."
Roadrunner, Earth

But what do you think?

If you haven't already weighed in, today is your last day to submit your ideas to Google Moderator. Voting will close tonight at midnight (Eastern Time).

Over the next several days we'll be studying each and every submission, and next week we'll submit the results from Google Moderator to the official FCC record on your behalf. Stay tuned to this blog for the latest information on this and other projects designed to help bring more broadband to more Americans.

Posted:


(Cross-posted from the Official Google Blog)

Have an idea for how to expand high-speed Internet access across the United States? Here's your chance to have your voice heard.

Under the terms of the recent economic stimulus package, the Federal Communications Commission must deliver to Congress a National Broadband Plan by February 2010. Several weeks ago, we laid out Google's vision for how to make broadband Internet available and affordable for every American — and hundreds of others have already submitted comments of their own.

The FCC has called for "maximum civic engagement" in developing a broadband strategy, and we're hoping to help them to achieve just that.

We've teamed up with the New America Foundation to launch a Google Moderator page where you can submit and vote on ideas for what you think the Commission should include in its National Broadband Plan. Two weeks from now we'll take the most popular and most innovative ideas and submit them to the official record at the FCC on your behalf.

Google and the New America Foundation agree that public participation in this process is critical. Expanding access to broadband has the potential to transform communities across the country, spark economic growth, and restore American competitiveness. Now that the Commission has officially opened this proceeding, and with a new Chairman at the helm, we think it's time to give people the opportunity to learn about the issue and to weigh in with their thoughts. And as the process continues to unfold at the FCC, we'll keep you informed of additional ways to share your views and voice your ideas to the agency.

So do you have any good ideas? Submit them today on Google Moderator — and you just might help change the face of broadband in the United States.

Posted:


Open, ubiquitous broadband connectivity holds the promise to catapult America to the next level of competitiveness, productivity, education, health, and security -- but how do we get there from here?

The American Recovery and Reinvestment Act (ARRA) of 2009 requires the Federal Communications Commission (FCC) to deliver to Congress a National Broadband Plan by February 2010. This represents a golden opportunity for policymakers and all Americans to take a hard look at the current state of broadband deployment and uptake, and begin laying the groundwork for a communications infrastructure truly capable of meeting the demands of the 21st century. Today Google submitted to the FCC our initial thoughts for how we might do just that.

As part of a comprehensive broadband policy framework, we believe that our government should adopt a bold yet achievable goal for making high-speed Internet capabilities available to each and every American. Our comments call for all American households to have access, by 2012, to at least 5 Mbps upload and download speeds over broadband. We believe that a 5 Mbps benchmark is an ambitious yet attainable first-step, and that even more challenging benchmarks with much higher capacity levels may well be necessary over the course of the next decade. If this benchmark is accomplished -- so that today's unserved or underserved consumers become tomorrow's broadband customers -- we will have truly become an always-on nation.

In addition to laying out a suggested public policy framework, our comments also describe four concrete proposals that we believe would help advance this vision:
  • Install broadband fiber as part of every federally-funded infrastructure project. By some estimates nearly 90 percent of the cost of deploying fiber is associated with construction costs like tearing up and repairing roads. The National Broadband Plan should require the installation of broadband fiber as part of all new federally-funded infrastructure projects. Laying fiber -- or even simply installing the conduit for later fiber deployment, as Rep. Anna Eshoo has suggested -- during the construction or repair of roads and other public works projects will dramatically reduce deployment costs. And it's just good common sense.
  • Deploy broadband fiber to every library, school, community health care center, and public housing facility in the United States. Low-income Americans are increasingly left out of the digital revolution. The National Broadband Plan should call for the deployment of high-speed fiber connections to every library, school, community health care center, and public housing facility in the country. This would create community hub centers nationwide, providing access to underserved populations and potentially acting as a springboard for more widespread broadband adoption in these communities.
  • Create incentives for providers to install multiple lines of fiber as new networks are rolled out. The Commission should offer incentives to providers wishing to build new network infrastructure to lay cable containing multiple fibers. These unused fibers could in turn be leased or sold to other network operators, increasing competition along with deployment.
  • Encourage greater wireless broadband and reduce barriers to deployment. Last November, the FCC paved the way for "white spaces" spectrum to be used to deliver better and faster wireless broadband connections to American consumers. The Commission should encourage use of unlicensed devices in "white spaces" spectrum by eliminating unnecessary requirements and easing interference standards in rural areas where no actual harmful interference would occur.
Our comments also note that using broadband as an optimal Internet platform will require both considerable focus and substantial resources, both private and public. In short, there is no "silver bullet" solution. Instead, some projects will depend on market forces and companies investing private capital to construct new infrastructure (like Verizon's FiOS platform), while others will require direct government involvement through subsidies or regulatory mandates. Still others will require a mix of public and private involvement.

In developing a National Broadband Plan, the FCC has the opportunity to embark on a fresh course to ensure our nation's digital infrastructure fully meets our 21st century opportunities and challenges.

Posted:


By many measures, much of the United States continues to lag behind other developed countries in terms of broadband penetration and speed -- but it's not for a lack of good ideas. Take Don Means' "Fiber to the Library" (FTTL) proposal, which would equip every one of our nation's 16,548 public libraries with a 100+ Mbps Internet connection. This morning I was fortunate enough to hear Don discuss details of these plans at a forum sponsored by ITIF.

For centuries, libraries have provided a tremendous public service, allowing Americans to access useful information in their free and open facilities. What better way to continue and expand that mission in the 21st century than to provide every library in the United States with a high-speed fiber connection to the Web?

Deploying FTTL is a bold yet achievable concept that promises a number of tangible benefits. It would deliver high-performance Internet applications to communities across the country quickly and equitably, serving pre-schoolers and senior citizens alike -- not to mention millions of folks who don't have or can't afford Internet access at home. Libraries with fiber connections also could be transformed into virtual technology hubs, offering multiple ways for people to interact with new forms of IT services. Fiber-equipped libraries even can become their own communications nodes, from which any number of providers could further expand high-performance broadband infrastructure into surrounding neighborhoods now lacking such access.

There are several possible ways to help make FTTL a reality. For example, the American Recovery and Reinvestment Act (ARRA) sets aside $7.2 billion to improve the nation's broadband infrastructure, with no less than $200 million explicitly allocated to expand capacity for computer centers at public libraries and other community-based institutions. Don and his partners estimate it would cost on average only about $20,000 to wire each of our public libraries with fiber connectivity. Policymakers as a start should resolve to distribute ARRA funds to local anchor institutions like libraries that will use emerging broadband technologies in ways that most directly benefit our nation's communities.

Posted:


When you're walking around town chatting on your cellphone, or sitting in a cafe surfing the Web over Wi-Fi, do you ever wonder how wireless signals travel through the airwaves around you? Most of us probably don't give it much thought -- and yet use of these airwaves is precisely what makes many of our modern communications systems possible.

Radio spectrum is a natural resource, something that here in the U.S. is owned by all of us as American citizens. But which entities are operating in our nation's public airwaves, and where? Are these resources actually being used efficiently and effectively, or is a sizable portion of useful spectrum simply lying fallow?

We cannot conclusively answer these critical questions today, because our government has not taken and published a full inventory of spectrum ownership and use in the United States. Senators John Kerry (D-MA) and Olympia Snowe (R-ME) have introduced a bill in Congress that seeks to do just that. The Radio Spectrum Inventory Act calls on the Federal Communications Commission (FCC) and the National Telecommunications and Information Administration (NTIA) to take a full inventory of our nation's spectrum resources between the 300 MHz and 3.5 GHz bands.

The Kerry/Snowe effort to take full stock of our nation's airwaves is a positive development. Often lost in the debate over how best to put our spectrum to use is the fact that these airwaves belong to the American public, not to any corporation or other entity. But without a clear idea exactly whether and how these airwaves are being used, it is difficult to have an informed conversation about the best way to allocate and use spectrum efficiently for the needs of the American people.

In the past decade, Wi-Fi and other innovative uses of our public airwaves have revolutionized wireless communications and triggered great economic and technological growth. Last year's white spaces decision paved the way for better and faster broadband Internet connections. More efficient use of spectrum holds potential for even greater gains. Developing and publishing a detailed inventory of our nation's airwaves would be the first step towards achieving this critically important goal.

Posted:


Should Internet access providers be allowed to block or degrade lawful applications or content? The Canadian Radio-television and Telecommunications Commission (CRTC) is asking that question right now as they study Internet traffic management practices.

Google and a coalition of technologies companies and public interest groups have weighed into this proceeding, urging the CRTC to prohibit application or content based "throttling." Google's submission was one of many formal submissions made in this proceeding.

Now it's your turn to weigh in.

Yesterday the CRTC launched an online public consultation, open until 30 April, to solicit more public input in this proceeding.

Log in to online consultation and have your voice heard on the future of the open Internet in Canada.

Posted:


Just as a unique number is associated with your telephone line, your computer is assigned an Internet Protocol (IP) address when you connect to the Internet. Unfortunately, under the current Internet protocol, IPv4, the Internet is projected to run out of IP addresses in 2011. While technologies such as Network Address Translation (NAT) can provide temporary workarounds, they undermine the Internet's open architecture and "innovation without permission" ethos, allowing network intermediaries to exert undue control over new applications.

Effective adoption of the next generation protocol -- IPv6 -- will provide a real, sustainable solution. By expanding the number of IP addresses -- enough for three billion addresses for every person on the planet -- IPv6 will clear the way for the next generation of VoIP, video conferencing, mobile applications, "smart" appliances (Internet-enabled heating systems, cars, refrigerators, and other devices) and other novel applications.

In a report prepared for the National Institute of Standards & Technology in 2005, RTI International estimated annual benefits in excess of $10 billion.

Unfortunately, IPv6 presents a classic chicken-and-egg problem. The benefits of any one network operator, device vendor, application and content provider, or Internet user adopting IPv6 are limited if there is not a critical mass of other adopters. As a result, adoption lags.

The best way to kickstart IPv6 support is to adopt it, and governments are uniquely positioned here. Governments can take advantage of their roles as network operator, content provider, and consumer of Internet services to spur rapid, effective adoption of IPv6. Governments are owners of large IP-based networks, and they can transition both their externally- and internally-facing services to IPv6. They can also choose to only purchase Internet services from entities that commit to deploying native IPv6. In addition, governments can also consider subsidizing or otherwise financially supporting IPv6, such as by conditioning funding for broadband deployment on the use of IPv6 and by funding research around innovative IPv6-based applications

The private sector also has a critical role to play, of course. Here at Google we're hosting a conference this week to support IPv6 implementation. We began offering Web Search over IPv6 on ipv6.google.com in March 2008, and we recently announced our Google over IPv6 initiative, which provides users seamless access to most Google services over IPv6 simply by going to websites like www.google.com. At this week's conference, participants will share IPv6 implementation experience, advice, and associated research, and hopefully take one more step towards sustaining a healthy, open Internet.

Posted:


For one of the two broadband deployment programs created by last month's stimulus package, the legislation states that "Priority for awarding such funds shall be given to project applications for broadband systems that will deliver end users a choice of more than one service provider." Telecom wonks call this "open access" -- while one entity builds and owns the physical network infrastructure, other competing companies are allowed to use the infrastructure to offer Internet access and other services to consumers.

In Europe and elsewhere in the world, regulations that require incumbent telecom companies to operate on an open access are quite common. By enabling more competition, open access can enhance consumer choice, lower prices, and ultimately drive infrastructure improvements. Open access can also catalyze innovation, because competing providers can develop new broadband data services. For instance, Stockholm's Stokab network is used to provide not only Internet access, but also telemedicine, e-learning, and a multiplicity of other services (link via Tim Poulus).

Regardless of the public policy rationale, are there reasons why infrastructure providers should embrace the open network model? Some certainly think so. British Telecom, for instance, restructured itself in early 2006 to operate its infrastructure on an open access basis, and Swisscomm is building a super high-speed fiber-to-the-home network that will allow multiple competitors to serve each household. The CEO of Dutch telecom company KPN recently stated, "In hindsight, KPN made a mistake back in 1996. We were not too enthusiastic to be forced to allow competitors on our old wireline network. That turned out not to be very wise. If you allow all your competitors on your network, all services will run on your network, and that results in the lowest cost possible per service. Which in turn attracts more customers for those services, so your network grows much faster. An open network is not charity from us, in the long run it simply works best for everybody."

If you want an in-depth discussion of how open access can make good business sense, check out this insightful presentation from Yankee Group analyst Benoit Felten (the first part is embedded below). Felten runs a tremendous telecom blog called Fiberevolution and his thoughts on open access are summarized here.

Posted:


We're trying out a new feature on this blog -- video interviews with folks from Google's public policy team.

With a deal apparently struck today on the economic stimulus bill before Congress, I spoke this afternoon with Rick Whitt, who handles telecom policy issues for Google, about the bill's provisions to expand broadband deployment. Check it out:

Posted:


Since November's big vote at the FCC, some have begun asking when we'll start seeing consumer mobile devices take advantage of TV white spaces spectrum.

As the Commission made clear in its ruling, a working white spaces database must be deployed in order for consumer devices to be available in the market. Before sending or receiving data, devices will be required to access this database to determine available channels in the vicinity. Combined with spectrum sensing technologies, use of a geo-location database will offer complete protection to licensed signals from harmful interference.

With this mandate in mind, this morning we joined Comsearch, Dell, HP, Microsoft, Motorola, and Neustar to launch the White Spaces Database Group.

In the coming weeks and months, members of the group will be offering to the Commission their perspectives, and some specific recommendations, about the technical requirements we would like to see adopted for the database. Many of these specifications ultimately will be heavily technical; put simply, we'll advocate for data formats and protocols that are open and non-proprietary, with database administration that is also open and non-exclusive.

We don't plan to become a database administrator ourselves, but do want to work with the FCC to make sure that a white spaces database gets up and running. We hope that this will unfold in a matter of months, not years.

Stay tuned to this blog for further updates on the group's work.

Posted:


One of the first posts I wrote for this blog last summer tried to define what we at Google mean when we talk about the concept of net neutrality.

Broadband providers -- the on-ramps to the Internet -- should not be allowed to prioritize traffic based on the source, ownership or destination of the content. As I noted in that post, broadband providers should have the flexibility to employ network upgrades, such as edge caching. However, they shouldn't be able to leverage their unilateral control over consumers' broadband connections to hamper user choice, competition, and innovation. Our commitment to that principle of net neutrality remains as strong as ever.

Some critics have questioned whether improving Web performance through edge caching -- temporary storage of frequently accessed data on servers that are located close to end users -- violates the concept of network neutrality. As I said last summer, this myth -- which unfortunately underlies a confused story in Monday's Wall Street Journal -- is based on a misunderstanding of the way in which the open Internet works.

Edge caching is a common practice used by ISPs and application and content providers in order to improve the end user experience. Companies like Akamai, Limelight, and Amazon's Cloudfront provide local caching services, and broadband providers typically utilize caching as part of what are known as content distribution networks (CDNs). Google and many other Internet companies also deploy servers of their own around the world.

By bringing YouTube videos and other content physically closer to end users, site operators can improve page load times for videos and Web pages. In addition, these solutions help broadband providers by minimizing the need to send traffic outside of their networks and reducing congestion on the Internet's backbones. In fact, caching represents one type of innovative network practice encouraged by the open Internet.

Google has offered to "colocate" caching servers within broadband providers' own facilities; this reduces the provider's bandwidth costs since the same video wouldn't have to be transmitted multiple times. We've always said that broadband providers can engage in activities like colocation and caching, so long as they do so on a non-discriminatory basis.

All of Google's colocation agreements with ISPs -- which we've done through projects called OpenEdge and Google Global Cache -- are non-exclusive, meaning any other entity could employ similar arrangements. Also, none of them require (or encourage) that Google traffic be treated with higher priority than other traffic. In contrast, if broadband providers were to leverage their unilateral control over consumers' connections and offer colocation or caching services in an anti-competitive fashion, that would threaten the open Internet and the innovation it enables.

Despite the hyperbolic tone and confused claims in Monday's Journal story, I want to be perfectly clear about one thing: Google remains strongly committed to the principle of net neutrality, and we will continue to work with policymakers in the years ahead to keep the Internet free and open.

P.S.: The Journal story also quoted me as characterizing President-elect Obama's net neutrality policies as "much less specific than they were before." For what it's worth, I don't recall making such a comment, and it seems especially odd given that President-elect Obama's supportive stance on network neutrality hasn't changed at all.

Update: Larry Lessig, Save the Internet, Public Knowledge, David Isenberg, Wired and others all found fault with today's piece too.