Posted:


(Cross-posted to the Official Google Blog)

Here at Google, we see the upcoming 700 megahertz spectrum auction at the Federal Communications Commission as one of the best opportunities consumers will have to enjoy more choices in the world of wireless devices. That's why we announced today that we are applying to participate in the auction.

We already know that regardless of which bidders ultimately win the auction, consumers will be the real winners either way. This is because the eventual winner of a key portion of this spectrum will be required to give its customers the right to download any application they want on their mobile device, and the right to use any device they want on the network (assuming the C Block reserve price of $4.6 billion is met in the auction). That's meaningful progress in our ongoing efforts to help transform the relatively closed wireless world to be more like the open realm of the Internet.

Regardless of how the auction unfolds, we think it's important to put our money where our principles are. Consumers deserve more choices and more competition than they have in the wireless world today. And at a time when so many Americans don't have access to the Internet, this auction provides an unprecedented opportunity to bring the riches of the Net to more people.

While we've written a lot on our blogs and spoken publicly about our plans for the auction, unfortunately you're not going to hear from us about this topic for awhile, and we want to explain why.

Monday, December 3, is the deadline for prospective bidders to apply with the FCC to participate in the auction. Though the auction itself won't start until January 24, 2008, Monday also marks the starting point for the FCC's anti-collusion rules, which prevent participants in the auction from discussing their bidding strategy with each other.

These rules are designed to keep the auction process fair, by keeping bidders from cooperating in anticompetitive ways so as to drive the auction prices in artificial directions. While these rules primarily affect private communications among prospective bidders, the FCC historically has included all forms of public communications in its interpretation of these rules.

All of this means that, as much as we would like to offer a step-by-step account of what's happening in the auction, the FCC's rules prevent us from doing so until the auction ends early next year. So here's a quick primer on how things will unfold:
  • December 3: By Monday, would-be applicants must file their applications to participate in the auction (FCC Form 175), which remain confidential until the FCC makes them available.

  • Mid-December: Once all the applications have been fully reviewed, the FCC will release a public list of eligible bidders in the auction. Each bidder must then make a monetary deposit no later than December 28, depending on which licenses they plan to bid on. The more spectrum blocks an applicant is deemed eligible to bid on, the greater the amount they must deposit.

  • January 24, 2008: The auction begins, with each bidder using an electronic bidding process. Since this auction is anonymous (a rule that we think makes the auction more competitive and therefore better for consumers), the FCC will not publicly identify which parties have made which bid until after the auction is over.

  • Bidding rounds: The auction bidding occurs in stages established by the FCC, with the likely number of rounds per day increasing as bidding activity decreases. The FCC announces results at the end of each round, including the highest bid at that point, the minimum acceptable bid for the following round, and the amounts of all bids placed during the round. The FCC does not disclose bidders' names, and bidders are not allowed to disclose publicly whether they are still in the running or not.

  • Auction end: The auction will end when there are no new bids and all the spectrum blocks have been sold (many experts believe this auction could last until March 2008). If the reserve price of any spectrum block is not met, the FCC will conduct a re-auction of that block. Following the end of the auction, the FCC announces which bidders have secured licenses to which pieces of spectrum and requires winning bidders to submit the balance of the payments for the licenses.
If you're interested in keeping track of the publicly available details of the auction, check out this page on the FCC's website or Google News. In the meantime, my lips will be sealed (something, frankly, that I'm not used to).

Posted:


Last July, the Democratic presidential candidates took part in the first CNN/YouTube presidential debate. Tonight it's the Republicans' turn to answer questions from individuals around the country.

Just like last time, all of the questions to the candidates will come from YouTube users, who've submitted video questions over the past few months. And with a little more than a month before the first votes are cast in the Iowa caucuses, the debate promises fireworks as each of the candidates tries to set himself apart in one of the most competitive and least settled primary races in the Republican party's history.

Nearly 5,000 questions have been submitted. Here's are a sampling of the questions:



Tune in to CNN tonight at 8pm ET/5pm PT to see which questions are posed to the candidates.

Posted:


Barack Obama added another "first" to his already notable list yesterday: he became the first U.S. presidential candidate -- and, I'm guessing, the first high-level elected official in any country -- to have a ready answer to a standard Google engineering interview question. Asked by Eric Schmidt about "the most efficient way to sort a million 32-bit integers," Sen. Obama replied that "the bubble sort would be the wrong way to go." Though some might view this as shameless pandering to the bucket-sorting community, others will see a bold pragmatism.

Following Ron Paul, Hillary Clinton, John McCain, Bill Richardson, John Edwards, and Mike Gravel, Obama became the seventh presidential candidate to visit Google's main campus in Mountain View. Obama got a warm reception from an overflow crowd at Charlie's Cafe, with hundreds of employees watching via live webcast from forty remote locations. Looking out over the sea of t-shirts, Sen. Obama paid tribute to Silicon Valley style: "It's good to see Google is maintaining its strict dress code."

After a screening of his Monday Night Football clip and an introduction by Google's Senior VP David Drummond, Obama unveiled his new policy agenda on technology and innovation. He reaffirmed his support for network neutrality, saying:

The Internet is perhaps the most open network in history. We have to keep it that way.

Obama laid out a detailed package of technology policies designed to strengthen online privacy, increase government openness and transparency, put high-speed broadband within reach of all Americans, improve the delivery government services, drive America's competitiveness, reform our abuse-prone patent system, and free up wireless spectrum for new connectivity and public safety.

As part of his plan, Sen. Obama said he would use the Internet to give citizens better visibility into, and greater participation in, the workings of their government:

I’ll put government data online in universally accessible formats. I’ll let citizens track federal grants, contracts, earmarks, and lobbyist contacts. I’ll let you participate in government forums, ask questions in real time, offer suggestions that will be reviewed before decisions are made, and let you comment on legislation before it is signed. And to ensure that every government agency is meeting 21st century standards, I’ll appoint the nation’s first Chief Technology Officer.

After Obama finished his speech, Eric Schmidt joined him on stage for a "fireside chat" (except without the crackling fire). After a particularly open-ended first question ("What is it that you're going to do that's exceptional?"), Obama looked out and asked, "Is this the kind of interview that you guys went through?" (The answer is "yes," except we went through eight of them, and they focused more on how to sort 32-bit integers and less on how to counter the threat of global terrorism).

During the discussion, Obama made the case for his ability to bring Americans together, take on special interests, and bring new credibility to foreign relations. In about thirty minutes he covered a lot of ground: Iraq, Guantanamo, international relations and diplomacy, globalization, education, health care, college loans, Social Security, and race. Googler Ethan Beard asked Obama about fears that he lacks experience; he started his response by noting that Google founders "Larry and Sergey didn't have a lot of experience starting a Fortune 100 company."

The final question of the day was about political reform -- how to fix a broken system of political and government? Sen. Obama observed that the more people know, the more lawmakers and officials can be held accountable. He talked about his "Google for Government" bill, now law, to create a searchable database for every dollar of federal spending. He said, "If you give people good information, they will make good decisions."

Here's the complete video of Senator Obama's fireside chat:


Senator Obama also sat for an interview with YouTube's Steve Grove, with the questions posed by the YouTube community:

Posted:


We're gratified that Google’s recent call for global privacy standards has sparked a healthy debate. Nearly everyone agrees that factors such as globalisation, the growing recognition of privacy rights, and technological developments have accelerated the urgency of global privacy protection.

However, our support for the emergence of the APEC Privacy Framework has generated some criticism, which I'd like to address. The APEC Privacy Framework was inspired by the OECD Guidelines on the Protection of Privacy and is concerned with ensuring consistent and practical privacy protection across a wide range of economic and political perspectives.

At the core of the APEC framework is an entirely new privacy protection principle that does not exist in the regulatory frameworks of the 80s and the 90s: the “preventing harm” principle. The starting point is that personal information protection should be designed to prevent the misuse of that information. Since the greatest risk of that misuse is harm to individuals, we need a set of rules that seek to prevent that harm.

Using the reasoning of the APEC framework, global privacy standards should take account of the risks derived from the wrongful collection and misuse of people’s personal information and be aimed at preventing the harm resulting from those risks. Under the “preventing harm” principle, any remedial measures should be proportionate to the likelihood and severity of the harm. Some critics have said that the APEC framework is ambiguous and that the “preventing harm” principle does not look at privacy protection from the point of the individual. However, the focus of the “preventing harm” principle is precisely the individual and what is perceived as harmful by that individual.

Others see the APEC framework as the weakest international framework in this area and support the original OECD Privacy Guidelines because they are based on a simple approach to privacy protection. But is this approach a valid one to address the challenges of the Internet age? In today’s world, virtually every organisation – public or private, large or small, offline or online – relies on the collection and use of personal information for core operational purposes.

At the same time, regulators around the world are acknowledging the fact that they have limited resources to deal with all aspects of personal information protection. And three-quarters of the countries in the world still don't have meaningful privacy regimes in place. We believe that the APEC framework is the most promising foundation to advance privacy protections in those countries. What is wrong then with looking at this very practical challenge in a practical manner and trying to prioritise what really matters to people in an objective, yet flexible, way?

Fortunately, some regulators are also looking at the “preventing harm” principle as a valid way forward. The UK Information Commissioner recently published its data protection strategy which emphasises the need to make judgments about the seriousness of the risks of individual and societal harm, and about the likelihood of those risks materialising. The strategy document goes on to say that the UK regulator’s actions will give priority to tackling situations where there is a real likelihood of serious harm.

Using this approach, the key issue for policymakers and regulators is to figure out what is (or can be) harmful and what isn’t. Sure, identity theft and spam are bad. But is targeted advertising harmful or beneficial for consumers? What about the use of cookies to remember consumers’ preferences or computer settings? Do they make life easier or are they a harmful consequence of our online activities?


The truth is that the newest generation of Internet users are in the best position to know what is good and what is bad -- what amounts to 21st century online interaction and what is a potentially harmful intrusion into their private lives. Their perception of what is justified and what is not should be a determining factor in the protection of their personal information so that the “preventing harm” principle is not seen as a weakness, but as an objective yardstick of how to protect people’s privacy.

Posted:


Let's say you're looking for some publicly available government information online. Maybe you're searching for property records or background on your local school district. Chances are, you'll start your quest not by typing in the URL of a government agency website, but by visiting Google or another search engine. Unfortunately, that may not produce the results you're looking for. In fact, much of the content that government agencies make available on the web (about half, by our estimates) doesn't appear in search results because of the way many government websites are structured.

Google has been working to make publicly available government information more accessible to the public. We're doing so by helping government agencies implement the Sitemap Protocol, a technical standard that makes it easier for search engines to crawl and index pages on a website. Tomorrow, a Senate committee will take another important step toward addressing this problem.

The Senate Homeland Security and Government Affairs Committee will consider S. 2321, which extends and updates the E-Government Act of 2002. Part of the bill directs the Office of Management and Budget to create guidance and best practices for federal agencies to make their websites more accessible to search engine crawlers, and thus to citizens who rely on search engines to access information provided by their government. It also requires federal agencies to ensure their compliance with that guidance and directs OMB to report annually to Congress on agencies’ progress.

Implementing Sitemaps is an easy way for government agencies to make their online information and services more visible and accessible to the citizens they serve. We’ve already worked with states like Arizona, California, and Virginia, and federal agencies in the Departments of Agriculture, Energy and Health and Human Services. We've also supported the sitemapping of large databases by Library of Congress and National Archives and Records Administration.

We welcome this Senate legislation and encourage governments at all levels to participate in this effort to become more transparent and accessible to citizens.

Posted:

Fascinated by the twists and turns of the upcoming FCC spectrum auction? Can't get enough of the Digital Millennium Copyright Act? Passionate about online freedom of expression issues? If you're a undergraduate, graduate, or law student interested in in the world of tech policy, or know someone who is, keep reading.

We’re excited to announce the launch of the Google Policy Fellowship program, our effort to replicate the success of our Summer of Code program in the public policy sphere and to support students and organizations doing work important to the future of Internet users everywhere.

Those selected as fellows will receive a stipend to spend ten weeks contributing to the public debate on technology policy issues -- ranging from broadband policy to copyright reform to open government. Participating organizations for our beta summer of 2008 include the American Library Association, Cato Institute, Center for Democracy and Technology, Competitive Enterprise Institute, Electronic Frontier Foundation, Internet Education Foundation, Media Access Project, New America Foundation, and Public Knowledge.

Check out more details and the application, which is due by January 1, 2008. And please help us spread the word!

Posted:


Since this blog was officially born back in June, we've seen a great response from both Google users and policymakers around the world. Now we're giving you another way to keep track on the latest posts that appear here.

Over in the right hand column, under "Get Blog Posts by E-mail," you can sign up to, well, get new blog posts sent to you via e-mail (neat the way that works, huh?). Once you sign up, you'll receive each new blog post in your inbox minutes after they're posted to the blog.

We hope you enjoy this new feature.

Posted:


While we know that the Internet allows people and organizations to operate much more efficiently, the reality is that personal computers, servers and data centers use too much energy. Right now, the average desktop computer is only 50% energy efficient and most servers waste 30% of the energy they use. Typical industry data centers also waste huge amounts of energy on cooling and backup power.

As we at Google looked at how to cut the amount of energy we consume, it became clear that the problem is largely not technological (it's currently possible to make more efficient computers). The problem is due mostly to a lack of a market for high efficiency equipment. Manufacturers would make more efficient equipment if they could be sure that enough people would pay the slightly higher cost (somewhere around $20-$30 extra per personal computer).

So, in a twist on the famous "Field of Dreams" line, Google and Intel led an effort to build a market for high efficient computing equipment, so that manufacturers would come. We created the Climate Savers Computing Initiative (CSCI) earlier this year, and more than 100 major corporations, environmental groups, universities and other large IT purchasers have joined the initiative and agreed to buy Energy Star 4.0 rated equipment. Participants have also agreed to employ better power management methods to reduce energy usage of existing computing systems. By 2010, we hope this effort will lead to a 50% reduction in power consumption for member organizations.

After working for years in state government, I know that governors around the country are aggressively looking for ways to reduce energy consumption and explore new solutions on climate change. Earlier this week, the co-chairs of the National Governors Association's energy task force, Governors Tim Pawlenty from Minnesota and Kathleen Sebelius from Kansas, not only agreed to have their states sign on to the initiative, but also to recruit other states to join as well. Minnesota and Kansas state governments buy over 8,000 computers a year. Imagine the impact this program can have if we get all 50 states to join.

When I came to Google six months ago to work on state policy issues I had no idea that I would be involved in a project that would make a such a big dent in energy usage. For me it is just one more example of why I like my job so much.

Posted:


How does the EU privacy regime fit with the idea of developing global privacy standards? How should the EU’s laws for data protection evolve to continue safeguarding privacy in the digital age? And what is the right balance between privacy and security in today’s society?

These were some of the central themes of a discussion that the Centre for European Policy Studies and Google hosted recently in Brussels, which was attending by some 100 EU policymakers and advocates. Our new Google Privacy channel on YouTube has a video of the entire event; here's a recap of some of the highlights.

Global privacy standards
EU data protection supervisor Peter Hustinx expressed support for Google's call for global privacy standards, and said there was a surprising overlap between different legal frameworks such as the EU rules and the 1980 OECD principles. However, with three out of every four countries not having any privacy rules in place, he considered the APEC framework as a "pragmatic approach to allow some late-starters to step in.” Peter Fleischer, Google's Global Privacy Counsel, said that global privacy standards are intended to raise standards where they don’t exist, not to lower standards where they do exist, like in Europe. Operating a global IT architecture, Google will identify and abide by the highest common denominator of privacy protection, even though in practice it’s not always easy to know what that standard is. Fleischer said that for Google’s business to thrive, consumers need to trust the internet, not just Google.


Adapting the EU data protection regime
Fleischer acknowledged that since 1995, the EU principles of data protection have been adopted by many countries. However, while the administrative application of the principles might have been appropriate before the age of the internet, this no longer works today. In addition, the EU data protection directive has become akin to an export control regime. The list of countries that have been found to be “adequate” under EU law is rather short. Fleischer said that the mechanism of establishing adequacy should be based on universally valid principles, and not on their administrative application that will vary from country to country.

Hustinx agreed that the adequacy test is too cumbersome. "We can do better, and should build in more global privacy into the EU framework as well," he said. There was a larger number of formally "non-adequate" countries that can be considered as adequate for practical purposes. That was not the only change we’ll need in the next five years or so, he said. The administration of the principles should be more simple and flexible; other actors beyond data protection authorities and affected persons should get the right for legal action; and companies should embrace quality controls by third parties of their privacy policies and architecture. Rather than paying expensive auditors, Fleischer favoured technology solutions such as Google web history, to increase transparency for users on how a company deals with privacy.

Privacy and security
Alexander Alvaro, who sits on the Civil Liberties Committee of the European Parliament, focused on the implications of new technologies for individual freedom. There is an urgent need for extending data protection to the EU’s security policies to restrain governments’ excessive requests for data, he said. Alvaro also spoke against filtering web searches or generally blocking content on web sites for security purposes.

Alvaro expressed concern about storage of web search queries but acknowledged Google’s initiative to limit storage to 18 months, even though he’d personally prefer deletion of the data after that, rather than anonymisation. With respect to possible obligations to notify users of security breaches in the upcoming review of the EU’s telecom laws, Alvaro wants to limit notifications to risky breaches that potentially damage users. Encrypting data should also be regarded as a way of securing personal data.

It's clear that there is a need in public policy circles to better understand the rapidly innovating online world and to reflect on how data protection legislation can be adapted to it.