Posted:
At Google I/O, we announced PHP as the latest supported runtime for Google App Engine in Limited Preview. PHP is one of the world's most popular programming languages, used by developers to power everything from simple web forms to complex enterprise applications.

Now PHP developers can take advantage of the scale, reliability and security features of App Engine. In addition, PHP runs well with other parts of Google Cloud Platform. Let's look at how this works.

Connecting to Google Cloud SQL from App Engine for PHP

Many PHP developers start with MySQL when choosing a database to store critical information, and a wide variety of products and frameworks such as WordPress make extensive use of MySQL’s rich feature set. Google Cloud SQL provides a reliable, managed database service that is MySQL 5.5 compatible and works well with App Engine.

To set up a Cloud SQL database, sign into Google Cloud Console - create a new project, choose Cloud SQL and create a new instance.


After you create the instance, it's automatically associated with your App Engine app.
You will notice Cloud SQL instances don’t need an IP address. Instead they can be accessed via a compound identifier made up of their project name and instance name, such as hello-php-gae:my-cloudsql-instance.

From within PHP, you can access Cloud SQL directly using the standard PHP MySQL libraries - mysql, mysqli or PDO_MySQL. Just specify your Cloud SQL database with its identifier, such as:
<?php

$db = new PDO(
  'mysql:unix_socket=/cloudsql/hello-php-gae:my-cloudsql-instance;dbname=demo_db;charset=utf8',
  'demo_user',
  'demo_password'
);

foreach($db->query('SELECT * FROM users') as $row) {
  echo $row['username'].' '.$row['first_name']; //etc...
}
Methods such as query() work just as you’d expect with any MySQL database. This example uses the popular PDO library, although other libraries such as mysql and mysqli work just as well.

Storing files with PHP and Google Cloud Storage

Reading and writing files is a common task in many PHP projects, whether you are reading stored application state, or generating formatted output (e.g., writing PDF files). The challenge is to find a storage system that is as scalable and secure as Google App Engine itself. Fortunately, we have exactly this in Google Cloud Storage (GCS).

The first step in setting up Google Cloud Storage is to create a bucket:
With the PHP runtime, we’ve implemented native support for GCS. In particular, we’ve made it possible for PHP’s native filesystem functions to read and write to a GCS bucket.

This code writes all prime numbers less than 2000 into a file on GCS:

<?php

$handle = fopen('gs://hello-php-gae-files/prime_numbers.txt','w');

fwrite($handle, "2");
for($i = 3; $i <= 2000; $i = $i + 2) {
  $j = 2;
  while($i % $j != 0) {
    if($j > sqrt($i)) {
      fwrite($handle, ", ".$i);
      break;
    }
    $j++;
  }
}

fclose($handle);
The same fopen() and fwrite() commands are used just as if you were writing to a local file. The difference is we’ve specified a Google Cloud Storage URL instead of a local filepath.

And this code reads the same file back into memory and pulls out the 100th prime number, using file_get_contents():

<?php

$primes = explode(",",
  file_get_contents('gs://hello-php-gae-files/prime_numbers.txt')
);

if(isset($primes[100]))
  echo "The 100th prime number is ".$primes[100];

And more features supported in PHP

Many of our most popular App Engine APIs are now supported in PHP, including our zero-configuration Memcache, Task Queues for asynchronous processing, Users API, Mail API and more. The standard features you’d expect from App Engine, including SSL support, Page Speed Service, versioning and traffic splitting are all available as well.

Open today in Limited Preview

Today we’re making App Engine for PHP available in Limited Preview. Read more about the runtime in our online documentation, download an early developer SDK, and sign up to deploy applications at https://cloud.google.com/appengine/php.

- Posted by Andrew Jessup, Product Manager

Posted:
At Google I/O, we announced Google Cloud Datastore, a fully managed solution for storing non-relational data. Based on the popular Google App Engine High Replication Datastore (HRD), Cloud Datastore provides a schemaless, non-relational datastore with the same accessibility of Google Cloud Storage and Google Cloud SQL.

Cloud Datastore builds off the strong growth and performance of HRD, which has over 1PB of data stored, 4.5 trillion transactions per month and a 99.95% uptime. It also comes with the following features:
  • Built-in query support: near SQL functionality that allows you to search, sort and filter across multiple indexes that are automatically maintained 
  • ACID transactions: data consistency (both Strong and Eventual) that spans multiple replicas and requests 
  • Automatic scaling: built on top of Google’s BigTable infrastructure, the Cloud Datastore will automatically scale with your data 
  • High availability: by utilizing Google’s underlying Megastore service, the Cloud Datastore ensures that data is replicated across multiple datacenters and is highly available 
  • Local development environment: the Cloud Datastore SDK provides a full-featured local environment that allows you to develop, iterate and manage your Cloud Datastore instances efficiently 
  • Free to get started: 50k read & write operations, 200 indexes, and 1GB of stored data for free per month  

Getting started with Cloud Datastore 

To get started, head over to the Google Cloud Console and create a new project. After supplying a few pieces of information you will have a Cloud Project that has the Cloud Datastore enabled by default. For this post we’ll use the project ID cloud-demo.


With the project created and the Cloud Datastore enabled, we’ll need to download the Cloud Datastore client library. Once extracted, it’s time to start writing some code. For the sake of this post, we’ll focus on accessing the Cloud Datastore from a Python application running on a Compute Engine VM (which is also now in Preview). We’ll assume that you’ve already created a new VM instance.
import googledatastore as datastore

def main()
  writeEntity()
  readEntity()
Next include writeEntity() and readEntity() functions:
def WriteEntity():
  req = datastore.BlindWriteRequest()
  entity = req.mutation.upsert.add()
  path = entity.key.path_element.add()
  path.kind = 'Greeting'
  path.name = 'foo'
  message = entity.property.add()
  message.name = 'message'
  value = message.value.add()
  value.string_value = 'to the cloud and beyond!'
  try:
    datastore.blind_write(req)
  except datastore.RPCError as e:
    # remember to do something useful with the exception
    pass

def ReadEntity(): 
  req = datastore.LookupRequest()
  key = req.key.add()
  path = key.path_element.add()
  path.kind = 'Greeting0'
  path.name = 'foo0'
  try:
    resp = datastore.lookup(req)
    return resp
  except datastore.RPCError as e:
    # remember to do something useful with the exception
    pass
First create a new file called “demo.py”. Inside demo.py, we’ll add code to write and then read an entity from the Cloud Datastore.  Finally we can update main() to print out the property values within the fetched entity:
def main():
  writeEntity();
  resp = readEntity();

  entity = resp.found[0].entity
  for p in entity.property:
    print 'Entity property name: %s', p.name
    v = p.value[0]
    print 'Entity property value: %s', v.string_value
Before we can run this code we need to tell the client library which Cloud Datastore instance we would like to use. This is done by exporting the following environment variable:
~$ export DATASTORE_DATASET cloud-datastore-demo
Finally we’re able to run the application by simply issuing the following:
~$ python demo.py
Besides the output that we see in console window, we’re also able to monitor our interactions within the Cloud Console. By navigating back to Cloud Console, selecting our cloud-datastore-demo project, and then selecting the Cloud Datastore we’re taken to our instance’s dashboard page that includes number of entities, properties, and property types, as well as index management, ad-hoc query support and breakdown of stored data.

And that’s really just the beginning. To fully harness the features and functionality that the Cloud Datastore offers, be sure to check out the larger Getting Started guide and the Cloud Datastore documentation.

Cloud Datastore is the latest addition to the Cloud Platform storage family, joining Cloud Storage for storing blob data, Cloud SQL for storing relational data, and Persistent Disk for storing block data. All fully managed so that you can focus on creating amazing solutions and leave the rest to us.

And while this is a Preview Release, the team is off to a great start. As we move the service towards General Availability we’re looking forward to improving JSON support, more deeply integrating with the Cloud Console, streamlining our billing and driving every bit of performance that we can out of the API and underlying service.

Happy coding!

 -Posted by Chris Ramsdale, Product Manager

Posted:
Cross-posted with the Google Developers Blog

After last year's Google I/O conference, the Google Cloud Platform Developer Relations team started to think about how attendees experienced the event. We wanted to help attendees gain more insight about the conference space and the environment itself. Which developer Sandboxes were the busiest? Which were the loudest locations, and which were the best places to take a quick nap? We think about data problems all the time, and this looked like an interesting big data challenge that we could try to solve. So this year, we decided to try to answer our questions with a project that's a bit different, kind of futuristic, and maybe a little crazy.

Since we love open source hardware hacking as much as we love to share open source code, we decided to team up with the O'Reilly Data Sensing Lab to deploy hundreds of Arduino-based environmental sensors at Google I/O 2013. Using software built with the Google Cloud Platform, we'll be collecting and visualizing ambient data about the conference, such as temperature, humidity, air quality, in real time! Altogether, the sensors network will provide over 4,000 continuous data streams over a ZigBee mesh network managed by Device Cloud by Etherios.

photo of sensors

In addition, our motes will be able to detect fluctuations in noise level, and some will be attached to footstep counters, to understand collective movement around the conference floor. Of course, since a key goal of Google I/O is to promote innovation in the open, the project's Cloud Platform code, the Arduino hardware designs, and even the data collected, will be open source and available online after the conference.

Google Cloud Platform, which provides the software backend for this project, has a variety of features for building applications that collect and process data from a large number of client devices - without having to spend time managing hardware or infrastructure. Google App Engine Datastore, along with Google Cloud Endpoints, provides a scalable front end API for collecting data from devices. Google Compute Engine is used to process and analyse data with software tools you may already be familiar with, such as R and Hadoop. Google BigQuery provides fast aggregate analysis of terabyte datasets. Finally, App Engine's web application framework is able to surface interactive visualizations to users.

Networked sensor technology is in the early stages of revolutionizing business logistics, city planning, and consumer products. We are looking forward to sharing the Data Sensing Lab with Google I/O attendees, because we want to show how using open hardware together with the Google Cloud Platform can make this technology accessible to anyone.

With the help of the Google Maps DevRel team, we'll be displaying visualizations of interesting trends on several screens around the conference. Members of the Data Sensing Lab will be on hand in the Google I/O Cloud Sandbox to show off prototypes and talk to attendees about open hardware development. Lead software developer Amy Unruh and Kim Cameron from the Cloud Platform Developer Relations team will talk about how we built the software involved in this project in a talk called "Behind the Data Sensing Lab". In case you aren't able to attend Google I/O 2013, this session will be available online after the conference. Learn more about the Google Cloud Platform on our site, and to dive in to building applications, check out our developer documentation.

-Posted by Michael Manoochehri, Developer Programs Engineer