Posted by Champika Fernando, Product Manager, Kids Coding
At Google I/O, we announced our ongoing investment in Blockly with the release of a native Android version. We also highlighted significant improvements to the performance of web Blockly, which enables better rendering performance on mobile devices. Now iOS developers will have access to an open-source developer preview of Blockly for iOS that supports building better experiences on mobile, including multi-touch and enhanced animations as new experimental features.
Today’s release supports our ongoing efforts to enable developers to create consistent, high-quality, beginner programming experiences - as block-based programming interfaces can make coding more accessible by removing syntax errors and supporting “tinkerability.” We believe that coding is more than just a set of technical skills; coding is a valuable tool for everyone, empowering users from around the globe to imagine, invent, and explore.
With Blockly for iOS, developers can add Blockly views and fragments directly into their iOS app. This will offer tighter integration and improved performance compared to using a WebView. In this developer preview, blocks are currently optimized for tablets, but ready to customize for any app.
In addition, if you already use Blockly we're releasing a major update to the tools for creating custom blocks and configuring Blockly for your app, check out the new Blockly Developer Tools. The new tools allow you to edit and maintain a library of custom blocks, quickly configure toolboxes, and export and import files to local storage.
Click here to learn more, and get started on Blockly for iOS today. And to share feedback and get news, we welcome you to join the Blockly mailing list. We look forward to seeing your future builds!
Originally posted on Geo Developers blog
Posted by Akshay Kannan, Product Manager
Today we're launching Nearby on Android, a new surface for users to discover and interact with the things around them. This extends the Nearby APIs we launched last year, which make it easy to discover and communicate with other nearby devices and beacons. Earlier this year, we also started experimenting with Physical Web beacons in Chrome for Android. With Nearby, we’re taking this a step further.
Imagine pulling up a barcode scanner when you’re at the store, or discovering an audio tour while you’re exploring a museum–these are the sorts of experiences that Nearby can enable. To make this possible, we're allowing developers to associate their mobile app or a website with a beacon.
A number of developers have already been building compelling proximity-based experiences, using beacons and Nearby:
Getting started is simple. First, get some Eddystone Beacons- you can order these from any one of our Eddystone-certified manufacturers. Android devices and and other BLE-equipped smart devices can also be configured to broadcast in the Eddystone Format.
Second, configure your beacon to point to your desired experience. This can be a mobile web page using the Physical Web, or you can link directly to an experience in your app. For users who don’t have your app, you can either provide a mobile web fallback or request a direct app install.
Nearby has started rolling out to users as part of the upcoming Google Play Services release and will work on Android devices running 4.4 (KitKat) and above. Check out our developer documentation to get started. To learn more about Nearby Notifications in Android, also check out our I/O 2016 session, starting at 17:10.
Posted by Andrey Doronichev, Group Product Manager, Google VR
In Daydream Labs, the Google VR team explores virtual reality’s possibilities and shares what we learn with the world. While it’s still early days, the VR community has already come a long way in understanding what works well in VR across hardware, software, video, and much more. But, part of what makes developing for VR so exciting is that there’s still so much more to discover.
Apps are a big focus for Daydream Labs. In the past year, we’ve built more than 60 app experiments that test different use cases and interaction designs. To learn fast, we build two new app prototypes each week. Not all of our experiments are successful, but we learn something new with each one.
For example, in one week we built a virtual drum kit that used HTC Vive controllers as drumsticks. The following week, when we were debating how to make typing in VR more natural and playful, we thought — “what if we made a keyboard out of tiny drums?”
We were initially skeptical that drumsticks could be more efficient than direct hand interaction, but the result surprised us. Not only was typing with drumsticks faster than with a laser pointer, it was really fun! We even built a game that lets you track your words per minute (mine was 50 wpm!).
Daydream Labs is just getting started. This post is the first in an ongoing series sharing what we’ve learned through our experiments so stay tuned for more! You can also see more of what we’ve learned about VR interactions, immersion, and social design by watching our Google I/O talks on the live stream.
Posted by Mike Pegg, Google Developers Team
What are the best ways to optimize battery and memory usage of your apps? How do you create a great app experience that is accessible to everyone, including users with disabilities? How do you build an offline-ready, service-working, app-manifesting, production-ready Progressive Web App using Firebase Hosting? And what are some of the best desserts that start with N? Tune in to Google I/O to get the answers to all of these questions (well, most of them...), along with a whole lot more. You can start planning your schedule now, as the first wave of 100 technical talks just went live at google.com/io!
Last year, you told us you wanted more: more technical content, more time, more space, more everything! We heard your feedback loud and clear and have added a full third day onto Google I/O to accommodate more comprehensive talks in larger spaces than in previous years. These talks will be spread over 14 suggested tracks, including Android, the Mobile Web, Play and more, to help you easily navigate your I/O experience. Of course, we’re also bringing back Codelabs, our self-paced workshops with Googlers nearby to give you a hand.
Attending Remotely?
There are already over 200 I/O Extended events happening around the world. Join one of these events to participate in I/O from your local neighborhood alongside local developers who share the same passion for Google technology. You can also follow the festival from home; we’ll have four different live stream channels to chose from, broadcasting many of the sessions in real time from Shoreline. All of the sessions will be available to watch on YouTube after I/O concludes, in case you miss one.
See you soon!
This is just the first wave of talks. We’ll be adding more talks and events as we get closer to I/O, including a number of talks directly after the keynote (shhhh!! We’ve got some new things to share). We look forward to seeing you in a few weeks -- whether it be in person at Shoreline, at an I/O Extended event, or on I/O Live!