<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[Flutter - Medium]]></title>
        <description><![CDATA[Flutter is Google&#39;s UI framework for crafting high-quality native interfaces on iOS, Android, web, and desktop. Flutter works with existing code, is used by developers and organizations around the world, and is free and open source. Learn more at https://flutter.dev - Medium]]></description>
        <link>https://medium.com/flutter?source=rss----4da7dfd21a33---4</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Thu, 29 Feb 2024 11:05:25 GMT</lastBuildDate>
        <atom:link href="https://medium.com/feed/flutter" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[How Flutter facilitates collaboration between designers and developers at L+R]]></title>
            <link>https://medium.com/flutter/how-flutter-facilitates-collaboration-between-designers-and-developers-at-l-r-05ec82c9f45e?source=rss----4da7dfd21a33---4</link>
            <guid isPermaLink="false">https://medium.com/p/05ec82c9f45e</guid>
            <category><![CDATA[flutter-app-development]]></category>
            <category><![CDATA[flutter]]></category>
            <category><![CDATA[global-citizen]]></category>
            <dc:creator><![CDATA[Ivan Leider]]></dc:creator>
            <pubDate>Mon, 19 Feb 2024 19:01:58 GMT</pubDate>
            <atom:updated>2024-02-19T19:01:57.892Z</atom:updated>
            <content:encoded><![CDATA[<p>You might be familiar with <a href="https://www.globalcitizen.org/en/">Global Citizen</a> (GC), an organization dedicated to ending world poverty and helping the planet. When GC wanted to rewrite a mobile app to help with this effort, they reached out to our studio, <a href="https://levinriegner.com/home">L+R</a>, to collaborate on designing, building, and launching the app simultaneously on Android and iOS.</p><p>Flutter’s flexibility, pre-built widget catalog, and robust animation capabilities allowed the L+R team to implement a design-led development process. When using Flutter, our developers have a blank canvas to bring custom designs to life. For L+R, this means our design teams can unleash their creativity to create user-centric apps that look and feel great.</p><p>Here are a few screenshots from the Global Citizen app:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/738/0*wuezwMGAnC91NBTJ" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/738/0*dv6mhmjDb4HP59LA" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/738/0*KD5yX1QHIghjtf78" /></figure><p><strong>The design process</strong></p><p>When starting a new client project, the designers first create a design system in Figma. This consists of brand guidelines — like color palettes and typography — and small reusable components.</p><p>When starting on the Global Citizen app rewrite, the design team expanded upon the existing guidelines to provide a fresh look with higher-contrast elements. Not only does this help to direct the user’s attention but it also improves accessibility.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*DwbyxOv8yHSp1aFb" /></figure><p>UI elements are created as <em>components</em> in Figma. Each component may have different <em>variants</em> depending on component type or state. For example, the following diagram shows several button variants:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/572/0*JXYimUKJxblL6LYA" /></figure><p>When developing new components, the design team referenced the Material and iOS design kits in Figma for inspiration. This helps with translating the design to code as these Figma components closely correspond to widgets available in the Material and Cupertino libraries.</p><p>Throughout the design process, the team composed the small, reusable components to form larger feature-specific components that were then combined into entire pages. This empowered Global Citizen’s product team to expand the app with new functionality while remaining true and consistent with the original vision.</p><p><strong>Project template</strong></p><p>At L+R, our team has been working with Flutter since its first public release. To make building Flutter apps fast and easy for our team, we’ve created an <a href="https://github.com/levin-riegner/flutter-template">open-source repository</a> that serves as a project template and provides the cornerstone for all Flutter apps that we build.</p><p>This barebones project contains a set of widgets that can be customized and used in our client’s applications. For example, the buttons represented in the Figma design file (shown above) can be built using the _BaseButton class from the project template. This class accepts different colors, text styles, and padding to best match the design system set for that project.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/638/0*ApRsUADr-G3HYMzY" /></figure><p>,</p><p>To make it easy to style these components based on the brand guidelines, we use the ThemeData class. Developers take the color schemes and text styles from Figma and map them to the corresponding theme properties.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/888/0*DOvyVLAckVSQqoax" /></figure><p>Our team leveraged the existing properties in ThemeData as much as possible, and then used the ThemeExtension functionality to complete the remaining configurations.</p><p><strong>Flutter’s widget library</strong></p><p>As mentioned above, the project template contains a set of reusable widgets. When developing these, we try to use pre-existing widgets from the Material and Cupertino libraries as much as possible. In the Global Citizen app, about half of the components descend from Material or Cupertino widgets.</p><p>For example, the TextField and TextFormField widgets from Material contained all the necessary customizations and functionality needed to match the designs. The TextButton from Material was also used to implement the _BaseButton widget. In this case, we took advantage of TextButton’s existing highlight functionality, but we wrapped it into a new widget with extended functionality to better suit our needs.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*BrVLb8vUnOxCenlR" /></figure><p>Having these out-of-the-box UI components that can easily be styled makes it easier to build and maintain our widgets. However, sometimes our designers create custom components that can’t be recreated using Flutter’s Material or Cupertino widgets. In these cases, the development team creates our own custom widgets from low-level Flutter widgets.</p><p>One example from the Global Citizen app was progress indicators. For this app, our designers created a custom progress indicator as shown in the following GIF. The existing Material and Cupertino progress indicators didn’t work for this design. However, it was easy for our developers to create an entirely new widget using only containers, rows, and columns with intrinsic animations.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/350/0*FSZ6UZM2gdxag6tr" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*ecO_ptzWhoeP60fp" /></figure><p>Community libraries were also incredibly useful for more complex and flexible components. For example we used a community- built <a href="https://pub.dev/packages/another_flushbar">flushbar</a> to display alert notifications inside the Global Citizen app.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/745/0*vMiJl709u8ABpUE3" /></figure><p><strong>Hot reload</strong></p><p>When translating the Figma design to Flutter code, it’s important for designers and developers to collaborate. Designers often have feedback that results in small tweaks to styles, layouts, or animations. Hot reload made this collaboration much more efficient.</p><p>With hot reload, developers can tweak the code while screen-sharing with the design team (either during the UX phase, UI concept phase, front-end implementation stage, or even the design-QA stage). While screen sharing, the designer can instantly see the results of changes. This makes debugging and iteration much faster.</p><p><strong>Smooth animations</strong></p><p>One thing that helps an app feel polished is adding thoughtful animations. Flutter’s animation framework is flexible and powerful, however it can be difficult for designers to understand. One thing that helped our team was to reference beautiful animations in open source apps. Specifically, we looked at the codebase for the <a href="http://wonderous.app">Wonderous ap</a>p. We’ve since leveraged the open-source library, <a href="https://pub.dev/packages/flutter_animate">flutter_animate</a>, that powers Wonderous to add simple yet enriching animations to key functionalities.</p><p>The video in the following link highlights part of the “Take Action” flow, where the user navigates through a series of informative screens to learn more about the issue.</p><p>Check out the <a href="https://drive.google.com/file/d/1OmILf7hZZHbnWbaC6BN-EopYDmQtEOmp/view">action_learn_animations</a> video.</p><p>Using flutter_animate, we were able to do the following:</p><ul><li>Animate the step indicator as the user moves forward</li><li>Add a slight fade transition between pages</li><li>Transform the <strong>Play Video</strong> button into a success checkmark after viewing the video</li></ul><p>Overall, Flutter allows a flexible approach to design and development. Flutter’s pre-built widgets make it easier to create reusable UI components that can be styled to match a customer’s branding. The Flutter community plays an important role in how the technology matures, offering high quality libraries and expanding the available resources. Key Flutter features like hot reload supports better collaboration between designers and developers. All of these put together result in an app that is both beautiful and functional!</p><p>For more information about the app, check out <a href="https://www.globalcitizen.org/en/content/new-global-citizen-app-impact-activism-every-day/">The New Global Citizen App: Daily Activism &amp; Measurable Impact in Your Pocket</a>. You can also <a href="https://www.globalcitizen.org/en/app/download/">download the Global Citizen app</a> to your mobile device.</p><p><a href="https://levinriegner.com/home">L+R</a> is an international strategy, design, and mobile technology studio. We offer many services like UX design, mobile development and strategy consulting.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=05ec82c9f45e" width="1" height="1" alt=""><hr><p><a href="https://medium.com/flutter/how-flutter-facilitates-collaboration-between-designers-and-developers-at-l-r-05ec82c9f45e">How Flutter facilitates collaboration between designers and developers at L+R</a> was originally published in <a href="https://medium.com/flutter">Flutter</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[What’s new in Flutter 3.19]]></title>
            <link>https://medium.com/flutter/whats-new-in-flutter-3-19-58b1aae242d2?source=rss----4da7dfd21a33---4</link>
            <guid isPermaLink="false">https://medium.com/p/58b1aae242d2</guid>
            <category><![CDATA[flutter-app-development]]></category>
            <category><![CDATA[flutter]]></category>
            <category><![CDATA[release-notes]]></category>
            <category><![CDATA[announcements]]></category>
            <dc:creator><![CDATA[Kevin Chisholm]]></dc:creator>
            <pubDate>Thu, 15 Feb 2024 19:31:48 GMT</pubDate>
            <atom:updated>2024-02-15T22:35:02.380Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*ZX3bHJdGGZwwOQHt" /></figure><h4>Revolutionizing App Development with the Gemini API, Impeller Updates, and Windows Arm64 Support</h4><p>Today we present you with a new Flutter release, Flutter 3.19. This release brings a new Dart SDK for Gemini, a widget enabling developers to add fine-grained control to widget animations, a rendering boost with updates to Impeller, tooling to help implement deep links, Windows Arm64 support and so much more!</p><p>The Flutter community continues to impress, merging 1429 pull requests by 168 community members, with 43 community members committing their first Flutter pull requests!</p><p>Keep reading to learn about all the new additions and improvements the Flutter community has contributed to this latest release!</p><h3>AI integration</h3><h4>Gemini Google AI Dart SDK beta release</h4><p>The Google AI Dart SDK has been released to beta. This enables you to build generative AI features into your Dart or Flutter app, powered by Gemini, Google’s latest family of AI models. There is now a <a href="https://pub.dev/packages/google_generative_ai">google_generative_ai</a> package on pub.dev. Learn more about how to build with the Google AI Dart SDK in <a href="https://medium.com/flutter/harness-gemini-in-your-dart-and-flutter-apps-00573e560381">this blog post</a> or jump straight into the <a href="https://ai.google.dev/tutorials/dart_quickstart">Dart quickstart</a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*13y0iIXD9nN5wCcX" /></figure><h3>Framework</h3><h4>Scrolling improvements</h4><p>Flutter used to scroll twice as fast if you dragged two fingers. You can now configure the default ScrollBehavior with MultiTouchDragStrategy.latestPointer to get number-of-fingers-agnostic scrolling behavior. For more information on this change, see the <a href="https://docs.flutter.dev/release/breaking-changes/multi-touch-scrolling">migration guide</a>.</p><p>We also completed bug fixes for <a href="https://github.com/flutter/flutter/pull/136871">SingleChildScrollView</a> and <a href="https://github.com/flutter/flutter/pull/136828">ReorderableList</a>, resolving a number of reported crashes and unexpected behavior.</p><p>In two dimensional scrolling, we resolved an issue so now, if you drag or tap while a scroll is underway in either direction, the scroll activity stops as expected.</p><p>The TableView widget in the two_dimensional_scrollables package has also been updated since the last release, providing more polish, adding support for merged cells, and adopting more of the new features of the 2D foundation after the last stable release of 3.16.</p><h4>AnimationStyle</h4><p>Thanks to a <a href="https://github.com/flutter/flutter/pull/138721">contribution</a> by Flutter community member <a href="https://github.com/TahaTesser">@TahaTesser</a>, Flutter has a new AnimationStyle widget that allows users to override the default animation behavior in widgets, such as MaterialApp, ExpansionTile, and PopupMenuButton, providing developers with the ability to override animation curves and durations.</p><h4>SegmentedButton.styleFrom</h4><p>Flutter community member <a href="https://github.com/AcarFurkan">@AcarFurkan</a> added a styleFrom static utility method, just like the ones provided by the other button types. This method enables quickly creating a SegmentedButton’s ButtonStyle that can be shared with other segmented buttons or used to configure the app’s SegmentedButtonTheme.</p><h4>Adaptive Switch</h4><p>This adaptive component looks and feels native on macOS and iOS and has the Material Design look and feel elsewhere. It does not depend on the Cupertino library so its API is exactly the same on all platforms.</p><p>See the <a href="https://github.com/flutter/flutter/pull/130425">adaptive switch pull request</a> and the live example on the Switch.adaptive constructor <a href="https://api.flutter.dev/flutter/material/Switch/Switch.adaptive.html">API page</a>.</p><h4>SemanticsProperties accessibility identifier</h4><p>A new accessibility identifier in SemanticsProperties provides an identifier for the semantic node in the native accessibility hierarchy. On Android, it appears in the accessibility hierarchy as resource-id. On iOS, this sets UIAccessibilityElement.accessibilityIdentifier. We want to thank community member <a href="https://github.com/bartekpacia">@bartekpacia</a> for this change, which spanned the <a href="https://github.com/flutter/engine/pull/47961">engine</a> and <a href="https://github.com/flutter/flutter/pull/138331">framework</a>.</p><h4>Increased access to text widget state</h4><p>We added support for a MaterialStatesController in TextField and TextFormField so that you can listen to MaterialState changes.</p><h4>UndoHistory stack</h4><p>We <a href="https://github.com/flutter/flutter/pull/138674">fixed</a> a <a href="https://github.com/flutter/flutter/issues/130881">problem</a> where the undo/redo history could disappear on Japanese keyboards. You can now modify an entry before it’s pushed to the UndoHistory stack.</p><h3>Engine</h3><h4>Impeller progress</h4><p><strong>Android OpenGL preview</strong></p><p>In the 3.16 stable release, we invited users to try out Impeller on Vulkan-enabled Android devices, covering 77% of Android devices in the field. Over the past few months, we have brought Impeller’s OpenGL backend up to feature parity with the Vulkan backend, for example, by adding <a href="https://github.com/flutter/engine/pull/47030">support for MSAA</a>. This means that Flutter apps on nearly all Android devices are expected to render correctly, with the exception of a small number of remaining features that are coming soon, such as custom shaders and full support for external textures.</p><p>We request that Flutter developers upgrade to the latest stable version, and file issues about any shortcomings observed when <a href="https://docs.flutter.dev/perf/impeller#android">Impeller is enabled</a>. Feedback at this stage is invaluable to ensuring that Impeller is successful on Android and that we can confidently make it the default renderer in a release later this year. The Android hardware ecosystem is much more diverse than the iOS ecosystem. For that reason, the most helpful feedback about Impeller should include detailed information about the specific device and Android version where issues occurred.</p><p>Further, as a reminder, Impeller’s Vulkan backend enables additional debugging capabilities in <a href="https://docs.flutter.dev/testing/build-modes#debug">debug</a> builds beyond what is used with Skia, and these capabilities have additional runtime overhead. Therefore, it’s important to give feedback about Impeller’s performance from a <a href="https://docs.flutter.dev/testing/build-modes#profile">profile</a> or <a href="https://docs.flutter.dev/testing/build-modes#release">release</a> build. The bug report should include timelines from DevTools and a comparison with the Skia backend on the same device. Finally, as always, we are very grateful for feedback that includes a small reproducible test case that demonstrates the issue.</p><p><strong>Roadmap</strong></p><p>After rendering fidelity, our main focus in Impeller’s Android preview period is performance. We continue to make incremental gains, however a couple of larger improvements are also in progress. We expect work to take advantage of <a href="https://github.com/flutter/flutter/issues/128911">Vulkan subpasses</a> to greatly improve the performance of advanced blend modes. Further, we also expect that a change in rendering strategy away from always tessellating every path on the CPU towards a <a href="https://github.com/flutter/flutter/issues/137714">Stencil-then-cover</a> approach will greatly reduce Impeller’s CPU utilization on both Android and iOS. Finally, we expect that a new implementation of <a href="https://github.com/flutter/flutter/issues/131580">Gaussian blurring</a> will match the throughput of the Skia implementation, and improve idiomatic use of blurring on iOS.</p><h4>API improvements</h4><p><strong>Glyph Information</strong></p><p>This release includes two new methods on dart:ui’s Paragraph object: getClosestGlyphInfoForOffset, and getGlyphInfoAt, which each return an object of the new type GlyphInfo. Check out the documentation on the new <a href="https://main-api.flutter.dev/flutter/dart-ui/GlyphInfo-class.html">GlyphInfo</a> type.</p><p><strong>GPU tracing</strong></p><p>Under Impeller on Metal (iOS, macOS, Simulator) and on Vulkan-enabled Android devices, the Flutter engine now reports GPU times for each frame in the timeline in debug and profile builds. GPU frame timing can be inspected in DevTools under the “GPUTracer” heading.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*m3qW6u7Q4mNfTKxN" /></figure><p>Please note that since non-Vulkan Android devices might misreport their support for querying GPU timing, Impeller’s GPU tracing can only be enabled with a flag set in the AndroidManifest.xml file on these devices.</p><pre>&lt;meta-data<br>  android:name=&quot;io.flutter.embedding.android.EnableOpenGLGPUTracing&quot;<br>  android:value=&quot;true&quot; /&gt;</pre><h4>Performance optimizations</h4><p><strong>Specialization constants</strong></p><p>The team added <a href="https://github.com/flutter/flutter/issues/119357">support for specialization constants</a> to Impeller. Taking advantage of this feature in Impeller’s shaders reduced uncompressed binary size of the Flutter engine by <a href="https://flutter-flutter-perf.skia.org/e/?begin=1698877815&amp;end=1702074996&amp;queries=test%3Dhello_world_ios__compile&amp;requestType=0&amp;selected=commit%3D37892%26name%3D%252Carch%253Darm%252Cbranch%253Dmaster%252Cconfig%253Ddefault%252Cdevice_type%253DiPhone_11%252Cdevice_version%253Dnone%252Chost_type%253Dmac%252Csub_result%253Dflutter_framework_uncompressed_bytes%252Ctest%253Dhello_world_ios__compile%252C">nearly 350KB</a>.</p><p><strong>Backdrop filter speedups</strong></p><p>There is much more work to do, however this release includes a couple of nice performance improvements for backdrop filters and blurs on Impeller. In particular, open source contributor <a href="https://github.com/knopp">@knopp</a> <a href="https://github.com/flutter/flutter/issues/131567#issuecomment-1678210475">noticed</a> that Impeller was mistakenly requesting the capability to read from the onscreen texture. <a href="https://github.com/flutter/engine/pull/47808">Removing this capability</a> improved scenes that include multiple backdrop filters anywhere from 20–70% in our benchmarks, depending on complexity.</p><p>Further, Impeller <a href="https://github.com/flutter/engine/pull/47397">no longer unconditionally stores the stencil buffer</a> on every backdrop filter. Instead, any clip affecting operations are recorded, and replayed into a new stencil buffer when restoring the save layer for the backdrop filter.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1008/0*6Of__x8KILe6U5Si" /></figure><p>With this change, our benchmark of animated advanced blend modes on a Pixel 7 Pro running Impeller with the Vulkan backend improved average GPU frame times from 55ms to 16ms, and improved 90%-ile raster thread CPU times from around 110ms down to 22ms.</p><h3>Android</h3><h4>Deeplinking web validator</h4><p>We have learned from developers that deep linking (taking users from a web URL to a specific page in a mobile app) has always been difficult to implement, and also error prone. So we first created a validation tool to help developers understand what links are incorrectly configured, and provide implementation guidance. We are very happy to share that an early version of the Flutter deeplink validator is now available!</p><p>In this early version, the Flutter deep link validator supports web check on Android, which means validating the setup of your assetlinks.json file. You can open DevTools, click into the <strong>Deep Links</strong> tab, and import a Flutter project that contains deeplinks. The deeplinking validator tells you if your web file is configured correctly. You can refer to the deep link validation tool <a href="https://docs.google.com/document/d/1fnWe8EpZleMtSmP0rFm2iulqS3-gA86z8u9IsnXjJak/edit?tab=t.0">testing instructions</a> for more information.</p><p>We hope this tool is the first step to simplify your deep linking implementation journey. We will continue to work on providing future support for web check on iOS, and app check on both iOS and Android!</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*1YoGEcLgpaFythel" /></figure><h4>Support for Share.invoke</h4><p>The default <strong>Share</strong> button on text fields and views was previously missing from Android, but we’ve added it in this release as part of our ongoing effort to ensure all the default context menu buttons are available on each platform. You can follow that ongoing work in <a href="https://github.com/flutter/flutter/issues/107578">PR #107578</a>.</p><h4>Native assets feature</h4><p>If you’re interested in the interoperability of Flutter with functions from other languages in your Flutter code, you can now perform FFI calls through Native assets on Android as part of our <a href="https://github.com/flutter/flutter/issues/129757">ongoing work</a> towards supporting Native assets.</p><h4>Texture Layer Hybrid Composition (TLHC) mode</h4><p>Flutter 3.19 includes work that now makes Google Maps and the text input magnifier work in TLHC mode, which means better performance for your apps. If you’re using Google Maps, we encourage you to test out the changes and let us know your feedback!</p><p>This work doesn’t include commits under the Framework or Engine, but you can see the work in <a href="https://github.com/flutter/packages/pull/5408">PR 5408</a>, along with the steps to test the THLC.</p><h4>Custom system-wide text selection toolbar buttons</h4><p>Android apps can add custom text selection menu items that appear in all text selection menus (the menu that appears when you long-press on text). Flutter’s TextField selection menu now incorporates those items.</p><h3>iOS</h3><h4>Flutter iOS native fonts</h4><p>Flutter text now looks a little more compact and a little more native on iOS. According to the Apple design guidelines, smaller fonts on iOS should be more spread out in order to be easier to read on mobile, while larger fonts should be more compact to not take up as much space. Before, we were incorrectly using the smaller, more spaced out font in all cases. Now, by default Flutter will use the compact font for larger text.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/798/0*q9BjqFkxrFRfx9eP" /></figure><h3>DevTools</h3><h4>DevTools updates</h4><p>Some highlights for DevTools with this release are:</p><ul><li>Added a new feature and screen in DevTools to validate deeplinks setup on Android.</li><li>Added an option in the <strong>Enhance Tracing</strong> menu for tracking platform channel activity. This is useful for apps with plugins.</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/752/0*zDY2wXaCTMankTTb" /></figure><ul><li>The Performance and CPU profiler screens are now made available when there is no connected app. Performance data or CPU profiles that were previously saved from DevTools can be reloaded for viewing from these screens.</li><li>The Flutter Sidebar in VS Code now has the ability to enable new platforms if not enabled for the current project, and the DevTools menu in the sidebar now has an option to open DevTools in an external browser window.</li></ul><p>To learn more, check out the release notes for DevTools, <a href="https://docs.flutter.dev/tools/devtools/release-notes/release-notes-2.29.0">2.29.0</a>, <a href="https://docs.flutter.dev/tools/devtools/release-notes/release-notes-2.30.0">2.30.0</a>, and <a href="https://docs.flutter.dev/tools/devtools/release-notes/release-notes-2.31.0">2.31.0</a>.</p><h3>Desktop</h3><h4>Windows Arm64 support</h4><p>Flutter on Windows is now embracing initial support for the Arm64 architecture, thanks to the commendable efforts of community member <a href="https://github.com/pbo-linaro">@pbo-linaro</a>. This initial support paves the way for more efficient and performant Flutter applications that run natively on Windows Arm64 devices. Although still in development, with progress trackable on GitHub issue <a href="https://github.com/flutter/flutter/issues/62597">#62597</a>, this move signifies a promising enhancement for Flutter developers aiming to optimize their apps for a broader range of Windows devices.</p><h3>Ecosystem</h3><h4>Required reason privacy manifest</h4><p>Flutter now includes a privacy manifest on iOS to meet <a href="https://developer.apple.com/support/third-party-SDK-requirements/">upcoming Apple requirements</a>.</p><h4>Progress of the Flutter and Dart package ecosystem</h4><p>In case you missed it, check out the blogpost from January on the <a href="https://medium.com/flutter/progress-of-the-flutter-package-ecosystem-17cded9a0703">progress of the Flutter and Dart package ecosystem</a>.</p><h3>Deprecations and breaking changes</h3><h4>Dropping Windows 7 and 8 support</h4><p>As Flutter evolves, we’re excited to focus on the latest technologies by ending support for Windows 7 and 8 with Dart 3.3 and Flutter 3.19 releases. This shift, in line with Microsoft’s strategy, allows us to enhance Flutter on modern operating systems. We appreciate the adjustments required from our developers and are committed to assisting you through this transition. This move paves the way for a more secure, efficient, and feature-rich development environment on supported versions of Windows. Thank you for your understanding and adaptability as we continue to innovate together in the Flutter ecosystem.</p><h4>Impeller dithering flag</h4><p>As noted in the release notes for the 3.16 stable release, the global flag Paint.enableDithering has been <a href="https://github.com/flutter/engine/pull/46745">removed</a>. See the <a href="https://docs.flutter.dev/release/breaking-changes/paint-enableDithering">breaking change announcement</a> on the website for full details.</p><h4>Deprecate iOS 11</h4><p>Due to a <a href="https://github.com/flutter/flutter/issues/136060">runtime crash</a> when certain networking APIs were called, Flutter no longer supports iOS 11. This means that apps built against Flutter 3.16.6 and later won’t run on those devices.</p><h4>Deprecate auto render mode</h4><p><a href="https://docs.flutter.dev/release/breaking-changes">Breaking changes</a> in this release include deprecated APIs that expired after the release of v3.16. To see all affected APIs, along with additional context and migration guidance, see the <a href="https://docs.flutter.dev/release/breaking-changes/3-16-deprecations">deprecation guide for this release</a>. Many of these deprecations are supported by <a href="https://docs.flutter.dev/development/tools/flutter-fix">Flutter fix</a>, including quick fixes in the IDE. Bulk fixes can be evaluated and applied with the dart fix command line tool.</p><p>As always, many thanks to the community for <a href="https://github.com/flutter/tests/blob/master/README.md">contributing tests</a> — these help us identify these breaking changes. To learn more, check out <a href="https://github.com/flutter/flutter/wiki/Tree-hygiene#handling-breaking-changes">Flutter’s breaking change policy</a>.</p><p>This is the first release to adopt the flutter_driver package into the deprecation policy in addition to already supported packages, flutter and flutter_test.</p><h3>Conclusion</h3><p>As we highlighted the remarkable number of contributors at the start of this announcement, we did so with purpose. The evolution of Flutter into the powerful and efficient toolkit it has become is a direct testament to the dedication and hard work of our incredible community. A heartfelt thank you to each and every one of you.</p><p>To dive into the specifics of what has been achieved with this release, we invite you to view the <a href="https://docs.flutter.dev/release/release-notes/release-notes-3.19.0">release notes and change log</a> for a comprehensive list of additions in Flutter 3.19.</p><p>Flutter 3.19, alongside <a href="https://medium.com/dartlang/new-in-dart-3-3-extension-types-javascript-interop-and-more-325bf2bf6c13">Dart 3.3</a>, is now available on the stable channel. Embarking on this latest journey with Flutter is as straightforward as running flutter upgrade.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=58b1aae242d2" width="1" height="1" alt=""><hr><p><a href="https://medium.com/flutter/whats-new-in-flutter-3-19-58b1aae242d2">What’s new in Flutter 3.19</a> was originally published in <a href="https://medium.com/flutter">Flutter</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Starting 2024 strong with Flutter and Dart]]></title>
            <link>https://medium.com/flutter/starting-2024-strong-with-flutter-and-dart-cae9845264fe?source=rss----4da7dfd21a33---4</link>
            <guid isPermaLink="false">https://medium.com/p/cae9845264fe</guid>
            <category><![CDATA[flutter]]></category>
            <category><![CDATA[flutter-app-development]]></category>
            <category><![CDATA[announcements]]></category>
            <category><![CDATA[releases]]></category>
            <dc:creator><![CDATA[Brandon Badger]]></dc:creator>
            <pubDate>Thu, 15 Feb 2024 19:30:03 GMT</pubDate>
            <atom:updated>2024-02-15T19:30:02.985Z</atom:updated>
            <content:encoded><![CDATA[<p>An introduction, two new SDK releases, and bringing Flutter and Dart into the Gemini era</p><p><em>We’re excited to announce the first of this year’s quarterly SDK releases for Flutter and Dart — Flutter 3.19 and Dart 3.3, along with some exciting announcements involving AI.</em></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*gO0yYMh4FUkpue3w" /></figure><p><strong>Flutter and Dart’s value and future</strong></p><p>I recently joined the Flutter and Dart team, and am excited to work with our developer community at a time when AI is moving quickly to enhance developer productivity and unlock new types of user experiences. I see endless potential in Flutter and Dart’s role in helping to shape this future. I’m equally inspired by the original vision for Flutter to improve the experience of building beautiful, performant, multi platform apps for any device<strong><em>.</em></strong></p><p>Judging by the millions of talented and creative developers who support Flutter with contributions to the framework, or by building amazing experiences — it’s clear others see this vision and are motivated to help. And the core mission remains the same: to deliver a strong language and framework pairing, enabling creative developers to build beautiful, rich, and performant apps for any device. Let’s do this together!</p><p><strong>Bringing Flutter and Dart into the Gemini era</strong></p><p>Today we launched the <a href="https://medium.com/flutter/harness-gemini-in-your-dart-and-flutter-apps-00573e560381">Google AI Dart SDK</a>, a new pub.dev package, <a href="https://pub.dev/packages/google_generative_ai">google_generative_ai,</a> and <a href="https://ai.google.dev/tutorials/dart_quickstart">supporting resources</a>; together these enable you to build your own generative AI-based features like smart chat bots, visual search engines, and image descriptions into Dart and Flutter apps using the Gemini API. Flutter and Dart’s cross-platform capabilities and this new SDK make it easier for you to build interactive experiences across platforms.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*5hGoIuMtVnvrZ2id" /></figure><p>And this is only the beginning of the innovations that we’re bringing to Flutter and Dart development with AI. For instance, soon Flutter and Dart developers can copy Dart code directly from <a href="https://aistudio.google.com/?utm_source=flutter&amp;utm_medium=referral&amp;utm_campaign=blog_umbrella_announcement&amp;utm_content=">Google’s AI Studio</a> after honing prompts for your use case.</p><p>Learn more about the Google AI Dart SDK in the <a href="https://medium.com/flutter/harness-gemini-in-your-dart-and-flutter-apps-00573e560381">deep dive blog post</a>.</p><p>Many developers have already begun to bring Flutter and AI tools together in exciting ways:</p><ul><li>The team at <a href="https://leancode.co/">LeanCode</a> have used the Gemini model to build <a href="https://leancode.co/arb_translate">arb_translate</a>, a package that allows developers to perform translation tasks automatically</li><li>We Spot Turtles! have combined Flutter and AI in their mission to save sea turtles from extinction. They were recently featured in Google Play’s <a href="https://play.google.com/console/about/weareplay/">#WeArePlay campaign</a>. Take a look below.</li><li>AutoGPT, an experimental, open-source project that builds on top of large language models (LLMs), has a <a href="https://github.com/Significant-Gravitas/auto_gpt_flutter_client">Flutter client</a> that runs across iOS, Android, web, macOS, and Windows.</li></ul><p><a href="https://youtu.be/CfzhLOiczDQ?si=Qgc4Yb4Q9xKI6byF">#WeArePlay | Caitlin and Nicolas | We Spot Turtles! | Australia</a></p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FCfzhLOiczDQ&amp;display_name=YouTube&amp;url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DCfzhLOiczDQ&amp;image=http%3A%2F%2Fi.ytimg.com%2Fvi%2FCfzhLOiczDQ%2Fhqdefault.jpg&amp;key=d04bfffea46d4aeda930ec88cc64b87c&amp;type=text%2Fhtml&amp;schema=youtube" width="854" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/c589a07e2318b9c8d79af8b3e2dc5a53/href">https://medium.com/media/c589a07e2318b9c8d79af8b3e2dc5a53/href</a></iframe><p>As you explore the Gemini model’s capabilities, be sure to share the new and innovate experiences you’re building with us using the #BuildWithGemini hashtag.</p><p><strong>Two new SDK releases</strong></p><p>Aside from the excitement that AI brings, we remain focused on continuing to build a strong UI framework, capable of delivering any experience you want to build on any screen you want to build for. You’ll see progress towards that vision in today’s SDK releases, Flutter 3.19 and Dart 3.3.</p><p>These releases are focused on refinements and performance improvements that build upon the trajectory that Flutter and Dart set out <a href="https://medium.com/flutter/whats-next-for-flutter-b94ce089f49c">last year</a>. In this Flutter release, you’ll find:</p><ul><li>Updates to our work to generate breakthrough graphics performance through our continued work on Impeller</li><li>Additional steps towards providing seamless integration between platforms with Flutter iOS native fonts and an early version of a deep linking web validator</li><li>Continued focus on the developer experience with updates to DevTools and a <a href="https://medium.com/flutter/progress-of-the-flutter-package-ecosystem-17cded9a0703?source=collection_home---4------1-----------------------">progress report on the Flutter package ecosystem</a></li><li>And finally, we’re excited to share progress on our mission to help define the future of the web with Wasm</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*2UG76__vsbp6NHtN" /></figure><p>Dart 3.3 in turn, introduces extension types, a new model for interacting with JavaScript on the web, and updates to our work to support access to more and better web libraries. You can learn more about each release in the blog posts for <a href="https://medium.com/flutter/whats-new-in-flutter-3-19-58b1aae242d2">Flutter 3.19</a> and <a href="https://medium.com/dartlang/dart-3-3-325bf2bf6c13">Dart 3.3</a>, respectively.</p><p><strong>2024 Strategy and Roadmap</strong></p><p>Each of these features is a small step in a larger journey we’re taking this year, and that you can see in our <a href="https://github.com/flutter/flutter/wiki/Roadmap">2024 roadmap</a>. As always, these roadmaps are born from a desire to be open about our plans as we know many of you consider Flutter and Dart to be essential components in your careers and businesses. That being said, progress can be difficult to predict, even with a plan in place.</p><p>And while we’ll do our best to continue to remain transparent as changes inevitably force us to shift focus and make tradeoffs, we want to highlight that there are more contributors to Flutter and Dart outside of Google than those of us employed here, meaning that the things mentioned in our roadmap are but a small portion of the thousands of changes that will come to Dart and Flutter this year.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*oAoUwrRrrYWIbu_u" /><figcaption>2024 roadmap</figcaption></figure><p><strong>Bringing it all together</strong></p><p>I’ve worked on many fun and innovative projects during my 17+ years at Google and YouTube, but this is the most enthusiastic I’ve felt about leaning into a new opportunity. I started my career as a software engineer, and my first job at Google was as the PM for the Maps API and Geo developer tools, so it’s great to get back to my developer roots.</p><p>Part of what drives my enthusiasm is, well, your enthusiasm. Just in 2024 so far, I’ve seen such amazing things come out of this community, including:</p><ul><li>Over 2,700 of you have joined our<a href="http://flutter.dev/global-gamers"> Global Gamers Challenge</a>, which we’re hosting in partnership with international advocacy firm Global Citizen to challenge you to use your skills to build Flutter games that inspire the world to live more sustainably.</li><li><a href="https://youtu.be/37qvcjmE51w">Superlist</a>, who announced version 1.0 yesterday, is using Flutter to redefine task management, note taking, and everything in between.</li></ul><p>In closing, my kids are starting to study Computer Science, and I’m motivated to help create the software development experience that will help the next generation change the world for the better.</p><p>This year promises to be an important one for creating that future, and I can’t wait to see the ways in which Flutter and Dart support it. As always, we’re incredibly grateful for your continued support and we can’t wait to see what we’ll build together. Until next time!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=cae9845264fe" width="1" height="1" alt=""><hr><p><a href="https://medium.com/flutter/starting-2024-strong-with-flutter-and-dart-cae9845264fe">Starting 2024 strong with Flutter and Dart</a> was originally published in <a href="https://medium.com/flutter">Flutter</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Sharing Flutter’s 2024 roadmap]]></title>
            <link>https://medium.com/flutter/sharing-flutters-2024-roadmap-22debd2bbd22?source=rss----4da7dfd21a33---4</link>
            <guid isPermaLink="false">https://medium.com/p/22debd2bbd22</guid>
            <category><![CDATA[flutter]]></category>
            <category><![CDATA[roadmaps]]></category>
            <category><![CDATA[announcements]]></category>
            <category><![CDATA[flutter-app-development]]></category>
            <dc:creator><![CDATA[Michael Thomsen]]></dc:creator>
            <pubDate>Thu, 15 Feb 2024 19:28:44 GMT</pubDate>
            <atom:updated>2024-02-15T19:28:44.068Z</atom:updated>
            <content:encoded><![CDATA[<p>As an open source project with a thriving community, we strive to be transparent about our plans, with everything from issues to design specifications being shared in the open. We’ve heard a great deal of interest in Flutter’s feature roadmap. These kinds of roadmaps can be challenging in terms of predictability, as those plans tend to shift and adapt throughout the year, but we still feel it’s important to share our overall plans, with the stated caveat: plans may change.</p><p>We’ve been publishing our roadmaps <a href="https://github.com/flutter/flutter/wiki/%5BArchive%5D-Old-Roadmaps">since 2020</a>, and <strong>today we’re sharing our</strong><a href="https://github.com/flutter/flutter/wiki/Roadmap"><strong> 2024 roadmap</strong></a>. This is a natural continuation of previous years’ work, still working towards our long-term goal of creating <em>the most popular, fastest growing, and highest-productivity multi-platform UI framework for building rich app experiences</em>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*8v__Z0fIVOBm4uxFnMLAVg.png" /></figure><p>Note that what we’re listing here is primarily content gathered from those of us who work on Flutter as employees of Google. By now non-Google contributors outnumber those employed by Google, so this is not an exhaustive list of all the new and exciting things that we hope will come to Flutter this year!</p><p>We’re <strong>immensely thankful</strong> for the community and your continued support. We can’t wait to see what you’ll build!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=22debd2bbd22" width="1" height="1" alt=""><hr><p><a href="https://medium.com/flutter/sharing-flutters-2024-roadmap-22debd2bbd22">Sharing Flutter’s 2024 roadmap</a> was originally published in <a href="https://medium.com/flutter">Flutter</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Harness the Gemini API in your Dart and Flutter Apps]]></title>
            <link>https://medium.com/flutter/harness-the-gemini-api-in-your-dart-and-flutter-apps-00573e560381?source=rss----4da7dfd21a33---4</link>
            <guid isPermaLink="false">https://medium.com/p/00573e560381</guid>
            <category><![CDATA[flutter]]></category>
            <category><![CDATA[google]]></category>
            <category><![CDATA[gemini]]></category>
            <category><![CDATA[ai]]></category>
            <category><![CDATA[dart]]></category>
            <dc:creator><![CDATA[Ander Dobo]]></dc:creator>
            <pubDate>Thu, 15 Feb 2024 14:51:44 GMT</pubDate>
            <atom:updated>2024-02-15T16:07:34.216Z</atom:updated>
            <content:encoded><![CDATA[<p><strong>Introducing the Google AI Dart SDK</strong></p><p>We’re thrilled to announce the launch of the Google AI Dart SDK for the Gemini API. The new pub.dev package, <a href="https://pub.dev/packages/google_generative_ai">google_generative_ai,</a> and <a href="https://ai.google.dev/tutorials/dart_quickstart">supporting resources</a> enable you to build your own generative AI-based features into Dart and Flutter apps through an idiomatic Dart integration with the Gemini API. It opens the door to a vast range of possibilities for building intelligent, performant applications for Android, iOS, web, macOS, Windows, and Linux from a single code base.</p><p>With the Google AI Dart SDK, you can:</p><ul><li><strong>Easily integrate generative AI features: </strong>Add advanced text generation, summarization, chat, and more to your Dart or Flutter apps with minimal setup.</li><li><strong>Tap into Google’s most capable and general model yet:</strong> The Gemini model draws on Google’s extensive research and development in machine learning, giving you access to generative AI capabilities that will continue to improve.</li><li><strong>Accelerate your AI-powered app development: </strong>Focus on your app logic and user experience, while the SDK handles the intricacies of interacting with AI models.</li><li><strong>Build cross-platform AI-powered apps:</strong> Easily create generative AI features across desktop, web, and mobile applications using Flutter.</li><li><strong>Use the Gemini API in more than 180+ countries and territories: </strong>Check the <a href="https://ai.google.dev/available_regions#available_regions">available regions</a> for the most current list of countries and regions where the Gemini API and Google AI Studio (described further below) are available.</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*-KkJmzvv3jNhh88TWxIBJg.png" /></figure><p><strong>What can you build?</strong></p><p>We believe generative AI holds immense potential to help you achieve your app and business goals. And since the Gemini model is multimodal (it’s capable of processing information from multiple modalities, including images and text), it empowers you to be extremely creative. However, the first question we often get from app developers — and even from within our own team — is “What can I actually do with the Gemini API?” Here are a few examples of features you might create for your Dart or Flutter app:</p><ul><li><strong>Text summarization:</strong> Generate concise summaries of long articles, research papers, or website content from textual input.</li><li><strong>Smart chatbots:</strong> Build more engaging and human-like conversational interfaces, enhancing user experience in your applications.</li><li><strong>Visual search engine: </strong>Users can upload an image, and the app uses the Gemini API to return descriptions of what’s in the image, the style, and perhaps even how to make what’s in the image.</li><li><strong>Image descriptions for accessibility:</strong> Generate detailed text descriptions of uploaded images to aid users who are visually impaired.</li><li><strong>Diagram &amp; chart interpretation: </strong>Users can upload images of diagrams, charts, or graphs, and the Gemini API delivers a text-based analysis and explanation of the data.</li></ul><p>This list could go on because the possibilities are nearly endless!</p><figure><img alt="A screenshot of the Flutter sample app that uses the Google AI Dart SDK" src="https://cdn-images-1.medium.com/max/1024/0*7Zvr0YiN7O22wTOy" /><figcaption>A screenshot of the Flutter sample app that uses the Google AI Dart SDK</figcaption></figure><p><strong>Getting Started</strong></p><p>Check out the <a href="https://ai.google.dev/tutorials/dart_quickstart">Dart quickstart</a> for a detailed step-by-step guide on how to get set up. At a high level, here’s what you’ll do:</p><ol><li>Get a Gemini API key from Google AI Studio. Keep this key secure. We strongly recommend that you do not include the key directly in your code, or check files that contain the key into version control systems. While developing, we recommend using flutter run -d [DEVICE NAME] — dart-define=API_KEY=[YOUR API KEY] to run the app in an emulator/simulator, using your API key as an environment variable.</li><li>Add the Google AI Dart SDK to your Dart or Flutter app by running dart pub add google_generative_ai or flutter pub add google_generative_ai, respectively. This adds google_generative_ai as a dependency to your `pubspec.yaml` file.</li><li>Initialize the generative model in your code:</li></ol><pre>import &#39;package:google_generative_ai/google_generative_ai.dart&#39;;<br><br>// Access your API key as an environment variable (see first step above)<br>final apiKey = Platform.environment[&#39;API_KEY&#39;];<br>if (apiKey == null) {<br>  print(&#39;No \$API_KEY environment variable&#39;);<br>  exit(1);<br>}<br><br>final model = GenerativeModel(model: &#39;MODEL_NAME&#39;, apiKey: apiKey);</pre><p>4. You can now start to explore using the Gemini API to implement different use cases. For example, when the prompt input includes both text and images, use the gemini-pro-vision model and the generateContent method to generate text output:</p><pre>import &#39;dart:io&#39;;<br><br>import &#39;package:google_generative_ai/google_generative_ai.dart&#39;;<br><br>void main() async {<br>  // Access your API key as an environment variable (see first step above)<br>  final apiKey = Platform.environment[&#39;API_KEY&#39;];<br>  if (apiKey == null) {<br>    print(&#39;No \$API_KEY environment variable&#39;);<br>    exit(1);<br>  }<br>  // For text-and-image input (multimodal), use the gemini-pro-vision model<br>  final model = GenerativeModel(model: &#39;gemini-pro-vision&#39;, apiKey: apiKey);<br>  final (firstImage, secondImage) = await (<br>    File(&#39;image0.jpg&#39;).readAsBytes(),<br>    File(&#39;image1.jpg&#39;).readAsBytes()<br>  ).wait;<br>  final prompt = TextPart(&quot;What&#39;s different between these pictures?&quot;);<br>  final imageParts = [<br>    DataPart(&#39;image/jpeg&#39;, firstImage),<br>    DataPart(&#39;image/jpeg&#39;, secondImage),<br>  ];<br>  final response = await model.generateContent([<br>    Content.multi([prompt, ...imageParts])<br>  ]);<br>  print(response.text);<br>}</pre><p>Explore the <a href="https://ai.google.dev/docs">Gemini API documentation</a> and check out the <a href="https://github.com/google/generative-ai-dart/tree/main/samples">Dart and Flutter sample apps</a> in the GitHub repo for detailed guides and examples on how to use the SDK for various use cases, or in <a href="https://dartpad.dev/?id=341bc46b2ed1d2055d357ab987ed5fc2">this sample app</a> in DartPad, which is a free, open-source online editor for Dart and Flutter snippets, now built with Flutter. Please report any issues or tell us about feature requests in the <a href="https://github.com/google/generative-ai-dart/issues/new/choose">generative-ai-dart GitHub repo</a>.</p><p><strong>Google AI Studio</strong></p><p>Alongside the SDK, <a href="https://aistudio.google.com/?utm_source=flutter&amp;utm_medium=referral&amp;utm_campaign=blog_gaidartsdk_announcment&amp;utm_content=">Google AI Studio</a> is a browser-based IDE for prototyping with generative models. It enables you to quickly iterate to develop prompts for your use case, and then get an API key to use in your app development. You can sign into Google AI Studio with your Google account and take advantage of the free quota, which allows 60 requests per minute. To help us improve product quality, when you use the free quota, your Google AI Studio input and output might be accessible to trained reviewers. This data is de-identified from your Google account and API key.</p><p>We will add Dart to Google AI Studio soon, so keep a lookout for the announcement! This will enable you to simply click on “Get code”, select a new Dart tab (which will be alongside the existing supported languages), and then “Copy” the Dart code to transfer your work to your IDE of choice.</p><figure><img alt="A screenshot of Google AI Studio" src="https://cdn-images-1.medium.com/max/1024/0*xhBJ20OLOQDtQ2xi" /><figcaption>Google AI Studio</figcaption></figure><p><strong>Share what you build!</strong></p><p>We look forward to seeing what you’ll build with Gemini, like the team at LeanCode who have used the Gemini API to build <a href="https://leancode.co/arb_translate">arb_translate</a>. It’s a package that helps developers to perform language translation automatically, streamlining localization in Flutter apps.</p><p>Use the hashtag #BuildWithGemini on Twitter/X to let us know what you’re building!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=00573e560381" width="1" height="1" alt=""><hr><p><a href="https://medium.com/flutter/harness-the-gemini-api-in-your-dart-and-flutter-apps-00573e560381">Harness the Gemini API in your Dart and Flutter Apps</a> was originally published in <a href="https://medium.com/flutter">Flutter</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Extreme UI Adaptability in Flutter — How Google Earth supports every use case on earth]]></title>
            <link>https://medium.com/flutter/extreme-ui-adaptability-in-flutter-how-google-earth-supports-every-use-case-on-earth-6db4661e7a17?source=rss----4da7dfd21a33---4</link>
            <guid isPermaLink="false">https://medium.com/p/6db4661e7a17</guid>
            <category><![CDATA[google-earth]]></category>
            <category><![CDATA[flutter]]></category>
            <category><![CDATA[adaptive-ui]]></category>
            <category><![CDATA[dart]]></category>
            <dc:creator><![CDATA[Craig Labenz]]></dc:creator>
            <pubDate>Wed, 07 Feb 2024 17:09:33 GMT</pubDate>
            <atom:updated>2024-02-13T17:22:35.523Z</atom:updated>
            <content:encoded><![CDATA[<h3>Extreme UI Adaptability in Flutter — How Google Earth supports every use case on earth</h3><p>When Google Earth set out to rewrite their mobile and web clients in Flutter, they knew they wanted to allow each and every one of their users to explore the planet however they liked, on whatever devices they owned. This had long been true to an extent; after all, Google Earth had existing web, desktop, Android, and iOS clients. But this rewrite, which would cover all of those targets other than desktop, would need to support the superset of existing use cases; plus a few new adaptability ideas the Earth team was excited to explore.</p><p>The search for Google Earth’s tech stack of the future was heavily informed by all-too-familiar sources of friction that had slowed development on their existing clients. Namely, Google Earth had long been forced to choose between developer velocity on new features and maintaining feature parity across their three unique codebases (web, Android, and iOS). Luckily, the center of the UI — the entirety of the pale blue dot in the middle of the screen — is powered by a C++ engine that already delivered a unified experience for <em>some</em> of Google Earth’s features. However, the rest of the UI chrome and menus were implemented separately across each codebase. This meant that any cross-platform choice would not only need to overhaul the UI development process, but also integrate with a large legacy engine in Android, iOS, and the web.</p><p>The deciding factors to use Flutter were twofold. First, integrating with the existing Google Earth engine proved to be a straightforward task using method channels. Second, Google Earth not only wanted to streamline their codebases, but also to reimagine their UI entirely. Any major UI overhaul is already something of a rewrite, and Google Earth opted to write one new Flutter app instead of performing surgery on three existing apps. This complicated the task, but the team committed to a clean break and obsessive focus on adaptability. In the end, the Google Earth team came to power their UI across three platforms with Flutter.</p><h3><strong>Defining adaptability</strong></h3><p>And so, the Google Earth team set out on an adventure to push the limits of UI adaptability. Prior art abounds on creating UIs tailored to different user journeys — dating back to the dawn of smartphones and the entire internet’s collective realization that most websites needed a rethink for small screens. Browser APIs and CSS patterns emerged to build websites with an awareness of their screen’s resolution; and those ideas have been prominent ever since. Even in Flutter’s earliest days, developers knew phone screens would vary and made their apps’ UIs depend on the screen’s resolution. And if that resolution changed — either because the user rotated their phone or resized their browser window, the app’s UI would <em>respond</em>. In Flutter, as in the web for years before it, <strong><em>responsive UIs</em> </strong>improved user experiences.</p><p>What then, you might wonder, is the difference between a <em>responsive</em> UI and an <em>adaptive</em> UI? Put simply, a <em>responsive</em> UI adjusts to changes in the amount and aspect ratio of available pixels; while an <em>adaptive</em> UI adjusts to <em>everything else.</em> Responsive UIs can grow and shrink individual UI elements based on screen real estate details, but adaptive UIs answer more fundamental questions like where to render the app’s navigation, whether list views should route to separate detail views or show them side-by-side with the list itself, and how the user’s connected peripherals should influence things like tap targets and hover states (more on this concept later).</p><blockquote>For more on this, watch <a href="https://youtu.be/HD5gYnspYzk?si=8AvuBRGXNRNET9dR">episode 15 of Decoding Flutter</a> on Adaptive vs Responsive UIs and see these guides from <a href="https://docs.flutter.dev/ui/layout/responsive/building-adaptive-apps">Flutter</a> and <a href="https://developer.android.com/develop/ui/views/layout/responsive-adaptive-design-with-views">Android’s</a> documentation.</blockquote><p>As anyone who’s written responsive CSS for a website will tell you, even simple UIs promise tricky edge cases. And to be clear, this is no fault of CSS; the problem space’s many states are so fine-grained as to almost feel analog. What then, should a UI developer expect when considering several additional variables, such as the device form factor and connected peripherals? Naturally, they should expect a fair increase in complexity.</p><p>This all came to a head when the behavior of an early prototype caught the Google Earth team off-guard. While playing with that early build, a Google Earth engineer shrunk their desktop web browser down to an extremely narrow width. Suddenly, typical desktop affordances like side navigation bars and tighter touch targets were replaced by mobile affordances, like a bottom navigation bar and larger, finger-friendly buttons. Their surprise was brief — after all, <em>that was exactly what they’d told their app to do</em>. The Google Earth team was now faced with a profound question — <em>Is this what a user would want?</em></p><p>Such was the terra incognita the Google Earth team was about to chart.</p><h3>Why adaptability?</h3><p>To some, the following content raises a meta-question: <em>Why bother with any of this in the first place? Is the ROI sufficient when surely a responsive UI will satisfy most users?</em></p><p>These are good questions, but they should not contribute toward hesitation with Flutter. Using a cross-platform UI framework like Flutter does not <em>introduce</em> adaptive UI concerns; it <em>unlocks adaptive UI solutions</em>. Beyond that, here are two considerations that suggest adaptive UIs really are that important:</p><ul><li>Screen resolutions don’t imply what they once did. Desktop browsers can have low DPI settings that a naive breakpoint check will confuse with mobile environments; high-DPI phones in landscape orientation can be mistaken with old tablet (or even desktop!) breakpoints; and foldable devices can alternate between showing your app full-screen and splitting screen real estate between multiple apps, leading to jarring differences if this careens a user back and forth across certain breakpoints.</li><li>Apps with distinct creation vs consumption modes (think any text composition app with Read and Edit experiences) can suffer heavily on mobile — and especially on tablets. Shipping a mobile-first, and thus likely consumption-first experience to smartphones and tablets greatly limits your power users with a tablet, Bluetooth keyboard, and mouse.</li></ul><h3>Delivering on adaptability</h3><p>The Google Earth team walked a long road of experimentation, user research, and iteration to arrive at the app they ultimately shipped. But in the end, their problem space boiled down to three high-level questions:</p><ol><li>How should the app determine its initial UI strategy?</li><li>How and when should the app change its UI strategy?</li><li>How would the Google Earth team cleanly implement this logic?</li></ol><h3>Determining an initial UI strategy</h3><p>One of the Earth team’s early assumptions was that “there is no difference between a Chromebook with a touchscreen and a tablet with a connected Bluetooth keyboard”, and that their UI should not distinguish between the two. Although this idea stood to initial reason, it did not survive testing; and over time the Earth team increasingly realized the gaps in this approach. A user launching the app with a high-resolution tablet in landscape mode could find themselves within a desktop UI range of pixel resolution (following older, responsive UI rules). If that same user then rotated their tablet into portrait mode and in doing so shifted into a pixel resolution range assigned to tablets, Google Earth would be faced with a hard choice. The dynamic option would be to dramatically restructure everything by shifting from the desktop UI to the mobile UI; whereas the static option would be to do nothing except squish and compress the desktop UI until it fit in within its new constraints. Neither of these options were satisfying, and it all meant that there <em>was</em> a difference between a Chromebook with a touchscreen and a tablet with a keyboard.</p><p>In the end, the Earth team settled on a simple rule: serve the mobile experience to smartphones and tablets, and the desktop experience to desktops. If this seems anti-climactic, well, it sort of is; but only because it punts some of the juicy parts to the next question — <em>When should the UI’s initial strategy </em><strong><em>change?</em></strong></p><h3>Updating the UI strategy within a user session</h3><p>The Earth team’s first strategy for UI changes was little more than established responsive UI rules: show your mobile UI on any resolutions below a minimum threshold, your tablet UI (if you have one) the next few hundred possible widths, and lastly, your desktop UI on anything else. And, critically, when a UI crosses one of those thresholds for any reason, you re-render the app accordingly. Of course, this ruleset’s awkwardness launched Google Earth onto its odyssey of extreme adaptability; so it should be no surprise that the team abandoned this approach.</p><p>A second possibility came from Stadia, a fellow Google team with a successful Flutter mobile app. (Obviously, Stadia did not survive as a product; but that was not for lack of functionality in its mobile app!) Stadia’s approach was to make adaptive UI decisions based on which inputs were last touched. Drag your computer’s cursor or press a key, and Stadia would snap into its desktop UI mode. Conversely, if you tilted a joystick on a connected console controller, Stadia would snap into its console UI mode. However, while that made sense for Stadia, it proved less appropriate for Google Earth. A simple case ruled out this last-inputs-touched strategy: a tablet user pinching to zoom their map, then returning to a Bluetooth keyboard to finish typing content. No user would want two dramatic UI transitions during that simple interaction, so the user’s most recent inputs could not wholesale change Google Earth’s UI from mobile to desktop or back.</p><p>In the end, the Google Earth team settled on a second very simple rule: remain consistent within a session and never leave the initial UI flavor without the user’s explicit permission. As explored earlier, Google Earth would show its mobile-first UI on smartphones and tablets and its desktop-first UI on desktops; and it would never outsmart itself and change that unless the user requested a change in the settings panel.</p><h3>Mixed-UI states</h3><p>UI consistency within sessions served Google Earth well, but it is not the whole story. UI affordances in desktop experiences like cursor hover effects lack any equivalent on mobile and must be reimagined. A user treating their touchscreen laptop like a tablet could be blocked entirely by an app’s failure to replace critical hover effects with alternatives suitable for mobile. This realization suggested a two-tier problem and solution. Google Earth’s UI would not only need to smoothly switch back and forth between its mobile and desktop experiences when a user requested, but individual controls would need to have both a touch-friendly form <em>and</em> a mouse-friendly form, regardless of the overarching strategy.</p><p>Finally, Google Earth knew what they were building. All of their research and iteration left only implementation questions, which amounted to:</p><ol><li>How to manage transitions between two fundamentally different UIs, and</li><li>How to build individual controls to support atypical peripherals</li></ol><h3>Managing multiple UIs</h3><p>At its simplest, building any Flutter app to seamlessly switch between two different experiences is as simple as putting the following line somewhere in a widget’s build method:</p><pre>child: mode == Mode.desktop ? DesktopUI() : MobileUI()</pre><p>However, this strategy (which is what Google Earth uses) implies some extra work elsewhere to fully realize. The issue — initially obscure — surfaces when <em>any</em> application state is stored within a Stateful widget, as toggling that `mode` variable completely replaces the widget tree, destroying all State objects and any information they hold. There are two layers to this problem.</p><p>To imagine the first layer, consider a screen that has multiple panels on desktop, but reorganizes each of those panels into a tab bar experience on mobile. A mobile user will have an active tab, but that concept has no equivalent on desktop. Storing the active tab index within a StatefulWidget (an idiomatic decision in Flutter!) would always reset a mobile user’s position to the default tab after toggling back and forth through the desktop UI. The solution to this involves moving any primitive application state — strings, integers, and the like — out of StatefulWidgets and into your state management classes. This way, no shenanigans in your widget tree can reset critical values.</p><p>The problem’s second layer comes from application state less easily pulled out of the widget tree, like TextEditingControllers or ScrollControllers. The situation looks like this: you have a ListTile with a TextField, but any time the user touches their mouse or touchscreen, you rebuild that ListTile to accommodate the user’s latest peripherals. Without intervention, this would cause Flutter to destroy the entire part of the Widget and Element trees containing the old TextField, taking with them any controllers holding the user’s work. You might be tempted to treat these as primitives (TextEditingControllers as strings and ScrollControllers as doubles) and repeat the above solution; but controllers are too rich to easily serialize in this way (cursor position and text selection, anyone?).</p><p>To solve this problem, Google Earth uses GlobalKeys to have the framework “reparent” highly-scoped widgets after a fresh layout. The following AdaptableTextInput widget is tightly scoped to its TextField and TextEditingController. Supplying the same GlobalKey to that AdaptableTextInput widget across UI-changing rebuilds will keep the TextEditingController alive, saving your users’ work.</p><pre>class AdaptableTextInput extends StatefulWidget {<br><br>  // Supply a consistent GlobalKey here!<br>  const AdaptableTextInput({super.key, required this.mode});<br><br>  final Mode mode;<br><br>  @override<br>  State&lt;AdaptableTextInput&gt; createState() =&gt; _AdaptableTextInputState();<br>}<br><br>class _AdaptableTextInputState extends State&lt;AdaptableTextInput&gt; {<br><br>  final _controller = TextEditingController();<br>  final String helpText = &#39;I clarify this text input!&#39;;<br><br>  @override<br>  Widget build(BuildContext context) {<br>    if (widget.mode == Mode.desktop) {<br>      return Tooltip(<br>        showOnHover: helpText,<br>        child: TextField(controller: _controller),<br>      );<br>    } else if (widget.mode == Mode.mobile) {<br>      return Column(<br>        children: &lt;Widget&gt;[<br>          TextField(controller: _controller),<br>          Text(helpText);<br>        ],<br>      );<br>    } <br>  }<br>}</pre><h3>Navigation</h3><p>Navigation stacks and the app’s Back button also require special attention. Continuing with the above example of a desktop UI that shows multiple panels at once, now imagine a complementary mobile UI that presents those panels in a stack-like UI with forward and backward navigation. The implications of allowing desktops to use the mobile UI, and phones to use the desktop UI, was one of the big adaptability ideas Google Earth wanted to pursue.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*nHhbCY1mlIwlvQI60jaJHA.png" /><figcaption>A grid of UIs, showing a desktop UI on both desktop and mobile devices, and a mobile UI on both desktop and mobile devices</figcaption></figure><p>If a desktop-UI user is on the red panel when they switch to the mobile UI, the Back button won’t automatically be wired up, because the navigation stack will be reset. This means your desktop UI needs to account for extra information technically only needed by the mobile UI, because at any moment the mobile UI could be asked to take over.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*RXf-GaJ4uz5_ZKU8QgIVkw.png" /><figcaption>A desktop device rendering the same UI in two different modes — one typical of desktop, and one typical of mobile</figcaption></figure><p>Luckily, GoRouter’s declarative routing API can help. Create two separate routing declarations and switch to the appropriate route when your user toggles UI modes. In this situation, if the desktop UI has tracked the user’s last activity to the red panel when a request to activate the mobile UI comes in, calling `mobileRouter.go(‘home/blue/red’)` will create a navigation stack with a synthetic history, allowing the user to press the Back button to escape the red screen.</p><pre>final desktopRouter = GoRouter(<br>  routes: [<br>    GoRoute(<br>      path: &#39;/home&#39;,<br>      builder: (context, state) =&gt; FourPanels(),<br>    ),<br>  ],<br>);<br><br>final mobileRouter = GoRouter(<br>  routes: [<br>    // One route for each panel, configured to wire up<br>    // the Back button if a user arrives on one of the nested panels<br>    GoRoute(<br>      path: &#39;/home/blue&#39;,<br>      builder: (context, state) =&gt; BluePanel(),<br>      routes: [<br>        GoRoute(<br>          path: &#39;red&#39;,<br>          builder: (context, state) =&gt; RedPanel(),<br>          routes: [<br>            GoRoute(<br>              path: &#39;green&#39;,<br>              builder: (context, state) =&gt; GreenPanel(),<br>              routes: [<br>                GoRoute(<br>                  path: &#39;yellow&#39;,<br>                  builder: (context, state) =&gt; YellowPanel(),<br>                ),<br>              ],<br>            ),<br>          ],<br>        ),<br>      ],<br>    ),<br>  ],<br>);</pre><p>Highly adaptive UIs like Google Earth’s require an implementation that treats all possible scenarios as always in play, even though only one given UI is ever being rendered. This means that the app must always be able to <em>completely</em> reconstruct its state from resources you completely control — whether that is because you have GlobalKeys to retain State objects holding important information, or because you’ve stored all relevant details in your state management classes.</p><h3>Adapting to user inputs</h3><p>All of this left only one more tricky adaptability problem: ensuring controls across their UI were amenable to the user’s last-used peripherals and not just the reigning UI strategy. After all, if a tablet user started clicking a Bluetooth mouse; Google Earth wasn’t going to wholesale switch to their desktop UI, but they <em>did</em> want to slightly tweak elements to leverage a keyboard and mouse’s strengths.</p><p>Merely using Flutter meant Google Earth was off to a good start here. Imagine the alternative: an app that is split across three codebases (JavaScript for desktop via web, and Swift and Kotlin for mobile), when the Swift and Kotlin teams realize that it would be awfully nice if, <em>in some scenarios</em>, they could borrow elements from the JavaScript app’s UI. Maybe what they need can be re-implemented simply enough; or maybe not. Either way; in a Flutter app, the existing tool you want to borrow is always in the same codebase.</p><p>But code sharing is not code organization, and the question of how to implement this coherently remained. Here, the Google Earth team turned to an old Flutter staple: the builder pattern.</p><pre>/// High level categories of user inputs.<br>enum InputType { gamepad, keyboardAndMouse, touch }<br><br>/// Builds a widget tree that depends on the user&#39;s current [InputType].<br>class InputTypeBuilder extends StatelessWidget {<br>  /// Called when the [InputType] data updates.<br>  final Function(BuildContext, InputTypeModel, Widget?) builder;<br><br>  /// Constructs a wrapping widget that will invoke the [builder] whenever<br>  /// [InputType] changes.<br>  ///<br>  /// See [InputTypeModel] for details on how to change [InputType].<br>  const InputTypeBuilder({<br>    Key? key,<br>    required this.builder,<br>  }) : super(key: key);<br><br>  @override<br>  Widget build(BuildContext context) {<br>    return Consumer&lt;InputTypeModel&gt;(<br>      builder: (context, inputTypeModel, _) {<br>        return builder(<br>          context,<br>          inputTypeModel.inputType,<br>        );<br>      },<br>    );<br>  }<br>}</pre><p>A widget like InputTypeBuilder listens to a top-level mechanism, the InputTypeModel, which itself subscribes to the Flutter Engine for updates on the last-used input. InputTypeModel.inputType is a property of the InputType enum. And with that, child widgets can make localized decisions about how to render themselves in light of how the user is currently interacting with the app. For example, if you had been using a mouse, but then tapped your finger on the touch screen, affordances that were once only revealed by the cursor’s hover effect would now appear all over the app. And similarly, if you switched back to using the mouse, this InputTypeBuilder would allow them to reverse the change.</p><pre>@override<br>Widget build(BuildContext context) {<br>  return InputTypeBuilder(<br>    builder: (context, inputTypeModel, child) {<br>      final bool isHoveredOrTouchInput = isHovered || inputTypeModel.inputType == InputType.touch;<br>      return Row(<br>        children: &lt;Widget&gt;[<br>          isHoveredOrTouchInput ? DragIndicator() : Container(),<br>          RestOfRow(),<br>        ],<br>      );<br>    },<br>  );<br>}</pre><p>The following gif shows Google Earth’s desktop UI (running in Chrome), nimbly adjusting to the user alternating between touchscreen and mouse actions.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/810/1*7hP0qPviv5Hrh7U82WWhQg.gif" /><figcaption>Google Earth’s UI swapping between typical desktop and mobile affordances as the end-user interacts with different peripherals</figcaption></figure><h3>Conclusion</h3><p>The biggest unexpected wins rebuilding Google Earth with Flutter came to users of the tweener environments — tablets and the web. Caught awkwardly between phones and laptops; tablets can physically support both types of experiences but rarely enjoy the software flexibility to match. Similarly, web experiences can be loaded on any device; and on desktop, browsers can be arbitrarily resized. Depending on the app, all of this can imply radically different UIs. For most development teams with separate codebases for each build target, fully supporting users caught in these limbo states is a non-starter. (Imagine convincing your boss to spend the time rebuilding your entire desktop UI on mobile, just in case a tablet user wants it!)</p><p>But, as the Google Earth team found, while building a fully adaptive UI in one codebase did imply extra complexity, it was dwarfed by the user experience improvements gained by meeting each and every user exactly where they were.</p><p>You can try Google Earth’s new Flutter implementation today by downloading the app on Android or iOS, or visiting <a href="https://earth.google.com">https://earth.google.com</a>.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=6db4661e7a17" width="1" height="1" alt=""><hr><p><a href="https://medium.com/flutter/extreme-ui-adaptability-in-flutter-how-google-earth-supports-every-use-case-on-earth-6db4661e7a17">Extreme UI Adaptability in Flutter — How Google Earth supports every use case on earth</a> was originally published in <a href="https://medium.com/flutter">Flutter</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Progress of the Flutter Package Ecosystem]]></title>
            <link>https://medium.com/flutter/progress-of-the-flutter-package-ecosystem-17cded9a0703?source=rss----4da7dfd21a33---4</link>
            <guid isPermaLink="false">https://medium.com/p/17cded9a0703</guid>
            <category><![CDATA[flutter]]></category>
            <category><![CDATA[pubdev]]></category>
            <category><![CDATA[dartlang]]></category>
            <dc:creator><![CDATA[Ander Dobo]]></dc:creator>
            <pubDate>Mon, 22 Jan 2024 20:05:00 GMT</pubDate>
            <atom:updated>2024-01-22T20:04:59.909Z</atom:updated>
            <content:encoded><![CDATA[<h3>Progress of the Flutter and Dart Package Ecosystem</h3><p>The Flutter and Dart package ecosystem remains a key enabler for developers to build beautiful, performant apps for any screen from a single codebase. The ecosystem grew 26% in 2023 from 38,000 packages in January to 48,000 at the end of December.</p><figure><img alt="Line chart depicting the growth in number of packages on pub.dev in 2023" src="https://cdn-images-1.medium.com/max/1024/0*tkq_040X0xa_Noms" /></figure><p>Pub.dev now has more than 700,000 monthly active users as of January 2024. The Flutter team remains keen and committed to supporting this growth into the future, and enabling developers to build with and contribute to Flutter and Dart. In this update, we’ll take a look at the newest Flutter Favorites, results of the Package Ecosystem Virtual Summit and share some notable updates and things to know.</p><h3>New Flutter Favorites</h3><figure><img alt="Flutter Favorite logo" src="https://cdn-images-1.medium.com/max/150/0*gI42oCiw6spGZfOy" /></figure><p>The <a href="https://docs.flutter.dev/packages-and-plugins/favorites">Flutter Favorites</a> program recognizes and helps developers discover the highest quality packages to consider using in their apps. We’re pleased to announce seven new Flutter Favorite packages that have demonstrated exceptional quality, popularity, and community engagement, making them invaluable tools for Flutter developers. Let’s dive into each:</p><ol><li><a href="https://pub.dev/packages/flame"><strong>flame</strong></a><strong>:</strong> A high-performance 2D game engine for Flutter. Its intuitive API and rich feature set make it an ideal choice for creating visually stunning and engaging games. Check out <a href="https://codelabs.developers.google.com/codelabs/flutter-flame-game">this codelab</a> to try your hand at building a game with flame.</li><li><a href="https://pub.dev/packages/flutter_animate"><strong>flutter_animate</strong></a><strong>: </strong>Bring your UI to life with this a powerful animation library that simplifies complex animations and makes them accessible to all Flutter developers. Its declarative syntax and extensive documentation make it a breeze to create smooth and expressive animations.</li><li><a href="https://pub.dev/packages/riverpod"><strong>riverpod</strong></a><strong>:</strong> An elegant package that offers a powerful and intuitive approach to managing application state. Its streamlined API, performance, scalability, and testability make it a compelling choice for modern Flutter apps.</li><li><a href="https://pub.dev/packages/video_player"><strong>video_player</strong></a><strong>:</strong> Essential for anyone looking to integrate video playback in their Flutter applications. It provides a widget to display video content. It supports a wide range of formats and sources, including network asset and file-based videos. This makes it a versatile tool for building multimedia-rich Flutter apps.</li><li><a href="https://pub.dev/packages/macos_ui"><strong>macos_ui</strong></a><strong>:</strong> For developers targeting macOS, this package enables creation of applications with a design that feels right at home on that platform. It provides an extensive collection of widgets and components that are styled according to the macOS design language, ensuring that your Flutter app not only runs well on macOS, but also looks and feels native.</li><li><a href="https://pub.dev/packages/fpdart"><strong>fpdart</strong></a><strong>:</strong> This package enables functional programming in Dart. It’s great for implementing business logic, for instance, where functional programming paradigms like immutability, pure functions, and higher-order functions, as well as fpdart’s use of Dart’s type system, helps in building more maintainable and predictable code.</li><li><a href="https://pub.dev/packages/flutter_rust_bridge"><strong>flutter_rust_bridge</strong></a><strong>:</strong> For developers seeking to leverage the best of Rust and Flutter in their application, flutter_rust_bridge provides a seamless bridge between the two worlds. It enables native Rust code to interact with Flutter seamlessly, unlocking the potential of Rust’s performance and memory safety in Flutter apps.</li></ol><h3>Sunsetting the Happy Paths program</h3><p>We decided to sunset the Happy Paths program to enable a more dedicated focus on Flutter Favorites. The vision of Happy Paths recommendations was to help you to make informed decisions on finding and using packages to add functionality to your app. We are fortunate to have community initiatives such as <a href="https://fluttergems.dev/">Flutter Gems</a> that are comprehensive resources for navigating well categorized package options. As we focus on the Flutter Favorites program, we will continue to evolve it with input and feedback from the Flutter and Dart community.</p><h3>Package Ecosystem Virtual Summit</h3><figure><img alt="The Flutter and Dart Ecosystem Virtual Summit 2023 landing page" src="https://cdn-images-1.medium.com/max/1024/0*NzfQsOUhVaeUcGB8" /></figure><p>At the end of August 2023, we held a first-time <a href="https://rsvp.withgoogle.com/events/flutter-package-ecosystem-summit-2023">virtual summit</a> for the Flutter and Dart package ecosystem, attended by more than 50 non-Googler and Googler contributors to <a href="https://pub.dev/">pub.dev</a>. We started with a relatively small invitee list to fit the unconference-style format, and to learn from this first-time event before figuring out what it might look like in the future. The goal was to bring contributors together in unconference-style discussions to plan, educate, learn, and share amongst the community. We had three discussion sessions, each on topics that were voted on by registered attendees in the weeks leading up to the summit. The three discussion topics were 1) Building high quality packages — best practices, and challenges, 2) Maintaining packages long term — sustainable models, and 3) Flutter and Dart DevTools Extensions.<strong> </strong>Respondents to the post-event survey gave us insightful feedback that we’ll incorporate in future event planning. Thank you! Overall, we consider this first summit a success. Going forward, we’re keen to partner with the community on similar standalone events, or sessions focused on the Flutter and Dart ecosystem, set within more general events.</p><h3>Updates to the Pigeon package</h3><p>The <a href="https://pub.dev/packages/pigeon">Pigeon package</a> is a code generation tool that streamlines setting up the communication between your Flutter app and platform-specific code. This makes Pigeon useful both 1) when writing custom integrations directly between a Flutter app and platform-native APIs, such as in an <a href="https://docs.flutter.dev/add-to-app">add-to-app</a> scenario, and 2) when writing a <a href="https://docs.flutter.dev/packages-and-plugins/developing-packages#types">Flutter plugin</a> to provide a Dart API surface for platform-native APIs. It’s maintained by the Flutter Team who has made the following notable improvements to the package over this year:</p><ul><li>Added support for Swift, Kotlin and C++ (C++ unlocked Windows support).</li><li>Null safety is now enforced.</li><li>Expanded support for primitive data types support. For example, enums were added as a supported type.</li><li>Added nullable parameters.</li><li>Added error handling on host and Flutter APIs.</li><li>Improved the ergonomics of the tools to make them easier and more intuitive to use. For example, we added support for default parameters and named parameters.</li></ul><p>There are a lot more developments between v5.0.0 in January and v15.0.2 in December than we can list here, so check out all the changes in the <a href="https://pub.dev/packages/pigeon/changelog">change log</a>!</p><h3>Packages in DartPad</h3><p><a href="https://dartpad.dev/">DartPad</a> supports a fixed set of packages that you can view by clicking the info icon in the bottom, right-hand corner of the screen. The Flutter and Dart team at Google reviews and prioritizes package requests on an ongoing basis. If you’d like a package to be added to DartPad, add your thumbs up to an <a href="https://github.com/dart-lang/dart-pad/issues?q=is%3Aissue+is%3Aopen+label%3Asuggested-package+sort%3Areactions-%2B1-desc">existing package suggestion</a>, if there is one, or <a href="https://github.com/dart-lang/dart-pad/issues/new?assignees=&amp;labels=&amp;projects=&amp;template=everything-else.md&amp;title=">open a new issue</a> with your suggestion.</p><figure><img alt="Screenshot of packages on Dartpad.dev" src="https://cdn-images-1.medium.com/max/1024/0*U2blC-2k01FIuCsZ" /></figure><h3>Proposal for canonical topics on pub.dev</h3><p>In 2023 we launched the ability for package authors to tag their package with 1–5 free text topics in the pubspec file. The goal was to improve discovery of packages by potential users by adding a form of package categorization. We’ve seen a healthy uptake of the feature with many packages tagged. We’re exploring a proposal to improve the feature by merging topics that are effectively the same (For example, widget and widgets). We invite the community to share feedback or contribute PRs to this <a href="https://github.com/dart-lang/pub-dev/issues/7263">canonicalize topics issue</a>.</p><figure><img alt="An example of Topics on a package on pub.dev" src="https://cdn-images-1.medium.com/max/1024/0*BhBIcosj4JSij9xk" /></figure><p>That’s it for now! To engage with the amazing community of package authors, check out the <a href="https://discord.com/channels/608014603317936148/1014208569706561567">#package-authors</a> Discord channel (you first need to join the <a href="https://github.com/flutter/flutter/wiki/Chat">Flutter Discord server</a>).</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=17cded9a0703" width="1" height="1" alt=""><hr><p><a href="https://medium.com/flutter/progress-of-the-flutter-package-ecosystem-17cded9a0703">Progress of the Flutter Package Ecosystem</a> was originally published in <a href="https://medium.com/flutter">Flutter</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Announcing the #GlobalGamers Challenge]]></title>
            <link>https://medium.com/flutter/announcing-the-globalgamers-challenge-2c5315c87898?source=rss----4da7dfd21a33---4</link>
            <guid isPermaLink="false">https://medium.com/p/2c5315c87898</guid>
            <category><![CDATA[games]]></category>
            <category><![CDATA[flutter-app-development]]></category>
            <category><![CDATA[flutter]]></category>
            <category><![CDATA[game-development]]></category>
            <dc:creator><![CDATA[Kelvin Boateng]]></dc:creator>
            <pubDate>Tue, 09 Jan 2024 19:46:22 GMT</pubDate>
            <atom:updated>2024-01-09T19:46:21.963Z</atom:updated>
            <cc:license>http://creativecommons.org/licenses/by/4.0/</cc:license>
            <content:encoded><![CDATA[<h3>Build epic Flutter games to assist in the battle to defend the planet</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*e6DUZrDIvURyxTW2" /></figure><p>We know Flutter devs love a good <a href="https://flutter.dev/events/puzzle-hack">challenge</a>, so just in time for the new year, we’re excited to announce the next Flutter challenge!</p><p><a href="http://flutter.dev/global-gamers">The Global Gamers Challenge</a> is an 8 week contest to design, build, and publish sustainable games, sponsored by Flutter and <a href="https://www.globalcitizen.org/en/">Global Citizen</a>. The contest’s winners will receive a trip to NYC in September 2024 to meet some of the Flutter team for a day of workshops and mentorship, and then celebrate their achievements with up to 60,000 other Global Citizens at Global Citizen Festival 2024.</p><h3>What are sustainable games?</h3><p>Sustainable games use the power of play to inspire positive environmental action. Imagine Candy Crush, but instead of crushing candies, you’re crushing plastic pollution! Or, picture a game like <a href="https://superdash.flutter.dev/">Super Dash</a>, but instead of collecting acorns, you’re in Dash’s home figuring out how to cool the house by optimizing a path for airflow using windows instead of the AC. Here are some more ideas inspired by Global Citizen campaigns running right now:</p><ol><li><a href="https://www.reuters.com/markets/commodities/world-cant-afford-us-style-home-energy-consumption-habits-2023-05-19/"><strong>Encourage a reduction in home energy use</strong></a><strong><br></strong>Data shows that American homes average three times more electricity use than typical homes across the rest of the globe. Can you build a game that helps to reduce reliance on inefficient energy sources?</li><li><a href="https://www.timeout.com/travel/best-public-transport-in-the-world"><strong>Encourage a reduction in use of single-use plastics</strong></a><strong><br></strong>Southeast Asia has some of the highest levels of plastic pollution in the world. Consider building a game that encourages someone to make a swap, like opting for a reusable water bottle over a single-use one.</li><li><a href="https://www.ukri.org/what-we-do/browse-our-areas-of-investment-and-support/understanding-plastic-pollution-impact-on-marine-ecosystems-in-southeast-asia/"><strong>Encourage use of public transportation for short distances and overland options for longer distances</strong></a><strong><br></strong>Europe is home to many of the world’s greatest public transportation systems. Effective games can encourage people to use local public transit for shorter distances, and overland transport, like trains rather than planes) for longer distances.</li></ol><p>We’re confident that games can encourage players to take small, real life, actions that add up to a large impact for the environment. In fact, <a href="https://www.globalcitizen.org/en/categories/defend-the-planet/">if you’d like to take direct action on these items, check out the campaigns Global Citizen is running now</a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/600/0*BI7eVnRDmTOmFkaQ" /></figure><h3>Why join the Global Gamers Challenge?</h3><h3>Defend the planet</h3><p>This contest was inspired by the <a href="https://www.playing4theplanet.org/">Playing for the Planet Alliance</a>, a United Nations-facilitated alliance of 50 game studios and companies, <a href="https://blog.google/around-the-globe/google-europe/sustainable-gaming-with-the-playing-for-the-planet-alliance/">including Google</a>, with a mission to reduce the industry’s environmental impact and<em> </em>leverage the power of gaming to bring awareness to and coordinate action for important environmental issues like climate change, biodiversity loss, and pollution.</p><p>Games produced by members of the Playing for the Planet Alliance have a combined reach of over 1 billion people. Through the Alliance’s flagship initiative, the <a href="https://www.playing4theplanet.org/green-game-jam-2023">Green Game Jam</a>, participating games have created real-world impact by raising funds to plant 2.75 million trees and raised roughly $1,500,000 USD to protect wildlife and support environmental causes.</p><p>We’ve partnered with the Playing for the Planet Alliance to source <a href="https://igda-website.s3.us-east-2.amazonaws.com/wp-content/uploads/2022/04/06100719/EnvironmentalGameDesignPlaybook_Alpha_Release_Adj.pdf">best practices for environmental games</a> and added them to the resources kit provided as part of this challenge. This is your chance to build something that helps protect our planet!</p><h3>Learn something new</h3><p>Whether you’re a Flutter developer who’s new to games, a game developer who is new to Flutter, or someone new to both game development and Flutter, you’re bound to learn a few tricks while creating a positive impact on the world.</p><h3>Details</h3><h3>Timeline</h3><p>All projects must be submitted by March 5, 2024, 2:59pm PT (GMT -8). A Top 20 will be announced in late March 2024, and final winners will be announced in May 2024.</p><h3>Submission guidelines</h3><p>Registration and entry submission instructions can be found at <a href="http://globalgamers.devpost.com/">DevPost</a>.</p><h3>Awards</h3><p>Winners will be judged based on criteria like:</p><ul><li>Originality and creativity</li><li>Sustainable action and story</li><li>Use of animation</li><li>Effective Multi-platform deployment</li></ul><p>We won’t just award great coding ability, though. We also have prizes for great ideas, demo videos, educational content for your game, and more!</p><p><strong>Resources</strong></p><p>We’ve compiled a bunch of <a href="http://flutter.dev/global-gamers/#resources">resources</a> to help you build a game, including a <a href="http://flutter.dev/global-gamers/#guide">guide</a> to navigating this challenge, kind of like a game map. It shows you the resources you need, helps you become a Global Citizen through a new <a href="https://glblctzn.co/hQ5oyPFbcGb">learning journey</a> in the Global Citizen application, and contains instructions on how to register and submit your game.</p><h3>Teams</h3><p>We recommend completing this challenge as part of a team. When you <a href="http://globalgamers.devpost.com/">register for the challenge</a> on Devpost, you’ll be able to share your skillset, your team status, and your ideas.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*fJPnnrD19GKt6MiK" /></figure><p>So, whether you have a great idea but need teammates with technical skills to bring it to life, or if you have the technical skills, but want a great idea, make sure to fill out your profile accordingly and then peruse the <a href="https://globalgamers.devpost.com/participants">participants tab</a> and look for folks whose profile details match what you’re looking for!</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*L51Psa66pNh99Rsd" /></figure><p>We’re particularly excited for the ideas that will come from technologists and activists working together towards a common goal. If you need extra support in finding a teammate, read over this <a href="https://help.devpost.com/hc/en-us/articles/360022031411-Participants-page-forming-a-team">help article</a>.</p><p>Finally, note that teams can be any size, but only 3 people from a team will be able to travel to New York City should the project be selected as a finalist.</p><h3>Get started and stay connected</h3><p>Check out the official contest site at <a href="https://flutter.dev/global-gamers">flutter.dev/global-gamers</a> for everything you need to know. You can also visit <a href="https://globalgamers.devpost.com/">DevPost</a> to register and submit your game.</p><p>Submissions close on March 5th, so don’t wait! We can’t wait to see what you build!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=2c5315c87898" width="1" height="1" alt=""><hr><p><a href="https://medium.com/flutter/announcing-the-globalgamers-challenge-2c5315c87898">Announcing the #GlobalGamers Challenge</a> was originally published in <a href="https://medium.com/flutter">Flutter</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[How we built it: Ask Dash — A generative AI Flutter application]]></title>
            <link>https://medium.com/flutter/how-we-built-it-ask-dash-a-generative-ai-flutter-application-79a836ced058?source=rss----4da7dfd21a33---4</link>
            <guid isPermaLink="false">https://medium.com/p/79a836ced058</guid>
            <category><![CDATA[flutter]]></category>
            <category><![CDATA[animation]]></category>
            <category><![CDATA[ai]]></category>
            <category><![CDATA[flutter-app-development]]></category>
            <category><![CDATA[cloud]]></category>
            <dc:creator><![CDATA[Very Good Ventures Team]]></dc:creator>
            <pubDate>Wed, 13 Dec 2023 18:36:03 GMT</pubDate>
            <atom:updated>2023-12-13T18:36:03.049Z</atom:updated>
            <content:encoded><![CDATA[<h3>How we built it: Ask Dash — A generative AI Flutter application</h3><p>As part of the <a href="https://cloudonair.withgoogle.com/events/summit-applied-ml-summit-23?talk=t1_s5_vertexaisearchandconversation">Google Cloud Applied AI Summit</a>, the Flutter and Vertex AI teams collaborated with <a href="https://verygood.ventures/">Very Good Ventures</a> to create an AI-powered Flutter demo app, <a href="https://github.com/VeryGoodOpenSource/dash_ai_search">Ask Dash</a>, using <a href="https://cloud.google.com/vertex-ai-search-and-conversation">Vertex AI Search and Conversation</a> by Google Cloud. Vertex AI Search and Conversation empowers you to build and deploy search and conversational applications quickly with little to no experience in AI. Flutter was a great way to build a beautiful, customized search experience to show how both products can be used to build powerful applications in only a few short weeks!</p><p>Vertex AI Search and Conversation allows you to create applications that interact with your data with personalized responses demonstrating the power of generative AI. Most importantly, it gives you full control over what data your application accesses and indexes so you can control what information is surfaced to which user. All application data and user interactions are stored in your own cloud instance and are never used to train Google’s underlying machine learning models.</p><p>Since we used Flutter to build the demo app, we decided to use Flutter documentation as training data. We worked with the Google Cloud team to train the model specifically on the Flutter and Dart developer documentation to provide generative AI responses to questions like: What is Flutter? What platforms does it support? And what is hot reload? While much of this data is readily available in public AI models, this demo showcases how you can train a model on just your own data to create powerful AI experiences.</p><p>This article takes you through how our partner, Very Good Ventures, built a Flutter web application and how we connected the app in the Cloud console.</p><h4>How we built the Flutter web app</h4><p>The idea of creating a search application trained on the Flutter docs was straightforward. In fact, the <a href="https://docs.flutter.dev/">official Flutter documentation</a> already provides a simple search experience that delivers relevant page results for questions on Flutter. However, when conceptualizing what to build, we wanted to demonstrate how Flutter can be used to create visually appealing interactive experiences that are fun and engaging.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/777/0*92qfA1AlxG0HMJCE" /></figure><p><strong>Creating interactive animations in Flutter</strong></p><p>Flutter empowered the team to implement a wide array of animations seamlessly. Its rich set of animations, coupled with the flexibility of widgets, allowed us to create transitions, engaging motion effects, and fluid user interactions. From creating animated loading states while generating the results, to a Dash sprite that waves when the answers appear, Flutter provided flexibility to turn what could be a basic text response into something fun for users to interact with.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/933/0*0gsPB_1VlYop9DPL" /></figure><p><strong>Visualizing natural language search results</strong></p><p>Unlike a traditional search experience, Vertex AI Search provides a natural language response to the question asked. The answer is generated with AI sourced from various pages within the Flutter documentation and presented as a summarized response alongside cards that display the relevant pages used by the AI. Each card provides the title to the page and a description so that the user can flip through the cards to get more context on the AI response.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1000/0*FapWwJ2wFpi8l9ZZ" /></figure><p>Additionally, within the natural language response, Vertex AI Search provides a link to the source of each sentence in the response — giving the user a more granular understanding of where each part of the response originated. In the demo, clicking on the number next to each sentence flips the cards to the relevant source page.</p><p><strong>More engaging in less time</strong></p><p>Going into the project, we had a tight deadline to launch the demo at the <a href="https://cloudonair.withgoogle.com/events/summit-applied-ml-summit-23?talk=t1_s5_vertexaisearchandconversation">Google Cloud Applied AI Summit</a>. Flutter’s efficient development and ease-of-use, significantly expedited the development process for the team of two working on the application. It provided the necessary tools and framework to build complex animations efficiently, enabling our team to build and launch this demo in just under the two weeks leading up to the event.</p><h4>Connecting the Flutter app to the Vertex API</h4><p>Integrating our front end web application with the Vertex AI Search API was achieved with a simple request using the <a href="https://pub.dev/packages/http">`http` package</a>. Without requiring any previous experience in building AI, Vertex AI Search provided answers to user-generated questions as JSON responses that were parsed and displayed within animated widgets. This allowed the team to focus on Flutter development to create an engaging experience out of the generated data.</p><h4>Setting up generative AI search in your application</h4><p>Setting up Vertex AI and hosting the API for our Flutter app was also straightforward.. In our case, we used <a href="https://docs.flutter.dev/">https://docs.flutter.dev/</a> as our data source and set it up directly in the Google Cloud console. As a Google Cloud customer, getting started with Vertex AI requires just three steps:</p><ol><li><strong>Create a data store</strong></li></ol><p>This is your website’s digital library, holding all the information you need to generate the AI model based on just the root URL. Google Cloud crawls your website for relevant data and creates a data source for you to query. To set this up in the Google Cloud Console,select <strong>Search and Conversation</strong>. Choose <strong>Data Stores</strong> and then <strong>New Data Store</strong>. Opt for <strong>Website URL</strong> as the source and provide your website’s URL.</p><p>2.<strong> Access your data</strong></p><p>Next, create an app in Cloud Console to navigate the data indexed by the model and link it to the data store you created earlier. Under <strong>Search and Conversation</strong>, choose <strong>Apps</strong> and then <strong>New App</strong>. Select <strong>Search</strong> as the type and give your app a name that reflects its purpose, like Ask Dash.</p><p>3. <strong>Craft a Cloud Function</strong></p><p>Finally, create a Cloud Function. This is the API wrapper that exposes your Vertex AI data to other applications. In the Console, go to <strong>Cloud Functions</strong> and select <strong>Create Function</strong>.</p><p>That’s it!</p><p>From there, use the API in your front-end application as you would any API to send requests and receive formatted responses that your application can display. To test it out, head to the <strong>Function</strong> page and select <strong>Testing</strong>. Enter a JSON object with a “search_term” key for your question (such as “hot reload”), and see a detailed response containing a natural language summary, relevant citations, and concise summaries of the referenced pages.</p><p>Learn more about how to get started with <a href="https://cloud.google.com/generative-ai-app-builder/docs/try-enterprise-search">Vertex AI Search</a> in Google Cloud’s documentation.</p><h4>Generative AI applications built in Flutter</h4><p>To see Ask Dash in action and learn more about how we built it, check out the video session from the <a href="https://cloudonair.withgoogle.com/events/summit-applied-ml-summit-23?talk=t1_s5_vertexaisearchandconversation">Google Cloud Applied AI Summit</a>, where Alan Blount, a Product Manager for Google Cloud, breaks down the build process to show the potential of Vertex AI Search in a Flutter application. Check out the <a href="https://github.com/VeryGoodOpenSource/dash_ai_search">open source Flutter code</a> for the demo and get started with your own AI search experience in Google Cloud Console.</p><p>Ask Dash is just the start for how Flutter can power interactive Generative AI experiences in applications. We can’t wait to see what you build!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=79a836ced058" width="1" height="1" alt=""><hr><p><a href="https://medium.com/flutter/how-we-built-it-ask-dash-a-generative-ai-flutter-application-79a836ced058">How we built it: Ask Dash — A generative AI Flutter application</a> was originally published in <a href="https://medium.com/flutter">Flutter</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Dart & Flutter DevTools Extensions]]></title>
            <link>https://medium.com/flutter/dart-flutter-devtools-extensions-c8bc1aaf8e5f?source=rss----4da7dfd21a33---4</link>
            <guid isPermaLink="false">https://medium.com/p/c8bc1aaf8e5f</guid>
            <category><![CDATA[extension-development]]></category>
            <category><![CDATA[dart]]></category>
            <category><![CDATA[flutter]]></category>
            <category><![CDATA[developer-tools]]></category>
            <category><![CDATA[devtools]]></category>
            <dc:creator><![CDATA[Kenzie Davisson]]></dc:creator>
            <pubDate>Wed, 15 Nov 2023 21:32:22 GMT</pubDate>
            <atom:updated>2023-11-15T21:32:22.092Z</atom:updated>
            <content:encoded><![CDATA[<h4>A guide for building custom tooling in Dart &amp; Flutter DevTools</h4><p>Have you ever wanted to build developer tooling for Dart and Flutter but didn’t know where to start? Or maybe you didn’t want to go through all the work of establishing a connection to a running Dart or Flutter application to access debugging data? Then, even if you did create a development tool, how would you deploy it or give users easy access to it? Well, we have some good news for you: now you can create developer tooling without all these hurdles.</p><p>With the new Dart &amp; Flutter DevTools extensions framework, you can easily build developer tooling that is tightly integrated with the existing DevTools tooling suite. Extensions are built using Flutter web and leverage existing frameworks and utilities from DevTools to simplify the developer tool authoring experience.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*JsdgKjxlVmm5EAFfXvQ2yA.png" /><figcaption>Example DevTools extension for package:foo</figcaption></figure><h3>How do DevTools extensions work?</h3><p>Extensions are shipped as part of a pub package. For example, imagine we have some package:foo, and this package provides a DevTools extension. When a user depends on package:foo in their app, they automatically get access to the DevTools extension provided by this package. When this user is debugging their app with DevTools, they will see a new tab “Foo” that contains the developer tools provided by package:foo.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*FFl2jeYpyn8DfjRI5xQ8mQ.jpeg" /></figure><p>You can add a DevTools extension to an existing pub package, for example <a href="https://pub.dev/packages/provider">package:provider</a>, <a href="https://pub.dev/packages/patrol">package:patrol</a>, or <a href="https://pub.dev/packages/drift">package:drift</a>, which have all already published DevTools extensions, or you can create a new package that provides a DevTools extension only. In both these scenarios, the user must list a dependency on the package providing the DevTools extension in order to see the developer tool in DevTools.</p><h3>Writing a DevTools extension: a step-by-step guide</h3><h4>Before you get started</h4><p>What you will need:</p><ul><li>Flutter SDK &gt;= 3.17.0–0.0.pre &amp; Dart SDK &gt;= 3.2.</li><li>A <a href="https://pub.dev/">Pub</a> package (existing or new) to add a DevTools extension to.</li></ul><blockquote>It is recommended to develop your extension from the Flutter master channel in order to use the latest <a href="https://pub.dev/packages/devtools_extensions">devtools_extensions</a> and <a href="https://pub.dev/packages/devtools_app_shared">devtools_app_shared</a> packages</blockquote><h3>Step 1: Set up your package hierarchy</h3><p>To add an extension to your Dart package, add a top-level extension directory to your package.</p><pre>foo/<br>  extension/<br>  lib/<br>  ...</pre><p>Under the extension directory, create the following structure (exactly as shown):</p><pre>extension/<br>  devtools/<br>    build/<br>    config.yaml</pre><p>The config.yaml file should contain metadata that DevTools needs to load the extension:</p><pre>name: foo<br>version: 0.0.1<br>issue_tracker: &lt;link_to_your_issue_tracker.com&gt;<br>material_icon_code_point: &#39;0xe0b1&#39;</pre><p>Copy the config.yaml file content above and paste it into the config.yaml file you just created in your package. For each key, fill in the appropriate value for your package.</p><ul><li>name: the package name that this DevTools extension belongs to. The value of this field is used in the extension page title bar. <strong>[required]</strong></li><li>version: the version of your DevTools extension. This version number should evolve over time as you ship new features for your extension. The value of this field is used in the extension page title bar. <strong>[required]</strong></li><li>issue_tracker: the url for your issue tracker. When a user clicks the “Report an issue” link in the DevTools UI, they are directed to this url. <strong>[required]</strong></li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/583/1*H0Q1Zv2L6_NUFUedI6mkQw.png" /><figcaption>DevTools extension screen title bar</figcaption></figure><ul><li>material_icon_code_point: corresponds to the codepoint value of an icon from <a href="https://github.com/flutter/flutter/blob/master/packages/flutter/lib/src/material/icons.dart">material/icons.dart</a>. This icon is used for the extension’s tab in the top-level DevTools tab bar. <strong>[required]</strong></li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/83/1*bkNzahdp7eXUo7Xwh1U07Q.png" /><figcaption>DevTools extension tab icon</figcaption></figure><p>For the most up-to-date documentation on the config.yaml spec, see <a href="https://github.com/flutter/devtools/blob/master/packages/devtools_extensions/extension_config_spec.md">extension_config_spec.md</a>.</p><h4>Where to put your extension source code</h4><p>Only the pre-compiled output of your Flutter extension web app needs to be shipped with your pub package for DevTools to load it in an embedded iFrame. To keep the size of your pub package small, we recommend that you develop your DevTools extension outside of your pub package. We recommend the following package structure:</p><pre>foo/  <br>  packages/<br>    foo/  # your pub package<br>      extension/<br>        devtools/<br>          build/<br>            ...  # pre-compiled output of foo_devtools_extension<br>          config.yaml<br>      ...<br>    foo_devtools_extension/  # source code for your extension</pre><p>Now it’s time to develop your extension.</p><h3>Step 2: Create a DevTools extension</h3><p>From the directory where you want your extension source code to live (foo/packages in the example above), run the following command, replacing foo_devtools_extension with &lt;your_package_name&gt;_devtools_extension:</p><pre>flutter create --template app --platforms web foo_devtools_extension</pre><p>In foo_devtools_extension/pubspec.yaml, add a dependency on devtools_extensions:</p><pre>devtools_extensions: ^0.0.10</pre><p>In foo_devtools_extension/ib/main.dart, place a DevToolsExtension widget at the root of your app:</p><pre>import &#39;package:devtools_extensions/devtools_extensions.dart&#39;;<br>import &#39;package:flutter/material.dart&#39;;<br><br>void main() {<br>  runApp(const FooDevToolsExtension());<br>}<br><br>class FooDevToolsExtension extends StatelessWidget {<br>  const FooDevToolsExtension({super.key});<br><br>  @override<br>  Widget build(BuildContext context) {<br>    return const DevToolsExtension(<br>      child: Placeholder(), // Build your extension here<br>    );<br>  }<br>}</pre><p>The DevToolsExtension widget automatically performs all extension initialization required to interact with DevTools. From anywhere in your extension web app, you can access the global variables extensionManager and serviceManager to send messages back and forth with DevTools or interact with the connected app.</p><h4>Utilize helper packages</h4><p>Use <a href="https://pub.dev/packages/devtools_app_shared">package:devtools_app_shared</a> for access to service managers, common widgets, DevTools theming, utilities, and more. See <a href="https://github.com/flutter/devtools/tree/master/packages/devtools_app_shared/example">devtools_app_shared/example</a> for sample usages.</p><h3>Step 3: Debug a DevTools extension</h3><p>When developing and maintaining your DevTools extension, you’ll want to run, debug, and test your extension Flutter web app. You have a couple of different options for this, outlined below.</p><h4>Option A: Use the Simulated DevTools Environment (recommended for development)</h4><p>For debugging purposes, you will likely want to use the “simulated DevTools environment”. This is a simulated environment that allows you to build your extension without having to develop it as an embedded iFrame in DevTools. Running your extension this way will wrap your extension with an environment that simulates the DevTools-to-DevTools extension connection. It also gives you access to hot restart and a faster development cycle.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*-3c-p57ai0Quc5rJOL9qEQ.png" /><figcaption>Debugging an extension with the Simulated DevTools Environment</figcaption></figure><ol><li><em>Your DevTools extension.</em></li><li><em>The VM service URI for a test app that your DevTools extension will interact with. This app should depend on your extension’s parent package (package:foo in this example).</em></li><li><em>Buttons to perform actions that a user may trigger from DevTools.</em></li><li><em>Logs showing the messages that will be sent between your extension and DevTools.</em></li></ol><p>The simulated environment is enabled by the environment parameter use_simulated_environment. To run your extension web app with this flag enabled, add a configuration to your launch.json file in VS code:</p><pre>{<br>    ...<br>    &quot;configurations&quot;: [<br>        ...<br>        {<br>            &quot;name&quot;: &quot;foo_devtools_extension + simulated environment&quot;,<br>            &quot;cwd&quot;: &quot;packages/foo_devtools_extension&quot;,<br>            &quot;request&quot;: &quot;launch&quot;,<br>            &quot;type&quot;: &quot;dart&quot;,<br>            &quot;args&quot;: [<br>                &quot;--dart-define=use_simulated_environment=true&quot;<br>            ],<br>        },<br>    ]<br>}</pre><p>or launch your app from the command line with the added flag:</p><pre>flutter run -d chrome - dart-define=use_simulated_environment=true</pre><h4>Option B: Use a real DevTools environment</h4><p>Once you develop your extension to a point where you are ready to test your changes in a real DevTools environment, you need to perform a series of setup steps:</p><p>1. Build your Flutter web app and copy the built assets from the your_extension_web_app/build/web directory to the parent package’s extension directory (your_pub_package/extension/devtools/build). To do this, use the build_and_copy command from package:devtools_extensions:</p><pre>cd your_extension_web_app;<br>flutter pub get;<br>dart run devtools_extensions build_and_copy --source=. --dest=path/to/your_pub_package/extension/devtools</pre><blockquote>Note: if you are using the recommended package structure from above, the value for<em> — dest</em> should be <em>../your_pub_package/extension/devtools</em>.</blockquote><p>2. Prepare and run a test application that depends on your pub package. In the test application’s pubspec.yaml file, you’ll need to change the dependency on your package to be a <a href="https://dart.dev/tools/pub/dependencies#path-packages">path dependency</a> that points to your local package source code. Once you have done this, run flutter pub get on the test app, and run the application.</p><p>3. Start DevTools. Open the DevTools instance that was just started by running your test application. You can open DevTools from either the url printed to the command line or from the IDE where you ran your test app. Optionally, you can also run dart devtools from the command line.</p><blockquote>Note: If you need local or unreleased changes from DevTools, you’ll need to build and run DevTools from source (<a href="https://github.com/flutter/devtools/blob/master/CONTRIBUTING.md#frontend--devtools-server">server + front end</a>). See the DevTools <a href="https://github.com/flutter/devtools/blob/master/CONTRIBUTING.md">CONTRIBUTING</a> guide.</blockquote><p>4. Connect DevTools to your test app if it is not connected already, and you should see a tab in the DevTools app bar for your extension. The enabled or disabled state of your extension is managed by DevTools. The extension-enabled states are exposed from an “Extensions” menu in DevTools, available from the action buttons in the upper right corner of the screen.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/576/0*s9-Y9X5dTyuk6Xj4" /><figcaption>DevTools Extensions menu button</figcaption></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/789/1*DSi_p-2FO60qo5JUKXk-3Q.png" /><figcaption>DevTools Extensions menu</figcaption></figure><h3>Step 4: Publish your package with a DevTools extension</h3><p>In order for a package to provide a DevTools extension to its users, it must be published with the expected content in the your_pub_package/extension/devtools/ directory (see the setup instructions above).</p><ol><li>Ensure that the your_pub_package/extension/devtools/config.yaml file exists and is configured per the specifications above.</li><li>Use the build_and_copy command provided by package:devtools_extensions to build your extension and copy the output to the extension/devtools directory:</li></ol><pre>cd your_extension_web_app;<br>flutter pub get;<br>dart run devtools_extensions build_and_copy --source=. --dest=path/to/your_pub_package/extension/devtools</pre><p>Then publish your package on <a href="https://pub.dev/">pub.dev</a>: flutter pub publish. For additional guidance around publishing your package, see the package:devtools_extensions <a href="https://pub.dev/packages/devtools_extensions#publish-your-package-with-a-devtools-extension">publishing guide</a>.</p><h3>Conclusion</h3><p>That’s it! Now, when a user depends on the latest version of your package, they will automatically get access to the tools you provide in your DevTools extension. This feature is hot off the press, so we are eager to hear your feedback.</p><p>For issues and feature requests, please file an issue on the DevTools <a href="https://github.com/flutter/devtools/issues">issue tracker</a>.</p><p>For general support and access to the community of DevTools extension authors, check out the <a href="https://discord.com/channels/608014603317936148/1159561514072690739">#devtools-extension-authors</a> Discord channel (you will first need to join the <a href="https://github.com/flutter/flutter/wiki/Chat">Flutter Discord server</a>).</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=c8bc1aaf8e5f" width="1" height="1" alt=""><hr><p><a href="https://medium.com/flutter/dart-flutter-devtools-extensions-c8bc1aaf8e5f">Dart &amp; Flutter DevTools Extensions</a> was originally published in <a href="https://medium.com/flutter">Flutter</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
    </channel>
</rss>