<?xml version="1.0" encoding="utf-8"?>
<rss xmlns:a10="http://www.w3.org/2005/Atom"
  version="2.0">
  <channel xmlns:slash="http://purl.org/rss/1.0/modules/slash/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:wfw="http://wellformedweb.org/CommentAPI/" xmlns:dc="http://purl.org/dc/elements/1.1/">
    <title>Microsoft Azure Blog</title>
    <link>https://azure.microsoft.com/blog/</link>
    <description />
    <language>en-US</language>
    <lastBuildDate>Mon, 12 Dec 2016 17:30:26 Z</lastBuildDate>
    <item>
      <guid
        isPermaLink="false">first-azure-as</guid>
      <category>Business Intelligence</category>
      <title>Creating your first data model in Azure Analysis Services</title>
      <description>Azure Analysis Services is a new preview service in Microsoft Azure where you can host semantic data models. Users in your organization can then connect to your data models using tools like Excel, Power BI and many others to create reports and perform ad-hoc data analysis.</description>
      <pubDate>Thu, 08 Dec 2016 09:00:03 Z</pubDate>
      <content:encoded>&lt;p&gt;Azure Analysis Services is a new preview service in Microsoft Azure where you can host semantic data models. Users in your organization can then connect to your data models using tools like Excel, Power BI and many others to create reports and perform ad-hoc data analysis.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;To understand the value of Azure Analysis Services, imagine a scenario where you have data stored in a large database. You want to make that data available to your business users or customers so they can do their own analysis and build their own reports. To do this, one option would be to give those users access to that database. Of course, this option has several drawbacks. The design of that database, including the names of tables and columns may not be easy for a user to understand. They would need to know which tables to query, how those tables should be joined, and other business logic that needs to be applied to get the correct results. They would also need to know a query language like SQL to even get started. Most often this will lead to multiple users reporting the same metrics but with different results.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;With Azure Analysis Services, you can encapsulate all the information needed into a semantic model which can be more easily queried by those users in an easy drag-and-drop experience. And you can ensure that all users will see a single version of the truth. Some of the metadata included in the semantic model includes; relationships between tables, friendly table/column names, descriptions, display folders, calculations and row level security.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;Once your data is properly modeled for your users to consume, Azure Analysis Services offers additional features to enhance their querying experience. The biggest of which is the option to put the data in an in memory columnar cache which can accelerate queries to sub second performance. This not only improves the query experience but by hitting the cache also reduces the query load on your underlying database.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;Ready to give it a try? Follow the steps in the rest of this blog post and you&amp;rsquo;ll see how easy it is.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;Before getting started, you&amp;rsquo;ll need:&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;Azure Subscription - &lt;a href="https://azure.microsoft.com/free/?b=16.46"&gt;Sign up for a free trial&lt;/a&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;SQL Server Data Tools - &lt;a href="https://msdn.microsoft.com/library/mt204009.aspx"&gt;Download the latest version for free&lt;/a&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;Power BI Desktop - &lt;a href="https://go.microsoft.com/fwlink/?LinkId=521662&amp;amp;clcid=0x409"&gt;Download the latest version for free&lt;/a&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;h3&gt;Create an Analysis Services server in Azure&lt;/h3&gt;&#xD;
&#xD;
&lt;p&gt;1. Go to &lt;a href="http://portal.azure.com"&gt;http://portal.azure.com&lt;/a&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;2. In the Menu blade, click &lt;b&gt;New&lt;/b&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/516caf4f-6158-42cd-9634-7b46c97806b7.jpg"&gt;&lt;img alt="clip_image002" border="0" height="302" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/191be014-a349-4b99-ab5f-26d12e8e59b9.jpg" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="clip_image002" width="335"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;3. Expand &lt;b&gt;Intelligence + Analytics&lt;/b&gt;, and then click &lt;b&gt;Analysis Services&lt;/b&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/c3cb6790-1e10-488a-8aa2-5c76ddee02f4.jpg"&gt;&lt;img alt="clip_image004" border="0" height="306" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/48c44146-9691-4e35-aecd-7156363479dd.jpg" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="clip_image004" width="475"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;4. In the Analysis Services blade, enter the following and then click &lt;b&gt;Create&lt;/b&gt;:&lt;/p&gt;&#xD;
&#xD;
&lt;ul&gt;&#xD;
	&lt;li&gt;&lt;b&gt;Server name&lt;/b&gt;: Type a unique name.&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;b&gt;Subscription&lt;/b&gt;: Select your subscription.&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;b&gt;Resource group&lt;/b&gt;: Select &lt;b&gt;Create new&lt;/b&gt;, and then type a name for your new resource group.&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;b&gt;Location&lt;/b&gt;: This is the Azure datacenter location that hosts the server. Choose a location nearest you.&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;b&gt;Pricing tier&lt;/b&gt;: For our simple model, select &lt;b&gt;D1&lt;/b&gt;. This is the smallest tier and great for getting started. The larger tiers are differentiated by how much cache and query processing units they have. Cache indicates how much data can be loaded into the cache after it has been compressed. Query processing units, or QPUs, are a sign of how many queries can be supported concurrently. Higher QPUs may mean better performance and allow for a higher concurrency of users.&lt;/li&gt;&#xD;
&lt;/ul&gt;&#xD;
&#xD;
&lt;p&gt;Now that you&amp;rsquo;ve created a server, you can build your first model. In the next steps, you&amp;rsquo;ll use SQL Server Data Tools (SSDT) to create a data model and deploy it to your new server in Azure.&lt;/p&gt;&#xD;
&#xD;
&lt;h3&gt;Create a sample data source&lt;/h3&gt;&#xD;
&#xD;
&lt;p&gt;Before you can create a data model with SSDT, you&amp;rsquo;ll need a data source to connect to. Azure Analysis Services supports connecting to many different types of data sources both on-premises and in the cloud. For this post, we&amp;rsquo;ll use the Adventure Works sample database.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;1. In Azure portal, in the Menu blade, click &lt;b&gt;New&lt;/b&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/509b2e48-b31f-47b7-a878-4787b035e88b.jpg"&gt;&lt;img alt="clip_image005" border="0" height="388" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/27263fab-dc71-4860-b85e-853ecff83797.jpg" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="clip_image005" width="431"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;2. Expand &lt;b&gt;Databases&lt;/b&gt;, and then click &lt;b&gt;SQL Database&lt;/b&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/46f33b61-1526-413e-b298-b51ab3618f1a.jpg"&gt;&lt;img alt="clip_image007" border="0" height="186" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/b5adc0ad-ab0a-4ad3-a414-117e3fb9941c.jpg" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="clip_image007" width="489"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;3. In the SQL Database blade, enter the following and then click &lt;b&gt;Create&lt;/b&gt;:&lt;/p&gt;&#xD;
&#xD;
&lt;ul&gt;&#xD;
	&lt;li&gt;&lt;b&gt;Database name&lt;/b&gt;: Type a unique name.&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;b&gt;Subscription&lt;/b&gt;: Select your subscription.&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;b&gt;Resource group&lt;/b&gt;: Select the same resource group you created for your Analysis Services server.&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;b&gt;Select source&lt;/b&gt;: Select &lt;b&gt;Sample (Adventure Works LT)&lt;/b&gt;.&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;b&gt;Server&lt;/b&gt;: Choose a location nearest you.&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;b&gt;Pricing tier&lt;/b&gt;: For your sample database, select &lt;b&gt;B&lt;/b&gt;.&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;b&gt;Collation&lt;/b&gt;: Leave the default, &lt;b&gt;SQL_Latin1_General_CP1_CI_AS&lt;/b&gt;.&lt;/li&gt;&#xD;
&lt;/ul&gt;&#xD;
&#xD;
&lt;p&gt;Now that you&amp;rsquo;ve created a sample data source, you&amp;rsquo;ll have some data to connect to when you build your data model.In the next steps, you&amp;rsquo;ll use SQL Server Data Tools (SSDT) to connect to your new data source, create a data model, and deploy it to your new server in Azure.&lt;/p&gt;&#xD;
&#xD;
&lt;h3&gt;Create a data model&lt;/h3&gt;&#xD;
&#xD;
&lt;p&gt;To create Analysis Services data models, you&amp;rsquo;ll use Visual Studio and an extension called SQL Server Data Tools (SSDT).&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;1. In SSDT, create a new &lt;b&gt;Analysis Services Tabular Project&lt;/b&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/4e80a60c-6851-4c37-adb9-ca0949ff3342.png"&gt;&lt;img alt="image" border="0" height="660" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/35766de9-c6ec-4149-810b-2f41193fb6d8.png" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="image" width="953"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;If asked to select a workspace type, select &lt;b&gt;Integrated&lt;/b&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;2. Click the &lt;b&gt;Import From Data Source&lt;/b&gt; icon on the toolbar at the top of the screen.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/6c78a7d0-376b-4fe1-90f3-7fb79caf6299.jpg"&gt;&lt;img alt="clip_image011" border="0" height="203" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/02daa50f-8e80-462d-9711-0776cddc72d2.jpg" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="clip_image011" width="625"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;3. Select &lt;b&gt;Microsoft SQL Azure&lt;/b&gt; as your data source type and click &lt;b&gt;Next&lt;/b&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;4. Fill in the connection information for the sample SQL Azure database created earlier and click &lt;b&gt;Next&lt;/b&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/4957a95a-6413-4f53-ba10-b86851edd308.jpg"&gt;&lt;img alt="clip_image013" border="0" height="522" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/3fcba5d2-76e1-420b-a879-b17723e9ebb9.jpg" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="clip_image013" width="648"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;ul&gt;&#xD;
	&lt;li&gt;&lt;b&gt;Server Name&lt;/b&gt;: Name of SQL Azure server to connect to.&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;b&gt;User Name&lt;/b&gt;: Name of the user which will be used to login to the server.&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;b&gt;Password&lt;/b&gt;: Password for the account.&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;b&gt;Database Name&lt;/b&gt;: Name of the SQL database to connect to.&lt;/li&gt;&#xD;
&lt;/ul&gt;&#xD;
&#xD;
&lt;p&gt;Note: If using SQL Azure ensure that you have allowed your &lt;a href="https://docs.microsoft.com/en-us/azure/sql-database/sql-database-configure-firewall-settings"&gt;IP address access through the firewall&lt;/a&gt;. Also, ensure that &amp;ldquo;Allow access to Azure Services&amp;rdquo; is set to &amp;ldquo;on&amp;rdquo; for the firewall.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;5. Select &lt;b&gt;Service Account&lt;/b&gt; for the impersonation mode and click &lt;b&gt;Next&lt;/b&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;6. Select the tables you wish to import into cache and click &lt;b&gt;Finish&lt;/b&gt;:&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/c6653ea6-bbcb-4d02-a834-a6c36bb5f132.jpg"&gt;&lt;img alt="clip_image015" border="0" height="564" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/3aae9d0c-1e2e-4a86-8627-400f3a33fc4f.jpg" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="clip_image015" width="693"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;ul&gt;&#xD;
	&lt;li&gt;At this step, you can optionally provide a friendly name for each table. For large tables, which may not fit into cache, you can also specify a filter expression to reduce the number of rows. When complete, click next.&lt;/li&gt;&#xD;
	&lt;li&gt;Data will now be read from the database and pulled into a local cache within Visual Studio.&lt;/li&gt;&#xD;
	&lt;li&gt;Once loading is complete, you will have your first model created and will be able to see each table and the data within them. You can also switch to a diagram view by clicking the little diagram option at the bottom right of the screen:&lt;/li&gt;&#xD;
&lt;/ul&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/733bc918-b22d-4385-88a4-d15e8f4d1334.png"&gt;&lt;img alt="image" border="0" height="331" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/bbd38a46-fb78-4901-b21e-26a40a376319.png" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="image" width="557"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;The diagram view makes it really easy to see all of the tables and the relationships between them.&lt;/p&gt;&#xD;
&#xD;
&lt;h3&gt;Improving the model&lt;/h3&gt;&#xD;
&#xD;
&lt;p&gt;Now that your basic model is built, you could start querying it now or you could enhance it further by using more of the available modeling features. Some of these features include:&lt;/p&gt;&#xD;
&#xD;
&lt;ul&gt;&#xD;
	&lt;li&gt;Create or edit relationships. You can add, remove or change relationships between tables by going to the diagram view and dragging a line between two columns in different tables. Once tables are joined together, they can automatically be queried together when a user selects columns from both tables.&lt;/li&gt;&#xD;
	&lt;li&gt;Edit properties for a table or column. You can update multiple properties for tables and columns by clicking on them and updating the values in the properties pane.&lt;/li&gt;&#xD;
&lt;/ul&gt;&#xD;
&#xD;
&lt;blockquote&gt;&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/17a2ce80-7daa-46dc-ac42-b4445b107543.png"&gt;&lt;img alt="image" border="0" height="366" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/61e0e78a-a2c5-41f2-bfab-8103809065ff.png" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="image" width="786"&gt;&lt;/a&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/2b51bfa3-f4a4-4d71-a7c5-49d3c2ec201c.png"&gt;&lt;img alt="image" border="0" height="506" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/d30a77bf-5e77-4c84-858f-ed037690fb2c.png" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="image" width="583"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&lt;/blockquote&gt;&#xD;
&#xD;
&lt;ul&gt;&#xD;
	&lt;li&gt;Add more business logic to the model by creating calculations and measures.&lt;/li&gt;&#xD;
&lt;/ul&gt;&#xD;
&#xD;
&lt;blockquote&gt;&#xD;
&lt;p&gt;&lt;ins datetime="2016-12-01T09:15"&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/e902ded8-86f9-4321-9b19-eef1d9b786aa.jpg"&gt;&lt;img alt="clip_image023" border="0" height="100" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/0325ca8e-b000-4b7d-ac29-8f6429e6daa7.jpg" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="clip_image023" width="1432"&gt;&lt;/a&gt;&lt;/ins&gt;&lt;/p&gt;&#xD;
&lt;/blockquote&gt;&#xD;
&#xD;
&lt;h3&gt;Deploy&lt;/h3&gt;&#xD;
&#xD;
&lt;p&gt;Once your model is complete, you can now deploy it to the Azure AS server which you created in the first step. This can be done with the following steps:&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;1. Copy your Azure Analysis Services server name for the Azure portal. This can be found at the top of the overview section of your server.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;ins datetime="2016-12-01T09:25"&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/ec6bffbe-647d-4b14-888d-861d734c1ad0.jpg"&gt;&lt;img alt="clip_image025" border="0" height="446" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/1261dc5e-6c3f-4b9e-97a2-2b0746a5f32f.jpg" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="clip_image025" width="949"&gt;&lt;/a&gt;&lt;/ins&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;2. In the solution explorer in Visual Studio, right click on the project and click properties.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/1313599f-f602-46e9-8f4a-9944b84de2b9.png"&gt;&lt;img alt="image" border="0" height="756" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/88429cb8-c6a4-4e5f-82fb-cfe6a8c69bb5.png" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="image" width="769"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;3. Change the deployment server to the name of your Azure AS server and click &lt;b&gt;OK.&lt;/b&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;ins datetime="2016-12-01T09:29"&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/816bb510-17bc-47ba-83d2-5ad80e233bc7.jpg"&gt;&lt;img alt="clip_image029" border="0" height="305" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/e5ad57b9-3a5d-461f-9e7f-6af716dfd9bc.jpg" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="clip_image029" width="691"&gt;&lt;/a&gt;&lt;/ins&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;4. Right click the project name again, but this time click &lt;b&gt;Deploy&lt;/b&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/533bdd95-fd28-4c81-b2a4-7d2c50fd5052.png"&gt;&lt;img alt="image" border="0" height="366" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/6bb727ba-0888-4cb0-9323-23e27284b858.png" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="image" width="746"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;h3&gt;Connect&lt;/h3&gt;&#xD;
&#xD;
&lt;p&gt;Now that you model has been creating you can connect with it through tools like the Power BI Desktop or Excel.&lt;/p&gt;&#xD;
&#xD;
&lt;h4&gt;Power BI Desktop&lt;/h4&gt;&#xD;
&#xD;
&lt;p&gt;If you don&amp;rsquo;t already have the &lt;a href="https://powerbi.microsoft.com/en-us/desktop"&gt;Power BI Desktop&lt;/a&gt;, you can download it for free.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;1. Open the Power BI Desktop&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;2. Click &lt;b&gt;Get Data&lt;/b&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;ins datetime="2016-12-02T10:35"&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/605cdb9a-7cf5-404e-a4ef-624e553cd931.jpg"&gt;&lt;img alt="clip_image033" border="0" height="253" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/1e326f8b-4b95-4510-8cae-0ac5f2b6b730.jpg" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="clip_image033" width="352"&gt;&lt;/a&gt;&lt;/ins&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;3. Select &lt;b&gt;Databases/SQL Server Analysis Services&lt;/b&gt; and then click connect.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;ins datetime="2016-12-02T10:36"&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/e1157d95-8460-4835-8538-09ac635e5cb0.jpg"&gt;&lt;img alt="clip_image035" border="0" height="221" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/097f6be9-747c-468c-bfe1-60d4c5cd1451.jpg" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="clip_image035" width="658"&gt;&lt;/a&gt;&lt;/ins&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;4. Enter your Azure AS server name and click &lt;b&gt;OK&lt;/b&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;ins datetime="2016-12-02T10:38"&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/c661ec43-da8a-486c-a249-3fc0fbf99878.jpg"&gt;&lt;img alt="clip_image037" border="0" height="353" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/82267b5f-327c-40ed-85e0-6bece385c165.jpg" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="clip_image037" width="700"&gt;&lt;/a&gt;&lt;/ins&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;5. On the &lt;b&gt;Navigator&lt;/b&gt; screen, select your model and click &lt;b&gt;OK&lt;/b&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;ins datetime="2016-12-02T10:39"&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/f3f5300d-3108-4e5b-9134-0eb06ac14d2f.jpg"&gt;&lt;img alt="clip_image039" border="0" height="264" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/8bab5e2c-e33e-4ef4-b91a-379289c804a7.jpg" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="clip_image039" width="827"&gt;&lt;/a&gt;&lt;/ins&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;You will now see your model displayed in the field list on the side. You can drag and drop the different fields on to your page to build out interactive visuals.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/23452888-c1d5-409a-b916-c15d53233a62.png"&gt;&lt;img alt="image" border="0" height="1038" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/98c062cd-4c9b-4674-9bb5-032a478d790d.png" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="image" width="1615"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;h4&gt;Excel&lt;/h4&gt;&#xD;
&#xD;
&lt;p&gt;Learn more about &lt;a href="https://docs.microsoft.com/en-us/azure/analysis-services/analysis-services-connect#connect-in-excel"&gt;connecting through Excel&lt;/a&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;strong&gt;Learn more about &lt;a href="https://azure.microsoft.com/services/analysis-services/"&gt;Azure Analysis Services&lt;/a&gt;.&lt;/strong&gt;&lt;/p&gt;&#xD;
</content:encoded>
      <comments>https://azure.microsoft.com/blog/first-azure-as/#comments</comments>
      <link>https://azure.microsoft.com/blog/first-azure-as/</link>
      <dc:creator>Josh Caplan</dc:creator>
    </item>
    <item>
      <guid
        isPermaLink="false">japanese-language-support-in-azure-media-indexer-2-preview</guid>
      <category>Announcements</category>
      <category>Media Services &amp; CDN</category>
      <title>Azure Media Indexer 2: Japanese support, punctuation improvements, no more time limit</title>
      <description>Azure Media Indexer 2 Preview now supports the Japanese language and media files greater than 10 minutes in duration.</description>
      <pubDate>Thu, 08 Dec 2016 08:00:02 Z</pubDate>
      <content:encoded>&lt;p&gt;On the heels of Microsoft&amp;#39;s &lt;a href="https://blogs.microsoft.com/next/2016/10/18/historic-achievement-microsoft-researchers-reach-human-parity-conversational-speech-recognition/"&gt;groundbreaking new developments in speech recognition&lt;/a&gt;, we have are continuing along our path: improving the quality of the transcripts generated by Azure Media Indexer and expanding our locale support to eventually accomplish our goal of being able to recognize all human speech on the Azure cloud.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;Today we are ready to release the following improvements to Azure Media Indexer 2 Preview:&lt;/p&gt;&#xD;
&#xD;
&lt;ul&gt;&#xD;
	&lt;li&gt;Japanese language models for public (preview) consumption in Azure Media Indexer 2&lt;/li&gt;&#xD;
	&lt;li&gt;Removal of the 10 minute processing limit&amp;nbsp;&lt;/li&gt;&#xD;
	&lt;li&gt;Additional quality improvements with respect to punctuation and grammar&lt;/li&gt;&#xD;
&lt;/ul&gt;&#xD;
&#xD;
&lt;p&gt;This Japanese language works in an identical manner to all other language models, simply provide the proper language code in the configuration file.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;The following configuration will allow you to process a file with Japanese speech content (with defaults in all other Options)&lt;/p&gt;&#xD;
&#xD;
&lt;pre class="prettyprint"&gt;&#xD;
{&#xD;
    &amp;#39;Version&amp;#39;: &amp;#39;1.0&amp;#39;,&#xD;
    &amp;#39;Features&amp;#39;: [{&#xD;
        &amp;#39;Options&amp;#39;: {&#xD;
            &amp;quot;Language&amp;quot;: &amp;quot;JaJp&amp;quot;&#xD;
        }        &#xD;
    }]&#xD;
}&#xD;
&lt;/pre&gt;&#xD;
&#xD;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;Still not sure what Azure Media Indexer 2 is?&amp;nbsp; Read the introductory blog post to learn how to &lt;a href="https://azure.microsoft.com/en-us/blog/new-languages-with-azure-media-indexer-2-preview/"&gt;extract the speech content&lt;/a&gt; from your media files.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;To learn more about &lt;a href="http://azure.microsoft.com/en-us/blog/introducing-azure-media-analytics"&gt;Azure Media Analytics&lt;/a&gt;, check out the introductory blog post.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;Have feedback?&amp;nbsp; Share it on our &lt;a href="https://feedback.azure.com/forums/169396-azure-media-services/category/146181-media-analytics"&gt;feedback forum&lt;/a&gt;.&lt;/p&gt;&#xD;
</content:encoded>
      <comments>https://azure.microsoft.com/blog/japanese-language-support-in-azure-media-indexer-2-preview/#comments</comments>
      <link>https://azure.microsoft.com/blog/japanese-language-support-in-azure-media-indexer-2-preview/</link>
      <dc:creator>Adarsh Solanki</dc:creator>
    </item>
    <item>
      <guid
        isPermaLink="false">query-store-on-is-the-new-default-for-azure-sql-database</guid>
      <category>Announcements</category>
      <category>IT Pro/DevOps</category>
      <category>Database</category>
      <category>Supportability</category>
      <title>Query Store ON is the new default for Azure SQL Database</title>
      <description>We are happy to announce that Query Store is turned ON in all Azure SQL databases (including Elastic Pools) which will bring benefits both to the end users and the entire Azure SQL Database platform.</description>
      <pubDate>Wed, 07 Dec 2016 12:00:10 Z</pubDate>
      <content:encoded>&lt;p&gt;We are happy to announce that Query Store is turned ON in all Azure SQL databases (including Elastic Pools) which will bring benefits both to the end users and the entire Azure SQL Database platform.&lt;/p&gt;&#xD;
&#xD;
&lt;h2&gt;Why is this important?&lt;/h2&gt;&#xD;
&#xD;
&lt;p&gt;Query Store acts as a &amp;ldquo;flight data recorder&amp;rdquo; for the database, continuously collecting critical information about the queries. It dramatically reduces resolution time in case of performance incidents, as pre-collected, relevant data is available when you need it, without delays.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;You can use Query Store in scenarios when tracking performance and ensuring database performance predictability is critical. The following are some examples where Query Store is going to significantly improve your productivity:&lt;/p&gt;&#xD;
&#xD;
&lt;ul&gt;&#xD;
	&lt;li&gt;Identifying and fixing application performance regressions (more details in &lt;a href="https://azure.microsoft.com/en-in/blog/query-store-a-flight-data-recorder-for-your-database/" target="_blank"&gt;this blog article&lt;/a&gt;)&lt;/li&gt;&#xD;
	&lt;li&gt;Tuning the most expensive queries considering &lt;a href="https://msdn.microsoft.com/en-us/library/dn818158.aspx" target="_blank"&gt;different consumption metrics&lt;/a&gt; (elapsed time, CPU time, used memory, read and write operations, log I/O, etc.)&lt;/li&gt;&#xD;
	&lt;li&gt;Keeping performance stability with compatibility level 130 in Azure SQL Database (more details in &lt;a href="https://blogs.msdn.microsoft.com/sqlserverstorageengine/2016/05/06/improved-query-performance-with-compatibility-level-130-in-azure-sql-database/" target="_blank"&gt;this blog article&lt;/a&gt;)&lt;/li&gt;&#xD;
	&lt;li&gt;Assessing impact of any application or configuration change (&lt;a href="https://msdn.microsoft.com/en-us/library/mt614796.aspx#Anchor_2" target="_blank"&gt;A/B testing&lt;/a&gt;)&lt;/li&gt;&#xD;
	&lt;li&gt;Identifying and improving ad-hoc workloads (more details &lt;a href="https://msdn.microsoft.com/en-us/library/mt614796.aspx#Anchor_4" target="_blank"&gt;here&lt;/a&gt;)&lt;/li&gt;&#xD;
&lt;/ul&gt;&#xD;
&#xD;
&lt;p&gt;Query Store also provides the foundation for performance monitoring and tuning features, such as the &lt;a href="https://docs.microsoft.com/en-us/azure/sql-database/sql-database-advisor" target="_blank"&gt;SQL Database Advisor&lt;/a&gt;. Query Store powers &lt;a href="https://docs.microsoft.com/en-us/azure/sql-database/sql-database-performance" target="_blank"&gt;SQL Database Performance Insights&lt;/a&gt; which allows you to monitor and troubleshoot database performance directly from the Azure portal. With Query Store turned ON we ensure that relevant information about your most critical queries is available first time you open the queries chart on SQL Database Performance Insights:&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/f810f89c-2b0b-418d-b5cc-e6f94fd1cde3.png"&gt;&lt;img alt="Query Perfrormance Insights" border="0" height="610" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/f19f52b9-15b1-4ba4-983a-c43ef3d13fd3.png" style="border: 0px currentColor; border-image: none; padding-top: 0px; padding-right: 0px; padding-left: 0px; display: inline; background-image: none;" title="Query Perfrormance Insights" width="596"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;We strongly recommend keeping Query Store ON. Thanks to an optimal default configuration and automatic retention policy, Query Store operates continuously using an insignificant part of the database space with a negligible performance overhead, typically in the range of 1-2%.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;The default configuration is automatically applied by Azure SQL Database. If you want to switch to a customized Query Store configuration, use &lt;a href="https://msdn.microsoft.com/en-us/library/bb522682.aspx" target="_blank"&gt;ALTER DATABASE with Query Store options&lt;/a&gt;. Also check out &lt;a href="https://msdn.microsoft.com/library/mt604821.aspx" target="_blank"&gt;Best Practices with the Query Store&lt;/a&gt; to learn how to choose optimal parameter values.&lt;/p&gt;&#xD;
&#xD;
&lt;h2&gt;Next steps&lt;/h2&gt;&#xD;
&#xD;
&lt;p&gt;For more detailed information, check out the online documentation:&lt;/p&gt;&#xD;
&#xD;
&lt;ul&gt;&#xD;
	&lt;li&gt;&lt;a href="https://azure.microsoft.com/en-in/blog/query-store-a-flight-data-recorder-for-your-database/"&gt;Query Store: A flight data recorder for your database&lt;/a&gt;&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;a href="https://msdn.microsoft.com/en-us/library/mt614796.aspx"&gt;Query Store Usage Scenarios&lt;/a&gt;&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;a href="https://msdn.microsoft.com/library/dn817826.aspx"&gt;Monitoring Performance by Using the Query Store&lt;/a&gt;&lt;/li&gt;&#xD;
&lt;/ul&gt;&#xD;
</content:encoded>
      <comments>https://azure.microsoft.com/blog/query-store-on-is-the-new-default-for-azure-sql-database/#comments</comments>
      <link>https://azure.microsoft.com/blog/query-store-on-is-the-new-default-for-azure-sql-database/</link>
      <dc:creator>Borko Novakovic</dc:creator>
    </item>
    <item>
      <guid
        isPermaLink="false">project-bletchley-new-blockchain-partners-come-to-azure-marketplace</guid>
      <category>Announcements</category>
      <category>Developer</category>
      <category>Virtual Machines</category>
      <category>Database</category>
      <category>Blockchain</category>
      <title>Project Bletchley – New Blockchain partners come to Azure Marketplace</title>
      <description>In just the last couple of weeks, we have added two new partners and solutions to the growing blockchain ecosystem in the Azure Marketplace.</description>
      <pubDate>Wed, 07 Dec 2016 11:00:09 Z</pubDate>
      <content:encoded>&lt;p&gt;As promised, we continue to release early and release often in the blockchain space together with our partners.&amp;nbsp; I am pleased to be back to announce more great additions to our blockchain offering on Azure.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;In just the last couple of weeks, we have added two new partners and solutions to the growing blockchain ecosystem in the &lt;a href="https://azure.microsoft.com/en-us/marketplace/?term=blockchain"&gt;Azure Marketplace&lt;/a&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azure.microsoft.com/en-us/marketplace/partners/ethcore/parity/"&gt;Parity&lt;/a&gt; &amp;ndash; &lt;a href="https://ethcore.io/index.html"&gt;Ethcore&lt;/a&gt; recently published their high performance, low footprint, reliable Ethereum blockchain client, Parity, on Azure.&amp;nbsp; This offering simplifies setting up a new Parity node in the cloud, with little configuration from the user.&amp;nbsp; In a matter of minutes, a user can have a single node, private Ethereum network up and running for development and test.&amp;nbsp;&amp;nbsp;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/9ec3d6c4-ae3a-4541-8e00-b9875947a4ac.png"&gt;&lt;img alt="Parity" border="0" height="399" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/655b8f4b-2e3e-47e1-83bb-2358dee5d4d0.png" style="border: 0px currentColor; border-image: none; padding-top: 0px; padding-right: 0px; padding-left: 0px; margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" title="Parity" width="640"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azure.microsoft.com/en-us/marketplace/partners/blockstack/blockstack-core-v14/"&gt;Blockstack Core v14&lt;/a&gt; &amp;ndash; &lt;a href="https://blockstack.org/"&gt;Blockstack&lt;/a&gt; is building a new decentralized web of server-less applications where users can control their own data.&amp;nbsp; Applications run locally and utilize user-specific data stores as their backend to maintain decentralization and control.&amp;nbsp; Users can seamlessly deploy Blockstack Core nodes on Microsoft Azure.&amp;nbsp; Blockstack Core nodes provide the core functionality of the Blockstack stack, processing data from a standard blockchain layer to construct a global view of security and ownership mappings.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/1829c285-a158-4ef4-a00b-fa8c74b9c7cd.png"&gt;&lt;img alt="Blockstack" border="0" height="480" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/45699eb8-eda9-4ae7-baf8-b9fdcc5c2eab.png" style="border: 0px currentColor; border-image: none; padding-top: 0px; padding-right: 0px; padding-left: 0px; margin-right: auto; margin-left: auto; float: none; display: block; background-image: none;" title="Blockstack" width="617"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;In addition to our growing blockchain partner ecosystem, over the last few weeks, we have focused on functionality to improve deployment resiliency for our own &lt;a href="https://aka.ms/microsoft-azure-blockchain-solution"&gt;consortium network blockchain solution&lt;/a&gt; releasing several updates that should improve overall deployment success rates.&amp;nbsp; More new features are already in the works, so stay tuned for updates!&lt;/p&gt;&#xD;
</content:encoded>
      <comments>https://azure.microsoft.com/blog/project-bletchley-new-blockchain-partners-come-to-azure-marketplace/#comments</comments>
      <link>https://azure.microsoft.com/blog/project-bletchley-new-blockchain-partners-come-to-azure-marketplace/</link>
      <dc:creator>Christine Avanessians</dc:creator>
    </item>
    <item>
      <guid
        isPermaLink="false">new-price-performance-choices-for-azure-sql-database-elastic-pools</guid>
      <category>Announcements</category>
      <category>Updates</category>
      <title>New price-performance choices for Azure SQL Database elastic pools</title>
      <description>Azure SQL Database elastic pools provide a simple cost effective solution for managing the performance of multiple databases with unpredictable usage patterns. New price-performance choices for…</description>
      <pubDate>Wed, 07 Dec 2016 10:00:10 Z</pubDate>
      <content:encoded>&lt;p&gt;Azure SQL Database elastic pools provide a simple cost effective solution for managing the performance of multiple databases with unpredictable usage patterns. New price-performance choices for elastic pools provide even more cost effectiveness and greater scale than before.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;strong&gt;More cost effectiveness&lt;/strong&gt;&lt;br&gt;&#xD;
Now available are smaller elastic pool sizes and pools with higher database limits. These new choices lower the starting price for pools, lower the effective cost per database, and reduce price jumps between pool sizes.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;strong&gt;Greater scale&lt;/strong&gt;&lt;br&gt;&#xD;
Also, now available are larger sizes for Basic, Standard, and Premium pools, and higher eDTU limits per database for Premium pools. These new choices provide more storage and eDTU headroom for greater scale and the most demanding workloads.&lt;/p&gt;&#xD;
&#xD;
&lt;h2&gt;Highlights&lt;/h2&gt;&#xD;
&#xD;
&lt;table border="0" cellpadding="2" cellspacing="0" width="892"&gt;&#xD;
	&lt;tbody&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="200"&gt;&lt;br&gt;&#xD;
			More pool eDTU sizes&lt;/td&gt;&#xD;
			&lt;td valign="top" width="690"&gt;&#xD;
			&lt;ul&gt;&#xD;
				&lt;li&gt;New sizes range from 50 eDTUs for Basic and Standard pools up to 4000 eDTUs for Premium pools with additional sizing choices in between.&lt;/li&gt;&#xD;
			&lt;/ul&gt;&#xD;
			&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="200"&gt;More storage for Standard pools&lt;/td&gt;&#xD;
			&lt;td valign="top" width="690"&gt;&#xD;
			&lt;ul&gt;&#xD;
				&lt;li&gt;Up to 2.9 TB for 3000 eDTU Standard pools.&lt;/li&gt;&#xD;
			&lt;/ul&gt;&#xD;
			&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="200"&gt;Higher database limits per pool&lt;/td&gt;&#xD;
			&lt;td valign="top" width="690"&gt;&#xD;
			&lt;ul&gt;&#xD;
				&lt;li&gt;Up to 500 databases for Basic and Standard pools of at least 200 eDTUs.&lt;/li&gt;&#xD;
				&lt;li&gt;Up to 100 databases for Premium pools of at least 250 eDTUs.&lt;/li&gt;&#xD;
			&lt;/ul&gt;&#xD;
			&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="200"&gt;Higher eDTU limits per database for Premium pools&lt;/td&gt;&#xD;
			&lt;td valign="top" width="690"&gt;&#xD;
			&lt;ul&gt;&#xD;
				&lt;li&gt;Max eDTUs per database increase to 1750 eDTUs (P11 level) and 4000 eDTUs (P15 level) for the largest Premium pools.&lt;/li&gt;&#xD;
			&lt;/ul&gt;&#xD;
			&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
	&lt;/tbody&gt;&#xD;
&lt;/table&gt;&#xD;
&#xD;
&lt;h2&gt;Learn more&lt;/h2&gt;&#xD;
&#xD;
&lt;p&gt;To learn more about SQL Database elastic pools and these new choices, please visit the &lt;a href="https://azure.microsoft.com/en-us/documentation/articles/sql-database-elastic-pool"&gt;SQL Database elastic pool&lt;/a&gt; webpage.&amp;nbsp; And for pricing information, please visit the &lt;a href="https://azure.microsoft.com/en-us/pricing/details/sql-database/"&gt;SQL Database pricing&lt;/a&gt; webpage.&lt;/p&gt;&#xD;
</content:encoded>
      <comments>https://azure.microsoft.com/blog/new-price-performance-choices-for-azure-sql-database-elastic-pools/#comments</comments>
      <link>https://azure.microsoft.com/blog/new-price-performance-choices-for-azure-sql-database-elastic-pools/</link>
      <dc:creator>Morgan Oslake</dc:creator>
    </item>
    <item>
      <guid
        isPermaLink="false">sneak-peek-a-new-azure-cloud-console</guid>
      <category>Supportability</category>
      <category>Updates</category>
      <title>Sneak peek: A new Azure Cloud Console</title>
      <description>We built an integrated workflow enabling users to build their applications on Azure using graphical and command line tools, even on devices where command line tools aren't installed. </description>
      <pubDate>Wed, 07 Dec 2016 10:00:09 Z</pubDate>
      <content:encoded>&lt;p&gt;For a while now, I have been passionate about containers and how they are revolutionizing and truly delivering the promise of cloud native computing. However, even as excited as I am about revolutionizing container compute with Azure, I&amp;rsquo;m equally passionate about user interface.&amp;nbsp; After all, cloud computing is useless if it can&amp;rsquo;t be accessed from a useful interface.&amp;nbsp; So, today, I&amp;rsquo;m excited to show you how we&amp;rsquo;re bringing these passions together in the new cloud console for the Azure portal.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;Traditional cloud user interfaces have been divided into either a web-based graphical interface or a command line terminal interface. Each of these interfaces provide their utility and different users prefer different interfaces for different tasks. However, most Azure users use both interfaces to manage their applications on Azure.&amp;nbsp; Much like developing code before integrated development environments like Visual Studio or Visual Studio Code, switching between these interfaces requires switching between applications, a context switch that slows users and makes it harder to accomplish their goals.&amp;nbsp; In some cases, (for example tablets and other mobile devices) a terminal interface may not even be available and a user may have to switch devices.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;To address these needs, we built an integrated workflow enabling users to build their applications on Azure using graphical and command line tools, even on devices where command line tools aren&amp;#39;t installed. Today, we&amp;#39;re giving you a sneak peek of this new cloud shell experience that we are adding into the Azure portal. As you can see from the video below, the shell is integrated into the portal so users can quickly drop into a command line experience while simultaneously viewing their cloud resources in the graphical web interface.&lt;/p&gt;&#xD;
&#xD;
&lt;h3&gt;Using Azure Cloud Console to deploy a VM&lt;/h3&gt;&#xD;
&#xD;
&lt;p&gt;&lt;iframe allowfullscreen="" frameborder="0" height="315" src="https://channel9.msdn.com/Blogs/Azure-Linux-Team/Using-Azure-Cloud-Console-to-deploy-a-VM/player" width="560"&gt;&lt;/iframe&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;&#xD;
&#xD;
&lt;h3&gt;Using Azure Cloud Console with GIT&lt;/h3&gt;&#xD;
&#xD;
&lt;p&gt;&lt;iframe allowfullscreen="" frameborder="0" height="315" src="https://channel9.msdn.com/Blogs/Azure-Linux-Team/Using-Azure-Cloud-Console-with-GIT/player" width="560"&gt;&lt;/iframe&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;The key features of this experience are:&lt;/p&gt;&#xD;
&#xD;
&lt;ul&gt;&#xD;
	&lt;li&gt;Automatic authentication to the command line tools from your existing web login&lt;/li&gt;&#xD;
	&lt;li&gt;All Azure command line tools, as well as relevant command line utilities pre-installed&lt;/li&gt;&#xD;
	&lt;li&gt;Personalized, persistent workspace that preserves your code, configuration and activity across cloud shell sessions.&lt;/li&gt;&#xD;
&lt;/ul&gt;&#xD;
&#xD;
&lt;p&gt;With a single click, you are dropped into a terminal command line tools pre-configured with your existing Azure credentials.&amp;nbsp; This terminal is a fully featured experience featuring not only the Azure command line tools, but also standard editors and tools you would expect.&amp;nbsp; Further, the cloud shell preserves context for you. When you save files to disk, they are persisted in Azure&amp;rsquo;s cloud so you can resume where you left off in your next cloud shell session, even if you are on a different device or network.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;So how does the cloud console relate to containers? Well, the shell itself is packaged as a container to provide a clean, consistent interface every time you launch a new session.&amp;nbsp; Of course this is based on the container we&amp;#39;ve already built for the Azure 2.0 command line tool. You can use today on your own machine with:&lt;/p&gt;&#xD;
&#xD;
&lt;pre class="prettyprint"&gt;&#xD;
$ docker run -v ${HOME}:/root -it azuresdk/azure-cli-python:latest&lt;/pre&gt;&#xD;
&#xD;
&lt;p&gt;Going forward, we&amp;#39;re looking for a few hardy souls who are willing to test and provide feedback on this new console experience as we bring it to general availability over the next few months. If you are interested, please &lt;a href="https://forms.office.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7e1BxBe77BKjoaBtSzldH1UMFYwTTFMVlhHU0cwN0k4VVFCNEMwU0pHQi4u"&gt;sign up&lt;/a&gt; and we&amp;#39;ll be in touch!&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;I&amp;#39;m super excited about how containers are revolutionizing compute on Azure, and especially excited about how we ourselves can use container technology to offer new, integrated interfaces for developing your applications on the Azure cloud.&lt;/p&gt;&#xD;
</content:encoded>
      <comments>https://azure.microsoft.com/blog/sneak-peek-a-new-azure-cloud-console/#comments</comments>
      <link>https://azure.microsoft.com/blog/sneak-peek-a-new-azure-cloud-console/</link>
      <dc:creator>Brendan Burns</dc:creator>
    </item>
    <item>
      <guid
        isPermaLink="false">new-features-in-azure-application-insights-metrics-explorer</guid>
      <category>Announcements</category>
      <category>IT Pro/DevOps</category>
      <category>Developer</category>
      <title>New features in Azure Application Insights Metrics Explorer</title>
      <description>Over the past weeks, Application Insights Metrics Explorer introduced several new features that allow more options for visualizing metrics, and simplify chart configuration. This includes new…</description>
      <pubDate>Wed, 07 Dec 2016 09:00:09 Z</pubDate>
      <content:encoded>&lt;p&gt;Over the past weeks, &lt;a href="https://docs.microsoft.com/en-us/azure/application-insights/app-insights-metrics-explorer"&gt;Application Insights Metrics Explorer&lt;/a&gt; introduced several new features that allow more options for visualizing metrics, and simplify chart configuration. This includes new percentage aggregation charts, pinning standard and custom grids to dashboards, ability to control over y-axis boundaries of the charts, and sorting basic and advanced configuration options in chart settings.&lt;/p&gt;&#xD;
&#xD;
&lt;h2&gt;Percentage aggregation charts&lt;/h2&gt;&#xD;
&#xD;
&lt;p&gt;In many cases the actual value of a segmented metric is not as important as how it compares to the other values in the group. For example, you might want to visualize the percentage of the failed vs. successful requests, where they would sum up to 100%, instead of looking at the raw counts. Or you might want to see the percentage number of the HTTP requests handled by each server instance. To make it possible, we introduced a new percentage aggregation type, allowing users to switch between &amp;ldquo;Average&amp;rdquo; and &amp;ldquo;Average (%)&amp;rdquo;, &amp;ldquo;Sum&amp;rdquo; and &amp;ldquo;Sum (%)&amp;rdquo;, etc. Below is a sample chart that is based on the percentage aggregation:&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/5280f623-5a5d-48a7-bdf7-a5e57553fd68.png"&gt;&lt;img alt="image" border="0" height="330" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/d3fcd0be-ec70-458e-b7d0-822c94909236.png" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="image" width="667"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;To configure percentage aggregation, you need to turn on grouping, and select percentage aggregation from the Aggregation dropdown selector of chart settings:&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/f1d01d6d-37f5-4173-8297-4638349b2605.png"&gt;&lt;img alt="Percentage Aggregation and Grouping" border="0" height="628" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/3fef6982-c418-40b7-bafd-7be8d91274dd.png" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="Percentage Aggregation and Grouping" width="393"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;h2&gt;Hiding less commonly used advanced settings, unless the user really wants to see them&lt;/h2&gt;&#xD;
&#xD;
&lt;p&gt;A common feedback that we caught from new Application Insights users is that there are too many options in the chart settings pane. Some of these options are quite advanced but critical for the experienced users. Mixing basic and advanced options resulted in frustration for the new users who could not identify the essential settings and thus could not configure the charts the way they wanted. What did we do? A new checkbox on top of chart configuration now allows filtering basic and advanced settings. New users are looking at a simpler view. As the user becomes more proficient with metrics explorer, he can add advanced settings by checking the box on top of the chart details dialog:&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/1ba37740-6767-4626-b583-4f0017b2f061.png"&gt;&lt;img alt="image" border="0" height="517" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/01ce05c1-cf5b-4825-b2c1-aa271ade5628.png" style="border-top: 0px; border-right: 0px; background-image: none; border-bottom: 0px; padding-top: 0px; padding-left: 0px; border-left: 0px; margin: 0px; display: inline; padding-right: 0px" title="image" width="331"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;h2&gt;Pinning Metrics Explorer grids to Azure dashboards&lt;/h2&gt;&#xD;
&#xD;
&lt;p&gt;In the past, you could pin metrics explorer charts to &lt;a href="https://docs.microsoft.com/en-us/azure/application-insights/app-insights-dashboards"&gt;dashboards&lt;/a&gt;. Now in addition to pinning charts, you can also pin grids. This small but very useful icon lets you customize your dashboards with those metrics that are better represented in a grid format:&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/77822b6a-e89e-4a33-916e-ff20ba8341be.png"&gt;&lt;img alt="image" border="0" height="353" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/4b377e25-2a96-4b70-9a14-4266dee248b5.png" style="border-top: 0px; border-right: 0px; background-image: none; border-bottom: 0px; padding-top: 0px; padding-left: 0px; border-left: 0px; margin: 0px; display: inline; padding-right: 0px" title="image" width="728"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;h2&gt;Ability to freeze lower and upper boundaries of the y-axis on the charts&lt;/h2&gt;&#xD;
&#xD;
&lt;p&gt;Freezing y-axis of the charts becomes important when looking at smaller fluctuations of larger values. For example, when the volume of successful requests drops from 99.99% to 99.5%, it may represent a significant reduction in the quality of service. But from the charting perceptive, noticing a small numeric value fluctuation would be difficult or even impossible. With the new option, you can freeze the lowest boundary of the chart to 99%, which would make this small drop more apparent. Another example can be a fluctuation in the available memory, where the value will technically never reach 0, so fixing the range to a higher value may make the drops in available memory easier to spot. The two charts below duplicates the same metric but with and without fixed y-axis boundary:&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/e93107a2-2c90-4005-bfc3-77d6350aa057.png"&gt;&lt;img alt="image" border="0" height="327" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/e177f8ca-0ec7-4860-9331-0d25cf016992.png" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="image" width="666"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;To freeze the y-axis boundaries, you need to check advanced settings and specify the desired range under the Y-axis range section of the chart details dialog:&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/d485dbc2-8fc9-48e5-9775-6945c289efa4.png"&gt;&lt;img alt="Y-axis range settings" border="0" height="801" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/19f60dff-79f4-49ff-9348-9b962fe32ee1.png" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="Y-axis range settings" width="398"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
</content:encoded>
      <comments>https://azure.microsoft.com/blog/new-features-in-azure-application-insights-metrics-explorer/#comments</comments>
      <link>https://azure.microsoft.com/blog/new-features-in-azure-application-insights-metrics-explorer/</link>
      <dc:creator>Vitaly Gorbenko</dc:creator>
    </item>
    <item>
      <guid
        isPermaLink="false">clustered-columnstore-index-in-azure-sql-database</guid>
      <category>Announcements</category>
      <category>Database</category>
      <title>Clustered Columnstore Index in Azure SQL Database</title>
      <description>Columnstore index is the preferred technology to run analytics queries in Azure SQL Databases. We recently announced general availability&amp;nbsp; if In-Memory technologies for all Premium databases.…</description>
      <pubDate>Tue, 06 Dec 2016 10:00:07 Z</pubDate>
      <content:encoded>&lt;p&gt;Columnstore index is the preferred technology to run analytics queries in Azure SQL Databases. We recently &lt;a href="https://azure.microsoft.com/en-us/blog/azure-sql-database-in-memory-performance/"&gt;announced general availability&lt;/a&gt; if In-Memory technologies for all Premium databases. Similar to &lt;a href="https://azure.microsoft.com/en-us/blog/in-memory-oltp-in-azure-sql-database/"&gt;In-Memory OLTP&lt;/a&gt;, the columnstore index technology is available in premium databases.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;The columnstore technology is available in two flavors; clustered columnstore index (CCI) for DataMart analytics workloads and nonclustered columnstore index (NCCI) to run analytics queries on operational (i.e. OLTP) workload. Please refer to &lt;a href="https://blogs.msdn.microsoft.com/sqlserverstorageengine/2016/07/18/columnstore-index-differences-between-clusterednonclustered-columnstore-index/"&gt;NCCI vs CCI&lt;/a&gt; for the differences between these two flavors of columnstore indexes. The columnstore index can speed up the performance of analytics queries up to 100x while significantly reducing the storage footprint. The data compression achieved depends on the schema and the data, but we see around 10x data compression on average when compared to rowstore with no compression. This blog will focus on Analytic workloads using CCI but cover NCCI in a future blog.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;Clustered Columnstore index is available in Azure SQL Databases across all premium editions. However, it is not yet available on the Standard and Basic pricing tiers. Using this technology in Azure SQL Databases, you can lower the storage cost and getting a similar or better query performance on lower premium tiers.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;The tables below show a typical analytics query with multi-table join running on P1 and P15 both with/without clustered columnstore index and storage savings achieved&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;strong&gt;Query Performance:&lt;/strong&gt; Key point to note below is that with clustered columnstore index, the example query runs 5x faster on P1 compared to the same query running on P15 with rowstore with no tuning.&amp;nbsp; This can significantly lower the cost you need to pay to meet your workload requirements.&lt;/p&gt;&#xD;
&#xD;
&lt;table border="1" cellpadding="2" cellspacing="0" width="999"&gt;&#xD;
	&lt;tbody&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="210"&gt;&lt;b&gt;Pricing Tier&lt;/b&gt;&lt;/td&gt;&#xD;
			&lt;td valign="top" width="265"&gt;&lt;b&gt;With Rowstore &lt;/b&gt;&lt;/td&gt;&#xD;
			&lt;td valign="top" width="260"&gt;&lt;b&gt;With Columnstore&lt;/b&gt;&lt;/td&gt;&#xD;
			&lt;td valign="top" width="262"&gt;&lt;b&gt;Performance Gains&lt;/b&gt;&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="209"&gt;P1&lt;/td&gt;&#xD;
			&lt;td valign="top" width="264"&gt;30.6 secs&lt;/td&gt;&#xD;
			&lt;td valign="top" width="261"&gt;4.2 secs&lt;/td&gt;&#xD;
			&lt;td valign="top" width="263"&gt;14x&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="208"&gt;P15&lt;/td&gt;&#xD;
			&lt;td valign="top" width="264"&gt;19.5 secs&lt;/td&gt;&#xD;
			&lt;td valign="top" width="262"&gt;0.319 secs&lt;/td&gt;&#xD;
			&lt;td valign="top" width="264"&gt;60x&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
	&lt;/tbody&gt;&#xD;
&lt;/table&gt;&#xD;
&#xD;
&lt;p&gt;&lt;strong&gt;Storage Size:&lt;/strong&gt; The storage savings with columnstore compared to PAGE or NONE compressed tables shown below. While the cost of storage is already included with AzureDB, but lower storage can enable you to choose a lower tier. Note, this is generated test data so the compression is lower than what one would get for customer workloads.&lt;/p&gt;&#xD;
&#xD;
&lt;table border="1" cellpadding="2" cellspacing="0" width="1685"&gt;&#xD;
	&lt;tbody&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="377"&gt;&lt;strong&gt;Number of Rows&lt;/strong&gt;&lt;/td&gt;&#xD;
			&lt;td valign="top" width="432"&gt;&lt;strong&gt;Size Rowstore (MB)&lt;/strong&gt;&lt;/td&gt;&#xD;
			&lt;td valign="top" width="438"&gt;&lt;strong&gt;Size columnstore (MB)&lt;/strong&gt;&lt;/td&gt;&#xD;
			&lt;td valign="top" width="436"&gt;&lt;strong&gt;Savings&lt;/strong&gt;&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="377"&gt;3626191&lt;/td&gt;&#xD;
			&lt;td valign="top" width="432"&gt;212 (PAGE compression)&lt;/td&gt;&#xD;
			&lt;td valign="top" width="438"&gt;120&lt;/td&gt;&#xD;
			&lt;td valign="top" width="436"&gt;1.8x&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="377"&gt;3626191&lt;/td&gt;&#xD;
			&lt;td valign="top" width="432"&gt;756 (NONE compression)&lt;/td&gt;&#xD;
			&lt;td valign="top" width="438"&gt;120MB&lt;/td&gt;&#xD;
			&lt;td valign="top" width="436"&gt;6.2x&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
	&lt;/tbody&gt;&#xD;
&lt;/table&gt;&#xD;
&#xD;
&lt;p&gt;The best part of columnstore index technology is that it does not require any changes to your application. All you need to do is to either create or replace an existing index with columnstore index on your table(s).&lt;/p&gt;&#xD;
&#xD;
&lt;h2&gt;How does Columnstore Index work?&lt;/h2&gt;&#xD;
&#xD;
&lt;p&gt;As described earlier, the columnstore is just an index that stores data in a table as columns as shown below. The queries can continue to access the table requiring no changes.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/15e271b1-9944-4894-be37-0d4b1bbbfe90.png"&gt;&lt;img alt="chart 1" border="0" height="268" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/73d0f49d-2cf4-424e-9570-050824cc2dda.png" style="border-left-width: 0px; border-right-width: 0px; background-image: none; border-bottom-width: 0px; padding-top: 0px; padding-left: 0px; display: inline; padding-right: 0px; border-top-width: 0px" title="chart 1" width="793"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;Columnstore index delivers significant data compression and query performance due to the following three key factors&lt;/p&gt;&#xD;
&#xD;
&lt;ul&gt;&#xD;
	&lt;li&gt;&lt;strong&gt;Reduced IO and Storage:&lt;/strong&gt; Since data is stored as individual columns, it compresses really well as all values are drawn from the same domain (i.e. data type) and in many cases, the values repeat or are similar. The compression will depend on the data distribution but typical compression that we have seen is around 10x. This is significant because it enables you reduce the storage as the IO footprint of your database significantly.&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;strong&gt;Only Referenced columns need to be fetched:&lt;/strong&gt; Most analytics queries fetch/process only a small set of columns. If you consider a typical &lt;a href="https://en.wikipedia.org/wiki/Star_schema"&gt;Star Schema&lt;/a&gt;,&amp;nbsp; the FACT table is the one with most rows and it has large number of columns. With columnstore storage, SQL Server needs to fetch only the referenced columns unlike rowstore where the full row needs to be fetched regardless of number of columns referenced in the query. For example, consider a FACT table with 100 columns and an analytic query accessing this tables references only 5 columns. Now, by fetching only the referenced columns, you can potentially reduce IO by 95% with simplifying assumption that all columns take same storage. Note, this is on top of already 10x data compression provided by columnstore.&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;strong&gt;Efficient Data Processing:&lt;/strong&gt; SQL Server has an industry leading query engine for columnstore data to deliver up to 100x speed up in query performance. For details, please refer to &lt;a href="https://blogs.msdn.microsoft.com/sqlserverstorageengine/2016/03/14/columnstore-index-how-does-sql-server-delivers-industry-leading-performance-for-analytic-queries/"&gt;Speeding up Analytics Queries&lt;/a&gt;.&lt;/li&gt;&#xD;
&lt;/ul&gt;&#xD;
&#xD;
&lt;h2&gt;How do I create clustered columnstore index?&lt;/h2&gt;&#xD;
&#xD;
&lt;p&gt;Creating a clustered columnstore index is like creating any other index. For example, I can create a regular rowstore table as follows&lt;/p&gt;&#xD;
&#xD;
&lt;table border="1" cellpadding="2" cellspacing="0" width="996"&gt;&#xD;
	&lt;tbody&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="128"&gt;CREATE TABLE ACCOUNT (&lt;/td&gt;&#xD;
			&lt;td valign="top" width="866"&gt;&amp;nbsp;&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="128"&gt;ACCOUNTKEY&lt;/td&gt;&#xD;
			&lt;td valign="top" width="866"&gt;INT NOT NULL,&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="128"&gt;ACCOUNTDESCRIPTION&lt;/td&gt;&#xD;
			&lt;td valign="top" width="866"&gt;NVARCHAR (50),&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="128"&gt;ACCOUNTTYPE&lt;/td&gt;&#xD;
			&lt;td valign="top" width="866"&gt;NVARCHAR (50),&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="128"&gt;ACCOUNTCODEALTERNATEKEY&lt;/td&gt;&#xD;
			&lt;td valign="top" width="866"&gt;INT)&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
	&lt;/tbody&gt;&#xD;
&lt;/table&gt;&#xD;
&#xD;
&lt;pre&gt;&#xD;
&#xD;
&amp;nbsp;&lt;/pre&gt;&#xD;
&#xD;
&lt;pre&gt;&#xD;
CREATE TABLE ACCOUNT (&#xD;
    ACCOUNTKEY            INT NOT NULL,&#xD;
    ACCOUNTDESCRIPTION        NVARCHAR (50),&#xD;
    ACCOUNTTYPE            NVARCHAR (50),&#xD;
    ACCOUNTCODEALTERNATEKEY     INT)&lt;/pre&gt;&#xD;
&#xD;
&lt;p&gt;Any rows inserted into the table above are stored in rowstore format. Now, if you want to convert this table to store data in &amp;#39;columnstore&amp;#39;, all you need to do is to execute the following SQL statement&lt;/p&gt;&#xD;
&#xD;
&lt;pre&gt;&#xD;
CREATE CLUSTERED COLUMNSTORE index ACCOUNT_CI on ACCOUNT&lt;/pre&gt;&#xD;
&#xD;
&lt;p&gt;If the rowstore table had a clustered BTREE index, then you can execute the following SQL Statement&lt;/p&gt;&#xD;
&#xD;
&lt;pre&gt;&#xD;
CREATE CLUSTERED COLUMNSTORE index ACCOUNT_CI on ACCOUNT WITH (DROP_EXISTING = ON)&lt;/pre&gt;&#xD;
&#xD;
&lt;h2&gt;When and where should you use clustered columnstore Index?&lt;/h2&gt;&#xD;
&#xD;
&lt;p&gt;Clustered Columnstore index primarily targets analytics workloads. The table below shows the common scenarios that have been successfully deployed with this technology.&lt;/p&gt;&#xD;
&#xD;
&lt;table border="1" cellpadding="2" cellspacing="0" width="996"&gt;&#xD;
	&lt;tbody&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="321"&gt;&lt;strong&gt;Columnstore Option&lt;/strong&gt;&lt;/td&gt;&#xD;
			&lt;td valign="top" width="356"&gt;&lt;strong&gt;Workload&lt;/strong&gt;&lt;/td&gt;&#xD;
			&lt;td valign="top" width="317"&gt;&lt;strong&gt;Compression&lt;/strong&gt;&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="319"&gt;CCI (clustered columnstore index)&lt;/td&gt;&#xD;
			&lt;td valign="top" width="361"&gt;&#xD;
			&lt;ul&gt;&#xD;
				&lt;li&gt;&lt;strong&gt;Traditional DW workload with Star or Snowflake schema:&lt;/strong&gt; Commonly you enable CCI on the FACT table but keep DIMENSION tables with rowstore with PAGE compression.&lt;br&gt;&#xD;
				Additional Considerations: consider CCI for large dimension tables with &amp;gt; 1 million rows&lt;/li&gt;&#xD;
				&lt;li&gt;&lt;strong&gt;Insert mostly workload:&lt;/strong&gt; Many workloads such as IOT (Internet of things) insert large volume of data with minimal updates/deletes. These workloads can benefit with huge data compression as well as speed up of analytic queries.&lt;/li&gt;&#xD;
			&lt;/ul&gt;&#xD;
			&lt;/td&gt;&#xD;
			&lt;td valign="top" width="315"&gt;&#xD;
			&lt;p&gt;10x on average&lt;/p&gt;&#xD;
			&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="317"&gt;CCI/NCI (with one or more nonclustered indexes)&lt;/td&gt;&#xD;
			&lt;td valign="top" width="366"&gt;&#xD;
			&lt;ul&gt;&#xD;
				&lt;li&gt;Similar to the ones mentioned with CCI but require (a) PK/FK enforcements (b) significant number of queries with equality predicate or short range queries. NCIs speed up the query performance by avoiding full table scans (c) update/delete deletes of rows which can be efficiently located using NCIs.&lt;/li&gt;&#xD;
			&lt;/ul&gt;&#xD;
			&lt;/td&gt;&#xD;
			&lt;td valign="top" width="319"&gt;&#xD;
			&lt;p&gt;10X on average + additional storage for NCIs&lt;/p&gt;&#xD;
			&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
	&lt;/tbody&gt;&#xD;
&lt;/table&gt;&#xD;
&#xD;
&lt;h2&gt;Resources to get started&lt;/h2&gt;&#xD;
&#xD;
&lt;p&gt;For more details, please refer to the following&lt;/p&gt;&#xD;
&#xD;
&lt;ul&gt;&#xD;
	&lt;li&gt;&lt;a href="https://docs.microsoft.com/en-us/azure/sql-database/sql-database-in-memory"&gt;Sample workload for columnstore index&lt;/a&gt;&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;a href="https://blogs.msdn.microsoft.com/sqlserverstorageengine/2016/10/04/columnstore-index-in-memory-analytics-i-e-columnstore-index-videos-from-ignite-2016/"&gt;Examples of production deployment of columnstore index&lt;/a&gt;&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;a href="https://blogs.msdn.microsoft.com/sqlserverstorageengine/tag/columnstore-index/"&gt;SQL Server Team&amp;#39;s blogs on columnstore index&lt;/a&gt;&lt;/li&gt;&#xD;
	&lt;li&gt;&lt;a href="https://msdn.microsoft.com/library/gg492088.aspx"&gt;MSDN documentation on columnstore index&lt;/a&gt;&lt;/li&gt;&#xD;
&lt;/ul&gt;&#xD;
</content:encoded>
      <comments>https://azure.microsoft.com/blog/clustered-columnstore-index-in-azure-sql-database/#comments</comments>
      <link>https://azure.microsoft.com/blog/clustered-columnstore-index-in-azure-sql-database/</link>
      <dc:creator>Sunil Agarwal</dc:creator>
    </item>
    <item>
      <guid
        isPermaLink="false">new-microsoft-azure-training-and-discounted-certifications</guid>
      <category>Announcements</category>
      <title>New Microsoft Azure training and discounted certifications</title>
      <description>Announcing today, we have three training offers that combine free access to our library of flexible online courses, discounts on our industry-standard Microsoft Certified Professional exams, and a discount on Linux certification offered through the Linux Foundation.</description>
      <pubDate>Tue, 06 Dec 2016 09:00:15 Z</pubDate>
      <content:encoded>&lt;p&gt;&lt;em&gt;This post is authored by Julia White, Corporate Vice President, Azure + Security Marketing.&lt;/em&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;As I recently discussed in my &lt;a href="http://azure.microsoft.com/en-us/blog/top-cloud-myths-of-2016/" target="_blank"&gt;Top Cloud Myths of 2016 blog post&lt;/a&gt;, the use of cloud technology has become mainstream. Over 90% of the Fortune 500 companies now use at least one of Microsoft&amp;rsquo;s enterprise-grade services, and more than 60% are using three or more Microsoft cloud technologies. Azure compute usage has more than doubled year-over-year, and across a variety of industries, companies like &lt;a href="https://customers.microsoft.com/en-us/story/rollsroycestory" target="_blank"&gt;Rolls Royce&lt;/a&gt;, &lt;a href="https://customers.microsoft.com/en-us/story/uber" target="_blank"&gt;Uber&lt;/a&gt;, and &lt;a href="https://customers.microsoft.com/en-us/story/celebrating-the-100th-birthday-of-the-coca-cola-bottle-with-cortana-intelligence" target="_blank"&gt;Coca-Cola&lt;/a&gt; are using Azure to transform their businesses.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;With this rapid adoption of Azure, we have consistently heard from developers and IT operators about the need for Azure training and certification to help ensure they have the latest and greatest information. With the rapid innovation cycle of Azure, it&amp;rsquo;s critical to stay up to date on the latest technology and methodologies.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;To make sure you have up-to-date technical skills and best practices using Azure, we are introducing new training resources and discounted access to certification. Announcing today, we have three training offers that combine &lt;strong&gt;free&lt;/strong&gt; access to our library of flexible online courses, discounts on our industry-standard Microsoft Certified Professional exams, and a discount on Linux certification offered through the Linux Foundation. In case you missed the recent news, &lt;a href="https://news.microsoft.com/2016/11/16/microsoft-contributes-to-open-ecosystem-by-joining-linux-foundation-and-welcoming-google-to-the-net-community/#sm.0000bx2du9wegemut1s1esho8on73" target="_blank"&gt;Microsoft joined the Linux Foundation&lt;/a&gt;, further demonstrating our commitment to open source for Azure, and as a company.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;We are offering a broad range of learning resources &amp;ndash; from a Massively Open Online Course, or MOOC, to full certification offerings. The MOOCs offer online videos, demos, labs, graded assessments, office hours, and more.&amp;nbsp; When you complete a MOOC you get digital certificates for completion, and you get access to reduced-cost Microsoft Certified Professional exams for formal certification.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;Check out all the &lt;a href="https://partner.microsoft.com/azureskills" target="_blank"&gt;learning options for Azure&lt;/a&gt;, and read my colleague &lt;a href="https://blogs.partner.microsoft.com/mpn/new-cloud-trainings-for-next-generation-tech-professionals/" target="_blank"&gt;Gavriella Schuster&amp;rsquo;s blog post&lt;/a&gt; to dive into more details about our new resources and discounted certifications!&lt;/p&gt;&#xD;
</content:encoded>
      <comments>https://azure.microsoft.com/blog/new-microsoft-azure-training-and-discounted-certifications/#comments</comments>
      <link>https://azure.microsoft.com/blog/new-microsoft-azure-training-and-discounted-certifications/</link>
      <dc:creator>Microsoft Azure</dc:creator>
    </item>
    <item>
      <guid
        isPermaLink="false">telemetry-platform-features-in-azure-media-services</guid>
      <category>Media Services &amp; CDN</category>
      <title>Telemetry Platform Features in Azure Media Services</title>
      <description>We are excited to announce new Azure Media Services telemetry platform features, generally available through our new Telemetry API. </description>
      <pubDate>Tue, 06 Dec 2016 09:00:07 Z</pubDate>
      <content:encoded>&lt;p&gt;We are excited to announce new Azure Media Services (AMS) telemetry platform features, generally available through our new Telemetry API. Media Services telemetry allows you to monitor and measure the health of your services through a suite of telemetry data. Telemetry data is written to your Azure Storage account and can be processed and visualized using a wide array of data visualization tools.&lt;/p&gt;&#xD;
&#xD;
&lt;h2&gt;Consuming Telemetry Data&lt;/h2&gt;&#xD;
&#xD;
&lt;p&gt;This release includes telemetry metrics for Channel, Streaming Endpoint, and Archive entities. Telemetry data is written to an Azure Storage table in the storage account specified when configuring telemetry for your media services account. Telemetry data is stored in aggregate in a table, &amp;ldquo;TelemetryMetricsYYYYMMDD,&amp;rdquo; for each day&amp;rsquo;s data (where &amp;ldquo;YYYYMMDD&amp;rdquo; denotes the date timestamp).&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;Each table entry contains a set of common fields and a record with a set of entity-specific fields. The entry identifying fields include the following:&lt;/p&gt;&#xD;
&#xD;
&lt;table border="0" cellpadding="2" cellspacing="0" width="2198"&gt;&#xD;
	&lt;tbody&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="260"&gt;&lt;strong&gt;Property&lt;/strong&gt;&lt;/td&gt;&#xD;
			&lt;td valign="top" width="895"&gt;&lt;strong&gt;Value&lt;/strong&gt;&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1041"&gt;&lt;strong&gt;Example&lt;/strong&gt;&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="260"&gt;PartitionKey&lt;/td&gt;&#xD;
			&lt;td valign="top" width="895"&gt;{Account ID}_{Entity ID}&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1041"&gt;e49bef329c29495f9b9570989682069d_64435281c50a4dd8ab7011cb0f4cdf66&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="260"&gt;RowKey&lt;/td&gt;&#xD;
			&lt;td valign="top" width="895"&gt;{Seconds to Midnight}_{Random Value}&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1041"&gt;01688_00199&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="260"&gt;Timestamp&lt;/td&gt;&#xD;
			&lt;td valign="top" width="895"&gt;The time at which the row entry was created&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1041"&gt;2016-09-09T22:43:42.241Z&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="260"&gt;Type&lt;/td&gt;&#xD;
			&lt;td valign="top" width="895"&gt;The type of the entity providing telemetry&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1041"&gt;Channel&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="260"&gt;Name&lt;/td&gt;&#xD;
			&lt;td valign="top" width="895"&gt;The name of the telemetry event&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1041"&gt;ChannelHeartbeat&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="260"&gt;ObservedTime&lt;/td&gt;&#xD;
			&lt;td valign="top" width="895"&gt;The time at which the event occurred&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1041"&gt;2016-09-09T22:42:36:924Z&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="260"&gt;ServiceID&lt;/td&gt;&#xD;
			&lt;td valign="top" width="895"&gt;Service ID&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1041"&gt;f70bd731-691d-41c6-8f2d-671d0bdc9c7e&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="260"&gt;Entity-specific Properties&lt;/td&gt;&#xD;
			&lt;td valign="top" width="895"&gt;{Record as defined by the event}&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1041"&gt;{Record}&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
	&lt;/tbody&gt;&#xD;
&lt;/table&gt;&#xD;
&#xD;
&lt;p&gt;The account ID is included in the partition key to simplify workflows where multiple media services accounts are writing data to the same storage account. The row key start with the number of seconds to midnight to allow &lt;em&gt;top n&lt;/em&gt; style data queries within a partition (see the &lt;a href="https://docs.microsoft.com/en-us/azure/storage/storage-table-design-guide#log-tail-pattern" target="_blank"&gt;log tail table design pattern&lt;/a&gt; for more information). The observed timestamp is an approximate measure provided by the entity reporting telemetry.&lt;/p&gt;&#xD;
&#xD;
&lt;h3&gt;Entity-Specific Telemetry&lt;/h3&gt;&#xD;
&#xD;
&lt;p&gt;The data in each telemetry row represents an aggregation of telemetry events raised over an aggregation time window, listed below. Each entity pushes telemetry with the following frequencies:&lt;/p&gt;&#xD;
&#xD;
&lt;ul&gt;&#xD;
	&lt;li&gt;Channels: Every 60 seconds&lt;/li&gt;&#xD;
	&lt;li&gt;Streaming Endpoints: Every 30 seconds&lt;/li&gt;&#xD;
	&lt;li&gt;Archive: Every 60 seconds&lt;/li&gt;&#xD;
&lt;/ul&gt;&#xD;
&#xD;
&lt;p&gt;Below is the schema description for channels, streaming endpoints, and archive entities.&lt;/p&gt;&#xD;
&#xD;
&lt;h4&gt;Channels&lt;/h4&gt;&#xD;
&#xD;
&lt;table border="0" cellpadding="2" cellspacing="0" width="2191"&gt;&#xD;
	&lt;tbody&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;&lt;strong&gt;Property&lt;/strong&gt;&lt;/td&gt;&#xD;
			&lt;td valign="top" width="755"&gt;&lt;strong&gt;Value&lt;/strong&gt;&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1201"&gt;&lt;strong&gt;Example&lt;/strong&gt;&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;TrackType&lt;/td&gt;&#xD;
			&lt;td valign="top" width="755"&gt;Type of track&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1201"&gt;video&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;TrackName&lt;/td&gt;&#xD;
			&lt;td valign="top" width="755"&gt;Name of the track&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1201"&gt;video&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;Bitrate&lt;/td&gt;&#xD;
			&lt;td valign="top" width="755"&gt;Expected bitrate of the track&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1201"&gt;785,000&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;IncomingBitrate&lt;/td&gt;&#xD;
			&lt;td valign="top" width="755"&gt;Incoming bitrate of the track&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1201"&gt;784,548&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;OverlapCount&lt;/td&gt;&#xD;
			&lt;td valign="top" width="755"&gt;Number of overlapping fragments received in ingest&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1201"&gt;0&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;DiscontinuityCount&lt;/td&gt;&#xD;
			&lt;td valign="top" width="755"&gt;Number of discontinuities detected in ingest&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1201"&gt;0&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;LastTimestamp&lt;/td&gt;&#xD;
			&lt;td valign="top" width="755"&gt;Last ingested data timestamp&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1201"&gt;1800488800&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;NonincreasingCount&lt;/td&gt;&#xD;
			&lt;td valign="top" width="755"&gt;Count of fragments discarded due to non-increasing timestamp&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1201"&gt;0&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;UnalignedKeyFrames&lt;/td&gt;&#xD;
			&lt;td valign="top" width="755"&gt;Boolean on whether we received fragments where key frames are not aligned&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1201"&gt;False&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;UnalignedPresentationTIme&lt;/td&gt;&#xD;
			&lt;td valign="top" width="755"&gt;Boolean on whether we received fragments where presentation time is not aligned&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1201"&gt;False&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;UnexpectedBitrate&lt;/td&gt;&#xD;
			&lt;td valign="top" width="755"&gt;Boolean on whether the IncomingBitrate and Bitrate differ by more than 50% or if IncomingBitrate for an audio or video track is less than 40 kbps if Bitrate is 0&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1201"&gt;False&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;Healthy&lt;/td&gt;&#xD;
			&lt;td valign="top" width="755"&gt;Boolean on whether OverlapCount, DiscontinuityCount, NonincreasingCount, UnalignedKeyFrames, UnalignedPresentationTime, and UnexpectedBitrate are all zero or false&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1201"&gt;True&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;CustomAttributes&lt;/td&gt;&#xD;
			&lt;td valign="top" width="755"&gt;Placeholder for custom attributes&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1201"&gt;&amp;nbsp;&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
	&lt;/tbody&gt;&#xD;
&lt;/table&gt;&#xD;
&#xD;
&lt;h4&gt;Streaming Endpoints&lt;/h4&gt;&#xD;
&#xD;
&lt;table border="0" cellpadding="2" cellspacing="0" width="2196"&gt;&#xD;
	&lt;tbody&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;&lt;strong&gt;Property&lt;/strong&gt;&lt;/td&gt;&#xD;
			&lt;td valign="top" width="758"&gt;&lt;strong&gt;Value&lt;/strong&gt;&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1203"&gt;&lt;strong&gt;Example&lt;/strong&gt;&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;HostName&lt;/td&gt;&#xD;
			&lt;td valign="top" width="758"&gt;Hostname of the endpoint&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1203"&gt;builddemoserver.origin.mediaservices.windows.net&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;StatusCode&lt;/td&gt;&#xD;
			&lt;td valign="top" width="758"&gt;HTTP status code&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1203"&gt;200&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;ResultCode&lt;/td&gt;&#xD;
			&lt;td valign="top" width="758"&gt;HTTP result code detail&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1203"&gt;S_OK&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;RequestCount&lt;/td&gt;&#xD;
			&lt;td valign="top" width="758"&gt;Total requests received within the last aggregation window&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1203"&gt;3&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;BytesSent&lt;/td&gt;&#xD;
			&lt;td valign="top" width="758"&gt;Total bytes sent within the last aggregation window&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1203"&gt;2,987,358&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;ServerLatency&lt;/td&gt;&#xD;
			&lt;td valign="top" width="758"&gt;Average server latency including storage in milliseconds&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1203"&gt;130&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="233"&gt;E2ELatency&lt;/td&gt;&#xD;
			&lt;td valign="top" width="758"&gt;Average end-to-end latency in milliseconds&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1203"&gt;250&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
	&lt;/tbody&gt;&#xD;
&lt;/table&gt;&#xD;
&#xD;
&lt;h4&gt;Archive&lt;/h4&gt;&#xD;
&#xD;
&lt;table border="0" cellpadding="2" cellspacing="0" width="2206"&gt;&#xD;
	&lt;tbody&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="230"&gt;&lt;strong&gt;Property&lt;/strong&gt;&lt;/td&gt;&#xD;
			&lt;td valign="top" width="763"&gt;&lt;strong&gt;Value&lt;/strong&gt;&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1211"&gt;&lt;strong&gt;Example&lt;/strong&gt;&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="230"&gt;ManifestName&lt;/td&gt;&#xD;
			&lt;td valign="top" width="763"&gt;Name of the manifest&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1211"&gt;asset-eb149703-ed0a-483c-91c4-e4066e72cce3/a0a5cfbf-71ec-4bd2-8c01-a92a2b38c9ba.ism&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="230"&gt;TrackName&lt;/td&gt;&#xD;
			&lt;td valign="top" width="763"&gt;Name of the track&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1211"&gt;audio_1&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="230"&gt;TrackType&lt;/td&gt;&#xD;
			&lt;td valign="top" width="763"&gt;Type of track&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1211"&gt;audio&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="230"&gt;Bitrate&lt;/td&gt;&#xD;
			&lt;td valign="top" width="763"&gt;Track bitrate&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1211"&gt;785,000&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="230"&gt;Healthy&lt;/td&gt;&#xD;
			&lt;td valign="top" width="763"&gt;Boolean on whether there were no discarded fragments or archive acquisition errors in storage&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1211"&gt;True&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
		&lt;tr&gt;&#xD;
			&lt;td valign="top" width="230"&gt;CustomAttributes&lt;/td&gt;&#xD;
			&lt;td valign="top" width="763"&gt;Placeholder for custom attributes&lt;/td&gt;&#xD;
			&lt;td valign="top" width="1211"&gt;&amp;nbsp;&lt;/td&gt;&#xD;
		&lt;/tr&gt;&#xD;
	&lt;/tbody&gt;&#xD;
&lt;/table&gt;&#xD;
&#xD;
&lt;p&gt;The schema above is designed to give good performance within the limits of Azure Table Storage:&lt;/p&gt;&#xD;
&#xD;
&lt;ul&gt;&#xD;
	&lt;li&gt;Data is partitioned by account ID and service ID to allow telemetry from each service to be queried independently&lt;/li&gt;&#xD;
	&lt;li&gt;Partitions contain the date to give a reasonable upper bound on the partition size&lt;/li&gt;&#xD;
	&lt;li&gt;Row keys are in reverse time order to allow the most recent telemetry items to be queried for a given service&lt;/li&gt;&#xD;
&lt;/ul&gt;&#xD;
&#xD;
&lt;p&gt;This should allow many of the common queries to be efficient:&lt;/p&gt;&#xD;
&#xD;
&lt;ul&gt;&#xD;
	&lt;li&gt;Parallel, independent downloading of data for separate services&lt;/li&gt;&#xD;
	&lt;li&gt;Retrieving all data for a given service in a date range&lt;/li&gt;&#xD;
	&lt;li&gt;Retrieving the most recent data for a service&lt;/li&gt;&#xD;
&lt;/ul&gt;&#xD;
&#xD;
&lt;h2&gt;Visualizing Telemetry Data&lt;/h2&gt;&#xD;
&#xD;
&lt;p&gt;Your Azure Storage account can export data to data visualization tools such as PowerBI, Application Insights, and AMS Live Monitoring Dashboard, among many others. Below is an example of how this data can be imported into and visualized directly with &lt;a href="http://powerbi.com/" target="_blank"&gt;PowerBI&lt;/a&gt;.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;First, select the telemetry tables for the days you are interested in visualizing. To import your data, select Microsoft Azure Table Storage from the Get Data menu. Enter your storage account credentials and select the tables to import. Next, by selecting &lt;strong&gt;Content.ObservedTime&lt;/strong&gt; as the time axis and &lt;strong&gt;Average of Content.Healthy&lt;/strong&gt; (casted to decimal representation) as the value, plot the health of your channels and archive entities as a line graph. To plot channel health and archive health separately, add a filter on &lt;strong&gt;Content.Name&lt;/strong&gt;. This visualization illustrates entity health where a value of 1 represents perfectly healthy and a value of 0 represents perfectly unhealthy for the given time. Below are examples of the channel and archive health plotted over time.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/928c75bf-09dd-485c-be07-e6e8758af7cf.png"&gt;&lt;img alt="Health Over Time" border="0" height="1558" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/5239a9c0-bed0-4bec-8b30-de9cb0909ce8.png" style="border-width: 0px; padding-top: 0px; padding-right: 0px; padding-left: 0px; display: inline; background-image: none;" title="Health Over Time" width="2675"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;To visualize the overall health of your services, collapse the time dimension by plotting the data as a pie chart. Selecting &lt;strong&gt;Content.Healthy&lt;/strong&gt; as the legend and &lt;strong&gt;Count of Content.Healthy&lt;/strong&gt; as the value, produces the visualization below.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;&lt;a href="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/08028e8f-13a8-41e2-bf29-5a8ba35ce5f7.png"&gt;&lt;img alt="Overall Health Percentage" border="0" height="1560" src="https://azurecomcdn.azureedge.net/mediahandler/acomblog/media/Default/blog/2369a90a-66a7-4eda-9ba0-120fa5449365.png" style="border-width: 0px; padding-top: 0px; padding-right: 0px; padding-left: 0px; display: inline; background-image: none;" title="Overall Health Percentage" width="2682"&gt;&lt;/a&gt;&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;We&amp;rsquo;re excited to share these telemetry features and hope they provide useful information about the health of your Azure media services.&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;Thanks,&lt;/p&gt;&#xD;
&#xD;
&lt;p&gt;Azure Media Services Streaming Team&lt;/p&gt;&#xD;
</content:encoded>
      <comments>https://azure.microsoft.com/blog/telemetry-platform-features-in-azure-media-services/#comments</comments>
      <link>https://azure.microsoft.com/blog/telemetry-platform-features-in-azure-media-services/</link>
      <dc:creator>Dwyane George</dc:creator>
    </item>
  </channel>
</rss>