We are excited to announce the release of Windows Azure Storage Analytics. This feature offers developers and operations the ability to track, analyze, and debug usage of Windows Azure Storage (Blobs, Tables and Queues). You can use this data to analyze storage usage to improve the design of your applications and their access patterns to Windows Azure Storage. Analytics data consists of:
- Provide trace of executed requests for Blobs, Tables and Queues
- Provide summary of key capacity and request statistics for Blobs, Tables and Queues
This feature provides a trace of all executed requests for your storage accounts as block blobs in a special container called $logs. Each log entry in the blob corresponds to a request made to the service and contains information like request id, request URL, http status of the request, requestor account name, owner account name, server side latency, E2E latency, source IP address for the request etc.
This data now empowers you to analyze your requests much more closely. It allows you to run the following types of analysis:
- How many anonymous requests is my application seeing from a given range of IP address?
- Which containers are being accessed the most?
- How many times is a particular SAS URL being accessed and how?
- Who issued the request to delete a container?
- For a slow request –where is the time being consumed?
- I got a network error, did the request reach the server?
Provide summary of key statistics for Blobs, Tables and Queues for a storage account. The statistics can be categorized as:
- Request information: Provides hourly aggregates of number of requests, average server side latency, average E2E latency, average bandwidth, total successful requests and total number of failures and more. These request aggregates are provided at a service level and per API level for APIs requested in that hour. This is available for Blob, Table and Queue service.
- Capacity information: Provides daily statistics for the space consumed by the service, number of containers and number of objects that are stored in the service. Note, this is currently only provided for the Windows Azure Blob service.
All Analytics Logs and Metrics data are stored in your user account and is accessible via normal Blob and Table REST APIs. The logs and metrics can be accessed from a service running in Windows Azure or directly over the Internet from any application that can send and receive HTTP/HTTPS requests. You can opt in to store either the log data and/or metric data by invoking a REST API to turn on/off the feature at a per service level. Once the feature is turned on, the Windows Azure Storage stores analytics data in the storage account. Log data is stored as Windows Azure Blobs in a special blob container and metrics data is stored in special tables in Windows Azure Tables. To ease the management of this data, we have provided the ability to set a retention policy that will automatically clean up your analytics blob and table data.
Please see the following links for more information:
Windows Azure Storage Team
Announcing Windows Azure Storage Analytics – Windows Azure – Site Home – MSDN Blogs
Very good article showing how to access SQL Azure from Ruby.
Connecting to SQL Azure from Ruby Applications
This article discusses the methods of connecting to SQL Azure from the Ruby language. While this article discusses several gems that can be used to connect to SQL Azure, it is by no means a comprehensive listing of all gems that provide this functionality.
NOTE: The procedures listed in this article may not work on all operating systems due to availability of ODBC drivers, differences in compilation process, etc. Currently this article contains information based on the Windows 7 operating system and the Windows Azure web or worker role hosting environment.
Table of Contents
Connecting to SQL Azure from Ruby Applications – TechNet Articles – Home – TechNet Wiki
The Windows Azure Toolkit for Social Games allows you to quickly get started building new social games in Windows Azure. The social gaming market continues to grow and become more profitable, and eMarketer predicts it will increase to $1.32 billion in revenues by 2012 (up from $856 million in 2010). To help you quickly tap into this market, the toolkit includes accelerators, libraries, developer tools, and samples that you can use in your own .NET or HTML5 game. The toolkit also enables unique capabilities for social gaming prerequisites, such as storing user profiles, maintaining leader boards, in-app purchasing and so forth.
The Windows Azure platform provides game developers with on-demand compute, storage, content delivery and networking capabilities so that they can focus on development as opposed to operational
Windows Azure is a cloud-computing platform that lets you run applications and store data in the cloud. Instead of having to worry about building out the underlying infrastructure and managing the operating system, you can simply build your application and deploy it to Windows Azure. Windows Azure provides developers with on-demand compute, storage, networking, and content delivery capabilities. For more information about Windows Azure, visit the Windows Azure website. For developer focused training material, download the Windows Azure Platform Training Kit or view the online Windows Azure Platform Training Course.
The Windows Azure Toolkit for Social Games also comes with a new proof-of-concept game called Tankster from industry innovator Grant Skinner and his team at gskinner.com.
Tankster is built with HTML5 and comes complete with reusable server side code and documentation. It also supports a variety of social interactions including messaging, wall posts, and comments while player achievements and game stats are presented on a live leaderboard so gamers can interact with each other—what’s a social game without being able to talk trash?
Windows Azure Toolkit for Social Games
Found this good article on Windows Azure Storage Architecture. Good read if you are planning use storage in cloud.
In this posting we provide an overview of the Windows Azure Storage architecture to give some understanding of how it works. Windows Azure Storage is a distributed storage software stack built completely by Microsoft for the cloud.
Before diving into the details of this post, please read the prior posting on Windows Azure Storage Abstractions and their Scalability Targets to get an understanding of the storage abstractions (Blobs, Tables and Queues) provided and the concept of partitions.
3 Layer Architecture
The storage access architecture has the following 3 fundamental layers:
- Front-End (FE) layer – This layer takes the incoming requests, authenticates and authorizes the requests, and then routes them to a partition server in the Partition Layer. The front-ends know what partition server to forward each request to, since each front-end server caches a Partition Map. The Partition Map keeps track of the partitions for the service being accessed (Blobs, Tables or Queues) and what partition server is controlling (serving) access to each partition in the system.
- Partition Layer – This layer manages the partitioning of all of the data objects in the system. As described in the prior posting, all objects have a partition key. An object belongs to a single partition, and each partition is served by only one partition server. This is the layer that manages what partition is served on what partition server. In addition, it provides automatic load balancing of partitions across the servers to meet the traffic needs of Blobs, Tables and Queues. A single partition server can serve many partitions.
- Distributed and replicated File System (DFS) Layer – This is the layer that actually stores the bits on disk and is in charge of distributing and replicating the data across many servers to keep it durable. A key concept to understand here is that the data is stored by the DFS layer, but all DFS servers are (and all data stored in the DFS layer is) accessible from any of the partition servers.
These layers and a high level overview are shown in the below figure:
Here we can see that the Front-End layer takes incoming requests, and a given front-end server can talk to all of the partition servers it needs to in order to process the incoming requests. The partition layer consists of all of the partition servers, with a master system to perform the automatic load balancing (described below) and assignments of partitions. As shown in the figure, each partition server is assigned a set of object partitions (Blobs, Entities, Queues). The Partition Master constantly monitors the overall load on each partition sever as well the individual partitions, and uses this for load balancing. Then the lowest layer of the storage architecture is the Distributed File System layer, which stores and replicates the data, and all partition servers can access any of the DFS severs.
Windows Azure Storage Architecture Overview – Windows Azure Storage Team Blog – Site Home – MSDN Blogs
Are you a SQL Express user? We would like to understand your usage of SQL Express database and your interest in cloud through a very short and simple survey. Survey Link: http://www.surveygizmo.com/s/580520/sql-express-customer-market-research
If you are not a SQL Express customer just blog/tweet/facebook the above link and help us reach as many folks as you can.
Thanks a lot for your help.
Support for DAC will make import export very easy for our customers. Every conversation I have with customers topic of import/export, backup and disaster recovery always comes up. This is great next step in our feature set in the space.
Database Import and Export for SQL Azure
SQL Azure database users have a simpler way to archive SQL Azure and SQL Server databases, or to migrate on-premises SQL Server databases to SQL Azure. Import and export services through the Data-tier Application (DAC) framework make archival and migration much easier.
The import and export features provide the ability to retrieve and restore an entire database, including schema and data, in a single operation. If you want to archive or move your database between SQL Server versions (including SQL Azure), you can export a target database to a local export file which contains both database schema and data in a single file. Once a database has been exported to an export file, you can import the file with the new import feature. Refer to the FAQ at the end of this article for more information on supported SQL Server versions.
This release of the import and export feature is a Community Technology Preview (CTP) for upcoming, fully supported solutions for archival and migration scenarios. The DAC framework is a collection of database schema and data management services, which are strategic to database management in SQL Server and SQL Azure.
Microsoft SQL Server “Denali” Data-tier Application Framework v2.0 Feature Pack CTP
The DAC Framework simplifies the development, deployment, and management of data-tier applications (databases). The new v2.0 of the DAC framework expands the set of supported objects to full support of SQL Azure schema objects and data types across all DAC services: extract, deploy, and upgrade. In addition to expanding object support, DAC v2.0 adds two new DAC services: import and export. Import and export services let you deploy and extract both schema and data from a single file identified with the “.bacpac” extension.
For an introduction to and more information on the DAC Framework, this whitepaper is available: http://msdn.microsoft.com/en-us/library/ff381683(SQL.100).aspx.
Database Import and Export for SQL Azure
Sanborn Building Model Products are an essential ingredient to virtual city implementations. Combining a strong visual appearance with a standards-compliant relational objects database, Sanborn Building Model offers building and geospatial feature accuracy for the most demanding of users. Data contains the information necessary to construct a digital city with hyper-local central business district building data.
Sanborn Building Model can provide valuable geospatial and visualization data for users and organization involved with: State and major urban Fusion Centers, Local, State and Federal Operations Centers, Architectural / Engineering / Construction, Real-Estate and Land Development, Navigation Applications, Urban and Transportation Planning, Security Planning and Training, Public and private utility providers, and Gaming and internet companies. Sanborn Building Model provides a critical component to urban geo-spatial information analysis and modeling that can be used in production environments for GIS, CAD, LULC, planning, and other aspects.
The LexisNexis Legal Communities provide access to the largest collection of Legal Blogs written by leading legal professionals, more than 1,600 Legal Podcasts featuring legal luminaries, information about Top Cases and Emerging Issues. The communities enable you to keep current with the latest Legal News, Issues and Trends.
Practice areas covered include: Bankruptcy, Copyright & Trademark Law, Emerging Issues, Environment and Climate Change Law, Estate Practice and Elder Law, Insurance, International & Foreign Law, Patent, Real Estate, Rule of Law, Tax Law, Torts, UCC, Commercial Contracts and Business and Workers Compensation. Professional areas of interest include: Law Firm Professionals, Corporate Legal Professionals, Government Information Professionals, Legal Business, Librarians and Information Professionals, Litigation Professionals, Law Students, and Paralegals.