Http Clients and Azure Blob Storage

Monday, March 31, 2014

Although Azure Blob storage has a formal API that you can use from C#, Node, and many other environments, behind it all is a simple HTTP API. For example, with .NET you can use HttpClient to PUT a new blob into storage.

private readonly HttpClient _client = new HttpClient();
private readonly string _url = "https://pathtoblob";

public async Task WriteBlobStringAsync(string data)
{
    var content = new StringContent("Hello from C#");
    content.Headers.Add("x-ms-blob-type", "BlockBlob");

    var response = await _client.PutAsync(_url, content);
    response.EnsureSuccessStatusCode();            
}

The same can be done with Node using request.

var request = require('request');

var options = {
    url: 'https://bitmask.blob.core.windows.net/test/readme.txt?…',
    body: "Hello from Node",
    method: 'PUT',
    headers: {
        'x-ms-blob-type': 'BlockBlob'
    }    
};

var processResponse = function(error, response) {
    if (error) {
        console.log('error: ' + error);
    } else {
        console.log('response: ' + response.statusMessage);
        console.log('etag: ' + response.headers.etag);
    }    
};

request(options, processResponse);

You could also manage an upload using Node’s https module, but https is a low level API and you’d have to manage many small details, like the Content-Length header.

Uploads and Shared Access Signatures

Both these code examples expect to use URLs with Shared Access Signatures in the query string, so there is no need to know the storage account access keys or manage authorization headers. Shared access signatures allow you to grant access to various storage features in a granular fashion. There is a good overview of SAS on the Azure web site.

As an example, the following C# code will create a SAS for a “readme.txt” file in the “test” storage container. The SAS is good for ~4 hours and grants someone read and write privileges on the readme.txt blob. Note that the readme.txt blob does not have to exist before the code creates the SAS.

var storageAccount = CloudStorageAccount.Parse(
    CloudConfigurationManager.GetSetting("ConnectionNameInConfig"));
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("test");

var blob = container.GetBlockBlobReference("readme.txt");
var sasConstraints = new SharedAccessBlobPolicy
{
    SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-15),
    SharedAccessExpiryTime = DateTime.UtcNow.AddHours(4),
    Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write
};
string sasBlobToken = blob.GetSharedAccessSignature(sasConstraints);
return blob.Uri + sasBlobToken;

The SAS combined with the blob URL will look like:

https://bitmask.blob.core.windows.net/test/readme.txt?sv=2013-08-15&sr=b&sig=---&st=2014-03-15T15%3A27%3A14Z&se=2014-03-15T19%3A42%3A14Z&sp=rw

The SAS can be handed to clients who can now directly access storage instead of streaming data through your application as an intermediary.  The example here is what Azure calls an Ad hoc SAS, because all of the details, like the permissions and expiry time are in the URL itself.


Comments
gravatar trailmax Monday, March 31, 2014
Manual uploads work like that as long as you upload small amounts. As soon as you try to save more than 64Mb, complexity sky-rockets: you need to chunk the content into 4Mb parts, track their uploads separately and then glue the parts together on the blob-storage. I had a go on that a while ago, gave up and moved back to the official client for C#: http://tech.trailmax.info/2014/01/uploading-large-files-to-azure-blob-storage-through-rest-api/
gravatar scott Monday, March 31, 2014
Good point, thank you trailmax.
Comments are now closed.
by K. Scott Allen K.Scott Allen
My Pluralsight Courses
The Podcast!