Monday, November 5, 2012

Remote Desktop to Ubuntu in Azure

After having seen a question on the "Connect" site about this, I decided to write a quick blog post on how to set up Remote Desktop (RDP) in Azure for a Linux installation (Ubuntu in this case).

After I wrote the post, my employer decided to post it on their site, so go here to view the post:

Remote Desktop to Ubuntu in Windows Azure

Wednesday, June 27, 2012

Update Azure DB Firewall Rules with PowerShell

If you do Azure development and work from home like I do, you may find yourself repeatedly updating your SQL Azure Firewall settings to allow yourself to work with your database. 

This is especially true if your crappy ISP repeatedly disconnects you throughout the day and assigns you new IP addresses. I don't want to disparage any company in particular, but I really hope there's some bandwidth competition on the horizon.
So, I wrote a PowerShell script to fix this problem.  Now, I just say do the following:

image

Notice, the script checks the current rule and then replaces it, if needed.  If it is not, it will say "Current Rule Is Correct".

In order to use this script, you will need to download and "import" the WAPPSCmdlets from CodePlex using "Import-Module".

You will also need to fill in the 4 variables at the top of the script. Note that the Certificate you use must be a Management certificate that is installed into the Azure Portal.

#TODO: set these up as parameters (with defaults)
$ruleName = 'Rule Name' #Whatever you want to see in the Azure UI for this Rule
$server = '' # the server name (without the ".database.windows.net" on the end)
$subId = '' # The Subscription ID
$cert = (Get-Item cert:\\CurrentUser\My\[YOUR KEY GOES HERE]) #the key is the "Thumbprint" - use "dir cert:\CurrentUser\My" to find yours

function RemoveRule()
{
    Write-Verbose 'Removing Firewall Rule'
    Remove-SqlAzureFirewallRule -ServerName $server -RuleName $ruleName -SubscriptionId $subId -Certificate $cert | Format-Table
}

function AddRule($myIP)
{
    Write-Output 'Adding rule for IP: ' + $myIP
    New-SqlAzureFirewallRule -StartIpAddress $myIP -EndIpAddress $myIp -ServerName $server -RuleName $ruleName -SubscriptionId $subId -Certificate $cert | Format-Table
}

## Function to retrieve external IP address.
## the external address is retrieved from the
## title header of the webpage "www.myip.dk"

function Get-ExternalIP {
    $source = "http://automation.whatismyip.com/n09230945.asp"
    
    $client = new-object System.Net.WebClient
    
    #add the header that whatismyip.com suggests from (http://www.whatismyip.com/faq/automation.asp)
    $client.Headers.Add('user-agent', 'Mozilla/5.0 (Windows NT 6.1; WOW64; rv:12.0) Gecko/20100101 Firefox/12.0')
    
    $client.downloadString($source)
}

function Update-AzureDBFirewallRules
{
    # get current IP.
    $currentIp = &Get-ExternalIP
    $rules = (Get-SqlAzureFirewallRules $server -SubscriptionId $subId -Certificate $cert)
    
    $curRule =  $rules | ? {$_.RuleName -eq $ruleName}
    
    $shouldAdd = $false

    if ($curRule -ne $null) {
        $ruleIp = $curRule.StartIpAddress
        if ($ruleIp -ne $currentIp ) {
        
            Write-Warning "Current IP [$ruleIp] is incorrect, removing Rule"
            &RemoveRule
            $shouldAdd = $true
        }
        else {
            Write-Output 'Current Rule is Correct'
        }
    }

    if ($shouldAdd -eq $true)
    {
        &AddRule ($currentIp)
    }
}

Hope this helps some of you out there...

Thursday, May 17, 2012

Developing for, but remaining decoupled from Azure

(This started out as a Yammer discussion post, but got obviously way too long for that, so I decided to blog it instead.)

A coworker had mentioned that there could be difficulty in developing for Azure while maintaining independence from it by decoupling from direct dependencies.  Having done this from the beginning in my almost 2-year old Azure project, I decided to tell how I did this.

Why decouple?


First of all, why do we need to do this?

Well, I have developed for a very wide range of platforms in my past, including dozens of flavors of *nix, tons of versions of Windows (3.0, 3.1, 3.11, 95, 98, 98 SE, NT 3.X, 2000, ME, XP, Vista, Win 7), QNX, PSOS, OS/2, Mac OS, etc.

Given that, I've always had issues with tying my code too tightly to a platform.  It always seems to come back and bit you. Hard.

Developing/Debugging


If you have ever tried to develop and debug with the Azure Dev Fabric, you probably didn't even ask this question in the first place.

Working with the Dev Fabric is painful, especially in the Web development side.  Instead of being able to edit the HTML and JavaScript files while you debug, each edit requires a rebuild and redeploy.

If you're not familiar with this process, what is actually happening when you debug a "Cloud" app is a full deployment.  Visual Studio is compiling the code, then, upon success, it packages a mini-azure deployment and deploys it to the Dev Fabric.  This involves spinning up temporary Web Sites on your IIS installation and spinning up what are effectively mini Virtual Machines.  So, in order to get any code changes over there, you must redeploy.

Bottom Line: no easy tweaks to your HTML, CSS or JavaScript while debugging.

Platform Independence


Sometimes, you may not have 100% buy-in from a client or a manager for running in Azure, so, in order to hedge your bets, you need to have the ability to run on Azure without depending on it, in case someone makes a decision to go the other way.  Maybe the client hires a new CIO and he wants to run on Amazon because Jeff Bezos is an old college buddy.  Crazier things have happened.

In my case, my project was, I believe, the first Azure project our company deployed into Production.   Thus, we were hedging our bets a bit by allowing it to run on a regular on-premise deployment.  This had the added benefit of allowing us to do pilot/demo deployments on-site for our client so they could view the site during early development.

What to decouple?


First off, there are three main areas that need to be decoupled in order to maintain your freedom, Configuration Settings, Images/Static Files and Data (SQL, etc)

CONFIGURATION SETTINGS


Regular .NET apps use AppSettings from either web.config or app.config for simple key/value settings.  For this scenario, a simple interface will do:

public interface IConfigurationSettingsProvider
{
    string GetSettingValue(string key);
}
Non-Azure Implementation
using System.Configuration;

public class ConfigurationManagerSettingsProvider
    : IConfigurationSettingsProvider
{
    public string GetSettingValue(string key)
    {
        return ConfigurationManager.AppSettings.Get(key);
    }
}
Azure Implementation
public class AzureRoleConfigurationSettingsProvider
    : IConfigurationSettingsProvider
{
    public string GetSettingValue(string key)
    {
        return RoleEnvironment.GetConfigurationSettingValue(key);
    }
}
As you see, this isn't really all that complicated

IMAGES


In my case, the images we're serving up generally are based on an "Item Key" and can either be referring to a "normal" image size or a "thumbnail" image.  We also needed the ability to store the images and retrieve them.

The implementation of my IImageStorage interface also actually takes care of scaling the images to the appropriate sizes during upload.

The particulars of your situation may vary, but the pattern could be similar.
NOTE: I removed a few overloads of methods here (for instance, each "Save" method has an implementation that takes a "Stream" and proxies it to the byte[] after reading the Stream in)

Interface
public enum ImageSize
{
    Thumbnail,
    Normal,
    Original
}
 
public interface IImageStorage 
{ 
    void SaveImage(int imageId, string pathInfo, byte[] imageBytes); 
 
    // removed overloads 
    // ... 
 
    Uri GetImageUri(int imageId, ImageSize size); 
 
    IEnumerable<int> GetAllImageIds();
 
    void DeleteImage(int imageId); 
} 


Non-Azure Implementation

In this case, I chose to make a "FileSystemImageStorage" class that uses a directory structure of

Root->
      Thumbs
      Normal
      Original

Each file name is the "ID" with ".jpg" in the end.

The urls returned are either "file:///" or "data://" (base64-encoded) urls based on the size.


Azure Implementation

For Azure, the pattern is very similar.  I created a "BlobStorageImageStorage" class.  I use 3 Blob Storage Containers called "normal", "thumbs" and "original" which each contain a blob with the image inside it.

In this case, I chose to make the name the ID zero-padded to 10 spaces in order to allow for long-term expansion.  So, ID 1234 would be "0000001234".  So, my url for the thumbnail for 1234 is:

http:// [accountname].blob.storage.net/thumbs/0000001234

DATA


For a "typical" application, data usually means SQL.  In the case of SQL, the only difference as far as your application is concerned is a connection string, which is easy enough to abstract for yourself.

If, on the other hand, you are using Azure Table Storage, you will need to have some sort of Repository interface that will persist TableStorageEntitites for you.

In our application, we decided to allow the Entities to subclass TableStorageEntity, regardless of whether or not we're storing them to Azure.

Given that, we have this class (keep in mind that I've removed error checking, comments and a few other minor details - and the code patterning is kinda strange too in this format):

public class TableServiceEntityRepository<TEntity>
    where TEntity : TableServiceEntity
{
    readonly string _tableName;
    readonly TableServiceContext _context; 
    public TableServiceEntityRepository(
        ITableServiceContextFactory ctxFactory)
    {
        _tableName = typeof(TEntity).Name;
        _context = ctxFactory.Create(_tableName);
    }
    public IQueryable<TEntity> Get(
        Specification<TEntity> whereClause = null,
        Func<
        <TEntity>, IOrderedQueryable<TEntity>
            > orderBy = null)
    {
        IQueryable<TEntity> query =
            _context.CreateQuery<TEntity>(_tableName);
        if (whereClause != null)
        {
            query = query.Where(whereClause.IsSatisfiedBy());
        }
        if (orderBy != null)
        {
            return orderBy(query);
        }
        return query;
    }
    public virtual void Insert(TEntity entity)
    {
        _context.AddObject(_tableName, entity);
    }
    public virtual void Update(TEntity entity)
    {
        var desc = _context.GetEntityDescriptor(entity);
        if (desc == null || desc.State == EntityStates.Detached)
        {
            _context.AttachTo(_tableName, entity, "*");
        }
        _context.UpdateObject(entity);
    }
    public virtual void Delete(TEntity entity)
    {
        var desc = _context.GetEntityDescriptor(entity);
        if (desc == null || desc.State == EntityStates.Detached)
        {
            _context.AttachTo(_tableName, entity, "*");
        }
        _context.DeleteObject(entity);
    }
    public virtual void SaveChanges()
    {
        _context.SaveChanges();
    }
}

If you notice, this code uses a Specification for a "whereClause".  That is defined this way: 

public abstract class Specification<TEntity>
{ 
    private Func<TEntity, bool> _compiledFunc = null; 
    private Func<TEntity, bool> CompiledFunc 
    { 
        get 
        { 
            if (_compiledFunc == null) 
                _compiledFunc = this.IsSatisfiedBy().Compile(); 
            return _compiledFunc; 
        } 
    } 
    public abstract Expression<Func<TEntity, bool>> IsSatisfiedBy();
 
    public bool IsSatisfiedBy(TEntity entity) 
    { 
        return this.CompiledFunc(entity); 
    } 
} 

The non-Azure version of this just uses the TableServiceEntity's PartitionKey and RowKey fields for folder name and file name and uses Binary Serialization to serialize the files.  It's pretty simple, but wouldn't work for very large numbers of entities, of course.  You could, however choose to do the same thing with SQL Server using a SQL Blob for the serialized data for a quick speed improvement.

Though we have other implementation abstractions in our software for things like Queues, this should give you a good kick start on how things are done.

Summary


Abstracting things from specific hardware/platform dependencies is usually a Good Thing in my opinion.  Stuff Happens.  Platforms break/die/become uncool, etc.  We shouldn't have to refactor significant portions of applications in order to be more Platform agnostic.

Feel free to comment or contact me for more details on how things like this can be done, or any questions on my implementation of the Specification pattern or anything at all :)

Friday, April 20, 2012

Solution to 0x80007002 error installing Zune 4.8


I just bought a Nokia Lumia 900 only to find that Microsoft took a page from Apple's book and required the installation if iTunes Zune software in order to upgrade the phone. Ugh!

So, I download the installer and, of course, get 0x80007002 error installing the software. It tells me:
The media for installation package couldn't be found. It might be incomplete or corrupt
Uh… What? It's a Self-Extracting EXE!

Here's what I tried:

  • Download the "full version" (first one downloaded the installer to temp files) –  FAIL
  • Ran a Microsoft "FixIt" app that was supposed to .. you know.. Fix It? –  FAIL
  • Turned off the Firewall –  FAIL
  • Yelled at the Screen –  FAIL
  • "Unblocked" the EXE file and re-ran the installer –  FAIL
  • Decided to get insanely stupid and unzipped the file myself and ran the contained installer – Success!? (*facepalm*)

Whiskey. Tango. Foxtrot? 

Apparently their "self-extracting" EXE is not very good at the job. Can't blame them, since self-extracting zip files is a really recent innova--- Wait! That technology has existed for decades!!

Come on guys! Maybe you need a refresher course!
("It's all ball bearings nowadays!")

So.. if you run into this, just unzip the thing your damned self and call it a day!

Sunday, March 18, 2012

JsTrace 1.3 Released

Get it while it's hot!

I just added a few features to the JsTrace code:
  • New passthrough methods on the Trace object for the following: assert, clear, count, dir, dirxml, exception, group, groupCollapsed, groupEnd, profile, profileEnd, table, time, timeEnd, trace
    • Note that the Trace script does nothing but pass these through if they exist
  • Ability to get/set the default trace level with the Trace.defaultTraceLevel method
  • Ability to reset some or all of the currently configured trace levels to the defaults by calling the Trace.resetToDefault method.
  • New Jasmine tests to make sure it works (mostly)
On the MVC side, I added a feature to the JsTrace.MVC package:
  • The ability to force logging to the server to be synchronous in order to preserve the order of messages coming back to the server.
    • It was either that or adding message numbers, and this was quicker.
NuGet packages are updated:

Or browse the code here: http://jstrace.codeplex.com/

Happy debugging!

Saturday, March 17, 2012

Disable Minification with MVC 4 Bundles

I decided to play around with a bunch of new technologies to dive in and learn today. I decided to write a basic blog app (yeah, boring, I know) using the following tools:
    (TL;DR? Click Here)

Thursday, March 15, 2012

How To: Download Azure Publish Settings File

So, as of Azure SDK 1.6 (I think), Microsoft added the ability to import your publishing settings for Azure into Visual Studio to make deployments easier
image
This then launches your browser and has you login to your Live account.
Then, it downloads the file automatically and imports it, saving you time.
I installed Azure Diagnostics Manager Version 2 and it has a neat feature to allow you to import a Publish Settings File.  So, off I went to try to download it.  I found NO documentation on how to get it.
By some spelunking through browser history, I found this magical link:
https://windows.azure.com/download/publishprofile.aspx
This will allow you to download the file and do whatever you want with it, including import it into ADM.
WARNING: This has sensitive information in it to allow you to publish to your Azure account.  I recommend you delete this file after you import it into ADM