mandag den 19. november 2012

Requesting Certificate from different Windows OS’

long ago I created an application that could create a certificate request, submit it to a Web Service that would submit it to an internal Enterprise CA, send the signed certificate back, and then accept the certificate on the client. Once done, the application would the (if needed) install SCOM (System Center Operation Manager Client) and configure it for Certificate Authentication.

We needed this application to work on everything from Windows 2000 and up, and that turned out to be one hell of a lot of “messy” code, since Microsoft changed the API for handling Certificates around Server 2008 (and now again in Windows 8/ Server 2012)

So I had to do an update to the application and instantly I started getting a ton of errors. After fixing those, suddenly the application didn’t work on old client’s (2003 servers) After spending several hour’s I just had a flip and decided there had to be something smarter you could do. I came across this blog post and though to my self, that seemed simple and elegant, and if Bouncy Castle could help create a PFX file, it would be simple to import the certificate, on all versions of operation systems. I cleaned up the code, removed a few bits and pieces and added links to where I got the difference pieces of code from.

Imports Org.BouncyCastle.Crypto.Generators
Imports Org.BouncyCastle.Crypto

Imports System.Security.Cryptography.X509Certificates

Public Class CertificateRequest

#Region "Internals"

Private _CSR As String
Public ReadOnly Property CSR() As String
Get
Return _CSR
End Get
End Property

Private PrivateKeyPem As String
Private ackp As AsymmetricCipherKeyPair

#End Region

#Region "Constants"
Const CRYPT_EXPORTABLE = 1
Const AT_KEYEXCHANGE = 1
Const CERT_SYSTEM_STORE_LOCAL_MACHINE = &H20000

Const OID_MSTEMPLATE_v1 As String = "1.3.6.1.4.1.311.20.2"
Const OID_MSTEMPLATE_v2 As String = "1.3.6.1.4.1.311.20.7"

Private Const CC_DEFAULTCONFIG As Integer = 0
Private Const CC_UIPICKCONFIG As Integer = &H1
Private Const CR_IN_BASE64 As Integer = &H1
Private Const CR_IN_FORMATANY As Integer = 0
Private Const CR_IN_PKCS10 As Integer = &H100
Private Const CR_DISP_ISSUED As Integer = &H3
Private Const CR_DISP_UNDER_SUBMISSION As Integer = &H5
Private Const CR_OUT_BASE64 As Integer = &H1
Private Const CR_OUT_CHAIN As Integer = &H100

#End Region

''' <summary>
''' Create a new Certificate Request, with a 2048 bits key pair
''' </summary>
''' <param name="FQDN">Fully qualified domain name</param>
''' <returns>An instance of CertificateRequest containing private key and CSR</returns>
''' <remarks></remarks>
Shared Function CreateRequest(FQDN As String) As CertificateRequest
Return CreateRequest(FQDN, 2048)
End Function

''' <summary>
''' Create a new Certificate Request
''' </summary>
''' <param name="FQDN">Fully qualified domain name</param>
''' <param name="Strength">Strength of key in bits</param>
''' <returns>An instance of CertificateRequest containing private key and CSR</returns>
''' <remarks></remarks>
Shared Function CreateRequest(FQDN As String, Strength As Integer) As CertificateRequest
' http://stackoverflow.com/questions/949727/bouncycastle-rsaprivatekey-to-net-rsaprivatekey
'Key generation
Dim rkpg As New RsaKeyPairGenerator()
rkpg.Init(New KeyGenerationParameters(New Org.BouncyCastle.Security.SecureRandom(), Strength))
Dim ackp As AsymmetricCipherKeyPair = rkpg.GenerateKeyPair()

'Requested Certificate Name
Dim name As New Org.BouncyCastle.Asn1.X509.X509Name("CN=" & FQDN)
Dim csr As New Org.BouncyCastle.Pkcs.Pkcs10CertificationRequest("SHA1WITHRSA", name, ackp.[Public], Nothing, ackp.[Private])

'Convert BouncyCastle CSR to .PEM file.
Dim CSRPem As New System.Text.StringBuilder()
Dim CSRPemWriter As New Org.BouncyCastle.OpenSsl.PemWriter(New System.IO.StringWriter(CSRPem))
CSRPemWriter.WriteObject(csr)
CSRPemWriter.Writer.Flush()

'Convert BouncyCastle Private Key to .PEM file.
Dim PrivateKeyPem As New System.Text.StringBuilder()
Dim PrivateKeyPemWriter As New Org.BouncyCastle.OpenSsl.PemWriter(New System.IO.StringWriter(PrivateKeyPem))
PrivateKeyPemWriter.WriteObject(ackp.[Private])
CSRPemWriter.Writer.Flush()

'Push the CSR Text to a Label on a Page
'Dim PrivateKeyLabel As String = PrivateKeyPem.ToString()
Return New CertificateRequest With {.ackp = ackp, ._CSR = CSRPem.ToString(), .PrivateKeyPem = PrivateKeyPem.ToString()}
End Function

''' <summary>
''' Submit request to CA and get signed certificate back
''' </summary>
''' <param name="CA">CA to use, leaving blank will open default UI to select an CA</param>
''' <param name="Template">Certificate Template to use, can be blank if standalone CA, or template specefied in CSR. If specefied, will override what is specefied in CSR</param>
''' <param name="base64">CSR as Base54</param>
''' <returns>Signed Certificate as Base64</returns>
''' <remarks></remarks>
Public Function submitRequest(ByVal CA As String, Template As String, ByVal base64 As String) As String
' Create all the objects that will be required
Dim objCertConfig As CERTCLIENTLib.CCertConfig = New CERTCLIENTLib.CCertConfig()
Dim objCertRequest As CERTCLIENTLib.CCertRequest = New CERTCLIENTLib.CCertRequest()
Dim strCAConfig As String
Dim iDisposition As Integer
Dim strDisposition As String
Dim strCert As String

Try
If CA = "" Then
' Get CA config from UI
'strCAConfig = objCertConfig.GetConfig(CC_DEFAULTCONFIG);
strCAConfig = objCertConfig.GetConfig(CC_UIPICKCONFIG)
Else
strCAConfig = CA ' "AD01.int.wingu.dk\int-AD01-CA"
End If

Dim strAttributes As String = Nothing
If Not String.IsNullOrEmpty(Template) Then strAttributes = "CertificateTemplate: " & Template

' Submit the request
iDisposition = objCertRequest.Submit(CR_IN_BASE64 Or CR_IN_FORMATANY, base64, strAttributes, strCAConfig)

' Check the submission status
If CR_DISP_ISSUED <> iDisposition Then
' Not enrolled
strDisposition = objCertRequest.GetDispositionMessage()

If CR_DISP_UNDER_SUBMISSION = iDisposition Then
' Pending
Console.WriteLine("The submission is pending: " & strDisposition)
Return ""
Else
Throw New Exception("The submission failed: " & strDisposition & vbCrLf & "Last status: " & objCertRequest.GetLastStatus().ToString())
' Failed
Console.WriteLine("The submission failed: " & strDisposition)
Console.WriteLine("Last status: " & objCertRequest.GetLastStatus().ToString())
Return ""
End If
End If

' Get the certificate
strCert = objCertRequest.GetCertificate(CR_OUT_BASE64 Or CR_OUT_CHAIN)

Return strCert
Catch ex As Exception
Throw ex
Console.WriteLine(ex.Message)
End Try
End Function

''' <summary>
''' Pair Private Key with signed certificate and save to local machine store
''' </summary>
''' <param name="CertResponse"></param>
''' <param name="allowExport"></param>
''' <remarks></remarks>
Sub AcceptRequest(ByVal CertResponse As String, allowExport As Boolean)
Dim s As String = CertResponse
' Response from Microsoft CA will not contain filetype specefication, to add it to make BouncyCastle happy
If Not s.Contains("-----BEGIN CERTIFICATE-----") Then
s = "-----BEGIN CERTIFICATE-----" & vbCrLf & s & vbCrLf & "-----END CERTIFICATE-----"
End If
' Load signed Certificate as a BouncyCastle Certificate
Dim pr As New Org.BouncyCastle.OpenSsl.PemReader(New System.IO.StringReader(s))
Dim prObj = pr.ReadObject()
Dim cert As Org.BouncyCastle.X509.X509Certificate = DirectCast(prObj, Org.BouncyCastle.X509.X509Certificate)

' Convert certificate to a "microsoft" Certificate
Dim _netcert As System.Security.Cryptography.X509Certificates.X509Certificate = Org.BouncyCastle.Security.DotNetUtilities.ToX509Certificate(cert)
Dim netcert As New System.Security.Cryptography.X509Certificates.X509Certificate2(_netcert)
' Add Private Key to the Certificate
Dim rcsp As New System.Security.Cryptography.RSACryptoServiceProvider()

'And the privateKeyParameters
Dim parms As New System.Security.Cryptography.RSAParameters()
'Translate ackp.PrivateKey to parms;
Dim BCKeyParms As Parameters.RsaPrivateCrtKeyParameters = DirectCast(ackp.[Private], Parameters.RsaPrivateCrtKeyParameters)
parms.Modulus = BCKeyParms.Modulus.ToByteArrayUnsigned()
parms.P = BCKeyParms.P.ToByteArrayUnsigned()
parms.Q = BCKeyParms.Q.ToByteArrayUnsigned()
parms.DP = BCKeyParms.DP.ToByteArrayUnsigned()
parms.DQ = BCKeyParms.DQ.ToByteArrayUnsigned()
parms.InverseQ = BCKeyParms.QInv.ToByteArrayUnsigned()
parms.D = BCKeyParms.Exponent.ToByteArrayUnsigned()
parms.Exponent = BCKeyParms.PublicExponent.ToByteArrayUnsigned()

'import the RSAParameters into the RSACryptoServiceProvider
rcsp.ImportParameters(parms)
netcert.PrivateKey = rcsp

' I'm sure there is a smarter way, but this works for now
' If you add the certificate now, it will be lacking the MachineKeySet/PersistKeySet and Exportable
' Parameter, so we export the certificate and load it again, with Cryptography.CspParameters set correctly
' http://stackoverflow.com/questions/9810887/export-x509certificate2-to-byte-array-with-the-private-key

Dim certBytes As Byte() = netcert.Export(X509ContentType.Pkcs12, "Password1!")
'System.IO.File.WriteAllBytes("c:\certificate.pfx", certBytes)
Dim certToImport As System.Security.Cryptography.X509Certificates.X509Certificate2
If allowExport Then
certToImport = New System.Security.Cryptography.X509Certificates.X509Certificate2(certBytes, "Password1!", X509KeyStorageFlags.MachineKeySet Or X509KeyStorageFlags.Exportable Or X509KeyStorageFlags.PersistKeySet)
Else
certToImport = New System.Security.Cryptography.X509Certificates.X509Certificate2(certBytes, "Password1!", X509KeyStorageFlags.MachineKeySet Or X509KeyStorageFlags.PersistKeySet)
End If
If Not certToImport.HasPrivateKey Then Throw New Exception("Certificate failed to load with Private key")

'Test that we can encrypt and decrypt with the certificate
'Dim PlainString As String = "Encrypt this string"
'Dim EncryptedString As String = CertUtilities.GetEncryptedText(certToImport, PlainString)
'Dim result As String = CertUtilities.GetDecryptedText(certToImport, EncryptedString)
'If PlainString <> result Then Throw New Exception("Certificate failed the encryption/decryption test")

' Finally, add the certificate to the LocalMachine store
Dim store As New Security.Cryptography.X509Certificates.X509Store(Security.Cryptography.X509Certificates.StoreName.My, Security.Cryptography.X509Certificates.StoreLocation.LocalMachine)
store.Open(OpenFlags.ReadWrite)
store.Add(certToImport)
store.Close()

End Sub

End Class

And a few tools for testing

    ' http://blogs.msdn.com/b/cagatay/archive/2009/02/08/removing-acls-from-csp-key-containers.aspx
Sub RemoveEveryoneFromPrivateKey(Cert As System.Security.Cryptography.X509Certificates.X509Certificate2)
Dim rsa As Cryptography.RSACryptoServiceProvider = Cert.PrivateKey

Dim id As New Principal.SecurityIdentifier(Principal.WellKnownSidType.WorldSid, Nothing) ' Indicates a SID that matches everyone.
Dim cspParams As New Cryptography.CspParameters(rsa.CspKeyContainerInfo.ProviderType, rsa.CspKeyContainerInfo.ProviderName, rsa.CspKeyContainerInfo.KeyContainerName)
cspParams.Flags = Cryptography.CspProviderFlags.UseMachineKeyStore
Dim container As New Cryptography.CspKeyContainerInfo(cspParams)

'get the original acls first
cspParams.CryptoKeySecurity = container.CryptoKeySecurity

'Search for the account given to us and remove it from accessrules
'For Each rule As CryptoKeyAccessRule In cspParams.CryptoKeySecurity.GetAccessRules(True, False, GetType(Principal.NTAccount))
For Each rule As CryptoKeyAccessRule In cspParams.CryptoKeySecurity.GetAccessRules(True, False, GetType(Principal.SecurityIdentifier))
If rule.IdentityReference.Equals(id) Then
cspParams.CryptoKeySecurity.RemoveAccessRule(rule)
End If
Next
'persist accessrules on key container.
Dim cryptoServiceProvider As New Cryptography.RSACryptoServiceProvider(cspParams)
End Sub

'http://social.msdn.microsoft.com/Forums/hu/vbgeneral/thread/7e0f513f-cd09-492a-8748-a4aea024cff0
'http://stackoverflow.com/questions/425688/how-to-set-read-permission-on-the-private-key-file-of-x-509-certificate-from-ne
Sub AddEveryoneToPrivateKey(Cert As System.Security.Cryptography.X509Certificates.X509Certificate2)
Dim id As New Principal.SecurityIdentifier(Principal.WellKnownSidType.WorldSid, Nothing) ' Indicates a SID that matches everyone.
Dim rsa As Cryptography.RSACryptoServiceProvider = Cert.PrivateKey

If Cert.HasPrivateKey Then
Dim cspParams = New Cryptography.CspParameters(rsa.CspKeyContainerInfo.ProviderType, _
rsa.CspKeyContainerInfo.ProviderName, _
rsa.CspKeyContainerInfo.KeyContainerName) _
With {.Flags = Cryptography.CspProviderFlags.UseExistingKey Or Cryptography.CspProviderFlags.UseMachineKeyStore, _
.CryptoKeySecurity = rsa.CspKeyContainerInfo.CryptoKeySecurity}
cspParams.CryptoKeySecurity.AddAccessRule(
New CryptoKeyAccessRule(id, CryptoKeyRights.FullControl, AccessControlType.Allow))

' Once we create a new RSACryptoServiceProvider, we override the existing one.
Dim rsa2 As Cryptography.RSACryptoServiceProvider = New Cryptography.RSACryptoServiceProvider(cspParams)
End If
End Sub

Function GetEncryptedText(cert As System.Security.Cryptography.X509Certificates.X509Certificate2, PlainStringToEncrypt As String) As String
Dim cipherbytes As Byte() = Text.ASCIIEncoding.ASCII.GetBytes(PlainStringToEncrypt)
Dim rsa As Security.Cryptography.RSACryptoServiceProvider = cert.PublicKey.Key
Dim cipher As Byte() = rsa.Encrypt(cipherbytes, False)
Return Convert.ToBase64String(cipher)
End Function

Function GetDecryptedText(cert As System.Security.Cryptography.X509Certificates.X509Certificate2, EncryptedStringToDecrypt As String) As String
Dim cipherbytes As Byte() = Convert.FromBase64String(EncryptedStringToDecrypt)
If cert.HasPrivateKey Then
Dim rsa As Security.Cryptography.RSACryptoServiceProvider = cert.PrivateKey
Dim plainbytes As Byte() = rsa.Decrypt(cipherbytes, False)
Dim enc As System.Text.ASCIIEncoding = New System.Text.ASCIIEncoding()
Return enc.GetString(plainbytes)
Else
Throw New Exception("Certificate used for has no private key.")
End If
End Function

tirsdag den 13. november 2012

Azure PowerShell – Now with Provider Support

I got a little feedback on my PowerShell module WA, and have updated it to reflect some of those things. First of all, subscription handling was weird, so created the prober commands for that ( New-Subscription, Set-Subscription, Remove-Subscription, Get-Subscription, Select-Subscription ) . I also made some optimizing on Table/Row CmdLets, so should handle tables with many rows better. But the most important thing is I implemented a PowerShell Provider ( still working a a few tweaks here and there, but ill keep updating the binaries when needed )

First of all. I started creating the Provider in hopes of making copying files between your local file system and Azure more smooth and intuitive. After I had almost finished most of the work and was ready to start working on that part, I found out that Microsoft does not support cross provider copying. That sucks, but I had already started the work and it still works “cross” container/Storage account wise, and with my new parameters it also works with local file system (with a twist).

When you load the PowerShell module, you will now have access to a azure: drive. The root will contain a list of your subscriptions (and you can create/update/remove with new-item/set-item/remove-item) each representing a folder. If you go down on one of those you will get a list of Storage Accounts associated with that subscription. Again you can Create/Remove storage accounts to that subscription with New-Item/Remove-Item. You can also copy all content from one Storage Account to another with Copy-Item (Move-Item to work soon. Snapshots are not part of the copy, but can be handled individually )

image

All levels work with <tab> tabulator auto completion, to make it fast and smooth navigating around. If you jump down to a storage account and do a dir or Get-ChildItem you will get a list of BlobContainers and Table’s (queues will be added later). You can create new containers and tables with New-Item (default is containers)

image

But enough of that. Remember my long script to copy a storage account in part one ? This is how you can do that now

Copy-Item azure:\Pay-As-You-Go\wingu2 azure:\skadefro\copytest

Works with containers and single blobs too. So what about local files ?
New-Item is for uploading files. Get-Item / Get-ChildItem (or just dir) is for downloading file. Eksamples

# Upload a single file
New-Item -filename C:\temp\1.jpeg azure:\skadefro\copytest\ad01\1.jpeg
# Since we know the file name, you can also just specefy container
New-Item -filename C:\temp\1.jpeg azure:\skadefro\copytest\ad01
# Hell, if you allready are in the folder just use .
cd azure:\skadefro\copytest\ad01
New-Item -filename C:\temp\1.jpeg .

# want to upload a whole folder ?
New-Item -filename C:\temp azure:\skadefro\copytest\ad01
# or
cd azure:\skadefro\copytest\ad01
New-Item -filename C:\temp .
# What about all subfolders in that folder ?
# sure, this will also create the "folder" in azure
cd azure:\skadefro\copytest\ad01
New-Item -Recursive -filename C:\temp .

# want to upload a single file to a folder ?
# Since powershell keep messing with the folder paths, we need to use
# forwariding slash for folders.
New-Item -filename C:\temp\1.jpeg azure:\skadefro\copytest\ad01\test/1.jpeg
# or

# same goes the other way
# download a file
Get-Item -filename c:\temp\1.jpeg azure:\skadefro\copytest\ad01\1.jpeg
# or, since we allready know the file name,
Get-Item -filename c:\temp\ azure:\skadefro\copytest\ad01\1.jpeg

# Download all the files from the container
# this will create the folder structure localy also, if names contains /
Get-Item -filename c:\temp azure:\skadefro\copytest\ad01
# or the short version
cd azure:\skadefro\copytest\ad01
Get-Item -filename c:\temp .

# I know, Get-ChildItem is suppose to do that, but that works too
Get-ChildItem -filename c:\temp azure:\skadefro\copytest\ad01
# or
cd azure:\skadefro\copytest\ad01
Get-ChildItem -filename c:\temp .

# or the cool and lazy way,
dir azure:\skadefro\copytest\ad01 -filename c:\temp

So download a copy, test it out and feel free to give me some feedback

lørdag den 10. november 2012

PowerShell Provider fails with Remove-Item

This one was driving me nuts. Looked at tons of examples, tried every kind of combination of code, and tried goggling all kinds of variations and just couldn’t get this to work.

I’m working on implementing a PowerShell Provider and got to the part where I want to implement more than just Get-Item, Get-ChildItem … Clear-Item, Copy-Item, Move-Item, Rename-Item, Set-Item all works without any fuzz … but Remove-Item would constantly throw an PSNotSupportedException and in PowerShell

Remove-Item : Provider execution stopped because the provider does not support this operation.

You see this error if you don’t Override the function, but I was doing that, and that was kept me going around in circles. It never hit me, it could be another function I had to override to make it work *DOH* … If I also override RemoveItemDynamicParameters that would get hit, and PowerShell would respect my Parameter() decorations just fine, so this made no sense to me…. So, like in my former blog post I finally had enough, I override all functions and starting removing them one by one .. and that’s when I hit my jackpot. All I had to do was override HasChildItems.

Protected Overrides Function HasChildItems(path As String) As Boolean
' make code to check if item is an container or not
Return True
End Function

fredag den 9. november 2012

PowerShell provider relative path tab-completion issue

I’m playing around with implementing a PowerShell Provider. After a while I started having big issues when using <tab> auto completion. Google wasn’t at much help, but I did find someone else who had the same problem. it just didn’t sound correct that it’s a “bug”, considering I have seen other providers that works correctly. Like notfed on Stack Overflow I tried returning different information, without any luck, so I finally decided to override all functions in NavigationCmdletProvider and do a Trace.WriteLine to see what was going on. And in MakePath I got lucky. To be honest since I haven't been able to find any source code examples that actually work, I don’t know if this is the correct solution, but it works for me, so I’m happy

''' <summary>
''' Joins two strings with a provider specific path separator.
''' </summary>
''' <param name="parent">The parent segment of a path to be joined with the child.</param>
''' <param name="child">The child segment of a path to be joined with the parent.</param>
''' <returns>A string that contains the parent and child segments of the path joined by a path separator.</returns>
''' <remarks></remarks>
Protected Overrides Function MakePath(parent As String, child As String) As String
Trace.WriteLine("::MakePath(parent:=" & parent & ",child:=" & child & ")")
Dim res As String = MyBase.MakePath(parent, child)
Trace.WriteLine("::MakePath(parent:=" & parent & ",child:=" & child & ") " & res)
If parent = "." Then
'res = ".\" & child.Split("\").Last
If String.IsNullOrEmpty(Me.SessionState.Path.CurrentLocation.ProviderPath) Then
res = parent & PATH_SEPARATOR & child
Else
res = parent & PATH_SEPARATOR & child.Substring(Me.SessionState.Path.CurrentLocation.ProviderPath.Length + 1)
'res = parent & PATH_SEPARATOR & child.Replace(Me.SessionState.Path.CurrentLocation.ProviderPath & PATH_SEPARATOR, String.Empty)
End If
Trace.WriteLine("::**** TRANSFORM: " & res)
End If
Return res
End Function

torsdag den 8. november 2012

Azure PowerShell–The missing links

I attended Campus Days again this year, and doing one of the sessions one of the presenters told everyone you need to buy 3rd party products to take snapshots or copy Storage Blobs from one container/account to another. That doesn’t seem fair. I needed this functionality and have created a PowerShell module that can do all that, and a lot more, so I decided to share that, with the world.

I think its important to understand the reason I wrote this before I get flamed with comments and emails. When I began playing around with azure, doing a proof of concept setup together with a guy from Microsoft Denmark, I started creating a PowerShell module to manage it, but quickly came across a module that someone else had written than already did most of what I needed, so abandoned the project again. Several months later, when I was asked to start doing some more work with Azure, especially Persistent VM’s I found that Microsoft had adopted the PowerShell module that I had once used, and in that process removed several key features that I needed (but was once there) I also noticed the module was painfully slow to work with when handling Persistent VM’s, so I started writing my own. Half way though that process I got bored with it, and started mixing Microsoft's PowerShell with my own, just focusing on the things they miss. So you will find a lot of useful stuff, and some redundant stuff. In a real scenario you would probably end up using both (for instance, I've only written half of the code to create new VM’s, and that sure is is an important command, right ? ;-) )

Download the module and extract it somewhere, like c:\wa  Open PowerShell and load the module with

# If you just downloaded it, unblock files
Unblock-File C:\wa\Release\*
import-module C:\WA\Release\wa.psd1

First of all, I hate the way Microsoft handles working with different Subscriptions. (granted some of mine ended up working the same way in the end, but still ). So you load the module and connect to one of your subscriptions with

# Find your Management Certificate, or create one and upload it to azure
dir cert:\CurrentUser\my

# Tell WA module about the subscription and cert and give it a name
Select-Subscription -Name Pay-As-You-Go -SubscriptionId 1f6b36b5-d2ab-49e6-8674-2155a3ea4a55 -CertificateThumbprint AAF4E3017B3DE26E93601E1573FDAF44A3AA15E8

The module loads certificates from localmachine and localuser store, but most will have it in the user\my store. Name can be what ever makes sense to you, I normally use the name Azure also used within the management portal. This gets saved in HKLU\software\wa Registry and from now on, every time you start a new PowerShell prompt you can just use

Select-Subscription -Name Pay-As-You-Go 
# or list all, known subscription with
Get-Subscription
# This will also query azure about statistics data, so you can see if your about to hit your Core, HostedService or Storage Account limits.

If you only want to manage an storage account, you can ignore the above and connect directly to it, using

Select-StorageAccount -AccountName skadefro -AccountKey blahBlagBlah1234==

But if you loaded an subscription, you can let the module handle all that for you and just use the Storage account’s Service Name.

Select-StorageAccount -ServiceName Skadefro

Ok, So what can you do, that Microsoft’s version is missing. Well, managing Storage/Blob/Queue’s was the first show stopper for me, so now you can Create/modify/delete tables, select, insert, modify,delete rows from tables. You can add/update/remove blob containers, update ACL on containers and blobs. Add/Upload/download/snapshot/signature/lease/unlock/remove/copy blobs, both as Page and Block blobs. ( uploading is insanely fast with Page blob, and if you really want to tweak your VM’s, making sure page size fits the size you format your disks with makes a huge difference. )


The reason I’m sharing all this, is several people asked about snapshot’s and copying content from one storage account to another ( hell, even my favorite azure application Azure Explore, doesn’t support copying blobs from one container to another ) ill give a few examples of that.


First, snapshots. very simple

$storageacountname = 'wingu2'
$containername = 'public'
$filename = 'test.txt'
# select an StorageAccount if you got more than one.
# if you only have 1, that will always be selected
Select-StorageAccount -ServiceName wingu2
# Lets upload a text file
# we should place that in a container
$container = Get-Container $containername
if(!$container) { $container = New-Container $containername }
# Lets make it public, so we can show cross subscription/account
# copying later
if( (Get-ContainerACL $containername) -ne 'container'){ Set-ContainerACL $containername container }

# Create a test file, and upload it
if ( (Test-Path $filename) -eq $true) { Remove-Item $filename -confirm:$false }
$file = New-Item -ItemType file $filename
add-content $file '1. Hello world'
New-Blob $containername $filename
$blob = Get-Blob $containername $filename

#Now, lets take a snapshot
$snapshot = $blob | New-Snapshot

#Lets update the file
add-content $file '2. Hello world'
New-Blob $containername $filename

# This will return 1 result
Get-Blob $containername $filename

# This will return 2 result
Get-Blob $containername $filename -Snapshots

# Remove the snapshot, supply snapshot time or pipe the Object
#$snapshot | Remove-Blob

# Removing the file will also fail, since we have a snapshot
Get-Blob $containername $filename | Remove-Blob

# But we can force, deleting all snapshot with
Get-Blob $containername $filename | Remove-Blob -deletesnapshots

Next up, is copy blobs across containers or even storage accounts. If you want to move everything from one account to another, here's a crud but efficient way

Select-Subscription Pay-As-You-Go
Select-StorageAccount -ServiceName wingu2
$containers = @{}
foreach($container in Get-Container){
$containers += @{$container.Name=Get-Blob $container.Name}
# Could even update it to public and close it again after with
# Set-ContainerACL $container.Name container
}

Select-Subscription skadefro
Select-StorageAccount -ServiceName copytest
foreach($container in $containers.GetEnumerator()){
$c = get-container $container.Name
if(!$c){ $c = New-Container $container.Name }

foreach($blob in $container.Value){
Write-Host ("Copying '" + $blob.Name + "' to container '" + $c.Name + "'")
Copy-Blob -Uri $blob.Uri -Destinationuri ($c.Uri.ToString() + '/' + $blob.Name)
}
}


When I showed this to a friend, he quickly asked, what if I don’t want to make it public ? Well, we can use shared signature’s for that, so here’s another way, without requiring the containers to be public.

Select-Subscription Pay-As-You-Go
Select-StorageAccount -ServiceName wingu2
$containers = @{}
foreach($container in Get-Container){
$blobs = @{}
foreach($blob in (Get-Blob $container.Name)){
# By default, this cmdlet creates a 55 minute readonly adhoc signature
$url = $blob | Get-BlobSignature
$blobs += @{$blob.Name=$url}
}
$containers += @{$container.Name=$blobs}
}

Select-Subscription skadefro
Select-StorageAccount -ServiceName copytest
foreach($container in $containers.GetEnumerator()){
$c = get-container $container.Name
if(!$c){ $c = New-Container $container.Name }

foreach($blob in $container.Value.GetEnumerator()){
Write-Host ("Copying '" + $blob.Name + "' to container '" + $c.Name + "'")
Copy-Blob -Uri $blob.Value -Destinationuri ($c.Uri.ToString() + '/' + $blob.Name)
}
}

Another feature that is missing from Microsoft's azure module is the ability to work with tables and queues. So now you can use your favorite PowerShell prompt and work with them there (if that rocks your boat)

Select-StorageAccount -ServiceName wingu2
$tablename = 'testtable'
$table = Get-Table $tablename
if(!$table){
New-Table $tablename
} else {
Remove-Table $tablename
New-Table $tablename
}

# add using hashtable
New-Row $tablename -Columns @{ ID=1; Name='John Doe' }

# add using PSObject
$row = New-Object PSObject -Property @{ ID=2; Name='Jane Doe'; upn='jd@identity.local' }
New-Row $tablename $row

# add overriding RowKey
New-Row $tablename -Columns @{ ID=3; Name='Little Doe'; City='Aarhus' }

Get-Row $tablename

# We can search by Row Key and/or Partition key. Or just use PowerShell
$row = Get-Row $tablename | ?{$_.ID -eq 3}
# Add Age to the object
$row | Add-Member -Name 'Age' -Value 33 -MemberType Noteproperty
# Update row
$row | Set-Row $tablename

# Create new, based on another
$row.ID = 4
$row.Name = 'Bill Doe'
$row.upn = 'bill@identity.local'
$row | New-Row $tablename

Get-Row $tablename

Lastly, I short list of all the commands. Source code will be uploaded once I cleaned it up a bit, and added some help texts and such. Drop me an email if your interested before I'm done (its not highest on my priority list at the moment). Should probably also add some logic to handle async blob copy’s etc. you know.

Copy-WAStorageBlob
Get-WADeployment
Get-WADisk
Get-WAHostedService
Get-WALocation
Get-WAOperationStatus
Get-WAOS
Get-WAOSFamily
Get-WAOSImage
Get-WAServiceCertificate
Get-WAStorageAccount
Get-WAStorageAccountKeys
Get-WAStorageBlob
Get-WAStorageBlobLease
Get-WAStorageBlobSignature
Get-WAStorageContainer
Get-WAStorageContainerACL
Get-WAStorageContainerSignature
Get-WAStorageRow
Get-WAStorageTable
Get-WASubscription
Get-WASubscriptionCertificate
Get-WAVM
Get-WAAffinityGroup
New-WADeployment
New-WADisk
New-WAHostedService
New-WAOSImage
New-WAServiceCertificate
New-WAStorageAccount
New-WAStorageAccountKeys
New-WAStorageBlob
New-WAStorageBlobLease
New-WAStorageBlobSnapshot
New-WAStorageContainer
New-WAStorageRow
New-WAStorageTable
New-WASubscriptionCertificate
New-WAAffinityGroup
Remove-WADeployment
Remove-WADisk
Remove-WAHostedService
Remove-WAOSImage
Remove-WAServiceCertificate
Remove-WAStorageAccount
Remove-WAStorageBlob
Remove-WAStorageBlobLease
Remove-WAStorageContainer
Remove-WAStorageRow
Remove-WAStorageTable
Remove-WASubscriptionCertificate
Remove-WAAffinityGroup
Select-WAStorageAccount
Select-WASubscription
Set-WADeployment
Set-WADeploymentStatus
Set-WADisk
Set-WAHostedService
Set-WAOSImage
Set-WAStorageAccount
Set-WAStorageBlob
Set-WAStorageContainerACL
Set-WAStorageRow

# aliases registed though module manifest file in wa.ps1
New-Alias -name Select-Subscription -value Select-WASubscription
New-Alias -name Get-Subscription -value Get-WASubscription
New-Alias -name Get-OperationStatus -value Get-WAOperationStatus
#New-Alias -name Get-Location -value Get-WALocation

New-Alias -name Get-Deployment -value Get-WADeployment
New-Alias -name New-Deployment -value New-WADeployment
New-Alias -name Set-Deployment -value Set-WADeployment
New-Alias -name Set-DeploymentStatus -value Set-WADeploymentStatus
New-Alias -name Remove-Deployment -value Remove-WADeployment

New-Alias -name Get-AffinityGroup -value Get-WAAffinityGroup
New-Alias -name New-AffinityGroup -value New-WAAffinityGroup
New-Alias -name Set-AffinityGroup -value Set-WAAffinityGroup
New-Alias -name Remove-AffinityGroup -value Remove-WAAffinityGroup

New-Alias -name Get-Disk -value Get-WADisk
New-Alias -name New-Disk -value New-WADisk
New-Alias -name Remove-Disk -value Remove-WADisk
New-Alias -name Set-Disk -value Set-WADisk

New-Alias -name Get-OSImage -value Get-WAOSImage
New-Alias -name New-OSImage -value New-WAOSImage
New-Alias -name Set-OSImage -value Set-WAOSImage
New-Alias -name Remove-OSImage -value Remove-WAOSImage

New-Alias -name Get-VM -value Get-WAVM


New-Alias -name Get-ServiceCertificate -value Get-WAServiceCertificate
New-Alias -name New-ServiceCertificate -value New-WAServiceCertificate
New-Alias -name Set-ServiceCertificate -value Set-WAServiceCertificate
New-Alias -name Remove-ServiceCertificate -value Remove-WAServiceCertificate

New-Alias -name Get-Certificate -value Get-WASubscriptionCertificate
New-Alias -name New-Certificate -value New-WASubscriptionCertificate
New-Alias -name Remove-Certificate -value Remove-WASubscriptionCertificate

New-Alias -name Get-HostedService -value Get-WAHostedService
New-Alias -name New-HostedService -value New-WAHostedService
New-Alias -name Remove-HostedService -value Remove-WAHostedService
New-Alias -name Set-HostedService -value Set-WAHostedService

New-Alias -name Get-StorageAccount -value Get-WAStorageAccount
New-Alias -name Get-StorageAccountKeys -value Get-WAStorageAccountKeys
New-Alias -name New-StorageAccount -value New-WAStorageAccount
New-Alias -name New-StorageAccountKeys -value New-WAStorageAccountKeys
New-Alias -name Remove-StorageAccount -value Remove-WAStorageAccount
New-Alias -name Select-StorageAccount -value Select-WAStorageAccount
New-Alias -name Set-StorageAccount -value Set-WAStorageAccount

New-Alias -name Get-Blob -value Get-WAStorageBlob
New-Alias -name Get-BlobLease -value Get-WAStorageBlobLease
New-Alias -name Get-BlobSignature -value Get-WAStorageBlobSignature
New-Alias -name New-Blob -value New-WAStorageBlob
New-Alias -name New-BlobLease -value New-WAStorageBlobLease
New-Alias -name New-Snapshot -value New-WAStorageBlobSnapshot
New-Alias -name Copy-Blob -value Copy-WAStorageBlob
New-Alias -name Remove-Blob -value Remove-WAStorageBlob
New-Alias -name Remove-BlobLease -value Remove-WAStorageBlobLease
New-Alias -name Set-Blob -value Set-WAStorageBlob

New-Alias -name Get-Container -value Get-WAStorageContainer
New-Alias -name Get-ContainerACL -value Get-WAStorageContainerACL
New-Alias -name Get-ContainerSignature -value Get-WAStorageContainerSignature
New-Alias -name New-Container -value New-WAStorageContainer
New-Alias -name Remove-Container -value Remove-WAStorageContainer
New-Alias -name Set-Container -value Set-WAStorageContainer
New-Alias -name Set-ContainerACL -value Set-WAStorageContainerACL

New-Alias -name Get-Row -value Get-WAStorageRow
New-Alias -name New-Row -value New-WAStorageRow
New-Alias –name Set-Row –value Set-WAStorageRow


New-Alias
-name Remove-Row -value Remove-WAStorageRow

New-Alias -name Get-Table -value Get-WAStorageTable
New-Alias -name New-Table -value New-WAStorageTable
New-Alias -name Remove-Table -value Remove-WAStorageTable

mandag den 29. oktober 2012

Setting up Navision 2013 Service

So I have shown how to setup/install NAV 6 before, but I am playing around with 2013 and had a few issues. First of all, the .TCP sharing services trick didn’t seem to want to work for me (I have seen blog post from others showing it does work but it is not all that important for me right now) so first thing we need to do is find a free port (4 actually)

$properties = [System.Net.NetworkInformation.IPGlobalProperties]::GetIPGlobalProperties()
$TcpConnections = $properties.GetActiveTcpConnections()

$clientport = 0; $mgtport = 0; $odataport = 0; $soapport = 0; $freeport = 0;

for ($freeport=6000;$freeport -le 7000; $freeport++){
if( ($TcpConnections | Where-Object {$_.LocalEndPoint.Port -eq $freeport}).count -eq 0){
if($clientport -eq 0){ $clientport = $freeport }
elseif($mgtport -eq 0){ $mgtport = $freeport }
elseif($odataport -eq 0){ $odataport = $freeport }
elseif($soapport -eq 0){ $soapport = $freeport }
else { break }
}

}
write-host "Create NAV Service '$InstanceName' using clientport: $clientport mgtport: $mgtport odataport: $odataport soapport: $soapport"

Right, next thing is installing the service. I really don’t like running stuff as local system, so lets use a windows user.

Add-PSSnapin "Microsoft.Dynamics.Nav.Management"
$Password = ConvertTo-SecureString $NAVAdminPassword -AsPlainText -Force
$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $NAVAdminUPN, $password

New-NAVServerInstance -ServerInstance $InstanceName -ClientServicesPort $clientport -ManagementServicesPort $mgtport -ODataServicesPort $odataport -SOAPServicesPort $soapport -ServiceAccountCredential $cred -ServiceAccount User

Once the service starts, you will start getting an warning in the event log “


The service account has insufficient privileges to register service principal names in Active Directory.


And that is even if you registered the SPNS manually. So to shut it up, and just give it the permissions to do that:

$ADNAVAdminUser = Get-ADUser $NAVAdmin.Username
$self = New-Object System.Security.Principal.SecurityIdentifier([Security.Principal.WellKnownSidType]"SelfSid", $ADNAVAdminUser.Sid.tostring())
$rule = New-Object System.DirectoryServices.ActiveDirectoryAccessRule($self, [System.DirectoryServices.ActiveDirectoryRights]::WriteProperty, [System.Security.AccessControl.AccessControlType]"Allow")

$dn = $ADNAVAdminUser.DistinguishedName
$U = [ADSI]"LDAP://$dn"
$Result = $U.ObjectSecurity.AddAccessRule($rule)
$U.CommitChanges()

torsdag den 13. september 2012

Azure PersistentVM loosing network connectivity

I have several time experience that my VM’s suddenly loose all connectivity in azure. in the forums several people have complained about this. Your machine will show as running in the portal, but you cannot connect to it remotely. Some say that waiting a few hours makes it come back, but not for me. If you have several machines connected to a virtual network, you can ping and connect to the machine, but you cannot browse/resolve DNS, or connect to anything externally, even using IP addresses from the machine.

If you try and export the VM, delete it, and import the VM, it will fixes the problem.

$servicename = 'superservice'
$vmname = 'superserver01'

Export
-AzureVM -ServiceName $servicename -Name $vmname -Path "c:\VMConfig\$vmname.xml"
Remove
-AzureVM -ServiceName $servicename -Name $vmname
Remove
-AzureService -ServiceName $servicename
$vm = Import-AzureVM -Path "c:\VMConfig\$vmname.xml"
New
-AzureVM -VM $vm -ServiceName $servicename -Location 'North Europe'


How ever if you want to keep the Virtual IP, you don’t want to delete the Hosted Service.




New-AzureVM -VM $vm -ServiceName $servicename


This will fail if the machine was part of a Virtual Network. ( the machine will have a Subnet defined in the configuration, but the azure REST API wont know what Network that subnet belongs too (even if you only have one) so if you get


New-AzureVM : HTTP Status Code: BadRequest - HTTP Error Message: The virtual network name cannot be null or empty.


do it like this


$servicename = 'superservice'
$vmname = 'superserver01'
$vmname = 'supercoolnetwork'

Export
-AzureVM -ServiceName $servicename -Name $vmname -Path "c:\VMConfig\$vmname.xml"
Remove
-AzureVM -ServiceName $servicename -Name $vmname
$vm = Import-AzureVM -Path "c:\VMConfig\$vmname.xml"
New
-AzureVM -VM $vm -ServiceName $servicename -VNetName $networkName

søndag den 19. august 2012

Windows Server 2012 (Server 8) Remote desktop Certificate

Oh, this one was a pain in to get though.
When you install Windows Server 2012 and configure Remote Desktop, everything goes though a nice and simple guide, and everything works perfectly except one very important part. Clients will keep getting popups about certificate is not trusted or computer name does not match, once the Service broker redirect the user to the RD session host server.

Goggling this will give you nothing. I saw one post from a Microsoft guy saying this is not a problem on windows 8, and there will be a “patch” for windows 7 later this year, but who the hell wants to wait on a patch ?

in Windows 2008/2003 you could just open (Remote Desktop Session Host Configuration) tsconfig.msc and set it there, or you could use PowerShell

Set-Item RDS:\RDSConfiguration\Connections\RDP-Tcp\SecuritySettings\SSLCertificateSHA1Hash -Value $ThumbPrint

But the PS Drive RDS: does not exist in the RemoteDesktopServices module shipped with Server 2012. After ton of goggling I finally found a solution


$pass = ConvertTo-SecureString "PfxPassword" -AsPlainText -Force
Import
-PfxCertificate -FilePath '\\rdgw01\c$\wild.domain.com.pfx' -Password $pass -CertStoreLocation cert:\localMachine\my
$path = (Get-WmiObject -class "Win32_TSGeneralSetting" -Namespace root\cimv2\terminalservices -Filter "TerminalName='RDP-tcp'").__path
Set
-WmiInstance -Path $path -argument @{SSLCertificateSHA1Hash="thumbprint"}

mandag den 18. juni 2012

Remote Desktop and Claimsbased authentication

I have had a wet dream for a long time, about implementing WIF/Claimsbased authentication into Windows Credential Provider/Remote Desktop. I don’t really care weather it could be though the RDP Website and/or directly on the Windows Server.

If we want to implement claims based authentication in corporation with remote desktop we need to consider what scenario this makes sense for, and how the logon process should be. Doing passive authentication in a browser, the user gets redirected to an identity provider and validated, has a SAML token issued and posted back to the website. Validation at the Identity Provider could also involve redirecting the user to yet another Identity provider (Microsoft Azure ACS, another ADFS 2.0 server, Google ID, Facebook and such) and the all this without the original website knowing anything about this, all it cares about is getting a SAML token from its trusted Identity Provider.

Doing Active Federation, lets say, from a Windows Application, we (could) ask the user how he wants to be validated and then though Kerberos, Certificates or Username/Password send the users credentials to the Identity Provider on behalf of the user and get’s back a SAML token. If the user want to logon though Windows Live ID in this scenario, we first need to ask the user for he's username/password ( this is where all alarm bells should go off for most security aware users ) and send this to login.live.com and by some kind of magic we get back a SAML token. We then ask out local Identity provider to Authorize us, using the SAML token from Windows Live ID and get back a SAML token and can continue the login process.

First of all, asking a user to type in a username and password for Facebook/windows Live ID/what ever on a “proprietary login page” is not good practice. Next, I doubt Microsoft/Facebook/whatever is allowing validating username and passwords though any kind of API, so the above will not work in any case.

So why is this even interesting, if its wrong ? Imaging the most common setup. Company A and Company B wants to collaborate and allow each other users access to certain resources. Instead of creating an Active Directory trust, Company A and B both install ADFS 2.0 and created a mutual trust between these. Now they can both add users to each others SharePoint 2010 and other Claims aware web applications. But what if they want to grant access to Microsoft CRM 2011, create Exchange mailbox's, allow access to Axapta 2009/2012 or use any other kind of application that rely on a Windows user in the Active Directory, then they need to map the User Principal name to locally created Active Directory users. How you do that, is beyond the scope of this post, but its doable and used many places.

Now image you want to allow users from Company A to access Company B’s Citrix XenApp or Remote Desktop servers. So User1 from Company A also have a User account in Company B, but he doesn’t know the password for that account. To give the whole Single Sing-on experience all he needs to know is the username and password to he’s own AD account in Company A.

You might think that implementing Claimsbased authentication on Remote Desktop WebApp is easy, and to be honest, it is. it just doesn’t work. The reason is, when you login to RDWeb the webpage loads an instance of your locally installed Remote Desktop client, though JavaScript/ActiveX and feeds it, the username and password you logged in with. So after implementing Claimsbased authentication in enabling it to use the c2wts service ( Claims to Windows Token Service ) you WILL get the correct list of application, but when ever you launch any of those, you will be prompted to login with a real windows username and password, and the whole concept falls apart. See this post

If you rely on Citrix XenApp you could implement the solution explained here, but that will not work with Remote Desktop. The solution explained there, will not work for user accessing the solution with PNAgent installed locally either ( PDA, Tablets, Thin Clients or what ever ) so an alternative solution will need to rely on replacing the Credential Provider ( called Gina pre windows vista ).

I came across this blog post and I really really wish I was better a coding C++ so I could create this my self. I can do hello world and maybe write to a text file, but this is just to much for my skillset. Google is not much help here, but you will probably come across www.pgina.org at some point.

This will not be completely what I wanted, but at least it’s a few steps in the right direction. Pgina is a cool, yet in some ways limited implementation of a Gina/Credential Provider that send events back to a Service written in .NET 4.0, you can then develop plugins that gets loaded into this service and react/modify things doing the login process. It come bundled with tons of cute add-ons that can validate users though LDAP/Radius/SQL or even IMAP/POP3 . When I say it is limited, I do that, from the perspective that it could have been nice if they had implemented support for asking for more than just username and password. a 3rd field for a passcode to do RSA Token or SMS 2 factor validation would have been a very nice touch, but for now lets just focus on Claimsbased authentication.

So just to test my theory I installed Pgina on a server and wrote a simple add-on to Pgina, that would take what ever the user types in, validate it against and Identity Provider, find the user based on the UPN claim in the local Active Directory and then log the user on.

image

Doing login, no matter what, it all have to end up with win logon getting a valid username and password. But in a world of claims based authentication we don’t have the password. God knows how Citrix manage to get a user logged on purely based on a Kerberos Ticket, and I would love to see this implemented in Pgina too, but a even better solution would be if someone with more brains than me, could write an Authentication Package as explained by Steve Syfuhs in he's blog post. But for now, we focus on the scenario where a user from Company A logs on to a Server inside Company B and the user have an account in the local AD with same UPN.
What I do here is validate what ever the user types in,
1) by asking for a SAML token from the remote Identity Provider.
2) If I get a token back, I extract the User Principal name and search the local Active Directory for such an user.
3) I then validate that the password the user typed is also the password the local Active Directory account, and if login fails, I reset the password and set the login password to this.

Since the context the login process run as, does not have permission to look up in the local AD and/or reset passwords, we also need to tell the PGina addon what username and password to use when talking to AD

image

Finally, since RDP will try to Negotiate security (the popup for credentials before you are connected) and we want PGina to handle this, you need to open Remote Desktop Session Host Configuration and set it to RDP Security Layer

image

And voila, we just logged on to Remote Desktop using Claims based authentication.