January 2013 Blog Posts

Autofac 3.0 Released

The final version of Autofac 3.0.0 is released and you can get it on NuGet or download directly from Google Code.

If you're upgrading from 2.6, the big changes are:

• NuGet packages for everything - you can get core Autofac, the integrations, and the extras, all on NuGet.
• Symbol/source packages are deployed to SymbolSource so you can debug into the source.
• New integration with MVC4, WebAPI, and SignalR.
• Autofac core is now a Portable Class Library that can work on the full .NET stack, Windows Store apps, Silverlight 5, and Windows Phone 8 apps.
• AutofacContrib projects are now Autofac.Extras (namespace and assembly name change).

There are also a ton of issues resolved and we're working on enhancing the wiki docs. The Release Notes page on the Autofac wiki has the detailed list of changes.

Huge thanks to Alex Meyer-Gleaves for all the work he does (and managing the release process!) and to Nicholas Blumhardt for getting Autofac going.

On Ragequitting

Jeff Atwood just posted an article about Aaron Swartz, his unfortunate story, and the notion of ragequitting.

I agree with Jeff on Swartz and the thoughts about that case. Rather than restate all that, check out the article. My thoughts go out to Swartz and his family. He'll be missed.

What I disagree with is this:

Ragequitting is childish, a sign of immaturity.

We've often used "vote with your feet" as an idiom describing how people can effectively support (or show lack of support) for a thing, particularly an internet (or programming, or technology) thing. It occurs to me that ragequitting, while abrupt, is effectively foot-voting-in-action. I've done it myself, I admit, and I'm sure you have, too. Point is, just because it's fast or unannounced doesn't mean it's any less valid, and, in my opinion, certainly doesn't mean it's childish. It's within everyone's rights to choose their situation and decide what's best for them regardless of the emotion that may be associated with said decision.

Using Portable Class Libraries - Update .NET Framework

We switched Autofac 3.0 to be a Portable Class Library so we can target multiple platforms. In consuming the updated Autofac, I've noticed some folks receive errors like this at runtime:

Test 'MyNamespace.MyFixture.MyTest' failed: System.IO.FileLoadException : Could not load file or assembly 'System.Core, Version=2.0.5.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e, Retargetable=Yes' or one of its dependencies. The given assembly name or codebase was invalid. (Exception from HRESULT: 0x80131047)
at Autofac.Builder.RegistrationData..ctor(Service defaultService)
at Autofac.Builder.RegistrationBuilder3..ctor(Service defaultService, TActivatorData activatorData, TRegistrationStyle style)
at Autofac.RegistrationExtensions.RegisterInstance[T](ContainerBuilder builder, T instance)
MyProject\MyFixture.cs(49,0): at MyNamespace.MyFixture.MyTest()

You can solve this by getting the latest .NET framework patches from Windows Update. Article KB2468871 and the associated patch specifically address this, however in practice I've noticed that specific patch doesn't always fix it for everyone. I actually don't know if it's a different patch that fixes the 100% case or if it's different patches for different people. What I do know is that once the machine is patched up with the latest, the problem goes away.

If you're on a domain where there's an internal Windows Server Update Services instance running, you may think you have the latest but you might not. If it looks like you're all patched up but still seeing the error, make sure you've checked online for updates.

See here, it looks like I've got the latest…

…but really, I'm missing several updates.

A note on the version 2.0.5.0 you see in the error message: That's the version of the framework Autofac references as an artifact of being a Portable Class Library. 2.0.5.0 is Silverlight, which is one of the platforms we target. You won't see it installed on your machine, don't go trying to manually hack around and install it. But don't freak out, that version is expected and it's OK. The .NET framework patches should allow you to dynamically redirect that reference at runtime to the appropriate version (e.g., 4.0.0.0) and the error will go away.

posted @ Monday, January 21, 2013 10:16 AM | Feedback (1) | Filed Under [ .NET ]

Putting Controllers in Plugin Assemblies for ASP.NET MVC

With Autofac (particularly the multitenant extensions) I see a lot of questions come through that boil down to this:

I have an ASP.NET MVC project. I have some controllers in a plugin assembly that isn't referenced by the main project. At application startup, I do some scanning and use Autofac to dynamically register the controllers. For some reason I get an error when I try to visit one of these controllers. How can I have a controller in a plugin assembly?

Shannon Deminick has a nice blog article that explains this in a different context - similar question, but the same answer.

The problem is that the default controller factory in ASP.NET MVC 3 and 4 (the latest version as of this writing) is really tied to the BuildManager. The BuildManager is the class that maintains the internal list of all the referenced application assemblies and handles ASP.NET compilation.

The DefaultControllerFactory in ASP.NET MVC, in the CreateController method, uses a ControllerTypeCache internal type to locate the controller type being instantiated. The ControllerTypeCache uses another internal, TypeCacheUtil, to load the set of controllers from the list of referenced assemblies. TypeCacheUtil uses the BuildManager.GetReferencedAssemblies() method to initialize that list. BuildManager.GetReferencedAssemblies() includes:

• Assemblies that are referenced by the main web application.
• Assemblies you list in the <assemblies> part of web.config.
• Assemblies built from App_Code.

Note that none of those automatically include non-referenced, already-built plugin assemblies.

You need to add your plugin assembly to the list of referenced assemblies in the BuildManager.

You can do that in one of three ways.

First, you can register the assembly in web.config. This makes for a less "drop-in-an-assembly" plugin model, but it also means no code getting executed. If you put your plugins in a folder other than bin, you will also have to register that path. Here's a snippet showing a web.config with this sort of registration.

<?xml version="1.0" encoding="utf-8"?>
<configuration>
<runtime>
<assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
<!--
IF you put your plugin in a folder that isn't bin, add it to the probing path
-->
<probing privatePath="bin;bin\plugins" />
</assemblyBinding>
</runtime>
<system.web>
<compilation>
<assemblies>
</assemblies>
</compilation>
</system.web>
</configuration>

Another alternative is to register your assemblies in code at PreApplicationStart. This is the method outlined in Deminick's article in more detail. The idea is that you use the PreApplicationStartMethodAttribute to bootstrap some assembly scanning and registration with the BuildManager (since that all has to happen before app startup).

Basically, you mark your main project assembly with the attribute and point to a class that has a static method that will do the auto-registration, like this:

[assembly: PreApplicationStartMethod(typeof(Initializer), "Initialize")]

Then you write your initializer class to do the assembly loading and registration with the BuildManager. Something like this:

using System.IO;
using System.Reflection;
using System.Web.Compilation;

namespace MyNamespace
{
public static class Initializer
{
public static void Initialize()
{
var pluginFolder = new DirectoryInfo(HostingEnvironment.MapPath("~/plugins"));
var pluginAssemblies = pluginFolder.GetFiles("*.dll", SearchOption.AllDirectories);
foreach (var pluginAssemblyFile in pluginAssemblyFiles)
{
}
}
}
}

A third way would be to create your own ControllerFactory implementation. In your custom controller factory you could search your plugin assemblies for the controller types or use some other convention to determine which controller type to resolve. I don't have any sample code for that and there is a lot of work to accomplish that and get it right - supporting areas, properly handling the type resolution… If you go that route, and some people have, you'll have to go out searching for samples. I don't have any here to readily provide.

I'd recommend one of the first two options. They're the easiest and require the least "messing around with the framework" to get things working for you.

Manually Running the Java Update

I swear every time I change the Java settings to stop auto-updating it still pops up that stupid "A Java Update is Available" toast in the corner and I want to punch it repeatedly. Killing the scheduled task from startup works until you actually do install the next update, at which point you forget it and it puts itself back.

I run as a non-admin user. The Java auto-update thing hates that. It tells me there's an update, then I say, "OK, do it then." It asks me for admin credentials, I enter them, and I instantly get a failure message. Again, I want to punch it repeatedly.

The only way I can get this thing to go away is to manually run the update (or download the entire package and manually install the update). For my own reference, here's how I do it:

2. Run "C:\Program Files (x86)\Common Files\Java\Java Update\jucheck.exe" with elevated privileges.
3. Follow the prompts to get the update and make sure to uncheck all the freeware crap they want to install alongside it.

Controlling NuGet Packaging Version with TeamCity Variables

We use TeamCity as our build server and one of the cool things TeamCity has built in is the ability to serve as a NuGet server. You build your product, run a nuget pack task on it, and TeamCity will automatically add it to the feed.

One of the somewhat odd things I’ve found with TeamCity’s NuGet server is that it seems to require that you let TeamCity run the actual nuget pack on packages it should host. That is, even if you wanted to do that action in your build script, you can’t – simply putting the package into your set of build artifacts doesn’t get it into the feed. You actually have to use the “NuGet Pack” build step in your build. When you do that, the build step ignores any version information you put inside your .nuspec files because the “NuGet Pack” build step requires you to specify the version right there.

That’s fine as long as the build number for the build (or some similar existing variable) is also the version you want on your NuGet package. But when you want to have tighter control over it, like calculating the version as part of a build task, it becomes less clear how to get things working. This should help you.

First, you have to establish a NuGet package configuration variable. You need this so you can use it in the NuGet Pack build steps. In your TeamCity build configuration, go to the “Build Parameters” tab and define a “System Property” with your calculated NuGet package semantic version. I called mine “CalculatedSemanticVersion” so it ends up showing in the TeamCity UI as “system.CalculatedSemanticVersion” like this:

Set it to some simple, default value. It won’t stay that value so it doesn’t matter; it’s more for when you come back later and look at your configuration – this way it’ll make a little more sense.

Next, set up your NuGet Pack build steps. Use this new “system.CalculatedSemanticVersion” property as the NuGet package version you’re building.

Finally, insert a build script step before all of your NuGet Pack steps. In that build script step, calculate the version you really want for your packages and use a TeamCity message to update the variable value. You do that by using a specially formatted message written to the console, like this:

##teamcity[setParameter name='system.CalculatedSemanticVersion' value='1.0.0-beta1']

In MSBuild, you might have something that looks like this:

<?xml version="1.0" encoding="utf-8"?>
<Project
DefaultTargets="SetVersion"
xmlns="http://schemas.microsoft.com/developer/msbuild/2003"
ToolsVersion="4.0">
<Target Name="SetVersion">
<!--
Calculate your semantic version however you like.
could do anything.
-->
<CalculateMySemanticVersion>
</CalculateMySemanticVersion>
<!-- The message task here is the important part. -->
<Message Text="##teamcity[setParameter name='system.CalculatedSemanticVersion' value='$(SemanticVersion)']" /> </Target> </Project> Now when your build configuration runs, the script will calculate your NuGet package version and update the value of the property before the NuGet Pack tasks run. The NuGet Pack tasks will build your packages using the correct calculated version that you controlled through script. Automating NuGet Dependency Version Updates with MSBuild Although I wasn't a big fan of NuGet when it started getting big, I have to admit it's grown on me. I think part of that has to do with the large amount of improvement we've seen since back then. Regardless, I'm in a position with Autofac and other projects where I'm not only consuming NuGet packages, I'm also producing them. One of the biggest pains I have when maintaining the .nuspec files for my packages is that you can update a dependency for your project (via NuGet) but the corresponding version value isn't updated in the .nuspec. (This is, of course, assuming you're doing manual maintenance and not re-generating everything each time. In a single-package solution, I can see regenerating would be fine, but when you've got multiple like in Autofac, you don't want to regenerate.) What I want is for the .nuspec file <dependency> entries to match the installed package versions that I'm actually building against. So… I automated that with MSBuild. Here's how: First, put placeholders into your .nuspec file(s) using a special format, like this: <dependencies> <dependency id="Autofac" version="$version_Autofac$" /> <dependency id="Castle.Core" version="$version_Castle.Core$" /> </dependencies> Each dependency gets a $version_NuGetPackageName$ format placeholder. The "NuGetPackageName" part matches the name of the dependency (and, coincidentally, the first part of the folder name under "packages" where the dependency gets installed in your solution). Next, in your build script, include a custom task that looks like this. It will look in the "packages" folder and parse the various folder names into these placeholders so you can do some search-and-replace action. <UsingTask TaskName="GetNuGetDependencyVersions" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v4.0.dll">
<ParameterGroup>
<PackageInstallDirectory Required="true" />
</ParameterGroup>
<Using Namespace="System" />
<Using Namespace="System.Collections.Generic" />
<Using Namespace="System.IO" />
<Using Namespace="System.Text.RegularExpressions" />
<Using Namespace="Microsoft.Build.Framework" />
<Using Namespace="Microsoft.Build.Utilities" />
<Code Type="Fragment" Language="cs">
<![CDATA[
// Match package folders like Castle.Core.3.0.0.4001
// Groups[1] = "Castle.Core"
// Groups[2] = "3.0.0.4001"
var re = new Regex(@"^(.+?)\.(([0-9]+)[A-Za-z0-9\.\-]*)$"); try { // Create item metadata based on the list of packages found // in the PackageInstallDirectory. Item identities will be // the name of the package ("Castle.Core") and they'll have // a "Version" metadata item with the package version. var returnItems = new List<ITaskItem>(); foreach(var directory in Directory.EnumerateDirectories(PackageInstallDirectory)) { var directoryName = Path.GetFileName(directory); var match = re.Match(directoryName); if(!match.Success) { continue; } var name = match.Groups[1].Value; var version = match.Groups[2].Value; var metadata = new Dictionary<string, string>(); metadata["Version"] = version; var item = new TaskItem(name, metadata); returnItems.Add(item); } Dependencies = returnItems.ToArray(); return true; } catch(Exception ex) { Log.LogErrorFromException(ex); return false; } ]]> </Code> </Task> </UsingTask> If you're industrious, you could package that build task into an assembly so it's not inline in your script, but... I didn't. Plus this lets you see the source. Now you can use that build task along with the MSBuild Community Tasks "FileUpdate" task to do some smart search and replace. Here are a couple of MSBuild snippets showing how: <!-- At the top/project level... --> <!-- You need MSBuild Community Tasks for the FileUpdate task. --> <Import Project="tasks\MSBuild.Community.Tasks.targets" /> <PropertyGroup> <!-- This is where NuGet installed the packages for your solution. --> <PackageInstallDirectory>$(MSBuildProjectDirectory)\packages</PackageInstallDirectory>
</PropertyGroup>

<!-- Inside a build target... -->
<ItemGroup>
<!--
This should include all of the .nuspec files you want to update. These should
probably be COPIES in a staging folder rather than the originals so you don't
modify the actual source code.
-->
<NuspecFiles Include="path\to\*.nuspec" />
</ItemGroup>
<!--
Parse out the installed versions from the list of installed
-->
<GetNuGetDependencyVersions PackageInstallDirectory="$(PackageInstallDirectory)"> <Output TaskParameter="Dependencies" ItemName="LocatedDependencies" /> </GetNuGetDependencyVersions> <!-- Use the MSBuild Community Tasks "FileUpdate" to do the search/replace. --> <FileUpdate Files="@(NuspecFiles)" Regex="\$version_%(LocatedDependencies.Identity)\\$"
ReplacementText="%(LocatedDependencies.Version)" />

Generally what you'll want to do from a process perspective, then, is:

1. Build and test your project as usual.
2. Create a temporary folder to stage your NuGet packages. Copy the .nuspec file in along with the built assemblies, etc. in the appropriate folder structure.
3. Run the file update process outlined above to update the staged .nuspec files.
4. Run nuget pack `on the staged packages to build the final output.
This will ensure the final built NuGet packages all have dependencies set to be the same version you're building and testing against.

WHS v1 End of Life – What’s Next?

Windows Home Server v1 is end of mainstream support tomorrow and some folks have asked me what I'm going to do.

Options for switching include upgrading to WHS 2011, switching to Windows Server 2012 Essentials, or moving off the Windows platform entirely to something else.

If you've been following my Media Center solution, you'll know I have both an HP MediaSmart Windows Home Server v1 and a Synology DS1010+.

I use the WHS for:

• PC image-based backups
• General file sharing
• Image sharing
• Music sharing (both via file system and via UPnP using Asset).
• Windows 8 File History

I use the Synology DS1010+ for:

• Storing DVD movie images
• Serving the MySQL instance for my XBMC machines

Both machines have all drive bays full. The Synology doesn't have enough space to hold all the stuff I have on the Home Server and the Home Server can't hold all the stuff on the Synology. We're talking about terabytes on both machines. Keeping that in mind, if I were to want to change the OS on the WHS it'd require me to…

• Move everything off the WHS to… somewhere.
• Reformat and upgrade the OS on the HP MediaSmart box, which is older and not super-powerful. It's also headless (no video card and no DVD drive) so… that's pretty limiting. If there's any troubleshooting to do during the installation, that's going to be painful.
• Hope against hope that the new OS won't tank the HP box into non-performance and that all the drivers are properly found.
• If I go with Windows Server 2012 Essentials, I get to set up a domain for my home computers and go around joining everything so they can be backed up. If I go with WHS 2011, I will get the same backup functionality I'm used to. If I go with something else… I get to figure out my backup solution.
• Move everything back to the WHS that was previously there and set all that junk up again.

If, instead, I moved everything to the Synology I'd need to upgrade all the drives in the RAID array. It's RAID 5 so I can't do one at a time. And I can't switch to a different RAID strategy (like the Hybrid RAID they provide) without moving everything off the NAS and back on.

UGH. There was a time in my life where I had a bunch of time at home and loved to tinker with things. Now… it takes me two nights to watch a two-hour movie. I just want things to work.

So what am I going to do?

Not a damn thing.

I don't expose my WHS to the outside world so I'm not worried much about the security aspect of things. I will probably run it until it dies. In the meantime I'll slowly be moving things over to the Synology. I will probably end up investing in the five-drive expansion bay for it so I can add more drives in a new array. Then I can stock those drives up and slowly but surely both expand storage and switch to the Hybrid RAID approach. I'll also have to figure out my UPnP answer (I've admittedly not tried the Music Station that Synology offers, but I hope it does transcoding of, like, Apple Lossless and whatnot). And I'll have to figure out the backup strategy; probably something like Acronis TrueImage.

In the meantime… the plan is "no action."

Blogging from Word 2013

I've been wanting Windows Live Writer for Surface RT for a while but I noticed Hanselman mentioned you could blog from Word (which I never knew) so I thought I'd try it out.

If you're reading this… I was successful. Yay!

2012 in Review

2012 has come and gone, and it's time to look back at what happened. Because if I don't, well… my memory isn't quite what it used to be, you know?

It was a good year both personally and professionally, though I noticed I blogged a lot less. That happens, I guess. I find I post more of my little personal updates on Twitter or Facebook, which reduces the noise here but definitely splits up the content. Maybe that's a good thing. You can subscribe to the stuff you like, ignore the stuff you don't. (I use Twitter for more professional stuff whereas Facebook will show you more pictures of my kid and my cats.)

Professionally, I got promoted to be a Tech Lead at work, which is sort of like a team leader but without the "people management" part of things and more focus on product architecture and technology solutions. That's totally my wheelhouse, so good things there.

I also became a co-owner of the Autofac project, which has been a lot of fun to work on. I started out over there as a contributor for the multitenant support and started playing a larger role with the restructuring for the upcoming 3.0 release. It's great to work with smart folks like those on that project and it's nice to be learning so much while (hopefully) providing some value to the masses.

Blog-wise, other than the usual "hey, I found this interesting" sorts of tips and articles…

So there's all that. Maybe not high in quantity, but I'd like to think the quality is there.

Personally, my year (and most of my free time) has revolved around my daughter, Phoenix, who is now two years old. This year she went from walking and a small amount of vocabulary to running around rampantly and being a total chatterbox. She loves Batman, with the "Little People" playset as well as a Batman raincoat. (She's on the ThinkGeek customer action shot page for that raincoat, too.) We took her to Disneyland and had a great time, though she didn't take well to the costumed characters. I look forward to taking her again when she's older and can understand a little more about what's going on.

Every day she surprises me by saying or doing something new and I have to wonder where she gets all her material. Her latest thing is to "sneeze" ("ah… ah… AH-CHOO" like in cartoons) and then ask for a tissue ("Daddy, tissue me? I tissue. Please?"). I have no idea where she got that. This morning I yawned so she pointed to the kitchen and said, "Daddy, coffee?" Yes, baby, Daddy does need some coffee. You are the smartest toddler alive.

In going through some of our stuff, weeding out things we don't use, I came across these baby sign language videos. We tried that since we'd heard a lot of success with it and wanted Phoenix to be able to communicate and not have those "I can't speak so I'll throw a tantrum" issues. We never could get Phoenix into it, though. She lost interest in the videos (we tried several kinds from different places) and just didn't pick up on the signs. Instead, she pretty much skipped all that and just spoke or used less formal gestures to indicate what she wanted. We haven't ever really had any issues figuring out what she's saying and she's never thrown any communication-related tantrums, so I suppose it all worked out in the end.

One thing I've sort of surprised myself with is the amount of television we let her watch. It's not a lot, not like she's just "glued to the tube," but I thought I'd be one of those parents who would be, like, "NO TV EVER!" What I find, though, is that she really learns a lot from the stuff she watches. She knows a ton of animals from Go, Diego, Go. She is starting to get good problem solving skills from Mickey Mouse Clubhouse ("Which tool will solve this problem?"). She's learned a lot about music and such from Little Einsteins. We don't really watch anything with "no value" - arbitrary cartoons or whatever - but the educational stuff you see on PBS and Disney Junior has been really pretty good. She pretends a lot, she likes building with blocks and playing with those wooden Brio trains… and she knows how to navigate Netflix and the Disney Junior apps on the iPad to find the different shows she likes, so that's pretty crazy to watch.

Toward the end of the year I've started getting into tea. I've never really been much of a tea-drinker in the past, but something clicked with me and I'm enjoying tea a lot. (Honestly,

In the upcoming year, I am thinking I'd like to move off the Subtext blog platform. I am a contributor over there, but the momentum behind the project has been lost and I don't think it's going to come back. I thought I'd be more into contributing and building on the blog engine than I ended up being. I met some great folks there and I'm glad I got involved, but I realize that, as far as a blog platform is concerned… honestly, at this point I just want it to work and have the software maintained by someone else. I want to own my content and I want to be able to tweak things if needed, but for the most part I don't want a super-young platform and I don't want to worry about whether there's going to be an update coming. I honestly thought I'd want to tweak a bunch of stuff on my blog, write plugins, and do a bunch of things, but… well, not so much. As such, I will probably see what it will take to move to WordPress. It's been around a long time, it's a sort of de-facto standard, and it has an actual plugin model (something I'd wanted from Subtext for years). It also has no shortage of themes to choose from (something else I'd wanted from Subtext). It won't be a simple process - I'll need to figure out how to export all the Subtext content in WordPress Extended RSS format, redirect permalinks, etc. - but I think it'll be worth it.

Beyond that, much as I would like to blog more and better things… I will have to see. I anticipate I'll still use a lot of social media for the tiny updates, but hopefully I'll have more interesting problems (and solutions!) to share with you all as the year progresses.