process comments edit

I just had an interesting [to me] interaction on Twitter that got me thinking:

Workaround... fix... tomato tomahto... same
same.

Ignoring the original issue - that iTunes cover flow doesn’t handle similarly named albums properly - the “workaround… fix… same same” thing got me.

To a person who doesn’t develop software, I bet a workaround and a fix are the same thing. To people who develop software, they’re very different, and the distinction is important.

What’s the difference?

A workaround means a problem has been identified, there’s no official solution for it, but if you do some sort of temporary change on your end you can get things to function within reason. It may not be 100% correct behavior, but it’ll get you past the problem - in a way, you need to change your expectations to accept a workaround as a solution. In this case, the workaround would be for me to modify the metadata on all of my music to “fool” iTunes into behaving correctly. The important bit here is that the change is applied to how you use the product, not the product proper. The problem in the product still exists and the use of a workaround is expected to be temporary.

A fix means the problem has been officially solved so, once applied, the expected behavior will be the actual behavior. In this case, if the issue was fixed then I wouldn’t have to change the metadata on any of my songs - the iTunes cover flow would work properly. The important bit here is that the change is applied to the product proper. The problem in the product no longer exists because it’s actually been fixed.

This doesn’t sound like it’s a big deal, but from a language precision standpoint (particularly for a software developer), it’s huge. If someone files a defect on one of my products and I provide a workaround, I’m still expected to fix it.

(Note that this is no reflection on Alex, who’s a smart guy and friend of mine. It just got me thinking, is all.)

dotnet, aspnet, gists, csharp comments edit

I haven’t done much work with ASP.NET Dynamic Data but in a recent project I started working with it and instantly ran into a dilemma. First, let me explain the project setup.

I have a database project that outlines the schema of the database, the stored procedures, all of that. I have a database client project that has some services and a LINQ to SQL data context that I can distribute to clients who want to go that way. I then have a Dynamic Data project for managing the data and a separate web application that will consume the data, both of which need the LINQ to SQL data context.

Switch on your suspension of disbelief for a second with respect to the design. I could do a way better design going more SOA, or using a repository pattern, or whatever, but it’s a spike project and part of the goal is for me to learn something about Dynamic Data, LINQ to SQL, and so on.

Now, Dynamic Data uses the LINQ to SQL data context - from the client assembly - to do its work and generate its screens. Here’s the problem:

In order to control the rendering of the Dynamic Data screens, I have to have a metadata “buddy class” to describe it. In order to have a “metadata buddy” class, I have to add an attribute to the generated LINQ to SQL model class that points to the metadata type.

See the problem? The Dynamic Data app is the only thing that cares about the metadata “buddy class,” so that’s where the class will live… but if I have to mark up the original LINQ to SQL class in a separate assembly to get that to happen, I’m hosed.

Here’s what a standard scenario looks like:

[MetadataType(typeof(ResourceMetadata))]
public partial class Resource
{
  // Resource is a class in the LINQ to SQL
  // generated data context. A partial class
  // declaration allows us to put the metadata
  // attribute on it.
}

public class ResourceMetadata
{
  // The metadata class can define hints for
  // the Dynamic Data UI as to how to render
  // view/edit controls for the similarly named
  // property on the LINQ to SQL model class.
  // This declaration says 'render this as a
  // ResourceValue type.'

  [UIHint("ResourceValue")]
  public object Value;
}

As you can see, we have to mark up the LINQ to SQL class with that MetadataTypeAttribute. I don’t want to do that… but how to keep the metadata separate from the model?

The key is in the Global.asax.cs of your Dynamic Data project. The line where you register the data context with the application:

MetaModel model = new MetaModel();
model.RegisterContext(typeof(DataLibrary.ResourceDataContext), new ContextConfiguration()
{
  ScaffoldAllTables = true
});

See that “new ContextConfiguration” bit? One of the parameters you can pass is “MetadataProviderFactory.” That parameter is a delegate that creates an instance of something deriving from “System.ComponentModel.TypeDescriptionProvider.” The default behavior is similar to this:

MetaModel model = new MetaModel();
model.RegisterContext(typeof(DataLibrary.ResourceDataContext), new ContextConfiguration()
{
  ScaffoldAllTables = true,
  MetadataProviderFactory =
    (type) => {
      return new AssociatedMetadataTypeTypeDescriptionProvider();
    }
});

The default MetadataProviderFactory is System.ComponentModel.DataAnnotations.AssociatedMetadataTypeTypeDescriptionProvider. That provider uses an internal type (of course it’s internal) that gets the metadata type for a model class through reflection.

In order to get your metadata class from somewhere other than reflection, you need to make your own TypeDescriptionProvider.

Fortunately, that’s not actually too hard.

First, let’s decide what we want to do: We want to have a static mapping, similar to the MVC route table, that lets us manually map a LINQ to SQL type to any metadata type we want. If there’s no manual mapping, we want to fall back to default behavior - get it through reflection.

Now we know what we want the outcome to be, let’s get cracking. Throw together a place where you can hold the metadata mappings:

using System;
using System.Collections.Generic;

namespace DynamicDataProject
{
  public static class DisconnectedMetadata
  {
    public static Dictionary<Type, Type> Map { get; private set; }

    static DisconnectedMetadata()
    {
      Map = new Dictionary<Type, Type>();
    }
  }
}

I suppose if you wanted to get really fancy with it you could have add/remove/clear methods that have a bunch of thread locking around them and such, but this is a simple way to go and most likely you’re only going to be registering mappings at app startup so all of that would just be overkill.

Next we have to create a System.ComponentModel.CustomTypeDescriptor. What a CustomTypeDescriptor does is get all of the information about the various metadata - attributes and properties - on your buddy class. The thing is, Microsoft already did all of that for us, they just inconveniently marked the type they use - System.ComponentModel.DataAnnotations.AssociatedMetadataTypeTypeDescriptor

  • as internal. With a little fancy, maybe slightly unsupported, reflection work we can pretty easily make use of the code that’s already there. Instead of doing a giant full implementation of a new CustomTypeDescriptor, we can write a wrapper around the existing one.

    using System; using System.ComponentModel; using System.ComponentModel.DataAnnotations; using System.Reflection;

    namespace DynamicDataProject { public class DisconnectedMetadataTypeDescriptor : CustomTypeDescriptor { private static Type AssociatedMetadataTypeTypeDescriptor = typeof(AssociatedMetadataTypeTypeDescriptionProvider) .Assembly .GetType(“System.ComponentModel.DataAnnotations.AssociatedMetadataTypeTypeDescriptor”, true);

      public Type Type { get; private set; }
      public Type AssociatedMetadataType { get; private set; }
      private object _associatedMetadataTypeTypeDescriptor;
    
      public DisconnectedMetadataTypeDescriptor(ICustomTypeDescriptor parent, Type type)
        : this(parent, type, GetAssociatedMetadataType(type))
      {
      }
    
      public DisconnectedMetadataTypeDescriptor(ICustomTypeDescriptor parent, Type type, Type associatedMetadataType)
        : base(parent)
      {
        this._associatedMetadataTypeTypeDescriptor = Activator.CreateInstance(AssociatedMetadataTypeTypeDescriptor, parent, type, associatedMetadataType);
        this.Type = type;
        this.AssociatedMetadataType = associatedMetadataType;
      }
    
      public override AttributeCollection GetAttributes()
      {
        return AssociatedMetadataTypeTypeDescriptor.InvokeMember(
          "GetAttributes",
          BindingFlags.Instance | BindingFlags.Public | BindingFlags.InvokeMethod,
          null,
          this._associatedMetadataTypeTypeDescriptor,
          new object[] { }) as AttributeCollection;
      }
    
      public override PropertyDescriptorCollection GetProperties()
      {
        return AssociatedMetadataTypeTypeDescriptor.InvokeMember(
          "GetProperties",
          BindingFlags.Instance | BindingFlags.Public | BindingFlags.InvokeMethod,
          null,
          this._associatedMetadataTypeTypeDescriptor,
          new object[] { }) as PropertyDescriptorCollection;
      }
    
      public static Type GetAssociatedMetadataType(Type type)
      {
        if (type == null)
        {
          throw new ArgumentNullException("type");
        }
    
        // Try the map first...
        if (DisconnectedMetadata.Map.ContainsKey(type))
        {
          return DisconnectedMetadata.Map[type];
        }
    
        // ...and fall back to the standard mechanism.
        MetadataTypeAttribute[] customAttributes = (MetadataTypeAttribute[])type.GetCustomAttributes(typeof(MetadataTypeAttribute), true);
        if (customAttributes != null && customAttributes.Length > 0)
        {
          return customAttributes[0].MetadataClassType;
        }
        return null;
      }
    }   }
    

We’re doing a few interesting things here to be aware of:

  • On static initialization, we get a handle on the original AssociatedMetadataTypeTypeDescriptor - the internal type that does all the attribute reflection action. If we don’t get a reference to that type for some reason, we’ll throw an exception so we immediately know.
  • We have a GetAssociatedMetadataType method that you can pass any type to - ostensibly a LINQ to SQL model type - and you should come back with the correct metadata buddy class type. First we check the type mapping class that we created before and if it’s not there, we fall back to the default behavior - getting the MetadataTypeAttribute off the LINQ to SQL class.
  • The two-parameter constructor, which is used by Dynamic Data, is where we call our GetAssociatedMetadataType method. That’s our point of interception.
  • The three-parameter constructor, which lets a developer manually specify the associated metadata type, creates an instance of the original AssociatedMetadataTypeTypeDescriptor and passes the information into it first. We do that because that type has a bunch of validation it runs through to make sure everything is OK with the metadata type. Rather than re-implementing all of that validation, we’ll use what’s there. We’ll hang onto that created object so we can use it later.
  • The GetAttributes and GetProperties overrides call the corresponding overrides in that AssociatedMetadataTypeTypeDescriptor object we created. We do that because there’s a lot of crazy stuff that goes into recursing down the metadata class tree to generate all of the metadata information and we don’t want to replicate all of that. Again, use what’s there.

That’s the big work, seriously. A wrapper around AssociatedMetadataTypeTypeDescriptor pretty much does it. Last thing we have to do is create a System.ComponentModel.TypeDescriptionProvider that will generate our descriptors. That’s easy:

using System;
using System.ComponentModel;

namespace DynamicDataProject
{
  public class DisconnectedMetadataTypeDescriptionProvider : TypeDescriptionProvider
  {
    public DisconnectedMetadataTypeDescriptor Descriptor { get; private set; }
    public DisconnectedMetadataTypeDescriptionProvider(Type type)
      : base(TypeDescriptor.GetProvider(type))
    {
      this.Descriptor =
        new DisconnectedMetadataTypeDescriptor(
          base.GetTypeDescriptor(type, null),
          type);
    }

    public override ICustomTypeDescriptor GetTypeDescriptor(Type objectType, object instance)
    {
      return this.Descriptor;
    }
  }
}

As you can see, this basically just provides an override for the “GetTypeDescriptor” method and hands back our custom descriptor.

That’s the entirety of the infrastructure:

  • The map.
  • A CustomTypeDescriptor that looks in the map and then falls back to reflection.
  • A TypeDescriptionProvider that uses our CustomTypeDescriptor.

To use this mechanism, in your Dynamic Data project you need to register the mappings and register the TypeDescriptionProvider. Remember the “model.RegisterContext” call in the Global.asax.cs file in the RegisterRoutes method? Add your mappings there and when you call RegisterContext, add a MetadataProviderFactory:

public static void RegisterRoutes(RouteCollection routes)
{
  // Register types with the map
  DisconnectedMetadata.Map.Add(typeof(DataLibrary.Resource), typeof(DynamicDataProject.ResourceMetadata));

  // When you register the LINQ to SQL data context,
  // also register a MetadataProviderFactory pointing
  // to the custom provider.
  MetaModel model = new MetaModel();
  model.RegisterContext(typeof(DataLibrary.ResourceDataContext), new ContextConfiguration()
  {
    ScaffoldAllTables = true,
    MetadataProviderFactory =
    (type) =>
    {
      return new DisconnectedMetadataTypeDescriptionProvider(type);
    }
  });

  // ...and the rest of the method as usual.
}

That’s it. Now you don’t have to mark up your LINQ to SQL objects with partial classes- you can put your metadata “buddy classes” anywhere you want.

There are some optimizations you could make for performance purposes that I didn’t do here for clarity. For example, rather than call “InvokeMember” on every call to GetAttributes and GetProperties in the CustomTypeDescriptor, you could cache references during static construction to the MemberInfo corresponding to the two methods and invoke the cached references. This should get the idea across, though.

And, of course, the usual disclaimers apply: YMMV, I’m not responsible if this code burns your house down or crashes your app or whatever, etc., etc. Works on My Machine!

dotnet, vs comments edit

In a really large system the build can take a long time. A really long time. Long enough to make continuous integration sort of meaningless. You may not be able to do a whole lot about it, but something to look at is your project’s code organization. The compiler and linker have some startup overhead you may be able to get rid of by reducing the number of solutions/projects you have.

For example, I threw together a set of three test codebases. Each has 100 (empty) classes, but they’re organized in different ways. I then built them a few times and compared the average times.

Project Format Time to Rebuild (Clean/Build) Working Copy Size Post-Build
100 separate solutions, 1 project per solution, 1 class per project 42s 4.29MB
1 solution with 100 projects, 1 class per project 42s 3.52MB
1 solution with 1 project, 100 classes in the project 1s 256KB

I noticed two interesting things here:

  1. From a time perspective, you don’t get much if you have 100 solutions or 100 projects - the real gain (and it’s significant) is if you put everything into the same project/assembly.
  2. The working copy size post-build (the amount of disk space taken by the source and build output) is orders of magnitude smaller if you put everything into the same project/assembly.

This isn’t to say everyone should start shipping mammoth assemblies. Just be careful how you organize things. Choose your assembly boundaries carefully. You may gain yourself some time in the build - and some space on your disk.

General Ramblings comments edit

A while ago a friend of mine asked me for the names of some of the authors I read. This got me thinking and I figured I’d make a list of a few of my favorites. So, in no particular order…

"Neuromancer" by William
Gibson

William Gibson: I’ve read Neuromancer so many times the book has nearly fallen apart. I like how he describes things enough to get a vivid image in your head but not with such numbing detail you get bogged down. Plus, how can you deny the guy who coined the term “cyberspace?” I’ve read all of his books and I have yet to find one I didn’t like.

Richard K. Morgan: Morgan has created a future world where your personality lives in a “cortical stack” at the base of your skull and a character named Takeshi Kovacs is an ex special-forces soldier turned investigator for hire. Action packed and a really fun read, Altered Carbon is the first book in that series.

Tom Clancy: Clancy’s sort of hit-or-miss. I’ve read some of his books that just take freaking forever to get where they’re going, but others are really exciting. Most of the books revolve around a modern-day wartime environment. I particularly liked Rainbow Six.

Jeff Noon: Noon writes in a very distinctive style that seems to resonate for some folks but not as much for others. If it works for you, it really works, and the books are amazing. His primary series revolves around a world where we exchange objects between our world and a parallel world and what comes back is a powerful hallucinogenic drug… that you take by sticking a feather in your mouth. If you think it sounds weird, you’re right, it is… but it’s a very compelling universe, too. The first book in the series, Vurt, brings with it the added challenge that it’s written in an invented dialect so it may take a bit to get into, but give it a shot. when I was done with it, I immediately flipped back to the first page and read it again.

"Vurt" by Jeff
Noon

Steve Aylett: I’ll admit I’ve only read one of Aylett’s books - Slaughtermatic - but it was so good I have to include him here. In this particular book, the world is a place where crime is a form of recreation and time travel is possible. It’s admittedly a little convoluted, but a really fantastic read.

Philip Pullman: Specifically, Pullman’s “His Dark Materials” trilogy which starts with The Golden Compass (also made into a movie). In this world, your soul is physically embodied as a “daemon” - an animal that travels with you. When the movie came out a lot of stink got raised about the social commentary on organized religion that these books present, but I really feel like that was a lot of crap. Is there some commentary? Sure. Is it as important or prevalent as the folks out there would like you to think? I don’t think so. This is another set of books I’ve read several times and enjoy every time.

"The Looking Glass Wars" by Frank
Beddor

Frank Beddor: I am an_Alice in Wonderland_ freak. I love the story, the characters, and I love imaginative derivatives of it. Beddor has created my favorite reimagining with his “Looking Glass Wars” trilogy - the idea being that Wonderland is a real place and Alyss Heart is a real person who ends up crossing into our world and getting trapped. Not difficult reads but some of my favorites. There’s even a soundtrack that goes along with them.

Neal Stephenson: Again, sort of hit-or-miss for me, but I can’t recommend more strongly that everyone read Snow Crash. Where else would you find a place where pizza delivery is controled by the mob and it arrives at your house via a guy you refer to as “The Deliverator?” Cyberpunk action at its finest.

Neil Gaiman: Is there a “favorite authors” list Gaiman_isn’t_on? Every story is different and imaginative in its own right, but my absolute, all-time favorite, and one that I have yet to find anyone disappointed with, is his collaboration with Terry Pratchett: Good Omen. Of course, if you were offended by the “commentary” in Philip Pullman’s “His Dark Materials” series,Good Omens is probably not for you… but otherwise it’s a must-read.

Douglas Adams: The Hitchhiker’s Guide series are some favorites for me (and a ton of other people) and I’ve loved them since I was a kid. One of the best birthday gifts I’ve gotten was a leather-bound copy of the first four books in the series. I’ve read them, listened to the audio books (on tape!), seen the movies/TV shows… I can’t get enough. Truly funny stuff.

"Jennifer Government" by Max
Barry

Max Barry: You also might see him listed as “Maxx Barry” but he seems to have changed that in recent times. I discovered Barry through his book Jennifer Government: Welcome to a place where enterprise has taken over enough that your last name is the name of the company where you work and a shoe manufacturer tries to earn “street cred” for his shoes by killing people who buy them. His subsequent efforts are no less interesting or imaginative. Barry’s another one where I’ve read all of his books and love them all.

If you’re looking for something new to read, maybe check some of these out. If you do, drop me a line and let me know what you think.