personal comments edit

As of yesterday, June 27, 2016, I’ve worked for 15 years at Fiserv.

My 15 year certificate

Given I got my first “official” job when I was 14 and I turn 40 this year, that’s over half of my professional working life that I’ve been here.

I started in the marketing department back when the company was Corillian. I got hired to help work on the corporate web site, www.corillian.com (which now redirects to a Fiserv page on internet banking).

I think it was a year or two into that when some restructuring came along and the web site transferred to the IT department. I transferred with it and became the only IT developer doing internal tools and working on automating things.

I remember working on rolling out the original SharePoint 2003 along with Windows SharePoint Services in an overall Office 2003 effort. We had some pretty fancy “web parts” in VBScript to do custom document indexing and reporting. I vaguely recall updating several of those parts to be .NET 1.1 assemblies.

It was in 2004 when a need arose for a developer to work on some proof-of-concept and demo web sites that our sales folks could take around on calls. I happened to be free, so I worked with our product folks on those things. As sometimes happens, those POC and demo sites became the template for what we wanted the next version of the product to be like. And since I’d already worked on them… why not come over to product development and do the work “for real this time?”

I worked on the very first iteration of the Corillian Consumer Banking product. That was in .NET 1.1 though 2.0 was right around the corner. I remember having to back-port features like ASP.NET master pages into 1.1. (I still like our implementation better.) This was back when Hanselman was still at Corillian and we worked together on several features, particularly where the UI had to interact with/consume services.

In early 2007 CheckFree acquired Corillian. After the dust on that settled, I was still working on Consumer Banking - basically, same job, new company. There were definitely some process hiccups as we went from a fairly agile Scrum-ish methodology that Corillian had into CheckFree’s version of Rational Unified Process, but we made do.

In late 2007, Fiserv acquired CheckFree.

Yeah, that was some crazy times.

Fiserv, for the most part, adopted CheckFree’s development processes, at least as far as our group found. RUP gave way after a while to something more iterative but still not super agile. It was only pretty recently (last five-ish years?) that we’ve finally made our way back to Scrum.

The majority of my time has been in web service and UI development. I did get my Microsoft Certified DBA and Microsoft Certified .NET Solutions Developer certifications so I’m not uncomfortable working at all layers, but I do like to spend my time a little higher than the data tier when possible.

In my most recent times, I’ve been working on REST API stuff using ASP.NET Core. Always something new to learn, always interesting.

Also interesting is that with the various acquisitions, reorganizations, and re-prioritizations we’ve seen over the years, while I have worked (effectively) for the same company, it’s given me a lot of great experience with different people, processes, tools, and development environments. In some cases, it’s been like working different jobs… despite it being the same job. Definitely some great experience.

Plus, I’m afforded (a small amount of) time to help out the open source community with Autofac and other projects.

That’s actually why I’ve stayed so long. I can only speak for myself, but even with me sort of doing “the same thing” for so long… it’s not the same thing. I’m always learning something new, there’s always something changing, there’s always a new problem to solve. I work with some great people who are constantly trying to improve our products and processes.

And a bit of seniority never hurt anyone.

humor, personal comments edit

In the last couple of weeks I’ve had the opportunity to get together with folks for lunch or dinner and I’m finding it’s hard to agree on a “nice place to eat.”

Here’s the thing.

I’m not a really picky eater. At least, I don’t think so. I like simple food that tastes good. The thing is, I live in the Portland, OR metro area, so when someone talks about “a nice place to eat” it usually has something to do with an independent restaurateur who has “a fresh take on old ideas.” This generally amounts to “I don’t want to eat there” for me.

Don’t get me wrong, I’ve tried several of these places. I have yet to enjoy them. It’s not like I didn’t give it a fair shake.

With that in mind, I decided to post a list of “restaurant red flags” - things that warn me against eating at a place. No single item here instantly disqualifies a place, but a combination of them will probably result in a “no.”

If your restaurant has/does/says any of these things, I’m out:

  • The description of your restaurant contains some version of the word “gastronomy” that is not immediately prefixed by “molecular.”
  • All the pictures of the meat dishes appear to be barely cooked to rare.
  • I have to look up what two or more of the words are on any item.
  • Your menu is in English but you don’t use the common English words for things so you can sound fancier (e.g., you use “ali-oli” instead of “aioli”).
  • You serve a dish based on a creature I would normally otherwise consider “vermin” rather than “game.”
  • You’ve been in any top ten restaurant list where the food is described as “new and exciting.”
  • The intent of the food is to have lots of small dishes purchased and get passed around. (I hate tapas. Joey doesn’t share food.)
  • You think it’s a great idea to have a lot of community tables and no individual tables.
  • There’s a lot of fermented stuff on the menu that isn’t alcohol.
  • More than one item on the menu can be described as a “delicacy.”
  • A significant number of the meat dishes are made with the less-common cuts of meat (cheek, tongue, tail, etc.).

I may add to that list in the future, but basically, yeah. Red flags.

aspnet, security comments edit

I’ve been working with ASP.NET Core in a web farm environment. Things worked great when deployed to an Azure Web App but in a different farm setting (Pivotal Cloud Foundry) I started getting an error I hadn’t seen before: System.AggregateException: Unhandled remote failure. ---> System.Exception: Unable to unprotect the message.State.

This happened in context of the OpenID Connect middleware, specifically when a value encrypted by one instance of the ASP.NET Core application tried to be decrypted by a different instance of the application.

The problem is that the values used in DataProtection weren’t synchronized across all instances of the application. This is a lot like the ASP.NET classic issue where you have to ensure all nodes in the farm have the machine key synchronized so ViewState and other things can be shared across application instances.

Instead of machine key, ASP.NET Core uses Microsoft.AspNetCore.DataProtection for handling the encryption keys used to protect state values that get posted between the app and the client. There is plenty of documentation on how this works but not much in the way of a concise explanation of what it takes to get things working in a farm. Hopefully this wil help.

How DataProtection Gets Added

Normally you don’t manually add the data protection bits to the application pipeline. It’s done for you when you call services.AddMvc() during the ConfigureServices() part of application startup. That services.AddMvc() line actually fans out into adding a lot of default services, some of which are the defaults for data protection.

What to Synchronize

Instead of just machine key in ASP.NET Core, you have three things that must line up for a farm scenario:

Why This Doesn’t “Just Work” in All Farms

  • The application discriminator, being based on the installed location of the app, is great if all machines in the farm are identical. If, instead, you’re using some containerization techniques, a virtual filesystem, or otherwise don’t have the app installed in the same location everywhere, you need to manually set this.
  • The master encryption key, while not used on non-Windows environments, does otherwise need to be synchronized. If you choose to use a certificate, the current EncryptedXml mechanism used internally allows you to pass in a certificate for use in encryption but in decryption it requires the certificate to be in the machine certificate store. That requirement is less than stellar since it means you can’t store the certificate in something like Azure Key Vault.
  • The encrypted set of session keys is easy to persist in a file share… if the farm is allowed to store things in a common share and all the network ports are open to allow that. If you want to store in a different repository like a database or Redis, there’s nothing out of the box that helps you.

Why This Works in Azure Web Apps

There is some documentation outlining how this works in Azure. In a nutshell:

  • All applications are installed to the same location, so the application discriminator lines up.
  • Keys aren’t encrypted at rest, so there is no master encryption key.
  • The session keys are put in a special folder location that is “magically” synchronized across all instances of the Azure Web App.

Setting Your Own Options

To set your own options, call services.AddDataProtection() after you call services.AddMvc() in your ConfigureServices() method in Startup. It will look something like this:

public virtual IServiceProvider ConfigureServices(IServiceCollection services)
{
  services.AddMvc();
  services
    .AddDataProtection(opt => opt.ApplicationDiscriminator = "your-app-id")
    .ProtectKeysWithYourCustomKey()
    .PersistKeysToYourCustomLocation();
}

Example Extensions

To help get you on your way, I’ve published a couple of extensions on GitHub. They include:

  • XML encryption/decryption using a certificate that isn’t required to be in a machine certificate store. This allows you to store the master certificate in a repository like Azure Key Vault. This bypasses that requirement that the certificate be in the machine certificate store during decryption.
  • Encrypted XML storage in Redis. This allows you to share the session keys in a Redis database rather than a file share.

I wanted to be able to not only tidy my JSON objects, but also sort by property. I wanted to do this so I could unify my project.json and config.json files while working in .NET Core. Figuring out where people were adding keys, finding redundant things added to files, and so on… having a predictable order makes it all that much easier.

Up front, I’ll tell you this is a total hack. I got it to work as a user package (code in your User folder) but haven’t taken it as far as putting it into a repo or adding it to Package Control. That’s probably the next step. I just wanted to get this out there.

I’ll also say this is instructions for a Windows environment. The places you’ll have to adjust for Linux should be obvious, but I don’t have guidance or instructions to help you. Sorry.

First, install the External Command package. This is a great general-purpose package for setting up external commands and pushing Sublime Text buffers through. Select some text and have that text passed to an external shell command on stdin. (No selection? It runs the whole file.)

Next, create a folder called SortJson in your User package folder. This is where we’ll put the contents of the user module.

If you don’t have Node installed… why not? Really, though, if you don’t, go get it and install it. We need it because we use the Node json-stable-stringify package to do the work.

Drop to a command prompt in the SortJson folder and install the json-stable-stringify module.

npm install json-stable-stringify

You should get a node_modules folder under that SortJson folder and inside you’ll have json-stable-stringify (and maybe dependencies, but that’s fine).

Now we need a little script to take the contents of stdin and pass it through json-stable-stringify.

Create a script called sort-json.js in the SortJson folder. In that script, put this:

var stringify = require('json-stable-stringify');
var opts = {
    "space": 2
};

var stdin = process.stdin,
    stdout = process.stdout,
    inputChunks = [];

stdin.resume();
stdin.setEncoding('utf8');

stdin.on('data', function (chunk) {
    inputChunks.push(chunk);
});

stdin.on('end', function () {
    var inputJSON = inputChunks.join(""),
        parsedData = JSON.parse(inputJSON),
        outputJSON = stringify(parsedData, opts);
    stdout.write(outputJSON);
    stdout.write('\n');
});

Unfortunately, the External Command package doesn’t let you set a working directory, so you can’t just fire up Node and run the sort-json.js directly. We have to create a little batch file that helps our script find the json-stable-stringify module at runtime.

Create a batch script called sort-json.cmd in the SortJson folder. In that script, put this:

@SETLOCAL
@SET NODE_MODULES=%~dp0node_modules
@node "%~dp0sort-json.js" %*

That temporarily adds the SortJson\node_modules folder to the NODE_MODULES environment variable before running the sort-json.js script.

The last thing you need is a tie to the Sublime Text command palette so you can run the command to sort JSON.

Create a file called sort-json.sublime-commands in the SortJson folder. In that file, put this:

[
    {
        "caption": "JSON: Sort Object",
        "command": "filter_through_command",
        "args": { "cmdline": "\"%APPDATA%\\Sublime Text 3\\Packages\\User\\SortJson\\sort-json.cmd\"" }
    }
]

You’ll have to restart Sublime, but when you do you’ll see a command in the palette “JSON: Sort Object”. Load up a file with a JSON object and run that command. You should get a sorted JSON object.

I try to pair this with the JsFormat package (for JSBeautify integration) as well as SublimeLinter-json (for linting/error checking), both of which are in Package Control. If you want to tweak the formatting that comes out of the sort directly, the opts variable you see at the top of sort-json.js are the options used by json-stable-stringify.

personal, movies, humor comments edit

I was watching Cutthroat Island this weekend with my daughter, who loves pirate movies, when I started thinking about these giant treasure chests full of gold you see in such films.

The stereotypical treasure chest

While I get that it’s a movie, it was fun to think about how practical carting around that treasure chest of doubloons might be.

Assumptions:

Now, let’s say the treasure chest is like 90cm x 60cm x 60cm on the inside. A little large-ish, but not unheard of in a pirate movie.

  • The chest has 324,000cc interior capacity.
  • Multiplied by the 35% packing efficiency, you’d have 113,400cc of gold.
  • 113,400cc x 19.3g per cc = 2,188,620g = 2188.62kg (4825.1lb).

There is no way pirates are carrying around 5000lb gold chests.

Let’s figure a couple guys - one on each end of the chest - need to cart the chest through the jungle or something. They’re strong, but not they’re not Hafþór Júlíus Björnsson. You’re looking at something like 115kg (253.5lb) or so lest it gets unwieldy.

Working backwards, 115kg is 5958.55cc of gold. With the packing ratio, that’s a chest with 17024.43cc total capacity. To make the math easy, let’s say it’s a cube-shaped chest. That’d yield a chest with internal dimensions of about roughly 25.73cm (10.13 inches) on a side.

That’s a pretty tiny treasure chest.

At least, tiny in comparison to what you usually see on a pirate movie.

Now, I could be generous with my packing efficiency. Maybe it’s far less than 35%, or it could be that the chest isn’t packed to the top with gold, or both.

If you had that 90cm x 60cm x 60cm chest and limited yourself to the 115kg weight, that’d put the packing efficiency of doubloons at closer to 5%; or it’d mean the chest is not quite a quarter of the way full.

Just for fun, we can also calculate the value of such treasure. The price of gold today (as I write this) is $39,175.66 USD per kg.

  • 115kg of gold = $4,505,200.90 USD
  • 2188.62kg of gold = $85,740,632.99 USD

If the chest was full of doubloons (which, again, are actually 22k gold, not 24k), we know that doubloons weigh 6.867g so you’d have…

  • 115kg of doubloons = 16,746 doubloons
  • 2188.62kg of doubloons = 318,715 doubloons

Doubloons seem to be baesd on weight rather than physical size (or, at least, I didn’t see any average size listed anywhere in my two minutes of searching) so I’m not sure how big a chest with that number of doubloons might need to be. I can’t imagine it’s too far off from my original calculation.

Anyway, it was kind of fun to think about. It makes for a better movie to have the giant chest of treasure, so it’s all good.