javascript comments edit

I was working on my annual PTO schedule and thought it would be nice to collaborate on it with my wife, but also make it easy to visually indicate which days I was taking off.

Google Sheets is great for that sort of thing, so I started out with the calendar template. Then I wanted to highlight (with a background color) which days I was taking off.

That works well, but then I also wanted to see how many days I was planning to make sure I didn’t use too many vacation days.

How do you count cells in Google Sheets by background color?

One way is to use the “Power Tools” add-on in Sheets. You have to use the “Pro” version if you go that route, so consider that. I think the pro version is still free right now.

I did try that and had some trouble getting it to work. Maybe I was just doing it wrong. The count was always zero.

Instead, I wrote a script to do this. It was based on this StackOverflow question but I wanted my function to be parameterized, where the stuff in the question wasn’t.

First, go to “Tools > Script Editor…“ in your sheet and paste in this script:

 * Counts the number of items with a given background.
 * @param {String} color The hex background color to count.
 * @param {String} inputRange The range of cells to check for the background color.
 * @return {Number} The number of cells with a matching background.
function countBackground(color, inputRange) {
  var inputRangeCells = SpreadsheetApp.getActiveSheet().getRange(inputRange);
  var rowColors = inputRangeCells.getBackgrounds();
  var count = 0;

  for(var r = 0; r < rowColors.length; r++) {
    var cellColors = rowColors[r];
    for(var c = 0; c < cellColors.length; c++) {
      if(cellColors[c] == color) {

  return count;

Once that’s in, you can save and exit the script editor.

Back in your sheet, use the new function by entering it like a formula, like this:

=countBackground("#00ff00", "B12:X17")

It takes two parameters:

  • The first parameter is the color of background highlight. It’s a hexadecimal color since that’s how Sheets stores it. The example I showed above is the bright green background color.
  • The second parameter is the cell range you want the function to look at. This is in the current sheet. In the example, I’m looking at the range from B12 through X17.

Gotcha: Sheets caches function results. I found that Google Sheets caches the output of custom function execution. What that means is that you enter the function (like the example above), it runs and calculates the number of items with the specified background, and then it won’t automatically run again. You change the background of one of the cells, the function doesn’t just run again and the value of the count/total doesn’t update. This is a Google Sheets thing, trying to optimize performance. What it means for you is that if you change cell backgrounds, you need to change the function temporarily to get it to update.

For example, say you have a cell that has this:

=countBackground("#00ff00", "B12:X17")

You update some background colors and want your count to update. Change the function to, say, look at a different range temporarily:

=countBackground("#00ff00", "B12:X18")

Then change it back:

=countBackground("#00ff00", "B12:X17")

By changing it, you force Google Sheets to re-run it. I haven’t found any button or control to force the methods to update or re-run so this is the way I’ve been tricking it.

net, aspnet comments edit

It’s good to develop and deploy your .NET web apps using SSL/TLS to ensure you’re doing things correctly and securely.

If you’re using full IIS for development and testing, it’s easy enough to create a self-signed certificate right from the console. But you have to be an administrator to use IIS in development, and it’s not cool to dev as an admin, so that’s not usually an option. At least, it’s not for me.

IIS Express comes with a self-signed SSL certificate you can use for development and Visual Studio even prompts you to trust that certificate when you first fire up a project using it. (Which is much nicer than the hoops you used to have to jump through to trust it.)

That still doesn’t fix things if you’re using a different host, though, like Kestrel for .NET Core projects; or if you’re trying to share the development SSL certificate across your team rather than using the per-machine self-signed cert.

There are instructions on MSDN for creating a temporary self-signed certificate.

The instructions work well enough, but something my team and I ran into: After a period of time, the certificate you created will no longer be trusted. We weren’t able to reproduce it on demand, just… periodically (between one day and two weeks) the certificate you place in the “Trusted Root Certification Authorities” store as part of the instructions just disappears.

It literally disappears. Your self-signed CA cert will get removed from the list of trusted third party CAs without a trace.

You can try capturing changes to the CA list with Procmon or set up security auditing on the registry keys that track the CA list and you won’t get anything. I tried for months. I worked through it with Microsoft Premier Support and they couldn’t find anything, either.

It’s easy enough to put it back, but it will eventually get removed again.

What is Going On?

The reason for this is the Automatic Third-Party CA Updates process that runs as a part of of Windows. This process goes to Windows Update periodically to get an updated list of trusted third-party certificate authorities and if it finds any certificates not present in the list they get deleted.

Obviously your self-signed dev cert won’t be in the list, so, poof. Gone. It was unclear to me as well as the MS support folks why we couldn’t catch this process modifying the certificate store via audits or anything else.

There are basically two options to fix this (assuming you don’t want to ignore the issue and just put the self-signed CA cert back every time it gets removed):

Option 1: Stop Using Self-Signed Certificates

Instead of using a self-signed development cert, try something from an actual, trusted third-party CA. You can get a free certificate from LetsEncrypt, for example. Note LetsEncrypt certificates currently only last 90 days but you’ll get 90 uninterrupted days where your certificate won’t magically lose trust.

Alternatively, if you have an internal CA that’s already trusted, use that. Explaining how to set up an internal CA is a bit beyond the scope of this post and it’s not a single-step five-minute process, but if you don’t want a third-party certificate, this is also an option.

Option 2: Turn Off Automatic Third-Party CA Updates

If you’re on an Active Directory domain you can do this through group policy, but in the local machine policy you can see this under Computer Configuration / Administrative Templates / System / Internet Communication Management / Internet Communication Settings - You’d set “Turn off Automatic Root Certificate Update” to “Enabled.”

I’m not recommending you do this. You accept some risk if you stop automatically updating your third-party CA list. However, if you’re really stuck and looking for a fix, this is how you turn it off.

Turning off automatic third-party CA updates

personal comments edit

Well, I made it to 40.

That being a somewhat significant life milestone, I figured I’d stop to reflect a bit.

To celebrate the occasion, I rented out a local theater (a smaller 18-person place) and we had a private screening of Star Trek Beyond along with a great dinner while watching. It was a great time with family and friends.

I think 40 is not a bad place to be. I feel like I’m now old enough to “know better” but not so old I can’t still take risks. As it’s been approaching I haven’t looked at it in fear or with any real sense of “mortality” as it were, just… “Hey, here comes a sort of marker in the road of life. I wonder what it’ll mean?”

I feel like I’ve lost at least a little bit of that rough edge I had when I was younger, and that’s good. Looking back at the blog history here the tone of posts have changed to be slightly less aggressive, though I can’t honestly say a bit of that isn’t still inside me. I still don’t suffer fools gladly and I still get irritated with people who don’t respect me or my time. I’m not a patient guy and I have about ten minutes’ worth of attention span for poorly run meetings.

I’m working on it.

I’ve been in professional software development for about 20 years now. The majority of that work has been web-related, which is sort of weird for me to think that field has been around for that long. I remember doing a project in college writing a web CGI library in Standard ML and that was pretty new stuff.

As of this year, 15 of my career years have been spent at Fiserv. Given my first job was when I was 14, that’s actually most of my working life. I was originally hired at Corillian, which was subsequently acquired by CheckFree, which, in turn, was acquired by Fiserv. With all the changes of management, process, tools, and so on, it feels like having worked at different companies over the years even though the overall job hasn’t changed. That’s actually one of the reasons I haven’t really felt the need to go elsewhere - I’ve had the opportunity to see product development from a long-term standpoint, experience different group dynamics, try different development processes… and all without the instability of being an independent contractor. It’s been good.

I originally went to college wanting to be a computer animator / 3D modeler. Due to a series of events involving getting some bad counseling and misleading information, which I am still very bitter about, I ended up in computer science. Turns out art houses believe you can teach an artist computer science but computer scientists will never be good at art. Even if you have a portfolio and a demo reel. So that was the end of that.

That’s actually why I started in web stuff - I was in love with the UI aspect of things. Over time I’ve found the art in solving computer science problems and have lost my interest in pushing pixels (now with CSS).

I still have a passion for art, I still do crafts and things at home. I really like sewing, which is weird because when I was a kid I would feel dizzy in fabric stores so I hated it. (My mom used to sew when I was a kid.) I actually called them “dizzy places.” I’m curious if maybe I had a chemical sensitivity to the sizing (or whatever) that ships on the fabric. Maybe I was just super bored. In any case, I really like it now. I like drawing and coloring. I like building stuff. I’m probably not the best artist, but I have a good time with it. I do wish I had more time for it.

I waited until later in life to start a family. I’ve only been married for coming up on 10 years now, though I’ve been with my wife for 16 years. I only have one kid and she’s five so she came along slightly later, too. That’s probably a good thing since she’s quite the handful and I’m not sure I’d have had the patience required to handle her in earlier times. I still kinda don’t. I definitely don’t have the energy I would have had 15 years ago or whatever.

I only have one grandparent left. My wife has none. My daughter really won’t know great-grandparents. I’m not sure how I feel about that. I was young when I met my great-grandparents and I honestly don’t remember much about them. I’m guessing it’ll be the same for her.

I love my parents and have a good relationship with them. They’re around for my daughter and love her to pieces. That makes me happy.

I have two sisters, both of whom I love, but only one of whom still talks to the family. The one that still talks to us has a great life and family of her own that means we don’t cross paths often. I’m glad she’s happy and has her own thing going, but I realize that our lives are so different now that if she weren’t actually related to me we probably wouldn’t really keep in touch. A lot of the commonality we shared as kids has disappeared over time.

Friends have come and gone over the years. I don’t have a lot of friends, but I’m glad to say the ones I have are great. I’m still friends with a few people I knew from school, but my school years weren’t the best for me so I don’t really keep in touch with many of them. Some folks I swear I’d be best friends with for life have drifted away. Some folks I never would have guessed have turned into the best friends I could have. I guess that’s how it goes as people change.

I haven’t made my first billion, or my first million, but I’m comfortable and don’t feel unsuccessful. I wish we had a bigger house, but there is also a lot of space we don’t use so maybe it’s just that I want a different layout. I feel comfortable and don’t live paycheck to paycheck so I can’t say I’m not fortunate. (Don’t get me wrong, though, I’m not saying I’m not interested in money. I don’t work for free.)

Anyway, here’s to at least another 40 years. The first 40 has been great, I’m curious what the next batch has in store.

aspnet, autofac comments edit

As we all saw, ASP.NET Core and .NET Core went RTM this past Monday. Congratulations to those teams - it’s been a long time coming and it’s some pretty amazing stuff.

Every time an RC (or, now, RTM) comes out, questions start flooding in on Autofac, sometimes literally within minutes of the go-live, asking when Autofac will be coming out with an update. While we have an issue you can track if you want to watch the progress, I figured I’d give a status update on where we are and where we’re going with respect to RTM. I’ll also explain why we are where we are.

Current Status

We have an RC build of core Autofac out on NuGet that is compatible with .NET Core RTM. That includes a version of Autofac.Extensions.DependencyInjection, the Autofac implementation against Microsoft.Extensions.DependencyInjection. We’ll be calling this version 4.0.0. We are working hard to get a “stable” version released, but we’ve hit a few snags at the last minute, which I’ll go into shortly.

About half of the non-portable projects have been updated to be compatible with Autofac 4.0.0. For the most part this was just an update to the NuGet packages, but with Autofac 4.0.0 we also changed to stop using the old code access security model (remember [AllowPartiallyTrustedCallers] ?) and some of these projects needed to be updated accordingly.

We are working hard to get the other half of the integration projects updated. Portable projects are being converted to use the new project.json structure and target netstandard framework monikers. Non-portable projects are sticking with .csproj but are being verified for compatibility with Autofac 4.0.0, getting updated as needed.

Why It’s Taking So Long

Oh, where do I begin.

Let me preface this by saying it’s going to sound like a rant. And in some ways it is. I do love what the .NET Core and ASP.NET Core teams have out there now, but it’s been a bumpy ride to get here and many of the bumps are what caused the delay.

First, let’s set the scene: There are really only two of us actively working on Autofac and the various officially supported integration libraries - it’s me and Alex Meyer-Gleaves. There are 23 integration projects we support alongside core Autofac. There’s a repository of examples as well as documentation. And, of course, there are questions that come in on StackOverflow, issues that come in that need responses, and requests on the discussion forum. We support this on the side since we both have our own full-time jobs and families.

I’m not complaining, truly. I raise all that because it’s not immediately evident. When you think about what makes Autofac tick (or AutoMapper, or Xunit, or any of your other favorite OSS projects that aren’t specifically backed/owned by a company like Microsoft or a consultant making money from support), it’s a very small number of people with quite a lot of work to get done in pretty much no time. Core Autofac is important, but it’s the tip of a very large iceberg.

We are sooooo lucky to have community help where we get it. We have some amazing folks who chime in on Autofac questions on StackOverflow. We’ve gotten some pretty awesome pull requests to add some new features lately. Where we get help, it’s super. But, admittedly, IoC containers and how they work internally are tricky beasts. There aren’t a lot of simple up-for-grabs sort of fixes that we have in the core product. It definitely reduces the number of things that we can get help with from folks who want to drop in and get something done quickly. (The integration projects are much easier to help with than core Autofac.)

Now, keep that in the back of your mind. We’ll tie into that shortly.

You know how the tooling for .NET Core changed like 1,000 times? You know how there was pretty much no documentation for most of that? And there were all sorts of weird things like the only examples available being from the .NET teams and they were using internal tools that folks didn’t have great access to. Every new beta or RC release was a nightmare. Mention that and you get comments like, “That’s life in the big city,” which is surely one way to look at it but is definitely dismissive of the pain involved.

Every release, we’d need to reverse-engineer the way the .NET teams had changed their builds, figure out how the tools were working, figure out how to address the breaking changes, and so on. Sometimes (rarely, but it happened) someone would have their project ported over first and we could look at how they did it. We definitely weren’t the only folks to feel that, I know.

NuGet lagging behind was painful because just updating core Autofac didn’t necessarily mean we could update the integration libraries. Especially with the target framework moniker shake-up, you’d find that without the tooling in place to support the whole chain, you could upgrade one library but not be able to take the upgrade in a downstream dependency because the tooling would consider it incompatible.

Anyway, with just the two of us (and the community as possible) and the tooling/library change challenges there was a lot of wheel-spinning. There were weeks where all we did was try to figure out the right magic combination of things in project.json to get things compiling. Did it work? I dunno, we can’t test because we don’t have a unit test framework compatible with this version of .NET Core. Can’t take it in a downstream integration library to test things, either, due to tooling challenges.

Lots of time spent just keeping up.

Finally, we’ve been bitten by the “conforming container” introduced for ASP.NET Core. Microsoft.Extensions.DependencyInjection is an abstraction around DI that was introduced to support ASP.NET Core. It’s a “conforming container” because it means anything that backs the IServiceProvider interface they use needs to support certain features and react in the same way. In some cases that’s fine. For the most part, simple stuff like GetService<T>() is pretty easy to implement regardless of the backing container.

The stuff you can’t do in a conforming container is use the container-specific features. For example, Autofac lets you pass parameters during a Resolve<T>() call. You can’t do that without actually referencing the Autofac lifetime scope - the IServiceProvider interface serves as a “lowest common denominator” for containers.

All along the way, we’ve been testing the junk out of Autofac to make sure it works correctly with Microsoft.Extensions.DependencyInjection. It’s been just fine so far. However, at the last minute (20 days ago now) we got word that not only did we need to implement the service provider interface as specified, but we also need to return IEnumerable<T> collections in the order that the components were registered.

We don’t currently do that. Given IEnumerable<T> has no specification around ordering and all previous framework features (controller action filters, etc.) requiring ordering used an Order property or something like that, it’s never been an issue. Interfaces using IEnumerable<T> generally don’t assume order (or, at least, shouldn’t) This is a new requirement for the conforming container and it’s amazingly non-trivial to implement.

It’s hard to implement because Autofac tracks registrations in a more complex way than just adding them to a list. If you add a standard registration, it does get added to a list. But if you add .PreserveExistingDefaults() because you want to register something and keep the existing default service in place if one’s already registered - that goes in at the end of the list instead of at the head. We also support very dynamic “registration sources” - a way to add registrations to the container on the fly without making explicit registrations. That’s how we handle things like Lazy<T> automatically working.

(That’s sort of a nutshell version. It gets more complex as you think about child/nested lifetime scopes.)

Point being, this isn’t as simple as just returning the list of stuff that got registered. We have to update Autofac to start keeping track of registration order yet still allow the existing functionality to behave correctly. And what do you do with dynamic registration sources? Where do those show up in the list?

The answers are not so straightforward.

We are currently working hard on solving that ordering problem. Actually, right now, Alex is working hard on that while I try and get the rest of the 23 integration projects converted, update the documentation, answer StackOverflow/issue/forum questions, and so on. Thank goodness for that guy because I couldn’t do this by myself.

If you would like to follow along on the path to a stable release, check out these issues:

While it may not be obvious, adding lots of duplicate issues asking for status or “me too” comments on issues in the repo doesn’t help. In some cases it’s frustrating (it’s a “no pressure, but hurry up” vote) and may slow things down as we spend what little time we have responding to the dupes and the questions rather than actually getting things done. I love the enthusiasm and interest, but please help us out by not adding duplicates. GitHub recently added “reactions” to issues (that little smiley face at the top-right of an issue comment) - jump in with a thumbs-up or smile or something; or subscribe to an issue if you’re interested in following along (there’s a subscribe button along the right up near the top of the issue, under the tags).

Thanks (So Far)

Finally, I have some thanks to hand out. Like I said, we couldn’t get this done without support from the community. I know I’m probably leaving someone out, and if so, I’m sorry - please know I didn’t intentionally do it.

  • The ASP.NET Core team - These guys took the time to talk directly to Alex and I about how things were progressing and answered several questions.
  • Oren Novotny - When the .NET target framework moniker problem was getting us down, he helped clear things up.
  • Cyril Durand and Viktor Nemes - These guys are rockstars on StackOverflow when it comes to Autofac questions.
  • Caio Proiete, Taylor Southwick, Kieren Johnstone, Geert Van Laethem, Cosmin Lazar, Shea Strickland, Roger Kratz - Pull requests of any size are awesome. These folks submitted to the core Autofac project within the last year. This is where I’m sure I missed someone because it was a manually pulled list and didn’t include the integration libraries. If you helped us out, know I’m thanking you right now.

personal comments edit

As of yesterday, June 27, 2016, I’ve worked for 15 years at Fiserv.

My 15 year certificate

Given I got my first “official” job when I was 14 and I turn 40 this year, that’s over half of my professional working life that I’ve been here.

I started in the marketing department back when the company was Corillian. I got hired to help work on the corporate web site, (which now redirects to a Fiserv page on internet banking).

I think it was a year or two into that when some restructuring came along and the web site transferred to the IT department. I transferred with it and became the only IT developer doing internal tools and working on automating things.

I remember working on rolling out the original SharePoint 2003 along with Windows SharePoint Services in an overall Office 2003 effort. We had some pretty fancy “web parts” in VBScript to do custom document indexing and reporting. I vaguely recall updating several of those parts to be .NET 1.1 assemblies.

It was in 2004 when a need arose for a developer to work on some proof-of-concept and demo web sites that our sales folks could take around on calls. I happened to be free, so I worked with our product folks on those things. As sometimes happens, those POC and demo sites became the template for what we wanted the next version of the product to be like. And since I’d already worked on them… why not come over to product development and do the work “for real this time?”

I worked on the very first iteration of the Corillian Consumer Banking product. That was in .NET 1.1 though 2.0 was right around the corner. I remember having to back-port features like ASP.NET master pages into 1.1. (I still like our implementation better.) This was back when Hanselman was still at Corillian and we worked together on several features, particularly where the UI had to interact with/consume services.

In early 2007 CheckFree acquired Corillian. After the dust on that settled, I was still working on Consumer Banking - basically, same job, new company. There were definitely some process hiccups as we went from a fairly agile Scrum-ish methodology that Corillian had into CheckFree’s version of Rational Unified Process, but we made do.

In late 2007, Fiserv acquired CheckFree.

Yeah, that was some crazy times.

Fiserv, for the most part, adopted CheckFree’s development processes, at least as far as our group found. RUP gave way after a while to something more iterative but still not super agile. It was only pretty recently (last five-ish years?) that we’ve finally made our way back to Scrum.

The majority of my time has been in web service and UI development. I did get my Microsoft Certified DBA and Microsoft Certified .NET Solutions Developer certifications so I’m not uncomfortable working at all layers, but I do like to spend my time a little higher than the data tier when possible.

In my most recent times, I’ve been working on REST API stuff using ASP.NET Core. Always something new to learn, always interesting.

Also interesting is that with the various acquisitions, reorganizations, and re-prioritizations we’ve seen over the years, while I have worked (effectively) for the same company, it’s given me a lot of great experience with different people, processes, tools, and development environments. In some cases, it’s been like working different jobs… despite it being the same job. Definitely some great experience.

Plus, I’m afforded (a small amount of) time to help out the open source community with Autofac and other projects.

That’s actually why I’ve stayed so long. I can only speak for myself, but even with me sort of doing “the same thing” for so long… it’s not the same thing. I’m always learning something new, there’s always something changing, there’s always a new problem to solve. I work with some great people who are constantly trying to improve our products and processes.

And a bit of seniority never hurt anyone.