dotnet, aspnet comments edit

In working with a REST API project I’m on, I was tasked to create a DELETE operation that would take the resource ID in the URL path, like:

DELETE /api/someresource/reallylongresourceidhere HTTP/1.1

The resource ID we had was really, really long base-64 encoded value. About 750 characters long. No, don’t bug me about why that was the case, just… stick with me. I had to get it to work in IIS and OWIN hosting.

STOP. STOP RIGHT HERE. I’m going to tell you some ways to tweak URL request validation. This is a security thing. Security is Good. IN THE END, I DIDN’T DO THESE. I AM NOT RECOMMENDING YOU DO THEM. But, you know, if you run into one of the issues I ran into… here are some ways you can work around it at your own risk.

Problem 1: The Overall URL Length

By default, ASP.NET has a max URL length set at 260 characters. Luckily, you can change that in web.config:

<configuration>
  <system.web>
    <httpRuntime maxUrlLength="2048" />
  </system.web>
</configuration>

Setting that maxUrlLength value got me past the first hurdle.

Problem 2: URL Decoding

Base 64 includes the “/” character – the path slash. Even if you encode it on the URL like this…

/api/someresource/abc%2Fdef%2fghi

…when .NET reads it, it gets entirely decoded:

/api/someresource/abc/def/ghi

…which then, of course, got me a 404 Not Found because my route wasn’t set up like that.

This is also something you can control through web.config:

<configuration>
  <uri>
    <schemeSettings>
      <add name="http" genericUriParserOptions="DontUnescapePathDotsAndSlashes" />
      <add name="https" genericUriParserOptions="DontUnescapePathDotsAndSlashes" />
    </schemeSettings>
  </uri>
</configuration>

Now that the URL is allowed through and it’s not being proactively decoded (so I can get routing working), the last hurdle is…

Problem 3: Max Path Segment Length

The key, if you recall, is about 750 characters long. I can have a URL come through that’s 2048 characters long, but there’s still validation on each path segment length.

The tweak for this is in the registry. Find the registry key HKEY_LOCAL_MACHINE\System\CurrentControlSet\Services\HTTP\Parameters and add a DWORD value UrlSegmentMaxLength with the value of the max segment length. The default is 260; I had to update mine to 1024.

After you change that value, you have to reboot to get it to take effect.

This is the part that truly frustrated me. Even running in the standalone OWIN host, this value is still used. I thought OWIN and OWIN hosting was getting us away from IIS, but the low-level http.sys is still being used in there somewhere. I guess I just didn’t realize that and maybe I should have. I mean, .NET is all just wrappers on unmanaged crap anyway, right? :)

What I Ended Up Doing

Having to do all that to get this working set me on edge. I don’t mind increasing, say, the max URL length, but I had to tweak a lot, and that left me with a bad taste in my mouth. Deployment pain, potential security pain… not worth it.

Since we had control over how the resource IDs were generated in the first place, I changed the algorithm so we could fit them all under 260 characters – the max path segment length. I left the overall URL length configuration in web.config at a higher number, but shrunk it down to 1024 instead of sticking at 2048. I ditched the registry change – no longer needed.

dotnet, vs, autofac comments edit

In the Autofac project we’ve maintain all of the various packages and integrations in one assembly. In order to make sure each package builds against the right version of Autofac, all references are redirected through NuGet.

A challenge we face is when we are testing a new release of Autofac, we want to update specific integration projects with the latest version of Autofac so we can do testing, eventually upgrading everything as needed. Running through the GUI to do something like that is a time consumer.

Instead, I use a little script in the Package Manager Console to filter out the list of projects I want to update and then run the update command on those filtered projects. It looks like this:

Get-Project -All | Where-Object { $_.Name -ne "Autofac" -and $_.Name -ne "Autofac.Tests" } | ForEach-Object { Update-Package -Id "Autofac" -ProjectName $_.Name -Version "3.5.0-CI-114" -IncludePrerelease -Source "Autofac MyGet" }

In that little script…

  • Get-Project -All gets the entire list of projects in the current loaded solution.
  • The Where-Object is where you filter out the stuff you don’t want upgraded. I don’t want to run the Autofac upgrade on Autofac itself, but I could also add other projects.
  • The ForEach-Object runs the package update for each selected project.
    • The -Version parameter is the build from our MyGet feed that I want to try out.
    • The -Source parameter is the NuGet source name I’ve added for our MyGet feed.

You might see a couple of errors go by if you don’t filter out the update for a project that doesn’t have a reference to the thing you’re updating (e.g., if you try to update Autofac in a project that doesn’t have an Autofac reference) but that’s OK.

James Chambers has a great roundup of some additional helpful NuGet PowerShell script samples. Definitely something to keep handy.

dotnet, gists, build comments edit

I’ve run across a similar situation to many folks I’ve seen online, where I have a solution with a pretty modular application and when I build it,I don’t get all the indirect dependencies copied in.

I found a blog article with an MSBuild target in it that supposedly fixes some of this indirect copying nonsense, but as it turns out, it doesn’t actually go far enough.

My app looks something like this (from a reference perspective)

  • Project: App Host
    • Project: App Startup/Coordination
      • Project: Core Utilities
      • Project: Server Utilities
        • NuGet references and extra junk

The application host is where I need everything copied so it all works, but the NuGet references and extra junk way down the stack isn’t making it so there are runtime explosions.

I also decided to solve this with MSBuild, but using an inline code task. This task will…

  1. Look at the list of project references in the current project.
  2. Go find the project files corresponding to those project references.
  3. Calculate the path to the project reference output assembly and include that in the list of indirect references.
  4. Calculate the paths to any third-party references that include a <HintPath> (indicating the item isn’t GAC’d) and include those in the list of indirect references.
  5. Look for any additional project references – if they’re found, go to step 2 and continue recursing until there aren’t any project references we haven’t seen.

While it’s sort of the “nuclear option,” it means that my composable application will have all the stuff ready and in place at the Host level for any plugin runtime assemblies to be dropped in and be confident they’ll find all the platform support they expect.

Before I paste in the code, the standard disclaimers apply: Works on my box; no warranty expressed or implied; no support offered; YMMV; and so on. If you grab this and need to tweak it to fit your situation, go for it. I’m not really looking to make this The Ultimate Copy Paste Solution for Dependency Copy That Works In Every Situation.

And with that, here’s a .csproj file snippet showing how to use the task as well as the task proper:

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="12.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <!-- All the stuff normally found in the project, then in the AfterBuild event... -->
  <Target Name="AfterBuild">
    <!-- Here's the call to the custom task to get the list of dependencies -->
    <ScanIndirectDependencies StartFolder="$(MSBuildProjectDirectory)"
                              StartProjectReferences="@(ProjectReference)"
                              Configuration="$(Configuration)">
      <Output TaskParameter="IndirectDependencies" ItemName="IndirectDependenciesToCopy" />
    </ScanIndirectDependencies>

    <!-- Only copy the file in if we won't stomp something already there -->
    <Copy SourceFiles="%(IndirectDependenciesToCopy.FullPath)"
          DestinationFolder="$(OutputPath)"
          Condition="!Exists('$(OutputPath)\%(IndirectDependenciesToCopy.Filename)%(IndirectDependenciesToCopy.Extension)')" />
  </Target>


  <!-- THE CUSTOM TASK! -->
  <UsingTask TaskName="ScanIndirectDependencies" TaskFactory="CodeTaskFactory" AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v12.0.dll">
    <ParameterGroup>
      <StartFolder Required="true" />
      <StartProjectReferences ParameterType="Microsoft.Build.Framework.ITaskItem[]" Required="true" />
      <Configuration Required="true" />
      <IndirectDependencies ParameterType="Microsoft.Build.Framework.ITaskItem[]" Output="true" />
    </ParameterGroup>
    <Task>
      <Reference Include="System.Xml"/>
      <Using Namespace="Microsoft.Build.Framework" />
      <Using Namespace="Microsoft.Build.Utilities" />
      <Using Namespace="System" />
      <Using Namespace="System.Collections.Generic" />
      <Using Namespace="System.IO" />
      <Using Namespace="System.Linq" />
      <Using Namespace="System.Xml" />
      <Code Type="Fragment" Language="cs">
      <![CDATA[
var projectReferences = new List<string>();
var toScan = new List<string>(StartProjectReferences.Select(p => Path.GetFullPath(Path.Combine(StartFolder, p.ItemSpec))));
var indirectDependencies = new List<string>();

bool rescan;
do{
  rescan = false;
  foreach(var projectReference in toScan.ToArray())
  {
    if(projectReferences.Contains(projectReference))
    {
      toScan.Remove(projectReference);
      continue;
    }

    Log.LogMessage(MessageImportance.Low, "Scanning project reference for other project references: {0}", projectReference);

    var doc = new XmlDocument();
    doc.Load(projectReference);
    var nsmgr = new XmlNamespaceManager(doc.NameTable);
    nsmgr.AddNamespace("msb", "http://schemas.microsoft.com/developer/msbuild/2003");
    var projectDirectory = Path.GetDirectoryName(projectReference);

    // Find all project references we haven't already seen
    var newReferences = doc
          .SelectNodes("/msb:Project/msb:ItemGroup/msb:ProjectReference/@Include", nsmgr)
          .Cast<XmlAttribute>()
          .Select(a => Path.GetFullPath(Path.Combine(projectDirectory, a.Value)));

    if(newReferences.Count() > 0)
    {
      Log.LogMessage(MessageImportance.Low, "Found new referenced projects: {0}", String.Join(", ", newReferences));
    }

    toScan.Remove(projectReference);
    projectReferences.Add(projectReference);

    // Add any new references to the list to scan and mark the flag
    // so we run through the scanning loop again.
    toScan.AddRange(newReferences);
    rescan = true;

    // Include the assembly that the project reference generates.
    var outputLocation = Path.Combine(Path.Combine(projectDirectory, "bin"), Configuration);
    var localAsm = Path.GetFullPath(Path.Combine(outputLocation, doc.SelectSingleNode("/msb:Project/msb:PropertyGroup/msb:AssemblyName", nsmgr).InnerText + ".dll"));
    if(!indirectDependencies.Contains(localAsm) && File.Exists(localAsm))
    {
      Log.LogMessage(MessageImportance.Low, "Added project assembly: {0}", localAsm);
      indirectDependencies.Add(localAsm);
    }

    // Include third-party assemblies referenced by file location.
    var externalReferences = doc
          .SelectNodes("/msb:Project/msb:ItemGroup/msb:Reference/msb:HintPath", nsmgr)
          .Cast<XmlElement>()
          .Select(a => Path.GetFullPath(Path.Combine(projectDirectory, a.InnerText.Trim())))
          .Where(e => !indirectDependencies.Contains(e));

    Log.LogMessage(MessageImportance.Low, "Found new indirect references: {0}", String.Join(", ", externalReferences));
    indirectDependencies.AddRange(externalReferences);
  }
} while(rescan);

// Expand to include pdb and xml.
var xml = indirectDependencies.Select(f => Path.Combine(Path.GetDirectoryName(f), Path.GetFileNameWithoutExtension(f) + ".xml")).Where(f => File.Exists(f)).ToArray();
var pdb = indirectDependencies.Select(f => Path.Combine(Path.GetDirectoryName(f), Path.GetFileNameWithoutExtension(f) + ".pdb")).Where(f => File.Exists(f)).ToArray();
indirectDependencies.AddRange(xml);
indirectDependencies.AddRange(pdb);
Log.LogMessage("Located indirect references:\n{0}", String.Join(Environment.NewLine, indirectDependencies));

// Finally, assign the output parameter.
IndirectDependencies = indirectDependencies.Select(i => new TaskItem(i)).ToArray();
      ]]>
      </Code>
    </Task>
  </UsingTask>
</Project>

Boom! Yeah, that’s a lot of code. And I could probably tighten it up, but I’m only using it once, in one place, and it runs one time during the build. Ain’t broke, don’t fix it, right?

Hope that helps someone out there.

I’m messing around with Boxstarter and Chocolatey and one of the things I wanted to do was install the various “Command Prompt Here” context menu extensions I use all the time. These extensions are .inf files and, unfortunately, there isn’t really any documentation on how to create a Chocolatey package that installs an .inf.

So here’s how you do it:

First, package the .inf file in the tools folder of your package alongside the chocolateyInstall.ps1 script..inf files are pretty small anyway and you want the file to be around for uninstall, so it’s best to just include it.

Next, set your chocolateyInstall.ps1 to run InfDefaultInstall.exe. That’s an easier way to install .inf files than the rundll32.exe way and it’ll work with Vista and later. So… no XP support. Aw, shucks.

Here’s a sample chocolateyInstall.ps1:

$packageName = 'YourPackageNameHere'
$validExitCodes = @(0)

try {
  $scriptPath = split-path -parent $MyInvocation.MyCommand.Definition
  $target = Join-Path $scriptPath "YourInfFileNameHere.inf"
  $infdefaultinstall = Join-Path (Join-Path $Env:SystemRoot "System32") "InfDefaultInstall.exe"
  Start-ChocolateyProcessAsAdmin "$target" "$infdefaultinstall" -validExitCodes $validExitCodes
  Write-ChocolateySuccess "$packageName"
} catch {
  Write-ChocolateyFailure "$packageName" "$($_.Exception.Message)"
  throw
}

To support uninstall, add a chocolateyUninstall.ps1 script. This will have to use rundll32.exe to uninstall, but it’s not too bad.

    $packageName = 'YourPackageNameHere'
    $validExitCodes = @(0)

    try {
      $scriptPath = split-path -parent $MyInvocation.MyCommand.Definition
      $target = Join-Path $scriptPath "YourInfFileNameHere.inf"
      Start-ChocolateyProcessAsAdmin "SETUPAPI.DLL,InstallHinfSection DefaultUninstall 132 $target" "rundll32.exe" -validExitCodes $validExitCodes
      Write-ChocolateySuccess "$packageName"
    } catch {
      Write-ChocolateyFailure "$packageName" "$($_.Exception.Message)"
      throw
    }

That’s it! Run the packaging and you’re set to go. This will support both installation and uninstallation of the .inf file.

Note: At one point I was having some trouble getting this to run on a Windows Server 2012 VM using the one-click Boxstarter execution mechanism. I found this while testing an install script that installs something like 40 things. After rolling back the VM to a base snapshot (before running the script) I’m no longer able to see the failure I saw before, so I’m guessing it was something else in the script causing the problem. This INF install mechanism appears to work just fine.

net comments edit

I was testing out some changes to versioning in Autofac. We have a MyGet feed, but all of the internal dependencies of the various NuGet packages when they’re built point to the CI versions, so it’s sort of hard to stage a test of what things will look like when they’re released – you have to rename each .nupkg file to remove the “-CI-XYZ” build number, open each .nupkg file, change the internal .nuspec file to remove the “-CI-XYZ” build number info, then re-zip everything up. In testing, I had to do this a few times, so I scripted it.

I put everything in a folder structure like this:

  • ~/TestFeed
    • backup – contains all of the original .nupkg files (renamed without the “-CI-XYZ”)
    • msbuildcommunitytasks – contains the MSBuild Community Tasks set

Then I wrote up a quick MSBuild script for doing all the extract/update/rezip stuff. I could have used any other scripting language, but, eh, the batching and file scanning in MSBuild made a few things easy.

msbuild fixrefs.proj /t:Undo puts the original packages (from the backup folder) into the test feed folder.

msbuild fixrefs.proj Does the zip/fix/re-zip.

One of the challenges I ran into was that the zip task in MSBuild Community Tasks seemed to always want to add an extra level of folders into the .nupkg – I couldn’t get the original contents to live right at the root of the package. Rather than fight it, I used 7-Zip to do the re-zipping. I probably could have gotten away from the MSBuild Community Tasks entirely had I some form of sed on my machine because I needed that FileUpdate task. But… Windows. And, you know, path of least resistance. I think this was a five-minute thing. Took longer to write this blog entry than it did to script this.

Here’s “fixrefs.proj”:

<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="All" xmlns="http://schemas.microsoft.com/developer/msbuild/2003" ToolsVersion="4.0">
  <PropertyGroup>
    <MSBuildCommunityTasksPath>.</MSBuildCommunityTasksPath>
    <SevenZip>C:\Program Files\7-Zip\7z.exe</SevenZip>
  </PropertyGroup>
  <Import Project="$(MSBuildProjectDirectory)\msbuildcommunitytasks\MSBuild.Community.Tasks.Targets"/>
  <ItemGroup>
    <Package Include="*.nupkg"/>
  </ItemGroup>
  <Target Name="All">
    <MakeDir Directories="%(Package.Filename)" />
    <Unzip ZipFileName="%(Package.FullPath)" TargetDirectory="%(Package.Filename)" />
    <ItemGroup>
      <NuSpec Include="**/*.nuspec" />
    </ItemGroup>
    <FileUpdate Files="@(NuSpec)" Regex="(.)\-CI\-\d+" ReplacementText="$1" WarnOnNoUpdate="true" />
    <Delete Files="@(Package)" />
    <CallTarget Targets="ZipNewPackage" />
    <RemoveDir Directories="%(Package.Filename)" />
  </Target>
  <Target Name="Undo">
    <Delete Files="@(Package)" />
    <ItemGroup>
      <Original Include="backup/*.nupkg" />
    </ItemGroup>
    <Copy SourceFiles="@(Original)" DestinationFolder="$(MSBuildProjectDirectory)" />
  </Target>
  <Target Name="ZipNewPackage" Inputs="@(Package)" Outputs="%(Identity).Dummy">
    <Exec
      Command="&quot;$(SevenZip)&quot; a -tzip &quot;$(MSBuildProjectDirectory)\%(Package.Filename)%(Package.Extension)&quot;"
      WorkingDirectory="$(MSBuildProjectDirectory)\%(Package.Filename)" />
  </Target>
</Project>