gaming, xbox comments edit

They finally released a tool on xbox.com that allows you to transfer your downloadable content licenses from one console to another. This is huge. I used to have to wait months (yes, plural) to get this to happen after repair, and people who bought new consoles (upgrade to Elite?) were out-and-out hosed.

That said, there are some important points they blog about on the Gamerscore Blog and there’s a FAQ about the tool. The two big points for me:

  • When you get your console repaired by Microsoft, they’ll do this for you so you won’t need to do it yourself. That’s important because…
  • You can only do this one time every 365 days.

Apparently this will work with all content but movies because movies aren’t “download-to-own.” I don’t rent movies over Xbox Live, so I’m not too concerned here.

The last time I got my console repaired, which was about a month ago, they did not automatically transfer the licenses even though they said they would. I called Xbox Live Support and it took about a week for them to do the transfer (as opposed to the months it previously took). I’m gathering that if I get another repair and need the licenses transferred, this won’t be the avenue I’m supposed to take - I’ll still need to call them. This is more for folks getting new consoles or getting repairs from non-Microsoft facilities. That said, now that we know there’s a tool, maybe they can do the transfer right there while I’m on the phone rather than waiting a week.

GeekSpeak comments edit

I finally got around to putting a new picture as the home screen image on my new BlackBerry Curve. The problem - every time I synchronized against the BlackBerry Desktop Manager software, my home screen image would get reset to the default for the theme. I’d go set it again, sync up, and watch my wallpaper disappear. The image file was still there, it just wasn’t my wallpaper anymore.

The answer: Store any picture you want as your home screen image on the device memory, not on a media card. If you store the image on the media card, it’ll get reset when you sync. If you store it on the device memory, it’s stays persistent the way you’d expect.

gists, dotnet, build comments edit

When you’re integrating FxCop into your build using MSBuild, you have three basic options.

  1. Build a static project file that contains all of the FxCop settings for your solution and run the command line client against that.
  2. Skip the project file entirely and run the command-line client alone, specifying all of the options on the command line.
  3. Build a partial project file that contains things that don’t change regularly (like the list of rules you want to run) and tell the FxCop command line client to use that as well as some dynamic properties you specify on the command line.

Usually you end up with option 3 - a partial project file and dynamic properties on the command line. The problem is that command lines have a maximum length limitation and once you get into Very Large Projects, there may be dozens of folders that FxCop needs to search for assembly dependencies as well as several places you need to point FxCop to so it can find the assemblies you want to analyze. Those are things you normally specify on the command line… but in that Very Large Project scenario, you hit the command line length limit pretty quick.

(You might think using a custom task to run the FxCop process might fix it, but only if it doesn’t pipe the output to the command line and try to run it.)

The solution? Dynamically generate an FxCop project file with the list of your dependency folders in it. Then you don’t have to worry about any of that showing up on the command line.

  • Create an empty FxCop project. Set up the rules you want to run, etc., but don’t put any target assemblies in there. Save that project and exit FxCop.
  • Open the FxCop project in your favorite text editor and find the /FxCopProject/Targets node. It will be empty.
  • Open that Targets node and add a block so it looks like this:
<Targets>
  <AssemblyReferenceDirectories>
  <!-- @DIRECTORIES@ -->
  </AssemblyReferenceDirectories>
</Targets>
  • Save the FxCop project. We’re going to use that comment to do a replacement in the file and poke in the list of dependency folders.
  • Open your MSBuild script and import the MSBuild Community Tasks. You’ll need these for the FileUpdate task.
  • Just before you run FxCop, add a section to your build script that creates an item list of all of your potential dependencies. For my build, all of my third-party dependencies are in one directory tree. After that, create a copy of your FxCop project and use the FileUpdate task to poke in the list of directories. (I’ll show the code below.)
  • Run FxCop using the new temporary copy of the FxCop project file that has your dynamic list of assembly reference locations. Done - no more need to worry about command line length limits, and if you keep all of your dependencies in one folder tree, you can add or remove them without having to update your FxCop project file and things will still work.

Here’s that snippet of MSBuild code to dynamically enumerate the files and poke them into the FxCop project:

<!--
     Create a temporary copy of the FxCop project.
  -->
<Copy
  SourceFiles="Build\FxCop Project.FxCop"
  DestinationFolder="$(TempDirectory)"/>

<!--
     Create an item with all of the dependencies in it. Creating multiple
     items with the same name will actually append everything into one big
     item. Here we have two folder trees with dependencies and we're
     ignoring the Subversion files.
  -->
<CreateItem
  Include="$(DependencyFolder1)\**"
  Exclude="$(DependencyFolder1)\**\_svn\**">
  <Output TaskParameter="Include" ItemName="Dependencies"/>
</CreateItem>
<CreateItem
  Include="$(DependencyFolder2)\**"
  Exclude="$(DependencyFolder2)\**\_svn\**">
  <Output TaskParameter="Include" ItemName="Dependencies"/>
</CreateItem>

<!--
     Update the temporary copy of the FxCop project with the dynamic list
     of dependencies. Using MSBuild batching, we won't get duplicates. The
     replacement expression shows we're only interested in the directories
     for the dependencies, not the dependencies themselves.
  -->
<FileUpdate
  Files="$(TempDirectory)\FxCop Project.FxCop"
  Regex="&lt;!-- @DIRECTORIES@ --&gt;"
  ReplacementText="&lt;!-- @DIRECTORIES@ --&gt;&lt;Directory&gt;%(Dependencies.RootDir)%(Dependencies.Directory)&lt;/Directory&gt;"/>

<!--
     Now we can run FxCop against the temporary copy of the FxCop project
     and not worry about specifying dependency folders.
  -->

You can apply this pattern to any list of files or folders you need to dynamically poke into files like FxCop or NDepend project files. Even if you’re not worried about the command line length limit, by dynamically generating these things you make your build more robust and require less manipulation as your project changes. Good luck!

gaming, playstation comments edit

Playstation 3 80GB bundle with Metal Gear Solid
4Jenn and I enjoy our movies, and I’m not (too) ashamed to say I jumped into the HD DVD camp, so when HD DVD died, we started saving up for a PS3. When the MGS4 80GB bundle came out last week, we picked up a PS3 as a Blu-ray player.

There are a lot of things I didn’t know about the PS3 that I thought I’d share. You hear about the gaming and Blu-ray playing, but not some of the other things.

You’ll need a better video cable. Out of the box, you only get a composite video cable. It makes sense from a lowest common denominator standpoint, but you’ll want to see an HD picture, so get a better cable. You can either get a proprietary PS3 component video cable set or you can get a standard HDMI cable. (Get the cable ahead of time - Best Buy really gouges you on HDMI. Like, “10x price difference” style. I’m not afraid to admit I fell into this trap because I needed to get the thing hooked up and it’s too late for me… but not too late for you.)

Sound only comes from one output at a time. You get to choose - does it come through the optical out or through the HDMI cable? You don’t get both. (With Xbox, it comes through both simultaneously.)

You can’t use your universal remote with it. Everything on the PS3 is through Bluetooth including remote control support. Unless you don’t mind controlling the movie with a game controller, you’ll be buying a PS3 Bluetooth remote.

It may be quieter than Xbox, but it runs just as hot. I thought I had heard this thing ran cool and quiet. I was wrong - it’s just quiet… until it heats up, when it sounds like a jet engine. I had put it in the entertainment center with plenty of space around it but it turns out we have to leave the door open when it’s on because it really heats up and the fan kicks into serious overdrive.

The PSP integration is awesome. If you have a PSP, you can attach it to your PS3 and do what they call “Remote Play” - basically remote control the PS3. The audio and video come over the wireless network and render on the PSP. Some games let you remote play, too, so you can start a game on the PSP and pick up where you left off on PS3. I haven’t done that yet, but it sounds cool.

Games on the Playstation Network are priced in dollars. PSN is the Sony equivalent of Xbox Live. Except on Xbox Live you buy games with “points” and have to do conversion rates in your head. It’s in standard currency on PSN.

Audio/video setup is tricky. You have a lot of control over the settings used for outputting A/V signals - down to which stream types and bit rates come out the optical audio feed. The auto-setup actually detected some things wrong so I wasn’t getting any Dolby Digital out. For a non-tech person, I can see that getting it right could be a challenge.

It has a web browser. What’s cool about this is you can do a quick check of movie times or Google something from the couch. It’s not an awesome browser, but it suffices.

You can install a second OS on it. There’s a menu option to partition your hard drive and install a second OS. It reminds me of the short-lived PS2 Linux kit they had out. I’m not sure what your options are for OS right now, but it’s a forward-thinking option.

GeekSpeak comments edit

I’m running this laptop at work that has full disk encryption and a real-time virus scanner on it and it feels like it’s dog slow all the time. It’s not a CPU or memory issue, either - it’s that my disk is constantly churning and I’m I/O bound.

I knew that the the encryption had an impact, but I never realized how much until I found these benchmarks. Looks like I’m about doubling the amount of time it takes for disk I/O, not counting the real-time virus scanner overhead.

I’m all about security and all, but man, what I wouldn’t give to just have a separate data partition encrypted and leave the system partition unencrypted.