The session on AJAX patterns was very cool. In one demo application (a photo album application), six specific patterns were addressed and a little on how to solve it was also shown.

Pattern - Script to Enable Interactivity Sort of a no-brainer, but using script to enable interactive elements is sort of the basis of a rich application. In this particular pattern, it was more about making it easy to script what you’re looking to do. ASP.NET AJAX offers a lot of shortcuts to help you do that scripting.

This pattern also addressed the notion of separating script from behavior. ASP.NET AJAX introduces the notion of “extender controls” that allow you to use server controls to modify the behavior of controls in the page. An example was shown where some existing markup got modified by adding an extender - a server control registering script to modify HTML on the client side. It’s a great way to do the separation.

Pattern - Logical Navigation AJAX applications have typically lost the ability to use the back/forward buttons and the ability to bookmark a page. ASP.NET Futures contains a “History” control that allows you to enable your AJAX elements to support state, sort of like ViewState, but on the URL. Modifying the page contents modifies the browser URL and, thus, enables logical navigation and bookmarking. As long as your scripts store enough history state to be able to recreate a logical view, this looks like a great way to overcome some shortcomings in AJAX.

Pattern - Update Indicators Notifying a user of what changed when an AJAX request finishes is helpful so they can see the results of an action. The UpdateAnimation control in ASP.NET AJAX is one way to do that - it performs AJAX updates in an animated fashion so movement is the key for the user. There is a prototype UpdateIndicator control that scrolls the page to the location of the change and does a highlight animation on the change; this isn’t in ASP.NET AJAX now but will hopefully be in the future.

Pattern - Smart Data Access Possibly a poorly-named pattern, but the idea is that you should use HTML properly such that external services like search engine crawlers or programmatic site map generators can correctly access/index the content you post. Use tags in the correct semantic sense (e.g., if it’s not a header, don’t put it in <h1 /> tags). Also, keep in mind the way you display pages in non-scripted environments, such as in a search engine crawler or when the user has script disabled. Your content should look good either way.

Pattern - Mashups (Using External Services) There’s a lot of data out there, and a lot of services providing added value. Make use of them where you can. The example shown was a call to Flickr to get images and data.

What was interesting about the discussion of this pattern was less the “what” than the “how.” Browsers don’t allow cross-site scripting, so you have one of two options to get third-party data into your application.

You can use a server-side proxy where you create a proxy on your site that requests the third-party data. Your application then talks to your proxy to get the data. This is a good general-purpose solution and allows you to take advantage of things like caching calls on your site and gives you the ability to manipulate the data before passing it to the client (possibly optimizing it). The downside is that it does use up your server’s bandwidth.

The other option is JSONP, which is a way you can add a script reference to your page that requests data in JSON format from a third-party service and when that data gets returned, it gets passed to a callback that you specify. ASP.NET AJAX supports this by allowing you to specify your own executor in an AJAX call, so the result of the call gets passed to your callback.

More Resources ASP.NET AJAX AJAX Patterns Yahoo! Design Pattern Library

The conference technically starts tomorrow, but I’m in town a day early to get settled so I can be there, bright-eyed and bushy-tailed. Or at least bright-eyed.

There was a mashup session that ran from 4:00p to 8:00p alongside registration, but by the time I got here, got registered, got to the room, and got something to eat… well, I also got a little tired and didn’t really feel like throwing down with the mad technical skillz. Instead, I thought it would be prudent to take it easy - it’s been a long day, and I do want to be ready to pay attention and learn in some of the great sessions planned.

My schedule looks like this:

Monday, April 30 9:30a - General Session 1:30p - Building Rich Web Experiences using Silverlight and JavaScript for Developers 3:00p - Using Visual Studio “Orcas” to Design and Develop Rich AJAX Enabled Web Sites 4:30p - AJAX Patterns with ASP.NET

Tuesday, May 1 8:30a - Front-Ending the Web with Microsoft Office 10:15a - Designing with AJAX: Yahoo! Pattern Library 11:45a - Developing ASP.NET AJAX Controls with Silverlight 2:15p - Go Deep with AJAX 4:00p - General Session

Wednesday, May 2 8:30a - How to Make AJAX Applications Scream on the Client 10:00a - Windows Presentation Foundation for Developers (Part 1 of 2) 11:30a - Windows Presentation Foundation for Developers (Part 2 of 2)

Interestingly, this isn’t the schedule as I originally planned it on Friday. Even up to the last minute the times, places, and topics are changing. I don’t know if this is the set of classes I’ll actually be in or not, but we’ll see.

Getting into the spirit of things, I’ve joined Facebook and Twitter since those seem to be ways folks are supposed to coordinate things. I’m not super taken with either one, but then, I’m not a big “social networker.” I’ll withhold judgment for now.

gaming, xbox comments edit

So about eight months ago I had to send my Xbox 360 in for repair and they sent me back a refurbished console. Due to the crazy, crappy DRM scheme they have on the content you get from Xbox Live Marketplace (which includes Xbox Live Arcade games), that meant I had to jump through a bunch of hoops to get the games on my system to work correctly again.

Well, I just got my Xbox 360 back from my recent bout with the Red Ring of Death and guess what - they sent me another refurb.

Which, of course, means I get to go through the hoops a second time. That’s right - I get to create a second dummy Xbox Live Silver membership (because I can’t use the dummy account I created last time around), have them refund me points to that account, and then use that account to re-purchase everything. Again.

Net result is that I spent like an hour last night taking inventory of all of the Xbox Live Arcade games we’ve purchased, figuring out which account we originally bought them with, and determining the price for each game as listed in the Xbox Live Marketplace.

I then called Xbox Live Support and after explaining the situation to one of the representatives, he mentioned that I should just be able to go in with the account I purchased the games with, hit Xbox Live Arcade, and select the “re-download” option (without deleting the game from the hard drive first) and it should authorize the new console.

That doesn’t work.

The call got escalated to the supervisor, who spent time going through my account and my wife’s account and calculating up all of the things we’ve purchased. Problem there is that their history only goes back one year so they don’t actually have a visible record of what you purchased beyond that… so they argue with you when you tell them, say, that you bought one of the Xbox Live Gold packages at a retail outlet over a year ago (because you’ve renewed since then) and it came with a copy of Bankshot Billiards 2, and yes, you’d like to have that re-authorized on the console as well.

After all of that, they still came up with a different number of points that they owe me than I did. You know why? Because they use the number of points you originally spent on the game as a guide, not today’s prices. And prices have gone up, so now the game you paid 400 points for six months ago costs 800 points if you want to buy it today but they only want to give you the 400 points you originally paid. Obviously, that causes a little contention on the phone, but the best the supervisor can do is put a note in there that mentions your concern because…

…there’s a guy named Eric whose job it is, apparently, to call all of the people that this happens to and hash out the whole “Points After Repair” thing (yes, they have an actual name for it, which sort of tells you something). I get to argue with Eric about the difference in what they think they owe me and what they actually owe me, and that discussion will happen in “approximately five business days.”

And there it sits. A couple of hours of work and phone later and I’m hanging on for Eric to call me and give me points so I can re-purchase and re-download the games I already own so my console works like it should again. Awesome.

subtext, blog, xml comments edit

I’ve been looking for a while to migrate off this infernal pMachine blog engine I’m on. The major problem is how to migrate my data to the new platform. Enter BlogML.

BlogML is an XML format for the contents of a blog. You can read about it and download it on the CodePlex BlogML site. They’re currently at version 2.0, which implies there was a 1.0 somewhere along the lines that I missed.

Anyway, the general idea is that you can export blog contents in BlogML from one blog engine and import into another blog engine, effectively migrating your content. Thus began my journey down the BlogML road.

If you download BlogML from the site it comes with an XSD schema for BlogML, a sample BlogML export file, a .NET API, and a schema validator.

I didn’t use the .NET API because pMachine is in PHP and all of the routines for extracting data are already in PHP, so I wrote my pMachine BlogML exporter in - wait for it - PHP. As such, I can’t really lend any commentary to the quality of the API’s functionality. That said, a quick perusal of the source shows that there are almost no comments and the rest looks a lot like generated XmlSerializable style code.

The schema validator is a pretty basic .NET application that can validate any XML against any schema - you select the schema and the XML files manually and it just runs validation. This actually makes it troublesome to use; you’d think the schema would be embedded by default. If you have some other schema validation tool, feel free to ignore the one that comes with BlogML.

The real meat of BlogML is the schema. That’s where the value of BlogML is - in defining the standard format for the blog contents.

The overall format of things seems to have been thought out pretty well. The schema accounts for posts, comments and trackbacks on each post, categories, attachments, and authors. I was pretty easily able to map the blog contents of pMachine into the requisite structure for BlogML.

There are three downsides with the schema:

First, the schema could really stand to be cleaned up. This may not be obvious if you’re editing the thing in a straight text editor, but when you throw it into something like XMLSpy, you can see the issues. Things could be made simpler by better use of common base types that get extended. There are odd things like an empty, hanging element sequence in one of the types. Generally speaking, a good tidy-up might make it a lot easier to use, because…

Second, the documentation is super duper light. I think there are like 10 lines of documentation in the schema, tops, and there’s nothing outside the schema that explains it, either. Without going back and forth between the schema and the sample document, I’d have no idea what exactly was supposed to be where, what the format of things needed to be, etc.

Third, and admittedly this may be more pMachine-specific, there’s no notion of distinguishing between a “trackback” and a “pingback.” There’s only a “trackback” entity in the schema, so if your blog supports the notion of a “pingback,” you will lose the differentiation when you export.

Anyway, I planned on importing my blog into Subtext, so I set up a test site on my development machine, ran the export on my pMachine blog (through a utility I wrote; I’m going to do some fine-tuning and release it for all you stranded pMachine users) and did the import. This is where I started noticing the real shortcomings in BlogML proper. These fall into two categories:

Shortcoming 1: Links. If you’ve had a blog for any length of time, you’ve got posts that link to other posts. That works great if your link format doesn’t change. If I’m moving from pMachine to Subtext, though, I don’t want to have to keep my old PHP blog around (hence “moving”), and, if possible, I’d like to have any intra-site links get updated. There doesn’t seem to be any notion in BlogML pre-defining a “new link mapping” (like being able to say “for this post here, its new link will be here”) so import engines will be able to convert content on the fly. There’s also no notion of a response from an import engine to be able to say “Here’s the old post ID, here’s the new one” so you can write your own redirection setup (which you will have to do, regardless of whether you update the links inside the posts).

I think there needs to be a little more with respect to link and post ID handling. BlogML might be great for defining the contents of a blog from an exported standpoint, but it doesn’t really help from an imported standpoint. Maybe offering a second schema for old-ID-to-new-ID mapping (or even old-ID-to-new-post-URL) that blog import engines could return when they finish importing… something to address the mapping issue. As it stands, I’m going to be doing some manual calculation and post-import work.

Shortcoming 2: Non-Text Content If you’ve got images or downloads or other non-text content on your blog posts, it’s most likely stored in some proprietary folder hierarchy for the blog engine you’re on… and if you’re moving, you won’t be having that hierarchy anymore, will you? That means you’ve got to not only move the text content, but the rest of the content into the new blog engine.

There is a notion of attachments in BlogML, but it’s not clear that solves the issue. You can apparently even embed “attachments” for each entry as a base64 encoded entity right in the BlogML. It’s unclear, however, how this attachment relates back to the entry and, further, unclear how the BlogML import will handle it. This could probably be remedied with some documentation, but like I said, there really isn’t any.

This sort of leaves you with one of two options: You can leave the non-text content where it is and leave the proprietary folder structure in place… or you can move the non-text content and process all of the links in all of your posts to point to the new location. One way is less work but also less clean; the other is cleaner but a lot of work. Lose-lose.

Anyway, the end result of my working with BlogML: I like the idea and I’ll be using it as a part of a fairly complex multi-step process to migrate off pMachine. That said, I think it’s got a long way to go for widespread use.

personal, gaming, xbox comments edit

Saturday was a hell of a day.

We got up at around 7:00a and got out of the house basically ASAP so we could get to Jenn’s parents’ house in time to get in their motorhome and head down to Eugene for Jenn’s grandma’s 79th birthday party. That’s like an hour and a half drive, which isn’t as bad when you’re riding in the Adamson Bus, but it’s still a long trip.

We got there and Jenn’s grandma was very pleased to see us. It was a surprise party and the entire family was there.

Now, when I say “the entire family,” I mean like 40 or 50 people. The rockin’ part was that it was raining and we had planned to have the party outside, but instead we had it indoors in a space that was, oh, maybe 500 square feet. You can imagine the chaos - not enough chairs (or enough room for chairs) ended up meaning people sitting on the floor, sitting on laps, standing in the hallway… and it got hot.

So that lasted for about four hours. And the thing is, I like Jenn’s family and all, but I don’t really know anyone and every time we get together it’s like this whirlwind of faces that only look mostly familiar and only results in me being confused and claustrophobic. I don’t really have anything to talk about with them because none of them are tech people and I really don’t follow sports or family gossip. So it’s nice to see them, but I won’t lie, it’s not super duper fun.

After that, we hopped back on the bus and headed home. We got back somewhere around the 6:30p timeframe and on the way home we planned on stopping at my parents’ house because it was my dad’s birthday, too. We got there sometime around quarter to seven, but they weren’t home so I planned on calling him later in the evening. I actually feel bad I didn’t get in touch with him earlier, but he’s going to have to throw me a bone on this one because I was sort of predisposed.

Sunday was errand day so we ran around and did the shopping and so on. Groceries ran far more than I anticipated because we ended up picking up a lot of high-ticket items (cleaning supplies and so on) that we had been putting off. Not great on the pocketbook, but had to be done.

We also picked up one of those automatic cat boxes. We like to occasionally go away for the weekend and the new cat generates a looooot of poop so we want to make sure the box is always clean and doesn’t need to be dealt with for a couple of days at least. We went with the Littermaid Elite after looking at a lot of these things because not only did it seem to be the most popular model, but it also didn’t lock us into proprietary refills or materials beyond the little litter receptacle (which runs about $0.30 each and last around five days - so maybe $2/month, which is better than some of the ongoing costs of the other boxes). The only thing we were afraid of was whether they’d use it.

As I was putting it together, I got to about step seven of ten and had to put the clean litter in the box. I poured it in, turned around, put the litter box down, picked up the instruction manual and turned around to do step eight… but the cat was already in the box taking a fresh crap - even before I got the box put together - so I’m no longer afraid they won’t use it. I couldn’t even finish putting it together before it was used.

Jury’s still out on whether I like it or not. It works great on the clumped-up pee balls, but if the cat poop is… well, soft… it sort of attaches to the rake that cleans the box. I cleaned it off the rake manually the first time, but I left it when I saw it again this morning. I’m going to see if the situation somehow rectifies itself.

I also checked on my sick Xbox, which is on its way home from Xbox Hospital. They are sending me another refurbished machine - it has a different serial number than the one I sent in - so I’m going to end up going through the Xbox DRM problems again. Support actually has a name for this process now - “Points After Repair.” I called them and said I noticed that the serial number was different and that I was disappointed I’d have to go through this again and they were all, “Well, just set up [yet another] Xbox silver account before you call, then when you call in give us your repair number and ask about ‘Points After Repair.’ We’ll hook you up.” Ridiculous. Because that didn’t cause all nature of pain in my ass last time.