Impenetrable Thoughts
Shameless Plug

What else is a personal website for other than shamelessly promoting yourself and your personal projects? Other than that whole self-expression thing.

So I present, for the interest of those search engine indexers out there and my readership who have ample free time: Liber Omnia!

It mostly works and has the important 80% of features and I've been using it for a couple of months and it doesn't suck.

Too Many Shortcuts

Back when computers were physically large, virtually small and GUIs were primitive a major selling point of one piece of software over another was how many mouse clicks were required to get things done. The most lauded software had keyboard shortcuts for most of their functionality so it took zero mouse clicks.

Keyboard shortcuts were thus burned into the minds of developers as good things. Unfortunately there is too much of a good thing. These days my major problems are not that there are too few keyboard shortcuts, but too many. In basically every major piece of software I use on a daily basis, including my terminal editor, pretty much every key on the keyboard is a shortcut to something. Every program wins at the mouse click game these days. But they all lose at the typo game.

Several times a week I'll typo a shortcut. Perhaps this is because I forgot which terminal emulator or web browser I'm in. Perhaps I just wasn't paying attention to where my fingers were on the keyboard. Perhaps something else brushed the keyboard. Whatever the reason I hit the wrong keys. Hopefully nothing too terrible happened, like closing that window or program out from under me. This is all very annoying.

So here it is. I like keyboard shortcuts. I really do, there are a few I use hundreds of times everyday. Then there are the few I use once or twice a week. Then there are the rest I only use when I typo them or to undo a typo'd shortcut. While I like the ability to have the shortcuts I do use, I wish very few shortcuts where enabled by default. I don't use most of them any they only get in my way.

Like an Animal

This week I've travelled far to the sunny land of meetings. My favoured dinner on the first night of these trips, after I've travelled better than half the day and then put in a good half day at the office on top, is pizza. It's quick, it's easy, it's heavy to help me sleep and I don't have to leave my hotel room at all.

Pizza is also good because it is an unavoidable fact that I'll have some pizza leftover. This is just the thing to keep around should I be really lazy one night or should the hotel breakfast suck. And this means stuffing a pizza box into those tiny refrigerators.

A small minority of pizza places provide their delivery pizza in an awesome box which breaks up and folds down to become half the size. Just perfect for storing my half pizza in the world's second tiniest fridge. But alas, few restaurants do this. Instead I usually just cut the box up to fit with my trusty pocket knife.

But I flew to the land of meetings remember? Sometimes I'll risk this sharp extension of myself (much in the way my watch and shoes are) to the trials of checked luggage. I haven't lost it yet, but I fully expect there will be a day when I arrive without my luggage and the luggage is never to be seen again. But this time I did not.

You know those flimsy nail files which come with toiletry bags? The ones they include because some Victorian author claimed that it wasn't a true toiletry bag unless it contained the quartet of nail file, nail clipper, tweezers and precision safety scissors. The ones which are so thin with the merest impression of a grating surface as to barely be a file at all. That is what I resorted to cutting my pizza box with.

Some might say I have fallen to the level of a savage, but a savage would never be far from their blade. Savages know the value of a good knife. No, I have fallen even lower, here in the land of meetings, I have fallen to the level of the animals; I have resorted to mostly tearing things where a simple cut would be better and easier.

Never again shall I sink so low.

Text Editor Platforms

Lately there have been a few discussions about text editors sloshing around the Internet. Among this discussion was the concept of a cross platform editor. During the discussion this was meant to mean Linux vs Windows vs MacOSX, but I actually think that when discussing text editors the platforms to note are not the traditional ones. I believe that the platforms of note are:

  • X11

  • Windows

  • OSX

  • Web

  • tty

  • iOS

  • Android

I don't divide the editors into the standard OS, but instead by the windowing system. Looking at my use I think this is a more useful definition of platform in this case. In the past I've used an editor on all of these platforms. Especially important is the distinction between tty and the OS it's running on. I spend most of my editing hours editing on a remote machine, but I still want the same editor when I'm editing locally with a fully fledged graphical windowing system.

To me, the most important consideration of an editor isn't whether it runs on Linux and FreeBSD and OSX, but if it is accessible on the window system I have to use at that moment.

The Wiki Knowledge Future

I think the future of the sum of human knowledge looks more like TVTropes than Wikipedia and a lot less than the array of publishers and unavailable papers today. However, I see the underlying technology to be much different.

Wikipedia is certainly a successful experiment in crowd sourced knowledge. It has millions of articles across dozens of languages. The articles are of varying quality, but the ones of general interest are often of good, if not great, quality. However along with several other frequently discussed problems Wikipedia has I don't see it as the direct descendant of the future of knowledge for three reasons, one technical, one fundamental and one social.

The technical limitation is perhaps the easiest hurdle to overcome. Wikipedia, as it stands, is based upon the complex MediaWiki software. This software is difficult to install, difficult to administer and sharing wiki pages or portions of pages between MediaWiki instances is difficult and time consuming. The sum of this make it extremely difficult to run a fully fledged instance of Wikipedia which keeps up to date and serves a medium-sized population, say that of a single University. As has been show time and time again throughout history a single point of failure, in this case the WikiMedia foundation, will always, given enough time, fail. Then there are the various scaling costs. Surely you have seen he pictures of a downtrodden Jimmy Wales begging for donations to keep the expensive Wikipedia servers running. Wikipedia is, no doubt, expensive to run and as it becomes more popular it'll only get worse. Decentralization would make the total running costs greater, but spread them out over a much larger number of groups, many of which could run their own copy using otherwise idle resources. All this could be fixed, but as it stands Wikipedia is too centralized to scale to the level required to be a store of all human knowledge.

The fundamental issue with Wikipedia is also a relatively simple one to solve, but likely cannot be done without starting over. From the outset Wikipedia chose to have separate language versions. There are some good reasons to do this, but also several good reasons this separation should be avoided. The most obvious reason to avoid such a separation is duplication of effort. If there is an English article on some topic and an article on the same topic in Japanese then at least two people spent time writing more or less the same content, doing the same research. Not only is this wasteful, but it is extremely unlikely that either of the two articles is a superset, knowledge-wise, of the other. Instead the English version would contain some information the Japanese version doesn't and vice-versa. Better would be combining the articles using manual and automated translation to present the text is the languages of choice for the reader. In this way there is only one article containing all the information. Manual translation help would often still be useful, but the canonical source would be available for those needing preciseness.

Finally Wikipedia is not the future of knowledge storage for the simple social reason that its community has chosen to be merely an encyclopedia. At the outset before it was known that the experiment of a crowd sourced encyclopedia would work it was a reasonably narrow to goal to achieve. Since that time the WikiMedia organization has branched out and created wikis for many other uses, such as a dictionary, a quote database, a species listing and even a text library. These are all in line with a store of knowledge, but being separate they fail to be cohesive. Lack of cohesiveness isn't the worst problem however. Intersite hyperlinks have limitations, but they are worlds better than the book references that they replaced. The biggest issue is that Wikipedia and cousins intentionally excludes original thought and original research. This limits all the discourse down to appeals to authority. If you can't find or manufacture a sufficient strong authority to appeal to then that knowledge is ignored and the article itself likely deleted. Within the scope of an encyclopedia this is likely acceptable, though it severely limits the scope and depth of what it can contain. There are many topics for which there is no formal research and the canonical knowledge of the respective community is insufficiently authoritative. Obviously this is unacceptable for a store of human knowledge versus a large index thereof.

Contrast this with TVTropes. While TVTropes has the same technical limitations as Wikipedia with respect to centralization it has solutions for the other two problems. The fundamental language division of Wikipedia is solved in a traditional, but less elegant method by TVTropes. Instead of supporting multiple languages the entire site is in English. That's not great if you don't know English, but at least you never have to read multiple articles on the same topic in different languages to get all you can out of it. More interesting is that TVTropes is not an encyclopedia at all. Instead it's mostly original research. There's a lot of sausage making which goes into this research, but the end result is undeniably amazing. This is positive proof that, at least in technical areas, a wiki based community can accomplish novel research and categorization.

The greatest weakness I see in the TVTropes model is the lack of direct storage of the original source material. I'm sure they would store the original texts and shows and games if they could. Linking to specific examples wherever appropriate. Excepting this exclusion I would posit that TVTropes is almost entirely self contained. Any definition or example required to make distinctions clear is available and often linked within the site itself. This is quite different from the Wikipedia model where you need to leave the site to view a dictionary or often the number of examples included on the site are quite sparse.

Inclusiveness and mechanisms for handling original research are things TVTropes does much better than Wikipedia. These are things which are critical for any storehouse of human knowledge. It does no good to limit such a storehouse to references to old books which almost nobody can actually access for deep answers to subtle questions.

Bitrot Free Backups

Bitrot is a great issue affecting archives which most people learn about only after they have irretrievably lost pictures or papers to its effects. This is unfortunate because there is a simple, efficient way to avoid losing data to bitrot, but few seem to use it.

Bitrot plagues all media. Printed pictures fade, stone tablets erode and hard drives demagnetize. Bitrot is simply the partial degradation of bits of the media until those sections become unrecoverable. With low density paper or stone media this was much less serious, you might lose one letter in a word but it doesn't make the entire text unreadable. Digital media is more dependent upon most bits being correct. Digital formats are usually compressed such that changing a single bit changes the meaning of all the bits which follow it. Thus one single error can completely corrupt an image. Most of the image is still there, but without expert knowledge of the format the image is unrecoverable.

Bitrot has two primary sources. The most common is due to accumulated errors in the media it was stored upon. Burnable CD-Rs, for instance, degrade over time from exposure to light, heat, moisture and bacteria. All digital media has built in error detection, but it is of a simple sort because it must be fast and generally applicable. Under normal use it is sufficient, yet can become overwhelmed with accumulated errors in a long term storage situation. Similarly all digital media, such as harddrives or DVD-Rs or tapes, slowly degrade even when not in use. Given sufficient time their built-in error correction facilities are unable to correct the accumulated errors. At that point an erroneous bit appears. It might be that the error correction can still detect the error though it can no longer be corrected, however that doesn't need to be the case. Undetected errors are not uncommon.

As in the analog world some media lasts longer than others. Stone outlasts paper for example. With this in mind there are digital media which outlasts CD-Rs, but they tend to be expensive. On the costly end you can pay to have your data pressed into DVDs or masked into custom ROM. Such techniques will last hundreds of years with basic physical protections. They'll also out cost your car for any significant amount of data. However, even with this stable media you will see bitrot, just at a lower rate. You can't escape bitrot, just delay it.

The second source of bitrot are copying errors. In order to copy a file from one computer to another the bits might be copied dozens of times. Precisely like copying DNA every copy presents a risk of errors. There are error correction mechanisms built in, but they are simple to be fast and can't catch everything. Avoiding this type of bitrot is simple as verifying the new copy immediately after it is made. Simple, but it can double the time the copy takes.

Since you can't escape bitrot you must repair the damage after it occurs. There are three common solutions to this problem. The first is favoured by archival libraries. First you store multiple redundant copies of every file you wish to archive on different media. These days this tends to be a large array of hard drives and an automated tape library. Every file has a copy stored on many hard drives and several tapes. On a regular and frequent basis you check all the versions of every file against the others, looking for changes. This can be done somewhat efficiently using a good checksum algorithm. This method is effective, but expensive. Not only are many media required, but you have to have the man power and automation to continuously check for and fix bitrot. This is easy with only a handful of gigabytes of data, but quickly becomes difficult and ludicrously expensive as the amount of data grows to terabytes in size. The key with this method is to store many copies and check them all frequently enough that bitrot can't cause too much trouble. Both of these things are expensive.

The most common Internet recommendation for protecting against bitrot is a watered down version of the archive library solution. Instead of constantly verifying the archived data integrity across many copies many people recommend a two pronged approach. First all data is stored on multiple media, but where a library would have dozens of copies most people can only afford a handful. Instead of constantly checking and repairing errors most recommend rotating the media out as it ages. For example, having three hard drives containing three copies bought over three years. Every year a new drive is bought and a new copy created. When a new copy is created several of the older archives are checked to get a good copy of each file. The simplest schemes use voting to determine which copy is good, better schemes check a strong hash of the file contents from when the file was new and known good. This is relatively robust, but the few copies and infrequent verification carries significant risks of all copies bitrotting some small amount in different ways. In such a way the data can still become unrecoverable. This is especially true when the data has exceeded the capacity of a single media, whether that was a hard drive, burnt disc or flash drive.

The second prong of the common Internet recommendation is to come to terms with the fact that the above method won't protect everything and that you should fall back to well understood paper archival methods. Basically sort out the most important files and print them. At this point bitrot is much less of an issue and protecting paper for decades and centuries is well understood, but you can' save everything. Videos are impossible and many raw data files have no convenient printed format. There is also the issue of space. Hundreds of gigabytes of digital pictures can be held in one hand. Those same pictures printed would fill a large house to bursting.

The previous two methods do their best to detect bitrot and find a good copy to fix it. In essence they gamble that bitrot won't happen to all copies between verifications. The third method assumes that bitrot is unavoidable, but that it happens relatively slowly and independently on every media. Instead of trying to avoid bitrot it prepares to repair it after the fact. The specific implementation of this solution I use has two components. First each copy of the data has a PAR2 recovery set created for it. This uses a similar algorithm at its heart as most error correction mechanisms on digital media, the difference is that the PAR2 recovery set covers all of the files instead of a single 4KB chunk for each set. If bitrot happens randomly across all the data and the recovery data is configured such that ten percent corrupted bits can be recovered and a single unrecoverable block can make a file useless, then it is more likely that eleven percent of a block becomes corrupt and thus unrecoverable than eleven percent of the entire archive is corrupted. Using PAR2 over all the data together provides better protection than the media error correction at the expense of more computing power.

The second component of my strategy is having a small number of copies of the archive along with its PAR2 recovery data. This is needed for two reasons. The most important is to protect against total media failure. All the recovery data in the world won't help if the media is completely destroyed. Flash drives get lost, hard drives fail to spin up, tapes get turned into party streamers. Multiple copies are the only real defence against these events. I currently store five copies of my backups and it is unlikely that all five would be destroyed within the same short time frame.

Since this method makes the reasonable assumption that bitrot happens independently on different media, that is, which bits rot on one media (such as a particular hard drive) bears no influence on which bits will rot on another media (another hard drive or a cloud storage service for example). Trusting in this assumption for a relatively slow bitrot rate, even if every copy bitrots more than the ten percent which can be corrected using the PAR2 data, it is unlikely that all the same files have bitrotted to the same extent across all the copies. Instead it is likely that using multiple copies it will be possible to aggregate a set of files and recovery data where less than ten percent of the dta contained in them are corrupt and thus recover all the archived data.

I prefer this method because it is affordable, low maintenance and trustworthy. Though I currently store five copies of my backup this system is workable with fewer. I can't really recommend fewer than three full copies, but I have had success with two. If you format your archive correctly then these two copies don't even have to be identical. If you do regular backups of your archival data, say every six months, and your backup is in some append only format, then it's possible to combine the latest backup with the previous backup to get around corrupted data. You can use raw directories of files as an append-only archive format, but PAR2 has a limit on the number of files a single recovery set can cover of about thirty thousand. I personally use a log archive format where new versions of old files are copied later in the file. In this way the first N bytes of the latest archive file match the first N bytes of the previous archive version and so on back through all the versions I store. You perform this aggregation by replacing too corrupted files or sections of files with the same chunks from the other backup. Missing or out of date files will be treated as corrupt data so you don't get exactly the same recovery power as an identical copy, but with effort this can be enough to get you to the point where the recovery data is sufficient. First try this replacing missing data in a (copy!) of the latest backup from the older backup, but if fails you should also try filling missing data in the older backup as well. I've had success using this method to recover a backup off CD-Rs and DVD-Rs where some discs where unreadable. Thus this system is relatively reliable even if simply burning a single copy onto write-once media, such a DVD-Rs, on a regular basis and keeping the last handful of copies. Of course the more copies the lower the chance of data becoming unrecoverable.

The level of maintenance overhead is quite low. There is no continuous verification of the data. There isn't even any verification of old copies when creating a new archive. Instead you just keep as many copies as is reasonable and might have to do some work to combine multiple copies when recovery is necessary. The ability to combine multiple copies of possibly different ages to recover all the data in most cases makes the system trustworthy. There are better systems for other use cases, but this is the best I've seen for the private use case of ample amounts of personal data reliably stored for the longterm with a minimal amount of work or hardware costs.

Private Project Licensing

Licensing intellectual property used to be something the average person just didn't do. They had no reason. Copying was too expensive, too low fidelity and generally not economically feasible. It was impossible to get a book you wrote out to a large number of people and then worry about other publishing rights. The only organizations which could publish already knew what to do to protect themselves and played above board more or less. It's not like they would even know about your hobby paintings unless you notified them.

With the spread of computers like wildfire and the proliferation of software development skills and tools that changed. Now anybody can sit at home, produce useful software and release that software to the world at large. Similarly someone can write a book and put it online to be read. Licensing now matters for the common person.

This is not an article intended to help you pick a license. There are already a good number of sources which do a much more complete job than I would be willing to do. Instead this article is about the general groups of people I understand to exist when it comes to licensing. I'll try to explain some reasoning behind each group. This post won't help you pick a specific license, but it might give you something to ponder before you decide which group you fall into.

The simplest group to describe are those who don't put any license on their creations at all. Not putting any license onto software or writing is equivalent to saying the code is for private use only. You are free to what you will in the privacy of your own home, but don't share your changes and don't even consider doing anything where you might make money. The author doesn't tend to care that much right now, but hasn't made any guarantees about not caring in the future. Often you'll see this on writing or small code projects which the author never believes will be all that valuable to anybody except for short term entertainment value.

A second group might be succinctly described as "Don't Sue Me". This group puts the minimum restrictions on their software, but does construct the license such that users of the code can't sue the original author with the expectation of winning. This is really middle ground between the first license-less group and the next group. Often authors will put this type of license on code which they think might be useful to somebody, but never as significant part of a significant commercial product. They might otherwise have chosen to not put any license on at all, but they've read other licenses and noticed that they tend to explicitly mention the lack of a warranty and consider that a prudent protection to have.

This third group comprises a large portion of the open source world. Though it's hard to objectively compare the impact of code with various licenses I would not be surprised to hear that well over half of all the open source code in the world has licenses which fall into this group. I would call this the "Free-est license" group. The details vary between license of this class but they tend to boil down to: don't sue me and don't claim you wrote this. And that's it. Just about any conceivable use of code and other creations licensed in this way is considered acceptable by the author.

There are varied reasons to choose licenses which fall into this group. Corporate funded development tends to prefer this license because it means that they can use the code however they wish without restriction. If they want to add their own magic sauce and release it as a project they can. However, it also permits them to submit their changes back upstream to put the burden of maintenance upon the community. This often results in a better public code base and lower maintenance costs for the corporation.

Individuals might choose licenses in this group because all they care about is seeing their code used. They don't mind that a company can slurp their code in, put it into a multi-billion dollar a year program and never see any return. Often if the author started in the "Don't Sue Me" group they'll arrive in this group, at least for large projects, as they become aware of the limitations in usage the less comprehensive licenses cause.

Another very good reason to choose licenses of this sort is for reference implementations. Having the clear and unrestricted Free-est licenses for reference implementations improves compatibility and speed of uptake. If any programs which would find the fileformat or algorithm useful can take your tested code and use it as they desire then other have as well, ensuring compatibility. If the reference implementation starts out as a wide spread defacto standard then future incompatibility is minimized since any competing implementations must interoperate with the reference.

The Free-est licenses have some very strong points to defend them as licensing choices. However, for some they are too free since in many cases the code will enter a proprietary black hole, never to be seen again no matter the improvements.

The other huge portion of licensed software and other creative works are with what I'll call the Share Fairly licenses. The Share Fairly licenses are more restrictive than the Free-est licenses, however these restrictions are used to force anybody who modifies (or in some case uses) the software to provide their changes to the public such that they could use or build upon them themselves. Because of this proprietary uses of such code must be carefully segregated from proprietary code. Interface the code incorrectly and you take the risk of causing the license of your proprietary code to become this Share Fairly license.

This group of licenses was originally created to provide freedoms which the Free-est licenses do not guarantee for the users of software. They do this by restricting the packagers and distributors of software. For some this is still the most important reason they choose these licenses for their works.

However not all authors choose these licenses for such high minded concepts as freedom. Other authors might choose these licenses to restrict the one way flow of work that can occur with the free-est licenses. Large corporations will tend to prefer this group of licenses when it comes to collaborating with other large corporations which are competitors in one manner or another. With a project licensed as such each corporation is required to publish their changes for all to see. In such a way they ensure that their investments into the code cannot be unilaterally taken by a competitor who doesn't submit their changes back. Such a one way flow effectively gives the unsharing participant free work. Situations like these would prevent collaboration from occurring for fear of losing competitiveness. In a way similar to corporations an individual author might choose the Share Fairly licenses to force others to return back to the community if they've received value from the software.

Other authors choose Share Fairly licenses not as the final say in the licensing of their software, but instead as a default position in the licensing negotiation. Such authors are displeased at the thought of a company take their work and making a profit from it without compensating the author in some manner, but the author still wants others to benefit from his work. In this case a restrictive Share Fairly license might be chosen as a default minimum compensation, any company which finds the code profitable must provide their changes back to the public. However should a company not wish to compensate the author by making the code better for everybody then these sorts of authors are more than willing to enter into a formal negotiation for different compensation. Such compensation could be a lump sum of money, royalties, employment or just about anything else you could imagine.

These are just the most major of the licensing groups. There are other smaller groups, such as the non-military use groups, but they tend to almost fit into one of the groups above barring some special restrictions. These are also only some of the most common reasons for choosing a license type over another, but by no means are these exhaustive. It is likely impossible to render such a list. One common reason to choose on license group over another which I didn't mention is simply local inertia. If you are within groups which predominately choose one license group over another then you are more likely when you give it little thought, to choose whatever license is most common there.

Such an article as this wouldn't be complete without a note about which camp I fall into and why. I'd likely receive questions in any case, so I may as well answer them upfront. Without other modifying factors I will choose an appropriate Share Fairly license for my personal projects. I do this because I make my living as a software developer and do not desire to see either of the extreme possible results of the other license group choices. I also view the public license as a default negotiation position and am always open to discussing proprietary licenses for compensation. I consider my work useful enough to be worth providing to the public, but it takes real work to produce this software from which I'd like to see some return. Whether this return is in users, patches or cash I care little, but I don't want to see some company consuming my work for their sole profit.

Common Misunderstandings About Version Control

I have an interest in the mechanics and tools used to run software projects. I've used several of the most common version control system and dealt with projects large and small. I like to think that I've thought about the theory and practicalities of version control and like to think I have a solid grasp on version control as a topic. This is why it pains me to read, more frequently than I wish were true, some developer making some statement related to version control which is untrue and detrimental to the use and discussion of version control. I hope to clear up the most common incorrect beliefs in this post. I'll be discussing primarily in the context of Subversion, Perforce and Git. I unfortunately don't have extensive experience with Mercurial, but Git shares several of the same core concepts and together they are currently the best of breed distributed VCSes. Subversion and Perforce are the best of breed centralized VCSes and there are some important distinctions between the two worth noting at certain points. All three together should provide sufficient coverage of the necessary concepts.

To put a face on the prototypical developer who makes the ignorant statements about version control imagine a developer in his early twenties. He's not really worked at a large corporation, but he has done plenty of coding for small personal and consulting projects. He's only ever really used Git and reads nothing but that it's the best. Let's go over the worst of the misunderstandings about version control this developer is likely to have.

No Binaries Checked In


One common belief about version control is that you shouldn't check in binary files. To some, version control is only for source files. When pushed many of these developers will agree that small binary files, such as images for a website, should also be checked into the VCS, but by no means should large binary files be checked in.

Such a view is incorrect and commonly broken in several industries. Firstly it is incorrect in the belief that large binaries, such as the Photoshop sources of those website images, should not be version controlled. There are only good arguments for doing so. Large binaries are able to change just like any source file and other parts of the project are just as capable of being dependent on the large binary file as any source file. It is true that a separate tool could be used to separately version the binary files, but why do that if your VCS is capable of doing it for you right alongside the rest of your project? Why should a video maker use separate tools to track the versions of the original videos and the final renders?

There is then the argument that binary intermediate products, that is binary files which can be produced from other files in the repository, should not be checked in. In some cases, such as source files to object files, this make sense and is often then the context in which such a belief is learnt. There is little to be gained by committing the compiled output of a source file when a compiler can recreate it with no trouble. If creating the intermediate products is cheap then there is no need to commit them. But that processing is not always cheap. It is common in video game development to have not only the photoshop originals of various assets, but also the flattened and compressed versions committed into the VCS. The reason for this is because it would be expensive to pay for Photoshop licenses for anybody who needed to build the game for whatever reason. It might also be time consuming. If it takes five minutes per asset to compile from a Photoshop format to the format needed by the game then a game with hundreds of such assets may gain immensely from keeping the intermediate products. Similarly other intermediate products can take hours of processing to produce.

Even more than intermediate products, it is common in embedded products to not only store the source code of the project, but the binaries of all the tools necessary to build it as well inside the VCS. When working with an evolving tool chain it can be a great aid to be able to go back arbitrary versions and know that you have the matching tool chain. Such binaries can run into the hundreds of megabytes and often make sense to put into the project VCS.

Linux is a Large Project


The previous discussion about binary files brings us to an extremely common misunderstanding about version control. This is especially true in the open source and web development worlds simply due to the lack of exposure. It might be difficult to believe for people who have never worked at a corporation, but in just about every way Linux is not a large project with respect to version control.

Given a little thought this should be pretty obvious. There are many projects in the world, such as Linux distributions and embedded software, which are significant projects on their own which include the Linux kernel as a subset of their source code. Quite obviously these projects must be larger than the Linux kernel themselves. In fact, I would argue that the Linux kernel, with modern technology, is merely a medium sized project.

Consider some recent statistics about the Linux kernel. Version 3.2 has about fifteen million lines of code across about thirty seven thousand files, about thirteen hundred developers take part in each version and they submit about seven patches per hour for inclusion. A checkout size of about 450 megabytes. While the number of developers would rank this as a large project, none of the other metrics do. Fifteen million lines isn't nothing, but it certainly isn't anywhere near the size of projects which include the source of the kernel, glibc, gcc and a few other things you see in a Linux distribution or other MacOSX. Seven patches submitted per hour isn't an impressive number either and can easily be matched by a couple hundred developers working on a single project at any corporation. A checkout of less than half a gigabyte is nothing compared to projects where binaries are stored which can reach multiples of hundreds of gigabytes for a AAA video game.

Big projects are just so much bigger than Linux when it comes to strain on the version control system. Multigigabyte checkouts are normal. Lines of code in the fifty million range or more, including libraries and third party components, are not abnormal. It can be surprising the first time one thinks about it, but consider the case of KDE. It's a well known open source project which is larger than Linux in many respects and it itself probably only barely crosses the large project threshold.

Centralized Mean No Branching


Given the way that some people talk you might be led to believe that before Git no VCS ever supported branching. This is obviously not true and the idea of VCS branching support has been around for at least forty years. RCS had primitive support for it. And yet the belief that centralized VCSes such as Subversion and Perforce don't support branching persists. Often such a view is expressed in such a way that the 'normal' way to use SVN or Perforce is to develop all in the trunk or mainline. While this is a common method of development suitable to some projects, it is by no means the only solution. Much of this misunderstanding comes about, I believe, because the DVCSes have chosen to solve an easier form of branching than the centralized VCSes. I'll discuss the distinction in the next section. But first I'll show how branching is a critical component of every modern VCS, centralized or not.

For that purpose I'll define a modern VCS as one in which there is a checkout which is separate from the files as committed into the VCS history until such time that the developer consciously commits changes from the checkout into the repository. Many VCSes satisfy this definition, Git, Subversion and Perforce included. Now consider the situation where a developer has a checkout of some branch and makes some changes. Before committing a colleague commits some other changes into the same repository, assume for the moment that if using Git it is actually the same clone on the same machine in the same branch. Now before the developer can commit their changes they need to first bring in the changes from the repository and then merge their changes on top of them.

In such a situation the checkout is a branch with a maximum commit depth of one. That is, the checkout can be considered a branch which only ever has a maximum of one set of changes in it. When bringing in changes from the branch a merge occurs. This is the simplest way in which branching is a capability of every modern VCS, including the centralized VCSes. Of course Subversion and Perforce have documented and battle tested branching and merging on a larger scale.

Sometimes when a developer exclaims that centralized VCSes don't have branching they really mean what is termed local branching in DVCSes. That is, within a developer's own clone they can branch as much as they want and make commits into those branches as they see fit. It is not required of centralized VCSes, but common for a repository to either allow developers to create branches as they see fit or to have areas within the branch namespace explicitly for developers to create private branches. The end result is nearly identical in both cases. The only distinction is that DCVS local branches can remain hidden from everybody but the creating developer while private branches in a CVCS can only hide in plain sight.

One Way to Branch


As I mentioned earlier most current DVCSes have decided to solve an easier branching problem than most CVCSes. If you've only ever used a DVCS then you might be underinformed and believe that there is only one way to branch. The branch-the-world philosophy of DVCSes is certainly simple to grasp and easier to make easy related to merging and keeping track of branches, but it is not the only way. Subversion and Perforce are more flexible in this regard and support branching directories.

At first it isn't clear why you might want to branch beneath the root of the repository. Consider an embedded project based on Linux. Such a project will have a copy of the kernel, a copy of the C library and some other bits of code. Now if a new system call were added to the kernel to be used by the application code then the C library might be updated to support that system call as well. Such a change would be useful to go in all at once to ensure that you never have to deal with mismatched versions when compiling. However, since these are separate components most of the time it is also useful to build each as an RPM. Since these components also take a long time to build if you are working on the application why should you have to rebuilt the kernel RPM? One solution to this problem is to branch each package separately depending on which changes you require. Only a system which allows you to branch directories and not just the repository root give this ability.

Consider a further case where you have some company-wide documentation in the same repository as the project. It's often not useful to branch that documentation when you branch the project, build server configuration doesn't branch with the project, but it's still worthwhile to be in the same repository for other reasons. Such setups are impossible in branch-the-world models.

DVCSes Are Special


This is perhaps the most annoying misconception which comes from inexperienced VCS users who have only really used DVCSes. They often believe that DVCSes are special in ways which aren't true and ignore the few ways in which they differ. The only way in which DVCSes are fundamentally special is the distributed aspect. Current DVCSes keep a copy of the entire history in the local clone. This is good in some situations, e.g. on an airplane, and bad in other, e.g. the history is 500GB in size. It is dependant upon the situation whether a DVCS is better than a CVCS or not. Beyond this there is nothing special about DVCSes. Everything you can do with a DVCS you can do with a CVCS with more or less trouble. Similarly everything you can do with a CVCS you can usually accomplish with a DVCS with more or less trouble.

Version control is all about keeping track of changes. The theory is simple, you just have to look past the implementation details of the VCS you are using and think about the high level operation you are trying to accomplish. Do so and you'll be significantly less likely to misunderstand version control.

Hacky Simulation

Xug. I completely forgot about my final Evolutionary Societies assignment that's due tomorrow. Ok, let's see how much trouble I'm in.

Simulate the effects of an energy peak on a Type 1 civilization. Note: This will take a day on the campus supercomputer so schedule your slot ahead of time.

Great. I didn't schedule a slot and the last one is surely taken. That's bad. But look, it says nothing about the simulation fidelity. I'm smart, surely I can take some shortcuts and get the simulation done in time. Better at least try right?

So let's see, Type 1 civilization. So limited space travel and probably no planet splitting. I can work with that. At least I'm not simulating multiple solar systems. Ok, solar system size, smaller is better but it must be big enough to not take too long. Medium sized star it is then. Need a planet of course, let's put them on the third one for fun. In the Carbon Zone of course, I need this done quickly and there is no time to wait for Silicon life. No point in simulating more than the single solar system so we'll limit the simulation to the volume of the solar winds. I like space travel so let's make sure there's a reason for these beings to go into space. I guess that means they need to be surrounded by rocky worlds, but you really do need interesting large worlds to go to space, so I'll put a few of those out there. Just to keep them guessing I'll put an asteroid belt between them. Gotta remember the energy peak though. Let's put make fusion impossible for them and put the organic peak a bit after 4.5 billion years. Add some randomness so it doesn't because obvious to there philosophers. And voila, one standard carbon life solar system. Now which corners to cut.

First the big stuff. There is no way I have time to simulate to the real quantization. I guess I'll do a hundredth of real. 10-35 should be lots. It's not near the 10-3500 of reality, but that just means that there computers will stay large, slow and power hungry. And who knows, that puts quantum effects close enough in size that maybe there will be some interesting biology out of it. Plug in the standard quantum model and I save a whole bunch on the time quantization too. Let's see how long that will take to run, say, five billion years.

A year. Xug. Man the supercomputer must be fast. Time for more cuts. Let's see where all the time is being consumed. Hmm, that's a lot of memory used to simulate the universe within non-zero gravitational effect. That's easy to fix though, just limit gravity to the speed of light, they shouldn't be reach the point of gravity control soon enough anyways. That means I can just simulate the universe delayed. That'll save a ton of memory. Not a real huge time saver though. Ok, Type 1, think really limited. No planet splitting so I can get real coarse with the internal simulation. Screw the quantum model, let's go with trivial fluid dynamics for the cores of planets. Simulating the first 10KM of crust should be lots. I don't even have to be that accurate since they won't get that far so let's use graduated accuracy starting at the space quantization 10KM down and going to the cubic kilometre at the centre. You know, let's go crazy and only be that accurate for the planet with life. The rest can just live with fluid simulation starting at 10-5 and going down to hundreds of cubic kilometres for the larger planets. If anything leaves the planet I guess I'll have to up the precision there, but that's no big loss.

So that's the bodies, but most of the solar system isn't body, it's space. If I follow the same kinda deal with space why not reduce it's precision dynamically too. It's mostly empty so I can do a 10-20 simulation for most of that at a reduced time granularity. Sure it'll have physics artifacts, but it'll all balance in the end nearly immediately. Keep them guessing. Now how long?

One month. Closer, but I still need some big gains. Let's take a look at the physics engine options to save there. Unobservable approximation. That's an option I was hoping to avoid, but there isn't really anything I can do about it. My little species will just have to live with being unable to unify physics in the large and physics in the small. That's the unavoidable when you use two different physics engines depending on the energies and precisions involved. It'll be close, but they might not even progress to the point where they'll notice. I really really hope that gets me there. I don't know what else I can cut.

A bit less than a day. What time is it now? Just after lunch, perfect. As long as my little species don't push extensively past the boundaries of their planet too much it'll be done with a couple of hours to spare to throw together some BS observations and charts.

Sometimes being a third year student means doing a rush job.

Many Fueled

Lithium coin cell. Twelve volt lead acid starter. Gasoline. Lithium-ion rechargeable. AA alkaline. Wood. White gas. Propane. Butane. Diesel. AAA alkaline. Food. It takes a lot of different types of fuels to run a modern autumn camp. Too many types to be honest. It's a far cry from where camps started.

Wood and food used to be the fuels of the day. Perhaps with hay thrown in if you were well off enough to have some beasts of burden with you. Wood was found locally, though you often had to bring in at least some of the food and hay. This wasn't what you'd call convenient though. Having no truly portable light to travel by and needing to burn down a fire before you could cook.

At some point lamps and candles where added to the mix. Not precisely light nor bright, but easily carried light is a valuable thing. The sweet spot really came about with pressurized petroleum stoves and lanterns. No more stumbling around at night and you could be cooking in a couple of minutes on the stove versus half an hour or more on wood coals. Wood, food and white gas. Three fuels for all needs with one locally gathered. A camp setup like that would look almost modern.

But then things started to get complicated. Make no mistake, what was lost in fuel simplicity was more than gained in convenience. Flashlights sure beat a lantern for walking around and packing on a hike. Propane makes cleaner and easier to use stoves and lanterns. With white gas you can be cooking in a minute or two; with propane it's literally seconds. Matches are nice and all, but butane lighters are really nice when they'll do the job. And I'm sure few will want to trade their ATV or truck for a horse.

Convenience sure can be a pain sometimes though.

Practical Bear Safety

Bear safety in the woods is an important matter. If you are lucky you only have to deal with black bears. If you are unlucky you not only have to deal with black bears but also grizzly bears. In either case you should be safe. Here are some practical, if not widely advertised, tips for keeping your camp bear safe. These tips are not for a light hiking camp, but more for a semi-permanent camp with several people where you'll be staying for more than a week.

  1. Have a fire. Fire is perhaps the most critical component of keeping a camp bear safe. With a fire you should burn all the garbage you can. This is most obvious with the food wrappings and food scraps but food cans can also effectively be burnt. The heavy food cans can be put in the fire to burn off the food residue and after they have cooled kept to bring back to a recycling centre back in the city.

    Just having a fire burning, especially through the night, will also serve as a deterrent, though not absolute deterrent, and keep wildlife away from your camp.

  2. Burn off your BBQ. Similar to the idea of fire you should ensure that any cooking surfaces are well cleaned before being put away. With pots and pans this means cleaning them shortly after finishing your meal with water and soap. For BBQs and other grills you should run them at a high power for a few minutes after you are finished cooking, scrape them and then run for a few minutes more. If it no longer smells like food it won't attract bears after the cooking smells have dissipated.

  3. Mark your territory. Animals are intensely sensitive to smell. Use this to your advantage. Make sure to urinate around the perimeter of your camp on a regular basis. It helps if you are eating foods or consuming drinks which add, let us say body, to the urine. This isn't an extremely strong deterrent, but you'll want to do it anyways because walking in the dark is hard so it's good to have an excuse.

  4. Drink beer and pop. If you can't prevent a bear from coming to visit your camp you can at least get some warning that they've arrived. Drink copiously and use the empties as an early warning system. This is especially useful near the areas where food is stored or cooked. If hunting you should have a pile under any hanging harvest. Ensure that when you leave you collect all the empties to return to help pay for the next case of beer.

    If you have somebody known to snoring ensure that they get an extra measure in their cup. It'll most likely make them snore all the louder and who wants to come near a chainsaw running at night?

  5. Have a Designated Teetotaller. Though you should drink so you have the empties to use as a warning perimeter there should be at least one, but preferably several people who aren't drunk and can handle a bear should one appear. This doesn't actually mean that they can drink at all, but if they mustn't drink to the point that they wouldn't drive.

  6. Sleep with a Gun and a Big Flashlight. All of the above are really only deterrents to bears. Nothing will really stop a bear which wants to come visit your camp. In that case there is only really one thing to do. Having a gun safely at the ready. You'll need the flashlight since it'll be dark. It might take two people, one to carry the gun and one to carry the flashlight if you don't have a headlamp.

    Try and have the designated teetotaller be the one using the gun.

    Should things get this serious don't bother with a warning shot, the bear will either not understand it or simply come back later. Go straight for the chest shot. All the better to put two in there to be sure. Never go for the head since you'll either miss or, in the case of a grizzly, just bounce the bullet uselessly of its forehead. In the morning you must remove the corpse to some distant dumping spot.

These aren't your standard bear safety tips, but they'll get the job done if you are in a situation where the normal safety tips are not practical. Just keep in mind that in the woods the bear is a top dog and you are beneath them on the food chain.

Golden Age of Burgers

Imagine the year is 1958. You are a 17 year old boy living in a medium size American city. It's a warm Saturday afternoon and your father has lent you his car for a couple of hours. Life is pretty good. So good, in fact, that you decide to use your gasoline powered freedom machine to visit the local burgershack with your friends.

Hamburgers likely existed in a similar form before this idyllic Saturday, but this may just be the perfect time in history to go out for a hamburger. The War is over, post-war prosperity has arrived in America and many of the previous hardships have passed. Even more importantly, burger optimization which will eventually drive the local burgershack out of business and replace the burgers with limp, dry imitations, has yet to come.

You and your friends cruise the relatively empty streets with sidewalks full of people. You pull into the burgershack and order a burger and milkshake. You pay with the money from your part time job. This is truly the golden age of burgers.

The mass of fiction and fevered dreams above was brought to you by Fatburger. Burgers how I imagine they were before fast food meant sixty seconds or less.

How to Use a Thinking Machine

Computers are wonderfully useful machines full of possibilities to make one's life easier. Unfortunately most people don't know how to make the best use of their thinking machines. They only use the simplest features of the software they buy and perform many manual steps which the machine could do for them. In short they are insufficiently educated on the key to the most effective use of a computer.

The biggest key to using a computer to the fullest is that you, dear user, should use them in such a way as to forget as much as possible as soon as possible. You must let the computer do the remembering and thinking wherever practical. How to do this will be clearer with some examples.

First consider the case of email. Many people read their email, letting that email become marked read, and then have to remember which messages they still requires a response or some action be performed. This is a situation where remembering can be pushed off to the computer. Messages which don't need an action should be separated from messages which need to be read or acted upon. Move them to another email folder.

Another email case occurs with finding old email. Many email programs have become quite good at searching for old messages these days. There is no need for anything but the coarsest of email filing hierarchy. Just throw the email into a large bin after finishing acting upon it. Perhaps this is one folder per project, perhaps this is a single folder named 'Old'. It doesn't have to be complicated.

The previous two examples are more about using the existing features of the software in an intelligent way to have the computer do the remembering for you. Beyond remembering and communicating, which computers can not yet make significant savings other than speed of delivery, computers can think for you. This requires simple programming and scripting. Those of you using Windows will find this more difficult than on other systems, but it can still be done. Writing dirty scripts in bash or python or make are great ways to teach the computer how to do some thinking for you. If you are clear in how you write these scripts you can them forget how to do those tasks entirely.

Writing applications has a well deserved reputation for being aggravating because users demand flexibility and that all the sharp edges have been sanded off. Writing a basic tool which performs exactly your own task and nothing more is several orders of magnitude easier due to the specificity and ease of modification. When what you need done changes slightly you can change the script to match with ease.

Computers are also known as thinking machines for a very good reason. When used most effectively you can offload tedious but significant parts of your brain to the machine, Thus leaving you to have a less stressful and more productive life. And productivity means more cold drinks with friends.

Threat Models. Security Researchers and You

Your communications are insecure. Criminals and governments the world over can read your emails, what you read on the web and watch you as you bank online. This is not your fault, it is the fault of security researchers. Security researchers reading too many cold war spy thrillers as children has left you without effective communications security. Your communications are insecure because security researchers are using the wrong threat model.

Threat models are the axioms, the base assumptions of the security world. They are used to design the security system by making defining what is being protected and what the attacker is capable of. Consider two threat models for a criminal organization owned warehouse. In the first model the crime syndicate has the warehouse in a country with a strong rule of law and is only worried about other criminals breaking into the warehouse and stealing the illicit goods within. In this case it is probably sufficient to make it known that they own that building and then have standard locks to deter the pettiest of criminal and cameras to identify any successful thieves. Those thieves can then be dealt with extra-judicially after the fact as a warning to the next would be thief.

The second model to consider is that same warehouse of illicit goods, but this time the police are militarized and the rule of law weak. In this case the warehouse must be defended against military raids. Locks and cameras are useless but blockades, reinforced doors and armed guards are more the call. This is quite different from the first case where all this extra security would be detrimental to the security goals. These two examples show that security assumptions are critical in having effective security choices. Choosing the wrong security model can result in less security then no protection at all. The crime syndicate would immediately lose their goods to the police if they had armed guards.

It is no different in the realm of communication security. Choosing the wrong model can expose you to insecurity or make the cost of the security outweigh the benefits. Consider two of the most prolific threat models used on the Internet, Cold War Spy and Faith in Government.

The Cold War Spy threat model is a favourite of the security community primarily for historical reasons. Over the past century, only governments did communications security and did so primarily under the guise of the military. Consequently the threat model discussed was usually some variant of: anybody can be a spy, assume every line is tapped by the enemy, leaking even one message will result in somebody dying and all users will be well trained because it may be them who pays the ultimate price for mistakes. All very reasonable if you are a military commander trying to plan an invasion and keep your spies alive. Not so reasonable if you are trying to trade emails with your friend across the country. This model is typified with the PGP model of security.

The popular alternative is the Faith in Government threat model. In this model government can be trusted. They can be trusted to correctly vet every key and certificate or delegate to equally trustworthy entities. This sounds pretty reasonable right? Your government follows the laws and doesn't need to make things insecure since they have legal methods of tapping whatever they want. Reasonable except that it isn't just your government which is being trusted and it isn't just all the employees in your government and delegates, it's every government. Corruption and privacy laws may be strong in your country, but can you say the same about China, Syria, Lebanon or Somalia? Can you trust that the certificate issuer of Russia doesn't owe somebody a favour they can't refuse or want a new luxury car? Of course you can't. This model is typified by SSL, the only real encryption used on the web for online banking and commerce.

These models leave the average person vulnerable because they ignore the situations average people find themselves in. Cold War Spy communications security is too hard and cumbersome. Nobody will die if one random email message of mine is broken. My lines are not usually tapped, especially since I move between network connections several times a day. Most people who come in contact with my messages couldn't care less and will merely do their job and otherwise ignore my messages. Cold War Spy security just isn't applicable to the average situation. The Faith in Government model is equally flawed. While I may trust my government because they can subpoena whatever they want anyways, I certainly don't trust their third rate corporate delegate run out of a derelict warehouse. This is to say nothing of trusting the governments, government officials and private persons in the various high corruption areas of the globe.

So most communications are vulnerable to criminals, foreign governments and corrupt local government employees. It doesn't have to be this way, but security researchers don't have your threat model in mind and are unwilling to accept the necessary compromises to protect them.

Investment

The modern global economic depends upon investment to operate. Money is invested, products produced and sold, profit returned. Many people know this. However I would argue that the majority of people who don't work in business or accounting or the financial services don't intuitively know this. Instead they only understand the abstract version of this, financial investment. This is greatly to their detriment.

Financial investment is the most advertised type of investment. The ads equate putting your money into their funds and pulling out a small fortune in twenty years time. This is more or less harmless as is, but is the kernel of the disease of misunderstanding. This is most evident where-ever a housing bubble is inflating. Some people go ahead and buy a house, perhaps fix it up and then quickly sell it for a profit. Once this becomes a trend people see the price of their houses going up. Suddenly houses are an investment. Put your money in, irrespective of whether you live their or not, and in a few years cash out with an extra 20-30-50%.

Unfortunately this misses out on the form of investment which is more useful to the average person with a finite budget and never as much money as they could use. I will term this kind of investment productive investment. Productive investment is paying money for depreciating assets which have a positive return on investment. Take education as an example. Suppose that a degree or diploma will cost you $10,000 once tuition and books are taken into account. The longer you have held this degree the less it is worth. The certificate may help you find a better job shortly after you graduate, but ten years later it's mostly your experience in the field which helps you find another job in the field. Thus a diploma is an asset which decreases in value an time goes on. If the new job is better and pays more then it's likely a good investment.

One example of a borderline case which is often argued as never an investment is new car. While it is absolutely true that a car will never be worth more than you paid for it that doesn't mean that it cannot be an investment. Consider the case where you already have a quite old vehicle, say a thirty year old beater. It is feasible that conditions could be such that the difference in price between a used car a few years old and a new car would be five or ten thousand dollars. If you intend to keep the car for many years, a decade at least, and the new car is very fuel efficient then it may be worth the extra cost. It then comes down to math. The extra vehicle cost versus the additional reliability, important if you drive often for a living or otherwise have a lengthy commute with no affordable alternatives such as transit, reduced repairs over the next handful of years and reduced fuel costs. It is especially important that you take into account the higher future cost of fuel since you may only be saving $0.50 per 100 KM now, but in ten years fuel could triple and you would then be saving $1.50 per 100 KM.

Of course these numbers ignore the less tangible benefits such as reduced stress, greater happiness or a vehicle which fits your needs better. And of course the best transportation investment is not having to own a car at all since almost no personal vehicle ever fully pays for itself, though the cost versus a more used vehicle may not be as great as it first seems.

Hopefully I've made clear how many things which cannot ever be sold for more than their purchase price can be viewed as investments. This is how business looks at it. You have to spend money to make money, but you also have to spend money to save money. Paying for efficiency and longevity can be worthwhile investments. Longevity especially so since a $20,000 purchase depreciating to nothing in 25 years is less expensive than a $10,000 purchase which is worth nothing in eight.

Not The Stat You Are Looking For

People seem to have a fetish for Life Expectancy. Not a measurement of how long they'll live, that would actually be useful, but instead the formal statistic. This comes up wherever a discussion of progress, now versus the middle ages for example, the rate of progress, has invention hit diminishing returns yet, comparisons between countries, why it sucks to be in the developing world, and many others. Champions of improvement point to the massive gains since the second world war, moving from 64 to 80 today in the developed world. People arguing that life in the middle ages was brutishly short point to the thirty year life expectancy.

All of these are strong arguments based upon a misleading statistic. Formally Life Expectancy is the mean age of death of every person in a particular time and place. Though at first blush this sounds like what you want it really isn't. It is heavily biased towards measuring the deaths of infants and young children. If a certain life expectancy is low then it is almost certain that infant mortality is high. However, the infant mortality rate doesn't matter that much except in raising life expectancy. If an infant is born and doesn't make the week that's a shame, but isn't a useful measure of what age adults tended to die, either of natural causes or not. To get at the latter you'd want a statistic more like the median age of death of any person over fourteen. The latter could be obtained using something calculated like life expectancy, but excluding anybody under fourteen. Unfortunately these useful statistics are not commonly available.

Life expectancy does have its uses, but they are much narrower than how it used as the end-all measurement of how long the average person lived. There were still people who commonly lived into their sixties and seventies in the middle ages. Even though the life expectancy was nearer to thirty years than eighty. Don't base an argument on life expectancy, it just doesn't mean what everybody thinks it does.

Annotated Tour: bash

On occasion the topic of shell configuration has come up in discussion in one regard or another. Often I have some useful tidbit in my bashrc which others don't know about. I have thus decided to begin writing up an annotated tour through the configurations for the various tools I use on a daily basis. This is the first in that series where I cover what's in my bash configuration.

Let's start from the top.

# First unpack $TERM because it's the only effective way to move arbitrary environment
# variables through ssh to arbitrary hosts. The format of the modified string is:
#   realterm;flag1,flag2,flag3
# Each flag will be set to it's own name as it's value. Thus, "xterm;USE_FANCY_KEYBOARD"
# will result in:
# $TERM=xterm
# $USE_FANCY_KEYBOARD=USE_FANCY_KEYBOARD
EXTRAFLAGS=${TERM##*:}
export TERM=${TERM%%:*}

if [ "$EXTRAFLAGS" != "$TERM" ]; then
        IFS=',' read -ra ENVFLAGS <<< $EXTRAFLAGS
        for flag in ${ENVFLAGS[@]}; do
                export $flag="${flag}"
        done
fi

From what I've been able to determine there is only one portable way to move environment variables from one machine to another via ssh. While you can configure ssh to copy particular environment variables when logging in it requires configuration changes on both the client and the server. Since that isn't portable I stick my environment variables, only flags at the moment, by stuffing them into TERM which is exported by ssh by default. This code here unpacks the encoded TERM and sets it to the real TERM value.

# terminal configuaration options:
case $STY in
   *pts*|*tty*)
      session_name=`sed 's/.*\.//' <<< $STY`
      ;;
   *)
      session_name=`sed 's/[^.]*\.//' <<< $STY`
      ;;
esac

This code here extracts the screen session name. If there isn't a user set session name it extracts the hostname.

case $TERM in
        xterm*)
                TITLEBAR='\[\e]0;\u@\h: \w\007\]'
                ;;
        screen*)
                if [ ! -z "$session_name" ]; then
                    TITLEBAR='\[\e]0;[${session_name}|${WINDOW}] \u@\h: \w\007\]'
                fi
                ;;
esac

With the screen session name I can then put that and the current window into the terminal emulator titlebar like "[daredevil|6]". I use this to keep track of which session and window number I am in as I move between several sessions often and create and destroy windows regularly.

# In progress work to detect whether the terminal is light-on-dark or
# dark-on-light. Very useful for things with colour. Would also be useful on
# odysseus.
if false; then
    if [ -z "$DARK_TERM" -a -z "$LIGHT_TERM" ]; then
        dark="1" # Default assumption of a dark on light terminal

        # Terminal.app
        dark="0"
        colour=`osascript -e 'tell application '\"Terminal\"' to tell the front window to get its normal text color' | sed 's/,//g'`
        for rgb in $colour; do
            if [ $rgb -gt 32000 ]; then
                dark="1"
            fi
        done

        # Xterm
        # echo -e "\e]11;?\007" will return something like
        # \e]11;rgb:rrrr/gggg/bbb BEL

        # Screen inside xterm
        # echo -e "\eP\e]11;?\007\e\\" will return as above. How can one detect
        # screen in xterm and is it even necessary?

    fi
    echo $dark
fi

This is some work in process code to detect the background colour to give other applications I use a hint as to what colour scheme to use. It isn't used because there is no general way to determine locally, let alone through ssh, the background colour. This is a hole in the traditional terminal information model and isn't helped by the fact that most emulators claim to be some variant of xterm, even when they obviously aren't.

# Preparation for system specific configuration. These are the interm aliases
# necessary so that the OS specific command names can override them if
# necessary. The primary example is that the ls with colour is a different
# command on NetBSD than on Darwin and Linux.
alias __ls='ls'

# Common configuration options:
export PATH="$HOME/bin:$PATH"
export VISUAL='vim'
export EDITOR=$VISUAL
shopt -s histappend # append to history instead of overwriting it
export HISTFILESIZE=100000
export HISTSIZE=100000
export HISTCONTROL=ignoredups
export LESS="-R"
export PAGER="less"
LS_OPTIONS="-h"
GREP_OPTIONS_DEFAULT="--exclude=tags --exclude=ID"
_GREP_OPTIONS=$GREP_OPTIONS_DEFAULT
NCPU=1 # Default to one CPU

# Not all greps support --exclude-dir
GREP_IGNORE_DIRS="--exclude-dir=.svn --exclude-dir=CVS --exclude-dir=.git"

These are all my default settings which are more or less portable across all the OSes and systems I use. Of interest here is 'histappend', which causes bash to append its history to the history file when closing the shell. This means I don't lose my command history when closing a shell, though sometimes the history I am looking for is further back than I expect. It works fine for opening a new shell and wanting the history of the most recently closed one though. I also set the ignoredups option which doesn't save duplicate command lines, such as if I do a manual watch with the same command for some reason.

Another setting of interest is -R for less. This setting makes less pass through colour escape codes. This is most useful when using grep colouring as configued below.

Finally I setup options to ignore VCS directories by default when grepping around.

function pwd_len_limited {
        local pwdmaxlen=20
        local pwd=${PWD/$HOME/\~}

        if [ ${#pwd} -gt $pwdmaxlen ]; then
                local pwdoffset=$(( ${#pwd} - $pwdmaxlen ))
                newPWD="#${pwd:$pwdoffset:$pwdmaxlen}"
        else
                newPWD=${pwd}
        fi

        echo $newPWD
}

This function takes the current working directory and only returns the last 20 characters of it. $HOME is automatically turned into ~ and any path which is too long is prefixed with #. I find this as useful as having my full path in my prompt, but without the various disadvantages, such as a path longer than my terminal is wide, which having the complete path in the prompt entails. In practise the width of the displayed path is normally always truncated so it doesn't introduce much variability into my prompt size.

function is_vim_running_locally {
    if ps -T -o comm | grep '^vim' &> /dev/null; then
        # vim is running in this local terminal
        echo -n "&"
    else
            if [ -f /p4conf ]; then
                    echo -n "%"
            else
                    echo -n "$"
            fi
    fi
}

This function sets the last character of my prompt. Normally it is the standard $. If this is a shell started from vim it will be &. At work we access the crosscompiler toolchain inside a chroot so the prompt is % inside that chroot. The indication that I am inside vim is especially useful to prevent me from editing a file, starting a shell to run some command and then forgetting I was in vim and start vim again. Before this change I sometimes found myself three or four vim instances down and wondering where my editor modifying a file went.

# Functions which do non-trivial configuration which isn't always performed
function connect_to_ssh_agent {
        local SSHPATH="$HOME/.ssh/$HOSTNAME"
        # If we have a remote agent we are already done
        if [ ! -z $SSH_AUTH_SOCK ]; then
            return
        fi

        # If we have a record of starting an agent before, try connecting to it
        mkdir -p $SSHPATH
        if [ -s "$SSHPATH/sa.sh" ]; then
               . "$SSHPATH/sa.sh" >/dev/null 2>&1
                kill -0 "$SSH_AGENT_PID" >/dev/null 2>&1
                if [ $? -eq 1 ]; then
                        # agent is dead
                        rm -f "$SSHPATH/sa.sh"
                fi
        fi
        
        # If all else fails start an agent
        if [ ! -f "$SSHPATH/sa.sh" ]; then
                touch "$SSHPATH/sa.sh"
                chmod 600 "$SSHPATH/sa.sh"
                ssh-agent > "$SSHPATH/sa.sh"
               . "$SSHPATH/sa.sh" >/dev/null 2>&1
        fi
}

This function tries its best to always use an existing and recent ssh-agent when starting a new shell. This is most useful if I reattach to a screen session. Since the previous agent socket would then be invalid any new shells started in the old session wouldn't have a working ssh-agent otherwise.

# Mark the machines which have my fancy keyboard connected most of the time
function use_fancy_keyboard {
        export USE_FANCY_KEYBOARD="USE_FANCY_KEYBOARD"
}

This is just a function for tidiness below.

# Function which updates the settings of some environment variables. Useful when
# using screen and connecting from different machines.
function refresh_env {
        eval `cat ~/.ssh/$HOSTNAME/update_config`
}

# Create a file which can be processed to update various shell environment
# variables which may become out of date, such as DISPLAY when a shell is run in
# screen.
function create_update_config {
        local CONFIG="$HOME/.ssh/$HOSTNAME/update_config"

        # We only want to overwrite this configuration file if we are at the
        # root of a set of shells on this machine. Ie. if this session is in a
        # terminal window or as the result of an ssh login. We do not want to
        # overwrite the configuration if a new window in screen is opened or a
        # shell opened from vim. However, if this isn't the root we'll want to
        # ensure that we read the config file to have the latest settings.
        if [ ! -z $CREATED_UPDATE ]; then
            refresh_env
            return
        fi

        export CREATED_UPDATE=yes

        mkdir -p "$HOME/.ssh/$HOSTNAME"
        echo "export SSH_AUTH_SOCK=$SSH_AUTH_SOCK;" >  $CONFIG
        echo "export DISPLAY=$DISPLAY;"             >> $CONFIG

        if [ -z "$USE_FANCY_KEYBOARD" ]; then
            echo "unset USE_FANCY_KEYBOARD;" >> $CONFIG
        else
            echo "export USE_FANCY_KEYBOARD=$USE_FANCY_KEYBOARD;" >> $CONFIG
        fi

        if [ -z "$LESSKEY" ]; then
            echo "unset LESSKEY;" >> $CONFIG
        else
            echo "export LESSKEY=$LESSKEY;" >> $CONFIG
        fi
}

These two functions operate to that any new shell I start has up to date environment variable concerning things which may change for every remote login, even if the new shell is being started in the context of a previous login as happens when reattaching to a screen session or starting a shell from an editor. Currently this only ensures that my ssh-agent, DISPLAY and keyboard layout are kept up to date. The refresh_env function allows me to update any running shell without starting a new shell.

# OS specific settings
OS=`uname`
case $OS in

This section applies OS specific settings. Usually these are options for different userspaces or different ways of determining if particular hardware is available.

        Darwin)
                export LC_CTYPE="en_US"
                export PATH=/opt/local/bin:/opt/local/sbin:$PATH # MacPorts
                export HOSTNAME=`scutil --get LocalHostName`
                LS_OPTIONS="$LS_OPTIONS -b -G"
                NCPU=`/usr/sbin/sysctl -n hw.ncpu`

                # Detect if my fancy keyboard is connected or not
                if system_profiler SPUSBDataType | grep Kinesis > /dev/null; then
                        use_fancy_keyboard
                fi

                # Version specific changes
                OSXVER=`/usr/bin/defaults read /System/Library/CoreServices/SystemVersion ProductVersion`
                case $OSXVER in
                        10.8.*)
                                unset PROMPT_COMMAND
                                ;;
                esac
                ;;

Settings for MacOSX systems. The only bit of note here is that I check to see if my ergonomic keyboard is attached. Several pieces of software I use have different key mappings depending on what kind of keyboard I am typing on. I don't tend to use MacOSX as an ssh destination so I don't take care to connect my ssh-agent or to check for a local login before my fancy keyboard.

        Linux)
                export HOSTNAME=`hostname`
                _GREP_OPTIONS="${GREP_IGNORE_DIRS} ${_GREP_OPTIONS}"
                LS_OPTIONS="$LS_OPTIONS -T 0 -b --color=auto"
                connect_to_ssh_agent
                NCPU=`grep ^processor /proc/cpuinfo | wc -l`
                if [ $NCPU -eq 0 ]; then NCPU=1; fi
                ;;
        NetBSD)
                export HOSTNAME=`hostname`
                alias __ls='colorls'
                # Currently the only ls option is -h, which isn't supported
                # with colorls
                LS_OPTIONS="-G"
                connect_to_ssh_agent

                # sysctl requires extra permissions on some systems
                #NCPU=`/sbin/sysctl -n hw.ncpu`
                ;;
        FreeBSD)
                NCPU=`/sbin/sysctl -n hw.ncpu`
                ;;
        Solaris)
                #NCPU=`psrinfo | something
                ;;
        *) # Try something reasonable
                export HOSTNAME=`hostname`

These OSes have nothing special about them aside from some different optons supported by a couple of userland tools.

esac

# Linux Distro specific settings
if [ $OS == "Linux" ]; then
        DISTRO=""
        # LSB check (Ubuntu)
        if [ -f /etc/lsb-release ]; then
                DISTRO=`cat /etc/lsb-release | sed -e 's/=/ /'|awk '{print $2}'|head -n 1`
        elif [ -f /etc/debian_version -o -f /etc/debian_release ]; then
                DISTRO="Debian"
        elif [ -f /etc/slackware-version ]; then
                DISTRO="Slackware"
        elif [ -f /etc/gentoo-release ]; then
                DISTRO="Gentoo"
        elif [ -f /etc/redhat-release -o -f /etc/redhat_version ]; then
                DISTRO="Redhat"
        fi

Find the name of a few Linux distributions I use from time to time.

        # Now we switch on the different distros because some of them are quite different
        case $DISTRO in
                Ubuntu)
                        export LC_CTYPE="en_CA.utf8"
                        ;;
                Redhat)
                        # I don't know which Redhat support a new enough Grep
                        _GREP_OPTIONS=$GREP_OPTIONS_DEFAULT
                        ;;
                *)
                        export LC_CTYPE="en_US"
                        ;;
        esac
fi

The comment really says it all and this code adjusts a few things for slight differences between distros.

# Machine specific settings
case $HOSTNAME in
        travis)
                export DITRACK_ROOT="/home/travis/issues/issues"
                export QUEX_PATH="${HOME}/bin/quex-0.53.2"
                ;;
        daredevil)
                # Daredevil has an older version of gnugrep which doesn't support exclude-dir
                _GREP_OPTIONS=${GREP_OPTIONS_DEFAULT}
                export NNTPSERVER=localhost
                ;;
        multivac)
                export QUEX_PATH="${HOME}/bin/quex-0.59.5"
                export SVN_SSH="${HOME}/projects/configurations/subversion/svnssh.sh"
                ;;
        david)
                export USE_TINY_KEYBOARD=USE_TINY_KEYBOARD
                export SVN_SSH="${HOME}/projects/configurations/subversion/svnssh.sh"
                ;;
        tbrown-macbook) # Machine at Mobidia
                export CVSROOT=":pserver:tbrown@MobidiaCVS:2401/MOBIDIACVS"
                ulimit -c unlimited
                ;;
        tbrown3-macbook|tbrown3-vm32) # Machine at Tellabs
                export TELLABS=1
                ;;
        usscrh5bld*) # Build machines at Tellabs
                export PATH="${PATH}:/home/wiccadm/bin/cc:/usr/atria/bin:/net/sunwicc01/export/home/sunwicc01/wicc/tools/bin"
                alias ct='cleartool'
                export TELLABS=1
                # The build system breaks badly if done in parallel
                export MAKEFLAGS=" "
                ;;
        tbrown3-2|wiz|VTd-GAP) # vmbox at Tellabs
                export TELLABS=1
                ;;
        sdf|otaku|benten|faeros|iceland|norge|sverige|ukato) # Machines at SDF
                # These settings come out of the default .profile. At least
                # these are the settings I didn't overwrite.
                export MAIL=/mail/${LOGNAME:?}
                stty erase '^h' echoe
                ;;
        TRAVISB-ARISTA|*.aristanetworks.com)
                # Machines at Arista
                export ARISTA=1
                export P4MERGE=$HOME/configurations/perforce/merge.sh
                export SCREENDIR=$HOME/.screen_sockets
                mkdir -p $HOME/.screen_sockets
                chmod 700 $HOME/.screen_sockets
                ;;
esac

These machine specific configurations aren't that interesting for most people. In fact most of these entries are defunct but I keep them around as examples of how to perform certain types of environment specific configurations without having to reach back into my VCS history.

What might be of interest will be the screen socket configuration in the last machine section. Remember how I said at work I spent time inside chroots in screen sessions? well in order to start a screen session within the chroot you must be in the chroot. This makes storing the screen sockets in /tmp problematic. Since this chroot does allow me access to my home directory I put the screen sockets under there instead so I can access them outside the chroots.

# Set things up using the above configurations.
alias grep='grep --colour=always'
export GREP_OPTIONS=$_GREP_OPTIONS
alias ls='__ls $LS_OPTIONS'
alias df='df -h'
alias du='du -h'
alias free='free -m'
if [ -z "$MAKEFLAGS" ]; then
   export MAKEFLAGS="-j $(( ( ${NCPU} * 5 ) / 4 )) -l $(( ${NCPU} * 3 ))"
fi

This section takes all my default settings, which are modified above for specific OSes and machines, and exports them. Some things, like df using human readable units by default, are equally well supported everywhere and so aren't (yet) factored out.

One thing I do for make is to have the default number of parallel jobs be 5/4 times the number of cores. On one or two core machines this is equal to the number of cores, but on machines with more cores it's a bit higher to ensure that the machine is maximally loaded. Sometimes IO or other delays will result in a small number of these jobs not consuming a full core worth of CPU time. To fix this I oversubscribe the CPUs a bit. I also limit make to not starting too many jobs if the load average is more than three times the number of cores. This is high enough that I'll push out any nice'd processes, but not so many that I make the box unresponsive. These settings really help when compiling large programs on 32 core or greater boxes which have background builds and tests running.

if [ -z "$USE_FANCY_KEYBOARD" ]; then
    export LESSKEY="/dev/nonexistant_file"
fi

If I don't have my fancy keyboard then I want to use the standard QWERTY less key mappings. My .lesskey contains the mappings for my non-standard keyboard.

# Don't overwrite the interactive login info if this isn't interactive. Say if
# we are logged in somewhere and then scp a file over
if [ -n "$PS1" ]; then
    create_update_config
fi

You may not have noticed, but all my bash configuration is in this single file. I find this much more convenient than having to factor it out further into things which are to be used during an interactive shell versus not. Instead I simply skip the few bits which are not appropriate in a non-interactive shell and symlink all the other bash rc files, such as .profile, to this file. This section skips one of those interactive only chunks.

if [ -z "$DARK_TERM" -a -z "$LIGHT_TERM" ]; then
    # Setup a default of light on dark terminals since that's what I use most of
    # the time. Eventually I may get some autodetection working. At least on
    # some platforms. Background/foreground colour detection is a bit of a
    # forgotten corner of Unix terminal interaction. Especially since most
    # terminals claim to be xterm.
    export DARK_TERM="DARK_TERM"
fi

This is the export portion of the nonfunctional terminal colour scheme detection code.

# Second half to the $TERM flag ssh passthrough
alias ssh="TERM=\"\${TERM}:\${USE_FANCY_KEYBOARD},\${USE_TINY_KEYBOARD},\${DARK_TERM},\${LIGHT_TERM}\" ssh"

In order to export my shell flags over ssh I setup this alias to fill the flags into TERM before ssh'ing. This can be slightly annoying if you ssh to an account without my configuration because it won't know about the compound terminal type. I usually have a script which sshes to the correct machine without the compound TERM though so this is rarely an issue. When it is I fix it with and "export TERM=xterm" on the remote end.

# Only set PS1 if there is one already set so that we don't set one in a non-interactive shell

PROMPTHOSTNAME=`echo ${HOSTNAME} | sed 's/\..*$//' | tr '[:upper:]' '[:lower:]'`
PROMPT='${PROMPTHOSTNAME}:$(pwd_len_limited) \[\033[1;37m\]$(is_vim_running_locally)\[\033[0m\]'

if [ -n "$PS1" ]; then
        PS1="${TITLEBAR}${PROMPT} "
fi

The last thing my bashrc does, if running interactive, is to combine my prompt and set it. Some of the machines I work on have uppercase hostname (ick!) so I convert all hostnames to lowercase here. My prompt isn't as ornate as some, but I find it functional. The only piece which I haven't described here are the control characters which make the final character of my prompt bold for an easy visual marker of where my prompt ends and my command begins.

Too Much About Distributed Bug Tracking

Distributed bug tracking is a topic which had a burst of interest in 2008 and then again in 2010. Unfortunately, since then not that much has come of it and there is a lot of misunderstanding. Distributed bug tracking is also a relatively new concept so there are many facets which haven't been fully thought through or which are being re-invented repeatedly. This is an attempt to collect and explain all the major issues and approaches to distributed bug tracking seen in software to the current date. The intention is to serve both as starting point for those looking to use a distributed bug tracker and a summary of the major issues for those considering to write or just understand distributed bug trackers. There is also a comparison of existing software and some possible use cases for the reader interested in using a distributed bug tracker as part or a project.

Before diving in it's important to note that there are actually two definitions of distributed bug tracking competing for the term. The older, which I'll be discussing in detail below, is tracking or distributing bug information in a distributed manner much like you can track and distribute source code using a distributed version control system such as Git or Mercurial. The second definition is distributing bugs between many more traditional centralized bug trackers such as Bugzilla or Jira. I won't cover this latter definition here, but perhaps in a later post as a rise of DVCS-like distributed bug tracking will drastically increase the need for inter-tracker bug synchronization.

Software


Over the past five or six years there have been several distributed bug trackers written which have explored various different aspects of the domain. Most of these have issues ranging from minor through major. Here I've listed all the distributed bug trackers I was able to find in the course of my research into the topic. In a later section I'll go over a matrix of their capabilities and designs.

As you can see there is no lack of early projects exploring distributed bug tracking. Later I'll compare them to each other but first I will discuss the various dimensions and design decisions which go into a distributed bug tracker and are expressed in the above software.

Design Considerations


There are several aspects of distributed bug tracking which have parallels with traditional centralized bug tracking, such as which fields a bug should have, and several which are distinct, such as how bugs are stored relative to branches. This section will discuss only those shared issues which have direct relevance to distributed bug tracking. Issues such as bug priority policy will not be discussed as those don't differ between centralized and distributed bug tracking. Issues unique to distributed bug tracking will also be discussed.

On-Branch, Off-Branch or Out-of-tree


The first issue which comes up when people first ponder distributed bug tracking is where, with respect to the code, the bug database should be stored. There are three common options. The first and most popular is to store the bugs next to the source code in a separate directory in the source VCS. This is attractive because the developers already have that source available and it's easy for the tracker developer because if there is any VCS support required it is limited to basic content tracking commands such as add and commit. Using a VCS also lets the distributed bug tracker developer leverage the existing VCS synchronizing and merging capabilities. Further it allows bug information to follow the code across branches.

This latter ability is one of the great possibilities that distributed bug tracking brings to the table. Large complex projects which have several development and maintenance branches often have difficult or complex ways in which they track whether a particular branch has the fixes for a particular bug or not. The best track which commits fix a particular bug and then leverage the VCS to determine if a particular branch has that change or not. Other systems use multiple bugs or other manually maintained fields to store such information for release and maintenance branches, usually development branches are too much work to cover using these manual systems. In the worst case the source of information is the original developer being asked to examine the branch to see if a particular fix exists there.

Obviously all the lessor traditional approaches have their issues. However even the best traditional method depend heavily on the VCS being able to effectively determine if a change exists on a particular branch across a wide array of obstacles including complex merges, rebases, double commits and changes passed around as patches, which may be manually reapplied. This is a difficult proposition and inevitably the coverage of supported cases will have holes.

On-branch storage also has the advantage of keeping the bug database with the code is that the bug database can follow the code through source tarballs and packages as they are distributed and incorporated into distributions. It is also possible, with greater or lessor merge troubles, to have the bug data follow fixes along in the patches.

The on-branch strategy is not without disadvantages, most of which are trade-offs for the advantages gained. The aforementioned bug data in patches is one such disadvantage. Since the bugs are stored beside the code any diffs or patches will, by default, contain change information related to the bugs as well. This is not always desirable and results in extra work to clean up patches or ignore bug changes. Similarly having the bug status track the code through various branches is a useful feature, but brings about the challenge of producing a summary view across the various release and trunk branches. It is also not immediately obvious where bugs against a particular release version should be filed or how to determine which branches have a fix if any have it at all.

Another alternative is to store the bugs inside the VCS in a separate branch. This approach results in a system which is more similar to the traditional centralized bug tracking paradigm. Designed this way there is only one source of bugs of which any particular copy of the repository will have a more or less up to date version.

Off-branch bug storage solves some of the issues related to on-branch storage, namely issues related to where a bug should be entered, keeping bug data out of patches or diffs and, as will be discussed later, how to get descriptions of bugs onto the branches where the bugs are. Similarly off-branch storage has as its disadvantages many of the advantages of on-branch storage. In particular off-branch storage does nothing to help track the state of a bug on any particular code branch.

Off-branch storage also suffers a few disadvantages of its own. By storing the bugs away from the code in a separate branch extra care must be taken to ensure that the bug branch is propagated. For example, systems such as git don't automatically push and pull branches other than the current one. This can lead to a project being pushed, to Github say, without the bug database being included. As we'll see later this is one aspect which may have contributed to low recommendation scores of some of the existing distributed bug tracking software since it may be the reason several appear to not dogfood themselves.

Off-branch storage will also have difficulty transferring between different version control systems. Though it may feel like everybody uses git all the time, it just isn't true. Unfortunately how branches work differs across VCSes both semantically and with respect to the interface. This can cause limitations with entities, such as Linux distributions, integrating the upstream bug repository.

The least favoured storage method is neither on-branch nor off-branch, but out-of-tree. With out-of-tree the bug database is stored in some other fashion either inside the VCS or using some other external database. One example of this is Fossil which stores the bugs as part of its distributed database, but not really in a separate branch at all. Another example are systems which take advantage of the git-note capabilities. These systems have the advantage of being clean since they don't have the clutter of bug directories or bug branches. Unfortunately that is really the only advantage they have. Storage of this form tends to be tightly integrated with a single VCS and usually even more care must be taken to ensure that the bug databases are propagated and merged correctly then in the off-branch case.

One advantage shared between off-branch and out-of-tree is that they hold the possibility of using custom merge algorithms. If bugs are stored on-branch then they must be merged alongside the source code and thus, for the most part, must use the standard source control merging algorithms. This will constrain the file formats of the bug database to forms which are feasible for basic textual merges to be successful and relatively easy for humans to merge manually when conflicts arise. Off-branch and out-of-tree, in contrast, hold the promise of using custom merging algorithms. This is theoretically possible with off-branch storage, depending on VCS support, and the norm with out-of-tree storage.

File Formats AKA Ease of Merging


Traditional centralized bug tracking has a great freedom in how its data is structured and represented on disk. It is perfectly acceptable to require specialized tools to read the data and the data is optimized to be processed by trusted and properly configured server software, exceptions to this will be performed by prepared system administrators who take the utmost care. Distributed bug tracking has none of these freedoms.

Distributed bug tracking must operate in a world where the tracker doesn't have full control over what happens to their data or who has permissions to change it. As we'll come back to later distributed bug tracking cannot rely on authorization to ensure that only permissible states are entered, instead the best they can do is verification of change before they are integrated into the local bug database. As such one important aspect in the chosen file formats of distributed bug trackers is that they must be difficult to corrupt.

The minority of existing distributed bug trackers have the ability to rely on specialized merging algorithms. Mostly these are out-of-tree based or based upon specialized databases. The rest must at least perform acceptably without the benefit of custom merging code. This is very true of on-branch trackers where the bug changes will pass through the standard code merging algorithms and mostly true for off-branch trackers where the bug branch will likely have at least a few hops where the specialized merge tool is not installed.

The two important aspects of distributed bug database file formats are how well they merge automatically using the standard textual merge tools and, since conflicts are sometimes unavoidable, how easily they can be resolved by humans. Conflicts are unavoidable in all cases because some data about a bug, such as whether it is resolved or not, is semantic and singular. A bug is either declared fixed or not. Consider the case of bug A and a tracking policy which has three possible bug states: New, Diagnosed and Fixed. Suppose Alice, in her branch or repository clone, fixes bug A and marks it as fixed in her copy. Suppose concurrently Bob, in his branch or repository clone, looks at the bug and figures out what's wrong so he marks it as Diagnosed. If Bob later pulls in Alice's changes he will receive a textual conflict related to the bug state. If a custom merge algorithm could be used this wouldn't be an issue since Fixed obviously overrides Diagnosed.

Though the above case could be solved by a custom merge algorithm there are cases where it is not clear that any algorithm can always make the correct merge. Consider the case of the customer severity of a bug. Alice may mark a bug as Minor because it only affect two or three customers. Bob might, however, mark it as Critical because one of those few customers is the biggest customer the company has. No mere computer could ever have all the relevant information to always make the correct choice.

With these two aspects in mind there are several different file formats which have seen use in the software I've found. These can be divided to cover the span of two dimensions. The first dimension is the format of each file and the second dimension is what is contained within these files. Out-of-tree storage designs won't be covered here since they tend to demand custom merging utilities anyways and be based upon more complex databases.

The most common file format appears to be a simple markup. Simple markups rate highly for ease of human resolution since there isn't a finicky file format to worry about. They tend to be rather inflexible and difficult to code for however. Most of the formats in this class are usually too simple to have a name or look much like the INI format.

The second most popular format seems to be a hierarchical markup akin to YAML. This differs from a simple markup in that the format is more complicated, but also more flexible. While these formats don't rate badly in terms of human conflict resolution there is a risk of a missing significant character causing issues.

The least popular appears to be full serialization formats such as JSON or XML. Unless pretty printed these are nearly impossible to manually resolve. With pretty printing these serialization formats tend to be merely error prone and tedious. One technique I have seen is to use JSON with each data element separated from any other via five or six newlines. The intent here is to reduce the possibility of a merge conflict by removing the other data in the JSON file from the context of the merge.

The file format chosen is perhaps the greatest determiner of how often automatic merging will be successful and how much pain the human will have to suffer when automatic merging fails. From this perspective alone the simple markup seems the best which is possible. Since they tend to be one statement per line formats and have minimal grammar requirements automatic merges tend to corrupt these formats the least and they are the easiest, especially when the lines in the file are in a fixed order and produce nice diffs, for a human to manually merge.

There are also three major ways to arrange the storage of bugs among a number of files. The simplest from a file layout point of view is to store the entire bug database in a single file. This has advantages of efficiency, speed and ease of coding. As a disadvantage every change to any bug will modify this file thus ensuring that it will have to be merged constantly. This option is not used it many of the existing distributed bug trackers.

The most popular file layout appears to be one file per bug. This has the advantage of reducing conflicts since it is less likely that two developers will modify the same bug than two bugs in the same database. If the tracker restricts itself to singular semantic data only, such as bug state, then this can work well since any concurrent changes the data would have to be manually merged in any case. If the tracker supports things like bug comments then this format is still open to frequent file merges as different people comment on the same bug at different times. Unfortunately bug comments in a single file will cause frequent merge conflicts until the number of existing comments becomes sufficiently large. At that point it is possible to place new comments into the file randomly to give the automatic merge the best possibility of success. Most bugs to not accumulate more than a dozen comments however.

The final common layout is to use (almost) immutable objects. In this scenario each issue has a number of files. All or most files will be immutable. One way to accomplish this is to put each comment into a separate, immutable file and give each bug one small mutable file which contains the singular semantic data. Since concurrent comments are common and, in principle, easily merged automatically the comments would be trouble free. Since singular data is impossible to automatically merge in all cases the file being mutable gives the human the full power of their VCS to help them determine the correct semantic resolution. An alternative is to use fully immutable objects and a log-like structure where newer objects override older objects. Such a system is capable of always merging automatically, but when the merges are incorrect, as in the Fixed/Diagnosed example above, the human is left with minimal tools to determine the correct resolution or even receive any indication that a conflict occurred which requires their attention.

In allowing the maximum number of successful automatic merges and immediately bringing semantic conflicts to the user's attention the mostly immutable object method appears to be the superior method. Successful automatic merges have much less friction than the alternative which is important to support the adoption of distributed bug tracking.

In summary it would seem, at this time, that a series of mostly immutable object in a simple markup format would be the best available choice for the backend bug storage format.

Process Automation


Centralized bug trackers tend to support process automation. Process automation is the ability of the bug tracker to ensure that a bug goes from New to Assigned to Resolved and, being assigned back to the reporter, to Closed. Many projects use this to implement complex bug life cycles and bug handling processes. Distributed bug trackers don't have the luxury of supporting this feature in a reliable manner. There are two central reasons for this.

The first is that while centralized bug trackers operate on centralized and controlled servers, distributed bug trackers run on the developers' own machine. The developer won't be able to short circuit the twelve step bug process on the server, but if they are aggravated enough they'll disable the process enforcement code on their own copies of the repository. With no way to trust that every step has been performed in an allowable order the only way to confirm the process has been followed is to verify after the fact.

Unfortunately this verification comes with its own problems, even if the developers follow the process locally. Merging state between concurrent modifications can, depending on the complexity of the bug process, result in invalid or at least ambiguous states. Merging the output state of two identical but independently run state machines is not guaranteed to result in a valid state of the state machine. It is possible to verify that a valid state has been reached as the result of a merge, but that will involve manual resolution, often of a frustratingly tedious nature. Merging of bugs makes it difficult to maintain a verified bug state since the transitions cannot necessarily be observed.

In the end it seems that bug tracker automation will either be done mostly with wrapper tools or VCS hooks. As with DVCS hooks versus CVCS hooks I believe we'll find that distributed bug tracking results in the adoption of less stringent processes and additional trust put into the users of the bug database because only after the fact can hooks be executed at a canonical repository.

Comments, Attachments and Fields


The oldest form of distributed bug tracking is a TODO file committed beside the code. This is usually a simple list of tasks or bugs to be fixed, perhaps with a single brief comment explaining the issue in detail. This is the simplest form of bug tracking, just a list of titles, maybe with a description. At the other end there are massively complex centralized systems with bug processes, multiple comments, attachments and more fields, both free form and constrained, than you can shake a stick at.

Distributed bug tracking covers this entire range. Simple TODO lists are not very interesting because they are simple and quite limited, massively complex systems are unlikely to succeed as distributed bug trackers for the reasons described in the previous section. Most interesting is the middle ground along the lines of a basic Bugzilla installation. Such a bug tracker support a handful of useful fields: severity, component, state, owner, etc. They also support comments on bugs and attachments on bugs. Systems of these moderate complexities are commonly found in open source applications and smaller corporations.

The handling of the metadata fields is not terribly complex. These are the singular semantic data concerning a bug which computers will find difficult to correctly merge in all situations. Having a large number of these is not an engineering challenge, but beyond some number it will strain the patience of the developer and be ignored. A large number of metadata fields may also not be as useful in distributed bug tracking as centralized bug tracking. Since relational database formats are troublesome when it comes to distributed bug storage many of them use less structured file formats, thus running arbitrarily complex queries on the bug database is cumbersome, often requiring parsing hundreds or thousands of files into memory before checking each record in a loop. This is more difficult than simply using existing text processing tools to run regex queries on the database. If a tool like grep is used then there is no point in having a field for every possible situation since all the comments will be searched anyways. This being the case I believe that only the most useful of fields will be formalized with any other data being put into a structured form appropriate for the project and placed into comments.

The issue of attachments is also not complicated other than the fact that most existing distributed bug trackers ignore this feature entirely. This is likely just an oversight due to the relative immaturity of the field. Attachments play an important role in the operation of a bug tracker by being able to store data that is too large to fit conveniently into a comment. Examples of this include logs or configuration files.

Comments in a bug tracker are a critical collaboration feature. Comments allow one developer to communicate through time to either themselves, users, watchers or other developers. It provides an organized area to maintain comments and investigation notes concerning a bug.

One particular issue related to comments and distributed bug tracking is comment order. Many bug trackers use a flat comment model where comments are made in a linear order. In a centralized model this works well since there is a definite order to the comments and there are only small windows where a comment can be posted while another is being prepared. In fact, many bug trackers detect this situation and prevent submitting the latter comment until the user has read the former comment. This is a form of real time merging. Because a consistent linear order is maintained in the views of different users the comments can constructively reference each other. However, in a distributed world concurrent comments will be the norm rather than the exception and not until much after the fact will it be possible to determine a canonical comment ordering.

It is not an insurmountable challenge to make flat comments work in a distributed world, but it is also not clear that it is the best way. One alternative is to work along the lines of email, where you respond to particular comments in a tree. This can then be displayed in a nested fashion this makes which comments are replying to which parent comments clear. Perhaps this might ease the difficulties of creating a consistent canonical ordering. One particular additional requirement of a nested presentation might be the necessity to show the user which comments are new when they revisit a bug thread. None of the trackers I investigated appear to support this at the moment.

User Interfaces


The most popular bug tracker user interfaces are web interfaces. A web interface is convenient for centralized trackers because it is graphical in nature and has an easy communication path from the centrally controlled web server to the centrally controlled database server, often the same machine. The web interface also provides realtime feedback. The less common, but nonetheless effective, interface types often seen are CLI interfaces, email interfaces and GUI interfaces. Often these are used in concert with a web interface.

Of these there is no intrinsic argument against any but email interfaces. It is too burdensome to expect a developer to always have local email configured and to integrate every project or branch of a project into such a system. Most of the distributed bug trackers offer a CLI interface. This is a popular option because most interactions with a bug tracker during development are changing the state or commenting on a particular bug. For these purposes a CLI is more than adequate. CLI interfaces also have the great advantage of fitting well with the other CLI development tools such as VCSes, editors, build systems and test runners. CLI interfaces are also easy to script which allows developers to automate or integrate the tracker with other tools, such as their editor.

In general CLI interfaces are very convenient for the developer who is working on a bug. They are less convenient if the developer has to wade through a list of bugs to find a particular bug or otherwise navigate a large amount of data. CLI interfaces are also entirely inappropriate for users of a project. It is unreasonable to expect a user to checkout the source repository and use a CLI to see whats bugs a project has or the current state of their particular bug.

Many of the disadvantages of the CLI interface could be ameliorated with a curses interface to allow interactive navigation and modification of issues. However this would still limit the interface to textual information. A related approach which offers additional flexibility is the support a local web browser interface. If this interface has reasonable support for terminal browsers then the effect can be almost as good as a dedicated curses interface with the advantage of supporting GUI browsers with all the niceties that entails.

Distributed bug tracking brings one additional wrinkle to a web browser interface. If the bug tracker is running locally and stores its database near the source code then having multiple concurrent users against one instance brings numerous difficulties. Among these are handling commit attribution avoiding conflicts. VCSes provide tools to do this when one user uses a checkout at a time, but tend to provide no help on a finer level. The result of this is that any bug tracker intending to support this will likely end up reimplementing much of the isolation support of formal databases. Since this isn't required in the common case of a single developer working on their own checkout this seems to be wasted effort.

Thus many distributed bug trackers will have two web interfaces if they have any. One will be for local use and one will be for public use. I am not aware of any existing distributed bug tracker which provides a read-write public web interface and stores the bugs either on-branch or off-branch, but there are a several which have a readonly public interface. In a later section I will discuss possible ways in which this can be made to work when interacting with the public. If one is to write a read-write public web interface there are several design issues which need to be thought through first.

The first of these is how to get bug changes from the webserver to the source repository. A traditional centralized bug tracker stores its bug repository in a mutable database. This allows data to be deleted at will. Additionally the integrity of a separate database bug database is usually not considered as critical as the project's source repository. Thus if a malicious user comes along and fills the centralized tracker up with hundreds of megabytes of bugs the effects are relatively minor and a system administrator can easily delete the greater portion of the mess. If a distributed bug tracker stores its database in the VCS then it may not be possible to permanently delete junk data. It could be made to not appear in recent versions, but would still exist in the immutable history. A rapid increase in size could also cause severe problems as a source checkout which was less than a megabyte suddenly turns into one several gigabytes in size. Even if the checkout size later decreases back to the original size after a cleanup.

The second is related to the interface for resolving conflicts between the public interface and the canonical bug repository. Since distributed bug tracking is distributed many bug changes can happen concurrently only to be merged later. This is handled using VCS capabilities in the developer case, but it is likely that using a VCS backend to a public web interface would be cumbersome or be used differently because of the possibility of many concurrent public users. If VCS help isn't possible in the same way as the developer use case then a separate tool might need to be provided to pick and choose which public changes pass moderation.

The rest of the major issues in considering a read-write public interface are those of any other public web site with user generated content and won't be covered here.

One possible solution to this is to have some sort of staging system where the new data from the public interface for the bug repository is manually vetted before inclusion the permanent copy of the bug database. This moderation would need to either be performed frequently or have the unmoderated modifications appear on the public tracker immediately to ensure the public users receive timely interface feedback.

Though there is no specific reason a dedicated GUI interface could not be written none of the major software described does so. This is likely partially due to the effort required compared to a web interface or CLI. Modern web technologies coupled with a single user web server would seem to provide nearly all the advantages of a GUI with significantly better portability and reduced development effort.

Bug Identification


As with the change from centralized VCSes to DVCSes global identification is a tricky subject. It is undeniable that the traditional linear numbering of bugs is an obvious method, where possible, and easier to remember when the numbers are small. Unfortunately such a system cannot be globally unique in a distributed world.

As with DVCSes there appears to be no alternative to random or pseudo-random identifiers, such as cryptographic hashes. This has proven to not be overly burdensome in practice as long as the tracker attempts to disambiguate hashes from a subset of the full string. For example the tracker should be able to determine that the bug identified by a8d82 is actually a8d82ff764188578 as long as the shorter prefix isn't shared by more than one full hash.

There are several methods that can meet this need, but the most common are encoded UUIDs and taking a cryptographic hash of the contents. Obviously these should be presented in a human readable format such as base-64 or hexadecimal. Base-64 has the advantage of being a denser representation, but it suffers from using most of the keyboard characters and both upper and lower case letters. This latter can cause trouble with typing correctly or going through some systems which may stomp on the case. Hexadecimal, on the other hand, is slightly less dense, but doesn't suffer from the case problem. Also, since it uses a limited set of characters it is easier to include in identifiers in other systems, such as version codes.

It seems that there is room for an encoding of the pseudo-random hashes which is both denser than hexadecimal and yet avoids the major issues of base-64. Perhaps something like base-36 (0-9a-z) would fit the bill, though some of those characters may be difficult to type on some keyboards in some languages.

The inability to support linear numerical identifiers would seem to be a severe disadvantage. For projects with a small number of bugs, less than one or two thousand say, this is definitely the case. Beyond that number however the situation is less clear cut. When the number of required digits in an identifier is greater then four or five or the new bug rate is more than a handful per day, then the numbers themselves become more difficult to remember and lose meaning. On many large projects bug IDs are copy and pasted anyways since they are difficult to remember and easy to mistype. A similar situation appears to have won out in the DVCS world, large project will have millions of commits and in such a situation a linear numbering scheme can be no easier to tell apart than the pseudo-random hashes which replaced them.

Software Comparison


Above I've listed all the distributed bug tracking software, both defunct and active, I could find. In this section I will compare them briefly. First I will make any notes about the software and then I will have a summary table of the major aspects. Most of the aspects which are specific to distributed bug tracking have been discussed and explained above. After all the software has been described individually I will compare the most usable software in a table.

Artemis


Artemis is a basic tracker built as a Mercurial extension. It has pretty complete filtering options including the ability to store custom filters.

Last commit/release: Feb 2012
Language/Runtime: Python / Mercurial plugin
Bug storage: On-branch
Dog food: Yes
CLI: Yes
Local Web UI: No
Public Web UI: No
GUI: No
File format: Maildir per issue
VCSes: Mercurial
Custom Fields:
Comments: Nested
Attachments: Yes
BugID: Hash
Multiuser: No
Bug Dependencies: No

b


b is another Mercurial extension with a simpler model than Artemis. Note that the last release is quite old, but the development tree has activity as of late last year. b is based off the t extension but adapted to provide for more bug tracker-like use cases. b doesn't provide a public website itself, but the hgsite extension will take a b bug database and produce a simple static website from it.

Last commit/release: Oct 2012
Language/Runtime: Python / Mercurial plugin
Bug storage: On-branch
Dog food: Yes
CLI: Yes
Local Web UI: No
Public Web UI: Readonly via hgsite
GUI: No
File format: Sectioned text fields
VCSes: Mercurial
Custom Fields: No
Comments: Yes
Attachments: No
BugID: Hash
Multiuser: Yes
Bug Dependencies: No

Bugs Everywhere


Bugs Everywhere is likely the most mature of the distributed bug trackers. It has a reasonably active user base and seems to have most of the features to be expected of a distributed bug tracker. The project has had multiple contributors and is currently on its third maintainer since 2005. Bugs Everywhere additionally has an email interface, which is rare among distributed bug trackers.

Last commit/release: March 2013
Language/Runtime: Python
Bug storage: On-branch
Dog food: Yes
CLI: Yes
Local Web UI: Yes
Public Web UI: Readonly
GUI: No
File format: JSON, one file per comment
VCSes: Arch, Bazaar, Darcs, Git, Mercurial, Monotone, Others possible
Custom Fields: No?
Comments: Yes
Attachments: Yes
BugID: UUID
Multiuser: Yes
Bug Dependencies: Yes

cil


cil is another small CLI only distributed bug tracker. It provides some basic integration with Git, but can also be used with other VCSes as long as you are willing to add and commit changes to the bug repository manually.

cil uses a unique bug repository format where every issue and comment have a file inside a single directory. Each issue and comment has a link to it's children or parent. Thus adding a comment may cause a merge conflict in the issue file if another comment was added concurrently, but it will be restricted to references.

Last commit/release: Oct 2011
Language/Runtime: Perl
Bug storage: On-branch
Dog food: Yes
CLI: Yes
Local Web UI: No
Public Web UI: No
GUI: No
File format: Simple key-value-freeform markup
VCSes: Git-supported but not required
Custom Fields: No
Comments: Yes
Attachments: Yes
BugID: Hash
Multiuser: Yes
Bug Dependencies: Yes

DisTract


DisTract is one of the older distributed bug trackers, but seem to have fallen off the Internet. You can find the last copy of the site at Archive.org. DisTract is interesting in that it doesn't provide a CLI interface, but instead all the bug interactions are performed from within a page in Firefox (not any other browser) which uses Javascript to access the filesystem directly. Unfortunately since I have been unable to find any copies of DisTract not a lot is known about it.

From the archived website it was clear that the author intended to have a bug specific merge algorithm, though it seems unlikely that ever came to pass.

A bug tracker which didn't make my list because it requires realtime access to a central repository but takes a similar implementation view is Artifacts for Web. The bug tracker runs locally in the browser but all the bug storage happens on a central SVN server directly.

Last commit/release: mid-2007
Language/Runtime: Haskell / Javascript / Firefox
Bug storage: ?
Dog food: Yes
CLI: No
Local Web UI: Yes
Public Web UI: ?
GUI: No
File format: JSON?
VCSes: Monotone
Custom Fields: ?
Comments: ?
Attachments: ?
BugID: ?
Multiuser: ?
Bug Dependencies: ?

DITrack


DITrack is the first off-branch distributed bug tracker in this list. Ditrack is interesting in that it only support SVN, a centralized VCS. As such several of its design features are rare. The first is a linear bug ID scheme, bugs are numbered in sequential order. Each issue is a directory made up of multiple files. Each file is numbered in sequence but seems immutable. Thus each issue is the sum of the log-type entries from the files. While sequential numbering has obvious problems in a decentralized system the log structure does present an interesting solution to the merging problem. Since the bug is the combined last state of various fields from the log there need never been any manual merging since regular file merging will always automatically result in a last-wins bug metadata merging strategy.

Last commit/release: Aug 2008
Language/Runtime: Python
Bug storage: Off-branch
Dog food: Yes
CLI: Yes
Local Web UI: No
Public Web UI: Read-only
GUI: No
File format: RFC-822
VCSes: SVN
Custom Fields: No
Comments: Yes
Attachments: Yes
BugID: Linear
Multiuser: Yes
Bug Dependencies: No

dits


dits appears to be the aborted beginnings of a distributed bug tracker. Its functionality isn't very complete and it doesn't appear usable.

Last commit/release: Apr 2010
Language/Runtime: Python
Bug storage: On-branch
Dog food: Yes
CLI: No
Local Web UI: Yes
Public Web UI: No
GUI: No
File format: JSON
VCSes: HG, Git?
Custom Fields: No
Comments: No
Attachments: No
BugID: Hash
Multiuser: No
Bug Dependencies: No

Ditz


Ditz is a distributed bug tracker which was, at one time, fairly popular as distributed bug trackers go in the Ruby community. Now it seems to be abandoned though several people have created personal forks on gitorious. Ditz has no native support for any particular VCS, but it does have a plugin system which has been used to integrate with Git. Of interest, especially to Emacs users, is that Ditz has an accompanying Emacs major mode. Ditz has a particular focus on grouping issues into releases.

There appears to be a local web UI "Sheila" but I am unsure of it's usability state.

Last commit/release: Sept 2011
Language/Runtime: Ruby
Bug storage: On-branch
Dog food: Yes
CLI: Yes
Local Web UI: Yes
Public Web UI: Read-only
GUI: ditz-commander
File format: YAML
VCSes: Agnostic
Custom Fields: No
Comments: Yes
Attachments: No
BugID: Hash
Multiuser: With plugin
Bug Dependencies: No

Fossil


Fossil is not just a distributed bug tracker, but an entire development forge in a box. It includes a DVCS, wiki, bug tracker and web server. One might call it a distributed forge. Since Fossil stores the tickets in its distributed database there is a custom merging algorithm which is mostly apparently newest-wins, but avoids any manual merging of bug files.

Last commit/release: Apr 2013
Language/Runtime: C
Bug storage: Out-of-tree
Dog food: Yes
CLI: Yes
Local Web UI: Yes
Public Web UI: Read-write
GUI: No
File format: Database
VCSes: Fossil
Custom Fields: Yes
Comments: Yes
Attachments: Yes
BugID: UUID
Multiuser: Yes, but no ownership
Bug Dependencies: No

UPDATE 2013-06-03: As C2H5OH mentioned in the comments it is possible to add custom fields.

git-case


git-case is a bare bones proof of concept distributed bug tracker built in the style of the git porcelain. The website claims that some operations are sluggish, but no further details are given.

Last commit/release: Oct 2010
Language/Runtime: Bash
Bug storage: Off-branch
Dog food: No
CLI: Yes
Local Web UI: No
Public Web UI: No
GUI: No
File format: Plain text
VCSes: Git
Custom Fields: Yes
Comments: Yes
Attachments: Yes
BugID: Hash
Multiuser: Yes
Bug Dependencies: No

git-issues


git-issues is a mostly defunct tracker built on top of Git in a similar manor to git-case.

Last commit/release: June 2012
Language/Runtime: Python
Bug storage: Off-branch
Dog food: No
CLI: Yes
Local Web UI: No
Public Web UI: No
GUI: No
File format: XML
VCSes: Git
Custom Fields: No
Comments: Yes
Attachments: Yes
BugID: Hash
Multiuser: Yes
Bug Dependencies: No

gitissius


gitissius started off as a fork of git-issues, but then diverged significantly.

Last commit/release: Dec 2011
Language/Runtime: Python
Bug storage: Off-branch
Dog food: Yes
CLI: Yes
Local Web UI: No
Public Web UI: No
GUI: No
File format: JSON
VCSes: Git
Custom Fields: No
Comments: Yes
Attachments: No
BugID: Hash
Multiuser: Yes
Bug Dependencies: No

gitli


gitli is really more of a single user TODO list than a fully fledged distributed bug tracker. All the issues are contained within a single file. With that setup, linear BugIDs and no comments this isn't really suitable for any except the simplest of project needs.

Last commit/release: March 2011
Language/Runtime: Python
Bug storage: On-branch
Dog food: Yes
CLI: Yes
Local Web UI: No
Public Web UI: No
GUI: No
File format: Custom text
VCSes: Git
Custom Fields: No
Comments: No
Attachments: No
BugID: Linear
Multiuser: No
Bug Dependencies: No

gitstick


gitstick is apparently based upon Ticgit. This seems to be a young distributed bug tracker yet. Unfortunately I wasn't able to determine much information about how this project operates from inspection. It may be less appropriate to call this a standalone distributed bug tracker than a local web UI for Ticgit.

Last commit/release: Jan 2013
Language/Runtime: Scala
Bug storage: Off-branch
Dog food: No
CLI: No
Local Web UI: Yes
Public Web UI: No
GUI: No
File format: ?
VCSes: Git
Custom Fields: ?
Comments: ?
Attachments: ?
BugID: ?
Multiuser: Yes
Bug Dependencies: No

klog


klog appears to be greatly in flux at this time so it is difficult to say much which is likely to be accurate in a year. There appears to be a great many features planned, but only the most basic features are implemented. According to the bug database a complete rework of the way the bug database is stored is planned.

Last commit/release: Mar 2013
Language/Runtime: Javascript
Bug storage: On-branch
Dog food: Yes
CLI: Yes
Local Web UI: Prototype?
Public Web UI: No?
GUI: Mac OSX
File format: Key-value-text
VCSes: Agnostic
Custom Fields: No
Comments: No
Attachments: No
BugID: Hash
Multiuser: No
Bug Dependencies: No

Mercurial Bugtracker Extension


Mercurial Bugtracker Extension uses an unusual layout for bugs. There is one directory for open bugs and another for closed bugs. Such a layout may cause issues when there are concurrent modifications such as one person modifying an open bug and another closing it.

Last commit/release: May 2012
Language/Runtime: Python / Mercurial plugin
Bug storage: On-branch
Dog food: Yes
CLI: Yes
Local Web UI: No
Public Web UI: No
GUI: No
File format: INI
VCSes: Mercurial
Custom Fields: No
Comments: No
Attachments: No
BugID: Hash
Multiuser: Yes
Bug Dependencies: No

milli


milli seems to have disappeared during the lengthy research period so no further information is available.

Last commit/release: ?
Language/Runtime: ?
Bug storage: ?
Dog food: ?
CLI: ?
Local Web UI: ?
Public Web UI: ?
GUI: ?
File format: ?
VCSes: Agnostic
Custom Fields: ?
Comments: ?
Attachments: ?
BugID: ?
Multiuser: ?
Bug Dependencies: ?

Nitpick


Disclosure: Nitpick is written by the author.

Nitpick is a relatively young distributed bug tracker with most of the significant features discussed in this article. One notable feature of Nitpick not present in other distributed bug trackers is the ability to combine multiple Nitpick databases, via the foreign project feature, into a single view. This allows viewing both bugs across several project and across several branches in a single instance of Nitpick.

Last commit/release: Apr 2013
Language/Runtime: Python
Bug storage: On-branch
Dog food: Yes
CLI: Yes
Local Web UI: Yes
Public Web UI: Read-only
GUI: No
File format: Simple markup
VCSes: git, hg, svn
Custom Fields: No
Comments: Nested
Attachments: Yes
BugID: Hash
Multiuser: Yes
Bug Dependencies: Yes

pitz


pitz started off as a reimplementation of Ditz.

Last commit/release: Aug 2012
Language/Runtime: Python
Bug storage: On-branch
Dog food: Yes
CLI: Yes
Local Web UI: No
Public Web UI: No
GUI: No
File format: YAML
VCSes: Agnostic
Custom Fields: No
Comments: Yes
Attachments: Yes
BugID: UUID
Multiuser: Yes?
Bug Dependencies: No

scm-bug


scm-bug is not a standalone distributed bug tracker. Instead it ties source code to an existing bug tracker. It might be possible to use this with a locally installed tracker in a distributed fashion.

Last commit/release: Feb 2011
Language/Runtime: Perl
Bug storage: Out-of-tree
Dog food: ?
CLI: ?
Local Web UI: ?
Public Web UI: ?
GUI: ?
File format: ?
VCSes: svn, git, cvs, hg
Custom Fields: ?
Comments: ?
Attachments: ?
BugID: ?
Multiuser: ?
Bug Dependencies: ?

Simple Defects


Simple Defects is more than just a distributed bug tracker, it is also capable of synchronizing bidirectionally with several centralized bug trackers. SD uses a distributed database instead of storing the bug repository alongside the source code in a VCS. As such the VCS support it does have is mostly limited to adding commands to the VCS command it self. Since SD is capable of synchronizing bugs in multiple ways it might be possible to use it as an intermediate step between a central project bug tracker and a locally installed centralized bug tracker for developer use.

Last commit/release: Sept 2012
Language/Runtime: Perl
Bug storage: Out-of-tree
Dog food: Yes
CLI: Yes
Local Web UI: Yes
Public Web UI: No
GUI: No
File format: Database
VCSes: git, darcs and other
Custom Fields: ?
Comments: Yes
Attachments: Yes
BugID: Linear
Multiuser: Yes
Bug Dependencies: ?

Stick


Stick is another one of those distributed bug trackers which seems to have fallen of the Internet. I'm unable to retrieve the source to get much concrete information but the website makes it seem as if Stick was mostly in the idea conception phase with little actual working functionality.

Last commit/release: ?
Language/Runtime: ?
Bug storage: ?
Dog food: ?
CLI: ?
Local Web UI: ?
Public Web UI: ?
GUI: ?
File format: ?
VCSes: Git
Custom Fields: ?
Comments: ?
Attachments: ?
BugID: Hash
Multiuser: No
Bug Dependencies: ?

ticgit-ng


ticgit-ng does dogfood itself, but that isn't evident from the main repository. I had to search through some forks on Github to find the bug branch. Ticgit-ng uses an interesting approach to managing the data by having a single 'file' per field. Thus there is a file for the state and one for each comment. Not evident in the feature summary is that Ticgit-ng supports tagging issues, though it isn't clear if it supports multiple tags or only one.

Last commit/release: Oct 2012
Language/Runtime: Ruby
Bug storage: Off-branch
Dog food: Yes
CLI: Yes
Local Web UI: Yes
Public Web UI: No
GUI: No
File format: Plain text
VCSes: git
Custom Fields: No
Comments: Yes
Attachments: Yes
BugID: Hash
Multiuser: Yes
Bug Dependencies: No

Veracity


Veracity is another distributed forge in that it is not only a distributed bug tracker, but also wiki and source control. Again the bugs are stored in a distributed database which has some special logic and interfaces for helping merging along.

Last commit/release: Mar 2013
Language/Runtime: C
Bug storage: Out-of-tree
Dog food: Yes
CLI: No
Local Web UI: Yes
Public Web UI: No?
GUI: No
File format: Database
VCSes: Veracity
Custom Fields: No?
Comments: Yes
Attachments: Yes
BugID: Linear
Multiuser: Yes
Bug Dependencies: No

Summary Table


Only what I consider to be fully fledged distributed bug trackers worth considering for use in a project of more than one developer will find a place in this summary table for reasons of space. All the same information is available for every tracker I evaluated in their respective sections. The primary determinant of suitability is multi-user support (the ability to assign bugs to users and to determine who made any particular comment or bug report), a sufficiently recent commit or release and what appeared to be at least one mature interface for developer use. The range on project complexity these trackers are suitable for varies, but since small two man project should use a bug tracker just as large projects should I list trackers of multiple complexities.

To save space I have skipped the fields for the GUI (since none of the selected trackers have a GUI), support for custom fields (again since none of them appear to have such support) and multiuser support since that was one of the requirements and all have some such support. It is important to note that all but Fossil have full multi-user support out of the box. Fossil lacks the ability to assign a bug to a particular person for resolution, but it can be added as a set of custom fields.

Comparison part 1

Software Last Commit / Release Language Bug Storage Dogfood CLI Local Web UI Public Web UI
b Oct 2012 Python On-branch Yes Yes No Read only
Bugs Everywhere Mar 2013 Python On-branch Yes Yes Yes Read only
Fossil Apr 2013 C Out-of-tree Yes Yes Yes Read-write
git-issues June 2012 Python Off-branch No Yes No No
Mercurial Bugtracker Extension May 2012 Python On-branch Yes Yes No No
Nitpick Apr 2013 Python On-branch Yes Yes Yes Read only
Simple Defects Sept 2012 Perl Out-of-tree Yes Yes Yes No
ticgit-ng Oct 2012 Ruby Off-branch Yes Yes Yes No
Veracity Mar 2013 C Out-of-tree Yes No Yes No?

Comparison part 2

Software File format VCSes Comments Attachments BugID Bug Dependencies
b Sectioned text hg Yes No Hash No
Bugs Everywhere JSON Many Yes Yes UUID Yes
Fossil Database Fossil Yes Yes UUID No
git-issues XML git Yes Yes Hash No
Mercurial Bugtracker Extension INI hg No No Hash No
Nitpick Simple Markup svn git hg Nested Yes Hash Yes
Simple Defects Database git darcs other Yes Yes Linear ?
ticgit-ng Plain text git Yes Yes Hash No
Veracity Database Veracity Yes Yes Linear No

Bug Handling Strategies


By analogy with DVCSes distributed bug tracking provides some new capabilities, make some older techniques easier and make some traditional centralized bug tracking methods all but impossible. In this section I'll try to cover the most common of these cases and some way to work within the limits, tighter and looser, which distributed bug tracking software as it exists today provides.

Distributed Use Cases


Most of the talk around distributed bug tracking is about replacing a centralized bug tracker completely. This is so for the obvious reason that most developers don't want more than one bug tracker per project. There are, however, some interesting alternative uses which are not in direct conflict with a centralized tracker. One such use is the aggregation of multiple trackers into a single one. Consider the case of a developer who works on several different projects, If these projects don't share one bug tracker then the developer must regularly check these separate trackers. Some of the distributed bug trackers described above support bidirectional communication with other bug trackers, centralized or not. As such the developer could configure a local distributed bug tracker to give an overview of several trackers.

An alternative is a hybrid centralized-decentralized setup, similar to how DVCS is used in practice in many cases. If the project or organization has a single centralized tracker a developer could setup a distributed tracker as a mirror, full or partial, of the centralized tacker for personal consumption and modification when they are disconnected or operating over a poor network link. Whenever it is convenient they would then trigger a bidirectional synchronization. Thus they gain all the advantages of distributed bug tracking without many of the disadvantages. This model is similar to individual developers who use git as an interface to a Perforce or Subversion repository.

Yet another use case, again depending on aggregation, is to combine various bug trackers for a single project. As an example consider the case where the bug trackers of an open source package and bugs against that package in the trackers of all the major Linux distributions could be combined for a more complete view of the issues users are having with the software.

The various ways a distributed bug tracker could be used is not fully explored so these are just a few examples of how they could be integrated into workflows.

Non-Developer Members


One common concern with large projects moving to distributed bug tracking is how to integrate QA and project managers. The predominant view among existing large projects is that QA and project managers, for the most part, should neither need nor have access to the VCS. Bringing this stance to distributed bug tracking would imply that QA would have no way to directly interact with the bug tracker in other than a readonly fashion. The solution to this predicament is to give those QA and project managers read-write access to the VCS.

There are a few reasons such a move is resisted. Many of them are obsolete or misinformed notions based upon limitations of old VCSes or poor bug trackers. The first of these is an entirely valid argument that requiring all the QA and project managers to become experts in the VCS of choice is overly onerous. At more than one place I have seen the local VCS expert setting up special wrappers to perform only the limited set of functionality a particular artist or QA needed to get their job done and hid all the other complexity. In a similar vein any good distributed bug tracker will provide a sufficiently simple interface to the VCS for the bug operations that minimal training should be necessary.

A second common claim, especially among open source projects, is that the VCS is for source code only and everything else should be kept separate. While it is possible to have a parallel VCS repository, or some other arrangement as will be discussed below, modern VCSes are not simply source control systems, but generalized version control systems. Though some VCSes handle them less well than others, many large projects have good success storing large assets or even build chain tools into the VCS alongside the project. As such there is no reason not to also store the bug database and all the input from the QA people as well. The VCS can be viewed as the project state and not just the project output.

A final possible complaint is that the QA people may, not being VCS experts, make disastrous mistakes relating to merging or reviving stale commits or just editing the other parts of the project inadvertently. While this is true when no protections are put in place, most VCSes provide the ability to either restrict different users to different portions of the checkout tree or otherwise have a knowledgeable person double check their changes before accepting them into the main development repository.

Integrating QA, project managers and other non-developers such that they can make full use of the distributed bug tracker is not a difficult matter, it merely requires that sufficient training and protections be put in place. These less technical people will likely, however, not be pleased with purely a command line interface to the tracker. Partially this is because their use cases tend to not deal with one bug at a time but normally traversing, reading, commenting on and modifying several in quick succession. Partially this aversion will be to reduced familiarity with CLI tools as compared to the average developer. For this reason any distributed bug tracker used should also provide a good read-write local web or graphical interface for these less technical users.

Care must also be taken when it comes to helping them know which branch of development to find the appropriate bugs in. For support type staff this is as easy as having them choose the version to file the bug against first and choosing the correct bug repository version based upon that. For QA users it is more a matter of ensuring that the builds they test come along with the bug database. This is most easily accomplished with a fully automated build system which can produce QA testable builds on demand. With such a system QA is given a source tree which is trivially built into a product to test. Then QA need merely use that branch to handle with any bugs for that build.

Public Users


As previously discussed a major outstanding issue, especially for open source projects, is how to provide the public with a useful interface into the project bug tracker. Few read-write web interfaces suitable for public consumption with a distributed bug trackers have been created, though no insurmountable obstacle appears to block the way in most cases. Currently I can only recommend one approach to solving this issue, namely having a readonly user web interface which is updated frequently and handling any bug modification or creation on the part of users as part of a support mailing list.

This will not be as convenient for both the developers and users as a public bug tracker, but is likely to provide better results for both parties. The users, instead of creating a new bug which is likely never to be answered if it is ever read by the developer, will interact with a developer or other support person for the project directly. This will allow the developer to not only determine if this is an existing bug, a step users often never perform correctly, but also ensure that all the necessary information has been acquired before the user leaves and is never heard from again. Many bugs in open source bug trackers are full of incomplete information and the reporting user is nowhere to be found. The user is also better off as there may be a solution to the particular issue they are facing which they will be told about immediately instead of waiting for a WorksForMe resolution of their bug, if that ever comes.

As previously mentioned the developer will be able to extract all the necessary information from the user more easily because the discussion will happen immediately instead of days or weeks later when somebody gets around to viewing the newest bugs on the tracker. Developers also benefit by having fewer duplicate bugs with slightly different information cluttering up the tracker because they'll deduplicate as they go along. An additional advantage is a greater likelihood of a user actually reporting the issue. If the recommended way for a user to report a problem is to a mailing list they are reasonably likely to do so. They are less likely to create yet another account for yet another bug tracker which they will never use again such that they can file one bug which will almost certainly not receive a response.

One critical and necessary aspect of this to remember is that responses to the users on the list must be timely and efficient. It is this requirement of good communication which brings about the benefits for both the developers and users. In fact, this method is how many commercial companies operate, the customers interact directly with support staff who navigate and fill in the necessary information in the bug tracker. The second critical aspect is that the public readonly web view is updated frequently. With an up to date place for users to track the state of their bug, look up resolutions to other similar issues or to point users when they are having a known issue is invaluable and saves developers time. Users prefer to get the answers they want without having to bother the developers and they like to see progress.

Multibranch Overview


One particular issue with distributed bug tracking is that there is not necessarily a single complete view of the bugs at any time. Instead different branches may have different bugs in different and conflicting states. For example a development branch may have fixed a bug but since that branch hasn't yet merged to the trunk the trunk doesn't have that bug marked as fixed. A further example is a release branch having a bug created against it on the complaint of a user, but that bug not having made its way via merging to the trunk and so that bug exists nowhere else in the VCS. These are all examples of the power of distributed bug tracking when it is used to have bug states follow the code flow within the project.

However, sometimes it is useful to have a complete, or more complete view of the bugs. As an example, a project manager may want to know what bugs have been fixed for the coming release, even if not all of those changes have made it to the release branch yet. Perhaps the code must move from a development branch, through a QA branch before arriving in the release branch. It is still important to note the state of the bug that is otherwise marked as open in the release branch is actually closed in some branch. Another situation is one of a developer who works in a development branch. This developer would like to be able to, when he views the bug database, see not just the bug information as it appears in his branch, but also in the trunk in case some new bug or comment relevant to his current branch appears.

This cross-branch bug database merging is an important feature to ensure a wider view of the state of the bug database when such a view is useful. At the time of the this writing only one distributed bug tracker which I am aware of, Nitpick, supports such a facility directly. Indirectly it is possible to script a CLI interface to merge the bug query results across many VCS checkouts. Of course any distributed bug tracker which used off-branch or out-of-tree storage will have neither this disadvantage nor the advantage of having differing branch versions of the bug database.

Using On-Branch as Off-Branch


Distributed bug tracking holds many possible advantages and uses which cannot be filled by traditional centralized bug trackers. But it may be that not all of this power is desired for a particular project. In many cases it is possible to configure the distributed bug tracker, with some scripting effort, to work in a less powerful mode.

For example, while off-branch bug repositories will likely have an easier interface for storing bugs that way it is possible to use an on-branch bug tracker as an off-branch tracker. Simply create a separate branch or checkout for the bug repository and write some scripts which direct the bug tracker to use that branch or checkout instead of putting the bugs beside the source code. This relatively simple step will produce an off-branch bug tracker with the bugs stored in the VCS.

Similarly a setup even more similar to a centralized bug tracker, but with the bugs stored entirely in the VCS could be setup by presenting the web interface to each developer and then directly committed to the VCS. There is even the possibility of doing these simpler setups for some users of the repository while allowing the full distributed capabilities for others, perhaps the remote workers.

In much the same way an on-branch or off-branch bug tracker can be turned into an out-of-tree tracker simply by having a separate VCS repository which contains only the bug repository.

Surviving A Manual Bug Process


In the beginning bug trackers started as simple TODO lists, perhaps with some notes. From there the massive environmental spectrum of bug tracking tools and processes evolved. At the extreme end there are very complicated bug processes and tools. While these sorts of processes can be translated to distributed bug tracking they are likely to be cumbersome and disappointing. Instead distributed bug tracking is better suited to simpler processes and fewer fields. Because all the bug database is available locally to a developer it is simpler to run complex queries as scripts locally. Any datum which isn't extremely common is likely better off as a formatted comment to a bug instead of a custom field with complex automation behind it.

Along these lines a simpler bug process in general is recommended. A small number of bug states, priorities and the like is recommended. If the bug process is simple enough then no automation will be necessary because the developer will have only one obvious choice and will be able to arrive directly at the state they desire. This is contrary to a common setup where the developer must first mark a bug as assigned before marking it resolved before marking it closed. For many simple bugs this is overkill and the developer will spend more time navigating the process than fixing the bug. In such cases the developer wants to be able to skip straight to closing the bug.

With distributed bug tracking it pays to have a clear and simple process. Only a handful of states are needed and most information is better suited to being in a comment than a custom field.

Where to Enter Bugs


One issue which comes up when discussing the abstract theory of distributed bug tracking is that it would be ideal if a bug could be associated with the original commit which introduced the error across all the various branches and clones. This is nice in theory but also impossible in theory. There may be no single commit which introduced the error for one thing. While it is possible to associate a bug with any commits which do introduce an issue that is really just a mapping from bug ID to commit ID. It is possible to construct a bug tracker in this way, and it would be able to cut across branches using this mapping.

Lacking such a system the next best that can be done is to ensure that bugs are entered where the fix would be placed. As an example consider a project with a recent release and a trunk where development continued. During the release stabilization process a branch would have been created for the release while any remaining major bugs where fixed there. All those changes would then be merged back into trunk at a later time. Any bugs found by QA against that stabilizing release should be raised in the release branch. Then when any fixes propagate so too will the bug information.

In a similar way bugs in maintenance releases should be raised in the branch for that release to eventually be merged into the trunk. The fix itself may or may not still be applicable, but since some changes will be the bug database changes should be merged up as well.

Now all this depends on the particular branching and versioning strategy the project uses. If the project doesn't have maintenance releases or doesn't move changes around like that then a different location to report or modify bugs will be appropriate.

Other Distributed Tracking Options


As previously stated distributed bug tracking really started as simple TODO files. As such there are ways of tracking bugs which don't require fully fledged bug tracking software, distributed or otherwise. Most of these are severely limited in several ways, but a project may not hit these limits.

Beyond the simplest TODO lists are things such as the emacs org-mode. This can work well, but may fall apart when multiple developers are involved and will make providing a public read only view into the bug database cumbersome.

Another alternative, which doesn't suffer from this last limitation, is to use wiki software to track bugs. There exist VCS based wikis, such as ikiwiki. These will tend to be usable with a standard text editor but still provide an easy rendering option to provide users on the web. Using a wiki like this will tend to make it difficult for users to find issues that may apply to them expect by reading all the existing issues.

Is This Worth Doing At All?


It may seem odd to have a section which deals with the question of the value of distributed bug tracking so late in the article, but without understanding distributed bug tracking as it is currently known it is quite difficult to make a reasoned judgement on the matter. There are opinions in both directions. Proponents of distributed bug tracking focus on the isolation capabilities, offline support and bug branching, while opponents focus on the collaborative aspects of bug tracking which distributed bug tracking slows down. Both groups have points and the strength of any particular point truly depends on the project in question.

To start consider a project where all work is done on feature or bugfix branches, there is a thorough review process and all real discussion happens on a mailing list with relevant messages copied into the bug tracker manually for reference. In this case distributed bug tracking would seem to have few downsides. All the discussion happens in a broadcast medium, the email list, so every developer can easily get a sense of the current stage and latest debugging information. Since the bugs are fixed on-branches and a thorough review process may cause a large span of time to pass between the bug being fixed and that fix being merged into the trunk the ability of bug states to follow the fixes is very useful, especially if there is some tool support to aggregate bug states across multiple branches.

A different project might instead to the vast majority of its development on the mainline with all discussion occurring via the bug tracker. Here distributed bug tracking seems to have no detriment. Surely the full capabilities are not being used, but perhaps offline support is sufficiently useful. As soon as the developer synchronizes their local copy they will have all the new discussion. This does require that the developer frequently and regularly synchronize, which may be a change in workflow. This is less of a burden with DVCSes, but can be an issue for projects or developers which prefer checking in single, complete units of work as a single commit. Something like committing a few days of work at a single time instead of as several commits over those days. In these situations frequently merging in changes from the trunk may be onerous.

There is then a third case of a project with the branching structure of the first case, but the communication system of the second. That is, all work is done on-branches and the vast majority of the communication occurs exclusively via the bug tracker. This situation can cause some difficulty when using a distributed bug tracker. The time to push a new bug comment up and then have another developer pull it down can be quite significant. There are simple solutions however, the simplest of which is to have the canonical VCS repository have a hook which emails out new bug comments and state changes when those changes are pushed to it. Having the centralized bug tracker email out such information is very common already and shouldn't be an issue.

It remains to be seen which side of the argument will win out, but it currently appears that distributed bug tracking fills a real need, especially when coupled with a DVCS, and has few downsides without relatively simple technical fixes. For the time being at least, distributed bug tracking appears to be useful tool worth using.

Future Thoughts


Distributed bug tracking is a young concept, even considering the age of other concepts in the various fields of computing. As such there are many areas which have not been thought through and it is unlikely that any of the current generation of distributed bug trackers have all the features and functionality which will one day be considered essential. Here are a few ideas or issues which still need resolution with respect to distributed bug tracking.

Tracking Changes


One of the first advantages which comes up when considering distributed bug tracking is the ability for the closing of a bug to follow the change which fixes the bug as it propagates through branches and releases. This works fine with on-branch storage, but there are arguments against on-branch storage related to bug visibility and the length of time it takes for a comment on a bug to propagate to the branch a developer is watching. Off-branch bug storage, however, gives up on the ability for bug state to follow code fixes. One possible solution to this is to, with VCS support, store the change IDs which fixes the bug and then query the database to see if that change exists on the branch in question when showing the bug state.

VCS Storage Limitations


It is appealing to store the bugs directly in the VCS of the project, either beside the code or in their own branch. For a moderate number of bugs and comments this is not an issue. However, the file layouts and formats which are the easiest for the VCSes to merge are not especially efficient and may cause issues when scaling. There does not yet seem too be enough experience to determine how this scaling should be dealt with or even how much of an issue it will become. Should old issues be archived into a more efficient format? Should old issues be deleted from the HEAD of the VCS and rely on the VCS history to retrieve it? Is there some other option which is superior to those mentioned?

References


I haven't explicitly called out references in the text above, but here are some websites which may be of interest where I procured some of my information where it is not original thought or extracted from the software compared above.

  1. http://tychoish.com/rhizome/supporting-distributed-bug-tracking/

  2. http://bytbox.net/blog/2012/10/thoughts-on-distributed-bug-tracking.html

  3. http://nullprogram.com/blog/2009/02/14/

  4. http://www.ericsink.com/entries/dbts_fossil.html

  5. http://erlangish.blogspot.ca/2007/06/distributed-bug-tracking-again.html

  6. http://heapkeeper-heap.github.io/hh/thread_298.html#post-summary-hh-1076

  7. http://dist-bugs.branchable.com

  8. http://evan-tech.livejournal.com/248736.html

  9. http://blog.tplus1.com/index.php/2008/08/01/toward-a-horrifying-new-workflow-system/

  10. http://esr.ibiblio.org/?p=3940

  11. http://blog.ssokolow.com/archives/2011/08/25/topic-glimpse-distributed-issue-tracking/

  12. http://www.raizlabs.com/blog/2007/06/20/linux-distributed-bug-tracker/

  13. http://urchin.earth.li/~twic/Distributed_Bugtrackers.html

Infinitely Evil

There is a set of topics for which, in general, no rational discussion can take place. For these topics the majority of the population can only accept the most extreme solutions and will consistently either misunderstand any point to the contrary or consider you a monster unworthy of association. Both of these situations usually end up with lots of flaming, but very little useful discussion.

I believe it's an unfortunate gap in education which causes this. It takes a significant amount of effort and training to be able to consider positions which trigger a strong gut reaction. Instead of calm contemplation and questioning many people will respond with vitriol and hate.

I term these topics infinitely evil because a rational response to anything depends on how negative it is compared to the likelihood. Something like an expected value calculation. Under this framework the only kinds of things which can result in an inconsiderate stonewall of disagreement is something infinitely negative. That which is infinitely negative is infinitely evil. There is actually two related sets of infinitely evil topics: universal within an ideology and universal within a society.

Universal with an ideology include things like equality of women, abortion, taxes, welfare, drug use, organized crime to name a few in the common North American ideologies. An interesting thing about these ideological evils is that they work both ways. If there is an ideology where some solution to an issue here then there is often another ideology where that solution is infinitely evil. There are rare cases where one ideology is dead set on one solution, but most other ideologies are pretty neutral on the whole topic.

Any infinite evil which isn't ideologically restricted is by definition universal within the society. These are still not truly universal, but may appear so if you never leave the same social region, such as North America or the Western World. Included here are things such as sex crimes, hate crimes, crimes against 'children', murder, Nazis, etc.

Topics such as these are one part of the reason the Internet is such an echo chamber. If every discussion leads to a hot flame war in one forum then you'll likely move on until you find a spot with less disagreeable participants. Unfortunately this tends to be a group of people who agree on these infinitely evil topics. They may discuss them, but only to nod agreement at each other and discuss the finer points of a solution which,to some, is morally repugnant.

The only things which can be done in these situations is to not discuss the infinitely evil topic directly. Instead some more nuanced situation should be discussed. One will just have to hope that the other participants will see the parallels to the more extreme situation on their own.

One final disclaimer, the above is obviously a generalization. One that I've found to be true more often than not in my experience, but a generalization none-the-less.

High Resolution 3D at 48 FPS

Yesterday I watched The Hobbit in high framerate 3D. I did this mostly to see what 48FPS movies looked like. The movie itself was not bad.

The first think that happened is that my view that one shouldn't watch live action 3D films was reaffirmed. The lack of whole scene focus is just too hard on the brain. The Hobbit wasn't as bad as others I've seen, but I still left the theatre with a mild headache. I've also learnt to ignore the background entirely, which I think detracts from the film since I feel that I spent most of the film looking at dwarf noses.

The Hobbit was also filmed in some high resolution process and then downscaled. This certainly increased the fidelity of the images and made them seem much more real. Unfortunately everything seemed somewhat too bright and the additional resolution highlighted lighting discontinuities. As such, scenes shot within one set with natural lighting, such as daytime outdoor shots or indoor shots with sufficient ambient lighting, look quite good. Composite shots or shots where highlight lighting was necessary work out less well because the lighting differences are obvious and unnatural.

I think it is this high resolution process with highlight lighting which is the primary cause of complaints that The Hobbit looks too much like a soap opera. A secondary cause would be the lighting brightness required for the various light sapping processes (3D, high resolution, high framerate) interacting poorly with the inverse square law of light intensity.

The real reason I went to see this movie as I did was to watch the high framerate, just in case it ends up being a flop and fading away like 3D has done in the past. I think 48 FPS filming is a mistake.

The biggest issue is that many actions end up looking either jerky or as if they are happening on fast forward, even if the actual movement happens at a normal pace. I noticed this more during slow motions like moving a book than fast motions like fighting. This rushed feeling is quite jarring and through the entire course of the movies I saw it repeatedly. Not even two hours of practice made the effect go away. This is a severe problem that breaks the immersion in the movie world. I have a couple half baked theories why the higher FPS may cause this problem, but the solutions always seem to boil down to needing an even higher FPS, thus escaping an uncanny valley, or modifying the frame display in some way to aid persistence of vision. It's also possible that simply being more deliberate in motion fixes the issue. Perhaps there was some frame cutting for pacing which caused this issue.

The second issue I saw with the high framerate occurred during action scenes. Specifically, everything moved too fast and was difficult to follow. What's unfortunate is that this is an obvious result of more realistic projection. In the real world action happens in the blink of an eye. It takes a wealth of experience in a particular sport to be able to follow the action when you are unable to see most of it. Additionally, in the real world when your mental picture of some portion of the world is out of date you can quickly check on it, not so in movies where you can only see what the director shows you. In movies the limitations of 24 FPS and the motion blur which comes with it helps the audience understand what's happening, even if they've taken their focus off the sword of the hero to examine his facial expression. The blurry history shows what's important and what's happened over the last quarter of a second.

Both of these issues where most pronounced during camera movement. When the camera was placed at a human level and moving at a human pace (that is, slow as a snail) then it worked out more or less fine. However, as soon as the camera was moved in a way which is dramatic, but humanly impossible the jerking became severe.

I would not recommend watching The Hobbit in 3D High FPS, I think it'd be a better movie in 2D at 24FPS. This is not to say that I think it impossible, just that there are some restrictions on what the film can do. As long as a movie was shot only from a steady cam at human level with a walking camera man, avoided fast action scenes and had all the actors move slightly deliberately I think it could work out fine. It makes you wonder what the point of a drama shot in such an expensive process would be though.

Diggin Up Old Stuff

In channel today the discussion turned to the Twelve Days of Van Epp. This reminded me that I had the Thirteen Days of Crackdown in the archive of my SFU account. In looking at the contents of that archive I found a couple of things worth uploading.

I present, for your enjoyment, Thirteen Days of Crackdown and Meet SFU CS.

Tyranny of the Forums

The web has a pretty rich and varied set of ways to deliver information to a person. There are standard web pages, images, videos, interactive images, audio, tables and wikis to name a few. There is also the web forum.

Web forums are pretty much the default tool of many communities to disperse information. Often this occurs with an admin pinning a post at the top of one of the topic areas and then the appropriate senior member posts a start to this thread with some information and all the information they forgot is filled in as responses to posts from normal members.

They are the default for this sort of thing and I hate it. If you are a casual reader of the community or you arrive well after the fact then you have to read through an often lengthy series of posts in order to extract the relevant information. Information that would otherwise take mere seconds to read off a plain, up to date webpage.

And this is the best case when the community deems the information important enough to pin a thread. If that isn't the case, perhaps because you want some old information which is no longer pinned. Or maybe you want some information provided to the community by a non-senior member. In any case you now not only do you have to read through some thread spread over thirteen pages to find the information you want with clarifications, but you also have to sort through tens or hundreds or thousands of threads to find the one which has your information. God help you if you want to find any amendments in other threads.

Perhaps the worst example of this I have ever seen is at the xdadevelopers forum. There is an amazing amount of knowledge strewn about that forum. Unfortunately it's strewn about and nearly impossible to find. When you try searching for some you usually only get a series of twisty ten page threads full of comments linking to other thirty page threads. It truly is hell on the Web.

Amazing amounts of pain and suffering could be avoided simply by summarizing these important knowledge threads in a wiki. Help end the tyranny of the forums, summarize the information in important threads in a community wiki.

Infinite Impersonal Internet

It's quite possible you've heard about the "Infinite News Stand". In short this is the issue faced by news providers on the Internet where there is effectively an infinite free content. The issue here is obviously how to get paid. You can paywall everything, but then nobody shows up to read your stuff. You could provide free samples, but if everybody did that there is effectively infinite free content available, so why would people pay for more of your stuff?

This is just one aspect of what I, in this post, am going to the Infinite Impersonal Internet Issue. I'll shorten that to I4 to save my wrists. I4 arises because of three facts about the world.

The first is that millions upon millions of people use the Internet for anything and everything you can imagine. In fact, it is impossible for any single person to type even one message to every user of the Internet online at any one time. There are just too many people, there are infinite monkeys.

The second is that, for any particular discussion topic, many people hold the same viewpoint. This can best be seen in popular comment sites such as Reddit. If don't look carefully at the names you'll see that the threads progress in a logic manner as if a cohesive discussion is taking place between a small number of participants. Normally nothing of the sort is true. Instead you have entire cohesive threads where a single participant will often post only once. This is but one obvious and interesting example of the substitutability of people on the Internet. It is difficult, in a practical sense, to differentiate posters except by their stance within the discussion taking place. While it is usually a safe bet that posts from the same account are the same person, it is generally unknowable if different accounts are different people. Accounts tend to be cheap on the Internet, easily produced and easily discarded. It's hard to tell the monkeys apart.

The third fact is simply that most interactions on the Internet are transitory and impersonal. While it is entirely possible to make good friends on the Internet and even to create deep communities that is not the default interaction. The default interaction is to read an article by some named but otherwise anonymous author, further read comments with random nicknames and then perhaps provide a comment yourself. Much like walking down a busy urban street you encounter a setting, take in the response of people you are unlikely to ever see again and perhaps react in your own way. Only rarely will anybody in that crowd distinguish themselves from the faceless mass.

As noted above I4 is not the only form of interaction on the Internet, but it is definitely the most common. It takes real work to construct a community and many interactions between two people before they will see the other's face. It happens everyday, but not every time.

Now what are the consequences of I4? There are many, the most obvious of which are the general ailments of the Internet: trolls, spammers, hate mongers of various sorts and echo chambers. Trolls don't care if you don't like them and ignore them forevermore; they can always troll somebody else or start a new account. Spammers don't even care if you personally respond as long as some minuscule fraction of people do. Hate mongers don't care about you; they already have their own echo chamber which tells them they are right and you are wrong.

This leaves echo chambers. Echo chambers are not unique to the Internet, but are made much larger and more numerous by it. Given an infinite amount of content produced by an infinite number of monkeys which are hard to differentiate, how does one choose which places to return to? Obviously they go where they liked the content the best. This just so happens to be where other like minded people tend to end up. You now have an echo. The chamber is simply a result from the fact that Internet people aren't particularly differentiable. Spend enough time listening to the echo, hearing it from nearly all sides and one starts to believe that everybody on the Internet believes that. The logic is simple and fuzzy: Internet people A through Z think that way and they cover the spectrum of Internet people, therefore all Internet people believe so. Echo chambers are unavoidable, but they prevent alienation. They provide a necessary shared culture.

Not all the consequences of I4 are negative. Take the dual of trolls and spammers is the newbie. The person who doesn't quite know where the fit or quite how to act. I4 allows this person an infinite number of tries under an infinite number of guises to find their place and make their contribution. The dual of hate mongers are constructive communities which combine their intelligence and effort to build great monuments. The dual of echo chambers is exposure to new ideas. These great strengths of the Internet cannot be ignored just as they cannot be separated from their duals.

The next time you see a troll ignore that identity and move on; they may return under a different nickname or move onto some other target and that's alright. If you come upon a den of hate mongers enlighten them; they'll likely view you as a troll or spammer, but that's fine Internet nicknames are free. The next time you are disliked or hated on the Internet don't sweat it; there are a lot of people on the Internet and they are all kinda blurry at this distance.

CSS

Now I don't claim to be a web developer. I don't do it for money, but I do touch web development every so often. Every time I do I am amazed at the poor design of CSS. I don't really understand how CSS was standardized being so bad.

The idea of CSS is a good one: separate the content from the presentation. It makes code in general cleaner and easier to modify in a consistent manner. This makes sense. But the implementation is halfways useless and falls well short of this goal. It isn't possible to setup the simplest semantic div hierarchy and call it a day. CSS is non-orthogonal and makes many basic presentations needlessly cumbersome to create. It is well known to be impossible to create any moderately complex layout with the ideal semantic hierarchy.

Instead there are hacks use everywhere with divs inside divs inside spans with Javascript thrown on top to get layouts which are functional and visually pleasing. Even then, some things are nearly impossible without manually pixel layout.

You think by CSS version three they would have figured out what they've done wrong and fixed it. Oh how wrong you are. Instead of fixing the major core issues of non-orthoganility, insane limitations and special cases they merely add more half-baked features. At the rate that real issues get fixed I figure that CSS version 10 will be reasonably robust and flexible.

I have no idea what kind of drugs the CSS designers are on, but it must be good.

Time for New Key

This is a short announcement that I have produced a new GPG key. You can find the new key here and my transition statement here. My new key ID is CBA7B85A.

Professional C

The Internet is full of simple language tutorials and bookstores are full of books proclaiming to teach you language X in Y time period. This is not one of them. Instead this is intended to provide a shortcut to all the learning which happens as one works on a large, well engineered and well written codebase. I expect that you already know how to program in at least one programming language and further that you understand the theory of indirection and pointers. I'll cover the basics, but I'll do it at breakneck speeds. You will want to look elsewhere for formal definitions and corner cases.

Basic Syntax and main()


Let's look at a simple example to start with.

#include <stdio.h>
#include <stdlib.h>

#define TYPE_OF_WORLD ("beautiful")
#define DEFAULT_NUM_ITERATIONS (10)

int iterations_remaining(int max_iterations);

/*
 * Utility to send say hello
 *
 * Usage: hello [iterations]
 */
int main(int argn, char **args)
{
        int num_iterations = DEFAULT_NUM_ITERATIONS;

        if (argn == 2) {
                num_iterations = atoi(args[2]);
        }

        for(;;) {
                if (iterations_remaining(num_iterations) > 0)
                        printf("Hello " TYPE_OF_WORLD " World!\n");
                else
                        break;
        }

        return 0;
}

int iterations_remaining(int max_iterations)
{
        static int iterations;

        iterations++;

        return max_iterations - iterations;
}

This simple example shows us much of the basic syntax. It shows us how to include library header files:

#include <stdio.h>
#include <stdlib.h>

How to define constant strings and integers with symbolic names:

#define TYPE_OF_WORLD ("beautiful")
#define DEFAULT_NUM_ITERATIONS (10)

How to declare a function without defining the body. This let's us define the function elsewhere later:

int iterations_remaining(int max_iterations);

How to write a block comment:

/*
 * Utility to send say hello
 *
 * Usage: hello [iterations]
 */

The proper function signature for main():

int main(int argn, char **args)

How to define a variable and optionally set it with an initial value:

int num_iterations = DEFAULT_NUM_ITERATIONS;

The syntax for an if block, how to check for equality, how to set a variable to a value, how to call a function and how to access array elements. Additionally it shows one aspect of the equivalence of pointers and arrays:

if (argn == 2) {
        num_iterations = atoi(args[2]);
}

The syntax for an infinite for loop, how to do simple if-else, how to print text using the standard library, the fact that string constants automatically combine in C and how to break out of a loop early:

for(;;) {
        if (iterations_remaining(num_iterations) > 0)
                printf("Hello " TYPE_OF_WORLD " World!\n");
        else
                break;
}

How to return a value from a function:

return 0;

How to define a function, even one which has been previously declared:

int iterations_remaining(int max_iterations)
{

How to declare a static variable restricted to a function scope along with the fact that static variables are automatically set to zero on program load:

static int iterations;

The post-increment operator and how to perform basic arithmetic:

iterations++;

return max_iterations - iterations;

Overall that's quite a busy example. The syntax so far has all been more or less plain except a few points: including header files, the function signature of main() and the static keyword. I'll leave an explanation of static until later as it is has several uses. C uses header files to contain type definitions, constant definitions, function declarations and sometimes variable definitions. There are two similar forms:

#include <library.h>
#include "project.h"

The first is used for header files from libraries, the latter for header files from the current project. The distinction is a bit murky since some projects are so large as to have library like elements within themselves.

The type signature of main is more or less fixed in stone. Since C is used in so many different domains it isn't actually a fixed rule, but if you are writing a application it is a safe signature to assume. The signature has three parts. The first is the return type, int or integer. This is a signed type and even though an int is usually 32bits long on most modern computers you can only safely return values from 0-255. The exact range is OS specific.

The second element is "int argn". This is the number of arguments in the following string array. This value will never be less than one. The third element, the array of strings, is the argument list. Element zero (args[0]) is the string containing the name the program was executed under. The following elements up to elements argn - 1 are the program arguments as strings.

Though the return type of main() is an int, you can't depend on more than 7 bits being reliably and portably returned. Thus the safest range is 0-127. You can use the values 128-255, but sometimes it will be interpreted as a signed char and sometimes unsigned and so can be confusing.

Control Structures


C has all the basic control structures you need: do-while loops, while loops, for loops, switch statements, if, if-else and goto. Their syntax is pretty simple, but it is important to follow some style conventions to make it readable:

int func(void)
{
        int i;
        int j;

        for (;;) {
                /* Infinite loop */
        }

        for (i = 0, j = 6; i < j; i += 2, j += 1) {
                if (i == 8)
                        break;

                if (j == 10) {
                        continue;
                } else if (j + i == 15) {
                        printf("foo\n");
                }

                j += 1;
        }

        i = 10;
        do {
                i = i / 2;
        } while (i > 0):

        while (i < 15) {
                i++;
        }
}

All this is fairly straight forward. goto will be covered later when we get to handling errors. A while loop can be used to implement an infinite loop, but I prefer using an infinite for loop instead because it is more visually distinctive. Don't be afraid to use goto for jumping out of, into or above loops. Properly placed goto/labels can make code much more obvious and simpler.

When it comes to generic loop counters do use i, j or k unless you have a good reason not to. These loop iterators are well known and short. If you are doing the equivalent of a for loop using a macro foreach-type construct the loop iterator should be named for what you are iterating over, e.g.:

list_foreach(fruit, fruits) {
        /* Do something */
}

It is good practice to write and use macro'd foreach constructs with non-array data structures since they make things clearer.

Datatypes and Abstraction


The key in C when it comes to data structure is that the size of the data structure must be correct. If the size is correct then you'll be able to interpret those bytes in many ways. Some of them will even be correct. Let us start with basic data types:

#include <stdint.h>

int tmp;
char character;
char[10] fixed_length_string;
char *pointer_to_string;
int8_t a;
uint8_t unsigned_a;
int16_t b;
uint16_t unsigned_b;
int32_t c;
uint32_t unsigned_c;
int64_t d;
uint64_t unsigned_d;
float f;
double g;

Use the int type for a general integer work type when you don't need large values. There are various other modifiers which you may see in older code, such as unsigned short int. Don't use them. Instead use the fixed sized integers from stdint.h or the equivalent. stdint.h also has a number of more specialized types for things like the most efficient signed type of at least a given number of bits. These can be advantageous on embedded CPUs where different integer widths can have widely differing performance and code size.

A programming language with just basic types isn't that useful, so C provides struct and unions. A struct is simply an interpretation of an object (objects in C terminology tend to mean just a chunk of memory). As previously mentioned you can interpret chunks of memory in different ways, this is critical to the method of implementing polymorphic object-oriented programming in C. A union is an explicit extension of this interpretation concept. A union is simply a compiler aid to interpreting the same object in different ways. This is most useful when you have dependant data which differ in types and you want to reduce the amount of memory (or bandwidth) an object takes. It can also be useful to reduce the amount of stack space a function requires, which matters in some environments such as kernel work.

struct config_msg {
        uint8_t type;
        union {
                uint64_t transaction_number;
                char username[32];
        } u;
};

int func(int value)
{
        union {
                struct success_msg success;
                struct failure_msg failure;
        };
        int len;

        if (value == SUCCESS) {
                success.header.type = SUCCESS_MSG;
                success.value = value;
                len = sizeof(success);
        } else if (value == FAILURE) {
                failure.header.type = FAILURE_MSG;
                failure.value = value;
                failure.source = SOURCE_ANALOG;
                len = sizeof(failure);
        }

        return send(&success, len);
}

There are a few things to note about the above example which are quite important. The first is that, unless otherwise configured, the compiler is free to add padding inside a structure to ensure that the elements are aligned. In this case that means that there will be padding (usually 3 or 7 bytes depending on architecture) after the type element. You can disable this on a per structure definition basis, but how to do so is compiler specific.

Secondly, note that the members of a union do not have to be the same size. Any memory which is past the end of a member will simply not be accessed.

Finally, a note about anonymous unions and structs or unions without type names. An anonymous union is a union which doesn't have a type name or a variable name, as is seen with the union in the function above. Use anonymous unions sparingly as they can confuse what memory belongs to which variables. Perhaps the only acceptable location is in function temporary variable definitions.

As shown above structs and unions don't have to have type names. This can be useful when creating a struct or union inside a typed struct to help organize the data. It can also be useful to construct a custom typed struct for limited use within a function. However, you should always have a variable name in that case, as in the case with the union u above.

Something you may see is typedef'ing structs, like so:

typedef struct index_type_t_{
        uint8_t type;
        uint32_t index;
} index_type_t;

I don't recommend this in general usage because it makes the code harder to read when it only saves a small number of typing characters. Using a programmer's editor and those last few characters likely aren't typed to begin with. The C language comes with a separate struct namespace and it would be foolish not to use it. Do note that there are use cases for typedef and even for typedef'ing structs when it comes to library interfaces and basic types. For example, it is entirely reasonable to typedef a uint16_t to be called index_t since that makes it easy in the future to change the size of index_t without having to check all the existing code line by line. Every use of index_t will have to be checked however, for correct intermediate uses. Some code may store the value in an int for example, which would make changing to a uint64_t incorrect on most platforms. It is good practice to use the correct type names whenever possible to avoid this problem; if you are manipulating an index_t you should use only index_t typed variables, even if it current is the same as uint32_t.

static


Before we move onto object oriented programming in C and interface design we need to understand the static keyword. The static keyword has two major uses: defining a variable with a known and unchanging memory location and restricting the symbol visibility of a function to a single file.

The first use, static variables, is easy to understand abstractly, but as you'd need to understand linking (which I won't cover here) we won't go any deeper than that. There are two defining properties of static variables:

  1. Every static variable is initialized to zero at program initialization with no further work from the programmer.

  2. Every time the static variable is used from a function it is the same variable which contains whatever value it did the last time it was used.

Static functions, as previously mentioned, restrict the visibility of that function to the current compilation unit (more or less the current file). This should be used whenever feasible to restrict the visibility of functions to their minimum necessary. Minimal visibility is advantageous as it makes it easier to refactor functions, delete unused functions and use short, descriptive names for functions and it reduces redundancy because forward function declarations for static function are unnecessary in most cases. Consider the name differences between these two functions:

int libname_component_toggle_fob(struct fob *fob)
{
        ...
}

int toggle_fob(struct fob *fob)
{
        ...
}

void foo(struct list *fobs)
{
        struct fob *fob;

        list_foreach(fob, fobs) {
                libname_component_toggle_fob(fob);
        }

        list_foreach(fob, fobs) {
                toggle_fob(fob);
        }
}

As you can see the shorter name is better as it makes the code easier to read. We will also be sure that toggle_fob() will never be used by name outside this file so it makes it easy to check for all instances when we decide to rename it or modify what it does.

Object Oriented C


Many people believe that C is only a procedural language and further that you need special language support to implement object oriented programming and especially polymorphism. This is not correct. In fact, object oriented programming in C is not only possible, but it is not noticeably more complex than normal C programming and is more flexible than some object oriented languages.

They key to using the object oriented style in C is that you have to pass the object you are operating on explicitly to the methods. Usually this is done as the first argument. Past that there are two ways to implement object oriented programming in C. If you don't want polymorphism it's as simple as you would believe:

struct fob {
        char name[10];
        uint16_t num;
};

void fob_init(struct fob *fob)
{
        memset(fob, 0, sizeof(*fob));
}

void fob_setname(struct fob *fob, char *newname)
{
        strncpy(&fob->name, newname, sizeof(fob->name));
        fob->name[sizeof(fob->name)] = '\0';
}

int main(int argn, char **args)
{
        struct fob *fob;

        fob = malloc(sizeof(*fob));
        if (!fob)
                return 1;

        fob_init(fob);
        fob_setname(fob, "myname");

        free(fob);

        return 0;
}

Note the use of "sizeof(*fob)" and "sizeof(fob->name)". You can use sizeof on things like these and it is recommended to do so instead of the traditional tactic of defining the size and then using that, or doing "sizeof(struct fob)". Using the former forms when possible prevents errors when the type of foo is changed since the correct size is determined by the compiler. It also reduces the number of definitions which have to be exported and named.

As you can see, static object oriented programming in C isn't difficult. This form works equally well with object allocated on the heap or the stack.

If you want to support polymorphism then a bit more work needs to be done, but that's to be expected given the greater capabilities. This is a simple example of one way to implement polymorphism. With more work and some helper macros it is possible, though not always advisable, to achieve usages which require less typing.

struct fob_ops;

struct fob {
        struct fob_ops *ops;

        char name[10];
        int32_t value;
};

struct fob_ops {
        void (*setname)(struct fob *fob, char newname);
        int (*getval)(struct fob *fob);
};

static void fob_setname(struct fob *fob, char newname) { ... }
static void fob_getval(struct fob *fob) { ... }

static const struct fob_ops fob_ops = {
        .setname = fob_setname,
        .getval = fob_getval,
};

void fob_init(struct fob *fob)
{
        memset(fob, 0, sizeof(*fob));

        fob->ops = fob_ops;
}

struct gizmo_ops;

struct gizmo {
        struct fob fob;

        int8_t type;
};

struct gizmo_ops {
        void (*setname)(struct gizmo *gizmo, char newname);
        int (*getval)(struct gizmo *gizmo);
        int (*settype)(struct gizmo *gizmo, int8_t type);
};

static void gizmo_getval(struct gizmo *gizmo) { ... }
static void gizmo_settype(struct gizmo *gizmo) { ... }

static const struct gizmo_ops gizmo_ops = {
        .setname = fob_setname, /* Will produce a type warning */
        .getval = gizmo_getval,
        .settype = gizmo_settype,
};

void gizmo_init(struct gizmo *gizmo)
{
        memset(gizmo, 0, sizeof(*gizmo));

        gizmo->ops = gizmo_ops;
}

int main(int argn, char **args)
{
        struct gizmo *gizmo;
        struct fob *fob;

        gizmo = malloc(sizeof(*gizmo));
        if (!gizmo)
                return 1;

        gizmo_init(gizmo):

        gizmo->fob.ops->settype(gizmo, 6);
        gizmo->fob.ops->setname(gizmo);

        fob = (struct *fob) gizmo; /* Downcast */

        fob->ops->getval(fob);

        return 0;
}

Note that whenever there is a list of items that the last item, where possible, should also have the separator. That way if an item is added after the current last item in the list the code diff will be cleaner.

This example shows that while there is a bit more manual setup for polymorphism in C, that the usage isn't too onerous in practice. The reason all of this works is because we take different views of the memory object. The critical thing to remember with this is that the order of elements must be maintained. That is, in the ops structures the superclass op struct must be included first. Similarly the superclass structure itself must be included first in the subclasses.

You'll note the awkward usage "gizmo->fob.ops->getval(...)". This can easily be avoided if we have two definitions for each class's struct. The internal one which is presented pretty much as above and an externally visible one which contains the ops element first and then a filler array which brings the total structure size such that the real internal one and the interface external one are the same size. This is often not necessary as the majority of uses in a well designed system only use each (OO) object as if it were the superclass.

Error Handling and Function Documentation


It may seem odd at first that error handling and documentation are in the same section, however part of being a professional is doing the annoying things which help future developers. One key component of this is having good API design. Even if you can't have good API design you can still have good error checking and documentation.

Error handling and function documentation, excepting one specific question, simple and formulaic. This is good as being formulaic helps prevent coding mistakes.

/*
 * Whatchmacallit the whosit.
 *
 * Whosit the whatchmacallit as long as the stars are
 * aligned correctly. This function immediately performs the
 * action. It is assumed that the caller holds the gizmo
 * lock. The argument howhard is how hard to try
 * whatchmacalliting. A valid range is -1 through 13, the
 * meanings of these values described in the definition of
 * struct whosit.
 *
 * Returns:
 *    EINVAL    - The whosit is not initialized
 *    EDOM      - howhard is out of range
 *    EEXIST    - Action does not exist
 *    ETIMEDOUT - Authentication server timed out
 *    ENOENT    - Failed to succeed, try harder
 *    ENOMEM    - Failed to allocate necessary memory
 */
 int whosit_whatchmacallit(struct whosit *whosit, int howhard)
 {
        int result;
        struct action *action = NULL, *a;

        if (howhard < 1 || howhard > 13)
                return EDOM;
        
        if (!whosit || !whosit->initted)
                return EINVAL;

        /* Grab a reference to the global list so it will
         * not disappear from under us
         */
        get_list(&global_action_list);

        foreach_list_entry(a, global_action_list) {
                spinlock(&a->lock);
                if (action_matches(a, whosit->action_num)) {
                        action = a;
                        break;
                }
                spinunlock(&a->lock);
        }
        if (!action) {
                result = EEXIST;
                goto out_nounlock;
        }

        get_action(action);

        whosit->lock(whosit);

        result = check_authentication(whosit);
        if (result = ETIMEDOUT) {
                return = ETIMEDOUT;
                goto out;
        }

        whosit->tries = howhard;

        whosit->data = malloc(DATASIZE);
        if (!whosit->data) {
                result = ENOMEM;
                goto out;
        }

        result = whosit->try(whosit);
        if (result != 0)
                goto err_out;

out:
        whosit->unlock(whosit);
        put_action(action);
        spinunlock(&action->lock);

out_nounlock:
        put_list(&global_action_list);

        return result;

err_out:
        free(whosit->data);
        whosit->data = NULL;
        goto out;
 }

When it comes to documenting a function there are a few mandatory parts. Firstly is a short description of the function. Some manuals recommend that this be a single line. I personally don't mind fewer than three lines, but it should be a short summary. After this is a more complete description of the function. This description includes any documentation about the arguments, including where to find more information if this isn't the canonical source. All side effects and other calling requirements (such as which locks must be held) are also mentioned. Finally there is an exhaustive list of error codes this function can return along with the cause of each one. This includes error codes which may be passed up from functions this function itself calls. While it is possible for a developer to walk the code when the code is available, they shouldn't have to.

An important consideration is where to put this large block of documentation. There are two different spots and which is appropriate depends on what kind of code you are writing. If you are writing code where you expect the user to have the full source code, such as if you are writing a function in an application, then put this documentation block along with the definition of the function as above. The reason for this is that it is usually easier to jump to the definition of a function than it's declaration. This is especially true if you have multiple versions of the function (for different platforms perhaps). If, however, this is a library function where you expect that the developer will not have access to the full source code, then this block should go with the function declaration on the header file.

As this example shows it is not only acceptable, but preferable to error out early and to use goto. Relating to erroring out early there is a school of thought where every function should have only one return point. This causes terrible nesting as erroneous cases pile up. It is better to use the return statement as intended early to check on preconditions.

Once the main body of the function has started though you often can't just return at random. There will be cleanup required. There are three ways to handle this. The first is to copy and paste the code into every area where exiting is required. This is obviously not recommended in the general case because code changes and copy and paste mistakes are very common. The second alternative is to nest the code such that the appropriate cleanup code is always run. This is difficult to read and modify. The recommended approach it to use goto as in the example above.

As an aside, goto is not evil and to be avoided at all costs. The original paper (Go To Considered Harmful) which decried the use of goto is usually taken out of historical context. When Go To Considered Harmful was written the predominant languages of the time didn't provide support for structured programming, that is, they didn't have language constructs for for loops, or switch statements. In such a case conditionals and goto is all most programmers had. This is obviously more confusing than necessary. Without goto algorithms are often forced to use trigger variables to exit loops or do any especially complex flow. You must be responsible in the use of goto, but it is still a tool with a valid place in the professional toolbox.

As this example shows the use of gotos allow a clear implementation of complex cleanup operations which must be partially performed depending on when the function reached an error. This is done without repetition of code and without cluttering the normal control flow. The key to this ability is the fact that the cleanup operations are listed in reverse order to when their dirtying operations were performed in the function. It is then a simple matter of jumping into the cleanup at the appropriate place on error. It is also possible, as seen above, to have special error exit status code after the final return to handle cases which differ from the normal cleanup actions.

Whenever possible you should use standard existing error codes. This makes it easier to get some initial sense of the cause of the error. ETIMEDOUT may not have a text definition which matches the use here, but it does extend the general idea of the error.

Callbacks


Callbacks are a common sight when dealing with complex data structures or library code. Contrary to popular opinion callbacks, especially immediately executed ones, are rather simple in C. By immediately executed I mean the callback is passed to a library function, perhaps to iterate over a complex data structure, and once the flow of execution has returned to the calling function the callback function and context is free to be released.

struct walker_context {
        int num_processed;
        int action_type;
};

static int walker(struct element *element, void *data)
{
        struct walker_context *context = data;

        if (!element) {
                /* Perform final iteration work */
        printf("Processed all %d entries\n", context->num_processed);
                return 0;
        }

        context->num_processed++;

        switch(context->action_type) {
                case 1:
                        action1(element);
                        break;
                case 2:
                        action2(element);
                        break;
        }

        return 0;
}

int act(struct list *list, int action)
{
        struct walker_context context;
        int result;

        memset(&context, 0, sizeof(context));
        context->action_type = action;

        result = foreach_list_element(list, walker, &context);

        return result;
}

We see that, for an immediately executed callback, things are pretty simple. Define the context structure your particular call back will use. Then define your callback making it static if possible. Inside this function you take the void* data passed to you and immediately convert it into the context structure the function expects. Note that no casting is necessary or should be used. Many times a NULL element will be passed in when the iteration is complete, to allow this function to be self contained in its action and as a signal when processing is complete. Usually there is a return value for "continue processing" and another for "error, stop processing".

Since the callback is executed immediately and not stored for processing later, as would be the case if this callback were to be called because of some event, we can allocate the context structure directly on the stack. Initialize the context and then pass everything to the library function, foreach_list_element in this case. That's all there is to it really.

Style


Formatting C code is a subtle task, but there are a few guidelines:

  1. Use whitespace. Whitespace makes code easier to parse with the dense by reducing its density.

  2. Align when it makes sense

  3. Use vertical space. Much like prose is separated into paragraphs, code should be separated into sections by vertical space. Doing so makes it easier to know which chunks must be digested at one time. One blank line is usually sufficient.

  4. Don't indent too deep. Deep nesting makes logic more difficult than necessary to follow. Either reformat the code to nest less deeply (perhaps with some careful use of gotos) or factor some of the inner code out into a separate function.

  5. Follow the coding standard. You may not personally agree with the coding standards of the project you work on, but as a professional it is nonetheless your responsibility to follow them. Starting with a good coding standard (such as the Linux Coding Standard) make life easier, but being consistent is the most important factor.

Beyond formatting there are also elements of style concerning how the code is written. In general the code should be written in the most simplistic manner which doesn't negatively effect the performance or conciseness of the code. Macros should only be used when they provide a repeatable and obvious benefit. Sometimes repetitive code is acceptable.

Next Steps


This article is not an exhaustive description of all the tricks and situations a modern professional C programmer may come across. Instead it is merely a quick summary. For a large set of examples you should look at the source of well written C code bases. Examples of this include Linux, QEMU, and many others. As with most disciplines the best way to learn is to watch those more experienced.

Bathtub Plains

Bathtub Plains: The flat part at the bottom of the failure curve. Many electronics never make it to the Bathtub Plains when they die early. Of course, many people don't keep their electronics that long anyways.

Outlook Killer

I don't know anybody who claims to like Outlook and yet you see it used in corporations everywhere. There's several good reasons for that. I'm going to tackle the calendaring abilities. Don't make the mistake of assuming that you can replace Outlook without doing calendaring.

These are the critical requirements to match Outlook's calendaring abilities. If you don't meet or exceed any of these you will fail to replace Outlook.

  1. The ability to look at coworker's schedules when planning a meeting.

  2. Updating or rescheduling a meeting is trivial.

  3. Events work exactly like email. You can email all the attendees of the event without any extra work. Just open the event and reply to all the attendees.

  4. The calendaring must be in the same application as email. If a separate application is required then it will not be used as often as it should and the integration won't be as good as it needs to be. Corporations use meeting invitations primarily as email threads which just happen to have reminders and update the available schedule for item one above.

  5. As a corollary to three and four above, event invitations must work with list of people. These lists should be already existing lists such as those already used for email. Nobody will create lists just for calendaring.

  6. Moderately complex recurrence setups must be supported. If I want a meeting to happen every Monday and Tuesday forever the that should be easy. It must also be easy if I want to move or cancel any single occurrence.

  7. Timezones must be supported mostly correct. If I send an invitation to somebody then they must see the invitation in their current timezone. Furthermore, if they proceed to change timezones then the reminder must occur at the correct time.

  8. The calendaring must be useful enough to be used standalone to do calendaring. If it is missing any critical calendaring features then it won't be used and the schedule lookups from one won't be useful.

  9. It isn't a web app. I know that all the cool kids think web apps are the future and no developer would be caught dead in this day and age without a 24/7 network connection. But that doesn't matter. People who use Outlook for real work and real calendaring sometimes do so disconnected (airplanes) or with bad connections (hotels or that office wifi which never really works in that far meeting room). They need to be able to work without a network connection and they need to be able to access their archives. These archives will be several gigabytes in size. Additionally, if the reminders aren't reliable then nobody will use the software. This itself necessitates a local component.

  10. You must be able to book rooms as well as people. Sure you, as a developer may not have ever actually booked a room, but the people who spend the money have.

  11. It must be possible to easily schedule meetings with people in a different organization than you. Back when inter-organization calendaring was a new thing this was less important. But now it matters and that means interoperating with the leader, Outlook.

  12. Invitations must have a positive response of reception and attendance intent. No excuses.

Outlook certainly isn't the best of email clients. In fact it is likely the cause of many of the worst abuses and most ineffective email practises in the world. But even if you hate it Outlook is well adapted to the corporate environment and calendaring is one key for that success.

Fun in the Slow Lane

Recently I was reading a some posts from a developer who lives in a very rural area in a house on solar power. You can read about it here and here and here. The short of it is that he spends most of his time accessing the Internet via dialup and he has a few coping mechanisms.

It's been quite a while since I've used dialup so I was interested to try it out again, but don't actually want to have to setup a dialup account. Instead I looked up how to throttle bandwidth on my laptop. On my Mac you can turn your connection into a modem for most purposes (http, https, irc, ssh and dns) with the following script:

sudo ipfw pipe 1 config bw 3KBytes/s delay 150ms
sudo ipfw add 1 pipe 1 src-port 80
sudo ipfw add 2 pipe 1 dst-port 80
sudo ipfw add 3 pipe 1 src-port 22
sudo ipfw add 4 pipe 1 dst-port 22
sudo ipfw add 5 pipe 1 src-port 6667
sudo ipfw add 6 pipe 1 dst-port 6667
sudo ipfw add 7 pipe 1 src-port 443
sudo ipfw add 8 pipe 1 dst-port 443
sudo ipfw add 9 pipe 1 src-port 53
sudo ipfw add 10 pipe 1 dst-port 53

You then turn your network connection back to normal when you've had your fun with

sudo ipfw delete 1
sudo ipfw delete 2
sudo ipfw delete 3
sudo ipfw delete 4
sudo ipfw delete 5
sudo ipfw delete 6
sudo ipfw delete 7
sudo ipfw delete 8
sudo ipfw delete 9
sudo ipfw delete 10

Have fun surfing in the slow lane and remember, you can turn of automatically loading images.

The Networked Future is Local

Modern telecom networks are an amazing thing. I can trade packets between two semi-remote areas across the world with ease. I can be on the move in a car on the highway and be working at the same time on a server thousands of kilometres away. It's a great thing that works often enough. Network performance has been increasing all around for years, which is good because the amount of data we want to push through the network has also been constantly increasing. Unfortunately we are quickly reaching limits of physics with respect to the performance of these networks and that will require a change in how we use the network.

The first and most obvious limit is one of energy. We all love our mobile devices, but the batteries never last long enough. Battery technology isn't making significant headway and the radio is already a significant power drain on modern smartphones. In the quest for longer battery life we will have to transfer data more efficiently and transfer less.

The second most obvious limit is the speed of light. We just can't beat the speed of light, but we are sure putting our best foot forward to match it. Currently a good latency time for the Internet is within a factor of two of the latency of light over the same distance. We can help this along slightly with better technology and more significantly with more direct cables, but we will end up at the limit sooner than later. Latency has a significant effect on the type of activities which can be performed without annoying the user. If you've ever had a transcontinental phone call you know what I mean, you are always interrupting each other while you wait for your voice to span the globe. Even today many websites use CDNs to move parts of their data geographically closer to the end user to reduce the pain of latency.

This third limitation is bandwidth. Wireless bandwidth, unlike wired bandwidth, is limited by the laws of physics to quite small bandwidths. Worse, that bandwidth must be shared by all the users near you. Thus while you can get hundreds of megabits of bandwidth to your desk if you pay enough, you'll be hard pressed to get much more than tens of megabits on any cell network.

All these limitations lead to it being advantageous to store as much of the data you are working on locally as possible. Now it isn't possible to store all of the data, but if you have 95% of it then you can save a lot of time waiting for it to be transfered over the thin straw of cellular Internet. The rise of DVCSes is just one indication that local copies of most of the required data are coming back into style. In fact some systems, like email, have worked this way from the beginning.

I would be on the lookout for things which leverage the network, but insulate themselves from slow networks by storing as much data as necessary locally. DVCSes are one example, but distributed bug trackers are also gaining mind share. It may not be long before you don't have to fear that wiki server going down or being on the other side of the world, because it's easy to have your own local copy to hold you over until it is restored.

Basics of Effective Corporate Email

Email is perhaps the most used communication method in the world after human speech. Email is, in many cases, the only communication mechanism worth talking about at large corporations. It is unfortunate that there is no guidance given on how to use email effectively in a corporate environment. Instead people are left to figure it out and many just continue as if their corporate email account is just a busier personal account. This doesn't scale.

Before you can effectively use corporate email you need to understand a few features of email. The first is the concept of the inbox. You may have seen a picture of the desk of a desk worker from the early 20th century. On that desk you would see a blotter (that big piece of paper in the middle of the desk, often a calendar), an assortment of pencils, pens, paper clips, letter openers and at least two baskets. One would be labelled "Outbox" the other "Inbox". The theory was that throughout the day the mail boy would deliver mail as it arrived into the Inbox and take completed mail from the Outbox to be delivered. Sometimes there would be a third basket for items which couldn't be handled immediately.

In modern email the Outbox still exists, but is usually empty as the email is sent immediately. The only times it tends to get used are when handling email offline, when the email can't be sent immediately it will be queued to be sent the next time a network connection is available, or when the email server is down. The Inbox, of course, still exists. I'll come back to how to use the Inbox effectively.

Email also has three different addressee fields: To, CC and BCC. To should contain the recipients you want to respond or act on the email. CC should contain the recipients you want to read the email, but you don't expect a response or comment. CC is mostly for ensuring that people are kept up to date. BCC is used to send a copy of the email to recipients without other recipients knowing that they received it.

One major difference between postal mail and email is that replying to a large list of people is incredibly easy. No longer do you have to write one copy of the response and then have a secretary make copies. Instead the computer does it all with a touch of a button. In the corporate environment you should never use the "Reply" button. Instead you should always use "Reply-all" and then trim the To and CC list as necessary. This requires two important rules be followed. The first is to NEVER respond to an email unless a response is required or you have something useful to add. If you receive a mass email erroneously ignore it. Responding will simply cause more mass mails to be sent out. If the erroneous emails persist respond to the sender directly, not the entire list of recipients. During a discussion it often happens that a person no longer has any useful input. Such a case could happen if somebody was added to the discussion to help with one portion of an issue which has been resolved and the discussion has moved on. Should this happen the individual may be removed by moving their name to the BCC recipient list with a note in the message about how that person is being removed from the discussion. The removed individual is removed from the discussion, saving them time, they have the opportunity to re-add themselves and everything is done politely. Future responses started using the "Reply-all" button will work as expected.

The rules to follow to make effective use of the Inbox are rather simple. The key is to keep the Inbox as empty as possible. This is easily done as long as you follow a few simple guidelines. The first is that the Inbox is not a TODO list. You can use it as such for items which will be done in the next day or two. If the action is further out than that you are better off using a proper TODO list, even if that list is on paper.

The second guideline is respond to email in a timely manner. This doesn't mean that you have to continually check your email and respond the instant a new message arrives. In fact that is a recipe for being unproductive. Instead, when you reach a point where handling email is the right thing to do you should respond to as much email as you can. After you have handled a message (which could be just reading it, a quick response or possibly a more involved action such as looking some information up) you must move it out of your Inbox into another email folder.

There are differing schools of thought on how to handle email folders. Some people have quite intricate folder arrangements. I prefer a simpler approach of a single email folder (which is not the Inbox) for each project. Given the search capabilities of the modern computer I find any more granular manual sorting to be a waste of my time. Of course if you have automated emails which are more specialized in nature feel free to setup filters and other email folders to store those items. It's useful to keep such noise out of the primary Inbox.

There are two states a message can be inside an Inbox: read and unread. When handling emails you should always deal with all the unread messages first, oldest to newest and then all the read messages newest to oldest. This ensures that messages get the most timely response, which makes your coworkers more effective and more likely to help you in the future. Here are the steps which should be followed when handling your email:

  1. For every unread email, oldest to newest

    1. If a response is not required move to archive

    2. If a brief response is required send that response immediately

    3. If a lengthy response or action is required mark the message as read and move to the next message.

  2. For every read email, newest to oldest, including messages marked read above

    1. Is the email still relevant? Perhaps a new message has made the message no longer require a response from you. If so move to archive folder.

    2. Do you know what you need to know to write the lengthy response or have time to perform the action? If so you should do so and then move the message to the archive folder.

    3. Has this email been in your Inbox for more than a week? If so put the action on your TODO list and move the message to the archive folder. If the item doesn't belong on your TODO list, perhaps because you don't actually intend to perform the action. In that case simply move the email to the archive folder.

If you follow this algorithm to handle your email you should never have a constantly growing Inbox. If you follow this algorithm and find yourself still falling behind you are likely just overworked and need to find some solution to reduce the amount of email you receive. Delegation works reasonably well.

One True Way: Tabs

One of the endless holy wars of computing is tabs versus spaces in code. Basically every corporate coding policies dictate using spaces. Many open source communities (such as Python) dictate spaces as well. One notable exception is Linux.

All the people who recommend using spaces are wrong. The one true way to do indentation is tabs for indentation and then spaces for alignment. Specifically, tabs should never appear after any character on a line which is not a tab. This has one advantage indenting with spaces cannot match: the programmer can view the code with whichever indentation suits them best. If you have a low resolution monitor and fresh eyes, go right ahead and use two character indentation. After you've been coding for a day and a night on your high resolution 27" monitor you can move up to eight characters. More standard coders can use four characters. That one weird guy who likes three characters will be happy too.

With tab indentation and space alignment a developer can see any of these options whenever they wish without changing the source code. All it requires is a programmer's editor and a minor configuration to your terminal which I describe below.

Eight character indentation:

#include <stdio.h>
#include <netinet/in.h>

struct foo {
        struct sockaddr_in ip;
        int                id;
        char               server_name[1024];
};

Four character indentation:

#include <stdio.h>
#include <netinet/in.h>

struct foo {
    struct sockaddr_in ip;
    int                id;
    char               server_name[1024];
};

Three characters:

#include <stdio.h>
#include <netinet/in.h>

struct foo {
   struct sockaddr_in ip;
   int                id;
   char               server_name[1024];
};

Two characters:

#include <stdio.h>
#include <netinet/in.h>

struct foo {
  struct sockaddr_in ip;
  int                id;
  char               server_name[1024];
};

All the alignment stays consistent since it is done with spaces. If tabs were used in the alignment then it wouldn't look near as good. Here we have the same code aligned with a combination of tabs and then spaces. At eight character tabs it looks fine:

#include <stdio.h>
#include <netinet/in.h>

struct foo {
        struct sockaddr_in ip;
        int               id;
        char              server_name[1024];
};

However, when we change the indentation to two spaces it no longer lines up:

#include <stdio.h>
#include <netinet/in.h>

struct foo {
  struct sockaddr_in ip;
  int       id;
  char      server_name[1024];
};

So you should always indent with tabs and align with spaces. Most good editors should have options which make this easy. Vim, for example, lets you use hard tabs with the 'noexpand' option and by setting the automatic indentation options correctly it will automatically indent with spaces for you.

Editors are one thing, but getting diffs and the like in terminals working is another. Until just the other day I didn't know how to fix that problem easily. However, I have discovered the tabs(1) utility. It's available at least on OSX and Linux, so I expect that it is available everywhere. With this utility you can set the tabwidth of a terminal with ease. For character indentation is only a 'tabs -4' away.

Distributed Bug Tracking

Distributed bug tracking as an idea has been floating around the Internet for six or seven years now. And there have been several attempts:

Unfortunately each of these suffers from some combination of the following problems:

  • Being unmaintained

  • Not providing a graphical interface

  • Being VCS specific

  • Being a VCS (Fossil, Veracity)

  • Having issue formats which don't merge well

  • Not easily tracking the state of a bug inside a specific branch

  • Seeming to not dogfood themselves

I have been interested in a distributed bug tracker to use with my personal projects for several years, but the field never seemed to improve. The leading options always seemed to have one of the issues listed above. I tried a couple and found them lacking to the point where I would quickly stop using them and revert to TODO lists.

Finally I truly needed a distributed bug tracker quite severely. So I broke down and wrote one. You can find the manual for Nitpick here. Nitpick avoids all the issues above while being simple and lightweight enough start using quickly.

Roundabout Here

How can one go on a road trip and not discuss the roads? I, for one, won't be the first to start. So let's have a brief summary of the roads in New Zealand!

The roads in New Zealand are actually rather good overall. They are well constructed with clear signage which means business. If you see a sharp corner sign then you know that a sharp corner is coming up for which you must slow. The vast majority of the roads are two lane highways with grassed shoulders. There is quite little asphalt and most of the highways are chip and seal. This makes for quite a bit of road noise, but traction seemed good in most cases. At the very least water tend not to pool on the road, but you'll still pick it up with the ground suction as you pass over it. Nicely road glare is kept to the minimum.

Then there are the roundabouts. They are used everywhere possible to good effect. It's a bit of a shame that Canada doesn't have as many roundabouts as they are fuel efficient and keep traffic moving. They are fuel efficient in that except during heavy traffic you tend not to have to stop the car and then accelerate again. You also spend very little time idling waiting for a light.

Now the key to a good roundabout is size. The traffic circles I've seen in Canada tend to be too small. Roundabouts really must be large enough that the inner circle can go around at 30km/h without too much trouble.

Traffic in New Zealand tended to be light and polite. It was a lot like driving in the Maritimes. People will pull over to the side of the road to let you pass if you drive like a standard Lower Mainlander. Now I didn't drive myself, Don did all the driving, but on those long highway stretches we had plenty of time to discuss it. The New Zealand speed limit is 100km/h. I heard that there was some talk of increasing it, but I don't think that would be in New Zealand's best interest. That speed is not too slow to make good progress through the country. Overall the roads are good and nice drives, except when the high desert roads are fogged in and it's raining cats and dogs.

Things I Wish I Brought

As with any trip you discover things you wish you had brought and things you wish you had left during the course of the trip. These are the things I wish I had brought.

The first is a small travel power bar. I had the necessary travel adapter and had confirmed that my devices would work with the simple adapter. However I only had one. In these days the standard traveller carries at least a laptop, phone and camera. Well, it may be easy enough at home to charge all three of those devices at once, but if you only have a single adapter it becomes more difficult. I do wish that I had brought a small three socket travel power bar to split the adapter. It would also help with the fear I felt when hanging my laptop charge off a wall socket held up by nothing other than the travel adapter. I will definitely bring one along the next time I leave North America.

The second thing I wish I had brought was some string or light rope to fashion into a strap for my water bottle. I really detest travelling without water, so the first thing I did upon arriving in New Zealand is buy a bottle of water to refill throughout the trip. That certainly worked well, but I had no convenience way to carry it. I was fine in situations where I could have my backpack, but that isn't always possible. Five feet of light cordage would have provided my a solution for this problem. Next time I'll make such a sling before I leave for the airport.

More British Than You

Many countries across the world are more or less British. This isn't surprising as most of them are former colonies of Britain. There is a definite gradient however. At the low end you have the USA. It almost seems that they made a consistent and conscious effort to avoid being British.

Take the colour of post boxes. In Canada, New Zealand and apparently Britain they are red. I think they tend to be blue in the USA; at least the postal colour is blue and not red in the USA. Then you have accent. Britain, Australia, New Zealand and South Africa all have British-esque accents. They aren't the same, but they resemble each other. The USA accent differs significantly.

Having visited neither Australia nor South Africa you must take this next comment as a baseless supposition, but I believe that among the countries under discussion (Australia, Canada, New Zealand, South Africa, USA) that the next least British are the Australians. At least in the past couple of decades it has seemed that Australia has positioned itself to be more similar to the USA than the prototypical British colony.

This is partially born out with the relationship between Australia and New Zealand. It is quite similar in many respects to the relationship between Canada and the US. For one thing Australia has a larger population and seems more willing to tout its own horn than New Zealand. Then there is the fact that many Kiwis head over to Australia to try to make their fortune in much the same way Canadians sometimes do. Of course being so close there is ample tourism between the countries, with Australia having sunnier beaches. I have heard that Kiwis get similarly insulted if you believe them to be Australian.

So I believe the comparison is apt. But how does New Zealand compare to Canada on the colony scale? I would have to say that New Zealand is, without a doubt, more British. They drive smaller cars on the left side of the road, they have the British-esque accent, they eat more meat pies. They also seem to not believe in insulation or double glazed windows. I suppose it makes perfect sense. New Zealand is still a remote country and Canada has strong French and American influences.

Brown Custom Code and Bits Co.

An interesting aspect of New Zealand culture is that they appear to name things after people which wouldn't be named in that way in North America. Take Dick Smith's for example. This is a chain electronics store. I saw several example of chains or otherwise medium sized enterprises named after people. You don't see this much any more in North America.

I doubt there is any deep meaning behind this observation, but it could be because of the smaller scale of the country. With only 4.3 million people spread rather thinly over the two major islands it's likely possible for a family owned business to make a niche for itself and remain. I don't know however, it could be something else.

World of Tomorrow, Music of Yesteryear

Whenever I travel I tend like to listen to the local radio to get a feel for the local culture. New Zealand has been no different. The first thing to note is that New Zealand seems to only have about four radio stations which are retransmitted all over the country. For example. There was one radio station we were listening to a couple hours south of Auckland. When we got out of range of that transmitter we assumed we'd never hear from the colour commentary again, which was a shame since they were pretty good. However, no sooner do we go to find a new station than we find the same people on a different frequency. In fact, we were able to listen to this station on and off down as far as we travelled. North and South Islands.

Of course I spent some time listening to Radio New Zealand. There are two stations, a traditional variety station, similar to CBC Radio 1, and a music station, which seems to play concert music, but I must confess to not having listened to it much. The content was pretty much what you'd expect. Some news, some intellectual discussion and commenting on current events. There was also one music show by request which had a quite eclectic mix.

As to the content of the rest of the stations, I'm not sure if it's the people I am driving with or just New Zealanders, but it seems that most of it is older rock. There was one country station, but that station was vetoed almost immediately. The rest of the stations we've listened to have tended to be rock from the 90's and early in the previous decade. This isn't even us being picky. Since we are touring we are covering a lot of ground and moving out of transmitter range frequently enough that we tend to stick for a little while on the first station we find.

It isn't just on the radio where I heard this older rock. At this conference we were mooching off of there was a banquet with a band. The band was quite good, but tended to cover older rock tunes. Maybe this is just because of the type of people who were attending the conference.

Listening to local radio is an interesting view into the local culture. New Zealand may be the world of tomorrow, but it's music seems to have come from yesteryear.

Christchurch

Today we visited Christchurch. The condition of Christchurch is better than I had feared. It seems that most of the city either escaped severe damage or has been repaired. However, a large portion of the city centre is off limits to the public.

This Red Zone is a very large demolition and construction area. Unfortunately as it is so expansive and the military guard at every entrance so alert that you can't get in to see much of the centre. This includes the Christchurch Cathedral. However, we walked the perimeter and found one street view of the cathedral two blocks away. It wasn't a good view, but it was a view.

The feel of the city is a bit odd. Many of the buildings look brand new or are recently refurbished. However, interspersed are damaged buildings and buildings which are a bit rundown. It is all lively however. It is unfortunate that we won't have much time to stay. However the obvious construction boom makes it difficult since most of the rooms in the city are taken and many roads are blocked off.

I think Vancouver would be lucky to look as good as Christchurch two years after the big one.

Clean SVN URL Update

It turns out that you don't need to modify your subversion configuration file to use the SSH wrapper script. The default setting first checks the SVN_SSH environment variable, so you can put the path to the script there instead. This is very useful if you share your configuration files among many machines since shell has conditionals.

Meat and Potatoes

Local cuisine is one of the joys of travel for the open minded. As long as you don't mind asking what it is after you have tried it you will do fine. Now this is New Zealand, not a place known for its exotic tastes. What it does have are meat pies.

I am a fan of meat pies. What is not to like about a nice flaky crust surrounding slow cooked meats and vegetables doused in a nice thick, flavourful gravy? So the abundance of meat pies pleased me. In fact, you can get meat pies and other savoury pies just about everywhere except high end restaurants. Cafes, street bakeries, gas stations. They all have pies and they are all quite good.

The only fault of the pies here is the lack of vegetables. The pies could be much improved with some more carrot and peas. This problem is not restricted to the pies however. It was generally agreed among my travelling party that the food here, while good, was light on vegetables and fruit. You got ample meat and potatoes and pastry, but little of the good for you stuff. I also noticed that vegetables seemed quite expensive at grocery stores.

Other than that one fault I had, on my journey, excellent: beef, pork, chicken, fish, venison, lamb, mutton and shellfish (This bad grammar left in because it makes Courteney hurt inside). I didn't see until late in the trip that there are some free range rabbits, and so didn't get a chance to try that. Vegetables may be in short supply, but the Kiwis know their meat and potatoes.

Travelling Rich

I once read something to the effect that the truly wealthy of the world don't interact with the real world much at all. The theory goes that as they travel the world they stay in hotels which are all high end and pretty much the same, they eat high end food in French styles, they shop in high end shops which are all pretty much the same. The only big differences are the local language, local architecture and the local currency, when they bother to concern themselves with that.

Now I'm not going to claim that I have travelled rich in this way, but I believe I've had a bare taste of it. Nothing but fine meals and controlled tours for several days along with this conference. From what I've seen it's pretty true. It's an experience you can get in any large city in the world and it's quite difficult to tell where you are if you don't look for it.

I suppose it appeals to some, but I think it's something people would do not because they can or because they enjoy it, but because it is expected. It's not the sort of travel for me. I prefer to travel in minimal comfort. Rent a car, stay in affordable hotels, eat at common restaurants. Actually experience the country.

Kiwi Efficiency

Have you ever seen the road work signs going up on a major route and dreaded the coming weeks of traffic snarls and rough roads? It would seem that isn't the way it happens in New Zealand. I woke on Saturday morning to see a major street in Auckland half blocked off with trucks parked off into the distance. When I returned into the city that evening nearly the entire street had been stripped for resurfacing. We woke Sunday to work already beginning as soon as legal. Upon returning in the evening on Sunday we saw the entire road section resurfaced. Fresh lines were painted after we returned from dinner.

That's the way road work should be done. Quick and cleanly.

I'm not sure if this is indicative of any general trend, but I have noticed a few other things related to efficiency in my time here. The first is that neither of the two hotels I've seen have built in air conditioners or central heating. I have seen or heard ads for heat pumps, house insulation, efficient light bulbs, water conservation and other minor conservation measures. As I mentioned previously vehicles tend to be smaller, but I did see a television commercial promoting fuel efficient vehicle choices and choosing just as much vehicle as is required.

I get the sense that environmental concerns are at the forefront of the public consciousness. Perhaps this is due to the previous environmental devastation which was wrought on the unique ecology of New Zealand though the centuries. There are strong movements and punishments for further destruction of the few remaining pristine wilderness areas on isolated islands.

Then there is the furniture and houses. All the furniture I have seen has been minimalistic and well made. This is in contrast to the trend of the past few years in North America of ever larger oversized furniture. I will admit to not having seen much non-commercial furniture, but all the hotel and restaurant furniture I have seen is well made. Perhaps this come about due to a relative scarcity of local wood suitable to construct furniture.

I haven't seen any house interiors, but in general the houses appear nowhere near as large as in North America. The parts of New Zealand I have seen so far are quite similar to the Maritimes in housing sizes and condition.

Meal sizes also seem to be more moderated. To be fair I have been eating at higher end restaurants during my first few days here, but even the street bakeries and cafes have very reasonably sized portions. Restaurant food seems a bit expensive which may play into it. Or they may just have better portion control. There are certainly few obviously overweight people lumbering around the streets.

I get the sense that New Zealanders are a quite efficient people who accomplish precisely sufficient outcomes with little unnecessary waste. This seems to fit well with their resource and production situation. It is quite obvious that the economy is having a rough spell. I'm not sure that this is restricted only to the recent past and believe that the parallels to the Maritimes is equally valid here.

Pickup Trucks and Tow Hitches

When it comes to personal transportation in New Zealand there are a few notable differences from North America. The first is that diesel and gasoline prices are quite out of alignment with their energy ratios. Since diesel contains about a third more energy than gasoline, you would expect that diesel would cost about a third more than gasoline. In New Zealand, yesterday, the ratios were reversed. Diesel was approximately $1.56 while gas was $2.20. This is almost exactly reversed. No wonder small turbo diesel motors are so popular.

The second thing to notice is that, like much of the rest of the world, they don't have full size pickup trucks. Given the cost of fuel and size of their roads (fairly narrow with nearly no shoulder), this isn't surprising. Trucks in general are also relatively rare, restricted mostly to those who seem to actually need one. It seems that most people who have a vehicle for sports purposes have SUVs. New Zealand full size seems to be North American mid-size.

Notice that I said that only those who truly need a truck on a regular basis have one. What about the people who need to carry stuff only every so often? Well, many more cars have tow hitches here. Even the sporty car we ended up renting (4L V8 baby) has a tow rating of 1600KG. So it would seem that people here buy the vehicle they need and aren't afraid to use a small trailer when necessary. This makes perfect sense since a small trailer can carry as much as one of these trucks (though I haven't yet checked to see what the pickup trucks have as a bed load rating) for much less initial cost (you only need to buy a trailer once and it'll last many years) and much less continuing fuel cost. And yet you still have just about as much capability. Certainly enough capability for the person who buys a truck and never leaves the highways with it.

I think if you look at the vehicle choices of New Zealand you'll see the future for North America, especially Canada. For much less overall cost they maintain sufficient capability. Of course there are still large trailers and heavy trucks (which include one tons over here it seems) which can be bought when they are truly necessary, but those aren't necessary near as often as you see lifted one ton pickups with six litre diesel engines rolling down Lower Mainland streets.

Some Peculiarities of Air Travel

Here are some things I noticed during my 26 hour journey from the door step of my apartment in New Westminster to the New Zealand side of Customs.

Vancouver airport now has free wifi which is decent. It even has, in some lounges, power points which can be used to recharge. I'm a bit surprised at this because I thought wifi and power were one of the major perks of the status and business class lounges. Note, however, that since power isn't everywhere you may be sitting at a different terminal than the one you will be leaving by. Also note that the initial boarding announcements are not made across the entire terminal, but only at the gate you should be boarding on. The final boarding announcements are made over the entire terminal though. Which is good because otherwise we would have missed our flight leaving Vancouver.

The second thing I note is that when you travel by air your passport is likely to actually get stamped. I wasn't expecting this since crossing the land border with the US never results in a stamp.

LAX also claims to have free wifi and some power points. The power points are much sparser than at Vancouver airport though. Also when I was spending an extended four hour layover there the Wifi near my gate didn't work. Maybe there wasn't actually any free wifi.

It seems that either planes have gotten quiet in general, or that they use quieter planes for international travel. My experience with domestic flights in Canada is that the noise always started to bother me a couple hours after the flight began. That didn't happen this time. I suppose it's also possible that I'm starting to lose my hearing, but I hope not.

Airline food isn't actually that bad. I wouldn't call it five star, but the meals I had on Air Pacific were quite edible and well spaced. On the topic of food, I thought it would be a good idea to buy a couple of bottles of water after security in Vancouver airport. I expected to use this a bit on the first leg, mostly on the second and suffer the third flight. This doesn't actually work. It seems that no matter that you just got off an international plane, you will have to go through security again to get onto another international plane. You may even have to travel between different buildings of the airport, walking across land from the country you are in. If you want to bring water on your flight you'll have to buy it again for every flight separately.

Suit bags are pretty well designed pieces of luggage, but they don't work that well if the hotel room you book doesn't have a full size hanger bar so that the suit bag can be hung up while open. It is also very important that you close the hanger clip inside the suit bag, or everything will fall down into a heap when you go to open the bag up after hanging it up. Go me.

SVN over SSH With a Clean Repository URL

The biggest problem with the svn+ssh protocol is that the repository URLs leak too much information about where the repository is. It just doesn't look clean. svn+ssh://servername/home/me/repos/foo just doesn't look good.

It happens that this is easy to fix. First write a small script on the workstation:

#!/bin/bash

ssh $1 'svnserve -t -r/home/me/repos'

Put this file somewhere appropriate, say ~/.subversion/svnssh.sh and make it executable. Now you merely need to modify the subversion configuration file, usally ~/.subversion/config to set the tunnel program for ssh to be this script.

After doing this you are able to use the svn+ssh://server/foo as the . You may want to include some additional logic to support multiple servers, but that is a simple extension.

Professionalism

There is a class of jobs described as The Professions. This post is not about them. This post is instead about what it means to be a professional and how it can apply to any job. Professionalism, at its core, is about doing things the right way, even when doing so is contrary to human nature, personally detrimental and not obviously necessary.

Let us discuss each in turn in the context of software development. Within every professional software developer is a craftsman. Some part of the developer enjoys doing good work and feels pride in it. This is where the most obvious conflict of human nature and professionalism begins. If one takes pride in their works they feel some measure of ownership. Not ownership in the copyright sense, but ownership in the sense of defender. The original developer is the expert of that piece of code and since he feels pride in having written it, will strive to maintain its quality and form. Ownership in this sense is at odds with professionalism. Professionals must work in teams and feeling that one owns some code is at odds with effective teamwork.

Another aspect of the professional is that they put doing a good job over employment advancement or job security. This is done in several ways. The most notable is documenting the system appropriately, but it isn't externally obvious that all the necessary documentation and automation exists. It is quite easy for a developer to automate some complicated and necessary process with a bit of scripting, but then neglect to polish that tool sufficiently for use by the rest of the team. In fact, perhaps the most important indicator of the willingness to make oneself easily replaceable is the professional's understanding of the team. To a professional the team is not just the developer himself. The team isn't even just the existing people currently working on the project. Instead the team is an abstract set which contains not only the developer as they are, but the developer thirty years from now after they have forgotten everything to do with this project. Even more expansively, the team includes members not yet hired and the professional's replacement. A professional makes themselves easy to replace, often to the point of maintaining documents to orient their replacement and at time training their replacement as one of their final acts before moving on.

Then there are the things about a job which just don't seem that important. In the software world these are the design documents being kept up to date, having appropriate code comments or even providing useful explanations of what otherwise incomprehensible errors mean in particular circumstances. Perhaps most important is documenting and automating testing. These tasks are tedious and thankless. These tasks are also the key to good maintainability and a sign of a well polished development setup. A professional makes themselves replaceable and does the job to completion, not just the fun and interesting bits.

Professionalism is what differentiates software engineers from mere programmers. As with most things professionalism is a spectrum. I'm slowly moving my way towards being an exemplary professional, but I'm not there yet. I'll get there, one step at a time.

Thoughts of the Day

In a post scarcity world only attention hours are in short supply.

Usenet isn't dying, it just turned into Reddit.

Straight Razor Shaving: A Few Tips

In the past shaving used to be a luxury. I've heard that men would shave twice a week on Thursday and Sunday. For centuries this was done with the equivalent of the straight razor. These days there are many different tools and products used to shave and shaving is no longer a weekly luxury taken in at a barber shop with beer and chatter. Now shaving is a daily burden required to greater or lessor degrees by polite society.

In any case there is still not a shave to be found which is better than that of a straight razor. As with many of the old traditions which used to be passed down from father to son much of the knowledge of how to use a straight razor has been lost to the majority. So I have put here the extent of my limited experience with straight razor shaving.

Rule one of straight razor shaving: Never move the blade parallel to the blade. Always move it perpendicular to the blade, like you are trying to sweep something up with it.

Rule two of straight razor shaving: Always move the blade perpendicular to the edge, like sweeping. To do otherwise will leave you horribly scarred.

The key to a good, close shave is the sharpness of the blade and the smoothness of the cutting motion. A sharp blade cuts the hair without pulling it, resulting in the hair being cut level with the skin and not below the skin. The smoothness of the cutting motion is important to keep the skin from bunching up.

There are four keys to a nice smooth shaving motion:

  1. Practice. The following tips will help, but nothing beats practice. Without a steady hand you aren't going to have a smooth shave no matter what you do.

  2. Pull the skin. As you shave a region just tug on the skin a little bit to tighten it slightly. This will prevent bunching of the skin by ensuring there is less loose skin to bunch.

  3. Before you shave you must soak the hairs in hot water. This will soften the hairs, which makes them easier to cut through as well as opening the pores, which pushes the hairs slightly outwards from the skin. Shaving straight out of a hot shower is recommended, but if that isn't possible a hot towel will suffice. Use the hottest water in the towel that you can stand and keep the towels on the face for no less than five minutes, ten is better. If the towel starts to get cool you will need to use another towel.

    I also recommend that you shave in a hot, humid room, such as the bathroom after a hot shower, in order to keep the pores open as much as possible and to prevent the hairs from drying out. How much a problem these two are depends on your particular characteristics.

  4. Shaving soap. Sure you see ads all over the place for shaving creams which promise you the world. Don't believe them, they lie. The major reason to use shaving cream and the like is to lubricate the skin. If your shaving cream isn't doing that you are using the wrong stuff.

    I recommend proper shaving soap. This is soap which is high in glycerin. Properly lathered this soap will further soften the hairs, lubricate and clean out the pores.

    To use shaving soap (or other high glycerin soap) you must first lather it. To do so put some soap into the lathering cup. I prefer to buy shaving soap which comes already in a suitable container, but you can us a regular cup if you wish and then just cut some soap into it. You don't need much soap for each shave as the secret is in lathering it. Now you will need a badger hair lather brush, accept no substitutes. You take the brush and dip it shallowly into hot water. Then you lather the soap using this brush in a circular motion in the lather cup. You only want to use a little water, I dip my brush about a quarter of an inch, as too much water will prevent lathering and you can always add more water if you aren't seeing the results you desire. Lathering should only take a minute or two and you should end up with a rich, thick lather which sticks to the brush and the side of the cup.

    Now that you have the lather you apply it to the skin using the brush in long strokes. For the most part these strokes should be up and down like you are painting the skin. You should cover all the skin you intend to shave and then a little bit. If you are a slow shaver or the temperature and humidity of the room does not allow you may wish to lather only part of the area at any one time. The lather must still be thick when you go to shave the area. If you need to reapply the soap you can give the soap a few quick lathering motions first to ensure a good lather.

    Once you have finished shaving it is important to clean the lather brush properly or its performance will degrade. The proper way to clean a badger hair brush is to run it under slightly warm water while gently squeezing it with your hands. The action is allowing water to soak in and the squeezing the soaping water out. Continue this until no more soap soap comes out. Then you should get the majority of the water out of the brush with a couple gentle squeezes and then a small number of wrist flicks. It is ideal to then hang the brush in a well ventilated area so dry, but I've not had trouble with standing the brush up on its handle.

    A good shaving soap is very important to a good shave. In fact, I have had excellent results with cheap disposable razors as long as I have used proper shaving soap with a good lather. A good shaving soap will also prevent or greatly reduce razor burn by reducing the friction of shaving.

The magic of a straight razor is the sharpness of the blade. This is why the single blade of a straight razor can provide a superior shave to the multibladed razors of the modern age. Unfortunately I have no good advice on sharpening a straight razor, I ruined my blade on my first attempt. I can only suggest finding some old man who has great experience hand sharpening blades to extreme keenness or to attempt proper knife sharpening system. As far as I can tell the razors should be sharpened to 17 degree angle.

Though I can offer no great advice on sharpening a straight razor I can tell you how to keep one sharp. To keep a straight razor sharp there are really three things to keep in mind:

  1. Follow the recommendations for a smooth shaving motion above. The best way to keep a blade sharp is to not dull it in the first place. Most of those recommendations, but especially the hot water soak, are important for softening the hairs and making them easier to cut through. The softer the hairs the less wear on the blade edge.

  2. Rest your razor. As you shave the edge of the razor gets slightly bent through the resistance of the hairs. Resting the razor is simply not using it for a time. Gentlemen of times past used to have one straight razor for each day of the week to ensure that the razors stayed sharp as long as possible. This is because the bent edge will return to almost the original position is not bent to severely and given time to rest.

    If using a personal razor you should have no problem resting the razor at least a day between uses. Resting it longer will reduce the wear and so if possible it may be advisable to either shave on alternating days or to rotate between multiple razors. Additionally if you are shaving large areas of skin, such as legs, with a straight razor you might consider using several razors in one shaving session to reduce the wear.

  3. As resting the razor undoes some of the wear so does stropping. Stropping should be done on conditioned leather only. There are two common types of strops, solid ones which are made on wood blocks and flexible ones. Which you use really depends on the layout of the shaving space. I prefer the flexible ones, though they require something to hang one end off.

    Stropping should be done only at the beginning of shaving, never after shaving. You can strop during shaving if the razor is no longer sharp enough, but that will cause dulling and I recommend against it if possible. To strop you simply ensure that the strop is straight or nearly straight and lightly drag the razor along the strop, blunt end first. You should strop alternating sides of the razor. The easiest way to do this is to lightly place the razor on the strop nearest yourself with the edge pointed to you. Then drag the razor away from you towards the other end of the strop. Then when you have nearly reached the end you should flip the razor over onto its other side. Flip over the blunt edge of the razor. If you try to flip it any other way you will mess up from time to time and either cut your strop, damage the razor, dull the razor prematurely or at the very least have to spend more time stropping. You should strop each side gently about twenty times.

    If your strop is narrower than you razor you will have to strop the entire blade either by moving the blade during the stropping motion or by stropping the side nearest the tang on one trip along the strop and back and the side furthest from the tang on the next trip.

    The point of stropping is to realign the edge of the razor which was bent out of shape when you shave. If the edge only has to be moved back a little bit it is likely to bend easily. However if the edge has to be moved back too much some of it will break off, causing dulling. It is for this reason that stropping should only be done after resting the razor in order to keep the razor sharp.

There are a few more things to keep in mind about maintaining a straight razor. Firstly you must always remember that a knick in the razor is a bloody knick in your skin. As such it is critical that you take good care of the razor and keep fix all rust and edge damage as soon as possible. Secondly no matter how skillful and diligent and maintaining the sharpness of your razor it will need to be sharpened eventually. If you are taking care of the razor and shaving a face every day or two the razor should only need to be sharpened once or twice a year.

So that is all the theory about using a straight razor. As long as you remember Rule One of Straight Razor Shaving you should be able to learn without too much trouble or long lasting injury. I will now list out the steps I use when shaving with a straight razor. Use this as a suggestion, but shaving is a personal thing and you'll have to experiment some to figure out what works best for you.

  1. I always try to shave directly after a shower. I always try to end the shower with nice and hot water to ensure a steamy bathroom. I find I just don't have the patience for hot towels at home. If I ever get a barber shave though, hot towels all the way. It's just a shame that most barber shops don't serve beer these days like they used to.

    When I step out of the shower I dry everything except my face. I want my face as wet as possible. I keep the door closed and try to keep the bathroom as humid as possible.

  2. After ensuring a soaked face I strop the razor. This shouldn't take more than a minute or two.

  3. Then I lather the soap as described above. I put on a thick coating over my entire face. I find it helps keep the hairs soft even if I have to relather later.

  4. I shave, first the left side of my face with my right hand and then the right side of my face with my left hand. Being left handed has endowed me with some useful abilities. I shave in sections where each section is mostly flat. So my cheek down to the jaw line is one section. Under the jaw and on the neck is another. Special care is taken under the nose and on the chin. Always shave with the grain the first time over. If you have areas where the hair grows in circles or changes direction you will have to go over that area once for each direction. Always lather the area before shaving it.

    After every stroke of the razor I rinse the razor off with hot water. I find that this helps keep my skin warm and my pores open.

    Each stroke goes from the top of my face to the bottom of my face. The razor is held between thirty and forty degrees to the skin where zero degrees would be laying the razor flat on my skin and ninety degree would be the blade sticking straight into my skin.

    Never move the blade parallel to the blade. That is, never move the straight razor in a sawing motion. Doing so will cut straight into you and likely leave you with conversation starting scars. Always move the blade perpendicular to the blade, like you are brushing something with it. If you must turn a curve do it gradually.

  5. Now that I have finished the first time over I will consider additional passes. Shaving against the grain is a way to achieve an even closer shave, but I find that I end up with ingrown hairs if I do. It appears that many men are this way so shaving against the grain is optional.

    I will, however, often find some spot which I hadn't shaved sufficiently well for my tastes. These areas I will lather with soap again and then shave again. I try to avoid shaving an area in this way more than two or three times to avoid razor burn.

  6. After I am satisfied with my shave I take a towel soaked with the coldest water I can find and press the towel against my face. I do this to cool my face, close my pores and stem any bleeding from small cuts. I'll often need to soak the towel twice. Once my face is cool I use the towel to pat and wipe off any remaining soap.

    Some people suggest using aftershave at this point, but I've never seen the point. Using shaving soap ensures that any small cuts are clean along with the skin itself. The soap can also be lightly scented if you desire that. Finishing with an ice cold towel also does wonders for stopping any bleeding.

  7. Then it is time to clean up. Since I lather in a lather cup I simply put the lather cup away. I rinse the lather brush and stand it up to dry. I ensure that the razor has been rinsed off and that there is no water left sitting on the blade. Patting with a towel or a couple of very careful flicks will remove the water. I rinse and wring the towel and hang it up to dry.

  8. I then present my freshly shaved face to my wife for inspection. I usually pass with flying colours.

I believe that is all one can really be told about shaving with a straight razor. It is a rewarding skill, but does require practice. For somebody considering starting out shaving in this fashion I would suggest buying a new, factory sharpened razor. Though it seems counter intuitive, as long as you remember the first rule of shaving with a straight razor, a sharp razor is less dangerous and painful than a dull one.

Sketch of P2P DNS

Several months ago the US government started confiscating domain names. This started some portions of the Internet honking like a gaggle of geese. One concept which came up from this was a P2P DNS system which would be resistant to such government intervention. Creating such a system presents three difficulties:

  1. Ensuring a domain is only 'registered' once

  2. Allowing the owner to modify the domain at will

  3. Distributing the domain information

The first is easily done by having a single master public key which signs the key of the domain with a date. The domain key with the earliest signature date is the correct one. This implies that ownership of the domain key is ownership of the domain. The central authority can only hand out ownership of a domain once. Furthermore lost keys will result in an unmodifiable domain.

The second requires that the owner of the key sign the updates with the domain key and a date. Properly signed version information with the latest date is the correct one to use.

The third requirement is more complicated, but there are several distributed hash tables out there which would seem to fit the bill. In the worst case some simple eventually consistent P2P system could be created with relative ease.

The reason such a system would work is that they do not attempt to completely replace DNS. Instead it merely replaces the root and TLD DNS servers. Each domain would be represented in this system by the same information which is already returned by a normal DNS query use in the hierarchical hostname search. That is, it would contain at least the authoritative DNS servers for the domain.

Though such a system could be extended to contain all the hostnames on the Internet it would grow quite large. It would seem that there is a low limit to the amount of data a person will deem reasonable to allocate to P2P infrastructure. Not extending it as such leaves the domain DNS servers as the first vulnerable point in the chain of viewing a webpage, but no more vulnerable than the content servers themselves.

Reliability versus Dependability

Reliability is the capability of a tool to perform a specific purpose without failure. Dependability is subtly different from reliability in that it is the capability of a tool to perform the task of the moment sufficiently well. Perhaps a short listing of expectations will help:

Reliability:

  • Perform the designed task successfully every time

  • May exclude manual intervention

Dependability:

  • Perform the prescribed task as often as possible

  • Gracefully degrade under adverse conditions

  • Be abusable, work on the fringe of the stated purpose

  • Be amiable to jury rigging either repairs or modification

Take, for example, a ball point pen. Ball point pens are designed to write on dry, clean paper. A reliable pen will do so until the ink runs out. A dependable pen, however, will also be able to write, if with difficulty, on dirty, crumpled paper. Wet paper or things which aren't paper at all, such as wood, are also markable by a dependable pen. More so even than just marking in adverse conditions a dependable pen can be used as a small lever or to push small buttons.

These latter uses don't use the pen for it's marking ability, but instead misuse the physical properties of the pen. Dependable pens have these free variables of construction, strength of the body for example, modified to be more useful in some ways.

In general institutions desire reliability, because they will have one tool for each task, while individuals desire dependability, since they have a much wider variety of needs and circumstances.

Mesh Networks

The recent Egyptian network blackout has caused a surge in interest in decentralized mesh networks. For quite a while I've personally thought that a wide spread mesh network would be great. Unfortunately while I know that such a network could be created using available technology, I don't believe it ever will. There are several reasons, but they essentially come down to two overriding reasons: geek density is too low and everybody expects real time communication. That is the won't read version.

The more detailed reasoning is based upon technical restrictions. The conceptual framework of a mesh network that I'll be using in comprised of four facts:

  1. Every node is connects to a number of mediums. Any node may be connected to as few as one medium, e.g. using their one wifi card, to as many as ten media, e.g. a couple shared wifi channels and a handful of point to point links using directed wifi or Free Space Optics. Nodes connect primarily to other nodes geographically nearby.

  2. Every link between nodes has approximately the same bandwidth as every other link. For the purposes of theory we can assume that every link is equally capable. For the purposes of a practical implementation we can assume that the achieved throughput ranges between 1Mbps and 100Mbps. This is a reasonable assumption as all the commonly available 802.11 Wifi protocols have achievable throughputs in this range. I deem it unlikely that a significant portion of the network will have Gigbit Ethernet links to each other.

  3. Point to point links may use some private medium, such as a wire or focused laser beam. Broadcast links need to use radio spectrum. As such point to point links can assume to always have the nominal bandwidth available, where shared mediums must divide the available bandwidth among all the nodes using that medium.

  4. The average distance covered by any link is 1KM. This is optimistic for regular 802.11, but perfectly possible with fixed point to point links such as directed wifi.

Let us first discuss the infeasibility of such a mesh network to support primarily real time communication, as the current Internet does. Assume, for the moment, that it takes one millisecond for a node to forward a packet. Given the assumed average link length of one kilometre the network would have a minimum round trip time of 45 milliseconds from one side of a moderately sized town to the other and back again. That isn't too bad and is slightly faster than most of the current Internet. However, most of the websites you visit aren't hosted in the same city. Most aren't even in the same province. Let us assume that you are in Calgary and wish to access a site in Vancouver. The driving distance is about 1000KM which we'll use since nobody is can afford to lay a hundred kilometre link. That means that a round trip would take two entire seconds. That isn't quite what most people consider real time.

If the problem was just latency we could all get used to it. Unfortunately we also have to deal with bandwidth. As mentioned in this paper the average number of nodes a random request must traverse is proportional to the square root of the total number of nodes in the network. This implies that the average available bandwidth for each node is the bandwidth of the link divided by the square root of the number of nodes. Take the 100Mbps assumption and scale the network to ten thousand nodes and the effective bandwidth for each node averages out to 1Mbps. Such a network would cover a small geographic area and 100Mbps is quite optimistic. More realistic may be an average link speed of 10Mbps which gives and average of 0.1Mbps, approximately 10KB/s, on a ten thousand node network. While there are certainly uses for such a network I'm not sure that people in urban areas would be willing to go back to dial up speeds.

However, if most of the network accesses can be made to traverse only a small number of nodes then the bandwidth and latency issues are dramatically reduced. This can be achieved simply by using a caching architecture where each node has a cache of the content which has passed through it. A request would then be serviced by the first node on the path which had the content cached. Unfortunately this means that real time access won't happen, but most of the web isn't real time anyways. There is still the latency problem from distant nodes in other cities or provinces with transfer of updated content taking hours. You won't be able to IM your friend in New York, but you could still email them.

So real time access isn't going to happen, that doesn't mean such a network wouldn't be useful. Local content would still be fast enough. So let us consider the requirements of setting such a mesh up. The biggest requirement for a mesh is to have the maximum possible aggregate bandwidth on each node. Since the primary factor in aggregate bandwidth of a node is the number of network mediums it uses obviously every node needs as many mediums as possible.

As explained in the assumptions there are two broad classes of mediums. Shared and private. Shared media have the benefits of being easy. All it takes is a device with a wifi card to join a shared media. Unfortunately shared media, while allowing many links between nodes, shares some fixed bandwidth among all those links and the links of other nodes using the same media. A fast 802.11n shared medium would probably only afford each link an average bandwidth of around 1Mbps unless the network was quiet. Private media, on the other hand, always provide good bandwidth to each link. Unfortunately it takes one private media for each link and additional effort for each link. A private Wifi link requires directional antennas be aimed and configuration between both ends of the link. Private media may also provide good range, such as the Free Space Optics mentioned above which are reliable out to 1.4KM.

Who has the energy and knowledge to setup numerous, or even one point to point link? Not normal users who just want to plug something into a wall and have it work. Just Geeks. I don't know that there is sufficient density of true geeks to put such a network together. You'd need at least one on every side of every apartment building and a couple on every block of houses. That's a lot of geeks talking to each other.

A successful mesh network could be built and would probably look like Freenet. Anybody want to confirm that the network assumptions are similar?

Writing a Hackish Profiler

When a program is too slow a competent developer will first look to faster hardware. When faster hardware isn't possible this developer will then reach for their profiler. Profilers help a developer determine why a program is slow so that it can be sped up. But what if you are on a platform with poor native support *coughANDROIDcough* which doesn't support the tracing profiler you need? Write your own of course!

Now writing an efficient, low overhead and functional profiler is an involved task which can take a significant amount of time. Since we are all busy instead we'll do an 80% job by writing an efficient and functional profiler, but skip the low overhead. This should allow us to get the numbers we require, but will cause the program to run much slower.

First we list our tools:

  • gcc -finstrument-functions causes gcc to call the __cyg_profile_func_enter and __cyg_profile_func_exit functions as part of the regular function call preamble and postamble.

  • A small script program, trace2text, which converts from the efficient binary format to a text format which is easily manipulated.

  • nm -al to help convert function addresses into symbol names and source line numbers.

  • c++filt to unmangle the symbol names.

  • A small script, join_symbols.py to do a relational join on the function address in the trace between the trace line and the nm line.

  • A bit of shell script and awk to transform binary trace files into something human readable and usable with code folding to explore the trace.

  • A final small script, avg_time.sh, to take a human readable trace and produce the average amount of time a call of a particular function requires.

With these few tools we can simulate a fair amount of the power of gprof, though with significantly more overhead.

For various reasons I will not be providing significant source to the above bits. One primary reason is that, being hackish, the profiler was tuned specifically to grab the statistics I required for my particular problem. Specifically I had a multi-threaded program which processed a number of packets coming in off a network. The program was neither IO nor CPU constrained, but yet was unable to keep up with the possible network data rate because processing each packet took too long. My job was to figure out why and fix it. Because of this situation I needed to determine which function were spending time blocked on some resource in the course of processing each packet. This is in contrast to simply being able to look at which functions are called the most or use the most CPU time over an entire run.

With this in mind I will lay out one way to implement the hackish profiler and hopefully I'll point out all the pitfalls you may run across. The starting point for any profiler is in accessing the data they require. Some profilers hook into the kernel to sample what the system is doing many times a second, others hook into the function call preamble of the program in question via compile time modification. I needed the latter on a system with no support for it. Conveniently the build chain uses gcc and gcc provides -finstrument-functions. This compile flag causes gcc to add calls to __cyg_profile_func_enter upon entering a function and __cyg_profile_func_exit upon exiting a function. Each of these functions are provided with two arguments, the address of the function just entered and the address of where the function was called from. These addresses are only approximate due to automatic Program Counter movement, but are a constant increment greater than the true address. In my particular case on the ARM architecture the function addresses where one greater in the argument than as produced by nm.

Gcc does not provide these functions, so you will have to write them yourself. In my case I was interested in having a trace for each thread separately and with a minimal overhead since we are already turning every function call into at least three function calls (one for the original call, one for func_enter and one for func_exit). To this end I created a simple structure which stored the function address, call address, time and whether the function was entering or exiting (For this last I borrowed a bit from the time). Each thread had a contiguous array of these entries and when the array was full it was flushed directly to a file for each thread. The size of this array must be chosen with care, too small and the cost of the write system call will result in inaccurate numbers being recorded, too large and each write call will stall for a significant time causing spikes in the recorded timing. I ended up using a guess of 10000 entries and it seemed to be accurate enough for my needs so I didn't experiment.

It is critically important that you ensure that you do not cause infinite recursion from these two functions. There are two ways to ensure this. The first is to realize that -finstrument-functions is a compile time option and therefore any function not compiled with this option is safe to call. This includes any system library functions. The second method is to tell gcc to exclude the function from being profiles. This is done with the no_instrument_function attribute. Do note that this attribute can only be supplied in the prototype of the function and not the function definition itself. As an example this is how you should define each of the profiling functions:

extern "C" void __attribute__((__no_instrument_function__)) __cyg_profile_func_enter(void *this_fn, void *call_site);

extern "C" void __cyg_profile_func_enter(void *this_fn, void *call_site){...}

You must supply any function you write with a similar attribute to avoid infinite recursion.

With these functions written and working you should end up, after a run of your program, with a number of binary trace files, one for each thread. You will need to convert this to a text format to easily make use of the existing tools such as nm and c++filt. I did this with a small C program which took each record and printed out a line with the equivalent data. Every record was space separated. I also found it convenient to use a stack and compute the elapsed time each function took at this point since I was already sequentially processing the entire trace. The most important thing to keep in mind at this point is that the addresses must be output in the exact format that nm outputs the addresses. In my case this was %08lx, but on 64-bit platforms it will likely be %016lx. Matching the format will make the relational join easy. You should also adjust the function addresses as necessary to match the addresses of nm. Again, in my case I had to subtract one from the address supplied to this_fn.

Now you need to convert those function addresses into symbols so you know what you are looking at. What you want to do is combine every line in the text trace with the matching line of the nm -al output on the profiled executable (make sure you run it on an executable with debug symbols). I originally did this using the join command, but this requires that all the inputs be sorted upon the join field and this proved to be too time consuming. Instead I wrote a short Python script which read in the nm output into a dictionary keyed on the address and then printed out the joined line for every line on standard input. This drastically cut down the time needed to process the traces into something usable since even a short run will produce hundreds of megabytes of text traces.

Once you have the joined output you merely need to convert it into the final usable form. The first step to this is to use awk to move the various fields into the order you desire. The second step of this is to pipe the entire trace through c++filt to demangle the function names. This will turn the mangled symbol names into full function or method names with the types of the arguments. These will contain spaces so the demangled symbol name should appear after any fields you with to easily process. In my case this was the total time the current invocation of the function took.

With this trace you are almost there and should be able to extract the just about any statistic you desire. You can also manually explore the traces. The easiest way I found to do this is to take my trusty programmer editor (vim in my case) to open the text traces. The traces will be large so you may wish to use dd to extract a subset to work with. This was specifically necessary in my case because the trace files were more than 4GB in size, which vim will not open successfully on a 32-bit platform. Once you have some subset of the trace loaded in your editor I recommend that you use the code folding features to make exploring the trace simpler. In my case I used vim's foldmethod=marker with foldmarker=Entering,Exiting to allow me to fold function calls. On a medium sized project it was enlightening to see just how deep the call stack went in some cases.

Now you can compute the statistics you need to find your problem. Statistics I found useful and easy to compute include just skimming a stream of elapsed time for a function in question to get a sense of how long it took. I also produced a small script which scanned the trace and calculated the average time a particular function took, how many times that function was called and the sum of all time that function took. Exploring will give you a sense of which functions seem to be taking all the time, but be sure to examine the statistics from a significant run to smooth out spikes caused by other sources, such as the overhead of the profiling.

Hopefully this will help somebody profile when no profiler exists. At the very least writing a hachish profiler as an exercise is educational as proper profilers perform many of the same actions, just in a more efficient manner with lower overhead.

Why I Prefer Duck Typing

Duck typing is a magical form of typing where an object is considered to be the correct type as long as it implements the methods or messages which a piece of code calls upon it. It is also the type system I prefer. The three most relevant examples of languages with Duck Typing are Smalltalk, Python and Objective-C. Smalltalk because while it may not be the language which introduced duck typing, it certainly brought it into the mainstream. Python because it is one of the most popular languages which have duck typing. Finally I mention Objective-C for two important reasons. The first is that Objective-C is a compiled language with duck typing while the majority of languages with duck typing are interpreted or run on a virtual machine. The second reason is that Objective-C is the only language I know which has typed duck typing. I believe this latter feature to be a great concept introduced by Objective-C.

Typed duck typing provides the flexibility of duck typing with the basic error checking capabilities of weak strong typing. Specifically every object has a type and this type is statically checked against the expected types passed into functions and methods. Additionally the methods provided by a type are also checked, to make sure you don't ask a List to do the tango. All this combined with a simple way to tell the compiler that you know what you are doing strikes a great balance between power and protection.

Below I've listed most of the reasons I prefer duck typing and especially prefer typed duck typing:

  1. You often write less code because you can avoid writing adaptor classes in many instances.

  2. Anytime you would otherwise have to use code generation(templates) simply to handle operating on different types duck typing allows the same compiled machine code to handle an infinite variety of types; the only requirements are that the objects respond to the correct methods. Thus you avoid code bloat and multiple code paths for data structures. No longer do you need different template instantiations for bignum-object and string-object dictionaries. In fact it is possible avoid having two dictionary objects all together and instead having one instance which supports the two different key types, even if the two types share no common superclass or formal interface.

  3. You don't even have to implement a method directly. Duck typing allows a method of last resort which receives the object equivalent of the call just made on the object to be handled algorithmically. This allows many things, two examples of which are fully transparent RPC calls on distributed objects and powerful ORMs with no code generation or manual code writing. Both of these are examples of the power of proxy objects.

  4. With typed duck typing we do not give up the aid the compiler provides in checking that we make no trivial mistakes such as using the wrong variable or pass in arguments of the incorrect type. That is, we do not give up the aid, but can trivially tell the compiler to trust the programmer.

  5. Formal mathematics has been proven to be inconsistent so why should we believe that the typing within our programs will be consistent? It is obvious that fully statically typed languages prevent a set of correct programs from being written. I find being prevented from writing those programs to be aggravating. There is usually a different, acceptable way of accomplishing the same goal, but they are most often significantly more work on the part of the programmer.

  6. Programmer will move heaven and earth, while swearing, to do what they want, even if what they want is wrong. It's better to make accomplishing the goal easy such that they may learn of and correct their mistake as quickly as possible.

It is for these reasons that I not only find duck typed languages more pleasant to work in, but also more productive. Interested parties will likely find it enlightening to sample what NeXT was doing in the early and mid nineties and compare that with the capabilities of other companies of the time. NeXT made extensive use of Objective-C before in many ways becoming Apple.

Design For Your User, Not Yourself

Frequently when designing something the designer comes across shortcuts. Little corners which can be cut to save significant effort on their part with seemingly reasonable restrictions passed onto the user. More often than not the designer takes these shortcuts. As a user I beg you to resist the temptation and put the effort in. You may not think that the user will notice, but they will. Any inconsistency or obviously unnecessary work on the users' part will be noticed, time and time again. Do I have to hit refresh instead of changes in one part of the program automatically being noticed in another? Must I trawl through all the settings before your application is usable? Any action which can be made simpler with no loss of generality must be done so.

It doesn't matter who the user is. Programming language designers should not bow to compiler writers, but instead only to the programmers. Application designers should not bow to the programmers but instead only to the users.

The recent drive to minimalism in software and hardware is not done because it is easier. It is done because it is difficult but valuable. With a minimal set of features it becomes possible to carefully consider the implementations of features and eliminate or reduce pain points. People don't like having to deal with issues not directly related to the task they wish to perform. Don't make them.

Manufacturing Defect

I dislike shopping online for two major reasons. The first is that my credit card number will get stolen and it's annoying to deal with, even if it ends up not costing me any money. The second is that I strongly prefer to examine what I buy before I pay for it. I examine it for manufacturing defects.

Before the factory everything was handmade and each piece was more or less unique. In this situation it obviously makes sense to investigate everything before you buy it to check for flaws. Later came factories and uniform products. Though this is still before my time I am led to assume that there was a time when you didn't have to check these products carefully because they were all mostly the same and each one had some expert attention to detail. Expert examination should weed out most of the minor problems which must result. Minor manufacturing defects, when fixed early, will cause no further trouble. However minor problems will eventually become major problems if ignored. One further distinction is that things used to be overbuilt and so could handle some defects without issue.

In the modern manufacturing world little of this still holds true. Products are not overbuilt, but instead use the minimum necessary materials of the minimum necessary strength in order to reduce cost. Further the pace of manufacturing has gone up. Where before every piece would receive some fine finishing work at the hands of a human, now humans barely touch the products at all.

This results in manufacturing defects. Not necessarily the kind that will cause the item to fail spectacularly, but of the nagging variety which makes the action slightly less smooth, or makes boot have a rubbing stop, or just generally reduces the lifespan of the item.

I have heard that in the Soviet Union no consumer appliances arrived in working order. Instead the family handyman would have to spend hours finishing the appliance until it worked. From then on the appliance would supposedly run forever. I sometimes wonder if we are slowly approaching this state of affairs, one small item at a time.

Cheaper at all costs is nickel and dimeing us.

Stack Machines

Recently I read Stack Computers: The New Wave and have spent a short time pondering what is contained therein. A quick summary of the technology of stack machines is that you have a CPU which doesn't have any programmer visible general purpose registers. Instead the programmer generally access to two stacks, a data stack and a return stack. All the arithmetic operations occur using data from the data stack and subroutine return addresses go onto the return stack. It is sometimes also possible to push arbitrary data onto the return stack to avoid accessing memory when something complicated needs to be done to the data stack. One example would be duplicating an entry arbitrarily deep in the data stack.

Stack machine architectures have a couple of interesting properties owing to the fact that the operands are always known in advance. The first is that you can direct the ALU to perform every possible operation before you have decoded the instruction and then simply pick the correct output. Also, since the top number of entries in the stacks are known in advance you can make use of blazingly fast register memory to avoid having to go out to memory. Also, since there is almost no state to save serving interrupts cost about as much as subroutine calls and subroutine calls cost on the order of one memory access. Finally, since most or all operations work on the top of the data stack the number of instructions is reduced, allowing smaller opcodes.

This latter is important because the linked book implies that stack machines are predominately limited by how fast instructions may be read from main memory. An additional interesting point made by the book is that stack machines may have less latency because you can't really effectively pipeline it deeply due to the constant data dependencies.

Obviously stack machines haven't taken over the world. I believe that the major reason for this is C. Though it is possible to compile C to operate on a stack machine, C makes some assumptions about how the stacks are laid out and how memory is accessed which requires that more work be done. Specifically C, or at least C code, assumes a single stack which contains locals and return addresses. Further, it is assumed that the locals may be access not only in any order, but that they exist in memory. As with register machines it is possible, in some cases, to have the optimizer figure out that the local is not accessed and need not exist in memory, but it is complicated.

Stack machines exist, more or less, solely to run FORTH, a stack based programming language. Unfortunately my limited understanding of FORTH and stack computing in general leads me to believe that getting good performance and code reuse, an aspect FORTH is famous for, requires a consistency of design throughout the system. Such a consistency almost requires starting from scratch and working in only tiny teams. Considering the FORTH style I believe that it would be trivial to produce an effective static object system and further that such a system would be, due to the extremely cheap subroutine call costs, well suited to the runtime optimized functions as used in Synthesis.

Stack machine friendly OO languages are perhaps an intriguing concept. Though it is easy to implement a dynamic OO on stack machines I am left wondering how well a dynamic OO with optional method parameters could perform.

Review: Other M

Last week Courteney bought Metroid: Other M and started playing through it. I've been watching her play it, sometimes playing from the backseat. Since I've watched the majority of the game I'm going to write a review of it in point. Without further ado here are the important points:

  • The graphics are quite good, sidling right up to the uncanny valley without beginning to decline. I would declare it to have the perfect amount of detail for standard definition televisions.

  • Samus is much more action oriented. The first person view is no longer the primary action view. Instead you spend most of the game viewing in third person. This allows various high-energy moves to be scripted, such as lunging finishing moves and action-roll dodges.

  • There's a plot! Whether this is good or bad depends on if you like knowing what is going on in the game or whether you just like exploring and killing anything that moves.

  • The Zero Suit Samus model is too busty and thin for the character. The model looks like an underfed lingerie model and does not have the athletic body her profession would imply.

  • *Spoiler* Nagubal, gur oynpx punenpgre qbrf abg npghnyyl qvr. Guvf vf tbbq orpnhfr ur'f gur zbfg yvxrnoyr fhccbeg punenpgre.

Overall I would rate this game as "Would watch played again". Given that the controls have been simplified I may even consider playing the game myself sometime.

Review: The Singularity is Near

Just recently I have just finished reading The Singularity is Near by Ray Kurzweil. The basic premise of the book is that information processing capability has been increasing exponentially since the beginning of life on Earth. Furthermore, this exponentially increasing rate of processing is going to increase, if not forever, for at least another two hundred years. By that point the human society, certainly not flesh and blood humans, will be masters of all matter reachable at half the speed of light.

Specifically, Ray Kurzweil posits that this will occur because of the Law of Accelerating Returns. The premise behind this law is that as progress marches on an ever greater proportion of every industry will be equivalent to information processing and thus able to take advantage of the exponential increases in capability.

Ever increasing mastery of genetics, nanotechnology, brain mapping and simulation, and Artificial Intelligence are noted as the keys to not only continuing the exponential path of information processing, but also to solving all the problems of humankind. Of these problems are specifically mentioned disease, poverty and death. The vast majority of the book is spent showing examples of how various important industries have historically taken advantage of the increases in information processing capabilities to progress further and then showing more examples of how technologies which are extremely primitive at the time of publishing, 2005, and are only ten or twenty years from production will ensure the continued exponential increase in information processing capabilities. A small portion of the book is devoted to debating, rather weakly in my opinion, about how it is a moral imperative to maximally develop these major technologies despite the serious innate risks.

The two primary arguments for the continued exponential increase in information processing capacity are a history of exponential increases and strong AI. The first argument is that because for all of history, recorded or otherwise, the information processing capability has been increasing exponentially it is reasonable to expect it to continue otherwise until some limit is hit suddenly. The strong AI argument predicts that before we hit the limit of human understanding we will have AI strong enough to design even smarter AI and computing machines, leading to continued exponential increases in processing capability per gram up to the theoretical limits of physics.

The historical data provided is convincing of the increasing power of hardware in terms of raw MIPS, however I believe that there is a flaw in the argument of Ray Kurzweil in that it is not raw MIPS which determines capability, but effective MIPS. Effective MIPS is the measurement of usable MIPS available after various efficiency losses are taken into account. These losses are significant and include computer architecture losses such as cache misses, communication losses such as synchronization, complexity management losses such as abstraction and innate problem limitations such as necessary serialization. Though the raw MIPS of hardware has been increasing exponentially the effect of these limitations have also been increasing exponentially. While I do not have any hard data on hand I would still agree that there is a net increase in the capability of hardware, but if this increase is exponential then at the very least the exponent is closer to a linear function than doubling every eighteen months as predicted by Moore's Law.

Though a decrease in the exponent does not entirely invalidate the argument it does drastically change the time scales involved according to the Law of Accelerating Returns. There is some evidence that the effective processing capability is not increasing as fast as Ray Kurzweil believed provided by the five years of hindsight since the book was published. Specifically the predictions of the available processing power for 2010 are off, already, by a factor of two or three.

The second argument, that we will have strong AI before we reach some limit is sound in the theoretical world. The limit which Ray Kurzweil eludes to is that of human creativity, that is that we will create strong AI before humans are unable to contain enough of the design of the necessary systems in our head to make forward progress. Whether the human mind is sufficiently capable to create strong AI before reaching our assisted limits or not is unclear. However, there are other limits which are not discussed in any depth which threaten much more than humans just not being smart enough. In chemistry there is the concept of an activation energy level. For many types of reactions if you plotted the energy of the reaction, positive for energy put in, negative for energy put out, you will see a bump just before the reaction starts to output energy. Unless sufficient energy is put in to crest the hill the desired reaction will not occur. There is a similar requirement with the development of any technology. Certainly a nuclear power station can generate trillions of Watt-hours of energy throughout its lifetime, but if you do not have the energy to build the plant in the first place you can never tap that energy, even if you already have all the necessary knowledge.

Similarly it is with information processing capability. Though we can create systems of ever greater processing capability, they will require ever greater energy to perform. In the book it is argued based on the theoretical minimums that this will not be a problem, but to achieve these minimum energy levels we require a minimum level of processing capability which we do not currently meet. Thus ever more energy will be required until we have reached the activation energy to enable low energy computing. It is much in doubt whether human industry will choose to support this level of energy in competition with the other demands on the finite energy generation capability. There is further the considerable possibility that the cheap energy provided by oil will run out before the processing capacity necessary for efficient computing is reached. Oil currently accounts for 37% of the world's energy production. The loss of this proportion of the energy supply will greatly exacerbate the competition information processing research faces and it seems likely that maintaining the current industries will take precedence over new research.

With the threat of insufficient energy supplies in the near future ultra-efficient computing may not come to fruition at all. Pushing back the timeline for sufficient processing capability due to a reduced effective rate of increase makes it more likely that the energy will run out before strong AI becomes a possibility.

Now that I have expressed my concerns relating to why I believe that the singularity may not come about at all it is important to express my reasoning behind why it should be avoided. My arguments are essentially that on the path to the Singularity lies the inevitable extinction of the human race. I will demonstrate this by referring to the destructive power of the major technologies Ray Kurzweil believes are necessary to power humanity to the Singularity. In the book Ray Kurzweil covers each of these threats and concludes that they are insufficient reasons to stop progress through two lines of reasoning.

The first line of reasoning is that these technologies hold the ability to reduce human suffering and it is thus morally required that these technologies be developed to reduce human suffering. This argument misses the point that there already exists the technology and capability to drastically reduce the aggregate human suffering in the world and that if the funds used to power technological progress were instead directed to making existing technologies cheaper, more reliable and in distributing these tools to those in need the majority of the human suffering in the world could be solved forever. It merely requires sacrifice. Though it is not obvious, I also believe that this argument, when used in the context of a specific technology, may lead to ignoring the unintended side effects of the new technology and thus cause further suffering. One major example of this is manufacturing automation. While the advantages of automated manufacturing are deemed quite valuable, cheaper goods, the reduction in the number of unskilled jobs and the resulting unemployment (It is not always possible or economical to retrain for the limited number of more skilled positions which are soon to be automated) are often ignored.

The second line of reasoning, that as long as we are sufficiently security conscious these technologies contain the necessary defencive tools, is based on two themes of invalid reasoning. The first is that we are currently dealing quite satisfactorily with the artificially created threats of computer viruses and their familiars. As counter evidence I present the Internet. Even with the best security software a person can buy a careless user will quickly have their computer infected with several virii and trojans and spam bots. Currently the only real payoff for the creators of this malicious software is money or personal information. A radical madman cannot effectively gain control of a significant number of critical military systems to be able to launch missiles. However a single crackes can easily amass a botnet of millions of nodes on the public Internet for the purposes of DDOSing or spamming. If we are unable to protect ourselves against quite limited malicious software should we attempt to allow malicious 'software' to take the much more potent form of a custom virus or nanobot swarms? With a greater effect on the real world the ideological payoffs increase greatly. Why limit yourself to getting the message out when you can devote a couple of years of your life to wipe out the heathens yourself?

The second invalid theme Ray Kurzweil invokes is quite surprising for a book which is all about the exponentially increasing power of technology. It is simply this, he makes the assumption that the new threats of custom virii and malicious nanobot swarms will always be of the same magnitude of existing diseases and threats. That is, that things which can wipe a nation off the map instantly are going to be restricted to large governments, as are nuclear weapons due to the cost of their creation, and that small threats will act pretty much as diseases do now and take sufficient time to spread that they can be detected and fought. Both these assumptions do not hold when it comes to powerful custom virii and malicious nanobot swarms.

The most critical flaw is in assuming that these tools do not provide the power to instantly destroy an entire population. The human immune system has evolved over millions of years to handle threats of the sort that exist in nature. The threats which exist in nature, on the other hand, have evolved to spread using the tools at their disposal. This means that diseases which kill too quickly are limited in their spread by the size of the village. Disease which don't spread or kill quickly enough give humans sufficient time to either defend themselves directly, or evolve at least a partial defence. Now imagine a custom virus which works like HIV, that is lays in wait for years before attacking and destroying the immune system, but instead of stopping at the immune system proceeds to destroy any tissue it has infected. Now imagine that it is transmitted like the common cold. Such a virus could kill the vast majority of the human race in a decade.

Consider further the nanobot case. Since the human immune system has never seen a nanobot it is likely ineffectual in defending against a nanobot infection. Let us further assume the best case of Ray Kurzweil's future by assuming that we have a nanobot immune system covering the Earth to prevent a grey goo scenario which is ten times more effective against novel nanobots than the human immune system is against novel viruses. Under such an assumption it is safe to further assume that at least sometimes such an immune system will fail. This is likely because the determined madman can just isolate a sample of the immune system and test thousands of nanobot/virus variants against this sample to determine a set which either strain the system's limits or sneak by entirely. If it only takes one nanobot swarm to convert a nation into goo or one virus to destroy a population then any failure is not an option.

It is further not a valid assumption that these destructive technologies will be restricted to large governments. The entire point of the singularity argument is that as progress moves on more and more of the processes of creation will be information processing based and the tools for that will become ever cheaper and wide spread. You may not be able to procure the necessary technology from your corner store, but you could certainly steal it from a University laboratory.

I believe it is clear that, given the range of mental reactions and states of all the people in the world, it is unavoidable that there will be numerous disasters resulting from maliciously designed viruses, bacteria (fast plastic eating bacteria anybody?) and nanobot swarm which will kill millions on a regular basis. Further I believe that the ideological bias which Kurzweil places in his book, that Neo Luddite beliefs are indefensible, is not near as clear cut as his flippant responses to valid concerns may have you believe.

Overall Ray Kurzweil in The Singularity is Near does a good job of playing the starry-eyed futurist, but fails to convince not only that the Singularity is likely to happen, but even that it is desirable to cause it to happen.

Art Appreciation

Art is captured emotion

That isn't quite right. It isn't like looking at art floods me with strange emotions of bygone era. In my experience I do not get new emotions from art. Instead art brings out the emotions from memories I already have. When I see a happy campfire scene I am not suddenly filled with happiness from some random source, I am full of the happiness I've felt during all the campfires I've sat around with friends. I would say more accurately that art triggers emotions which are already in you because you have already experienced them.

Art is captured sparks of emotion

This seems better and more accurate. But if art is merely the spark of emotion, then obviously there must be something already within the viewer to set aflame.

Some might argue that everybody may equally experience art since the emotion is always there, ready to be tapped. For is not every person able to experience the same range of emotions? While I must agree that every person is capable of experiencing the same range of emotions and must further agree that the emotions exist as part of the normal development, I disagree that everybody is equally able of experiencing art. It is not the emotion, but the connection between the emotion and the art stimulus which is critical to experiencing the full emotional power of art.

If we assume that these connections are important, then the world of art becomes clear. It becomes easy to understand why the great works of art fail to impress the general public. The general public no longer has the cultural connections to the art which existed when they were new. People no longer walk on the beach with suits and parasols. And yet the art buffs and critics have put forward the effort to understand the cultural connections so that they may experience the art.

It also becomes easy to see why older people enjoy and collect more art. The older you get the more life experience you earn, the more emotional connections you create. It is foolish to expect children to fully understand and experience the emotions implicit in a young boy and young girl holding hands and walking down a dirt road. And yet to seniors such a scene holds many powerful memories and emotions of first love and youth.

Finally, it becomes obvious why people create art. Art holds the power to bring to the forefront the emotions of a whole series of memories all at once. This creates a powerful and complex emotional mix. As we forget the negative memories the positive emotions stand out stronger.

The important message to understand with art is that even if the art doesn't change, it gets better with age and exposure. It is also important to not expect the young or inexperienced to truly appreciate art. To appreciate art requires life experience.

In Praise of Concise Books

Imagine yourself outside on a sunny day in the shade of a tree. You go to open the book you've brought along to read on this brilliant summer day. Perhaps you are reading a novel, perhaps you are studying a mathematical text; whatever you are reading, your are reading for pleasure. Make sure you have this image firmly in your mind before you move onto the next paragraph.

Take your mental image and look at the book you are reading. Is that book some heavy fifteen hundred page monstrosity or is it a nice light pocket book? Is it some middle ground? Unless you are a glutton for punishment I would expect that you are not choosing to read the immense book under the tree and that you are doing this for more than just weight reasons.

In the not too distant past books where, in general, shorter. This doesn't mean that they contained less, just that they were concise. This is in part because before the advent of mass production of books printing large books was expensive and before the advent of computers writing large volumes of text was difficult. Who wants to handwrite the equivalent of fifteen hundred typed pages several times over while producing the manuscript? Now these limitations on book size haven't been an issue for several decades, but as with everything with a cultural component there was a lag before large books were accepted.

Unfortunately large books are not only accepted where they are necessary, but the size of a book has become synonymous with the quality of that book. This is an unfortunate aspect of the bigger is better phenomenon. There is the additional aspect of laziness on the part of readers these days. It is generally expected that comprehending a passage should take only minimal mental effort and study. In the past this was not the and you are probably aware of the image of learned men pouring over small volumes for weeks at a time.

Through my education I've come to realize two important things about reading books. The first is that any idea can be explained in any number of words or symbols, from incredibly dense mathematical notations to long wordy chapters. Orthogonal to the number of words used to describe the concept is the mental effort and time required to understand the concept. Given identical amounts of context a concept requires an identical mental effort to understand irrespective of the density of the explanation. This is not to say that the number of words does not matter. Too few and much time is wasted deciphering. Too many and much time is wasted condensing.

The second thing I have learnt about books is that being concise and useful requires focus and skill on the part of the writer and patience on the part of the reader. A writer must resist the temptation to repeat themselves and the reader must understand this and have the discipline to start at the beginning.

It is truly the concise books which add the most value to our lives. The long fantasy epic may provide many hours of frantic reading, but the pocket novel provides a pleasant, relaxing read. Even more so the short story provides thoughtful entertainment in time to wait for the bus. The difference is even greater when it comes to scholastic texts. The immense tomes of science and mathematics are often more confusing and less suitable for in depth study then the slim, focused texts.

The more I look at the world the more I believe that less is the answer, not more.

Writers Block

Have you ever been in the situation that where you are either required or simply desire to write something, but just can't find anything to write about? If so, you've had one form of Writer's Block. It is really quite annoying, especially if you are trying to keep a consistent blogging pace (Though judging from the number of comments nobody can actually bring themselves to read what I write).

Well, this is where I find myself. I have no trouble finding words to write with, but the topic has been eluding me for a little over a week now. It's also not that I don't have anything to write about. I have a couple of topics which I will write about sooner than later, they just aren't done percolating yet.

So what to do then? Give up on writing to the world semi-regularly? Write about the inane things that happen in my life, such as the fact that I've recently seen a store in a mall which sells nothing but toilets. Or do I head into obscure topics which fall flat even with my geekiest friends? I'm really not sure and that's why you get a small, boring post on how I am unable to write anything interesting.

Buy Once Environmentalism

There are many forms of environmentalism which differ mostly on what they wish to save. You can save the forests, but you'll have to give up the oceans and rivers to fertilizer. You can save the fish, but will have to sacrifice immense areas of land to become garbage dumps. There really is no single solution.

I would like to promote the idea of buy once environmentalism. The theory is simple: buy an item as few times as possible. Do not buy a new computer every year, do not buy something you are already planning no throwing away. Instead buy durable and repairable goods.

The industrial production system requires large amounts of energy, materials and produces large amounts of waste and pollution for every object which is manufactured. The difference in environmental costs between a single well made item and a cheaply made alternative is not large. However, the well made car, fridge, computer, can opener will outlast several of the cheap equivalents. This is obviously a net win for the environment.

There is an old saying which is relevant: A rich man buys a pair of boots for $100 dollars and has dry, warm feet for a decade. A poor man buys a pair of boots for $50 and has wet, cold feet for six months.

Buy once environmentalism is not only environmentally sound, it is also cheaper. So buy once.

Identity

Identity used to be a simple concept. You were identified by who you were and where you lived, but since nearly everybody you knew lived with you that was inconsequential. This was the case when fire was the new thing.

Later on, with the invention of agriculture and specialization, identity became a little bit more complicated. In addition to who you are where you live and your profession became important. In fact the profession became so important that it became part of a person's name. No longer was John sufficient, instead it was John Smith of the village of Foo.

Though this is more complicated than the simplest form of identity, it isn't near as complicated as identity was about to become. As villages became towns and towns became cities the law increased in complexity and the concept of identity followed suit. At some point identity split into two component: legal identity and natural identity. Natural identity remained, for a time, at one's name and profession and source town. Legal identity also started out in this way, but quickly added proof of identification, such as a signature or wax seal.

From here identity only got more complicated. Natural identity expanded to include all the goods and services and distinctions which expanding wealth allowed. These components had always been there but, as with location in the beginning, so few people travelled far enough for the distinctions to become obvious. Natural identity has expanded until it has reached the current state where who you are to real people is a composite of what you look like, which music you listen to, what you drive, where you live, what you wear, your profession, your personality, your interests and those sorts of things.

Unfortunately the growth of legal identity is not so simple or clear cut. As time and technology and the complexity of the legal system increased the simple legal identity of a name and a signature became insufficient. Fraud became too prevalent and coordination between groups became necessary to the industrial legal system. It used to be that you would only deal with businesses within your local community. As long as you did this then your legal identity could be simple because it was closely tied to your natural identity. However, the industrialization of the legal system required a strict separation of these two forms of identity.

Consequently legal identity became the enormous, contradictory monstrosity it is today. A complex and fragile system of numbers and accounts spread across hundreds of organizations now defines a legal entity. This system groans under its own weight and complexity. It both stifles freedom by being inflexible and grants freedom through easy theft. The fragility of this system is the reason identity fraud is so simple and easy. There are too many numbers needed on part of the time to create new numbers tied to this amorphous legal identity but controlled by criminals.

There is hope however. We have today the tools necessary to reconstruct the legal identity system in a way which is simpler, more robust, more flexible and more secure. There are only three obstacles which stand in the way of public key cryptology from reforming legal identity. These are inertia, the legal system and the police state.

The latter is reason enough to prevent such a reform. Conveniently the former two are obstacles of sufficient strength as to likely be insurmountable. Sometimes imperfection is the correct solution.

TL;DR

Web 2.0 whippersnappers have a shrinking attention span. Complete thoughts get "tl;dr". Incomplete thoughts get broadcast as worthwhile. Long sentences, greater than 140 characters, are ignored. Soon we'll all talk like young children.

Makes me angry as I crave thoughtful discussion.

Discussion Forums

In the beginning there was the fire. People sat around the fire to cook and chat. Times were good. Some time later alcohol was discovered and alcoholic drinks devised. Some time later cam the public house. From this point on all forums of discussion have gone downhill.

Don't get me wrong. I'm not saying that more modern discussion forums don't have their advantages. It is undeniable that email allows discussion with more people than the pub and that instant messaging allows discussion when people are otherwise supposed to be working. However all the more modern forums lack at least one thing that the pub provides.

It may be useful to split this discussion into two parts: realtime and non-realtime. This distinction is important because it divides the people who may discuss into those who have little better to do or are discussing alongside some other task and those who are giving their full attention to the discussion. Of course this is a generalization, but a useful one. Also note that we will only be covering methods of holding a discussion, not conversations. Discussions are between more than two people.

The realtime discussion methods include: party telephone lines, IRC, party-line IM, radio. These can basically be divided into text only and voice only. Text only realtime discussions often have the problem that since people type much slower than they speak. It has also become the norm to supply text conversations as whole lines or sentences instead of letting recipients read as the sender types. This is done to make it less painful to read so you don't have to watch every typo being made. Old talk systems used to work this way though. Voice communications don't have this packet problem, but it does make it difficult to split the discussion off into subdiscussions become there is often only a single channel. Realtime voice communication has the additional benefits over realtime text communication of inflection. I'm sure everybody has experienced that offhand remark which was intended to be sarcastic and was taken as a personal attack.

Non-realtime systems have a bit more variety. One thing you don't see much of is non-realtime non-text communication. I believe that this is mostly because it is a pain to do and doesn't provide sufficient gain. It may also not occur simply because people haven't thought of it yet (Maybe it is time for the Web 2.0 Video forum?). This leaves mostly text based methods. Of these the most common are: usenet, email, web forums, BBS's, blogs, social networking sites and article comments. Now some of these have mostly fallen out of fashion in the past decade, such as usenet and BBS's, but they are all still in active use. Mostly these systems are divided into two major categories: messages come to you and you go to the message. Many of the former are the older systems, usenet and email for example. These are differentiated in that each user uses some software to connect to a server which contains all the messages waiting for them to read. These messages themselves originate on many systems from many people. The latter type of systems, where you go to the messages, are mostly the newer style systems. These include web forums, blogs, social networking sites and comments.

The major advantage of the systems where the messages come to you is that it takes less time to collect the messages and there is more flexibility in viewing them. If I can't stand reading a flatly threaded discussion with dozens of participants then I can use a threading client. However there are also those who can't stand threading. Using our own software gives us both the ability to view messages as we desire. It is difficult to explain the rise of the other form of mechanisms, where you need to go to the messages, except in the context of the increasing view that the entirety of the Internet is nothing more than the Web.

Now how do these modern methods compare? Well the current voice methods are expensive and time consuming, though broadcast only forms are starting to gain prominence. Specifically amateur podcasts and video blogs can be quite successful when they keep a tight focus. Blogs and comments tend to go together, but it may be argued that this is more the foil and the discussion. The major problem with comments is that they are too dispersed and activity in them dies out quickly. Social networking sites are a bit better in this regard, but they are not conducive to indepth discussions mostly due to cumbersome interfaces and in some cases length limits.

In many ways I find it difficult to top the capability of the old timers of the Internet: usenet, email and IRC. They have really covered the bases as far as I can see. IRC handles most of the realtime discussion needs. IM tries to be as effective in discussions, but tend to end up muddled and difficult to coordinate. Though it has fallen from the public eye I truly believe that no forum of discussion with random strangers has topped usenet. There are places to talk about any topic on usenet and the capabilities of modern readers far exceeds those of any web forum. There is also the fact that the discussions can be global in nature. Sometimes it's nice to discuss only with your friends, but you'll often hit limits as you discover that your friends either all agree on a topic or just don't care about some topic.

Finally we have email. People complain about email all the time. They don't like the SPAM, it isn't fast enough, it isn't pretty enough, etc. Yet for all this there has been no true competitor which has gained traction. There are no systems which are able to handle the volume while still providing quite good reliability (sure the message isn't guarranteed to arrive, but 99% of the time it does and email handled server outages pretty well). No other system does this while also allowing large and varying lists of reciptients and allow true offline capability with attachment. In fact, perhaps the greatest complaints about email come about because of terrible email readers (Webmail and Outlook are not good tools by any means) and the lack of authentication.

Authentication of email is an interesting problem. On the one hand a large part of the robustness and flexibility of email comes from its store-and-forward nature and yet a large part of its utility is the ability to send from nearly any server and email claiming to be from nearly any domain. That is, it is not only possible, but common for a business to handle moderate volumes of email without them hosting and maintaining their own mailserver. Furthermore, what mailserver they use is often inaccessible (for sending outgoing email) from their workstation. Instead they go through the office ISP's mailserver to send their mail. A system which didn't work this way could certainly be made to work, but at an increased cost and complexity.

One consequence of this is that there is no association of an email address and the sending computer. This results in a sizeable chunk of the SPAM. It is important to note that SPAM happens even when there is little doubt that the actual sender of the message is the authenticated owner of the account. Now there are solutions to this problem, but not many people use them. The best solution which I am aware of is PGP/GPG. These allow messages to be signed to have some guarantee that they have been sent by who claims to have sent them.

I believe that the only reason these have not really taken of is two fold. First the need is not so acute in most situations that it is worth any effort to rectify. This is becoming less the case as more and more is being done online, but is a reason nonetheless. The second reason, in my opinion, is that the proponents of these systems have been too zealous in achieving perfection right off the bat. Any tutorial you read will give you complicated rules of thumb with scary warnings in an attempt to have you construct a perfect and watertight web of trust such that this web of trust can be used to conduct the most confidential and important of business communications. This is really the wrong tactic to take. Instead they should simply promote the most basic use as a toehold. Instead of admonishing users to use a strong pass phrase and protect their private keys like they were priceless jewels they should recommend that, unless you have a need for further security, they use no passphrase and make sure that every computer account they send email from has a copy. These programs could help this along while still maintaining security if they tagged such private keys which have no passphrase in some special way. That way those who desire security will know not to put any trust in those unprotected keys.

In fact, if the big webmail providers automatically created a key and automatically signed every message it would increase the security of email in general by more than all the promotion of encryption software to date. Would this provide perfect security? Of course not, but some security is much better than no security.

Now back to discussions. There exist realtime discussions on IRC, though those have issues with long, in depth discussions because people are usually not focused solely on the discussion as they would in the pub. There exist email mailing lists, but they don't have the level of privacy that a pub provides. There also exist a small number of optionally encrypted mailing lists. There are perhaps the best fit for good discussions when you are unable to take it to the pub. Sure secure email doesn't have beer and doesn't have emotion, but it does have the participants focus.

Thus, in my search for valuable conversation I have created a mailing list. This list is a private mailing list, but notify me if you want to be added. This mailing list also supports encryption. This means that anything may be discussed there, safe from prying eyes. Hopefully is gets comes valuable discussion on any interesting topics.

Exceptions, Massive Concurrency and Pseudo-Serial Languages

Imagine, if you will, that you are in a world where the performance of a single processing core is no longer increasing. Instead every device has an ever increasing number of processing cores, each of limited capability. As I'm sure most of you realize we have nearly reached this point. Those of you who have been paying attention will also realize that the current common systems for allowing concurrent computing have limitations. Functional programming just never caught on, threads are difficult to get correct, pipelining can only soak up so much processing power and the other systems are more specialized with the consequent limitations. Research is being done into programming models which are not functional and not serial, but there is no convincing evidence yet that they will remove serial languages from the seat of domination.

So what can be done if we wish to stay within the confines of serial progamming languages. Obviously we cannot stay strictly within the serial paradigm. What we need is something which operates concurrently, but looks as if it is being done serially. Along this lines I believe that the most promising avenue are promises. Promises are basically delayed subroutine execution. When these subroutines are first called they return a token which is used to access the result of the computation. If the result is ready when it is accessed then processing continues as if it were always available. However, if the result is not ready then execution waits until the result is completed. During all this time the computation has been queued for execution and, ideally, completed.

Promises have an interesting interaction with the popular language feature: exceptions. Exceptions, as many of you are aware, are non-linear program execution constructs. Specifically an series of statements are checked for exceptions being raised. If an exception is raised then execution leaves the normal flow and enters an exception handler. Now this works, in modern languages, because there is the assumption that code executes serially. One necessarily makes the assumption that once all the checked code has been executed that no more exceptions may arise as a direct result of that code. It is possible that the results of that code was incorrect in some way and later code will react badly, but that is an indirect result.

Promises break this state of affairs, because the execution of all the statements is not guaranteed to occur before the primary flow of control has left the checked section of code. There are five differing methods of dealing with this.

  1. Pretend that the exception occurred when the primary thread was within the checked block. In this case you call the exception handler for the checked code section. The primary problem with this method is handling execution which has occurred after the checked section. The ideal case would be to enclose all computation within a transaction and roll back that computation. I haven't yet thought too much about those systems, but I have the feeling that these systems are necessarily equivalent to functional programming along with all the associated problems.

  2. Don't pretend that the exception occurred anywhere in particular. Instead raise the exception within any checked section which is in the primary thread whenever the promise happens to raise it. This has the obvious problems of being entirely useless because an exception may be raised at any time. There can be no assumptions made about the flow of execution. Only the top level exception handler of a thread would be generally useful.

  3. Pretend that the exception occurred at the point when the result of the promise was accessed. This loses much of the advantage of exceptions. Instead of having logically removed yet geographically associated handling of exceptional situations the programmer is required to handle exceptions when they are accessed. This really breaks the geographic association of error handling code with the highest useful level of processing which caused it. This variant of promise exception handling does have the advantage of making it obvious to the programmer what processing has been done and needs to be undone. The biggest problem with this is spreading the error handling code over a large volume of code, likely requiring multiple handlers be written for each user of the result.

    I suppose it may be possible to attach an exception handler to each promise, but then this is conceptually identical to the first option.

  4. Not have exceptions. This is perhaps the most radical option, mostly because exceptions have been seen as the greatest advancement of serial languages in the nineties. Now in some ways the end result of this choice is similar to the previous option. The error is communicated to the primary thread at the point the promise is accessed. The primary difference is that the error does not automatically bubble up the stack if there is no handler. Instead all errors must be handled as soon as the first access. This enforces more disciplined error checking (See how well that worked in C?), but does make handling error which must be passed up more cumbersome.

  5. The final way to handle exceptions is to not pass them to the primary thread at all. Instead there can be a, per promise, option on whether to retry the promise, cancel the primary thread or communicate with some error handling thread as to how to resolve the failure. This error handling thread could either provide a valid result to be used, cause the promise to retry, terminate the thread or maybe something more complicated such as rolling back a stack limited transaction. This latter option assumes that there exists some section of processing which is easily functional in nature and allows easy rollback.

The question is what should be done in a language where any call which is expected to take more than a microsecond is automatically promoted to a promise. Such a language may be able to take advantage of implicit, short term concurrency in code. Concurrency which is otherwise too expensive to exploit by more traditional means. However the exception problem must be resolved. Almost certainly the first two options are unacceptable. The first option results in the exception handlers being called in entirely unknown situations. Any amount of processing may have occurred before the result of the particular promise is accessed. The second option has a similar problem and the additional problem that exceptions become unusable to check and handle recoverable errors. This second option makes it impossible to use exceptions as flow control, which is convenient in certain situations.

Which of the three remaining promise safe exception models is the most useful, with the fewest gotchas? I'm not sure, but at the moment I am leaning in the direction of not having exceptions for recoverable errors, but instead having exceptions handled in a thread top-level exception handler for serious errors only. Errors such as IO failures, memory exhaustion or the like. I believe that such a compromise will allow exceptions for what they are best at, handling truly exceptional situations, while providing the least number of surprises in dealing with promises.

Thoughts?

Augmented Intelligence

This past weekend I read Accelerando by Charles Stross. The central theme of the book seems to be a story of human life in the time surrounding technical singularity. It also covers other topics which include: autonomous corporations, intelligence augmentation, capitalism in a post-scarcity world and immortality through consciousness upload. I find several of these interesting and may discuss them at length later. In the interest of expanding my reader base to those who don't believe in reading more than two paragraphs I'm going to comment only on the first couple of pages.

I highly recommend reading the first couple of pages and considering if you are enough of an information junkie to enjoy living like that. For those who aren't enough of an information junkie to have read that, basically the protagonist has reality augmentation glasses so he is constantly deluged with information. I couldn't live like that. That life seems so spastic.

Later on, after technology has progressed, the humanoids are branching and merging their consciousness to perform all the tasks the need. This includes, in one case, living several simulated months when meeting a new person. Oh how I have often wished I could do that.

The Economy According to Travis

Everybody should be given the opportunity create their own theory of economics. There are the common views of economics, the predominant views of capitalistic economics and several less popular theories which get media time every time there is a crisis which shows the failing of the current leading choice. This is one reason everybody should create their own theory of economics, the current ones aren't perfect. The other reason people should create their own theory is that the popular theories are popular not because they are more correct, but because they enable some to make money off gaming belief in the system itself. Here is my theory.

As I see it there are two sorts of activities one can partake within an economy: producing wealth and conserving wealth. Producing wealth always involves changing physical material from one shape or form to another, but it is important to note that not all manipulation of physical material creates wealth. Conserving wealth, on the other hand, does not necessarily involve physical manipulation. Conserving wealth is simply that, doing something more efficiently so that it requires less wealth.

It is critical never to confuse wealth and money. Money is a medium of transfer of wealth, it is worth something only so far as it can be exchanged for goods or services. Money is also not the only medium of exchange, you can buy thousands of dollars worth of skilled labour with beer, pizza and friendship. Money is also not an effective store of wealth; events within the economy make a fixed sum of money have a varying value over time. The current economic system favours decreasing the value of money as time progresses.

Now there are two sources of wealth: raw resource extraction and human labour. Obviously most activities involve some amount of labour, but in many cases it is quite minimal, such as selling software online. Conserving wealth is simply making some action require less wealth. This is most often saving labour, as in the case of many machines or improved engineering to reduce the amount of material required. The reason reducing labour is more common is because reducing the labour necessary to mine iron makes iron cheaper, which makes steel cheaper, which makes mining equipment cheaper, which makes iron require less labour. Cycles of this form and more complicated occur throughout the economy.

To clarify the distinction between producing wealth and conserving wealth some examples are warranted. What is traditionally considered resource extraction, logging or mining for example, produce wealth. On the other hand, nearly all of the 'white collar' careers produce little to no wealth, but instead conserve wealth. That is, lawyers don't create anything of value, but they reduce the costs of making agreements between parties with disjoint needs. Programmers tend to create little value, but tend to save significantly on the costs of communication and processing and, to a lesser degree, increase efficiency overall. In between these two extremes you have a range of production versus conservation values. Restaurants are a good example. Restaurants take some raw materials and produce a meal. The restaurant itself did not create or procure the raw food so they are not primarily producing wealth. Additionally the restaurant isn't entirely replaceable by making a meal at home because the meal is often better or cheaper or the venue is more relaxing or some combination. Thus a restaurant is a mix of producing wealth, improving the value of the raw food, and conserving wealth, saving the time of the patrons or wasting less food or providing some level of experience which would be difficult to do at home.

As may be evident in the previous example, entertainment is a wealth producing activity. It is a straight conversion of labour into wealth. Note, however, that mass-produced entertainment is primarily conserving. One is always able to spend their labour creating entertainment for themselves, but producing it and then providing it to others saves people from this task.

So now that we know what the two forms of activities are we should discuss how their value is determined. Wealth production is valued based upon supply and demand. This supply and demand is similar to, but distinct from, the common supply and demand mostly in that it does not produce a curve in two dimensions, but instead in three dimensions: supply, demand and value. While the physical constraints are loose, that is there are plenty of easy to access trees, then supply will meet demand at a nearly flat value. However, as supply becomes limited the value increases. While the demand remains high or continues to grow the value will increase. In the case of the two extremes, high demand with high supply or low demand with low supply, the value will remain high. This latter distinction from the more traditional definition occurs because of labour costs in switching professions, handling small orders and the like.

Conservation of wealth also has a value. However that value is limited to the value of real resources, labour or material, that can be saved. If a skilled builder can build a cabin in half the time and using half the wood then that is the upper limit on the value of his specialized labour. This is to draw attention to the distinction between specialized labour and general labour, the labour of the non-specialist. Or even the labour of the specialist in a field other than their speciality. As an example, anybody can paint walls, but that does not make them a painter and it is likely that the professional will do a better job quicker and cheaper in terms of wasted materials.

That is my theory of economics as it stands. It is not a complicated theory, but then again neither are the leading economic schools. What is complicated are the consequences.

So what are the consequences of this philosophy of economics? The first is that labour is expensive. So expensive, in fact, that the majority of the wealth of the world is spent avoiding labour at all costs. One need look no further than automobiles, trains, ships, electric generators, the telephone, computers or any communication mechanism. The bulk of technology has been created to avoid having to pay a person to walk or carry or think. The second is that the only constant profession is farming. Other resources will become limited and be abandoned for new materials, for every labour saving job there is some person trying to save the labour of that job. The world will continuously strive to do more with fewer people while at the same time increasing the number of people. Wages will continually fall in real terms because, although labour is expensive, anybody will sell themselves cheap when starvation is the alternative.

Also, programmers will continue to try program everybody out of a job. I believe that one day they will succeed. When this comes to pass let us hope we have all joined the leisure class.

Survivability and Population Density

While by no means the most numerous and certainly not the most physically imposing, the human race is the single most powerful force on Earth. We've done it through the use, and dependence upon, technology. One need only read my previous post, featured right below this one, to see that there are challenges ahead caused by this very technology. The greatest threats are the ones which hold the possibility for a short disruption of the economy.

Now how does this relate to population density? That's easy. The more dense the population the smaller the area of ecological disruption and the fewer resources which are necessary to put toward transportation of goods around the economy. However, as part of this the greater the population density the greater the dependence upon the smooth operation of the economy to ensure that the people living in these areas get the goods they need to survive. Conversely, the lower the population density the greater the proportion of resources which need to be put forth to transportation in exchange for a lesser dependence upon a smooth running economy.

To see this in action consider water. In a high density city water is supplied by a small army of technicians which ensure that it is cleaned, filtered, pumped and delivered. In most rural areas each household has their own well to supply their needs. Now on a per capita basis the city people are less ecologically damaging, but if the economy has a hiccup then the water may very well stop flowing.

Next consider electricity. The vast majority of people have electricity delivered to them by the massive machine that is the economy. However, then the power goes out in the city large areas quickly become inhospitable. All those sealed office towers quickly gain tropical climates without constant air conditioning. The upper floors of skyscrapers become accessible only to the most athletic. Many places become too dark to move around in when the lights go out. Similar things happen in lightly populated areas, but there is also significantly less dependence on those areas. If there are no lights you can move to a window. If it is too hot you can let air in. Nowhere will it be necessary to climb twenty flights of stairs.

Perhaps most importantly is the availability of natural resources to make up for economic shortfalls. In a city there is only a small amount of burnable material on a per capita basis. In the woods there is plenty.

This is all important because there is no perfect solution. Living in higher density areas will reduce the environmental impacts of economic activity, but increases the sensitivity to disruptions caused by ecology and other factors. As such cities help prevent economic disasters, but are much more sensitive to them. Living on a farm makes it easy to support yourself, but, unless you are farming, you will be spending significant amounts of time and fuel commuting.

In the middle we have the suburb. In any civilization ending disasters the suburbs is where you want to end up after the majority of the human population has died off. They are not as good as farmland to support yourself, but will have ample precut firewood (furniture), shelter and other leftover bits of technology (bikes, lights, pots, etc.). Unfortunately suburban living is nearly expensive, in terms of resources per capita, as farmland, but is just as dependent on the economy as the cities.

This is the core of the problem of choosing the correct population density for survivability. Choose high density for prevention at the expense of greater damages. Choose low density for lesser damages at the expense of lesser prevention.

The End of the World

Every age has its various challenges, naysayers and preachers of the coming doom. The present is no different. For those who are not on top of the current list of issues I summarize them below. As in all cases forewarned could lead you to become forearmed.

I present these threats to civilization in order of exacerbation. That is latter items, should they come to pass, are likely to exacerbate the difficulties caused by some of the former items.

  1. Solar Superstorm. The Sun has solar storms which can increase the strength of the solar flares and solar winds. These, in turn, touch the Earth's magnetic volume. Apparently once every couple hundred of years the Sun has a very strong storm which causes extreme activity around the Earth. The effects include the Aurora Borealis being visible in New York city, long distance powerlines being charged with large inductive currents and general dysfunction of the electrical grid and radio spectrum. Watch out for blown transformers and wide spread, long duration power outages.

  2. Emptying of the Seas. Man has fished the seas of the world for his dinner since the beginning of civilization. However in recent decades the demands placed on the oceans have been steadily increasing. This has led to overfishing and the collapse of various fish stocks, such as the Cod. There is little sign of this slowing as the fishing industry continues to move to less desirable species to fill their holds. Watch for seafood becoming an unaffordable treat.

  3. Agricultural breakdown. With the rise in industrial agriculture and the decline of the family farm the tendency of farming has been towards maximizing production per acre at all costs. This has meant pesticides, chemical fertilizer, engineered crops, immense irrigation projects and minimization of fallow land. Industry has optimized agriculture and as with all optimized systems any unexpected change can bring the system down.

    Add to the current demand the increasing demand for Western style diets full of variety and animal products in China and India and you have powerful pressure to increase production. This will lead to further optimization and industrial farming on marginal lands. Watch for shortages and increased prices.

  4. Empty Aquifers. Perhaps the single greatest invention in the history of man has been irrigation. Irrigation freed farmer from dependence on rain. It has allowed marginal land to feed the world. Unfortunately much of this water, especially in the plains of the world, comes from underground aquifers. Though these aquifers do replenish over time, they do so at a lesser rate than we are currently emptying them for irrigation. It is expected that in the near future some of the aquifers in the breadbaskets of the world will become empty. Watch for failing crops in the heartland of the USA and reduced production from southern California as two of the first difficulties.

  5. Aging global population. No matter how hard people try, they can't stop getting older. Couple this with a declining birthrate and you end up with a population who's average age increases with every passing year. Civilization has not seen a population where more than half of the living people are retired. This brings with it several challenges. The most difficult of which is economic. Will the economy function when the half the population does little productive work? Where will the money comes from for pensions, health care and education?

    For the young this is a great opportunity because labour and skills will be a seller's market. For the old it is less clear. Those who have saved nearly nothing will find themselves working until they die. Those who thought they had saved enough will find every service suddenly more expensive than they predicted. Those who invested in real estate may find prices dropping as retirees sell en mass to pay for their retirement. The infirm will perhaps be hardest hit as there won't be enough people of working age to hire a sufficient number of nurses and other aides.

    Watch for stiff competition for workers causing wage increases and inflation and labour shortages all around.

  6. Peak Oil. Peak oil has been touted since the seventies as coming any year now. It has not come yet, but we are thirty years closer. Modern civilization is brutally dependent on oil. It is used in the manufacture of fertilizer, plastic and machinery of all sorts. Some food is even made from it. Oil runs our transportation systems and some of our electrical grids.

    Peak oil is not running out of oil. Peak oil occurs when the production of oil, how much is pumped out the ground, cannot be further increased to keep up with demand. The industrialization and westernization of China and India are rapidly increasing demand. When peak oil has arrived the essential structure of the economy, both globally and locally, will change. Watch out for the end of air travel, increased prices across the board and the rebirth of rail.

  7. Climate change. Though there has been much talk in the media about climate change, there has been little discussion of what the cause of the real difficulties is. All the difficulties caused by climate change, whether it is hospitable deserts becoming inhospitable or rising sea levels or more powerful winter storms, are only problems because they are change. Civilization has optimized its functioning on certain assumptions of climate. It has also proven that is can withstand fierce storms which occur on a frequent basis, just look at places hit with hurricanes every year.

    The difficulty comes mostly from holding out through the adjustment period. Some lands will require evacuation. Others will become more productive. Watch for mass moves and general upset of all the components of civilization. Especially watch out for the newly impoverished who have been forced to abandon their lands, possessions and ways of life.

  8. Peak Energy. Similar to peak oil this is when the production of energy, mostly electrical energy, cannot keep up with demand. Most of the good energy sources are near capacity. There are few good valleys left to damn, nuclear has political problems and renewable sources require decades of heavy research. Watch for drastically rising electricity prices, the return of battery powered appliances and rolling brownouts.

  9. Good for nothing young people. Since the beginning of time older people have been claiming the next generations are of a lesser sort. This time is no different. As in ages before the young people of today are: disrespectful, blasphemous (both religiously and ideologically), lazy, lacking in vision and generally good for nothing. Of course most of that has come about because the have been pampered compared to their parents. However, there exists the seed of a will to live and succeed inside every person, no matter how useless they appear. Watch for changing definitions of success, intergenerational struggle and, ultimately, adaptation to the new world.

  10. Revenge of the exponential. Since the beginning of the Industrial Revolution the global economy has been growing exponentially. The exponential function is perhaps the most misunderstood function and has the greatest impact on the life of the common person. Compound interest, inflation, population growth and resource consumption are all examples of things which have been growing exponentially for decades or centuries. When the exponential system meets up with bounded physical limits there will be trouble. Watch out for painful economic restructuring likely resulting in all existing virtual asset investments being wiped out.

And those are the major issues, as far as I am aware, which will come to the forefront in the next twenty years and should be well into full swing withing the next hundred. The outlook may appear grim, but the costs of persevering are not insurmountable. All it takes is a strong will, sacrifice and ingenuity.

Why You Arent a Cop

As you may be aware the RCMP has been on a big recruitment push for the past couple of years. According to this source part of the reason for this is because there was a big recruitment push in the seventies. Well, those recruits are getting ready to retire. That article also alludes to reduced interest in the profession. Now it is not only the RCMP which has this recruitment problem, municipal forces have a similar problem to a lesser degree, but the RCMP has it the worst mostly because they offer lower wages and remote posts.

To understand the core reason nobody wants to be a police officer you need to first understand that, to most of the public, a cop is a cop. To most there is no relevant distinction between the RCMP, the Vancouver Police Department, the LAPD or bylaw officers. This is important because I firmly believe that policing suffers from a serious negative image. Do not forget, however, that though an image may not be entirely accurate they tend to be more representational than fictional.

In the past, police where firmly viewed as upstanding members of the community. They were friendly when doing their rounds and helped people. More often than not they were a voice of reasonable authority who could be trusted to act in the interest of the community. This is significantly less true today.

Today the direct interaction people have with police tends to be restricted to getting a ticket for something they consider perfectly reasonable. This is the single greatest problem of the modern police force. No longer do people consider the police friendly and reasonable, but instead they are viewed as speed bumps in the course of living a reasonable life. Worse than this limited interaction is the relative increase in surveillance caused by the increase in population density and unmarked cars.

It used to be you could drive ten minutes from home and be in a rural area where there was little fear of being caught. If you wanted to drink in a field you did so. If you wanted to do a bit of drag racing you did that too. Mostly nobody got hurt. If you did that now you would often find that the cops would show up, either because the are cruising around, or because somebody called them. If your fun if often ruined by police you won't think much of them.

Then there is the recent increase in unmarked cars. So called ghost cars have a legitimate purpose in formal, undercover investigations. However, I have noticed many such cars pulling people over. Such actions can be construed as the first steps to a police state and are definitely not friendly. If police hide their presence to the community at large they should expect to be treated like any other group which hides their job from the public, namely criminals.

Those are the reasons people have a negative direct experiences with police. If the direct interactions are negative, the indirect interactions are downright terrible. As noted above to the public all police and all police forces are the same. This means that anytime a person reads a story of views a video of police brutality, corruption, use of unnecessary force, unreasonable TASER use, speed traps or police provocateurs they see all police as untrustworthy.

Police have a serious image problem as being against freedom and the public. It is no wonder that recruitment is down. A much more severe problem, which I have not seen addressed, is the self-reinforcing nature of the problem. If fewer good people wish to become cops there will be fewer good cops. With fewer good cops the impression will tend to be more negative. Not all these problems are the fault of the various police forces, if the politicians demand that there be no tolerance, then there will be no tolerance. Many of these problems have solutions within reach of these forces, they just need to start serving the entire community again.

The Three Stages of Success

When a profit seeking venture unleashes an innovative product the world is full of possibility. The first time a significantly more reliable car rolls of the production line, the first time a new search algorithm is used by the public or the first time that AI makes a stock trade the world changes. Suddenly the life of some is better. This is the first stage, taking over the world.

Then comes success. This is when everybody wants this new car, uses this new search engine or that AI starts making money. The product is on top of the world. The venture is profiting handsomely from the growth. This is the second stage of success, profit.

If the product didn't exist in the real world this would be the end of the story. The growth, by definition an exponential process, would continue indefinitely. Most people believe they live in in this ideal world and they base their actions on this assumption. It is for this reason that nearly every venture fumbles in the third stage. The real world has limits, there are only so many people to buy cars, only so many searches only so much money.

The third stage is dealing with the fact that the world has been taken over. Everybody owns a car which is equally reliable and efficient, most of the searches are done using this new algorithm, the AI now controls a significant portion of the worlds' money. The growth doesn't necessarily end at this point, but the rules of the game have changed. Once everybody owns a car far fewer need to be made, but making them less reliable means fewer older vehicles will be replaced. When most of the searches are done using the new algorithm creators of web pages include less metadata explicitly for the computer. Why add numerous links when you are just a search away? When the AI is a major force in the economy the rules of that economy change, old heuristics become invalid and new ones appear.

Those who fail to plan for the third stage are doomed to decay. This decay may be slow because of the massive reserves gained in the second stage, but decay of the product is unavoidable. Those who plan will simply silently replace the product with another new one.

Planning too far into the future is a waste of energy, but in the life of a product (1. World Domination 2. Profit 3. Saturation) it is important to take the next stage into consideration.

If I Had a Majority

If I were the leader and had a majority in the House of Commons there would be some changes. The first and most significant change would be my goal to be a one term wonder. I believe that the largest source of dysfunction in any political system are career politicians. If you are worried about re-election then you are unable to focus on solving problems. If you don't solve problems effectively then you need to worry about being re-elected. It is a vicious cycle. The only way out is to not consider re-election an option. Aiming to not be re-elected has the added advantage that enemies can be garnered without fear. The ever present enemies of solution is slightly depressing.

I start by not wanting to be re-elected and only needing to conserve my political capital for the length of one term. This shouldn't be that difficult. The Conservatives seemingly haven't had any political capital for three elections. What do I do with my improbable power and freedom? Make things better.

I start by moving the age of consent back to where it was ten years ago. Nothing good has come from coddling children and nothing but bad has come from the extension of childhood.

Along the same lines I'd reduce the Federal minimum drinking age to fourteen. So many teenagers already drink from this age that it isn't effective. Even worse, they are forced to do in it secret. Things done secretly are never done as safely nor with as much moderation. Reducing the drinking age has the added benefits of pushing the frequent binging to a time when they have less money and are, for the most part, unable to drive. Nothing makes it easier to avoid driving drunk than having lots of practice.

Following this line of thought I would make drinking in public legal. Plenty of other countries allow drinking in public and their society hasn't fallen apart. Again being forced to secretly drink in public worsens the situation. When you can't bring beer you are forced into the water bottle filled with vodka. Public intoxication will still be illegal.

Next on the agenda comes automotive fuel efficiency. I am not entirely certain of the rules at the moment, but they certainly do not please me. I would set the minimum combined city/highway fuel mileage to 30 MPG for cars and 20 MPG for light trucks. These minimums would be increased by 3% each year until such time that liquid petroleum fuels are no longer used.

Having set personal transportation toward improvement I would turn my eye toward public transit. The Federal government is unable to do much directly, however I would pressure the appropriate governments to not fund transit via taxes raised of automobile use. It is counterproductive to have transit funded by gasoline, parking, insurance and other taxes. How can an effective transit system be developed in any region if every improvement and ridership increase reduces funding for that same system?

While there are happening I would move forward with policies to decentralise the Federal Government offices. At the moment a large majority of the Federal Government is housed in the area of Ottawa. This made sense when most of the information required to run the government was on paper and the postal system was slower. In the present, however, most communication is done using digital documents. Having all the offices in one geographic area is an inefficiency. How many hours are wasted each day because of traffic in Ottawa? How many processing delays exist because all the work is done in a single timezone? How much economic harm is done by funnelling so money into southern Ontario at the expense of the outlying regions? Much better is distributing the work.

This work I would distribute to small towns all around the country. Putting these new distributed jobs into large cities will, in time, recreate the centralisation problem we already have. Small towns have the additional benefit of likely requiring lower wages due to lower costs of living. A prime candidate for towns to place government offices are towns in the North. Many people in the North are unemployed seasonally and having consistent jobs will reduce unemployment rolls. The North is all wired and it doesn't matter how bad the winter storm is, you can still make it across the town of one or two thousand to the office just a handful of kilometres away.

In moving the government offices I've reduced poverty in small, remote towns and the unemployment rolls. I've possibly even saved money in the long run. How else can I improve the countryside? Infrastructure is how. Specifically building or upgrading rail/highway/fiber links between communities. Of these three the least important is the highway. I propose to help pay for this infrastructure through hiring the unemployed in the areas. Additionally I propose an opt-in programme for prison inmates where they will agree to work in a labour camp to work off their term a third faster. Inmates may be paid, but it will be well below minimum wage. Having inmates work should ease the burden on the prison system. Nothing reforms a person like five or ten years of hard labour.

Speaking of criminals I would immediately shutdown the long gun registry. It is immensely expensive and provides no benefit. Criminals just don't use hunting guns.

Relatedly I would rewrite the way corporate fines are computed. Instead of whatever system there is in place now I would institute a statistically rational punishment scale. It works like this: First take the maximum amount of money breaking this law may have saved the corporation and quadruple it. To this add twice the cost of all cleanup and restitution. This is the cost of any fine levied. Should this fine not be paid all senior management goes to jail and the assets of the company seized for government auction. Things should settle down a bit after the first multi-billion dollar fines are handed out.

Copyright. Copyright from the government's point of view used to be easy. Only those with lots of money and corporate backing cared or produced valuable content. With the rise of computers this is no longer true. I would push an updated bill biased to the consumer. Overly large media conglomerates already have more than sufficient power in the form of Loonies.

I have in mind many other things which I may do, but I am not yet entirely convinced about all of them. The final thing I would do as Prime Minister, before being run out of office by those who hate the Canadian people, is do my best to diversify Canadian trade away from the USA. I just don't feel that the USA is reliable enough.

Lessons at Christmas

The best thing about the holiday season is the abundance of baked goods and other treats. The second best thing are the long periods of time off and the distractions from regular life. Mostly these are family from out of town or feasts. This year the calendar was arranged well. Even though I have on vacation time I ended up with a four day weekend. During these four days I discovered, or perhaps rediscovered, some unsurprising facts.

The first and perhaps most useful of there is that computers are the cause of all the ills in my life. During the break I barely touched a computer at all and I was happy. It is a unfortunate that while computers seem to be the source of all the frustration in my life they are also the source of all its necessities. I am not yet sure what I will do about this rediscovery. Perhaps I'll examine and fix the aggravating factors. Perhaps I'll not touch computers as a hobby. Perhaps I'll try to make my living some other way such that I can still enjoy computers.

The second of the things that I discovered is that happiness is spending a lazy Sunday with a fresh book on a comfortable couch. I have been quite busy the past couple of years and haven't have the chance to read for pleasure much. In the past few of weeks I have found time to read four books among my other chores. I believe that I will attempt to keep this up, though perhaps at a more sustainable level. Reading until two in the morning before work is not necessarily a positive thing.

Rediscovering reading has also clarified the direction my media consumption has been heading in since I left for my trip. I have, for nearly a year now, been avoiding visual media of all kinds. I have been avoiding TV, movies, pictures and everything on the Internet which isn't predominately text. I have instead been reading, listening to podcasts and talk radio. I never listened to much music and am listening to even less now. I find these more restrained and thoughtful media reduces my stress level.

This leads me to a blog entry I had considered writing, but I never got around to and I will summarize here. After having listened to two CBC podcasts, one the Ideas programme of a newspaper speech and the other a Rewind series of podcasts concerning the history of Public Relations, it occurred to me what the future of the news industry may be. The primary issue with news today is that they mostly print press releases, for various reasons. This, coupled with the ever increasing number of minutes of news reporting, has led to news becoming a constant stream of informationless data. The only future I see for news is for the industry to drastically increase its signal to noise ratio, starting with a stiff cut in the amount of data output. Newspapers will not become extinct, but instead return to weekly printings of news, not just reworded press releases, stock prices and sport scores.

The final discovery comes about because I watched Avatar in 3D. I am impressed by how far technology has come in crossing the uncanny valley since Final Fantasy. There are only a few spots where the unrealism is jarring. 3D movies, however, I feel don't add enough and is only a gimmick. One thing these film makers need to learn is that with a 3D movie you can't direct the attention of the audience through the use of focus. The entire volume must be in focus at all times. Doing otherwise gives viewers who want to look at the scenery eye strain. I know it'll be a while before I watch another 3D movie.

Information Organization

Information has come up a couple of times among my friends in the past short while (see here, here and here). The solutions and problems discussed seem to revolve around ignoring you aren't interested in to make more time for those which you are. Apike makes the suggestion of ignoring aggregators in preference for primary sources. Curtis just wants to read everything without setting his brain on fire.

My view on this whole debate is different. I avoid information overload in three ways: filtering aggregation, categorization and prioritization and finally quick filtering. Through the application of these three techniques I learn about everything important without spending my life reading.

First we have filtering aggregation. This is getting most of my transient information, that is news and gossip, from other people. People who filter the Internet for me. In this class I read Slashdot, certain sub-Reddits and a few other sources. I fully realize that I don't see every little piece of news on the latest gadget that I'll never see, but I am OK with that. When picking aggregators it is important to keep in mind your general interests and acceptable volume. I find that older sites tend to do better at filtering the useful from the inane and transient. This is likely because being older it has attracted an older crowd which has learnt the lesson that you can't know everything. Volume is critical, no site which ever has more than twenty-five items a day should be considered, especially when viewing the constant stream.

It is important to distinguish two separate ways in which you can use an aggregator. The first, which is the most common now in the days of RSS, is to take every article posted and use that as the articles to read. The second, which has fallen largely out of favour, is to take snapshots of the article lists once a day. This latter method works especially well with aggregators who keep popular links up longer. I find that Reddit is best read this way and I only read the top page once a day of each sub-Reddit I follow.

The second technique is categorization and prioritization. Most of the articles and content on the Internet loses value incredibly quickly. An article which is inspiring and groundbreaking on day is valueless by the next week. It thus becomes easier to pick out only the most valuable information the older the articles are. This lets us make use of the powerful tool of prioritization. By prioritizing the articles with respect to relevance we ensure that we only read what we have time for. Any articles which don't fit within my fifteen minute morning reading session are left until I have more time. Later when I get to reading it time has passed and articles of lesser importance have accumulated. This provides two ways of making our lives easier. The first is that the articles are older and so of lesser value. This means that more of them will fall below the minimum value threshold and not be read. The second important fact is that there are more of them. When a topic has a sudden burst of articles it likely means that something of interest has occurred in relation to that topic. That topic becomes worth reading. The other topics which have few articles concerning them are less likely to be interesting. The third advantage is that if you wait then most of the interesting comments will have occurred in the interesting articles. Do not underestimate the power of interested and knowledgeable commentors to provide value to an article. In fact, I often read Slashdot articles solely for the (filtered) comments.

Categorization and prioritization is all about delaying. With the addition of an extra twelve hours it quickly becomes obvious which topics had something interesting happen and which didn't.

The final tool is quick filtering. This is the only technical aspect of my strategy. In dealing with large volumes of information, most of which is of little value, it is important that filtering be a quick, efficient process. The first requirement is that whatever software you are using be quick. Any delay in processing your commands is unacceptable. If you use a web based RSS reader I would recommend you try some other reader. Ideally the delay between you skipping this article to the next article appearing would be 20 milliseconds or less. If you can feel a delay it is too slow.

It is also important to minimize the information you, the slow human, use to filter upon. Ideally you will filter upon only the text in the subject. There should be no date (does it really matter if it happened today or last week?), no author (generally more interesting authors should be in a higher priority group) and certainly no body text. Making reading decisions solely on the quality of part of the subject may seem harsh, but there the basic fact is that an author who cannot write a concise, interesting subject are unlikely to have written anything truly original or interesting. Filtering by subject line also eases the constant delays of refocussing your eyes or having your computer process your command. If you can ignore a full screen of articles with a single keyboard command then it matters less that it takes your software one hundred milliseconds to switch to the next screen.

That is the strategy, don't look at what you can avoid seeing, don't read now what can be left until later and be able to quickly ignore anything which got past the previous filters as quickly as possible. With that you'll be reading significantly less than you ever imagined possible and not missing out on anything truly ground breaking. Remember, people on the Internet are like a flock of birds, anything truly interesting will set them off into a repetitive flurry that lasts two or three days. It doesn't matter if it is the original article or the hundredth response to it which catches your attention, you can always follow the links back to the source.

There is one more factor to take into consideration. For the health of the Internet it is not possible to only read heavily filtered lists of articles. If everybody wanted to read only the best one percent there would be nobody to read through the remaining chaff to find the gems. You should pick your favourite topic and read that topic unfiltered, keeping in mind to pass along articles which others would find interesting. You only need read the level to which your interest and time allow. If you can read all the source articles, go right ahead, but remember that people are needed at every filtering level to ensure that only the most relevant and important articles make it to the top.

In fact, I'd probably pay somebody to provide me a list of the twenty most important articles from yesterday.

Lessons

What did we learn from my last entry? Firstly that my initial computed layout wasn't very good.

The other thing I learnt is that it is impossible to keep a coherent thought when it takes the majority of one's mental power just in the struggle to put those thoughts down.

I won't be blogging much until I pick and get good at my new keyboard layout.

Feature Creep as a Human Phenomenon

Warning! The following is an incomplete thought and will be full of errors.

Feature creep is a term which describes continual addition of features to a piece of software. This occurs even though the software already meets the needs of the vast majority of its users. Feature creep is detrimental because each feature makes the program as a whole slower and require more resources. The end result of feature creep is the eventual collapse of the software or the creation of new software which serves the same purpose. In general software grow without bound until it is too cumbersome to update and too cumbersome to use, at which point it is dropped for some other software.

In our increasingly computerized world it is bad enough that all the software we create eventually becomes too complicated to operate. Unfortunately feature creep is not restricted to software, it merely advances fastest these. No, feature creep stems from Western society and can be seen in all its aspects from engineering, to law, to education, to the very norms which create the society itself. This creep is not known as feature creep, but instead by the more positive sounding progress.

Examples of feature creep are easily found in life. Take phone which now surfs the web, plays games, organizes your time, takes photos, watches movies, does email and a host of other things. The scope of a phone has broadened. Now some of these features are useful to some people sometimes. Yet even those features a person doesn't use costs. Those who do not need colour still pay for the battery cost. The law contains so many loopholes and special cases to satisfy small groups that an expert is needed to navigate it. It now takes a considerable effort to find a basic car, that is without AC, power seats, a high performance engine and other things. These are a few examples of areas where it is difficult to find things without 'features' which are of no use to many people.

It seems that nobody can accept the way things are as good enough. Perhaps not enough people realize that at some point you can only improve one aspect at the expense of another?

Learning a New Keyboard Layout

Though I haven't mentioned it, I have been using a Kinesis Countoured keyboard for a couple of weeks. This is all part of my attempt to avoid RSI.

I am not content to just get a fancy keyboard. I am also changing the layout. I am currently trying out one layout which I computed. So far it it frustrating beyond belief, but that is mostly because 10 WPM is significantly slower than I otherwise type. I'm not sure if I will stick with this layout, try a different one, or fall back on DVORAK. Time will tell.

New Feature: Comments

Some people out there have requested the ability to comment on my blog. Well here it is. I have officially added the ability to create comments! It is still early so there may be quirks. If you come across any please do notify me.

Happy commenting.

Epic North American Trip Summary

Well I've been home for two weeks now and have finally gotten the last of the trip stuff squared away. And so without further ado here is the sum of the remaining wisdom and observations I have gained from the trip.

The first things I am going to cover is that Canada is really big. In fact, Canada is so large that our epic trip turned out to be too epic. We had originally planned to travel into the territories. Alas this turned out to be impossible. By the time we had finished travelling through the provinces it was late August. In the North the winter comes early with snowstorms starting in September. I figure that our epic trip which we originally thought would take approximately five months would actually take more like eight months.

Canada is more varied than the United States in some respects. While the United States is more varied in climate it is less varied in culture. In Canada the climate is approximately the same no matter where I went. Sure there are some differences between the forests, prairies and tundra, but not that much. At no time did I feel that the landscape was alien.

The culture has an east-west gradient though. Well, not precisely east-west. With the exception of Ontario the culture becomes more stereotypically Canadian as you move east. Small, close knit communities become more common and you get a sense that good society is the goal more so than individual well-being.

Then there is Ontario. In many respects Ontario is the most American of all the provinces. This is more about feel than anything which is easily mentioned. It is a start to say that the road system resembles the States more than any other province, commuting is a way of life and a number of other factors which I just cannot articulate.

The most beautiful place in Canada is PEI, hands down. It really embodies the best of Canada: the beauty of a green spring, bright sunny summers, abundant locally grown produce and the snow that I love so much.

Unlike the US the wealth of the average person seems more or less consistent across the country. Those in the west tend to have larger houses, but those in the east tend to have larger lots. There are far fewer run down houses and drying towns.

Now there are a few interesting facts about North America that I've learnt on this trip. The first is that no matter where you go you will find somebody has driven there from British Columbia, Ontario, Quebec, California and Florida. It doesn't matter how far from home they are, they will be there and have driven there.

Contrary to popular belief the Quebequois are not all jerks. I'm sure that some of the stories of the abusively rude are true, they cannot be as common as we are led to believe. However, I can confirm that there are people there who do not speak enough English to depend upon. Well, it is either that or the majority have realized the best way to punish Anglophones is not to finger them, but instead to refuse to speak any English. Yes even if this means suffering through their half remembered high school french from ten years ago. Being rude about it just sets people against you.

To some the automobile is the ultimate symbol of freedom. If you subscribe to this view I can assure you that it simply is not true. You are only as free as the amount of gas in your tank. Fuel is the leash to the freedom of your car. After this trip a 4X4 will never have quite the same sense of freedom.

The final great piece of wisdom is related to travelling. We were travelling for nearly six months. We had only a short break in the middle at home. This is too long. The trip started to become a grind after the end of the second month. I strongly recommend that no one travels for more than two months at a stretch. This goes doubly so if you are moving a lot.

In the end this trip cost a fair amount. Of the receipts we got I've totalled the travelling costs up. This is not the complete cost of the trip as the cost of equipment and automotive repairs before we left are not counted. Also there are likely data entry mistakes, lost receipts and illegible receipts. These values should be more or less accurate though. First the USA.

Fuel (including ice): $2340
Attractions (including parking): $913
Food (both restaurants and from the grocery store): $1105
Accommodations: $2130
Miscellaneous (including vehicle repairs and cash withdrawals): $1360

The total for the USA is $7850 in US funds. There is some overlap in that we spent the cash we took out from time to time on item for which we received a receipt. Next up is Canada.

Fuel: $3559
Attractions: $834
Food: $1135
Accommodations: $1984
Miscellaneous: $1266

The sum total is $8779 in Canadian funds. Taking the exchange rate into account the total travelling cost is somewhere in the area of $20, 000.

The trip was well worth it. Though I will never do a trip such as this again I will remember it for the rest of my life. And after all this Courteney is still agreeing to marry me. Who would have thought.

In closing I would like to thank all those who helped us along the way and all the family we saw as we moved across. The trip would have been much more difficult without you. For those who have followed us on our journey I hope you have enjoyed it. Pictures are up in the gallery for the Canadian portion of the trip for those who wish to look.

Now it is back to regularly scheduled life.

KM 47773: Langley, British Columbia

Finally we are home. Stay tuned for a final summary post and a link to the pictures from the Canada portion of our trip.

KM 47219: Blue River, British Columbia

Today I didn't spend all day driving! The first thing we did after getting out of the campsite was to visit the legislative building of Alberta. It is much as you would expect and is similar to the one for BC except that they are a pinky-beige colour.

Then we visited the West Edmonton Mall. Courteney had never been there and wanted to go there to claim that she had been there. We played mini-golf there and I beat Courteney by ten strokes.

Then we drove westward. We should be home tomorrow in the afternoon and that is making both of us happy.

KM 46611: Edmonton, Alberta

Driving a brick against the wind all day is not only annoying, but also not fuel efficient. Getting closer to home though. Edmonton falls prey to the prairie city problem. That is massive sprawl. Worse is the fact that it is sprawling in a circular pattern. This means that it is crisscrossed with highways, all much like the last. It's annoying.

In more touristy news we saw two things of interest today. The first was a Naval Reserve Base in Saskatoon. That's right, a naval base in the prairies. We also saw the largest pysanka (Ukrainian Easter Egg) in the world and I tell you, dear friends, that it is larger than the largest ball of twine in Minnesota.

KM 45858: Wynyard, Saskatchewan

Well, not only did we make it out of Ontario, but we also drove across all of Manitoba. There was also another item which we checked off the list.We ate at a Red Lobster. It isn't a terribly exciting item, but now I can stop wondering about those commercials. We ate at one of the ones in Winnipeg, more seafood at the mid point of Canada and nearly as far from an ocean as you can get and still be in Canada.

KM 45039: Kenora, Ontario

Nothing went wrong today. This is awesome. We continue our way west and are almost out of Ontario.

KM 44302: Geraldton, Ontario

Well, my day sucked. First I had to wait to get my tire patched. Then my truck decides that it wants to stall a lot. This happens most when I back up, which is the one time I cannot go quickly because I cannot see out my back window very well. Then I get a leaking brake line.

In general it is impossible to get a mechanic to do any work when you want it to be done today. They are always busy. Perhaps I should have been a mechanic.

Anyways, I get lucky and manage to find somebody to fix it and it gets done early enough for me to still make some distance. My truck is starting to show its age and it worries me that I still have five thousand kilometres left to go.

KM 43626: Larder Lake, Ontario

Well, I drove a whole bunch and things were going well until I got a flat tire. It was late in the day so I put my spare on and continued on to the campsite. I'll deal with it in the morning.

KM 42803: Roberval, Quebec

The wheels on the truck go
Round 'n' round
Round 'n' round
Round 'n' round
The wheels on the truck go
Round 'n' round
All through the day

The gravel on the road goes
Ting, ting, tonk
Ting, ting, tonk
Ting, ting, tonk
The gravel on the road goes
Ting, ting, tonk
All through the day

The Courteney in the truck goes
Zzzzz, zzzzz, zzzzz
Zzzzz, zzzzz, zzzzz
Zzzzz, zzzzz, zzzzz
The Courteney in the truck goes
Zzzzz, zzzzz, zzzzz
All through the day

The gas into the truck goes
Glug, glug, glug
Glug, glug, glug
Glug, glug, glug
The gas into the truck goes
Glug, glug, glug
All through the day

The radio in the truck goes
Warble, warble, warble
Warble, warble, warble
Warble, warble, warble
The radio in the truck goes
Warble, warble, warble
All through the day

The wheels on the truck go
Round 'n' round
Round 'n' round
Round 'n' round
The wheels on the truck go
Round 'n' round
3, 973, 823 times!

I also saw a chopper with long horns on the front. Awesome.

KM 41820: Middle of Gagnon and Fremont, Quebec

Well, the ferry arrived late. That's unavoidable. None-the-less we made good time and drove the nearly six hundred kilometres of highway to Labrador City over the gravel highway. This highway was often good (capable of 90 KM/h), sometimes poor (70+ KM/h) and only rarely bad (less than 70 KM/h).

We arrived in Labrador City looking for a room for the night as I had promised Courteney that we would stay in a motel or hotel every night in Labrador. There was not a room to be had. Every room in the town and the towns within an hour drive was booked solid. Apparently contractors, likely for the mines, have taken every room.

So since there are no rooms available and the bugs are quite numerous we are in an abandoned gravel pit spending the night in the truck. After this night is over the only type of lodging we would not have tried would be a hostel.

Tomorrow promises to be a high kilometre day as we try to hurry our way out of Quebec.

KM 41150: Cartwright, Newfoundland (Day 4)

I didn't make an entry yesterday because nothing happened. We didn't even leave the room. Having no money in a small town sucks. One can only talk to the guy at the gas station for so long.

Today we got up at a reasonable time and packed up truck. This evening we are finally on the ferry to Happy Valley-Goose Bay. Of course this is yet another bad experience on a ferry on this end of the country. First we need to check in at five in the evening. Then they eventually start loading us. Now we didn't get a berth because there are none left. This means that yet again we are looking to sleep on whatever chairs can be found.

Well, even though we started loading shortly after five thirty the boat didn't leave the dock until after nine. I have no idea whatever could have held up the boat so long, but this means that we will not make it into Goose Bay on time and this will set back our travelling.

KM 41139: Cartwright, Newfoundland (Day 2)

Well, this is the second of four days in Cartwright. We are waiting for a ferry and we have reservations on the Monday night ferry. We slept in late, did laundry and otherwise killed time.

KM 41139: Cartwright, Newfoundland

Throughout this trip we have travelled and acted according to our whims with minimal planning. We don't make reservations and really don't plan more than a couple of days ahead. Today this easy living has caught up with us.

On this trip there is one more ferry to take before we are solidly placed on highways all the way home. Well the ferry we need to take only runs twice a week, Saturday and Monday. Today is Thursday. We arrived in town in the morning and discovered this. We tried to get a reservation for Saturday, but the ferry is full. We now have a reservation for Monday and need to kill time in this small town until then.

So we have time to kill. We spend about three hours today chatting with the attendant at the gas station. This was rather unexpected and nice, but is only a start. Pete is a nice friendly guy though. This will be the hardest few days we've had. Not only is this town of 700 short on cheap activities to do, it is also hundreds of kilometres from anywhere unless you own a boat. It is going to be a long weekend.

KM 40967: Port Hope Simpson, Newfoundland

This morning we awoke at the brilliant time of five AM and I got to watch the sun rise. After getting cleaned up, fed and packed up we drove to the ferry to Labrador. A few hours later we were in Quebec, which is where the ferry lands, and mere minutes away from Labrador.

We decided to stop for lunch and where swarmed by small biting flies, it sucked. It does however remind me of how few bugs there were in Newfoundland. We came across almost none and at no time did we need to retreat or use bug repellent. It was quite nice.

So we've made it to Labrador, where the trees are short, the flies are numerous and the highways are gravel. At least they are well maintained gravel where you can often do 90 KM/h without trouble. Other than that there isn't much to say yet except that the towns tend to be small.

KM 40568: Raleigh, Newfoundland

Today we drove. And saw the only confirmed Norse settlement in North America.

That is we have finally reached L'Anse aux Meadows, the Viking settlement from 1000AD. The archaeological site itself isn't terribly interesting, but like most digs they are carefully covered over for their protection. They are just shaped mounds in the ground which look like the wall plan.

Though that may not be very interesting there is plenty around it which is. The first is the visitor centre which is really a small museum which explains the find, shows a few artifacts from the site and has a few more which relate items which would have been here at the time, but which were too valuable to leave behind. There you can learn such interesting tidbits such as that the Viking's didn't have a compass and navigated mostly by guts and the sun.

On this site a short distance from the dig itself there is a reconstruction of a few of the buildings using the local materials which would have been available. This is staffed by recreationists who are quite knowledgeable. It is also stocked with accurate replicas of all the period items and tools. It is quite nice to see and worth the trip.

The trip itself takes you through a national park, several long stretches of wilderness and many small fishing villages. It is all nice to see and if I had significantly more money and time I think it would be educational to spend a couple of weeks in such a town.

With this tomorrow we need to get up early so we can attempt to travel to Labrador.

KM 40089: Deer Lake, Newfoundland

Today I was disappointed and we will not be going to France on this trip. We awoke early to travel to the ferry which goes to St. Pierre, an island of France. When we arrived we discovered that the schedule requires an overnight stay. Unfortunately as this ferry is a pedestrian ferry it means that we would need to stay in a hotel and eat three meals in restaurants. This is not in the budget.

In fact, as of yesterday we are on our reserve funds, with our primary travelling supply exhausted. Now the reserve funds are more than sufficient to get us home. It merely means that we cannot partake in any expensive activities. The only stop which this will likely affect is Churchill. Churchill is only accessible by train, which is expensive, and this requires a hotel, which is also expensive. Now going up to Churchill without going on a tour is more or less pointless, but this is also expensive. We will look at the costs, but at this point it is seeming unlikely to occur.

Well, after leaving the ferry terminal we took the long way around the southern peninsula we were on to reach the Trans-Canada Highway. Thus began our day of driving. It is easy to see that we covered a significant distance today and will cover much the same amount again today. Truly most of our days from here until we reach home will consist of us driving most of the time. This is alright and I was expecting it, but it will be more or less boring. At least we are heading home.

KM 39304: Frenchcove, Newfoundland

As I was mentioning in my last entry we were staying at a bed and breakfast. Well this morning the breakfast part came about. It wasn't near as bad as I thought it would be, there was one other older couple there having breakfast with us. Breakfast was good consisting of tea/coffee, French toast, biscuits, banana bread, fruit salad and the various condiments. We had a nice conversation and ate breakfast. We were heading to leave and started to chat with the hosts, but we had to cut that short to make our boat tour.

The boat tour was touted to show us Puffins, Courteney's second favourite bird, and whales. Unfortunately there were no whales, but there were over a hundred thousand Puffins and tens of thousands of other birds. It is quite a sight to see a small rocky island full with birds. The birds were nearly shoulder to shoulder among the entire length and height. Courteney enjoyed it and that is all that truly mattered.

After that we headed to a peninsula on the southern end of the island. We are going here in order to see about visiting St. Pierre, a small French island off the coast of Newfoundland. It is quite a long drive with a stretch in the middle of over two hundred kilometres without a gas station. I am thankful that I bought those gas cans to save us from having to use our propane.

We made it as far as Grand Bank, which is fifteen kilometres from the town where the ferry leaves. There we went to see the Provincial Seaman's Museum, which just had a single special exhibit on. This exhibit was a set of blown up lantern slides (precursor to the slide projector) detailing the trip of the first explorer to reach the North Pole. The story they tell isn't terribly interesting, but the slides themselves are coloured. During this time only black and white photographs could be taken. These are basically black and white photographs which have been transfered to glass and then had colours painted on in certain parts. The effect is quite nice and I found it almost better than a colour photograph would have been.

We then had to leave Grand Bank to travel back to a provincial park to camp. The area we are in is rather empty of people. It is rather nice. It is also much as I expect Labrador to look like. The trees are stunted and closely packed where they exist. Where they don't is full of low lying brush and grasses. There are small ponds and streams all over the place and in general it is quite rocky. This is perhaps the first time since we left the desert that I have seen a truly new type of terrain.

KM 38921: Witless Bay, Newfoundland

St. John's is like many older cities in that the roads are rarely straight and there are many odd intersections. We spent the majority of the day circling around St. John's trying to find things and on Signal Hill.

The first thing we tried to find was the legislative building for the province. We could find no strict listing of such a place and after trying a couple of places which seemed like they would be likely we gave up and left it to be found later.

Instead we went up to Signal Hill, just outside of St. John's and overlooking the harbour and city, to see what was there. The first stop on this hill was the Parks Canada visitor centre. In here we saw a bunch of stuff about the history of the hill. The primary use of this hill was to look over the harbour and bay to then alert the city to the incoming ships. This was a military post as at all times a watch for enemy ships was kept.

On our way out of this centre we just caught a traditional military tattoo, that is a musical military exercise. This was cool to see. They had people dressed up and marching like the Newfoundland Regiment of Foot from the late eighteenth century. They also played out a mock battle with the muskets firing just powder. They even had two old mortars and one eight pound cannon. The cannon, when it was set off, produced the biggest smoke ring I had ever seen. It rolled away from the cannon for at least thirty seconds.

Then we went to the top of Signal Hill to the Cabot Tower. The Cabot Tower was finished in 1900 to replace a previous wooden tower that had burnt down and close the temporary tower. It was a nice place and had been used for many jobs in its time. Mostly a signal tower to keep a watch on the water it was also used as a firewatch, radiotelegraph office, soldier ready room and most recently gift shop and small museum. Now when we went in it was a fairly nice day as it was clear but overcast. Less than half an hour later when we left the tower the fog was thick. About an hour or two later it was clear again. That is weather on the ocean for you.

Once we left there we glided down the hill a short ways to the Johnson Geo Centre. This is your average geological activity museum except that it is build underground. Three of the four outer walls of the exhibit area are made of the bedrock. We went here specifically for an oil gallery they have which explains the processes of finding, drilling and processing oil. We saw that and a few other things which mostly focused of Newfoundland and Labrador, as these sorts of things tend to do.

Finally we headed to find the legislative building again. We had found that it was called the Confederation Building when we were at the Parks Canada Visitor Centre on Signal Hill so it was easy to find. It turns out that it is a pair of huge buildings done in as a sixties style skyscraper. That is to say that they are large, square and plain. They are truly the least notable of the legislative buildings of the provinces I've scene (all of them excepting Alberta) except that they do not appear to be the seat of an old and prestigious institution. I consider the style quite unsuitable for such an old settlement as Newfoundland.

With that we said goodbye to St. John's and headed towards our next destination, Witless Bay for a boat tour to see more Puffins. The tour itself leaves out of a town called Bay Bulls about five minutes away, but we were unable to find accommodations there. Instead we are in Witless Bay proper and will need to drive back in the morning. Now camping around here is scarce and we couldn't find a place that fit our needs and because of our rough night last night minimal camping will not do. In the end this means that we have ended up in a bed and breakfast. As far as I know neither of us has ever been in a B&B before. I am left unsure of the etiquette, it's a lot like staying with a relative you haven't seen for many years. You are never quite sure of the rules and norms.

It is certainly an experience.

KM 38841: St. Johns, Newfoundland

Another day of driving and we not only passed through the down of Dildo, but also made it to St. John's. The town of Dildo we visited merely because of its name. St. John's we are in because it's St. John's.

We arrived in town in the late afternoon because we slept in to recoup from our poor sleep the night before on the ferry. This left us little time to do anything. So we found a place to stay. At first I thought that we would be required to find a motel or the like for the night, but we found a campsite in the end which was close. Courteney is having a rough time with the camping and I unfortunately raised and then crushed her hopes of sleeping between a roof and a mattress.

We ended up going out to dinner because it was raining slightly and Courteney was in a bad mood. We ate at some quiet pub on Water street. After dinner we went back to the campsite instead of exploring the supposedly lively bar scene because Courteney was in a bad mood and this had put me in a bad mood. Even though it had rained quite heavily when we were gone and Courteney was fearing the tent would be soaked, but when we arrived back the inside was quite dry.

KM 38465: Gander, Newfoundland

It wasn't the least comfortable sleep I've ever had, but the night on the ferry ranks high on the list. But no matter, we made it. And then we proceeded to drive our butts off. The ferry landed at nine in the morning, local time. As we didn't have anything to put away or anything of that sort and we had had breakfast on the ferry we just drove off the ferry and basically didn't stop. That is how we covered two thirds of the distance to St. John's from the ferry terminal in just one day of lazy driving.

The first thing that you notice when you see Newfoundland is that the coastline is quite rocky, but just out of the water it is green. The entire island is either rock or green. Driving through the country reminds me strongly of BC. It is hilly, green, covered in trees and you see exposed rock face from time to time. Now the hills are not as high and the trees not as tall, but it is still a strong resemblance.

Newfoundland is also the only place on the trip so far where I have noticed a strong, identifiable accent. Everywhere else the accents where not so strong as to make you step back and think carefully about what you just heard to make sense of it. In the end it isn't terrible as there are just two critical things I've noticed. The first is that you need to pay attention at all times. Unlike what I consider normal Newfies seem to head straight into the content with the usual introductions. This means that if you aren't paying explicit attention you will miss something important.

Now I've heard that the best way to make yourself appear to be smarter is to simply talk faster. Now if this is the case then Newfies must seem like the smartest people on the planet. They just motor. Maybe my brain is addled by too much West Coast leisure living, but they go. I don't think I could match the speed without some serious practise. But like all accents you get over it after being exposed a few times.

Shortly we will both reach St. John's, that magical city which has the honour of being the most easterly city in Canada which we will visit. This means that after we leave St. John's we will be heading west and home. Soon we will be coming close instead of going farther.

KM 37817: Inter-Provincial Waters, Canada

As I haven't explained why the ferries to Newfoundland were so backed up I will explain briefly. Firstly one of the ferries had some sort of explosion or fire in its boiler room. This put it out of commission and threw off the schedule. The other ferries pushed through to make due, but were unable for some reason, even with every ferry full to the rafters. Instead when we checked up on our sailing they were seven hours behind. What this meant is that our 6:30 PM sailing didn't load until midnight and didn't leave the docks until 1:30 AM.

Now the ferry we were one felt more like a small cruiseship than a ferry. Firstly most of the space in the ferry was taken up with private cabins, all of which were booked. Secondly, this ferry had a restaurant, not a cafeteria. It also had a health club and a casino. Now this left almost no room for normal seats. This being an overnight sailing (when they leave after dark they slow down and take seven hours to arrive) everybody was cramming into the undersized lounge in order to find a spot to sleep. Courteney and I found a spot, but it certainly wasn't comfortable because the seating is a continuous couch that encircles the deck like a serpent, there was no truly straight section longer than three feet.

So we had this sailing which we had expected to be at 6:30 in the evening. This meant that we had many hours to spend around town before we could line up at the ferry terminal. First we slept in two hours. Then we packed camp up leisurely. After that we went to Canadian Tire to complete my collection of five five gallon jerry cans. Now when my truck is full of fuel I have a range of something like fifteen hundred kilometres. Now half of that is in propane which is impossible to find in some places and so I would like to use that only as a last resort as T may not be able to replenish it. After doing that and filling my truck to the utmost with fuel we went to a public library for five hours. Finally when we could read no more we wandered a mall for an hour before finally just sitting in the mall parking lot with the windows down and the radio on.

On the road killing time can be the most difficult thing, especially since there are no truly comfortable spots to sit and wait. The truck is too hot, some places require visiting a parking meter every few hours, malls are boring, there are no good movies out and the world has a depressing lack of publicly accessible couches. We managed, though just barely.

KM 37744: New Harris, Nova Scotia (Day 2)

Our second day of killing time went rather well considering the amount of time we had to consider what we would do. In the morning we had a fine breakfast (but only because we could leave the dishes to be done later) and then headed off on a boat tour. We went out to a pair of islands known as the Bird Islands. They are just two small rocky islands which are raised out of the ocean. There used to be a lighthouse, keeper and keeper's cattle on these two islands, but now there is only an automated lighthouse and birds. Thousands of birds.

We went specifically because Courteney wanted to see Puffins, which we did in spades. We also saw a great number of bald eagles, several breeds of seagulls, razorbills, a blue heron, a whack of grey seals, a few dolphins and some other birds which I cannot recall the name of. It was quite an enjoyable three hours. And quite affordable as well, something like thirty dollars a ticket. The captain was knowledgeable and our three hour tour didn't turn into many years of comedic antics.

After returning we went back to our campsite for lunch before heading off to the east side of Sydney to see the Marconi National Historic Site. This is the site of the first attempt at a trans-Atlantic radio service. It is quick, but descriptive and has a model of the site as it was during the attempts. The radio signals used longwave and required not only enormous amount of power (there was a 75kW generator on site for this purpose), but also enormous antennas. The model showed an inverted square pyramid nearly two hundred feet on a size and over two hundred feet tall made of hanging copper wires. It would have been quite the site to see in the day.

They also had an amateur radio operator there, but we didn't see what he was up to.

We then returned to our campsite to make dinner. Along the way I picked up two jerry cans and will be picking up at least two more in order to ensure that we have enough fuel to make it across the northern sections of the provinces. Life can be difficult with a gas tank that is half the size it should be.

KM 37583: New Harris, Nova Scotia

This morning we awoke in the basement of Sam Grant Senior's parents' place. They had kindly offered us a floor for the evening and showers in the morning. Sam Junior and Erika were heading home this morning and needed to be out early. We awoke in plenty of time to see them off and after a few minutes of packing and showering we were off as well. The plan was to arrive at the ferry terminal and wait for a ferry to Newfoundland. That was the plan.

What actually happened is that we arrived, waited in line, reached the from of the line and was then told that the ferry was entirely booked up for two and a half days. This was quite a surprise and needless to say we are not in Newfoundland this evening. Instead we bought that ticket and then had to figure out how to spend another two and a half days.

The first thing we did was continue to finish the Cabot Trail. I had been told around the fire that we had missed the best portion by not completing the loop. On the way to doing this we stopped in at the Alexander Graham Bell National Historic Site. This was interesting to see as he did much more than just the telephone. Perhaps the most ahead of its time was the hydrofoil and hydrofoil boats.

We then continued and finished the trail. I believe I was misled because the south eastern quarter of the trail is not anything special at all. The best portion of the trail is really the section which is bookmarked by the gates of the National Park there. We unfortunately were in too much of a rush to stop in any of the side roads, but there are plenty that promise to be excellent.

This took us into the afternoon so we planned to go on a bird watching tour tomorrow and found ourselves a campsite for the next couple nights while we wait for our number to come up at the ferry.

KM 37258: Point Edward, Nova Scotia

On Cape Breton in Nova Scotia there is a relatively famous route called the Cabot Trail. This trail goes up and around the north west side of the island. We spent the majority of the day driving to and along this route. It is quite the spot with many spectacular views. We drove about two thirds of the trail. We went up the west coast from the rest of Nova Scotia and got off to go to Sydney.

The reason we went to Sydney is that a number of the Grants are out on vacation in the area. These are friends of the family so we wanted to stop in and see them before we head to Newfoundland. The real reason for the rush is that two of them are heading back tomorrow morning. We did make it and spent a couple of hours sitting around a fire in their aunt's/sister's backyard with some of their other family. It was good and welcome to see some familiar faces from home.

On our way north on the Cabot Trail we also hit the root of a parade. We arrived about two minutes after the parade started and were three cars from where all the floats were turning onto the route. Thus we got to see most of the parade from a short distance, even though we weren't on the route. After the final float had passed by traffic continued with us in it travelling at the pace of the parade. It was fun at the beginning, but the parade didn't travel smoothly and instead stopped and started. It took us about an hour to reach the end of the route and by the end of it I was tired of stop and go traffic in the hot sun.

KM 36698: Hildon, Nova Scotia

When you put your mind to it one can get a lot of site seeing done in a single day. We started today with the Maritime Museum of the Atlantic. This is a museum which covers maritime history from sailing ships, steamships and more modern diesel ones down to shipwrecks. It isn't too big and not repetitive so it was enjoyable. They have a nice collection of small sea boats, most of them small sailing vessels of the sort used by the common man in the past. There is also an entire steamship sitting at the wharf to be explored. We spent nearly four hours there and enjoyed it quite a bit.

After that we walked around a bit in downtown Halifax in order to find what amounts to the legislative building in this province. They call it Province House here and like the other legislative buildings for the maritime provinces it is rather small. I suppose this is to be expected.

Finally we went to a vegan friendly cafe elsewhere in Halifax for a late lunch. This was a pie stop. We both had a bowl of rice/noodles with ample vegetables and differing sauces. This was besides the nachos. If you ever want a truly good plate of nachos you need look no further than the nearest vegetarian restaurant. I'm not sure why, but they make the best nachos. Anyways, after stuffing ourselves on rather good vegan food we proceeded to the vegan pie.

Now some might not know the amount of animal based food that goes into the average pie: eggs, milk, lard, cream and any number of other things. Of course vegans refuse to eat anything of the sort so I was unsure of what a vegan pie would taste like. On today was the cocobanana pie. It had banana and coconut in it. It was rather good. The crust was what I was most interested in and it was alright. It wasn't flaky at all, instead it was crunchy. It also wasn't as dry as I had expected. It was alright.

We are making progress towards exiting Nova Scotia. All we have left on our list to do here is to see some of Cape Breton. I'm not sure why, but people keep on asking if we are going to go there or not. We'll find out soon enough.

KM 36568: Sackville, Nova Scotia

At this point Courteney and I are getting tired of travelling and site seeing. The greatest evidence is that as we go along the time we spend at each site gets shorter and shorter. Also some things which made the list we decide are not worth our time. An example of the latter happened today. We originally had a blacksmith shop on our list. However, when we arrived at the town where it was supposed to be we had some small amount of trouble finding it. Instead of putting a lot of effort into looking we both easily agreed that it wasn't worth it and that we could move on, which we did. Long term travelling is quite hard and is taking its toll on both of us.

In fact, in some ways I am beginning to consider myself a professional tourist. I have perfected the art of being quick at service counters, reading attraction maps and generally getting around without spending too much money and missing anything. I'm even getting angry at the lesser amateur tourists. This happened when we visited Peggy's Cove today. Peggy's Cove is home to a picturesque lighthouse and a small fishing village of about two hundred people. The main street it narrow, winding and lined with boulders the size of large cars. It is also an immensely popular tourist attraction. When we arrived there were easily five hundred tourists wandering around the town, blocking traffic and generally being a nuisance. I feel sorry for those who live in Peggy's Cove and could never live in a similar situation.

We saw the lighthouse and got some postcards and lobster flavoured potato chips. The latter aren't that bad, as long as you don't mind a bit of a fishy taste.

Then we moved onto Halifax and the first stop was the Alexander Keith's brewery tour. This theatrical tour is much as one would expect: introduction by a costumed lady, movie outlining the history, pretences of meeting Mr. Keith, beer tasting and entertainment while drinking said beer. Somewhere in their we even learnt how beer is made. It was fun and the hour went quickly.

KM 36249: Shelburne, Nova Scotia

I have lost the patience to waste a day away. Travelling and constantly having something to do or something to see or somewhere to go has done this to me. This morning we went first thing to find a mechanic to do my brakes to avoid the disappointment of yesterday. We ended up finding a slot at the Canadian Tire in town, but not until after three in the afternoon. This was at quarter after nine in the morning. So we had most of a day to kill.

We had arrived in this town from our drive yesterday after not getting our brakes fixed because we needed to be here anyways to see the world's smallest drawbridge. Now most tourist attractions, no matter how small and inconsequential, tend to be clearly marked. The rest which are not clearly marked generally have at least one sign off the nearest major highway pointing the way. The drawbridge had no signs. All we knew is that it is in a small town called Sandford just outside of Yarmouth, which is the large town of which Arcadia is just outside of. Well, not knowing where it is, but Sandford being a small town we thought we would just drive through Sandford and see it. Well we drove through it, and the next town over and one more town over. The highways around that spot are not terribly well connected. So we make another look of the town and try two side roads as we go through town. On the third side road after travelling down it quite a bit we end up at a wharf. As part of this wharf and associated breakwater there is a pedestrian bridge about twenty feet long that parts in the middle. It was a draw bridge. No signs, no plaque, nothing but the world's smallest draw bridge.

So we slowed down as we drove by to see it. This all told took about half an hour. Having no further plans for the day and plenty of time to spend we headed back into Yarmouth to find the local tourist information centre as they tend to be good sources of information on where to kill a few hours that isn't a mall or movie theatre. This time not so much. Not unless we wanted to go through some of the three or four small museums. What we did find was an old lighthouse that was opened to the public with a small museum in it. This was a bit of a drive on narrow, winding roads through an old fishing village on a peninsula.

It was much as you would expect. There was a small cafe where we had lunch because it was raining fairly hard. We both had some fair tea, a bowl of fish chowder and some bread pudding. The pudding was delicious. Afterwards we went for a short walk around. Courteney's hair was blowing in the sea gale. It was also foggy all day so we could only see about a hundred feet into the ocean before it became greyed out.

After we finished there it was still only two. So we headed back into town to wait it out. We ended up going into a dollar store in the hopes of finding some cheap citronella candles and candle holders. We did find a few things. After that we went back to the truck where I listened to a couple of podcasts and Courteney worked more on her latest needlework project, a stuffed dragon.

Finally the appointed hour came and we could drop the truck off. We did so and then proceeded to wander the Canadian Tire. We did so for a while and commented to each other on various things. We paid special attention to the citronella lamps as we were going to try them. After wandering for an hour we returned to find the truck finished. So we got the keys, bought the lamp and lamp oil and moved to the truck to make some distance before it was time to make camp.

We did make about a hundred kilometres. When we stopped the first thing we did was setup the lamp and candles in an attempt to keep Courteney from getting bit. I've often heard people claim that they don't work at all and I'd like to disagree. They do work, especially the lamp with a large flame, but not over near the distance one might hope. I found it better at up to three or four feet from the flame. That isn't quite the backyard protecting distance, but is better than nothing. An important thing I noticed is that as long as I stayed within the protection I would be fine, but if I left and came back mosquitoes would follow me in. So the most effective use of them seems to require that there be a lamp or candle every few feet and that all areas of travel be covered.

Hopefully they help enough to make Courteney stop being miserable.

KM 36058: Arcadia, Nova Scotia

Today was a day of disappointments. First we drove over two hours through the scorching heat in my air-conditionless truck to see the world's heaviest lion in Aylesford. Unfortunately the lion died sometime in February when we were still making up our list. We instead walked around the zoo and saw a few other animals. It was at least nice to get outside in the sun, but the sun was so strong that without our hats I fear we both would have fallen to heat stroke.

After the zoo we drove, again through the hot sun, in search of somebody to put new brake pads on my truck. They are due to a change, but I am unable to do them myself because I don't have a few necessary things and I don't really have room to want to haul them home. Alas we found nobody who could do a while-you-wait job. So we need to travel one more day with squeaky brakes.

Then after that was done and we were to travel towards our next destination not only did we travel through more bright sunshine, but on our way there we entered heavy fog. So not only did we suffer most of the day with sweating ourselves into puddles we didn't even get to enjoy a nice warm evening because it got cold! Talk about disappointing. Especially since when the sun is blocked or has gone away out come the mosquitoes.

Hopefully tomorrow goes better.

KM 35544: Pictou, Nova Scotia

Well, we have finished visiting PEI. All that we really had left was to see the legislative buildings. We did that, but only after Courteney bought a purse and a silly hat. We also saw the latter two thirds of a free musical show about the history and important elements of PEI history. Only then did we take a look at the legislative building. It is rather small, but then so is the province. It also served as the place where the Fathers of Confederation set about creating the Dominion to protect the British colonies from American influence.

Other than that we took a ferry over to Nova Scotia. We started quite late and so didn't get out of our campsite until after eleven. We spent a couple of hours in downtown Charlottetown and so didn't have time to do anything after the ferry docked. Thus there is little for me to report.

KM 35456: Cornwall, Prince Edward Island

The great advantage of PEI being small is that you can get a lot done in a single day because you never spend that much time travelling between destinations. Take today as an example. First we went to Cavendish to look at some Anne of Green Gables stuff. Specifically we went to the farm which served as the model for Green Gables and has been restored to represent what it would have looked like at the time the book was set. It was a thing to see.

Then we went to Charlottetown to buy tickets to the showing this evening of Anne of Green Gables, The Musical. After that we headed to the edge of town to visit the Cows Factory. For those who don't know Cows is an ice cream shop which makes excellent ice cream; it has even been rated as the world's best ice cream by some magazines. Then we went to a small town a short distance out of Charlottetown to find a campsite and have dinner before going to see the show. That is a lot of stuff to do in a single day.

At Cavendish in the Green Gables National Historic Site we went around and saw some scenes which are famous from the book. The barn has been restored, the house setup much as it is in the book, the Haunted Woods are there as is Lovers' Lane. Now I've never read the books, but this is what I am told. In the gift shop we found a couple things of note. The first is chocolate covered potato chips. They are delicious, though likely not healthy in the slightest. The second is Raspberry Cordial, a drink which is apparently a favourite of Anne. We bought four bottles and have drank one, it is not half bad.

The next stop of note is the Cows Factory. This is really a factory with a storefront. We took the factory tour and saw how they make their shirts (to some they are almost as well known for their shirts full of puns as their ice cream), ice cream and their cheese. I didn't even know that they made cheese, but I suppose that is because they don't sell that at their Whistler store. It was a nice tour and everybody got a free sample of ice cream at the end. Of course that isn't enough and we both got ourself a cone before we left. I got a Don Cherry in a waffle cone coated with chocolate and sprinkles while Courteney had something with pineapple and mango in it.

Then we went and got a campsite for the night. Just as we were rolling in it started to rain a small amount and in the nearby bay there were a few lightning strikes. It promised to be a fun night, but passed by quickly and caused us no trouble.

Finally we went back into Charlottetown to watch the show. This musical has been running for the past forty-five years. That is a pretty long running show. It was rather funny times and quite good. I also know what all the short actors who don't make it in movies end up doing, playing children in musicals. The effect was quite good. We both quite enjoyed it. I don't think it'll ever stop playing, so if you find yourself in the area you might choose to watch it.

KM 35275: Mill River, Prince Edward Island

Today we drove to PEI. It used to be that you couldn't do this and instead needed to take a ferry. Well several years a long bridge was built called the Confederation Bridge from New Brunswick to PEI. That is what we drove on. I had heard that it had eight foot walls on either side which prevented any sort of view. I am glad to say that it isn't true. It does have solid concrete walls, but they are only three or four feet high and out of a truck you can see over them easily. I was the entire island of PEI from end to end on our way in.

Upon arriving you enter a town called Gateway Village. It is named for an obvious reason and is really only a tourist place.

Our first major stop, excluding a gas station and a grocery store, was the town of O'Leary on the western side of the island. We went to this small town to visit the Prince Edward Island Potato Museum. When we arrived we found a Potato Blossom Festival in progress and had missed the parade by a handful of minutes. This meant that we got to see a number of the parade floats and the like dispersing. There were a number of old tractors, a poor person in a potato costume roasting under the strong sun and a bunch of other things.

After a while we eventually got past the traffic caused by the ending of the parade and reached the museum at about one in the afternoon. There are some museums which you are surprised that they exist and find them a bit odd. This was one of them when I added it to the list of places to see oh so many months ago. On going through it, it is entirely reasonable. Not only is it the single largest crop on this island, but the potato is the fourth largest crop in the world and is highly nutritional. The potato is native to South America, but the Eastern Europeans eat the most of them. Generally the potato only became a popular crop when famine hit. It turns out that potatoes can grow just about anywhere.

I also found it surprising that so many ailments afflict the lowly potato. From the Colorado Potato Bug which has been spread by people across the world to the Late Blight that devastated the Irish. After going through the museum it seems surprising that any potatoes make it to our table in the end.

Unfortunately the potato museum was the only stop in PEI that doesn't involve Anne of Green Gables in some way. PEI truly only has two main exports: potatoes and Anne.

KM 34938: Miramachi, New Brunswick (Day 3)

There are many aquariums in North America and we have visited three of them. Today we visited the third. This was the Aquarium and Marine Centre of New Brunswick in Shippagan. Most aquariums only really have tropical and otherwise exotic species. I believe this is because they are nicer to look at and harder to come by. This aquarium is different. It is filled with species from the ocean surrounding and rivers contained in New Brunswick. It is nice, for a change, to sea fish that you can actually find in Canada.

Of note I saw my first whole Atlantic Cod, a live lobster that must have weighed twenty-five pounds, an albino lobster and a blue lobster. This is among other local species like Lake Sturgeon and Haddock. It was unfortunately raining lightly with a strong wind so we didn't spend much time watching the seals, even though Courteney likes them.

After returning back and watching TV channels go off and on the air (likely because they are sent out by satellite from Toronto and Toronto was having severe thunderstorms) we had a nice dinner. Courteney's Grandfather had us sampling one of his strong and young blueberry wines. It'll be good in a few more months, but was a bit rough yet. Finally after all that one of his close friends came over to see us for an hour. I can't quite remember her name, but I think it was something similar it Gracie.

This was our last day in New Brunswick and tomorrow we are destined for P.E.I., the land of potatoes, sandstone and Anne of Green Gables.

KM 34716: Miramachi, New Brunswick (Day 2)

Since we have entered Ontario we haven't really had much sun. It seems that this part of the country hasn't had much in the way of a summer. This morning was alright so we went on a short tour of the town. It is a nice small town.

We also went to a small island called Midling Island. This island was house to a quarantine centre for the Irish immigrants who reached this side of the Atlantic sick. We had a nice lunch there and then made our way home.

Once there we let lunch settle and then I helped cut the front lawn. This is the first time that my offers to do work of some sort have been accepted. Oh well.

Right now we are just relaxing, watching the news and awaiting the rain that is supposed to arrive at around dinner time.

KM 34716: Miramachi, New Brunswick

After a loud night of rain and wind amplified by the tarp covering our tent we awoke to a relatively sunny day this morning. We made a breakfast as best we could with our limited supplies. We have been staying at peoples' houses for such a long time that we have run out of certain foods, such as pancake batter, and others have gone bad, such as our milk. This means we don't have a whole lot to work with. We made due though and had some bachelor's egg and toast.

There were two tourist stops on the agenda today. The first was Magnetic Hill. This is a hill where you roll uphill. It is quick, but actually quite a neat effect. When I return I will post the video I took.

The second stop was the tidal bore in Moncton. A tidal bore is a tidal wave which moves inland in a river and raises the level. The wave we saw was only perhaps six inches high, but it did move upriver with good speed and did raise both the rate of flow and the water level. As the tides change so does the height of the wave. Construction on the river has decreased the height of the wave since the mid sixties, but there are longterm plans to fix this and bring back the multifoot tidal bore.

Both of these attractions didn't take long to see so we made our way to Miramashi where Courteney's Grandfather resides. We made it here quite early and he was not yet back from fishing so we went to the local library to read some of the magazines. Libraries are an often forgotten way of passing time while travelling. Only rarely is a library card required to read inside the library and they tend to have a number of periodicals that are up to date. We spent two and a half hours there and it was quite good.

After that we went back to Courteney's Grandfather's place and he was there. We proceeded to chat for a while, eventually had dinner and then chatted again. We really have no plans but to spend a bit of time here so seeing what we do tomorrow will be interesting.

KM 34510: Parlee Beach, New Brunswick

We are back on the road again for another day or two on our way to Courteney Grandfather's. Which is just fine because the weather was poor this morning. We woke to a light rain and a good amount of fog. After packing the truck up and saying our goodbyes we headed east. Our first destination of the day was Hopewell Cape, where the Hopewell Rocks are to be found.

These rocks are more or less pillars which have been eroded out of the cliff by the tides. The tides at that point are forty-seven feet or thereabouts. The difference in height is so great that even with the tide going out for half an hour it went down perhaps four feet. At the Hopewell Rocks are some which are called the Flower Pot Rocks. These are true pillars which widen at the top with trees and grass growing on top.

It is even possible, when the tide is low, to walk beneath these pillars in the mud. Unfortunately the timing of the tides didn't work to our advantage at all. We arrived shortly after high tide at about one in the afternoon. We spent perhaps an hour and a half walking the paths and gazing at the Bay of Fundy, but when we left it was still nearly two hours until the tides where low enough to walk.

Instead of waiting we continued onto Moncton. Though we have two things to see in and around this city we saw none of it. Instead we got pelted by heavy rain and got an oil change. Nothing spectacular, but not everything about travelling can be exciting.

When we arrived at the campsite for the night it was raining cats and dogs while we sought a dry spot to pith the tent. Luckily the rain broke for a couple of hours shortly thereafter so we were able to set the tent up and eat dinner while staying relatively dry. Of course afterwards it started to rain quite heavily again before ceasing for a time. We brought two tarps with us to deal with heavy rain. However, one is so large and we are able to set the tent out upon it and then fold the other half of the tarp over the tent completely. We are kept quite dry when we do this no matter how much it rains as long as we roll the bottom bit up to prevent a puddle from forming underneath us.

KM 34190: Saint George, New Brunswick (Day 4)

Oh how the time flies. Today was spent visiting a friend of Courteney's mother, Ruthie, on the other side of St. John. We chatted there from about eleven until one thirty before heading back here. It was nice to meet her and Courteney seemed to enjoy talking to her. After we returned Courteney proceeded to bake another pie. This time it was strawberry-rhubarb.

While she was doing that I was fiddling to fix Maynard's Internet connection and burning a couple of DVD's of pictures to mail home as a backup. Network access has been much rarer at campsites here in Canada than in the USA. This isn't a problem at all except that it means that I have been unable to copy my pictures back home for safekeeping. So I will make use of the postal system instead.

After that was done and with some help from Courteney while I was burning stuff we got Maynard's Internet to not only work, but work wirelessly. This has been causing him undue trouble for the past two weeks.

In the evening there was another family dinner, this time a stir fry and swish kabobs. I truly think I've misspelled that. This dinner was good and everybody had a good time.

This ends our stay in St. George and the St. John region. Tomorrow we are up early to make our way north east to Moncton.

KM 34000: Saint George, New Brunswick (Day 3)

I made my entry too soon last night. After I made my entry we ended up going to one of Gloria's child's house. He is of course an adult. They were having dinner so we chatted for a couple of hours. During this is was noted that there were fireworks last night. So we went back to Maynard's house. Shortly thereafter the fireworks were to start so we were whisked off there.

It seemed like the whole town showed up and the fireworks were quite good. Part of it was that we were quite close and could feel every explosion. But even the number, variety and combinations of fireworks was quite good. They lasted perhaps twenty minutes or half an hour before we made to leave. Of course as with anything like this there was a short traffic jam.

Today we did three things. First we went back into St. John to see that Carleton Martello Tower. This is a tower that was first built for the War of 1812, but was then used to fend off some Fenians, hold misbehaving soldiers during WWI before they shipped out and then as a fire control centre during WWII for the defence of the harbour. It is a nice tower.

After that we went to a salt water beach called New River. It was nice and sunny and we spent a couple of hours there. The water is from the Bay of Fundy which is connected to the Atlantic Ocean. The water was very cold, but it is the North Atlantic. The day was good and we flew a kite for a short while on the sea breeze.

I could have stayed the rest of the day, but we needed to return because Courteney was to make a pie for dinner that night. She did and made a nice blue berry pie. Dinner was back at Gloria's son's. Dinner was burgers and salad an corn on the cob. We also sampled a number of homemade fruit wines which were quite good. All in attendance raved at Courteney's pie. A number of them complained because they had a Weightwatchers weigh in tomorrow.

And that was our day. It was quite nice. Before I go tonight I am going to express my thoughts on if we are going to get to travel much of the North or not. The first thing to know is that the North is cold and consequently snow and ice comes early in the year. This really means that if we haven't entered the North by the beginning of the third week of August we likely shouldn't because we are likely to hit cold weather. We just aren't equipped for cold weather. Courteney has clothes only good down to about zero and I only have things good enough for about ten below on me. With the current time of year, how long it took us to get here, how much more east we have to go and taking into account just how immense the North is I currently do not think that we'll be able to make it during this trip. I'm not happy about this, but I do not have the money to outfit ourselves. I still hold hope, but I am getting ready to accept it.

KM 33858: Saint George, New Brunswick (Day 2)

Today was another day seeing Saint John. The first thing we saw where the Reversing Falls at high tide. The Reversing Falls is a section of the Saint John river which runs backwards when the tide is high or coming in because the tides are so high. After we saw that we went to go through the New Brunswick Museum. This is something Courteney wanted to see. It had a variety of things in it including the art by Canadian Artists, a hall of whales, a portion on the history of industry in New Brunswick and a few other things. The history of industry was particularly interesting because I haven't seen the changes of industry through time laid out before.

We wandered around the museum for not quite four hours and saw most of it. After exploring the museum and getting Courteney away from a stuffed puffin we made our way. Back to the Reversing Falls. It was by now low tide and not only was much more beach and rock above water, but the falls where travelling in the opposite direction. This caused a number of large whirlpools which we watched.

After spending a few minutes looking at the falls, which are more just reversing rapids at this point, we headed back to St. George for dinner.

Now normally I don't mention what we have for lunch unless we ate at some particular restaurant. Today I am making an exception because we had something special. For lunch we had lobster sandwiches. It is one of those odd lunches. Though I am told it used to be only the poorest who ate lobster, I am sure it is now only those who are well off. It was good and surely something different.

KM 33707: Saint George, New Brunswick

For those who don't know St. Stephen is where Ganong, the chocolate company, makes their home. In this town there is not only a Chocolate Festival(in August so we don't get to see it) but also a chocolate museum. It is for this museum that we came to this town. We did visit the chocolate museum. It isn't an enormous museum, but does have good information on the history of chocolate, how chocolates were and are made as well as a bunch of stuff about the Ganong company. There are also free samples placed throughout. It was a pleasant way to spend the morning.

After the museum we visited the original chocolatier shop. It is still selling chocolate and we picked up a few things. We had half the box of chocolates after dinner.

With that we left St. Stephen and headed East. It was still early so we headed to Saint John. This is the city where Courteney was born and lived for several years. Consequently we have a number of things to see. On our first afternoon there we saw the City Market and King's Square. King's Square is a nice little park with a few monuments and the like. The City Market is a covered open air market similar to the one on Granville Island. The ceiling is made to look like the hull of a ship upside down. We walked through that and thought we saw lobster for three dollars a pound. Alas upon further inspection it was crab which was three dollars a pound and not lobster.

After seeing those couple of sights we headed to St. George to stay at a couple of Courteney's family. These two, which go by the name Gloria and Maynard, live in St. George which is about half an hour out of Saint John. We got there at about three thirty. We chatted for a while and then went out to get some lobster from a local fish market. It ended up being not quite eight dollars a pound. Then it was back to their house for a lobster feast.

I've never had a full lobster before, in the West it is just too expensive. So I needed to be shown how to eat one. Conveniently I had two old Maritimers to show me. I quite enjoyed it, but it is no means a tidy meal.

KM 33405: Saint Stephen, New Brunswick

I must reiterate, New Brunswick is small. Instead of only being able to make a single stop in a day we made nearly two. First we drove a short distance to Fredericton. In Fredericton we had three things to see. The first was the Beaverbrook Art Gallery. This is a medium sized gallery founded in the 1950's. In this we saw a number of things. The first two were some temporary exhibits of native artists. In one of them there was the oldest birch bark canoe in the world, built in 1825. It had spent 180 years in Ireland and was only recently put on temporary exhibit in Canada and most recently has been repatriated.

Elsewhere in the gallery there are a number of pieces from painting masters and a number of items from the medieval and Renaissance periods. This includes one tapestry which is perhaps twenty feet square and is one of three surviving of the original set of twelve. They used to decorate the dining room at the French Palace, but this one was found in an old chest. It depicts a hunting scene from the time of the Holy Roman Empire perhaps five hundred years ago. There is also a number of pieces of furniture that were interesting to see and were well built.

After going through that museum we walked to the truck for lunch; we had parked it in a public parking lot near the city hall. After a fine lunch of moldy cheese and a peanut butter and jam sandwich without the bread we proceeded to walk to see the Garrison district. The Garrison district is the historic district. We walked around there a bit, but it wasted as interesting as I had thought it would be. It may have been our experience in historic settings and the fact that being a Friday not everything was running.

Instead of spending much time there we took a walk along the river. We picked up two ice cream cones and walked for a way before returning. It was nice, though the day was rather grey. At least it didn't rain. Unfortunately because there wasn't near enough sun we couldn't make use of the sun dial on the side of one of the old buildings.

Leaving Fredericton we went to St. Stephen. This is just a small town, but has a few stops which I'll describe tomorrow after I see them. Instead I'll describe the things we did in St. Stephen since we arrived too late to do anything but set down.

Firstly we took a brief tour of the city looking for propane. Auto-propane is sometimes hard to find and even though New Brunswick is small I do need to fill up eventually. We did find it, but at $1.11 per litre. We are running on gasoline and I haven't decided if I am going to fill up at such a costly price. We had also planned on camping tonight. Alas there appears to be only one campsite near this town and that is closed. I am not sure what could cause a campsite to become closed, the only real costs are the land and labour.

With our lodging plans for the evening being changed we needed to find a motel. Conveniently this town is small enough to have the single visitor centre easily found. We stopped there and had a nice chat with one of the ladies there about the available motels and other attractions in the area. We eventually decided on a motel somewhat outside of town. When we arrived the price was fine as motels go, but no motel will ever beat camping. This motel was first built in the fifties and is the nicest non-chain motel we have stayed in yet.

Now motels mean we cannot cook our own food. Instead dinner was to be found at a nearby diner. Courteney chose a lobster roll, basically a sandwich. I saw a seventeen dollar seafood platter. I like seafood and when in the Maritimes seafood is everywhere. Well, that platter almost killed me, it was all that I could do to finish it. I won't make that mistake again. Seafood is cheap here. Tomorrow I'll see about finding my way to a fishing dock to pick up some lobster for dinner. Apparently they can be had there for about five dollars a pound. That should be fun. But since my dinner was so large I could only watch while Courteney had a nice chocolate cream pie.

KM 33000: Kings Clear, New Brunswick

Today we left the first of our language ordeals and entered New Brunswick. We made it through our short trip of French surrounded by people who can manage broken English when necessary. On our way back we will spend a much greater amount of time in Quebec and northern, small town Quebec at that.

Also, New Brunswick is quite small. We drove from the north-western corner to nearly the centre in a day and still managed to see the world's longest covered bridge in Heartland and the world's largest axe. The covered bridge crosses the St. John river and heads into a small town. It is well over a thousand feet long and takes a noticeable time to drive across. It takes perhaps five or ten minutes to walk across.

As a single lane bridge it isn't necessarily the most practical, but since it has been there for more than a hundred years I don't see this town getting another bridge or getting rid of this one. In the visitor centre nearby we bought postcards and Courteney found a large plush lobster and found it so irresistible that she bought it. I'm not sure what she is going to do with it, but she has it.

After the bridge we drove a ways and arrived at a town with an axe so large that it is hard to describe. It was a double headed axe nearly to scale that had an axe head perhaps fifteen feet wide and two feet thick.

We will be in this province for about a week, even though it is so small because Courteney has family to see and a number of sites she wishes to show me. It may be a nice change not to need to drive so much every day.

KM 32912: Riviere-du-Loup, Quebec

Quebec City is old. This would seem obvious from the fact that it recently celebrated its four hundredth year. However other cities through which we have travelled which are well over two hundred years old don't feel near as old. I believe that it is due to the French pattern of three story buildings made of stone which makes it feel as old as it does. Firstly the buildings are large enough and well constructed enough that tearing them down to put up a new building isn't worth it. Being well constructed also reduces the likelihood of them falling down. Then there is the stone. You just don't see stone construction often in North America. Lots of brick it is true, but brick shows its age much quicker.

In any case we visited Quebec City today. Specifically we stuck to Old Quebec, which is the nicer spot anyways. It is full of, as mentioned above, three story stone buildings. It is nice. I had been there once before during a Coop term when I was flown out. I didn't see much then though. We had a few things to see while we were in the city and as we arrived near noon lunch was the first on our agenda.

In Quebec City there is a restaurant called Restaurant aux anciens Canadiennes. This is is a pie stop. So we stopped in and both had the daily special of a bison meat pie. It was quite good. Now the last time I had been in Quebec City I had a beer which was the darkest stout I have ever seen. It is called Boreale Noire and it is black. For those who know it is even darker than the Black Plague. Being back in town I spoiled myself to a bottle over lunch. After the meal it was onto the reason for eating there, the Maple Syrup Pie. It was quite good, but not near as sweet as I had feared. It was quite tasty.

After lunch we went on a walk around the Old City. Our first stop was the wall. Quebec City is not only the oldest, but if I am remembering correctly the only remaining walled city in North America. It is a stone wall perhaps fifteen or twenty feet tall.

After this we went to look at the Citadel. We saw a bit of it, but not all of it because it is still an active military base and home to the French-Canadian regiment. We could have taken a guided tour, but Courteney didn't feel up to it. That is certainly a place to be posted. The Citadel is on one end of the Plains of Abraham, where the historic battle which caused New France to become a colony of the British occurred.

Finally we walked just outside the walls of the city to reach the parliament buildings of Quebec. They take a bit of liberty in using national terms, but I suppose that is what many consider themselves. It is of an entirely different style than the legislative buildings which I have seen to this point and is really just an enormous stone rectangle with a large stairway to the front doors. It is nice, but rather uniformly grey.

That pretty much covers the sites we wanted to see in Quebec. We will likely travel back through Quebec on our way to the northern portions of the provinces, but have no listed stops. We are currently near the New Brunswick border and will begin exploring that province tomorrow.

KM 32506: Ste. Madeleine, Quebec

I have often wished that I could speak French fluently. Alas, due to a lack of practice I cannot. Although I took French for the majority of my elementary and secondary schooling. I believe that I got rather reasonable and conjugation and forming sentences. However, I was always slow and my vocabulary is as small as you would expect. Even though I know a few people who speak French I feel too bad to make them endure conversations with me speaking at a two year old level. But being here in Quebec and having to deal is bringing back small snippets of highschool French.

So we are both making our ways with what little French we remember from school, the cereal box French we know by heart and what our little phrase book helps us out with. When we arrived here at the campsite we met a nice man across the fence who decided to talk to us. We know but a little French and he knows but a little English, but we made due and managed to have a slow and stilted conversation.

Earlier today we were in Montreal. We didn't have any particular stop so we walked around the downtown for a couple of hours. We had lunch and took a look at how expensive Just For Laughs tickets were. The lunch was good, the tickets about eighty or ninety dollars a person per show. This meant that we couldn't see a show, but perhaps next year. The downtown was quite nice and I think that if I knew French I could live there with relative ease. This means that there are two places I could go which are big for my industry: Vancouver and Montreal. The third, Toronto, didn't appeal to me at all.

KM 32264: Ottawa, Ontario (Day 6)

The Canadian Museum of Civilization is deceptively large. Enormous I would call it. We spent the entire day there and only finished seeing perhaps two thirds of what is there on display. We had fun and I got to enter another new province, Quebec.

This museum has a number of neat things and displays including the Canadian Postal Museum and a linear history of booming industry and life in the various Canadian Provinces and Territories. The history itself moved east to west and yet was mostly linear in time. It is a bit funny to see the development of Canada happen moving to the west in a progression over time. Especially since most of the eastern provinces had plateaued before a more westernly province really took off.

There is also an exhibit of Egyptian artifacts which Courteney quite enjoyed. These are the things on permanent display that we saw. We didn't have time for the permanent displays on the Natives of Canada or the historic people of Canada or the Children's museum. There were also a few temporary exhibits which we saw. The first was an exhibit on mythical monsters. These included dragons, unicorns, sasquatch and the like. It was pretty nice and covered a number of creatures from countries and continents which I had never known, such as Mexico and Australia.

The other temporary exhibit we saw was the Royal Stamp Collection. It was interesting to see and showed mostly the very old stamps from when postage stamps where the new thing. It is a bit odd to think that postage stamps haven't existed forever, postage hasn't always been cheap and it used to be the receiver who paid.

We finished looking around at about five in the evening and were both exhausted. Stacy came to my Grandparents' after work and said goodbye, which was nice. It impossible for us to expect others to rearrange their life when we arrive because we can give no notice, but she put the effort in anyways. They all did really in that they mostly showed up for dinner on Saturday.

This ends our stay in Ottawa. We have seen all the family in this part of the country and have seen all the sites we desire to see. Tomorrow morning we are heading back into Quebec to the city of Montreal. It should be an interesting thing to see, I just hope that the French doesn't trip me up too much. It should be alright in the parts we are going this time, but on the way back we are planning on heading through the more northern parts of Quebec where English is a foreign language.

KM 32240: Ottawa, Ontario (Day 5)

Cheese curds and sunshine where the big things that happened today. We woke up and moved ourselves to the Dugay's again. Once there my Uncle Jim was kind enough to drive us to St. Albert's. St. Albert's is a place with an awesome cheese factory which packages and sells fresh cheese curds. These are also known as squeaky cheese because they squeak when they are fresh. One the way back we picked up a sandwich each and returned to their house to eat and have a couple of beers in the sun. It was quite nice.

We had come prepared to take a swim in their pool, but it was too cold because they've have rather cool weather the past few weeks. They even fed us dinner which was also nice as we could talk more than we did the previous night as we bounced around a bit trying to see everybody.

KM 32179: Ottawa, Ontario (Day 4)

It is a nice change to be woken by the sound of thunder and being able to roll over and go back to sleep. In a tent we don't have that luxury. Today was a wet one. The weather rotated between overcast, raining lightly and pouring cats and dogs. It did this all day, even after we had left the cottage. This unfortunately means that we were mostly stuck inside the cottage until we left in the early afternoon watching satellite TV. Such is life however.

After returning to the city there was a family dinner at the Dugay's (the family of one of my mother's sisters). It was good as it meant that I got to meet several family members who I would have had to seek out otherwise. As you might imagine with people's work schedule that would be difficult. It was nice to see everybody. I've now met all the new children of the family and I believe that I am the first in my immediate family to do so.

I even got Grandpa to drive out to his daughter's place. This is a trip which people apparently find it difficult to get him to do. Now this also meant that I couldn't stay quite as long as I would have liked, but that is alright. We are going back there tomorrow in order for us to get some cheese curds, which are only good when they are fresh and squeaky.

KM 32179: Griffith, Ontario

The Canada Science and Technology Museum is similar to Science World in Vancouver. Both of us enjoy playing with science toys so we spent the morning and half the afternoon visiting it and having fun. I had been there once before perhaps ten or fifteen years previous when I came out to visit my grandfather. We had plenty of fun watching the electricity demo, though unfortunately Courteney didn't get up to try the Vandegraf Generator. One day I'll get a picture of her using one.

Of course most of the exhibits have been changed from when I was there. This time around they had an exhibit about Canadian Inventions (bug repellent!) and a photographer by the name of Karsh. Both where rather interesting. I especially liked the radio and telegraph exhibit.

After leaving the museum we went back to my grandparents' place and my Aunt Katie and her husband Ken where there. It was good to see them and we talked for about an hour before they had to leave.

After this Stacy showed up to take us to the Cooper Cottage where Amanda and Cory are staying. There we had a few beers and chatted by the fire. I wish we could have had more fires during our trip, but it just hasn't been practical. So that is what we did.

KM 32165: Ottawa, Ontario (Day 2)

As I was saying before my grandparents live just minutes from downtown Ottawa so today we spent in centre town, as they call it. First we went to the Parliament buildings. When we arrived we caught the tail end of the changing of the guard. We saw them march down the lawn and then off on the street. There were lots of people crowding around as people tend to do. Well a police officer came running to clear the troop's path of onlookers. After this had been done the troop started to march out, towards the road and the line of onlookers. Well, the lead man was marching toward a particular group of onlookers. And marched towards them. And marched further towards them with no indication that they were going to stop. When the leader was about fifteen feet away from the onlookers they started to get nervous. At ten feet some started to back up. At five most were starting to freak out as they would assuredly be run over. At three feet the leader turned, stamped his foot and continued on his way. I am certain that man enjoyed doing that.

After this the huge crowd of people dissipated we went to get ourself a free tour ticket. After getting it we had an hour before it started so we went ahead and took the self guided tour around the buildings. We saw all the various monuments and the like that are placed around the buildings. The most beautiful building there is the Parliamentary Library. It is also the only original building which remained after the fire in 1916. You'll need to wait for my pictures or find pictures yourself, but it is quite the library.

Eventually we finished the walk and had a bit more time to spend before our tour started so we sat in a shady spot and listened to the musical bells in the tower. It is actually an instrument operated by directly connected pedals. It is quite the thing and are as capable as a piano. Except that some bells weight nearly three thousand pounds.

So our tour started and went much as you would expect. We saw the house of commons, the senate room, a couple other small rooms and the inside of the Parliamentary Library. The library is all one up inside with intricately carved wood where the rest of the building is stone.

Upon finishing the tour we hit the boutique, as they call the gift shop. We bought a couple of postcards and two maple syrup lollipops. These have got to be the sweetest lollipops I have ever tasted. Even Courteney with her sweet tooth can only have a little before putting it away. They are likely to last at least a week, if not two.

Leaving the Parliamentary buildings we went to wander a bit downtown. We ended up in a pedestrian mall and checked out a few shops. On one end of this mall was the War Memorial and the grave of the Unknown Soldier, which we saw. On the opposite end was the Currency Museum. This was on our list so we took a peek.

Inside they had the history of money with artifacts related to the times laid out. It was interesting to see, especially the progression in the quality of coins from rough lumps to roughly stamped coins to the well shaped coins of today. Also interesting was the early history of money in Canada. It started as beaver pelts with the Hudson Bay Company having all their prices in beaver pelt equivalents. At one time Quebec (as a colony of France) used playing cards because there wasn't enough coin to go around. It is somewhat amazing that money made of playing cards could have worked, but it did.

This museum also has a large selection of bills and coins from all modern times and places. This means that I saw a thousand dollar Canadian bill. There was even for a time a fifty thousand dollar Canadian bill that was used only by banks. It is a bit of a shame that electronic funds transfers like debit and credit cards have done away with the thousand dollar bill, but I guess life must move on.

Thus ended our first full day in Ottawa. We still have many more things to see and we'll be here a while more. My grandfather picked us up and took us back. We had a nice dinner of salad and spaghetti and then wiled the rest of the evening away chatting and reading. It sure is nice to not have to drive far to get places and not have to setup camp when we are done the day. We will be sure to enjoy this while we can.

KM 32165: Ottawa, Ontario

Well, we really did nothing today except drive from Port Perry to Ottawa taking the scenic route. It is some relatively nice farm country, but not much else can be said of it.

We have arrived at my Grandparents' house and will be staying here for a few days. They have lived here for decades and so the house is conveniently located minutes from downtown and most of the things we have to see in this city. It will be good.

The most noticeable change from the last time I was here is that the enormous maple tree in the backyard has been replaced by a small maple tree. Apparently the old one had begun to split and was threatening to fall own on the house in a stiff wind.

KM 31793: Port Perry, Ontario (Day 5)

Another day and another two attractions. The first was the Canadian War Heritage Museum in Brantford. This is a smaller museum, but it had a nice collection and lays out progressions of wars quite nicely. Again it was staffed by a veteran, which I still find full of pressure. It was good though. They apparently also have a couple of the old military vehicles that are still in running condition, but they hadn't been moved for the season.

Those heavy vehicles all have relatively weak engines. The largest vehicle there had a mere eighty horsepower and the most powerful engine was less than a hundred and fifty horsepower. You'd be hard pressed to find a new pickup truck with less horsepower than any two of the vehicles there combined. It sure makes you wonder why newer vehicles need so much power.

After leaving the museum we headed to the Reptile Zoo which we missed yesterday. It was fairly nice and had a reasonable collection of snakes, lizard and a few other things of interest. The most interesting was the albino alligator which was on loan from another zoo. There was also a pair of Nile Crocodiles, which are huge. The largest one was at least twelve feet long. Not something I'd like to meet in a river. Finally there was a group of half a dozen small alligators one of which was quite active and would follow us from one end of their tank to the other.

Unfortunately we were unable to get in contact with a number of the friends and family who live in the area. We have finished seeing the sites we have for the area and tomorrow we move onto the Ottawa region. We would like to stay and see the people we missed, but though we have no fixed schedule we cannot afford to stagnate.

KM 31405: Port Perry, Ontario (Day 4)

The resting place of the Stanley Cup is the Hockey Hall of Fame in Toronto. Today we went there. It is much as you'd expect with artifacts from all the ages of hockey and notable moments. The moments in history I found more interesting than most of the moments. There are also some video games, some hockey games (such as playing goalie against a video opponent) and two vaults of trophies. In one they display replicas of current minor hockey trophies and actual retired trophies. In the other they have not only the original Stanley Cup, but also most of the modern professional cups in either replicas or actual. The replicas are all quite good. When we were there the real Stanley Cup was in Pittsburgh. Had it in been in house we could have touched it and had our picture taken with it.

The Hall of Fame also has a gallery on international hockey, which I found interesting. Finally the Hall of Fame has an extensive gift shop of branded merchandise for most of the teams. If you ever want to get a pencil or alarm clock or scoreboard lamp for the fan in your life check this place out.

After we had gone through the Hall of Fame we headed to Niagra Falls. The Canadian side is absolutely better. Not only are the falls better, but the views are better. Of course we went on the Maid of the Mist boat and not only did we get wet, but we also enjoyed it. It is well worth the money.

We didn't spend long at Niagra, but enjoyed it nonetheless. After Niagra we tried to visit a Reptile Zoo, but we didn't have the address and our GPS Navigation box failed us. Tomorrow.

KM 30945: Port Perry, Ontario (Day 3)

Toronto is a very large city. Even if I hadn't known this before arriving there is no mistaking it from the streets. There is just something in the way big city streets are laid out, paved and how parkings works that is different from any other type of city.

Well today we went to Toronto. The original plan was to go up the CN Tower, then the Hockey Hall of Fame before going to Medieval Times for dinner. Unfortunately the line at the CN Tower to go up to the top observation point was well over an hour long and by the time we finished there it was nearly two in the afternoon. Medieval Times opened its doors to get tickets and seats and examine some of the things they have there at two thirty. So we walked to check out the hours of the Hall of Fame, but could not go in because we needed to find our way to our dinner-theatre.

There are a couple of things other than just the observation decks at the CN Tower. First there is the short movie describing its construction, design and a few anecdotes from people involved in the project. I didn't know, but the practical purpose of the tower is to be used at a broadcast and telecommunications tower. Build an antenna tall enough and you can transmit over skyscrapers. The movie is worth watching.

Then there is a simulation roller coaster which has a theme of a futuristic tree factor and mill. Unless simulator rides are your passion I wouldn't recommend spending time doing this. However, the ticket which gets you everything also allows you to skip a bunch of the line to the elevator up and so is likely worth the few extra dollars.

So we'd gone up to the main observation deck, looked around and then waited in line forever to go up to the tallest man-made observation deck in the world. After coming down it was time to go to dinner. Medieval Times is a dinner-theatre that is fairytale medieval themed. That is the knights are chivalrous, the king doesn't have gout and the evil prince plays fair most of the time. Even dinner is themed. Firstly there are no cutlery, you eat with your hands. I quite enjoyed tearing my half chicken to bits in order to consume its delicious flesh, but I imagine some in the audience didn't fair so well.

The show itself is a mix of horsemanship demonstrations, knight skill demonstrations, plot and tournament fighting. The horsemanship was pretty cool, they crab walked a horse and had a horse walking on only its hind legs. There was also precision formation riding. The knight skill demonstration was mostly done on horseback and included catching large rings with a lance at speed, spearing a target at speed, passing flags back and forth at speed and catching a small, four inch steel ring, at speed. I quite liked the latter one as it is truly difficult and only one got it.

I would rate Medieval Times as a place to go and a thing to see at least once.

KM 30763: Port Perry, Ontario (Day 2)

Today was a rest day in order to help Courteney get over her cold. We just mostly hung around my Aunt Debbie's place. We did go and see a movie, Transformers, which wasn't bad and then spent a couple of hours reading and enjoying the sun which came out while we were in the dark theatre.

So Courteney is well on the mend, but I am getting a cough now. Sometimes we just can't win.

KM 30702: Port Perry, Ontario

After an easy day of driving we arrived in Port Perry. Well, we arrived in Oshawa because we were looking for a movie theatre. We didn't find one, but we did find propane for fifty cents a litre, which is good.

Anyways, we eventually arrived at about two thirty to a warm introduction at my Aunt Debbie's house. She put on a nice dinner. Tyler even showed up with his family in tow. It was nice. We spent most of the evening chatting, though Courteney did help my Aunt with some knitting.

Tomorrow is going to be a rest day to ensure that Courteney is all better before we go to Niagra falls. She has been getting better, but isn't quite a hundred percent.

KM 30459: Parry Sound, Ontario

Yesterday was Canada Day. Most municipalities in Canada, especially the larger ones, have fireworks the evening of Canada Day. We were even lucky enough to arrive at a sizable town. Unfortunately because Courteney is sick we had to go to bed early and did not get to watch the fireworks.

Everybody in Ontario should own a canoe. Ontario is so full of lakes, streams. creeks and rivers that everybody should spend time touring their waters. Now of course this is much easier than it used to be since the invention of DEET. According to Wikipedia DEET was invented by the US military, so let it never be said that enormous military budgets haven't helped make the world a better place. Driving through this country I wish for two things. Firstly that I had a canoe on me. Secondly that Courteney didn't hate bugs or water near as much as she does. I'm going to need to find some outdoorsy friends.

Well today was another day full of driving. There was a short break in the middle of it when we saw the big nickel. For those who are unaware there is a giant nickel (made of nickel) in Sudbury, Ontario. Sudbury is known for its nickel mining so I suppose it makes sense. Anyways, this is a scale nickel approximately fifteen feet in diameter. It is something to see, especially if you are feeling touristy. Other than that we have really just driven. I am happy that tomorrow is the last complete day of driving for a while as I am getting tired of sitting and seeing nothing but highway for hours on end.

In other news Courteney was a bit better today than yesterday so she is likely getting better. Perhaps she'll learn to listen to me yet when I tell her how to improve her situation, whether that'd be by going to bed early or by looking where she is going to sit before sitting even if she left just a minute ago.

KM 29951: Sault Saint Marie, Ontario

Today was all about driving. Or at least, it was all about driving once I got a miserable Courteney to have a shower, eat her breakfast and dry her hair. Courteney is slow on the best of mornings. On mornings when she is sick she is downright glacial. She is still quite under the weather, though she is not so bad once she gets moving.

As I was saying today was all about driving, as can be seen from the number of kilometres we covered today. There isn't anything we really want to see in this area until we hit Sudbury and we are pushing to reach to reach family not only so we can mooch a warm bed (to heal Courteney of course), but also to see them. This is important because unlike the USA portion of the trip where I had the second half of funds meant for Canada to fall back on if things got expensive in Canada the trip ends when I run out of money. Courteney being sick doesn't help matters at all because she is even more miserable than normal when the thought of camping when it is raining comes up. And of course cheap motels cost two to four times as much as the average night of camping. It will also be nice to see the family I haven't seen for a number of years and won't for at least a couple more after I get married because I'll be too broke putting together a home.

Hopefully it stops raining soon, Courteney gets better soon and propane gets cheaper soon. I am quite surprised to see that propane isn't near as cheap out east as it is in the Lower Mainland. Back home it is always at least thirty cents per litre cheaper. Out here is is more like ten cents and I have even seen it more expensive at one station. There is something wrong with being charged nearly a dollar a litre for something which was fifty cents back home when I left.

Also, Lake Superior is very big.

KM 29345: Nipigon, Ontario

It turns out that Ontario is larger than I first believed. Or rather, what I call western Ontario is bigger than I first believed. In some ways it is perhaps a country in itself. It certainly has a much different feel in this part than the other parts of Ontario I have been in on past trips east. Perhaps this explains why people out here tend to not take into consideration the needs of the actual western provinces, simply because western Ontario is already so different that they consider it impossible that anything could be as different again.

Let me start at the beginning however. The reason I discovered that western Ontario is so much larger than I first believed is that Courteney is sick. It isn't anything serious, but Courteney doesn't take well to being sick and travelling is hard enough on her already. So I thought we might push our travel a bit in order to arrive at my Aunt Debbie's house sooner such that she would have a warm place to get better. I thought it couldn't be more than eight or nine hundred kilometres away from where we stayed last night, Dryden. Well was I wrong. It was actually sixteen hundred kilometres and our GPS claimed it would take twenty-two hours to drive it. So we'll see when we actually arrive. Until then I'm not sure how I'm going to keep her warm so she gets better.

Speaking of my GPS. The GPS contains within it a directory of services and businesses from what I believe to be all the world. How useful and accurate it is really depends on where you are. The maps themselves are pretty complete; as I mentioned previously it has at least some of the US Forestry Service roads in it. Something I may not have mentioned is that it contains mapping information for Europe and when searching for cities we have occasionally been forced to ignore results from Russia and Europe. Now in the USA the business directory seemed pretty good. If it didn't find something we were looking for in a particular small town, then it didn't exist. In Canada, however, it has been less good. It is fine for larger cities, but it falls down on the smaller ones. One example which was annoying is motels in Golden. We knew there were plenty of motels in Golden because we had driven by them, but when we went to search for a list of them to make a decision the GPS turned up only three.

It is always annoying and frustrating when you cannot trust your tools.

During the drive today we saw two moose, several deer (only one on the road) and one wolf. It is surprising how much wildlife you don't see in the Lower Mainland. I find myself wishing wildlife on the highway was a legitimate worry where I lived.

KM 28882: Dryden, Ontario

We started the day by visiting the Royal Canadian Mint. They have a tour where they show you the facilities and the various steps needed to mint coins. The tour is from an elevated and enclosed platform. I learnt a few things. The most interesting is perhaps that the Canadian Mint makes coins under contract for a number of countries, even a couple of coins for the USA. Also the original poppy quarter was the worlds first coloured coin. Finally most new Canadian coins are stainless steel with a coating of nickel and iron. Thinks I didn't know.

The mint also has a coin boutique, should you be a coin collector and want something new and special. They have a few other things. The most fun is the option to hold a solid gold brick. That's right, half a million dollars and twenty eight pounds of gold can be in your hands. Of course you can't leave with it, the chain and armed guard makes sure of that.

The mint also has an interesting souvenir. They've taken one of their old stamping machines and hooked it up to stamp a stainless steel blank with the mint logo and the current year.

And so we left the prairies and headed into Ontario. In preparing for the next bit of the trip we took a look at the map. The first odd thing you'll notice is that the people from Ontario call the western part of the province North Ontario. This is odd because we are driving at about the same level as Kelowna. The second and perhaps more odd thing is that not only is Ontario large, it is also empty. Find a road map of Ontario and look at the highways. There is nothing in the northern two thirds of the province. Yet if you look in similar places on the west coast you'll find highways well into the north, you can even drive all year round beyond the tree line. I truly wonder why there is nothing in northern Ontario.

KM 28496: Winnipeg, Manitoba

This just in: You can make butter with regular 1% milk if you shake it enough. I went to the truck this morning to fetch the things necessary for a cereal breakfast only to find that our milk had gone chunky. Not the chunky that happens when it has gone bad, but instead the chunky where all the fat in it had grouped together leaving whites blobs floating in a pale white liquid. We opted to not have cereal and instead had cereal bars. This is how we started our first full day in Winnipeg.

Today was a historical tour of two of the National Historic sites in the area. The first was Lower Fort Garry. This fort is a short distance north of the city and is an HBC fort first put into service in the late 1840's. They have perhaps eight buildings all decked out with period furniture and living articles. Most of the buildings are also manned by actor-guides in period dress. Each actor has a story to tell about what part they are playing in fort, but they are also quite knowledgeable with respect to facts that a person in their position during the operation of the fort would know. It was quite a nice experience and I highly recommend it to anybody in Winnipeg with half a day to kill.

After Fort Garry we paid a short visit to the Riel House. This is the historic house of the Riel family, most famously Louis Riel. It is setup for the year 1886, the year after his execution. There we found a guide in period clothing who was also quite knowledgeable. I'm not sure if this is normal for National Parks, but I like it. It was especially nice at the house because it turned eight hundred square feet of tourable house into an entertaining and educational hour long tour.

The most interesting portion of the Riel house is the minor historic point that it records. This is the French system of agricultural division. In the English system farms are made square in shape. The French system has long narrow farms instead. This is more sensible than it first appears because each farm was ensured access to a river for water and transportation. Each farm also had a variety of land available for use, from fertile land next to the river for vegetables to heavily forested land up to three kilometres from the river banks. Each farm tended to be 250 metres wide and three or more kilometres long. I think it is an extremely well thought out system that is good for self-sufficient or nearly self-sufficient family farms.

The final thing we had before retiring to our room for the evening was a slushie apiece. Winnipeg is the slushie capital of the world after all and it is only fitting. Things went about as expected with them, we both got a far amount of brainfreeze, I became fidgety (perhaps the 1.2 litre cup was a bad choice) and Courteney simply felt ill. It's all ended alright though so tomorrow is another day of adventuring.

KM 28387: Winnipeg, Manitoba

Today was one of those days in which we are forced to consider the more mundane tasks of life. Tasks such as laundry. So we spent a couple of hours sitting in a laundromat. While heading back to the truck for a snack while waiting for our laundry to finish I discovered that my grill had caught something slightly unusual. In addition to the normal variety of bugs and the odd leaf I discovered that I had caught a bird. It took over twenty-eight thousand kilometres of watching small birds flying into the highway to take flight, but I finally caught one. I am quite surprised it took this long.

After dealing with the necessary chores of laundry and scraping a bird out of my grill we arrived in Winnipeg. Before finding a room for the next couple of days we ate dinner. One of the things that I put on my list of things to do was to eat sushi in the prairies. Early in the second half of the trip we decided to wait until Winnipeg to do this because it is really in the middle of the country. So that is what we did for dinner. It was not bad, but as you might expect rather expensive. This did mean that they put the effort into the small things. Things such as having good green tea and raised floors with a hole around a table for your feet. This gave us the appearance of kneeling at the low Japanese tables. That was neat.

Now we are tucked away in a cheap motel for the next two nights. Two nights is really the minimum amount of time it is going to take to see all the stuff we need to see here. I had considered camping, but it was raining all day. Earlier in the day as we had just finished our laundry it started to rain hard. We had to fill up on propane before leaving. As we waited in line and for the man to pump the propane the skies opened and started throwing lighting and thunder around. I got soaked waiting outside.

For this reason we aren't camping now and will stay in a motel. It also means that we are guaranteed a shower. Unlike in the USA it appears that most campsites have pay showers. Now we never have much change because we use it for small purchases, parking and the like. Just something interesting.

KM 28001: Moossomin, Saskatchewan

This afternoon we left Regina and continued east. In the morning we saw the RCMP Depot and Heritage Centre. The Heritage Centre is basically a museum which explains the history of the force and has a number of artifacts from various eras.

It is an interesting thing to see, especially in the equipment that the early force was equipped with in their patrols of the north. They truly did not have much in the way of equipment. Perhaps the most useful things they had were their rifles and their red serge. In some ways I wish that the RCMP still wore the red as a regular uniform, but can see the argument for wearing a more modern uniform.

After perusing the museum there was the Sergeant's Parade, which we saw and a short guided tour of the base afterwards. It is an interesting tour to take. It doesn't travel far and wide into the base since it is an active training base.

We finished with the RCMP at around 2:30 in the afternoon. We then left town heading east. We ended up at a campground in a small town just west of the Manitoba border. When we arrived there was already a good swarm of mosquitoes milling about and the number only increased as the day wore on. We quickly put up the tent, cooked dinner and ate. In the end we finished up at around 6:30 and hid in the tent.

Now I have often heard that mosquitoes are bigger in the east and the north. I never quite believed them, but now I have seen it for myself. The mosquitoes we saw were about twice the size of the ones I have seen in the south western area of British Columbia. This is bad, but for the first time in my experience bug spray has been successful at keeping the mosquitoes at bay. Usually all it seems that bug spray does is annoy me. It certainly worked this time and was only 25% DEET. The bugs are taking their toll on Courteney though. She is seriously mentioning her desire to end the trip. She is going to go on as far as she can, but this evening she has claimed that Churchill, Manitoba is likely the end of the road for her. We shall see.

KM 27740: Regina, Saskatchewan

Today we arrived in the big city of Regina. To get here we had to drive about a hundred and fifty kilometres of road, most of it good. There were few bends and those that did exist seemed put in place more to keep drivers awake than to avoid anything.

The first thing we did upon arriving was visit the Royal Saskatchewan Museum. This is a relatively small museum that has the focus of the province of Saskatchewan. It covers the geology of the area, the native culture of the province and the environment through time. It starts with rocks, goes through natives, then through dinosaurs and the ice ages before arriving in more modern times with animals and environments you'd see today. It was nice to see.

By the end of it Courteney's back was hurting and my watch read 3:30 PM so we decided to get a room for the night and perhaps let Courteney lay down before dinner. Upon arriving I checked out the weather channel as I always do when it is available. It was now that I discovered that it was in fact 2:30. We had half the day left. Courteney had recently mentioned how she wanted to see the movie Up, while it was still in theatres. So we went to a theatre.

Upon arriving it turned out that the first showing was not until 6:30. We had a bunch of time to kill. To do so we went to the legislative buildings. Since we will be travelling through all the capital cities I thought it a good idea to at see all the buildings to round out the trip. We did this and then wandered around a garden nearby and eventually rested on a bench beside what I believe was a river for a while.

After tiring of this we travelled to a nearby pizza place for an early dinner. This we had and we finished up at just after five and still had lots of time left. We has passed a blood donation clinic on the way to the pizza place so I thought we'd stop by and I'd give some of my blood to kill time. Well we found it again, but it had closed at one PM today. I'd give blood more often if they weren't always closed whenever I have the time.

Well, after this we returned to the movie theatre and proceeded to waste forty-five minutes wandering the adjoining mall. When it came time to go and get our tickets, however, we were in a Walmart and had some difficulty returning to the mall. Yes, we got lost in Walmart. We eventually found our way out, watched the movie, enjoyed ourselves and then found ourselves back at our motel for the night. Tomorrow we see the RCMP museum and perhaps a tour of the grounds before we head out of town for Winnipeg.

KM 27497: Willow Bunch, Saskatchewan

The town of Moose Jaw is somewhat bigger than I expected. I expected a small town, but not one of the tiny farming towns we've seen around. Instead it is perhaps what counts for a medium sized town out here, they even have parking meters on the streets.

We went to see the tunnels of Moose Jaw. These are what started out as steam service tunnels and were later used both by Chinese immigrants and bootleggers. Both of these tours are historic in nature, but they are not the stuffy tour you would expect. Instead they are, more or less, tour plays. Of course nobody tells you this before the tour starts. The bootlegging tour is entirely a play, where the Chinese tour is only half a play and half a tour full of explanation.

We did the bootlegging tour first and I ended up being chosen (you didn't expect them to let people volunteer did you?) as Charlie, the regular there at the speak easy. Well we go through the club, Al Capone's office and bedroom. Then we meet the guard who brews, and sells, Al Capone's 195 proof private reserve.

It was interesting and well done. As a note it turns out that Charlie is a long time drinking buddy of the guard. It was fun.

The Chinese was more formal in places, but that is likely because acting out what the Chinese did would be difficult and involve burning our hands and days of backbreaking labour.

After doing this we left the city in search of a place to stay. We had some time so we decided to head a bit south. In the end we ended up heading down a badly maintained highway until we were merely a hundred kilometres from the border. On the way we passed through a lot of farmland and I drove around a lot of broken pavement.

Willow Bunch is a small farming support community. It is big enough to have a motel, at least one gas station (we've passed towns without those) and a pub. It also has a museum, but I'm not sure why. As we were very near town we saw a sign leading so a historic park with petroglyph's. It was only eighteen kilometres down a gravel road and it even looked like it'd been graded in the last couple of years. On the way there we passed through an even small town which goes by the name St. Victor. Victor is a one street town and though it is paved in town, it is gravel on either side. I'm really not sure how that town survives, but if anybody is interested there is a house for sale.

Anyways, we go up a hill to see these carved rocks. They are horizontal sandstone rocks and are exposed to the elements. Thus the actual carvings are faint. Somebody was nice enough to place a modern replica with deep carvings for us to examine in detail with ease. We saw and looked and we took in the view of the plains as this was on an actual hill which was perhaps two hundred feet above the rest of the plains. It was nice.

So back we head down eighteen klicks of gravel to the town we had decided to actually stay at. We head to the regional park and tour it trying to find a suitable spot. I was quite surprised at how nice the park was considering the area and what sort of money must be available. Well we set up and ate dinner. Just before cooking dinner we notice that the time on my watch and the time on my cellphone do not match up. They differ by an hour. For the time being we choose to go by my watch.

KM 27141: Swiftcurrent, Saskatchewan

Well, the world's tallest tepee isn't quite what I expected. Firstly I expected the world's tallest tepee in Medicine Hat to be enclosed, it wasn't. It was instead an exposed skeleton of structural steel. This was the first disappointment. Secondly I fully expected the tallest tepee to be a gift shop visitor centre or liquor store. All we got instead was half a dozen paintings and some plaques explaining the painting. All the paintings represented important events or elements of native culture in Alberta.

Other than that we didn't really do that much. The drive from Rosedale to Saskatchewan is really quite relaxing. The road is good and the small hills keep things interesting enough. Unfortunately sections of the highway become rather rough in Saskatchewan. Also the parts of Saskatchewan we have seen are mostly flat, but not near as flat as the jokes would have you believe.

Judging by the distance we've covered since we left and how much more distance we have left to cover I am beginning to wonder how long this leg will truly take. It may still take the three months that I first guessed because we have a number of stops and family to see, however it may be significantly less driving than I first anticipated because the stops we have have been rather tightly grouped so far. In some ways this is good, travelling less distance means it costs less and we can do more, but in other ways it is less good, the trip doesn't feel even with some driving and some seeing everyday but instead has lots of driving with lots of seeing. The lack of an even mix will probably be more tiring and I worry about that.

KM 26668: Rosedale, Alberta

Today we took our leave of Calgary. We woke up, showered and started to pack up while Grandma made us a nice breakfast. After eating and chatting for a short while we finished packing and loaded up the truck. We made it out at about ten thirty in the morning.

Our first stop was the Royal Tyrrel Museum. Because we got out a bit late and had to wait a short while to fill the truck we didn't arrive until nearly one o'clock. So we moved around the museum and took a look at the dinosaurs and related things. It was nice and enjoyable. Courteney had never been and I had last been there about eight years ago.

So we finished going through the museum at around four thirty. Courteney's back has been bothering her for the last couple of days and was bothering her as we left so we went directly to a campsite to stay for the night. We've had large meals for the last couple of days so we ate small with some soup and fruit.

KM 26486: Calgary, Alberta

Today was a lazy day. Because it was fathers' day the Barnerts put on a huge breakfast with omelets, potatoes, waffles and fruit. We gladly took part. We ate and then lazed around talking until about noon when Courteney and I began our trip back to my grandparents'. Once there I had a shower and read while Courteney baked a couple of strawberry-rhubarb pies.

Later family members started arriving for the dinner Grandma was putting on for us. The turkey was good. Eventually dessert came, we the delicious pies still warm from the oven and everybody was well fed. There was much conversation and, as it was a Sunday, they started to roll home around nine.

This was fine because both Courteney and I were quite tired. We went to bed shortly after the last group left.

KM 26442: Okotoks, Alberta

After a great breakfast of bacon and eggs we headed out to the museum we failed to find yesterday. It was actually called the Military Museums. It is a pretty nice museum with sections from most of the branches of the Canadian Military. It is too large to fully see in a single day. Instead we took our time through one of the museums, carefully reading every plaque and examining every artifact. The rest of the museums we went through an skimmed most of the artifacts, stopping to carefully examine certain things. There are a lot of things I missed.

After I walked Courteney's feet off we went for a late lunch and a couple of beers with Shane, Wes and Wes' girlfriend. I had a good steak sandwich. After that we went back to Shane's place to kill some time. We played video games for an hour or so and then played with a small, primitive motorized bike that was there. This bike has perhaps a two horsepower motor and no real suspension to speak of. We had fun motoring around the block for a while.

Now it turns out that Shane bought a six and a half horsepower motor on a whim a while back. Well, having fun on this little bike has put the thought into Shane's mind to make use of this motor. The most popular thoughts of what he should motorize is a barrel, a refrigerator box and a carpet. The barrel is perhaps the most practical, but the magic carpet is the coolest. We'll see what he comes up with.

After leaving Shane's place we made a quick stop at the liquor store and then onto the Barnert's plot for dinner and a bonfire. A nice large fire was built and fun was had. We went to bed late and slept deeply.

KM 26361: Calgary, Alberta

Today we made it to Calgary. A trip that can be made in nine and a half hours and we made it take six days. We arrived in the late morning. Our first stop was up the Calgary tower. It has a nice view of the city and out onto the horizon in most directions. It was fairly nice, though the glass floor isn't as nerve wracking as I remember.

After that we tried to find what we knew as the war museum. We were unable to find it and so instead went to the World of Science here. We thought it would be much like Science World. In some ways it was, but it was mostly more for children than Science World. We stayed an hour and a bit and saw what there was to see, but I wouldn't recommend it.

After that we headed to my grandparents' house. We were greeted in the typical style with hugs and a nice dinner. We stayed up too late talking.

KM 26202: Banff, Alberta

We spent most of today taking in our beautiful national parks. First we went to Lake Louise. It is a very nice lake. We walked an easy path around the shore to the far end from the town. We tried to meet Mark and Karen for lunch, but they weren't answering their phone when we called. That's fine though, we'll see them in Calgary.

After Lake Louise we moved a short distance to Banff. There we took in a garden they have here built around the administration buildings, the original national park site of the hot springs and a couple of other things along the road. It is truly a nice park and I feel that I have underused the park system.

Mostly we walked around a bunch and took in the scenery. One other thing we took in was my hatred of RVs. I truly hate RVs. They are clumsy monsters of vehicles often driven by people who have never driven anything bigger than a sedan. Of course they can't stick to the straight main roads. No, they are inclined to take the narrow, twisting mountain roads of the parks and boy do they take them slow. I don't understand why you want to take what is basically a house on wheels to a campsite. Is it truly camping if you have all the luxuries of home?

This is not to say that I don't see the allure of having a camper, I truly do. Set up and tear down a tent everyday for three months and you get to see how nice it would be to sleep in your vehicle. In fact I am perfectly alright with camper vans. They are not unwieldy, slow and overly full of comforts. In fact if I ever do this sort of trip again, but on a different continent I would seriously consider renting a camper van. But there is no way I'd consider an RV.

KM 26003: Golden, British Columbia

So I was wrong in my remembering which side of Golden Roger's Pass is on. It is actually on the west side where Field is on the east side. So we back tracked halfway to Revelstoke in our quest to visit the visitor centre there. It was interesting to see.

As you can see above we made quite a distance of driving today. What I find odd is that though I drove more than most days of driving in the USA I didn't find it near as tiring. I'm not quite sure what to attribute it to exactly, but I have a few guesses. The first is that the drive itself is much quieter. Not only is there much less wind, but there seems to also be a lot less road noise. The road surface is all asphalt, instead of primarily concrete, and thus much smoother going, with significantly less vibration. Now I did replace my front tires, which were going and that has probably helped with the vibration significantly.

Well, after seeing the centre at Roger's pass we headed to the small town of Field. Now I have often driven past this town on our way to Calgary. It is a small town just off the side of the Trans Canada highway. I always thought it was a small town of workers, construction workers, rail workers or miners. I thought it the kind of place where there is one seedy motel and one rough bar. How wrong I was. It is a small place, but there is nothing cheap about it. It also seems that half of all the buildings in the town have a guestroom in them for rent. Not all that cheaply either. So as it was raining and Courteney would hate me otherwise we headed back to Golden to get an affordable motel room.

This brings us to the end of the day where we grabbed dinner at a 50's themed diner. Tomorrow we head into Alberta.

KM 25459: Kimberly, British, Columbia

Of all the places I've seen the area around the town of Creston, nestled as it is in a valley in the west Kooteneys, is perhaps the most livable. The valley floor itself is fair sized and fertile. The mountains provide variety and sport. According to one person who lives here the weather is also quite nice with high twenties in the summer and only rarely getting colder than negative ten in the winter. There is always snow, but not usually more than four or five feet. It truly sounds like a nice place to live.

The reason that we have taken the extra long way to Golden (not that we've arrived yet) is to see the glass house. This is a house made mostly of glass bottles and mortar.It was built by a retired mortician in the early fifties. It has a grounds with several retaining walls and what look like guard towers as would be found on a fairytale castle. That is except that the walls are all made of empty sixteen ounce bottles of embalming fluid. There are over half a million such bottles. It is something worth seeing. I forgot to ask if the builder threw stones.

Really the glass house was the only thing we saw today other than driving through the countryside. It is a nice countryside and is pretty full of wildlife. We are making our way north now back to Golden such that we can take Roger's Pass through to Banff. One interesting thing that I forgot to mention yesterday is that apparently road travel across the Rockies was not always as easy as it is now. I'm not talking about hundreds of years ago either. Apparently up until the mid-sixties travel by vehicle was treacherous at the best of times on the Trans Canada and everybody took either the more southern pass or travelled through the USA.

We have also seen perhaps a dozen bikers travelling along the stretched of Trans Canada we have driven. I suppose it is a common challenge, though I didn't think it at the time that my friend Trevor did it. Well, the sun is going down and the bugs coming out. This means it is time to retreat into our shelter for the night.

I have gained a greater appreciation of shelter and the nuances thereof. There are a few things that one could need shelter from: wind, rain, cold, heat, bugs, sunlight and lighting. The least important I have surprisingly found to be sun and cold. A tent only protects against a few of these things, namely bugs, wind and rain. Though they can protect against the sun they get so hot as to be unlivable. The only thing I wish tents provided protection against is lighting. There is nothing quite as draining as setting up a tent for the night with the expectation of rain and the threat of lighting. Retreating into the cramped cab of my truck just isn't comfortable. All the outdoor advice I have seen in regards to thunderstorms say to take cover in a building with plumbing or a car. A tent is neither of these things and yet people have lived for thousands of years before the invention of either plumbing or cars. I wonder how they made it and why a similar reasoning cannot be applied to tents.

I am considering setting up a lighting rod at a distance of fifteen feet or so from my tent, but I don't know if this is both far enough away not to get shocked as it is grounded as well as near enough such that the tent is contained within the radius of attraction.

KM 25095: Nakusp, British Columbia

Today we a busy day. We visited the Last Spike, the Enchanted Forest, the Revelstoke Dam and still made over 250 kilometres of driving.

The Last Spike, for those who are unaware, is where the Canadian Pacific Railway met the east and west portions during construction. It is truly a stop everybody must make at least once in their life as they travel the Trans Canada highway. We have both been there before, but decided to make a quick stop in order to take a picture of it on our trip. It wasn't even out of our way.

After that short stop we arrived at our first planned destination, the Enchanted Forest. The enchanted forest is more or less a short nature trail which has many small (and large) statues of characters and scenes from fairy tales. We saw many familiar things. I had been there once before when I was young, but Courteney had never been and wanted to go. It turns out that this forest was started by a sculpture who created a place to display her work. At first it wasn't open to the public, but eventually it was as traffic increased on the highway. It is a fine way to spend a couple of hours. They even have several buildings which are in interesting form that you can enter and peer out of. They are child sized and so make a tight fit for us two, but we had fun in a couple of them.

The third stop was the Revelstoke Dam. We took the tour and saw some things. It is a self guided tour and so is more of a set of exhibits, but that's fine. The educational toys here are better than the exhibits at Hoover Dam. They are currently installing a fifth turbine to add to the original four (the dam itself is designed for up to six turbines) in order to increase generation capacity and help reach the new government policy of energy self-sufficiency by 2016. Compared to building a dam it is a small project, but installing pipes big enough for two lanes of traffic is a project no matter what.

Finally we drove a bit. We are currently detouring to the very south of the province in order to reach Creston. We should be there tomorrow. We have pretty much fallen back into our travel groove and things are going fine. I'm sure people in Calgary are wondering how we could be taking so long to travel what is otherwise a ten hour drive.

KM 24839: Salmon Arm, British Columbia

Today is the first day that I realized that we truly different from other tourists in that we get up early. This morning we woke up at six, showered and ate quickly. We were heading out at seven thirty. Our first destination was Hell's Gaate. We arrived shortly after eight. We though this a reasonable time. We were wrong. The Hell's Gate Airtram doesn't start running until ten. So we proceeded to sit in the truck and kill two hours.

Finally the tram started running and we bought our tickets. All the attraction is is crossing the Fraser canyon. It does involve a drop of nearly two hundred feet and there is a little toursty area on the other side with a gift shop, candy shop, some signs and a small museum. You also get to look at Hell's Gate, which is a section of the Fraser river, pretty close up. It is certainly neat to see, but won't fill an entire day out.

Specifically Hell's Gate is a section of the river where one side is a shear cliff, the other side is a steep grade and the river in between travels at up to twenty-five miles an hour when the water is high and more slowly over rocky rapids when low. Before the modern boats no canoe and only a single steamship ever plied these waters. The ship was a paddle wheeler and took 150 Chinese labours pulling it on rope to aid it. I was quite surprised to read that.

Finally there is a suspension bridge which crosses the river. It is the second bridge because the first was washed away a few decades back when the river reached a record height of 207 feet (when we were there it was 170 feet deep). I didn't think rivers could even get that deep.

Well we left Hell's Gate and went back to Hope in order to take the newly detolled highway five to Kamloops. We did this because we thought it would be shorter. After driving it I am no longer sure, but we made it. We are now heading as far east as Revelstoke, then we will head south down to nearly the border to see a few sights. Travelling through the mountains really makes the roads odd and any loops are large and out of our way.

This evening is also the first evening that we are camping again. Last night we stayed in a motel because Courteney feared heavy rain and it did feel rather muggy. I agreed mostly to let her ease her way back into the mood of travelling. No rain came, but I did get a chance to watch the Weather Channel and see that not only was nearly all of Southern Canada having the possibility of thunderstorms yesterday, but also that it is supposed to rain on us for the next four days. Conveniently it didn't rain hard at all today when we were driving, though we did get a light sprinkling a couple of times.

Now being camping we see the bugs and I am reminded about a bug that is very common up here in Canada, but which I saw not at all in the USA. This is the humble fly. We saw very few or none of the black fly and its relatives the house fly, deer fly and horse fly. Well, there are flys here and I expect we won't get away from them at all. I really hate flys more than mosquitoes because some of them take chunks and most of that hurts.

Well, that is all for now, tune in soon for more entires just as exciting as this one.

KM 24423: Hope, British Columbia

Finally, we have left for the second leg of our journey. It has been a long road fraught with truck repairs, frustration, waiting and delays. We even had delays leaving today. First loading the truck took far longer than I think it should. This is not to say that we could have done it much faster.

Then I had lost a propane connector bit which I bought to make setting up the stove easier. So we had to pop into Langley City to pick up another. Then I had to buy more car insurance and there was a line. Then, thinking we had everything we headed east. Upon arriving in Abbotsford I recall that I didn't take the bear spray from where I had set it before leaving for the first leg.

While we could easily stand in not having bear spray in the USA because there are significantly fewer bears there and we were planning on sticking primarily to the well populated areas this is not true in Canada. First of all nearly all of Canada has bears, where only a portion of the USA does. Secondly we plan to spend more time taking in the wild. This means we will not only camp in less populated and built up areas, but we will also go hiking in the woods. We will of course take precautions to avoid a problem, but if a confrontation should occur I want more than a pocket knife as my last resort.

So we had to turn around and pick up the spray. Then we headed back towards our first destination, the Hope slide. We arrived a little after four and went to see the slide. It is a large slide and not only filled in a small lake, but also raised the floor of the valley by up to seventy metres. It is amazing what the side of a mountain sliding can do.

This first day of travelling did not go as well as I had planned, but tomorrow should be better. Soon enough we'll get back into the travelling groove.

The Trials of Leaving and Other Things

Since I've been home a few things have happened which are of note. Perhaps the most important has been my attempts to leave on the second half around Canada. I thought this would be a simple matter of getting my truck a tuneup and an oil change, extending my vehicle insurance, buying some food and leaving. In the abstract that is exactly what happened. As always the devil is in the details.

The first detail is that I made an appointment to get a tuneup on my truck and the other minor work. I dropped my truck off first thing in the morning under the expectation that it would be done late that day or early the next. This was a Wednesday. Come Thursday afternoon I call to see how my truck is coming along. I am informed that it is next in line to be seen. If they weren't going to get to it until late Thursday why the hell did I make an appointment in the first place?

But there isn't anything to be done about it. They claim it will be done by the end of the day Friday. This isn't that helpful because I am graduating on Friday and have placed to go in Burnaby and Vancouver afterwards. The mechanic is closed on weekends and I have no chance of getting my truck until Monday. At this point I had hoped to be leaving the Tuesday after graduation.

The truck isn't ready Friday and it isn't ready first thing Monday morning. Instead it is ready at the end of the day Monday. So leaving for Tuesday is shot because I still need to extend the insurance, put a new pair of tires on the front wheels, buy food, pack up and still have enough time in the day to get somewhere before dark.

So I move back to wanting to leave Wednesday. Tuesday morning I go to extend my insurance and my hopes are crushed: my truck failed Aircare. This was unexpected. Sure my truck has never once in the entire time I've owned it passed Aircare the first time. However the solution has always been to give it a tuneup and this time I did that before running it through Aircare. This depressing fact led me to needing to get it back into the mechanic. The very same one which took so long in the first place. I use this particular one because my truck runs on propane and this mechanic has a propane person. They haven't been that bad for time before this summer.

Anyways, I can't get my truck in before Thursday morning. I again hope that it'll be done by the end of the day, but to no avail. In fact, as I write this is is after noon on the following Friday. The importance of the noon number is that though I went food shopping and sorted the stuff we need to pack and all that while waiting for my truck to be fixed I still need to extend my insurance, fill up on fuel, buy ice and drive somewhere. Getting the truck after noon doesn't provide enough time to do all that and make it a suitable distance away.

So now I find myself having to kill time for another evening. Having full days of spare time sucks. This sucks especially when there is nothing worth watching on TV and because I am leaving for an extended period 'any day now' I can't really start putting serious time into any of my private projects. I'll just have to set them aside. So I mill around aimlessly.

And that ends caps off the complaining for the moment. Now I get to summarize the other events that have happened since I got home. The first is that both Courteney and I graduated and where given our degrees. It's been all but finished with since October, but it there is something truly final about a ceremony which makes hundreds of people uncomfortable for a few hours. Courteney ended up with a Bachelors of Arts and myself a Bachelors of Science.

Courteney also spent a couple of days looking at places and we took a tour with the catering manager at one of the places. There is now a date and venue for the wedding (June 19, 2010), but all other planning is being put off until September at the earliest after we get back.

The only other thing of significant note is that Nathan, one of my brother's friends, left for the middle of nowhere. I haven't mentioned this here yet, but over the past couple of months he has been preparing to be flown in on a float plane to some small lake 'near' Burns Lake. He is going to stay there for fourteen months, living all alone except for his dog. He will build a log cabin, chop his wood and generally be separated from civilization for the most part. It is going to be some adventure.

There were of course the required send off parties and he is up there now. He was supposed to bring something similar to our bleeper and use it to notify people where he has decided to setup his cabin. We over here have yet to receive any response, but he left a lot of things to the last minute and may have forgotten to set it up so a number of people got his messages.

He won't be entirely alone the whole time. Sometime during the summer a friend of his is going out there for a time and there are embryonic plans to put together a snowmobile expedition during the winter to visit him. Of course nothing can be done until us in the outside world know where to find him.

I suppose this will do for an entry. I'm sure many people didn't make it this far down into the entry. For those of you who did I commend your stamina. Don't worry, baring permanent delay of the second leg of the trip my next entry will be written on the tiny rubber keyboard and consequently shorter in length.

USA in Summary

Before I begin I am announcing that the wait for pictures is over. When viewing the blog just click on that handy Gallery link to see the trimmed results of the USA.

Well, we've been home for a little over a week. In that week I've rested and had time to look back on the first leg of this long trip. Several things come to mind worth noting, but none so unexpected as the realization that travelling is hard work. It truly is.

Then there is the appreciation at the immensity of several modern networks. The road system, electricity, telecommunications, postal service and fuel; these are the largest networks in the USA. By far the largest is the road network and the smallest the fuel network. It is easy to imagine the difficulties caused by this disparity. Equally astounding is how cheap the postal system is. For less than a dollar I got a postcard from the middle of an inhospitable desert in the southern United States to my home in Canada, a trip of several thousand kilometres.

No matter what anybody may say there is an undeniable difference between the USA and Canada and the people who make up those countries. There is no single large distinction that one can point to, instead a wide gulf is created through many small, important differences.

And though there is a difference between my home and the places I travelled and though vast geographic distances create difference in the people even within a small part of the United States, for the most part people live their lives in much the same way no matter where you go.

Though we may joke about the litigious nature of the USA it is certainly true. Lawyers advertise frequently on billboards, milk contains a warning that it may contain milk and nearly every service or product you come across is foam padded.

The most important structure in the entirety of the USA is most certainly the Hoover Dam. Without that dam southern California would be unlivable. The fertile farms there, which produce approximately sixty percent of all the fruits and vegetables, would quickly turn into dust. This critical nature of this structure is sorely underestimated.

When it comes to roads there are a few important differences to note. The first is that the west coast does not believe in guard rails and the east coast in street lines. The streets in urban areas are, as a rule, terrible. The speed limits are much closer to the maximum safe speed for the given road than in Canada. Speeding not only costs in fines, but is actually dangerous in many areas.

The United States is a place to visit and contains lots to see. And yet, after touring it, I have firmed my belief that I would not like to live there. This travelling has not made me appreciate where I went as much as where I came from.

KM 24183: Langley, British Columbia

Home. After a long time we are home. Not much interesting except that the dude at the border was cool. Stay tuned for pictures.

KM 23851: Leavenworth, Washington

This town we are staying outside of is themed like an old Bavarian village. It is a bit interesting to see old style Starbucks signs. Other than that nothing much interesting happened today. We are just driving home and will be there in the afternoon tomorrow.

KM 23349: St. Regal, Montana

Well. Today I have nothing to report. We woke up, ate, packed up and left. We drove all day before arriving here, just east of Idaho. I expect that we will be back in the Lower Mainland in two days. Other than that we did nothing and saw mostly nothing.

Oddly enough though, I was previously tired of travelling and was doubting my resolve to go through with the Canadian portion of the trip. This feeling has come to an end though and I am, if not chomping at the bit, fully willing to do it. I never thought driving could be at all restful. Perhaps it is some more familiar scenery.

KM 22738: Yellowstone National Park, Wyoming

This is perhaps the most famous of all the American parks. Firstly it is large as American parks go, looking at the map I estimate it covers about nine hundred square miles. I have also learnt on this trip (but cannot remember where) that this park contains the only remaining complete ecosystem in the USA. And that is only after the reintroduction of wolves a dozen years ago.

Finally, and perhaps most famously is Old Faithful. The geyser here that blows her lid with striking regularity. In fact it blows approximately every hour and a half. We, of course, saw this spectacle upon arriving. It is something to see, but I wasn't terribly impressed.

We didn't actually do much else today because the park is large and all National Parks have a low speed limit of forty-five miles an hour in general. So it takes a long while to get anywhere. What driving through this park has made me think of is back country camping. I think I'd like to try it sometime.

I also don't believe that I've mentioned it before, but the price of gas is noticeably rising down south of the forty-nine. I'm taking this as my cue to leave. And since this park is the last item on our agenda we will do so with good speed.

KM 22190: Buffalo, Wyoming

Wyoming, while not as flat as the Dakotas, is certainly emptier. This is both good and bad. It is good in that the only roads that get anywhere are primary highways and so have fuel at convenient intervals. It is bad in that there is not much to look at. This is not to say we saw nothing.

What we did see were something we have sorely missed, mountains. This country lacks decent mountains over its majority. Not since leaving Utah have we really seen a proper snow-capped mountain. We were both excited to see these mountains in the distance. We didn't reach them as we viewed them a few hundred miles off. That is another nice thing about this particular state and the current weather, you can see forever.

We also saw Mount Rushmore. While it is surely a feat of artistry and something to see, it is something to see once. It is also smaller than I thought it would be. I guess that most photographs are shot to make it look imposing and majestic. I just think it is an interesting sculpture. The depiction of Washington is pretty good and does make an impression. Jefferson is alright. Lincoln appears unfinished (which it mostly is) by having the natural rock begin too close to his chin and end to far on his head. Roosevelt just got a bum deal. His bust is in a nook that makes it difficult to see him. The additional distance also makes him seem smaller. Perhaps the best part of the carving is the profile view of Washington, a facet I didn't even know existed.

The sculptor of Mount Rushmore, talked about the monument as something that would leave a mark for thousands of years into the future. This seems fairly reasonable because it is carved into natural granite. However, annually a maintenance crew goes out to patch cracks in the monument. It is all fine to talk about people viewing the sculpture ten thousand years from now, but is it going to be true if it needs yearly maintenance?

It sure is a long haul driving straight and steady for hours on end in a land where the only distinction is between land and sky. But we keep trucking towards our rest.

KM 21734: Belle Fourche, South Dakota

A note to those travelling through the rural areas of USA: the county roads tend to only have small local gas stations and those are all closed on Sundays. I nearly ran out of gas twice. In the future though it may take longer I think I'll stick to the larger highways.

Well, South Dakota isn't as flat, but the roads are just as straight as North Dakota. If you want to pick up some bad driving habits drive in the prairies. No need to stop at every intersection, you can see for miles. Two hands on the wheels? Nah, it's all a dead straight line anyways. Speed limits? They exist, but there will never be anybody to check on you.

It takes great effort to not become a bad driver here.

As we've been driving about we have been checking out the welcome signs of the small towns around here. Many of them say something to the effect of "Great place to visit and live". I'm not sure, but it often sounds slightly desperate. Almost like they are trying the best they can to attract people. Perhaps they are trying to retain people. I can see it easy for a young person to leave. These small towns seem to already have enough businesses and seem to have few jobs, especially non-agricultural jobs.

What is most interesting about this is that I have seen several ads for going to school; from state colleges to certificate programs. These are things I just can't quite explain.

KM 21128: Minot, North Dakota

North Dakota is flat. I mean board flat. It is also not unlike the desert, except that it is green with more cows. It is nearly as empty though. But other than that we basically just drove.

Well, that is except or visiting the centre of the North America. That is, in Rugby North Dakota is the geographic centre of North America. This point is equidistant from the most northern point of Canada and the most southern point of Mexico, From the westernmost point to the easternmost point. It is just one of those little odd roadside tourist attractions.

Conveniently the flooding of the Dakotas has stopped and withdrawn some. There is still ample evidence of recent flooding, many fields are still flooded and the medians in the highway are currently small ponds. Continued flooding was my greatest worry about this area. When were early in the trip they started to flood. I thought this was alright because it would be over in a week or so. Three weeks later as we made our way to Florida it was still flooding and I was getting slightly worried. It is bad enough that I don't know exactly where I am going (often all I know for sure is the compass direction), but far worse to not know which roads are flooded and impassible, which are just flooded. Driving back and forth it tiring and stressful as we reach the end of our tank of gas.

KM 20611: Glyndon, Minnesota

Yesterday we did the most touristy thing imaginable. We went to see the world's largest ball of twine. It was pretty big. We didn't send much time there because it is just a ball of twine inside a covered structure. It is sure something to see though.

In fact, that is basically all we did except drive. It's some nice country, but there isn't too much here other than farms. Another reason we didn't do much is that it was windy all day, raining most of the day and was quite chilly (five degrees with an expected low of negative two). Instead of attempting to find a drive-in movie theatre as we had originally decided we hid ourselves under warm clothes and blankets in the tent.

In fact, right now it is still chilly enough that I can only type a line or two before my fingers are too cold. With that I'm putting on my gloves.

KM 20001: Pipestone, Minnesota

I ate way too much yesterday. On our list of places to see and things to do we have these pie places. Now usually they are well spaced and we only get to one every week or two. Not so yesterday. Yesterday we had two pie places to eat at: one for lunch and one for dinner. Now of course if you go to a restaurant and haven't eaten lunch yet you will order some before dessert. This is fine. However, having done two in one day and having a rather large dish at the second place proved to be too much and I was digesting through most of the night.

We first visited Johnny's Cafe in Omaha, Nebraska. The pie we were to have there was the pecan pie. It was quite delicious and exceptionally creamy. This was our early lunch. We then moved on at a leisurely pace north through Nebraska and Iowa into Minnesota. (Nebraska is much more hilly than I had thought.) In Minnesota we examined the large variety of pie available at Lange's in Pipestone. I ended up eating a Sour Cream and Raisin pie, which was much acclaimed, after an enormous plate of veal parmesan. The pie was good, but I was too full to finish it.

Something I've observed when visiting small businesses is that some of them frame their first dollar. I'm assuming it is the first revenue they made. I have no comment on it, it is just interesting and something I haven't seen before. Another observation is that I am beginning to believe that the USA is a country of highways. It seems as if everybody lives within a few miles of a highway and commutes on those highways to work. I wonder if Canada will be similar.

KM 19531: Atlantic, Iowa

Thunderstorms are something that are often enjoyable to see. Not quite so when camping. This is especially true when every campsite seems to have taken a forest and cut down all the trees. This makes it too dangerous to sleep through the storm. If the trees were larger and more plentiful right next to where out tent is planted it wouldn't be a problem. Instead when the flashes get close we need to move out of the tent and into the truck cab. It is just a bench seat and so isn't too comfortable.

This is what we did last night for an hour or so. It is such a pain. When I woke up this morning I thought at first that it was going to be a nice day. The sun was up and though it wasn't bright it was still early. It was also not raining. Well, soon after getting up it started to rain gently. This was fine. Then after we had finished our showers and breakfast it started to rain harder. This is not great because we had to take the tent down, but it was workable.

Well, when had just begun to pack the truck up we heard the first distant reports of thunder. We picked up the pace, but even with our practice we can only move so fast. As we packed up the storm moved closer and closer. We had everything packed away except a tarp and the tent and were merely five minutes from being done when the storm arrived. The space between the flash and the sound was less than a second.

So with the tent and tarp unpegged and floating there we retreated to the truck. Conveniently the storm passed quickly without wind and fifteen minutes later we had the tarp and tent loosely thrown into the back of the truck and we were off. Taking down a tent in a thunderstorm is exciting, but I wish campsites had more tall trees.

So we drove through heavy rain for a while. At times it got so heavy that the road looked like a long puddle and the windshield spent more time blocked than clear. This eventually gave way to on and off rain. We listened to the weather report that claimed heavy rain and thundershowers and so chose to sleep in a motel this evening. Upon arriving we also learnt that we had spent the entire day travelling through a tornado watch area. We are beyond it now, but according to the Weather Channel tornadoes are happening a couple of hundred miles south of us and are nearly happening in areas we had driven though.

I guess I don't get to see a tornado on this trip, but I do get an uninterrupted nights sleep tonight.

KM 19046: Geneseo, Illinois

Today we visited the flea market. The three most common items were: socks, women's bags and bras in variety packages of five or six. I had never been to a flea market before, so this was an experience. Though it claims to peak at thirty thousand people, this day it didn't have anywhere near that. It is still early in the season though.

We just walked around and looked through a few booths. We ended up only buying some fruit and Courteney got an Indian outfit, perhaps for belly dancing. The booths contained a surprising amount of day to day stuff. I didn't expect to see so much in the way of papers, pens, soap and the little knick knacks that always seem so expensive to buy and so necessary.

We then made our way towards Omaha. We didn't make it and am still a day or two away, but we are right on the Iowa border. Unfortunately they are calling for rain and possible thunder showers tonight so we could only have a simple meal and had to hide away in the tent.

KM 18575: Elkhart, Indiana

You have got to have great respect for those Amish. Not only do they stick to their beliefs and use only horse buggies and bicycles to get around, they do it while being passed several hundred times a day by people in cars. That takes dedication.

They do eat well though. We had lunch at an Amish style restaurant They had a meal option which they called family dining. This is likely close to what their meals are actually like. Firstly it is all you can eat, a good start to creating a meal that fills you up. Then it has a selection of good hearty meats such as fried chicken, meatloaf, smoked ham and roast beef. This is rounded out with mashed potatoes, beans, chicken stuffing and thick noodles. Topping it all off is gravy. We had the fried chicken and meatloaf and it was delicious.

We had decided to stay until tomorrow so we could see one of the largest flea markets in the USA, which happens on Tuesdays and Wednesdays. To pass the time we walked around the local tourist town and checked out the shops. One thing I noticed is that man things around here are unusually cheap. We saw a gallon of pure maple syrup for $45US, which we didn't get because we would never finish it. We perhaps bought a couple of things we shouldn't have (I bought an oil lamp), but it was a nice sunny day.

Cold though. Even when the sun was up it topped out at about fifteen degrees and once that sun went down it got cold quick. I'm not sure what happened to Spring, but it doesn't seem to have yet arrived in force around here.

It it interesting to see professional buildings such as doctor offices with an occupied hitching post and sections of parking lots reserved for horses and carts. It is also worth seeing a small combine being pulled by six draft horses. Unfortunately we took no pictures to respect their religious beliefs against them. We will see what we can see at the flea market. I expect we'll see more examples of Amish production there.

KM 18530: Elkhart, Indiana

Interesting fact: Indiana has the greatest population of Amish. So we are back in Amish country. This time I actually believe it because I've seen a few horse drawn carriages and 'driveways' setup for horses. Yesterday was laundry, grocery shopping and driving. The drive was nice through some small farms. We even passed into Michigan for a short while.

Trying to find groceries was a chore. The particular town we were in was a medium sized town, probably thirty to fifty thousand people. Yet the section of town we were in had us pass by two closed grocery stores. Most interesting was a shopping centre we stopped at. This was obviously a rather new centre, likely less than five years old and well maintained. The only reason we could tell it wasn't brand new was the worn, cracked and patched parking lot pavement. This shopping centre seemed empty and dying. It seemed that every second unit was closed, the parking lot was basically empty and the shops that were there were closed. Even more telling, the central building which looked designed for a large store was for lease.

On this trip we've seen many spaces up for lease and an especially large number of gas stations shut down. It wasn't even that long ago as they mostly appear modern and those which left a price up show something in the three dollar range. I wonder how many towns we've passed through that are dying and we didn't even realize it.

Back to Amish country. Since we are in Amish country and they are known for their handmade goods we are going to do a bit of shopping. Well, more looking with a willingness to buy rather than buying, but I don't know a word for window shopping where we bring money.

KM 18220: Sandusky, Ohio

There is another day of driving past us. We drove a fair bit yesterday for a good reason. As you may be aware the NHL playoffs are happening right now. We are also near both Detroit and Chicago, which both have teams in the playoffs. We were notified of a few game dates in this area by Jesse so we thought we might try to take in a game. Because of this we drove a fair ways yesterday in order to be close enough to make a game in Detroit or Chicago. Unfortunately there are no tickets to be had at this late date for Detroit (the game is tonight). We are also having no luck for seeing the Canucks in Chicago, but still hold hope that a few more tickets in our price range will be released. But it was a good theory.

This morning we got off to a late start because Courteney was dangerously low and that required some time to rectify. Not that an early start matters at all today because we don't have anywhere to be. So instead of touring Motor City while waiting for the game to start I am writing this from a Laundromat, waiting for our huge bag of laundry to finish.

KM 17743: Lewiston, New York

Today we visited the American side of Niagra Falls. Niagra Falls is surely a large waterfall. I have heard and can confirm that the American side of the falls is not as good as the Canadian side. Though the American portion of the falls itself is fine, view is not. The USA has no land from which to view the falls head on. They even built an overlook structure to get halfway there. The Canadian side however, has a nice park on the other side of the river from which one can see the entire falls.

When we arrive on the Canadian side we will know for sure which side is better. Both sides have an attraction that gets you wet. The Canadian side has the Maid of the Mist boat ride into the falls. The American side has an elevator down to the base where they have a walkway setup to go right up near the falls. We did this Cave of Winds, but didn't get that wet because the top platform right near the falls was closed. Every fall they take the platforms down to avoid damage by the massive ice accumulation which occurs there.

Well, we are well on the home stretch and I am feeling that to be a good thing.

KM 17410: Sandy Creek, New York

Today is a rest day. I've been tired lately, travelling is more work than I had expected. I also am coming to believe that two months travelling at any one stretch is the maximum length. Any longer and it threatens to become a routine.

Again on our drive we didn't see too much of note. What we did do was drive through a mountain range and look at what the Americans do for resort towns. Basically the entire mountain range was small town strung along the highway with more motels and restaurants than would otherwise seem sensible. I am beginning to become awed at the extent to which people live in areas that, at first glance, seem to offer no sufficient means of making a living.

During our rest day we basically just sat and read or watched the fire, or read by the fire. It is surprising how long a small pile of wood can last if you keep the fire to the minimum necessary to warm two people.

KM 16916: St. Johnsbury, Vermont

Yesterday was another pretty generic day of driving. We did have to stay in a motel because all the campsites are closed for the season. The northeast is filled again with small towns separated by small segments of highway.

We are on the home stretch now and I am finding myself becoming bored. Perhaps this is because I am tired and there is little to see from the highway other than trees. Hopefully this gets better when we approach another destination. The next destination is Niagra Falls, from the US side.

KM 16558: Bangor, Maine

So we didn't quite turn the corner. At least we got to the corner. (Completely unrelatedly my normal email addresses are working again) So today we begin the home stretch. The original plan was to arrive back home, stick around for three or four days doing repairs and cleaning and then heading off again. With the way the timing is working out it seems that we will instead spend a couple off weeks at home doing this, that and the other thing while we wait to convocate.

I have remembered the observation that has bugged the hell out of me. I'm not sure if it is only an American thing, but I haven't really seen it in Canada. This is that whenever an American parks illegally they put their hazards on like it makes it better. It doesn't. I've even seen once or twice (on television) people arguing a parking ticket on the grounds that they have their hazards on. Silly. I had also not seen double parking before and it annoys me something fierce.

But the drive yesterday was mostly uneventful. We have seen some old towns and small villages, but not much that is exceptional. The houses have started to all have second floors, but they are also small and old.

KM 16201: Salisbury, Massachusetts

These north eastern states are all small. Yesterday we drove through Rhode Island. Looking at the map Rhode Island is a city-state such as the Ancient Greeks had. We also stopped at a place in Boston on the beach for some clam chowder. It was good and hot. Also nice and thick. Courteney is going to have to look for a thick clam chowder recipe because the one she has is a it too soupy.

We also drove past MIT. I had wanted to stop and look around the campus. Unfortunately we arrived on a Sunday, which is bad, and in the late afternoon, which is worse. We couldn't find any parking and there was some sort of sporting event happening so we just gawked as we drove by.

We are nearly in New Hampshire and I expect that we'll turn our final corner in the United States today.

I'm not sure what it is, but it appears that the Americans like their strip clubs. We often see three or four along our short path through town. They mostly represent themselves as gentlemen's clubs as well. I very much doubt they spend much time drinking scotch on an overstuffed leather chair while smoking a cigar and discussing world politics. In fact, it seems that either a region has more tiny casinos than you can shake a stick at or more strip clubs.

Something slightly different with respect to the roads in this region of the continent is that they like round-abouts. More than that though, they use round-abouts with traffic lights on the circle. I find this exceptionally odd.

Something I find funny are the signs which say no parking on this street during a snow emergency. Now around here I expect a snow emergency to require several feet of snow in only a couple of hours. Firstly how is one supposed to move their car when they are buried up to the hood in snow? Secondly, as with all old towns there is no parking, so where do the cars go?

KM 15897: Clinton, Connecticut

Today we toured though some of New York. Perhaps it was more accurately Manhattan. In any case we saw Times Square and briefly thought we were in Japan. We saw Broadway and were tempted to see a show, until we saw the cost. Unfortunately we missed the May Day parade by ten or fifteen minutes, but we did see the back end of the final marching band.

We also went up the Empire State building. It is an ornate building. The view is good, but it was grey and drizzling when we went up. Buildings just don't have the same work into decoration put into them these days. They are all far too utilitarian, especially considering how long they continue in use.

We also shopped at the Nintendo World store. It used to be a Pokemon store and perhaps a third of the store still is. Courteney had fun looking around. They had a small display of historic Nintendo stuff including ROB, but alas no Powerglove.

Finally we took a walk in Central park. It's a fairly nice park. I am amazed that it was zoned off early enough to be such a large chunk in the middle of the city. Though the park we walked in had frequent large, stone hills.

I said I'd have other things to say about New York and here they are. Firstly there are often more cabs on a street than private vehicles. It is an experience to be a lone blue truck in a sea of yellow. Secondly, here is how a New York cabbie (and even a city bus!) changes lanes: First change lanes. Well, no. There are actually two steps, first put two tires into the lane you want, then change lanes. It seemed that the only traffic directions that are followed are one way streets and red lights. Courteney commented that it looked like a dance. I responded that this is good except that I'm driving the clumsy fat man.

New York also has excellent deli's. I didn't expect this, but New York is the first city I we've visited that I could see myself living in. I just wouldn't own a vehicle.

And with that I have a couple of observations. The first is that American drug commercials are much more boring than their Canadian counterparts. There is very little entertainment and a long list of side effects. I think that television ads would be much better all around if all industries were required to to the same thing as drug companies.

Well, we escaped from New York and are fast approaching the north-east corner of our trip. Until tomorrow.

KM 15738: New York, New York

Well, we have arrived in New York. In driving here I have learned a couple of things. The first is that New Jersey is basically a suburb of New York City. Once I heard parking in New York described in two ways: if there is a free spot it is illegal to park there and if find a spot look at it every half hour and if you see the entire city looking for a new parking spot do the same. Now I haven't seen the latter, but the former is entirely true. This city doesn't have any parking.

So we arrived in the afternoon and managed to find parking and a hotel room. Inconveniently they are not the same company. After putting our stuff away we grabbed a late lunch and then walked to see the Statue of Liberty. On the way we passed Wall Street. though not the portion in the financial district. We got to the park and looked upon the island from a distance. The ferries had stopped running for the evening so we couldn't get there.

On the way back the skies opened up. Not the nice kind of opening either. It just started pouring. Now we didn't have an umbrellas because we left them in the parking garage. We also didn't have our coats because it was nearly twenty degrees out and too warm. Needless to say we got thoroughly soaked during our twenty minute walk back to the room.

I have a few more things to say about New York, but they'll have to wait until we are done playing tourist tomorrow.

KM 15538: Lenhartsville, Pennsylvania

Today we visited the NSA Cryptological Museum. It is a small museum placed inside an old motel. It is right near the main NSA buildings, which are interesting to see from a distance through the barbed wire topped fence. The museum is primarily historical, with many examples of the machines used throughout American history. I found the machines and developmental history from WW2 to the present (that's right, they have some examples of current technology (well, current visible technology) on display) interesting to see.

The height of the museum, however, must have been the two working, open to the public Enigma machines. I of course wrote a short message. Unfortunately not all the lights were working and the ? represents that situation. My message is "VMSM BESN GILE GFT? LSHG RXRX VE?T LNSP", unfortunately I don't know which rotors or the plugboard configuration so I couldn't decode it easily. It was nifty to use though I wish it had a teletype output like some other encryption machines there.

It takes three to four hours to fully examine everything in the museum and so isn't an onerous stop like some other museums.

After the museum I got an oil change, I needed one after over fifteen thousand kilometres of driving. Now I just need to figure out why my gas gauge is only reporting the propane level (and not the gas level which is what I normally want) and things will be fine.

In driving to this particular campsite we passed through the Pennsylvania-Dutch region. I have been informed by Courteney that the Amish frequent this area. We did see one horse and buggy sign, but no actual horses or buggies.

KM xxxxx: Ellicott, Maryland

Yesterday we did the tourist thing around Washington, D.C. Our first stop was the Washington Monument. It is basically just a tall stone tower. There was a tour available, but the timing didn't work out.

Then we went and looked around the Library of Congress for a couple of hours, including the tour. That is a beautiful library. Everything is quite ornate. The main reading room is a beautiful finished oak and stucco room. Perhaps the coolest thing that I saw there was the grooves worn into the marble steps. That is how you know that it is a well used library. The Library of Congress is also a public library and if I were a citizen and had the time I could get a library card there and access to the thirty million books.

After the Library of Congress we went to the museum of natural history. That was nice and they have some good collection. We didn't see all the museum, but instead focused on the hall of mammals, the dinosaurs and the gem exhibits.

An interesting fact about Washington is that all the large government buildings are made of sandstone. This is a stone that erodes quickly in wind and rain. I think it a bit of an odd choice for the construction of a city that is intended to last forever.

Now as I said yesterday parking is at a premium in Washington. Yesterday we didn't drive to these various destinations, we walked. The city isn't as bad to walk, though it is a bit too spread out for easy walking. We likely walked ten or fifteen kilometres including walking around the various spectacles. It wasn't that bad, but my old running shoes are not made for walking.

We did see the Whitehouse briefly, but didn't have time to walk past it and instead drove past it on our way out. As you might imagine the lawn and surrounding land is not designed with the tourist in mind, but instead for segregating the President.

Well, we are done with Washington and I am happy. There is one more thing that I saw lots of which I failed to mention yesterday: men in suits. There are men in suits everywhere.

KM 15194: Sterling, Virginia

Yesterday we drove a bunch and went into Washington D.C. We tried to make a tour of the Library of Congress, but traffic and the lack of parking made us miss it. We then though that we would at least see the Washington Monument. Unfortunately there is nearly no parking there either. In the end all we accomplished was driving around wacky streets through rush hour traffic for an hour.

For those who may not be aware Washington, D.C. is just a city. During my impromptu driving tour all I really saw where the following: large imposing mortar and block buildings, road blocks, association branches, black SUVs with tinted windows, restaurants and police standing around impeding traffic. I'm not even going into the layout of streets and intersections, but I will say that gridlock is nearly unavoidable.

If the city is any indication then the government it contains is badly designed, badly run, inefficient and specifically designed to make it difficult for outsiders. I can't wait to truly see Ottawa and compare.

Today is more touring, but this time on foot because driving is a frustrating waste of time.

If you've been following the weather in the northern USA you may have noticed that the weather just isn't that good. Specifically North Dakota appears to have been flooding for the past month or two. I am beginning to get concerned that we'll need to find our way through that mess. It might be fun, but would make camping difficult.

KM 14692: Fort Oaks, North Carolina

Today was the second day getting up at six in the morning. We woke before the sun and Courteney didn't fall asleep all day. There is hope yet.

We just covered some ground today. We have no plans for either of the Carolinass and so are moving to our next destination, Washington, with good speed. Really we didn't see much at all today except forest and highway.

My observation for the day is that Americans really like air conditioning. Nearly every building has it and people will leave their car running while parked for extended periods just to keep their vehicle cool. What I find most odd is when it isn't terribly hot, perhaps thirty degrees Celsius, and you can see the water nearly streaming out of the condenser of people's vehicles. I am amazed that they consider cold air so valuable and fuel so cheap that they would do that. I wonder what is going to happen when energy becomes expensive.

Americans also have an extreme aversion to taxes. You have likely heard hews of various Tea Parties. Now taxes are neither good nor terrible. I believe that part of the dislike of taxes comes from the badly run programs which it funds. I further think that most of these programs are badly run either because they are underfunded or they have a partnership with a private party. There is a strong, nearly religious, belief that private enterprise is necessarily more efficient than government. It seems to be taken to such extremes that even when a government office is efficiently run it is screamed that a private business could do better.

The final observation is that the United States is very religious. Not necessarily Christian, but religious none the less. There are many tarrot and palm reading shops. Every paper has horoscopes. I am beginning to believe that something in their culture requires one to hold religion in their heart. Something to dwell on.

KM 14240: Hollywood, South Carolina

Well, yesterday we did stuff! The first thing we did the previous night was to decide that we would try and solve Courteney's bug problem by shifting our schedule to avoid them. This means we are getting up earlier and stopping earlier such that we can be fed and in the tent before dusk, when the bugs are the worst. So yesterday morning Courteney saw 6:00 AM for the first time in a long time. I hope she gets used to it, because she was falling asleep all day.

After dragging Courteney out of bed we went to see Savanna Georgia, specifically the historic area. It is a nice place with some interesting architecture and construction. I especially found the old brick paths and cobblestone roads interesting to see and walk on. This got me thinking that eventually I would like to see the old cobblestone roads in Europe, the ones with grooves worn in by ancient wagons.

After walking around for about two hours we headed out to South Carolina. We actually entered South Carolina twice. The first time we missed taking a picture of the sign because we didn't expect it to come so soon. So we had to turn around and cross the state line again.

Our first stop in South Carolina was a park on the coast where Painted Buntings are reported to hang out. We went there and looked around for a bit. Unfortunately it is a swampy forest and Courteney didn't want to go too deep. So unfortunately Courteney didn't get to see a painted bunting this time around.

After that we drove for another hour or so to our camping spot for the night. Here we saw an odd looking bird that we dubbed the turgoosen. It looks like a goose, but has a red flesh mask around the eyes and an odd colouring. We looked it up and it is some sort of duck, but turgoosen is a much better name.

KM 13832: Brunswick, Georgia

Yesterday was a short driving day because we miscalculated how long it would take us, but we had booked a motel room. Hopefully this won't happen again since we are back to camping. We are going to try avoiding the bugs that make Courteney break out by shifting our day to be earlier. That means that this morning I woke at six and Courteney at 6:30. She'll get used to this time of day existing eventually.

The drive yesterday was uneventful. After arriving at the motel around 2:30PM we checked in, did some minor sewing repairs and headed to the pool. We enjoyed ourself until the threat of sunburn became too great. In all it was a quiet restful day.

I think I will soon have another round of observations so stay tuned.

KM 13536: Orlando, Florida

Today we visited Universal Studios. It's a nice park and they have put a ton of effort into theming the buildings. Most of they day is hurry up and wait. Walking quickly from ride to ride just to sit in line for anywhere from ten minutes to over half an hour. I am sure glad we didn't arrive on a weekend.

There are a couple of interesting rides. The most fun is the Men in Black one. It is just a rail shooter, except with real rails and we do the shooting. The most advanced must be the Simpsons ride. It is just a large advanced simulator. It begins as a roller coaster and for the first ten or fifteen seconds I truly believed and felt that I was on a coaster. It was wild.

The most surprising thing I saw was a projection of a person onto something I can't explain. It appeared very life-like except for the fact that the person was flat. I wonder what sort of screen it was, I didn't see anything that could act as a screen.

In some ways it was a good thing that we spent a lot of time standing in line. All the lines are covered over so you don't bake in the sun. It was supposed to be ninety degrees Fahrenheit today and I would believe that it got that high. I sure am glad I had my hat.

We had lunch at a fifties-themed dinner, which was neat and my hat looked right at place.

KM 13520: Orlando, Florida

I forgot to mention that at the Space Centre I had the opportunity to see a 3D Omnimax movie for the first time. I am able to report that it worked surprisingly well. Live action segments worked less well than computer generated segments. This is because in the latter everything is in focus and you can look at whatever you wish. With live action some areas are not in focus and where uncomfortable to focus on.

It'd make an interesting interface for a video game and I can see that movies may eventually turn to 3D.

Well, today we saw Seaworld. Seaworld is a nice place, but nearly nothing like an aquarium. Though there are some exhibits and places to see animals they are not the focus. Instead it is the shows. They are all well put together. The big show is the Shamu killer whale show, but make no mistake about it, the best show is the dolphin show.

It really is a pretty nice place. There are enough shows to fill most of the day and enough to see to kill the time between shows.

After Seaworld we went out to out birthday dinner. We went to a seafood place (watching them all day made me hungry). It was good and plentiful. We even split a piece of key lime pie instead of birthday cake. This is certainly one of the more low key birthdays I've had.

KM 13474: Orlando, Florida

I saw the most unexpected thing yesterday: BC plates. We were leaving the Kennedy Space Centre, walking through the parking lot an we passed by the SUV. It entertained us both for a good fifteen minutes. Especially since we saw them leaving and arranged to be on the road right in front of them so they knew we were there as well. I didn't think anybody else would be crazy enough to drive down.

Anyways, we saw the Kennedy Space Centre. We only had half a day there so we didn't see the entire park. I would say that we saw approximately two thirds of it. The part that we did see seemed to be heavily biased towards videos. I suppose that is alright, just different from any other educational exhibit I've ever seen.

In traveller news, Courteney is allergic to mosquito bites and bug bites in general. She has had a case of the hives for a couple days now. Before that certain bites would swell to enormous size and stick around for a couple of days. It is quite sad and she has gotten itchy. We've gotten some antihistamines, but she still is covered in bumps and itches when the medication wears off.

We have no idea how we are going to deal with this.

KM 13224: Fort Pierce, Florida

It is hot here in Florida. It isn't as bad as I imagined it would be, but it isn't nothing. Yesterday we drove for several hours and didn't leave the urban area until late in the afternoon. The entire coast from the tip of Key West up to about fifty kilometres south of here is all urban. We travelled for somewhere around four hundred kilometres without leaving the city. Madness.

We didn't do much other than drive yesterday, all day. However, since we almost never left the cities we didn't cover much ground. The lower speed limit, traffic lights and traffic itself held us back. It sucked.

Today and the next few days are looking to be better though. This is mostly because we are going to drive little and take in a few of the attractions of the area. The current plan (because we have already bought tickets) is to see the Kennedy Space Centre, Sea World and then Universal Studios. Each will be on a day by itself. Disney has lost out on this round of sightseeing. It is targeted mostly to younger children, is more rides and shops than anything and is incredibly expensive. A single adult ticket for a single day to the Disney park in this area (I can never remember if it is -land or World) is $120US. For what a single day there would cost us we could spend a night in a higher-end hotel or eat at a restaurant where formal dress is required or travel for a week.

It is outrageous.

KM 12917: Key Largo, Florida

Except for melting while driving since my truck doesn't have air conditioning we had a rather nice day. First we got up and hung out on the beach for an hour or so. I could almost have stayed all day, but it was getting very hot and the water there is too shallow to take a dip. So we moved on and headed south. After a fairly short drive we arrived at the most southern state park we walked to the beach and took a look around. We didn't actually see the marker for the most southern point because we didn't find out it existed until we were already on our way north. It isn't a big deal.

Then we started the third leg of our journey and are heading north. We haven't seen anything new yet because we didn't make it out of the keys, but we should today.

I am beginning to appreciate one more thing about the desert and that is thee lack of bugs. We didn't notice it at the time when the were missing, but I sure notice the mosquitoes and no-see-ems when they are here. I hate being part of the food chain again.

Camping is easy when it's cold, hot, muggy, windy, sunny and overcast. Camping is easy when dry, raining or snowing. On rocky ground or soft ground, camping is easy. Biting bugs make camping hard. I fear we have just entered the hard part of camping.

KM 12637: Long Key, Florida

Florida is hot. Yesterday it was 32 degrees in the shade during the afternoon. At least the humidity wasn't bad. Yesterday we did a couple of things. The first was wake up, eat and shower, though not necessarily in that order. Then we drove east.

Yes, yesterday we arrived on the east coast of North America. We aren't quite as far south as we need to be before we head north, but almost. We took the toll road called Alligator Alley across what I believe is referred to as the pan handle. It's a good driving road and the $2.50 toll isn't that bad considering that it saves over thirty miles of driving. However, you see nearly nothing. The side of the corridor is about a hundred feet off the road and fenced off. It was really the wrong road to take on this sort of trip. We would have been better off taking the slower US highway. Oh well, live and learn.

But we did arrive on the east coast near Miami and headed south into the Keys. From the hundred miles or so of the coast that I've seen it looks like one large urban/suburban area, all the way down the coast. There is no break between cities and little or no natural forest left. It's a bit disturbing.

The keys themselves are interesting. They are just one street highway towns with an east and a west beach. Seriously, all the ads locate the stores by the mile marker on the US1. It was neat when I was able to see both beaches through the trees at the same time. This would be a nice place to vacation for a longer stay if it weren't for two problems. The first is that there are stingrays in the area and you must shuffle when moving in the water or risk being stung and speared. This is made worse by the fact that the ocean around here is very shallow. Courteney and I had walked perhaps two hundred feet out and the water was just above our knees. This was with the tide in. Being so shallow really makes it cruel with the sun pounding down. It was 3 degrees in the sun when we arrived in the late afternoon.

There are advantages to the area though. Though there isn't much in the way of stuff to do if you don't like strippers or beaches there is the most clear view of the sky I've yet seen. There are many more stars than you see in most places. It is truly more on par with what I had thought a clear desert night would be like. Alas all the places in the desert suffer from rather severe light pollution. This might be the best view of the stars we are going to get on the trip because Key West seems heavily populated and we could see some night glow from that direction and the Arctic will have twenty-four hours of daylight when we arrive.

Relatedly, though the stars are familiar they are in different positions. I guess this is to be expected when one is six thousand kilometres from home.

KM 12308: Naples, Florida

Yesterday we left the hospitality of Stacy and Jimmy to continue our journey. We continued south. Courteney didn't much enjoy the driving yesterday, mostly because she was behind the wheel. I was in the passenger seat handling all the regular navigator duties as well as emptying my head of cat induced guk.

Up to now Courteney has held onto the dream of bringing her cats when we move in together. Not a chance. The three days I spent being slightly allergic to the cat Mystic saw me being completely stuffed up and not sleeping right at all. It was so bad that now I have a cold. Likely because my immune system was otherwise occupied fending off cat smell.

I likely have more to say, but it is hard to collect thoughts when I have an aching overpressurized ear and have to empty my nose every five minutes. There is one thing I have been wanting to broadcast and may have already. If so please excuse me. It is perhaps a sign of the times when as many as every fourth billboard is an advertisement for advertisement.

KM 11939: Palm Harbor, Florida

Yesterday was another day at Stacy's. We spent it travelling around a town called Treasure Island. Treasure Island is really an island with three miles of beaches and a bunch of shops. It'd be a fairly nice place to go and spend a weekend partying, especially since you can drink in public. We had a pretty good dinner at a restaurant-bar right on the beach.

After dinner we went to take a walk in the Gulf of Mexico. The Gulf is much warmer than my previous short dip in salt water. It is really much like a warm lake. So Courteney and I walked in not quite up to our knees because we hadn't brought swimwear. The sand is very soft and very white. I can see how it gets full on sunny weekends.

We where also given a tour of the local police department by Jimmy, Stacy's husband and office in the Treasure Island Police Department. It is a pretty nice place, but a bit of a drive from where they live.

While driving I also heard something on the radio. A car dealership offering -0.9% financing. That is a sure sign of the times. At least to car companies money is worth more now than in two or three years, which breaks an important assumption of the economy. In some cases it is a better investment to hide money under a bed than other investment options.

KM 11939: Palm Harbor, Florida

Well, yesterday was the second day staying at Stacy's place. Yesterday we went to the Aquarium. I haven't been to an aquarium in years, perhaps since elementary school. The variety of life in the sea is amazing and many of them look oh so funny.

We got of to a bit of a late start, but managed to see most of the wildlife, but not most of the shows. That's fine though. Seeing everything a park or aquarium or what have you is truly a tiring experience.

After going through the aquarium in Tampa and having a nice dinner in a shopping courtyard which allows open liquor we went back to the house. It was a pretty quiet night though Stacy made us watch Twilight. I didn't get much out of it, but it's a movie for teenage girls. Apparently there is a second one coming out, I don't think I'll be first in line to watch it.

And this is the second full day of living with a cat. For those who aren't aware I am slightly allergic to cats. Courteney of course loves cats and would very much like to have one when we eventually move in together. I'm not sure how that's going to work out. I've been stuffed up since I got here. I wasn't always allergic, so maybe I can work out of it.

KM 11939: Palm Harbor, Florida

Yesterday we arrived in Palm Harbor at Courteney's sister's place. Stacy met us with open arms and a hug. The drive was pretty uneventful and we spent the afternoon just chatting. There wasn't much planned so we had an easy dinner in and an easy evening. We basically just chatted more with a trip to a local bookstore chain for some change of scenery.

After getting back we watched a bit of TV, specifically a show called The United States of Tara. It an interesting show, but I'm not quite sure how to describe it yet.

KM 11733: Old Town, Florida

Yesterday we continued our way down the western side of Florida. I can't say much interesting happened, because it didn't. We ended up getting an early start because the threat of rain had caused us to put everything that could go away in its place before we retreated to the tent. There was little to cleanup and only cereal for breakfast.

We made good time and arrived in Old Town in the early afternoon. This even though we lost an hour due to time zone changes. We lazed the afternoon away eating discount Easter chocolate, reading and playing video games. The weather was nice, though I found the humidity uncomfortable during the drive.

And that is pretty much it. We are about a hundred miles shy of Courteney's sister's place and will make it with time to spare today.

If all my entries were like that all my readers would leave and never return. In the interest of keeping those eight people happy it's time for a new section in this blog. I like to call it "Things Which Disturb Travis". This first incarnation will deal with advertising. One would think that advertising is pretty much the same between Canada and the USA. We have many of the same major stations and relatively similar cultures and laws.

This is true in general, but there are specific cases which give pause. The first are the hospital ads. I had never expected advertisements prompting you to choose one hospital over the others. Yet it is so. Even more surprising is that they prompt you to choose their hospital over another when in an ambulance! I'm not sure it is entirely ethical to suggest that people who are heading to an emergency room choose a particular hospital over the closest. If you truly have time to waste travelling then perhaps you don't quite need an ambulance?

The particular advertisement I am referring to is radio ad (we see very little television) where one actor has had a bit of chest pain and his concerned wife 'foolishly' suggests that they go to the nearest ER. The husband then says something about her not loving him. She retorts by mentioning that he had said he thought it was just heartburn.

I find the message that you should go to the hospital when you just have heartburn even more reprehensible than the one that you should waste an ambulances precious time to travel to a further hospital.

I think that's enough soapboxing about that for now. I'll just mention the other two sorts of ads that I've seen that have weirded me out. The first is an ad for a toll bridge. Who advertises a bridge? If you need advertising then perhaps you shouldn't have built a toll bridge there. The other are ads for cosmetic surgery. I'm not sure why we don't see this in Canada and perhaps I just haven't paid attention, but why is this necessary on a billboard?

Some things here are a bit off. Though when I listen to their talk radio there are two things that are worth mentioning. The first is that there is no such thing as liberal or even centrist talk radio; it is all right wing or extreme right wing. The second, keeping the first in mind, is that there appears to be no distinction in the American political sphere between socialism and communism. None at all. I find this interesting at a time when the US government is slowly moving towards nationalizing the largest corporations in the insurance, banking and automotive sectors. Very interesting indeed.

KM 11442: Mariana, Florida

Yesterday we entered Florida. Our introduction to the self acclaimed sunshine state was dark grey skies. Further, when we stopped to camp for the night it was raining. Not a light sprinkle and not a torrential downpour, but somewhere in between. This gave me a good opportunity to try my tarp canopy off my truck idea. Basically I have a bar that clamps to the window opening of my camper on the box of my truck. To this I tie a tarp which is help up on the other end (or in the middle depending on how windy it is and how much covered area I need) is help up with two extending poles.

It worked fairly well, though the ground was soft and the pegs holding the rope down came out a couple of times.

After making and eating a fine dinner of soup and grilled cheese sandwiches on my tailgate we proceeded to setup the tent under the tarp. The rain had gotten lighter as we were doing this until it eventually stopped. We were undeterred and kept with our plan of strongly tarping the tent. In the end we had our tent inside a large pup tent. All prepared for the worst rain possible we retreated to the tent.

It is important to realize that upon arriving and asking for a tent site the keeper questioned our sanity. In the last three days the area had received over twenty inches of rain. Many roads and most fields were flooded. We being young and sure of ourselves held our course.

The rain never did return and though the ground was still exceptionally wet in the morning the sun peeked out to warm our backs. I hold that we brought about an end to their biblical rain.

KM 11096: Alexander City, Alabama

For those who are not keeping up with their calender holidays yesterday was Easter Sunday. In Canada this usually means that most places of business close early and you can expect to gain five pounds before monday. Not so in the United States. In the United States it appears that nothing except gas stations and movie theatres are open.

We are moving towards Courteney's sister's place. We needed to call her sister in order to coordinate our arrival. I don't know how many of you have ever made a long distance call on a payphone using coins. I have and it is an expensive and painful experience. First you need to, generally, put in four or five quarters to connect the call and receive two or three minutes of talk time. Then you need to keep feeding the machine quarters about the approximate rate of two a minute to keep talking. With this in mind we set out to find a payphone that took credit cards. They are fairly common in Canada. Not so in the USA. We went to all the regular hangouts for fancy payphone: corporate gas stations, malls (which were both closed) and movie theatres. Nada. Instead we discovered a payphone with a dollar for ten minutes of USA talk time.

So we put in our money and get, an answering machine. Oh well. We move on to our camping destination for the evening.

We arrive, I call my parents from a different payphone (I call collect though and don't have to suffer entering a mountain of quarters I don't have) and we set about making Easter dinner. Since it was Easter dinner we went all out. We had half a small ham with an orange marmalade glaze, mashed potatoes and a green salad. We even had dessert. It wasn't Courteney's usual dessert fair of a pie. Instead we just had a third of a large honeydew melon.

As we were finishing dinner it as starting to rain slightly so we retreated to the tent. Now I refuse to put up a tarp unless it is really raining because often it rains, but not enough to actually need a tarp. I'm a lazy person and hate doing work I don't need to. Thus the tarp didn't get put up before we retreated to the tent.

Well, at about one in the morning it started to pour with thunder and lightning. This meant I had to get out and put the tarp on. I convinced Courteney to help me, which was nice. The tarp went up quickly and though the tent is damp we did not get wet. Of course now I need to fold the tarp.

I suppose that it is time for an update on the housing situation here in the United States. You may remember that several weeks ago I commented on how small the houses all were. Well, I am nearly done half my tour of this country and I don't believe it has really changed. Most of the houses I have seen are a small single story. In fact I have seen a good number of mobile and modular homes, which are small by their nature. I did see relatively large single story houses in Louisiana, but that was in a rather affluent looking farming neighbourhood. I had expected more two story houses their, but didn't see them.

Something to consider is the price of a double-wide modular home here. I have seen ads showing them as starting at $45,000. I can see how that may be attractive to some in lieu of a larger house on a larger plot.

KM 10765: Huntsville, Alabama

We had a late start today. First we went to the all you can eat breakfast this KOA we were staying at put on. Then we leisurely tore the camp down. Then we went on a short, un-scenic tour trying to find a visitor centre to find out what we could spend a couple of hours doing. We did this because we didn't want to cover too much ground today.

After failing to do that we visited the only attraction we know is in this town: the Grand Ol' Opry House. So we did that and walked around the mall that is nearby for an hour or two.

Looking isn't really all that fun if you don't have any money to spend. So we went and found a minigolf course. After a gripping eighteen holes I stood as the victor and fun was had by both of us. This particular course had one hole which started on the second floor of a small building and put the ball through several pipes and four or five green areas. It was pretty neat.

After that we headed to a state park in Alabama. Now I used to think that pay showers were the norm in state parks when we started the trip. This is because that's all we saw in Oregon and California. However, the last few parks that we have been at, if they've had showers they've been free. It may have been the case that it is only the desert states which charge for showers. But then there is the one park in Nevada we stayed at which had free showers. In any case, having a free shower everyday is nice.

KM 10527: Nahville, Tennessee

Today we left Mississippi and entered Tennessee. As can be seen from the mileage we didn't do much other than drive. We did however stop in Memphis for lunch at another pie restaurant. Their famous pie was banana cream. It was pretty good.

We both had toasted sandwiches for lunch and regretted it. You see last night Courteney made a quiche for dinner using my makeshift oven. It turned out well, but very hot. We both burnt our mouths eating it. This of course makes crunchy toasted bread a uncomfortable thing to eat.

After Memphis we drove to Nashville. The land in between is much the same as Mississippi, that is mostly flat farmland. Unfortunately we arrived in Nashville too late to get tickets to the Grand Ol' Opry show; I doubt we could have afforded them anyways. Instead we just ate dinner and then went to bed around eleven.

Our next goal is to make it to Courteney's sister's place. We are trying to time it so as not to arrive early in the week. We fully realize that people have jobs. I have no idea how we are going to manage this timing when we are in Canada and have many friends and family concentrated in a small area.

KM 10004: Oakland, Mississippi

Mississippi is a long name. It is also mostly flat. We didn't do much yesterday except drive and we didn't see much except farmland. That's alright though, we did break ten thousand kilometres and are well on our way to Memphis and Nashville.

We did have some excitement though. At about one in the morning there was a thunderstorm here. Now so far we've been lucky and not had any bad weather. Sure some days were chilly, but there has been nothing more than a light sprinkling of rain and some wind. I didn't mind the thunderstorm at all. It rained a bit, but not terrifically hard. We didn't even feel it necessary to put up a tarp. We did get up and out of the tent to go and watch the storm.

I haven't seen a good thunderstorm since I moved out of Calgary. The milder climate of the lower mainland just doesn't promote thunderstorms. So I was quite happy to sit in the truck watching the storm through the windshield for a couple of hours before heading back to bed.

I have decided that the interstates are more tiring than driving the smaller highways. There is almost nothing to see, the roads tend to be rougher and, in my case, the higher speed limit doesn't matter. So all around the interstates seem only good for truckers and commuters.

KM 9348: Slidell, Louisiana

Louisiana is small. So small in fact that we drove most of the way across it, taking a scenic route and still ended up in New Orleans in time for dinner.

This is the first time I've ever seen real swamp. It was a bit of an experience. If you look at it from a distance or don't pay too much attention it looks like the trees are growing our of flat dirt. This is not the case. Upon closer inspection you see that what you thought was solid dirt is actually still water thick with floating scum. How deep the water is varies from inches to feet but either way you are going to get wet.

The highways through this swampy country are also built in a fashion I consider odd. Reasonable enough are the built up sections where the highway is on a mound. Somewhat odder is where the entire highway is sitting on raised concrete platforms. These platforms go on for miles in some places. This is not to speak of the interstate going through New Orleans.

This interstate doesn't go through the city so much as above the city. It covers an immense area of the city with a platform raised perhaps fifty feet in the air. It is such an immense amount of effort to build a road which requires constant monitoring and repair to avoid becoming entirely impassable because of a chunk of missing concrete.

Anyways. We had dinner in NOLA, as I've seen the city referred, on Bourbon street in the French Quarter. We had some good cajun cooking while sitting on a patio overlooking Bourbon street. We also walked around a bit. It's an adult playground in that area, perhaps more so than even Las Vegas. I prefer Bourbon street to the strip though. It just seems like a more livable playground. Courteney phrased it as less artificial. Unfortunately the area is like Las Vegas in the regard that it takes considerable money to get the full experience. It takes money to sit in the various bars and clubs all day drinking and listening to live music.

I think this is one spot I will return to at a later time to experience more fully.

Now to finish my metaphor from last night. In football everybody wants to be the quarterback, the man who is in charge when it is time to succeed or fail. If you can't be this person then you hope to be the star of the defensive team, Even then you don't get fame equal to the quarterback. Finally if you can't be either of those you just keep your head down and work hard to give the quarterback all the credit.

Now in hockey it is a bit different. Though there are leaders and there are stars there is no single position which is the centre of fame. The captain of the team could be a centre or a defenceman. Each player has a role to play, but that role varies depending on the context of the game. Each player is also given the flexibility to make a positive difference at any point in the game.

The point of this is that in hockey there is a team and each player can make a positive play for the team. Each person can accomplish something and be acknowledged for it. In football, however, the blocker doesn't get congratulated when they keep their target at bay. That is their job and they can't improve upon it. Instead all they can do is screw up and let their block go. They are scapegoats.

My impression of the culture of the United States is that there are winners and sheep with no middle ground. I don't find this same attitude in Canada where most people are seen as having a positive contribution they could be making.

This brings up another brief thought on the security precautions that are now prevalent in the US. They seem to be more about getting people used to being treated like cattle than actually accomplishing anything to increase safety. I foresee a negative future if this trend continues.

KM 8720: Small Town (Anahuac), Texas

As I write this I can't remember the name of the town I am in. Look for the name when I get back into the truck to be placed in parenthesis. Right now I am going to call this Small Town Texas. And it really is. On the way to this campsite we passed a restaurant called Dolly's Quick Food. It is all farm country around here.

This campsite itself isn't necessarily a campsite. The book we have lists it as one, but there is no cost, no marked sites and no signs of any sort. All there seems to be is a poorly maintained bathroom and a community sized picnic table. I'm sure it'll be alright and we did come equipped for all variety of campsites.

Unfortunately we won't have any chance to make use of the tables and chairs and lights we packed. This area of the country is a bit more swampy. This means mosquitoes. We decided it better to avoid being eaten alive instead of camping in comfort. This means we put up the tent in record time and ate peanut butter and marmalade sandwiches for dinner inside the tent, protected from the bugs.

We did end up spending the entire day at the space center. It is pretty touristy, but does have some cool artifacts from the early years of space exploration on display. You can even touch a rock from the moon. It would have been fairly expensive except we were directed to the CVS Pharmacy across the street for some half off coupons. The center also has twilight tickets which allow entry in the last two hours of the day and the entire next day for a reasonable price.

The only really annoying part of the day was the security surrounding the tram ride. They have some cars which take you one a short tour of the NASA site itself. There are two major annoyances though. The first is that there are two tours which overlap. This means you would spend a bunch of time, perhaps as much as half a day, waiting in line and seeing the same things twice. Then there is the security. They waste immense amounts of time searching everybody's bags and putting everybody through a metal detector. This is silly enough, doubly so because they let you roam freely at the end of the tour in the rocket park.

The rocket park is where they have three rockets of display, including the Saturn V. It is at the edge of the campus and as far as I can tell unsupervised. Silly. Also, the security guards weren't paying much attention to what was happening and I believe it would have been easy to pass contraband across the road barrier.

Oh well, just more pointless, useless, time wasting pretend security. The rest of the center was worthwhile, interesting and entertaining. I'd recommend it to anybody interested at all in space exploration or history, but it is a full day if you want to take your time and experience the majority of what they have to offer.

Finally some philosophy. I am beginning to believe that the favoured games of football in the USA and hockey in Canada may be a good metaphor for some differences in the cultures. Now I may have heard this before, I don't claim this to necessarily be original thought.

I would go into the metaphor tonight if I weren't cooped up in a tent, hunched over trying to type in the dark. Instead I'll expand on it tomorrow night. Since tomorrow promises to be a driving day I should have plenty of time to ponder the best wording. We will also be leaving Texas.

Until then keep reading.

KM 8563: Houston, Texas

Again this is an entry for yesterday. This is because we were busy yesterday. First we drove perhaps a hundred miles east to arrive in the Houston area. Then we did laundry. After that we ended up popping by the Space Center here for an hour.

We didn't realize it was on the list and so we will be returning today. All this stuff meant that we didn't arrive at a campsite until late in the evening so there was no time to write an entry then.

We have officially left the desert though. I, at least, am glad. The desert is beautiful, but also harsh, dry and alien. We have securely entered forested areas again and this campsite we are staying at even had some slightly muddy ground.

I am also hoping that the change in scenery to something I am more used to will make things better. For the last couple of days I have begun to feel slightly homesick and sick of travelling in general. Never let anybody tell you that extended travel isn't hard on the mind and body.

KM 8199: Ottine, Texas

Yesterday we drove a bit and saw the Alamo. I had thought that the Alamo would be one of those old forts several miles from town. I was wrong. The Alamo is actually in the middle of San Antonio, right near downtown.

The majority of the missionary-turned-fort had been build over, but the church and what was once the barracks stills remains. It is a bit odd to consider it as a fort. The walls are only eight or ten feet high and perhaps three feet thick of mortared stone. But a fort it was.

I didn't get that much out of it. Perhaps I would get more if it were part of my culture. It was free and I did learn that Spain and Mexico used to own the greater portion of the southwestern United States. This explains the heavy Spanish influence I've seen all over the place.

After leaving the Alamo we headed east a bit more to arrive at a state park. The park was mostly empty. During the night it was us, the park hosts and two pushy geese.

KM 7890: Junction, Texas

I repeat my statement that Texas is full of nothing. We had to drive about eighty kilometres to find a tire place. But we got there, bought a new tire and the small one is now snug back under the bed. Automotive excellence.

After we got that chore finished we headed to a state park in this area which is apparently an excellent place to view Painted Buntings. These must be Courteney's favourite bird because she has wanted to see one for years. So we arrived at the park and walked around with binoculars, water bottles and hats in tow.

Unfortunately we returned to the truck with empty water bottles, sweat soaked hats (it was about 35 degrees Celsius here today) and no bunting sightings. It turns out that they are not due to arrive for another week or two. Our timing was just a bit off. It was a nice walk in any case. We did learn that there is a migration group which mates in Florida; perhaps we will catch them there.

That about sums up today. I am currently having some email trouble at my regular addresses. If you want to get a hold of me I can be reached by a tertiary account, the username is travisb, the domain is sdf.lonestar.org.

As a note I am well pleased with my antenna eared laptop's wireless reception. Until next time.

KM 7566: Ozona, Texas

I once saw an old cartoon which depicted the construction of the great civil projects. The Hoover Dam and the Interstates specifically. In this cartoon it showed a mass of men and machines moving across the landscape, levelling everything in their path. At the time I thought this was exaggeration. Today I learned differently. Today I saw half mile sections of large hills dug out. All to avoid a slight bend in the road.

Obviously at some point in time, if not today, Americans consider nature something to be overcome. Not something to admire and work with. I'm not sure that I will ever quite understand it.

That is not all we saw today. We also saw some sheep. More interestingly we also saw a small oil field, perhaps five or six pumps, surrounded by a wind farm. I thought it a nice juxtaposition. We had seen a couple of wind turbines dismantled on trucks heading in the opposite direction. One blade was an extra long load filling a truck. They are likely quite the sight up close.

We made a good distance across the nothingness that is western Texas. At times there was nothing around except the road and the horizon. We would have made more distance except that we had a mishap. For some reason the truck started to vibrate a lot. About thirty kilometres past Ozona we pull off the Interstate to take a look.

It turns out that my front driver side tire grew a bulge. I'm not sure why, but it may be due to overinflation. When I began the trip I checked the tire pressure in all the tires, but it was quite a bit colder up there. When I stopped and saw the bulge I checked all the pressures. They were all up around 40psi, I had originally filled them to about 38 and had checked back in Oregon that they were below 40 when warm.

Anyways, off the highway I throw the spare on and limp back into town. Quite literally limp. Even though I have a full size spare it isn't the same size as my regular tires. I once thrashed my spare and picked up another out of a rack of miscellaneous tires. So it has a diameter that is two or three inches smaller than my other tires. So my truck has a limp.

Well, tomorrow I get to hit the tire shop before driving a few dozen miles to our planned destination for this evening at a state park to find a male painted bunting.

KM 7148: Van Horn, Texas

It happened so slowly I didn't even notice it. I have started to think again. I am well on the road to recovery from burnout.

Over the last couple of days I've had subtle, but undeniable urges to create, to code. I haven't had the urge to code for perhaps a year and instead did it out of rote necessity. This change makes me happy.

KM 7148: Van Horn, Texas

Today was mostly just driving. We drove through Truth or Consequences, an interesting town name if there ever was one. The town itself isn't that interesting.

We also drove into Texas. They say that everything is bigger in Texas and I'm willing to believe them. Of course I've only really seen one city and a hundred miles of nothing.

I did however see a couple of interesting things. The first is that the highways here have two speed limits for each vehicle type: one for day and one for night. This is in addition to the different speed limits for passenger vehicles and trucks. I think this is not a bad idea, especially when sections of the highway have a daylight speed limit of 80MPH. I certainly don't trust most people to travel no faster than they can see.

The other is that some sections of the highway through El Paso have both a speed limit and a speed minimum. I've never seen that before.

Finally I have some photographic proof of a difference in emission standards. We saw an old RV which would leave a large cloud of brown fog behind it whenever it had to go up even the slightest hill. It was actually pretty disgusting.

I suppose I had one other thing. About sixty miles east of the Mexican border we passed through a border inspection station. They were obviously looking for illegal Mexican immigrants. I have to question how often they actually find any at that checkpoint. I mean, if you've made it across the US border into Texas and want to make it further inland do you stick to major highways all the way, do you try the rural highways or do you simply hike around the checkpoint through the dozens of miles of fields and mountains? Surely the locations of the permanent checkpoints are widely known among those trying to import themselves.

Well, there is no long entry this time around because there hasn't been much to see. I am sure hoping Texas gets more interesting soon. At least it is warmer and providing a cause to wear my new stylish hat.

Until tomorrow.

KM 6746: Elephant Butte, New Mexico

This entry was made in the morning before leaving because it was exceptionally windy and threatening worse.

Yesterday we travelled the greater section of New Mexico that remains. We did that while keeping ourselves busy on side trips. The first trip was to a small town which sits on the continental divide. We stopped in briefly for a slice of pie, which is only fitting for a town which goes by the name Pie Town. The restaurant was a small country cafe with perhaps six tables. The pie was good and Courteney found herself a book of pie recipes with which to experiment.

On the way back to civilization we stopped in at the National Radio Observatory. In order to reduce noise from human radio and electronic sources it is situated in the middle of nowhere. It was fairly neat to see the large dishes, example images and an explanation of how it all works.

I'd like to describe more, but find myself incapable. So I will move onto where we camped. We camped in a state park called Elephant Butte. It is just outside of a city called Elephant Butte and just off a nice little lake called, imaginatively, Elephant Butte Lake. I didn't expect to see a lake in this part of the country, but here it is. We stayed in what is described as a developed camp spot. This means that there is a permanent structure on the campsite. It was big enough to put the tent entirely inside with room to spare and was looking to storm in a big way so we did so. In hindsight this may have been a mistake. The storm never came, but the wind sure did. This structure had a concrete floor, so we couldn't peg the tent down. As an alternative we weight it down and tied it to the structure. This kept the tent mostly stationary, but it would still shake and the loops where the tent is meant to be pegged shifted position regularly with a screech. We made it through the night though.

In the morning we were greeted by many bird of perhaps half a dozen varieties. This was a nice change.

KM 6306: Bernardo, New Mexico

Today we did several things. The first wasn't especially fun, but is something that I do because I like eating. That is we went grocery shopping. After getting this chore out of the way we went onto something a bit more fun: Hat shopping.

Now I've only known Courteney to enjoy two types of shopping. Shopping for books and shopping for video games. However, she had a grand time when we went to shop for hats. Even though it was a small shop it had probably two dozen different styles of womens hats. This is beyond the regular assortment of mens hats. Now going in, as I've mentioned, I knew what kind of hat I wanted and the only decision I truly had was the colour. Courteney on the other hand had made no decision and so tried on every hat in the shop, often to comedic effect. Pictures will come eventually.

I decided on a brown wide brimmed fedora and Courteney decided on a blue and green wide brimmed straw hat, even though the Outback hat looked just as good on her. With this chore done it was time to pay Santa Fe a visit.

We went to Santa Fe with one goal in mind, to eat pie. Courteney has a book full of pie recipes from restaurants and bakeries all over North America. One of these pies is a giant apple pecan pie. This is the pie we went to eat. It was indeed enormous. Courteney and I shared a single piece and it was a good idea. The slice was as large as a normal slice of pie, but the pie itself towered at least four inches tall, perhaps even five inches. The restaurant itself is situated in a small historic square in Santa Fe, near the cathedral. It is a rather nice spot to have lunch and take a walk. Anyways, the pie is topped with a delicious Mexican caramel and pecans. The filling is made up of spiced apple slices, cut real thin. It was excellent pie and Courteney will have to work up the bravery to make it in the future.

As we began to digest our meal and desert we headed south towards our next destination. Along the way we passed a Girl Scout building. After some turning around due to a misunderstanding on my part, we arrived at this building looking for the fabled Girl Scout cookies. They are rumoured to be good, perhaps even better than the mint cookies of the Girl Guides. Unfortunately it turns out that they stopped selling cookies last weekend. Both of us had forgotten that they have these stores and so we never sought one out. This is one item which will have to remain on the list for the foreseeable future.

Beyond that we just drove. Today we covered a number of miles because there isn't much out here. New Mexico is good, as far as I see, only for raising cattle. And cattle don't make interesting landscape.

New Mexico does have something going for it though, it is the coldest spot on the trip. Since it seemed when we first entered the state that the entire state was at seven thousand feet elevation above sea level it is fairly high. The highs while we have been here have been in the mid fifties (cool) and the lows have been a little bit below freezing.

KM 5968: Albuquerque, New Mexico

We woke today to fresh snow. I suppose this should be no surprise, but it was a nice one. We spent most of the day driving and where in four different states today.

This is only possible without a monumental drive because we drove through the only four state corner in the United States. It was a bit interesting.

Other than that we only saw more desert. Now for those of you who haven't travelled through more deserts than I can count you mustn't make the mistake of thinking of all deserts are the same. There are in fact many nuances. Is it sandy or rocky? Are there mostly shrubs, trees or cacti? What colour is the rock, I've seen various shades of grey, yellow, red, pink and even green. Is it a deathly dry desert like Death Valley (a well deserved name) or a relatively wet desert like the Interior of British Columbia? Finally, is the desert as flat as Saskatchewan, Alberta or BC?

Each combination produces a unique type of desert. I feel that by this point I must have driven through most of them. Both of us are getting rather tired of deserts. Of course the most interesting desert I have seen is the forested desert near the Grand Canyon. It is a desert, but with a fully fleshed forest. Sure I didn't get the pine needle floor that I'm used to, but it was a forest that was also a desert. An exceptionally dry desert that reached lows below freezing both days we were there.

It has finally become sunny enough, for long enough on this trip that it has come time to finally get a hat. After much consideration I have decided on a wide brimmed fedora. This isn't quite as good for keeping the sun away and the head cool as a straw cowboy hat, but it is more practical in city settings and for keeping in the small cab of my truck.

We are staying in the world famous Albuquerque Holiday Inn. Alas, the towels are not oh so fluffy. Though the room is slightly nicer than the rock bottom economy motels we have previously stayed in, it is certainly not worth the extra cost. My faith in Weird Al has been diminished.

An interesting fact people may not know about me is that I have never driven myself through a drive-through. Not a bank, not fast food. Today I did a first. I ate at a drive-in. Down here in the USA there is a fast-food chain called Sonic. They are a drive-in. Our experience almost went perfectly. Except that the booths are very narrow and so when in the booth we couldn't open the doors wide enough to get out. The speaker system was also too quiet, which caused problems ordering. Finally, because I couldn't understand the voice I didn't quite get that cash was accepted at the booths. This was made more confusing by the card reader and keypad at the booth. So after hitting cancel on the keypad when prompted to enter a card to pay the bill I assumed that the order had been cancelled. I then backed out of the tiny booth to park and go into the restaurant. Just after we parked the server came out with our food. Cash was exchanged for food and everything worked out.

I still haven't recalled my previous observation, but I do have another one. This one is brought to you by the convenience of having a television. If you watch television here you notice something quite different from in Canada. The first is that there is more news. Though, interestingly, it is all repeated. Secondly, everything is sensationalized. The news, advertisements, everything. I believe this explains some of the culture of the USA.

Secondly, in listening to the local radio stations I have come to the conclusion that there are many country songs which we never hear on the radio up in Canada. I have no philosophical extrapolation of this observation, I just thought it interesting.

KM 5488: Hovenweep National Park, Utah

Again there was no entry for yesterday because Courteney was freezing and needed warming up. You didn't miss much because we spent the entire day at the Grand Canyon, walking most of it.

The Grand Canyon is a major tourist attraction and has been for years. It certainly shows. It is the most well established park I have ever seen. Most of the paths up on the rim and in the village are paved and level. Most of the shops are gift and souvenir shops. There are also many ranger run activities and talks everyday. There are even shuttles moving around the park. It is all rather commercial.

It truly seems that all the park is setup to entertain the less physically capable tourists. It would seem that the only thing really worth the entry fee is hiking down the canyon. Unfortunately Courteney vetoed that idea so we only got to walk around the rim.

I think it would be nice to do an overnight hike into the canyon. This requires a permit that you can get up to four months in advance and some lightweight equipment I don't have. But still.

This travelling is beginning to make me think that I'd like to take up adventuring, or at least backcountry hunting.

I have a couple more things to mention about our mostly offroad trip in getting to the Grand Canyon. I was actually very impressed with the GPS guide I have here. Not only did it have the long dirt path marked, it also had all the National Forest roads (which are all gravel). That is not something that I expected a regular unit to have on it.

Today we left the Grand Canyon. In searching for a suitable camping park within the range we felt like driving we ended up entering Utah. The drive up here through Monument Valley was nice, even though it was mostly covered in a dust cloud. Entering the south eastern corner of Utah is quite a nice view of small canyons and exceptionally red rock.

An interesting fact that about Utah that I am extrapolating from a single sample is that the lowest grade of gasoline is 86 octane. The midgrade is what I consider to be the normal 87 octane.

As I was saying Courteney was freezing last night and so I ended up warming her up. Now due mostly to her laziness I ended up sleeping in my sleeping bag for the first time. The night was cool, with a predicted low of negative two. That sleeping bag is sure comfortable. Sleeping on one's side is a bit tough, but other than that it is a dream. I can't understand how Courteney is cold in it.

I had another observation about Americans, but I have forgotten. I'll likely think of it again.

KM 5034: Grand Canyon National Park, Arizona

Some days the most exciting part of my day is looking at all the exotic bugs as I scrape them out of my radiator. Today was not one of those days.

Today we left my grandparents' apartment and headed towards the Grand Canyon. Now instead of sticking to the I-40 like all the other boring tourists we thought we would take a small state highway that coincides with a section of the historic route 66. Now after driving on this for a while north we went further astray on another highway into a Native Reservation. This was fine and a pretty nice drive.

After this wee had to start heading east. Our road atlas shows one road heading east towards the Grand Canyon from where we were. Now it marked it as an unpaved road. This was fine. It made a large scale map and so is likely just a gravel road, right?

Wrong. This road, using the term loosely was actually two ruts worn across the rugged fields of a ranch. This was fine, I only expected it to be like that for a few kilometres before we got back onto a graded road.

Well, that didn't happen for nearly sixty kilometres. And though I had fun leaving the flattened trail for some time where I made good use of my truck, it did grow old after a while and Courteney was not much a fan. I expect if you read her entry for today you will find her making the most of her vocabulary to complain.

We finally made it to the Grand Canyon and did so in time to see the sun set. It's quite a sight, but the scale of it hasn't sunk in yet. We'll do some exploring tomorrow. For tonight though I expect more complaints. The forecast is calling for a low of negative nine degrees celsius. That is the coldest camp yet by a long shot. At least we have plenty of sleeping bags, clothes, blankets and a heater if things get really bad.

Well, it's getting too cold to type so that is all for this installment.

KM 4652: Lake Havasu, Arizona (Day 2)

Today was our first rest day. That is why the odometer on my truck hasn't changed. This is not to say that we didn't see things. In fact my grandparents' took us around to see a few of the sights in the area.

The first was the London Bridge. Yes we did in fact see the famous London Bridge of the song. I was much surprised to learn that the bridge now resides in a desert in the southwestern United States. It turns out that the person who founded this town thought the area would make a good vacationing area. In order to do that he needed to get people to come and see the town. To do that he decided to buy the London Bridge, dismantle it, ship it across the ocean and then reassemble it. Finally he dug a trench underneath the bridge to create an island in the lake.

It must have worked because the town, founded in 1963, now has a population of about fifty thousand people.

We also drove about an hour to see a town called Oatman. This town was once a gold mining town, once a stop on Route 66 (in fact to get there we drove on a piece of the now defunct Route 66) and now a tourist town. The major reason it is a tourist town is because there are mostly wild burro (donkeys) walking the streets and a bar where the walls are covered in dollar bills. The donkeys originally where the workhorse of the mine, but was released into the wild when the mine closed. The bills are each marked with a person's name and sometimes a date or origin. The entire bar is covered in bills and finding a place to put ours was difficult.

That was just about all we did, I did say it was a rest day. We of course also talked with my grandparents.

I also have a reader comment. Please keep them coming. Everyone I receive pushes me a little bit further towards using some real blogging software with actual comments and a real RSS feed. Eugene responds to my entry here:

Eugene wrote:

There's a shop called Sanrio around various malls in the Lower Mainland that are also filled with pink and cat-faced things. You just don't venture out into the areas filled with Asian people.

In terms of the American Dream, it seems to have warped from "house, car and a comfortable life for your family" to "buy big, buy now, worry later." It has drowned them in debt.

Well, if I can't understand one store, I am astounded that there is a chain. In fact I believe that the store was called Sanrio as well.

You seem very correct with respect to the American dream. Unfortunately for them when you always take the most that you can get you eventually need to pay the piper. If that isn't happening now it will eventually.

KM 4652: Lake Havasu, Arizona

Today we saw the Hoover Dam. It is considered an engineering marvel and it perhaps is. When we first arrived we were told that there were no tours and that was disappointing, but we pressed on to see the small museum. There we learnt several interesting things such as hardhats being a new thing with that project. I was amazed to learn the amount of work that was done with steam powered machinery and men on wooden boards suspended by a single rope. I especially liked the full scale model of a generator. It is interesting to see how many layers are necessary in a generator.

After we had seen approximately half of the exhibits we received the news that they had opened the tours. This made us happy. After finishing the exhibits we upgraded our tickets to go on the most expensive tour of the dam. This tour took us into the gallery of the generator room and then through some of the tunnels inside the dam. This was somewhat interesting. The dam itself was designed with tours in mind and large sections of floor is a nice, finished floor.

I'm not sure if the tour was worth the $30 per person, but it was nice to see and get out of the heat.

What was annoying was the security. It was more of the same reactionary security. I can see why some people would like to protect it though. This dam is one of many which controls the Colorado river. This river used to be unpredictable and vicious. It would regularly flood massive regions of the Southwest and then merely trickle the rest of the year. Without this dam and the rest of the system the Southwest United States could not exist. It would be flooded every few years and little food could be grown there.

Should the Hoover Dam become inoperable Las Vegas would likely go dark and much of the USA starve.

We are now seeing my grandparents' who vacation here in Arizona. We looked over our proposed route and have decided to make tomorrow our first rest day. This means we will only do light tourist things and not travel that far. Tomorrow we will see the London Bridge.

KM 4423: Boulder City, Nevada

Here we find ourselves on near the Nevada-Arizona border. We spent a few hours walking the strip in Las Vegas today. Though this is a young man's town we saw numerous old people. I suppose it is also a city where it helps to have disposable income, especially if you want to take advantage of all the drinking, gambling and sex.

For a city in the desert it is certainly green. Even more than the numerous palm trees and shrubs every casino seems to have a fountain, some of which are immense. The amount of money. time and other resources that have been spent to build this adult playground.

In any case this city isn't something that can be truly experienced on a trip like this. Firstly there are the shows, which we can afford neither the time nor the money to get tickets to all the shows. There are five Circ-du-Solei shows and many other magicians, comedians, dancers and others. Let us also not forget the free street shows put on by the casinos and hotels on the strip.

In the few hours we spent there we only saw a little bit. One day I'll have to spend a week here to truly take it all in. Though I did gamble in Vegas, it was only a dollar because it just doesn't fit in with the trip.

Tomorrow we will visit the Hoover Dam. That promises to be a worthwhile tour.

Now it is time for another episode of Thing Travis Notices about Americans. In this instalment we will discuss RVs. Though I have discussed this before I have some new insight. The first is that it appears that are public lands in the USA. I hadn't thought this to be the case. Secondly it appears that tenting is disallowed on all public lands, but RVing is acceptable. I'm not sure why this is, but it is.

Perhaps it's because Americans like travelling with all the amenities of home. I've even seen RVs with portable satellite dishes. And here I am quite enjoying a low technology trip.

KM 4273: Valley of Fire, Nevada

We are in a state park a distance outside of Las Vegas. We arrived in town too late to make a tour and needed to do some grocery shopping. I am glad we did. This park is full of brilliant red rocks. All these rocks and mountains are eroded into interesting shapes. When the pictures are up you will see what I mean.

This is yet another day in the desert. That damn dust gets into everything. It finds its way through zippers and floods in if you open the door for even a second.

But the views are worth it. It is hard to imagine just what emptiness looks like. This is it. You can see for miles and there is nothing there. No trees, no buildings, no water and often no vehicles. At least the mountains off in the distance have interesting colourings.

I would strongly recommend that those into photography spend a week in this desert. Just play with framing nothingness and hard shadows. Bring plenty of water and a polarizing lens.

KM 3938: Furnace Creek, California

We arrived here yesterday, but I was too wiped to make an entry.

We spent the day travelling through desert country. First the parts of the Mojave desert off the Interstate, then Death Valley. Both are pretty nice drives, especially to experience nothing. There is so close to nothing out here that I can't conceive of ever living here.

All that is here is wind, dust and sun bleached rock. And yet there are people who live here. It brings new meaning to scratching out a living.

Though there isn't much here other than wind and dust we did see a few things. The first is that we drove through a small sandstorm. That was fun. Visibility was just like light fog and you could hear the sand bouncing off the door.

We also visited the lowest point in North America. It is 278 feet below see level. It's a dry lake area, which means it is rather flat and the old lake bed is covered in salt. I am also glad that we are making the trip in spring instead of the height of summer. Currently it only reaches about 25 degrees, but the record during summer is 57 degrees with a more normal maximum of 49 degrees. That is way to hot for me.

In a land of nothing the entries are short because there is only beauty in the rocks. Pictures will come eventually.

KM 3620: Ghost Town Calipso, California

Yesterday was a long and exciting day. First we spent the majority of the day walking around the San Diago Zoo. We walked for nearly eight hours straight, but we managed to catch the two major shows. The sea lion show was pretty good. It consisted of a brief appearance by a hawk followed by a well trained sea lion performing a few tricks and finally an arctic wolf.

We took many more pictures than perhaps we should have. I'm sure I'll have endless fun sorting through them when we return.

The San Diago Zoo is world famous and touted as one of the best zoos in the world. However, I'm not convinced it's better than the Vancouver Zoo. Though the San Diago Zoo has a large collection of birds Vancouver has more large mammals. I also found that the signs in San Diago are not nearly as informative as the ones in Vancouver. In San Diago all each sign said was their range and threat level. The signs at Vancouver, however, have that same information presented as a map with the addition of peculiar facts and average ages and weight for each species.

Now the San Diago Zoo does have certain species that the Vancouver Zoo does not such as giant pandas and of course many tropical birds. I believe for the most part this is because of the cooler climate of Vancouver prevents housing these species outdoors as is done in the San Diago Zoo. Also certain species like Pandas and some Australian species either cannot be exported except by special governmental decree or are difficult to keep in captivity.

Though the location of the Vancouver Zoo precludes certain species from being affordable it does give one advantage, the advantage of space. Most of the cages at the San Diago Zoo are rather small and give severely limited vantage points. This is especially true of the large mammals like lions, tigers and bears. When I was last at the Vancouver Zoo the cages seemed bigger and gave many vantage points as many of them could be viewed from at least two sides of the rectangular cage. The San Diago Zoo also has the annoying tendency to put the cages above the pedestrian so you cannot see over or around many of the foliage and landscaping.

It is expected that animals at a zoo will be animals and not active at all times. So it is disappointing when the cages are designed in such a way that when the animals are resting they are difficult or impossible to see.

Then there are the shows. For a zoo that seems to present itself as an amusement park, perhaps even more than an educational park the shows at the San Diago Zoo were slightly disappointing. They consisted mostly of bringing three animals out briefly, walking them around a small stage, having them perform a few simple tricks and then taking them off stage. There is little explanation of the animal. Also for a Zoo with such a large bird population it is surprising that there wasn't any falconry. The Vancouver Zoo has a falconry show that I like.

The San Diago Zoo may make a better amusement park with its gondola and bus tours, but I feel that Vancouver makes a better educational experience and a better animal viewing experience.

So that was the first part of the day. This lasted from about 9:30 AM when we left our cheap motel which was two kilometres from the Mexican border until about six when we pulled our tired feet into the truck. Having done that it was time to find somewhere to sleep. We had spent the last few days in motels because we were in large cities and decided to camp for a change of pace and to save cash. Staying in a motel tends to be just slightly less than three times the cost of camping.

So we head north out of San Diago, having finished out run down the west coast. We may yet go to Mexico, but no to Tijuana with their current gang war going on. About two hours on the highway (really a highway since there are so many around here) we get to one place. Unfortunately they don't take night registrations and we cannot stay. So we head to a state park about an hour and a half away. At first we believed it to only be thirty klicks, but we were wrong. Instead it was about sixty miles of driving. Even worse, it was near the summit of a mountain and we spent about forty-five minutes driving up this mountain. Surprisingly there were houses most of the way up this mountain. After passing the summit of the road we were on at five thousand feet we then dipped a bit down to the park. Arriving in the dark of light, hungry, we made a quick dinner, setup the tent and slept. This explains the lack of an entry yesterday.

Even though we arrived in the dark in the middle of nowhere the park was partially full. This is a distinct difference to the other weekend camps we had in more northern parts.

I was certainly surprised at how much in the middle of nowhere people lived. There is no way any commute to anywhere could be less than forty-five minutes and every commute is steeply uphill. Yet people do it.

When we were leaving we discovered that that mountain is actually a pretty busy place, especially with motorcyclists and bicyclists.

Well, today was a driving day. We are currently heading towards Death Valley and Las Vegas and are on schedule to stay overnight in Death Valley tomorrow. We are currently in the Mojave desert. The land here is not quite flat, but certainly empty. Just off the sides of the interstate are 4X4 parks. We are camping in a regional park just outside of a ghost mining town here. Unexpectedly we found it full of jacked up trucks. I suppose it makes sense given the generally smooth nature of the landscape. Even a stock 4X4 could make its way without getting stuck in most places. There are of course the more challenging rock and cliff sections for the skilled.

That is where we find ourselves. Setting up the tent was an interesting feat. At the time we had strong winds, perhaps as strong as seventy miles per hour. Now is the time I'm glad I brought those strong tent pegs and all that rope. The sky is clear in this land the locals call highway land and just as soon as it is dark enough we will check off another item from The List, truly seeing the stars.

Now before I forget I must mention a couple of things that slipped my mind in previous posts. The first is that in Little Tokyo in Los Angeles we came across a shop I can only describe as a Hello Kitty store. Nearly everything in the store was pink and I saw almost nothing that didn't have a cat face on it. I never considered that such a store could exist.

Secondly, there are a much larger number of old cars down here. I was sceptical when I talked to a clerk and heard that my truck wasn't that old. But then I came further south and saw it to be true. It's rather amazing the see how a small change in the climate effects how things age.

Finally, since the sun has set and the stars have come out to play, it seems that there is not as much wealth in the USA as the travel brochure would have one believe. Everywhere I see cheap land I see small houses, the cheaper the and the smaller the houses. While in the cities I've seen with expensive land I've seem endless wastelands of highway, expensive cars and large houses.

I'm beginning to think that the American dream is just that, a dream.

KM 3101: San Diago, California

After the very busy couple of days we have had a low key day. There are a couple of days without entries because we met one of Courteney's friends in Fremont. I have been told her name as either Honey or Sakura, I'm not precisely sure which is just a handle and which is her name.

We arrived back at our motel late in the evening and had to go straight to bed because we had what will likely be the longest single driving stretch of the trip. We woke up early in the morning and left for Los Angeles. Courteney's friend Annette is there, but she was leaving the area for few days for a University orientation. Inconveniently her leaving would have happened at precisely the time we would have arrived there if we had kept us our leisurely pace.

So instead of missing her we woke up early and proceeded to drive nearly four hundred miles in a single day. We made it and arrived on Saint Patty's day. Annette had just had her birthday and was holding off on the party until Courteney got there. We arrived at our motel room at about six in the evening and shortly thereafter we went to a bar to have a party. It was good and everybody there seemed to have fun. We of course closed the bar down.

After arriving back at our motel we needed to sleep and rest for the day of sightseeing that was to happen. We did that and ended up going several places. I'll have difficultly naming all the places we went, but I do know a few. We went to the Santa Monica Pier, Little Tokyo, Hollywood and highway.

LA isn't so much a single city as a large number of small cities which have just grown to touch each other. Now all these areas are connected by highways. There are many of them and they are all at least eight lane highways, though at times some are twelve lanes. This is because of the unsurprising fact that everybody drives everywhere. In fact we probably drove more than a hundred miles on that tour seeing day and it didn't appear to be too far out of the ordinary.

It was fun and we did see a lot. Especially the airport. As I mentioned Annette was travelling away. Well she left on a redeye yesterday. We dropped her off for her 11:30 PM flight. However, it is a large airport and we walked partially to the terminal and partially took a shuttle. To return to the car we thought it would be smart to take the shuttle that roams for that purpose. This would have been good except that upon getting off the shuttle we didn't know where the car was in relation to us. Not having a car we needed to walk to find it.

That was a bit of an adventure. It turns out that we had not parked in the correct lot. Nothing bad happened, but we did end up with an hour long walk around the perimeter of one of the parking lots. Those lots are large. Oddly enough there were nice wide sidewalks around every block, even though I can see no reason anybody would consider it a good idea to walk around the outskirts of the LAX airport.

After this rather exhausting day playing tourist we returned to playing traveller. We took a leisurely drive from the outskirts of the LA area down to San Diago via Ontario. I know it's a bit corny, but I like driving through areas with names matching places in Canada. This was an easy drive and much shorter than the previous driving day down the LA.

We ended up getting a motel less than two kilometers from the Mexican-US border. It's a rather interesting place. I even heard a bilingual radio station. Not too far from this motel is a beach. Since we are turning east in the next day or two it was important to make our way to the ocean for a swim.

Earlier in the day it was looking to be a nice day for a swim. The sun was shining and we were half roasting in my truck. I've since discovered that though I may not have air conditioning turning the fans on helps quite a bit.

Alas, when we arrived at the beach there was a massive dark cloud hanging overhead. It was dark and windy and cold. While Courteney didn't brave the water, I did manage to make my way out until the water was up to my knees before I could no longer feel my feet. The beast was shallow. It might not have been the nice sunny swim I had hoped for, but I have been in the Pacific on this trip. I will of course get another chance in Tofino to have a proper swim, but that's a ways away.

Tomorrow we do the zoo. It promises to require a lot of walking.

KM 2927: Los Angeles, California

There is no real entry today because Courteney's friend Annette and her boyfriend showed us around the town all day and we arrived back at our motel too late to type much.

KM 2192: Fremont, California

This is an interm entry. Currently we are sitting here in a laundromat watching our laundry spin in circles.

Today was a rather full day of checking things off the list. We covered the University of California Berkeley campus, Infinite Loop and partially the Computer Museum in Mountainview. Berkeley is a rather older than I expected. The campus is fairly large and has a surprisingly full road system. The campus is very similar to the UBC campus with a mix of ornate brick buildings, more modern concrete monstrosities and a few new steel and glass buildings.

Infinite Loop is just a road , but the geekier among my readers will get the joke. We did get pictures of all six addresses on Infinite Loop. All of them are of course part of the Apple campus.

Now we were only able to partially see the Computer Museum because it was closed for a private event. We did sneak in and take a peek at some of the historical pieces, but they closed that section down and caught us. That was unfortunate. Well, perhaps I'll make it down this way again in five or ten years. As a note for anybody who is considering making their way down, if you arrive one the first or third Saturday of the month there is a live PDP-1 demonstration.

Also it is closed on Mondays and Tuesdays.

KM 2036: San Fransisco, California

Finally we have arrived at the first listed stop on our trip. It may seem odd that we don't have any places we want to stop on the way down until San Fransisco, but that's the way it is. Out stops on this trip tend to be more landmark related and less people related. That is we'll stop and look at the Grand Canyon and the giant redwoods and the Goldengate bridge, but we won't make specific time to take in a Broadway show.

So today we made San Fransisco and saw the Goldengate Bridge. We feared for a while that it would be fogged in and we would be unable to see it at all. This is a very real fear because Courteney's parents were here a few years ago for three or four days and not once could they see the bridge. Not from afar and not even while walking on it. It was also raining lightly and foggy when we were coming in of highway one.

Now that is a rather nice highway as long as the weather is fair and