EAT vulva at the SFO RCC

Thursday, September 27, 2007 

Posted at 01:05:15 GMT-0700

Category: FunnyLatrinaliaphotoPlanesTravel


Wednesday, September 26, 2007 

Demo this please

Posted at 10:55:14 GMT-0700

Category: photoPlacesTravel

San Diego demo

Tuesday, September 25, 2007 

From robodock to demo in a day.

Posted at 14:10:14 GMT-0700

Category: photoPlacesTravel


Sunday, September 23, 2007 

Engine room of the Stubnitz

A very cool floating studio and night club. It was docked at Robodock and the crew very kindly gave us a tour of the engine room and below decks.
If you are ever fortunate enough to have an opportunity to get on board for an event, do so.
Posted at 17:30:14 GMT-0700

Category: photoPlacesSRLTravel

The SRL Show is Starting

Saturday, September 22, 2007 

The view from my gun.


The gun ran well – some good flare shots and 2-3 good diesel shots before the receiver froze solid and we couldn’t get any velocity out of it. I think the show went well overall.

Posted at 14:30:14 GMT-0700

Category: photoSRL

Jet Is Running

Saturday, September 22, 2007 

Greg and I got the jet and the fuel spray running. Warms the space up nicely.


This is a Williams APU with a highly modified and quite deadly ignition system Greg and I built and installed after we found the stock one sparkless.  The flame is much brighter than it looks – the dark windows at the top of the image are looking out on daylight.

Posted at 01:10:18 GMT-0700

Category: photoSRL

Starting the jet.

Thursday, September 20, 2007 

Tonight I finally got the Jet running. Greg figured out a circuit repair using found objects, that we improved over a few iterations and got a suitably reliable spark, if a bit deadly to the problematically absent minded. Nothing like a joule at 10kv to wake you up. One last time….
The outboard sparkplug is the spark gap that sets the discharge point for the capacitor in the high voltage supply. Do Not Touch.

Posted at 18:25:18 GMT-0700

Category: photoSRLTechnology

Robodock Starts

Wednesday, September 19, 2007 

Robodock started tonight. La Machine was playing, interesting music but very typical of this sort of event. The kind of arty, house music inspired, alternate instrument playing thing.


One of our neighbors and new friends is an Australian guy with a flying ring. Apparently he makes a living going to events like this and flying around. He knew more about the history of SRL, who is in it and when, and what machines they had worked on than any of us did. And he knew it all from Australia and watching the movies over and over.


We spent some time just sitting around talking about the jet engine I’m working on bringing up while I was working on bringing up the 5th of these little ignitor boards to light various pulse jets and flame throwers.


Turns out our land lord from Cambridge is at the show. He’s helping a guy I met in Berlin more than 10 years ago build a giant explosive pipe organ. They’re doing an original composition for 7 tubas and tuned explosive pipe organ. It’s a bit like the tuned mice sketch on Monty Python except instead of mice and mallets there are gas filled pipes that produce variable amplitude, tuned, musically timed explosions.

So far I’ve spent time here on various details on the hovercraft, the shockwave canon, the jet engine, and the pulse jets. I still have to test the pulse jets, get the jet engine back together with it’s newly rebuilt ignition system and new flame thrower, and then build a recoil absorbing shoulder mount so I can shoulder fire the 3″ air canon without breaking bones.
The whole thing is the SF expats show – we constitute between 15-25% of the entire artist roster, and that’s bringing people from all over Europe and as far as Australia.

Posted at 18:20:31 GMT-0700

Category: photoSRL


Wednesday, September 19, 2007 

I’ve been in Amsterdam the last few days working to get the SRL show up and running. Things are coming together nicely, but there are some difficulties to working far from home. The bizarre aerospace fasteners on the jet engine require a socket we don’t have and they’re English, for example. But so far I’ve got the shockwave cannon working and the pulse jet ignitors built, but the Williams jet isn’t running yet as it seems the HF ignitor is broken.
More fun to come.

Posted at 00:00:14 GMT-0700

Category: SRL

Search Engine Enhancement

Wednesday, September 12, 2007 

Getting timely search engine coverage of a site means people can find things soon after you change or post them.

Linked pages get searched by most search engines following external links or manual URL submissions every few days or so, but they won’t find unlinked pages or broken links, and it is likely that the ranking and efficiency of the search is suboptimal compared to a site that is indexed for easy searching using a sitemap.

There are three basic steps to having a page optimally indexed:

  • Generating a Sitemap
  • Creating an appropriate robots.txt file
  • Informing search engines of the site’s existence

It seems like the world has settled on sitemaps for making search engine’s lives easier. There is no indication that a sitemap actually improves rank or search rate, but it seems likely that it does, or that it will soon. The format was created by Google, and is supported by Google, Yahoo, Ask, and IBM, at least. The reference is at

Google has created a python script to generate a sitemap through a number of methods: walking the HTML path, walking the directory structure, parsing Apache-standard access logs, parsing external files, or direct entry. It seems to me that walking the server-side directory structure is the easiest, most accurate method. The script itself is on sourceforge . The directions are good, but if you’re only using directory structure, the config.xml file can be edited down to something like:

<?xml version="1.0" encoding="UTF-8"?>

 <url href="" />

Note that this will index every file on the site, which can be large. If you use your site for media files or file transfer, you might not want to index every part of the site. In which case you can use filters to block the indexing of parts of the site or certain file types. If you only want to index web files you might insert the following:

 <filter  action="pass"  type="wildcard"  pattern="*.htm"           />
 <filter  action="pass"  type="wildcard"  pattern="*.html"          />
 <filter  action="pass"  type="wildcard"  pattern="*.php"           />
 <filter  action="drop"  type="wildcard"  pattern="*"               />

Running the script with

python --config=config.xml

will generate the sitemap.xml.gz file and put it in the right place. If the uncompressed file size is over 10MB, you’ll need to pare down the files listed. This can happen if the filters are more inclusive than what I’ve given, particularly if you have large photo or media directories or something like that and index all the media and thumbnail files.

The sitemap will tend to get out of date. If you want to update it regularly , there are a few options: one is to use a wordpress sitemap generator (if that’s what you’re using and indexing) which does the right thing and generates a sitemap using relevant data available to wordpress and not to the file system (a good thing) and/or add a chron script to regenerate the sitemap regularly, for example

3  3  *  *  *  root python /path_to/ --config=/path_to/config.xml

will update the sitemap daily.


The robots.txt file can be used to exclude certain search engines, for example MSN if you don’t like Microsoft for some reason and are willing to sacrifice traffic to make a point, it also points search engines to your sitemap.txt file. There’s kind of a cool tool here that generates a robots.txt file for you but a simple one might look like:

User-agent: MSNBot                             % Agent I don't like for some reason
Disallow: /                                    % path it isn't allowed to traverse
User-agent: *                                  % For everything else
Disallow:                                      % Nothing is disallowed
Disallow: /cgi-bin/                            % Directory nobody can index
Sitemap: % Where my sitemap is.

Telling the world

Search engines are supposed to do the work, that’s their job, and they should find your robots.txt file eventually and then read the sitemap and then parse your site without any further assistance. But to expedite the process and possibly enhance search results there are some submission tools at Yahooo, Ask, and particularly Google that generally allow you to add meta information.
Ask allows you to submit your sitemap via URL (and that seems to be all they do)

Yahoo has some site submission tools and supports site authentication, which means putting a random string in a file they can find to prove you have write-access to the server. Their tools are at

with submissions at

you can submit sites and feeds. I usually use the file authentication which means creating a file with some random string (y_key_random_string.html) with another random string as the only contents. They authenticate within 24 hours.
It isn’t clear that if you have a feed and submit it that it does not also add a site, it looks like it does. If you don’t have a feed you may not need to authenticate the site for submission.
Google has a lot of webmaster tools at

The verification process is similar but you don’t have to put data inside the verification file so

touch googlerandomstring.html

is all you need to get the verification file up. You submit the URL to the sitemap directly.
Google also offers blog tools at

Where you can manually add the feed for the blog to Google’s blog search tool.

Posted at 13:25:13 GMT-0700

Category: FreeBSDTechnology