Friday, October 26, 2007

Geotagging: The Trackback!

My first trackback! I haven't ever really done this, thought it might be a good time to try it out since I've just put in a post about geotagging.

Stacy and Brian's Hikes and Bikes: Life of a Geolocated Blog Post

Geotagging works!

See my first geotagged album from Pt Reyes.

I got a Garmin Vista HCx, after reading quite a few reviews etc this seemed like a good unit for general use. I followed directions I had found somewhere on the web. I don't have a reference to where I'd gotten started, but here's a very detailed set of directions. I sync'ed the camera and GPS clocks, turned on logging, and went out for a hike. On return I got the GPS data out of the unit, and produced a .gpx file from Garmin's software. I then added GPS coordinates to my Canon .CR2 raw files using gpicsync. They key here turned out to be setting the timezone. The GPS data is in UTC, and the camera timestamps in local time. I used a timezone of -7, which corresponds to PDT. I then processed the .CR2 files and exported them as .jpg files. Picasa didn't recognize the GPS coordinates, however. Turns out Picasa is quite picky. The solution is again the the directions I've cited above: see Markus's comment on fixing the GPS exif data using exiftool. Once that's done, Picasa recognizes the coordinates. The final step is to upload the geotagged photos to Picasaweb. Before uploading, go into the Picasaweb settings, and select "Use exif location information". Then upload the photos, and you have a geotagged online photo gallery.

Tuesday, October 16, 2007

Photography equipment wishlist

A short wishlist of photography equipment...

Canon EF 70-300mm f/4-5.6 IS USM Telephoto Lens
Canon EF 100mm f/2.8 USM Macro Lens
Remote shutter release for Canon 30D
VisibleDust Arctic Butterfly 724 Sensor Brush
Canon 580EX II Speedlite or Canon Speedlite 430EX Flash
Lightroom and photoshop
GPS unit for geotagging

Update 10/26/07:
I have a GPS unit.
I would also like Canon EF 24-105mm f/4 L IS USM Lens.

Saturday, October 6, 2007

To update or not to update

I was recently considering updating my desktop with more RAM. The cost of adding 2GB of RAM is about $120. The cost of getting a new (or slightly used) desktop, with a faster processor and better graphics card, with 2GB of RAM, is about $600. So, is it worth spending money to get 2GB of RAM? The downside was that such a new desktop would likely come with Vista, and would wholly defeat the purpose of getting a newer machine: improving performance. Getting a mac would in the long run be much more of a bargain. But then it becomes worthwhile to get that 2GB of RAM. I settled on getting 1GB for about $60. I think I made the right choice.

Photography failures

I've been trying to learn to take a good photograph. I've joined the Martin Bailey Photography Podcast forums, and have posted some images. I've gotten valuable feedback, some positive, all of it constructive. There is a downside to getting good feedback: all those photos that I thought were good now don't look so hot. We were out today at a lake. I tried photographing birds. I'm not particularly happy with the outcome. The sad part is that I haven't even looked at the photos yet! I think I'm just feeling pretty negative about myself right now.

So much I want to do, so little time to do it all. Will I really get any good with my camera? I do want to.

New FReT nearly done

I've finished testing FReT on Allegro CL, OpenMCL, and SBCL. I've been sloppy releasing versions of this software in the past, and am determined to not have that happen again. Tomorrow I'll write up release notes, package it up and put it up on download. I doubt this software has any users. Even I haven't really used it on any of my projects, a state of affairs I shall address now that I have the last essential usability related detail in place: execution through asdf.

Sunday, September 30, 2007

FReT 0.4 is approaching

ASDF integration for FReT now works on at least one lisp implementation. Now to test with one other implementation, and do a 0.4 release. This one has been a long time coming. After that... use it for some real work and see if I consider it usable.

Otherwise this is going to have been a lot of wasted effort.

Tuesday, September 25, 2007

Troubles extending ASDF

ASDF is a huge improvement over system definition facilities that had preceded it. I always found mk:defsystem to be a bit of a monstrosity. But one of the great benefits I had expected from ASDF, extensibility, leaves a lot to be desired.

I have, after a long break, returned to integrating FReT into ASDF. I want to define an ASDF operation, along with other instrumentation, so that one can execute a FReT test suite directly through ASDF. For example, after loading FReT, one should be able to run:

(asdf:operate 'fret:fret-op 'fret-test)

This should execute the FReT test suites.

ASDF does not make this easy. ASDF does not perform any operations on components other than files. So you can't really ever instrument an arbitrary component to call an :after method. If you're loading a system, there is no way to know when you're done loading. So if I have a test suite defined on a component, and I want to test that component, I can't know when to run the test suite.

I found a way around that problem. I defined a virtual FReT source file that is added after the component is defined that depends on all other components in the system to test. I wanted to add the virtual file in the :after of initialize-instance of the fret::component class, which inherits from asdf:component. I was thwarted again. ASDF does not really do anything interesting in its make-instance method, only in its reinitialize-instance. The components however aren't even set here, but rather through a setf.

The manner in which I've had to implement this new operation seems quite convoluted. With a bit of forethought I imagine this could have been greatly simplified. For now I'm happy to have everything working.

Saturday, September 15, 2007

Working on FReT again

I've finally started work on FReT again. I want to actually start using it. Seems like such a waste to have put so much effort into getting so far, and then not being able to use it. I have yet to implement an essential feature of the software: being able to run a test suite/script through an ASDF operation. The ASDF operation must run an operation after a load has been performed on an instance of a module. For example, we would define a test definition for FReT itself as follows:

(defsystem fret-test
:class fret:system
:components ((:module test-support
:components ((:file "framework-tests-support"))))
:test-script fret::fret-test-script
:depends-on (fret))
Then, calling (asdf:operate 'fret:fret-op 'fret-test) should execute the test suite. fret:fret-op would be a subclass of asdf:load-op, and would only define an after method on whatever generic function signaled the completion of loading a testable module, in this case fret-test itself. I could have done without subclassing load-op, but then I would not be able to do a straight load.

The model here is that a system would be able to add a test suite into the primary system definition if it only had a set of simple tests. If the tests required any support code (as FReT's tests do) then we would have to introduce a separate system. Another reason we need to separate out the tester system for FReT is that the tester system cannot be declared without FReT. It is the only piece of software which would run into such a bootstrapping problem.

At present though it appears ASDF does not signal the completion of loading a module through any generic function. So I cannot know when I should run the test suite for a module, as I don't know when it has finished loading. I've posted a message on the mailing list asking for advice, as soon as I have some I'll be able to finish up my implementation.


Friday, September 14, 2007

Facebook, here I am!

Some time ago I got around to creating a profile in Facebook, when a friend invited me to do something or the other on there. No, I really don't remember the circumstances. Now I'm a bit addicted to Facebook, despite certain annoyances in how it pushes applications you're using on to your friends. (Yes, I realize that would work for some, but everyone doesn't work the same way.)

I'm fascinated with the simplicity of Facebook's architecture, and the power it produces. Each application is basically a function you can apply to data, which is the list of your friends. Each use of an application generates a message to all your data. There's a bit of window dressing to go along with this basic idea, such as being able to search for friends and applications, managing email, etc. But there really doesn't seem to be that much more to it. The only other social networking site I've tried is LinkedIn, which by contrast seems quite static. But then it does so much more with that static structure. Would it be possible to implement LinkedIn on Facebook?

Performance tuning in lisp

The more I use Lisp, the more I appreciate it. Recently I had to tune up the performance of a knowledge based application running on Allegro CL. We made extensive use of a reasoner. The reasoner, being a reasoner, had to be general purpose. Our application used a subset of the reasoner. The manner in which we went about performance tuning could possibly be done with any language capable of introspection, but this is just about lisp.

Broadly speaking, I see two approaches to tuning performance: use a better algorithm, and cache intermediate results. Given that the reasoner was essentially a library, we didn't have sufficient understanding to perform serious algorithmic tuning. However, there was a lot we could do with caching.

Consider that in a full frame based reasoner, a general purpose application could be creating arbitrary knowledge. At no point can we make a "closed world assumption" and optimize certain behaviors of the reasoner. However, an application using the reasoner does know when to make such assumptions. For example, we may enter a phase of computation where we're not creating any new classes, or creating any new slots. In our case, we never create any slots after compile time. So we can safely start caching the results of all computations performed on slots. For example, computing the inverse of a slot can potentially be a complex operation, but since we know we aren't going to see any operations on slots we can start caching these slot inverses.

Here's where Lisp comes in. The function that computes slot inverses is a library function that we don't control. Given that we can manipulate the symbol table, however, we can create a wrapper around the function body, and setf the wrapper to the symbol's function value. This will not catch all types of uses of the slot inversion function. It catches most of them though, and the result is a noticeable performance improvement. So, not only does Lisp allow us to apply targeted optimizations in software we have ourselves written, but it also allows us to apply optimizations in software we've inherited as libraries, based on the usage pattern of those libraries in the final application.

Wednesday, September 12, 2007

New toy!

I recently got myself a new lens: the Canon 10-22mm EF-S. I always wanted a wide lens, but couldn't find anything suitable given the reduced sensor size. I looked around and picked out three: the Sigma 12-24mm, Tamron 11-18mm, and the Canon. The Canon is the most expensive of the three, but the quality of the images I'm getting from it is exceptional. (Especially after getting a friend's help in cleaning out the camera sensor! Very important!) The salesperson that helped me out at Calumet steered me away from the Sigma, due to sample variance. The Sigma though has the distinction of being a full frame lens with that wide a view. The glass though has to bulge out to the extent that putting any sort of filter on it is impossible. The Tamron turned out to be a good lens, but much clunkier. And noisier.

I've now taken a few photos with the lens. Problem's right now with the photographer, and not the lens. We recently took a trip to Mt Tam, and went up to the east peak. I tried taking some wide landscape shots. Getting close enough to the edge to get an unobstructed view was difficult for me, given my fear of heights. So many of the images turned out rather unsatisfactory. I also tried on Mission Peak, and I think had more luck there. Still a lot to learn. Exciting!!

Monday, September 3, 2007

New toys, virtual and real

I've got myself a couple of new toys: I've started playing with facebook, and I got myself a Canon EF-S 10-22mm lens, which in 35mm equivalent is 16-35mm.

I'm impressed with facebook. There's of course the social aspect of facebook that's been beaten to death. The site however is well put together, especially the manner in which applications work within the relatively clean UI. (There's a lot in the UI, but that's OK.)

As for my non-virtual toy... I spent a lot of time agonizing over which lens to get: Sigma 12-24mm, Tamron 11-18mm, or the Canon. With a Canon 30D, given the small sensor size, getting a wide angle is difficult. I wanted to get down to about 2omm, but regular lenses don't cut it. And I didn't want a lens that's specifically designed for the small sensor size, since that would make graduating to a full frame camera difficult. (Not that there's any reason to believe that's going to happen any time soon, but just in case...) The Sigma fits a full frame camera, but the chromatic aberration seems quite bad. Plus I've heard Sigma has a real problem with sample variance. The Tamron was quite nice, but the feel of the lens was clunky compared to the cannon. The cannon was smoother and faster, both mechanically and optically. It was also more expensive.

I got the lens on Saturday. We went to Mission Peak that day, and took a few photographs. I'm in the process of sorting through them, and I'll eventually put them on my gallery on Picasaweb. The short of it is that the lens is great, but the photographer isn't. I have a lot to learn if I'm going to use this lens well. I think I got a couple of decent shots out of it.


Simulated worlds, and agents therein

I had started working on implementing a simulated world based on a discrete event simulator ages ago. I've finally started working on the software again. It's taken a bit of work to get going on it again. The code is strangely complex, and I'm now getting re-acquainted with my intentions when I had put it together that way. I'm struggling with the implementation of the simplest agent, one that perceives the artificial worlds, and is able to act in it. Just getting the notions of simulated world, perception and action worked out are turning out to be quite challenging. I'm looking forward to getting this simple agent model working, I haven't designed any AI software from scratch in a long time, so this is rather exciting.

Thursday, July 19, 2007

Setup nearly there

I neglected wrapping up the migration to the new dreamhost server. To serve my domain from dreamhost, I had to transfer my DNS hosting from to dreamhost. A minor disappointment. I had pre-paid all the way up to 2009. After transferring DNS, I forgot to update my blog URLs. Now that's fixed. All seems to be working.

Backups are working. I split my backups into five sections. The largest of them, music, doesn't change. I'm considering turning that to manual. I added lots of photos from our trip to Carmel, which have been correctly backed up. I'm happy.

Wednesday, July 18, 2007

Trip to Carmel

We took a trip to Carmel, CA last weekend. See the photos: Carmel and Point Lobos. Carmel is a little "village" that deliberately hides away most modern public conveniences. Such as street lights. The houses are expensive, and amazingly well kept. And rather small. Surprisingly tasteful, much more so than I was expecting. We stayed at the Sea View Inn. I enjoyed myself. It was a wonderful break from reality. The nights were dark, and quiet. A passing car became an obvious disturbance, rather than the routine that it is in a city. Point Lobos is a state park just south of Carmel. Quite amazing. My camera got quite a workout over the weekend, I think I got some of the best photographs I've ever taken.

Wednesday, July 11, 2007

Lessons from multiple backup failures

No matter how fast your internet connection, backing up 20 odd GB of data is going to take days. And is quite likely to fail before completion. Now to see if backing up in smaller chunks works.

Tuesday, July 10, 2007

McLaren Park

The big park in the neighborhood is now McLaren Park. The park's beautifully situated, occupying a hilltop with views of the bay on one side, and Glen Park on the other. It is also one of the wildest parks in San Francisco, it would appear. It tends to be rather empty. The vegetation and paths are quite rough. The park on the whole feels quite unloved. A neglected park in a neglected corner of the City, with Excelsior on one side, and Visitacion Valley, Bay View etc. on the other. The micro-demographics of San Francisco are as numerous as its micro-climates.

Monday, July 9, 2007

The server is dead... Long live the server!

I've maintained a server for my use for a long time now. It's been sitting in my place, sucking up power, serving up content. And it has done a fine job. However, data on the server, and on my home machines, was fragile and unutilized.

I spent the weekend switching my domain over to Google, and moving all my mail to Google Mail. Despite some negative reviews, I have rented space on Dreamhosts, not just for their space but also for their SVN repository. All my blogs are now on their own blogger domains. There's safety and relief in giving up control. I'm trying to set up a backup regimen. I found some software, and started a backup. But the sheer quantity of data that needs to be backed up is daunting. Sending over many gigs of data to a remote host is a slow process, even on a relatively fast network connection. I don't know how long even a single full backup is going to take.

So far I'm as satisfied as can be. Though it's a bit unsettling just how little email I get, when all the spam has been cut out. I miss my spam.

Friday, June 15, 2007

No news is, well, no news

It has been a long time since I've done anything public. No software updates. No blog postings. But it isn't because I've been idle. Far from it. I've been furiously working away on an object oriented application server built on top of hunchentoot. That isn't yet publicly available. I hesitate to put it out before I have the storage scheme for the server worked out.

Meanwhile, I have neglected my other public projects. I fixed a few critical bugs in FReT, but haven't put out a new version. Even finding a review that basically panned FReT for crashing right away didn't turn out to be sufficient motivation.

I have yet to write a test suite for the other software I've been developing. The test suite will obviously be in FReT, and writing the test suite will drive further development. I just have to get working on putting together said test suite.

Testing a new server setup

A lot has happened since I last blogged. We have moved. So has our ISP. We're using a dynamic IP right now, and managing the domain is a bit more complicated. This post is a test to see if I'm still able to correctly connect to my server for publishing my blogs.

Thursday, January 25, 2007

SBCL doesn't thread on the mac

I just realized (once again) that SBCL doesn't do threading on the mac. Which means hunchentoot doesn't work. Which means I can't use SBCL. Time to try OpenMCL...

Wednesday, January 24, 2007

Starting up with SBCL

I've recently started working with SBCL on an intel Mac OS X machine. SBCL has come a long way since the last time I had attempted using it, two years ago. It has a different set of quirks from Allegro CL, which I've been using for many years now. My baseline for considering SBCL usable was whether I could get FReT to work. Happily I have succeeded. Version 0.3.1 doesn't work, version 0.3.2 (or the current svn head) will.

The most difficult task in porting FReT was odd bugs that came up in CLOS. FReT does a lot of CLOS mangling. And while I find CLOS to be one of the most powerful and capable OO systems I've used, it is also one of the most mysterious. I was defining and redefining classes, and using their prototypes for selecting generic function methods to execute. In addition, I had added default initialization forms in class definitions to signal errors if certain slots were not supplied, like so:
(defclass test-object ()
((start-time :initform (get-universal-time) :reader test-start-time)
(state :initform :init :accessor test-state)
(runner :initarg :runner :initform (error-required 'runner 'test-object) :reader test-runner)))

SBCL, it turns out, applies slot initialization on prototype instances when a class is redefined, but not when the class is initially defined. This is apparently allowed by the ANSI standard. I'm not that great at following the minutiae of standards, so I am not going to try and explain why this is possible. The solution turned out to be including an error handler in an :around method for update-instance-for-redefined-class.

I'm presently satisfied that SBCL is usable, though not necessarily immediately comprehensible. Suitable for my hacking needs.

Experiences with IkeWiki

I have gotten my hands on IkeWiki, installed it on my mac, and have been testing it. IkeWiki is a pretty good piece of work, though I find it a bit un-wiki-like. The author, Sebastian Schaffert, is quite responsive on the mailing list.

IkeWiki is quite wiki-like when it comes to editing articles. You can add an article, put in links to pages that don't yet exist, and fill in content as you go through the web site. Authoring can be either using wiki syntax, or through HTML if wiki syntax doesn't satisfy your needs. For any interesting formatting, you have to resort to HTML, as the wiki syntax is much less elaborate than say for MediaWiki. This is not surprising, but certainly not a show stopper.

Semantic annotation though is not very wiki-like. You cannot invent relations (also called properties in OWL) on the fly, or namespaces, or any other type of content. Relations cannot be embedded in the wiki syntax. You have to use a separate annotation mode. All annotations are always with respect to the current document. This may not seem like a big deal, but it really takes away the feeling of easy content creation you have when you typically use a wiki. You do end up with a more consistent knowledge base, but it doesn't really get at the questions that I want answered.

There are other things I want from a semantic wiki that aren't handled in IkeWiki. You cannot describe content about multiple resources in a single article. The resource and the document about the resource are confounded. There isn't enough documentation on how to work with IkeWiki. Ultimately, this is not the semantic wiki I want. But do I really want to write my own semantic wiki? Well, IkeWiki is GPL, which means if I want to do anything non-open with IkeWiki, I will have to produce my own semantic wiki.