Mar 112011

After installing Sphinx 1.10 on OSX Snow Leopard, I attempted to run the initial indexer and got the following error:

dyld: Library not loaded: libmysqlclient.16.dylib
Referenced from: /usr/local/bin/indexer
Reason: image not found
Trace/BPT trap

The fix was found after a little googling; just run this command:

sudo install_name_tool -change libmysqlclient.16.dylib /usr/local/mysql/lib/libmysqlclient.16.dylib /usr/local/bin/indexer

Take into account the location of your Sphinx install, but after this runs you should be all set.

I also just discovered that you’ll probably have to run the same command to get search working as well.

Jan 142011

So after almost a whole day of banging my head on my desk, I’ve come to realize the error I was getting:

org.apache.solr.handler.dataimport.DataImportHandlerException: com.mysql.jdbc.CommunicationsException: Communications link failure due to underlying exception:


was related to the old version of the MySQL JDBC connector I was using, 5.0.8. It turns out it doesn’t recognize the netTimeoutForStreamingResults parameter, causing my executions to die a horrible death after 600 seconds, the default setting. Upgrading to the latest connector, 5.1.14, solved the problem and I’m happily importing my million records of data into Solr. Yay.

Nov 092010

Saw someone on Twitter need this, so I whipped something up pretty quick. It’s not pretty, but it gets the job done.

You run it with:
ruby adif_to_sql.rb [ADIF_FILENAME] [MYSQL_TABLENAME]

You just specify the ADIF file and the name of the MySQL table name. The script creates an SQL file that can be imported directly into MySQL.

Get the script here:


Apr 072010

I’ve gone ahead and removed the unused model that was discontinued last year and added the GEFS and SREF models. I’ve also discovered why some of the valid dates will not stay the same when you hold down ‘shift’ when changing model runs. I’ll have to create a new property for the model that holds how far apart the model runs are from one another, rather than simply using the next hour in the run.

As always, you can check out the latest version at

Feb 182010

So I needed a way to replicate the ‘INSERT IGNORE’ statement in DataMapper and being that the code documentation is sparse, and I didn’t feel like force feeding an SQL statement into DM, I wrote some ruby code to replicate it.

If anyone has a better, or more refined solution, please let me know!

      rescue DataObjects::IntegrityError => e
        @logger.error e
        unless e.code == 1062
          throw e
Sep 282009

I recently had the pleasure of being interviewed by Chris Matthieu, N7ICE, for HamBrief #47. We talk about my involvement with Wayne County’s Skywarn program, emergency communications, and other assorted rag chewing.

Thank again to Chris for the opportunity to talk about one of my hobbies!

You can find more information at:
Central Region NOAA Ham Radio Page

Sep 252009

For the last couple of years at the SRD event at DTX (NWS in White Lake) we’ve used a logger I’ve built using Ruby on Rails. Last year I did a complete rewrite only days before the event, but this year I’m building on the existing codebase. One of the features I’ve added since last year is an Ajax dupe checker. It’s pretty simple but goes pretty far in saving us having to hit the Submit button.

Right now I’m looking for ways to improve the function and feature-set and in November begin trying to spruce up the interface. I’d like more of it to be Ajax, but degrade extremely well as it is with the ham community that they tend to use older hardware.

Eventually this will build out to be an event logger that can be extended to work for any sort of event, with a focus on amateur radio oriented activities. From Skywarn nets, to disaster events, I want to make the logger that can easily manage and display what happened.

I’m seeking thoughts and ideas on the SRD Logger right now, but I’ll be thinking of how to abstract its parts out. If you have an interest in using it this year, or want to help beta test the existing software, just shoot me a line.

Jul 112009

I was having a problem with the MySQL adapter for Ruby throwing a stack and memory trace when the script was finished running. The script ran fine and from the error below, you can see it’s related to the garbage collection, or lack thereof, that the MySQL adapter is doing.

Here’s the error I got:

*** glibc detected *** ruby: double free or corruption (!prev): 0x09bb64b8 ***
======= Backtrace: =========

The application I was working no longer needed to connect to the database so I put the problem off until I came across it again today. After some research I looked for alternatives to the old adapter, but there was nothing.

Then I came across a post by Stefan Kaes, “Make ruby-mysql create less garbage“. He had found a problem with the amount of objects the adapter created and that was leading to a performance hit.

I thought perhaps the problem I was having would be fixed by applying his patch. It certainly couldn’t hurt anything, right? Who doesn’t want a faster database adapter?

After downloading the patch and applying it on my server, I rebuilt the mysql 2.7 adapter and POW!

Problem solved. Many thanks to Stefan.

Apr 062009

I’ve gone ahead and updated the code to support templated image URLs. This makes for supporting a wider range of image sources much easier. This also means that it will be supporting more than just NCEP images, so the name and location will be changing soon. I may leave the NCEP version up just for those images on NCEP.

Because of the possible sources for images, I’ll be creating a new interface so you can add/remove any source you want from the viewer. I doubt those in the Plains will really want the DTX WRF runs.

And that’s the other addition to this version, the DTX WRF Hemispheric resolution runs are now available. Currently it only supports the latest run, so there’s no way to look at the previous run and the current, but new version are being worked on by DTX so stay tuned on that front. I may also incorporate some sort of archiving locally so I can support this feature even if DTX does not.

As always, let me know if there are any bugs or additions you’d like to see.

Feb 082009

I’ve gone ahead and added support for image links so you can share the images more easily. An idea is to allow the user to cache the image to the server so that they can get it later and will last longer than 24 hours, when the next day’s run is done.

I’ve also made the menu smaller so that more image fits on the screen. I may change this back but for now it works.

I’m also looking into making the script aware of the model runs’ start and stop times. I’m not sure if I want to do this for each image, assume an integral of time for each image to finish or simply wait until the whole run is complete (which would still be an assumption). The benefit to this is allowing the user to change model runs that are a day behind and still keep the valid date the same. Right now this only works with runs of the same day where if you do this to a run of the previous day you will be behind 24 hours.

Of course, if I start using the server-side for this then all is moot since I’ll be able to check the modified-date on the image itself. This would also be a good time to start thinking about caching the images to relieve the strain on the NCEP servers.