Dec 6 2012

## Rpath emulation: absolute DLL references on Windows

When creating an executable or shared library on Linux, it’s possible to include an ELF RPATH header which tells the dynamic linker where to search for the any shared libraries that you reference. This is a pretty handy feature because it can be used to nail down exactly which shared library you will link against, without leaving anything up to chance at runtime.

Unfortunately, Windows does not have an equivalent feature. However, it does have an undocumented feature which may be enough to replace your use of rpath if you are porting software from Linux.

Executables or DLLs or Windows always reference any DLLs that they import by name only. So, the import table for an executable will refer to kernel32.dll rather than C:\Windows\kernel32.dll. Window’s dynamic loader will look for a file with the appropriate name in the DLL search path as usual. (For full details on DLL import tables and more, you can check out my previous in depth post.)

However, Window’s dynamic loader will, as a completely undocumented (and presumably unsupported) feature, also accept absolute paths in the import table. This is game-changing because it means that you can hard-code exactly which DLL you want to refer to, just like you would be able to with rpath on Linux.

# Demonstration

To demonstrate this technique, we’re going to need code for a DLL and a referring EXE:

$cat library.c #include <stdio.h> __declspec(dllexport) int librarycall(void) { printf("Made library call!\n"); return 0; }$ cat rpath.c
__declspec(dllimport) int librarycall(void);

int main(int argc, char **argv) {
return librarycall();
}

If we were building a DLL and EXE normally, we would do this:

gcc -c library.c
gcc -shared -o library.dll library.o
gcc -o rpath rpath.c -L./ -llibrary

This all works fine:

$./rpath Made library call! However, as you would expect, if you move library.dll elsewhere, the EXE will fail to start: $ mv library.dll C:/library.dll
$./rpath /home/Max/rpath/rpath.exe: error while loading shared libraries: library.dll: cannot open shared object file: No such file or directory Now let’s work some magic! If we open up rpath.exe in a hex editor, we see something like this: Let’s just tweak that a bit to change the relative path to library.dll to an absolute path. Luckily there is enough padding to make it fit: The EXE will now work perfectly! $ ./rpath
Made library call!

# In practice

Knowing that this feature exists is one thing. Actually making use of it in a reliable way is another. The problem is that to my knowledge no linkers are capable of creating a DLL or EXE which include an absolute path in their import tables. Sometimes we will be lucky enough that the linker creates an EXE or DLL with enough padding in it for us to manually edit in an absolute path, but with the method above there is no guarantee that this will be possible.

In order to exploit this technique robustly, we’re going to use a little trick with import libraries. Instead of using GCC’s ability to link directly to a DLL, we will generate an import library for the DLL, which we will call library.lib:

$dlltool --output-lib library.lib --dllname veryverylongdllname.dll library.o When you use dlltool you either need to write a .def file for the DLL you are creating an import library for, or you need to supply all the object files that were used to create the DLL. I’ve taken the second route here and just told dlltool that the our DLL was built from library.o. Now we have an import library, we can do our hex-editing trick again, but this time on the library. Before: And after (note that I have null-terminated the new absolute path): The beauty of editing the import library rather than the output of the linker is that using the --dllname option we can ensure that the import library contains as much space as we need to fit the entire absolute path of the DLL, no matter how long it may be. This is the key to making robust use of absolute paths in DLL loading, even if linkers don’t support them! Now we have the import library, we can link rpath.exe again, but this time using the import library rather than library.dll: $ gcc -o rpath rpath.c library.lib
$./rpath Made library call! Yes, it really is using the DLL on the C: drive: $ mv C:/library.dll C:/foo.dll

## Conclusions

Building universal 32-bit/64-bit binaries is apparently fairly straightforward but it was tricky to find documentation for the process. I hope this article helps others who need to get this done.

Jan 28 2011

## Solving GHC iconv problems on OS X 10.6

A problem that has plagued my GHC installation for a while is that whenever I tried to install any non-trivial package I would get a horrible link error like this:

Undefined symbols:
"_iconv_close", referenced from:
_hs_iconv_close in libHSbase-4.3.1.0.a(iconv.o)
(maybe you meant: _hs_iconv_close)
"_iconv_open", referenced from:
_hs_iconv_open in libHSbase-4.3.1.0.a(iconv.o)
(maybe you meant: _hs_iconv_open)
"_iconv", referenced from:
_hs_iconv in libHSbase-4.3.1.0.a(iconv.o)
(maybe you meant: _hs_iconv_open, _hs_iconv_close , _hs_iconv )
"_locale_charset", referenced from:
_localeEncoding in libHSbase-4.3.1.0.a(PrelIOUtils.o)
collect2: ld returned 1 exit status

The reason for this is a combination of several factors:

• The base library that comes with the GHC binary distribution wants to link against the standard Mac iconv
• I have installed MacPorts libiconv, which renames the function that is named iconv_open in the standard iconv to libiconv_open
• The Haskell library being installed by cabal depends transitively on some library that was built with something like extra-lib-dirs: /opt/local/lib, which causes -L/opt/local/lib to be passed to the linker
• The linker's -L/opt/local/lib option occurs before -L/usr/lib, so the linker prefers to link against the MacPorts libiconv instead of the system one

In my case, it was the Haskell readline wrapper that was causing /opt/local/lib to be pulled in. I had to link the Haskell readline against MacPorts readline because the standard Mac libreadline is actually libeditline, which is almost-but-not-quite compatible and misses some crucial features.

There are several ways to fix the problem:

• Perhaps you don't really need the MacPorts libiconv. In this case, you can stop it from being used by just doing port deactivate libiconv. This is the route I took.
• Perhaps it's OK to link this particular library against the system libraries in preference to the MacPorts one. In this case, you can configure the package with cabal configure --extra-lib-dir=/usr/lib, so /usr/lib is searched before the MacPorts directory. This may fail if the package that needed -L/opt/local/lib requires a MacPorts version of some library that is also present in /usr/lib, though.
• You could build GHC yourself and link it against the MacPorts library versions. This is not for the faint-hearted, but if the version of GHC you need is in MacPorts I imagine that you can just do port install ghc

I'm glad I've finally got this sorted out. If you are still having trouble, you might find some helpful information in the threads that finally helped me to overcome the issue and prompted this writeup.

Apr 3 2010

## Ditaa support for gitit

I hacked together a quick plugin for the most excellent gitit wiki today. It's written in Haskell, so it's an absolute pleasure to write code for it.

What I added support for is a neat little tool called ditaa (DIagrams Through Ascii Art). Basically, in the markdown source of your Gitit wiki you can now write something like the following:

 ~~~ {.ditaa} +--------+ +-------+ +-------+ | | --+ ditaa +--> | | | Text | +-------+ |diagram| |Document| |!magic!| | | | {d}| | | | | +---+----+ +-------+ +-------+ : ^ | Lots of work | +-------------------------+ ~~~ 

The plugin will then call out to the ditaa command line tool (written in Java, boo!) to render that to a beautiful image:

To get this set up for yourself, try the following from the root of your Gitit wiki:
 git clone git://github.com/batterseapower/gitit-plugins.git batterseapower-plugins wget http://downloads.sourceforge.net/project/ditaa/ditaa/0.9/ditaa0_9.zip?use_mirror=kent -O ditaa0_9.zip unzip ditaa0_9.zip 

Now edit your Gitit configuration file so the plugins list includes my plugin:
 plugins: batterseapower-plugins/Ditaa.hs 

That's it - restart Gitit and you should be ready to go!

Dec 7 2008

I've often thought it would be a nice idea to mash together Eelco Dolstra's ideas about purely-functional software deployment with functional reactive programming to produce a purely functional build system. In this world, compilers are pure functions from source (.c files) to object (.o files) and these dependencies are encoded explicitly by argument-passing in the program rather than happening implicitly via the file system.

Having grappled with GNU Make somewhat for the last week, I thought I'd take another look at making this work nicely and made a little progress. However, that is not the topic of this post! In the course of research, Google turned up an interesting find: OMake, a build system written in OCaml. It has a number of niceish features over GNU Make but the thing that really caught my eye was built-in support for LaTeX building - just the problem I'd had over the last week. Thus inspired, I had a go at integrating this support with one of my Literate Haskell documents, and this post will preserve the result for posterity

OMake syntax is somewhat Make like, so our OMakefile opens with something you may find familiar:

########################################################################
# Phony targets are scoped, so you probably want to declare them first.
#

.PHONY: all clean
.DEFAULT: all


The notion of scoping is somewhat different from that Make, and it includes the nice feature that you can introduce local scopes into your OMakefile with a "section" declaration - I don't use any of that good stuff here though.

Next I need to tell OMake how to build .lhs files, since it only understands .tex. I do this by declaring a function that generates rules when it is called (pretty cool!):

########################################################################
# Lhs2Tex builder.
#

LHS2TEX = lhs2TeX
LHSSTYLE = tt

Lhs2Tex(tex, lhs) =
tex_name = $(addsuffix .tex,$(tex))
lhs_name = $(addsuffix .lhs,$(lhs))

# The build rule
$(tex_name):$(lhs_name)
$(LHS2TEX) --$(LHSSTYLE) -o $@ <$+

# Return the tex file name
value $(tex_name) At this point I could also use the .SCANNER target to tell OMake how to determine the dependencies of a .lhs file, but I didn't need that for my situation, so I didn't bother to do it. Instead, let's plunge onwards into the declaration of the rules particular to my program: ######################################################################## # LaTeX configuration. # DOCUMENT = closure-elimination GENERATED_TEX = closure-elimination foreach(tex,$(GENERATED_TEX)):
Lhs2Tex($(tex),$(tex))

LaTeXDocument($(DOCUMENT), closure-elimination) This turns my file closure-elimination.lhs into closure-elimination.tex and then finally into a PDF using the built-in LaTeXDocument function. I've used a foreach loop to call my Lhs2Tex function just to show you that this can be done - the OMake system includes a whole rich programming language of it's own, which even includes first-class functions! Finally I need to define those phony targets I declared way up at the start of the post: ######################################################################## # Phony targets # all:$(DOCUMENT).pdf

clean:
rm -f \
$(addsuffix .tex,$(GENERATED_TEX)) \
$(addsuffixes .pdf .dvi .aux .log .out,$(DOCUMENT)) \
*.omc

My "clean" target is somewhat unsatisfactory. I rather think that the use of LaTeXDocument should mean that the LaTeX junk (.pdf, .dvi, .aux, .log, .out) should be automatically cleaned somehow, but I can't work out how to achieve that. (My functional build system would consider this stuff part of the "local side effects" of a build command - encoded through some sort of monad structure - and hence fair game for any "clean".)

Overall, OMake looks like a nice system - indeed, anything that handles the tedious business of running LaTeX over my document right number of times is doing better than GNU Make in my book. I'll certainly consider using it in future, larger scale projects - but I'm a bit concerned the code has been abandonded, as the last release was in August of 2007...

Other alternatives for LaTeX projects include (if you want to stick with GNU Make) the Ultimate LaTeX Makefile. Another option might be cute-looking thing I just discovered called rubber, which is a specialised LaTeX build tool.

Nov 11 2008

## Upgrading An Unactivated Windows Install To Parallels 4.0

This is a pretty obscure problem, but I'm going to put a post up about it on the off chance I can help someone else out. My regular reader (hi dad!) will probably find this of no interest and should give it a miss

The situation I found myself in was upgrading a Boot Camp install of Windows Vista for the new release of Parallels Desktop 4.0 - no big deal, you may think. Unfortunately, I had forgotten that that particular install of Windows Vista wasn't activated, which caused the automatic upgrade process to bork, dropping me back to manual mode.

To complete the upgrade I needed to run the Parallels Tools setup executable. However, since I hadn't activated, I could only log in as far as getting the "please activate Windows now" screen. As it happened, I knew that I could get rid of this screen by feeding it the details of a Windows Vista license I own, but in order to do that I needed an Internet connection (I don't think my PAYG phone had enough credit on it for an extended Microsoft call centre experience). However, to get an Internet connection I had to install the Parallels Ethernet Connection drivers, and hence the Tools. Catch 22!

The workaround is convoluted, to say the least. First, we need a command prompt in the restricted Vista activation session. You do this by clicking any of the links in the activation window: they should cause a browser to open. From here, you can ask the browser to "Open a file" and direct it to C:\Windows\System32\cmd.exe - this should initiate "download" of the executable. Click the option to run the file and voila!

Now you have a command prompt the fun really begins. You might think you could just type D:\setup.exe and the Tools would begin installing, but life just isn't that simple - in Their infinite wisdom, Microsoft have imposed quotas on the resource consumption of the session they set up for the purposes of activation. This is probably the Right Thing to do from their POV, but it's just a pain in the arse for us.

The workaround is to get the internet connection working, so you can do the activation and hence lift the resource limits. To do this, create a floppy disk image containing the Windows 2000 drivers for a Realtek 8029AS adapter (you should be able to get those from here, until Realtek break their incoming links again). Personally I did this by using another virtual machine to download the files and extract them onto a new floppy disk image (you can create a blank image on the Floppy Drive tab of the VM settings). I would make the fruits of this labour available to you as a simple download if it were not for (unfounded?) fear of Realtek's highly trained attack lawyers.

Once you have the requisite image in your sweaty virtual paws you can proceed to mount it into the Vista VM. To finish up, type compmgmt.msc into that command prompt and update the drivers for the detected network adapter by searching for new ones on the A:\ drive.

You should now be free to run the online activation and break the Catch 22, allowing installation of the Tools - at this point feel free to help yourself to a cup of coffee and a ginger-snap biscuit to celebrate a difficult job done well (I know I did...).

I'm really quite suprised that I had to jump through this many hoops - the Realtek drivers allegedly come with Vista, for one thing. But - c'est la vie! It's also quite pleasing that the humble, long outmoded, floppy drive still has a place in solving modern IT problems

Aug 31 2008

The Haskell community has built up a great resource: the Hackage Haskell package database, where we recently hit the 500-package mark!

One of those 500 packages was mine, I added another to their number just an hour ago, and I've got two more in the oven. Given, then, that I'm starting to maintain a few packages, I went to the trouble of automating the Hackage release process, and in this post I'm going to briefly walk through setting up this automated environment.

1. Install cabal-upload from Hackage. I'm afraid that at the time of writing this is not perfectly simple because it won't build with GHC 6.8 or above: this can be fixed with a new .cabal file, however, which I've made available here. (Edit: I've just noticed that this functionality seems to have been added to Cabal itself! You may just be able to use cabal upload. However, I'm not sure what the right config file location is for the next step).
3. Copy the following shell script into a file called release in the root of your project (the same directory as the Setup.lhs file):
#!/bin/bash
#

echo "Have you updated the version number? Type 'yes' if you have!"

if [ "$version_response" != "yes" ]; then echo "Go and update the version number" exit 1 fi sdist_output=runghc Setup.lhs sdist if [ "$?" != "0" ]; then
echo "Cabal sdist failed, aborting"
exit 1
fi

# Want to find a line like:
# Source tarball created: dist/ansi-terminal-0.1.tar.gz

# Test this with:
# runghc Setup.lhs sdist | grep ...
filename=echo $sdist_output | sed 's/.*Source tarball created: .*/\1/' echo "Filename:$filename"

if [ "$filename" = "$sdist_output" ]; then
echo "Could not find filename, aborting"
exit 1
fi

# Test this with:
# echo dist/ansi-terminal-0.1.tar.gz | sed ...
version=echo $filename | sed 's/^[^0-9]*.tar.gz$/\1/'
echo "Version: $version" if [ "$version" = "$filename" ]; then echo "Could not find version, aborting" exit 1 fi echo "This is your last chance to abort! I'm going to upload in 10 seconds" sleep 10 git tag "v$version"

if [ "$?" != "0" ]; then echo "Git tag failed, aborting" exit 1 fi # You need to have stored your Hackage username and password as directed by cabal-upload # I use -v5 because otherwise the error messages can be cryptic cabal-upload -v5$filename

if [ "$?" != "0" ]; then echo "Hackage upload failed, aborting" exit 1 fi # Success! exit 0 4. When you're ready to release something, simply run the shell script! Not only will this package up your project and upload it to Hackage, it will also add a version tag to your Git repository (obviously you should change this bit if you are using another VCS!). If you would like to follow my continuing adventures in Haskell open source, please check out my GitHub profile! Patches gratefully accepted May 1 2008 ## Fixing File Associations Eaten By Parallels Desktop I use OS X, and so for those rare cases where I must deign to run a program designed for Windows I make use of my copy of it hosted in Parallels Desktop. Now, Parallels has been generally good to me, but I recently came across the problem documented in this forum thread where its ability to launch Mac applications from Windows caused some basic file associations in Windows-land to stop working. For example, Excel files were listed as "xls_auto_file" by Explorer and double clicking on them only gave me the "Open With" dialog: not very helpful. The thread does suggest a means of solving the problem, by creating a batch file that re-associates a white-list of some of the possibly affected extensions, like this: REM Restores MS Office File Type Associations assoc .doc=Word.Document.8 assoc .dochtml=wordhtmlfile assoc .docmhtml=wordmhtmlfile assoc .docxml=wordxmlfile assoc .dot=Word.Template.8 assoc .pot=PowerPoint.Template.8 assoc .pps=PowerPoint.SlideShow.8 assoc .ppt=PowerPoint.Show.8 assoc .rtf=Word.RTF.8 assoc .wbk=Word.Backup.8 assoc .xlc=Excel.Chart.8 assoc .xlm=Excel.Macrosheet assoc .xls=Excel.Sheet.8 assoc .xlt=Excel.Template assoc .xlw=Excel.Workspace However, this felt a bit ad-hoc to me, and in particular some of the extensions that were affected for me were not on the list. Thus, like any good programmer I went off and whipped up a utility designed to undo the damage inflicted on my poor defenseless HKEY_CLASSES_ROOT by Parallels. Using it is pretty simple: just click "Scan" and then "Fix Selected" repeatedly until the scan finds nothing more of interest. Now, before I give you a download link to the utility, I need to give you the standard disclaimer to which you must agree in order to use it: This utility is provided on an 'as is' basis, without warranties of any kind, and no warranty, express or implied, is given that the operation of the utility is correct or safe to use. I do not accept any liability for any error or omission. Use of the utility is at your own risk. Sorry for the legalese but this utility is modifying your registry and hence (though I feel it is highly unlikely) could get something quite, quite wrong. You would be also wise to have a backup of the HKEY_CLASSES_ROOT registry branch to restore in case you aren't happy with the changes made by the utility. That said, without further ado here is the utility and its source code (C# 3.0), both of which (being of my sole authorship) I release into the public domain. I hope someone else finds it useful! Mar 3 2008 ## Google Images Plugin For Corripio A simple post this time. Corripio is a program for OS X that lets you easily find album artwork for your music collection. However, it's a bit rough around the edges and its built in web based mechanisms for retrieving this artwork are rather limited and frankly a bit rubbish. I've managed to solve all these issues by writing a plugin to fix its unaccountable lack of support for Google Images. However, a word to the wise: this is probably against the terms of service, so make sure you only use this plugin for personal use! #!/usr/bin/ruby # require 'net/http' require 'open-uri' require 'uri' require 'cgi' # Construct a query for good-sized Google images by taking the supplied keywords and adding "album" query = (ARGV + ["album"]).join(" ") sizes = ["medium", "large", "xlarge"].join("|") uri = "http://images.google.co.uk/images?hl=en&q=#{CGI.escape(query)}&imgsz=#{CGI.escape(sizes)}" source = open(URI.parse(uri)).read.gsub("\r\n", "") # Look for javascript statements of the form dyn.Img("foo", "bar", "lol", "http://www.example.com/yay.jpg") matcher = /dyn.Img$\$".*?",".*?",".*?","(.*?)"/
matches = source.scan(matcher)

# Push back the best 10 results
matches.first(10).each { |match| puts match.first }

To install the plugin, copy the above code into a new file in ~/Library/Application Support/Corripio/Plugins and make sure it is marked executable. Now, you can drag that file into the Plugins section of the Corripio preference pane and then (optionally) disable the default image search plugins. When you're done the preferences pane should look something like this: