Wednesday, November 13, 2013

Apple Breaks Apache Configurations for Gitit (Again)

I'm not quite sure why I put myself through this, but I upgraded my Mac Pro to Mavericks. This broke my local Gitit Wiki. The symptom was that Apache was unable to start, although nothing would be written in the error logs. To determine what was wrong I used sudo apachectl -t. The installer did preserve my http.conf, but wiped out the library mod_proxy_html.so that I had installed in /user/libexec/apache2. See this old entry that I wrote back when I fixed it for Mountain Lion here.

I installed XCode 5 and I thought I was set, but there is more breakage. You might need to run xcode-select --install to get headers in /usr/include. The makefile /usr/share/httpd/build/config_vars.mk is still broken in Mavericks, so commands like sudo apxs -ci -I /usr/include/libxml2 mod_xml2enc.c won't work.

To make a long story short, I got the latest (development) version of the mod_proxy_html source, these commands worked for me:

sudo /usr/share/apr-1/build-1/libtool --silent --mode=compile --tag=CC /usr/bin/cc -DDARW
IN -DSIGPROCMASK_SETS_THREAD_MASK -I/usr/local/include -I/usr/include/apache2 -I/usr/include/apr-1 -I/usr/include/libxml2 -I
. -c -o mod_xml2enc.lo mod_xml2enc.c && sudo touch mod_xml2enc.slo

and

sudo /usr/share/apr-1/build-1/libtool --silent --mode=compile --tag=CC /usr/bin/cc -DDARW
IN -DSIGPROCMASK_SETS_THREAD_MASK -I/usr/local/include -I/usr/include/apache2 -I/usr/include/apr-1 -I/usr/include/libxml2 -I
. -c -o mod_proxy_html.lo mod_proxy_html.c && sudo touch mod_proxy_html.slo

Previously, this gave me .so files in the generated .libs directory, but now I just have .o files and I'm not sure that's what I want.

Sunday, August 11, 2013

More Crappy Print-on-Demand Books -- for Shame, Addison-Wesley "Professional"

So, a while back I wrote about some print-on-demand editions that didn't live up to my expectations, particularly in the area of print quality -- these Tor print-on-demand editions.

Now, I've come across one that is even worse. A few days ago I ordered a book from Amazon called Imperfect C++ by Matthew Wilson -- it's useful, thought-provoking material. Like the famous UNIX-Hater's Book, it's written for people with a love-hate relationship with the language -- that is, those who have to use it, and who desperately want to get the best possible outcomes from using it, writing code that is as solid and portable as possible, and working around the language's many weaknesses. (People who haven't use other languages may not even be aware that something better is possible and that complaints about the language are just sour grapes; I'm not really talking to those people).

The universe sometimes insists on irony. My first copy of Imperfect C++ arrived very poorly glued; the pages began falling out as soon as I opened the cover and began to read. And I am not hard on books -- I take excellent care of them.

So I got online and arranged to return this copy to Amazon. They cross-shipped me a replacement. The replacement is even worse:

Not only are the pages falling out, because they were not properly glued, but the back of the book had a big crease:

So I guess I'll have to return both.

I'll look into finding an older used copy that wasn't print-on-demand. But then of course the author won't get any money.

Amazon, and Addison-Wesley, this is shameful. This book costs $50, even with an Amazon discount. I will be sending a note to the author. I'm not sure there is much he can do, but readers should not tolerate garbage like this. Amazon, and Addison-Wesley, fix this! As Amazon approaches total market dominance, I'm reminded of the old Saturday Night Live parody of Bell Telephone: "We don't care. We don't have to. We're the Book Company."

Thursday, August 01, 2013

Arduino, Day 1

A friend of mine sent me a RedBoard and asked me to collaborate with him on a development idea. So I'm playing with an Arduino-compatible device for the first time. I've been aware of them, but just never got one, in part because after writing embedded code all day, what I've wanted to do with my time off is not necessarily write more embedded code.

I downloaded the Arduino IDE and checked that out a bit. There are some things about the way it's presented that drive me a little batty. The language is C++, but Arduino calls it the "Arduino Programming Language" -- it even has its own language reference page. Down at the bottom the fine print says "The Arduino language is based on C/C++."

That repels me. First, it seems to give the Arduino team credit for creating something that they really haven't. They deserve plenty of credit -- not least for building a very useful library -- but not for inventing a programming language. Second, it fails to give credit (and blame) for the language to the large number of people who actually designed and implemented C, C++, and the GCC cross-compiler running behind the scenes, with its reduced standard libraries and all. And third, it obfuscates what programmers are learning -- especially the distinction between a language and a library. That might keep things simpler for beginners but this is supposed to be a teaching tool, isn't it? I don't think it's a good idea to obfuscate the difference between the core language (for example, bitwise and arithmetic operators), macros (like min), and functions in the standard Arduino library. For one thing, errors in using each of these will result in profoundly different kinds of diagnostic messages or other failure modes. It also obfuscates something important -- which C++ is this? Because C++ has many variations now. Can I use enum classes or other C++11 features? I don't know, and because of the facade that Arduino is a distinct language, it is harder to find out. They even have the gall to list true and false as constants. If there's one thing C and C++ programmers know, and beginners need to learn quickly, it's that logical truth in C and C++ is messy. I would hate to have to explain to a beginner why testing a masked bit that is not equal to one against true does not give the expected result.

Anyway, all that aside, this is C++ where the IDE does a few hidden things for you when you compile your code. It inserts a standard header, Arduino.h. It links you to a standard main(). I guess that's all helpful. But finally, it generates prototypes for your functions. That implies a parsing stage, via a separate tool that is not a C++ compiler.

On my Mac Pro running Mountain Lion, the board was not recognized as a serial device at all, so I had to give up using my Mac, at least until I can resolve that. I switched over to Ubuntu 12.04 on a ThinkPad laptop. The IDE works flawlessly. I tried to follow some directions to see where the code was actually built by engaging a verbose mode for compilation and uploading, but I couldn't get that working. So I ditched the IDE.

This was fairly easy, with the caveat that there are a bunch of outdated tools out there. I went down some dead ends and rabbit holes, but the procedure is really not hard. I used sudo apt-get install to install arduino-core and arduino-mk.

There is now a common Arduino.mk makefile in my /usr/share/arduino directory and I can make project folders with makefiles that refer to it. To make this work I had to add a new export to my .bashrc file, export ARDUINO_DIR=/usr/share/arduino (your mileage may vary depending on how your Linux version works, but that's where I define additional environment variables).

The Makefile in my project directory has the following in it:

BOARD_TAG    = uno
ARDUINO_PORT = /dev/serial/by-id/usb-*
include /usr/share/arduino/Arduino.mk
And nothing else! Everything else is inherited from the common Arduino.mk. I can throw .cpp and .h files in there and make builds them and make upload uploads them.

If you have trouble with the upload, you might take a look at your devices. A little experimentation (listing the contents of /dev before and after unpluging the board) reveals that the RedBoard is showing up on my system as a device under /dev/serial -- in my case, /dev/serial/by-id/usb-FTDI_FT232R_USB_UART_A601EGHT-if00-port0 and /dev/serial/by-path/pci-0000:00:1d.0-usb-0:2:1.0-port0 (your values will no doubt vary). That's why my Makefile reads ARDUINO_PORT = /dev/serial/by-id/usb-* -- so it will catch anything that shows up in there with the usb- prefix. If your device is showing up elsewhere, or you have more than one device, you might need to tweak this to properly identify your board.

When you look at the basic blink demo program in the Arduino IDE, you see this, the contents of an .ino file (I have removed some comments):

int led = 13;

void setup() {                
    // initialize the digital pin as an output.
    pinMode(led, OUTPUT);     
}

// the loop routine runs over and over again forever:
void loop() {
    digitalWrite(led, HIGH);   // turn the LED on (HIGH is the voltage level)
    delay(1000);               // wait for a second
    digitalWrite(led, LOW);    // turn the LED off by making the voltage LOW
    delay(1000);               // wait for a second
}

The Makefile knows how to build an .ino file and inserts the necessary header, implementation of main, and generates any necessary prototypes. But if you want to build this code with make as a .cpp file, it needs to look like this:

#include <Arduino.h>

int led = 13;

void setup() {
    // initialize the digital pin as an output.
    pinMode(led, OUTPUT);
}

// the loop routine runs over and over again forever:
void loop() {
    digitalWrite(led, HIGH);   // turn the LED on (HIGH is the voltage level)
    delay(1000);               // wait for a second
    digitalWrite(led, LOW);    // turn the LED off by making the voltage LOW
    delay(1000);               // wait for a second
}

int main(void)
{
    init();

#if defined(USBCON)
    USBDevice.attach();
#endif

    setup();

    for (;;) {
        loop();
        if (serialEventRun) serialEventRun();
    }

return 0;

}

And there it is -- C++, make, and no IDE. Relaxen and watchen Das blinkenlights!

Tuesday, July 30, 2013

Lexx is Wretched

I have a fondness for science fiction series that are imaginative but not, as a whole, successful. Farscape, I'm talking about you. Even, occasionally, those that start out promising, but which turn into complete failures -- failure can occasionally be interesting. At least, it serves as an object lesson for how a story line can go so very far wrong. Andromeda, I've got your number. I can deal with very dated CGI -- Babylon Five is still generally good and often great. So I happened to come across discounted boxed sets of Lexx, the whole series, at my local Target store. They were dirt cheap. "How bad could it be?" I thought. Well, now I know. At least, I know part of the story.

First off, Lexx is not something I can show my kids -- pretty much at all. Season 1 has a surprising amount of very fake gore in it -- brains and guts flying everywhere. That didn't really bother them -- I think they got that the brains were made of gelatin -- but it was getting to me. Watching characters carved up by rotating blades, repeatedly; watching characters getting their brains removed -- that got old. Body horror, body transformation -- pretty standard stuff for B grade science fiction, or anything that partakes of the tropes of such, but not actually kid-friendly. So we didn't continue showing the kids.

Still, I thought it might make more sense to watch them in order, so I watched the second two-hour movie (1:38 without commercials). The second one has full frontal nudity, which startled me a bit. I'm not really opposed to looking at a nubile young woman, per se. There is some imaginative world-building and character creation here, but ultimately it's just incredibly boring. It's like the producers shot the material, not having any idea how long the finished product would be; they shot enough scenes to actually power an hour show (forty-plus minutes without commercials), but also shot a bunch of extended padding sequences, "just in case." And so after a repeated intro that lasts just under four minutes, we get a two-hour show with endless cuts to spinning blades slowly approaching female groins, huge needles slowly approaching male groins, countdown timers counting down, getting stopped, getting started, getting stopped... endless fight scenes, endless scenes of the robot head blathering his love poetry, a ridiculous new character eating fistfuls of brains... et cetera, et cetera, et cetera.

Every time something happens, I'd get my hopes up, thinking that maybe the writing has actually improved, but then it's time to slow down the show again, because we've still got an extra hour and twenty minutes to pad. And it's all distressingly sexist and grotesquely homophobic. Again, I'd be lying if I said that I didn't like to look at Eva Habermann in a miniskirt, but given that the actress is actually young enough to be my daughter, and especially given that she has so little interesting to do, and there's just not much character in her character -- it's -- well, "gratuitous" doesn't even begin to cover it. She's young, but Brian Downey was old enough to know better. And let's just say I'm a little disgusted with the choices the show's producers made. The guest stars in Season 1 are like a who-used-to-be-who of B actors -- Tim Curry, Rutger Hauer, Malcom McDowell. There's material here for a great cult show -- but these episodes are mostly just tedious. They're actually not good enough to be cult classics.

The season consists of four two-hour movies. After watching the first movie, I didn't quite realize all four season one movies were on one disc, so when I tried to watch some more, I put in the first disc of season two by mistake. I watched the first few episodes of season two -- these are shorter. I didn't notice any actual continuity issues. In other words, nothing significant changes from the pilot movie to the start of season two. There are some imaginative satirical elements. Season 2, episode 3 introduces a planet called "Potatohoe" which is a pretty funny satire of the American "right stuff" tropes. But it's too little, and it amounts to too little, amidst the tedious general adolescent sex romp. Then we lose Eva Habermann, who was 90% of the reason I even watched the show this far. I'm honestly not sure if I can watch any more.

It doesn't help that several of the discs skip a lot. It might have something to do with the scratches that were on the discs when I took them out of the packaging, which come from the fact that the discs are all stuck together on a single spindle in the plastic box. And the discs themselves are all unmarked, identifiable only by an ID number, not any kind of label indicating which part of which season they hold -- so good luck pulling out the one you want.

I'm told the later seasons have some very imaginative story lines. People say good things about the third season. It seems like the universe has a lot of potential. Is it worth continuing, or am I going to be in old Battlestar Galactica's second season territory?

UPDATE: I have continued skimming the show. The scripts seem to get somewhat more interesting around season 2, episode 5, called "Lafftrak." It finally seems to take its darkness seriously enough to do something interesting with it, and not just devolve to pornographic settings. The pacing is still weak, but the shows start to feel as if they have a little bit of forward momentum. Of course, then in the next episode, we're back to Star Whores and torture pr0n...

Wednesday, July 24, 2013

The Situation (Day 135)

So, it's day 135. This is either the last covered week (week 20) of unemployment benefits, or I have three more; I'm not quite sure. Without a new source of income, we will run out of money to cover mortgage payments either at the end of September or the end of October. We have burned through the money I withdrew from my 401K in March when I was laid off. I've been selling some possessions, guitars and music gear, but this is demoralizing, and not sustainable. We don't have much more that is worth selling.

I was fortunate to have a 401K to cash out, and to get the food and unemployment benefits I've gotten -- so far I have been able to pay every bill on time and my credit rating is, so far, completely unscathed. But winter is coming. And another son is coming -- Benjamin Merry Potts, most likely around the middle of October.

Emotionally, the situation is very confusing. On the one hand, I have several very promising job prospects, and I'm getting second phone interviews. But these are primarily for jobs where I'd have to relocate, and a small number of possible jobs that might allow me to work from home. This includes positions in Manhattan and Maine. We're coming to grips with the fact that we will most likely have to leave Saginaw. It's a well-worn path out of Saginaw. We were hoping to stick with the road less traveled, but we can't fight economic reality single-handed. And we don't really have any interest in relocating within Michigan, again. If we're going to have to move, let's move somewhere where we won't have to move again -- someplace where, if I lose one job, there's a good chance I can quickly find another.

So, we are willing to relocate, for the right job in the right place. The right place would be the New England area -- Grace is fed up here, and I am too. Maine, Vermont, New Hampshire, Massachusetts, Connecticut, New York, or eastern Pennsylvania are all appealing. but it would not be a quick and easy process. It would probably involve a long separation from my family. I don't relish that idea, especially if my wife has a new baby. That might be what it takes, though. I'll do it for the right job and the right salary and the right place. In any case, we can't move with either a very pregnant woman or a newborn. It's would not be a quick and easy process to sell, or even rent out, a house. A benefit to a permanent job in Manhattan is that it would pay a wage that is scaled for the cost of living there. It might be perfectly doable for me to find as cheap a living arrangement there as I can, work there, and send money home. A Manhattan salary would go a long way towards maintaining a household in Michigan, and helping us figure out how to relocate, and I'd probably be able to fly home fairly frequently.

I would consider a short-term remote contract job where I wasn't an employee, and didn't get benefits, and earned just an hourly wage. Let's say it was a four-hour drive away. I'd consider living away from home during the work week, staying in an extended-stay motel, and driving home on weekends. But it would have to pay well enough to be able to do that commute, pay for that hotel, and be able to send money home -- enough to pay the mortgage and bills. A per diem would help, but the contract work like this I've seen won't cover a per diem. We'd need to maintain two cars instead of one. Grace would need to hire some people for housekeeping and child care help. I wouldn't be there to spend the time I normally spend doing basic household chores and helping to take care of the kids.

Would I consider a contract job like that father away -- for example, an hourly job in California? That's tougher. I think I could tolerate seeing my wife and kids only on weekends, if I knew that situation would not continue indefinitely. But if I had to fly out, that probably wouldn't be possible. California has very little in the way of public transportation. Would I have to lease a car out there, so I could drive to a job? Take cabs? It might make more sense to buy a used car, once out there. In any case, it would cost. Paying for the flights, the hotel, and the car, with no per diem, it's hard to imagine that I'd be able to fly home even once a month. Would I do a job like that if I could only manage to see my family, say, quarterly? Let's just say that would be a hardship. I would consider an arrangement like this if it paid enough. But the recruiters who are talking to me about these jobs are not offering competitive market rates. It doesn't seem like the numbers could work out -- I can't take a job that won't actually pay all our expenses.

The prospect of employment locally or within an hour commute continues to look very poor. I've applied for a number of much lower-paying IT or programming jobs in the region, and been consistently rejected. These jobs wouldn't pay enough to afford a long commute or maintain any financial security at all. In fact, I think we'd still be eligible for food stamps (SNAP) and my wife and kids would probably still be eligible for Medicaid. Their only saving grace is that they would pay the mortgage. Some of them might provide health insurance, at least for me. But I've seen nothing but a string of form rejections for these positions.

Grace and I don't get much quiet time -- we haven't had an actual date night, or an evening without the kids, since March. The closest we come is getting a sitter to watch the kids for a couple of hours while we run some errands. That's what we did last Sunday. I made a recording and turned it into a podcast. You can listen if you are interested.

Building a Podcast Feed File, for Beginners

I had a question about how to set up a podcast. I wrote this answer and thought while I was at it, I might as well polish up the answer just a bit and post it, in case it would be helpful to anyone else.

I'm starting a podcast and I need help creating an RSS feed. You're the only person I could think of that might know how to create such a thing. Is there any way you could help me?

OK, I am not an expert on podcasts in general because I've only every created mine. I set mine up by hand. I'll tell you how I do that and then you can try it that way if you want. You might prefer to use a web site that does the technical parts for you.

A podcast just consists of audio files that can be downloaded, and the feed file. I write my feed files by hand. I just have a hosting site at DreamHost that gives me FTP access, and I upload audio files to a directory that is under the root of one of my hosted web site directories. For example: http://thepottshouse.org/pottscasts/gpp/

The feed file I use, I write with a text editor. I use BBEdit, which is a fantastic text editor for the Macintosh that I've used for over 20 years, but any text editor will do. For the General Purpose Podcast, this is the feed file: http://thepottshouse.org/pottscasts/gpp/index.xml

The feed file contains information about the podcast feed as a whole, and then a series of entries, one for each episode (in my case, each audio file, although they don't strictly have to be audio files; you can use video files). When I add an audio file, I just add a new entry that describes the new audio file.

This is a slight simplification. I actually use a separate "staging" file for testing before I add entries to the main podcast feed. The staging file contains the last few episodes, and I have a separate subscription in iTunes to the "staging" podcast for testing purposes. When I upload a new episode MP3 file, I test it by adding an entry to the staging index file here: http://thepottshouse.org/pottscasts/gpp/index_staging.xml

So I add an entry to test, and then tell iTunes to update the staging podcast. If it works OK and finds a new episode, downloads it, and it comes out to the right length, and the tags look OK, then I add the same entry to the main index file.

I have a blog for the podcast too. That's a separate thing on Blogger, here: http://generalpurposepodcast.blogspot.com That just provides a jumping-off point to get to the episodes, and something I can post on Facebook or Twitter. For each episode I just make a new blog post and write a description and then include a link to the particular MP3 file. The blog in the sidebar also has links to the feeds and to the iTunes store page for the podcast. I'll get to the iTunes store in a minute.

Oh, writing the entry in the feed file is kind of a pain. You have to specify a date, and it has to be formatted correctly and it has to have the right GMT offset which changes with daylight savings time. You have to specify the exact number of bytes in the file and the length in hours, minutes, and seconds. If you get these wrong the file will not be downloaded correctly -- it will be cut off. The URL needs to be URL-escaped, for example spaces become %20, etc.

If I upload the file to my hosting site first, so that I can see the file in my web browser, and copy the link, it comes out URL-escaped for me, so that part is easy. I paste that link to the file into the feed file entry for the episode. The entry gets a link to the file, and then there is a also a UID (a unique ID for the episode). Personally, I use the same thing for both the UID and the link, but they can be different. The UID is how iTunes (or some other podcast reader) decides, when it reads your feed file, whether it has downloaded that file already, or whether it needs to download it again. So it's important to come up with a scheme for UIDs and then never change them, or anyone who subscribes to your podcast will probably either see errors or get duplicated files. In other words, even if I moved the podcast files to a different server, and the link needed to be changed, I would not change the UIDs of any of the existing entries.

Once you have your feed file, you can check it with the feed validator -- and you definitely should do this before giving it out in public or submitting it to the iTunes store. See http://feedvalidator.org I try to remember to check mine every so often just to make sure I don't have an invalid date or something like that. If the feed is not working, this thing might tell you why.

OK, the next thing is iTunes integration. The thing to keep in mind here is that Apple does not host any of your files or your feed. You apply to be in the podcast directory, and then someone approves it, and the system generates a page for you on Apple's site. Once a day or so it reads your feed file and updates that page. The point here is that if someone is having problems with your page on iTunes, it is probably not Apple's fault, it is probably a problem with your feed or your hosted audio files.

If you don't want to do this all manually there are sites that will set up your feed for you automatically, like libsyn.com and podbean.com. I am not sure which one is best and I have not used them.

This is Apple's guide that includes information on how to tag your files in the feed -- you could start out with mine as an example, but this is the de facto standard for writing a podcast feed that will work with iTunes and the iTunes store: http://www.apple.com/itunes/podcasts/specs.html

OK, now you know just about everything I know about it. Oh, there is one more thing to talk about. This part is kind of critical.

So you create an audio file -- I make a WAV file and then encode it into an MP3 file either in Logic or in iTunes. My recent spoken word files are encoded at 128 Kbps; if I'm including music I would use a higher bit rate. Some people compress them much smaller, but I am a sticker about audio quality and 128 Kbps is about as much compression as I can tolerate.

You then have to tag it. This actually changes data fields in your MP3 file. The tagging should be consistent. You can see how my files look in iTunes. If the tagging is not consistent then the files will not sort properly -- they won't group into albums or sort by artist and that is a huge pain. When files get scattered all over your iTunes library, it looks very unprofessional and I tend to delete those podcasts. But note that the tags you add are not quite as relevant as they would be if you were releasing an album of MP3 files, and here's why -- podcasts have additional tags that are added by your "podcatcher" -- iTunes, or some other program that downloads the podcast files.

So you tag your MP3 file, and take note of the length (the exact length in bytes and the length in hours, minutes, and seconds), so that you can make a correct entry in your feed file. The MP3 file is the file you upload, but note that this file is not actually a podcast file yet. It doesn't show up in "Podcasts" under iTunes. It becomes a podcast file when iTunes or some other podcatcher downloads it. iTunes reads the metadata from the feed file (metadata is data about a file that is not in the file itself) and it uses parts of that metadata, like the podcast name, to adds hidden tags to the MP3 file. Yes, it changes the file -- the MP3 file on your hard drive that is downloaded will not be exactly the same file you put on the server. This is confusing. But it explains why if you download the MP3 file directly and put it in your iTunes library, rather than letting iTunes download it as a podcast episode, it won't "sort" -- that is, it won't show up as an iTunes podcast under the podcast name.

At least, that has been true in the past. I think recent versions of iTunes have finally made it so there is an "advanced" panel that will let you tell iTunes that a file is a podcast file, but sorting it into the proper podcast this way might still be tricky. So the key thing is that you might want to keep both your properly tagged source files, because those are the ones you would upload to your site if, for example, your site lost all its files, or if you were going to relocate your site to a new web server, and also the files after they have been downloaded and tagged as podcasts by iTunes. I keep them separately. If someone is missing an episode I can send them the podcast tagged file and they can add it to their iTunes library and it will sort correctly with the other podcast files.

OK, now you pretty much know everything I know about podcast feeds. I prefer doing it by hand because I'm a control freak -- I like to know exactly what is happening. I like to tag my files exactly the way I want. But if you're not into that -- if you don't know how to upload and download files of various kinds and tag MP3 files, for example -- you probably want to use something like Libsyn. Or maybe you know what to do but just want to save time. I just know that I've sometimes been called on to help people using these services fix their feeds after they are broken, or they need to relocate files, and it isn't pretty, so I'll stick to my hand-rolled feed.

Monday, July 08, 2013

Building Repast HPC on Mountain Lion

For a possible small consulting project, I've built Repast HPC on Mountain Lion and I'm making notes available here, since the build was not simple.

First, I needed the hdf5 library. I used hdf5-1.8.11 from the .tar.gz. This has to be built using ./configure --prefix=/usr/local/ (or somewhere else if you are doing something different to manage user-built programs). I was then able to run sudo make, sudo make check, sudo make install, and sudo make check-install and that all seemed to work fine (although the tests take quite a while, even on my 8-core Mac Pro).

Next, I needed to install netcdf. I went down a versioning rabbit hole for a number of hours with 4.3.0... I was _not_ able to get it to work! Use 4.2.1.1. ./configure --prefix=/usr/local, make, make check, sudo make install.

Next, the netcdf-cxx, the C++ version. I used netcdf-cxx-4.2 -- NOT netcdf-cxx4-4.2 -- with ./configure --prefix=/usr/local/

Similarly, boost 1.54 had all kinds of problems. I had to use boost 1.48. ./bootstrap.sh --prefix=/usr/local and sudo ./b2 ... the build process is extremely time consuming, and I had to manually install both the boost headers and the compiled libraries.

Next, openmp1 1.6.0 -- NOT 1.6.5. ./configure --prefix=/usr/local/ seemed to go OK, although it seems to run recursively on sub-projects, so it takes a long time, and creates hundreds of makefiles. Wow. Then sudo make install... so much stuff. My 8 cores are not really that much help, and don't seem to be busy enough. Maybe an SSD would help keep them stuffed. Where's my 2013 Mac Pro "space heater" edition, with a terabyte SSD? (Maybe when I get some income again...)

Finally, ./configure --prefix=/usr/local/ in repasthps-1.0.1, and make succeeded. After about 4 hours of messing around with broken builds. I had a lot with build issues for individual components and final problems with Repast HPC itself despite everything else building successfully, before I finally found this e-mail message chain that had some details about the API changes between different versions, and laid out a workable set of libraries:

http://repast.10935.n7.nabble.com/Installing-RepastHPC-on-Mac-Can-I-Install-Prerequisite-Libraries-with-MacPort-td8293.html

They suggest that these versions work:

drwxr-xr-x@ 27 markehlen  staff    918 Aug 21 19:14 boost_1_48_0 
drwxr-xr-x@ 54 markehlen  staff   1836 Aug 21 19:19 netcdf-4.2.1.1 
drwxr-xr-x@ 26 markehlen  staff    884 Aug 21 19:20 netcdf-cxx-4.2 
drwxr-xr-x@ 30 markehlen  staff   1020 Aug 21 19:04 openmpi-1.6 
drwxr-xr-x@ 31 markehlen  staff   1054 Aug 21 19:28 repasthpc-1.0.1

And that combination did seem to work for me. I was able to run the samples (after changing some directory permissions) with:

mpirun -np 4 ./zombie_model config.props model.props
mpirun -np 6 ./rumor_model config.props model.props

---

Notes on building Boost 1.54: doing a full build yielded some failures, with those megabyte-long C++ template error messages. I had to build individual libraries. The build process doesn't seem to honor the prefix and won't install libraries anywhere but a stage directory in the source tree. I had to manually copy files from stage/lib into /user/local/lib and manually copy the boost headers. There is an issue with building mpi, too:

./bootstrap.sh --prefix=/usr/local/ --with-libraries=mpi --show-libraries

sudo ./b2

only works properly if I first put a user-config.jam file in my home directory containing "using mpi ;" Then I have to manually copy the boost mpi library.

Notes on bilding netcdf-cxx4-4.2: I had to use sudo make and sudo make install since it seems to write build products into /usr/local/ even before doing make install (!)

Sunday, July 07, 2013

Are You Experienced?

A recruiter recently asked me to answer some questions for a client, so I did. I thought it might be worthwhile to save the questions and my answers and make them public so that I can refer people to them.

How many years of C++ experience do you have & have you worked with C++ during the last 2 years?

I've been using both C and C++ since before the C89/C90 standard and well before the C++98 standard. I taught myself C programming in college -- I did not learn it in a class. I initially used C++ when there were various sets of object-oriented extensions to C like THINK C and Microsoft's "structured exception handling" for Windows NT 3.5.

It's hard to provide an exact "number of years. At some positions I worked more in plain old C, or Java, or NewtonScript or other languages, but even in those jobs there were often times where I was working with small C and/or C++ projects on the side.

I own copies of the ISO standards for C and C++ (C90, C99, and C++03) and used to study them for my own edification, so that I could write more portable code. I used to subscribe to the C++ Report magazine. I used to write C and C++ interview test questions for screening people at the University of Michigan. I own dozens of books on C++ and have studied them extensively. I was definitely a C++ expert, although I was much more of an expert on C++03 than C++11. I am not so interested in the "cutting edge" of C++ these days (see below for notes about STL and C++11/C++0x). For example, here's a blog post I wrote about the C++ feature "pointers to member functions," in 2006:

http://praisecurseandrecurse.blogspot.com/2006/08/generic-functions-and-pointers-to.html

I have used the following compilers and frameworks for paid work (off the top of my head, these are the major tools I've used, and I am probably forgetting some):

THINK C / Think Class Library

MPW C/C++

Borland C++ with the Object Windows Library and TurboVision for MS-DOS

Microsoft Visual C++ starting with 1.0 / MFC

CodeWarrior / PowerPlant class library and Qt

XCode (aka ProjectBuilder) / CoreAudio

GCC ("g++") / Lectronix AFrame library

TI Code Composer Studio

In addition, I have some experience with static checkers (Lint, Understand for C/C++, QAC, etc. -- more are mentioned on my resume.) and I would say they are a must for large commercial code bases. Also, I have worked with profilers, various run-time debuggers, and tools such as valgrind -- these are incredibly useful and helpful in finding bugs, especially in the use of uninitialized memory.

So, how do you put that in an exact number? I'd say I've used C++ daily for perhaps 12 years, but even when I was not using C++ as my primary development language for a given job or set of projects, I used it at least a little bit every year for the last 24 years. So somewhere in between those numbers.

In the Last Two Years

Yes, the most recent project was a server for Lectronix, something called the PTT Server, that sits on top of the AFrame framework and receives requests via JSON-RPC, and manages the state of all the discrete IO in the system. It is a multi-threaded application using message queues and hierarchical state machines. The server is not very big, maybe 7,500 lines of code, and the top layer of it is actually generated by some internal Python scripts. During this period, I was also maintaining and adding new features to several other servers and drivers as needed.

If the client wants to know whether I am familiar with C++11/C++0x, the answer is "not very much." I have not studied the C++ 11 changes very much yet, so I am only slightly familiar with features like enum classes and lambdas. At Lectronix, we chose not to try to adopt new features for an existing multi-million line code base, and we stuck with slightly older, well-tested versions of our compilers. I have definitely used STL, but we do not use it heavily in embedded projects, because of a conservative attitude towards memory use and hidden costs. We also tend to avoid things like and multiple inheritance in embedded programming, although I have used these features in the past. We tend to deliberately use a conservative subset of C++.

While I consider myself an expert on C++, it is not the be-all, end-all of programming languages. Learning other languages has made me a much better programmer and able to see problems from different perspectives. For example, I have on several occasions prototyped designs in other languages, for example Dylan or Haskell, to refine a _design_, and then ported the design to C++ to produce the shipping product.

I believe the industry is gradually moving towards functional programing, and languages such as Scala (that runs on the JVM) or Haskell (which can now generate code for ARM on "bare metal"), and embeddable scripting languages on top of C/C++ for configurability (for example, Lua on top of a back end written in C++ is the structure of most high-performance commercial video games). Even sticking with plain C++, there is no denying that Clang/LLVM are very promising developments -- Clang has the best error-checking and static analysis I've seen so far for C++, and for Objective-C this static analysis has allowed a feature called ARC -- automatic reference counting, which is basically garbage collection without having a separate background task that can create "pauses."

I have a strong interest in figuring out how using tools such as these, to make a business more competitive, specifically reducing line counts and bug counts and improving time to market. If the client is not interested in any of that, I'm probably not actually going to be the best fit for them, since they will not be making maximum use of my skills. I see myself as a full-stack software developer who should be able to choose the best tools for the job, not strictly as a C++ programmer.

What recent steps have you taken to improve your skills as a software developer?

Recently while still at Lectronix I was asked to re-implement some logic for handling the "PTT" (push to talk) signals for police radios, microphones, and hand controllers. My supervisor wanted me to use a library for handling HSM (hierarchical state machine) designs. I had never worked with hierarchical state machines, just flat state machines, so this was a little bit of a challenge. My first drafts of the state machines were not very good, but I arranged to meet with some other developers in my team who had more experience with HSM designs and improve them. After a couple of revisions I got the state machines up and running, they were simpler than the original design, they passed all my testing and all the bench testing, and we shipped them in a prototype for a police motorcycle product. So I now feel that I understand the basics of using hierarchical state machines as a design tool.

While unemployed, and although the job search itself takes up a great deal of time, I have been working on teaching myself Objective-C programming, something I've wanted to learn for a long time, and the basics of Apple's iOS framework for developing iPhone and iPad applications. My goal is to get a simple game up and running and available in the app store as a small demonstration project to show potential employers. Even if the first version is not sophisticated it should prove that I can build on those skills. I am doing this work "in public" -- sharing the code on github and writing about the design experiments and trade-offs on my blog. Here is one of my blog posts about the Objective-C implementation of the game:

http://praisecurseandrecurse.blogspot.com/2013/06/objective-c-day-5.html

The latest code is available on GitHub here: https://github.com/paulrpotts/arctic-slide-ios

I am also attempting to master some of the slightly more advanced features of the Haskell programming language, and functional programming in general, on the grounds that I believe that properly using functional languages such as F#, Scala, and Haskell can provide a competitive advantage, and give me the chance to bring that advantage to an employer.

Describe any experience you have in developing desktop applications.

Just to expand on some items from my resume and some that aren't:

In college I worked with a mathematics faculty member to develop an instructional multimedia tool using HyperCard and XCMD plug-ins that I wrote in THINK C for teaching calculus. I developed various other small applications for "classic" MacOS too, including a tool to edit version resources and a startup extension.

At the University of Michigan's Office of Instructional Technology, I built several instructional multimedia applications for students -- based around HyperCard stacks with custom XCMD and XFCN plug-ins written in C, Toolbook programs that integrated content from videodiscs, and a Visual BASIC application that used digital video to teach manufacturing process re-engineering techniques to business school students.

As a side project, I used Borland C++ and the TurboVision for MS-DOS framework, and also Visual BASIC for MS-DOS, to develop a survey application "front end" (to collect data at remote sites) and "back end" (to read the data from discs and aggregate it and display statistics) for the National Science Teachers Association (NSTA).

At Fry Multimedia, I built a prototype, with one other developer, in C++ using the MFC framework, of a Windows application to search a large CD-ROM database of compressed business data called "Calling All Business." This featured a "word wheel" feature that would match entries in the database and display matches while the user typed in search strings.

At the University of Michigan Medical Center I wrote, among other things, a Newton application that administered surveys to people in the ER waiting rooms. This was not desktop so much as "palmtop" but the same emphasis on user-centered design was there. I also either wrote entirely, or collaborated with another developer on, several internal applications, such as a Macintosh application written in C++ to upload data from the Newton devices, an application written using Metrowerks CodeWarrior (in C++ using the PowerPlant framework) to use AppleEvents to control Quark XPress in order to print batches of customized newsletters while providing text-to-speech feedback.

At Aardvark Computer Systems I completely rewrote the GUI application for controlling the company's flagship sound card product, the Aardvark Direct Pro Q10. This featured a mixer interface with knobs and sliders and animated meters to display audio input and output levels on all channels, persistent storage of mixer settings, and was built using C++ and the Qt framework. I also ran the beta-test program for this software.

At Lectronix, my work did not focus on desktop applications but I was able to occasionally contribute code to the Infotainment system GUIs, written in C++ using the Qt framework.

Describe any experience you have in developing server-side applications.

The bulk of my work at InterConnect was revisions to a server application written in Java that ran on Sun machines, and parsed "page collections" (bundles of scanned page images) along with metadata, a combination of XML including Library of Congress subject heading data and MARC records, to populate Oracle databases. These were large collections (terabytes, and that the time that was an unusually large amount of data to put into a web application). The data was housed on the client's EMC storage RAID arrays (at the time, very high-end systems). A full run of the program to populate a "page collection" would take several days. I worked with the client's Oracle team to debug issues with their database, particularly stored procedures written in PL/SQL, and with their production team to try to determine the best strategies for data issues; I wrote code to "clean" this data). The client was ProQuest (formerly University Microfilms International and Bell and Howell Information and Learning), and I worked specifically on the back-end for the Gerritsen Women's History collection and Genealogy and Family History collection. When InterConnect handed over development to ProQuest's internal team I wrote documentation on the import process and gave a presentation to explain it to their team.

Much of my work at Lectronix was also server-side applications, in the sense that all the code on products like the Rockwell iForce system was divided into drivers, servers, clients, and GUI code. Servers interact with clients and other servers using a network sockets interface wrapped in the Lecronix proprietary framework. So, for example, the Audio Zone Manager (AZM) server receives all requests as remote procedure calls and handles multiple clients. For some complex tasks like "priority audio" text-to-speech prompts it sets up a session where a client requests a session, the AZM lowers the level of "foreground" audio such as FM radio, the requesting client is granted a token, and then must make "keepalive" messages using the token, in order to keep the priority audio active. Multiple clients can request priority audio using different priority levels and the AZM must be able to handle requests that are "immediate" (only valid now) or requests which can be deferred, queue up these requests, and manage termination of expired priority audio sessions if a client process fails to send "keepalive" messages.

The more recent PTT server has a similar, multi-threaded design, where multiple instances of hierarchical state machines are fed messages via a serializing message queue, and there were various APIs to drivers that the code called, some that returned immediately, and some which blocked and returned only when a new state was available from the DSP (for example).

These are two examples; depending on what is meant, some other things I've worked on might qualify. For example, applications that support AppleEvents, wait for serial data, or run as "daemons" handling audio transfer between a driver and user-space applications on MacOS X, or run as interrupt-driven tasks to mix audio data.

Describe any experience you have in developing web applications.

I am not familiar with Microsoft web frameworks like ASP.NET so if this employer is looking for someone who would "hit the ground running" to develop web applications using that framework, I'm not that guy. I would be willing to learn, though, and I think I have a track record that indicates that I could.

I am not a database guru -- I have worked with databases and solved problems with databases, and I can understand basic database queries and the basics of database design but I am not an expert on (for example) query optimization for SQL Server; again, that is something I'd have to spend time learning.

I have not recently developed web applications using stacks such as LAMP (Linux, Apache, MySQL, PHP) or Ruby on Rails. However, I did work on several early web applications using Perl CGI scripts and plain old HTML -- for example, at Fry Multimedia I developed the web site for the Association of American Publishers. I was a beta-tester for Java before the 1.0 release and wrote early experimental "applets."

At the University of Michigan I used Apple's WebObjects (Java, with the WebObjects framework) to port my design for the Apple Newton survey engine to the web.

Later, while at InterConnect, I did some work (fixing bugs) on InterConnect's Perl and Java framework for "front-end" web sites -- the engine that generated HTML from database queries and templates, and made fixes to the web applications themselves in Perl, although this work was not my primary focus (see above).

I can solve basic Apache configuration issues and I've done small projects to set up, for example, a personal Wiki written in Haskell on an Ubuntu server. For example, I had to do some problem-solving to get this functioning under Mountain Lion (MacOS X 10.8). I wrote recently about this here:

http://geeklikemetoo.blogspot.com/2012/07/apple-breaks-my-gitit-wiki-under.html

What development project have you enjoyed the most? Why?

I have enjoyed a lot of them, but I'll give you one specific example. Back at the Office of Instructional Technology at the University of Michigan I worked with a faculty member in the School of Nursing to develop a program to teach nursing students about the side effects of antipsychotic medications. For this project we hired an actor to act out various horrible side effects, from drowsiness to shuffling to a seizure, and videotaped him. I enjoyed this project a lot because I got to collaborate with several people, including Dr. Reg Williams. I got to have creative input at all levels, from conception to final development -- for example, while developing the ToolBook application, I added animations to show how neurotransmitters move across synapses. I learned a number of new skills, including how to light a video shoot and the use of a non-linear video-editing equipment (Avid Composer), and I got to see the final system in use and receive positive feedback.

So I would say that what I liked most about that project was (1) being involved in the design and implementation at all stages, (2) the chance to work with some very collegial and welcoming people, and (3) being able to learn several new skills, and (4) being able to "close the loop" and see how the final product was received by the customers (in this case, faculty and students). I have since then worked in many development situations where different parts of that process were lacking or non-existent and they often make projects less enjoyable.

Here's a "runner-up." In 2000 I did some consulting work for Aardvark Computer Systems, assisting them with getting their flagship audio card working on Macintosh systems. Where the multimedia application I worked on several years earlier was high-level, this was very low-level: the issues involved included data representation ("big-endian" versus "little-endian"), and the low-level behavior of the PCI bus (allowed "shot size" and retry logic). Debugging this involved working closely with an electrical engineer who set up wires on the board and connected them to logic analyzer probes, and configuring GPIO pins so that we could toggle them to signal where in the DSP code we were executing. This was tedious and fragile -- even while wearing a wrist strap, moving around the office near the desk could produce enough static electricity to cause the computer to reboot. The solution eventually required a very careful re-implementation of the PCI data transfer code using a combination of C and inline Motorola 56301 DSP assembly language. I had to pull out a lot of very low-level tricks here, paying close attention to the run-time of individual assembly instructions from the data sheet, borrowing bits from registers to count partially completed buffers, studying chip errata and documentation errors in the data sheet, and dealing with a compiler that did not properly support ISO standard C and would let you know this by, for example, crashing when it saw a variable declared const. This also was very enjoyable for the chance to work very close to the hardware, learning new things, solving a difficult problem, working in close collaboration with a talented engineer, and getting the chance to actually ship the solution we came up with. In retrospect we forget the tedium and bleary eyes and remember the success.

The Situation (Day 118)

So. Day 118 of unemployment. Almost four months. It's getting hard to stay positive and keep anxiety at bay. Here's what's going on.

It might sound hopelessly naive, but I didn't think it would be this hard to find another job. I know I've been quite fortunate in some ways with respect to my career -- being very into, and good at, computer programming through the '90s and 2000s was a good place to be. I've been out of work, briefly, a few times before, when the small business or startups I worked for shrunk or imploded. but I've never had much difficulty finding my next job, and the job changes have generally been "upgrades" to higher pay, or at least bigger projects and more responsibility.

The job market is certainly bad right now, and especially bad locally. I am trying to both be realistic and optimistic at the same time -- realistically, it seems to be absolutely useless, for the most part, to apply for publicly-posted jobs. I've applied for dozens -- it would have been more, if there were more posted, but while there are a lot of job listings, it doesn't make any sense for me to apply for jobs that will not pay enough to cover our mortgage; if I got one, we'd still have to move. And we are still trying to figure out how to avoid that, so that we don't lose everything we've put into our house.

Working with recruiters has been an overwhelmingly negative experience as well, although there have been a few bright spots that have led to good leads and interviews. I'm really fed up with applying for a job listing for Saginaw or Flint only to find out that I'm actually contacting a recruiter about a position in Alabama or Mississippi or Florida. I've talked to recruiters at length who, it turned out, didn't even know the company they were recruiting for, because they were actually working for another recruiter. Is there even an actual job posting at the end of that chain of recruiters, or am I putting effort into some kind of scam? I don't know. I've also put a considerable amount of time interviewing for contract positions, making the case that I am a strong candidate, only to be completely low-balled on hourly rate, to the point where it would make no economic sense whatsoever for me to take that job (for example, a six-month C++ programming contract out of state, in Manhattan, for a major bank, where I'm expected to be pleased to accept $50 an hour and no per diem or travel expenses).

My wife suggests that in the market right now, it will basically be impossible to find a job without having a job, except through personal contacts. That's discouraging, but she is probably right. And one difficulty is that I just don't have a lot of personal contacts in the area, since we've only been here three years. I have a few, and they've been trying to help me, but in general the leads (even with referrals from people who already work in the companies) have not yielded much that is promising -- usually a series of web forms where I upload a resume, then describe my work experience again in detail, write and upload a cover letter, fill out an elaborate series of questions -- this can and often does take two or three hours -- and then hear nothing whatsoever about the job again. For most of these, there is no person to contact -- no name, no phone number, no e-mail address. I'm faceless to the company, and they are faceless to me. That's just not a good prospect.

Still, I have a generalized feeling that the right thing will come along, at least for the short term. Essentially, I have to keep believing that. I keep feeling optimistic about particular jobs. But hearing nothing back over and over again for months is starting to wear me down.

The money situation is getting to be difficult. We still have a small positive bank balance, and I've been able to continue to pay for everything we need. Fortunately our consumer debt is very low -- far lower than a lot of American families like ours. But our savings is gone, so from here on out it's either income or selling off things. We are eligible for cash assistance from the state as well, to cover things like diapers -- we will look into that this week.

Unemployment continues to cover our mortgage and taxes, and food benefits are doing quite a good job at covering our food needs. But tomorrow, I will certify for my 17th week of unemployment. I have either 3 or 6 weeks left to collect out of Michigan's maximum of 20. I'm not sure, because the state refused to pay me for 3 weeks, and I'm not sure if those weeks are just lost or if I am still eligible to collect them. I'm hoping so. We've got a situation with some other bills -- things like energy, water, life insurance, and things not covered by our food benefits. Fortunately energy bills were low during the spring and early summer, but we've had to turn on the air conditioning. We use window ventilation fans strategically. We have the AC on, set low, and the furnace fan running continually, and a separate portable AC unit here in my office where it gets too hot to run computers without it. Our next bill is going to be high. I'm guessing the next bill will be $600. I may be able to get that reduced by starting up a budget plan with Consumers Energy.

The dehumidifier in the basement has stopped working, which is a potential mold problem with our things stored down there. We've got to go through some of the remaining boxes. The older of our two furnaces seems to be out of commission again. I had plans to put money into our insulation and HVAC this summer, as well as exterior repainting and a whole lot of minor repair items, but that doesn't really work without income. Fortunately the roof is holding up, as is the new set of gutters we put on last fall.

Oh -- the lead situation. We had the lead inspection -- a very thorough, all-day inspection of our home, inside and outside, and grounds. The inspector did not find anything really hazardous -- there is some old lead paint in baseboards and woodwork in a couple of rooms, but it is under layers of later paint, and it is not peeling. Our dining table, which was an old ping-pong table, had lead paint on it. It's now gone, as are all the kitchen towels we used to use to wipe it. That was about two months ago, and we have still not gotten the written report which is supposed to include the results of the soil samples. Another follow-up phone call is needed. In another month or so we will be getting the children's blood levels tested again, so we should then have a read on whether there is still any kind of daily exposure going on.

Even though we have been making payments via COBRA to continue our dental coverage, several dental bills have been refused, and so we have to straighten that out. Basically, as I'm sure you are aware, health insurance companies are slimy pig-fuckers, and I don't mean that in an affectionate way. We've got some big residual bills -- our four-year-old's dental work cost a lot, even after insurance. We are trying to get the refused bills re-submitted and get whatever issue there is with our dental coverage straightened out. I'm very concerned that something is going to hit my credit rating. So far, we have not failed to meet any of our obligations, but one of these medical providers could decide to sell a debt to a collection agency at any time and that will be a black mark against us. I'm going to pull my reports and try to make sure that hasn't already happened "silently" when an insurance payment was improperly refused.

I've raised a little cash by selling off some of my home studio gear. The Apogee Ensemble audio interface I've used for the last five years to record songs and podcasts is gone. I've sold a number of my guitars, including my Steinberger fretless bass and my Adamas 12-string acoustic guitar. There isn't much left to sell that is worth the effort -- for example, I could only get $75 for a made-in-Japan Jagmaster that I paid $400 for. No one wants to buy a 20" Sony tube TV from 1994 or an electric guitar that needs rewiring work before it can be played. I could start selling our library -- I've had to gut my book collection in the past. I'm really resisting this, though, in part because the return compared to the time and effort put in to do it would be very low -- there are no decent local used book stores that might send a buyer out, so I'd be carting boxes of books down to Ann Arbor -- and in part because I just don't think I can bear giving up a collection of books I've gradually built and shaped and cultivated over the years, in some cases books I've carried around with me since I was a child. Grace and I will do a pass through our possessions trying to find some things that might be easy to turn into cash, but in general we have always lived a very bohemian lifestyle -- furniture from the Goodwill, silverware collected from rummage sales, bookshelves from Ikea; my desk is a door on plastic sawhorses. I'm not going to sell my computers; that would be eating our "seed corn."

I've been reading the recent novels by Alex Hughes about the adventures of an ex-junkie telepath. In them, he has to meet with his Narcotics Anonymous buddy, who asks him each time to list three things he's grateful for. I'm grateful for many things. I'm grateful that there is a safety net, even if unemployment may not last until I get my first paycheck from my next job. I'm grateful for the SNAP program and the WIC program that are pretty much supplying us with all the food we need to feed the family. I'm grateful for our small but supportive group of local friends, and our out-of-town friends. I'm grateful for the anonymous donation of a Meijer gift card sent by a friend of the family. I'm grateful for the handful of decent recruiters who are actually trying to hook me up with a real job for both our benefits. And I'm grateful to my family and especially to my wife who has been very patient with me as I work through this process and all the frustration and anxiety that comes with it. And I'm grateful to you, my online friends, who have been supportive as well.

Tuesday, June 18, 2013

The Situation (On Investing in a Revitalized Career)

When I found myself unemployed, one of my first thoughts was that it would be a good opportunity to invest some R&D in my career. I had a plan to put in some serious time learning some new skills. I ordered some books on Scala, Objective C, iOS programming, digital filters, and a few other topics I wanted to study. I considered taking an iOS "boot camp" with Big Nerd Ranch -- it looked like a good class, but it just plain cost too much. I planned to work through a couple of books. I got in a couple of days of work and made some progress, but have come to realize that this was just a bit unrealistic.

In part, it's unrealistic because of the time required to manage benefits, as well as the job-search reporting requirements in which I have to log specific jobs applied for each week (only recently added, apparently). There's no option to say "I'm teaching myself some new skills so I can apply for better jobs." It hasn't helped that we've had a couple of other difficulties piled on too -- we're still waiting on the lead testing, now scheduled for this coming week. There was a heap of work to help my teenager finish some college application essays. There was some other family drama. In fact I had arranged to go stay with some friends in Ann Arbor for a week specifically to get away from the distractions here, and work towards a demo-able iOS app. When things blew up, I had to cancel that idea (although I did wind up doing it later).

I came across something else that I'd really like to do (although I missed this one). There's an organization that teaches two- or four-day intensive courses in Haskell programming. The last one was in the San Francisco Bay area. There is no guarantee at all that if I took the class, and met the folks there, doing the classic networking thing, it would necessarily help me get a better job. I'd really, really like to take the class anyway. I'm not asking for donations to go to a training class like that right now, as such -- I'm not sure it is quite the right time. I'm mostly writing this down by way of just putting my intention out there in some kind of concrete form.

I've been diddling around with Haskell for a number of years now. I've written about Haskell a few times on. I've used it "in anger" -- to solve a real work-related problem -- a few times, for creating small utility programs, usually to chew through some data files, to prototype an algorithm that I later wrote in C++, or to generate some audio data for testing. It is, hands-down, my favorite programming language, a language that expands my mind every time I use it, and has taught me some entirely new ways to think about writing code, applicable to any language. I won't claim that Haskell is, per se, the great savior of programming. GHC can be awkward, and produces truly obscure error messages. It can be hard to debug and optimize. However, it seems to have some staying power, and perhaps more importantly, it is a huge influence on recent programming language designs.

Haskell didn't appear in a vacuum -- it certainly has absorbed strong influences from the Lisp family of languages, and from ML, and maybe other languages like Clean, and others even more obscure. I love learning new programming languages, and I've learned new ideas from just about every language I've learned, but Haskell seems unique in the sheer density of its ability to blow your mind. Despite the fact that it is perhaps not practical for every application, I've become convinced that many of the paradigms and concepts behind Haskell really are the future of programming -- specifically, the competitive advantage, even something close to the ever-receding goal of a "silver bullet" for programming.

I'm really encouraged by the emergence of CUFP (Commercial Users of Functional Programming) and work that some companies like Galois and Well Typed are doing. I believe it is already practical to write complex embedded systems with real-time and space constraints in Haskell, or at least partially in Haskell. It looks like a few pioneers are already doing it. The expressiveness of the language, and the resistance to many kinds of common errors that the language design essentially gives you "for free" could be a big competitive advantage in embedded software designs.

I'm not sure if, at this stage, there are sufficient opportunities to join companies that are also interested in R&D along these lines, especially given that I don't have a Ph.D. and am not likely to acquire one in the near future. Certainly few people nearby seem to be doing this kind of work, and I'm not certain whether there might be an opportunity to join an existing consultancy as a remote employee. I might have to strike out on my own. Grace and I have also been talking about setting me up as an LLC, as opposed to just doing hourly work via W-2s. In fact, honestly, despite the fact that I don't think of myself as much of an entrepreneur, doing so may be the best long-term solution to the thorny question of how to get any significant "upgrade" to my career, in terms of both money and the challenge of doing new and meaningful work. But that's a big leap to make.

I realized that I have been thinking and occasionally talking to friends about the idea of forming a company to do R&D and consulting on using advanced languages for embedded programming for ten years now, or maybe even slightly longer. I didn't even know Well-Typed offered classes like this, until I stumbled across the description online, but I have confidence that there will be more classes in the future. It seems like there is a window of opportunity. I didn't manage to get into iOS development at the start, like I did with Newton development, and I regret that (although, on the plus side, stuff works now!). It's hard juggling a career and a family. But I don't think it's too late to become a Haskell guru, for some value of "guru," and feel that maybe programming isn't entirely devoid of innovation after all. And maybe even enjoy programming again!

Monday, June 17, 2013

The Situation (Post-Father's Day)

I had a great weekend.

These posts have been largely kind of gloomy -- maybe understandably, given my ongoing unemployment. But I had a great weekend.

On Friday afternoon I had a phone interview that went, I thought, pretty well. Grace had taken the kids away with her to Ann Arbor where she had an obstetric appointment, and then stayed overnight with them with extended family, even taking them to a kind of barbecue/fishing party that sounded like a blast. On Saturday morning she picked up a CSA share that belonged to a friend, who was out of town and donated it to us. She got back Saturday afternoon. Our fridge is packed with fantastic produce. More on that in a bit.

I spent most of that time working on a Dylan program, an implementation of the little Macintosh Polar puzzle game from 20-plus years ago. When I took breaks from the screen I worked on a Gene Wolfe novel that has eluded me for a long time -- the second part of the Short Sun trilogy, In Green's Jungles. Wolfe is one of my very favorite writers and I still think that the Book of the New Sun series is pretty much the masterpiece of late-twentieth-century fantasy and science fiction. I think The Shadow of the Torturer is the only book I've literally worn to the point of disintegration just by reading it over and over.

But he's a puzzling writer, and in the later series he gets more puzzling. Reading In Green's Jungles is like looking through a kaleidoscope held by someone else. As soon as you start to figure out what you're looking at, and say "Ah! Yes, I think I see what is going on," he twists the kaleidoscope and says "how about now?" And it's all a jumble of pretty fragments again. And so these are books that are unsatisfying on a first reading, and even a second reading. I've gotten further this time; maybe I'll even finish the second book. Maybe by the third reading I will be able to plow through the third and final book and feel like I have a sense of what is really going on. They differ from The Book of the New Sun in that the former series can be read as a straightforward adventure story, and it is satisfying in that way -- to a certain extent. Until you realize that Severian's story doesn't entirely hold up, and that he is an unreliable narrator, and then you fall naturally into the mystery, and start to form your own theories. I have a monograph I'm working on, about The Book of the New Sun, but I don't feel it is quite ready for publication, even on my blog. I feel almost ready to write about the second series, the Long Sun books. The Short Sun books are still largely a blur of glittering fragments to me.

I'm digressing again... back to my weekend. The time with my wife and family out of town. That was a great chance to dive back in, just a little bit, into one of my favorite programming languages, and one that was hugely formative to my thinking about programming. In 1994 or thereabouts I was an alpha-tester for Apple's Dylan development environment, a tool that was ultimately relegated to the status of a technology demo than a viable language. At the same time I was developing real solutions in NewtonScript, the language that Apple actually deployed in the Newton product line. Trying to understand Dylan led me to Scheme and eventually to Common Lisp and Haskell. Dylan still exists in the form of community-supported implementations -- see also the Dylan Foundry.

Dylan is a fascinating language but as I study the original documents in 2013 -- Apple's book The Dylan Reference Manual and the original Dylan book describing the language with Lisp-like syntax -- I see an over-designed language, in the sense that the core language, designed to allow both dynamism and efficient compilation, seems to have too many features to really enable the sort of optimizations that the designers imagined. Maybe I'm just mistaking implementation failures for language design failures. Is there a thinner core language to be extracted from the big-language spec, if some features could be sacrificed? And would that be worth doing? Because I also see an extremely expressive language, a language I far prefer to Java, the other language emerging at the time, with some wonderful features, not the least of which is generic functions, which still seems like the natural way to construct object-oriented programs which are open to tinkering and extension.

Anyway, I got my program mostly working, and I'm talking to some of the remaining volunteer team about some remaining issues, so that's been fun. But I'm not writing today to talk about programming. I'm writing to talk about how grateful I am for my life and what my family and I are doing here in Saginaw.

Staying home in Saginaw for the phone interview Friday, I missed the travel and the company and the barbecue. But on Father's Day there was a friend in nearby Bay City who was moving his family -- a large family like mine. I thought helping their family would be a great way to spend Father's Day so I took my daughter and drove out there. It was a great afternoon -- there was food set up, a big U-Haul truck, and just enough guys volunteering. Veronica hung out with a gang of kids. The weather and the company were terrific. I helped load cabinets, dressers, a treadmill, helped take apart a picnic table -- all kinds of stuff. It was a reminder that doing work that requires me to exist only as a brain and a set of fingers is sometimes not gratifying, and that enjoying life is really often predicated on using the body, not just the brain. My back feels better than it has in months -- I worked it just hard enough to stretch everything out thoroughly and counteract some of the endless hours spent sitting at the computer looking at job postings. Today my back and arms and shoulders and wrists feel sore, but in a good way -- no stabbing pain or pinched-nerve sensations, just a pleasant ache of well-used muscles.

I wonder if that makes sense -- the idea that I would go spend most of my Father's Day helping someone else move, and honestly, I can't really say that it was entirely by way of trying to be virtuous or helpful. I feel like I got a lot out of it. It was fun. I'm really glad I went.

On the way back home I stopped at a bookstore, and indulged my habit. One of the books I picked up is a bit of fun trash (I say that admiringly). Alastair Reynolds is one of my favorite contemporary science fiction writers. He writes gloriously gothic and gritty space opera. He's now written a Doctor Who novel, a spin-off story set in the Jon Pertwee (Third Doctor) era. I have not finished it but it is terrific so far. Somehow Reynolds, in print, manages to conjure up the low-budget location shoots, cliched supporting characters, awkward dialogue, excess foreshadowing, and cliff-hanger pacing of the old serials in a way that is both dead-on and affectionate.

But I was talking about greens... a kaleidoscopic jungle of greens, in our refrigerator, or something... oh, yeah. Sunday night is tossed salad and scrambled eggs night -- yes, inspired by the closing song from the old Frasier TV show. Do we really eat meals on a regular schedule? Well, more or less; Monday is always chili night, and I cook it. Tuesday is baked potato night, of some kind -- white potatoes or sweet potatoes, often topped with leftover chili -- you get the idea. Theme, but variations according to whatever is in the refrigerator. So Grace softened up some chopped garlic scapes and chives in butter, and threw in eight eggs, and some gorgonzola cheese, and fresh dill, and something else I probably don't remember -- and it was delicious. We had a big salad of mixed greens, fresh from local Michigan farms, at room temperature, tossed with a little leftover pasta salad rescued from her trip, and it was delicious.

And Grace couldn't eat any of it. Somehow between the pharmacy and her doctor's office and Medicaid they did not approve her enzyme prescription refill, and somehow sat on it for ten days, so that she didn't know it had not been approved until it too late to do anything about it for this weekend. She's now out of pills, and so can't eat food without experiencing waves of nausea. So she sipped weak tea and watched us eat. We will be trying to resolve that today, and spend a few hundred dollars we can't really afford to spend, if we have to, so my wife can eat food. It seems like it should be a simple thing, but it isn't. And unfortunately this is not the first time she's had to go without her enzyme pills. We remain hopeful that someday she will be able to go off them entirely, but having to forcibly go off them doesn't help that.

But it was still a shockingly delicious dinner. Sometimes life just hits you across the face, in a good way.

For dessert she made a strawberry-rhubarb fool, with fruit picked up from a farm-stand north of Ann Arbor. The strawberries were so ripe that you would not have wanted to pick one up and eat it -- they were just starting to dissolve into pungent red liquid. That's just when they actually taste the best, of course. She just cooked down the strawberries and rhubarb, with a little honey, and I threw in a tablespoon or so of dried thyme. The result was indescribably delicious. We served it to the kids with a little half-and-half drizzled on top, which curdled from the acid -- so it was kind of an ugly dessert, but delicious. I think Grace got to eat some of that, without the half-and-half.

It seems like a simple thing -- working, socializing, eating. We're running short of money, I'm still applying for jobs every week, I'm waiting to hear back on dozens of them, I'm waiting to hear follow-up from a number of interviews. It all seems complicated and challenging and stressful. But I had a great weekend. I hope you did, too.

Tuesday, June 11, 2013

The Situation (Day 92)

So this is one of those days where everything is just "hovering." For the last few weeks I've had three or four recruiter phone calls and e-mails a day, but today I've had none. It's spooky, like the rest of the world was destroyed and I haven't gotten the news yet. Meanwhile, I've done some follow-up e-mails and messages, and gotten nothing back. Several different applications are in the post-interview stage, "hovering." I need to apply for some more local jobs, but I'm not seeing very many that are even remotely within the realm of possibility.

Just for fun I did a calculation on what it would take to do a daily commute from Saginaw to Bloomfield Hills. That's about 80 miles one way, taking approximately an hour and 20 minutes. I know people have commutes like this, and longer, but let's do the math as an exercise.

Our current main car is a late-model SUV that gets an average of 15.4 mpg. It probably will do a little better for an all-highway commute, but considering the possibility of heavy traffic and road construction, let's call it 16 mpg. For a 160-mile round trip commute, that's a convenient round number, ten gallons of gas a day. Gas today is about $4.20 a gallon. It will probably be lower off-season, but that's what it is today. That gives us $42.00 a day in gas, or $210 a week. Not accounting for vacation time, that's $10,920 just in gas. That doesn't cover wear and tear at all. The IRS standard mileage allowance including wear and tear and repairs for 2012 is 56.5 cents a mile; that works out in this case to $90.40 a day or (again, not taking vacation time into account) $23,504 a year -- in other words, that's what they consider the actual cost of owning and maintaining a vehicle and using it for that much travel.

Something like a Honda Fit would obviously be a better choice, at somewhere in the ballpark of 30 mpg, but note that this would add a car payment, when we don't have one now, and so the overall cost would not be dramatically lower.

Note that this takes into account no "externalities" at all. Here's one externality: if I was going to be gone with the car all day, every work day, my wife would need a second car in order to run any kind of local errand at all with the family. So we'd then be a two-car family instead of a one-car family. So it wouldn't be a matter of swapping out one car for a better-mileage car -- where selling the first could help pay for the second. Of course the at-home car wouldn't incur nearly as much in the way of gas expense and wear-and-tear, but it isn't trivial just to maintain a car, even one you don't drive very much. It also doesn't account at all for the emissions, and what that is doing to the climate, or the fact that I'd be driving for almost 3 hours a day, turning an 9-hour-day (with lunch) into a 12-hour day, and what that would do to me and my relationship with the family, and whether we'd be able to afford to hire someone to help replace some of my labor in and around our home (ranging from cooking and cleaning and mowing the lawn to child care).

So, alternatives. It would probably be cheaper to stay someplace much closer to a work situation in the metro Detroit area during the work week, and we're exploring that option. Relocation would be neither quick or easy. So what's the cost of an extended-stay hotel close to the area? The cheapest one I could find online in a brief search was about $55 a night. Assuming I stayed Monday through Thursday nights and left from work on Friday, that's $220 a week (and note that these are still a commute from the workplaces, just a much shorter one, and I'd still have one $42.00 round-trip commute). So it isn't significantly cheaper. I don't think I could make the food options as low-priced as they are at home. Exercising that option, I'd be doing a lot less driving, and that would be great, but I wouldn't see my family at all for four nights a week. I'm chewing over whether I could find that tolerable, and for how long. I don't really want to be an absentee father; these years aren't really fungible, to be "made up for" later.

A local apartment might be cheaper. I haven't looked into that. But it is a good reminder that if I'm going to consider an arrangement like this, I have to be sure to ask for enough money to actually make it viable. Ideally "viable" would translate to "at least what I was earning before, with cost of living adjustment, and enough extra to cover the cost of the distance." Of course this isn't an ideal world. How about "after taking the cost of the distance into account, doesn't actually represent a decline in income?" And we may have to accept "we can break even doing this" as opposed to "I'm working, but going further into debt with every mile of scenic I-75 I traverse!" And this is why I continue to press for a telecommuting, or at least part-week telecommuting, option. And why we might ultimately have to give up everything we've been working for here.

Thursday, June 06, 2013

The Situation (Day 88)

I received word back (via paper mail) from the State of Michigan saying that my claim for 3 weeks of unemployment compensation, for the weeks ending April 27, May 4, and May 11 (see my earlier posts) is denied. The form I got back said I had it is found that "you did not report (certify) as directed and had no good cause for failing to report (certify)."

The cause I reported was that I missed certifying online by one business day because I was distracted by recruiters and interviews. In other words, because I was concentrating so much on searching for a suitable job. What would have been good cause, I wonder?

So, er, let this be a lesson to all you slackers!

It says I have the right to appeal in writing. Would there be any point to that, I wonder?

A Counterfeit Motorola Razr V3 Cell Phone

I have an old Motorola Razr V3. It's from (roughly) 2005 or 2006. I use it without a contract, with a T-Mobile SIM card, buying minutes when I need to. I like this phone design, and I don't really want a smart phone or even a dumb phone with a touch screen, but mine is falling apart. I bought two allegedly new-old stock Motorola Razr V3 phones from an eBay seller. Unfortunately, they are counterfeits.

I have opened a case with eBay to return them, but I thought it might be useful to share pictures. Honestly, I wouldn't have minded much if (1) they worked well (they don't -- the speaker for speakerphone mode doesn't work, they don't vibrate, and the audio is poor), and (2) they were really cheap (they weren't that cheap -- I paid $59.99 each).

Take a look at the pictures. The gray phone is the original. The gold one is the counterfeit. It's very obvious when you just pick them up, open them, and try to work the buttons or open the battery compartment. The old phone opens smoothly and still feels solid. The new one grinds slightly and feels loose and flimsy.

Original: fit and finish is very clean. "M" logo button top center matches phone.

 Fake: front cover edge misaligned, "M" logo is blue and looks strange, buttons are loose.

Original: you can read all the serial numbers (even though the picture is blurry, sorry).

Fake: numbers are cut off; missing some numbers.

Original: logo is laser etched right into the aluminum surface.

Fake: logo is painted.

Original: darker, glossy.

Fake: type is different, lighter gray, matte.

Original: inside battery compartment cover. Note recycling warning, 3 clips to stabilize cover. Release mechanism still works after many years.

Fake: mechanism is extremely stiff and barely works, nothing molded on the inside.

Original battery hologram.

Fake battery hologram.

Original: still has a little rubber plug in that access hole after years of handling.

Fake: rubber plug stuck way out, fell out immediately with the gentlest handling, now it's around here somewhere...

Back covers. Note the raised logo and carrier on the original (right). Ignore the missing dark glass over the display on the old phone, I broke that many years ago...

The cover of the manual.

The printing inside the manual.

Under the right lighting you can see that the battery compartment cover on the counterfeit phone is completely mismatched to the rest of the case. Wow! Crap-tastic!