This may be an idiosyncrasy, but I doubt it, because I see too many examples of it in open-source code all the time: I have this strong suspicion that software developers think too holistically. There’s even a rather trenchant and famous cartoon illustrating how a developer loses context with a simple interruption and has to rebuild the scaffolding of his thoughts before he can fix the issue.

I had this problem recently. I’ve been noodling with a small lisp interpreter that I was building as an exercise, and one of my steps was to write a lisp-like “cons” list library, recodify each node from a record-in-a-vector to lists-in-lists, and then build upward from there. My target language was Coffeescript. There are three major components to the interpreter: an eval() function, the environment, and the syntax.

Special forms are injected into the environment at initialization time, and obviously both the “define function” and “if” statements need to be able to call eval() to evaluate their arguments, since this is an interpreter that works its way to the bottom of the AST and then evaluates upward to a final result.

As I was making the changes, I got terribly lost making the special forms respond to cons lists rather than vectors or dictionaries. The structure of a node from the reader became (type value blame), where a value would be a cons list of nodes when type became “list”. The distinction between the two types became muddled in my head, and I kept worrying about what it would mean to switch to the other phase of processing.

And then I realized: I should stop worrying. Everything I needed really was right before my eyes. Knowing the code in front of me, I could easily reason about everything I was doing right then, right there. This data structure would get handed off eventually, but I shouldn’t care. There shouldn’t be context above and beyond what’s on the screen.

I think this is why I’m really starting to like F# and Haskell and that family of languages so much. (And Lisp, really); I shouldn’t have to think that hard, I shouldn’t be spending every day at the utter limit of my human understanding just to get my job done. I know, that’s the holy grail of development, yet I’m convinced that it’s not only a reachable goal, it’s one we perversely avoid. We’re proud to plumb code through multiple layers when we ought to be ashamed of it.

I think we ought to consider the baseball bat as a unit of code quality: if the code is so hard to understand the next guy will want to come after you with a baseball bat, maybe you should consider re-writing it.

My first impression was , “What a fucking bunch of greybeards.”

I went to Beerly Functional, a get-together of down-in-the-trenches programmers who were either using or interested in using Functional Programming, and my first impression upon walking into the room was exactly that: What a bunch of fucking greybeards, myself included.

And yet, as I paid attention to what was being said, the reason why we were all there became more and more apparent. We were tired of dealing with lousy development cycles. The biggest promises of Functional Programming are that it narrows the gap, both temporally and spatially, between where you create the bug and where it manifests. Functional programming emerges from the needs of lifelong programmers to stop fucking around with code-run-debug cycles; they want to produce excellent code the first time; they’re willing to adopt strong constraints on shoddy thinking and poor code in order to avoid that. They want to make software that gives a damn. They want to make software that they are comfortable saying, “This cannot fail.” They value quality and correctness, whereas most business people… don’t know how to assess that question, and they see the rapidity and widespread availability of developers in the shoddy languages like PHP and Javascript as signs of those language’s legitimacy.

So we raised a glass together, and we took on our missions: to teach each other what we knew, to get better at our craft, and to sell it to businesspeople who need to know there’s a better, faster way to get quality code in front of consumers.  We were greybeards; lifelong programmers who, whether we’d made our millions or not, wanted to keep making great code, not graduate into management and executive by 30.   We wanted to be the best.  And we knew the tools were within our grasp.

18Mar

Working around the feeling of cheating…

Posted by Elf Sternberg as Uncategorized

This is just going to be random.  Don’t expect it to make sense.

The obvious next steps in my Scheme are: Strings. A Common Lisp 22.1.1  compliant hand-written parser, as a prologue to re-writing it in Scheme, where the record objects used in Coglan’s version are replaced with cons list of the various fields being collected.

Macros.  And that’s where my brain comes a screeching halt.  Because “macros” seem to imply a certain purity of lispyness.  How do I make a Lisp macro engine that successfully handles, and yet successfully ignores, all of the extra stuff after the content?

Maybe I don’t.  Maybe I just write it so that it accepts all that, and more, because ultimately I want to add a deterministic more to the list to carry typing information.

Grief, I have no idea at this point.  I just know what I want it to be able to do next.

16Mar

They lied to me in university…

Posted by Elf Sternberg as Uncategorized

As revealed last time, my university training was focused on a business degree with a programming minor, the point of which was to prepare me for being a dark programmer in some basement of a bank or other financial institution. There was an actual track for this at my university.  As such, my instructors told me not to worry too much about data structures; I didn’t need to know about trees, linked lists, and deques.

They lied.

I thought, since I’d done the exercise, that I was done with A Language In 20 Minutes, but this morning I had an inspiration and so, on the commute into work, I hacked out a new version that’s available on Github. It’s more “functional,” as I said, for some definition of functional.

But what I did add was a Lisp-like list class that behaves more or less like a Lisp cons list: A linked list of two cells, the second of which points to the next cell. Like so:

[(data:C), (next)] -> [(data:B), (next)] ->
[(data:A), null]

These lists start out with just the ‘A’ entry; when you cons an entry, it looks like: cons(D, list), and it would add a ‘D’ entry to the “head” of the list. The amazing thing about this structure is that you only have access to the head, but through it you have access to the whole list.

Amazing.

Really.

When you’re describing the scope of an operation for an interpreter, you still need a traversible, ordered collection of scopes that go all the way from your deepest scope to your root in order to ensure along the way that every variable is defined, and add new variables to your current scope as needed. When you create a new scope in your interpreter, you do it in a given function call that will interpret expressions within that scope. When you leave that function, you need to expunge the memory allocated for your scope. So let’s say that the D object mentioned above is a dictionary for resolving variables in the current scope; by consing it onto your list, you can now traverse the scope, checking all the dictionaries down to the root automatically.

And when you leave the scope, the D object and its list reference get reaped. You get memory management for free.

Admittedly, that’s only true because Javascript has its own memory management going on, but since lots of practice projects involve writing an interpreter on an interpreter, that’s often going to be the case. But it’s a good starting point.  And understanding the principle means that you get to apply it later if and when you decide to write this thing in C++.

The dirty secret inside Coglan’s example (see the first commit to my github) is that his Scope object is a cons list; it’s just a very informally defined one, using the parent field to point to the next cell, all the way down to the “TopScope.”

And, if you look at my javascript, I don’t call things the way most people do. You might write car(list_object) to retrieve the data from the current list object, but all throughout my code I use, in Coffeescript, (car list_object), just because Coffeescript lets me. I’ve created a Frankensteinian example of Greenspun’s Tenth Rule of Programming.

15Mar

Paying my dues, Studying the masters…

Posted by Elf Sternberg as programming

I sometimes give the impression that I’m an excellent programmer, and to the extent that I’m empowered to code within my narrow little specialty, web-based interfaces for industrial and enterprise users, I’m comfortable saying I’m one of the best.  It’s pretty much all I’ve done for the past 15 years: Isilon, Spiral Genetics, and now Splunk all used my jQuery/Backbone/Python/NodeJS/HTML5 expertise to create and maintain lively and vibrant cybernetic applications (in the classic sense of “cybernetic,” i.e. control systems) for their specific deployments: A cloud-based storage solution, a cloud-based gene sequencing engine, a database and query engine for semi-structured information.

But outside of that specialty, I’m somewhat lost.  I know a lot of things, like SQL, video processing, natural language processing, and so on, that I’ve only ever played with or that I’ve only ever had to use once, professionally; these things live in the back of my brain and just kinda lie there, useless.

In the Serendipity stories I told earlier this year, I highlighted specific hiring instances where skills I had acquired for unprofessional reasons (and let’s face it, “I watch a lot of anime” is pretty unprofessional) had serious professional uses.

Today, I posted to my github a Language In 20, a quick programming language based on James Coglan’s lecture “A Language in 20 Minutes“, in which he snowflakes a programming language from nothing to a Turing-complete integer calculator capable of recursion.  I still haven’t had the necessary insight to understand the scoping issue, despite getting it right… but then I was just following Coglan’s lead.  My implementation is different; since PegJS doesn’t auto-create interfaces to parser classes, I had to write my own switch statement about types, and I tried to avoid using classes as much as possible.  I didn’t succeed; I may go back and revise it to be more “functional,” using closures rather than classes.

It’s frustrating seeing how far away I am from what I really want to know and do; it’s equally frustrating knowing that precisely zero of my corporate masters have ever had any interest in my learning this stuff.  It doesn’t make me a better programmer; in fact, it makes me aware of just how bad most programming, mine and others, really is.

17Feb

The Best Firing I’ve Ever Experienced

Posted by Elf Sternberg as Uncategorized

Paul Petrone writes What it’s like to watch the worst firing ever, and in it he describes a process by which a boss, in the process of announcing forthcoming layoffs, does something pretty horrible. You ought to read it.

I can’t do one better, but I can describe the most dramatic firing I’ve ever seen. It was at CompuServe.

In 1998, after I’d been there about five years, CompuServe was purchased by AOL, then immediately sold to WorldCom in a complex stock transaction that allowed AOL to keep the users, but WorldCom to keep the wires and infrastructure. The group I was with was part of the wires and infrastructure group, but we were a redundancy. WorldCom wanted CompuServe’s Omaha offices but could care less about the Seattle space, which to them was an internal competitor with their own internet services.

The trouble was they actually couldn’t, according to Washington law, just lay us all off. A mass firing in Washington involves some rather arcane proceedings, much like a bankruptcy, and any layoff involving more than 20% of a large workforce requires the company to pay out substantially large severance. The idea is that you’re about to dump a large group of highly skilled people into the workforce; it may take a while for all of them to find work and they may have to move to do it.

So what did WorldCom do? It made us miserable. At least it tried. It took away all our work and enforced work hours. Basically, it took twenty bright developers and all the support staff of a dot-com and locked us in an office and said, “You have nothing to do for nine hours.” They did this for three months, hoping that we would see the light of our desperate straits and quit. If enough of us quit we’d drop below the minimum worker threshold and they could fire the rest without triggering the penalties.

Except we were software developers. And our in-house agreements with our in-house management allowed us to contribute to open-source projects. We made significant contributions to Perl, Apache, Python, and Linux in that time. (We also, I confess, played an awful lot of Quake: Painkeep.) Still it wasn’t meaningful work in the same way that having a real job would have allowed.

So there came a day when we were all called to the big conference center on the top floor. I’d never been there before, and it looked like the sort of place one spent too much money on: wood paneling, deep carpet, huge polished wooden table, expensive chairs. There were many more than 20 people there; all the management, support staff, sales and phone folks were there. There was a fellow there who we’d never seen before in an expensive dark grey suit. He explained, bluntly but not unkindly, that he was the guy with the hatchet. He handed out envelopes on them and told us not to open them. They had dots on them: green, blue, and white. There were a lot more green dots than anything else. “Now,” he said, “I’m afraid the green dots are being let go today.”

A roar went up from the crowd. The hatchet man stepped back, his hands in front of his face, looking quite terrified. Then he realized we were cheering. “I have to say,” he said, “This is first time I’ve done this when people cheered!”

“You have no idea how bored we’ve been!” someone shouted. “They took away all our work!”

“I’m a little sick of playing Quake!” That got a chuckle.

Our division’s senior manager, bless his heart, had managed a real hat trick of severance. Since our group was the unwanted stepchild and “in transition” involving multiple dot-com wannabes, he had managed to score concession from every single company involved in the transaction. Five months severance from one; ongoing health insurance for that same period from another; twenty hours each of professional headhunter services, including really slick resume preparation from a third; educational vouchers at the county community college and five sessions of third-party career coaching from another.

I’ve left some jobs for better jobs; I’ve been laid off four times now, three times for economic reasons rather than performance (“The Dot Com Bust” and “The Great Recession” were both brutal to web developers, even ones with obscure industrial skillsets like mine), and once because the company was “consolidating” its offices to Palo Alto and I didn’t feel like moving. The Bust and the Recession layoffs were the worst; I felt cold, clammy, and dazed upon leaving those discussions. But the CompuServe shutdown had to be the best firing I’ve ever experienced.

Open office plans has been a subject of significant controversy recently, with everyone and his brother pointing out that the quiet craft of software development clashes horribly with the loud, communal, extroverted environment of open office plans. So the question then becomes: who benefits from this arrangement?

Psychopaths.

Venkatesh Roy has this thesis– he calls it the Gervais Theory– that all modern office hierarchies have three kinds of people: the losers, the clueless, and the psychopaths. Extroverted and manipulative, pychopaths form companies and temporary alliances as they claw their way up the corporate and financial hierarchies; their lives are wrapped up in playing this game. Losers don’t play the corporate game at all; they knowingly enter into arrangements with psychopaths to do the work and go home to their “real lives.” The clueless are mostly middle management: they believe in the company, but play by the rules, believing they’ll be rewarded for their loyalty with greater rewards. Eventually, the company is filled by successive layers of middle management until it becomes a hollow shell, at which point the psychopaths cash out; the losers, who knew this was coming, transfer their transferrable skills elsewhere; the clueless are left wondering what the hell just happened. The psychopaths go on to form a new company; lather, rinse, repeat.

Software development is so hot right now that skilled developers can just up and leave. They can find work elsewhere. If they’re bored and unrewarded, they will find work elsewhere. Psychopaths need to keep a closer eye on their software developers than on other productive roles, like in manufacturing, refinery, delivery, and so on.

The open bay permits that. It gives the psychopaths in the room a horizon they can scan for trouble. It gives them an intelligence edge they wouldn’t have if developers were all in their own little rooms.

Open office bays are a mechanism for controlling restless developers.

03Feb

Serendipity: Another out-of-band win.

Posted by Elf Sternberg as Uncategorized

While at the genetic engineering gig, I was regularly scouted for positions elsewhere. I didn’t want to leave: the pay was terrible, but the freedom and professional satisfaction was immense. But really, I had teenagers and Omaha has her epilepsy, so we needed insurance more than I needed professional satisfaction. If Obamacare had been a thing back then, maybe I’d have stayed all the way. They’re still doing well there.

The one job that really threw money at me was a textbook company, of all things. They were looking for front end developers who would help them transition from traditional publishing to on-line, and they wanted their website to be a “core value proposition,” a place where school districts could describe their needs and their requirements, and get customized textbooks to meet their district needs.

During the interview, the other guy asked me if I was working on anything interesting in my spare time. I mentioned that I’d just written “a little thing,” a toolkit that glued together a markdown processor, a couple of python scripts, and a JSON file to automatically create ebooks. He asked me to explain how it worked, I explained the simplicity of the NCX and OPF formats, and how I hoped to put a visual front-end on this tool someday. “You actually know the EPUB standard?” he said, his eyes wide and genuinely hungry.

“Well, most of it.”

They hired me. A year later they laid me off. It turns out they were hoping to add automatic EPUB generation to their existing LaTeX-based production line. But I never got to see any of that; I was stuck doing the website for a year, my Epub skills reserved for “When we get to it…” I think part of the idea was that EPUB 3.0 would come soon, which had some Javascript and interactivity built in, and the interactive chemistry and biology lessons I was building for the website would be book-ready by then. But it never happened.

The serendipity here is obvious: I had no professional reason to know the EPUB 2.0 standard. But I did for unprofessional reasons. They wanted me for it, as yet another an in-house expert. They never used me in that capacity. They had too many in-house experts spending their days doing the more routine parts of bootstrapping; that may be part of the reason they failed to really get anywhere.

02Feb

Serendipity 2009: Keep learning!

Posted by Elf Sternberg as Uncategorized

I eventually tired of working at the video streaming service. It wasn’t paying very well at all, and the challenge of bootstrapping a video streaming website soon gave way to the tedium of maintaining a media-centered CMS with a customized catalog. Weekly specials, micro-sites for film festivals, director’s specials, interviews, events and new releases were all bread-and-butter boring.

So when I went looking, I stumbled upon a bioinformatics startup. These guys had one of the cheapest human genome alignment toolkits in the world; it could be run on AWS and would do a human being in less than three days, a miracle at a time when normal sequencing software frequently took weeks. Part of the miracle was figuring out how to map/reduce the problem; I never understood the algorithm, but apparently it was a big freaking deal.

The engine was written in C++ with a Python front-end that spoke REST. It was all they had; you could give it the S3 address of a pool of genetic data and tell it what animal you thought the pool was from and a short while later it would tell you a lot about the beastie (human, yeast, E coli, that kind of thing; yeast was popular to test against because yeast have fairly small genomes; humans took a weekend, yeast was done in minutes). And that was all they had.

When I signed on, my duty was two-fold: first, write a front-end that spoke to the sequencing engine and let customers configure and launch processing jobs, managed their data sets, and control jobs that were unproductive; second, write a Django app that took people’s money and gave them access to the sequencing engine. The guy who wrote the REST layer did me two favors; he wrote a plug-in that, before a job would launch, would make a query to the Django engine saying “Does this guy have enough money?”; secondly, he made his own endpoint take a list of processing tickets, so data could be sequenced, aligned, filtered, and analyzed all in one command.

I had just mastered the art of credit card processing at the video streaming service, as we had pressed out the site’s functionality. I had also recently heard about this “Backbone” thing, which let you handle REST APIs with relative ease. I’d written a little tutorial on it, mostly as an educational thing.

I got the Django end up quickly, then wrote simple front-ends to list out the data, jobs, and processing “pipelines” (that’s what they’re called in bioinformatics), then refined them reiteratively to show more and more, and to handle more and more. Along the way, I recalled my university training as an accountant (yes) and looked up how to handle encumbrances; the user’s account could be “encumbered” (temporarily debited) and if a failure occured that was our fault, the credits could be unencumbered efficiently and returned. The system would only report to the engine “credits available” to prevent cheating. It was remarkably robust.

The serendipity here was (a) being a deep hard SF geek, I knew enough biology not to be utterly, completely lost, (b) I’d just come off a Django assignment that mirrored what they needed, and (c) I’d just run into this Backbone thing and learned it, and it was exactly what they needed.

I also did the site in Coffee, HAML, and Less, which made actually coding the site much faster and easier than doing it in the traditional tools. The lead engineer later said my choices made it “difficult to find people who knew them, but incredibly easy to train people to use,” so it was half a win.

I was at Isilon for eight years. In some sense, that was three years too long; I stayed because the handcuffs were golden and the work had long become routine. Isilon had grown explosively; I was now third-longest veteran, and was still writing web apps for new extensions to the Isilon product line. I wasn’t writing C anymore; the job of writing Python libraries had been farmed out to others. I was comfortable. I was lazy.

Then 2008 happened. The market fell apart. The tech world collapsed. The world seized up in economic shock. Isilon was bought by EMC and a lot of people got laid off. They gave me a bucket of money and… that was it. Like millions of other people, I was out of work.

I’d been in a hothouse environment for eight years. Isilon had been fairly open-source unfriendly; they took a lot of stuff in, but discouraged employees from contributing to open source projects, mostly due to paranoia about copyrights and poaching. I’d been working with Webware and Prototype. The world had moved on to Django and jQuery. I needed to follow them.

I hacked and played and discussed on this blog how I’d dived head-first into Django and taught myself a few nifty things. I also discussed a little side-project about using Python to write transpilers. This combination caught someone’s attention: he asked me if I could write a Django app that would take in a single <textarea> and, whenever the user pressed [Enter], send the contents of the textarea to a transpiler, and display the results in a <pre> page next to the <textarea> The transpiler was written in Python, so it would be easy to integrate with Django. He needed it to work on a multi-user basis, because he was going to South Korea soon to teach assembly language programming, and he needed this as an instruction tool.

It took me five days. He paid me well for it, and that was that. I blogged about my success.

Sometime in mid-December, I got wind from a friend about a little company that was trying to be “The Netflix of Independent Filmmakers.” They needed help badly. They were desperate. Their original hosting site had gone up in bankruptcy and deleted their entire website, written in .NET, secretly over a weekend; they had no backups; they had no website; the guy they’d hired had finished 2/3rds of the site in Django before getting fired for reasons nobody there would discuss. Could I please finish the website and get it running by Sundance, which was five week away?

I said I’d try. I mostly succeeded. It was actually more than 2/3rds done; the database schema wasn’t insane, it was just poorly managed. I taught myself migrations pretty quickly, and soon had the central component of the site up and running in two weeks.

While I was doing this, I learned that there was another crisis. They didn’t rent movies, but they sold them. They had a commercial DVD printer that could burn a title and print a label and slap the whole thing together as a bespoke edition of any ISO image in almost zero time. They had all their DVDs in ISO format on Amazon Web Services. But their streaming service had been on the .NET site; they’d lost all their Flash-ready video. The four people running the office were downloading the ISOs to their Macs, one at a time, finding the movie, encoding it by hand, and uploading it to the streaming site, then logging into the admin page and flipping the “ready to stream” button.

If they were lucky, they’d have been done with the whole collection sometime in August, eight months later.  Sundance was now three weeks away.

This is where serendipity comes in:

I’d been laid off, home, bored for a lot of the time. I watched a ton of anime. Some of it I burned to DVDs, I taught myself transcoding using mplayer, mencode, transcode, and a few other tools for converting “soft” subtitles to “hard.” I one of the guys if I could take a crack at the encoding issue. “If you can make it go faster, go right ahead.”

I did. I bought the O’Reilly Amazon Web Services manual, and learned what tools would let me list and download files from S3 to a Linux box. I used mplayer’s “index” option to dump the contents of the ISO, and wrote a little Perl (yes!) script to find “the longest track,” which is usually the movie. I figured out how to encode it to Flash. And I used the tool to upload it back into the Flash section of their storage. I did it with one movie. Then ten. Then I broke up the entire list of unencoded films into twenty lists of about a hundred films each. I wrote a little script to spin up a new Amazon EC2 server and encode all the films on the list; I did it with one list, and it worked. So that weekend I spun up nineteen more in parallel, and doled out my lists.

That Sunday I was able to call the boss. “It’s done,” I said.

“You mean they’ll all be ready by Sundance?”

“No,” I told her. “They’re all done. Now. All two thousand of them.”

She was quiet for a few moments. She’d heard that spinning up lots of parallel EC2’s was expensive. “How much did that cost me?”

“Eighty-six bucks. And some change.”

Can I keep you?

She did, too.

The serendipity was that I’d saved her business twice; once, by knowing what I was expected to know; I had my shingle out as a Django middle-tier developer with confidence end-to-end; and twice, because completely by luck I had the toolset necessary, how to transcode video, to save her specific business.

Recent Comments