Brilliance and Thoroughness

I've made a lot of mistakes programming. Naturally, this can start to give you the sense that you may not be as sharp as you had hoped. Typically, I've brushed it off and told myself something about my hard work or persistence. Maybe to spare my ego or to at least give myself a feeling that I had value in the job market place. Regardless, they ultimately didn't make me feel more confident. It's because almost all of my mistakes had nothing to do with being sharp, insightful or clever. 

The classics of programming as a craft - The Pragmatic Programmer, Clean Code, Code Complete - generally boil down to one statement “don’t be lazy.” When I read this statement and all the many ways each author repeats it, and then compare it to my track record, I really fall short of the mark. Not because I am comparing myself to some excellent and well-known professionals; it’s because I trail in one defining quality of professional work -  thoroughness.

To give a simple example of thoroughness - are you the sort of person who moves their furniture when vacuuming or sweeping? Did you wipe down the counter after cooking? Feel your plates after hand washing to make sure they’re not still greasy?

Now, I wouldn’t call not doing these things “lazy,” but thoroughness is requisite skill to not be considered so.

For a long time, both in my kitchen and in code, I believed I could skirt by on tiny details. No need to to check that site in IE, I assume it’s fine. Don’t worry about thinking up a few more test cases, most people will follow the same usage pattern. No one really reads commit messages, so no need to reread them just to be sure they’re clear. I'll rename that variable later. 

None of these are errors or sins, but they are much like a kitchen counter that's never properly wiped down - crumbs congregate under the toaster, there's always a little moisture around the edge of the sink, and when you turn on a stove burner, the room starts to smell like burning carbon. 

It’s not lazy, it’s presumptive hope that everything will work out so you can save a few minutes. Unfortunately, it’s exactly what John Wooden meant when he said - If you don’t have time to do it right, when will you have time to do it again?

The sad thing is, I always assumed that brilliant programmers could just write stuff once and it would be exceptional and work every time. Like the story of Da Vinci drawing a perfect circle to demonstrate his capabilities. Had he erased a couple points and said, “Hold on,” we probably wouldn’t be retelling a most likely made up story. I believed, subconsciously, I needed some time to pass before I would get to that point.

I’m reminded a cookbook by famous molecular gastronomy chef Ferran Adria on his restaurant called El Bulli. In this expensive cinder block of a book, somewhere in the middle, is a series of photos of the staff at El Bulli, cleaning the kitchen, Adria included. The people are wiping down every surface in that kitchen, including the legs of tables, and staring with the intensity that they would their dishes as they plate.

These chefs are brilliant, but to be so, they also have to be thorough about even mundane tasks. Cleaning isn’t beneath them, it’s an essential element to being some of the best cooks in the world. Furthermore, the reason they are at the level, most likely derives from this basic attention to detail, something the customers will never see. Adria's inclusion in these photos demonstrates that this process never goes away. Regardless of how well you cook, how inspired or efficient you are, the counter will still need to be wiped down. 

Back in code, I no longer hope to be the super hacker, smarmy, always-gets-it-right-the-first-time programmer. Instead, I hope to start by cleaning well and see where that takes me.


Why I Taught Java

When I taught my first programming class this past year, I decided to teach the class in Java. My co-workers at the time gave me a bit of flack for this. I should be teaching Go or Python or something with a little better reputation.

Java was looked at as the C++ killer as I remember when it first came out. It go overhyped, is the bane of everyone’s desktop with near daily updates it seems, and it isn’t the prettiest language compared to more minimal programming styles.

So why bother teaching it? Particularly if it is semantically dense or harder to read?

Platform Accessibility

Java is free, it’s IDE’s are free, and they supposedly work on all systems. Sure, web programming has this quality too, but Java introduces application style development to students that’s different from the web and emphasizes further the use of object-oriented programming.

Furthermore, it provides an introduction to library packages and it gives students the chance to type something and see something happening that’s not the command line.

Compilation and Typing

These are two concepts that we could say aren’t really that big of deal, but they are conceptually necessary for students to grasp. Programs are typically compiled and by just introducing this requirement you can illuminate the critical concepts of a compiler, build pipelines and the different varieties of code that takes you all the way down to the ones and zeroes.

Likewise, typing, while not a huge issue for non-space critical systems like a laptop, inform students how memory allocation works, why people care, and how programs are designed with memory in mind.

It’s more semantics, but a class is supposed to teach concepts as well as practical programming. Those concepts make the practical lessons more effective.

It’s Hard

I believe there’s a certain level of work that’s necessary for a concept to stick with students. The more details, the more attention is demanded and the really important practices of programming tend to become ingrained in the process of memorizing minor details like the labels for different types.

Think about it - when you’re learning typing you are repeatedly thinking about variables and what they really mean, which then lays the groundwork for more complex data types like arrays and finally designing your own types with classes.

It’s Useful

However, they are ready to look at vendor specific-language like C# with a much greater baseline than previously because of Java. Likewise, other languages like Ruby, JavaScript and even something like Arduino programming are all going to look pretty basic and a relief, picking them up with a lot more speed than having to work the opposite way.

I’m sure this sounds like a dad telling shoveling snow builds character. But I’ll take it - when you need to teach novice students, programming too requires character building.

Intuitive vs Powerful

When I describe a certain piece of software, I'll often add to the end of my description that the tool in question is either intuitive or powerful

Simple comparisons like Photoshop vs Preview, Trello vs Jira, Excel vs Google Spreadsheets are demonstrative of these dual principles. Something that is intuitive is easy to pick up, everything to be used is present on the screen, and does a few things well and quickly. In contrast, something that is powerful, has loads of features, complex and detailed minutiae even to do a simple task like saving, and of course, gives you complete control of whatever your subject is.

Generally speaking, most of us don't buy a $600+ copy of Adobe After Effects (well, you can just use Creative Cloud...) if MovieMaker will suffice. We don't need those extra widgets, we only use it every so often and therefore, why pay that much.

After reading Chase Buckley's "The Future is Near: 13 Design Predictions for 2017", specifically prediction #9: Age-Responsive Design, I'd like to suggest software that exists in a specific user instance that lands on a spectrum between the intuitive and powerful. This is software that evolves its complexity based on your use, skill and demands. 

In this case, your software wouldn't necessary be like ever other user's experience. Take a simple example - if you only need the Sum formula in Excel, perhaps you only have a couple options in that formula. Or maybe your Kanban board in Jira looks and interacts a lot more like Trello, and doesn't need to know business value for task unless necessary. 

Naturally, this would mean that our software would be a lot bigger. However, taking Salesforce for example, already a massive system, it seems possible that a large majority of users would like a single interface that evolves with their business needs but begins much simpler. In fact, we know this is the case because of products like SalesforceIQ CRM. 

Unlike Salesforce's divided software products (at least on the front end), I'd suggest a UI that is singular but varied. 

Take a more basic example - this is a Drupal site, I've done Drupal development for years. I would love it if I could put a Drupal site in at least 3 modes - Tumblr-like, Wordpress-like, and finally full powered Drupal. The majority of your super admins in Drupal are typically looking for a strong product, but mainly use it for blogging. Yeah, I could create user roles and customize the experience - specialized content entry forms, menu navigation, and permissions grokking - but I'd love a switch and for the evolving UI to be a conceptual core to Drupal's interface.

Overall, I see three reasons that would push this concept into a mainstream software platform (most likely SaaS). First, a broad base of user types; second, an existing demand for complexity by a core base; third, strong stratification of user types. All you need from there is just to make the UI that works for your types, in the same way as Buckley describes in his article, we can clearly design for different age groups. 

Angelfire Retrospective

Angelfire was the first place I learned to code. And to learn how to design websites. The latter of which I never really got into, but in the past I didn't even think there would be a distinction. It was all part of programming, you did everything yourself, right? Not that there was much out there to really get excited about.

To those unaware, Angelfire in the mid-nineties, alongside GeoCities, was one of the most popular website that hosted websites. However, more than just a hosting company, Angelfire and its peers created similar platforms as WordPress, Blogger and others of that ilk.

This was the era of frames, <i> tags and most especially tables. No CSS, no JavaScript, something called CGI that did form processing (totally over my head at the time) - but the one thing it gave you was a real live website address on the Web.

I would navigate to my page, a made-up skateboarding company where I made designs using MS Paint, every day. I would try and load it on school computers, and I would use the first edition of HTML For Dummies to try and implement every possible tag, including <blink>. 

These were halcyon days of development for a young teen, as the HTML isn't processor or platform intensive and so provided me, and others I'm sure, with the ability to code without paying for Microsoft's development suite or even Borland's C++ compiler (which I eventually owned). Instead, you could just build something on your desktop, try it out and share it. Sure it was painfully slow, but it was a start.;

The details of Angelfire's purchase by Lycos and other minor points about its business (did you know it started as a medical transcription service?) are pretty nineties and not very exciting, but the late nights from this pain-in-the-ass web host will always make the name Angelfire perk my ears up.

What Most People Miss About Drupal

I've been working with Drupal for about six years or so. Like many, I sort of fell into Drupal in the manner by which a lot of us end up aligned with one CMS or another. Meaning, it was a historical alignment rather than an academic one. 

Regardless, Drupal is still a good reprsentation of what a CMS can and should be. It definitely has its shortcomings (loading times, come on!), but there's a lot to draw from its architecture and patterns. That said, there's a lot of what both developers and non-tech savy end users mess up as they attempt to get more complex with their CMS. These aren't technical issues so much as conceptual ones. 


The System Doesn't Know What You're Thinking

There's this adage in programming - the computer does what you tell it to do, not what you want it to do. With Drupal in particular there are loads of settings, permissions, weighting, and so forth that need to be optimized for your particular site's content. While this can be frustrating, it does just take a little bit of time to get everything in order, however, while a user not being able to see the navbar can be annoying, the most common place I see this issue is with images. 

Drupal's fancy image cropping, scaling and processing are awesome, it's one of my favorite parts about spinning up a Drupal site. But what most folks forget is that while it can make everything fit a certain dimension for your design, Drupal doesn't know where your picture looks good. Maybe someday Drupal will be able to find the best cropping of a photo for a thumbnail, today is not that day. 

I've seen this in subtle ways elsewhere. For example, folks will develop and complex user system hierarchy, but then not want to manage it, wishing the system would just know where there new users should go. Likewise for node type management, particularly when there are a load of custom fields per node. 

You Must Have a System as Well 

Drupal is a CMS. It's a system that tries to publish content for you in an organized way. However, your end as a site content producer must be systematic for the site to work well for you. Drupal is not a tack board to simply plop content however you would like on the front page. Having a loose system for how you would like to display, publish and arrange content, while it can be expressive for some content producers, it is also very brittle and the system's liable to break and stumble over itself. 

This most commonly occurs in design and in the block system. You'll have a hundred blocks or so that have these extremely specific node listings for display. None of that is fun to maintain and it some point you'll want to toss it all out.

Layout and Content Are Not the Same Thing

This is my personal rule, and I've violated it plenty of times, but at no point should you have these complicated layout fields in your nodes that designated whether something should have a particular background color, text color, width, whatever. All this should be built out into the theme. Content producers are not designers and they shouldn't be forced to make design decisions as they produce content. 

Use different node types if the layout is really that complex, but more importantly, Drupal works well with a simple design rather than an overly complex one that ends up sucking the energy out of the site editors as they try to create nodes that just look decent. 

Tagging and Sorting Are Not the Same Thing

Drupal's taxonomy system is awesome. It's all you need and just so. As such, the system should not be jimmy-rigged to work with a blending of ordering and of tagging. This basically comes up in situations where you define a list of words like "featured", "top" and such to make things sticky to the top of lists. First off, there's already a sticky feature in Drupal. Second, without a doubt, site users almost always mix up their ranking systems so the content lists don't come out looking as they should. 

I always recommend that people use nodequeue or node references if you want to manage specific content lists and let your related content float naturally using some system like publication date. This will typically give you all the control you need without having an extra layer of hassle. 


More often than not, developers tend to give themselves trouble by not holding their Drupal implementations to an organization where it will work well. Sure, you can make Drupal do such and such, but often times it's a hack rather than an innovation. While these make-it-work solutions can satisfy you for a short period, they tend to give Drupal site managers and content producers more work. 

Best advice - keep it simple. Spend your coding time making good themes, organizing proper Views, and using custom module dev for more complex queries or data entry / processing. If you find yourself spending a lot of time in custom fields, taxonomies or blocks, you may want to reconsider your approach. 

When web dev goes political

The release of HealthCare.gov and its subsequent failure is not something that most web programmers would be very surprised about. 

This has nothing to do with 500 million lines of code, which is bull anyhow, but instead has to do with the organization of any project of the scale, and what most people would anticipate how the government runs code projects.

There are probably some groups within the government, particularly within the military, that run awesome and would put most private enterprise groups (including my own) to absolute shame. However, with HC.gov, we knew that we were dealing with a new team, new objectives, in an untested user market. So that team was going to be green in their field regardless. 

Furthermore, we knew that this green team was going to have to scale up immediately. Consider a large application like Facebook. Its user base did not show up on day one. It's feature set, for all it's photo and tagging capabilities was not even close to what it was in the beginning. Facebook was something that a motivated developer could design and test with a small group, making incremental improvements as the software was used.

Not so, for HC.gov - everything had to work day one. What happens if you need to udpate the system? You can't - it's just gotta work. 

This is not how most web applications are developed. This is how a lot of desktop applications are developed - basically, choosing what bugs to ship with. And this makes fixing the problem, "A website should just work" a real problem, because there is no version 2.0. This is now and now it should fucking work.

What those of us in this industry could not have known is another most common problem of green teams - a lack of testing. 

As so many books on testing will tell you, you are not down with something unless it passes a test. I would say "the home page loads" or "A new user can sign up for insurance" would be a pretty big deal of a test to pass. A beginner tester might note that sign-up is not something that would ever be one test, and that's more to the point - the failure of that intergration is obviously something that would have most likely thousands of smaller tests behind it. You wouldn't even get the chance to test that larger one if those other tests didn't pass. So yeah, obviously someone skimped somewhere.

And take note, none of this has to do with benchmarking tests or the like. But if processing applications was just hung up by performance, we could run that by hand at 3am and gain success. And fixing it would be like AOL - the government would spin up more distributions and db backups, something pretty trivial at this point in time. To my knowledge, that isn't the issue.

Most often when you're trying to trim costs, TDD and adequate QA is not done. Therefore, it would be no surprise that this is where fat was cut, but they weren't cutting fat, they were cutting meat. 

I'm not a behind-the-scenes coder who could actually verify these issues, but it doesn't take a big leap to guess and take heed of the troubles a disaster of this magnitude demonstrates. The President looks incompetent and his opponents look spot on correct in that the government is incapable of providing healthcare. 

The truth is that the last point has not even been demonstrated - the government and the Obama administration just didn't produce the software to provide healthcare, and really it's gotten to the point it really makes no difference. 





When I was a kid, I watched the video above. I love the hell out of the Magic Secrets revealed series. I love that it had that guy from the X-Files in it, I probably loved the assistants, and I definitely wanted to know who the Masked Magician was. When I saw who he was I was disappointed. Truthfully, the only magician I knew was David Copperfield, and obviously, it wasn't going to be him. 

After a side comment about David Copperfield tonight, I ended up watching a bunch of magician videos, which led me here.

I remember the masked magician's message from when I was a kid. Namely, that by revealing tricks, he was pushing magicians to even greater heights. What I missed was the message that by discussing the elements of magic, that Val (that's the guy) had actually encouraged kids to get into magic. 

It's a practical message. Obviously, not everyone is going to watch this show. And there's more to magic than just knowing how its done. Shit, for the most part I know how a plane flies. As Val mentions, you need to have showmanship like Copperfield to do this well. 

I've watched Val's reveal of the above illusion, and I can still only guess that one of the men on the other side, has a fake arm to produce David's face, or just assume it's all a stupid video trick. If it's not, I don't believe David's magic, but I'm still interested. 

There's two careers I gave my heart to: cooking and programming. 

I was very fortunate to work for an exceptionally talented chef, who taught everyone on his line that food was about skill. It was not magic. It was not about "authenticity" or "real X cuisine". I worked for a guy who made fucking fantastic northern Italian food from Colorado. 

In particular, I remember working front of the line, while he worked wheel and some folks came back to ask for the secret recipe to our boar sauce. He flatly told them, there was dick special about it. Cream, demiglace, rosemary, time. 

And that's exactly the point - you can't cook that, because you aren't good enough to handle BASIC ingredients. The secret isn't ingredients, it's time. Which you haven't got or put in. 

Now that I work as a programmer and lead teams, I dispense with any notion of genius or rock star programmer. The deeper you in get in code - from high level to assembly to virtual machine to specific circuits - you realize that everything is pretty direct. It's just a lot building up simple elements. 

Yes, there's geniuses, brilliant chefs, but if you're reading this, you're unlikely to be one of those people who can skip steps intuitively. Funny thing is - even those folks know those steps are there, but that's another discussion I won't diverge into. 

So I've revealed the secret, I've shown you the secret ingredient, but you're not any closer to wowing people with that card trick, because you gotta practice it to make it look good, let alone make a profession.

I think Val did a good thing. No one today hardly remembers him, but I'd guess that he made a few kids interested in magic who are damn good at it, because he revealed the trick, made them focus on perfecting that one tiny thing, forced them to develop the hundreds of other microskills that support it, instilled the confidence to make it look real, and created magic in their lives. 


Why I bought R.O.B.

I own lots of old vintage computer and gaming stuff. It's something I'm into. I like simple systems that are easier to wrap my head around. There's a charming simplicity to it all.

ROB - the Robotic Operating Buddy - is a piece of shit I have no interest in every getting my head around. 

ROB is an additional controller that hooked up with the original NES. He even came with system if you paid a little more. He basically only plays two games - Gyromite and Stack-Up. Which suck as well. The professional below bitches much better about the awfulness and absolute unplayable nature of these games. Specifically, because of ROB's failture to work correctly or time his motions right with the the game.

So why buy it? Because I've always wanted it. Seriously, back in the day, I really imagined that ROB actually had some of sort of personality to him and would be this super smart robot who would advance my pretty pathetic gameplay. If you watched the video above, obviously that wouldn't have been the case. Nostalgia sets in deep, I even have the The Office Nintendo Player's Guide, which lists a category of games know as the "Robot Series." Guess what - it's Gyromite and Stack-Up.

 However, I did want ROB for another reason. I like to see the faith people had in technology and the exploration it took in the past. ROB may have been quickly rushed to market to add novelty to a system that would be forever praised for its basic elements: games and square controllers. Call it desperation or Nintendo thinking kids will buy anything (guess they were right in my case), but holding ROB now, I know that, someone years ago thought that this was where games and human interaction with computers may be headed. 

Perhaps that's a little idealistic on my part, but I've been one of those people my whole life. Less so now, but I looked at games like Burncycle and I imagined the possibilities of such a medium.  

I didn't by any means think that it was the only and end of development, but it allowed me to really think about where I thought technology was going. It was a good practice. I still think of ads for games and movies that I knew nothing about and where I imagined their plots, most of the time, the actual plot (sans my idea for Jurassic Park) I enjoyed more.

When I looked at my lifeless ROB (I couldn't really afford a working one), I see the start of that process for me. I know it's nothing special, that quality, or even collecting for nostalgia's sake, but those reflective eyes remind me, not of a carefree time, but a place where I was forced to imagine the possibilities rather than be disappointed by their implementation. 

Goodbye to the Ballmer Peak

Today I say goodbye to the Ballmer Peak. 

I have never handled alcohol well. I've rightly earned a reputation as an overdrinker. Typically, this is a joke among everyone who hasn't had to deal with my bullshit, most especially my wife. 

But in software, drinking IS a joke. I've never particularly understood this. I find that software requires a fair amount of my wits to do well. Though, I will say software is aided by a bit of numbness, I tend to see this in the form of low light at night listening to RainyMood.

I've had three major careers in my life: cook, writer and coder. All of these fields have the reputation for heavy drinking. In reality, the only place where I saw it, and where I learned the majority of my bad habits was in cooking. You drink there, because there is fuck all to do after a late night shift, you're hot, you're in pain, you want to blow off steam, because you're poor, you got your ass reamed and it really doesn't matter if you're hungover, because your job actually feels better a little numb. 

As a writer, I pretty much completely sabotaged my career by drinking. I worked mainly in copy, but the opportunities I was given by various editors to step up were typically ruined by drinking and quickly doing stories that were badly worded and poorly thought out. I was poor there too, but less so. But hey, writers drink right?

In my late teens, I learned HTML/JS (no CSS back then), C++/C, and PERL all on a POS laptop from my girlfriend's dad. Being a writer sounded sexier, so I gave coding to the cubicle drones. In my late twenties, I was back in the saddle. I found here too that drinking was just as encouraged and supported as cultural as my past two careers. This time, though, I had money.

If you can't tell where I'm going here, allow me to clarify - I think all of this is FUCKING STUPID.  To cut around, the anecdotes and get to didactics - there is absolutely ZERO benefit to ascribing drinking as a part of coder culture. Coding is about building, intelligence, innovation, and cleverness. And alcohol benefits none of those things. It's antithetical to the culture entirely. 

Software is the culture of late nights of coffee and Coke. Sometimes Mountain Dew. Sacrifice, for the glory of outsmarting the other guys, of creating Cathedrals. It has higher aspirations than the acctrument of success that goes with glorifying alcohol. And that is the only reason that I see this so common in popular depictions of coding. People who code are supposed to be rich now. And rich people drink to excess. Like they don't give a shit. They can afford to be out of it. In reality, coders can't and shouldn't.

Go ahead and drink. Seriously. I'm not being anti-alcohol. But don't ascribe it any power to code. For every crazy Friday with cheers all around, back it with hundreds of cups of coffee and bloodshot eyes. 

Subscribe to Opinion