I don't understand why you don't understand my Views

Drupal developers typically follow this pattern when they start to work with layout systems: nodes to blocks to maybe panels to Views. Now these are obviously intertwining and of course a lot of us jump into custom templates and Views-like modules that aggregate and filter content.

As a Drupal dev, looking at these elements the progression seems pretty obvious and is natural to content management. First you create content chunks, then you arrange more specialized content pieces, and finally you start creating lists and contextual filters of content.

Speaking with clients who are actually going to use your system, however, this progression to and use of Views isn’t as natural or intuitive.

For a lot of folks not in web development, pages of content are just that - individual documents that stand alone. They aren’t nodes, they definitely aren’t fields, and from there the concept of filtering nodes to access fields is a conversation that isn’t going to translate well between dev and non-dev. But particularly when using systems like Taxonomy or Nodequeue with Views, where curation is a focus, it’s critical that users understand the basic idea.

So what is the best way to get these ideas across?

Tech terms is obviously a non-starter. Referencing SQL or the index.php query system won’t help. Neither will the name “Views”. It’s just too general, and worse yet if a person has enough knowledge to understand what a “view” is within MVC, as it can be easily confused with templates.

Thus, keep terminology to “Lists” and “Filtered Lists” or whatever wording works best. 

The simplest way to explain Views from there is to explain that these lists are lists of pages that use their parts, and we can set filters to “go get” the pages we want and just get their “Title” for example.

Here again, saying “fields” and “content types” is not very helpful either. Likewise, when discussing filters, if you’re using Taxonomy, “label” or similar may work best since that’s more common in other SaaS platforms to do the same function.

From there, concepts like limit / count, pager and such are a little more familiar to most people involved who use the web.

The key point therefore is to step out of the Drupal lingo and start thinking in terms of what’s more common. If users reach the point where lingo becomes important, then it’s fine to switch over, but until then, communicating the key operational concepts is more vital than keeping things in Drupalesque.

 

Cheers to small books

Recently I picked up Thomas Schwarzl’s 2D Game Collision Detection. It’s a couple years old, and I already have books that are much more thorough on game development such as Mathematics and Physics for Programmers and Game Physics Engine Development, but I wanted a book that was as stripped down and direct as possible on its topic, which I needed a little boning up on.

And indeed the book did exactly what it said it would do and no more, a result that a lot of programming books in general fail to deliver on. Simple question - how do I solve the tunneling problem in collision detection? For Schwarzl it’s a couple pages, and for a lot of books their thoroughness in answering this programming problem can become less of a technical issue and more of a readership problem.

While I wouldn’t knock books attempting to provide a lot of knowledge for my page-buying buck, I would say instead that there is a space missing in programming literature, which is the small book, and in particular, the small specialized book.

I have a couple of algorithm books on my shelves (well, in stacks on my floor), but some of these, while serving as rich collegiate textbooks aren’t anything that can really engage the beginner or improve the intermediate developer.

By all means, I won’t return Sedgewick’s Algorithms, however, I would like to see books along the lines of 4 Sorting Algorithms that barely trump 90 pages. Books of this variety wouldn’t be intended to be the end of all of study, but could provide a low cost of entry into a new subject without intimidation, high cost or excessive detail.

Sometimes, you just want to be told in a few sentences what a bubble sort is without, for the moment, worrying about its O-notation compared to other algorithms. Similar series elsewhere - A Very Short Introduction, How to Read…, and every book on meditation - demonstrate the practicality and demand for reading of this type.

The You Don’t Know series focused on JavaScript from Kyl Simpson is a perfect example - meeting all these criteria: short, detailed in a specialized topic and not too much extraneous information. Want to know about closures thoroughly but JUST closures, well Simpson’s written that book.

As literacy in programming expands, I do expect that these types of books will appear more consistently and we’ll see a move away from textbook or language survey tomes and more about how to use languages involving very specific topics in a way that readers can quickly consume and then apply their understanding.

Course I guess I could start writing them myself...

Dumb Game

This weekend, after completing the ARDX Experimentation Kit for Arduino, I decided to think of the simplest possible game I could using a variety of parts provided in the kit. 

What I arrived at was Is It Even? - a game that challenges players to indicate yes or no to whether or not a displayed binary number is even. Now, for anyone who knows how to read binary, this is a pretty damn simple game. But the intent was to combine a few elements that would fit on the board that came with the kit. 

For knowing nothing about these devices, it was amazingly simple to put together the project, and the examples in the experimentation kit, more than Arduino C code examples, are very empowering to play around with. I highly recommend this kit for any beginners. 

Now if only I could think of something that isn't so useless. 

Who started it?

In the final chapter of Steve Jobs, author Walter Isaacson assembles a bulleted list of Jobs’ greatest career contributions – the iPhone, Pixar, the whole App ecosystem – and of course the Macintosh, which Isaacson describes as: “[it] begat the home computer revolution and popularized the graphical user interface.”

Brian Bagnall, author of Commodore: a Company on the Edge, would beg to differ:

“[The] rosy picture of Apple starting the microcomputer industry crumbles under inspection,” he writes in the introduction to his book, “Commodore put computers in the hands of ordinary consumers.

Bagnall replied with his own list of what he believes were Commodore’s successes – first to sell a million computers, first major company to show a personal computer, and the first to release a multimedia computer.

In this way, Bagnall’s book begins as a direct challenge to Isaacson’s book, but aside from this opening salvo and fighting over turf, the books are excellent compliments to each other.

While Isaacson’s book is not strictly about Apple the company and Bagnall’s book is about the Commodore as a whole, both books have a lot in common - each author had a tremendous amount of access to the people involved in the history of these companies and saturated their book with quotes and firsthand sources, and both are very concerned with and detailed about late 70s and 80s computer history.

Commodore, who closed their doors in 1994, was the progenitor of the Commodore 64 and the manufacturer of the Amiga. Commodore has become emblematic of the shifting sands in the computer industry in the transition from the 80s to the 90s. Commodore had money, had technology, had vertical integration (much emphasized by Jobs in his biography) and yet, they couldn’t survive. While Bagnall has yet to publish a long-awaited follow-up to Commodore, The Future Was Here by Jimmy Maher, a book profiling of the Amiga, explains that Commodore just simply didn’t know what to do with their computers or how to market them as multimedia became a requirement of consumers. However, Jobs certainly knew how to market his machines.

Bagnall’s book makes a thorough and persuasive presentation of the contributions of Commodore and notably its technological heart Chuck Peddle. You can’t read Bagnall’s book and then look at Isaacson’s bullet point list of Job’s accomplishments without thinking “Well…..”

Unfortunately for Bagnall’s subject, history is written by the victors. Or rather, fans of the victors it seems.

This collision of these differing authors is central to a lot of what is currently being written about personal computing history. There’s a plethora of books available now that attempt to state who invented the computer, or who sparked the revolution (since it needs to be called that for whatever reason), or what influenced the revolution. Ultimately boiling down to the question - who was the first, such that they deserve credit?

Commodore, to its credit, is getting recognition - the documentary From Bedrooms to Billions, the aforementioned The Future Was Here, and the tremendous amount of retro computing interest has brought the company’s technology back into popular interest. Commodore for many of us, does sit as a time capsule of that era, perceiving its products unaltered by its present form, as is the case with Apple.

For myself and my near-peers, computing history is not just industry on its own merits, but also valuable because it is our personal history as well. We experienced that history, and many of the details filled in by books like Isaacson and Bagnall’s are enriching simply because we can say to ourselves “Oh yeah, I do remember that.” It helps us to explain and understanding the evolution of the digital world, one in which, we directly inhabit and is still quite new.

However, I don’t care who started it. Computer history is always a tremendous confluence of factors that assigning historical responsibility is a pointless task. Oh absolutely, individual people had tremendous impact or influence, but I can’t say Alan Turing started the computer revolution more than I can say Morris Tanenbaum did as well. In fact, it is the combining factors that makes the field of study so fascinating. The iPhone wouldn’t be half as interesting if we didn’t have social outlets such that we always had a reason to be engaged with and notified by our phones.

I commend Isaacson and Bagnall on their enormous efforts to document the history of personal computing and choosing such large subject matters and persons of importance. While I agree that Commodore has indeed been downplayed as Bagnall claims, the whole problem isn’t that Commodore isn’t getting its just desserts from history. Instead, the motivation to claim ownership of the digital age is going to be there regardless of history, especially when money, power and success are intertwined.

Game Programming Patterns

Most game programming books are one of two things: very specific – particular game engines, physics, rendering, AI – or very general, for the beginner.

Robert Nystrom’s Game Programming Patterns is right in the middle.

As the title indicates, the book doesn’t focus on a particular engine, individual component or language, though examples are given in a stripped down C++. Instead, Nystrom’s book takes its cue from the Gang of Four’s Design Patterns - presenting a problem, then a pattern that attempts to resolve complexity with higher aspirations of reusability and performance, both for the computer and the programmer.

GPP is an excellent companion to its inspiration for even the general design pattern enthusiasts. Games, at this point in time, are something most programmers have some level if not expert knowledge in, even if that’s only as a player. Framing design patterns within this context allows Nystrom to provide examples that are relatable and can move away from the Gang of Four’s need to be as neutral as possible when outlining their ideas. In other words - it clicks better because the content is very close to our experience.

The book is organized around groups of patterns and most of the chapters are focused on problems that are unique to game development.

The first section revisits several of the Gang of Four’s canonized patterns – command, flyweight, observer, prototype, singleton, state – even taking a number of them to task as misused, bad or at the very least poorly described by the original authors. Nystrom certainly could have chosen other patterns (composite, builder and visitor spring to mind), in fact, it’d be awesome if he extended the series, but he had to find a limit someplace.

The other sections dive more deeply into game programming specifics including sequencing, behavioral structures, decoupling, and optimization, but the descriptions still keep the discussion on the pattern level. These chapters cover problems that specifically arise in game development, as opposed to say web development, and Nystrom is even more strong-handed in the tone - use the pattern when you need it, not just because.

The patterns discussed near the end of the book specifically belabour this point, as their optimizations aim really make code more complex and difficult to debug, but can greatly affect performance for games that need it.

Overall, Nystrom’s book was a like a great pair programming buddy. He presents most patterns in the book initially as “hey, let’s just get it working”, then through the introduction of a pattern, refactors, and then asks question of “do we really need this?”, all the while with asides and geek notes in the sidebars.

While this information can come from other places targeted at game developers, Nystrom’s a good writer. I mentioned above that this book fits in the middle, in truth it probably does lean more towards the beginner, that said, the value of the book for the more experienced is to gain another perspective on patterns and code practices from an author who delights in it.

Game Programming Patterns is indeed useful for its titular topic, but as a whole, it made programming, patterns used or not, just fun.

Z-index growth

Above is the plotting of the growth of z-indices over time in a project in CSS. Earlier this week I was told that my QA team was discovering websites in our network that had z-indices of upwards of 90,000,000. Hence I have developed a hard observational rule about z-index:

The higher a extant z-index, the greater the increase in the z-index to override it. 

I have no reasoning for why this rule exists, perhaps it's ego or frustration to just be on top, but take some time and observe z-index in yours and other's projects and you'll find it to be true. Around 100, people will jump between 90 and 110, once you get to 500, you're in the realms of 50 increases, and past a 1000, you'll see serious gains. Finally, someone will just say fuck it and start placing 500+ gains. 

I've had the fortune of seeing good style guides, where the developers define actual layers to their application to ensure that items are properly scoped. The joys of a well designed application. Unfortunately, when dealing with CMS structures, embedded iframes and similar black box situations, "fuck it" becomes a definitive answer.

Angelfire Retrospective

Angelfire was the first place I learned to code. And to learn how to design websites. The latter of which I never really got into, but in the past I didn't even think there would be a distinction. It was all part of programming, you did everything yourself, right? Not that there was much out there to really get excited about.

To those unaware, Angelfire in the mid-nineties, alongside GeoCities, was one of the most popular website that hosted websites. However, more than just a hosting company, Angelfire and its peers created similar platforms as WordPress, Blogger and others of that ilk.

This was the era of frames, <i> tags and most especially tables. No CSS, no JavaScript, something called CGI that did form processing (totally over my head at the time) - but the one thing it gave you was a real live website address on the Web.

I would navigate to my page, a made-up skateboarding company where I made designs using MS Paint, every day. I would try and load it on school computers, and I would use the first edition of HTML For Dummies to try and implement every possible tag, including <blink>. 

These were halcyon days of development for a young teen, as the HTML isn't processor or platform intensive and so provided me, and others I'm sure, with the ability to code without paying for Microsoft's development suite or even Borland's C++ compiler (which I eventually owned). Instead, you could just build something on your desktop, try it out and share it. Sure it was painfully slow, but it was a start.;

The details of Angelfire's purchase by Lycos and other minor points about its business (did you know it started as a medical transcription service?) are pretty nineties and not very exciting, but the late nights from this pain-in-the-ass web host will always make the name Angelfire perk my ears up.

Atari: Game Over

Currently on Netflix, the documentary Atari: Game Over does one thing very well - makes you very endeared to Howard Scott Warshaw.

Warshaw is the game designer and developer for the Atari 2600 game E.T. based on the film of the same name, that in 1983, supposedly killed the video game industry and thus, millions of copies of its cartridges were shamefully buried in the desert. 

This documentary is the story of the filmmaker's, Zak Penn, attempt to find those cartridges and to answer the question - What happened to Atari? 

Like a lot of recent video game documentaries, the film begins with some variant of "Now, video games are everywhere." This is probably an unnecessary line, considering the audience for this film has to be specialized enough to care about buried Atari cartridges but let's move past this.

The film uses Warshaw as the central character around the history of late 70s and early 80s video game development, explosion and subsequent downfall. Warshaw, who is now a therapist, had, before creating E.T., developed classic games like Yars Revenge and the adaptation of Raiders of the Lost Ark to Atari. Warshaw is naturally a part of this arc of history and admittedly realizes that he tried for decades to find the same high as he felt as a young man on the wave of the video game craze. 

While Game Over does the perfunctory history of the history of Atari and interviews with Nolan Bushnell, what it does so much better than its peer documentaries is tie it to Warshaw, who is person that the audience can actually connect to. In contrast Video Games: The Movie has a group of young people talking about how "AWESOME" Atari was, which may be fun for them, but when sitting in movie and asking yourself 'why should I care?' Game Over answers with Warshaw's experience and life. 

The climax of the film, not surprisingly, is the big dig, where Atari cartridges are discovered in a land fill. People come out in droves to see the dig and a cheer goes up as the old games are found. Ernest Cline, author of Ready Player One, is there too. He makes an appearance for several scenes throughout in a DeLorean from George RR Martin for no reason other than I can assume to throw some nerd-credibility in there, cause seriously, he serves no plot or historically illuminative purpose. But, really, his presence is a lot like the cheer as the games are found.

I am a game and vintage computer collector myself. I don't have a tremendous amount of money, so often I content myself with old programming books from the 70s. I own three Atari's including a Sears Atari. I own multiple copies of E.T. and my most rare game is Swordquest: Waterworld. I get excited about old gaming stuff, but not enough to break my bank. 

Nothing about the Atari dig excites me. As the different people at the end of the film explain, hating E.T. has become fashionable. There are far worse games on the Atari 2600, even the Atari 5200 in its entirety is terrible. The dig and its excitement in the film are Internet Exciting not actually exciting on a gaming level. That strikes me as a little hollow, like just throwing a bunch of retro gaming t-shirts on a character in a film and calling them a "gamer." To any screenwriters out there looking to exploit this market, I recommend you use Ernest Cline. 

I don't doubt anyone's authenticity, but self-congratulation and masturbation in gaming docs is horrendous, and it's the only part of the film that starts to dip into this realm. Fortunately, Warshaw carries it past this. 

Warshaw, having been demonized for his creation of E.T., is understandably touched by everyone's involvement, hardwork and enthusiasm. This is actually moving. It's validation to a person that his who had to give up a career he was good at and loved. As the film notes, there are no lifetime achievement awards for Warshaw. 

I didn't obtain any new information from this film, but it did make me like Warshaw more and that's for a guy who is already pretty likeable. If you haven't seen his series Once Upon Atari it's one of the more in-depth "documentaries" on gaming and programming history I've seen.

What is Wrong with Fighting Tournaments

The above video is funny and a pratical annoyance for anyone playing tournament fighting games. Personally, I always figured there were other fighters going on in the background, and to be fair, in the original Mortal Kombat you do see a bunch of random bodies lying around in the spike pit that look fairly fresh.

But there is one more element to most fighting games that I depise besides the fact that they are not technically tournaments - no one in the game treats it like a tournament. Watch the below series of cut scenes. Well, probably don't need to watch all of it. In theMortal Kombat (reboot) there are only a handful of times when people are actually fighting in matches. Aside from that, people just try to constantly randomly kill one another.

Well shit - if you can do that just fucking have everyone do a fatal melee. In the MK universe in particular, I know Shang Tsung's a bad guy, but if all he ever does is cheat and have people murdered than what the hell is the point of having a tournament. Just have people come to your evil island and throw them in the fucking spike pit or whatever.

And real quick - who the hell are those monks in the background?

My point is: the lacking of any semblance of a tournament really makes these tournament games feel like they are awkwardly trying to fit a larger story into what is essentially two people pummeling each other. AND that's exactly what they are doing. BUT - I would suggest that there may be actually interesting stories to be had in fighting tournament games, if the designers had characters that actually adhered to a tournament. Perhaps there are different conditions as you go through the tournament, just a thought.

Now, if you'll I have a creepy island to head to for a tic-tac-toe competition...

Pages