Happiness is DRY Code, or "Programming Language Stratification"

Cryptic prologue

As George Burns may or may not have said:

Happiness is DRY code in a good shell script

Your code—directed by Matthew Vaughn


In the age of devops and the full-stack developer, it’s common to have many layers of programming in your life. These layers are often treated as classes in sociological sense, where an implicit hierarchy of resource allocation and standards are applied subconciously.

To everything, there is a season


As in sociology, the hierarchy shifts, unshifts, pushes and pops over time.
JavaScript code could be said to be higher up in 2016 than in 2004.
Objective-C might be lower in 2016 than 2010.
Throughout the ages though, it’s been prevalent to see the same programmers that give incredible attention-to-detail in their code throw all caution and decorum to the wind when writing shell scripts.

Shell scripts—no longer just for 90s cyberpunk movies

As the Perl, Ruby and Node.js communities subsequently lowered the barrier to CLIs further and further, you’ve seen the types of devs willing to consume and create CLIs expand. I credit this change with a renewed interest in pure shell programs written for bash and zsh. Alongside this, even within the shell script sphere there’s been a stratification—between public (OSS) and private (proprietary or personal) shell scripts. While proprietary Java code often receives similar scrutiny to OSS Java code, the same doesn’t hold true for scripts. It doesn’t have to, and shouldn’t, be this way though.

Walk the talk

I was compelled to write this post because of the immense benefit I’ve received over time from treating my zsh scripts much like I would JS packages. I put them in version control and all that jazz. But most importantly I endeavor to make the code as readable and DRY as possible. An example of how this benefits me in the day-to-day can be seen in the following real-life example. I had a script like this that’s part of the zsh setup I sync across multiple machines. This piece is a set of aliases for moving into directories I commonly develop within.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
source $ZSH_SCRIPTS_DIR/bootstrap.zsh
alias gpom="git push origin master"
alias zscripts="c $ZSH_SCRIPTS_DIR"
# Personal / OSS
alias ghub="c $GHUB_DIR"
alias bitb="c $BITB_DIR"
alias ghjw="c $GHUB_DIR/jameswomack"
alias exti="c $GHUB_DIR/jameswomack/exploding-video-tiles"
# Netflix
alias stash="c $STASH_DIR"
# Netflix DEA
alias ignite="c $STASH_DIR/dseplat/ignite"
alias nmdash="c $STASH_DIR/nmdash/nmdash"
alias abacuse="c $STASH_DIR/abacus/abacus-editor"
alias abacusa="c $STASH_DIR/abacus/abacus-editor-app"

Now yesterday I migrated a couple of Abacus repos from Stash to Github. That affects my aliases across several machines, and also affects scripts outside the above example. But because I’ve carefully put common paths into variables and already setup syncing across each of my machines, it’s a simple matter of find-and-replace (in one file) + commit and push. If I hadn’t broken out my common paths into variables… if I hadn’t modularized my scripts… if I hadn’t setup up version control and syncing… this seemingly simple task would be rife with room for error. In the end I simply changed this

1
2
alias abacuse="c $STASH_DIR/abacus/abacus-editor"
alias abacusa="c $STASH_DIR/abacus/abacus-editor-app"

to this

1
2
alias abacuse="c $GHUB_DIR/abacus/abacus-editor"
alias abacusa="c $GHUB_DIR/abacus/abacus-editor-app"

and then

1
2
git commit -am "refactor(aka): Abacus Github migration"
ggpush

It’s the little things :)


The Reproducibility Crisis

Schooled


Ever since I was a boy, I’ve had an issue with text books. These sacred tomes filled with confident assertions offered a steadfast view of the universe I lived in, but when I peeked into the text books of others I learned we were each living in a different universe.

Different than my friends’

When I looked at the text books of my friends, who went to other schools, the science they learned was different than my science. It was as if they lived in a different universe. I say that because, the books never said “this is what we think” or “this is our guess about how this works”. No no no no. They said “this is what we know” or “these are the facts”. So at Loma Verde Elementary, the facts were quite different than at Castle Park or Wolf Canyon. Nevermind that the facts were in Spanglish at my Loma Verde—that wasn’t the big deal—it’s that they were as if from another universe than at CP or WC.

Different than my parents


Most of us have had the feeling that our parents are from a different universe. Well I can say with assuredness that my parents truly are from another. How do I know this? I know because I’ve read my mother’s 6th grade science tome. In it, dinosaurs are lizards—not in any way related to birds. In my mom’s universe, the 1990s were filled with flying cars. My 1990s were filled with false promises of the type of bullet train Nihon seemingly had for decades, and a Britney Spears video that sent horomones into overdrive.

School’s Out

The localized reality distortion fields didn’t cease to exist outside of the educational sphere. I could barely turn the corner toward the local market or turn on the TV without another scientific “fact” being blurted out at me.

There is no truth, only sugary refreshment


When Mulder said the truth is out there, he should have also told me it takes as many forms as there are blog posts and subtweets. While it always struck me as odd that scientists always know the truth the whole truth and nothing but the truth—and yet change their mind every 5 minutes—that doesn’t seem to strike anyone else as anything other than awesome. Take The Coca-Cola Company for instance! They spend millions of pounds each year on British scientists that use their exceptional Anglo-Saxon brain power to invent whatever universal truths Coca-Cola asks them to. That’s the Union Jack promise! And while we’re all grateful for CC proving that Mexican Coke is healthy, there’s a greater yarn spinning factory out yonder. That factory is the medical research establishment.

Curing cancer one irreproducible study at a time

Here’s where I get serious—serious enough to stop telling a stupid story and get to the cold hard facts. Essentially, next time you hear there’s a new cure or big breakthrough in cancer reasearch, that might not mean much beyond a scientist getting an award and a few that-a-boys stuffed with Canadian dollars and Prosecco popsicles. The reason?

“…Clinical trials in oncology have the highest failure rate…barriers to clinical development [are] lower than for other disease areas, and a larger number of drugs with suboptimal preclinical validation will enter oncology trials.”

Translation? You don’t need to prove your oncological results to make money or gain respect from them. But don’t listen to my irreproducible blog post. Read the facts:


The everlasting benefit of naming conventions

Take a look at this example .tern-project file

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
{
"libs": [
"browser",
"ecma5",
"ecma6"
],
"loadEagerly": [
"./node_modules/abacus-notepad-component/dist/*.js",
"./node_modules/activity-component/lib/*.js",
"./node_modules/component-popup/src/popup.jsx",
"./server/**/*.js",
"./server/*.js",
"./client/src/js/**/*.js"
],
"plugins": {
"node": {}
}
}

We’re using TernJS and it’s loadEagerly option to have intelligent & dynamic autocomplete available in our JavaScript editor setup. It works anywhere from vim to Visual Studio Code. But I digress.

As out project grows, we add more entries. As our number of projects grow, it will likely get copied all over the place, including onto other developers’ machines that you do not control. Even if you did automate (and even control) the propogation and maintenance of this file, you’d have a problem: the module names, file paths and file names aren’t normalized. A file path is an address. Addresses are normalized in society because they serve a purpose that is not reached if humans cannot make assumptions about them. When identifying structures within the United States of America, addresses are generally normalized to meet the following assumptions:

  • The first piece of the identifier is real number, with the vast majority being integers (a small percentage have a vulgar fraction appended). Most importantly, the overwhelming majority of US streets have the odd numbers on one side of the street and the even numbers on the other. Anecdotal evidence shows this to be even more important than chromatic sequence when locating a structure
  • The second piece is almost always the name of the thoroughfare touching the land nearest the official entrance to the structure
  • The (optional) third piece is a sub-identifier representing that identity of the unit within the structure identified by the preceding and succeeding pieces
  • The fourth piece is the city name
  • The fifth piece is the state name
  • The sixth piece is the Zone Improvement Code or ZIP code. It consist of five- and four-digit integer separated by a hypen (or “dash”). This is, in my opinion, one of the weaker parts of the address system as most folks do not know the 4 digit appendage that was introduced in 1983, nor do they usually know many ZIP codes other than their own

Just as the home address system in the USA does, a file path convention will have stronger and weaker aspects to it. All the same, we all know having an address system is better than none, so why would you not have one for files? In retrospect it becomes an obvious choice.

Take a look at an improved .tern-project file, taking into account the lessons of the address system

1
2
3
4
5
6
7
8
9
10
11
12
13
14
{
"libs": [
"browser",
"ecma5",
"ecma6"
],
"loadEagerly": [
"./node_modules/abacus-*-component/lib/**/*.js",
"./lib/**/*.js"
],
"plugins": {
"node": {}
}
}

What changes did we make?

  • Name all our team’s custom components using the format teamname-modulename-component
  • Always put our transpiled/consumable JavaScript files in a folder named lib, organized into appropriate subfolders& always using the extension .js. This is the convention, whether it’s a small package or an application

That’s it! We went from 6 entries to 2 just like that.

Happy filemaking!


Automating the hiring process

A representative of an organization called TestDome Ltd contacted me a couple days ago. Today, they pinged me again, asking “Any thoughts on this?”. What they wanted my thoughts on was whether I could use their service of automating using programming quizzes to filter out folks in the (often long-winding) hiring process. My answer was as follows:

My thought is that this type of automation cannot solve the type of hiring problems we have at Netflix. Programming tests embody an extremely poor evaluation of senior programmers. Senior programmers leverage their past experience to effectively combine the best existing solutions in such a way that they’ve created some new, maintainable and sustainable. Senior programmers are good at working with others. They conform to, while incrementally improving, coding style and standards. The only way to pre-suppose about these things is looking through open source contributions, Stack Overflow answers, behavior on Twitter and then pair programming with them. An automated programming quiz merely tests how long they’ve spent doing automated programming quizzes.


The More JavaScript Changes...

..the more it stays the same.

It’s clear I love dynamic languages. I like metaprogramming. I like DRY code. I hate repeating myself (at least when I write code—I repeat myself a lot in person). Objective-C and JavaScript are both dynamic languages and those are the two langs I’ve written produced the most open source code with.

I also like reliable and fast languages. That’s why, although they’re dynamic, I don’t favor Ruby or Python. Python is the better of the two, but I still can’t justify creating the type of web services I write in Python.

That was all setup for the following: Just as Swift is less dynamic than Objective-C, new JavaScript is less dynamic than old JavaScript. Examples:

  • import|export syntax vs. CommonJS
  • More types and implementations of type systems

Static analysis in the language itself isn’t the only reason we’ve gotten less dynamic. Build time tools such as Browserify, static analysis via ESLint, type checking via Flow and several other tools have given us greater safety at the expense of the former wild west freedom.

While I strongly dislike giving up dynamism, I have a much stronger dislike for unDRY (WET?) code. There are a couple recent additions to JavaScript that require a little more thought up front but result in DRYer, safer (and after some re-training of your team) more expressive code. I’m talking about object shorthand syntax and computed property names. Technically computed property names have a duality between safety and danger, but that’s why I love JS.

Object shorthand syntax reduces errors and reader’s overhead by taken advantage of that fact keys and the variables assigned to them should be the same anyway. I’ve always done it that way and viewed variance from that as evidence of not having thoroughly evaluated the why and how of semantics in your application. This syntax is especially valuable in React components, where passing props is common:

1
2
3
4
5
({ dimensions, mappings }) =>
<Component
dimensions
mappings
/>

vs.

1
2
3
4
5
({ dimensionz, mappingz }) =>
<Component
dimensions={dimensionz}
mappings={mappingz}
/>

No joke, I’ve seen plenty of code where the variables were just as arbitrarily named differently from the keys.

Computed property names allow one to (dangerous but powerful skill) dynamically create method names on an object while declaring the object or (safe skill) use constants to name your methods while declaring the object. We’ve always been able to do the following:

1
2
3
4
const mappings = { }
mappings[SOME_CONSTANT] = '<3'
mappings['time_' + Date.now()] = new Date
explode(mappings)

but now we can do either of the following

1
2
3
4
explode({
[SOME_CONSTANT] : '<3',
['time_' + Date.now()] : new Date
})

New safety, but also new power depending on which aspect of computed properties you focus on!

The more JavaScript changes, the more it stays the same I guess.


A Proven Cure for JavaScript Fatigue

As an ADHD-addled obsessive who’s been writing JavaScript since 1998, the drive to stay avante garde is nothing new to me. Whether it’s collecting the latest comic books, tech tomes or guitar effects pedals—I’ve always yearned to get my hands on the latest. From childhood on I’ve used a proven set of techniques and principles to guide the way I consume fresh information. The same techniques that helped me transition from a construction worker/dishwasher to a Netflix software architect are the same techniques I use today to upgrade from Babel 5 to Babel 6.

I'm not tired

Now on to JavaScript Fatigue specifically. JSF has relatively recently entered into the web developer’s lexicon. However it’s been a part of technical work for much longer. The difficulty of keeping up-to-date in our industry has been there for many years, but only recently has it become a social pressure. An in-depth exploration of the reasons for the rise of “JavaScript knowledge as fashion” are not in the scope of this article, but the drivers include:

  • Github’s socialization of code
  • Twitter being a key forum in which we examine our place in the industry
  • A rapid increase in salaries—and the subsequent gold rush of smaller-better-faster code jocks. This includes the me-first virus in our industry (see this)

Regardless of the causes, the following 6 principles are the cure for JavaScript fatigue that works for me:

  1. Automate, automate, automate. Any sites that you repeatedly visit for information should be automated as feeds via services like IFTTT.com. An example of doing this can be seen here.
  2. Eliminate all information inputs that are not essential to being the best programmer/manager/artist/human you can be. I follow something resembling an inverted Mad Max version of Pareto Principle here. If a Google Group, newsletter or Twitter user do not produce life-enhancing content to you at least 80% of the time, eliminate it. That Ruby on Rails user group that was really active in 2007 but it a shell of its former self? Unsubscribe. That high school buddy Bradley that was fun in 2003 but now posts 100% negative rants? Bye Bradley.
  3. Constantly replace your previous realms with new ones. If you’ve already covered following 100s of mobile development brothers on Twitter, try following some of the web development sisters. Learn about new areas of life from as many different types of people as possible! Default to saying yes, then revisit and say no aggressively. Say yes to trying new meetups, modules & software, but don’t stay too long if they’re not working for you after you’ve given it the ol’ college try.
  4. Ensure you’re constantly around experts in different but related fields. Contrary to popular wisdom, only associating with teammates that are focused on your realm can result in suboptimal performance as you will spend more time debating choices than making them and executing on them. On my team at Netflix, I’ve an author & former Digital Humanities Specialist from Stanford, a design-savvy D3 expert & a Data Scientist with a business degree around me at all times. We’re all multi-disciplinary and hold one another accountable, but defer to one another on the implementation details of our respective areas of expertise. I learn about new things—so do they—and we focus more energy on learning than arguing.
  5. Allow the wisdom of the crowd to lead you to treasure, but don’t let mob mentality dictate which gems you put in your rucksac to take back to camp. Crowd-sourced wisdom is o’plenty on the web and can be found at Product Hunt, by following key individuals on Twitter & by subscribing to the popularity feeds of Github.
  6. Never study when you’re fatigued unless you’re in-the-zone. Ensure your body and mind are primed to effectively take on new information. There’s no point in going through your info feeds when you’re too knackered to move any of the info into long-term memory. Physical health is deeply tied to mental health too—take walks in the sunshine frequently.

The Future of Editing JavaScript

I think the future lays in having an app like Visual Studio Code seamlessly integrate tools (Browserify, Travis et al) in an agile and open manner. This’d be much like what Slack did in combining Hubot-like hacker features to a user-friendly place. That’d bring the ease-of-use of something like Visual Studio, while also bringin the intelligent & agile hacker-friendly features from the Terminal.


Const(ant) Craving

“Humans avoid change unless they see immense and immediate personal benefit. const isn’t about personal benefit or immediate benefit. It’s about long term benefit to others—those that read your code.”

Prologue (before const)

While const has been available in both Chrome (20) & Firefox (13) since summer 2012, it didn’t become part of my core JS vocabulary until Mr. Schlueter announced Node v0.10 in the spring of 2013.
I’d seen const on MDN when I was still using Node v0.6, but:

  1. I wasn’t eager to fill my JS with the constraints I had when programming for iOS (types, static, private, et al)
  2. I was introduced to isomorphic JS early on and didn’t want to use transpilers for client-side code for what amounted to a minor readability aide

Why I Turned to const

In 2013 I started to not only use Node.js for important production apps, but I began preaching the good word of proper Node.js programming to Sony employees in both San Diego, USA and Tokyo, Japan. Being tasked with teaching, as well as producing modules that’d end up maintained by others, led me to reach for new levels of readability in my JavaScript code. My journey toward the ultimate readability lead to increased specificity in several areas of my code. Many of these were informed by things I read during my days as an iOS programmer. My colleagues at Sony had plenty of input as well. Examples include:

  1. Naming variables in such a way that, if entered into Google Translate, would make sense in another language. I was teaching Japanese developers that weren’t fluent in English, and this is just a good way to train yourself to write meaningful variables for folks that speak English as a first language as well!
  2. Always, always, always putting requires, constructors, prototype declarations, exports and other common statements in the same place in each module
  3. Taking (what I would call) full advantage of linters and style tools
  4. Endeavoring to unite approaches between client and server. This led to several new practices, but Browserify was perhaps the most important of those
  5. Creating or consuming documented modules for all repeated tasks. From checking if something is an Array, to connecting to a CouchDB instance, let’s keep things DRY, sharable & effortlessly learnable
  6. Most importantly, an overall guiding principle. That is, always side with code decisions that lead communicate more information over code decisions that decrease typing effort or simply impress others in their syntactic athleticism

Make Nostradamuses Out of Your Team (or how to write predictable code)

Nostradamus Yodamus
Number 6 above is where const comes in. Quite simply, var makes code harder to read (harder to predict) than const does. JavaScript developers have (unfortunately) often misused var by re-using pointers like it’s 1982 and they need to hoard them like Qualudes in Jordan Belfort’s basement. The result is, when you see foo somewhere, it’s pretty tough to know if it’s what it was when var foo = 'FOO' or something else it got assigned to somewhere else in your app’s codebase. With const, that’s not true. While that is a seemingly minor change, the culmination of using const everywhere that it’s appropriate is that your entire app is more predictable. It is, in part, for this same reason that using immutable data structures makes code more readable.

The Little Revolt (against constants in JavaScript)

Most JavaScript developers, even those at the top of our industry, have been using var exclusively for a long time. Thus, my practice of using const has met some resitance. And hey, it’s natural. Humans avoid change unless they see immense and immediate personal benefit. const isn’t about personal benefit or immediate benefit. It’s about long term benefit to others—those that read your code. Anyway, I’ve consistently heard a small set of arguments against using const. They are:

  1. What if I want to change the value later? I want my pointers to stay flexible
  2. const doesn’t freeze the object. That’s confusing!
  3. const is harder to type than var
  4. Won’t const fail on most browsers?

The Rebuttal

Each of those arguments / questions are unbelievably easy to address. In order:

  1. a) We don’t program for “just in case” b) You can create a new pointer or change const to let later
  2. That’s a fundamental misunderstanding of computer science / how memory works. Ignorance of the law is not an excuse ;)
  3. Really? If you will never have your code read by someone else, then feel free to save yourself 2 chars a line :)
  4. This is the most valid question. If you’re programming for Node, just use it. It works at least as far back as v0.6. Any remotely recent version of FF or Chrome supports it, but IE didn’t support it until 11. Now if you’re using any ESNext|ES6 features via Babel or related tools, you can certainly use const as well

Gotchas

Strict mode

It’s important to note that, in some JavaScript engines, you need 'use strict' to be off to use const, in some engines you need 'use strict' to be on! The good news is, your chosen JS engine will loudly let you know when this is the case either way.

To SyntaxError is to be human

In most engines const initially didn’t throw a SyntaxError when you attempted to reassign to the pointer! The assignment statement would return the value you attempted to assign as if it worked, but when you accessed the value the const actually points to, it’d be the original value.

(As)sign on the line

It’s specified that a const statement must contain an actual assignment, unlike let and var statements, which can be empty. In earlier implementations of Firefox’s Spidermonkey (and perhaps other engines?), an error was not raised when a const declaration didn’t end with a value assignment.

Hopefully this has explained why I have Constant Craving!
K.D. Lang


Falcor & why you should care about your Bithound score

Falcor has been extremely successful.

Falcor core was immediately considered one of the main JavaScript modules and it’s sister modules are well-liked as well. As part of helping Falcor be as successful as possible, I want to raise awareness of Bithound scores (which are already being used by the Falcor project), and point out why they matter.

What influences your Bithound score?

  • The security of packages used
  • Adherance to semver (Falcor doesn’t adhere)
  • (Up|Out)dated packages
  • Adherance to consistent style
  • Whether or not issues are addressed within a timely matter

  • Most everyone knows the above are important to the success of an open source module.

Weaknesses of the Bithound score (Falcor’s score should be higher)

While all of the above are important, they are less important (by far) for devDependencies than for dependencies proper. While Falcor locks down the few deps it has (this is shockingly rare but good on the Falcor team), dev deps are with the caret and some of the modules used are way out of date. In other words, I think the Falcor score should be higher than it is.

We should still improve the little things that matter

In terms of adherence to Node & NPM best practices however, Falcor core can improve. A quick comparison using Bithound.

Falcor Path Utils (another falcor project)

KeyKey (an insignificant project of my own)

Falcor core

Size is part of this, and stats aren’t everything. However, in my modules I’ve found a strong correlation between my maintenance of them and what the Bithound score ends up being.

I’m going to open some issues & PRs to address this, and I encourage other, both inside of Netflix and without, to do so as well.

Cheers!


Failure to Componentize (plus Stylescope & ModCSS)

Back in Aug 26 2015 I posted hopefully about the future of web components in Netflix’s Digital Supply Chain. Since then, all fans of the open web have sided with expediency (and therefore React). You can’t stop the mob’s forward march :). Seeing this coming a year ago, I offered Reactive Elements as the virtually cost-free way to bridge the gap ‘tween React and WC, but interop be damned I guess. The group recently had the joy of bringin’ on Tim Branyen from Bocoup—I hope he’ll help folks “see the light”.

Scoped CSS

As a side note, I’m developing an answer to one of the main open questions. It’s tentatively called Stylescope and serves as a more traditional (but versatile) companion to ModCSS.

ModCSS

ModCSS assists in modularizing styles by enabling CommonJS-like system to pull in Stylus or CSS files as JSON. The primary use so far in Netlix has been assigning to style in JSX.

Stylescope

Stylescope assists in modularizing styles by enabling one to reliably assign style information to an HTMLElement with 100% security that style won’t affect other elements. The pain of an Enterprise front-end developer attempting to add their own flair to a large CSS cascade is traumatizing and I’m here to prevent others from experiencing the same fate :)

Stylescope is early alpha, but it already works well for one-deep style trees in its Mochify tests. For next steps, I plan to demonstrate distributing and sharing styles via ModCSS and then consuming them in a Web Component via Stylescope. I’ll need to locate a module containing all viable HTML tags to implement parent-child style relationships in Stylescope…