Using Dropbox as a Markdown Image Source

Penning a new README. Opening a good Github issue. Answering a Stack Overflow question. Creating a descriptive pull request in Bitbucket. All of these common developer actions and more require the use of embedded images in Markdown documents. Each of these platforms uses a different Markdown flavor, but each support the same general syntax for embedding image URLs inline:

1
![Alt text](https://domain.com/image-name.png)

One of the most common flows for incorporating images into these documents is as follows:

  • Take a screenshot
  • Upload said screenshot
  • Find & copy a URL to the uploaded screenshot
  • Type out the Markdown syntax for inline images and paste in the URL

One of the most common cloud storage services for folks who’ve been using the web for a long time is Dropbox. I fall in that camp so it was incumbent upon me to find an efficient method for executing the above 4 steps in a maximally automated fashion. A method for obtaining a public link to an image in Dropbox and not a link to an HTML page containing the image wasn’t readily apparent to me. Thus in accordance with my principles of sharing technology skills with others the way they’ve been shared with me, I’m posting about it! Here goes:

Steps required to install Dropbox w/ screenshot uploading (happens once)

  • Install the official Dropbox MacOS menu bar app and log in
  • Press the Dropbox icon in the menu bar, opening a drop-down
  • Press the preferences/gear icon in the top right corner of the drop-down
  • Press the Import button at the top of the window that appears
  • Check “Share screenshots using Dropbox” in the Import view

    Steps required for each image

  • Use Shift-Cmd+4 and drag a rectangle around something you want to capture as an image (or Shift-Cmd+3 for the whole screen, or Shift-Cmd+4+Spacebar for one window)
  • Dropbox automatically uploads the screenshot and copies a URL for it to your clipboard (no action is required on your part)
  • Enter ![]() into your document and then paste the URL that was automatically copied to your clipboard in between the parens, creating something like ![](https://www.dropbox.com/s/sosick1984fuzzy/wuzzy.png?dl=0)
  • IMPORTANT: change the 0 at the end of the URL to a 1, e.g. ![](https://www.dropbox.com/s/sosick1984fuzzy/wuzzy.png?dl=1)

That’s it! It’d be even better if we could have Dropbox automatically use 1 & wrap the URL within a Markdown-friendly template. Still, this is already better than manually moving the screenshot into Dropbox & manually creating the link. Hit me up on Twitter if you’ve found a better Dropbox-friendly way to achieve this.


The One Where My Macbook FaceTime HD Camera Joined the Romulan Star Empire

There’ve been many to prognosticate Apple’s fall of late. Few have accurately predicted just how far.

Romulan Warbird

Today, my MacBook (15-inch, 2016) Pro’s built-in FaceTime HD Camera didn’t activate when I began one of what are typically many Google Hangouts in a day. Given my history with Hangouts, I reloaded the page assuming it was a networking or runtime issue with hangouts itself. That failing, I opened up Photo Booth to see something like “No available USB camera devices”. “I must have lost my mind and not realized this newer Macbook doesn’t have a camera”, I thought. But the hardware design synecdoche that is the FaceTime HD Camera was right there. It’d simply disappeared from my system’s recognized internal “USB” devices, rendering remote work-life much more difficult.

A sudden disappearance of this sort has only been recorded in encounters with the Romulan Star Empire, so I catiously moved forward on that hypothesis.

I’d obtained a leak of Section 31 intelligence on Romulan starship network protocols and decided to give some of its incantations a try. BOOM! One of them deactivated the cloaking device and brought the cam back into the fold. I really shouldn’t be telling you this, but here’s the command to run in Terminal or iTerm2:

1
sudo killall VDCAssistant

Next time your MacBook’s camera decide to cloak, run that command first and ask questions later.


IFTTT & the Augmented Human

I’ve been an IFTTT user for a while but only recently have I been trying to use it to its fullest extent. Apps like IFTTT & Zapier are an early version of augmenting ourselves.


A Better Source View in Chrome

JSONView has been one of the first things I ensure is installed in Chrome for years now. It adds interactivity and highlighting whenever you navigate to a JSON resource.
I wanted the same for JS, CSS et al and Sight fit the bill for that. Both extensions add CSS to JSON resources and they were initially conflicting. It turns out that
you can edit the CSS styling for JSONView (and the themes of Sight). By doing so I was able to get them to play nicely with one another.


Programming Kraftwerkflows: git-worktree

git-worktree

git-worktree enables us to manage multiple working trees attached to the same repository. Like most (all?) git-* commands, you interact with it via git worktree rather than git-worktree. I came to know git-worktree via a Google search: “working on 2 git branches at the same time”. Thanks Google!

Why use worktree?

I’m glad you asked that. Most codebases are contributed to by a team. That means plenty of pull requests. Pull requests are important even if you’re not on a team, but you’ll inevitably have more of them when a team is involved. Now PRs are great, but often your teammates cannot review your changes immediately. Does that mean you can pack up and go home to drink clamatos preparados? Maybe, but maybe not.

I want to stay productive even when my PR will be sitting there for a couple days. Sometimes the right way to fill that time is with work on another project, sometimes not. Even when you do need to stay hacking on the same project, you can usually simply checkout another branch get your work done there before moving back to the previous branch when your team starts providing feedback. There is another scenario though, and that’s the scenario in which you want to use git-worktree. In this scenario, your team’s feedback starts trickling in while you’re not at a good stopping point in another branch. You want to essentially work on two branches at the same time. Some folks will just cp -r repo repo_2 in this scenario. Those folks might wonder why that’s no good enough. The reasons are profound but quite simple to understand. There are essentially 2:

  1. Copying a large project takes a long time. In a plain cp -r execution, there’s at least two things happening that are unnecessary and take the bulk of that time.
    1. Copying all of node_modules. Using git-worktree in concert w/ a symbolic link can be better here.
    2. Copying all of your git history. git-worktree doesn’t do this.
  2. When you cp -r you’re more likely to end up with the two folders out of sync in a serious way. This isn’t merely anecdotal—git-worktree keep your remotes etc. in sync

How to use it

First, you need to have git 2.5+. Before becoming interested in using git-worktree I was using git 2.2. I simple brew upgrade git gave me git 2.9 and it’s working great! git worktree add ../ignite_002 master will create a new folder named ignite_002 and set it’s head to master, as long as you’re not already on master.

Conclusion

I’m new to git-worktree, but so far I’m finding it a useful workflow upgrade that I’ll apply in a limited set of situations. It represents the biggest improvement to my git workflow in months.
If I’ve piqued your interest, read on:

Cheers and happy Kraftwerking my fellow Gitlians!


Nightwatch Parlance

Nightwatch has its own little semantic world. It’s a world that, while not “on fleek”, makes a lot of sense to me. In a teamgramming context I add a gang of comments if introducing new tech or use thereof. In introducing a new type of UI test to a project at Netflix, I decided to explain some of the Nightwatch parlance. I’ve transposed some of that here for y’all.

Elements

Elements start with an @ and allow you to have improved semantics within a Nightwatch test context. CSS selectors are typical either more verbose or more terse than elements in Nightwatch parlance.

Commands

Commands give you the ability to extend the Nightwatch API w/ your own methods.

Tags

Tags allow you to flexibly group your tests according to your own organization principles, allowing you to execute subsets of all tests.
Within a spec it looks like this:

1
2
3
4
5
6
function createNotepadTestRunner () { }
module.exports = {
tags: [ 'sanity', 'ancillary' ],
NotepadTest: createNotepadTestRunner()
}

An example of leveraging this in concert w/ NPM scripts:

1
2
3
4
5
6
{
"scripts": {
"test:ancillary": "npm t -- --tag ancillary",
"test": "nightwatch -c ./config/nightwatch.js --env chrome"
}
}

Web driver

A web driver is a piece of software that allows you to manipulate a website using the web client’s native interface. Selenium is the web driver, written in Java, that Nightwatch provides a beautiful JavaScript API for. Nightwatch supports most everything Selenium does. It also achieves the remarkable feat of being familiar to both Selenium devs new to JavaScript and JavaScript devs unfamiliar w/ Selenium.


Hating on simplicity: A developer's passion?

It’s common to see talented engineers shit on egalitarian software solutions only because they’re not perfect for them.

That mentality is both socially and technically toxic in a larger successful company that writes enduring software.

There’s a great benefit + beauty in having a very simple system that’s easy to replicate and update—easy to propagate technically & socially—but it’s tough to love simplicity with a belief system that rewards complexity.


CORS 'R' US: CouchDB & Nginx teamwork

Many years after CouchDB debuted, we still see developer after developer—manager after manager—bypass CouchDB only to rebuild exactly what CouchDB offers using a collection of other technologies. As a Netflix employee I’m well aware of the diverse set of needs that can lead to an number of different combinations of datastores and web applications fronting them. However, the majority (a deafening majority in fact) of applications need only a somewhat performant storage mechanism paired with a JSON-HTTP transport mechanism. Why use PostgreSQL w/ an ORM and Spring to achieve the exact same thing CouchDB does by itself? I don’t know if I’ll ever understand why so many teams made the wrong decision there.

The good news is that we don’t need to be one of those teams. You can get on the rapid development train with CouchDB and even take it up a notch by letting Nginx take care of a couple little things that you might be tempted to use Node.js for—but really do not need to. For now I’ll leave you with an Nginx site config that will allow you to use cross-origin resource sharing w/ CouchDB, effectively eliminating the need for any database server programming. Without further ado:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
server {
listen 80 default_server;
listen [::]:80 default_server ipv6only=on;
root /usr/share/nginx/html;
index index.html index.htm;
# Make site accessible from http://localhost/
server_name localhost;
# http://wiki.apache.org/couchdb/Nginx_As_a_Reverse_Proxy
location / {
# https://michielkalkman.com/snippets/nginx-cors-open-configuration.html
if ($request_method = 'OPTIONS') {
#
# Om nom nom cookies
#
add_header 'Access-Control-Allow-Credentials' 'true';
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS, PUT';
#
# Custom headers and headers various browsers *should* be OK with but aren't
#
add_header 'Access-Control-Allow-Headers' 'DNT,X-CustomHeader,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Authorization';
#
# Tell client that this pre-flight info is valid for 20 days
#
add_header 'Access-Control-Max-Age' 1728000;
add_header 'Content-Type' 'text/plain charset=UTF-8';
add_header 'Content-Length' 0;
return 204;
}
if ($request_method = 'PUT') {
add_header 'Access-Control-Allow-Credentials' 'true';
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS, PUT';
add_header 'Access-Control-Allow-Headers' 'DNT,X-CustomHeader,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Authorization';
}
if ($request_method = 'POST') {
add_header 'Access-Control-Allow-Credentials' 'true';
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS, PUT';
add_header 'Access-Control-Allow-Headers' 'DNT,X-CustomHeader,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Authorization';
}
if ($request_method = 'GET') {
add_header 'Access-Control-Allow-Credentials' 'true';
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS, PUT';
add_header 'Access-Control-Allow-Headers' 'DNT,X-CustomHeader,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Authorization';
}
proxy_pass http://localhost:5984;
proxy_redirect off;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
location ~ ^/(.*)/_changes {
proxy_pass http://localhost:5984;
proxy_redirect off;
proxy_buffering off;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
}

Dictionarius: Thennable

The Womack Dictionarius word of the day is Thennable (sometimes spelled thenable).

Cases

There are two primary cases of this word. I’ve provided example sentences for each case:

Case 1 ~ Adjective

Is the return value of API.getDictionariusEntries() thennable? I want to try await with it

Case 2 ~ Noun

Each of the methods of the API object are Thennables. We feel this is a good convention for any async-heavy networking object.

Duck Reasons

You may ask “why do we need the word/concept thennable given we already have the word/concept Promise?”. I’m glad you asked madam. The reason we need “thennable” is, well, JavaScript (or ducks). JavaScript, like Ruby & Objective-C, makes heavy use of duck-typing. Despite typed-functional-language pundits manufacturing successive declarations of imminent conflagration vis-à-vis dynamically typed proglangs, dynamic typing that’s deferred to runtime (duck typing) has benefits in both unit testing and general programming. The Thennable is a supreme manifestation of said benefits. When a JavaScript function designed for a Promise receives a Thennable (often in a unit testing scenario), things Just Work™. I don’t know about you madam, but I don’t want to conform to a protocol just the unit test a simple function. Example:

1
2
3
4
5
// foo.js
export function promiseMe (prollyAPromiseRight, doSomethingElse) {
return prollyAPromiseRight.then(doSomethingElse)
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
// foo.test.js
import spec from 'tape'
import { spy } from 'sinon'
import { promiseMe } from './foo'
spec('foo', ({ test, end : endSpec }) => {
test('promiseMe', ({ ok, end : endTest }) => {
const thennable = { then : spy() }
const doSomethingElse = spy(function () {
ok(thennable.then.calledOnce, 'the object in the 1st position gets "thenned"')
ok(thennable.then.calledWith(doSomethingElse))
ok(doSomethingElse.calledOnce, '`then` is called with the fn in the 2nd position')
endSpec()
})
promiseMe(thennable, doSomethingElse)
})
endSpec()
})

Now I could have used a spied-on Promise in that test. But the experienced unit tester knows they should write the simplest, fastest test that will satisfy the requirements. A Thennable takes up less CPU cycles, less memory and more accurately reflects the requirements of the function than a Promise instance would.

May the “then” last forever!

Until next time in The Womack Dictionarius—keep on “thenning”.


Mass JSON Edits in a Monorepo

As mentioned in the previous post, I’ve migrated a couple key Abacus repos from Stash to Github. I’ve done this in preparation for open source as well as to leverage greater tooling and infrastructure. I like Stash, but it cannot compete with the Github ecosystem.

One of the two repos I’ve migrated is the Abacus monorepo. For context, Abacus consists of 3 main repos:

  1. Abacus Editor App, a reference implementation and future basis for a hosted offering
  2. Abacus Viz Framework, a framework for laying out performant virtual DOM viz with academic principles
  3. Abacus, a monorepo containing many small modules that other project, including the above, are made out of. Why a monorepo? To take advantage of centralized testing & tooling as well as to see what Lerna is made out of

As part of migrating #3, I used The Silver Searcher to find all references to Stash. Most of them were in the Lerna packages folder at packages/abacus-*/package.json. I’m a fan of using the most precise tool you’re effective with, so I opted to use Trent Mick’s json instead of something like sed. Using zsh in concert with json, I was able to precisely edit those package files like so:

1
2
3
4
5
6
7
#!/usr/bin/env zsh
for pkg in packages/abacus-*/package.json
do
echo "Editing $pkg with $1"
json -I -f $pkg -e $1
done
1
./pkedit 'this.repository.url="git@github.com:Netflix/abacus.git"

Hopefully that will help you out in your own massive JSON situations. Cheers y’all!