Recursive function binding in a JavaScript object

Despite the rise of ES6 and especially the sacrosanct fat arrow syntax, JavaScript developers might still find themselves in need for Function.prototype.bind in some edge cases; binding all functions in an object to the same this value regardless of depth is one of them.

The example I have in mind is the building of a Vue.js plugin such as this one:

function install(Vue) {
Vue.mixin({
beforeCreate() {
this.$myPlugin = {
aMethod() {
console.log(this);
},

yet: {
anotherMethod() {
console.log(this);
}
}
};
}
});
}

export default { install };

Fairly logically, setting it up through Vue.use(MyPlugin) and calling this.$myPlugin.aMethod and this.$myPlugin.yet.anotherMethod yields a reference to the plugin object itself twice, which might not be what you want depending on what the plugin is supposed to do.

In such a situation, we therefore need to write a recursive binding function, which luckily is not all that hard:

function recursiveBind(obj, that) {
return Object.keys(obj).reduce((acc, key) => {
switch (typeof obj[key]) {
case "object":
return { ...acc, [key]: recursiveBind(obj[key], that) }; // recurse

case "function":
return { ...acc, [key]: obj[key].bind(that) }; // bind
}

return { ...acc, [key]: obj[key] }; // leave untouched
}, {});
}

All that remains to do then is to wrap our plugin object declaration with it:

function install(Vue) {
Vue.mixin({
beforeCreate() {
this.$myPlugin = recursiveBind({
aMethod() {
console.log(this);
},

yet: {
anotherMethod() {
console.log(this);
}
}
}, this);
}
});
}

And this now correctly refers to our Vue instance! Please share your plugin tips in the comment section, if any!

In-depth object prop validation in Vue.js

Vue.js offers a simple yet effective prop validation API:

export default {
props: {
config: {
type: Object,
required: true
}
}
};

For any given prop, you can specify the expected type, and whether the prop is required or has a default value. Doing so is actually best practice, just like defining data as a function in your components.

Regarding Object props, however, there is no built-in mechanism allowing one to validate the object's keys or structure; the closest thing we have is the ability to pass in a custom validation function, so let's build upon that!

The most simple implementation could be along the lines of the following:

const configKeys = ["some", "keys", "to", "check"];

export default {
props: {
config: {
type: Object,
required: true,
validator: config => configKeys.every(key => key in config)
}
}
};

That pretty much does it! A more thorough version could support deep key checking with a dotted notation, using this package for example.

But let's go further! If you find yourself using this pattern on a regular basis, why would you not write a mixin - actually, a mixin factory - to make it easier?

/**
* @param {String} propName
* @param {String[]} keys
*
* @return {Object}
*/
function getObjectPropMixin(propName, keys) {
return {
[propName]: {
type: Object,
required: true,
validator: value => keys.every(key => key in value)
}
};
}

export default {
mixins: [getObjectPropMixin("config", ["some", "keys", "to", "check"])]
};

You could even use it multiple times in the same component if you have more than one Object prop to check; or better yet:

/**
* @param {Object} propDefs
*
* @return {Object}
*/
function getObjectPropsMixin(propDefs) {
return Object.keys(propDefs).reduce((props, key) => {
return {
...props,
[key]: {
type: Object,
required: true,
validator: value => propDefs[key].every(propKey => propKey in value)
}
}
}, {});
}

export default {
mixins: [getObjectPropsMixin({
config: ["some", "keys", "to", "check"],
anotherObject: ["and", "here", "are", "more", "dummy", "values"]
})]
};

Using such stuff gives us the confidence to reference keys in Object props directly, without defensively checking for their presence in our code. If you have other ways to deal with this kind of thing, please let me know!

CLI scripts for ES6-powered packages with ESM

If you are developing your own front-end JavaScript packages, chances are you use ES6 syntax in your codebase, especially the long-awaited module system. Browser compatibility is pretty much painless thanks to Babel, but things can get harder if you want your package to expose a CLI script (to generate code or whatever) that is meant to be ran through Node.js directly.

ESM to the rescue! Add esm to your project's package.json as a dependency (or peerDependency if you want to leave its installation up to your package's consumer), and your users will then be able to run your script with node -r esm node_modules/path/to/script.js.

The only detail to pay attention to is to account for the difference between ES6's module system and CommonJS's:

const someES6Module = require("./path/to/module").default;   // explicitly require default export if that is what you need
const namedExport = require("./path/to/module").namedExport; // named exports are to be referenced this way

// Or just use destructuring:
const { default: someES6Module, namedExport } = require("./path/to/module");

Making a Vue.js application usable from the outside: the widget pattern

Hello there! It's been a while since I posted anything, but today I'd like to take a moment to tell you about a simple yet quite efficient pattern we've been using at my current job regarding Vue.js applications.

As with any front-end framework, Vue.js isn't really opinionated regarding how you're supposed to inject configuration into your SPA, assuming this configuration comes from your back-end service. A common pattern in this era of API-centric services is to dedicate a JSON endpoint to configuration that your JavaScript code will fetch immediately upon boot. This allows us to keep all back-end considerations aside, but has several disadvantages as well:

  • limited flexibility if the network or your back-end itself fail (the more you can provide for an offline and/or degraded experience, the better)
  • somewhat extraneous network round-trip for something you could have provided locally to begin with
  • your JS code still needs to know which URL to request, which is already configuration; how do you provide that?

For all these reasons, we have began to experiment with a little something we like to call the widget pattern. Nothing crazy here: we essentially wrap the new Vue declaration inside a build function, which is then called from public/index.html:

<body>
<div id="app"></div>
<script>
document.addEventListener("DOMContentLoaded", function() {
window.App.build("#app", { /* ... */ });
});
</script>
</body>

We then use its second argument in src/main.js to inject config into our app, which we can variabilize depending on the environment if we need to, by swapping wildcards with environment variables from Docker for example:

import Vue from "vue";
import App from "./components/App";

export default App; // more on that later

export function build(el, config) {
new Vue({
el,
render: h => h(App, props: { /* ... */}),
});
}

All we need to do to make this work is to expose our root module in vue.config.js:

module.exports = {
configureWebpack: {
output: {
library: "App",
libraryTarget: "umd"
}
}
};

This gives us the flexibility we need, and forces us to reflect on our app's root component's input, as its props are used to receive the aforementioned configuration values. We thus also made an habit of exporting this root component as default, which means we can use our app on its own as before or use it as a "widget" (hence the "pattern" name) in a larger Vue.js codebase or a legacy application.

If you're using something similar or have something to say about this way of doing things, use the comment section below!

Handle peer dependencies with NPM 3

The latest major version of Node.js's package manager introduces a breaking change regarding the peerDependencies section of the package.json file. With NPM 2, these dependencies were installed in the root node_modules folder; this is NPM 3's default behaviour, and peerDependencies are no longer fetched by npm install. Instead, the tool will simply warn you if these dependencies are unmet. But what if you relied on this feature and have to support both versions?

Looking for an answer to this question, I stumbled upon this article that proposes a simple solution: duplicate peerDependencies as dependencies, which will result in installing the right stuff at the right place no matter the NPM version in use. This works, but has a few drawbacks, the most important in my eyes being the obvious code configuration duplication, which often rhymes with maintenance pain.

I therefore put up a Bash workaround, that will play nicely with Jenkins jobs and other install scripts:

# Install dependencies once, just to fetch packages with peerDependencies
npm install

# Check for current NPM version
NPM_MAJOR_VERSION=`npm -v | cut -d \. -f 1`

if [ "$NPM_MAJOR_VERSION" == "3" ]
then
# Turn peerDependencies into dependencies for relevant packages
sed -i 's/peerDependencies/dependencies/g' ./node_modules/package-with-peer-deps/package.json

# Install again, for real this time
npm install
fi

# Do stuff...

Kinda dirty, but it works with no need to think about it. Feel free to discuss this solution and propose something better in the comments!

Note: using npm shrinkwrap also seems to work around this once dependencies have been correctly installed at least once, which could cancel the need to perform the above trick on a regular basis.