Share JavaScript code between frontend apps in a monorepo through a local NPM package

If you have multiple frontend apps in a monorepo and want to share code between them to avoid duplication, you are probably better off doing it locally rather than going the whole "setup a private registry, publish a package on it and install it everywhere" route. Fortunately, NPM makes it pretty simple to use a filesystem-based package:

{
// ...
"dependencies": {
// ...
"@neemzy/common": "file:../common",
// ...
},
// ...
}

This simple setup is enough... if the module(s) you share do not import anything, which will not get you very far. Let's see how to handle dependencies within the common package!

Your first instinct might be to just install these dependencies through the package's own package.json file, but this might not actually be your best option:

- unless you are very thorough with version constraints, you will probably end up with different versions of these dependencies between common and the host app, which might cause trouble down the road
- this requires more unnecessary maintenance regarding common itself
- if you use React and there is code in common which imports react, your bundle will end up including two copies of it, which will not work at runtime (even if both are the exact same version)

It is far more convenient to just let the host app supply the dependencies, which actually makes sense if we consider modules in common have a fair chance to have originated in one of your apps, and were moved to common once they were needed in another. Since we will not be specifying common's dependencies directly, we need to tell our build tool to explicitly look for dependencies in the app's node_modules directory upon bundling our code. Here is how to do it with Webpack:

const path = require("path");

module.exports = {
// ...
resolve: {
modules: [
// ...
"node_modules", // generic "resolve in relative node_modules directory"
path.resolve(__dirname, "node_modules"), // explicitly check the app's node_modules as a backup
],
// ...
},
// ...
};

And with Browserify (for e.g. Gulp):

const path = require("path");

// ...

const b = browserify({
// ...
paths: [
path.resolve(__dirname, "node_modules"),
],
// ...
});

// ...

Note that if you used create-react-app to bootstrap your app, this step is not necessary: the default Webpack configuration seems to have this taken care of.

This should cover it for runtime! Now, if the host app has unit tests for code imported from common, the same logic applies: we have to let our test runner (this example uses Jest) know that it needs to look for dependencies in the local node_modules:

const path = require("path");

module.exports = {
//...
moduleDirectories: [
"node_modules",
path.join(__dirname, "node_modules"),
],

// You might need this in order to transpile the package's code with Babel before Jest interprets it:
transformIgnorePatterns: ["node_modules/([email protected]/common)"],

// ...
};

Finally, we also need to handle this the other way around: if common has unit tests for modules which import stuff from third-party dependencies, these will also be needed in order for the tests to run; we will therefore tell Jest to fetch dependencies from neighbour apps:

const path = require("path");

module.exports = {
// ...
moduleDirectories: [
path.join(__dirname, "../my-app/node_modules"),
// add more of your apps here if needed
],
// ...
};

With all of this, everything should be working as intended, and copied and pasted files now belong in the past!

Using a route config with React Router: the right way

When nesting routes, recent versions of React Router encourage you to declare child routes directly in the component they will render into. Although it is a perfectly valid way of doing things, it gets a bit more complicated if you are defining all your routes in a separate configuration file to begin with. Documentation includes an example of such a config, but it does not cover some pretty essential use cases, such as redirects, or routes using a render function rather than a static component declaration. Let's dive in and see how to do those!

First, let's picture a basic app looking something like the following:

import React from "react";
import { Switch } from "react-router-dom";
import Home from "./components/Home";
import Hello from "./components/Hello";
import World from "./components/World";

const routes = {
home: {
path: "/"
exact: true,
component: Home
},

hello: {
path: "/hello",
component: Hello
},

world: {
path: "/hello/world",
component: World
}
};

export default function App() {
return (
<div className="app">
<h1>Hello there!</h1>
<Switch>
{Object.entries(routes).map(([key, route]) => <Route key={key} {...route} />)}
</Switch>
</div>
);
}

Now, imagine we want our World component to actually be rendered within the Hello component, which would better reflect the URL it is accessed at and allow it to share part of its newfound parent's markup, but we want to keep all routes declarations in our config, like so:

// ...

const routes = {
home: {
path: "/"
exact: true,
component: Home
},

hello: {
path: "/hello",
component: Hello,

routes: {
world: {
path: "/hello/world",
component: World
}
}
}
};

// ...

To make this work, we need the routes object to be passed down to our Hello component, which we can achieve by superseding the standard Route component:

import React from "react";
import { Route } from "react-router-dom";

export function RecursiveRoute(route) {
return (
<Route
path={route.path}
exact={!!route.exact}
render={props => (
<route.component {...props} routes={route.routes} />
)}
/>
);
}

Then, after making use of this in our rendering logic:

import React from "react";
import { Switch } from "react-router-dom";
import RecursiveRoute from "./components/RecursiveRoute";
// ...

export default function App() {
return (
<div className="app">
<h1>Hello there!</h1>
<Switch>
{Object.entries(routes).map(([key, route]) => <RecursiveRoute key={key} {...route} />)}
</Switch>
</div>
);
}

We can do the same in Hello.js:

import React from "react";
import { Switch, Route } from "react-router-dom";
import RecursiveRoute from "./RecursiveRoute";

export default function Hello({ routes }) {
return (
<div>
<div class="sidebar"><!-- ... --></div>
<Switch>
{Object.entries(routes).map(([key, route]) => <RecursiveRoute key={key} {...route} />)}
<Route>I only appear if no child route matched</Route>
</Switch>
</div>
);
}

Routes can now be nested on an infinite number of levels in our config!

So far, we're on par with the example from the React Router docs; but what if instead of showing default content when none of our child routes match, we wanted to redirect to one of them?

// ...

const routes = {
home: {
path: "/"
exact: true,
component: Home
},

hello: {
path: "/hello",
component: Hello,
redirect: "/hello/world",

routes: {
world: {
path: "/hello/world",
component: World
}
}
}
};

// ...

Let's move that Object.entries(routes).map call to a separate function and make it better:

import { Redirect } from "react-router-dom"
import RecursiveRoute from "./components/RecursiveRoute";

export default function renderRoutes(routes) {
return Object.entries(routes).reduce((routes, [key, route]) => {
if (route.redirect) {
routes.push(<Redirect key={`${key}_redirect`} exact from={route.path} to={route.redirect} />);
}

return routes.concat(<RecursiveRoute key={key} {...route} />);
}, []);
}

Last but not least, if we need to support routes relying on render rather than component, all we have to do is to enhance RecursiveRoute a bit:

import React from "react";
import { Route } from "react-router-dom";

export function RecursiveRoute(route) {
return (
<Route
path={route.path}
exact={!!route.exact}
render={route.component
// Render given component with extra "routes" prop
? props => (
<route.component {...props} routes={route.routes} />
)

// Run given render function and inject extra "routes" prop
: props => {
const rendered = route.render(props);

return {
...rendered,
props: {
...rendered.props,
routes: route.routes
}
};
}
}
/>
);
}

This allows us to keep our routing logic fully contained in our config.

Automatically switching Node.js version upon cd with n

In , I was exploring the possibility to read a Node.js version constraint from the package.json file of every directory upon entering them, and switch Node.js versions accordingly, relying on nvm's default version if a specific one was not required.

The script I hacked together does the job in most cases, but suffers from a few limitations:

  • it only looks for package.json in the (new) current directory, rather than the closest one in the tree
  • it is pretty bad at dealing with file ranges, just parsing the first correct version number it could find in them, which turns out to be their lower end

I therefore dug a bit further and found out that n, nvm's main competitor, can perfectly handle those two aspects natively! Just run n auto anywhere and it will look for the closest version constraint in the tree and interpret it correctly, reading from package.json but also from other files - even .nvmrc!

Running it systematically has one major downside, though: n actually reinstalls Node.js upon every version switch, and even if it uses its local cache for versions it has already downloaded, it turns out to be pretty slow; and unfortunately, there does not seem to be an option to cancel the switch if the current version already matches the expected one. Well, in that case, why not write something of our own to do exactly that?

In order to properly handle ranges, we need a list of all available Node.js versions; fortunately, this is pretty easy to get by running n ls-remote --all, the result of which we will store in a file that then gets passed as a parameter to our script.

const fs = require("fs");
const { exec } = require("child_process");
const semver = require("semver");
const readPkgUp = require("read-pkg-up");

// Build array of available versions from input file
const versions = fs.readFileSync(process.argv.pop(), { encoding: "utf8" }).split("\n");

// Read closest package.json file
readPkgUp().then(file => {
try {
// Determine highest version satisfying its version constraint
const target = semver.maxSatisfying(versions, file.packageJson.engines.node);

// Check current node version and switch if necessary
exec("node -v", (error, stdout) => {
if (semver.clean(stdout) !== target) {
console.log(`Switching to node v${target}`);
exec(`n -p ${target}`);
}
});
} catch (e) {} // fail silently
});

Reusing our zsh bit from last time:

autoload -U add-zsh-hook
n ls-remote --all > path/to/.node-versions # update version list

switch-node-version() {
node path/to/runn path/to/.node-versions
}

add-zsh-hook chpwd switch-node-version
switch-node-version

And there you have it! The script is if you are interested.

Automatically switching Node.js version upon cd with nvm

As I was browsing search results for that exact sentence, I stumbled upon this post which proposes a zsh script automating the switching of Node.js versions when entering a directory using nvm, based on the presence of a .nvmrc file.

It works pretty well, but where I work we already specify Node version constraints in our package.json files (under the engines key) and weren't keen on duplicating that information, since said constraints mostly consist of a strict version anyway. I therefore adapted the original script to read from there instead:

autoload -U add-zsh-hook

switch-node-version() {
if [[ -f package.json ]] && grep -q '"node"' package.json; then
nvm use `cat package.json | sed -nE 's/"node": "[^0-9]*([0-9\.]*)[^"]*"/\1/p'`
elif [[ $(nvm version) != $(nvm version default) ]]; then
nvm use default
fi
}

add-zsh-hook chpwd switch-node-version
switch-node-version

When using a range (e.g. ^11.9, >=10 <14, etc.), special characters are dropped (which would result in nvm use 10 with the latter example). This brings one caveat, which is that the script doesn't play nice with overly specific ranges, such as ^10.0.0: it would indeed try to nvm use 10.0.0, and that particular version might not be available on one's machine. In such cases, I would advise shortening the range to ^10 (which better conveys the idea anyway in my opinion).

Forcing an iframe to rerender with React

You may one day find yourself unlucky enough to work on a React application embedding an iframe. If that happens, chances are you might want to force it to refresh based on some change in the overlying component, that is to say have it rerender. But how would you do it?

An easy and not too shabby way is to abuse the key attribute, which serves as an identifier to tell a React component whether a given child element needs to be rerendered, and is typically used when building a list within a loop. How convenient! And even more so if you hide away all those shenanigans in a custom hook:

import { useState, useEffect } from "react";

export default function useChecksum(value) {
const [checksum, setChecksum] = useState(0);

useEffect(() => {
setChecksum(checksum => checksum + 1);
}, [value]);

return checksum;
}

The "checksum" here is but a dumb counter, which does a perfect job at yielding unique values over time - you can go with something more convoluted if you so desire. Anyhow, here is our hook in action:

import React from "react";
import useChecksum from "../hooks/useChecksum";

export default function FrameContainer({ someProp }) {
const checksum = useChecksum(someProp);

return (
<>
<iframe key={checksum} src="..." title="I'm an outdated piece of technology!" />
</>
);
}

The iframe will thus be refreshed when someProp changes. If you have any better way to deal with this, please let me know!