Managing a variable configuration for a Flux application with gulp

Once you're done with the development of your Flux application, it is time to deploy it to your staging and/or production environment. But what if the app relies on configuration values that need to be specific to these environments? We will see how to address such a concern with Gulp.

We are going to store our configuration in JSON files. Values will be defined as environment variables, which can be set upon packaging your app with a continuous integration tool, and will be applied by a good ol' Bash script with regard to default values used for development. I assume all files, including gulpfile.js, are located at the project's root. We will begin by writing a sample configuration file, named (for example) config.json.dist:

{
"myParam1": "this value is environment-agnostic",
"myParam2": "?MY_PARAM_2?"
}

Next, we need to copy this file to config.json, which we will then require in our JavaScript code (with the help of Browserify). This is where gulp comes into play:

var gulp = require('gulp'),
rename = require('gulp-rename');

gulp.task('config', function() {
gulp.src('config.json.dist')
.pipe(rename('config.json'))
.pipe(gulp.dest('.'));
});

Make sure to npm install gulp-rename --save-dev if necessary.

We then want the ?MY_PARAM_2? placeholder to be swapped for the actual value. Let's set up the envify script that does that:

#!/bin/bash

set -e
cd "`dirname "$0"`"

declare -A DEFAULTS

# Set up default configuration values
DEFAULTS["MY_PARAM_2"]="this one isn't"

# Push them as environment variables if they don't already exist
for key in "${!DEFAULTS[@]}"; do
if [ -z "${!key}" ]; then
declare -x "$key"="${DEFAULTS[$key]}"
fi
done

# Edit configuration files accordingly
sed -i "s,?MY_PARAM_2?,$MY_PARAM_2,g" config.json

Have gulp run this script after copying the sample file:

gulp.task('envify', ['config'], function() {
gulp.src('')
.pipe(tasks.shell(['chmod +x envify', 'envify']));
});

Note: the ['config'] thing should make gulp run the config task prior to this one; however, I sometimes have encountered race conditions forcing me to make the shell sleep before actually running the script. It's a dirty hack though, so if you have had the same issue and found a better way to work around it, I'm interested!

After running gulp envify, you should see a config.json file with the following contents:

{
"myParam1": "this value is environment-agnostic",
"myParam2": "this one isn't"
}

If you do it again after running export MY_PARAM_2="I love bananas", the file will look like this instead:

{
"myParam1": "this value is environment-agnostic",
"myParam2": "I love bananas"
}

You can thus do the same in your CI job, and that's how you get environment-specific configuration in a Flux application. Stay tuned!

I would like to seize this opportunity to say that all my affection goes to victims of terrorism, from Paris last friday to other places in the world where such atrocities happen frequently, such as Beirut. Remember freedom and love cannot be killed; stay strong together and support each other in the hope of brighter days.

Handle events outside of your React components with Flux

Flux is an architecture methodology for React applications proposed by Facebook to support its own UI library. It is designed to help inject data from the server (or any other data source) into the React components and the other way around.

React components are pieces of knowledge that should deal with themselves and nothing else, to respect the separation of concerns principle that comes with the object-oriented programming paradigm. When using such an architecture, one might wonder how to handle JavaScript events happening outside of a component's scope, but still affecting it; a good example could be a dropdown menu that needs to be closed whenever the user clicks anywhere on the page, outside of it. Flux is actually able to answer such a problematic: let's figure out how!

Note: I will use a basic Flux implementation as described by Facebook.

First, let's define our Dropdown component:

'use strict';

var React = require('react'),
classNames = require('classnames');

module.exports = React.createClass({
getInitialState: function() {
return {
open: false
};
},

toggle: function() {
this.setState({
open: !this.state.open
});
},

render: function() {
return (
<div className={classNames({ open: this.state.open })} onClick={this.toggle}>
{this.props.children}
</div>
);
}
});

We need to catch clicks happening throughout our app (which we assume lives in a Root component) and let Dropdown know when they happen, so it can call its own toggle to close itself up. In order to do so, we will simply follow the Flux pattern, and start by defining a constant for the click event in a browserConstants module:

'use strict';

module.exports = {
ACTION_CLICK: 'ACTION_CLICK'
};

We will also set up the related action, in a browserActions module:

'use strict';

var browserConstants = require('../constants/browserConstants'),
dispatcher = require('../dispatcher');

module.exports = {
appClick: function() {
dispatcher.handleAction({
actionType: browserConstants.ACTION_CLICK
});
}
};

It is now time to implement our browserStore:

'use strict';

var EventEmitter2 = require('eventemitter2').EventEmitter2,
dispatcher = require('../dispatcher'),
browserConstants = require('../constants/browserConstants'),
store = {};

Object.assign(
store,
EventEmitter2.prototype,
{
emitClick: function() {
this.emit('click');
},

addClickListener: function(callback) {
this.addListener('click', callback);
},

removeClickListener: function(callback) {
this.removeListener('click', callback);
}
}
);

dispatcher.register(function(payload) {
switch (payload.action.actionType) {
case browserConstants.ACTION_CLICK:
store.emitClick();
break;
}

return true;
});

module.exports = store;

We are now able to dispatch the click event by calling an action, which is the next thing we will do in our Root component:

'use strict';

var React = require('react'),
browserActions = require('../actions/browserActions');

module.exports = React.createClass({
render: function() {
return (
<div onClick={browserActions.appClick}>
{/* your content here */}
</div>
);
}
});

The last thing to do is to listen and react to this event in Dropdown:

'use strict';

var React = require('react'),
browserStore = require('../stores/browserStore'),
classNames = require('classnames');

module.exports = React.createClass({
getInitialState: function() {
return {
open: false
};
},

toggle: function() {
this.setState({
open: !this.state.open
});
},

onAppClick: function() {
if (this.state.open) {
this.toggle();
}
},

componentDidMount: function() {
browserStore.addClickListener(this.onAppClick);
},

componentWillUnmount: function() {
browserStore.removeClickListener(this.onAppClick);
},

render: function() {
return (
<div className={classNames({ open: this.state.open })} onClick={this.toggle}>
{this.props.children}
</div>
);
}
});

Now, clicking anywhere will close any open instance of this component! Messing around with the Flux pattern in such a way might prove itself useful in other contexts, feel free to leave a comment if you have anything to share on the subject.

Does your JavaScript code still need IIFEs and explicit strict mode?

At a time when modules are becoming a de facto architectural solution for any serious frontend JavaScript codebase, and ES6 is at our doorstep, one may wonder if a systematic use of IIFEs and explicit strict mode declaration (using 'use strict';) is still relevant. This short note is thus designed to answer these questions.

Note: if you aren't using modules yet (should it be with CommonJS or the native ES6 syntax supported by Babel), you should really begin to! In both cases, Browserify is the way to go. Other solutions exist, of course, but these two respectively are the current (at least in the node.js world) and upcoming standard, and are the most broadly used.

Here we go:
  • A module's code does not need to be wrapped in an IIFE, as it is already in isolation anyway
  • 'use strict'; is still needed in CommonJS modules, nothing new here; on the contrary, it is implied in ES6 modules
  • 'use strict'; is needed in your code's entry point, which thus must also be wrapped in an IIFE to prevent concatenation side-effects

I hope this helped you know what you still have to do to keep your JS tidy as of today.

Render React components on the server with PHP

Earlier today, I stumbled upon a tweet promoting Facebook's very own React rendering library, based on PHP's v8js, already more than a year and a half of age and surprisingly unpopular - although its poor architecture and flexibility may be a good start in a quest for reasons.

It indeed is designed to spit out, separately, the markup rendered from your root component, and the JavaScript code required to bootstrap the application clientside. The fact it writes the latter itself requires React to be declared in the global scope; however, the major PITA is the fact it needs to be fed React's source and your own separately. Both of these make it utterly incompatible with modern JS build tools and their use of module systems - in my case, native ES6 modules transpiled by Babelify.

I thus decided to bypass it and see how difficult it would be to mimic its behaviour, but in a much simpler way, by simply feeding V8js my bundled, plain-old-ES5 code, with its own bootstrap code using the module syntax - the clientside code, actually.

It turned out great! A quick hack on the entry point:

import React from 'react';
import MyComponent from './components/my';

if ('undefined' !== typeof document) {
React.render(<MyComponent />, document.body);
} else {
print(React.renderToString(<MyComponent />));
}

And you're good to go:

$v8js = new V8Js();
ob_start();
$v8js->executeString(file_get_contents('/path/to/app.js'));
$content = ob_get_clean();

echo($content); in the client page then displays your gorgeous React app, rendered on the server.

Have fun!

Deploy to your own server through SSH with Travis CI

Have small projects on the continuous integration service Travis CI and want to deploy them to your own server through SSH, but cannot stand the hassle of setting up an advanced tool like Capistrano? You have come to the right place! In this post, I will show you how to deploy a Symfony project with some Gulp-powered asset management, but the following "technique" should suit any technology stack you want to use just fine.

We want our production server to handle as few roles as possible, apart from serving our PHP and static content to our end users, which goes without saying - we absolutely do not want it to compile assets, for example. Travis will therefore do everything for us, and we will simply package the result and send it to our server, where we will also trigger a simple shell script to take care of the final adjustments.

For the sake of simplicity, we will authenticate against our server with a plain old username/password pair. We would obviously be better off using a SSH key. We will also assume your project lives at /home/project/www on this server, and Composer is installed there at /usr/local/bin/composer.

First things first, here is our initial.travis.yml file:

language: php
sudo: false

before_script:
- composer install --prefer-source
- npm install

script:
- phpunit
- gulp --production

Let's append an after_success section to it, which will remove development dependencies and package our build:

after_success:
- export SYMFONY_ENV=prod && composer install --no-dev -o
- mkdir build
- mv * build
- tar -czf package.tgz build

Note: to exclude any production-irrelevant file or folder from the package, use the --exclude option.

It is time to send this package to its destination through scp. But wait! This file will eventually get committed to our repository, and we most certainly do not want to expose our production server's credentials for the world to see... So, unless our repository is private (which would not make it less of a bad idea), we want these informations to be encrypted in some way. Travis offers just that in the form of a command-line utility:

sudo gem install travis
cd /path/to/our/repository
travis encrypt DEPLOY_HOST=123.45.67.89
travis encrypt DEPLOY_PATH=/home/project
travis encrypt DEPLOY_USER=randy
travis encrypt DEPLOY_PASS=marshM4LL0W

Those last four commands will output some YAML with secure as the key, and our encrypted variable as the value. Paste these in a new env section in your .travis.yml:

env:
global:
- secure: "..."
- secure: "..."
- secure: "..."
- secure: "..."

We are now able to reference these variables in our build! To ease things a little, we will make use of sshpass, which allows us to authenticate through SSH with a password in a one-line, non-interactive way. Tell Travis to install it:

addons:
apt:
packages:
- sshpass

We will now make use of the tool's -e option, which will read the password from the eponymous environment variable:

after_success:
- export SYMFONY_ENV=prod && composer install --no-dev -o
- mkdir build
- mv * build
- tar -czf package.tgz build
- export SSHPASS=$DEPLOY_PASS
- sshpass -e scp package.tgz [email protected]$DEPLOY_HOST:$DEPLOY_PATH

Note: you can append -o stricthostkeychecking=no to the scp call to bypass host key verification if necessary.

Now that Travis can copy our package to the server, we just need to set up our deployment script in /home/project/deploy.sh:

#!/bin/bash
cd `dirname $0`

# Extract the package
tar -xzf package.tgz
rm package.tgz

# Copy any file we want to keep from build to build
cp www/app/config/parameters.yml build/app/config/parameters.yml

# Swap it all around, keeping the previous version aside in case something goes wrong
rm -rf www_old
mv www www_old
mv build www

We also tell Travis to run it as its final step:

after_success:
- export SYMFONY_ENV=prod && composer install --no-dev -o
- mkdir build
- mv * build
- tar -czf package.tgz build
- export SSHPASS=$DEPLOY_PASS
- sshpass -e scp package.tgz [email protected]$DEPLOY_HOST:$DEPLOY_PATH
- sshpass -e ssh [email protected]$DEPLOY_HOST $DEPLOY_PATH/deploy.sh

And here we are! Every successful Travis build will now trigger a production deployment.

Note: it is strongly advised to restrict this to direct pushes on the master branch by prefixing all after_success steps with test $TRAVIS_PULL_REQUEST == "false" && test $TRAVIS_BRANCH == "master" && .