Minimal Fullstack CRUD example

Since long I wanted to extend my fullstack knowledge, I think that even for a frontender it is beneficial to have backend knowledge.

Minimal Application

To learn more full stack I first created a minimal React application to connect to a minimal Rest API.

slogans app
Slogans app React part

The minimal application stores slogans to save the world. Since this is a mostly a front-end blog I will assume the the React part will be clear. This blog post will focus on the back-end part of this small application. Start the application with the following steps

  1. clone the repo:
  2. and run the command npm install && npm run start
  3. check the application at http://localhost:8055/
  4. and the API should run at http://localhost:8044/api/slogans

Application structure

You can see the main structure of the application below, leaving out many files and only listing the most important files of the application in the structure overview.

  • server
    • database.js: initializes the sqlite db
    • server.js: rest endpoints created with express.js
  • site
    • services.js: implements the correct calls to the API with values from the front-end
    • App.js: the main React file uses services.js


The database.js file connects to the sqlite db, if the db is not yet available it creates it with the name set in the code. The sqlite db file will be saved in the root

const sqlite3 = require('sqlite3').verbose();

const DBSOURCE = 'db.sqlite';

let db = new sqlite3.Database(DBSOURCE, (err) => {

If the db and the table is not yet created than SQL will create it with a simple statement since it has only one field, besides the primary key. Once created 2 items will be inserted.

    slogan text


The server.js file can be seen as the backend of the application, or as the API. It is totally based on the express.js framework. The first part initializes the express app en set the needed response headers.

// body parser

// set headers
app.use(function(req, res, next) {
    res.header("Access-Control-Allow-Origin", "*"); // wildcard, only for localhost
    res.header("Access-Control-Allow-Methods", "GET, PUT, POST, DELETE");
    res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept");

For production use the headers will not be sufficient. And need to be more strict.

For the rest services.js has all the endpoints for the API, each endpoint is a express.js function (app.get(), app.put()…) which executes some SQL statement in the database and returns an error or a success JSON response with optional data attached. See the method for fetching all slogans (http://localhost:8044/api/slogans)

// get all slogans
app.get("/api/slogans", (req, res, next) => {
    const sql = "select * from slogan";
    let params = [];
    db.all(sql, params, (err, rows) => {
        if (err) {

This in a nutshell is all that there is for the serverside part of this application. Up next will be the fetching of the front-end part of the application.


Services.js is an abstraction of the calls to the api. For deep control I just created a helper method around the battle-tested, though awfully named XMLHttpRequest. This helper method returns a Promise and makes it easy to apply the HTTP request headers in one place.

// ajax request helper
function get(url, type = 'GET', data) {
    return new Promise((resolve, reject) => {
        let req = new XMLHttpRequest();, url, true);
        req.setRequestHeader('Accept', 'application/json');
        req.setRequestHeader("Content-type", "application/json");
        req.onload = () => {
            if (req.status == 200) {
            } else {

        req.onerror = () => {
            reject(console.log('Network Error'));

        (type === 'POST' || type === 'PUT') ? req.send(JSON.stringify(data)) : req.send();

The methods using this helper can now be made very simple.

const services = {
    getSlogans: () => {
        return get(`${apiDomain}api/slogans`);
    addSlogan: (slogan) => {
        return get(`${apiDomain}api/slogan`, 'POST', {slogan: slogan});

These methods can now be used by any React/ JavaScript component which imports this service file. For example in the main application file App.js.


App.js is the main React file which most front-end developers will be familiar with. In this file I made some develop shortcuts to redundantly call getSlogans, after each update, delete… calls. This keep the application really simple to reason about with a single source of truth. Perform a altering CRUD operation the refresh the state thus the whole view again from the database. For production use (with a large dataset) this could result in a bad performing application.

// the state to store the result from the API
const [slogans, setSlogans] = useState([]);

    const getSlogans = () => {
        .then((response) => {
            let responseObject = JSON.parse(response);
        }, (error) => {
    const addSlogan = (sloganText) => {
        .then((response) => {
        }, (error) => {

Wrap up

Personally I think that is is valuable for every front-end developer to create a small end-to-end application. It will definitely help communicating and understanding with other developers. And to become a more T-shaped front-end developer. For myself I have learned a lot and hope to find time to connect the same application again but then with an external db for example mysql or postgresql. Happy developing 👍

Improving performance

These days it’s very important for a site have a decent performance. For visitors as well as to reduce server costs for google 😉 …  Improving performance becomes a hard task when a website has a lot of machinery and complex parts from the path. In this article I will explain the low hanging fruit we could harvest. We also moved to Akamai to make use of a cdn since we serve the larger part of Europe.

Keep in mind that this process is based on our main site but it could be useful for other sites as well.

JS, Try to decrease the  javascript loaded

The main issue with lots of JavaScript is that it needs to be parsed, which is an issue for mid, and lower-end smartphones.
We took the following steps

  • Gaining insight
  • Removing unneeded javascript
  • have a discussion with marketing, about the marketing scripts being loaded by Google Tag Manager.
  • Create a division in transpiled bundles, one for legacy browsers and one for modern browsers

First we needed some insights luckily there is a convenient package for that `webpack-bundle-analyzer` once  installed I created an npm task to run webpack and analyze the results right away.

"webpack-analyze": "webpack --env.NODE_ENV=develop --profile --json > module/Eurocampings/analysis/stats.json && webpack-bundle-analyzer module/Eurocampings/analysis/stats.json"

This task runs webpack with some statistics flags which generates the output to JSON. After that the webpack-bundle-analyzer does the crunching en presents us with a nice Mondriaanesque webpage like the image below.

First obvious thing for us was to remove the moment locales, which we done in webpack with the IgnorePlugin

const ignorePlugin = new webpack.IgnorePlugin({
        resourceRegExp: /^\.\/locale$/,
        contextRegExp: /moment$/

This removed all the locales in the bundle, we still use several locales which we import directly in the modules. For us this already saved some hundreds kbs.

For the Google Tag manager we first tested what the performance increase would be. It turned out the the increase was significant. Since we cannot market our site properly without GTM we need to discuss with the business which things can be done with GTM to improve the performance of the site.

For the separate bundling of the JavaScript resource we need to decide which versions of browsers we still would like to support. The gain is this process will be the faster parsing for already modern browser. We need to calculate the business value before we will take this route.

CSS, remove unneeded things

  • Removing unneeded CSS
  • Check for mixins which generate lots of extra CSS
  • Remove vendor prefixes and if needed use postCSS and AutoPrefixrCritical rendering, inline CSS

Critical rendering, inline CSS

The principle of inline css is actually simple, use an npm package which uses a headless browser to scan a local, of live html-page, restricted by a width and a height. The classnames eg of the elements in the box are used to extract the styles to be inlined in the page. To be used to improve critical rendering.

For our situation, many dynamic pages, we needed to find an automated solution. We also posed the question, should we somehow generate one generic inline CSS blob, or should each page has its own inline css sections. The latter is more accurate but requires more processing time, and we need to scan every possible page. 

We went for the following solution.

  1. Scan the most important pages
  2. Combine the extracted CSS into 1 chuck of inline CSS
  3. Insert the inline CSS on all the pages

We use this gives us a more low level critical css module which helps us to fetch the styles from several pages and combine them into one critical css file.

Other assets

For the other assets we sure had something to optimize for the other assets. Font were high on the list and easy to fix.

The font preloading was actually very simple, we made the decision to only preload the woff2 fonts and have the woff format as fallback for IE11 which we still support but luckily the usage is slowly decreasing. We use an assethelper which reads a manifest.json generated by webpack. Since we do not require the font we needed to explicitly bundle the font files like this

const fontStagMedium = './module/Eurocampings/assets/fonts/stag-medium-webfont.woff2';
const fontStagLight = './module/Eurocampings/assets/fonts/stag-light-webfont.woff2';

entry: {
            stagMedium: path.resolve(__dirname, fontStagMedium),
            stagLight: path.resolve(__dirname, fontStagLight),

Now we could add the fonts to the head section to preload the fonts

<link rel="preload" href="<?php echo $this->assetPacker('stag-medium-webfont.woff2'); ?>" as="font" type="font/woff2" crossorigin>
<link rel="preload" href="<?php echo $this->assetPacker('stag-light-webfont.woff2'); ?>" as="font" type="font/woff2" crossorigin>
Without font preloading (fonts load in the stylesheet)
with font preloading (fonts load in parallel with css)

Wrap up

The business

We experienced that it is necessary to inform the business that with a large and complicated site: there are no silver bullets. You have to improve the site (lighthouse) point by point. Be transparent about the technical cost of each step. Sometimes you can combine the work to be made to improve the performance with some refactoring which helps to sell the step.

Things left to improve

One of the hardest parts we currently deal with is that our site has been out for a lot of years and have a lot of stakeholders. The result is that we have a lot….  A lot of HTML, JS and CSS. We pointed this to the business a lot of times, the sites needs focus and afterwards we can strip things out. Turns that this is not easy.  We decided to improve our user measurements and notice the business of parts never being used that need to be striped out. And of course we need to implement some functionality of service workers to help with the clientside caching especially for mobile users.

Current improvements

We still have work to do to improve the scores even further, hence our current measurements with lighthouse and pagespeed show an improvement of at least 200%. We are still performing mediocre on mobile. Luckily we know which step we need to take.