Skip to main content

The power of SQLite

This blog post might be outdated!
This blog post was published more than one year ago and might be outdated!
· 3 min read
Daniel Ruf

In a few React.js projects, we have huge JSON files, which are used for the filterable data in the frontend. To keep the frontend fast we decided on creating a small standalone API.

Normally we have very small JSON files that we load via AJAX and parse in JavaScript. When such files get very large and complex, we can not load them like this. Otherwise, this would drastically impact the times to load and parse them in the frontend.

Our own requirements were to create a small and efficient API endpoint on the server, where we can use PHP. A complete MySQL or MariaDB database setup was not an ideal solution, the whole setup should work in the local development setups without any further steps. So our only good option was to use a standalone PHP script for the API and SQLite for the portable database. Since we only read but did not write any data in production, this was the best solution and we do not get any concurrency issues.

The API consists of a single PHP file with a few lines of code, which retrieve, sort, and filter the data from the static SQLite database file. The results are then formatted with json_encode().

With the help of the SQLite CLI we created the database file from a CSV file in a few simple steps:

# create new empty database
rm data_new.sqlite3
sqlite3 data_new.sqlite3 "VACUUM;"

# start sqlite cli
sqlite3 data_new.sqlite3

# one command per line, press enter after each one:
# create tables from dumped schema
# skip at first import, run sqlite3 data_new.sqlite3 .schema > schema.sql to export schema
.read schema.sql
# switch to csv mode
.mode csv
# create entries table from csv data
.import entries.csv entries
.quit

Thanks to the preinstalled SQLite extension of PHP we can now work with pure SQL commands, probably the most efficient way to work with data, to get the right data in JSON format. This means native sorting, and filtering without any big performance impact.

In summary, this all worked quite well and we found the perfect tools (SQLite and SQL) for this task. The result was pretty fast filtering based on frontend inputs and a performant frontend. Also, we prevented an overly complex reimplementation of comparable features in pure JavaScript.

As a bonus SQLite provides powerful JSON functions, which come in handy for different projects. I have written a small blogpost about this.

Depending on your requirements a custom PHP script like our script might be sufficient. For larger projects and projects with more requirements, there are solutions like php-crud-api, which comes with many helpful features.