Skip to main content

JSON via SQLite (from CSV)

This blog post might be outdated!
This blog post was published more than one year ago and might be outdated!
· 2 min read
Daniel Ruf
Full Stack Developer

Sometimes we get data from customers as Excel files. These are then converted to JSON files to be able to be processed further. In the past, this was often a cumbersome task with many manual steps. With SQLite, we made this much easier and reduced the number of needed steps.

The first step was to export the needed data as CSV file. In our case, the columns and rows were switched in the original data and we had to switch them back. The process of switching rows and columns is called transposing. After some short search we found gocsv and transposed the data in the CSV file.

./gocsv transpose data.csv > data-transposed.csv

Now the CSV file was ready to be converted to JSON. To prepare this, we imported the file into a temporary SQLite database. This can be done by switching to the CSV mode via .mode csv and then importing the CSV data into a table called entries via .import data-transposed.csv entries.

Then we processed the data and converted it with a custom SQL script csv-to-json.sql to the needed JSON structure with the help of the JSON functions in SQLite.

select '"' || `number` || '": ' || 
json_object('attributes',
json_object(
'customAttribute', iif(`Attribute` == 'yes', json('true'), json('false') ),
'customAttribute2', iif(`Attribute 2` == 'yes', json('true'), json('false') ),
),
'category', `Category`,
'name', `number`,
'link', '#'
) || ','
from entries;

Newer SQLite versions have the json1 extension built in, so the script should work without any problems.

Our final script csv-to-json.sh looked like this:

#!/usr/bin/env bash

rm -f -- db.sqlite3
sqlite3 db.sqlite3 ".mode csv" \
".import data-transposed.csv entries" \
".mode list" \
".once data.json" \
".read csv-to-json.sql" \
".exit"
rm -f -- db.sqlite3

An alternative solution would be to use a temporary database:

#!/usr/bin/env bash
sqlite3 -cmd ".mode csv" \
-cmd ".import data-transposed.csv entries" \
-cmd ".mode list" \
-cmd ".once data.json" \
-cmd ".read csv-to-json.sql" \
-cmd ".exit 1"

You can test it with the following example data-transposed.csv file:

number,Category,Attribute,Attribute 2
1,standard,yes,Now
2,premium,yes,yes

The generated data.json file will look like this:

"1": {"attributes":{"customAttribute":true,"customAttribute2":false},"category":"standard","name":"1","link":"#"},
"2": {"attributes":{"customAttribute":true,"customAttribute2":true},"category":"premium","name":"2","link":"#"},

After this, we removed the last trailing comma and wrapped everything in {...}. When there were new changes, we regenerated the JSON file with the same steps.