Parsing 18 billion JSON lines with Go

At my employer Tjek we recently decided to rebuild our event pipeline and move it to Google BigQuery to reduce complexity in the stack and to remove some services that were no longer maintainable.

BigQuery offers a bunch of nice tools for querying and visualizing our data, which would enable a number of our internal teams to work directly with the data & share it with customers without having to make a request to the engineering department.

The new service that would replace the old http service was easy to write, but then came the question of moving the historic data into BigQuery.

To migrate the old data, we had to backfill our entire dataset, accumulated over the last 10 years, into BigQuery and this is the story of how it was done.

https://itnext.io/parsing-18-billion-lines-json-with-go-738be6ee5ed2