Chrono Flow is a Laravel project focused on high-performance backend data processing. It lets you upload a large (≈80 MB) CSV simulating a work log of 1,000,000 entries. Once submitted, the app quickly extracts and inserts all records into a MySQL database. Such a large number of entries would be preferably handled in a queue, but to show the performance it was done in a single request.
Key Technologies used:
- Laravel 12
- MySQL
- Nginx
- Docker + Docker Compose
- Alpine.js & Tailwind CSS
- PestPHP
Important
You must have Docker and Docker Compose installed on your machine.
- Clone the repository:
git clone https://github.com/benjamimWalker/chrono-flow.git
- Go to the project folder:
cd chrono-flow
- Prepare environment files:
cp .env.example .env
- Build the containers:
docker compose up -d
- Install composer dependencies:
docker compose exec app composer install
- Run the migrations:
docker compose exec app php artisan migrate
- Install npm dependencies:
docker compose run --rm npm install
- Build the assets:
docker compose run --rm npm run build
- You can now execute the tests:
docker compose exec app php artisan test
There is a python script on the root folder that generates a CSV file with 1,000,000 entries. You can run it and upload the file to the application.
Navigate to the home page at http://localhost
and upload the CSV.
After importing, the app redirects to a results page that lists all entries paginated.
The main features of the application are:
- High-speed processing of large CSV files using Laravel 12's new concurrency features.
- Bulk insertion of up to 1,000,000 records into MySQL in a few seconds.
- Full test coverage with PestPHP.
- Clean, maintainable Laravel 12 code with proper architecture.
[Benjamim] - [benjamim.sousamelo@gmail.com] Github: @benjamimWalker