How to upload large CSV file to Database in Laravel 12

 Upload large CSV file to Database in Laravel 12🚀

How to upload large CSV file to Database in Laravel 12

Handling large CSV files in Laravel can be challenging. If you try to process everything at once, it may lead to memory issues or even timeouts 😵. But don’t worry! Laravel has your back with chunking and queues.

In this article we will see how can you upload large CSV file and import the data to your database effortlessly using Laravel's Job Queue mechanism in Laravel 12. I have created a simple a simple form with upload button to upload a CSV and a controller to process the uploaded file. Let's see step by step below how it works.

📦 What We'll Use:

  • Laravel 12 🧱
  • Jobs and Queue System 🧵

🔧Step 1: Setup your route

I have created two routes as below, one is to upload the CSV file and another is to process the uploaded file.

Route::get('/import-users', [UserController::class, 'importIndex'])->name('users.form');
Route::post('/import-users', [UserController::class, 'importUsers'])->name('users.import');


📁Step 2: Setup the upload form (ImportUser.blade.php)

Create a view file according your need from where you can upload your CSV file. You can ignore this step if you are testing it via postman or any other tool.

<!-- resources/views/import.blade.php -->
<!DOCTYPE html>
<html>
<head>
    <title>Import Users from Excel</title>
</head>
<body>
    <h2>Upload Excel File</h2>
    @if(session('success'))
        <div style="color: green;">{{ session('success') }}</div>
    @endif
    @if($errors->any())
        <div style="color: red;">
            <ul>
                @foreach($errors->all() as $error)
                    <li>{{ $error }}</li>
                @endforeach
            </ul>
        </div>
    @endif

    <form action="{{ route('users.import') }}" method="POST" enctype="multipart/form-data">
        @csrf
        <label>Select Excel File:</label><br>
        <input type="file" name="file" accept=".xlsx,.xls" required><br><br>
        <button type="submit">Upload</button>
    </form>
</body>
</html>

🧠Step 3: The Controller Logic (UserController.php)

Here is the controller logic to handle large CSV upload.

<?php

namespace App\Http\Controllers;

use Illuminate\Http\Request;
use App\Jobs\ProcessUserChunk;
use Illuminate\Support\Facades\Log;
use Maatwebsite\Excel\Facades\Excel;
use App\Imports\UsersImport;

class UserController extends Controller
{
    public function importIndex()
    {
        return view('ImportUser');
    }
    public function importUsers(Request $request)
    {
        $request->validate([
            'file' => 'required|mimes:xlsx,xls,csv|max:51200' // 50MB
        ]);

        $file = $request->file('file');
        $fileName = time().'.'.$request->file->extension();  
        $request->file->move(public_path('uploads'), $fileName);
        $fullPath = public_path('uploads/' . $fileName);
        
        Excel::import(new UsersImport, $fullPath);

        return back()->with('success', 'Chunked jobs dispatched!');
    }
}

🧩Step 4: Importing with Chunks and Queues (UserImport.php)

This is the main file where our import logic going be work. As we are using Maatwebsite package, you can run this command to create this file according your need. It will create the file in App\Imports directory.

php artisan make:import UsersImport --model=User

<?php

namespace App\Imports;

use App\Models\User;
use Maatwebsite\Excel\Row;
use Illuminate\Support\Facades\Log;
use Maatwebsite\Excel\Concerns\OnEachRow;
use Illuminate\Contracts\Queue\ShouldQueue;
use Maatwebsite\Excel\Concerns\WithChunkReading;

class UsersImport implements OnEachRow, WithChunkReading, ShouldQueue
{
    /**
     * @param array $row
     *
     * @return \Illuminate\Database\Eloquent\Model|null
     */
    public function onRow(Row $row)
    {
        try {
            $row = $row->toArray();

            User::updateOrCreate([
                'name'     => $row[0],
                'email'    => $row[1],
                'password' => bcrypt($row[2]),
            ]);
        } catch (\Exception $e) {
            Log::error('Row import failed', ['row' => $row, 'error' => $e->getMessage()]);
        }
    }

    public function chunkSize(): int
    {
        return 100; // Adjust based on memory
    }
}

🧰Step 5: Queue Configuration

Make sure you have already configured your queue job in your project else you can use the following steps to do so.

Set the queue driver to database in .env file:

QUEUE_CONNECTION=database

Run migration for jobs table:

php artisan queue:table
php artisan migrate

Start the queue worker with proper flags 🏃‍♂️:

php artisan queue:work --timeout=600 --tries=3

--timeout=600 → Give long-running jobs more time ⏳

--tries=3 → Retry failed jobs 3 times before marking as failed ❗

🎉 Conclusion

Now you can upload and process large CSV files in chunks without breaking your app or your server 💪. Laravel’s queue and chunk reading support makes it super smooth.

If you're building a system that regularly imports large user data files — this pattern will save you a ton of headaches! 🔥

Thank you for reading this article 😊

For any query do not hesitate to comment 💬


Previous Post Next Post

Contact Form