Optimizing Large Data Processing in Laravel with LazyCollection

Sandeeppant
2 min read3 days ago

--

Managing large datasets in Laravel can be tricky, particularly with memory usage. One effective approach is using Laravel’s cursor(), which retrieves records one at a time instead of loading the entire dataset into memory.

Understanding LazyCollection

LazyCollection helps handle large datasets efficiently by loading and processing data in small chunks instead of all at once. This prevents memory overload, making it great for large files or big database queries.

Here’s an example of reading a CSV file line by line:

use Illuminate\Support\LazyCollection;

LazyCollection::make(function () {
$handle = fopen('data.csv', 'r');
while (($row = fgets($handle)) !== false) {
yield str_getcsv($row);
}
})->each(function ($row) {
// Process data
});

Practical Application

Consider processing a large transaction log file to generate reports. Using LazyCollection, you can read the file line by line, transform each log entry, filter for completed transactions, and insert records in manageable chunks.

use App\Models\TransactionLog;
use Illuminate\Support\LazyCollection;

class TransactionProcessor
{
public function processLogs(string $filename)
{
return LazyCollection::make(function () use ($filename) {
$handle = fopen($filename, 'r');
while (($line = fgets($handle)) !== false) {
yield json_decode($line, true);
}
})
->map(function ($log) {
return [
'transaction_id' => $log['id'],
'amount' => $log['amount'],
'status' => $log['status'],
'processed_at' => $log['timestamp']
];
})
->filter(function ($log) {
return $log['status'] === 'completed';
})
->chunk(500)
->each(function ($chunk) {
TransactionLog::insert($chunk->all());
});
}
}

Database Operations with cursor()

For database operations, Laravel’s cursor() method can be used to create lazy collections, ensuring efficient memory usage even when processing millions of records.

use App\Models\Transaction;
use Illuminate\Support\Facades\DB;

class ReportController extends Controller
{
public function generateReport()
{
DB::transaction(function () {
Transaction::cursor()
->filter(function ($transaction) {
return $transaction->amount > 1000;
})
->each(function ($transaction) {
// Process each large transaction
$this->processHighValueTransaction($transaction);
});
});
}
}

Implementing LazyCollection in the Laravel applications can significantly enhance performance and memory management when dealing with large datasets.

For more in-depth information, check out Laravel’s official documentation.

Thank you for Reading …..

Before you go don’t forget to clap, share and follow me 😇😇😇

--

--

No responses yet