Modern web applications often manage millions of records — whether it's product catalogs in e-commerce, activity feeds in social apps, or log data in analytics dashboards. Displaying this data efficiently without slowing down your server or frustrating users is critical.
In Laravel MongoDB applications, traditional offset-based pagination (using skip() and limit()) works fine for small datasets but degrades dramatically as your collection grows. Cursor-based pagination (also called keyset or seek pagination) offers a scalable, high-performance alternative.
In this guide, you'll learn:
How offset pagination works and why it fails at scale
The mechanics and advantages of cursor pagination
Practical implementations with Laravel + MongoDB
When to choose each method
Advanced tips for infinite scroll, APIs, and performance optimization
Keywords: Laravel MongoDB pagination, cursor pagination Laravel MongoDB, offset vs cursor pagination, handling large datasets Laravel MongoDB, cursorPaginate MongoDB.
Offset Pagination in Laravel MongoDB: Simple but Problematic at Scale
Offset pagination is the most familiar method. You specify a page number and items per page, then skip previous records and take the next batch.
Laravel's Eloquent (with jenssegers/laravel-mongodb) makes this easy:
use App\Models\Product;
use Illuminate\Http\Request;
public function index(Request $request)
{
$perPage = (int) $request->input('per_page', 20);
$products = Product::orderBy('created_at', 'desc')
->paginate($perPage); // Uses skip() + limit() internally
return response()->json($products);
}You can also do it manually:
$page = max((int) $request->get('page', 1), 1);
$perPage = 20;
$skip = ($page - 1) * $perPage;
$products = Product::orderBy('created_at', 'desc')
->skip($skip)
->take($perPage)
->get();Why offset pagination breaks with large datasets:
MongoDB does not "jump" to the offset. It must scan and discard every skipped document. For page 500 with 20 items per page, it scans ~10,000 documents before returning results.
Query time grows linearly with the offset. Deep pages become painfully slow, even with proper indexes.
Additionally, getting the total count ($products->total()) requires scanning the entire collection (or using estimatedDocumentCount() for approximations).
When offset pagination is still acceptable:
Collections under ~100,000 documents
Admin panels or internal tools with heavy filtering
Cases where users need to jump to arbitrary pages (e.g., "Go to page 42")
Cursor-Based Pagination: The Scalable Solution for Large Datasets
Cursor pagination uses a "pointer" (cursor) to the last record from the previous page instead of counting offsets. It queries for records that come after that pointer using indexed fields.
This approach delivers consistent performance no matter how deep into the dataset you go — ideal for millions of records.
Core idea:
First page: Fetch first N records, ordered by a unique/indexed field (usually id or createdat + _id)
Next pages: Add a where condition like where('_id', '>', $lastCursor)
Always maintain consistent sorting
Laravel provides a built-in cursorPaginate() method that works excellently with MongoDB (via jenssegers package):
$products = Product::orderBy('_id')
->cursorPaginate(20);For descending order (newest first) and better uniqueness:
$products = Product::orderBy('created_at', 'desc')
->orderBy('_id', 'desc') // Compound sort for uniqueness
->cursorPaginate(20);API-friendly response with metadata:
public function index(Request $request)
{
$perPage = (int) $request->input('per_page', 20);
$products = Product::orderBy('created_at', 'desc')
->orderBy('_id', 'desc')
->cursorPaginate($perPage);
return response()->json([
'data' => $products->items(),
'meta' => [
'next_cursor' => $products->nextCursor()?->encode(),
'prev_cursor' => $products->previousCursor()?->encode(),
'has_more' => $products->hasMorePages(),
'per_page' => $products->perPage(),
],
]);
}Custom manual cursor implementation (useful for full control):
public function index(Request $request)
{
$perPage = 20;
$cursor = $request->input('cursor'); // base64 encoded value
$query = Product::orderBy('created_at', 'desc')
->orderBy('_id', 'desc');
if ($cursor) {
$decoded = base64_decode($cursor);
// For compound cursor, decode into array and apply multiple where conditions
$query->where('created_at', '<', $decoded['created_at'])
->orWhere(function ($q) use ($decoded) {
$q->where('created_at', '=', $decoded['created_at'])
->where('_id', '<', $decoded['_id']);
});
}
// Fetch one extra to detect "has more"
$results = $query->limit($perPage + 1)->get();
$hasMore = $results->count() > $perPage;
if ($hasMore) {
$results->pop();
}
$nextCursor = $hasMore
? base64_encode(json_encode([
'created_at' => $results->last()->created_at,
'_id' => $results->last()->_id,
]))
: null;
return response()->json([
'data' => $results,
'next_cursor' => $nextCursor,
'has_more' => $hasMore,
]);
}Real-World Examples
E-commerce Product Catalog (Large inventory)
Infinite Scroll Activity Feed (Social-style)
Order History API (User-specific large logs)
For infinite scroll frontends (Vue, React, Livewire), combine cursorPaginate() with "Load More" buttons and pass the next_cursor in subsequent requests.









