Skip to content

Pagination

List endpoints return paginated results. This guide explains how to navigate through large datasets efficiently.

ParameterTypeDefaultDescription
pageinteger1Page number (1-indexed)
pageSizeinteger100Items per page (max 1000)
Terminal window
curl -X GET "https://www.mando.fi/api/secure/plu?page=2&pageSize=50" \
-H "Authorization: Bearer YOUR_API_KEY"

Pagination metadata is included in response headers:

HeaderDescription
X-Total-CountTotal number of items
X-PageCurrent page number
X-Page-SizeItems per page
X-Total-PagesTotal number of pages
async function getAllProducts(): Promise<Product[]> {
const allProducts: Product[] = [];
let page = 1;
const pageSize = 100;
while (true) {
const response = await client.plu.list({ page, pageSize });
allProducts.push(...response.data);
// Check if we've reached the end
if (response.data.length < pageSize) {
break;
}
page++;
}
return allProducts;
}
def get_all_products():
all_products = []
page = 1
page_size = 100
while True:
response = client.plu.list(page=page, page_size=page_size)
all_products.extend(response.data)
if len(response.data) < page_size:
break
page += 1
return all_products

Some endpoints support cursor-based pagination for better performance with large datasets:

Terminal window
curl -X GET "https://www.mando.fi/api/secure/sales?cursor=abc123&limit=100" \
-H "Authorization: Bearer YOUR_API_KEY"

Response includes a next_cursor field:

{
"data": [...],
"next_cursor": "def456",
"has_more": true
}
async function getAllSales(): Promise<Sale[]> {
const allSales: Sale[] = [];
let cursor: string | undefined;
while (true) {
const response = await client.sales.list({ cursor, limit: 100 });
allSales.push(...response.data);
if (!response.hasMore) {
break;
}
cursor = response.nextCursor;
}
return allSales;
}

Combine filters with pagination:

// Get active products, 50 per page
const products = await client.plu.list({
page: 1,
pageSize: 50,
active: true,
groupId: 'beverages'
});
  1. Use reasonable page sizes - 100-500 items is typically optimal
  2. Request only needed fields - Use fields parameter if supported
  3. Cache results - Avoid re-fetching unchanged data
  4. Use cursors for large sets - More efficient than offset pagination
  5. Parallel requests - Fetch multiple pages concurrently when possible

When you know the total count, fetch pages in parallel:

async function getAllProductsFast(): Promise<Product[]> {
// First request to get total count
const first = await client.plu.list({ page: 1, pageSize: 100 });
const totalPages = Math.ceil(first.totalCount / 100);
// Fetch remaining pages in parallel
const pagePromises = [];
for (let page = 2; page <= totalPages; page++) {
pagePromises.push(client.plu.list({ page, pageSize: 100 }));
}
const pages = await Promise.all(pagePromises);
return [first.data, ...pages.map(p => p.data)].flat();
}