Pagination
List endpoints in the Consignly API return paginated results to efficiently handle large datasets. This guide explains how to work with paginated responses.
Pagination Parameters
All list endpoints accept the following query parameters:
| Parameter | Type | Default | Max | Description |
|---|---|---|---|---|
PageIndex |
integer | 1 | - | The page number to retrieve (1-based) |
PageSize |
integer | 25 | 500 | Number of records per page |
Paginated Response Structure
Paginated responses include metadata about the current page and total results:
{
"index": 1,
"total": 150,
"consignments": [
// Array of items
]
}
| Field | Type | Description |
|---|---|---|
index |
integer | Current page index |
total |
integer | Total number of records matching the query |
[items] |
array | Array of records (field name varies by endpoint) |
Note: The
totalfield represents all matching records, not the count of items in the current page. The array property name is resource-specific; inspect each endpoint’s response example.
Basic Pagination Example
Request:
curl -X GET "https://api.consignlyhq.com/v1/consignments?PageIndex=1&PageSize=25" \
-H "Authorization: Bearer ACCESS_TOKEN"
Response:
{
"index": 1,
"total": 150,
"consignments": [
{
"id": "550e8400-e29b-41d4-a716-446655440000",
"consignmentNumber": "CON-001",
"clientName": "Example Client",
"status": 1
},
// ... 24 more items
]
}
Iterating Through Pages
To retrieve all records, iterate through pages until all the items are retrieved:
Python Example
import requests
def get_all_consignments(access_token):
base_url = "https://api.consignlyhq.com/v1/consignments"
headers = {"Authorization": f"Bearer {access_token}"}
all_consignments = []
page_index = 1
page_size = 100
while True:
params = {
"PageIndex": page_index,
"PageSize": page_size
}
response = requests.get(base_url, headers=headers, params=params)
data = response.json()
all_consignments.extend(data["consignments"])
# Check if we've fetched all records
if len(all_consignments) >= data["total"]:
break
page_index += 1
return all_consignments
JavaScript Example
async function getAllConsignments(accessToken) {
const baseUrl = 'https://api.consignlyhq.com/v1/consignments';
const headers = { Authorization: `Bearer ${accessToken}` };
const allConsignments = [];
let pageIndex = 1;
const pageSize = 100;
while (true) {
const url = `${baseUrl}?PageIndex=${pageIndex}&PageSize=${pageSize}`;
const response = await fetch(url, { headers });
const data = await response.json();
allConsignments.push(...data.consignments);
if (allConsignments.length >= data.total) {
break;
}
pageIndex++;
}
return allConsignments;
}
C# Example
public async Task<List<Consignment>> GetAllConsignmentsAsync(string accessToken)
{
var client = new HttpClient();
client.DefaultRequestHeaders.Authorization =
new AuthenticationHeaderValue("Bearer", accessToken);
var allConsignments = new List<Consignment>();
var pageIndex = 1;
var pageSize = 100;
while (true)
{
var url = $"https://api.consignlyhq.com/v1/consignments?PageIndex={pageIndex}&PageSize={pageSize}";
var response = await client.GetAsync(url);
var data = await response.Content.ReadFromJsonAsync<ConsignmentPageModel>();
allConsignments.AddRange(data.Consignments);
if (allConsignments.Count >= data.Total)
break;
pageIndex++;
}
return allConsignments;
}
Filtering with Pagination
Many list endpoints support filtering parameters that can be combined with pagination:
Example: Filtered Consignment Search
curl -X GET "https://api.consignlyhq.com/v1/consignments?\
PageIndex=1&\
PageSize=50&\
Status=1&\
ClientId=550e8400-e29b-41d4-a716-446655440000&\
SearchText=ORDER123" \
-H "Authorization: Bearer ACCESS_TOKEN"
Common filter parameters by endpoint:
| Endpoint | Filter Parameters |
|---|---|
/v1/consignments |
SearchText, ClientId, CarrierId, WarehouseId, Status, EdiStatus, ExternalId |
/v1/partners |
SearchText, Relationship, Status |
/v1/inventory |
SearchText, PartnerId, PartnerProductId, WarehouseId, IncludeDepleted |
/v1/warehouses |
SearchText, Status |
Best Practices
- Use larger page sizes (100-500) when fetching all records to reduce API calls
- Use smaller page sizes (10-25) for interactive UIs with pagination controls
- Consider caching reference data that changes infrequently (tax rates, shipping ports, warehouses etc.)
- Apply filters before pagination to reduce the total dataset and improve performance:
# Instead of fetching all and filtering client-side
GET /v1/consignments?PageSize=500
# Filter server-side for better performance
GET /v1/consignments?Status=1&ClientId=xxx&PageSize=100
Troubleshooting
Issue: Receiving Fewer Items Than Expected
The total field shows total matching records, not the number returned in the current page. The last page may resolve fewer items than PageSize.
Issue: Duplicate or Missing Records
If records are added or removed during pagination duplicates or missed records may be encountered. Use date filters or sort by a stable field to minimize this.
Issue: Performance Degradation
Large page sizes with complex filters may cause slow responses. Try:
- Reducing page size
- Simplifying filters
- Adding more specific filters to reduce the dataset