Solving Pagination Issues in GraphQL Using Relay-Style in Apollo Client
GraphQL is a powerful tool with many benefits. However, like any tool, it has its weaknesses, and one of them is pagination. It doesn't always work well out of the box, requiring custom cache merge functions and manual updates after mutations. In this article, we will explore relay-style pagination using the example of chat messages.
Implementing Relay-Style Pagination in Apollo Client
Apollo Client has a built-in cache merge function for relay-style pagination. You can find it here.
To integrate this into your project, add the relay-style pagination function to your InMemoryCacheConfig
in your typePolicies
configuration:
export const cache = new InMemoryCache({
typePolicies: {
Dialog: {
fields: {
messages: relayStylePagination([]),
},
},
},
});
Handling OptimisticResponse and Cache Updates
What if you want to add optimisticResponse
and update the cache in update
? How can you combine this with pagination and cursors?
Optimistic Update
optimisticResponse
allows you to temporarily add data to the cache as if the request has already succeeded. This is useful for creating a smooth and responsive user interface.
Example of using optimisticResponse:
const [sendMessage] = useMutation(SEND_MESSAGE_MUTATION, {
optimisticResponse: {
__typename: "Mutation",
sendMessage: {
__typename: "Message",
id: "temp-id", // Temporary identifier
text: newMessageText,
createdAt: new Date().toISOString(),
sender: currentUser,
},
},
update(cache, { data: { sendMessage } }) {
// Cache update logic after mutation
},
});
Writing a cache.modify Function for update
We will describe our cache.modify
for update
step-by-step.
-
Create a mutation with the
update
parameter:const [mutation] = useMutation(SEND_MESSAGE_MUTATION, { update(cache, { data }) { if (data?.sendMessage) { // Cache update code will be here } }, });
-
Get the
dataId
for our message dialog:const dataId = cache.identify({ __typename: dialog.__typename, id: dialog.id, });
-
Check if
dataId
was obtained:if (!dataId) return;
-
Modify the cache:
cache.modify<Dialog>({ id: dataId, fields: { messages(existingMessages, { isReference, readField }) { // Message update logic will be here }, }, });
-
Create a copy of the existing messages:
const existingEdges = (existingMessages.edges || []).slice();
-
Remove the cursor from the first message to avoid confusion with cursors:
const lastMessage = { ...existingEdges[0] }; const cursor = isReference(existingEdges[0]) ? readField<string>("cursor", existingEdges[0]) : existingEdges[0].cursor; delete lastMessage.cursor; existingEdges[0] = lastMessage;
-
Create a new message and add it to the beginning of the list:
const edge = { __typename: "MessageEdge", cursor, node: sendMessage, }; existingEdges.unshift(edge);
-
Return the updated list of messages:
return { ...existingMessages, edges: existingEdges, };
Full Example of the update Function
const [mutation] = useMutation(SEND_MESSAGE_MUTATION, {
update(cache, { data }) {
if (data?.sendMessage) {
try {
const dataId = cache.identify({
__typename: dialog.__typename,
id: dialog.id,
});
if (!dataId) return;
cache.modify<Dialog>({
id: dataId,
fields: {
messages(existingMessages, { isReference, readField }) {
const existingEdges = (existingMessages.edges || []).slice();
const lastMessage = { ...existingEdges[0] };
const cursor = isReference(existingEdges[0])
? readField<string>("cursor", existingEdges[0])
: existingEdges[0].cursor;
delete lastMessage.cursor;
existingEdges[0] = lastMessage;
const edge = {
__typename: "MessageEdge",
cursor,
node: sendMessage,
};
existingEdges.unshift(edge);
return {
...existingMessages,
edges: existingEdges,
};
},
},
});
} catch (error) {
console.error("Error updating cache:", error);
}
}
},
});
Deduplicating Messages When Updating the Cache
When receiving new data, you may encounter the issue of duplicate messages in the cache. To avoid this, we will add deduplication to the relayStylePagination
function.
Modified relayStylePagination Function with Deduplication
-
Import necessary modules and define types:
import { __rest } from "tslib"; import { FieldPolicy, Reference } from "@apollo/client"; import { RelayFieldPolicy, TExistingRelay, TRelayEdge, TRelayPageInfo, } from "@apollo/client/utilities/policies/pagination"; import { mergeDeep } from "@apollo/client/utilities"; import { ReadFieldFunction } from "@apollo/client/cache/core/types/common"; type KeyArgs = FieldPolicy<any>["keyArgs"];
-
Define helper functions:
-
Function to get additional fields:
const notExtras = ["edges", "pageInfo"]; const getExtras = (obj: Record<string, any>) => __rest(obj, notExtras);
-
Function to create an empty data object:
function makeEmptyData(): TExistingRelay<any> { return { edges: [], pageInfo: { hasPreviousPage: false, hasNextPage: true, startCursor: "", endCursor: "", }, }; }
-
Function to get the edge node ID:
type IsReferenceFunction = (obj: any) => obj is Reference; type GetEdgeNodeIdPayload = { edge: TRelayEdge<Reference>; isReference: IsReferenceFunction; readField: ReadFieldFunction; idKey?: string; }; function getEdgeNodeId({ edge, isReference, readField, idKey, }: GetEdgeNodeIdPayload): string | undefined { const node = isReference(edge) ? readField<string>("node", edge) : edge.node; if (node) { return isReference(node) ? readField<string>(idKey || "id", node) : (node as any)?.id; } return undefined; }
-
-
relayStylePagination
Function with Deduplication:export function relayStylePagination<TNode extends Reference = Reference>( keyArgs: KeyArgs = false, idKey?: string ): RelayFieldPolicy<TNode> { return { keyArgs, read(existing, { canRead, readField }) { if (!existing) return existing; const edges: TRelayEdge<TNode>[] = []; let firstEdgeCursor = ""; let lastEdgeCursor = ""; existing.edges.forEach((edge) => { if (canRead(readField("node", edge))) { edges.push(edge); if (edge.cursor) { firstEdgeCursor = firstEdgeCursor || edge.cursor || ""; lastEdgeCursor = edge.cursor || lastEdgeCursor; } } }); if (edges.length > 1 && firstEdgeCursor === lastEdgeCursor) { firstEdgeCursor = ""; } const { startCursor, endCursor } = existing.pageInfo || {}; return { ...getExtras(existing), edges, pageInfo: { ...existing.pageInfo, startCursor: startCursor || firstEdgeCursor, endCursor: endCursor || lastEdgeCursor, }, }; }, merge(existing, incoming, { args, isReference, readField }) { if (!existing) { existing = makeEmptyData(); } if (!incoming) { return existing; } const incomingEdges: typeof incoming.edges = []; const incomingIds = new Set(); if (incoming.edges) { incoming.edges.forEach((edge) => { if (isReference((edge = { ...edge }))) { edge.cursor = readField<string>("cursor", edge); } const nodeId = getEdgeNodeId({ edge, isReference, readField, idKey, }); if (nodeId) incomingIds.add(nodeId); incomingEdges.push(edge); }); } if (incoming.pageInfo) { const { pageInfo } = incoming; const { startCursor, endCursor } = pageInfo; const firstEdge = incomingEdges[0]; const lastEdge = incomingEdges[incomingEdges.length - 1]; if (firstEdge && startCursor) { firstEdge.cursor = startCursor; } if (lastEdge && endCursor) { lastEdge.cursor = endCursor; } const firstCursor = firstEdge && firstEdge.cursor; if (firstCursor && !startCursor) { incoming = mergeDeep(incoming, { pageInfo: { startCursor: firstCursor, }, }); } } let prefix: typeof existing.edges = []; let afterIndex = -1; let beforeIndex = -1; existing.edges.forEach((edge, index) => { const nodeId = getEdgeNodeId({ edge, isReference, readField, idKey, }); /** * Remove duplicates */ if (!(nodeId && incomingIds.has(nodeId))) prefix.push(edge); if (edge.cursor === args?.after) afterIndex = index; if (edge.cursor === args?.before) beforeIndex = index; }); let suffix: typeof prefix = []; if (args && args.after) { if (afterIndex >= 0) { prefix = prefix.slice(0, afterIndex + 1); } } else if (args && args.before) { suffix = beforeIndex < 0 ? prefix : prefix.slice(beforeIndex); prefix = []; } else if (incoming.edges) { prefix = []; } const edges = [...prefix, ...incomingEdges, ...suffix]; const pageInfo: TRelayPageInfo = { ...incoming.pageInfo, ...existing.pageInfo, }; if (incoming.pageInfo) { const { hasPreviousPage, hasNextPage, startCursor, endCursor, ...extras } = incoming.pageInfo; Object.assign(pageInfo, extras); if (!prefix.length) { if (void 0 !== hasPreviousPage) pageInfo.hasPreviousPage = hasPreviousPage; if (void 0 !== startCursor) pageInfo.startCursor = startCursor; } if (!suffix.length) { if (void 0 !== hasNextPage) pageInfo.hasNextPage = hasNextPage; if (void 0 !== endCursor) pageInfo.endCursor = endCursor; } } return { ...getExtras(existing), ...getExtras(incoming), edges, pageInfo, }; }, }; }
How Deduplication Works
-
Creating a Set of New Node IDs
incomingIds
:When processing incoming data
incoming.edges
, an IDnodeId
is created for each edge node, which is added to the setincomingIds
.const incomingEdges: typeof incoming.edges = []; const incomingIds = new Set(); if (incoming.edges) { incoming.edges.forEach((edge) => { if (isReference((edge = { ...edge }))) { edge.cursor = readField<string>("cursor", edge); } const nodeId = getEdgeNodeId({ edge, isReference, readField, idKey }); if (nodeId) incomingIds.add(nodeId); incomingEdges.push(edge); }); }
-
Removing Duplicates from Existing Data
existing.edges
:When processing existing data, each edge node is checked for presence in the set
incomingIds
. If the node already exists in the new data, it is not added to the resulting arrayprefix
.let prefix: typeof existing.edges = []; existing.edges.forEach((edge, index) => { const nodeId = getEdgeNodeId({ edge, isReference, readField, idKey }); if (!(nodeId && incomingIds.has(nodeId))) prefix.push(edge); if (edge.cursor === args?.after) afterIndex = index; if (edge.cursor === args?.before) beforeIndex = index; });
-
Merging Arrays Without Duplicates:
After removing duplicates from the existing data, the arrays
prefix
,incomingEdges
, andsuffix
are combined into a new arrayedges
, which is then returned to the cache.const edges = [...prefix, ...incomingEdges, ...suffix];
Thus, we avoid duplicating messages when updating the cache, maintaining data consistency in the client application.