Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

useOffsetInfiniteScrollQuery does not account for added/removed rows #428

Open
roobox opened this issue Mar 29, 2024 · 5 comments
Open

useOffsetInfiniteScrollQuery does not account for added/removed rows #428

roobox opened this issue Mar 29, 2024 · 5 comments
Labels
bug Something isn't working

Comments

@roobox
Copy link

roobox commented Mar 29, 2024

Describe the bug
I am using useOffsetInfiniteScrollQuery to achieve an Inifite-Scroll effect in my web application. The data is sorted by the created_at column (descending order) and loading more data (pageSize = 10) is working perfectly, until someone inserts or deletes a row. When I insert an row to my supabase DB using a stored procedure and then insert this manually to my SWR frontend cache via useUpsertItem this item gets added at index 0 to my data array, as it has the newest created_at value. It is also at position 0 in the DB table when sorting it using the created_at column. As the the "offset" value of the SQL query stayed the same, but in the sorted DB all rows were shifted one position down, the first row of the next page is the same as the last row of the previous page (so I am fetching 2 times the same row).

To Reproduce

  1. Setup Infinite Scroll Query:
    const { data, loadMore, isValidating, error } = useOffsetInfiniteScrollQuery(
    supabase
    .from('books')
    .select('id, name, created_at')
    .order('created_at', { ascending: false }) //multiple rows can have the same created_at value, e.g. when using Batch inserts
    .order('id', { ascending: false })
    { pageSize: 10 },
    );

  2. Add row to DB without this package by using a stored procedure

  3. Insert the added row to the SWR Cache
    const upsertItem = useUpsertItem({
    primaryKeys: ['id'],
    table: 'books',
    schema: 'public'
    });
    upsertItem({id: '231c220d-f4d7-4ca9-8761-d1beee69cbb7', name: 'testbook', created_at: new Date().toISOString()})

  4. Load more data
    loadMore() // This is the point where my described bug occurs. The last row of the first page and the first row of the second page are identical, as the offset-value has not changed, but so did the DB table.

Expected behavior
I would expect that at some place I can "manually" adjust the offset. In my case where I add an element to the SWR cache I want to increase the Offset by one so instead of performing:
SELECT "id", "name", "created_at"
FROM "books"
ORDER BY ...
OFFSET 10
LIMIT 10

I want to perform:
SELECT "id", "name", "created_at"
FROM "books"
ORDER BY ...
OFFSET 11
LIMIT 10

And when I delete an Item via useDeleteItem I want to be able to decrease the offset by 1.

Additional context
I cant use useCursorInfiniteScrollQuery, as the created_at column does not hold unique values (i.e. batch imports) and the package would need to be extended to support additional filters and also using >= created_at instead of > created_at (see https://the-guild.dev/blog/graphql-cursor-pagination-with-postgresql#cursor-pagination-with-additional-filters). Maybe this is also worth considering.

@roobox roobox added the bug Something isn't working label Mar 29, 2024
@pikameow420
Copy link

Hey @roobox,

I am running into the same issue. Were you able to find a workaround for this?

@psteinroe
Copy link
Owner

Hey! Thanks for opening the issue. I understand what's going wrong, but I don't have a good fix for this. For now, I would propose to just revalidate the query.

I was thinking about refactoring the pagination and infinite scroll hooks, and this is good input. Let me know if you have more!

@roobox
Copy link
Author

roobox commented Feb 14, 2025

Hi @pikameow420 ,

I endet up developing my own inifiniteScroll implementation to be able to handle "more complex" cases.

@psteinroe
Copy link
Owner

Hi @pikameow420 ,

I endet up developing my own inifiniteScroll implementation to be able to handle "more complex" cases.

can you share it? :)

@roobox
Copy link
Author

roobox commented Feb 16, 2025

Hi @pikameow420 ,
I endet up developing my own inifiniteScroll implementation to be able to handle "more complex" cases.

can you share it? :)

Sure! I am using SWR from Vercel.
Here my code:

`import useSWR from 'swr'
import useSWRMutation from 'swr/mutation'
import { useState } from 'react'
import supabase from '../utils/supabaseClient'

/**

  • Fetch a list of books with an initial page load size.
    */
    const getBooks = async (
    url: string,
    searchTerm: string,
    sortBy: string,
    sortAscending: boolean,
    initialPageLoad: number,
    discountId: string | null
    ) => {
    // Example: cacheKey = books_<categoryId>_optionalDiscountId => extract categoryId
    const categoryId = url.split('_')[1]
    if (!categoryId || categoryId === 'null' || categoryId === 'root') {
    return []
    }

let query = supabase
.from('books_overview')
.select('*')
.eq('category_id', categoryId)
.or(title.ilike.%${searchTerm}%,description.ilike.%${searchTerm}%)
.order(sortBy, { ascending: sortAscending })
.range(0, initialPageLoad - 1)

// Optionally filter by discountId
if (discountId !== null) {
query = query.contains('book_meta', { discount_id: discountId })
}

const { data, error } = await query
if (error) {
throw new Error('Failed to retrieve Books')
}

console.log('Fetched Books:', data)
return data
}

/**

  • Fetch additional books for infinite scroll (subsequent pages).
    */
    const loadMoreBooks = async (
    url: string,
    searchTerm: string,
    sortBy: string,
    sortAscending: boolean,
    offset: number,
    successivePageLoad: number,
    discountId: string | null
    ) => {
    const categoryId = url.split('_')[1]
    if (!categoryId || categoryId === 'null' || categoryId === 'root') {
    return []
    }

let query = supabase
.from('books_overview')
.select('*')
.eq('category_id', categoryId)
.or(title.ilike.%${searchTerm}%,description.ilike.%${searchTerm}%)
.order(sortBy, { ascending: sortAscending })
.range(offset, offset + (successivePageLoad - 1))

if (discountId !== null) {
query = query.contains('book_meta', { discount_id: discountId })
}

const { data, error } = await query;

if (error) {
throw new Error('Failed to load more Books')
}

return data
}

/**

  • Custom hook to manage infinite scroll of books from Supabase.
    */
    export default function useBooks(
    categoryId: string | null,
    searchTerm: string,
    sortBy: string,
    sortAscending: boolean,
    initialPageLoad: number = 25,
    discountId: string | null
    ) {
    const [hasMore, setHasMore] = useState(true)

// Define how many items to load on each subsequent fetch
const successivePageLoad = 20

// Build a cache key that includes discountId if it exists
const cacheKey = discountId ? books_${categoryId}_${discountId} : books_${categoryId}

// 1) Load the initial set of books
const {
data: books,
error: booksError,
isLoading: booksIsLoading,
mutate: booksMutate,
} = useSWR(
cacheKey,
(url) => getBooks(url, searchTerm, sortBy, sortAscending, initialPageLoad, discountId),
{
fallbackData: [],
onSuccess: (data) => {
// If we got exactly initialPageLoad items, there's probably more
setHasMore(data.length === initialPageLoad)
},
}
)

// 2) Load more books when scrolling
const {
trigger: loadMore,
isMutating: loadMoreRunning,
error: loadMoreError,
} = useSWRMutation(
cacheKey,
(url, { arg }) =>
loadMoreBooks(url, searchTerm, sortBy, sortAscending, books.length, successivePageLoad, discountId),
{
// Populate SWR’s cache with the newly loaded items
populateCache: (loadedBooks, currentBooks) => [...(currentBooks || []), ...loadedBooks],
revalidate: false,
rollbackOnError: true,
onSuccess: (data) => {
// If we got exactly successivePageLoad, there's probably more to load
setHasMore(data.length === successivePageLoad)
},
}
)

// Optional helper to force-refresh the entire list
const refreshBooks = async () => {
await booksMutate(
async () =>
getBooks(cacheKey, searchTerm, sortBy, sortAscending, initialPageLoad, discountId),
{ revalidate: false }
)
}

if (!categoryId) {
return {
books: null,
loadMore: () => null,
booksError: null,
booksIsLoading: null,
booksMutate: null,
refreshBooks: null,
hasMore: false,
}
}

return {
books,
loadMore,
hasMore,
booksError,
booksIsLoading,
booksMutate,
refreshBooks,
loadMoreRunning,
loadMoreError,
}
}`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants