Looking for Best Practices for Optimizing Data Tables with 10,000+ Records!

Hey

I am currently building an internal dashboard for our operations team…, and we are working with a few data tables that have 10,000+ records each. Performance is starting to lag a bit—especially when applying multiple filters or loading complex pages with connected data.

I would love to get some advice from those with experience handling larger datasets in Tadabase. Specifically:

Are there recommended strategies to improve page load speed: ??

Do lookup fields or connected records significantly impact performance: ??

Would using pipelines or automations to archive older data help: ??

Is it better to split large tables into smaller linked ones: ??

I am also curious if there’s any known threshold where performance takes a noticeable dip: ??

Appreciate any tips or lessons learned !! I want to make sure the app scales smoothly as our data grows. I have also gone through this thread https://community.tadabase.io/t/assign-connection-fields-to-records-big-data-table-azure-training-in-hyderabad but still looking for some more help.

Thanks in advance !!

With Regards,

Marcelo Salas

2 Likes

also following

I think the answer here is “it depends”

What are you doing with your datasets? What account level do you have? Are there any specific issues you are noticing?

I don’t know if other users are going to be able to give you a best practices list without knowing what’s going on.