Hi,
I am experiencing issues with the TB native webhook. For some reason, it doesn’t save the mapping , even then U do it over and over. So the values are not saved to the database.
I can’t switch to Make, since the number of fields are really a lot.
Does anyone have the same experience and maybe a solution?
@moe : is there a limit on the number of fields which can be saved from a Tadabase webhook? I rebuild my Tadabase webhook from scratch several times and it stops saving the mapping at a certain point.. So I do the mapping and starting from field number 134 (example) it stops saving my updates from number 135 and onwards.
There must be some kind of thing blocking it on the back end, since it stops saving at the same point every time.!
Is it possible to map the fields directly by adding the fields in the JSON request? So that I can skip the process of the mapping?
Can you provide a little more context as to the purpose of the webhook and the data that you are trying to map to specific fields? We do a ton of data replication and webhooks in all of the apps that we have created for others.
Please feel free to reach out. asanchez@rmelas.com
Hi @bgedevteam ,
Thank you, but this is not about data structures, pipes, record rules or database setup. This is about that the results from a pipe are not being saved by Tadabase. And it drives me crazy, because my users are complaining that there are incomplete results in their app.
I use the native TB webhooks, due to the amount of fields and the speed of processing.
For context, please check the Loom video below and see how the mapping for the fields “Type” are not being saved.
Loom: Fields not being saved video | Loom
I am really open to suggestions, because I haven’t figured out a solution yet.
Thanks
@bgedevteam,
Do you have experience with large amounts of webhook request, both in terms of the number of fields and the number of webhooks? I am still figuring out a solution, but I haven’t figured it out yet. I am really worried about it and support says there working on it, but I haven’t got an answer yet.
Hi @slimpens ,
Although this hasn’t happened to me, it looks like you hive hit a Tadabase limitation. I’m sure Tadabase engineers will find a solution soon, but in the meantime maybe you could explore how you could split the data. For example, could you setup 2 webhooks, get 70% from the first one and the rest after? Or get that first webhook, then call a pipe to get the rest?
Send me a PM or email if you’d like to discuss this, I’ll be happy to help.
Cheers
Martin
1 Like
Hi @Gaudspeed ,
Yes, it looks like it, but nobody from Support or @moe have confirmed whether I hit a limit. I haven’t read about a limit with webhooks nowhere.
I requested to confirm whether i hit a limit multiple times, because no problem if there’s a limit, but then I have to rearrange my app. As for now, i am completely in the dark, since my next steps rely on whether I hit a limit or not.
I will contact you if I need any help, but first I have to wait for the response from the TB team.
Regards,
Also, the reason that I need a lot of fields is that I am using Charts JS to show the data in my app. And charts JS requires that the values are stored on the table itself. It’s not possible to use connection field values.
Hi @slimpens , I understand that you want to get an answer regarding limitation first. I recommend that you also consider that whenever you are working with very large payloads, there is more risk. It might be a maximum number of fields, a payload size in bytes, a timeout for record update, etc.
My opinion is that doing 2-3 smaller webhooks and/or API calls instead a single very large one could potentially improve the reliability of your app in the long term.
Keep in mind that I don’t know the size of your webhook, I’m just supposing it’s very large.
Cheers
Hi @Gaudspeed
Thank you, but the size of the webhook is 90kb. At least the webhook response mentions ‘90’ as size. That shouldn’t be the issue is my assumption. The field values are numbers, dates and text fields, and the total number of fields is 200.
I was under the assumption that TB can handle large amounts of data and several thousands of row mutations instantly. But for now, I am not so sure about that.
Thank you for your suggestion and I keep that in mind. And for now, i am still awaiting the response of support of whether my issue is the result of a TB limit or a bug.
We have some applications that are pulling in 125 fields at a time and repurposing data into other fields. Lets connect to review you specifics. I did watch your video. But need to get a little more context.
@Gaudspeed @bgedevteam
I had contact with support and the issue was indeed related to my webhook limit. So there was no bug. I re structured my app so now everything works. fine