-
Notifications
You must be signed in to change notification settings - Fork 4.4k
🚀 Feature: Bulk Document Creation #3051
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
This is not achieveable with functions due to the 8192 character limit. |
I have a achieved an okay workaround by creating a ".json" file inside a bucket that contains an array of documents, which is then read by a function that inserts the required documents. |
It's currently taking about 46 seconds to process 15,863 simple json objects. |
Need this feature too |
Hi, can you point me to a demo file please. NodeJS if possible. |
No worries. You can find it at the link below. It was thrown together pretty quickly. It expects a JSON file to be created in a bucket with the following structure
https://gist.github.com/Shadowfita/b5ccd20f65566cb9f2b40d416c5201a2 |
Would also like a batch delete too. |
Using functions and bucket is brilliant solution! Another potential workaround is to use multithreaded client and upload documents in parallel; for instance, client application developed in Kotlin. Or, another solution:
The best approach would be if Appwrite API has such functions: export (permissions/objects/buckets/collections/functions) (JSON), import, etc.; in this case we "abstract" underlaying implementation details and can do more granular export/import. For example, right now we don't have implementation-independent backup/restore (except that executing |
It would be great if bulk update is also considered as it ll greatly reduce the number of requests I make in my application. |
Hey @stnguyen90 , |
@singhbhaskar, thanks for your interest! 🙏 However, it would be best for the core team to figure out how it should work. |
Need this feature too |
I Need bulk operations too, please |
I need this also, need to create 18K document which are Pincode I should say... |
I want to delete multiple docs by their IDs too |
So I am working on a scrapping project , there are almost 2K records , it takes almost 20 min to write !!! {"ColumnRef":"Name","index":"0","value":"orange house"} this is the size of each record !!! is there any way to speed up the writes !! |
You should run all write requests asynchronously and wrap them in Promise.all . Appwrite is built to scale and will handle that many concurrent requests with no issue, and it will reduce your wait-time exponentionally. |
@Shadowfita Is there a guarantee that all promises will resolve if we do it with Last time I checked like 2 months ago, I tried to do bulk document deletion with Promise.all and it seems some of them failed. |
Answer depends actually, INstead you should delete in batches, meaning |
Right now I am using that lib for bulk document-from-json creation |
I agree a csv importer in the dash would be very useful. I also had gone down the road of: csv--> json, then json--> appwrite but throttling and processing just made it unbearable for larger workloads. This is using appwrite self-hosted.
For _uid I had used LEFT(MD5(RAND()), 20) at first but I got too many repeated values. For testing, best to copy the _3_database_x_collection_x table into another and feed that one. then
This seems to work. ymmv and proceed at your own risk. |
Any new ? |
Guys use https://github.com/react-declarative/appwrite-backup-tool The restore script can upload over than 10_000 documents without loading all of them to the RAM. That means it scalable and work with any size of data |
also allow bulk UPDATE & DELETE too. this is a very critical feature, can you guys make it priority? |
Need this ASAP. Bulk actions should be a priority. |
Planned for 1.7 |
🙌 |
Uh oh!
There was an error while loading. Please reload this page.
🔖 Feature description
Create a "createDocuments" post endpoint that takes an array of documents.
🎤 Pitch
In my project, I am trying to insert 12,000 documents in one go. It is inefficient running 12,000 external createDocument API calls to achieve this.
To remedy this, there should be a "createDocuments" endpoint that allows you to pass an array of documents. That way you could easily break up the number of external calls required, and allow the stack to process the creation of the large amount of documents interally, quickly and efficiently.
A potential workaround would be creating a function that does it locally, but I don't believe this is a proper solution.
👀 Have you spent some time to check if this issue has been raised before?
🏢 Have you read the Code of Conduct?
The text was updated successfully, but these errors were encountered: