You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was wondering if it's possible to have one archive split into multiple smaller archives? Specifically, what I'm trying to do is serialize a lot of polymorphic pointers, but I'd like every pointer entry to be serialized in its own output file. Then later on when deserializing I'd like to load all those archives, smash them together, and have the pointers re-initialize themselves properly.
The end result would be that if I 1/10000 pointers changes, the whole archive doesn't have to be regenerated again, but rather, just one archive file for that specific pointer would be remade.
The text was updated successfully, but these errors were encountered:
You’re likely to have better luck generating some kind of paginated archive
class which contains an array of “pages” each “page” object contains a list
of polymorphic objects.
Have manual load/save methods which call the archive load/save code and
rebuild the pointers after loading.
This isn’t something I think you can expect to be built into cereal itself
though (at least that’s my gut reaction based on your description.)
On Tue, Dec 31, 2024 at 9:13 PM Damir Halilovic ***@***.***> wrote:
Hello,
I was wondering if it's possible to have one archive split into multiple
smaller archives? Specifically, what I'm trying to do is serialize a lot of
polymorphic pointers, but I'd like every pointer entry to be serialized in
its own output file. Then later on when deserializing I'd like to load all
those archives, smash them together, and have the pointers re-initialize
themselves properly.
The end result would be that if I 1/10000 pointers changes, the whole
archive doesn't have to be regenerated again, but rather, just one archive
file for that specific pointer would be remade.
—
Reply to this email directly, view it on GitHub
<#849>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAHBPZMPDXPOXQLHPEOH4S32INMMXAVCNFSM6AAAAABUOF3DJCVHI2DSMVQWIX3LMV43ASLTON2WKOZSG43DIOBQG43TOOA>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
I suppose all I would have to do is defer the pointer rebuilding after all individual archives were loaded and combined into what cereal expects potentially.
Hello,
I was wondering if it's possible to have one archive split into multiple smaller archives? Specifically, what I'm trying to do is serialize a lot of polymorphic pointers, but I'd like every pointer entry to be serialized in its own output file. Then later on when deserializing I'd like to load all those archives, smash them together, and have the pointers re-initialize themselves properly.
The end result would be that if I 1/10000 pointers changes, the whole archive doesn't have to be regenerated again, but rather, just one archive file for that specific pointer would be remade.
The text was updated successfully, but these errors were encountered: