could someone help me with pandas chunking the large dataset and process it to nested dictionary
Hi @ksam78727,
The following thread may help, it’s actually a better way to optimize the process.
Export pandas dataframe to a nested dictionary from multiple columns
and what about chunking?
sir i have a queation if i perform concat it converted to dataframe, but again it came to orignal form occupying the same memory i want to reduce its memory as chunking is performed. to reduce the memory
could you guide me on this? and i want that chunked inro nested dictionary
I’m not sure about this though, will check and let you know. By then, can you share the notebook that you’re working on?