There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

geekpython.in

Dremor , to technology in Efficiently Manage Memory Usage in Pandas with Large Datasets
@Dremor@lemmy.world avatar

This should probably be posted on a programing community.

phlegmy , to technology in Efficiently Manage Memory Usage in Pandas with Large Datasets

This could really do with an explanation for wtf ‘pandas’ is, and why this is relevant.

Nomecks , to technology in Efficiently Manage Memory Usage in Pandas with Large Datasets

Is there a benefit to doing CoW with Pandas vs. offloading it to the storage? Practically all modern storage systems support CoW snaps. The pattern I’m used to (Infra, not big data) is to leverage storage APIs to offload storage operations from client systems.

sem ,
@sem@lemmy.ml avatar

If you are doing data processing in pandas CoW allows to avoid of a lot of redundant computations on intermediate steps. Before CoW any data processing in Pandas required manual and careful working with code to avoid the case described in the blog post. To be honest I cannot imagine the case of offloading each result of each operation in the pipeline to the storage…

Nomecks ,

So you would be using CoW in-memory in this case?

sem ,
@sem@lemmy.ml avatar

If I already use Pandas for processing my data in-memory, CoW can significantly improve the performance. That was my point.

LunarLoony ,
@LunarLoony@lemmy.sdf.org avatar

I’m confused by all this talk of black-and-white animals. Can we instead use a Zebra node and put it behind a TuxedoCat cluster? I’ve also heard good things about barred-knifejaw as a data warehouse.

(Genuine question: what are Pandas and Cows in this context?)

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • lifeLocal
  • goranko
  • All magazines