Running out of memory on a big dataset

Hello,

I need to run through a large dataset (~400k cells) on my laptop. When I get to the point in QC where I am trying to regress out using scanpy.pp.regress_out, I eventually get a dead kernel. No error message I think it just runs out of steam. What I am wondering is how I can fix this? I see there is the n_jobs argument for parallelization, will that make a difference? Do I need to some how split up my memory before setting n_jobs? I have a 16gb mac book pro so im not sure if it will make a huge difference. Thanks for your advice.

1 Like