-
Notifications
You must be signed in to change notification settings - Fork 114
Open
Description
Hello,
I was trying to extract the galaxy&psf couples from the COSMOS_23.5_training_sample catalog
to make some datasets
Please find below the code.
The problem I have is that after having processed 28_000 galaxies (and not yet 29_000) the
python code crashes due to "Out of Memory" detected by the my local SLURM system
Is there a "purge" that I have missed somewhere?
real_galaxy_catalog = galsim.RealGalaxyCatalog(
dir=path_catalog, sample=sample
)
n_total = real_galaxy_catalog.nobjects # - 56062
sequence = np.arange(0, n_total)
np.random.shuffle(sequence)
# Loop....
t0 = time.time()
for i,gal_idx in enumerate(sequence):
if i==0 or i%1000==0:
print(f"Process {i}/{gal_idx} at t:{time.time()-t0:.1f}sec")
# read HST galaxy and PSF
gal_ori = galsim.RealGalaxy(real_galaxy_catalog, index=gal_idx)
psf_hst = real_galaxy_catalog.getPSF(gal_idx)
# get the corresponding galsim.Image
gal_ori_image = real_galaxy_catalog.getGalImage(gal_idx)
psf_ori_image = real_galaxy_catalog.getPSFImage(gal_idx)
#save into fits files
fname = out_catalog_dir +"/"+"gal_"+str(i)+".fits"
gal_ori_image.write(fname)
fname = out_catalog_dir +"/"+"psf_"+str(i)+".fits"
psf_ori_image.write(fname)
Thanks for your attention.
Jean_Eric
Metadata
Metadata
Assignees
Labels
No labels