Maximum file's size

If you are allergic to bug trackers, you can post here any remarks, issues and potential bugs you encounter
skarab
Posts: 12
Joined: Wed Mar 22, 2017 9:42 am

Maximum file's size

Post by skarab »

Hello guys,

I've some files that exceded 8go and I can't open it. Is there a limitation of the size for .bin files ?

Thanks
daniel
Site Admin
Posts: 7332
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: Maximum file's size

Post by daniel »

There's no limit on the filesize (at least for what I'm aware of), but there's one on the maximum number of points (I think it's 2 billion points per cloud, but you'll generally hit the machine memory limit before that).

You can increase the virtual memory limit but in this case you'll get horrible performances (which should already be poor anyway with so many points). At least it will let you apply some decimation in command line mode for instance...
Daniel, CloudCompare admin
dope
Posts: 1
Joined: Wed Mar 29, 2017 5:00 pm

Re: Maximum file's size

Post by dope »

Is the limit (2Billion) still valid for 2.9beta? Is there a way to overcome this? Tried to decimate the cloud using command line, but the program still crashes. :/
daniel
Site Admin
Posts: 7332
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: Maximum file's size

Post by daniel »

Yep, no change on this side.

If the file is a LAS file, then you can split it before loading it with the LAS 'split' tab in the dedicated loading dialog.
If it's ASCII, you can also try to split the file in multiple clouds with the ASCII/text loading dialog (at the bottom). But in this case you'll need a lot of memory as CC will still try to load all the points in memory (but in several clouds)
Daniel, CloudCompare admin
skarab
Posts: 12
Joined: Wed Mar 22, 2017 9:42 am

Re: Maximum file's size

Post by skarab »

Thks for the answer. Finaly, it was just my files that was corrupted.

But my answer up a new problems : my files are unusally fat. For 11,000 points I have .bin of 1.8go. If i convert it in .e57 and resave it in .bin the file is arround few mo.

Do you have an idea ?

Sorry for my bad english, i'm french...
daniel
Site Admin
Posts: 7332
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: Maximum file's size

Post by daniel »

Where do these files come from? (which version of CC?)
Daniel, CloudCompare admin
skarab
Posts: 12
Joined: Wed Mar 22, 2017 9:42 am

Re: Maximum file's size

Post by skarab »

daniel wrote:Where do these files come from? (which version of CC?)
This files are created with .fls and saved with CC 2.8.1. The original files with no modification and around 6 millions points is around 1.7go. After few operation (section, cut, clean...) and of course, after deleting all the temp files, my final files is bigger than the source...

I found after few manipulation that an resample reduce the file size but it delete the SF. So it's not a good option for me.
daniel
Site Admin
Posts: 7332
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: Maximum file's size

Post by daniel »

The original FLS file is 1.7gb with 6 M. points only? Or is it the first BIN file?

For BIN files at least, the problem is that the scan grid (the gridded structure) is kept in memory with the point cloud. And each time you split the cloud, the scan grid is duplicated. If this grid is huge then the memory might explode quickly.

I believe the latest versions (2.8.1 or 2.9.beta) let you delete this scan grid (it's only useful to compute normals actually). The idea is to remove it right from the start to avoid duplication (and make the BIN file much slower).

And as far as I know the resampling should not remove the scalar fields (maybe it's only deactivated?).
Daniel, CloudCompare admin
skarab
Posts: 12
Joined: Wed Mar 22, 2017 9:42 am

Re: Maximum file's size

Post by skarab »

daniel wrote:The original FLS file is 1.7gb with 6 M. points only? Or is it the first BIN file?
My first .bin with mutliples .fls.
daniel wrote:I believe the latest versions (2.8.1 or 2.9.beta) let you delete this scan grid (it's only useful to compute normals actually). The idea is to remove it right from the start to avoid duplication (and make the BIN file much slower).
How can delete it ?
daniel
Site Admin
Posts: 7332
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: Maximum file's size

Post by daniel »

Something like 'Edit > Scan grids > Delete'?
Daniel, CloudCompare admin
Post Reply