fast conversion of a text file to arrays C++ -


i've been working on project involves large heightmaps (3000x3000 ~60mb). . need split data several 200x200 arrays (15x15 of them), save them separately (but time in format fast possible load again). i've tried using streams (i'm not @ c++ don't exclude ideas streams) it's agonizingly slow.

stuff might (based on i've seen while searching answer): heightmaps supplied text files (.asc) numbers written "125.123" without "". each entry has 3 decimals no matter number ("0.123" , "100.123").as far know there no negative numbers , size of heightmap known beforehand (usually 3000x3000).

so questions essentially:

  1. whats best way this? (preferably without boost or such if helps lot why not)
  2. what format (for 200x200 arrays) allow fastest loading time?

any help, ideas, code or links/litterature?

part 2

if reading file onto same type of system (endianness) use binary blittable format. ie store straight binary dump of 200 * 200 array. multiply 1000 , store ints since typically faster (you did not mention range of values, nor required precision, units feet, miles, nanometers?)


Comments

Popular posts from this blog

javascript - oscilloscope of speaker input stops rendering after a few seconds -

javascript - gulp-nodemon - nodejs restart after file change - Error: listen EADDRINUSE events.js:85 -

Fatal Python error: Py_Initialize: unable to load the file system codec. ImportError: No module named 'encodings' -