performance - Reading huge ascii text file quickly in Java. Need help using MappedByteBuffer -
i have text file thousands of lines of data following:
38.48,88.25 48.20,98.11 100.24,181.39 83.01,97.33 ... , list keeps going (thousands of lines that).
i figured out how separate data usable tokens using filereader , scanner method far slow.
i created following delimeter: src.usedelimiter(",|\n");
and used scanner class nextdouble() each piece of data.
i have done lot of research , looks solution use mappedbytebuffer place data memory , access there. problem don't know how use mappedbytebuffer separate data usable tokens.
i found site: http://javarevisited.blogspot.com/2012/01/memorymapped-file-and-io-in-java.html - helps me map file memory , explains how read file looks data returned byte or perhaps in binary form? file trying access ascii , need able read data ascii well. can explain how that? there way scan file mapped memory in same way have done using scanner previous filereader method? or there method faster? current method takes 800x amount of time should take.
i know may trying reinvent wheel academic purposes , thus, not allowed use external libraries.
thank you!
to data loaded memory can use scanner in same way did earlier, store each row on list following.
list<pair> data = new arraylist<pair>(); where pair defined as
class pair { private final double first; private final double second; public pair(double first, double second) { this.first = first; this.second = second; } .... }
Comments
Post a Comment