asp.net - The best way to store millions of 100kb/Ave. records -


we in situation need store millions of records everyday,


data structure model:

  • id
  • date
  • title
  • ...
  • data [raw text]

our [raw text] different each time, ~30kb 300kb, , on average it's 100kbs. never need search [raw text], maybe once month data access required of them id.

now storing of them(attributes , data) in mongodb because of great insert speed , performance in mongodb. our database size growing rapidly , it's 85gbs now, , in next few days problem us.

here question, how implement it?
does worth change database , software structures store data[raw text] in file system(/datafiles/x/y/z/id.txt)?
change have significant impact on system performance?

if you're concerned storage, why not compress text data? decent text compression should 10:1.

personally, i'd take file-based approach, because sounds main function archiving. i'd write info file that's needed regenerate database record, compress it, , store in kind of sensible directory structure based on key. reason being it's easy start new disk or move sections of data off archival storage.

if collecting 10 million records each day compression, amounts 100gb per day. might want make 'disk id' form part of key, @ rate you'd fill 2tb disk in 3 weeks. 20tb raid array fill in 6 months.


Comments

Popular posts from this blog

linux - xterm copying to CLIPBOARD using copy-selection causes automatic updating of CLIPBOARD upon mouse selection -

c++ - qgraphicsview horizontal scrolling always has a vertical delta -