php - Approachs to load thousands of records into MySQL -


i have been developing php application in symfony1 loads thousands of records excel file , evaluates them in script, , inserts them in mysql database. saw script taking load. im using doctrine 1 insert data transactions.

my questions are:

should use orm or should use raw sql inserting task? there performance related between them?

should convert excel file in csv, consume less memory?

should use else load files other scripting language task? files sizes between 10mb , 50mb

thanks.

mauro, can still create friendly interface in application import records database. using raw sql best way so. through php can manipulate system's resources. once you've got file's path (i'd recommend use csv) can issue command in system such as:

<?php shell_exec("mysql -uuser -ppassword -e 'your script import data here'"); ... or <?php shell_exec("mysqlimport -uuser -ppassword 'your criteria here'"); ... 

i believe best approach that. let know how sort out!


Comments

Popular posts from this blog

linux - xterm copying to CLIPBOARD using copy-selection causes automatic updating of CLIPBOARD upon mouse selection -

c++ - qgraphicsview horizontal scrolling always has a vertical delta -