Quantcast
Channel: PHP Freaks: PHP Help
Viewing all articles
Browse latest Browse all 13200

large data issues

$
0
0

I'm developing an app that uploads users' CSV files into a mysql database with the following table structure:

 

mysql_query("create table `datasets`(
	dataset_id int auto_increment not null primary key,
	name varchar(255))");
	
mysql_query("create table `dataset_rows`(
	row_id int auto_increment not null primary key,
	dataset_id varchar(255))");

mysql_query("create table `dataset_cols`(
	col_id int auto_increment not null primary key,
	dataset_id varchar (255),
	field_name varchar (255))");
	
mysql_query("create table `data`(
	data_id int auto_increment not null primary key,
	row_id varchar(255),
	col_id varchar(255),
	data varchar (255))");

 

I'm having issues with execution times... I've set my time limit to 0 so the script wouldn't timeout and I could test how long it actually takes.  When a large csv file is uploaded (>3mb) and a loop is done to insert values into mysql, it takes several minutes.  When I then try to loop through and display this data in an HTML table, it takes waaaay longer (>15min).  My question is, how can I run the csv import process in the background when the user submits a csv file? and is there a suggested way to change the database architecture to speed up things?

 

Thanks for the input


Viewing all articles
Browse latest Browse all 13200

Trending Articles