Turrable title. Trying to give this post an SEO boost? Maybe? Anyway, I had a pretty large dataset of about 50k+ records that I wanted to grab, store, and spit out as a CSV via a download link. Pretty straight forward process that goes as follows:
- hit up mysql for the data
- set the headers for the page to have a content-type of application/octet-stream
- force it be downloaded via Content-Disposition: attachment; filename=blah
- echo out the records in a CSV format with carriage returns to start a new row (eg. it can be opened in excel that way)
This worked great until the dataset I was loading became too heavy. I was on a pretty low-memory server at the time, so I wasn't able to hit mysql, store all the records in an array, and then iterate over them. My method was as follows before:
// helper for running sql statements
function select($statement)
{
$resource = mysql_query($statement);
$results = array();
while ($results[] = mysql_fetch_assoc($resource)) { }
array_pop($results);
return $results;
}
As you can see, I'm looping through everything, storing it in the results array and returning it. But since I need a relatively low-memory footprint, I thought about not using this helper, and just wrapped it as follows:
//
function cselect($statement)
{
$resource = mysql_query($statement);
while ($row = mysql_fetch_assoc($resource)) {
foreach ($row as $column => $value) {
if ($column === 'language') {
echo substr($value, 0, 2) . ',';
}
// bla bla bla, more conditions
}
echo "\n";
}
}
Seeing as php5 allows you to pass functions are parameters, you could optimize the first helper to accept one, and if found, run that as the callback instead of storing and return the records, but that'd be overkill for what I was trying to do.
So, if you're looking for a memory-efficient way to grab copious amounts of data, don't let php store it; use it right away.