LAMP: permissions questions

LotsOfZeros

^^^ Bi-Winning ^^^
Feb 9, 2008
4,649
118
0
www.makemoniesonline.com
I am responsible for creating a solution for a programmatic import of a csv into MySQL. I'm going to use fgetcsv for this. The csv will be sent to my server via FTP on a schedule, where I would need to cron a script to pull it into the database.

2 questions:
Will I need to run this script in the context of the root user account to pull the file from the FTP inbound directory, parse it, rename it, and move it to another directory (archive) or can I just use the same account I use for the MySQL db?

Does anyone have any code sample they might care to share with me?
 


I doubt ftp and your account will be the same. You could add your account to the FTP group, generally running things as root isn't a good idea.

Howto: Linux Add User To Group should help. You'll need to find out what group ftp is in.

I think you have me on skype, feel free to ask, or PM me.
 
short: if mysql user has access to your ftp src and file dest directory, your first idea might work (doing it as mysql user)

long:
create a group for "cronimport"
add mysql user to cronimport
chown ftpuser:cronimport /ftp_folder
chown cronimport:cronimport /cron_archive_folder
sudo su -u cronimport; crontab -e # add the cronjob
mysql -e "grant {import permissions} on import_table.* to cronimport@'localhost';"
 
I use the following all the time to pull in data from csv that I need to process and import into MySQL. Just pass it the path to your file (it will recognize a .gz version automatically) and an array of keys that your csv fields should be mapped to. You'll need to modify it slightly if you don't have a header row (comment out that first fgets).

PHP:
    /**
     * Load data from the given file. If the file exists with a .gz extension, the
     * data will be loaded from the .gz file.
     *
     * @param string $file The path to the data file.
     * @param array  $keys A list of keys to use for the data fields.
     * @return array
     */
    protected function _parseDataFile($file, $keys)
    {
        if (file_exists($file . '.gz')) {
            $file = $file . '.gz';
        }

        echo "Loading data from $file...\n";
        $fh = fopen("compress.zlib://$file", "r");
        if (false === $fh) {
            throw new Exception("Could not open $file for reading!");
        }

        fgets($fh); // Strip column headers from CSV data.

        $data = array();
        while (($row = fgetcsv($fh)) !== false) {

            // Replace any empty values with NULL.
            foreach ($row as &$value) {
                if ('' == $value) {
                    $value = null;
                }
            }
            unset($value);

            $tmp = array();
            foreach ($keys as $idx => $key) {
                $tmp[$key] = trim($row[$idx]);
            }

            $data[] = $tmp;

        }

        return $data;
    }
 
If you don't have root access (or the ability to amend access to the inbound FTP directory), bear in mind that you can also use PHP's FTP functions to transfer the CSV file to the archive directory, from which you can parse it directly into your DB.
 
I don't have access or any control over the sending server, and it's IIS which means Windows. It's basically a proprietary e-commerce system without an API but they can schedule FTP to send orders to another server (mine) where I plan to process through address verification systems before passing them on to Fedex for shipping.

Basically, I'm creating middleware to make up for lack of functionality with an existing system.