10 min read

In general, an exported file can be imported either to the same database it came from or to any other database; the XML format is an exception to this and a workaround is given in the XML section later in this chapter. Also, a file generated from an older phpMyAdmin version should have no problem being imported by the current version, but the difference between the MySQL version at the time of export and the one at the time of import might play a bigger role regarding compatibility. It’s difficult to evaluate how future MySQL releases will change the language’s syntax, which could result in import challenges.

The import feature can be accessed from several panels:

  • The Import menu available from the homepage, the Database view, or the Table view
  • The Import files menu offered inside the Query window

An import file may contain the DELIMITER keyword. This enables phpMyAdmin to mimic the mysql command-line interpreter. The DELIMITER separator is used to delineate the part of the file containing a stored procedure, as these procedures can themselves contain semicolons.

The default values for the Import interface are defned in $cfg[‘Import’].

Before examining the actual import dialog, let’s discuss some limits issues.

Limits for the transfer

When we import, the source file is usually on our client machine and therefore must travel to the server via HTTP. This transfer takes time and uses resources that may be limited in the web server’s PHP configuration.

Instead of using HTTP, we can upload our file to the server by using a protocol such as FTP, as described in the Reading files from a web server upload directory section. This method circumvents the web server’s PHP upload limits.

Time limits

First, let’s consider the time limit. In config.inc.php, the $cfg[‘ExecTimeLimit’] configuration directive assigns, by default, a maximum execution time of 300 seconds (five minutes) for any phpMyAdmin script, including the scripts that process data after the file has been uploaded. A value of 0 removes the limit, and in theory, gives us infinite time to complete the import operation. If the PHP server is running in safe mode, modifying $cfg[‘ExecTimeLimit’] will have no effect. This is because the limits set in php.ini or the user-related web server configuration file, (such as .htaccess or the virtual host configuration files) take precedence over this parameter.

Of course, the time it effectively takes depends on two key factors:

  • Web server load
  • MySQL server load

The time taken by the file, as it travels between the client and the server does not count as execution time because the PHP script only starts to execute after the file has been received on the server. Therefore, the $cfg[‘ExecTimeLimit’] parameter has an impact only on the time used to process data (like decompression or sending it to the MySQL server).

Other limits

The system administrator can use the php.ini file or the web server’s virtual host configuration file to control uploads on the server.

The upload_max_filesize parameter specifies the upper limit or maximum file size that can be uploaded via HTTP. This one is obvious, but another less obvious parameter is post_max_size. As HTTP uploading is done via the POST method, this parameter may limit our transfers. For more details about the POST method, please refer to http://en.wikipedia.org/wiki/Http#Request_methods.

The memory_limit parameter is provided to prevent web server child processes from grabbing too much of the server’s memory—phpMyAdmin runs inside a child process. Thus, the handling of normal file uploads, especially compressed dumps, can be compromised by giving this parameter a small value. Here, no preferred value can be recommended; the value depends on the size of uploaded data we want to handle and on the size of the physical memory. The memory limit can also be tuned via the $cfg[‘MemoryLimit’] parameter in config.inc.php, as seen in Chapter 6, Exporting Structure and Data (Backup).

Finally, file uploads must be allowed by setting file_uploads to On; otherwise, phpMyAdmin won’t even show the Location of the textfile dialog. It would be useless to display this dialog as the connection would be refused later by the PHP component of the web server.

Handling big export files

If the file is too big, there are ways in which we can resolve the situation. If the original data is still accessible via phpMyAdmin, we could use phpMyAdmin to generate smaller CSV export files, choosing the Dump n rows starting at record # n dialog. If this were not possible, we could use a spreadsheet program or a text editor to split the file into smaller sections. Another possibility is to use the upload directory mechanism, which accesses the directory defined in $cfg[‘UploadDir’].

In recent phpMyAdmin versions, the Partial import feature can also solve this file size problem. By selecting the Allow interrupt… checkbox, the import process will interrupt itself if it detects that it’s close to the time limit. We can also specify a number of queries to skip from the start, in case we successfully import a number of rows and wish to continue from that point.

Uploading into a temporary directory

On a server, a PHP security feature called open_basedir (which limits the files that can be opened by PHP to the specified directory tree) can impede the upload mechanism. In this case, or if uploads are problematic for any other reason, the $cfg[‘TempDir’] parameter can be set with the value of a temporary directory. This is probably a subdirectory of phpMyAdmin’s main directory, into which the web server is allowed to put the uploaded file.

Importing SQL files

Any file containing MySQL statements can be imported via this mechanism. This format is more commonly used for backup/restore purposes. The relevant dialog is available in the Database view or the Table view, via the Import subpage, or in the Query window.

There is no relation between the currently-selected table (here author) and the actual contents of the SQL file that will be importeAll of the contents of the SQL file will be imported, and it’s those contents that determine which tables or databases are affected. However, if the imported file does not contain any SQL statementsto select a database, all statements in the imported file will be executed on the currently selected database.

Let’s try an import exercise. First, we make sure that we have a current SQL export of the book table (as explained in Chapter 6, Exporting Structure and Data (Backup)). This export file must contain the structure and the data. Then we drop the book table—yes, really! We could also simply rename it. (See Chapter 9, Performing Table and Database Operations, for the procedure.)

Now it’s time to import the file back. We should be on the Import subpage, where we can see the Location of the text file dialog. We just have to hit the Browse button and choose our file.

phpMyAdmin is able to detect which compression method (if any) has been applied to the file. Depending on the phpMyAdmin version, and the extensions that are available in the PHP component of the web server, there is variation in the format that the program can decompress.

However, to import successfully, phpMyAdmin must be informed of the character set of the file to be imported. The default value is utf8. However, if we know that the import file was created with another character set, we should specify it here.

A SQL compatibility mode selector is available at import time. This mode should be adjusted to match the actual data that we are about to import, according to the type of the server where the data was previously exported.

Another option, Do not use AUTO_INCREMENT for zero values, is selected by default. If we have a value of zero in a primary key and we want it to stay zero instead of being auto-incremented, we should use this option.

To start the import, we click on Go. The import procedure continues and we receive a message: Import has been successfully finished, 2 queries executed. We can browse our newly-created tables to confirm the success of the import operation.

The file could be imported for testing in a different database or even on another MySQL server.

Importing CSV files

In this section, we will examine how to import CSV files. There are two possible methods—CSV and CSV using LOAD DATA. The first method is implemented internally by phpMyAdmin and is the recommended one for its simplicity. With the second method, phpMyAdmin receives the file to be loaded and passes it to MySQL. In theory, this method should be faster. However, it has more requirements due to MySQL itself (see the Requirements subsection of the CSV using LOAD DATA section).

Differences between SQL and CSV formats

There are some differences between the SQL and CSV formats. The CSV file format contains data only, so we must already have an existing table in place. This table does not need to have the same structure as the original table (from which the data comes); the Column names dialog enables us to choose which columns are affected in the target table.

Because the table must exist prior to the import, the CSV import dialog is available only from the Import subpage in the Table view, and not in the Database view.

Exporting a test file

Before trying an import, let’s generate an author.csv export file from the author table. We use the default values in the CSV export options. We can then Empty the author table—we should avoid dropping this table because we still need the table structure.

CSV

From the author table menu, we select Import and then CSV:

We can influence the behavior of the import in a number of ways. By default, importing does not modify existing data (based on primary or unique keys). However, the Replace table data with file option instructs phpMyAdmin to use the REPLACE statement instead of the INSERT statement, so that existing rows are replaced with the imported data.

Using Ignore duplicate rows, INSERT IGNORE statements are generated. These cause MySQL to ignore any duplicate key problems during insertion. A duplicate key from the import file does not replace existing data, and the procedure continues for the next line of CSV data.

We can also specify the character that terminates each field, the character that encloses data, and the character that escapes the enclosing character. Usually, this is. For example, for a double quote enclosing character, if the data field contains a double quote, it must be expressed as “some data ” some other data”.

For Lines terminated by, recent versions of phpMyAdmin offer the auto choice, which should be tried first as it detects the end-of-line character automatically. We can also specify manually which characters terminate the lines. The usual choice is n for UNIX-based systems, rn for DOS or Windows systems, and r for Mac-based system (up to Mac OS 9). If in doubt, we can use a hexadecimal file editor on our client computer (not part of phpMyAdmin) to examine the exact codes.

By default, phpMyAdmin expects a CSV file with the same number of fields and the same field order as the target table. However, this can be changed by entering a comma-separated list of column names in Column names, respecting the source file format. For example, let’s say our source file contains only the author ID and the author name information:

“1”,”John Smith”
“2”,”Maria Sunshine”


We’d have to put id, name in Column names in order to match the source file.

When we click on Go, the import is executed and we receive a confirmation. We might also see the actual INSERT queries generated if the total size of the file is not too big.

Import has been successfully finished, 2 queries executed.
INSERT INTO `author` VALUES (‘1’, ‘John Smith’,
‘+01 445 789-1234’)# 1 row(s) affected.

INSERT INTO `author` VALUES (‘2’, ‘Maria Sunshine’,
‘333-3333’)# 1 row(s) affected.

LEAVE A REPLY

Please enter your comment!
Please enter your name here