export

The export tool exports a set of files from HDFS back to an RDBMS. The target table must already exist in the database. The input files are read and parsed into a set of records according to the user-specified delimiters.

The default operation is to transform these into a set of INSERT statements that inject the records into the database. In the update mode, Sqoop generates UPDATE statements that replace existing records in the database, and in the call mode Sqoop will make a stored procedure call for each record.

The tool usage is shown below.

$ sqoop export <generic-args> <export-args>
$ sqoop-export <generic-args> <export-args>

Although the generic Hadoop arguments must precede any export arguments, the export arguments can be specified in any order with respect to one another.

Common arguments

--connect <jdbc-uri>

Specifies the JDBC connection string

--connection-manager <class-name>

Specifies the connection manager class to use

--connection-param-file <filename>

Specifies optional properties file that provides connection parameters

--driver <class-name>

Specifies the JDBC driver class to use

--hadoop-mapred-home <dir>

Overrides $HADOOP_MAPRED_HOME

--help

Prints usage instructions

--password-file

Sets the path to a file containing the authentication password

-P

Reads the password from the console

--password <password>

Specifies the authentication password

--username <username>

Specifies the authentication username

--verbose

Prints more information while working

--relaxed-isolation

Instructs Sqoop to use the read-uncommitted isolation level

Validation arguments

--validate

Enables the validation of the copied data, supports single table copy only

--validator <class-name>

Specifies validator class to use

--validation-threshold <class-name>

Specifies validation threshold class to use

--validation-failurehandler <class-name>

Specifies validation failure handler class to use

Export control arguments

--columns <col,col,col…>

Columns to export to table

--direct

Use direct export fast path

--export-dir <dir>

HDFS source path for the export

-m,--num-mappers <n>

Use n map tasks to export in parallel

--table <table-name>

Table to populate

--call <stored-proc-name>

Stored Procedure to call

--update-key <col-name>

Anchor column to use for updates. Use a comma-separated list of columns if there are more than one column

--update-mode <mode>

Specifies how updates are performed when new rows are found with non-matching keys in database. Legal values for mode include updateonly (default) and allowinsert

--input-null-string <null-string>

The string to be interpreted as null for string columns

--input-null-non-string <null-string>

The string to be interpreted as null for non-string columns

--staging-table <staging-table-name>

The table in which data will be staged before being inserted into the destination table

--clear-staging-table

Indicates that any data present in the staging table can be deleted

--batch

Specifies to use batch mode for underlying statement execution

Input parsing arguments

--input-enclosed-by <char>

Sets a character that encloses the input

--input-escaped-by <char>

Sets an input escape character

--input-fields-terminated-by <char>

Sets an input field separator

--input-lines-terminated-by <char>

Sets an input end-of-line character

--input-optionally-enclosed-by <char>

Sets a field-enclosing character

Output line formatting arguments

--enclosed-by <char>

Sets a required field enclosing character

--escaped-by <char>

Sets an escape character

--fields-terminated-by <char>

Sets a field separator character

--lines-terminated-by <char>

Sets an end-of-line character

--mysql-delimiters

Uses the MySQL default delimiter set: fields — ,, lines — \n, escaped-by — \, optionally-enclosed-by — '

--optionally-enclosed-by <char>

Sets an optional field enclosing character

Sqoop automatically generates the code to parse and interpret records of the files containing the data to be exported back to the database. If these files were created with non-default delimiters (comma-separated fields with newline-separated records), you should specify the same delimiters again so that Sqoop can parse your files.

If you specify incorrect delimiters, Sqoop will fail to detect columns per line. This will cause export map tasks to fail by throwing ParseExceptions.

Code generation arguments

--bindir <dir>

Sets the output directory for compiled objects

--class-name <name>

Specifies a name for generated class. This overrides --package-name. When combined with --jar-file, sets the input class

--jar-file <file>

Disables code generation; the provided JAR is used instead

--map-column-java <m>

Overrides the default mapping from SQL type to Java type for column <m>

--outdir <dir>

Sets the output directory for generated code

--package-name <name>

Puts auto-generated classes into the specified package

If the records to be exported were generated as the result of a previous import, then the original generated class can be used to read the data back. Specifying --jar-file and --class-name obviate the need to specify delimiters in this case.

The use of existing generated code is incompatible with --update-key; the update-mode export requires new code generation to perform the update. You cannot use --jar-file, and must fully specify any non-default delimiters.

Found a mistake? Seleсt text and press Ctrl+Enter to report it