Sqoop--Error during export, job failed
we are trying to export the data from HDFS to mysql using sqoop, and
facing the following issue.
Sample data:
{(THB0000000018401231),(18401231),(+00000cbsadm),(000000000000),(00),(0000000018401231),(200001),(+013050120AA0+0000000000000000001),(+000000000000000000+00DEPOSIT),(+000000000000000000+000000),(2001+00000+000000000000000000+0000000000000000009257),(201305001#2001#20AA),(0),(ATSWC),(000000K03395),(00AD1#000141KRUNG),(TSDAAI),(~ZCM#2~ZIR#0),(K03395),(201474001),(91141),(312108004)}
{(THB0000000018401231),(18401231),(+00000cbsadm),(000000000000),(00),(0000000018401231),(200001),(+013050120AA0+0000000000000000001),(+000000000000000000+00DEPOSIT),(+000000000000000000+000000),(2001+00000+000000000000000000+0000000000000000009257),(201305001#2001#20AA),(0),(ATSWC),(000000K03395),(00AD1#000141KRUNG),(TSDAAI),(~ZCM#2~ZIR#0),(K03395),(201474001),(91141),(312108004)}
First, we did tokenize using pig and to make data comma separated, as
shown above. After we replicated the record for testing purpose.
Following Sqoop program, we used to export data from HDFS to MYSQL and we
specified the schema in table b1:
public class App //Export data from hdfs to mysql using sqoop
{
public static void main(String[] args) {
String[] str = { "export", "--connect", "jdbc:mysql://-------/test",
"--table", "b1", "--username", "root", "--password", "******",
"--export-dir", "hdfs://-----/user/hdfs/token2/",
"--input-fields-terminated-by", ",",
"--input-lines-terminated-by", "\n"
};
Sqoop.runTool(str);
}
Error after program execution:
11068 [main] ERROR org.apache.sqoop.tool.ExportTool - Error during export:
Export job failed!
After, we checked the mysql table only first record is there. We are
trying to load multiple records also but only one record is exported to
table.
Your help is highly appreciated.
No comments:
Post a Comment