Use incremental import in sqoop to load data from Oracle (Part II)

The last post discussed the first part of creating incremental import job in sqoop. This post will continue the discussion for the
2nd part of the of the series.

In the last few posts, I discussed the following:
1. Install Cloudera Hadoop Cluster using Cloudera Manager
2. Configurations after CDH Installation
3. Load Data to Hive Table.
4. Import Data to Hive from Oracle Database
5. Export data from Hive table to Oracle Database.
6. Use Impala to query a Hive table
7. Use incremental import in sqoop to load data from Oracle (Part I)

Although we can pass many arguments to execute the sqoop job just what I did in the last post, there is a better way to manage this kind of work by using sqoop job and can do the similar work as showing the picture below.
sqoop_incremental_2

First, let me create a sqoop job using the following command:
sqoop job \
–create student_job \
— import \
–connect jdbc:oracle:thin:@enkx3-scan:1521:dbm2 \
–username wzhou \
–password wzhou \
–table STUDENT \
–incremental append \
–check-column student_id \
–last-value 7 \
-m 4 \
–split-by major

Once the sqoop job is created, I can use the following commands to show, execute or delete a sqoop job.
sqoop job –list
sqoop job –show student_job
sqoop job –exec student_job
sqoop job –delete student_job

Here are the execution results:
[wzhou@vmhost1 ~]$ sqoop job \
> –create student_job \
> — import \
> –connect jdbc:oracle:thin:@enkx3-scan:1521:dbm2 \
> –username wzhou \
> –password wzhou \
> –table STUDENT \
> –incremental append \
> –check-column student_id \
> –last-value 7 \
> -m 4 \
> –split-by major

Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/09/25 13:57:34 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.4.3
15/09/25 13:57:35 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.

[wzhou@vmhost1 ~]$ sqoop job –list
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/09/25 13:57:47 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.4.3
Available jobs:
student_job

[wzhou@vmhost1 ~]$ sqoop job –show student_job
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/09/25 13:58:07 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.4.3
Enter password:
Job: student_job
Tool: import
Options:
—————————-
verbose = false
incremental.last.value = 7
db.connect.string = jdbc:oracle:thin:@enkx3-scan:1521:dbm2
codegen.output.delimiters.escape = 0
codegen.output.delimiters.enclose.required = false
codegen.input.delimiters.field = 0
hbase.create.table = false
db.require.password = true
hdfs.append.dir = true
db.table = STUDENT
codegen.input.delimiters.escape = 0
import.fetch.size = null
accumulo.create.table = false
codegen.input.delimiters.enclose.required = false
db.username = wzhou
codegen.output.delimiters.record = 10
import.max.inline.lob.size = 16777216
hbase.bulk.load.enabled = false
hcatalog.create.table = false
db.clear.staging.table = false
incremental.col = student_id
codegen.input.delimiters.record = 0
enable.compression = false
hive.overwrite.table = false
hive.import = false
codegen.input.delimiters.enclose = 0
accumulo.batch.size = 10240000
hive.drop.delims = false
codegen.output.delimiters.enclose = 0
hdfs.delete-target.dir = false
codegen.output.dir = .
codegen.auto.compile.dir = true
relaxed.isolation = false
mapreduce.num.mappers = 4
accumulo.max.latency = 5000
import.direct.split.size = 0
codegen.output.delimiters.field = 44
export.new.update = UpdateOnly
incremental.mode = AppendRows
hdfs.file.format = TextFile
codegen.compile.dir = /tmp/sqoop-wzhou/compile/5c52e95144738b92047bd2e1e37d1b1f
direct.import = false
db.split.column = major
hive.fail.table.exists = false
db.batch = false

[wzhou@vmhost1 ~]$ sqoop job –exec student_job
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/09/25 14:05:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.4.3
Enter password:
15/09/25 14:05:22 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
15/09/25 14:05:22 INFO manager.SqlManager: Using default fetchSize of 1000
15/09/25 14:05:22 INFO tool.CodeGenTool: Beginning code generation
15/09/25 14:05:23 INFO manager.OracleManager: Time zone has been set to GMT
15/09/25 14:05:23 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM STUDENT t WHERE 1=0
15/09/25 14:05:23 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce
Note: /tmp/sqoop-wzhou/compile/67a12da72d209152325de07df2841b68/STUDENT.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
15/09/25 14:05:25 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-wzhou/compile/67a12da72d209152325de07df2841b68/STUDENT.jar
15/09/25 14:05:26 INFO manager.OracleManager: Time zone has been set to GMT
15/09/25 14:05:26 INFO tool.ImportTool: Maximal id query for free form incremental import: SELECT MAX(student_id) FROM STUDENT
15/09/25 14:05:26 INFO tool.ImportTool: Incremental import based on column student_id
15/09/25 14:05:26 INFO tool.ImportTool: No new rows detected since last import.

As expected, no rows were inserted. So let me insert a few more rows and rerun the jobs.
Run the following SQL commands to insert the rows on Oracle database.
insert into wzhou.student values ( 8, ‘student8’, ‘history’);
insert into wzhou.student values ( 9, ‘student9’, ‘math’);
insert into wzhou.student values ( 10, ‘student10’, ‘computer’ );
commit;

Rerun the job and it shows 3 rows inserted.
[wzhou@vmhost1 ~]$ sqoop job –exec student_job
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/09/25 14:08:49 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.4.3
Enter password:
15/09/25 14:08:53 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
15/09/25 14:08:53 INFO manager.SqlManager: Using default fetchSize of 1000
15/09/25 14:08:53 INFO tool.CodeGenTool: Beginning code generation
15/09/25 14:08:53 INFO manager.OracleManager: Time zone has been set to GMT
15/09/25 14:08:53 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM STUDENT t WHERE 1=0
15/09/25 14:08:53 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce
Note: /tmp/sqoop-wzhou/compile/b4ec871b912ecfebe6b03a510f000581/STUDENT.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
15/09/25 14:08:55 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-wzhou/compile/b4ec871b912ecfebe6b03a510f000581/STUDENT.jar
15/09/25 14:08:56 INFO manager.OracleManager: Time zone has been set to GMT
15/09/25 14:08:56 INFO tool.ImportTool: Maximal id query for free form incremental import: SELECT MAX(student_id) FROM STUDENT
15/09/25 14:08:56 INFO tool.ImportTool: Incremental import based on column student_id
15/09/25 14:08:56 INFO tool.ImportTool: Lower bound value: 7
15/09/25 14:08:56 INFO tool.ImportTool: Upper bound value: 10
15/09/25 14:08:56 INFO mapreduce.ImportJobBase: Beginning import of STUDENT
15/09/25 14:08:56 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
15/09/25 14:08:56 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
15/09/25 14:08:56 INFO client.RMProxy: Connecting to ResourceManager at vmhost1.local/192.168.56.71:8032
15/09/25 14:08:59 INFO db.DBInputFormat: Using read commited transaction isolation
15/09/25 14:08:59 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(major), MAX(major) FROM STUDENT WHERE ( student_id > 7 AND student_id use test_oracle;
OK
Time taken: 1.548 seconds

hive> select * from student_ext;
OK
2 student2 computer
4 student4 accounting
1 student1 math
3 student3 math
5 student3 computer
7 student5 computer
6 student4 math
10 student10 computer
8 student8 history
9 student9 math

Time taken: 0.694 seconds, Fetched: 10 row(s)

We can see the new rows with student8, 9 and 10.