Now export this data from hdfs to mysql retail_db.departments table. During upload make sure existing department will just updated and no new departments needs to be inserted.

Now export this data from hdfs to mysql retail_db.departments table. During upload make sure existing department will just updated and no new departments needs to be inserted.

Answer: Solution:

Step 1: Create a csv tile named updateddepartments.csv with give content.

Step 2: Now upload this tile to HDFS.

Create a directory called newdata.

hdfs dfs -mkdir new_data

hdfs dfs -put updated_departments.csv newdata/

Step 3: Check whether tile is uploaded or not. hdfs dfs -Is new_data

Step 4: Export this file to departments table using sqoop.

sqoop export –connect jdbc:mysql://quickstart:3306/retail_db

-username retail_dba

–password cloudera

-table departments

–export-dir new_data

-batch

-m 1

-update-key department_id

-update-mode allowinsert

Step 5: Check whether required data upsert is done or not. mysql –user=retail_dba -password=cloudera

show databases;

use retail_db;

show tables;

select" from departments;

Step 6: Update updated_departments.csv file.

Step 7: Override the existing file in hdfs.

hdfs dfs -put updated_departments.csv newdata/

Step 8: Now do the Sqoop export as per the requirement.

sqoop export –connect jdbc:mysql://quickstart:3306/retail_db

-username retail_dba

–password cloudera

–table departments

–export-dir new_data

–batch

-m 1

–update-key-department_id

-update-mode updateonly

Step 9: Check whether required data update is done or not. mysql –user=retail_dba -password=cloudera

show databases;

use retail db;

show tables;

select" from departments;

Latest CCA175 Dumps Valid Version with 96 Q&As

Latest And Valid Q&A | Instant Download | Once Fail, Full Refund

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments