Now export data from hive table departments_hive01 in departments_hive02. While exporting, please note following. wherever there is a empty string it should be loaded as a null value in mysql.

Now export data from hive table departments_hive01 in departments_hive02. While exporting, please note following. wherever there is a empty string it should be loaded as a null value in mysql. wherever there is -999 value for int field, it should be created as null value.View AnswerAnswer: Solution: Step 1: Create...

December 11, 2020 No Comments READ MORE +

Now import the data from following directory into departments_export table, /user/cloudera/departments new

Now import the data from following directory into departments_export table, /user/cloudera/departments newView AnswerAnswer: Solution: Step 1: Login to musql db mysql --user=retail_dba -password=cloudera show databases; use retail_db; show tables; step 2: Create a table as given in problem statement. CREATE table departments_export (departmentjd int(11), department_name varchar(45), created_date T1MESTAMP DEFAULT NOW());...

December 11, 2020 No Comments READ MORE +

Problem Scenario 3: You have been given MySQL DB with following details.

Problem Scenario 3: You have been given MySQL DB with following details. user=retail_dba password=cloudera database=retail_db table=retail_db.categories jdbc URL = jdbc:mysql://quickstart:3306/retail_db Please accomplish following activities. View AnswerAnswer: Solution: Step 1: Import Single table (Subset data} Note: Here the ' is the same you find on - key sqoop import --connect jdbc:mysql://quickstart:3306/retail_db...

December 10, 2020 No Comments READ MORE +

Also make sure you use orderid columns for sqoop to use for boundary conditions.

Also make sure you use orderid columns for sqoop to use for boundary conditions.View AnswerAnswer: Solutions: Step 1: Clean the hdfs file system, if they exists clean out. hadoop fs -rm -R departments hadoop fs -rm -R categories hadoop fs -rm -R products hadoop fs -rm -R orders hadoop fs...

December 10, 2020 No Comments READ MORE +

Also make sure you use orderid columns for sqoop to use for boundary conditions.

Also make sure you use orderid columns for sqoop to use for boundary conditions.View AnswerAnswer: Solutions: Step 1: Clean the hdfs file system, if they exists clean out. hadoop fs -rm -R departments hadoop fs -rm -R categories hadoop fs -rm -R products hadoop fs -rm -R orders hadoop fs...

December 10, 2020 No Comments READ MORE +

Now import the data from following directory into departments_export table, /user/cloudera/departments new

Now import the data from following directory into departments_export table, /user/cloudera/departments newView AnswerAnswer: Solution: Step 1: Login to musql db mysql --user=retail_dba -password=cloudera show databases; use retail_db; show tables; step 2: Create a table as given in problem statement. CREATE table departments_export (departmentjd int(11), department_name varchar(45), created_date T1MESTAMP DEFAULT NOW());...

December 10, 2020 No Comments READ MORE +

Also make sure you use orderid columns for sqoop to use for boundary conditions.

Also make sure you use orderid columns for sqoop to use for boundary conditions.View AnswerAnswer: Solutions: Step 1: Clean the hdfs file system, if they exists clean out. hadoop fs -rm -R departments hadoop fs -rm -R categories hadoop fs -rm -R products hadoop fs -rm -R orders hadoop fs...

December 10, 2020 No Comments READ MORE +

Also make sure you have imported only two columns from table, which are department_id, department_name

Also make sure you have imported only two columns from table, which are department_id, department_nameView AnswerAnswer: Solutions: Step 1: Clean the hdfs tile system, if they exists clean out. hadoop fs -rm -R departments hadoop fs -rm -R categories hadoop fs -rm -R products hadoop fs -rm -R orders hadoop...

December 10, 2020 No Comments READ MORE +

Also make sure your results fields are terminated by '|' and lines terminated by 'n

Also make sure your results fields are terminated by '|' and lines terminated by 'nView AnswerAnswer: Solutions: Step 1: Clean the hdfs file system, if they exists clean out. hadoop fs -rm -R departments hadoop fs -rm -R categories hadoop fs -rm -R products hadoop fs -rm -R orders hadoop...

December 10, 2020 No Comments READ MORE +

Now import data from mysql table departments_hive01 to this hive table. Please make sure that data should be visible using below hive command. Also, while importing if null value found for department_name column replace it with "" (empty string) and for id column with -999 select * from departments_hive;

Now import data from mysql table departments_hive01 to this hive table. Please make sure that data should be visible using below hive command. Also, while importing if null value found for department_name column replace it with "" (empty string) and for id column with -999 select * from departments_hive;View AnswerAnswer:...

December 10, 2020 No Comments READ MORE +