SDET- QA Automation Techie

Software Testing Blog

  • Home
  • Training
    • Online
    • Self-Paced
  • Video Tutorials
  • Interview Skills
    • HR Interview Questions Videos
    • Domain Knowledge
  • Career Guidance
  • Home
  • Software Testing
    • Manual Testing Tutorials
    • Manual Testing Project
    • Manaul Testing FAQS
    • ISTQB
    • AGILE
  • Web Automation Testing
    • Java Programmng
    • Python Programmng
    • Selenium with Java
    • Selenium with Python
    • Robot Framework(Selenium with Python)
    • selenium with Cucumber
    • TestNG+IntelliJ
    • Mobile App Testing(Appium)
    • JMeter
  • API Automation Testing
    • Rest Assured API Testing (BDD)
    • Rest Assured API Testing (Java+ TestNG)
    • Robot Framework(Rest API Testing with Python)
    • Postman
    • SoapUI
    • API Testing(FAQ's)
  • SDET|DevOps
    • Continuos Integration
    • SDET Essentials
    • AWS For Testers
    • Docker
  • SQL
    • Oracle(SQL)
    • MySQL for Testers
    • NoSQL
  • Unix/Linux
    • UNIX TUTORIALS
    • Linux Shell Scripting
  • ETL Testing
    • ETL Data warehouse Tutorial
    • ETL Concepts Tools and Templates
    • ETL Testing FAQ's
    • ETL Testing Videos
  • Big Data Hadoop
  • Video Tutorials
  • ApachePOI Video Tutorials
  • Downloads
    • E-Books for Professionals
    • Resumes
  • Automation Essencials
    • Cloud Technologies
      • Docker For Testers
      • AWS For Testers
      • Sub Child Category 3
    • Java Collections
    • Selenium Locators
    • Frequently Asked Java Programs
    • Frequently Asked Python Programs
    • Protractor
    • Cypress Web Automation

Apache Sqoop Commands

 Sqoop commands   

Sqoop commands
  1. import-all-tables
  2. list-databases
  3. list-tables
  4. create-hive-table
  5. hive-import
  6. eval
  7. export



import-all-tables : import data from all mysql tables to HDFS

[cloudera@quickstart ~]$ hadoop dfsadmin -safemode leave   // you need to run this if you get error.

[cloudera@quickstart ~]$ sqoop import-all-tables --connect jdbc:mysql://192.168.13.135/retail_db --username cloudera --password cloudera

list-databases: list available databases from Mysql

[cloudera@quickstart ~]$ sqoop list-databases --connect jdbc:mysql://192.168.13.135/ --username cloudera --password cloudera

list-tables: list avaialable tables in the database

[cloudera@quickstart ~]$ sqoop list-tables --connect jdbc:mysql://192.168.13.135/retail_db --username cloudera --password cloudera


create-hive-table : import a table definition in to hive

step 1) import mysql table data in to hdfs.
[cloudera@quickstart ~]$ sqoop import --connect jdbc:mysql://192.168.13.135/retail_db --username cloudera --password cloudera --table customers --m 1;

step 2) import mysql table definition in to hive.(create-hive-table)

[cloudera@quickstart ~]$ sqoop create-hive-table --connect jdbc:mysql://192.168.13.135/retail_db --username cloudera --password cloudera --table customers --fields-terminated-by ',' --lines-terminated-by '\n';

step 3) load the data from HDFS to hive table.

hive> load data inpath '/user/cloudera/customers' into table customers;

'hive-import' option for import command  ( used for reduce the above steps)

[cloudera@quickstart ~]$ sqoop import --connect jdbc:mysql://192.168.13.135/retail_db --username cloudera --password cloudera --table customers --m 1 --hive-import;

eval: evaluate SQL statement and display the result

[cloudera@quickstart ~]$ sqoop eval --connect jdbc:mysql://192.168.13.135/retail_db --username cloudera --password cloudera --query "select * from customers limit 10";

export: export the data from HDFS to MySQL

  • insert mode
  • update update

mysql> create database hr;   // creating new daTabase in MySQL
mysql> use hr;
mysql> create table employees(name varchar(30),email varchar(40));  // creating table

insert mode

[cloudera@quickstart hivedata]$ sqoop export --connect jdbc:mysql://192.168.13.135/hr --username cloudera --password cloudera --table employees --export-dir /user/hive/warehouse/Employees.csv;

update mode

[cloudera@quickstart hivedata]$ sqoop export --connect jdbc:mysql://192.168.13.135/hr --username cloudera --password cloudera --table employees --export-dir /user/hive/warehouse/Employees.csv --update-key name;




  • Share This:  
  •  Facebook
  •  Twitter
  •  Google+
  •  Stumble
  •  Digg
Email ThisBlogThis!Share to TwitterShare to Facebook
Newer Post Older Post Home
popup

Popular Posts

  • How To Explain Project In Interview Freshers and Experienced
    “ Describe an important project you’ve worked on ” is one of the most common questions you can expect in an interview. The purpose of a...
  • MANUAL TESTING REAL TIME INTERVIEW QUESTIONS & ANSWERS
    1. How will you receive the project requirements? A. The finalized SRS will be placed in a project repository; we will access it fr...
  • API/Webservices Testing using RestAssured (Part 1)
    Rest Assured : Is an API designed for automating REST services/Rest API's Pre-Requisites Java Free videos: https://www.you...

Facebook Page

Pages

  • Home
  • Resumes
  • Job Websites India/UK/US
  • ISTQB
  • Selenium with Java
  • E-Books for Professionals
  • Manual Testing Tutorials
  • Agile Methodology
  • Manual Testing Projects

Live Traffic

YouTube


Blog Visitors

Copyright © SDET- QA Automation Techie | Powered by Blogger
Design by SDET | Blogger Theme by | Distributed By Gooyaabi Templates