SQL Server export data to Azure hbase/hive detailed steps

Source: Internet
Author: User
Tags server port sqoop

The Hadoop on Azure Sqoop Import Sample tutorialtable of Contents
    • Overview
      • Goals
      • Key Technologies
      • Setup and Configuration
    • Tutorial
      • How to set up a SQL database
      • How to use Sqoop from Hadoop on Azure to import SQL Database query results to the HDFS cluster in Hadoop on Azure.
    • Summary


Overview

This tutorial shows "How to" use Sqoop to import data from a SQL database on Windows Azure to an Hadoop on Azure HDFS cluste R.

While Hadoop was a natural choice for processing unstructured and semi-structured data, such as logs and files, there may a LSO is a need to process structured data stored in relational databases.

Sqoop is a tool designed to transfer data between Hadoop and relational databases. You can use it to import data from a relational database management system (RDBMS) such as SQL or MySQL or Oracle into the Hadoop distributed File System (HDFS), transform the data in Hadoop with MapReduce or Hive, and then export the data back into an RDBMS. In this tutorial, you is using a SQL database for your relational database.

Sqoop is an open source software PRODUCT of Cloudera, Inc. software development for Sqoop have recently moved from Gith UB to the Apache Sqoop site.

In Hadoop on Azure, Sqoop are deployed from the Hadoop Command Shell on the head node of the HDFS cluster. You use the Remote Desktop feature available in the Hadoop on Azure portal to access the head node of the cluster for this Deployment.

Goals

In this tutorial, you see three things:

    1. How to set up a SQL database on Windows Azure for use with the tutorial.

    2. How to use the Remote Desktop feature in Hadoop on Azure to access the head node of the HDFS cluster.

    3. How to import the relational data from SQL Server to a Hadoop in Azure HDFS cluster by using Sqoop.

Key Technologies

    • Windows Azure
    • Hadoop on Azure
    • Sqoop

Setup and Configuration

You must has an account to access Hadoop on Azure and has created a cluster to work through this tutorial. To obtain a account and create a Hadoop cluster, follow the instructions outlined in theGetting started with Microso FT Hadoop on Azure sections of the Introduction to Hadoop on Azure topic.

You'll also need your outward facing IP address for your current location when configuring your firewall on SQL Database . To obtain it, go to the Sitewhatismyip and make a note of it. Later in the procedure, you also need the outward facing IP address for the head of the Hadoop cluster. You can obtain this IP address in the same.

Tutorial

This tutorial is composed of the following segments:

    1. How to set up a SQL database.

    2. How to use Sqoop from Hadoop on Azure to import data to the HDFS cluster.

How to set up a SQL database

Log in to your Windows Azure account. To create a database server, click theDatabase icon in the lower left-hand corner on the page.

On the Getting Started page, click the Create a new SQL Database Server option.

Select the type of subscription (such as pay-as-you-go) associated with your account in theCreate Server window and press Next.

Select the appropriate region in the Create Server window and clickNext.

Specify the login and password of the Server-level principal of your SQL Database server and then pressNext.

Press ADD to specify a firewall rule this allows your current location access to SQL Database to upload the Adven Tureworks database. The firewall grants access based on the originating IP address for each request. Use the IP address found with the configuration preliminaries of this tutorial for the values to add. Specify a Rule name, such as shown, but remember to use your IP address, not the one used for illustration purposes below. (You must also add the outward IP address of the ' Head ' node in your Hadoop cluster. If you know it already, add it now.) Then press the Finish button.

Download the AdventureWorks2012 database onto your local machine from Recommended Downloads link on the Adventure Works fo R SQL Database site.

Unzip the file, open an Administrator Command Prompt, and navigate to the AdventureWorks directory inside the Adventurewor Ks2012forsqlazure folder.

Run createadventureworksforsqlazure.cmd by typing the following:

CreateAdventureWorksForSQLAzure.cmd servername username password

For example, if the assigned SQL Database server is named b1gl33p, the administrator user name "Fred", and the password "S Ecret ", you would type the following:

CreateAdventureWorksForSQLAzure.cmd b1gl33p.database.windows.net [email protected] Secret

The script creates the database, installs the schema, and populates the database with sample data.

Return to the windowsazureplatform portal page, click your subscription on the left-hand side (pay-as-you-go< /c1> in the example below) and select your database (here named Wq6xlbyoq0). The AventureWorks2012 should is listed in the Database Name column. Select it and press the Manage icon at the top of the page.

Enter the credentials for the SQL database when prompted and press logson.

This opens the Web interface for the Adventure Works database on SQL database. Press theNew query icon at the top to open the Query Editor.

Since Sqoop currently adds square brackets to the table name, we need to add a synonym to support two-part naming for SQL Server tables. To doing so, run the following query:

CREATE SYNONYM [Sales.SalesOrderDetail] FOR Sales.SalesOrderDetail

Run the following query and review its result.

select top 200 * from [Sales.SalesOrderDetail]

How to use Sqoop from Hadoop on Azure to import SQL Database query results to the HDFS cluster in Hadoop on Azure.

From your Account page, scroll down to the Open Ports icon in theyour cluster section and click the icon To open the ODBC Server port on the head node in your cluster.

Return to your Account page, scroll- your cluster section and click theRemote Desktop icon th Is time-to-open the head node in your cluster.

Select Open When prompted to Open the. rdp file.

Select Connect in the Remote Desktop Connection window.

Enter your credentials for the Hadoop cluster (not your Hadoop in Azure account) into theWindows Security window and select OK.

Open Internet Explorer and go to the site Whatismyip to obtain the outward facing IP address for the head node of the Clus ter. Return the SQL Database Management page and add a firewall rule that allows your Hadoop cluster access to SQL Databas E. The firewall grants access based on the originating IP address for each request.

Double-click on the Hadoop Command Shell icon in the upper left hand of the Desktop to open it. Navigate tothe "C:\Apps\dist\sqoop\bin" directory and run the following command:

sqoop import --connect "jdbc:sqlserver://[serverName].database.windows.net;username=[userName]@[serverName];password=[password];database=AdventureWorks2012" --table Sales.SalesOrderDetail --target-dir /data/lineitemData -m 1

So, for example, for the following values:
* Server Name:wq6xlbyoq0
* Username:hadooponazuresqoopadmin
* password:pa$ $w 0rd

The Sqoop command is:

sqoop import --connect "jdbc:sqlserver://wq6xlbyoq0.database.windows.net;[email protected];password=Pa$$w0rd;;database=AdventureWorks2012" --table Sales.SalesOrderDetail --target-dir /data/lineitemData -m 1

Return to the Accounts page of the Hadoop in Azure Portal and open theInteractive Console this time. Run the #lsr command from the JavaScript console to list the files and directories on your HDFS cluster.

Run the #tail command to view selected results from the part-m-0000 file.

tail /user/RAdmin/data/SalesOrderDetail/part-m-00000

Summary

In this tutorial, you have seen how to use Sqoop to import data from a SQL database on Windows Azure to a Hadoop on azure HDFS cluster.

SQL Server export data to Azure hbase/hive detailed steps

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.