Simplistic CLI tool for RDBMS and Hadoop transfers
Use Cases and Deployment Scope
Sqoop is being used to offload relational databases into Hadoop HDFS, Hive, or HBase. From there, big data analysis processes can be run, then Sqoop is used to reload different tables in the source database for relational queries by external systems such as web applications.
Sqoop helps bridge the gap between traditional RDBMS systems and the Hadoop ecosystem.
Pros
- Provides generalized JDBC extensions to migrate data between most database systems
- Generates Java classes upon reading database records for use in other code utilizing Hadoop's client libraries
- Allows for both import and export features
Cons
- Sqoop2 development seems to have stalled. I have set it up outside of a Cloudera CDH installation, and I actually prefer it's "Sqoop Server" model better than just the CLI client version that is Sqoop1. This works especially well in a microservices environment, where there would be only one place to maintain the JDBC drivers to use for Sqoop.
Likelihood to Recommend
Sqoop is great for sending data between a JDBC compliant database and a Hadoop environment. Sqoop is built for those who need a few simple CLI options to import a selection of database tables into Hadoop, do large dataset analysis that could not commonly be done with that database system due to resource constraints, then export the results back into that database (or another). Sqoop falls short when there needs to be some extra, customized processing between database extract, and Hadoop loading, in which case Apache Spark's JDBC utilities might be preferred.
