Hadoop 與關系數據庫相互遷移,Apache Sqoop 1.99.4 發布

jopen 10年前發布 | 8K 次閱讀 Hadoop

Sqoop是一個用來將Hadoop和關系型數據庫中的數據相互轉移的工具,可以將一個關系型數據庫(例如 : MySQL ,Oracle ,Postgres等)中的數據導入到Hadoop的HDFS中,也可以將HDFS的數據導入到關系型數據庫中。
24081823_ksef.gif

Apache Sqoop 1.99.4 發布,這是 Sqoop2 的第四個里程碑版本,是非常重要的一個里程碑。

該版本改進內容和新特性:

Improvement

  • [SQOOP-773] - Sqoop2: Batch execution support for client commands

  • [SQOOP-1144] - Sqoop2: Add fixVersion to PreCommit branch detection

  • [SQOOP-1189] - Sqoop2: Ensure that clone methods will correctly copy over all values from all parents

  • [SQOOP-1196] - Sqoop2: Add support for arbitrary compression codecs

  • [SQOOP-1211] - Sqoop2: Derby repo: Sync maximal length of versions

  • [SQOOP-1225] - Sqoop 2 documentation for connector development

  • [SQOOP-1290] - Sqoop2: Kill Tomcat in case that Sqoop Server fails to load

  • [SQOOP-1509] - Sqoop2: Sqoop2 Rest API refactoring

  • [SQOOP-1547] - Sqoop2: Connector API stabilization

  • [SQOOP-1557] - Sqoop2: SQ_CONFIGURABLE ( for entities who own configs)

  • [SQOOP-1566] - Sqoop2: Fix the upgrade logic for SQOOP-1498

  • [SQOOP-1585] - Sqoop2: Prefix mapreduce classes with MR ( no functionality change)

  • [SQOOP-1586] -  Sqoop2: Rename leftovers from the SQOOP2 merge of 1367

  • [SQOOP-1597] - Sqoop2: Refactor DerbySchemaQuery into one for create/ update/ and then CRUD operarations

  • [SQOOP-1620] - Sqoop2: FileSystem should be configurable in HDFS connector

  • </ul>

    New Feature