Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »


Configuring Hive Server2 as a Connection

Configuring a HiveServer2 connection is similar to configuring Hive.

To create a HiveServer2 connector:

  1. Click the + (plus) button at the top left of the File Browser and select Connection, or right-click on a folder, select Create New then select Connection.
  2. From drop-down list, select Hive Server2 as the connection type. Click Next.
  3. Enter the HiveServer2 connection address and port number. 
    Select TCP/Binary or HTTP as the transport mode. (Transport mode option is available as of Datameer v6.3)
         - If selecting HTTP as the transport mode, the property hive.server2.thrift.http.path has the hardcoded value of cliservice. Learn more at Apache Hive.
         - HTTP mode using SSL is not supported.
         - HTTP mode can only be used if Hive supports token based authentication. Hive versions >= 1.2 supports this feature.
    Optional: 
         - Enter a database filter to limit the amount listed databases when importing with this connector.
         - Enter an export path where exports from this connector are written. 
         - Enter any additional custom properties.

    If the database filter for a connection is updated, all import and export jobs using a prohibited database fail upon running.

    Security

    Kerberos

    Select Kerberos Secured Hive and enter both the Hive and HDFS principals.

    LDAP (As of Datameer v6.1)

    Select LDAP/AD Authentication and select between having the HiveServer2 connector provide the authorization credentials or have the credentials provided by the individual import/export jobs.

    Sentry 

    Datameer respects Sentry permissions from users running HiveServer2.

    How Datameer and Sentry interact with users:

    When you add an artifact though Datameer to Hive, it is added using the Datameer service account. The Datameer service account must has all permissions for Hive to authenticate with Sentry.

    However, data is accessed on the Hive cluster using the Datameer user account. The Datameer user account must have permissions for the requested data in Hive as well as Datameer to authenticate with Sentry.

    At the transport level, HiverServer2 has multiple connection methods available. Datameer currently supports the following binary connection methods:

    • SASL

    • Non SSL
    • NOSASL
    • Kerberos
    • Plain

  4. Fill out a description and save the connector.

Importing Data with a HiveServer2 Connector

Configuring the import job wizard with a HiveServer2 connector follows the same procedure as importing from Hive.

Configuring the Hive Plug-in

The Hive plug-in is provided per default with the installation of Datameer to import from and export to Hive servers. It can be found under the Admin tab and selecting Plug-ins from the menu.

Click the cog icon under actions to configure Datameer's Hive plug-in.

Export Settings

There are options on how to configure the data field type mapping from Datameer to HiveServer2 upon export.

  • Datameer classic is the default setting for the plug-in. The mapping specifics can be found on Data Field Types in Datameer: Hive Server2 Mapping.
  • Hive specific is similar to the default setting with the change that Datameer's BigDecimal is mapped to Hive's Decimal and Datameer's Date is mapped to Hive's Timestamp data type.
    • BigDecimal
      • New Table - Datameer exports the BigDecimal type to the Hive type Decimal with a precision (total number of digits) of 38 and a scale (number of digits to the right of the point) of 18.
      • Existing Table - Datameer exports the BigDecimal type to the Hive type Decimal with the precision and scale defined on the Hive server. 
        • A maximum is set at a percision,scale of (38,37).
      • If an exported value doesn't fit within the precision/scale of either a new or existing table, a failure occurs. 
    • Date
      • New Table - Datameer exports the Date type to the Hive type Timestamp.
      • Existing Table - Datameer exports the Date type to the Hive type Timestamp/Date/String depending on what is defined on the Hive server.

The mode has no influence when exporting into an existing partitioned Hive table.

Cache

Datameer caches Java objects that represent partitions on HiverServer2. These partitions contain locations, columns names, and other information. Datameer stores these objects on a local disk. When Datameer needs to read Hive partitions it increases performance by using the stored cache instead of having to pull the same information each time it is needed.

The cached Hive partition data is created for all registered import job models when the plug-in starts and then updates automatically every 6 hours. Registration occurs when the import job or data link is created using the wizard as well as when the job is processed.

From the Hive plug-in configuration settings, you have the ability to clear the current cache or start filling it immediately without having to wait for the automated renewal.

  • The cache can be cleared to remove stored partition data that is no longer being used and is decreasing performance.
  • The cache can be manually filled before the auto update to cache new/changed Hive partition data to increase performance. 

The feature is unavailable for HiveServer1.


  • No labels