Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 19 Next »

INFO

Within these release notes Datameer X provides information and resolutions on improvements, bug fixes, and visual changes for 11.1.x maintenance release versions.  

Visit Datameer's Help Center for additional information or contact your customer service representative. 

Datameer X 11.2.9

Bug Fixes

1Google Data Proc: Version 2.1 - Can not create External System for HiveDAP-42529

Creating an External System for Hive works as expected.

(tick) Bug fixed.

2Workbook: 'WorkbookRestoreManager' lazy initialization failureDAP-42536

After restarting Datameer, opening a Workbook via the FileBrowser first initializes the 'WorkbookRestoreManager' again and one can open the Workbook directly by its URL again.(tick) Bug fixed.
3Authentication: 'Group Filters' section is not openingDAP-42537

The 'Group Filters' section on the Admin's configuration page when configuring external authentication can be opened and closed again.(tick) Bug fixed.

Datameer X 11.2.8

Improvements

1Supported Hadoop Distributions: Add support for Google DataProc 2.0.67 and 2.1.14DAP-42396 and DAP-42521

Datameer now supports Google DataProc versions 2.0.67 and 2.1.14.
2Plug-ins: plugin-snowflake: Use an existing Snowflake stage for export file stagingDAP-42505

The plugin now provides the option to use a specific named staging location defined by a variable in the export as a specific storage option.

Bug Fixes

1Workbook: Custom Workbook undo configuration is reverted to default after a Datameer restartDAP-42520

The custom 'Workbook Undo' configuration is kept after restarting Datameer. 

(tick) Bug fixed.

2Performance: Datameer became unavailable because of high memory and CPU consumptionDAP-42522

After a patched Hadoop AWS file, Datameer works again without performance lost.(tick) Bug fixed.
3Export: Tableau - Export job setup using JSON Web Token authentication method causes Null Pointer ExceptionDAP-42533

The Tableau plugin has been patched. One can use the JWT authentication method again to execute export jobs or create new jobs using the JWT authentication method.(tick) Bug fixed.

Datameer X 11.2.7

Improvements

1Plug-ins: plugin-snowflake - use Snowflake's internal stage for exportsDAP-42491

The plugin now provides the option to select Snowflake's internal stage as the default storage option. After writing the export data into the local file system, it will be imported to Snowflake via a 'PUT' command.

Bug Fixes

1Timezone not resolving correctly in Mexico DAP-42497

After fixing code, the correct timezone is now displayed again for date and time assets in Mexico.

(tick) Bug fixed.

2Workbook: Sorting by the 'Last Processed' time doesn't workDAP-42501

The sorting functionality now works again as expected in the File Browser's Artifacts bar.(tick) Bug fixed.

Datameer X 11.2.6

Improvements

1REST API: Output a response when creating a Workbook vie the APIDAP-42481

REST API v2 calls for creating a new Workbook now return the following information: status, configuration ID, file ID and file UUID, and the. file path.
2Import/ Export: Use the local Datameer timezone instead of UTC while converting numeric values into a dateDAP-42475

Two new global properties ensure that Parquet int96/int64 codes timestamps are treated as being in Datameer's timezone now when set to 'false': 'das.import.parquet.int64.timestamp.adjusted-to-utc' and 'das.import.parquet.int96.timestamp.adjusted-to-utc'.
3Supported Hadoop Distributions: Add support for Cloudera Private Cloud/ Cloudera Runtime 7.1.8DAP-42394

Datameer now supports Cloudera Private Cloud/ Cloudera Runtime 7.1.8.

Bug Fixes

1Administration: Housekeeping doesn't work - jobs stuck in the job schedulerDAP-42485 & DAP-42489

Housekeeping can now be configured via the properties 'housekeeping.keep-deleted-data' and 'housekeeping.keep-deleted-data-max' to remove outdated files.

(tick) Bug fixed.

2REST API: Workbooks with schedule created via REST API v2 are not triggeredDAP-42484

Scheduled triggering for Workbooks that are created via the REST API v2 is now working without a need to manually resave the Workbook's configuration in the User Interface.(tick) Bug fixed.
3Import/ Export: Updating Snowflake drivers DAP-42479

The patched plugin 'snowflake-jdbc' is now working to enable the Datameer - Snowflake connection again. (tick) Bug fixed.

Datameer X 11.2.5

Improvements

1Security: Upgrade Jetty to version 9.4.50DAP-42458

Datameer is moved to the Jetty version 9.4.50 in order to ensure the services are not vulnerable to any known
vulnerabilities.

Bug Fixes

1Security: Penetration tests - GetFiles metadata, get the folder content via the folder IDDAP-42468

A user is not allowed to view any metadata for a directory without proper access (sees no folder path or file ID). 

(tick) Bug fixed.

2Security: Penetration tests - Any user may not see others schema logDAP-42469

A user is not allowed to see the parsing records from the artifacts created by other users.(tick) Bug fixed.
3Security: Penetration tests - The '/browser/list-file' endpoint returns metadata for any file in the systemDAP-42470

A user is not allowed to view artifacts metadata without having at least a 'View' permission for these items.(tick) Bug fixed.
4Security: Penetration tests - Dependency graph information doesn't check the permission authorizationDAP-42471

A user is not allowed to load an artifact’s dependency graph without having at least a 'View' permission for this item.

(tick) Bug fixed.
5Security: Penetration tests - detailed error messages displayedDAP-42472

Only generic error messages without error details are now returned to Datameer users. Stack traces are logged server-side and only accessible by developers or administrators. The associated property handles the error messages.

(tick) Bug fixed.
6Security: Penetration tests - '/rest/user-management/authenticable-users' should return list of usernames belonging to the same role only with CHANGE_FOLDER_OWNER capabilityDAP-42473

Only user who belong to the same role with the 'change folder owner' capability can view other user names from the REST API call. (tick) Bug fixed.

Datameer X 11.2.4

Improvements

1Drivers: Update Redshift Native JDBC driver to use Amazon's JDBC42 driverDAP-42443

Since the Amazon Redshift Native JDCB driver is deprecated. Therefore the the new drive should be updated to  the Amazon Redshift JDBC42 driver.


Bug Fixes

1Plugins: plugin-hbase+cdh-7.1.7.0 - Import failsDAP-42440

Setting the property 'hbase.server.sasl.provider.extras=org.apache.hadoop.hbase.security.provider.GssSaslClientAuthenticationProvider' bypasses the service loader lookup code with providing authentication providers.

(tick) Bug fixed.

Datameer X 11.2.3

Improvements

1Importing Data: Add an option to configure the default value for "Records for Schema Detection"DAP-42421

The new custom property 'das.conductor.default.record.sample.size' controls what 'Records for schema detection' value is set by default whenever a user creates a new Import Job, Data Link or File Upload. 

It should be adjusted, e.g. when working with large JSON objects (should be decreased to 250 - 500).

Once updated, it affects only newly created Import Jobs, the 'Records for schema detection' value for existing artifacts remains intact.


Bug Fixes

1Export: Snowflake - Exporting to a new table leads to duplicated columns error intermittentlyDAP-42012

After fixing the plug-in, the export sheet can be executed without any errors again.

(tick) Bug fixed.

2Export: Snowflake - Intermittent Export failure, insert value list does not match the column listDAP-42442

After a plug-in fix, the columns list now matches the insert value list as expected again.(tick) Bug fixed.

Datameer X 11.2.2

Improvements

1Supported Hadoop Distributions: Add support for Amazon EMR 6.8.0DAP-42400

Datameer X now supports Amazon EMR version 6.7.0.


2Properties: Make HadoopUtil#HADOOP_ACCESS_TIMEOUT configurable DAP-42404

The timeout in a S3A filesystem can be changed via the properties 'fs.s3a.connection.establish.timout' and 'fs.s3a.connection.timeout' now in order to prevent job timeouts.

Bug Fixes

1Security: Spring framework CVE vulnerabilities DAP-42402

The user has been backported to Spring framework 4.3.

(tick) Bug fixed.

Datameer X 11.2.1

Bug Fixes

1Import/ Export: Hive (with ADLS backed external table) - Failure on importDAP-42389

Hive import jobs now succeeds again when executed at the cluster after setting several properties and adapted Hadoop configuration.

(tick) Bug fixed.

2Plug-ins: Plugin Parquet - Datameer encodes DATE values in base64 while exporting into a Parquet fileDAP-42382

After the 'plugin-parquet' was patched, the DATE fields are exported correctly.

(tick) Bug fixed.

3Export: Tableau - Hyper jobs fail intermittently with "Hyper Server did not call back on the callback connection."DAP-42380

Users are now able to export hyper formatted files to Tableau without any failures again after the associated plug-in has been updated.(tick) Bug fixed.

Datameer X 11.2.0

Improvements

1Supported Hadoop Distributions: Add support for Amazon EMR 5.36.0DAP-42369

Datameer X now supports Amazon EMR version 5.36.0.


2Supported Hadoop Distributions: Add support for Amazon EMR 6.7.0DAP-42375

Datameer X now supports Amazon EMR version 6.7.0.


3Backup & Restore: Performance and lack of logging problemDAP-42370

Performance issues during the backup and restore process are now executed at an expected speed for large number of artifacts.

Bug Fixes

1Plug-ins: EMR - 'plugin-emr' High Availability discovery mode picks local running Yarn Ressource Manager at 0.0.0.0:8088DAP-42378

The EMR cluster can now be set up in High Availability mode. 

(tick) Bug fixed.

2Plug-ins: EMR - 'plugin-emr' "default=obtain from ec2 environment" region option cannot be savedDAP-42372

When configuring the cluster mode in EMR, the default option for the region can now be saved.

(tick) Bug fixed.

3Upgrade: Java upgrade 'Upgrade Filesystem Artifact To Delete' doesn't trigger the database upgradeDAP-42

The Java upgrade script now updates the table 'Upgrade Filesystem Artifact To Delete' as expected or throws a clear error message. (tick) Bug fixed.
  • No labels