Supported Hadoop Distributions: Add Support for Cloudera CDP-DC/ Cloudera Runtime 7.1.1
DAP-40110
Datameer X now supports Cloudera Runtime version 7.1.1.
2
Workbook: Change 'SHIFTTIMEZONE' function arguments' name
DAP-40199
The description of the function arguments has been changed in the formula builder to prevent ambiguities. There is now no behavior change of the functionality of the function itself.
3
Setup: Enable the 'DAS_USER' property in the 'etc/das-env.sh' file to reflect the assigned service account
DAP-40247
Adjusting this parameter is critical in saving the Admin from themselves by preventing an accidental startup of Datameer X without first assuming the proper service account. The property 'DAS_USER' must be enabled in the 'etc/das-env.sh' file.
4
Workbook: Extend variable value length
DAP-40265
Within a fresh Datameer X distribution, it is now possible to create variable values with more than 255 characters.
5
Performance: Validate Datameer X license status once a day only
DAP-40272
A Datameer X license now gets checked with the first user authentication or scheduled job run after midnight (local time) and is valid 24 h afterwards.
6
Workbook: Validate cross joins for SQL sheets
DAP-40276
Datameer X validates queries of SQL sheets and notifies an analyst if he accidentally created a cross join in a 'FROM' section. Furthermore Datameer X allows the explicit usage of a 'cross join' syntax. The validation can be enabled by setting the 'sql-cross-join.enabled property' in the 'default.properties' configuration file.
7
Setup: Document MariaDB as a supported database dialect for Datameer's metadata database
DAP-40293
MariaDB is now listed as a valid setting for the 'system.property.db.mode'.
8
Supported Hadoop Distributions: Add Support for Amazon EMR v5.30.0
DAP-40355
Datameer X now supports Amazon EMR v5.30.0.
9
Supported Hadoop Distributions: Add Support for Amazon EMR v6.0.0
DAP-40354
Datameer X now supports Amazon EMR v6.0.0.
Bug fixes
1
Import/ Export: Add tooltips for long column names in Full Data and Export Wizard step 'Choose Sheet'
DAP-38627
Datameer X now renders tooltips for long column names.
Bug fixed.
2
Import Job: Browse all data, and job details are opened in separate tabs
DAP-39613
Datameer X stays in the same browser tab during navigation after running an import job.
Bug fixed.
3
Import Job: 'DelegateImportFormat' and 'SequenceFileType' create an empty configuration instead of using the provided one
DAP-39884
Datameer X now validates for properly configured 'io.serializations' settings for custom sequence file formats.
Bug fixed.
4
Import/ Export: Error during an ImportJob creation
DAP-40148
While creating a BigQuery-based ImportJob in a Google BigQuery connection, no 'IllegalArgumentException' error is shown in the 'Data Details' step anymore.
Bug fixed.
5
Workbook: Can not rename a source column from a SQL joined sheet
DAP-40158
Renaming a source column of a SQL sheet doesn't lead to an error any more.
Bug fixed.
6
Import: Can't import an AVRO file via HDFS HA Connection
DAP-40172
Importing AVRO files from HA HDFS connections now works.
Bug fixed.
7
Setup: Datameer X doesn't clean up temporary data
DAP-40242
Datameer X does a clean-up of its temporary folder during a start-up as long it is configured under Datameer's installation directory, but keeps it untouched otherwise.
Bug fixed.
8
HiveServer2 Plug-in: Enable support of mid-column schema changes of an underlying Hive table (AVRO & Parquet)
DAP-40267
Datameer's Hive plug-in is supporting mid-column schema changes for Parquet and AVRO storage formats now.
Bug fixed.
9
File Browser: Details of an error dialog box opens a new browser tab
DAP-40292
Error details are now rendered in the current browser tab.
Bug fixed.
10
Import Job: An import job is still referenced in a workbook after deleted the datasource in the workbook
DAP-40321
When deleting a datasource, the current workbook holds no further references to it anymore.
Bug fixed.
11
Workbook: Workbook job run fails with an SQL sheet
DAP-40327
The 'deploy.mode' is passed in for SQL sheets correctly now and the job run doesn't fail any longer.
Bug fixed.
12
Administration: The "User can see every file and folder" permission grants 'View Full Results' without an explicit mention.
DAP-40366
Using this permission does not grant access to view data, only to see the files and folders. The ability to grant view full results became be a separate option to modify underneath.
Bug fixed.
13
Connection: Issue connecting to Amazon Athena
DAP-40372
The connection to Athena now works when in the custom properties the 'das.jdbc.import.transaction-isolation=TRANSACTION_NONE' property is set.
Bug fixed.
14
File Browser: Abort in 'Duplicate Folder' or 'Paste Folder' does not work
DAP-40374
Clicking the 'Abort' button while duplicating or pasting a folder now cancels the operation.
Bug fixed.
15
REST API: REST API v1 vs v2 - inconsistent handling of workbook UUIDs
DAP-40400
The REST API v2 will generate a new random UUID during POST (while a valid UUID is required in the payload, e.g. '00000000-0000-0000-0000-000000000000' can be used as a template).
Bug fixed.
16
Plug-ins: Errors are swallowed, making it look like the plug-in was successfully initialized
DAP-40406
A plug-in will not get registered if there has been an error during plug-in initialization.
Bug fixed.
17
HiveServer2: A workbook that contains a datalink drops the records after a schema change of the underlying Parquet table
DAP-40448
Datameer X now imports all records from before and after a schema change.
Bug fixed.
18
Job Status Notification: Validation of the e-mail address according to RFC2822 (in detail: e-mail can contain "&")
DAP-40453
Job status notifications for import, export and workbooks jobs can be send via e-mail addresses that contain chars like '&' from now on.
Bug fixed.
19
Support Engineer Report: The property 'SupportEngineerReportController.filterFilesLastNDays()' must cut at midnight from the system default timezone, not UTC
DAP-40467
The 'Support Engineer Report' now respects the system default timezone.
Bug fixed.
20
Plug-in Tez: 'CONSUMED_BYTES/RECORDS' is incorrect in some cases
DAP-40471
The plug-in now reports the counters for all (aliased) inputs in the dataset.
Bug fixed.
21
Plug-in Tez: A 'job-conf.xml' debug artifact should be written only once per Tez job, not for each vertex
DAP-40475
The 'job-conf.xml' debug artifact is only written once for a job now, therefore rate limits like in GCP/ GCS don't occur.