Datameer's release notes provide information and resolutions on improvements, bug fixes, and visual changes for 7.1.x maintenance release versions.
...
1 | Cross-site Request Forgery (CSRF) Tokens Are Ignored | DAP-22902 |
---|---|---|
Create CSRF Tokens per Request | DAP-36910 | |
A CSRF Token Should Have a Llfe Time | DAP-36911 | |
Show the User a Warning If the CSRF Token Is Expired - HTML Form | DAP-37041 | |
The License Upload in the UI Isn't Secured by a CSRF Token | DAP-37194 | |
Show the User a Warning If the CSRF Token Is Expired - AJAX | DAP-36969 | |
Workbook: 403 Responses Need to Be Handled Correctly | DAP-37458 | |
The External REST API Shouldn't Be Able to Work With a Valid Session Cookie | DAP-36914 | |
Avoid the Flash Message in Case of HTTP-403 Error | DAP-37472 |
Improvements
1 | Connect to S3 Via a Network Proxy | DAP-37456 |
---|---|---|
Datameer can now connect to S3 via a network proxy. |
...
37 | The Run Workbook Button Is Displayed but Doesn't Work in the Info Dialog for Users with No Permissions | DAP-37486 | |
---|---|---|---|
The run workbook button is no longer present in info dialog for users with no permissions. | Bug fixed. |
7.1.6.1 Release Notes
Improvements
1 | Upgrade Parquet to 1.8.3 | DAP-37498 |
---|---|---|
Datameer has upgraded to the more stable Parquet 1.8.3 |
Bug Fixes
1 | Copied Data-driven Workbooks Causes a Job Scheduler Issue When It's Triggered Together with the Original Artifact | DAP-37332 | |
---|---|---|---|
Copied workbooks no longer cause a Job Scheduler issues. If any problems arise, clear warnings are show about needed configuration changes. | Bug fixed. |
1 | Data Driven Jobs Aren't Triggered If a Dependency Triggers a Permissions Exception | DAP-37605 | |
---|---|---|---|
The scheduling and triggering of subsequent jobs is now working as expected. This update continues to ensure that sharing and permissions security is respected. | Bug fixed. |
7.1.6 Release Notes
Improvements
1 | Add Support for MapR 6.0.1 along MEP 5.0 | DAP-36793 |
---|---|---|
Support added support for MapR 6.0.1 along MEP 5.0 |
...
19 | A 500 Error on Deinstallation of Plug-ins Isn't Handled By the UI | DAP-37027 | |
---|---|---|---|
Errors are now handled by the UI code and a message is shown to the user. | Bug fixed. |
7.1.5 Release Notes
Improvements
1 | Added Origin and Record Count Statistics to workbook-preview-service.log File | DAP-36779 |
---|---|---|
Origin and record count statistics added to the workbook-preview-service.log |
...
29 | Undo Is Disabled When a User Adds a SQL Sheet into a Workbook | DAP-36786 | |
---|---|---|---|
Undo is now enabled when users add a SQL sheet into a workbook. | Bug fixed. |
7.1.4 Release Notes
Improvements
1 | Don't Show Any Label in Infographic Table Widgets for Data with Null Values | DAP-36351 |
---|---|---|
Null cells are displayed as blank in tables and pivot tables. This includes empty date cells displaying as empty instead of showing "Invalid Date". |
...
27 | Computing an Expression from a Datameer Function Is Leading to Bad Performances | DAP-36706 | |
---|---|---|---|
Computations are now much faster. | Bug fixed. |
7.1.3 Release Notes
Improvements
1 | Support Snowflake Import | DAP-35693 |
---|---|---|
The Snowflake connector supports importing data. |
...
53 | Workbook Won't Open or Run after Upgrade | DAP-35783 | |
---|---|---|---|
After upgrading from 5.11.30 to 6.1.x to 6.3.5, workbooks fail to open or run and throw an error. | Resolution: | Bug fixed. |
7.1.2 Release Notes
Epics
1 | Hive 2.1 Hortonworks Support | DAP-32623 |
---|---|---|
Make Hive 2.1 Default for HDP-2.6.X Distributions from V7.1 Onwards | DAP-34900 | |
Enhance Existing Distribution Support for HDP 2.6. To Have Hive 2.1.X | DAP-34076 | |
Packaging of Hive Plug-in Should Select Proper Java Class Automatically | DAP-34866 | |
Update Hives2tasks.Createdefaultview() to Work with Hive 2 | DAP-35183 |
2 | Export into String Partitioned Hive Table and Storing as Parquet | DAP-34136 |
---|---|---|
Replace Export Partitioned Temp Table Approach by Write Data Directly into the Hdfs Partition Locations | DAP-35544 | |
Use Existing Hive Type Specific ObjectInspectors to Write Data for Existing Table | DAP-35097 | |
Enable Export of Datameer Column into Hive Partitioned Table (Including Mapping) | DAP-34106 | |
Enable Preventing Schema Mismatch While Export into Hive Table | DAP-34119 | |
Enable Support for Different Parquet Versions on Export into Hive Table (Minimum CDH 5.11 and Parquet 2.1) | DAP-34120 | |
Support Custom Output Format (E.g. Parquet) When Exporting to an Existing Hive Table | DAP-34461 | |
Enable Type Mapping of Datameer Integer to Hive Bigint | DAP-35967 | |
Add Limitation for RecordWriter Cache to Prevent OOM Error | DAP-36001 |
3 | Implement New S3 Export Adapter | DAP-34485 |
---|---|---|
Export to Redshift via S3 Exporter and RR Copy | DAP- 34887 | |
Support Single Stream Multipart Uploading with Csv File Format | DAP-34486 | |
Support Avro File Format | DAP-34886 | |
Support IAM Role | DAP-34888 | |
Support Encrypted Buckets | DAP-34889 | |
Extract Utility Method to Create Amazon S3 | DAP-35228 | |
Make Plugin-S3 Available in Datameer 6.4, 7.0, And 7.1 | DAP-35326 | |
Provide Infrastructure That Jobflow Can Deal with Server Side Operations | DAP-35863 | |
Provide a DatameerJobLifeCycleListener | DAP-36021 |
Improvements
1 | Introduce the Databasecredentials Class That Holds All Properties Needed to Establish a Database Connection In Snowflake | DAP-36030 |
---|---|---|
This is used for the import case so that the different database types can configure this object as needed. |
...
57 | Downloading Data via REST in Native Multi User Mode Doesn't Return Data, Only the Column Names | DAP-36283 | |
---|---|---|---|
Downloading data via REST is only returning the header information and not the data. | Resolution: | Bug fixed. |
7.1.1 Release Notes
Improvements
1 | Support Added for MapR 6.0 | DAP-34743 |
---|---|---|
Datameer works in unsecured/secured MapR 6.0 environments. |
...