Important API and SDK changes for developers by version.
7.0
Deprecated REST API calls
The following REST API calls have been deprecated as of Datameer v7.0 and are scheduled to be removed before Datameer v8.0.
- Read Workbook
- Replaced with Read Workbook v2
- Create Workbook
- Replaced with Create Workbook v2
- Update Workbook
- Replaced with Update Workbook v2
New job execution events for the SDK
Datameer introduced four new events regarding to job execution (e.g., running a data source or workbook).
- A job started event is published on the event bus with a job execution id, data directory, and the job metadata when a job starts (the job status changes to RUNNING).
- The metadata of a job will be determined when the job switches from QUEUED to RUNNING; if a user would make changes during this time these changes will be taken when the job switches to RUNNING.
- The SDK type is called
JobExecutionStartedEvent
.
- A job canceled event is published on the event bus with a job execution id, data directory, and owner when a job gets canceled.
- The SDK type is called
JobExecutionCanceledEvent
.
- The SDK type is called
- A job completed event is published on the event bus with a job execution id, data directory, and owner when a job successfully (with data) completes.
- The SDK type is called
JobExecutionCompletedEvent
.
- The SDK type is called
- A job failed event is published on the event bus with a job execution id, data directory and owner when a job failed during execution.
- The SDK type is called
JobExecutionFailedEvent
.
- The SDK type is called
Details about the JobExecutionStartedEvent
- The executing user of a job is the user that runs the job. This user is always the owner of the job.
- The triggering user of a job is the user who performed the job start action. This depends on:
- if the job is triggered manually, the logged in user is the triggering user.
- if the job is triggered by the scheduler, the job owner is also the triggering user.
- The JobExecutionStartedEvent published on the event bus consists of the fields:
- dataDir with an URI where the data resides in HDFS.
- executingAs which contains information about the executing user (the running user).
- jobExecutionId which is the ID of the job execution.
- jobMetaData which contains the data about the job (e.g., the sheets and columns of a workbook)
- triggeredBy which is either USER, SCHEDULER, RESTAPI, IMPORTJOB, EXPORTJOB, or WORKBOOK.
- performedBy which contains information about the triggering user.
- Depending on the cluster mode the executing user of a job is potentially used to authenticate against the cluster/HDFS
Downloading data event for the SDK
- An event is published on the event bus when a user is downloading data from a worksheet or a data source on the "Browse All Data" page.
- The event consist of:
- The file UUID from the workbook or data source.
- A list of sheet data which has one or multiple SheetData instances attached to it.
- Many SheetData instances are available when the import job is in append mode or partitioned.
- Many SheetData instances are when the file upload is partitioned.
- Many SheetData instances are when the workbook sheet is partitioned.
- Otherwise we have only one sheet data instance
- The user who is performing the download