Release 3.16.0 (50) - SQL Editor polishing (46.34.0 // 1.176.0)

Modified on Thu, 30 Jul 2020 at 09:58 AM

Features

In the following all features are listed which are (partly) included in the release. If no other open parts (stories) are left, the feature is finished and otherwise it is clarified as ongoing.

Finished Features


Basic SQL editor functionality
GoalInside a SQL editor in Apps, users want to have basic SQL editor functionality, e.g. by executing a selected/marked line of code.
Finished parts in the releaseXLSX exportIt is possible to export the result table (or just the marked/selected result table) into a xlsx file + provide SQL script in second work sheet


Bugs and TR
GoalCollection of bugs and technical refinements.
Finished parts in the releaseEnable SQL execution from the editor for other database types than oracle/impala/postgres 


Basic Cassandra datatables with primary keys
Goal
  • Create cassandra connections
  • Create datatables from tables in cassandra connection, through browsing of available tables
  • Read from and write to cassandra datatables

Finished parts in the release

Support custom connection type for cassandra
  • Cassandra connections can be created with basic authentication
  • Datatables can be created from cassandra connections and used as FRTs (spark-backed) and within datatable save and load processors

Support selection of primary keys for datatable save for cassandra
  • Minimal: when configuring a datatable save for cassandra, the primary keys can be selected and are applied
  • Optimal: the primary keys are stored in the datatables' metadata


Ongoing Features


Convenience features around connection & datatable management


Finalize UI for Debug-Mode
Goal

Most important points to be improved:

  • while executing a workflow in break-point mode, we want to visualize the exact flow of data in the workflow. The order of execution does not matter here.
  • coloring edges after execution based on WARNING, FAILED and SUCCESS
  • we need debug icon on the ports that contain debug information
  • error icon in the workflow should change to a new icon
  • data-debug-info should be removed and spark-debug-info should be moved to new processor tab

Finished parts in the release

While executing the workflow only the executed sub-path should be dashed
While executing the workflow in partial mode, only edges in the execution process should be dashed. The order does not matter here.
Debug results in new processor tab
The debug results for each processor being displayed in another tab when opening the corresponding processor.
Highlighting of WF paths depending on debug mode
Depending on the selected debug mode, we want to highlight the WF edges differently.  

Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select atleast one of the reasons

Feedback sent

We appreciate your effort and will try to fix the article