Content
- Intro
- DateTime-Variable in ONE DATA
- Data Type Conversion-Processor
- Modifiers
- DateTime Variables in Spark SQL
Intro
To be able to properly work with DateTimes, ONE DATA has an integrated DateTime variable type. It comes with options to save your values in different formats that fit best for your use case and includes features to display the variables in certain ways, using modifiers. The table below contains all existing DateTime formats in OD at the moment:
Format | Example Value |
SPARK_TIMESTAMP (default) | 2020-05-15 11:34:36 |
STRING | 05/11/2020 06:42:39.8 |
LONG | 1590133554 (milliseconds since 01.01.1970) |
ORACLE_TIMESTAMP | TO_TIMESTAMP('2020-05-11 06:42:39','YYYY-MM-DD HH24:MI:SS') |
ORACLE_DATE | TO_DATE('2020-05-11 06:42:39','YYYY-MM-DD HH24:MI:SS') |
DateTime-Variable in ONE DATA
In the Variable Manager of a workflow, you can create a new DateTime variable and select your desired format.
Then you can define its value in the Variable Manager by using the date- and time-pickers that show up when clicking on the respective boxes.
Data Type Conversion-Processor
When using a Data Type Conversion Processor to convert a string to DateTime, you will get the value in SPARK_TIMESTAMP format.
This means in order to compare "datetimes" obtained via a workflow variable to "datetimes" in a dataset you need to convert either one of them.
Modifiers
Within query config elements of processors (e.g. for the Query Processor), the display pattern of DateTime variables can be changed by using modifiers. When referencing variables in queries, the syntax is @variableName@. To apply modifiers you just need to put them in brackets behind the variable name. Some examples are shown below:
-- Modifier @varName[MODIFIER]@ -- Modifier and string pattern @varName[MODIFIER|yyyy-MM-dd]@ -- Example @varName[SPARK_TIMESTAMP|yyyy-MM-dd]@
Note that the given pattern needs to follow the requirements of the java DateTime.toString() method.
Also it is not possible to apply all patterns to the types 'ORACLE_DATE' and 'ORACLE_TIME' because Oracle uses some slightly different patterns sometimes.
DateTime Variables in Spark SQL
When you want to use a DateTime variable in Spark SQL, for example in a Query Processor, there are some useful conversions and some things to note, that will make your work with DateTime variables easier.
Syntax
A very crucial point to mention is, that all variables in the format 'SPARK_TIMESTAMP' and 'STRING' need to be put in apostrophes when referenced within a Query. Here's an example:
SELECT '@stringDateTimeVariable@'
Note that this also applies to e.g. a variable of format 'LONG' that is converted to 'STRING' with a modifier. The expression also needs to be put in single apostrophes like this: '@longDateTime[STRING]@'
This is not necessary inside the Database Connection Load Processor when targeting an ORACLE database, as Oracle variables get replaced with functions, which the data base can interpret. For LONG it depends on how the variable will be used.
Conversions and Manipulation
It is possible to manipulate the value of a DateTime variable within a query, without altering the actual value of it. This method is only available for the format 'SPARK_TIMESTAMP', so if your variable has another type, you have to combine it with a modifier.
-- Subtract three hours from a DateTime variable of format SPARK_TIMESTAMP SELECT '@dt_var@' - INTERVAL 3 HOURS as exampleValue -- Subtract two hours from a DateTime variable of any other format type SELECT '@dt_var[SPARK_TIMESTAMP]@' - INTERVAL 3 HOURS as exampleValue
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article