[jira] [Created] (FLINK-13699) Fix TableFactory doesn't work with DDL when containing TIMESTAMP/DATE/TIME types

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

[jira] [Created] (FLINK-13699) Fix TableFactory doesn't work with DDL when containing TIMESTAMP/DATE/TIME types

Shang Yuanchun (Jira)
Jark Wu created FLINK-13699:
-------------------------------

             Summary: Fix TableFactory doesn't work with DDL when containing TIMESTAMP/DATE/TIME types
                 Key: FLINK-13699
                 URL: https://issues.apache.org/jira/browse/FLINK-13699
             Project: Flink
          Issue Type: Bug
          Components: Table SQL / API, Table SQL / Planner
    Affects Versions: 1.9.0
            Reporter: Jark Wu
            Assignee: Jark Wu


Currently, in blink planner, we will convert DDL to {{TableSchema}} with new type system, i.e. DataTypes.TIMESTAMP()/DATE()/TIME() whose underlying TypeInformation are  Types.LOCAL_DATETIME/LOCAL_DATE/LOCAL_TIME.

However, this makes the existing connector implementations (Kafka, ES, CSV, etc..) don't work because they only accept the old TypeInformations (Types.SQL_TIMESTAMP/SQL_DATE/SQL_TIME).

A simple solution is encode DataTypes.TIMESTAMP() as "TIMESTAMP" when translating to properties. And will be converted back to the old TypeInformation: Types.SQL_TIMESTAMP. This would fix all factories at once.




--
This message was sent by Atlassian JIRA
(v7.6.14#76016)