Leonard Xu created FLINK-14549:
----------------------------------
Summary: Bring more detail by using logicalType rather than conversionClass in exception msg
Key: FLINK-14549
URL:
https://issues.apache.org/jira/browse/FLINK-14549 Project: Flink
Issue Type: Bug
Components: Table SQL / Planner
Affects Versions: 1.9.1
Reporter: Leonard Xu
Fix For: 1.10.0
We use DataType‘s conversionClass name in validating the query result's field type and sink table schema which is no precise when the DataType has variable parameters like DECIMAL(p,s)、TIMESTAMP(p).
Exception in thread "main" org.apache.flink.table.api.ValidationException: Field types of query result and registered TableSink `default_catalog`.`default_database`.`q2_sinkTable` do not match.Exception in thread "main" org.apache.flink.table.api.ValidationException: Field types of query result and registered TableSink `default_catalog`.`default_database`.`q2_sinkTable` do not match.Query result schema: [d_week_seq1: Long, EXPR$1: BigDecimal, EXPR$2: BigDecimal, EXPR$3: BigDecimal]TableSink schema: [d_week_seq1: Long, EXPR$1: BigDecimal, EXPR$2: BigDecimal, EXPR$3: BigDecimal] at org.apache.flink.table.planner.sinks.TableSinkUtils$.validateSink(TableSinkUtils.scala:68) at org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$2.apply(PlannerBase.scala:179) at org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$2.apply(PlannerBase.scala:178)
--
This message was sent by Atlassian Jira
(v8.3.4#803005)