[jira] [Created] (FLINK-15158) Why convert integer to bigdecimal for formart-json when kafka is used

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

[jira] [Created] (FLINK-15158) Why convert integer to bigdecimal for formart-json when kafka is used

Shang Yuanchun (Jira)
hehuiyuan created FLINK-15158:
---------------------------------

             Summary: Why convert integer to bigdecimal for formart-json when kafka is used
                 Key: FLINK-15158
                 URL: https://issues.apache.org/jira/browse/FLINK-15158
             Project: Flink
          Issue Type: Wish
          Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile)
            Reporter: hehuiyuan


For example , 

I have a table  `table1` :

root
 |-- name: STRING
 |-- age: INT
 |-- sex: STRING

 

then , I want to `insert into kafka select * form table1` :

jsonschame: 

{type:'object',properties:\{name: { type: 'string' },age: \{ type: 'integer' },sex: \{ type: 'string' }}}

 

```

descriptor.withFormat(new Json().jsonSchema(jsonSchema)).withSchema(schema);

```

 

Exception in thread "main" org.apache.flink.table.api.ValidationException: Field types of query result and registered TableSink [sink_example2] do not match.Exception in thread "main" org.apache.flink.table.api.ValidationException: Field types of query result and registered TableSink [sink_example2] do not match.

*Query result schema: [name: String, age: Integer, sex: String]*

*TableSink schema:    [name: String, age: BigDecimal, sex: String]* at org.apache.flink.table.sinks.TableSinkUtils$.validateSink(TableSinkUtils.scala:65) at org.apache.flink.table.planner.StreamPlanner$$anonfun$2.apply(StreamPlanner.scala:156) at org.apache.flink.table.planner.StreamPlanner$$anonfun$2.apply(StreamPlanner.scala:155) at scala.Option.map(Option.scala:146) 

 

I know that the type of integer in the jsonschema is convert to BigDecimal .But for the above scenario, does this have to be forced to be decimal?

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)