Jun Zhang created FLINK-16646:
---------------------------------
Summary: flink read orc file throw a NullPointerException
Key: FLINK-16646
URL:
https://issues.apache.org/jira/browse/FLINK-16646 Project: Flink
Issue Type: Bug
Components: Connectors / Hive
Affects Versions: 1.10.0
Reporter: Jun Zhang
Fix For: 1.11.0
When I use OrcRowInputFormat to read multiple orc files, the system throws one NullPointerException .
the code like this
{code:java}
StreamExecutionEnvironment environment = StreamExecutionEnvironment.getExecutionEnvironment();
environment.setParallelism(1);
String path = "file://tmp/dir";
String schema = ..... ;
OrcRowInputFormat orcRowInputFormat = new OrcRowInputFormat(
path,
schema,
new org.apache.hadoop.conf.Configuration());
DataStream dataStream =environment.createInput(orcRowInputFormat);
dataStream.writeAsText("file:///tmp/aaa", FileSystem.WriteMode.OVERWRITE);
environment.execute();
{code}
the exception is
{code:java}
Caused by: java.lang.NullPointerExceptionCaused by: java.lang.NullPointerException at org.apache.flink.orc.shim.OrcShimV200.computeProjectionMask(OrcShimV200.java:188) at org.apache.flink.orc.shim.OrcShimV200.createRecordReader(OrcShimV200.java:120) at org.apache.flink.orc.OrcSplitReader.<init>(OrcSplitReader.java:73) at org.apache.flink.orc.OrcRowSplitReader.<init>(OrcRowSplitReader.java:50) at org.apache.flink.orc.OrcRowInputFormat.open(OrcRowInputFormat.java:102) at org.apache.flink.streaming.api.functions.source.ContinuousFileReaderOperator$SplitReader.run(ContinuousFileReaderOperator.java:315)
{code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)