Hi guys,
I'm trying to enable Flink's checkpoint on our Flink app. I got the following Apache http jar compatibility error, and cannot figure out how to resolve it. Here's the stacktrace: ``` 2017-06-19 15:07:39,828 INFO org.apache.flink.runtime.taskmanager.Task - Source: Custom Source -> (Timestamps/Watermarks, Filter -> Map, Filter -> Map, Filter -> Map) (1/1) (37ab9429deda28e31fa0ed0ed1568654) switched from RUNNING to FAILED. java.lang.NoSuchFieldError: INSTANCE at org.apache.flink.kinesis.shaded.com.amazonaws.http.conn.SdkConnectionKeepAliveStrategy.getKeepAliveDuration(SdkConnectionKeepAliveStrategy.java:48) at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:535) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906) at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805) at org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:837) at org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:607) at org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient.doExecute(AmazonHttpClient.java:376) at org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient.executeWithTimer(AmazonHttpClient.java:338) at org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:287) at org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.AmazonKinesisClient.doInvoke(AmazonKinesisClient.java:1940) at org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient.java:1910) at org.apache.flink.kinesis.shaded.com.amazonaws.services.kinesis.AmazonKinesisClient.describeStream(AmazonKinesisClient.java:656) at org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy.describeStream(KinesisProxy.java:361) at org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy.getShardsOfStream(KinesisProxy.java:323) at org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy.getShardList(KinesisProxy.java:231) at org.apache.flink.streaming.connectors.kinesis.internals.KinesisDataFetcher.discoverNewShardsToSubscribe(KinesisDataFetcher.java:430) at org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer.run(FlinkKinesisConsumer.java:202) at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:87) at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:55) at org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:95) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:262) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:702) at java.lang.Thread.run(Thread.java:745) ``` Here's my Flink environment setup: - I'm using flink-connector-kinesis_2.11-1.3.0.jar that I built from 1.3.0 source code. - I followed https://ci.apache.org/projects/flink/flink-docs-release-1.3/setup/aws.html#flink-for-hadoop-27 and added all necessary dependency jars - My application doesn't use Apache http/core. Has anyone experienced the similar incompatibility issue? Thanks! Bowen |
Here is the dependency in the flink-connector-kinesis module:
[INFO] +- com.amazonaws:aws-java-sdk-kinesis:jar:1.10.71:compile [INFO] | \- com.amazonaws:aws-java-sdk-core:jar:1.10.71:compile [INFO] | +- org.apache.httpcomponents:httpclient:jar:4.3.6:compile [INFO] | +- org.apache.httpcomponents:httpcore:jar:4.3.3:compile Checking dependency tree of flink, the highest version is 4.2.x You can try building flink with dependency on 4.3.y of httpclient / httpcore FYI On Mon, Jun 19, 2017 at 4:52 PM, Bowen Li <[hidden email]> wrote: > Hi guys, > I'm trying to enable Flink's checkpoint on our Flink app. I got the > following Apache http jar compatibility error, and cannot figure out how to > resolve it. > > Here's the stacktrace: > > ``` > > 2017-06-19 15:07:39,828 INFO org.apache.flink.runtime.taskmanager.Task > - Source: Custom Source -> (Timestamps/Watermarks, Filter > -> Map, Filter -> Map, Filter -> Map) (1/1) > (37ab9429deda28e31fa0ed0ed1568654) switched from RUNNING to FAILED. > java.lang.NoSuchFieldError: INSTANCE > at > org.apache.flink.kinesis.shaded.com.amazonaws.http.conn. > SdkConnectionKeepAliveStrategy.getKeepAliveDuration( > SdkConnectionKeepAliveStrategy.java:48) > at > org.apache.http.impl.client.DefaultRequestDirector.execute( > DefaultRequestDirector.java:535) > at > org.apache.http.impl.client.AbstractHttpClient.execute( > AbstractHttpClient.java:906) > at > org.apache.http.impl.client.AbstractHttpClient.execute( > AbstractHttpClient.java:805) > at > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > executeOneRequest(AmazonHttpClient.java:837) > at > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > executeHelper(AmazonHttpClient.java:607) > at > org.apache.flink.kinesis.shaded.com.amazonaws.http. > AmazonHttpClient.doExecute(AmazonHttpClient.java:376) > at > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > executeWithTimer(AmazonHttpClient.java:338) > at > org.apache.flink.kinesis.shaded.com.amazonaws.http. > AmazonHttpClient.execute(AmazonHttpClient.java:287) > at > org.apache.flink.kinesis.shaded.com.amazonaws.services. > kinesis.AmazonKinesisClient.doInvoke(AmazonKinesisClient.java:1940) > at > org.apache.flink.kinesis.shaded.com.amazonaws.services. > kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient.java:1910) > at > org.apache.flink.kinesis.shaded.com.amazonaws.services. > kinesis.AmazonKinesisClient.describeStream(AmazonKinesisClient.java:656) > at > org.apache.flink.streaming.connectors.kinesis.proxy. > KinesisProxy.describeStream(KinesisProxy.java:361) > at > org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy. > getShardsOfStream(KinesisProxy.java:323) > at > org.apache.flink.streaming.connectors.kinesis.proxy. > KinesisProxy.getShardList(KinesisProxy.java:231) > at > org.apache.flink.streaming.connectors.kinesis.internals. > KinesisDataFetcher.discoverNewShardsToSubscribe( > KinesisDataFetcher.java:430) > at > org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer.run( > FlinkKinesisConsumer.java:202) > at > org.apache.flink.streaming.api.operators.StreamSource. > run(StreamSource.java:87) > at > org.apache.flink.streaming.api.operators.StreamSource. > run(StreamSource.java:55) > at > org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run( > SourceStreamTask.java:95) > at > org.apache.flink.streaming.runtime.tasks.StreamTask. > invoke(StreamTask.java:262) > at org.apache.flink.runtime.taskmanager.Task.run(Task.java:702) > at java.lang.Thread.run(Thread.java:745) > > ``` > > Here's my Flink environment setup: > > - I'm using flink-connector-kinesis_2.11-1.3.0.jar that I built from > 1.3.0 source code. > - I followed > https://ci.apache.org/projects/flink/flink-docs- > release-1.3/setup/aws.html#flink-for-hadoop-27 > and added all necessary dependency jars > - My application doesn't use Apache http/core. > > > Has anyone experienced the similar incompatibility issue? > > Thanks! > Bowen > |
Thanks, Ted! woo, this is unexpected. https://ci.apache.
org/projects/flink/flink-docs-release-1.3/setup/aws.html is out of date. I bet anyone using Kinesis with Flink will run into this issue. I can try to build Flink myself and resolve this problem. But talking about a feasible permanent solution for all flink-connector-kinesis users. Shall we downgrade aws-java-sdk-kinesis version in flink-connector-kinesis, or shall we upgrade httpcomponents version in Flink? Bowen On Mon, Jun 19, 2017 at 7:02 PM, Ted Yu <[hidden email]> wrote: > Here is the dependency in the flink-connector-kinesis module: > > [INFO] +- com.amazonaws:aws-java-sdk-kinesis:jar:1.10.71:compile > [INFO] | \- com.amazonaws:aws-java-sdk-core:jar:1.10.71:compile > [INFO] | +- org.apache.httpcomponents:httpclient:jar:4.3.6:compile > [INFO] | +- org.apache.httpcomponents:httpcore:jar:4.3.3:compile > > Checking dependency tree of flink, the highest version is 4.2.x > > You can try building flink with dependency on 4.3.y of httpclient / > httpcore > > FYI > > > > On Mon, Jun 19, 2017 at 4:52 PM, Bowen Li <[hidden email]> wrote: > > > Hi guys, > > I'm trying to enable Flink's checkpoint on our Flink app. I got the > > following Apache http jar compatibility error, and cannot figure out how > to > > resolve it. > > > > Here's the stacktrace: > > > > ``` > > > > 2017-06-19 15:07:39,828 INFO org.apache.flink.runtime.taskmanager.Task > > - Source: Custom Source -> (Timestamps/Watermarks, Filter > > -> Map, Filter -> Map, Filter -> Map) (1/1) > > (37ab9429deda28e31fa0ed0ed1568654) switched from RUNNING to FAILED. > > java.lang.NoSuchFieldError: INSTANCE > > at > > org.apache.flink.kinesis.shaded.com.amazonaws.http.conn. > > SdkConnectionKeepAliveStrategy.getKeepAliveDuration( > > SdkConnectionKeepAliveStrategy.java:48) > > at > > org.apache.http.impl.client.DefaultRequestDirector.execute( > > DefaultRequestDirector.java:535) > > at > > org.apache.http.impl.client.AbstractHttpClient.execute( > > AbstractHttpClient.java:906) > > at > > org.apache.http.impl.client.AbstractHttpClient.execute( > > AbstractHttpClient.java:805) > > at > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > > executeOneRequest(AmazonHttpClient.java:837) > > at > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > > executeHelper(AmazonHttpClient.java:607) > > at > > org.apache.flink.kinesis.shaded.com.amazonaws.http. > > AmazonHttpClient.doExecute(AmazonHttpClient.java:376) > > at > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > > executeWithTimer(AmazonHttpClient.java:338) > > at > > org.apache.flink.kinesis.shaded.com.amazonaws.http. > > AmazonHttpClient.execute(AmazonHttpClient.java:287) > > at > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > > kinesis.AmazonKinesisClient.doInvoke(AmazonKinesisClient.java:1940) > > at > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > > kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient.java:1910) > > at > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > > kinesis.AmazonKinesisClient.describeStream(AmazonKinesisClient.java:656) > > at > > org.apache.flink.streaming.connectors.kinesis.proxy. > > KinesisProxy.describeStream(KinesisProxy.java:361) > > at > > org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy. > > getShardsOfStream(KinesisProxy.java:323) > > at > > org.apache.flink.streaming.connectors.kinesis.proxy. > > KinesisProxy.getShardList(KinesisProxy.java:231) > > at > > org.apache.flink.streaming.connectors.kinesis.internals. > > KinesisDataFetcher.discoverNewShardsToSubscribe( > > KinesisDataFetcher.java:430) > > at > > org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer.run( > > FlinkKinesisConsumer.java:202) > > at > > org.apache.flink.streaming.api.operators.StreamSource. > > run(StreamSource.java:87) > > at > > org.apache.flink.streaming.api.operators.StreamSource. > > run(StreamSource.java:55) > > at > > org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run( > > SourceStreamTask.java:95) > > at > > org.apache.flink.streaming.runtime.tasks.StreamTask. > > invoke(StreamTask.java:262) > > at org.apache.flink.runtime.taskmanager.Task.run(Task.java:702) > > at java.lang.Thread.run(Thread.java:745) > > > > ``` > > > > Here's my Flink environment setup: > > > > - I'm using flink-connector-kinesis_2.11-1.3.0.jar that I built from > > 1.3.0 source code. > > - I followed > > https://ci.apache.org/projects/flink/flink-docs- > > release-1.3/setup/aws.html#flink-for-hadoop-27 > > and added all necessary dependency jars > > - My application doesn't use Apache http/core. > > > > > > Has anyone experienced the similar incompatibility issue? > > > > Thanks! > > Bowen > > > |
I logged FLINK-6951, referencing this thread.
We can continue discussion there. Thanks On Mon, Jun 19, 2017 at 9:06 PM, Bowen Li <[hidden email]> wrote: > Thanks, Ted! woo, this is unexpected. https://ci.apache. > org/projects/flink/flink-docs-release-1.3/setup/aws.html is out of date. > > I bet anyone using Kinesis with Flink will run into this issue. I can try > to build Flink myself and resolve this problem. But talking about a > feasible permanent solution for all flink-connector-kinesis users. Shall we > downgrade aws-java-sdk-kinesis version in flink-connector-kinesis, or shall > we upgrade httpcomponents version in Flink? > > Bowen > > > On Mon, Jun 19, 2017 at 7:02 PM, Ted Yu <[hidden email]> wrote: > > > Here is the dependency in the flink-connector-kinesis module: > > > > [INFO] +- com.amazonaws:aws-java-sdk-kinesis:jar:1.10.71:compile > > [INFO] | \- com.amazonaws:aws-java-sdk-core:jar:1.10.71:compile > > [INFO] | +- org.apache.httpcomponents:httpclient:jar:4.3.6:compile > > [INFO] | +- org.apache.httpcomponents:httpcore:jar:4.3.3:compile > > > > Checking dependency tree of flink, the highest version is 4.2.x > > > > You can try building flink with dependency on 4.3.y of httpclient / > > httpcore > > > > FYI > > > > > > > > On Mon, Jun 19, 2017 at 4:52 PM, Bowen Li <[hidden email]> > wrote: > > > > > Hi guys, > > > I'm trying to enable Flink's checkpoint on our Flink app. I got the > > > following Apache http jar compatibility error, and cannot figure out > how > > to > > > resolve it. > > > > > > Here's the stacktrace: > > > > > > ``` > > > > > > 2017-06-19 15:07:39,828 INFO org.apache.flink.runtime. > taskmanager.Task > > > - Source: Custom Source -> (Timestamps/Watermarks, > Filter > > > -> Map, Filter -> Map, Filter -> Map) (1/1) > > > (37ab9429deda28e31fa0ed0ed1568654) switched from RUNNING to FAILED. > > > java.lang.NoSuchFieldError: INSTANCE > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.conn. > > > SdkConnectionKeepAliveStrategy.getKeepAliveDuration( > > > SdkConnectionKeepAliveStrategy.java:48) > > > at > > > org.apache.http.impl.client.DefaultRequestDirector.execute( > > > DefaultRequestDirector.java:535) > > > at > > > org.apache.http.impl.client.AbstractHttpClient.execute( > > > AbstractHttpClient.java:906) > > > at > > > org.apache.http.impl.client.AbstractHttpClient.execute( > > > AbstractHttpClient.java:805) > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > > > executeOneRequest(AmazonHttpClient.java:837) > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > > > executeHelper(AmazonHttpClient.java:607) > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. > > > AmazonHttpClient.doExecute(AmazonHttpClient.java:376) > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > > > executeWithTimer(AmazonHttpClient.java:338) > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. > > > AmazonHttpClient.execute(AmazonHttpClient.java:287) > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > > > kinesis.AmazonKinesisClient.doInvoke(AmazonKinesisClient.java:1940) > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > > > kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient.java:1910) > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > > > kinesis.AmazonKinesisClient.describeStream( > AmazonKinesisClient.java:656) > > > at > > > org.apache.flink.streaming.connectors.kinesis.proxy. > > > KinesisProxy.describeStream(KinesisProxy.java:361) > > > at > > > org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy. > > > getShardsOfStream(KinesisProxy.java:323) > > > at > > > org.apache.flink.streaming.connectors.kinesis.proxy. > > > KinesisProxy.getShardList(KinesisProxy.java:231) > > > at > > > org.apache.flink.streaming.connectors.kinesis.internals. > > > KinesisDataFetcher.discoverNewShardsToSubscribe( > > > KinesisDataFetcher.java:430) > > > at > > > org.apache.flink.streaming.connectors.kinesis. > FlinkKinesisConsumer.run( > > > FlinkKinesisConsumer.java:202) > > > at > > > org.apache.flink.streaming.api.operators.StreamSource. > > > run(StreamSource.java:87) > > > at > > > org.apache.flink.streaming.api.operators.StreamSource. > > > run(StreamSource.java:55) > > > at > > > org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run( > > > SourceStreamTask.java:95) > > > at > > > org.apache.flink.streaming.runtime.tasks.StreamTask. > > > invoke(StreamTask.java:262) > > > at org.apache.flink.runtime.taskmanager.Task.run(Task.java:702) > > > at java.lang.Thread.run(Thread.java:745) > > > > > > ``` > > > > > > Here's my Flink environment setup: > > > > > > - I'm using flink-connector-kinesis_2.11-1.3.0.jar that I built > from > > > 1.3.0 source code. > > > - I followed > > > https://ci.apache.org/projects/flink/flink-docs- > > > release-1.3/setup/aws.html#flink-for-hadoop-27 > > > and added all necessary dependency jars > > > - My application doesn't use Apache http/core. > > > > > > > > > Has anyone experienced the similar incompatibility issue? > > > > > > Thanks! > > > Bowen > > > > > > |
Hi,
We’ve seen this issue before [1]. The usual reason is that the httpcomponent dependencies weren’t properly shaded in the flink-dist jar. Having them properly shaded should solve the issue. cc Bowen: Are you building Flink yourself? If yes, what Maven version are you using? If you’re using 3.3.x+, after the first build under flink/, make sure to go to flink-dist/ and build a second time for the dependencies to be properly shaded. Alternatively, Maven 3.0.x+ is the recommended version, as 3.3.x has dependency shading issues. If you’re not building Flink yourself, the cause could be that the Flink 1.3.0 flink-dist jar wasn’t shaded properly, may need to double check. Best, Gordon [1] https://issues.apache.org/jira/browse/FLINK-5013 On 20 June 2017 at 12:14:27 PM, Ted Yu ([hidden email]) wrote: I logged FLINK-6951, referencing this thread. We can continue discussion there. Thanks On Mon, Jun 19, 2017 at 9:06 PM, Bowen Li <[hidden email]> wrote: > Thanks, Ted! woo, this is unexpected. https://ci.apache. > org/projects/flink/flink-docs-release-1.3/setup/aws.html is out of date. > > I bet anyone using Kinesis with Flink will run into this issue. I can try > to build Flink myself and resolve this problem. But talking about a > feasible permanent solution for all flink-connector-kinesis users. Shall we > downgrade aws-java-sdk-kinesis version in flink-connector-kinesis, or shall > we upgrade httpcomponents version in Flink? > > Bowen > > > On Mon, Jun 19, 2017 at 7:02 PM, Ted Yu <[hidden email]> wrote: > > > Here is the dependency in the flink-connector-kinesis module: > > > > [INFO] +- com.amazonaws:aws-java-sdk-kinesis:jar:1.10.71:compile > > [INFO] | \- com.amazonaws:aws-java-sdk-core:jar:1.10.71:compile > > [INFO] | +- org.apache.httpcomponents:httpclient:jar:4.3.6:compile > > [INFO] | +- org.apache.httpcomponents:httpcore:jar:4.3.3:compile > > > > Checking dependency tree of flink, the highest version is 4.2.x > > > > You can try building flink with dependency on 4.3.y of httpclient / > > httpcore > > > > FYI > > > > > > > > On Mon, Jun 19, 2017 at 4:52 PM, Bowen Li <[hidden email]> > wrote: > > > > > Hi guys, > > > I'm trying to enable Flink's checkpoint on our Flink app. I got the > > > following Apache http jar compatibility error, and cannot figure out > how > > to > > > resolve it. > > > > > > Here's the stacktrace: > > > > > > ``` > > > > > > 2017-06-19 15:07:39,828 INFO org.apache.flink.runtime. > taskmanager.Task > > > - Source: Custom Source -> (Timestamps/Watermarks, > Filter > > > -> Map, Filter -> Map, Filter -> Map) (1/1) > > > (37ab9429deda28e31fa0ed0ed1568654) switched from RUNNING to FAILED. > > > java.lang.NoSuchFieldError: INSTANCE > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.conn. > > > SdkConnectionKeepAliveStrategy.getKeepAliveDuration( > > > SdkConnectionKeepAliveStrategy.java:48) > > > at > > > org.apache.http.impl.client.DefaultRequestDirector.execute( > > > DefaultRequestDirector.java:535) > > > at > > > org.apache.http.impl.client.AbstractHttpClient.execute( > > > AbstractHttpClient.java:906) > > > at > > > org.apache.http.impl.client.AbstractHttpClient.execute( > > > AbstractHttpClient.java:805) > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > > > executeOneRequest(AmazonHttpClient.java:837) > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > > > executeHelper(AmazonHttpClient.java:607) > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. > > > AmazonHttpClient.doExecute(AmazonHttpClient.java:376) > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > > > executeWithTimer(AmazonHttpClient.java:338) > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. > > > AmazonHttpClient.execute(AmazonHttpClient.java:287) > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > > > kinesis.AmazonKinesisClient.doInvoke(AmazonKinesisClient.java:1940) > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > > > kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient.java:1910) > > > at > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > > > kinesis.AmazonKinesisClient.describeStream( > AmazonKinesisClient.java:656) > > > at > > > org.apache.flink.streaming.connectors.kinesis.proxy. > > > KinesisProxy.describeStream(KinesisProxy.java:361) > > > at > > > org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy. > > > getShardsOfStream(KinesisProxy.java:323) > > > at > > > org.apache.flink.streaming.connectors.kinesis.proxy. > > > KinesisProxy.getShardList(KinesisProxy.java:231) > > > at > > > org.apache.flink.streaming.connectors.kinesis.internals. > > > KinesisDataFetcher.discoverNewShardsToSubscribe( > > > KinesisDataFetcher.java:430) > > > at > > > org.apache.flink.streaming.connectors.kinesis. > FlinkKinesisConsumer.run( > > > FlinkKinesisConsumer.java:202) > > > at > > > org.apache.flink.streaming.api.operators.StreamSource. > > > run(StreamSource.java:87) > > > at > > > org.apache.flink.streaming.api.operators.StreamSource. > > > run(StreamSource.java:55) > > > at > > > org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run( > > > SourceStreamTask.java:95) > > > at > > > org.apache.flink.streaming.runtime.tasks.StreamTask. > > > invoke(StreamTask.java:262) > > > at org.apache.flink.runtime.taskmanager.Task.run(Task.java:702) > > > at java.lang.Thread.run(Thread.java:745) > > > > > > ``` > > > > > > Here's my Flink environment setup: > > > > > > - I'm using flink-connector-kinesis_2.11-1.3.0.jar that I built > from > > > 1.3.0 source code. > > > - I followed > > > https://ci.apache.org/projects/flink/flink-docs- > > > release-1.3/setup/aws.html#flink-for-hadoop-27 > > > and added all necessary dependency jars > > > - My application doesn't use Apache http/core. > > > > > > > > > Has anyone experienced the similar incompatibility issue? > > > > > > Thanks! > > > Bowen > > > > > > |
Hi Gordon,
Here's what I use: - Flink: I didn't build Flink myself. I download http://apache.mirrors.lucidnetworks.net/flink/flink-1.3.0/flink-1.3.0-bin-hadoop27-scala_2.11.tgz from https://flink.apache.org/downloads.html (Hadoop® 2.7, Scala 2.11) - flink-kinesis-connector: I build flink-connector-kinesis_2.11-1.3.0.jar myself, from source code downloaded at *#Source* section in https://flink.apache.org/downloads.html. - Mvn -v: Apache Maven 3.2.5 In short, I didn't build Flink. Most likely that dependencies in either flink-dist or flink-kinesis-connector is not shaded properly? Thanks! Bowen On Mon, Jun 19, 2017 at 9:28 PM, Tzu-Li (Gordon) Tai <[hidden email]> wrote: > Hi, > > We’ve seen this issue before [1]. The usual reason is that the > httpcomponent dependencies weren’t properly shaded in the flink-dist jar. > Having them properly shaded should solve the issue. > > cc Bowen: > Are you building Flink yourself? If yes, what Maven version are you using? > If you’re using 3.3.x+, after the first build under flink/, make sure to go > to flink-dist/ and build a second time for the dependencies to be properly > shaded. > Alternatively, Maven 3.0.x+ is the recommended version, as 3.3.x has > dependency shading issues. > > If you’re not building Flink yourself, the cause could be that the Flink > 1.3.0 flink-dist jar wasn’t shaded properly, may need to double check. > > Best, > Gordon > > [1] https://issues.apache.org/jira/browse/FLINK-5013 > > On 20 June 2017 at 12:14:27 PM, Ted Yu ([hidden email]) wrote: > > I logged FLINK-6951, referencing this thread. > > We can continue discussion there. > > Thanks > > On Mon, Jun 19, 2017 at 9:06 PM, Bowen Li <[hidden email]> wrote: > > > Thanks, Ted! woo, this is unexpected. https://ci.apache. > > org/projects/flink/flink-docs-release-1.3/setup/aws.html is out of date. > > > > I bet anyone using Kinesis with Flink will run into this issue. I can try > > to build Flink myself and resolve this problem. But talking about a > > feasible permanent solution for all flink-connector-kinesis users. Shall > we > > downgrade aws-java-sdk-kinesis version in flink-connector-kinesis, or > shall > > we upgrade httpcomponents version in Flink? > > > > Bowen > > > > > > On Mon, Jun 19, 2017 at 7:02 PM, Ted Yu <[hidden email]> wrote: > > > > > Here is the dependency in the flink-connector-kinesis module: > > > > > > [INFO] +- com.amazonaws:aws-java-sdk-kinesis:jar:1.10.71:compile > > > [INFO] | \- com.amazonaws:aws-java-sdk-core:jar:1.10.71:compile > > > [INFO] | +- org.apache.httpcomponents:httpclient:jar:4.3.6:compile > > > [INFO] | +- org.apache.httpcomponents:httpcore:jar:4.3.3:compile > > > > > > Checking dependency tree of flink, the highest version is 4.2.x > > > > > > You can try building flink with dependency on 4.3.y of httpclient / > > > httpcore > > > > > > FYI > > > > > > > > > > > > On Mon, Jun 19, 2017 at 4:52 PM, Bowen Li <[hidden email]> > > wrote: > > > > > > > Hi guys, > > > > I'm trying to enable Flink's checkpoint on our Flink app. I got the > > > > following Apache http jar compatibility error, and cannot figure out > > how > > > to > > > > resolve it. > > > > > > > > Here's the stacktrace: > > > > > > > > ``` > > > > > > > > 2017-06-19 15:07:39,828 INFO org.apache.flink.runtime. > > taskmanager.Task > > > > - Source: Custom Source -> (Timestamps/Watermarks, > > Filter > > > > -> Map, Filter -> Map, Filter -> Map) (1/1) > > > > (37ab9429deda28e31fa0ed0ed1568654) switched from RUNNING to FAILED. > > > > java.lang.NoSuchFieldError: INSTANCE > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.conn. > > > > SdkConnectionKeepAliveStrategy.getKeepAliveDuration( > > > > SdkConnectionKeepAliveStrategy.java:48) > > > > at > > > > org.apache.http.impl.client.DefaultRequestDirector.execute( > > > > DefaultRequestDirector.java:535) > > > > at > > > > org.apache.http.impl.client.AbstractHttpClient.execute( > > > > AbstractHttpClient.java:906) > > > > at > > > > org.apache.http.impl.client.AbstractHttpClient.execute( > > > > AbstractHttpClient.java:805) > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > > > > executeOneRequest(AmazonHttpClient.java:837) > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > > > > executeHelper(AmazonHttpClient.java:607) > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. > > > > AmazonHttpClient.doExecute(AmazonHttpClient.java:376) > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > > > > executeWithTimer(AmazonHttpClient.java:338) > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. > > > > AmazonHttpClient.execute(AmazonHttpClient.java:287) > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > > > > kinesis.AmazonKinesisClient.doInvoke(AmazonKinesisClient.java:1940) > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > > > > kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient.java:1910) > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > > > > kinesis.AmazonKinesisClient.describeStream( > > AmazonKinesisClient.java:656) > > > > at > > > > org.apache.flink.streaming.connectors.kinesis.proxy. > > > > KinesisProxy.describeStream(KinesisProxy.java:361) > > > > at > > > > org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy. > > > > getShardsOfStream(KinesisProxy.java:323) > > > > at > > > > org.apache.flink.streaming.connectors.kinesis.proxy. > > > > KinesisProxy.getShardList(KinesisProxy.java:231) > > > > at > > > > org.apache.flink.streaming.connectors.kinesis.internals. > > > > KinesisDataFetcher.discoverNewShardsToSubscribe( > > > > KinesisDataFetcher.java:430) > > > > at > > > > org.apache.flink.streaming.connectors.kinesis. > > FlinkKinesisConsumer.run( > > > > FlinkKinesisConsumer.java:202) > > > > at > > > > org.apache.flink.streaming.api.operators.StreamSource. > > > > run(StreamSource.java:87) > > > > at > > > > org.apache.flink.streaming.api.operators.StreamSource. > > > > run(StreamSource.java:55) > > > > at > > > > org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run( > > > > SourceStreamTask.java:95) > > > > at > > > > org.apache.flink.streaming.runtime.tasks.StreamTask. > > > > invoke(StreamTask.java:262) > > > > at org.apache.flink.runtime.taskmanager.Task.run(Task.java:702) > > > > at java.lang.Thread.run(Thread.java:745) > > > > > > > > ``` > > > > > > > > Here's my Flink environment setup: > > > > > > > > - I'm using flink-connector-kinesis_2.11-1.3.0.jar that I built > > from > > > > 1.3.0 source code. > > > > - I followed > > > > https://ci.apache.org/projects/flink/flink-docs- > > > > release-1.3/setup/aws.html#flink-for-hadoop-27 > > > > and added all necessary dependency jars > > > > - My application doesn't use Apache http/core. > > > > > > > > > > > > Has anyone experienced the similar incompatibility issue? > > > > > > > > Thanks! > > > > Bowen > > > > > > > > > > |
Hi Bowen,
Thanks for the info. I checked the 1.3.0 release jars, and they do not have unshaded httpcomponent dependencies, so that shouldn’t be the problem. Looking back into the stack trace you posted, the conflict seems to be a different problem. The conflict seems to be with clashes with the aws-java-sdk version, and not the httpcomponent dependency. The “INSTANCE” field actually does exist in the aws-java-sdk version that the Kinesis connector is using. Could it be that you have other conflicting aws-java-sdk versions in your jar? Cheers, Gordon On 20 June 2017 at 12:55:17 PM, Bowen Li ([hidden email]) wrote: Hi Gordon, Here's what I use: - Flink: I didn't build Flink myself. I download http://apache.mirrors.lucidnetworks.net/flink/flink-1.3.0/flink-1.3.0-bin-hadoop27-scala_2.11.tgz from https://flink.apache.org/downloads.html (Hadoop® 2.7, Scala 2.11) - flink-kinesis-connector: I build flink-connector-kinesis_2.11-1.3.0.jar myself, from source code downloaded at *#Source* section in https://flink.apache.org/downloads.html. - Mvn -v: Apache Maven 3.2.5 In short, I didn't build Flink. Most likely that dependencies in either flink-dist or flink-kinesis-connector is not shaded properly? Thanks! Bowen On Mon, Jun 19, 2017 at 9:28 PM, Tzu-Li (Gordon) Tai <[hidden email]> wrote: > Hi, > > We’ve seen this issue before [1]. The usual reason is that the > httpcomponent dependencies weren’t properly shaded in the flink-dist jar. > Having them properly shaded should solve the issue. > > cc Bowen: > Are you building Flink yourself? If yes, what Maven version are you using? > If you’re using 3.3.x+, after the first build under flink/, make sure to go > to flink-dist/ and build a second time for the dependencies to be properly > shaded. > Alternatively, Maven 3.0.x+ is the recommended version, as 3.3.x has > dependency shading issues. > > If you’re not building Flink yourself, the cause could be that the Flink > 1.3.0 flink-dist jar wasn’t shaded properly, may need to double check. > > Best, > Gordon > > [1] https://issues.apache.org/jira/browse/FLINK-5013 > > On 20 June 2017 at 12:14:27 PM, Ted Yu ([hidden email]) wrote: > > I logged FLINK-6951, referencing this thread. > > We can continue discussion there. > > Thanks > > On Mon, Jun 19, 2017 at 9:06 PM, Bowen Li <[hidden email]> wrote: > > > Thanks, Ted! woo, this is unexpected. https://ci.apache. > > org/projects/flink/flink-docs-release-1.3/setup/aws.html is out of date. > > > > I bet anyone using Kinesis with Flink will run into this issue. I can try > > to build Flink myself and resolve this problem. But talking about a > > feasible permanent solution for all flink-connector-kinesis users. Shall > we > > downgrade aws-java-sdk-kinesis version in flink-connector-kinesis, or > shall > > we upgrade httpcomponents version in Flink? > > > > Bowen > > > > > > On Mon, Jun 19, 2017 at 7:02 PM, Ted Yu <[hidden email]> wrote: > > > > > Here is the dependency in the flink-connector-kinesis module: > > > > > > [INFO] +- com.amazonaws:aws-java-sdk-kinesis:jar:1.10.71:compile > > > [INFO] | \- com.amazonaws:aws-java-sdk-core:jar:1.10.71:compile > > > [INFO] | +- org.apache.httpcomponents:httpclient:jar:4.3.6:compile > > > [INFO] | +- org.apache.httpcomponents:httpcore:jar:4.3.3:compile > > > > > > Checking dependency tree of flink, the highest version is 4.2.x > > > > > > You can try building flink with dependency on 4.3.y of httpclient / > > > httpcore > > > > > > FYI > > > > > > > > > > > > On Mon, Jun 19, 2017 at 4:52 PM, Bowen Li <[hidden email]> > > wrote: > > > > > > > Hi guys, > > > > I'm trying to enable Flink's checkpoint on our Flink app. I got the > > > > following Apache http jar compatibility error, and cannot figure out > > how > > > to > > > > resolve it. > > > > > > > > Here's the stacktrace: > > > > > > > > ``` > > > > > > > > 2017-06-19 15:07:39,828 INFO org.apache.flink.runtime. > > taskmanager.Task > > > > - Source: Custom Source -> (Timestamps/Watermarks, > > Filter > > > > -> Map, Filter -> Map, Filter -> Map) (1/1) > > > > (37ab9429deda28e31fa0ed0ed1568654) switched from RUNNING to FAILED. > > > > java.lang.NoSuchFieldError: INSTANCE > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.conn. > > > > SdkConnectionKeepAliveStrategy.getKeepAliveDuration( > > > > SdkConnectionKeepAliveStrategy.java:48) > > > > at > > > > org.apache.http.impl.client.DefaultRequestDirector.execute( > > > > DefaultRequestDirector.java:535) > > > > at > > > > org.apache.http.impl.client.AbstractHttpClient.execute( > > > > AbstractHttpClient.java:906) > > > > at > > > > org.apache.http.impl.client.AbstractHttpClient.execute( > > > > AbstractHttpClient.java:805) > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > > > > executeOneRequest(AmazonHttpClient.java:837) > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > > > > executeHelper(AmazonHttpClient.java:607) > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. > > > > AmazonHttpClient.doExecute(AmazonHttpClient.java:376) > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHttpClient. > > > > executeWithTimer(AmazonHttpClient.java:338) > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. > > > > AmazonHttpClient.execute(AmazonHttpClient.java:287) > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > > > > kinesis.AmazonKinesisClient.doInvoke(AmazonKinesisClient.java:1940) > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > > > > kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient.java:1910) > > > > at > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > > > > kinesis.AmazonKinesisClient.describeStream( > > AmazonKinesisClient.java:656) > > > > at > > > > org.apache.flink.streaming.connectors.kinesis.proxy. > > > > KinesisProxy.describeStream(KinesisProxy.java:361) > > > > at > > > > org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy. > > > > getShardsOfStream(KinesisProxy.java:323) > > > > at > > > > org.apache.flink.streaming.connectors.kinesis.proxy. > > > > KinesisProxy.getShardList(KinesisProxy.java:231) > > > > at > > > > org.apache.flink.streaming.connectors.kinesis.internals. > > > > KinesisDataFetcher.discoverNewShardsToSubscribe( > > > > KinesisDataFetcher.java:430) > > > > at > > > > org.apache.flink.streaming.connectors.kinesis. > > FlinkKinesisConsumer.run( > > > > FlinkKinesisConsumer.java:202) > > > > at > > > > org.apache.flink.streaming.api.operators.StreamSource. > > > > run(StreamSource.java:87) > > > > at > > > > org.apache.flink.streaming.api.operators.StreamSource. > > > > run(StreamSource.java:55) > > > > at > > > > org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run( > > > > SourceStreamTask.java:95) > > > > at > > > > org.apache.flink.streaming.runtime.tasks.StreamTask. > > > > invoke(StreamTask.java:262) > > > > at org.apache.flink.runtime.taskmanager.Task.run(Task.java:702) > > > > at java.lang.Thread.run(Thread.java:745) > > > > > > > > ``` > > > > > > > > Here's my Flink environment setup: > > > > > > > > - I'm using flink-connector-kinesis_2.11-1.3.0.jar that I built > > from > > > > 1.3.0 source code. > > > > - I followed > > > > https://ci.apache.org/projects/flink/flink-docs- > > > > release-1.3/setup/aws.html#flink-for-hadoop-27 > > > > and added all necessary dependency jars > > > > - My application doesn't use Apache http/core. > > > > > > > > > > > > Has anyone experienced the similar incompatibility issue? > > > > > > > > Thanks! > > > > Bowen > > > > > > > > > > |
Hi Gordon, I double checked that I'm not using any of httpclient/httpcore or aws-java-sdk-xxx jars in my application. The only thing I did with aws-java-sdk is to put aws-java-sdk-1.7.4.jar to /lib described in https://ci.apache.org/projects/flink/flink-docs-release-1.3/setup/aws.html#flink-for-hadoop-27. Here's the screenshot of my /lib dir. Can the root cause be that shaded aws-java-sdk in flink is different than shaded aws-java-sdk in flink-kinesis-connector? Thanks! On Mon, Jun 19, 2017 at 10:26 PM, Tzu-Li (Gordon) Tai <[hidden email]> wrote: Hi Bowen, |
Bowen:
The picture didn't come thru. Can you pastebin the contents of /lib dir ? Cheers On Mon, Jun 19, 2017 at 11:22 PM, Bowen Li <[hidden email]> wrote: > Hi Gordon, > I double checked that I'm not using any of httpclient/httpcore > or aws-java-sdk-xxx jars in my application. > > The only thing I did with aws-java-sdk is to put > aws-java-sdk-1.7.4.jar to /lib described in https://ci.apache.org/ > projects/flink/flink-docs-release-1.3/setup/aws.html#flink-for-hadoop-27. > Here's the screenshot of my /lib dir. > [image: Inline image 1] > > Can the root cause be that shaded aws-java-sdk in flink is different > than shaded aws-java-sdk in flink-kinesis-connector? > > Thanks! > > On Mon, Jun 19, 2017 at 10:26 PM, Tzu-Li (Gordon) Tai <[hidden email] > > wrote: > >> Hi Bowen, >> >> Thanks for the info. I checked the 1.3.0 release jars, and they do not >> have unshaded httpcomponent dependencies, so that shouldn’t be the problem. >> >> Looking back into the stack trace you posted, the conflict seems to be a >> different problem. >> The conflict seems to be with clashes with the aws-java-sdk version, and >> not the httpcomponent dependency. >> The “INSTANCE” field actually does exist in the aws-java-sdk version that >> the Kinesis connector is using. >> >> Could it be that you have other conflicting aws-java-sdk versions in your >> jar? >> >> Cheers, >> Gordon >> >> On 20 June 2017 at 12:55:17 PM, Bowen Li ([hidden email]) wrote: >> >> Hi Gordon, >> Here's what I use: >> >> - Flink: I didn't build Flink myself. I download >> http://apache.mirrors.lucidnetworks.net/flink/flink-1.3.0/ >> flink-1.3.0-bin-hadoop27-scala_2.11.tgz >> from https://flink.apache.org/downloads.html (Hadoop® 2.7, Scala 2.11) >> - flink-kinesis-connector: I >> build flink-connector-kinesis_2.11-1.3.0.jar myself, from source code >> downloaded at *#Source* section in >> https://flink.apache.org/downloads.html. >> - Mvn -v: Apache Maven 3.2.5 >> >> >> In short, I didn't build Flink. Most likely that dependencies in >> either flink-dist or flink-kinesis-connector is not shaded properly? >> >> Thanks! >> Bowen >> >> On Mon, Jun 19, 2017 at 9:28 PM, Tzu-Li (Gordon) Tai <[hidden email] >> > >> wrote: >> >> > Hi, >> > >> > We’ve seen this issue before [1]. The usual reason is that the >> > httpcomponent dependencies weren’t properly shaded in the flink-dist >> jar. >> > Having them properly shaded should solve the issue. >> > >> > cc Bowen: >> > Are you building Flink yourself? If yes, what Maven version are you >> using? >> > If you’re using 3.3.x+, after the first build under flink/, make sure >> to go >> > to flink-dist/ and build a second time for the dependencies to be >> properly >> > shaded. >> > Alternatively, Maven 3.0.x+ is the recommended version, as 3.3.x has >> > dependency shading issues. >> > >> > If you’re not building Flink yourself, the cause could be that the Flink >> > 1.3.0 flink-dist jar wasn’t shaded properly, may need to double check. >> > >> > Best, >> > Gordon >> > >> > [1] https://issues.apache.org/jira/browse/FLINK-5013 >> > >> > On 20 June 2017 at 12:14:27 PM, Ted Yu ([hidden email]) wrote: >> > >> > I logged FLINK-6951, referencing this thread. >> > >> > We can continue discussion there. >> > >> > Thanks >> > >> > On Mon, Jun 19, 2017 at 9:06 PM, Bowen Li <[hidden email]> >> wrote: >> > >> > > Thanks, Ted! woo, this is unexpected. https://ci.apache. >> > > org/projects/flink/flink-docs-release-1.3/setup/aws.html is out of >> date. >> > > >> > > I bet anyone using Kinesis with Flink will run into this issue. I can >> try >> > > to build Flink myself and resolve this problem. But talking about a >> > > feasible permanent solution for all flink-connector-kinesis users. >> Shall >> > we >> > > downgrade aws-java-sdk-kinesis version in flink-connector-kinesis, or >> > shall >> > > we upgrade httpcomponents version in Flink? >> > > >> > > Bowen >> > > >> > > >> > > On Mon, Jun 19, 2017 at 7:02 PM, Ted Yu <[hidden email]> wrote: >> > > >> > > > Here is the dependency in the flink-connector-kinesis module: >> > > > >> > > > [INFO] +- com.amazonaws:aws-java-sdk-kinesis:jar:1.10.71:compile >> > > > [INFO] | \- com.amazonaws:aws-java-sdk-core:jar:1.10.71:compile >> > > > [INFO] | +- org.apache.httpcomponents:httpclient:jar:4.3.6:compile >> > > > [INFO] | +- org.apache.httpcomponents:httpcore:jar:4.3.3:compile >> > > > >> > > > Checking dependency tree of flink, the highest version is 4.2.x >> > > > >> > > > You can try building flink with dependency on 4.3.y of httpclient / >> > > > httpcore >> > > > >> > > > FYI >> > > > >> > > > >> > > > >> > > > On Mon, Jun 19, 2017 at 4:52 PM, Bowen Li <[hidden email]> >> > > wrote: >> > > > >> > > > > Hi guys, >> > > > > I'm trying to enable Flink's checkpoint on our Flink app. I got >> the >> > > > > following Apache http jar compatibility error, and cannot figure >> out >> > > how >> > > > to >> > > > > resolve it. >> > > > > >> > > > > Here's the stacktrace: >> > > > > >> > > > > ``` >> > > > > >> > > > > 2017-06-19 15:07:39,828 INFO org.apache.flink.runtime. >> > > taskmanager.Task >> > > > > - Source: Custom Source -> (Timestamps/Watermarks, >> > > Filter >> > > > > -> Map, Filter -> Map, Filter -> Map) (1/1) >> > > > > (37ab9429deda28e31fa0ed0ed1568654) switched from RUNNING to >> FAILED. >> > > > > java.lang.NoSuchFieldError: INSTANCE >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.conn. >> > > > > SdkConnectionKeepAliveStrategy.getKeepAliveDuration( >> > > > > SdkConnectionKeepAliveStrategy.java:48) >> > > > > at >> > > > > org.apache.http.impl.client.DefaultRequestDirector.execute( >> > > > > DefaultRequestDirector.java:535) >> > > > > at >> > > > > org.apache.http.impl.client.AbstractHttpClient.execute( >> > > > > AbstractHttpClient.java:906) >> > > > > at >> > > > > org.apache.http.impl.client.AbstractHttpClient.execute( >> > > > > AbstractHttpClient.java:805) >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt >> pClient. >> > > > > executeOneRequest(AmazonHttpClient.java:837) >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt >> pClient. >> > > > > executeHelper(AmazonHttpClient.java:607) >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. >> > > > > AmazonHttpClient.doExecute(AmazonHttpClient.java:376) >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt >> pClient. >> > > > > executeWithTimer(AmazonHttpClient.java:338) >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. >> > > > > AmazonHttpClient.execute(AmazonHttpClient.java:287) >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. >> > > > > kinesis.AmazonKinesisClient.doInvoke(AmazonKinesisClient.jav >> a:1940) >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. >> > > > > kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient.java:1910) >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. >> > > > > kinesis.AmazonKinesisClient.describeStream( >> > > AmazonKinesisClient.java:656) >> > > > > at >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy. >> > > > > KinesisProxy.describeStream(KinesisProxy.java:361) >> > > > > at >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy. >> > > > > getShardsOfStream(KinesisProxy.java:323) >> > > > > at >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy. >> > > > > KinesisProxy.getShardList(KinesisProxy.java:231) >> > > > > at >> > > > > org.apache.flink.streaming.connectors.kinesis.internals. >> > > > > KinesisDataFetcher.discoverNewShardsToSubscribe( >> > > > > KinesisDataFetcher.java:430) >> > > > > at >> > > > > org.apache.flink.streaming.connectors.kinesis. >> > > FlinkKinesisConsumer.run( >> > > > > FlinkKinesisConsumer.java:202) >> > > > > at >> > > > > org.apache.flink.streaming.api.operators.StreamSource. >> > > > > run(StreamSource.java:87) >> > > > > at >> > > > > org.apache.flink.streaming.api.operators.StreamSource. >> > > > > run(StreamSource.java:55) >> > > > > at >> > > > > org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run( >> > > > > SourceStreamTask.java:95) >> > > > > at >> > > > > org.apache.flink.streaming.runtime.tasks.StreamTask. >> > > > > invoke(StreamTask.java:262) >> > > > > at org.apache.flink.runtime.taskmanager.Task.run(Task.java:702) >> > > > > at java.lang.Thread.run(Thread.java:745) >> > > > > >> > > > > ``` >> > > > > >> > > > > Here's my Flink environment setup: >> > > > > >> > > > > - I'm using flink-connector-kinesis_2.11-1.3.0.jar that I built >> > > from >> > > > > 1.3.0 source code. >> > > > > - I followed >> > > > > https://ci.apache.org/projects/flink/flink-docs- >> > > > > release-1.3/setup/aws.html#flink-for-hadoop-27 >> > > > > and added all necessary dependency jars >> > > > > - My application doesn't use Apache http/core. >> > > > > >> > > > > >> > > > > Has anyone experienced the similar incompatibility issue? >> > > > > >> > > > > Thanks! >> > > > > Bowen >> > > > > >> > > > >> > > >> > >> > > |
In reply to this post by Bowen Li
From aws-sdk-java/aws-java-sdk-core/src/main/java/com/amazonaws/http/conn/SdkConnectionKeepAliveStrategy.java
: import org.apache.http.impl.client.DefaultConnectionKeepAliveStrategy; I checked out 4.2.x branch of httpcomponents-client There is no INSTANCE in httpclient/src/main/java/org/apache/http/impl/client/DefaultConnectionKeepAliveStrategy.java So the 4.2.x httpcomponents-client jar in the classpath got in the way of aws-java-sdk-core which was built with newer httpcomponents-client In master branch of httpcomponents-client, httpclient5/src/main/java/org/apache/hc/client5/http/impl/DefaultConnectionKeepAliveStrategy.java does contain INSTANCE. FYI On Mon, Jun 19, 2017 at 11:22 PM, Bowen Li <[hidden email]> wrote: > Hi Gordon, > I double checked that I'm not using any of httpclient/httpcore > or aws-java-sdk-xxx jars in my application. > > The only thing I did with aws-java-sdk is to put > aws-java-sdk-1.7.4.jar to /lib described in https://ci.apache.org/ > projects/flink/flink-docs-release-1.3/setup/aws.html#flink-for-hadoop-27. > Here's the screenshot of my /lib dir. > [image: Inline image 1] > > Can the root cause be that shaded aws-java-sdk in flink is different > than shaded aws-java-sdk in flink-kinesis-connector? > > Thanks! > > On Mon, Jun 19, 2017 at 10:26 PM, Tzu-Li (Gordon) Tai <[hidden email] > > wrote: > >> Hi Bowen, >> >> Thanks for the info. I checked the 1.3.0 release jars, and they do not >> have unshaded httpcomponent dependencies, so that shouldn’t be the problem. >> >> Looking back into the stack trace you posted, the conflict seems to be a >> different problem. >> The conflict seems to be with clashes with the aws-java-sdk version, and >> not the httpcomponent dependency. >> The “INSTANCE” field actually does exist in the aws-java-sdk version that >> the Kinesis connector is using. >> >> Could it be that you have other conflicting aws-java-sdk versions in your >> jar? >> >> Cheers, >> Gordon >> >> >> On 20 June 2017 at 12:55:17 PM, Bowen Li ([hidden email]) wrote: >> >> Hi Gordon, >> Here's what I use: >> >> - Flink: I didn't build Flink myself. I download >> http://apache.mirrors.lucidnetworks.net/flink/flink-1.3.0/ >> flink-1.3.0-bin-hadoop27-scala_2.11.tgz >> from https://flink.apache.org/downloads.html (Hadoop® 2.7, Scala 2.11) >> - flink-kinesis-connector: I >> build flink-connector-kinesis_2.11-1.3.0.jar myself, from source code >> downloaded at *#Source* section in >> https://flink.apache.org/downloads.html. >> - Mvn -v: Apache Maven 3.2.5 >> >> >> In short, I didn't build Flink. Most likely that dependencies in >> either flink-dist or flink-kinesis-connector is not shaded properly? >> >> Thanks! >> Bowen >> >> On Mon, Jun 19, 2017 at 9:28 PM, Tzu-Li (Gordon) Tai <[hidden email] >> > >> wrote: >> >> > Hi, >> > >> > We’ve seen this issue before [1]. The usual reason is that the >> > httpcomponent dependencies weren’t properly shaded in the flink-dist >> jar. >> > Having them properly shaded should solve the issue. >> > >> > cc Bowen: >> > Are you building Flink yourself? If yes, what Maven version are you >> using? >> > If you’re using 3.3.x+, after the first build under flink/, make sure >> to go >> > to flink-dist/ and build a second time for the dependencies to be >> properly >> > shaded. >> > Alternatively, Maven 3.0.x+ is the recommended version, as 3.3.x has >> > dependency shading issues. >> > >> > If you’re not building Flink yourself, the cause could be that the Flink >> > 1.3.0 flink-dist jar wasn’t shaded properly, may need to double check. >> > >> > Best, >> > Gordon >> > >> > [1] https://issues.apache.org/jira/browse/FLINK-5013 >> > >> > On 20 June 2017 at 12:14:27 PM, Ted Yu ([hidden email]) wrote: >> > >> > I logged FLINK-6951, referencing this thread. >> > >> > We can continue discussion there. >> > >> > Thanks >> > >> > On Mon, Jun 19, 2017 at 9:06 PM, Bowen Li <[hidden email]> >> wrote: >> > >> > > Thanks, Ted! woo, this is unexpected. https://ci.apache. >> > > org/projects/flink/flink-docs-release-1.3/setup/aws.html is out of >> date. >> > > >> > > I bet anyone using Kinesis with Flink will run into this issue. I can >> try >> > > to build Flink myself and resolve this problem. But talking about a >> > > feasible permanent solution for all flink-connector-kinesis users. >> Shall >> > we >> > > downgrade aws-java-sdk-kinesis version in flink-connector-kinesis, or >> > shall >> > > we upgrade httpcomponents version in Flink? >> > > >> > > Bowen >> > > >> > > >> > > On Mon, Jun 19, 2017 at 7:02 PM, Ted Yu <[hidden email]> wrote: >> > > >> > > > Here is the dependency in the flink-connector-kinesis module: >> > > > >> > > > [INFO] +- com.amazonaws:aws-java-sdk-kinesis:jar:1.10.71:compile >> > > > [INFO] | \- com.amazonaws:aws-java-sdk-core:jar:1.10.71:compile >> > > > [INFO] | +- org.apache.httpcomponents:httpclient:jar:4.3.6:compile >> > > > [INFO] | +- org.apache.httpcomponents:httpcore:jar:4.3.3:compile >> > > > >> > > > Checking dependency tree of flink, the highest version is 4.2.x >> > > > >> > > > You can try building flink with dependency on 4.3.y of httpclient / >> > > > httpcore >> > > > >> > > > FYI >> > > > >> > > > >> > > > >> > > > On Mon, Jun 19, 2017 at 4:52 PM, Bowen Li <[hidden email]> >> > > wrote: >> > > > >> > > > > Hi guys, >> > > > > I'm trying to enable Flink's checkpoint on our Flink app. I got >> the >> > > > > following Apache http jar compatibility error, and cannot figure >> out >> > > how >> > > > to >> > > > > resolve it. >> > > > > >> > > > > Here's the stacktrace: >> > > > > >> > > > > ``` >> > > > > >> > > > > 2017-06-19 15:07:39,828 INFO org.apache.flink.runtime. >> > > taskmanager.Task >> > > > > - Source: Custom Source -> (Timestamps/Watermarks, >> > > Filter >> > > > > -> Map, Filter -> Map, Filter -> Map) (1/1) >> > > > > (37ab9429deda28e31fa0ed0ed1568654) switched from RUNNING to >> FAILED. >> > > > > java.lang.NoSuchFieldError: INSTANCE >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.conn. >> > > > > SdkConnectionKeepAliveStrategy.getKeepAliveDuration( >> > > > > SdkConnectionKeepAliveStrategy.java:48) >> > > > > at >> > > > > org.apache.http.impl.client.DefaultRequestDirector.execute( >> > > > > DefaultRequestDirector.java:535) >> > > > > at >> > > > > org.apache.http.impl.client.AbstractHttpClient.execute( >> > > > > AbstractHttpClient.java:906) >> > > > > at >> > > > > org.apache.http.impl.client.AbstractHttpClient.execute( >> > > > > AbstractHttpClient.java:805) >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt >> pClient. >> > > > > executeOneRequest(AmazonHttpClient.java:837) >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt >> pClient. >> > > > > executeHelper(AmazonHttpClient.java:607) >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. >> > > > > AmazonHttpClient.doExecute(AmazonHttpClient.java:376) >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt >> pClient. >> > > > > executeWithTimer(AmazonHttpClient.java:338) >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. >> > > > > AmazonHttpClient.execute(AmazonHttpClient.java:287) >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. >> > > > > kinesis.AmazonKinesisClient.doInvoke(AmazonKinesisClient.jav >> a:1940) >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. >> > > > > kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient.java:1910) >> > > > > at >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. >> > > > > kinesis.AmazonKinesisClient.describeStream( >> > > AmazonKinesisClient.java:656) >> > > > > at >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy. >> > > > > KinesisProxy.describeStream(KinesisProxy.java:361) >> > > > > at >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy.KinesisProxy. >> > > > > getShardsOfStream(KinesisProxy.java:323) >> > > > > at >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy. >> > > > > KinesisProxy.getShardList(KinesisProxy.java:231) >> > > > > at >> > > > > org.apache.flink.streaming.connectors.kinesis.internals. >> > > > > KinesisDataFetcher.discoverNewShardsToSubscribe( >> > > > > KinesisDataFetcher.java:430) >> > > > > at >> > > > > org.apache.flink.streaming.connectors.kinesis. >> > > FlinkKinesisConsumer.run( >> > > > > FlinkKinesisConsumer.java:202) >> > > > > at >> > > > > org.apache.flink.streaming.api.operators.StreamSource. >> > > > > run(StreamSource.java:87) >> > > > > at >> > > > > org.apache.flink.streaming.api.operators.StreamSource. >> > > > > run(StreamSource.java:55) >> > > > > at >> > > > > org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run( >> > > > > SourceStreamTask.java:95) >> > > > > at >> > > > > org.apache.flink.streaming.runtime.tasks.StreamTask. >> > > > > invoke(StreamTask.java:262) >> > > > > at org.apache.flink.runtime.taskmanager.Task.run(Task.java:702) >> > > > > at java.lang.Thread.run(Thread.java:745) >> > > > > >> > > > > ``` >> > > > > >> > > > > Here's my Flink environment setup: >> > > > > >> > > > > - I'm using flink-connector-kinesis_2.11-1.3.0.jar that I built >> > > from >> > > > > 1.3.0 source code. >> > > > > - I followed >> > > > > https://ci.apache.org/projects/flink/flink-docs- >> > > > > release-1.3/setup/aws.html#flink-for-hadoop-27 >> > > > > and added all necessary dependency jars >> > > > > - My application doesn't use Apache http/core. >> > > > > >> > > > > >> > > > > Has anyone experienced the similar incompatibility issue? >> > > > > >> > > > > Thanks! >> > > > > Bowen >> > > > > >> > > > >> > > >> > >> > > |
Hi Ted and Gordon,
I found the root cause and a solution. Basically https://ci.apache.org/projects/flink/flink-docs-release-1.3/setup/aws.html#flink-for-hadoop-27 is out of date. Adding httpcore-4.3.6.jar and httpclient-4.3.3.jar rather than httpcore-4.2.5.jar and httpclient-4.2.5.jar to /lib fixed version conflicts. I've taken https://issues.apache.org/jira/browse/FLINK-6951 and will submit doc update. Thank you for your help on navigating through this problem! Bowen On Tue, Jun 20, 2017 at 1:51 AM, Ted Yu <[hidden email]> wrote: > From aws-sdk-java/aws-java-sdk-core/src/main/java/com/amazonaws/http/conn/ > SdkConnectionKeepAliveStrategy.java > : > > import org.apache.http.impl.client.DefaultConnectionKeepAliveStrategy; > > I checked out 4.2.x branch of httpcomponents-client > There is no INSTANCE > in httpclient/src/main/java/org/apache/http/impl/client/ > DefaultConnectionKeepAliveStrategy.java > > So the 4.2.x httpcomponents-client jar in the classpath got in the way > of aws-java-sdk-core which was built with newer httpcomponents-client > > In master branch > of httpcomponents-client, > httpclient5/src/main/java/org/apache/hc/client5/http/impl/ > DefaultConnectionKeepAliveStrategy.java > does contain INSTANCE. > > FYI > > On Mon, Jun 19, 2017 at 11:22 PM, Bowen Li <[hidden email]> > wrote: > > > Hi Gordon, > > I double checked that I'm not using any of httpclient/httpcore > > or aws-java-sdk-xxx jars in my application. > > > > The only thing I did with aws-java-sdk is to put > > aws-java-sdk-1.7.4.jar to /lib described in https://ci.apache.org/ > > projects/flink/flink-docs-release-1.3/setup/aws.html# > flink-for-hadoop-27. > > Here's the screenshot of my /lib dir. > > [image: Inline image 1] > > > > Can the root cause be that shaded aws-java-sdk in flink is different > > than shaded aws-java-sdk in flink-kinesis-connector? > > > > Thanks! > > > > On Mon, Jun 19, 2017 at 10:26 PM, Tzu-Li (Gordon) Tai < > [hidden email] > > > wrote: > > > >> Hi Bowen, > >> > >> Thanks for the info. I checked the 1.3.0 release jars, and they do not > >> have unshaded httpcomponent dependencies, so that shouldn’t be the > problem. > >> > >> Looking back into the stack trace you posted, the conflict seems to be a > >> different problem. > >> The conflict seems to be with clashes with the aws-java-sdk version, and > >> not the httpcomponent dependency. > >> The “INSTANCE” field actually does exist in the aws-java-sdk version > that > >> the Kinesis connector is using. > >> > >> Could it be that you have other conflicting aws-java-sdk versions in > your > >> jar? > >> > >> Cheers, > >> Gordon > >> > >> > >> On 20 June 2017 at 12:55:17 PM, Bowen Li ([hidden email]) > wrote: > >> > >> Hi Gordon, > >> Here's what I use: > >> > >> - Flink: I didn't build Flink myself. I download > >> http://apache.mirrors.lucidnetworks.net/flink/flink-1.3.0/ > >> flink-1.3.0-bin-hadoop27-scala_2.11.tgz > >> from https://flink.apache.org/downloads.html (Hadoop® 2.7, Scala 2.11) > >> - flink-kinesis-connector: I > >> build flink-connector-kinesis_2.11-1.3.0.jar myself, from source code > >> downloaded at *#Source* section in > >> https://flink.apache.org/downloads.html. > >> - Mvn -v: Apache Maven 3.2.5 > >> > >> > >> In short, I didn't build Flink. Most likely that dependencies in > >> either flink-dist or flink-kinesis-connector is not shaded properly? > >> > >> Thanks! > >> Bowen > >> > >> On Mon, Jun 19, 2017 at 9:28 PM, Tzu-Li (Gordon) Tai < > [hidden email] > >> > > >> wrote: > >> > >> > Hi, > >> > > >> > We’ve seen this issue before [1]. The usual reason is that the > >> > httpcomponent dependencies weren’t properly shaded in the flink-dist > >> jar. > >> > Having them properly shaded should solve the issue. > >> > > >> > cc Bowen: > >> > Are you building Flink yourself? If yes, what Maven version are you > >> using? > >> > If you’re using 3.3.x+, after the first build under flink/, make sure > >> to go > >> > to flink-dist/ and build a second time for the dependencies to be > >> properly > >> > shaded. > >> > Alternatively, Maven 3.0.x+ is the recommended version, as 3.3.x has > >> > dependency shading issues. > >> > > >> > If you’re not building Flink yourself, the cause could be that the > Flink > >> > 1.3.0 flink-dist jar wasn’t shaded properly, may need to double check. > >> > > >> > Best, > >> > Gordon > >> > > >> > [1] https://issues.apache.org/jira/browse/FLINK-5013 > >> > > >> > On 20 June 2017 at 12:14:27 PM, Ted Yu ([hidden email]) wrote: > >> > > >> > I logged FLINK-6951, referencing this thread. > >> > > >> > We can continue discussion there. > >> > > >> > Thanks > >> > > >> > On Mon, Jun 19, 2017 at 9:06 PM, Bowen Li <[hidden email]> > >> wrote: > >> > > >> > > Thanks, Ted! woo, this is unexpected. https://ci.apache. > >> > > org/projects/flink/flink-docs-release-1.3/setup/aws.html is out of > >> date. > >> > > > >> > > I bet anyone using Kinesis with Flink will run into this issue. I > can > >> try > >> > > to build Flink myself and resolve this problem. But talking about a > >> > > feasible permanent solution for all flink-connector-kinesis users. > >> Shall > >> > we > >> > > downgrade aws-java-sdk-kinesis version in flink-connector-kinesis, > or > >> > shall > >> > > we upgrade httpcomponents version in Flink? > >> > > > >> > > Bowen > >> > > > >> > > > >> > > On Mon, Jun 19, 2017 at 7:02 PM, Ted Yu <[hidden email]> > wrote: > >> > > > >> > > > Here is the dependency in the flink-connector-kinesis module: > >> > > > > >> > > > [INFO] +- com.amazonaws:aws-java-sdk-kinesis:jar:1.10.71:compile > >> > > > [INFO] | \- com.amazonaws:aws-java-sdk-core:jar:1.10.71:compile > >> > > > [INFO] | +- org.apache.httpcomponents: > httpclient:jar:4.3.6:compile > >> > > > [INFO] | +- org.apache.httpcomponents:httpcore:jar:4.3.3:compile > >> > > > > >> > > > Checking dependency tree of flink, the highest version is 4.2.x > >> > > > > >> > > > You can try building flink with dependency on 4.3.y of httpclient > / > >> > > > httpcore > >> > > > > >> > > > FYI > >> > > > > >> > > > > >> > > > > >> > > > On Mon, Jun 19, 2017 at 4:52 PM, Bowen Li < > [hidden email]> > >> > > wrote: > >> > > > > >> > > > > Hi guys, > >> > > > > I'm trying to enable Flink's checkpoint on our Flink app. I got > >> the > >> > > > > following Apache http jar compatibility error, and cannot figure > >> out > >> > > how > >> > > > to > >> > > > > resolve it. > >> > > > > > >> > > > > Here's the stacktrace: > >> > > > > > >> > > > > ``` > >> > > > > > >> > > > > 2017-06-19 15:07:39,828 INFO org.apache.flink.runtime. > >> > > taskmanager.Task > >> > > > > - Source: Custom Source -> (Timestamps/Watermarks, > >> > > Filter > >> > > > > -> Map, Filter -> Map, Filter -> Map) (1/1) > >> > > > > (37ab9429deda28e31fa0ed0ed1568654) switched from RUNNING to > >> FAILED. > >> > > > > java.lang.NoSuchFieldError: INSTANCE > >> > > > > at > >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.conn. > >> > > > > SdkConnectionKeepAliveStrategy.getKeepAliveDuration( > >> > > > > SdkConnectionKeepAliveStrategy.java:48) > >> > > > > at > >> > > > > org.apache.http.impl.client.DefaultRequestDirector.execute( > >> > > > > DefaultRequestDirector.java:535) > >> > > > > at > >> > > > > org.apache.http.impl.client.AbstractHttpClient.execute( > >> > > > > AbstractHttpClient.java:906) > >> > > > > at > >> > > > > org.apache.http.impl.client.AbstractHttpClient.execute( > >> > > > > AbstractHttpClient.java:805) > >> > > > > at > >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt > >> pClient. > >> > > > > executeOneRequest(AmazonHttpClient.java:837) > >> > > > > at > >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt > >> pClient. > >> > > > > executeHelper(AmazonHttpClient.java:607) > >> > > > > at > >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. > >> > > > > AmazonHttpClient.doExecute(AmazonHttpClient.java:376) > >> > > > > at > >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt > >> pClient. > >> > > > > executeWithTimer(AmazonHttpClient.java:338) > >> > > > > at > >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. > >> > > > > AmazonHttpClient.execute(AmazonHttpClient.java:287) > >> > > > > at > >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > >> > > > > kinesis.AmazonKinesisClient.doInvoke(AmazonKinesisClient.jav > >> a:1940) > >> > > > > at > >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > >> > > > > kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient. > java:1910) > >> > > > > at > >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. > >> > > > > kinesis.AmazonKinesisClient.describeStream( > >> > > AmazonKinesisClient.java:656) > >> > > > > at > >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy. > >> > > > > KinesisProxy.describeStream(KinesisProxy.java:361) > >> > > > > at > >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy. > KinesisProxy. > >> > > > > getShardsOfStream(KinesisProxy.java:323) > >> > > > > at > >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy. > >> > > > > KinesisProxy.getShardList(KinesisProxy.java:231) > >> > > > > at > >> > > > > org.apache.flink.streaming.connectors.kinesis.internals. > >> > > > > KinesisDataFetcher.discoverNewShardsToSubscribe( > >> > > > > KinesisDataFetcher.java:430) > >> > > > > at > >> > > > > org.apache.flink.streaming.connectors.kinesis. > >> > > FlinkKinesisConsumer.run( > >> > > > > FlinkKinesisConsumer.java:202) > >> > > > > at > >> > > > > org.apache.flink.streaming.api.operators.StreamSource. > >> > > > > run(StreamSource.java:87) > >> > > > > at > >> > > > > org.apache.flink.streaming.api.operators.StreamSource. > >> > > > > run(StreamSource.java:55) > >> > > > > at > >> > > > > org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run( > >> > > > > SourceStreamTask.java:95) > >> > > > > at > >> > > > > org.apache.flink.streaming.runtime.tasks.StreamTask. > >> > > > > invoke(StreamTask.java:262) > >> > > > > at org.apache.flink.runtime.taskmanager.Task.run(Task.java:702) > >> > > > > at java.lang.Thread.run(Thread.java:745) > >> > > > > > >> > > > > ``` > >> > > > > > >> > > > > Here's my Flink environment setup: > >> > > > > > >> > > > > - I'm using flink-connector-kinesis_2.11-1.3.0.jar that I built > >> > > from > >> > > > > 1.3.0 source code. > >> > > > > - I followed > >> > > > > https://ci.apache.org/projects/flink/flink-docs- > >> > > > > release-1.3/setup/aws.html#flink-for-hadoop-27 > >> > > > > and added all necessary dependency jars > >> > > > > - My application doesn't use Apache http/core. > >> > > > > > >> > > > > > >> > > > > Has anyone experienced the similar incompatibility issue? > >> > > > > > >> > > > > Thanks! > >> > > > > Bowen > >> > > > > > >> > > > > >> > > > >> > > >> > > > > > |
Guys,
This is the PR https://github.com/apache/flink/pull/4150 On Tue, Jun 20, 2017 at 1:37 PM, Bowen Li <[hidden email]> wrote: > Hi Ted and Gordon, > I found the root cause and a solution. Basically > https://ci.apache.org/projects/flink/flink-docs- > release-1.3/setup/aws.html#flink-for-hadoop-27 is out of date. Adding httpcore-4.3.6.jar > and httpclient-4.3.3.jar rather than httpcore-4.2.5.jar and > httpclient-4.2.5.jar to /lib fixed version conflicts. > > I've taken https://issues.apache.org/jira/browse/FLINK-6951 and will > submit doc update. > > Thank you for your help on navigating through this problem! > Bowen > > > > On Tue, Jun 20, 2017 at 1:51 AM, Ted Yu <[hidden email]> wrote: > >> From aws-sdk-java/aws-java-sdk-core/src/main/java/com/amazonaws/ >> http/conn/SdkConnectionKeepAliveStrategy.java >> : >> >> import org.apache.http.impl.client.DefaultConnectionKeepAliveStrategy; >> >> I checked out 4.2.x branch of httpcomponents-client >> There is no INSTANCE >> in httpclient/src/main/java/org/apache/http/impl/client/Default >> ConnectionKeepAliveStrategy.java >> >> So the 4.2.x httpcomponents-client jar in the classpath got in the way >> of aws-java-sdk-core which was built with newer httpcomponents-client >> >> In master branch >> of httpcomponents-client, >> httpclient5/src/main/java/org/apache/hc/client5/http/impl/De >> faultConnectionKeepAliveStrategy.java >> does contain INSTANCE. >> >> FYI >> >> On Mon, Jun 19, 2017 at 11:22 PM, Bowen Li <[hidden email]> >> wrote: >> >> > Hi Gordon, >> > I double checked that I'm not using any of httpclient/httpcore >> > or aws-java-sdk-xxx jars in my application. >> > >> > The only thing I did with aws-java-sdk is to put >> > aws-java-sdk-1.7.4.jar to /lib described in https://ci.apache.org/ >> > projects/flink/flink-docs-release-1.3/setup/aws.html#flink- >> for-hadoop-27. >> > Here's the screenshot of my /lib dir. >> > [image: Inline image 1] >> > >> > Can the root cause be that shaded aws-java-sdk in flink is different >> > than shaded aws-java-sdk in flink-kinesis-connector? >> > >> > Thanks! >> > >> > On Mon, Jun 19, 2017 at 10:26 PM, Tzu-Li (Gordon) Tai < >> [hidden email] >> > > wrote: >> > >> >> Hi Bowen, >> >> >> >> Thanks for the info. I checked the 1.3.0 release jars, and they do not >> >> have unshaded httpcomponent dependencies, so that shouldn’t be the >> problem. >> >> >> >> Looking back into the stack trace you posted, the conflict seems to be >> a >> >> different problem. >> >> The conflict seems to be with clashes with the aws-java-sdk version, >> and >> >> not the httpcomponent dependency. >> >> The “INSTANCE” field actually does exist in the aws-java-sdk version >> that >> >> the Kinesis connector is using. >> >> >> >> Could it be that you have other conflicting aws-java-sdk versions in >> your >> >> jar? >> >> >> >> Cheers, >> >> Gordon >> >> >> >> >> >> On 20 June 2017 at 12:55:17 PM, Bowen Li ([hidden email]) >> wrote: >> >> >> >> Hi Gordon, >> >> Here's what I use: >> >> >> >> - Flink: I didn't build Flink myself. I download >> >> http://apache.mirrors.lucidnetworks.net/flink/flink-1.3.0/ >> >> flink-1.3.0-bin-hadoop27-scala_2.11.tgz >> >> from https://flink.apache.org/downloads.html (Hadoop® 2.7, Scala 2.11) >> >> - flink-kinesis-connector: I >> >> build flink-connector-kinesis_2.11-1.3.0.jar myself, from source code >> >> downloaded at *#Source* section in >> >> https://flink.apache.org/downloads.html. >> >> - Mvn -v: Apache Maven 3.2.5 >> >> >> >> >> >> In short, I didn't build Flink. Most likely that dependencies in >> >> either flink-dist or flink-kinesis-connector is not shaded properly? >> >> >> >> Thanks! >> >> Bowen >> >> >> >> On Mon, Jun 19, 2017 at 9:28 PM, Tzu-Li (Gordon) Tai < >> [hidden email] >> >> > >> >> wrote: >> >> >> >> > Hi, >> >> > >> >> > We’ve seen this issue before [1]. The usual reason is that the >> >> > httpcomponent dependencies weren’t properly shaded in the flink-dist >> >> jar. >> >> > Having them properly shaded should solve the issue. >> >> > >> >> > cc Bowen: >> >> > Are you building Flink yourself? If yes, what Maven version are you >> >> using? >> >> > If you’re using 3.3.x+, after the first build under flink/, make sure >> >> to go >> >> > to flink-dist/ and build a second time for the dependencies to be >> >> properly >> >> > shaded. >> >> > Alternatively, Maven 3.0.x+ is the recommended version, as 3.3.x has >> >> > dependency shading issues. >> >> > >> >> > If you’re not building Flink yourself, the cause could be that the >> Flink >> >> > 1.3.0 flink-dist jar wasn’t shaded properly, may need to double >> check. >> >> > >> >> > Best, >> >> > Gordon >> >> > >> >> > [1] https://issues.apache.org/jira/browse/FLINK-5013 >> >> > >> >> > On 20 June 2017 at 12:14:27 PM, Ted Yu ([hidden email]) wrote: >> >> > >> >> > I logged FLINK-6951, referencing this thread. >> >> > >> >> > We can continue discussion there. >> >> > >> >> > Thanks >> >> > >> >> > On Mon, Jun 19, 2017 at 9:06 PM, Bowen Li <[hidden email]> >> >> wrote: >> >> > >> >> > > Thanks, Ted! woo, this is unexpected. https://ci.apache. >> >> > > org/projects/flink/flink-docs-release-1.3/setup/aws.html is out of >> >> date. >> >> > > >> >> > > I bet anyone using Kinesis with Flink will run into this issue. I >> can >> >> try >> >> > > to build Flink myself and resolve this problem. But talking about a >> >> > > feasible permanent solution for all flink-connector-kinesis users. >> >> Shall >> >> > we >> >> > > downgrade aws-java-sdk-kinesis version in flink-connector-kinesis, >> or >> >> > shall >> >> > > we upgrade httpcomponents version in Flink? >> >> > > >> >> > > Bowen >> >> > > >> >> > > >> >> > > On Mon, Jun 19, 2017 at 7:02 PM, Ted Yu <[hidden email]> >> wrote: >> >> > > >> >> > > > Here is the dependency in the flink-connector-kinesis module: >> >> > > > >> >> > > > [INFO] +- com.amazonaws:aws-java-sdk-kinesis:jar:1.10.71:compile >> >> > > > [INFO] | \- com.amazonaws:aws-java-sdk-core:jar:1.10.71:compile >> >> > > > [INFO] | +- org.apache.httpcomponents:http >> client:jar:4.3.6:compile >> >> > > > [INFO] | +- org.apache.httpcomponents:httpcore:jar:4.3.3:compile >> >> > > > >> >> > > > Checking dependency tree of flink, the highest version is 4.2.x >> >> > > > >> >> > > > You can try building flink with dependency on 4.3.y of >> httpclient / >> >> > > > httpcore >> >> > > > >> >> > > > FYI >> >> > > > >> >> > > > >> >> > > > >> >> > > > On Mon, Jun 19, 2017 at 4:52 PM, Bowen Li < >> [hidden email]> >> >> > > wrote: >> >> > > > >> >> > > > > Hi guys, >> >> > > > > I'm trying to enable Flink's checkpoint on our Flink app. I got >> >> the >> >> > > > > following Apache http jar compatibility error, and cannot >> figure >> >> out >> >> > > how >> >> > > > to >> >> > > > > resolve it. >> >> > > > > >> >> > > > > Here's the stacktrace: >> >> > > > > >> >> > > > > ``` >> >> > > > > >> >> > > > > 2017-06-19 15:07:39,828 INFO org.apache.flink.runtime. >> >> > > taskmanager.Task >> >> > > > > - Source: Custom Source -> (Timestamps/Watermarks, >> >> > > Filter >> >> > > > > -> Map, Filter -> Map, Filter -> Map) (1/1) >> >> > > > > (37ab9429deda28e31fa0ed0ed1568654) switched from RUNNING to >> >> FAILED. >> >> > > > > java.lang.NoSuchFieldError: INSTANCE >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.conn. >> >> > > > > SdkConnectionKeepAliveStrategy.getKeepAliveDuration( >> >> > > > > SdkConnectionKeepAliveStrategy.java:48) >> >> > > > > at >> >> > > > > org.apache.http.impl.client.DefaultRequestDirector.execute( >> >> > > > > DefaultRequestDirector.java:535) >> >> > > > > at >> >> > > > > org.apache.http.impl.client.AbstractHttpClient.execute( >> >> > > > > AbstractHttpClient.java:906) >> >> > > > > at >> >> > > > > org.apache.http.impl.client.AbstractHttpClient.execute( >> >> > > > > AbstractHttpClient.java:805) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt >> >> pClient. >> >> > > > > executeOneRequest(AmazonHttpClient.java:837) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt >> >> pClient. >> >> > > > > executeHelper(AmazonHttpClient.java:607) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. >> >> > > > > AmazonHttpClient.doExecute(AmazonHttpClient.java:376) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt >> >> pClient. >> >> > > > > executeWithTimer(AmazonHttpClient.java:338) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. >> >> > > > > AmazonHttpClient.execute(AmazonHttpClient.java:287) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. >> >> > > > > kinesis.AmazonKinesisClient.doInvoke(AmazonKinesisClient.jav >> >> a:1940) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. >> >> > > > > kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient.java: >> 1910) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. >> >> > > > > kinesis.AmazonKinesisClient.describeStream( >> >> > > AmazonKinesisClient.java:656) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy. >> >> > > > > KinesisProxy.describeStream(KinesisProxy.java:361) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy.KinesisP >> roxy. >> >> > > > > getShardsOfStream(KinesisProxy.java:323) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy. >> >> > > > > KinesisProxy.getShardList(KinesisProxy.java:231) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.connectors.kinesis.internals. >> >> > > > > KinesisDataFetcher.discoverNewShardsToSubscribe( >> >> > > > > KinesisDataFetcher.java:430) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.connectors.kinesis. >> >> > > FlinkKinesisConsumer.run( >> >> > > > > FlinkKinesisConsumer.java:202) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.api.operators.StreamSource. >> >> > > > > run(StreamSource.java:87) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.api.operators.StreamSource. >> >> > > > > run(StreamSource.java:55) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run( >> >> > > > > SourceStreamTask.java:95) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.runtime.tasks.StreamTask. >> >> > > > > invoke(StreamTask.java:262) >> >> > > > > at org.apache.flink.runtime.taskm >> anager.Task.run(Task.java:702) >> >> > > > > at java.lang.Thread.run(Thread.java:745) >> >> > > > > >> >> > > > > ``` >> >> > > > > >> >> > > > > Here's my Flink environment setup: >> >> > > > > >> >> > > > > - I'm using flink-connector-kinesis_2.11-1.3.0.jar that I >> built >> >> > > from >> >> > > > > 1.3.0 source code. >> >> > > > > - I followed >> >> > > > > https://ci.apache.org/projects/flink/flink-docs- >> >> > > > > release-1.3/setup/aws.html#flink-for-hadoop-27 >> >> > > > > and added all necessary dependency jars >> >> > > > > - My application doesn't use Apache http/core. >> >> > > > > >> >> > > > > >> >> > > > > Has anyone experienced the similar incompatibility issue? >> >> > > > > >> >> > > > > Thanks! >> >> > > > > Bowen >> >> > > > > >> >> > > > >> >> > > >> >> > >> >> >> > >> > >> > > |
Thanks a lot of looking into this Bowen!
On 21 June 2017 at 5:02:55 AM, Bowen Li ([hidden email]) wrote: Guys, This is the PR https://github.com/apache/flink/pull/4150 On Tue, Jun 20, 2017 at 1:37 PM, Bowen Li <[hidden email]> wrote: > Hi Ted and Gordon, > I found the root cause and a solution. Basically > https://ci.apache.org/projects/flink/flink-docs- > release-1.3/setup/aws.html#flink-for-hadoop-27 is out of date. Adding httpcore-4.3.6.jar > and httpclient-4.3.3.jar rather than httpcore-4.2.5.jar and > httpclient-4.2.5.jar to /lib fixed version conflicts. > > I've taken https://issues.apache.org/jira/browse/FLINK-6951 and will > submit doc update. > > Thank you for your help on navigating through this problem! > Bowen > > > > On Tue, Jun 20, 2017 at 1:51 AM, Ted Yu <[hidden email]> wrote: > >> From aws-sdk-java/aws-java-sdk-core/src/main/java/com/amazonaws/ >> http/conn/SdkConnectionKeepAliveStrategy.java >> : >> >> import org.apache.http.impl.client.DefaultConnectionKeepAliveStrategy; >> >> I checked out 4.2.x branch of httpcomponents-client >> There is no INSTANCE >> in httpclient/src/main/java/org/apache/http/impl/client/Default >> ConnectionKeepAliveStrategy.java >> >> So the 4.2.x httpcomponents-client jar in the classpath got in the way >> of aws-java-sdk-core which was built with newer httpcomponents-client >> >> In master branch >> of httpcomponents-client, >> httpclient5/src/main/java/org/apache/hc/client5/http/impl/De >> faultConnectionKeepAliveStrategy.java >> does contain INSTANCE. >> >> FYI >> >> On Mon, Jun 19, 2017 at 11:22 PM, Bowen Li <[hidden email]> >> wrote: >> >> > Hi Gordon, >> > I double checked that I'm not using any of httpclient/httpcore >> > or aws-java-sdk-xxx jars in my application. >> > >> > The only thing I did with aws-java-sdk is to put >> > aws-java-sdk-1.7.4.jar to /lib described in https://ci.apache.org/ >> > projects/flink/flink-docs-release-1.3/setup/aws.html#flink- >> for-hadoop-27. >> > Here's the screenshot of my /lib dir. >> > [image: Inline image 1] >> > >> > Can the root cause be that shaded aws-java-sdk in flink is different >> > than shaded aws-java-sdk in flink-kinesis-connector? >> > >> > Thanks! >> > >> > On Mon, Jun 19, 2017 at 10:26 PM, Tzu-Li (Gordon) Tai < >> [hidden email] >> > > wrote: >> > >> >> Hi Bowen, >> >> >> >> Thanks for the info. I checked the 1.3.0 release jars, and they do not >> >> have unshaded httpcomponent dependencies, so that shouldn’t be the >> problem. >> >> >> >> Looking back into the stack trace you posted, the conflict seems to be >> a >> >> different problem. >> >> The conflict seems to be with clashes with the aws-java-sdk version, >> and >> >> not the httpcomponent dependency. >> >> The “INSTANCE” field actually does exist in the aws-java-sdk version >> that >> >> the Kinesis connector is using. >> >> >> >> Could it be that you have other conflicting aws-java-sdk versions in >> your >> >> jar? >> >> >> >> Cheers, >> >> Gordon >> >> >> >> >> >> On 20 June 2017 at 12:55:17 PM, Bowen Li ([hidden email]) >> wrote: >> >> >> >> Hi Gordon, >> >> Here's what I use: >> >> >> >> - Flink: I didn't build Flink myself. I download >> >> http://apache.mirrors.lucidnetworks.net/flink/flink-1.3.0/ >> >> flink-1.3.0-bin-hadoop27-scala_2.11.tgz >> >> from https://flink.apache.org/downloads.html (Hadoop® 2.7, Scala 2.11) >> >> - flink-kinesis-connector: I >> >> build flink-connector-kinesis_2.11-1.3.0.jar myself, from source code >> >> downloaded at *#Source* section in >> >> https://flink.apache.org/downloads.html. >> >> - Mvn -v: Apache Maven 3.2.5 >> >> >> >> >> >> In short, I didn't build Flink. Most likely that dependencies in >> >> either flink-dist or flink-kinesis-connector is not shaded properly? >> >> >> >> Thanks! >> >> Bowen >> >> >> >> On Mon, Jun 19, 2017 at 9:28 PM, Tzu-Li (Gordon) Tai < >> [hidden email] >> >> > >> >> wrote: >> >> >> >> > Hi, >> >> > >> >> > We’ve seen this issue before [1]. The usual reason is that the >> >> > httpcomponent dependencies weren’t properly shaded in the flink-dist >> >> jar. >> >> > Having them properly shaded should solve the issue. >> >> > >> >> > cc Bowen: >> >> > Are you building Flink yourself? If yes, what Maven version are you >> >> using? >> >> > If you’re using 3.3.x+, after the first build under flink/, make sure >> >> to go >> >> > to flink-dist/ and build a second time for the dependencies to be >> >> properly >> >> > shaded. >> >> > Alternatively, Maven 3.0.x+ is the recommended version, as 3.3.x has >> >> > dependency shading issues. >> >> > >> >> > If you’re not building Flink yourself, the cause could be that the >> Flink >> >> > 1.3.0 flink-dist jar wasn’t shaded properly, may need to double >> check. >> >> > >> >> > Best, >> >> > Gordon >> >> > >> >> > [1] https://issues.apache.org/jira/browse/FLINK-5013 >> >> > >> >> > On 20 June 2017 at 12:14:27 PM, Ted Yu ([hidden email]) wrote: >> >> > >> >> > I logged FLINK-6951, referencing this thread. >> >> > >> >> > We can continue discussion there. >> >> > >> >> > Thanks >> >> > >> >> > On Mon, Jun 19, 2017 at 9:06 PM, Bowen Li <[hidden email]> >> >> wrote: >> >> > >> >> > > Thanks, Ted! woo, this is unexpected. https://ci.apache. >> >> > > org/projects/flink/flink-docs-release-1.3/setup/aws.html is out of >> >> date. >> >> > > >> >> > > I bet anyone using Kinesis with Flink will run into this issue. I >> can >> >> try >> >> > > to build Flink myself and resolve this problem. But talking about a >> >> > > feasible permanent solution for all flink-connector-kinesis users. >> >> Shall >> >> > we >> >> > > downgrade aws-java-sdk-kinesis version in flink-connector-kinesis, >> or >> >> > shall >> >> > > we upgrade httpcomponents version in Flink? >> >> > > >> >> > > Bowen >> >> > > >> >> > > >> >> > > On Mon, Jun 19, 2017 at 7:02 PM, Ted Yu <[hidden email]> >> wrote: >> >> > > >> >> > > > Here is the dependency in the flink-connector-kinesis module: >> >> > > > >> >> > > > [INFO] +- com.amazonaws:aws-java-sdk-kinesis:jar:1.10.71:compile >> >> > > > [INFO] | \- com.amazonaws:aws-java-sdk-core:jar:1.10.71:compile >> >> > > > [INFO] | +- org.apache.httpcomponents:http >> client:jar:4.3.6:compile >> >> > > > [INFO] | +- org.apache.httpcomponents:httpcore:jar:4.3.3:compile >> >> > > > >> >> > > > Checking dependency tree of flink, the highest version is 4.2.x >> >> > > > >> >> > > > You can try building flink with dependency on 4.3.y of >> httpclient / >> >> > > > httpcore >> >> > > > >> >> > > > FYI >> >> > > > >> >> > > > >> >> > > > >> >> > > > On Mon, Jun 19, 2017 at 4:52 PM, Bowen Li < >> [hidden email]> >> >> > > wrote: >> >> > > > >> >> > > > > Hi guys, >> >> > > > > I'm trying to enable Flink's checkpoint on our Flink app. I got >> >> the >> >> > > > > following Apache http jar compatibility error, and cannot >> figure >> >> out >> >> > > how >> >> > > > to >> >> > > > > resolve it. >> >> > > > > >> >> > > > > Here's the stacktrace: >> >> > > > > >> >> > > > > ``` >> >> > > > > >> >> > > > > 2017-06-19 15:07:39,828 INFO org.apache.flink.runtime. >> >> > > taskmanager.Task >> >> > > > > - Source: Custom Source -> (Timestamps/Watermarks, >> >> > > Filter >> >> > > > > -> Map, Filter -> Map, Filter -> Map) (1/1) >> >> > > > > (37ab9429deda28e31fa0ed0ed1568654) switched from RUNNING to >> >> FAILED. >> >> > > > > java.lang.NoSuchFieldError: INSTANCE >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.conn. >> >> > > > > SdkConnectionKeepAliveStrategy.getKeepAliveDuration( >> >> > > > > SdkConnectionKeepAliveStrategy.java:48) >> >> > > > > at >> >> > > > > org.apache.http.impl.client.DefaultRequestDirector.execute( >> >> > > > > DefaultRequestDirector.java:535) >> >> > > > > at >> >> > > > > org.apache.http.impl.client.AbstractHttpClient.execute( >> >> > > > > AbstractHttpClient.java:906) >> >> > > > > at >> >> > > > > org.apache.http.impl.client.AbstractHttpClient.execute( >> >> > > > > AbstractHttpClient.java:805) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt >> >> pClient. >> >> > > > > executeOneRequest(AmazonHttpClient.java:837) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt >> >> pClient. >> >> > > > > executeHelper(AmazonHttpClient.java:607) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. >> >> > > > > AmazonHttpClient.doExecute(AmazonHttpClient.java:376) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt >> >> pClient. >> >> > > > > executeWithTimer(AmazonHttpClient.java:338) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. >> >> > > > > AmazonHttpClient.execute(AmazonHttpClient.java:287) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. >> >> > > > > kinesis.AmazonKinesisClient.doInvoke(AmazonKinesisClient.jav >> >> a:1940) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. >> >> > > > > kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient.java: >> 1910) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. >> >> > > > > kinesis.AmazonKinesisClient.describeStream( >> >> > > AmazonKinesisClient.java:656) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy. >> >> > > > > KinesisProxy.describeStream(KinesisProxy.java:361) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy.KinesisP >> roxy. >> >> > > > > getShardsOfStream(KinesisProxy.java:323) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy. >> >> > > > > KinesisProxy.getShardList(KinesisProxy.java:231) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.connectors.kinesis.internals. >> >> > > > > KinesisDataFetcher.discoverNewShardsToSubscribe( >> >> > > > > KinesisDataFetcher.java:430) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.connectors.kinesis. >> >> > > FlinkKinesisConsumer.run( >> >> > > > > FlinkKinesisConsumer.java:202) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.api.operators.StreamSource. >> >> > > > > run(StreamSource.java:87) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.api.operators.StreamSource. >> >> > > > > run(StreamSource.java:55) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run( >> >> > > > > SourceStreamTask.java:95) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.runtime.tasks.StreamTask. >> >> > > > > invoke(StreamTask.java:262) >> >> > > > > at org.apache.flink.runtime.taskm >> anager.Task.run(Task.java:702) >> >> > > > > at java.lang.Thread.run(Thread.java:745) >> >> > > > > >> >> > > > > ``` >> >> > > > > >> >> > > > > Here's my Flink environment setup: >> >> > > > > >> >> > > > > - I'm using flink-connector-kinesis_2.11-1.3.0.jar that I >> built >> >> > > from >> >> > > > > 1.3.0 source code. >> >> > > > > - I followed >> >> > > > > https://ci.apache.org/projects/flink/flink-docs- >> >> > > > > release-1.3/setup/aws.html#flink-for-hadoop-27 >> >> > > > > and added all necessary dependency jars >> >> > > > > - My application doesn't use Apache http/core. >> >> > > > > >> >> > > > > >> >> > > > > Has anyone experienced the similar incompatibility issue? >> >> > > > > >> >> > > > > Thanks! >> >> > > > > Bowen >> >> > > > > >> >> > > > >> >> > > >> >> > >> >> >> > >> > >> > > |
I have some thoughts on the proper solution that I’ve left in the PR.
Lets continue the discussion directly there. On 21 June 2017 at 11:14:12 AM, Tzu-Li (Gordon) Tai ([hidden email]) wrote: Thanks a lot of looking into this Bowen! On 21 June 2017 at 5:02:55 AM, Bowen Li ([hidden email]) wrote: Guys, This is the PR https://github.com/apache/flink/pull/4150 On Tue, Jun 20, 2017 at 1:37 PM, Bowen Li <[hidden email]> wrote: > Hi Ted and Gordon, > I found the root cause and a solution. Basically > https://ci.apache.org/projects/flink/flink-docs- > release-1.3/setup/aws.html#flink-for-hadoop-27 is out of date. Adding httpcore-4.3.6.jar > and httpclient-4.3.3.jar rather than httpcore-4.2.5.jar and > httpclient-4.2.5.jar to /lib fixed version conflicts. > > I've taken https://issues.apache.org/jira/browse/FLINK-6951 and will > submit doc update. > > Thank you for your help on navigating through this problem! > Bowen > > > > On Tue, Jun 20, 2017 at 1:51 AM, Ted Yu <[hidden email]> wrote: > >> From aws-sdk-java/aws-java-sdk-core/src/main/java/com/amazonaws/ >> http/conn/SdkConnectionKeepAliveStrategy.java >> : >> >> import org.apache.http.impl.client.DefaultConnectionKeepAliveStrategy; >> >> I checked out 4.2.x branch of httpcomponents-client >> There is no INSTANCE >> in httpclient/src/main/java/org/apache/http/impl/client/Default >> ConnectionKeepAliveStrategy.java >> >> So the 4.2.x httpcomponents-client jar in the classpath got in the way >> of aws-java-sdk-core which was built with newer httpcomponents-client >> >> In master branch >> of httpcomponents-client, >> httpclient5/src/main/java/org/apache/hc/client5/http/impl/De >> faultConnectionKeepAliveStrategy.java >> does contain INSTANCE. >> >> FYI >> >> On Mon, Jun 19, 2017 at 11:22 PM, Bowen Li <[hidden email]> >> wrote: >> >> > Hi Gordon, >> > I double checked that I'm not using any of httpclient/httpcore >> > or aws-java-sdk-xxx jars in my application. >> > >> > The only thing I did with aws-java-sdk is to put >> > aws-java-sdk-1.7.4.jar to /lib described in https://ci.apache.org/ >> > projects/flink/flink-docs-release-1.3/setup/aws.html#flink- >> for-hadoop-27. >> > Here's the screenshot of my /lib dir. >> > [image: Inline image 1] >> > >> > Can the root cause be that shaded aws-java-sdk in flink is different >> > than shaded aws-java-sdk in flink-kinesis-connector? >> > >> > Thanks! >> > >> > On Mon, Jun 19, 2017 at 10:26 PM, Tzu-Li (Gordon) Tai < >> [hidden email] >> > > wrote: >> > >> >> Hi Bowen, >> >> >> >> Thanks for the info. I checked the 1.3.0 release jars, and they do not >> >> have unshaded httpcomponent dependencies, so that shouldn’t be the >> problem. >> >> >> >> Looking back into the stack trace you posted, the conflict seems to be >> a >> >> different problem. >> >> The conflict seems to be with clashes with the aws-java-sdk version, >> and >> >> not the httpcomponent dependency. >> >> The “INSTANCE” field actually does exist in the aws-java-sdk version >> that >> >> the Kinesis connector is using. >> >> >> >> Could it be that you have other conflicting aws-java-sdk versions in >> your >> >> jar? >> >> >> >> Cheers, >> >> Gordon >> >> >> >> >> >> On 20 June 2017 at 12:55:17 PM, Bowen Li ([hidden email]) >> wrote: >> >> >> >> Hi Gordon, >> >> Here's what I use: >> >> >> >> - Flink: I didn't build Flink myself. I download >> >> http://apache.mirrors.lucidnetworks.net/flink/flink-1.3.0/ >> >> flink-1.3.0-bin-hadoop27-scala_2.11.tgz >> >> from https://flink.apache.org/downloads.html (Hadoop® 2.7, Scala 2.11) >> >> - flink-kinesis-connector: I >> >> build flink-connector-kinesis_2.11-1.3.0.jar myself, from source code >> >> downloaded at *#Source* section in >> >> https://flink.apache.org/downloads.html. >> >> - Mvn -v: Apache Maven 3.2.5 >> >> >> >> >> >> In short, I didn't build Flink. Most likely that dependencies in >> >> either flink-dist or flink-kinesis-connector is not shaded properly? >> >> >> >> Thanks! >> >> Bowen >> >> >> >> On Mon, Jun 19, 2017 at 9:28 PM, Tzu-Li (Gordon) Tai < >> [hidden email] >> >> > >> >> wrote: >> >> >> >> > Hi, >> >> > >> >> > We’ve seen this issue before [1]. The usual reason is that the >> >> > httpcomponent dependencies weren’t properly shaded in the flink-dist >> >> jar. >> >> > Having them properly shaded should solve the issue. >> >> > >> >> > cc Bowen: >> >> > Are you building Flink yourself? If yes, what Maven version are you >> >> using? >> >> > If you’re using 3.3.x+, after the first build under flink/, make sure >> >> to go >> >> > to flink-dist/ and build a second time for the dependencies to be >> >> properly >> >> > shaded. >> >> > Alternatively, Maven 3.0.x+ is the recommended version, as 3.3.x has >> >> > dependency shading issues. >> >> > >> >> > If you’re not building Flink yourself, the cause could be that the >> Flink >> >> > 1.3.0 flink-dist jar wasn’t shaded properly, may need to double >> check. >> >> > >> >> > Best, >> >> > Gordon >> >> > >> >> > [1] https://issues.apache.org/jira/browse/FLINK-5013 >> >> > >> >> > On 20 June 2017 at 12:14:27 PM, Ted Yu ([hidden email]) wrote: >> >> > >> >> > I logged FLINK-6951, referencing this thread. >> >> > >> >> > We can continue discussion there. >> >> > >> >> > Thanks >> >> > >> >> > On Mon, Jun 19, 2017 at 9:06 PM, Bowen Li <[hidden email]> >> >> wrote: >> >> > >> >> > > Thanks, Ted! woo, this is unexpected. https://ci.apache. >> >> > > org/projects/flink/flink-docs-release-1.3/setup/aws.html is out of >> >> date. >> >> > > >> >> > > I bet anyone using Kinesis with Flink will run into this issue. I >> can >> >> try >> >> > > to build Flink myself and resolve this problem. But talking about a >> >> > > feasible permanent solution for all flink-connector-kinesis users. >> >> Shall >> >> > we >> >> > > downgrade aws-java-sdk-kinesis version in flink-connector-kinesis, >> or >> >> > shall >> >> > > we upgrade httpcomponents version in Flink? >> >> > > >> >> > > Bowen >> >> > > >> >> > > >> >> > > On Mon, Jun 19, 2017 at 7:02 PM, Ted Yu <[hidden email]> >> wrote: >> >> > > >> >> > > > Here is the dependency in the flink-connector-kinesis module: >> >> > > > >> >> > > > [INFO] +- com.amazonaws:aws-java-sdk-kinesis:jar:1.10.71:compile >> >> > > > [INFO] | \- com.amazonaws:aws-java-sdk-core:jar:1.10.71:compile >> >> > > > [INFO] | +- org.apache.httpcomponents:http >> client:jar:4.3.6:compile >> >> > > > [INFO] | +- org.apache.httpcomponents:httpcore:jar:4.3.3:compile >> >> > > > >> >> > > > Checking dependency tree of flink, the highest version is 4.2.x >> >> > > > >> >> > > > You can try building flink with dependency on 4.3.y of >> httpclient / >> >> > > > httpcore >> >> > > > >> >> > > > FYI >> >> > > > >> >> > > > >> >> > > > >> >> > > > On Mon, Jun 19, 2017 at 4:52 PM, Bowen Li < >> [hidden email]> >> >> > > wrote: >> >> > > > >> >> > > > > Hi guys, >> >> > > > > I'm trying to enable Flink's checkpoint on our Flink app. I got >> >> the >> >> > > > > following Apache http jar compatibility error, and cannot >> figure >> >> out >> >> > > how >> >> > > > to >> >> > > > > resolve it. >> >> > > > > >> >> > > > > Here's the stacktrace: >> >> > > > > >> >> > > > > ``` >> >> > > > > >> >> > > > > 2017-06-19 15:07:39,828 INFO org.apache.flink.runtime. >> >> > > taskmanager.Task >> >> > > > > - Source: Custom Source -> (Timestamps/Watermarks, >> >> > > Filter >> >> > > > > -> Map, Filter -> Map, Filter -> Map) (1/1) >> >> > > > > (37ab9429deda28e31fa0ed0ed1568654) switched from RUNNING to >> >> FAILED. >> >> > > > > java.lang.NoSuchFieldError: INSTANCE >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.conn. >> >> > > > > SdkConnectionKeepAliveStrategy.getKeepAliveDuration( >> >> > > > > SdkConnectionKeepAliveStrategy.java:48) >> >> > > > > at >> >> > > > > org.apache.http.impl.client.DefaultRequestDirector.execute( >> >> > > > > DefaultRequestDirector.java:535) >> >> > > > > at >> >> > > > > org.apache.http.impl.client.AbstractHttpClient.execute( >> >> > > > > AbstractHttpClient.java:906) >> >> > > > > at >> >> > > > > org.apache.http.impl.client.AbstractHttpClient.execute( >> >> > > > > AbstractHttpClient.java:805) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt >> >> pClient. >> >> > > > > executeOneRequest(AmazonHttpClient.java:837) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt >> >> pClient. >> >> > > > > executeHelper(AmazonHttpClient.java:607) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. >> >> > > > > AmazonHttpClient.doExecute(AmazonHttpClient.java:376) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http.AmazonHtt >> >> pClient. >> >> > > > > executeWithTimer(AmazonHttpClient.java:338) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.http. >> >> > > > > AmazonHttpClient.execute(AmazonHttpClient.java:287) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. >> >> > > > > kinesis.AmazonKinesisClient.doInvoke(AmazonKinesisClient.jav >> >> a:1940) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. >> >> > > > > kinesis.AmazonKinesisClient.invoke(AmazonKinesisClient.java: >> 1910) >> >> > > > > at >> >> > > > > org.apache.flink.kinesis.shaded.com.amazonaws.services. >> >> > > > > kinesis.AmazonKinesisClient.describeStream( >> >> > > AmazonKinesisClient.java:656) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy. >> >> > > > > KinesisProxy.describeStream(KinesisProxy.java:361) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy.KinesisP >> roxy. >> >> > > > > getShardsOfStream(KinesisProxy.java:323) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.connectors.kinesis.proxy. >> >> > > > > KinesisProxy.getShardList(KinesisProxy.java:231) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.connectors.kinesis.internals. >> >> > > > > KinesisDataFetcher.discoverNewShardsToSubscribe( >> >> > > > > KinesisDataFetcher.java:430) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.connectors.kinesis. >> >> > > FlinkKinesisConsumer.run( >> >> > > > > FlinkKinesisConsumer.java:202) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.api.operators.StreamSource. >> >> > > > > run(StreamSource.java:87) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.api.operators.StreamSource. >> >> > > > > run(StreamSource.java:55) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run( >> >> > > > > SourceStreamTask.java:95) >> >> > > > > at >> >> > > > > org.apache.flink.streaming.runtime.tasks.StreamTask. >> >> > > > > invoke(StreamTask.java:262) >> >> > > > > at org.apache.flink.runtime.taskm >> anager.Task.run(Task.java:702) >> >> > > > > at java.lang.Thread.run(Thread.java:745) >> >> > > > > >> >> > > > > ``` >> >> > > > > >> >> > > > > Here's my Flink environment setup: >> >> > > > > >> >> > > > > - I'm using flink-connector-kinesis_2.11-1.3.0.jar that I >> built >> >> > > from >> >> > > > > 1.3.0 source code. >> >> > > > > - I followed >> >> > > > > https://ci.apache.org/projects/flink/flink-docs- >> >> > > > > release-1.3/setup/aws.html#flink-for-hadoop-27 >> >> > > > > and added all necessary dependency jars >> >> > > > > - My application doesn't use Apache http/core. >> >> > > > > >> >> > > > > >> >> > > > > Has anyone experienced the similar incompatibility issue? >> >> > > > > >> >> > > > > Thanks! >> >> > > > > Bowen >> >> > > > > >> >> > > > >> >> > > >> >> > >> >> >> > >> > >> > > |
Free forum by Nabble | Edit this page |