Hi guys,
I am facing problem while connecting remote hbase from Apache flink. I am able to connect successfully through simple hbase java program. However, when i try to connect and scan the table it says "java.lang.NoSuchMethodError: org.apache.hadoop.net.NetUtils.getInputStream(Ljava/net/Socket;)Ljava/io/InputStream;" Regards, Santosh |
Hi Santosh, which version of Flink are you using? And which version of
hbase? On Mar 31, 2015 12:50 PM, "santosh_rajaguru" <[hidden email]> wrote: > Hi guys, > > I am facing problem while connecting remote hbase from Apache flink. > I am able to connect successfully through simple hbase java program. > However, when i try to connect and scan the table it says > > "java.lang.NoSuchMethodError: > > org.apache.hadoop.net.NetUtils.getInputStream(Ljava/net/Socket;)Ljava/io/InputStream;" > > Regards, > Santosh > > > > > > -- > View this message in context: > http://apache-flink-incubator-mailing-list-archive.1008284.n3.nabble.com/Apache-Flink-connection-with-Hbase-tp4839.html > Sent from the Apache Flink (Incubator) Mailing List archive. mailing list > archive at Nabble.com. > |
Hi flavio,
I got to manage the connection. The problem was with versioning of hbase client. i was using 0.8.1 version for flink-hbase. when i changed the version to 0.8.0 it is working fine. The hbase server is 0.98.11 . One more question flavio, is there any particular mechanism to follow while inserting records to hbase using flink? Thanks and regards, Santosh |
Strange, I'm using Flink 0.8.1 and HBase 0.98.6 and everything works fine
(at least during reading). Remember to put the correct hbase-site.xml in the classpath! To output data I'm trying to find the best way to achieve it.. It came out that the hadoop compatibility layer of flink probably doesn't initialize the outputformat correctly for HBase so I'm debugging it with Fabian. I'll try to fix that asap. For the moment I created a temporary HBaseOutputFormat as a workaround to such a problem but if you have time today and tomorrow I'll work on it and hopefully fix it. For the moment I do something like this: Job job = Job.getInstance(); job.getConfiguration().set(TableOutputFormat.OUTPUT_TABLE, oputputTable); HBaseTableOutputFormat<Text> hbaseTOF = new HBaseTableOutputFormat<>(); hbaseOF.setConf(job.getConfiguration()); HadoopOutputFormat<Text, Put> of = new HadoopOutputFormat<>(hbaseTOF, job); myDs.output(of); On Tue, Mar 31, 2015 at 2:35 PM, santosh_rajaguru <[hidden email]> wrote: > Hi flavio, > > I got to manage the connection. The problem was with versioning of hbase > client. i was using 0.8.1 version for flink-hbase. when i changed the > version to 0.8.0 it is working fine. > The hbase server is 0.98.11 . > > One more question flavio, is there any particular mechanism to follow while > inserting records to hbase using flink? > > Thanks and regards, > Santosh > > > > -- > View this message in context: > http://apache-flink-incubator-mailing-list-archive.1008284.n3.nabble.com/Apache-Flink-connection-with-Hbase-tp4839p4843.html > Sent from the Apache Flink (Incubator) Mailing List archive. mailing list > archive at Nabble.com. > |
Hi!
Also important: Which Hadoop version are you using with Flink? The problem is a missing method in a Hadoop class, so I guess there is a Hadoop version mismatch. For all Flink versions, there is a package for Hadoop 1.x and a package for Hadoop 2.x . Make sure you pick the right one for HBase 0.98.6. Stephan On Tue, Mar 31, 2015 at 3:06 PM, Flavio Pompermaier <[hidden email]> wrote: > Strange, I'm using Flink 0.8.1 and HBase 0.98.6 and everything works fine > (at least during reading). > Remember to put the correct hbase-site.xml in the classpath! > To output data I'm trying to find the best way to achieve it.. > It came out that the hadoop compatibility layer of flink probably doesn't > initialize the outputformat correctly for HBase so I'm debugging it with > Fabian. > I'll try to fix that asap. For the moment I created a temporary > HBaseOutputFormat as a workaround to such a problem but if you have time > today and tomorrow I'll work on it and hopefully fix it. > > For the moment I do something like this: > Job job = Job.getInstance(); > job.getConfiguration().set(TableOutputFormat.OUTPUT_TABLE, oputputTable); > HBaseTableOutputFormat<Text> hbaseTOF = new HBaseTableOutputFormat<>(); > hbaseOF.setConf(job.getConfiguration()); > HadoopOutputFormat<Text, Put> of = new HadoopOutputFormat<>(hbaseTOF, job); > myDs.output(of); > > On Tue, Mar 31, 2015 at 2:35 PM, santosh_rajaguru <[hidden email]> > wrote: > > > Hi flavio, > > > > I got to manage the connection. The problem was with versioning of hbase > > client. i was using 0.8.1 version for flink-hbase. when i changed the > > version to 0.8.0 it is working fine. > > The hbase server is 0.98.11 . > > > > One more question flavio, is there any particular mechanism to follow > while > > inserting records to hbase using flink? > > > > Thanks and regards, > > Santosh > > > > > > > > -- > > View this message in context: > > > http://apache-flink-incubator-mailing-list-archive.1008284.n3.nabble.com/Apache-Flink-connection-with-Hbase-tp4839p4843.html > > Sent from the Apache Flink (Incubator) Mailing List archive. mailing list > > archive at Nabble.com. > > > |
Free forum by Nabble | Edit this page |