This post was updated on .
Hi all,
I am experiencing some problem while writing to the jena-hbase. I am using a custom OutPutformat for writing. This reads a configuration file from the classpath. it works fine while running the job from eclipse. but it throws an exception (hbaserdf.ttl file cannot found) while running it from flink web interface. hbaserdf is present in the jar. and the configuration file (hbaserdf.ttl) is also present in the store folder. for hadoop configurations, we can use fs.hdfs.hadoopconf and path to the folder of configuration files. is there any similar way to include other configurations into flink? the error is following: 37844 [Flink-IPC Server handler 7 on 6123] ERROR de.fraunhofer.fokus.odp.transformer.flink.iee2rdf.JenaHbaseOutputFormat - Not found: Store/hbaserdf-hybrid.ttl com.hp.hpl.jena.shared.NotFoundException: Not found: Store/hbaserdf-hybrid.ttl at com.hp.hpl.jena.util.FileManager.readModelWorker(FileManager.java:388) at com.hp.hpl.jena.util.FileManager.loadModelWorker(FileManager.java:299) at com.hp.hpl.jena.util.FileManager.loadModel(FileManager.java:250) at com.talis.hbase.rdf.StoreDesc.read(StoreDesc.java:36) at com.talis.hbase.rdf.store.StoreFactory.create(StoreFactory.java:52) at com.talis.hbase.rdf.HBaseRdfFactory.connectStore(HBaseRdfFactory.java:61) at de.fraunhofer.fokus.odp.transformer.flink.iee2rdf.JenaHbaseOutputFormat.configure(JenaHbaseOutputFormat.java:74) at org.apache.flink.runtime.jobgraph.OutputFormatVertex.finalizeOnMaster(OutputFormatVertex.java:111) at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.subtaskInFinalState(ExecutionJobVertex.java:331) at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.vertexCancelled(ExecutionJobVertex.java:315) at org.apache.flink.runtime.executiongraph.ExecutionVertex.executionCanceled(ExecutionVertex.java:366) at org.apache.flink.runtime.executiongraph.Execution.cancel(Execution.java:346) at org.apache.flink.runtime.executiongraph.ExecutionVertex.cancel(ExecutionVertex.java:350) at org.apache.flink.runtime.executiongraph.ExecutionJobVertex.cancel(ExecutionJobVertex.java:246) at org.apache.flink.runtime.executiongraph.ExecutionGraph.fail(ExecutionGraph.java:377) at org.apache.flink.runtime.executiongraph.ExecutionGraph.lookupConnectionInfoAndDeployReceivers(ExecutionGraph.java:607) at org.apache.flink.runtime.jobmanager.JobManager.lookupConnectionInfo(JobManager.java:577) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.apache.flink.runtime.ipc.RPC$Server.call(RPC.java:420) at org.apache.flink.runtime.ipc.Server$Handler.run(Server.java:949) Thanks and Regards, Santosh |
Hi Santosh,
Did you include the resource file in the JAR that you submit using the web interface? Does it work using the command-line interface? Best regards, Max On Wed, Oct 7, 2015 at 3:31 PM, santosh_rajaguru <[hidden email]> wrote: > Hi all, > > I am experiencing some problem while writing to the jena-hbase. > I am using a custom OutPutformat for writing. This reads a configuration > file from the classpath. > it works while running the job from eclipse. but it throws an exception > (hbaserdf.ttl file cannot found) while running it from flink web interface. > hbaserdf is present in the jar. > > for hadoop configurations, we can use fs.hdfs.hadoopconf and path to the > folder of configuration files. > is there any similar way to include other configurations into flink? > > Thanks and Regards, > Santosh > > > > -- > View this message in context: http://apache-flink-mailing-list-archive.1008284.n3.nabble.com/Running-the-job-in-Flink-webInterface-tp8358.html > Sent from the Apache Flink Mailing List archive. mailing list archive at Nabble.com. |
Yes i have included the files in the jar. It throws the same error while executing from command prompt
|
How do you load the resource? Could you supply the code section?
On Fri, Oct 9, 2015 at 3:53 PM, santosh_rajaguru <[hidden email]> wrote: > Yes i have included the files in the jar. It throws the same error while > executing from command prompt > > > > -- > View this message in context: http://apache-flink-mailing-list-archive.1008284.n3.nabble.com/Running-the-job-in-Flink-webInterface-tp8358p8422.html > Sent from the Apache Flink Mailing List archive. mailing list archive at Nabble.com. |
Free forum by Nabble | Edit this page |