[jira] [Created] (FLINK-941) Possible deadlock after increasing my data set size
Posted by
Shang Yuanchun (Jira) on
Jun 16, 2014; 11:57am
URL: http://deprecated-apache-flink-mailing-list-archive.368.s1.nabble.com/jira-Created-FLINK-941-Possible-deadlock-after-increasing-my-data-set-size-tp279.html
Bastian Köcher created FLINK-941:
------------------------------------
Summary: Possible deadlock after increasing my data set size
Key: FLINK-941
URL:
https://issues.apache.org/jira/browse/FLINK-941 Project: Flink
Issue Type: Bug
Affects Versions: pre-apache-0.5.1
Reporter: Bastian Köcher
If I increase my data set, my algorithm stops at some point and doesn't continue anymore. I already waited a quite amount of time, but nothing happens. The linux processor explorer also displays that the process is sleeping and waiting for something to happen, could maybe be a deadlock.
I attached the source of my program, the class HAC_2 is the actual algorithm.
Changing the line 271 from "if(Integer.parseInt(tokens[0]) > 282)" to "if(Integer.parseInt(tokens[0]) > 283)" at my PC "enables" the bug. The numbers 282, 283 are the numbers of the documents in my test data and this line skips all documents with an id greater than that.
--
This message was sent by Atlassian JIRA
(v6.2#6252)