Internally, DDF uses Flink Table APIs to process the SQL queries.
I would say that DDF would be very useful to provide good
virtualization when building application platform.
- Henry
On Thu, Dec 3, 2015 at 8:48 AM, Kostas Tzoumas <
[hidden email]> wrote:
> Hi Nam-Luc,
>
> I cc Rohit who implemented the DDF framework.
>
> I would say that the main difference with the Table API is that DDF aims at
> portability (running the same code using Flink, Spark, or a database),
> whereas the Table API is meant to be part of Flink itself.
>
> Best,
> Kostas
>
>
> On Thu, Dec 3, 2015 at 11:23 AM, Nam-Luc Tran <
[hidden email]>
> wrote:
>
>> Hello Everyone,
>>
>> We came across the Distributed DataFrame project (
http://ddf.io) that aims
>> at implementing a dataframe representation targeting Spark and Flink.
>>
>> Has anybody already heard or played with this project? How would you
>> position that with regards to Flink's Tables?
>>
>> Cheers,
>>
>> --
>>
>> *Nam-Luc TRAN*
>>
>> R&D Manager
>>
>> EURA NOVA
>>
>> (M) +32 498 37 36 23
>>
>> *euranova.eu <
http://euranova.eu>*
>>