Hi all!
i want to integrate flink with ranger in our project. And i have some
investigation about hive/spark/hbase/hadoop ranger. And found that
hbase/hive/hadoop all have interface for authorization while spark sql does
it with inject spark rules. You can find it here
https://github.com/yaooqinn/spark-ranger.
This is my simple investigation
https://www.yuque.com/jackylau-sc7w6/bve18l/kkhn9e, which is chinese.And has
some authentication info, which is not relevant and you can ignore.
So if flink integrates with ranger, we can do it by following options
1) refer to the Spark Ranger code without modifying Flink code by inject
rules. But is has some problems.
Because if we do it by injecting rules refering to spark ranger, we can only
authorize DML/DDL.
sql String -> sqlNode -> Operation -> DDL Operation -> run
|
---------> DML/DQL Operation ->
RelNode(logical) -> optimizer rules -> ExecNode(physical)-> Transformation.
The spark can do it because the ddl is also a logical plan in spark, which
can optimize it by rules.
case _: ShowColumnsCommand => SHOWCOLUMNS
case _: ShowDatabasesCommand => SHOWDATABASES
case _: ShowFunctionsCommand => SHOWFUNCTIONS
case _: ShowPartitionsCommand => SHOWPARTITIONS
case _: ShowTablesCommand => SHOWTABLES
}
case _ => QUERY
2) So i think flink may do it as hive does. add interface for authorization.
And users can implement it by themself.could the flink conmunity add some
interfaces or have some future work to support authorization
Cheers,
Jacky Lau
--
Sent from:
http://apache-flink-mailing-list-archive.1008284.n3.nabble.com/