"Lambdifying" scala Function in Java -
using java , apache spark (that has been rewritten in scala), faced old api method (org.apache.spark.rdd.jdbcrdd
constructor), has abstractfunction1 it's argument:
abstract class abstractfunction1[@scala.specialized -t1, @scala.specialized +r]() extends scala.anyref scala.function1[t1, r] {}
because abstractfunction1
abstract class, cant use java8 lambdas, decided wrap scala.function1 trait same java.util.functions.function
does't implements andthen
, compose
methods. result, create thes interface:
import scala.function1; @functionalinterface public interface funct<t, r> extends function1<t, r>, serializable { @override default <a> function1<a, r> compose(function1<a, t> before) { return null; } @override default <a> function1<t, a> andthen(function1<r, a> g) { return null; } }
ide has no problems interface, while compiling, get:
[error] funct not functional interface [error] multiple non-overriding abstract methods found in interface funct
is possible wrap scala's trait, can use lambdas method:
void domagic(scala.function1<t,v> arg)
so want functional interface version of scala function traits can call them java8 lambda syntax? don't have yourself. take @ https://github.com/scala/scala-java8-compat . not quite nice directly using java8 lambda syntax, since still have do
import static scala.compat.java8.jfunction.*; domagic(func(x => ...));
instead of
domagic(x => ...);
but in scala 2.12 big theme java8 compatibility. scala functionx classes redone sam interfaces, able latter when scala 2.12 comes out. @ least plan last time checked.
Comments
Post a Comment