java - HashMap as a Broadcast Variable in Spark Streaming? -


i have data needs classified in spark streaming. classification key-values loaded @ beginning of program in hashmap. hence each incoming data packet needs compared against these keys , tagged accordingly.

i realize spark has variables called broadcast variables , accumalators distribute objects. examples in tutorials using simple variables etc.

how can share hashmap on spark workers using hashmap. alternatively, there better way this?

i coding spark streaming application in java.

in spark can broadcast serializable object same way. best way because shipping data once worker , can use in of tasks.

scala:

val br = ssc.sparkcontext.broadcast(map(1 -> 2)) 

java:

broadcast<hashmap<string, string>> br = ssc.sparkcontext().broadcast(new hashmap<>()); 

Comments

Popular posts from this blog

javascript - gulp-nodemon - nodejs restart after file change - Error: listen EADDRINUSE events.js:85 -

Fatal Python error: Py_Initialize: unable to load the file system codec. ImportError: No module named 'encodings' -

javascript - oscilloscope of speaker input stops rendering after a few seconds -