Spark streaming providing sliding window function for get rdd for last k. But I want to try use slice function to get rdd for last k, in a case I want to query rdd during range tim
Solution 1:
Based on the error message,
"Adding new inputs, transformations, and output operations after stopping a context is not supported"
it looks like ssc.stop() instead of ssc.awaitTermination() was used.
Please provide more information about the Spark Streaming Context (ssc) setup in the program.
Post a Comment for "Slice Function In Dstream Spark Streaming Not Work"