Exercise 3: Relationship Strength Analytics using Spark
Now that you have the results, we can close our Spark shell with the 'exit' command (note that you may see many log messages from other nodes in the cluster as the application shuts down).
An experienced SQL user may recognize how this job could be written in SQL as well. Like MapReduce, Spark can actually be used as an environment for executing SQL queries. But Spark's ability to interactively write custom code in a real programming language allows you to go much beyond that, and preview the results at each step.
You can learn more about building advanced recommendation systems by consulting Cloudera's Spark Guide or take a more in-depth training specifically for Spark with Cloudera's Developer Training for Spark and Hadoop.