exception in thread "main" java.lang.noclassdeffounderror: org/apache/spark/sql/sparksession
when submitting
spark-submit
check your project should have same dependency as of spark version in pom.xml,
This may be because you have two spark version on the same machine
for this you can create two various soft link and can use the exact spark-version on which you have build your project
spark1-submit -> /Users/test/sparks/spark-1.6.2-bin-hadoop2.6/bin/spark-submit
spark2–submit -> /Users/test/sparks/spark-2.1.1-bin-hadoop2.7/bin/spark-submit
or may be because of you have build your project with higher spark-version and deployed on the cluster with lower spark-version
for this you have to upgrade your spark version in the cluster. cloudera provides support for it https://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/Multiple-Spark-version-on-the-same-cluster/td-p/39880