SparkSql运行程序报错,
Exception in thread "main" org.apache.spark.sql.AnalysisException: Detected cartesian product for INNER join between logical plans
解决方式:设置spark.sql.crossJoin.enabled=true
因为 ,2.x中默认不支持笛卡尔积操作,需要通过参数spark.sql.crossJoin.enabled开启
程序代码里面开始笛卡尔积操作,如下示:
val sc: SparkSession = SparkSession.builder .appName("My Spark Application") // optional and will be autogenerated if not specified .master("local[*]") // avoid hardcoding the deployment environment .config("spark.debug.maxToStringFields", "200") .config("spark.sql.crossJoin.enabled","true") .getOrCreate val rst = discountFinancial.run(sc)