Always set SPARK_HOME on Spark jobs
Phew! This one was nasty!... yet, simple. The answer was in front of me all the time....
In this MR we:
-
Partially revert commit 294f3a27. We only revert the code and not the
.expected
files since they have evolved. - Solve the issue of not being able to find the iceberg jar when running
for_virtualenv()
withdeploy_mode='client'
by always settingSPARK_HOME
. - Update all
.expected
files accordingly.
Bug: T336800