Permission error with root scratch dir
By default hive config attribute hive.exec.local.scratchdir is set to /tmp/hive, NOT HDFS, despite error message says so
Copy <property>
<name>hive.exec.local.scratchdir</name>
<value>/tmp/hive</value>
<description>Local scratch space for Hive jobs</description>
</property>
Simply change to:
Copy <property>
<name>hive.exec.local.scratchdir</name>
<value>~/tmp/hive</value>
<description>Local scratch space for Hive jobs</description>
</property>
Then in the OS, log in as user that owns HIVE
Copy cd ~
mkdir tmp
cd tmp
mkdir hive
cd ~
chmod -R 777 tmp/hive
Relaunch Tableau to connect to Spark SQL
Problem with ODBC driver, then uninstall and reinstall
Simply uninstall ODBC driver.
On Windows, uninstall ODBC driver in the red box from control panel -> program and feature
On Mac, delete or move away ODBC driver file and edit odbcinstall.ini
Copy sudo cp /Library/ODBC/odbcinst.ini /Library/ODBC/odbcinst.ini.org
sudo vi /Library/ODBC/odbcinst.ini
Copy [ODBC Drivers]
PostgreSQL Unicode = Installed
Simba Spark ODBC Driver = Installed
[PostgreSQL Unicode]
Description = PostgreSQL ODBC driver
Driver = /usr/local/lib/psqlodbcw.so
[Simba Spark ODBC Driver]
Driver = /Library/simba/spark/lib/libsparkodbc_sbu.dylib
delete line 3, line 9 and line 10, it should be like below
Copy [ODBC Drivers]
PostgreSQL Unicode = Installed
[PostgreSQL Unicode]
Description = PostgreSQL ODBC driver
Driver = /usr/local/lib/psqlodbcw.so
save and exit vi
Next, move away /Library/simba folder
Copy mv /Libray/simba /Library/simba.org
Then you will get back to initial state of your Tableau Spark SQL log in screen that requires you to download ODBC driver following link on the screen.
or