Wednesday, August 5, 2020

he root scratch dir: /tmp/hive on hdfs should be writable current permissions are: rw-rw-rw s

Solution: /tmp/hive is temporary directory. Only temporary files are kept in this location. No problem even if we delete this directory, will be created when required with proper permissions.

Step 1) In hdfs, Remove the /tmp/hive directory ==> "hdfs dfs -rm -r /tmp/hive"

2) At OS level too, delete the dir /tmp/hive ==> rm -rf /tmp/hive

After this, started the spark-shell and it worked fine..

No comments:

Post a Comment

Recent Post

Databricks Delta table merge Example

here's some sample code that demonstrates a merge operation on a Delta table using PySpark:   from pyspark.sql import SparkSession # cre...