Thursday, July 30, 2020

In java -D what does the D stand for?

I've always assumed it was to define the value of a property... possibly a legacy from C compilers, which often use -D as similar to #define in code.

EDIT: The closest I have to a source for this at the moment is some JDK 1.1 documentation which specifies the flag as:

Redefines a property value. propertyName is the name of the property whose value you want to change and newValue is the value to change it to. [...]

That at least contains the word "redefine" which is close to "define" :)

Hadoop Environment Setting


# Java Environment Variables
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export PATH=$PATH:$JAVA_HOME/bin

#Cassandra Environment Variables
export CASSANDRA_HOME=/usr/local/softwares/cassandra-3.11.7
export PATH=$PATH:$CASSANDRA_HOME/bin

#Spark Environment Variables
export SPARK_HOME=/usr/local/softwares/spark-2.4.6-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin

#Kafka Environment Variables
export KAFKA_HOME=/usr/local/softwares/kafka_2.12-2.4.1
export PATH=$PATH:$KAFKA_HOME/bin

#Hadop Environment Variables

export HADOOP_HOME=/usr/local/softwares/hadoop-2.8.5
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_HDFS_HOME=$HADOOP_HOME
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native

export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"

# Hive Environment Variables
export HIVE_HOME=/usr/local/softwares/apache-hive-2.3.5
export PATH=$PATH:$HIVE_HOME/bin

Thursday, July 23, 2020

To Learn hadoop Reference

http://blog.praveendeshmane.co.in/index.jsp  

HBase Error IllegalStateException when starting Master: hsync

I had similar issue while using HBase 2.2.3 with Hadoop 2.10.0 on a 3-N Hbase cluster while using HDFS as the base.rootdir.

Solution

  1. Set the configuration to hbase.unsafe.stream.capability.enforce to false in hbase-site.xml
<property>
  <name>hbase.unsafe.stream.capability.enforce</name>
  <value>false</value>
</property>



  1. Not sure of this was necessary , also changed the hbase.wal.dir to an HDFS directory outside the hbase.rootdir

zookeeper.ClientCnxn: Opening socket connection to server localhost/192.168.43.203:2181. Will not attempt to authenticate using SASL

Any location: local or HDFS location:

Solution : remove the /user/nagaraju/hbase
           and also remove  the /user/nagaraju/zookeeper

Recent Post

Databricks Delta table merge Example

here's some sample code that demonstrates a merge operation on a Delta table using PySpark:   from pyspark.sql import SparkSession # cre...