Thursday, April 30, 2020

openvpn --config client.ovpn fail: RTNETLINK answers: File exists

I have removed route remote_host 255.255.255.255 net_gateway from client.ovpn file

OpenVPN: “Bad compression stub decompression header byte: 102”





OpenVPN 2.4 also became the default version your connections use. OpenVPN 2.4 adds many security and performance improvements.

Because OpenVPN 2.4 is a major release, there are some setting incompatibilities you may have from your OpenVPN 2.3 configurations. The most common incompatibility is compression settings and the deprecation of the comp-loz command.

Due to which our  data traffic is halted, but connection is not dropped (vpn)

connecting to sophos VPN , we were able to ping to the desired machine in office network but unable to connect to machine via, SSH or VNC.

when we try to ssh we get the below error

Bad Compression

 

The Following Solution will Be able to solve this issue and make you to connect to your machine.

 

Step 1. Goto The setting and Select networks

 

Step 2: Install the Following Packages to configure VPN.

sudo apt-get install network-manager-openvpn 
sudo apt-get install network-manager-openvpn-gnome  
sudo apt-get install network-manager-pptp 
sudo apt-get install network-manager-vpnc

  

Step 3:  Go to the Network of Ubuntu, Under VPN , click on + button.


Step 4: Click on Import From File and import the .ovpn file from Sophos.

if you get any error like plugins error  like.


“error: the plugin does not support import capability” when attempting to import openvpn config file”

Open your .ovpn and Comment the following line

route remote_host 255.255.255.255 net_gateway


Step 5:  After importing successfully, Enter your Credentials like username and password.

Step 6: After entering, click on Advanced Settings.

Step7: Change the LZO Data Compress Value from no to Adaptive

Step8: Click ok and Apply and connect to your VPN and try to connect to your machine!

 

 

***********

Important Update

If you Dint get internet Access after Connected to VPN!

Try this

Goto network setting & click on your desired vpn setting icon

VPN configaration > IPv4routes > use this connection only for resources on its network

Goto IPv4 and tick the “Use this connection only for resource on its network”  & click on Apply

Sophos XG Firewall: How to configure SSL VPN client in Ubuntu

How to configure the SSL VPN on Ubuntu

OpenVPN should be installed. 

You can install OpenVPN by executing thefollowing command:

# sudo apt-get install openvpn

Follow these steps to configure SSL VPN Client in Ubuntu:

  1. Login to Sophos Firewall's User Portal by browsing to https://<WAN IP address of Sophos>:443 

  2. Navigate to SSL VPN
  3. Click Download Configuration for Other OSs


  4. A compressed file named .ovpn is downloaded.
  5. Navigate to where the ovpn file was downloaded and execute the following command:
  6. openvpn --config client.ovpn
  7. Fill in the username and password
  8. The client will now connect

Wednesday, April 22, 2020

Spring Boot SSL (HTTPS) Configuration

To enable SSL or HTTPS for Spring Boot web application, puts the certificate file .p12 or .jks in the resources folder, and declares the server.ssl.* values in the application.properties

Self-signed Certificate

For this example, we will use the JDK’s keytool to generate a self-sign certificate in PKCS12 format. The below command will create a PKCS12 cert, name nagaraju.p12, puts this file into the resources folder.

Terminal

$ keytool -genkeypair -keyalg RSA -keysize 2048 -storetype PKCS12 -keystore nagaraju.p12 -validity 365

Enter keystore password:  
Re-enter new password:


application.properties
# SSL
server.port=8443
server.ssl.key-store=classpath:nagaraju.p12
server.ssl.key-store-password=123456

# JKS or PKCS12
server.ssl.keyStoreType=PKCS12

# Spring Security
# security.require-ssl=true

Done, starts the Spring Boot, and access https://localhost:8443

Redirect all traffic from port 8080 to 8443.

StartApplication.java

package com.muthyatechnology.config
import org.apache.catalina.Context;
import org.apache.catalina.connector.Connector;
import org.apache.tomcat.util.descriptor.web.SecurityCollection;
import org.apache.tomcat.util.descriptor.web.SecurityConstraint;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.web.embedded.tomcat.TomcatServletWebServerFactory;
import org.springframework.boot.web.servlet.server.ServletWebServerFactory;
import org.springframework.context.annotation.Bean;

@SpringBootApplication
public class StartApplication {
    public static void main(String[] args) {
        SpringApplication.run(StartApplication.class, args);
    }

    // spring boot 2.x
    @Bean
    public ServletWebServerFactory servletContainer() {
        TomcatServletWebServerFactory tomcat = new TomcatServletWebServerFactory() {
            @Override
            protected void postProcessContext(Context context) {
                SecurityConstraint securityConstraint = new SecurityConstraint();
                securityConstraint.setUserConstraint("CONFIDENTIAL");
                SecurityCollection collection = new SecurityCollection();
                collection.addPattern("/*");
                securityConstraint.addCollection(collection);
                context.addConstraint(securityConstraint);
            }
        };
        tomcat.addAdditionalTomcatConnectors(redirectConnector());
        return tomcat;
    }

    private Connector redirectConnector() {
        Connector connector = new Connector("org.apache.coyote.http11.Http11NioProtocol");
        connector.setScheme("http");
        connector.setPort(8080);
        connector.setSecure(false);
        connector.setRedirectPort(8443);
        return connector;
    }

}


Generating a PKCS12 (.p12) Self-Signed Certificate Using OpenSSL on Ubuntu

nagaraju@nagaraju:~$ openssl req -newkey rsa:2048 -nodes -keyout key.pem -x509 -days 365 -out certificate.pem
Can't load /home/nagaraju/.rnd into RNG
140366584201664:error:2406F079:random number generator:RAND_load_file:Cannot open file:../crypto/rand/randfile.c:88:Filename=/home/nagaraju/.rnd
Generating a RSA private key
..................................................................+++++
................+++++
writing new private key to 'key.pem'
-----
You are about to be asked to enter information that will be incorporated
into your certificate request.
What you are about to enter is what is called a Distinguished Name or a DN.
There are quite a few fields but you can leave some blank
For some fields there will be a default value,
If you enter '.', the field will be left blank.
-----
Country Name (2 letter code) [AU]:91
State or Province Name (full name) [Some-State]:Karnataka
Locality Name (eg, city) []:Bengalore
Organization Name (eg, company) [Internet Widgits Pty Ltd]:TheprogrammersBook
Organizational Unit Name (eg, section) []:IT
Common Name (e.g. server FQDN or YOUR name) []:Nagaraju Gajula
Email Address []:nagaraju@gmail.com
nagaraju@nagaraju:~$ ls
certificate.pem   FromDrive          Pictures               Templates
Desktop           key.pem            Public                 Videos
nagaraju@nagaraju:~$ openssl pkcs12 -inkey key.pem -in certificate.pem -export -out certificate.p12
Enter Export Password:
Verifying - Enter Export Password:
nagaraju@nagaraju:~$ ls
certificate.p12  examples.desktop   Personals              Technology
certificate.pem  FromDrive          Pictures               Templates
Desktop          key.pem            Public                 Videos
nagaraju@nagaraju:~$ 
nagaraju@nagaraju:~$ openssl x509 -text -noout -in certificate.pem
nagaraju@nagaraju:~$ openssl pkcs12 -in certificate.p12 -noout -info

Saturday, April 11, 2020

Set log level in spark-shell in spark REPL Environment

Follow the below steps .

1. Add the Logger 


import org.apache.log4j.Logger
2. Add the Level
import org.apache.log4j.Level
3.Set the Log Level (Here we are setting the Level to "org" package)
Logger.getLogger("org").setLevel(Level.INFO)  (We can add any Level Like  DEBUG,ERROR,TRACE,OFF )



How to use braodcast variable in spark


Broadcast Variables

Broadcast variables allow the programmer to keep a read-only variable cached on each machine rather than shipping a copy of it with tasks. They can be used, for example, to give every node a copy of a large input dataset in an efficient manner. Spark also attempts to distribute broadcast variables using efficient broadcast algorithms to reduce communication cost.

Spark actions are executed through a set of stages, separated by distributed “shuffle” operations. Spark automatically broadcasts the common data needed by tasks within each stage. The data broadcasted this way is cached in serialized form and deserialized before running each task. This means that explicitly creating broadcast variables is only useful when tasks across multiple stages need the same data or when caching the data in deserialized form is important.

Broadcast variables are created from a variable v by calling SparkContext.broadcast(v). The broadcast variable is a wrapper around v, and its value can be accessed by calling the value method. The code below shows this:



import java.nio.charset.CodingErrorAction

import org.apache.log4j.{Level, Logger}
import org.apache.spark.SparkContext

import scala.io.{Codec, Source}

object AverageMovieRatings {

    def mapToTuple(line: String): (Int, (Float, Int)) = {
        val fields = line.split(',')
        return (fields(1).toInt, (fields(2).toFloat, 1))
    }

    def loadMovieNames(): Map[Int, String] = {

        // Handle character encoding issues
        implicit val codec = Codec("UTF-8")
        codec.onMalformedInput(CodingErrorAction.REPLACE)
        codec.onUnmappableCharacter(CodingErrorAction.REPLACE)

        var movieNames: Map[Int, String] = Map()

        // Read lines from movies.csv into Iterator. Drop the first (header) row.
        val lines = Source.fromFile("/tmp/ml-latest-small/movies.csv").getLines().drop(1)
        for (line <- lines) {
            val fields = line.split(',')
            movieNames += (fields(0).toInt -> fields(1))
        }
        return movieNames
    }

    def main(args: Array[String]): Unit = {
        Logger.getLogger("org").setLevel(Level.ERROR)

        val sc = new SparkContext("local[*]", "AverageMovieRatings")

        // Broadcast the movie names
        val names = loadMovieNames()

        // Read a text file
        var data = sc.textFile("/tmp/ml-latest-small/ratings.csv")

        // Extract the first row which is the header
        val header = data.first();

        // Filter out the header from the dataset
        data = data.filter(row => row != header)

        val result = data.map(mapToTuple)
          .reduceByKey((x, y) => (x._1 + y._1, x._2 + y._2))
          .map(x => (x._1, x._2._1 / x._2._2))
          .sortBy(_._2, false)
          .map(x => (names(x._1), x._2))
          .collect()

        result.foreach(println)
    }
}

Recent Post

Databricks Delta table merge Example

here's some sample code that demonstrates a merge operation on a Delta table using PySpark:   from pyspark.sql import SparkSession # cre...