Set up environment variables : mac or linux


#1

This is the error message when launch spark-shell in ubuntu
kindly assist with what went wrong please

Chris@DESKTOP-13E02IR:~$ spark-shell
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark’s repl log4j profile: org/apache/spark/log4j-defaults-repl.properties
To adjust logging level use sc.setLogLevel(“INFO”)
Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/_,// //_\ version 1.6.3
/
/

Using Scala version 2.10.5 (Java HotSpot™ 64-Bit Server VM, Java 1.8.0_171)
Type in expressions to have them evaluated.
Type :help for more information.
18/07/08 16:59:55 WARN Utils: Your hostname, DESKTOP-13E02IR resolves to a loopback address: 127.0.1.1; using 192.168.8.22 instead (on interface wifi0)
18/07/08 16:59:55 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Spark context available as sc.
18/07/08 17:00:03 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
18/07/08 17:00:03 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
18/07/08 17:00:59 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
18/07/08 17:01:06 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
18/07/08 17:01:22 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
18/07/08 17:01:22 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
SQL context available as sqlContext.

scala>


#2

I see warnings but no errors. Everything looks ok.


#3

After giving this command in dot profile
vi .profile

i got the warning message. what do i do to continue set up please

E325: ATTENTION
Found a swap file by the name “.profile.swp”
owned by: Chris dated: Thu Jul 5 22:57:09 2018
file name: ~Chris/.profile
modified: YES
user name: Chris host name: DESKTOP-13E02IR
process ID: 14
While opening file “.profile”
dated: Wed Jul 4 23:13:30 2018

(1) Another program may be editing the same file. If this is the case,
be careful not to end up with two different instances of the same
file when making changes. Quit, or continue with caution.
(2) An edit session for this file crashed.
If this is the case, use “:recover” or “vim -r .profile”
to recover the changes (see “:help recovery”).
If you did this already, delete the swap file “.profile.swp”
to avoid this message.

Swap file “.profile.swp” already exists!
[O]pen Read-Only, (E)dit anyway, ®ecover, (D)elete it, (Q)uit, (A)bort:


#4

~/.prf swap file by the name “.profile.swp”
owned by: Chris dated: Thu Jul 5 22:57:09 2018
file name: ~Chris/.profile
modified: YES
user name: Chris host name: DESKTOP-13E02IR
process ID: 14
While opening file “.profile”
dated: Wed Jul 4 23:13:30 2018

(1) Another program may be editing the same file. If this is the case,
be careful not to end up with two different instances of the same
file when making changes. Quit, or continue with caution.
(2) An edit session for this file crashed.
If this is the case, use “:recover” or “vim -r .profile”
to recover the changes (see “:help recovery”).
If you did this already, delete the swap file “.profile.swp”
to avoid this message.

Swap file “.profile.swp” already exists!
[O]pen Read-Only, (E)dit anyway, ®ecover, (D)elete it, (Q)uit, (A)bort:ile: executed by the command interpreter for login shells.

This file is not read by bash(1), if ~/.bash_profile or ~/.bash_login

exists.

see /usr/share/doc/bash/examples/startup-files for examples.

the files are located in the bash-doc package.

the default umask is set in /etc/profile; for setting the umask

for ssh logins, install and configure the libpam-umask package.

#umask 022
– INSERT –


#5

I have spark context available as given below

Spark context available as sc. while in the demo spark session is available as “spark”

this is confusing,

please let me know if I ma track


#6

when exiting i have this warning

warning: there were 1 deprecation warning(s); re-run with -deprecation for details


#7

Everything seems ok. I get same warnings and everything works fine on my machine. You should do some test runs ( like reading files and write data to HDFS) to validate spark.


#8

thank you for the response i thought i lost track for getting a different message on my machine


#9

I believe with this I am set to start training. how do i enter lab to commence training please
thank you