Just wanted to check whether Scala IDE (eclipse) for Spark is available and accessible in lab? or it has be developed only using shell? Please advise
I guess, you don’t have any IDE’s as such on the cluster to develop scala code. You can use the shell for code development or you can write a scala file, and create a jar file out of it and run it on the cluster. If you find it difficult writing code in vi editor, install mobaxterm, it has an inbuilt code editor in it.
No, not yet. It will taken care in 2 days when you upgrade the cluster to 5 nodes.
@venkatwilliams sbt is developer tool. IDE for Scala and sbt should be set up on your laptop it self.
That’s what even I know. But this above message from made me think it is possible and you are planning to provide it.
I can understand we can’t have Scala IDE in BigData-labs. No issues.
My suggestion is - Allow users to install git, scala and sbt/maven in their personal accounts to build and directly submit (without FTP) spark jobs to cluster nodes.
Yes, I agree it was my goof up. I will have scala and probably sbt/maven.
@venkatwilliams, scala is now installed. Maven and git are already there. I will setup sbt as well.
Thanks for the installing Scala. Found Maven and Git already installed in cluster.
Appreciate your protectiveness in handling suggestions and requests.
@venkatwilliams, Were you able to install sbt?
I do not think it is installed on lab yet?
sbt is build tool for developers. You have to build jar using sbt, ship it to cluster and run jar file on the cluster.
Hi @raj_sharma … while evaluting big-data-labs setup I was able to build scala application using maven.
But in the current cluster setup best way is do the development and jar build and compilation in your development box and move the jar for big-data-labs for execution as suggested by @itversity.
Hope this helps…