Step 01 - Setup environment and sqoop import - Developer

As part of the first step we will be setting up environment or signing up to the lab or accessing existing cluster and then perform below actions

  • If you do not have access to cluster or do not want to sign up for the lab then you can follow this
  • Setting up virtual environment on your PC - click here
  • Make sure all the technologies required for certification are available for use - mysql, sqoop, flume, hdfs, Spark with Scala, Spark with Python, Hive, Impala, avro-tools
  • Learn Sqoop import to get data from mysql to HDFS

Hi Sir,
I have configured my desktop with quick Cloudera VM in standalone mode, Currently my desktop is on 16 GB RAM & i5 processor. Is it fine as you mentioned in your live session that it should have i7 processor.

Few month back I configured multi cluster hadoop environment using Virtual Machine with Ubuntu. But it doesn’t have other technologies like sqoop, flumes, spark etc. So is it possible to configure all technologies that are required for the certification on linux flavor like ubutnu or centos etc of my own, to setup multi cluster using VMs with on my current desktop resources instead of cloudera quick start VM?

i5 processor will be slow if you set up multi-node cluster with VMs. Even with Cloudera quickstart VM you might run into out of memory issues or some other issues such as crash of services etc.

1 Like


I have created an account in the big data labs and when trying to validate HDFS I am getting below error. Can you please check this issue so that I can validate what is mentioned in the tutorial

hadoop fs -copyFromLocal ~/cards /user/training/cards

copyFromLocal: Permission denied: user=vchembati, access=EXECUTE, inode="/user/training/cards":training:hdfs:drwx

You have to use /user/vchembati/cards, not /user/training/cards.

training is my id and vchembati is yours. You will not be able to copy files to my account. You have to use yours.

Thank you. I have realized it after watching the videos. thanks.