I am trying to work in pyspark. I am able to launch pyspark using spark 2.0. But one weird thing is I am not able to use autocomplete by hitting tab. The same autocomplete feature works fine in spark-shell(scala). Is something missing in pyspark configuration. It would be of great help if someone suggest on how to get this resolved. It is bit frustrating to not have autocomplete because most of the times we might not remember exact function/object name. Thanks in advance.