Tab Autocomplete does not work in pyspark shell

Hi All,
I am trying to work in pyspark. I am able to launch pyspark using spark 2.0. But one weird thing is I am not able to use autocomplete by hitting tab. The same autocomplete feature works fine in spark-shell(scala). Is something missing in pyspark configuration. It would be of great help if someone suggest on how to get this resolved. It is bit frustrating to not have autocomplete because most of the times we might not remember exact function/object name. Thanks in advance.


There is no autocomplete option in pyspark shell.
you can history commands by using arrow up or down and reuse them

Thank you for the reply Rahul.
After searching for sometime on-line below import did the trick.
Once I am in pyspark prompt, executed below.

import rlcompleter, readline
readline.parse_and_bind(“tab: complete”)

To make sure this setting is persistent every time when pyspark is launched. Below was done.

Step1: Add below to ~/.pythonrc

import rlcompleter, readline
readline.parse_and_bind(“tab: complete”)

Step2: Add below to ~/.bash_profile
export PYTHONSTARTUP="$HOME/.pythonrc"

@itversity --> Is there a way this setting can be done in a common startup file so that auto complete is available for everyone by default. Thanks.

Pavan A


Thanks Pavan. Its very useful.


Small correction. The second line should be

readline.parse_and_bind(‘tab: complete’)

It should be single quote.

Hello Pavan,
very nice !!