Json Error in spark SQL

Hi everyone

I am getting error while working with json files in spark sql.

data=sqlContext.sql("Select * from Dep where department_id > 5 ")
pyspark.sql.utils.AnalysisException: u"cannot resolve ‘(department_id > 5)’ due to data type mismatch: differing types in ‘(department_id > 5)’ (structint:bigint and int).;"
but if i try without where condition it works.

data=sqlContext.sql("Select * from Dep ")
I am using quickststartVM.
My schema is

departments.printSchema()
root
|-- department_id: struct (nullable = true)
| |-- int: long (nullable = true)
|-- department_name: struct (nullable = true)
| |-- string: string (nullable = true)

Kindly help