machine learning - pyspark : NameError: name spark is not defined

machine learning – pyspark : NameError: name spark is not defined

machine learning – pyspark : NameError: name spark is not defined

You can add

from pyspark.context import SparkContext
from pyspark.sql.session import SparkSession
sc = SparkContext(local)
spark = SparkSession(sc)

to the begining of your codes to define a SparkSession, then the spark.createDataFrame() should work.

Answer by 率怀一 is good and will work for the first time.
But the second time you try it, it will throw the following exception :

ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=pyspark-shell, master=local) created by __init__ at <ipython-input-3-786525f7559f>:10 

There are two ways to avoid it.

1) Using SparkContext.getOrCreate() instead of SparkContext():

from pyspark.context import SparkContext
from pyspark.sql.session import SparkSession
sc = SparkContext.getOrCreate()
spark = SparkSession(sc)

2) Using sc.stop() in the end, or before you start another SparkContext.

machine learning – pyspark : NameError: name spark is not defined

Since you are calling createDataFrame(), you need to do this:

df = sqlContext.createDataFrame(data, [features])

instead of this:

df = spark.createDataFrame(data, [features])

spark stands there as the sqlContext.


In general, some people have that as sc, so if that didnt work, you could try:

df = sc.createDataFrame(data, [features])

Related posts on Name Error :

Leave a Reply

Your email address will not be published.