pySpark on Windows

pySpark on Windows can be installed using two different ways. Since spark is a distributed compute engine, it also works stand alone. Most of the developer who are familiar with working jupyter notebood prefer to use jupyter notebook and it has to be integrated with pySpark.

pySpark Jupiter Notebook

There are other sets of python developers who prefer to use an interactive shell to interact with pySpark. Form them pySpark has REPL (Read, Iterate, Print, Loop) which helps them to start the pySpark session and write their code. Refer this detailed documentation for pySpark interactive shell installation.