PySpark Jupyter Notebook Configuration On Windows
PySpark on Windows can be installed using two different ways. Since Spark is a distributed compute engine, it also works stand alone. Most of the developer who are familiar with working jupyter notebood prefer to use jupyter notebook and it has to be integrated with PySpark.
There are other sets of python developers who prefer to use an interactive shell to interact with
Additional PySpark Resource & Reading Material
PySpark Frequentl Asked Question
Refer our PySpark FAQ space where important queries and informations are clarified. It also links to important PySpark Tutorial apges with-in site.
PySpark Examples Code
Find our GitHub Repository which list PySpark Example with code snippet
PySpark/Spark Related Interesting Blogs
Here are the list of informative blogs and related articles, which you might find interesting