Clean and transform the data scraped from the wiki pages. The final result is a well-format csv file with each city's historical population data
python=3.7.3
requests
beautifulsoup4
pandas
numpy
progressbar2
jupyter notebook
- Create a conda enviornment for this project to work in. (Installation for conda can be found here.)
$ conda create -n envName python=3.7.3
- Activate the conda enviornment
$ conda activate envName
- Download relevent packages
$ conda install requests beautifulsoup4 pandas numpy progressbar2 jupyter notebook
- Clone this repository to your local directory with this link: https://github.com/dw820/topos.git
$ git clone https://github.com/dw820/topos.git
- Go to the topos folder
$ cd topos
- Open the jupyter notebook
$ jupyter notebook
- The browser should open the jupyter notebook and can start working on the topos.ipynb file
- topos.ipynb: The jupyter notebook is running all the web scraping and data cleaning process, also creating the result.csv file.
- topos.html: The HTML file shows the running results of the jupyter notebook.
- result.csv: The final csv output file for BigQuery.