Skip to content

dw820/topos

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

2019 Data Eng Intern Assignment

Clean and transform the data scraped from the wiki pages. The final result is a well-format csv file with each city's historical population data

Prerequisites

python=3.7.3

Dependencies

requests
beautifulsoup4
pandas
numpy
progressbar2
jupyter notebook

Steps to run the notebook

  1. Create a conda enviornment for this project to work in. (Installation for conda can be found here.)
$ conda create -n envName python=3.7.3
  1. Activate the conda enviornment
$ conda activate envName
  1. Download relevent packages
$ conda install requests beautifulsoup4 pandas numpy progressbar2 jupyter notebook
  1. Clone this repository to your local directory with this link: https://github.com/dw820/topos.git
$ git clone https://github.com/dw820/topos.git
  1. Go to the topos folder
$ cd topos
  1. Open the jupyter notebook
$ jupyter notebook
  1. The browser should open the jupyter notebook and can start working on the topos.ipynb file

Files in this repository

  • topos.ipynb: The jupyter notebook is running all the web scraping and data cleaning process, also creating the result.csv file.
  • topos.html: The HTML file shows the running results of the jupyter notebook.
  • result.csv: The final csv output file for BigQuery.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published