live PORTFOLIO

Watch how some of the websites were scrapped!

After scrapping the websites we can move to storing the data, cleaning it, analyzing it, and then starting to perform hypothesis testing before building predictive models!

Right move uk

Scrapped using APIS

The main goal is to understand how the URL works, location, rent/buy, pagination (next page) and other features work to automate the process.

I. understanding the website
II. capturing the request api

The main goal of this video is to capture the API and turn it into a python code that we can use.

IIi. writing the code

After capturing and testing the API, we're going write a code that will start to extract data from the website and add it to the data frame

IV. running the code and collecting data

We simply run the code and get all the data we want!

The newt examples will be shorter

The next videos will be shorter and focus more on the results

Scraping primelocation

I. understanding the website
iI. testing the code and scraping the website
Iii. running the code and collecting data

Creating a bot to scrape gas prices in the u.s.

GETTING GAS PRICES IN THE U.S.

using playwright only
I. understanding the website
Ii. Writing & testing the code
Iii. running the code and extracting data

That's it?

ABSOLUTELY NOT!
Following the scrapping of the data we can directly move to clean the data, create a database to store it and grant access to users, then analyze the data and make predictions.