A while ago, I had a big list of website links saved in an Excel file—over 700 of them! I needed to find out the title of each website. Doing it by hand would mean opening every link, copying the title, and pasting it back into Excel. That would take forever, and I’d probably make mistakes along the way.
The Problem
When you have a lot of data, doing simple, repetitive work by hand is really slow and easy to mess up. It’s boring and you can easily skip a link or copy something wrong.
My Solution: Using Python
So I decided to let my computer do the work for me. Python is a programming language that makes things like this much easier. I wrote a short script that reads all the links from my Excel file, visits each website on its own, grabs the title, and then saves everything back into Excel.
Here’s the script I used:
How to Run the Script (Even If You Don’t Have Python Installed)
You don’t need to install anything on your computer. There’s a website called Google Colab where you can run Python scripts for free:
- Go to Google Colab (just search for it online).
- Upload your Excel file that has the URLs.
- Copy and paste the script above into a cell in the notebook.
- Run the code. It will go through all your links and get the webpage titles.
- You can then download your updated Excel file with an extra column for the page titles.
Automation can really make life easier!


