
Python Automation Notes
So, I’ve been diving into Python automation lately, since it helps getting rid of repetitive tasks. If you’ve ever found yourself manually renaming files, copy-pasting from spreadsheets, or scraping data from websites, Python can do it all for you.
Dropping my notes here, hope they help! :)
What We’ll Cover
-
📂 File Handling & Organization - Read, write, rename, and organize files effortlessly with Python.
-
🕵️♂️ Web Scraping with BeautifulSoup - Extract data from websites without manual copy-pasting.
-
🌐 Browser Automation with Selenium - Control web pages, fill forms, and interact with dynamic sites.
-
🔗 APIs & Data Fetching - Connect to external services, fetch live data, and automate app interactions.
-
⚡ Error Handling & Optimization - Write clean, efficient automation scripts that don’t break easily.
1. Automating File Handling & Data Validation
One of the first things I looked into was managing files and directories with Python. No more renaming 200 files by hand. Python makes it super easy:
1. Reading & Writing Files
Python makes file handling way too easy with the open() function.
# Reading a file
with open("example.txt", "r") as file:
content = file.read()
print(content) # Boom! The whole file in one go
# Writing to a file
with open("new_file.txt", "w") as file:
file.write("Hello, automation!") # And just like that, a new file is created
2. Web Scraping Like a Pro (or almost one? :P)
Scraping data from websites? No more manual copy-pasting. Python’s Beautiful Soup makes it stupidly easy to extract data from webpages.
This is especially useful when dealing with huge datasets, no need to manually open and edit CSVs.
2. Working with Directories
Need to organize or clean up folders? Python’s os and shutil modules have your back.
import os
# List all files in a directory
files = os.listdir("my_folder")
print(files)
# Create a new directory
os.mkdir("new_folder")
# Rename a file
os.rename("old_file.txt", "new_file.txt")
import requests
from bs4 import BeautifulSoup
url = "https://example.com"
response = requests.get(url)
soup = BeautifulSoup(response.text, "html.parser")
# Extract all links from the page
links = soup.find_all("a")
for link in links:
print(link.get("href")) # Prints each URL
Also, mass-renaming files is such a lifesaver when dealing with bulk image or dataset files:
import os
folder = "images"
for index, filename in enumerate(os.listdir(folder)):
new_name = f"image_{index}.jpg"
os.rename(os.path.join(folder, filename), os.path.join(folder, new_name))
No more IMG_0097.jpg, IMG_0098.jpg nonsense.
1. Basic Web Scraping with BeautifulSoup
You can use this to scrape articles, product listings, or even Instagram bios (not me, but you could 👀).
2. Scraping Multiple Pages
Got a site with pagination? Automate clicking “Next” like this:
for page in range(1, 5): # Scraping first 5 pages
url = f"https://example.com/page/{page}"
response = requests.get(url)
soup = BeautifulSoup(response.text, "html.parser")
print(f"Scraping {url}...")
Web scraping is fun, but don’t be a villain, always check if the site allows it in robots.txt.
3. Automating Browser Tasks with Selenium
Sometimes, Beautiful Soup isn’t enough, like when websites load dynamically with JavaScript. That’s when Selenium comes in, letting you control a web browser like a human, but faster.
1. Opening a Browser and Searching on Google
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
driver = webdriver.Chrome() # Make sure you have ChromeDriver installed
driver.get("https://www.google.com")
search_box = driver.find_element("name", "q")
search_box.send_keys("Python automation" + Keys.RETURN)
This literally types and presses enter for you. The power.
2. Logging into Websites Automatically
Imagine you never have to type passwords again (jk, use this responsibly 👀).
from selenium import webdriver
driver = webdriver.Chrome()
driver.get("https://example.com/login")
username = driver.find_element("name", "username")
password = driver.find_element("name", "password")
username.send_keys("your_username")
password.send_keys("your_password")
password.submit()
If you’ve ever had to log into work portals that time out every 5 minutes, you need this.
4. APIs: Connecting the Dots Between Apps
APIs are like secret tunnels between different apps. They let you grab live data, send messages, or even make bots.
1. Making an API Request
import requests
response = requests.get("https://api.example.com/data")
data = response.json() # Boom! Parsed JSON
print(data)
Want live weather updates?
import requests
api_key = "your_api_key"
city = "New York"
url = f"http://api.weatherapi.com/v1/current.json?key={api_key}&q={city}"
response = requests.get(url)
weather = response.json()
print(f"Current temperature in {city}: {weather['current']['temp_c']}°C")
APIs are insanely useful for things like:
✅ Fetching live data (weather, stock prices, news)
✅ Automating tweets/posts on social media
✅ Connecting different services together
This barely scratches the surface of what’s possible with Python automation. The moment I realized I never had to manually rename files or scrape data again, it changed the game.
Things I still want to explore:
-
Automating emails & notifications
-
Integrating Python with Excel/Google Sheets
-
Building actual task bots with AI
If you’re starting with automation, just dive in, pick a task you hate doing manually and script it. Python will take care of the rest.