My electricity company provides quite a bad web interface for looking at your electricity consumption. The plotting seems clumsy and you need to click many times to even reach the data visualization. Thankfully their web service seemed to have a way to export the electricity consumption data as an Excel file, so I just needed to figure out the proper way of logging into their portal and make a request to download the Excel sheet.
I wrote a bit of Python code to download the consumption data. The requests
library proved to be a great solution again. I just needed to get the login page, parse a verification token from the html form, and submit my login data together with the token, and my session was now logged in. Then I could already make a request for my sought-after Excel file to be generated. The wonderful pandas
library made it easy to read the file and fix-up a few things.
oomi (this link opens in a new window) by sremes (this link opens in a new window)
Download electricity consumption data from Oomi and upload it into Google Firestore.
I decided to store my electricity consumption data in a simple Google Cloud Firestore collection, with each hourly measurement being its own document within the collection.
{
"time": timestamp,
"consumption": float,
"location": string
}
So I created a new service account with sufficient rights in the Google Cloud console so I could push the data into Firestore. The Python API’s for Google Cloud proved to be very easy to use. I was now able to fetch data from the electricity company as well as upload (and download) data to the Firestore.
Next I wanted to automate this process. I deployed my code as a Cloud Function in GCP and set the Function to be triggered by a message in the Cloud Pub/Sub service in which I created an appropriate new topic. Next I just needed to add a new scheduled task with the Cloud Scheduler service to publish a message everyday that triggers the Cloud Function to update the Firestore electricity data.