APIs & Automation
Letting Computers Do What They Do Best

APIs & Automation

Letting Computers Do What They Do Best

APIs & Automation

Computers communicate through the use of Application Programming Interfaces, or API for short. This is a standard way computers send and receive data. The internet is build on the back of APIs with most websites making API calls to render the website.

Data vendors will also offer their information through an API. This allows developers to automate tasks that rely on certain data points.


Use Cases

Obtaining COVID Vaccines

During COVID, when the first vaccines were being released to the general population, it was extremely hard to find available appointments to get the vaccine. Websites would offer time slots, but they were unpredictable and often gone in a matter of minutes.

Tony built an application that plugged into the APIs behind major websites such as CVS, Walgreens, and major grocery store chains. The data would pull each minute to find available appoints within a certain radius of a zip code. Users could put in their location and email address. As appointments became available, emails would start going out every minute with a link to the available appointment until the user signed up or the appointment was booked. This helped hundreds of family, friends, and coworkers to get the vaccine before they became more widely available. This was especially helpful for the elderly and at-risk individuals who were having trouble getting an appointment.

Trading APIs

For education purposes, Tony started experimenting with Artificial Intelligence (AI) and Machine Learning (ML) models to try and day trade the stock market successfully. With all the data signals being sent, it was impossible to manually execute all the trades required to track the model.

Tony realized that there were trading APIs offered by major retail brokerage firms. He wrote an open source software package for both the ETrade and TD Ameritrade API that could programmatically execute trades that were specified by the AL/ML models. He would launch an AWS EC2 server each morning and execute the necessary trades. He built a dashboard to track his performance to a custom benchmark and the major market indices. Unfortunately, performance was mixed so he could not retire early (yet?), but the automation worked perfectly!

Data Gathering

There is valuable data spread across the internet. Some of this data is specifically made available through Rest APIs, other times the websites present data and then discard the data once it becomes history. If the data is not captured at the time, then it is lost forever.

To support the blog Tony started, he started collecting and downloading data from dozens of sites and data providers across the internet. He gathered data from open APIs such as the Treasury website or the St. Louis Federal Reserve site. For other data points that were not offered over API, he would scrape the information on a periodic basis and save it down. All of this was fully automated and ran nightly with a report listing if any jobs had failed in the last 24 hours. This gave Tony all the data he needed to feed into his dashboards and articles.