Introduction
Looking for a remote job as a developer can feel overwhelming. There are so many job boards, and checking them regularly takes time we don’t always have. What if we could automate this process?
In this tutorial, we’ll use a Job Posting API on RapidAPI and Bash commands to fetch remote job postings, format them into a clean table, and set up a weekly email alert. By the end, we’ll have a simple yet powerful tool to keep us informed about new opportunities without the hassle of manual searches.
Prerequisites
Let’s start by setting up our environment. We’ll use Ubuntu or a similar Linux distribution for this tutorial. If you’re on Windows, you can use WSL or a virtual machine to follow along.
Creating a RapidAPI Account and Subscribing to the API
First we need a RapidAPI account to access the job postings API - don’t worry, it’s free to sign up, and we’ll grab an API key once we’re logged in.
-
Create a RapidAPI Account
- Go to RapidAPI and click on the Sign Up button in the top-right corner.
- You can sign up using your email, Google account, or GitHub account. Follow the prompts to complete the registration process.
-
Subscribe to the Job Postings API
- Once logged in, you can visit the API’s page using this link: Daily International Job Postings API.
- On the API’s page, you can read up on the API features and go to the Pricing section.
- Select the Basic plan, which offers 25 free requests per month, with each request returning up to 10 job postings.
- Click the Subscribe button and confirm your selection.
-
Get Your API Key
- After subscribing, navigate to the Endpoints section on the API’s page.
- Look for the Headers section in any endpoint example. You’ll see two required headers:
x-rapidapi-key
: This is your unique API key.x-rapidapi-host
: This should be set todaily-international-job-postings.p.rapidapi.com
.
- Copy your
x-rapidapi-key
and keep it secure - we’ll use it in our script to authenticate API requests.
Now that we’ve set up our RapidAPI account and subscribed to the Basic plan, we’re ready to start building our script to fetch and process remote job postings. Let’s move on to the next step!
Installing Required Bash Commands
Next, we’ll install some command-line tools to make our lives easier. We’ll use curl
to fetch data from the API, jq
to process the JSON responses, and ssmtp
or mailutils
to send emails. If you don’t already have these installed, you'll need to run some quick command to get them set up. Additionally, we’ll configure ssmtp
with our email credentials so we can send the job postings directly to our inbox.
- Check if
curl
is Installedcurl
is a command-line tool for making HTTP requests. It’s commonly pre-installed on most Linux distributions, but let’s verify by runningcurl --version
. If you see version information,curl
is installed. If not, install it with:sudo apt update sudo apt install curl
- Check if
jq
is Installedjq
is a lightweight JSON processor that we’ll use to parse and format the API responses. Check ifjq
is installed by runningjq --version
. If it’s not installed, install it with:sudo apt update sudo apt install jq
- Check if
ssmtp
ormailutils
is Installed We’ll use eitherssmtp
ormailutils
to send emails from the command line. Check ifssmtp
is installed by runningssmtp --version
. Ifssmtp
not installed, install it with:
and if you want to usesudo apt update sudo apt install ssmtp
mail
check ifmailutils
is installed by runningmail --version
. If it’s not installed, install it with:sudo apt update sudo apt install mailutils
- Configure Email Sending
If you’re using
mailutils
, no additional configuration is required for basic usage. If you’re usingssmtp
, you’ll need to configure it with your email credentials. Open the configuration file in a text editor by runningsudo nano /etc/ssmtp/ssmtp.conf
and add the following lines, replacing placeholders with your email provider’s SMTP settings (e.g., Gmail):
Save and exit the file (root=your_email@gmail.com mailhub=smtp.gmail.com:587 AuthUser=your_email@gmail.com AuthPass=your_email_password
Ctrl+O
,Enter
,Ctrl+X
).
Finally, we’ll need a basic understanding of Bash scripting. If you’ve ever written or run a shell script, you’re good to go. If not, don’t worry - we’ll keep the code simple and explain each step as we go. By the end, we'll have a script that fetches remote job postings, formats them, and sends them to our email account at the end of the month. Let’s get everything set up so we can start building!
Overview of the Job Posting API
The Job Posting API, as documented at https://api.techmap.io/, provides comprehensive access to job postings aggregated from a wide range of sources, including career pages, job boards, job aggregators, and employment offices. The API allows users to search and filter job postings based on various parameters such as job title, skills, company, location, industry, and more. It supports advanced querying capabilities, including Boolean searches, keyphrase searches, and geographic filtering, enabling precise and flexible job searches. The API returns results in JSON format, making it easy to integrate into applications for backfilling job boards, detailed analysis, company monitoring, or automated job agents.
One of the key features of the API is its ability to handle complex queries with multiple parameters. For example, users can search for remote jobs in specific time zones, jobs in particular industries, or jobs requiring specific skills. The API also supports pagination, allowing users to retrieve job postings in manageable chunks, with each request returning up to 10 job postings per page. Additionally, the API provides metadata endpoints in higher subscription plans that offer statistics and distinct values for specific fields, such as workPlace or industry, which can be useful for analyzing trends or building filters in job search applications.
The API is designed to be developer-friendly, with clear documentation and examples for each endpoint. It also includes structured data in JSON-LD format, which can be directly integrated into webpages to enhance visibility and indexing by search engines. Overall, the Job Posting API is a powerful tool for developers looking to build or enhance job search functionalities, offering a rich set of features and flexible querying options to meet various use cases.
For our goal, we can use the /api/v2/jobs/search
endpoint with the following query parameters:
workPlace
for filtering jobs posted based on the place of work (e.g., on-site, hybrid, or remote),title
for selecting job for a specific position (e.g., developer) or technology (e.e., Javascript),language
to constrain the search to job postings written in English, andpage
to paginate through larger lists of jobs.
Fetching Job Postings from the API
To fetch remote job postings for Java developers using the Job Posting API, we’ll construct a query that filters jobs based on specific criteria. The API allows us to search for jobs using parameters like workPlace
and title
. For this tutorial, we’ll focus on remote jobs (workPlace=remote
) that mention Java (title=Java
).
Using curl
with the URL parameters, we’ll construct a GET
request that includes the API endpoint, query parameters directly in the URL, and the required headers for authentication. For example, to fetch remote Java developer jobs in February 2025, we’ll use the following curl
command:
curl --request GET \
--url 'https://daily-international-job-postings.p.rapidapi.com/api/v2/jobs/search?dateCreated=2025-02&workPlace=remote&title=Java&page=1' \
--header 'x-rapidapi-host: daily-international-job-postings.p.rapidapi.com' \
--header 'x-rapidapi-key: YOUR_API_KEY'
This command fetches the first page of remote Java developer jobs in JSON format. The results will also return the amount of all jobs (in the totalCount
field) for this query that can be used for pagination (divide by 10). To retrieve more results, we can loop through additional pages by incrementing the page
parameter (e.g., page=2
, page=3
, etc.). The response can then be processed further to extract and format the job postings.
Creating the full Script
Finally, we’ll create the script that connects to the Job Postings API, queries for remote job postings, and stores the results in a file. To stay within the free tier of the subscription plan, the script will loop through up to 25 pages, fetching a maximum of 250 job postings. These results are then combined into a single JSON file, which will later be reformatted and sent via email to keep you updated on new job opportunities.
The script leverages curl
to make HTTP requests and jq
to parse and manipulate the JSON responses. It iterates through multiple pages, retrieving up to 10 job postings per page and appending them to a combined JSON array. Once all pages are processed, the aggregated job postings are saved to a file named jobs.json
. This ensures you have a structured, up-to-date list of remote job opportunities ready for further processing.
After fetching and formatting the job postings, the next step is to send this information to your email in a clean, readable table. Using jq
, we’ll extract key fields such as the date posted, job title, company, and URL. These details are then formatted into an HTML table for improved readability and sent to the specified email address.
To get started, create a script named fetch_jobs.sh
and insert the code provided below. Before running the script, ensure you set your API key in the RAPIDAPI_KEY
variable and your email address in EMAIL_RECIPIENT
. Keep in mind that the free tier allows only 25 requests per month, so adjust your query parameters carefully to avoid exceeding this limit.
#!/bin/bash
# Define general values
EMAIL_RECIPIENT='your_email@gmail.com'
# Define the API values
RAPIDAPI_KEY=My-RapidAPI-API-Key # Can be defined outside the script but needs special handling for the cronjob
API_URL="https://daily-international-job-postings.p.rapidapi.com/api/v2/jobs/search"
HEADERS=(
-H "X-RapidAPI-Key: $RAPIDAPI_KEY"
-H "X-RapidAPI-Host: daily-international-job-postings.p.rapidapi.com"
)
# REQUEST_LIMIT=25 # if calling it monthly (free requests)
REQUEST_LIMIT=1 # for testing purposes
CURRENT_YEAR_MONTH=$(date +"%Y-%m")
# Initialize an empty array to store all job postings
ALL_JOBS=()
MAX_PAGES=1
PAGE=1
while (( PAGE <= MAX_PAGES )); do
printf "Fetching page $PAGE for $CURRENT_YEAR_MONTH ... "
RAW_JSON=$(curl -s -G "$API_URL" \
--data-urlencode "dateCreated=$CURRENT_YEAR_MONTH" \
--data-urlencode "workPlace=remote" \
--data-urlencode "title=Java" \
--data-urlencode "language=en" \
--data-urlencode "page=$PAGE" \
"${HEADERS[@]}" | jq -c .)
TOTAL_COUNT=$(printf "%s" "$RAW_JSON" | jq -r '.totalCount // 0')
MAX_PAGES=$(( (TOTAL_COUNT + 9) / 10 )) # calculate max pages and round up
((MAX_PAGES > $REQUEST_LIMIT)) && MAX_PAGES=$REQUEST_LIMIT
printf "found $TOTAL_COUNT jobs resulting in $MAX_PAGES max pages we load for free\n"
JOBS=$(printf "%s" "$RAW_JSON" | jq -c '.result') || {
echo " Error processing JSON on page $PAGE.";
break;
}
ALL_JOBS+=("$JOBS")
((PAGE++))
done
# Combine all job postings into a single JSON array
COMBINED_JOBS=$(printf "%s" "${ALL_JOBS[@]}" | jq -s -c 'add')
# Save the combined job postings to a file
printf "%s" "$COMBINED_JOBS" > jobs.json
echo "Job postings fetched and saved to jobs.json"
# Extract job data, create HTML table and send email
HTML_TABLE=$(jq -r 'to_entries
| map(" <tr> <td>\(.value.dateCreated | split("T") | .[0])</td> <td>\(.value.title)</td> <td>\(.value.company)</td> <td>\(.value.jsonLD.url)</td> </tr>")
| join("\n")
' jobs.json)
HTML_TABLE="<table border='1'>
<tr> <th>Date Posted</th> <th>Title</th> <th>Company</th> <th>URL</th> </tr>
$HTML_TABLE
</table>"
echo -e "Subject: New Remote Job Postings\nContent-Type: text/html\n\n$HTML_TABLE" | sendmail "$EMAIL_RECIPIENT"
echo "Job postings send to $EMAIL_RECIPIENT"
Automating the Script with Cron
To fully automate our job-fetching script, we can leverage Cron - the built-in scheduler in Unix-based systems like Linux and macOS - to run the script on a regular basis without manual intervention. Cron enables us to schedule commands or scripts to execute at specified intervals, ensuring that our remote job postings are fetched automatically. By simply adding an entry to the crontab, we can have the script run monthly on the 25th day, so we always receive the latest remote job postings.
Before setting up the cron job, ensure that key parameters in the script are configured, such as REQUEST_LIMIT
(e.g., 25 if run monthly or 1 if run every workday) and your RAPIDAPI_KEY
. To schedule the script, open your crontab editor by running:
crontab -e
Then, append the following line to the file:
0 0 25 * * /path/to/your/fetch_jobs.sh >> /tmp/fetch_jobs.log 2>&1
This line instructs Cron to run the script at midnight on the 25th of every month, with all output (including errors) directed to fetch_jobs.log
for easy debugging. Finally, ensure that the script has execute permissions by running chmod +x fetch_jobs.sh
. With this setup in place, our personalized job alert system will automatically fetch and deliver fresh remote job postings each month, allowing you to potentially quit your old and apply to the new job.
Conclusion
By completing this tutorial, we’ve not only automated a job search but also gained valuable skills in Bash scripting, API integration, and task automation. This project is a testament to how a little bit of coding can make a big difference in your daily life.
To take it a step further, try customizing the script to suit your specific needs - whether that’s adding more precise filters, tweaking the email format, or scheduling it to run more frequently. If you encounter any issues, refer to the API documentation for more advanced features. Don’t forget to share your success with others or contribute to the project by improving the script. Happy automating!