Introduction
In my work with AWS Systems Manager (SSM) documents, I faced a challenge that might resonate with anyone managing complex automation: passing information between steps. Each step in an SSM document executes independently, and I wanted to keep my commands clean, organized, and effective. However, without an easy way to carry over data from one step to the next, maintaining structure and efficiency in the document became tricky.
I eventually found a solution that not only kept my SSM documents orderly but also flexible: writing key information into a JSON file stored in the /tmp
directory. In this approach, each step can easily access this file to read or update necessary data, allowing information to be passed throughout the entire automation process.
In my previous article, I encountered exactly this issue and solved it with a similar workaround.
This method, while not perfect, has been effective for ensuring smooth data flow between steps in SSM documents.
Example: Passing Information Between Steps Using a JSON File
Let's consider a scenario where we want to store the instance ID and a timestamp in one step, then read that information in a following step. We'll start by writing this information into a JSON file in the /tmp
directory, then read from that same file in the next step.
Here’s how this might look in an SSM document:
{
"schemaVersion": "2.2",
"description": "Example SSM Document to pass data between steps using a JSON file.",
"mainSteps": [
{
"action": "aws:runShellScript",
"name": "WriteToJSONFile",
"inputs": {
"runCommand": [
"#!/bin/bash",
"INSTANCE_ID=$(curl -s http://169.254.169.254/latest/meta-data/instance-id)",
"DATE_TIME=$(date +'%Y-%m-%d %H:%M:%S')",
"echo '{\"InstanceId\":\"'$INSTANCE_ID'\", \"Timestamp\":\"'$DATE_TIME'\"}' > /tmp/ssm_data.json",
"echo 'Data written to /tmp/ssm_data.json'"
]
}
},
{
"action": "aws:runShellScript",
"name": "ReadFromJSONFile",
"inputs": {
"runCommand": [
"#!/bin/bash",
"if [ -f /tmp/ssm_data.json ]; then",
" INSTANCE_ID=$(jq -r '.InstanceId' /tmp/ssm_data.json)",
" DATE_TIME=$(jq -r '.Timestamp' /tmp/ssm_data.json)",
" echo \"Instance ID from JSON: $INSTANCE_ID\"",
" echo \"Timestamp from JSON: $DATE_TIME\"",
"else",
" echo '/tmp/ssm_data.json does not exist!'",
" exit 1",
"fi"
]
}
}
]
}
Explanation of Each Step
- WriteToJSONFile
- This step gathers the instance ID from the instance metadata service and the current timestamp.
- It then writes these values into a JSON file at
/tmp/ssm_data.json
. - By storing this data in a JSON format, we can easily retrieve it in the next step or other steps that follow.
- ReadFromJSONFile
- This step reads the JSON file created in the previous step.
- Using
jq
, it extracts the values ofInstanceId
andTimestamp
and stores them in variables. - The script then echoes these values, demonstrating that the information has been successfully passed down from the previous step.
Limitations of Using JSON Files Between Steps
Using a JSON file to pass data between steps is a reliable solution, but it’s worth mentioning its downside. Ideally, setting environment variables would be a cleaner approach for passing data within a script. However, SSM documents don’t allow environment variables to persist across steps, which limits that option.
This JSON method has been the most effective solution I’ve found so far, as it keeps data organized and easy to access in each step. If there’s a more efficient way to achieve this without external files, I’d love to hear about it!
Member discussion