Table of Contents
The Problem
When running CI/CD pipelines in a monorepo environment, you can encounter unexpected errors. Our team ran into an Argument list too long error in a GitHub Actions file validation workflow.
This workflow verifies that files changed in a PR are only within the service directory corresponding to the branch. For example, on a service-a branch, only files in the serviceA directory should be modified.
The original implementation used tj-actions/changed-files to get the list of changed files, then iterated through the file list passed via environment variables in a Bash for loop.
for file in $ALL_CHANGED_FILES; do
service_type=$(echo "$file" | cut -d'/' -f2)
if [ "$service_type" != "$service_directory" ]; then
echo "The file $file is not allowed to be changed in the $SERVICE_NAME branch."
exit 1
fi
done
The problem occurs when there are many changed files. When the file paths stored in the $ALL_CHANGED_FILES environment variable exceed the OS argument length limit (ARG_MAX defaults to approximately 2MB on Linux), the Argument list too long error is triggered. As the monorepo grows, it becomes increasingly common for a single PR to change hundreds or thousands of files, making it easy to hit this limit.
Solution
The solution has two key parts.
1. Save the Changed File List to a File
We used the write_output_files option in tj-actions/changed-files to save the changed file list to a file instead of an environment variable.
- name: Get changed files
id: changed-files
uses: tj-actions/[email protected]
with:
files: |
**
write_output_files: true
output_dir: /tmp
This writes the changed file list to /tmp/all_changed_files.txt. Since data is exchanged through the filesystem, it’s not affected by the OS argument length limit.
2. Switch from Bash to JavaScript
We rewrote the file reading and processing logic in JavaScript using actions/github-script instead of Bash.
- name: Validate file changes
uses: actions/github-script@v7
with:
script: |
const fs = require('fs');
const serviceDirectoryMap = {
'service-a': 'serviceA',
'service-b': 'serviceB',
'service-c': 'serviceC',
// ...
};
const serviceName = process.env.SERVICE_NAME;
const serviceDirectory = serviceDirectoryMap[serviceName];
// ...
const fileContent = fs.readFileSync('/tmp/all_changed_files.txt', 'utf8');
const files = fileContent.trim().split(/\s+/).filter(file => file.trim() !== '');
for (const filename of files) {
// Check allowed files and validate service directory
}
The benefits of this transition are as follows:
| Aspect | Bash (Before) | JavaScript (After) |
|---|---|---|
| File list delivery | Environment variable (length limited) | File I/O (no limit) |
| Service mapping | case statement | Object literal (Map) |
| Error handling | exit 1 | core.setFailed() |
| Readability | Shell-specific syntax | Intuitive JavaScript |
Key Takeaways
Don’t pass large amounts of data through environment variables. This isn’t a problem limited to GitHub Actions — it’s a constraint across Unix/Linux systems in general. Shell command argument lengths have OS-level limits, and if the number of changed files in your CI environment is unpredictable, you must use a file-based approach.
Additionally, actions/github-script is a great alternative to Bash for handling complex logic in GitHub Actions. It runs on a Node.js runtime, so you can freely use standard modules like fs and path, and you can directly leverage Actions-specific APIs like core.setFailed().
Conclusion
It was a small change, but one that significantly improved the stability of our CI pipeline. As the monorepo grows, the likelihood of encountering system-level constraints like this increases. It reminded us that designing for scalability is important, rather than settling for “it works for now.”
To learn more about the overall structure of the file validation workflow, see A GitHub Actions Workflow to Restrict File Change Scope by Branch in a Monorepo.
Was my blog helpful? Please leave a comment at the bottom. it will be a great help to me!
App promotion
Deku.Deku created the applications with Flutter.If you have interested, please try to download them for free.