perl.gg / hidden-gems

<!-- category: hidden-gems -->

Hijacking STDOUT Into a String

2026-04-02

A function prints to STDOUT. You did not write the function. You cannot change the function. But you need what it prints as a string variable, not as text flying past on a terminal.
use Some::Module qw(noisy_function); # noisy_function() calls print 47 times internally # you need ALL of that output in $captured
The shell programmer reaches for backticks and a temp file. The sysadmin pipes to a file and slurps it back. The Perl programmer does something far more elegant: redirects STDOUT to a string variable in memory. No files. No processes. No shell.
my $captured; { open my $fh, '>', \$captured or die $!; local *STDOUT = $fh; noisy_function(); # all print output goes into $captured } say "Got: $captured";
Everything that function printed is now sitting in $captured. The function had no idea. It called print like it always does. Perl just pointed print somewhere else for a moment.

Part 1: IN-MEMORY FILEHANDLES

The core of this trick is the in-memory filehandle. When you open a filehandle to a reference to a scalar, Perl creates a filehandle that writes to (or reads from) that scalar's memory instead of a file on disk:
my $buffer = ''; open my $fh, '>', \$buffer or die "Cannot open: $!"; print $fh "Hello "; print $fh "World\n"; close $fh; say $buffer; # Hello World
The \$buffer is the key. That backslash-reference tells open "this is not a filename, this is a scalar variable." Perl creates a fake filehandle that appends to $buffer on every write.

No temp files created. No disk I/O. Everything stays in RAM. The scalar grows as you write to it, just like a file would grow on disk.

This has been in core since Perl 5.8. No modules needed.

Part 2: THE LOCAL STDOUT TRICK

An in-memory filehandle is nice, but how do you make someone else's print statements use it? You cannot go edit their code to say print $fh "stuff".

The answer is local *STDOUT:

my $output = ''; { open my $fh, '>', \$output or die $!; local *STDOUT = $fh; print "This goes into \$output\n"; print "So does this\n"; } # STDOUT is restored here print "This goes to the terminal\n"; say "Captured: $output";
The local *STDOUT = $fh line temporarily replaces the STDOUT filehandle with your in-memory filehandle. Every bare print (which defaults to STDOUT) now writes to your scalar. When the block exits, local restores the original STDOUT automatically.

This is dynamic scoping at its finest. Any code called from within that block, no matter how deep the call stack goes, sees the replaced STDOUT. The function you are capturing from does not need to know or cooperate.

Part 3: CAPTURING MODULE OUTPUT

Here is the real use case. You are using a module that prints diagnostic information, and you want to capture it:
use Data::Dumper; my $dump_text; { open my $fh, '>', \$dump_text or die $!; local *STDOUT = $fh; print Dumper({ name => 'perl', version => 5.40 }); } # $dump_text now contains the Dumper output # parse it, log it, email it, whatever if ($dump_text =~ m~'perl'~) { say "Found perl in the dump"; }
This works with any module. Legacy code that prints reports. Third-party libraries that dump status. Old CGI scripts that blast HTML to STDOUT. Wrap the call, capture the output, do what you want with it.

You never touch the module's source code. You never fork a process. You just temporarily bend STDOUT.

Part 4: A CLEAN CAPTURE FUNCTION

Wrap the pattern in a reusable function:
sub capture_stdout { my ($code) = @_; my $output = ''; { open my $fh, '>', \$output or die "Cannot redirect STDOUT: $!"; local *STDOUT = $fh; $code->(); } return $output; } # usage my $text = capture_stdout(sub { print "line one\n"; print "line two\n"; some_noisy_function(); }); say "Captured " . length($text) . " bytes";
Pass a coderef. Get a string back. Clean interface, zero boilerplate at the call site. The block scoping and local restoration happen inside the function where you never have to think about them again.

Part 5: CAPTURING STDERR TOO

STDOUT is not the only filehandle you can hijack. STDERR works the same way:
my $errors = ''; { open my $fh, '>', \$errors or die $!; local *STDERR = $fh; warn "This warning goes into \$errors\n"; system("ls /nonexistent 2>&1"); # more on system() later } say "Captured errors: $errors";
Or capture both at once into separate variables:
my ($out, $err) = ('', ''); { open my $out_fh, '>', \$out or die $!; open my $err_fh, '>', \$err or die $!; local *STDOUT = $out_fh; local *STDERR = $err_fh; print "normal output\n"; warn "error output\n"; do_something_complex(); } say "STDOUT got: $out"; say "STDERR got: $err";
Two streams, two variables, completely separated. No temp files, no pipes, no shell gymnastics.

Part 6: TESTING PRINT-HEAVY CODE

This is a killer application for testing. You have a function that prints a report and you need to verify the output:
use Test::More; sub generate_report { print "=== Report ===\n"; print "Items: 42\n"; print "Status: OK\n"; } # capture and test the output my $report; { open my $fh, '>', \$report or die $!; local *STDOUT = $fh; generate_report(); } like($report, qr~=== Report ===~, 'has header'); like($report, qr~Items: \d+~, 'has item count'); like($report, qr~Status: OK~, 'status is OK'); done_testing();
No more "did the test print the right thing?" eyeball testing. Capture it. Assert against it. Automate it.

The module Capture::Tiny on CPAN does this same thing with a nicer API, but understanding the mechanism means you can do it anywhere without installing anything.

Part 7: THE SYSTEM() GOTCHA

Here is the big trap. The local *STDOUT trick does NOT capture output from system() calls or backticks:
my $output = ''; { open my $fh, '>', \$output or die $!; local *STDOUT = $fh; print "this IS captured\n"; system("echo 'this is NOT captured'"); # goes to terminal! } say $output; # only contains "this IS captured\n"
Why? Because system() forks a child process. That child has its own file descriptors, inherited from the parent. The in-memory filehandle exists only inside the Perl interpreter. The child process knows nothing about it. It writes to the real file descriptor 1, which still points at your terminal.

If you need to capture external command output, use backticks or open with a pipe:

my $cmd_output = `echo 'hello from the shell'`; # or with open open my $pipe, '-|', 'echo', 'hello from the pipe' or die $!; my $piped = do { local $/; <$pipe> }; close $pipe;
The local *STDOUT trick is for Perl-level print statements only. Anything that calls write(2) at the C level bypasses it.

Part 8: READING FROM IN-MEMORY STRINGS

The reverse works too. You can create a filehandle that reads from a string:
my $data = "line one\nline two\nline three\n"; open my $fh, '<', \$data or die $!; while (my $line = <$fh>) { chomp $line; say "Read: $line"; } close $fh;
This is useful for testing code that reads from STDIN:
my $fake_input = "yes\n42\nquit\n"; { open my $fh, '<', \$fake_input or die $!; local *STDIN = $fh; # code that reads from STDIN now reads from $fake_input my $answer = <STDIN>; # gets "yes\n" my $number = <STDIN>; # gets "42\n" my $cmd = <STDIN>; # gets "quit\n" }
Feed fake input to interactive code without actually typing anything. Combine with STDOUT capture and you can test a complete interactive session in memory.

Part 9: APPENDING AND LAYERING

You can open in-memory filehandles in append mode too:
my $log = "=== Start ===\n"; open my $fh, '>>', \$log or die $!; print $fh "Entry 1\n"; print $fh "Entry 2\n"; close $fh; say $log; # === Start === # Entry 1 # Entry 2
The >> mode appends to whatever is already in the scalar. Useful when you want to build up output across multiple capture sessions:
my $combined = ''; for my $func (\&report_users, \&report_disk, \&report_services) { open my $fh, '>>', \$combined or die $!; local *STDOUT = $fh; $func->(); } # $combined now has all three reports concatenated
Each function prints its report. Each report appends to the same scalar. One variable, three captures, zero temp files.

Part 10: THE FULL PICTURE

Here is everything together. A utility that captures STDOUT and STDERR from arbitrary code, with timing:
#!/usr/bin/env perl use strict; use warnings; use feature 'say'; use Time::HiRes qw(gettimeofday tv_interval); sub capture_all { my ($code) = @_; my ($stdout, $stderr) = ('', ''); my $t0 = [gettimeofday]; { open my $out_fh, '>', \$stdout or die $!; open my $err_fh, '>', \$stderr or die $!; local *STDOUT = $out_fh; local *STDERR = $err_fh; $code->(); } my $elapsed = tv_interval($t0); return { stdout => $stdout, stderr => $stderr, elapsed => $elapsed, }; } # usage my $result = capture_all(sub { print "Processing...\n"; warn "Minor issue detected\n"; print "Done.\n"; }); say "STDOUT: $result->{stdout}"; say "STDERR: $result->{stderr}"; say "Took: $result->{elapsed}s";
.--. |o_o | "STDOUT goes where I tell it. |:_/ | Not where it wants to go." // \ \ (| | ) /'\_ _/`\ \___)=(___/
The local *STDOUT trick is pure Perl. No modules. No external tools. No temp files littering your filesystem. You reach into the interpreter, point a filehandle somewhere else for a moment, and everything that prints during that moment goes where you want it.

Functions that print become functions that return strings. Code that was untestable becomes trivially testable. Output you could only see on a terminal becomes data you can parse, filter, and transform.

It is one of those Perl features that makes you wonder why every language does not work this way. Open a filehandle to a variable. Redirect output into it. Get your data. Done.

perl.gg*