Limiting maximum line length during line based read 
Author Message
 Limiting maximum line length during line based read

I'm throwing together a simple "server" that will run from inetd, and
since quick and simple was the plan I decided to use a line based
protocol.  I figured perl handles line based data pretty easily so it
should amount to a trivial amount of development time.  Anyway, since
it's line based, the main handler loop looks similar to this:

$line = <STDIN>;
( $code, $message ) = $line =~ /^(\d+)\s+(\S.*)\r\n$/;

Offtopic for this post, but I chose a number/message thing because it
would let the program read the code and act on it while logging a more
readable message for the administrator.

So this will be run on the local network, but the server is routable so
anyone on the internet can connect to it, so I considered the security
issues it would bring up.  I'm sure everyone hates buffer overflows as
much as the next guy, but I figured I'm safe from that because perl will
expand the size of $line dynamically as it needs to.  That's when I
realized that would be a problem - what if the client *never* sent the
line terminator?  Well, of course perl will just read and read and read
until the OS kills it or the machine freezes, depending on how the VM
handles low memory availability.  A one liner to demonstrate (if you
want to run this, be ready to hit ctl-c quick):

perl -e '$line=<>' /dev/zero

So here, it would be filling up $line with null bytes, and keep reading
and reading.  For me, I get about 5 seconds before the system becomes
mostly unusable.  Knowing this, and knowing my "server" is line based
and exhibits this behaviour, executing a DOS attack against it just
needs netcat or the netpipes:

hose my.server.com 1234 -out cat /dev/zero

And the server will merrily thrash it's VM shortly on a decent
connection.  BTW, I've noticed lots of the Net:: server modules do
blindly read, so they will have this problem too.

What I wanted was something like fgets from C.  I looked and in the
POSIX module they say use IO::Handle -> fgets, which appears to be
deprecated now and calls getline() after spewing a warning.  Well,
getline() is just a sub that returns <$fh>, bringing me back to my
original problem.

What I've done is made a little sub called line_in that uses getc() over
and over and will terminate if the string is too long or if a \n is
received.  I know this is about as inefficient as can be, I could use
read() instead and do it block by block, but there would be other more
irritating issues then, like jamming stuff back into the buffer and
worrying about timeout if the whole block isn't read.  So I picked the
easy way, but I have to believe there is a better way.  I've checked
FAQ's, a few of the O'Reilly books, and even #perl on efnet, but haven't
found an answer.  Any ideas?

(apologies in advance if this shows up twice, mozilla crashed while
tring to send and I don't see it in the header list yet)
--

http://www.*-*-*.com/
Remove the junk to email me



Mon, 31 May 2004 04:57:02 GMT  
 
 [ 1 post ] 

 Relevant Pages 

1. Maximum length of a line in perl

2. maximum command line length <tcsh>

3. Does Perl have a line length limit

4. Limits on command line length in Win32 perl5?

5. line length limit?

6. read last line without reading previous lines, how?

7. Finding matching line based on surrounding lines

8. Controlling line length read by <>

9. read varilble length lines

10. ANNOUNCE: Text::Filter - base class to read and write text lines

11. how to read in line by line?

12. read line by line

 

 
Powered by phpBB® Forum Software