Web Programming, Linux System Administation, and Entrepreneurship in Athens Georgia

Author: Brandon (Page 23 of 29)

Tracking down how hackers gain access through web apps

Hackers commonly use vulnerabilities in web applications to gain access to a server. Sometimes, though, it can be difficult to track down exactly how they gained access to a server. Especially if the server hosts a bunch of websites and there are lots of potentially vulnerable scripts.

I’ve tracked down more of these than I can count, and have sortof developed a pattern for investigating. Here are some useful things to try:

1- Look in /tmp and /var/tmp for possibly malicious files. These directories are usually world-writable, and commonly used to temporarily store files. Sometimes the files are disguised with leading dot’s, or they may be named something that looks similar to other files in the directory like “. ” (dot- space), or like a session files named sess_something.

If you are able to see any files, you can use the timestamps of the files to try and look through some Apache logs to find the exact hit that it came from

2- If a rogue process is still running, look at the /proc entry for that file to determine more information about it. The files in /proc/<PID> will tell you information like the executable file that created the process, it’s working directory, environment information, and plenty more details. Usually, the rogue processes are running as the apache user (httpd, nobody, apache).

If all of the rogue processes were being run by the Apace user, then the hacker likely didn’t gain root access. If you have rogue processes that were being run by root, it is much harder to clean up after. Usually the only truly safe method is to start over with a clean installation.

3- netstat -l will help you identify processes that are listening for incoming connections. Often times, these are a perl script. Sometimes they are named things that look legitmiate like ‘httpd’, so pay close attention. netstat-n will help you to see current connections that your server has to others.

4- Look in your error logs for files being downloaded with wget. A common tactic is for hackers to run a wget command to download another file with more malicious instructions. Fortunately, wget writes to STDERR, so it’s output is usually displayed in the error logs. Something like this is evidence of a successful wget:

--20:30:40--  https://somehackedsite.com/badfile.txt
            => `Lnx.txt'
Resolving somehackedsite.com... 12.34.56.78

Connecting to somehackedsite.com[12.34.56.78]:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 12,345 [text/plain]

     0K .......... ......                                     100%  263.54 KB/s

20:30:50 (263.54 KB/s) - `badfile.txt' saved [12,345/12,345]

You can use this information to try and recreate what the hacker did. Look for the file they downloaded (badfile.txt in this case) and look at what it does. You can also used these timestamps to look through access_logs to find the vulnerable script.

Since wget is a commonly used tool for this, I like to create a .wgetrc file that contains bogus proxy information, so that even if a hacker is able to attempt a download, it won’t work. Create a .wgetrc file in Apache’s home directory with this content:

http_proxy = https://bogus.dontresolveme.com:19999/
ftp_proxy = https://bogus.dontresolveme.com:19999/

5- If you were able to identify any timestamps, you can grep through Apache logs to find requests from that time. If you have a well-structured server where you have logs in a consistent place, then you can use a command like this to search all of the log files at onces:

grep "01\\/Jun\\/2007:10:20:" /home/*/logs/access_log

I usually leave out the seconds field because requests sometimes take several seconds to execute. If you have a server name or file name that you found was used by a wget, you can try searching for those too:

grep "somehackesite.com" /home/*/logs/access_log

6 – Turn of PHP’s register_globals by default and only enable it if truly needed. If you write PHP apps, learn how to program securely, and never rely on register_globals being on.

Disabling PHP processing for an individual file

I sometimes want to post samples of PHP scripts on my website. Since the site web server is configured to parse files that end in .php, that means that simply linking to the PHP file will try to parse it instead of displaying its contents. In the past, I’ve always made a copy of the file with a .txt extension to have it displayed as text/plain. That way is kindof clumsy though. If a user wants to save the file, they download it as a .txt and have to rename it to .php.
Fortunately, Apache has a way to do about anything. To configure it to not parse a specific PHP file, you can use this in your Apache configuration:

<Files "some.file.php">
   RemoveHandler .php
   ForceType text/plain
</Files>

If you have AllowOverride FileInfo enabled, this can also be placed in a .htaccess file. It should work for other file types like .cgi or .pl files as well. You can substitute a FilesMatch directive to have it match multiple file names base on a regular expression match.

What a difference a blank line can make

I had a customer today who had problems with a PHP script that output a Microsoft Word document. The script was pretty simple and just did some authentication before sending the file to the client. But, when the document was opened in Word, it tried to convert it into a different format and would only display gibberish.

The customer had posted his problem on some forums, and was told that upgrading from PHP 5.1.4 to PHP 5.2 should fix the problem. Well it didn’t. In fact, the PHP 5.2 version had some weird bug where a PDO object would overwrite stuff in the wrong memory location. In this case, a call to fetchAll() was overwritting the username stored in the $_SESSION variable, which in turn was messing up all of the site’s authentication. After digging into it to find that out, it seemed best to revert back to PHP 5.1. Once that was completed, the we were back to the original problem with the Word document.

The headers he was sending all looked okay. Here’s the relevant code to download a document:

$file = "/path/to/some_file.doc";
header("Pragma: public");
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: private",false); // required for certain browsers
header("Content-Type: application/msword");
header("Content-Disposition: attachment; filename=\"".basename($file)."\";" );
header("Content-Transfer-Encoding: binary");
header("Content-Length: ".filesize($file));
readfile($file);

I tried tweaking with them a little to match a known-working site, but to no avail. I finally had to download a copy of the file directly from the web server, bypassing the PHP script. I also downloaded a copy of the file through the PHP script and saved them both for comparison. After looking at them both side-by-side in vi, I noticed an extra line at the top of the bad one. I removed the extra line and downloaded the fixed copy which opened fine in Word. After that, it was just a matter of finding the included file with an extra line in it. Sure enough, one of the configuration files had an extra line after the closing ?> tag. Removed that and everything worked correctly.

Experience with eJabberd

I spent a couple hours today trying to install a jabber server called ‘ejabberd‘. The latest release was version 1.1.3, so of course I downloaded and installed that. The program came as a Linux binary installation, so I just ran that and it seemed to install okay. The program even started up just fine, and it looked like everything was working. I had a little snag when trying to login to the web interface the first time. The installation program asked for an admin password, but evidently didn’t set it. I had to set the password on my own, but then was able to log in to the admin web interface okay. Then I created a user account to log in as, and it went downhill from there

ejabberd is written in a high level language called Erlang. Evidently there is some problem in the version of Erlang that the binary installation includes. It has some problems with the cryptography functions. I spent a couple hours searching the ejabberd forums, and the posts that they referenced on the erlang development forums. After much wasted time, and serveral attempts at compiling erlang separately I was about to give up.

Then, I noticed a link for previous versions of ejabberd on their downloads page and decided to give version 1.0.0 a shot. I downloaded, installed, and had it running in under ten minutes. I guess sometimes older is better.

Stupid Advertisers

Advertisers can be dumb. I don’t read a whole lot of magazines, but a couple advertisements in eWeek have made me laugh

First of all, is the latest VeriSign ad that says ‘Introducing the biggest advancement to Internet security in the last ten years’. Turns out that this amazing new advancement is that they now sell an ‘Extended Validation SSL’ certificate. The amazing feature that they’ve manage to incorporate with this, is that modern browsers will now display a green address bar when it detects a site using one of these new certificates.

So that’s it? The big advancement of car insurance quotes in security is that the address bar turns green? You’ve gotta be kidding me. Sounds to me like just a way for VeriSign to make more money.

And my favorite, stupid-funny advertisement is a Microsoft ad that used to be inside the front cover of eWeek. You’ll notice the text that claims 5-nines uptime with a star next to it. Then read the fine print in the footnote of the start that says “Results Not Typical”. It still makes me laugh. To bad they don’t run this ad anymore.

Funny Microsoft AdThe TextThe Fine Print

Avazio.com it is

After spending far to many hours looking up possible domain names, I’ve finally settled on avazio.com. This will be a place for me to sell programs that I’ve written, and to advertise System Administration and Programming services. There is no special meaning or anything to the name. It’s just something that sounded cool and was available. I’ve spent a little time putting up a website there with a little bit of information about the products and services that I’m hoping to sell.

I’m actually quite happy with the look of the site. It’s nothing too complicated, but I have created all of the graphics for it myself using an old version of Paint Shop Pro. Considering that I know nothing about graphics, I think that it looks pretty good. I picked the colors from colorschemer.com (although I forget which one).

Gonna give Asterisk a try

I attended a meeting tonight at my local Linux Users Group in Columbia Maryland.   The topic was on getting an Asterisk Voip server up and running and was presented by a couple members (John and Terry), who have both been playing with different Asterisk distrobutions.  John went the route of the full-featured trixbox, and Terry went for the minimal installation on an embedded device, showing just how flexible and customizable Asterisk is.

My work has recently agreed to migrate our phone system over to an Asterisk server, so I’m hoping to get that set up in the next week or two so that I can start playing with it.

Redirecting WordPress ‘numeric’ permalinks

When I set up my blog, I configured it to use the ‘numeric’ permalinks that look something like https://www.brandonchecketts.com/archives/61

Of course, that is pretty straightforward, but it isn’t very readable, and we can probably get a few extra SEO points by putting the post’s title in the URL. However, just changing the permalink format is a bad idea since I have a bunch (okay, a few) incoming links that I don’t necessarily want to break.

So, I wrote a quick WordPress Plugin that redirects these old numeric links to my new format. Simply create a file named ‘numeric_permalink_redirect.php’ in your wp-content/plugins directory with this content:

<?php
/*
Plugin Name: BC Rewriter
Plugin URI: https://www.brandonchecketts.com/
Description: Redirect old requests to new permalink format
Author: Brandon Checketts
Version: 1.5
Author URI: https://www.brandonchecketts.com/
*/

// redirect old "numeric" type archives to our current permalin structure
function check_numeric_permalink()
{
  if(preg_match("/^/archives/([0-9]+)/", $_SERVER['REQUEST_URI'], $matches)) {
  $post_id = $matches[1];
  $url = get_permalink($post_id);
  header("HTTP/1.1 301 Moved Permanently");
  header("Location: $url");
  exit;
}

add_action('init', 'check_numeric_permalink');

?>

That will now do a 301 Permanent redirect to the new URL so that you shouldn’t lose anybody, and the search engines should change their incoming links.

Upgrading CentOS 4.4 to CentOS 5 with yum

CentOS 5 has been out for a little while now and I still haven’t had a chance to install and play with it. Today, though, I was able to make excuse to start on it. I actually created it on a virtual server though, and it’s much easier for me to just copy an existing file system over to a new virtual server than it is to create an entirely new file system. So, I started googling for ways to upgrade rather than do a fresh install.

I came across these instructions on how to upgrade an x86_64 architecture, and kindof adapted them to the i386 architecture that I was using. There was lots of trial an error to get it working, but overall it only took about two hours. I’m looking back in my history to make sure I got all of the important stuff. These instructions are based off of that. I’ll revise the instructions to get more exact the next time I try this.

Note, the CentOS5 mirror that I used can be changed for a mirror of your choice. If you are going to be doing this more than a couple times, I’d suggest setting up your own private repository so that you just download everything once. You can use this command to rsync to your local mirrors server:

/usr/bin/rsync -aqzH --delete mirror.chpc.utah.edu::CentOS/5.0/ /home/mirrors/centos/5.0

Again, adjust the mirror server to one that is close to you. You can find a good one on this list

Remove un-necessary packages

yum erase http* php* mysql* autofs glibc-kernheaders

Download and install the centos-release stuff:

wget  https://centos.sd2.mirrors.redwire.net/5.0/os/i386/CentOS/centos-release-notes-5.0.0-2.i386.rpm
wget https://centos.sd2.mirrors.redwire.net/5.0/os/i386/CentOS/centos-release-5-0.0.el5.centos.2.i386.rpm
rpm -Uvh centos-release*

Install the CentOS5 GPG Key

rpm --import https://mirrors.kernel.org/centos/RPM-GPG-KEY-CentOS-5

Create a file called ‘files’ and add this content:

rpm-libs-4.4.2-37.el5.i386.rpm
rpm-4.4.2-37.el5.i386.rpm
yum-3.0.5-1.el5.centos.2.noarch.rpm
rpm-python-4.4.2-37.el5.i386.rpm
popt-1.10.2-37.el5.i386.rpm
glibc-2.5-12.i386.rpm
glibc-common-2.5-12.i386.rpm
beecrypt-4.1.2-10.1.1.i386.rpm
glibc-headers-2.5-12.i386.rpm
glibc-devel-2.5-12.i386.rpm
binutils-2.17.50.0.6-2.el5.i386.rpm
elfutils-libelf-0.125-3.el5.i386.rpm
elfutils-0.125-3.el5.i386.rpm
elfutils-libs-0.125-3.el5.i386.rpm
beecrypt-python-4.1.2-10.1.1.i386.rpm
python-2.4.3-19.el5.i386.rpm
python-devel-2.4.3-19.el5.i386.rpm
python-elementtree-1.2.6-5.i386.rpm
python-sqlite-1.1.7-1.2.1.i386.rpm
python-urlgrabber-3.1.0-2.noarch.rpm
neon-0.25.5-5.1.i386.rpm
libxml2-2.6.26-2.1.2.i386.rpm
libxml2-python-2.6.26-2.1.2.i386.rpm
db4-4.3.29-9.fc6.i386.rpm
libselinux-1.33.4-2.el5.i386.rpm
libsepol-1.15.2-1.el5.i386.rpm
mcstrans-0.1.10-1.el5.i386.rpm
m2crypto-0.16-6.el5.1.i386.rpm
krb5-libs-1.5-17.i386.rpm
openssl-0.9.8b-8.3.el5.i386.rpm
readline-5.1-1.1.i386.rpm

Then run this to download them all:

for file in `cat files`
do
wget https://centos.sd2.mirrors.redwire.net/5.0/os/i386/CentOS/$file
done

Then install them all:

rpm -Uvh *.rpm  --nodeps

Clean up some old stuff:

rm -f /var/lib/rpm/__*
rpm --rebuilddb
yum clean all

Download and install the new kernel (This doesn’t do anything for me since I’m on a virtual server who’s kernel is controlled by the host. But it’s already running 2.6.20, so it should have any features that 2.6.18 uses)

wget https://centos.sd2.mirrors.redwire.net/5.0/os/i386/CentOS/kernel-2.6.18-8.el5.i686.rpm
wget https://centos.sd2.mirrors.redwire.net/5.0/os/i386/CentOS/kernel-headers-2.6.18-8.el5.i686.rpm
rpm -ivh kernel-* --nodeps

Now upgrade all of the packages:

yum upgrade

And re-install stuff that you removed before starting:

yum install mysql mysql-server httpd php

Google Problems

I read about some Google problems the other day and just experienced some of my own. Yesterday, when trying to log into my gmail account, I received an error that the service was unavailable. Its amazing how much we have come to rely on these free services. It does make me glad that I still use a thick mail client for my main email usage though.

The recent problems reinforced something that I realized about them when I was out there recently. That Google is essentially the same as most other IT companies around. They have problems just like anybody else does. Fortunately they have built things in a way that usually allows the problems to go un-noticed by users. It seems, however that with the added complexity of their applications, and with all of the new people being hired, that some of their rough edges are starting to show a little.

« Older posts Newer posts »

© 2025 Brandon Checketts

Theme by Anders NorenUp ↑