Hey everyone. There are so many people wondering how to find an admin page. I myself when I was just beginning web-hacking, started with SQL Injection. As many of you would've done as well. The problem is, you can barely ever find the proper admin page. Either you can't find it, or it won't work.
First of all, I'd like to get this through EVERYONE's head. There is a difference between a CPANEL Login, and a ADMIN Login. The cpanel login is to manage EVERYTHING on the site. You have access to FTP and everything. Unless the admin is VERY VERY VERY stupid (Which is probably likely, but still this is rare) you will not have the same credentials as the CPanel. So don't even both trying that login.
In a website, the webmaster will usually create a standard admin page, for the admins to manage everything easier.
Now. Let's start listing our possibilities.
This is by far the most popular way of finding an admin page. A lot of people just guess what the admin page is. Most of the time, there will either be a directory OR a file. Here is an example of a directory, and then a file:
A directory can contain other files, but a file is just one thing.
A lot of the time, the admin page is simply just "Admin". So you can try adding: /admin to the end of site's URL. If you get a 404 Error (Which usually means the files does not exist) then that's not right. After I try "admin", I usually try a different method. This one is probably my favorite.
What is Robots.txt? Robots.txt is a file that makes sure scanners will not be able to scan certain pages. Usually, if they don't want a scanner to find something, it's for a certain reason right? Obviously it's an important file. So sometimes they will have the admin page listed in there. This is what a Robots.txt page looks like:
Code:User-Agent: *Disallow: /moderation.phpDisallow: /ratethread.phpDisallow: /report.phpDisallow: /reputation.phpDisallow: /sendthread.phpDisallow: /usercp.phpDisallow: /usercp2.phpDisallow: /newreply.phpDisallow: /newthread.phpDisallow: /editpost.phpDisallow: /private.phpDisallow: /search.phpDisallow: /refer.phpDisallow: /myawards.phpDisallow: /stats.phpDisallow: /member.phpDisallow: /memberlist.phpDisallow: /showteam.phpDisallow: /upgrade.phpDisallow: /showratings.phpUser-agent: dotbotDisallow: /User-agent: 008Disallow: /
Even if it says "Disallow" we can usually still access the files. So go ahead and add /robots.txt to your target, and see what you find!
Web crawlers are, and always will be, a hackers best friend. A web-crawler will crawl a website, and list certain directories and files. I DEFINITELY recommend Acunetix. Acunetix is definitely one of the best Web-Crawlers out there, don't even bother trying to say different.
Even Robots.txt won't stop Acunetix's web-crawler (Which is very important if we actually want to get at useful files).
If the webmaster is smart, they will sometimes use sub-domains to hide certain admin-pages, or even files. You can tell what they may, or may not have open if you scan the ports with Nmap. Nmap will list the open ports of the website. If it has an SMTP port open, that may mean you can access an email login. Which may or may not contain valuable information.
Again, you can use Acunetix to check for subdomains. I don't have any cracks for acunetix (That I've posted) but I have seen some here on HF. I might (at some point) post a crack, and an easy one so you don't have to replace files and shit.
FTPWhat is FTP? FTP stands for "File Transfer Protocol". If you have access to FTP, you can do absolutely anything you want. Unfortunately, you will not have permissions, unless you supply a username and password. But sometimes the FTP will be WIDE open for you to see. Sometimes they'll list the admin page in there. To gain access to the FTP, you can either do:
Why did I just add a ":21" to the end of the site? Because 21 is the port for FTP. If I do :21, it'll connect to the port I have put after the ":". This is a very useful method, and I definitely recommend it.
Scanners are programs that will connect to the internet, and test certain pages of your website.
If you're looking for another program that will scan, and you're looking for a very simple to use one, you can check out "Havij". I do not support the SQL Injection methods with it, because it's pretty "nooby". But using it to find Admin Pages is completely acceptable.
Google dorks are keywords you can use to search for exact things. Like this:
Code:inurl:admin.phpThat will look for any site that has admin.php in it. I usually use these dorks if I'm looking for an admin page.
site:site.com inurl:adminsite:site.com intext:loginsite:site.com intext:adminsite:site.com intitle:loginsite:site.com intitle:admin
Those should help! Well, that's pretty much it. Thanks for reading the tutorial, and I hope this helps you out! These methods are extremely useful, I find admin pages A LOT with these methods, don't doubt them until you try them.Thanks!