Recents in Beach

Information Gathering For Web Hacking




Information Gathering
When you are going to hunt a website down then you must know what really you are going to deal with, if you know your enemy which you are going to face then you can prepare yourself for that.
So this is why Information Gathering is the first phase of Penetration testing. But now arise the question what information are we going to collect and where are we going to get that information from. "Where and how", Well i will tell you both of these things step by step of Information Gathering.
But in this tutorial we will understand that what we are targetting to achieve via Information Gathering and how that information is going to help us in our Penetration testing.

Who is Information :
This is the most basic information about a domain, It shows the registration Details of the website in which you can commonly see who registered the domain and which date did he registered it on, when will it expire etc. This information may help you sometimes in Social engineering like sending him email on his registered email. Or you use his address, name or contact number in various tasks of Social Engineering.

Login Pages:
While pentesting when you find a login page or admin login page which requieres some username and password to login then that is nothing to get sab about, actually getting a login page is just like finding a Locked door of a Secure house. But the to break inside you can use a master key or you can even break the lock. In the same manner Login pages can also be tested for many known Attacks.

Web Application:
Many times what you are targetting is a public Web Application like Joomla, Wordress or any other. We also need to get all the information about the web Application so we can find any known Vulnerability for that particular Version or else we can find any Vulnerability in the source code available online.

Sub Domain:
If you do not know what subdomains are then, Subdomain are domains maintained under a domain for examle google.com is a domain name then mail.google.com is a subdomain inside it. We need to collect all available sub domains for a website. In many cases you may find hidden or private domain where they are maintaining something private and such application are usually left vulnerable and exposed because of the assumption the no one can reach them.

Ip Address:
Well this ones for newbies, actually IP address is the real address behind any domain name which are resolved by the nameservers. Every Box or you can say a system contains a unique IP address for example (542.622.22.88). Using it computers communicate to eachother. IP Address will help us targetting the network as well as find open ports and other exploitable services on the system while pentesting.

NameServers:
These are the DNS resolvers, for example when you type in google.com in your browser the DNS resolvers finds the real IP behind and and take your request to the server, and bring back the response. We can later target the nameservers to for DNS based attacks testing of our pentesting.

Web Server:
Webservers the one we are dealing over here is an application which is running over an Operating system and serves to the web requests coming to the system. Like Apache, Tomcat, IIS etc are webservers running on an operating system when any web request is sent to a system they handle it and they are responsible for giving out the response. Many times you can get Exploits related to a webserver and get a way into the sytem using that exploit, and if you know which webserver is bieng used then it will help you to find out the default directories or known vulnerabilities for that web server.

Operating System:
Well most of you know what an operating system is but still if any one is confused that why do we need to know the OS, then let me clarify that when we know the operating system then we can find out the rights attacks, Open Ports, Exploits, Common Services etc which will help us later in pentesting.

Other Domains on Same Server:
Many times you can not find vulnerability in a Website then you can make a Reverse IP Domain Look up and find out other websites on the same server and you can p0wn them to get access to the server and make your way towards the target.

Web Application Firewall :
We can also test if they are using any firewall for that we can know what we are going to face and is there any ways to bypass that firewall.

Robots.txt
Robots.txt is a file which is used by the websites to disallow crawlers to Crawl some of its sensitive data or the admin panels. And it can be viewed publically so in that case it could be useful if we find that data and use it later on.



Post a Comment

0 Comments