Saturday, January 4, 2020

Web Application Hacker’s Toolkit


Here We know about same basic of Web Application

A Web Application Hacker’s Toolkit


Some attacks on web applications can be performed using only a standard web browser.
Most Important
The most important item in your toolkit falls into this latter category, and operates as an intercepting web proxy, enabling you to view and modify all of the HTTP messages passing between your browser and the target application. 
Second Main Category
The second main category of tool is the web application scanner.
This is a product designed to automate many of the tasks involved in attacking a web application, 
from initial mapping through to probing for vulnerabilities.

What is Web Browsers?

A web browser is not exactly a hack tool, being the standard means by which web applications are designed to be accessed. 

Internet Explorer 
Microsoft’s Internet Explorer (IE) is currently the most widely used web browser, comprising approximately 60% of the market
 at the time of writing. Virtually all web applications are designed for and tested on IE, making it a good choice for an attacker because most applications’ content and functionality will be correctly displayed and usable within IE. 

Firefox 
Firefox is currently the second most widely used web browser, comprising approximately 35% of the market at the time of writing. The majority of web applications work correctly on Firefox; however, there is no native support for ActiveX controls

Opera
Opera is a relatively little-used browser, having less than 2% of the market share at the time of this writing. Relatively few applications are specifically tested on Opera. 

Integrated Testing Suites
After the essential web browser, the most useful item in your toolkit when attacking a web application is an intercepting proxy.
There are three leading suites in widespread use, which we will examine in this section:
Burp suite

Paros 

WebScarab 


Configuring Your Browser
If you have never set up your browser to use a proxy server, this is trivial to do on any browser. 

Then perform the steps required for your browser:

Internet Explorer
In Internet Explorer, go to Tools ➪ Internet Options ➪ Connections ➪ LAN settings. Ensure that the Automatically Detect Settings and Use Automatic Configuration Script boxes are not checked. Ensure that the Use a Proxy Server for Your LAN box is checked. In the Address field, enter localhost and in the Port field enter the port used by your proxy.  Click on the Advanced button, and ensure that the Use the Same Proxy Server for All Protocols box is checked. If the hostname of  the application you are attacking is matched by any of the expressions in the Do Not Use Proxy Server for Addresses Beginning With box, remove these expressions.
 Click OK on all the dialogs to confirm the new configuration.

Web Application Spiders
Web application spiders work in a similar way to traditional web spiders — by requesting web pages, parsing these for links to other pages, and then requesting those pages, continuing recursively until all of a site’s content has been discovered.
To accommodate the differences between functional web applications and traditional web sites, application spiders must go beyond this core function and address various other challenges, such as the following:

>> Forms-based navigation, using drop-down lists, text input, and other methods.

>> JavaScript-based navigation, such as dynamically generated menus.Multistage functions requiring actions to be performed in a defined sequence.

>> Authentication and sessions.

>> The use of parameter-based identifiers, rather than the URL, to specify different content and functionality.
The appearance of tokens and other volatile parameters within the URL query string, leading to problems identifying unique content.

>> Checking for the robots.txt file, which is intended to provide a blacklist of URLs that should not be spidered, but which an attacking spider can use to discover additional content.
Automatic retrieval of the root of all enumerated directories. This can be useful to check for directory listings or default content (see Chapter 17).

>> Automatic processing and use of cookies issued by the application, to enable spidering to be performed in the context of an authenticated session.

>> Automatic testing of session-dependence of individual pages. This involves requesting each page both with and without any cookies that have been received. If the same content is retrieved, then the page does not require a session or authentication. This can be useful when probing for some kinds of access control flaw (see Chapter 8).

>>  Automatic use of the correct Referer header when issuing requests. Some applications may check the contents of this header, and this function ensures that the spider behaves as far as possible like an ordinary browser. 

>> Control of other HTTP headers used in automated spidering.

>> Control over the speed and order of automated spider requests, to avoid overwhelming the target, and if necessary behave in a stealthy manner.
                                                     Burp suite     Paros      WebScarab 





Application Fuzzers and Scanners
While it is possible to perform a successful attack using only manual techniques, to become a truly accomplished web application hacker, you need to make use of automation in your attacks, to enhance their speed and effectiveness. 

The following features are implemented in the different tool suites:

>> Automated scans to detect common vulnerabilities.

>> Manually configured scanning for common vulnerabilities.

>> A set of built-in attack payloads and versatile functions to generate arbitrary payloads in user-defined ways — for example, based on malformed encoding, character substitution, brute force, data retrieved in a previous attack, and so on.

>> Ability to save scan response data to use in reports or incorporate into further attacks.

>> Customizable functions for viewing and analyzing responses — for example, based on the appearance of specific expressions or the attack payload itself.

>> Functions for extracting useful data from the application’s responses — for example, by parsing out the username and password fields in a My Details page.

>> Functions for analyzing cookies and other tokens for any sequences.



1 comment:

Hack Me Tech