Cyber Detective💙💛
Cyber Detective💙💛

@cyb_detective

26 Tweets 12 reads Jan 25, 2022
From this thread thread you will learn about 12 key #OSINT-services for gathering information about a website.
I'll show them with an example of most famous russian search engine "yandex.ru" and it's subdomains.
Step #1
Collect basic information about domain
IP address lookup, whois records, dns records, ping, traceroute, NSlookup.
centralops.net
Step 2
Find out what technology was used to create the site: frameworks, #javascript libraries, analytics and tracking tools, widgets, payment systems, content delivery networks etc.
builtwith.com
Step 3
Get a list of sites belonging to the same owner (having the same Yandex.Metrika and Google Analytics counter numbers, as well as other common identifiers)
builtwith.com
Find sites with the same Facebook App ID
analyzeid.com
Step 4
Map subdomains.
#domainmap" target="_blank" rel="noopener" onclick="event.stopPropagation()">dnsdumpster.com
Step 5
Looking for email addresses associated with the domain or subdomains
hunter.io
or
snov.io
Step 6
Collect data on search engine rankings and approximate traffic.
alexa.com
similarweb.com
Step 7
Download documents (PDF, docx, xlsx, pptx) from the site and analyze their metadata. This way you can find the names of the organization's employees, user names in the system and emails.
github.com
Step 8
Use Google Dorks to look for database dumps, office documents, log files, and potentially vulnerable pages.
dorks.faisalahmed.me
Step 9
Calculate a website fingerprint for searching it in Shodan, Censys, BinaryEdge, Onyphe and others "hackers" search engines.
mmhdan.herokuapp.com
Step 10
Looking for old versions of the site in archives and caches of search engines (sometimes in this way you can find addresses and contact information of the owners, which are currently already hidden from the site).
cipher387.github.io
Step 11
Partially automate the process of finding important data in the archives. Download archive copies of pages from web.archive.org with Waybackpack
github.com
Search it for phone numbers, emails and nicknames using Grep for OSINT
github.com
Step 12
Find out the approximate geographical location of the site
iplocation.net
(There is a separate 12-step thread about gathering information about a place)
This short thread is over.
But there are dozens of times more tools for gathering information about domains. In my OSINT-collection there are already more than 60 of them:
#domainiplinks" target="_blank" rel="noopener" onclick="event.stopPropagation()">cipher387.github.io
Follow @cyb_detective to learn about new tools every day.
vstat.info
Getting detailed info about website traffic (sources, keywords, linked sites etc)
🧵⤴️⤵️
RobotTester
Simple #Python script can enumerate all URLs present in robots.txt files, and test whether they can be accessed or not.
github.com
Creator @podalirius_
#opensource
🧵⤵️⤴️
@rattibha unroll

Loading suggestions...