Sometimes you may be curious about the subdomains of a particular domain. Sometimes, you may need to see all the URLs that are hyperlinked on a website. If that’s the case, JSFinder is a helpful tool that can help you with that.
A developer with the username Threezh1 has created a repository on Github, in which there is a Python script included that quickly extracts subdomains and URLs from a website. All you have to do is clone the repository or save the content of the script on a file in your machine and run a simple command line in which you also write your domain of choice.
python JSFinder.py -u domain
You can also save the URLs and subdomains on text files using the following command line:
python JSFinder.py -u domain -d -ou url_list.txt -os subdomain_list.txt
This script will save URLs in a new text file named url_list and subdomains in a new text file named subdomain_list, both located at the directory where you are running the script.
You can see a sample of the output at the image associated with the article.
You can also find it on Github where you can also see a further description in Chinese.