Page 1 of 1

Websites and Directorys Questions

PostPosted: Fri Nov 18, 2011 5:31 am
by Veos
Hey guys,

Have just been wandering if there was a way to find all the webpages a website has to offer? Or is there a way to view the websites folder structure, not to do anthing bad just want to know if there is a way of doing this. If yes what do you sugest I read or lookup to work this out?

Thanks.

Re: Websites and Directorys Questions

PostPosted: Fri Nov 18, 2011 1:39 pm
by mShred
There's a few different ways. If you find a vulnerability, you could probably get a directory listing. But for just random sites, you could use Google hacks. Site:example.org and see what that gets. You could try looking into the robots.txt file to see what directories they don't want crawlers seeing.

Re: Websites and Directorys Questions

PostPosted: Fri Nov 18, 2011 6:03 pm
by smithmetal
You could also try to smartly bruteforce the filenames and/or repositories of the web server. It's more like a script-kiddie way but if you know what you're doing these tools are often highly customizable and you could save some great amount of time using them.

Here's some examples of tools made to achieve this goal :
  • https://github.com/initnull/tachyon
  • http://sourceforge.net/projects/dirbuster/
  • http://cirt.net/nikto2

Re: Websites and Directorys Questions

PostPosted: Wed Nov 23, 2011 2:58 pm
by tremor77
you could also apply this to images/files other than just served pages... also, determining if a website is built using a specific editor or CMS you can get inherent knowledge of directory structure to locate more things.