Tutorial :HTTP File Browsing tool?


Is there any client tool available which allows me to browse through files which are on a web site?

For ftp://example.com/folder/files, you can browse through this files using FileZilla. But what about the http://example.com/folder/files, which is hosted either under IIS, Apache etc web server? I am looking for a tool to browse through those kind of hosted files.


WinSCP allows FTP, SFTP and SCP, and of course Putty allows SSH and Telnet.

Both applications are mature, free, and I've been using them for years, and so does everyone else at my office.

I've also found WinSCP to be more reliable than FileZilla.


If what you are really after is a GUI window for an SFTP session, then look no further than Bitvise's Tunnelier. Provided your host is running a linux distribution. Using Tunnelier, you will be able to open a terminal session, an SFTP which will allow file transfer and file browsing on the remote server, as well as profile management, to make it easier to manage multiple such remote servers.


Let me see if I understand correctly:

You have a file URL (ie http://example.com/somefolder/pretty_image.jpg) and you want to see other images from the same folder, but when you go to the parent directory (/somefolder/) you get a message that file listings are not allowed (or similar), or you get shown some web page rather than a listing of the files you are interested in?

If this is what you want, you are probably out of luck - a web server will only show you the files you explicitly request unless it has been specifically configured to show a DirectoryIndex (to use Apache httpd language). There is no way around this while connection "from the web". As suggested previously, you need to access the file sustem by logging on using for example FTP, ssh or so on.


No, there isn't really. A web server typically does not tell you about all the files it knows about. When you request some URL, you are often not even browsing actual files, but are just making the server invoke some scripts that in the end give you some HTML back. In that HTML you would find references to images etcetera, which are often real files on the web server (but sometimes are just scripts as well), but in general there's no way to make the web server give you a list of all other images.

There are ways to get a lot of information from a web server, like ways to download many pages and its media from some website. But that's still all based on the links that the web server exposes itself: programs simply follow all the links and get as much information they can from that. That's basically also how search engines work.

(Sometimes a server would show you all files in a folder if you really ask for a folder name. But that's often prohibited.)

Note:If u also have question or solution just comment us below or mail us on toontricks1994@gmail.com
Next Post »