Wget is a very small download manager, so small, actually, that it only runs in command line. It's actually quite strange to use. You cannot -I repeat, cannot- just run the EXE, because it needs the URL as an argument to run. So what I would do is place the executable in the root directory (like C:\) or at least close (like C:\Downloads). Then, go to the command prompt (go to "Run", type "cmd" without quotes and hit "Ok"), type "cd C:\" or "cd C:\Downloads" depending on where you put wget.exe, and then finally type "wget URL," where URL is the URL to the file that you want, for example "http://www.website.com/monkey.exe." Remember that in the command prompt, you can right click and click "Paste" (but no Control+V) so you don't have to copy manually.
So that's the basics. Here's a few arguments that make it handy. If you want to see ALL the arguments, (a) download wget and type "wget --help", (b) visit the wget website conviniently located at the bottom off this post, or (c) visit this site, which makes it a little more clear.
-c = continue: One of the nicest features of wget is the simple ability to pick back up downloads. If you are downloading a large file and it gets interrupted, no sweat. Type in the exact same thing as when you first started downloading the file, except add "-c" before the URL, and it will continue where it left off.
-i = get URLs from file: If you have alot of URLs you want to download, paste them into a simple TXT file, one per line, then you don't even have to put it alot at a time. Just use the argument "wget -i TEXTFILE.TXT," and it will automatically get every URL in the file. Or, alternatively you can use....
-i -= prompt for multiple URLs: This is slightly different than above. Instead of typing -i then the name of a text file, type "-i -", which tells wget that you are going to paste them in now. It won't say anything, but paste in as many URLs as you want, then press Control+Z (it will show up as "^Z") and press enter, and it should start.
-r = recursive: This is the other most useful part. If you want to download everything on a page, just add a "-r" before the URL, and it'll grab everything. You can also use....
-A = accept only: If you want to download everything on a page, but only want a certain type of file, type "-A FILETYPE" like "-A mp3", and it will download only mp3s, or if you want more than one, type "-A mp3,mp4,ogg" and so on and so forth.
-nd = no directories: By default, wget will have a directory structure, just like the website. But if you want to just download the files to the one folder, type "-nd" before the URL.
That's some of the options, but there are alot more to do with proxies, passwords, and other utilities.
So in summary, there's a few reason wget is nice:
-Tiny: 400kb file, that's all you need.
-Portable: Because it's 400kb.
-OpenSource: For those who care.
-Downloads all on page: Using the "-r" argument. Can be further helped with limiting.
-Continues downloads: If they're interrupted.
The only downside to this program is that it has no GUI(Graphical User interface, for you non-techies...this means it's just white text on black background). However, people have made GUI to make it a little easier. Like Jacks WGET GUI, Wget:gui, Kiwi Wget GUI, or what personally looks the best to me, WinWGet, because it looks like it has Firefox integration, and almost all of the arguments in forms of checkboxes. And there are probably tons more. Just google "wget gui,"
For more info on wget, check out Jimmy Ruska's video on Youtube. He's the man. Here is wget's native website.
Visit Wget website for download