Welcome To Snipplr


Everyone's Recent Bash Snippets Tagged download



« Prev 1 Next »
AIX does not have Package Manager like YUM for Open Source Software. So I made a script to automatically install RPM packages on AIX box by downloading it from www.oss4aix.org site via ftp . It is very first version, it does not have all the necess...
0 2211 posted 10 years ago by teterkin
simple ftp download example
0 2028 posted 11 years ago by ktrout
download directory to localhost home folder new-dir-name
0 1938 posted 13 years ago by zackn9ne
<ol><li>save as filename.sh</li><li>chmod u+x filename.sh</li><li>usage : ./filename.sh\"url\"</li></ol>
1 1800 posted 15 years ago by kentoy
Source: Linux Journal This command downloads the Web site www.website.org/tutorials/html/. The options are: * --recursive: download the entire Web site. * --domains website.org: don't follow links outside website.org....
1 4861 posted 15 years ago by abhiomkar
Download all tutorial pages of html format recursively. Ex: http://www.moraware.com/help/ Get the url of tree frame (left), use it in wget. Have fun :)
0 1129 posted 16 years ago by abhiomkar
THERE IS A BUG AFTER COUNTING 100
0 952 posted 16 years ago by zxeem
obviously change url. you'll get (better formatted) something like... curl -o /dev/null -w %{time_total}\\n http://www.google.com % Total % Received % Xferd Average Speed Time Time Time Current D...
0 835 posted 16 years ago by nategood
For expression engine, I placed with with the main set of .htaccess rules in the root
1 1184 posted 16 years ago by joeymarchy
Simply paste the list of URLs into stdin. You can add the option `-P 4` to parallelize the downloads.
1 1148 posted 16 years ago by xenonite
2 1288 posted 17 years ago by pmaciver
-nd do not create a hierarchy of directories (save all recursively retrieved files in the current directory) -r recursive retrieving -l1 set maximum recursion depth to 1 (stay in that folder): set to 2 if necessary --no-parent do not...
3 1696 posted 17 years ago by iblis
« Prev 1 Next »