Welcome To Snipplr


Everyone's Recent Bash Snippets Tagged download



« Prev 1 Next »
AIX does not have Package Manager like YUM for Open Source Software. So I made a script to automatically install RPM packages on AIX box by downloading it from www.oss4aix.org site via ftp . It is very first version, it does not have all the necess...
0 141 posted 5 years ago by teterkin
simple ftp download example
0 102 posted 6 years ago by ktrout
download directory to localhost home folder new-dir-name
0 84 posted 8 years ago by zackn9ne
<ol><li>save as filename.sh</li><li>chmod u+x filename.sh</li><li>usage : ./filename.sh\"url\"</li></ol>
1 110 posted 9 years ago by kentoy
Source: Linux Journal This command downloads the Web site www.website.org/tutorials/html/. The options are: * --recursive: download the entire Web site. * --domains website.org: don't follow links outside website.org....
1 497 posted 10 years ago by abhiomkar
Download all tutorial pages of html format recursively. Ex: http://www.moraware.com/help/ Get the url of tree frame (left), use it in wget. Have fun :)
0 75 posted 11 years ago by abhiomkar
THERE IS A BUG AFTER COUNTING 100
0 67 posted 11 years ago by zxeem
obviously change url. you'll get (better formatted) something like... curl -o /dev/null -w %{time_total}\\n http://www.google.com % Total % Received % Xferd Average Speed Time Time Time Current D...
0 47 posted 11 years ago by nategood
For expression engine, I placed with with the main set of .htaccess rules in the root
1 57 posted 11 years ago by joeymarchy
Simply paste the list of URLs into stdin. You can add the option `-P 4` to parallelize the downloads.
1 83 posted 11 years ago by xenonite
2 71 posted 11 years ago by pmaciver
-nd do not create a hierarchy of directories (save all recursively retrieved files in the current directory) -r recursive retrieving -l1 set maximum recursion depth to 1 (stay in that folder): set to 2 if necessary --no-parent do not...
3 109 posted 12 years ago by iblis
« Prev 1 Next »