Download with browse with bash and wget

Been pretty busy for last couple of days. Will be busy for another few days, but here\’s something to munch in the meantime (Script may need changes depending on the type of file you want to download or the site you are browsing, but the chnages should be minimal):

file=\”/tmp/temp.dir\”
url=\”URL Here\”
IFS=\’
\’
cont=\”y\”
while [ $cont != \”n\” ]
do
name=\”\”
wget \”$url\” -O $file -o /dev/null
for i in $(grep href $file |grep -v Parent)
do
name=${i##*href=\”}
name=${name%%\”>*}
echo $name
if [[ $name == *gz ]]
then
cont=\”n\”
fi
done
if [ ! $name ]
then
echo \”No files here.. Exiting\”
exit -1
fi
echo
if [ $cont == \”n\” ]
then
echo \”Enter the filename for download :\”
read file
fi
echo \”Select one of the options:\”
read product
url=\”$url/$product\”
echo \”About to get $url\”
done
wget \”$url\” -O $file -o /dev/null

I have used the script on Solaris 8/9 and Fedora 9/10.

Transfer all the google feed URLs to rss2email.

Next step in restoring my settings was to get the working rules and all the blogs into rss2email. I have subscribed to more than 150 blogs. So, adding all of them manually was difficult.

For a time like this, I subscribe to all the blogs using google reader and then add them to rss2email. So I had quickly exported the google feeds list and then had to find a way to extract the URLs for the blog to the rss2email. The command to add a url is r2e add, so I wrote this one line just now to do the task for me.

Hope it helps someone:

for i in $(cat google-reader-subscriptions.xml |grep xmlUrl|sed -e \’s@.*xmlUrl=\”\\(.*\\) .*@\\1@\’|sed -e \’s/\”//\’); do r2e add $i; done

Now all I needed to do was to setup procmail again to deliver the messages to my homedir. I already have posted article on procmailrc so that was simple. So all done. Hope this helps someone.