get all the urls in html file (local or on server).

2014-02-17 1 min read bash Fedora
To use this, you will need the lynx tool, so install that first. sudo yum install lynx Now, to get list of all the URLs in local html file or some URL, just execute this: lynx -dump -listonly Related articles Trouble in using file_get_contents() How to send image to server with url in ios Endangered species of the Web: the Link

phpmyadmin access problem and change server from URL

2012-09-30 1 min read Database Fedora
I was having issues with one of the phpmyadmin setups. The first server defined in the config file was not accessible and the theme that I was using was not allowing me to change the server from the first page. This resulted in a state where I was not able to access phpmyadmin in any way. I could not connect to any of the servers. And thus had to hack my way in to phpmyadmin to find out how to connect to other servers from URL. Continue reading

Mirror or download a website with a tool much better than wget

2010-05-19 1 min read bash Fedora Linux
If you want to mirror a <a class="zem_slink freebase/en/website" title="Website" rel="wikipedia" href="http://en.wikipedia.org/wiki/Website">website, the simplest tool I know if is <a class="zem_slink freebase/en/wget" title="Wget" rel="homepage" href="http://www.gnu.org/software/wget/">wget. Once you have used wget, you know the troubles associated with it. You also get added couple of files as search html files, which are useless and wastage of bandwidth for you as well as the <a class="zem_slink freebase/en/web_server" title="Web server" rel="wikipedia" href="http://en.wikipedia.org/wiki/Web_server">webserver. lftp -e &#8221;mirror -c&#8221; This will mirror the website for you and will not leave with all those un-necessary files. Continue reading