ruby - How do I download a web page into a single file from a specified URL? -


i'm trying scrape web pages.

i want download web page providing url , save offline reading images. can't manage wget since creates many directories.

is possible wget? there "save as" option in firefox creates directory , puts required resources html page?

would possible nokogiri or mechanize?

you can use wget , run within ruby script.

here's example rip homepage of site, skrimp.ly, , put contents single directory named "download". @ top level , links embedded in html rewritten local:

wget -e -h -k -k -p -nh -nd -pdownload -e robots=off http://skrimp.ly

note: should checkout of docs wget. can crazy stuff go down multiple levels. if sort of thing please cautious -- can pretty heavy on web server , in cases cost webmaster lot of $$$$.

http://www.gnu.org/software/wget/manual/html_node/advanced-usage.html#advanced-usage


Comments

Popular posts from this blog

linux - xterm copying to CLIPBOARD using copy-selection causes automatic updating of CLIPBOARD upon mouse selection -

c++ - qgraphicsview horizontal scrolling always has a vertical delta -