You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A Feature to follow link on a pages and download them all together. For example if a page has bunch of valid url links in such cases singlefile can follow that link and download the pages all together in a single .html file.
This allows singlefile to crawl and download all available pages from small sites which has 10-20 pages all in one single file .html where it will act like a offline book where we can navigate to the downloaded website file just like when we do online.
Thank You
The text was updated successfully, but these errors were encountered:
Thank you for the suggestion. In fact, this is possible via single-file-cli, cf. the last examples in the README.
From my point of view, it requires a rather complex user interface to supervise and control website crawling. So, ideally, I think it would be better to code another extension based on SingleFile to handle this functionality.
A Feature to follow link on a pages and download them all together. For example if a page has bunch of valid url links in such cases singlefile can follow that link and download the pages all together in a single .html file.
This allows singlefile to crawl and download all available pages from small sites which has 10-20 pages all in one single file .html where it will act like a offline book where we can navigate to the downloaded website file just like when we do online.
Thank You
The text was updated successfully, but these errors were encountered: