Using a wget or curl command, you can retrieve a webpage and save it under the file ~/.regexpsearcher_web.
Then, the downloaded content is searched for regexp matches.
This is useful for checking if some specific content has been published on a web page everytime your machine turns on or clic on the \"update\" button.
Using a command makes things more flexible:
- It can be a generated file or just a copy of another file. No need to download from Internet.
- You can simulate POSTs or other types of HTTP request if you like.
- You can use other protocols, like FTP.
|File (click to download)||Version||Description||Filetype||Packagetype||License||Downloads||Date||Filesize||OCS-Install|