Hi
I need to download Websites automatically via Batch exaktly like the Browser would save them via Strg+S.
I used curl to do that and it worked fine for many years but… (閱讀更多)
Hi
I need to download Websites automatically via Batch exaktly like the Browser would save them via Strg+S.
I used curl to do that and it worked fine for many years but now the Source-Website has changed and so curl is no longer able to get the content. The problem is that the URL / Website that the Browser downloads does not contain the information.
This is loaded later by a java script.
So when I open the URL in the Browser and look at the Source-Code I can see something like
<script src="/example.js"></script>
When I press Strg+S to save the website the whole site is saved, because the java script was executed by the Browser.
Curl does not process java so it saves only the source code with the "<script src="/example.js"></script>"
Question is:
How can I download the URL directly with the Java Script executed so that the informaion is included too ?
I hope anybody can answer that.
Thank you.