Hello,
if i scrap a web site, Helium scraper runs out of Memory.
On the top-site i jump to the links with "navigate each" and on the sub-sites i collect only one link (no download!). the top-sites contains only 24 links. after every "navigate each" Helium scraper increase the Memory of 3-5 MB and after 2-3 Websites Helium scraper runs out of Memory.
what's going wrong? I don't know? Can anyone help me?
thx
matze