Requiring human intervention for Multiple Processes Extract

Questions and answers about anything related to Helium Scraper
Post Reply
broadway34th
Posts: 3
Joined: Tue Oct 14, 2014 1:43 pm

Requiring human intervention for Multiple Processes Extract

Post by broadway34th » Tue Oct 14, 2014 1:54 pm

Hello,

I love your product, and am making a very good use of it. Thanks for making it. Well worth the $$$.

I have couple of problems, though:

1) When using the multiple processes feature (like the attached screenshot), it requires a human intervention every time a child process returns. My extraction takes about 30 minutes to 1 hour, and this means I have to stand by and be ready to hit enter 100 times... Is there any way to avoid this and automate the entire process?

2) One of the child processes sometimes hangs for no obvious reason with a black child window. It seems to prevent this symptom when I make all of the child windows very small (so that they don't blink as much) every time they appear. This is a tedious process.

I have to do (1) and (2) for every one of my routine website scan/extractions. It would help very very much if you could give me a clue on how to mitigate the above issues.

Thanks!

Soo

broadway34th
Posts: 3
Joined: Tue Oct 14, 2014 1:43 pm

Re: Requiring human intervention for Multiple Processes Extr

Post by broadway34th » Tue Oct 14, 2014 1:56 pm

Opps, forgot to attach the screenshot.
Here we go:
helium_screenshot.png
Screenshot for my action
helium_screenshot.png (32.05 KiB) Viewed 12100 times

webmaster
Site Admin
Posts: 521
Joined: Mon Dec 06, 2010 8:39 am
Contact:

Re: Requiring human intervention for Multiple Processes Extr

Post by webmaster » Wed Oct 15, 2014 7:33 pm

Hi,

Why exactly do you need to press enter for every child process? Do they show any message box you need to accept?

Also, I don't think that setup is going to work since you're clearing the SubMenuURL table while another process may be using it as a source. What I'd to is first extract all the available URLs into the SubMenuURL table and, after checking the table and making sure all the URLs are there, set up another actions tree to spawn all the child processes using this table (without ever having to clear it).
Juan Soldi
The Helium Scraper Team

broadway34th
Posts: 3
Joined: Tue Oct 14, 2014 1:43 pm

Re: Requiring human intervention for Multiple Processes Extr

Post by broadway34th » Thu Oct 16, 2014 2:31 pm

Hi, thanks very much for the response.

Yes, every time a child process returns, there's a pop-up box for which I have to press an enter. I normally have to close about 100 of these pop-up boxes for every extraction.. :-( I"m using the latest version of Helium Scraper and IE 11.

As for the SubMenuURL, I doubt if it gets referred to by the child processes while they're running. Extraction itself seems to be working okay. I'd like to try your suggestion however - Once the table with all the URLs is ready (Let's say about 2000 URLs), how many processes would you recommend I spawn? Assuming I'd have to allocate the data in the SubMenuURL table to each child process, is there an easy way of doing this?

Cheers,
Soo

webmaster
Site Admin
Posts: 521
Joined: Mon Dec 06, 2010 8:39 am
Contact:

Re: Requiring human intervention for Multiple Processes Extr

Post by webmaster » Thu Oct 16, 2014 6:08 pm

Hi,

What does this popup say?

About 4-5 processes should be OK. You'd really just do the same you're doing in Actions tree 1 with the Execute tree: Clear Tables and the Start Processes at: SubMenuURL, URL actions removed and then in another actions tree add a single Start Processes at: SubMenuURL, URL action, which you'd run after you've extracted all the URLs.
Juan Soldi
The Helium Scraper Team

Post Reply