Questions and answers about anything related to Helium Scraper
Post Reply
Posts: 1
Joined: Thu Nov 10, 2011 9:26 am


Post by dude123r24 » Thu Nov 10, 2011 9:34 am


Appreciate some guidance on the below.

I'm trying to scrape a website where there are a few links. I need to go to each link and then collect some data and go back to the previous page (i.e. where the orignal link was) and progress to the next link.
It does not seem to work.

I'm using a navigate each.
In my actions I have a Navigate each under which I have the extract routine.

One thing I noticed while creating the project was that , during building the kind, when I press the back button it should ask me if i want to resubmit/resend the data, but I just get a webpage has expired message.
So I cannot give multiple examples for the kind.

Hope I'm making some sense. If not please shoot me down :)

Posts: 21
Joined: Tue May 31, 2011 5:12 pm

Re: Resend

Post by caliman » Thu Nov 10, 2011 2:39 pm

Hey dude, I know you will receive an official answer from Juan, but here is my two cents. First you can extract all the Urls and then using the "Navigate URLS" you will extract what you want.

Site Admin
Posts: 501
Joined: Mon Dec 06, 2010 8:39 am

Re: Resend

Post by webmaster » Thu Nov 10, 2011 7:04 pm


Make sure your extraction routine actions are child actions of your navigate each (instead of being bellow). Using caliman's suggestion can also help you simplify the whole process since you would have a flat list or URLs from which to perform your extraction. Finally, you can try using a Wait action after your navigate each action in case the original page loads dynamically. Other than this, I'd need to look at the site and see what's going on.

Regarding the going back, you'll probably need to refresh the page to resend. This is just the default behavior of the browser. Perhaps upgrading your Internet Explorer will change this behavior.
Juan Soldi
The Helium Scraper Team

Post Reply