Page 1 of 1


Posted: Fri May 15, 2020 10:04 am
by BalooBear
Finally getting to grips with the new program and I think it's great. :D

My scrape process can take a long time and during that time I can encounter OS/Internet errors outside the control of the HSS script running.

Sometimes if the script crashes, I lose all my data scraped (ouch :x ). Additionally I want to process some of the data while the scrape is still going, but when I examine the SQLite file the latest data has not been committed yet.

I there a way to "flush" the data at regular intervals or programmatically? Currently I have to stop the scrape, save the file and only then the journal file commits the data (I could be wrong on this).

Also, another very minor problem that you may know a work around for is that Access doesn't like "." in the table names. So I cannot directly link/import from them. My current workaround is to create another table in SQLite with a table name access can handle, but it would be great if I could control the naming convention of the tables at creation time or by some other method?

Looking forward to your response.

Re: SQLite

Posted: Tue May 19, 2020 10:31 pm
by webmaster
There's no way to automatically flush the data, but you can save the project with File -> Save while it's still running without having to stop the extraction.

Regarding the dot, not 100% sure about this but I think you can use brackets like "[Some.Thing]" in Access. Anyway, it'd make sense to be able to use a different separator. I'll think about it see how feasible this option is.

Re: SQLite

Posted: Wed May 20, 2020 9:59 am
by BalooBear