
My scrape process can take a long time and during that time I can encounter OS/Internet errors outside the control of the HSS script running.
Sometimes if the script crashes, I lose all my data scraped (ouch

I there a way to "flush" the data at regular intervals or programmatically? Currently I have to stop the scrape, save the file and only then the journal file commits the data (I could be wrong on this).
Also, another very minor problem that you may know a work around for is that Access doesn't like "." in the table names. So I cannot directly link/import from them. My current workaround is to create another table in SQLite with a table name access can handle, but it would be great if I could control the naming convention of the tables at creation time or by some other method?
Looking forward to your response.