TypeScript + Helium Scraper

Since its inception, Helium Scraper has supported injecting and running JavaScript code. This makes it possible to perform complex calculations and access information that is not directly accessible to Helium Scraper, such as JSON data stored in elements and variables. But mapping between JavaScript and Helium Scraper types has always been a hassle. Since JavaScript

Low-Code Web Scraping

Developers live myself love dark-mode code editors, terminals and cryptic looking command lines, but certain tasks, such as web development, data sheet calculations and, of course, web scraping, don’t require this level of technical expertise. Low-code is an approach to software development that allows users of any level of technical knowledge to participate in the

Scraping from LinkedIn

We’ve created a ready-made template that can be used to extract people and company information from LinkedIn. An account is required for the extraction to work. Check with support to see how many profiles/companies you’re allowed to view per day, otherwise, your account could get banned. Getting started To get started, download the template and

The Plus Operator

There are many operators in Helium Scraper, but the plus (+) operator deserves its own tutorial, given the number of uses it has. This is because it doesn’t just represent addition, but also concatenations of strings and sequences. Simple Cases Helium Scraper will treat the operator differently, depending on the type of data that is

Introducing Common Crawler

Common Crawler is a free version of Helium Scraper that, instead of loading pages from the web, it loads them from the Common Crawl database. Aimed at both developers and non-developers, it makes it easy to query the common crawl data and then create selectors and actions that extract structured data from the target HTML

The Web Scraping Dilemma

The web scraping community seems to be divided into two sub-worlds. One is the world of programmers, who would often use Python or JavaScript to carefully craft their agents down to the details in a time consuming but ultimately rewarding process. And the other is the world of layman users, who must choose between a