You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Local define defs can override those inside module define.
How to wire this stuff up?
inputs
Create a object in each ScrapeStep that came from a module. Object should map full input keys to module's internal keys. The internal keys will be the ones actually used in the handlebar templates. E.g.
This is essentially global state, whenever '@community-scrapers/twitter-login' gives us a value, we update the input value for 'accessToken', and replace the passed down value with ''
organizing dependencies
It is possible to have a separate directory where module scrapers live using worker_threads.
mkdir scrape-pages-runners
cd scrape-pages-runners
npm init
npm i scrape-pages @community-scrapers/twitter-feed @community-scrapers/twitter-login
The dream here is to let other users maintain scrapers in a community repo, or on their own githubs, and let developers simply install them via npm.
ConfigInit
:yields
Config
:Local
define
defs can override those inside moduledefine
.How to wire this stuff up?
inputs
Create a object in each
ScrapeStep
that came from a module. Object should map full input keys to module's internal keys. The internal keys will be the ones actually used in the handlebar templates. E.g.scrape
Two options:
flow.ts
instance for a module and hook that up to whatever is above/below it.scrapeEach
arrays and reattach the rest of the structure there.stateful values
There may be times when a local/module scraper gets a value that you want for the rest of the run. Most often this will be an auth/access token.
This is essentially global state, whenever
'@community-scrapers/twitter-login'
gives us a value, we update the input value for'accessToken'
, and replace the passed down value with''
organizing dependencies
It is possible to have a separate directory where module scrapers live using
worker_threads
.mkdir scrape-pages-runners cd scrape-pages-runners npm init npm i scrape-pages @community-scrapers/twitter-feed @community-scrapers/twitter-login
Your main nodejs process can run something like
The text was updated successfully, but these errors were encountered: