Richard Jedlička Posted December 18, 2021 Share Posted December 18, 2021 Hi there, I have a PW website which I want to use as a CMS. I will have also some different scraper application which will retrieve data from the web and needs to store them in that PW website as pages. The problem is, it will be in JS not PHP and will be on different server (but with access to the PW's server). One thing to consider is the scraper will generate a lot of data. I am thinking about the best solution to this, I have some ideas: 1) Create a REST api in CMS and let the scraper send request over HTTP 2) Use websockets 3) Install another PW instance along with the scraper with the access to the same database as CMS and call it over command line. 4) Insert data from scraper directly to the database (create pages) What approach do you think is the best. I want to have CMS UI usable while the scraper sends many requests (multiple per second). I am interested in the option 3) but not sure if PW support that, running multiple instances with the same database. Thanks Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now