Jump to content

Module that outputs a static site?


owzim
 Share

Recommended Posts

Hi,

is there something like that, or would you consider it useful? Maybe there's another way but I think it would be great.

Since I tinkered around a bit with Jekyll (static site generator) I love the idea not depending on php or database acces for clients who don't want to change content that often (or even almost never). Still there would be the huge benefit of pw letting me organize the data in a very comfortable way, when finished, push a button an the site is output into a directory, ready to be used in a static way. Also if a client didn't pay for cms integration, they should not get it =)

On a side note: I have so many initial questions, but I don't know if I should ask them each separately (better in my opinion) or all in one post.

cheers

Link to comment
Share on other sites

Yes, Pro Cache might be an option, but the pages are still passed through php, and the .htaccess is necessary to mimic the static site structure. Perhaps I will use Pro Cache to make a module which takes care of that.

Link to comment
Share on other sites

Pro cache builds a static site in cache. Not sure but it maybe could be just copied to root and voila.

When the cache is fully populated, you could do this. You could literally take everything in the /site/assets/ProCache/ dir, move it to the root on another web server and it would function statically. You would also need to have your other static assets copied over too, which I think would be limited to these dirs:

/site/assets/files/*

/site/assets/templates/styles/*

/site/assets/templates/scripts/* 

  • Like 2
Link to comment
Share on other sites

Would you consider this Ryan?

Sure. Though it's a little problematic because in order to be complete, every single page in the site would have to be visited and have a cache file generated for it. It's certainly possible to do on your own, assuming you can easily visit every page in your site (i.e. it's not too big). But if I build it as a "feature", then it has to be scalable to very large situations… not sure I'm up for that.

Link to comment
Share on other sites

Another option, if what you want to do is simply a static copy of your whole site, is to use something like HTTrack:

HTTrack is a free (GPL, libre/free software) and easy-to-use offline browser utility.
 
It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure.

It's available as a Windows and Linux version + command line tool and (based on my experience so far) most of the time does good job. Windows version can sometimes require a bit of playing around with settings, but other than that it's simple, powerful, very configurable and completely automatic :)

  • Like 2
Link to comment
Share on other sites

Sure. Though it's a little problematic because in order to be complete, every single page in the site would have to be visited and have a cache file generated for it. It's certainly possible to do on your own, assuming you can easily visit every page in your site (i.e. it's not too big). But if I build it as a "feature", then it has to be scalable to very large situations… not sure I'm up for that.

Is there a way to fake that? Effectively run a crawler over the entire site based on a generated map or something?

I don't see this as high priority, but interesting all the same!

  • Like 1
Link to comment
Share on other sites

Sure. Though it's a little problematic because in order to be complete, every single page in the site would have to be visited and have a cache file generated for it.

....

If you want to use Pro Cache Module for that and want to play save - I recommend a run with Xenu's Link Sleuth  ^-^

Also, like Teppo mentioned, HTTrack is a good solution for creating offline-version of sites :)

Link to comment
Share on other sites

Is there a way to fake that? Effectively run a crawler over the entire site based on a generated map or something?

Oh, Joss - sorry, I have not read it first.

Xenu is a crawler which checks for broken links. I think there is no need for maps. Just let Xenu start at your root and wait until it emails you the result.

Link to comment
Share on other sites

I have to admit a static site generator is something I would also be interested in. I have used similar plugins for other systems (wordpress, modx etc) and found it useful in various situations.


I often use processwire for prototyping an idea on my local machine, and being able to quickly share that with clients or team members to get feedback requires copying the db and code over to the server (not a massive task but still takes time and effort). Having a one click export to folder or zip file for sharing would be a huge time saver. If you set that up with a dropbox folder and a service like site44 (http://www.site44.com/), you could have a pretty neat publishing workflow. Instant one click publishing - just like FrontPage 98!


Another interesting use case, if you made the static site generator flexible enough, would be to use it for exporting to epub, pdf and mobi formats - a kind of ebook export module. This would fit in nicely with an idea I have been thinking about - a simple book publishing platform with PW - something similar to the booktype project by sourcefabric ( http://www.sourcefabric.org/en/booktype/ ) or the Leanpub site ( https://leanpub.com/ ), which could be used for writing books or documentation.

Also, similar to httprack application suggested by teppo, If you are on a mac and want to create local copy of a website, site sucker is very good (and free)

http://sitesucker.us

  • Like 2
Link to comment
Share on other sites

Another option, if what you want to do is simply a static copy of your whole site, is to use something like HTTrack:

It's available as a Windows and Linux version + command line tool and (based on my experience so far) most of the time does good job. Windows version can sometimes require a bit of playing around with settings, but other than that it's simple, powerful, very configurable and completely automatic :)

Tested it and works quite well, this solution is the best for me right now. Looking forward to see what might be coming to the ProCache module.

Link to comment
Share on other sites

There's also SiteSucker (for Mac people) which I think does the same kind of thing as HTTrack: http://www.sitesucker.us/mac/mac.html

That's more convenient and seems to work even better, thanks.

Just for the people who also use this kind of solution and want to strip all "index.html" from the urls:

find . -name '*.html' -print0 | xargs -0 sed -i '' 's/index.html//g'
  • Like 1
Link to comment
Share on other sites

I stumbled upon this by mistake.

I was actually looking for a method that would export a static page with a CSS extension, thereby allowing total control of "editable style sections" of a site.

I'm gonna check out Ryans Pro Cache module. Maybe that'll give me some idea.

Link to comment
Share on other sites

  • 2 years later...
  • 6 months later...

Sure. Though it's a little problematic because in order to be complete, every single page in the site would have to be visited and have a cache file generated for it. It's certainly possible to do on your own, assuming you can easily visit every page in your site (i.e. it's not too big). But if I build it as a "feature", then it has to be scalable to very large situations… not sure I'm up for that.

Do you have a hint for anybody who wants to automate this visiting task? I have a client with a rather small website that shall not be visited but downloaded by him and I would love to offer a one-click solution to him. If this can be done using ProCache I would love to buy the module for this project.

EDIT: I do not know ProCache yet. Is it possible to add a hook on page save that somehow calls ProCache and automatically indexes the page after it has been saved?

Thanks!

  • Like 2
Link to comment
Share on other sites

  • 3 years later...

Hi everyone!

I come here after a very ugly situation in which mysql server was down holding around 50+ websites in a pretty big server using ProcessWire only. So I start to feel where all this static site generation hype comes from. Has anyone moved further on this aspect?

Link to comment
Share on other sites

3 hours ago, elabx said:

I come here after a very ugly situation in which mysql server was down holding around 50+ websites in a pretty big server using ProcessWire only. So I start to feel where all this static site generation hype comes from. Has anyone moved further on this aspect?

Not an answer really, but technically just having ProCache could help with situations like these. The idea is to bypass PHP and MySQL entirely, after all. In fact in the past I've had a situation where MySQL was dead but the site I was monitoring kept working due to ProCache, so the issue didn't become apparent for quite a while... ?

Another thing to consider would be something like Cloudflare ("always on" feature in particular), or adding Squid or Varnish (or some other proxy/cache/accelerator) in front of the site.

  • Like 4
Link to comment
Share on other sites

As @teppo already mentioned: Procache or/and Cloudflare... but...

If you really want to establish a static site, just use the pre-rendered files from ProCache and upload them to your host. Images/assets should already be present - to make things easier. 

In case of a Linux/*nix setup you might get it done with a few custom rsync/rclone/git setups to push only files that were changed/new.

There is one site (a client site) I manage through ProcessWire locally, run ScreamingFrog to generate the static files and push all changes via git to the repository, which is than published by Netlify.

Yes... there are a few steps involved but it's still way easier to go this way than everything else I know (Jekyll, Ghost, etc.).

  • Like 3
Link to comment
Share on other sites

1 hour ago, teppo said:

Not an answer really, but technically just having ProCache could help with situations like these. The idea is to bypass PHP and MySQL entirely, after all.

Just pushed ProCache's cache time to like 100 years lol.

That's also what I want to achieve at least, not letting the site completely die even if forms/backend are not working. 

42 minutes ago, wbmnfktr said:

In case of a Linux/*nix setup you might get it done with a few custom rsync/rclone/git setups to push only files that were changed/new.

 

So like mentioned above, a "build" step is something i'd need to work on. I guess I could put pw backend in a subdomain (possible?) and put the rendered static files in another folder of the server to render on the right domain. 

Also I guess there's a lot other stuff to consider like 404's, formbuilder forms in iframes...Maybe I should just get MySQL to behave but I guess that's gonna take me a couple years at least :P

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...