Jump to content

SearchEngine


teppo

Recommended Posts

@teppo I don't think this is to with my setup. I have tried 3 x different character sets and encodings, all throwing errors. In particular... if I dump the result of processIndex() just before returning it, with that "à" character it is getting encoded as \xc3 when it should be \xC3\xA0. You can check this here: https://mothereff.in/utf-8.

I don't understand enough about how you are prepping the data before saving the field, but this I think is an issue with multibyte substrings.

Link to comment
Share on other sites

Sorry for the spam, but found a solution. I am not an expert on prepping strings for the database, but replace the line here with the below to make it unicode aware fixes things for me:

$processed_index = preg_replace('/\s+/u', ' ', $processed_index);

 

  • Like 1
Link to comment
Share on other sites

 

1 hour ago, teppo said:

I'll take a closer look at this ASAP 🙂

Coolio, no rush. I think this might have also been to do with having html entity encoders set on those ckeditor fields. Have no idea why I did that, maybe I did it testing something else.

  • Like 1
Link to comment
Share on other sites

21 hours ago, Mikie said:

@teppo I don't think this is to with my setup. I have tried 3 x different character sets and encodings, all throwing errors. In particular... if I dump the result of processIndex() just before returning it, with that "à" character it is getting encoded as \xc3 when it should be \xC3\xA0. You can check this here: https://mothereff.in/utf-8.

I don't understand enough about how you are prepping the data before saving the field, but this I think is an issue with multibyte substrings.

Heya!

I've looked a bit into this, but to be honest I'd like to gain a better understanding of the situation before applying the fix. Any chance you could check the charset and collation of the field_search_index table (assuming search_index is your search index field)? The output of "SHOW FULL COLUMNS FROM field_search_index" should be enough.

The "u" modifier for preg_replace() does some things I'm slightly worried about, i.e. it's documented as "not compatible with Perl", it changes how matches are treated, and it may also result in warnings if the subject string invalid UTF-8 — so at the very least it may require a bit of extra validation as well to account for that. Before going there I'd like to figure out how to reproduce this issue first. I've tried all sorts of special characters with no luck, so far everything has worked just fine here 🙂

Also, when you say that the ""à" character it is getting encoded as \xc3 when it should be \xC3\xA0", what do you mean exactly? I mean... do you literally see \xc3 somewhere, or do I have to grab the value and pass it through some sort of inspection process to see that it's wrong? If I dump the result of processIndex(), I see "à" character on the screen, and that's also what's being stored in the database.

Sorry, I'm easily confused when it comes to things like character sets etc. 😅

Edit: forgot to mention that based on StackOverflow this definitely looks like a character set issue, i.e. typical case where this error occurs is when you're trying to store UTF-8 data into a latin1 table. Assuming that the CKEditor field in question is some form of UTF-8, the index field data column should definitely also be UTF-8 — and if it's not, that sounds really weird.

Edited by teppo
  • Like 1
Link to comment
Share on other sites

On 2/6/2020 at 12:11 AM, Mikie said:

Cheers! I am storing magazine style story credits (role, name, website url etc) in the Table. I feel that since Table only accepts text based fields this is an ok candidate for indexing. Can try to hack away at your module myself for now, no rush.

Table field is now one of the supported fieldtypes in SearchEngine 0.17.0. The indexing part makes use of TableRows::render(); I may have to revisit this at some point, but this approach seemed to work quite well in my initial tests, and this way I don't have to identify each possible value but can rather let the fieldtype do all the heavy lifting 🙂

  • Like 2
Link to comment
Share on other sites

8 hours ago, teppo said:

Table field is now one of the supported fieldtypes in SearchEngine 0.17.0. The indexing part makes use of TableRows::render(); I may have to revisit this at some point, but this approach seemed to work quite well in my initial tests, and this way I don't have to identify each possible value but can rather let the fieldtype do all the heavy lifting 🙂

I've edited this reply, since I double-checked and it is happening both with and without entity encoders active on CKEditor fields when trying to save the search index. 

See screenshots below, with the text Testing “testing” à 123 in a ckeditor field. Strange quotes get converted to utf-8 from html encoding, but the "à" symbol utf-8 gets clipped in half. Can confirm PW / DB / db table / db column all using utf8mb4 + unicode_ci (learnt alot about this stuff past few days!).

EB44487B-4E41-4581-B3B4-CD3E5FED313E.png.e4776af794ec130f505ffb801c4cd896.png

CF38A96B-ABCC-472A-B03E-9CC595C4D589.thumb.png.83ff9029aa80c41386b409ac18b618cf.png

B1FD22F1-0268-4A4A-BDAC-2F4753660148.png.3f11ed59793713dd8a7b77f5b50e2b7c.png

1F931F58-8F1A-42CF-99D8-EE625DB30669.png.a7c2d1c33dbc173e7c00968ddf4ffe89.png

  • Like 1
Link to comment
Share on other sites

Thanks, I'll try to dedicate this a bit more time later today. I'm still confused as for why it's happening (can only assume that there's some difference in the environment), but perhaps the "u" flag indeed is the correct fix. Will have to check that it doesn't cause additional issues in cases where the module is now working as expected... 🙂

Link to comment
Share on other sites

12 minutes ago, Mikie said:

Here is the exact issue. Set locale seems to be the problem, when combined with preg_replace on white-space... 

https://github.com/silverstripe/silverstripe-framework/issues/7132

 

7 minutes ago, Mikie said:

And it only happens on mac as well! Wow, what an edge case.

Awesome — thanks for digging these out! 🙂

  • Like 1
Link to comment
Share on other sites

6 minutes ago, teppo said:

Awesome — thanks for digging these out! 🙂

No worries! Can confirm had setlocale(LC_ALL, 'en_US.UTF-8'); in my site config. I only do this when PW tells me to, haven't taken the time to even understand why. Turning that off fixed the issue also.

There's enough discussion within that silverstripe GitHub issue about the alternatives. Very very edge, will leave up to you!

  • Like 2
Link to comment
Share on other sites

Aforementioned issue should be fixed now. As was already mentioned above, this could only be replicated under specific circumstances on macOS; nevertheless it seems that defining the "u" flag for preg_replace() is a relatively safe thing to do, so I've gone ahead and done that. If it ends up causing trouble, I may have to reconsider that, but at least for now it seems to be all good 🙂

Thanks @Mikie for tracking this down!

  • Like 2
Link to comment
Share on other sites

  • 1 month later...

Hi @teppo,

Thanks a lot for your module!

Whenever I check "Index pages now?" in the module's backend config and save to build/rebuild the index field, PW throws a lenghty error (see attached PNG). I've selected a couple of text/textarea fields to index and included the index field in my templates. Calling

$modules->get('SearchEngine')->indexPages();

from the search template seems to work fine though. Am I making some newbie mistake here or is that an actual bug?

Screenshot_2020-03-30.png

Link to comment
Share on other sites

Hello @teppo and all,

I currently run the same processwire site on multiple servers with a shared database and shared asset resources.  This has been working fine for years, but we've been using elasticsearch, which has required feeding index updates from our multiple servers to a single elasticsearch index.  It looks to me like this module will eliminate the need for that additional complexity, and here's where I would like someone to correct me if I'm wrong or point out any flaws in my understanding.  I see that the page indexes are updated with a hook upon page save, and the index info will be stored in the database as a page field.  It seems to me that this should work fine in a multi-server environment sharing a single database, as the page save events will only happen once from the server the page is saved on, and the index will be updated for all servers sharing that database.

Short of something extreme like simultaneously running a complete re-index from multiple servers (which probably would still work out ok...), does anyone see any problem with this approach, or see considerations in this scenario that I may be missing?  Your input is appreciated.

Best Regards,
David

Link to comment
Share on other sites

15 hours ago, CalleRosa40 said:

Whenever I check "Index pages now?" in the module's backend config and save to build/rebuild the index field, PW throws a lenghty error (see attached PNG). I've selected a couple of text/textarea fields to index and included the index field in my templates.

It looks like you're using Hanna Code with one or more of your indexed fields. Is that correct?

Here something is trying to resize a Pageimage object while it's actually a Pageimages object, which usually means that output formatting is off. If so, you could fix this in the Hanna Code snippet itself (by checking for Pageimages and getting the first Pageimage from it). I'll see if there's something I can do to make this work better, but that's the quick fix anyway.

(Assuming I understood the stack trace correctly...)

  • Like 1
Link to comment
Share on other sites

On 3/30/2020 at 3:21 PM, Confluent Design said:

Hello @teppo and all,

I currently run the same processwire site on multiple servers with a shared database and shared asset resources.  This has been working fine for years, but we've been using elasticsearch, which has required feeding index updates from our multiple servers to a single elasticsearch index.  It looks to me like this module will eliminate the need for that additional complexity, and here's where I would like someone to correct me if I'm wrong or point out any flaws in my understanding.  I see that the page indexes are updated with a hook upon page save, and the index info will be stored in the database as a page field.  It seems to me that this should work fine in a multi-server environment sharing a single database, as the page save events will only happen once from the server the page is saved on, and the index will be updated for all servers sharing that database.

Short of something extreme like simultaneously running a complete re-index from multiple servers (which probably would still work out ok...), does anyone see any problem with this approach, or see considerations in this scenario that I may be missing?  Your input is appreciated.

Best Regards,
David

Any thoughts on this?  @teppo?  If I am doing something wrong in terms of the form or placement of my inquiry, please let me know that as well.  I want to do things the right way.  Thanks for your time, and apologies for any inconvenience.

Best Regards,
David

Link to comment
Share on other sites

Hey @Confluent Design,

Sorry for the delay. Your message slipped my mind, thanks for pinging me 🙂

Quote

[...] but we've been using elasticsearch, which has required feeding index updates from our multiple servers to a single elasticsearch index.  It looks to me like this module will eliminate the need for that additional complexity [...]

You're correct in that SearchEngine stores its index in a ProcessWire field, so yes — since your database is already shared, there's nothing else to do in that regard; unless I've very much misunderstood something here, it should work right out of the box.

Quote

Short of something extreme like simultaneously running a complete re-index from multiple servers (which probably would still work out ok...), does anyone see any problem with this approach, or see considerations in this scenario that I may be missing?  Your input is appreciated.

From a basic technical point of view, no, I don't see any issues here. It's worth noting, though, that elasticsearch is quite a bit more complex and feature-rich than this little module here. Depending on your needs that might or might not be an issue.

Basically with SearchEngine you get a searchable blob of all page content that can be converted to text without toying around with additional APIs or libraries (so no file data at the moment), and the search itself is done using a simple selector string — no advanced weighting, stemming, etc. I do have a few "advanced" features on my todo list, but at the moment there's no timeline for any of that 🙂

  • Like 2
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...