Jump to content

Jozsef

Members
  • Posts

    245
  • Joined

  • Last visited

1 Follower

Contact Methods

  • Website URL
    http://www.focusweb.ie

Profile Information

  • Gender
    Male
  • Location
    Northern Ireland (UK)

Recent Profile Visitors

2,580 profile views

Jozsef's Achievements

Sr. Member

Sr. Member (5/6)

108

Reputation

  1. Thanks everyone, a lot of learning to do, I'll look into the suggestions. I've never used UIKit but Sortable looks the bill. Updating data would be the real challenge with my skills. @bernhard, such a module would be awesome. ?
  2. Hi everyone. My clients wants me to build a CRM app connected to their existing client site so they have all data in one place. One of their requirements is to have a drag and drop kanban board interface to update their sales status. Does anyone have any recommendation for an existing JS or jQuery library that could be integrated into the PW backend and could update data in real time? Have you done anything like this before? Or does PW have built-in tools for this? I know you can update page parents by drag and drop but can that be reused somewhere else? I'm sorry for the lame question, I'm not very fluent in JS.
  3. I'm in a similar situation where the client asked me to integrate their sales into the existing client facing site since half of the data is already there. Silly question but would it be bad practice to store data as pages? I mean ProcessWire can do all the content types, fields and page references already, would it still be better to store data in the database directly?
  4. Thanks @wbmnfktr I will definitely do. I was also looking into email relay services such as Mailgun or SendGrid but their privacy policies were not acceptable for them. ? Anyway, thanks for everyone for the ideas. We are going off topic though so it will be the WireMailSMTP module for now.
  5. I'm using PrivacyWire with Google Analytics. Looking at their documentation it is possible to load GA4 without cookies. How would I go about updating the GA settings when statistics cookies get approved instead of blocking the analytics script altogether? https://developers.google.com/tag-platform/devguides/consent Is there a JS variable I can check for or I have to parse local storage to find the answer? I understand that I can run a script when consent is changed, I was just wondering if there's anything already exposed.
  6. Thank you @fliwire, that might be another consideration for the client.
  7. Thanks, these are valid points. I'm not a module developer, hence the "simple way" in my question. ? No ideas unfortunately. The site is for cyber security experts, their mindset is to minimise the risk when something happens. Their previous WordPress site was compromised so a breach is not an "if", it's more like a "when" for them as they deal with this all day every day.
  8. I completely agree, I really think that saving the SMTP password as plain text is a big security compromise. Is there a simple way to change this?
  9. Thanks Robin, I didn't know I can use mysqldump directly in a cron job. As for the site, they are debs (prom) organisers for high schools, hence the large number of customers and ticket sales on the site. Otherwise the site is performant and works well within the shared hosting limits. As far as shared hosting goes it's a really good one with guaranteed resources. Thanks, I always wanted to check out your module, I'll give it a try.
  10. Can I reduce the database size or break up the backup somehow? The site's database grew over 100Mb having over 1M rows in total. There are over 150k pages (high number of payments, orders and over 30k users). I use the great CronjobDatabaseBackup module by kixe to trigger hourly updates it uses the core WireDatabaseBackup class. My backup now triggers a PHP error every time, despite max_execution_time is set to 300 on the shared hosting. I think they enforce a maximum of 120 seconds on the database though. Fatal Error: Maximum execution time of 120 seconds exceeded Line 854 of /wire/core/WireDatabaseBackup.php or Fatal Error: Maximum execution time of 120 seconds exceeded Line 961 of /wire/core/WireDatabasePDO.php
  11. @aagd Yes, it's a shared hosting but upgrading is not an option at the moment.
  12. Thanks, that doesn't work unfortunately, the hard limit applies specifically for database processing, according to hosting support. max_execution_time has no effect on it, it was already much higher. I wish it was that simple.
  13. I have a site where the database grew over 100Mb and the hosting implies a 120 seconds hard limit on mysql processing. As a result, all scheduled backups trigger a PHP error: Fatal Error: Maximum execution time of 120 seconds exceeded Line 854 of /wire/core/WireDatabaseBackup.php Any suggestions? The site has over 150k pages (orders and payments on top of content) and over 30k users and I need an hourly backup. Because of the PHP error the Automatic Cleanup is not happening and, backups eventually filling up my hosting storage. Can I reduce the database site? Or speed up the backup process? (Note: the server is quite fast).
  14. Thank you for the detailed answer, it's very educational. I was aware of some of the considerations but I didn't think about cache, for example. I understand that the DeepL service take time, especially with a 5000 word document. ? I did go through the readme multiple times before. Great module and thanks again for the brilliant support here.
  15. Thank you @FireWire, after some wrestling I got it working. My issue was that I assumed that the returned translation is a simple string while it's not. For anyone interested, this is the working function I put together. It returns the translated field value or translates a multi-language field (headline in this case) on demand, if it's not yet translated. You can call it simply <?= translate($page,'headline'); ?>. It's saved and displayed right away. This way it can also be used in a foreach loop where you want to display a field from a different page. Please let me know if something could be improved or simplified, I'm not a PHP guru. Do you think it will have a toll on performance for already translated fields? /* Translate a field if it's not available in current language * * @param object $page * @param string $field * @return string * */ function translate($page,$field) { // If field is empty there's nothing to be translated if (!$page->$field) return; // Get current language setting $lang = wire('user')->language; $page->of(false); // Check if the field is already translated $local_value = $page->$field->getLanguageValue($lang); if (!$local_value) { $fluency = wire('modules')->get('Fluency'); // If not, translate it from default language $translate = $fluency->translate('en', $page->$field, $lang->language_code); // Get the translated string from the response object $translated = $translate->data->translations[0]->text; if ($translated) { // Save the translated value back to the field $page->$field = $translated; $page->save($field); // Set return value $local_value = $translated; } else { // If translation fails and returns empty string // We use the default language $local_value = $page->$field; } } $page->of; return $local_value; } There's still the issue of updated content in the original language but I'm not sure how to deal with that.
×
×
  • Create New...