Jump to content

wbmnfktr

Members
  • Posts

    2,063
  • Joined

  • Last visited

  • Days Won

    51

Everything posted by wbmnfktr

  1. I don't want to show the banner. That's the point. auto-accept mode + allow user to manage + user blocks cookies = banner does show up auto-accept mode + allow user to manage + added line to .js + user blocks cookies = banner does not show up
  2. The negative viewCount becomes relevant (at least in my case) at this point around line ~257: // if they haven't explicitly accepted it (ie: auto-accept) then display the banner if (pwcmb_settings.auto_accept && cookieMonster.cfg.viewCount != -1) { cookieMonster.ui.show(); } I don't know if it's really necessary for you to get your head around it - at this point. Maybe in the future. To be honest... I think my actual case seems to be kind of special.
  3. In my case the banner doesn't show up again. It works as expected. Banner is gone. Cookies are set. Values for cookies are fine.
  4. I can confirm that the latest version only supports either auto-accept mode or allow users to manage. But by now I use both options auto-accept mode and allow users to manage without any problems. I added one line to the cookie.monster.block function (jQuery version) to make it work. //set cookieMonster variables when user blocks cookieMonster.block = function() { // added to disable banner while blocking cookies in auto-accept mode cookieMonster.cfg.viewCount = -1; // cookieMonster.cfg.allowCookies = "n"; cookieMonster.cfg.selectionMade = "y"; cookieMonster.cfg.storedVersion = cookieMonster.cfg.version; cookieMonster.sendActionBeacon(); cookieMonster.updateStatus(); } I'm not sure anymore that auto-accept mode was the real deal-breaker here but that missing line. Maybe the (original) author had something in mind when he/she decided not to add that negative viewCount in the block function. In my use case I need it to get my expected behaviour while having all options (auto-accept and blocking) I want and need.
  5. Bug or feature: Denying cookies works different to what I expected. I allow page visitors to manage cookie settings. Those who deny cookies get a success message but afterwards the cookie banner shows up again. I would expect that those who deny cookies get the same experience as those visitors that allow cookies and therefore that the banner doesn't show up again. Tested with version 0.4.0 in jQuery and vanilla flavour. So... is it my expectation or the module/banner behaviour that needs a fix? +++ UPDATE +++ The option auto-accept mode interferes here. Enabling this results in the slightly unexpected behaviour. Disabling this option ends in the expected behaviour. I think the bug/feature question is answered now.
  6. a URL field that intermittently verifies itself sounds like something I would really like (for external links/URLs). Right now - with external URLs in mind - I think of a list of URLs that show the status codes of all links/URLs similar to Jumplinks module does for redirects. Would this be possible?
  7. While using it as page names that will never see the light of the day I will probably be fine with it.
  8. This sounds like something I could use in some projects and therefore replace my buildCustomUniquePageName() function in those projects. Will definitely give it a try.
  9. Woah... that's a stunning backend!
  10. Are there any modules or hooks in place that could be responsible for that behaviour? How do you create those painting-pages? Which version of ProcessWire do you use? I just tried to reproduce that behaviour but without success.
  11. With ProcessWire 3.0.98 I can't confirm your bug. How do you create your pages? - in browser or via API Which versions do you use? Process wire and modules What modules do you use? Do you have hooks in place?
  12. I would agree with that to a certain point. Imagine your client uploads a few hundred images with 12 MB+ each. If they explicitly ask for the possibility to upload super high-res images it's another story.
  13. In case of images I would limit upload sizes as in height/width and MB. In case of videos it depends on where they were used. As a movie I wouldn't care that much about size but I would load them only on demand. As a background video I would limit size.
  14. You only get about 22 MB? The video I get is already way larger than that.
  15. Just out of curiosity but will you try to downsize the homepage's size in the future? 82.6 MB is quite a big number.
  16. I guess... we have to thank you!
  17. This works as expected. Tested in Chrome, EDGE, Firefox, Opera (all latest versions).
  18. Ok... now I understand what they tried. Didn't see that very first line. I'm using 3.3.1 right now.
  19. I know [method] very well but only from documentations noone understands and not from productive code. Especially with jQuery I miss often new things so maybe this is the new way to get things done. ?
  20. I can confirm the behaviour @iank describes but I'm quite surprised that I notice this change just as of today. I changed line 182 from [method] to .prop to get the functionality back. $(this).parents('.pwcmb-option-wrapper').siblings().find('.pwcmb-widget__row-cb').prop('checked', !checked); Is it me or is [method] wrong at that and some other places?
  21. In my case it wasn't cropping the image which led to unexpected results it was only resizing. It took me hours without any results until I used a different image. I don't know if you test with only one or several images but if there is only one image you use, please try another one. My .jpg was more of a corrupted-Frankenstein-PNG-saved-to.jpg file. May sound ridiculous but you never know.
  22. Clean code, less code than content (ratio), and fast content delivery aren't a guarantee of good SERP results. You can build the smallest, cleanest and bestest website ever and get outranked by a crappy WordPress instance. Sometimes other things matter more than that. Links, links, links, spammy content, PBN links, more PBN links, more spammy content... all those grey to black techniques still work for almost every site. Old domain with trust but spammy content and WordPress footprint? Perfect! We already love it. New domain with better content, better UI, better load times? Are you kidding me? We will never rank that! You build clean, fast sites as a base for more. Good SERP results are a thing you have accomplish with several other things. Spend a few hundred dollars for a good, old, trustworthy domain, create 100 pages of optimized content pages for another 200 - 300 dollars, get lots of links from trusted sources (Reddit, LinkedIn, Blogs, ...), buy 10-30 more good old domains, build spammy sites with matching content, create backlinks, outreach to other spammers bloggers, get more links and you are in the Top 10 to Top 3. Don't play fair on money keywords. That won't work.
  23. Noone wants to blame small offices and business that are in need for a website that won't break the bank. I blame those professional scammers that sell 80-dollar-theme-based sites as custom-made [whatever buzzword fits here] website and either charge way too much or dump prices with it.
  24. Oh yeah... I love those super creative full stack and full service brand entrepreneur happy guys agencies that sell websites this way and claiming they are the best in the market doing highly coughstomized websites and brand building. I know agencies that charge five-figure numbers for 80-dollar-theme-based sites. Only if you do basic website jobs without any real custom data. Pages and posts slightly customized with visual page builders but that's it. A win-win situation for both sites.
×
×
  • Create New...