Jump to content

teppo

PW-Moderators
  • Posts

    3,227
  • Joined

  • Last visited

  • Days Won

    109

Everything posted by teppo

  1. @muzzer: most fields (almost all of them) are only loaded as required. You can check "Autoload" within field settings (Advanced tab) to always include it with the page, though. This makes sense especially if that field is always used when the page is loaded. You should also take a look at built-in Fieldtype Cache too; sounds like it could be exactly what you're looking for here. It's a core module but not installed by default. I wrote a quick post about caching in ProcessWire a while ago, you can find it here. It's not too in-depth (and doesn't mention aforementioned Fieldtype Cache at all), so I'm not sure if it provides you any new information at all
  2. $event = new Event(); $event->date = "1979-10-12 00:42:00"; $event->location = "Vogsphere"; $event->notes = "The homeworld of the Vogons"; $page->events->add($event); That's one way to do it, at least. Since $page->events here is events field it returns an instance of EventArray, which in turn can contain multiple Event objects. For the most part EventArray acts just like a PageArray or Pagefiles or any other object extending WireArray. This is just the most basic example of what you can do with it.
  3. RT @lukew: 90% of the time, smartphone use is all thumbs. Design accordingly. http://t.co/dcQMmTJVxd

  4. I haven't had much (enough) time to work on my mailer module and haven't looked at how WireMailSMTP handles these particular things, but in general I'd have to agree with Pete. For things that are commonly used, it'd be best if there was a "standard" way to do that. One that doesn't depend on which module extending WireMail is installed at the time. I believe that we're talking about interfaces here, though that's not technically what it's going to be.. or what it is at the moment, at least. Then again, if @horst has implemented this feature already, I'll probably take a look at his implementation anyway and use similar public methods if possible to provide some consistency
  5. RT @brad_frost: Yes, and that voice says "get this goddamn thing out of my face." http://t.co/KIPgjpg20M

  6. On a slightly related note, I've found clients to be quite happy with "I don't know, I'll find out and get back to you", but if they've got an issue and can't reach anyone at your side, they won't be happy at all. Be clear about when you're available and stick to that. The absolute worst thing you can do is making promises and setting expectations you won't be able to fulfil. An easy way to increase customer satisfaction is exceeding expectations.. and the trick to exceeding expectations is setting them at a realistic level in the first place
  7. As far as I know, the only way to do this would be applying an RTE with a module. Hooking into InputfieldImage::renderItem would give you access to inputfield output, but I'm not exactly sure how TinyMCE (or CKEditor, if you're using that) is configured by default, i.e. is it enough to add a class to the description textarea (which should appear once the field is configured to hold more than one row of description data) or do you actually have to add custom JavaScript to apply it to image descriptions.
  8. Take a look at the example below "Custom PHP code to find selectable pages": Example: return $page->parent->parent->children("name=locations")->first()->children(); Isn't that almost exactly what you need here? Of course you'd have to use ...->child("template=categories|tags")->children() or something like that.
  9. Pete: I haven't had the opportunity (?) to deal with truly large databases myself, but I remember discussing this with someone more experienced a few years ago. They had built a system for local university and hospitals for managing medical research data (or something like that, the details are a bit hazy). I don't really know what that data was like (probably never asked), but according to him they started running into various performance issues at database level after just a few million rows of it. Indexes in general make searches fast and having results in memory makes them even faster, but there's always a limit on buffer size, searching huge index takes time too and in some rare cases indexes can actually even make things worse. (Luckily the Optimizer is usually smart enough to identify the best approach to each specific scenario.) Downside of indexes is that they too take space and need to be updated when data changes -- they're not a silver bullet that makes all performance issues vanish, but can actually add to the issue. This also means that you'll need to consider the amount of inserts/updates vs. searches when creating your indexes, not just the fields to index and their order. This is starting to get almost theoretic and you're probably right that none of it matters to any of us here anyway (although what do I know, someone here might just be building the next Google, Craigslist or PayPal) -- just wanted to point out that it's not quite that simple when it comes to really large amounts of data. And no matter what you claim, (database) size does matter Edit: just saw your edits, apparently we're pretty much on the same page here after all
  10. @rusjoan: for converting content of a page to JSON, you could do something like this: $data = array(); foreach ($page->template->fields as $field) $data[$field->name] = (string) $page->$field; $json = json_encode($data); Of course this isn't a complete solution and won't work in anything but the most limited use cases (think about images, files, page references etc.) but it's a start.
  11. @rusjoan: that error message is a bit vague, but it means that the name of your module is invalid. This is where it originates from. ProcessWire expects each module name to start with single letter (uppercase or lowercase), followed by one or more lowercase letters. Result of this is that "VkAPI", "Vkapi", "vkapi", "RusjoanVKAPI" etc. are valid names, while "VKAPI" is not. I'd suggest renaming your module to comply with that requirement, as that's the easiest solution here. Edit: for the record, I just submitted a pull request about adding a more descriptive error message.
  12. @Pete: actually archiving database content elsewhere could have it's merits, in some cases. Imagine a huge and constantly changing database of invoices, classifieds, messages, history data etc. Perhaps not the best possible examples, but anyway something that can grow into vast mass. Unless you keep adding extra muscle to the machine running your database (in which case there would only be theoretical limits to worry about), operations could become unbearably slow in the long run. To avoid that you could decide not to keep records older than, say, two years, in your production database. In case that you don't actually want to completely destroy old records, you'd need a way to move them aside (or archive them) in a way that enables you to later fetch something (doesn't have to be easy, though). Admittedly not the most common use case, but not entirely unimaginable either As for the solution, there are quite a few possibilities. In addition to deleting pages periodically you could do one or more of these: exporting pages via API into CSV or XML file(s) duplicating existing tables for local "snapshots" performing regular SQL dumps (typically exporting content into .sql files) using pages to store data from other pages in large chunks of CSV/JSON (or custom fieldtype per Pete's idea) In any case all of this isn't really going to be an issue before you've got a lot of data, and by lot I mean millions of pages, even. Like Pete said, caching methods, either built-in ones or ProCache, will make typical sites very slick even with huge amounts of content. If your content structure is static (unchanged, new fields added and old ones removed or renamed very rarely), custom fieldtype is a good option, and so is a custom database table. These depend on the kind of content you're storing and the features of the service you're building.
  13. RT @vruba: Idea: a high-level public agency with the mandate and funding to review and patch popular code. A national security agency, if y…

  14. RT @beep: Despite what so-called “climate change scientists” say, my walk through Harvard Square shows the brozone layer hasn’t depleted on…

  15. If you need to handle a large quantity of pages, I'd probably rely on SQL. Sounds like a rather trivial task that way, though this, of course, depends on what you're actually after. If I'm reading your post correctly and it's just selected pages you're looking for: SELECT GROUP_CONCAT(data SEPARATOR '|') data FROM (SELECT DISTINCT data FROM field_myfield ORDER BY data LIMIT 5) f; After that you've got a list of pages you can pass to $pages->find().. though I don't quite understand why you'd want to do this with the limit, so there's probably something I'm misinterpreting here. I hope you get the point here anyway IMHO It's questionable whether selectors even should be able to handle every imaginable task. This, for an example, seems quite rare need to me (and is already easily solved by either loop or SQL). Selectors are good at finding pages in general, while finding distinct values, even when those values are later used for finding other pages, sounds like a job for something entirely different -- or some kind of combination of SQL and selectors.
  16. +1 for some of these, especially the force delete field/template ones. It's apparently a safety feature to make it harder to lose important data, but also very, very annoying at times. IMHO a proper warning would be quite enough: "this will remove [the field from following templates / following pages using this template], are you sure you want to continue?"
  17. teppo

    Heartbleed

    Thanks, netcarver -- the article was great and the video linked from there was even better. Every good explanation needs a hand (or mouse) drawn diagram
  18. teppo

    Heartbleed

    I found this post kind of interesting: http://article.gmane.org/gmane.os.openbsd.misc/211963. Haven't checked the facts myself so can't really vouch for it, but if it's true.. well, it does tell something about the mindsets of the developers working on this particular product. Security in general is very complicated thing like Matthew already pointed out, but too often vulnerabilities are (at least partly) a result of laziness, general ignorance and/or bad practices.
  19. RT @sandofsky: The data is in: hamburger menus kill engagement. "It was a disaster!"http://t.co/DlxaM1JIXM

  20. @bwakad: once again this depends on your use case, but you can always add page-specific notes in a field that you simply omit from page output. For template-specific notes Soma's module does good job.
  21. Already a bunch of good resources, but here's a couple more: Especially if you're at "absolutely beginner" level, there's a ton of useful video tutorials floating around. This one titled "Learn PHP in 15 minutes", for an example, seems pretty good. For more background there are also more thorough videos, such as this Harvard extension school lecture. Once you get up to speed, you should most definitely take a look at PHP: The Right Way. If you're really into it, I'd strongly suggest Programming PHP from O'Reilly. It's available as an ebook too. Another book worth checking out is Essential PHP Security. Short but good -- if every PHP developer knew at least this much about security best practices, PHP world would be a lot safer place. Just my two cents. Totally random fact: Designing Web Graphics by Lynda Weinman was one of the very first web design books I could get my hands on. That was some fifteen years ago, when quality learning material was still kind of scarce, at least around here. Should probably take another look at that book one of these days, could be sort of fun
  22. RT @HeyyThereDalia: So Nyan Cat and LulzSec are in my sociology textbook http://t.co/KQLrIP0ccZ

  23. $pages->find(..) returns PageArray, not regular array, hence the issue with array_merge(). Also there's no need for array_unique() (should it work), PW handles this already by grouping results by page ID's. Apart from that, I believe that DaveP is right; you'll have to do separate queries and merge results: $ps = $pages->find("customer=$page->customer"); $ps->add($pages->find("service=$page->service")); $ps->add($pages->find("business=$page->business")); // etc. One problem with this approach is that, depending on your situation, you might not be able to use limits and built-in pager functions to full extent. Anyway, hope this helps a bit.
  24. Take a look at mb_substr() for setting character limit. For word limiter you can use something like this: http://snipplr.com/view/12987/limit-words-in-a-string/. // basic mb_substr example $value = $page->my_field; if (mb_strlen($value) > 255) { echo mb_substr($value, 0, 255) . "..."; } else { echo $value; }
  25. “Submit your poster in PowerPoint format”. Whoever wrote that must hate Mac users. Two hours well spent. http://t.co/m7asD0KzI6

×
×
  • Create New...