Jump to content

gebeer

Members
  • Posts

    1,542
  • Joined

  • Last visited

  • Days Won

    47

gebeer last won the day on February 22

gebeer had the most liked content!

1 Follower

Profile Information

  • Gender
    Male
  • Location
    Thailand

Recent Profile Visitors

16,727 profile views

gebeer's Achievements

Hero Member

Hero Member (6/6)

1.9k

Reputation

28

Community Answers

  1. I wanted to add: This gives you basically something like context7. But locally with your very specific knowledge and not implemented as MCP but as skill which has less overhead in the context window. And you could modify it easily for other knowledge. Different frameworks, whatever.
  2. Skill is up at https://github.com/gebeer/processwire-ai-docs/tree/main/skills/processwire-memory You need to install memvid-cli. It's all in the README and skill. You can build your memory file with docs you need from https://github.com/phlppschrr/processwire-api-docs/blob/main/api-docs/index.md If you want me to share my mem file (~35MB), I can do that, too. I haven't used it a lot yet but it seems to work quite well. Maybe needs some work on the proactive part so that agents know when to lookup stuff even if not explicitly prompted. Implementing that depends very much on the AI tools you're using. For Claude Code hooks would be a good place. Others like Cursor, I don't know.
  3. Yes, very good results. It's fast and pretty token efficient. You can connect it to an already open browser wit a logged in session etc. no need for auto-login route. let your agent read the mcp instructions. mine said to start chrome with a debug flag. Let me pulled that up through my conversation-search MCP quickly. Here it is :-) Launch command (detached from terminal): setsid chromium --remote-debugging-port=9222 &>/dev/null & - --remote-debugging-port=9222 — enables CDP so the MCP can connect - setsid — creates a new session, fully detached from the terminal's process group (plain nohup doesn't survive terminal close because the terminal sends SIGTERM, not just SIGHUP) - &>/dev/null & — suppresses output and backgrounds it That's for Linux but should work on your Mac, too. In that browser you open your PW project, backend and then the mcp can connect. That's the way it's supposed to be done, I guess.
  4. Hi fellow devs, this is a somewhat different post, a little essay. Take it with a grain of salt and some humor. Maybe some of you share similar experience. I don't really mean to poop on a certain group with certain preferences, but then, that's what I'm doing here. I needed to write it to load off some frustration. No offense intended. Good Sunday read :-) React Is NPC Technology Have you ever really looked at React code? Not the tutorial. Not the "Hello World." An actual production component from an actual codebase someone is actually proud of? Because the first time I did, I thought there'd been a mistake. A failed merge. HTML bleeding into JavaScript, strings that weren't strings, logic and markup performing some kind of violation you'd normally catch in code review before it got anywhere near main. "Fix this," I thought. "Someone broke this." It looks broken because it is broken. That's the first thing you need to understand. JSX is a category error. Mixing markup and logic at the syntax level - not as an abstraction, not behind an interface, but visually, literally, right there in the file - is the kind of decision that should have ended careers. Instead it ended up on 40% of job postings. And here's the part that actually matters, the part that explains everything: Nobody can tell you why. "Everyone uses it." Go ahead, ask. That's the answer. That's the complete sentence, delivered with the confidence of someone who has never once questioned whether a thing should exist before learning how it works. The argument for React is React's market share. The case for Next.js is that your tech lead saw it on a conference talk in 2021 and it was already too late. You're supposed to hear this and nod - because if everyone's doing something, there must be a reason, right? The herd doesn't just run toward cliffs. Except. That's literally what herds do. The web development community, bless its heart, has a category of decision I can only call NPC behavior. Not an insult - a technical description. An NPC doesn't evaluate options. An NPC reads the room, finds the dominant pattern, and propagates it. React is on every job posting = React is what employers want = React is what I need to know = React is what I reach for. The loop closes. Nobody along the chain asked if it was right. They asked if it was safe. Safe to put on a resume. Safe to recommend. Safe to defend at the standup. React is the framework you choose when you've stopped choosing and started inheriting. The 10% who actually think about their tools - they're out there running Alpine.js. Which is 8kb. Does the same job. No build step required. Add an attribute, the thing works. Revolutionary concept. They're running htmx, which understood something profound: the web already has a protocol for moving data, and it was fine. You didn't need to rebuild HTTP in JavaScript. You just needed to reach for the right thing instead of the fashionable one. Let's talk performance, because "everyone uses it" is already bad enough before you look at what it actually does. React ships 40-100kb of runtime JavaScript before your application does a single thing. Your users wait while React bootstraps itself. Then it hydrates - a word that sounds refreshing and means "React redoes on the client what the server already did, because React can't help it." Then they invented Server Components to fix the problem of shipping too much JavaScript. The solution: ship different JavaScript, handled differently, with new mental models, new abstractions, new ways to get it wrong. They called it an innovation. I once worked with WordPress and React together. I want you to sit with that. Two philosophies, neither of which is actually correct, stacked on each other like a complexity casserole nobody ordered. WordPress solving 2003's problems with 2003's patterns. React solving 2003's problems with 2013's patterns that created 2023's problems. Together they achieved something genuinely special: all the drawbacks of both, and none of the advantages of either. The PHP you want but in a different way and the hydration you couldn't prevent, serving pages that load like it's apologizing for something. Twenty years building for the web and I've watched frameworks rise and fall like geological events. ColdFusion, anyone? Remember when Java applets were going to be everywhere? Flash was going to be the web. Then jQuery saved us. Then Angular saved us from jQuery. Then React saved us from Angular. Rescue upon rescue, each one leaving more complexity than it cleared, each one defended by exactly the same people who defended the last one, now wearing a different conference lanyard. ProcessWire. That's what I build with. Most developers have never heard of it - which is not a criticism, that's the evidence. You find ProcessWire because you went looking for something specific, evaluated it, and it fit. It doesn't have conference talks. It doesn't have a VC-funded developer relations team. It has a forum full of people who chose it. That's a different category of thing entirely. The same 10% who finds ProcessWire finds Alpine. Finds htmx. Makes decisions that don't optimize for defensibility in interviews. Builds websites that load fast because they don't carry React around everywhere they go. There's a physics concept called a local minimum. A place where a system settles because the immediate neighborhood looks stable - the energy gradient points upward in every direction, so the system stops. Stays. Convinces itself it's home. Even if a global minimum exists somewhere else, at lower energy, lighter, simpler - you'd have to climb first, and the herd doesn't climb. React is a local minimum. The web settled here when it got tired of looking. Stable enough. Defended by enough career investment. Surrounded by enough tooling and tutorials and framework-specific bootcamps that switching costs feel existential. The ground state - simpler, faster, closer to what the web actually is - sits somewhere else, past a hill that looks too steep from inside the valley. The ground state is always simpler. That's not a philosophical position. That's thermodynamics. They don't want you to know that.
      • 12
      • Like
      • Thanks
  5. I use https://github.com/ChromeDevTools/chrome-devtools-mcp for that. Very fast. The thing about the plan is that it's supposed to be reviewed before it is being applied, haha. But if you trust it without at least a quick glance, ok. I think Cursor can play sounds when it needs your attention. You could use that to notify you when you have to click the button. Wow. The "Full ProcessWire API access - Query, create, update, and delete pages" is the most interesting for me here. Working right now on a single-file PW-API-docs database based on https://github.com/memvid/memvid. Has semantic vector search (local embedding model), BM25 and all that good stuff. Also supports CRUD. I fed it a good part of https://github.com/phlppschrr/processwire-api-docs/blob/main/api-docs/index.md . File currently has around 35MB. Search is blazingly fast. I implement it as portable skill, not as MCP. Needs a little more love and testing but I'll drop it soonish.
  6. Contents of related skills are not included in the docs that context7 parsed from your repo. Those are separate. It should be save to use as is @bernhard FYI. If you want to be 100% sure that you can trust those snippets, you'd need to go through https://github.com/phlppschrr/processwire-knowledge-base/tree/master/docs and look for prompt injections. But I think that would be overkill tbh
  7. tried several of them, including kilo code from NVIDIA (I think) which uses a clean spec-driven workflow. Currently working on my own version of that with prompt templates, verification through hooks and all that good stuff. Spec driven is a good approach, especially for larger features. For small things I'm still using good old chat in claude code.
  8. Would love to have @Jonathan Lahijani chime in here. Maybe he's got news about his MCP project :-)
  9. I just published https://github.com/gebeer/conversation-search-mcp Its a very minimal and fast MCP server that can search over jsonl session transcripts. It can be pointed to a folder with those sessions and then do BM25 searches for relevant context. Claude Code sessions all get stored in ~/.claude/projects/* folders. One folder per project. I have pointed mine to a folder in there that contains all my ProcessWire projects. So it has all past conversations that I had with claude code in these projects. There's a lot of valuable data in there. Like how the assistant applied fixes, what context I gave it, what it did wrong, what corrections I had to make etc. When the MCP server is active, the assistant can use it to search for relevant context. Currently I'm working on a hook system that auto-injects relevant context from a search based on the current prompt. It's an experiment currently. But might be a good way to enhance the assistants understanding with relevant context.
  10. Slight understanding on a high level. The greatest challenge is training data collection and formatting, not so much the LoRA training itself. I spent some time about 1 year ago on how to approach the data side of things and couldn't find any good sources back then. Then gave up on it. imo it's better to work with ICL (In Context Learning) and a SoTA model than having a spezialized but weaker one. That might not true anymore, though.
  11. Makes total sense :-)
  12. Wow. That was quick. Thank you!
  13. You can surely do that. context7 will accept it no problem. could add a note: unofficial but AI-friendly docs for the great ProcessWire CMS/CMF lol
  14. Good idea, thank you! You might want to create a separate thread for this for further discussion. Sure people are interested in working together on this.
  15. This is great!. I see you have a much more sophisticated setup for API/docs extraction setup than me. I was having a go at producing markdown docs for processwire a while ago and took a different approach (Scrape API docs instead of parsing source). Partial results are here: https://gebeer.github.io/mkdocs-processwire/ I really wish @ryan would adapt the md format for the official API docs so that AI assistants can easily parse them and als cotent7 can index them. And the collection of blog posts you have is impressive. As for the Skill itself, it doesn't currently follow the recommended format. There should be no README, docs dir could be renamed to references etc. Other than that it looks amazing. Love the scripts section. Def will give this a go and let you know how it goes.
×
×
  • Create New...