Frank Vèssia Posted October 27, 2011 Share Posted October 27, 2011 I have a page that i use for embedding a Streamed Malware like youtube, in this format: wwww.domain.com/embed/videoid Videoid is a string i generate crypted (urlSegment) and normally it ends with "=", ex. "/embed/mZ2Xp5g=". Now, the urlSegment doesn't work with this type of string...the problem is the "=". My solution for now is substr the = and add again when i need to decrypt the string but could be nice to have the entire string working... Link to comment Share on other sites More sharing options...
Adam Kiss Posted October 27, 2011 Share Posted October 27, 2011 1. urlencode() and urldecode() dosent help? 2. cant you change the radnom (i guess) id generation to include only small/big letters and numerals, for instance? Link to comment Share on other sites More sharing options...
ryan Posted October 27, 2011 Share Posted October 27, 2011 In addition to the expected forward slash to separate directories, ProcessWire only accepts these [ASCII] characters in it's URLs: abcdefghijklmnopqrstuvwxyz 0123456789 . - _ The same goes for page names (which make up URLs), and url segments. But it doesn't apply to GET vars of course. In your case, I would probably use a GET var to hold that video ID. This would be more appropriate because URL segments really are more worthwhile for creating readable URLs, and hashes aren't by nature very readable. But if you wanted to support those hashes in the URL, you could rtrim($urlSegment, '=') and add it back later, or just replace it with a dash or underscore, and add it back if needed. Link to comment Share on other sites More sharing options...
Frank Vèssia Posted October 28, 2011 Author Share Posted October 28, 2011 thanks. I use a base64 encoding and secret key to generate that code and it always finish with = or double ==. I'ìm using rtrim and it works good...i prefer this solution rather than a GET value Link to comment Share on other sites More sharing options...
ryan Posted October 28, 2011 Share Posted October 28, 2011 I've used base64 encoding in URLs before (though in GET vars) and had to do the same thing, trimming out the equal signs at the end. Link to comment Share on other sites More sharing options...
er314 Posted September 14, 2012 Share Posted September 14, 2012 In addition to the expected forward slash to separate directories, ProcessWire only accepts these [ASCII] characters in it's URLs: abcdefghijklmnopqrstuvwxyz 0123456789 . - _ The same goes for page names (which make up URLs), and url segments. But it doesn't apply to GET vars of course. [...] I have url segments which may contain characters out of this allowed list. Did you consider also allowing '%' character in URLs ? Having '%' would permit to use urlencode/urldecode functions. Instead, I guess I have to write by own urlencode/urldecode function, or use base64_encode/base64_decode + handling the '=' characters which is somewhat overkill ;-) Or do I overlook a simpler approach ? Link to comment Share on other sites More sharing options...
ryan Posted September 14, 2012 Share Posted September 14, 2012 Did you consider also allowing '%' character in URLs ? The intention is to keep things in a well known set for security, but also so we have consistency with ProcessWire's pageName filters. Or do I overlook a simpler approach ? GET variables? Link to comment Share on other sites More sharing options...
er314 Posted September 14, 2012 Share Posted September 14, 2012 Ok, many thanks for the advice I want to benefit from caching & indexing, so I'll stick to url segments, and some encoding mecanism. 1 Link to comment Share on other sites More sharing options...
ryan Posted September 16, 2012 Share Posted September 16, 2012 From the caching side, you can always use the MarkupCache module to cache your own stuff. But you just have to be careful so as limit the possible pages that can be rendered. Unbridled caching of pages generated as a result of values in GET vars is a bit of a security hole, as someone could take advantage and make you end up with millions of cache files, potentially filling up the disk or slowing things down. This is the reason why PW doesn't support caching of output generated as a result of URLs with GET variables. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now