GiZZ All American 6982 Posts user info edit post |
was thinking since quite frequently we forget (or dont know how) to do the [link's] it would be cool if it automatically added the tags around http addressed. just a thought, dunno how hard it would be to impliment.
btw - glad to see you added a feedback forum! been wanting a place to give constructive critisim for a long time keep up the good work. 10/17/2001 2:11:27 PM |
InsaneMan All American 22802 Posts user info edit post |
it should be pretty easy.
anything that contains any of these: .com .edu .net .gov www.
do not add http://www.thewolfweb.com/ to it and do add http:// 10/18/2001 3:49:50 AM |
CrazyJ The Boss 2453 Posts user info edit post |
i made a thing before that used regular expressions to do it. it looked for http:// or www. and made it into a link. there was something wrong with it though where i decided not to use it. i forget what. i'll look into rewriting it and putting it back in! 10/18/2001 2:29:42 PM |
bavander All American 1567 Posts user info edit post |
wouldn't you have a problem finding the end of the link, or the beggining for that issue. since some places have spaces in there file names, it couldn't always be used as a delimiter. Also, you would have to make sure to convert them into their hex equivalents, like %20 for a space
I've been working on a perl script that is supposed to do the same thing, except it takes a file for the input, and spits out a new file. but with urls that don't end with just the domain but also has a path it has been imposible to implement 10/19/2001 12:19:56 AM |
CrazyJ The Boss 2453 Posts user info edit post |
well, to find the beginning of the link, I would look for:
" http://" or " www."
this wouldn't find anything in parentheses, but I think it would do the job for the most part. to find the end of the link, we would search for the nearest whitespace. whitespaces aren't allowed in URLs (http://www.w3.org/Addressing/rfc1738.txt - listed as an "unsafe" character). you'll see in the appendix that they are allowed to be used to break apart a long URL into multiple lines. anyways, it would be a royal bitch to intelligently sense this. people can fucking deal with it
with the regular expression support in .NET, it really only takes one line of code to accomplish this!10/19/2001 11:01:02 AM |
GiZZ All American 6982 Posts user info edit post |
yeah it shouldn't be too hard, and white space would be the deliminator. thanks for considering jake. 10/23/2001 7:03:01 PM |