00:00:07 <calamari> mediawiki seems like a heavyweight wiki
00:01:22 <kipple_> heavyweight in what way? system requirements, features, lines of code... ?
00:01:40 <calamari> system requirements (ram), and database
00:02:38 <kipple_> graue: what hardware are you running on?
00:03:31 <kipple_> it's not going to have lots of traffic, so I don't think it will be a problem...
00:04:18 <calamari> lament: I think it's scary, because we ultimately lose control and if wikicities goes down, all is lost
00:04:33 <lament> if graue is using BT to download and upload lots of porn, the site will be slow no matter how little traffic it has
00:04:40 <pgimeno> mirrors are a way of dealing with network traffic
00:05:06 <kipple_> you can get (daily I think) db-dumps from wikicities too
00:05:06 <lament> calamari: you can get the DB dump on wikicities, no?
00:05:16 <graue> i don't own the server, it's shared, specs are here: http://textdrive.com/specs/
00:05:33 <lament> calamari: and i trust wikimedia more than any individual with a server
00:05:49 <kipple_> me too. that's why we mirror it
00:06:25 <calamari> lament: who is going to store them? Unless people are actively involved, I foresee the db dumps eventually going to a dead server, then wikicities goes offline, the dumps are attempted to be retrieved , and only then it is discovered that the erson keeping dumps was gone long ago
00:06:38 <kipple_> I trust 3 mirrors more than wikicities (which is NOT part of wikimedia)
00:07:15 <lament> calamari: anybody involved in anything esoteric is likely to very suddenly stop being involved
00:07:43 <pgimeno> lament: btw, wikicities is not a wikimedia project
00:08:25 <GregorR> With full-on mirrors rather than just backups, if the main goes down, you don't have to go begging after somebody for content, it's still there.
00:08:28 <kipple_> I fear what will happen if the ads on wikicities isn't enough to pay the hosting expences
00:09:58 <kipple_> anybody know if there is a read-only setting in MediaWiki?
00:12:12 <wooby> well, i have this usermode linux box running and i'm willing to devote it to the site
00:12:23 <wooby> just might need help administering it from time to time as i'm about to travel
00:15:10 <calamari> kipple_: yes there is a read only setting
00:15:29 <kipple_> then read-only mirrors should be easy
00:15:41 <calamari> kipple: has edut redirecting been abandoned?
00:16:35 <calamari> kipple: not without modifying the source code
00:16:47 <kipple_> we should try to avoid that IMHO
00:16:49 <pgimeno> I think it's not an option
00:17:05 <pgimeno> there's a lot of problems involved
00:17:53 <calamari> there should at least be an edit way for people on the mirrors to edit, or they will never be used for anything
00:18:25 <calamari> then nobody will know when the mirrors go down, because they are unused :)
00:18:26 <kipple_> they don't have to be used. only be there in case the main goes down
00:19:56 <GregorR> I agree with calamari - there should at least be a simple header("Location: <master>/edit.php");
00:20:43 <pgimeno> guest users can't edit anyway
00:22:09 <calamari> each mirror can redirect to the main site, unless the main site is down (pretty sure there's a way to test for that), in which case the read only local copy is shown. The main site would only allow connections that were from the mirrors
00:22:46 <calamari> what is it called, the referrer?
00:23:39 <calamari> that way people are forced to use the mirrors, rather than accessing the main site directly
00:23:52 <kipple_> I don't know. Would make it more complicated to set up. KISS?
00:24:06 <calamari> is it really that complicated?
00:24:26 <pgimeno> anything involving touching that code is risky
00:24:57 <kipple_> how would you prevent connections from non-mirrors?
00:25:06 <pgimeno> then the server needs a list of mirrors
00:25:09 <calamari> kipple: you'd only prevent them from the main page
00:25:24 <graue> make an Esolang:Mirrors page on the wiki
00:25:29 <calamari> kipple: yeah it would.. but it'll need that anyways to send out dumps
00:25:29 <graue> listing the URL of the original, and the URLs of the mirrors
00:25:48 <graue> since that page will be mirrored along with everything else, any copy can be found from anywhere
00:26:09 <kipple_> um, the main shouldn't send out dumps. Only provide them for download
00:26:52 <kipple_> aha. anyway, that's at least my opinion
00:27:28 <kipple_> makes it easier to set up a mirror (i.e. no configuration is needed on the main site)
00:28:30 <calamari> I see little reason to set up a mirror if nobody is going to use it, though :)
00:28:54 <calamari> that's not what I meant.. but ok
00:29:13 <pgimeno> I more or less agree with calamari, but I don't see why people are not going to use mirrors
00:29:27 <kipple_> I don't see a reason to use a mirror?
00:29:34 <calamari> pgimeno: because you can't edit
00:29:50 <kipple_> why use a mirror if you can use the main?
00:29:53 <pgimeno> if only editors can edit, that's not a problem
00:30:08 <pgimeno> kipple_: "Please use a server near to you"
00:31:05 <pgimeno> similar to what one gets when downloading a file from sourceforge
00:31:31 <kipple_> I don't think traffic will reach the point were that is necessary
00:32:06 <calamari> I don't either.. it's not really a traffic problem.. I'm just concerned that the mirrors will evaporate without anyone knowing it
00:32:34 <pgimeno> a rotating DNS could perhaps also help
00:32:40 <pgimeno> but that's harder to set up
00:32:51 <pgimeno> (similar to irc.freenode.net)
00:33:08 <calamari> yeah that's too much trouble I think
00:33:29 <kipple_> here's how I see it: if you are afraid of something disappearing, take a backup yourself. All we should do is make sure the main site facilitates that
00:33:36 <calamari> people aren't going to want to change their config, well unless it can be automated
00:34:07 <calamari> s the whole point tho.. otherwise, what's wrong with graue's site?
00:34:50 <kipple_> I don't follow you. Have I said there's something wrong with graue's site?
00:35:08 <GregorR> As long as there are enough alternative modes of communication (DirectNet and IRC and DirectNet and email and ... DirectNet :P )and enough auto-downloaders, I think just having one active main isn't a problem.
00:35:45 <calamari> one thing they taught us in first aid training is never to say "someone call for help", because then no one does (someone else must already have done it) the better alternative is to pick someone (or in our case multiple people) to do it
00:36:27 <GregorR> Hence auto-downloaders rather than humans.
00:36:37 <calamari> kipple: ?? not trying to imply anything wrong with his site
00:36:47 <kipple_> I'm not saying "someone call for help". I'm saying "YOU call for help" ;)
00:37:17 <kipple_> but, of course, you have a point....
00:37:32 <kipple_> how can we be 100% sure a backup is taken.....
00:37:33 <calamari> kipple: I didn't realize you were speaking directly to me individually
00:38:22 <kipple_> dang, it's hard sometimes to communicate by IRC :)
00:38:56 <calamari> I'm glad graue has his site up.. it looks really nice
00:39:07 <pgimeno> kipple_: by checking the mirror site ;P
00:39:12 <GregorR> The best way to guarantee it would be to upload from the main rather than trusting the mirrors to download.
00:39:52 <kipple_> if the mirrors go down, upload fails as much as download
00:40:11 <GregorR> But the main would know it, and could put a big red banner on the page saying "THIS MIRROR IS DOWN!!!!!!!!"
00:40:32 <kipple_> you could still do that, without uploading
00:40:48 <GregorR> Hmm, I suppose you could check whether a mirror has downloaded...
00:41:06 <kipple_> I would like to be able to take a backup of the site WITHOUT being dependent on the current admin to give it to you
00:41:46 <GregorR> The problem with the download model is that you can't trust people to download - if the main goes down, it's possible that nobody would have backed it up. (cont. next line)
00:42:12 <GregorR> The problem with the upload model is that the (possibly non-existant) administrator of the main site needs to make changes for a new mirror to spring up.
00:42:36 <GregorR> So make the process of adding oneself to the upload list automated.
00:42:41 <GregorR> You win the typing contest.
00:42:41 <pgimeno> downloads can be motnitored
00:42:56 <GregorR> pgimeno: That would be significantly more difficult I think ...
00:43:13 <GregorR> Especially if it's via HTTP or whatnot...
00:43:56 <pgimeno> really? "Last downloads: 80.35.19.122 2005-05-25 17:20"
00:44:59 <GregorR> Oh, so the download would be through a PHP script?
00:45:11 <GregorR> I thought there would just be a file floating on a server somewhere that got updated now and then X-D
00:45:46 <pgimeno> even so, that file could be gotten through a PHP script
00:45:52 <calamari> mmm php in a nutshell, to be published July 2005.. I'll have to ask for that one for Christmas :)
00:46:22 <pgimeno> "gotten"? is that correct?
00:46:22 <GregorR> However, then there's another problem. 1) You'd need a daemon to actually do anything with that info, 2) there would still need to be a main-site mirror list for that to be useful.
00:46:45 <GregorR> Hmm, "gotten" ... I think so? Me not talk English.
00:47:10 <GregorR> AHH! LONG WORDS HURT GREGOR!
00:48:12 <pgimeno> anyway, just a special page with the last downloads seems sensible
00:48:59 <GregorR> Then it falls back to trusting humans - who's going to check that page to make sure everything is in order?
00:49:17 <pgimeno> it can be in the same page as mirrors
00:50:30 <GregorR> I don't know whether people would react when they saw "Last update/backup: <2002>"
00:50:31 <calamari> then have a link on how to because a mirror site
00:50:54 <GregorR> Wait, didn't we decide that they wouldn't be mirrors proper, just backup sites?
00:51:13 <calamari> I didn't realize that was decided
00:51:44 <kipple_> the point is, you can do whatever you want with your own backup/mirror
00:52:01 <kipple_> for me a backup is sufficient
00:52:30 <calamari> is it possible to download sithout downloading the entire database each time?
00:53:13 <calamari> oh? cool.. didn't realize rsync could work with databases
00:53:20 <kipple_> the dump is a plain text file
00:53:25 <pgimeno> <kipple> it works on MySQL dumps as well :) (unless they are zipped)
00:53:55 <kipple_> though it might get big if not zipped
00:54:02 <kipple_> probably not a problem for us
00:54:37 <calamari> well, hopefully not, since rsync only sends the files that changed, right? or does it even do better than that and send a patch?
00:55:15 <pgimeno> well, a forum with lots of daily traffic has a 50 Mb database
00:55:37 <kipple_> yes, but I don't think we'll get close to that
00:56:16 <kipple_> I think we're in the ballpark of some weekly traffic
00:56:19 <pgimeno> I think a 50 Mb dump is reasonable
00:56:43 <kipple_> is that before or after zipping it? (the forum example)
00:58:37 <calamari> I like the "Last backup" idea, but put it on the main page where everyone sees it
01:01:17 <calamari> I wonder if it'd be possible to determine the last time anything was edited on the wiki.. then after a week it could say "Backup out of date, please help to preserve this wiki" or something like that :)
01:02:03 <kipple_> There could be a list of backups last week. then people could be encouraged to take weekly backups, and picking days when few others do
01:03:04 <pgimeno> I'm too sleepy to go on discussing
01:03:07 <kipple_> the "recent changes" page shows the last time something was edited
01:03:18 <calamari> it'd also be good to require some kind of contact information, like an e-mail address
01:04:16 <calamari> although, that might raise privacy concerns
01:04:32 <calamari> since the only way it would be useful is if it was also in the dump :)
01:04:47 <graue> yes, the dump contains every user's email address
01:05:18 <kipple_> I don't see that as a problem. WikiPedia does the same. if
01:05:19 -!- GregorR has quit (Remote closed the connection).
01:06:49 -!- GregorR has joined.
01:07:05 <GregorR> Well, that was a pointless quit :P
01:07:07 <kipple_> the email should be optional anyway
01:08:35 <GregorR> BTW, if the database was sent compressed, rsync would be pointless, since the diffs would be irrelevent.
01:08:59 <kipple_> yes. that's why it shouldn't be compressed
01:09:17 <GregorR> Plus, rsync's traffic can be compressed, rather than the DB itself.
01:09:19 <calamari> the question is how much bandwidth would it take to prepare the patch vs just sending the zipped file?
01:10:09 <kipple_> well, the whole point of rsync seems to be to conserve bandwith, so I think it is worth it
01:10:54 <GregorR> The very first download would be significantly higher-bandwidth.
01:10:59 <GregorR> After that it would be far far less.
01:11:06 <graue> i don't think a couple megabytes a week matters to anyone in the first place
01:11:55 <GregorR> I live in the happy world where the esowiki is about 1.6GB and there's more data there than anybody could swim through in a lifetime :P
01:12:32 <calamari> that post pgimeno made is the first in how many months? :)
01:13:25 <kipple_> FYI, the zipped dump for current pages of the english wikipedia is 900MB, so 1.6GB is perhaps a bit optimistic...
01:17:01 <GregorR> YAAAAAAAAAAAAAAAAAAAAAAAAAAAAAY! GIKI PROJECT APPROVED!!! :)
01:18:35 <calamari> google says: Ghulam Ishaq Khan Institute of Engineering Sciences and Technology.. GIKI is a center of excellence in Pakistan for the natural sciences and Computing.
01:25:58 <GregorR> kipple_: It was you who suggested the name, right?
01:26:21 <GregorR> Well, it is a rawx0r name :)
01:26:30 <GregorR> Gregor's Wiki = Giki = Geeky = yeeeeh haw :P
01:28:08 <GregorR> I've had experience with bad project names, and this isn't one
01:28:24 <kipple_> just made a user on graue's wiki and got user ID 2 :) I take it not many has registered....
01:28:25 <GregorR> OBLISK's original name was SupaRun ... it's embarassing just to say that ... what a stupid name.
01:31:20 <kipple_> in a couple of years, maybe I can sell that user on ebay for lots of $$$$$ ;)
01:31:58 <kipple_> I've heard people have actually sold slashdot users with low IDs on ebay
01:44:51 <GregorR> I despise anybody stupid enough to actually buy those, and idolize anybody with the marketing genius to be able to sell them :P
01:48:15 <kipple_> Yay! I've finished zeroing the harddrive on my web server. and it only took 13 hours.... :D
01:48:47 <GregorR> There may be people who would suggest that that's a bit extreme ;)
01:49:08 <kipple_> it was suggested as a way to get rid of the bad sectors
01:49:57 <kipple_> at least I didn't get any r/w errors, like I got with various HD utilities
01:50:59 <kipple_> "bit extreme" it is though! more than a trillion bits is definately extremely many :)
02:40:09 -!- pgimeno has quit (Connection reset by peer).
02:51:02 -!- pgimeno has joined.
02:51:17 -!- wooby has quit.
03:06:51 -!- calamari has quit ("Leaving").
03:10:05 -!- kipple_ has quit (Read error: 110 (Connection timed out)).
03:57:13 -!- wooby has joined.
03:59:10 -!- GregorR-L has joined.
03:59:38 * GregorR-L is just getting the Giki page up :)
04:11:21 -!- graue has quit ("Are you a Schweinpenis? If so, type "I am not a Schweinpenis."").
04:18:06 <wooby> been tinkering with moinmoin myself
04:18:12 <wooby> wikis are so awesome :)
04:21:02 -!- malaprop has quit ("quit").
05:39:09 <GregorR-L> So, what semi-common Wiki features should I adapt as Giki plugins next :P
06:43:36 -!- puzzlet has joined.
06:47:39 <puzzlet> first ever written in Hangul
06:48:01 <puzzlet> http://puzzlet.org/puzzlet/%EC%95%84%ED%9D%AC~Ahui
06:50:39 <GregorR-L> I will probably fail to write anything in this language :P
06:50:46 <GregorR-L> Seeing as that I don't even have the keyboard :P
06:51:34 <GregorR-L> Does the "Hello World" program print "Hello World," or "Hello World" in Korean?
06:53:01 <GregorR-L> I assume that it can produce Hangul output?
06:53:34 <puzzlet> it receives Unicode code point and prints
06:54:21 <GregorR-L> So where's the "??, ???!"program? (If Babelfish is smart ;) )
06:55:25 <puzzlet> http://puzzlet.org/puzzlet/%EC%95%84%ED%9D%AC~%EC%95%88%EB%85%95%ED%95%98%EC%84%B8%EC%9A%94
06:55:47 <puzzlet> since "Hello, world!" literally is an ackward expression.
07:02:09 -!- tokigun has joined.
07:05:11 <puzzlet> he wrote Ahui interpreter in Python
07:06:38 <tokigun> puzzlet: i have to update interpreter for new spec
07:11:45 <cpressey> pgimeno: hmm. i'm still holding the opinion that "provide wiki for esolangs" and "preserve esolangs" are two different tasks, and it just feels like this problem is being shoehorned into a solution that doesn't fit it.
07:12:16 <cpressey> ftp sites preserve content just fine.
07:12:50 <cpressey> what could be more KISS than that?
07:15:01 <puzzlet> i have seen <br>, <font>, <table> so far
07:16:37 <puzzlet> see http://moinmoin.wikiwikiweb.de/CategoryMarket?action=fullsearch&value=linkto%3A%22CategoryMarket%22&context=180
07:34:16 <lament> puzzlet: ahui sounds incredibly dirty in russian
07:40:26 <GregorR-L> Anybody want to add any more wikis to Giki's "other wiki software" list?
07:40:32 <GregorR-L> http://giki.sourceforge.net/index.php?title=other%20wiki%20software
07:41:13 <puzzlet> http://moniwiki.sourceforge.net/wiki.php
07:42:34 <puzzlet> does enabling html codes with like [[HTML(<font>)]] count?
07:42:53 <GregorR-L> Any means of injecting HTML into the wiki *shrugs*
07:44:00 <puzzlet> Cikiwiki, tokigun's ioccc entry - http://page.tokigun.net/obfuscation/cikiwiki.php
07:52:35 -!- puzlet has joined.
07:54:11 <puzlet> wondering why i have been disconnected
07:59:59 -!- clog has quit (ended).
08:00:00 -!- clog has joined.
08:00:11 -!- puzzlet has quit (Read error: 60 (Operation timed out)).
08:02:29 -!- puzlet has changed nick to puzzlet.
08:09:59 <puzzlet> musician's extinction, maybe
08:16:59 -!- puzzlet has quit (clarke.freenode.net irc.freenode.net).
08:16:59 -!- cpressey has quit (clarke.freenode.net irc.freenode.net).
08:17:00 -!- cmeme has quit (clarke.freenode.net irc.freenode.net).
08:17:00 -!- tokigun has quit (clarke.freenode.net irc.freenode.net).
08:17:00 -!- wooby has quit (clarke.freenode.net irc.freenode.net).
08:17:00 -!- GregorR-L has quit (clarke.freenode.net irc.freenode.net).
08:17:01 -!- lament has quit (clarke.freenode.net irc.freenode.net).
08:17:01 -!- lindi- has quit (clarke.freenode.net irc.freenode.net).
08:17:15 -!- puzzlet has joined.
08:17:15 -!- tokigun has joined.
08:17:15 -!- GregorR-L has joined.
08:17:15 -!- wooby has joined.
08:17:15 -!- lindi- has joined.
08:17:15 -!- cpressey has joined.
08:17:15 -!- cmeme has joined.
08:17:15 -!- lament has joined.
08:26:04 -!- GregorR-L has quit ("Leaving").
10:15:59 -!- comet_11 has quit (Read error: 110 (Connection timed out)).
10:28:54 -!- puzzlet has quit ("reboot").
11:53:36 -!- tokigun has quit ("leaving").
12:03:22 -!- puzzlet has joined.
12:30:10 -!- kipple has joined.
12:59:26 -!- CXI has joined.
13:02:26 <pgimeno> cpressey: I agree that ftp hosting can perfectly cope with mere preservation; however the wiki is also a means to publish additional information about the language(s) which would otherwise require downloading files. In that sense, graue's idea about separating the wiki and the files deals with both preservation and additional information (in a too disconnected way for my taste, but it does)
13:33:40 -!- kipple_ has joined.
13:49:19 -!- malaprop has joined.
13:52:11 -!- kipple has quit (Read error: 110 (Connection timed out)).
14:33:54 <kipple_> yay. my website is finally up again! :D
14:34:00 <kipple_> (http://rune.krokodille.com/lang/)
14:47:33 <pgimeno> it seems you finally managed to teach the HD which sectors to skip
14:48:21 <pgimeno> yes, if some rebel sectors appear
14:48:24 <kipple_> I left the last 30 gigs of the drive unpartitioned this time (that is where the problems were)
14:49:23 <pgimeno> hm, it might be a problem of underventilation
14:50:07 <kipple_> I've changed the IDE cable, which was suggested on seagates web site
14:51:12 <pgimeno> I've had temperature problems with disks
14:51:42 <kipple_> maybe I should put a thermometer inside the box to check
14:52:48 <pgimeno> (and with cpu's; check http://www.formauri.es/personal/pgimeno/temp/dsc02325.jpg )
14:53:51 <kipple_> the cpu is not the problem. It has never crashed that way (even though it doesn't have a CPU-fan)
14:57:51 <pgimeno> have you noticed the placement of the disk in the shot? there were two disks together before, but it seems that the lack of ventilation caused the temperature to raise to a point where touching the disk was even prone to causing injury
14:58:12 <pgimeno> (plus the fact that each disk raised the other's temperature)
14:58:48 <kipple_> (the others made too much noise, so I removed them)
14:59:05 <pgimeno> hum, my theory is not very acceptable for your case then
14:59:46 <kipple_> could still be too warm in the cabinet. the lack of a cpu fan could be a problem
15:01:14 <pgimeno> I kind of doubt it but of course if you check it you'll be more confident
15:02:23 <kipple_> maybe. the disk is mounted in a bracket in a 5.25" bay, so it has room on all sides as well.
15:33:06 -!- puzzlet has quit (Remote closed the connection).
16:13:23 <pgimeno> yesterday (more than 12 hours ago anyway) someone said that Martijn van der Heide's work grabbing permissions from authors for distribution on WoS was hardly a huge work... that hit my sensible fiber
16:13:48 <pgimeno> http://www.worldofspectrum.org/permits/publishers.html
17:08:54 <wooby> so i was tinkering yesterday with moin moin, and got it working if anyone wants to check it out
17:09:42 <wooby> http://wiki.esolangs.org/
17:23:05 <wooby> i know there are other ones, and i don't want to further divide effort... so i may or may not keep it up
17:23:14 <wooby> in any case moinmoin is nice
17:23:28 <pgimeno> wait until the final decision is taken
17:24:18 <CXI> what was that about the GFDL being restrictive?
17:25:19 <lindi-> CXI: GFDL can have the 'invariant sections'
17:25:22 <malaprop> CXI: it has hassles about immutable portions.
17:25:57 <CXI> oh, sure, but the GFDL as used by wikipedia specifies no invariant sections
17:26:57 <malaprop> It also has anti-DRM requirements.
17:27:46 <malaprop> http://people.debian.org/~srivasta/Position_Statement.html
17:28:15 <CXI> mm, I remember that article
17:28:56 <CXI> the only reason I mention it is that wikipedia compatibility would be nice
17:29:03 <CXI> dual-license, maybe?
17:29:32 <CXI> actually, hmm, that would only work in one direction
17:30:09 <CXI> gragh, licensing is a pain :(
17:32:56 <malaprop> It's a pity Wikipedia has such a bad license, ya.
17:33:11 <CXI> *goes to bed instead*
17:36:39 <lindi-> malaprop: they have "or later" clause there
18:49:50 <kipple_> hey. has anybody seen the BF mandelbrot program by Eric Bosman? It's really cool!
18:50:23 <kipple_> http://www.microlyrix.com/software/bfdev/output.jpg
18:56:10 <kipple_> runs awfully slow in my java interpreter though... should have compiled it....
19:02:02 <pgimeno> kipple_: could you try if my optimizing interpreter makes it run faster?
19:03:46 <kipple_> how could it not.... my interpreter doesn't optimize
19:04:43 <kipple_> I don't have GCC on my win box. Will it comile with Visual Studio?
19:05:02 <pgimeno> don't know, maybe it does with a bit of makefile tweaking
19:06:16 <pgimeno> malaprop: I've just seen your message in lang
19:06:39 <kipple_> ha. I can run it on my linux box. pedro's optimizing compiler on a 187 MHz box vs. kipple's lousy java interpreter on a 1.4GHz box
19:07:40 <pgimeno> actually it just tokenizes (no compiling)
19:08:56 <pgimeno> anyway... I'm also curious about the BF compiler written in BF :)
19:09:10 <pgimeno> where's the BF code doing the output?
19:09:25 <pgimeno> the Mandelbrot output I mean
19:10:48 <kipple_> the java interpreter runs it about 3 times faster than brfd
19:11:05 <kipple_> considering the difference in hardware that's quite good for brfd
19:12:18 <pgimeno> hm, not bad but not as good as I expected
19:13:03 <kipple_> there's more than MHz that counts
19:13:09 <pgimeno> lament: I didn't explain myself, sorry
19:13:25 <pgimeno> I mean where to find the program
19:14:32 <pgimeno> lament: did you write Smallfuck?
19:14:32 <kipple_> http://brainfuck.kicks-ass.net/files/mandelbrot.bf
19:15:32 <pgimeno> I'm interested in whether Smetana can be done Turing-complete
19:15:43 <lament> that's exactly what i made smallfuck for
19:16:16 <pgimeno> I've read about that and apparently the conclusion was that it wasn't
19:16:39 <lament> smetana programs can only have limited "memory" since the size of memory is limited by the size of the code
19:16:59 <lament> but within that constraint a smetana program can emulate a BF machine of arbitrary size
19:17:33 <lament> i.e. it's as "turing-complete" as any physical computer :)
19:17:54 <pgimeno> can I read the whole story somewhere? what I read in the voxelperfect wiki is not accurate it seems
19:18:12 <lament> what does it say there?
19:18:33 <pgimeno> er, want to check yourself?
19:18:43 <pgimeno> basically that some programs don't stop or something
19:18:52 <pgimeno> and that it's shown to not be Turing-complete
19:19:36 <pgimeno> http://esoteric.voxelperfect.net/wiki/Smallfuck
19:21:19 <lament> that just looks like an error in my compiler :)
19:22:53 <kipple_> pgimeno: I've started a test running both interpreters on the same machine :)
19:23:31 * lament tries to figure out how to operate the smallfuck compiler
19:23:35 <pgimeno> I'm looking in the backlog for that link with an optimizing compiler for Linux
19:23:40 <lament> man, when i wrote this i was still in high school :)
19:24:07 <kipple_> you mean this one: http://www.nada.kth.se/~matslina/awib/
19:24:40 * pgimeno bookmarks the link this time
19:27:29 <lament> ummmmmmmmmmmmmmmmmmmmmmm
19:27:33 <pgimeno> a nice aspect of that compiler is its ability to compile self
19:27:46 <lament> it works perfectly in my compiler
19:27:53 <lament> terminates once it reaches the end of memory
19:28:03 <lament> so the info on wiki is simply wrong
19:28:20 <pgimeno> oh, do you mean an smetana version?
19:28:47 * pgimeno considers taking out smetana from the non-Turing-complete category
19:28:51 <lament> it sets all elements of memory to * and terminates
19:29:13 <lament> well, it's still not turing-complete :)
19:29:22 <lament> there should be a word for this particular type of ability
19:29:30 <lament> turing-complete with a memory constraint
19:29:34 <lament> maybe there even is a word
19:29:43 <lament> it seems a very common situation
19:29:49 -!- calamari has joined.
19:31:33 <pgimeno> yeah, actually bound is not a proper term
19:32:26 <malaprop> And binding already has a definition in languages, so... potential confusion.
19:32:59 <pgimeno> yup, wrong choice on my side
19:33:50 <lament> if you want i can give you the smetana/smallfuck files
19:34:22 <malaprop> lament: Which of the wikis are you editing?
19:34:58 <pgimeno> argh, that requires immediate attention
19:35:48 <pgimeno> I was writing another message to the list on the evolution of the proposals but was having a break
19:36:11 <lament> malaprop: voxelperfect
19:38:49 <pgimeno> well, it's 4 lines total, I don't think it will be annoying
19:38:53 <pgimeno> $ time ../brfd-1.0/brfd awib-1.0rc4.b < awib-1.0rc4.b > awib1
19:39:00 <pgimeno> $ time ./awib < awib-1.0rc4.b > awib2
19:40:29 <pgimeno> it's computing the mandelbrot now
19:42:32 <pgimeno> 6.77 secs... but it seems it doesn't like my terminal
19:42:45 <lindi-> what language is that ".b"? befunge?
19:43:14 <lindi-> maybe i should write fast brainfuck interpreter then
19:43:43 <pgimeno> argh, I'm really dumb... the output was the compiled code rather than the executed one
19:43:55 <kipple_> pgimeno: mandelbrot in 6.77 secs?? impressive
19:44:08 <pgimeno> that was compiling time O:)
19:44:49 <pgimeno> 11 secs will surely make more sense :)
19:45:35 <pgimeno> and yes, the output is a beautiful Mandelbrot set
19:45:46 <kipple_> the mandelbrot or something else?
19:46:37 <kipple_> I don't htink brfd will do i in 11 minutes here :)
19:47:47 <kipple_> 's obvoius that it isn't cheating, at least :)
19:48:14 <kipple_> you can see it noticably slowing down when it gets to the edges
19:48:50 <pgimeno> yeah, the set itself is the slowest
19:49:11 <kipple_> as the code is not exactly readable, it could have been nothing more than an advanced Hello WOrld
19:49:13 <lament> of course the set itself has to be the slowest
19:49:50 <pgimeno> actually such thing as a bf compiler in bf is dangerous
19:50:10 <pgimeno> it turns out to be honest but it could have generated a virus
19:50:12 <lament> not if it compiles to bf :)
19:51:38 <kipple_> and the comiled was 11 secs? wow
19:52:18 <kipple_> brfd is still running here. I estimate it will take about 25 mins...
19:53:05 <kipple_> wonder how long the unoptimizing java interpreter will take....
19:54:20 <kipple_> perhaps I should compile it to binary to get a more fair comparison
19:54:23 <pgimeno> what I'm wondering is how long would it take for a non-optimizing compiler
19:57:33 <pgimeno> calamari: have you tried how hard is a MediaWiki to set up?
19:57:58 <malaprop> MediaWiki is not hard to set up.
20:06:44 <pgimeno> have you tried it, malaprop
20:13:16 <malaprop> MediaWiki? Yes, I run a couple. None public ATM, tho.
20:16:29 <calamari> pgimeno: nope.. no time. We will just be taking dumps, anyways, though.. right? :)
20:16:59 <pgimeno> how easy is to rebuild the database?
20:17:13 <pgimeno> that question is for you, malaprop
20:17:15 <malaprop> mysql -u user -p -h host db_name < dump.sql
20:17:38 <calamari> I like MoinMoin better, so I'll be sticking with it (won't be using it for esowiki though)
20:18:06 <calamari> I'm happy to download the mediawiki dumps tho
20:18:08 <pgimeno> given an empty database that's ok, but what if all you want is to update the database with the last changes?
20:18:32 -!- calamari has quit ("Leaving").
20:20:16 <malaprop> A partial update is probably possible, but a drop & reload would be simpler and saner. W'ere unlikely to ever have a db that is so large it takes more than a few minutes to do this.
20:22:16 <malaprop> For an extreme example, the English Wikipedia (http://download.wikimedia.org/) is currently ~36G and can usually run in under 12h. So it's really really unlikely we'll have any kind of painful downtime.
20:22:42 <pgimeno> so do you think that this scheme is feasible?
20:23:28 <malaprop> Entirely, yes. One of the advantages of MediaWiki is that it's immensly popular and featureful -- so we'll get nearly any feature we want without having to do it ourselves, and there's no worries that the maintiners will disappear.
20:24:05 <pgimeno> I hope that graue cares a bit about the look tho
20:24:35 <kipple_> well, I want a feature to include java applets in the wiki. I think we might have to do that one ourselves...
20:24:46 <pgimeno> am I the only one who sees the left bar disturbing?
20:26:08 <pgimeno> kipple_: I don't think that the wiki is the best place to host java apps; a regular server seems to make more sense
20:26:29 <pgimeno> anyway it's probably already done
20:27:07 <pgimeno> calamari's moinmoin server does not have it and I find it cleaner
20:27:15 <kipple_> I think it could do with some chaning of it's elements, but otherwise it's nice
20:27:29 <kipple_> The language List should be in it for instance
20:27:50 -!- GregorR-L has joined.
20:31:54 <kipple_> hmm. does that mean that my unoptimizing interpreter will take hours?
20:32:20 <pgimeno> you can try to make an estimation
20:34:08 <pgimeno> so I think that most votes here favor MediaWiki, right?
20:35:03 <pgimeno> wooby went the moinmoin way also with success
20:35:46 <pgimeno> but yes, most people agree on mediawiki
20:35:58 <pgimeno> okay, I'll follow the trend in the message
20:37:28 <pgimeno> were you reading the log or something, Gregor?
20:44:35 <pgimeno> malaprop: so are you offering to make backups?
20:48:03 <pgimeno> er, you offered that in the list; do you want to set up a mirror?
21:06:59 <pgimeno> what's the status of file uploading?
21:26:54 <malaprop> pgimeno: I'm happy to do hosting, mirror, or backup as needed. So if we have a host and want mirrors, I'll be a mirror.
21:27:47 <wooby> i also have this domain, esolangs.org... which i'd point at where we eventually decide the main site should go
21:28:31 <GregorR-L> Dern, this fopen-ing of a web site in PHP is not working right for me >_<
21:28:55 <malaprop> GregorR-L: Does the local install permit fopen_wrapper?
21:29:39 <pgimeno> malaprop: the problem is that so far there's no mirror
21:29:39 <malaprop> erm, is allow_url_fopen, not fopen_wrapper, pardon
21:30:02 <pgimeno> wooby: would you try MediaWiki?
21:30:18 <wooby> pgimeno: sure i'll have some time to set it up later tonight
21:30:27 <malaprop> pgimeno: Of voxelperfect? Shall I contact graue and work it out with him?
21:31:13 <GregorR-L> malaprop: Yeah, I use it elsewhere, I just don't know what's causing it to fail in this one situation.
21:31:44 <malaprop> Ah. Were you venting or looking for help? :)
21:32:34 <pgimeno> at the moment I need to complete the message reporting the current status
21:34:18 <pgimeno> so you don't still need to do that
21:37:56 <kipple_> pgimeno: about the left menu in MediaWiki. You can disable it if you're logged in
21:40:22 <kipple_> I would also suggest changing the default skin.
21:40:25 <malaprop> It is also possible to add skins with very different appearances.
21:40:53 <kipple_> I think the one WikiPedia uses looks best of the 5
21:41:08 <malaprop> kipple_: Yes, that's Monobook.
21:42:04 <kipple_> looks very much like WikiPedia then, but at least then it's very familiar to most
21:43:00 <pgimeno> "This [Quickbar settings] preference only works in the 'Standard' and the 'CologneBlue' skin."
21:43:30 <kipple_> well, then what's the rpoblem?
21:48:21 <pgimeno> if it only had a bit of a margin... :)
21:48:37 <malaprop> pgimeno: Yeesh, install GreaseMonkey already. :P
21:55:28 <pgimeno> nostalgia seems to be Good enough(tm)
22:01:10 <GregorR-L> Grr. I'm trying to make an InterWiki Content system, but I can't seem to fopen URLs, even though I know SF lets you >_<
22:02:02 <GregorR-L> Repeat: "even though I know SF lets you"
22:31:15 <wooby> anyone aware of a language based on the idea of data redirection, like piping?
22:31:29 <wooby> i recall seeing a BF derivative that did something similar
23:07:05 -!- GregorR-L has quit (Read error: 113 (No route to host)).
23:14:57 -!- wooby has quit.
23:36:45 <kipple_> pgimeno: I finished running mandelbrot.b in the java interpreter:
23:37:07 <kipple_> which means your interpreter is about 3.7 times faster
23:48:15 <pgimeno> I've tried my interpreter with the -s option and the speed difference is not significant
23:48:31 <kipple_> what does the -s option do?
23:48:43 <pgimeno> disable optimization (the s stands for slow)
23:49:08 <kipple_> what kind of optimization do you do?
23:49:31 <pgimeno> I don't remember that very well :)
23:49:59 <pgimeno> I think that it optimizes copy operations and things like that
23:50:24 <pgimeno> addition to multiple cells, zero out cells...
23:51:28 -!- GregorR-L has joined.
23:52:07 <pgimeno> kipple_: it's embarrasing not being able to answer your question :)
23:52:18 <GregorR-L> Slash anybody else who knows if it's possible to open a web page to get its content in Javascript?
23:53:21 <pgimeno> GregorR-L: what do you mean by "get its content in JavaScript"? do you mean get the JS code which it has embedded?
23:53:57 <GregorR-L> No, I mean something like read the HTML from http://www.google.com/
23:54:24 <pgimeno> sure, wget http://www.google.com/ ; less index.html
23:54:49 <kipple_> you mean from another web page?
23:54:58 <GregorR-L> I'm not good at explaining this obviously :P
23:55:18 <pgimeno> oh, do you mean the JS code to *connect* to another page to grab the text?
23:55:42 <GregorR-L> Other than, say, opening it up in an iframe and loading that (which may or may not work)
23:56:44 <pgimeno> to me it sounds like it would be a security risk, say the page is in file://...