00:06:49 -!- GreaseMonkey has joined.
00:45:53 <bsmntbombdood> $ihopes_new_nick: so where's your onoz interpreter?
01:10:46 -!- ab5tract has joined.
01:13:31 -!- BeholdMyGlory has quit ("godnatt!").
01:29:41 -!- Slereah has joined.
01:30:07 -!- Slereah2 has quit (Read error: 60 (Operation timed out)).
01:30:58 -!- unlambda has quit (Read error: 113 (No route to host)).
01:41:47 <lament> http://www.youtube.com/watch?v=BtGlQHAEwVo
01:45:49 -!- kwufo has quit ("Leaving.").
01:47:03 -!- kwufo has joined.
01:56:01 <GregorR> It's sad that I'm feeling good because I made this software fail in the same way on multiple platforms X_X
01:57:07 <kerlo> comex: did you want to compliment me on my prophetic ability?
01:57:58 -!- kwufo1 has joined.
02:04:24 -!- ab5tract has quit.
02:05:12 -!- kwufo1 has quit ("Leaving.").
02:17:00 -!- FireFly has quit ("Later").
02:21:41 -!- kwufo has quit (Connection timed out).
02:28:02 -!- kwufo has joined.
02:59:09 -!- jix_ has joined.
03:03:04 -!- kwufo has quit ("Leaving.").
03:06:35 -!- whoppix has joined.
03:12:16 -!- jix has quit (Read error: 110 (Connection timed out)).
03:26:44 -!- kwufo has joined.
03:44:12 -!- kwufo1 has joined.
03:58:48 <comex> reddit programming (3 clicks -->) ihope
03:59:08 -!- kwufo has quit (Connection timed out).
03:59:37 <kerlo> Oh, you were answering bsmntbombdood's question.
03:59:54 <kerlo> Let me find out what onoz is.
04:00:38 <kerlo> bsmntbombdood: write one. ehird would tell you that it's easy.
04:00:53 <kerlo> (Disclaimer: I don't actually know enough about ehird to truly make that statement.)
04:01:55 <comex> reddit --> banana scheme --> brainhype --> onoz (ihope)
04:02:02 <comex> in the unimplemented category :(
04:02:45 <kerlo> So my Reddit number is more like 4.
04:03:03 <kerlo> Inded, onoz --> http://esoteric.voxelperfect.net/wiki/User:Ihope127
04:03:06 <kerlo> And http://esoteric.voxelperfect.net/wiki/User:Ihope127 = me.
04:03:17 <comex> 3 clicks to see the name ihope
04:03:24 <comex> and yesterday there was bf joust
04:03:30 <comex> what's with Agorans on reddit?
04:03:40 <kerlo> There are Agorans on reddit?
04:04:17 <comex> s/(?=on)/linked to /
04:04:46 <lament> banana scheme was on reddit?
04:05:03 <kerlo> Well, I have to go. See y'allz.
04:05:05 <comex> it's on /r/programming at the moment
04:05:17 <kerlo> Does RProgrammer have anything to do with that?
04:05:19 <comex> I'd upvote it, but I don't have an account :p
04:05:32 <lament> wow, it's indeed on reddit
04:10:39 <lament> it really belongs on http://www.reddit.com/r/Marijuana/
04:18:43 <kerlo> (Retroactively. It was not a lie when I told it, but it is now.)
04:18:58 <kerlo> I have returned so that I may confess a sin in ##sl4.
04:40:32 -!- kwufo1 has quit (Remote closed the connection).
05:13:18 -!- icefox has quit.
05:46:30 -!- icefox has joined.
07:50:50 -!- upyr[emacs] has joined.
07:59:59 -!- clog has quit (ended).
08:00:00 -!- clog has joined.
08:13:32 -!- SpaceManPlusPlus has joined.
08:13:52 -!- SpaceManPlusPlus has quit (Client Quit).
08:14:59 -!- kwufo has joined.
09:34:45 -!- GreaseMonkey has quit ("Client Excited").
10:11:02 -!- sebbu has joined.
10:13:12 <oklofyug> lament: you have pretty fingers
10:13:42 <oklofyug> i always figured your skin was full of boils
10:18:35 * oklofyug writes his first youtube comment
10:19:51 -!- oklofyug has changed nick to okloflaeg.
10:20:48 -!- ktne has joined.
10:27:53 -!- sebbu2 has quit (Read error: 110 (Connection timed out)).
10:33:48 -!- sebbu2 has joined.
10:47:53 -!- Slereah2 has joined.
10:51:30 -!- sebbu has quit (Read error: 113 (No route to host)).
10:58:34 -!- ktne has quit (Read error: 104 (Connection reset by peer)).
11:00:17 -!- ktne has joined.
11:00:26 <upyr[emacs]> my first `program` brainfuck http://pastebin.com/m469e5a92
11:00:44 -!- Slereah has quit (Read error: 110 (Connection timed out)).
11:15:04 <upyr[emacs]> i want do summator for two numbers, but begin i do it ^
11:28:40 -!- KingOfKarlsruhe has joined.
11:44:58 <okloflaeg> >----< <--> <<< which one is longer? (you'll be so surprised!)
11:46:50 -!- KingOfKarlsruhe has quit (Remote closed the connection).
12:07:37 -!- kwufo has quit (Read error: 110 (Connection timed out)).
12:18:39 -!- Hiato has joined.
12:28:06 -!- FireFly has joined.
12:49:07 -!- KingOfKarlsruhe has joined.
12:51:55 -!- BeholdMyGlory has joined.
12:58:44 -!- Slereah has joined.
13:10:49 -!- Slereah2 has quit (Read error: 110 (Connection timed out)).
13:22:08 -!- KingOfKarlsruhe has quit (Remote closed the connection).
13:50:00 -!- Hiato has quit ("Leaving.").
14:09:57 -!- psygnisfive has quit (Remote closed the connection).
14:15:42 -!- Hiato has joined.
14:22:13 -!- ktne has quit (Remote closed the connection).
14:24:28 -!- BeholdMyGlory has quit ("restaring!").
14:27:50 -!- BeholdMyGlory has joined.
14:30:40 -!- ais523 has joined.
14:57:21 <ehird> <kerlo> I have returned so that I may confess a sin in ##sl4.
14:57:42 <ais523> what is ##sl4 about, anyway?
14:57:52 <ais523> ehird: you are aware that "wat" is technically speaking a spelling error?
14:58:14 <ehird> ais523: ##sl4 is the IRC channel for sl4.org, a mailing list about the technological Singularity
15:01:46 <ehird> but yes, I'm rather wondering what the heck ##sl4 has to do with sins
15:04:40 -!- okloflaeg has changed nick to oklopol.
15:05:09 <oklopol> are you groaning at me, or with me?
15:05:10 <ehird> i hope the singularity doesn't make us all too intellectual for terrible puns.
15:05:14 <ehird> oerjan and oklopol would go out of business.
15:06:16 <oklopol> should probably go bite my cell phone
15:06:19 <ehird> I wouldn't put it past ihope to be a rapidly self-improving strong AI.
15:06:38 <oklopol> well if someone here is, it's him
15:07:05 <ehird> oklopol: or fungot
15:07:06 <fungot> ehird: i really feel like going fast and such. i suggested a holy grail of web applications, and expressions that manipulate them or store to/ load from memory.
15:07:14 <ehird> "i really feel like going fast and such."
15:07:18 <ehird> he's talking about his self-improvement
15:07:45 -!- kar8nga has joined.
15:07:51 <fungot> Available: agora alice darwin discworld europarl ff7 fisher ic irc* lovecraft pa speeches ss wp
15:08:58 <ehird> <fungot> Available: agora alice darwin discworld europarl ff7 fisher ic irc* lovecraft pa speeches ss wp allofsentience
15:08:58 <fungot> ehird: riastradh, i don't think so...
15:15:24 <ehird> Unfortunately for the singularity, we still haven't asked AC how to reverse entropy.
15:15:29 <ehird> maybe AC is the singularity.
15:18:18 <pikhq> I ask *every* computer how to reverse entropy.
15:18:28 <ehird> pikhq: how's that going for you
15:18:29 <pikhq> You never know when it'll come up. ;)
15:19:10 <pikhq> ehird: So far, I've gotten answers ranging from "syntax error" to "Insufficient information."
15:19:39 <ehird> pikhq: Wire your brain up to a serial cable and ask it.
15:21:26 <pikhq> Sorry; I'm waiting for ethernet jacks.
15:26:02 <ehird> I bet google is the singularity.
15:26:15 <ehird> As soon as they try and make it improve its indexing algorithm through the pages it indexes.
15:35:11 <ehird> http://twitter.com/OHHDL
15:43:05 <comex> His Holiness the Dalai Lama had been experiencing some mild discomfort in one of his arms over the last three days. On the advice of his personal physician, His Holiness left Dharamsala early this afternoon and arrived in New Delhi. After undergoing medical tests at Apollo Hospital, His Holiness left the hospital early this evening after having been diagnosed to be suffering from a pinched nerve. Doctors have advised some medication. His Holiness is spending
15:43:06 <comex> the night in New Delhi before returning to Dharamsala tomorrow to resume his normal schedule from Wednesday. - Office of His Holiness the Dalai Lama
15:43:29 <comex> why do I feel the urge to laugh after reading so many 'His Holiness'
15:47:09 -!- alex89ru has joined.
15:49:17 <fizzie> fungot: So, how *do* we reverse entropy?
15:49:18 <fungot> fizzie: how do you " work on a syntactic closures srfi?) address. :p
15:50:27 <ehird> So, work on a syntactic closures srfi, give it the address, and it'll tell you how.
15:55:23 <comex> fungot: so what source is this from?
15:55:25 <fungot> comex: i think i use security by obscurity anyway?) ( allthough, lispme does let you look at all the colors and stuff are authored in latin. like fnord and rain.
15:55:41 <comex> ah, never knew fnord and rain were latin
15:55:45 <ais523> comex: clearly, it won't tell you
15:56:06 <fungot> Selected style: agora (a large selection of Agora rules, both current and historical)
15:56:11 <ehird> comex: greeeeeeeeeeeeeeeeeeen
15:56:16 <ehird> fungot: greeeeeeeeeeeeeeeeeen
15:56:16 <fungot> ehird: a zombie, has the lowest non-zero stain. ( f) the associate director of personnel may declare an interested proposal is
15:56:39 <comex> honestly, fungot would be impressive even if it were written in Python
15:56:40 <fungot> comex: mark awards and penalties allowed by the assessor with respect to entities in the same month.
15:57:56 <ehird> meh, megahal is better :P
16:17:27 <ehird> ais523: I forget, was Vaughan Pratt right in the end?
16:17:47 <ais523> at least, his original claim was based on incorrect data, therefore false
16:18:01 <ehird> this one: http://cs.nyu.edu/pipermail/fom/2007-October/012156.html
16:18:04 <ais523> there followed a rather inconclusive argument after that, in which in the end we agreed that the original statement was ambiguosu
16:18:18 <ais523> yep, that one's completely fallacious
16:18:29 <ais523> at least, the argument's correct but the premises are wrong, so it's inapplicable to the situation
16:18:38 <ehird> ais523: but, confirm this:
16:19:05 <ehird> "It is possible, with a 2,3 machine and one sub-turing machine, to make the 2,3 compute something only a turing machine can"
16:19:54 <ais523> yes, I can confirm that
16:20:02 <ehird> that's good enough for me
16:20:12 <ehird> ais523: and thus, it follows that
16:20:22 <ais523> with a reasonable definition of "compute", I think the one in my original proof was good enough but I created a more clear-cut demonstration later
16:20:34 <ehird> "it is possible, with a 2,3 machine and one sub-turing machine with the property that after one use, it self-destructs, to make the 2,3 provide an environment for generating the programs thereforth"
16:20:39 <ehird> that is, you only need a sub-turing machine once
16:20:52 <ais523> well, if you have storage for an infinite amount of data
16:20:59 <ehird> we're in platonic land.
16:21:02 <ehird> that's good enough for me, then
16:21:09 <ais523> in platonic land, correct
16:21:11 -!- Hiato has quit ("Leaving.").
16:21:23 <ehird> ais523: I think it may be more accurate to say that {2,3 + sub turing machine} might be the thing that is TC, though
16:21:33 <ehird> kind of like the chinese box thing, it's the whole system combined that's intelligen
16:21:46 <ais523> I think what came out of this is that turing-completeness was badly defined
16:21:51 <ais523> after all, BF needs an input program to be turing complete
16:21:56 <ais523> and you have to write that program in BF
16:22:10 <ais523> so it's actually BF, plus the process of translating a program into BF, that's TC
16:26:35 -!- Judofyr has quit (Remote closed the connection).
16:32:22 <ehird> ha ha paradoxes are so funny.
16:32:54 <ehird> also, it's not a paradox.
16:33:16 <ehird> it's just psychology
16:35:39 <AnMaster> ehird, no, because if more than half answers no, then the result will actually be more than half, and no will be incorrect
16:35:53 <AnMaster> if more than half answer yes, then it will be incorrect as well
16:36:01 <ehird> only the minority can win
16:36:09 <ehird> but it's not impossible to win
16:36:16 <ehird> it's just weighing up human psychology to find the answer
16:36:43 <AnMaster> that depends on how you interpret it
16:37:11 <ehird> it's not a paradox
16:43:11 * ehird plans out dream language
16:43:26 <whoppix> yay, my underload interpreter is mostly complete.
16:43:35 <ehird> It's a total FP language that has no compilation/runtime distinction, dependent types in the same language to infinite levels,
16:43:42 <ais523> which lang are you writing it in?
16:43:43 <ehird> an extensible syntax and implementation at the base level
16:43:58 <whoppix> I couldn't find any underload implementation on cpan, so I thought i'd write one.
16:43:59 <ais523> I'd be interested to see it
16:44:23 <ais523> I think I have a Perl version of my own lying around somewhere, but it probably isn't very robust
16:44:31 <whoppix> I'll paste it for you, although its not complete yet.
16:44:59 <whoppix> Ill yet have to implement a way to throw in custom callbacks for outputting.
16:45:21 <ehird> ais523 has a 300-character or so Underload interp in perl
16:45:36 <ais523> ah yes, I was golfing it for anagolf
16:45:40 <ais523> it wasn't a robust one, though
16:45:42 <whoppix> http://codepad.org/hYbMoOl2
16:45:48 <ehird> ais523: I can't imagine it being much more verbose
16:45:49 <ais523> just one that was good enough to "legitimately" win
16:45:54 <ehird> even if was robust
16:46:00 <ehird> 50 lines, max, I'd say
16:46:10 <whoppix> The specification is a bit unclear about how to handle ", so I haven't implemented quoting yet
16:46:13 <ehird> whoppix: wow, that's... thoroughly overengineered.
16:46:14 <ais523> also, mine was terribly inefficient
16:46:18 <ais523> whoppix: don't, nobody else does
16:46:21 <ais523> I should just take that out of the spec
16:46:28 <whoppix> ais523, just a random namespace I throwed it into, eclipse is always a bitch about those things.
16:46:32 <ehird> you really don't need a separate paresr
16:46:41 <ais523> ehird: you do if you want it to run fast
16:46:54 <ehird> because you never skip past
16:47:00 <ehird> parsing a quote is one-pass
16:47:03 <whoppix> ais523, I'll put it under Language::Underload or so, when I upload it to cpan.
16:47:14 <ehird> the delay in an extra parsing step actually slows it down
16:47:15 <ais523> also, shouldn't those dies be carps?
16:47:45 <ais523> ehird: finding the other end of a heavily-nested (()) can be rather slow
16:47:53 <ais523> especially if there are a huge number of bracketed elements inside it
16:47:56 <ais523> parsing avoids that problem
16:48:00 <ehird> ais523: yes, but parsing before running is identical, unlike in BF
16:48:07 <ehird> except the overhead of an extra pass delays program execution
16:48:22 * ehird writes a perl interp himself to demonstrate
16:48:24 <whoppix> ais523, yepp, i just did this in an hour or so, quick and dirty. also, I could propably improve performance greatly by removing the is_valid_token check
16:48:41 <whoppix> which would then lead to runtime exceptions, rather than compile-time-errors
16:49:01 <whoppix> (not that it matters for most, anyway, since you have to re-compile code at runtime all the time)
16:49:06 <ais523> you could probably speed it up further using a jump hash for the commands
16:49:12 <whoppix> (and not that performance matters at all anyway, its just for fun.)
16:49:27 <whoppix> ais523, a jump hash? what do you mean?
16:49:58 <ais523> { 'a' => \¶nthesize, '^' => \&execute
16:50:10 <ais523> then use the hash rather than an if/else if chain
16:50:15 <whoppix> ah, well, yes, but whatever, I don't care about performance.
16:50:20 <ais523> not sure which is faster when you have so few commands, though
16:50:35 <whoppix> direct hash access would be significantly faster
16:50:53 <ais523> my compiler worked completely differently
16:51:02 <ais523> let me try to find a link to it
16:51:10 <whoppix> ais523, that'd be interesting to look at.
16:51:33 <whoppix> my first thought was, to create a EBNF for some sort of higher-level-language, which I could then compile down to underload
16:51:44 <whoppix> but giving it a second thought, I think I'd rather compile it down to brainfuck
16:51:48 <ais523> http://golf.shinh.org/reveal.rb?Underload+interpreter/ais523/1202246125&pl
16:52:01 <ais523> sorry about the lack of whitespace, I was going for a length record
16:52:09 -!- Hiato has joined.
16:52:17 <ais523> run that through B::Deparse and it should look a lot nicer]
16:52:21 <whoppix> ais523, as you can see, I'm not :P
16:52:47 <ais523> whoppix's link is more readable perl
16:52:56 <ais523> also, I did all sorts of cheating things like $/=$]
16:53:05 <ais523> which breaks if Perl's version number happens to be in the input Underload source code
16:53:10 <ais523> but it's a few chars shorter than undef $/
16:54:15 <whoppix> i really think though, I should do something with the quotes, it wouldn't be much work, since I drag in Text::Balanced anyway, and the specification says something about quoting.
16:54:26 <whoppix> (but does not really define where or how to use it)
16:54:33 <ais523> yes, that's because nobody ever did
16:54:50 <ais523> low Underlambda tiers won't have it
16:55:02 <ais523> (Underlambda lowest tier is a "fixed" version of Underload)
16:55:15 <AnMaster> ais523, what about the perl winner, just 34 chars?
16:55:22 <ais523> AnMaster: it was cheating
16:55:35 <ais523> it memorised the outputs that the test was looking for, and pasted one or the other back
16:55:41 <ais523> whoppix: no, unlambda is quite different
16:55:50 <whoppix> ais523, ok, I don't know underlambda.
16:56:07 <ais523> neither does anyone else
16:56:10 <ais523> it's my vaporware language
16:56:24 <ais523> which will one day not be vaporware, honest
16:56:45 <whoppix> ais523, so underlambda can be compiled to underload, or something like that?
16:56:55 <ais523> whoppix: yes, and compiled from Unlambda
16:57:05 <ais523> although I want it to be compilable to and from anything without too much difficulty
16:57:14 <ais523> it's invented as an intermediary language to compile esolangs via
16:57:23 <whoppix> ais523, that sounds interesting, did you made a spec or implementation?
16:57:26 <ais523> anyway, http://golf.shinh.org/reveal.rb?Underload+interpreter/ais523(genuine)/1202731346&sed is another of my Underload interps
16:57:33 <ais523> whoppix: I change the spec frequently
16:57:45 <ais523> and I have an implementation I keep more or less in synch with the spec
16:57:53 <ais523> I'll put it on esolang when it's ready, which is not yet
16:58:15 <ais523> whoppix: regexes tend to be shorter than anything else for golfed esolang interps, I find
16:58:28 <ais523> I don't use them so much in larger projects
16:58:49 <whoppix> I never really played perl golf.
16:59:07 <ais523> http://golf.shinh.org/reveal.rb?Underload+interpreter/yshl/1201872465&ps seems to be a genuine Underload interp in Postscript
16:59:32 <ais523> and http://golf.shinh.org/reveal.rb?Underload+interpreter/hinoe(mugoi)/1202159887&c is a crazily short one in C, but it doesn't always work
16:59:41 <ais523> as far as I can tell, it just uses memory without allocating it, or something
17:00:12 <whoppix> I haven't really done a lot of tests with mine yet, but it computes fibonacci sequences just fine.
17:01:39 <whoppix> I think I'll put that module into my bot and make a !underload command. It understands about 20 languages now.
17:01:53 <ais523> how many are esolangs?
17:02:00 <ais523> we could do with an egobot replacement
17:02:14 <whoppix> ais523, depends on wether you count perl as esolang :D
17:02:41 <ais523> Perl in general isn't an esolang; various restrictions and modifications of it (such as golfed Perl) are
17:02:58 <whoppix> it supports perl, python, ruby, javascript, J, haskell, lua, that common kind of stuff, as well as perl6 (pixie/rakudo etc) farnsworth, and now underload
17:03:16 <whoppix> farnsworth doesn't work quite right yet
17:03:24 <ais523> mostly non-esolangs, then
17:03:34 <ais523> could you bring it in here so we could test?
17:03:38 -!- Judofyr has joined.
17:03:41 <ais523> actually, better not, I'd just get it into a botloop with fungot
17:03:41 <fungot> ais523: an entity, unless the term " possess" and " owner" are unambiguous synonyms for " off hold" are synonymous. a player you name must play or incur a debt of the
17:04:25 <whoppix> ais523, its not irc, but rather in a little chatnetwork that me and a friend made on our own, with a JSON-based protocol, TLS encryption, and a lot of other stuff. I'm writing a GTK+ client for it, currently, as well as maintaining the perl network libraries to access it.
17:04:59 <AnMaster> whoppix, make it ignore fungot?
17:05:00 <fungot> AnMaster: highest point total for the ambassador is authorized to perform a certain action " by paradox if e is allowed to stand. as soon as possible after
17:05:04 <whoppix> we made it just for fun, and its still beta, but it has a lot of interesting features, as for example writing latex into the chat.
17:05:19 <whoppix> so that you can post formulas, sheet music, stuff like that
17:05:30 <AnMaster> ais523, btw since I know you use emacs a lot, have you ever used pymacs?
17:05:54 <whoppix> oh, and its going to support PGP for private conversations.
17:06:32 <ehird> whoppix: I wrote my own underload that uses a jump table and has no pre-parsing step and is 100 lines shorter tan yours
17:06:44 <ehird> (and checks the stack size too)
17:06:57 <ehird> it's an OO module, too
17:06:58 <AnMaster> what about reusing something like that IM encryption that offered deniability(sp?) too
17:07:05 <ehird> you can run multiple programs, check the stack, and replace the outputter
17:07:18 <whoppix> ehird, bet it doesn't even pass the cpan kwalitee requirements :P
17:07:44 <ehird> about as much as yours is
17:07:56 <whoppix> yeah yeah, I know, its just for fun anyways.
17:08:43 <whoppix> Hm, I could implement a max stack size option.
17:08:57 <ehird> that's easy in mine, too
17:09:53 <whoppix> not that that would be a usefull feature, but anway.
17:10:05 <AnMaster> irritating, if I open something in firefox and it asks to confirm cookie, (and blink the process bar once), when I switch to it the "confirm cookie dialog" ends up behind the main firefox window
17:10:10 <AnMaster> sure non-modal dialogs are good bug...
17:10:47 <SimonRC> Windows loves to put pop-up baloons behing the start bar where I can't see them grr
17:11:27 <AnMaster> and it doesn't happen with konqueror, only with firefox
17:11:35 <ehird> simonrc is referring to a related but different situation
17:11:40 <ehird> we all know you don't use windows goddammit
17:11:41 <SimonRC> whoppix: I have my bar on the LHS
17:11:54 <AnMaster> ehird, was just clarifying what happened to me
17:12:43 <AnMaster> well that is what LHC is. No idea about LHS
17:15:15 <AnMaster> actually when I google define:LHC it seems like for once there is a TLA that is not used for more than one thing
17:15:46 <AnMaster> (oh the irony, TLA is a TLA with more than one meaning....)
17:16:51 <SimonRC> I recently discovered that the Java typesystem is terribly useful when one is trying to set up mocks using reflection.
17:17:12 <SimonRC> you just end up telling it to bugger off
17:17:42 <SimonRC> and compilers disagree about what is allowed grr
17:22:40 -!- KingOfKarlsruhe has joined.
17:25:04 <ehird> my underload interp works
17:25:08 <ehird> I only had one bug, and that was a missing char.
17:25:21 <ehird> it's also fast, and short, and robust, what point was I trying to prove again? XD
17:26:03 <ehird> also, you can control how many instructions can run, with (...) counting as 1 instruction
17:26:08 <ehird> and how large the stack can be
17:26:30 <ehird> and you can control where the outputter goes, and run a program on a custom stack
17:26:50 <ehird> ah, ^ has a small bug
17:30:17 -!- oerjan has joined.
17:32:48 -!- upyr[ema` has joined.
17:35:20 <whoppix> Off to the concerts, see you around
17:35:51 <oerjan> your ears will be toast, though
17:36:26 <oerjan> * oklofyug writes his first youtube comment
17:37:37 <oklopol> yes, i wanted to join the stereotype
17:39:56 <oklopol> that's one sinister character mister.
17:40:00 <oerjan> interesting, the virus scanner finally managed to start a scheduled scan
17:41:15 <ehird> http://www.youtube.com/watch?v=BtGlQHAEwVo
17:41:29 -!- ais523 has quit ("http://www.mibbit.com ajax IRC Client").
17:41:44 <ehird> LAMENT REVERSES HIS AGE
17:42:37 <oerjan> i don't want to reverse my age. i think oklopol agrees.
17:43:07 <oerjan> with the maybe being dead and all
17:44:06 <oerjan> <ehird> but yes, I'm rather wondering what the heck ##sl4 has to do with sins
17:44:30 <oerjan> i think kerlo must have made an AI without ensuring it was friendly
17:45:35 <oerjan> fortunately it decided earth was too boring to care about, so went to conquer the andromeda galaxy instead
17:46:55 <oerjan> <ehird> i hope the singularity doesn't make us all too intellectual for terrible puns.
17:47:23 <oerjan> of course not. but the new terrible puns will be incomprehensible to who we are now.
17:47:53 -!- upyr[emacs] has quit (Read error: 110 (Connection timed out)).
17:48:29 <oerjan> also, fart jokes will be replaced with nebula jokes
17:49:14 <oerjan> and jokes about the methane creatures on Titan
17:49:50 <oerjan> who will protest that they are not actually smelly
17:51:59 <oerjan> <ehird> Unfortunately for the singularity, we still haven't asked AC how to reverse entropy.
17:52:21 <oerjan> someone asked that billions of years ago. it designed us as the answer.
17:52:37 <oerjan> also, we really shouldn't turn on LHC.
17:52:59 -!- upyr[ema` has quit (Remote closed the connection).
17:53:04 <Slereah> Hey, it's my internship >:|
17:53:09 -!- KingOfKarlsruhe has quit (Remote closed the connection).
17:53:14 <SimonRC> oerjan: heh, I was listening to that last night too
17:53:17 <Slereah> Without the LHC, I ain't getting ma master!
17:54:04 <oerjan> note those comments were connected, btw
17:54:40 <oklopol> slerry what shall ya do there?
17:54:40 <SimonRC> oerjan: http://www.bbc.co.uk/programmes/b007jwp4
17:54:51 <oerjan> "I've _got_ to destroy the world, otherwise I won't graduate"
17:55:48 -!- upyr[emacs] has joined.
17:55:56 <oklopol> oerjan: i might want to reverse my age shortly though, you know, retry.
17:57:41 <oerjan> oklopol: you want to go back to diapers?
17:57:42 <ehird> oklopol: your jew tube profile says you're 21
17:57:59 -!- ktne has joined.
17:58:16 * oerjan had the impression oklopol was 19
17:58:20 <ktne> anyone here who actually understands how chicken scheme implements call/cc?
17:58:28 <oklopol> i probably registered while i was underage?
17:58:46 <oerjan> ktne: ehird, possibly?
17:58:54 <ktne> oerjan: well, is he around?
17:58:57 <oklopol> and didn't know how youtube works regarding underage + viewing big-boy vids
17:59:03 <ktne> ehird: i kept reading that
17:59:08 <ehird> and all functions therefore get the continuation as the last argument
17:59:13 <ktne> ehird: i still don't get how call/cc works
17:59:14 <ehird> in the implementaiton:
17:59:33 <ktne> ehird: but what about local variables?
17:59:33 <ehird> primitive-function(f, k) { call(f, arguments=[k], continuation=k) }
17:59:43 <ehird> read the cheney on the mta paper, it explains it
17:59:49 <ktne> i've read that numerous times :)
17:59:57 <ktne> i just don't get it :(
17:59:58 <ehird> not hard enough :-)
18:00:03 <ehird> read the example compilation
18:00:06 <ktne> i understand what it does
18:00:14 <ktne> except i don't really understand how tail calls work
18:00:22 <ktne> how is that there is no stack explosion with this method?
18:00:32 <ktne> since the stack frames are still pushed on each function call
18:00:33 <ehird> because you clear the stack periordically
18:00:40 <ehird> with setjmp/longjmp
18:00:40 <ktne> so it does explode
18:00:44 <ehird> read the example compilation
18:00:52 <ehird> ktne: no, tail recursion works forever
18:00:53 <ktne> well i understand what happens when you fill up the stack
18:01:01 <ehird> read the example compilation
18:01:12 <ehird> you GC, then clear the stack
18:01:19 <ktne> i understand that
18:01:21 <ehird> read the example compilation
18:01:28 <ehird> it explains all this
18:01:31 <ktne> but then the tailcall will trash memory
18:01:36 <SimonRC> the idea of a tail call is that instead of "call foo; return" you do "goto foo"
18:01:55 <ktne> SimonRC: i understand that, but this method seems to use normal C functions
18:02:06 <ktne> the taillcall continues filling up the stack
18:02:16 <ktne> ehird: i've read that
18:02:29 <ehird> it answers your questions...
18:02:35 <ktne> as a detail, why it doesn't use the paging mechanism to detect overflow?
18:02:49 <ehird> it can if you want
18:03:16 * oerjan somehow wonders what Dick Cheney has to do with functional programming
18:03:35 <ktne> ehird: but will the stack grow if you have a tailcall? that's what i'm asking
18:03:48 <ktne> or will the stack pointer stay constant?
18:03:50 <ehird> ktne: yes but then it'll be emptied perioridically
18:03:59 <ktne> that was the whole thing
18:04:01 <ehird> so tail calls still work
18:04:07 <ktne> so basically it does trash your memory
18:04:18 <ehird> read the example compilation!!!
18:04:19 -!- KingOfKarlsruhe has joined.
18:04:21 <ktne> well it keeps allocating
18:04:25 <ktne> until it fills up
18:04:29 <ktne> then it clears
18:04:41 <ehird> besides you can do tail call optimization with it anyway
18:04:52 <ehird> so it isn't even an issue if it was an issue which it wasn't
18:04:55 <ktne> so is it possible?
18:05:21 <ktne> to use in-place execution
18:05:37 <ktne> because this method naively implemented still walks around the stack
18:05:44 <ktne> it just that it clears the stack periodically
18:06:00 <ehird> so don't implement it naively
18:06:23 <ktne> well this was my problem
18:06:29 <ktne> because it doesn't look efficient at all
18:07:04 <ktne> it looks horrible if you don't implement tailcall optimisations
18:07:21 <ktne> because the stack pointer keeps walking around
18:07:25 <ehird> look, you ask for suggestions on implementing a dynamic language functionally quickly
18:07:30 <ehird> I tell you about cheney on the mta, point you to chickn
18:07:32 <ehird> all you do is complain
18:07:37 <ehird> "doesn't look efficient to me" etc etc
18:07:38 <ktne> no, i do not complain
18:07:45 <ktne> it's just that you told me that it's there
18:07:51 <ktne> and it wasn't there
18:08:21 <ktne> ok, but what about call/cc
18:08:26 <ktne> that definitivelly isn't there
18:08:44 <ktne> how do you save your current continuation
18:08:49 <ktne> i understand how you pass it
18:08:53 <ktne> but how do you save it
18:08:57 <ktne> it's not tehre
18:09:05 <ehird> you're not making any sense
18:09:38 <ktne> this paper makes no sense
18:09:45 <ktne> it's just a quick innacurate description
18:10:02 <ehird> that's rather accurate, accurate enough for a computer at least
18:10:13 <ktne> the problem with the paper is that is too terse
18:10:21 <ktne> it doesn't really tell anything except that stack trick
18:10:24 <ehird> and the example compilation
18:10:43 <ktne> maybe i'm dumb :(
18:11:32 <ehird> just read chicken's source
18:12:59 <ktne> i'm opening a project right now
18:13:58 -!- alex89ru has quit (Remote closed the connection).
18:19:03 <ktne> btw, is chicken scheme state-of-the-art?
18:19:12 <ktne> or it's just a fast enough implementation?
18:20:19 <ktne> i read that chez scheme is fastest implementation available
18:23:07 <ehird> chicken scheme is pretty fast
18:23:12 <ehird> chez is the fastest but is $$$$$$$$
18:23:29 <ktne> i'm asking this because i'm designing my own language
18:23:44 <ktne> and i'm now sure whenever to add continuations or not, they are very powerful
18:24:00 <SimonRC> maybe some kind of restricted continuation?
18:24:00 <ktne> but the naive way i can implement them it's so slow
18:24:03 <ehird> Well, you've made closures awful for the sake of speed, I don't think continuations are worth it.
18:24:11 <ehird> Of course, you're just building C here.
18:24:25 -!- KingOfKarlsruhe has quit (Remote closed the connection).
18:24:57 <ktne> ehird: i'm looking for something "possible if you want" instead of clean, and a keyword for those closure vars is not that bad i think
18:25:14 <ktne> especially since it you gain code readability too
18:25:19 <ehird> it is, i would just ignore closures altogether if forced to program in such a languag
18:26:50 <ktne> what is the name of the method specified in "cheney on the mta" paper?
18:27:03 <ehird> Cheney on the mta.
18:27:17 <ehird> ktne: also google: Cons should not cons its arguments
18:27:32 <ktne> ok, whatever with this method
18:27:37 -!- kwufo has joined.
18:27:45 <ktne> you can allocate all closure variables on stack if their size is known
18:27:56 <ktne> or even if their size is variable
18:28:07 <ehird> that's what the example compilation does
18:28:08 <ktne> except if they are very large, in which case you have to allocate them in the heap
18:28:24 <ktne> so i guess i could get clean closures at no cost
18:28:53 <ktne> ok, so i could remove that limitation
18:29:31 <ktne> can you provide me a hint on where to start looking in chicken scheme?
18:29:43 <ktne> i'm looking for the call/cc implementation, the place where the current continuation is saved
18:29:51 <ehird> ktne: here's how you implement call/cc
18:30:12 <ehird> (define (call/cc f #k) (call-with-specified-continuation (f #k) #k))
18:30:22 <ehird> where #k is the continuation argument passed to the end of all compiled functions
18:30:31 <ehird> as you can tell, that call/cc is O(1).
18:30:44 <ehird> ktne: http://en.wikipedia.org/wiki/Continuation-passing_style
18:31:01 <ehird> you transform the input program to CPS
18:31:03 <ktne> do languages that do not use CPS
18:31:07 <ktne> have to copy the stack?
18:31:12 <ktne> or how do they save the continuation?
18:31:22 <ehird> that's why you use CPS :-)
18:31:46 <ehird> a bonus of CPS: since you make tons and tons of closures for the continuation, you're forced to make your closures really fast
18:31:50 <ehird> which is a big win for closure use
18:32:39 <ktne> ok, thanks, i have to do some work for 15min now
18:33:34 <ktne> i was confused about copying the stack
18:33:49 <ehird> yeah, this is all pretty tricky until you just "get" it
18:34:03 <ehird> CPS + cheney on the mta reduces the amount of places you have to optimize, though
18:34:13 <ehird> cps lets you not optimize continuations, focusing instead on closures
18:34:27 <ehird> cheney on the mta lets you get really fast calls, focusing your optimization instead on the gc
18:34:41 <ehird> so in the end, most optimization boils down to closures and the gc
18:34:53 <ehird> and both have quite a bit of literature on optimizing them
18:36:55 -!- Somebody123 has joined.
18:48:41 -!- KingOfKarlsruhe has joined.
18:52:36 -!- kar8nga has quit (Read error: 110 (Connection timed out)).
19:01:08 -!- kar8nga has joined.
19:04:37 <ktne> is it possible to implement retriable exceptions?
19:04:52 <ehird> they're continuations
19:05:06 <ehird> the exception handler is a continuation, and if you pass a continuation at the point of the start of the try block to them,
19:05:09 <ehird> they can jump back in
19:05:15 <ehird> isn't it nice how everything reduces to continuations :P
19:05:29 <ktne> that's why i would like to have them
19:05:34 -!- Somebody123 has quit ("Leaving.").
19:07:07 <ktne> how is cheney on the mta compatible with optimized tailcalls?
19:09:23 <ehird> just do it like normal
19:11:38 <ktne> but what happens if you have a try block inside a tailcall-optimized function?
19:11:54 <ktne> how would the exception be thrown?
19:12:22 <ktne> because there are missing stack frames
19:12:27 <ktne> (optimized away)
19:12:35 <oerjan> obviously inside a try block is not a tail position
19:13:05 <ktne> hmm, let me think
19:13:34 -!- Corun has joined.
19:13:54 -!- Corun has quit (Client Quit).
19:13:59 <ktne> let's suppose we use an obtuse fibonacci implementation
19:14:31 <ktne> fib(2) throws an exception
19:14:54 <ktne> and if that exception is caught then you return the precomputed fib(2) constant
19:15:13 <ktne> that indeed appears not to be a tailcall
19:15:46 <ktne> because you must execute the code that pops the exception handler from the stack before "returning"
19:15:55 <ktne> but then, what about a loop?
19:16:02 <ktne> a loop that has a try block inside
19:16:07 -!- Corun has joined.
19:16:56 <ktne> for(i=i;i<n;i++) { try { ... } catch { ... } };
19:18:36 -!- KingOfKarlsruhe has quit (Remote closed the connection).
19:21:49 -!- KingOfKarlsruhe has joined.
19:22:45 <oerjan> but it's a bit too small to see if there's a try block inside
19:23:40 -!- sebbu has joined.
19:28:05 <oklopol> great way to hide you're not actually executing that
19:28:40 <oklopol> in a hypothetical language that doesn't initialize variables to anything coherent automatically that is
19:29:52 <oerjan> in a very hypothetical language, couldn't i = i cause a crash? :D
19:30:46 <SimonRC> if you try to evaluate that i, you'd get a <<<loop>>> error
19:30:47 <oerjan> er i was assuming C-like
19:31:36 <SimonRC> how about if i is an uninitialised reference?
19:31:44 <kerlo> i = i: set i to a fixed point of the function returning i given i.
19:32:06 <oerjan> something that doesn't initialize, and in which assignment can look at contents
19:32:29 <ktne> i meant i=0 :)
19:32:39 <kerlo> It's easy to prove that the identity function has a fixed point because it's a rotation. :-P
19:32:40 <oerjan> does C++ allow this? i don't know it
19:34:52 -!- sebbu2 has quit (Read error: 110 (Connection timed out)).
19:36:01 <oerjan> a very twisted concept
19:36:11 <kerlo> For any given axis, the identity function is a rotation by 0 about that axis.
19:38:47 <ktne> is it possible to optimize loops such the above one in scheme?
19:39:31 <oerjan> well the goto the beginning isn't actually inside the try block...
19:39:47 <oerjan> unless you use some escape command
19:40:04 <oerjan> in which case that should probably break out of the try block too
19:40:08 <ktne> " goto the beginning" i'm not sure i understand this
19:41:24 <ktne> i expect a lot of code to be written in this form in my language
19:41:30 <ktne> mostly for array processing
19:42:29 <oerjan> ktne: if you implement loops as tail recursion, the try block there won't actually be _part_ of the recursion
19:42:54 <oerjan> just something done before recursing
19:43:06 <ktne> i'm not sure i understand it
19:43:19 <ktne> i will have to translate that loop in CPS form first
19:51:05 <ktne> is there a good paper on transforming C-like languages in CPS form?
19:55:56 -!- alex89ru has joined.
20:03:47 -!- mib_mksvta has joined.
20:05:57 <mib_mksvta> Current holdup: how to nicely load/unload plugins at runtime (they're written in haskell)
20:06:26 <oerjan> well there's a hackage library iirc
20:08:13 <mib_mksvta> oerjan: it just calls ghc, doesn't it?
20:08:19 <mib_mksvta> I could shell out to ghc, then use something like dlopen.
20:08:44 <oerjan> it also uses some internal module loading stuff i think
20:09:02 <mib_mksvta> yeah, dlopen won't get the instance of Plugin which is important
20:09:41 <oerjan> anyhow, i haven't looked much into it
20:10:10 <mib_mksvta> http://www.haskell.org/ghc/docs/latest/html/libraries/unix/System-Posix-DynamicLinker.html
20:10:32 -!- Hiato has quit (Read error: 60 (Operation timed out)).
20:11:47 <mib_mksvta> This library is distributed under the terms of the LGPL:
20:12:31 <mib_mksvta> http://www.haskell.org/ghc/docs/latest/html/libraries/ghc/index.html eek
20:13:02 <oerjan> no, ehird is ehird. you are an EVIL IMPOSTER.
20:13:34 <oerjan> taking advantage of ehird's awayitude
20:14:57 <mib_mksvta> http://www.haskell.org/ghc/docs/latest/html/libraries/ghc/GHC.html Promising
20:15:09 -!- KingOfKarlsruhe has quit (Remote closed the connection).
20:16:27 <SimonRC> GregorR: how long have you been reading Spamusement?
20:18:22 <GregorR> Idonno, I think I read through them first shortly before stevenf stopped, then somebody recently sent me a link to a comic on the forums and so I started reading those :P
20:19:24 <mib_mksvta> kerlo: what was the sin you told ##sl4
20:19:31 * kerlo mumbles something about sparse error correction codes
20:19:49 <kerlo> mib_mksvta: believing the conclusion more strongly than the premises.
20:20:27 <kerlo> For example: "It is probably raining. If it is raining, Daniel is probably carrying an umbrella. Therefore, Daniel is almost certainly carrying an umbrella."
20:24:15 <kerlo> If understanding AI is possible, Friendly AI is almost certainly possible. Understanding AI is probably possible. Therefore, Friendly AI is almost certainly possible.
20:25:17 <oerjan> puck up the fun, i say
20:26:43 <mib_mksvta> I'm pretty sure kerlo will be responsible for the singularity which is why I said that
20:27:08 <mib_mksvta> so kerlo, make sure stupid puns stay intact. also, omit oklopol from the singularity. he's hilarious enough as is. we need him post-singularity, you see, otherwise the world will collapse. of lack of oko.
20:27:09 <SimonRC> GregorR: I have been known to like to their forums, so maybe me
20:27:17 <oerjan> kerlo: it follows by a similar argument to yours
20:27:23 <mib_mksvta> now give fungot self-improvement routines
20:27:24 <fungot> mib_mksvta: that it is. if this still does not possess less than the maximum
20:27:30 <GregorR> SimonRC: Who are you on the forums?
20:27:39 -!- Slereah2 has joined.
20:27:59 <GregorR> SimonRC: (And no, "somebody" != "I don't remember who", I do remember who it's just not somebody relevant to this audience)
20:28:13 <kerlo> Something like this: I am possibly the smartest person in the world. It's likely that the smartest person in the world will be responsible for the Singularity. Therefore, I will almost certainly be responsible for the Singularity.
20:28:24 <oerjan> it is recommended that you _don't_ include oklopol among the goals of the AI. we don't want the solar system tiled with o's and k's.
20:28:39 <mib_mksvta> kerlo: No, more like only you're batshit insane enough to actually get the singularity going.
20:28:44 <kerlo> That would only happen if I decided to give the Solar System to oklopol for some reason. :-)
20:28:50 <mib_mksvta> The others don't write their AI by starting with fungot, you see.
20:28:51 <fungot> mib_mksvta: 3) entities explicitly specified by the clerk of the
20:29:02 <kerlo> Ah, yes, I do have a reputation for being batshit insane.
20:29:15 <mib_mksvta> oerjan: I dunno, the whole universe replaced by okokokokoko over and over again would be abso-frickin-lutely hilarious.
20:29:41 <kerlo> You'd have to be batshit insane to mumble something about low-density parity check codes.
20:29:59 <mib_mksvta> Incidentally, I don't really buy the Singularitarian view of a point of infinite improvement.
20:30:12 <kerlo> What's the Singularitaritaritarian view?
20:30:20 <oerjan> mib_mksvta: but if an oko falls in the forest and there is noone there to see it...
20:30:47 <mib_mksvta> kerlo: The singularity is defined as the point where the recursive exponential self-improvement of the AI hits a point where it improves itself an infinite amount of times.
20:31:04 <mib_mksvta> I argue that there cannot be such a point, due to the planck time and other universal limits.
20:31:07 <kerlo> That's a really weird definition.
20:31:10 <mib_mksvta> It can go very, very fast, but not infinitely.
20:31:14 <kerlo> Anyone who subscribes to it is silly.
20:31:38 <kerlo> Don't tell me EY subscribes to it.
20:31:39 <mib_mksvta> It's in one of his essays. Go find it or something.
20:31:55 <ktne> anyone here knows a human readable paper on CPS transformation? From a C-like language to CPS form. The papers i have found are quite dense.
20:32:30 <kerlo> Rather: "I'm sure you are correct, but I believe that in circumstances like this, it happens that the burden of proof falls on you, not me, for which I apologize deeply."
20:32:54 <SimonRC> Singularitarianism, like many other things, ranges from people who believe the obvious (Computers are going to get a hell of a lot more powerful), through those that believe the reasonable (There are several internet-level societal revolutions to come), to the ridiculous (suddenly we all ascend to a higher plane of existance and spread across the galaxy within the century)
20:33:17 <oerjan> a quintillion apologies, tiled onto the universe
20:33:31 <kerlo> Higher plane of existence, sort of. Spreading across the galaxy, maybe not. :-P
20:34:07 <SimonRC> humans are not going to seriously go to the stars
20:34:13 <SimonRC> something derived from them? maybe
20:34:26 <ktne> what about mind uploading?
20:34:33 <ktne> i think that would be more beneficial
20:34:37 <SimonRC> I'm not counting them under humans
20:34:39 <ktne> and with that you could go to the stars
20:34:47 <ktne> well they are persons
20:34:52 <ktne> even if they are not biological
20:34:59 <SimonRC> but if physical humans ever get there it will be for the amusement of some other more powerful type of entity
20:35:06 <oerjan> what is reasonable depends on unknown physical limits
20:35:14 <SimonRC> ktne: I mean, not space opera
20:37:01 <SimonRC> mib_mksvta: given that the strictness analyser is the most important part of the GHC optimiser, I agree that that is an unfortunate name
20:37:30 <oerjan> SimonRC: well i would certainly hope it is being anal about what it does
20:39:01 <SimonRC> I don't think Freud had computer programs in mind
20:39:55 <oerjan> no he had his mother in mind, obviously
20:40:04 -!- Slereah has quit (Read error: 110 (Connection timed out)).
20:40:40 <SimonRC> Sigmund Freud is his grandson, y'know.
20:41:53 <kerlo> I'd say that pre-Singularity biological humans are deserving ipso facto.
20:42:09 <kerlo> (Raise your hand if you think that was a batshit insane thing to say.)
20:43:59 <kerlo> Do you know what "ipso facto" means?
20:46:25 <oerjan> while half-humans must ipso facto half not be
20:47:02 <kerlo> Half-humans are probably not pre-Singularity.
20:50:26 * SimonRC considers the riots of the 40s
20:50:50 <SimonRC> where 10000 furries want the right to GM their children
20:51:38 <oerjan> and then the riots of the 2060, when the children reverse that
20:53:13 <oerjan> although by 2070 it will be moot since people can reengineer themselves on the fly
20:54:35 <Ilari> Yes, it will be fun if ones idea of fun is sufficiently twisted... :-)
20:56:46 <SimonRC> oerjan: if you believe Accellerando, the Earth will have been disassembled by the end of the century.
20:57:46 <oerjan> by 2100 everyone will be so sick of it that they will live in virtual simulations of close to the year 2000.
20:58:13 <oerjan> and coincidentally, this is recursive, and we're not the first iteration.
21:00:55 <Ilari> I say by 2100, society can't even sustain current level of technological development... :-/
21:04:47 <SimonRC> Orion's Arm gives some intersting reasons why any non-related civilisations we eventually meet are likely to be at a technological level to our own.
21:04:58 <mib_mksvta> SimonRCSingularitarianism, like many other things, ranges from people who believe the obvious (Computers are going to get a hell of a lot more powerful), through those that believe the reasonable (There are several internet-level societal revolutions to come), to the ridiculous (suddenly we all ascend to a higher plane of existance and spread across the galaxy within the century)
21:05:06 <mib_mksvta> I am referring to Eliezer Yudkowsky's beliefs, mainly.
21:06:35 <SimonRC> (The idea is they are so distant that we will meet them by contact of high-speed wormhole ends being carted around. Time dilation means that ultra-long-distance exploration will still not take much time at our end.)
21:06:38 <mib_mksvta> Now take the Transcended version of S{n}, starting at 2. Half a time-unit later, we have 3. A third of a time-unit after that, 6. A sixth later - one whole unit after this function started - we have 64. A sixty-fourth later, 10^80. An unimaginably tiny fraction of a second later... Singularity.
21:06:39 <mib_mksvta> http://yudkowsky.net/obsolete/singularity.html
21:06:52 <mib_mksvta> Yes, it is marked as Obsolete, though.
21:07:53 <Ilari> What I think about those: "Computers are going to get a hell a lot more poweful": Doubtful. "There are several internet-level societal revolutions to come": There are multiple large societal revolutions coming. "suddenly we all ascend to a higher plane of existance and spread across the galaxy within the century": Utterly ridiculous.
21:09:11 <mib_mksvta> "Computers are going to get a hell a lot more poweful": Doubtful?!?!?!?!
21:10:06 <oklopol> i don't find that an unreasonable assumption
21:10:48 <oklopol> well we've hit the wall already, who says there's a simple way out.
21:11:43 <oklopol> i mean i don't believe that, but i don't think it's a ridiculous assumption.
21:12:49 <mib_mksvta> One ridiculous thing EY's said: both "we cannot comprehend higher-than-human intelligence, duhh" and "we can make a higher-than-human intelligence AI"
21:15:25 <oerjan> mib_mksvta: i think that means we cannot comprehend the _consequences_, but we can still set up the _initial_ concept
21:15:52 <oklopol> well that's the science fiction aspect, X is cool and higher-level, and X could exist
21:16:08 <mib_mksvta> No, he's said that if we could understand things more intelligent than ourselves we'd be that intelligent.
21:17:12 <mib_mksvta> Ilari: care to justify that "doubtful"?
21:18:31 <Ilari> mib_mksvta: Essentially running against technological limits in multiple ways, and not being able to deal with the consequences.
21:18:51 <mib_mksvta> Ilari: I don't disagree that there are limits, I just think that current computers are very far from them.
21:19:55 <Ilari> mib_mksvta: Some other limitations make how far current computers are from theoretical limits pretty much irrelevant...
21:22:17 <Ilari> And besides, increasingly advanced semiconductor fabs are becoming exponentially more expensive...
21:23:04 <Ilari> The most signaficant one: Energy.
21:23:19 <Ilari> More precisely, technologically usable energy.
21:23:43 <mib_mksvta> But we can come up with more efficient energy, too, no?
21:24:07 <Ilari> Energy use efficency improvements don't yield that much.
21:24:33 <oerjan> there is plenty of energy from the sun
21:24:53 <oerjan> just the other day there was this energy chart on reddit
21:25:11 <Ilari> Yes, on order of 100PW, but how much of that is technologically usable?
21:25:37 <oerjan> one thing sticking in the mind: energy from sun per DAY > all electricity used since tesla
21:25:50 <mib_mksvta> Note that I'm not so sure about the Singularity: I find the prospect of an X level intelligence being able to create an X+Y intelligence unlikely. kerlo will probably argue with me about this. :P
21:25:56 <Ilari> (Electricity is 'technologically usable energy').
21:26:01 <mib_mksvta> I do think computers are going to get a lot more powerful, though.
21:27:34 -!- GreaseMonkey has joined.
21:28:01 <mib_mksvta> hi GreaseMonkey. We're discussing the technological singularity.
21:28:30 <Ilari> My opinion of tecnological singularity is that its utter pile of crock...
21:28:58 <GreaseMonkey> hmm, i think that we should have some common standards
21:29:06 <mib_mksvta> Ilari: I think you're one of the two extremes in this channel, the other of which is kerlo, as far as I know.
21:30:12 <GreaseMonkey> so yeah, uh, what do you mean by "technological singularity"?
21:31:21 <mib_mksvta> The creation of a smarter-than-human AI which recursively self-improves.
21:31:25 <mib_mksvta> http://en.wikipedia.org/wiki/Technological_singularity
21:32:59 <mib_mksvta> tl;dr so far: kerlo talks about it like it's obviously happening, I say some things, SimonRC is extremely ... moderate, Ilari says it's a bunch of bullshit
21:33:38 <Ilari> And doing stuff that we have no idea of consequences of gives me creeps (fortunately I don't see this one happening)...
21:34:04 <mib_mksvta> Ilari: Thank god you've never been around when someone's trying to make progress.
21:34:04 <GreaseMonkey> while someone may be able to create a program which can fix its own bugs and the bugs of others, i don't think it will be able to be a heck of a lot more intelligent than us.
21:34:19 <GreaseMonkey> it would have to be based on a neural network for it to do that
21:34:29 <GreaseMonkey> and then again, you'll probably run out of RAM.
21:34:44 <oerjan> always practice extreme moderation
21:34:53 <mib_mksvta> a recursive self-improver more intelligent than us could find RAM, surely
21:35:12 <mib_mksvta> GreaseMonkey: that's rather ridiculous, neural networks can't do all that much
21:35:28 <mib_mksvta> they're as much thinking as markov chains are conversationing
21:36:09 <mib_mksvta> GreaseMonkey: so you think AI will _never_ progress beyond neural networks?
21:36:26 <GreaseMonkey> i'm saying that neural networks is probably the path to go along
21:36:44 <Ilari> No idea of consequences is pretty much different from understanding the consequences even poorly...
21:37:47 <mib_mksvta> I'd say progress we had no idea of the consequences of has happened before.
21:38:00 <mib_mksvta> But if it's going to happen, it's not going to be something we can choose...
21:38:30 <Ilari> And worse yet, understanding the consequences but not paying attention to them...
21:38:57 <mib_mksvta> The singularity would pretty much be the definition of not understanding the consequences, so that doesn't really apply
21:39:16 <GreaseMonkey> also, if you want a good degree of intelligence, you could use a neural network for triggering behaviours based on "emotions", and "emotions" based on input
21:39:22 <mib_mksvta> Although if we're going to kill ourselves off, it'll probably happen in some simpler way.
21:39:36 <mib_mksvta> GreaseMonkey: That's kind of simplistic. Whereby kind of I mean really
21:40:16 <kerlo> Neural nets are weird.
21:40:31 <mib_mksvta> kerlo: so what -is- your opinion on the singularity
21:41:21 <kerlo> Bayesian networks are theoretically nice. The problem with the neural networks we have is that they don't seem to be self-modifying in any way.
21:41:37 <kerlo> mib_mksvta: my opinion is "yes".
21:41:45 <kerlo> Are you asking what I think the consequences will be?
21:42:01 <kerlo> mib_mksvta: more detailed question plz?
21:42:01 <mib_mksvta> Yeah. What consequences, how will it come about, ...
21:42:28 <mib_mksvta> mib_mksvtaYeah. What consequences, how will it come about, ...
21:42:32 <Ilari> There is pretty big difference between just inventing some technology and actually using that technology.
21:42:34 <kerlo> Let me go ahead and design a neural net real quick, 'kay? :-P
21:44:23 <kerlo> Sorry, I got disconnected for a moment.
21:44:52 <mib_mksvta> kerlomib_mksvta: more detailed question plz? 21:42mib_mksvtamib_mksvtaYeah. What consequences, how will it come about, ...
21:44:53 <kerlo> mib_mksvta: well, I can't say how it'll come about. Might emerge relatively spontaneously from a collection of relatively intelligent things.
21:45:32 <kerlo> It might be created by a Manhattan Project, it might be created by an educated genius, it might be created by an ignorant genius.
21:46:34 <kerlo> Within 100 years is likely, it seems.
21:47:07 <mib_mksvta> Alright then... Consequences? (Let's say 'immediately after' for a time frame).
21:48:32 <mib_mksvta> As in, Mr. My First Singularity hits enter on his keyboard after typing "ghc --make smarter_than_human_ai; ./smarter_than_human_ai".
21:48:40 <lament> all my body parts are pretty.
21:49:08 <kerlo> Perhaps it would discreetly take over the Internet.
21:49:14 <mib_mksvta> SimonRC: he's replying to an ancient comment.
21:50:01 <oerjan> mib_mksvta: it could be trying to assure there's no unfriendly ones out there :D
21:50:09 <kerlo> Perhaps it would discreetly take over the Internet, set up an oracle service, earn money, and repay the people it took computer stuff from.
21:50:42 <mib_mksvta> I would say that discreetly taking over the internet is not a Friendly task regardless of how it pays back.
21:51:22 <kerlo> That's comparable to saying that cutting people open is evil regardless of how it pays back.
21:51:42 <mib_mksvta> kerlo: If the singularity kills someone, then makes 10 babies, that isn't Friendly.
21:51:55 <Ilari> If it wanted to do really hostile act, perhaps it would attack control computers of all kinds of real-world important systems. Attack and successfully disable power grid and its game over.
21:51:56 <kerlo> What if the Singularity kills someone, then saves ten?
21:52:19 <mib_mksvta> Ilari: the idea is to make it not do that.
21:52:41 <mib_mksvta> kerlo: How do you define Friendly AI? I'd define it as an AI applying utilitarianism over humanity.
21:53:05 <mib_mksvta> So killing one person, then saving ten -- if they must be interlinked -- is probably Friendly.
21:53:17 <mib_mksvta> But doing harm to the internet, and merely paying back, is still an unfriendly act.
21:53:37 <kerlo> Utilitarianism does not value equality at all.
21:53:52 <mib_mksvta> That is true. It values the group as a whole.
21:54:07 <kerlo> Then again, people wouldn't like extreme inequality.
21:54:24 * SimonRC dislikes this area of conversation.
21:54:28 <mib_mksvta> kerlo: Yeah, but think of what would happen if Ayn Rand wrote the seed AI.
21:54:40 <kerlo> So I would probably consider utilitarianism a valid approach.
21:54:57 <kerlo> Is Ayn Rand an extreme deregulationist libertarian dude?
21:56:53 <kerlo> With deregulation stuff, all humans get laid off and starve.
21:57:14 <mib_mksvta> kerlo: I wonder how mainstream news organizations and politicians would react?
21:57:47 <kerlo> That would be fun.
21:58:10 <mib_mksvta> "A terrist is trying to take over the world with his computer! We must bomb him and his network before it is too late!"
21:58:24 <kerlo> It would be quite terroristic.
21:58:54 <kerlo> "A human is trying to step on our hill! We must bite him repeatedly bite him before it is too late!" --an ant
21:59:17 <mib_mksvta> We must repeatedly before it is too late!
21:59:33 <mib_mksvta> "The 'Singularity Institute' for 'Artificial Intelligence' today announced that they had 'created "smarter than human AI"'. SOME SCIENTISTS say that this is in fact a load of rubb*evaporates into the stars as a God*"
21:59:36 <kerlo> There was no g at the end.
22:01:04 <mib_mksvta> So, prediction: kerlo will cause the singularity by modifying fungot. oklopol will be the only human left behind, as he is ominipotent and oko and immune.
22:01:05 <fungot> mib_mksvta: an agoran decision has an honor of each week,
22:01:12 <mib_mksvta> All opinions to the contrary are wrong.
22:01:59 <kerlo> Wait, isn't fungot just a Markov chain bot?
22:02:00 <fungot> kerlo: their debts to each officer with a list of all shareholders.) the delegated player ceases to to
22:02:14 <mib_mksvta> kerlo: It can also execute Brainfuck and Underload.
22:02:28 <mib_mksvta> because you, kerlo, are batshit insane.
22:02:42 <mib_mksvta> kerlo: http://zem.fi/~fis/fungot.b98.txt Get to work
22:02:43 <fungot> mib_mksvta: ( c) a player who makes further play impossible by eir actions or lack thereof, or
22:02:50 <oerjan> this is obviously logical
22:03:50 <kerlo> Can't I just run a random subleq program instead?
22:04:29 <kerlo> Anyway, suppose I were to batshit insanely start trying to make AI right now.
22:04:52 <mib_mksvta> kerlo: btw, since you think the singularity will happen, does that mean you think intelligence X can comprehend X+Y intelligence? (well, you must)
22:05:26 <kerlo> To an extent, certainly.
22:05:50 <kerlo> Well, it depends on whether "intelligence" includes capacity for improvement.
22:05:55 <mib_mksvta> kerlo: so you believe there is a lower bound on X and the higher bound on Y?
22:06:08 <mib_mksvta> That is, a monkey can't understand a human but a human could understand the seed AI to write it?
22:06:09 <kerlo> Do you see anything inherently wrong with creating an idiot that becomes a genius?
22:06:29 <mib_mksvta> No, I'm just not sure humans can understand smarter-than-human intelligence to create an AI that is
22:06:37 <Ilari> It is possible to get to X+Y intelligence without X intelligence understanding it. But that's likely too complicated to be practical with technological stuff...
22:07:03 <mib_mksvta> if humans can create smarter-than-human AI, we don't have to
22:07:05 <kerlo> We don't need to create smarter-than-us intelligence, only more-flexible-than-us intelligence that can make itself smarter than us.
22:07:06 <mib_mksvta> we just have to create human-intelligence AI
22:07:13 <mib_mksvta> which can then make the smarter-than-us intelligence for us.
22:07:35 <mib_mksvta> Making something in equal intelligence to us' easiness >>>>>>>>>>>>>>>>>>>>>>>>>>>>> Making something more intelligent than us's easiness
22:09:25 <kerlo> If I have something whose intelligence is equal to mine, improving it by anything at all will make it more intelligent than me.
22:10:22 <kerlo> It seems that *the* problem with Friendly AI, and perhaps strong AI as well, is making it so that the AI will recognize any change to itself that would change its supergoal before making such a change.
22:11:53 <kerlo> So we ought to find a class of changes to AI guaranteed not to change its supergoal.
22:11:56 <mib_mksvta> If the AI box worked, it could just simulate it to see. Unfortunately...
22:12:42 <kerlo> Even if it worked, it would have to simulate a modified version of itself, plus, presumably, bits of the universe.
22:13:59 <mib_mksvta> These things sure are complicated, huh?
22:14:35 <kerlo> That's why I refuse to think about anything complicated.
22:14:42 <mib_mksvta> :( I tried to teach a neural network addition on two bits
22:14:46 <kerlo> If you want to be batshit insane, you have to learn to ignore these things.
22:15:17 <kerlo> So we ought to find a class of changes to AI guaranteed not to change its supergoal. And then, um...
22:15:37 <kerlo> Oh, we also have to find a way to prove that the supergoal of an AI is in fact a given thing.
22:17:24 <oerjan> iiuc there is also the problem of finding the right supergoal
22:19:24 <kerlo> Can an AI really be said to have precisely one supergoal?
22:19:43 <ktne> well, if it has multiple goals
22:20:01 <ktne> then it needs some comparation metric to use to decide which one will follow
22:20:08 <ktne> at least for the moment
22:20:15 <mib_mksvta> i think the supergoal is meant to be the overriding goal
22:20:19 <ktne> but this means that there is just one weighted supergoal
22:20:32 <kerlo> CEV: "Give us (humanity) what we would wish for if we knew more, thought faster, were more the people we wished we were, had grown up farther together; where the extrapolation converges rather than diverges, where our wishes cohere rather than interfere; extrapolated as we wish that extrapolated, interpreted as we wish that interpreted."
22:20:59 <kerlo> That's not what I mean.
22:21:30 <kerlo> If the AI wants the diamond, and it believes that the only way to get the diamond is by getting box A, and so it decides to get box A, what is its supergoal?
22:21:40 <kerlo> I guess I need more iffage.
22:22:03 -!- olsner_ has quit ("Leaving").
22:22:05 <kerlo> If all that and it also decided that it would not change its mind about its decision to get box A no matter what.
22:23:27 <kerlo> The thing is, I guess, its supergoal is to get the diamond, but it's incompetent at doing so.
22:23:39 <mib_mksvta> If you want the diamond and think the only way to get the diamond is via box A, and you realise you can't get the diamond, and you have no other reasons for getting box A, then getting box A is an act of sheer stupidity.
22:24:29 <kerlo> Suppose an AI came to believe that from then on, its senses would attempt to deceive it.
22:24:38 <kerlo> The best course of action would be to ignore its senses completely.
22:25:06 <mib_mksvta> The best course of action would be to believe what its senses say to it is reversed.
22:25:54 <kerlo> So that if its senses told it that the sky is not green, it would come to believe that the sky is green?
22:26:09 <kerlo> Saying nothing but falsehoods is not the best way to deceive a person.
22:26:25 <mib_mksvta> kerlo: how about it'd never believe that because its senses aren't sentient
22:26:33 <mib_mksvta> i mean unless they are which they shouldn't be
22:27:07 <kerlo> My senses are sentient.
22:27:34 <mib_mksvta> I disbelieve that your ears, eyes, mouth are sentient.
22:27:42 <mib_mksvta> I disbelieve that your skin is sentient.
22:27:54 <kerlo> Okay, s/senses/sensations/g
22:28:55 <Ilari> If one really wants to deceive a person, expose it to same falsehood from all directions and often (much more often than they hear the truth)...
22:29:27 <mib_mksvta> Yeah, but global falsehood is not deceiving: just flip it.
22:30:05 <kerlo> I'd write an AI, but unfortunately, this would require sheer gibbering stupidity, blank incomprehension of the Singularity, and total uncaring recklessness.
22:30:24 <Ilari> Yeah, but how does one tell which way is correct?
22:30:25 <kerlo> Sorry, that was a typo. I meant to type "I can't".
22:31:00 <kerlo> Don't you just hate it when you go into a trance and quote Eliezer Yudkowsky when trying to press the shift button?
22:31:06 <mib_mksvta> Ilari: if you know it's going to lie to you in that way, ...
22:31:12 <mib_mksvta> kerlo: I think you have an obsession problem.
22:31:34 <mib_mksvta> Also, um, didn't EY advocate writing a seed AI?
22:32:09 <kerlo> He was talking about writing AI with one of our subgoals (e.g. solve the Riemann hypothesis) as its supergoal.
22:32:11 <mib_mksvta> The other way to get a Riemann Hypothesis Catastrophe is to make solving the Riemann Hypothesis a direct supergoal of the AI - perhaps the only supergoal of the AI. This would require sheer gibbering stupidity, blank incomprehension of the Singularity, and total uncaring recklessness.
22:32:13 <Ilari> mib_mksvta: I don't think you can know that (unless you find out)...
22:32:27 <mib_mksvta> Ilari: It was part of kerlo's question.
22:33:48 <mib_mksvta> Hmm... I think I'll stop bothering the people who don't care/dislike the singularity in here. kerlo, others: if you want to continue, I've set up a treehouse in #zot.
22:35:07 -!- mib_mksvta has quit ("http://www.mibbit.com ajax IRC Client").
22:37:23 -!- kar8nga has quit (Read error: 110 (Connection timed out)).
22:45:34 -!- oerjan has quit ("Good night").
22:47:00 <kerlo> ehird is promoting artificial artificial artificial artificial artificial intelligence intelligence intelligence intelligence.
22:51:23 -!- alex89ru has quit ("Verlassend").
23:02:08 -!- abc has joined.
23:03:19 -!- abc has quit (Remote closed the connection).
23:06:31 -!- jix_ has quit ("...").
23:21:27 -!- oklopol has quit (Read error: 104 (Connection reset by peer)).
23:21:47 -!- oklopol has joined.
23:24:05 -!- FireFly has quit ("Later").
23:24:09 -!- BeholdMyGlory has quit ("bye").
23:26:25 -!- ktne has quit ("Leaving.").
23:33:29 <ehird> #zot's quiet. Someone should join.
23:39:03 <Ilari> At least its one (pretty rare) metasyntactic variable...
23:39:14 <ehird> bsmntbombdood: Zotalicious, for one.
23:39:40 <ehird> But it's just a random name for discussion of a variety of topics including AI, cake, #zot, and the topics applicable in #zot.
23:39:42 <ehird> Mostly the first one.
23:45:36 <GregorR> I'LL HAVE YOU KNOW THAT I AM 0% ZOT
23:49:32 <ehird> lament: the pianidio is awesome.
23:56:53 -!- radioactivity has joined.
23:58:27 -!- Corun has quit ("This computer has gone to sleep").
23:58:59 -!- macondo has joined.
23:59:26 -!- macondo has left (?).