00:04:33 <ehird`> is toboge an egobot clone or something
00:05:55 <GregorR> He's trying to outcompete with EgoBot :(
00:06:26 <oerjan> GregorR: you have gone too long without adding new languages :/
00:07:06 -!- immibis_ has joined.
00:08:34 <GregorR> oerjan: People haven't been very persistent in telling me to add them :P
00:09:25 <GregorR> I won't add any languages with file I/O, and most non-esolangs have that.
00:10:49 <bsmntbombdood> there are only a few file operators that you have to remove
00:11:51 * ehird` thinks blahbot is supreme!
00:12:11 <bsmntbombdood> just wrap the code in something like: (define (fuck-you . ignored) (write-to-channel "fuck you, hax0r")) (let ((with-output-file fuck-you) et cetra) ...)
00:12:55 * ihope continues pondering stuff
00:13:30 <ihope> \x is a function from a to a binder of x to a...
00:13:46 <bsmntbombdood> http://www.schemers.org/Documents/Standards/R5RS/HTML/r5rs-Z-H-9.html#%_sec_6.6.1
00:13:54 <ihope> x is an a given a binding of x to a.
00:14:02 -!- immibis has quit (Nick collision from services.).
00:14:05 -!- immibis_ has quit (Nick collision from services.).
00:14:09 -!- immibis has joined.
00:14:39 <ihope> \x :: all a. a -> Bind \x\ a
00:15:18 <immibis> if anyone could tell me how to fix my connection, that would be useful.
00:15:31 -!- toBogE has quit (Nick collision from services.).
00:15:35 <ihope> immibis: duct tape.
00:16:28 <bsmntbombdood> call-with-input-file, call-with-output-file, with-input-from-file, with-output-to-file, open-input-file, open-output-file, load, transcript-on
00:16:56 <ihope> Er, \x\ : a, not x : a.
00:17:55 <ihope> \x :: all a. a -> Bind \x\ a; x :: [x : a] => a; (::) :: a -> * -> Dec;
00:17:57 <oerjan> bsmntbombdood: _provided_ he knows his scheme implementation has no i/o extensions.
00:18:55 <bsmntbombdood> oerjan: so he just needs to read his implementation's docs
00:19:43 <edwardk> ihope: seems kinda authmathy
00:19:55 <ihope> edwardk: automathy?
00:19:55 <bsmntbombdood> or he could write his own scheme and be absolutely sure
00:20:09 <ihope> bsmntbombdood: he could write himself a Scheme in 48 hours!
00:20:17 <edwardk> like automath, the grand-daddy of modern functional languages
00:20:32 <edwardk> on the non-lisp side of the family tree ;)
00:20:56 <edwardk> where we got this strange notion of type systems from, etc =)
00:20:59 <oerjan> and not miss anything that puts surprising I/O access into an "obviously" safe place.
00:22:04 <ihope> \x :: all (\a. a -> Bind \x\ a); x :: all (\a. [\x\ : a] => a); (::) :: all (\a. a -> * -> Dec); all :: all (\a. (a -> *) -> *); * :: *; \x\ :: Id; Bind :: Id -> *-> *; Id :: *; (:) :: Id -> * -> Req; (=>) :: List Req -> * -> *; List :: * -> *; Req :: *
00:23:41 <ihope> Oh, and (.) :: all (\a. all (\i. (a -> Bind i a) -> [i : a] => b -> a -> b))
00:24:48 <ihope> And then there's let...
00:26:06 <ihope> Trying to invent a language.
00:26:14 <ihope> Of the programming kind.
00:28:43 <immibis> ihope: sorry for the late reply, but how can duct tape ensure a wireless connection stays connected?
00:28:56 <ihope> immibis: duct tape'll connect anything!
00:29:10 <immibis> even a wireless connection?
00:29:23 <immibis> even a wireless, ducttapeless connection?
00:29:54 <ihope> See if you can find wireless duct tape.
00:31:27 <immibis> ihope: see if you can find wired duct tape
00:37:38 -!- toBogE has joined.
00:43:06 <SimonRC> all the bizarre bits of Haskell suppoert one another
00:43:17 <SimonRC> without type inference, monads are useless
00:43:29 <SimonRC> hell, without types they are useless
00:43:55 <oerjan> i am not quite sure of that.
00:44:46 <oerjan> you _could_ have objects with a bind method.
00:46:32 <SimonRC> But it is a PITA to have to write type sigs all over the place
00:47:10 <SimonRC> >> is ploymorphic remember
00:47:12 <oerjan> i am talking about in a dynamically typed language
00:47:39 <oerjan> >> would call the bind method of its left argument.
00:47:51 <SimonRC> but think of all those functions which work for any monad
00:48:29 <SimonRC> liftM :: a -> b -> m a -> m b
00:49:30 <SimonRC> <lambdabot> forall a1 r (m :: * -> *). (Monad m) => (a1 -> r) -> m a1 -> m r
00:50:27 <oerjan> anyway, liftM f x = x.bind(\t -> return (f x))
00:50:31 <SimonRC> write the type signature for *that*
00:50:46 <SimonRC> oerjan: I know that, but that is verbose
00:51:12 <oerjan> well, first you have x >>= f = x.bind(f), of course.
00:52:08 <SimonRC> readThingy >>= liftM (+2) >>= writeThingy
00:53:12 -!- immibis has quit ("Download IceChat at www.icechat.net").
00:53:22 -!- toBogE has quit (Read error: 104 (Connection reset by peer)).
00:53:26 <oerjan> SimonRC: i meant to make liftM f x a _function_
00:53:33 <oerjan> defined by the right hand side
00:53:51 <SimonRC> ah, wait, i can see how that might work
00:54:07 <oerjan> i actually thought about this before a bit
00:54:32 <SimonRC> It is occasionally handy to be able to dispatch on return type
00:55:05 <oerjan> yes, that is hard to get. also, this method works only for monads strict in the left argument of >>=
00:55:18 <oerjan> but it does work for a number of monads.
00:55:21 <SimonRC> How would one go about writing enumFromTo?
00:55:43 <SimonRC> :: forall a. (Enum a) => a -> a -> [a]
00:55:43 <oerjan> that would be a method too, of course.
00:55:57 <SimonRC> ah, I can see how this works
00:56:08 <oerjan> Scala has operators as methods of their first argument.
00:56:40 <oerjan> it also has a bit of comprehension syntax, which is thinly disguised monads.
00:56:56 <oerjan> although the type system doesn't support the full concept.
00:57:15 <SimonRC> ok, now a pathalogical example: "makeIntoZeros = map (\x -> 0)"
00:58:04 <SimonRC> :: forall a, n. (Num n) => [a] -> [n]
00:58:09 <oerjan> btw you can /msg lambdabot
00:58:27 <oerjan> bsmntbombdood: in some cases you can do it with type classes
00:58:34 <oerjan> printf exists, for example
00:58:44 <SimonRC> it conflicts interestingly with currying
00:59:12 <oerjan> basically, the final result of the function cannot be a function, i guess
00:59:46 <SimonRC> the problem is if there is a type in the return value that cannot be deduced from the arguments
01:00:17 <oerjan> printf is polymorphic on the return value :D
01:00:40 <oerjan> oh, i thought you were still talking about variadic functions
01:01:03 <SimonRC> ad many of the types that are in one sense types of arguments end up as part of the type of the return value when you start currying.
01:01:11 <SimonRC> e.g. readThingy >>= liftM (+2) >>= writeThingy
01:01:34 <SimonRC> the monad type does not appear in the single argument to liftM
01:01:39 <SimonRC> but it does in the return type
01:02:11 <oerjan> as i see it, the monad is found from the first object in the >>= chain which is not return _
01:03:01 <SimonRC> and obviously you lose majorly if you get rid of currying
01:03:35 <oerjan> well, getting rid of currying was not part of the original specification :)
01:04:11 <SimonRC> if you allow currying the Java and C# programmers will kill you
01:05:16 <SimonRC> suppose you have a function getStream :: m a -> m [a]
01:05:57 <SimonRC> and because it is used deep inside an abstraction, for elegance you want to pass in "return 0", which eventually gets passed as the first argument of getStream...
01:06:30 <SimonRC> you have a naked return, so you must specify the type somehow
01:06:51 <SimonRC> any any hard-coded type will reduce generality
01:07:34 <oerjan> well, there _would_ have to be default return(x) objects
01:08:08 <oerjan> which would know how to insert themselves into a >>= chain
01:08:36 <SimonRC> getStream f = do { x <- f ; xs <- getStream f ; return x : xs } -- I think
01:08:37 <oerjan> no worse than having numerical conversions
01:09:53 <oerjan> basically, you are making the wrapped Identity monad a supertype of the others.
01:12:37 <ihope> What's all this about?
01:13:01 <oerjan> how much of monads can be done in a dynamically typed language
01:13:36 <oerjan> with code polymorphic over the monad
01:31:09 -!- ehird` has quit (Read error: 104 (Connection reset by peer)).
01:31:18 -!- marvinbot` has quit (Remote closed the connection).
01:47:10 -!- immibis has joined.
01:49:07 -!- toBogE has joined.
01:50:23 -!- toBogE has quit (Read error: 104 (Connection reset by peer)).
03:09:30 -!- ihope has quit (Read error: 110 (Connection timed out)).
03:36:30 -!- GreaseMonkey has joined.
04:14:11 -!- andreou has quit (Read error: 113 (No route to host)).
04:34:14 -!- oerjan has quit ("Good night").
05:14:30 <Sukoshi> Hey, if you read() from a Reader, does it always pull in the next byte?
05:14:35 -!- Arrogant has joined.
05:16:11 <Sukoshi> It reads an int, actually :P
05:16:18 <RodgerTheGreat> "The character read, as an integer in the range 0 to 65535"
05:16:47 <Sukoshi> It's the integer representation of the next unicode character.
05:18:07 <Sukoshi> Now I have to like, completely redesign half my classes.
05:19:05 <Sukoshi> Wait no, I don't. Only 1 class I need to redesign.
05:19:59 <Sukoshi> I need to use an InputStream now, so I have to make sure the bytes are converted to their appropriate types before I have the classes perform the internal magic to represent the types I need.
05:20:25 <Sukoshi> Can you test against bytes like (blah == -1) ?
05:20:31 <Sukoshi> Or do you have to cast to int?
05:20:50 <RodgerTheGreat> you should be able to make the comparison you have above
05:22:04 <RodgerTheGreat> on an unrelated note, I've come up with a bunch of monsters and things for the player and fluffy, his faithful genetically engineered pencil-sharpener, to face in my RPG: http://rodger.nonlogic.org/images/CRPG%20combat.png
05:23:25 <RodgerTheGreat> ideas not shown here include staple removers, peeps(TM) candy and the ghost of Edsgard Djikstra.
05:25:31 <Sukoshi> Also, if you cast byte to char, does it do the auto-conversion for you?
05:25:40 <RodgerTheGreat> Djikstra's attacks will include "Shunting yard", "FOR loop", "A case against the GO TO statement" and "exhaustive proof"
05:26:00 <Sukoshi> This primitives business is what really confuses me. I'm so used to C primitives ;-;
05:26:14 <RodgerTheGreat> char literals are dealt with internally as if they instantly become integers
05:26:43 <RodgerTheGreat> that's how I always think about it- single-quotes are just an alias
05:27:29 <Sukoshi> I haven't written ASM in a *looong* time.
05:27:37 <Sukoshi> I want to do a low level project. Methinks an emulator.
05:28:14 <Sukoshi> You can't write a substantial emulator in a high-level language and expect it to be fast, though.
05:28:49 <Sukoshi> Plus you need cheap bit-flipping hacks that is total C-lurv :3
05:29:02 <immibis> you can't write anything in java and expect it to be fast, that includes emulators
05:29:57 <Sukoshi> Funny how a byte-code compiled language can't be fast, no?
05:30:09 <Sukoshi> OSS anti-Java stigma, when unfounded, is funny.
05:30:22 <immibis> at least, not on my computer
05:30:46 <Sukoshi> I'll bet my machine is worse than yours.
05:30:59 <RodgerTheGreat> I'm with Sukoshi on this one, immibis- Java has a tremendous amount of technology behind it to *make* it fast, even when it's innately at a disadvantage
05:32:22 <immibis> really? i must have a slow computer then
05:32:31 * immibis checks in System Properties
05:33:19 <RodgerTheGreat> there is no excuse to have that little RAM. It's a travesty.
05:33:53 <RodgerTheGreat> ram IS NOT EXPENSIVE. It's the most affordable upgrade you can make to your computer these days.
05:34:11 <immibis> 8MB is used by onboard graphics
05:34:27 <immibis> i think there is actually 256MB in the box
05:34:44 <Sukoshi> I have a slower computer, and yet it runs fine.
05:34:55 <Sukoshi> You even run Windows, and the Linux JVM has historically been known for being crappy.
05:36:40 <RodgerTheGreat> I can attest to this- applet compatibility on linux is absolute shit
05:37:04 <RodgerTheGreat> unreliable keylisteners, improper graphics buffering, and a host of other intermittent problems
05:38:53 <RodgerTheGreat> I've had numerous programs run on OSX and windows flawlessly, and then utterly fail when I test them out on one of the fedora-based lab machines up here in the CS department
05:39:25 <Sukoshi> It's gonna improve now that Java is OSSing the thing.
05:39:57 <Sukoshi> Never doubt the power of horrendous numbers of OSS coders.
05:39:59 <RodgerTheGreat> OR, we'll wind up with a ton of slightly broken and weird forks of the language
05:40:14 <Sukoshi> Read the GNU Classpath mailing list. It's *really* active.
05:40:23 <RodgerTheGreat> "Woo I should add operator overloading to Java FOR NO REASON! Whoopeee!"
05:40:37 <GreaseMonkey> i reckon they'd have a fork of Java with built-in "Hello World!" support
05:40:58 <Sukoshi> But Java 1.6 really upped Linux VM awesomeness.
05:41:07 <Sukoshi> Much faster/lighter on the memory.
06:02:13 -!- GreaseMonkey has quit ("custom quit messages --> xchat.org <-- hydrairc sucks").
06:04:47 <Sukoshi> Although I used to find it aggravating in the beginning, now I'm starting to like Java's restriction of one class per file and the class should have the same name as the filename.
06:05:04 <Sukoshi> I remember hunting typedefs in large globs of C codes and shuddering.
06:07:32 <RodgerTheGreat> although in cases where it makes some sense (like non-public classes), it *is* sometimes possible to have more than one in a file
06:11:16 -!- RodgerTheGreat has quit.
06:56:43 <immibis> could someone please indicate what is wrong with the following bf program: +[,>[-]+.<[.,]+.[-]+++++++++++++.---.]
06:57:31 <immibis> it is supposed to read from standard input until end-of-file and echo it putting the character with code 1 before and after it
06:57:50 <immibis> in other words, it is meant to translate plain text into a CTCP request when run on EgoBot as a daemon
07:05:23 <immibis> left over from an earlier revision
07:05:37 <immibis> what about the CRLF though?
07:15:18 -!- Arrogant has quit ("Leaving").
07:59:59 -!- clog has quit (ended).
08:00:00 -!- clog has joined.
08:00:09 -!- sebbu has joined.
08:58:48 <immibis> !daemon ctcp bf8 +[.[-],[.,]+.++++++++++++.---.]
08:59:24 <immibis> !daemon ctcp bf8 +[.[-],[.,]+.+++++++++.]
08:59:48 <immibis> anyone know what is happening?
09:00:30 <EgoBot> Use: usertrig <command> <trig-command> Function: manage user triggers. <command> may be add, del, list or show.
09:00:54 <immibis> !usertrig add ctcp bf8 +.,[.,]+.
09:06:57 <immibis> why does the goat go woof?
09:10:33 -!- immibis has quit ("I cna ytpe 300 wrods pre mniuet!!!").
10:19:10 <bsmntbombdood> "If you are caught downloading copyrighted material, you will lose your ResNet privileges forever.", then, later on the page, "Copyright © 2005 by the University of Kansas". Ouch.
10:33:40 <oklopol> <oerjan> Ach, du lieber! <<< OMG, rather you?
10:36:04 <oklopol> hmph, why is everyone gone when i need them
10:36:23 <oklopol> okay, i admit i did't need oerjan that much
10:44:44 -!- ehird` has joined.
11:00:34 -!- jix has joined.
11:03:50 <ehird`> where's that video about procedures in c2bf again?
11:20:58 -!- Cesque has joined.
11:22:12 -!- Cesque has quit (Client Quit).
11:48:05 -!- andreou has joined.
11:53:37 <oklopol> made a language with static typing
11:54:30 <oklopol> actually, i solved my problem
12:05:46 -!- jix has quit (Nick collision from services.).
12:06:00 -!- jix has joined.
12:36:55 -!- RedDak has joined.
12:40:01 -!- andreou has quit ("Leaving.").
12:48:29 -!- oklofok has joined.
12:48:56 <oklofok> so okay, i make a language, then try creating i using s, k and i -combinators.
12:49:11 <oklofok> WHY CAN'T MY I COMBINATOR USE ITSELF RECURSIVELY???
12:49:23 <oklofok> this kept me occupied for quite a while
12:50:12 <oklofok> (don't use recursion if you don't know it or just happen to be a miserably failish person.)
12:53:05 -!- ololobot has joined.
12:54:07 <oklofok> >>> numbda s={a->{b->{c->(a!c)!(b!c)}}};k={a->{b->a}};i={a->s!k!k!a};i!7
12:54:24 <oklofok> the i combinator via ``skk in numbda
12:55:09 <oklofok> (the language i created to make possible to make lambdas using parenthesis while still having them for normal grouping)
12:55:26 <oklofok> and no, this feature hasn't been done yet
12:55:36 <oklofok> and yes, i know no one is interested in whether it is
12:55:48 <oklofok> and now, gonna eat something funnish ->
12:56:46 <oklofok> crack it if you wish, tell me if you do
13:01:50 -!- oklobot has joined.
13:02:04 <EgoBot> help ps kill i eof flush show ls bf_txtgen usertrig daemon undaemon
13:02:07 <EgoBot> 1l 2l adjust axo bch bf{8,[16],32,64} funge93 fyb fybs glass glypho kipple lambda lazyk linguine malbolge pbrain qbf rail rhotor sadol sceql trigger udage01 unlambda whirl
13:03:28 <oklofok> oklobot sucks, i just wanted 4 nicks here for the hell of it
13:03:41 <oklofok> now, retry at the going away thing ->
13:03:47 -!- ihope__ has joined.
13:04:12 -!- ihope__ has changed nick to ihope.
13:28:26 <ehird`> numbda looks like oklotalk
13:29:21 <ehird`> wait how does egobot do befunge
13:35:11 <ehird`> is there a precompiled binary of fukyorbrane for windows anywhere?
13:37:51 <ehird`> FukYorBrane combined with self-replicating brainfuck?
13:38:01 <ehird`> you could easily replace an opponents code with your own.
13:40:10 <oklofok> ololobot has a new language now
13:40:19 <oklofok> >>> bs 33<11<=!Hello> world>:
13:40:49 <oklofok> now, perhaps, i'm going ->
13:52:14 <ehird`> one thing i don't understand about bf function calls like in c2bf
13:52:49 <ehird`> is that the only way to call a function is to put the function id in the current cell, and then >end the loop< (i.e. return from the current function.) so how do you handle my_function() { a_func(); more_code; }? you'd return right after a_func
13:53:05 <ehird`> and you can't use a call stack since you can't represent a certain part of a function
14:18:26 <ehird`> do many brainfuck compilers optimize x[x] to a do..while?
14:30:03 <ihope> oklofok: what's that language?
14:30:12 -!- RedDak has quit (Remote closed the connection).
14:34:04 <ehird`> hmm, x[x] optimization could really speed up some code
14:50:16 <ihope> Ponders how to write that without x being present twice.
14:50:51 <ehird`> ... i think it'd be hard
14:50:58 <ehird`> which is why lovely compilers should do it for us!
14:51:44 <ihope> Perhaps AI means a good compiler.
14:52:10 * ehird` ponders writing a bf-to-c compiler in C, optimizing - yeah it's been done before, but they're short affairs, and you can optimize so much in BF
14:52:49 <ehird`> (wow -- i'm stupid, i just realised that cell-wrapping is just modulo 256)
15:03:05 -!- oerjan has joined.
15:34:44 <ehird`> i might write that bf compiler.
16:33:43 <oklofok> (ihope) oklofok: what's that language?
16:33:53 <oklofok> that one i call oklobot :)
16:34:18 <oklofok> that's a language of my friends
16:34:55 <oklofok> it's kinda like brainfuck, except you have bitwise logic and basic arithmetic for adjacent cells
16:35:04 <ehird`> is there a page on the wiki describing most of the good brainfuck-compilation optimization techniques known?
16:35:24 <oklofok> my friend's knowledge about esoteric languages is pretty much limited to brainfuck
16:35:33 <ihope> >>> bs 33<11<=!Hello> world>:
16:35:42 <oklofok> ehird`: that while -> do while thing isn't possible in general, methinks
16:36:13 <ehird`> just match on a parse tree x[x], where x is matched as what's in the [], then convert
16:36:16 <oklofok> because you can't keep a cell for the while in store if you don't know where in memory x will land
16:36:42 <oklofok> i was thinking about what ihope said
16:36:47 <oklofok> and answered to him, actually
16:36:56 <ehird`> well you said "ehird: that while -> do..."
16:37:10 <oklofok> i did, because i forgot who asked what.
16:37:20 <ehird`> what i mean is, instead of x[x] being e.g. x; while(*p){x} it's do{x}while(*p)
16:37:21 <oklofok> optimizing that is just a stirng match
16:37:32 <ehird`> or a parse tree match for more advanced compilers :P
16:37:46 <oklofok> essentially the same in the case of brainfuck
16:38:00 <ehird`> maybe x[xy] could be optimized too
16:38:02 <oklofok> because in brainfuck you can't play with syntax
16:38:07 <ehird`> that is if x isn't just one character or something silly
16:39:02 <oklofok> >>> numbda "Hello, world!"
16:39:16 <oklofok> i realized my static scoping is broken when i was eating
16:39:31 <oklofok> recursion in general will not work
16:39:55 <oklofok> but you can't notice it yet, really, since there aren't control flow operators to make recursion usable
16:40:08 <ehird`> i also think that the algorithms to set the ptr to a certain value can be optimized
16:40:15 <ehird`> things like copying, too
16:40:31 <ehird`> you just need either some heuristics or some hard-coded snippits to optimize
16:40:39 <oklofok> you mean [-]+++++ can be made into cell=5
16:40:53 <oklofok> i think my brianfuck compiler does that
16:41:23 <ehird`> yes, and: [>+<-] can be optimized too
16:41:36 <ehird`> it's *p = *(p - 1); *(p - 1) = 0;
16:41:41 <oklofok> my brainfuck compiler optimizes that methinks
16:41:47 <ehird`> to do it completely requires solving the halting problem of course
16:42:03 <ehird`> but you can try some heuristics, and use hardcoded optimizations for a few ways.
16:42:16 <oklofok> any [] that has right_moves-left_moves==0 can be completely optimized.
16:42:22 <oklofok> and my compiler does that methinks
16:42:30 <ehird`> also you can optimize every single one on http://esolangs.org/wiki/Brainfuck_constants :)
16:42:34 <oklofok> if i actually implemented the last optimization
16:42:46 <ehird`> oklofok, and that has no IO right, you mean :)
16:43:05 <ehird`> "and that has no I/O, right"
16:43:11 <ehird`> and, how do you do it?
16:43:16 <ehird`> do you interpret it at compile-time?
16:43:24 <ehird`> otherwise nested loops suc hthat r-l==0 might be hard..
16:43:38 <oklofok> you just sum up the +'s and -'s for each level
16:44:08 <oklofok> and an optimized [] will just be a list like [ccell-4]+=4, [ccell]-=3
16:44:21 <ehird`> so [+++[---]] would be compiled as while (*p) { *p += 3; while (*p) { *p -= 3; } }
16:44:31 <ehird`> i was thinking you'd flatten the loop somehow and i was confused
16:44:43 <oklofok> err, [---] would be optimized as [-] = NULLIFY
16:44:52 <ehird`> i mean in the context of this optimizations
16:44:53 <oklofok> [+++NULLIFY]==[-]= nullify
16:45:34 <oklofok> you can flatten a thing like [+-+-+-+->-+-+-++---->-+-++-<--+--<<-+++-<-+++++++>]
16:45:49 <oklofok> [+-+-+-+->-+-+-++---->-+-++-<--+--<>-+++-<-+++++++>]
16:46:16 <oklofok> and nullifications can usually be there as well and can be optimized
16:46:38 <ehird`> so [>++<-[+>-<]] would be while (*p) { *(p + 1) += 2; *p--; while (*p) { *p++; *(p + 1)--; }} right
16:47:12 <oklofok> but that's a pretty obvious optimization anyway
16:47:21 <oklofok> of course, i was wrong there
16:47:23 <ehird`> i see these optimizations would be much easier with the code as a nested list (for loops) and a language with pattern matching ;)
16:47:30 <ehird`> this would be quite verbose in C
16:47:34 <ihope> Did somebody say Haskell?
16:47:45 <oklofok> a non recursive one with num(>)-num(<)=0 can always be fully optimized
16:47:56 <oklofok> i mean, with no nested []'s
16:48:00 <ihope> I've hardly heard of SML.
16:48:05 <oklofok> but obvious obvious, that doesn't really help
16:48:06 <ehird`> ihope, i think it looks nice
16:48:09 <ehird`> i haven't used it much
16:48:20 <ihope> Related to ML, probably.
16:48:50 <oklofok> and of course you have the code as a nested list
16:48:51 <ehird`> oklofok, what about initialization optimizations
16:49:12 <oklofok> you mean stuff like constants?
16:49:20 <ehird`> >+++<- at the start of the program makes e.g. the char tape[3000] be char tape[3000] = { 255, 3 };
16:49:31 <ehird`> instead of tape[3000]; <setting stuff here>
16:50:57 <oklofok> the only thing that can't completely be optimized is stuff where a part of a code uses a cell whose value isn't surely known at that point
16:51:31 <oklofok> so everything done before an input can trivially be encoded in the starting patterns
16:52:26 <ehird`> you mean, things like >+++<->[code] is optimized as code not being conditional at all?
16:52:49 <oklofok> a program that doesn't take input is optimized into it's result.
16:52:54 <oklofok> if you do compiling/optimizing.
16:53:01 <ehird`> no matter what that program is?
16:53:09 <ehird`> a factorial program with a fixed input would be evaluated at compile time?
16:53:14 <ehird`> that doesn't take input.
16:53:20 <ehird`> but that, at compile time, is insane
16:53:23 <ehird`> you're not writing a compiler
16:53:38 <oklofok> i see it as the best optimization possible.
16:53:39 <ehird`> you're writing an interpreter which sometimes delegates input to the code outputted by it
16:53:53 <ehird`> seriously, no compiler would run a whole factorial program and then just compile the result
16:54:06 <oklofok> well, i'm not talking about a compiler
16:54:15 <oklofok> i'm talking about what you *can* optimize away
16:54:24 <oklofok> i don't care about what's actually feasible
16:54:34 <ehird`> the "optimization" you have described has a name it's called interpretation :)
16:55:14 <ehird`> interpretation really just optimizes source code into a more optimal form - it does a pretty good job, too - it produces output requiring no computation. :-)
16:55:24 <oklofok> you can't compile, run, recompile because... you'd get scared?
16:56:05 <oklofok> if a code always produces the same input, the best optimization is to have it just return that input
16:56:35 <ehird`> Yes, and that falls under the subclass of optimizations known as "interpretation"
16:56:43 <oklofok> if you don't want to optimize that because of your ideology, that's fine
16:56:50 <oklofok> but do not start bugging me about it :)
16:57:00 <ehird`> However, interpretation is generally not a good optimization for a compiler to perform, as compilers are designed to generate code which goes through the optimization process of interpretation
16:57:06 <ehird`> Doing it before the output defeats what a compiler is meant to do.
16:57:22 <ehird`> i'm not bugging you :) just saying
16:57:22 <oklofok> aha, so you can't optimize constants
16:57:35 <oklofok> you said you would like it to do that earlier
16:57:37 <ehird`> you can, because that is not interpretation in its strictest sense
16:57:44 <oklofok> i'm not sure where we went a different way.
16:57:47 <ehird`> (Really, everything is interpretation. But, let's think of it stricter)
16:58:03 <ehird`> we went a different way when you said that all programs without input should be optimized fully to their output
16:58:09 <ehird`> because that is interpretation in its strictest sense :)
16:58:21 <oklofok> i said that's how far you get in optimization
16:58:33 <oklofok> it's just you can choose any level between 0...that
16:59:16 <oklofok> any loop that always just the same thing can be optimized, that's the most basic idea of optimization, you can choose to optimize it away fully, or just optimize some of it
16:59:45 <oklofok> i'm just saying there's nothing superturing about optimizing code that produces the same output every time
17:00:51 <oklofok> i know you mean you want +++++(<- input there) [code to calculate f(x) for any x indicated by the number of +'s in the beginning] to actually just have the loop optimized
17:00:58 <oklofok> so that the first +'s could be changed
17:00:59 <ehird`> my definition of a very-highly-optimizing compiler is that it optimizes up to everything but complete interpretation - the point of a compiler, IMO, is to produce code which you can then apply that final optimization on
17:01:14 <oklofok> and it would have the same functionality, just change it's first few bytes
17:01:40 <oklofok> you can't know which +'s in the code are input hardcoded by the programmer.
17:01:49 <oklofok> so you can't optimize anything.
17:02:01 <oklofok> and i know i'm not being clear :)
17:02:12 <ehird`> optimization, is all about heuristics
17:02:33 <ehird`> true optimization - to make code completely "optimal" - is impossible.
17:03:16 <oklofok> yes, you can't optimize fully a code that can take infinite input
17:03:26 <oklofok> i mean, any lenght input that happens to be given
17:03:44 <oklofok> but you can always trivially optimize anything that does not take input
17:03:52 <oklofok> unless you have ideological problems with that
17:04:10 <ehird`> i think our definition of input is mixed up
17:04:13 <oklofok> i don't care about that stuff, i just care about the fact you can optimize a constant.
17:04:26 <oklofok> by input you also mean hardcoded input, i know
17:04:39 <oerjan> there is a subtlety if your non-input taking expression doesn't terminate.
17:04:55 <ehird`> compilation in code without errors should ALWAYS succeed
17:04:57 <ehird`> even if it doesn't halt.
17:05:23 <oklofok> if it doesn't terminate quickly, of course you can't optimize it
17:05:54 * ehird` does the halting problem dance
17:05:58 <oklofok> you define it when you make your optimizer.
17:06:44 <oerjan> also, there is a subtlety if the result is actually much larger than the expression creating it, and isn't always used.
17:06:47 <oklofok> anyway, i just meant constants, and a program taking no input can always be optimized into it's result if you have it's result
17:07:05 <ehird`> so i could have some code that takes hours to compile but less than a second to run
17:07:21 <oklofok> oerjan: stop making points :)
17:07:41 <ehird`> also i could have code that, just because it takes a long time to execute, is denied optimization -- Oh a-ha! This can result in /different output for the same input on different machine specs/
17:07:45 <ehird`> Which is fundamentally wrong
17:07:49 <oerjan> oklofok: i am saying, partial evaluation is a well-known optimization technique but it has limits.
17:07:49 <oklofok> ehird`: if it takes an hour to compile, it takes an hour to run
17:08:03 <oklofok> oerjan: yes, but i didn't think of that
17:08:15 <oklofok> stop being cleverer than me, is my point :D
17:08:21 <ehird`> sure but i might want to have some sort of automatic build process so people working on something can test the code
17:08:28 <ehird`> if its run at build time they can't
17:09:43 <oklofok> though i was wrong about the fact you can always optimize a non input taking program, which i now find very very dumb, i was right in saying if you can do it, you should
17:09:57 <ehird`> a team of people are working on software A
17:10:00 <oklofok> of then you are just making a bad optimization for fun
17:10:05 <ehird`> they agree to each test each new release
17:10:22 <ehird`> so, automated program B compiles the new version of A, so that the team can test it (hint: it has a bug - it loops forever!)
17:10:26 <oklofok> and of course, true, you shouldn't optimize if the output is very complex compared to the code
17:10:38 <ehird`> however the compilation process runs on the automated program, so each coder only gets the output produced
17:10:40 <oklofok> in whic case you just optimize some parts
17:10:41 <ehird`> they cannot test the software.
17:10:51 <ehird`> define "very complex compared to"
17:10:52 <oerjan> oklofok: never! especially when i am having trouble with #haskellers outclevering me :)
17:11:08 <oklofok> oerjan: i'll become better then, okay?
17:11:41 <oklofok> len(code)>len(memory state)
17:12:05 <oklofok> code being the unoptimized code, memory state being after the run
17:12:09 <ehird`> string:length(compile(code)) > string:length(compile(memory state))?
17:12:30 <ehird`> if so, you could have some really complex code that doesn't get optimized just because of its output size -- this seems like a bad heuristic
17:12:46 <oklofok> ehird`: so you want an optimization that's still possible to turn into the original brianfuck code?
17:12:57 <ehird`> (AND, of course, you get a longer compile time)
17:13:07 <ehird`> (Since it has to compile BOTH (running one segment of code that may be complex), THEN compare the results)
17:13:08 <oklofok> i get thta impression from teams-working-on-something example
17:13:19 <ehird`> (If it decides against optimizatin, then it has to execute AGAIN at run-time - zzzz snore)
17:13:52 <oklofok> ehird`: compiling oughtta be fast?
17:15:14 <oklofok> you mean if the original program runs T seconds, and the compiler runs U seconds, the resulting code must run <= T-U seconds?
17:15:24 <oklofok> i can't think of another criteria
17:16:11 <oklofok> i'm not sure where i got that impression, you never said anything about a criteria
17:16:55 <oklofok> anyway, i don't see how a compiler shouldn't try to run the code fully
17:17:54 <ehird`> well, then why compile at all? :)
17:18:10 <oklofok> to make the program faster?
17:19:01 <oklofok> if you do precompilation, of course you don't optimize even +++>--<++
17:19:07 <oklofok> in the beginning of the program
17:19:17 <oklofok> it's faster just to execute one instruction at a time.
17:19:30 <oklofok> i mean, if you do interpretation
17:19:38 <oklofok> s/precompilation/interpretation
17:19:55 <oklofok> if you interpret the code, then my arguments about this have been wrong
17:20:01 <oklofok> but you were talking about compilation.
17:20:12 <oklofok> unless you have mixed the to concepts
17:20:24 <oklofok> *confused the two concepts
17:22:35 <ehird`> anyway a compiler is an interpreter and an interpreter is a compiler.
17:23:07 <ehird`> Wow, a BF compiler that warns if < and > aren't balanced...
17:23:33 <oklofok> sounds like a sucky compiler :P
17:23:49 <ehird`> http://home.arcor.de/partusch/html_en/bfd.html
17:24:08 <oklofok> okay... well guess you often have them balanced
17:24:28 <oklofok> but i'd prefer syntax highlighting for those loops that have them balanced
17:24:50 <ehird`> a stack in brainfuck is 1 (item 1) ... 0 isn't it?
17:25:25 <ehird`> [1][my item][1][my item][0]
17:25:40 <ehird`> and you navigate it with [>process item>], and push with [>>]+>(CALCULATE VALUE HERE)
17:25:47 <ehird`> (assuming you're on the starting 1)
17:25:58 <oklofok> well, you can't really ask "what a stack is in brainfuck", but yes, i've done stacks that way, usually
17:26:01 <ehird`> and pop with [>>]<<->(USE VALUE)
17:26:10 <ehird`> well, i meant what's a common, kinda-efficient way :)
17:27:08 <oklofok> if you use multiple stacks, you might wanna have them interleaved
17:27:09 <ehird`> the initial 1, of course, is to seperate stacks
17:27:17 <ehird`> so two stacks, non-interleaved is:
17:27:35 <ehird`> [1][item][1][item][0][1][item][1][item][0]
17:27:58 <ehird`> whereas [item][1][item][0][item][1][item][0] is ambigious, depending on where you start etc
17:28:18 <oklofok> and a cell for index carrying if you do random access memory
17:28:34 <ehird`> you mean, a "where I am"?
17:28:37 <oklofok> [1][value][for calculation][1][value][for calculation][1][value][for calculation][0]
17:28:50 <oklofok> those are always 0 but can be played with
17:28:54 <ehird`> so like, you do all your destructive operations involving value in [for calculation]
17:28:55 <oklofok> you can use the 1-cell for that
17:28:58 <ehird`> so as not to disturb it
17:29:03 <oklofok> and then make it one after your calculation
17:29:16 <ehird`> (What if you need more cells? Sounds a bit silly... maybe there's a better way)
17:29:20 <ehird`> Well, i guess one cell is goodo
17:29:22 <oklofok> yes, but i just realized you can use the 1-cell for that
17:29:42 <ehird`> you mean, use the interspersing [1]s?
17:29:50 <ehird`> and then do [-]+ once you move it out of the way?
17:30:16 <ehird`> so, you pop off the stack, compute a little bit, move that barrier cell to the top of the stack, go to that cell, repeat
17:30:42 <oklofok> when you move into index n, you carry n with you and each time you go one cell right in you vector, you decrease n until it's zero and you have your value
17:31:07 <ehird`> also pushing should be [>>][-]+>[-](CALCULATE VALUE), you ned the [-]s since popped values stay on the tape, just after the end marker
17:31:42 <ehird`> i'll write a short doc explaining it
17:32:17 <oklofok> i was talking about a random access vector, not a stack
17:32:25 <oklofok> unless i wasn't clear about that
17:32:35 <oklofok> which i most likely wasn't
17:37:52 <ehird`> this describes the stack representation i was talking about: http://pastie.caboo.se/80941
17:45:10 -!- i-- has joined.
17:46:43 -!- trepliev has joined.
17:47:13 -!- i-- has left (?).
17:59:54 <oklofok> ehird`: you want a way to get the value out of the stack as well, in some cases
18:00:06 <ehird`> you mean, navigate to a specific element?
18:00:09 <oklofok> i mean, a way to move it to the beginning of the stack
18:00:25 <ehird`> it's <<[<<], while on a boundry
18:00:28 <oklofok> sorry, i didn't actually read it thorough yet xD
18:00:43 <ehird`> (Well, <<<[<<] is better, but meh)
18:00:52 <oklofok> you don't move it out of the stack
18:01:25 <oklofok> you must be able to be able to get the value from the top of the stack to somewhere completely other
18:01:43 <ehird`> that's not part of the stack itself.
18:01:57 <oklofok> i mean, traverse the stack down carrying the value
18:02:10 <oklofok> so that you get it *out of the stack*
18:03:12 -!- oerjan has quit ("Dinner, probably").
18:06:44 <oklofok> i used it when making my brainfuck-brainfuck interpreter
18:06:55 <oklofok> i should finish that some day
18:07:12 <oklofok> the interpreter i was making it with was just goddamn crappy
18:07:26 <oklofok> infinite loop -> crash, negative value -> crash
18:11:01 <oklofok> it was about two years ago and i was a total noob, so i'm not actually sure it would even be that much of a challenge
18:11:11 <oklofok> anyway, i'll follow oerjan's footsteps ->
18:11:29 <oklofok> (or in them, if that's the way to say it in english)
18:12:52 <ehird`> hey wow i managed to design a non-esoteric functional language
18:12:58 <ehird`> and it doesn't even look much like haskell!
18:16:22 <ehird`> mine doesn't look as esoteric: http://pastie.caboo.se/80953
18:16:25 <oklofok> though numbda wasn't really designed, it's a result of me starting to code.
18:17:47 <ehird`> (Also, f(x, y) is not a shortcut for f(x)(y) right now, although it is always equivilent. Thinking about adding va-args later.)
18:18:13 <ehird`> (Currying va-arg functions once you have already supplied enough args will require explicit curry(f, list) guess)
18:18:13 <oklofok> ehird`: i'd say that looks quite a lot like haskell
18:18:20 <ehird`> oklofok, SML is closer :)
18:18:25 <ehird`> SML and Haskell look eerily similar
18:18:31 <oklofok> but then again, haskell doesn't have a "look", really
18:18:33 <ehird`> (Hint: because haskell is inspired by SML)
18:18:48 <ehird`> oklofok, one major difference is how i always use f(x, y) instead of (f x y)
18:18:52 <ehird`> i like it more that way
18:19:10 <oklofok> in oklotalk, those two parse as the same thing, but for a different reason :)
18:19:48 <ehird`> so f(x, y) is f (x , y), which has x y, so it's f x y
18:20:27 <oklofok> (if f isn't a funcoken, that parses differently)
18:20:31 <ehird`> one of the advantages of my syntax is that there's no pesky left-associative-messing-around
18:20:45 <oklofok> (but since objoken and funcoken are my own terms, you don't know what they are)
18:20:56 <ehird`> also, you don't need to do e.g. (*) to get the * function (because f * x is f times x, not f (function *) x))
18:21:03 <ehird`> you can just do f(*, x)
18:21:13 <ehird`> i think x * y binary operators will be `*`(x, y)
18:22:22 <oklofok> i love how everything like that just arises from the underlying structure of oklotalk
18:22:28 <oklofok> but i hate how i can't stop talking about it
18:22:43 <oklofok> really, i'm an irc-a-holic
18:22:51 <oklofok> can't live without irc-a-hole
18:23:09 <oklofok> (i prefer holes over hols.)
18:23:29 <ehird`> heh, i think my language has been heavily influenced by merd: http://merd.sourceforge.net/
18:23:47 <ehird`> except my language has no "if"
18:25:43 <oklofok> i had this idea for a language when speccing numbda
18:25:58 <oklofok> a language called yawn, for it's excessive laziness
18:26:12 <oklofok> but i just have ideas for it
18:26:58 <oklofok> (so basically i was just telling the name which is trying to be clever, you have fun with that...)
18:27:13 <oklofok> (i should filter what i say)
18:31:36 <ihope> Excessive laziness?
18:32:09 <ehird`> http://pastie.caboo.se/80960 i should write a spec for this, shouldn't I?
18:32:09 <ihope> oklofok: do you have an oklotalk spec anywhere?
18:32:17 <ehird`> ihope, he only has a parsing spec.
18:32:52 <ihope> ehird`: how do you curry that there?
18:45:21 <ehird`> ihope, you just apply to not enough arguments
18:45:25 <ehird`> note product -> fold(*, 0) ;
18:45:57 <ehird`> if you want to do va-args, when i implement va-args, then you'd have to do curry(vaFunc, [my, curried, args])
18:46:10 <ehird`> (same with default arguments)
18:51:51 <oklofok> ihope: i was thinking there'd be two separate threads evaluating, one so lazy it evaluates nothing, and the other dependant on that
18:52:06 <oklofok> i have some ideas on how to make that work
18:52:14 <oklofok> but not enough to be interesting to tell
18:58:48 <ehird`> ihope, the idea for implementing my language is for it to be interpreted ONLY
18:58:51 <ehird`> well, most of the time
18:59:09 <ehird`> and to have a small C base, and as much possible written in the language itself (no matter how strained the low-level code might look in it)
18:59:35 <ehird`> then, another version of the base, written in the language itself - so if a compiler is ever written, you can have a self-hosted interpreter
19:03:01 <ehird`> you mean a spec of the C language?
19:03:04 <ehird`> if so, you'll have to pay
19:05:04 <ihope> You have to pay to look at specifications?
19:05:37 <ihope> Okay then, where's a GCC C spec? :-P
19:12:25 <ihope> Or maybe I should compile for GHC if there's no reason to go with C instead.
19:13:43 <ehird`> gcc c spec doesn't exist
19:13:47 <ehird`> you have to pay iso to get the spec
19:13:53 <ehird`> how do you think standards agencies make their money
19:13:59 <ehird`> it costs $80 for C89, iirc
19:14:21 <Sukoshi> Why not just get the C Programming Language?
19:14:27 <ehird`> cause that's not a spec
19:14:36 <Sukoshi> Why do you need a spec? :P
19:14:42 <ehird`> because ihope is compiling to c
19:15:03 <Sukoshi> Does he not know C, or something?
19:15:19 <ihope> Couldn't you rewrite a spec to get an equivalent spec not protected by anything?
19:15:23 <ehird`> You need a spec to reliably compile
19:15:30 <ehird`> ihope, Yes, but it's a pain in the butt so nobody will
19:15:42 <ihope> You don't want your compiler to produce invalid code in obscure circumstances.
19:17:29 <ehird`> http://pastie.caboo.se/80978 more examples!
19:17:31 <ehird`> i need to write a spec.
19:20:28 <ehird`> the comments on 99-bottles-of-beer are almost as stupid as on youtube. http://99-bottles-of-beer.net/language-java-1162.html "Alex Mirchev That language is definatly java.. btw, why is your code so weird... it doesnt look like a correct syntax..."
19:21:42 <ehird`> Also: http://web.mit.edu/kenta/www/two/beer_i_m.html "Java is a machine independent compiler based on C++ which targets to pseudo-code." "Java Script Interpretive Java." grrrrr
19:24:34 <oklofok> i didn't know there was a language called Microsoft Word xD
19:24:49 <oklofok> i know the language, however
19:37:17 <ihope> What's the usual way of making a language "system-complete"?
19:38:11 <ihope> ...as in being able to make all the operating system calls and such?
19:38:43 -!- atrapado has joined.
19:39:43 <ihope> I guess I could reserve some identifier space for... I/O extentions.
19:40:32 <ehird`> write a primitive like syscall() in your target language,
19:40:46 <ehird`> or, wrap around cstdlib or equiv. functions manually
19:40:53 <ihope> Or do one of those.
19:44:13 -!- Sgeo has joined.
19:55:12 -!- RodgerTheGreat has joined.
19:57:23 <oklofok> great, i was just looking for ya
20:08:24 <Sukoshi> Why am I getting a NoSuchMethodError?
20:08:40 <Sukoshi> When the thing is obviously compiling correctly, and the method exists.
20:13:52 <oklopol> Sukoshi: i don't believe you.
20:14:47 <oklopol> i think the compiler is more reliable than you
20:15:05 <oklopol> (and a bit about the chirp)
20:15:46 <Sukoshi> Well, my top Emacs buffer is viewing the method *right* now so :P
20:16:34 <oklofok> then i guess you are both a bit crooked
20:18:58 <Sukoshi> ... Thanks for the help? :D
20:20:00 <oklofok> hey, no problem, that's what i'm for
20:22:18 <oklofok> Sukoshi: i can't really believe that can happen if you aren't doing something very very weird
20:23:02 <Sukoshi> Well, I've been purposefully avoiding generics because I'm not sure if GCJ supports them.
20:23:34 <Sukoshi> So I've been doing a whole bunch of casts.
20:23:53 <oklofok> i like looking at code and i know some java, so if you isolate the problem, i'd love to look ;)
20:28:19 <RodgerTheGreat> Sukoshi: GCJ? eep. Good luck debugging that thing's output. :S
20:29:51 <Sukoshi> RodgerTheGreat: I'm using Sun's JVM right now.
20:31:19 <Sukoshi> GregorR: How's D for writing an emulator?
20:31:34 <Sukoshi> GregorR: I need to use heavy pointer-foo and ASM, so.
20:47:11 <GregorR> Sukoshi: D certainly gives you heavy pointer-foo and ASM if you want it.
20:48:16 <bsmntbombdood> my brother got the harry potter book, he's gone all spastic
20:50:49 <RodgerTheGreat> yeah, that was more of a "why the fuck did he scream" question mark
20:51:11 <Sukoshi> GregorR: How's native D speed compared to C and C++ ?
20:51:18 <Sukoshi> Now I'm really going to shower, heh.
20:51:28 <Sukoshi> (Before this was shower preparation :P)
20:52:14 <GregorR> Sukoshi: That sort of depends on how heavily you use the GC. You can choose to stop the GC and do manual deletion, in which case it's as fast. If you use the GC, it'll stop the world on occasion. That being said, the GC-stopping functions are in there for purposes exactly like emulators, so :P
21:04:04 <ehird`> what's the most noobish form of GC currently known?
21:10:08 <GregorR> Calling reference counting GC is an insult to GC :)
21:12:15 <bsmntbombdood> reference counting works perfectly in languages without mutators
21:12:16 <RodgerTheGreat> sometimes you can build garbage collection into the compiler around some complicated scoping rules
21:13:40 <Sukoshi> GregorR: Yeah, I want to stop the GC.
21:14:23 <Sukoshi> GregorR: Got any good tutorials on it?
21:26:15 <ehird`> i mean non-referencecounting
21:26:27 <ehird`> ref counting is simple but ineffective for e.g. circular objects
21:40:47 <RodgerTheGreat> now I must find a USB adapter to plug this beauty into my mac
21:41:06 <ehird`> that's like, harsh dissonance in hardware form, man!
21:41:39 <RodgerTheGreat> Model M + OSX: beautiful interface for your eyes, and beautiful interface for your hands. :D
21:54:00 <ihope> Circular objects...
21:54:25 <ihope> Yes, there's sort of failure there.
21:57:01 <ihope> Well, if an object contains a pointer to itself, but nothing else contains a pointer to it, the reference counter is still 1.
21:57:31 <ehird`> that's why ref counting is not usable
21:57:40 <ehird`> Python only uses it with hacks (circular detection)
21:57:55 <oklofok> ihope: then there is a pointer to it, let the poor object be, he obviously wants to live
21:57:59 <ihope> It's sort of like determining whether an object is supported based on whether there's something directly under it. Put something under itself, and boom, support.
21:58:44 <oklofok> ihope: are you implying i'm not strong enough to lift myself in the air?
21:59:16 <ihope> oklofok: don't jump; you'll get garbage collected.
21:59:47 <ehird`> who wants to help design an analog computer rube goldberg machine
22:00:04 <oklofok> ehird`: been my plan for years :)
22:00:13 <ehird`> oklofok, then help design its fruitition :)
22:00:23 <oklofok> i just somehow feel that can't be made over irc :)
22:00:31 <ehird`> it can be designed over the internet
22:00:40 <ehird`> plus final plans can be made and a guide to make your own
22:02:40 <ehird`> noooo! sleep is useless!
22:02:48 <oklofok> true, it's the cousin of death
22:06:09 <oklofok> i do know for sure i should either do something or sleep
22:06:28 <oklofok> hmm, gonna go buy something
22:10:17 <ehird`> hmm, how useful would a computer with a tape of 6 two-state cells be?
22:10:21 <ehird`> i imagine not useful for actual computation
22:10:28 <Sukoshi> RodgerTheGreat: You need an expensive one, by the way.
22:10:41 <ehird`> (assuming a programming language something like a highly simplified boolfuck)
22:11:20 <Sukoshi> ehird`: If you can prove that it's Turing Complete, then you can do anything in it ;)
22:12:37 <ehird`> well obviously 6 two-state cells isn't TC
22:12:46 <ehird`> but is it enough to perform some simple calculations?
22:12:54 <ehird`> since, you can't store many numbers for one
22:13:07 <oklofok> didn't ya read it's homepage!?
22:13:21 <ehird`> because that TC claim is a joke by the author
22:13:36 <ehird`> i'm considering 6 0-9 cells
22:13:40 <ehird`> that'd be a bit more useful
22:13:46 <oklofok> errr..... don't think so :|
22:13:57 <ehird`> maybe i could squeeze up to MAX 20 0-9 cells
22:14:09 <ehird`> that should be useful for, i dunno, adding two small numbers together
22:14:19 <oklofok> well, you want the memory to be easily extendable
22:14:29 <oklofok> so you can make it tc when you get an infinite universe
22:14:45 <oklofok> ehird`: still easier to do base-2.
22:15:26 <ehird`> yeah but 20 0-9s offer more computing potential than 20 0-1s
22:15:55 <oklofok> but you can make 100 0-1's easier than 20 0-9's
22:16:12 <oklofok> and you can actually make them calculate stuff without doing something very incredibly hard
22:16:40 <ehird`> well, 100 0-1's will be hard this IS a rube goldberg machine
22:17:03 <ehird`> i mean, i have to incorporate tennis balls as a main part - making 100 binary registers will not exactly be easy/fun
22:17:21 <oklofok> you think a 0-9 is even possible, then?
22:17:31 <ehird`> they made a difference engine in lego..
22:17:54 <oklofok> does that use 10 base for other than output?
22:18:04 <oklofok> but i didn't understand the pic, so...
22:18:24 <oklofok> anyway, wtf am i still doing here? ----->
22:18:25 <ehird`> well my registers will be primarily output i guess
22:18:39 <ehird`> maybe, 10 output, 10 data
22:19:16 <ehird`> 10 base-2 data cells give me only 1024 combinations of state...
22:19:34 <ehird`> 10 base-10 data cells give me 10000000000.
22:20:07 <oklofok> but a 10-base one cannot be used for computation, too complicated
22:20:29 <ehird`> it can be used for computation, albiet not too simply
22:20:55 <ehird`> though base-2 is easier, as i just need a switch
22:21:14 <ehird`> 20 switches = 1048576 states, which is good
22:21:44 <ehird`> then 10 base-10 output displays.
22:23:44 -!- atrapado has quit ("zadflksdg sh fhjd").
22:27:38 <Sukoshi> The structs in D are so ... easy.
22:30:28 <Sukoshi> Well, there are stuff you get used to like wrapping stuff in structs for type checking, or doing union/struct combos and such.
22:30:42 <Sukoshi> And this new named-struct assignment thinger is waaay cheap.
22:30:53 <Sukoshi> Whatever happened to programmer skill? :|
22:36:57 <oklofok> Sukoshi: invent a worse language and use that one?
22:39:38 <ehird`> D is fun, but sometimes lame
22:40:02 <Sukoshi> GregorR: I'm concerned about all the stuff D takes care of for you.
22:40:09 <Sukoshi> How's the performance hit from that?
22:40:20 -!- sebbu has quit ("@+").
22:40:31 <GregorR> Well, everything it "takes care of for you" you have to ask for except for GC>
22:40:31 <oklopol> Sukoshi: i don't think anything else than gc really affects anything
22:40:54 <Sukoshi> I... don't ... believe you.
22:40:58 <GregorR> Thinks like dynamic array concatenation et cetera involve a malloc, but you pretty much have to ask for it.
22:41:03 <ehird`> Sukoshi, well, the runtime type system
22:41:08 <Sukoshi> Yeah, there we are GregorR.
22:41:18 <Sukoshi> Most of emulator stuff won't even deal with string concatenation and all.
22:41:28 <Sukoshi> It's just that, OOP is a godsend with that sorta stuff.
22:41:28 <GregorR> Doing type-checking is a fairly quick lookup into the vptr, I've never seen /anyone/ complain about the speed there.
22:41:34 <GregorR> Plus, you can just compile with -release to get rid of that.
22:41:57 <GregorR> [That is, once you're sure that you're not doing anything stupid in runtime type checking, just use -release and it all assumes it's OK]
22:42:05 <oklopol> it's constant time usually, that's like a negative number clock cycles
22:42:18 <Sukoshi> What's this delegate stuff?
22:43:11 <Sukoshi> So will -release compile out the dynamic array stuff?
22:43:24 <Sukoshi> Or is there a little marker you can give static arrays?
22:44:07 <Sukoshi> And lastly, how do you interface with ASM code?
22:44:58 -!- oerjan has joined.
22:45:16 <GregorR> suifur: Uh, the dynamic array stuff can't be compiled out ..
22:45:26 <GregorR> Sukoshi: But it will compile out the bounds-checking of it.
22:45:47 <GregorR> Sukoshi: As per interfacing with ASM, see http://www.digitalmars.com/d/1.0/iasm.html
22:46:01 <Sukoshi> When was the last time suifur even talked? :D
22:46:13 <GregorR> Damn you tab-completion! :P
22:48:27 <bsmntbombdood> one that uses last-talked order for tab completion
22:49:53 <oklopol> a generic tab completion would be nice
22:50:05 <oklopol> last word used beginning with what you typed
22:51:53 -!- pikhq has joined.
22:53:36 <bsmntbombdood> some text editors and word proccessors have tab completion of all words in their spellcheck database or previously typed
23:00:44 <Sukoshi> Grrr. NoSuchMethodError!!!
23:08:50 <bsmntbombdood> i thought java methods were looked up at compile time
23:10:55 <ihope> Sukoshi: sprinkle your code with assertions.
23:11:34 <ehird`> bsmntbombdood, the compiler uses exceptions as errors
23:14:13 <ehird`> well, java folks CAN'T code java
23:17:15 <Sukoshi> bsmntbombdood: Didn't I say that I'm a C coder?
23:17:19 * ihope CTCP TIMEs himself because he doesn't feel like double-clicking the clock
23:17:31 <Sukoshi> (I mean, when it comes to static languages.)
23:17:44 <Sukoshi> Well, I've found out the error ... and it's ... weird.
23:19:08 <ehird`> ihope, did it for you.
23:19:12 <ehird`> now you can be even more lazy :)
23:19:53 <ihope> Though my client tosses CTCP requests.
23:20:05 <ihope> data LCTerm = Var Label | Apply LCTerm LCTerm | Lambda Label LCTerm; data SKITerm = Apply SKITerm SKITerm | S | K | I
23:20:14 <ihope> (Never mind the fact that I used "Apply" twice.)
23:20:40 <ihope> Now, continuations would probably help in compiling from LCTerm to SKITerm, though I'm not sure just how.
23:22:06 <ehird`> there's a binary clock but no hexadecimal clock
23:22:18 <ihope> Maybe delimited continuations.
23:22:20 <ihope> compile (Apply t1 t2) = do t1' <- compile t1; t2' <- compile t2; return (Apply t1' t2
23:22:34 <ihope> ...gah, left off the last two characters?
23:22:37 <ihope> compile (Apply t1 t2) = do t1' <- compile t1; t2' <- compile t2; return (Apply t1' t2')
23:22:57 <ihope> compile (Lambda l t1) = do t1' <- compile t1; return (Apply K t1')
23:22:58 <ehird`> i think you are just reinventing hsakell
23:23:01 <oerjan> actually, you want an intermediate format that includes Vars.
23:23:26 <ihope> ehird`: writing something in Haskell is reinventing Haskell?
23:23:40 <ehird`> i thought you were still going on about your language :P
23:24:18 <oerjan> abstraction elimination is just simple recursion if you have vars on both sides.
23:24:42 <ihope> I may be able to come up with a clever way of doing this.
23:25:58 <ihope> compile (Var l) should somehow look for the corresponding compile (Lambda l t1) and... do something with it.
23:26:19 <oklofok> i'd like to do D but i can't install the compiler
23:26:26 <oklofok> these computers are so hard to use :\
23:26:32 <oerjan> you _don't_ want to consider more than one variable at one time. Trust me.
23:27:10 <ihope> Can you prove there's no really clever way of doing this? :-P
23:27:19 <Sukoshi> oklobot: Wanna help with an NES emulator?
23:27:31 <oerjan> of course not. But having a common data structure makes it so much simpler.
23:29:04 <oklofok> Sukoshi: you mean oerjan or me?
23:29:13 <oklofok> i wanna help, oerjan can help.
23:29:30 <oklofok> i haven't done D but it looks awesome
23:29:34 <oerjan> among other things, you want to give the result of translating a sublambda _back_ into the simplification of the outer ones
23:29:44 <ihope> Hmm. Somehow, my mind read that as <Sukoshi> oerjan: you mean ihope or me?
23:29:45 <oklofok> someone install me the compiler and tell me how to use it :)
23:30:17 <ihope> That makes sense as long as Sukoshi said "oerjan: Wanna help with an NES emulator?"
23:30:38 <ihope> Well, I don't have much to lose by trying to come up with a clever way of doing this.
23:30:41 <oklofok> err... you sure it would make sense then?
23:31:00 <oerjan> which means that needs to be in the intersection of the before and after formats
23:32:03 <oerjan> now if you want to be _clever_, come up with an algorithm which doesn't grow exponentially as you nest lambdas.
23:35:07 <ihope> Hmm, I think cleverness is coming vaguely...
23:35:29 <oerjan> yes, although the initial overhead is greater.
23:36:09 <oerjan> you can pass a list of variables to look up in
23:36:38 <oerjan> it resembles deBruijn notation...
23:38:06 <oerjan> i am sure you could even do binary lookup somehow.
23:38:37 <oerjan> (logarithmic growth but horrible overhead, i guess)
23:38:48 <bsmntbombdood> you can always do the naive algorithm but then reduce afterwards
23:39:32 <oerjan> might be easier to choose while you still have lambdas to analyze
23:39:47 <ihope> Hmm, a monadic hole...
23:40:21 <Sukoshi> I've found a Microsoft way to fix this error.
23:40:47 <ihope> ...a monadic version of LCTerm that can have holes in it?
23:41:07 <Sukoshi> /* For some reason, the Hashtable contains an extra null element that is useless. When returning number of entries, decrease Hashtable entries by 1 */
23:42:11 <ihope> Zippers are what I'm reminded of, yes...
23:42:37 <oerjan> although zippers with several holes are far more complicated
23:42:41 <pikhq> Sukoshi: Call it a null-terminated Hashtable. :p
23:42:56 <ihope> But so far, I don't think this actually has anything to do with zippers.
23:43:12 <oerjan> i think Oleg (TM) has done a tiny bit on it.
23:43:14 <Sukoshi> But because I want to deliver this code, I think I will do exactly that and do some more heuristics later.
23:43:43 <oklofok> Sukoshi: i do want to help.
23:44:00 <Sukoshi> oklobot: How much ASM do you know?
23:44:00 <oklofok> NES emulator? that gamie thing
23:44:11 <ihope> Cool, we're butchering trademarks...
23:44:26 <oklofok> i haven't written a line of assembly since i never got a compiler set up :)
23:44:31 <oerjan> and vincenz in #haskell was doing something the other day
23:44:38 <Sukoshi> If I wanted theory, I'd use Haskell, not ASM :D
23:44:48 <oklofok> i know a lot of theory about asm
23:44:56 <pikhq> Why are you doing stuff in ASM?
23:44:58 <Sukoshi> Phaw. Be an engineer. Just Do It.
23:45:12 <Sukoshi> pikhq: Because this is practice for a GBA emulator I plan to fork from VBA.
23:45:19 <Sukoshi> Because the Linux VBA is bleh.
23:46:03 <Sukoshi> If you have a brain, and can imagine stacks and registers... it shouldn't be too hard.
23:46:17 <oklofok> i've read a few books about asm, and an inter processor manual or something half-way through
23:46:47 <oklofok> really, i just didn't get tasm and masm to work
23:47:12 <oklofok> installing programs is reeeeal hard
23:47:39 <oklofok> i have nasm and masm on my hd
23:47:56 <oklofok> Figs i think did some assembly... or who was it
23:49:15 <ihope> What makes me happy is that what I'm trying to do would probably be entirely non-obvious without monads :-)
23:49:25 <oklofok> i recall making a program play random sounds with the pc beeper
23:49:29 <oklofok> but i didn't know asm then
23:50:02 <oklofok> that's all i ever made with it
23:50:13 <Sukoshi> Then grab a good tutorial around, and play with it.
23:52:09 <oklofok> uh you gotta love assembly
23:52:25 <oklofok> grab a tutorial, try the hello world program, get 7 errors <3
23:54:06 <oerjan> ihope: with monads, you can make it entirely incomprehensible! :D
23:54:12 <oklofok> Sukoshi: isn't making a NES emulator rather huge a challenge?
23:54:34 <oklofok> though i agree those are the best ones
23:55:07 <ihope> Indeed, Haskell is probably capable of writing extremely short stuff that doesn't make any sense at all until you've thought it over a few days.
23:56:10 <oklofok> i love it how i can just skip @ anywhere in a tutorial, see immediately what's happening and rememeber reading about how that's done (the basic bit and jmp fun i mean), but i have absolutely no idea how to make a "Hello world" program
23:58:34 <oerjan> what you say three times is true