00:02:21 movement :o 00:04:33 afk dessert :X 00:07:25 see you 00:14:45 o hai 00:16:42 o hai? 00:16:58 oh, hi 00:18:44 -!- bsmntbombdood_ has joined. 00:18:51 əʊ haɪ? 00:20:03 youre canadian arent you 00:20:47 New York born and raised 00:21:04 huh. then your ling training was better than i expected. or something. 00:21:21 tho i indeed wouldve said @U, i probably wouldve transcribed it as oU 00:23:19 more like copying and pasting from the local IPA-friendly dictionary 00:23:35 I can read IPA. (hardly) 00:24:10 I can speak LaTeX, though. got my IDE to be ∈ the mode for ∞ LaTeX enjoyment 00:24:38 latex made me reveal my password to irc the other day :| 00:24:45 i'd been doing my semantics homework in latex 00:24:53 lots of \f{x} shit all day 00:24:55 for like two days 00:25:05 then i go to irc and notice im nicked as augur_ 00:25:08 so i type in 00:25:18 \msg nickserv id flibble 00:25:22 in a channel 00:25:23 ... 00:25:45 cute password. *blows a kiss to it* 00:25:51 oh, yeah. backslash is not natural. 00:25:56 NOT NATURAL MAN 00:25:57 well, thats not what it was, right 00:26:03 i just chose it as an example 00:26:04 :P 00:26:22 well, well, my fake password is bubbles. and it's cuter than yours! >:| 00:26:30 :P 00:26:38 but mine is MISTER flibble! :| 00:27:49 oh god, looks like I'm dealing with a professional here. *backs away slowly* 00:28:50 * Gracenotes browses reddit 00:30:34 I'm watching Buffy. yes, THE VAMPIRE SLAYER 00:30:39 D: 00:30:55 it is keeping me quite interested 00:37:33 ok so heres my idea 00:37:45 you'd have the expression 1 + 2 00:37:52 "1 + 2" 00:38:10 with the "abstract" structure ["1", ["+", "2"]] 00:39:07 "1" = 1 :: n, "2" = 2 :: n, "+" = \s:SUM(s) -> true :: s -> t / n,n,n 00:39:38 where n,n,n means "if s is SUM(s) then you need three more participants to evaluate this expression" 00:40:20 ["+", "2"] = \s:SUM(s) -> true && first(s,2) :: s -> t / n,n 00:40:48 ["1", ["+", "2"]] = \s:SUM(s) -> true && first(s,2) && second(s,1) :: s -> t / n 00:41:04 the evaluator says, ok this is the top of the tree 00:41:06 so we lift it now: 00:41:24 \s:SUM(s) -> true && first(s,2) && second(s,1) && third(s,w) :: s -> t 00:41:39 but this isnt a value type so close over it 00:41:48 Es:SUM(s)[true && first(s,2) && second(s,1) && third(s,w)] 00:41:57 but this has an open variable, so question-close that 00:42:06 Ws[Es:SUM(s)[true && first(s,2) && second(s,1) && third(s,w)]] 00:42:22 and this can be evaluated because this is a structure that the evaluator understands 00:42:28 namely, it means 1+2 00:42:39 so it evaluates it Ww[Es:SUM(s)[true && first(s,2) && second(s,1) && third(s,w)]] and returns 3 00:43:07 (well, first it does some logic to conjunction reduce Ww[Es:SUM(s)[true && first(s,2) && second(s,1) && third(s,w)]] to Ww[Es:SUM(s)[first(s,2) && second(s,1) && third(s,w)]] 00:43:08 :p 00:46:00 * Gracenotes makes a note to take semantics 00:46:07 "Seriously messes with your mind" 00:46:12 :p 00:47:14 tell me about semantics 00:52:17 why, back in my day, the whole family came to together to talk about semantics in front of the fireplace, we exchanged many arguments about.. er.. whatever semantics is about.. and lambdas.. ∀ dog. ∃ person and stuff 00:52:27 o_o 00:52:40 :P 00:56:12 motherfucking damn 00:56:18 i've got a couple of sticky pixels 00:57:07 ok ive refined the ideas some more 00:57:35 brb 00:58:27 ok 00:58:28 so 00:58:43 imagine you have the expression "a one + a two" 00:59:03 for simplicity actually, lets say just "1 plus a two" instead 00:59:31 lets say "1" ~ 1 :: n 00:59:55 "plus" ~ \s:SUM(s).true :: s -> t / n,n,n 01:00:07 -!- calamari has joined. 01:00:19 "two" ~ \y.y = 2 :: n -> t 01:00:43 and "a" ~ \p.Ey.p(y) :: (n -> t) -> t 01:01:27 or even, lets say, "a" ~ \p.\q.Ey.p(y) & q(y) :: (n -> t) -> (n -> t) -> t 01:01:42 Gracenotes: semantics messes with your mind? 01:01:56 (uorygl: look at what im saying! its messed with MY mind! :D) 01:02:41 you put together "a" with "two" to get 01:03:08 It would be nice if I knew what you were talking about. 01:03:09 \q.Ey[y = 2 & q(y)] :: (n -> t) -> t 01:03:21 And what your notation is. 01:03:47 then you TRY to put together \s:SUM(s).true :: s -> t / n,n,n with \q.Ey[y = 2 & q(y)] :: (n -> t) -> t 01:03:50 but this fails 01:03:56 then you notice, hold on 01:04:24 (n -> t) -> t?! maybe if i replace the expression "a two" with some unbound variable z :: n 01:04:37 So, um, "1 plus a two". 01:05:53 eh no sorry my explanations are sucky :d 01:05:54 :D 01:06:02 my logic isnt quite worked out yet 01:06:25 1 is a DP; it denotes a specific value. 01:06:32 Two is a noun; it denotes a type of thing. 01:06:53 A is a determiner; it takes a type of thing and denotes a specific instance of it. 01:07:01 actually, DP's dont denote specific values 01:07:02 Though it doesn't specify the specific instance. :-P 01:07:03 but nevermind 01:07:14 DPs totally denote specific values! 01:07:34 actually they dont 01:07:39 but thats ok 01:08:26 Plus is... a conjunction? A preposition? Anyway, it takes two DPs and becomes a DP denoting their sum. 01:08:48 plus is a lambda with a body, and a restriction on its variables 01:09:00 "plus a two" i is another lambda 01:09:03 ok i think i have it 01:09:19 Well, figure out whether "plus" is a conjunction or a preposition. 01:09:32 I guess maybe it's both. 01:10:17 So, DPs don't denote specific values? 01:10:37 Explain. 01:10:37 "a two" then has the value \q.Ey[y = 2 & q(y)] :: (n -> t) -> t 01:10:51 so you notice, ok fuck, this doesnt combine with "plus" :: s -> t / n,n,n 01:11:07 but then you notice, hey, "plus" needs an n participant, and i'm an (n -> t) -> t 01:11:30 so lets replace "a plus" with an unbound z, and hold "a plus" off on the side for a while 01:11:36 Hang on. 01:11:43 (n -> t) -> t is the same as C n for some monad C. 01:11:47 hush 01:11:55 Go on. 01:11:57 so we instead pretend this is "plus z" where z :: n 01:12:04 holding "a plus" on the side 01:12:40 so now we combine \s:SUM(s).true :: s -> t / n,n,n with z :: n, returning 01:13:15 \s:SUM(s).(true && first(s,z)) :: s -> t / n,n 01:13:31 then we combine 1 :: n with that, returning 01:13:52 \s:SUM(s).(true && first(s,z) && second(s,1)) :: s -> t / n 01:14:24 then we say, ok, so this is the top of the tree, but thats not a value i can deal with 01:14:53 so let me pull this up to an existentially closed statement 01:15:03 er sorry, no 01:15:04 :p 01:15:16 first, let me add the last n: 01:15:29 \s:SUM(s).(true && first(s,z) && second(s,1) && third(s,w)) :: s -> t 01:15:42 so now i can existentially close s: 01:15:55 Es:SUM(s).(true && first(s,z) && second(s,1) && third(s,w)) :: t 01:16:31 but z is associated with "a two" and i havent handled that yet 01:16:52 so "a two" is \q.Ez[z = 2 && q(y)] 01:17:06 and we can bind z on the whole tree: 01:17:14 \z.Es:SUM(s).(true && first(s,z) && second(s,1) && third(s,w)) :: n -> t 01:17:20 and hey, n -> t! 01:17:26 this is what q needs to be! 01:17:31 so lets use this as the value for q 01:17:44 combining "a plus" with "1 plus z" 01:17:46 giving back 01:18:02 Ez[z = 2 && Es:SUM(s).(true && first(s,z) && second(s,1) && third(s,w))] :: t 01:18:05 lovely 01:18:24 but w is open and we introduced it to satisfy the participant needs of s:SUM(s) 01:18:38 so we have to bind that with a question operator 01:18:52 Ww[Ez[z = 2 && Es:SUM(s).(true && first(s,z) && second(s,1) && third(s,w))]] 01:18:56 ok, lovely 01:19:29 but this isnt something we know how to complrehend, so lets do some logic: minimize Ez to exist around the smallest thing it can: 01:19:49 Ww[Es:SUM(s)[true && Ez[z = 2 && first(s,z)] && second(s,1) && third(s,w)]]] 01:20:04 -!- soupdragon has quit ("Leaving"). 01:20:09 now apply some general theorem that Ex[x = y && p(x)] :- p(y) 01:20:10 to give 01:20:31 Ww[Es:SUM(s)[true && first(s,2) && second(s,1) && third(s,w)]]] 01:20:36 then conjunction reduce true out of existence 01:20:40 Ww[Es:SUM(s)[first(s,2) && second(s,1) && third(s,w)]]] 01:20:56 and now notice that this is precisely the kind of statement that the interpreter understands by default. 01:20:56 :"D 01:30:30 So, what's the purpose of all this? 01:30:37 Are we trying to make computers understand English? 01:30:41 no 01:30:43 quite the opposite 01:30:45 its an esolang 01:30:52 Oh. 01:30:55 designed to have syntacto-semantics like natural language 01:31:19 I guess I can see why in this sense DPs don't denote specific values. 01:34:22 So in your language thing, a DP has type (n -> t) -> t? 01:34:22 It's a predicate on predicates? 01:34:22 its not a DP 01:34:22 What do you mean by "it"? 01:34:22 "a two" is just a convenient use of english words for function names 01:34:22 In English, "a two" is a DP. 01:34:22 yes 01:34:22 but this isnt english! 01:34:22 True. 01:34:22 "a two" is just a function of type (n -> t) -> t 01:34:22 So by DP, I mean a thing that looks like an English DP. 01:34:22 So why is it an (n -> t) -> t? 01:34:22 because i wanted to force QR 01:34:22 If you're trying to be English-like, I would expect a DP to be either a value or a predicate on values. 01:34:22 Anyway, what's QR? 01:34:22 quantifier raising 01:34:22 "a two" is actually a quantifier, right 01:34:22 \q.Ex[x = 2 && q(x)] 01:34:41 just like you might have some quantifier in haskell 01:34:59 or haskellish 01:35:18 Can you use pseudo-Haskell notation here? I'm not used to whatever you're using. 01:35:18 all xs f = Ex:(x in xs)[f(x)] 01:35:39 or, for the sake of being pure 01:35:49 lets define 01:35:56 well no we cant define 01:35:57 haha 01:36:00 were TRYING to define 01:36:01 :D 01:36:02 ok 01:36:04 I'm not following a lot of what you're saying. 01:36:21 all xs f = foldr (&&) True (map f xs) 01:37:06 exists xs f = foldr (||) False (map f xs) 01:37:16 exists ~ some 01:37:17 whatever 01:37:28 Did you tell me what the value of "a two" is? 01:37:33 not yet 01:37:35 im getting ther e:D 01:37:39 :D 01:37:47 I think I'd rather you just tell me and we work backwards. 01:38:28 a f g = exists [x : x <- [1..], f x] g 01:38:40 If I want to learn why the gostak distims what it does, I'd like to know what the gostak distims first. 01:38:51 two x = x == 2 01:39:05 a two == exists [x : x <- [1..], x == 2] g 01:39:24 s/a two/a two g/, I'm assuming. 01:39:31 er, yeah sure 01:39:40 a two = exists [x : x <- [1..], x == 2] 01:39:54 now we have plus 01:40:16 So "a two" takes a predicate and returns whether 2 satisfies that predicate, right? 01:40:23 yep 01:40:27 plus is something i cant write in haskell, but lets modify haskell a bit 01:41:07 lets say that in haskell, the argument in a lambda can have a restriction which is _almost_ like a type but instead tells you what kind of thing this is for semantic interpretability 01:41:11 we'll denote it like so 01:41:16 f x:R = ... 01:41:21 where R is the restriction 01:41:33 so lets say that plus s:SUM = true 01:41:40 So in your type system, n means values and t means Booleans, right? 01:41:49 n means Ints, t means Bools 01:41:53 s means States 01:42:03 so plus s:SUM = true :: s -> t 01:42:06 That means a DP is C n, where C is the continuation monad C x = (x -> t) -> t. 01:42:34 but since s:SUM, and SUMs need 3 "participants" in order to be semantically _sensible_ 01:42:40 we can say 01:42:49 plus s:SUM = true :: s -> t / 3 01:43:03 "s:SUM" can be read "s which is a SUM", yes? 01:43:10 or s -> t / n,n,n specifically 01:43:16 sure, s-which-is-a-SUM 01:43:20 but its not a type restriction, keep in mind 01:43:23 Right. 01:43:41 ok so now we want to combine "plus" with "a two" 01:43:49 In "plus s:SUM = true :: s -> t / 3", are you using s consistently? 01:43:55 It looks like you're using it inconsistently. 01:44:07 What does "t / 3" mean? 01:44:15 its (s -> t) / 3 01:44:24 Oh. 01:44:29 the / 3 means "needs three participants in order to be sensible" 01:44:45 im using s as both the name of the type and the name of the var 01:44:50 Don't do that. 01:44:56 if you want, plus s:SUM = true :: S -> T / N,N,N 01:45:02 Wonderful. 01:45:03 anyway 01:45:17 and "a two" is (N -> T) -> T 01:45:21 so this doesnt match up 01:45:24 It looks to me like you're being really complicated. :-P 01:45:31 "a two" cant be an arg to "plus" 01:45:39 so lets be hopeful: 01:45:41 What is a State, anyway? 01:45:45 dont worry ;P 01:45:55 in the end, the primitive types wont matter 01:46:02 and we can make it untyped 01:46:05 or mono-typed 01:46:14 well, mono-primitive 01:46:19 anyway 01:46:23 so lets be hopeful 01:46:27 I would have plus :: ((N,N,N) -> T) -> T or something. 01:46:33 no shush 01:46:34 listen 01:46:36 be quite 01:46:37 Yes, sir. 01:46:49 since "a two" looks like (a -> b) -> b 01:46:58 this just looks like a type-lifted a 01:47:03 Right. 01:47:05 so "a two" looks like a type-lifted N 01:47:18 so lets replace "a two" with some place holder, call it tz 01:47:19 of type N 01:47:28 and hold "a two" off to the side 01:47:30 Great. 01:47:46 so now what we're really trying to do is combine plus :: (S -> T) / N,N,N with tz :: N 01:47:56 well, still, tz isnt an S so it cant be an argument to plus 01:48:03 but maybe its one of those three participants! 01:48:09 -!- Pthing has quit (Remote closed the connection). 01:48:20 Do you want to explain why plus has that type? 01:48:23 no 01:48:30 its magic, youll see 01:48:38 so we can specify this. its the first of the three participants to be added, sooo 01:48:42 we combine the two like so: 01:49:22 plus ^ tz = \s:SUM -> true && first(s,z) :: S -> T / N,N 01:49:55 pretend i said z everywhere before where i said tz :p 01:50:11 Sure... 01:50:21 (im mixing domains of discourse here, so im thinking partially in terms of words with meanings, and partially in terms of pure meanings, so its difficult. anyway...) 01:50:36 so plus ^ tz = \s:SUM -> true && first(s,z) :: S -> T / N,N 01:50:41 Sure. 01:50:57 now 1 is :: N 01:51:02 cause remmber we're doing "1 plus a two" 01:51:12 (As you can see, the ellipsis denoted that I was responding to your second to last message. :-P) 01:51:12 N again can be used as a participant 01:51:21 :P 01:51:57 You know, it seems to me that natural language has a value embed :: (Monad m) => m a -> a 01:52:05 so we combine (plus ^ z) with 1: 1 ^ (plus ^ z) = \s:SUM -> true && first(s,z) && second(s,1) :: S -> T / N 01:52:31 and now we've handled the whole of the _expression_ "1 plus z" 01:52:33 but wait 01:52:38 we have some problems! 01:52:49 first, we didnt handle that whole "a two" thing, we just left it off to the size 01:53:10 second we still dont have a last N participant for this function 01:53:15 so we must proceed! 01:53:17 I don't think I can continue listening unless you tell me why you gave plus that type. 01:53:17 what can we do? 01:53:25 i will when im done :p 01:53:38 well, we can existentially close of the s variable! 01:53:43 \s:SUM -> true && first(s,z) && second(s,1) :: S -> T / N 01:53:46 but this then becomes.... 01:53:48 Then say my name when you're done. 01:54:19 :| 01:54:22 no keep reading 01:54:23 were almost there 01:54:47 \z -> exists states (\s:SUM -> true && first(s,z) && second(s,1)) :: N -> T / N 01:54:54 er sorry 01:54:57 not quite that far :p 01:55:03 exists states (\s:SUM -> true && first(s,z) && second(s,1)) :: T / N 01:55:12 Darn, no alternative activity comes to mind. I'll have to keep reading. 01:55:27 but now wait, z isnt _technically_ a bound variable 01:55:48 so we have to bind it, but we dont want to do existential, since we got z by removing "a two" 01:55:50 so we do a lambda: 01:55:55 \z -> exists states (\s:SUM -> true && first(s,z) && second(s,1)) :: N -> T / N 01:56:18 and now we have an N -> T that "a two" :: (N -> T) -> T can combine with! 01:56:34 so we put them together to get 01:56:42 Though I'm not reading so much as checking to see whether you're answering my question yet. 01:56:53 exists [z : z <- [1..], z = 2] (\z -> exists states (\s:SUM -> true && first(s,z) && second(s,1))) :: T / N 01:57:02 UORYGL 01:57:11 fucking fine 01:57:21 let me stop this ALMOST FINISHED EXPLANATION 01:57:26 so i can answer your question 01:57:31 Thank you very much. 01:57:38 LETS HOPE YOU REMEMBER WHERE WE FUCKING WERE 01:58:32 the idea is to try and force the language to allow _only_ primitive combinations of functions (ie the kind of combinations you can type into a REPL) which involve ONLY forks of two functions 01:58:49 Forks? 01:58:51 yes 01:58:53 like in J 01:59:01 I'm not familiar with J. 01:59:15 fork f g h z = f (g x) (h x) 01:59:22 so the classic definition of avg is 01:59:39 avg = fork (+) len (/) 01:59:44 er 01:59:45 well 01:59:50 for (/) (+) len 01:59:51 :p 02:00:23 where + should be the summation operation 02:00:24 whatever 02:00:27 Right. 02:00:45 since fork (/) (foldr (+) 0) len = \xs (/) (foldr (+) 0 xs) (len xs) 02:01:04 so then how do you force plus to be a fork-able function? 02:01:12 well, thats where the language-y stuff comes in 02:01:27 just say plus is a predicate that specifies that kind of thing its argument is 02:01:29 e.g. 02:01:40 all plus does is \s:SUM -> true 02:01:52 so that if you do 02:01:58 plus a_state 02:02:07 this is true 02:02:16 and, by assertion, s is a SUM 02:02:31 (so the :SUM thing kind of is an assertion of type, rather than a restriction on type) 02:02:36 Though I've not thought about it, it sounds a lot like you're saying, "I want my language to allow only oranges. Now, how do we represent an apple as an orange?" 02:02:44 exactly. 02:03:12 if you want your language to be purely logical (Prolog) you have to represent addition some special way 02:03:25 Where "apple" and "orange" are mutually exclusive concepts. 02:03:25 if you want your language to be purely functional (Haskell) you have to represent loops some special way 02:03:26 etc etc 02:03:33 nope, theyre NOT mutually exclusive 02:03:41 theyre just very different perspectives on things 02:03:43 so, to continue 02:04:14 you've now figured out how to satisfy "a two"'s argument 02:04:15 "Apple" and "orange" are mutually exclusive. You're denying that my analogy applies. 02:04:19 Which is precisely what I expected. 02:04:23 -!- oerjan has quit ("Good night"). 02:04:26 BOTH ARE FRUITS 02:04:28 :| 02:04:29 >| 02:04:39 you've done so and gotten back exists [z : z <- [1..], z = 2] (\z -> exists states (\s:SUM -> true && first(s,z) && second(s,1))) :: T / N 02:04:48 which is GREAT, because this is a basic value 02:04:58 but wait, its not a sensible meaning 02:05:06 it still needs an N 02:05:07 :( 02:05:13 but we dont HAVE an N! 02:05:14 what can we do 02:05:15 ? 02:05:26 well, we can ask which N will make it sensible! 02:05:32 I think I'm not your intended audience. 02:05:41 but now? stick third(s,w) in there at the appropriate place 02:05:52 exists [z : z <- [1..], z = 2] (\z -> exists states (\s:SUM -> true && first(s,z) && second(s,1) && third(s,z))) :: T 02:06:01 but now this has an unbound variable z 02:06:04 so we have to close it 02:06:07 Which means I'm just being distracted and should get back to something useful. 02:06:09 but how can we close it? well, lambdas. 02:06:29 woops, this should be w 02:06:32 since we have z already 02:06:36 \w -> exists [z : z <- [1..], z = 2] (\z -> exists states (\s:SUM -> true && first(s,z) && second(s,1) && third(s,w))) :: N -> T 02:06:47 but alas, this is _still_ not a value 02:06:58 what can we do? 02:07:06 we can "question" it: 02:07:16 which [1..] (\w -> exists [z : z <- [1..], z = 2] (\z -> exists states (\s:SUM -> true && first(s,z) && second(s,1) && third(s,w)))) :: t 02:07:38 where which is like 02:07:56 * uorygl nods in a way that expresses that he believes that his suspicion has been confirmed. 02:08:16 which x:xs f = if (f x) then x else which xs f 02:08:38 and which [1..] (\w -> ...) is of type N 02:08:53 and we can definitely display one of those! 02:08:54 3 02:09:17 this is how my esolang is going to calculate 1 + 2 :D 02:09:23 er 02:09:25 1 plus a two 02:09:34 similar for 1 + 2 but less crazy 02:10:10 or, since we're being haskellish 02:10:17 1 `plus` (a two) 02:10:26 I keep coming back to this conversation. It would be easier for me to leave if I explicitly closed my end and you acknowledged the closure. 02:10:29 So, see you. 02:10:36 aww :( 02:10:46 uorygl doesnt like my esolang :| 02:11:17 Your esolang is probably interesting, but I must admit I can't find interest in what you're saying. 02:11:29 well, i was giving an evaluation trace 02:11:33 who COULD find interest 02:11:44 If your purpose is to interest me, you should answer my questions and say little else. 02:11:55 well ask me a question then 02:12:07 you sort of interrupted an explanation i was giving to Gracenotes, so... 02:12:12 you got the rest of it 02:12:17 then you started asking queeestions 02:12:18 and its like 02:12:26 i was in the middle of explaining something, you demanded i restart 02:12:27 fine 02:12:28 :| 02:12:30 Okay. I felt I would have been interrupting you, since you were saying so much. 02:13:01 You don't have to be interesting. Did I imply that I wanted that? 02:13:10 s/be interesting/interest me/ 02:13:36 Anyway, my question, if you want to answer it, is what it means for a fork to be "of" two functions. 02:13:48 In fork f g h x, what's that a fork of? 02:13:55 anyway 02:13:57 uh 02:14:02 what do you mean 02:14:25 Well, you said "only primitive combinations of functions which involve only forks of two functions". 02:14:37 fork :: forall a, b, c. (a -> b) -> (a -> b) -> (b -> c) -> c 02:14:45 i defined fork earlier :| 02:14:56 fork f g h x = f (g x) (h x) 02:15:05 Right. 02:15:13 my types are not matching my definitions but whatever 02:15:15 you get the point 02:15:35 So is fork (/) (+) len a fork of (+) and len, or what? 02:16:02 fork :: forall a, b, c. (b -> c) -> (a -> b) -> (a -> b) -> a -> c 02:16:31 (b -> b -> c), not (b -> b), I think. 02:16:42 fork (/) (+) is a higher order function \f -> \x -> (+ x) / (f x) 02:16:53 no, its (b -> c) 02:17:19 remember tho, i mistyped + instead of like... foldr (+) 02:17:55 What I'm trying to understand is your usage of "of", though. 02:18:05 oh ok 02:18:14 you mean when i said that i wanted everything to be forks of functions 02:18:14 right 02:18:15 ok 02:18:17 what i meant is 02:18:55 a fork of f and g is: forsome h. \x -> h (f x) (g x) 02:19:28 So fork f g h x is a fork of g and h. 02:19:46 yes 02:22:11 * uorygl ponders fork. 02:22:46 -!- adam_d_ has quit (Read error: 110 (Connection timed out)). 02:23:33 There's an arrow operation that does that. Something like (&&&) :: ar a b -> ar c d -> ar (a,c) (b,d) 02:23:36 But anyway! 02:23:56 maybe. i dont know what its like in haskell 02:24:13 Are you familiar with arrows in Haskell? 02:24:18 nope 02:25:39 They're things that look like functions. Something is an arrow if it has three operations: turning a function into one, composing two of them, and "fork"ing two of them into one. 02:26:07 Except that's not really fork; it's more like (&&&) f g (x,y) = (f x, g y). You can make fork out of it. 02:26:41 Lessee. fork f g h = f . (g &&& h) . (\x -> (x,x)) 02:26:50 Where . is the composition operator. 02:26:55 But anyway! :-P 02:27:17 i know what . is in haskell :P 02:27:34 and i think you mean 02:28:01 eh.. no youre right. :D 02:28:08 so yeah 02:28:10 anyway 02:28:21 Well, I said that because . only works for actual functions, not arrow things in general. 02:28:55 the evaluation of "1 + a two" is such that a lot of it is not written (e.g. the extraction of "a two" is not explicit) 02:29:06 (in that its done by the interpreter) 02:29:17 * uorygl ponders what "1 plus a two" would look like if it were done with arrows. 02:29:24 but infact maybe you _could_ do it yourself 02:29:31 a two: 1 + 02:29:47 and this evaluates to 3 as well 02:29:56 Let's use @ for an arrow: a @ b is a "function" taking a and returning b. 02:30:00 which lets you use functions normally 02:30:07 except that + is not defined as Int -> Int -> Int 02:30:24 but rather as State -> Bool / Int, Int, Int 02:31:21 "Plus" could be (N,N) @ N, "1" could be () @ N, "a two" could likewise be () @ N... 02:31:43 You can use arrow operators to combine those into () @ N the right way. 02:32:16 i really should write the interp for this 02:32:20 Yes! 02:33:19 i like the idea tho man 02:33:20 omg 02:33:29 we have to extend it to quantifiers tho 02:33:52 by we i mean me 02:33:55 but im talking to you, so 02:34:13 nah there can be real functions 02:34:31 "A two" isn't really an N, though; you said that. 02:34:31 no! haha! there wont be 02:34:36 right 02:34:40 "a two" is (N -> T) -> T 02:35:03 If we define a @ b as a -> (b -> T) -> T, then "a two" is () @ N. 02:35:23 specifically, "a two" == \q.Ex:(x = 2)[q(x)] 02:35:57 hmm yes 02:38:43 Though what you say makes me want to say stuff of my own. :-) 02:38:54 :) 02:39:57 Let's call P a point concept, a specific thing, where anything you might refer to is exactly one point concept. 02:40:12 so i think "a" would infact be defined as E* 02:40:49 "That orange over there" is a phrase referring to a point concept. There is exactly one thing that you're talking about. 02:41:09 yes 02:41:11 its a deictic 02:41:12 :P 02:41:29 deictic terms are the only referentials really 02:41:45 but even "that", while being deictic, is a quantifier 02:41:48 * uorygl lets his browser look up the word "deictic". 02:42:00 its just a quantifier that has a use of a bound variable 02:42:06 P -> T is a sort of region concept, a type of thing. "Orange" is a region concept, I guess. 02:42:07 bound globals, really 02:42:52 that f g = exists [x : x == the_global, f x] g 02:43:04 Let's say R = P -> T. 02:43:10 ok 02:43:18 Nouns tend to denote Rs, and DPs tend to denote Ps. 02:43:30 "point" concepts are denoted E, btw 02:43:31 :p 02:43:36 well, in semantics, e 02:44:12 (you know, we dont use -> in semantics? we use <,>, following church. its horrible. a -> b -> c == > 02:45:09 Something that's attested is DPs denoting Rs instead of Ps, which is too bad. 02:45:43 Now, let's pretend English is a subject-dropping language; it doesn't really matter. 02:46:17 its actually rare that DPs actually point 02:46:22 Verbs can stand alone as sentences, but they are usually modified by subjects, direct objects, indirect objects, and prepositional phrases. 02:46:23 ignoring "that dog" 02:46:35 "the cat" does not actually point 02:46:37 nor does "John" 02:46:53 "John" doesn't actually point? 02:46:56 nope! 02:47:03 well, it sort of does actually 02:47:15 "John" can be one of two things 02:47:21 either an NP, which is a predicate like "cat" 02:47:40 -!- Slereah has quit (Read error: 60 (Operation timed out)). 02:47:41 or a deictic DP, with a covert deictic existential quantifier 02:47:54 -!- BeholdMyGlory has quit (Remote closed the connection). 02:47:56 Intuitively, "John is eating" and "Somebody named John is eating" mean different things. 02:47:58 the idea is that when you say "John" as a DP, as in "John danced" 02:48:11 -!- Slereah has joined. 02:48:22 sure sure but "Somebody named 'John' is eating" is overly something even more complex 02:48:33 -!- jpc has joined. 02:48:46 we think that "John danced" is really "NOM John danced" 02:48:55 where NOM is roughly "Some" only a covert some 02:48:57 notice, 02:49:03 nom nom nom 02:49:06 "John danced" is the same as "Some 'John' danced" 02:49:14 except where you have someone in mind 02:49:34 "Some 'John' who I'm thinking of danced" 02:49:53 at least, this is what you have to say to fully account for the semantics of english 02:50:07 without being obnoxious and saying that there's either 02:50:17 a) 10 different versions of the verb "eat" 02:50:41 I still think that intuitively, "John is eating" and "Some John who I'm thinking of is eating" mean different things. The former is ambiguous, the latter is inspecific. 02:50:48 b) 10 different versions of "the" 02:50:59 well, it DOES mean something different 02:51:15 because the meaning of "Some John who I'm thinking of is eating" is an actual english sentence with a complex meaning 02:51:26 see the problem is that like 02:51:44 the quantifier inside of "NOM" is not the SAME as the quantifier "some ... who I'm thinking of" 02:51:44 You seem to be claiming that they do mean the same thing. 02:51:52 theyre roughly the same, right 02:52:02 but insofar as your internal representation is concerned, they're not 02:52:18 Huh. 02:52:23 and representationally they're different 02:52:32 in the same way that "1 + 2" and "2 + 1" are different haskell expressions 02:52:38 but if you evaluate them they mean the same thing 02:53:00 if you "evaluate" the two sentences "John is eating" and "Some John that I'm thinking of is eating", WHILE you're thinking of that particular John 02:53:10 you wont be able to find a difference in meaning 02:53:49 Let's say that "John is eating" does refer to a specific John, though which John it is may be undeterminable. 02:54:00 undeterminable by the listener 02:54:01 not by you 02:54:30 when we reflect upon the meanings of sentences, we're not looking at whether or not the sentence is true, nor what makes the sentences true, we're looking at the form of the meaning 02:55:19 the form of the meaning of "John is eating" really does seem to have something like a quantifier in it, at when you represent the meaning using the normal modes of meaning representation. 02:55:46 if all we ever said were things like "John is eating" this wouldnt be an issue 02:56:00 hell, if all we said were things like "John is eating" and "all dogs are brown" this wouldnt be an issue 02:56:17 the issue is "John is eating something" 02:56:29 and 02:56:39 So I'm trying to create the concept of "the P referred to by a DP". I think in order to do that, I have to find three mutually correlated properties, and say that this new concept is the thing that they are related by being correlated with. 02:56:39 "The John that I know is eating something" 02:57:08 what? 02:57:10 I think I'm getting caught up in Philosopher's Molasses. 02:57:19 no, semanticists molasses :p 02:57:22 but such is language! 02:57:37 to account for object quantification you need a whole mess of shit 02:57:40 Rationalist's Molasses. 02:57:42 there are a number of options 02:57:52 in ONE option, you could ALMOST get away wit h John being deictic 02:58:04 except then you have to have as many words "John" as know people named John 02:58:19 AND you also have to have another word "John" that means "person named 'John'" 02:58:37 I'm pondering when it's valid to postulate a concept. 02:59:11 and at the same time you STILL need to have purely deictic terms like "that" 02:59:27 or some covert terms that seem to be genuinely deictic 03:00:01 and then theres the problem that in many languages, even PROPER NOUNS must be used with "the" 03:00:02 e.g. 03:00:12 in greek, you dont say "John is dancing" you say "The John is dancing" 03:00:32 Anyway, we should stop trying to find the Deep Theoretical Platonic Truth until we figure out how it's related to the Shallow Empirical Observable Truth. 03:00:40 the same "the" (an 'o' or there abouts eg 'o janos') that you use to say "the cat" 03:00:53 there is no shallow empirical observable truth :D 03:01:08 quantum mechanics and that berkeley dude has shown us this 03:01:20 On the contrary! 03:01:30 look, the evidence that "John" is really a deictic quantifier is quite strong 03:02:02 yes, it seems like "John is eating" points to some particular person John (because it does) and that it has a different meaning that "Some John is eating" (because it does) 03:02:16 If I look at a glass, I know that I see a glass. 03:02:23 Is there really a glass there? Almost certainly. 03:02:38 The presence of that glass is the shallow empirical observable truth. 03:02:38 but that does NOT mean that "John" here by itself, as the smallest meaningful object that contains the letters "John", is infact the thing doing the pointing. 03:02:54 you dont know you see a glass, you THINK you see a glass 03:03:15 the most you know is that you're experiencing what it would be like to see a glass 03:03:24 or what you would call a glass 03:04:15 thats not to say you dont have a very good reason to think that 03:04:15 Sure. In any case, we know *something* upon seeing a glass; whatever logic is based on, let us call that knowledge. 03:04:27 s/logic is based on/the premises of logical reasoning are/ 03:04:41 but that reason is merely 500 million years of evolution hard-coding some very sophisticated philosophical argumentation 03:04:51 so seeing a glass is NOT shallow at all 03:05:12 its just that the depth is masked by the fact that conscious experience is a very recent thing that has very LITTLE access to cognitive processes 03:05:24 It's shallow in that it's a very familiar and intuitive sort of thing. 03:05:29 yes 03:05:38 but familiar and intuitive is irrelevant 03:05:48 when we're doing semantics, we're not asking what is intuitive 03:05:54 we're asking what is real 03:05:58 or accurate, at least. 03:06:22 science has revealed again and again that the intuitive understanding of the world is usually wrong 03:06:23 I guess it depends on whose perspective you're speaking of. From a human's perspective, it's a shallow truth. From a computer's perspective, it's presumably a pretty deep one. 03:06:44 motion is not newtonian, there is no such thing as simultaneous, and you are not the best authority on the contents of your own mindbrainthing 03:06:54 sure, its a shallow truth 03:06:56 like i said 03:06:57 that means nothing 03:07:05 because were not ASKING what the shallow truth is 03:07:16 when you ask a person what does "fido is a dog" mean, they just repeat the sentence back at you 03:07:22 "why, it means that fido is a dog! duh!" 03:07:33 but thats not a meaning 03:07:43 thats not what your mental representation of the meaning is 03:07:45 surely not! 03:08:00 because "fido is a dog" is a string of letters/sounds! 03:08:22 and its almost certainly not the case that our representations of meaning are in terms of abstract syntax trees 03:08:56 because then then bilingual speakers would never be able to say that "rex is a cat" and "rex est un chat" mean the same thing 03:08:57 But our deep knowledge is based on our shallow knowledge and it exists for the purpose of predicting what our next shallow knowledge will be. 03:09:08 because french and english probably dont have the same abstract syntax in that regard 03:09:13 yes but guess what 03:09:25 our shallow knowledge of LANGUAGE is buttressed by an enormously complex linguistic faculty 03:09:50 what reason do you have to believe that when you learn the word "fido" that you're just learning what it points to? 03:10:00 this is NOT part of your shallow experience 03:10:09 because now we're in the realm of cognitive science/psychology/whatever 03:10:22 your shallow experience of the world does not include the details of name acquisition 03:10:33 it includes the experience of seeing someone point to an animal and go "fido" 03:10:36 "fido fido fido" 03:10:38 "this is fido" 03:10:44 "c'mere fido!" 03:10:47 "fido want a treat?" 03:10:51 "thats a good fido" 03:11:26 (oops, sorry, i just used a quantifier in your primary linguistic data! D: you might get the impress that fido is really a predicate!) 03:11:45 Hmm, I think we're talking about the deep truth again. 03:11:48 no, we're not 03:11:59 because you're saying that the "Shallow" meaning of "fido" is the dog itself 03:12:02 but how do you know this? 03:12:09 theres nothing in your experience of the world that shows this 03:12:25 you have to make an ASSUMPTION about how people learn the meaning of words 03:12:52 your ASSUMPTION is that "fido" is by default going to be mapped to the dog in question, not to some abstract "fidoness" quality 03:12:59 but thats the whole thing you're ARGUING! 03:13:00 We're talking about things that are themselves deep truth. 03:13:19 The shallow truth of semantics is whatever we can observe and feel about it. 03:13:27 but you CANT observe semantics 03:13:41 and you cant FEEL everything either 03:13:44 like i said earlier 03:13:51 you _FEEL_ that when you typed "John" 03:13:52 If semantics has no observable or feelable consequences, there's no reason to discuss it. 03:13:56 that this whole thing referred to someone 03:14:02 well, sure, in a way it DID refer to someone in particular 03:14:14 but that doesnt mean that "John" those four letters are really all there is to that phrase 03:14:22 nor that that person is all there is to that meaning 03:14:37 Great. So I can see some shallow truth there. 03:14:43 no, you cant 03:14:57 Um... 03:15:07 all you can see is the fact that the string of letters "John" seems to refer to a particular person. 03:15:08 thats it. 03:15:14 that doesnt tell you whether or not thats ALL it does 03:15:27 Right. The shallow truth is that "John" probably refers to a particular person. 03:15:42 like i said 03:15:44 deictics DO refer 03:15:51 and "John" can be a deictic 03:15:55 but its STILL a quantifier as well 03:16:06 because its a quantifier with has a deictic element 03:16:10 or, more accurately 03:16:22 its a predicate that combines with a quantifier that has a deictic element inside it 03:16:38 suppose that J was _that particular John_ 03:16:54 the current best guess is that "John" in "John is eating" has the meaning roughly like 03:17:33 \p -> Ex[x = J & calledJohn(x)] 03:17:47 you've GOT the deicticy thing in there 03:18:04 you're referring directly to the person, in that there IS the use of J in there 03:18:18 but that does NOT mean that the whole thing isnt a quantifier, ok 03:18:55 infact, the root word "John" (not the phrase "John" in "John is eating") is NOT the thing thats contributing the referentiality 03:19:00 its the quantifier! 03:19:21 \q, p -> Ex[x = J & p(x) & q(x)] 03:19:47 and this quantifier in english (but not in greek maybe?) happens to be phonologically empty 03:20:02 So, you mentioned "the current best guess". What's the question we're guessing the answer to? 03:20:15 what the meaning of "John" looks like 03:20:29 is it the person john, or some quantifier, or whatever 03:20:32 thats the question 03:21:02 the assumption that its just a reference to some person in particular just _doesnt work_ with the rest of the meanings of sentences 03:21:26 now we can waffle about the meaning of "meaning" 03:21:26 So, "What does the meaning of 'John' look like?" Now, I can't look at a meaning and see what it looks like, so what's the shallow truth we're trying to figure out by answering that question? 03:21:35 and say that meaning is the intention behind the expression 03:21:39 but thats way too loose 03:21:52 because then the meaning of an expression is whatever the person intends it to be, regardless of the conventions of the language 03:22:08 "John is eating" means "Mary danced" because i INTENDED it to 03:22:12 well no, sorry, it doesnt 03:22:22 like i said 03:22:28 semantics is NOT a shallow truth 03:22:48 which is why you cant just tell me that John refers to the person because thats the shallow truth of it 03:22:53 I find that instead of waffling about meanings, it's better to say what we'd like to say without using the word at all. 03:22:57 because you're saying the meaning of John is a reference to a person 03:23:08 but the meaning of expressions is not a shallow truth! 03:23:22 well, uorygl 03:23:32 when you can figure out what you're trying to say without using the word meaning, be my guess :P 03:23:34 guest* 03:24:01 or without using any words that covertly incorporate notions of meaning 03:24:06 What should we conclude upon hearing "John is eating"? 03:24:21 What do people want when they say "John is eating"? 03:24:54 Combine those: What do people want us to conclude when they say "John is eating"? 03:25:17 we should conclude that the speaker believes there is a person that the speaker believes is called john with whom the speaker knows, and assumes the hearer knows as well, and that this person is, at this very moment, engaged in a process of eating something or other which the speaker doesnt know the identity of, or which the speaker does not feel the identity of to be relevant 03:25:44 we should also conclude that the speaker believes that this is somehow relevant to the conversation were having and that i will be able to integrate this into my understanding of the situation 03:25:46 Indeed, there's a whole bunch we should conclude. 03:25:51 yes 03:25:58 except that is not the meaning of the sentence "John is eating" 03:26:18 because when someone says "John is eating" they are not conveying to you "I believe you know who John is" 03:26:25 because if they were 03:27:03 well, there are all sorts of because 03:27:04 but 03:27:20 if I said that Frank was wrong when he said that "John is eating" 03:27:55 this is definitely not going to be the case if Frank didn't actually think I knew who John was 03:28:08 Right. 03:28:18 but all of this hinges on the WORD "meaning" 03:28:31 i mean, we can just ask what we should concluce 03:28:52 but that leads to a whole world of shit beyond just LANGUAGE 03:29:01 it leads to psychology of action 03:29:06 inferring why people do things 03:29:06 etc 03:29:14 and that can lead to inferences about the physical world 03:29:39 because "john died after jumping off the bridge" leads us to conclude that the speaker believes something about jumping off of bridges and death 03:29:41 like say falling 03:29:42 and gravity 03:29:47 and that yadda yadda 03:30:02 No shallow truth can hinge upon the definition of a word. Unless, of course, said truth is about the word itself. 03:30:23 what we can or should conclude from a sentence is not the meaning of the sentence, its the meaning of the sentence plus a whole shittone of other crap thats brought into the task of understanding 03:30:35 ah but thats the thing see 03:30:43 you cant even ask what the shallow truth of blah blah blah is 03:30:50 because there is no shallow truth of "John" 03:30:54 its just a word 03:30:56 Anyway: 03:30:59 truth is a property of claims 03:31:02 or propositions 03:31:07 and "John" is not a proposition 03:31:18 Let's call the "real" definition of meaning meaning1, and my definition of meaning meaning2. 03:31:23 well 03:31:28 lets not use the word meaning at all 03:31:35 So the meaning1 of "John is eating" does not include the fact that I know who John is, but the meaning2 does. 03:31:41 lets do what semanticists do and say this: 03:31:55 Why do we care about the meaning1? 03:32:20 watching Maru yawn in slow motion is perhaps the most satisfying thing I've done today 03:32:20 if you know the "meaning" of a sentence like "John is eating", you know what is required of the world for it to be true or not 03:32:35 you also know some other stuff, but at the very least you know what is required for it to be "literally true" 03:33:17 semantics is the study of the literal content of expressions 03:33:23 Well, let me take a different route. 03:33:31 and, it turns out, in order to understand the WHOLE of the "meaning" of "John is eating" 03:33:36 that is, to include your meaning1 as well 03:33:39 I think you would agree that "Oh no!" doesn't have a truth value. 03:33:45 How do you know that it doesn't have a truth value? 03:33:46 you have to know the litteral content 03:33:56 And why do you care that it doesn't have a truth value? 03:34:06 it doesnt have a truth value because you cant so "No, you're wrong." when someone says "Oh no!" 03:34:20 since I've sort of been passively wondering what the scope if it all is, how about, say, "John is ugly"? 03:34:27 the question is whether or not "Oh no!" is language. 03:34:37 its a bunch of sounds you made with your mouth 03:34:41 but so is a sneeze 03:34:44 or instead of ugly, some other thing that is 100% an opinion of the speaker 03:35:02 now we might actually accept that its language, sure 03:35:13 lets say its language, ok 03:35:28 So we feel like there's a notion of contradictability. 03:35:33 not at all. 03:35:43 i didnt say that ALL linguistic expressions have truth values 03:35:53 i said that expressions like "John is dancing" have truth values 03:35:53 When somebody says "You're wrong!" in response to "Oh no!", that makes us feel a specific way. 03:36:13 oh oh sorry i see what you mean 03:36:20 well yes, i mean 03:36:22 you cant say 03:36:28 "yes thats true" to someone's "oh no" 03:36:34 augur: again, how's "John is ugly?" 03:36:35 nor can you say "no thats false" 03:36:37 or no thats wrong 03:36:38 or whatever 03:36:42 (gracenotes: what about it) 03:36:45 make that a "?, not ?" 03:37:00 oh, in-situ questions. what about it? 03:37:13 augur: it's not something that is provably true or false, yes, what we learned about in grade school as truth vs. opinions 03:37:27 or "Some say John is fugly" 03:37:46 "John is ugly" is an assertion 03:37:51 well, that is verifiable, although semantics must still make meaning about what the claim is 03:37:55 and, in some very real sense, it can be considered true or false 03:38:10 "John is ugly?" with a question mark is a question 03:38:15 and so doesnt have a truth value 03:38:24 So... 03:38:27 so! 03:38:35 sou desu 03:38:41 sou ka? 03:38:54 what, uorygl 03:38:56 If I know a certain thing, and I hear the sentence "John is eating", I will agree. 03:39:03 sure 03:39:09 this is the kind of meaning that semantics is concerned with 03:39:17 the literal sorts of meaning 03:39:20 Yes, I think we've come up with a definition of meaning. 03:39:29 not what you can infer from the expression 03:39:37 just what the literal meaning is 03:39:41 john be fugly 03:39:46 the literal content, if you will 03:39:52 just know that, when you hear him used in example questions 03:39:58 there is a whole field of exploring the non-literal meanings behind things 03:40:01 he's not truly beautiful :_: 03:40:02 part of that is pragmatics 03:40:06 part of that is psychology 03:40:07 I think we can confuse the issue a bit, though. 03:40:47 actually we cant. there are very clear lines that can be drawn between literal meaning, pragmatics, and psychology. 03:40:56 Suppose that I and a friend are walking along, and we both see something disgusting. My friend says, "Eew." Am I agreeing? 03:41:02 and the problematic aspects are all at the line between pragmatics and psychology. 03:41:23 What if my friend says "Disgusting!"? 03:41:28 that depends on what you mean by "agree"! 03:42:00 Well, this definition of "meaning" that I came up with hinges on what "agreement" is. 03:42:10 there are at least two words in english that are written "agree" 03:42:13 or two senses of the same word 03:42:20 one is "share an opinion" 03:42:23 as in 03:42:54 "Alexis expressed disgust at the food. I agreed inside, but didn't say anything." 03:42:56 or 03:43:09 -!- FireFly has quit ("Leaving"). 03:43:12 augur: if you work in semantics, you might have to explain this depressingly often :| but better to introduce non-linguists, I suppose. 03:43:25 the other can mean "express concurrance on a matter" 03:43:26 as in 03:43:34 "Alexis didn't agree to the terms of the deal" 03:43:43 * uorygl nods. 03:43:53 "even if inside Alexis really liked the deal, her bosses insisted that she not agree to it" 03:44:46 Gracenotes: SO often. you have no idea how many times, when working on an extraordinarily complicated issue like what the meaning of "Fido is a dog" is, someone will say "Fido is a dog just means Fido is a dog! It's simple!" 03:44:50 well no, its not simple, sorry 03:44:55 you're just DESIGNED TO UNDERSTAND IT 03:45:01 it is a black art 03:45:08 if it were truly simple, my calculator could understand what it means 03:45:10 you must have secret semanticist rituals 03:45:14 thats the true test 03:45:24 simplicity is demonstrable: program it. 03:45:36 thats how we know that calculus, for instance, is mindnumbingly simple, when it comes down to it 03:46:01 all the calculus youve ever done in your life to date could be done in a computer program in a few minutes or hours 03:46:05 ALL of it 03:46:16 and a computer isnt anywhere near as smart as a person 03:46:31 well if calculus is so easy that a computer can do it, it must be easy 03:46:36 regardless of how difficult humans find it 03:46:37 Is it okay if I interpret "program" as meaning "express in a computer language", where a "computer language" is any language that expresses details that a computer deals with intimately? 03:46:48 sure! 03:46:53 infact 03:46:55 Great. 03:46:58 so, has anyone bothered making a language --(parse)--> syntax --(interpret)--> semantics tool, at some point? 03:47:05 lets say that "program" means "express in some form of logic" 03:47:20 since programming languages are just logics. 03:47:31 I'd imagine there must be. But fragments of it, which I'm sure is not perfect, otherwise it's what we'd all be using 03:47:34 Well, that's kind of ambiguous, isn't it? Straightforward English is arguably a form of logic. 03:47:34 yeah, they have 03:47:37 but they tend to suck. 03:47:43 well no 03:47:46 english is far from it 03:47:51 Euclid wrote in straightforward English, for strange values of "English". 03:48:03 Euclid wrote in straightforward Greek ;) 03:48:17 what i should have said is 03:48:20 "in some Logic" 03:48:28 where a Logic is a well defined formal system 03:48:34 "If English was good enough for Jesus, it's good enough for America!" --a straw man 03:48:38 like first order predicate calculus or whatever 03:48:49 Okay, that works. 03:48:50 but sure, programming language 03:48:52 i dont care 03:49:16 its just easier to expression certain things in a language like PC2 or HPC or whatever 03:49:29 ~ SOL/HOL 03:49:37 to express** 03:49:43 but thats what i was doing earlier! 03:49:46 and you complained! 03:49:47 :P 03:49:58 Hmm. :-) 03:50:04 if you try to give a good, well defined meaning for "John" 03:50:13 and your well defined meaning is a referential 03:50:33 (that is, a symbol referring to, or behaving as if it were, the person) 03:50:46 (e.g. your logic says the meaning of "John" is J) 03:51:03 then you will, inevitably, conclude that there must be an infinite number of distinct words "John" 03:51:33 "John_1" refers to John Lennon, "John_2" refers to JFK, "John_3" refers to the pizza delivery boy, ... 03:51:40 for all of the John's that you know 03:51:50 Why would those have to be distinct words? 03:52:05 It would make a lot of sense, I think, for them to be distinct senses of a single word. 03:52:07 well explain how you'd distinguish which is meant! 03:52:21 tell me how a single word "John" could be used to refer to INDIVIDUAL people 03:52:27 when when there are multiple Johns 03:52:32 I dunno, the same way that we distinguish between different senses of a word like "set"? 03:52:52 ah but there are probably multiple words "set"! 03:53:01 just because they sound the same doesnt mean theyre the same word, keep in mind 03:53:02 Why would we call those multiple words? 03:53:06 I agree. 03:53:16 "set" like a movie set 03:53:24 "set" like what you do to a variable 03:53:30 probably different words 03:53:34 But I like to define words by the conjunction of spelling, pronunciation, and etymology. 03:53:35 now maybe you have a smaller kind of difference 03:53:38 "mostly the same but not quite" 03:53:55 ok, so now "John" is a single word with multiple subtly different senses 03:53:56 ok fine 03:54:03 If they have the same spelling, they're pronounced the same way, and their reason for being a word is the same, they're the same word. 03:54:09 now tell me what the representation in your logic is. 03:54:19 well yes, words are like 03:54:27 words are pairs, at least, of form and sound 03:54:29 er 03:54:30 form and meaning 03:54:34 what saussure called a Sign 03:54:50 theres probably a bit more to it than that 03:55:11 Hmm, what you're doing is a lot like asking me for the simplest explanation of a phenomenon. 03:55:17 i AM, yes 03:55:22 i mean 03:55:22 look 03:55:37 you seemed to be up to the challenge of giving me pseudocode for this 03:55:39 so 03:55:48 if "John" is one word with multiple senses 03:55:52 give me pseudocode. 03:56:15 give me pseudocode for "John is dancing" 03:56:29 and ignore all the extraneous shit like what the "is" and the "ing" and the tense and shit are doing 03:56:45 i dont care about that (unless its relevant to your explanation of the pseudocode for "John") 03:57:04 just give me, right now, the pseudocode. 03:57:05 do it. 03:57:13 * uorygl ponders. 03:57:42 A person is dancing. That person's name is John. Nothing else I could be referring to is named John. Therefore, I say "John is dancing". 03:59:43 Did I explain the phenomenon? 04:00:51 nope! 04:01:01 not in the way you'd accept it 04:01:06 because you turned "John" into a predicate! 04:01:11 "That person's name is John" 04:01:23 Did I attempt to explain the right phenomenon? 04:01:23 but thats what _I_ said 04:01:30 sure 04:01:44 whatever you want 04:01:53 just give me the meaning in your pseudocode 04:02:11 -!- Pthing has joined. 04:02:21 ph man, pthing 04:02:29 were talking linguistics 04:02:29 * uorygl tries again. 04:02:30 hardcore like 04:02:36 you might not want to stay D: 04:02:44 shut up augur 04:02:47 :| 04:02:55 y'mutha 04:03:13 is the perfect relish to any and all conversations in which augur plays a vital part 04:03:27 your mother is the perfect relish? 04:03:34 no 04:03:38 shut up augur 04:03:39 is 04:03:40 shut up augur 04:04:36 A person is dancing. "John" evokes that person. "John" does not evoke anything else. Q.E.D. 04:04:40 Er. 04:04:57 A person is dancing. "John" evokes that person. "John" does not evoke anything else. I want to use a word that evokes only that person. Q.E.D. 04:05:12 this is your pseudocode? 04:05:23 Yes? 04:05:36 but it does not do what you said! 04:05:48 or, rather, it does not do what you wanted it to do 04:06:19 Given the statement I just made, one can conclude that I will use the word "John" to refer to the dancing person. 04:06:29 in that, the word "John" here surely maps to "'John' evokes that person. 'John' does not evoke anything else." 04:06:43 and minimally also, 04:06:51 "a person" 04:07:07 that is to say, the word john maybe denotes a particular person 04:07:50 but the majority of the words contribution, infact, ALL of the contribution of the words distinguishing feature (being written 'John' instead of 'Mary') is NOT the particular person 04:07:51 but rather 04:08:13 '"John" evokes that person. "John" does not evoke anything else. I want to use a word that evokes only that person.' 04:08:28 you see what i mean? 04:08:34 Hmm. 04:08:50 hell, even 'a person' doesnt do what you want unless we interpret it to be some atomic element that means that person 04:08:56 the whole thing is nothing but a giant quantifier! 04:09:25 "A person that is called John who is the only person in this context that could intend... is dancing" 04:09:33 well, thats pretty much what i said earlier. :) 04:09:36 How about we ask a different question, like... 04:10:28 How could an intelligent computer represent the sentence "John is dancing"? 04:10:36 sure 04:10:38 whatever you want 04:10:41 i dont care how you do it 04:10:44 just give me pseudocode. 04:11:00 Awesome. 04:12:27 IsDancing(JohnThePizzaDeliveryGuy) 04:12:35 (Look, suggestively-named Lisp tokens!) 04:12:47 and thats the literal meaning of "John"? 04:13:36 Well, presumably, this computer knows some things that are related to each other, and has chosen this label the relation JohnThePizzaDeliveryGuy. 04:13:45 sure sure ok 04:14:04 so then this computer will always-and-forever interpret "John" as JohnThePizzaDeliveryGuy 04:14:30 Well, there's further reason that the computer interprets "John" as this token. 04:14:41 ah, well then, you havent given me the literal meaning! 04:14:47 I haven't? 04:14:59 youve given me a meaning that has some interpretation of meaning already in it 04:15:20 I feel like this has turned into a discussion of rationality, which makes me want to take it to #lesswrong, our rationality channel. 04:15:41 no, if we did that, thered be no convo here 04:15:58 since it was quiet until gracenotes and i moved the convo here from proggit 04:16:08 although it was here originally 04:16:19 well sure but we werent talking about esolangs then :D 04:16:29 we were talking aboud proggitty things 04:16:36 i mean, maybe #lesswrong 04:16:52 if you have a name for your PL, you get a *free* freenode channel for it! 04:16:55 fancy that 04:17:12 indeed! 04:17:16 hence #antigravity! :D 04:17:28 you don't say 04:18:10 i do 04:18:16 but i dont go there anymore since i was bored of it 04:25:57 quite so 04:26:39 now, I'm going to get one of the delicious melted mint chocolate kiss sugar cookies my mom made 04:47:33 -!- soupdragon has joined. 05:06:18 -!- adu has joined. 05:19:11 -!- ehirdiphone has joined. 05:19:26 http://journal.stuffwithstuff.com/2009/12/26/conditional-binding-with-let-in-magpie/ Interesting control structure. 05:20:25 erm 05:20:30 i should write a similar tutorial in ruby 05:20:47 what's interesting about it 05:20:53 if value = "1234".to_i then ... else puts "Couldn't parse string." end 05:21:08 SO AMAZING 05:22:28 hate to be a detractor but afaict it's like a bad version of do notation? 05:22:45 everything is a bad version of do notation 05:23:37 oh, well, actually ruby wont do it correctly because ruby defaults to 0 if the string starts with a letter, but... 05:29:16 ehirdiphone: did you hear about my awesome idea for an esolang? 05:32:35 augur: Your snippet was nothing like the actual construct. 05:32:45 ey? 05:33:14 ehird am I missing something or were you being sarcastic? 05:33:42 soupdragon: Just because a concept is reducible to another concept does not mean it is inferior. 05:33:55 what 05:33:59 ehirdiphone: what are you talking about 05:34:02 I agree and I didn't mean to suggest that 05:34:03 He acks the prior art at the end of the post. 05:34:08 humblest apologies bro 05:34:13 augur: His do notation remark 05:34:25 soupdragon: Apologies for what? 05:34:25 oh 05:34:27 wait, what? 05:34:36 for implying that 05:34:55 I don't know what you're talking about ehird. 05:51:09 anyway 05:51:17 did you see my ideas for an esolang, ehird? 05:51:54 I still don't get what's interesting about the language construct... 05:52:17 it's basically the same as try/catch 05:52:28 i agree, its nothing amazing 05:52:35 but its useful i suppose? 05:52:48 not more useful or less useful than existing ocnstructs, say.. 05:52:49 but 05:58:24 -!- ehirdiphone has quit ("Get Colloquy for iPhone! http://mobile.colloquy.info"). 06:06:42 -!- augur has quit (Read error: 104 (Connection reset by peer)). 06:06:53 -!- augur has joined. 06:18:27 -!- bsmntbombdood_ has quit ("Leaving"). 06:19:47 That conditional binding thing sounds exactly like Haskell's 'maybe', which is a defined function, not a language feature. 06:20:42 Oh, the author said so. 06:21:09 uorygl, yeah doesn't seem to be anything interesting about it not sure what ehird was meaning 06:28:08 i... 06:28:12 uorygle 06:28:45 I'm not sleepy any more? >.>] 06:28:48 s/]// 06:28:51 uorygl: i think i just made a translation routing from the semantics of functional language to prolog to my esolang 06:28:57 routine** 06:29:02 unintentionally 06:29:19 Huh, neat. 06:34:01 first [1] -> first 1:[] -> first (List 1 []) -> first(s,l,w) && list(l,1,[]) -> first(s) && P0(s,l) && P1(s,w) && list(l) && P0(l,1) && P1(l,[]) 06:34:15 with closures: 06:34:22 -!- calamari has quit ("Leaving"). 06:35:26 Ww[Es[first(s) && P1(s,w) && El:list(l)[P0(s,l) && P0(l,1) && P1(l,[])]]] 06:36:14 and you'd define, in your code, Ww[Es[first(s) && P1(s,w) && El:list(l)[P0(s,1) && P0(l,x)]] = x 06:36:38 and thats how you define functions :X 06:37:39 er 06:37:41 that should really be 06:37:58 Ww[Es:FIRST[P1(s,w) && El:LIST[P0(s,1) && P0(l,x)]] = x 06:41:30 the translation should actually be Ww[Es:FIRST[El:LIST[P0(s,l) && P1(l,1) && P0(l,[])] && P1(s,w)]] sorry 06:42:47 WHATCHU THINK SON 07:19:24 -!- coppro has quit (Read error: 54 (Connection reset by peer)). 07:37:30 augur what's W? 07:37:48 I guess E is exists, W is forall? 07:37:54 an operator that essentially says "for which w is it true that ..." 07:38:02 hmm 07:38:07 so like 07:38:15 Wn[1 + 1 = n] => 2 07:38:38 ah I get it, but what abouft Wn[n*n = 4], multivalued? 07:47:27 you know curry has a backend that compiles to prolog? 07:48:12 -!- MizardX has joined. 07:48:35 -!- coppro has joined. 07:49:27 -!- Gracenotes has quit (Read error: 104 (Connection reset by peer)). 07:51:05 -!- Gracenotes has joined. 07:59:59 -!- clog has quit (ended). 08:00:00 -!- clog has joined. 08:05:37 wonder what structures you can decide W for..? 08:06:04 probably only really trivial stuff like finite sets 08:13:14 soupdragon: yes, Wn[n*n = 4] would be multiply valued 08:13:25 infact, it would be like asking that question in prolog 08:13:31 ?- x*x = 4. 08:13:31 I see 08:13:52 X = 2. 08:13:55 X = -2. 08:24:16 W with + and * is TC isn't it? you can define 0 = Wn[n+n=n] but I can't see how to define 1 without (.. & ~n = 0) or using A 08:24:30 (on natural numbers) 08:33:20 that's so irritating that you can't define 1... 08:35:25 0 = Wn[n+n=n], 1 = W[n+n=n*n], ~0=1 08:36:36 Prime = Wp[En[Em[nm=p]]] 08:36:49 uh 08:36:54 you wouldnt define shit like that 08:36:56 though that includes 1 and 0 08:37:07 you would just start by defining succ and shit 08:38:02 Wn[succ {} == n] = {{}} 08:38:56 Am[Wn[succ m == n] = {m}] 08:39:41 do A and W commute? 08:40:12 no 08:40:24 actually, this is not a proper quantifier 08:40:29 i shouldnt even say it like that 08:40:34 its a quantifier over rules 08:40:39 so i should just use variables 08:40:52 Wn[succ M == n] == {M} 08:40:56 = {M} ** 08:40:57 this is stronger than TC I think 08:42:29 you can immediately solve all diophantine equations 08:42:35 no, its not 08:42:40 oh? 08:42:41 its just prolog 08:42:49 I mean in theory 08:42:50 well 08:42:53 ok let me clarify 08:42:55 it's like definabilify 08:43:14 its just quantified predicate calculus 08:43:33 using what is essentially a prolog base interpreter 08:43:44 as the algo to interpret it 08:44:02 logics are "turing complete", algorithms that calculate logics are 08:44:35 the existential and universal quantifiers dont quantify over actual things, they're actually sort of more like assertions 08:45:18 when you say Ex[F(x)] in this language, you're essentially specifying an axiom in a logic 08:45:30 the axiom being 08:45:35 :- Ex[F(x)]. 08:45:51 where :- is the logic turnstile operator 08:45:56 oh okay 08:46:04 which says, "anything can prove that Ex[F(x)]" 08:46:42 which means you can use it whenever you want and its guaranteed to be true. 08:46:43 I was thinking about it as a (paradoxically strong) language for computing sets 08:47:13 so really, in this "logic", using Ex[F(x)] in a truth conditional statement is always acceptable 08:47:37 you know twelf? 08:47:38 for instance, the expression [1] is equivalent to El[list(l) && p0(l,1) && p1(l,[])] 08:47:43 twelf, no. 08:47:47 damn 08:47:59 I was going to ask, is it like twelf? :p 08:48:02 no 08:48:24 this is me attempting to design a proglang that has the compositional semantics system of natural language. 08:49:15 cool 08:49:16 except in natural language, Ex[F(x)] is a normal truth-functional quantifier, not an assertion. 08:49:32 I'm not totally sure what that means but I think my book gets into this later on 08:49:32 well, maybe. :) 08:49:50 what it means is that ive got some crazy shit in this language im designing :D 08:50:36 its got a transformational grammar 08:51:01 -!- Asztal has joined. 08:51:12 so that, to the extent that you can define generic functions, they will, almost inevitably, result in a rearrangement of the code 08:52:49 so that if you type lets say "1 == this 1", where "this 1" is the function \p.p(1) 08:53:02 == doesnt take functions as its right hand argument 08:53:21 but the right hand argument does take Int -> Bool's 08:53:40 so you shift "this 1" to the front and turn the rest into a big lambda: 08:53:59 this 1 (\x.1 == x) 08:54:19 and infact both of these are valid in the language 08:54:43 infact, even "this 1 (1 ==) is valid in the language, in a kind of haskellish fashion 08:54:56 but the means of generating it are more interesting :P 08:55:58 I don't really see what's going here: What is 'this' and is 1 just an arbitrary symbol 08:56:10 well 08:56:14 1 is the value 1 08:56:15 this isn't anaphora resolution is it? 08:56:22 this is the function \x.\p.p(x) 09:07:51 -!- poiuy_qwert has quit ("This computer has gone to sleep"). 09:12:09 -!- neuDialect has joined. 09:15:24 -!- neuDialect has left (?). 09:25:26 uorygl 09:25:30 ! 09:25:37 check it out 09:25:44 MoveHaskell 09:25:49 MoveHaskell? 09:26:07 a derivative of haskell with one very simple extra piece of functionality 09:26:14 in an expression 09:26:24 f (g (... h)) 09:26:49 which is equivalent to (f . g . ...) h 09:27:02 and where (f . g . ...) :: a -> b 09:27:09 and h is of type (a -> b) -> c 09:27:29 Are you sure you didn't mix those types up? 09:27:31 you can turn the expression into h (f . g . ...) 09:27:33 yes, im sure :) 09:27:45 the idea is 09:27:49 in a type mismatch situation 09:27:51 where the argument 09:28:03 looks like it should be the function for some higher chain of args 09:28:18 where the chain looks like it should be the argument of the actual argument 09:28:23 you can reverse them 09:28:25 in the simplest case 09:28:46 Huh. That looks difficult to implement and not necessarily useful. 09:28:48 f g where f :: a -> b, g :: (a -> b) -> c 09:28:54 this is a type mismatch 09:28:58 but it wouldnt be if it were g f 09:29:00 so make it g f 09:29:27 thus you can do shit like, say 09:30:06 "a" ++ "b" ++ "c" == which ["foo", "abc", "bar"] 09:30:10 => 1 09:30:51 actually the simplest case is all you need to solve since the rest just pops out of it by iterated Move 09:31:16 which, i guess, is something like indexOf or whatever 09:32:07 or it functions like it, but is a wh word for ese of understanding 09:34:22 -!- jpc has quit (Read error: 60 (Operation timed out)). 09:41:57 hm 09:42:13 my exxample was bad, because ++ is left associative and so ruins things 09:42:15 but 09:43:11 which ["foo", "abc", "bar"] == "abc" is just the same 09:44:36 -!- ais523 has joined. 09:45:22 i wanted to give an example where the shit before "which" couldnt be used wrapped in ()'s and have it work 09:45:31 heh 09:45:54 hi; which lang is this? 09:45:57 which ["foo"] == "f" ++ "b" ++ "c" i think exemplefies this 09:46:01 ais523: MoveHaskell! 09:46:17 eso? 09:46:22 slight mod to haskell. 09:46:27 incredibly minor, infact. 09:46:42 type mismatches of a certain sort are resolvable 09:47:03 ah 09:47:14 so in this example 09:47:30 which ["foo"] :: String -> Int 09:47:50 but it cant find that as its next arg 09:48:19 so the interpreter replaces which ["foo"] with a temporary unbound variable x :: String 09:48:37 giving x == ... 09:49:32 actually sorry, which ["foo"] should really be more like (String -> Bool) -> Int 09:49:54 if it turns out that (x == ...) :: (String -> Bool) -> Int 09:50:16 the interp lambda binds x and feeds the result in as the argument to which ["foo"] 09:51:54 ais523: merp? 09:52:42 (string -> bool) -> int is a weird typesig 09:53:09 well 09:53:12 the type sig is really 09:53:19 weird? 09:53:29 -!- FireFly has joined. 09:53:29 which :: forall a. [a] -> (a -> Bool) -> Int 09:53:42 its essentially just firstIndexOf 09:53:55 except with a test function instead of a value 09:54:26 Data.List.findIndex :: (a -> Bool) -> [a] -> Maybe Int 09:54:42 you know what 09:54:44 fuck you :| 09:55:08 ok so which = findIndex 09:55:24 findIndex ["foo", "bar", "baz"] == "foo" 09:55:34 but thats ugly, hence the rename to which :P 09:58:01 its cool tho, because you can do craaazy shit 09:58:07 i think 09:58:08 lol 09:58:21 basically, anything where you'd normally do 09:58:38 f \x -> ... 09:58:48 you can turn that into ...f... 09:59:06 assuming theres only one occurance of x in ... 09:59:12 dunno about multiple occurances 09:59:20 maybe thered be another rule for that :p 09:59:28 or a more general rule 09:59:48 makes me think of shift/reset 10:00:03 f \x -> ...x...x...x... -> ...f... ... ... 10:00:06 or something 10:00:11 I guess that's hinting toward some of cc shans notes on linguisticd 10:01:02 so you could end up doing: which xs f && g && h 10:01:26 like say 10:01:40 which [1,2,3,4] > 1 && < 5 && odd 10:02:51 because there are missing vars in... 10:03:06 which _ [1,2,3,4] _ > 1 && _ < 5 && odd _ 10:03:30 so it goes through building up \x -> x > 1 && x < 5 && odd x 10:04:59 it also builds up \f -> which f [1,2,3,4] 10:05:53 actually, im not sure how it'd do that, so lets pretend its which [1,2,3,4] f not which f [1,2,3,4] 10:06:19 ive never understood why they put the xs last in all these functions 10:06:32 map f xs, foldr f z xs 10:06:34 so annoying 10:07:10 i guess maybe you have more curried (map f) and (foldr f z) than you do (map xs) and (foldr xs z) 10:23:30 -!- MigoMipo has joined. 10:23:33 join #jbopre 10:23:41 Dammit! 10:23:42 MigoMipo: missing slash 10:23:58 morning 10:24:06 Good morning! 10:24:38 ais523, see /msg 10:26:33 I wonder how suspend to ram/disk interacts with cron jobs 10:26:52 I mean, if some cron job is running right then 10:28:17 also why is my dns so slow suddenly 10:28:30 slow = takes several seconds to resolve anything 10:29:28 -!- adam_d_ has joined. 10:29:42 AnMaster: it gets suspended and restarted just like any other process 10:29:47 hm okay 10:30:03 ais523, what about missed cron jobs? 10:30:08 if you see what I mean 10:30:15 they never get run do they 10:30:27 depends on which cron implementation you have 10:30:36 ais523, vixie-cron here 10:30:46 I have anacron here, which runs them when it gets round to them 10:30:54 as in, late rather than never 10:31:00 but it's unusual in that respect 10:31:00 ais523, anacron needs normal cron to run it iirc 10:31:10 and can only handle once per day stuff 10:31:48 oh another cron question: how does cron handle with leap seconds. 10:32:51 it would run on :59, surely, if that's what the seconds field said? 10:32:57 and :00 if /that's/ what the second's field said? 10:33:10 so, no differently from normal 10:34:24 hm does cron do local time or utc? 10:34:54 not sure offhand 10:34:55 if local time, how does it handle switch to/from daylight saving 10:35:03 * AnMaster forgot which one leaps back an hour 10:35:14 heh, all you need is a VM and an insane ntp server 10:35:17 and you can find out by experiment 10:36:45 I'm not sure where to find an insane ntp server ;P 10:37:04 actually all I need is a VM without time syncing. 10:37:10 so I can set it manually in there 10:37:39 but I'm not THAT interested 10:38:43 is it wrong that I always try to think up the most insane non-destructive solution to stuff, when I'm in here? 10:38:44 -!- Pthing has quit (Remote closed the connection). 10:38:46 ais523, btw about that language on the wiki we talked about a few days ago, using two different interpreters. do you remember it's name? 10:38:51 I have an idea for it you see 10:39:02 Dupdog? 10:39:05 ah yes thanks 10:39:06 it isn't really two different interps 10:39:11 it's just alternate commands are interpreted with different meanings 10:39:28 as in, semantics of a command depend on whether an even or an odd number of commands were executed before it 10:39:36 my idea is not to prove it tc by implementing something in it, my idea is to prove it not tc by implementing it in a sub-tc language 10:39:40 wouldn't that be worth a try? 10:40:04 yep 10:40:11 it's one of the standard methods to prove something sub-tc 10:40:16 now, what would be a good language for it 10:40:16 hm 10:40:46 if you want vapourware unreleased languages that I've never told the channel about, you could try ACK 10:41:08 it's a vague attempt to make a language that's powerful enough to do pretty much anything sub-TC you'd want to do, but isn't TC 10:41:08 befunge93 won't do, since I would need to store the source somewhere 10:41:19 it clearly has infinite storage, is the issue 10:41:30 and sub-TC things with infinite storage are kind-of rare 10:41:41 is ACK such a language? 10:41:47 you could try to implement it in Splinter, I suppose 10:42:05 AnMaster: not exactly, basically you have to calculate how much storage you're going to use in advance 10:42:05 * AnMaster reads the splinter page 10:42:12 and it's hard to see how to do that for dupdog 10:42:42 (splinter's a push-down automaton) 10:43:06 hm "push-down automaton". what exactly can it do that a FSM can't? 10:43:09 PDAs are bad at duplicating unboundedly large values, though 10:43:16 AnMaster: matching brackets is the most famous example 10:43:21 ah 10:43:26 a PDA can match to any depth of nesting, an FSM has to have some limit 10:47:39 but PDA can't match a^nb^nc^n? 10:49:08 this is correct. 10:49:30 soupdragon, as a literal string? regex? 10:49:40 * AnMaster can't find any meaning in that line 10:49:43 {,abc,aabbcc,aaabbbccc,...} 10:49:48 interestingly, there are a small class of languages which can match a^n b^n c^n trivially, but cant match certain context free languages 10:50:06 soupdragon, ah, how would you write that as PCRE? 10:50:08 there is** 10:50:35 AnMaster might take me a few hours to figure that out :p 10:50:57 soupdragon, I strongly suspects PCRE might be TC 10:51:21 google books is such a tease 10:51:34 it has recursion and what not after all 10:56:06 ais523, can you explain how that PDA proof works? 10:56:25 it seems to be: N{N{N{}}I}Z{B{B{A{}}A{ZBAI}}A{O{ZBAO}I{ZBAI}N}}ZA\ 10:56:30 (followed by a newline) 10:56:51 you could try stepping through it in an interpreter 10:57:41 the basic idea is: you can compile a program that uses n splinters into a program that uses only a limited number by having a large object holding all of them, and a method to step through it 10:57:53 and then store things on the call stack Underload-style 10:58:19 ah 10:59:30 -!- puzzlet has quit (Read error: 60 (Operation timed out)). 10:59:37 -!- puzzlet has joined. 10:59:55 pda-splinter.pl: Lisp/Scheme program text 11:00:01 file gone crazy heh 11:03:48 ais523, so what about implementing dupdog in that PDA language and then using the PDA->splinter compiler? would that be easier or harder than splinter directly do you think? 11:03:58 no idea 11:10:52 nah I can't figure it out 11:13:59 soupdragon, can't figure what out? 11:14:05 the regex 11:14:13 soupdragon, ah 11:14:18 well I suspect it is possible 11:14:50 ais523, meh, I can't think of a way to do it it either the PDA language or splinter directly 11:15:02 I suspect dupdog can't be done by a PDA 11:15:05 basically I guess the issue is understanding splinter 11:15:08 although, ofc, that doesn't mean it's TC 11:15:16 ais523, what is there between a PDA and TC? 11:16:16 LBA, but that only really works if you have a concept of input 11:16:21 and a whole load of classes that don't have their own names 11:16:25 because they don't come up very often 11:16:33 (there are infinitely many classes, after all) 11:17:28 http://www.stanford.edu/~laurik/fsmbook/examples/Einstein%27sPuzzle.html 11:17:31 clever 11:17:41 ais523, but dupdog doesn't have stdin? 11:18:02 ais523, also which one is LBA? 11:18:03 AnMaster: exactly, which is why LBA doesn't even make any sense wrt dupdog 11:18:07 linear-bounded automaton 11:18:11 ah 11:18:19 no hits on the wiki 11:19:00 oh why did searching for LBA not find http://esolangs.org/wiki/Linear_bounded_automaton 11:19:01 ais523, ^ 11:19:25 no redirect, you can add one if you like 11:19:35 mediawiki default search tends to rely a lot on human help 11:19:41 hm 11:40:25 -!- rodgort has quit (Client Quit). 11:40:35 -!- rodgort has joined. 11:55:31 -!- |MigoMipo| has joined. 12:10:40 i just had an interesting idea 12:11:36 protons are functions from electrons to 1 12:12:53 e.g. p = \e -> 1 :: Electron -> Int 12:13:19 concatenation of protons as in p0^p1 is sort of like fold: 12:13:28 forked fold? who knows 12:14:52 p0^p1 = \x -> \y -> p1 x + p0 y 12:14:55 or something like that 12:15:45 hm.. 12:15:54 well maybe it should track charge not charge-balances.. 12:17:15 hm hm 12:17:23 obviously im sleep deprived :D 12:17:43 -!- MigoMipo has quit (Read error: 110 (Connection timed out)). 12:20:58 -!- soupdragon has quit ("Leaving"). 12:26:36 -!- soupdragon has joined. 12:37:14 hm 12:37:21 hm! 12:38:10 augur, BNF? 12:38:15 ?? 12:38:19 e.g. p = \e -> 1 :: Electron -> Int 12:38:23 no 12:38:25 bad haskell 12:38:27 ah 12:38:42 well it would have been heavily modified bnf 12:40:20 :P 12:40:24 so not bnf 12:40:37 and I don't know haskell 12:40:43 been planning to learn it 12:40:45 never had time 12:40:56 and never found a nice tutorial that extended into the depths too 12:41:03 heres haskell in a micosecond: 12:41:11 math f(x) is haskell f x 12:41:13 welcome to haskell 12:41:15 :P 12:41:21 augur, I'm pretty sure there is more than that 12:41:26 the monads and what not 12:41:34 augur, also, what about f(x,y) 12:41:39 shh! you'll wake the ehird! 12:41:45 f x y 12:41:54 hm 12:42:04 functions are all monadic in haskell 12:42:06 augur, then f(g(x),y) 12:42:06 hardcore LC 12:42:09 and f(g,x,y) 12:42:12 how do they differ 12:42:16 f (g x) y 12:42:19 f g x y 12:42:21 functions are all monadic in haskell ??? 12:42:21 okay that's lisp now 12:42:29 ohh like J monadic 12:42:31 yeah I got it 12:42:32 nah, its just bracketted when need-be 12:42:32 well not fully 12:42:44 no not all functions are monadic in haskell 12:42:50 where did you get that idea 12:42:54 augur, is addition + x y 12:42:55 too? 12:42:58 no 12:43:00 meh 12:43:01 sugared to x + y 12:43:09 but you can do it that way if you want: (+) x y 12:43:14 -!- |MigoMipo| has quit ("co'o rodo"). 12:43:15 "huh" 12:43:22 -!- MigoMipo has joined. 12:43:24 also what about all those type things and such 12:43:36 see, haskell has sugar for operators right 12:43:41 well okay 12:43:42 but operators are still just functions 12:43:46 so if you want to get the function 12:43:51 stick it in parens to "call" it on nothing 12:43:55 that is, return just the function 12:43:58 augur, but (+ x y) wouldn't work? 12:44:01 well now its the function, not the operator 12:44:02 no. 12:44:05 right 12:44:11 (+) x y 12:44:18 ((+) x y) ? 12:44:20 sure. 12:44:36 tho that might also be interpretable as a tuple with one elemtn 12:44:45 im not sure how haskell disambiguates these things 12:44:57 augur, I suspect that by mixing the prefix and infix notations it is possible to create something that makes IOCCC seem quite sane and easy to read 12:45:00 just a hunch 12:45:21 you have no clue 12:45:24 it gets far far worse 12:45:28 ouch 12:45:56 augur, anyway, what is all the stuff about types and such. 12:45:57 mostly because haskell can be highly pointsfree 12:46:03 types 12:46:03 ok 12:46:05 pointsfree? 12:46:08 yes 12:46:11 tacit 12:46:17 * AnMaster googles 12:46:20 like J again 12:46:27 f . g = \x.f (g x) 12:46:27 soupdragon, which I don't know 12:46:37 oh that's too bad J is fantastic 12:46:43 I know erlang, lisp, C, bash, and a few other languages 12:46:43 AnMaster: im really surprised 12:46:46 youve been here longer than me 12:46:49 (like python) 12:46:50 and i know of J 12:46:51 cmon 12:46:54 dont be a slacker 12:46:56 anyway 12:46:59 so with haskell 12:47:01 lets say 12:47:04 f x = x + 1 12:47:10 right 12:47:12 then f has the type Int -> Int 12:47:13 that seems easy enough 12:47:18 augur, makes sense 12:47:19 well 12:47:21 wait no 12:47:25 augur, why can't x be a float 12:47:34 or any other numeric type 12:47:40 well, 1 has the type Int 12:47:43 1.0 has the type Float 12:47:44 i think 12:47:45 so.. 12:47:48 hm 12:47:49 okay 12:47:51 you have to convert i guess 12:47:57 anyway 12:48:06 f x = x + 1 is desugard to f = \x -> x + 1 12:48:16 mhm 12:48:20 and so the type looks like whats on either side 12:48:22 x is an Int 12:48:27 what does the \ signify? 12:48:28 and (x + 1) is an Int too 12:48:42 f x y = x + y is desugared to \x -> \y -> x + 1 12:48:50 er 12:48:51 x + y** 12:48:55 which is Int -> (Int -> Int) 12:49:05 err 12:49:09 since x is an Int and \y -> x + 1 is an Int -> Int 12:49:50 what exactly do you mean by -> here? 12:50:03 X -> Y ~ "Take and X and return a Y" 12:50:11 right 12:50:22 if this were an untyped lambda calculus 12:50:30 \x -> \y -> x + y is trivially obvious, right 12:50:33 apply it to the number 2 12:50:34 and you get 12:50:35 then why isn't f x y = x + y "takes 2 integers and returns one" 12:50:38 \y -> 2 + y 12:50:44 hm 12:50:52 why, because haskell is lambda calculus. 12:50:54 with types. 12:50:55 and fun. 12:50:56 augur, so it returns a lambda that takes the other parameter? 12:51:00 yes 12:51:09 okay 12:51:10 hence the following is completely valid haskell: 12:51:11 f 2 12:51:16 f 2 == \y -> 2 + y 12:51:29 so it returns an "add 2" function 12:51:31 fun 12:51:35 sure. 12:52:02 augur, so what about functions/operators like + - and such that can take various types 12:52:11 or does haskell have separate + operator for ints and floats? 12:52:17 have a* 12:52:23 well, so theres type polymorphism right 12:52:26 for instance 12:52:28 uhu 12:52:31 you might have the function 12:52:39 f x = [x] 12:52:43 hm 12:52:44 now, lists are typed in haskell 12:52:45 but 12:52:49 surprise surprise 12:52:52 [] being list? right 12:52:56 lists are polymorphically typed 12:52:59 so theyre like 12:53:05 lists of whatever type you give it 12:53:12 so f x is also polymorphic 12:53:19 what is its type specifically? 12:53:21 augur, like list of foo? well okay 12:53:26 f :: forall a. a -> [a] 12:53:27 where foo can vary 12:53:42 where [a] is the sugar for list of a's 12:53:48 augur, what does the "a. a" bit mean. 12:53:53 or rather 12:53:56 what does the . there mean 12:53:57 a. a doesnt mean anything :P 12:54:05 forall a. a -> [a] 12:54:05 means 12:54:06 augur, okay how should one parse that bit then 12:54:27 for all types a, f can be a -> [a] 12:54:46 so if you give f an int 12:54:54 f is behaving like its of type Int -> [Int] 12:55:04 give it a string and its behaving like its of type String -> [String] 12:55:05 etc 12:55:06 that is 12:55:16 for ALL types a, f can behave like a -> [a] 12:55:22 ah 12:55:23 or, f :: forall a. a -> [a] 12:55:41 augur, so this is not exactly the same as predicate logic ∀ then? 12:55:48 well 12:55:51 yes and no 12:55:59 if you're doing a typed lambda calculus 12:56:03 same procnounciation 12:56:04 this is a quantifier over types 12:56:12 rather than a quantifier over values 12:56:17 String -> [String] <-- is that a list of 1 or more strings? 12:56:21 or 0 or more strings 12:56:26 0-or-more 12:56:28 ah 12:56:32 the list [] is of type 12:56:36 forall a. a -> [a] 12:56:50 [a] just means "a list of a's" 12:56:57 and a list of no a's is still a list of a's 12:56:59 anyway 12:57:05 well [] being of any type makes sense. After all the empty set is a subset of every other set 12:57:14 so, as for polymorphisms 12:57:23 in general 12:57:36 i think you can have multiple functions with the same name 12:57:43 so long as they have different type signatures 12:57:46 e.g. you could define 12:57:48 ah 12:58:06 (+) :: Int -> Int -> Int 12:58:07 and also 12:58:12 (+) :: Float -> Float -> Float 12:58:13 and so on 12:58:15 augur, couldn't there be some confusion about which one to use in certain cases? If there are implicit casts available in haskell? 12:58:22 if there isn't I can't imagine it being an issue 12:58:28 no, i dont think there are implicit casts, so 12:58:47 oh, there are. hah. 12:58:50 oh? 12:58:54 well, the compiler is smart 12:59:04 it rounds down to the most applicable type 12:59:05 so 12:59:33 infact, it seems that haskell has a generic type class Num 12:59:36 I can still imagining that being an issue sometimes 12:59:43 and Int and Float are both Nums 12:59:55 augur, so the types form a hierarchy? sensible 13:00:08 and (+) is required to be of type (Num a) => a -> a -> a 13:00:20 okay 13:00:29 where (Num a) => ... just means sort of 13:00:40 for all types a, which are of the typeclass Num, a -> a -> a 13:00:57 augur, btw, couldn't you pass a tuple or such to make a function take two arguments (instead of returning a lambda) 13:01:10 sure 13:01:31 then its a 1-arg function that requires a 2-tuple 13:01:36 augur, also that leaves some interesting related questions: how you build and take apart tuples 13:01:37 and it has pattern matching over the tuple 13:01:38 so like 13:01:42 f (x,y) = x + y 13:01:43 as in, how do I get the second member of a tuple 13:01:45 aha 13:01:46 this is fine 13:01:53 but you cant then do f 1 13:01:57 because 1 is not a 2-tuple 13:02:00 augur, so similar to pattern matching? 13:02:10 (in the erlang sense of that) 13:02:11 its pattern matched, yes. 13:02:16 right 13:02:21 but shallowly 13:02:25 oh? 13:02:28 how do you mean 13:02:34 so you can, for instance, do 13:02:46 first x:xs = x 13:02:57 first x:xs would mean? 13:03:04 well, x:xs is cons 13:03:06 ah 13:03:12 a cons pair 13:03:15 its sugar for roughly 13:03:17 Cons x xs 13:03:22 augur, and what are the car and cdr functions? 13:03:36 where Cons is a data constructor 13:03:46 augur, I know scheme :P 13:03:53 no no no 13:03:56 i dont mean its a cons function 13:04:01 oh= 13:04:02 its a special thing 13:04:04 s/=/?/ 13:04:05 a data constructor 13:04:11 like 13:04:20 augur, is it like erlang's [H|T] then? 13:04:22 it forms the basis of the ASTs in haskell 13:04:25 yea sure 13:04:32 lets pretend its that 13:04:33 its not, but 13:04:33 augur, on the left or the right side of = ? 13:04:35 ah 13:04:39 also 13:04:45 people dont usually use car and cdr 13:04:53 because they can just match them out 13:05:02 so you'd never define first = car 13:05:04 you'd just do 13:05:05 like erlang's [H|T] then for head/tail 13:05:13 first x:xs = x 13:05:20 right 13:05:38 augur, xs being a variable name that isn't used I guess? 13:05:47 it would be the cdr I imagine? 13:05:52 well 13:05:53 yes 13:06:01 i mean 13:06:04 you can just like 13:06:07 type 1 : [] 13:06:08 doesn't the compiler warn about unused variable then? 13:06:12 and this conses together 1 and [] 13:06:15 to give you [1] 13:06:28 1 : 2 : [] = 1 : [2] = [1,2] 13:06:33 type being a keyword to define a type? 13:06:42 no no no 13:06:46 oh 13:06:47 i mean if you type 1 : [] 13:06:48 ah 13:06:56 right, it wasn't part of the code, heh sorry 13:07:08 :P 13:07:15 ill use > on newlines 13:07:19 since lambdabot isnt here 13:07:22 hm 13:07:24 > 1 : [] 13:07:25 [1] 13:07:36 > 1 : 2 : [] 13:07:37 [1,2] 13:07:43 right 13:08:01 > let first x:xs = x in first [1,2,3] 13:08:02 1 13:08:13 > let rest x:xs = xs in rest [1,2,3] 13:08:15 [2,3] 13:08:36 augur, so 1 : 2 : [] is (cons 1 (cons 2 '())) ? 13:08:40 this is because [1,2,3] is really really just sugar for 1:2:[] 13:08:46 sure 13:08:52 sensible 13:08:54 er 13:08:56 1:2:3:[] 13:09:02 (if you are insisting on infix ;P) 13:09:04 : being right associative 13:09:14 well, yes 13:09:25 if we were using prefixing: 13:09:43 [1,2,3] is sugar for (:) 1 ((:) 2 ((:) 3 [])) 13:09:45 :P 13:09:46 if you are going to prefix you might as well use lisp instead ;P 13:09:53 and if we were using the List monad: 13:10:00 [1,2,3] is sugar for uh 13:10:08 List 1 (List 2 (List 3 [])) 13:10:11 ah now we come to "monad"... now what on earth is that exactly? 13:10:16 noone knows 13:10:22 no im kidding 13:10:27 Monad is a type class 13:10:37 like Int or Num? 13:10:41 no 13:10:44 Num 13:10:50 Int is an instance of the type class Num 13:10:52 aha 13:11:07 so as an instance of the Monad type class 13:11:10 augur, so you can only have 2 layers then? type class and type? 13:11:16 or can type classes inherit each other 13:11:19 i think so 13:11:26 i think you only get types and type classes 13:11:28 ah 13:11:29 right 13:11:35 and you just multi-class a type 13:11:39 mhm 13:11:46 okay so monad is a type class 13:11:47 right 13:11:56 on an instance of Monad you can (must?) have defined certain functions 13:12:06 makes sense. 13:12:09 i think the three are >>=, return, and ... maybe thats it? 13:12:27 ... being an operator?? 13:12:34 no no sorry :p 13:12:35 or function I guess 13:12:38 >>= and return 13:12:40 ah 13:12:46 are the only functions. 13:12:56 and those do what 13:13:09 (and where does IO and state come into this) 13:13:16 well, theyre, by convention, required to follow certain rules 13:13:18 namely 13:13:29 (return a) >> k == k a 13:13:38 m >>= return == m 13:13:42 hrrm >> ? 13:13:52 xs >>= (return . f) == fmap f xs 13:14:08 x >>= (\x -> (k x) >>= h) == (m >>= k) >>= h 13:14:24 those are the facts that must hold of >>= and return 13:14:41 when i say must, i mean "by convention of what a monad is" 13:14:51 you can violate them, but dont expect your code to behave monadically 13:14:51 augur, writing those in English would help, I'm not very used to the multitude of use of punctuation that haskell uses (possibly even exceeding perl at that!) 13:14:59 er 13:15:01 well look 13:15:05 >>= is just an operator right 13:15:08 right 13:15:10 like + or % 13:15:18 sure, I'm okay with that 13:15:22 so (return a) >>= k == k a 13:15:43 means that (return a) >>= k must return the same value as k a 13:15:47 hm okay 13:15:54 so the >> instead of >>= was a typo? 13:15:59 yes, obviously. 13:16:00 :P 13:16:04 that explains a bit 13:16:04 . is the compose operator 13:16:06 defined as 13:16:16 augur, okay, I'm more used to seeing the compose operator as a mid dot 13:16:18 also which way is it 13:16:19 f . g = \x -> f (g x) 13:16:23 ah 13:16:32 . IS a mid dot 13:16:35 just ascii-ish :P 13:16:42 I'm pretty sure I have seem f g mean g(f(x)) somewhere 13:16:48 yes 13:16:49 in math 13:16:54 but haskell isnt APL 13:16:56 so. 13:17:00 augur, yes, which is where I have seen the mid dot! 13:17:22 f o g in SML 13:17:27 ring 13:17:33 so those are the monad laws 13:17:47 and thats what defines a monad 13:17:48 augur, what is fmap? 13:17:58 a functor map 13:18:06 uh you lost me there 13:18:20 you can define fmap usint >>= and return 13:18:20 is it related to maping a lambda over a list in any way? 13:18:33 if m = [] then it is the list map 13:18:45 hm okay 13:18:45 fmap is i think probably somethat you can think of as like 13:18:58 a monad-specific definable version of map 13:19:05 a generic map over lists as well as other types? 13:19:26 soupdragon, are monads all Functors? they are right? 13:19:41 yes 13:19:50 then where does state and IO come into monads? 13:19:54 so, monads are all Functors, Functor being another type class 13:20:06 and so in order to be a Monad you also have to be a Functor which means you have to have fmap defined. 13:20:10 yes, another (equivalent) definition of monad is Functor with join and return 13:20:31 which obeys its own set of laws 13:20:37 hm 13:20:42 namely 13:20:58 fmap id x = x 13:20:58 and 13:21:19 fmap (f . g) x = fmap f (fmap g x) 13:21:44 IO and state who knows. 13:22:14 hm 13:22:16 i mean, the way you do it, afaik, is that like 13:22:36 fmap takes a lambda and whatever it is to map over? 13:22:43 or how do you mean 13:22:46 if you just call into existence an IO monad, itll sort of be the same monad every time. 13:22:57 but if you pipe an IO monad through some magic functions 13:22:59 fmap f m = do x <- m ; return (f x) 13:23:03 you get new IO monads 13:23:10 augur, sounds like black compiler magic to me... 13:23:13 it is 13:23:18 -!- MigoMipo has quit ("co'o rodo"). 13:23:26 the idea is basically that the monads you're getting back are not the SAME monad you put in 13:23:32 and therefore can be different 13:23:36 augur, and why is that important? 13:23:48 well, because it provides the illusion of purity 13:23:56 mhm 13:24:04 suppose you just called forth an IO monad from nothing 13:24:07 and assigned it to x 13:24:17 and then you did some shit with it, printing a line to the screen 13:24:18 augur, how is this better than just making most of your code pure and having a tiny bit with all the unpure stuff 13:24:25 thus getting back a new monad that you assign to y 13:24:29 like you commonly do in scheme or erlang for example 13:24:55 if you then tried to print to the screen again using x, i think nothing would happen on screen 13:25:03 augur, hm okay 13:25:04 (im not entirely sure on this; i dont do this IO shit in haskell :P) 13:25:11 instead you'd have to use the monad stored in y 13:25:22 augur, how do you then get the result from your computation back? ;P 13:25:28 and you'd keep chaining together the monads you get back from previous IO function calls 13:25:30 always using the REPL? 13:25:40 right so input is sort of the same, right 13:25:47 mhm 13:25:55 if you ask the monad stored in y, now, for an input 13:26:08 itll give you back a string paired with a new IO monad i think 13:26:12 okay 13:26:15 the new monad you can do shit with however you want 13:26:21 if you ask y for input AGAIN 13:26:26 you get back the same string-monad pair as before 13:26:35 it doesnt go ask the user for input again 13:26:41 augur, I sure hope haskell's GC is good then 13:26:54 im fairly certain that its spot on 13:26:56 since it sounds like reading a file would take an awful lot of memory 13:27:13 nah, i think read ops are compiler-internally single operations 13:27:14 well, hoepfully it shares the string read in question 13:27:20 you get back a monad and the content string 13:27:40 like i said, im not entirely clear on this IO stuff 13:27:43 but 13:27:44 right 13:27:50 what about the state monad then? 13:28:05 state is more black magic 13:28:14 even more than IO? hard to imagine 13:28:22 state isn't black magic at all :( 13:28:24 i think its vaguely like a monad that has hash-like magic 13:28:29 oh? 13:28:33 ST is magic, State isn't. 13:28:35 ST vs State 13:28:43 Asztal, what is ST and what is State? 13:28:44 so that when you "assign" you're taking a var-binding hash, and deriving a new hash with the "bound" var changed 13:28:59 e.g. you're going from, say, { "x" => 1 } to { "x" => 2 } 13:29:05 hm oaky 13:29:06 in a purely functional fashion 13:29:06 okay* 13:29:11 augur, what is {} here? 13:29:16 you haven't used that above 13:29:19 nothing, im just making shit up 13:29:22 ah 13:29:35 ST means State Thread, it kind of ensures that there's only one "world state" in existence at any time, so you can mutate the world as you please 13:29:40 it looked slightly python-dict-ish 13:29:56 its supposed to be ruby-esque but whatever 13:30:15 augur, well not knowing ruby I guess python was the next best I could manage ;P 13:30:17 i dont know how these magic monads are implemented so, yeah. 13:30:30 IO and ST are implemented in a very similar way :) 13:30:37 hm 13:30:57 ST is elite 13:31:12 Asztal, why "thread"? 13:31:34 does it actually act like a erlang style of state process. (being a common idiom) 13:31:37 IO is sort of a special instance of ST where the state you're manipulating is... the real world! 13:31:38 because its a state monad that threads through your whole program! :P 13:31:50 ah so not thread as in separate process then? 13:31:57 i dont know :P 13:32:00 probably not 13:32:00 not a thread in that sense, no. 13:32:03 right 13:32:06 i dont think haskell has explicit threading 13:32:31 because its lazy and functional, you can spin off arbitrarily many threads in your compiled program 13:32:37 and im pretty sure it works fine 13:32:43 because using a separate process in erlang is a common idiom to keep the state. basically you send/receive messages to/from that process to access the state 13:32:46 parallelize the shit out of everything 13:32:53 and that one just does a tail recursive loop 13:32:57 with it's state as a parameter 13:33:06 nothing needs to be explicit except, i think, in your compile params 13:33:24 hm 13:33:32 augur, well, one issue is where to thread 13:33:50 I guess the compiler is smart enough to figure out where it is an overhead and where it isn't? 13:34:40 sure 13:34:49 a heuristic I can't imagine as being 100% fool-proof ;P 13:34:59 guys 13:35:00 where is lambdabot 13:35:02 The expression (x `par` y) sparks the evaluation of x (to weak head normal form) and returns y. Sparks are queued for execution in FIFO order, but are not executed immediately. If the runtime detects that there is an idle CPU, then it may convert a spark into a real thread, and run the new thread on the idle CPU. In this way the available parallelism is spread amongst the real CPUs. 13:35:03 we need lambdabot 13:35:17 http://www.haskell.org/ghc/docs/latest/html/users_guide/lang-parallel.html 13:35:33 Asztal, ah interesting 13:35:37 well there you go. 13:35:44 wheres lambdabot 13:35:45 :| 13:35:53 augur, I have ghci in a window here 13:35:55 no issue 13:36:42 well still 13:36:53 augur, does haskell have any sort of macros? In the meaning of lisp macros I mean 13:37:00 i dont think so 13:37:04 ah okay 13:37:17 it's not quite as cool as lisp, but it does have template haskell 13:37:31 hm okay 13:37:35 theres some crazy shit that people do with haskell tho 13:37:40 Asztal, nothing like C++ templates I hope? 13:37:41 somehow they get reactive programming 13:37:44 which is like 13:37:49 the epitome of anti-laziness 13:37:52 programming that bites back? 13:37:58 ;P 13:39:14 augur, so how would you do something like a memoizing Fibonacci function in haskell? 13:39:33 that is, one which counds up, not down, and reuses the calculations 13:39:41 so it doesn't have to calculate everything a lot of times 13:39:45 no clue 13:39:49 i mean 13:39:54 you could probably do something stupid like 13:39:58 fib n memo = ... 13:40:05 and just carry your memos with you 13:40:26 maybe you'd do a memoized-fib monad 13:40:38 MemFib Int [Int] 13:40:53 and use do notation to peel it open 13:40:55 augur, in scheme I would do it as a tail recursive function 13:41:14 you don't get TCO in haskell 13:41:29 do (n, memo) <- mfib (5, []) 13:41:32 or something 13:41:34 http://www.haskell.org/haskellwiki/Memoization#Memoization_with_recursion :P 13:41:36 soupdragon, oh? how do you implement something like a main loop then? 13:41:56 main = do ... 13:41:59 oops 13:42:01 main = do ... ; main 13:42:09 soupdragon, looks like a tail call? 13:42:17 yeah but it's not 13:42:24 Asztal's memoized fib there looks like magic 13:42:26 then what is it 13:42:26 because it is 13:42:29 haskell is magic. 13:42:45 I can explain the memoization 13:42:48 if you want 13:42:54 soupdragon, to me? or augur ? 13:42:58 anyone 13:43:15 its the magic of infinite lists and such 13:43:24 soupdragon, the one Asztal linked seems fairly easy to understand, apart from the !! bit 13:43:28 which I have no idea what it means 13:43:58 "list !! i" picks the i'th element from the list. 13:43:59 actually, what its doing is defining an infinite list thats calculated over some other infinite list 13:44:02 hm okay 13:44:11 and by virtue of it being defined ones, and in terms of itself 13:44:14 huh 13:44:38 when you go to plug some elements off the list, it just traces back the list, building it up as far as it needs to 13:44:55 and, since it IS the same list, it only ever gets built up once 13:44:57 I have to say that the variant I thought of in scheme is a lot more sensible, just tail recursion and pass the value of the last calculations along 13:45:09 and by built i mean in the interpreter when it tries to evaluate it 13:45:21 yes, thats what you could do in haskell too 13:45:34 but then you have to explicitly carry this around 13:45:39 well okay 13:45:47 what the one here doesnt let you do tho 13:45:51 is memoize across calls to fib 13:45:54 so if you did like 13:45:56 well okay 13:45:57 fib 1000000 13:45:59 then did it again 13:46:04 it would take the same amount of time 13:46:13 augur, then I would write it in another way indeed 13:46:22 if I needed that 13:46:22 whereas if you were explicitely carrying it around, you could do like 13:46:44 augur, you could use the ST monad to store it somehow? :D 13:46:46 let (x, memo) = fib 1000000 [] in fib 1000000 memo 13:46:55 AnMaster: probably 13:46:59 state monad 13:47:00 i dunno 13:47:04 its magic! 13:47:07 right 13:47:18 so what is this infinte list stuff about. 13:47:21 but you'd still need the explicit stuff i think 13:47:24 infinite lists! 13:47:28 well ok so haskell is lazy, right 13:47:35 it doesnt evaluate it unless it needs to 13:47:43 so lets do the classic infinite list 13:47:44 well okay 13:47:46 ones = 1:ones 13:47:59 hm 13:48:02 this is fine, because haskell doesnt need to evaluate ones when DEFINING ones 13:48:07 it doesnt need to evaluate anything, infact 13:48:11 of course 13:48:18 I have no problems with that idea at all 13:48:22 so that is all it is then? 13:48:33 or can you do more interesting stuff with it? 13:48:33 tho i think itll check for scopes 13:48:41 well, you can do interesting stuff, but it all comes down to that 13:48:48 if you do first ones 13:48:59 it doesnt need to evaluate ones beyond its first value 13:49:10 -!- MizardX- has joined. 13:49:18 augur, what about using it to calculate 1/2+1/4+1/8+1/16+... 13:49:20 when you do third ones it then goes through and evals what it needs to eval to get there 13:49:38 ok so that is probably something like lets see 13:49:50 doesn't that go to 1 when the number of elements summed goes to infinity iirc? 13:49:53 AnMaster that's kind of impossible 13:49:58 -!- MizardX has quit (Read error: 104 (Connection reset by peer)). 13:50:07 -!- MizardX- has changed nick to MizardX. 13:50:18 soupdragon, not in math. I think the limit when n goes towards infinite is 1. Unless I misremember 13:50:28 AnMaster, ?? 13:50:33 in haskell 13:50:40 i want to say 13:50:41 soupdragon, I meant http://upload.wikimedia.org/math/e/c/2/ec2190a57b685add5a4e43c3ecfeed94.png 13:50:43 tha the defintionis 13:51:10 soupdragon, it would be cool if you could use haskell to do that kind of stuff 13:51:11 suppose you do sum [1/2,1/4,1/8,..] 13:51:25 but I guess you are better off with mathematica or some other CAS 13:51:25 halves = 1 : map (\x -> x/2) (drop 1 halves) 13:51:31 but i killed ghci doing that 13:51:32 so 13:51:36 there's no way + can know that the 100th element of that sequence isn't suddenly 62662 13:51:44 soupdragon, true. 13:51:57 that will get you [1,1/2,1/4,...] 13:52:24 then you'd just take however many out you want for precision and fold them down 13:52:30 i guess you could also do this the smarter way 13:52:31 which is 13:52:33 if you defined it in the same way as mathematica you could compute it 13:52:52 (as a series, rather than an infinite sum) 13:52:58 [x ** n : n <- [0..]] 13:53:36 augur: Lose the drop 1 13:53:40 whoops, | 13:53:45 Deewiant: but it NEEDS drop 1! 13:53:50 no it doesnt 13:53:51 :D 13:53:55 Only for the infinite loop :-P 13:53:59 im a silly person i am 13:54:43 huh? 13:54:46 what does the drop 1 mean? 13:54:55 drop 1 x:xs = xs 13:55:03 soupdragon, atm I'm trying to work out what it would be in mathematica 13:55:06 drop n x:xs = drop n-1 xs 13:55:27 hm it gives me (pi^2)/6 13:56:15 AnMaster, I mean if you represent it symbolically, like Sigma (\n -> Div 1 (Exp 2 n)) then you could write an algorithm to try and solve sums (like mathematica) 13:56:29 soupdragon, right 13:56:42 soupdragon, what was that about 1/(1+1/(1+1/... 13:56:53 huh? 13:56:58 or something like that 13:57:03 *tries to remember the series* 13:57:10 that ones golden ratio 13:57:10 it was some rather neat series of infinite divisions 13:57:17 but that's not what I was talking about 13:57:22 soupdragon, well true 13:57:33 I was just jumping to a different thought 13:57:49 anyway it wasn't the golden ratio I meant 13:57:52 phi = 1/(1+phi) also diverges in haskell :P 13:57:58 soupdragon, the point was that it added up to exactly 2 13:58:03 so it must have been a different one 13:58:16 ok im going to sleep for a little bite guys 13:58:16 afk 13:58:20 it was a continued fraction of some sort 14:02:31 (an infinite one at that) 14:05:46 oh wait it was not 14:06:07 it was 1 + 1/2 + 1/4 + 1/8 + 1/16 + 1/32 14:06:10 and so on 14:07:17 soupdragon, that adds up to 2 14:07:23 yeah 14:07:27 0.1111111... = 1 14:07:29 er 14:07:32 1.1111111... = 10 14:07:39 err? 14:07:44 binary 14:07:47 soupdragon, oh right 14:07:58 soupdragon, as a floating point number? 14:08:06 a real number 14:08:17 -!- coppro has quit (Read error: 110 (Connection timed out)). 14:08:24 how do you mean it would be stored then 14:08:35 if decimal 1.11111... is 10 in binary? 14:08:58 oh you mean binary for both? 14:09:01 right yeah 14:09:46 anyway that sum was Sum[1/2^i, {i, 0, Infinity}] I think 14:10:05 is that mathematica code for it? 14:10:17 soupdragon, well yes for 1 + 1/2 + 1/4 + 1/8 + 1/16 + 1/32 + ... 14:10:45 cool does it compute it to 2? 14:10:54 soupdragon, it does 14:11:07 that's pretty cool I wonder what algorithm it uses 14:11:24 soupdragon, in latex \sum _{i=0}^{\infty } \frac{1}{2^i} I think 14:11:34 well mathematica claims it is 14:11:36 with convert to latex 14:22:16 -!- BeholdMyGlory has joined. 14:45:07 soupdragon, well that is not public for mathematica mostly iirc 14:45:16 :( 14:45:17 * AnMaster tests with maxima 14:46:08 soupdragon, maxima manages it too 14:46:10 and that is open source 14:46:38 sum(1/(2^k), k, 0, inf), simpsum; 14:46:41 gives 2 14:46:47 soupdragon, so go look at that 15:03:13 -!- ais523 has quit (Connection timed out). 15:19:36 -!- ais523 has joined. 15:35:31 -!- soupdragon has quit ("Leaving"). 15:44:34 -!- FireFly has quit (Connection timed out). 15:45:42 -!- FireFly has joined. 16:47:04 Gregor, btw, that "linux binaries on OS X" idea with your elfloader, did you do any work of that? 16:47:48 s/of/on/ 17:00:09 -!- adam_d_ has changed nick to adam_d. 17:43:52 -!- FireFly has quit (Read error: 60 (Operation timed out)). 17:59:34 -!- FireFly has joined. 18:49:55 -!- jpc1 has joined. 19:19:13 -!- bsmntbombdood has changed nick to bsmntbombgirl. 19:32:23 -!- Desmo has joined. 19:38:38 -!- ais523 has quit (Remote closed the connection). 19:38:57 -!- Desmo has left (?). 21:06:50 -!- oerjan has joined. 21:32:59 hm interesting stuff happens when you delete part of the stuff while tar is still extracting: 21:33:24 tar: linux-2.6.32.2/arch/microblaze: Function "stat" failed: File or directory not found 21:33:40 I was trying to save some space by deleting all the arches I didn't need 21:37:26 -!- coppro has joined. 21:37:50 Presumably it was extracting into that directory, so of course it expects it to still exist. 22:10:54 -!- jpc1 has changed nick to jpc. 22:30:27 * uorygl goes to see if Gregor has any simpler opuses. 22:33:08 Er, opera. 22:34:48 Well, Opus 10 sounds simpler, but it's less interesting and has some icky patches. 22:35:32 hm 22:35:36 "Support for old Pentium 5 / WinChip machine checks" 22:35:39 a kernel config option 22:35:44 what the hell is a pentium 5 22:36:00 * uorygl looks at the score for Opus 9 in an attempt to judge its quality from just that. 22:37:04 * uorygl listens to the horrible-sounding MIDI version of Opus 9 instead. 22:37:18 pentium 5fV 22:37:31 er 22:37:35 *5eV 22:37:35 Why am I listening to this at all? It's not for piano. 22:38:43 oerjan, what? 22:38:55 uorygl, why is it horrible sounding? 22:39:10 uorygl, not a good enough sound font loaded into your hardware midi? 22:39:42 oerjan, thing is, if this is a typo for pentium 3 then it would affect the kernel I'm configuring 22:41:02 -!- augur has quit ("Leaving..."). 22:41:46 oh it seems the original pentium was codenamed P5 22:41:47 AnMaster: punny puns of puun 22:41:48 that explains stuff 22:41:56 meh 22:42:08 oerjan, too worried about this to bother with puns 22:42:42 AnMaster: nope. It's probably difficult to get a good soundfont for a string trio. 22:42:47 and I'm not in a happy mood atm: frozen water in pipes tends to make you rather unhappy 22:43:00 eek 22:43:18 oerjan, thankfully only for the garden hose connection on the outside of the wall 22:43:33 so no expensive leak inside some wall 22:43:40 but still, water has to be turned off 22:43:51 until a plumber is available 22:43:53 and they are hard to get 22:44:26 not a single one within 40 km radius was able before tomorrow (and this happened yesterday morning) 22:46:06 mhm 22:46:50 btw googling for pentium 5 found some funny old stuff 22:47:03 rumours about a 15 GHz pentium 5 by 2010 for example :D 22:47:08 was from 2002 22:47:40 -!- MigoMipo has joined. 22:50:27 AnMaster: Well, Intel did say they were going to go to 10 GHz and higher, around the time when the Prescott came out 22:50:43 When they noticed AMD's 2 GHz chips performed better than theirs they kinda changed plans 22:51:04 (Where "theirs" == 5 GHz and up.) 22:51:06 -!- augur has joined. 22:52:51 http://en.wikipedia.org/wiki/Tejas_and_Jayhawk suggests they had 7 GHz stuff ready to be built in 2004. 23:03:59 -!- Asztal has quit (Read error: 60 (Operation timed out)). 23:19:35 -!- soupdragon has joined. 23:23:22 heh 23:23:43 Deewiant, but I thought there was some wall around 5-6 Ghz? 23:23:44 iirc 23:29:24 What was the highest clock frequency seen in any sold Intel X86/X64 series processor? 23:36:54 hiya 23:37:00 I wanted to ask you guys a question here 23:37:44 Given a linear recurrence over the integers: a_n = c_1 * a_(n-1) + ... + c_d * a_(n-d) 23:38:10 with initial conditions, does there exist an k so that a_k = 0? He only asks for a decision procedure: he says that it is ... faintly outrageous that this problem is still open; it is saying that we do not know how to decide the halting problem even for “linear” automata! 23:38:30 Apparently 3.8GHz was the highest. 23:38:42 doh 23:38:45 I just realized 23:39:03 I was going to ask, doesn't rule 110 explain why there is no decision procedure 23:39:10 but actually it's not a linear recurrence.. 23:39:18 so nevermind 23:40:01 That question is reminiscent of the subset-sum problem. 23:40:24 soupdragon: um that looked linear to me... 23:40:38 Rule 110 looks linear? 23:40:44 oh, no. 23:41:08 I think that linear recurrence plus initial conditions decision looks solvable... 23:41:59 soupdragon: "faintly outrageous that this problem is still open" means that whoever gave you the problem claims it's open? 23:42:05 Hmm, all those linear recurrences can be represented as matrices. 23:42:05 Ah yeah, I figured out what the problem is... 23:42:50 yes the claim is that it's open, and I thought 110 gave an argument against: but then realized that 1D CA is a different format 23:43:06 hm 23:43:40 a_n = c_1 * a_(n-1) + ... c_d * a_(n_d) plus initial conditions can be written in form a_n = e_1 * (b_1)^n + ... + e_d * (b_d)^n. 23:43:44 well that obviously means it's _hard_ 23:44:36 Ilari: that may still make it _worse_, since those b_i are not integers 23:44:39 -!- MigoMipo has quit (Read error: 60 (Operation timed out)). 23:44:55 oerjan: In fact b's and e's are not even guaranteed to be real. 23:45:05 right. 23:46:57 -!- MizardX has quit ("zzz"). 23:47:10 so you can't encode a turing machine with a linear recurrence? 23:50:03 -!- adu has quit (Read error: 110 (Connection timed out)). 23:51:43 soupdragon: um if we knew how to do that we would have solved that unsolved problem. you think that is something done on the spot in an irc channel? 23:51:55 haha 23:52:11 if you know you can't encode a turing machine into the linear recurrence, it still doesn't solve the problem 23:52:33 hm could be 23:52:47 if you know you can then the problem is closed, but it's open -- so there's some kind of epistomogical argument that you can't encode turing machines into it 23:52:56 er epistemological 23:53:16 QED! :P 23:53:35 i don't think so. it could just be possible, but very hard 23:53:59 that's how famous unsolved problems are usually solved, after all 23:54:26 well that or someone comes up with an argument from something completely unrelated 23:54:49 or both. 23:56:05 it can be easily turned into a number of special cases of a vector problem 23:56:34 Here's another problem that's suprisingly hard (but not unsolveable): Given nxn table of bits, How many ways there are to choose n all-1 bits such that no two chosen bits are on same column or row? 23:56:34 given matrix A, vector v, is there any n such that A^n v has first coordinate 0 23:57:12 although that could still be harder, or could it... 23:57:45 Ilari, <= n! I guess 23:58:05 well that's obviously solvable in the computational sense, just use brute force 23:58:12 ^ Ilari 23:58:23 oerjan, hm.. I have seen people compute fibs using the matrix power like that so yeah I see a link