00:18:44 -!- bsmntbombdood_ has joined.
00:20:03 <augur> youre canadian arent you
00:21:04 <augur> huh. then your ling training was better than i expected. or something.
00:21:21 <augur> tho i indeed wouldve said @U, i probably wouldve transcribed it as oU
00:23:19 <Gracenotes> more like copying and pasting from the local IPA-friendly dictionary
00:24:10 <Gracenotes> I can speak LaTeX, though. got my IDE to be ∈ the mode for ∞ LaTeX enjoyment
00:24:38 <augur> latex made me reveal my password to irc the other day :|
00:24:45 <augur> i'd been doing my semantics homework in latex
00:24:53 <augur> lots of \f{x} shit all day
00:25:05 <augur> then i go to irc and notice im nicked as augur_
00:25:18 <augur> \msg nickserv id flibble
00:25:57 <augur> well, thats not what it was, right
00:26:03 <augur> i just chose it as an example
00:26:22 <Gracenotes> well, well, my fake password is bubbles. and it's cuter than yours! >:|
00:26:38 <augur> but mine is MISTER flibble! :|
00:27:49 <Gracenotes> oh god, looks like I'm dealing with a professional here. *backs away slowly*
00:30:34 <Gracenotes> I'm watching Buffy. yes, THE VAMPIRE SLAYER
00:37:33 <augur> ok so heres my idea
00:37:45 <augur> you'd have the expression 1 + 2
00:38:10 <augur> with the "abstract" structure ["1", ["+", "2"]]
00:39:07 <augur> "1" = 1 :: n, "2" = 2 :: n, "+" = \s:SUM(s) -> true :: s -> t / n,n,n
00:39:38 <augur> where n,n,n means "if s is SUM(s) then you need three more participants to evaluate this expression"
00:40:20 <augur> ["+", "2"] = \s:SUM(s) -> true && first(s,2) :: s -> t / n,n
00:40:48 <augur> ["1", ["+", "2"]] = \s:SUM(s) -> true && first(s,2) && second(s,1) :: s -> t / n
00:41:04 <augur> the evaluator says, ok this is the top of the tree
00:41:06 <augur> so we lift it now:
00:41:24 <augur> \s:SUM(s) -> true && first(s,2) && second(s,1) && third(s,w) :: s -> t
00:41:39 <augur> but this isnt a value type so close over it
00:41:48 <augur> Es:SUM(s)[true && first(s,2) && second(s,1) && third(s,w)]
00:41:57 <augur> but this has an open variable, so question-close that
00:42:06 <augur> Ws[Es:SUM(s)[true && first(s,2) && second(s,1) && third(s,w)]]
00:42:22 <augur> and this can be evaluated because this is a structure that the evaluator understands
00:42:28 <augur> namely, it means 1+2
00:42:39 <augur> so it evaluates it Ww[Es:SUM(s)[true && first(s,2) && second(s,1) && third(s,w)]] and returns 3
00:43:07 <augur> (well, first it does some logic to conjunction reduce Ww[Es:SUM(s)[true && first(s,2) && second(s,1) && third(s,w)]] to Ww[Es:SUM(s)[first(s,2) && second(s,1) && third(s,w)]]
00:52:17 <Gracenotes> why, back in my day, the whole family came to together to talk about semantics in front of the fireplace, we exchanged many arguments about.. er.. whatever semantics is about.. and lambdas.. ∀ dog. ∃ person and stuff
00:57:07 <augur> ok ive refined the ideas some more
00:58:43 <augur> imagine you have the expression "a one + a two"
00:59:03 <augur> for simplicity actually, lets say just "1 plus a two" instead
00:59:31 <augur> lets say "1" ~ 1 :: n
00:59:55 <augur> "plus" ~ \s:SUM(s).true :: s -> t / n,n,n
01:00:07 -!- calamari has joined.
01:00:19 <augur> "two" ~ \y.y = 2 :: n -> t
01:00:43 <augur> and "a" ~ \p.Ey.p(y) :: (n -> t) -> t
01:01:27 <augur> or even, lets say, "a" ~ \p.\q.Ey.p(y) & q(y) :: (n -> t) -> (n -> t) -> t
01:01:42 <uorygl> Gracenotes: semantics messes with your mind?
01:01:56 <augur> (uorygl: look at what im saying! its messed with MY mind! :D)
01:02:41 <augur> you put together "a" with "two" to get
01:03:08 <uorygl> It would be nice if I knew what you were talking about.
01:03:09 <augur> \q.Ey[y = 2 & q(y)] :: (n -> t) -> t
01:03:21 <uorygl> And what your notation is.
01:03:47 <augur> then you TRY to put together \s:SUM(s).true :: s -> t / n,n,n with \q.Ey[y = 2 & q(y)] :: (n -> t) -> t
01:03:56 <augur> then you notice, hold on
01:04:24 <augur> (n -> t) -> t?! maybe if i replace the expression "a two" with some unbound variable z :: n
01:04:37 <uorygl> So, um, "1 plus a two".
01:05:53 <augur> eh no sorry my explanations are sucky :d
01:06:02 <augur> my logic isnt quite worked out yet
01:06:25 <uorygl> 1 is a DP; it denotes a specific value.
01:06:32 <uorygl> Two is a noun; it denotes a type of thing.
01:06:53 <uorygl> A is a determiner; it takes a type of thing and denotes a specific instance of it.
01:07:01 <augur> actually, DP's dont denote specific values
01:07:02 <uorygl> Though it doesn't specify the specific instance. :-P
01:07:14 <uorygl> DPs totally denote specific values!
01:07:34 <augur> actually they dont
01:08:26 <uorygl> Plus is... a conjunction? A preposition? Anyway, it takes two DPs and becomes a DP denoting their sum.
01:08:48 <augur> plus is a lambda with a body, and a restriction on its variables
01:09:00 <augur> "plus a two" i is another lambda
01:09:03 <augur> ok i think i have it
01:09:19 <uorygl> Well, figure out whether "plus" is a conjunction or a preposition.
01:09:32 <uorygl> I guess maybe it's both.
01:10:17 <uorygl> So, DPs don't denote specific values?
01:10:37 <augur> "a two" then has the value \q.Ey[y = 2 & q(y)] :: (n -> t) -> t
01:10:51 <augur> so you notice, ok fuck, this doesnt combine with "plus" :: s -> t / n,n,n
01:11:07 <augur> but then you notice, hey, "plus" needs an n participant, and i'm an (n -> t) -> t
01:11:30 <augur> so lets replace "a plus" with an unbound z, and hold "a plus" off on the side for a while
01:11:43 <uorygl> (n -> t) -> t is the same as C n for some monad C.
01:11:57 <augur> so we instead pretend this is "plus z" where z :: n
01:12:04 <augur> holding "a plus" on the side
01:12:40 <augur> so now we combine \s:SUM(s).true :: s -> t / n,n,n with z :: n, returning
01:13:15 <augur> \s:SUM(s).(true && first(s,z)) :: s -> t / n,n
01:13:31 <augur> then we combine 1 :: n with that, returning
01:13:52 <augur> \s:SUM(s).(true && first(s,z) && second(s,1)) :: s -> t / n
01:14:24 <augur> then we say, ok, so this is the top of the tree, but thats not a value i can deal with
01:14:53 <augur> so let me pull this up to an existentially closed statement
01:15:16 <augur> first, let me add the last n:
01:15:29 <augur> \s:SUM(s).(true && first(s,z) && second(s,1) && third(s,w)) :: s -> t
01:15:42 <augur> so now i can existentially close s:
01:15:55 <augur> Es:SUM(s).(true && first(s,z) && second(s,1) && third(s,w)) :: t
01:16:31 <augur> but z is associated with "a two" and i havent handled that yet
01:16:52 <augur> so "a two" is \q.Ez[z = 2 && q(y)]
01:17:06 <augur> and we can bind z on the whole tree:
01:17:14 <augur> \z.Es:SUM(s).(true && first(s,z) && second(s,1) && third(s,w)) :: n -> t
01:17:26 <augur> this is what q needs to be!
01:17:31 <augur> so lets use this as the value for q
01:17:44 <augur> combining "a plus" with "1 plus z"
01:18:02 <augur> Ez[z = 2 && Es:SUM(s).(true && first(s,z) && second(s,1) && third(s,w))] :: t
01:18:24 <augur> but w is open and we introduced it to satisfy the participant needs of s:SUM(s)
01:18:38 <augur> so we have to bind that with a question operator
01:18:52 <augur> Ww[Ez[z = 2 && Es:SUM(s).(true && first(s,z) && second(s,1) && third(s,w))]]
01:19:29 <augur> but this isnt something we know how to complrehend, so lets do some logic: minimize Ez to exist around the smallest thing it can:
01:19:49 <augur> Ww[Es:SUM(s)[true && Ez[z = 2 && first(s,z)] && second(s,1) && third(s,w)]]]
01:20:04 -!- soupdragon has quit ("Leaving").
01:20:09 <augur> now apply some general theorem that Ex[x = y && p(x)] :- p(y)
01:20:31 <augur> Ww[Es:SUM(s)[true && first(s,2) && second(s,1) && third(s,w)]]]
01:20:36 <augur> then conjunction reduce true out of existence
01:20:40 <augur> Ww[Es:SUM(s)[first(s,2) && second(s,1) && third(s,w)]]]
01:20:56 <augur> and now notice that this is precisely the kind of statement that the interpreter understands by default.
01:30:30 <uorygl> So, what's the purpose of all this?
01:30:37 <uorygl> Are we trying to make computers understand English?
01:30:43 <augur> quite the opposite
01:30:55 <augur> designed to have syntacto-semantics like natural language
01:31:19 <uorygl> I guess I can see why in this sense DPs don't denote specific values.
01:34:22 <uorygl> So in your language thing, a DP has type (n -> t) -> t?
01:34:22 <uorygl> It's a predicate on predicates?
01:34:22 <uorygl> What do you mean by "it"?
01:34:22 <augur> "a two" is just a convenient use of english words for function names
01:34:22 <uorygl> In English, "a two" is a DP.
01:34:22 <augur> but this isnt english!
01:34:22 <augur> "a two" is just a function of type (n -> t) -> t
01:34:22 <uorygl> So by DP, I mean a thing that looks like an English DP.
01:34:22 <uorygl> So why is it an (n -> t) -> t?
01:34:22 <augur> because i wanted to force QR
01:34:22 <uorygl> If you're trying to be English-like, I would expect a DP to be either a value or a predicate on values.
01:34:22 <augur> quantifier raising
01:34:22 <augur> "a two" is actually a quantifier, right
01:34:22 <augur> \q.Ex[x = 2 && q(x)]
01:34:41 <augur> just like you might have some quantifier in haskell
01:35:18 <uorygl> Can you use pseudo-Haskell notation here? I'm not used to whatever you're using.
01:35:18 <augur> all xs f = Ex:(x in xs)[f(x)]
01:35:39 <augur> or, for the sake of being pure
01:35:56 <augur> well no we cant define
01:36:00 <augur> were TRYING to define
01:36:04 <uorygl> I'm not following a lot of what you're saying.
01:36:21 <augur> all xs f = foldr (&&) True (map f xs)
01:37:06 <augur> exists xs f = foldr (||) False (map f xs)
01:37:28 <uorygl> Did you tell me what the value of "a two" is?
01:37:35 <augur> im getting ther e:D
01:37:47 <uorygl> I think I'd rather you just tell me and we work backwards.
01:38:28 <augur> a f g = exists [x : x <- [1..], f x] g
01:38:40 <uorygl> If I want to learn why the gostak distims what it does, I'd like to know what the gostak distims first.
01:39:05 <augur> a two == exists [x : x <- [1..], x == 2] g
01:39:24 <uorygl> s/a two/a two g/, I'm assuming.
01:39:40 <augur> a two = exists [x : x <- [1..], x == 2]
01:40:16 <uorygl> So "a two" takes a predicate and returns whether 2 satisfies that predicate, right?
01:40:27 <augur> plus is something i cant write in haskell, but lets modify haskell a bit
01:41:07 <augur> lets say that in haskell, the argument in a lambda can have a restriction which is _almost_ like a type but instead tells you what kind of thing this is for semantic interpretability
01:41:11 <augur> we'll denote it like so
01:41:21 <augur> where R is the restriction
01:41:33 <augur> so lets say that plus s:SUM = true
01:41:40 <uorygl> So in your type system, n means values and t means Booleans, right?
01:41:49 <augur> n means Ints, t means Bools
01:42:03 <augur> so plus s:SUM = true :: s -> t
01:42:06 <uorygl> That means a DP is C n, where C is the continuation monad C x = (x -> t) -> t.
01:42:34 <augur> but since s:SUM, and SUMs need 3 "participants" in order to be semantically _sensible_
01:42:49 <augur> plus s:SUM = true :: s -> t / 3
01:43:03 <uorygl> "s:SUM" can be read "s which is a SUM", yes?
01:43:10 <augur> or s -> t / n,n,n specifically
01:43:16 <augur> sure, s-which-is-a-SUM
01:43:20 <augur> but its not a type restriction, keep in mind
01:43:41 <augur> ok so now we want to combine "plus" with "a two"
01:43:49 <uorygl> In "plus s:SUM = true :: s -> t / 3", are you using s consistently?
01:43:55 <uorygl> It looks like you're using it inconsistently.
01:44:07 <uorygl> What does "t / 3" mean?
01:44:29 <augur> the / 3 means "needs three participants in order to be sensible"
01:44:45 <augur> im using s as both the name of the type and the name of the var
01:44:56 <augur> if you want, plus s:SUM = true :: S -> T / N,N,N
01:45:17 <augur> and "a two" is (N -> T) -> T
01:45:21 <augur> so this doesnt match up
01:45:24 <uorygl> It looks to me like you're being really complicated. :-P
01:45:31 <augur> "a two" cant be an arg to "plus"
01:45:39 <augur> so lets be hopeful:
01:45:41 <uorygl> What is a State, anyway?
01:45:55 <augur> in the end, the primitive types wont matter
01:46:02 <augur> and we can make it untyped
01:46:14 <augur> well, mono-primitive
01:46:23 <augur> so lets be hopeful
01:46:27 <uorygl> I would have plus :: ((N,N,N) -> T) -> T or something.
01:46:49 <augur> since "a two" looks like (a -> b) -> b
01:46:58 <augur> this just looks like a type-lifted a
01:47:05 <augur> so "a two" looks like a type-lifted N
01:47:18 <augur> so lets replace "a two" with some place holder, call it tz
01:47:28 <augur> and hold "a two" off to the side
01:47:46 <augur> so now what we're really trying to do is combine plus :: (S -> T) / N,N,N with tz :: N
01:47:56 <augur> well, still, tz isnt an S so it cant be an argument to plus
01:48:03 <augur> but maybe its one of those three participants!
01:48:09 -!- Pthing has quit (Remote closed the connection).
01:48:20 <uorygl> Do you want to explain why plus has that type?
01:48:30 <augur> its magic, youll see
01:48:38 <augur> so we can specify this. its the first of the three participants to be added, sooo
01:48:42 <augur> we combine the two like so:
01:49:22 <augur> plus ^ tz = \s:SUM -> true && first(s,z) :: S -> T / N,N
01:49:55 <augur> pretend i said z everywhere before where i said tz :p
01:50:21 <augur> (im mixing domains of discourse here, so im thinking partially in terms of words with meanings, and partially in terms of pure meanings, so its difficult. anyway...)
01:50:36 <augur> so plus ^ tz = \s:SUM -> true && first(s,z) :: S -> T / N,N
01:51:02 <augur> cause remmber we're doing "1 plus a two"
01:51:12 <uorygl> (As you can see, the ellipsis denoted that I was responding to your second to last message. :-P)
01:51:12 <augur> N again can be used as a participant
01:51:57 <uorygl> You know, it seems to me that natural language has a value embed :: (Monad m) => m a -> a
01:52:05 <augur> so we combine (plus ^ z) with 1: 1 ^ (plus ^ z) = \s:SUM -> true && first(s,z) && second(s,1) :: S -> T / N
01:52:31 <augur> and now we've handled the whole of the _expression_ "1 plus z"
01:52:38 <augur> we have some problems!
01:52:49 <augur> first, we didnt handle that whole "a two" thing, we just left it off to the size
01:53:10 <augur> second we still dont have a last N participant for this function
01:53:15 <augur> so we must proceed!
01:53:17 <uorygl> I don't think I can continue listening unless you tell me why you gave plus that type.
01:53:25 <augur> i will when im done :p
01:53:38 <augur> well, we can existentially close of the s variable!
01:53:43 <augur> \s:SUM -> true && first(s,z) && second(s,1) :: S -> T / N
01:53:46 <augur> but this then becomes....
01:53:48 <uorygl> Then say my name when you're done.
01:54:47 <augur> \z -> exists states (\s:SUM -> true && first(s,z) && second(s,1)) :: N -> T / N
01:54:57 <augur> not quite that far :p
01:55:03 <augur> exists states (\s:SUM -> true && first(s,z) && second(s,1)) :: T / N
01:55:12 <uorygl> Darn, no alternative activity comes to mind. I'll have to keep reading.
01:55:27 <augur> but now wait, z isnt _technically_ a bound variable
01:55:48 <augur> so we have to bind it, but we dont want to do existential, since we got z by removing "a two"
01:55:50 <augur> so we do a lambda:
01:55:55 <augur> \z -> exists states (\s:SUM -> true && first(s,z) && second(s,1)) :: N -> T / N
01:56:18 <augur> and now we have an N -> T that "a two" :: (N -> T) -> T can combine with!
01:56:34 <augur> so we put them together to get
01:56:42 <uorygl> Though I'm not reading so much as checking to see whether you're answering my question yet.
01:56:53 <augur> exists [z : z <- [1..], z = 2] (\z -> exists states (\s:SUM -> true && first(s,z) && second(s,1))) :: T / N
01:57:21 <augur> let me stop this ALMOST FINISHED EXPLANATION
01:57:26 <augur> so i can answer your question
01:57:38 <augur> LETS HOPE YOU REMEMBER WHERE WE FUCKING WERE
01:58:32 <augur> the idea is to try and force the language to allow _only_ primitive combinations of functions (ie the kind of combinations you can type into a REPL) which involve ONLY forks of two functions
01:59:01 <uorygl> I'm not familiar with J.
01:59:15 <augur> fork f g h z = f (g x) (h x)
01:59:22 <augur> so the classic definition of avg is
01:59:39 <augur> avg = fork (+) len (/)
02:00:23 <augur> where + should be the summation operation
02:00:45 <augur> since fork (/) (foldr (+) 0) len = \xs (/) (foldr (+) 0 xs) (len xs)
02:01:04 <augur> so then how do you force plus to be a fork-able function?
02:01:12 <augur> well, thats where the language-y stuff comes in
02:01:27 <augur> just say plus is a predicate that specifies that kind of thing its argument is
02:01:40 <augur> all plus does is \s:SUM -> true
02:02:16 <augur> and, by assertion, s is a SUM
02:02:31 <augur> (so the :SUM thing kind of is an assertion of type, rather than a restriction on type)
02:02:36 <uorygl> Though I've not thought about it, it sounds a lot like you're saying, "I want my language to allow only oranges. Now, how do we represent an apple as an orange?"
02:03:12 <augur> if you want your language to be purely logical (Prolog) you have to represent addition some special way
02:03:25 <uorygl> Where "apple" and "orange" are mutually exclusive concepts.
02:03:25 <augur> if you want your language to be purely functional (Haskell) you have to represent loops some special way
02:03:33 <augur> nope, theyre NOT mutually exclusive
02:03:41 <augur> theyre just very different perspectives on things
02:04:14 <augur> you've now figured out how to satisfy "a two"'s argument
02:04:15 <uorygl> "Apple" and "orange" are mutually exclusive. You're denying that my analogy applies.
02:04:19 <uorygl> Which is precisely what I expected.
02:04:23 -!- oerjan has quit ("Good night").
02:04:39 <augur> you've done so and gotten back exists [z : z <- [1..], z = 2] (\z -> exists states (\s:SUM -> true && first(s,z) && second(s,1))) :: T / N
02:04:48 <augur> which is GREAT, because this is a basic value
02:04:58 <augur> but wait, its not a sensible meaning
02:05:06 <augur> it still needs an N
02:05:13 <augur> but we dont HAVE an N!
02:05:26 <augur> well, we can ask which N will make it sensible!
02:05:32 <uorygl> I think I'm not your intended audience.
02:05:41 <augur> but now? stick third(s,w) in there at the appropriate place
02:05:52 <augur> exists [z : z <- [1..], z = 2] (\z -> exists states (\s:SUM -> true && first(s,z) && second(s,1) && third(s,z))) :: T
02:06:01 <augur> but now this has an unbound variable z
02:06:04 <augur> so we have to close it
02:06:07 <uorygl> Which means I'm just being distracted and should get back to something useful.
02:06:09 <augur> but how can we close it? well, lambdas.
02:06:29 <augur> woops, this should be w
02:06:32 <augur> since we have z already
02:06:36 <augur> \w -> exists [z : z <- [1..], z = 2] (\z -> exists states (\s:SUM -> true && first(s,z) && second(s,1) && third(s,w))) :: N -> T
02:06:47 <augur> but alas, this is _still_ not a value
02:07:06 <augur> we can "question" it:
02:07:16 <augur> which [1..] (\w -> exists [z : z <- [1..], z = 2] (\z -> exists states (\s:SUM -> true && first(s,z) && second(s,1) && third(s,w)))) :: t
02:07:38 <augur> where which is like
02:07:56 * uorygl nods in a way that expresses that he believes that his suspicion has been confirmed.
02:08:16 <augur> which x:xs f = if (f x) then x else which xs f
02:08:38 <augur> and which [1..] (\w -> ...) is of type N
02:08:53 <augur> and we can definitely display one of those!
02:09:17 <augur> this is how my esolang is going to calculate 1 + 2 :D
02:09:34 <augur> similar for 1 + 2 but less crazy
02:10:10 <augur> or, since we're being haskellish
02:10:26 <uorygl> I keep coming back to this conversation. It would be easier for me to leave if I explicitly closed my end and you acknowledged the closure.
02:10:46 <augur> uorygl doesnt like my esolang :|
02:11:17 <uorygl> Your esolang is probably interesting, but I must admit I can't find interest in what you're saying.
02:11:29 <augur> well, i was giving an evaluation trace
02:11:33 <augur> who COULD find interest
02:11:44 <uorygl> If your purpose is to interest me, you should answer my questions and say little else.
02:11:55 <augur> well ask me a question then
02:12:07 <augur> you sort of interrupted an explanation i was giving to Gracenotes, so...
02:12:12 <augur> you got the rest of it
02:12:17 <augur> then you started asking queeestions
02:12:26 <augur> i was in the middle of explaining something, you demanded i restart
02:12:30 <uorygl> Okay. I felt I would have been interrupting you, since you were saying so much.
02:13:01 <uorygl> You don't have to be interesting. Did I imply that I wanted that?
02:13:10 <uorygl> s/be interesting/interest me/
02:13:36 <uorygl> Anyway, my question, if you want to answer it, is what it means for a fork to be "of" two functions.
02:13:48 <uorygl> In fork f g h x, what's that a fork of?
02:14:25 <uorygl> Well, you said "only primitive combinations of functions which involve only forks of two functions".
02:14:37 <augur> fork :: forall a, b, c. (a -> b) -> (a -> b) -> (b -> c) -> c
02:14:45 <augur> i defined fork earlier :|
02:14:56 <augur> fork f g h x = f (g x) (h x)
02:15:13 <augur> my types are not matching my definitions but whatever
02:15:35 <uorygl> So is fork (/) (+) len a fork of (+) and len, or what?
02:16:02 <augur> fork :: forall a, b, c. (b -> c) -> (a -> b) -> (a -> b) -> a -> c
02:16:31 <uorygl> (b -> b -> c), not (b -> b), I think.
02:16:42 <augur> fork (/) (+) is a higher order function \f -> \x -> (+ x) / (f x)
02:17:19 <augur> remember tho, i mistyped + instead of like... foldr (+)
02:17:55 <uorygl> What I'm trying to understand is your usage of "of", though.
02:18:14 <augur> you mean when i said that i wanted everything to be forks of functions
02:18:55 <augur> a fork of f and g is: forsome h. \x -> h (f x) (g x)
02:19:28 <uorygl> So fork f g h x is a fork of g and h.
02:22:46 -!- adam_d_ has quit (Read error: 110 (Connection timed out)).
02:23:33 <uorygl> There's an arrow operation that does that. Something like (&&&) :: ar a b -> ar c d -> ar (a,c) (b,d)
02:23:56 <augur> maybe. i dont know what its like in haskell
02:24:13 <uorygl> Are you familiar with arrows in Haskell?
02:25:39 <uorygl> They're things that look like functions. Something is an arrow if it has three operations: turning a function into one, composing two of them, and "fork"ing two of them into one.
02:26:07 <uorygl> Except that's not really fork; it's more like (&&&) f g (x,y) = (f x, g y). You can make fork out of it.
02:26:41 <uorygl> Lessee. fork f g h = f . (g &&& h) . (\x -> (x,x))
02:26:50 <uorygl> Where . is the composition operator.
02:27:17 <augur> i know what . is in haskell :P
02:27:34 <augur> and i think you mean
02:28:01 <augur> eh.. no youre right. :D
02:28:21 <uorygl> Well, I said that because . only works for actual functions, not arrow things in general.
02:28:55 <augur> the evaluation of "1 + a two" is such that a lot of it is not written (e.g. the extraction of "a two" is not explicit)
02:29:06 <augur> (in that its done by the interpreter)
02:29:17 * uorygl ponders what "1 plus a two" would look like if it were done with arrows.
02:29:24 <augur> but infact maybe you _could_ do it yourself
02:29:47 <augur> and this evaluates to 3 as well
02:29:56 <uorygl> Let's use @ for an arrow: a @ b is a "function" taking a and returning b.
02:30:00 <augur> which lets you use functions normally
02:30:07 <augur> except that + is not defined as Int -> Int -> Int
02:30:24 <augur> but rather as State -> Bool / Int, Int, Int
02:31:21 <uorygl> "Plus" could be (N,N) @ N, "1" could be () @ N, "a two" could likewise be () @ N...
02:31:43 <uorygl> You can use arrow operators to combine those into () @ N the right way.
02:32:16 <augur> i really should write the interp for this
02:33:19 <augur> i like the idea tho man
02:33:29 <augur> we have to extend it to quantifiers tho
02:33:55 <augur> but im talking to you, so
02:34:13 <augur> nah there can be real functions
02:34:31 <uorygl> "A two" isn't really an N, though; you said that.
02:34:31 <augur> no! haha! there wont be
02:34:40 <augur> "a two" is (N -> T) -> T
02:35:03 <uorygl> If we define a @ b as a -> (b -> T) -> T, then "a two" is () @ N.
02:35:23 <augur> specifically, "a two" == \q.Ex:(x = 2)[q(x)]
02:38:43 <uorygl> Though what you say makes me want to say stuff of my own. :-)
02:39:57 <uorygl> Let's call P a point concept, a specific thing, where anything you might refer to is exactly one point concept.
02:40:12 <augur> so i think "a" would infact be defined as E*
02:40:49 <uorygl> "That orange over there" is a phrase referring to a point concept. There is exactly one thing that you're talking about.
02:41:29 <augur> deictic terms are the only referentials really
02:41:45 <augur> but even "that", while being deictic, is a quantifier
02:41:48 * uorygl lets his browser look up the word "deictic".
02:42:00 <augur> its just a quantifier that has a use of a bound variable
02:42:06 <uorygl> P -> T is a sort of region concept, a type of thing. "Orange" is a region concept, I guess.
02:42:07 <augur> bound globals, really
02:42:52 <augur> that f g = exists [x : x == the_global, f x] g
02:43:18 <uorygl> Nouns tend to denote Rs, and DPs tend to denote Ps.
02:43:30 <augur> "point" concepts are denoted E, btw
02:43:36 <augur> well, in semantics, e
02:44:12 <augur> (you know, we dont use -> in semantics? we use <,>, following church. its horrible. a -> b -> c == <a,<b,c>>
02:45:09 <uorygl> Something that's attested is DPs denoting Rs instead of Ps, which is too bad.
02:45:43 <uorygl> Now, let's pretend English is a subject-dropping language; it doesn't really matter.
02:46:17 <augur> its actually rare that DPs actually point
02:46:22 <uorygl> Verbs can stand alone as sentences, but they are usually modified by subjects, direct objects, indirect objects, and prepositional phrases.
02:46:23 <augur> ignoring "that dog"
02:46:35 <augur> "the cat" does not actually point
02:46:53 <uorygl> "John" doesn't actually point?
02:47:03 <augur> well, it sort of does actually
02:47:15 <augur> "John" can be one of two things
02:47:21 <augur> either an NP, which is a predicate like "cat"
02:47:40 -!- Slereah has quit (Read error: 60 (Operation timed out)).
02:47:41 <augur> or a deictic DP, with a covert deictic existential quantifier
02:47:54 -!- BeholdMyGlory has quit (Remote closed the connection).
02:47:56 <uorygl> Intuitively, "John is eating" and "Somebody named John is eating" mean different things.
02:47:58 <augur> the idea is that when you say "John" as a DP, as in "John danced"
02:48:11 -!- Slereah has joined.
02:48:22 <augur> sure sure but "Somebody named 'John' is eating" is overly something even more complex
02:48:33 -!- jpc has joined.
02:48:46 <augur> we think that "John danced" is really "NOM John danced"
02:48:55 <augur> where NOM is roughly "Some" only a covert some
02:49:06 <augur> "John danced" is the same as "Some 'John' danced"
02:49:14 <augur> except where you have someone in mind
02:49:34 <augur> "Some 'John' who I'm thinking of danced"
02:49:53 <augur> at least, this is what you have to say to fully account for the semantics of english
02:50:07 <augur> without being obnoxious and saying that there's either
02:50:17 <augur> a) 10 different versions of the verb "eat"
02:50:41 <uorygl> I still think that intuitively, "John is eating" and "Some John who I'm thinking of is eating" mean different things. The former is ambiguous, the latter is inspecific.
02:50:48 <augur> b) 10 different versions of "the"
02:50:59 <augur> well, it DOES mean something different
02:51:15 <augur> because the meaning of "Some John who I'm thinking of is eating" is an actual english sentence with a complex meaning
02:51:26 <augur> see the problem is that like
02:51:44 <augur> the quantifier inside of "NOM" is not the SAME as the quantifier "some ... who I'm thinking of"
02:51:44 <uorygl> You seem to be claiming that they do mean the same thing.
02:51:52 <augur> theyre roughly the same, right
02:52:02 <augur> but insofar as your internal representation is concerned, they're not
02:52:23 <augur> and representationally they're different
02:52:32 <augur> in the same way that "1 + 2" and "2 + 1" are different haskell expressions
02:52:38 <augur> but if you evaluate them they mean the same thing
02:53:00 <augur> if you "evaluate" the two sentences "John is eating" and "Some John that I'm thinking of is eating", WHILE you're thinking of that particular John
02:53:10 <augur> you wont be able to find a difference in meaning
02:53:49 <uorygl> Let's say that "John is eating" does refer to a specific John, though which John it is may be undeterminable.
02:54:00 <augur> undeterminable by the listener
02:54:30 <augur> when we reflect upon the meanings of sentences, we're not looking at whether or not the sentence is true, nor what makes the sentences true, we're looking at the form of the meaning
02:55:19 <augur> the form of the meaning of "John is eating" really does seem to have something like a quantifier in it, at when you represent the meaning using the normal modes of meaning representation.
02:55:46 <augur> if all we ever said were things like "John is eating" this wouldnt be an issue
02:56:00 <augur> hell, if all we said were things like "John is eating" and "all dogs are brown" this wouldnt be an issue
02:56:17 <augur> the issue is "John is eating something"
02:56:39 <uorygl> So I'm trying to create the concept of "the P referred to by a DP". I think in order to do that, I have to find three mutually correlated properties, and say that this new concept is the thing that they are related by being correlated with.
02:56:39 <augur> "The John that I know is eating something"
02:57:10 <uorygl> I think I'm getting caught up in Philosopher's Molasses.
02:57:19 <augur> no, semanticists molasses :p
02:57:22 <augur> but such is language!
02:57:37 <augur> to account for object quantification you need a whole mess of shit
02:57:40 <uorygl> Rationalist's Molasses.
02:57:42 <augur> there are a number of options
02:57:52 <augur> in ONE option, you could ALMOST get away wit h John being deictic
02:58:04 <augur> except then you have to have as many words "John" as know people named John
02:58:19 <augur> AND you also have to have another word "John" that means "person named 'John'"
02:58:37 <uorygl> I'm pondering when it's valid to postulate a concept.
02:59:11 <augur> and at the same time you STILL need to have purely deictic terms like "that"
02:59:27 <augur> or some covert terms that seem to be genuinely deictic
03:00:01 <augur> and then theres the problem that in many languages, even PROPER NOUNS must be used with "the"
03:00:12 <augur> in greek, you dont say "John is dancing" you say "The John is dancing"
03:00:32 <uorygl> Anyway, we should stop trying to find the Deep Theoretical Platonic Truth until we figure out how it's related to the Shallow Empirical Observable Truth.
03:00:40 <augur> the same "the" (an 'o' or there abouts eg 'o janos') that you use to say "the cat"
03:00:53 <augur> there is no shallow empirical observable truth :D
03:01:08 <augur> quantum mechanics and that berkeley dude has shown us this
03:01:30 <augur> look, the evidence that "John" is really a deictic quantifier is quite strong
03:02:02 <augur> yes, it seems like "John is eating" points to some particular person John (because it does) and that it has a different meaning that "Some John is eating" (because it does)
03:02:16 <uorygl> If I look at a glass, I know that I see a glass.
03:02:23 <uorygl> Is there really a glass there? Almost certainly.
03:02:38 <uorygl> The presence of that glass is the shallow empirical observable truth.
03:02:38 <augur> but that does NOT mean that "John" here by itself, as the smallest meaningful object that contains the letters "John", is infact the thing doing the pointing.
03:02:54 <augur> you dont know you see a glass, you THINK you see a glass
03:03:15 <augur> the most you know is that you're experiencing what it would be like to see a glass
03:03:24 <augur> or what you would call a glass
03:04:15 <augur> thats not to say you dont have a very good reason to think that
03:04:15 <uorygl> Sure. In any case, we know *something* upon seeing a glass; whatever logic is based on, let us call that knowledge.
03:04:27 <uorygl> s/logic is based on/the premises of logical reasoning are/
03:04:41 <augur> but that reason is merely 500 million years of evolution hard-coding some very sophisticated philosophical argumentation
03:04:51 <augur> so seeing a glass is NOT shallow at all
03:05:12 <augur> its just that the depth is masked by the fact that conscious experience is a very recent thing that has very LITTLE access to cognitive processes
03:05:24 <uorygl> It's shallow in that it's a very familiar and intuitive sort of thing.
03:05:38 <augur> but familiar and intuitive is irrelevant
03:05:48 <augur> when we're doing semantics, we're not asking what is intuitive
03:05:54 <augur> we're asking what is real
03:05:58 <augur> or accurate, at least.
03:06:22 <augur> science has revealed again and again that the intuitive understanding of the world is usually wrong
03:06:23 <uorygl> I guess it depends on whose perspective you're speaking of. From a human's perspective, it's a shallow truth. From a computer's perspective, it's presumably a pretty deep one.
03:06:44 <augur> motion is not newtonian, there is no such thing as simultaneous, and you are not the best authority on the contents of your own mindbrainthing
03:06:54 <augur> sure, its a shallow truth
03:06:57 <augur> that means nothing
03:07:05 <augur> because were not ASKING what the shallow truth is
03:07:16 <augur> when you ask a person what does "fido is a dog" mean, they just repeat the sentence back at you
03:07:22 <augur> "why, it means that fido is a dog! duh!"
03:07:33 <augur> but thats not a meaning
03:07:43 <augur> thats not what your mental representation of the meaning is
03:08:00 <augur> because "fido is a dog" is a string of letters/sounds!
03:08:22 <augur> and its almost certainly not the case that our representations of meaning are in terms of abstract syntax trees
03:08:56 <augur> because then then bilingual speakers would never be able to say that "rex is a cat" and "rex est un chat" mean the same thing
03:08:57 <uorygl> But our deep knowledge is based on our shallow knowledge and it exists for the purpose of predicting what our next shallow knowledge will be.
03:09:08 <augur> because french and english probably dont have the same abstract syntax in that regard
03:09:13 <augur> yes but guess what
03:09:25 <augur> our shallow knowledge of LANGUAGE is buttressed by an enormously complex linguistic faculty
03:09:50 <augur> what reason do you have to believe that when you learn the word "fido" that you're just learning what it points to?
03:10:00 <augur> this is NOT part of your shallow experience
03:10:09 <augur> because now we're in the realm of cognitive science/psychology/whatever
03:10:22 <augur> your shallow experience of the world does not include the details of name acquisition
03:10:33 <augur> it includes the experience of seeing someone point to an animal and go "fido"
03:10:47 <augur> "fido want a treat?"
03:10:51 <augur> "thats a good fido"
03:11:26 <augur> (oops, sorry, i just used a quantifier in your primary linguistic data! D: you might get the impress that fido is really a predicate!)
03:11:45 <uorygl> Hmm, I think we're talking about the deep truth again.
03:11:59 <augur> because you're saying that the "Shallow" meaning of "fido" is the dog itself
03:12:02 <augur> but how do you know this?
03:12:09 <augur> theres nothing in your experience of the world that shows this
03:12:25 <augur> you have to make an ASSUMPTION about how people learn the meaning of words
03:12:52 <augur> your ASSUMPTION is that "fido" is by default going to be mapped to the dog in question, not to some abstract "fidoness" quality
03:12:59 <augur> but thats the whole thing you're ARGUING!
03:13:00 <uorygl> We're talking about things that are themselves deep truth.
03:13:19 <uorygl> The shallow truth of semantics is whatever we can observe and feel about it.
03:13:27 <augur> but you CANT observe semantics
03:13:41 <augur> and you cant FEEL everything either
03:13:44 <augur> like i said earlier
03:13:51 <augur> you _FEEL_ that when you typed "John"
03:13:52 <uorygl> If semantics has no observable or feelable consequences, there's no reason to discuss it.
03:13:56 <augur> that this whole thing referred to someone
03:14:02 <augur> well, sure, in a way it DID refer to someone in particular
03:14:14 <augur> but that doesnt mean that "John" those four letters are really all there is to that phrase
03:14:22 <augur> nor that that person is all there is to that meaning
03:14:37 <uorygl> Great. So I can see some shallow truth there.
03:15:07 <augur> all you can see is the fact that the string of letters "John" seems to refer to a particular person.
03:15:14 <augur> that doesnt tell you whether or not thats ALL it does
03:15:27 <uorygl> Right. The shallow truth is that "John" probably refers to a particular person.
03:15:51 <augur> and "John" can be a deictic
03:15:55 <augur> but its STILL a quantifier as well
03:16:06 <augur> because its a quantifier with has a deictic element
03:16:10 <augur> or, more accurately
03:16:22 <augur> its a predicate that combines with a quantifier that has a deictic element inside it
03:16:38 <augur> suppose that J was _that particular John_
03:16:54 <augur> the current best guess is that "John" in "John is eating" has the meaning roughly like
03:17:33 <augur> \p -> Ex[x = J & calledJohn(x)]
03:17:47 <augur> you've GOT the deicticy thing in there
03:18:04 <augur> you're referring directly to the person, in that there IS the use of J in there
03:18:18 <augur> but that does NOT mean that the whole thing isnt a quantifier, ok
03:18:55 <augur> infact, the root word "John" (not the phrase "John" in "John is eating") is NOT the thing thats contributing the referentiality
03:19:00 <augur> its the quantifier!
03:19:21 <augur> \q, p -> Ex[x = J & p(x) & q(x)]
03:19:47 <augur> and this quantifier in english (but not in greek maybe?) happens to be phonologically empty
03:20:02 <uorygl> So, you mentioned "the current best guess". What's the question we're guessing the answer to?
03:20:15 <augur> what the meaning of "John" looks like
03:20:29 <augur> is it the person john, or some quantifier, or whatever
03:20:32 <augur> thats the question
03:21:02 <augur> the assumption that its just a reference to some person in particular just _doesnt work_ with the rest of the meanings of sentences
03:21:26 <augur> now we can waffle about the meaning of "meaning"
03:21:26 <uorygl> So, "What does the meaning of 'John' look like?" Now, I can't look at a meaning and see what it looks like, so what's the shallow truth we're trying to figure out by answering that question?
03:21:35 <augur> and say that meaning is the intention behind the expression
03:21:39 <augur> but thats way too loose
03:21:52 <augur> because then the meaning of an expression is whatever the person intends it to be, regardless of the conventions of the language
03:22:08 <augur> "John is eating" means "Mary danced" because i INTENDED it to
03:22:12 <augur> well no, sorry, it doesnt
03:22:28 <augur> semantics is NOT a shallow truth
03:22:48 <augur> which is why you cant just tell me that John refers to the person because thats the shallow truth of it
03:22:53 <uorygl> I find that instead of waffling about meanings, it's better to say what we'd like to say without using the word at all.
03:22:57 <augur> because you're saying the meaning of John is a reference to a person
03:23:08 <augur> but the meaning of expressions is not a shallow truth!
03:23:32 <augur> when you can figure out what you're trying to say without using the word meaning, be my guess :P
03:24:01 <augur> or without using any words that covertly incorporate notions of meaning
03:24:06 <uorygl> What should we conclude upon hearing "John is eating"?
03:24:21 <uorygl> What do people want when they say "John is eating"?
03:24:54 <uorygl> Combine those: What do people want us to conclude when they say "John is eating"?
03:25:17 <augur> we should conclude that the speaker believes there is a person that the speaker believes is called john with whom the speaker knows, and assumes the hearer knows as well, and that this person is, at this very moment, engaged in a process of eating something or other which the speaker doesnt know the identity of, or which the speaker does not feel the identity of to be relevant
03:25:44 <augur> we should also conclude that the speaker believes that this is somehow relevant to the conversation were having and that i will be able to integrate this into my understanding of the situation
03:25:46 <uorygl> Indeed, there's a whole bunch we should conclude.
03:25:58 <augur> except that is not the meaning of the sentence "John is eating"
03:26:18 <augur> because when someone says "John is eating" they are not conveying to you "I believe you know who John is"
03:26:25 <augur> because if they were
03:27:03 <augur> well, there are all sorts of because
03:27:20 <augur> if I said that Frank was wrong when he said that "John is eating"
03:27:55 <augur> this is definitely not going to be the case if Frank didn't actually think I knew who John was
03:28:18 <augur> but all of this hinges on the WORD "meaning"
03:28:31 <augur> i mean, we can just ask what we should concluce
03:28:52 <augur> but that leads to a whole world of shit beyond just LANGUAGE
03:29:01 <augur> it leads to psychology of action
03:29:06 <augur> inferring why people do things
03:29:14 <augur> and that can lead to inferences about the physical world
03:29:39 <augur> because "john died after jumping off the bridge" leads us to conclude that the speaker believes something about jumping off of bridges and death
03:29:47 <augur> and that yadda yadda
03:30:02 <uorygl> No shallow truth can hinge upon the definition of a word. Unless, of course, said truth is about the word itself.
03:30:23 <augur> what we can or should conclude from a sentence is not the meaning of the sentence, its the meaning of the sentence plus a whole shittone of other crap thats brought into the task of understanding
03:30:35 <augur> ah but thats the thing see
03:30:43 <augur> you cant even ask what the shallow truth of blah blah blah is
03:30:50 <augur> because there is no shallow truth of "John"
03:30:59 <augur> truth is a property of claims
03:31:07 <augur> and "John" is not a proposition
03:31:18 <uorygl> Let's call the "real" definition of meaning meaning1, and my definition of meaning meaning2.
03:31:28 <augur> lets not use the word meaning at all
03:31:35 <uorygl> So the meaning1 of "John is eating" does not include the fact that I know who John is, but the meaning2 does.
03:31:41 <augur> lets do what semanticists do and say this:
03:31:55 <uorygl> Why do we care about the meaning1?
03:32:20 <Gracenotes> watching Maru yawn in slow motion is perhaps the most satisfying thing I've done today
03:32:20 <augur> if you know the "meaning" of a sentence like "John is eating", you know what is required of the world for it to be true or not
03:32:35 <augur> you also know some other stuff, but at the very least you know what is required for it to be "literally true"
03:33:17 <augur> semantics is the study of the literal content of expressions
03:33:23 <uorygl> Well, let me take a different route.
03:33:31 <augur> and, it turns out, in order to understand the WHOLE of the "meaning" of "John is eating"
03:33:36 <augur> that is, to include your meaning1 as well
03:33:39 <uorygl> I think you would agree that "Oh no!" doesn't have a truth value.
03:33:45 <uorygl> How do you know that it doesn't have a truth value?
03:33:46 <augur> you have to know the litteral content
03:33:56 <uorygl> And why do you care that it doesn't have a truth value?
03:34:06 <augur> it doesnt have a truth value because you cant so "No, you're wrong." when someone says "Oh no!"
03:34:20 <Gracenotes> since I've sort of been passively wondering what the scope if it all is, how about, say, "John is ugly"?
03:34:27 <augur> the question is whether or not "Oh no!" is language.
03:34:37 <augur> its a bunch of sounds you made with your mouth
03:34:41 <augur> but so is a sneeze
03:34:44 <Gracenotes> or instead of ugly, some other thing that is 100% an opinion of the speaker
03:35:02 <augur> now we might actually accept that its language, sure
03:35:13 <augur> lets say its language, ok
03:35:28 <uorygl> So we feel like there's a notion of contradictability.
03:35:43 <augur> i didnt say that ALL linguistic expressions have truth values
03:35:53 <augur> i said that expressions like "John is dancing" have truth values
03:35:53 <uorygl> When somebody says "You're wrong!" in response to "Oh no!", that makes us feel a specific way.
03:36:13 <augur> oh oh sorry i see what you mean
03:36:28 <augur> "yes thats true" to someone's "oh no"
03:36:35 <augur> nor can you say "no thats false"
03:36:42 <augur> (gracenotes: what about it)
03:37:00 <augur> oh, in-situ questions. what about it?
03:37:13 <Gracenotes> augur: it's not something that is provably true or false, yes, what we learned about in grade school as truth vs. opinions
03:37:46 <augur> "John is ugly" is an assertion
03:37:51 <Gracenotes> well, that is verifiable, although semantics must still make meaning about what the claim is
03:37:55 <augur> and, in some very real sense, it can be considered true or false
03:38:10 <augur> "John is ugly?" with a question mark is a question
03:38:15 <augur> and so doesnt have a truth value
03:38:56 <uorygl> If I know a certain thing, and I hear the sentence "John is eating", I will agree.
03:39:09 <augur> this is the kind of meaning that semantics is concerned with
03:39:17 <augur> the literal sorts of meaning
03:39:20 <uorygl> Yes, I think we've come up with a definition of meaning.
03:39:29 <augur> not what you can infer from the expression
03:39:37 <augur> just what the literal meaning is
03:39:46 <augur> the literal content, if you will
03:39:52 <Gracenotes> just know that, when you hear him used in example questions
03:39:58 <augur> there is a whole field of exploring the non-literal meanings behind things
03:40:02 <augur> part of that is pragmatics
03:40:06 <augur> part of that is psychology
03:40:07 <uorygl> I think we can confuse the issue a bit, though.
03:40:47 <augur> actually we cant. there are very clear lines that can be drawn between literal meaning, pragmatics, and psychology.
03:40:56 <uorygl> Suppose that I and a friend are walking along, and we both see something disgusting. My friend says, "Eew." Am I agreeing?
03:41:02 <augur> and the problematic aspects are all at the line between pragmatics and psychology.
03:41:23 <uorygl> What if my friend says "Disgusting!"?
03:41:28 <augur> that depends on what you mean by "agree"!
03:42:00 <uorygl> Well, this definition of "meaning" that I came up with hinges on what "agreement" is.
03:42:10 <augur> there are at least two words in english that are written "agree"
03:42:13 <augur> or two senses of the same word
03:42:20 <augur> one is "share an opinion"
03:42:54 <augur> "Alexis expressed disgust at the food. I agreed inside, but didn't say anything."
03:43:09 -!- FireFly has quit ("Leaving").
03:43:12 <Gracenotes> augur: if you work in semantics, you might have to explain this depressingly often :| but better to introduce non-linguists, I suppose.
03:43:25 <augur> the other can mean "express concurrance on a matter"
03:43:34 <augur> "Alexis didn't agree to the terms of the deal"
03:43:53 <augur> "even if inside Alexis really liked the deal, her bosses insisted that she not agree to it"
03:44:46 <augur> Gracenotes: SO often. you have no idea how many times, when working on an extraordinarily complicated issue like what the meaning of "Fido is a dog" is, someone will say "Fido is a dog just means Fido is a dog! It's simple!"
03:44:50 <augur> well no, its not simple, sorry
03:44:55 <augur> you're just DESIGNED TO UNDERSTAND IT
03:45:08 <augur> if it were truly simple, my calculator could understand what it means
03:45:10 <Gracenotes> you must have secret semanticist rituals
03:45:14 <augur> thats the true test
03:45:24 <augur> simplicity is demonstrable: program it.
03:45:36 <augur> thats how we know that calculus, for instance, is mindnumbingly simple, when it comes down to it
03:46:01 <augur> all the calculus youve ever done in your life to date could be done in a computer program in a few minutes or hours
03:46:16 <augur> and a computer isnt anywhere near as smart as a person
03:46:31 <augur> well if calculus is so easy that a computer can do it, it must be easy
03:46:36 <augur> regardless of how difficult humans find it
03:46:37 <uorygl> Is it okay if I interpret "program" as meaning "express in a computer language", where a "computer language" is any language that expresses details that a computer deals with intimately?
03:46:58 <Gracenotes> so, has anyone bothered making a language --(parse)--> syntax --(interpret)--> semantics tool, at some point?
03:47:05 <augur> lets say that "program" means "express in some form of logic"
03:47:20 <augur> since programming languages are just logics.
03:47:31 <Gracenotes> I'd imagine there must be. But fragments of it, which I'm sure is not perfect, otherwise it's what we'd all be using
03:47:34 <uorygl> Well, that's kind of ambiguous, isn't it? Straightforward English is arguably a form of logic.
03:47:37 <augur> but they tend to suck.
03:47:46 <augur> english is far from it
03:47:51 <uorygl> Euclid wrote in straightforward English, for strange values of "English".
03:48:03 <augur> Euclid wrote in straightforward Greek ;)
03:48:17 <augur> what i should have said is
03:48:28 <augur> where a Logic is a well defined formal system
03:48:34 <uorygl> "If English was good enough for Jesus, it's good enough for America!" --a straw man
03:48:38 <augur> like first order predicate calculus or whatever
03:48:50 <augur> but sure, programming language
03:49:16 <augur> its just easier to expression certain things in a language like PC2 or HPC or whatever
03:49:43 <augur> but thats what i was doing earlier!
03:49:46 <augur> and you complained!
03:50:04 <augur> if you try to give a good, well defined meaning for "John"
03:50:13 <augur> and your well defined meaning is a referential
03:50:33 <augur> (that is, a symbol referring to, or behaving as if it were, the person)
03:50:46 <augur> (e.g. your logic says the meaning of "John" is J)
03:51:03 <augur> then you will, inevitably, conclude that there must be an infinite number of distinct words "John"
03:51:33 <augur> "John_1" refers to John Lennon, "John_2" refers to JFK, "John_3" refers to the pizza delivery boy, ...
03:51:40 <augur> for all of the John's that you know
03:51:50 <uorygl> Why would those have to be distinct words?
03:52:05 <uorygl> It would make a lot of sense, I think, for them to be distinct senses of a single word.
03:52:07 <augur> well explain how you'd distinguish which is meant!
03:52:21 <augur> tell me how a single word "John" could be used to refer to INDIVIDUAL people
03:52:27 <augur> when when there are multiple Johns
03:52:32 <uorygl> I dunno, the same way that we distinguish between different senses of a word like "set"?
03:52:52 <augur> ah but there are probably multiple words "set"!
03:53:01 <augur> just because they sound the same doesnt mean theyre the same word, keep in mind
03:53:02 <uorygl> Why would we call those multiple words?
03:53:16 <augur> "set" like a movie set
03:53:24 <augur> "set" like what you do to a variable
03:53:30 <augur> probably different words
03:53:34 <uorygl> But I like to define words by the conjunction of spelling, pronunciation, and etymology.
03:53:35 <augur> now maybe you have a smaller kind of difference
03:53:38 <augur> "mostly the same but not quite"
03:53:55 <augur> ok, so now "John" is a single word with multiple subtly different senses
03:54:03 <uorygl> If they have the same spelling, they're pronounced the same way, and their reason for being a word is the same, they're the same word.
03:54:09 <augur> now tell me what the representation in your logic is.
03:54:19 <augur> well yes, words are like
03:54:27 <augur> words are pairs, at least, of form and sound
03:54:34 <augur> what saussure called a Sign
03:54:50 <augur> theres probably a bit more to it than that
03:55:11 <uorygl> Hmm, what you're doing is a lot like asking me for the simplest explanation of a phenomenon.
03:55:37 <augur> you seemed to be up to the challenge of giving me pseudocode for this
03:55:48 <augur> if "John" is one word with multiple senses
03:55:52 <augur> give me pseudocode.
03:56:15 <augur> give me pseudocode for "John is dancing"
03:56:29 <augur> and ignore all the extraneous shit like what the "is" and the "ing" and the tense and shit are doing
03:56:45 <augur> i dont care about that (unless its relevant to your explanation of the pseudocode for "John")
03:57:04 <augur> just give me, right now, the pseudocode.
03:57:42 <uorygl> A person is dancing. That person's name is John. Nothing else I could be referring to is named John. Therefore, I say "John is dancing".
03:59:43 <uorygl> Did I explain the phenomenon?
04:01:01 <augur> not in the way you'd accept it
04:01:06 <augur> because you turned "John" into a predicate!
04:01:11 <augur> "That person's name is John"
04:01:23 <uorygl> Did I attempt to explain the right phenomenon?
04:01:23 <augur> but thats what _I_ said
04:01:53 <augur> just give me the meaning in your pseudocode
04:02:11 -!- Pthing has joined.
04:02:29 <augur> were talking linguistics
04:02:36 <augur> you might not want to stay D:
04:03:13 <Pthing> is the perfect relish to any and all conversations in which augur plays a vital part
04:03:27 <augur> your mother is the perfect relish?
04:04:36 <uorygl> A person is dancing. "John" evokes that person. "John" does not evoke anything else. Q.E.D.
04:04:57 <uorygl> A person is dancing. "John" evokes that person. "John" does not evoke anything else. I want to use a word that evokes only that person. Q.E.D.
04:05:12 <augur> this is your pseudocode?
04:05:36 <augur> but it does not do what you said!
04:05:48 <augur> or, rather, it does not do what you wanted it to do
04:06:19 <uorygl> Given the statement I just made, one can conclude that I will use the word "John" to refer to the dancing person.
04:06:29 <augur> in that, the word "John" here surely maps to "'John' evokes that person. 'John' does not evoke anything else."
04:06:43 <augur> and minimally also,
04:07:07 <augur> that is to say, the word john maybe denotes a particular person
04:07:50 <augur> but the majority of the words contribution, infact, ALL of the contribution of the words distinguishing feature (being written 'John' instead of 'Mary') is NOT the particular person
04:08:13 <augur> '"John" evokes that person. "John" does not evoke anything else. I want to use a word that evokes only that person.'
04:08:28 <augur> you see what i mean?
04:08:50 <augur> hell, even 'a person' doesnt do what you want unless we interpret it to be some atomic element that means that person
04:08:56 <augur> the whole thing is nothing but a giant quantifier!
04:09:25 <augur> "A person that is called John who is the only person in this context that could intend... is dancing"
04:09:33 <augur> well, thats pretty much what i said earlier. :)
04:09:36 <uorygl> How about we ask a different question, like...
04:10:28 <uorygl> How could an intelligent computer represent the sentence "John is dancing"?
04:10:41 <augur> i dont care how you do it
04:10:44 <augur> just give me pseudocode.
04:12:27 <uorygl> IsDancing(JohnThePizzaDeliveryGuy)
04:12:35 <uorygl> (Look, suggestively-named Lisp tokens!)
04:12:47 <augur> and thats the literal meaning of "John"?
04:13:36 <uorygl> Well, presumably, this computer knows some things that are related to each other, and has chosen this label the relation JohnThePizzaDeliveryGuy.
04:14:04 <augur> so then this computer will always-and-forever interpret "John" as JohnThePizzaDeliveryGuy
04:14:30 <uorygl> Well, there's further reason that the computer interprets "John" as this token.
04:14:41 <augur> ah, well then, you havent given me the literal meaning!
04:14:59 <augur> youve given me a meaning that has some interpretation of meaning already in it
04:15:20 <uorygl> I feel like this has turned into a discussion of rationality, which makes me want to take it to #lesswrong, our rationality channel.
04:15:41 <augur> no, if we did that, thered be no convo here
04:15:58 <augur> since it was quiet until gracenotes and i moved the convo here from proggit
04:16:19 <augur> well sure but we werent talking about esolangs then :D
04:16:29 <augur> we were talking aboud proggitty things
04:16:36 <augur> i mean, maybe #lesswrong
04:16:52 <Gracenotes> if you have a name for your PL, you get a *free* freenode channel for it!
04:17:16 <augur> hence #antigravity! :D
04:18:16 <augur> but i dont go there anymore since i was bored of it
04:26:39 <Gracenotes> now, I'm going to get one of the delicious melted mint chocolate kiss sugar cookies my mom made
04:47:33 -!- soupdragon has joined.
05:06:18 -!- adu has joined.
05:19:11 -!- ehirdiphone has joined.
05:19:26 <ehirdiphone> http://journal.stuffwithstuff.com/2009/12/26/conditional-binding-with-let-in-magpie/ Interesting control structure.
05:20:30 <augur> i should write a similar tutorial in ruby
05:20:53 <augur> if value = "1234".to_i then ... else puts "Couldn't parse string." end
05:22:28 <soupdragon> hate to be a detractor but afaict it's like a bad version of do notation?
05:22:45 <augur> everything is a bad version of do notation
05:23:37 <augur> oh, well, actually ruby wont do it correctly because ruby defaults to 0 if the string starts with a letter, but...
05:29:16 <augur> ehirdiphone: did you hear about my awesome idea for an esolang?
05:32:35 <ehirdiphone> augur: Your snippet was nothing like the actual construct.
05:33:14 <soupdragon> ehird am I missing something or were you being sarcastic?
05:33:42 <ehirdiphone> soupdragon: Just because a concept is reducible to another concept does not mean it is inferior.
05:33:59 <augur> ehirdiphone: what are you talking about
05:34:02 <soupdragon> I agree and I didn't mean to suggest that
05:34:03 <ehirdiphone> He acks the prior art at the end of the post.
05:34:55 <augur> I don't know what you're talking about ehird.
05:51:17 <augur> did you see my ideas for an esolang, ehird?
05:51:54 <soupdragon> I still don't get what's interesting about the language construct...
05:52:28 <augur> i agree, its nothing amazing
05:52:35 <augur> but its useful i suppose?
05:52:48 <augur> not more useful or less useful than existing ocnstructs, say..
05:58:24 -!- ehirdiphone has quit ("Get Colloquy for iPhone! http://mobile.colloquy.info").
06:06:42 -!- augur has quit (Read error: 104 (Connection reset by peer)).
06:06:53 -!- augur has joined.
06:18:27 -!- bsmntbombdood_ has quit ("Leaving").
06:19:47 <uorygl> That conditional binding thing sounds exactly like Haskell's 'maybe', which is a defined function, not a language feature.
06:20:42 <uorygl> Oh, the author said so.
06:21:09 <soupdragon> uorygl, yeah doesn't seem to be anything interesting about it not sure what ehird was meaning
06:28:45 <uorygl> I'm not sleepy any more? >.>]
06:28:51 <augur> uorygl: i think i just made a translation routing from the semantics of functional language to prolog to my esolang
06:34:01 <augur> first [1] -> first 1:[] -> first (List 1 []) -> first(s,l,w) && list(l,1,[]) -> first(s) && P0(s,l) && P1(s,w) && list(l) && P0(l,1) && P1(l,[])
06:34:22 -!- calamari has quit ("Leaving").
06:35:26 <augur> Ww[Es[first(s) && P1(s,w) && El:list(l)[P0(s,l) && P0(l,1) && P1(l,[])]]]
06:36:14 <augur> and you'd define, in your code, Ww[Es[first(s) && P1(s,w) && El:list(l)[P0(s,1) && P0(l,x)]] = x
06:36:38 <augur> and thats how you define functions :X
06:37:41 <augur> that should really be
06:37:58 <augur> Ww[Es:FIRST[P1(s,w) && El:LIST[P0(s,1) && P0(l,x)]] = x
06:41:30 <augur> the translation should actually be Ww[Es:FIRST[El:LIST[P0(s,l) && P1(l,1) && P0(l,[])] && P1(s,w)]] sorry
07:19:24 -!- coppro has quit (Read error: 54 (Connection reset by peer)).
07:37:54 <augur> an operator that essentially says "for which w is it true that ..."
07:38:15 <augur> Wn[1 + 1 = n] => 2
07:38:38 <soupdragon> ah I get it, but what abouft Wn[n*n = 4], multivalued?
07:47:27 <soupdragon> you know curry has a backend that compiles to prolog?
07:48:12 -!- MizardX has joined.
07:48:35 -!- coppro has joined.
07:49:27 -!- Gracenotes has quit (Read error: 104 (Connection reset by peer)).
07:51:05 -!- Gracenotes has joined.
07:59:59 -!- clog has quit (ended).
08:00:00 -!- clog has joined.
08:05:37 <soupdragon> wonder what structures you can decide W for..?
08:06:04 <soupdragon> probably only really trivial stuff like finite sets
08:13:14 <augur> soupdragon: yes, Wn[n*n = 4] would be multiply valued
08:13:25 <augur> infact, it would be like asking that question in prolog
08:24:16 <soupdragon> W with + and * is TC isn't it? you can define 0 = Wn[n+n=n] but I can't see how to define 1 without (.. & ~n = 0) or using A
08:33:20 <soupdragon> that's so irritating that you can't define 1...
08:36:54 <augur> you wouldnt define shit like that
08:37:07 <augur> you would just start by defining succ and shit
08:38:02 <augur> Wn[succ {} == n] = {{}}
08:38:56 <augur> Am[Wn[succ m == n] = {m}]
08:40:24 <augur> actually, this is not a proper quantifier
08:40:29 <augur> i shouldnt even say it like that
08:40:34 <augur> its a quantifier over rules
08:40:39 <augur> so i should just use variables
08:40:52 <augur> Wn[succ M == n] == {M}
08:42:29 <soupdragon> you can immediately solve all diophantine equations
08:43:14 <augur> its just quantified predicate calculus
08:43:33 <augur> using what is essentially a prolog base interpreter
08:43:44 <augur> as the algo to interpret it
08:44:02 <augur> logics are "turing complete", algorithms that calculate logics are
08:44:35 <augur> the existential and universal quantifiers dont quantify over actual things, they're actually sort of more like assertions
08:45:18 <augur> when you say Ex[F(x)] in this language, you're essentially specifying an axiom in a logic
08:45:51 <augur> where :- is the logic turnstile operator
08:46:04 <augur> which says, "anything can prove that Ex[F(x)]"
08:46:42 <augur> which means you can use it whenever you want and its guaranteed to be true.
08:46:43 <soupdragon> I was thinking about it as a (paradoxically strong) language for computing sets
08:47:13 <augur> so really, in this "logic", using Ex[F(x)] in a truth conditional statement is always acceptable
08:47:38 <augur> for instance, the expression [1] is equivalent to El[list(l) && p0(l,1) && p1(l,[])]
08:47:59 <soupdragon> I was going to ask, is it like twelf? :p
08:48:24 <augur> this is me attempting to design a proglang that has the compositional semantics system of natural language.
08:49:16 <augur> except in natural language, Ex[F(x)] is a normal truth-functional quantifier, not an assertion.
08:49:32 <soupdragon> I'm not totally sure what that means but I think my book gets into this later on
08:49:50 <augur> what it means is that ive got some crazy shit in this language im designing :D
08:50:36 <augur> its got a transformational grammar
08:51:01 -!- Asztal has joined.
08:51:12 <augur> so that, to the extent that you can define generic functions, they will, almost inevitably, result in a rearrangement of the code
08:52:49 <augur> so that if you type lets say "1 == this 1", where "this 1" is the function \p.p(1)
08:53:02 <augur> == doesnt take functions as its right hand argument
08:53:21 <augur> but the right hand argument does take Int -> Bool's
08:53:40 <augur> so you shift "this 1" to the front and turn the rest into a big lambda:
08:53:59 <augur> this 1 (\x.1 == x)
08:54:19 <augur> and infact both of these are valid in the language
08:54:43 <augur> infact, even "this 1 (1 ==) is valid in the language, in a kind of haskellish fashion
08:54:56 <augur> but the means of generating it are more interesting :P
08:55:58 <soupdragon> I don't really see what's going here: What is 'this' and is 1 just an arbitrary symbol
08:56:22 <augur> this is the function \x.\p.p(x)
09:07:51 -!- poiuy_qwert has quit ("This computer has gone to sleep").
09:12:09 -!- neuDialect has joined.
09:15:24 -!- neuDialect has left (?).
09:26:07 <augur> a derivative of haskell with one very simple extra piece of functionality
09:26:49 <augur> which is equivalent to (f . g . ...) h
09:27:02 <augur> and where (f . g . ...) :: a -> b
09:27:09 <augur> and h is of type (a -> b) -> c
09:27:29 <uorygl> Are you sure you didn't mix those types up?
09:27:31 <augur> you can turn the expression into h (f . g . ...)
09:27:49 <augur> in a type mismatch situation
09:27:51 <augur> where the argument
09:28:03 <augur> looks like it should be the function for some higher chain of args
09:28:18 <augur> where the chain looks like it should be the argument of the actual argument
09:28:23 <augur> you can reverse them
09:28:25 <augur> in the simplest case
09:28:46 <uorygl> Huh. That looks difficult to implement and not necessarily useful.
09:28:48 <augur> f g where f :: a -> b, g :: (a -> b) -> c
09:28:54 <augur> this is a type mismatch
09:28:58 <augur> but it wouldnt be if it were g f
09:29:27 <augur> thus you can do shit like, say
09:30:06 <augur> "a" ++ "b" ++ "c" == which ["foo", "abc", "bar"]
09:30:51 <augur> actually the simplest case is all you need to solve since the rest just pops out of it by iterated Move
09:31:16 <augur> which, i guess, is something like indexOf or whatever
09:32:07 <augur> or it functions like it, but is a wh word for ese of understanding
09:34:22 -!- jpc has quit (Read error: 60 (Operation timed out)).
09:42:13 <augur> my exxample was bad, because ++ is left associative and so ruins things
09:43:11 <augur> which ["foo", "abc", "bar"] == "abc" is just the same
09:44:36 -!- ais523 has joined.
09:45:22 <augur> i wanted to give an example where the shit before "which" couldnt be used wrapped in ()'s and have it work
09:45:54 <ais523> hi; which lang is this?
09:45:57 <augur> which ["foo"] == "f" ++ "b" ++ "c" i think exemplefies this
09:46:01 <augur> ais523: MoveHaskell!
09:46:22 <augur> slight mod to haskell.
09:46:27 <augur> incredibly minor, infact.
09:46:42 <augur> type mismatches of a certain sort are resolvable
09:47:14 <augur> so in this example
09:47:30 <augur> which ["foo"] :: String -> Int
09:47:50 <augur> but it cant find that as its next arg
09:48:19 <augur> so the interpreter replaces which ["foo"] with a temporary unbound variable x :: String
09:49:32 <augur> actually sorry, which ["foo"] should really be more like (String -> Bool) -> Int
09:49:54 <augur> if it turns out that (x == ...) :: (String -> Bool) -> Int
09:50:16 <augur> the interp lambda binds x and feeds the result in as the argument to which ["foo"]
09:52:42 <ais523> (string -> bool) -> int is a weird typesig
09:53:12 <augur> the type sig is really
09:53:29 -!- FireFly has joined.
09:53:29 <augur> which :: forall a. [a] -> (a -> Bool) -> Int
09:53:42 <augur> its essentially just firstIndexOf
09:53:55 <augur> except with a test function instead of a value
09:54:26 <Deewiant> Data.List.findIndex :: (a -> Bool) -> [a] -> Maybe Int
09:55:08 <augur> ok so which = findIndex
09:55:24 <augur> findIndex ["foo", "bar", "baz"] == "foo"
09:55:34 <augur> but thats ugly, hence the rename to which :P
09:58:01 <augur> its cool tho, because you can do craaazy shit
09:58:21 <augur> basically, anything where you'd normally do
09:58:48 <augur> you can turn that into ...f...
09:59:06 <augur> assuming theres only one occurance of x in ...
09:59:12 <augur> dunno about multiple occurances
09:59:20 <augur> maybe thered be another rule for that :p
09:59:28 <augur> or a more general rule
10:00:03 <augur> f \x -> ...x...x...x... -> ...f... ... ...
10:00:11 <soupdragon> I guess that's hinting toward some of cc shans notes on linguisticd
10:01:02 <augur> so you could end up doing: which xs f && g && h
10:01:40 <augur> which [1,2,3,4] > 1 && < 5 && odd
10:02:51 <augur> because there are missing vars in...
10:03:06 <augur> which _ [1,2,3,4] _ > 1 && _ < 5 && odd _
10:03:30 <augur> so it goes through building up \x -> x > 1 && x < 5 && odd x
10:04:59 <augur> it also builds up \f -> which f [1,2,3,4]
10:05:53 <augur> actually, im not sure how it'd do that, so lets pretend its which [1,2,3,4] f not which f [1,2,3,4]
10:06:19 <augur> ive never understood why they put the xs last in all these functions
10:06:32 <augur> map f xs, foldr f z xs
10:07:10 <augur> i guess maybe you have more curried (map f) and (foldr f z) than you do (map xs) and (foldr xs z)
10:23:30 -!- MigoMipo has joined.
10:23:42 <ais523> MigoMipo: missing slash
10:26:33 <AnMaster> I wonder how suspend to ram/disk interacts with cron jobs
10:26:52 <AnMaster> I mean, if some cron job is running right then
10:28:17 <AnMaster> also why is my dns so slow suddenly
10:28:30 <AnMaster> slow = takes several seconds to resolve anything
10:29:28 -!- adam_d_ has joined.
10:29:42 <ais523> AnMaster: it gets suspended and restarted just like any other process
10:30:03 <AnMaster> ais523, what about missed cron jobs?
10:30:27 <ais523> depends on which cron implementation you have
10:30:46 <ais523> I have anacron here, which runs them when it gets round to them
10:30:54 <ais523> as in, late rather than never
10:31:00 <ais523> but it's unusual in that respect
10:31:00 <AnMaster> ais523, anacron needs normal cron to run it iirc
10:31:10 <AnMaster> and can only handle once per day stuff
10:31:48 <AnMaster> oh another cron question: how does cron handle with leap seconds.
10:32:51 <ais523> it would run on :59, surely, if that's what the seconds field said?
10:32:57 <ais523> and :00 if /that's/ what the second's field said?
10:33:10 <ais523> so, no differently from normal
10:34:24 <AnMaster> hm does cron do local time or utc?
10:34:55 <AnMaster> if local time, how does it handle switch to/from daylight saving
10:35:03 * AnMaster forgot which one leaps back an hour
10:35:14 <ais523> heh, all you need is a VM and an insane ntp server
10:35:17 <ais523> and you can find out by experiment
10:36:45 <AnMaster> I'm not sure where to find an insane ntp server ;P
10:37:04 <AnMaster> actually all I need is a VM without time syncing.
10:37:10 <AnMaster> so I can set it manually in there
10:38:43 <ais523> is it wrong that I always try to think up the most insane non-destructive solution to stuff, when I'm in here?
10:38:44 -!- Pthing has quit (Remote closed the connection).
10:38:46 <AnMaster> ais523, btw about that language on the wiki we talked about a few days ago, using two different interpreters. do you remember it's name?
10:39:06 <ais523> it isn't really two different interps
10:39:11 <ais523> it's just alternate commands are interpreted with different meanings
10:39:28 <ais523> as in, semantics of a command depend on whether an even or an odd number of commands were executed before it
10:39:36 <AnMaster> my idea is not to prove it tc by implementing something in it, my idea is to prove it not tc by implementing it in a sub-tc language
10:40:11 <ais523> it's one of the standard methods to prove something sub-tc
10:40:16 <AnMaster> now, what would be a good language for it
10:40:46 <ais523> if you want vapourware unreleased languages that I've never told the channel about, you could try ACK
10:41:08 <ais523> it's a vague attempt to make a language that's powerful enough to do pretty much anything sub-TC you'd want to do, but isn't TC
10:41:08 <AnMaster> befunge93 won't do, since I would need to store the source somewhere
10:41:19 <ais523> it clearly has infinite storage, is the issue
10:41:30 <ais523> and sub-TC things with infinite storage are kind-of rare
10:41:47 <ais523> you could try to implement it in Splinter, I suppose
10:42:05 <ais523> AnMaster: not exactly, basically you have to calculate how much storage you're going to use in advance
10:42:12 <ais523> and it's hard to see how to do that for dupdog
10:42:42 <ais523> (splinter's a push-down automaton)
10:43:06 <AnMaster> hm "push-down automaton". what exactly can it do that a FSM can't?
10:43:09 <ais523> PDAs are bad at duplicating unboundedly large values, though
10:43:16 <ais523> AnMaster: matching brackets is the most famous example
10:43:26 <ais523> a PDA can match to any depth of nesting, an FSM has to have some limit
10:49:30 <AnMaster> soupdragon, as a literal string? regex?
10:49:40 * AnMaster can't find any meaning in that line
10:49:48 <augur> interestingly, there are a small class of languages which can match a^n b^n c^n trivially, but cant match certain context free languages
10:50:06 <AnMaster> soupdragon, ah, how would you write that as PCRE?
10:50:35 <soupdragon> AnMaster might take me a few hours to figure that out :p
10:50:57 <AnMaster> soupdragon, I strongly suspects PCRE might be TC
10:51:34 <AnMaster> it has recursion and what not after all
10:56:06 <AnMaster> ais523, can you explain how that PDA proof works?
10:56:25 <AnMaster> it seems to be: N{N{N{}}I}Z{B{B{A{}}A{ZBAI}}A{O{ZBAO}I{ZBAI}N}}ZA\
10:56:51 <ais523> you could try stepping through it in an interpreter
10:57:41 <ais523> the basic idea is: you can compile a program that uses n splinters into a program that uses only a limited number by having a large object holding all of them, and a method to step through it
10:57:53 <ais523> and then store things on the call stack Underload-style
10:59:30 -!- puzzlet has quit (Read error: 60 (Operation timed out)).
10:59:37 -!- puzzlet has joined.
10:59:55 <AnMaster> pda-splinter.pl: Lisp/Scheme program text
11:03:48 <AnMaster> ais523, so what about implementing dupdog in that PDA language and then using the PDA->splinter compiler? would that be easier or harder than splinter directly do you think?
11:13:59 <AnMaster> soupdragon, can't figure what out?
11:14:50 <AnMaster> ais523, meh, I can't think of a way to do it it either the PDA language or splinter directly
11:15:02 <ais523> I suspect dupdog can't be done by a PDA
11:15:05 <AnMaster> basically I guess the issue is understanding splinter
11:15:08 <ais523> although, ofc, that doesn't mean it's TC
11:15:16 <AnMaster> ais523, what is there between a PDA and TC?
11:16:16 <ais523> LBA, but that only really works if you have a concept of input
11:16:21 <ais523> and a whole load of classes that don't have their own names
11:16:25 <ais523> because they don't come up very often
11:16:33 <ais523> (there are infinitely many classes, after all)
11:17:28 <soupdragon> http://www.stanford.edu/~laurik/fsmbook/examples/Einstein%27sPuzzle.html
11:17:41 <AnMaster> ais523, but dupdog doesn't have stdin?
11:18:02 <AnMaster> ais523, also which one is LBA?
11:18:03 <ais523> AnMaster: exactly, which is why LBA doesn't even make any sense wrt dupdog
11:18:07 <ais523> linear-bounded automaton
11:19:00 <AnMaster> oh why did searching for LBA not find http://esolangs.org/wiki/Linear_bounded_automaton
11:19:25 <ais523> no redirect, you can add one if you like
11:19:35 <ais523> mediawiki default search tends to rely a lot on human help
11:40:25 -!- rodgort has quit (Client Quit).
11:40:35 -!- rodgort has joined.
11:55:31 -!- |MigoMipo| has joined.
12:10:40 <augur> i just had an interesting idea
12:11:36 <augur> protons are functions from electrons to 1
12:12:53 <augur> e.g. p = \e -> 1 :: Electron -> Int
12:13:19 <augur> concatenation of protons as in p0^p1 is sort of like fold:
12:13:28 <augur> forked fold? who knows
12:14:52 <augur> p0^p1 = \x -> \y -> p1 x + p0 y
12:14:55 <augur> or something like that
12:15:54 <augur> well maybe it should track charge not charge-balances..
12:17:23 <augur> obviously im sleep deprived :D
12:17:43 -!- MigoMipo has quit (Read error: 110 (Connection timed out)).
12:20:58 -!- soupdragon has quit ("Leaving").
12:26:36 -!- soupdragon has joined.
12:38:19 <AnMaster> <augur> e.g. p = \e -> 1 :: Electron -> Int
12:38:42 <AnMaster> well it would have been heavily modified bnf
12:40:56 <AnMaster> and never found a nice tutorial that extended into the depths too
12:41:03 <augur> heres haskell in a micosecond:
12:41:11 <augur> math f(x) is haskell f x
12:41:13 <augur> welcome to haskell
12:41:21 <AnMaster> augur, I'm pretty sure there is more than that
12:41:34 <AnMaster> augur, also, what about f(x,y)
12:41:39 <augur> shh! you'll wake the ehird!
12:42:04 <augur> functions are all monadic in haskell
12:42:21 <soupdragon> functions are all monadic in haskell ???
12:42:32 <augur> nah, its just bracketted when need-be
12:42:44 <augur> no not all functions are monadic in haskell
12:42:50 <augur> where did you get that idea
12:43:09 <augur> but you can do it that way if you want: (+) x y
12:43:14 -!- |MigoMipo| has quit ("co'o rodo").
12:43:22 -!- MigoMipo has joined.
12:43:24 <AnMaster> also what about all those type things and such
12:43:36 <augur> see, haskell has sugar for operators right
12:43:42 <augur> but operators are still just functions
12:43:46 <augur> so if you want to get the function
12:43:51 <augur> stick it in parens to "call" it on nothing
12:43:55 <augur> that is, return just the function
12:43:58 <AnMaster> augur, but (+ x y) wouldn't work?
12:44:01 <augur> well now its the function, not the operator
12:44:36 <augur> tho that might also be interpretable as a tuple with one elemtn
12:44:45 <augur> im not sure how haskell disambiguates these things
12:44:57 <AnMaster> augur, I suspect that by mixing the prefix and infix notations it is possible to create something that makes IOCCC seem quite sane and easy to read
12:45:24 <augur> it gets far far worse
12:45:56 <AnMaster> augur, anyway, what is all the stuff about types and such.
12:45:57 <augur> mostly because haskell can be highly pointsfree
12:46:27 <augur> f . g = \x.f (g x)
12:46:27 <AnMaster> soupdragon, which I don't know
12:46:43 <AnMaster> I know erlang, lisp, C, bash, and a few other languages
12:46:43 <augur> AnMaster: im really surprised
12:46:46 <augur> youve been here longer than me
12:47:12 <augur> then f has the type Int -> Int
12:47:40 <augur> well, 1 has the type Int
12:47:43 <augur> 1.0 has the type Float
12:47:51 <augur> you have to convert i guess
12:48:06 <augur> f x = x + 1 is desugard to f = \x -> x + 1
12:48:20 <augur> and so the type looks like whats on either side
12:48:28 <augur> and (x + 1) is an Int too
12:48:42 <augur> f x y = x + y is desugared to \x -> \y -> x + 1
12:48:55 <augur> which is Int -> (Int -> Int)
12:49:09 <augur> since x is an Int and \y -> x + 1 is an Int -> Int
12:49:50 <AnMaster> what exactly do you mean by -> here?
12:50:03 <augur> X -> Y ~ "Take and X and return a Y"
12:50:22 <augur> if this were an untyped lambda calculus
12:50:30 <augur> \x -> \y -> x + y is trivially obvious, right
12:50:33 <augur> apply it to the number 2
12:50:35 <AnMaster> then why isn't f x y = x + y "takes 2 integers and returns one"
12:50:52 <augur> why, because haskell is lambda calculus.
12:50:56 <AnMaster> augur, so it returns a lambda that takes the other parameter?
12:51:10 <augur> hence the following is completely valid haskell:
12:51:16 <augur> f 2 == \y -> 2 + y
12:51:29 <AnMaster> so it returns an "add 2" function
12:52:02 <AnMaster> augur, so what about functions/operators like + - and such that can take various types
12:52:11 <AnMaster> or does haskell have separate + operator for ints and floats?
12:52:23 <augur> well, so theres type polymorphism right
12:52:31 <augur> you might have the function
12:52:44 <augur> now, lists are typed in haskell
12:52:56 <augur> lists are polymorphically typed
12:53:05 <augur> lists of whatever type you give it
12:53:12 <augur> so f x is also polymorphic
12:53:19 <augur> what is its type specifically?
12:53:21 <AnMaster> augur, like list of foo? well okay
12:53:26 <augur> f :: forall a. a -> [a]
12:53:42 <augur> where [a] is the sugar for list of a's
12:53:48 <AnMaster> augur, what does the "a. a" bit mean.
12:53:57 <augur> a. a doesnt mean anything :P
12:54:05 <augur> forall a. a -> [a]
12:54:06 <AnMaster> augur, okay how should one parse that bit then
12:54:27 <augur> for all types a, f can be a -> [a]
12:54:46 <augur> so if you give f an int
12:54:54 <augur> f is behaving like its of type Int -> [Int]
12:55:04 <augur> give it a string and its behaving like its of type String -> [String]
12:55:16 <augur> for ALL types a, f can behave like a -> [a]
12:55:23 <augur> or, f :: forall a. a -> [a]
12:55:41 <AnMaster> augur, so this is not exactly the same as predicate logic ∀ then?
12:55:59 <augur> if you're doing a typed lambda calculus
12:56:04 <augur> this is a quantifier over types
12:56:12 <augur> rather than a quantifier over values
12:56:17 <AnMaster> String -> [String] <-- is that a list of 1 or more strings?
12:56:32 <augur> the list [] is of type
12:56:36 <augur> forall a. a -> [a]
12:56:50 <augur> [a] just means "a list of a's"
12:56:57 <augur> and a list of no a's is still a list of a's
12:57:05 <AnMaster> well [] being of any type makes sense. After all the empty set is a subset of every other set
12:57:14 <augur> so, as for polymorphisms
12:57:36 <augur> i think you can have multiple functions with the same name
12:57:43 <augur> so long as they have different type signatures
12:57:46 <augur> e.g. you could define
12:58:06 <augur> (+) :: Int -> Int -> Int
12:58:12 <augur> (+) :: Float -> Float -> Float
12:58:15 <AnMaster> augur, couldn't there be some confusion about which one to use in certain cases? If there are implicit casts available in haskell?
12:58:22 <AnMaster> if there isn't I can't imagine it being an issue
12:58:28 <augur> no, i dont think there are implicit casts, so
12:58:47 <augur> oh, there are. hah.
12:58:54 <augur> well, the compiler is smart
12:59:04 <augur> it rounds down to the most applicable type
12:59:33 <augur> infact, it seems that haskell has a generic type class Num
12:59:36 <AnMaster> I can still imagining that being an issue sometimes
12:59:43 <augur> and Int and Float are both Nums
12:59:55 <AnMaster> augur, so the types form a hierarchy? sensible
13:00:08 <augur> and (+) is required to be of type (Num a) => a -> a -> a
13:00:29 <augur> where (Num a) => ... just means sort of
13:00:40 <augur> for all types a, which are of the typeclass Num, a -> a -> a
13:00:57 <AnMaster> augur, btw, couldn't you pass a tuple or such to make a function take two arguments (instead of returning a lambda)
13:01:31 <augur> then its a 1-arg function that requires a 2-tuple
13:01:36 <AnMaster> augur, also that leaves some interesting related questions: how you build and take apart tuples
13:01:37 <augur> and it has pattern matching over the tuple
13:01:43 <AnMaster> as in, how do I get the second member of a tuple
13:01:53 <augur> but you cant then do f 1
13:01:57 <augur> because 1 is not a 2-tuple
13:02:00 <AnMaster> augur, so similar to pattern matching?
13:02:11 <augur> its pattern matched, yes.
13:02:34 <augur> so you can, for instance, do
13:03:04 <augur> well, x:xs is cons
13:03:15 <augur> its sugar for roughly
13:03:22 <AnMaster> augur, and what are the car and cdr functions?
13:03:36 <augur> where Cons is a data constructor
13:03:56 <augur> i dont mean its a cons function
13:04:02 <augur> its a special thing
13:04:05 <augur> a data constructor
13:04:20 <AnMaster> augur, is it like erlang's [H|T] then?
13:04:22 <augur> it forms the basis of the ASTs in haskell
13:04:32 <augur> lets pretend its that
13:04:33 <AnMaster> augur, on the left or the right side of = ?
13:04:45 <augur> people dont usually use car and cdr
13:04:53 <augur> because they can just match them out
13:05:02 <augur> so you'd never define first = car
13:05:05 <AnMaster> like erlang's [H|T] then for head/tail
13:05:38 <AnMaster> augur, xs being a variable name that isn't used I guess?
13:05:47 <AnMaster> it would be the cdr I imagine?
13:06:08 <AnMaster> doesn't the compiler warn about unused variable then?
13:06:12 <augur> and this conses together 1 and []
13:06:28 <augur> 1 : 2 : [] = 1 : [2] = [1,2]
13:06:33 <AnMaster> type being a keyword to define a type?
13:06:47 <augur> i mean if you type 1 : []
13:06:56 <AnMaster> right, it wasn't part of the code, heh sorry
13:07:15 <augur> ill use > on newlines
13:07:19 <augur> since lambdabot isnt here
13:08:01 <augur> > let first x:xs = x in first [1,2,3]
13:08:13 <augur> > let rest x:xs = xs in rest [1,2,3]
13:08:36 <AnMaster> augur, so 1 : 2 : [] is (cons 1 (cons 2 '())) ?
13:08:40 <augur> this is because [1,2,3] is really really just sugar for 1:2:[]
13:09:02 <AnMaster> (if you are insisting on infix ;P)
13:09:04 <augur> : being right associative
13:09:25 <augur> if we were using prefixing:
13:09:43 <augur> [1,2,3] is sugar for (:) 1 ((:) 2 ((:) 3 []))
13:09:46 <AnMaster> if you are going to prefix you might as well use lisp instead ;P
13:09:53 <augur> and if we were using the List monad:
13:10:00 <augur> [1,2,3] is sugar for uh
13:10:08 <augur> List 1 (List 2 (List 3 []))
13:10:11 <AnMaster> ah now we come to "monad"... now what on earth is that exactly?
13:10:27 <augur> Monad is a type class
13:10:50 <augur> Int is an instance of the type class Num
13:11:07 <augur> so as an instance of the Monad type class
13:11:10 <AnMaster> augur, so you can only have 2 layers then? type class and type?
13:11:16 <AnMaster> or can type classes inherit each other
13:11:26 <augur> i think you only get types and type classes
13:11:35 <augur> and you just multi-class a type
13:11:56 <augur> on an instance of Monad you can (must?) have defined certain functions
13:12:09 <augur> i think the three are >>=, return, and ... maybe thats it?
13:12:46 <augur> are the only functions.
13:13:09 <AnMaster> (and where does IO and state come into this)
13:13:16 <augur> well, theyre, by convention, required to follow certain rules
13:13:29 <augur> (return a) >> k == k a
13:13:52 <augur> xs >>= (return . f) == fmap f xs
13:14:08 <augur> x >>= (\x -> (k x) >>= h) == (m >>= k) >>= h
13:14:24 <augur> those are the facts that must hold of >>= and return
13:14:41 <augur> when i say must, i mean "by convention of what a monad is"
13:14:51 <augur> you can violate them, but dont expect your code to behave monadically
13:14:51 <AnMaster> augur, writing those in English would help, I'm not very used to the multitude of use of punctuation that haskell uses (possibly even exceeding perl at that!)
13:15:05 <augur> >>= is just an operator right
13:15:22 <augur> so (return a) >>= k == k a
13:15:43 <augur> means that (return a) >>= k must return the same value as k a
13:15:54 <AnMaster> so the >> instead of >>= was a typo?
13:16:04 <augur> . is the compose operator
13:16:16 <AnMaster> augur, okay, I'm more used to seeing the compose operator as a mid dot
13:16:19 <augur> f . g = \x -> f (g x)
13:16:42 <AnMaster> I'm pretty sure I have seem f <mid dot> g mean g(f(x)) somewhere
13:16:54 <augur> but haskell isnt APL
13:17:00 <AnMaster> augur, yes, which is where I have seen the mid dot!
13:17:33 <augur> so those are the monad laws
13:17:47 <augur> and thats what defines a monad
13:18:20 <soupdragon> you can define fmap usint >>= and return
13:18:20 <AnMaster> is it related to maping a lambda over a list in any way?
13:18:45 <augur> fmap is i think probably somethat you can think of as like
13:18:58 <augur> a monad-specific definable version of map
13:19:05 <AnMaster> a generic map over lists as well as other types?
13:19:26 <augur> soupdragon, are monads all Functors? they are right?
13:19:50 <AnMaster> then where does state and IO come into monads?
13:19:54 <augur> so, monads are all Functors, Functor being another type class
13:20:06 <augur> and so in order to be a Monad you also have to be a Functor which means you have to have fmap defined.
13:20:10 <soupdragon> yes, another (equivalent) definition of monad is Functor with join and return
13:20:31 <augur> which obeys its own set of laws
13:21:19 <augur> fmap (f . g) x = fmap f (fmap g x)
13:21:44 <augur> IO and state who knows.
13:22:16 <augur> i mean, the way you do it, afaik, is that like
13:22:36 <AnMaster> fmap takes a lambda and whatever it is to map over?
13:22:46 <augur> if you just call into existence an IO monad, itll sort of be the same monad every time.
13:22:57 <augur> but if you pipe an IO monad through some magic functions
13:23:03 <augur> you get new IO monads
13:23:10 <AnMaster> augur, sounds like black compiler magic to me...
13:23:18 -!- MigoMipo has quit ("co'o rodo").
13:23:26 <augur> the idea is basically that the monads you're getting back are not the SAME monad you put in
13:23:32 <augur> and therefore can be different
13:23:36 <AnMaster> augur, and why is that important?
13:23:48 <augur> well, because it provides the illusion of purity
13:24:04 <augur> suppose you just called forth an IO monad from nothing
13:24:07 <augur> and assigned it to x
13:24:17 <augur> and then you did some shit with it, printing a line to the screen
13:24:18 <AnMaster> augur, how is this better than just making most of your code pure and having a tiny bit with all the unpure stuff
13:24:25 <augur> thus getting back a new monad that you assign to y
13:24:29 <AnMaster> like you commonly do in scheme or erlang for example
13:24:55 <augur> if you then tried to print to the screen again using x, i think nothing would happen on screen
13:25:04 <augur> (im not entirely sure on this; i dont do this IO shit in haskell :P)
13:25:11 <augur> instead you'd have to use the monad stored in y
13:25:22 <AnMaster> augur, how do you then get the result from your computation back? ;P
13:25:28 <augur> and you'd keep chaining together the monads you get back from previous IO function calls
13:25:40 <augur> right so input is sort of the same, right
13:25:55 <augur> if you ask the monad stored in y, now, for an input
13:26:08 <augur> itll give you back a string paired with a new IO monad i think
13:26:15 <augur> the new monad you can do shit with however you want
13:26:21 <augur> if you ask y for input AGAIN
13:26:26 <augur> you get back the same string-monad pair as before
13:26:35 <augur> it doesnt go ask the user for input again
13:26:41 <AnMaster> augur, I sure hope haskell's GC is good then
13:26:54 <augur> im fairly certain that its spot on
13:26:56 <AnMaster> since it sounds like reading a file would take an awful lot of memory
13:27:13 <augur> nah, i think read ops are compiler-internally single operations
13:27:14 <AnMaster> well, hoepfully it shares the string read in question
13:27:20 <augur> you get back a monad and the content string
13:27:40 <augur> like i said, im not entirely clear on this IO stuff
13:27:50 <AnMaster> what about the state monad then?
13:28:05 <augur> state is more black magic
13:28:14 <AnMaster> even more than IO? hard to imagine
13:28:22 <Asztal> state isn't black magic at all :(
13:28:24 <augur> i think its vaguely like a monad that has hash-like magic
13:28:33 <Asztal> ST is magic, State isn't.
13:28:43 <AnMaster> Asztal, what is ST and what is State?
13:28:44 <augur> so that when you "assign" you're taking a var-binding hash, and deriving a new hash with the "bound" var changed
13:28:59 <augur> e.g. you're going from, say, { "x" => 1 } to { "x" => 2 }
13:29:06 <augur> in a purely functional fashion
13:29:19 <augur> nothing, im just making shit up
13:29:35 <Asztal> ST means State Thread, it kind of ensures that there's only one "world state" in existence at any time, so you can mutate the world as you please
13:29:40 <AnMaster> it looked slightly python-dict-ish
13:29:56 <augur> its supposed to be ruby-esque but whatever
13:30:15 <AnMaster> augur, well not knowing ruby I guess python was the next best I could manage ;P
13:30:17 <augur> i dont know how these magic monads are implemented so, yeah.
13:30:30 <Asztal> IO and ST are implemented in a very similar way :)
13:31:34 <AnMaster> does it actually act like a erlang style of state process. (being a common idiom)
13:31:37 <Asztal> IO is sort of a special instance of ST where the state you're manipulating is... the real world!
13:31:38 <augur> because its a state monad that threads through your whole program! :P
13:31:50 <AnMaster> ah so not thread as in separate process then?
13:32:00 <Asztal> not a thread in that sense, no.
13:32:06 <augur> i dont think haskell has explicit threading
13:32:31 <augur> because its lazy and functional, you can spin off arbitrarily many threads in your compiled program
13:32:37 <augur> and im pretty sure it works fine
13:32:43 <AnMaster> because using a separate process in erlang is a common idiom to keep the state. basically you send/receive messages to/from that process to access the state
13:32:46 <augur> parallelize the shit out of everything
13:32:53 <AnMaster> and that one just does a tail recursive loop
13:32:57 <AnMaster> with it's state as a parameter
13:33:06 <augur> nothing needs to be explicit except, i think, in your compile params
13:33:32 <AnMaster> augur, well, one issue is where to thread
13:33:50 <AnMaster> I guess the compiler is smart enough to figure out where it is an overhead and where it isn't?
13:34:49 <AnMaster> a heuristic I can't imagine as being 100% fool-proof ;P
13:35:00 <augur> where is lambdabot
13:35:02 <Asztal> The expression (x `par` y) sparks the evaluation of x (to weak head normal form) and returns y. Sparks are queued for execution in FIFO order, but are not executed immediately. If the runtime detects that there is an idle CPU, then it may convert a spark into a real thread, and run the new thread on the idle CPU. In this way the available parallelism is spread amongst the real CPUs.
13:35:17 <Asztal> http://www.haskell.org/ghc/docs/latest/html/users_guide/lang-parallel.html
13:35:37 <augur> well there you go.
13:35:53 <AnMaster> augur, I have ghci in a window here
13:36:53 <AnMaster> augur, does haskell have any sort of macros? In the meaning of lisp macros I mean
13:37:17 <Asztal> it's not quite as cool as lisp, but it does have template haskell
13:37:35 <augur> theres some crazy shit that people do with haskell tho
13:37:40 <AnMaster> Asztal, nothing like C++ templates I hope?
13:37:41 <augur> somehow they get reactive programming
13:37:49 <augur> the epitome of anti-laziness
13:39:14 <AnMaster> augur, so how would you do something like a memoizing Fibonacci function in haskell?
13:39:33 <AnMaster> that is, one which counds up, not down, and reuses the calculations
13:39:41 <AnMaster> so it doesn't have to calculate everything a lot of times
13:39:54 <augur> you could probably do something stupid like
13:40:05 <augur> and just carry your memos with you
13:40:26 <augur> maybe you'd do a memoized-fib monad
13:40:53 <augur> and use do notation to peel it open
13:40:55 <AnMaster> augur, in scheme I would do it as a tail recursive function
13:41:29 <augur> do (n, memo) <- mfib (5, [])
13:41:34 <Asztal> http://www.haskell.org/haskellwiki/Memoization#Memoization_with_recursion :P
13:41:36 <AnMaster> soupdragon, oh? how do you implement something like a main loop then?
13:42:09 <AnMaster> soupdragon, looks like a tail call?
13:42:24 <augur> Asztal's memoized fib there looks like magic
13:43:15 <augur> its the magic of infinite lists and such
13:43:24 <AnMaster> soupdragon, the one Asztal linked seems fairly easy to understand, apart from the !! bit
13:43:28 <AnMaster> which I have no idea what it means
13:43:58 <Asztal> "list !! i" picks the i'th element from the list.
13:43:59 <augur> actually, what its doing is defining an infinite list thats calculated over some other infinite list
13:44:11 <augur> and by virtue of it being defined ones, and in terms of itself
13:44:38 <augur> when you go to plug some elements off the list, it just traces back the list, building it up as far as it needs to
13:44:55 <augur> and, since it IS the same list, it only ever gets built up once
13:44:57 <AnMaster> I have to say that the variant I thought of in scheme is a lot more sensible, just tail recursion and pass the value of the last calculations along
13:45:09 <augur> and by built i mean in the interpreter when it tries to evaluate it
13:45:21 <augur> yes, thats what you could do in haskell too
13:45:34 <augur> but then you have to explicitly carry this around
13:45:47 <augur> what the one here doesnt let you do tho
13:45:51 <augur> is memoize across calls to fib
13:45:54 <augur> so if you did like
13:46:04 <augur> it would take the same amount of time
13:46:13 <AnMaster> augur, then I would write it in another way indeed
13:46:22 <augur> whereas if you were explicitely carrying it around, you could do like
13:46:44 <AnMaster> augur, you could use the ST monad to store it somehow? :D
13:46:46 <augur> let (x, memo) = fib 1000000 [] in fib 1000000 memo
13:46:55 <augur> AnMaster: probably
13:47:18 <AnMaster> so what is this infinte list stuff about.
13:47:21 <augur> but you'd still need the explicit stuff i think
13:47:28 <augur> well ok so haskell is lazy, right
13:47:35 <augur> it doesnt evaluate it unless it needs to
13:47:43 <augur> so lets do the classic infinite list
13:48:02 <augur> this is fine, because haskell doesnt need to evaluate ones when DEFINING ones
13:48:07 <augur> it doesnt need to evaluate anything, infact
13:48:18 <AnMaster> I have no problems with that idea at all
13:48:33 <AnMaster> or can you do more interesting stuff with it?
13:48:33 <augur> tho i think itll check for scopes
13:48:41 <augur> well, you can do interesting stuff, but it all comes down to that
13:48:48 <augur> if you do first ones
13:48:59 <augur> it doesnt need to evaluate ones beyond its first value
13:49:10 -!- MizardX- has joined.
13:49:18 <AnMaster> augur, what about using it to calculate 1/2+1/4+1/8+1/16+...
13:49:20 <augur> when you do third ones it then goes through and evals what it needs to eval to get there
13:49:38 <augur> ok so that is probably something like lets see
13:49:50 <AnMaster> doesn't that go to 1 when the number of elements summed goes to infinity iirc?
13:49:58 -!- MizardX has quit (Read error: 104 (Connection reset by peer)).
13:50:07 -!- MizardX- has changed nick to MizardX.
13:50:18 <AnMaster> soupdragon, not in math. I think the limit when n goes towards infinite is 1. Unless I misremember
13:50:41 <AnMaster> soupdragon, I meant http://upload.wikimedia.org/math/e/c/2/ec2190a57b685add5a4e43c3ecfeed94.png
13:50:43 <augur> tha the defintionis
13:51:10 <AnMaster> soupdragon, it would be cool if you could use haskell to do that kind of stuff
13:51:25 <AnMaster> but I guess you are better off with mathematica or some other CAS
13:51:25 <augur> halves = 1 : map (\x -> x/2) (drop 1 halves)
13:51:31 <augur> but i killed ghci doing that
13:51:36 <soupdragon> there's no way + can know that the 100th element of that sequence isn't suddenly 62662
13:51:57 <augur> that will get you [1,1/2,1/4,...]
13:52:24 <augur> then you'd just take however many out you want for precision and fold them down
13:52:30 <augur> i guess you could also do this the smarter way
13:52:33 <soupdragon> if you defined it in the same way as mathematica you could compute it
13:52:52 <soupdragon> (as a series, rather than an infinite sum)
13:52:58 <augur> [x ** n : n <- [0..]]
13:53:45 <augur> Deewiant: but it NEEDS drop 1!
13:53:55 <Deewiant> Only for the infinite loop :-P
13:53:59 <augur> im a silly person i am
13:55:03 <AnMaster> soupdragon, atm I'm trying to work out what it would be in mathematica
13:55:06 <augur> drop n x:xs = drop n-1 xs
13:56:15 <soupdragon> AnMaster, I mean if you represent it symbolically, like Sigma (\n -> Div 1 (Exp 2 n)) then you could write an algorithm to try and solve sums (like mathematica)
13:56:42 <AnMaster> soupdragon, what was that about 1/(1+1/(1+1/...
13:57:03 <AnMaster> *tries to remember the series*
13:57:10 <AnMaster> it was some rather neat series of infinite divisions
13:57:17 <soupdragon> but that's not what I was talking about
13:57:33 <AnMaster> I was just jumping to a different thought
13:57:49 <AnMaster> anyway it wasn't the golden ratio I meant
13:57:52 <soupdragon> phi = 1/(1+phi) also diverges in haskell :P
13:57:58 <AnMaster> soupdragon, the point was that it added up to exactly 2
13:58:03 <AnMaster> so it must have been a different one
13:58:16 <augur> ok im going to sleep for a little bite guys
13:58:20 <AnMaster> it was a continued fraction of some sort
14:06:07 <AnMaster> it was 1 + 1/2 + 1/4 + 1/8 + 1/16 + 1/32
14:07:58 <AnMaster> soupdragon, as a floating point number?
14:08:17 -!- coppro has quit (Read error: 110 (Connection timed out)).
14:08:24 <AnMaster> how do you mean it would be stored then
14:08:35 <AnMaster> if decimal 1.11111... is 10 in binary?
14:09:46 <AnMaster> anyway that sum was Sum[1/2^i, {i, 0, Infinity}] I think
14:10:17 <AnMaster> soupdragon, well yes for 1 + 1/2 + 1/4 + 1/8 + 1/16 + 1/32 + ...
14:11:07 <soupdragon> that's pretty cool I wonder what algorithm it uses
14:11:24 <AnMaster> soupdragon, in latex \sum _{i=0}^{\infty } \frac{1}{2^i} I think
14:22:16 -!- BeholdMyGlory has joined.
14:45:07 <AnMaster> soupdragon, well that is not public for mathematica mostly iirc
14:46:08 <AnMaster> soupdragon, maxima manages it too
14:46:38 <AnMaster> sum(1/(2^k), k, 0, inf), simpsum;
14:46:47 <AnMaster> soupdragon, so go look at that
15:03:13 -!- ais523 has quit (Connection timed out).
15:19:36 -!- ais523 has joined.
15:35:31 -!- soupdragon has quit ("Leaving").
15:44:34 -!- FireFly has quit (Connection timed out).
15:45:42 -!- FireFly has joined.
16:47:04 <AnMaster> Gregor, btw, that "linux binaries on OS X" idea with your elfloader, did you do any work of that?
17:00:09 -!- adam_d_ has changed nick to adam_d.
17:43:52 -!- FireFly has quit (Read error: 60 (Operation timed out)).
17:59:34 -!- FireFly has joined.
18:49:55 -!- jpc1 has joined.
19:19:13 -!- bsmntbombdood has changed nick to bsmntbombgirl.
19:32:23 -!- Desmo has joined.
19:38:38 -!- ais523 has quit (Remote closed the connection).
19:38:57 -!- Desmo has left (?).
21:06:50 -!- oerjan has joined.
21:32:59 <AnMaster> hm interesting stuff happens when you delete part of the stuff while tar is still extracting:
21:33:24 <AnMaster> tar: linux-2.6.32.2/arch/microblaze: Function "stat" failed: File or directory not found
21:33:40 <AnMaster> I was trying to save some space by deleting all the arches I didn't need
21:37:26 -!- coppro has joined.
21:37:50 <Deewiant> Presumably it was extracting into that directory, so of course it expects it to still exist.
22:10:54 -!- jpc1 has changed nick to jpc.
22:30:27 * uorygl goes to see if Gregor has any simpler opuses.
22:34:48 <uorygl> Well, Opus 10 sounds simpler, but it's less interesting and has some icky patches.
22:35:36 <AnMaster> "Support for old Pentium 5 / WinChip machine checks"
22:36:00 * uorygl looks at the score for Opus 9 in an attempt to judge its quality from just that.
22:37:04 * uorygl listens to the horrible-sounding MIDI version of Opus 9 instead.
22:37:35 <uorygl> Why am I listening to this at all? It's not for piano.
22:38:55 <AnMaster> uorygl, why is it horrible sounding?
22:39:10 <AnMaster> uorygl, not a good enough sound font loaded into your hardware midi?
22:39:42 <AnMaster> oerjan, thing is, if this is a typo for pentium 3 then it would affect the kernel I'm configuring
22:41:02 -!- augur has quit ("Leaving...").
22:41:46 <AnMaster> oh it seems the original pentium was codenamed P5
22:41:47 <oerjan> AnMaster: punny puns of puun
22:42:08 <AnMaster> oerjan, too worried about this to bother with puns
22:42:42 <uorygl> AnMaster: nope. It's probably difficult to get a good soundfont for a string trio.
22:42:47 <AnMaster> and I'm not in a happy mood atm: frozen water in pipes tends to make you rather unhappy
22:43:18 <AnMaster> oerjan, thankfully only for the garden hose connection on the outside of the wall
22:43:33 <AnMaster> so no expensive leak inside some wall
22:43:40 <AnMaster> but still, water has to be turned off
22:44:26 <AnMaster> not a single one within 40 km radius was able before tomorrow (and this happened yesterday morning)
22:46:50 <AnMaster> btw googling for pentium 5 found some funny old stuff
22:47:03 <AnMaster> rumours about a 15 GHz pentium 5 by 2010 for example :D
22:47:40 -!- MigoMipo has joined.
22:50:27 <Deewiant> AnMaster: Well, Intel did say they were going to go to 10 GHz and higher, around the time when the Prescott came out
22:50:43 <Deewiant> When they noticed AMD's 2 GHz chips performed better than theirs they kinda changed plans
22:51:04 <Deewiant> (Where "theirs" == 5 GHz and up.)
22:51:06 -!- augur has joined.
22:52:51 <Deewiant> http://en.wikipedia.org/wiki/Tejas_and_Jayhawk suggests they had 7 GHz stuff ready to be built in 2004.
23:03:59 -!- Asztal has quit (Read error: 60 (Operation timed out)).
23:19:35 -!- soupdragon has joined.
23:23:43 <AnMaster> Deewiant, but I thought there was some wall around 5-6 Ghz?
23:29:24 <Ilari> What was the highest clock frequency seen in any sold Intel X86/X64 series processor?
23:37:00 <soupdragon> I wanted to ask you guys a question here
23:37:44 <soupdragon> Given a linear recurrence over the integers: a_n = c_1 * a_(n-1) + ... + c_d * a_(n-d)
23:38:10 <soupdragon> with initial conditions, does there exist an k so that a_k = 0? He only asks for a decision procedure: he says that it is ... faintly outrageous that this problem is still open; it is saying that we do not know how to decide the halting problem even for “linear” automata!
23:38:30 <Ilari> Apparently 3.8GHz was the highest.
23:39:03 <soupdragon> I was going to ask, doesn't rule 110 explain why there is no decision procedure
23:39:10 <soupdragon> but actually it's not a linear recurrence..
23:40:01 <uorygl> That question is reminiscent of the subset-sum problem.
23:40:24 <oerjan> soupdragon: um that looked linear to me...
23:40:38 <uorygl> Rule 110 looks linear?
23:41:08 <Ilari> I think that linear recurrence plus initial conditions decision looks solvable...
23:41:59 <oerjan> soupdragon: "faintly outrageous that this problem is still open" means that whoever gave you the problem claims it's open?
23:42:05 <uorygl> Hmm, all those linear recurrences can be represented as matrices.
23:42:05 <Ilari> Ah yeah, I figured out what the problem is...
23:42:50 <soupdragon> yes the claim is that it's open, and I thought 110 gave an argument against: but then realized that 1D CA is a different format
23:43:40 <Ilari> a_n = c_1 * a_(n-1) + ... c_d * a_(n_d) plus initial conditions can be written in form a_n = e_1 * (b_1)^n + ... + e_d * (b_d)^n.
23:43:44 <oerjan> well that obviously means it's _hard_
23:44:36 <oerjan> Ilari: that may still make it _worse_, since those b_i are not integers
23:44:39 -!- MigoMipo has quit (Read error: 60 (Operation timed out)).
23:44:55 <Ilari> oerjan: In fact b's and e's are not even guaranteed to be real.
23:46:57 -!- MizardX has quit ("zzz").
23:47:10 <soupdragon> so you can't encode a turing machine with a linear recurrence?
23:50:03 -!- adu has quit (Read error: 110 (Connection timed out)).
23:51:43 <oerjan> soupdragon: um if we knew how to do that we would have solved that unsolved problem. you think that is something done on the spot in an irc channel?
23:52:11 <soupdragon> if you know you can't encode a turing machine into the linear recurrence, it still doesn't solve the problem
23:52:47 <soupdragon> if you know you can then the problem is closed, but it's open -- so there's some kind of epistomogical argument that you can't encode turing machines into it
23:53:35 <oerjan> i don't think so. it could just be possible, but very hard
23:53:59 <oerjan> that's how famous unsolved problems are usually solved, after all
23:54:26 <oerjan> well that or someone comes up with an argument from something completely unrelated
23:56:05 <oerjan> it can be easily turned into a number of special cases of a vector problem
23:56:34 <Ilari> Here's another problem that's suprisingly hard (but not unsolveable): Given nxn table of bits, How many ways there are to choose n all-1 bits such that no two chosen bits are on same column or row?
23:56:34 <oerjan> given matrix A, vector v, is there any n such that A^n v has first coordinate 0
23:57:12 <oerjan> although that could still be harder, or could it...
23:58:05 <oerjan> well that's obviously solvable in the computational sense, just use brute force
23:58:23 <soupdragon> oerjan, hm.. I have seen people compute fibs using the matrix power like that so yeah I see a link