←2015-02-18 2015-02-19 2015-02-20→ ↑2015 ↑all
00:00:26 <dulla> :^)
00:03:05 <MDude> http://orteil.dashnet.org/randomgen/?gen=http://mdude1350.webs.com/generators/random-code/one-liner.txt
00:07:05 -!- wool has joined.
00:09:13 <MDude> http://wurstcaptures.untergrund.net/music/?oneliner=(Math.log(sin(215%26t)))-(Math.sqrt(124%7Ct))%2Bcos((tan(t))%26201%7Ct)%20&oneliner2=(Math.log(sin(215%26t)))-(Math.sqrt(124%7Ct))%2Bsin(sin(cos(117%7Ct)))&t0=0&tmod=0&duration=30&separation=100&rate=8000
00:11:39 <dulla> fucking byte bytes
00:11:45 <dulla> byte beats
00:14:29 <MDude> Random number generator is not always the best composer.
00:14:47 <MDude> But it did come up with this as I was typing that: http://wurstcaptures.untergrund.net/music/?oneliner=t%3E%3E(sin(t*28))%7Ct%20&oneliner2=t%3E%3E(198-t*(Math.log(t%7C62)))%2F(cos(cos(233%3E%3Et)))&t0=0&tmod=0&duration=30&separation=100&rate=8000
00:16:23 <oerjan> girl genius late
00:18:33 <dulla> that weird comic?
00:23:34 <MDude> http://wurstcaptures.untergrund.net/music/?oneliner=Math.sqrt(81*t)%3E%3Et%25121%20&oneliner2=Math.sqrt(81*t)%3E%3Et&t0=0&tmod=0&duration=30&separation=100&rate=8000
00:31:33 -!- adu has joined.
00:32:29 -!- kcm1700 has quit (Read error: Connection reset by peer).
00:32:34 -!- kcm1700_ has joined.
00:38:21 <Gregor> oerjan: I see my name highlighted multiple times.
00:38:29 <oerjan> it's alive!
00:38:40 <oerjan> Gregor: the wiki server is/was down
00:39:33 <oerjan> make that is
00:40:50 -!- shikhin_ has joined.
00:41:05 <dulla> install gentoo
00:41:25 <Sgeo> Why do resolution changes permanently resize windows?
00:41:41 <Gregor> Wow, it's very down. Weird.
00:41:44 <Gregor> I'll kick the server.
00:41:56 * Sgeo has his oculus rift
00:42:03 -!- Lymia has joined.
00:43:22 -!- shikhin has quit (Ping timeout: 240 seconds).
00:44:08 * Gregor has his occultus rift.
00:44:10 -!- Sgeo has quit (Read error: Connection reset by peer).
00:44:31 <oerjan> Gregor: is this the portal-to-another-dimension kind
00:44:51 <Gregor> It's a portal to the esolangs server panel, anyway.
00:47:20 -!- mitchs has quit (Quit: mitchs).
00:49:16 <MDude> http://wurstcaptures.untergrund.net/music/?oneliner=Math.sqrt(t*253)%2Btan(Math.sqrt(166%7Ct))%20&oneliner2=Math.sqrt(t*253)%2BMath.log(tan(cos(57-t)))&t0=1000&tmod=0&duration=30&separation=100&rate=8000
00:52:57 -!- mihow has quit (Quit: mihow).
00:53:55 -!- hjulle has quit (Ping timeout: 250 seconds).
00:59:41 -!- mitchs has joined.
01:00:11 <oerjan> oh it's back
01:00:28 <oerjan> but HackEgo hasn
01:00:34 <oerjan> 't rejoined the channel
01:06:46 <oerjan> MDude: syllogisms hth
01:09:51 <MDude> Oh right, that's the other thing I was thinking of.
01:09:56 <MDude> Thanks oerjan.
01:10:04 <oerjan> yw
01:12:04 <MDude> Anyway, what I was thinking with categorical propositions was to have it used in OOP to make inheritence better/worse.
01:13:39 -!- Lymee has joined.
01:14:08 <oerjan> All JavaBeanVisitorFactories are Factories
01:14:11 <MDude> I'm pretty sure that's what I was thinking.
01:14:22 <MDude> Not that.
01:14:26 <oerjan> aww
01:14:28 <MDude> Maybe?
01:14:41 * oerjan _was_ joking hth hth
01:14:52 <oerjan> also forgetting the script bug
01:15:04 <MDude> I just meant you entered that right before me,,.
01:16:20 -!- Lymia has quit (Ping timeout: 265 seconds).
01:16:53 <MDude> But yeah, the idea was mostly that if you has "some" or "not all" of a class as something, you could more easily have exceptions to things.
01:17:31 <MDude> And other parts would know that such exceptions are possible, as oppossed to if it was "all" or "none".
01:18:23 <MDude> And then you could have conditionals like "if (instance) might be a (class)".
01:19:08 -!- Phantom_Hoover has quit (Read error: Connection reset by peer).
01:19:50 <MDude> And that is the fine joke.
01:25:19 -!- Sgeo|phone has joined.
01:25:35 <Sgeo|phone> My computer wont turb on
01:25:43 <Sgeo|phone> It just shows a black screeb
01:26:12 -!- Tritonio has quit (Ping timeout: 252 seconds).
01:29:08 <mitchs> not a burnt out backlight on a laptop is it?
01:29:41 <Sgeo|phone> How would i tell?
01:29:58 <Sgeo|phone> The screen does turb on
01:30:06 <Sgeo|phone> Turn
01:30:36 <Sgeo|phone> I don't even see POST stuff
01:31:32 <mitchs> you could tell by shining a flashlight on the screen or hooking up an external monitor
01:33:24 -!- Thedarkb has joined.
01:34:20 <Sgeo|phone> Rift counts as external monitor i think
01:34:39 <Sgeo|phone> Buy it didn't receive video info when i do that
01:35:59 <Sgeo|phone> Last time this happen ed, one of my name restarts was black as usual but eventually showed windows login screen
01:36:30 <Sgeo|phone> I may have been pressing F8 or something
01:36:39 <Sgeo|phone> I'm about to lose my mind
01:36:54 -!- Thedarkb has left ("Konversation terminated!").
01:40:46 -!- oren has joined.
01:41:48 <oren> So I dissasembledmy laptop to see if I could fix the screen problems. Now the screen seems to have stopped acting up... but the touchpad doesn't work.
01:42:51 <Sgeo|phone> Most recent restart seems to be working
01:43:02 <Sgeo|phone> In the sense of getting the Lenovo logo
01:43:19 <Sgeo|phone> Going to try to be patient now
01:44:04 <oren> I've decided I can live with using an external mouse all the time
01:47:29 <MDude> http://wurstcaptures.untergrund.net/music/?oneliner=((t%3E%3E88*(Math.log(6%2Ft)))%2B(t%7C(cos(sin(242%25t)))))%25(t%7C78)%26(83%25t)%20&oneliner2=%20((t%3E%3E88*(Math.log(6%2Ft)))%2B(t%7C(cos(sin(242%25t)))))%25(234%3E%3Et-227%7Ct)*(sin(sin(t)))&t0=1000&tmod=0&duration=30&separation=100&rate=8000
01:49:36 -!- AndoDaan has quit (Quit: Bye).
01:49:43 -!- AndoDaan_ has joined.
01:49:52 <tswett> MDude: well... that's interesting.
01:50:00 -!- AndoDaan_ has changed nick to AndoDaan.
01:51:17 <MDude> I'm trying to figure out how to have it put parens.
01:52:37 -!- FreeFull has quit (Ping timeout: 255 seconds).
01:54:22 -!- FreeFull has joined.
01:55:06 -!- wool has quit (Quit: Page closed).
01:55:50 -!- Sgeo has joined.
01:57:54 -!- Lymee has quit (Ping timeout: 265 seconds).
01:59:31 <tswett> http://mrob.com/pub/ries/zeta.html - this is one of the best-sounding simple mathematical things I know of.
02:00:06 -!- Sgeo has quit (Ping timeout: 244 seconds).
02:02:27 -!- Lymia has joined.
02:15:48 -!- Sgeo has joined.
02:27:52 <Sgeo> I am currently typing this from in a nebula of some sort
02:28:08 <Sgeo> But the resolution is crap, I can barely make out the words I am typing
02:28:21 <oerjan> Sgeo: is it omg full of stars?
02:28:36 <Sgeo> Did you say "omg full of stars"?
02:28:48 <Sgeo> There's a starfield option, but it's crap, it looks like a posterboard near me
02:30:20 <oerjan> so not a real nebula then
02:45:46 <Sgeo> Apparently I should put my glasses on
02:45:51 <Sgeo> Closed right eye, everything got fuzzy
02:45:56 <Sgeo> Which is what reality does too
02:51:11 -!- Lymia has quit (Ping timeout: 250 seconds).
03:07:02 -!- Tritonio has joined.
03:31:28 -!- MDude has quit (Quit: later chat).
03:31:51 -!- MDude has joined.
03:42:11 -!- Lymia has joined.
03:42:38 -!- oren has quit (Quit: Lost terminal).
03:48:15 -!- Sgeo|phone has quit (Quit: AndroIRC - Android IRC Client ( http://www.androirc.com )).
04:48:11 -!- bb010g has quit (Quit: Connection closed for inactivity).
05:06:09 -!- Oolicile has joined.
05:09:55 <Oolicile> Hi
05:10:26 <MDude> Hi
05:11:20 <Oolicile> CDXX
05:13:11 -!- heroux has quit (Remote host closed the connection).
05:14:07 <MDude> CDIII
05:14:15 -!- MDude has changed nick to MDream.
05:16:29 <Oolicile> Ayyy :)
05:16:57 -!- Sgeo has quit (Ping timeout: 244 seconds).
05:17:12 <Oolicile> return: None
05:17:41 <Oolicile> I literally am Complete shit at coding I have no idea how I got here
05:17:56 <Oolicile> Ill just go practice and whateverrr
05:18:38 <MDream> That sounds cool. I'd stay to talk about stuff, but I'm probably better off getting to bed earlier.
05:18:48 <MDream> In terms of being more alert to get things done.
05:19:57 -!- ProofTechnique has quit (Ping timeout: 256 seconds).
05:21:19 <Oolicile> Go sleep Mdream ya need it for the stuff we gonna do tommorow
05:34:47 -!- zzo38 has joined.
05:39:44 -!- Oolicile has quit (Ping timeout: 246 seconds).
05:40:00 -!- GeekDude has quit (Quit: ZNC - http://znc.in).
05:43:01 -!- Sgeo has joined.
06:01:00 -!- bb010g has joined.
06:09:50 -!- ProofTechnique has joined.
06:10:19 -!- contrapumpkin has joined.
06:11:49 -!- copumpkin has quit (Ping timeout: 250 seconds).
06:23:45 -!- copumpkin has joined.
06:25:24 -!- contrapumpkin has quit (Ping timeout: 252 seconds).
06:41:15 -!- shikhin has joined.
06:44:31 -!- shikhin_ has quit (Ping timeout: 265 seconds).
07:10:41 -!- contrapumpkin has joined.
07:13:55 -!- copumpkin has quit (Ping timeout: 255 seconds).
07:14:19 -!- contrapumpkin has changed nick to copumpkin.
07:21:11 -!- shikhin has quit (Ping timeout: 246 seconds).
07:34:58 -!- Patashu has joined.
08:14:13 -!- heroux has joined.
08:37:21 -!- AndoDaan has quit (Quit: Going, going, gone.).
08:39:47 -!- vanila has joined.
08:39:50 <vanila> hello
09:07:46 -!- oerjan has quit (Quit: leaving).
09:35:44 -!- adu has quit (Quit: adu).
10:13:18 -!- TieSoul_ has joined.
10:13:56 -!- TieSoul has quit (Disconnected by services).
10:14:01 -!- TieSoul_ has changed nick to TieSoul.
10:14:23 -!- hjulle has joined.
10:18:18 -!- Tritonio has quit (Remote host closed the connection).
10:23:57 -!- MoALTz has quit (Ping timeout: 246 seconds).
10:34:55 <b_jonas> zzo38: ping
10:36:04 -!- vanila has quit (Quit: Leaving).
10:39:16 -!- TieSoul_ has joined.
10:39:40 -!- AnotherTest has joined.
10:39:52 -!- TieSoul has quit (Ping timeout: 245 seconds).
10:40:07 -!- TieSoul_ has changed nick to TieSoul.
10:42:10 <b_jonas> zzo38: You have 6 life, opponent has 20. Opponent has Arcane Laboratory and Asceticism in play, you have a Soltari Foot Soldier. It's the start of the opponent's turn, both of you have no lands, but have some suspended Lotus Blooms so that the opponent can play a spell his turn, you can play a spell your turn, and a spell in your next turn, but that's all.
10:42:45 <b_jonas> no wait
10:43:32 <b_jonas> this might not work, I need some more cards
10:43:36 <b_jonas> let me think
10:46:06 -!- AnotherTest has quit (Ping timeout: 252 seconds).
10:46:18 <b_jonas> Right, let you also have a Bedlam in play. Both of you hvae some mathcing 1/1 creatures that are vanilla or have no relevant abilities (flying or first strike are ok), and you also have an Echoing Decay.
10:46:51 <b_jonas> Opponent will play a creature, and the only way you can win is by playing the same creature then playing Echoing Decay on your own creature next turn, then beating him with the Soltari.
10:47:10 <b_jonas> zzo38: I think that works for your Babson thing, though it's not very elegant.
10:56:09 -!- AnotherTest has joined.
10:59:02 <b_jonas> (The key is Echoing)
11:00:24 -!- AnotherTest has quit (Ping timeout: 252 seconds).
11:00:39 -!- shikhin has joined.
11:20:46 -!- boily has joined.
11:27:26 -!- MoALTz has joined.
11:30:44 <mroman> @tell AndoDaan https://github.com/FMNSSun/P/tree/master/psrc/std
11:30:44 <lambdabot> Consider it noted.
11:32:19 -!- MoALTz_ has joined.
11:33:47 -!- mhi^ has joined.
11:34:02 -!- MoALTz has quit (Ping timeout: 245 seconds).
11:34:22 -!- MoALTz__ has joined.
11:37:17 -!- MoALTz_ has quit (Ping timeout: 252 seconds).
11:37:56 -!- shikhin has changed nick to shikhout.
11:39:18 -!- shikhout has changed nick to shikhin.
11:40:00 <mroman> My lisp dialect has a module system now
11:40:55 <b_jonas> mroman: is it just the scheme r7rs module system?
11:42:43 <mroman> No there are *.pm files
11:42:54 <mroman> which contains a list of files to load
11:43:02 <mroman> i.e. other *.pm files or *.p files
11:43:06 <mroman> *.p files contain actually code
11:43:13 <mroman> *.pm just is a list of includes
11:43:17 <mroman> for example
11:43:27 <mroman> map_par.pm includes prelude.pm and map_par.p
11:43:34 <mroman> and prelude.pm includes prelude.p
11:43:49 <mroman> so including map_par.pm results in loading map_par.p and prelude.p
11:47:37 <b_jonas> wait what? why do you call them *.pm ? that's the extension we use for perl
11:47:49 -!- Patashu has quit (Ping timeout: 264 seconds).
11:49:31 -!- Frooxius has quit (Read error: Connection reset by peer).
11:49:49 -!- Frooxius has joined.
11:55:10 <boily> @massages-loud
11:55:11 <lambdabot> You don't have any messages
11:58:40 <lifthrasiir> but you have some massages.
12:00:36 <mroman> b_jonas: Yeah
12:00:42 <boily> it's been a long time I've had a massage. I'm overdue to have my feet lightly marinated, then forcefully reflexed by strong thumbs.
12:00:51 <mroman> with one to three letter extensions it's hard to take anything that hasn't been used yet
12:04:14 <b_jonas> mroman: sure, but still, perl is already complicated because *.pl is used for two things
12:15:52 <mroman> prolog
12:15:57 <mroman> yep
12:16:39 <mroman> @tell boily I can massage you over TCP.
12:16:39 <lambdabot> Consider it noted.
12:16:52 <mroman> elliott taught me
12:17:28 <elliott> I only over UDP-based massages.
12:21:39 <boily> ah, the feeling of a lower ACK massage...
12:21:47 <boily> @clear
12:21:47 <lambdabot> Maybe you meant: clear-auto-reply clear-messages clear-topic learn
12:21:52 <boily> @clear-massages
12:21:52 <lambdabot> Messages cleared.
12:22:17 -!- boily has quit (Quit: ANATOMIC CHICKEN).
12:31:44 <elliott> *offer
12:35:45 -!- int-e has left ("ERUDITE POULTRY").
12:35:48 -!- int-e has joined.
12:41:10 -!- shikhin_ has joined.
12:44:11 -!- shikhin has quit (Ping timeout: 264 seconds).
12:44:11 -!- Koen_ has joined.
12:49:59 <b_jonas> these fast convolution algorithms (various kinds of fast fourier transforms on various types of elements, plus the Nussbaumer convolution algorithm) are magical, and I think I should try to understand them at least a little.
12:50:10 <b_jonas> I probably won't fully understand them, but I should try to understand them partially.
12:55:35 <int-e> b_jonas: are you familiar with polynomial rings?
12:59:57 <int-e> If so, http://cr.yp.to/papers.html#m3 is a very terse but uniform description of various tricks surrounding fast multiplication (including FFT).
13:02:32 <b_jonas> int-e: there's also the Cormen-Leiserson-Rivest-Stein book which gives a nice description
13:02:36 <b_jonas> (and Knuth of course)
13:03:20 <b_jonas> thanks for the reference
13:07:38 <b_jonas> in particular, I'd like to understand whether fourier transforms on real or complex numbers (whether approximated with fixed point or floating point) are really always more useful for convolution than ones on finite fields (called "number theoretic transforms" for some reason), or just more popular.
13:07:49 <b_jonas> But I might not be able to answer this.
13:09:26 -!- shikhin_ has changed nick to shikhin.
13:12:17 -!- MDude has joined.
13:15:59 -!- MDream has quit (Ping timeout: 264 seconds).
13:19:47 <int-e> b_jonas: as a data point, https://gmplib.org/repo/gmp-6.0/file/2ff56d3c5dfe/mpn/generic/mul_fft.c does FFT modulo various 2^N+1 (note that for normal FFT's, you need 2 to be invertible, and a suitable root of unity; you don't actually need a field.)
13:20:58 <b_jonas> int-e: I see
13:21:38 <int-e> The existence of the root is ensured by making N divisible by a large enough power of 2. (If N = 2^k l, then 2^l is a 2^(k+1)-th root of unity modulo 2^N+1)
13:23:38 <b_jonas> int-e: I think I can see the basic problem, namely that for many practical applications, you'd have to use a 32-bit integer to represent an element of the field, but then to do four multiplication in that field, you need not one instructions, but like ten: you need two separate 32-bit-to-64-bit multiplications, some rearrangements, then two 32-bit-to-32-bit multiplications and lots of extra instructions to do the modulus by a constant.
13:25:02 <int-e> Yes, finite fields are a bit awkward. :)
13:26:17 <int-e> Intel is doing something about that for small GF(2^k) at least: http://en.wikipedia.org/wiki/CLMUL_instruction_set
13:26:24 <b_jonas> with real numbers, these smart modern algorithms avoid _most_ of the penalty of having to use complexes rather than reals, and you just have to use single multiplications with reals, or a few instructions with reals.
13:27:17 <b_jonas> int-e: that's for cryptography, but is GF(2^k) relevant for convolutions? I thought you need a field with characteristic at least as large as the size of data in my input and output,
13:27:30 <int-e> b_jonas: the modulo 2^N+1 thing is really quite clever: Multiplying by the root of unity is a shift (or perhaps just an offset for addresses) and reduction moduloy 2^N+1 is basically one subtraction.
13:27:43 <b_jonas> eg. if I want to convolve image data with 8-bit deep pixels, I need at least 16 bit deep data, in fact more because you have to add logarithm of the array size.
13:27:56 <int-e> b_jonas: probably not :)
13:28:49 <b_jonas> I know you said you don't really need a field, but still, I don't see how GF(2^k) would work here
13:30:40 <int-e> b_jonas: I'm a victim of free association :P
13:32:19 <int-e> b_jonas: Perhaps one can implement multiplication over fields of order 2^(kn).
13:32:40 <int-e> But mostly it just doesn't work. (You can't even use the base 2 FFTs, because 2 = 0)
13:45:39 <Jafet> A cute use for FFT is string searching with single-letter wildcards.
13:47:27 <Jafet> Some people (such as GMP developers) prefer finite fields because they don't want to deal with floating point rounding.
13:50:00 -!- MoALTz__ has quit (Ping timeout: 244 seconds).
13:50:12 -!- MoALTz has joined.
13:52:46 <Jafet> (or finite-field-like-rings)
13:52:57 <Jafet> `thanks rings
13:53:20 <b_jonas> Jafet: it needn't be floating point. you can represent the complexes with fixed point numbers. the Knuth book explains how that's supposed to be enough to work for Schonhage-Strassen integer multiplication.
13:53:31 <int-e> Jafet: they're only using that for really large numbers; the smallest N for i386 is 464, which has to be multiplied be approximately 32, I guess.
13:54:03 <int-e> (to get a number of bits)
13:54:04 <Jafet> You still need to consider rounding for fixed-point numbers.
13:54:10 <b_jonas> int-e: yes, because there's other algorithms to multiply integers
13:54:12 <b_jonas> hmm, I wonder
13:54:18 <int-e> Hmm. "sufficiently field-like rings".
13:54:57 <b_jonas> are there also simple algorithms to convolve polynomials that aren't O(n*polylog(n)) operations like FFT-based ones, but instead just O(n**p) where p is between 1 and 2?
13:55:20 <b_jonas> probably there are, analogous to integer multiplication
13:55:49 <int-e> b_jonas: But you're probably right that there's no floating point FFT in libgmp. At least I don't recall seeing one.
13:56:16 <b_jonas> the question is whether these can be faster than FFT stuff in reality
13:56:28 <b_jonas> maybe they can, I should try to read up on that
13:59:11 <int-e> b_jonas: sure; the integer multiplications are implemented as polynomial multiplications after all. So the Karatsuba trick is: (ax + b)(cx + d) = ac x^2 + ((a+b)(c+d) - ac - bd) x + bd
13:59:48 <b_jonas> int-e: right, the question is how fast that could be for the kind of operations I'm interested in
14:00:22 <b_jonas> most likely, you still can't get away with only 16-bit multiplications, even if the input is only 8 bits in size
14:00:43 <b_jonas> so you need 32 bits for all intermediate results
14:00:54 <int-e> b_jonas: what kind of platform are you targeting?
14:01:22 <int-e> (32 bit arithmetic doesn't scare me)
14:01:40 <b_jonas> int-e: let's say modern x86_64 cpus with SSE4_1 instructions and a decent cache, so 32 bit arithmetic is available, it's just more expensive than 16 bit one
14:01:47 <b_jonas> because you need twice the memory
14:02:28 <b_jonas> but of course I'd like to know a bit in general, not just answers for this
14:03:00 -!- ProofTechnique has quit (Ping timeout: 252 seconds).
14:04:00 <b_jonas> it may matter a lot about the speed how the intermediate results fit in the caches
14:05:10 <b_jonas> and of course it matters whether I'm doing merely 1D convolution or 2D
14:05:13 <int-e> have you mentioned the degree of the involved polynomials?
14:05:13 <Jafet> Since the convolution algorithms don't use division, it doesn't matter what the register size is, you should still get the remainder of the correct result (for integer polynomials)
14:05:46 <int-e> I gathered that the coefficients are 8 bit numbers (unsigned? signed? it probably doesn't matter much.)
14:06:27 <b_jonas> Jafet: yes, but even then, the final results in each digit is the sum of n multiplications of two 8-bit resutls, where n is oyur array size, whcih is still more than 16 bits
14:06:33 <b_jonas> Jafet: so no
14:07:01 <b_jonas> int-e: 8 bit unsigned, but I'm also interested in cases where you want more precision than that
14:07:29 <int-e> b_jonas: I don't know how you're hoping to fit the final result into 16 bits in the first place.
14:07:30 <b_jonas> and I'm thinking mostly of degrees up to 2048
14:07:33 <Jafet> In that case, you need more than 16 bits to store the result no matter what algorithm is used.
14:07:46 <b_jonas> int-e: yes, even the final result doesn't fit
14:07:52 <b_jonas> pity
14:09:08 <int-e> 28 bits, hmm. that's tight.
14:09:32 <Jafet> Another cute thing about the FFT mod 2^N+1 is that it always gives you the correct result mod 2^N+1, which is useful for people playing with Fermat numbers.
14:09:48 <Jafet> s/FFT/FFT multiplication/
14:12:18 -!- Koen_ has quit (Quit: The struct held his beloved integer in his strong, protecting arms, his eyes like sapphire orbs staring into her own. "W-will you... Will you union me?").
14:19:02 <int-e> Ah, the CHICKEN! have a history of Poulet! which appeared out of nowhere on 2012-08-08.
14:19:54 <mroman> A sufficiently round chicken in space is just a dwarf-planet.
14:20:25 <mroman> That can spawn asteroids.
14:22:04 <int-e> http://int-e.eu/~bf3/tmp/eboilution.txt
14:36:55 -!- ProofTechnique has joined.
14:42:59 <b_jonas> you're right, using 16 bit arithmetic would be totally unrealistic
14:43:39 <b_jonas> I should try to imagine how the Karatsuba-based convolution would really work, as in, how much temporary space it uses and then what operations it does on what arrays in what order
14:53:55 -!- Frooxius has quit (Quit: *bubbles away*).
14:58:17 -!- AnotherTest has joined.
15:00:39 -!- Frooxius has joined.
15:01:00 -!- `^_^v has joined.
15:05:20 -!- GeekDude has joined.
15:05:20 -!- GeekDude has quit (Changing host).
15:05:20 -!- GeekDude has joined.
15:20:12 -!- skj3gg has joined.
15:28:28 -!- skj3gg has quit (Quit: ZZZzzz…).
15:29:49 -!- skj3gg has joined.
15:35:46 -!- AnotherTest has quit (Ping timeout: 252 seconds).
15:39:45 -!- skj3gg has quit (Quit: welp, see you later.).
15:52:57 -!- newsham has quit (Read error: Connection reset by peer).
15:53:14 -!- newsham has joined.
16:05:52 -!- koo7 has quit (Ping timeout: 240 seconds).
16:07:38 -!- koo7 has joined.
16:16:18 -!- trn has quit (Remote host closed the connection).
16:20:35 -!- trn has joined.
16:29:40 -!- shikhin has quit (Ping timeout: 255 seconds).
16:29:42 -!- Koen_ has joined.
16:37:30 -!- cpressey has joined.
16:38:42 * cpressey is beating his head over what exactly is "total" about total functional programming that uses corecursion and codata to work on "infinite" data structures
16:39:25 <cpressey> they do realize that on a finitary computer, those "infinite" data structures aren't actually infinite, and when you try to realize them, it... doesn't terminate. don't they?
16:39:43 <cpressey> hi, btw
16:41:26 <Koen_> hi cpressey
16:44:18 <cpressey> hi Koen_
16:44:46 -!- mihow has joined.
16:44:54 -!- Phantom_Hoover has joined.
16:45:15 <Koen_> I witnessed a Master 2 class (Master 2 meaning fifth year of college) where the students were asked the complexity of an algorithm about inversing n x n matrices
16:45:24 <Koen_> apparently they agreed the complexity was n
16:45:28 <int-e> cpressey: they produce böhm trees without bottoms. that's nice.
16:45:52 <int-e> (in the limit, of course)
16:46:12 <elliott> cpressey: well, the distinguishing property of codata is productivity; you don't consider the fully unrolled trees, but only what you get after "one step"
16:46:28 <int-e> > [1..] -- and initial segments can be computed in finite time
16:46:30 <lambdabot> [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,...
16:46:43 <int-e> (that is productivity)
16:46:47 <elliott> "looking further" is considered an explicit operation rather than something you can ascribe to the result in the abstract
16:47:23 <Koen_> so if you try to "observe" it you're disturbing it? sounds quantic
16:47:32 <int-e> elliott: that's pretty lazy :)
16:47:36 <elliott> if you have "f : A -> Colist B", you know that (f x) is always something, you can look at as much of it as you want
16:47:42 <elliott> (whereas in Haskell it could bomb out)
16:48:22 <int-e> > filter odd [2,4..]
16:48:25 <cpressey> elliott: that grates against my idea of "total". w.r.t. how "total" is used in computability, anyway. *how do you know how much of it you want*
16:48:26 <lambdabot> mueval-core: Time limit exceeded
16:48:27 <elliott> cpressey: an alternative way of thinking about it is that functions are codata too. (Nat -> A) is exactly the same as (Colist A), and it's still "total" because we can look at as many "elements" as we want, even if the tree of them is infinitely deep
16:49:00 <b_jonas> cpressey: it's funny how people still try to invent total turing complete languages
16:49:34 <elliott> cpressey: er, Stream A.
16:49:37 <cpressey> b_jonas: https://github.com/wouter-swierstra/
16:49:38 <elliott> Colist can end.
16:49:59 <cpressey> sorry, wrong url
16:49:59 <int-e> codata allows that. the non-termination comes from trying to write a codata structure (like a stream) out completely.
16:50:02 <elliott> Colist A would be, like, {f : Nat -> Maybe A | forall i, isNothing (f i) -> forall j, j >= i -> isNothing (f j)}.
16:50:31 <elliott> you can probably come up with a function type for, like, potentially infinite binary trees too, it'll just be annoying. this gets kinda domain theory-ish
16:50:33 <cpressey> https://github.com/wouter-swierstra/Brainfuck/ and my inquiry https://github.com/wouter-swierstra/Brainfuck/issues/1
16:51:06 <elliott> like, you can do Partial A as {f : Nat -> Maybe A | forall i j, i >= j -> f i >= f j}, where Just >= Just, Just >= Nothing, Nothing >= Nothing
16:51:19 <cpressey> afaict the author wrote this to demonstrate that you can have something which is both total and Turing-complete
16:51:55 <cpressey> which may be true, for his definition of total -- but it's hard to fit it into my idea of total.
16:51:57 <int-e> "A system is total iff it always produces a final result." <-- but the totality refers to functions whose codomain can be codata.
16:52:01 <elliott> my definition of "total functional programming" is just that every function is total
16:52:05 <elliott> not considering it as a global property
16:52:18 <cpressey> int-e: right. and to me, an infinite list is not a final result
16:52:28 <elliott> cpressey: let's put it this way -- it's the same argument as haskell being pure
16:52:37 <elliott> from the point of view of running a haskell program, obviously it's not
16:52:42 <elliott> the trick is you have an RTS to walk the pure results
16:52:55 <elliott> er, *the pure descriptions of effects
16:52:58 <elliott> here, you have [something] to walk the brainfuck execution trail
16:53:10 <int-e> (This is closing a circle. In math, it's perfectly clear what a total function is. If you consider functions on natural numbers, then that notion happens to conicide with terminating functions on a Turing machine. But *defining* totality through termination is wrong once infinite data is involved.)
16:53:15 <elliott> (Agda's IO is actually codata, so you can write a full Agda brainfuck interpreter and have the RTS chase the codata for you, potentially forever.)
16:53:26 <int-e> (I'm *very* sloppy)
16:54:19 <int-e> (what I mean is that total, partial recursive functions on naturals are terminating)
16:54:29 <elliott> https://personal.cis.strath.ac.uk/conor.mcbride/pub/Totality.pdf this is about this exact topic, and might help.
16:54:41 -!- hjulle has quit (Ping timeout: 256 seconds).
16:55:00 <elliott> McBride is probably the strongest advocate of totality =/= turing-incompleteness. he even told me off about it once even though I agree :(
16:55:22 <int-e> (Oh I see that Wouter is attacking the same point.)
16:55:40 <cpressey> I will probably just stop using the term "total", honestly.
16:56:40 <elliott> I mean, totality in this sense is still a good guarantee.
16:56:49 <int-e> As I wrote, it's a perfectly sensible term for saying that a partial function, is, in fact, nowhere undefined.
16:56:51 <elliott> you know your programs won't get "stuck"
16:56:57 <cpressey> int-e: well yes
16:56:58 <cpressey> ok look
16:57:05 <cpressey> i can't talk to both of you at once
16:57:13 <cpressey> i can barely talk to one of you at once
16:57:15 <elliott> (though of course if you just go back and model general recursion again you don't gain anything from the internal POV)x
16:57:18 <elliott> heh
16:57:23 <cpressey> ok nevermind
16:57:30 <elliott> I'll shut up
16:57:31 <elliott> *-x
16:57:57 <int-e> I'm afk for a bit anyway.
16:57:59 <int-e> :-P
16:58:10 -!- bb010g has quit (Quit: Connection closed for inactivity).
16:58:13 <MDude> Often the limit as to how many computation steps a computer can afford to take invloves factors beyond the architecture of the computer itself.
16:59:11 <Koen_> heat comes to mind
16:59:51 <cpressey> I'm going to at least stop using "total" to describe Turing machines that always halt (sorry Kozen), because if you want to look at it as *each transition* of the TM, with the *option* to GET the next transition, well then of COURSE each transition terminates.
16:59:55 <cpressey> it's not helpful.
17:00:09 <cpressey> (to look at it that way. at least, not to me.)
17:00:13 <MDude> Until zeno machines exist, I'm not too worried about the contradictions that might happen if my turing machines runs an infinite amount of time.
17:00:39 <elliott> I guess this use of "total" to describe languages comes from Turner's Total Functional Programming, as the obvious generalisation of talking about writing total functions.
17:00:50 <MDude> *run
17:01:03 <cpressey> elliott: yes. linked to in that github issue. i read it (mostly). some parts i like. others, not so much
17:01:51 <cpressey> like... you can't write a self-interpreter in this... but you are thinking you might write an operating system in it? really?
17:02:01 <MDude> Well, it is distinct from a machine that could get stuck infinitely trying to proccess one command.
17:03:39 <cpressey> MDude: I'm a bit "worried" about such contradictions, but they don't keep me up at night.
17:03:45 <elliott> I mean, you can write a self-interpreter in a way.
17:03:53 <elliott> you can't write ProgramWithType A -> A.
17:03:54 -!- oren has joined.
17:03:55 <MDude> I guess you could call such mcachines micricode-total.
17:04:01 <elliott> but you can write ProgramWithType -> Partial A, with the codata partiality monad.
17:04:01 <MDude> *microcode-total.
17:04:11 <MDude> Or just micro-total.
17:04:30 <elliott> you can write the computation internally (like you can model IO in Haskell), you just can't prove that it actually always finishes, because Goedel.
17:04:52 <elliott> I think that paper defines 0 / 0 = 0 instead of doing something reasonable though. and probably doesn't cover codata or dependent types? I wouldn't want to program in "Haskell with a termination checker"
17:04:56 <cpressey> elliott: partiality monad in a total functional language, is that right?
17:05:08 <elliott> cpressey: like IO monad in a pure functional language, sure
17:05:23 <b_jonas> zzo38: ping?
17:05:31 <cpressey> entirely black horse with white legs
17:05:33 -!- ais523 has joined.
17:05:34 <elliott> you can define it easily with a normal codata mechanism, even prove things terminate and so on
17:05:48 <cpressey> a.k.a. oxymoron
17:05:50 <elliott> if you haven RTS that will chase it forever, you can use it to write any potentially non-terminating program you want, like that brainfuck interpreter does
17:06:35 <ais523> ooh, cpressey is here?
17:06:40 <cpressey> ais523: no
17:06:42 <ais523> that might be enough to make me pay attention to the channle
17:06:43 <cpressey> ais523: I'm Phantom_Hoover
17:06:45 <ais523> ah right
17:06:46 <elliott> cpressey: you can define the partiality monad regardless of language support for doing things for it. it's like saying the ability to define "data MyIO = GetChar (Char -> MyIO) | PutChar Char MyIO | Halt" makes Haskell an impure language *shrug*
17:06:48 <cpressey> Phantom_Hoover: eat that
17:07:01 <Phantom_Hoover> i'm cool with it, i'm actually aloril
17:11:43 <cpressey> having a total function from partial objects to partial objects is nice. i'm not contesting that. i'm trying to say that it doesn't tell you much about the properties of the program that uses them. and that's really the important part, isn't it
17:11:47 <cpressey> ?
17:12:18 <elliott> I don't personally see, e.g. (Colist Nat) as a partial object
17:12:23 <elliott> this is really deep into definition-arguing though
17:12:52 <elliott> like, I can't see any way to claim (f : Colist A -> Colist B) can't be total without claiming (f : (Nat -> A) -> (Nat -> B)) can't be total, since the two are exactly isomorphic.
17:13:35 <elliott> erm. *Stream for Colist there (sigh)
17:13:56 * cpressey sobs
17:14:07 * elliott too old for this
17:14:15 <cpressey> har
17:14:37 <elliott> how do you still have the ability to program computers without burning out
17:14:37 <cpressey> in other news, no one here will care, but Cat's Eye Technologies has a Twitter account now. https://twitter.com/catseye_tc
17:14:54 <cpressey> um... I eat a lot of fruit?
17:14:54 <pikhq> The best of news.
17:14:57 <cpressey> actually I don't
17:15:02 <elliott> this is a lot of fancy unicode quotes
17:15:20 -!- Froox has joined.
17:15:30 -!- Frooxius has quit (Read error: No route to host).
17:15:30 <cpressey> not as many as some of the weirdos I'm following
17:15:51 <cpressey> they're much better at it than I
17:18:58 <ais523> unicode quotes as in
17:19:00 <ais523> !quote unicode
17:19:05 <ais523> or as in “”?
17:19:08 <ais523> err
17:19:10 <ais523> `quote unicode
17:19:12 <cpressey> elliott: would learning about corecursion help me with my consternation here, or should I just give up? keep in mind that my understanding of monads is, after all these years, still fuzzy, at best
17:20:07 <elliott> cpressey: it might, yes, but you might still just come down to not liking how the terminology is used
17:20:20 <elliott> I don't know of a good introduction to codata though
17:20:32 <elliott> http://homepages.inf.ed.ac.uk/wadler/papers/free-rectypes/free-rectypes.txt is classic, at least
17:20:38 <cpressey> isn't codata just the subject of corecursion?
17:20:44 <elliott> but, uh, pretty technical
17:20:46 <elliott> cpressey: sure
17:21:22 <cpressey> i confess, there's another angle i'm interesting in corecursion for
17:21:39 <elliott> the two-sentence explanation is "data is easy to produce but you have to consume it carefully (i.e., structural recursion)", "codata is easy to consume but you have to produce it carefully (i.e., "guarded" corecursion; making sure you're "productive")"
17:21:42 <cpressey> /ing/ed/
17:21:45 <elliott> corresponding to folds and unfolds specifically. blah blah blah F-coalgebras
17:21:57 <cpressey> right ok sure
17:22:41 <elliott> I'm sorry for getting into a terminology argument, anyway. I really don't care about it, heh
17:23:24 <elliott> I do think "languages where every function is total and your data recursion is structural and your only infinite data is properly guarded codata are useless for doing actual things, because they're not TC" is too prevalent an attitude but people should stop equating general usefulness with TCness anyway
17:23:44 <cpressey> I agree wholeheartedly with that
17:24:55 -!- shikhin has joined.
17:25:04 <cpressey> btw, I think I saw an article on HN about HTML+CSS3 being TC -- it came back from an unrelated search & I did not click on it but ISTR there was some discussion on that at one point in here
17:25:20 <elliott> yeah it was one of those "by TC I mean FSA" type things
17:25:31 <cpressey> thought it might be
17:25:37 <cpressey> there's a reason i don't get my news from HN
17:25:52 <elliott> well, clearly you do :p
17:26:11 <ais523> cpressey: that article made me sign up to Reddit to debunk it
17:26:18 <ais523> it's not TC due to not having infinite memory
17:26:25 <cpressey> ais523: thank you
17:26:30 <ais523> assuming it's the one I'm thinking of
17:26:45 <cpressey> i really *don't* read HN, so I don't know, but it probably was
17:26:49 <ais523> it's possible you could salvage it using WebForms, but nobody's figured out how yet
17:27:22 -!- Phantom_Hoover has quit (Ping timeout: 245 seconds).
17:27:28 -!- Phantom__Hoover has joined.
17:27:42 <Phantom__Hoover> i've had some really frustrating conversations with people who can't accept that the human brain isn't TC
17:28:19 <ais523> I think you need to talk about things like quantum uncertainty to prove it isn't, though
17:28:30 <ais523> human brain is analog in time, and you can store an infinite amount in a single real number
17:28:33 <cpressey> I think we need about 8 more words that are variations on TC
17:28:37 <ais523> but you can prove that it can't measure a time delay accurately enough
17:29:00 <cpressey> even then, maybe it's just pointless
17:29:03 <ais523> cpressey: well I've been thinking a lot about how to define TCness because of the 2,3 Turing machine thing
17:29:31 <cpressey> ais523: I have a something too, based on thinking about Wang tiles
17:29:52 <cpressey> actually's it's just an old thing
17:30:10 <ais523> my latest approach has been along the lines of "a language is TC if you can compile any TC-language program to it, via a function that can output the 'nth' byte of the resulting program in finite time"
17:30:14 <ais523> err, O(1) time, that is
17:30:28 <ais523> but I'm getting bogged down in details of what operations you're allowed
17:30:44 <cpressey> ais523: ah, so I give the "compiler" the source program, and n, and it outputs "byte" number n ?
17:30:54 <cpressey> in constant time
17:31:05 <ais523> yep
17:31:06 <elliott> ais523: so you mean, the program that produces the output program has to give a productive colist of bytes back, huh
17:31:09 <ais523> well, symbol, not byte
17:31:11 <elliott> how appropriate
17:31:15 <cpressey> or maybe O(x) where x is the length of the source program, no?
17:31:21 <cpressey> because it's gotta read it
17:31:24 <cpressey> and it's gotta read n
17:31:28 <ais523> cpressey: oh, I'm treating the source program as fixed here
17:31:33 <cpressey> ah, ok
17:31:47 <ais523> as in, for any given source program, we can produce the nth symbol of the target-language program in finite time
17:31:49 <cpressey> but the "compiler" is invariant of the choice of source program?
17:32:10 <ais523> oh, ooh, yeah we have to be careful here
17:32:30 <ais523> definitions get really complex when you're trying to avoid exploits while still allowing as much as possible
17:32:30 <cpressey> i assume we can't just OPTOMIZE it for each program we want to compil
17:32:31 <cpressey> e
17:32:55 * cpressey should really be "optomizing" a database query right now
17:33:16 <Phantom__Hoover> ais523, so are you saying that the compiler has to be 'local' when emitting a byte, basically?
17:33:30 <ais523> I guess the compiler has to be a fixed program, which can scale with the size of the source program, but not the target program
17:34:11 <ais523> Phantom__Hoover: I'm not quite sure what you mean
17:34:25 <cpressey> well, in comparison, my insight was simple: Wang tiling ~= Conway's GoL ~= CA-complete ~= can simulate any Turing machine which *never* halts
17:34:28 <cpressey> that's all
17:34:53 <Phantom__Hoover> well, that the time it takes to emit the nth byte can't increase without bound
17:34:56 <ais523> aha, we define it like this: a compiler is a single program, which, for any given source program, there is some finite bound within which it can produce any given byte of the target program
17:35:00 <cpressey> ~= meaning "computationally equivalent" because I don't do fancy unicode in IRC (yet)
17:35:23 <ais523> I do fancy Unicode in IRC all the time, but that one isn't on my compose key
17:35:54 <Phantom__Hoover> i.e. what you're trying to prevent is situations where the compilation process is introducing an unbounded amount of computational 'strenght'
17:35:56 <Phantom__Hoover> *strength
17:36:39 <ais523> right
17:37:08 <ais523> so for example, imagine a programming language which takes an infinitely long program as input, and only has two commands, "halt" and "nop"
17:37:30 <ais523> you can compile into that language by running your input program for n steps, then outputting "halt" or "nop" based on whether it halted within those n steps
17:37:32 <ais523> to output the nth command
17:37:49 <Phantom__Hoover> presumably because the trouble with the 2,3 turing machine was that, in some sense, most of the computation was done in that way?
17:37:53 <ais523> the problem I'm having is that, to define O(1), you need to define which operations you're allowed
17:37:59 <ais523> Phantom__Hoover: actually it wasn't
17:38:04 <ais523> my problem is proving that it wasn't thouhg
17:38:06 <ais523> *though
17:38:13 <ais523> which means defining what it means to do that
17:38:18 <Phantom__Hoover> ah
17:39:27 -!- vanila has joined.
17:39:34 <vanila> hello
17:40:18 <ais523> hi vanila
17:42:03 <cpressey> ais523: this sounds reminiscent of one part of Minsky's old book where he frets about mappings between languages wrt TC-ness. He's not conclusive. I think he says the mapping function should not be more than primitive-recursive
17:42:31 <ais523> that's close to the argument I made in the first submission of the 2,3 Turing machine proof
17:42:32 <cpressey> if you allow any PR function, that still gives you an awful lot of leeway
17:42:55 <ais523> however, primitive recursive programs can't run for infinite time
17:42:59 <ais523> so they can't produce infinite output
17:43:00 <cpressey> and if we're trying to find the "smallest" TC things, it seems likely we're going to get into the territory of these "powerfulish" mappings
17:43:12 <cpressey> ais523: yeah, but a mapping doesn't need to
17:43:20 <cpressey> you're just mapping one finite program to another
17:43:27 <ais523> no, the problem with the 2,3 proof is
17:43:30 <ais523> the input program is infinitely long
17:43:38 <cpressey> ?!
17:43:38 <lambdabot> Maybe you meant: v @ ? .
17:43:56 <cpressey> no lambdabot, I meant an interrobang, but as I said, no fancy unicode yet
17:43:59 <ais523> because it has to initialize the whole tape of the Turing machine
17:44:09 <ais523> and the pattern it initializes it with isn't finite or repeating
17:44:27 <cpressey> yeah that ain't right
17:44:39 <cpressey> in my view, it needs to lazily create that pattern as it uses thar part of the tape
17:44:46 <cpressey> and the complexity of doing so needs to be factored in
17:45:08 <ais523> so you would consider the 2,3 machine to be sub-TC, then
17:45:29 <ais523> I still haven't found the simplest pattern that works
17:45:31 <cpressey> ("needs to" ... er... simpler to consider systems that only do that, rather than ones that need a "prepared tape")
17:45:31 <vanila> An operating system made in common lisp - https://dl.dropboxusercontent.com/u/46753018/Screen%20Shot%202015-01-19%20at%2001.29.31.png
17:45:39 <cpressey> (is that like a perpared piano, John Cage...?)
17:46:01 <ais523> but suspect it's along the lines of "infinitely repeating pattern, except that all cells with index 2^n are replaced with the same specific value as an exception to the pattern"
17:46:15 <cpressey> ais523: I guess I... not exactly; I would consider it to be an abbreviated description of a TC system which, if not abbreviated, would be >2,3
17:46:22 <cpressey> or in other words, yes
17:46:55 <ais523> have you heard of the language "sequential tag"? it's something that #esoteric sometimes discusses
17:47:19 <cpressey> vanila: nice! (is the nethack written in lisp too?)
17:47:22 -!- arjanb has joined.
17:47:23 <ais523> basically, it has infinitely long programs; each command in the program is a sequence of subcommands, either "run" or "skip"
17:47:39 <cpressey> ais523: I... don't think I have
17:47:49 <ais523> all data is stored in one queue, which is initialized as "run"
17:47:50 <cpressey> oh, infinitely long THIS, infinitely long THAT, what madness is this
17:48:07 <cpressey> (might make more sense if you saw my rant about "total" earlier)
17:48:33 <vanila> it's just telnet talking to nethack server, but the mandelbrot and stuff is in lisp :)
17:48:40 <ais523> then repeatedly, you pop the start of the queue, if it's "run" you append the current command of the program to the end of the queue and move to the next command, if it's "skip" you move to the next command of the program without appending anything to the queue
17:48:47 <ais523> and it halts if the queue ever becomes empty
17:49:19 <ais523> it's one of the easiest languages to implement, if you have the ability to read an infinitely long program
17:49:25 <ais523> because it's so simple
17:49:41 <ais523> and it's TC if you allow an infinitely repeating input prorgam
17:50:39 <cpressey> um, hm.
17:51:01 <ais523> (because with an infinitely repeating input program, it becomes cyclic tag)
17:51:34 <cpressey> wait, i totally do not follow. the queue is not a queue of... maybe i should look this up on the wiki
17:51:55 <vanila> I still find it hard to accept that cyclic tag machines can be TC, I get how they are but.. they shouldn't be..
17:52:04 <ais523> I'm not sure it's on the wiki
17:52:07 <ais523> but the queue is of "run" and "skip"
17:52:29 <ais523> vanila: they "feel" sub-TC?
17:52:37 <cpressey> ok, so you pop "run", and the "current command" is -- the next thing on the queue?
17:53:13 <cpressey> does not appear to be on the wiki, btw
17:53:46 <vanila> yes - but just because my intuition is wrong I guess
17:54:12 <ais523> cpressey: the next command in the program
17:54:18 <ais523> the queue tells you whether to run or skip each command in the program
17:54:34 <ais523> and when you run a command, it shifts a specific sequence onto the end of the queue
17:54:49 <cpressey> ok, so the queue is not infinite, and the program is not on the queue
17:55:13 <cpressey> replace "infinite but repeating" with a "goto" and I'm fine with it
17:55:32 <cpressey> infinitely long programs, though... no.
17:55:40 <cpressey> they defeat the point
17:55:55 <vanila> an infinitely long program could contain something very dangerous... unless you know that it was generated by a simple program
17:56:25 <ais523> cpressey: right
17:56:42 <ais523> now, the problem with things like cellular automata and turing machines
17:56:51 <ais523> is that the programs are inherently infinitely long
17:57:12 <cpressey> ais523: i disagree. unbounded, yes. infinite, no.
17:57:12 <ais523> so the conventional wisdom is that you have to initialize them with a repeating pattern
17:57:28 <ais523> hmm
17:57:37 <ais523> for a Turing machine, you can have an unbounded tape, plus a rule for initializing it
17:57:53 <ais523> and I assume you'd treat that initialization rule as part of the Turing machine, rather than part of the input to it?
17:57:54 <cpressey> ais523: yes. and yes, it's a bit more difficult to work that into a CA, but
17:58:02 <cpressey> ais523: and yes
17:58:06 <cpressey> oh irc
17:58:33 <cpressey> i consider a tape cell of a TM to essentially not exist until the head gets to it
17:58:44 <cpressey> but the head can always get to a new tape cell and make it exist
17:58:48 <elliott> "This is a little surprising: greatest fixpoints allow infinite objects, such as streams, yet the strong normalisation property is preserved." I guess wadler sums up the totality thing pretty well
17:58:50 <ais523> hmm, this viewpoint is quite at odds to mine
17:59:35 <ais523> elliott: I find it a little hard to define strong normalization if you have infinite input
17:59:54 <ais523> about the strongest concept I can manage to define on that is weak confluence
17:59:57 <elliott> cpressey: that's sort of like how the next cell of an infinite codata stream doesn't exist until you observe it, but you can always observe a new one and make it exist in finite time :)
18:00:18 <vanila> I hope wadler is ok, he was ill
18:01:04 <cpressey> elliott: yes, but this is happening *inside a Turing machine*, not *in my program*
18:01:27 <cpressey> it's only to avoid questions about infinite tapes
18:02:07 <cpressey> and a way to collect the complexity of "prepared tape" into one place, where you were already caring about complexity -- the running of the TM itself
18:02:44 -!- Phantom_Hoover has joined.
18:03:11 <elliott> with codata, semantically, the observation is explicit
18:03:21 <elliott> that can get hidden in sugar and bad implementation of it, though.
18:03:37 -!- Phantom__Hoover has quit (Ping timeout: 245 seconds).
18:04:31 <cpressey> elliott: question: could you formulate codata that represents a Wang tiling?
18:04:52 <cpressey> a "total" tiling of the plane
18:04:53 <cpressey> oh god
18:04:56 <cpressey> someone shoot me
18:04:59 <vanila> hahaha
18:05:53 <elliott> cpressey: I guess I don't see why not. "current tile" and four neighbours, say.
18:06:00 <elliott> well
18:06:09 <elliott> that gets annoying because of overlapping ways to observe the same square
18:06:13 <cpressey> http://catseye.tc/installation/Backtracking_Wang_Tiler fwiw
18:06:21 <elliott> cpressey: okay, alternate answer: sure, it's just (Nat,Nat) -> Tile
18:06:22 <cpressey> fun to watch for about 50 seconds
18:06:38 <elliott> functions are one type of codata (the best-supported, usually) and codata can be implemented as functions, so
18:06:52 <elliott> okay, (Z,Z) -> Tile
18:07:19 <elliott> cpressey: uh, or do you mean something that ensures you're tiling properly
18:07:31 <elliott> that sounds like you'd end up with proof objects and stuff and this is out of my pay grade
18:07:32 <cpressey> no elliott, I want an invalid Wang tiling!
18:07:49 <elliott> I'm an invalid Wang tiling :(
18:09:03 <cpressey> I'm trying very hard not to lol
18:10:03 <cpressey> #esoteric: the channel where everyone can disagree about infinite
18:10:07 <cpressey> infinity*
18:11:46 <olsner> oh, cpressey is here
18:11:59 <cpressey> hi olsner
18:13:27 <cpressey> corecursion looks interesting, because i've been hanging out with people in generative art circles, and they're interested in grammars, but they're more interested in *generating* things from than, than *parsing* things with them
18:13:48 <cpressey> which is easy, programming-wise, in an RDP: replace scan("x") with emit("x") {more or less}
18:14:02 <cpressey> but, theory?
18:14:06 <cpressey> this might be it
18:15:32 <vanila> also logic programming to run recognizers backwards to generate things
18:17:27 <cpressey> vanila: yes, I'm reading the wikipedia article on it now, and it's looking a bit like that
18:18:41 <cpressey> elliott: maybe I'm a finitist? would that be terrible? istr that might be terrible
18:18:58 <cpressey> an infinite list is REALLY a finite list plus a continuation!
18:19:11 <elliott> that's more or less the codata perspective though
18:19:22 <elliott> thinking of Colist A as an infinite list is up to you
18:20:16 <cpressey> ok, maybe that will help me think about what i don't like about this "TC and total too!" thing
18:20:29 <vanila> oh that's funny, I just learned about TC & total
18:20:35 <elliott> a Colist A is just something you can construct from an X and (X -> Maybe (A, X)) (for any X) (e.g. nats = unfold 0 (\n -> Just (n, n+1))) and that gives you Colist A -> Maybe (A, Colist A)
18:20:43 <vanila> I got my own view on it
18:20:46 <elliott> meaning is for humans
18:21:07 <elliott> or more simply, Stream A from X and (X -> (A, X)), yielding head : Stream A -> A and tail : Stream A -> Stream A
18:21:13 <elliott> (eliding the possibility of stopping)
18:22:09 <cpressey> oh wait
18:22:14 <cpressey> no n/m
18:22:32 <cpressey> "total" ~= "well-typed"?
18:22:35 <cpressey> look just ignore me
18:22:47 <elliott> depends on your type system :p
18:22:53 <cpressey> yes
18:22:54 <elliott> Agda and Coq rely on a termination checker separate to their type system
18:22:55 <vanila> Here's what I came to understand:
18:23:01 <elliott> I prefer type systems that ensure termination by construction.
18:23:25 <elliott> fwiw: map f stream = unfold stream (\strm -> (f (head strm), tail strm))
18:23:49 <elliott> map f colist = unfold colist (\cl -> case observe cl of Nothing -> Nothing; Just (hd,tl) -> (f hd, tl))
18:23:52 <elliott> fun times
18:24:00 <elliott> (of course your function has to be, um, total.)
18:24:11 <vanila> My idea of 'turing complete' was you could write any nat -> nat function that you can with 'recursive functions' (primitive rec + mu-minimization)
18:25:03 <vanila> agda isn't turing complete in that sense, but if you the notion to allow functions like: nat -> partial nat then it is turing complete
18:25:09 -!- mihow has quit (Quit: mihow).
18:26:22 <cpressey> elliott: I... almost see what you're saying
18:26:27 <cpressey> it's like a negative photograph
18:26:33 -!- mihow has joined.
18:28:12 <cpressey> that was why I was thinking total = well-typed; if the function maps every value in X to any value in Y, it is total, and you can type it X->Y
18:28:27 <elliott> sure
18:28:33 <cpressey> if for some values in X it doesn't, well, no Y for it then
18:28:42 <vanila> you can have well types programs that infinite loop, e.g. omega = omega
18:28:46 <elliott> then the question is what type in Haskell corresponds to the naturals
18:28:52 <cpressey> YEAH BARRING BOTTOM AND ALL THAT
18:28:53 <elliott> data Nat = Z | S !Nat is so close, but it has _|_
18:28:59 <elliott> but bottom is exactly what makes things non-total >_>
18:29:17 <elliott> since it means you don't have total mathematical \mathbb{N} -> \mathBB{N} functions
18:29:45 <cpressey> well, but f : Nat -> {0, _|_} is total right?
18:29:52 <elliott> sure
18:30:03 <elliott> also the functions have to be monotone/continuous/whatever blah blah blah
18:30:08 * elliott transforms into dana scott
18:31:42 <cpressey> unfortunately I need to leave no
18:31:45 <cpressey> *now
18:31:55 <elliott> unfortunately, I'm sure :p
18:33:05 <elliott> ttyl
18:33:26 <cpressey> my AMAZINGLY ORIGINAL AND NOVEL AND EARTH-SHATTERING THEORY about how IRC destroys productivity has been proven
18:33:36 <cpressey> see you later :)
18:33:38 -!- cpressey has quit (Quit: leaving).
18:35:12 <vanila> i dont like to think of agda as TC, it just seems misleading and flame-war-inviting to me
18:35:41 <ais523> now I'm wondering what goes wrong if you try to get Agda to prove itself consistent
18:35:59 <elliott> the termination checker won't buy it
18:36:02 -!- mihow has quit (Quit: mihow).
18:36:47 <vanila> you can't get conAgda :: Proof (Consistent Agda)
18:37:00 <vanila> but you can produce conAgda' :: partiality (Proof (Consistent Agda))
18:37:14 <elliott> won't that just be fix later
18:37:38 <elliott> I don't believe that can ever work if it's not (clasically), eventually you have to produce a now (p : Proof (Consistent Agda))
18:37:49 <vanila> its just the same as a lisp program that tries fitting together axioms randomly to prove agda consistent
18:37:52 <elliott> *ever work since if it's not fix later (classically),
18:38:03 <elliott> well, it'll never find such a proof though
18:38:06 <elliott> but sure
18:38:12 <elliott> the simpler way to write that value is fix later :p
18:38:42 <vanila> sometimes you can construct a value of type partiality Foo
18:38:43 <int-e> well, it might terminate after all
18:38:52 <vanila> and then actually provide a proof that it terminates, to get a Foo out
18:38:53 <elliott> it won't
18:39:01 <int-e> unless you are, for some reason, convinced that Agda is in fact consistent.
18:39:04 <vanila> in the caes of this Proof, you will not be able to proof that it terminates inside agda
18:39:17 <elliott> int-e: I am convinced that Agda will never find an internal proof of its own consistency...
18:39:35 <elliott> okay, I guess this is pointless without defining Proof here
18:39:38 <vanila> but you can make it to do arbitrary computation
18:40:03 <elliott> okay, here's what I mean: meta-theoretically, we can prove that conAgda' ~~ fix later
18:40:10 <vanila> i dont think so
18:40:13 <elliott> but that might not be true depending on what Proof is.
18:40:51 <elliott> if we can prove meta-theoretically that conAgda' ~~ later^n (now p) for some n and p, though, then there is an Agda term conAgda : Proof (Consistent Agda)
18:41:09 <elliott> if our metatheory is classical we can prove one of those.
18:41:18 <elliott> okay, it might not give us an n and p
18:41:24 <elliott> arguing about constructivism is too hard
18:42:43 <elliott> okay: there is no expression e : Partial A such that not not (e = later e, or there is an expression e' : A). I hope that's double-negated enough
18:42:47 <vanila> oh! I get you now
18:42:49 <elliott> again metatheoretically
18:43:25 <elliott> so if you say, there is no e : A, but I have e' : partial A, then I don't believe that e' can be anything other than later e', with enough "not not"s in there
18:43:31 <elliott> so it will not not not halt :p
18:57:21 <int-e> @djinn (((a -> r) -> r) -> r) -> (a -> r)
18:57:22 <lambdabot> f a b = a (\ c -> c b)
19:08:57 <int-e> elliott: I still think your assumption that A is empty is a tad optimistic.
19:09:11 <elliott> int-e: what do you mean?
19:09:29 <int-e> if the logic is inconsistent then it might well "prove" its own consistency
19:09:37 <elliott> well, sure
19:09:41 <elliott> Agda has had proofs of _|_ before.
19:09:57 <elliott> what I said applies regardless of A, though
19:10:03 <int-e> (and djinn just showed that not not not foo is the same as not foo, even intuitionistically)
19:10:21 <elliott> yes, that was what we call a joke >_>
19:10:26 <int-e> ah
19:10:39 <int-e> it was not not funny
19:18:19 -!- Phantom_Hoover has quit (Ping timeout: 250 seconds).
19:19:41 -!- Phantom_Hoover has joined.
19:20:38 -!- dianne has joined.
19:22:45 -!- mihow has joined.
19:26:34 -!- ais523 has quit.
19:26:41 -!- shikhin_ has joined.
19:26:41 -!- ais523 has joined.
19:26:51 -!- shikhin_ has quit (Read error: Connection reset by peer).
19:28:25 -!- shikhin has quit (Ping timeout: 244 seconds).
19:28:26 -!- shikhin_ has joined.
19:35:21 -!- shikhin_ has quit (Read error: Connection reset by peer).
19:36:39 -!- shikhin has joined.
19:40:07 -!- shikhin has quit (Read error: Connection reset by peer).
19:41:42 -!- shikhin has joined.
19:45:59 -!- `^_^v has quit (Quit: This computer has gone to sleep).
19:49:25 -!- `^_^v has joined.
19:50:54 -!- shikhin has quit (Read error: Connection reset by peer).
19:51:29 -!- shikhin has joined.
20:01:46 -!- Phantom__Hoover has joined.
20:01:54 -!- Phantom_Hoover has quit (Ping timeout: 245 seconds).
20:03:17 <b_jonas> ok, so I'm not sure in this, but I believe for polynomial convolution Karatsuba multiplication needs about as much space as twice the outplut plus once the input, but it might still be a bit too slow, and FFT multiplication needs twice the space of the output. so luckily, neither needs too much extra space, which is very useful.
20:04:51 -!- Patashu has joined.
20:05:54 -!- shikhin has quit (Read error: Connection reset by peer).
20:07:05 -!- shikhin has joined.
20:10:04 -!- shikhin has quit (Read error: Connection reset by peer).
20:11:37 -!- Phantom__Hoover has quit (Ping timeout: 250 seconds).
20:12:05 -!- shikhin has joined.
20:21:48 -!- shikhin has quit (Read error: Connection reset by peer).
20:22:31 -!- shikhin has joined.
20:25:02 -!- shikhin has quit (Read error: Connection reset by peer).
20:27:33 -!- shikhin has joined.
20:29:15 -!- shikhin has quit (Read error: Connection reset by peer).
20:32:37 -!- shikhin has joined.
20:32:38 -!- J_Arcane_ has joined.
20:32:56 -!- MoALTz_ has joined.
20:35:22 -!- J_Arcane has quit (Ping timeout: 255 seconds).
20:35:32 -!- J_Arcane_ has changed nick to J_Arcane.
20:35:49 -!- shikhin has quit (Read error: Connection reset by peer).
20:35:49 -!- MoALTz has quit (Ping timeout: 255 seconds).
20:36:37 -!- Patashu has quit (Ping timeout: 244 seconds).
20:41:08 -!- shikhin has joined.
20:50:50 -!- shikhin has quit (Read error: Connection reset by peer).
20:51:43 -!- shikhin has joined.
20:53:46 -!- shikhin has quit (Read error: Connection reset by peer).
20:54:58 -!- bb010g has joined.
20:56:41 -!- shikhin has joined.
20:59:56 -!- shikhin has quit (Read error: Connection reset by peer).
21:01:44 -!- shikhin has joined.
21:05:32 -!- shikhin has quit (Read error: Connection reset by peer).
21:06:57 -!- shikhin has joined.
21:08:21 -!- shikhin has quit (Read error: Connection reset by peer).
21:11:55 -!- shikhin has joined.
21:12:38 -!- shikhin has quit (Read error: Connection reset by peer).
21:16:57 -!- shikhin has joined.
21:20:58 -!- shikhin has quit (Read error: Connection reset by peer).
21:25:35 -!- ChanServ has set channel mode: +o elliott.
21:25:43 -!- elliott has set channel mode: +b shikhin!*@*$##fixyourconnection.
21:25:45 -!- elliott has set channel mode: -o elliott.
21:28:53 -!- PinealGlandOptic has joined.
21:36:33 -!- aretecode has quit (Read error: Connection reset by peer).
21:39:50 -!- aretecode has joined.
21:41:18 <oren> what does _|_ mean?
21:41:36 <oren> Is is the upside down T?
21:41:45 <b_jonas> oren: yes, it's called "bottom"
21:43:48 <oren> I suppose it is smaller on the page than the word "false".
21:44:17 <`^_^v> it's a middle finger to logic
21:44:17 <vanila> http://en.wikipedia.org/wiki/False_%28logic%29
21:44:20 <vanila> its an ascii picture of this
21:44:40 <oren> Maybe the easiest logical literals for me to type are "0=0" and "0=1"
21:45:56 <oren> Idea: an algolike language which lacks literals other than 0 and 1.
21:46:28 <oren> Wait... That would be the peano axioms...
21:46:30 <ais523> PCF? although that doesn't even have 1 really
21:46:31 <ais523> just +1
21:46:36 <vanila> (1+1)*(1+(1+1)*(1+(1+1))) I can do binary like this
21:47:36 <oren> assignment should be <- . It should always have been <- .
21:47:46 <MDude> I think those +s and *s might need switched around?
21:47:50 <MDude> I agree oren.
21:48:38 <MDude> ChucK knows how to assign properly.
21:49:25 <MDude> Though I would also accept "set [var] to [value]" if allowing the language to be more verbose.
21:50:37 <oren> I made a calculator once that did exactly what ChucK does.
21:50:50 <oren> (in assignment, not in anything else)
21:52:42 <MDude> Oh, ChucK actually uses =>
21:53:04 <oren> In math, assignment is "let x be [value]"
21:53:53 <oren> But in math, the evaluation is explicitly controlled, like "now subsitutute in the value of x"...
21:54:18 <oren> God damn it how do you spell subsitute?
21:54:19 <vanila> there's no evaluaton
21:54:31 <MDude> I was imagining let would be for lazy evaluation statements, while set would be for eager evaluation.
21:54:33 <vanila> you just treat all equal values as identical
21:54:41 <b_jonas> "substitute" I think
21:54:46 <MDude> Because obviously having both in a language makes sense.
21:55:47 <oren> MDude: that is a cool idea.
21:57:59 <oren> vanila: A math paper is a syntax-tree-rewriting program with verification, which a mathematician's brain runs and checks.
22:00:11 <MDude> Oh, good then. I wasn't sure if that was a good idea or not.
22:01:04 <vanila> idont agree
22:01:13 <oren> Eager evaluation is a good idea for time-critical parts of a program
22:02:13 <oren> E.g. we don't want to go out of the polygon drawing loop to lazy-evaluate the enemies' positions.
22:05:29 <oren> But in other parts of a program, being lazy can help performance. So having both is a good idea
22:07:44 -!- `^_^v has quit (Quit: This computer has gone to sleep).
22:08:15 -!- ais523 has quit (Quit: dinner).
22:08:24 <elliott> 22:02:13 <oren> E.g. we don't want to go out of the polygon drawing loop to lazy-evaluate the enemies' positions.
22:08:29 <elliott> this isn't what lazy evaluation is.
22:08:36 <elliott> I mean that is one possible non-strict evaluation strategy.
22:08:47 <elliott> but that is just not something that really happens when you write in Haskell or whatever.
22:09:00 <elliott> unless you're writing your polygon drawing loop extremely weirdly
22:09:03 <elliott> well
22:09:13 <elliott> ok I can see how that would happen with certain kinds of data flow, I misread what you meant
22:09:18 <oren> Why not? You don't need to evaluate anything until it would have visible effects, right?
22:09:29 <elliott> there are tools for solving that within a lazy evaluation discipline though
22:09:34 <elliott> like, the thing is that making the polygon drawing loop eager wouldn't solve that
22:09:39 <elliott> because it's a non-local property you're talking about here
22:09:47 <elliott> it would be the position evaluation prior to that that would need to be strict
22:11:53 <oren> Why not have an evaluate operator that forces the value to be stored?
22:12:07 <vanila> haskell tried that
22:12:10 <vanila> its awful
22:12:17 <oren> Why?
22:15:03 <vanila> i guess it just doesn't seem aesthetic to me
22:15:04 <elliott> oren: well, you can't quite have a -> a
22:15:08 <elliott> but you can have a -> b -> b
22:15:30 <elliott> in haskell (a `seq` b) is b, but where a must be evaluated (to WHNF)
22:15:34 <elliott> note: no guarantees as to the ordering of the two
22:15:42 <elliott> and (f $! x) is (x `seq` f x) which is useful
22:15:49 <elliott> it can be annoying to juggle though.
22:15:57 <elliott> but you can use it to enforce invariants about evaluation.
22:16:49 <vanila> It's cool how you can do strictness analysis on code and automatically use strict evaluation instead of lazy
22:17:06 <elliott> except when it doesn't work :(
22:17:21 -!- ProofTechnique has quit (Ping timeout: 265 seconds).
22:17:35 <vanila> yeah its undecidable but you can hopefully get a good result a lot of times by abstract interpretation
22:20:19 -!- Lymia has quit (Ping timeout: 250 seconds).
22:22:29 -!- Lymia has joined.
22:24:05 -!- `^_^v has joined.
22:44:17 -!- `^_^v has quit (Quit: This computer has gone to sleep).
22:47:50 <oren> I find it intensely annoying that I can get all the ingredients for gunpowder in Dwarf Fortress, but can't make cannon or shot.
22:50:13 <Taneb> DF has saltpetre?
22:50:56 <oren> Yup.
22:51:19 <oren> And brimstone (e.g. sulphur), and charcoal.
22:51:51 <Taneb> I am not sure that DF does have saltpetre
22:52:21 <oren> it has "saltpeter"
22:52:40 <pikhq> That's the US spelling.
22:52:55 <Taneb> Aaah
22:57:00 <oren> Currently I am building my fortress hovering over a freshwater lake.
23:00:54 -!- vanila has quit (Quit: Leaving).
23:18:02 -!- ais523 has joined.
23:18:37 -!- adu has joined.
23:19:40 -!- PinealGlandOptic has quit (Quit: leaving).
23:25:11 -!- Froox has quit (Quit: *bubbles away*).
23:32:02 -!- Phantom_Hoover has joined.
23:40:30 -!- Tritonio has joined.
23:42:57 -!- PinealGlandOptic has joined.
←2015-02-18 2015-02-19 2015-02-20→ ↑2015 ↑all