00:25:43 -!- ais523 has joined.
00:26:17 zzo38: I use EX_CONFIG in cases where the program couldn't meaningfully start because a configuration file it depends on was malformed or missing required informatoin
00:27:32 there's a more detailed explanation of sysexits.h in the BSD man page, https://man.openbsd.org/sysexits
00:27:58 although not much more detailed
00:29:58 -!- ais523 has quit (Client Quit).
00:30:58 -!- GeekDude has joined.
00:54:05 -!- MDude has joined.
00:58:02 -!- arseniiv has quit (Ping timeout: 240 seconds).
01:32:09 -!- Sgeo__ has joined.
01:35:18 -!- Sgeo_ has quit (Ping timeout: 245 seconds).
01:41:03 -!- oerjan has joined.
01:54:22 y'know what, assume you have the full pos/neg range. <-- not possible in a bounded number of instructions, i think.
01:55:43 well, you didn't say you had looping instructions, so not possible at all.
01:57:03 the thing is, subtraction commutes with negating all numbers involved. so the only thing to distinguish things if you negate everything is that AND gives 1 and not -1 (i assume)
01:57:36 *only way
01:59:25 imagine if the numbers you start with are multiples of a googolplex
02:03:36 it will take you more than a googol operations to magnify a 1 into something of similar size - and only then do you have any chance of a distinguishing test result.
02:07:42 that is, if you start with numbers x, y, then every number you can construct will be of the form i*x + j*y + k (where the k comes from AND results)
02:08:05 -!- xkapastel has quit (Quit: Connection closed for inactivity).
02:08:45 and this will be zero exactly when i*(-x) + j*(-y) + k is, _unless_ k is divisible by gcd(x,y).
02:09:05 oh and nonzero
02:09:19 in which case it _might_ not be, if you're lucky.
02:10:23 and it will take you on the order of log (gcd(x,y)) operations to get a k that large.
02:32:41 -!- Lykaina has joined.
02:40:57 assume I have looping instructions.
02:41:13 including "break if == 0"
02:46:04 hi
02:47:15 so hard to get certain people's attention in irc. (i refer to another channel on another network)
02:52:00 imode: ok then you can basically do something like: construct 1 using AND, then use it to increment/decrement counters to search for x and y among all numbers.
02:52:53 and keeping track of which you find first and whether they're >0 or <0
02:53:36 yikes. there's no way to speed that process up? I have access to constants and everything...
02:54:15 i don't really think so
02:54:39 interesting... you seem so limited. :\
02:54:48 if you don't have <0 or >0.
03:13:32 -!- atslash has quit (Ping timeout: 265 seconds).
03:14:27 -!- hppavilion[1] has joined.
03:25:20 -!- atslash has joined.
03:30:32 -!- Lykaina has quit (Quit: leaving).
04:10:27 -!- imode has quit (Ping timeout: 245 seconds).
04:17:06 -!- sprocklem has joined.
04:19:59 -!- tromp_ has quit (Read error: Connection reset by peer).
04:20:50 -!- tromp has joined.
04:21:09 [[1+]] M https://esolangs.org/w/index.php?diff=66317&oldid=66313 * TwilightSparkle * (+2) /* Fibonacci sequence */
04:33:38 -!- imode has joined.
04:42:50 -!- Lord_of_Life has quit (Ping timeout: 240 seconds).
04:44:14 -!- tromp has quit (Ping timeout: 276 seconds).
04:44:23 -!- Lord_of_Life has joined.
04:45:35 [[1+]] https://esolangs.org/w/index.php?diff=66318&oldid=66317 * TwilightSparkle * (+53)
04:59:44 [[1+]] M https://esolangs.org/w/index.php?diff=66319&oldid=66318 * TwilightSparkle * (+45)
05:07:37 -!- tromp has joined.
05:19:06 -!- tromp has quit (Read error: Connection timed out).
05:20:01 [[1+]] https://esolangs.org/w/index.php?diff=66320&oldid=66319 * TwilightSparkle * (+39) /* Examples */
05:48:31 -!- tromp has joined.
06:04:32 -!- tromp has quit (Read error: Connection timed out).
06:05:56 [[Keg]] https://esolangs.org/w/index.php?diff=66321&oldid=66035 * JonoCode9374 * (+81) /* See also */
06:05:59 -!- oerjan has quit (Quit: Nite).
06:26:50 -!- imode has quit (Ping timeout: 240 seconds).
06:27:26 -!- tromp has joined.
06:43:11 -!- tromp has quit (Read error: Connection timed out).
06:56:04 -!- tromp has joined.
07:11:14 -!- tromp has quit (Read error: Connection reset by peer).
07:11:41 -!- tromp has joined.
07:44:11 -!- cpressey has joined.
07:53:03 Good morning. There was something very important I needed to tell you about all your base, but I've forgotten what it was.
08:45:44 I made Robin's ref impl able to run on Hugs, even the random numbers and I/O parts. I'm running the test suite on it now. It's taking... a while to finish.
08:53:44 26 minutes, to be precise. (compared to about 1 minute 8 seconds when it's compiled by ghc.)
09:46:49 [[An Odd Rewriting System/Odd.hs]] N https://esolangs.org/w/index.php?oldid=66322 * Chris Pressey * (+1791) Add Haskell implementation of An Odd Rewriting System.
09:48:47 [[An Odd Rewriting System]] M https://esolangs.org/w/index.php?diff=66323&oldid=58485 * Chris Pressey * (+71) Link to Haskell implementation.
09:54:24 [[Talk:Deque]] N https://esolangs.org/w/index.php?oldid=66324 * YamTokTpaFa * (+186) /* Isn't this article's expression unified? */ new section
10:10:15 Consider a language where programs automatically loop after N steps, and on each step, N grows. Programs can modify themselves, so that they can grow on each loop.
10:11:21 cpressey: consider me considering
10:11:46 If this is the only way to loop, and if N grows by a constant additive amount, this could be TC. But if N grows by a multiplicative factor, is it?
10:11:56 (Is it could be TC?)
10:12:18 You'd need an inner loop to make it grow enough to get to the next loop.
10:13:02 idk, not terribly interesting by itself, but maybe it could be developed
10:13:32 Do you have to have enough instructions to make it to N to loop?
10:13:39 Or can it autopad with no-ops
10:14:11 (noöps)
10:14:17 I was thinking of having some fairly generic instruction set which could include no-ops.
10:15:48 It seems like N could start out "too small" as well
10:17:02 One could also try to make an "evil" version where N grows, but not in a way that is easy to predict
10:47:46 Or otherwise make it difficult to construct a no-op. In a lot of languages it's easy, in some it might be impossible, but ideally you'd want to strike some middle ground
10:51:27 -!- tromp_ has joined.
10:53:53 -!- tromp has quit (Ping timeout: 246 seconds).
10:55:53 [[Keg]] M https://esolangs.org/w/index.php?diff=66325&oldid=66321 * A * (+24) /* See also */ This link also includes irrelevant questions.
11:00:15 "A Turing-complete language that admits a quine also admits a way to build a no-op" -- I wonder if that's true?
11:00:37 My gut feeling is that it's probably true
11:02:59 thats a tough one
11:04:01 where is that from
11:12:37 Not really from anywhere, just from me thinking about how one might make a language where it's not possible to write a program that does nothing
11:16:08 -!- nfd9001 has joined.
11:52:02 -!- hppavilion[1] has quit (Ping timeout: 240 seconds).
12:08:38 What's a no-op?
12:09:00 Oh, there's a whole discussion above.
12:11:11 Does no-op mean a program of length n+k that computes the same thing as a program of length n?
12:15:37 I was using the word to mean the identity function, more or less. A program that does nothing. If you tack it on to some other program, it doesn't change the meaning of that program.
12:16:13 There are sort of a lot of assumptions about programs in there.
12:17:13 Oh, a whole program, not an instruction.
12:17:19 That seems easier.
12:17:48 But a language could easily be Turing-complete and still require programs to always print something?
12:17:58 I think that came from, if you don't have a no-op instruction, can you build one out of the other instructions.
12:20:05 Maybe. If you force the program to always output something, then you say that's Turing complete because you can reduce every program to a TM, you're kind of pushing the "no-op-ness" into the TC reduction.
12:21:12 Well, OK.
12:21:38 I'm not sure the TC reduction is such a useful way to talk about programs for other purposes.
12:24:48 I think that's why I started thinking about quines.
12:33:02 Trying to think of it another way: say you have an algebra, and it doesn't have an identity element (perhaps it's a semigroup). It's claimed that one can express any computation they like in this algebra. Could this claim be true?
12:40:52 How can it be said that an algebra can express a computation?
12:47:12 -!- xkapastel has joined.
12:49:09 That, itself, is an interesting question.
12:53:35 I once saw a paper that showed that the Hercules-Hydra battle could be expressed in braid theory.
12:53:39 is there a good proof of rice theorem
12:55:15 rain2: My rough version of Rice's theorem is: It's undeciable to show that a Turing machine does X, because you can take that Turing machine and replace "X" with "halt" and you get the Halting Problem, which you already know is undecidable.
12:55:36 I know that's not the same thing as a good proof.
12:56:17 -!- sprocklem has quit (Ping timeout: 276 seconds).
12:57:09 A rock solid proof
13:02:44 I'm pretty sure there's been a machine-checked proof of it in something like Lean or Coq but I have been unable to find anything with a quick web search.
13:05:58 https://github.com/leanprover-community/mathlib/blob/master/src/computability/halting.lean#L165
13:06:47 -!- imode has joined.
13:06:55 Apparently they prove the undecidability of HP based *on* Rice's theorem in that
13:08:17 wow
13:08:48 i'll try to understand these theorem statements
13:09:29 Do you like "Rice's theorem for the computable reals"?
13:10:52 yes
13:10:54 do you?
13:17:04 I think I like it.
13:21:06 It could be useful for thinking about infinite programs.
13:21:17 are infinite programs superturing
13:21:33 -!- Sgeo__ has quit (Read error: Connection reset by peer).
13:22:25 I would say uncountably infinite programs have to count as "superturing" in some sense. Countable ones, probably not.
13:24:04 https://cs.stackexchange.com/questions/77487/decidable-properties-of-computable-reals
13:24:10 Rice's theorem for reals holds in every reasonable version of computable reals
13:26:32 rain2: Yes, I like it.
13:26:58 It's interesting that it's a version of Rice's theorem that holds for programs that are guaranteed to halt.
13:28:13 What are other things you can say about programs that you know some things about?
13:29:08 For example what properties of a total function are decidable given a guaranteed-to-halt program that computes it?
13:30:02 oh interesting
13:31:03 Also: The computable reals are countable, but they're not computably countable.
13:31:45 why aren't they computably countabl?
13:33:01 I'm a bit confused: The program that generates a computable real isn't guaranteed to halt. Quite the opposite, it has to be able to run indefinitely to compute some computable numbers, such as square roots.
13:36:24 Guaranteed to halt = "total" = "recursive" = R; Computable real = "computably enumerable" = "recursively enumerable" = RE; R < RE.
13:37:48 R ⊊ RE
13:38:16 cpressey: Oh, well, if you define it that way it's productive.
13:38:31 I was thinking you give the program a required precision and it gives you an approximation within that precision.
13:38:40 In which case it halts on every input.
13:39:15 Sure, but you can write an analogous program for uncomputable reals too.
13:39:29 It halts because it's a finite approximation.
13:40:01 Not because of some property that the real it's approximating, has.
13:40:15 * cpressey grammars
13:40:49 What do you mean?
13:40:57 You can't compute uncomputable reals.
13:41:39 You can write a program that asks for a precision and computes an approximation of Chaitin's omega to that precision.
13:42:13 You certainly can't.
13:42:25 You can. It has in fact been done.
13:43:10 I don't believe you, cpressey
13:43:18 OK, let me find the paper.
13:43:18 In fact, I believe you are mistaken here
13:43:43 With N bits of that constant you can solve the halting problem of programs of a size proportional to N
13:44:18 -!- arseniiv has joined.
13:44:44 There are Ns known to have programs of that size where it's undecidable whether they halt or not
13:45:13 What does it mean for it to be undecidable whether a particular program halts?
13:46:08 (Obviously I Agreeneb.)
13:47:12 Taneb: the key word is "approximation". Perhaps there is a limit to the amount of precision you can ask for.
13:47:21 A proof of whether it halts or not is outside ZFC (or some other axiom scheme of your choice, but that might requre a different N)
13:47:55 cpressey: I think the limit might be about 5
13:47:55 (Or rather, expect to get, when you ask for it.)
13:48:15 Oh, I'd call that "unprovable", not "undecidable".
13:48:44 But, hmm, that still doesn't make sense to me.
13:49:01 Surely if the program halts, you can prove that much in ZFC.
13:49:19 I guess you're not naming a particular program but only saying (metatheoretically?) that one exists.
13:49:49 There are theorems outside ZFC. We can construct a turing machine that it halts iff one such theorem is true.
13:51:15 I'm having trouble finding the paper but I wouldn't be surprised if 5 bits was all you could get so, OK, I withdraw any implication I might have made that you can get an arbitrary precision that you ask for in that case.
13:51:29 That wasn't exactly what I was trying to drive at in any case.
13:52:21 Any function that produces a finite approximation of anything will of course terminate always. That's not what we're talking about when we talk about computable reals. We're distinguishing them from uncomputable reals, ones that no TM can produce.
13:53:19 cpressey: The point is that it can produce approximations to any precision you ask.
13:53:34 An infinite stream of bits can be represented as a function : N -> 2 that always halts in the same way.
13:54:58 -!- ais523 has joined.
13:55:04 Okay. I see it differently, in a way that has nothing to do with approximations.
13:55:46 I don't think the way you see it really matters because however you construct the computable reals they come out the same.
13:55:49 How do you like representing them?
13:55:50 cpressey: assuming you can't append to the program faster than 1 instruction per instruction, your multiplicatively growing language would probably want lots of imps to work
13:56:11 (an imp is an instruction that adds a copy of itself to the running program; this is Core Wars terminology but could generalise to other languages)
13:56:26 the imps are more useful than no-ops because they tend to multiply, whereas no-ops just sit there
13:57:38 shachaf: How do I like representing a computable real? With the Turing machine that writes it to its tape, I suppose.
13:57:58 Writes what?
13:58:32 There's a TM that writes pi to its tape, there's a TM that writes 1 to its tape, there's a TM that writes e to its tape, and so forth, for any computable real you can pick. Would you agree?
13:58:36 "A Turing-complete language that admits a quine also admits a way to build a no-op" -- I wonder if that's true? ← what about a language where all commands echo when run? quining is trivial, no-ops don't exist
13:58:51 I don't know what you mean by writing a number to its tape, so maybe I agree.
13:59:26 -!- ais523 has quit (Client Quit).
13:59:38 -!- ais523 has joined.
14:00:08 Turing machines write symbols on their tapes. Suppose the alphabet of symbols is "0123456789.". Then there's a TM that writes "3.14159..." on its tape, where "..." is standing in for the rest of pi.
14:00:38 Oh, a decimal expansion.
14:00:41 No, I don't agree.
14:00:44 Decimal is of course not necessary, but it's the base I tend to use the most in my daily like.
14:00:52 No base works.
14:00:52 *life
14:01:14 Well, if you want addition to be computable.
14:01:29 I believe you can use bases which have redundant representations
14:01:33 e.g. binary with digits 0, 1, 2
14:01:58 the problem with using a traditional base (say decimal) is that you might end up with a computable number that happens to be a terminating decimal but you can't prove it
14:02:03 I don't care about the base. I'm trying to communicate the general definition.
14:02:26 No base works.
14:02:44 https://en.wikipedia.org/wiki/Richardson%27s_theorem is the relevant theorem
14:02:46 Well, maybe the thing ais523 said works, I'm not sure.
14:03:15 shachaf: basically, the thing you can't do with computable reals is determine for certain whether two things that are actually equal happen to be <= or >= each other
14:03:21 if two things are actually not equal, you can tell them apart
14:03:31 ais523: You can't determine for certain *anything* about computable reals.
14:03:37 That's the "Rice's theorem for computable reals" above.
14:03:59 Oh, you mean determine for certain in the weaker sense of semidecidability, never mind.
14:04:02 shachaf: with the definitions I'm aware of, a < b is semi-decidable: it's always decidable if it happens to be true
14:04:09 but might not be decidable if it happens to be false
14:04:13 Yes.
14:04:26 cpressey: You certainly want addition to be computable. Say I add "0.666..." and "0.333...". What should the answer be?
14:04:50 this means that a general "base" representation of a computable real you use has to be one with semi-decidable equality, e.g. in hyperbinary, 022222… and 10000… might or might not be equal
14:04:51 shachaf: Did I claim computable reals can be compared to each other, or added to each other?
14:05:12 Maybe not, but then you're wildly deviating from anything anyone calls the computable reals.
14:05:18 cpressey: the ability to produce a base-N expansion (for any N) implies the abilty to compare
14:05:36 You *can* compare decimal expansions, that's exacty the problem.
14:05:59 e.g. in the case of pi, you need to compare to 3.1415 and 3.1416 to determine that the fourth digit after the decimal point is a 5
14:07:02 I guess I should ask what you mean by the reals in the first place.
14:07:22 Presumably you don't construct the reals in maths as decimal expansions but as Cauchy sequences or Dedekind cuts or something.
14:09:21 You can presumbly construct the reals as equivalence classes of decimal expansions.
14:09:32 ais523: How do you figure that implication? It's not so much that the TM "does produce" the expansion, as, if the TM were to run forever somehow, it "would produce" it.
14:10:11 cpressey: suppose you have a number for which it's undecidable how it compares to 3.1415; the Turing machine won't get past three digits past the decimal point no matter how long you run it
14:10:47 ais523: Where did the undecidable number come from?
14:11:04 I thought you were talking about computable numbers
14:11:14 cpressey: I thought we were discussing the computable reals; for any rational number there's a computable real number that's equal to it, but undecidably so
14:11:22 by Richardson's Theorem, which I linked above
14:12:03 OK, as usual, my brain is too small to follow everything that is going on in one of these conversations.
14:12:36 Fine, I surrender. I only thought this was how computable reals were defined, because it's how I recall them being defined in Turing's original paper.
14:12:43 But I might be mis-remembering how he defined them.
14:13:28 If decimal (etc) expansions don't work, pick some expansion that does work. Does any expansion work?
14:13:46 If there is one, make a TM that writes all the parts of it onto its tape, "forever".
14:14:39 The point about 0.666... + 0.333... is that when you see the trillionth digit you might have to change your mind about the first digit.
14:14:46 hyperbinary is the most common expansion that works
14:14:51 Because the answer might be either 0.999...8 or 1.000...1
14:15:05 shachaf: Sure, the TM might have to revise its tape, all the way back to the beginning.
14:15:10 AFAIR there are several computable real definitions: weaker ones, stronger ones
14:15:14 Any representation of the computable reals might have to change its mind arbitrarily far in.
14:15:25 arseniiv: that doesn't surprise me
14:15:29 That's why I talked about finite approximations.
14:16:04 I'm not interested in approximations and I don't think you need to invoke them to talk about the objects themselves.
14:16:11 so some phenomena may exist only for those sufficiently weak/strong
14:16:18 cpressey: But Cauchy sequences are sequences of approximations.
14:16:38 All I'm talking about here is a particular kind of Cauchy sequence.
14:17:00 fwiw, Wikipedia seems to agree with cpressey's definition of computable reals
14:17:23 in the original definition by Minsky
14:17:41 but says that that definition was since found to be flawed
14:18:08 there's a section discussing the difference: https://en.wikipedia.org/wiki/Rounding#Table-maker's_dilemma
14:19:17 This is all somewhat related to something I thought of recently: Consider the computable, infinite initial GoL configurations. Some, like "infinite barberpole" or "empty space" can be simulated by a TM; others can't even though they are computable.
14:19:33 The latter relates to "Rice's theorem for computable reals".
14:19:37 There is no approximation going on here.
14:20:23 Er well, I should say, you could probably try to "approximately evolve" a GoL form.
14:20:24 to me it's not obvious that a TM can't simulate all computable GoL configurations: can't it compute the value of any given cell after n steps via computing the initial state of the (2n+1)×(2n+1) neighbourhood around it and simulating that?
14:20:31 I'm not sure on this, though
14:20:37 You might be even be able to get more than 5 bits of precision out of it, who knows.
14:21:11 shachaf: re the 5 bits thing: do you know what the smallest/simplest program is for which it isn't known whether or not it halts?
14:21:22 I guess in some mathematical formalization
14:21:32 ais523: BB(5) is unknown, right?
14:21:35 we ran that competition at CGCC and I'm currently winning with a 3-byte Brachylog program
14:21:39 With a few candidates.
14:21:41 shachaf: that sounds about right
14:21:43 ais523: Every column of the GoL playfield could be a different irrational number. Every cell needs to be evolved to the next state. The effects of all the cells propogate at the speed of light. I don't think a TM can do all that.
14:22:10 "different irrational number" = infinitely tall binary expansion of such, in dead/alive cells, of course
14:22:19 cpressey: it can compute the value of any specific cell after any specific number of steps
14:22:24 https://en.wikipedia.org/wiki/Busy_Beaver_game#Exact_values_and_lower_bounds
14:22:25 perhaps you're using a different definition?
14:22:55 ais523: Maybe I don't know what I'm talking about.
14:23:44 I think thinking of the reals as infinite sequences of digits is a very common computer scientist thing to do.
14:23:51 And I think it's really pretty wrong.
14:23:52 it's interesting… a TM seems incapable of holding "the whole thing" in its memory at any given time, but for any concrete question you have about a particular cell and a particular time, the TM can answer it
14:24:14 simply by computing only the relevant parts and discarding all the data it doesn't need
14:24:36 this is due to the speed-of-light issues: in a language like RUBE you couldn't do it because data transmission can be arbitrarily fast there
14:25:01 In what sense is that not holding "the whole thing" in memory?
14:25:01 I guess because you depend on the oracle?
14:25:58 much of the information about the state of the GoL automation is never computed
14:26:57 Sure, but so what? If a program computes an infinite sequence of bits, it doesn't hold "the whole thing" in memory, but it can answer questions about any particular bit.
14:27:22 I guess the speed of light thing makes it slightly different.
14:27:31 This is the same trick that Hashlife uses.
14:28:00 ais523: There does seem to be an inherent slowdown at least; every time you need some information about a cell you haven't computed yet, you need to simulate its entire history so that you know for sure what it is
14:28:26 yes
14:29:00 actually this is one of the main reasons I don't like Turing machines as the standard basis of computation
14:29:10 they have inherent slowdowns in a weird and artificial-feeling way
14:29:39 I prefer counter machines because they are at least consistently slow, which feels more natural than Turing machines being slow at some things and fast at others
14:31:29 counter machines ftw!
14:31:41 what about queue automata?
14:31:49 I have a hard time believing simulating such a GoL playfield on a counter machine would not also have the same slowdown
14:32:10 TMs but without the need for data shifting.
14:32:16 cpressey: it does, probably worse
14:32:30 my complaints about TMs aren't that the slowdown exists, but that it's applied inconsistently
14:32:36 Okay
14:36:22 Still, say there is some expansion of a real that a TM can write to a tape. A real is computable if some TM writes its expansion to its tape "eventually". Then: if A and B are computable reals, then why is A + B not a computable real?
14:36:41 A + B will also be written "eventually"
14:36:56 What's your answer to my 0.666... + 0.333... question?
14:37:35 That if you have a TM computing 0.666... and a TM computing 0.333... you can make a TM that computes 0.666... + 0.333 ?
14:37:42 Oh, wait, you said something I didn't expect, which is that a program can change its mind and erase digits after it writes them.
14:37:51 I missed the final ellipsis there
14:37:59 I thought the output was append-only.
14:38:13 shachaf: hmm, I think that works
14:38:14 If it's not then these things don't seem useful.
14:38:36 What does?
14:38:38