00:20:21 <esolangs> [[Make me blush]] https://esolangs.org/w/index.php?diff=117913&oldid=116815 * Kaveh Yousefi * (+207) Added a hyperlink to my implementation of the Make me blush programming language on GitHub and introduced the category tag Implemented.
00:21:16 <esolangs> [[Make me blush]] https://esolangs.org/w/index.php?diff=117914&oldid=117913 * Kaveh Yousefi * (+4) Rectified the ASCII Loop examples, the same was inflicted with two mistakes: (1) The token becuase in lieu of because, (2) a missing and.
00:25:46 <esolangs> [[Make me blush]] https://esolangs.org/w/index.php?diff=117915&oldid=117914 * Kaveh Yousefi * (+499) Supplemented a character-based cat program and introduced a truth-machine example.
00:30:48 <esolangs> [[Make me blush]] M https://esolangs.org/w/index.php?diff=117916&oldid=117915 * Kaveh Yousefi * (+202) Reformatted the command listing as a table.
00:55:57 -!- Lord_of_Life has quit (Ping timeout: 260 seconds).
00:55:59 -!- Lord_of_Life_ has joined.
00:57:19 -!- Lord_of_Life_ has changed nick to Lord_of_Life.
01:20:06 <esolangs> [[Three Star Programmer]] https://esolangs.org/w/index.php?diff=117917&oldid=117912 * Ais523 * (-60) merge "implementations" and "external resources" sections, these are normally combined and there isn't an obvious reason for them to be separate
01:52:42 -!- craigo_ has quit (Read error: Connection reset by peer).
01:53:00 -!- craigo_ has joined.
02:33:16 <esolangs> [[Trampolines]] https://esolangs.org/w/index.php?diff=117918&oldid=117889 * Aadenboy * (+479) new commands
04:33:01 -!- b_jonas has quit (Ping timeout: 255 seconds).
05:02:10 <esolangs> [[B sharp]] M https://esolangs.org/w/index.php?diff=117919&oldid=75182 * Lilchiky * (+38) wrongtitle
05:02:47 -!- ais523 has quit (Quit: quit).
05:39:52 <esolangs> [[A?b.]] M https://esolangs.org/w/index.php?diff=117920&oldid=105539 * Lilchiky * (+13) 'not capitalised'
07:16:35 -!- Koen has joined.
07:55:55 -!- cpressey has joined.
08:14:50 -!- awewsomegamer has joined.
08:16:32 -!- awewsomegamer has quit (Client Quit).
08:21:23 -!- cpressey has quit (Ping timeout: 245 seconds).
08:51:09 -!- cpressey has joined.
09:59:19 -!- Sgeo_ has quit (Read error: Connection reset by peer).
09:59:49 -!- arseniiv has joined.
10:15:09 -!- cpressey has quit (Quit: Client closed).
10:42:43 -!- FireFly has changed nick to Luci-ghoule.
11:08:37 -!- b_jonas has joined.
11:56:25 -!- Thelie has joined.
13:12:46 -!- Koen has quit (Remote host closed the connection).
13:15:32 -!- Koen has joined.
13:20:07 -!- Koen has quit (Ping timeout: 264 seconds).
13:31:00 -!- op_4 has quit (Quit: ZNC - https://znc.in).
13:31:21 -!- Thelie has quit (Quit: Leaving.).
13:31:41 -!- op_4 has joined.
13:41:09 <esolangs> [[F!--]] https://esolangs.org/w/index.php?diff=117921&oldid=112716 * Kaveh Yousefi * (+159) Added a hyperlink to my implementation of the F!-- programming language on GitHub and changed the category tag Unimplemented to Implemented.
13:44:23 <esolangs> [[F!--]] https://esolangs.org/w/index.php?diff=117922&oldid=117921 * Kaveh Yousefi * (+281) Supplemented a juxtaposition of the commands defined for F!--, F!, and Deadfish.
13:50:54 <esolangs> [[B2C]] https://esolangs.org/w/index.php?diff=117923&oldid=115061 * None1 * (+21) /* Hello World (last because it's the hardest one) */ Fixed Hello World program that previously prints "Hdkkn Mehbz/"
13:52:42 <esolangs> [[B2C]] https://esolangs.org/w/index.php?diff=117924&oldid=117923 * None1 * (+2576) Added JavaScript interpreter and implemented category tag
13:55:45 <esolangs> [[B2C]] M https://esolangs.org/w/index.php?diff=117925&oldid=117924 * None1 * (-10) /* Interpreter */
14:07:37 <esolangs> [[Interpret Esolangs Online]] https://esolangs.org/w/index.php?diff=117926&oldid=115598 * None1 * (+10) /* Introduction */ Interpret Esolangs Online now supports B2C
14:40:13 -!- cpressey has joined.
15:48:30 <esolangs> [[Brainfuck]] https://esolangs.org/w/index.php?diff=117927&oldid=117750 * Hakerh400 * (+163) Add an implementation in Haskell
15:56:41 <cpressey> OK, new plan. Scrapping the Scheme compiler. Just gonna write the state transformation functions in Lua.
16:22:20 -!- craigo_ has quit (Quit: Leaving).
16:30:27 <b_jonas> ah yes, that's a good way to get an esoteric language. plan a domain-specific language that you want to use for some particular purpose, then find that you don't want to use it after all, ends up unused and esoteric
16:33:17 -!- Europe2048 has joined.
16:52:25 -!- Europe2048 has quit (Quit: Client closed).
16:54:35 -!- FortyTwoBB has joined.
16:59:15 <FortyTwoBB> @ais523 does flooding waterfall get tripped up by having clocks with minimum values? Because Xathrid Necromancer must share a type with the creature that triggers it, effectively every waterclock starts at a minimum of [the sum of their row]. We can add more indestructible creatures to flatten the disruption for each clock.
17:00:53 <esolangs> [[NONE]] https://esolangs.org/w/index.php?diff=117928&oldid=117616 * Jaip * (-20)
17:02:00 <FortyTwoBB> So if every clock has 413612 or whatever dummy creatures pumping it up, the first one to fall to 413611 will be the first to zero. The dummy creatures don't affect the multiplier because they don't die.
17:09:40 <FortyTwoBB> I think this shift works a similar way it did for normal FWC?
17:15:58 -!- ais523 has joined.
17:18:31 <ais523> so the problem is that the tokens come in with a fairly large toughness, based on the number of zeroing triggers that mention them, rather than 1 like they're supposed to
17:19:22 <ais523> that adjusts the length of time between first token creation and token death, but without adjusting the multiplier on the zeroing trigger
17:20:31 <ais523> so this can be compensated for by making all the tokens have the same toughness boost, yes – this will spread the cycles out in time more, but the computation will still proceed the same way with the same numbers, just with a small delay as it changes from one cycle to the next
17:20:51 <ais523> (this fix doesn't work for arbitrary Flooding Waterfall Model programs but does work for the programs generated by the compiler)
17:21:08 <ais523> also I suspect you don't need multiple copies of Coat of Arms in order to handle steady damage values greater than 1
17:21:40 <ais523> I would expect you could just make the zeroing triggers larger to compensate
17:22:04 <ais523> the baseline counters used by the compiler can compensate for almost anything, it is a pretty flexible construction
17:22:05 <FortyTwoBB> But that requires more material than just making 2 artifacts
17:22:36 <ais523> you want to save on tokens, so a token Coat of Arms is cheaper than a token Xathrid Necromancer
17:22:54 <ais523> or, at least, not massively more expensive, so one Coat of Arms is cheaper than millions of Necromancers
17:23:47 <ais523> incidentally, as long as your Bishop of Wings / Xathrid Necromancer / etc. has two creature types
17:23:54 <ais523> it does not need to be indestructible in order to make the construction work
17:24:24 <ais523> it is a bit hard to remember what has and hasn't been discovered in the thread
17:24:30 <ais523> seeing as it's over 200 pages long now
17:25:22 <FortyTwoBB> but we need exactly either !martyr of spores or !Resolute Watchdog and might as well go with the indestructible option.
17:25:33 <ais523> I tried to reread it yesterday to figure out how the iterated (and hyperiterated) busy beavers work, but got a little confused
17:25:37 <ais523> `card-by-name martyr of spores
17:25:39 <HackEso> Martyr of Spores \ G \ Creature -- Human Shaman \ 1/1 \ {1}, Reveal X green cards from your hand, Sacrifice Martyr of Spores: Target creature gets +X/+X until end of turn. \ CSP-C
17:25:47 <ais523> `card-by-name resolute watchdog
17:26:00 <ais523> the bot is somewhat outdated in terms of its M:tG knowledge
17:26:22 <FortyTwoBB> 1 sac self: target creature gaind indestructable until eot
17:26:37 <ais523> fwiw, the hardest part of the construction seems to me to be to make sure that there are no infinite loops in it
17:28:10 <ais523> btw, do you know of any good substitutes for Arcbond? I have been a bit frustrated with my "competitive Turing-complete deck" project because I am so close to getting it down to 6 cards (beyond those that already exist in the deck I'm basing this on)
17:28:51 <ais523> but need a seventh to prevent the turn player from having the option to die from their own Arcbond triggers while their lifegain triggers from Bishop of Wings are still on the stack
17:28:53 <FortyTwoBB> that's why I'm not sure visuvian duplimancy is ok, because now there can be waiting triggers to make a token of something that would have fizzled if the copy effect was a spell like fated infatuation
17:29:28 <ais523> there are four obviously required cards (Arcbond, Bishop of Wings or an equivalent, Coat of Arms, Artificial Evolution)
17:29:56 <ais523> and two cards is not *quite* enough to cover token creation, token donation, and keeping the turn player alive
17:30:48 <ais523> (also stock Ruby Storm doesn't have any infinite loops in it, unless I missed one, so I had the opposite problem from your thread – I needed to add a way to create an infinite loop so that I could set up arbitrarily large programs)
17:31:12 <ais523> my current attempt is to add Fractured Identity, Riftsweeper, and any random lifelinker
17:32:00 <ais523> (the deck naturally contains Inspired Tinkering , Past In Flames and Bonus Round, which collectively give you an infinite but somewhat stupid combo with Fractured Identity and Riftsweeper)
17:32:43 <ais523> (in addition to Fractured Identity + Riftsweeper being able to give the opponent tokens without losing the original card)
17:35:04 <ais523> `card-by-name Soulfire Grand Master
17:35:05 <HackEso> Soulfire Grand Master \ 1W \ Creature -- Human Monk \ 2/2 \ Lifelink \ Instant and sorcery spells you control have lifelink. \ {2}{(u/r)}{(u/r)}: The next time you cast an instant or sorcery spell from your hand this turn, put that card into your hand instead of into your graveyard as it resolves. \ FRF-M
17:35:15 <ais523> that has the most relevant ability I could find on a lifelinker
17:35:33 <ais523> having both lifelink and an infinite
17:35:49 <FortyTwoBB> yeah, i was trying to remember that exact card lol
17:35:51 <ais523> but I couldn't find a single other card to pair it with
17:37:39 <ais523> ah, I was wondering if that lingering effect could be used as a counter for some sort of stage construction, but it can't, it has the Netrunner-style wording where it doesn't stack with itself properly
17:39:01 <FortyTwoBB> yeah it just goes infinite or doesnt do much
17:40:50 <ais523> I've been kind-of wondering whether I should just try to find a maindeck slot for The One Ring
17:42:18 <ais523> ooh, Last Laugh is probably usable, but seems to have no advantages over Arcbond
17:42:22 <ais523> `card-by-name Last Laugh
17:42:23 <HackEso> Last Laugh \ 2BB \ Enchantment \ Whenever a permanent other than Last Laugh is put into a graveyard from the battlefield, Last Laugh deals 1 damage to each creature and each player. \ When no creatures are on the battlefield, sacrifice Last Laugh. \ TOR-R
17:43:41 <ais523> Massacre Girl might actually work for my constructoin
17:43:49 <ais523> because it doesn't hurt players
17:43:50 <FortyTwoBB> arcbond is unique in that it always triggers itself the same amount no matter how many creatures die
17:44:02 -!- FreeFull has joined.
17:44:22 <ais523> for me, the actual number of triggers doesn't matter as long as there are enough of them, because they just pile up on the bottom of tge stack
17:44:24 <FortyTwoBB> yeah you just need to do a bit more setup to have a clock that always dies
17:45:13 <ais523> but I agree that making the clock die would be a problem – you probably have to kill all the token creators
17:47:57 <FortyTwoBB> well with bishop of wings, you get to keep them alive
17:48:44 <ais523> but arcbond damages yourself
17:49:07 <FortyTwoBB> yeah but you can keep a bishop to keep yourself alive
17:49:20 <ais523> the problem is that you get to stack the bishop triggers and arcbond triggers
17:49:29 <ais523> and if you always stack the bishop triggers on the bottom, you lose
17:49:40 <ais523> so it isn't a perfect choiceless loop
17:49:49 <ais523> of course, this doesn't matter for your construction, because you can choose to stack them correctly
17:50:40 <ais523> there's more than one definition of Turing-complete
17:50:59 <FortyTwoBB> because a nondeterministic turing machine that has the option to catch fire at any step would still be turing complete no?
17:51:01 <ais523> for the Netrunner Turing-complete proof, I had to resort to "it's Turing-complete unless a player makes a decision that causes them to immediately lose the game"
17:51:48 <ais523> for Magic, it's possible to get a zero-choices Turing-completeness construction, which is more interesting than a "Turing complete unless you choose to lose" construction, although both would normally be considered to be Turing-complete
17:52:13 <ais523> but zero-choices is nice because you can F6 (or the in-person equivalent) and just have the program run itself
17:52:30 <APic> Good old self-running Programs alias Multiverses 😉
17:52:32 <ais523> (Flooding Waterfall Model doesn't run on MTGO, incidentally, because the numbers get too large too quickly)
17:52:47 <FortyTwoBB> yeah you can set triggers to auto stack in a certain order
17:53:36 <APic> ais523: Can You explain the Niagara-Falls to me, please? 😉
17:53:58 <ais523> APic: what, the real life geographical landmark? or the various waterfall-based esolangs?
17:54:17 <ais523> the esolangs are defined at https://esolangs.org/wiki/The_Waterfall_Model and https://esolangs.org/wiki/Flooding_Waterfall_Model
17:54:25 <APic> Some People seem to like going down with wooden Barrels
17:54:42 <ais523> and there is a tutorial for the former at http://nethack4.org/esolangs/waterfall/
17:54:56 <ais523> unfortunately I can't explain the real-life waterfall
17:55:24 <APic> Aaah, good old Magick
17:55:27 <ais523> (Flooding Waterfall Model is rather harder to understand than the original because each of the waterclocks is associated with two counters rather than one)
17:55:40 <APic> Okay, at least You tried, ktnx 😌
17:56:14 <APic> Good old JSON ♥
17:56:36 <FortyTwoBB> yeah mtgo/mtga etc really don't like large numbers
17:57:42 <FortyTwoBB> I did the polyraptor forerunner of the empire combo in limited to get several million 5/5
17:58:23 <ais523> I don't think I've ever done a large combo in limited – the best I ever managed was winning both games of the same match with Coalition Victory
17:58:46 <ais523> but, my opponent had mostly only played multiplayer, meaning that they were playing much more defensively than a typical limited player would, so it doesn't really count
18:05:02 -!- Koen has joined.
18:06:31 <ais523> ooh, actually getting halting to work is pretty easy using Massacre Girl and Bishop of Wings – Bishop of Wings shares no creature types with the creatures it creates, so you just use Human or Cleric as the halt counter
18:07:14 <ais523> this is probably best for my construction because I don't actually need an output from it, just halt / non-halt
18:08:34 <ais523> it does have the problem of your combo finisher giving the opponent double-exponentially large amounts of life, making it hard to cause an automatic game win
18:09:13 <ais523> but it's probably possible to set up a computation that creates a token army on one side of the field or the other
18:09:28 <ais523> depending on the result
18:09:42 <FortyTwoBB> yeah and then you make more coat of arms and use the giant creatures to win
18:13:11 <ais523> I'm not sure if the coat of arms would even matter at that point, quadratic doesn't put a dent in double-exponential
18:13:54 <ais523> likewise, Flooding Waterfall Model doesn't give you a meaningful amount of extra output compared to the original (and may even give less) because exponential growth is trivially small compared to busy beaver numbers
18:17:43 <FortyTwoBB> oh right you cant convert the bb output into coat of arms
18:21:13 -!- Europe2048 has joined.
18:29:35 <ais523> FortyTwoBB: by the way, what's the best way for me to contact the rest of you when I have something to say?
18:32:36 <APic> ais523: Just let Your Client stay here 24/7
18:32:56 <ais523> APic: my computer isn't switched on 24/7, nor is my Internet connection
18:33:01 <ais523> and I read the logs quite a lot
18:33:10 <ais523> Europe2048: tired, as usual
18:33:52 <ais523> I am fed up with existing parser generators, and feel like I could do better
18:33:55 <Europe2048> What's a parser generator? Also, what language?
18:34:39 <b_jonas> wait, it's Bishop of Wings now?
18:34:47 <b_jonas> I thought it was one of two other similar cards
18:34:48 <ais523> a parser generator is a program that generates a parser, and a parser is a program or subroutine that converts text input into a machine-readable form
18:34:53 <APic> ais523: I switch my Zarniwoop off when i go to sleep too, but i have a VM in the Switzerlands
18:35:00 <ais523> b_jonas: it depends on what specific construction we're talking about
18:35:23 <ais523> Bishop of Wings is the most convenient in most respects, but the lifegain trigger often screws things up
18:35:27 <b_jonas> I'm not following Magic at all these days so I shouldn't be surprised that there are useful cards that I hadn't heard of
18:35:29 <Europe2048> ais523: So it's like a [language]-to-assembly converter.
18:35:43 <ais523> so some constructions use, e.g., Xathrid Necromancer instead
18:35:46 <ais523> Europe2048: not to assembly
18:35:54 <ais523> just to a data structure that represents the original program
18:36:15 <ais523> going all the way to assembly is called a compiler; compilers will normally contain a parser but they have other parts too
18:36:41 <Europe2048> b_jonas: I don't have any Pokemon, Yu-Gi-Oh, etc. cards.
18:36:47 <APic> Good old Lexer
18:37:09 <ais523> Europe2048: that's probably for the best, they are not good value for money
18:37:10 <Europe2048> APic: I don't have any Pokemon, Yu-Gi-Oh, etc. cards.
18:37:18 <APic> Europe2048: So?
18:42:31 <APic> Does reading in the IRCs count as hearing?
18:43:19 <int-e> depends on how literal you are
18:43:34 <zzo38> I sometimes play Pokemon card, since I have some older cards
18:44:15 <b_jonas> wait, *another* parser generator?
18:45:11 <ais523> ayacc was primarily intended as something that could be used to compile programs that depend on POSIX yacc
18:45:55 <b_jonas> making more output templates for ayacc sounded plausible, eg. I might want a rust one, or a stackless C++ one
18:46:24 <ais523> whereas with the new one, I'm trying to create something much better than yacc – able to handle more grammars, better at detecting mistakes in the grammar, and the resulting parsers run faster
18:47:03 <ais523> one principle I want is that the parser generator should be powerful enough to handle a combined parser/lexer that's produced simply by writing the lexer rules as parser rules
18:47:03 <b_jonas> ok, though ayacc is already better at detecting mistakes than yacc, and the resulting parsers likely run faster
18:47:08 <ais523> current yacc completely fails at that
18:47:27 <ais523> and ayacc uses the same algorithm for compatibility, so it fails too
18:47:38 <b_jonas> the lexer rules as parser rules? I specifically don't want that, that would just make the grammar harder for humans to understand
18:47:55 <b_jonas> if you want to make a separate lexer generator, that could make sense of course
18:48:57 <ais523> b_jonas: say you don't have the same tokens in every place in your document
18:49:10 <b_jonas> also someone pointed out on this channel that the problem with having the parser grammar handle lexing is that it's harder to tell it to ignore whitespace and comments between almost any two tokens
18:49:28 <ais523> that is a problem I've been thinking a lot about
18:50:08 <b_jonas> sure, I want that for my python syntax extension, but if I have control over both the parser and the lexer than that's not a hard problem
18:50:10 <ais523> I think the correct solution to both of these problems is to have parser-ish rules and lexer-ish rules separate in the input format, but you're allowed to mix them in ways that a separate parser and lexer normally wouldn't be able to
18:50:38 <ais523> b_jonas: it's not a hard problem with respect to correctness, but it is a hard problem with respect to efficiency
18:50:52 <ais523> because if the parser is going to tell the lexer what sort of tokens to lex, that limits the evaluation order of the parser
18:51:15 <ais523> which invalidates a lot of possible optimisatiions
18:51:33 <b_jonas> right, and it gets worse if you also want the execution stage to tell the parser or lexer what to accept
18:52:10 <ais523> C-INTERCAL's parser currently does spark/ears matching in the lexer
18:52:41 <ais523> I'm not sure whether it's possible to write it in pure yacc+lex
18:52:54 <ais523> but if it is it'd probably involve duplicating a lot of rules
18:54:08 <b_jonas> I don't like lex. I only used it like once, after a teacher teaching the course involving lex and yacc told me that he can't give me full marks for homework if my simple tokenizer does not actually use lex.
18:54:36 <ais523> now I'm wondering how good yacc is at tokenising
18:54:42 <ais523> (apart from the output format, which is wrong)
18:54:51 <ais523> just in terms of the algorithms
18:55:11 <b_jonas> that's how geo ended up with a flex tokenizer
18:55:18 <ais523> I think it is very difficult to specify things like "identifier that isn't a keyword" in yacc
18:55:46 <b_jonas> and scan too. then I used lex twice.
18:55:59 <ais523> (not impossible but you have to basically write a trie of every prefix that no keyword starts with manually)
18:57:12 <b_jonas> I think the original use case of lex was to create a tokenizer that efficiently recognizes a dozen to a few hundred keywords, which look like identifiers until you learn the list
18:57:27 <ais523> C-INTERCAL has a pre-lexer that expands ! to '. (and 🐰 to ".)
18:57:54 <b_jonas> that's a pre-lexer, not a post-lexer?
18:58:01 -!- Sgeo has joined.
18:58:15 <ais523> I think it runs before the lexer, not sure though
18:58:17 <zzo38> I had mostly just wrote the parser and lexer in C, although having it separately might be useful sometimes too.
18:58:54 <b_jonas> I guess it can be a pre-lexer, since intercal doesn't have string/format literals
18:58:56 <ais523> I have handwritten a few parsers recently
18:59:10 <ais523> b_jonas: if it did the expansion would have to be before the lexer
18:59:22 <ais523> the whole point is that a ! can match a later '
19:00:37 <b_jonas> ais523: I don't mean string literals delimited by apostrophes or rabbit ears, I mean string literals in which an exclamation mark has to be preserved as is without transforming to apostrophe and dot
19:01:05 <b_jonas> oh, you're saying that the lexer determines which apostrophe is opening or closing
19:01:20 <b_jonas> yes, then it makes sense to expand exclams after that
19:01:31 <b_jonas> expand excalms *before* that
19:01:46 <ais523> anyway, my point is that if INTERCAL did have '-delimeted string literals, an exclamation mark would ned the string
19:02:11 <zzo38> Note that keywords need not look like identifiers if they are distinguished somehow (like they are in LLVM by adding sigils, for example; some of my own designs of programming languages do something similar, too).
19:02:46 <ais523> zzo38: in the specification for Algol, identifiers and keywords are written in different fonts, and are allowed to have the same spelling
19:02:57 <ais523> that created some interesting challenges for implementations
19:03:32 <ais523> I think the portable standard was to write a . before keywords so that they could be distinguished from identifiers formed from the same letters
19:03:34 -!- arseniiv has quit (Quit: gone too far).
19:03:35 <cpressey> I avoid most "parser generators". I either handcode the lexer and parser, or I want to use something more solidly theory-based than lex and yacc
19:03:53 <ais523> cpressey: what do you mean by "solidly theory-based"?
19:03:56 <b_jonas> they probably got that from the fortran .LE. operator then
19:04:15 <cpressey> ais523: like a parser combinator library, or an attribute-grammar-based formalism
19:04:42 <cpressey> lex and yacc be all like "let's intersperse some C code with some other junk in an expedient fashion"
19:05:07 <ais523> cpressey: I am hugely opposed to parser combinator libraries, for two reasons: a) they make it easy to write an ambiguous grammar without realising it, b) they make it easy to write a very slow parser without realising it
19:05:33 <ais523> I also disagree with many decisions lex/yacc have made, including that one
19:05:51 <zzo38> You could use a character set that uses different character codes for keywords vs identifiers, if such a thing is necessary
19:05:56 <ais523> …this is part of why I want to write my own parser generator
19:05:58 <APic> My Friend who studies Informatik at the Fernuni Hagen has to write Pascal-Code
19:06:12 <APic> Very easy to make a Pascal Compiler
19:06:17 <APic> Very ugly to program in it
19:06:21 <ais523> zzo38: a common nonportable approach at the time was to use uppercase vs. lowercase
19:06:26 <b_jonas> well, in theory you could have a parser combinator library that doesn't have full backtracking alteration, but only alternation where you explicitly have to specify the lookahead and the lookahead condition is more limited than the rest of the grammar
19:06:28 <cpressey> ais523: I don't really disagree, a lot of parser combinator libraries are annoying to use
19:06:37 <ais523> but in 1968 I'm not sure that all computers even supported two cases
19:07:23 <ais523> the thing I am most opposed to in yacc is the way it handles precedence
19:07:28 <zzo38> I wrote this file: http://sprunge.us/y2f0gi Could something like this be possible for parsing?
19:07:40 -!- Thelie has joined.
19:07:47 <ais523> basically because it is very easy to handle precedence in a mathematically rigorous way, but yacc does something else
19:07:55 <b_jonas> I find yacc precedences confusing, but that might be only because I don't understand itsrules
19:08:27 <ais523> zzo38: most parser generators have input that looks something like that, but typically not with exactly that syntax
19:08:33 <b_jonas> I think you explained to me at one point that they don't work anything like I thought they work
19:08:49 <ais523> b_jonas: no, you're right to be confused, its rules are designed for implementation convenience rather than making any sense
19:08:53 <zzo38> (Although, this file I did does not have precedence since it is for a programming language that does not have precedence.)
19:09:19 <b_jonas> anyway, if you get this new parser generator to work I'll be interestd
19:09:52 <ais523> zzo38: any grammar that can be expressed using precedence can be expressed without it, it just becomes bigger and harder to read
19:10:18 <ais523> as such, my opinion about precedence is that parser generators should simply be written to do that transformation before generating the parser
19:11:30 <ais523> which would give the most intuitive possible behaviour – it would not be tied to the parser internals at all, and there would never be a case where the parser generator doesn't tell you about an ambiguity because it thought you had resolved it with precedence rules, when you were actually trying to resolve some other ambiguity
19:13:49 <ais523> b_jonas: there's a great example in the paper about IELR where yacc (and Bison in LALR mode) can't parse some inputs because there is a lossy optimisation in LALR that is normally not a problem because it causes an ambiguity to be reported in cases where the optimisation is applied lossily, but that ambiguity ends up getting overridden by a precedence rule, with the consequence that some valid input just doesn't parse
19:14:08 <ais523> and this was actually discovered to affect at least one real-world program
19:16:48 <b_jonas> ais523: is that in a situation when the token that is in the yacc precedence rule is used in more than one different way, and the precedence was supposed to have affected only one of them? because that's the obvious easy way to mess up yacc precedence, but I assume it's not the only way
19:17:31 <ais523> b_jonas: probably; I can't remember the details, but that is a very common way to go wrong in yacc
19:18:39 <APic> What exactly differentiates Lexers from Parsers, apart from the Input-Chain-Length?
19:20:23 <ais523> APic: the distinction is sort-of artificial; but normally lexer generators generate finite state machines and parser generators generate push-down automata, so lexers have to be finite state transducers to be generated with a typical lexer generator
19:22:14 <ais523> although, nowadays languages often have fancy forms of string literal that can't be parsed finite-state, e.g. raw string literals which are surrounded by arbitrarily long sequences so you can pick one that doesn't appear in the string itself, or here-documents
19:22:32 <ais523> so it's unclear whether there's actually a mathematical distinction at this point or whether it's just a matter of tradition
19:23:10 <ais523> (that said, a push-down automaton can't do the raw string literals either, so some sort of generalisation is needed)
19:23:51 <APic> Just noticed „unclear“ has „nuclear“ as an Anagramm
19:32:36 <b_jonas> if you make a parser generator that knows how to make a lexer too, that could be useful, because it could tell you if you want to introduce a new digraph symbol when that is ambiguous with the existing tokens in a way that can occur according to the parser, eg. it could tell you if : > could occur in C++ so you might not want to use :> as a digraph (it's way too late not to use :> as a digraph in C++
19:35:50 <b_jonas> funnily, push-down automatons are enough for lua's long delimited string literal, but not for C++'s long delimited string literals (unless you want to add like 2**128 states because the standard says that the extra delimiter is at most 16 characters long)
19:36:37 <b_jonas> and then there's mime's delimited format, which uses even longer delimiters, so that you can just make the delimiter long and random and hope it doesn't occur in the data that you're delimiting
19:38:34 <cpressey> ais523: the ease with which you can accidentally make a backtracking parser with most parser combinator libraries reminds me that I thought about taking measures to avoid that in my AG formalism: a production has to be marked as "backtracking ok" otherwise it won't be allowed to. (haven't worked out the details yet though.)
19:38:36 <ais523> b_jonas: I have actually been seriously considering the "add like 2**128 states" solution, and simply storing the parser table compressed in such a way that you can use the compressed format directly
19:39:11 <cpressey> zzo38: is sprunge.us a good pastebin?
19:39:16 <ais523> cpressey: hmm, that's interesting
19:39:38 <ais523> sprunge is pretty good for short-term pastes, as long as you're OK with pasting from the command line rather than using a web interface
19:39:47 <ais523> the pastes seem not to last forever, though
19:39:58 <b_jonas> the problem about parser generators for me is that, if I invent a language then I'll make one for which I understand the grammar enough that I know how to handwrite a parser for it, so parser generators are mostly useful for the ugly cases when you want to parse some existing language with lots of silly historical cruft, like SQL
19:40:00 <ais523> which can be a problem in some cases
19:40:29 <ais523> b_jonas: what about things like generating good error messages for every possible invalid input?
19:40:43 <ais523> I find that a pain to do when writing by hand, admittedly many parser generators aren't too good at it either
19:45:04 <cpressey> Good error messages contain line and column and source file name. Those can be a chore to track, too.
19:52:14 <ais523> yes, to the extent that it makes sense to have a utility class/structure/library that does file reading and tracks the line/column as it goes (and source file if you have more than one of them)
19:52:28 <ais523> (but most of the time I need a parser, I only have one source file anyway)
19:53:14 <b_jonas> ais523: yes, perhaps a parser generator might help generate useful error messages, though I think only if the input for the parser generator has useful hints for that
19:54:20 <ais523> I think a lot can be done without hints, but maybe not everything
19:55:01 <b_jonas> I usually add good error messages lazily, as in my program is full of cases where it just gives up, but until I actually encounter such a give up case or I expect to encounter it, the error message that it prints isn't informative
19:55:52 <b_jonas> if my program dies and I don't understand why, then I make it print useful info to that error message, and possibly print debug info earlier too
19:56:32 <b_jonas> of course this won't work well with programs that you can't rerun reproducibly, or with programs that other peoplee have to run without me being available
20:02:31 <cpressey> In the olden days, people wanted to see as many syntax errors as possible in an error message, so they could fix them all before trying to compile again.
20:02:52 <cpressey> That's also what the semicolons were for, to help "error recovery".
20:03:17 <b_jonas> cpressey: semicolons are still useful for that
20:03:23 <cpressey> So that was another thing a parser generator could take off your hands.
20:03:40 <b_jonas> well, not specifically error recovery, but to get useful error messages if you forget a closing paren
20:03:51 <cpressey> But compiling is so much cheaper these days, these things matter so much less.
20:05:06 <cpressey> I still use parens in JS but I feel like I get nasty looks for doing so
20:06:08 <ais523> wait, there are people who don't use semicolons in JS?
20:06:25 <ais523> is this some sort of micro-golf in order to save a few pennies on bandwidth?
20:07:18 <b_jonas> I expect there are people who don't use semicolons (most of the time) beacuse JS has some quite complicated grammar rules for when you can omit semicolons and they wouldn't have added those unless someone wanted to use them
20:10:05 <fizzie> It looks more modern without semicolons.
20:11:32 <fizzie> But I don't think *using* them is uncommon either. The style guide at work mandates them, and MDN says it's "considered best practice, however, to always write a semicolon after a statement, even when it is not strictly needed."
20:12:28 <b_jonas> fizzie: do you have an interpreter or linter that can warn you when you miss a semicolon?
20:22:23 -!- Koen has quit (Remote host closed the connection).
20:42:26 <FortyTwoBB> ais523 yeah im not sure what the best way to connect with us would be.
20:44:45 <esolangs> [[QwertyScript]] M https://esolangs.org/w/index.php?diff=117929&oldid=105078 * PythonshellDebugwindow * (+93) Categories
20:58:02 <esolangs> [[Expressive]] M https://esolangs.org/w/index.php?diff=117930&oldid=116118 * PythonshellDebugwindow * (+24) Category
21:03:55 <esolangs> [[Trampolines]] https://esolangs.org/w/index.php?diff=117931&oldid=117918 * Aadenboy * (+1943) a
21:09:33 -!- Europe2048 has quit (Quit: Client closed).
21:17:23 <esolangs> [[User talk:Europe2048]] https://esolangs.org/w/index.php?diff=117932&oldid=116933 * PixelatedStarfish * (+97) /* Project Euler problem 10 implementation */
21:17:57 <esolangs> [[User talk:Europe2048]] https://esolangs.org/w/index.php?diff=117933&oldid=117932 * PixelatedStarfish * (+27)
21:24:23 -!- cpressey has quit (Quit: Client closed).
21:27:07 <zzo38> cpressey: Sprunge seems good enough to me. (Although, in the past I think once I tried to send a file with a few non-ASCII characters and they were deleted, so that might be of a consideration)
21:29:29 <zzo38> (Also, both command-line and web interface work OK)
21:32:47 <fizzie> b_jonas: I don't remember if I've actually tried writing JS at work, but I would imagine there is something. Other languages definitely do.
21:41:58 <zzo38> Why does C have fmemopen and open_memstream but not a function to open a new in-memory file for reading and writing (without needing to specify a pointer to the buffer or a pointer to a variable to store the pointer to the buffer) which is automatically destroyed when the file is closed?
21:42:17 <zzo38> I think that the automatic semicolon insertion is one of the bad ideas of JavaScript.
21:43:49 <zzo38> (The fmemopen function allows to open a fixed-size in-memory file similar than the above, but not dynamic sizing which is what open_memstream does, which requires specifying the pointers and is meant only for writing and not for reading)
22:37:35 -!- FreeFull has quit.
22:45:23 <esolangs> [[F!--]] M https://esolangs.org/w/index.php?diff=117934&oldid=117922 * None1 * (+59) /* Commands */ The output command in F!-- exists, but outputs an integer
22:52:21 <esolangs> [[F!--]] https://esolangs.org/w/index.php?diff=117935&oldid=117934 * None1 * (+111) Added an example (XKCD Random Number) to show that F!-- has integer output
22:53:15 -!- Thelie has quit (Remote host closed the connection).
22:57:02 <esolangs> [[F!--]] M https://esolangs.org/w/index.php?diff=117936&oldid=117935 * None1 * (+82) /* Interpreter */ The lisp interpreter is wrong, I changed it to a Python interpreter
23:19:01 <Noisytoot> I don't use unnecessary semicolons in JS
23:19:34 <esolangs> [[--yay]] M https://esolangs.org/w/index.php?diff=117937&oldid=82414 * PythonshellDebugwindow * (+8) Link, categories
23:39:21 -!- ais523 has quit (Remote host closed the connection).
23:40:34 -!- ais523 has joined.
23:55:05 -!- GreenHat has joined.