00:38:17 -!- Melvar`` has quit (Ping timeout: 256 seconds). 00:42:42 -!- Cale_ has changed nick to Cale. 00:51:23 -!- Melvar`` has joined. 00:54:13 -!- Phantom_Hoover has quit (Read error: Connection reset by peer). 01:16:48 -!- oerjan has joined. 01:29:16 -!- variable has joined. 01:53:41 Cale: What's your opinion on the higher-dimensional polynomial thing? 01:54:07 -!- augur has quit (Remote host closed the connection). 01:55:00 -!- augur has joined. 02:00:06 What thing? 02:01:11 Hmm, any thing. How would you define these things? 02:02:22 You can say, for instance, that a polynomial is something of the form f(x) = A + B(x) + C(x,x) + D(x,x,x) etc. 02:02:31 Where A,B,C,D are multilinear. 02:08:43 Of course you know you can define a polynomial ring in terms of an arbitrary other ring 02:09:24 But your generalisation goes a bit farther than that I suppose 02:09:28 Does that work here? 02:09:42 I... don't know what we're trying to do really :) 02:09:58 Fair enough. 02:10:07 But the usual way to define polynomials in multiple indeterminates is just to iterate the polynomial ring construction 02:10:16 Doesn't quite work for infinitely many 02:10:47 Even in finite-dimensional vector spaces I'd like some sort of answer. 02:11:06 Anyway, let's see. I was trying to figure out what the inverse of the Hessian matrix is. 02:11:19 Since people represent it as a matrix, but it "really" ought to be a quadratic form, right? 02:12:22 So say you have a generalized quadratic thing : R^n -> R 02:12:37 It'll be of the form f(x) = A + B(x) + C(x,x), right? 02:13:11 yeah, it really makes more sense as something that eats a couple of vectors 02:13:29 Should I require C to be symmetric here? 02:14:03 If you have a quadratic form like g(x,y) = ax^2 + bxy + cx^2, that seems to correspond to a symmetric tensor (/matrix). 02:15:08 Anyway, either way, the derivative of f if f'(x)(dx) = B(dx) + C(dx,x) + C(x,dx). Right? 02:16:20 -!- Soni has quit (Ping timeout: 276 seconds). 02:16:24 Sounds plausible 02:16:39 (If C is symmeetric then it's just f'(x)(dx) = B(dx) + 2C(x,dx), of course.) 02:17:05 dx here is a whole vector, not a basis covector or anything like that. 02:17:19 right 02:17:28 Anyway this is just like the "one-hole context" thing for type derivatives. 02:17:56 yeah 02:18:06 The derivative of a multilinear form is the linear map you get when you substitute one linear parameter in at each possible spot. 02:18:31 That seems pretty nice. 02:19:07 -!- Soni has joined. 02:20:00 Can you *define* derivatives this way somehow, as "varying one occurrence of x linearly"? Presumably it at least works for polynomials and power series, assuming I know what those are. 02:20:34 So I was tryinng to figure out how they work. 02:23:15 Anyway, back to the derivative of the Hessian, is there a general notion of the derivative of a tensor? I think there should be, at least in some cases. 02:24:55 f''(x)(dx)(dy) = C(dx,dy) + C(dy,dx) is a (0,2) tensor at each point. But you can probably represent it as a linear map : V -o (V -o F) and say that its inverse is a linear map : (V -o F) -o V. Which is a (2,0) tensor? 02:31:34 So yeah, exactly, the space of linear maps is just another vector space, so it's totally possible to talk about derivatives of functions whose codomain is that 02:31:52 And of course you can talk about the (2,0) tensor which numerically has the same values as the inverse of the Hessian, and when contracted with the second derivative you get the (1,1) identity tensor : V -o V 02:32:45 The derivative of a map V -> W is going to be a function V -> (V -o W), giving the best linear approximation to the original function at each point in V 02:33:00 and then that (V -o W) can be your W 02:33:24 Right, so you get f'' : V -> (V -o (V -o W)), or f'' : V -> (V⊗V -o W) 02:33:27 -!- variable has quit (Quit: /dev/null is full). 02:33:32 I know that much. 02:33:35 right 02:38:47 There's a really cute thing I found you can do at one point... 02:38:50 -!- variable has joined. 02:40:08 (Just trying to remember it clearly enough to explain it properly... I also have some mathematica code... somewhere) 02:41:12 But basically, you can take a curve (or generally one of these polynomial surfaces), and find the best lower degree approximation to it at any point 02:41:59 which specialises to the tangent line in the case of finding a linear approximation (or the polar line in the case of conic sections and a point not on the curve) 02:45:00 How does that work? Is it something other than the truncated Taylor series? 02:45:22 It's more like plain application :D 02:45:57 You construct a symmetric tensor which results in the polynomial when applied repeatedly to the same vector 02:46:32 Do you have an example? 02:46:51 yeah... actually, do you have/use mathematica? 02:47:08 No. 02:47:24 I have a mathematica notebook which I might have to take a moment to decipher to remember what it was that I was actually doing 02:47:51 (this was over a decade ago now) 02:48:44 But actually, let's start with some context, because this is fun 02:50:50 Are you familiar with projective coordinates? 02:52:09 Not very. 02:53:20 Okay, so here's the basic idea (I'm just going to talk about plane geometry because everything's kinda nice and easy there to start out) 02:54:17 The projective plane has as its points the lines through the origin in R^3 02:54:32 and as its lines, the planes through the origin 02:55:04 We'll use the coordinates of any nonzero point on such a line through the origin as coordinates for the point in the projective plane 02:55:37 and we can think of the ordinary Euclidean R^2 as belonging to the projective plane by considering points with coordinates of the form (x,y,1) 02:56:44 (i.e. lines through the origin which aren't parallel to z = 0) 02:59:14 OK... 02:59:52 So, let's think about the equation for a line: in Euclidean geometry, i.e. where z = 1, it's just a x + b y - c = 0. 03:00:09 or even a x + b y + c = 0, let's not mess around with signs 03:00:54 Now what we'll do is make this a polynomial which is homogeneously of degree 1 (i.e. every term has the same degree of 1) in x, y, z, by adding z's 03:01:12 a x + b y + c z = 0 03:01:39 which we could write as <(a,b,c), (x,y,z)> = 0 03:01:58 and this jives with what we said a projective line should be 03:02:22 a plane through the origin consists of all the vectors (x,y,z) whose inner product with some (a,b,c) is 0 03:03:32 We'll take the [a,b,c] (with square brackets because I'll usually write it as a row vector rather than a column) to be the coordinates of the line 03:03:49 and so it's easy to check if a point lies on our line, just take the dot product 03:04:11 Now, suppose we want to take two points with their projective coordinates and find the line through them 03:04:52 e.g. say our points are (x,y,z) and (x',y',z') 03:05:15 we want to find [a,b,c] such that [a,b,c](x,y,z) = 0 and [a,b,c](x',y',z') = 0 03:05:28 -!- tromp has joined. 03:05:31 and of course, this is well known, it's the cross product :) 03:06:06 similarly, if we have two lines and want their intersection, it's the same thing, cross product again (but everything is transposed) 03:06:36 okay, so that's phase 1 03:06:38 This sounds believable though I should probably work through the detail. 03:06:45 Now conic sections... 03:07:53 It turns out that you can represent conic sections really nicely 03:09:15 Given a symmetric matrix A, a conic section is just the set of points x for which x^t A x = 0 03:09:33 i.e. given a symmetric bilinear form B, it's just B(x,x) = 0 03:09:46 and the determinant tells you if the conic section is degenerate 03:10:09 -!- tromp has quit (Ping timeout: 256 seconds). 03:11:17 -!- xkapastel has quit (Quit: Connection closed for inactivity). 03:11:19 Given a conic section with matrix A, we can do the following cute thing: we can map points to lines by (x,y,z) |--> [x,y,z] A 03:11:42 and we can map lines to points by [a,b,c] |--> A^-1 (a,b,c) 03:12:11 It turns out that this computes what's known as the pole and polar 03:12:16 https://en.wikipedia.org/wiki/Pole_and_polar 03:14:08 and, really, the important thing is that if the point happens to lie on the conic, then the line you get is the tangent line 03:14:39 of course, you sort of already mentioned this in a different way :) 03:15:18 You mean that the derivative of x |-> C(x,x) is x |-> (dx |-> 2C(x,dx))? 03:15:27 Or something that leads to that -- differentiating a symmetric bilinear map looks kind of like partially applying it 03:15:30 yeah 03:15:56 Or equivalently the derivative of f(x) = x^T C x is f(x)(y) = x^T (C + C^T) y 03:16:02 Or 2C if C is symmetric. 03:16:10 especially so when you're then looking at zeroes of these things and so the 2 goes away 03:16:47 So I was thinking about how to maybe do all this again, but with more interesting polynomials 03:16:58 Not just conic sections 03:17:24 and it turns out that all this geometry of conic sections really does generalise 03:18:48 -!- Remavas has quit (Remote host closed the connection). 03:19:12 So, what we want to do is to take our arbitrary polynomial, say something like x^4 + x^3 y - x^2 y^2 + y^4 - 1 03:19:20 -!- Remavas has joined. 03:19:51 By the way, I was reading a paper that talked about eigenvectors and an eigenbasis of a symmetric positive definite matrix representing a quadratic form. I suppose there's some eqeuivalent notion for quadratic forms represented with the "proper" variance? 03:20:44 and then we homogenize that with z to get... well, I picked a kind of boring one, we just get x^4 + x^3 y - x^2 y^2 + y^4 - z^4 here... in general we'd multiply each term by a large enough power of z to make the total degree the same in each term 03:21:16 -!- Remavas has quit (Read error: Connection reset by peer). 03:21:22 Right. 03:22:43 and then we need to construct a corresponding symmetric tensor by taking each term and basically splitting it up over all the ways of permuting the word 03:23:08 dividing by the number of permutations 03:23:17 and adding things up 03:23:18 This is https://en.wikipedia.org/wiki/Symmetric_tensor#Symmetric_part_of_a_tensor , right? 03:23:53 Kind of, except we're building an appropriate tensor from our polynomial 03:24:38 Right. 03:24:49 The goal is to get a symmetric multilinear form such that B(v,v,...,v) is our polynomial again 03:25:52 and then what you can do is just start to apply that thing, and then go back the other way to get lower-degree things 03:26:17 if you apply it n-1 times to a point that lies on the original curve, you get a tangent line, remarkably 03:26:47 Oh, this works because it's homogenized. 03:26:52 yeah 03:27:13 The thing I was doing with "general" polynomials didn't work, or had to consider each term separately. 03:27:24 and if you apply it n-2 times, you get a tangent... conic section 03:28:21 I was just playing around with this in mathematica and making plots... really I should prove some things about it and write a paper or something, but lazy. 03:28:37 Even too lazy to check if someone else did it properly ;) 03:28:57 This is like how you represent can affine maps as linear by adding one dimension. 03:29:04 can represent 03:29:24 ah, yeah, that's related to this projective stuff 03:29:45 (it is projective stuff) 03:29:49 But I'm kind of surprised that one extra dimension can work for any degree. 03:30:46 Actually I shouldn't be very surprised. 03:32:37 So these homogenized multilinear things are special in that by doing n applications you get the same thing as taking the nth derivative. 03:55:22 -!- variable has quit (Quit: /dev/null is full). 04:00:19 -!- tromp has joined. 04:04:57 -!- tromp has quit (Ping timeout: 240 seconds). 05:22:39 -!- sleffy has quit (Ping timeout: 248 seconds). 05:47:57 -!- tromp has joined. 05:53:21 -!- tromp has quit (Ping timeout: 268 seconds). 06:42:19 -!- sleffy has joined. 07:02:15 -!- oerjan has quit (Quit: Nite). 07:10:23 -!- sleffy has quit (Ping timeout: 248 seconds). 07:34:26 -!- Melvar has joined. 07:37:03 -!- Melvar`` has quit (Ping timeout: 248 seconds). 08:11:51 -!- tromp has joined. 09:53:23 -!- tromp has quit (Remote host closed the connection). 09:54:09 -!- tromp has joined. 09:54:10 -!- tromp has quit (Remote host closed the connection). 09:54:24 -!- tromp has joined. 10:14:00 -!- augur has quit (Remote host closed the connection). 10:14:13 -!- augur has joined. 10:15:09 -!- Melvar has quit (Ping timeout: 256 seconds). 10:28:30 -!- Melvar has joined. 10:35:57 -!- laerling has joined. 10:37:55 -!- LKoen has joined. 11:01:51 -!- wob_jonas has joined. 11:02:02 Are you talking about conic sections in the projective 2-plane? 11:02:26 I learnt some interesting facts about them that are rarely taught. 11:02:45 Let me review your discussion 11:07:44 -!- Melvar has quit (Ping timeout: 256 seconds). 11:13:41 ah yes. symmetric tensor. sounds so much fancier than a quadratic form. 11:17:12 "symmetric tensor" can have a higher degree than quadratic. 11:17:36 shachaf: yeah, though I think the context was a quadratic one 11:17:59 Also it might not be a form, I guess. 11:18:50 Good to know. 'Britain will not be "plunged into a Mad Max-style world borrowed from dystopian fiction" after it leaves the EU, the Brexit secretary has said.' 11:18:56 `" 11:18:57 1015) Should I watch Watchmen or read Watchmen? [...] Please, who *watches* The Watchmen? \ 104) * Phantom_Hoover wonders where the size of the compiled Linux kernel comes from. To comply with the GFDL, there's a copy of Wikipedia in there. 11:19:33 lol 11:19:46 104 is nice 11:20:43 * int-e recently became aware of the fact that the kernel patches itself with various workarounds on bootup. Patching as in, actually replace fragments of code with something else... I thought they were only adjusting addresses. 11:21:05 -!- Melvar has joined. 11:23:33 -!- augur has quit (Remote host closed the connection). 11:28:52 -!- laerling has quit (Quit: Leaving). 11:30:27 -!- laerling has joined. 11:36:25 -!- boily has joined. 11:38:51 -!- AnotherTest has joined. 11:49:52 fungot: nostril. 11:49:52 boily: compile it yourself? this isnt some kind of anonymous function, and just presented a big canvas on which to draw. 11:50:14 fungot: no, I won't compile any of your organs. 11:50:14 boily: it's because i'm copying and pasting out of ' em. 11:50:26 fungot: just how many nostrils do you have? 11:50:29 boily: w/ rt a screen shot no, not that 12:01:50 -!- augur has joined. 12:05:58 -!- AnotherTest has quit (Ping timeout: 256 seconds). 12:06:18 -!- augur has quit (Ping timeout: 240 seconds). 12:22:14 -!- boily has quit (Quit: ARMOURED CHICKEN). 12:22:29 `recipe 12:22:30 over the batter. Place on a lightly floured board in \ a bowl. Add the remaining ingredients in boiling water to combine. \ Cover and bake until the roasted to the boil. Stir in the cream \ of the liquid and spread with peanut oil in a small bowl. Pour into \ serving plate. Then, for a baking sheet and allow to cool. \ \ Preheat oven to 350. 12:22:35 `starwars 8 12:22:36 Moff Jerjerrod \ Captain Panaka \ Jyn Erso \ Jango Fett \ BB-9E \ Chirrut Îmwe \ Owen Lars \ Nute Gunray 12:23:06 hmm, what's that in centigrades? 12:23:10 `ftoc 350 12:23:10 350.00°F = 176.67°C 12:23:13 ok 12:25:24 onion's on 350 12:25:29 -!- Melvar has quit (Ping timeout: 256 seconds). 12:25:35 whuh 12:25:58 -!- Melvar has joined. 12:29:08 ask boily hth 12:30:15 boily sounds like someone who knows a lot about high temperatures 12:30:24 In particular their effects on liquids 12:42:38 -!- augur has joined. 12:44:59 -!- Melvar has quit (Ping timeout: 276 seconds). 12:47:01 -!- augur has quit (Ping timeout: 256 seconds). 12:50:27 -!- augur has joined. 12:57:19 -!- Melvar has joined. 13:02:57 -!- Melvar has quit (Ping timeout: 260 seconds). 13:03:26 -!- Melvar has joined. 13:13:27 -!- LKoen has quit (Remote host closed the connection). 13:14:40 int-e: It's a low bar for what constitutes a successful brexit. 13:30:28 It's possible for brexit to be successful? 13:30:29 -!- skankyyoda has joined. 13:33:07 -!- skankyyoda has quit (Max SendQ exceeded). 13:33:45 -!- skankyyoda has joined. 13:36:48 -!- xkapastel has joined. 13:51:51 Taneb: Apparently, avoiding the apocalypse is sufficient. 13:52:35 fizzie: on the other hand, fewer cool car chases 13:55:35 are you in the uk? 13:55:41 Yeah 13:55:45 where? 13:55:49 Currently, Cambridge 13:55:53 me too 13:55:55 :O 13:56:02 i moved here in november 13:56:19 is anyone else in cambridge uk? 13:56:28 we should have a #esoteric meeting 13:56:30 I presume there's more people in the city than just us two 13:56:34 no 13:56:40 that's unreasonable 13:56:49 -!- skankyyoda has quit (Ping timeout: 248 seconds). 13:56:52 But in the intersection of the population of Cambridge and of #esoteric? 13:56:56 That's going to be smaller 13:57:17 One of my coworkers used to commute from Cambridge to London until not long ago, but now lives in London. 13:57:42 Taneb: you're studying or working? 13:57:46 izabera: working 13:57:55 at arm or microsoft? 13:57:58 No 13:58:02 where? 13:58:03 Small company called Myrtle 13:58:36 We do neural networks to FPGAs via Haskell 13:58:50 How about you? 13:58:50 Taneb: oh, what are they for? 13:59:16 wob_jonas: the neural networks? 13:59:21 i work in a company called undo and we do fancy debuggers 13:59:41 Taneb: the neural networks, yes 13:59:46 izabera: reverse debuggers? 13:59:49 yes 14:00:13 wob_jonas: mostly automotive applications but we're expanding into other areas, I can't really talk about i tmuch 14:00:28 Taneb: I see 14:00:29 *beep* #CONFIDENTIAL INFORMATION DETECTED 14:01:10 what is this page https://www.myrtlesoftware.com/no-access/ 14:01:31 izabera: somewhat inaccessible 14:02:10 izabera: I'd be up for an #esoteric meetup 14:02:17 yay :D 14:02:30 We might be able to convince fizzie to come up from London too 14:02:46 Do you use neural networks on time series data where you need some short-term persistent state? 14:03:08 That's a very specific question 14:03:25 Makes me think there's a reason for it other than curiosity 14:04:44 Taneb: yeah. I'm curious because some of the hard recognition stuff I did at my job had time series input, and that is one reason why it's so difficult 14:04:56 The answer is "not yet but soon" 14:04:58 you can never tell how much of the past input is significant 14:05:22 but automotive applications would probably also involve that 14:05:41 Everything's frustratingly early days yet 14:06:18 I don't really trust neural network, or most other kinds of machine learning, I'm prejudiced against them 14:06:38 I didn't enjoy when co-workers tried to use them 14:06:54 although admittedly they do sometimes work, there are really good proofs that they are applicable in some situations 14:09:57 They feel like suspicious magic and alchemy to me 14:10:31 Like, they work well enough that there's something going on, but there's no real science for when they work, what sort of architecture you need, etc 14:12:35 Taneb: the reason I don't like them is because I feel they're overused as one of those magical bullets, used in situations when some more domain-specific solution would be much better 14:12:48 same reason why I hate multi-threading, GPU computations, and JIT 14:13:50 eh, I feel at least multi-threading and GPU computations are a lot better understood than neural networks with regards to when they're applicable 14:16:45 Although whether they're sufficiently well-understood by the people who put them into practise is another matter 14:17:31 jit is the same as regular compiling 14:18:21 Taneb: that's true 14:18:56 I understand enough when they should be used that I can recognized they're often used in vain 14:19:17 not that I'm immune to premature optimizations in other ways, mind you, not even close 14:19:27 I just use other silver bullets 14:22:43 -!- AnotherTest has joined. 14:43:59 `? procrastination 14:44:00 The Procrastination is destined to rule the world... right after watching this last funny cat clip on youtube. 14:44:42 `slwd procrastination//sblastbfinalb 14:44:44 procrastination//The Procrastination is destined to rule the world... right after watching this final funny cat clip on youtube. 14:45:41 Taneb: I completely agree that neural networks are alchemy. 14:46:08 int-e: luckily for me, they're alchemy that some people are willing to pay a lot of money for 14:47:21 (http://neuralnetworksanddeeplearning.com/chap5.html and chap5.html helped me demystify "deep learning" a bit) 14:51:05 -!- `^_^v has joined. 14:53:37 There *is* progress. But the solution to two fundamental challenges here is boring. I) deeper layers in deep networks learn slowly (solution: throw more computing power at the problem, oh, and reduce the number of coefficients by duplicating neurons ("convolutional networks")), and II) they are prone to overfitting to a much higher degree than very badly (solution: regularize by a) penalizing... 14:53:43 ...big coefficients and b) rerandomizing neurons in networks during training, and probably lots of other stuff, something that people have already done for shallow networks). Which explains why people see the real advances in recurrent networks and things like LSTM (a decaying memory whose appropriate decay can be trained by gradient descent) and other small innovations in network architecture. 14:54:40 to a much higher degree than single hidden layer networks... sorry, my edit buffer was too short for this statement. 14:58:52 -!- AnotherTest has quit (Ping timeout: 265 seconds). 15:12:19 -!- moei has joined. 15:25:02 -!- LKoen has joined. 16:01:49 -!- sparr has quit (Quit: WeeChat 1.9.1). 16:03:38 -!- Melvar` has joined. 16:05:19 -!- Melvar has quit (Ping timeout: 248 seconds). 16:05:58 -!- sleffy has joined. 16:22:46 -!- erkin has joined. 16:28:00 -!- contrapumpkin has joined. 16:35:03 -!- contrapumpkin has quit (Quit: My MacBook Pro has gone to sleep. ZZZzzz…). 16:54:25 -!- Melvar` has quit (Ping timeout: 268 seconds). 17:07:25 -!- Melvar` has joined. 17:16:33 -!- AnotherTest has joined. 17:18:29 -!- mroman has joined. 17:18:47 is there a free digital circuit simulator for windows? 17:19:12 um... probably? 17:19:34 I mean, there's got to be. That sounds like something someone would make if it doesn't exist yet. 17:20:24 maybe 17:20:26 but one that doesn't suck 17:20:33 it's the suck part that not everybody can make 17:20:43 then probably no 17:20:51 there's logisim 17:20:53 but you know 17:20:58 I wanna do large scale stuff 17:37:24 <\oren\> mroman: buy a fpga? 17:37:50 \oren\: those are harder to debug than a software simulation 17:38:01 <\oren\> i guess so 17:38:07 lel 17:38:08 I mean, I'm prejudiced, I'm a software guy 17:38:12 logisim confuses east with west 17:38:35 I believe the main advantage of hardware over software is that you can kick hardware when you're angry at it not working 17:39:19 and north with south 17:39:24 for some reason everything is flipped 17:39:24 wth. 17:39:32 <\oren\> wob_jonas: a reliable way to disable a hard disk is to dd /dev/zero to it and then hit it deveral times 17:39:53 <\oren\> with the blunt end of a scewdriver 17:40:23 ffpga would be cool. 17:41:33 modelsim costs 1.5k or something 17:41:38 too expensive. 17:43:15 <\oren\> the only hardware I've ever hit and had it work better than before is the screen on my old old laptop 17:44:03 <\oren\> which required taps to the bottom centre to keep it working 17:47:26 industrial strength shredder will do the trick too 17:47:34 wipe it, shred it 17:50:29 * int-e is so tempted to suggest the KONCTPYKTOP as a circuit simulator 17:51:48 logisim doesn't even have toggle buttons as input 17:51:53 now I gotta write my own toggle button 17:52:43 oh wait. pins are toggleable 17:52:44 alright. 17:55:03 mroman: the last four weeks had 4, 3, 80 and 4 accesses to the Burlesque shell. 17:56:54 hm. but how do I combine 8 wires into one wire of width 8 17:57:01 seems logisim can only do the splitting part 17:57:04 not the combining part. 17:57:45 use a logic or of wires? 18:04:06 ah, stupid me. 18:05:09 mroman: use a splitter backwards? 18:05:40 reverse its polarity 18:07:01 http://www.cburch.com/logisim/docs/2.7/en/html/guide/bundles/splitting.html suggests that this is possible indeed 18:08:42 (in fact the second example mixes directions on a single wire, IIUIC) 18:09:33 -!- Phantom_Hoover has joined. 18:10:43 oh 18:10:47 so a splitter can be used for both? 18:11:57 this channel would probably call it a splerger. 18:13:33 [[Numberwang/Implementations]] https://esolangs.org/w/index.php?diff=54156&oldid=53777 * Unt * (+71) Infinite recursion crash countermeasures. 18:27:47 alright. 18:27:52 I guess I can do a BF CPU with this. 18:31:11 I kinda like hardware stufff. 18:33:17 NOOOOO! 18:33:19 why BF? 18:33:25 seriously? 18:33:25 what's with BF? 18:35:03 -!- LKoen has quit (Quit: “It’s only logical. First you learn to talk, then you learn to think. Too bad it’s not the other way round.”). 18:38:31 https://i.imgur.com/OarOcTl.png 18:38:32 :D 18:39:09 It can count upwards a location in memory :D 18:58:00 hm. 18:58:03 but timing's a bitch. 19:04:31 <\oren\> stupid postgresql 19:04:47 <\oren\> why can't I transpose dimensions 19:06:36 hu 19:06:53 IN THE SUMMER TIME 19:11:22 -!- LKoen has joined. 19:19:51 <\oren\> I have data in the form 19:20:16 <\oren\> name_of_foo, name_of_bar, 4.56 19:20:20 <\oren\> ... 19:20:39 <\oren\> and I want it to be a table with a column for each bar and a row for each foo 19:21:28 \oren\: because it would break the type safety of your data model :-) 19:23:59 * \oren\ wonders if postgres has an EVAL function he can pass a procedrually generated string to 19:33:25 to... 19:34:44 [[Hieroglyphics]] https://esolangs.org/w/index.php?diff=54157&oldid=54145 * Plokmijnuhby * (+2233) 19:35:33 [[Hieroglyphics]] https://esolangs.org/w/index.php?diff=54158&oldid=54157 * Plokmijnuhby * (+17) 19:35:45 I was planning to do a Befunge coprocessor for a computer hardware architecture course (which I think was about doing a MIPSy thing in VHDL, with a coprocessor as an extra-points objective), but I think I had to drop that course. 19:47:12 a what? 19:47:18 a befunge corpocessor? 19:47:34 I guess that might make slightly more sense than all that brainfuck thing 19:52:40 -!- sleffy has quit (Remote host closed the connection). 19:55:23 -!- sleffy has joined. 20:03:05 -!- augur has quit (Remote host closed the connection). 20:03:32 -!- augur has joined. 20:08:01 -!- augur has quit (Ping timeout: 248 seconds). 20:21:46 [[Hieroglyphics]] M https://esolangs.org/w/index.php?diff=54159&oldid=54158 * Plokmijnuhby * (+270) 20:26:56 hcrh.pchr.rp.h 20:26:58 how dare you 20:28:02 I'll create a CPU for 2D shit! 20:33:37 like game of life? 20:37:06 -!- S1R has joined. 20:40:19 I wish machine learning people would stop using the word "neuron" and all brain analogies. 20:40:25 They are so bad. 21:04:42 -!- `^_^v has quit (Quit: This computer has gone to sleep). 21:10:11 -!- augur has joined. 21:10:23 -!- augur has quit (Remote host closed the connection). 21:14:38 [[Bitter]] https://esolangs.org/w/index.php?diff=54160&oldid=54127 * DMC * (-23) /* Description */ 21:16:33 -!- izabera has quit (Ping timeout: 264 seconds). 21:28:03 -!- `^_^v has joined. 21:46:29 Cale: This homogeneous polynomial thing is pretty neat. 21:54:57 -!- Melvar` has quit (Ping timeout: 240 seconds). 21:58:59 -!- mroman has quit (Ping timeout: 260 seconds). 22:08:20 -!- Remavas has joined. 22:08:35 -!- Melvar` has joined. 22:13:07 -!- moony has left. 22:13:17 -!- erkin has quit (Quit: Ouch! Got SIGIRL, dying...). 22:26:46 -!- izabera has joined. 22:26:51 meow 22:28:12 -!- laerling has quit (Quit: Leaving). 22:37:14 -!- SIR has joined. 22:41:02 -!- S1R has quit (Ping timeout: 276 seconds). 22:41:53 Cale: So can you represent power series this way too? 22:45:02 -!- Remavas has quit (Quit: Leaving). 22:45:48 -!- LKoen has quit (Remote host closed the connection). 23:04:07 -!- SIR has quit (Read error: Connection reset by peer). 23:04:31 -!- SIR has joined. 23:06:44 -!- `^_^v has quit (Quit: This computer has gone to sleep). 23:07:35 -!- AnotherTest has quit (Ping timeout: 276 seconds). 23:14:58 -!- moei has quit (Quit: Leaving...). 23:30:31 <\oren\> holy shitballs 23:30:38 <\oren\> "Thanks, but in the future please just provide the information about how to reproduce a problem, not a suggested fix. I don't read suggested fixes," 23:32:59 "Patches not welcome." 23:33:26 Pooches welcome. 23:35:48 fungot: Do you welcome patches? 23:35:48 fizzie: which one? i can't 23:35:58 Guess that's a no 23:37:12 fungot needs no patches 23:37:13 shachaf: how do you implement the macro ellipse? in r5rs, but many people arn't. at least as well as 23:37:37 Those poor people that arn't in r5rs. 23:38:49 fungot is in r98rf, not r5rs 23:38:49 shachaf: in bristol we'd say " fnord fnord 23:43:43 -!- augur has joined. 23:48:19 -!- augur has quit (Ping timeout: 256 seconds). 23:54:03 -!- wob_jonas has quit (Quit: http://www.kiwiirc.com/ - A hand crafted IRC client).