00:38:17 -!- Melvar`` has quit (Ping timeout: 256 seconds).
00:42:42 -!- Cale_ has changed nick to Cale.
00:51:23 -!- Melvar`` has joined.
00:54:13 -!- Phantom_Hoover has quit (Read error: Connection reset by peer).
01:16:48 -!- oerjan has joined.
01:29:16 -!- variable has joined.
01:53:41 <shachaf> Cale: What's your opinion on the higher-dimensional polynomial thing?
01:54:07 -!- augur has quit (Remote host closed the connection).
01:55:00 -!- augur has joined.
02:01:11 <shachaf> Hmm, any thing. How would you define these things?
02:02:22 <shachaf> You can say, for instance, that a polynomial is something of the form f(x) = A + B(x) + C(x,x) + D(x,x,x) etc.
02:02:31 <shachaf> Where A,B,C,D are multilinear.
02:08:43 <Cale> Of course you know you can define a polynomial ring in terms of an arbitrary other ring
02:09:24 <Cale> But your generalisation goes a bit farther than that I suppose
02:09:42 <Cale> I... don't know what we're trying to do really :)
02:10:07 <Cale> But the usual way to define polynomials in multiple indeterminates is just to iterate the polynomial ring construction
02:10:16 <Cale> Doesn't quite work for infinitely many
02:10:47 <shachaf> Even in finite-dimensional vector spaces I'd like some sort of answer.
02:11:06 <shachaf> Anyway, let's see. I was trying to figure out what the inverse of the Hessian matrix is.
02:11:19 <shachaf> Since people represent it as a matrix, but it "really" ought to be a quadratic form, right?
02:12:22 <shachaf> So say you have a generalized quadratic thing : R^n -> R
02:12:37 <shachaf> It'll be of the form f(x) = A + B(x) + C(x,x), right?
02:13:11 <Cale> yeah, it really makes more sense as something that eats a couple of vectors
02:13:29 <shachaf> Should I require C to be symmetric here?
02:14:03 <shachaf> If you have a quadratic form like g(x,y) = ax^2 + bxy + cx^2, that seems to correspond to a symmetric tensor (/matrix).
02:15:08 <shachaf> Anyway, either way, the derivative of f if f'(x)(dx) = B(dx) + C(dx,x) + C(x,dx). Right?
02:16:20 -!- Soni has quit (Ping timeout: 276 seconds).
02:16:24 <Cale> Sounds plausible
02:16:39 <shachaf> (If C is symmeetric then it's just f'(x)(dx) = B(dx) + 2C(x,dx), of course.)
02:17:05 <shachaf> dx here is a whole vector, not a basis covector or anything like that.
02:17:28 <shachaf> Anyway this is just like the "one-hole context" thing for type derivatives.
02:18:06 <shachaf> The derivative of a multilinear form is the linear map you get when you substitute one linear parameter in at each possible spot.
02:19:07 -!- Soni has joined.
02:20:00 <shachaf> Can you *define* derivatives this way somehow, as "varying one occurrence of x linearly"? Presumably it at least works for polynomials and power series, assuming I know what those are.
02:20:34 <shachaf> So I was tryinng to figure out how they work.
02:23:15 <shachaf> Anyway, back to the derivative of the Hessian, is there a general notion of the derivative of a tensor? I think there should be, at least in some cases.
02:24:55 <shachaf> f''(x)(dx)(dy) = C(dx,dy) + C(dy,dx) is a (0,2) tensor at each point. But you can probably represent it as a linear map : V -o (V -o F) and say that its inverse is a linear map : (V -o F) -o V. Which is a (2,0) tensor?
02:31:34 <Cale> So yeah, exactly, the space of linear maps is just another vector space, so it's totally possible to talk about derivatives of functions whose codomain is that
02:31:52 <shachaf> And of course you can talk about the (2,0) tensor which numerically has the same values as the inverse of the Hessian, and when contracted with the second derivative you get the (1,1) identity tensor : V -o V
02:32:45 <Cale> The derivative of a map V -> W is going to be a function V -> (V -o W), giving the best linear approximation to the original function at each point in V
02:33:00 <Cale> and then that (V -o W) can be your W
02:33:24 <shachaf> Right, so you get f'' : V -> (V -o (V -o W)), or f'' : V -> (V⊗V -o W)
02:33:27 -!- variable has quit (Quit: /dev/null is full).
02:38:47 <Cale> There's a really cute thing I found you can do at one point...
02:38:50 -!- variable has joined.
02:40:08 <Cale> (Just trying to remember it clearly enough to explain it properly... I also have some mathematica code... somewhere)
02:41:12 <Cale> But basically, you can take a curve (or generally one of these polynomial surfaces), and find the best lower degree approximation to it at any point
02:41:59 <Cale> which specialises to the tangent line in the case of finding a linear approximation (or the polar line in the case of conic sections and a point not on the curve)
02:45:00 <shachaf> How does that work? Is it something other than the truncated Taylor series?
02:45:22 <Cale> It's more like plain application :D
02:45:57 <Cale> You construct a symmetric tensor which results in the polynomial when applied repeatedly to the same vector
02:46:51 <Cale> yeah... actually, do you have/use mathematica?
02:47:24 <Cale> I have a mathematica notebook which I might have to take a moment to decipher to remember what it was that I was actually doing
02:47:51 <Cale> (this was over a decade ago now)
02:48:44 <Cale> But actually, let's start with some context, because this is fun
02:50:50 <Cale> Are you familiar with projective coordinates?
02:53:20 <Cale> Okay, so here's the basic idea (I'm just going to talk about plane geometry because everything's kinda nice and easy there to start out)
02:54:17 <Cale> The projective plane has as its points the lines through the origin in R^3
02:54:32 <Cale> and as its lines, the planes through the origin
02:55:04 <Cale> We'll use the coordinates of any nonzero point on such a line through the origin as coordinates for the point in the projective plane
02:55:37 <Cale> and we can think of the ordinary Euclidean R^2 as belonging to the projective plane by considering points with coordinates of the form (x,y,1)
02:56:44 <Cale> (i.e. lines through the origin which aren't parallel to z = 0)
02:59:52 <Cale> So, let's think about the equation for a line: in Euclidean geometry, i.e. where z = 1, it's just a x + b y - c = 0.
03:00:09 <Cale> or even a x + b y + c = 0, let's not mess around with signs
03:00:54 <Cale> Now what we'll do is make this a polynomial which is homogeneously of degree 1 (i.e. every term has the same degree of 1) in x, y, z, by adding z's
03:01:12 <Cale> a x + b y + c z = 0
03:01:39 <Cale> which we could write as <(a,b,c), (x,y,z)> = 0
03:01:58 <Cale> and this jives with what we said a projective line should be
03:02:22 <Cale> a plane through the origin consists of all the vectors (x,y,z) whose inner product with some (a,b,c) is 0
03:03:32 <Cale> We'll take the [a,b,c] (with square brackets because I'll usually write it as a row vector rather than a column) to be the coordinates of the line
03:03:49 <Cale> and so it's easy to check if a point lies on our line, just take the dot product
03:04:11 <Cale> Now, suppose we want to take two points with their projective coordinates and find the line through them
03:04:52 <Cale> e.g. say our points are (x,y,z) and (x',y',z')
03:05:15 <Cale> we want to find [a,b,c] such that [a,b,c](x,y,z) = 0 and [a,b,c](x',y',z') = 0
03:05:28 -!- tromp has joined.
03:05:31 <Cale> and of course, this is well known, it's the cross product :)
03:06:06 <Cale> similarly, if we have two lines and want their intersection, it's the same thing, cross product again (but everything is transposed)
03:06:36 <Cale> okay, so that's phase 1
03:06:38 <shachaf> This sounds believable though I should probably work through the detail.
03:06:45 <Cale> Now conic sections...
03:07:53 <Cale> It turns out that you can represent conic sections really nicely
03:09:15 <Cale> Given a symmetric matrix A, a conic section is just the set of points x for which x^t A x = 0
03:09:33 <Cale> i.e. given a symmetric bilinear form B, it's just B(x,x) = 0
03:09:46 <Cale> and the determinant tells you if the conic section is degenerate
03:10:09 -!- tromp has quit (Ping timeout: 256 seconds).
03:11:17 -!- xkapastel has quit (Quit: Connection closed for inactivity).
03:11:19 <Cale> Given a conic section with matrix A, we can do the following cute thing: we can map points to lines by (x,y,z) |--> [x,y,z] A
03:11:42 <Cale> and we can map lines to points by [a,b,c] |--> A^-1 (a,b,c)
03:12:11 <Cale> It turns out that this computes what's known as the pole and polar
03:12:16 <Cale> https://en.wikipedia.org/wiki/Pole_and_polar
03:14:08 <Cale> and, really, the important thing is that if the point happens to lie on the conic, then the line you get is the tangent line
03:14:39 <Cale> of course, you sort of already mentioned this in a different way :)
03:15:18 <shachaf> You mean that the derivative of x |-> C(x,x) is x |-> (dx |-> 2C(x,dx))?
03:15:27 <Cale> Or something that leads to that -- differentiating a symmetric bilinear map looks kind of like partially applying it
03:15:56 <shachaf> Or equivalently the derivative of f(x) = x^T C x is f(x)(y) = x^T (C + C^T) y
03:16:10 <Cale> especially so when you're then looking at zeroes of these things and so the 2 goes away
03:16:47 <Cale> So I was thinking about how to maybe do all this again, but with more interesting polynomials
03:16:58 <Cale> Not just conic sections
03:17:24 <Cale> and it turns out that all this geometry of conic sections really does generalise
03:18:48 -!- Remavas has quit (Remote host closed the connection).
03:19:12 <Cale> So, what we want to do is to take our arbitrary polynomial, say something like x^4 + x^3 y - x^2 y^2 + y^4 - 1
03:19:20 -!- Remavas has joined.
03:19:51 <shachaf> By the way, I was reading a paper that talked about eigenvectors and an eigenbasis of a symmetric positive definite matrix representing a quadratic form. I suppose there's some eqeuivalent notion for quadratic forms represented with the "proper" variance?
03:20:44 <Cale> and then we homogenize that with z to get... well, I picked a kind of boring one, we just get x^4 + x^3 y - x^2 y^2 + y^4 - z^4 here... in general we'd multiply each term by a large enough power of z to make the total degree the same in each term
03:21:16 -!- Remavas has quit (Read error: Connection reset by peer).
03:22:43 <Cale> and then we need to construct a corresponding symmetric tensor by taking each term and basically splitting it up over all the ways of permuting the word
03:23:08 <Cale> dividing by the number of permutations
03:23:17 <Cale> and adding things up
03:23:18 <shachaf> This is https://en.wikipedia.org/wiki/Symmetric_tensor#Symmetric_part_of_a_tensor , right?
03:23:53 <Cale> Kind of, except we're building an appropriate tensor from our polynomial
03:24:49 <Cale> The goal is to get a symmetric multilinear form such that B(v,v,...,v) is our polynomial again
03:25:52 <Cale> and then what you can do is just start to apply that thing, and then go back the other way to get lower-degree things
03:26:17 <Cale> if you apply it n-1 times to a point that lies on the original curve, you get a tangent line, remarkably
03:26:47 <shachaf> Oh, this works because it's homogenized.
03:27:13 <shachaf> The thing I was doing with "general" polynomials didn't work, or had to consider each term separately.
03:27:24 <Cale> and if you apply it n-2 times, you get a tangent... conic section
03:28:21 <Cale> I was just playing around with this in mathematica and making plots... really I should prove some things about it and write a paper or something, but lazy.
03:28:37 <Cale> Even too lazy to check if someone else did it properly ;)
03:28:57 <shachaf> This is like how you represent can affine maps as linear by adding one dimension.
03:29:24 <Cale> ah, yeah, that's related to this projective stuff
03:29:45 <Cale> (it is projective stuff)
03:29:49 <shachaf> But I'm kind of surprised that one extra dimension can work for any degree.
03:30:46 <shachaf> Actually I shouldn't be very surprised.
03:32:37 <shachaf> So these homogenized multilinear things are special in that by doing n applications you get the same thing as taking the nth derivative.
03:55:22 -!- variable has quit (Quit: /dev/null is full).
04:00:19 -!- tromp has joined.
04:04:57 -!- tromp has quit (Ping timeout: 240 seconds).
05:22:39 -!- sleffy has quit (Ping timeout: 248 seconds).
05:47:57 -!- tromp has joined.
05:53:21 -!- tromp has quit (Ping timeout: 268 seconds).
06:42:19 -!- sleffy has joined.
07:02:15 -!- oerjan has quit (Quit: Nite).
07:10:23 -!- sleffy has quit (Ping timeout: 248 seconds).
07:34:26 -!- Melvar has joined.
07:37:03 -!- Melvar`` has quit (Ping timeout: 248 seconds).
08:11:51 -!- tromp has joined.
09:53:23 -!- tromp has quit (Remote host closed the connection).
09:54:09 -!- tromp has joined.
09:54:10 -!- tromp has quit (Remote host closed the connection).
09:54:24 -!- tromp has joined.
10:14:00 -!- augur has quit (Remote host closed the connection).
10:14:13 -!- augur has joined.
10:15:09 -!- Melvar has quit (Ping timeout: 256 seconds).
10:28:30 -!- Melvar has joined.
10:35:57 -!- laerling has joined.
10:37:55 -!- LKoen has joined.
11:01:51 -!- wob_jonas has joined.
11:02:02 <wob_jonas> Are you talking about conic sections in the projective 2-plane?
11:02:26 <wob_jonas> I learnt some interesting facts about them that are rarely taught.
11:07:44 -!- Melvar has quit (Ping timeout: 256 seconds).
11:13:41 <wob_jonas> ah yes. symmetric tensor. sounds so much fancier than a quadratic form.
11:17:12 <shachaf> "symmetric tensor" can have a higher degree than quadratic.
11:17:36 <wob_jonas> shachaf: yeah, though I think the context was a quadratic one
11:17:59 <shachaf> Also it might not be a form, I guess.
11:18:50 <int-e> Good to know. 'Britain will not be "plunged into a Mad Max-style world borrowed from dystopian fiction" after it leaves the EU, the Brexit secretary has said.'
11:18:57 <HackEgo> 1015) <Sgeo> Should I watch Watchmen or read Watchmen? [...] <NihilistDandy> Please, who *watches* The Watchmen? \ 104) * Phantom_Hoover wonders where the size of the compiled Linux kernel comes from. <cpressey> To comply with the GFDL, there's a copy of Wikipedia in there.
11:20:43 * int-e recently became aware of the fact that the kernel patches itself with various workarounds on bootup. Patching as in, actually replace fragments of code with something else... I thought they were only adjusting addresses.
11:21:05 -!- Melvar has joined.
11:23:33 -!- augur has quit (Remote host closed the connection).
11:28:52 -!- laerling has quit (Quit: Leaving).
11:30:27 -!- laerling has joined.
11:36:25 -!- boily has joined.
11:38:51 -!- AnotherTest has joined.
11:49:52 <fungot> boily: compile it yourself? this isnt some kind of anonymous function, and just presented a big canvas on which to draw.
11:50:14 <boily> fungot: no, I won't compile any of your organs.
11:50:14 <fungot> boily: it's because i'm copying and pasting out of ' em.
11:50:26 <boily> fungot: just how many nostrils do you have?
11:50:29 <fungot> boily: w/ rt a screen shot no, not that
12:01:50 -!- augur has joined.
12:05:58 -!- AnotherTest has quit (Ping timeout: 256 seconds).
12:06:18 -!- augur has quit (Ping timeout: 240 seconds).
12:22:14 -!- boily has quit (Quit: ARMOURED CHICKEN).
12:22:30 <HackEgo> over the batter. Place on a lightly floured board in \ a bowl. Add the remaining ingredients in boiling water to combine. \ Cover and bake until the roasted to the boil. Stir in the cream \ of the liquid and spread with peanut oil in a small bowl. Pour into \ serving plate. Then, for a baking sheet and allow to cool. \ \ Preheat oven to 350.
12:22:36 <HackEgo> Moff Jerjerrod \ Captain Panaka \ Jyn Erso \ Jango Fett \ BB-9E \ Chirrut Îmwe \ Owen Lars \ Nute Gunray
12:25:29 -!- Melvar has quit (Ping timeout: 256 seconds).
12:25:58 -!- Melvar has joined.
12:30:15 <Taneb> boily sounds like someone who knows a lot about high temperatures
12:30:24 <Taneb> In particular their effects on liquids
12:42:38 -!- augur has joined.
12:44:59 -!- Melvar has quit (Ping timeout: 276 seconds).
12:47:01 -!- augur has quit (Ping timeout: 256 seconds).
12:50:27 -!- augur has joined.
12:57:19 -!- Melvar has joined.
13:02:57 -!- Melvar has quit (Ping timeout: 260 seconds).
13:03:26 -!- Melvar has joined.
13:13:27 -!- LKoen has quit (Remote host closed the connection).
13:14:40 <fizzie> int-e: It's a low bar for what constitutes a successful brexit.
13:30:28 <Taneb> It's possible for brexit to be successful?
13:30:29 -!- skankyyoda has joined.
13:33:07 -!- skankyyoda has quit (Max SendQ exceeded).
13:33:45 -!- skankyyoda has joined.
13:36:48 -!- xkapastel has joined.
13:51:51 <fizzie> Taneb: Apparently, avoiding the apocalypse is sufficient.
13:52:35 <Taneb> fizzie: on the other hand, fewer cool car chases
13:55:49 <Taneb> Currently, Cambridge
13:56:19 <izabera> is anyone else in cambridge uk?
13:56:28 <izabera> we should have a #esoteric meeting
13:56:30 <Taneb> I presume there's more people in the city than just us two
13:56:49 -!- skankyyoda has quit (Ping timeout: 248 seconds).
13:56:52 <Taneb> But in the intersection of the population of Cambridge and of #esoteric?
13:56:56 <Taneb> That's going to be smaller
13:57:17 <fizzie> One of my coworkers used to commute from Cambridge to London until not long ago, but now lives in London.
13:57:42 <izabera> Taneb: you're studying or working?
13:58:03 <Taneb> Small company called Myrtle
13:58:36 <Taneb> We do neural networks to FPGAs via Haskell
13:59:16 <Taneb> wob_jonas: the neural networks?
13:59:21 <izabera> i work in a company called undo and we do fancy debuggers
14:00:13 <Taneb> wob_jonas: mostly automotive applications but we're expanding into other areas, I can't really talk about i tmuch
14:00:29 <fizzie> *beep* #CONFIDENTIAL INFORMATION DETECTED
14:01:10 <izabera> what is this page https://www.myrtlesoftware.com/no-access/
14:01:31 <Taneb> izabera: somewhat inaccessible
14:02:10 <Taneb> izabera: I'd be up for an #esoteric meetup
14:02:30 <Taneb> We might be able to convince fizzie to come up from London too
14:02:46 <wob_jonas> Do you use neural networks on time series data where you need some short-term persistent state?
14:03:08 <Taneb> That's a very specific question
14:03:25 <Taneb> Makes me think there's a reason for it other than curiosity
14:04:44 <wob_jonas> Taneb: yeah. I'm curious because some of the hard recognition stuff I did at my job had time series input, and that is one reason why it's so difficult
14:04:56 <Taneb> The answer is "not yet but soon"
14:04:58 <wob_jonas> you can never tell how much of the past input is significant
14:05:22 <wob_jonas> but automotive applications would probably also involve that
14:05:41 <Taneb> Everything's frustratingly early days yet
14:06:18 <wob_jonas> I don't really trust neural network, or most other kinds of machine learning, I'm prejudiced against them
14:06:38 <wob_jonas> I didn't enjoy when co-workers tried to use them
14:06:54 <wob_jonas> although admittedly they do sometimes work, there are really good proofs that they are applicable in some situations
14:09:57 <Taneb> They feel like suspicious magic and alchemy to me
14:10:31 <Taneb> Like, they work well enough that there's something going on, but there's no real science for when they work, what sort of architecture you need, etc
14:12:35 <wob_jonas> Taneb: the reason I don't like them is because I feel they're overused as one of those magical bullets, used in situations when some more domain-specific solution would be much better
14:12:48 <wob_jonas> same reason why I hate multi-threading, GPU computations, and JIT
14:13:50 <Taneb> eh, I feel at least multi-threading and GPU computations are a lot better understood than neural networks with regards to when they're applicable
14:16:45 <Taneb> Although whether they're sufficiently well-understood by the people who put them into practise is another matter
14:17:31 <izabera> jit is the same as regular compiling
14:18:56 <wob_jonas> I understand enough when they should be used that I can recognized they're often used in vain
14:19:17 <wob_jonas> not that I'm immune to premature optimizations in other ways, mind you, not even close
14:22:43 -!- AnotherTest has joined.
14:43:59 <int-e> `? procrastination
14:44:00 <HackEgo> The Procrastination is destined to rule the world... right after watching this last funny cat clip on youtube.
14:44:42 <int-e> `slwd procrastination//sblastbfinalb
14:44:44 <HackEgo> procrastination//The Procrastination is destined to rule the world... right after watching this final funny cat clip on youtube.
14:45:41 <int-e> Taneb: I completely agree that neural networks are alchemy.
14:46:08 <Taneb> int-e: luckily for me, they're alchemy that some people are willing to pay a lot of money for
14:47:21 <int-e> (http://neuralnetworksanddeeplearning.com/chap5.html and chap5.html helped me demystify "deep learning" a bit)
14:51:05 -!- `^_^v has joined.
14:53:37 <int-e> There *is* progress. But the solution to two fundamental challenges here is boring. I) deeper layers in deep networks learn slowly (solution: throw more computing power at the problem, oh, and reduce the number of coefficients by duplicating neurons ("convolutional networks")), and II) they are prone to overfitting to a much higher degree than very badly (solution: regularize by a) penalizing...
14:53:43 <int-e> ...big coefficients and b) rerandomizing neurons in networks during training, and probably lots of other stuff, something that people have already done for shallow networks). Which explains why people see the real advances in recurrent networks and things like LSTM (a decaying memory whose appropriate decay can be trained by gradient descent) and other small innovations in network architecture.
14:54:40 <int-e> to a much higher degree than single hidden layer networks... sorry, my edit buffer was too short for this statement.
14:58:52 -!- AnotherTest has quit (Ping timeout: 265 seconds).
15:12:19 -!- moei has joined.
15:25:02 -!- LKoen has joined.
16:01:49 -!- sparr has quit (Quit: WeeChat 1.9.1).
16:03:38 -!- Melvar` has joined.
16:05:19 -!- Melvar has quit (Ping timeout: 248 seconds).
16:05:58 -!- sleffy has joined.
16:22:46 -!- erkin has joined.
16:28:00 -!- contrapumpkin has joined.
16:35:03 -!- contrapumpkin has quit (Quit: My MacBook Pro has gone to sleep. ZZZzzz…).
16:54:25 -!- Melvar` has quit (Ping timeout: 268 seconds).
17:07:25 -!- Melvar` has joined.
17:16:33 -!- AnotherTest has joined.
17:18:29 -!- mroman has joined.
17:18:47 <mroman> is there a free digital circuit simulator for windows?
17:19:34 <wob_jonas> I mean, there's got to be. That sounds like something someone would make if it doesn't exist yet.
17:20:26 <mroman> but one that doesn't suck
17:20:33 <mroman> it's the suck part that not everybody can make
17:20:58 <mroman> I wanna do large scale stuff
17:37:50 <wob_jonas> \oren\: those are harder to debug than a software simulation
17:38:08 <wob_jonas> I mean, I'm prejudiced, I'm a software guy
17:38:12 <mroman> logisim confuses east with west
17:38:35 <wob_jonas> I believe the main advantage of hardware over software is that you can kick hardware when you're angry at it not working
17:39:24 <mroman> for some reason everything is flipped
17:39:32 <\oren\> wob_jonas: a reliable way to disable a hard disk is to dd /dev/zero to it and then hit it deveral times
17:39:53 <\oren\> with the blunt end of a scewdriver
17:41:33 <mroman> modelsim costs 1.5k or something
17:43:15 <\oren\> the only hardware I've ever hit and had it work better than before is the screen on my old old laptop
17:44:03 <\oren\> which required taps to the bottom centre to keep it working
17:47:26 <mroman> industrial strength shredder will do the trick too
17:50:29 * int-e is so tempted to suggest the KONCTPYKTOP as a circuit simulator
17:51:48 <mroman> logisim doesn't even have toggle buttons as input
17:51:53 <mroman> now I gotta write my own toggle button
17:52:43 <mroman> oh wait. pins are toggleable
17:55:03 <int-e> mroman: the last four weeks had 4, 3, 80 and 4 accesses to the Burlesque shell.
17:56:54 <mroman> hm. but how do I combine 8 wires into one wire of width 8
17:57:01 <mroman> seems logisim can only do the splitting part
17:57:04 <mroman> not the combining part.
17:57:45 <int-e> use a logic or of wires?
18:07:01 <int-e> http://www.cburch.com/logisim/docs/2.7/en/html/guide/bundles/splitting.html suggests that this is possible indeed
18:08:42 <int-e> (in fact the second example mixes directions on a single wire, IIUIC)
18:09:33 -!- Phantom_Hoover has joined.
18:10:47 <mroman> so a splitter can be used for both?
18:11:57 <int-e> this channel would probably call it a splerger.
18:13:33 <esowiki> [[Numberwang/Implementations]] https://esolangs.org/w/index.php?diff=54156&oldid=53777 * Unt * (+71) Infinite recursion crash countermeasures.
18:27:52 <mroman> I guess I can do a BF CPU with this.
18:31:11 <mroman> I kinda like hardware stufff.
18:35:03 -!- LKoen has quit (Quit: “It’s only logical. First you learn to talk, then you learn to think. Too bad it’s not the other way round.”).
18:38:31 <mroman> https://i.imgur.com/OarOcTl.png
18:39:09 <mroman> It can count upwards a location in memory :D
19:04:47 <\oren\> why can't I transpose dimensions
19:11:22 -!- LKoen has joined.
19:19:51 <\oren\> I have data in the form
19:20:16 <\oren\> name_of_foo, name_of_bar, 4.56
19:20:39 <\oren\> and I want it to be a table with a column for each bar and a row for each foo
19:21:28 <wob_jonas> \oren\: because it would break the type safety of your data model :-)
19:23:59 * \oren\ wonders if postgres has an EVAL function he can pass a procedrually generated string to
19:34:44 <esowiki> [[Hieroglyphics]] https://esolangs.org/w/index.php?diff=54157&oldid=54145 * Plokmijnuhby * (+2233)
19:35:33 <esowiki> [[Hieroglyphics]] https://esolangs.org/w/index.php?diff=54158&oldid=54157 * Plokmijnuhby * (+17)
19:35:45 <fizzie> I was planning to do a Befunge coprocessor for a computer hardware architecture course (which I think was about doing a MIPSy thing in VHDL, with a coprocessor as an extra-points objective), but I think I had to drop that course.
19:47:34 <wob_jonas> I guess that might make slightly more sense than all that brainfuck thing
19:52:40 -!- sleffy has quit (Remote host closed the connection).
19:55:23 -!- sleffy has joined.
20:03:05 -!- augur has quit (Remote host closed the connection).
20:03:32 -!- augur has joined.
20:08:01 -!- augur has quit (Ping timeout: 248 seconds).
20:21:46 <esowiki> [[Hieroglyphics]] M https://esolangs.org/w/index.php?diff=54159&oldid=54158 * Plokmijnuhby * (+270)
20:28:02 <mroman> I'll create a CPU for 2D shit!
20:33:37 <int-e> like game of life?
20:37:06 -!- S1R has joined.
20:40:19 <shachaf> I wish machine learning people would stop using the word "neuron" and all brain analogies.
21:04:42 -!- `^_^v has quit (Quit: This computer has gone to sleep).
21:10:11 -!- augur has joined.
21:10:23 -!- augur has quit (Remote host closed the connection).
21:14:38 <esowiki> [[Bitter]] https://esolangs.org/w/index.php?diff=54160&oldid=54127 * DMC * (-23) /* Description */
21:16:33 -!- izabera has quit (Ping timeout: 264 seconds).
21:28:03 -!- `^_^v has joined.
21:46:29 <shachaf> Cale: This homogeneous polynomial thing is pretty neat.
21:54:57 -!- Melvar` has quit (Ping timeout: 240 seconds).
21:58:59 -!- mroman has quit (Ping timeout: 260 seconds).
22:08:20 -!- Remavas has joined.
22:08:35 -!- Melvar` has joined.
22:13:07 -!- moony has left.
22:13:17 -!- erkin has quit (Quit: Ouch! Got SIGIRL, dying...).
22:26:46 -!- izabera has joined.
22:28:12 -!- laerling has quit (Quit: Leaving).
22:37:14 -!- SIR has joined.
22:41:02 -!- S1R has quit (Ping timeout: 276 seconds).
22:41:53 <shachaf> Cale: So can you represent power series this way too?
22:45:02 -!- Remavas has quit (Quit: Leaving).
22:45:48 -!- LKoen has quit (Remote host closed the connection).
23:04:07 -!- SIR has quit (Read error: Connection reset by peer).
23:04:31 -!- SIR has joined.
23:06:44 -!- `^_^v has quit (Quit: This computer has gone to sleep).
23:07:35 -!- AnotherTest has quit (Ping timeout: 276 seconds).
23:14:58 -!- moei has quit (Quit: Leaving...).
23:30:38 <\oren\> "Thanks, but in the future please just provide the information about how to reproduce a problem, not a suggested fix. I don't read suggested fixes,"
23:32:59 <fizzie> "Patches not welcome."
23:35:48 <fizzie> fungot: Do you welcome patches?
23:35:48 <fungot> fizzie: which one? i can't
23:37:13 <fungot> shachaf: how do you implement the macro ellipse? in r5rs, but many people arn't. at least as well as
23:37:37 <fizzie> Those poor people that arn't in r5rs.
23:38:49 <shachaf> fungot is in r98rf, not r5rs
23:38:49 <fungot> shachaf: in bristol we'd say " fnord fnord
23:43:43 -!- augur has joined.
23:48:19 -!- augur has quit (Ping timeout: 256 seconds).
23:54:03 -!- wob_jonas has quit (Quit: http://www.kiwiirc.com/ - A hand crafted IRC client).