if you're getting filtered at matrix multiplication, it's only going to get harder

matrix multiplication is just regular multiplication a bunch of times it's retard tier

I calculated the partial derivatives of the loss function wrt to all of the weights and made update equations for scalar weight values, and for looped through for loops of each layer/neuron. It gives the right values but the person grading will know I'm retarded. I can't get the right values with matrices no matter what I do.

I see the problem. It's the frog.
Delete all frog jpegs off of your electronic devices and your IQ will instantly be raised by 15-20 points.
You can thank me later.

Thanks bro, I just replaced all of them with 'jaks and I'm feeling much smarter

Are you remembering to transpose your matrix? Because the gradient of matrix multiplication is computed with just another matrix multiplication.

Yes, shapes are correct and everything. I'm just incapable of visualizing the network in such a way that I can translate the scalar equation to a matrix equation. Probably deserve this for taking an ML course and never having took an linear algebra. Not that won't get an A anyway but I still feel very guilty.

Something that helped me was the observation that any given column in C is influenced by only one column of B (and any one row of C is influenced by only one row of A). May seem obvious written here but can be easy to forget if you're in the weeds of manually doing the calculations.
Also watch 3blue1brown's "essence of linear algebra" series.

I gotta give 3blue1brown another try, most of what he says flies over my head since it's all a little condensed but he might be more helpful after I've thought about everything a bit, thanks.

The problem with 3B1B is that he's like 140IQ and he doesn't seem to realise that he's abnormally smart, and presents his information like the average person should be able to understand it.

matrix multiplication is just regular multiplication a bunch of times it's retard tier

>I can't multiply

Yeah, there are lots of other incels on this board too.

It's literally just

np.matmul(x1, x2)

what's so hard about it?

Have you tried memorizing how to do the multiplication?

I calculated the partial derivatives of the loss function wrt to all of the weights and made update equations for scalar weight values, and for looped through for loops of each layer/neuron. It gives the right values but the person grading will know I'm retarded. I can't get the right values with matrices no matter what I do.

I see the problem. It's the frog.

Delete all frog jpegs off of your electronic devices and your IQ will instantly be raised by 15-20 points.

You can thank me later.

Thanks bro, I just replaced all of them with 'jaks and I'm feeling much smarter

Yes, shapes are correct and everything. I'm just incapable of visualizing the network in such a way that I can translate the scalar equation to a matrix equation. Probably deserve this for taking an ML course and never having took an linear algebra. Not that won't get an A anyway but I still feel very guilty.

Are you remembering to transpose your matrix? Because the gradient of matrix multiplication is computed with just another matrix multiplication.

SO TRAAAAAAAAAAAAAAAAANNNNNNNNNNSPOSEEEEEEEEEEEEEEEEEEEEEEEEEE

what does this have to do with matrix multiplication?

I know that feeling brother. Debugging neural nets is an absolute shit fest, and I never learned how to cope.

if you're getting filtered at matrix multiplication, it's only going to get harder

Something that helped me was the observation that any given column in C is influenced by only one column of B (and any one row of C is influenced by only one row of A). May seem obvious written here but can be easy to forget if you're in the weeds of manually doing the calculations.

Also watch 3blue1brown's "essence of linear algebra" series.

I gotta give 3blue1brown another try, most of what he says flies over my head since it's all a little condensed but he might be more helpful after I've thought about everything a bit, thanks.

hes a fag ass john green talking retard

The problem with 3B1B is that he's like 140IQ and he doesn't seem to realise that he's abnormally smart, and presents his information like the average person should be able to understand it.

but I understand it just fine?

Have you done an official IQ test?

Anyone over 120IQ can probably understand his videos, but midwits will really struggle.

120-130 iq is midwit

i struggle at doing things i really struggle at being a doer

undergrad math is like 0iq-90iq. A computer can do all of it.

so what youre saying is, your brain isnt a GPU?

Good thing you can just do it in Excel

you go across the row on teh first one and down the column of the second one

How? It’s one rule, if you have A and B and want to make AB you rotate A 90 degrees and slide it down over B multiplying each term that overlaps