Giants meet

I am actually not surprised about this. I'm just wondering why didn't it happen sooner.


A thought on neural networks

Neural networks are a (currently not very successful) attempt to programmatically mimic the learning behaviour of a human brain. Research in neural networks has mostly focused on topologies and transfer functions in the nodes. However, the aspect of time has been neglected. Human nerves transport signals with (relatively slow) speed between 0.5 and 120 m/s (quoting a random reference from the web).

Could it be that our learning capability depends not only on particular signal values (the part that artificial neural networks are simulating), but also on propagation time between neurons in the brain? A signal might have different effect on a neuron in the brain, depending on the time of signal's arrival.

This would add another dimension to artificial neural networks: temporally changing transfer functions in the nodes. This is just an idea for further research, maybe someone has already looked into it.


x86-64 ISA oddity

New processors have FXSAVE and FXRSTOR to save/restore the floating point state of both x87 and SSE units. In 64-bit mode, PUSHAD and POPAD instructions generate undefined operation exception, there is no new instruction to save/restore all general-purpose registers, and the hardware task switching is disabled. I'm wondering why did the AMD designers decide to cripple the CPU in such a way (along the side of disabling segmentation, but that's a story for another post).