It's incredible to see such an amount of Transputer work without seemingly mentioning either Occam or the microcode instructions regarding channels. The fact the Transputer was intended to be a high performance design is so often lost, and the development of the floating point units proved one of the more successful bits of the entire enterprise.
The impressive part here isn't so much the emulator, but all the rest. A pascal compiler for the Transputer as a teenager in early 90s Mexico? That's brilliantly unlikely.
I have fond memories of implementing a variety of parallelized search algorithms with Occam for the one Transputer we had at the school lab. I loved it. The professors and TAs thought I was nuts.
Seeing a DOS transpiled into JavaScript is pretty cool. Seeing a DOS written in the 90s by a Spanish speaking native who wanted everything written in Spanish is AWESOME. Being told write AYUDA is great. I love the sense of ownership it implies the author had as a teen. Just super fun.
>In the old times, Javascript was an interpreted language, but since many years ago it is implemented as a JIT (Just-In-Time) compiler, so it can approach speeds closer to the C language.
Looking at many benchmarks I've seen C#, Java, Go approaching the speed of C, but I never have seen Javascript doing that.
Perhaps is because it's dynamically typed and is harder for compiler and VM to optimize the generated code?
JS runtimes are probably the most advanced, powerful and optimized for any dynamically processed language. The amount of things that both can and are done with JS today is really impressive. Some might say horrible, how dare people do productive things in JS...
It's also that computing capability is that much more powerful today than it was in the 80's and 90's. I mean, up until 2002 or so, processing power close to doubled every other year, and since 2002 it slowed down a bit, but still went from 64mb ram in my computer around 1998 to 64gb in my computer from 6 years ago. Processing capability has gone up just as much. Of course a lot of it went into parallization this past decade and a half, since squeezing more out of each node/generation has been decreasingly fruitful.
It's incredible to see such an amount of Transputer work without seemingly mentioning either Occam or the microcode instructions regarding channels. The fact the Transputer was intended to be a high performance design is so often lost, and the development of the floating point units proved one of the more successful bits of the entire enterprise.
The impressive part here isn't so much the emulator, but all the rest. A pascal compiler for the Transputer as a teenager in early 90s Mexico? That's brilliantly unlikely.
I have fond memories of implementing a variety of parallelized search algorithms with Occam for the one Transputer we had at the school lab. I loved it. The professors and TAs thought I was nuts.
I too loved Occam. It was a real eye opener as to how you could structure things that more people should be exposed to.
Ooh I love this.
Seeing a DOS transpiled into JavaScript is pretty cool. Seeing a DOS written in the 90s by a Spanish speaking native who wanted everything written in Spanish is AWESOME. Being told write AYUDA is great. I love the sense of ownership it implies the author had as a teen. Just super fun.
TIL about ">>>" giving an unsigned result, very useful!
>In the old times, Javascript was an interpreted language, but since many years ago it is implemented as a JIT (Just-In-Time) compiler, so it can approach speeds closer to the C language.
Looking at many benchmarks I've seen C#, Java, Go approaching the speed of C, but I never have seen Javascript doing that.
Perhaps is because it's dynamically typed and is harder for compiler and VM to optimize the generated code?
JS runtimes are probably the most advanced, powerful and optimized for any dynamically processed language. The amount of things that both can and are done with JS today is really impressive. Some might say horrible, how dare people do productive things in JS...
It's also that computing capability is that much more powerful today than it was in the 80's and 90's. I mean, up until 2002 or so, processing power close to doubled every other year, and since 2002 it slowed down a bit, but still went from 64mb ram in my computer around 1998 to 64gb in my computer from 6 years ago. Processing capability has gone up just as much. Of course a lot of it went into parallization this past decade and a half, since squeezing more out of each node/generation has been decreasingly fruitful.