two screens same size to the front, laptop on a riser to my left, at a right angle, third, smaller screen to the left of that connected to a second laptop
Lol my set up is a 22" 1080p on the left for basic web viewing, 28" 1440p center for games, and a 32" 1080p on the right for videos/twitch while I’m playing games
Not a programmer, but I can definitely attest that TV Medicine is wildly inaccurate. From CPR (method, success rate, reason for initiating) to the usefulness of various imaging modalities; it's like watching a gardener plant a whole watermelon and growing a sky-high beanstalk the following morning.
A really fun intersection of both of these was the episode of Bones I saw once when visiting my dad, where some corpse had a microscopic code etched in its femur that then hijacked/hacked the CT machine computer when scanned.
To everyone commenting that you have to convert to binary to represent numbers because computers can’t deal with decimal number representations, this isn’t true! Floating point arithmetic could totally have been implemented with decimal numbers instead of binary. Computers have no problem with decimal numbers - integers exist. Binary based floating point numbers are perhaps a bit simpler, but they’re not a necessity. It just happens to be that floating point standards use binary.
Wrong. Sounds like you think only fixed point/precision could be implemented in decimal. There’s nothing about floating point that would make it impossible to implement in decimal. In fact, it’s a common form of floating point. See C# “decimal” type docs.
I generally interpret “decimal” to mean “real numbers” in the context of computer science rather than “base 10 numbers”. But yes, of course you can implement floating point in base 10, that’s what scientific notation is!
programmer_humor
Active
This magazine is from a federated server and may be incomplete. Browse more on the original instance.