Fun topics all around so far
3.5" disks inhaled upon the proverbial equine of short stature from the day they were introduced. The increase in capacity was offset by goofy "doors" that usually broke with the spring for them scratching up the media, that stupid metal connector that invariable got magnetized ruining track 0, even when they don't mechanically fail the handful of working media has a life span measured in weeks, and that's assuming more than half a box of disks worked new -- and that was twenty years ago when the tech was in it's "prime".
Today, new drives if you can find them are less reliable than pulls from the dump that sat outside in the snow for a week, format support for anything other than 1.44 meg is nonexistent in many OS (yes windblows, I'm looking at you) and nobody has actually manufactured new media in so long, the "reject rate" in my experience has surpassed 75%. For every pack of ten I open regardless of the source, I'm lucky if more than half of them work at all, much less finding one where there isn't at least one track that isn't totally banjaxed.
making writing .img files with rawrite an exercise in futility.
... meanwhile I still have original 5.25" in formats ranging from 135k TRS-80 to 1.2meg PC that work today as well as they did thirty years ago! The only time I've had 5.25" disks go bad is when something stupid happens like coworkers covering their system in refrigerator magnets or some dipshit stapling them to a report. (actual incidents from 20+ years ago). Very first box of 3.5" disks I ever had half of them wouldn't format to full capacity and two of them wouldn't even format.
It was NEVER a good technology, sure as hell wasn't reliable, and I'm thankful every day that I'm not stuck HAVING to use them just because the industry suddenly got a raging chodo for slapping them into every blasted system out there... and curse every time I have to deal with something like a PS/2 and drag out the external 5.25" just so I have a WORKING floppy drive.
To be fair optical wasn't exactly a real improvement in reliability given the microwave art I have decorating one wall and my set of drink 'coasters' in the living room... sad when flash with it's write limit is a more reliable technology; even sadder when even ZIP was better EVEN taking "click of death" into account!
Plow asunder 3.5" -- whoever engineered that crap deserves a massive pimp-slap with a wet trout.
Admittedly, I say the same thing about SATA connectors.
In many ways it's what I'd classify as false ruggedization, a close cousin to
false simplicty. False simplicity is where something is so simplified it makes the job it was designed to do harder; false ruggedization is the other end of the spectrum where you over-engineer to the point there's just so much more to go wrong it breaks every time a breeze wafts in the window!
Kind of what the programming mantra I had drilled into my head by a nun with a ruler, "The less code you use the less there is to break", was meant to address. Something lost on todays coding dipshits who dive for massive "frameworks" to do everything for them, usually making them write more code to make the framework do what they want than if they'd simply written it without the framework in the first blasted place!
As to daylight savings, it's idiotic nonsense, who the **** wants it to still be sunny and hot at ten o'clock at night in the summer?!? Honestly in my experience it's not just that the summer numbers would make more sense in winter, the winter numbers would make more sense in summer! I don't see why anyone would want it pitch black out at 4:30 in the afternoon in -9F weather any more than broiling hot in the middle of the night; hence why I think it should be fall forwards and spring back -- which is usually why even the saying to "help" remember is gibberish since I usually when I fall it's forwards flat on my face, and typically if I were to spring, it's back in horror.
Hz vs. Cycles? I was always taught that if you say cycles you don't have time involved, if you say hz you have a timer interval. You say "20 cycles" the question becomes "over how long", you say "20 cycles at 2 hz" then it means something. They're two different but related things.
Then of course there's metric -- and for all of you out there who are fans of said idiocy, do the world a favor and
sierra tango foxtrot uniform! It is the MOST arbitrary numbering system I could imagine since there is no legitimate math supporting it! It is entirely based on the mouth-breathing halfwit practice of counting on one's fingers in a most inefficient manner -- the pinnacle of "Ooh I cans haz then fingarz!!!"
Even the ancient Sumerian made more sense, they used base 60. 2*2*3*5, so that at every step you can divide by 2, 3, 4 or 5 without resorting to fractions. There's a reason we still use that on clocks and for navigation.
In that way, inches, feet, yards and chains make SENSE since you can at least divide a foot by 2, 3 or 4 without resorting to fractions or decimal places. Mixing a 5 and maybe a 7 in would be nice,
but oh no3z, th4t mite envulves actchewal maths... Which with Joe Sixpack and Susie Sunshine making Teen Talk "Math is hard" Barbie look like Alan freaking Turing, GOOD LUCK THERE!
If metric is so superior, why aren't headings and angles measured in 100 "degrees" to a circle? If metric is so superior, why has most everyone except the Chinese and Koreans switched back to feet and miles for altitude and speed, INCLUDING THE RUSSIANS SWITCHING AWAY FROM METRIC IN 2011! If metric is so superior why haven't we switched to ten decons a day where each decon is broken into 100 centons?
Or are people that afraid of Cylons?!?
See why Celcius pisses me off; joe forbid you call it degrees, then have 180 of them between freezing and boiling... but sure, that decision was "arbitrary", certainly SO MUCH more arbitrary than "I have ten fingers" as a basis for an entire numbering system regardless of how needlessly complicated it makes things...
RIGHT... What's a cubit?
From a purely mathematical standpoint, decimal is just plain stupid, that we've accomplished anything meaningful with it is more a testament to human determination and persistence than it is the usefulness of it as a numbering system.
I've often wondered if it was given to the West by the Arabic culture in the same way C and Unix was unleashed upon the world.
But there's a LOT of stuff people do or think saves time or is more efficient I just shake my head at in either utter and complete disbelief, or outright disgust to the point of nausea.
See HTML/CSS frameworks, JS frameworks, PHP frameworks, trying to time a single iteration of operations with interrupts disabled instead of how many iterations can be done over a fixed period of time interrupts enabled (so you have some clue what the BIU invalidation impact is)... Oh this is easier, and that is easier, and some other damned thing is easier...
Seriously folks, what's in this kool aid you're all sipping and who did you get it from?