• Please review our updated Terms and Rules here

XP Forever?

A real production Linux server starts with a pretty minimal base and gets loaded up with just the packages it needs, just like an OpenBSD of FreeBSD or whatever system.
Does it? I admit, I'm not really familiar with Linux in a production environment, but unless they're building from source or using radically different distributions from any I've seen, it seems to be pretty much impossible to get a modern, updated Linux setup with less than about 600 different packages, thanks to dependency hell. (GNOME and KDE are the patron sinners for this, but there's plenty other offenders in the repositories.) Even cutting out the graphical desktop I doubt it could go much below 400.
 
In today's world, I'm not sure that I buy the argument that "if it's open source, the number of maintainers is much larger". I've seen small bugs on the various support forums hang around for years. I suspect that well over 90% of the user base of a given distro lacks the ability to run down and correct a code bug. So usually, bug-chasing falls to the same small group of people.
 
it seems to be pretty much impossible to get a modern, updated Linux setup with less than about 600 different packages, thanks to dependency hell

A completely stripped Debian install is about 60, but you're comparing apples and oranges again. In a Linux distribution the entire thing is modular so dozens of things that in a BSD distribution would be lumped into one big hairy tarball come as separate items. Ultimately you're still dumping about the same amount of code on the hard disk, it's just the BSD lumps everything all into one big fat combo meal while Linux keeps track of each individual McNugget. The BSD people pretend that their approach is somehow superior because their entire base lives in one big fat CVS server that's only directly touched by an anointed few rather than collected from various separate projects, but... that, again, seems to be an idea based more on "feelings" than anything else. (And it also contributes to the notoriously slow technical evolution of the BSDs compared to Linux.)

BSD's monolithic construction also makes it awkward to fix problems discovered in the base distribution without rebuilding the whole **** thing. It's only relatively recently that FreeBSD introduced a system for doing binary updates of the base, and so far as I'm aware OpenBSD doesn't have that at all. (Updating involves untarring a new set of base packages over the old, all at once.) It's sort of ironic to me that these systems have better mechanisms for handling upgrades to add-on application software than the core OS.

The "hard to upgrade" thing doesn't matter a whole lot of all you're doing with BSD is building disposable appliances that are replaced when outdated rather than large systems that need to keep running indefinitely, but it is a data point in favor of arguing that Linux scales "up" better than the BSDs do. Haven't heard of anyone running FreeBSD on their IBM mainframe while Linux is a fairly big chunk of that business these days...

In today's world, I'm not sure that I buy the argument that "if it's open source, the number of maintainers is much larger". I've seen small bugs on the various support forums hang around for years. I suspect that well over 90% of the user base of a given distro lacks the ability to run down and correct a code bug. So usually, bug-chasing falls to the same small group of people.

But if you use a distribution like RedHat there are at least companies paying money to hire some people to at least pretend that they care about the overall security posture of all software that goes into the product. I don't see how this is an argument in favor of the BSDs, where the pool of eyes looking at the code will be even smaller.
 
But if you use a distribution like RedHat there are at least companies paying money to hire some people to at least pretend that they care about the overall security posture of all software that goes into the product. I don't see how this is an argument in favor of the BSDs, where the pool of eyes looking at the code will be even smaller.

But the people paying money for support are usually corporate "bleeding edge" types. If it's not bleeding, it gets demoted to irrelevancy. Where's my support for VIA chipsets (e.g. 8257)? How's the legacy floppy support? FWIW, BSD still runs those quite well. Windows 7 still does respectably well (I don't know about 10, but I suspect Microsoft has forced the issue of "Upgrade or die" even more than Linux.

Is recompiling a kernel a big thing to today's world? I hadn't noticed.
 
But the people paying money for support are usually corporate "bleeding edge" types. If it's not bleeding, it gets demoted to irrelevancy. Where's my support for VIA chipsets (e.g. 8257)? How's the legacy floppy support?

But I thought the topic was security and fitness for "mission critical" applications? I doubt many enterprises are running their mission-critical networked services on VIA chipsets anymore. ;)

Seriously, I'm not arguing that Linux is the greatest thing for every application ever and hasn't been involved with all sorts of really irritatingly teeth-hurting fiascoes over the years. All I'm saying is that Linux and BSD are simply tools and like any tool the skill and wisdom the person wielding it is going to be a greater factor in the success or failure of the endeavor than the qualities of the tool itself. There might indeed be corner cases where because of driver support or hardware overhead or whatever only one of the two tools will work at all so, sure, use the only one that works there, but if you're discussing a general case of "I need to run application X on commodity hardware in a position where it might be exposed to remote users with hostile intent" they're *pretty much* the same hammer and which one is actually "better" in a given instance isn't something that should be decided based on a misleading advertising slogan.

Honestly, I've worked for BSD heavy shops for *years* and built many a FreeBSD-based firewall(*) to protect production network services (in part because we were too cheap to buy commercial solutions) so don't get me wrong, I by no means dislike BSDs. I just happen to think that the differences between them and Linux are seriously overblown from a practical standpoint and that slogans that imply your OS somehow preemptively fixes security problems because it ships with no daemons enabled are silly and potentially dangerous.

(* When it was ported over my favorite combination was FreeBSD with the kernel compiled to enable the OpenBSD PF firewall. I didn't use OpenBSD itself because at the time it was *comically* broken on the servers I had to work with. Didn't support the Broadcom ethernet cards, the SCSI driver was broken, SMP was broken... okay, I'll admit it, OpenBSD is sort of a trigger word. There are better BSDs.) ;P
 

I wasn't defending OpenBSD.
My point is that the notion that open source in general would be of better quality than closed source, and bugs will be rare and found quickly, is a myth.

I personally think in the average case there's no advantage of open source over closed source.
In the case where a large company pays highly skilled software engineers to write code and review that of their peers, I think closed source is better than open source.
I say that because apparently open source has failed to deliver decent alternatives to highly specific software such as Photoshop, 3dsmax, Havok/PhysX, etc.
Likewise, in my own area of expertise, that of realtime 3d graphics, the open source 3d engines available are quite laughable in terms of features, performance etc compared to the stuff that well-paid developers write.
As far as I can tell, open source development is just not in the same league.

I think the same logic would apply to security-related issues.
 
Last edited:
In the case where a large company pays highly skilled software engineers to write code and review that of their peers, I think closed source is better than open source.

Well, the thing about that is you can't actually directly compare data, can you? When something idiotic makes it into the Linux kernel (or OpenSSL, or Bash, or whatever) the horror is right out there to be seen once it's actually found and the whole Internet can go crazy condemning the developers as idiots for not finding something SO DANG OBVIOUS earlier. However, we as outsiders don't get to point and laugh at how outrageously boneheadedly awful Microsoft's source code is because we're not allowed to see it, so... I guess you can go ahead and assume the quality is better without any actual evidence to support that?

It's not as if people are *not* finding exploitable security bugs in Windows... all the time... constantly... forever...
 
Well, I've used BSD since the 4.2BSD days. The object was that the local DEC sales guy pushed an VAX11/750 and BSD with promised CSRG support. He also promised that supporting HASP over a leased line to a S/370 would be no problem. CSRG tried to make it work for at least a year or two--they never could before we dropped the VAX and did our HASP using a PC/AT. Otherwise, BSD was surprisingly robust and we had very few crashes.

I'd been exposed somewhat to Unix V7 as part of a port to the then-not-ready-for-primetime 80286, but Intel largely did all of the kernel work. We had a PDP 11/70 to run the real thing, but my role was far from systems programming--mostly trying to port our existing 8085 program base over. Somewhere, I have a set of SysVR4 distribution tapes, but they're mostly of academic interest.

I started myself on Linux with a very early Slackware distro and eventually settled in with RH 3 then 5, then changed to Debian (I think). RH back in the day was the way to go because of the good organized support--and you got a book with it.

But as far as nuclear power plants goes, a VIA chipset would be "futuristic" or "bleeding edge" at best, considering that the last US nuclear plant to go online was in 1996 (construction started in 1973). How's Linux for paper tape support? :)

As far as security, I'm pretty old school. Nowadays, it seems that everyone wants to connect everything to the net, no matter that any benefit might be dubious at best. That just seems to be inviting the black-hat guys to fool with your stuff. After every report of an intrusion, I ask "Exactly why did all of this information need to be online to the outside world 24/7?"

So yeah, I'm a fuddy-duddy--and proudly so.
 
As far as security, I'm pretty old school. Nowadays, it seems that everyone wants to connect everything to the net, no matter that any benefit might be dubious at best. That just seems to be inviting the black-hat guys to fool with your stuff. After every report of an intrusion, I ask "Exactly why did all of this information need to be online to the outside world 24/7?"
Madness, Chuck, madness! Don't you know that the Internet of Things is the future of...things? It must be, because people keep talking about it!
 
It's pretty hard to beat a locked safe and an armed guard for real security.

Not impossible however. I think I related the story of a good friend who worked for Lockheed and did his stuff in a guarded screen room. Rules were pretty strict--no magnetic media, no cameras, no tape recorders, no radios allowed inside; paper notes were strictly forbidden. However, seeing as he was an engineer, calculators were perfectly okay. So he brought his loaded-to-the-gills HP-41 and simply stuck his stuff in memory. The security guys just thought it was a plain non-programmable calculator...
 
So yeah, I'm a fuddy-duddy--and proudly so.

Be proud of it, man. I'm working hard on perfecting being a prickly old fuddy-duddy myself.

(We just all have to prickly over a different subset of things, I think it's in the contract you sign but somehow forget you did when you turn "not-young" anymore.)

I've literally had nightmares about things like OpenStack so I totally feel the pain of anyone who's shocked by the current state of Internet-Cloud-App-Big-Data-Madness. It's still a decently paying gig to pretend like those are good and useful and not nightmarishly hackish things though so... sorta got to play along and make the best of it sometimes.
 
Getting back on topic, the third and final version of the unofficial Windows XP Service Pack 4 has been released.

Original thread with documentation and download link:
http://www.ryanvm.net/forum/viewtopic.php?t=10321

What's new in version 3.0:
1. POSReady updates through January 2016 have been included
2. 16-bit app emulation issue when installing from slipstreamed CD have been fixed
3. Fixed the Physical Address Extension patch (to provide support for more than 4 GB of RAM)
4. Some other minor fixes
5. Fixed the false positive "malware" files in SP4 v2

SP4 also claims to include registry updates to enhance security and address issues not fixed by Microsoft updates.

On a fully configured and updated XP system with the POSReady registry patch applied, SP4 is of not much benefit, but it's great for new installations, so you don't have to wait for all of the updates to download from Microsoft's servers.
 
Found another XP install 'gotcha'. One of my antivirus software packages wreaked havoc on my big box. I'm thinking is was 'SpyBot - Search & Destroy', but I can't be sure. In any event, XP's PC Health function and the REGEDIT editor took a hit. I was able partially reinstall PC Health but there are a few problems left over. While the system does run, and the all important games side of it functions okay, there are still a few things going on that aren't up to par.

My solution was to reinstall and get on with life. So, I booted with my XP install CD and made it up to "Starting Windows', then got the BSOD:

***STOP: 0x0000007B (0xF78D2524, 0xc000034, 0x00000000, 0x00000000)

What is key here is 0x0000007B. This means XP is possibly not detecting the HD. There are reams written on this failure, and if by chance you get hit by it, things can become frustrating real quick. It should be noted that my motherboard, an Asus Sabertooth FX990, was initially setup as ACHI, not IDE. When I decided to add XP on a separate WD 600 GB Velociraptor, the XP install went well without having to fiddle with the UFEI/BIOS settings; i.e., ACHI or IDE. Both, W10 on a 500 GB SSD and XP on a 600 GB IDE, functioned perfectly as ACHI. So what changed when attempting to reinstall XP? I figure it must be with the way the BIOS treats a storage device on the 'initial' install. It looks like you would have to FDISK the target HD in order to get back to square 1, or simply go into the BIOS and change the SATA setting to IDE. Long story short, that worked. I'm thinking XP's handling of ACHI may not be up to today's standards. Hope this helps someone down the line.
 
Last edited:
Here's another one for the XP-ers. On one of my systems, XP installed consistently got stock at "Time remaining: 34 minutes" during installation. Restarts and repeated install sessions did nothing.

The trick here is to restart the installation and hit Shift-F10 at the first install prompt. This brings up a command prompt. You can then use Notepad to examine the installation log. Find the last message relating to a device installation and note the .INF file it references. Delete the INF file and restart. There's information from Microsoft on this. I ran into this the other day.
 
Windows 10 has finally nudged out Windows XP to become the second most popular desktop operating system, at least among computers used to browse the Internet:

https://www.netmarketshare.com/operating-system-market-share.aspx?qprid=10&qpcustomd=0

However, if you look at the numbers, the market share of XP actually increased :wow: from December 2015 to January 2016, from 10.93% to 11.42%.

Meanwhile, Linux fans can finally jump for joy that "Linux is more popular than Windows".... Vista.
 
Most XP usage increasing due to it being holiday season.

Meanwhile in this house hold Android OS is gaining ground on a number of platforms, including the desktop.

I was given some "broken" 7 inch Android tablets to play with. Managed to get two working. Despite being basically throw away consumer items I enjoyed the experience learning about Android OS.
 
Last edited:
Here's a XP attaboy. I found an old/new HP Windows 2003 PDA the other day, and needed to install MS Active Sync on it. In order to get it into the programming mode, you need a XP box, and I chose my Asus A7 gaming rig for the task. To install new programs through Active Sync, you need at least XP SP2 with Outlook 2002. I couldn't find a stand-alone copy of Outlook 2002 anywhere, but turned up a 3 CD package of XP Office Pro that had been hiding in my stash.

XP office Pro installed without incident and I was able to access the PDA with no problems. As part of the install procedure, your are asked to provide the Office Pro CD key and some personal info. Today, I fired up the A7, clicked on Windows Update, and was surprised to see about 28 ready updates for XP Office Pro. Upon successfully installing the updates, I clicked on the Office Pro 'activation' tab and was pleased to see that the install had been properly activated. So, something must still be alive and well on the MS server side. :thumbsup:
 
I would still use XP, but it's become pretty much impossible to browse the web with it.

Chrome and Firefox both crash constantly for me, and if they don't, the plugins I need do...
 
Back
Top