• Please review our updated Terms and Rules here

You know what seems to be extra rare? Pentium II and III Xeons.

I am familiar with the precisions.. as I worked for Dell for 5 years. They are pretty good.

That's pretty cool. Since I was first introduced to Dells (way way back, actually, in that really short window of time when you could buy a Dell at Sam's Club and the 386SX was still sold; a 333s/L if I remember correctly) I liked the build quality. The clamshell Precisions and Inspirons (Precision 530MT/650/670 timeframe) could be a bit 'fun' to get the case back together, but even today I can get information from the service tag on those old boxes. I think I have a 433-series thinline LPX formfactor desktop still around somewheres.....

I have more experience with HPs Z400, z800 etc line as we used them extensively in the physics department at the Hospital I worked at.

I'll admit to no experience with HP's workstation line.
 
Is there a different link for "legacy" dell service tags? I have a PIII dell I was trying to get info on the other day and couldn't make it pop.
 
I wonder if it was an attempt at market manipulation. I remember that the share price dropped like 50% on the news. After I unbent the needle on my bullshitometer, I bought in. I could buy myself a pretty damn nice Threadripper workstation on the gains. 🤑
Lets hear it for the Threadripper! :)
 
I wonder if it was an attempt at market manipulation. I remember that the share price dropped like 50% on the news. After I unbent the needle on my bullshitometer, I bought in. I could buy myself a pretty damn nice Threadripper workstation on the gains. 🤑
I remember that story as having been many more devices than Supermicro. I was working in enterprise wifi support at the time and fielding calls from irate customers demanding proof that our hardware did not contain these chips. Twas funny.
 
Saturday for grins and giggles I powered up my IBM eServer x series 330 1RU server (dual PIII-S 1.4GHz, 4GB RAM, Debian Linux) and ran it for a bit. Need a rare PCI-X GPU, though, for best performance. Fastest 32-bit PCI GPU is pretty slow for workstation purposes.
See if you can find the PCI-X Matrox Parhelia. They are around $200 USD but probably the best PCI-X graphics card out there.


Perhaps I've seen some niche stuff, but I've seen quite a few Precision workstations, along with some Intel S5000X dual Socket Xeon boards (a dozen of those at current $dayjob in the Intel chassis built for that series board). These can handle PCIe GPUs with no problem; for workstations you'll likely get a Quadro or a FirePro instead of a GeForce or Radeon.

DUAL socket workstations are extremely common, being pretty much the norm not all that long ago. Hence why big makers like Dell had commercial systems like the Precision. There was even an entire enthusiast community dedicated exclusively to dual-socket systems that went defunct a few years back.

Quad+ is what is extremely niche. You need a very CPU-intensive workload to justify one, so the market is dominated by do-it-yourselfers and a few boutique system builders.

At $dayjob my one Windows XP 64-bit experience was with a Dell Precision 690 with high-end dual Xeons, 4x 300GB 15K RPM SAS drives in RAID 5, a big ATI FirePro GPU with dual 24-inch monitors, and 32GB of RAM, ECC FBDIMMS of course. Used for astronomical photographic plate scanning and data reduction. >$10,000 workstation, grant-funded.

Lol, basically just described my workstation, except I based mine on a supermicro motherboard and built it myself. I even had one of those precisions once, but this was WELL after they were cutting-edge. I want to buy another one for a project sometime, but shipping is a pain.
 
...I just "ripped" a Threadripper out thanks to extremely poor CPU socket design. Thanks a lot, AMD.
First I ever heard of the Threadripper series being poor anything. In most quarters it's considered the cock of the walk, so to speak. Also, I not sure that the PCI-X Matrox Parhelia would be my first choice. Kind of an odd ball that didn't hang around any too long. You'd want something that plays well with DX9 on a XP machine. Of course, if gaming is of no concern then just about anything goes.
 
First I ever heard of the Threadripper series being poor anything.

Starting with the socket 775(circa 2004), Intel introduced a CPU retention bracket to stop the CPU getting ripped out of the socket when the heat sink grease inevitably turns into glue. This fate befell many Pentium 4s and is the chief reason why my P4 will remain in its motherboard under its CPU cooler until the sun explodes.

AMD did not introduce such a feature until freaking socket AM5 in 2022. So, whilst doing some probono work on a friend's daughter's PC, That Threadripper got "ripped" right out of the socket and bent a dozen of the pins. It took two of us and a screwdriver to pry it off the bottom of the heat sink, a feat which would have been completely impossible while still in the socket. The CPU is is completely trashed, all because it took AMD 18 years to rip off the most "Well, DUH!" feature of CPU socket design ever invented.

I'm only out $150, but I'm certainly not bitter. He said. Scarcastically.

Also, I not sure that the PCI-X Matrox Parhelia would be my first choice. Kind of an odd ball that didn't hang around any too long. You'd want something that plays well with DX9 on a XP machine. Of course, if gaming is of no concern then just about anything goes.

The Parhelia occupies a really interesting space in that you can take those 2 DVI ports and turn them into 3 VGA ports, then create a contiguous desktop with a resolution 3072x768. You can then play surround games on this at a time when neither geforce nor ATI supported such a capability. Games tailored to the Parhelia's Surround feature are rare, but much like Glide games it is a unique experience you cannot have any other way.
 
At a place I worked, I ripped a lot of P4 out of their sockets while replacing Dell motherboards during the capacitor plague, but I don’t recall any being damaged by it. Eventually I did start giving the heat sink a little twist first so that it wouldn’t happen. Of course that was when they were a lot newer than they are now, maybe the “glue” has set up a lot more these days.
 
Starting with the socket 775(circa 2004), Intel introduced a CPU retention bracket to stop the CPU getting ripped out of the socket when the heat sink grease inevitably turns into glue. This fate befell many Pentium 4s and is the chief reason why my P4 will remain in its motherboard under its CPU cooler until the sun explodes.
I've done my fair share of work with P4s (and AM3, AM2, etc.) and have a drawer full of CPUs to prove it and have never had that happen. Always remove the CPU and heatsink together and then soak the interface layer in a mild solvent like WD-40 and wait. You mean to tell me that there is a sufficient density of ijits where a production design change was necessary?
 
At a place I worked, I ripped a lot of P4 out of their sockets while replacing Dell motherboards during the capacitor plague, but I don’t recall any being damaged by it. Eventually I did start giving the heat sink a little twist first so that it wouldn’t happen. Of course that was when they were a lot newer than they are now, maybe the “glue” has set up a lot more these days.

P4s tended to survive OK, mine made it through 2 such extractions. But when the pins got smaller and closer together, its game over. eBay is full of "for parts" threadrippers with bent pins.
 
Starting with the socket 775(circa 2004), Intel introduced a CPU retention bracket to stop the CPU getting ripped out of the socket when the heat sink grease inevitably turns into glue. This fate befell many Pentium 4s and is the chief reason why my P4 will remain in its motherboard under its CPU cooler until the sun explodes.

AMD did not introduce such a feature until freaking socket AM5 in 2022. So, whilst doing some probono work on a friend's daughter's PC, That Threadripper got "ripped" right out of the socket and bent a dozen of the pins. It took two of us and a screwdriver to pry it off the bottom of the heat sink, a feat which would have been completely impossible while still in the socket. The CPU is is completely trashed, all because it took AMD 18 years to rip off the most "Well, DUH!" feature of CPU socket design ever invented.

I'm only out $150, but I'm certainly not bitter. He said. Scarcastically.



The Parhelia occupies a really interesting space in that you can take those 2 DVI ports and turn them into 3 VGA ports, then create a contiguous desktop with a resolution 3072x768. You can then play surround games on this at a time when neither geforce nor ATI supported such a capability. Games tailored to the Parhelia's Surround feature are rare, but much like Glide games it is a unique experience you cannot have any other way.
The Threadripper issue is now 20 years old and they have come a long way. Lots of bad luck seems to have went your way on that. I'm sure there must have been a following of some sort for the Parhelia. The resolution that you pointed out, 3072x768, was probably out of reach for most of us at about $400, but it did have great bandwidth for its time. The ATI Radeon 9700 Pro came along shortly after the Parhelia release and seemed to take over in the $399 range. I believe Battlefield put that card over the top. I may have splurged and jumped in there myself with a 9800, and then a major build with a Gigabyte 990 and a1055T w/ a pair of 7970's in Crossfire. Mediocre performance at best, but I lived with it for a long time, and then the Z170 and I7-6700K w/ the 1080 came along years later and changed all of the that. The Z170 ushered in a new era in chipboard design and performance and 1080 took all of the guess work out of high-end gaming. All of the within the space of 11 or 12 years. Loads of other builds for friends and family and more spare parts than I know what to do with. My XP has been dusted off and new games are now being loaded in. Found a new 24" Acer monitor on sale at the MicroCenter for $89 and it fits in nicely. Your posts are interesting.
 
I'm sure there must have been a following of some sort for the Parhelia. The resolution that you pointed out, 3072x768, was probably out of reach for most of us at about $400, but it did have great bandwidth for its time. The ATI Radeon 9700 Pro came along shortly after the Parhelia release and seemed to take over in the $399 range. I believe Battlefield put that card over the top. I may have splurged and jumped in there myself with a 9800, and then a major build with a Gigabyte 990 and a1055T w/ a pair of 7970's in Crossfire. Mediocre performance at best, but I lived with it for a long time, and then the Z170 and I7-6700K w/ the 1080 came along years later and changed all of the that. The Z170 ushered in a new era in chipboard design and performance and 1080 took all of the guess work out of high-end gaming. All of the within the space of 11 or 12 years. Loads of other builds for friends and family and more spare parts than I know what to do with. My XP has been dusted off and new games are now being loaded in. Found a new 24" Acer monitor on sale at the MicroCenter for $89 and it fits in nicely. Your posts are interesting.

I should note that the Parhelia was not targeted at gamers; it was first and foremost a workstation graphics card aimed at the same people buying nVidia Quadros. I believe it was the first to the market to feature dual DVI ports, and it could run 3 monitors on a single graphics card at a time when no other manufacturer's cards could do that(Matrox also had 4 port VGA cards at the time, but these were aimed at yet another market segment you've never heard of).

Now here's why Matrox tried to get a piece of the gaming market: the way their drivers worked, Windows actually "saw" he 3 monitors as 1 monitor. I know that doesn't immediately sound that interesting, but it means a lot of games could be played in surround-mode out of the box. So Matrox opted to lean into that and convinced some developers to actually build support for this feature into their games.

This also happened at a time when 4:3 screens were still the standard and had become cheap enough that lots of people already had 2 of them on their desk. It wasn't that big of a stretch to say "hey, buy a third monitor and you can play surround games!" -it certainly brought me in.

Of course then along game 16:9/16:10 monitors and it all sort of fell apart.



But this brings us back to why I suggest the Parhelia for people looking for a card to put in a PCI-X slot. Its not the "best" on paper, the best is some generic PCI-X version of a geforce or ATI card that has fast specs but still gets its pants beaten off by better-supported AGP cards. You get a Parhelia for that build because it lets you do something unique.




...Of course you can also get the same experience with the later-released and much more available TripleHead2Go by Matrox, but come on, live a little!
 
Much like with multi-CPU systems I cannot think of many games that recognized more than one monitor being installed without patches or crazy console commands.
 
Much like with multi-CPU systems I cannot think of many games that recognized more than one monitor being installed without patches or crazy console commands.

Exactly. This is what makes the Parhelia or TripleHead2Go such unique devices. The game only sees 1 monitor. Just at a crazy resolution.
 
I should note that the Parhelia was not targeted at gamers; it was first and foremost a workstation graphics card aimed at the same people buying nVidia Quadros. I believe it was the first to the market to feature dual DVI ports, and it could run 3 monitors on a single graphics card at a time when no other manufacturer's cards could do that(Matrox also had 4 port VGA cards at the time, but these were aimed at yet another market segment you've never heard of).

Now here's why Matrox tried to get a piece of the gaming market: the way their drivers worked, Windows actually "saw" he 3 monitors as 1 monitor. I know that doesn't immediately sound that interesting, but it means a lot of games could be played in surround-mode out of the box. So Matrox opted to lean into that and convinced some developers to actually build support for this feature into their games.

This also happened at a time when 4:3 screens were still the standard and had become cheap enough that lots of people already had 2 of them on their desk. It wasn't that big of a stretch to say "hey, buy a third monitor and you can play surround games!" -it certainly brought me in.

Of course then along game 16:9/16:10 monitors and it all sort of fell apart.



But this brings us back to why I suggest the Parhelia for people looking for a card to put in a PCI-X slot. Its not the "best" on paper, the best is some generic PCI-X version of a geforce or ATI card that has fast specs but still gets its pants beaten off by better-supported AGP cards. You get a Parhelia for that build because it lets you do something unique.




...Of course you can also get the same experience with the later-released and much more available TripleHead2Go by Matrox, but come on, live a little!
While on the subject of Matrox: I hated the CGA on my 1000SX. I didn't want jump into VGA in 1987/8 as the price of a VGA card and monitor easily eclipsed the total investment in my meager system. An opportunity arose and I grabbed a Matrox EGA for substantially below cost, as I was dealing with suppliers thought my fed job. So, I got the card and had no monitor for while. Eventually I spotted a CDW 12" EGA monitor through a flyer at work. I think I gave about $350 with a discount and it was shipped to the back dock where I worked. That EGA setup changed everything; i.e. 600x350 with beaucoup colors. The amazing thing with the Matrox EGA card was it came with a software package that let you actually change system fonts and styles. I have since misplaced/lost that disk but still have the card. Matrox was on the cutting edge back then.
 
Matrox is always on the cutting edge. The problem is they tend to be on the cutting edge of the part of the knife that's not supposed to be sharp.
 
Matrox's market is multi-monitor configurations. They have almost always been the king of 2D and video accelerated cards for kiosks, airport displays and large control operations.
 
Back
Top