Scali
Veteran Member
So you've now moved the goalposts from CPU performance to GPU performance?
So you've now moved the goalposts from CPU performance to GPU performance?
What are you talking about? I haven't moved anything
it's the overall that counts.
You need to get you head out of the benchmarks and into a real machine, and then you might have an argument based on your own practical experiences. You are not going to convince anyone here that they should scrap their AMD gamer because you can quote a few obscure figures off some Intel leaning forum - not me anyway.
I'm quite sure this thread was about CPU cores, not about GPUs.
Except that there is no 'the overall'.
You are using some AMD slides, where they cherry-picked games that weren't CPU-limited, so that they could show off their faster GPUs compared to Intel.
There are also games where CPU performance is more important, and the Intel GPU will deliver better framerates despite the GPU itself being slower than the AMD one on paper:
Firstly, I have not suggested that people scrap their AMD machines. The fact that you interpret my posts that way says more about you than about me.
Secondly, I suggest you drop the attitude. You don't know who you're talking to.
I have been programming graphics for more than 20 years, and own my own company which develops realtime video processing software and does videowarping/mapping, dome projections and whatnot.
I am intimately familiar with developing graphics engines using the latest Direct3D or OpenGL on multicore/multi-GPU systems.
In fact, my company is participating in the DirectX 12 early access program.
So who are you to tell me about "practical experiences"? I probably know more about this topic than anyone else on this forum.
Oh, and I'll let you in on a little secret: the PC we use for the dome projection is actually an AMD Phenom X4 system which had a set of AMD Radeon GPUs until a recent upgrade to some GeForce cards.
You are a fine example of why I dislike AMD cheerleaders so much.
Like most folks on this forum, it's a hobby and mostly for fun and games, so why so snobby?
BTW, those stats that you trotted out above show some fairly dated GPU's for the Batman game, which was issued in late 2011.
Indeed, AMD used to be THE chip to have
I still wouldn't use an intel integrated gpu for anything graphics intensive.
At one small moment in time yes
This forum goes a bit further back than most
Prior to the K7 architecture, AMD never threatened Intel's faster CPUs, and was merely price-fighting the budget models. Which in those days generally meant last-gen CPUs. Eg, when AMD had its first successes with the 386DX-40, Intel was already offering a 486DX-50.
AMD hit the sweet-spot for users on a budget (it was very popular among students for example).
Heh, and while Intel was busy with a single-chip 8-bit slowpoke AMD was powering some of the largest minicomputers of the day with their cutting-edge 2900-series processor chipsets.... computers built with them walked all over the 8080 competition....
I always love the 'who's go the bigger machine epeen' debates when both epeens will be shriveled and flaccid in 6 months. The correct answer is what-ever is on-sale and do it again in a year.
Yes, and IBM only made it worse, by wanting to stick to the 8080-related 825x chipset for their PC, therefore choosing the 8088 over the 8086.
The 8086 was actually a half-decent 16-bit CPU... but putting it on an 8-bit bus pretty much cut the performance in half. The thing is struggling just to get its instructions read from memory. All that when the competition is already starting to adopt the Motorola 68000.
Z80 machines were still being purchased as personal computers for quite some time after the 5150 came out.
Yes, and IBM only made it worse, by wanting to stick to the 8080-related 825x chipset for their PC, therefore choosing the 8088 over the 8086.
Yet for some bizarre reason they fitted the Displaywriter "Word Processor" with the 8086. (While at the same time sticking the System/23 Datamaster with an 8085 and bank-switched memory.) One starts to wonder if circa 1980 IBM engineers were simply told to play "spin the bottle" when choosing CPUs for small computer projects.