Lame products & lazy engineering

Admin

Administrator
Moderator
Messages
3,905
#1
One big issue i have noticed with technology is general is that companies are not really pushing for excellence, instead they are just trying to be a bit better than the competition for most people and that's about it. There is no real push for greatness anymore.

Often new products are stripped of features. One particularly egregious example of this is apple and samsung removing headphone jacks from their phones.

Sometimes features/performance are intentionally withheld from a cheap products to for people to buy a more expensive variant instead.
 

Admin

Administrator
Moderator
Messages
3,905
#2
Nvidia GPUs are made not to be used fully
If you actually fully load an nvidia GPU (such as via furmark) it will throttle really badly. The big issue here is that the voltage of the card is set not to drop below 0.712 (for the tuf 3090 oc) which is really bad when the power of the card needs to drop further to remain below the power limit. You end up with lowered frequency but the same voltage which is very far from ideal.


I was able to replicate that issue with the tuf 3090 oc by using the "extreme burn-in" option. With 375W power-limit it was very far away from the 1600mhz i overclocked it to at 0.712v (dropping down to 1100mhz which is well below the stock base clock).

Extended.PNG

Some people have claimed that 0.712 would be some minimum for the architecture but that is extremely unlikely since at that voltage i was able to increase the frequency by 300mhz (20% overclock) showing that the voltage applies at stock is a lot higher than needed. Most likely this issue could have been resolved just by allowing lower minimum voltage (likely specified in the VBIOS). Unfortunately even when you extent the voltage curve in msi afterburner (picture above) it will not actually be followed by the card.

While this is a real issue it does not mean that nvidia GPUs are lame, the issue seems to have been that they had to go with the samsung 8 process which resulted in power-draw a lot higher than ideal even at low voltages (such as 0.718v).

You can still prevent down-clocking simply by increasing the power-budget a lot (if the VRM, PSU and cooler can handle it) but that might require modding the bios which will void the warranty (depending on what model you got). You would need over 550W of continuous power to support a fully loaded 3090ti at 0.712v & 1600mhz.

This also means that if you want to reduce the power-consumption of your card the better approach is simply to reduce the max voltage and accept that you will not have a stable power-consumption. That might also help to reduce the transient loads since the maximum height of the peaks will depend on the max voltage, this might still be worse if you are PSU/cooling constrained though. It also means the ability to increase the fps/W is rather limited.
 

Admin

Administrator
Moderator
Messages
3,905
#3
How intel ruined emulation performance
Intel has since alder-lake switched to a hybrid architecture setup where they mix strong p-cores with weak e-cores. The concept of mixing strong and weak cores by itself is not too bad but the issue is that the intel CPUs simply does not offer enough performance cores due to them abandoning high-end desktop. Another big issue is that the e-cores doesn't have avx-512 and instead of making sure only the p-cores would do the avx-512 instructions intel just disabled it from the factory and later starting physically fusing it off for no good reason. This however id disastrous for emulation performance.

wccftech.com/amd-zen-4-avx-512-major-benefits-in-emulators-such-as-yuzu-citra-xenia-vita3k/

It's unclear how much this is due to lazy engineering and how much it is about market segmentation, in any way they are doing a good job losing to AMD. It's probably too late for intel to re-enable avx-512 properly on raptor-lake by now (even though it's badly needed).
Experts in the area have pointed out that technically the chip could be designed to catch the error and hand off the thread to the right core, but Intel hasn’t done this here as it adds complexity. By disabling AVX-512 in Alder Lake, it means that both the P-cores and the E-cores have a unified common instruction set, and they can both run all software supported on either.
www.anandtech.com/show/17047/the-intel-12th-gen-core-i912900k-review-hybrid-performance-brings-hybrid-complexity/2
 

Admin

Administrator
Moderator
Messages
3,905
#4
AMD chiplets
For unclear reasons AMD like to shove non-monolitic designs down the throats of people. The only potential advantage with not having a monolithic design is slightly lower manufacturing cost.

While the defect-rate per die will increase with the die-size that does not mean that you will have to throw away a lot of dies if you do not go for a monolithic design, you can simply disable defective parts of the die and sell as a cheaper model (such as 3080ti instead of 3090ti). Nvidia in particular has been very effective when it comes to taking a good usage of defective dies rather than just throwing them away.

One big issue with having chiplets instead of monolithic dies is that it adds latency but there can also be significant added power-consumption due to the interconnections required. Almost all zen2 CPUs suffered significant performance loss (around 6%) due to L3 cahce not being unified.

The notion that smaller chips would be cheaper relative to the size isn't reflected in GPU prices, we actually see the opposite with GA102 chips giving the most active transistors/$. The worst value GPU is actually the 3070ti based on the smaller ga104

Note that it should be possible to deliver great performance with multiple dies fused together but the problem is that the high performance options for that are still too expensive so it's generally more cost-effective to simply use one monolithic die if the goal is max performance.
 

Admin

Administrator
Moderator
Messages
3,905
#5
xbox consoles
Microsoft has a proven track-record of screwing up their hardware designs with the only real exceptions being the original xbox and the xbox one x.

https://www.youtube.com/watch?v=YzIAYszH_X4

The original xbox one came with a very weak GPU and was very clearly a much worse option than PS4. The xbox fanboys used the games promised for the system to build hype but all of these later ended up on PC (where you got multiple times better performance).


They sortoff learned from their mistake with the xbox series X but it's still not great hardware design since they ended up using a much larger chip while not actually getting any clear overall performance-advantage over the PS5. They also don't have proper gen4 NVME storage and forces their customers to buy a much more expensive proprietary card for more than double the price, for more than double the price you of course also get more than double the time it takes for the card to read/write data.

Their controller lack important features such as gyro aiming and to a lesser extent haptic feedback.
 

Mr.Andrews

Well-known member
Messages
94
#6
In the smartphone field, companies run psychology studies. On youtube you see garbage tier psychologists, people like jordan peterson that embarass themselves and their field with their endless quotations of wisdom literature that they interpret like priests. But psychology was rescued from fiction narrative styles and in the field of corporations it is using the scientific method. Companies like scamsung and apple want to know how to sell better. Apple did studies for a lot of time (not the bullshit fictionalism of jordan peterson, real studies that need real money and real adults), and they discovered that if you remove features then consumers feel like they are buying "premium" goods. Apple in particular was leader in this type of innovation, they sold iphones with only in ears and a power adapter, nothing else. Previously, companies were trying to include dozens of accessories, thinking that would have increased their product value.

Instead in the field of graphic cards... I dont know. There are companies that basically do what they want since what they sell is rare.
 

Mr.Andrews

Well-known member
Messages
94
#7
xbox consoles
Microsoft has a proven track-record of screwing up their hardware designs with the only real exceptions being the original xbox and the xbox one x.

https://www.youtube.com/watch?v=YzIAYszH_X4

The original xbox one came with a very weak GPU and was very clearly a much worse option than PS4. The xbox fanboys used the games promised for the system to build hype but all of these later ended up on PC (where you got multiple times better performance).


They sortoff learned from their mistake with the xbox series X but it's still not great hardware design since they ended up using a much larger chip while not actually getting any clear overall performance-advantage over the PS5. They also don't have proper gen4 NVME storage and forces their customers to buy a much more expensive proprietary card for more than double the price, for more than double the price you of course also get more than double the time it takes for the card to read/write data.

Their controller lack important features such as gyro aiming and to a lesser extent haptic feedback.
Seriously?

I stopped at reason 10. How comes my old shitty PC of 10 years ago already had SSD and that crap bullshit console had an hard disk? damn that shit sthinks. I hope the new xbox isnt so shit.
 

Admin

Administrator
Moderator
Messages
3,905
#8
They discovered that if you remove features then consumers feel like they are buying "premium" goods. Apple in particular was leader in this type of innovation, they sold iphones with only in ears and a power adapter, nothing else. Previously, companies were trying to include dozens of accessories, thinking that would have increased their product value.
That does explain the recent developments on the smartphone market. They axe everything from removeable battery to headphone jack because consumers confuse lack of features with actual simplicity and quality.

The best way to deal with the smartphone nonsense is to just buy old cheap models from ebay or something. I did buy an LG V20 earlier which is ok but a bit flawed, works as a secondary phone.
 

Admin

Administrator
Moderator
Messages
3,905
#9
Most nvidia GPUs don't have enough VRAM
It's pathetic how most members of the "PC master race" have less vram that what you get with a 500$ console, really illustrates how PC gaming has fallen from grace.

The 3090 and 3090ti were the only great 30-series GPUs.

Of course you are still a peasant if you have 16GiB of vram of less, then you are no better than the PS5/XSX owners. Sure you can still use more vram from graphics (since it isn't shared) but that's a rather lame cope when the alternative is buying a nice 4090 (or used 3090/3090ti).

https://www.tomshardware.com/news/jedi-survivor-prelaunch-vram-rtx-4090

It's a shame the 4070ti only offers 12GiB of vram, ruins what would otherwise been a great card for the price.

You want to use better graphics settings and get higher FPS than the console peasants so you very much want at least 20 GiB of vram these days. 16GiB is ok for budget card such as 1000$ for a 4080 (not the 1200$ nvida tried charging for it).
 

Admin

Administrator
Moderator
Messages
3,905
#10
Speakers
One common dogma with speaker design is that it would be enough to cover the range from 20hz to 20000hz. There are however to big problems with that notion.

0. two studies have found that we humans can detect frequencies above 20khz, these studies used a separate tweeter for the ultrasonics to eliminate the possibility of people merely hearing intermoduleated distortion from speakers/amplifiers.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4005747/

https://journals.physiology.org/doi/full/10.1152/jn.2000.83.6.3548

This is inline with my own blind-testing where i did find that the ultrasonic does make a difference (statistically significant) and i am pretty sure it wasn't merely inter modulated distortion i heard.

1. You can still perceive frequencies below 20hz, there isn't any lower limit here since faithful reproduction of a very low frequency would feel like a wind which is something humans can perceive. Reproducing the wind requires fan subwoofers while the pressure itself can be reproduced by having sealed subwoofers and a sealed room.

Most commercial speakers use very cheap elements and instead rely on marketing and design to con people into buying poorly performing speakers. Many reviewers don't even measure the performance since then the products would look bad which is not what they want.
 

Admin

Administrator
Moderator
Messages
3,905
#11
Intel CPUs are failing now
Some people have blamed motherboard manufacturers for not following the intel recommended power/current limits but that's not the real issue at hand. Sure motherboard manufacturers are all bad with no exception but here the fault is 100% at intel.

You expect CPUs like the i9 14900K to actually offer great silicon quality since it's the second highest bin (only after the 14900KS) but that's unfortuantely not always the case.

I had one of my cores go bad even though i power-limited my 13900KF to just 200W and was thermally limited to significantly less than that with my AS500plus (it has gotten worse over time). Having a power-limit within the recommendation is not going to stop your intel CPU from going bad, at best it will simply delay it until you no longer have warranty.

So the rational response as a consumer is to push your intel CPU as hard as you can with your cooler and hope it will break before your warranty is over so you can get a replacement.

If you no longer have warranty you will simply need to downclock at least one core or raise the voltage.

 

Admin

Administrator
Moderator
Messages
3,905
#12
The raptor lake situation keeps getting worse
Now we are seeing servers fail due to the intel default voltages being unsafe slowly killing the chips


https://www.reddit.com/r/pcmasterrace/comments/1ed3da8/the_13th_and_14th_gen_news_just_keeps_getting/

vintologi24 wrote:

Raptor lake at launch were cheaper than the AMD zen4 offering for arguably better overall performance.

And people naturally assume that the default settings are going to be perfectly safe.

I actually disabled TVB pretty early on since i was worried about the high voltage it pushed but my CPU degraded anyway (one of the cores went bad, had to limit it to 5Ghz).

angrycoffeeuser wrote:

Instability issues in the form of BSODs, 'out of video memory issues', random freezes, game crashes and so on with 13th and 14th generation Intel CPUs, mostly the high end 13900k(f/s) and 14900k(f/s), but not limited to.

At first, the instability was attributed to a sort of automatic overclock on the motherboards themselves.
Because of this overclock, Intel tried to shift the blame to their partners for not adhering to the "recommended specs" for these chips (they sure didn't mind it when the tech outlets were first benchmarking them though). Board partners responded with a BIOS update providing profiles adhering to these baselines based on what chip you are using, decreasing performance but promising stability at least.

Then it turns out the issue is not limited to just user-grade cpu-motherboard combos. Data centers, gaming companies and so on are also having these issues. (based on research by the channel Level1techs) They are returning HUGE amount of these cpus, to the point their support are changing their contracts where intel support costs like 10 times more then AMD. Worth of note here is that data centers for example are using much more conservative settings.

Then it turns out just adhering to the intel baseline is still not enough. People have to downclock, downvolt, downeverything just to be able to do the most basic tasks on their pc. Intel then releases a microcode update BIOS, stating they found an issue/bug with the automatic core boost (for when two cores are boosting, not sure of the exact terminology). They also state that while this is a problem, it is not the root cause.

Now at this point Gamers Nexus comes swinging out of the gates with having several different sources stating one of the possible reasons as also being VIA oxidation, which is a manufacturing grade defect and no amount of microcode is going to fix it and doing their own independent inspection on several degraded cpus.

Intel finally respond a few days later stating they root caused the issue and it is "the CPUs incorrectly requesting voltage ", then later edit a reddit post regarding the same response to confirm that there was indeed such an oxidation issue, but it was root caused in 2023 and fixed. Now a new BIOS fixing the voltages is expected mid-august. Thats the tldr version pretty much. You can check Level1techs` and Gamers Nexus` latest videos on the subject for more details.
 
Top