Zašto nama svima s AMDovim GPUovima piše da nije valid score? Jer su driveri u beti?
Ocito to
Zašto nama svima s AMDovim GPUovima piše da nije valid score? Jer su driveri u beti?
Ocito to
da ali sta te briga...bitan je score
jel ima 3dmark spy, nek neko rokne na pm!
nema kaj ti problem kupit skrtico !
mirkoj--------------12132------4Way SLI TitanX--- stock brzine----I7-5960X@4.2
nemam pojma sta mu je ovaj error:
Your score was invalid for the following reasons
A nisam instalirato ni onaj riva da pratim koliko uiopste iskoristava grafike.. al nebitno eto cisto fun radi
imas vise 12 897
Koji je ovo sad sistem ? Šta CPU score šta GPU score ? Pa uvik se gledao ukupan rezultat a ne ovo kemijanje.
Ovaj test preskačem baš zbog vaših izmišljotina koje smatram glupostima jer na kraju krajeva što je GPU bez CPU-a i obrnuto.
Ako baš hoćete kemijat onda napravite listu za ukupan score i posebnu za GPU i posebnu za CPU.
Koji je ovo sad sistem ? Šta CPU score šta GPU score ? Pa uvik se gledao ukupan rezultat a ne ovo kemijanje.
Ovaj test preskačem baš zbog vaših izmišljotina koje smatram glupostima jer na kraju krajeva što je GPU bez CPU-a i obrnuto.
Ako baš hoćete kemijat onda napravite listu za ukupan score i posebnu za GPU i posebnu za CPU.
sistem je taj da je time spy obična prevara od strane nvidie,debelo potplaćena od nvidie kako bi amd grafičke opet ispale loše,time spy uopće nije pravi DX12 benchmark nego nvidia benchmark
evo koga zanima:
http://www.overclock-and-game.com/news/pc-gaming/50-analyzing-futuremark-time-spy-fiasco
It doesn't support parallel async. It supports single process pre emption context switching and it's operated from the driver level (and that's why can be turned off on Maxwell's cards) .This has nothing to do with low level dx12 API (closer to the metal) and with parallel async compute which processing many queues at the same time. Using 2 queues - exactly as context switching is doing. Processing one queue then STOP it to pikc some other task (pre emption) from second queue. That's why this is a single process, can't do both in parallel. But this is what NV only can do, so usage of AMD's async is maybe like 50% in this scenario and has nothing to do with real gaming pefrormance. Creating a dx12 game without hard coding is pointless so saying "every vendor will to this" (use pre emp) is a mumbo jumbo. Wonder how you change your mind in 2018 when Volta will have hardware ACE (sync compute unit) as AMD has. BTW - it Is not even dx12 software from the ground using one code path: http://www.overclock.net/content/type/61/id/2832197Citiraj:This benchmark async part is freud. Critics are justified. This is typical reason why I don't like synthetic benchmarks - there are not applicable in real word situations. Problem is that it implies that pascal architecture is waaay better than maxwells at async which is not true (it's basicaly same thing in games). That it has solid gains from async (allmost like amd cards) and that's not true (can't be seen in any async games)Basically TimeSpy is the best result you'll ever get for DX12 on a Nvidia hardware, which does not correlate to a proper DX12 game result since NVIDIA does not support parallel async.14:17- finally the proverbial nitty gritty so results from this benchmark application really isn't using Async Compute. By the way, given ID software's FAQ on Doom particularly Async Compute where they said Async Compute is not enabled on Nvidia cards, I think Nvidia is completely lying about Async Compute altogether. There is no way a driver from the Operating System can partition a GPU up so one segment does Compute and the other one does Graphics. I think it's complete bullshit and Nvidia put out that whole thing about Dynamic Load Balancing on PASCAL to obfuscate so no one would realize they don't have Async Compute support and they were doing false-advertisement, i.e. lawsuit. Don't any of you all find it strange that Nvidia purposely didn't talk about DX12/Vulkan this go around during the launch of the 1080/1070? When they launched the 980/970, they couldn't shut the fuck up about DX12 so why are they so quiet about it now?
Yeah so basically the benchmark is a complete fraud as everyone has been saying.TimeSpy best shows how DirectX 12 for 80% of the people out there will see ... the nVidia users. The simple fact of the matter is that most developers will code DX12 in a way that will benefit the vast majority of people, that means they will do the vast majority of their optimization for nVidia, since they are the major market share. A developer isn't going to spend an overwhelming amount of time to optimize for 20% of the machines out there at the expense of the 80%. Like it, don't like it, that is the reality of business. MaxROI ... Maximum Return On Investment.exactly - that's why is quite useless as benchmark. you can check 100% what one company is cappable to do and like 50% from other when talking about async. So what is the point to make any comparision using this dx11/12 software ?Nvidia's pre-emptive Async is fine, and it works ok, but it's not Async Compute, Shader work has to stop and then Compute work has to start, then it stops and Shader work continues again. GCN can run them both at the same time, ie: asynchronously
kao što sam rekao uvijek korak iza konkurencije...nvidia će opet uzeti dio tržišta sa svojom verzijom async aka enhanced asyncronus compute a dev. će opet bit na raskrižjima mita i korupcije
Eto, s razlogom sam bio sumnjičav jer to nije imalo logike baš kako su izašli ti rezultati. Koji su to cirkusanti.
Ako je to istina (trenutno su to ipak nedokazana nagadjanja), kako to da onda AMD ne izda priopcenje u kojem ce pucanstvu objasnit kako stvari stoje sa Time Spy-em i da ne odrazava pravu dx12 async sliku performansi njihovih grfa u odnosu na nvidiju?!
Tako da je to sve za sada na razini teorija urote..
Ako je to istina (trenutno su to ipak nedokazana nagadjanja), kako to da onda AMD ne izda priopcenje u kojem ce pucanstvu objasnit kako stvari stoje sa Time Spy-em i da ne odrazava pravu dx12 async sliku performansi njihovih grfa u odnosu na nvidiju?!
Tako da je to sve za sada na razini teorija urote..
Koja je to prava slika?
i5 3570K, Sapphire toxic r9 270x 2GB OC, score 1758
http://www.3dmark.com/3dm/13485813
Edit: core 1150, memory 1500 (stock)
Mali OC i taman je na rangu 1070...
http://www.3dmark.com/3dm/13582340
I7-4820k 4.6 GTX 980ti Core clock 1.743 MHz Mem 3069 MHz
Score 5961
Ma, pa sasvim sigurno nije core preko 1700mhz jer je rezultat debelo preslab za te taktove, to je izmedju 1400 i 1500mhz...
Core je 1524MHz. Na Afterburner-u je memorija ispod 2000MHz a na Time Spy pokazuje 3069MHz???
Pogledajte link. Frekvencije su nemoguće. Zato sam i stavio link na rezultat. Ovo sam i kod prike vidio da očitava različito od stvarnih frekvencija.
Pa to je isto kao i što Unigine Valley očitava bezveze. Ne možeš se uzdat u to nego u normalne programe za dijagnostiku i nadzor.
Mali OC i taman je na rangu 1070...
http://www.3dmark.com/3dm/13582340
I7-4820k 4.6 GTX 980ti Core clock 1.743 MHz Mem 3069 MHz
Score 5961
na slici ti se ništa ne vidi,stavi neki vanjski upload
Ima ko problema da mu bench ne zeli pokrenit spya, a a ko pokrene odradi samo prvi gpu testi i prekine ?
Po dana sad pokusavam i nikako da proradi, Firestrike na ultra radi, a ovo uopce ne zeli.
Rijeseno. Cini se da je problem do Samsun magiciana bia.
http://www.3dmark.com/3dm/13633731
Znaci 10830 graphic score.
Ima ko problema da mu bench ne zeli pokrenit spya, a a ko pokrene odradi samo prvi gpu testi i prekine ?
Po dana sad pokusavam i nikako da proradi, Firestrike na ultra radi, a ovo uopce ne zeli.
Kod mene radi uredno.
Ima ko problema da mu bench ne zeli pokrenit spya, a a ko pokrene odradi samo prvi gpu testi i prekine ?
Po dana sad pokusavam i nikako da proradi, Firestrike na ultra radi, a ovo uopce ne zeli.
Rijeseno. Cini se da je problem do Samsun magiciana bia.
http://www.3dmark.com/3dm/13633731
Znaci 10830 graphic score.
Ja imam Samsung Magician u pozadini, a bench uredno prodje. Nije do njega.
Možda mu je neka od opcija u OS optimizaciji stvarala probleme.
Odakle sad i7 Sandy Bridge? Jel to tvoj komp?