NVIDIA Ampere 30xx izdvojena tema

poruka: 11.274
|
čitano: 2.167.441
|
moderatori: DrNasty, pirat, XXX-Man, Lazarus Long, vincimus
+/- sve poruke
ravni prikaz
starije poruke gore
6 godina
protjeran
offline
Re: NVIDIA Ampere 30xx
picajzla0707 kaže...

 upišeš u google i čitaš - link

3080 ima 8704 cuda cores vs 2080ti sa 4352, pa sad ti vidi koliko je jača bez rt i dlss

uostalom, ćemo da vidimo vrlo skoro

 Poduplali su cuda-u po SM-u i sta sad? u najboljem slucaju to je 50% vise perf, ako mozemo i tako reci dok nevidimo prave recenzije potpune, jel nemora znaciti da cuda skalira bas tako linearno. Nikako cca 80%...najvise ce od ovih grafa profitirati kreatori, buduci da se spominje uz toliki broj cude i onaj I/O koji po toj logici ce uskratiti ili eliminirati cak load-anje samih projekata prije rendanja.

16 godina
neaktivan
offline
Re: NVIDIA Ampere 30xx
Vladitor kaže...

Koliko vi godina imate? Igrate igrice umjesto da budete muškarci da napravite nešto od svog života.Potrošit tisuće kuna na komad plastike a ljudi gladuju po svijetu.SRAMOTA!!

 Zasto si ti kupio onda taj komad plastike, sta nisi gladne nahranio? 

14 godina
suspendiran
offline
Re: NVIDIA Ampere 30xx
Vladitor kaže...

Koliko vi godina imate? Igrate igrice umjesto da budete muškarci da napravite nešto od svog života.Potrošit tisuće kuna na komad plastike a ljudi gladuju po svijetu.SRAMOTA!!

 

 

 

17 godina
offline
Re: NVIDIA Ampere 30xx
Envy kaže...
rambox kaže...

* * *

 

Samo su * * *

Samo su predstavili bolji proizvod od AMD i to tako rade još od kad ono... 2500. g. prije nove ere.

IPC ravan nuli veliš? Pa može biti i negativan što se mene tiče i kao i cijena nafte kada je otišla u minus.

I to ti je tako.. na nama je da se igramo na tebi je da najvljuješ nešto novo od AMD-a jednog lijepog sunčanog dana. 

Pa da realno Ampere IPC napredak je ravan 0, samo si ti zaboravil spomenuto ono napredak.Ne samo da nema IPC napretka, vidim da se sad spominje i određena IPC regresija vs stari Turing GPU.

 

https://forum.bug.hr/forum/topic/graficke-kartice/nvidia-ampere-30xx/265501.aspx?page=19&jumpto=6188699&sort=asc&view=flat

 

Nevidija je marketinški elegantno nabubala Ampere Cuda jezgre u nebo. Zapravo ili realno uopče i nema 10 000+ klasičnih Cuda jezgri na RTX 3090.

 

Updejtal sam dolnji post, sad imam večinu bitnih informacije pa je kompletna situacija malo jasnija.

 

https://forum.bug.hr/forum/topic/graficke-kartice/amd-navi/265345.aspx?page=99&jumpto=6189227&sort=asc&view=flat

 

 

AMD,........is that Epyc or what?
Poruka je uređivana zadnji put čet 3.9.2020 1:17 (rambox).
11 godina
offline
NVIDIA Ampere 30xx

Evo baš kolega i ja prelazimo skupa Tom Clancy Division i napravili smo benhcmarke

 

Ultra detalji,1440p rezolucija kod obojice

 

On koristi 32gb Ram-a, I7 9700k 5.0ghz te Rtx 2080Ti

Ja koristim 16gb ram-a, R5 3600 stock te Gtx 1080Ti

 

 

Znači nema šanse da 3600 radi bn 3080 grafi na 1440p i jačim rezolucijama, za 1080p me već odavno nije briga a sigurno neće nitko kupovati grafu od 6,7 tisuća kuna da bi se igrao na 1080p rezi. I opet recimo kad igraš COD ili Battfield multi na 1440p smanjiš si grafičke postavke i gejmaš u 130+fpsa. Evo on ima punih 10 fpsa više u Divisionu a platio je procesor 3x više nego ja svoj a još je tu i jača grafa... Ryzen je zakon za 200 eura uloženo/dobiveno. A ovi intelovi gutači struje od 500 eura su valjda bitni još ovima što se furaju na neke profiće pa stavljaju 4:3 rezolucije u igrama te igraju na 720p kako bi imali 800fpsa a KD im je i dalje u minusu... 

Moj Bench Moj Bench
Njegov Bench Njegov Bench
Moj PC  
4 0 hvala 0
12 godina
offline
Re: NVIDIA Ampere 30xx
Kako nema šanse - na 2080ti na 1440p u primjerice far cry 5 intel i7 10700 ima za četvrtinu veći fps od 3600x na 4,4ghz. Sa 3080 će razlika bit i veća, a takvih igara ima sigurno još i bit će ih sve više. Meni osobno je takva razlika nebitna i ne bih se uopće razmišljao za tu kombinaciju, no tko uzima rtx 3080 za 1440p vjerojatno hoće takvu grafu iskoristiti fo kraja, a za to će ipak biti bolja kombinacija intel ili nova 4xxx serija rajzena.
8 godina
protjeran
offline
Re: NVIDIA Ampere 30xx
rambox kaže...
Envy kaže...
rambox kaže...

* * *

 

Samo su * * *

Samo su predstavili bolji proizvod od AMD i to tako rade još od kad ono... 2500. g. prije nove ere.

IPC ravan nuli veliš? Pa može biti i negativan što se mene tiče i kao i cijena nafte kada je otišla u minus.

I to ti je tako.. na nama je da se igramo na tebi je da najvljuješ nešto novo od AMD-a jednog lijepog sunčanog dana. 

Pa da realno Ampere IPC napredak je ravan 0, samo si ti zaboravil spomenuto ono napredak.Ne samo da nema IPC napretka, vidim da se sad spominje i određena IPC regresija vs stari Turing GPU.

 

https://forum.bug.hr/forum/topic/graficke-kartice/nvidia-ampere-30xx/265501.aspx?page=19&jumpto=6188699&sort=asc&view=flat

 

Nevidija je marketinški elegantno nabubala Ampere Cuda jezgre u nebo. Zapravo ili realno uopče i nema 10 000+ klasičnih Cuda jezgri na RTX 3090.

 

Updejtal sam dolnji post, sad imam večinu bitnih informacije pa je kompletna situacija malo jasnija.

 

https://forum.bug.hr/forum/topic/graficke-kartice/amd-navi/265345.aspx?page=99&jumpto=6189227&sort=asc&view=flat

 

 

 vas dvojica trebate neki tehnološki podcast smislit, ne znaš tko je luđi u svoju stranu   

11 godina
offline
Re: NVIDIA Ampere 30xx
picajzla0707 kaže...
Kako nema šanse - na 2080ti na 1440p u primjerice far cry 5 intel i7 10700 ima za četvrtinu veći fps od 3600x na 4,4ghz. Sa 3080 će razlika bit i veća, a takvih igara ima sigurno još i bit će ih sve više. Meni osobno je takva razlika nebitna i ne bih se uopće razmišljao za tu kombinaciju, no tko uzima rtx 3080 za 1440p vjerojatno hoće takvu grafu iskoristiti fo kraja, a za to će ipak biti bolja kombinacija intel ili nova 4xxx serija rajzena.

 Ma to je samo jedna igra, u hrpetinama drugih novih naslova je razlika minimalna, naravno radit će bolje sa intelom to stoji ali pitanje je dali dovoljno da daš 3500kn na novi procesor. Ja mislim da nikako ne. Mene rtx 3000 serija zanima zbog RTXa premda znam sebe utući ću u cyberpunk ako igra bude ono što mislim da će biti 150+ sati te DLSSa koji značajno poboljšava preformanse, kad to sve zbrojiš zaboli me što će u kombinaciji sa intelom grafa radit nešto bolje... 

6 godina
protjeran
offline
NVIDIA Ampere 30xx

Far cry 5 je "Intelova" igra pa ce u njoj razlike uvijek biti dosta vece nego u "neutralnim" igrama.

 
0 0 hvala 0
10 godina
offline
NVIDIA Ampere 30xx

Nvidia na redditu odgovorila na neka pitanja gamera

 

 

RTX 30-Series

Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?

[Justin Walker] We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price.

In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples - if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory.

Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.

When the slide says RTX 3070 is equal or faster than 2080 Ti, are we talking about traditional rasterization or DLSS/RT workloads? Very important if you could clear it up, since no traditional rasterization benchmarks were shown, only RT/DLSS supporting games.

[Justin Walker] We are talking about both. Games that only support traditional rasterization and games that support RTX (RT+DLSS). You can see this in our launch article here

Does Ampere support HDMI 2.1 with the full 48Gbps bandwidth?

[Qi Lin] Yes. The NVIDIA Ampere Architecture supports the highest HDMI 2.1 link rate of 12Gbs/lane across all 4 lanes, and supports Display Stream Compression (DSC) to be able to power up to 8K, 60Hz in HDR.

Could you elaborate a little on this doubling of CUDA cores? How does it affect the general architectures of the GPCs? How much of a challenge is it to keep all those FP32 units fed? What was done to ensure high occupancy?

[Tony Tamasi] One of the key design goals for the Ampere 30-series SM was to achieve twice the throughput for FP32 operations compared to the Turing SM. To accomplish this goal, the Ampere SM includes new datapath designs for FP32 and INT32 operations. One datapath in each partition consists of 16 FP32 CUDA Cores capable of executing 16 FP32 operations per clock. Another datapath consists of both 16 FP32 CUDA Cores and 16 INT32 Cores. As a result of this new design, each Ampere SM partition is capable of executing either 32 FP32 operations per clock, or 16 FP32 and 16 INT32 operations per clock. All four SM partitions combined can execute 128 FP32 operations per clock, which is double the FP32 rate of the Turing SM, or 64 FP32 and 64 INT32 operations per clock.

Doubling the processing speed for FP32 improves performance for a number of common graphics and compute operations and algorithms. Modern shader workloads typically have a mixture of FP32 arithmetic instructions such as FFMA, floating point additions (FADD), or floating point multiplications (FMUL), combined with simpler instructions such as integer adds for addressing and fetching data, floating point compare, or min/max for processing results, etc. Performance gains will vary at the shader and application level depending on the mix of instructions. Ray tracing denoising shaders are good examples that might benefit greatly from doubling FP32 throughput.

Doubling math throughput required doubling the data paths supporting it, which is why the Ampere SM also doubled the shared memory and L1 cache performance for the SM. (128 bytes/clock per Ampere SM versus 64 bytes/clock in Turing). Total L1 bandwidth for GeForce RTX 3080 is 219 GB/sec versus 116 GB/sec for GeForce RTX 2080 Super.

Like prior NVIDIA GPUs, Ampere is composed of Graphics Processing Clusters (GPCs), Texture Processing Clusters (TPCs), Streaming Multiprocessors (SMs), Raster Operators (ROPS), and memory controllers.

The GPC is the dominant high-level hardware block with all of the key graphics processing units residing inside the GPC. Each GPC includes a dedicated Raster Engine, and now also includes two ROP partitions (each partition containing eight ROP units), which is a new feature for NVIDIA Ampere Architecture GA10x GPUs. More details on the NVIDIA Ampere architecture can be found in NVIDIA’s Ampere Architecture White Paper, which will be published in the coming days.

Any idea if the dual airflow design is going to be messed up for inverted cases? More than previous designs? Seems like it would blow it down on the cpu. But the CPU cooler would still blow it out the case. Maybe it’s not so bad.

Second question. 10x quieter than the Titan for the 3090 is more or less quieter than a 2080 Super (Evga ultra fx for example)?

[Qi Lin] The new flow through cooling design will work great as long as chassis fans are configured to bring fresh air to the GPU, and then move the air that flows through the GPU out of the chassis. It does not matter if the chassis is inverted.

The Founders Edition RTX 3090 is quieter than both the Titan RTX and the Founders Edition RTX 2080 Super. We haven’t tested it against specific partner designs, but I think you’ll be impressed with what you hear… or rather, don’t hear. :-)

Will the 30 series cards be supporting 10bit 444 120fps ? Traditionally Nvidia consumer cards have only supported 8bit or 12bit output, and don’t do 10bit. The vast majority of hdr monitors/TVs on the market are 10bit.

[Qi Lin] The 30 series supports 10bit HDR. In fact, HDMI 2.1 can support up to 8K@60Hz with 12bit HDR, and that covers 10bit HDR displays.

What breakthrough in tech let you guys massively jump to the 3xxx line from the 2xxx line? I knew it would be scary, but it's insane to think about how much more efficient and powerful these cards are. Can these cards handle 4k 144hz?

[Justin Walker] There were major breakthroughs in GPU architecture, process technology and memory technology to name just a few. An RTX 3080 is powerful enough to run certain games maxed out at 4k 144fps - Doom Eternal, Forza 4, Wolfenstein Youngblood to name a few. But others - Red Dead Redemption 2, Control, Borderlands 3 for example are closer to 4k 60fps with maxed out settings.

Will customers find a performance degradation on PCIE 3.0?

System performance is impacted by many factors and the impact varies between applications. The impact is typically less than a few percent going from a x16 PCIE 4.0 to x16 PCIE 3.0. CPU selection often has a larger impact on performance.We look forward to new platforms that can fully take advantage of Gen4 capabilities for potential performance increases.

RTX IO

Could we see RTX IO coming to machine learning libraries such as Pytorch? This would be great for performance in real-time applications

[Tony Tamasi] NVIDIA delivered high-speed I/O solutions for a variety of data analytics platforms roughly a year ago with NVIDIA GPU DirectStorage. It provides for high-speed I/O between the GPU and storage, specifically for AI and HPC type applications and workloads. For more information please check out: https://developer.nvidia.com/blog/gpudirect-storage/

Does RTX IO allow use of SSD space as VRAM? Or am I completely misunderstanding?

[Tony Tamasi] RTX IO allows reading data from SSD’s at much higher speed than traditional methods, and allows the data to be stored and read in a compressed format by the GPU, for decompression and use by the GPU. It does not allow the SSD to replace frame buffer memory, but it allows the data from the SSD to get to the GPU, and GPU memory much faster, with much less CPU overhead.

Will there be a certain ssd speed requirement for RTX I/O?

[Tony Tamasi] There is no SSD speed requirement for RTX IO, but obviously, faster SSD’s such as the latest generation of Gen4 NVMe SSD’s will produce better results, meaning faster load times, and the ability for games to stream more data into the world dynamically. Some games may have minimum requirements for SSD performance in the future, but those would be determined by the game developers. RTX IO will accelerate SSD performance regardless of how fast it is, by reducing the CPU load required for I/O, and by enabling GPU-based decompression, allowing game assets to be stored in a compressed format and offloading potentially dozens of CPU cores from doing that work. Compression ratios are typically 2:1, so that would effectively amplify the read performance of any SSD by 2x.

Will the new GPUs and RTX IO work on Windows 7/8.1?

[Tony Tamasi] RTX 30-series GPUs are supported on Windows 7 and Windows 10, RTX IO is supported on Windows 10.

I am excited for the RTX I/O feature but I partially don't get how exactly it works? Let's say I have a NVMe SSD, a 3070 and the latest Nvidia drivers, do I just now have to wait for the windows update with the DirectStorage API to drop at some point next year and then I am done or is there more?

[Tony Tamasi] RTX IO and DirectStorage will require applications to support those features by incorporating the new API’s. Microsoft is targeting a developer preview of DirectStorage for Windows for game developers next year, and NVIDIA RTX gamers will be able to take advantage of RTX IO enhanced games as soon as they become available.

RTX Broadcast

What is the scope of the "Nvidia Broadcast" program? Is it intended to replace current GFE/Shadowplay for local recordings too?

[Gerardo Delgado] NVIDIA Broadcast is a universal plugin app that enhances your microphone, speakers and camera with AI features such as noise reduction, virtual background, and auto frame. You basically select your devices as input, decide what AI effect to apply to them, and then NVIDIA Broadcast exposes virtual devices in your system that you can use with popular livestream, video chat, or video conference apps.

NVIDIA Broadcast does not record or stream video and is not a replacement for GFE/Shadowplay

Jason, Will there be any improvements to the RTX encoder in the Ampere series cards, similar to what we saw for the Turing Release? I did see info on the Broadcast software, but I'm thinking more along the lines of improvements in overall image quality at same bitrate.

[Jason Paul] Hi Carmen813, for RTX 30 Series, we decided to focus improvements on the video decode side of things and added AV1 decode support. On the encode side, RTX 30 Series has the same great encoder as our RTX 20 Series GPU. We have also recently updated our NVIDIA Encoder SDK. In the coming months, livestream applications will be updating to this new version of the SDK, unlocking new performance options for streamers.

I would like to know more about the new NVENC -- were there any upgrades made to this technology in the 30 series? It seems to be the future of streaming, and for many it's the reason to buy nvidia card rather than any other.

[Gerardo Delgado] The GeForce RTX 30 Series leverages the same great hardware encoder as the GeForce RTX 20 Series. We have also recently updated our Video Codec SDK to version 10.0. In the coming months, applications will be updating to this new version of the SDK, unlocking new performance options.

Regarding AV1 decode, is that supported on 3xxx series cards other than the 3090? In fact can this question and dylan522p question on support level be merged into: What are the encode/decode features of Ampere and do these change based on which 3000 series card is bought?

[Gerardo Delgado] All of the GeForce RTX 30 Series GPUs that we announced today have the same encoding and decoding capabilities:

They all feature the 7th Gen NVIDIA Encoder (the one that we released with the RTX 20 Series), which will use our newly released Video Codec SDK 10.0. This new SDK will be integrated in the coming months by the live streaming apps, unlocking new presets with more performance options.

They all have the new 5th Gen NVIDIA Decoder, which enables AV1 hardware accelerated decode on GPU. AV1 consumes 50% less bandwidth and unlocks up to 8K HDR video playback without a big performance hit on your CPU.

NVIDIA Machinima

How active is the developer support for Machinima? As it's cloud based, I'm assuming that the developers/publishers have to be involved for it to really take off (at least indirectly through modding community support or directly with asset access). Alongside this, what is the benefit of having it cloud based, short of purely desktop?

[Richard Kerris] We are actively working with game developers on support for Omniverse Machinima and will have more details to share along with public beta in October.

Omniverse Machinima can be run locally on a GeForce RTX desktop PC or in the cloud. The benefit of running Omniverse from the cloud is easier real-time collaboration across users.

NVIDIA Studio

Content creator here. Will these cards be compatible with GPU renderers like Octane/Arnold/Redshift/etc from launch? I know with previous generations, a new CUDA version coincided with the launch and made the cards inert for rendering until the 3rd-party software patched it in, but I'm wondering if I will be able to use these on launch day using existing CUDA software.

[Stanley Tack] A CUDA update will be needed for some renderers. We have been working closely with the major creative apps on these updates and expect the majority (hopefully all!) to be ready on the day these cards hit the shelves.

NVIDIA Reflex

Will Nvidia Reflex be a piece of hardware in new monitors or will it be a software that other nvidia gpus can use?

[Seth Schneider] NVIDIA Reflex is both. The NVIDIA Reflex Latency Analyzer is a revolutionary new addition to the G-SYNC Processor that enables end to end system latency measurement. Additionally, NVIDIA Reflex SDK is integrated into games and enables a Low Latency mode that can be used by GeForce GTX 900 GPUs and up to reduce system latency. Each of these features can be used independently.

Moj PC  
5 0 hvala 4
13 godina
offline
NVIDIA Ampere 30xx

Ja vjerujem da R5 3600 nece biti bog zna kakav bottle neck, igram na 1440p 144 Hz - al bitno mi je jedino da je fps iznad 60 s obzirom na freesync. U najgorem slucaju cu uzeti Ryzen 4xxx ako bude potrebe, bez mijenjanja maticne. RAM sam vec sredio :)

~beware the beast, but enjoy the feast he offers~
Moj PC  
0 0 hvala 0
12 godina
offline
Re: NVIDIA Ampere 30xx
Cansee kaže...

Nvidia na redditu odgovorila na neka pitanja gamera

 

 

RTX 30-Series

Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?

[Justin Walker] We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price.

In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples - if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory.

Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.

When the slide says RTX 3070 is equal or faster than 2080 Ti, are we talking about traditional rasterization or DLSS/RT workloads? Very important if you could clear it up, since no traditional rasterization benchmarks were shown, only RT/DLSS supporting games.

[Justin Walker] We are talking about both. Games that only support traditional rasterization and games that support RTX (RT+DLSS). You can see this in our launch article here

Does Ampere support HDMI 2.1 with the full 48Gbps bandwidth?

[Qi Lin] Yes. The NVIDIA Ampere Architecture supports the highest HDMI 2.1 link rate of 12Gbs/lane across all 4 lanes, and supports Display Stream Compression (DSC) to be able to power up to 8K, 60Hz in HDR.

Could you elaborate a little on this doubling of CUDA cores? How does it affect the general architectures of the GPCs? How much of a challenge is it to keep all those FP32 units fed? What was done to ensure high occupancy?

[Tony Tamasi] One of the key design goals for the Ampere 30-series SM was to achieve twice the throughput for FP32 operations compared to the Turing SM. To accomplish this goal, the Ampere SM includes new datapath designs for FP32 and INT32 operations. One datapath in each partition consists of 16 FP32 CUDA Cores capable of executing 16 FP32 operations per clock. Another datapath consists of both 16 FP32 CUDA Cores and 16 INT32 Cores. As a result of this new design, each Ampere SM partition is capable of executing either 32 FP32 operations per clock, or 16 FP32 and 16 INT32 operations per clock. All four SM partitions combined can execute 128 FP32 operations per clock, which is double the FP32 rate of the Turing SM, or 64 FP32 and 64 INT32 operations per clock.

Doubling the processing speed for FP32 improves performance for a number of common graphics and compute operations and algorithms. Modern shader workloads typically have a mixture of FP32 arithmetic instructions such as FFMA, floating point additions (FADD), or floating point multiplications (FMUL), combined with simpler instructions such as integer adds for addressing and fetching data, floating point compare, or min/max for processing results, etc. Performance gains will vary at the shader and application level depending on the mix of instructions. Ray tracing denoising shaders are good examples that might benefit greatly from doubling FP32 throughput.

Doubling math throughput required doubling the data paths supporting it, which is why the Ampere SM also doubled the shared memory and L1 cache performance for the SM. (128 bytes/clock per Ampere SM versus 64 bytes/clock in Turing). Total L1 bandwidth for GeForce RTX 3080 is 219 GB/sec versus 116 GB/sec for GeForce RTX 2080 Super.

Like prior NVIDIA GPUs, Ampere is composed of Graphics Processing Clusters (GPCs), Texture Processing Clusters (TPCs), Streaming Multiprocessors (SMs), Raster Operators (ROPS), and memory controllers.

The GPC is the dominant high-level hardware block with all of the key graphics processing units residing inside the GPC. Each GPC includes a dedicated Raster Engine, and now also includes two ROP partitions (each partition containing eight ROP units), which is a new feature for NVIDIA Ampere Architecture GA10x GPUs. More details on the NVIDIA Ampere architecture can be found in NVIDIA’s Ampere Architecture White Paper, which will be published in the coming days.

Any idea if the dual airflow design is going to be messed up for inverted cases? More than previous designs? Seems like it would blow it down on the cpu. But the CPU cooler would still blow it out the case. Maybe it’s not so bad.

Second question. 10x quieter than the Titan for the 3090 is more or less quieter than a 2080 Super (Evga ultra fx for example)?

[Qi Lin] The new flow through cooling design will work great as long as chassis fans are configured to bring fresh air to the GPU, and then move the air that flows through the GPU out of the chassis. It does not matter if the chassis is inverted.

The Founders Edition RTX 3090 is quieter than both the Titan RTX and the Founders Edition RTX 2080 Super. We haven’t tested it against specific partner designs, but I think you’ll be impressed with what you hear… or rather, don’t hear. :-)

Will the 30 series cards be supporting 10bit 444 120fps ? Traditionally Nvidia consumer cards have only supported 8bit or 12bit output, and don’t do 10bit. The vast majority of hdr monitors/TVs on the market are 10bit.

[Qi Lin] The 30 series supports 10bit HDR. In fact, HDMI 2.1 can support up to 8K@60Hz with 12bit HDR, and that covers 10bit HDR displays.

What breakthrough in tech let you guys massively jump to the 3xxx line from the 2xxx line? I knew it would be scary, but it's insane to think about how much more efficient and powerful these cards are. Can these cards handle 4k 144hz?

[Justin Walker] There were major breakthroughs in GPU architecture, process technology and memory technology to name just a few. An RTX 3080 is powerful enough to run certain games maxed out at 4k 144fps - Doom Eternal, Forza 4, Wolfenstein Youngblood to name a few. But others - Red Dead Redemption 2, Control, Borderlands 3 for example are closer to 4k 60fps with maxed out settings.

Will customers find a performance degradation on PCIE 3.0?

System performance is impacted by many factors and the impact varies between applications. The impact is typically less than a few percent going from a x16 PCIE 4.0 to x16 PCIE 3.0. CPU selection often has a larger impact on performance.We look forward to new platforms that can fully take advantage of Gen4 capabilities for potential performance increases.

RTX IO

Could we see RTX IO coming to machine learning libraries such as Pytorch? This would be great for performance in real-time applications

[Tony Tamasi] NVIDIA delivered high-speed I/O solutions for a variety of data analytics platforms roughly a year ago with NVIDIA GPU DirectStorage. It provides for high-speed I/O between the GPU and storage, specifically for AI and HPC type applications and workloads. For more information please check out: https://developer.nvidia.com/blog/gpudirect-storage/

Does RTX IO allow use of SSD space as VRAM? Or am I completely misunderstanding?

[Tony Tamasi] RTX IO allows reading data from SSD’s at much higher speed than traditional methods, and allows the data to be stored and read in a compressed format by the GPU, for decompression and use by the GPU. It does not allow the SSD to replace frame buffer memory, but it allows the data from the SSD to get to the GPU, and GPU memory much faster, with much less CPU overhead.

Will there be a certain ssd speed requirement for RTX I/O?

[Tony Tamasi] There is no SSD speed requirement for RTX IO, but obviously, faster SSD’s such as the latest generation of Gen4 NVMe SSD’s will produce better results, meaning faster load times, and the ability for games to stream more data into the world dynamically. Some games may have minimum requirements for SSD performance in the future, but those would be determined by the game developers. RTX IO will accelerate SSD performance regardless of how fast it is, by reducing the CPU load required for I/O, and by enabling GPU-based decompression, allowing game assets to be stored in a compressed format and offloading potentially dozens of CPU cores from doing that work. Compression ratios are typically 2:1, so that would effectively amplify the read performance of any SSD by 2x.

Will the new GPUs and RTX IO work on Windows 7/8.1?

[Tony Tamasi] RTX 30-series GPUs are supported on Windows 7 and Windows 10, RTX IO is supported on Windows 10.

I am excited for the RTX I/O feature but I partially don't get how exactly it works? Let's say I have a NVMe SSD, a 3070 and the latest Nvidia drivers, do I just now have to wait for the windows update with the DirectStorage API to drop at some point next year and then I am done or is there more?

[Tony Tamasi] RTX IO and DirectStorage will require applications to support those features by incorporating the new API’s. Microsoft is targeting a developer preview of DirectStorage for Windows for game developers next year, and NVIDIA RTX gamers will be able to take advantage of RTX IO enhanced games as soon as they become available.

RTX Broadcast

What is the scope of the "Nvidia Broadcast" program? Is it intended to replace current GFE/Shadowplay for local recordings too?

[Gerardo Delgado] NVIDIA Broadcast is a universal plugin app that enhances your microphone, speakers and camera with AI features such as noise reduction, virtual background, and auto frame. You basically select your devices as input, decide what AI effect to apply to them, and then NVIDIA Broadcast exposes virtual devices in your system that you can use with popular livestream, video chat, or video conference apps.

NVIDIA Broadcast does not record or stream video and is not a replacement for GFE/Shadowplay

Jason, Will there be any improvements to the RTX encoder in the Ampere series cards, similar to what we saw for the Turing Release? I did see info on the Broadcast software, but I'm thinking more along the lines of improvements in overall image quality at same bitrate.

[Jason Paul] Hi Carmen813, for RTX 30 Series, we decided to focus improvements on the video decode side of things and added AV1 decode support. On the encode side, RTX 30 Series has the same great encoder as our RTX 20 Series GPU. We have also recently updated our NVIDIA Encoder SDK. In the coming months, livestream applications will be updating to this new version of the SDK, unlocking new performance options for streamers.

I would like to know more about the new NVENC -- were there any upgrades made to this technology in the 30 series? It seems to be the future of streaming, and for many it's the reason to buy nvidia card rather than any other.

[Gerardo Delgado] The GeForce RTX 30 Series leverages the same great hardware encoder as the GeForce RTX 20 Series. We have also recently updated our Video Codec SDK to version 10.0. In the coming months, applications will be updating to this new version of the SDK, unlocking new performance options.

Regarding AV1 decode, is that supported on 3xxx series cards other than the 3090? In fact can this question and dylan522p question on support level be merged into: What are the encode/decode features of Ampere and do these change based on which 3000 series card is bought?

[Gerardo Delgado] All of the GeForce RTX 30 Series GPUs that we announced today have the same encoding and decoding capabilities:

They all feature the 7th Gen NVIDIA Encoder (the one that we released with the RTX 20 Series), which will use our newly released Video Codec SDK 10.0. This new SDK will be integrated in the coming months by the live streaming apps, unlocking new presets with more performance options.

They all have the new 5th Gen NVIDIA Decoder, which enables AV1 hardware accelerated decode on GPU. AV1 consumes 50% less bandwidth and unlocks up to 8K HDR video playback without a big performance hit on your CPU.

NVIDIA Machinima

How active is the developer support for Machinima? As it's cloud based, I'm assuming that the developers/publishers have to be involved for it to really take off (at least indirectly through modding community support or directly with asset access). Alongside this, what is the benefit of having it cloud based, short of purely desktop?

[Richard Kerris] We are actively working with game developers on support for Omniverse Machinima and will have more details to share along with public beta in October.

Omniverse Machinima can be run locally on a GeForce RTX desktop PC or in the cloud. The benefit of running Omniverse from the cloud is easier real-time collaboration across users.

NVIDIA Studio

Content creator here. Will these cards be compatible with GPU renderers like Octane/Arnold/Redshift/etc from launch? I know with previous generations, a new CUDA version coincided with the launch and made the cards inert for rendering until the 3rd-party software patched it in, but I'm wondering if I will be able to use these on launch day using existing CUDA software.

[Stanley Tack] A CUDA update will be needed for some renderers. We have been working closely with the major creative apps on these updates and expect the majority (hopefully all!) to be ready on the day these cards hit the shelves.

NVIDIA Reflex

Will Nvidia Reflex be a piece of hardware in new monitors or will it be a software that other nvidia gpus can use?

[Seth Schneider] NVIDIA Reflex is both. The NVIDIA Reflex Latency Analyzer is a revolutionary new addition to the G-SYNC Processor that enables end to end system latency measurement. Additionally, NVIDIA Reflex SDK is integrated into games and enables a Low Latency mode that can be used by GeForce GTX 900 GPUs and up to reduce system latency. Each of these features can be used independently.

 Fantasticno, puni hdmi 2.1,podrska za 10bit HDR NAPOKON. 

16 godina
online
NVIDIA Ampere 30xx

Ne znam za vas, ali ja idem na 3080 čim stigne :D preorder k'o kuća!

It is always darkest just before the dawn.
Moj PC  
3 0 hvala 0
12 godina
offline
Re: NVIDIA Ampere 30xx
mrsmith kaže...

Ne znam za vas, ali ja idem na 3080 čim stigne :D preorder k'o kuća!

Koji model ces uzet?

 

Vec se prica da ce bit nestasica Ampere serije kroz cjelu 2021,moram ugrabit cim vidim dobar model koji pase duzinom u kuciste. 

17 godina
offline
Re: NVIDIA Ampere 30xx
BeLLicus kaže...

Evo baš kolega i ja prelazimo skupa Tom Clancy Division i napravili smo benhcmarke

 

Ultra detalji,1440p rezolucija kod obojice

 

On koristi 32gb Ram-a, I7 9700k 5.0ghz te Rtx 2080Ti

Ja koristim 16gb ram-a, R5 3600 stock te Gtx 1080Ti

 

 

Znači nema šanse da 3600 radi bn 3080 grafi na 1440p i jačim rezolucijama, za 1080p me već odavno nije briga a sigurno neće nitko kupovati grafu od 6,7 tisuća kuna da bi se igrao na 1080p rezi. I opet recimo kad igraš COD ili Battfield multi na 1440p smanjiš si grafičke postavke i gejmaš u 130+fpsa. Evo on ima punih 10 fpsa više u Divisionu a platio je procesor 3x više nego ja svoj a još je tu i jača grafa... Ryzen je zakon za 200 eura uloženo/dobiveno. A ovi intelovi gutači struje od 500 eura su valjda bitni još ovima što se furaju na neke profiće pa stavljaju 4:3 rezolucije u igrama te igraju na 720p kako bi imali 800fpsa a KD im je i dalje u minusu... 

 to je jedinica ili dvojka? ako je dvojka mogu ti ja napraviti usporedbu s i7 i 1080ti pa da vidis na cemu si

Bnet: mNik987#2148
16 godina
online
Re: NVIDIA Ampere 30xx
MEGATAMA kaže...
mrsmith kaže...

Ne znam za vas, ali ja idem na 3080 čim stigne :D preorder k'o kuća!

Koji model ces uzet?

 

Vec se prica da ce bit nestasica Ampere serije kroz cjelu 2021,moram ugrabit cim vidim dobar model koji pase duzinom u kuciste. 

Vjerojatno će biti nestašica da, volio bih upiknuti kakav MSI ili Asus model.. ali vidjeti ćemo što će biti na CU ili MF.de

 

Zvao sam domaće shopove jučer ali još nemaju nikakav info - samo da dolaze u 10. mjesecu kod njih kažu.

(a i marža će biti Bogu pod oblake vjerojatno.....)

It is always darkest just before the dawn.
Poruka je uređivana zadnji put čet 3.9.2020 9:58 (mrsmith).
11 godina
offline
Re: NVIDIA Ampere 30xx
toxic kaže...
BeLLicus kaže...

Evo baš kolega i ja prelazimo skupa Tom Clancy Division i napravili smo benhcmarke

 

Ultra detalji,1440p rezolucija kod obojice

 

On koristi 32gb Ram-a, I7 9700k 5.0ghz te Rtx 2080Ti

Ja koristim 16gb ram-a, R5 3600 stock te Gtx 1080Ti

 

 

Znači nema šanse da 3600 radi bn 3080 grafi na 1440p i jačim rezolucijama, za 1080p me već odavno nije briga a sigurno neće nitko kupovati grafu od 6,7 tisuća kuna da bi se igrao na 1080p rezi. I opet recimo kad igraš COD ili Battfield multi na 1440p smanjiš si grafičke postavke i gejmaš u 130+fpsa. Evo on ima punih 10 fpsa više u Divisionu a platio je procesor 3x više nego ja svoj a još je tu i jača grafa... Ryzen je zakon za 200 eura uloženo/dobiveno. A ovi intelovi gutači struje od 500 eura su valjda bitni još ovima što se furaju na neke profiće pa stavljaju 4:3 rezolucije u igrama te igraju na 720p kako bi imali 800fpsa a KD im je i dalje u minusu... 

 to je jedinica ili dvojka? ako je dvojka mogu ti ja napraviti usporedbu s i7 i 1080ti pa da vidis na cemu si

 Ma jedinica je ovo free je na uplayu pa reko da odigram nisam nikada.

12 godina
offline
Re: NVIDIA Ampere 30xx
mrsmith kaže...
MEGATAMA kaže...
mrsmith kaže...

Ne znam za vas, ali ja idem na 3080 čim stigne :D preorder k'o kuća!

Koji model ces uzet?

 

Vec se prica da ce bit nestasica Ampere serije kroz cjelu 2021,moram ugrabit cim vidim dobar model koji pase duzinom u kuciste. 

Vjerojatno će biti nestašica da, volio bih upiknuti kakav MSI ili Asus model.. ali vidjeti ćemo što će biti na CU ili MF.de

 

Zvao sam domaće shopove jučer ali još nemaju nikakav info - samo da dolaze u 10. mjesecu kod njih kažu.

(a i marža će biti Bogu pod oblake vjerojatno.....)

Ja bih rado MSI ali one su predugacke za par milimetara barem ove sa 3 venta koje ja zelim,mozda uzmem asus rog 3080 strix ako nece bit pregladna za moje napajanje jer pise da su rog deklarirane na 400

7 godina
offline
NVIDIA Ampere 30xx

ne znam zasto brijete da nasi ducani zovu jedni druge i dogovaraju se, e ajmo nabit marzu 50% na nove grafe i zaradit lovu

 

ne podnose se medusobno, totalna su si konkurencija, i cijene uopce ne funkcioniraju tako kako vi mislite

 

usporedujete cijene sa ducanima u njemackoj, gdje im je porez sada 16%  dok je kod nas 25%, pa onda banke koje u njemackoj uzimaju 1-2% od placanja karticama, dok dok nas od 4-6%

 

i recimo da hrvatski ducani dobiju istu cijenu kao i njemacku, sto nikada nije slucaj, jer je njemacka ogromno trziste i ducani kupuju vece kolicine pa dobiju i bolju nabavnu cijenu, sve ove gore razlike cine veliku razliku u maloprodajnoj cijeni kod nas i u njemackoj

 

nije do nasih halapljivih trgovaca koji nabijaju marze, graficke se prodaju sa oko 15% margine, i od toga odbijte troskove banke pa dobijete jasnu sliku koliko ostaje nasim trgovcima, jer nema nikakve poslovne logike nabijati cijene kako vi mislite, jer ce konkurencija imati manje i nitko nece doci i vas ducan kupiti robu

 

uz sva ta sranja koja mi imamo u hr, rekao bih da su nam cijene u globalu i vise nego dobre

 

 isto tako webshop i maloprodajni ducan nece nikada imati iste cijene, jer webshopovi, oni koji nemaju fizicke ducane, nemaju ni blizu iste troskove poslovanja kao ovi drugi, pa si mogu priusitit raditi za 5-6% marze

Poruka je uređivana zadnji put čet 3.9.2020 10:09 (3dfx).
Moj PC  
6 0 hvala 1
16 godina
online
Re: NVIDIA Ampere 30xx
MEGATAMA kaže...

Ja bih rado MSI ali one su predugacke za par milimetara barem ove sa 3 venta koje ja zelim,mozda uzmem asus rog 3080 strix ako nece bit pregladna za moje napajanje jer pise da su rog deklarirane na 400

Mislim da kod mene stanu sve dužine kartica, nisam se uopće zamarao s time iskreno jer je kućište doslovno slobodno cijelom dužinom pošto ima mjesto za HDD/SSD ispod u komori gdje je i PSU. Već i ova ogromna STRIX 1080Ti zauzima puno prostora pa ima još barem 10cm slobodno. Da, STRIX OC će vjerojatno biti oko 380-400W TDP pošto je default 320W.

 

Full sam nahajpan za Cyberpunk 2077 i RTX/DLSS 2.0 na 3440x1440@100Hz

 

3dfx oprosti, nisam upućen kako naši rade, to je bilo onako bezveze rečeno pošto imamo lude cijene :) no hard feelings

It is always darkest just before the dawn.
Poruka je uređivana zadnji put čet 3.9.2020 10:13 (mrsmith).
12 godina
offline
Re: NVIDIA Ampere 30xx
mrsmith kaže...
MEGATAMA kaže...

Ja bih rado MSI ali one su predugacke za par milimetara barem ove sa 3 venta koje ja zelim,mozda uzmem asus rog 3080 strix ako nece bit pregladna za moje napajanje jer pise da su rog deklarirane na 400

Mislim da kod mene stanu sve dužine kartica, nisam se uopće zamarao s time iskreno jer je kućište doslovno slobodno cijelom dužinom pošto ima mjesto za HDD/SSD ispod u komori gdje je i PSU. Već i ova ogromna STRIX 1080Ti zauzima puno prostora pa ima još barem 10cm slobodno. Da, STRIX OC će vjerojatno biti oko 380-400W TDP pošto je default 320W.

 

Full sam nahajpan za Cyberpunk 2077 i RTX/DLSS 2.0 na 3440x1440@100Hz

 

3dfx oprosti, nisam upućen kako naši rade, to je bilo onako bezveze rečeno pošto imamo lude cijene :) no hard feelings

Ova moja Strix je 300mm i doslovno je skoro do kraja kucista,ima free mozda centimetar jos.

Moram pogledat Gigabyte modele,ovaj me zanima i valjda nece bit vise od 1000eu

 

https://www.gigabyte.com/Graphics-Card/GV-N3080AORUS-M-10GD

17 godina
neaktivan
offline
NVIDIA Ampere 30xx

Jel zna netko ima li igdje za skinut Nvidijin Marbles demo? 

Kolko vidim na njihovoj službenoj stranici ima sve osim njega (a to je demo bio već i za 2xxx seriju kartica) 

Lijep pozdrav ;)
Moj PC  
0 0 hvala 0
16 godina
online
Re: NVIDIA Ampere 30xx
MEGATAMA kaže...

Ova moja Strix je 300mm i doslovno je skoro do kraja kucista,ima free mozda centimetar jos.

Moram pogledat Gigabyte modele,ovaj me zanima i valjda nece bit vise od 1000eu

 

https://www.gigabyte.com/Graphics-Card/GV-N3080AORUS-M-10GD

Dobar model, ali će biti sigurno 100-150€ skuplja nego ostale. Ja računam na ~6k kn cijenu iskreno (STRIX / Aorus vjerojatno bliže 7k), ali vidjeti ćemo.

It is always darkest just before the dawn.
11 godina
offline
NVIDIA Ampere 30xx

Kako u 10mj? Rtx 3080 dolazi na police 17.9, custom modeli su spremni od vodecih proizvodaca. Aorus,Msi,Asus,Gigabyte,Zotac,Evga. Mi smo ipak u Eu i ako je realise date 17.9 onda je tada a ne 10 mjesecu. Uostalom i prije 2 god kad je dosao Turing nase trgovine su imale kartice isti dan kad i ostatak Eu, barem one ozbiljne it trgovine.

Moj PC  
1 0 hvala 0
12 godina
offline
Re: NVIDIA Ampere 30xx
Dado_ZG78 kaže...

Jel zna netko ima li igdje za skinut Nvidijin Marbles demo? 

Kolko vidim na njihovoj službenoj stranici ima sve osim njega (a to je demo bio već i za 2xxx seriju kartica) 

To nije bio demo za seriju 20(vrtio se na onim profi karticama ako se dobro sjecam) i nisu nista izjavili oko javno dostupne demo verzije.

16 godina
online
Re: NVIDIA Ampere 30xx
BeLLicus kaže...

Kako u 10mj? Rtx 3080 dolazi na police 17.9, custom modeli su spremni od vodecih proizvodaca. Aorus,Msi,Asus,Gigabyte,Zotac,Evga. Mi smo ipak u Eu i ako je realise date 17.9 onda je tada a ne 10 mjesecu. Uostalom i prije 2 god kad je dosao Turing nase trgovine su imale kartice isti dan kad i ostatak Eu, barem one ozbiljne it trgovine.

Jedini koji su mi dali neki info su bili Links i Instar.. rekli su mi da dolaze u 10mj. Ako oni lažu i ja lažem

It is always darkest just before the dawn.
12 godina
offline
Re: NVIDIA Ampere 30xx
mrsmith kaže...
MEGATAMA kaže...

Ova moja Strix je 300mm i doslovno je skoro do kraja kucista,ima free mozda centimetar jos.

Moram pogledat Gigabyte modele,ovaj me zanima i valjda nece bit vise od 1000eu

 

https://www.gigabyte.com/Graphics-Card/GV-N3080AORUS-M-10GD

Dobar model, ali će biti sigurno 100-150€ skuplja nego ostale. Ja računam na ~6k kn cijenu iskreno (STRIX / Aorus vjerojatno bliže 7k), ali vidjeti ćemo.

 850eu bi bilo fantasticno ali ocekujem da bude skuplje,ova moja Strix je kostala okruglo 1000eu krajem 2018.

11 godina
offline
NVIDIA Ampere 30xx

Mozemo to lako usporedit rtx 2080 super ima isti msrp kao i 3080 od 699 dolara. Msi rtx 2080 super gaming x trio je u Njemackoj tdenutno 771 euro ili ti ga 5800kn kod nas je 6500kn isti model. Razlika u porezu je 9% sto je 500kn. Takve cijene ce biti i za 3080 znaci vani ispod 6000kn sto je odlicno. Pogotovo ako imas nekog da dolazi redovito iz Njemacke pa platis prema njihovom porezu. Kad narucujes kod nas preko njihovih web stranica promjene odma porez na 25%

Moj PC  
1 0 hvala 0
17 godina
neaktivan
offline
Re: NVIDIA Ampere 30xx
MEGATAMA kaže...
Dado_ZG78 kaže...

Jel zna netko ima li igdje za skinut Nvidijin Marbles demo? 

Kolko vidim na njihovoj službenoj stranici ima sve osim njega (a to je demo bio već i za 2xxx seriju kartica) 

To nije bio demo za seriju 20(vrtio se na onim profi karticama ako se dobro sjecam) i nisu nista izjavili oko javno dostupne demo verzije.

 A moguće...  možda ga release-aju sad uz 30 seriju.... 

Lijep pozdrav ;)
6 godina
protjeran
offline
NVIDIA Ampere 30xx

Šta se 2080Ti kupuje za 1080p rezoluciju? Ja sam davno migrirao na 1440p.

 
0 0 hvala 0
E-mail:
Lozinka:
 
vrh stranice