michaelmikado's forum posts

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 5

#1 michaelmikado
Member since 2019 • 406 Posts

@AIIison said:

Seriously picking up saves from streamers defeats the challenge of a game and would ruin any trophy/achievement system in place.

What an awful device and I hope it fails hard and Google regrets ever challenging Microsoft for the crown.

You can make games a service without ruining the experience. When everyone plays we win as is xbox’s Slogan. Play anywhere is pretty much Nintendo’s and Sony’s is giving us epic cinematic experiences.

This is going to be a pain in the ass for anyone with data caps or slow internet.

At least with Xbox or ps4 ( if you bring a monitor along side you get odd looks but it is worth it)you can download at a place with free WiFi then take the fun home to you.

Sony already does this with SharePlay, Remote Play, and PS Now. SharePlay let's your friend remotely play your PS4 or even local multiplayer. It's been around for 5 years now and hasn't ruined the industry.

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 5

#2 michaelmikado
Member since 2019 • 406 Posts

Peak speed is the least reliable metric for measuring preparedness for game streaming. It will be about latency and consistency. Internet speeds are almost a non-factor with the standards speeds today.

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 5

#3  Edited By michaelmikado
Member since 2019 • 406 Posts

I'm still waiting for details on costs but I've already got my estimates:

20 hours of gaming in 3 tiers.

$15/month 720P x 30fps - (A) games & free-to-play included

$25/month 1080P x 30fps (AA) games included

$40/month 4K x 60fps (AAA) games included

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 5

#4 michaelmikado
Member since 2019 • 406 Posts

@goldenelementxl said:

If anyone has the cash to take a hit on hardware, it's Google. And the difference here is that Google doesn't have to pay retailers or ship anything across the globe. This pretty much confirms a 10-12 TFLOP range in terms of power next gen. Lets just hope this 2.7 GHz hyperthreaded chip isn't a sign of things to come CPU wise.

No, that's based on AMD's old 7601 Epyc server chip. Epyc Rome would be the equivalent and Google will upgrade whenever they are available. Consoles will be the 3.2Ghz variant and cloud servers will follow soon afterwards.

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 5

#5  Edited By michaelmikado
Member since 2019 • 406 Posts

@xantufrog said:

@michaelmikado: as far as I can tell, the information you have provided only addresses one part of my post:

I speculated they might have some way to distribute rendering on the back end without code changes. You've asserted that's impossible.

Maybe you are right - I don't do this line of work. But SLI/Crossfire-type software-side resource support is absolutely possible. Not only did they show a demo using 3 instances in this manner, but I can do this at home on my own computer.

You might argue they "won't" do that, but can't and won't are not the same thing, and they just got on a stage and advertised they will support that to all the developers.

Their system has to be scalable to survive. They cannot offer competitive long-term games support if they don't encourage developers to take advantage of multi-GPU rendering - unless they want to do what us PC gamers do and upgrade all their gpus every few years. Which is an enormous waste of the potential of a server system like this

That's correct there would be a different type of interface and it wouldn't be networked GPU resources. The servers they are talking about would have 8 GPUs on a PCI bus. 2 per v340. It's still one physical server with theoretical performance of 448 cus due to quad v340s. These servers are at least 50K-80K each so they definitely wouldn't allot it to a single user. However they aren't doing networked GPUs for real-time performance gaming. That remains outside of possible at this point. There is a massive worlds of difference between distributing rendering over PCI buses on identical cards vs distributing over LAN on varying hardware.

I keep talking about the above server in my post because understanding how this is all working is better than speculating.

At VMworld 2018 in Las Vegas in the AMD booth, we demonstrated how A + A makes an astounding duet. For the first time, we showed a technology demonstration of a virtualized environment driven by an HPE ProLiant DL385 Gen10 from AMD’s Santa Clara, CA datacenter. Inside, the server we have two 32 core, 64 thread Epyc CPUs and a recently announced dual GPU solution based on the “Vega” architecture1, the Radeon Pro V340. The result, an awesome system for virtualized workloads. And a GPU that delivers 33% greater user density than our competition3.

Also, in the booth was our A + A gorilla. It’s a dual Epyc server with 4 Radeon Pro V340s. 8 GPUs (2 per card), each with 56 compute units4 thumping graphics and supported by 2 Epyc 32 core CPUs.

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 5

#6 michaelmikado
Member since 2019 • 406 Posts

@ronvalencia said:

@qx0d:

https://wccftech.com/stadia-is-googles-cloud-based-game-platform-powered-by-amd-linux-and-vulkan-due-in-2019/

Google Stadia is powered by AMD technology and custom built for this use case. It surpasses the likes of PlayStation 4 Pro and Xbox One X put together with its 10.7 teraflops of computing power, 56 compute units and HBM2 memory. The CPU is a custom x86 processor clocked at 2.7 GHz which also supports hyperthreading and AVX2. Memory totals 16GB, with up to 484 GB/s of transfer speed, and 9.5 MB of L2 + L3 cache.

Specs decode

GPU: RX Vega 56 with Vega 64's HBM v2 memory bandwidth... Unknown if this Vega 56 is a semi-custom VII with 56 CUs at 10.7 TFLOPS and two HBM v2 stacks. Normal Vega 56 has 10.56 TFLOPS.

CPU: custom ZEN ... AVX2 full hardware from ZEN 2 or half baked version from ZEN v1.x??? Clock speed set at 2.7 Ghz.

Nah this is known. The full hardware was announced in October by AMD.

https://community.amd.com/community/radeon-pro-graphics/blog/2018/11/13/amd-server-cpus-gpus-the-ultimate-virtualization-solution

Anyone who is doing game streaming will be using these servers except for MS.

Footnote from the link.

Estimates based on SPECfp®_rate_base2017 using the GCC-02 v7.2 compiler. AMD- based system scored 201 in tests conducted in AMD labs using an “Ethanol” reference platform configured with 2 x AMD EPYC 7601 SOC’s ($4200 each at AMD 1 ku pricing), 512GB memory (16 x 32GB 2R DDR4 2666MHz), Ubuntu 17.04, BIOS 1002E. Intel- based Supermicro SYS-1029U-TRTP server scored 164 in tests conducted by AMD, configured with 2 x 8160 CPU’s (2 x $4702 each per ark.intel.com), 768GB memory (24 x 32GB 2R DDR4 2666MHz), SLES 12 SP3 4.4.92-6.18-default kernel, BIOS set to Extreme performance setting. NAP-77

CPU Specs: 7601 Epyc

# of CPU Cores32

# of Threads64

Base Clock2.2GHz

Max Boost Clock 3.2GHz

All Core Boost Speed 2.7Ghz

Socket Count1P/2P

PCI Express Versionx128

Default TDP / TDP180W

GPU Specs: v340

GPU Architecture VegaLithography 14nm Fin FETStream Processors7168Compute Units 112 = Vega 56x2 GPU Memory

Memory Size 32 GB Memory Type (GPU) HBM2


Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 5

#7 michaelmikado
Member since 2019 • 406 Posts

@Grey_Eyed_Elf:

The cost of the service is a huge concern.

These streaming services are generally priced per hour of steaming. When Netflix streaming first launched users only got 20 hours of streaming per month.

PSnow when it first launched charged $2.99 for a 4 hour rental. Or $0.75 per hour.

Shadow charges $35 for 20 hours and Nvidia Shield is rumored to be priced at $25 for 20 hours of use.

Neither Shadow nor the consumer version of Shield includes game unless you own them already through a digital store. The betas include some games.

So the break down is PSNow at launch: $0.75

Nvidia Shield rumored: $1.25 per hour no games

Shadow: $1.75 per hour, no games.

PSNow can be gotten for $99 a year now or about $8 per month so like Netflix, after several years in the streaming industry they were able to offer an unlimited model for under $10 a month that includes the content itself.

Neither Google nor MS have said anything about their streaming prices but the going rate appears to be between $1.25-$1.75 and hour. Anything higher that $2 per hour may be a hard sell even though we pay $5 to rent 2 hour movies for 72 hours.... so new games will likely be out of the question unless it’s very high priced.

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 5

#8 michaelmikado
Member since 2019 • 406 Posts

@BassMan:

PhysX is just a software API. It can still be used if a developer wants to. I’m sure they have their reasons but I’m assuming because they will be using AMD hardware and running PC games that they would have some tie into Microsoft who own Havok and likely make a lot of Xbox games.

Basically Google needs PC games which may already be cross developed for Xbox which MS owns so getting them running on AMD hardware is likely easier due to their experience with Xbox and MS. Remember they games would need to support using the API in the first place and few devs will have PhysX running on AMD GPUs since it just recently went open source.

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 5

#9 michaelmikado
Member since 2019 • 406 Posts

@IMAHAPYHIPPO:

It’s the underlying servers that these run on and technically how much of those server resources they will allocate to a single user at any given time. It’s really for reference and marketing for consumers and devs to compare against current offerings.

Avatar image for michaelmikado
michaelmikado

406

Forum Posts

0

Wiki Points

2

Followers

Reviews: 0

User Lists: 5

#10 michaelmikado
Member since 2019 • 406 Posts

@Grey_Eyed_Elf:

That’s correct and 60fps is ideal as it actually helps to mask the lag more believe it or not. I assume it’s because it’s slightly less noticeable if a single frame is skipped due to packet loss.

They will likely scale up more in the future as they get better server hardware and bandwidth costs go down.