not long ago we open discussion about
Andy's Oh cat tool which was a
presentment updates i am be made and
added an interface to present mom made
it a lot easier to used for new api
testing and the sort of replacement
frats and we were sitting on that in a
beta stage of the software and we've
also been working with nvidia for the
last few months on their new software
fcat vr before getting to fcat vr this
coverage is brought to you by
thermaltake and the new contact silent
12 cooler which is a $25 cooler and a
competitor to the hyper 212 so be our
benchmarking is actually really hard to
do properly we don't have the full
numbers can't release them yet but very
shortly we'll have the full benchmarks
of VR on different video cards the
reason we are benchmarking is hard is
because it's not the normal type of
testing we do first of all the headset
has a force to be sync of 90 hurt and
that is for motion sickness reasons
among other technical reasons this means
it's difficult to actually measure the
true FPS get to do some calculations and
softer side to guess at what the FPS is
beyond 90 hurts and then if you're below
90 hurts there's another problem and
that problem is when the NVR if below
the if you're not meeting the runtime
within the designated amount of time
normally 11 milliseconds to produce a
frame the runtime then has to decide
what do I do for this next frame because
it's got a few options one it can reap
reject an old frame with no updates
that's what we're calling a warp miss
and in that scenario that's sort of a
worst case that is when the user
experience is absolutely no updates to
their frames at all that means you're
seeing the previous frame which would be
comparable to a stutter in traditional
FPS benchmarking or gaming and if you
see a previous frame in VR a couple
times anyway it will eventually start to
induce sickness or uneasiness or just
not good feelings and you're dead
because you're wearing a thin on your
face and it's the only thing you can see
so that's a wart miss the other thing
that we have to measure is a drop frame
a drop frame is when the runtime decides
to take a previous frame and then
synthesize a new frame
came by updating the head tracking but
animation within the frame remains the
same so if you have character movement
if there's if you're looking at a
character and a sort of firing position
that animation will not change but if
you move your head around the head
tracking will update and that helps
that's important to reduce the chance of
user sickness it's not an ideal
experience just like tearing in a game
is not an ideal experience but it's
probably not going to make you vomit so
that's a good thing these things are not
easy it turns out to measure with
traditional tools like fraps or present
mom we actually have no way to measure
them and you can use things like steams
5 FPS monitoring and frame time tracking
tools but it's not the best solution and
it's got some limitations so the fcat be
our approach is twofold one there is a
hardware solution we set up a secondary
hardware capture machine and that's one
for validation but two there's a
software solution that measures a dozens
of different data points and variables
creates a massive CSV file we have some
b-roll of that I think and and then we
can take that data and start crunching
it into graphs and how many work mrs.
are there drop frames what's the average
latency frame to frame what's the worst
plate and see what the best latency all
that stuff we can get from there along
with our traditional one percent and 0.1
percent lows so I'm a hardware capture
side that we have that one the really
important thing is epcat is an nvidia
tool of course they say that it's fair
and they are planning to release it as
open source which will allow you to look
into the code but it's still in that
video tool so we have to validate it and
make sure that's being fair and one of
the best ways to do that is with
hardware captured so we set up a
secondary pc that has a vision i think
it's an SD HD for something it's a
pretty expensive capture card and that
has very high bandwidth that card out of
it has a splitter and into the splitter
goes the sort of the the hdmi from a
splitter box we actually get through the
complicated the cables alone are the
heart
part of the our benchmarking the way the
cables are set up and the connection
order does matter by the way you have
the benchmarking system with the video
card under test out of that comes
displayport to your monitor and then
hdmi to a splitter box and then out of
the splitter box comes to other cables
one it goes to the link box for the
headset so we've got two boxes in there
and that goes to the headset the head
mounted display so I'd be the vibe and
then the other cable coming out of the
splitter box not the link box with a
splitter box goes to the capture system
into that splitter and then that goes
into the capture card and from there
you're able to record using virtualdub
which is a third party tool the actual
output to the head-mounted display and
then that output on it has an overlay a
VR benchmarking overlay which cycles
through something like eight colors so
we can step through each of those colors
one frame at a time and see if there are
any that are missing or see if there are
any other problems within the frame now
traditionally you would use this for
looking at tarian within frame but with
fcat VR we're mostly looking at are
there any dropped frames there can also
be dropped frames because of improper
testing so the big problem with with the
our testing is that it's such high
bandwidth that we have 21 use a high
bandwidth capture card and to use either
raid SSDs or something like an intel 715
we have an Intel 750 one terabyte drive
from vs mods thank you Bob and rod for
that and that is capable of keeping up
with the recording and the capture it's
also because one point two terabytes
capable of storing everything this is a
big problem too because each one minute
file or so is about 50 gigabytes and
then from the hair we analyze the file
we output a CSV that says if anything is
missing we can see if it's mind you hurt
if there any drops and use that to
validate against the software capture
which was running on the capture system
which has recorded all of the frames
synthetic and otherwise from the video
card so we can use that for validation
the next step is to compress the 50
gigabyte file which we can do you're
probably watching one of them at some
point this video and that is done with
the script that we wrote so we use it
for our own media production the script
compresses it to eighty percent or
smaller of the original size or i should
say it's it's eighty percent smaller or
greater than that of the original size
we go down sometimes to a couple hundred
megabytes depending and we don't really
lose any quality so that snakes to
script that we wrote in house I can't
release the full members yet you've
probably seen some examples while going
through all this stuff in the video but
we'll have numbers shortly if you want
to look into this the real main news
item here there is now a good way the
benchmark VR it has taken several months
of back and forth with NVIDIA to get it
functional it was originally rather than
this nice interface they have now we
were using a actually code so we had to
modify Perl code to output the chart we
have used code to align the benchmark
data and we also has used things like
regular expression editing to do
anything more complex that was really
not not easy to work with but it's a lot
better now it's a UI and they're going
to be releasing it to the public I think
it's going to reviewers first and we'll
have numbers soon if you're curious
about it we'll try and put a tutorial
out there as well as that you can use it
yourself in terms of fairness will
validate that but the hardware capture
should pretty much resolve any concerns
that because we're going to be able to
see the actual raw output from the
headset to the capture systems that will
validate that but that's all for now
we'll have a lot more on this so it's
more depth really the house planning to
get but go to patreon.com/scishow help
that directly and help fund our testing
efforts we looking at fcat vr shortly
hopefully oak at once it's got a more
final state subscribe tomorrow I'll see
you all next time
We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.