Skip to main content

Video Codec Testing and Quality Measurement
draft-ietf-netvc-testing-09

Revision differences

Document history

Date Rev. By Action
2020-08-03
09 (System) Document has expired
2020-05-14
09 Tero Kivinen Request closed, assignment withdrawn: Alan DeKok Last Call SECDIR review
2020-05-14
09 Tero Kivinen Closed request for Last Call review by SECDIR with state 'Overtaken by Events'
2020-03-25
09 Adam Roach IESG state changed to Dead from DNP-waiting for AD note
2020-03-25
09 Adam Roach After repeated attempts to get IESG evaluation DISCUSS comments addressed, we have concluded that the NETVC working group lacks sufficient momentum to complete this document.
2020-03-25
09 Adam Roach IESG state changed to DNP-waiting for AD note from IESG Evaluation::Revised I-D Needed
2020-03-10
09 Adam Roach IESG state changed to IESG Evaluation::Revised I-D Needed from IESG Evaluation::AD Followup
2020-02-07
09 Alissa Cooper [Ballot comment]
Thank you for addressing my DISCUSS.

Please respond to the Gen-ART review.
2020-02-07
09 Alissa Cooper [Ballot Position Update] Position for Alissa Cooper has been changed to No Objection from Discuss
2020-01-31
09 (System) Sub state has been changed to AD Followup from Revised ID Needed
2020-01-31
09 (System) IANA Review state changed to Version Changed - Review Needed from IANA OK - No Actions Needed
2020-01-31
09 Thomas Daede New version available: draft-ietf-netvc-testing-09.txt
2020-01-31
09 (System) New version approved
2020-01-31
09 (System) Request for posting confirmation emailed to previous authors: Andrey Norkin , Thomas Daede , Ilya Brailovskiy
2020-01-31
09 Thomas Daede Uploaded new revision
2019-06-14
08 Linda Dunbar Request for Telechat review by OPSDIR Completed: Has Nits. Reviewer: Linda Dunbar. Sent review to list.
2019-06-13
08 Cindy Morgan IESG state changed to IESG Evaluation::Revised I-D Needed from IESG Evaluation
2019-06-13
08 Roman Danyliw
[Ballot discuss]
(1) There appear to be deep and implicit dependencies in the document to the references [DAALA-GIT] and [TESTSEQUENCES].  I applaud the intent to …
[Ballot discuss]
(1) There appear to be deep and implicit dependencies in the document to the references [DAALA-GIT] and [TESTSEQUENCES].  I applaud the intent to provide tangible advice on testing and evaluation to the community with them.  I have a few questions around their use.

(1.a) Why aren’t [DAALA-GIT] and [TESTSEQUENCES] normative references as they are needed to fully understand the testing approach and provide the test data?

(1.b) What should readers of the RFC do should these external references no longer be available?  How is the change control of these references handled? 

(1.c) In the case of [DAALA-GIT] which version of the code in the repo should be used?  Formally, what version of C is in that repo? 

(1.d) Per the observation that there are implicit assumptions made by the document about familiarity with [DAALA-GIT] and [TESTSEQUENCES], here are a few places where additional clarity is required:

-- Section 4.3, Per “For individual feature changes in libaom or libvpx , the overlap BD-Rate method with quantizers 20, 32, 43, and 55 must be used”, what are libaom and libvpx and what is their role?

-- Section 5.3.  Multiple subsection in 5.3.* list what look like settings for tools (e.g., “av1: -codec=av1 -ivf -frame-parallel=0 …”). What exactly are those?  How to read them/use them?

(2) The full details of some of the testing regimes need to be more fully specified (or cited as normative):
-- Section 3.1. The variable MAX is not explained in either equation.

-- Section 3.1.  This section doesn’t explain or provide a reference to calculate PSNR.  I’m not sure how to calculate or implement it.

-- Section 4.2.  Reference needed for Bjontegaard rate difference to explain its computation

-- The references [SSIM], [MSSIM], [CIEDE2000] and [VMAF] are needed to fully explain a given testing metric so they need to be normative

(3) An IANA Considerations section isn’t present in the document.

(4) A Security Considerations sections isn’t present in the document.
2019-06-13
08 Roman Danyliw
[Ballot comment]
A few comments:

(5) Consider qualifying the title to more accurately capture the substance of this draft “Video Codec Testing and Quality Measurement …
[Ballot comment]
A few comments:

(5) Consider qualifying the title to more accurately capture the substance of this draft “Video Codec Testing and Quality Measurement {using the Daala Tool Suite or Xiph Tools and Data}.”

(6) Section 3.1 and 3.3.  Cite a reference for the source code files names in question – dump_psnr.c and dump_pnsrhvs.c (which are somewhere in the [DAALA-GIT] repo?)

(7) Editorial Nits
-- Section 2.1.  Expand PMF (Probability Mass Function) on first use.

-- Section 2.1. Explain floor.

-- Section 2.2.  Typo.  s/vidoes/videos/

-- Section 2.2. Typo. s/rewatched/re-watched/

-- Section 2.3.  Typo.  s/comparisions/comparisons/

-- Section 3.1.  Expand PSNR (Peak signal to noise ratio) on first use.

-- Section 3.1.  Typo.  s/drived/derived/
2019-06-13
08 Roman Danyliw [Ballot Position Update] New position, Discuss, has been recorded for Roman Danyliw
2019-06-12
08 Benjamin Kaduk
[Ballot discuss]
I suspect I will end up balloting Abstain on this document, given how
far it is from something I could support publishing (e.g., …
[Ballot discuss]
I suspect I will end up balloting Abstain on this document, given how
far it is from something I could support publishing (e.g., a
freestanding clear description of test procedures), but I do think
there are some key issues that need to be resolved before publication.
Perhaps some of them stem from a misunderstanding of the intended goal
of the document -- I am reading this document as attempting to lay out
procedures that are of general utility in evaluating a codec or codecs,
but it is possible that (e.g.) it is intended as an informal summary of
some choices made in a specific operating environment to make a
specific decision.  Additional text to set the scope of the discussion
could go a long way.

Section 2

There's a lot of assertions here without any supporting evidence or
reasoning.  Why is subjective better than objective?  What if objective
gets a lot better in the future?  What if a test should be important but
the interested people don't have the qualifications and the qualified
people are too busy doing other things?

Section 2.1

Why is p<0.5 an appropriate criterion?  Even where p-values are still
used in the scientific literature (which is decreasing in popularity),
the threshold is more often 0.05, or even 0.00001 (e.g., for high-energy
physics).

Section 3

Normative C code contained outside of the RFC being published is hardly
an archival way to describe an algorithm.  There isn't even a git commit
hash listed to ensure that the referenced material doesn't change!

Section 3.5, 3.6, 3.7

I don't see how MSSSIM, CIEDE2000, VMAF, etc. are not normative
references.  If you want to use the indicated metric, you have to follow
the reference.

Section 4.2

There is a dearth of references here.  This document alone is far from
sufficient to perform these calculations.

Section 4.3

There is a dearth of references here as well.  What are libaom and
libvpx?  What is the overlap "BD-Rate method" and where is it specified?

Section 5.2

This mention of "[a]ll current test sets" seems to imply that this
document is part of a broader set of work.  The Introduction should make
clear what broader context this document is to be interpreted within.
(I only note this once in the Discuss portion, but noted some other
examples in the Comment section.)
2019-06-12
08 Benjamin Kaduk
[Ballot comment]
Section 1

Please give the reader a background reading list to get up to speed with
the general concepts, terminology, etc.  (E.g., I …
[Ballot comment]
Section 1

Please give the reader a background reading list to get up to speed with
the general concepts, terminology, etc.  (E.g., I happen to know what
the "luma plane" is, but that's not the case for all consumers of the
RFC series.)

Section 2.1

It seems likely that we should note that the ordering of the algorithms
in question should be randomized (presented as left vs. right,
first vs. second, etc.)

Section 2.3

  A Mean Opinion Score (MOS) viewing test is the preferred method of
  evaluating the quality.  The subjective test should be performed as
  either consecutively showing the video sequences on one screen or on
  two screens located side-by-side.  The testing procedure should

When would it be appropriate to perform the test differently?

  normally follow rules described in [BT500] and be performed with non-
  expert test subjects.  The result of the test will be (depending on

(I couldn't follow the links to [BT500] and look; is this a
restricted-distribution document?)

Section 3.4

A forward reference or other expansion for BD-Rate would be helpful.

Section 3.7

  perception of video quality [VMAF].  This metric is focused on
  quality degradation due compression and rescaling.  VMAF estimates

nit: "due to"

Section 4.1

Decibel is a logarithmic scale that requires a fixed reference value in
order for numerical values to be defined (i.e., to "cancel out the
units" before the transcendental logarthmic function is applied).  I
assume this is intended to take the reference as the full-fidelity
unprocessed original signal, but it may be worth making that explicit.

Section 4.2

Why is it necessary to mandate trapezoidal integration for the numerical
integration?  There are fairly cheap numerical methods available that
have superior performance and are well-known.

Section 5.2.x

How important is it to have what is effectively a directory listing in
the final RFC?

Section 5.2.2, 5.2.3

              This test set requires compiling with high bit depth
  support.

Compiling?  Compiling what?  Again, this needs to be set in the broader
context.

Section 5.3

Please expand CQP on first usage.  I don't think the broader scope in
which the "operating modes" are defined has been made clear.

Section 5.3.4, 5.3.5

  supported.  One parameter is provided to adjust bitrate, but the
  units are arbitrary.  Example configurations follow:

Example configurations *of what*?

Section 6.2

  Normally, the encoder should always be run at the slowest, highest
  quality speed setting (cpu-used=0 in the case of AV1 and VP9).
  However, in the case of computation time, both the reference and

What is "the case of computation time"?

  changed encoder can be built with some options disabled.  For AV1, -
  disable-ext_partition and -disable-ext_partition_types can be passed
  to the configure script to substantially speed up encoding, but the
  usage of these options must be reported in the test results.

Again, this is assuming some context of command-line tools that is not
clear from the document.
2019-06-12
08 Benjamin Kaduk [Ballot Position Update] New position, Discuss, has been recorded for Benjamin Kaduk
2019-06-12
08 Ignas Bagdonas [Ballot Position Update] New position, No Objection, has been recorded for Ignas Bagdonas
2019-06-12
08 Alissa Cooper [Ballot discuss]
RFC 7322 requires all Internet-drafts to have IANA considerations and security considerations sections. Please add them.
2019-06-12
08 Alissa Cooper [Ballot comment]
Please respond to the Gen-ART review.
2019-06-12
08 Alissa Cooper [Ballot Position Update] New position, Discuss, has been recorded for Alissa Cooper
2019-06-12
08 Deborah Brungard [Ballot Position Update] New position, No Objection, has been recorded for Deborah Brungard
2019-06-07
08 Gunter Van de Velde Request for Telechat review by OPSDIR is assigned to Linda Dunbar
2019-06-07
08 Gunter Van de Velde Request for Telechat review by OPSDIR is assigned to Linda Dunbar
2019-06-05
08 Mirja Kühlewind
[Ballot comment]
Update: This document has no security considerations section, while having this section is required.

This document reads more like a user manual of …
[Ballot comment]
Update: This document has no security considerations section, while having this section is required.

This document reads more like a user manual of the Daala tools repository (together with the test sequences). I wonder why this is not simply archived within the repo? What’s the benefit of having this in an RFC? Especially I’m worried that this document is basically useless in case the repo and test sequences disappear, and are therefore not available anymore in future, or change significantly. I understand that this is referenced by OAM and therefore publication is desired, however, I don't think that makes my concern about the standalone usefulness of this document invalid. If you really want to publish in the RFC series, I would recommend to reduce the dependencies to these repos and try to make this document more useful as a standalone test description (which would probably mean removing most of section 4 and adding some additional information to other parts).

Also, the shepherd write-up seems to indicate that this document has an IPR disclosure that was filed after WG last call. Is the wg aware of this? Has this been discussed in the wg?

Other more concrete comments:
1) Quick question on 2.1: Is the tester supposed to view one image after the other or both at the same time? And if one ofter the other, could the order impact the results (and should maybe be randomly chosen therfore)?

2) Sec 2.3: Would it make sense to provide a (normative) reference to MOS? Or is that supposed to be so well know that that is not even necessary?

3) Sec 3.1: maybe spell out PSNR on first occurrence. And would it make sense to provide a reference for PSNR?

4) Sec 3.2: “ The weights used by the dump_pnsrhvs.c tool in
  the Daala repository have been found to be the best match to real MOS
  scores.”
Maybe document these weights in this document as well…?

5) Sec 5.3: Maybe spell out CQP at first occurrence
2019-06-05
08 Mirja Kühlewind Ballot comment text updated for Mirja Kühlewind
2019-06-05
08 Mirja Kühlewind
[Ballot comment]
Update: This document has no security considerations section, while having this section is required.

This document reads more like a user manual of …
[Ballot comment]
Update: This document has no security considerations section, while having this section is required.

This document reads more like a user manual of the Daala tools repository (together with the test sequences). I wonder why this is not simply achieved within the repo? What’s the benefit of having this in an RFC? Especially I’m worried that this document is basically useless in case the repo and test sequences disappear, and are therefore not available anymore in future, or change significantly. I understand that this is referenced by OAM and therefore publication is desired, however, I don't think that makes my concern about the standalone usefulness of this document invalid. If you really want to publish in the RFC series, I would recommend to reduce the dependencies to these repos and try to make this document more useful as a standalone test description (which would probably mean removing most of section 4 and adding some additional information to other parts).

Also, the shepherd write-up seems to indicate that this document has an IPR disclosure that was filed after WG last call. Is the wg aware of this? Has this been discussed in the wg?

Other more concrete comments:
1) Quick question on 2.1: Is the tester supposed to view one image after the other or both at the same time? And if one ofter the other, could the order impact the results (and should maybe be randomly chosen therfore)?

2) Sec 2.3: Would it make sense to provide a (normative) reference to MOS? Or is that supposed to be so well know that that is not even necessary?

3) Sec 3.1: maybe spell out PSNR on first occurrence. And would it make sense to provide a reference for PSNR?

4) Sec 3.2: “ The weights used by the dump_pnsrhvs.c tool in
  the Daala repository have been found to be the best match to real MOS
  scores.”
Maybe document these weights in this document as well…?

5) Sec 5.3: Maybe spell out CQP at first occurrence
2019-06-05
08 Mirja Kühlewind Ballot comment text updated for Mirja Kühlewind
2019-06-05
08 Mirja Kühlewind
[Ballot comment]
This document reads more like a user manual of the Daala tools repository (together with the test sequences). I wonder why this is …
[Ballot comment]
This document reads more like a user manual of the Daala tools repository (together with the test sequences). I wonder why this is not simply achieved within the repo? What’s the benefit of having this in an RFC? Especially I’m worried that this document is basically useless in case the repo and test sequences disappear, and are therefore not available anymore in future, or change significantly. I understand that this is referenced by OAM and therefore publication is desired, however, I don't think that makes my concern about the standalone usefulness of this document invalid. If you really want to publish in the RFC series, I would recommend to reduce the dependencies to these repos and try to make this document more useful as a standalone test description (which would probably mean removing most of section 4 and adding some additional information to other parts).

Also, the shepherd write-up seems to indicate that this document has an IPR disclosure that was filed after WG last call. Is the wg aware of this? Has this been discussed in the wg?

Other more concrete comments:
1) Quick question on 2.1: Is the tester supposed to view one image after the other or both at the same time? And if one ofter the other, could the order impact the results (and should maybe be randomly chosen therfore)?

2) Sec 2.3: Would it make sense to provide a (normative) reference to MOS? Or is that supposed to be so well know that that is not even necessary?

3) Sec 3.1: maybe spell out PSNR on first occurrence. And would it make sense to provide a reference for PSNR?

4) Sec 3.2: “ The weights used by the dump_pnsrhvs.c tool in
  the Daala repository have been found to be the best match to real MOS
  scores.”
Maybe document these weights in this document as well…?

5) Sec 5.3: Maybe spell out CQP at first occurrence
2019-06-05
08 Mirja Kühlewind [Ballot Position Update] New position, Abstain, has been recorded for Mirja Kühlewind
2019-06-05
08 Amy Vezza Placed on agenda for telechat - 2019-06-13
2019-06-04
08 Adam Roach IESG state changed to IESG Evaluation from Waiting for Writeup
2019-06-04
08 Adam Roach Ballot has been issued
2019-06-04
08 Adam Roach [Ballot Position Update] New position, Yes, has been recorded for Adam Roach
2019-06-04
08 Adam Roach Created "Approve" ballot
2019-06-04
08 Adam Roach Ballot writeup was changed
2019-06-04
08 (System) IESG state changed to Waiting for Writeup from In Last Call
2019-06-03
08 (System) IANA Review state changed to IANA OK - No Actions Needed from IANA - Review Needed
2019-06-03
08 Sabrina Tanamal
(Via drafts-lastcall@iana.org): IESG/Authors/WG Chairs:

The IANA Functions Operator has reviewed draft-ietf-netvc-testing-08, which is currently in Last Call, and has the following comments:

The …
(Via drafts-lastcall@iana.org): IESG/Authors/WG Chairs:

The IANA Functions Operator has reviewed draft-ietf-netvc-testing-08, which is currently in Last Call, and has the following comments:

The IANA Functions Operator notes that this document does not contain a standard IANA Considerations section. After examining the draft, we understand that, upon approval of this document, there are no IANA Actions that need completion.

If this assessment is not accurate, please respond as soon as possible.

Thank you,

Sabrina Tanamal
Senior IANA Services Specialist
2019-06-03
08 Francesca Palombini Request for Last Call review by GENART Completed: Ready with Nits. Reviewer: Francesca Palombini. Sent review to list.
2019-05-23
08 Tero Kivinen Request for Last Call review by SECDIR is assigned to Alan DeKok
2019-05-23
08 Tero Kivinen Request for Last Call review by SECDIR is assigned to Alan DeKok
2019-05-21
08 Jean Mahoney Request for Last Call review by GENART is assigned to Francesca Palombini
2019-05-21
08 Jean Mahoney Request for Last Call review by GENART is assigned to Francesca Palombini
2019-05-21
08 Amy Vezza IANA Review state changed to IANA - Review Needed
2019-05-21
08 Amy Vezza
The following Last Call announcement was sent out (ends 2019-06-04):

From: The IESG
To: IETF-Announce
CC: video-codec@ietf.org, draft-ietf-netvc-testing@ietf.org, netvc-chairs@ietf.org, Matthew Miller , …
The following Last Call announcement was sent out (ends 2019-06-04):

From: The IESG
To: IETF-Announce
CC: video-codec@ietf.org, draft-ietf-netvc-testing@ietf.org, netvc-chairs@ietf.org, Matthew Miller , adam@nostrum.com, linuxwolf+ietf@outer-planes.net
Reply-To: ietf@ietf.org
Sender:
Subject: Last Call:  (Video Codec Testing and Quality Measurement) to Informational RFC


The IESG has received a request from the Internet Video Codec WG (netvc) to
consider the following document: - 'Video Codec Testing and Quality
Measurement'
  as Informational RFC

The IESG plans to make a decision in the next few weeks, and solicits final
comments on this action. Please send substantive comments to the
ietf@ietf.org mailing lists by 2019-06-04. Exceptionally, comments may be
sent to iesg@ietf.org instead. In either case, please retain the beginning of
the Subject line to allow automated sorting.

Abstract


  This document describes guidelines and procedures for evaluating a
  video codec.  This covers subjective and objective tests, test
  conditions, and materials used for the test.




The file can be obtained via
https://datatracker.ietf.org/doc/draft-ietf-netvc-testing/

IESG discussion can be tracked via
https://datatracker.ietf.org/doc/draft-ietf-netvc-testing/ballot/

The following IPR Declarations may be related to this I-D:

  https://datatracker.ietf.org/ipr/3392/
  https://datatracker.ietf.org/ipr/3393/
  https://datatracker.ietf.org/ipr/3394/
  https://datatracker.ietf.org/ipr/3389/
  https://datatracker.ietf.org/ipr/3390/
  https://datatracker.ietf.org/ipr/3391/





2019-05-21
08 Amy Vezza IESG state changed to In Last Call from Last Call Requested
2019-05-21
08 Amy Vezza Last call announcement was changed
2019-05-20
08 Adam Roach Last call was requested
2019-05-20
08 Adam Roach Ballot approval text was generated
2019-05-20
08 Adam Roach Ballot writeup was generated
2019-05-20
08 Adam Roach IESG state changed to Last Call Requested from Publication Requested
2019-05-20
08 Adam Roach Last call announcement was generated
2019-05-20
08 Adam Roach Changed consensus to Yes from Unknown
2019-01-25
08 Matthew Miller
This document provides guidance and procedures for testing video
codecs, to evaluate their quality and performance.  This document
is to be published as an Informational …
This document provides guidance and procedures for testing video
codecs, to evaluate their quality and performance.  This document
is to be published as an Informational draft as it describes
methods for implementers of video codecs to compare against known
baselines for object and subject quality as well as measurements
for encoding/decoding performance.

Matthew A. Miller is the document shepherd, Adam Roach is the
responsible AD.

## Review and Consensus

The document has been reviewed both within the NETVC working group
and the Open Alliance for Media (OAM).  While there were no
comments at all during Working Group Last Call, the consensus is
demonstrated by the virtue that OAM uses this document to evaluate
the readiness of AV1 implementations.

## IPR

Andrey Norkin disclosed IPR he was aware of IPR held by Netflix
when asked for his acknowledgement in accordance with BCP 78 and
79.  It had not been filed with the IETF disclosure tools prior to
this writeup starting, so the shepherd filed it all as a
third-party disclosures.  Andrey noted to the shepherd that Neflix
licenses all patents using an Apache Public License.

The other authors have declared they are not aware of any IPR
claims over the procedures described in this document, in
accordance with BCP 78 and 79.

Further, this document has numerous references to video samples
critical to providing a baseline for subjective quality testing.
At least two of the authors have acknowledged the copyrights for
these video samples have a BSD-like license to allow open and
royalty-free use by implementers for testing.

## Other Points

This document is describing procedures and guidelines for how to
evaluate the performance and quality of video codecs; no Security
Considerations or IANA Considerations apply.

A number of references within the document are stale; it is
expected those references will be corrected as part of the RFC
publication process, if not addressed beforehand.

The document has expired numerous times, but without significant
changes.  The delays are largely around gathering acknowledgements
from authors.  The publication is still worthwhile as it is
already in use by AV1 codec implementers.
2019-01-25
08 Matthew Miller Responsible AD changed to Adam Roach
2019-01-25
08 Matthew Miller IETF WG state changed to Submitted to IESG for Publication from WG Consensus: Waiting for Write-Up
2019-01-25
08 Matthew Miller IESG state changed to Publication Requested from I-D Exists
2019-01-25
08 Matthew Miller IESG process started in state Publication Requested
2019-01-25
08 Matthew Miller Intended Status changed to Informational from None
2019-01-25
08 Thomas Daede New version available: draft-ietf-netvc-testing-08.txt
2019-01-25
08 (System) New version approved
2019-01-25
08 (System) Request for posting confirmation emailed to previous authors: Andrey Norkin , Thomas Daede , Ilya Brailovskiy
2019-01-25
08 Thomas Daede Uploaded new revision
2019-01-22
Jenny Bui Posted related IPR disclosure: Matthew A. Miller's Statement about IPR related to draft-ietf-netvc-testing belonging to Netflix
2019-01-22
Jenny Bui Posted related IPR disclosure: Matthew A. Miller's Statement about IPR related to draft-ietf-netvc-testing belonging to Nertflix
2019-01-22
Jenny Bui Posted related IPR disclosure: Matthew A. Miller's Statement about IPR related to draft-ietf-netvc-testing belonging to Netflix
2019-01-22
Jenny Bui Posted related IPR disclosure: Matthew A. Miller's Statement about IPR related to draft-ietf-netvc-testing belonging to Netflix
2019-01-22
Jenny Bui Posted related IPR disclosure: Matthew A. Miller's Statement about IPR related to draft-ietf-netvc-testing belonging to Netflix
2019-01-22
Jenny Bui Posted related IPR disclosure: Matthew A. Miller's Statement about IPR related to draft-ietf-netvc-testing belonging to Netflix
2019-01-21
07 Matthew Miller
This document provides guidance and procedures for testing video
codecs, to evaluate their quality and performance.  This document
is to be published as an Informational …
This document provides guidance and procedures for testing video
codecs, to evaluate their quality and performance.  This document
is to be published as an Informational draft as it describes
methods for implementers of video codecs to compare against known
baselines for object and subject quality as well as measurements
for encoding/decoding performance.

Matthew A. Miller is the document shepherd, Adam Roach is the
responsible AD.

## Review and Consensus

The document has been reviewed both within the NETVC working group
and the Open Alliance for Media (OAM).  While there were no
comments at all during Working Group Last Call, the consensus is
demonstrated by the virtue that OAM uses this document to evaluate
the readiness of AV1 implementations.

## IPR

Andrey Norkin disclosed IPR he was aware of IPR held by Netflix
when asked for his acknowledgement in accordance with BCP 78 and
79.  It had not been filed with the IETF disclosure tools prior to
this writeup starting, so the shepherd filed it all as a
third-party disclosures.  Andrey noted to the shepherd that Neflix
licenses all patents using an Apache Public License.

The other authors have declared they are not aware of any IPR
claims over the procedures described in this document, in
accordance with BCP 78 and 79.

Further, this document has numerous references to video samples
critical to providing a baseline for subjective quality testing.
At least two of the authors have acknowledged the copyrights for
these video samples have a BSD-like license to allow open and
royalty-free use by implementers for testing.

## Other Points

This document is describing procedures and guidelines for how to
evaluate the performance and quality of video codecs; no Security
Considerations or IANA Considerations apply.

A number of references within the document are stale; it is
expected those references will be corrected as part of the RFC
publication process, if not addressed beforehand.

The document has expired numerous times, but without significant
changes.  The delays are largely around gathering acknowledgements
from authors.  The publication is still worthwhile as it is
already in use by AV1 codec implementers.
2019-01-03
07 Matthew Miller IETF WG state changed to WG Consensus: Waiting for Write-Up from In WG Last Call
2019-01-03
07 (System) Document has expired
2018-07-19
07 Matthew Miller Notification list changed to Matthew Miller <linuxwolf+ietf@outer-planes.net>
2018-07-19
07 Matthew Miller Document shepherd changed to Matthew A. Miller
2018-07-19
07 Matthew Miller IETF WG state changed to In WG Last Call from WG Document
2018-07-02
07 Thomas Daede New version available: draft-ietf-netvc-testing-07.txt
2018-07-02
07 (System) New version approved
2018-07-02
07 (System) Request for posting confirmation emailed to previous authors: Andrey Norkin , Thomas Daede , Ilya Brailovskiy
2018-07-02
07 Thomas Daede Uploaded new revision
2018-05-03
06 (System) Document has expired
2017-10-30
06 Thomas Daede New version available: draft-ietf-netvc-testing-06.txt
2017-10-30
06 (System) New version approved
2017-10-30
06 (System) Request for posting confirmation emailed to previous authors: Andrey Norkin , Thomas Daede , Ilya Brailovskiy
2017-10-30
06 Thomas Daede Uploaded new revision
2017-09-28
05 (System) Document has expired
2017-03-27
05 Thomas Daede New version available: draft-ietf-netvc-testing-05.txt
2017-03-27
05 (System) New version approved
2017-03-27
05 (System) Request for posting confirmation emailed to previous authors: Thomas Daede , Andrey Norkin , Ilya Brailovskiy
2017-03-27
05 Thomas Daede Uploaded new revision
2016-10-31
04 Thomas Daede New version available: draft-ietf-netvc-testing-04.txt
2016-10-31
04 (System) New version approved
2016-10-31
03 (System) Request for posting confirmation emailed to previous authors: "Thomas Daede" , "Andrey Norkin" , "Ilya Brailovskiy"
2016-10-31
03 Thomas Daede Uploaded new revision
2016-10-31
03 Adam Roach Added to session: IETF-97: netvc  Tue-1550
2016-07-08
03 Thomas Daede New version available: draft-ietf-netvc-testing-03.txt
2016-03-31
02 Adam Roach Added to session: IETF-95: netvc  Thu-1000
2016-03-15
02 Thomas Daede New version available: draft-ietf-netvc-testing-02.txt
2016-02-29
01 Thomas Daede New version available: draft-ietf-netvc-testing-01.txt
2015-11-30
00 Adam Roach This document now replaces draft-daede-netvc-testing instead of None
2015-11-30
00 Thomas Daede New version available: draft-ietf-netvc-testing-00.txt