1. Challenge motivation and description
End-to-end image compression has been a research focus for both academia and industry for more than 5 years. A number of technologies have been developed such as more expressive neural networks and more accurate probability estimation schemes for entropy coding. Until most recently, the performance of end-to-end image compression has surpassed the H.266/Versatile Video Coding. To promote its practical use, we think it is time to consider the complexities of the end-to-end image compression schemes, especially the decoding complexities.
This challenge calls for development of novel end-to-end image compression algorithms which can result in a good balance between the performance and decoding complexity. We will have three tracks depending on the weights put on the decoding complexity. The participates are required to compress all images defined in the Test Dataset. The actual bits per pixel (bpp) is not allowed to exceed a target bpp, which is set to the bpp of the test image coded by BPG using the quantization parameter 28.
Training and Validation Dataset: A collection of about 1600 high-resolution images will be provided as the training and validation dataset. Participates are free to split the provided images into training and validation dataset. Participates are also free to use some other dataset for training and validation. If so, it is needed to document the extra dataset used clearly.
Test Dataset: 20 images with resolution 4K will be used for the evaluation. All the images will be in RGB color space and PNG file format. These images will be distributed to all the participates before a certain date. Participates are required to compress them within 72 hours.
3. Evaluation metrics
The performance Q will be evaluated by a weighted sum of the delta PSNR and the decoding complexity,
Q = w · ∆PSNR - dTime,
where PSNR is calculated using the average PSNR of the R, G, and B components. ∆PSNR is calculated by subtracting the PSNR of BPG from that of the proposed method. dTime is measured by the seconds used for both entropy decoding and image reconstruction with a V100 GPU provided by the organizers. Therefore, it is also required for the methods to be decoded successfully by the V100 GPU provided by the organizers. w is set to 1, 10000, and 1000000, respectively. The track setting w to 1 is the fastest. The track setting w to 10000 provides a good balance between the complexity and the performance. The track setting w to 1000000 emphasizes more on the compression performance. For each track, we will announce a winner.
4. Submission requirements
The participates are requested to submit a decoder along with a docker environment and the corresponding script which can run the decoder.
The participates are requested to submit the compressed bitstreams. The bitstreams shall be named like I01.bin
The participates are requested to submit the decoded images. The decoded images shall be named like I01dec.png
The participates can choose to submit a paper describing the end-to-end image compression scheme to the VCIP challenge session to or not. Please follow the VCIP paper template to prepare your manuscript if you want to submit a paper.
June 15, registration for the competition. The authors can send the team name, team members, and the institution to firstname.lastname@example.org for registration
June 15, release of the training and validation dataset
October 15, submission of the challenge paper manuscript
October 23, notification of the challenge paper acceptance
October 30, submission of the camera ready paper
November 15, submission of the decoder and docker environment
November 16, release of the test Dataset
November 19, submission of the compressed bitstream and decoded images
November 30, winners and leader boards notification.
December 13-16, challenge session at the VCIP conference. The winners will receive winner certificates provided by the VCIP organization committee. All teams can present the work at the conference.
Li Li, University of Science and Technology of China
Chuanmin Jia, Peking University
For any inquiry, please email us at: email@example.com; firstname.lastname@example.org