Sketch-based 3D Animal Fine-Grained Retrieval (SketchANIMAR)
SketchANIMAR Challenge
The main objective of this challenge is to leverage artificial intelligence research on 3D animal models. The rapid development of 3D technologies produced a markable number of 3D models. Therefore, 3D model retrieval has drawn significant attention and is beneficial in real-life demands, such as video games, arts, films, and virtual reality. Compared with searching 3D general objects of a given category, 3D animal model fine-grained retrieval is much more challenging due to the large discrepancy of animal breeds and poses.
This track proposes a realistic and promising setting for 3D animal model fine-grained retrieval, which aims to search for relevant 3D animal models from a dataset using sketch queries. It can help users to get access to 3D models quickly by available drawings.
Tentative Schedule
January 2, 2023: Track announcement
January 25, 2023: Preliminary training data released [Download]
February 15, 2023: Updated training data released [Registration Required To Access] (Extended to February 20, 2023)
February 15, 2023: Testing data released [Registration Required To Access] (Extended to February 25, 2023)
March 10, 2023: Team registration deadline
March 15, 2023: Team submission deadline (together with 4-page brief reports)
March 18, 2023: Result notification
March 28, 2023: Track paper submitted for review at Computers and Graphics journal
June 1, 2023: SketchANIMAR dataset is publicly released for only academic purposes
Participant Information
Please contact the task organizers with any questions on these points.
Signing up: Fill in the registration form. (The link is not available. If you want to use the dataset, please send an email to ltnghia@fit.hcmus.edu.vn)
Making your submission: To be announced (check the task readme)
Preparing your working notes paper: Instructions on preparing your working notes paper can be found in Submission Instructions.
Dataset Description
Our SketchANIMAR2023 dataset has the structure as follow:
3D Model References: 711 animal 3D models (<Model ID>.obj)
References.csv: list of <Model ID>
SketchQuery: 74 sketch images for training and 66 sketch images for testing (<Sketch Query ID>.jpg)
SketchQuery_Train.csv: list of <Sketch Query ID> for training
SketchQuery_GT_Train.csv: maching pairs of <Sketch Query ID> and <Model ID> for training
SketchQuery_Test.csv: list of <Sketch Query ID> for testing
Submission Instructions
Participants require to submit a CSV file with the format name (<Team Name>_SketchANIMAR2023.csv) to CodaLab (https://codalab.lisn.upsaclay.fr/competitions/11073). The file has to be compressed to submission.zip before being submitted to CodaLab. Each team has 25 submissions as the maximum.
Given N models and Q queries, each row needs to show retrieval results in descending order. A sample .csv file is below.
<Query ID 1>, <Model ID top-1>, <Model ID top-2>, <Model ID top-3>, ..., <Model ID top-N>
<Query ID 2>, <Model ID top-1>, <Model ID top-2>, <Model ID top-3>, ..., <Model ID top-N>
...
<Query ID Q>, <Model ID top-1>, <Model ID top-2>, <Model ID top-3>, ..., <Model ID top-N>
Each team also needs to send a working notes paper to the organizers (ltnghia@fit.hcmus.edu.vn) by the submission deadline with the email title "[TextANIMAR2023] <Team Name> Working notes paper submission".
The working notes paper is four pages long and in two-column IEEE format. You are allowed to add a fifth page that contains only references. Your paper should cite the Challenge Overview paper written by the organizers. This paper contains all the necessary information on the challenge definition and the dataset. Therefore, you don't need to repeat to describe the challenge or the dataset. Instead, you can devote the four pages exclusively to presenting the motivation for your approach, explaining your method, showing your results, analyzing your results, and giving an outlook on future work.
Evaluation Methodology
The metrics used for this track are:
Nearest Neighbor (NN): top-1 accuracy.
Precision-at-10 (P@10): ratio of relevant items in the top-10 returned results.
Normalized Discounted Cumulative Gain (NDCG).
Mean average precision (mAP).
Leaderboard
Public Test:
Private Test:
Organizers
Trung-Nghia Le, University of Science, VNU-HCM, Vietnam
Minh-Triet Tran, University of Science, VNU-HCM, Vietnam
Minh-Quan Le, University of Science, VNU-HCM, Vietnam
Nhat Hoang-Xuan, University of Science, VNU-HCM, Vietnam
Thang-Long Nguyen-Ho, University of Science, VNU-HCM, Vietnam
Trong-Thuan Nguyen, University of Science, VNU-HCM, Vietnam
Viet-Tham Huynh, University of Science, VNU-HCM, Vietnam
Mai-Khiem Tran, University of Science, VNU-HCM, Vietnam
Trong-Le Do, University of Science, VNU-HCM, Vietnam
Khanh-Duy Le, University of Science, VNU-HCM, Vietnam
Vinh-Tiep Nguyen, University of Information Technology, VNU-HCM, Vietnam
Tam V. Nguyen, University of Dayton, U.S.A.
Akihiro Sugimoto, National Institute of Informatics, Japan
References
Trung-Nghia Le, Tam V. Nguyen, Minh-Quan Le, Trong-Thuan Nguyen, Viet-Tham Huynh, Trong-Le Do, Khanh-Duy Le, Mai-Khiem Tran, Nhat Hoang-Xuan, Thang-Long Nguyen-Ho, Vinh-Tiep Nguyen, Nhat-Quynh Le-Pham, Huu-Phuc Pham, Trong-Vu Hoang, Quang-Binh Nguyen, Trong-Hieu Nguyen-Mau, Tuan-Luc Huynh, Thanh-Danh Le, Ngoc-Linh Nguyen-Ha, Tuong-Vy Truong-Thuy, Truong Hoai Phong, Tuong-Nghiem Diep, Khanh-Duy Ho, Xuan-Hieu Nguyen, Thien-Phuc Tran, Tuan-Anh Yang, Kim-Phat Tran, Nhu-Vinh Hoang, Minh-Quang Nguyen, Hoai-Danh Vo, Minh-Hoa Doan, Hai-Dang Nguyen, Akihiro Sugimoto, Minh-Triet Tran, "SketchANIMAR: Sketch-based 3D Animal Fine-Grained Retrieval", ArXiv Pre-print: 2304.05731, 2023. [PDF]
© Copyright Software Engineering Laboratory, University of Science, VNU-HCM, Vietnam