Knowledge-aware Zero-shot Learning (K-ZSL):
Concepts, Methods and Resources

Tutorial of ISWC 2022 (Online)
October 23, 2022

See the new K-ZSL tutorial in IJCAI'23

Description

Zero-shot Learning (ZSL), which enables machine learning models to predict new targets without seeing their training samples, has attracted wide research interests in many communities such as computer vision (CV) and natural language processing (NLP). An effective solution is to use external knowledge such as text, attribute descriptions and Knowledge Graphs (KGs) to bridge the gap between the targets with training samples and the targets without training samples. This tutorial aims to introduce ZSL from the perspective of knowledge especially KG. We will first present the background of KG and ZSL, then introduce the overall picture of KG-aware ZSL and some representative paradigms with case studies, and finally provide a hands-on session on benchmarks and codes. This tutorial is of the intermediate level.

Learning outcome

Through this tutorial, we expect the attendees to have an overall picture of ZSL, learn different paradigms and typical methods of KG-based ZSL, and master the skills of evaluating KG-based ZSL methods with some typical benchmarks on image classification and inductive KG completion.

Schedule

Length

Content

Speaker

 
25 mins
15 mins
15 mins
10 mins
Part I - Introduction and Background
ZSL definitions and concepts
An introduction to KGs
A brief review on knowledge-aware ZSL
Break
 
Jiaoyan Chen
Jiaoyan Chen
Jiaoyan Chen
 
20 mins
20 mins
20 mins
20 mins
20 mins
10 mins
Part II - KG-aware ZSL Method
Mapping-based paradigm
Data augmentation-based paradigm
Propagation-based paradigm
Feature fusion paradigm
Explanation and ZSL
Break
 
Yuxia Geng
Yuxia Geng
Yuxia Geng
Jiaoyan Chen
Jiaoyan Chen
 
15 mins
15 mins
15 mins
10 mins
5 mins
Part III - Hands-on: Resources and Benchmarking
A brief review on evaluation resources
Benchmarking zero-shot image classification
Benchmarking zero-shot KG completion
Resource demonstration
Summary and discussion
 
Jiaoyan Chen
Yuxia Geng
Yuxia Geng
Yufeng Huang
Huajun Chen

Material

The materials presented in the tutorial are as follows:

Presenters

Jiaoyan Chen is a Senior Researcher at Department of Computer Science, University of Oxford, working in the Knowledge Representation and Reasoning Group. In past ten years, he has been working on KG construction and curation technologies, knowledge-aware machine learning (including KG-based low-resource learning, inductive KG completion, concept drift in ontology stream learning, knowledge-augmented explanation), as well as the applications of these semantic and machine learning technologies.

Yuxia Geng is a PhD student in the Knowledge Engine Group at Zhejiang University, advised by Prof. Huajun Chen. Her research interests include inductive KG completion, KG-aware ZSL and Neuro-symbolic Integration. She has been contributing to Knowledge-driven ZSL, with a few papers published in WWW, ISWC, ACL, Semantic Web Journal and IJCAI as a main author.

Yufeng Huang is a MSc student in the Knowledge Engine Group at Zhejiang University, advised by Prof. Huajun Chen. His research interests include KG, ZSL and visual language models. He is familiar with kinds of machine learning and KG coding skills, and will mainly contribute to the hands-on session.

Huajun Chen is a full professor of College of Compouter Science and Technology at Zhejiang University. He has over 20 years research experience on topics such as the Semantic Web, KG, low-resource learning, Big Data and NLP. His research on KG has won several international and national awards, such as the best paper of ISWC 2006. Prof. Chen serves as the Area Editor of Journal of Web Semantics, and leads OpenKG, an open KG initiative launched in 2015. Huajun has been teaching in Zhejiang University for over 15 years for courses such as Introduction to Knowledge Graph and Object Oriented Programming.

Reference

1. Chen, J., Geng, Y., Chen, Z., Horrocks, I., Pan, J.Z., et al.: Knowledge-aware zero-shot learning: Survey and perspective. In: IJCAI (2021)
2. Geng, Y., Chen, J., Zhuang, X., Chen, Z., Pan, J. Z., Li, J., Yuan, Z., & Chen, H. Benchmarking Knowledge-driven Zero-shot Learning. Journal of Web Semantics. (2022).
3. Geng,Y.,Chen,J.,Chen,Z.,Pan,J.Z.,Ye,Z.,Yuan,Z.,Jia,Y.,Chen,H.:OntoZSL: Ontology-enhanced zero-shot learning. In: WWW. pp. 3325-3336 (2021)
4. Wang, X., Ye, Y., Gupta, A.: Zero-shot recognition via semantic embeddings and knowledge graphs. In: CVPR. pp. 6857-6866 (2018)
5. Li, J., Wang, R., Zhang, N., Chen, H., et al.: Logic-guided semantic representation learning for zero-shot relation classification. In: COLING. pp. 2967-2978 (2020)
6. Geng, Y., Chen, J., Ye, Z., Zhang, W., Chen, H.: Explainable zero-shot learning via attentive graph convolutional network and knowledge graphs. Semantic Web (2021)
7. Chen, Z., Chen, J., Geng, Y., Pan, J. Z., Yuan, Z., & Chen, H. Zero-shot visual question answering using knowledge graph. In International Semantic Web Conference (pp. 146-162). (2021).
8. Geng, Y., Chen, J., Zhang W., Xu Y., Chen Z., Pan, J. Z., Huang, Y., Xiong F. & Chen, H. Disentangled Ontology Embedding for Zero-shot Learning. In: ACM SIGKDD. (2022).
9. Chen, J., Geng, Y., Chen, Z., Pan, J. Z., He, Y., Zhang, W., ... & Chen, H. Low-resource Learning with Knowledge Graphs: A Comprehensive Survey. arXiv preprint arXiv:2112.10006. (2021).
10. Geng, Y., Chen, J., Zhang, W., Pan, J. Z., Yang, M., Chen, H. & Jiang, S. Relational Message Passing for Fully Inductive Knowledge Graph Completion. arXiv preprint arXiv:2210.03994. (2022).


Acknowledgement

We would like to thank all the researchers who participated our works or contributed comments/discussion to our tutorial, including Jeff Z. Pan from The University of Edinburgh, Wen Zhang and Zhuo Chen from Zhejiang University, Ian Horrocks from University of Oxford, and so on.