NLPExplorer
Papers
Venues
Authors
Authors Timeline
Field of Study
URLs
ACL N-gram Stats
TweeNLP
API
Team
Zi Lin
Number of Papers:- 10
Number of Citations:- 0
First ACL Paper:- 2018
Latest ACL Paper:- 2023
Venues:-
s
EMNLP
i
WOAH
d
-
WS
Findings
L
P
IJCNLP
ACL
g
E
M
N
F
n
CL
Co-Authors:-
Dan Roth
Di He
Fei Tian
Ian Kivlichan
Jeremiah Liu
Similar Authors:-
Fabrice Evrard
Raja Ayed
Filip Van Aelten
Warren J Plath
Naim Terbeh
2023
2022
2021
2020
2019
2018
Retrieval-Augmented Parsing for Complex Graphs by Exploiting Structure and Uncertainty
F
i
n
d
i
n
g
s
-
E
M
N
L
P
Zi Lin |
Quan Yuan |
Panupong Pasupat |
Jeremiah Liu |
Jingbo Shang |
ToxicChat: Unveiling Hidden Challenges of Toxicity Detection in Real-World User-AI Conversation
F
i
n
d
i
n
g
s
-
E
M
N
L
P
Zi Lin |
Zihan Wang |
Yongqi Tong |
Yangkun Wang |
Yuxin Guo |
Yujia Wang |
Jingbo Shang |
Towards Collaborative Neural-Symbolic Graph Semantic Parsing via Uncertainty
ACL
Findings
Zi Lin |
Jeremiah Zhe Liu |
Jingbo Shang |
Neural-Symbolic Inference for Robust Autoregressive Graph Parsing via Compositional Uncertainty Quantification
EMNLP
Zi Lin |
Jeremiah Liu |
Jingbo Shang |
Measuring and Improving Model-Moderator Collaboration using Uncertainty Estimation
ACL
IJCNLP
WOAH
Ian Kivlichan |
Zi Lin |
Jeremiah Liu |
Lucy Vasserman |
Comparing Knowledge-Intensive and Data-Intensive Models for English Resource Semantic Parsing
CL
Junjie Cao |
Zi Lin |
Weiwei Sun |
Xiaojun Wan |
Pruning Redundant Mappings in Transformer Models via Spectral-Normalized Identity Prior
EMNLP
Findings
Zi Lin |
Jeremiah Liu |
Zi Yang |
Nan Hua |
Dan Roth |
Hint-Based Training for Non-Autoregressive Machine Translation
EMNLP
Zhuohan Li |
Zi Lin |
Di He |
Fei Tian |
Tao QIN |
Liwei WANG |
Tie-Yan Liu |
Parsing Meaning Representations: Is Easier Always Better?
ACL
WS
Zi Lin |
Nianwen Xue |
Semantic Role Labeling for Learner Chinese: the Importance of Syntactic Parsing and L2-L1 Parallel Data
EMNLP
Zi Lin |
Yuguang Duan |
Yuanyuan Zhao |
Weiwei Sun |
Xiaojun Wan |
Linguistic
Task
Approach
Language
.