NLPExplorer
Papers
Venues
Authors
Authors Timeline
Field of Study
URLs
ACL N-gram Stats
TweeNLP
API
Team
Ting Han Fan
Number of Papers:- 5
Number of Citations:- 0
First ACL Paper:- 2023
Latest ACL Paper:- 2024
Venues:-
s
A
d
i
NAACL
-
L
P
ACL
E
C
M
N
F
n
g
Co-Authors:-
Alexander Rudnicky
Li Wei Chen
Peter Ramadge
Ta Chung Chi
Similar Authors:-
2024
2023
Advancing Regular Language Reasoning in Linear Recurrent Neural Networks
NAACL
Ting-Han Fan |
Ta-Chung Chi |
Alexander Rudnicky |
Attention Alignment and Flexible Positional Embeddings Improve Transformer Length Extrapolation
F
i
n
d
i
n
g
s
-
N
A
A
C
L
Ta-Chung Chi |
Ting-Han Fan |
Alexander Rudnicky |
Transformer Working Memory Enables Regular Language Reasoning And Natural Language Length Extrapolation
F
i
n
d
i
n
g
s
-
E
M
N
L
P
Ta-Chung Chi |
Ting-Han Fan |
Alexander Rudnicky |
Peter Ramadge |
Latent Positional Information is in the Self-Attention Variance of Transformer Language Models Without Positional Embeddings
ACL
Ta-Chung Chi |
Ting-Han Fan |
Li-Wei Chen |
Alexander Rudnicky |
Peter Ramadge |
Dissecting Transformer Length Extrapolation via the Lens of Receptive Field Analysis
ACL
Ta-Chung Chi |
Ting-Han Fan |
Alexander Rudnicky |
Peter Ramadge |
.