Conference item icon

Conference item

Universal in-context approximation by prompting fully recurrent models

Abstract:

Zero-shot and in-context learning enable solving tasks without model fine-tuning, making them essential for developing generative model solutions. Therefore, it is crucial to understand whether a pretrained model can be prompted to approximate any function, i.e., whether it is a universal in-context approximator. While it was recently shown that transformer models do possess this property, these results rely on their attention mechanism. Hence, these findings do not apply to fully recurrent a...

Expand abstract
Publication status:
Accepted
Peer review status:
Peer reviewed

Actions


Authors


More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Engineering Science
Role:
Author
More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Engineering Science
Role:
Author
More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Engineering Science
Role:
Author
More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Engineering Science
Role:
Author
ORCID:
0009-0006-0259-5732
More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Engineering Science
Role:
Author
More from this funder
Name:
Engineering and Physical Sciences Research Council
Funder identifier:
https://ror.org/0439y7842
Grant:
EP/W002981/1
Acceptance date:
2024-05-02
Event title:
41st International Conference on Machine Learning (ICML 2024)
Event location:
Vienna, Austria
Event website:
https://icml.cc/
Event start date:
2024-07-21
Event end date:
2024-07-27
Language:
English
Pubs id:
2013220
Local pid:
pubs:2013220
Deposit date:
2024-07-09

Terms of use


Views and Downloads






If you are the owner of this record, you can report an update to it here: Report update to this record

TO TOP