arXiv
Open Access
2020
Challenges and Thrills of Legal Arguments
Anurag Pallaprolu
Radha Vaidya
Aditya Swaroop Attawar
Abstrak
State-of-the-art attention based models, mostly centered around the transformer architecture, solve the problem of sequence-to-sequence translation using the so-called scaled dot-product attention. While this technique is highly effective for estimating inter-token attention, it does not answer the question of inter-sequence attention when we deal with conversation-like scenarios. We propose an extension, HumBERT, that attempts to perform continuous contextual argument generation using locally trained transformers.
Penulis (3)
A
Anurag Pallaprolu
R
Radha Vaidya
A
Aditya Swaroop Attawar
Akses Cepat
Informasi Jurnal
- Tahun Terbit
- 2020
- Bahasa
- en
- Sumber Database
- arXiv
- Akses
- Open Access ✓