Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
Mehvix
3 days ago
|
parent
|
context
|
favorite
| on:
'Attention is all you need' coauthor says he's 'si...
>None of that is specialized to run only transformers at this point
isn't this what [etched](
https://www.etched.com/
) is doing?
imtringued
3 days ago
[–]
Only being able to run transformers is a silly concept, because attention consists of two matrix multiplications, which are the standard operation in feed forward and convolutional layers. Basically, you get transformers for free.
reply
kadushka
3 days ago
|
parent
[–]
devil is in the details
reply
Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
isn't this what [etched](https://www.etched.com/) is doing?