> It's basically the same "exposure" as importing a random npm, but it has the benefit if being explicit when you do it.
I'm not sure how this works in detail here, but at least in NPM you got a chance to download packages, inspect them and fix the versions if so desired. Importantly, this gave you control over your transitive dependencies as well.
This seems more like the curl | bash school of package management.
The good thing about this is you can effectively build a register service that serves the same level of trust that npm provides, because at the end of the day that is the only deferential in this scenario as npm can just as well return malicious code.
AFAIK there is no option to allow a website to read and write random files anywhere to my hard drive period. At most a website can ask the user select a file or offer one for downloading. In the future maybe it can be given a domain specific folder.
That's not true here. If I'm running a web server I'm going to need to give the app permission to read the files being served and access to the database. That something that never happens in the browser.
The tldr is Deno also gives you a chance to download + inspect packages, and then lock dependencies. The mechanism for import is different, but the tooling is good.
I'm not sure how this works in detail here, but at least in NPM you got a chance to download packages, inspect them and fix the versions if so desired. Importantly, this gave you control over your transitive dependencies as well.
This seems more like the curl | bash school of package management.
Edit: This is explained in more detail at https://deno.land/manual/linking_to_external_code and indeed seems a lot more sane.
> It's also exactly what the websites you visit do. ;)
Well yes, and it causes huge problems there already - see the whole mess we have with trackers and page bloat.