atoav
If you truly worry about the performance of your webpage there are many, many places to optimize before ever considering the problematic CDN option. I suggest optimizing everything else first. How many pages I have seen that load 3 fonts with 8 variants each while using one font with two variants is too damn high. The amount of people who don't know how to scale and compress images as well.

I know this is not how most sites operate these days, but consider that your visitor wants to visit you and getting your website. Whenever you embed stuff from other servers you not only gift away your user-data and breach their trust, you just doubled your attack surface and lowered the reliability of your site. And for what exactly?

My suspicion is that developers find it easier to paste a CDN include than downloading the file and including it themselves. Because performance my ass.

That's a bit like that cookie notice thing. Guess what: if you don't collect personal data and store it on your users computer, you don't need to ask them for consent and suddenly your site looks a lot cleaner and needs to deliver less data.

antifa
Regarding the alleged performance benefits of using a public CDN: if there ever was a cache hit because a user visited one site then yours that coincidentally used the same CDN and query version (pro tip: never happened), and in a short enough time to not suffer a cache eviction, the user did not notice.

When the CDN was slow, every user noticed and thought your website was slow.

You gave away free analytics and made your website worse, there wasn't even a trade off.

mrinfinitiesx
https://addons.mozilla.org/en-US/firefox/addon/localcdn-fork...

Supply chain attacks will cause catastrophic damages and massive internet problems one day, as they have. DDoS/outages to the cdn js/resource suppliers and websites come to a standstill. why not host your own .js files? lazy upkeep?

I don't have a solutions for it all, but better to think about solutions now than when solutions are needed because a massive hack happened at some point.

flippy_flops
I’ve always wished a movie hacker would hack a CDN to take over every web page in the world. I don’t know what the real percentage would be, but it’d be more believable than a lot of the hacks in movies.
fieldcny
This isn’t news, this is exactly what we said when the don’t be evil company was pushing them in the name of page loading speed (b/c the faster they get to display an ad making time to sell another ad!)
linuxalien
Does anyone have tool suggestions for manging these dependencies for legacy applications that aren't setup for webpack and similar tools. Lots of legacy sites still use things like jQuery and jquery-ui which don't work nicely in webpack without changing how your JS works. But I also don't want to be manually downloading the libraries and committing to the repo. Something like npm, but specifically for browser libraries, and it can just install the js resources straight into a "public assets" folder. Bonus if it can create a JSON file with the path to the libraries and the file hashes so we can reference them from server side code. All my attempts at npm and webpack/parcel etc fall apart with things like jquery-ui. Edit: years ago bower fit this requirement. I'm not sure it's fit for the current state of JS libraries now though, and they seem to recommend moving away from it.
jauntywundrkind
Yeah Storage Partitioning has made the upside of shared cache not useful. As the article says. https://developers.google.com/privacy-sandbox/3pcd/storage-p... https://developer.mozilla.org/en-US/docs/Web/Privacy/State_P...

Still, I see the allure of having someone else cdn for you. They have a significantly better serving system than many of us, and it's traffic we don't have to serve.

A naive usage of cdn will have both information-leak problems and expose a security issue, as described in the article. But.

You can basically eliminate the information-leak problem by using a restrictive referrerPolicy (which can be set on fetch or a <script> tag). This will quite effectively blind the cdn to where specifically the traffic is coming from.

You can eliminate the security risk by specifying a subresource-intergrity for your assets. This will prevent the CDN from modifying the file from what you expect.

croes
Isn't the same true for popular cloud services?

One security flaw and thousands are affected.

Maybe back to on prem would be better.