logo

blog

My website can't be that messy, right? git clone https://anongit.hacktivis.me/git/blog.git/

HTTP3.xhtml (2693B)


  1. <article xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en" class="h-entry">
  2. <a href="/articles/HTTP3"><h1>HTTP3</h1></a>
  3. <p>First thing first: Well done, this is the first article where I had to drop a letter from the title to keep the orthogonality between the title and the filename.</p>
  4. <p>I went to the HTTP/3 talk at FOSDEM, it was quite interesting until I got reminded that the Web can't get it's shit right: QUIC basically has tracking of how good your connection/browser/… is, hello fingerprinting.</p>
  5. <p>So none of my computers will have support for HTTP/3 or QUIC, I run gentoo and I have my own browser which reuses existant parts of the system, I wish other browsers would do the same but I have no hope there.
  6. At worst I will have a reduced implementation of the protocol (for example no 0-RTT "Handshake") for compatibility if I get forced to use it.
  7. But I don't see it coming other than maybe for less pain in Google ReeCaptcha (fuck your website if it's using it) as I still support HTTP/0.9 throught HTTP/1.1, and HTTP/2 is only enabled on my HTTP server just because nginx has support for it.</p>
  8. <hr />
  9. <p>If there is <em>one</em> thing to fix in your broken protocol it's the fact that <code>ETag</code> is also great at being a <a href="https://en.wikipedia.org/wiki/HTTP_ETag#Tracking_using_ETags">fucking tracker</a>, but <code>HTTP 304 Not Modified</code> is the same so congrats, we have caching with also having it being tracked. And of course the lawsuits went against KISSmetrics and Hulu instead of browser vendors or protocol designers, because if I had time for this shit (and any trust in the Justice) I probably would sue them, not the ones merely watching their logs.</p>
  10. <p>The client should only do a <code>HEAD</code> to get new metadata and then do it's own side-effects. It's not tracking-proof but it would at least mean having to do tracking on multiple requests and with a risk of false-positives (<code>HEAD</code> and then sometimes <code>GET</code> being used by some software for link previews), while currently you can basically be 100% sure because it's part of the protocol.</p>
  11. <p>The solution adopted by most frontend folks for cache managment was to put a hash into the filename, and it's quite a good way to do it in their case. It should only have been into headers rather than into the filename so it could be used by other folks and a hash/version in the filename would get more rare, thus having better caching.</p>
  12. <p><a href="https://queer.hacktivis.me/objects/62260db6-e278-46eb-ad79-97a7ad924320">Fediverse post for comments</a>, published on 2020-03-01T02:00:00Z, last updated on 2020-03-01T02:01:00Z</p>
  13. </article>