-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
demo: http/2+ may make first page load faster (and more) #4
Comments
This is good suggestion, it seems trivial to make express work with http2, but I'll study how to make websocket (currently using ws on server-side) to work with http2. Also, the demo server is running behind http-proxy, I will upgrade it to use http2-proxy to make the whole thing work. |
I'm reading about http2. As my understanding, the benefit of http2 is it allows the server to actively push resources (js/css) to the browser's cache upon the first GET request, hence allow the whole download process to be finished earlier. However, if we inline the script and style in the initial http response, the benefit would be minimal? Also, we need websocket to pass client events to server, and to pass DOM update commands to the client, which seems not supported by http2. If we need to fire another http request to upgrade to websocket, it may be not beneficial overall? |
I prefer to use websocket to pass client event instead of using ajax, to save the overhead of http request. |
For first time request, inline style and script may be beneficial, but for subsequent visits, having the styles and scripts in separate file may be more cache-friendly (and be able to leverage the benefit of http2?) |
thanks for looking into details! yes you are right, one main advantage of http2 is server push. imho there should be 2 other advantages where your ts-liveview would benefit from http2, but only on first page load:
when looking into the future: http3 should the winner... |
I would prefer seperate files, since in real world there is also css and maybe some other js file for animations or at least changing the very light "preview image" to the full images when this has finished loading in background... |
just stumbled on this detailed answer to the question: |
The stackoverflow post also talked about server-sent-event. SSE looks good but it has a per-domain limitation (at most 6 connections), so it won't work well if there are multiple tabs opened. But it seems this limitation is not imposed on http2 connections? |
If SSE is usable, it may be more preferable, because it is more reliable. With websocket, we need to manually detect and resent for dropped messages. |
hmm... https://caniuse.com/websockets |
if one really think about changing WS for another tec https://www.cloudflare.com/learning/performance/what-is-http3/ |
maybe not yet, but soon ;-) |
http/2 in node.js: |
in this 3 part series, some good details are described, not only about http/3 as the article suggests ,but also (indirectly) about http/2 https://www.smashingmagazine.com/2021/09/http3-practical-deployment-options-part3/ |
last one: great tech detail about the benefits of each http version 1/2/3 : |
really the last comment ;-) DEMO: one of the fastest sites on first load I could find (site is not light, using fast connection/same as for #4 (comment) )
https://pulsar.apache.org/case-studies/ (of course, measured latency also depends on the distance: browser - server/datacenter) |
sorry for spamming (->maybe moving from issues to discussions?), regarding using http/3 and "fallback" to http/2 to support all browsers: and a minimal server config for even further clarification (yes it's for nginx and old but the main principle should keep the same..): => with this in mind, it may not be too early to think about http/3 :-) |
Thanks for the updates. It seems safari doesn't support http3 yet. The example on how to push static content with express is helpful. With http2, the server actively push the static resources (css/js) to the browser's cache, it seems to have similar performace as inlining the styles and scripts in the http response. The overall performance for initial page load should be similar? For subsequence routing, when the user switch to another view, part of the dom should be updated, and the new view will require new styles. In that case, if the dom update command is sent from websocket, and the styles get pushed from the previous http2 connection, the setup seems more tricky than having the style inline with the rest of dom elements (in json over ws). The server push behaviour may be beneficial to push images referenced by the img elements thought. In this case, maybe it's better to run the server with multiple versions of http at the same time, and upgrade the connection when supported [1] Thanks again for the links, it seems it would be benefial in turns of performance after switching to http3 (quic over UDP) even when we do not leverage the server push feature, because using QUIC we can have shorter handshaking overhead and resend lost packages in lower layer (hence earlier when it is needed) [2] Also, the seems great that QUIC server doens't have to listen on port 443 when the http1/http2 server response with header |
good to read that there were some interesting points included :-) just some advertising for 3 tiny tools:
https://addons.mozilla.org/de/firefox/addon/indicatetls/ https://addons.mozilla.org/de/firefox/addon/http2-indicator/ https://addons.mozilla.org/de/firefox/addon/websocket-detector/ |
Thanks for sharing the tools. I was not awared the linked image was using older version TLS. I'll update the link with https image proxy like https://images.weserv.nl |
just another thought on relying on 2 different connection types (http and ws): for ws and security this one was interesting: WebSocket Security: Top 8 Vulnerabilities and How to Solve Them https://brightsec.com/blog/websocket-security-top-vulnerabilities/ |
I'm considering sse over http2 vs ws. When using sse (server-side event) over http2, it seems there can have at most 100 connections among all the tabs, which seems plenty. The EventSource in browser will auto reconnect when the network fails, and auto ask for missing events between two connections. Even if we need to fallback to http 1 for some browser, we can workaround the maximum 6 connection limit with storage event [1] However, it may incur more latency on the interaction as sse doesn't support sending messages from the client side (hence will need to use ajax to send events from client to server with additional http header in the request and response) [1] https://fastmail.blog/historical/inter-tab-communication-using-local-storage/ |
as suggested in #4 (comment) not using images.weserv.nl image proxy because it seems the proxy doesn't support gif, and inlining the resource should load faster as it requires the browser to do https handshaking with less party
very interesting details and background. I'm not sure regarding the fallback and workaround described in [1] from 2012. Maybe modern browsers (2022) would throttle tabs/connection in background aggressively to save battery on mobile devices - this may break the master tab solution suggested in [1] or at least add additional complexity... since these things highly depend on browser type and could change over time... Regarding latency, one may need a test... of course there was a great reason why ws was choosen by Phoenix LiveView (and you): latency, but also some others..
hmm everything is a compromise :-) |
just looked through the linked sources again: http/2 seems to support full bidi streaming, so there seems to be no latency disadvantages (no need for ajax) as far as I get, one need to use:
Details:
...
this and more details in: edit: |
It would be very interesting if we can do bidirectional streaming with http2 (fetch push from client, event source push from server or streaming response to the previous fetch) I'm following the demo on https://web.dev/fetch-upload-streaming. If it really works, the performance will be improved and security part would be easier to cater! |
Update: I cannot get client-to-server stream work with http1/http2 yet, the body sent from Firefox and Chrome appear to be stringified object Maybe it need to be multiple ajax instead of a single streaming ajax at the moment? |
hmm
could you open a stream after the connection is established starting from the client site to the server? |
some more background on http/2 bidi streaming: |
This looks interesting: |
This guide is show how to create http2 stream from node.js, which is helpful. I'm still looking way to open the stream from browser. Maybe I misunderstood, it seems the browser doesn't expose the http2 stream API, do we open a http2 stream from browser with ordinary ajax request? |
Streaming requests with the fetch API: |
From the website:
It seems the latest available versions of firefox (v101) and chrome (v102) on archlinux aur doesn't support request streams at the moment. Will keep exploring alternative approaches 💪 |
interesting, but there should be a way, since:
|
there is also a compatibility card: |
and there is also a link to many fetch examples: |
another reason for using http/2: you do not have to rely for speed of first meaning full paint on possibly unsave inlining of resources |
is there anything one can do to help on this topic, beside
like tried/done above? Would be glad to help :-) |
I generally agree http2 is better than http in turns of performance. Just need to update my deployment setup (local port forwarding http proxy) to allow the a liveview app running https directly (instead of relying the proxy to provide https). |
meanwhile, I'm referring this article on how to develop with https locally |
This is pretty interesting! Thanks! |
The demo site is now working with http2 (handled by the proxy in-front of it). And I can get websocket (using http/1.1) working with http2 server when testing locally. Will consolidate everything then include in the next release under current major version 💪 |
sounds great :-) |
now ts-liveview comes with http2 support out-of-the-box in v4.4.0 🎉 Thanks for your suggestions and encouragements! |
With http2, it seems to have less latency even when the payload is increased. Between two version, I also disabled compression on the deployed version. In the original deployment, compression was performed twice, once between public network and proxy, once between the proxy and the web server. (Now the connection between web server and proxy is not compressed) |
Now with http2 enabled, we can investigate the potential benefit of using ajax+sse vs websocket with lost message handling. |
yeah! Following the latest sources mentioned from June 2+, I still believe there is a way to use
:-) |
Since the original goal of this issue is reached (make first page load faster), To move on with http/2 vs ws, I have opened a new fresh one: #13 |
another tiny idea to make page load faster and whose first step is very easy to implement: |
I agree, we've moved on to adopt http2, and having more concrete ideas to explore :) |
as suggested in beenotung/ts-liveview#4 (comment) not using images.weserv.nl image proxy because it seems the proxy doesn't support gif, and inlining the resource should load faster as it requires the browser to do https handshaking with less party
demo: using http2 for all resources may make first page load even faster (maybe also protocol switch to ws)
(yes of course this is not the focus of the demo, but it helps on overall impression ;-)
actual state:
The text was updated successfully, but these errors were encountered: