Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

demo: http/2+ may make first page load faster (and more) #4

Closed
hpvd opened this issue May 11, 2022 · 53 comments
Closed

demo: http/2+ may make first page load faster (and more) #4

hpvd opened this issue May 11, 2022 · 53 comments

Comments

@hpvd
Copy link

hpvd commented May 11, 2022

demo: using http2 for all resources may make first page load even faster (maybe also protocol switch to ws)
(yes of course this is not the focus of the demo, but it helps on overall impression ;-)

actual state:
2022-05-11_11h39_38

2022-05-11_12h30_48

@beenotung
Copy link
Owner

This is good suggestion, it seems trivial to make express work with http2, but I'll study how to make websocket (currently using ws on server-side) to work with http2.

Also, the demo server is running behind http-proxy, I will upgrade it to use http2-proxy to make the whole thing work.

@beenotung
Copy link
Owner

I'm reading about http2. As my understanding, the benefit of http2 is it allows the server to actively push resources (js/css) to the browser's cache upon the first GET request, hence allow the whole download process to be finished earlier.

However, if we inline the script and style in the initial http response, the benefit would be minimal?

Also, we need websocket to pass client events to server, and to pass DOM update commands to the client, which seems not supported by http2. If we need to fire another http request to upgrade to websocket, it may be not beneficial overall?

@beenotung
Copy link
Owner

I prefer to use websocket to pass client event instead of using ajax, to save the overhead of http request.
I learn that the http headers are encoded in binary format with better compression in http2, but it is still more overhead than sending a websocket message?

@beenotung
Copy link
Owner

For first time request, inline style and script may be beneficial, but for subsequent visits, having the styles and scripts in separate file may be more cache-friendly (and be able to leverage the benefit of http2?)

@hpvd
Copy link
Author

hpvd commented May 15, 2022

thanks for looking into details!

yes you are right, one main advantage of http2 is server push.
But as far as I can overlook it, using websocket should still deliver superior speed.

imho there should be 2 other advantages where your ts-liveview would benefit from http2, but only on first page load:

  • one may save a roundtrip when using http2 and tsl1.3 on the very beginning
  • it should be possible to start another transfer before the first has ended (in screenshot above: start js download before html has finished)

when looking into the future: http3 should the winner...

@hpvd
Copy link
Author

hpvd commented May 15, 2022

For first time request, inline style and script may be beneficial, but for subsequent visits, having the styles and scripts in separate file may be more cache-friendly (and be able to leverage the benefit of http2?)

I would prefer seperate files, since in real world there is also css and maybe some other js file for animations or at least changing the very light "preview image" to the full images when this has finished loading in background...

@hpvd
Copy link
Author

hpvd commented May 15, 2022

just stumbled on this detailed answer to the question:
does http2.0 make websocket obsolete?
(the answer was constantly updated improved from 2015 to 2021)
https://stackoverflow.com/questions/28582935/does-http-2-make-websockets-obsolete/42465368#42465368

@beenotung
Copy link
Owner

The stackoverflow post also talked about server-sent-event. SSE looks good but it has a per-domain limitation (at most 6 connections), so it won't work well if there are multiple tabs opened. But it seems this limitation is not imposed on http2 connections?

@beenotung
Copy link
Owner

If SSE is usable, it may be more preferable, because it is more reliable. With websocket, we need to manually detect and resent for dropped messages.

@hpvd
Copy link
Author

hpvd commented May 15, 2022

@hpvd
Copy link
Author

hpvd commented May 15, 2022

if one really think about changing WS for another tec
-> maybe one should also directly look into http 3.0 (for direct use or to prepare an upgrade path)?

https://www.cloudflare.com/learning/performance/what-is-http3/

@hpvd
Copy link
Author

hpvd commented May 15, 2022

maybe not yet, but soon ;-)
https://caniuse.com/?search=http%2F3

@hpvd
Copy link
Author

hpvd commented May 16, 2022

@hpvd
Copy link
Author

hpvd commented May 16, 2022

in this 3 part series, some good details are described, not only about http/3 as the article suggests ,but also (indirectly) about http/2 https://www.smashingmagazine.com/2021/09/http3-practical-deployment-options-part3/

@hpvd
Copy link
Author

hpvd commented May 16, 2022

last one: great tech detail about the benefits of each http version 1/2/3 :
https://www.toptal.com/web/performance-working-with-http-3

@hpvd
Copy link
Author

hpvd commented May 16, 2022

really the last comment ;-)

DEMO:

one of the fastest sites on first load I could find (site is not light, using fast connection/same as for #4 (comment) )

  • static
  • http/2 and
  • TSLv1.3

https://pulsar.apache.org/case-studies/

(of course, measured latency also depends on the distance: browser - server/datacenter)

2022-05-16_13h49_33

2022-05-16_13h08_59

@hpvd
Copy link
Author

hpvd commented May 16, 2022

detail comparison for the first element:

of course, measured latency also depends on the distance: browser - server/datacenter

but the structure is different between http/1.1 with TSLv1.2 and http/2.0 with TSLv1.3

2022-05-16_13h56_21

@hpvd
Copy link
Author

hpvd commented May 16, 2022

sorry for spamming (->maybe moving from issues to discussions?),
but this may be interesting?!

regarding using http/3 and "fallback" to http/2 to support all browsers:
https://stackoverflow.com/questions/61172217/does-http3-quic-fall-back-to-tls-1-2-if-the-browser-doesnt-support-quic

and a minimal server config for even further clarification (yes it's for nginx and old but the main principle should keep the same..):
https://blog.cloudflare.com/experiment-with-http-3-using-nginx-and-quiche/

=> with this in mind, it may not be too early to think about http/3 :-)

@hpvd hpvd changed the title demo: http2 may make first page load faster demo: http2+ may make first page load faster (and more) May 16, 2022
@hpvd hpvd changed the title demo: http2+ may make first page load faster (and more) demo: http/2+ may make first page load faster (and more) May 16, 2022
@beenotung
Copy link
Owner

beenotung commented May 21, 2022

Thanks for the updates. It seems safari doesn't support http3 yet.

The example on how to push static content with express is helpful.
Your example used spdy instead of http2 module to create the web server, It seems the different between spdy and http2 is they used different algorithm to compress the http header.

With http2, the server actively push the static resources (css/js) to the browser's cache, it seems to have similar performace as inlining the styles and scripts in the http response. The overall performance for initial page load should be similar?

For subsequence routing, when the user switch to another view, part of the dom should be updated, and the new view will require new styles. In that case, if the dom update command is sent from websocket, and the styles get pushed from the previous http2 connection, the setup seems more tricky than having the style inline with the rest of dom elements (in json over ws).

The server push behaviour may be beneficial to push images referenced by the img elements thought.

In this case, maybe it's better to run the server with multiple versions of http at the same time, and upgrade the connection when supported [1]

Thanks again for the links, it seems it would be benefial in turns of performance after switching to http3 (quic over UDP) even when we do not leverage the server push feature, because using QUIC we can have shorter handshaking overhead and resend lost packages in lower layer (hence earlier when it is needed) [2]

Also, the seems great that QUIC server doens't have to listen on port 443 when the http1/http2 server response with header Alt-Svc: h3=":_the_port_".

@hpvd
Copy link
Author

hpvd commented May 21, 2022

good to read that there were some interesting points included :-)

just some advertising for 3 tiny tools:
When working in these field and using firefox, there are three addons perfectly helping to get an overview of the needed infos regarding current

  • TLS version (shown) and details (on click)
  • http version (indicated per color) and
  • WS usage (symbol appears when established)

2022-05-21_21h26_57

https://addons.mozilla.org/de/firefox/addon/indicatetls/
source: https://github.com/jannispinter/indicatetls

https://addons.mozilla.org/de/firefox/addon/http2-indicator/
source: https://github.com/bsiegel/http-version-indicator

https://addons.mozilla.org/de/firefox/addon/websocket-detector/
source: https://github.com/mbnuqw/ws-detector

@hpvd
Copy link
Author

hpvd commented May 21, 2022

examples:
using these tools one easily stumble across

  • your website uses http 1.1 (leads to this issue...)
  • the gif on the editor page seems to be the only resource using "only" TLS v1.2 on your site (all other use v1.3)
    ps: its only loaded if browser cache is deactivated...

2022-05-21_22h22_34

@beenotung
Copy link
Owner

beenotung commented May 23, 2022

Thanks for sharing the tools. I was not awared the linked image was using older version TLS.

I'll update the link with https image proxy like https://images.weserv.nl
This proxy server is using TLS 1.3

@hpvd
Copy link
Author

hpvd commented May 28, 2022

just another thought on relying on 2 different connection types (http and ws):
you have to take care for security and abuse of 2 different stacks.
Stumbled across this topic when searching for background on how to protect against DDoS and unfriendly bots...

for ws and security this one was interesting: WebSocket Security: Top 8 Vulnerabilities and How to Solve Them https://brightsec.com/blog/websocket-security-top-vulnerabilities/

@beenotung
Copy link
Owner

beenotung commented May 31, 2022

I'm considering sse over http2 vs ws.

When using sse (server-side event) over http2, it seems there can have at most 100 connections among all the tabs, which seems plenty. The EventSource in browser will auto reconnect when the network fails, and auto ask for missing events between two connections.

Even if we need to fallback to http 1 for some browser, we can workaround the maximum 6 connection limit with storage event [1]

However, it may incur more latency on the interaction as sse doesn't support sending messages from the client side (hence will need to use ajax to send events from client to server with additional http header in the request and response)

[1] https://fastmail.blog/historical/inter-tab-communication-using-local-storage/

beenotung added a commit that referenced this issue Jun 1, 2022
as suggested in #4 (comment)

not using images.weserv.nl image proxy because it seems the proxy doesn't support gif, and inlining the resource should load faster as it requires the browser to do https handshaking with less party
@hpvd
Copy link
Author

hpvd commented Jun 1, 2022

very interesting details and background.

I'm not sure regarding the fallback and workaround described in [1] from 2012. Maybe modern browsers (2022) would throttle tabs/connection in background aggressively to save battery on mobile devices - this may break the master tab solution suggested in [1] or at least add additional complexity... since these things highly depend on browser type and could change over time...
just an impression on this topic https://news.ycombinator.com/item?id=13471543

Regarding latency, one may need a test... of course there was a great reason why ws was choosen by Phoenix LiveView (and you): latency, but also some others..
The question is:

  • is this the best solution today for latency, or could it be matched by others protocols and advanced configurations
  • is it the best solution in this environment (node world, not elixir/phoenix world)
  • does it advantage compensate for other things (e.g. additional complexity for setup, maintenance, security...)

hmm everything is a compromise :-)

@hpvd
Copy link
Author

hpvd commented Jun 2, 2022

just looked through the linked sources again:

http/2 seems to support full bidi streaming, so there seems to be no latency disadvantages (no need for ajax)

as far as I get, one need to use:

Details:

Articles like this (linked in another answer) are wrong about this aspect of HTTP/2. They say it's not bidi.
Look, there is one thing that can't happen with HTTP/2: After the connection is opened, the server can't initiate a regular stream, only a push stream. But once the client opens a stream by sending a request, both sides can send DATA frames across a persistent socket at any time - full bidi.

That's not much different from websockets: the client has to initiate a websocket upgrade request before the server can send data across, too.

...

If you need to build a real-time chat app, let's say, where you need to broadcast new chat messages to all the clients in the chat room that have open connections, you can (and probably should) do this without websockets.

You would use Server-Sent Events to push messages down and the Fetch api to send requests up. Server-Sent Events (SSE) is a little-known but well supported API that exposes a message-oriented server-to-client stream. Although it doesn't look like it to the client JavaScript, under the hood your browser (if it supports HTTP/2) will reuse a single TCP connection to multiplex all of those messages. There is no efficiency loss and in fact it's a gain over websockets because all the other requests on your page are also sharing that same TCP connection. Need multiple streams? Open multiple EventSources! They'll be automatically multiplexed for you.

this and more details in:
https://stackoverflow.com/a/42465368

edit:
if this can really be confirmed and it works like this, I was exactly right yesterday with my casual comment "could it be matched by others protocols and advanced configurations" :-)
-> at first, the client has to open the stream, similar to how the client has to initiate the WS upgrade...
(so you could/should keep even the WS indicator badge on your demo, it only has to be renamed to: stream open)

@beenotung
Copy link
Owner

beenotung commented Jun 3, 2022

It would be very interesting if we can do bidirectional streaming with http2 (fetch push from client, event source push from server or streaming response to the previous fetch)

I'm following the demo on https://web.dev/fetch-upload-streaming.
The demo used TextDecoderStream and stream.pipeThrough() which seems not supported by majority distribution of firefox but it seems possible with other approaches.

If it really works, the performance will be improved and security part would be easier to cater!

@beenotung
Copy link
Owner

Update: I cannot get client-to-server stream work with http1/http2 yet, the body sent from Firefox and Chrome appear to be stringified object [object ReadableStream], not the actual stream content.

Maybe it need to be multiple ajax instead of a single streaming ajax at the moment?

@hpvd
Copy link
Author

hpvd commented Jun 3, 2022

hmm
does this work?

After the connection is opened, the server can't initiate a regular stream, only a push stream. But once the client opens a stream by sending a request...

could you open a stream after the connection is established starting from the client site to the server?

@hpvd
Copy link
Author

hpvd commented Jun 3, 2022

some more background on http/2 bidi streaming:
https://web.dev/performance-http2/#streams-messages-and-frames

@hpvd
Copy link
Author

hpvd commented Jun 3, 2022

This looks interesting:
A Complete Guide to HTTP/2 in Node.js (With Example Code)
https://www.sohamkamani.com/nodejs/http2/

@beenotung
Copy link
Owner

beenotung commented Jun 3, 2022

https://www.sohamkamani.com/nodejs/http2/

This guide is show how to create http2 stream from node.js, which is helpful.

I'm still looking way to open the stream from browser. Maybe I misunderstood, it seems the browser doesn't expose the http2 stream API, do we open a http2 stream from browser with ordinary ajax request?

@hpvd
Copy link
Author

hpvd commented Jun 3, 2022

Streaming requests with the fetch API:
https://web.dev/fetch-upload-streaming/

@beenotung
Copy link
Owner

Streaming requests with the fetch API: https://web.dev/fetch-upload-streaming/

From the website:

If the browser doesn't support a particular body type, it calls toString() on the object and uses the result as the body. So, if the browser doesn't support request streams, the request body becomes the string "[object ReadableStream]". When a string is used as a body, it conveniently sets the Content-Type header to text/plain;charset=UTF-8. So, if that header is set, then we know the browser doesn't support streams in request objects, and we can exit early.

It seems the latest available versions of firefox (v101) and chrome (v102) on archlinux aur doesn't support request streams at the moment.

Will keep exploring alternative approaches 💪

@hpvd
Copy link
Author

hpvd commented Jun 3, 2022

interesting, but there should be a way, since:

fetch api to send requests up
93% coverage at can-I-use https://caniuse.com/mdn-api_fetch
and there seems to be also a polyfill
if one really!? want/need more browser support (e.g. IE10) https://github.com/github/fetch

@hpvd
Copy link
Author

hpvd commented Jun 3, 2022

@hpvd
Copy link
Author

hpvd commented Jun 3, 2022

@hpvd
Copy link
Author

hpvd commented Jun 3, 2022

and there is also a link to many fetch examples:
https://github.com/mdn/fetch-examples/

@hpvd
Copy link
Author

hpvd commented Jun 5, 2022

another reason for using http/2:

you do not have to rely for speed of first meaning full paint on possibly unsave inlining of resources
see #8 (comment)

@hpvd
Copy link
Author

hpvd commented Aug 1, 2022

is there anything one can do to help on this topic, beside

like tried/done above?

Would be glad to help :-)

@beenotung
Copy link
Owner

I generally agree http2 is better than http in turns of performance.

Just need to update my deployment setup (local port forwarding http proxy) to allow the a liveview app running https directly (instead of relying the proxy to provide https).

@beenotung
Copy link
Owner

meanwhile, I'm referring this article on how to develop with https locally

@hpvd
Copy link
Author

hpvd commented Aug 6, 2022

This is pretty interesting! Thanks!

@beenotung
Copy link
Owner

The demo site is now working with http2 (handled by the proxy in-front of it).

And I can get websocket (using http/1.1) working with http2 server when testing locally.

Will consolidate everything then include in the next release under current major version 💪

@hpvd
Copy link
Author

hpvd commented Aug 8, 2022

sounds great :-)

@beenotung
Copy link
Owner

now ts-liveview comes with http2 support out-of-the-box in v4.4.0 🎉

Thanks for your suggestions and encouragements!

@hpvd
Copy link
Author

hpvd commented Aug 8, 2022

just for comparison some numbers: looks like it was worth it:-)

(sorry for different scaling factors)

2022-08-08_14h40_09

@beenotung
Copy link
Owner

With http2, it seems to have less latency even when the payload is increased.

Between two version, I also disabled compression on the deployed version. In the original deployment, compression was performed twice, once between public network and proxy, once between the proxy and the web server. (Now the connection between web server and proxy is not compressed)

@beenotung
Copy link
Owner

beenotung commented Aug 8, 2022

Now with http2 enabled, we can investigate the potential benefit of using ajax+sse vs websocket with lost message handling.

@hpvd
Copy link
Author

hpvd commented Aug 8, 2022

Now with http2 enabled, we can investigate the potential benefit of using ajax+sse vs websocket with lost message handling.

yeah!

Following the latest sources mentioned from June 2+, I still believe there is a way to use

  • SSE to push messages down
  • fetch api to send requests up

:-)

@hpvd
Copy link
Author

hpvd commented Aug 8, 2022

Since the original goal of this issue is reached (make first page load faster),
maybe we can close this issue, and enjoy the sense of accomplishment :-D

To move on with http/2 vs ws, I have opened a new fresh one: #13

@hpvd
Copy link
Author

hpvd commented Aug 8, 2022

another tiny idea to make page load faster and whose first step is very easy to implement:
#14

@beenotung
Copy link
Owner

I agree, we've moved on to adopt http2, and having more concrete ideas to explore :)

Panchenko77 added a commit to Panchenko77/realtime-app that referenced this issue Jul 10, 2024
as suggested in beenotung/ts-liveview#4 (comment)

not using images.weserv.nl image proxy because it seems the proxy doesn't support gif, and inlining the resource should load faster as it requires the browser to do https handshaking with less party
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants