A webserver framework which addresses both formal and informal (machine and meat) users, can deduplicate efforts by, (A) always sending the formal Response, and then for informality add an extra script that deformalises a corresponding interface for meat use. But this is a formality-oriented design decision, which is more machine-oriented, and contrary to lay-human intuition. A meat-oriented design, conversely, would (B) always send the informal Response, but only send a formal Response upon escalation.
All that being said, it seems that pre-emptive deduplication is going to be a poor design pattern, overall. Instead, (C) there should be different Request parameters demanding either formal/machine, or informal/meat, Responses. Responses-formality should therefore, be specific to the Requested-formality. And finally always, a Response must include the hyperlink which sends the same Response but parameterized for alternative degrees of formality.
#hci #restful #www #hateoas
( In case you have no idea what I just wrote above, here is some context. )
---
2024 : link
---
2025-02-23 :
HATEOAS is efficient for use-cases where : Client access is infrequent, because, every Server response has to convey an Adjacency Map of available Server state to the Client. This requires some disambiguation.
- 1. The human-usable web browser interface of say, LinkedIn, is indeed HATEOAS compliant. However it is not Uniform-interface compliant.
- 2. The Ruby on Rails framework does something where you can toggle between HTML and JSON Server Responses ... where the JSON responses are nevertheless not HATEOAS compliant, whereas the HTML Responses are HATEOAS compliant.
- 3. The view I have had about this since working on the Ruthenium framework in 2020 ( I haven't done much work on it since ), is that HATEOAS can be enforced at the framework level using something similar to the concept of 'webpage hydration' that has been recently trendy. Instead of rehydrating markup with application state, I think a framework may be designed to rehydrate a HATEOAS compliant tun with the human-interactive adornments : HTML, CSS, and JavaScript.
E.g. roughly
- Request POST : I want X
- Response in JSON :
- - structure of webpage resources ( images, links, text nodes )
- - further links to secondary rendering resources ( CSS, JavaScript, images which are ornamental, etc. ) ( but if the request had been flagged say "pre-render", then the server can do hydration first and send a single response )
I think I was recently reading the term 'dynamic programming' which is about breaking down a large program into distinct sub-programs. So a webpage is should be like that ... it is a browser web app that can be decomposed into sub-apps / widgets.So of course, nowadays we want to reduce latency so sub-programs can interact on the client side and update the UI before the server is updated ... but then it's a matter of how clean the programmer writes the sync code to the server. ( and a proper framework should make it both accurate, and easy, decompose the synchronisation between client app and server )
thinking aloud :
- level 1 : JSON tun ( dehydrated tardigrade ) of response RESOURCE and sub-resources
- level 2 : additional sub-resource/s for static HTML hydration
- level 3 : additional sub-resource/s for dynamic web UI hydration : potentially displaying other widgets
... so here's where my imagination got stuck in 2020, I didn't know how to design this, and postponed this level of consideration. So I have to get to it now.
No comments :
Post a Comment