2019-09-07 at

Comment: on getting people to buy "local" goods and services

Let's go back to a recent lens I provided on someone's comment talk about 'how to build communities." The notion I raised is that communities exist as a result of economic pressures, and the construction of fundamentally sustainable communities begins with a review of the economic pressures faced by each potential member of the constructed community. Goodwill is born of economic pressures, it doesn't just pop out of the good books.

The phenomena described by Iqbal Ameer is not new. Malaysians have been complaining about Malaysians lacking support for Malaysian things since time immemorial (I admit I'm not very old). Just yesterday, I think, someone on Twitter was asking how individual citizens can help to prop up the exchange rate of Malaysia's currency - and I'm like, dude, there's only one way to do that... you buy locally produced goods and services, and if you're a producer you avoid selling them domestically. This may seem like a bit of a digression because artists may be less sentimental about money than they are about community, and no one's saying that we should go to more local artists' concerts in an effort to prop up the Ringgit.

But it's not a digression, because we're talking about the exact same underlier. Whether a scrooge gets sentimental about their Ringgit investments, or whether a Malaysian consumer gets sentimental about local dingbat cartoon characters, the underlier is the common identity shared by the scrooge, the art consumer, the Ringgit, the Upin and the Ipin... the national identity, as Malaysian.

Now as we come back to the economic pressures faced by consumers, we find that many consumers don't identity with other Malaysians. Read that again. It's literally a case of Malaysian#1 and Malaysian#2 don't think each other are actually on the same team. Now why is that the case?

We're left with no option but to review a fundamental component of Malaysia's social psychology: xenophobia is grounded in the Constitution (yes, the national constitution), and furthermore it has been encouraged for decades by the Executive branch of government. Xenophobia is omnipresent in our federal propaganda, to date. There's no shorter way to phrase it. Regardless of superficial talk about national unity, Malaysians go to their back-channels and say, "well we tried, but after all, the other blokes are not like us."

So back to your question about petty traders: read, local brands, bands, and businesses. The only way to market something as being BETTER because it is LOCAL (or domestic) is to focus on value propositions that emphasise a common threat vis a vis a failure to support the local thing (or a common incentive vis a vis supporting the local thing).

Going back to the little example of the scrooge, the art consumer, the Ringgit, and the cartoon characters ("art"). The narrative of a brand that is selling local cartoons needs to demonstrate to a scrooge why his Ringgit will depreciate if he doesn't buy local cartoons... and the narrative of a brand that is selling Ringgit investments needs to demonstrate to art consumers why Ringgit appreciation is beneficial to the development of local art.

This may seem like an obvious pattern, but it is actually ignored by marketers everywhere. Oh well. :P

2019-09-05 at

Decoupling Web Application Eventing from the DOM

Sometimes I think, I write things which make no sense at all. Then I frame them up for farther consideration.

---

TL;DR: Can you guys tell me about programs which:
1. implement "events" and "event listeners" ...
2. ... on web clients... (not servers)
3. ... outside of the the DOM (events are not attached to DOM elements)
?

---

I just found a bug in common web applications. This has been bugging me for some time, and I'm relieved to have nailed it finally (I think). Caveat emptor: this is an architectural bug that is expansive throughout the web application ecosystem, so it may not be easily viewed as a bug.

The DOM is supposed to be a document, not a business processor. But for quite a few years now, people have been using DOM events & event-handlers as business processors, instead of as mere documents. This has royally fucked up the separation of concerns between the DOM tree, Web Application APIs (Javascript), and Rendering (CSS, etc.).

It just so happens that usually, the only place to plant event listeners, and the only place to trigger events, is on DOM elements. So what happened is that in pursuit of writing event-driven business processes into code, people have been writing event-driven business processes into the DOM.

The data structure of a web application is NOT supposed to depend on a DOM tree. Therefore close-coupling between [data models in a web application] (read in 2019 as "stores"), and [specific node trees in a DOM] is an anti-pattern.

But isn't the DOM supposed to mediate such user-interface actions as receiving input from the user, and sending feedback to the user, you might ask? Yes, of course, these can be modelled as events, and the DOM provides us data structures pertaining to those events, as well as ways to operate on those structures (listeners).

But that does NOT mean that the DOM is also supposed to handle traffic control for events throughout an entire web application. A web application should have an eventable control centre which lies OUTSIDE the DOM. (That is, outside any light, shadow, or virtual DOM, in case it wasn't clear). We generally know how to do this on the server, since NodeJS lit the rage on this. But web applications are not always supposed to run on the server - often times we want a web application to run robustly on the client side, and to depend on servers only for federated data stores. So how do we go about writing web application CLIENTS that have EVENT-driven traffic control OUTSIDE the DOM?

With that in mind, I searched for implementations of event listeners outside the DOM, and was pointed to the Reactor pattern by this thread.

I look forward to examining it more closely, perhaps with your input.

2019-09-04 at

My Limited Experiences in Machine Computation / FFWD: Web Dev in 2019

Past


In the 1990s, I used to play games on a monochrome XT. It also ran QBasic, but I never learnt how to use that properly. I first got involved in website development in secondary school, on voluntary projects.

In the summer of 2004, if memory serves me right, I developed websites industrially at an internship. This was a mix of design and development work, involving HTML, CSS, and Flash/ActionScript. We used Windows machines.

In 2006, I learnt how to use spreadsheets at a management consulting firm. In 2007, I learnt how to write scripts in Excel.

In early 2009 I found myself helping an entrepreneur to assemble, from untrained talent, a small web development product, in CakePHP on LAMP, and jQuery. Here I also learnt how to use Vim, VirtualBox, and SSH.

In 2012, I took some time off to study Git, MongoDB, Erlang, and Haskell, among other things. I developed a superficial understanding of these things, and their related fields of technology.

In 2014, I worked in a few software development operations, none of which could be called robust. I practiced quite a bit of responsive web design, and learnt how to work very superficially on cloud infrastructure.

In 2015, I got around to starting work on AWS.

It is now 2019. Web standards were a mess ten years ago, and they appear to remain a mess now, albeit somewhat cleaner.



Present


In preparation for web development at work, I've been reading to catch up on lost time.

In terms of separation of concerns, the dichotomy of HTML for content and CSS for presentation... seems to be continually abused.

Fortunately, with the advent of Custom Elements v1, we can now officially rant at people who redefine vanilla HTML elements to do absurdly irrelevant things - not that this is going to stop everyone from trying to distort the semantics of the given tags.

There remains no standard [data storage model] for [web applications].

There remains no standard [application architecture] for [web applications].

In the debates on UX, web developers seem obsessed with web rendering performance, and little else.

In the debates on DX, web developers seem obsessed with workflow ergonomics, to the detriment of considerations on what the standards should be.

In short, little has changed. I look forward to reviewing this all tomorrow when I get back to work on it.

A Legalistic View


I accept that many people will not agree with me on the following point, and I think that's perfectly reasonable for them.

First and foremost, I think a web developer needs to forget about commercial concerns when embarking upon the selection of tools for developing a web site or a web application. The domain of web development is politically governed by commercial implementations of renderers, i.e. browser implementations - and the only thing holding that Mexican standoff in place is the good old W3C. When the limitations of browser implementations become the primary concern of web developers, it is then that web developers lose sight of the end-game... which is an eventual redundancy of such war and the eventual emergence of a boringly staid technology stack. However, given the way technology works, that's probably only going to happen when the stack has been tooled down to the molecules. So for the time being, I guess we're stuck in this decades long race to be first to grasp at infinitely durable design patterns.

That being said, I think that at this time I want to approach web development with a modular lens. A web development framework should bridge... the standard semantics of the media, with the availability of innovative programs that enable the standard media to do various things. I think an explicit plumbing skeleton which lets users plug in their preferred data store, state management rules, router, node differ, renderer, etc. is in order. (Poor old Project Ara comes to mind.) All of this should explicitly warn users of the framework, where and when standards endorsed architectural patterns are being either encouraged or denied.

2019-09-03 at

Svelte.dev's Approach to Runtime Libraries

This is my short take on the Svelte 3 author's presentation, given comments that the Svelte compile-time code is "not really Javascript," ...

Any non-ECMA-native templating, the use of decorators, and Typescript, etc. require a x-pilation step. But what's the benefit from it?

The approach that React/ Vue/ Hyperapp/ Choo/ lit-html/ etc. take is that:
1. you write x-pile-time code
2. your (1.) is parsed by an x-piler to run-time code
3. your x-piled run-time code calls a run-time library
3b. with some frameworks, you may also write (1.) which does not need an x-piler to convert DSLs to JS... but that JS remains subject to the API of (3.)...
Whereas the approach taken by Svelte is that:
1. you write x-pile-time code
2. your (1.) is parsed by an x-piler to run-time code
3. there is no run-time library **
3b. there is thus never an API limiting the run-time code ****
** You could feasibly argue that Svelte reifies the library code and puts it into the x-piled JS. But we can only discuss the value of this, if discussants all do homework and read the pre-x-piled/post-x-piled code for all the frameworks under discussion :P

**** Subject to ** above.

All in, instead of x-piling moderately complex x-pile-time code to moderately complex run-time code... Svete tries to x-pile relatively simple x-pile-time code to much more complex run-time code, WITH BETTER PERFORMANCE... and since this generally how code x-pilation has worked in computing prior to the web... I'm going to go out on a limb with my limited experience and say that I welcome Svelte's approach.

Caveat: I'm a bit out of date with these things, and I may have misunderstood a lot of things which I referred to.

2019-09-02 at

Classism Among Poor Humans, and Rich Humans

I have friends who keep cats and dogs. We have debates about whether humans and other mammals are very different. I do not think humans are very different from other mammals, and I also do think that it is very easy to build empathetic machines.

The [set of people] includes humans, other animals, and other machines. A lot of animal lovers don't actually deal with people very well... they interact with their friends who are dogs and cats, but they are not actually capable of socialising with the dogs and cats who are not their friends. I find that they carry the same patterns into their commercial workplaces... where they are only able to appreciate the existence of their friends, separately from their non-friends.

In my case, I don't really think that my lovers, friends, family, business partners, whatnot... should be treated any differently from other humans, animals, or machines that I encounter daily. (You can argue that I have no friends, and that may be quite right from certain points of view.)

As for my friends who can't handle their non-friends... I find that many of them are incapable of working closely with random persons (human or otherwise). They feel that my work in the minimum-wage sector must be painful - but I do not feel more pain in the minimum-wage sector than I do in the white collar echelons of society.

Work in the minimum-wage sector indeed, involves manipulating your own and other people's meat, where many of your colleagues are foolish, uncompetitive, illiterate, handicapped, mentally unstable, socially maladapted, or otherwise of ill health. It is really, and I quote myself verbatim, "as close to training dogs as you can get." But this is nothing to be ashamed of.

My higher-classist friends feel less pain dealing with their own kind. But I feel that [the intellectual distance between myself and a celebrity or management consultant] is quite similar to [the intellectual distance between myself and a floor sweeper], which is why I have a comparative advantage at working in the minimum-wage sector.

TL;DR: Javascript has matured. What shall next, be born?

JavaScript may have always been complex, under the hood, in its early incarnations - but the commonly used surface was a rather small general purpose language. In recent years, the addition of features which make the language more robust have also increased the number of commonly used keywords and concepts... to a point where I can no longer refer to Javascript as a "small" language.

This is good for engineering professionals, and bad for artisanal users (small timers / noobs). In light of this, I have some context for appreciating the complaints and debates on language politics in Javascript framework communities.

I will continue to study Javascript as an ongoing concern... and I expect that more language features will be added into it, as it turns into a high-level cross-platform glue language like C and Java before it. (I am familiar with the difference between compiled, and interpreted code.)

However, I think, at some point we will find that economic pressures will encourage people to seek/design smaller languages (which may or may not compile down to Javascript) for introductory-level computing projects.

2019-09-01 at

Delivery Drivers should Own the Last-mile, and the Customer (opinion)

I've written recently on why outsourcing the last mile results in lousy customer experience. Delivery services really need to be pivoted towards letting drivers own the last mile, in order to improve customer service.

(1) Comment on Article (linked):

.
Legend:
.
Customer (C)
Seller (S)
Amazon (AMZN)
Third-party logistics company (3PLC)
Driver (D)
.
The CX under discussion is timely delivery. The value chain for this currently looks like:
.
[Cs pay >a> Ss pay >b> AMZN pays >c> 3PLCs pay >d> Ds]
.
>c> is where AMZN hands off highly variable costs to 3PLCs for a less variable fee
.
>d> is where 3PLC hands off highly variable cost to Ds for a less variable fee
.
AMZN used to have to deal with >d> directly, but didn't want to do the dirty work, so it handed the gun to 3PLCs, and they've dutifully performed. The rudimentary argument for encouraging this pattern is to say that AMZN has made the market more efficient by accruing value to C via reduced prices. In light of that, the article raises the contrary concern that this value accrued to C has costs to Ds.
.
Forecasting the effects of that, systematically the market will turn around to Cs and say "your cheap prices are resulting in road accidents, please pay more for a better world." In other words, AMZN has conned the system into allowing inefficiently low C prices, by hiding the true cost of low C prices. (AMZN may or may not have done this intentionally - which is a completely separate concern from whether AMZN should bear legal liability for it - which is again a separate concern from whether AMZN will be fined by regulators or the courts for allowing this to happen.)
.

(2) What I wrote on previously:

.
Delivery brokers like BERUEats do the exact same thing, in certain markets.
.
[Cs pay >a> Ss pay >b> BERU pays >d> Ds]
.
This isn't efficient. The value chain should be:
.
[Cs pay >e> Ds pay >a> Ss] and
[Ds pay >f> BERU only for SaaS]
.
This essentially repositions Ds as dropshippers enabled by BERU as a software platform - where the Ds are able to own the customer, own their own independent businesses, charge whatever they want, and add truer value to the market by reducing BERU's unnecessarily high margins.