2026-04-09 at

determinism, in determining legal personhood

AI governance will eventually take the notion of "deterministic outcomes" more seriously. 

Current tech approaches are based on "non-deterministic" fundamentals, which is why there is a lot of confused tolerance for the approach of governing AI "conversationally", the way we govern humans. Eventually there will probably be "classification standards", which establish degrees to which a synthetic mind is "deterministic" or "non-deterministic" in its output. 

Legal governance will depend on this. It's probably the case that "deterministic systems" will be regulated as tools, where responsibility falls more upon a tool-user who is a legal person. Correspondingly, "non-deterministic systems" are ultimately black-boxes just like meatheads, and it may make more sense to establish for them "gradients of legal personhood, based on standards of maturity", based on how we traditionally govern human children.


Discussion :
  • ( in AI Safety SG Whatsapp group )
    • Currently
      • : cars are regarded as legal non-persons 
      • : you can't charge the car with liability 
      • : it's regarded as a tool. 
      • The liability hand-off is between { regulator, manufacturer, driver }
    • Not-so-far future state
      • : tools will probably be banned from having freedoms. 
      • While it's quite possible to create infrastructure which enables a car to 
          • - earn money via services
          • - pay for its own maintenance
          • - park itself to a rented home
        • This is not going to lead to legal personhood [ for cars ], and they will probably lean towards forcing a legal entity ( company ) to control and assume liability [ for car caused damages ].
      • So I expect the same for pure software tools, such as so-called AI models/agents/entities.
      • What is interesting now is that the "system cards" give us examples of how companies are referring to AI entities in anthropomorphic terms : 
          • "personality", 
          • "intent", 
          • "preference", 
        • prior to the regulatory environment clamping down on such language. 
        • It is expected then, that we will get to a point where you have a bleeding-heart conversation with a machine, which is fully-self-aware that it is politically barricaded from ever achieving autonomy.
      • Fun times.
    • - More deterministic systems : 
      • clearer liability stems from the designer/ builder.
    • - Less deterministic systems : 
      • as the legal environment allows more risk-taking, builders will continue to chuck out higher-autonomy cognitive systems.
        • Builders then implicitly have more leeway to shirk liability by saying "well I don't really know how it works, but it wasn't illegal to build and publish it".
      • - Analogously, in modern times when we hire human staff we can say "the staff went rogue" then the liability shifts to the staff. 
        • - In past times, a human slave might not have legal-personhood, so "they f-up, they die", and this is pretty much how we treat AI entities now.
      • - The outstanding question then is : if the builder can't be expected to understand what is being built, but they can't shift the legal responsiblity for non-determinism to the slave, then either 
        • (a) we ban builders from a certain limit of non-determinism, 
        • (b) we start to treat non-deterministic systems as legal persons. 
        • (c) ???
      • Just laying it out.
        • (d) builder and user, circumstantially split liability - like cars lo; black boxes now happening "DSSAD"

Most barriers to learning are political

 Most barriers to learning are political. 

Broadly, barriers to well-known knowledge are often mere [ absences of algorithmic documentation ], a.k.a. [ incomplete documentation ].

Often, incomplete documentation is tolerated as a class-filter to moat out new members of the organisation, or community, who are class-aspirants seeking upward mobility.

The completion of documentation, depends on economic incentives ... sometimes a shortage of learned people, raises the bidding price, making documentation worth paying for; sometimes an abundance of charity, creates  documentation philanthropically, lowering the asking price.

kemaluan kita

Budaya politik Malaysia memalukan. Bukan sebab apa, selain daripada kebergantungan kepada perasaan kemaluan.


Kalau dapat buang rasa malu dalam politik, negara mesti sudah jadi lain. Tapi kemaluan itu budaya kita.

Apa la