Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2025 February TC39 presentation update #393

Open
acutmore opened this issue Feb 24, 2025 · 112 comments
Open

2025 February TC39 presentation update #393

acutmore opened this issue Feb 24, 2025 · 112 comments

Comments

@acutmore
Copy link
Collaborator

acutmore commented Feb 24, 2025

At TC39 last week (2025/02) we discussed R&T.
Slides here: https://docs.google.com/presentation/d/1uONn7T91lfZDV4frCsxpwd1QB_pU3P7F6V2j9jEPnA8/.
Minutes are typically available three weeks after plenary.


After the feedback received from committee in 2023 the proposal has been looking for a new design that:

  • Does not add new primitives (no new typeof)
  • Does not overload ===
  • Still adds immutable data structures
  • Still adds nested/composite equality operations

After discussing some designs with various delegates last week. I currently have:

// Creation:
const recordLike = Composite({ x: 1, [Symbol()]: 2 });
const tupleLike = Composite([1, 2, 3]);
typeof recordLike; // "object"
try { new Composite([]) } catch { /* throws */ }

Object.isFrozen(recordLike); // true
Object.isFrozen(tupleLike);  // true
Array.isArray(tupleLike);    // true

Object.getPrototypeOf(recordLike); // `Object.prototype`
Object.getPrototypeOf(tupleLike);  // `Array.prototype`
recordLike instanceof Composite;   // false
Object.getPrototypeOf(Composite(new Map())) // `Object.prototype`

Composite.fromEntries(Object.entries(someObj));
Composite.fromIterable(someIterable);
Composite.of(1, 2, 3);

// Syntax (sugar)
#{ x: 1, y: 2 }; // Composite({ x: 1, y: 2 });
#[1, 2, 3];      // Composite([1, 2, 3]);
// Generic containers
const mutableObj = new Set();
const t = #[];
const c = #{ o: mutableObject, t, z: -0 };
c.o === mutableObject; // true
c.t === t;             // true
Object.is(c.z, -0);    // true
// Predicates
Composite.isComposite(#{}); // true
Composite.isComposite(0);   // false

#{} === #{};         // false
#{} == #{};          // false
Object.is(#[], #[]); // false

const obj = {};
Composite.equal(#[obj], #[obj]); // true

Composite.equal(#{ x: #[0, NaN] }, #{ x: #[-0, NaN] }); // true

const a = "a", z = "z";
Composite.equal(#{ a, z}, #{ z, a });  // true

Composite.equal(1, 1); // true

Composite.equal([obj], [obj]);        // false
Composite.equal(#{ length: 0 }, #[]); // false
// Language integration
new Set([#[], #[]]).size; // 1

let m = new Map();
m.set(#[], true);
m.has(#[]); // true

Map.groupBy(iter, v => #[v.x, v.y]);

Composite.isComposite(#[1, 2, 3].filter(v => v > 1)); // true

[#[]].includes(#[]); // true
[#[]].indexOf(#[]);  // 0

// There are various options on how to handle composites in weak positions.
// Option 1: reject
try { new WeakSet([#[]]) } catch { /* throws */ }
// Option 2: referential (same as regular objects)
new WeakSet([#[], #[]]).size === 2;
// Option 3: composite weak key (extension of option 1, as that example would still throw)
new WeakSet([#[1, globalThis], #[1, globalThis], #[2, globalThis]]).size === 2;
// Option 4:
// do option 2, and have option 3 semantics as an opt-in when constructing the `Weak{Set/Map}`
@acutmore acutmore mentioned this issue Feb 24, 2025
25 tasks
@slorber
Copy link

slorber commented Feb 24, 2025

Thanks for the update!

const obj = {};

Composite.equal(#[obj], #[obj]); // true

Composite.equal(#{ x: #[0, NaN] }, #{ x: #[-0, NaN] }); // true

This looks like a quite interesting behavior.

Is this expected to perform faster than deepEqual(obj1,obj2)?

Although I would have hoped for === to work, I guess this could be a good enough equivalent. A framework like React could use this for useEffect dependency array or when comparing props in React.memo.

Are there other integrations considered/planned? For example, reading API responses as composites, or converting a deeply nested object to a composite? Is JSON.parseImmutable() still on the table?

How does this get represented in TypeScript? Can we encode in the type system that something is a Composite/Record/Tuple?

I'm curious to know more about the possible WeakSet options. Depending on the behavior, this could probably be useful to implement composite interning in userland.

@littledan
Copy link
Member

Composite.equal is expected to be linear time, like any deepEqual function.

@slorber
Copy link

slorber commented Feb 24, 2025

Composite.equal is expected to be linear time, like any deepEqual function.

I see thanks.

When using map.get(), set.has() and other operators with large maps/sets, can't this be a performance problem? Or is there a way to index those composites for fast lookup?

@nicolo-ribaudo
Copy link
Member

When using map.get(), set.has() and other operators with large maps/sets, can't this be a performance problem? Or is there a way to index those composites for fast lookup?

The way maps/sets work is that you need to:

  1. compute the hash of the key
  2. lookup the hash in the map
  3. check that the key indeed matches

1 and 3 are O(size of the key), and 2 is O(1). So maps/sets using composite key would still be constant time with respect to the number of elements in the map.

@rauschma
Copy link

rauschma commented Feb 24, 2025

This looks great!

A few ideas:

  • I’d prefer to have separate factory functions:
    const recordLike = Record({ x: 1, [Symbol()]: 2 });
    const tupleLike = Tuple([1, 2, 3]);
    
    // Predicates
    Record.isRecord(#{}); // true
    Record.isRecord(#[]); // false
    Record.isRecord(0); // false
  • Loosely related to the previous point – I think having two iterator methods instead of a single .toComposite() makes sense – e.g. when you have an iterable over pairs or an empty iterable:
    Iterator.prototype.toTuple()
    Iterator.prototype.toRecord()
    
  • Could Composite.equal() be Object.equal() and work for any values?
    • Downside: It would be different from the operation used by .indexOf(), .includes(), etc. – which then should also be exposed in some manner.
    • Upside: It’d be really useful to be able to compare arbitrary, potentially nested, values and the algorithm would mostly be the same(?)
  • That is interesting and very useful:
    // Language integration
    new Set([#[], #[]]).size; // 1
    
    let m = new Map();
    m.set(#[], true);
    m.has(#[]); // true
  • Could this lay the foundation for classes that produce immutable objects that are compared by value in collections? That also seems useful.

@demurgos
Copy link

demurgos commented Feb 24, 2025

Thank you for the update. I'm happy that it's moving forward, but I'm still disappointed that it requires custom integrations and comparisons instead of relying on existing mechanisms. (===, indexOf, etc.). Integration with sets and maps is probably the most important use case, and Composite.equal can become the new "universal equivalence check".

const obj = {};
Composite.equal(#[obj], #[obj]); // true

Is this right? Regular objects would be allowed inside composites?

Composite.equal is expected to be linear time, like any deepEqual function.

One reason for native support of immutable structs was better integration with the engine to support stuff like interning or a cached hash. The worst case would still be linear obviously, but I assume that there may be faster paths eventually for certain patterns.

@ljharb
Copy link
Member

ljharb commented Feb 24, 2025

Personally I like composite keys quite a lot, but I don't think syntax is worth it without ===, which was the vast majority of the benefit of the R&T proposal imo.

@acutmore
Copy link
Collaborator Author

but I assume that there may be faster paths eventually for certain patterns.

Fast paths are theoretically possible, when the values are not equal and if their hash values have already been computed and cached then the equality check may bail out immediately. The existence of such fast paths would not be part of the spec and should not be relied upon, so the equality check should always be thought of as being O(n). This is somewhat similar to how strings work today, though their data structure is simpler so not exactly the same.

One advantage of composites is that they are guaranteed to not trigger user code (not proxies, no getters) which at least means that these aspects don't need to be guarded against. So what can be relied upon is that when given two values the check will always return the same result, and will never throw an exception.

@acutmore
Copy link
Collaborator Author

Is this right? Regular objects would be allowed inside composites?

That is correct. A composite would not be a guarantee of deep immutability. At this point in JS's life I think the ship has sailed, essentially every object in JS is mutable either internally or publicly. Even newer Proposals such as Temporal are only internally immutable, they are still extensible.

It's possible to imagine an additional API Composite.isDeep, though this may also be a linear check unless engines found space to store an extra bit of information on every composite.

@mindplay-dk
Copy link

Since composites are objects, and not instanceof Composite (and since the function works on objects and not only on composites) shouldn't it be Object.isComposite rather than Composite.isComposite?

@acutmore acutmore pinned this issue Feb 24, 2025
@acutmore
Copy link
Collaborator Author

Is JSON.parseImmutable() still on the table?

Personally I would still like it to happen, similar to how it is now it would be a follow on piece of work to investigate.

How does this get represented in TypeScript? Can we encode in the type system that something is a Composite/Record/Tuple?

It would need a new marker from TypeScript to encode as structurally they are no different from regular objects/arrays. Maybe it would be composite & { readonly x: 1 }. I'll chat with the TS team to get their thoughts.

I'm curious to know more about the possible WeakSet options

  1. Always reject composites
    • simple, yet possibly surprising
  2. Always accept composites (A)
    • they are keyed by referential equality (match the way objects work today)
  3. Always accept composites (B)
    • they are keyed by Composite.equal
      • composites with no objects and no symbols* would effectively 'leak' when put in a WeakMap
      • composites with some objects/symbols* would only be removed when one of those objects/symbols is collected - otherwise they 'leak'
  4. Sometimes accept composites
    • they are keyed by Composite.equal
    • they are only accepted if they contain at least one lifetime bearing value non-composite-object/non-registered-symbol
    • composites with some objects/symbols* would only be removed when one of those objects/symbols is collected - otherwise they 'leak'

Note: 3 and 4 can be done in userland but 2 can't. 2 would break the similarity between Map and WeakMap.

shouldn't it be Object.isComposite rather than Composite.isComposite?

Potentially. Technically all of the APIs could be on Object and there is no new Composite namespace. Putting everything on Object makes it quite crowded IMO, it felt better to have a new namespace. I'm sure the location and name of the APIs will be discussed at length.

@HappyStriker
Copy link

I really hope it is still time to reconsider to not drop the idea of using ===, which would be awesome and elegant.
Also like @rauschma suggested, the dedicated Record and Tuple factory functions are more straightforward in my opinion.
What I do not get, please help me understand this, is why instanceof shall not work. The records, tuples, or even composites could just be instances of their own with Object in their prototype chain, right?

@demurgos
Copy link

Also like @rauschma suggested, the dedicated Record and Tuple factory functions are more straightforward in my opinion.

I'm eagerly waiting for the minutes to see what was discussed; however having a single constructor is consistent with dropping support for ===, and it should be considered as a whole. Except equal, all methods on Composite are used for construction or as "instanceof" replacement, so they could live on separate constructor/namespaces. For the equality check though, the slides already call out that it's a 5th equality. Splitting into Record.equal and Tuple.equal would be pure inconvenience. You could also keep the split Record/Tuple for constructors and instance checks, with equality somewhere else (e.g. Object.compositeEqual) but it feels like a worse solution. Compared to either:

  • Split constructor, and equivalence check using ===
  • Merged constructor, and equivalence check using a single namespaced method

Is this right? Regular objects would be allowed inside composites?

That is correct. A composite would not be a guarantee of deep immutability. At this point in JS's life I think the ship has sailed, essentially every object in JS is mutable either internally or publicly.

Thanks for the confirmation. Shallow immutability can easily be checked recursively in user-code to enforce it deeply. In particular, a recursive check still allows to get rid of defensive copies which was one of the goals of the proposal. It's cheaper to check once when the value is received rather than copying on each access. 👍

The existence of such fast paths would not be part of the spec and should not be relied upon, so the equality check should always be thought of as being O(n). This is somewhat similar to how strings work today, though their data structure is simpler so not exactly the same.

I understand that it's not exactly the same, but the way strings are handled today is proof to me that the perf discussion (and === support) is not so black and white. I agree however that the spec should not require any guarantees from implementations, and O(n) should be assumed.

@Maxdamantus
Copy link

Maxdamantus commented Feb 25, 2025

// Language integration
new Set([#[], #[]]).size; // 1

let m = new Map();
m.set(#[], true);
m.has(#[]); // true

Map.groupBy(iter, v => #[v.x, v.y]);

Composite.isComposite(#[1, 2, 3].filter(v => v > 1)); // true

[#[]].includes(#[]); // true
[#[]].indexOf(#[]);  // 0

So just to clarify, does this mean every practically every current use of equality is being updated except the equality operators (==, ===, !=, !==) and Object.is? That is, as if SameValueZero is being updated to treat different composite objects as equal, and === would use a new definition of equality that distinguishes them?

Same as some others here, I'm also very much in favour of using === for R/T/composite values (so as with strings, repeated evaluations of #[] would produce indistinguishable values, even if the engine uses different underlying memory allocations).

Disregarding the ergonomic advantages of R/T ===, wouldn't this make an R/T polyfill a lot more intrusive, since it would need to replace every constructor/function that uses equality? With R/T ===, the polyfill needs to maintain an inefficient trie of WeakMaps (and there are some issues around -0/+0 values), but the scope of the polyfill is at least minimal, and if R/T values are never used, there's practically no impact.

@mhofman
Copy link
Member

mhofman commented Feb 25, 2025

What I do not get, please help me understand this, is why instanceof shall not work.

Technically instanceof could be made to work by having Composite[@@hasInstance] simply alias Composite.isComposite

That is, as if SameValueZero is being updated to treat different composite objects as equal

I believe that's the hope.

I'm curious to know more about the possible WeakSet options

Note: 3 and 4 can be done in userland but 2 can't. 2 would break the similarity between Map and WeakMap.

For the record, I stated last week that 3/4 currently seem unacceptable to me since I expect it to break too much code that expect Weak collections to use the identity of the key. I do not however expect much code to rely on WeakMap and Map to have interchangeable keying semantics, and since you can implement 3/4 in userland with a trie of WeakMap, I'm ok with both 1 or 2.

@spartanatreyu
Copy link

spartanatreyu commented Feb 25, 2025

Not keeping Composite separate as Record and Tuple seems like it's going to create a bunch of boilerplate that I'll have to remember, which I'm not going to want to write for every project and just push me away from using them.

If I have a function that receives a "thing", and I want to do something different depending on if it's a record, tuple or something else, I would rather write:

// nice and neat

function measureThing(thing) {
    if (typeof thing === "string") {
        return ...
    }
    if (typeof thing === "record") {
        return ...
    }
    if (typeof thing === "tuple") {
        return ...
    }
    return ...
}

or even:

// slightly less neat

function measureThing(thing) {
    if (typeof thing === "string") {
        return ...
    }
    if (Record.isRecord(thing)) {
        return ...
    }
    if (Tuple.isTuple(thing)) {
        return ...
    }
    return ...
}

rather than:

// painful

function measureThing(thing) {
    if (typeof thing === "string") {
        return ...
    }
    if (Composite.isComposite(thing)) {
        if (Array.isArray(thing)) {
            return ...
        }
        return ...
    }
    return ...
}

My first code example is really easy to read and write, plus it could be easily refactored into something like this when pattern matching becomes available:

// heaven

function measureThing(thing) {
    return match(typeof thing) {
        when "string": ...;
        when "record": ...;
        when "tuple": ...;
        default: ...;
    }
}

@Maxdamantus
Copy link

Maxdamantus commented Feb 25, 2025

What I do not get, please help me understand this, is why instanceof shall not work. The records, tuples, or even composites could just be instances of their own with Object in their prototype chain, right?

I suspect this isn't the rationale, but as @rauschma alluded to, I like the idea of having records with custom prototypes (assuming this would need to be a future proposal).

The linked slide deck has a hidden slide with:

#{ __proto__: vector2DProto, x, y }

(personally, I'd prefer just specifying the prototype as a second argument to Composite or Record, eg Composite({ x, y }, vector2DProto), since the ugly __proto__ stuff shouldn't be propagated further than Object.prototype, but this is a minor nit)

I can imagine if custom prototypes are used, there would also be some syntax like composite class Vector2D { ... }, and Composite.isComposite(Vector2D(4, 5)) would be the reliable test for composite values.

This can also be compared with other "reliable" ways of detecting certain types of values:

  • Array.isArray(a) rather than a instanceof Array, since array objects might have prototypes other than Array.prototype
  • typeof o === "object" rather than o instanceof Object, since objects might lack a prototype, or have a prototype that doesn't have Object.prototype in its chain
  • Error.isError(e) rather than e instanceof Error, since it works across realms (also applies to the other examples above)

Similar to Array.isArray and Error.isError, the Composite.isComposite pattern also correctly classifies objects that might be impersonating special types of values. eg, Object.create(Array.prototype) instanceof Array, but that's not actually an array object (it doesn't have the special behaviour around setting the .length property).

@acutmore
Copy link
Collaborator Author

acutmore commented Feb 25, 2025

Disregarding the ergonomic advantages of R/T ===, wouldn't this make an R/T polyfill a lot more intrusive

Yes, === can be 'polyfilled' without having to modify existing APIs. Whereas the design in this thread requires the polyfill to modify Array,Map,Set, and potentially Weak{Set,Map},WeakRef, and FinalizationRegistry. While this is good to note, I think that the complexity here is justified. The proposal has been researching a === based design for many years and has come to the conclusion that it is not a design that can move forwards.

if (typeof thing === "record") {
return ...
}
if (typeof thing === "tuple") {
return ...
}

This is actually the type of code that the proposal is trying to discourage. Most functions that can work on records/tuples can also operate on objects/arrays. Conversely there is already lots of code that currently only works for objects/arrays and would not work if passed something that had a different typeof. That said if code did find itself needing to switch on composite objects/arrays then it could write a small utility that would return a custom $typeof(v)utility to clean up the switch. And for the pattern matching proposal maybe custom matchers mean that it could look something like this:

function measureThing(thing) {
    return match(thing) {
        when String(str) -> ...,
        when Composite([...tupleLike]) -> ...,
        when Composite(recordLike) -> ...,
    }
}

@mAAdhaTTah
Copy link

Personally I like composite keys quite a lot, but I don't think syntax is worth it without ===, which was the vast majority of the benefit of the R&T proposal imo.

FWIW, I feel similar, and would note that if we were to drop the syntax from the proposal, that could free up the # for the pipeline operator's placeholder (related issue).

@spartanatreyu
Copy link

if (typeof thing === "record") {
return ...
}
if (typeof thing === "tuple") {
return ...
}

This is actually the type of code that the proposal is trying to discourage.

Why?

Discouraging that style of code seems like a problem to me.

I want an easy way to tell if a variable is a record, or a tuple (or something else).

Combining both record and tuple into Composite() does not help me with that issue.

I suggested both typeof and Record.isRecord() and Tuple.isTuple() to tell them apart but typeof is preferred.

We currently use typeof for all immutable primatives (e.g. string, number, symbols, etc...), and as the proposal states: You could think of Records and Tuples as "compound primitives"., and since records and tuples are already immutable it only makes sense to use typeof to tell them apart. And adding new returnable values of the typeof operator has happened before (i.e. bigint, symbol) so it's not out of the realm of possibility.

Conversely there is already lots of code that currently only works for objects/arrays and would not work if passed something that had a different typeof.

Everything that comes to my mind (for loops, searching methods, Set and Map constructors) should work similar to the iterators and instance methods of frozen objects/arrays. Can you give an example of where this would be an issue?

That said if code did find itself needing to switch on composite objects/arrays then it could write a small utility that would return a custom $typeof(v)utility to clean up the switch.

That is the exact issue I brought up. I don't want to have to write boilerplate code to tell records and tuples apart. I want it to be easy and ergonomic.

typeof does that for me.

Having to write code to distinguish Arrays from Objects, then Arrays from Frozen Arrays and Objects from Frozen Objects is the exact reason I'm not using Object.freeze() now. It's just too much hassle. I'd rather just annotate something as as const; in typescript, and design systems that keep my may-be-mutated and will-never-be-mutated data structures separated so I never have to differentiate between them.

If it's not easy and if it's not ergonomic, then I'm not going to use it.

@mindplay-dk
Copy link

Having to write code to distinguish Arrays from Objects, then Arrays from Frozen Arrays and Objects from Frozen Objects is the exact reason I'm not using Object.freeze() now. It's just too much hassle. I'd rather just annotate something as as const; in typescript, and design systems that keep my may-be-mutated and will-never-be-mutated data structures separated so I never have to differentiate between them.

If it's not easy and if it's not ergonomic, then I'm not going to use it.

this. 💯

I honestly don't know about typeof though - I mean, this obviously looks great on the surface:

// nice and neat

function measureThing(thing) {
    if (typeof thing === "string") {
        return ...
    }
    if (typeof thing === "record") {
        return ...
    }
    if (typeof thing === "tuple") {
        return ...
    }
    return ...
}

but what about code that's already testing broadly for typeof thing === "object"?

if typeof is changed and begins to return more specific types, code that is trying to determine the fundamental type of something I suspect is likely going to subtly break?

For example, a simple and widely used utility like fast-deep-equal - is this comparison going to work as intended?

How about this?

I mean, I completely see your point, but I also feel like maybe the type-checking problem in JS is a problem that has been allowed to compound over many, many years - it might be time to step back and reflect on the problem itself, rather than coming up with yet another case-specific solution for a new feature.

I realize that's completely beyond the scope of this proposal, I'm just putting the thought out there. I think, no matter what we do here, ergonomically, it's going to be less than ideal - it's important however that this doesn't disrupt the ecosystem. I'd be extremely concerned and careful about changing the behavior of typeof - and, in a sense, changing the behavior of typeof perhaps ought to be seen as equally beyond the scope of this proposal as the broader type checking problem I'm talking about. 🤔

@acutmore
Copy link
Collaborator Author

I don't think syntax is worth it without ===

One of the reasons I'm still including syntax is because otherwise the API ends up creating two objects every time, considering how many developers wanted to use R&T to help with performance concerns I think that they would also appreciate that #{} directly allocates one object but Composite({}) allocates two.

@bloodyowl
Copy link

Overall I'm not sure it'd be worth to add the composites concept at all if it (roughly) boils to down to sugar for Object.freeze, a deepEqual function (isComposite) and a hashing mechanism for maps & sets.

IMO, the main value of the initial proposal is to leverage the inherent properties that deep-immutability give us.

I'd say we could have the best of both worlds in terms of ergonomics with the following (and ideally computing the record/tuple hash at creation, to allow for really fast comparisons):

const record = #{ x: 1 };
const tuple = #[1]

tuple === #[1]       // true
record === #{ x: 1 } // true

Object.is(tuple, #[1])       // true
Object.is(record, #{ x: 1 }) // true

typeof record; // "object"
typeof tuple;  // "object"

Object.isFrozen(record); // true
Object.isFrozen(tuple);  // true
Array.isArray(tuple);    // true

Object.getPrototypeOf(record); // `Object.prototype`
Object.getPrototypeOf(tuple);  // `Array.prototype`

This would make the semantics more consistent, and provide retro-compatibility with existing code in the ecosystem (syntax is opt-in, doesn't introduce a new typeof value).

@nicolo-ribaudo
Copy link
Member

The reason that this proposal was stuck for so long was the desire of having === perform comparison between records/tuples. The fundamental question is: is it worth to have this proposal if #{ x: 1 } !== #{ x: 1 }?

Any approach where they are === will leave the proposal in the current stuck state.

@bloodyowl
Copy link

Is there anywhere we can read about what exactly is the problem with having ===? What were the considered design?

@acutmore
Copy link
Collaborator Author

@slorber
Copy link

slorber commented Feb 26, 2025

@bloodyowl

=== behavior has been discussed here lately: #387


One of the reasons I'm still including syntax is because otherwise the API ends up creating two objects every time, considering how many developers wanted to use R&T to help with performance concerns I think that they would also appreciate that #{} directly allocates one object but Composite({}) allocates two.

@acutmore , if we don't have === then it's not clear how R&T help with performance concerns at all.

I also agree with @ljharb that it might not be worth it to add syntax sugar since the use-case is much more limited to things like composite keys in maps. Without ===, I feel like this feature is less likely to receive mainstream adoption, and I don't necessarily think I'll use inline records using #{} syntax in as many places as with === support.

However there's still an interesting property that's useful for semantics more than performance. In an algo like shallowEqual(obj1,obj2), frameworks like React could use Composite.equal() instead of === to compare object properties. It won't be particularly fast (according to what you aid above) but it remains something valuable to convey meaning to the underlying tools relying on comparisons to detect changes.

One example: it could be a possible replacement of something like React use-deep-compare-effect lib to a natively supported variant:

// Effect called after each React re-render
React.useEffect(fn,[{hello: "world"}])

// Effect called only after React first render
React.useEffect(fn,[#{hello: "world"}])

The fundamental question is: is it worth to have this proposal if #{ x: 1 } !== #{ x: 1 }?

I think it could be, eventually.

But for that, we also need a way to compare large Records & Tuples efficiently. Otherwise, it becomes quite useless for performant change detection systems that many frameworks rely on.

Ultimately, what I want is:

const payload1 = await apiCall("/").then(toComposite)
const payload2 = await apiCall("/").then(toComposite)

compare(payload1,payload2); // This should be fast

As far as I understand, the spec doesn't guarantee any fast comparison mechanism. And the exposed primitives do not permit us to easily implement one ourselves.

If there's already a hashing mechanism involved to index records in maps/sets buckets, exposing it could enable faster comparisons in userland.

function compare(composite1, composite2) {
  return Composite.hash(composite1) === Composite.hash(composite2) 
           && Composite.equal(composite1, composite2);
}

If there was a way to put a composite in a WeakMap, we could memorize the hashes to avoid recomputing them on every comparison.

But this would be better if this efficient comparison system was built in. On composite creation, why not lazily compute a hash in the background and memoize it? Afaik the objects are immutable so the hash should not change. Yes, there's a cost to doing so, but it can be done with low priority, or even be delayed until the first comparison.

@acutmore
Copy link
Collaborator Author

and ideally computing the record/tuple hash at creation, to allow for really fast comparisons):

A hash value does not make comparisons faster in general. It only allows for a fast bail out when two values have a different hash. When two values have the same hash value then they still need to be compared in their entity. So the equality is a linear operation.

@bakkot
Copy link
Contributor

bakkot commented Mar 12, 2025

My inclination would be to add a new prototype with copies of all the Array methods which make sense for immutable structures. (Some of them could be aliases, I guess, like .some; others would be omitted entirely, like .push.) Adding special cases to Array.prototype.map and friends seems quite fraught.

We could say these things are new tuple types but then no existing patterns work, all code has to change

I don't actually think that's true? If you have .map on tuples and .map on arrays, then you don't really have to care if you're working with a tuple or an array. I'm hard-pressed to think of an example where you'd actually need to change your patterns, except places which are relying on mutability (which of course have to update anyway).

@acutmore
Copy link
Collaborator Author

I'm thinking specifically about code that is checking Array.isArray(input). But yeah if the code is just assuming array and directly reaches for .map that would still work

@ljharb
Copy link
Member

ljharb commented Mar 12, 2025

There's no need to check that because the return value of Array.prototype.slice is always one - the code that attempts to mutate that return value, though, would break, because it'd be the first time in decades it didn't work.

@acutmore
Copy link
Collaborator Author

I mean, other patterns that do check for Array.isArray, e.g the function can be passed either a number or an array of numbers and it has decided to decide which one it has been passed by checking for arrays instead of checking typeof, that could work break if composite arrays were not arrays.

@ljharb
Copy link
Member

ljharb commented Mar 12, 2025

Sure - they can be arrays without making existing Array methods return them.

@Maxdamantus
Copy link

I'm also in favour of using a separate prototype for tuples (or "composite arrays").

Might be worth noting that having a separate prototype means this pattern probably goes both ways, where the caller can decide what type of array they want, particularly if they're trying to avoid allocating intermediate array objects:

Array.prototype.filter.call(arrayLike, predicate); // returns mutable array
Tuple.prototype.filter.call(arrayLike, predicate); // returns composite array

@DominoPivot
Copy link

DominoPivot commented Mar 13, 2025

It does seem difficult to support both passing composites to code that expects arrays and using typical methods to map composites to composites without having a separate prototype.

Consider these options:

  • A composite array is a frozen array with an extra internal property used by Composite.equal. Calling generic array method like Array.prototype.map on a composite array returns a normal mutable array since these methods do not check the internal property. You must manually turn any mapping of a composite into a composite 👎.
  • A composite array has its own prototype and constructor which inherits from Array. Calling a generic array method like Array.prototype.map on a composite array throws, because such methods rely on the mutability of the arrays returned by the constructor. 👎
  • A composite array is a frozen array, but the specification is changed so that the generic array methods take the internal composite property into account and return composite arrays themselves if needed. This is not web compatible; it breaks existing code which relies on generic array methods to produce mutable arrays. 👎
  • A composite array has its own prototype and constructor, but its Symbol.species is still Array such that generic array methods can be applied to it, returning normal arrays. The composite array prototype has methods that shadow the generic array methods and return composite arrays.

EDIT: Corrected option 2 after @bakkot's post. I meant to imply it in the first place.

@acutmore
Copy link
Collaborator Author

acutmore commented Mar 14, 2025

On the topic of Symbol.species, just to point out that it is not an existing invariant of the language that Array.prototype.slice.call(arrayLike) returns a mutable array.

class SealedArray extends Array {
    constructor() {
        super();
        Object.seal(this);
    }
}

let copy = Array.prototype.slice.call(new SealedArray())
copy.push(0); // VM641:10 Uncaught TypeError: Cannot add property 0, object is not extensible

@bakkot
Copy link
Contributor

bakkot commented Mar 14, 2025

  • A composite array has its own prototype and constructor. Calling a generic array method like Array.prototype.map on a composite array throws, because such methods rely on the mutability of the arrays returned by the constructor. 👎

Only if composite arrays define Symbol.species, e.g. by inheriting from Array. But we can (and should) simply not do that. Then Array.prototype.map will return a regular, non-frozen array, and it all works out fine.

@ljharb
Copy link
Member

ljharb commented Mar 14, 2025

@acutmore fair point, but i think that subclassing array at all is rare enough, let alone to lock down the receiver, that it hasn't caused a problem - whereas i would expect/hope that the usage of this proposal would be drastically higher.

@DominoPivot
Copy link

@bakkot Good catch, I should have been explicit about that; I was working under the assumption that a composite array must be an array. But I could have added a fifth option:

  • The composite of an array is not an array, but an array-like Tuple object whose methods return other other Tuple objects. Array.isArray(#[]) returns false. As is typical of array-like objects, applying a generic array method like Array.prototype.map to a Tuple returns a normal array.

One benefit of making Tuple objects array-like and not true arrays is it avoids issues such as:

if (Array.isArray(val))
    return val.slice(); // oops, we called Tuple.prototype.slice but wanted a mutable copy

What I meant to draw attention to earlier was the fact that generic array methods themselves rely on mutability when applied to arrays. Even less intuitive than the example by @acutmore is this behavior:

class SealedArray extends Array {
    constructor(...args) {
        super(...args);
        Object.seal(this);
    }
}

let copy = Array.prototype.slice.call(new SealedArray("a", "b", "c"));
// Uncaught TypeError: Cannot add property 0, object is not extensible

We can't even copy a SealedArray with slice because slice calls new SealedArray(length) first, then tries to set the elements on the already sealed array.

To fix this, either composite arrays must have Array as their species, or they must not inherit from Array in the first place.

@mindplay-dk
Copy link

To fix this, either composite arrays must have Array as their species, or they must not inherit from Array in the first place.

What if tuples were partially compatible with arrays?

A composite array can't implement splice - anything that expects a mutation would quietly fail or misbehave.

But it can implement toSpliced - and there's probably no reason it shouldn't.

But if something type-checks for arrays, clearly these are not arrays, since they can't satisfy all of an array method's contracts - pretending to be arrays (whether by falsifying type-checks or pretending to implement mutable methods) is almost definitely worse than not.

I think this is just the down side of OOP and not really something we can or should attempt to fix?

It'll be frustrating and you will likely have to do some data type juggling when integrating with third-party code that expects arrays, but that's probably just the reality of introducing a new collection type into an object oriented standard library.

At least popular methods like filter and map can be supported, which probably means a lot of JS code will "just work". Perhaps the tuple type could throw helpful error messages for the unsupported array methods.

Presumably we'll have an IDL type for the subset of the array API supported by tuples? Which, in terms of Typescript, means we could change typehints from Array to whatever the intersection of array and tuple would be called, in library code that doesn't require mutable arrays.

It will likely take a long time for the ecosystem to adapt, but I don't think there's any realistic alternative? Tuples are not arrays.

@Maxdamantus
Copy link

Maxdamantus commented Mar 21, 2025

I guess the point of making them arrays is that this version of the proposal seems to say that your "composite" objects can be just like any other objects you create, just immutable. This means you don't need a special concept for tuples, since it's just a "composite array". I'm not sure I agree with this however.

Like arrays, a lot of existing types of objects are designed to be mutable, so it's not clear what it would mean to create "composite" versions of these.

For "ordinary objects" (spec term) we create ourselves (using object literal expressions, or Object.create, or new expressions on normal classes (ones that don't define a Symbol.species property)) I guess we can figure out what happens: the properties (and prototype) of the object are copied into the new object, but this raises at least two issues.

First, usually when I write a class, I'll intend for it to be mutable or immutable. It seems a bit weird to me expecting to do Composite(new SomeClass()), since the mutability should be decided by SomeClass, not an external user. Similarly to the issues with arrays, if a user does Composite(new SomeClass()).someMethod(), they're passing an unexpected value for this to that method. What if someMethod is expecting to mutate this, or worse, what if it was relying on the identity of this being the same as in the constructor? Good luck debugging issues like this.

Second, presumably for "exotic objects", there'll be internal properties that are not copied. This includes functions, maps, regexp objects, wrapper objects. Will Composite do the naïve thing and only copy the ordinary properties, or will it conservatively throw in cases that are not handled by the spec (eg, arrays)?

Regarding Composite(new SomeClass()), I think it's a lot cleaner if we expect constructors to internally create composite/record values instead of the caller:

function SomeClass(x, y) {
    return Record({ x, y }, SomeClass.prototype);
}
SomeClass.prototype.someMethod = ...

(or using some special syntax)

record class SomeClass {
    constructor(x, y) {
        // maybe the constructor is called with a temporary object that is used to construct the composite value afterwards?
        this.x = x;
        this.y = y;
        return;
        // or maybe there's syntax for constructing a record with the prototype based on the class we're syntactically within?
        return #this{ x, y };
    }
    someMethod() { ... }
}

Regarding arrays specifically, we've already got the "array-like" concept for things that are basically arrays but not really arrays, and to me it seems like tuples should fit in this category.

@mhofman
Copy link
Member

mhofman commented Mar 21, 2025

A Composite array can be an array (aka Array.isArray(#[]) while having a different prototype than Array.prototype. Thinking about this, I prefer this than overloading the behavior of Arrray.prototype methods to recognize composite receivers.

A composite array is effectively the equivalent of a frozen array with a different prototype, except it was born frozen, so can have stable equality semantics making it usable as collection keys.

Regarding composite objects, we had some discussions for custom behavior / prototype for them. Fundamentally classes are incompatible with composite objects because the initialization of classes is not "one shot".
At the lowest level, a custom behavior for a composite object is really declaring a prototype when creating the object, e.g. #{ __proto__: vector2DProto, x, y }, but we could also imagine Composite({x, y}, vector2DProto).

What's important to understand is that Composite(foo) does not modify foo to make it a composite, but creates a new object from foo by copying its keys. As such Composite(new SomeClass()) would be completely detached the SomeClass instance, and should never IMO create a composite object copying the prototype.

Now there may be use cases for declaring "immutable classes". The most likely avenue for that is possibly the structs proposal. structs are one shot initialization, making them a good candidate for a starting point. There are discussions to split the proposal between structs and shared structs, and one of the potential thing to look into for normal structs is how to declare immutability of fields. It wouldn't be a stretch to extend that to make the whole instance immutable, in which case the instance would have all properties of a composite object.

@Maxdamantus
Copy link

Maxdamantus commented Mar 21, 2025

A Composite array can be an array (aka Array.isArray(#[]) while having a different prototype than Array.prototype. Thinking about this, I prefer this than overloading the behavior of Arrray.prototype methods to recognize composite receivers.

Right, but as alluded to above about subclassing Array, I think this would be surprising. The conventional way for checking if something is a normal array is currently Array.isArray(a) (and then we assume its prototype is Array.prototype), but maybe with this change we should expect to start using a instanceof Array again? Or Array.isArray(a) && a instanceof Array? Or Array.isArray(a) && !Composite.isComposite(a)?!

As such Composite(new SomeClass()) would be completely detached the SomeClass instance, and should never IMO create a composite object copying the prototype.

Right, I've always been assuming the composite object is detached from the original object, which leads to most of the points in my previous comment about which parts are copied. When I said "the properties (and prototype) of the object are copied", I actually meant the prototype property, so the composite has the same prototype as the original object, not a copy of the prototype object itself. This is based on the initial post in this thread, where Object.getPrototypeOf(Composite({})) === Object.prototype and Object.getPrototypeOf(Composite([])) === Array.prototype.

Now there may be use cases for declaring "immutable classes". The most likely avenue for that is possibly the structs proposal.

Perhaps, but then all of these discussions about identity/equality will apply to that proposal.

@mhofman
Copy link
Member

mhofman commented Mar 21, 2025

Right, but as alluded to above about subclassing Array, I think this would be surprising.

I'm not so sure. If you're receiving an array as parameter, you probably shouldn't be mutating it, unless it's extremely clear that's what your function does. As such you should not be calling mutating methods on the array you receive.

If you receive an array as parameter, nothing guarantees it's not been frozen, in which case mutating methods would throw. Would an error trying to use a missing method be that different (assuming error messages these days are pretty helpful in debugging these cases)?

The last part is that the prototype methods would create further composite arrays. So const copy = arr.slice(); copy.push(123) would now fail. That would be a case where code expecting an array could trip over.

When I said "the properties (and prototype) of the object are copied", I actually meant the prototype property, so the composite has the same prototype as the original object, not a copy of the prototype object itself.

Sorry that is also what I meant, and I don't think Composite should "copy" __proto__ in any circumstances. The fact that a composite would have an array or object prototype by default would be based on the Array.isArray() nature of the object being copied, not its __proto__.

@Maxdamantus
Copy link

Maxdamantus commented Mar 21, 2025

If you receive an array as parameter, nothing guarantees it's not been frozen, in which case mutating methods would throw.

I think frozen arrays are fairly niche. Presumably these discussions should focus on common idioms/conventions. JS is a very reflective language, so of course you can do lots of weird stuff to surprise consumers of your values, whether that's freezing objects or adding properties that override members from the prototype, but developers just .. shouldn't .. do .. these .. things.

The last part is that the prototype methods would create further composite arrays. ... That would be a case where code expecting an array could trip over.

👍

Sorry that is also what I meant, and I don't think Composite should "copy" __proto__ in any circumstances.

I agree with this in some sense. I'm not entirely sure what the intention is in the first post in this thread, but it looks to me like the intention is to copy __proto__. This goes to what I'm saying about just automatically creating "composite" versions of any old object.

Personally, I prefer having a Record constructor which expects the caller to specify the properties of the record. The natural way of conveying properties is using a plain object (usually with prototype Object.prototype or null), where the receiver will iterate over its "own" properties with Object.getOwnPropertyNames/Object.getOwnPropertySymbols. An existing example of this pattern would be Object.assign.

I am also in favour of records having custom prototypes, but I think when this is done through the Record constructor it should be a second optional parameter (Record(properties, prototype)).

There's a (hidden) slide from the deck at the beginning of this thread that contains #{ __proto__: vector2DProto, x, y } so obviously the idea has been at least considered. Personally I don't like this syntax. I'd prefer all syntactic members of the record literal to correspond to own keys rather than be potentially invocations of a setter on Object.prototype. Note that in current JS there's a distinction between

({ __proto__: foo })

and

({ ["__proto__"]: foo })

where I think the former behaviour should be considered an historical mistake.

@acutmore
Copy link
Collaborator Author

acutmore commented Mar 21, 2025

Personally, I prefer having a Record constructor which expects the caller to specify the properties of the record.

This is what the original post was meant to show. Just like with the main proposal, the constructors don't look at the [[prototype]] of the input, only its properties. The difference between getting Object.prototype vs Array.prototype was only happening due to an Array.isArray check. I've updated the post with an extra example to clarify this.

think when this is done through the Record constructor it should be a second optional parameter (Record(properties, prototype)).

Yes I was thinking the same.
yes I was thinking of something like Composite(obj, prototype)

@mindplay-dk
Copy link

I'm not so sure. If you're receiving an array as parameter, you probably shouldn't be mutating it, unless it's extremely clear that's what your function does. As such you should not be calling mutating methods on the array you receive.

This is an argument about code quality or code standards.

I'm talking about correctness, especially as it relates to types - in terms of types, something is not an array if it offers a splice method that doesn't mutate the array in place, simple as that.

If it type-checks (in any way, whether Array.isArray or instanceof Array) as an array, but doesn't behave like an array, you have a serious category error which can't lead to anything good, only to unexpected errors, exceptions and bugs.

For one, this would literally break TypeScript - a block such as if (x instanceof Array) will have the x variable locally typed as Array, and this is supposed to be safe. It's a fundamental feature of the language. What are they going to do with a change like this, make the splice method of the Array type nullable? Force you into even weirder runtime checks like "splice" in x to see if an array is really really an array? Yikes.

There needs to be a distinct type check for these types, e.g. instanceof Tuple, and likely an IDL type like ImmutableArray that both Tuple and Array could then implement. Perhaps Tuple.isTuple could return true for arrays, assuming it adheres to the same public interface.

The type hierarchy needs to be carefully designed. Frozen arrays already are not arrays and hopefully a proper tuple feature takes the place of frozen arrays, which, since they are constructed that way, provide an opportunity to resolve this type problem. (same for frozen objects vs records.)

@mhofman
Copy link
Member

mhofman commented Mar 22, 2025

Frozen arrays already are not arrays

Uh? In what way is a frozen array not an array?

If it type-checks (in any way, whether Array.isArray or instanceof Array) as an array, but doesn't behave like an array

There is a question of what these checks actually tell. Array.isArray says that the object is an array exotic object, which means accessing index properties has special behavior. In particular it updates the length property if appropriate.

instanceof Array checks if Array.prototype is on the prototype chain, giving access to its methods.

There is technically nothing preventing an array exotic object from having a different prototype. This can happen today already, although it takes some effort so it's rare.

While I agreed that it would be surprising for some code that an object passing an Array.isArray check would lack some methods found on Array.prototype, my argument was that quality code that would work with any input array would keep working, as that code should accept frozen arrays and not perform operations that mutate the input array.

@mindplay-dk
Copy link

Frozen arrays already are not arrays

Uh? In what way is a frozen array not an array?

in terms of typing, a frozen array is no longer an array.

look here - it type-checks like an array, but it doesn't behave like an array.

even with run-time type-checking, a frozen array does not work like an array.

even duck typing falls apart here: it doesn't quack like a duck, so it's not a duck.

all of this type-checks (in strict mode) in TypeScript, too. 😿

While I agreed that it would be surprising for some code that an object passing an Array.isArray check would lack some methods found on Array.prototype, my argument was that quality code that would work with any input array would keep working, as that code should accept frozen arrays and not perform operations that mutate the input array.

don't get me wrong: I completely agree that functions should not have side-effects.

but mine or your opinion is really completely irrelevant to this question - arrays are mutable, and JS code does mutate arrays, so a decision like this comes down to a question of correctness and soundness, and backwards compatibility, not one of code quality or opinions about what someone should or should not do with arrays.

unless you want to suggest deprecating mutable methods on arrays, I don't see how we could even discuss such a thing.

@Maxdamantus
Copy link

Maxdamantus commented Mar 23, 2025

look here - it type-checks like an array, but it doesn't behave like an array.

To add to your overall point, TypeScript does actually have different types for mutable and frozen arrays (actually, "readonly", since the same array could be mutable and readonly in different contexts). It's just not handled very nicely in the example (sample code abbreviated below for longevity):

function stuff(x: Array<number>): void { ... }
const b = [1,2,3];
Object.freeze(b);
stuff(b);

If it instead uses const b = Object.freeze([1, 2, 3]);, it gets the type readonly number[] aka readonly Array<number>, so TypeScript raises a static error when passing it to stuff.

Anyway, the question of "is a frozen array an array" is less important than "is a tuple an array" if the actual array methods (ie, the ones that return mutable arrays) are missing or wrong (because they return tuples instead).

Even TypeScript includes this declaration for Array.isArray, where the result being true is considered an assertion that it is a normal array (which extends Array.prototype): source

interface ArrayConstructor {
    ...
    isArray(arg: any): arg is any[];
    ...
}

One might say that this is technically incorrect because array exotic objects don't always extend Array.prototype, but this is very rare. As I've said already, the convention is that Array.isArray(a) implies that a is a normal array.

If some code uses that check and you pass an array exotic object that's frozen, or that doesn't extend Array.prototype, or extends a subtype that misbehaves, or includes own properties that override members from Array.prototype, those are probably considered to be misuses of the interface since those are thought of as weird values.

Tuples shouldn't be thought of as weird values. If Array.isArray(#[]) === true where #[] doesn't behave like an array (eg, where #[].slice() returns a new mutable array), then they are weird values that are inappropriate in existing APIs.

@acutmore
Copy link
Collaborator Author

This discussion reminded me of: https://bsky.app/profile/searyanc.dev/post/3lfd4q6htes2p

I agree that theory says that a readonly array is different from a mutable array. If we were designing the language from scratch we could have interfaces/protocols be the norm. However we are not starting from scratch and JS has one of the strongest backwards compatibility goals, so we can't separate the methods on the Array prototype into a non-mutating base class with a mutating child class.

Engines have also asked that other proposal reduce the number of new methods to keep the core footprint of the language down (most recently Temporal was shrunk). A whole new Tuple.prototype would be a cost.

All that said, I am now re-considering the semantics here. Looking for precedent from other proposals I was reminded of https://github.com/tc39/proposal-immutable-arraybuffer. Immutable array buffers have the same prototype and the immutability is not preserved for methods that return new instances. This suggests that composite arrays, like frozen arrays, would still have mutating methods (that throw) and methods that produce new instances would return a new non-composite, non-frozen, array exactly as they are already defined today.

@bakkot
Copy link
Contributor

bakkot commented Mar 24, 2025

FWIW I persnoally think immutable ArrayBuffers are closer to regular ArrayBuffers than Composites are to Arrays, such that it makes sense not to have a new prototype for immutable ArrayBuffers but to have one for Composites.

@mindplay-dk
Copy link

If it instead uses const b = Object.freeze([1, 2, 3]);, it gets the type readonly number[] aka readonly Array<number>

My example was a "worst case" example of the runtime issue - the typing issue was more an aside.

Whether it's TS or plain JS, the problem is the same when it comes to runtime type checking:

function stuff(thing: unknown): void {
    if (thing instanceof Array) {
        thing.splice(3,0,4);
    }
}

const b = [1,2,3];

const c = Object.freeze(b)

stuff(c) // Error: "cannot add property 3,  object is not extensible"

You can check with Object.isFrozen, but my point is that's like asking "is it really an array?" after asking if it's an array. 😌

@ljharb
Copy link
Member

ljharb commented Mar 29, 2025

instanceof doesn’t tell you it’s an array, it tells you it has this realm’s Array.prototype. Array.isArray tells you it’s an array, but that doesn’t tell you it’s mutable.

@mindplay-dk
Copy link

instanceof doesn’t tell you it’s an array, it tells you it has this realm’s Array.prototype.

Which, in terms of typing, means you can expect it to behave like an array. 🤷‍♂️

Array.isArray tells you it’s an array, but that doesn’t tell you it’s mutable.

In TS, Array.isArray tells you the type is Array, which implies it's mutable.

Do you want Array.isArray(x) to infer the type of x as readonly number[]?

Should we be forced to additionally check if an array is also mutable, in order to remove the readonly brand from the type?

In practice, you'll almost never see a (TS or plain JS) library performing this level of defensive type-checking.

My point is, this doesn't matter much right now, because you'll almost never see anybody actually using Object.freeze - but with the introduction of first class language constructs, you can expect this to become a much greater problem. Simply pretending that immutable arrays/objects have mutation methods isn't going to work out well for anybody. 😕

@mhofman
Copy link
Member

mhofman commented Mar 29, 2025

Typescript is notoriously bad at assuming mutable and readonly annotations causing headaches. We actually tried to document our APIs as accepting readonly types to indicate that it never attempts mutation, but that causes a ton of problems.

That said here we're talking about plain JavaScript. Knowing an object is an array (Array.isArray) does not inform anything about the mutability of the array. The same as knowing a value is an object doesn't inform anything about it's extensibility or that some properties may be mutable or not.

I understand some code assumes it can mutate objects it receives, but I argue this is usually not the case. And when it does, it's often clear that is the intent of the method/function.

@jayaddison
Copy link

In delayed response to this comment:

I'm not sure what the point of a composite is or what it brings to the table if they're not deeply immutable.

What is the point of a record/tuple/composite, if it holds mutable data?

From what I understand, the proposal allows for -- but does not require -- Composite objects containing mutable values.

So, for example, an application could construct (or load) a Composite declared without any mutable references, and its developers could rely on that value remaining immutable. If the application were particularly risk-averse, it might also assert that the composite is deeply immutable before using it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests