-
Notifications
You must be signed in to change notification settings - Fork 60
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
2025 February TC39 presentation update #393
Comments
Thanks for the update! const obj = {};
Composite.equal(#[obj], #[obj]); // true
Composite.equal(#{ x: #[0, NaN] }, #{ x: #[-0, NaN] }); // true This looks like a quite interesting behavior. Is this expected to perform faster than Although I would have hoped for Are there other integrations considered/planned? For example, reading API responses as composites, or converting a deeply nested object to a composite? Is JSON.parseImmutable() still on the table? How does this get represented in TypeScript? Can we encode in the type system that something is a Composite/Record/Tuple? I'm curious to know more about the possible |
|
I see thanks. When using |
The way maps/sets work is that you need to:
1 and 3 are O(size of the key), and 2 is O(1). So maps/sets using composite key would still be constant time with respect to the number of elements in the map. |
This looks great! A few ideas:
|
Thank you for the update. I'm happy that it's moving forward, but I'm still disappointed that it requires custom integrations and comparisons instead of relying on existing mechanisms. (
Is this right? Regular objects would be allowed inside composites?
One reason for native support of immutable structs was better integration with the engine to support stuff like interning or a cached hash. The worst case would still be linear obviously, but I assume that there may be faster paths eventually for certain patterns. |
Personally I like composite keys quite a lot, but I don't think syntax is worth it without |
Fast paths are theoretically possible, when the values are not equal and if their hash values have already been computed and cached then the equality check may bail out immediately. The existence of such fast paths would not be part of the spec and should not be relied upon, so the equality check should always be thought of as being One advantage of composites is that they are guaranteed to not trigger user code (not proxies, no getters) which at least means that these aspects don't need to be guarded against. So what can be relied upon is that when given two values the check will always return the same result, and will never throw an exception. |
That is correct. A composite would not be a guarantee of deep immutability. At this point in JS's life I think the ship has sailed, essentially every object in JS is mutable either internally or publicly. Even newer Proposals such as It's possible to imagine an additional API |
Since composites are objects, and not |
Personally I would still like it to happen, similar to how it is now it would be a follow on piece of work to investigate.
It would need a new marker from TypeScript to encode as structurally they are no different from regular objects/arrays. Maybe it would be
Note: 3 and 4 can be done in userland but 2 can't. 2 would break the similarity between Map and WeakMap.
Potentially. Technically all of the APIs could be on |
I really hope it is still time to reconsider to not drop the idea of using |
I'm eagerly waiting for the minutes to see what was discussed; however having a single constructor is consistent with dropping support for
Thanks for the confirmation. Shallow immutability can easily be checked recursively in user-code to enforce it deeply. In particular, a recursive check still allows to get rid of defensive copies which was one of the goals of the proposal. It's cheaper to check once when the value is received rather than copying on each access. 👍
I understand that it's not exactly the same, but the way strings are handled today is proof to me that the perf discussion (and |
// Language integration
new Set([#[], #[]]).size; // 1
let m = new Map();
m.set(#[], true);
m.has(#[]); // true
Map.groupBy(iter, v => #[v.x, v.y]);
Composite.isComposite(#[1, 2, 3].filter(v => v > 1)); // true
[#[]].includes(#[]); // true
[#[]].indexOf(#[]); // 0 So just to clarify, does this mean every practically every current use of equality is being updated except the equality operators ( Same as some others here, I'm also very much in favour of using Disregarding the ergonomic advantages of R/T |
Technically
I believe that's the hope.
For the record, I stated last week that 3/4 currently seem unacceptable to me since I expect it to break too much code that expect Weak collections to use the identity of the key. I do not however expect much code to rely on WeakMap and Map to have interchangeable keying semantics, and since you can implement 3/4 in userland with a trie of WeakMap, I'm ok with both 1 or 2. |
Not keeping If I have a function that receives a "thing", and I want to do something different depending on if it's a record, tuple or something else, I would rather write: // nice and neat
function measureThing(thing) {
if (typeof thing === "string") {
return ...
}
if (typeof thing === "record") {
return ...
}
if (typeof thing === "tuple") {
return ...
}
return ...
} or even: // slightly less neat
function measureThing(thing) {
if (typeof thing === "string") {
return ...
}
if (Record.isRecord(thing)) {
return ...
}
if (Tuple.isTuple(thing)) {
return ...
}
return ...
} rather than: // painful
function measureThing(thing) {
if (typeof thing === "string") {
return ...
}
if (Composite.isComposite(thing)) {
if (Array.isArray(thing)) {
return ...
}
return ...
}
return ...
} My first code example is really easy to read and write, plus it could be easily refactored into something like this when pattern matching becomes available: // heaven
function measureThing(thing) {
return match(typeof thing) {
when "string": ...;
when "record": ...;
when "tuple": ...;
default: ...;
}
} |
I suspect this isn't the rationale, but as @rauschma alluded to, I like the idea of having records with custom prototypes (assuming this would need to be a future proposal). The linked slide deck has a hidden slide with: #{ __proto__: vector2DProto, x, y } (personally, I'd prefer just specifying the prototype as a second argument to I can imagine if custom prototypes are used, there would also be some syntax like This can also be compared with other "reliable" ways of detecting certain types of values:
Similar to |
Yes,
This is actually the type of code that the proposal is trying to discourage. Most functions that can work on records/tuples can also operate on objects/arrays. Conversely there is already lots of code that currently only works for objects/arrays and would not work if passed something that had a different function measureThing(thing) {
return match(thing) {
when String(str) -> ...,
when Composite([...tupleLike]) -> ...,
when Composite(recordLike) -> ...,
}
} |
FWIW, I feel similar, and would note that if we were to drop the syntax from the proposal, that could free up the |
Why? Discouraging that style of code seems like a problem to me. I want an easy way to tell if a variable is a record, or a tuple (or something else). Combining both record and tuple into I suggested both We currently use
Everything that comes to my mind (for loops, searching methods, Set and Map constructors) should work similar to the iterators and instance methods of frozen objects/arrays. Can you give an example of where this would be an issue?
That is the exact issue I brought up. I don't want to have to write boilerplate code to tell records and tuples apart. I want it to be easy and ergonomic.
Having to write code to distinguish Arrays from Objects, then Arrays from Frozen Arrays and Objects from Frozen Objects is the exact reason I'm not using If it's not easy and if it's not ergonomic, then I'm not going to use it. |
this. 💯 I honestly don't know about // nice and neat
function measureThing(thing) {
if (typeof thing === "string") {
return ...
}
if (typeof thing === "record") {
return ...
}
if (typeof thing === "tuple") {
return ...
}
return ...
} but what about code that's already testing broadly for if For example, a simple and widely used utility like How about this? I mean, I completely see your point, but I also feel like maybe the type-checking problem in JS is a problem that has been allowed to compound over many, many years - it might be time to step back and reflect on the problem itself, rather than coming up with yet another case-specific solution for a new feature. I realize that's completely beyond the scope of this proposal, I'm just putting the thought out there. I think, no matter what we do here, ergonomically, it's going to be less than ideal - it's important however that this doesn't disrupt the ecosystem. I'd be extremely concerned and careful about changing the behavior of |
One of the reasons I'm still including syntax is because otherwise the API ends up creating two objects every time, considering how many developers wanted to use R&T to help with performance concerns I think that they would also appreciate that |
Overall I'm not sure it'd be worth to add the composites concept at all if it (roughly) boils to down to sugar for IMO, the main value of the initial proposal is to leverage the inherent properties that deep-immutability give us. I'd say we could have the best of both worlds in terms of ergonomics with the following (and ideally computing the record/tuple hash at creation, to allow for really fast comparisons): const record = #{ x: 1 };
const tuple = #[1]
tuple === #[1] // true
record === #{ x: 1 } // true
Object.is(tuple, #[1]) // true
Object.is(record, #{ x: 1 }) // true
typeof record; // "object"
typeof tuple; // "object"
Object.isFrozen(record); // true
Object.isFrozen(tuple); // true
Array.isArray(tuple); // true
Object.getPrototypeOf(record); // `Object.prototype`
Object.getPrototypeOf(tuple); // `Array.prototype` This would make the semantics more consistent, and provide retro-compatibility with existing code in the ecosystem (syntax is opt-in, doesn't introduce a new typeof value). |
The reason that this proposal was stuck for so long was the desire of having Any approach where they are |
Is there anywhere we can read about what exactly is the problem with having |
@acutmore , if we don't have I also agree with @ljharb that it might not be worth it to add syntax sugar since the use-case is much more limited to things like composite keys in maps. Without However there's still an interesting property that's useful for semantics more than performance. In an algo like One example: it could be a possible replacement of something like React use-deep-compare-effect lib to a natively supported variant: // Effect called after each React re-render
React.useEffect(fn,[{hello: "world"}])
// Effect called only after React first render
React.useEffect(fn,[#{hello: "world"}])
I think it could be, eventually. But for that, we also need a way to compare large Records & Tuples efficiently. Otherwise, it becomes quite useless for performant change detection systems that many frameworks rely on. Ultimately, what I want is: const payload1 = await apiCall("/").then(toComposite)
const payload2 = await apiCall("/").then(toComposite)
compare(payload1,payload2); // This should be fast As far as I understand, the spec doesn't guarantee any fast comparison mechanism. And the exposed primitives do not permit us to easily implement one ourselves. If there's already a hashing mechanism involved to index records in maps/sets buckets, exposing it could enable faster comparisons in userland. function compare(composite1, composite2) {
return Composite.hash(composite1) === Composite.hash(composite2)
&& Composite.equal(composite1, composite2);
} If there was a way to put a composite in a WeakMap, we could memorize the hashes to avoid recomputing them on every comparison. But this would be better if this efficient comparison system was built in. On composite creation, why not lazily compute a hash in the background and memoize it? Afaik the objects are immutable so the hash should not change. Yes, there's a cost to doing so, but it can be done with low priority, or even be delayed until the first comparison. |
A hash value does not make comparisons faster in general. It only allows for a fast bail out when two values have a different hash. When two values have the same hash value then they still need to be compared in their entity. So the equality is a linear operation. |
My inclination would be to add a new prototype with copies of all the Array methods which make sense for immutable structures. (Some of them could be aliases, I guess, like
I don't actually think that's true? If you have |
I'm thinking specifically about code that is checking |
There's no need to check that because the return value of |
I mean, other patterns that do check for |
Sure - they can be arrays without making existing Array methods return them. |
I'm also in favour of using a separate prototype for tuples (or "composite arrays"). Might be worth noting that having a separate prototype means this pattern probably goes both ways, where the caller can decide what type of array they want, particularly if they're trying to avoid allocating intermediate array objects: Array.prototype.filter.call(arrayLike, predicate); // returns mutable array
Tuple.prototype.filter.call(arrayLike, predicate); // returns composite array |
It does seem difficult to support both passing composites to code that expects arrays and using typical methods to map composites to composites without having a separate prototype. Consider these options:
EDIT: Corrected option 2 after @bakkot's post. I meant to imply it in the first place. |
On the topic of class SealedArray extends Array {
constructor() {
super();
Object.seal(this);
}
}
let copy = Array.prototype.slice.call(new SealedArray())
copy.push(0); // VM641:10 Uncaught TypeError: Cannot add property 0, object is not extensible |
Only if composite arrays define |
@acutmore fair point, but i think that subclassing array at all is rare enough, let alone to lock down the receiver, that it hasn't caused a problem - whereas i would expect/hope that the usage of this proposal would be drastically higher. |
@bakkot Good catch, I should have been explicit about that; I was working under the assumption that a composite array must be an array. But I could have added a fifth option:
One benefit of making Tuple objects array-like and not true arrays is it avoids issues such as: if (Array.isArray(val))
return val.slice(); // oops, we called Tuple.prototype.slice but wanted a mutable copy What I meant to draw attention to earlier was the fact that generic array methods themselves rely on mutability when applied to arrays. Even less intuitive than the example by @acutmore is this behavior: class SealedArray extends Array {
constructor(...args) {
super(...args);
Object.seal(this);
}
}
let copy = Array.prototype.slice.call(new SealedArray("a", "b", "c"));
// Uncaught TypeError: Cannot add property 0, object is not extensible We can't even copy a To fix this, either composite arrays must have |
What if tuples were partially compatible with arrays? A composite array can't implement But it can implement But if something type-checks for arrays, clearly these are not arrays, since they can't satisfy all of an array method's contracts - pretending to be arrays (whether by falsifying type-checks or pretending to implement mutable methods) is almost definitely worse than not. I think this is just the down side of OOP and not really something we can or should attempt to fix? It'll be frustrating and you will likely have to do some data type juggling when integrating with third-party code that expects arrays, but that's probably just the reality of introducing a new collection type into an object oriented standard library. At least popular methods like Presumably we'll have an IDL type for the subset of the array API supported by tuples? Which, in terms of Typescript, means we could change typehints from It will likely take a long time for the ecosystem to adapt, but I don't think there's any realistic alternative? Tuples are not arrays. |
I guess the point of making them arrays is that this version of the proposal seems to say that your "composite" objects can be just like any other objects you create, just immutable. This means you don't need a special concept for tuples, since it's just a "composite array". I'm not sure I agree with this however. Like arrays, a lot of existing types of objects are designed to be mutable, so it's not clear what it would mean to create "composite" versions of these. For "ordinary objects" (spec term) we create ourselves (using object literal expressions, or First, usually when I write a class, I'll intend for it to be mutable or immutable. It seems a bit weird to me expecting to do Second, presumably for "exotic objects", there'll be internal properties that are not copied. This includes functions, maps, regexp objects, wrapper objects. Will Regarding function SomeClass(x, y) {
return Record({ x, y }, SomeClass.prototype);
}
SomeClass.prototype.someMethod = ... (or using some special syntax) record class SomeClass {
constructor(x, y) {
// maybe the constructor is called with a temporary object that is used to construct the composite value afterwards?
this.x = x;
this.y = y;
return;
// or maybe there's syntax for constructing a record with the prototype based on the class we're syntactically within?
return #this{ x, y };
}
someMethod() { ... }
} Regarding arrays specifically, we've already got the "array-like" concept for things that are basically arrays but not really arrays, and to me it seems like tuples should fit in this category. |
A Composite array can be an array (aka A composite array is effectively the equivalent of a frozen array with a different prototype, except it was born frozen, so can have stable equality semantics making it usable as collection keys. Regarding composite objects, we had some discussions for custom behavior / prototype for them. Fundamentally classes are incompatible with composite objects because the initialization of classes is not "one shot". What's important to understand is that Now there may be use cases for declaring "immutable classes". The most likely avenue for that is possibly the structs proposal. structs are one shot initialization, making them a good candidate for a starting point. There are discussions to split the proposal between structs and shared structs, and one of the potential thing to look into for normal structs is how to declare immutability of fields. It wouldn't be a stretch to extend that to make the whole instance immutable, in which case the instance would have all properties of a composite object. |
Right, but as alluded to above about subclassing
Right, I've always been assuming the composite object is detached from the original object, which leads to most of the points in my previous comment about which parts are copied. When I said "the properties (and prototype) of the object are copied", I actually meant the prototype property, so the composite has the same prototype as the original object, not a copy of the prototype object itself. This is based on the initial post in this thread, where
Perhaps, but then all of these discussions about identity/equality will apply to that proposal. |
I'm not so sure. If you're receiving an array as parameter, you probably shouldn't be mutating it, unless it's extremely clear that's what your function does. As such you should not be calling mutating methods on the array you receive. If you receive an array as parameter, nothing guarantees it's not been frozen, in which case mutating methods would throw. Would an error trying to use a missing method be that different (assuming error messages these days are pretty helpful in debugging these cases)? The last part is that the prototype methods would create further composite arrays. So
Sorry that is also what I meant, and I don't think |
I think frozen arrays are fairly niche. Presumably these discussions should focus on common idioms/conventions. JS is a very reflective language, so of course you can do lots of weird stuff to surprise consumers of your values, whether that's freezing objects or adding properties that override members from the prototype, but developers just .. shouldn't .. do .. these .. things.
👍
I agree with this in some sense. I'm not entirely sure what the intention is in the first post in this thread, but it looks to me like the intention is to copy Personally, I prefer having a I am also in favour of records having custom prototypes, but I think when this is done through the There's a (hidden) slide from the deck at the beginning of this thread that contains ({ __proto__: foo }) and ({ ["__proto__"]: foo }) where I think the former behaviour should be considered an historical mistake. |
This is what the original post was meant to show. Just like with the main proposal, the constructors don't look at the
|
This is an argument about code quality or code standards. I'm talking about correctness, especially as it relates to types - in terms of types, something is not an array if it offers a If it type-checks (in any way, whether For one, this would literally break TypeScript - a block such as There needs to be a distinct type check for these types, e.g. The type hierarchy needs to be carefully designed. Frozen arrays already are not arrays and hopefully a proper tuple feature takes the place of frozen arrays, which, since they are constructed that way, provide an opportunity to resolve this type problem. (same for frozen objects vs records.) |
Uh? In what way is a frozen array not an array?
There is a question of what these checks actually tell.
There is technically nothing preventing an array exotic object from having a different prototype. This can happen today already, although it takes some effort so it's rare. While I agreed that it would be surprising for some code that an object passing an |
in terms of typing, a frozen array is no longer an array. look here - it type-checks like an array, but it doesn't behave like an array. even with run-time type-checking, a frozen array does not work like an array. even duck typing falls apart here: it doesn't quack like a duck, so it's not a duck. all of this type-checks (in strict mode) in TypeScript, too. 😿
don't get me wrong: I completely agree that functions should not have side-effects. but mine or your opinion is really completely irrelevant to this question - arrays are mutable, and JS code does mutate arrays, so a decision like this comes down to a question of correctness and soundness, and backwards compatibility, not one of code quality or opinions about what someone should or should not do with arrays. unless you want to suggest deprecating mutable methods on arrays, I don't see how we could even discuss such a thing. |
To add to your overall point, TypeScript does actually have different types for mutable and frozen arrays (actually, "readonly", since the same array could be mutable and readonly in different contexts). It's just not handled very nicely in the example (sample code abbreviated below for longevity): function stuff(x: Array<number>): void { ... }
const b = [1,2,3];
Object.freeze(b);
stuff(b); If it instead uses Anyway, the question of "is a frozen array an array" is less important than "is a tuple an array" if the actual array methods (ie, the ones that return mutable arrays) are missing or wrong (because they return tuples instead). Even TypeScript includes this declaration for interface ArrayConstructor {
...
isArray(arg: any): arg is any[];
...
} One might say that this is technically incorrect because array exotic objects don't always extend If some code uses that check and you pass an array exotic object that's frozen, or that doesn't extend Tuples shouldn't be thought of as weird values. If |
This discussion reminded me of: https://bsky.app/profile/searyanc.dev/post/3lfd4q6htes2p I agree that theory says that a readonly array is different from a mutable array. If we were designing the language from scratch we could have interfaces/protocols be the norm. However we are not starting from scratch and JS has one of the strongest backwards compatibility goals, so we can't separate the methods on the Array prototype into a non-mutating base class with a mutating child class. Engines have also asked that other proposal reduce the number of new methods to keep the core footprint of the language down (most recently Temporal was shrunk). A whole new All that said, I am now re-considering the semantics here. Looking for precedent from other proposals I was reminded of https://github.com/tc39/proposal-immutable-arraybuffer. Immutable array buffers have the same prototype and the immutability is not preserved for methods that return new instances. This suggests that composite arrays, like frozen arrays, would still have mutating methods (that throw) and methods that produce new instances would return a new non-composite, non-frozen, array exactly as they are already defined today. |
FWIW I persnoally think immutable ArrayBuffers are closer to regular ArrayBuffers than Composites are to Arrays, such that it makes sense not to have a new prototype for immutable ArrayBuffers but to have one for Composites. |
My example was a "worst case" example of the runtime issue - the typing issue was more an aside. Whether it's TS or plain JS, the problem is the same when it comes to runtime type checking: function stuff(thing: unknown): void {
if (thing instanceof Array) {
thing.splice(3,0,4);
}
}
const b = [1,2,3];
const c = Object.freeze(b)
stuff(c) // Error: "cannot add property 3, object is not extensible" You can check with |
instanceof doesn’t tell you it’s an array, it tells you it has this realm’s Array.prototype. Array.isArray tells you it’s an array, but that doesn’t tell you it’s mutable. |
Which, in terms of typing, means you can expect it to behave like an array. 🤷♂️
In TS, Do you want Should we be forced to additionally check if an array is also mutable, in order to remove the In practice, you'll almost never see a (TS or plain JS) library performing this level of defensive type-checking. My point is, this doesn't matter much right now, because you'll almost never see anybody actually using |
Typescript is notoriously bad at assuming mutable and readonly annotations causing headaches. We actually tried to document our APIs as accepting readonly types to indicate that it never attempts mutation, but that causes a ton of problems. That said here we're talking about plain JavaScript. Knowing an object is an array ( I understand some code assumes it can mutate objects it receives, but I argue this is usually not the case. And when it does, it's often clear that is the intent of the method/function. |
In delayed response to this comment:
From what I understand, the proposal allows for -- but does not require -- So, for example, an application could construct (or load) a |
At TC39 last week (2025/02) we discussed R&T.
Slides here: https://docs.google.com/presentation/d/1uONn7T91lfZDV4frCsxpwd1QB_pU3P7F6V2j9jEPnA8/.
Minutes are typically available three weeks after plenary.
After the feedback received from committee in 2023 the proposal has been looking for a new design that:
typeof
)===
After discussing some designs with various delegates last week. I currently have:
The text was updated successfully, but these errors were encountered: