luau/Analysis/src/Clone.cpp

993 lines
26 KiB
C++
Raw Normal View History

2022-04-07 22:29:01 +01:00
// This file is part of the Luau programming language and is licensed under MIT License; see LICENSE.txt for details
#include "Luau/Clone.h"
#include "Luau/NotNull.h"
2022-04-07 22:29:01 +01:00
#include "Luau/RecursionCounter.h"
#include "Luau/TxnLog.h"
Sync to upstream/release/591 (#1012) * Fix a use-after-free bug in the new type cloning algorithm * Tighten up the type of `coroutine.wrap`. It is now `<A..., R...>(f: (A...) -> R...) -> ((A...) -> R...)` * Break `.luaurc` out into a separate library target `Luau.Config`. This makes it easier for applications to reason about config files without also depending on the type inference engine. * Move typechecking limits into `FrontendOptions`. This allows embedders more finely-grained control over autocomplete's internal time limits. * Fix stability issue with debugger onprotectederror callback allowing break in non-yieldable contexts New solver: * Initial work toward [Local Type Inference](https://github.com/Roblox/luau/blob/0e1082108fd6fb3a32dfdf5f1766ea3fc1391328/rfcs/local-type-inference.md) * Introduce a new subtyping test. This will be much nicer than the old test because it is completely separate both from actual type inference and from error reporting. Native code generation: * Added function to compute iterated dominance frontier * Optimize barriers in SET_UPVALUE when tag is known * Cache lua_State::global in a register on A64 * Optimize constant stores in A64 lowering * Track table array size state to optimize array size checks * Add split tag/value store into a VM register * Check that spills can outlive the block only in specific conditions --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-08-18 19:15:41 +01:00
#include "Luau/Type.h"
2022-04-07 22:29:01 +01:00
#include "Luau/TypePack.h"
#include "Luau/Unifiable.h"
2022-04-15 00:57:43 +01:00
LUAU_FASTFLAG(DebugLuauCopyBeforeNormalizing)
LUAU_FASTFLAG(DebugLuauReadWriteProperties)
2022-04-15 00:57:43 +01:00
LUAU_FASTFLAG(DebugLuauDeferredConstraintResolution)
2022-04-07 22:29:01 +01:00
LUAU_FASTINTVARIABLE(LuauTypeCloneRecursionLimit, 300)
Sync to upstream/release/577 (#934) Lots of things going on this week: * Fix a crash that could occur in the presence of a cyclic union. We shouldn't be creating cyclic unions, but we shouldn't be crashing when they arise either. * Minor cleanup of `luau_precall` * Internal change to make L->top handling slightly more uniform * Optimize SETGLOBAL & GETGLOBAL fallback C functions. * https://github.com/Roblox/luau/pull/929 * The syntax to the `luau-reduce` commandline tool has changed. It now accepts a script, a command to execute, and an error to search for. It no longer automatically passes the script to the command which makes it a lot more flexible. Also be warned that it edits the script it is passed **in place**. Do not point it at something that is not in source control! New solver * Switch to a greedier but more fallible algorithm for simplifying union and intersection types that are created as part of refinement calculation. This has much better and more predictable performance. * Fix a constraint cycle in recursive function calls. * Much improved inference of binary addition. Functions like `function add(x, y) return x + y end` can now be inferred without annotations. We also accurately typecheck calls to functions like this. * Many small bugfixes surrounding things like table indexers * Add support for indexers on class types. This was previously added to the old solver; we now add it to the new one for feature parity. JIT * https://github.com/Roblox/luau/pull/931 * Fuse key.value and key.tt loads for CEHCK_SLOT_MATCH in A64 * Implement remaining aliases of BFM for A64 * Implement new callinfo flag for A64 * Add instruction simplification for int->num->int conversion chains * Don't even load execdata for X64 calls * Treat opcode fallbacks the same as manually written fallbacks --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-05-19 20:37:30 +01:00
LUAU_FASTFLAGVARIABLE(LuauCloneCyclicUnions, false)
2022-04-07 22:29:01 +01:00
Sync to upstream/release/591 (#1012) * Fix a use-after-free bug in the new type cloning algorithm * Tighten up the type of `coroutine.wrap`. It is now `<A..., R...>(f: (A...) -> R...) -> ((A...) -> R...)` * Break `.luaurc` out into a separate library target `Luau.Config`. This makes it easier for applications to reason about config files without also depending on the type inference engine. * Move typechecking limits into `FrontendOptions`. This allows embedders more finely-grained control over autocomplete's internal time limits. * Fix stability issue with debugger onprotectederror callback allowing break in non-yieldable contexts New solver: * Initial work toward [Local Type Inference](https://github.com/Roblox/luau/blob/0e1082108fd6fb3a32dfdf5f1766ea3fc1391328/rfcs/local-type-inference.md) * Introduce a new subtyping test. This will be much nicer than the old test because it is completely separate both from actual type inference and from error reporting. Native code generation: * Added function to compute iterated dominance frontier * Optimize barriers in SET_UPVALUE when tag is known * Cache lua_State::global in a register on A64 * Optimize constant stores in A64 lowering * Track table array size state to optimize array size checks * Add split tag/value store into a VM register * Check that spills can outlive the block only in specific conditions --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-08-18 19:15:41 +01:00
LUAU_FASTFLAGVARIABLE(LuauStacklessTypeClone2, false)
LUAU_FASTINTVARIABLE(LuauTypeCloneIterationLimit, 100'000)
2022-04-07 22:29:01 +01:00
namespace Luau
{
namespace
{
using Kind = Variant<TypeId, TypePackId>;
template<typename T>
const T* get(const Kind& kind)
{
return get_if<T>(&kind);
}
class TypeCloner2
{
NotNull<TypeArena> arena;
NotNull<BuiltinTypes> builtinTypes;
// A queue of kinds where we cloned it, but whose interior types hasn't
// been updated to point to new clones. Once all of its interior types
// has been updated, it gets removed from the queue.
std::vector<Kind> queue;
NotNull<SeenTypes> types;
NotNull<SeenTypePacks> packs;
int steps = 0;
public:
TypeCloner2(NotNull<TypeArena> arena, NotNull<BuiltinTypes> builtinTypes, NotNull<SeenTypes> types, NotNull<SeenTypePacks> packs)
: arena(arena)
, builtinTypes(builtinTypes)
, types(types)
, packs(packs)
{
}
TypeId clone(TypeId ty)
{
shallowClone(ty);
run();
if (hasExceededIterationLimit())
{
TypeId error = builtinTypes->errorRecoveryType();
(*types)[ty] = error;
return error;
}
return find(ty).value_or(builtinTypes->errorRecoveryType());
}
TypePackId clone(TypePackId tp)
{
shallowClone(tp);
run();
if (hasExceededIterationLimit())
{
TypePackId error = builtinTypes->errorRecoveryTypePack();
(*packs)[tp] = error;
return error;
}
return find(tp).value_or(builtinTypes->errorRecoveryTypePack());
}
private:
bool hasExceededIterationLimit() const
{
if (FInt::LuauTypeCloneIterationLimit == 0)
return false;
return steps + queue.size() >= size_t(FInt::LuauTypeCloneIterationLimit);
}
void run()
{
while (!queue.empty())
{
++steps;
if (hasExceededIterationLimit())
break;
Kind kind = queue.back();
queue.pop_back();
if (find(kind))
continue;
cloneChildren(kind);
}
}
std::optional<TypeId> find(TypeId ty) const
{
Sync to upstream/release/591 (#1012) * Fix a use-after-free bug in the new type cloning algorithm * Tighten up the type of `coroutine.wrap`. It is now `<A..., R...>(f: (A...) -> R...) -> ((A...) -> R...)` * Break `.luaurc` out into a separate library target `Luau.Config`. This makes it easier for applications to reason about config files without also depending on the type inference engine. * Move typechecking limits into `FrontendOptions`. This allows embedders more finely-grained control over autocomplete's internal time limits. * Fix stability issue with debugger onprotectederror callback allowing break in non-yieldable contexts New solver: * Initial work toward [Local Type Inference](https://github.com/Roblox/luau/blob/0e1082108fd6fb3a32dfdf5f1766ea3fc1391328/rfcs/local-type-inference.md) * Introduce a new subtyping test. This will be much nicer than the old test because it is completely separate both from actual type inference and from error reporting. Native code generation: * Added function to compute iterated dominance frontier * Optimize barriers in SET_UPVALUE when tag is known * Cache lua_State::global in a register on A64 * Optimize constant stores in A64 lowering * Track table array size state to optimize array size checks * Add split tag/value store into a VM register * Check that spills can outlive the block only in specific conditions --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-08-18 19:15:41 +01:00
ty = follow(ty, FollowOption::DisableLazyTypeThunks);
if (auto it = types->find(ty); it != types->end())
return it->second;
return std::nullopt;
}
std::optional<TypePackId> find(TypePackId tp) const
{
Sync to upstream/release/591 (#1012) * Fix a use-after-free bug in the new type cloning algorithm * Tighten up the type of `coroutine.wrap`. It is now `<A..., R...>(f: (A...) -> R...) -> ((A...) -> R...)` * Break `.luaurc` out into a separate library target `Luau.Config`. This makes it easier for applications to reason about config files without also depending on the type inference engine. * Move typechecking limits into `FrontendOptions`. This allows embedders more finely-grained control over autocomplete's internal time limits. * Fix stability issue with debugger onprotectederror callback allowing break in non-yieldable contexts New solver: * Initial work toward [Local Type Inference](https://github.com/Roblox/luau/blob/0e1082108fd6fb3a32dfdf5f1766ea3fc1391328/rfcs/local-type-inference.md) * Introduce a new subtyping test. This will be much nicer than the old test because it is completely separate both from actual type inference and from error reporting. Native code generation: * Added function to compute iterated dominance frontier * Optimize barriers in SET_UPVALUE when tag is known * Cache lua_State::global in a register on A64 * Optimize constant stores in A64 lowering * Track table array size state to optimize array size checks * Add split tag/value store into a VM register * Check that spills can outlive the block only in specific conditions --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-08-18 19:15:41 +01:00
tp = follow(tp);
if (auto it = packs->find(tp); it != packs->end())
return it->second;
return std::nullopt;
}
std::optional<Kind> find(Kind kind) const
{
if (auto ty = get<TypeId>(kind))
return find(*ty);
else if (auto tp = get<TypePackId>(kind))
return find(*tp);
else
{
LUAU_ASSERT(!"Unknown kind?");
return std::nullopt;
}
}
private:
TypeId shallowClone(TypeId ty)
{
Sync to upstream/release/591 (#1012) * Fix a use-after-free bug in the new type cloning algorithm * Tighten up the type of `coroutine.wrap`. It is now `<A..., R...>(f: (A...) -> R...) -> ((A...) -> R...)` * Break `.luaurc` out into a separate library target `Luau.Config`. This makes it easier for applications to reason about config files without also depending on the type inference engine. * Move typechecking limits into `FrontendOptions`. This allows embedders more finely-grained control over autocomplete's internal time limits. * Fix stability issue with debugger onprotectederror callback allowing break in non-yieldable contexts New solver: * Initial work toward [Local Type Inference](https://github.com/Roblox/luau/blob/0e1082108fd6fb3a32dfdf5f1766ea3fc1391328/rfcs/local-type-inference.md) * Introduce a new subtyping test. This will be much nicer than the old test because it is completely separate both from actual type inference and from error reporting. Native code generation: * Added function to compute iterated dominance frontier * Optimize barriers in SET_UPVALUE when tag is known * Cache lua_State::global in a register on A64 * Optimize constant stores in A64 lowering * Track table array size state to optimize array size checks * Add split tag/value store into a VM register * Check that spills can outlive the block only in specific conditions --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-08-18 19:15:41 +01:00
// We want to [`Luau::follow`] but without forcing the expansion of [`LazyType`]s.
ty = follow(ty, FollowOption::DisableLazyTypeThunks);
if (auto clone = find(ty))
return *clone;
else if (ty->persistent)
return ty;
Sync to upstream/release/591 (#1012) * Fix a use-after-free bug in the new type cloning algorithm * Tighten up the type of `coroutine.wrap`. It is now `<A..., R...>(f: (A...) -> R...) -> ((A...) -> R...)` * Break `.luaurc` out into a separate library target `Luau.Config`. This makes it easier for applications to reason about config files without also depending on the type inference engine. * Move typechecking limits into `FrontendOptions`. This allows embedders more finely-grained control over autocomplete's internal time limits. * Fix stability issue with debugger onprotectederror callback allowing break in non-yieldable contexts New solver: * Initial work toward [Local Type Inference](https://github.com/Roblox/luau/blob/0e1082108fd6fb3a32dfdf5f1766ea3fc1391328/rfcs/local-type-inference.md) * Introduce a new subtyping test. This will be much nicer than the old test because it is completely separate both from actual type inference and from error reporting. Native code generation: * Added function to compute iterated dominance frontier * Optimize barriers in SET_UPVALUE when tag is known * Cache lua_State::global in a register on A64 * Optimize constant stores in A64 lowering * Track table array size state to optimize array size checks * Add split tag/value store into a VM register * Check that spills can outlive the block only in specific conditions --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-08-18 19:15:41 +01:00
TypeId target = arena->addType(ty->ty);
asMutable(target)->documentationSymbol = ty->documentationSymbol;
(*types)[ty] = target;
queue.push_back(target);
return target;
}
TypePackId shallowClone(TypePackId tp)
{
Sync to upstream/release/591 (#1012) * Fix a use-after-free bug in the new type cloning algorithm * Tighten up the type of `coroutine.wrap`. It is now `<A..., R...>(f: (A...) -> R...) -> ((A...) -> R...)` * Break `.luaurc` out into a separate library target `Luau.Config`. This makes it easier for applications to reason about config files without also depending on the type inference engine. * Move typechecking limits into `FrontendOptions`. This allows embedders more finely-grained control over autocomplete's internal time limits. * Fix stability issue with debugger onprotectederror callback allowing break in non-yieldable contexts New solver: * Initial work toward [Local Type Inference](https://github.com/Roblox/luau/blob/0e1082108fd6fb3a32dfdf5f1766ea3fc1391328/rfcs/local-type-inference.md) * Introduce a new subtyping test. This will be much nicer than the old test because it is completely separate both from actual type inference and from error reporting. Native code generation: * Added function to compute iterated dominance frontier * Optimize barriers in SET_UPVALUE when tag is known * Cache lua_State::global in a register on A64 * Optimize constant stores in A64 lowering * Track table array size state to optimize array size checks * Add split tag/value store into a VM register * Check that spills can outlive the block only in specific conditions --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-08-18 19:15:41 +01:00
tp = follow(tp);
if (auto clone = find(tp))
return *clone;
else if (tp->persistent)
return tp;
Sync to upstream/release/591 (#1012) * Fix a use-after-free bug in the new type cloning algorithm * Tighten up the type of `coroutine.wrap`. It is now `<A..., R...>(f: (A...) -> R...) -> ((A...) -> R...)` * Break `.luaurc` out into a separate library target `Luau.Config`. This makes it easier for applications to reason about config files without also depending on the type inference engine. * Move typechecking limits into `FrontendOptions`. This allows embedders more finely-grained control over autocomplete's internal time limits. * Fix stability issue with debugger onprotectederror callback allowing break in non-yieldable contexts New solver: * Initial work toward [Local Type Inference](https://github.com/Roblox/luau/blob/0e1082108fd6fb3a32dfdf5f1766ea3fc1391328/rfcs/local-type-inference.md) * Introduce a new subtyping test. This will be much nicer than the old test because it is completely separate both from actual type inference and from error reporting. Native code generation: * Added function to compute iterated dominance frontier * Optimize barriers in SET_UPVALUE when tag is known * Cache lua_State::global in a register on A64 * Optimize constant stores in A64 lowering * Track table array size state to optimize array size checks * Add split tag/value store into a VM register * Check that spills can outlive the block only in specific conditions --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-08-18 19:15:41 +01:00
TypePackId target = arena->addTypePack(tp->ty);
(*packs)[tp] = target;
queue.push_back(target);
return target;
}
Property shallowClone(const Property& p)
{
if (FFlag::DebugLuauReadWriteProperties)
{
std::optional<TypeId> cloneReadTy;
if (auto ty = p.readType())
cloneReadTy = shallowClone(*ty);
std::optional<TypeId> cloneWriteTy;
if (auto ty = p.writeType())
cloneWriteTy = shallowClone(*ty);
std::optional<Property> cloned = Property::create(cloneReadTy, cloneWriteTy);
LUAU_ASSERT(cloned);
cloned->deprecated = p.deprecated;
cloned->deprecatedSuggestion = p.deprecatedSuggestion;
cloned->location = p.location;
cloned->tags = p.tags;
cloned->documentationSymbol = p.documentationSymbol;
return *cloned;
}
else
{
return Property{
shallowClone(p.type()),
p.deprecated,
p.deprecatedSuggestion,
p.location,
p.tags,
p.documentationSymbol,
};
}
}
void cloneChildren(TypeId ty)
{
return visit(
[&](auto&& t) {
return cloneChildren(&t);
},
asMutable(ty)->ty);
}
void cloneChildren(TypePackId tp)
{
return visit(
[&](auto&& t) {
return cloneChildren(&t);
},
asMutable(tp)->ty);
}
void cloneChildren(Kind kind)
{
if (auto ty = get<TypeId>(kind))
return cloneChildren(*ty);
else if (auto tp = get<TypePackId>(kind))
return cloneChildren(*tp);
else
LUAU_ASSERT(!"Item holds neither TypeId nor TypePackId when enqueuing its children?");
}
// ErrorType and ErrorTypePack is an alias to this type.
void cloneChildren(Unifiable::Error* t)
{
// noop.
}
void cloneChildren(BoundType* t)
{
t->boundTo = shallowClone(t->boundTo);
}
void cloneChildren(FreeType* t)
{
// TODO: clone lower and upper bounds.
// TODO: In the new solver, we should ice.
}
void cloneChildren(GenericType* t)
{
// TOOD: clone upper bounds.
}
void cloneChildren(PrimitiveType* t)
{
// noop.
}
void cloneChildren(BlockedType* t)
{
// TODO: In the new solver, we should ice.
}
void cloneChildren(PendingExpansionType* t)
{
// TODO: In the new solver, we should ice.
}
void cloneChildren(SingletonType* t)
{
// noop.
}
void cloneChildren(FunctionType* t)
{
for (TypeId& g : t->generics)
g = shallowClone(g);
for (TypePackId& gp : t->genericPacks)
gp = shallowClone(gp);
t->argTypes = shallowClone(t->argTypes);
t->retTypes = shallowClone(t->retTypes);
}
void cloneChildren(TableType* t)
{
if (t->indexer)
{
t->indexer->indexType = shallowClone(t->indexer->indexType);
t->indexer->indexResultType = shallowClone(t->indexer->indexResultType);
}
for (auto& [_, p] : t->props)
p = shallowClone(p);
for (TypeId& ty : t->instantiatedTypeParams)
ty = shallowClone(ty);
for (TypePackId& tp : t->instantiatedTypePackParams)
tp = shallowClone(tp);
}
void cloneChildren(MetatableType* t)
{
t->table = shallowClone(t->table);
t->metatable = shallowClone(t->metatable);
}
void cloneChildren(ClassType* t)
{
for (auto& [_, p] : t->props)
p = shallowClone(p);
if (t->parent)
t->parent = shallowClone(*t->parent);
if (t->metatable)
t->metatable = shallowClone(*t->metatable);
if (t->indexer)
{
t->indexer->indexType = shallowClone(t->indexer->indexType);
t->indexer->indexResultType = shallowClone(t->indexer->indexResultType);
}
}
void cloneChildren(AnyType* t)
{
// noop.
}
void cloneChildren(UnionType* t)
{
for (TypeId& ty : t->options)
ty = shallowClone(ty);
}
void cloneChildren(IntersectionType* t)
{
for (TypeId& ty : t->parts)
ty = shallowClone(ty);
}
void cloneChildren(LazyType* t)
{
if (auto unwrapped = t->unwrapped.load())
t->unwrapped.store(shallowClone(unwrapped));
}
void cloneChildren(UnknownType* t)
{
// noop.
}
void cloneChildren(NeverType* t)
{
// noop.
}
void cloneChildren(NegationType* t)
{
t->ty = shallowClone(t->ty);
}
void cloneChildren(TypeFamilyInstanceType* t)
{
// TODO: In the new solver, we should ice.
}
void cloneChildren(FreeTypePack* t)
{
// TODO: clone lower and upper bounds.
// TODO: In the new solver, we should ice.
}
void cloneChildren(GenericTypePack* t)
{
// TOOD: clone upper bounds.
}
void cloneChildren(BlockedTypePack* t)
{
// TODO: In the new solver, we should ice.
}
void cloneChildren(BoundTypePack* t)
{
t->boundTo = shallowClone(t->boundTo);
}
void cloneChildren(VariadicTypePack* t)
{
t->ty = shallowClone(t->ty);
}
void cloneChildren(TypePack* t)
{
for (TypeId& ty : t->head)
ty = shallowClone(ty);
if (t->tail)
t->tail = shallowClone(*t->tail);
}
void cloneChildren(TypeFamilyInstanceTypePack* t)
{
// TODO: In the new solver, we should ice.
}
};
} // namespace
namespace
{
Property clone(const Property& prop, TypeArena& dest, CloneState& cloneState)
{
if (FFlag::DebugLuauReadWriteProperties)
{
std::optional<TypeId> cloneReadTy;
if (auto ty = prop.readType())
cloneReadTy = clone(*ty, dest, cloneState);
std::optional<TypeId> cloneWriteTy;
if (auto ty = prop.writeType())
cloneWriteTy = clone(*ty, dest, cloneState);
std::optional<Property> cloned = Property::create(cloneReadTy, cloneWriteTy);
LUAU_ASSERT(cloned);
cloned->deprecated = prop.deprecated;
cloned->deprecatedSuggestion = prop.deprecatedSuggestion;
cloned->location = prop.location;
cloned->tags = prop.tags;
cloned->documentationSymbol = prop.documentationSymbol;
return *cloned;
}
else
{
return Property{
clone(prop.type(), dest, cloneState),
prop.deprecated,
prop.deprecatedSuggestion,
prop.location,
prop.tags,
prop.documentationSymbol,
};
}
}
Sync to upstream/release/576 (#928) * `ClassType` can now have an indexer defined on it. This allows custom types to be used in `t[x]` expressions. * Fixed search for closest executable breakpoint line. Previously, breakpoints might have been skipped in `else` blocks at the end of a function * Fixed how unification is performed for two optional types `a? <: b?`, previously it might have unified either 'a' or 'b' with 'nil'. Note that this fix is not enabled by default yet (see the list in `ExperimentalFlags.h`) In the new type solver, a concept of 'Type Families' has been introduced. Type families can be thought of as type aliases with custom type inference/reduction logic included with them. For example, we can have an `Add<T, U>` type family that will resolve the type that is the result of adding two values together. This will help type inference to figure out what 'T' and 'U' might be when explicit type annotations are not provided. In this update we don't define any type families, but they will be added in the near future. It is also possible for Luau embedders to define their own type families in the global/environment scope. Other changes include: * Fixed scope used to find out which generic types should be included in the function generic type list * Fixed a crash after cyclic bound types were created during unification And in native code generation (jit): * Use of arm64 target on M1 now requires macOS 13 * Entry into native code has been optimized. This is especially important for coroutine call/pcall performance as they involve going through a C call frame * LOP_LOADK(X) translation into IR has been improved to enable type tag/constant propagation * arm64 can use integer immediate values to synthesize floating-point values * x64 assembler removes duplicate 64bit numbers from the data section to save space * Linux `perf` can now be used to profile native Luau code (when running with --codegen-perf CLI argument)
2023-05-12 18:50:47 +01:00
static TableIndexer clone(const TableIndexer& indexer, TypeArena& dest, CloneState& cloneState)
{
return TableIndexer{clone(indexer.indexType, dest, cloneState), clone(indexer.indexResultType, dest, cloneState)};
}
2022-04-07 22:29:01 +01:00
struct TypePackCloner;
/*
* Both TypeCloner and TypePackCloner work by depositing the requested type variable into the appropriate 'seen' set.
* They do not return anything because their sole consumer (the deepClone function) already has a pointer into this storage.
*/
struct TypeCloner
{
2022-04-15 00:57:43 +01:00
TypeCloner(TypeArena& dest, TypeId typeId, CloneState& cloneState)
2022-04-07 22:29:01 +01:00
: dest(dest)
, typeId(typeId)
2022-04-15 00:57:43 +01:00
, seenTypes(cloneState.seenTypes)
, seenTypePacks(cloneState.seenTypePacks)
2022-04-07 22:29:01 +01:00
, cloneState(cloneState)
{
}
TypeArena& dest;
TypeId typeId;
SeenTypes& seenTypes;
SeenTypePacks& seenTypePacks;
CloneState& cloneState;
template<typename T>
void defaultClone(const T& t);
void operator()(const FreeType& t);
void operator()(const GenericType& t);
void operator()(const BoundType& t);
void operator()(const ErrorType& t);
void operator()(const BlockedType& t);
void operator()(const PendingExpansionType& t);
void operator()(const PrimitiveType& t);
void operator()(const SingletonType& t);
void operator()(const FunctionType& t);
void operator()(const TableType& t);
void operator()(const MetatableType& t);
void operator()(const ClassType& t);
void operator()(const AnyType& t);
void operator()(const UnionType& t);
void operator()(const IntersectionType& t);
void operator()(const LazyType& t);
void operator()(const UnknownType& t);
void operator()(const NeverType& t);
void operator()(const NegationType& t);
Sync to upstream/release/576 (#928) * `ClassType` can now have an indexer defined on it. This allows custom types to be used in `t[x]` expressions. * Fixed search for closest executable breakpoint line. Previously, breakpoints might have been skipped in `else` blocks at the end of a function * Fixed how unification is performed for two optional types `a? <: b?`, previously it might have unified either 'a' or 'b' with 'nil'. Note that this fix is not enabled by default yet (see the list in `ExperimentalFlags.h`) In the new type solver, a concept of 'Type Families' has been introduced. Type families can be thought of as type aliases with custom type inference/reduction logic included with them. For example, we can have an `Add<T, U>` type family that will resolve the type that is the result of adding two values together. This will help type inference to figure out what 'T' and 'U' might be when explicit type annotations are not provided. In this update we don't define any type families, but they will be added in the near future. It is also possible for Luau embedders to define their own type families in the global/environment scope. Other changes include: * Fixed scope used to find out which generic types should be included in the function generic type list * Fixed a crash after cyclic bound types were created during unification And in native code generation (jit): * Use of arm64 target on M1 now requires macOS 13 * Entry into native code has been optimized. This is especially important for coroutine call/pcall performance as they involve going through a C call frame * LOP_LOADK(X) translation into IR has been improved to enable type tag/constant propagation * arm64 can use integer immediate values to synthesize floating-point values * x64 assembler removes duplicate 64bit numbers from the data section to save space * Linux `perf` can now be used to profile native Luau code (when running with --codegen-perf CLI argument)
2023-05-12 18:50:47 +01:00
void operator()(const TypeFamilyInstanceType& t);
2022-04-07 22:29:01 +01:00
};
struct TypePackCloner
{
TypeArena& dest;
TypePackId typePackId;
SeenTypes& seenTypes;
SeenTypePacks& seenTypePacks;
CloneState& cloneState;
2022-04-15 00:57:43 +01:00
TypePackCloner(TypeArena& dest, TypePackId typePackId, CloneState& cloneState)
2022-04-07 22:29:01 +01:00
: dest(dest)
, typePackId(typePackId)
2022-04-15 00:57:43 +01:00
, seenTypes(cloneState.seenTypes)
, seenTypePacks(cloneState.seenTypePacks)
2022-04-07 22:29:01 +01:00
, cloneState(cloneState)
{
}
template<typename T>
void defaultClone(const T& t)
{
TypePackId cloned = dest.addTypePack(TypePackVar{t});
seenTypePacks[typePackId] = cloned;
}
void operator()(const FreeTypePack& t)
2022-04-07 22:29:01 +01:00
{
2022-05-20 01:02:24 +01:00
defaultClone(t);
2022-04-07 22:29:01 +01:00
}
void operator()(const GenericTypePack& t)
2022-04-07 22:29:01 +01:00
{
defaultClone(t);
}
void operator()(const ErrorTypePack& t)
2022-04-07 22:29:01 +01:00
{
defaultClone(t);
}
void operator()(const BlockedTypePack& t)
{
defaultClone(t);
}
// While we are a-cloning, we can flatten out bound Types and make things a bit tighter.
2022-04-07 22:29:01 +01:00
// We just need to be sure that we rewrite pointers both to the binder and the bindee to the same pointer.
void operator()(const Unifiable::Bound<TypePackId>& t)
{
2022-04-15 00:57:43 +01:00
TypePackId cloned = clone(t.boundTo, dest, cloneState);
if (FFlag::DebugLuauCopyBeforeNormalizing)
cloned = dest.addTypePack(TypePackVar{BoundTypePack{cloned}});
2022-04-07 22:29:01 +01:00
seenTypePacks[typePackId] = cloned;
}
void operator()(const VariadicTypePack& t)
{
2022-04-15 00:57:43 +01:00
TypePackId cloned = dest.addTypePack(TypePackVar{VariadicTypePack{clone(t.ty, dest, cloneState), /*hidden*/ t.hidden}});
2022-04-07 22:29:01 +01:00
seenTypePacks[typePackId] = cloned;
}
void operator()(const TypePack& t)
{
TypePackId cloned = dest.addTypePack(TypePack{});
TypePack* destTp = getMutable<TypePack>(cloned);
LUAU_ASSERT(destTp != nullptr);
seenTypePacks[typePackId] = cloned;
for (TypeId ty : t.head)
2022-04-15 00:57:43 +01:00
destTp->head.push_back(clone(ty, dest, cloneState));
2022-04-07 22:29:01 +01:00
if (t.tail)
2022-04-15 00:57:43 +01:00
destTp->tail = clone(*t.tail, dest, cloneState);
2022-04-07 22:29:01 +01:00
}
Sync to upstream/release/576 (#928) * `ClassType` can now have an indexer defined on it. This allows custom types to be used in `t[x]` expressions. * Fixed search for closest executable breakpoint line. Previously, breakpoints might have been skipped in `else` blocks at the end of a function * Fixed how unification is performed for two optional types `a? <: b?`, previously it might have unified either 'a' or 'b' with 'nil'. Note that this fix is not enabled by default yet (see the list in `ExperimentalFlags.h`) In the new type solver, a concept of 'Type Families' has been introduced. Type families can be thought of as type aliases with custom type inference/reduction logic included with them. For example, we can have an `Add<T, U>` type family that will resolve the type that is the result of adding two values together. This will help type inference to figure out what 'T' and 'U' might be when explicit type annotations are not provided. In this update we don't define any type families, but they will be added in the near future. It is also possible for Luau embedders to define their own type families in the global/environment scope. Other changes include: * Fixed scope used to find out which generic types should be included in the function generic type list * Fixed a crash after cyclic bound types were created during unification And in native code generation (jit): * Use of arm64 target on M1 now requires macOS 13 * Entry into native code has been optimized. This is especially important for coroutine call/pcall performance as they involve going through a C call frame * LOP_LOADK(X) translation into IR has been improved to enable type tag/constant propagation * arm64 can use integer immediate values to synthesize floating-point values * x64 assembler removes duplicate 64bit numbers from the data section to save space * Linux `perf` can now be used to profile native Luau code (when running with --codegen-perf CLI argument)
2023-05-12 18:50:47 +01:00
void operator()(const TypeFamilyInstanceTypePack& t)
{
TypePackId cloned = dest.addTypePack(TypeFamilyInstanceTypePack{t.family, {}, {}});
TypeFamilyInstanceTypePack* destTp = getMutable<TypeFamilyInstanceTypePack>(cloned);
LUAU_ASSERT(destTp);
seenTypePacks[typePackId] = cloned;
destTp->typeArguments.reserve(t.typeArguments.size());
for (TypeId ty : t.typeArguments)
destTp->typeArguments.push_back(clone(ty, dest, cloneState));
destTp->packArguments.reserve(t.packArguments.size());
for (TypePackId tp : t.packArguments)
destTp->packArguments.push_back(clone(tp, dest, cloneState));
}
2022-04-07 22:29:01 +01:00
};
template<typename T>
void TypeCloner::defaultClone(const T& t)
{
TypeId cloned = dest.addType(t);
seenTypes[typeId] = cloned;
}
void TypeCloner::operator()(const FreeType& t)
2022-04-07 22:29:01 +01:00
{
if (FFlag::DebugLuauDeferredConstraintResolution)
{
FreeType ft{t.scope, clone(t.lowerBound, dest, cloneState), clone(t.upperBound, dest, cloneState)};
TypeId res = dest.addType(ft);
seenTypes[typeId] = res;
}
else
defaultClone(t);
2022-04-07 22:29:01 +01:00
}
void TypeCloner::operator()(const GenericType& t)
2022-04-07 22:29:01 +01:00
{
defaultClone(t);
}
void TypeCloner::operator()(const Unifiable::Bound<TypeId>& t)
{
2022-04-15 00:57:43 +01:00
TypeId boundTo = clone(t.boundTo, dest, cloneState);
if (FFlag::DebugLuauCopyBeforeNormalizing)
boundTo = dest.addType(BoundType{boundTo});
2022-04-07 22:29:01 +01:00
seenTypes[typeId] = boundTo;
}
void TypeCloner::operator()(const Unifiable::Error& t)
{
defaultClone(t);
}
void TypeCloner::operator()(const BlockedType& t)
2022-06-17 02:05:14 +01:00
{
defaultClone(t);
}
void TypeCloner::operator()(const PendingExpansionType& t)
2022-08-04 23:35:33 +01:00
{
TypeId res = dest.addType(PendingExpansionType{t.prefix, t.name, t.typeArguments, t.packArguments});
PendingExpansionType* petv = getMutable<PendingExpansionType>(res);
2022-08-04 23:35:33 +01:00
LUAU_ASSERT(petv);
seenTypes[typeId] = res;
std::vector<TypeId> typeArguments;
for (TypeId arg : t.typeArguments)
typeArguments.push_back(clone(arg, dest, cloneState));
std::vector<TypePackId> packArguments;
for (TypePackId arg : t.packArguments)
packArguments.push_back(clone(arg, dest, cloneState));
petv->typeArguments = std::move(typeArguments);
petv->packArguments = std::move(packArguments);
}
void TypeCloner::operator()(const PrimitiveType& t)
2022-04-07 22:29:01 +01:00
{
defaultClone(t);
}
void TypeCloner::operator()(const SingletonType& t)
2022-04-07 22:29:01 +01:00
{
defaultClone(t);
}
void TypeCloner::operator()(const FunctionType& t)
2022-04-07 22:29:01 +01:00
{
// FISHY: We always erase the scope when we clone things. clone() was
// originally written so that we could copy a module's type surface into an
// export arena. This probably dates to that.
TypeId result = dest.addType(FunctionType{TypeLevel{0, 0}, {}, {}, nullptr, nullptr, t.definition, t.hasSelf});
FunctionType* ftv = getMutable<FunctionType>(result);
2022-04-07 22:29:01 +01:00
LUAU_ASSERT(ftv != nullptr);
seenTypes[typeId] = result;
for (TypeId generic : t.generics)
2022-04-15 00:57:43 +01:00
ftv->generics.push_back(clone(generic, dest, cloneState));
2022-04-07 22:29:01 +01:00
for (TypePackId genericPack : t.genericPacks)
2022-04-15 00:57:43 +01:00
ftv->genericPacks.push_back(clone(genericPack, dest, cloneState));
2022-04-07 22:29:01 +01:00
ftv->tags = t.tags;
2022-04-15 00:57:43 +01:00
ftv->argTypes = clone(t.argTypes, dest, cloneState);
2022-04-07 22:29:01 +01:00
ftv->argNames = t.argNames;
2022-06-17 02:05:14 +01:00
ftv->retTypes = clone(t.retTypes, dest, cloneState);
Sync to upstream/release/577 (#934) Lots of things going on this week: * Fix a crash that could occur in the presence of a cyclic union. We shouldn't be creating cyclic unions, but we shouldn't be crashing when they arise either. * Minor cleanup of `luau_precall` * Internal change to make L->top handling slightly more uniform * Optimize SETGLOBAL & GETGLOBAL fallback C functions. * https://github.com/Roblox/luau/pull/929 * The syntax to the `luau-reduce` commandline tool has changed. It now accepts a script, a command to execute, and an error to search for. It no longer automatically passes the script to the command which makes it a lot more flexible. Also be warned that it edits the script it is passed **in place**. Do not point it at something that is not in source control! New solver * Switch to a greedier but more fallible algorithm for simplifying union and intersection types that are created as part of refinement calculation. This has much better and more predictable performance. * Fix a constraint cycle in recursive function calls. * Much improved inference of binary addition. Functions like `function add(x, y) return x + y end` can now be inferred without annotations. We also accurately typecheck calls to functions like this. * Many small bugfixes surrounding things like table indexers * Add support for indexers on class types. This was previously added to the old solver; we now add it to the new one for feature parity. JIT * https://github.com/Roblox/luau/pull/931 * Fuse key.value and key.tt loads for CEHCK_SLOT_MATCH in A64 * Implement remaining aliases of BFM for A64 * Implement new callinfo flag for A64 * Add instruction simplification for int->num->int conversion chains * Don't even load execdata for X64 calls * Treat opcode fallbacks the same as manually written fallbacks --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-05-19 20:37:30 +01:00
ftv->hasNoFreeOrGenericTypes = t.hasNoFreeOrGenericTypes;
2022-04-07 22:29:01 +01:00
}
void TypeCloner::operator()(const TableType& t)
2022-04-07 22:29:01 +01:00
{
// If table is now bound to another one, we ignore the content of the original
2022-04-15 00:57:43 +01:00
if (!FFlag::DebugLuauCopyBeforeNormalizing && t.boundTo)
2022-04-07 22:29:01 +01:00
{
2022-04-15 00:57:43 +01:00
TypeId boundTo = clone(*t.boundTo, dest, cloneState);
2022-04-07 22:29:01 +01:00
seenTypes[typeId] = boundTo;
return;
}
TypeId result = dest.addType(TableType{});
TableType* ttv = getMutable<TableType>(result);
2022-04-07 22:29:01 +01:00
LUAU_ASSERT(ttv != nullptr);
*ttv = t;
seenTypes[typeId] = result;
ttv->level = TypeLevel{0, 0};
2022-04-15 00:57:43 +01:00
if (FFlag::DebugLuauCopyBeforeNormalizing && t.boundTo)
ttv->boundTo = clone(*t.boundTo, dest, cloneState);
2022-04-07 22:29:01 +01:00
for (const auto& [name, prop] : t.props)
ttv->props[name] = clone(prop, dest, cloneState);
2022-04-07 22:29:01 +01:00
if (t.indexer)
ttv->indexer = clone(*t.indexer, dest, cloneState);
2022-04-07 22:29:01 +01:00
for (TypeId& arg : ttv->instantiatedTypeParams)
2022-04-15 00:57:43 +01:00
arg = clone(arg, dest, cloneState);
2022-04-07 22:29:01 +01:00
for (TypePackId& arg : ttv->instantiatedTypePackParams)
2022-04-15 00:57:43 +01:00
arg = clone(arg, dest, cloneState);
2022-04-07 22:29:01 +01:00
ttv->definitionModuleName = t.definitionModuleName;
ttv->definitionLocation = t.definitionLocation;
2022-04-07 22:29:01 +01:00
ttv->tags = t.tags;
}
void TypeCloner::operator()(const MetatableType& t)
2022-04-07 22:29:01 +01:00
{
TypeId result = dest.addType(MetatableType{});
MetatableType* mtv = getMutable<MetatableType>(result);
2022-04-07 22:29:01 +01:00
seenTypes[typeId] = result;
2022-04-15 00:57:43 +01:00
mtv->table = clone(t.table, dest, cloneState);
mtv->metatable = clone(t.metatable, dest, cloneState);
2022-04-07 22:29:01 +01:00
}
void TypeCloner::operator()(const ClassType& t)
2022-04-07 22:29:01 +01:00
{
TypeId result = dest.addType(ClassType{t.name, {}, std::nullopt, std::nullopt, t.tags, t.userData, t.definitionModuleName});
ClassType* ctv = getMutable<ClassType>(result);
2022-04-07 22:29:01 +01:00
seenTypes[typeId] = result;
for (const auto& [name, prop] : t.props)
ctv->props[name] = clone(prop, dest, cloneState);
2022-04-07 22:29:01 +01:00
if (t.parent)
2022-04-15 00:57:43 +01:00
ctv->parent = clone(*t.parent, dest, cloneState);
2022-04-07 22:29:01 +01:00
if (t.metatable)
2022-04-15 00:57:43 +01:00
ctv->metatable = clone(*t.metatable, dest, cloneState);
Sync to upstream/release/576 (#928) * `ClassType` can now have an indexer defined on it. This allows custom types to be used in `t[x]` expressions. * Fixed search for closest executable breakpoint line. Previously, breakpoints might have been skipped in `else` blocks at the end of a function * Fixed how unification is performed for two optional types `a? <: b?`, previously it might have unified either 'a' or 'b' with 'nil'. Note that this fix is not enabled by default yet (see the list in `ExperimentalFlags.h`) In the new type solver, a concept of 'Type Families' has been introduced. Type families can be thought of as type aliases with custom type inference/reduction logic included with them. For example, we can have an `Add<T, U>` type family that will resolve the type that is the result of adding two values together. This will help type inference to figure out what 'T' and 'U' might be when explicit type annotations are not provided. In this update we don't define any type families, but they will be added in the near future. It is also possible for Luau embedders to define their own type families in the global/environment scope. Other changes include: * Fixed scope used to find out which generic types should be included in the function generic type list * Fixed a crash after cyclic bound types were created during unification And in native code generation (jit): * Use of arm64 target on M1 now requires macOS 13 * Entry into native code has been optimized. This is especially important for coroutine call/pcall performance as they involve going through a C call frame * LOP_LOADK(X) translation into IR has been improved to enable type tag/constant propagation * arm64 can use integer immediate values to synthesize floating-point values * x64 assembler removes duplicate 64bit numbers from the data section to save space * Linux `perf` can now be used to profile native Luau code (when running with --codegen-perf CLI argument)
2023-05-12 18:50:47 +01:00
if (t.indexer)
ctv->indexer = clone(*t.indexer, dest, cloneState);
2022-04-07 22:29:01 +01:00
}
void TypeCloner::operator()(const AnyType& t)
2022-04-07 22:29:01 +01:00
{
defaultClone(t);
}
void TypeCloner::operator()(const UnionType& t)
2022-04-07 22:29:01 +01:00
{
Sync to upstream/release/577 (#934) Lots of things going on this week: * Fix a crash that could occur in the presence of a cyclic union. We shouldn't be creating cyclic unions, but we shouldn't be crashing when they arise either. * Minor cleanup of `luau_precall` * Internal change to make L->top handling slightly more uniform * Optimize SETGLOBAL & GETGLOBAL fallback C functions. * https://github.com/Roblox/luau/pull/929 * The syntax to the `luau-reduce` commandline tool has changed. It now accepts a script, a command to execute, and an error to search for. It no longer automatically passes the script to the command which makes it a lot more flexible. Also be warned that it edits the script it is passed **in place**. Do not point it at something that is not in source control! New solver * Switch to a greedier but more fallible algorithm for simplifying union and intersection types that are created as part of refinement calculation. This has much better and more predictable performance. * Fix a constraint cycle in recursive function calls. * Much improved inference of binary addition. Functions like `function add(x, y) return x + y end` can now be inferred without annotations. We also accurately typecheck calls to functions like this. * Many small bugfixes surrounding things like table indexers * Add support for indexers on class types. This was previously added to the old solver; we now add it to the new one for feature parity. JIT * https://github.com/Roblox/luau/pull/931 * Fuse key.value and key.tt loads for CEHCK_SLOT_MATCH in A64 * Implement remaining aliases of BFM for A64 * Implement new callinfo flag for A64 * Add instruction simplification for int->num->int conversion chains * Don't even load execdata for X64 calls * Treat opcode fallbacks the same as manually written fallbacks --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-05-19 20:37:30 +01:00
if (FFlag::LuauCloneCyclicUnions)
{
// We're just using this FreeType as a placeholder until we've finished
// cloning the parts of this union so it is okay that its bounds are
// nullptr. We'll never indirect them.
TypeId result = dest.addType(FreeType{nullptr, /*lowerBound*/ nullptr, /*upperBound*/ nullptr});
Sync to upstream/release/577 (#934) Lots of things going on this week: * Fix a crash that could occur in the presence of a cyclic union. We shouldn't be creating cyclic unions, but we shouldn't be crashing when they arise either. * Minor cleanup of `luau_precall` * Internal change to make L->top handling slightly more uniform * Optimize SETGLOBAL & GETGLOBAL fallback C functions. * https://github.com/Roblox/luau/pull/929 * The syntax to the `luau-reduce` commandline tool has changed. It now accepts a script, a command to execute, and an error to search for. It no longer automatically passes the script to the command which makes it a lot more flexible. Also be warned that it edits the script it is passed **in place**. Do not point it at something that is not in source control! New solver * Switch to a greedier but more fallible algorithm for simplifying union and intersection types that are created as part of refinement calculation. This has much better and more predictable performance. * Fix a constraint cycle in recursive function calls. * Much improved inference of binary addition. Functions like `function add(x, y) return x + y end` can now be inferred without annotations. We also accurately typecheck calls to functions like this. * Many small bugfixes surrounding things like table indexers * Add support for indexers on class types. This was previously added to the old solver; we now add it to the new one for feature parity. JIT * https://github.com/Roblox/luau/pull/931 * Fuse key.value and key.tt loads for CEHCK_SLOT_MATCH in A64 * Implement remaining aliases of BFM for A64 * Implement new callinfo flag for A64 * Add instruction simplification for int->num->int conversion chains * Don't even load execdata for X64 calls * Treat opcode fallbacks the same as manually written fallbacks --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-05-19 20:37:30 +01:00
seenTypes[typeId] = result;
2022-04-07 22:29:01 +01:00
Sync to upstream/release/577 (#934) Lots of things going on this week: * Fix a crash that could occur in the presence of a cyclic union. We shouldn't be creating cyclic unions, but we shouldn't be crashing when they arise either. * Minor cleanup of `luau_precall` * Internal change to make L->top handling slightly more uniform * Optimize SETGLOBAL & GETGLOBAL fallback C functions. * https://github.com/Roblox/luau/pull/929 * The syntax to the `luau-reduce` commandline tool has changed. It now accepts a script, a command to execute, and an error to search for. It no longer automatically passes the script to the command which makes it a lot more flexible. Also be warned that it edits the script it is passed **in place**. Do not point it at something that is not in source control! New solver * Switch to a greedier but more fallible algorithm for simplifying union and intersection types that are created as part of refinement calculation. This has much better and more predictable performance. * Fix a constraint cycle in recursive function calls. * Much improved inference of binary addition. Functions like `function add(x, y) return x + y end` can now be inferred without annotations. We also accurately typecheck calls to functions like this. * Many small bugfixes surrounding things like table indexers * Add support for indexers on class types. This was previously added to the old solver; we now add it to the new one for feature parity. JIT * https://github.com/Roblox/luau/pull/931 * Fuse key.value and key.tt loads for CEHCK_SLOT_MATCH in A64 * Implement remaining aliases of BFM for A64 * Implement new callinfo flag for A64 * Add instruction simplification for int->num->int conversion chains * Don't even load execdata for X64 calls * Treat opcode fallbacks the same as manually written fallbacks --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-05-19 20:37:30 +01:00
std::vector<TypeId> options;
options.reserve(t.options.size());
2022-04-07 22:29:01 +01:00
Sync to upstream/release/577 (#934) Lots of things going on this week: * Fix a crash that could occur in the presence of a cyclic union. We shouldn't be creating cyclic unions, but we shouldn't be crashing when they arise either. * Minor cleanup of `luau_precall` * Internal change to make L->top handling slightly more uniform * Optimize SETGLOBAL & GETGLOBAL fallback C functions. * https://github.com/Roblox/luau/pull/929 * The syntax to the `luau-reduce` commandline tool has changed. It now accepts a script, a command to execute, and an error to search for. It no longer automatically passes the script to the command which makes it a lot more flexible. Also be warned that it edits the script it is passed **in place**. Do not point it at something that is not in source control! New solver * Switch to a greedier but more fallible algorithm for simplifying union and intersection types that are created as part of refinement calculation. This has much better and more predictable performance. * Fix a constraint cycle in recursive function calls. * Much improved inference of binary addition. Functions like `function add(x, y) return x + y end` can now be inferred without annotations. We also accurately typecheck calls to functions like this. * Many small bugfixes surrounding things like table indexers * Add support for indexers on class types. This was previously added to the old solver; we now add it to the new one for feature parity. JIT * https://github.com/Roblox/luau/pull/931 * Fuse key.value and key.tt loads for CEHCK_SLOT_MATCH in A64 * Implement remaining aliases of BFM for A64 * Implement new callinfo flag for A64 * Add instruction simplification for int->num->int conversion chains * Don't even load execdata for X64 calls * Treat opcode fallbacks the same as manually written fallbacks --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-05-19 20:37:30 +01:00
for (TypeId ty : t.options)
options.push_back(clone(ty, dest, cloneState));
asMutable(result)->ty.emplace<UnionType>(std::move(options));
}
else
{
std::vector<TypeId> options;
options.reserve(t.options.size());
for (TypeId ty : t.options)
options.push_back(clone(ty, dest, cloneState));
TypeId result = dest.addType(UnionType{std::move(options)});
seenTypes[typeId] = result;
}
2022-04-07 22:29:01 +01:00
}
void TypeCloner::operator()(const IntersectionType& t)
2022-04-07 22:29:01 +01:00
{
TypeId result = dest.addType(IntersectionType{});
2022-04-07 22:29:01 +01:00
seenTypes[typeId] = result;
IntersectionType* option = getMutable<IntersectionType>(result);
2022-04-07 22:29:01 +01:00
LUAU_ASSERT(option != nullptr);
for (TypeId ty : t.parts)
2022-04-15 00:57:43 +01:00
option->parts.push_back(clone(ty, dest, cloneState));
2022-04-07 22:29:01 +01:00
}
void TypeCloner::operator()(const LazyType& t)
2022-04-07 22:29:01 +01:00
{
if (TypeId unwrapped = t.unwrapped.load())
{
seenTypes[typeId] = clone(unwrapped, dest, cloneState);
}
else
{
defaultClone(t);
}
2022-04-07 22:29:01 +01:00
}
void TypeCloner::operator()(const UnknownType& t)
2022-07-08 02:22:39 +01:00
{
defaultClone(t);
}
void TypeCloner::operator()(const NeverType& t)
2022-07-08 02:22:39 +01:00
{
defaultClone(t);
}
void TypeCloner::operator()(const NegationType& t)
{
TypeId result = dest.addType(AnyType{});
seenTypes[typeId] = result;
TypeId ty = clone(t.ty, dest, cloneState);
asMutable(result)->ty = NegationType{ty};
}
Sync to upstream/release/576 (#928) * `ClassType` can now have an indexer defined on it. This allows custom types to be used in `t[x]` expressions. * Fixed search for closest executable breakpoint line. Previously, breakpoints might have been skipped in `else` blocks at the end of a function * Fixed how unification is performed for two optional types `a? <: b?`, previously it might have unified either 'a' or 'b' with 'nil'. Note that this fix is not enabled by default yet (see the list in `ExperimentalFlags.h`) In the new type solver, a concept of 'Type Families' has been introduced. Type families can be thought of as type aliases with custom type inference/reduction logic included with them. For example, we can have an `Add<T, U>` type family that will resolve the type that is the result of adding two values together. This will help type inference to figure out what 'T' and 'U' might be when explicit type annotations are not provided. In this update we don't define any type families, but they will be added in the near future. It is also possible for Luau embedders to define their own type families in the global/environment scope. Other changes include: * Fixed scope used to find out which generic types should be included in the function generic type list * Fixed a crash after cyclic bound types were created during unification And in native code generation (jit): * Use of arm64 target on M1 now requires macOS 13 * Entry into native code has been optimized. This is especially important for coroutine call/pcall performance as they involve going through a C call frame * LOP_LOADK(X) translation into IR has been improved to enable type tag/constant propagation * arm64 can use integer immediate values to synthesize floating-point values * x64 assembler removes duplicate 64bit numbers from the data section to save space * Linux `perf` can now be used to profile native Luau code (when running with --codegen-perf CLI argument)
2023-05-12 18:50:47 +01:00
void TypeCloner::operator()(const TypeFamilyInstanceType& t)
{
TypeId result = dest.addType(TypeFamilyInstanceType{
t.family,
{},
{},
});
seenTypes[typeId] = result;
TypeFamilyInstanceType* tfit = getMutable<TypeFamilyInstanceType>(result);
LUAU_ASSERT(tfit != nullptr);
tfit->typeArguments.reserve(t.typeArguments.size());
for (TypeId p : t.typeArguments)
tfit->typeArguments.push_back(clone(p, dest, cloneState));
tfit->packArguments.reserve(t.packArguments.size());
for (TypePackId p : t.packArguments)
tfit->packArguments.push_back(clone(p, dest, cloneState));
}
2022-04-07 22:29:01 +01:00
} // anonymous namespace
2022-04-15 00:57:43 +01:00
TypePackId clone(TypePackId tp, TypeArena& dest, CloneState& cloneState)
2022-04-07 22:29:01 +01:00
{
if (tp->persistent)
return tp;
Sync to upstream/release/591 (#1012) * Fix a use-after-free bug in the new type cloning algorithm * Tighten up the type of `coroutine.wrap`. It is now `<A..., R...>(f: (A...) -> R...) -> ((A...) -> R...)` * Break `.luaurc` out into a separate library target `Luau.Config`. This makes it easier for applications to reason about config files without also depending on the type inference engine. * Move typechecking limits into `FrontendOptions`. This allows embedders more finely-grained control over autocomplete's internal time limits. * Fix stability issue with debugger onprotectederror callback allowing break in non-yieldable contexts New solver: * Initial work toward [Local Type Inference](https://github.com/Roblox/luau/blob/0e1082108fd6fb3a32dfdf5f1766ea3fc1391328/rfcs/local-type-inference.md) * Introduce a new subtyping test. This will be much nicer than the old test because it is completely separate both from actual type inference and from error reporting. Native code generation: * Added function to compute iterated dominance frontier * Optimize barriers in SET_UPVALUE when tag is known * Cache lua_State::global in a register on A64 * Optimize constant stores in A64 lowering * Track table array size state to optimize array size checks * Add split tag/value store into a VM register * Check that spills can outlive the block only in specific conditions --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-08-18 19:15:41 +01:00
if (FFlag::LuauStacklessTypeClone2)
2022-04-07 22:29:01 +01:00
{
TypeCloner2 cloner{NotNull{&dest}, cloneState.builtinTypes, NotNull{&cloneState.seenTypes}, NotNull{&cloneState.seenTypePacks}};
return cloner.clone(tp);
2022-04-07 22:29:01 +01:00
}
else
{
RecursionLimiter _ra(&cloneState.recursionCount, FInt::LuauTypeCloneRecursionLimit);
2022-04-07 22:29:01 +01:00
TypePackId& res = cloneState.seenTypePacks[tp];
if (res == nullptr)
{
TypePackCloner cloner{dest, tp, cloneState};
Luau::visit(cloner, tp->ty); // Mutates the storage that 'res' points into.
}
return res;
}
2022-04-07 22:29:01 +01:00
}
2022-04-15 00:57:43 +01:00
TypeId clone(TypeId typeId, TypeArena& dest, CloneState& cloneState)
2022-04-07 22:29:01 +01:00
{
if (typeId->persistent)
return typeId;
Sync to upstream/release/591 (#1012) * Fix a use-after-free bug in the new type cloning algorithm * Tighten up the type of `coroutine.wrap`. It is now `<A..., R...>(f: (A...) -> R...) -> ((A...) -> R...)` * Break `.luaurc` out into a separate library target `Luau.Config`. This makes it easier for applications to reason about config files without also depending on the type inference engine. * Move typechecking limits into `FrontendOptions`. This allows embedders more finely-grained control over autocomplete's internal time limits. * Fix stability issue with debugger onprotectederror callback allowing break in non-yieldable contexts New solver: * Initial work toward [Local Type Inference](https://github.com/Roblox/luau/blob/0e1082108fd6fb3a32dfdf5f1766ea3fc1391328/rfcs/local-type-inference.md) * Introduce a new subtyping test. This will be much nicer than the old test because it is completely separate both from actual type inference and from error reporting. Native code generation: * Added function to compute iterated dominance frontier * Optimize barriers in SET_UPVALUE when tag is known * Cache lua_State::global in a register on A64 * Optimize constant stores in A64 lowering * Track table array size state to optimize array size checks * Add split tag/value store into a VM register * Check that spills can outlive the block only in specific conditions --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-08-18 19:15:41 +01:00
if (FFlag::LuauStacklessTypeClone2)
2022-04-07 22:29:01 +01:00
{
TypeCloner2 cloner{NotNull{&dest}, cloneState.builtinTypes, NotNull{&cloneState.seenTypes}, NotNull{&cloneState.seenTypePacks}};
return cloner.clone(typeId);
}
else
{
RecursionLimiter _ra(&cloneState.recursionCount, FInt::LuauTypeCloneRecursionLimit);
TypeId& res = cloneState.seenTypes[typeId];
2022-04-07 22:29:01 +01:00
if (res == nullptr)
2022-04-21 22:44:27 +01:00
{
TypeCloner cloner{dest, typeId, cloneState};
Luau::visit(cloner, typeId->ty); // Mutates the storage that 'res' points into.
// Persistent types are not being cloned and we get the original type back which might be read-only
if (!res->persistent)
{
asMutable(res)->documentationSymbol = typeId->documentationSymbol;
}
2022-04-21 22:44:27 +01:00
}
2022-04-07 22:29:01 +01:00
return res;
}
2022-04-07 22:29:01 +01:00
}
2022-04-15 00:57:43 +01:00
TypeFun clone(const TypeFun& typeFun, TypeArena& dest, CloneState& cloneState)
2022-04-07 22:29:01 +01:00
{
Sync to upstream/release/591 (#1012) * Fix a use-after-free bug in the new type cloning algorithm * Tighten up the type of `coroutine.wrap`. It is now `<A..., R...>(f: (A...) -> R...) -> ((A...) -> R...)` * Break `.luaurc` out into a separate library target `Luau.Config`. This makes it easier for applications to reason about config files without also depending on the type inference engine. * Move typechecking limits into `FrontendOptions`. This allows embedders more finely-grained control over autocomplete's internal time limits. * Fix stability issue with debugger onprotectederror callback allowing break in non-yieldable contexts New solver: * Initial work toward [Local Type Inference](https://github.com/Roblox/luau/blob/0e1082108fd6fb3a32dfdf5f1766ea3fc1391328/rfcs/local-type-inference.md) * Introduce a new subtyping test. This will be much nicer than the old test because it is completely separate both from actual type inference and from error reporting. Native code generation: * Added function to compute iterated dominance frontier * Optimize barriers in SET_UPVALUE when tag is known * Cache lua_State::global in a register on A64 * Optimize constant stores in A64 lowering * Track table array size state to optimize array size checks * Add split tag/value store into a VM register * Check that spills can outlive the block only in specific conditions --------- Co-authored-by: Arseny Kapoulkine <arseny.kapoulkine@gmail.com> Co-authored-by: Vyacheslav Egorov <vegorov@roblox.com>
2023-08-18 19:15:41 +01:00
if (FFlag::LuauStacklessTypeClone2)
2022-04-07 22:29:01 +01:00
{
TypeCloner2 cloner{NotNull{&dest}, cloneState.builtinTypes, NotNull{&cloneState.seenTypes}, NotNull{&cloneState.seenTypePacks}};
2022-04-07 22:29:01 +01:00
TypeFun copy = typeFun;
2022-04-07 22:29:01 +01:00
for (auto& param : copy.typeParams)
{
param.ty = cloner.clone(param.ty);
2022-04-07 22:29:01 +01:00
if (param.defaultValue)
param.defaultValue = cloner.clone(*param.defaultValue);
}
for (auto& param : copy.typePackParams)
{
param.tp = cloner.clone(param.tp);
if (param.defaultValue)
param.defaultValue = cloner.clone(*param.defaultValue);
}
2022-04-07 22:29:01 +01:00
copy.type = cloner.clone(copy.type);
2022-04-07 22:29:01 +01:00
return copy;
2022-04-07 22:29:01 +01:00
}
else
{
TypeFun result;
for (auto param : typeFun.typeParams)
{
TypeId ty = clone(param.ty, dest, cloneState);
std::optional<TypeId> defaultValue;
2022-04-07 22:29:01 +01:00
if (param.defaultValue)
defaultValue = clone(*param.defaultValue, dest, cloneState);
2022-04-07 22:29:01 +01:00
result.typeParams.push_back({ty, defaultValue});
}
for (auto param : typeFun.typePackParams)
{
TypePackId tp = clone(param.tp, dest, cloneState);
std::optional<TypePackId> defaultValue;
if (param.defaultValue)
defaultValue = clone(*param.defaultValue, dest, cloneState);
result.typePackParams.push_back({tp, defaultValue});
}
result.type = clone(typeFun.type, dest, cloneState);
return result;
}
2022-04-07 22:29:01 +01:00
}
} // namespace Luau