Compare commits

..

148 commits

Author SHA1 Message Date
daimond113
4f48963da3
chore(release): prepare for v0.6.1 2025-03-09 21:29:49 +01:00
daimond113
00bc83b4e5
docs: update logo url to 0.6 branch 2025-03-09 21:21:57 +01:00
daimond113
412ce90e7f
feat: add proper versioning to lockfiles
Lockfiles now store a format field which is used
to migrate the them easily without brute forcing
the version it's at.
2025-03-09 18:57:20 +01:00
daimond113
b8c4f7486b
refactor: switch from sync Path::exists() method
This commit disallows the method through clippy
and switches to the async equivalents, as to not
block the async runtime.
2025-03-09 17:41:38 +01:00
daimond113
af93b7d584
fix: correctly get aliases for bin linkers
Fixes two issues with getting the aliases. The
first was that direct dependency aliases weren't
collected. The second is that these aliases would
be handled in a case-insensitive matter which is
incorrect.
2025-03-09 17:13:23 +01:00
daimond113
292565b647
feat: add @generated to lockfiles
Adds the @generated marker to lockfiles to make
them easily recognizable as auto-generated.
2025-03-09 16:16:44 +01:00
daimond113
e6ee935c11
feat: show available targets when none fit
This will help prevent user confusion with
targets. Instead of a cryptic "no matching
versions available" error, it'll now list the
available targets (if the source decides to output
them, so if the specifier has a target option)
2025-03-09 16:01:54 +01:00
daimond113
0e73db2831
fix: add missing run alias behaviour
The `run <alias>` behaviour was documented yet
missing. This commit adds it.
2025-03-09 14:53:22 +01:00
daimond113
12c62d315d
refactor: build search without async
It didn't benefit from being async (nothing is
a future in there). For some reason Clippy didn't
complain despite the unused_async lint being
enabled.
2025-03-09 12:59:59 +01:00
daimond113
19d6a07851
fix: make bin linkers for non-direct dependencies
Previously, binary linkers weren't made for
non-direct dependencies despite this being the
documented & expected behaviour.
2025-03-09 01:33:41 +01:00
daimond113
9a75ebf637
refactor: handle bin linkers in rust
Previously, binary linkers were handled by a Luau
script. This was not cross-runtime portable, and
forced us to do many "hacks" in order to be able
to implement them. To solve these issues, they are
now handled with Rust, which allows us to use our
existing infrastructure.
2025-03-09 01:18:12 +01:00
daimond113
41337ac96a
fix: dont inherit workspace by path dependencies 2025-03-08 23:24:35 +01:00
daimond113
9ad691ee94
refactor: undo send archive changes 2025-03-08 23:15:36 +01:00
daimond113
31f74cff65
docs: document adding wally dependencies 2025-03-08 22:38:35 +01:00
daimond113
c7a8f919e2
refactor: apply clippy lints 2025-03-08 22:37:42 +01:00
daimond113
ff5c2e5d61
refactor: specify many new clippy lints
Adds quite a lot of Clippy lints which fit with my
personal taste for how pesde's codebase should
look like. Stylistic lints are mostly warns, and
behavioural lints are mostly denied.
2025-03-08 22:00:52 +01:00
daimond113
7520e8e8fc
chore(registry-release): prepare for v0.2.1 2025-03-02 02:58:43 +01:00
daimond113
86a111cf83
feat(registry): print more error info
Previously, only the root error's Display was
printed. Now we print Display of the root and
every parent error's as well.
2025-03-02 02:55:28 +01:00
daimond113
308320602f
fix(website): use v1 api for docs 2025-02-23 12:43:54 +01:00
daimond113
cb3c47477a
docs: update security policy 2025-02-22 22:22:04 +01:00
daimond113
1b984f2d82
docs: use list for engine support req 2025-02-22 21:50:00 +01:00
daimond113
c9faef256a
chore(release): prepare for v0.6.0 2025-02-22 11:50:49 +01:00
daimond113
e2f4e35a02
docs: rename config.yaml to config.yml 2025-02-18 00:33:19 +01:00
daimond113
5d58b9dffb
docs: update issue templates 2025-02-18 00:31:21 +01:00
dai
467a000c5c
docs: create issue templates 2025-02-18 00:25:32 +01:00
daimond113
8510fcf030
chore(release): prepare for v0.6.0-rc.8 2025-02-17 00:35:59 +01:00
daimond113
ef471069d8
fix: don't panic on SIGINT exits
Unwrapping the process code caused panics on
SIGINTs. Now we default to code 1 if there is
none.
2025-02-17 00:33:49 +01:00
daimond113
1bc5defd9c
docs: document roblox_server target 2025-02-16 14:23:47 +01:00
daimond113
f6f0cb48f8
chore: add missing changelog entries 2025-02-16 11:38:36 +01:00
daimond113
aa4f283e4c
refactor: allow publishing other packages on error
Previously, if a package failed to publish the
whole publishing process would halt. Then, if
a re-ran publishing got to an already published
package it'd error, which would require users to
manually publish the missing packages. This commit
fixes that by printing errors, but allowing other
members to still get published.
2025-02-16 11:32:16 +01:00
daimond113
9b70929e02
chore(release): prepare for v0.6.0-rc.7 2025-02-15 00:09:07 +01:00
daimond113
ff6449a340
chore: fix formatting 2025-02-15 00:08:47 +01:00
daimond113
509838bb08
fix: remove typo in duplicate key error
The message previously contained "at line", but
we don't have a way of knowing the line number so
there was nothing after that. This commit removes
the "at line" part of the message.
2025-02-15 00:03:05 +01:00
daimond113
d6e2f611d8
perf: use or_else variants for expensive errors
This commit changes the error construction in
multiple places from the `*_or` to the `*_or_else`
variants. This is done to avoid the heap
allocation (for example, `to_string`) when there
is no need to.
2025-02-14 23:42:33 +01:00
daimond113
5ea86e632e
feat(website): add archive download button 2025-02-14 23:28:15 +01:00
daimond113
a4f2829db2
refactor: remove unused trait 2025-02-14 23:09:44 +01:00
daimond113
5c43e3cb95
refactor: improve outdated command output
Adds some colour to the outdated command which
fits in with other places such as the CLI's update
checker. Also, switches `->` to `→` which also
fits in with the CLI's update checker.
2025-02-13 20:16:42 +01:00
daimond113
82e0d38483
fix: print update available message to stderr
Previously, the update available message was
printed to stdout, which is not the correct
place for such messages. This commit changes
the message to be printed to stderr instead,
which will prevent potential issues with
piping the output of the command to another
command.
2025-02-13 20:10:08 +01:00
daimond113
7150f6a7da
style: enable lints in registry 2025-02-12 23:42:16 +01:00
daimond113
2e02fecd46
chore(registry): add missing changelog entries 2025-02-12 23:15:43 +01:00
daimond113
ae3530126f
feat: add i alias to install command
Adds an alias `i` to the `install` command, which
is common in a few other package managers.
2025-02-12 23:15:01 +01:00
daimond113
4786adf187
refactor: make aliases case-insensitive
Previously, aliases: `foo` and `Foo` were treated
as different aliases. This could cause issues with
linking on case-insensitive filesystems. This
commit makes aliases case-insensitive, but they
will be stored in the case they were defined in.
2025-02-12 23:14:05 +01:00
daimond113
04aaa40c69
refactor: make specifier index not an option
Instead, we use the `default` serde attribute
to deserialize the index as `DEFAULT_INDEX_NAME`
if it is not present. This removes a lot of
redundancy around the codebase about getting
the index name.
2025-02-12 17:41:49 +01:00
daimond113
b384dbe6a0
chore: fix changelog rc diff link 2025-02-10 23:05:33 +01:00
daimond113
8a8bbcbd02
chore(release): prepare for v0.6.0-rc.6 2025-02-10 23:04:41 +01:00
daimond113
c6e7e74a53
fix: fix double long prefix paths on windows
Our previous attempt at fixing the long path issue
on Windows has caused a regression. This commit
attempts to fix both issues by simply
canonicalizing the path.
2025-02-10 22:52:00 +01:00
daimond113
041b14f404
chore(release): prepare for v0.6.0-rc.5 2025-02-10 09:01:41 +01:00
daimond113
ea9c7e6b39
fix: compile out patches without feature 2025-02-10 09:01:26 +01:00
daimond113
b6799e7400
fix: compile out patches without feature 2025-02-10 08:55:41 +01:00
daimond113
faaa76cf0d
fix: include new files in patches
Previously, newly created files would not be
included in patches. This commit fixes that.
Additionally, removes some left over dbg! calls.
2025-02-10 08:43:48 +01:00
daimond113
1d131e98c6
refactor: patch improvements
Patches now work much better with incremental
installs. They use a Git repository to reset
the patch to the original state before applying
it. They are now also properly cleaned up after
being removed. Also, to allow more flexability
the patching process is now done before type
extracting, so that types can be patched. Lastly,
the patch-commit command now doesn't delete the
patch directory until it has done everything else
to prevent data loss.
2025-02-10 00:35:51 +01:00
daimond113
8bb8888de8
fix: correct windows script linker require paths
The pathdiff crate doesn't behave properly when
one path begins with a long path prefix
and the other doesn't, so we always put the
prefix on paths.
2025-02-09 20:10:55 +01:00
daimond113
9c75bff65e
chore: remove buymeacoffee funding 2025-02-08 17:20:36 +01:00
daimond113
72c020efd3
chore: update dependencies 2025-02-08 15:43:14 +01:00
daimond113
e2d10ac72b
chore(registry): remove native tls dependency 2025-02-08 15:05:12 +01:00
daimond113
73146e6f64
chore(release): prepare for v0.6.0-rc.4 2025-02-08 14:05:45 +01:00
daimond113
1a79326ebf
fix: refresh sources before reading from them
Previously, if a package was modified in a way
that the index hasn't been cloned (for example,
through a remote Git change) pesde would be unable
to read the package's metadata, whether it be
because the index was outdated or because it
wasn't cloned at all. These are now refreshed
as needed like everywhere else.
2025-02-08 14:03:43 +01:00
daimond113
399c63cc8c
chore(release): prepare for v0.6.0-rc.3 2025-02-08 01:19:06 +01:00
daimond113
a4927bf4be
fix: fix types not being re-exported
Code responsible for removing the file at the
destination has been removed which broke the
re-exporting of types. This commit reverts
that change, but also silences Windows errors
when removing the file.
2025-02-08 00:48:42 +01:00
daimond113
4d39ddae04
fix: point to the right path when fresh installing engines 2025-02-08 00:29:55 +01:00
daimond113
7f21131415
chore(release): prepare for v0.6.0-rc.2 2025-02-07 20:59:48 +01:00
daimond113
c71e879bfd
fix: fix linux zbus panicking
Fixes zbus on Linux panicking due to the crate
spawning a runtime inside of our own runtime. This
is avoided by using the sync mode of the crate
instead of async. Additionally, keyring operations
have been wrapped in spawn_blocking to avoid
blocking the async runtime.
2025-02-07 20:53:31 +01:00
daimond113
daf0861eb3
feat: colour deprecation message
Colours the text in the deprecation message to
match the yank command's UI.
2025-02-07 11:26:59 +01:00
daimond113
5939050ee3
chore: remove version-management from default features 2025-02-06 23:51:10 +01:00
daimond113
51fc6c3abd
refactor: move schema gen to test
Moves schema generation over to a test instead of
as a feature. This allows us to publish the crate
since we use a schemars from Git, which is not
supported by crates.io.
2025-02-06 23:49:25 +01:00
daimond113
c698969f76
chore(release): prepare for v0.6.0-rc.1 2025-02-06 23:24:11 +01:00
daimond113
70a4dc3226
chore: fix typo in changelog 2025-02-06 23:23:43 +01:00
daimond113
c6242b8569
refactor: use better colors for the publish command
Switches the background color of the publish
command's package announcement to a better
looking color. Will possibly change the design
of the command's UI in the future.
2025-02-06 23:00:03 +01:00
daimond113
ff6d37bf27
docs: update docs for 0.6 2025-02-04 22:36:42 +01:00
daimond113
43e2d1f325
chore: add missing changelog entries 2025-02-04 21:51:39 +01:00
daimond113
ba6a02e13b
fix: exit with code 1 from invalid directory bin linkers
Fixes #24.
2025-02-03 15:52:38 +01:00
daimond113
7ad4c6f5c6
chore: fix clippy lints 2025-02-03 15:52:24 +01:00
daimond113
0b5c233734
style: remove comma from ,) expressions 2025-02-02 15:25:11 +01:00
daimond113
692ae1521d
docs: reword sync config explanation 2025-02-02 15:24:20 +01:00
daimond113
6ae16a7dac
feat: add list and remove commands
Fixes #22.
2025-02-02 15:23:10 +01:00
daimond113
f0e69a08e2
refactor: use iteration over recursion
Replaces the recursive implementation of fallback
Wally registries with an iterative approach.
2025-02-02 14:06:38 +01:00
daimond113
6856746ae2
refactor: crate optimizations
Replaces chrono with jiff (already used due to
gix). Switches from the async-io to the tokio
feature in keyring. Removes dependency on
serde-with. Optimizes release mode executable
size by using aborting panics.
2025-02-02 00:17:11 +01:00
daimond113
24049d60a2
feat: ignore submodules in git dependencies
Previously, if a repository contained a submodule
pesde would throw an error since there is no
entries for them. Now, they are simply ignored.
2025-02-01 16:25:00 +01:00
daimond113
ca550eee3a
refactor: rename PackageFs::CAS to Cas
The name now fits in with the rest of the codebase.
2025-02-01 01:02:48 +01:00
daimond113
2154fc0e84
refactor: reorder commands in help message
Reorders the commands in the help message so they
appear in a more logical order.
2025-02-01 01:01:40 +01:00
daimond113
b30f9ecdeb
feat: cas pruning
Squashed commit of the following:

commit 82b4b858e5
Author: daimond113 <contact@daimond113.com>
Date:   Sat Feb 1 00:46:31 2025 +0100

    feat: remove unused directories when purging cas

    Now purging the CAS will also clean up unused
    folders. Additionally, since concurrent removal
    of directories seems to throw a PermissionDenied
    error on Windows those are ignored. Needs
    investigation on why that happens.

commit 75d6aa5443
Author: daimond113 <contact@daimond113.com>
Date:   Fri Jan 31 23:24:11 2025 +0100

    feat: finish prune command implementation

    The prune command now discovers packages in the
    CAS, removes individual unused files and then
    packages which use those files, since that means
    they're unused.

commit 333eb3bdd9
Author: daimond113 <contact@daimond113.com>
Date:   Sun Jan 26 23:30:52 2025 +0100

    chore: fix clippy lint

commit a38da43670
Author: daimond113 <contact@daimond113.com>
Date:   Sun Jan 26 23:02:52 2025 +0100

    feat: add cas pruning command

    Removes unused files from the CAS. Still needs to
    remove individual package index entries to be
    complete.
2025-02-01 00:51:43 +01:00
daimond113
5cc64f38ec
refactor: include more linking debug info
Logs every path used in the linking process.
2025-01-30 23:46:58 +01:00
daimond113
4009313281
fix: do not include incompatible files in workspace packages
Fixes `default.project.json` being copied if it
is present in a workspace package.
2025-01-30 23:45:31 +01:00
daimond113
3e4ef00f4a
chore: fix formatting 2025-01-24 23:45:26 +01:00
daimond113
801acb0264
fix: remove scripts linkers in incremental installs
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-aarch64 (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
Additionally, this commit changes the linking
process to be much less blocking, which should
bring a not-insignificant speedup to the
installation process.
2025-01-24 23:39:15 +01:00
daimond113
8835156b76
chore: update dependencies
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-aarch64 (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2025-01-19 22:35:11 +01:00
daimond113
446aa748a6
chore: fix clippy lint 2025-01-19 22:33:23 +01:00
daimond113
fe979f26c5
refactor: remove unnecessary asyncness from download_and_link
Additionally fixes stack overflows by building the
`miniz_oxide` crate with release level
optimizations.

Signed-off-by: daimond113 <contact@daimond113.com>
2025-01-19 22:29:29 +01:00
daimond113
95896091cd
refactor: switch out dependencies
Switches the `colored` crate to the `console`
crate. Additionally, to optimize the compiled
program's size switches the `inquire` crate's
backend from `crossterm` to `console`. Console was
picked out because we depend on `indicatif` which
only supports `console`.

Also switches from `winreg` to `windows-registry`,
which `reqwest` depends on to optimize size even
further. Currently has to duplicate dependencies,
as `reqwest` depends on an older version but will
become optimized once `reqwest` updates to the
latest version of the crate.

Signed-off-by: daimond113 <contact@daimond113.com>
2025-01-19 22:29:27 +01:00
daimond113
b9a105cec4
docs: improve init scripts explanation wording
"default Roblox compatibility scripts" ->
"Roblox compatibility scripts"
2025-01-18 17:36:51 +01:00
daimond113
a53ae657e1
Merge branch '0.5' into 0.6
Some checks are pending
Debug / Get build version (push) Waiting to run
Debug / Build for linux-aarch64 (push) Blocked by required conditions
Debug / Build for linux-x86_64 (push) Blocked by required conditions
Debug / Build for macos-aarch64 (push) Blocked by required conditions
Debug / Build for macos-x86_64 (push) Blocked by required conditions
Debug / Build for windows-x86_64 (push) Blocked by required conditions
Test & Lint / lint (push) Waiting to run
2025-01-18 16:47:20 +01:00
daimond113
5ad3339535
chore: add missing changelog entries 2025-01-18 15:21:26 +01:00
daimond113
941bb79ea6
refactor: improve code tidiness
Switches to the `cas_path` function when reading
CAS files. Asyncifies IO operations when reusing
package folders.
2025-01-18 15:16:36 +01:00
daimond113
0dfc3ef5bd
fix: scope CAS package indices to index of package
Fixes conflicts between multiple packages with the
same name in different indices.
2025-01-18 15:14:58 +01:00
daimond113
a2ce747879
feat: update instead of recreating packages folders
Instead of recreating the packages folders, we now
update the existing ones. Additionally switches
a few APIs from accepting `&TargetKind` to `TargetKind`.
2025-01-18 14:18:46 +01:00
daimond113
53bdf0ced6
fix: do gix operations inside spawn_blocking
Some checks are pending
Debug / Get build version (push) Waiting to run
Debug / Build for linux-aarch64 (push) Blocked by required conditions
Debug / Build for linux-x86_64 (push) Blocked by required conditions
Debug / Build for macos-aarch64 (push) Blocked by required conditions
Debug / Build for macos-x86_64 (push) Blocked by required conditions
Debug / Build for windows-x86_64 (push) Blocked by required conditions
Test & Lint / lint (push) Waiting to run
Additionally refactors the source's code to be
much neater and easier to read.
2025-01-17 21:15:12 +01:00
daimond113
9e6fa4294f
fix: asyncify exists check 2025-01-17 20:37:38 +01:00
daimond113
3d659161e6
fix(resolver): handle infinite loop in resolver
Some checks are pending
Debug / Get build version (push) Waiting to run
Debug / Build for linux-aarch64 (push) Blocked by required conditions
Debug / Build for linux-x86_64 (push) Blocked by required conditions
Debug / Build for macos-aarch64 (push) Blocked by required conditions
Debug / Build for macos-x86_64 (push) Blocked by required conditions
Debug / Build for windows-x86_64 (push) Blocked by required conditions
Test & Lint / lint (push) Waiting to run
Fixes a recursive loop in the resolver when resolving
dependencies that depend on themselves from
an existing lockfile.
2025-01-16 23:18:23 +01:00
daimond113
805a257a76
feat: switch to version_matches for semver comparison
Fixes `*` not resolving to versions which are
pre-releases.
2025-01-16 22:48:43 +01:00
daimond113
6ae7e5078c
feat(engines): do not silence manifest read errors
Now reading the manifest will only be silenced if
the error's cause is the manifest not being found.
2025-01-16 22:44:05 +01:00
daimond113
684f711d93
ci: add linux aarch64 builds
Squashed commit of the following:

commit 4d455d4015
Author: daimond113 <72147841+daimond113@users.noreply.github.com>
Date:   Thu Jan 16 22:24:20 2025 +0100

    ci: install aarch64 build deps and fix typo

commit 549374d34c
Author: daimond113 <72147841+daimond113@users.noreply.github.com>
Date:   Thu Jan 16 22:20:43 2025 +0100

    ci: add linux aarch64 builds
2025-01-16 22:33:44 +01:00
daimond113
57afa4c593
chore: update dependencies
Some checks are pending
Debug / Get build version (push) Waiting to run
Debug / Build for linux-x86_64 (push) Blocked by required conditions
Debug / Build for macos-aarch64 (push) Blocked by required conditions
Debug / Build for macos-x86_64 (push) Blocked by required conditions
Debug / Build for windows-x86_64 (push) Blocked by required conditions
Test & Lint / lint (push) Waiting to run
2025-01-16 20:27:17 +01:00
daimond113
380a716200
feat(engines): report version resolving
Improves user experience when running engines by
reporting that it is currently resolving the version
of the engine to download.
2025-01-16 20:18:10 +01:00
daimond113
f4050abec8
feat: add engines
Squashed commit of the following:

commit 5767042964
Author: daimond113 <72147841+daimond113@users.noreply.github.com>
Date:   Thu Jan 16 18:28:52 2025 +0100

    fix(engines): correct engine detection on unix

    The `current_exe` function doesn't return the
    symlinked path on Unix, so the engine detection
    was failing there. This commit fixes that by
    using the 0th argument of the program to get
    the path of the executable on Unix.

commit b51c9d9571
Author: daimond113 <72147841+daimond113@users.noreply.github.com>
Date:   Wed Jan 15 22:43:50 2025 +0100

    refactor: print deprecated warning on CLI side

    Prints the deprecated warning on the CLI side
    which means it'll have a more consistent look
    with the rest of the CLI output.

commit 5ace844035
Author: daimond113 <72147841+daimond113@users.noreply.github.com>
Date:   Wed Jan 15 22:21:36 2025 +0100

    feat: add alias validation

    Ensures aliases don't contain characters which could
    cause issues. They are now also forbidden from being
    the same as an engine name to avoid issues.

commit a33302aff9
Author: daimond113 <72147841+daimond113@users.noreply.github.com>
Date:   Wed Jan 15 21:23:40 2025 +0100

    refactor: apply clippy lints

commit 2d534a534d
Author: daimond113 <72147841+daimond113@users.noreply.github.com>
Date:   Wed Jan 15 21:22:14 2025 +0100

    feat(engines): print incompatibility warning for dependencies

    Adds a warning message when a dependency depends
    on an incompatible engine.

commit 4946a19f8b
Author: daimond113 <72147841+daimond113@users.noreply.github.com>
Date:   Wed Jan 15 18:33:38 2025 +0100

    feat(engines): create linkers at install time

    Additionally fixes engines being executed as scripts,
    and fixes downloading pesde from GitHub.

commit e3177eeb75
Author: daimond113 <72147841+daimond113@users.noreply.github.com>
Date:   Tue Jan 14 14:33:26 2025 +0100

    fix(engines): store & link engines correctly

    Fixes issues with how engines were stored
    which resulted in errors. Also makes outdated
    linkers get updated.

commit 037ead66bb
Author: daimond113 <72147841+daimond113@users.noreply.github.com>
Date:   Mon Jan 13 12:26:19 2025 +0100

    docs: remove prerequisites

commit ddb496ff7d
Author: daimond113 <72147841+daimond113@users.noreply.github.com>
Date:   Mon Jan 13 12:25:53 2025 +0100

    ci: remove tar builds

commit e9f0c25554
Author: daimond113 <72147841+daimond113@users.noreply.github.com>
Date:   Mon Jan 13 12:25:11 2025 +0100

    chore(docs): update astro and starlight

commit fc349e6f21
Author: daimond113 <72147841+daimond113@users.noreply.github.com>
Date:   Sun Jan 12 23:12:27 2025 +0100

    feat: add engines

    Adds the initial implementation of the engines feature.
    Not tested yet. Requires documentation and
    more work for non-pesde engines to be usable.
2025-01-16 19:11:16 +01:00
daimond113
d4979bbdb2
feat: switch lib & cli to v1 api
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2025-01-13 13:21:22 +01:00
daimond113
1eef6078bf
fix(registry): keep v0 api backwards compatible 2025-01-13 13:19:15 +01:00
daimond113
72c1c39401
fix: use urlencoding crate for wally packages
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2025-01-10 16:06:59 +01:00
daimond113
076f5564ee
feat(registry): set content-length header for fs storage 2025-01-10 16:06:27 +01:00
daimond113
a39b1bb60a
feat(website): escape url parts 2025-01-10 14:07:52 +01:00
daimond113
dcc869c025
fix(registry): avoid race condition in search
Some checks are pending
Debug / Get build version (push) Waiting to run
Debug / Build for linux-x86_64 (push) Blocked by required conditions
Debug / Build for macos-aarch64 (push) Blocked by required conditions
Debug / Build for macos-x86_64 (push) Blocked by required conditions
Debug / Build for windows-x86_64 (push) Blocked by required conditions
Test & Lint / lint (push) Waiting to run
2025-01-10 09:24:33 +01:00
daimond113
6f4c7137c0
feat: add name.scope and name.name apis
Some checks are pending
Debug / Get build version (push) Waiting to run
Debug / Build for linux-x86_64 (push) Blocked by required conditions
Debug / Build for macos-aarch64 (push) Blocked by required conditions
Debug / Build for macos-x86_64 (push) Blocked by required conditions
Debug / Build for windows-x86_64 (push) Blocked by required conditions
Test & Lint / lint (push) Waiting to run
2025-01-10 00:01:02 +01:00
daimond113
e8c3a66524
feat(registry): add individual job endpoints for package data 2025-01-10 00:00:24 +01:00
daimond113
6ab334c904
feat: use url encoding crate to ensure validity of urls 2025-01-09 23:04:06 +01:00
daimond113
be6410443f
perf(registry): asyncify reading data of top search packages 2025-01-09 22:59:20 +01:00
daimond113
685700f572
perf(registry): use rwlock over mutex for repository data 2025-01-09 22:40:41 +01:00
daimond113
217ca238ff
feat: remove cli side dependency checks 2025-01-09 22:36:53 +01:00
daimond113
e61aeb5da0
feat(registry): add more info in auth & storage logs 2025-01-09 22:31:20 +01:00
daimond113
9bab997992
docs: add missing 'required' annotation 2025-01-09 22:13:41 +01:00
daimond113
325453450b
feat: add deprecating & yanking 2025-01-09 22:09:28 +01:00
daimond113
243dd39e14
Merge branch '0.5' into 0.6
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2025-01-06 13:33:46 +01:00
daimond113
ca5a8f108d
fix: install dev packages and remove them after use
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2025-01-03 00:05:32 +01:00
daimond113
de43d2ce42
chore: add git blame ignore file 2025-01-02 22:38:29 +01:00
daimond113
0ceb2f6653
style: enable hard_tabs rustfmt option 2025-01-02 22:37:27 +01:00
daimond113
a627a7253f
feat: add utility function to reduce code duplication 2025-01-02 22:21:18 +01:00
daimond113
6f5e2a2473
feat: improve linking process 2025-01-02 21:30:25 +01:00
daimond113
e5b629e0c5
feat: remove unused errors 2025-01-02 19:09:57 +01:00
daimond113
9bf2af6454
fix: compile without feature flags
Some checks are pending
Debug / Get build version (push) Waiting to run
Debug / Build for linux-x86_64 (push) Blocked by required conditions
Debug / Build for macos-aarch64 (push) Blocked by required conditions
Debug / Build for macos-x86_64 (push) Blocked by required conditions
Debug / Build for windows-x86_64 (push) Blocked by required conditions
Test & Lint / lint (push) Waiting to run
2025-01-02 15:41:29 +01:00
daimond113
5d62549817
feat: switch to JoinSet over join_all
Some checks are pending
Debug / Get build version (push) Waiting to run
Debug / Build for linux-x86_64 (push) Blocked by required conditions
Debug / Build for macos-aarch64 (push) Blocked by required conditions
Debug / Build for macos-x86_64 (push) Blocked by required conditions
Debug / Build for windows-x86_64 (push) Blocked by required conditions
Test & Lint / lint (push) Waiting to run
2025-01-01 18:46:00 +01:00
daimond113
83fa22f7de
feat: remove more data redundancy from lockfiles 2025-01-01 17:16:41 +01:00
daimond113
78e58d63fa
feat: improve container_folder api 2025-01-01 16:35:59 +01:00
daimond113
d0169976cd
feat: store dependency over downloaded graphs 2025-01-01 16:28:53 +01:00
daimond113
6a8dfe0ba3
feat: switch to flat graph handling
Some checks are pending
Debug / Get build version (push) Waiting to run
Debug / Build for linux-x86_64 (push) Blocked by required conditions
Debug / Build for macos-aarch64 (push) Blocked by required conditions
Debug / Build for macos-x86_64 (push) Blocked by required conditions
Debug / Build for windows-x86_64 (push) Blocked by required conditions
Test & Lint / lint (push) Waiting to run
2025-01-01 00:34:21 +01:00
daimond113
80b8b151d7
fix: switch to schemars attribute
Some checks are pending
Debug / Get build version (push) Waiting to run
Debug / Build for linux-x86_64 (push) Blocked by required conditions
Debug / Build for macos-aarch64 (push) Blocked by required conditions
Debug / Build for macos-x86_64 (push) Blocked by required conditions
Debug / Build for windows-x86_64 (push) Blocked by required conditions
Test & Lint / lint (push) Waiting to run
2024-12-31 01:37:42 +01:00
daimond113
fd5a038d8b
feat: add schema generation 2024-12-31 01:35:28 +01:00
daimond113
7f15264f48
feat: inherit pesde-managed scripts from workspace root
Some checks are pending
Debug / Get build version (push) Waiting to run
Debug / Build for linux-x86_64 (push) Blocked by required conditions
Debug / Build for macos-aarch64 (push) Blocked by required conditions
Debug / Build for macos-x86_64 (push) Blocked by required conditions
Debug / Build for windows-x86_64 (push) Blocked by required conditions
Test & Lint / lint (push) Waiting to run
2024-12-30 21:05:34 +01:00
daimond113
2700fe9e07
feat: remove data redundancy for workspace pkg refs 2024-12-30 19:06:53 +01:00
daimond113
c3d2c768db
feat: add path dependencies
Fixes #13
2024-12-30 18:33:48 +01:00
daimond113
ccb2924362
feat: remove old includes compat
Some checks are pending
Debug / Get build version (push) Waiting to run
Debug / Build for linux-x86_64 (push) Blocked by required conditions
Debug / Build for macos-aarch64 (push) Blocked by required conditions
Debug / Build for macos-x86_64 (push) Blocked by required conditions
Debug / Build for windows-x86_64 (push) Blocked by required conditions
Test & Lint / lint (push) Waiting to run
2024-12-30 13:37:30 +01:00
daimond113
6cf9f14649
Merge branch '0.5' into 0.6
Some checks are pending
Debug / Get build version (push) Waiting to run
Debug / Build for linux-x86_64 (push) Blocked by required conditions
Debug / Build for macos-aarch64 (push) Blocked by required conditions
Debug / Build for macos-x86_64 (push) Blocked by required conditions
Debug / Build for windows-x86_64 (push) Blocked by required conditions
Test & Lint / lint (push) Waiting to run
2024-12-30 01:12:48 +01:00
daimond113
634ef013da
docs: add missing changelog entry 2024-12-29 23:27:34 +01:00
daimond113
30b4459de0
docs: add override by alias docs 2024-12-29 23:27:25 +01:00
daimond113
a4d7b4d6e0
feat: allow ignoring syntax errors in file parsing
Fixes #16
2024-12-29 23:09:24 +01:00
daimond113
2aeee9de34
feat: add override by alias support 2024-12-29 22:31:06 +01:00
daimond113
2936f88a99
feat: use paths instead of pathbufs where applicable
Some checks are pending
Debug / Get build version (push) Waiting to run
Debug / Build for linux-x86_64 (push) Blocked by required conditions
Debug / Build for macos-aarch64 (push) Blocked by required conditions
Debug / Build for macos-x86_64 (push) Blocked by required conditions
Debug / Build for windows-x86_64 (push) Blocked by required conditions
Test & Lint / lint (push) Waiting to run
2024-12-29 12:47:36 +01:00
daimond113
aabb353d25
perf: lazily format error messages
Some checks are pending
Debug / Get build version (push) Waiting to run
Debug / Build for linux-x86_64 (push) Blocked by required conditions
Debug / Build for macos-aarch64 (push) Blocked by required conditions
Debug / Build for macos-x86_64 (push) Blocked by required conditions
Debug / Build for windows-x86_64 (push) Blocked by required conditions
Test & Lint / lint (push) Waiting to run
2024-12-28 18:13:53 +01:00
daimond113
a091d06f36
perf: remove unnecessary wally mutex 2024-12-28 18:09:20 +01:00
daimond113
8e6d877241
perf: use arcs where possible, remove unnecessary cloning 2024-12-28 16:50:14 +01:00
Luka
a41d9950f8
feat: better install (#17)
Some checks are pending
Debug / Get build version (push) Waiting to run
Debug / Build for linux-x86_64 (push) Blocked by required conditions
Debug / Build for macos-aarch64 (push) Blocked by required conditions
Debug / Build for macos-x86_64 (push) Blocked by required conditions
Debug / Build for windows-x86_64 (push) Blocked by required conditions
Test & Lint / lint (push) Waiting to run
* feat: better install

* feat: support progress reporting for wally

* chore: remove tracing-indicatif

* chore: fix Cargo.toml

* fix: indentation in bin link script

* fix: spinner tick chars

* feat: change progress message color

* fix: remove pretty from fmt_layer

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>

* style: format code

---------

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>
2024-12-27 22:04:47 +01:00
161 changed files with 16936 additions and 11302 deletions

View file

@ -1,2 +1,2 @@
PUBLIC_REGISTRY_URL= # url of the registry API, this must have a trailing slash and include the version PUBLIC_REGISTRY_URL= # url of the registry API, this must have a trailing slash and include the version
# example: https://registry.pesde.daimond113.com/v0/ # example: https://registry.pesde.daimond113.com/v1/

3
.git-blame-ignore-revs Normal file
View file

@ -0,0 +1,3 @@
# .git-blame-ignore-revs
# Enabled the `hard_tabs` option in rustfmt.toml
0ceb2f6653b12e8261533ef528d78e3dde7ed757

1
.github/FUNDING.yml vendored
View file

@ -1,2 +1 @@
buy_me_a_coffee: daimond113
ko_fi: daimond113 ko_fi: daimond113

30
.github/ISSUE_TEMPLATE/01-bug-report.md vendored Normal file
View file

@ -0,0 +1,30 @@
---
name: Bug report
about: Create a report to help us improve
title: ""
labels: bug
assignees: daimond113
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
If the issue is with:
- pesde's library (e.g. linking):
1. include your project's manifest (pesde.toml)
2. if possible, a repository with a minimal reproduction of the issue.
- pesde's CLI (e.g. authentication):
1. include a screenshot or copy-paste the output of the CLI.
**Expected behavior**
A clear and concise description of what you expected to happen.
**Please complete the following information:**
- pesde: (e.g. 0.6.0)
- Operating system: (e.g. Windows 11, macOS 15, Ubuntu 24.04)
**Additional context**
Add any other context about the problem here.

View file

@ -0,0 +1,19 @@
---
name: Feature request
about: Suggest an idea for this project
title: ""
labels: enhancement
assignees: daimond113
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

View file

@ -0,0 +1,28 @@
---
name: Engine support request
about: Suggest a new engine
title: "[ENGINE]"
labels: enhancement
assignees: daimond113
---
**General information about the engine**
Name: Lune
Repository: https://github.com/lune-org/lune
**Checklist**
- [ ] Is this engine versioned according to [SemVer](https://semver.org/spec/v2.0.0.html)?
- [ ] Does this engine provide pre-compiled builds?
- [ ] Will the engine get support for at least a year after it gets added to pesde?
- [ ] Does the engine have any notable differences to plain Luau (e.g. IO APIs)?
- [ ] Does the engine correctly implement `require` (according to Luau's RFCs)?
- [ ] Is the engine open-source under a reasonably permissive license?
If the answer is "no" to any of these questions, the engine will not be added.
**Additional information**
- [ ] Are you willing to submit a pull request adding the engine (and a target for it)?
These questions can help fast-track the engine being added.

1
.github/ISSUE_TEMPLATE/config.yml vendored Normal file
View file

@ -0,0 +1 @@
blank_issues_enabled: false

View file

@ -40,6 +40,11 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
artifact-name: pesde-debug-${{ needs.get-version.outputs.version }}-linux-x86_64 artifact-name: pesde-debug-${{ needs.get-version.outputs.version }}-linux-x86_64
- job-name: linux-aarch64
target: aarch64-unknown-linux-gnu
runs-on: ubuntu-24.04-arm
artifact-name: pesde-debug-${{ needs.get-version.outputs.version }}-linux-aarch64
- job-name: macos-x86_64 - job-name: macos-x86_64
target: x86_64-apple-darwin target: x86_64-apple-darwin
runs-on: macos-13 runs-on: macos-13
@ -58,7 +63,7 @@ jobs:
uses: actions/checkout@v4 uses: actions/checkout@v4
- name: Install Linux build dependencies - name: Install Linux build dependencies
if: ${{ matrix.runs-on == 'ubuntu-latest' }} if: ${{ startsWith(matrix.runs-on, 'ubuntu') }}
run: | run: |
sudo apt-get update sudo apt-get update
sudo apt-get install libdbus-1-dev pkg-config sudo apt-get install libdbus-1-dev pkg-config

View file

@ -51,6 +51,11 @@ jobs:
arch: x86_64 arch: x86_64
target: x86_64-unknown-linux-gnu target: x86_64-unknown-linux-gnu
- os: ubuntu-24.04-arm
host: linux
arch: aarch64
target: aarch64-unknown-linux-gnu
- os: windows-latest - os: windows-latest
host: windows host: windows
arch: x86_64 arch: x86_64
@ -96,11 +101,9 @@ jobs:
if [ ${{ matrix.host }} = "windows" ]; then if [ ${{ matrix.host }} = "windows" ]; then
mv target/${{ matrix.target }}/release/${{ env.BIN_NAME }}.exe ${{ env.BIN_NAME }}.exe mv target/${{ matrix.target }}/release/${{ env.BIN_NAME }}.exe ${{ env.BIN_NAME }}.exe
7z a ${{ env.ARCHIVE_NAME }}.zip ${{ env.BIN_NAME }}.exe 7z a ${{ env.ARCHIVE_NAME }}.zip ${{ env.BIN_NAME }}.exe
tar -czf ${{ env.ARCHIVE_NAME }}.tar.gz ${{ env.BIN_NAME }}.exe
else else
mv target/${{ matrix.target }}/release/${{ env.BIN_NAME }} ${{ env.BIN_NAME }} mv target/${{ matrix.target }}/release/${{ env.BIN_NAME }} ${{ env.BIN_NAME }}
zip -r ${{ env.ARCHIVE_NAME }}.zip ${{ env.BIN_NAME }} zip -r ${{ env.ARCHIVE_NAME }}.zip ${{ env.BIN_NAME }}
tar -czf ${{ env.ARCHIVE_NAME }}.tar.gz ${{ env.BIN_NAME }}
fi fi
- name: Upload zip artifact - name: Upload zip artifact
@ -109,12 +112,6 @@ jobs:
name: ${{ env.ARCHIVE_NAME }}.zip name: ${{ env.ARCHIVE_NAME }}.zip
path: ${{ env.ARCHIVE_NAME }}.zip path: ${{ env.ARCHIVE_NAME }}.zip
- name: Upload tar.gz artifact
uses: actions/upload-artifact@v4
with:
name: ${{ env.ARCHIVE_NAME }}.tar.gz
path: ${{ env.ARCHIVE_NAME }}.tar.gz
publish: publish:
name: Publish to crates.io name: Publish to crates.io
runs-on: ubuntu-latest runs-on: ubuntu-latest

2
.gitignore vendored
View file

@ -5,3 +5,5 @@ cobertura.xml
tarpaulin-report.html tarpaulin-report.html
build_rs_cov.profraw build_rs_cov.profraw
registry/data registry/data
data
manifest.schema.json

View file

@ -5,6 +5,75 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [0.6.1] - 2025-03-09
### Fixed
- Fix path dependencies using project's workspace dependencies by @daimond113
- Fix binary linkers not being created for non-direct dependencies by @daimond113
- Add missing `run <alias>` behaviour by @daimond113
### Changed
- Binary linkers are now done in Rust to simplify their implementation and cross-runtime portability by @daimond113
- Show available targets in add and install commands if the specifier hasn't matched any by @daimond113
- Add [@generated](https://generated.at/) marker to lockfiles by @daimond113
## [0.6.0] - 2025-02-22
### Added
- Improve installation experience by @lukadev-0
- Support using aliases of own dependencies for overrides by @daimond113
- Support ignoring parse errors in Luau files by @daimond113
- Add path dependencies by @daimond113
- Inherit pesde-managed scripts from workspace root by @daimond113
- Allow using binaries from workspace root in member packages by @daimond113
- Add yanking & deprecating by @daimond113
- Add engines as a form of managing runtimes by @daimond113
- Modify existing installed packages instead of always reinstalling by @daimond113
- Add `cas prune` command to remove unused CAS files & packages by @daimond113
- Add `list` and `remove` commands to manage packages in the manifest by @daimond113
### Fixed
- Install dev packages in prod mode and remove them after use to allow them to be used in scripts by @daimond113
- Fix infinite loop in the resolver in packages depending on themselves by @daimond113
- Do Git operations inside spawn_blocking to avoid performance issues by @daimond113
- Scope CAS package indices to the source by @daimond113
- Do not copy `default.project.json` in workspace dependencies by @daimond113
- Colour deprecate output to match yank output by @daimond113
- Fix zbus panic on Linux by @daimond113
- Fix `self-upgrade` using the wrong path when doing a fresh download by @daimond113
- Fix types not being re-exported by @daimond113
- Refresh sources before reading package data to ensure the index is even cloned (remote changes to lockfile) by @daimond113
- Correct script linker require paths on Windows by @daimond113
- Improve patches in incremental installs by @daimond113
- Patches now include newly created files by @daimond113
- Fix double path long prefix issues on Windows by @daimond113
- Fix panic when using SIGINT by @daimond113
### Changed
- Change handling of graphs to a flat structure by @daimond113
- Store dependency over downloaded graphs in the lockfile by @daimond113
- Improve linking process by @daimond113
- Use a proper url encoding library to ensure compatibility with all characters by @daimond113
- The `*` specifier now matches all versions, even prereleases by @daimond113
- Switch CLI dependencies to ones used by other dependencies to optimize the binary size by @daimond113
- Reorder the `help` command by @daimond113
- Ignore submodules instead of failing when using Git dependencies with submodules by @daimond113
- Exit with code 1 from invalid directory binary linkers by @daimond113
- Patches are now applied before type extraction to allow patches to modify types by @daimond113
- Make aliases case-insensitive by @daimond113
- Print "update available" message to stderr by @daimond113
- Improve output of the `outdated` command by @daimond113
- Allow publishing other packages even if an error occurred by @daimond113
### Removed
- Remove old includes format compatibility by @daimond113
- Remove data redundancy for workspace package references by @daimond113
- Remove dependency checks from CLI in publish command in favor of registry checks by @daimond113
### Performance
- Use `Arc` for more efficient cloning of multiple structs by @daimond113
- Avoid cloning where possible by @daimond113
- Remove unnecessary mutex in Wally package download by @daimond113
- Lazily format error messages by @daimond113
## [0.5.3] - 2024-12-30 ## [0.5.3] - 2024-12-30
### Added ### Added
- Add meta field in index files to preserve compatibility with potential future changes by @daimond113 - Add meta field in index files to preserve compatibility with potential future changes by @daimond113
@ -112,6 +181,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- Asyncify dependency linking by @daimond113 - Asyncify dependency linking by @daimond113
- Use `exec` in Unix bin linking to reduce the number of processes by @daimond113 - Use `exec` in Unix bin linking to reduce the number of processes by @daimond113
[0.6.1]: https://github.com/daimond113/pesde/compare/v0.6.0%2Bregistry.0.2.1..v0.6.1%2Bregistry.0.2.2
[0.6.0]: https://github.com/daimond113/pesde/compare/v0.5.3%2Bregistry.0.1.2..v0.6.0%2Bregistry.0.2.0
[0.5.3]: https://github.com/daimond113/pesde/compare/v0.5.2%2Bregistry.0.1.1..v0.5.3%2Bregistry.0.1.2 [0.5.3]: https://github.com/daimond113/pesde/compare/v0.5.2%2Bregistry.0.1.1..v0.5.3%2Bregistry.0.1.2
[0.5.2]: https://github.com/daimond113/pesde/compare/v0.5.1%2Bregistry.0.1.0..v0.5.2%2Bregistry.0.1.1 [0.5.2]: https://github.com/daimond113/pesde/compare/v0.5.1%2Bregistry.0.1.0..v0.5.2%2Bregistry.0.1.1
[0.5.1]: https://github.com/daimond113/pesde/compare/v0.5.0%2Bregistry.0.1.0..v0.5.1%2Bregistry.0.1.0 [0.5.1]: https://github.com/daimond113/pesde/compare/v0.5.0%2Bregistry.0.1.0..v0.5.1%2Bregistry.0.1.0

1690
Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -1,6 +1,6 @@
[package] [package]
name = "pesde" name = "pesde"
version = "0.5.3" version = "0.6.1"
edition = "2021" edition = "2021"
license = "MIT" license = "MIT"
authors = ["daimond113 <contact@daimond113.com>"] authors = ["daimond113 <contact@daimond113.com>"]
@ -10,28 +10,29 @@ repository = "https://github.com/pesde-pkg/pesde"
include = ["src/**/*", "Cargo.toml", "Cargo.lock", "README.md", "LICENSE", "CHANGELOG.md"] include = ["src/**/*", "Cargo.toml", "Cargo.lock", "README.md", "LICENSE", "CHANGELOG.md"]
[features] [features]
default = ["wally-compat", "patches"]
bin = [ bin = [
"dep:clap", "dep:clap",
"dep:dirs", "dep:dirs",
"dep:tracing-subscriber", "dep:tracing-subscriber",
"reqwest/json",
"dep:indicatif", "dep:indicatif",
"dep:tracing-indicatif",
"dep:inquire", "dep:inquire",
"dep:toml_edit", "dep:toml_edit",
"dep:colored", "dep:console",
"dep:anyhow", "dep:anyhow",
"dep:keyring", "dep:keyring",
"dep:open", "dep:open",
"gix/worktree-mutation", "dep:paste",
"dep:serde_json", "dep:serde_json",
"dep:winreg", "dep:windows-registry",
"dep:windows",
"gix/worktree-mutation",
"fs-err/expose_original_error", "fs-err/expose_original_error",
"tokio/rt", "tokio/rt",
"tokio/rt-multi-thread", "tokio/rt-multi-thread",
"tokio/macros", "tokio/macros",
] ]
wally-compat = ["dep:async_zip", "dep:serde_json"] wally-compat = ["dep:serde_json"]
patches = ["dep:git2"] patches = ["dep:git2"]
version-management = ["bin"] version-management = ["bin"]
@ -40,54 +41,205 @@ name = "pesde"
path = "src/main.rs" path = "src/main.rs"
required-features = ["bin"] required-features = ["bin"]
[lints.clippy] [workspace.lints.clippy]
zero_sized_map_values = "warn"
while_float = "deny"
useless_let_if_seq = "warn"
unused_trait_names = "warn"
unused_result_ok = "warn"
unused_peekable = "warn"
unused_async = "warn"
unreadable_literal = "warn"
unnested_or_patterns = "warn"
unneeded_field_pattern = "warn"
unnecessary_wraps = "warn"
unnecessary_semicolon = "warn"
unnecessary_self_imports = "warn"
unnecessary_literal_bound = "warn"
unnecessary_join = "warn"
unnecessary_box_returns = "warn"
uninlined_format_args = "warn" uninlined_format_args = "warn"
type_repetition_in_bounds = "warn"
try_err = "warn"
trivially_copy_pass_by_ref = "warn"
trait_duplication_in_bounds = "warn"
todo = "deny"
suspicious_operation_groupings = "warn"
suboptimal_flops = "deny"
struct_field_names = "warn"
string_to_string = "warn"
string_lit_chars_any = "warn"
string_lit_as_bytes = "warn"
str_split_at_newline = "warn"
stable_sort_primitive = "warn"
single_option_map = "warn"
single_match_else = "warn"
single_char_pattern = "warn"
significant_drop_tightening = "warn"
significant_drop_in_scrutinee = "warn"
set_contains_or_insert = "deny"
separated_literal_suffix = "warn"
semicolon_inside_block = "warn"
semicolon_if_nothing_returned = "warn"
self_named_module_files = "warn"
same_functions_in_if_condition = "warn"
return_and_then = "warn"
renamed_function_params = "warn"
ref_patterns = "deny"
ref_option = "deny"
ref_binding_to_reference = "deny"
redundant_type_annotations = "deny"
redundant_else = "warn"
redundant_closure_for_method_calls = "warn"
redundant_clone = "deny"
read_zero_byte_vec = "warn"
rc_buffer = "deny"
range_plus_one = "deny"
range_minus_one = "deny"
pub_without_shorthand = "deny"
pub_underscore_fields = "deny"
precedence_bits = "deny"
pathbuf_init_then_push = "warn"
path_buf_push_overwrite = "warn"
option_option = "deny"
option_as_ref_cloned = "deny"
nonstandard_macro_braces = "deny"
non_zero_suggestions = "deny"
no_effect_underscore_binding = "warn"
needless_raw_string_hashes = "warn"
needless_pass_by_value = "deny"
needless_pass_by_ref_mut = "warn"
needless_for_each = "deny"
needless_continue = "deny"
needless_collect = "deny"
needless_bitwise_bool = "deny"
mut_mut = "deny"
must_use_candidate = "warn"
mem_forget = "deny"
maybe_infinite_iter = "deny"
match_wildcard_for_single_variants = "deny"
match_bool = "warn"
map_unwrap_or = "warn"
map_err_ignore = "warn"
manual_midpoint = "warn"
manual_let_else = "warn"
manual_is_variant_and = "warn"
manual_is_power_of_two = "warn"
lossy_float_literal = "deny"
literal_string_with_formatting_args = "warn"
large_types_passed_by_value = "warn"
large_stack_frames = "warn"
large_stack_arrays = "warn"
large_digit_groups = "deny"
iter_with_drain = "deny"
iter_on_single_items = "deny"
iter_on_empty_collections = "deny"
iter_filter_is_some = "deny"
iter_filter_is_ok = "deny"
invalid_upcast_comparisons = "deny"
integer_division = "deny"
infinite_loop = "deny"
inefficient_to_string = "warn"
index_refutable_slice = "deny"
inconsistent_struct_constructor = "warn"
imprecise_flops = "deny"
implicit_clone = "warn"
if_then_some_else_none = "warn"
if_not_else = "warn"
get_unwrap = "warn"
from_iter_instead_of_collect = "warn"
format_push_string = "warn"
format_collect = "warn"
fn_to_numeric_cast_any = "deny"
float_cmp_const = "deny"
float_cmp = "deny"
float_arithmetic = "warn"
flat_map_option = "warn"
filter_map_next = "warn"
filetype_is_file = "deny"
explicit_iter_loop = "warn"
explicit_into_iter_loop = "warn"
explicit_deref_methods = "warn"
equatable_if_let = "warn"
enum_glob_use = "warn"
empty_structs_with_brackets = "warn"
empty_enum_variants_with_brackets = "warn"
empty_drop = "warn"
elidable_lifetime_names = "warn"
doc_link_with_quotes = "warn"
doc_link_code = "warn"
doc_include_without_cfg = "warn"
disallowed_script_idents = "warn"
derive_partial_eq_without_eq = "warn"
deref_by_slicing = "warn"
default_numeric_fallback = "warn"
dbg_macro = "deny"
comparison_chain = "warn"
collection_is_never_read = "warn"
cloned_instead_of_copied = "warn"
clear_with_drain = "warn"
cfg_not_test = "warn"
cast_sign_loss = "deny"
cast_precision_loss = "deny"
cast_possible_wrap = "deny"
case_sensitive_file_extension_comparisons = "warn"
branches_sharing_code = "warn"
bool_to_int_with_if = "warn"
assigning_clones = "warn"
as_underscore = "warn"
[lints]
workspace = true
[dependencies] [dependencies]
serde = { version = "1.0.216", features = ["derive"] } serde = { version = "1.0.217", features = ["derive"] }
toml = "0.8.19" toml = "0.8.20"
serde_with = "3.11.0" gix = { version = "0.70.0", default-features = false, features = ["blocking-http-transport-reqwest-rust-tls", "revparse-regex", "credentials", "parallel"] }
gix = { version = "0.68.0", default-features = false, features = ["blocking-http-transport-reqwest-rust-tls", "revparse-regex", "credentials", "parallel"] } semver = { version = "1.0.25", features = ["serde"] }
semver = { version = "1.0.24", features = ["serde"] } reqwest = { version = "0.12.12", default-features = false, features = ["rustls-tls", "stream", "json"] }
reqwest = { version = "0.12.9", default-features = false, features = ["rustls-tls"] }
tokio-tar = "0.3.1" tokio-tar = "0.3.1"
async-compression = { version = "0.4.18", features = ["tokio", "gzip"] } async-compression = { version = "0.4.18", features = ["tokio", "gzip"] }
pathdiff = "0.2.3" pathdiff = "0.2.3"
relative-path = { version = "1.9.3", features = ["serde"] } relative-path = { version = "1.9.3", features = ["serde"] }
tracing = { version = "0.1.41", features = ["attributes"] } tracing = { version = "0.1.41", features = ["attributes"] }
thiserror = "2.0.7" thiserror = "2.0.11"
tokio = { version = "1.42.0", features = ["process"] } tokio = { version = "1.43.0", features = ["process", "macros"] }
tokio-util = "0.7.13" tokio-util = "0.7.13"
async-stream = "0.3.6" async-stream = "0.3.6"
futures = "0.3.31" futures = "0.3.31"
full_moon = { version = "1.1.2", features = ["luau"] } full_moon = { version = "1.2.0", features = ["luau"] }
url = { version = "2.5.4", features = ["serde"] } url = { version = "2.5.4", features = ["serde"] }
chrono = { version = "0.4.39", features = ["serde"] } jiff = { version = "0.1.29", default-features = false, features = ["serde", "std"] }
sha2 = "0.10.8" sha2 = "0.10.8"
tempfile = "3.14.0" tempfile = "3.16.0"
wax = { version = "0.6.0", default-features = false } wax = { version = "0.6.0", default-features = false }
fs-err = { version = "3.0.0", features = ["tokio"] } fs-err = { version = "3.1.0", features = ["tokio"] }
urlencoding = "2.1.3"
async_zip = { version = "0.0.17", features = ["tokio", "deflate", "deflate64", "tokio-fs"] }
# TODO: remove this when gitoxide adds support for: committing, pushing, adding # TODO: remove this when gitoxide adds support for: committing, pushing, adding
git2 = { version = "0.19.0", optional = true } git2 = { version = "0.20.0", optional = true }
async_zip = { version = "0.0.17", features = ["tokio", "deflate", "deflate64", "tokio-fs"], optional = true } serde_json = { version = "1.0.138", optional = true }
serde_json = { version = "1.0.133", optional = true }
anyhow = { version = "1.0.94", optional = true } anyhow = { version = "1.0.95", optional = true }
open = { version = "5.3.1", optional = true } open = { version = "5.3.2", optional = true }
keyring = { version = "3.6.1", features = ["crypto-rust", "windows-native", "apple-native", "async-secret-service", "async-io"], optional = true } keyring = { version = "3.6.1", features = ["crypto-rust", "windows-native", "apple-native", "sync-secret-service"], optional = true }
colored = { version = "2.1.0", optional = true } console = { version = "0.15.10", optional = true }
toml_edit = { version = "0.22.22", optional = true } toml_edit = { version = "0.22.23", optional = true }
clap = { version = "4.5.23", features = ["derive"], optional = true } clap = { version = "4.5.28", features = ["derive"], optional = true }
dirs = { version = "5.0.1", optional = true } dirs = { version = "6.0.0", optional = true }
tracing-subscriber = { version = "0.3.19", features = ["env-filter"], optional = true } tracing-subscriber = { version = "0.3.19", features = ["env-filter"], optional = true }
indicatif = { version = "0.17.9", optional = true } indicatif = { version = "0.17.11", optional = true }
tracing-indicatif = { version = "0.3.8", optional = true } inquire = { version = "0.7.5", default-features = false, features = ["console", "one-liners"], optional = true }
inquire = { version = "0.7.5", optional = true } paste = { version = "1.0.15", optional = true }
[target.'cfg(target_os = "windows")'.dependencies] [target.'cfg(target_os = "windows")'.dependencies]
winreg = { version = "0.52.0", optional = true } windows-registry = { version = "0.4.0", optional = true }
windows = { version = "0.59.0", features = ["Win32_Storage", "Win32_Storage_FileSystem", "Win32_Security"], optional = true }
[dev-dependencies]
schemars = { git = "https://github.com/daimond113/schemars", rev = "bc7c7d6", features = ["semver1", "url2"] }
[workspace] [workspace]
resolver = "2" resolver = "2"
@ -96,11 +248,15 @@ members = ["registry"]
[profile.dev.package.full_moon] [profile.dev.package.full_moon]
opt-level = 3 opt-level = 3
[profile.dev.package.miniz_oxide]
opt-level = 3
[profile.release] [profile.release]
opt-level = "s" opt-level = "s"
lto = true lto = true
incremental = true incremental = true
codegen-units = 1 codegen-units = 1
panic = "abort"
[profile.release.package.pesde-registry] [profile.release.package.pesde-registry]
# add debug symbols for Sentry stack traces # add debug symbols for Sentry stack traces

View file

@ -1,7 +1,7 @@
<br> <br>
<div align="center"> <div align="center">
<img src="https://raw.githubusercontent.com/pesde-pkg/pesde/0.5/assets/logotype.svg" alt="pesde logo" width="200" /> <img src="https://raw.githubusercontent.com/pesde-pkg/pesde/0.6/assets/logotype.svg" alt="pesde logo" width="200" />
</div> </div>
<br> <br>

View file

@ -3,20 +3,23 @@
## Supported Versions ## Supported Versions
As pesde is currently in version 0.x, we can only guarantee security for: As pesde is currently in version 0.x, we can only guarantee security for:
- **The latest minor** (currently 0.5).
- **The latest minor** (currently 0.6).
- **The latest release candidate for the next version**, if available. - **The latest release candidate for the next version**, if available.
When a new minor version is released, the previous version will immediately lose security support. When a new minor version is released, the previous version will immediately lose security support.
> **Note:** This policy will change with the release of version 1.0, which will include an extended support period for versions >=1.0. > **Note:** This policy will change with the release of version 1.0, which will include an extended support period for versions >=1.0.
| Version | Supported | | Version | Supported |
| ------- | ------------------ | | ------- | ------------------ |
| 0.5.x | :white_check_mark: | | 0.6.x | :white_check_mark: |
| < 0.5 | :x: | | < 0.6 | :x: |
## Reporting a Vulnerability ## Reporting a Vulnerability
We encourage all security concerns to be reported at [pesde@daimond113.com](mailto:pesde@daimond113.com), along the following format: We encourage all security concerns to be reported at [pesde@daimond113.com](mailto:pesde@daimond113.com), along the following format:
- **Subject**: The subject must be prefixed with `[SECURITY]` to ensure it is prioritized as a security concern. - **Subject**: The subject must be prefixed with `[SECURITY]` to ensure it is prioritized as a security concern.
- **Content**: - **Content**:
- **Affected Versions**: Clearly specify which are affected by the issue. - **Affected Versions**: Clearly specify which are affected by the issue.

4
clippy.toml Normal file
View file

@ -0,0 +1,4 @@
avoid-breaking-exported-api = false
disallowed-methods = [
"std::path::Path::exists"
]

Binary file not shown.

View file

@ -10,20 +10,20 @@
"astro": "astro" "astro": "astro"
}, },
"dependencies": { "dependencies": {
"@astrojs/check": "^0.9.3", "@astrojs/check": "0.9.4",
"@astrojs/starlight": "^0.28.2", "@astrojs/starlight": "0.30.6",
"@astrojs/starlight-tailwind": "^2.0.3", "@astrojs/starlight-tailwind": "3.0.0",
"@astrojs/tailwind": "^5.1.1", "@astrojs/tailwind": "5.1.4",
"@fontsource-variable/nunito-sans": "^5.1.0", "@fontsource-variable/nunito-sans": "^5.1.1",
"@shikijs/rehype": "^1.21.0", "@shikijs/rehype": "^1.26.2",
"astro": "^4.15.9", "astro": "5.1.5",
"sharp": "^0.33.5", "sharp": "^0.33.5",
"shiki": "^1.21.0", "shiki": "^1.26.2",
"tailwindcss": "^3.4.13", "tailwindcss": "^3.4.17",
"typescript": "^5.6.2" "typescript": "^5.7.3"
}, },
"devDependencies": { "devDependencies": {
"prettier-plugin-astro": "^0.14.1", "prettier-plugin-astro": "^0.14.1",
"prettier-plugin-tailwindcss": "^0.6.8" "prettier-plugin-tailwindcss": "^0.6.9"
} }
} }

View file

@ -3,12 +3,7 @@
href="https://pesde.daimond113.com/" href="https://pesde.daimond113.com/"
class="flex text-[var(--sl-color-text-accent)] hover:opacity-80" class="flex text-[var(--sl-color-text-accent)] hover:opacity-80"
> >
<svg <svg viewBox="0 0 56 28" class="h-7" fill="none" xmlns="http://www.w3.org/2000/svg">
viewBox="0 0 56 28"
class="h-7"
fill="none"
xmlns="http://www.w3.org/2000/svg"
>
<title>pesde</title> <title>pesde</title>
<path <path
d="M0 28V26.3156H2.25652V12.2361H0.0635639V10.5517H4.44947L4.48125 11.9819L3.78205 12.3315C4.41769 11.6746 5.16986 11.1661 6.03857 10.8059C6.92846 10.4245 7.82895 10.2338 8.74003 10.2338C9.863 10.2338 10.88 10.4775 11.7911 10.9648C12.7234 11.4522 13.4544 12.1726 13.9841 13.126C14.5349 14.0795 14.8104 15.2448 14.8104 16.6221C14.8104 18.0416 14.5138 19.26 13.9205 20.277C13.3272 21.2728 12.5327 22.0356 11.5368 22.5653C10.5622 23.095 9.5028 23.3598 8.35865 23.3598C7.72301 23.3598 7.11916 23.2751 6.54708 23.1056C5.99619 22.9361 5.50887 22.7242 5.08511 22.4699C4.66135 22.1945 4.34353 21.8873 4.13165 21.5483L4.60838 21.4529L4.5766 26.3156H7.02381V28H0ZM7.94549 21.6118C9.19558 21.6118 10.2444 21.2092 11.0919 20.4041C11.9394 19.5778 12.3632 18.3807 12.3632 16.8127C12.3632 15.2872 11.9606 14.1113 11.1555 13.2849C10.3503 12.4586 9.3333 12.0454 8.1044 12.0454C7.72301 12.0454 7.26747 12.1196 6.73777 12.2679C6.20807 12.395 5.67837 12.6069 5.14867 12.9035C4.61898 13.2002 4.17403 13.5922 3.81383 14.0795L4.5766 12.7446L4.60838 20.7219L3.8774 19.7367C4.42828 20.3299 5.06392 20.7961 5.78431 21.1351C6.5047 21.4529 7.2251 21.6118 7.94549 21.6118Z" d="M0 28V26.3156H2.25652V12.2361H0.0635639V10.5517H4.44947L4.48125 11.9819L3.78205 12.3315C4.41769 11.6746 5.16986 11.1661 6.03857 10.8059C6.92846 10.4245 7.82895 10.2338 8.74003 10.2338C9.863 10.2338 10.88 10.4775 11.7911 10.9648C12.7234 11.4522 13.4544 12.1726 13.9841 13.126C14.5349 14.0795 14.8104 15.2448 14.8104 16.6221C14.8104 18.0416 14.5138 19.26 13.9205 20.277C13.3272 21.2728 12.5327 22.0356 11.5368 22.5653C10.5622 23.095 9.5028 23.3598 8.35865 23.3598C7.72301 23.3598 7.11916 23.2751 6.54708 23.1056C5.99619 22.9361 5.50887 22.7242 5.08511 22.4699C4.66135 22.1945 4.34353 21.8873 4.13165 21.5483L4.60838 21.4529L4.5766 26.3156H7.02381V28H0ZM7.94549 21.6118C9.19558 21.6118 10.2444 21.2092 11.0919 20.4041C11.9394 19.5778 12.3632 18.3807 12.3632 16.8127C12.3632 15.2872 11.9606 14.1113 11.1555 13.2849C10.3503 12.4586 9.3333 12.0454 8.1044 12.0454C7.72301 12.0454 7.26747 12.1196 6.73777 12.2679C6.20807 12.395 5.67837 12.6069 5.14867 12.9035C4.61898 13.2002 4.17403 13.5922 3.81383 14.0795L4.5766 12.7446L4.60838 20.7219L3.8774 19.7367C4.42828 20.3299 5.06392 20.7961 5.78431 21.1351C6.5047 21.4529 7.2251 21.6118 7.94549 21.6118Z"
@ -27,8 +22,7 @@
fill="currentColor"></path> fill="currentColor"></path>
</svg> </svg>
</a> </a>
<span class="-mt-px ml-2.5 mr-2 text-xl text-[var(--sl-color-gray-5)]">/</span <span class="-mt-px ml-2.5 mr-2 text-xl text-[var(--sl-color-gray-5)]">/</span>
>
<a <a
class="font-medium text-[var(--sl-color-gray-2)] no-underline hover:opacity-80 md:text-lg" class="font-medium text-[var(--sl-color-gray-2)] no-underline hover:opacity-80 md:text-lg"
href="/">docs</a href="/">docs</a

View file

@ -1,6 +1,7 @@
import { defineCollection } from "astro:content" import { defineCollection } from "astro:content"
import { docsLoader } from "@astrojs/starlight/loaders"
import { docsSchema } from "@astrojs/starlight/schema" import { docsSchema } from "@astrojs/starlight/schema"
export const collections = { export const collections = {
docs: defineCollection({ schema: docsSchema() }), docs: defineCollection({ loader: docsLoader(), schema: docsSchema() }),
} }

View file

@ -42,6 +42,9 @@ hello
# Hello, pesde! (pesde/hello@1.0.0, lune) # Hello, pesde! (pesde/hello@1.0.0, lune)
``` ```
Note that they are scoped to the nearest `pesde.toml` file. However, you can use
binaries of the workspace root from member packages.
## Making a binary package ## Making a binary package
To make a binary package you must use a target compatible with binary exports. To make a binary package you must use a target compatible with binary exports.

View file

@ -137,6 +137,24 @@ pesde add workspace:acme/bar
href="/guides/workspaces/" href="/guides/workspaces/"
/> />
## Path Dependencies
Path dependencies are dependencies found anywhere available to the operating system.
They are useful for local development, but are forbidden in published packages.
The path must be absolute and point to a directory containing a `pesde.toml` file.
```toml title="pesde.toml"
[dependencies]
foo = { path = "/home/user/foo" }
```
You can also add a path dependency by running the following command:
```sh
pesde add path:/home/user/foo
```
## Peer Dependencies ## Peer Dependencies
Peer dependencies are dependencies that are not installed automatically when Peer dependencies are dependencies that are not installed automatically when

View file

@ -32,6 +32,29 @@ foo = { name = "acme/foo", version = "^1.0.0" }
Now, when you run `pesde install`, `bar` 2.0.0 will be used instead of 1.0.0. Now, when you run `pesde install`, `bar` 2.0.0 will be used instead of 1.0.0.
Overrides are also able to use aliases to share the specifier you use for your
own dependencies:
```toml title="pesde.toml"
[dependencies]
foo = { name = "acme/foo", version = "^1.0.0" }
bar = { name = "acme/bar", version = "^2.0.0" }
[overrides]
"foo>bar" = "bar"
```
This is the same as if you had written:
```toml title="pesde.toml"
[dependencies]
foo = { name = "acme/foo", version = "^1.0.0" }
bar = { name = "acme/bar", version = "^2.0.0" }
[overrides]
"foo>bar" = { name = "acme/bar", version = "^2.0.0" }
```
You can learn more about the syntax for dependency overrides in the You can learn more about the syntax for dependency overrides in the
[reference](/reference/manifest#overrides). [reference](/reference/manifest#overrides).

View file

@ -91,6 +91,13 @@ For example, you may publish a package that can be used in both Roblox and
Luau environments by publishing two versions of the package, one for each Luau environments by publishing two versions of the package, one for each
environment. environment.
<Aside type="caution">
Packages for different targets but on the same version must have
the same description.
</Aside>
## Documentation ## Documentation
The `README.md` file in the root of the package will be displayed on the The `README.md` file in the root of the package will be displayed on the

View file

@ -0,0 +1,56 @@
---
title: Removing Packages
description: Learn how to remove packages from the registry.
---
pesde doesn't support removing packages from the registry. This is to ensure
that the registry remains a reliable source of packages for everyone. However,
pesde provides other mechanisms to handle packages that are no longer needed.
## Yanking
Yanking is limited to a specific version (and target) of a package. It is used
to mark a version as broken or deprecated. Yanked versions are unavailable
to download fresh, but they can still be installed if they are present in the
lockfile of a project.
To yank a package, you can use the `pesde yank` command:
```sh
pesde yank <PACKAGE>@<VERSION> <TARGET>
```
You can leave out the target if you want to yank all targets of the version:
```sh
pesde yank <PACKAGE>@<VERSION>
```
## Deprecating
On the other hand, deprecating a package is used to mark a package as deprecated
in the registry. This is useful when you want to discourage users from using
a package, but don't want to break existing projects that depend on it. Unlike
yanking, your package will still be able to be installed fresh. However, when it
is installed, a warning will be shown to the user.
To deprecate a package, you can use the `pesde deprecate` command:
```sh
pesde deprecate <PACKAGE> [REASON]
```
You must provide a non-empty reason when deprecating a package. This is to
inform users why the package is deprecated. For example, if your package
has been replaced by another package, you can provide a reason like:
```sh
pesde deprecate acme/old-package "This package has been replaced by acme/new-package."
```
## Other Options
There are other situations in which you might want to remove a package from
the registry. Please refer to the policies of the registry you are using for
more information on how to handle these situations. The process for the official
registry is described [here](/registry/policies/#package-removal).

View file

@ -188,10 +188,13 @@ This will cause the `src` directory to be directly synced into Roblox.
In pesde, you should not have a `default.project.json` file in your package. In pesde, you should not have a `default.project.json` file in your package.
Instead, you are required to use the `build_files` field to specify a 1:1 match Instead, you are required to use the `build_files` field to specify a 1:1 match
between Roblox and the file system. pesde forbids `default.project.json` to be between Roblox and the file system. These are given to the
part of a published package, and regenerates it when installing a pesde git `roblox_sync_config_generator` script to generate the configuration for the sync
dependency. This allows the consumer of your package to choose the sync tool tool the user is using. pesde forbids `default.project.json` to be part of a
they want to use, instead of being constrained to only using Rojo. published package, as well as ignoring them from Git dependencies. This allows
the consumer of your package to choose the sync tool they want to use, instead
of being constrained to only using Rojo as well as preventing broken packages
from being published (for example, if the project is configured as a DataModel).
This has the effect that the structure of the files in the file system ends up This has the effect that the structure of the files in the file system ends up
being reflected inside Roblox. being reflected inside Roblox.
@ -227,3 +230,19 @@ Whereas with pesde, it looks like this:
- dependency (roblox_packages/dependency.luau) - dependency (roblox_packages/dependency.luau)
</FileTree> </FileTree>
### The `roblox_server` target
Although optimizing your server-only dependency using the `roblox_server` target
might sound like a good idea it is not recommended, since it complicates
linking and makes your package unnecessarily harder to use. On a public registry
it is also redundant, since the package can be downloaded by anyone. Syncing
the scripts to the client may also come up as a reason, but it is a
micro-optimization which is very hard to observe, so it is unnecessary.
The target exists for a reason, that is
[private registries](/guides/self-hosting-registries). You might want to have
internal packages, such as configs or otherwise sensitive code which you do not
want clients to see. This is where the `roblox_server` target comes in handy.
If you're not using a private registry you should use the standard `roblox`
target instead.

View file

@ -23,7 +23,7 @@ the following content:
api = "https://registry.acme.local/" api = "https://registry.acme.local/"
# package download URL (optional) # package download URL (optional)
download = "{API_URL}/v0/packages/{PACKAGE}/{PACKAGE_VERSION}/{PACKAGE_TARGET}" download = "{API_URL}/v1/packages/{PACKAGE}/{PACKAGE_VERSION}/{PACKAGE_TARGET}/archive"
# the client ID of the GitHub OAuth app (optional) # the client ID of the GitHub OAuth app (optional)
github_oauth_client_id = "a1d648966fdfbdcd9295" github_oauth_client_id = "a1d648966fdfbdcd9295"
@ -58,7 +58,7 @@ scripts_packages = ["pesde/scripts_rojo"]
- `{PACKAGE_VERSION}`: The package version. - `{PACKAGE_VERSION}`: The package version.
- `{PACKAGE_TARGET}`: The package target. - `{PACKAGE_TARGET}`: The package target.
Defaults to `{API_URL}/v0/packages/{PACKAGE}/{PACKAGE_VERSION}/{PACKAGE_TARGET}`. Defaults to `{API_URL}/v1/packages/{PACKAGE}/{PACKAGE_VERSION}/{PACKAGE_TARGET}/archive`.
- **github_oauth_client_id**: This is required if you use GitHub OAuth for - **github_oauth_client_id**: This is required if you use GitHub OAuth for
authentication. See below for more information. authentication. See below for more information.
@ -115,11 +115,11 @@ for this purpose.
`GITHUB_USERNAME`. This is required. `GITHUB_USERNAME`. This is required.
- **COMMITTER_GIT_NAME**: The name to use for the committer when updating the - **COMMITTER_GIT_NAME**: The name to use for the committer when updating the
index repository.\ index repository. This is required.\
Example: `pesde index updater` Example: `pesde index updater`
- **COMMITTER_GIT_EMAIL**: The email to use for the committer when updating the - **COMMITTER_GIT_EMAIL**: The email to use for the committer when updating the
index repository.\ index repository. This is required.\
Example: `pesde@localhost` Example: `pesde@localhost`
- **DATA_DIR**: The directory where the registry stores miscellaneous data. - **DATA_DIR**: The directory where the registry stores miscellaneous data.

View file

@ -5,22 +5,11 @@ description: Install pesde
import { Aside, Steps, TabItem, Tabs } from "@astrojs/starlight/components" import { Aside, Steps, TabItem, Tabs } from "@astrojs/starlight/components"
## Prerequisites
pesde requires [Lune](https://lune-org.github.io/docs) to be installed on your
system in order to function properly.
You can follow the installation instructions in the
[Lune documentation](https://lune-org.github.io/docs/getting-started/1-installation).
## Installing pesde
<Steps> <Steps>
1. Go to the [GitHub releases page](https://github.com/pesde-pkg/pesde/releases/latest). 1. Go to the [GitHub releases page](https://github.com/pesde-pkg/pesde/releases/latest).
2. Download the corresponding archive for your operating system. You can choose 2. Download the corresponding archive for your operating system.
whether to use the `.zip` or `.tar.gz` files.
3. Extract the downloaded archive to a folder on your computer. 3. Extract the downloaded archive to a folder on your computer.
@ -76,6 +65,7 @@ You can follow the installation instructions in the
</TabItem> </TabItem>
</Tabs> </Tabs>
<br />
5. Verify that pesde is installed by running the following command: 5. Verify that pesde is installed by running the following command:
@ -92,8 +82,8 @@ You can follow the installation instructions in the
It is not recommended to use toolchain managers (such as Rokit or Aftman) to It is not recommended to use toolchain managers (such as Rokit or Aftman) to
install pesde. You can use `pesde self-upgrade` if you need to update pesde. install pesde. You can use `pesde self-upgrade` if you need to update pesde.
If you need everyone to use the same version of pesde, you can use the If you need everyone to use a compatible version of pesde, you can use the
`pesde_version` field in `pesde.toml` to specify the version of pesde to use `[engines.pesde]` field in `pesde.toml` to specify the version of pesde to use
for the current project. for the current project.
</Aside> </Aside>

View file

@ -33,7 +33,7 @@ pesde init
# what is the repository URL of this project? # what is the repository URL of this project?
# what is the license of this project? MIT # what is the license of this project? MIT
# what environment are you targeting for your package? luau # what environment are you targeting for your package? luau
# would you like to setup default Roblox compatibility scripts? No # would you like to setup Roblox compatibility scripts? No
``` ```
The command will create a `pesde.toml` file in the current folder. Go ahead The command will create a `pesde.toml` file in the current folder. Go ahead

View file

@ -55,10 +55,85 @@ is printed.
The default index is [`pesde-index`](https://github.com/pesde-pkg/index). The default index is [`pesde-index`](https://github.com/pesde-pkg/index).
## `pesde cas`
Content-addressable storage (CAS) related commands.
### `pesde cas prune`
Removes unused CAS files and packages.
## `pesde init` ## `pesde init`
Initializes a new pesde project in the current directory. Initializes a new pesde project in the current directory.
## `pesde add`
```sh
pesde add <PACKAGE>
```
Adds a package to the dependencies of the current project.
- `-i, --index <INDEX>`: The index in which to search for the package.
- `-t, --target <TARGET>`: The target environment for the package.
- `-a, --alias <ALIAS>`: The alias to use for the package, defaults to the
package name.
- `-p, --peer`: Adds the package as a peer dependency.
- `-d, --dev`: Adds the package as a dev dependency.
The following formats are supported:
```sh
pesde add pesde/hello
pesde add pesde/hello@1.2.3
pesde add wally#pesde/hello
pesde add wally#pesde/hello@1.2.3
pesde add gh#acme/package#main
pesde add https://git.acme.local/package.git#aeff6
pesde add workspace:pesde/hello
pesde add workspace:pesde/hello@1.2.3
pesde add path:/home/user/package
```
## `pesde remove`
```sh
pesde remove <ALIAS>
```
Removes a package from the dependencies of the current project.
## `pesde install`
Installs dependencies for the current project.
- `--locked`: Whether to error if the lockfile is out of date.
- `--prod`: Whether to not linking dev dependencies.
- `--network-concurrency <CONCURRENCY>`: The number of concurrent network
requests to make at most. Defaults to 16.
- `--force`: Whether to force reinstall all packages even if they are already
installed (useful if there is any issue with the current installation).
## `pesde update`
Updates the dependencies of the current project.
- `--no-install`: Whether to only update the lockfile without installing the
dependencies.
- `--network-concurrency <CONCURRENCY>`: The number of concurrent network
requests to make at most. Defaults to 16.
- `--force`: Whether to force reinstall all packages even if they are already
installed (useful if there is any issue with the current installation).
## `pesde outdated`
Lists outdated dependencies of the current project.
## `pesde list`
Lists the dependencies of the current project.
## `pesde run` ## `pesde run`
Runs a script from the current project using Lune. Runs a script from the current project using Lune.
@ -83,13 +158,6 @@ Arguments can be passed to the script by using `--` followed by the arguments.
pesde run foo -- --arg1 --arg2 pesde run foo -- --arg1 --arg2
``` ```
## `pesde install`
Installs dependencies for the current project.
- `--locked`: Whether to error if the lockfile is out of date.
- `--prod`: Whether to skip installing dev dependencies.
## `pesde publish` ## `pesde publish`
Publishes the current project to the pesde registry. Publishes the current project to the pesde registry.
@ -99,18 +167,26 @@ Publishes the current project to the pesde registry.
publish it. publish it.
- `-y, --yes`: Whether to skip the confirmation prompt. - `-y, --yes`: Whether to skip the confirmation prompt.
- `-i, --index`: Name of the index to publish to. Defaults to `default`. - `-i, --index`: Name of the index to publish to. Defaults to `default`.
- `--no-verify`: Whether to skip syntax validation of the exports of the
package.
## `pesde self-install` ## `pesde yank`
Performs the pesde installation process. This should be the first command run Yanks a version of a package from the registry.
after downloading the pesde binary.
## `pesde self-upgrade` - `--undo`: Whether to unyank the package.
- `-i, --index`: Name of the index to yank from. Defaults to `default`.
Upgrades the pesde binary to the latest version. ## `pesde deprecate`
- `--use-cached`: Whether to use the version displayed in the "upgrade available" ```sh
message instead of checking for the latest version. pesde deprecate <PACKAGE> [REASON]
```
Deprecates a package in the registry. A non-empty reason must be provided.
- `--undo`: Whether to undepricate the package.
- `-i, --index`: Name of the index to deprecate from. Defaults to `default`.
## `pesde patch` ## `pesde patch`
@ -137,33 +213,6 @@ pesde patch-commit <PATH>
Applies the changes made in the patching environment created by `pesde patch`. Applies the changes made in the patching environment created by `pesde patch`.
## `pesde add`
```sh
pesde add <PACKAGE>
```
Adds a package to the dependencies of the current project.
- `-i, --index <INDEX>`: The index in which to search for the package.
- `-t, --target <TARGET>`: The target environment for the package.
- `-a, --alias <ALIAS>`: The alias to use for the package, defaults to the
package name.
- `-p, --peer`: Adds the package as a peer dependency.
- `-d, --dev`: Adds the package as a dev dependency.
The following formats are supported:
```sh
pesde add pesde/hello
pesde add gh#acme/package#main
pesde add https://git.acme.local/package.git#aeff6
```
## `pesde update`
Updates the dependencies of the current project.
## `pesde x` ## `pesde x`
Runs a one-off binary package. Runs a one-off binary package.
@ -178,3 +227,15 @@ a pesde project.
```sh ```sh
pesde x pesde/hello pesde x pesde/hello
``` ```
## `pesde self-install`
Performs the pesde installation process. This should be the first command run
after downloading the pesde binary.
## `pesde self-upgrade`
Upgrades the pesde binary to the latest version.
- `--use-cached`: Whether to use the version displayed in the "upgrade available"
message instead of checking for the latest version.

View file

@ -84,11 +84,6 @@ includes = [
] ]
``` ```
### `pesde_version`
The version of pesde to use within this project. The `pesde` CLI will look at
this field and run the correct version of pesde for this project.
### `workspace_members` ### `workspace_members`
A list of globs containing the members of this workspace. A list of globs containing the members of this workspace.
@ -273,10 +268,27 @@ version `1.0.0`, and the `bar` and `baz` dependencies of the `foo` package with
version `2.0.0`. version `2.0.0`.
Each key in the overrides table is a comma-separated list of package paths. The Each key in the overrides table is a comma-separated list of package paths. The
path is a list of package names separated by `>`. For example, `foo>bar>baz` path is a list of aliases separated by `>`. For example, `foo>bar>baz`
refers to the `baz` dependency of the `bar` package, which is a dependency of refers to the `baz` dependency of the `bar` package, which is a dependency of
the `foo` package. the `foo` package.
The value of an override entry can be either a specifier or an alias. If it is an
alias (a string), it will be equivalent to putting the specifier of the dependency
under that alias. For example, the following two overrides are equivalent:
```toml
[dependencies]
bar = { name = "acme/bar", version = "2.0.0" }
[overrides]
"foo>bar" = "bar"
```
```toml
[overrides]
"foo>bar" = { name = "acme/bar", version = "2.0.0" }
```
<LinkCard <LinkCard
title="Overrides" title="Overrides"
description="Learn more about overriding and patching packages." description="Learn more about overriding and patching packages."
@ -399,18 +411,19 @@ foo = { workspace = "acme/foo", version = "^" }
href="/guides/workspaces/#workspace-dependencies" href="/guides/workspaces/#workspace-dependencies"
/> />
## `[peer_dependencies]` ### Path
The `[peer_dependencies]` section contains a list of peer dependencies for the
package. These are dependencies that are required by the package, but are not
installed automatically. Instead, they must be installed by the user of the
package.
```toml ```toml
[peer_dependencies] [dependencies]
foo = { name = "acme/foo", version = "1.2.3" } foo = { path = "/home/user/foo" }
``` ```
**Path dependencies** contain the following fields:
- `path`: The path to the package on the local filesystem.
Path dependencies are forbidden in published packages.
## `[dev_dependencies]` ## `[dev_dependencies]`
The `[dev_dependencies]` section contains a list of development dependencies for The `[dev_dependencies]` section contains a list of development dependencies for
@ -430,3 +443,31 @@ foo = { name = "acme/foo", version = "1.2.3" }
description="Learn more about specifying dependencies in pesde." description="Learn more about specifying dependencies in pesde."
href="/guides/dependencies/" href="/guides/dependencies/"
/> />
## `[peer_dependencies]`
The `[peer_dependencies]` section contains a list of peer dependencies for the
package. These are dependencies that are required by the package, but are not
installed automatically. Instead, they must be installed by the user of the
package.
```toml
[peer_dependencies]
foo = { name = "acme/foo", version = "1.2.3" }
```
## `[engines]`
The `[engines]` section contains a list of engines that the package is compatible
with.
```toml
[engines]
pesde = "^0.6.0"
lune = "^0.8.9"
```
Currently, the only engines that can be specified are `pesde` and `lune`.
Additionally, the engines you declared in your project will be installed when
you run `pesde install`. Then, a version of the engine that satisfies the
specified version range will be used when you run the engine.

View file

@ -5,18 +5,46 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [0.1.2] ## [0.2.2]
### Changed
- Make search building not async (as it didn't benefit from it) by @daimond113
## [0.2.1] - 2025-03-02
### Changed
- Print more error info by @daimond113
## [0.2.0] - 2025-02-22
### Added
- Support deprecating and yanking packages by @daimond113
- Add yanking & deprecating to registry by @daimond113
- Log more information about configured auth & storage by @daimond113
- Add individual endpoints for package data over using `Accept` header conditional returns by @daimond113
- Set `Content-Length` header for FS storage backend by @daimond113
### Changed
- Remove native-tls dependency by @daimond113
- Make aliases case-insensitive by @daimond113
### Performance
- Switch to using a `RwLock` over a `Mutex` to store repository data by @daimond113
- Asyncify blocking operations by @daimond113
- Asyncify reading of package data of top search results by @daimond113
## [0.1.2] - 2024-12-30
### Changed ### Changed
- Update to pesde lib API changes by @daimond113 - Update to pesde lib API changes by @daimond113
## [0.1.1] - 2024-12-19 ## [0.1.1] - 2024-12-19
### Changed ### Changed
- Switch to traccing for logging by @daimond113 - Switch to tracing for logging by @daimond113
## [0.1.0] - 2024-12-14 ## [0.1.0] - 2024-12-14
### Added ### Added
- Rewrite registry for pesde v0.5.0 by @daimond113 - Rewrite registry for pesde v0.5.0 by @daimond113
[0.2.2]: https://github.com/daimond113/pesde/compare/v0.6.0%2Bregistry.0.2.1..v0.6.1%2Bregistry.0.2.2
[0.2.1]: https://github.com/daimond113/pesde/compare/v0.6.0%2Bregistry.0.2.0..v0.6.0%2Bregistry.0.2.1
[0.2.0]: https://github.com/daimond113/pesde/compare/v0.5.3%2Bregistry.0.1.2..v0.6.0%2Bregistry.0.2.0
[0.1.2]: https://github.com/daimond113/pesde/compare/v0.5.2%2Bregistry.0.1.1..v0.5.3%2Bregistry.0.1.2 [0.1.2]: https://github.com/daimond113/pesde/compare/v0.5.2%2Bregistry.0.1.1..v0.5.3%2Bregistry.0.1.2
[0.1.1]: https://github.com/daimond113/pesde/compare/v0.5.1%2Bregistry.0.1.0..v0.5.2%2Bregistry.0.1.1 [0.1.1]: https://github.com/daimond113/pesde/compare/v0.5.1%2Bregistry.0.1.0..v0.5.2%2Bregistry.0.1.1
[0.1.0]: https://github.com/daimond113/pesde/compare/v0.4.7..v0.5.0%2Bregistry.0.1.0 [0.1.0]: https://github.com/daimond113/pesde/compare/v0.4.7..v0.5.0%2Bregistry.0.1.0

View file

@ -1,40 +1,44 @@
[package] [package]
name = "pesde-registry" name = "pesde-registry"
version = "0.1.2" version = "0.2.2"
edition = "2021" edition = "2021"
repository = "https://github.com/pesde-pkg/index" repository = "https://github.com/pesde-pkg/index"
publish = false publish = false
[lints]
workspace = true
[dependencies] [dependencies]
actix-web = "4.9.0" actix-web = "4.9.0"
actix-cors = "0.7.0" actix-cors = "0.7.0"
actix-governor = "0.8.0" actix-governor = "0.8.0"
dotenvy = "0.15.7" dotenvy = "0.15.7"
thiserror = "2.0.7" thiserror = "2.0.11"
tantivy = "0.22.0" tantivy = "0.22.0"
semver = "1.0.24" semver = "1.0.25"
chrono = { version = "0.4.39", features = ["serde"] } jiff = { version = "0.1.29", features = ["serde"] }
futures = "0.3.31" futures = "0.3.31"
tokio = "1.42.0" tokio = "1.43.0"
tempfile = "3.14.0" tokio-util = "0.7.13"
fs-err = { version = "3.0.0", features = ["tokio"] } tempfile = "3.16.0"
fs-err = { version = "3.1.0", features = ["tokio"] }
async-stream = "0.3.6" async-stream = "0.3.6"
git2 = "0.19.0" git2 = "0.20.0"
gix = { version = "0.68.0", default-features = false, features = [ gix = { version = "0.70.0", default-features = false, features = [
"blocking-http-transport-reqwest-rust-tls", "blocking-http-transport-reqwest-rust-tls",
"credentials", "credentials",
] } ] }
serde = "1.0.216" serde = "1.0.217"
serde_json = "1.0.133" serde_json = "1.0.138"
serde_yaml = "0.9.34" serde_yaml = "0.9.34"
toml = "0.8.19" toml = "0.8.20"
convert_case = "0.6.0" convert_case = "0.7.1"
sha2 = "0.10.8" sha2 = "0.10.8"
rusty-s3 = "0.5.0" rusty-s3 = "0.7.0"
reqwest = { version = "0.12.9", features = ["json", "rustls-tls"] } reqwest = { version = "0.12.12", default-features = false, features = ["json", "rustls-tls"] }
constant_time_eq = "0.3.1" constant_time_eq = "0.3.1"
tokio-tar = "0.3.1" tokio-tar = "0.3.1"
@ -44,7 +48,7 @@ tracing = { version = "0.1.41", features = ["attributes"] }
tracing-subscriber = { version = "0.3.19", features = ["env-filter"] } tracing-subscriber = { version = "0.3.19", features = ["env-filter"] }
tracing-actix-web = "0.7.15" tracing-actix-web = "0.7.15"
sentry = { version = "0.35.0", default-features = false, features = ["backtrace", "contexts", "debug-images", "panic", "reqwest", "rustls", "tracing"] } sentry = { version = "0.36.0", default-features = false, features = ["backtrace", "contexts", "debug-images", "panic", "reqwest", "rustls", "tracing"] }
sentry-actix = "0.35.0" sentry-actix = "0.36.0"
pesde = { path = "..", features = ["wally-compat"] } pesde = { path = "..", default-features = false, features = ["wally-compat"] }

View file

@ -1,6 +1,6 @@
use crate::{ use crate::{
auth::{get_token_from_req, AuthImpl, UserId}, auth::{get_token_from_req, AuthImpl, UserId},
error::ReqwestErrorExt, error::{display_error, ReqwestErrorExt as _},
}; };
use actix_web::{dev::ServiceRequest, Error as ActixError}; use actix_web::{dev::ServiceRequest, Error as ActixError};
use reqwest::StatusCode; use reqwest::StatusCode;
@ -21,9 +21,8 @@ struct TokenRequestBody {
impl AuthImpl for GitHubAuth { impl AuthImpl for GitHubAuth {
async fn for_write_request(&self, req: &ServiceRequest) -> Result<Option<UserId>, ActixError> { async fn for_write_request(&self, req: &ServiceRequest) -> Result<Option<UserId>, ActixError> {
let token = match get_token_from_req(req) { let Some(token) = get_token_from_req(req) else {
Some(token) => token, return Ok(None);
None => return Ok(None),
}; };
let response = match self let response = match self
@ -46,14 +45,14 @@ impl AuthImpl for GitHubAuth {
} }
Err(_) => { Err(_) => {
tracing::error!( tracing::error!(
"failed to get user: {}", "failed to get user info: {}",
response.into_error().await.unwrap_err() display_error(response.into_error().await.unwrap_err())
); );
return Ok(None); return Ok(None);
} }
}, },
Err(e) => { Err(e) => {
tracing::error!("failed to get user: {e}"); tracing::error!("failed to send user info request: {}", display_error(e));
return Ok(None); return Ok(None);
} }
}; };
@ -61,7 +60,7 @@ impl AuthImpl for GitHubAuth {
let user_id = match response.json::<UserResponse>().await { let user_id = match response.json::<UserResponse>().await {
Ok(resp) => resp.user.id, Ok(resp) => resp.user.id,
Err(e) => { Err(e) => {
tracing::error!("failed to get user: {e}"); tracing::error!("failed to parse user info response: {}", display_error(e));
return Ok(None); return Ok(None);
} }
}; };
@ -72,7 +71,7 @@ impl AuthImpl for GitHubAuth {
impl Display for GitHubAuth { impl Display for GitHubAuth {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "GitHub") write!(f, "GitHub (client id: {})", self.client_id)
} }
} }

View file

@ -11,11 +11,11 @@ use actix_web::{
error::Error as ActixError, error::Error as ActixError,
http::header::AUTHORIZATION, http::header::AUTHORIZATION,
middleware::Next, middleware::Next,
web, HttpMessage, HttpResponse, web, HttpMessage as _, HttpResponse,
}; };
use pesde::source::pesde::IndexConfig; use pesde::source::pesde::IndexConfig;
use sentry::add_breadcrumb; use sentry::add_breadcrumb;
use sha2::{Digest, Sha256}; use sha2::{Digest as _, Sha256};
use std::fmt::Display; use std::fmt::Display;
#[derive(Debug, Copy, Clone, Hash, PartialOrd, PartialEq, Eq, Ord)] #[derive(Debug, Copy, Clone, Hash, PartialOrd, PartialEq, Eq, Ord)]
@ -93,10 +93,10 @@ impl AuthImpl for Auth {
impl Display for Auth { impl Display for Auth {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self { match self {
Auth::GitHub(github) => write!(f, "{}", github), Auth::GitHub(github) => write!(f, "{github}"),
Auth::None(none) => write!(f, "{}", none), Auth::None(none) => write!(f, "{none}"),
Auth::Token(token) => write!(f, "{}", token), Auth::Token(token) => write!(f, "{token}"),
Auth::RwToken(rw_token) => write!(f, "{}", rw_token), Auth::RwToken(rw_token) => write!(f, "{rw_token}"),
} }
} }
} }
@ -106,13 +106,10 @@ pub async fn write_mw(
req: ServiceRequest, req: ServiceRequest,
next: Next<impl MessageBody + 'static>, next: Next<impl MessageBody + 'static>,
) -> Result<ServiceResponse<impl MessageBody>, ActixError> { ) -> Result<ServiceResponse<impl MessageBody>, ActixError> {
let user_id = match app_state.auth.for_write_request(&req).await? { let Some(user_id) = app_state.auth.for_write_request(&req).await? else {
Some(user_id) => user_id,
None => {
return Ok(req return Ok(req
.into_response(HttpResponse::Unauthorized().finish()) .into_response(HttpResponse::Unauthorized().finish())
.map_into_right_body()) .map_into_right_body());
}
}; };
add_breadcrumb(sentry::Breadcrumb { add_breadcrumb(sentry::Breadcrumb {
@ -124,7 +121,9 @@ pub async fn write_mw(
req.extensions_mut().insert(user_id); req.extensions_mut().insert(user_id);
next.call(req).await.map(|res| res.map_into_left_body()) next.call(req)
.await
.map(ServiceResponse::map_into_left_body)
} }
pub async fn read_mw( pub async fn read_mw(
@ -133,13 +132,10 @@ pub async fn read_mw(
next: Next<impl MessageBody + 'static>, next: Next<impl MessageBody + 'static>,
) -> Result<ServiceResponse<impl MessageBody>, ActixError> { ) -> Result<ServiceResponse<impl MessageBody>, ActixError> {
if app_state.auth.read_needs_auth() { if app_state.auth.read_needs_auth() {
let user_id = match app_state.auth.for_read_request(&req).await? { let Some(user_id) = app_state.auth.for_read_request(&req).await? else {
Some(user_id) => user_id,
None => {
return Ok(req return Ok(req
.into_response(HttpResponse::Unauthorized().finish()) .into_response(HttpResponse::Unauthorized().finish())
.map_into_right_body()) .map_into_right_body());
}
}; };
add_breadcrumb(sentry::Breadcrumb { add_breadcrumb(sentry::Breadcrumb {
@ -154,7 +150,9 @@ pub async fn read_mw(
req.extensions_mut().insert(None::<UserId>); req.extensions_mut().insert(None::<UserId>);
} }
next.call(req).await.map(|res| res.map_into_left_body()) next.call(req)
.await
.map(ServiceResponse::map_into_left_body)
} }
pub fn get_auth_from_env(config: &IndexConfig) -> Auth { pub fn get_auth_from_env(config: &IndexConfig) -> Auth {

View file

@ -1,7 +1,7 @@
use crate::auth::{get_token_from_req, AuthImpl, UserId}; use crate::auth::{get_token_from_req, AuthImpl, UserId};
use actix_web::{dev::ServiceRequest, Error as ActixError}; use actix_web::{dev::ServiceRequest, Error as ActixError};
use constant_time_eq::constant_time_eq_32; use constant_time_eq::constant_time_eq_32;
use sha2::{Digest, Sha256}; use sha2::{Digest as _, Sha256};
use std::fmt::Display; use std::fmt::Display;
#[derive(Debug)] #[derive(Debug)]
@ -12,33 +12,23 @@ pub struct RwTokenAuth {
impl AuthImpl for RwTokenAuth { impl AuthImpl for RwTokenAuth {
async fn for_write_request(&self, req: &ServiceRequest) -> Result<Option<UserId>, ActixError> { async fn for_write_request(&self, req: &ServiceRequest) -> Result<Option<UserId>, ActixError> {
let token = match get_token_from_req(req) { let Some(token) = get_token_from_req(req) else {
Some(token) => token, return Ok(None);
None => return Ok(None),
}; };
let token: [u8; 32] = Sha256::digest(token.as_bytes()).into(); let token: [u8; 32] = Sha256::digest(token.as_bytes()).into();
Ok(if constant_time_eq_32(&self.write_token, &token) { Ok(constant_time_eq_32(&self.write_token, &token).then_some(UserId::DEFAULT))
Some(UserId::DEFAULT)
} else {
None
})
} }
async fn for_read_request(&self, req: &ServiceRequest) -> Result<Option<UserId>, ActixError> { async fn for_read_request(&self, req: &ServiceRequest) -> Result<Option<UserId>, ActixError> {
let token = match get_token_from_req(req) { let Some(token) = get_token_from_req(req) else {
Some(token) => token, return Ok(None);
None => return Ok(None),
}; };
let token: [u8; 32] = Sha256::digest(token.as_bytes()).into(); let token: [u8; 32] = Sha256::digest(token.as_bytes()).into();
Ok(if constant_time_eq_32(&self.read_token, &token) { Ok(constant_time_eq_32(&self.read_token, &token).then_some(UserId::DEFAULT))
Some(UserId::DEFAULT)
} else {
None
})
} }
fn read_needs_auth(&self) -> bool { fn read_needs_auth(&self) -> bool {

View file

@ -1,7 +1,7 @@
use crate::auth::{get_token_from_req, AuthImpl, UserId}; use crate::auth::{get_token_from_req, AuthImpl, UserId};
use actix_web::{dev::ServiceRequest, Error as ActixError}; use actix_web::{dev::ServiceRequest, Error as ActixError};
use constant_time_eq::constant_time_eq_32; use constant_time_eq::constant_time_eq_32;
use sha2::{Digest, Sha256}; use sha2::{Digest as _, Sha256};
use std::fmt::Display; use std::fmt::Display;
#[derive(Debug)] #[derive(Debug)]
@ -12,18 +12,13 @@ pub struct TokenAuth {
impl AuthImpl for TokenAuth { impl AuthImpl for TokenAuth {
async fn for_write_request(&self, req: &ServiceRequest) -> Result<Option<UserId>, ActixError> { async fn for_write_request(&self, req: &ServiceRequest) -> Result<Option<UserId>, ActixError> {
let token = match get_token_from_req(req) { let Some(token) = get_token_from_req(req) else {
Some(token) => token, return Ok(None);
None => return Ok(None),
}; };
let token: [u8; 32] = Sha256::digest(token.as_bytes()).into(); let token: [u8; 32] = Sha256::digest(token.as_bytes()).into();
Ok(if constant_time_eq_32(&self.token, &token) { Ok(constant_time_eq_32(&self.token, &token).then_some(UserId::DEFAULT))
Some(UserId::DEFAULT)
} else {
None
})
} }
} }

View file

@ -0,0 +1,76 @@
use crate::{
auth::UserId,
error::{ErrorResponse, RegistryError},
git::push_changes,
package::{read_package, read_scope_info},
search::search_version_changed,
AppState,
};
use actix_web::{http::Method, web, HttpRequest, HttpResponse};
use pesde::names::PackageName;
use std::collections::HashMap;
pub async fn deprecate_package_version(
request: HttpRequest,
app_state: web::Data<AppState>,
path: web::Path<PackageName>,
bytes: web::Bytes,
user_id: web::ReqData<UserId>,
) -> Result<HttpResponse, RegistryError> {
let deprecated = request.method() != Method::DELETE;
let reason = if deprecated {
match String::from_utf8(bytes.to_vec()).map(|s| s.trim().to_string()) {
Ok(reason) if !reason.is_empty() => reason,
Err(e) => {
return Ok(HttpResponse::BadRequest().json(ErrorResponse {
error: format!("invalid utf-8: {e}"),
}))
}
_ => {
return Ok(HttpResponse::BadRequest().json(ErrorResponse {
error: "deprecating must have a non-empty reason".to_string(),
}))
}
}
} else {
String::new()
};
let name = path.into_inner();
let source = app_state.source.write().await;
let Some(scope_info) = read_scope_info(&app_state, name.scope(), &source).await? else {
return Ok(HttpResponse::NotFound().finish());
};
if !scope_info.owners.contains(&user_id.0) {
return Ok(HttpResponse::Forbidden().finish());
}
let Some(mut file) = read_package(&app_state, &name, &source).await? else {
return Ok(HttpResponse::NotFound().finish());
};
if file.meta.deprecated == reason {
return Ok(HttpResponse::Conflict().finish());
}
file.meta.deprecated = reason;
let file_string = toml::to_string(&file)?;
push_changes(
&app_state,
&source,
name.scope().to_string(),
HashMap::from([(name.name().to_string(), file_string.into_bytes())]),
format!("{}deprecate {name}", if deprecated { "" } else { "un" }),
)
.await?;
search_version_changed(&app_state, &name, &file);
Ok(HttpResponse::Ok().body(format!(
"{}deprecated {name}",
if deprecated { "" } else { "un" },
)))
}

View file

@ -1,4 +1,9 @@
pub mod deprecate_version;
pub mod package_archive;
pub mod package_doc;
pub mod package_readme;
pub mod package_version; pub mod package_version;
pub mod package_versions; pub mod package_versions;
pub mod publish_version; pub mod publish_version;
pub mod search; pub mod search;
pub mod yank_version;

View file

@ -0,0 +1,27 @@
use actix_web::{web, HttpResponse};
use crate::{
error::RegistryError,
package::read_package,
request_path::{resolve_version_and_target, AnyOrSpecificTarget, LatestOrSpecificVersion},
storage::StorageImpl as _,
AppState,
};
use pesde::names::PackageName;
pub async fn get_package_archive(
app_state: web::Data<AppState>,
path: web::Path<(PackageName, LatestOrSpecificVersion, AnyOrSpecificTarget)>,
) -> Result<HttpResponse, RegistryError> {
let (name, version, target) = path.into_inner();
let Some(file) = read_package(&app_state, &name, &*app_state.source.read().await).await? else {
return Ok(HttpResponse::NotFound().finish());
};
let Some(v_id) = resolve_version_and_target(&file, version, &target) else {
return Ok(HttpResponse::NotFound().finish());
};
app_state.storage.get_package(&name, v_id).await
}

View file

@ -0,0 +1,66 @@
use crate::{
error::RegistryError,
package::read_package,
request_path::{resolve_version_and_target, AnyOrSpecificTarget, LatestOrSpecificVersion},
storage::StorageImpl as _,
AppState,
};
use actix_web::{web, HttpResponse};
use pesde::{
names::PackageName,
source::{
ids::VersionId,
pesde::{DocEntryKind, IndexFile},
},
};
use serde::Deserialize;
pub fn find_package_doc<'a>(
file: &'a IndexFile,
v_id: &VersionId,
doc_name: &str,
) -> Option<&'a str> {
let mut queue = file.entries[v_id]
.docs
.iter()
.map(|doc| &doc.kind)
.collect::<Vec<_>>();
while let Some(doc) = queue.pop() {
match doc {
DocEntryKind::Page { name, hash } if name == doc_name => return Some(hash.as_str()),
DocEntryKind::Category { items, .. } => {
queue.extend(items.iter().map(|item| &item.kind));
}
DocEntryKind::Page { .. } => {}
}
}
None
}
#[derive(Debug, Deserialize)]
pub struct Query {
doc: String,
}
pub async fn get_package_doc(
app_state: web::Data<AppState>,
path: web::Path<(PackageName, LatestOrSpecificVersion, AnyOrSpecificTarget)>,
request_query: web::Query<Query>,
) -> Result<HttpResponse, RegistryError> {
let (name, version, target) = path.into_inner();
let Some(file) = read_package(&app_state, &name, &*app_state.source.read().await).await? else {
return Ok(HttpResponse::NotFound().finish());
};
let Some(v_id) = resolve_version_and_target(&file, version, &target) else {
return Ok(HttpResponse::NotFound().finish());
};
let Some(hash) = find_package_doc(&file, v_id, &request_query.doc) else {
return Ok(HttpResponse::NotFound().finish());
};
app_state.storage.get_doc(hash).await
}

View file

@ -0,0 +1,27 @@
use actix_web::{web, HttpResponse};
use crate::{
error::RegistryError,
package::read_package,
request_path::{resolve_version_and_target, AnyOrSpecificTarget, LatestOrSpecificVersion},
storage::StorageImpl as _,
AppState,
};
use pesde::names::PackageName;
pub async fn get_package_readme(
app_state: web::Data<AppState>,
path: web::Path<(PackageName, LatestOrSpecificVersion, AnyOrSpecificTarget)>,
) -> Result<HttpResponse, RegistryError> {
let (name, version, target) = path.into_inner();
let Some(file) = read_package(&app_state, &name, &*app_state.source.read().await).await? else {
return Ok(HttpResponse::NotFound().finish());
};
let Some(v_id) = resolve_version_and_target(&file, version, &target) else {
return Ok(HttpResponse::NotFound().finish());
};
app_state.storage.get_readme(&name, v_id).await
}

View file

@ -1,137 +1,43 @@
use actix_web::{http::header::ACCEPT, web, HttpRequest, HttpResponse, Responder}; use actix_web::{http::header::ACCEPT, web, HttpRequest, HttpResponse};
use semver::Version; use serde::Deserialize;
use serde::{Deserialize, Deserializer};
use crate::{error::Error, package::PackageResponse, storage::StorageImpl, AppState}; use crate::{
use pesde::{ endpoints::package_doc::find_package_doc,
manifest::target::TargetKind, error::RegistryError,
names::PackageName, package::{read_package, PackageResponse},
source::{ request_path::{resolve_version_and_target, AnyOrSpecificTarget, LatestOrSpecificVersion},
git_index::{read_file, root_tree, GitBasedSource}, storage::StorageImpl as _,
pesde::{DocEntryKind, IndexFile}, AppState,
},
}; };
use pesde::names::PackageName;
#[derive(Debug)]
pub enum VersionRequest {
Latest,
Specific(Version),
}
impl<'de> Deserialize<'de> for VersionRequest {
fn deserialize<D>(deserializer: D) -> Result<VersionRequest, D::Error>
where
D: Deserializer<'de>,
{
let s = String::deserialize(deserializer)?;
if s.eq_ignore_ascii_case("latest") {
return Ok(VersionRequest::Latest);
}
s.parse()
.map(VersionRequest::Specific)
.map_err(serde::de::Error::custom)
}
}
#[derive(Debug)]
pub enum TargetRequest {
Any,
Specific(TargetKind),
}
impl<'de> Deserialize<'de> for TargetRequest {
fn deserialize<D>(deserializer: D) -> Result<TargetRequest, D::Error>
where
D: Deserializer<'de>,
{
let s = String::deserialize(deserializer)?;
if s.eq_ignore_ascii_case("any") {
return Ok(TargetRequest::Any);
}
s.parse()
.map(TargetRequest::Specific)
.map_err(serde::de::Error::custom)
}
}
#[derive(Debug, Deserialize)] #[derive(Debug, Deserialize)]
pub struct Query { pub struct Query {
doc: Option<String>, doc: Option<String>,
} }
pub async fn get_package_version( pub async fn get_package_version_v0(
request: HttpRequest, request: HttpRequest,
app_state: web::Data<AppState>, app_state: web::Data<AppState>,
path: web::Path<(PackageName, VersionRequest, TargetRequest)>, path: web::Path<(PackageName, LatestOrSpecificVersion, AnyOrSpecificTarget)>,
query: web::Query<Query>, request_query: web::Query<Query>,
) -> Result<impl Responder, Error> { ) -> Result<HttpResponse, RegistryError> {
let (name, version, target) = path.into_inner(); let (name, version, target) = path.into_inner();
let (scope, name_part) = name.as_str(); let Some(file) = read_package(&app_state, &name, &*app_state.source.read().await).await? else {
let file: IndexFile = {
let source = app_state.source.lock().await;
let repo = gix::open(source.path(&app_state.project))?;
let tree = root_tree(&repo)?;
match read_file(&tree, [scope, name_part])? {
Some(versions) => toml::de::from_str(&versions)?,
None => return Ok(HttpResponse::NotFound().finish()),
}
};
let Some((v_id, entry, targets)) = ({
let version = match version {
VersionRequest::Latest => match file.entries.keys().map(|k| k.version()).max() {
Some(latest) => latest.clone(),
None => return Ok(HttpResponse::NotFound().finish()),
},
VersionRequest::Specific(version) => version,
};
let versions = file
.entries
.iter()
.filter(|(v_id, _)| *v_id.version() == version);
match target {
TargetRequest::Any => versions.clone().min_by_key(|(v_id, _)| *v_id.target()),
TargetRequest::Specific(kind) => versions
.clone()
.find(|(_, entry)| entry.target.kind() == kind),
}
.map(|(v_id, entry)| {
(
v_id,
entry,
versions.map(|(_, entry)| (&entry.target).into()).collect(),
)
})
}) else {
return Ok(HttpResponse::NotFound().finish()); return Ok(HttpResponse::NotFound().finish());
}; };
if let Some(doc_name) = query.doc.as_deref() { let Some(v_id) = resolve_version_and_target(&file, version, &target) else {
let hash = 'finder: {
let mut hash = entry.docs.iter().map(|doc| &doc.kind).collect::<Vec<_>>();
while let Some(doc) = hash.pop() {
match doc {
DocEntryKind::Page { name, hash } if name == doc_name => {
break 'finder hash.clone()
}
DocEntryKind::Category { items, .. } => {
hash.extend(items.iter().map(|item| &item.kind))
}
_ => continue,
};
}
return Ok(HttpResponse::NotFound().finish()); return Ok(HttpResponse::NotFound().finish());
}; };
return app_state.storage.get_doc(&hash).await; if let Some(doc_name) = request_query.doc.as_deref() {
let Some(hash) = find_package_doc(&file, v_id, doc_name) else {
return Ok(HttpResponse::NotFound().finish());
};
return app_state.storage.get_doc(hash).await;
} }
let accept = request let accept = request
@ -152,20 +58,22 @@ pub async fn get_package_version(
}; };
} }
let response = PackageResponse { Ok(HttpResponse::Ok().json(PackageResponse::new(&name, v_id, &file)))
name: name.to_string(), }
version: v_id.version().to_string(),
targets, pub async fn get_package_version(
description: entry.description.clone().unwrap_or_default(), app_state: web::Data<AppState>,
published_at: entry.published_at, path: web::Path<(PackageName, LatestOrSpecificVersion, AnyOrSpecificTarget)>,
license: entry.license.clone().unwrap_or_default(), ) -> Result<HttpResponse, RegistryError> {
authors: entry.authors.clone(), let (name, version, target) = path.into_inner();
repository: entry.repository.clone().map(|url| url.to_string()),
let Some(file) = read_package(&app_state, &name, &*app_state.source.read().await).await? else {
return Ok(HttpResponse::NotFound().finish());
}; };
let mut value = serde_json::to_value(response)?; let Some(v_id) = resolve_version_and_target(&file, version, &target) else {
value["docs"] = serde_json::to_value(entry.docs.clone())?; return Ok(HttpResponse::NotFound().finish());
value["dependencies"] = serde_json::to_value(entry.dependencies.clone())?; };
Ok(HttpResponse::Ok().json(value)) Ok(HttpResponse::Ok().json(PackageResponse::new(&name, v_id, &file)))
} }

View file

@ -1,54 +1,55 @@
use std::collections::{BTreeMap, BTreeSet}; use crate::{
error::RegistryError,
use actix_web::{web, HttpResponse, Responder}; package::{read_package, PackageResponse, PackageVersionsResponse},
AppState,
use crate::{error::Error, package::PackageResponse, AppState};
use pesde::{
names::PackageName,
source::{
git_index::{read_file, root_tree, GitBasedSource},
pesde::IndexFile,
},
}; };
use actix_web::{web, HttpResponse, Responder};
use pesde::{names::PackageName, source::ids::VersionId};
use semver::Version;
use std::collections::{btree_map::Entry, BTreeMap};
pub async fn get_package_versions_v0(
app_state: web::Data<AppState>,
path: web::Path<PackageName>,
) -> Result<impl Responder, RegistryError> {
let name = path.into_inner();
let Some(file) = read_package(&app_state, &name, &*app_state.source.read().await).await? else {
return Ok(HttpResponse::NotFound().finish());
};
let mut versions = BTreeMap::<&Version, &VersionId>::new();
for v_id in file.entries.keys() {
match versions.entry(v_id.version()) {
Entry::Vacant(entry) => {
entry.insert(v_id);
}
Entry::Occupied(mut entry) => {
if entry.get() < &v_id {
entry.insert(v_id);
}
}
}
}
let responses = versions
.into_values()
.map(|v_id| PackageResponse::new(&name, v_id, &file))
.collect::<Vec<_>>();
Ok(HttpResponse::Ok().json(responses))
}
pub async fn get_package_versions( pub async fn get_package_versions(
app_state: web::Data<AppState>, app_state: web::Data<AppState>,
path: web::Path<PackageName>, path: web::Path<PackageName>,
) -> Result<impl Responder, Error> { ) -> Result<impl Responder, RegistryError> {
let name = path.into_inner(); let name = path.into_inner();
let (scope, name_part) = name.as_str(); let Some(file) = read_package(&app_state, &name, &*app_state.source.read().await).await? else {
return Ok(HttpResponse::NotFound().finish());
let file: IndexFile = {
let source = app_state.source.lock().await;
let repo = gix::open(source.path(&app_state.project))?;
let tree = root_tree(&repo)?;
match read_file(&tree, [scope, name_part])? {
Some(versions) => toml::de::from_str(&versions)?,
None => return Ok(HttpResponse::NotFound().finish()),
}
}; };
let mut responses = BTreeMap::new(); Ok(HttpResponse::Ok().json(PackageVersionsResponse::new(&name, &file)))
for (v_id, entry) in file.entries {
let info = responses
.entry(v_id.version().clone())
.or_insert_with(|| PackageResponse {
name: name.to_string(),
version: v_id.version().to_string(),
targets: BTreeSet::new(),
description: entry.description.unwrap_or_default(),
published_at: entry.published_at,
license: entry.license.unwrap_or_default(),
authors: entry.authors.clone(),
repository: entry.repository.clone().map(|url| url.to_string()),
});
info.targets.insert(entry.target.into());
info.published_at = info.published_at.max(entry.published_at);
}
Ok(HttpResponse::Ok().json(responses.into_values().collect::<Vec<_>>()))
} }

View file

@ -1,60 +1,39 @@
use crate::{ use crate::{
auth::UserId, auth::UserId,
benv, error::{ErrorResponse, RegistryError},
error::{Error, ErrorResponse}, git::push_changes,
search::update_version, package::{read_package, read_scope_info},
storage::StorageImpl, search::update_search_version,
storage::StorageImpl as _,
AppState, AppState,
}; };
use actix_web::{web, web::Bytes, HttpResponse, Responder}; use actix_web::{web, web::Bytes, HttpResponse};
use async_compression::Level; use async_compression::Level;
use convert_case::{Case, Casing}; use convert_case::{Case, Casing as _};
use fs_err::tokio as fs; use fs_err::tokio as fs;
use futures::{future::join_all, join};
use git2::{Remote, Repository, Signature};
use pesde::{ use pesde::{
manifest::Manifest, manifest::{DependencyType, Manifest},
source::{ source::{
git_index::{read_file, root_tree, GitBasedSource}, git_index::GitBasedSource as _,
pesde::{DocEntry, DocEntryKind, IndexFile, IndexFileEntry, ScopeInfo, SCOPE_INFO_FILE}, ids::VersionId,
pesde::{DocEntry, DocEntryKind, IndexFileEntry, ScopeInfo, SCOPE_INFO_FILE},
specifiers::DependencySpecifiers, specifiers::DependencySpecifiers,
version_id::VersionId, traits::RefreshOptions,
IGNORED_DIRS, IGNORED_FILES, ADDITIONAL_FORBIDDEN_FILES, IGNORED_DIRS, IGNORED_FILES,
}, },
MANIFEST_FILE_NAME, MANIFEST_FILE_NAME,
}; };
use sentry::add_breadcrumb; use sentry::add_breadcrumb;
use serde::Deserialize; use serde::Deserialize;
use sha2::{Digest, Sha256}; use sha2::{Digest as _, Sha256};
use std::{ use std::{
collections::{BTreeSet, HashMap}, collections::{BTreeSet, HashMap},
io::{Cursor, Write}, io::Cursor,
};
use tokio::{
io::{AsyncReadExt as _, AsyncWriteExt as _},
task::JoinSet,
}; };
use tokio::io::{AsyncReadExt, AsyncWriteExt};
fn signature<'a>() -> Signature<'a> {
Signature::now(
&benv!(required "COMMITTER_GIT_NAME"),
&benv!(required "COMMITTER_GIT_EMAIL"),
)
.unwrap()
}
fn get_refspec(repo: &Repository, remote: &mut Remote) -> Result<String, git2::Error> {
let upstream_branch_buf = repo.branch_upstream_name(repo.head()?.name().unwrap())?;
let upstream_branch = upstream_branch_buf.as_str().unwrap();
let refspec_buf = remote
.refspecs()
.find(|r| r.direction() == git2::Direction::Fetch && r.dst_matches(upstream_branch))
.unwrap()
.rtransform(upstream_branch)?;
let refspec = refspec_buf.as_str().unwrap();
Ok(refspec.to_string())
}
const ADDITIONAL_FORBIDDEN_FILES: &[&str] = &["default.project.json"];
#[derive(Debug, Deserialize, Default)] #[derive(Debug, Deserialize, Default)]
struct DocEntryInfo { struct DocEntryInfo {
@ -70,9 +49,14 @@ pub async fn publish_package(
app_state: web::Data<AppState>, app_state: web::Data<AppState>,
bytes: Bytes, bytes: Bytes,
user_id: web::ReqData<UserId>, user_id: web::ReqData<UserId>,
) -> Result<impl Responder, Error> { ) -> Result<HttpResponse, RegistryError> {
let source = app_state.source.lock().await; let source = app_state.source.write().await;
source.refresh(&app_state.project).await.map_err(Box::new)?; source
.refresh(&RefreshOptions {
project: app_state.project.clone(),
})
.await
.map_err(Box::new)?;
let config = source.config(&app_state.project).await?; let config = source.config(&app_state.project).await?;
let package_dir = tempfile::tempdir()?; let package_dir = tempfile::tempdir()?;
@ -94,12 +78,14 @@ pub async fn publish_package(
let file_name = entry let file_name = entry
.file_name() .file_name()
.to_str() .to_str()
.ok_or_else(|| Error::InvalidArchive("file name contains non UTF-8 characters".into()))? .ok_or_else(|| {
RegistryError::InvalidArchive("file name contains non UTF-8 characters".into())
})?
.to_string(); .to_string();
if entry.file_type().await?.is_dir() { if entry.file_type().await?.is_dir() {
if IGNORED_DIRS.contains(&file_name.as_str()) { if IGNORED_DIRS.contains(&file_name.as_str()) {
return Err(Error::InvalidArchive(format!( return Err(RegistryError::InvalidArchive(format!(
"archive contains forbidden directory: {file_name}" "archive contains forbidden directory: {file_name}"
))); )));
} }
@ -117,7 +103,7 @@ pub async fn publish_package(
.file_name() .file_name()
.to_str() .to_str()
.ok_or_else(|| { .ok_or_else(|| {
Error::InvalidArchive( RegistryError::InvalidArchive(
"file name contains non UTF-8 characters".into(), "file name contains non UTF-8 characters".into(),
) )
})? })?
@ -158,7 +144,7 @@ pub async fn publish_package(
); );
let mut bytes = vec![]; let mut bytes = vec![];
gz.read_to_end(&mut bytes).await?; gz.read_to_end(&mut bytes).await?;
docs_pages.insert(hash.to_string(), bytes); docs_pages.insert(hash.clone(), bytes);
let mut lines = content.lines().peekable(); let mut lines = content.lines().peekable();
let front_matter = if lines.peek().filter(|l| **l == "---").is_some() { let front_matter = if lines.peek().filter(|l| **l == "---").is_some() {
@ -180,17 +166,20 @@ pub async fn publish_package(
let h1 = lines let h1 = lines
.find(|l| !l.trim().is_empty()) .find(|l| !l.trim().is_empty())
.and_then(|l| l.strip_prefix("# ")) .and_then(|l| l.strip_prefix("# "))
.map(|s| s.to_string()); .map(ToString::to_string);
let info: DocEntryInfo = let info: DocEntryInfo =
serde_yaml::from_str(&front_matter).map_err(|_| { serde_yaml::from_str(&front_matter).map_err(|_e| {
Error::InvalidArchive(format!( RegistryError::InvalidArchive(format!(
"doc {file_name}'s frontmatter isn't valid YAML" "doc {file_name}'s frontmatter isn't valid YAML"
)) ))
})?; })?;
set.insert(DocEntry { set.insert(DocEntry {
label: info.label.or(h1).unwrap_or(file_name.to_case(Case::Title)), label: info
.label
.or(h1)
.unwrap_or_else(|| file_name.to_case(Case::Title)),
position: info.sidebar_position, position: info.sidebar_position,
kind: DocEntryKind::Page { kind: DocEntryKind::Page {
name: entry name: entry
@ -200,12 +189,12 @@ pub async fn publish_package(
.with_extension("") .with_extension("")
.to_str() .to_str()
.ok_or_else(|| { .ok_or_else(|| {
Error::InvalidArchive( RegistryError::InvalidArchive(
"file name contains non UTF-8 characters".into(), "file name contains non UTF-8 characters".into(),
) )
})? })?
// ensure that the path is always using forward slashes // ensure that the path is always using forward slashes
.replace("\\", "/"), .replace('\\', "/"),
hash, hash,
}, },
}); });
@ -240,7 +229,7 @@ pub async fn publish_package(
if IGNORED_FILES.contains(&file_name.as_str()) if IGNORED_FILES.contains(&file_name.as_str())
|| ADDITIONAL_FORBIDDEN_FILES.contains(&file_name.as_str()) || ADDITIONAL_FORBIDDEN_FILES.contains(&file_name.as_str())
{ {
return Err(Error::InvalidArchive(format!( return Err(RegistryError::InvalidArchive(format!(
"archive contains forbidden file: {file_name}" "archive contains forbidden file: {file_name}"
))); )));
} }
@ -256,7 +245,7 @@ pub async fn publish_package(
.is_some() .is_some()
{ {
if readme.is_some() { if readme.is_some() {
return Err(Error::InvalidArchive( return Err(RegistryError::InvalidArchive(
"archive contains multiple readme files".into(), "archive contains multiple readme files".into(),
)); ));
} }
@ -271,7 +260,7 @@ pub async fn publish_package(
} }
let Some(manifest) = manifest else { let Some(manifest) = manifest else {
return Err(Error::InvalidArchive( return Err(RegistryError::InvalidArchive(
"archive doesn't contain a manifest".into(), "archive doesn't contain a manifest".into(),
)); ));
}; };
@ -292,117 +281,105 @@ pub async fn publish_package(
{ {
let dependencies = manifest.all_dependencies().map_err(|e| { let dependencies = manifest.all_dependencies().map_err(|e| {
Error::InvalidArchive(format!("manifest has invalid dependencies: {e}")) RegistryError::InvalidArchive(format!("manifest has invalid dependencies: {e}"))
})?; })?;
for (specifier, _) in dependencies.values() { for (specifier, ty) in dependencies.values() {
// we need not verify dev dependencies, as they won't be installed
if *ty == DependencyType::Dev {
continue;
}
match specifier { match specifier {
DependencySpecifiers::Pesde(specifier) => { DependencySpecifiers::Pesde(specifier) => {
if specifier let allowed = match gix::Url::try_from(&*specifier.index) {
.index
.as_deref()
.filter(|index| match gix::Url::try_from(*index) {
Ok(url) => config Ok(url) => config
.other_registries_allowed .other_registries_allowed
.is_allowed_or_same(source.repo_url().clone(), url), .is_allowed_or_same(source.repo_url().clone(), url),
Err(_) => false, Err(_) => false,
}) };
.is_none()
{ if !allowed {
return Err(Error::InvalidArchive(format!( return Err(RegistryError::InvalidArchive(format!(
"invalid index in pesde dependency {specifier}" "invalid index in pesde dependency {specifier}"
))); )));
} }
} }
DependencySpecifiers::Wally(specifier) => { DependencySpecifiers::Wally(specifier) => {
if specifier let allowed = match gix::Url::try_from(&*specifier.index) {
.index
.as_deref()
.filter(|index| match gix::Url::try_from(*index) {
Ok(url) => config.wally_allowed.is_allowed(url), Ok(url) => config.wally_allowed.is_allowed(url),
Err(_) => false, Err(_) => false,
}) };
.is_none()
{ if !allowed {
return Err(Error::InvalidArchive(format!( return Err(RegistryError::InvalidArchive(format!(
"invalid index in wally dependency {specifier}" "invalid index in wally dependency {specifier}"
))); )));
} }
} }
DependencySpecifiers::Git(specifier) => { DependencySpecifiers::Git(specifier) => {
if !config.git_allowed.is_allowed(specifier.repo.clone()) { if !config.git_allowed.is_allowed(specifier.repo.clone()) {
return Err(Error::InvalidArchive( return Err(RegistryError::InvalidArchive(
"git dependencies are not allowed".into(), "git dependencies are not allowed".into(),
)); ));
} }
} }
DependencySpecifiers::Workspace(_) => { DependencySpecifiers::Workspace(_) => {
// workspace specifiers are to be transformed into pesde specifiers by the sender // workspace specifiers are to be transformed into pesde specifiers by the sender
return Err(Error::InvalidArchive( return Err(RegistryError::InvalidArchive(
"non-transformed workspace dependency".into(), "non-transformed workspace dependency".into(),
)); ));
} }
DependencySpecifiers::Path(_) => {
return Err(RegistryError::InvalidArchive(
"path dependencies are not allowed".into(),
));
}
} }
} }
let repo = Repository::open_bare(source.path(&app_state.project))?; let mut files = HashMap::new();
let gix_repo = gix::open(repo.path())?;
let gix_tree = root_tree(&gix_repo)?; let scope = read_scope_info(&app_state, manifest.name.scope(), &source).await?;
if let Some(info) = scope {
let (scope, name) = manifest.name.as_str();
let mut oids = vec![];
match read_file(&gix_tree, [scope, SCOPE_INFO_FILE])? {
Some(info) => {
let info: ScopeInfo = toml::de::from_str(&info)?;
if !info.owners.contains(&user_id.0) { if !info.owners.contains(&user_id.0) {
return Ok(HttpResponse::Forbidden().finish()); return Ok(HttpResponse::Forbidden().finish());
} }
} } else {
None => {
let scope_info = toml::to_string(&ScopeInfo { let scope_info = toml::to_string(&ScopeInfo {
owners: BTreeSet::from([user_id.0]), owners: BTreeSet::from([user_id.0]),
})?; })?;
let mut blob_writer = repo.blob_writer(None)?; files.insert(SCOPE_INFO_FILE.to_string(), scope_info.into_bytes());
blob_writer.write_all(scope_info.as_bytes())?;
oids.push((SCOPE_INFO_FILE, blob_writer.commit()?));
} }
};
let mut file: IndexFile = let mut file = read_package(&app_state, &manifest.name, &source)
toml::de::from_str(&read_file(&gix_tree, [scope, name])?.unwrap_or_default())?; .await?
.unwrap_or_default();
let new_entry = IndexFileEntry { let new_entry = IndexFileEntry {
target: manifest.target.clone(), target: manifest.target.clone(),
published_at: chrono::Utc::now(), published_at: jiff::Timestamp::now(),
engines: manifest.engines.clone(),
description: manifest.description.clone(), description: manifest.description.clone(),
license: manifest.license.clone(), license: manifest.license.clone(),
authors: manifest.authors.clone(), authors: manifest.authors.clone(),
repository: manifest.repository.clone(), repository: manifest.repository.clone(),
yanked: false,
docs, docs,
dependencies, dependencies,
}; };
let this_version = file let same_version = file
.entries .entries
.keys() .iter()
.find(|v_id| *v_id.version() == manifest.version); .find(|(v_id, _)| *v_id.version() == manifest.version);
if let Some(this_version) = this_version { if let Some((_, other_entry)) = same_version {
let other_entry = file.entries.get(this_version).unwrap();
// description cannot be different - which one to render in the "Recently published" list? // description cannot be different - which one to render in the "Recently published" list?
// the others cannot be different because what to return from the versions endpoint? if other_entry.description != new_entry.description {
if other_entry.description != new_entry.description
|| other_entry.license != new_entry.license
|| other_entry.authors != new_entry.authors
|| other_entry.repository != new_entry.repository
{
return Ok(HttpResponse::BadRequest().json(ErrorResponse { return Ok(HttpResponse::BadRequest().json(ErrorResponse {
error: "same version with different description or license already exists" error: "same versions with different descriptions are forbidden".to_string(),
.to_string(),
})); }));
} }
} }
@ -418,90 +395,66 @@ pub async fn publish_package(
return Ok(HttpResponse::Conflict().finish()); return Ok(HttpResponse::Conflict().finish());
} }
let mut remote = repo.find_remote("origin")?; files.insert(
let refspec = get_refspec(&repo, &mut remote)?; manifest.name.name().to_string(),
toml::to_string(&file)?.into_bytes(),
);
let reference = repo.find_reference(&refspec)?; push_changes(
&app_state,
{ &source,
let index_content = toml::to_string(&file)?; manifest.name.scope().to_string(),
let mut blob_writer = repo.blob_writer(None)?; files,
blob_writer.write_all(index_content.as_bytes())?; format!(
oids.push((name, blob_writer.commit()?));
}
let old_root_tree = reference.peel_to_tree()?;
let old_scope_tree = match old_root_tree.get_name(scope) {
Some(entry) => Some(repo.find_tree(entry.id())?),
None => None,
};
let mut scope_tree = repo.treebuilder(old_scope_tree.as_ref())?;
for (file, oid) in oids {
scope_tree.insert(file, oid, 0o100644)?;
}
let scope_tree_id = scope_tree.write()?;
let mut root_tree = repo.treebuilder(Some(&repo.find_tree(old_root_tree.id())?))?;
root_tree.insert(scope, scope_tree_id, 0o040000)?;
let tree_oid = root_tree.write()?;
repo.commit(
Some("HEAD"),
&signature(),
&signature(),
&format!(
"add {}@{} {}", "add {}@{} {}",
manifest.name, manifest.version, manifest.target manifest.name, manifest.version, manifest.target
), ),
&repo.find_tree(tree_oid)?, )
&[&reference.peel_to_commit()?], .await?;
)?; drop(source);
let mut push_options = git2::PushOptions::new(); update_search_version(&app_state, &manifest.name, &new_entry);
let mut remote_callbacks = git2::RemoteCallbacks::new();
let git_creds = app_state.project.auth_config().git_credentials().unwrap();
remote_callbacks.credentials(|_, _, _| {
git2::Cred::userpass_plaintext(&git_creds.username, &git_creds.password)
});
push_options.remote_callbacks(remote_callbacks);
remote.push(&[refspec], Some(&mut push_options))?;
update_version(&app_state, &manifest.name, new_entry);
} }
let version_id = VersionId::new(manifest.version.clone(), manifest.target.kind()); let version_id = VersionId::new(manifest.version.clone(), manifest.target.kind());
let (a, b, c) = join!( let mut tasks = docs_pages
app_state
.storage
.store_package(&manifest.name, &version_id, bytes.to_vec()),
join_all(
docs_pages
.into_iter() .into_iter()
.map(|(hash, content)| app_state.storage.store_doc(hash, content)), .map(|(hash, content)| {
), let app_state = app_state.clone();
async { async move { app_state.storage.store_doc(hash, content).await }
if let Some(readme) = readme { })
.collect::<JoinSet<_>>();
{
let app_state = app_state.clone();
let name = manifest.name.clone();
let version_id = version_id.clone();
tasks.spawn(async move {
app_state app_state
.storage .storage
.store_readme(&manifest.name, &version_id, readme) .store_package(&name, &version_id, bytes.to_vec())
.await .await
} else { });
Ok(())
} }
}
);
a?;
b.into_iter().collect::<Result<(), _>>()?;
c?;
Ok(HttpResponse::Ok().body(format!( if let Some(readme) = readme {
"published {}@{} {}", let app_state = app_state.clone();
manifest.name, manifest.version, manifest.target let name = manifest.name.clone();
))) let version_id = version_id.clone();
tasks.spawn(async move {
app_state
.storage
.store_readme(&name, &version_id, readme)
.await
});
}
while let Some(res) = tasks.join_next().await {
res.unwrap()?;
}
Ok(HttpResponse::Ok().body(format!("published {}@{version_id}", manifest.name)))
} }

View file

@ -1,36 +1,34 @@
use std::collections::HashMap; use crate::{
error::RegistryError,
use actix_web::{web, HttpResponse, Responder}; package::{read_package, PackageResponse},
use serde::Deserialize; search::find_max_searchable,
use tantivy::{collector::Count, query::AllQuery, schema::Value, DateTime, Order}; AppState,
use crate::{error::Error, package::PackageResponse, AppState};
use pesde::{
names::PackageName,
source::{
git_index::{read_file, root_tree, GitBasedSource},
pesde::IndexFile,
},
}; };
use actix_web::{web, HttpResponse};
use pesde::names::PackageName;
use serde::Deserialize;
use std::{collections::HashMap, sync::Arc};
use tantivy::{collector::Count, query::AllQuery, schema::Value as _, DateTime, Order};
use tokio::task::JoinSet;
#[derive(Deserialize)] #[derive(Deserialize)]
pub struct Request { pub struct Request {
#[serde(default)] #[serde(default)]
query: Option<String>, query: String,
#[serde(default)] #[serde(default)]
offset: Option<usize>, offset: usize,
} }
pub async fn search_packages( pub async fn search_packages(
app_state: web::Data<AppState>, app_state: web::Data<AppState>,
request: web::Query<Request>, request_query: web::Query<Request>,
) -> Result<impl Responder, Error> { ) -> Result<HttpResponse, RegistryError> {
let searcher = app_state.search_reader.searcher(); let searcher = app_state.search_reader.searcher();
let schema = searcher.schema(); let schema = searcher.schema();
let id = schema.get_field("id").unwrap(); let id = schema.get_field("id").unwrap();
let query = request.query.as_deref().unwrap_or_default().trim(); let query = request_query.query.trim();
let query = if query.is_empty() { let query = if query.is_empty() {
Box::new(AllQuery) Box::new(AllQuery)
@ -44,64 +42,50 @@ pub async fn search_packages(
&( &(
Count, Count,
tantivy::collector::TopDocs::with_limit(50) tantivy::collector::TopDocs::with_limit(50)
.and_offset(request.offset.unwrap_or_default()) .and_offset(request_query.offset)
.order_by_fast_field::<DateTime>("published_at", Order::Desc), .order_by_fast_field::<DateTime>("published_at", Order::Desc),
), ),
) )
.unwrap(); .unwrap();
let source = app_state.source.lock().await; let source = Arc::new(app_state.source.clone().read_owned().await);
let repo = gix::open(source.path(&app_state.project))?;
let tree = root_tree(&repo)?;
let top_docs = top_docs let mut results = top_docs
.iter()
.map(|_| None::<PackageResponse>)
.collect::<Vec<_>>();
let mut tasks = top_docs
.into_iter() .into_iter()
.map(|(_, doc_address)| { .enumerate()
.map(|(i, (_, doc_address))| {
let app_state = app_state.clone();
let doc = searcher.doc::<HashMap<_, _>>(doc_address).unwrap(); let doc = searcher.doc::<HashMap<_, _>>(doc_address).unwrap();
let source = source.clone();
let id = doc async move {
.get(&id) let id = (&doc[&id])
.unwrap()
.as_str() .as_str()
.unwrap() .unwrap()
.parse::<PackageName>() .parse::<PackageName>()
.unwrap(); .unwrap();
let (scope, name) = id.as_str();
let file: IndexFile = let file = read_package(&app_state, &id, &source).await?.unwrap();
toml::de::from_str(&read_file(&tree, [scope, name]).unwrap().unwrap()).unwrap();
let (latest_version, entry) = file let (version_id, _) = find_max_searchable(&file).unwrap();
.entries
.iter()
.max_by_key(|(v_id, _)| v_id.version())
.unwrap();
PackageResponse { Ok::<_, RegistryError>((i, PackageResponse::new(&id, version_id, &file)))
name: id.to_string(),
version: latest_version.version().to_string(),
targets: file
.entries
.iter()
.filter(|(v_id, _)| v_id.version() == latest_version.version())
.map(|(_, entry)| (&entry.target).into())
.collect(),
description: entry.description.clone().unwrap_or_default(),
published_at: file
.entries
.values()
.map(|entry| entry.published_at)
.max()
.unwrap(),
license: entry.license.clone().unwrap_or_default(),
authors: entry.authors.clone(),
repository: entry.repository.clone().map(|url| url.to_string()),
} }
}) })
.collect::<Vec<_>>(); .collect::<JoinSet<_>>();
while let Some(res) = tasks.join_next().await {
let (i, res) = res.unwrap()?;
results[i] = Some(res);
}
Ok(HttpResponse::Ok().json(serde_json::json!({ Ok(HttpResponse::Ok().json(serde_json::json!({
"data": top_docs, "data": results,
"count": count, "count": count,
}))) })))
} }

View file

@ -0,0 +1,83 @@
use crate::{
auth::UserId,
error::RegistryError,
git::push_changes,
package::{read_package, read_scope_info},
request_path::AllOrSpecificTarget,
search::search_version_changed,
AppState,
};
use actix_web::{http::Method, web, HttpRequest, HttpResponse};
use pesde::names::PackageName;
use semver::Version;
use std::collections::HashMap;
pub async fn yank_package_version(
request: HttpRequest,
app_state: web::Data<AppState>,
path: web::Path<(PackageName, Version, AllOrSpecificTarget)>,
user_id: web::ReqData<UserId>,
) -> Result<HttpResponse, RegistryError> {
let yanked = request.method() != Method::DELETE;
let (name, version, target) = path.into_inner();
let source = app_state.source.write().await;
let Some(scope_info) = read_scope_info(&app_state, name.scope(), &source).await? else {
return Ok(HttpResponse::NotFound().finish());
};
if !scope_info.owners.contains(&user_id.0) {
return Ok(HttpResponse::Forbidden().finish());
}
let Some(mut file) = read_package(&app_state, &name, &source).await? else {
return Ok(HttpResponse::NotFound().finish());
};
let mut targets = vec![];
for (v_id, entry) in &mut file.entries {
if *v_id.version() != version {
continue;
}
match target {
AllOrSpecificTarget::Specific(kind) if entry.target.kind() != kind => continue,
_ => {}
}
if entry.yanked == yanked {
continue;
}
targets.push(entry.target.kind().to_string());
entry.yanked = yanked;
}
if targets.is_empty() {
return Ok(HttpResponse::Conflict().finish());
}
let file_string = toml::to_string(&file)?;
push_changes(
&app_state,
&source,
name.scope().to_string(),
HashMap::from([(name.name().to_string(), file_string.into_bytes())]),
format!(
"{}yank {name}@{version} {}",
if yanked { "" } else { "un" },
targets.join(", "),
),
)
.await?;
search_version_changed(&app_state, &name, &file);
Ok(HttpResponse::Ok().body(format!(
"{}yanked {name}@{version} {}",
if yanked { "" } else { "un" },
targets.join(", "),
)))
}

View file

@ -1,10 +1,11 @@
use actix_web::{body::BoxBody, HttpResponse, ResponseError}; use actix_web::{body::BoxBody, HttpResponse, ResponseError};
use pesde::source::git_index::errors::{ReadFile, RefreshError, TreeError}; use pesde::source::git_index::errors::{ReadFile, RefreshError, TreeError};
use serde::Serialize; use serde::Serialize;
use std::error::Error;
use thiserror::Error; use thiserror::Error;
#[derive(Debug, Error)] #[derive(Debug, Error)]
pub enum Error { pub enum RegistryError {
#[error("failed to parse query")] #[error("failed to parse query")]
Query(#[from] tantivy::query::QueryParserError), Query(#[from] tantivy::query::QueryParserError),
@ -53,20 +54,20 @@ pub struct ErrorResponse {
pub error: String, pub error: String,
} }
impl ResponseError for Error { impl ResponseError for RegistryError {
fn error_response(&self) -> HttpResponse<BoxBody> { fn error_response(&self) -> HttpResponse<BoxBody> {
match self { match self {
Error::Query(e) => HttpResponse::BadRequest().json(ErrorResponse { RegistryError::Query(e) => HttpResponse::BadRequest().json(ErrorResponse {
error: format!("failed to parse query: {e}"), error: format!("failed to parse query: {e}"),
}), }),
Error::Tar(_) => HttpResponse::BadRequest().json(ErrorResponse { RegistryError::Tar(_) => HttpResponse::BadRequest().json(ErrorResponse {
error: "corrupt archive".to_string(), error: "corrupt archive".to_string(),
}), }),
Error::InvalidArchive(e) => HttpResponse::BadRequest().json(ErrorResponse { RegistryError::InvalidArchive(e) => HttpResponse::BadRequest().json(ErrorResponse {
error: format!("archive is invalid: {e}"), error: format!("archive is invalid: {e}"),
}), }),
e => { e => {
tracing::error!("unhandled error: {e:?}"); tracing::error!("unhandled error: {}", display_error(e));
HttpResponse::InternalServerError().finish() HttpResponse::InternalServerError().finish()
} }
} }
@ -74,16 +75,33 @@ impl ResponseError for Error {
} }
pub trait ReqwestErrorExt { pub trait ReqwestErrorExt {
async fn into_error(self) -> Result<Self, Error> async fn into_error(self) -> Result<Self, RegistryError>
where where
Self: Sized; Self: Sized;
} }
impl ReqwestErrorExt for reqwest::Response { impl ReqwestErrorExt for reqwest::Response {
async fn into_error(self) -> Result<Self, Error> { async fn into_error(self) -> Result<Self, RegistryError> {
match self.error_for_status_ref() { match self.error_for_status_ref() {
Ok(_) => Ok(self), Ok(_) => Ok(self),
Err(e) => Err(Error::ReqwestResponse(self.text().await?, e)), Err(e) => Err(RegistryError::ReqwestResponse(self.text().await?, e)),
} }
} }
} }
pub fn display_error<E: Error>(err: E) -> String {
let mut causes = vec![];
let mut source = err.source();
while let Some(src) = source {
causes.push(format!("\t- {src}"));
source = src.source();
}
format!(
"{err}{}",
if causes.is_empty() {
"".into()
} else {
format!("\n{}", causes.join("\n"))
}
)
}

98
registry/src/git.rs Normal file
View file

@ -0,0 +1,98 @@
use crate::{benv, error::RegistryError, AppState};
use git2::{Remote, Repository, Signature};
use pesde::source::{git_index::GitBasedSource as _, pesde::PesdePackageSource};
use std::collections::HashMap;
use tokio::task::spawn_blocking;
fn signature<'a>() -> Signature<'a> {
Signature::now(
&benv!(required "COMMITTER_GIT_NAME"),
&benv!(required "COMMITTER_GIT_EMAIL"),
)
.unwrap()
}
fn get_refspec(repo: &Repository, remote: &mut Remote) -> Result<String, git2::Error> {
let upstream_branch_buf = repo.branch_upstream_name(repo.head()?.name().unwrap())?;
let upstream_branch = upstream_branch_buf.as_str().unwrap();
let refspec_buf = remote
.refspecs()
.find(|r| r.direction() == git2::Direction::Fetch && r.dst_matches(upstream_branch))
.unwrap()
.rtransform(upstream_branch)?;
let refspec = refspec_buf.as_str().unwrap();
Ok(refspec.to_string())
}
const FILE_FILEMODE: i32 = 0o100_644;
const DIR_FILEMODE: i32 = 0o040_000;
pub async fn push_changes(
app_state: &AppState,
source: &PesdePackageSource,
directory: String,
files: HashMap<String, Vec<u8>>,
message: String,
) -> Result<(), RegistryError> {
let path = source.path(&app_state.project);
let auth_config = app_state.project.auth_config().clone();
spawn_blocking(move || {
let repo = Repository::open_bare(path)?;
let mut oids = HashMap::new();
let mut remote = repo.find_remote("origin")?;
let refspec = get_refspec(&repo, &mut remote)?;
let reference = repo.find_reference(&refspec)?;
for (name, contents) in files {
let oid = repo.blob(&contents)?;
oids.insert(name, oid);
}
let old_root_tree = reference.peel_to_tree()?;
let old_dir_tree = match old_root_tree.get_name(&directory) {
Some(entry) => Some(repo.find_tree(entry.id())?),
None => None,
};
let mut dir_tree = repo.treebuilder(old_dir_tree.as_ref())?;
for (file, oid) in oids {
dir_tree.insert(file, oid, FILE_FILEMODE)?;
}
let dir_tree_id = dir_tree.write()?;
let mut root_tree = repo.treebuilder(Some(&repo.find_tree(old_root_tree.id())?))?;
root_tree.insert(directory, dir_tree_id, DIR_FILEMODE)?;
let tree_oid = root_tree.write()?;
repo.commit(
Some("HEAD"),
&signature(),
&signature(),
&message,
&repo.find_tree(tree_oid)?,
&[&reference.peel_to_commit()?],
)?;
let mut push_options = git2::PushOptions::new();
let mut remote_callbacks = git2::RemoteCallbacks::new();
let git_creds = auth_config.git_credentials().unwrap();
remote_callbacks.credentials(|_, _, _| {
git2::Cred::userpass_plaintext(&git_creds.username, &git_creds.password)
});
push_options.remote_callbacks(remote_callbacks);
remote.push(&[refspec], Some(&mut push_options))?;
Ok(())
})
.await
.unwrap()
}

View file

@ -14,22 +14,28 @@ use actix_web::{
}; };
use fs_err::tokio as fs; use fs_err::tokio as fs;
use pesde::{ use pesde::{
source::{pesde::PesdePackageSource, traits::PackageSource}, source::{
pesde::PesdePackageSource,
traits::{PackageSource as _, RefreshOptions},
},
AuthConfig, Project, AuthConfig, Project,
}; };
use std::{env::current_dir, path::PathBuf}; use std::{env::current_dir, path::PathBuf, sync::Arc};
use tracing::level_filters::LevelFilter; use tracing::level_filters::LevelFilter;
use tracing_subscriber::{ use tracing_subscriber::{
fmt::format::FmtSpan, layer::SubscriberExt, util::SubscriberInitExt, EnvFilter, fmt::format::FmtSpan, layer::SubscriberExt as _, util::SubscriberInitExt as _, EnvFilter,
}; };
mod auth; mod auth;
mod endpoints; mod endpoints;
mod error; mod error;
mod git;
mod package; mod package;
mod request_path;
mod search; mod search;
mod storage; mod storage;
#[must_use]
pub fn make_reqwest() -> reqwest::Client { pub fn make_reqwest() -> reqwest::Client {
reqwest::ClientBuilder::new() reqwest::ClientBuilder::new()
.user_agent(concat!( .user_agent(concat!(
@ -42,7 +48,7 @@ pub fn make_reqwest() -> reqwest::Client {
} }
pub struct AppState { pub struct AppState {
pub source: tokio::sync::Mutex<PesdePackageSource>, pub source: Arc<tokio::sync::RwLock<PesdePackageSource>>,
pub project: Project, pub project: Project,
pub storage: Storage, pub storage: Storage,
pub auth: Auth, pub auth: Auth,
@ -58,7 +64,7 @@ macro_rules! benv {
std::env::var($name) std::env::var($name)
}; };
($name:expr => $default:expr) => { ($name:expr => $default:expr) => {
benv!($name).unwrap_or($default.to_string()) benv!($name).unwrap_or_else(|_| $default.to_string())
}; };
(required $name:expr) => { (required $name:expr) => {
benv!($name).expect(concat!("Environment variable `", $name, "` must be set")) benv!($name).expect(concat!("Environment variable `", $name, "` must be set"))
@ -106,7 +112,9 @@ async fn run() -> std::io::Result<()> {
); );
let source = PesdePackageSource::new(benv!(required "INDEX_REPO_URL").try_into().unwrap()); let source = PesdePackageSource::new(benv!(required "INDEX_REPO_URL").try_into().unwrap());
source source
.refresh(&project) .refresh(&RefreshOptions {
project: project.clone(),
})
.await .await
.expect("failed to refresh source"); .expect("failed to refresh source");
let config = source let config = source
@ -114,7 +122,7 @@ async fn run() -> std::io::Result<()> {
.await .await
.expect("failed to get index config"); .expect("failed to get index config");
let (search_reader, search_writer, query_parser) = make_search(&project, &source).await; let (search_reader, search_writer, query_parser) = make_search(&project, &source);
let app_data = web::Data::new(AppState { let app_data = web::Data::new(AppState {
storage: { storage: {
@ -127,7 +135,7 @@ async fn run() -> std::io::Result<()> {
tracing::info!("auth: {auth}"); tracing::info!("auth: {auth}");
auth auth
}, },
source: tokio::sync::Mutex::new(source), source: Arc::new(tokio::sync::RwLock::new(source)),
project, project,
search_reader, search_reader,
@ -143,6 +151,8 @@ async fn run() -> std::io::Result<()> {
.finish() .finish()
.unwrap(); .unwrap();
let publish_payload_config = PayloadConfig::new(config.max_archive_size);
HttpServer::new(move || { HttpServer::new(move || {
App::new() App::new()
.wrap(sentry_actix::Sentry::with_transaction()) .wrap(sentry_actix::Sentry::with_transaction())
@ -159,6 +169,38 @@ async fn run() -> std::io::Result<()> {
) )
.service( .service(
web::scope("/v0") web::scope("/v0")
.route(
"/search",
web::get()
.to(endpoints::search::search_packages)
.wrap(from_fn(auth::read_mw)),
)
.route(
"/packages/{name}",
web::get()
.to(endpoints::package_versions::get_package_versions_v0)
.wrap(from_fn(auth::read_mw)),
)
.route(
"/packages/{name}/{version}/{target}",
web::get()
.to(endpoints::package_version::get_package_version_v0)
.wrap(from_fn(auth::read_mw)),
)
.service(
web::scope("/packages")
.app_data(publish_payload_config.clone())
.route(
"",
web::post()
.to(endpoints::publish_version::publish_package)
.wrap(Governor::new(&publish_governor_config))
.wrap(from_fn(auth::write_mw)),
),
),
)
.service(
web::scope("/v1")
.route( .route(
"/search", "/search",
web::get() web::get()
@ -171,15 +213,45 @@ async fn run() -> std::io::Result<()> {
.to(endpoints::package_versions::get_package_versions) .to(endpoints::package_versions::get_package_versions)
.wrap(from_fn(auth::read_mw)), .wrap(from_fn(auth::read_mw)),
) )
.service(
web::resource("/packages/{name}/deprecate")
.put(endpoints::deprecate_version::deprecate_package_version)
.delete(endpoints::deprecate_version::deprecate_package_version)
.wrap(from_fn(auth::write_mw)),
)
.route( .route(
"/packages/{name}/{version}/{target}", "/packages/{name}/{version}/{target}",
web::get() web::get()
.to(endpoints::package_version::get_package_version) .to(endpoints::package_version::get_package_version)
.wrap(from_fn(auth::read_mw)), .wrap(from_fn(auth::read_mw)),
) )
.route(
"/packages/{name}/{version}/{target}/archive",
web::get()
.to(endpoints::package_archive::get_package_archive)
.wrap(from_fn(auth::read_mw)),
)
.route(
"/packages/{name}/{version}/{target}/doc",
web::get()
.to(endpoints::package_doc::get_package_doc)
.wrap(from_fn(auth::read_mw)),
)
.route(
"/packages/{name}/{version}/{target}/readme",
web::get()
.to(endpoints::package_readme::get_package_readme)
.wrap(from_fn(auth::read_mw)),
)
.service(
web::resource("/packages/{name}/{version}/{target}/yank")
.put(endpoints::yank_version::yank_package_version)
.delete(endpoints::yank_version::yank_package_version)
.wrap(from_fn(auth::write_mw)),
)
.service( .service(
web::scope("/packages") web::scope("/packages")
.app_data(PayloadConfig::new(config.max_archive_size)) .app_data(publish_payload_config.clone())
.route( .route(
"", "",
web::post() web::post()

View file

@ -1,27 +1,33 @@
use chrono::{DateTime, Utc}; use crate::AppState;
use pesde::manifest::target::{Target, TargetKind}; use pesde::{
manifest::{
target::{Target, TargetKind},
Alias, DependencyType,
},
names::PackageName,
source::{
git_index::{read_file, root_tree, GitBasedSource as _},
ids::VersionId,
pesde::{IndexFile, IndexFileEntry, PesdePackageSource, ScopeInfo, SCOPE_INFO_FILE},
specifiers::DependencySpecifiers,
},
};
use semver::Version;
use serde::Serialize; use serde::Serialize;
use std::collections::BTreeSet; use std::collections::{BTreeMap, BTreeSet};
use tokio::task::spawn_blocking;
#[derive(Debug, Serialize, Eq, PartialEq)] #[derive(Debug, Serialize, Eq, PartialEq)]
pub struct TargetInfo { struct TargetInfoInner {
kind: TargetKind,
lib: bool, lib: bool,
bin: bool, bin: bool,
#[serde(skip_serializing_if = "BTreeSet::is_empty")] #[serde(skip_serializing_if = "BTreeSet::is_empty")]
scripts: BTreeSet<String>, scripts: BTreeSet<String>,
} }
impl From<Target> for TargetInfo { impl TargetInfoInner {
fn from(target: Target) -> Self { fn new(target: &Target) -> Self {
(&target).into() TargetInfoInner {
}
}
impl From<&Target> for TargetInfo {
fn from(target: &Target) -> Self {
TargetInfo {
kind: target.kind(),
lib: target.lib_path().is_some(), lib: target.lib_path().is_some(),
bin: target.bin_path().is_some(), bin: target.bin_path().is_some(),
scripts: target scripts: target
@ -32,6 +38,25 @@ impl From<&Target> for TargetInfo {
} }
} }
#[derive(Debug, Serialize, Eq, PartialEq)]
pub struct TargetInfo {
kind: TargetKind,
#[serde(skip_serializing_if = "std::ops::Not::not")]
yanked: bool,
#[serde(flatten)]
inner: TargetInfoInner,
}
impl TargetInfo {
fn new(target: &Target, yanked: bool) -> Self {
TargetInfo {
kind: target.kind(),
yanked,
inner: TargetInfoInner::new(target),
}
}
}
impl Ord for TargetInfo { impl Ord for TargetInfo {
fn cmp(&self, other: &Self) -> std::cmp::Ordering { fn cmp(&self, other: &Self) -> std::cmp::Ordering {
self.kind.cmp(&other.kind) self.kind.cmp(&other.kind)
@ -44,18 +69,199 @@ impl PartialOrd for TargetInfo {
} }
} }
#[derive(Debug, Serialize, Ord, PartialOrd, Eq, PartialEq)]
#[serde(untagged)]
pub enum RegistryDocEntryKind {
Page {
name: String,
},
Category {
#[serde(default, skip_serializing_if = "BTreeSet::is_empty")]
items: BTreeSet<RegistryDocEntry>,
#[serde(default, skip_serializing_if = "std::ops::Not::not")]
collapsed: bool,
},
}
#[derive(Debug, Serialize, Ord, PartialOrd, Eq, PartialEq)]
pub struct RegistryDocEntry {
label: String,
#[serde(default, skip_serializing_if = "Option::is_none")]
position: Option<usize>,
#[serde(flatten)]
kind: RegistryDocEntryKind,
}
impl From<pesde::source::pesde::DocEntry> for RegistryDocEntry {
fn from(entry: pesde::source::pesde::DocEntry) -> Self {
Self {
label: entry.label,
position: entry.position,
kind: match entry.kind {
pesde::source::pesde::DocEntryKind::Page { name, .. } => {
RegistryDocEntryKind::Page { name }
}
pesde::source::pesde::DocEntryKind::Category { items, collapsed } => {
RegistryDocEntryKind::Category {
items: items.into_iter().map(Into::into).collect(),
collapsed,
}
}
},
}
}
}
#[derive(Debug, Serialize)]
pub struct PackageResponseInner {
published_at: jiff::Timestamp,
#[serde(skip_serializing_if = "String::is_empty")]
license: String,
#[serde(skip_serializing_if = "Vec::is_empty")]
authors: Vec<String>,
#[serde(skip_serializing_if = "Option::is_none")]
repository: Option<String>,
#[serde(skip_serializing_if = "BTreeSet::is_empty")]
docs: BTreeSet<RegistryDocEntry>,
#[serde(skip_serializing_if = "BTreeMap::is_empty")]
dependencies: BTreeMap<Alias, (DependencySpecifiers, DependencyType)>,
}
impl PackageResponseInner {
pub fn new(entry: &IndexFileEntry) -> Self {
PackageResponseInner {
published_at: entry.published_at,
license: entry.license.clone().unwrap_or_default(),
authors: entry.authors.clone(),
repository: entry.repository.clone().map(|url| url.to_string()),
docs: entry.docs.iter().cloned().map(Into::into).collect(),
dependencies: entry.dependencies.clone(),
}
}
}
#[derive(Debug, Serialize)] #[derive(Debug, Serialize)]
pub struct PackageResponse { pub struct PackageResponse {
pub name: String, name: String,
pub version: String, version: String,
pub targets: BTreeSet<TargetInfo>, targets: BTreeSet<TargetInfo>,
#[serde(skip_serializing_if = "String::is_empty")] #[serde(skip_serializing_if = "String::is_empty")]
pub description: String, description: String,
pub published_at: DateTime<Utc>,
#[serde(skip_serializing_if = "String::is_empty")] #[serde(skip_serializing_if = "String::is_empty")]
pub license: String, deprecated: String,
#[serde(skip_serializing_if = "Vec::is_empty")] #[serde(flatten)]
pub authors: Vec<String>, inner: PackageResponseInner,
#[serde(skip_serializing_if = "Option::is_none")] }
pub repository: Option<String>,
impl PackageResponse {
pub fn new(name: &PackageName, version_id: &VersionId, file: &IndexFile) -> Self {
let entry = &file.entries[version_id];
PackageResponse {
name: name.to_string(),
version: version_id.version().to_string(),
targets: file
.entries
.iter()
.filter(|(ver, _)| ver.version() == version_id.version())
.map(|(_, entry)| TargetInfo::new(&entry.target, entry.yanked))
.collect(),
description: entry.description.clone().unwrap_or_default(),
deprecated: file.meta.deprecated.clone(),
inner: PackageResponseInner::new(entry),
}
}
}
#[derive(Debug, Serialize)]
struct PackageVersionsResponseVersionInner {
target: TargetInfoInner,
#[serde(skip_serializing_if = "std::ops::Not::not")]
yanked: bool,
#[serde(flatten)]
inner: PackageResponseInner,
}
#[derive(Debug, Serialize, Default)]
struct PackageVersionsResponseVersion {
#[serde(skip_serializing_if = "String::is_empty")]
description: String,
targets: BTreeMap<TargetKind, PackageVersionsResponseVersionInner>,
}
#[derive(Debug, Serialize)]
pub struct PackageVersionsResponse {
name: String,
#[serde(skip_serializing_if = "String::is_empty")]
deprecated: String,
versions: BTreeMap<Version, PackageVersionsResponseVersion>,
}
impl PackageVersionsResponse {
pub fn new(name: &PackageName, file: &IndexFile) -> Self {
let mut versions = BTreeMap::<Version, PackageVersionsResponseVersion>::new();
for (v_id, entry) in &file.entries {
let versions_resp = versions.entry(v_id.version().clone()).or_default();
versions_resp.description = entry.description.clone().unwrap_or_default();
versions_resp.targets.insert(
entry.target.kind(),
PackageVersionsResponseVersionInner {
target: TargetInfoInner::new(&entry.target),
yanked: entry.yanked,
inner: PackageResponseInner::new(entry),
},
);
}
PackageVersionsResponse {
name: name.to_string(),
deprecated: file.meta.deprecated.clone(),
versions,
}
}
}
pub async fn read_package(
app_state: &AppState,
package: &PackageName,
source: &PesdePackageSource,
) -> Result<Option<IndexFile>, crate::error::RegistryError> {
let path = source.path(&app_state.project);
let package = package.clone();
spawn_blocking(move || {
let (scope, name) = package.as_str();
let repo = gix::open(path)?;
let tree = root_tree(&repo)?;
let Some(versions) = read_file(&tree, [scope, name])? else {
return Ok(None);
};
toml::de::from_str(&versions).map_err(Into::into)
})
.await
.unwrap()
}
pub async fn read_scope_info(
app_state: &AppState,
scope: &str,
source: &PesdePackageSource,
) -> Result<Option<ScopeInfo>, crate::error::RegistryError> {
let path = source.path(&app_state.project);
let scope = scope.to_string();
spawn_blocking(move || {
let repo = gix::open(path)?;
let tree = root_tree(&repo)?;
let Some(versions) = read_file(&tree, [&*scope, SCOPE_INFO_FILE])? else {
return Ok(None);
};
toml::de::from_str(&versions).map_err(Into::into)
})
.await
.unwrap()
} }

View file

@ -0,0 +1,101 @@
use pesde::{
manifest::target::TargetKind,
source::{ids::VersionId, pesde::IndexFile},
};
use semver::Version;
use serde::{Deserialize, Deserializer};
#[derive(Debug)]
pub enum LatestOrSpecificVersion {
Latest,
Specific(Version),
}
impl<'de> Deserialize<'de> for LatestOrSpecificVersion {
fn deserialize<D>(deserializer: D) -> Result<LatestOrSpecificVersion, D::Error>
where
D: Deserializer<'de>,
{
let s = String::deserialize(deserializer)?;
if s.eq_ignore_ascii_case("latest") {
return Ok(LatestOrSpecificVersion::Latest);
}
s.parse()
.map(LatestOrSpecificVersion::Specific)
.map_err(serde::de::Error::custom)
}
}
#[derive(Debug)]
pub enum AnyOrSpecificTarget {
Any,
Specific(TargetKind),
}
impl<'de> Deserialize<'de> for AnyOrSpecificTarget {
fn deserialize<D>(deserializer: D) -> Result<AnyOrSpecificTarget, D::Error>
where
D: Deserializer<'de>,
{
let s = String::deserialize(deserializer)?;
if s.eq_ignore_ascii_case("any") {
return Ok(AnyOrSpecificTarget::Any);
}
s.parse()
.map(AnyOrSpecificTarget::Specific)
.map_err(serde::de::Error::custom)
}
}
pub fn resolve_version_and_target<'a>(
file: &'a IndexFile,
version: LatestOrSpecificVersion,
target: &AnyOrSpecificTarget,
) -> Option<&'a VersionId> {
let version = match version {
LatestOrSpecificVersion::Latest => {
match file.entries.keys().map(VersionId::version).max() {
Some(latest) => latest.clone(),
None => return None,
}
}
LatestOrSpecificVersion::Specific(version) => version,
};
let mut versions = file
.entries
.iter()
.filter(|(v_id, _)| *v_id.version() == version);
match target {
AnyOrSpecificTarget::Any => versions.min_by_key(|(v_id, _)| v_id.target()),
AnyOrSpecificTarget::Specific(kind) => {
versions.find(|(_, entry)| entry.target.kind() == *kind)
}
}
.map(|(v_id, _)| v_id)
}
#[derive(Debug)]
pub enum AllOrSpecificTarget {
All,
Specific(TargetKind),
}
impl<'de> Deserialize<'de> for AllOrSpecificTarget {
fn deserialize<D>(deserializer: D) -> Result<AllOrSpecificTarget, D::Error>
where
D: Deserializer<'de>,
{
let s = String::deserialize(deserializer)?;
if s.eq_ignore_ascii_case("all") {
return Ok(AllOrSpecificTarget::All);
}
s.parse()
.map(AllOrSpecificTarget::Specific)
.map_err(serde::de::Error::custom)
}
}

View file

@ -1,14 +1,14 @@
use crate::AppState; use crate::AppState;
use async_stream::stream;
use futures::{Stream, StreamExt};
use pesde::{ use pesde::{
names::PackageName, names::PackageName,
source::{ source::{
git_index::{root_tree, GitBasedSource}, git_index::{root_tree, GitBasedSource as _},
ids::VersionId,
pesde::{IndexFile, IndexFileEntry, PesdePackageSource, SCOPE_INFO_FILE}, pesde::{IndexFile, IndexFileEntry, PesdePackageSource, SCOPE_INFO_FILE},
}, },
Project, Project,
}; };
use std::collections::BTreeMap;
use tantivy::{ use tantivy::{
doc, doc,
query::QueryParser, query::QueryParser,
@ -16,58 +16,94 @@ use tantivy::{
tokenizer::TextAnalyzer, tokenizer::TextAnalyzer,
DateTime, IndexReader, IndexWriter, Term, DateTime, IndexReader, IndexWriter, Term,
}; };
use tokio::pin;
pub async fn all_packages( type Entries = BTreeMap<String, gix::ObjectId>;
source: &PesdePackageSource,
project: &Project,
) -> impl Stream<Item = (PackageName, IndexFile)> {
let path = source.path(project);
stream! { struct TreeIterator<'repo> {
let repo = gix::open(&path).expect("failed to open index"); repo: &'repo gix::Repository,
let tree = root_tree(&repo).expect("failed to get root tree"); entries: Entries,
current: Option<(String, Entries)>,
}
for entry in tree.iter() { fn collect_entries(tree: &gix::Tree) -> Result<Entries, gix::objs::decode::Error> {
let entry = entry.expect("failed to read entry"); tree.iter()
let object = entry.object().expect("failed to get object"); .map(|res| res.map(|r| (r.filename().to_string(), r.object_id())))
.collect()
}
// directories will be trees, and files will be blobs impl Iterator for TreeIterator<'_> {
if !matches!(object.kind, gix::object::Kind::Tree) { type Item = (PackageName, IndexFile);
fn next(&mut self) -> Option<Self::Item> {
if self
.current
.as_ref()
.is_none_or(|(_, entries)| entries.is_empty())
{
loop {
let (scope_name, scope_oid) = self.entries.pop_last()?;
let object = self
.repo
.find_object(scope_oid)
.expect("failed to get scope object");
if object.kind != gix::objs::Kind::Tree {
continue; continue;
} }
let package_scope = entry.filename().to_string(); let tree = object.into_tree();
let mut entries = collect_entries(&tree).expect("failed to read scope entries");
for inner_entry in object.into_tree().iter() { entries.remove(SCOPE_INFO_FILE);
let inner_entry = inner_entry.expect("failed to read inner entry");
let object = inner_entry.object().expect("failed to get object");
if !matches!(object.kind, gix::object::Kind::Blob) { if entries.is_empty() {
continue; continue;
} }
let package_name = inner_entry.filename().to_string(); self.current = Some((scope_name, entries));
break;
if package_name == SCOPE_INFO_FILE { }
continue;
} }
let blob = object.into_blob(); let (scope_name, entries) = self.current.as_mut()?;
let string = String::from_utf8(blob.data.clone()).expect("failed to parse utf8"); let (file_name, file_oid) = entries.pop_last()?;
let file: IndexFile = toml::from_str(&string).expect("failed to parse index file"); let object = self
.repo
.find_object(file_oid)
.expect("failed to get scope entry object");
if object.kind != gix::objs::Kind::Blob {
return None;
}
let mut blob = object.into_blob();
let string = String::from_utf8(blob.take_data()).expect("failed to parse utf8");
let file = toml::from_str(&string).expect("failed to parse index file");
Some((
// if this panics, it's an issue with the index. // if this panics, it's an issue with the index.
let name = format!("{package_scope}/{package_name}").parse().unwrap(); format!("{scope_name}/{file_name}").parse().unwrap(),
file,
yield (name, file); ))
}
}
} }
} }
pub async fn make_search( pub fn find_max_searchable(file: &IndexFile) -> Option<(&VersionId, &IndexFileEntry)> {
file.entries
.iter()
.filter(|(_, entry)| !entry.yanked)
.max_by(|(v_id_a, entry_a), (v_id_b, entry_b)| {
v_id_a
.version()
.cmp(v_id_b.version())
.then(entry_a.published_at.cmp(&entry_b.published_at))
})
}
pub fn make_search(
project: &Project, project: &Project,
source: &PesdePackageSource, source: &PesdePackageSource,
) -> (IndexReader, IndexWriter, QueryParser) { ) -> (IndexReader, IndexWriter, QueryParser) {
@ -80,6 +116,7 @@ pub async fn make_search(
); );
let id_field = schema_builder.add_text_field("id", STRING | STORED); let id_field = schema_builder.add_text_field("id", STRING | STORED);
let scope = schema_builder.add_text_field("scope", field_options.clone()); let scope = schema_builder.add_text_field("scope", field_options.clone());
let name = schema_builder.add_text_field("name", field_options.clone()); let name = schema_builder.add_text_field("name", field_options.clone());
let description = schema_builder.add_text_field("description", field_options); let description = schema_builder.add_text_field("description", field_options);
@ -100,22 +137,34 @@ pub async fn make_search(
.unwrap(); .unwrap();
let mut search_writer = search_index.writer(50_000_000).unwrap(); let mut search_writer = search_index.writer(50_000_000).unwrap();
let stream = all_packages(source, project).await; let path = source.path(project);
pin!(stream); let repo = gix::open(path).expect("failed to open index");
let tree = root_tree(&repo).expect("failed to get root tree");
while let Some((pkg_name, mut file)) = stream.next().await { let iter = TreeIterator {
let Some((_, latest_entry)) = file.entries.pop_last() else { entries: collect_entries(&tree).expect("failed to read entries"),
tracing::error!("no versions found for {pkg_name}"); repo: &repo,
current: None,
};
for (pkg_name, file) in iter {
if !file.meta.deprecated.is_empty() {
continue;
}
let Some((_, latest_entry)) = find_max_searchable(&file) else {
continue; continue;
}; };
search_writer.add_document(doc!( search_writer
.add_document(doc!(
id_field => pkg_name.to_string(), id_field => pkg_name.to_string(),
scope => pkg_name.as_str().0, scope => pkg_name.scope(),
name => pkg_name.as_str().1, name => pkg_name.name(),
description => latest_entry.description.unwrap_or_default(), description => latest_entry.description.clone().unwrap_or_default(),
published_at => DateTime::from_timestamp_secs(latest_entry.published_at.timestamp()), published_at => DateTime::from_timestamp_nanos(latest_entry.published_at.as_nanosecond() as i64),
)).unwrap(); ))
.unwrap();
} }
search_writer.commit().unwrap(); search_writer.commit().unwrap();
@ -128,7 +177,7 @@ pub async fn make_search(
(search_reader, search_writer, query_parser) (search_reader, search_writer, query_parser)
} }
pub fn update_version(app_state: &AppState, name: &PackageName, entry: IndexFileEntry) { pub fn update_search_version(app_state: &AppState, name: &PackageName, entry: &IndexFileEntry) {
let mut search_writer = app_state.search_writer.lock().unwrap(); let mut search_writer = app_state.search_writer.lock().unwrap();
let schema = search_writer.index().schema(); let schema = search_writer.index().schema();
let id_field = schema.get_field("id").unwrap(); let id_field = schema.get_field("id").unwrap();
@ -137,12 +186,36 @@ pub fn update_version(app_state: &AppState, name: &PackageName, entry: IndexFile
search_writer.add_document(doc!( search_writer.add_document(doc!(
id_field => name.to_string(), id_field => name.to_string(),
schema.get_field("scope").unwrap() => name.as_str().0, schema.get_field("scope").unwrap() => name.scope(),
schema.get_field("name").unwrap() => name.as_str().1, schema.get_field("name").unwrap() => name.name(),
schema.get_field("description").unwrap() => entry.description.unwrap_or_default(), schema.get_field("description").unwrap() => entry.description.clone().unwrap_or_default(),
schema.get_field("published_at").unwrap() => DateTime::from_timestamp_secs(entry.published_at.timestamp()) schema.get_field("published_at").unwrap() => DateTime::from_timestamp_nanos(entry.published_at.as_nanosecond() as i64)
)).unwrap(); )).unwrap();
search_writer.commit().unwrap(); search_writer.commit().unwrap();
drop(search_writer);
app_state.search_reader.reload().unwrap(); app_state.search_reader.reload().unwrap();
} }
pub fn search_version_changed(app_state: &AppState, name: &PackageName, file: &IndexFile) {
let entry = if file.meta.deprecated.is_empty() {
find_max_searchable(file)
} else {
None
};
let Some((_, entry)) = entry else {
let mut search_writer = app_state.search_writer.lock().unwrap();
let schema = search_writer.index().schema();
let id_field = schema.get_field("id").unwrap();
search_writer.delete_term(Term::from_field_text(id_field, &name.to_string()));
search_writer.commit().unwrap();
drop(search_writer);
app_state.search_reader.reload().unwrap();
return;
};
update_search_version(app_state, name, entry);
}

View file

@ -1,26 +1,31 @@
use crate::{error::Error, storage::StorageImpl}; use crate::{error::RegistryError, storage::StorageImpl};
use actix_web::{ use actix_web::{
http::header::{CONTENT_ENCODING, CONTENT_TYPE}, http::header::{CONTENT_ENCODING, CONTENT_LENGTH, CONTENT_TYPE},
HttpResponse, HttpResponse,
}; };
use fs_err::tokio as fs; use fs_err::tokio as fs;
use pesde::{names::PackageName, source::version_id::VersionId}; use pesde::{names::PackageName, source::ids::VersionId};
use std::{ use std::{
fmt::Display, fmt::Display,
path::{Path, PathBuf}, path::{Path, PathBuf},
}; };
use tokio_util::io::ReaderStream;
#[derive(Debug)] #[derive(Debug)]
pub struct FSStorage { pub struct FSStorage {
pub root: PathBuf, pub root: PathBuf,
} }
async fn read_file_to_response(path: &Path, content_type: &str) -> Result<HttpResponse, Error> { async fn read_file_to_response(
Ok(match fs::read(path).await { path: &Path,
Ok(contents) => HttpResponse::Ok() content_type: &str,
) -> Result<HttpResponse, RegistryError> {
Ok(match fs::File::open(path).await {
Ok(file) => HttpResponse::Ok()
.append_header((CONTENT_TYPE, content_type)) .append_header((CONTENT_TYPE, content_type))
.append_header((CONTENT_ENCODING, "gzip")) .append_header((CONTENT_ENCODING, "gzip"))
.body(contents), .append_header((CONTENT_LENGTH, file.metadata().await?.len()))
.streaming(ReaderStream::new(file)),
Err(e) if e.kind() == std::io::ErrorKind::NotFound => HttpResponse::NotFound().finish(), Err(e) if e.kind() == std::io::ErrorKind::NotFound => HttpResponse::NotFound().finish(),
Err(e) => return Err(e.into()), Err(e) => return Err(e.into()),
}) })
@ -32,7 +37,7 @@ impl StorageImpl for FSStorage {
package_name: &PackageName, package_name: &PackageName,
version: &VersionId, version: &VersionId,
contents: Vec<u8>, contents: Vec<u8>,
) -> Result<(), Error> { ) -> Result<(), RegistryError> {
let (scope, name) = package_name.as_str(); let (scope, name) = package_name.as_str();
let path = self let path = self
@ -52,7 +57,7 @@ impl StorageImpl for FSStorage {
&self, &self,
package_name: &PackageName, package_name: &PackageName,
version: &VersionId, version: &VersionId,
) -> Result<HttpResponse, Error> { ) -> Result<HttpResponse, RegistryError> {
let (scope, name) = package_name.as_str(); let (scope, name) = package_name.as_str();
let path = self let path = self
@ -70,7 +75,7 @@ impl StorageImpl for FSStorage {
package_name: &PackageName, package_name: &PackageName,
version: &VersionId, version: &VersionId,
contents: Vec<u8>, contents: Vec<u8>,
) -> Result<(), Error> { ) -> Result<(), RegistryError> {
let (scope, name) = package_name.as_str(); let (scope, name) = package_name.as_str();
let path = self let path = self
@ -90,7 +95,7 @@ impl StorageImpl for FSStorage {
&self, &self,
package_name: &PackageName, package_name: &PackageName,
version: &VersionId, version: &VersionId,
) -> Result<HttpResponse, Error> { ) -> Result<HttpResponse, RegistryError> {
let (scope, name) = package_name.as_str(); let (scope, name) = package_name.as_str();
let path = self let path = self
@ -103,7 +108,7 @@ impl StorageImpl for FSStorage {
read_file_to_response(&path.join("readme.gz"), "text/plain").await read_file_to_response(&path.join("readme.gz"), "text/plain").await
} }
async fn store_doc(&self, doc_hash: String, contents: Vec<u8>) -> Result<(), Error> { async fn store_doc(&self, doc_hash: String, contents: Vec<u8>) -> Result<(), RegistryError> {
let path = self.root.join("Doc"); let path = self.root.join("Doc");
fs::create_dir_all(&path).await?; fs::create_dir_all(&path).await?;
@ -112,7 +117,7 @@ impl StorageImpl for FSStorage {
Ok(()) Ok(())
} }
async fn get_doc(&self, doc_hash: &str) -> Result<HttpResponse, Error> { async fn get_doc(&self, doc_hash: &str) -> Result<HttpResponse, RegistryError> {
let path = self.root.join("Doc"); let path = self.root.join("Doc");
read_file_to_response(&path.join(format!("{doc_hash}.gz")), "text/plain").await read_file_to_response(&path.join(format!("{doc_hash}.gz")), "text/plain").await
@ -121,6 +126,6 @@ impl StorageImpl for FSStorage {
impl Display for FSStorage { impl Display for FSStorage {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "FS") write!(f, "FS ({})", self.root.display())
} }
} }

View file

@ -1,6 +1,6 @@
use crate::{benv, error::Error, make_reqwest}; use crate::{benv, error::RegistryError, make_reqwest};
use actix_web::HttpResponse; use actix_web::HttpResponse;
use pesde::{names::PackageName, source::version_id::VersionId}; use pesde::{names::PackageName, source::ids::VersionId};
use rusty_s3::{Bucket, Credentials, UrlStyle}; use rusty_s3::{Bucket, Credentials, UrlStyle};
use std::fmt::Display; use std::fmt::Display;
@ -19,31 +19,27 @@ pub trait StorageImpl: Display {
package_name: &PackageName, package_name: &PackageName,
version: &VersionId, version: &VersionId,
contents: Vec<u8>, contents: Vec<u8>,
) -> Result<(), crate::error::Error>; ) -> Result<(), RegistryError>;
async fn get_package( async fn get_package(
&self, &self,
package_name: &PackageName, package_name: &PackageName,
version: &VersionId, version: &VersionId,
) -> Result<HttpResponse, crate::error::Error>; ) -> Result<HttpResponse, RegistryError>;
async fn store_readme( async fn store_readme(
&self, &self,
package_name: &PackageName, package_name: &PackageName,
version: &VersionId, version: &VersionId,
contents: Vec<u8>, contents: Vec<u8>,
) -> Result<(), crate::error::Error>; ) -> Result<(), RegistryError>;
async fn get_readme( async fn get_readme(
&self, &self,
package_name: &PackageName, package_name: &PackageName,
version: &VersionId, version: &VersionId,
) -> Result<HttpResponse, crate::error::Error>; ) -> Result<HttpResponse, RegistryError>;
async fn store_doc( async fn store_doc(&self, doc_hash: String, contents: Vec<u8>) -> Result<(), RegistryError>;
&self, async fn get_doc(&self, doc_hash: &str) -> Result<HttpResponse, RegistryError>;
doc_hash: String,
contents: Vec<u8>,
) -> Result<(), crate::error::Error>;
async fn get_doc(&self, doc_hash: &str) -> Result<HttpResponse, crate::error::Error>;
} }
impl StorageImpl for Storage { impl StorageImpl for Storage {
@ -52,7 +48,7 @@ impl StorageImpl for Storage {
package_name: &PackageName, package_name: &PackageName,
version: &VersionId, version: &VersionId,
contents: Vec<u8>, contents: Vec<u8>,
) -> Result<(), Error> { ) -> Result<(), RegistryError> {
match self { match self {
Storage::S3(s3) => s3.store_package(package_name, version, contents).await, Storage::S3(s3) => s3.store_package(package_name, version, contents).await,
Storage::FS(fs) => fs.store_package(package_name, version, contents).await, Storage::FS(fs) => fs.store_package(package_name, version, contents).await,
@ -63,7 +59,7 @@ impl StorageImpl for Storage {
&self, &self,
package_name: &PackageName, package_name: &PackageName,
version: &VersionId, version: &VersionId,
) -> Result<HttpResponse, Error> { ) -> Result<HttpResponse, RegistryError> {
match self { match self {
Storage::S3(s3) => s3.get_package(package_name, version).await, Storage::S3(s3) => s3.get_package(package_name, version).await,
Storage::FS(fs) => fs.get_package(package_name, version).await, Storage::FS(fs) => fs.get_package(package_name, version).await,
@ -75,7 +71,7 @@ impl StorageImpl for Storage {
package_name: &PackageName, package_name: &PackageName,
version: &VersionId, version: &VersionId,
contents: Vec<u8>, contents: Vec<u8>,
) -> Result<(), Error> { ) -> Result<(), RegistryError> {
match self { match self {
Storage::S3(s3) => s3.store_readme(package_name, version, contents).await, Storage::S3(s3) => s3.store_readme(package_name, version, contents).await,
Storage::FS(fs) => fs.store_readme(package_name, version, contents).await, Storage::FS(fs) => fs.store_readme(package_name, version, contents).await,
@ -86,21 +82,21 @@ impl StorageImpl for Storage {
&self, &self,
package_name: &PackageName, package_name: &PackageName,
version: &VersionId, version: &VersionId,
) -> Result<HttpResponse, Error> { ) -> Result<HttpResponse, RegistryError> {
match self { match self {
Storage::S3(s3) => s3.get_readme(package_name, version).await, Storage::S3(s3) => s3.get_readme(package_name, version).await,
Storage::FS(fs) => fs.get_readme(package_name, version).await, Storage::FS(fs) => fs.get_readme(package_name, version).await,
} }
} }
async fn store_doc(&self, doc_hash: String, contents: Vec<u8>) -> Result<(), Error> { async fn store_doc(&self, doc_hash: String, contents: Vec<u8>) -> Result<(), RegistryError> {
match self { match self {
Storage::S3(s3) => s3.store_doc(doc_hash, contents).await, Storage::S3(s3) => s3.store_doc(doc_hash, contents).await,
Storage::FS(fs) => fs.store_doc(doc_hash, contents).await, Storage::FS(fs) => fs.store_doc(doc_hash, contents).await,
} }
} }
async fn get_doc(&self, doc_hash: &str) -> Result<HttpResponse, Error> { async fn get_doc(&self, doc_hash: &str) -> Result<HttpResponse, RegistryError> {
match self { match self {
Storage::S3(s3) => s3.get_doc(doc_hash).await, Storage::S3(s3) => s3.get_doc(doc_hash).await,
Storage::FS(fs) => fs.get_doc(doc_hash).await, Storage::FS(fs) => fs.get_doc(doc_hash).await,
@ -111,8 +107,8 @@ impl StorageImpl for Storage {
impl Display for Storage { impl Display for Storage {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self { match self {
Storage::S3(s3) => write!(f, "{}", s3), Storage::S3(s3) => write!(f, "{s3}"),
Storage::FS(fs) => write!(f, "{}", fs), Storage::FS(fs) => write!(f, "{fs}"),
} }
} }
} }
@ -120,14 +116,14 @@ impl Display for Storage {
pub fn get_storage_from_env() -> Storage { pub fn get_storage_from_env() -> Storage {
if let Ok(endpoint) = benv!(parse "S3_ENDPOINT") { if let Ok(endpoint) = benv!(parse "S3_ENDPOINT") {
Storage::S3(s3::S3Storage { Storage::S3(s3::S3Storage {
s3_bucket: Bucket::new( bucket: Bucket::new(
endpoint, endpoint,
UrlStyle::Path, UrlStyle::Path,
benv!(required "S3_BUCKET_NAME"), benv!(required "S3_BUCKET_NAME"),
benv!(required "S3_REGION"), benv!(required "S3_REGION"),
) )
.unwrap(), .unwrap(),
s3_credentials: Credentials::new( credentials: Credentials::new(
benv!(required "S3_ACCESS_KEY"), benv!(required "S3_ACCESS_KEY"),
benv!(required "S3_SECRET_KEY"), benv!(required "S3_SECRET_KEY"),
), ),

View file

@ -1,20 +1,20 @@
use crate::{ use crate::{
error::{Error, ReqwestErrorExt}, error::{RegistryError, ReqwestErrorExt as _},
storage::StorageImpl, storage::StorageImpl,
}; };
use actix_web::{http::header::LOCATION, HttpResponse}; use actix_web::{http::header::LOCATION, HttpResponse};
use pesde::{names::PackageName, source::version_id::VersionId}; use pesde::{names::PackageName, source::ids::VersionId};
use reqwest::header::{CONTENT_ENCODING, CONTENT_TYPE}; use reqwest::header::{CONTENT_ENCODING, CONTENT_TYPE};
use rusty_s3::{ use rusty_s3::{
actions::{GetObject, PutObject}, actions::{GetObject, PutObject},
Bucket, Credentials, S3Action, Bucket, Credentials, S3Action as _,
}; };
use std::{fmt::Display, time::Duration}; use std::{fmt::Display, time::Duration};
#[derive(Debug)] #[derive(Debug)]
pub struct S3Storage { pub struct S3Storage {
pub s3_bucket: Bucket, pub bucket: Bucket,
pub s3_credentials: Credentials, pub credentials: Credentials,
pub reqwest_client: reqwest::Client, pub reqwest_client: reqwest::Client,
} }
@ -26,10 +26,10 @@ impl StorageImpl for S3Storage {
package_name: &PackageName, package_name: &PackageName,
version: &VersionId, version: &VersionId,
contents: Vec<u8>, contents: Vec<u8>,
) -> Result<(), Error> { ) -> Result<(), RegistryError> {
let object_url = PutObject::new( let object_url = PutObject::new(
&self.s3_bucket, &self.bucket,
Some(&self.s3_credentials), Some(&self.credentials),
&format!( &format!(
"{package_name}/{}/{}/pkg.tar.gz", "{package_name}/{}/{}/pkg.tar.gz",
version.version(), version.version(),
@ -55,10 +55,10 @@ impl StorageImpl for S3Storage {
&self, &self,
package_name: &PackageName, package_name: &PackageName,
version: &VersionId, version: &VersionId,
) -> Result<HttpResponse, Error> { ) -> Result<HttpResponse, RegistryError> {
let object_url = GetObject::new( let object_url = GetObject::new(
&self.s3_bucket, &self.bucket,
Some(&self.s3_credentials), Some(&self.credentials),
&format!( &format!(
"{package_name}/{}/{}/pkg.tar.gz", "{package_name}/{}/{}/pkg.tar.gz",
version.version(), version.version(),
@ -77,10 +77,10 @@ impl StorageImpl for S3Storage {
package_name: &PackageName, package_name: &PackageName,
version: &VersionId, version: &VersionId,
contents: Vec<u8>, contents: Vec<u8>,
) -> Result<(), Error> { ) -> Result<(), RegistryError> {
let object_url = PutObject::new( let object_url = PutObject::new(
&self.s3_bucket, &self.bucket,
Some(&self.s3_credentials), Some(&self.credentials),
&format!( &format!(
"{package_name}/{}/{}/readme.gz", "{package_name}/{}/{}/readme.gz",
version.version(), version.version(),
@ -106,10 +106,10 @@ impl StorageImpl for S3Storage {
&self, &self,
package_name: &PackageName, package_name: &PackageName,
version: &VersionId, version: &VersionId,
) -> Result<HttpResponse, Error> { ) -> Result<HttpResponse, RegistryError> {
let object_url = GetObject::new( let object_url = GetObject::new(
&self.s3_bucket, &self.bucket,
Some(&self.s3_credentials), Some(&self.credentials),
&format!( &format!(
"{package_name}/{}/{}/readme.gz", "{package_name}/{}/{}/readme.gz",
version.version(), version.version(),
@ -123,12 +123,12 @@ impl StorageImpl for S3Storage {
.finish()) .finish())
} }
async fn store_doc(&self, doc_hash: String, contents: Vec<u8>) -> Result<(), Error> { async fn store_doc(&self, doc_hash: String, contents: Vec<u8>) -> Result<(), RegistryError> {
let object_url = PutObject::new( let object_url = PutObject::new(
&self.s3_bucket, &self.bucket,
Some(&self.s3_credentials), Some(&self.credentials),
// capitalize Doc to prevent conflicts with scope names // capitalize Doc to prevent conflicts with scope names
&format!("Doc/{}.gz", doc_hash), &format!("Doc/{doc_hash}.gz"),
) )
.sign(S3_SIGN_DURATION); .sign(S3_SIGN_DURATION);
@ -145,11 +145,11 @@ impl StorageImpl for S3Storage {
Ok(()) Ok(())
} }
async fn get_doc(&self, doc_hash: &str) -> Result<HttpResponse, Error> { async fn get_doc(&self, doc_hash: &str) -> Result<HttpResponse, RegistryError> {
let object_url = GetObject::new( let object_url = GetObject::new(
&self.s3_bucket, &self.bucket,
Some(&self.s3_credentials), Some(&self.credentials),
&format!("Doc/{}.gz", doc_hash), &format!("Doc/{doc_hash}.gz"),
) )
.sign(S3_SIGN_DURATION); .sign(S3_SIGN_DURATION);
@ -161,6 +161,6 @@ impl StorageImpl for S3Storage {
impl Display for S3Storage { impl Display for S3Storage {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "S3") write!(f, "S3 (bucket name: {})", self.bucket.name())
} }
} }

View file

@ -1 +1,2 @@
imports_granularity = "Crate" imports_granularity = "Crate"
hard_tabs = true

View file

@ -1,13 +1,14 @@
use crate::cli::config::{read_config, write_config}; use crate::cli::config::{read_config, write_config};
use anyhow::Context; use anyhow::Context as _;
use gix::bstr::BStr; use gix::bstr::BStr;
use keyring::Entry; use keyring::Entry;
use reqwest::header::AUTHORIZATION; use reqwest::header::AUTHORIZATION;
use serde::{ser::SerializeMap, Deserialize, Serialize}; use serde::{ser::SerializeMap as _, Deserialize, Serialize};
use std::collections::BTreeMap; use std::collections::BTreeMap;
use tokio::task::spawn_blocking;
use tracing::instrument; use tracing::instrument;
#[derive(Debug, Clone)] #[derive(Debug, Clone, Default)]
pub struct Tokens(pub BTreeMap<gix::Url, String>); pub struct Tokens(pub BTreeMap<gix::Url, String>);
impl Serialize for Tokens { impl Serialize for Tokens {
@ -46,41 +47,54 @@ pub async fn get_tokens() -> anyhow::Result<Tokens> {
return Ok(config.tokens); return Ok(config.tokens);
} }
match Entry::new("tokens", env!("CARGO_PKG_NAME")) { let keyring_tokens = spawn_blocking(|| match Entry::new("tokens", env!("CARGO_PKG_NAME")) {
Ok(entry) => match entry.get_password() { Ok(entry) => match entry.get_password() {
Ok(token) => { Ok(token) => serde_json::from_str(&token)
tracing::debug!("using tokens from keyring"); .map(Some)
return serde_json::from_str(&token).context("failed to parse tokens"); .context("failed to parse tokens"),
} Err(keyring::Error::PlatformFailure(_) | keyring::Error::NoEntry) => Ok(None),
Err(keyring::Error::PlatformFailure(_) | keyring::Error::NoEntry) => {} Err(e) => Err(e.into()),
Err(e) => return Err(e.into()),
}, },
Err(keyring::Error::PlatformFailure(_)) => {} Err(keyring::Error::PlatformFailure(_)) => Ok(None),
Err(e) => return Err(e.into()), Err(e) => Err(e.into()),
})
.await
.unwrap()?;
if let Some(tokens) = keyring_tokens {
tracing::debug!("using tokens from keyring");
return Ok(tokens);
} }
Ok(Tokens(BTreeMap::new())) Ok(Tokens::default())
} }
#[instrument(level = "trace")] #[instrument(level = "trace")]
pub async fn set_tokens(tokens: Tokens) -> anyhow::Result<()> { pub async fn set_tokens(tokens: Tokens) -> anyhow::Result<()> {
let entry = Entry::new("tokens", env!("CARGO_PKG_NAME"))?;
let json = serde_json::to_string(&tokens).context("failed to serialize tokens")?; let json = serde_json::to_string(&tokens).context("failed to serialize tokens")?;
let to_keyring = spawn_blocking(move || {
let entry = Entry::new("tokens", env!("CARGO_PKG_NAME"))?;
match entry.set_password(&json) { match entry.set_password(&json) {
Ok(()) => { Ok(()) => Ok::<_, anyhow::Error>(true),
Err(keyring::Error::PlatformFailure(_) | keyring::Error::NoEntry) => Ok(false),
Err(e) => Err(e.into()),
}
})
.await
.unwrap()?;
if to_keyring {
tracing::debug!("tokens saved to keyring"); tracing::debug!("tokens saved to keyring");
return Ok(()); return Ok(());
} }
Err(keyring::Error::PlatformFailure(_) | keyring::Error::NoEntry) => {}
Err(e) => return Err(e.into()),
}
tracing::debug!("tokens saved to config"); tracing::debug!("saving tokens to config");
let mut config = read_config().await?; let mut config = read_config().await?;
config.tokens = tokens; config.tokens = tokens;
write_config(&config).await.map_err(Into::into) write_config(&config).await
} }
pub async fn set_token(repo: &gix::Url, token: Option<&str>) -> anyhow::Result<()> { pub async fn set_token(repo: &gix::Url, token: Option<&str>) -> anyhow::Result<()> {

View file

@ -1,23 +1,25 @@
use std::{collections::HashSet, str::FromStr}; use std::str::FromStr as _;
use anyhow::Context; use anyhow::Context as _;
use clap::Args; use clap::Args;
use colored::Colorize;
use semver::VersionReq; use semver::VersionReq;
use crate::cli::{config::read_config, AnyPackageIdentifier, VersionedPackageName}; use crate::cli::{
config::read_config, dep_type_to_key, AnyPackageIdentifier, VersionedPackageName,
};
use pesde::{ use pesde::{
manifest::target::TargetKind, manifest::{target::TargetKind, Alias, DependencyType},
names::PackageNames, names::PackageNames,
source::{ source::{
git::{specifier::GitDependencySpecifier, GitPackageSource}, git::{specifier::GitDependencySpecifier, GitPackageSource},
path::{specifier::PathDependencySpecifier, PathPackageSource},
pesde::{specifier::PesdeDependencySpecifier, PesdePackageSource}, pesde::{specifier::PesdeDependencySpecifier, PesdePackageSource},
specifiers::DependencySpecifiers, specifiers::DependencySpecifiers,
traits::PackageSource, traits::{PackageSource as _, RefreshOptions, ResolveOptions},
workspace::WorkspacePackageSource, workspace::{specifier::WorkspaceDependencySpecifier, WorkspacePackageSource},
PackageSources, PackageSources,
}, },
Project, DEFAULT_INDEX_NAME, Project, RefreshedSources, DEFAULT_INDEX_NAME,
}; };
#[derive(Debug, Args)] #[derive(Debug, Args)]
@ -36,7 +38,7 @@ pub struct AddCommand {
/// The alias to use for the package /// The alias to use for the package
#[arg(short, long)] #[arg(short, long)]
alias: Option<String>, alias: Option<Alias>,
/// Whether to add the package as a peer dependency /// Whether to add the package as a peer dependency
#[arg(short, long)] #[arg(short, long)]
@ -63,8 +65,7 @@ impl AddCommand {
.cloned(); .cloned();
if let Some(index) = self.index.as_ref().filter(|_| index.is_none()) { if let Some(index) = self.index.as_ref().filter(|_| index.is_none()) {
println!("{}: index {index} not found", "error".red().bold()); anyhow::bail!("index {index} not found");
return Ok(());
} }
let index = match index { let index = match index {
@ -76,7 +77,7 @@ impl AddCommand {
let specifier = DependencySpecifiers::Pesde(PesdeDependencySpecifier { let specifier = DependencySpecifiers::Pesde(PesdeDependencySpecifier {
name: name.clone(), name: name.clone(),
version: version.clone().unwrap_or(VersionReq::STAR), version: version.clone().unwrap_or(VersionReq::STAR),
index: self.index, index: self.index.unwrap_or_else(|| DEFAULT_INDEX_NAME.to_string()),
target: self.target, target: self.target,
}); });
@ -90,8 +91,7 @@ impl AddCommand {
.cloned(); .cloned();
if let Some(index) = self.index.as_ref().filter(|_| index.is_none()) { if let Some(index) = self.index.as_ref().filter(|_| index.is_none()) {
println!("{}: wally index {index} not found", "error".red().bold()); anyhow::bail!("wally index {index} not found");
return Ok(());
} }
let index = index.context("no wally index found")?; let index = index.context("no wally index found")?;
@ -102,7 +102,7 @@ impl AddCommand {
pesde::source::wally::specifier::WallyDependencySpecifier { pesde::source::wally::specifier::WallyDependencySpecifier {
name: name.clone(), name: name.clone(),
version: version.clone().unwrap_or(VersionReq::STAR), version: version.clone().unwrap_or(VersionReq::STAR),
index: self.index, index: self.index.unwrap_or_else(|| DEFAULT_INDEX_NAME.to_string()),
}, },
); );
@ -119,36 +119,59 @@ impl AddCommand {
), ),
AnyPackageIdentifier::Workspace(VersionedPackageName(name, version)) => ( AnyPackageIdentifier::Workspace(VersionedPackageName(name, version)) => (
PackageSources::Workspace(WorkspacePackageSource), PackageSources::Workspace(WorkspacePackageSource),
DependencySpecifiers::Workspace( DependencySpecifiers::Workspace(WorkspaceDependencySpecifier {
pesde::source::workspace::specifier::WorkspaceDependencySpecifier {
name: name.clone(), name: name.clone(),
version: version.clone().unwrap_or_default(), version: version.clone().unwrap_or_default(),
target: self.target, target: self.target,
}, }),
), ),
AnyPackageIdentifier::Path(path) => (
PackageSources::Path(PathPackageSource),
DependencySpecifiers::Path(PathDependencySpecifier { path: path.clone() }),
), ),
}; };
source
.refresh(&project) let refreshed_sources = RefreshedSources::new();
refreshed_sources
.refresh(
&source,
&RefreshOptions {
project: project.clone(),
},
)
.await .await
.context("failed to refresh package source")?; .context("failed to refresh package source")?;
let Some(version_id) = source let (_, mut versions, suggestions) = source
.resolve( .resolve(
&specifier, &specifier,
&project, &ResolveOptions {
manifest.target.kind(), project: project.clone(),
&mut HashSet::new(), target: manifest.target.kind(),
refreshed_sources,
loose_target: false,
},
) )
.await .await
.context("failed to resolve package")? .context("failed to resolve package")?;
.1
.pop_last()
.map(|(v_id, _)| v_id)
else {
println!("{}: no versions found for package", "error".red().bold());
return Ok(()); let Some((version_id, _)) = versions.pop_last() else {
anyhow::bail!(
"no matching versions found for package{}",
if suggestions.is_empty() {
"".into()
} else {
format!(
". available targets: {}",
suggestions
.into_iter()
.map(|t| t.to_string())
.collect::<Vec<_>>()
.join(", ")
)
}
);
}; };
let project_target = manifest.target.kind(); let project_target = manifest.target.kind();
@ -159,74 +182,81 @@ impl AddCommand {
.context("failed to read manifest")?, .context("failed to read manifest")?,
) )
.context("failed to parse manifest")?; .context("failed to parse manifest")?;
let dependency_key = if self.peer { let dependency_key = dep_type_to_key(if self.peer {
"peer_dependencies" DependencyType::Peer
} else if self.dev { } else if self.dev {
"dev_dependencies" DependencyType::Dev
} else { } else {
"dependencies" DependencyType::Standard
}; });
let alias = self.alias.unwrap_or_else(|| match self.name.clone() { let alias = match self.alias {
AnyPackageIdentifier::PackageName(versioned) => versioned.0.as_str().1.to_string(), Some(alias) => alias,
None => match &self.name {
AnyPackageIdentifier::PackageName(versioned) => versioned.0.name().to_string(),
AnyPackageIdentifier::Url((url, _)) => url AnyPackageIdentifier::Url((url, _)) => url
.path .path
.to_string() .to_string()
.split('/') .split('/')
.last() .next_back()
.map(|s| s.to_string()) .map_or_else(|| url.path.to_string(), ToString::to_string),
.unwrap_or(url.path.to_string()), AnyPackageIdentifier::Workspace(versioned) => versioned.0.name().to_string(),
AnyPackageIdentifier::Workspace(versioned) => versioned.0.as_str().1.to_string(), AnyPackageIdentifier::Path(path) => path
}); .file_name()
.map(|s| s.to_string_lossy().to_string())
.expect("path has no file name"),
}
.parse()
.context("auto-generated alias is invalid. use --alias to specify one")?,
};
let field = &mut manifest[dependency_key] let field = &mut manifest[dependency_key]
.or_insert(toml_edit::Item::Table(toml_edit::Table::new()))[&alias]; .or_insert(toml_edit::Item::Table(toml_edit::Table::new()))[alias.as_str()];
match specifier { match specifier {
DependencySpecifiers::Pesde(spec) => { DependencySpecifiers::Pesde(spec) => {
field["name"] = toml_edit::value(spec.name.clone().to_string()); field["name"] = toml_edit::value(spec.name.to_string());
field["version"] = toml_edit::value(format!("^{}", version_id.version())); field["version"] = toml_edit::value(format!("^{}", version_id.version()));
if *version_id.target() != project_target { if version_id.target() != project_target {
field["target"] = toml_edit::value(version_id.target().to_string()); field["target"] = toml_edit::value(version_id.target().to_string());
} }
if let Some(index) = spec.index.filter(|i| i != DEFAULT_INDEX_NAME) { if spec.index != DEFAULT_INDEX_NAME {
field["index"] = toml_edit::value(index); field["index"] = toml_edit::value(spec.index);
} }
println!( println!(
"added {}@{} {} to {}", "added {}@{} {} to {dependency_key}",
spec.name, spec.name,
version_id.version(), version_id.version(),
version_id.target(), version_id.target()
dependency_key
); );
} }
#[cfg(feature = "wally-compat")] #[cfg(feature = "wally-compat")]
DependencySpecifiers::Wally(spec) => { DependencySpecifiers::Wally(spec) => {
field["wally"] = toml_edit::value(spec.name.clone().to_string()); let name_str = spec.name.to_string();
let name_str = name_str.trim_start_matches("wally#");
field["wally"] = toml_edit::value(name_str);
field["version"] = toml_edit::value(format!("^{}", version_id.version())); field["version"] = toml_edit::value(format!("^{}", version_id.version()));
if let Some(index) = spec.index.filter(|i| i != DEFAULT_INDEX_NAME) { if spec.index != DEFAULT_INDEX_NAME {
field["index"] = toml_edit::value(index); field["index"] = toml_edit::value(spec.index);
} }
println!( println!(
"added wally {}@{} to {}", "added wally {name_str}@{} to {dependency_key}",
spec.name, version_id.version()
version_id.version(),
dependency_key
); );
} }
DependencySpecifiers::Git(spec) => { DependencySpecifiers::Git(spec) => {
field["repo"] = toml_edit::value(spec.repo.to_bstring().to_string()); field["repo"] = toml_edit::value(spec.repo.to_bstring().to_string());
field["rev"] = toml_edit::value(spec.rev.clone()); field["rev"] = toml_edit::value(spec.rev.clone());
println!("added git {}#{} to {}", spec.repo, spec.rev, dependency_key); println!("added git {}#{} to {dependency_key}", spec.repo, spec.rev);
} }
DependencySpecifiers::Workspace(spec) => { DependencySpecifiers::Workspace(spec) => {
field["workspace"] = toml_edit::value(spec.name.clone().to_string()); field["workspace"] = toml_edit::value(spec.name.to_string());
if let AnyPackageIdentifier::Workspace(versioned) = self.name { if let AnyPackageIdentifier::Workspace(versioned) = self.name {
if let Some(version) = versioned.1 { if let Some(version) = versioned.1 {
field["version"] = toml_edit::value(version.to_string()); field["version"] = toml_edit::value(version.to_string());
@ -234,10 +264,15 @@ impl AddCommand {
} }
println!( println!(
"added workspace {}@{} to {}", "added workspace {}@{} to {dependency_key}",
spec.name, spec.version, dependency_key spec.name, spec.version
); );
} }
DependencySpecifiers::Path(spec) => {
field["path"] = toml_edit::value(spec.path.to_string_lossy().to_string());
println!("added path {} to {dependency_key}", spec.path.display());
}
} }
project project

View file

@ -1,18 +1,23 @@
use anyhow::Context; use anyhow::Context as _;
use clap::Args; use clap::Args;
use colored::Colorize; use console::style;
use serde::Deserialize; use serde::Deserialize;
use std::thread::spawn; use std::thread::spawn;
use tokio::time::sleep; use tokio::time::sleep;
use url::Url; use url::Url;
use crate::cli::{
auth::{get_token_login, set_token},
style::URL_STYLE,
};
use pesde::{ use pesde::{
source::{pesde::PesdePackageSource, traits::PackageSource}, source::{
pesde::PesdePackageSource,
traits::{PackageSource as _, RefreshOptions},
},
Project, Project,
}; };
use crate::cli::auth::{get_token_login, set_token};
#[derive(Debug, Args)] #[derive(Debug, Args)]
pub struct LoginCommand { pub struct LoginCommand {
/// The token to use for authentication, skipping login /// The token to use for authentication, skipping login
@ -57,7 +62,9 @@ impl LoginCommand {
let source = PesdePackageSource::new(index_url.clone()); let source = PesdePackageSource::new(index_url.clone());
source source
.refresh(project) .refresh(&RefreshOptions {
project: project.clone(),
})
.await .await
.context("failed to refresh index")?; .context("failed to refresh index")?;
@ -85,8 +92,8 @@ impl LoginCommand {
println!( println!(
"copy your one-time code: {}\npress enter to open {} in your browser...", "copy your one-time code: {}\npress enter to open {} in your browser...",
response.user_code.bold(), style(response.user_code).bold(),
response.verification_uri.as_str().blue() URL_STYLE.apply_to(response.verification_uri.as_str())
); );
spawn(move || { spawn(move || {
@ -138,12 +145,11 @@ impl LoginCommand {
return Ok(access_token); return Ok(access_token);
} }
AccessTokenResponse::Error(e) => match e { AccessTokenResponse::Error(e) => match e {
AccessTokenError::AuthorizationPending => continue, AccessTokenError::AuthorizationPending => {}
AccessTokenError::SlowDown { AccessTokenError::SlowDown {
interval: new_interval, interval: new_interval,
} => { } => {
interval = std::time::Duration::from_secs(new_interval); interval = std::time::Duration::from_secs(new_interval);
continue;
} }
AccessTokenError::ExpiredToken => { AccessTokenError::ExpiredToken => {
break; break;
@ -180,7 +186,7 @@ impl LoginCommand {
let token = format!("Bearer {token}"); let token = format!("Bearer {token}");
println!( println!(
"logged in as {} for {index_url}", "logged in as {} for {index_url}",
get_token_login(&reqwest, &token).await?.bold() style(get_token_login(&reqwest, &token).await?).bold()
); );
token token

View file

@ -2,7 +2,7 @@ use crate::cli::auth::set_token;
use clap::Args; use clap::Args;
#[derive(Debug, Args)] #[derive(Debug, Args)]
pub struct LogoutCommand {} pub struct LogoutCommand;
impl LogoutCommand { impl LogoutCommand {
pub async fn run(self, index_url: gix::Url) -> anyhow::Result<()> { pub async fn run(self, index_url: gix::Url) -> anyhow::Result<()> {

View file

@ -1,6 +1,6 @@
use crate::cli::config::read_config; use crate::cli::get_index;
use clap::{Args, Subcommand}; use clap::{Args, Subcommand};
use pesde::{errors::ManifestReadError, Project, DEFAULT_INDEX_NAME}; use pesde::Project;
mod login; mod login;
mod logout; mod logout;
@ -32,36 +32,7 @@ pub enum AuthCommands {
impl AuthSubcommand { impl AuthSubcommand {
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> { pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
let manifest = match project.deser_manifest().await { let index_url = get_index(&project, self.index.as_deref()).await?;
Ok(manifest) => Some(manifest),
Err(e) => match e {
ManifestReadError::Io(e) if e.kind() == std::io::ErrorKind::NotFound => None,
e => return Err(e.into()),
},
};
let index_url = match self.index.as_deref() {
Some(index) => match index.try_into() {
Ok(url) => Some(url),
Err(_) => None,
},
None => match manifest {
Some(_) => None,
None => Some(read_config().await?.default_index),
},
};
let index_url = match index_url {
Some(url) => url,
None => {
let index_name = self.index.as_deref().unwrap_or(DEFAULT_INDEX_NAME);
match manifest.unwrap().indices.get(index_name) {
Some(index) => index.clone(),
None => anyhow::bail!("index {index_name} not found in manifest"),
}
}
};
match self.command { match self.command {
AuthCommands::Login(login) => login.run(index_url, project, reqwest).await, AuthCommands::Login(login) => login.run(index_url, project, reqwest).await,

View file

@ -2,17 +2,14 @@ use crate::cli::auth::get_tokens;
use clap::Args; use clap::Args;
#[derive(Debug, Args)] #[derive(Debug, Args)]
pub struct TokenCommand {} pub struct TokenCommand;
impl TokenCommand { impl TokenCommand {
pub async fn run(self, index_url: gix::Url) -> anyhow::Result<()> { pub async fn run(self, index_url: gix::Url) -> anyhow::Result<()> {
let tokens = get_tokens().await?; let tokens = get_tokens().await?;
let token = match tokens.0.get(&index_url) { let Some(token) = tokens.0.get(&index_url) else {
Some(token) => token,
None => {
println!("not logged in into {index_url}"); println!("not logged in into {index_url}");
return Ok(()); return Ok(());
}
}; };
println!("token for {index_url}: \"{token}\""); println!("token for {index_url}: \"{token}\"");

View file

@ -1,24 +1,21 @@
use crate::cli::auth::{get_token_login, get_tokens}; use crate::cli::auth::{get_token_login, get_tokens};
use clap::Args; use clap::Args;
use colored::Colorize; use console::style;
#[derive(Debug, Args)] #[derive(Debug, Args)]
pub struct WhoAmICommand {} pub struct WhoAmICommand;
impl WhoAmICommand { impl WhoAmICommand {
pub async fn run(self, index_url: gix::Url, reqwest: reqwest::Client) -> anyhow::Result<()> { pub async fn run(self, index_url: gix::Url, reqwest: reqwest::Client) -> anyhow::Result<()> {
let tokens = get_tokens().await?; let tokens = get_tokens().await?;
let token = match tokens.0.get(&index_url) { let Some(token) = tokens.0.get(&index_url) else {
Some(token) => token,
None => {
println!("not logged in into {index_url}"); println!("not logged in into {index_url}");
return Ok(()); return Ok(());
}
}; };
println!( println!(
"logged in as {} into {index_url}", "logged in as {} into {index_url}",
get_token_login(&reqwest, token).await?.bold() style(get_token_login(&reqwest, token).await?).bold()
); );
Ok(()) Ok(())

View file

@ -0,0 +1,18 @@
use clap::Subcommand;
use pesde::Project;
mod prune;
#[derive(Debug, Subcommand)]
pub enum CasCommands {
/// Removes unused files from the CAS
Prune(prune::PruneCommand),
}
impl CasCommands {
pub async fn run(self, project: Project) -> anyhow::Result<()> {
match self {
CasCommands::Prune(prune) => prune.run(project).await,
}
}
}

View file

@ -0,0 +1,346 @@
use crate::{
cli::{
reporters::run_with_reporter,
style::{INFO_STYLE, SUCCESS_STYLE},
},
util::remove_empty_dir,
};
use anyhow::Context as _;
use async_stream::try_stream;
use clap::Args;
use fs_err::tokio as fs;
use futures::{future::BoxFuture, FutureExt as _, Stream, StreamExt as _};
use pesde::{
source::fs::{FsEntry, PackageFs},
Project,
};
use std::{
collections::{HashMap, HashSet},
future::Future,
path::{Path, PathBuf},
};
use tokio::task::JoinSet;
#[derive(Debug, Args)]
pub struct PruneCommand;
async fn read_dir_stream(
dir: &Path,
) -> std::io::Result<impl Stream<Item = std::io::Result<fs::DirEntry>>> {
let mut read_dir = fs::read_dir(dir).await?;
Ok(try_stream! {
while let Some(entry) = read_dir.next_entry().await? {
yield entry;
}
})
}
#[allow(unreachable_code)]
async fn get_nlinks(path: &Path) -> anyhow::Result<u64> {
#[cfg(unix)]
{
use std::os::unix::fs::MetadataExt as _;
let metadata = fs::metadata(path).await?;
return Ok(metadata.nlink());
}
// life if rust stabilized the nightly feature from 2019
#[cfg(windows)]
{
use std::os::windows::ffi::OsStrExt as _;
use windows::{
core::PWSTR,
Win32::{
Foundation::CloseHandle,
Storage::FileSystem::{
CreateFileW, GetFileInformationByHandle, FILE_ATTRIBUTE_NORMAL,
FILE_GENERIC_READ, FILE_SHARE_READ, OPEN_EXISTING,
},
},
};
let path = path.to_path_buf();
return tokio::task::spawn_blocking(move || unsafe {
let handle = CreateFileW(
PWSTR(
path.as_os_str()
.encode_wide()
.chain(std::iter::once(0))
.collect::<Vec<_>>()
.as_mut_ptr(),
),
FILE_GENERIC_READ.0,
FILE_SHARE_READ,
None,
OPEN_EXISTING,
FILE_ATTRIBUTE_NORMAL,
None,
)?;
let mut info =
windows::Win32::Storage::FileSystem::BY_HANDLE_FILE_INFORMATION::default();
let res = GetFileInformationByHandle(handle, &mut info);
CloseHandle(handle)?;
res?;
Ok(info.nNumberOfLinks as u64)
})
.await
.unwrap();
}
#[cfg(not(any(unix, windows)))]
{
compile_error!("unsupported platform");
}
anyhow::bail!("unsupported platform")
}
#[derive(Debug)]
struct ExtendJoinSet<T: Send + 'static>(JoinSet<T>);
impl<T: Send + 'static, F: Future<Output = T> + Send + 'static> Extend<F> for ExtendJoinSet<T> {
fn extend<I: IntoIterator<Item = F>>(&mut self, iter: I) {
for item in iter {
self.0.spawn(item);
}
}
}
impl<T: Send + 'static> Default for ExtendJoinSet<T> {
fn default() -> Self {
Self(JoinSet::new())
}
}
async fn discover_cas_packages(cas_dir: &Path) -> anyhow::Result<HashMap<PathBuf, PackageFs>> {
fn read_entry(
entry: fs::DirEntry,
) -> BoxFuture<'static, anyhow::Result<HashMap<PathBuf, PackageFs>>> {
async move {
if entry
.metadata()
.await
.context("failed to read entry metadata")?
.is_dir()
{
let mut tasks = read_dir_stream(&entry.path())
.await
.context("failed to read entry directory")?
.map(|entry| async move {
read_entry(entry.context("failed to read inner cas index dir entry")?).await
})
.collect::<ExtendJoinSet<Result<_, anyhow::Error>>>()
.await
.0;
let mut res = HashMap::new();
while let Some(entry) = tasks.join_next().await {
res.extend(entry.unwrap()?);
}
return Ok(res);
}
let contents = fs::read_to_string(entry.path()).await?;
let fs = toml::from_str(&contents).context("failed to deserialize PackageFs")?;
Ok(HashMap::from([(entry.path(), fs)]))
}
.boxed()
}
let mut tasks = ["index", "wally_index", "git_index"]
.into_iter()
.map(|index| cas_dir.join(index))
.map(|index| async move {
let mut res = HashMap::new();
let tasks = match read_dir_stream(&index).await {
Ok(tasks) => tasks,
Err(e) if e.kind() == std::io::ErrorKind::NotFound => return Ok(res),
Err(e) => return Err(e).context("failed to read cas index directory"),
};
let mut tasks = tasks
.map(|entry| async move {
read_entry(entry.context("failed to read cas index dir entry")?).await
})
.collect::<ExtendJoinSet<Result<_, anyhow::Error>>>()
.await
.0;
while let Some(task) = tasks.join_next().await {
res.extend(task.unwrap()?);
}
Ok(res)
})
.collect::<JoinSet<Result<_, anyhow::Error>>>();
let mut cas_entries = HashMap::new();
while let Some(task) = tasks.join_next().await {
cas_entries.extend(task.unwrap()?);
}
Ok(cas_entries)
}
async fn remove_hashes(cas_dir: &Path) -> anyhow::Result<HashSet<String>> {
let mut res = HashSet::new();
let tasks = match read_dir_stream(cas_dir).await {
Ok(tasks) => tasks,
Err(e) if e.kind() == std::io::ErrorKind::NotFound => return Ok(res),
Err(e) => return Err(e).context("failed to read cas directory"),
};
let mut tasks = tasks
.map(|cas_entry| async move {
let cas_entry = cas_entry.context("failed to read cas dir entry")?;
let prefix = cas_entry.file_name();
let Some(prefix) = prefix.to_str() else {
return Ok(None);
};
// we only want hash directories
if prefix.len() != 2 {
return Ok(None);
}
let mut tasks = read_dir_stream(&cas_entry.path())
.await
.context("failed to read hash directory")?
.map(|hash_entry| {
let prefix = prefix.to_string();
async move {
let hash_entry = hash_entry.context("failed to read hash dir entry")?;
let hash = hash_entry.file_name();
let hash = hash.to_str().expect("non-UTF-8 hash").to_string();
let hash = format!("{prefix}{hash}");
let path = hash_entry.path();
let nlinks = get_nlinks(&path)
.await
.context("failed to count file usage")?;
if nlinks > 1 {
return Ok(None);
}
fs::remove_file(&path)
.await
.context("failed to remove unused file")?;
if let Some(parent) = path.parent() {
remove_empty_dir(parent).await?;
}
Ok(Some(hash))
}
})
.collect::<ExtendJoinSet<Result<_, anyhow::Error>>>()
.await
.0;
let mut removed_hashes = HashSet::new();
while let Some(removed_hash) = tasks.join_next().await {
let Some(hash) = removed_hash.unwrap()? else {
continue;
};
removed_hashes.insert(hash);
}
Ok(Some(removed_hashes))
})
.collect::<ExtendJoinSet<Result<_, anyhow::Error>>>()
.await
.0;
while let Some(removed_hashes) = tasks.join_next().await {
let Some(removed_hashes) = removed_hashes.unwrap()? else {
continue;
};
res.extend(removed_hashes);
}
Ok(res)
}
impl PruneCommand {
pub async fn run(self, project: Project) -> anyhow::Result<()> {
// CAS structure:
// /2 first chars of hash/rest of hash
// /index/hash/name/version/target
// /wally_index/hash/name/version
// /git_index/hash/hash
// the last thing in the path is the serialized PackageFs
let (cas_entries, removed_hashes) = run_with_reporter(|_, root_progress, _| async {
let root_progress = root_progress;
root_progress.reset();
root_progress.set_message("discover packages");
let cas_entries = discover_cas_packages(project.cas_dir()).await?;
root_progress.reset();
root_progress.set_message("remove unused files");
let removed_hashes = remove_hashes(project.cas_dir()).await?;
Ok::<_, anyhow::Error>((cas_entries, removed_hashes))
})
.await?;
let mut tasks = JoinSet::new();
let mut removed_packages = 0usize;
'entry: for (path, fs) in cas_entries {
let PackageFs::Cas(entries) = fs else {
continue;
};
for entry in entries.into_values() {
let FsEntry::File(hash) = entry else {
continue;
};
if removed_hashes.contains(&hash) {
let cas_dir = project.cas_dir().to_path_buf();
tasks.spawn(async move {
fs::remove_file(&path)
.await
.context("failed to remove unused file")?;
// remove empty directories up to the cas dir
let mut path = &*path;
while let Some(parent) = path.parent() {
if parent == cas_dir {
break;
}
remove_empty_dir(parent).await?;
path = parent;
}
Ok::<_, anyhow::Error>(())
});
removed_packages += 1;
// if at least one file is removed, the package is not used
continue 'entry;
}
}
}
while let Some(task) = tasks.join_next().await {
task.unwrap()?;
}
println!(
"{} removed {} unused packages and {} individual files!",
SUCCESS_STYLE.apply_to("done!"),
INFO_STYLE.apply_to(removed_packages),
INFO_STYLE.apply_to(removed_hashes.len())
);
Ok(())
}
}

View file

@ -0,0 +1,100 @@
use crate::cli::{get_index, style::SUCCESS_STYLE};
use anyhow::Context as _;
use clap::Args;
use pesde::{
names::PackageName,
source::{
pesde::PesdePackageSource,
traits::{PackageSource as _, RefreshOptions},
},
Project,
};
use reqwest::{header::AUTHORIZATION, Method, StatusCode};
#[derive(Debug, Args)]
pub struct DeprecateCommand {
/// Whether to undeprecate the package
#[clap(long)]
undo: bool,
/// The index to deprecate the package in
#[clap(short, long)]
index: Option<String>,
/// The package to deprecate
#[clap(index = 1)]
package: PackageName,
/// The reason for deprecating the package
#[clap(index = 2, required_unless_present = "undo")]
reason: Option<String>,
}
impl DeprecateCommand {
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
let index_url = get_index(&project, self.index.as_deref()).await?;
let source = PesdePackageSource::new(index_url.clone());
source
.refresh(&RefreshOptions {
project: project.clone(),
})
.await
.context("failed to refresh source")?;
let config = source
.config(&project)
.await
.context("failed to get index config")?;
let mut request = reqwest.request(
if self.undo {
Method::DELETE
} else {
Method::PUT
},
format!(
"{}/v1/packages/{}/deprecate",
config.api(),
urlencoding::encode(&self.package.to_string()),
),
);
if !self.undo {
request = request.body(
self.reason
.map(|reason| reason.trim().to_string())
.filter(|reason| !reason.is_empty())
.context("deprecating must have non-empty a reason")?,
);
}
if let Some(token) = project.auth_config().tokens().get(&index_url) {
tracing::debug!("using token for {index_url}");
request = request.header(AUTHORIZATION, token);
}
let response = request.send().await.context("failed to send request")?;
let status = response.status();
let text = response
.text()
.await
.context("failed to get response text")?;
let prefix = if self.undo { "un" } else { "" };
match status {
StatusCode::CONFLICT => {
anyhow::bail!("version is already {prefix}deprecated");
}
StatusCode::FORBIDDEN => {
anyhow::bail!("unauthorized to {prefix}deprecate under this scope");
}
code if !code.is_success() => {
anyhow::bail!("failed to {prefix}deprecate package: {code} ({text})");
}
_ => {
println!("{}", SUCCESS_STYLE.apply_to(text));
}
}
Ok(())
}
}

View file

@ -1,22 +1,36 @@
use crate::cli::{config::read_config, progress_bar, VersionedPackageName}; use crate::cli::{
use anyhow::Context; config::read_config,
reporters::{self, CliReporter},
VersionedPackageName,
};
use anyhow::Context as _;
use clap::Args; use clap::Args;
use console::style;
use fs_err::tokio as fs; use fs_err::tokio as fs;
use indicatif::MultiProgress;
use pesde::{ use pesde::{
download_and_link::DownloadAndLinkOptions,
linking::generator::generate_bin_linking_module, linking::generator::generate_bin_linking_module,
manifest::target::TargetKind, manifest::target::TargetKind,
names::PackageName, names::{PackageName, PackageNames},
source::{ source::{
ids::PackageId,
pesde::{specifier::PesdeDependencySpecifier, PesdePackageSource}, pesde::{specifier::PesdeDependencySpecifier, PesdePackageSource},
traits::PackageSource, traits::{
DownloadOptions, GetTargetOptions, PackageSource as _, RefreshOptions, ResolveOptions,
}, },
Project, PackageSources,
},
Project, RefreshedSources, DEFAULT_INDEX_NAME,
}; };
use semver::VersionReq; use semver::VersionReq;
use std::{ use std::{
collections::HashSet, env::current_dir, ffi::OsString, io::Write, process::Command, sync::Arc, env::current_dir,
ffi::OsString,
io::{Stderr, Write as _},
process::Command,
sync::Arc,
}; };
use tokio::sync::Mutex;
#[derive(Debug, Args)] #[derive(Debug, Args)]
pub struct ExecuteCommand { pub struct ExecuteCommand {
@ -35,57 +49,71 @@ pub struct ExecuteCommand {
impl ExecuteCommand { impl ExecuteCommand {
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> { pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
let multi_progress = MultiProgress::new();
crate::PROGRESS_BARS
.lock()
.unwrap()
.replace(multi_progress.clone());
let refreshed_sources = RefreshedSources::new();
let (tempdir, bin_path) = reporters::run_with_reporter_and_writer(
std::io::stderr(),
|multi_progress, root_progress, reporter| async {
let multi_progress = multi_progress;
let root_progress = root_progress;
root_progress.set_message("resolve");
let index = match self.index { let index = match self.index {
Some(index) => Some(index), Some(index) => Some(index),
None => read_config().await.ok().map(|c| c.default_index), None => read_config().await.ok().map(|c| c.default_index),
} }
.context("no index specified")?; .context("no index specified")?;
let source = PesdePackageSource::new(index); let source = PesdePackageSource::new(index);
source refreshed_sources
.refresh(&project) .refresh(
&PackageSources::Pesde(source.clone()),
&RefreshOptions {
project: project.clone(),
},
)
.await .await
.context("failed to refresh source")?; .context("failed to refresh source")?;
let version_req = self.package.1.unwrap_or(VersionReq::STAR); let version_req = self.package.1.unwrap_or(VersionReq::STAR);
let Some((version, pkg_ref)) = ('finder: { let Some((v_id, pkg_ref)) = source
let specifier = PesdeDependencySpecifier { .resolve(
&PesdeDependencySpecifier {
name: self.package.0.clone(), name: self.package.0.clone(),
version: version_req.clone(), version: version_req.clone(),
index: None, index: DEFAULT_INDEX_NAME.into(),
target: None, target: None,
}; },
&ResolveOptions {
if let Some(res) = source project: project.clone(),
.resolve(&specifier, &project, TargetKind::Lune, &mut HashSet::new()) target: TargetKind::Luau,
refreshed_sources: refreshed_sources.clone(),
loose_target: true,
},
)
.await .await
.context("failed to resolve package")? .context("failed to resolve package")?
.1 .1
.pop_last() .pop_last()
{ else {
break 'finder Some(res);
}
source
.resolve(&specifier, &project, TargetKind::Luau, &mut HashSet::new())
.await
.context("failed to resolve package")?
.1
.pop_last()
}) else {
anyhow::bail!( anyhow::bail!(
"no Lune or Luau package could be found for {}@{version_req}", "no compatible package could be found for {}@{version_req}",
self.package.0, self.package.0,
); );
}; };
println!("using {}@{version}", pkg_ref.name);
let tmp_dir = project.cas_dir().join(".tmp"); let tmp_dir = project.cas_dir().join(".tmp");
fs::create_dir_all(&tmp_dir) fs::create_dir_all(&tmp_dir)
.await .await
.context("failed to create temporary directory")?; .context("failed to create temporary directory")?;
let tempdir = let tempdir = tempfile::tempdir_in(tmp_dir)
tempfile::tempdir_in(tmp_dir).context("failed to create temporary directory")?; .context("failed to create temporary directory")?;
let project = Project::new( let project = Project::new(
tempdir.path(), tempdir.path(),
@ -95,49 +123,71 @@ impl ExecuteCommand {
project.auth_config().clone(), project.auth_config().clone(),
); );
let (fs, target) = source let id = Arc::new(PackageId::new(
.download(&pkg_ref, &project, &reqwest) PackageNames::Pesde(self.package.0.clone()),
v_id,
));
let fs = source
.download(
&pkg_ref,
&DownloadOptions {
project: project.clone(),
reqwest: reqwest.clone(),
reporter: Arc::new(()),
id: id.clone(),
},
)
.await .await
.context("failed to download package")?; .context("failed to download package")?;
let bin_path = target.bin_path().context("package has no binary export")?;
fs.write_to(tempdir.path(), project.cas_dir(), true) fs.write_to(tempdir.path(), project.cas_dir(), true)
.await .await
.context("failed to write package contents")?; .context("failed to write package contents")?;
let mut refreshed_sources = HashSet::new(); let target = source
.get_target(
&pkg_ref,
&GetTargetOptions {
project: project.clone(),
path: Arc::from(tempdir.path()),
id: id.clone(),
},
)
.await
.context("failed to get target")?;
let bin_path = target.bin_path().context("package has no binary export")?;
let graph = project let graph = project
.dependency_graph(None, &mut refreshed_sources, true) .dependency_graph(None, refreshed_sources.clone(), true)
.await .await
.context("failed to build dependency graph")?; .context("failed to build dependency graph")?;
let graph = Arc::new(graph);
let (rx, downloaded_graph) = project multi_progress.suspend(|| {
eprintln!("{}", style(format!("using {}", style(id).bold())).dim());
});
root_progress.reset();
root_progress.set_message("download");
root_progress.set_style(reporters::root_progress_style_with_progress());
project
.download_and_link( .download_and_link(
&graph, &Arc::new(graph),
&Arc::new(Mutex::new(refreshed_sources)), DownloadAndLinkOptions::<CliReporter<Stderr>, ()>::new(reqwest)
&reqwest, .reporter(reporter)
true, .refreshed_sources(refreshed_sources)
true, .prod(true),
|_| async { Ok::<_, std::io::Error>(()) },
) )
.await .await
.context("failed to download dependencies")?; .context("failed to download and link dependencies")?;
progress_bar( anyhow::Ok((tempdir, bin_path.to_relative_path_buf()))
graph.values().map(|versions| versions.len() as u64).sum(), },
rx,
"📥 ".to_string(),
"downloading dependencies".to_string(),
"downloaded dependencies".to_string(),
) )
.await?; .await?;
downloaded_graph
.await
.context("failed to download & link dependencies")?;
let mut caller = let mut caller =
tempfile::NamedTempFile::new_in(tempdir.path()).context("failed to create tempfile")?; tempfile::NamedTempFile::new_in(tempdir.path()).context("failed to create tempfile")?;
caller caller
@ -162,6 +212,6 @@ impl ExecuteCommand {
drop(caller); drop(caller);
drop(tempdir); drop(tempdir);
std::process::exit(status.code().unwrap_or(1)) std::process::exit(status.code().unwrap_or(1i32))
} }
} }

View file

@ -1,25 +1,29 @@
use crate::cli::config::read_config; use crate::cli::{
use anyhow::Context; config::read_config,
style::{ERROR_PREFIX, INFO_STYLE, SUCCESS_STYLE},
};
use anyhow::Context as _;
use clap::Args; use clap::Args;
use colored::Colorize;
use inquire::validator::Validation; use inquire::validator::Validation;
use pesde::{ use pesde::{
errors::ManifestReadError, errors::ManifestReadError,
manifest::{target::TargetKind, DependencyType}, manifest::{target::TargetKind, DependencyType},
names::PackageName, names::{PackageName, PackageNames},
source::{ source::{
git_index::GitBasedSource, git_index::GitBasedSource as _,
ids::PackageId,
pesde::{specifier::PesdeDependencySpecifier, PesdePackageSource}, pesde::{specifier::PesdeDependencySpecifier, PesdePackageSource},
specifiers::DependencySpecifiers, specifiers::DependencySpecifiers,
traits::PackageSource, traits::{GetTargetOptions, PackageSource as _, RefreshOptions, ResolveOptions},
PackageSources,
}, },
Project, DEFAULT_INDEX_NAME, SCRIPTS_LINK_FOLDER, Project, RefreshedSources, DEFAULT_INDEX_NAME, SCRIPTS_LINK_FOLDER,
}; };
use semver::VersionReq; use semver::VersionReq;
use std::{collections::HashSet, fmt::Display, str::FromStr}; use std::{fmt::Display, path::Path, str::FromStr as _, sync::Arc};
#[derive(Debug, Args)] #[derive(Debug, Args)]
pub struct InitCommand {} pub struct InitCommand;
#[derive(Debug)] #[derive(Debug)]
enum PackageNameOrCustom { enum PackageNameOrCustom {
@ -40,12 +44,11 @@ impl InitCommand {
pub async fn run(self, project: Project) -> anyhow::Result<()> { pub async fn run(self, project: Project) -> anyhow::Result<()> {
match project.read_manifest().await { match project.read_manifest().await {
Ok(_) => { Ok(_) => {
println!("{}", "project already initialized".red()); anyhow::bail!("project already initialized");
return Ok(());
} }
Err(ManifestReadError::Io(e)) if e.kind() == std::io::ErrorKind::NotFound => {} Err(ManifestReadError::Io(e)) if e.kind() == std::io::ErrorKind::NotFound => {}
Err(e) => return Err(e.into()), Err(e) => return Err(e.into()),
}; }
let mut manifest = toml_edit::DocumentMut::new(); let mut manifest = toml_edit::DocumentMut::new();
@ -128,13 +131,19 @@ impl InitCommand {
manifest["indices"].or_insert(toml_edit::Item::Table(toml_edit::Table::new())) manifest["indices"].or_insert(toml_edit::Item::Table(toml_edit::Table::new()))
[DEFAULT_INDEX_NAME] = toml_edit::value(source.repo_url().to_bstring().to_string()); [DEFAULT_INDEX_NAME] = toml_edit::value(source.repo_url().to_bstring().to_string());
let refreshed_sources = RefreshedSources::new();
if target_env.is_roblox() if target_env.is_roblox()
|| inquire::prompt_confirmation( || inquire::prompt_confirmation("would you like to setup Roblox compatibility scripts?")
"would you like to setup default Roblox compatibility scripts?",
)
.unwrap() .unwrap()
{ {
PackageSource::refresh(&source, &project) refreshed_sources
.refresh(
&PackageSources::Pesde(source.clone()),
&RefreshOptions {
project: project.clone(),
},
)
.await .await
.context("failed to refresh package source")?; .context("failed to refresh package source")?;
let config = source let config = source
@ -188,14 +197,17 @@ impl InitCommand {
let (v_id, pkg_ref) = source let (v_id, pkg_ref) = source
.resolve( .resolve(
&PesdeDependencySpecifier { &PesdeDependencySpecifier {
name: scripts_pkg_name, name: scripts_pkg_name.clone(),
version: VersionReq::STAR, version: VersionReq::STAR,
index: None, index: DEFAULT_INDEX_NAME.into(),
target: None, target: None,
}, },
&project, &ResolveOptions {
TargetKind::Lune, project: project.clone(),
&mut HashSet::new(), target: TargetKind::Luau,
refreshed_sources,
loose_target: true,
},
) )
.await .await
.context("failed to resolve scripts package")? .context("failed to resolve scripts package")?
@ -203,12 +215,26 @@ impl InitCommand {
.pop_last() .pop_last()
.context("scripts package not found")?; .context("scripts package not found")?;
let Some(scripts) = pkg_ref.target.scripts().filter(|s| !s.is_empty()) else { let id = Arc::new(PackageId::new(PackageNames::Pesde(scripts_pkg_name), v_id));
anyhow::bail!("scripts package has no scripts. this is an issue with the index")
let target = source
.get_target(
&pkg_ref,
&GetTargetOptions {
project: project.clone(),
// HACK: the pesde package source doesn't use the path, so we can just use an empty one
path: Arc::from(Path::new("")),
id: id.clone(),
},
)
.await?;
let Some(scripts) = target.scripts().filter(|s| !s.is_empty()) else {
anyhow::bail!("scripts package has no scripts.")
}; };
let scripts_field = &mut manifest["scripts"] let scripts_field =
.or_insert(toml_edit::Item::Table(toml_edit::Table::new())); manifest["scripts"].or_insert(toml_edit::Item::Table(toml_edit::Table::new()));
for script_name in scripts.keys() { for script_name in scripts.keys() {
scripts_field[script_name] = toml_edit::value(format!( scripts_field[script_name] = toml_edit::value(format!(
@ -216,13 +242,13 @@ impl InitCommand {
)); ));
} }
let dev_deps = &mut manifest["dev_dependencies"] let dev_deps = manifest["dev_dependencies"]
.or_insert(toml_edit::Item::Table(toml_edit::Table::new())); .or_insert(toml_edit::Item::Table(toml_edit::Table::new()));
let field = &mut dev_deps["scripts"]; let field = &mut dev_deps["scripts"];
field["name"] = toml_edit::value(pkg_ref.name.to_string()); field["name"] = toml_edit::value(id.name().to_string());
field["version"] = toml_edit::value(format!("^{}", v_id.version())); field["version"] = toml_edit::value(format!("^{}", id.version_id().version()));
field["target"] = toml_edit::value(v_id.target().to_string()); field["target"] = toml_edit::value(id.version_id().target().to_string());
for (alias, (spec, ty)) in pkg_ref.dependencies { for (alias, (spec, ty)) in pkg_ref.dependencies {
if ty != DependencyType::Peer { if ty != DependencyType::Peer {
@ -233,16 +259,18 @@ impl InitCommand {
continue; continue;
}; };
let field = &mut dev_deps[alias]; let field = &mut dev_deps[alias.as_str()];
field["name"] = toml_edit::value(spec.name.to_string()); field["name"] = toml_edit::value(spec.name.to_string());
field["version"] = toml_edit::value(spec.version.to_string()); field["version"] = toml_edit::value(spec.version.to_string());
field["target"] = field["target"] = toml_edit::value(
toml_edit::value(spec.target.unwrap_or_else(|| *v_id.target()).to_string()); spec.target
.unwrap_or_else(|| id.version_id().target())
.to_string(),
);
} }
} else { } else {
println!( println!(
"{}", "{ERROR_PREFIX}: no scripts package configured, this can cause issues with Roblox compatibility"
"no scripts package configured, this can cause issues with Roblox compatibility".red()
); );
if !inquire::prompt_confirmation("initialize regardless?").unwrap() { if !inquire::prompt_confirmation("initialize regardless?").unwrap() {
return Ok(()); return Ok(());
@ -254,8 +282,8 @@ impl InitCommand {
println!( println!(
"{}\n{}: run `install` to fully finish setup", "{}\n{}: run `install` to fully finish setup",
"initialized project".green(), SUCCESS_STYLE.apply_to("initialized project"),
"tip".cyan().bold() INFO_STYLE.apply_to("tip")
); );
Ok(()) Ok(())
} }

View file

@ -1,20 +1,10 @@
use crate::cli::{ use crate::cli::{
bin_dir, files::make_executable, progress_bar, run_on_workspace_members, up_to_date_lockfile, install::{install, InstallOptions},
run_on_workspace_members,
}; };
use anyhow::Context;
use clap::Args; use clap::Args;
use colored::{ColoredString, Colorize}; use pesde::Project;
use fs_err::tokio as fs; use std::num::NonZeroUsize;
use futures::future::try_join_all;
use pesde::{
download_and_link::filter_graph, lockfile::Lockfile, manifest::target::TargetKind, Project,
MANIFEST_FILE_NAME,
};
use std::{
collections::{BTreeSet, HashMap, HashSet},
sync::Arc,
};
use tokio::sync::Mutex;
#[derive(Debug, Args, Copy, Clone)] #[derive(Debug, Args, Copy, Clone)]
pub struct InstallCommand { pub struct InstallCommand {
@ -25,303 +15,40 @@ pub struct InstallCommand {
/// Whether to not install dev dependencies /// Whether to not install dev dependencies
#[arg(long)] #[arg(long)]
prod: bool, prod: bool,
}
fn bin_link_file(alias: &str) -> String { /// The maximum number of concurrent network requests
let mut all_combinations = BTreeSet::new(); #[arg(long, default_value = "16")]
network_concurrency: NonZeroUsize,
for a in TargetKind::VARIANTS { /// Whether to re-install all dependencies even if they are already installed
for b in TargetKind::VARIANTS { #[arg(long)]
all_combinations.insert((a, b)); force: bool,
}
}
let all_folders = all_combinations
.into_iter()
.map(|(a, b)| format!("{:?}", a.packages_folder(b)))
.collect::<BTreeSet<_>>()
.into_iter()
.collect::<Vec<_>>()
.join(", ");
format!(
r#"local process = require("@lune/process")
local fs = require("@lune/fs")
local stdio = require("@lune/stdio")
local project_root = process.cwd
local path_components = string.split(string.gsub(project_root, "\\", "/"), "/")
for i = #path_components, 1, -1 do
local path = table.concat(path_components, "/", 1, i)
if fs.isFile(path .. "/{MANIFEST_FILE_NAME}") then
project_root = path
break
end
end
for _, packages_folder in {{ {all_folders} }} do
local path = `{{project_root}}/{{packages_folder}}/{alias}.bin.luau`
if fs.isFile(path) then
require(path)
return
end
end
stdio.ewrite(stdio.color("red") .. "binary `{alias}` not found. are you in the right directory?" .. stdio.color("reset") .. "\n")
"#,
)
}
#[cfg(feature = "patches")]
const JOBS: u8 = 5;
#[cfg(not(feature = "patches"))]
const JOBS: u8 = 4;
fn job(n: u8) -> ColoredString {
format!("[{n}/{JOBS}]").dimmed().bold()
} }
#[derive(Debug, thiserror::Error)] #[derive(Debug, thiserror::Error)]
#[error(transparent)] #[error(transparent)]
struct CallbackError(#[from] anyhow::Error); struct CallbackError(#[from] anyhow::Error);
impl InstallCommand { impl InstallCommand {
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> { pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
let mut refreshed_sources = HashSet::new(); let options = InstallOptions {
locked: self.locked,
let manifest = project prod: self.prod,
.deser_manifest() write: true,
.await network_concurrency: self.network_concurrency,
.context("failed to read manifest")?; use_lockfile: true,
force: self.force,
let lockfile = if self.locked {
match up_to_date_lockfile(&project).await? {
None => {
anyhow::bail!(
"lockfile is out of sync, run `{} install` to update it",
env!("CARGO_BIN_NAME")
);
}
file => file,
}
} else {
match project.deser_lockfile().await {
Ok(lockfile) => {
if lockfile.overrides != manifest.overrides {
tracing::debug!("overrides are different");
None
} else if lockfile.target != manifest.target.kind() {
tracing::debug!("target kind is different");
None
} else {
Some(lockfile)
}
}
Err(pesde::errors::LockfileReadError::Io(e))
if e.kind() == std::io::ErrorKind::NotFound =>
{
None
}
Err(e) => return Err(e.into()),
}
}; };
println!( install(&options, &project, reqwest.clone(), true).await?;
"\n{}\n",
format!("[now installing {} {}]", manifest.name, manifest.target)
.bold()
.on_bright_black()
);
println!("{} ❌ removing current package folders", job(1)); run_on_workspace_members(&project, |project| {
{
let mut deleted_folders = HashMap::new();
for target_kind in TargetKind::VARIANTS {
let folder = manifest.target.kind().packages_folder(target_kind);
let package_dir = project.package_dir();
deleted_folders
.entry(folder.to_string())
.or_insert_with(|| async move {
tracing::debug!("deleting the {folder} folder");
if let Some(e) = fs::remove_dir_all(package_dir.join(&folder))
.await
.err()
.filter(|e| e.kind() != std::io::ErrorKind::NotFound)
{
return Err(e).context(format!("failed to remove the {folder} folder"));
};
Ok(())
});
}
try_join_all(deleted_folders.into_values())
.await
.context("failed to remove package folders")?;
}
let old_graph = lockfile.map(|lockfile| {
lockfile
.graph
.into_iter()
.map(|(name, versions)| {
(
name,
versions
.into_iter()
.map(|(version, node)| (version, node.node))
.collect(),
)
})
.collect()
});
println!("{} 📦 building dependency graph", job(2));
let graph = project
.dependency_graph(old_graph.as_ref(), &mut refreshed_sources, false)
.await
.context("failed to build dependency graph")?;
let graph = Arc::new(graph);
let bin_folder = bin_dir().await?;
let downloaded_graph = {
let (rx, downloaded_graph) = project
.download_and_link(
&graph,
&Arc::new(Mutex::new(refreshed_sources)),
&reqwest,
self.prod,
true,
|graph| {
let graph = graph.clone();
async move {
try_join_all(
graph
.values()
.flat_map(|versions| versions.values())
.filter(|node| node.target.bin_path().is_some())
.filter_map(|node| node.node.direct.as_ref())
.map(|(alias, _, _)| alias)
.filter(|alias| {
if *alias == env!("CARGO_BIN_NAME") {
tracing::warn!(
"package {alias} has the same name as the CLI, skipping bin link"
);
return false;
}
true
})
.map(|alias| {
let bin_folder = bin_folder.clone();
async move {
let bin_exec_file = bin_folder.join(alias).with_extension(std::env::consts::EXE_EXTENSION);
let impl_folder = bin_folder.join(".impl");
fs::create_dir_all(&impl_folder).await.context("failed to create bin link folder")?;
let bin_file = impl_folder.join(alias).with_extension("luau");
fs::write(&bin_file, bin_link_file(alias))
.await
.context("failed to write bin link file")?;
#[cfg(windows)]
{
fs::copy(
std::env::current_exe()
.context("failed to get current executable path")?,
&bin_exec_file,
)
.await
.context("failed to copy bin link file")?;
}
#[cfg(not(windows))]
{
fs::write(
&bin_exec_file,
format!(r#"#!/bin/sh
exec lune run "$(dirname "$0")/.impl/{alias}.luau" -- "$@""#
),
)
.await
.context("failed to link bin link file")?;
}
make_executable(&bin_exec_file).await.context("failed to make bin link file executable")?;
Ok::<_, CallbackError>(())
}
}),
)
.await
.map(|_| ())
}
}
)
.await
.context("failed to download dependencies")?;
progress_bar(
graph.values().map(|versions| versions.len() as u64).sum(),
rx,
format!("{} 📥 ", job(3)),
"downloading dependencies".to_string(),
"downloaded dependencies".to_string(),
)
.await?;
downloaded_graph
.await
.context("failed to download & link dependencies")?
};
#[cfg(feature = "patches")]
{
let rx = project
.apply_patches(&filter_graph(&downloaded_graph, self.prod))
.await
.context("failed to apply patches")?;
progress_bar(
manifest.patches.values().map(|v| v.len() as u64).sum(),
rx,
format!("{} 🩹 ", job(JOBS - 1)),
"applying patches".to_string(),
"applied patches".to_string(),
)
.await?;
}
println!("{} 🧹 finishing up", job(JOBS));
project
.write_lockfile(Lockfile {
name: manifest.name,
version: manifest.version,
target: manifest.target.kind(),
overrides: manifest.overrides,
graph: downloaded_graph,
workspace: run_on_workspace_members(&project, |project| {
let reqwest = reqwest.clone(); let reqwest = reqwest.clone();
async move { Box::pin(self.run(project, reqwest)).await } async move {
install(&options, &project, reqwest, false).await?;
Ok(())
}
}) })
.await?, .await?;
})
.await
.context("failed to write lockfile")?;
Ok(()) Ok(())
} }

51
src/cli/commands/list.rs Normal file
View file

@ -0,0 +1,51 @@
use std::collections::BTreeMap;
use anyhow::Context as _;
use clap::Args;
use crate::cli::{
dep_type_to_key,
style::{INFO_STYLE, SUCCESS_STYLE},
};
use pesde::{
manifest::{Alias, DependencyType},
source::specifiers::DependencySpecifiers,
Project,
};
#[derive(Debug, Args)]
pub struct ListCommand;
impl ListCommand {
pub async fn run(self, project: Project) -> anyhow::Result<()> {
let manifest = project
.deser_manifest()
.await
.context("failed to read manifest")?;
let all_deps = manifest
.all_dependencies()
.context("failed to get all dependencies")?
.into_iter()
.fold(
BTreeMap::<DependencyType, BTreeMap<Alias, DependencySpecifiers>>::new(),
|mut acc, (alias, (spec, ty))| {
acc.entry(ty).or_default().insert(alias, spec);
acc
},
);
for (dep_ty, deps) in all_deps {
let dep_key = dep_type_to_key(dep_ty);
println!("{}", INFO_STYLE.apply_to(dep_key));
for (alias, spec) in deps {
println!("{}: {spec}", SUCCESS_STYLE.apply_to(alias));
}
println!();
}
Ok(())
}
}

View file

@ -2,22 +2,27 @@ use pesde::Project;
mod add; mod add;
mod auth; mod auth;
mod cas;
mod config; mod config;
mod deprecate;
mod execute; mod execute;
mod init; mod init;
mod install; mod install;
mod list;
mod outdated; mod outdated;
#[cfg(feature = "patches")] #[cfg(feature = "patches")]
mod patch; mod patch;
#[cfg(feature = "patches")] #[cfg(feature = "patches")]
mod patch_commit; mod patch_commit;
mod publish; mod publish;
mod remove;
mod run; mod run;
#[cfg(feature = "version-management")] #[cfg(feature = "version-management")]
mod self_install; mod self_install;
#[cfg(feature = "version-management")] #[cfg(feature = "version-management")]
mod self_upgrade; mod self_upgrade;
mod update; mod update;
mod yank;
#[derive(Debug, clap::Subcommand)] #[derive(Debug, clap::Subcommand)]
pub enum Subcommand { pub enum Subcommand {
@ -28,21 +33,43 @@ pub enum Subcommand {
#[command(subcommand)] #[command(subcommand)]
Config(config::ConfigCommands), Config(config::ConfigCommands),
/// CAS-related commands
#[command(subcommand)]
Cas(cas::CasCommands),
/// Initializes a manifest file in the current directory /// Initializes a manifest file in the current directory
Init(init::InitCommand), Init(init::InitCommand),
/// Adds a dependency to the project
Add(add::AddCommand),
/// Removes a dependency from the project
Remove(remove::RemoveCommand),
/// Installs all dependencies for the project
#[clap(name = "install", visible_alias = "i")]
Install(install::InstallCommand),
/// Updates the project's lockfile. Run install to apply changes
Update(update::UpdateCommand),
/// Checks for outdated dependencies
Outdated(outdated::OutdatedCommand),
/// Lists all dependencies in the project
List(list::ListCommand),
/// Runs a script, an executable package, or a file with Lune /// Runs a script, an executable package, or a file with Lune
Run(run::RunCommand), Run(run::RunCommand),
/// Installs all dependencies for the project
Install(install::InstallCommand),
/// Publishes the project to the registry /// Publishes the project to the registry
Publish(publish::PublishCommand), Publish(publish::PublishCommand),
/// Installs the pesde binary and scripts /// Yanks a package from the registry
#[cfg(feature = "version-management")] Yank(yank::YankCommand),
SelfInstall(self_install::SelfInstallCommand),
/// Deprecates a package from the registry
Deprecate(deprecate::DeprecateCommand),
/// Sets up a patching environment for a package /// Sets up a patching environment for a package
#[cfg(feature = "patches")] #[cfg(feature = "patches")]
@ -52,22 +79,17 @@ pub enum Subcommand {
#[cfg(feature = "patches")] #[cfg(feature = "patches")]
PatchCommit(patch_commit::PatchCommitCommand), PatchCommit(patch_commit::PatchCommitCommand),
/// Installs the latest version of pesde
#[cfg(feature = "version-management")]
SelfUpgrade(self_upgrade::SelfUpgradeCommand),
/// Adds a dependency to the project
Add(add::AddCommand),
/// Updates the project's lockfile. Run install to apply changes
Update(update::UpdateCommand),
/// Checks for outdated dependencies
Outdated(outdated::OutdatedCommand),
/// Executes a binary package without needing to be run in a project directory /// Executes a binary package without needing to be run in a project directory
#[clap(name = "x", visible_alias = "execute", visible_alias = "exec")] #[clap(name = "x", visible_alias = "execute", visible_alias = "exec")]
Execute(execute::ExecuteCommand), Execute(execute::ExecuteCommand),
/// Installs the pesde binary and scripts
#[cfg(feature = "version-management")]
SelfInstall(self_install::SelfInstallCommand),
/// Installs the latest version of pesde
#[cfg(feature = "version-management")]
SelfUpgrade(self_upgrade::SelfUpgradeCommand),
} }
impl Subcommand { impl Subcommand {
@ -75,22 +97,27 @@ impl Subcommand {
match self { match self {
Subcommand::Auth(auth) => auth.run(project, reqwest).await, Subcommand::Auth(auth) => auth.run(project, reqwest).await,
Subcommand::Config(config) => config.run().await, Subcommand::Config(config) => config.run().await,
Subcommand::Cas(cas) => cas.run(project).await,
Subcommand::Init(init) => init.run(project).await, Subcommand::Init(init) => init.run(project).await,
Subcommand::Run(run) => run.run(project).await, Subcommand::Add(add) => add.run(project).await,
Subcommand::Remove(remove) => remove.run(project).await,
Subcommand::Install(install) => install.run(project, reqwest).await, Subcommand::Install(install) => install.run(project, reqwest).await,
Subcommand::Update(update) => update.run(project, reqwest).await,
Subcommand::Outdated(outdated) => outdated.run(project).await,
Subcommand::List(list) => list.run(project).await,
Subcommand::Run(run) => run.run(project).await,
Subcommand::Publish(publish) => publish.run(project, reqwest).await, Subcommand::Publish(publish) => publish.run(project, reqwest).await,
#[cfg(feature = "version-management")] Subcommand::Yank(yank) => yank.run(project, reqwest).await,
Subcommand::SelfInstall(self_install) => self_install.run().await, Subcommand::Deprecate(deprecate) => deprecate.run(project, reqwest).await,
#[cfg(feature = "patches")] #[cfg(feature = "patches")]
Subcommand::Patch(patch) => patch.run(project, reqwest).await, Subcommand::Patch(patch) => patch.run(project, reqwest).await,
#[cfg(feature = "patches")] #[cfg(feature = "patches")]
Subcommand::PatchCommit(patch_commit) => patch_commit.run(project).await, Subcommand::PatchCommit(patch_commit) => patch_commit.run(project).await,
Subcommand::Execute(execute) => execute.run(project, reqwest).await,
#[cfg(feature = "version-management")]
Subcommand::SelfInstall(self_install) => self_install.run().await,
#[cfg(feature = "version-management")] #[cfg(feature = "version-management")]
Subcommand::SelfUpgrade(self_upgrade) => self_upgrade.run(reqwest).await, Subcommand::SelfUpgrade(self_upgrade) => self_upgrade.run(reqwest).await,
Subcommand::Add(add) => add.run(project).await,
Subcommand::Update(update) => update.run(project, reqwest).await,
Subcommand::Outdated(outdated) => outdated.run(project).await,
Subcommand::Execute(execute) => execute.run(project, reqwest).await,
} }
} }
} }

View file

@ -1,19 +1,18 @@
use crate::cli::up_to_date_lockfile; use crate::cli::{
use anyhow::Context; style::{ADDED_STYLE, INFO_STYLE, REMOVED_STYLE, SUCCESS_STYLE},
up_to_date_lockfile,
};
use anyhow::Context as _;
use clap::Args; use clap::Args;
use futures::future::try_join_all;
use pesde::{ use pesde::{
refresh_sources,
source::{ source::{
refs::PackageRefs,
specifiers::DependencySpecifiers, specifiers::DependencySpecifiers,
traits::{PackageRef, PackageSource}, traits::{PackageRef as _, PackageSource as _, RefreshOptions, ResolveOptions},
}, },
Project, Project, RefreshedSources,
}; };
use semver::VersionReq; use semver::VersionReq;
use std::{collections::HashSet, sync::Arc}; use tokio::task::JoinSet;
use tokio::sync::Mutex;
#[derive(Debug, Args)] #[derive(Debug, Args)]
pub struct OutdatedCommand { pub struct OutdatedCommand {
@ -40,95 +39,95 @@ impl OutdatedCommand {
.context("failed to read manifest")?; .context("failed to read manifest")?;
let manifest_target_kind = manifest.target.kind(); let manifest_target_kind = manifest.target.kind();
let mut refreshed_sources = HashSet::new(); let refreshed_sources = RefreshedSources::new();
refresh_sources( let mut tasks = graph
&project,
graph
.iter()
.flat_map(|(_, versions)| versions.iter())
.map(|(_, node)| node.node.pkg_ref.source()),
&mut refreshed_sources,
)
.await?;
let refreshed_sources = Arc::new(Mutex::new(refreshed_sources));
if try_join_all(
graph
.into_iter() .into_iter()
.flat_map(|(_, versions)| versions.into_iter()) .map(|(current_id, node)| {
.map(|(current_version_id, node)| {
let project = project.clone(); let project = project.clone();
let refreshed_sources = refreshed_sources.clone(); let refreshed_sources = refreshed_sources.clone();
async move { async move {
let Some((alias, mut specifier, _)) = node.node.direct else { let Some((alias, mut specifier, _)) = node.direct else {
return Ok::<bool, anyhow::Error>(true); return Ok::<_, anyhow::Error>(None);
}; };
if matches!( if matches!(
specifier, specifier,
DependencySpecifiers::Git(_) | DependencySpecifiers::Workspace(_) DependencySpecifiers::Git(_)
| DependencySpecifiers::Workspace(_)
| DependencySpecifiers::Path(_)
) { ) {
return Ok(true); return Ok(None);
} }
let source = node.node.pkg_ref.source(); let source = node.pkg_ref.source();
refreshed_sources
.refresh(
&source,
&RefreshOptions {
project: project.clone(),
},
)
.await?;
if !self.strict { if !self.strict {
match specifier { match &mut specifier {
DependencySpecifiers::Pesde(ref mut spec) => { DependencySpecifiers::Pesde(spec) => {
spec.version = VersionReq::STAR; spec.version = VersionReq::STAR;
} }
#[cfg(feature = "wally-compat")] #[cfg(feature = "wally-compat")]
DependencySpecifiers::Wally(ref mut spec) => { DependencySpecifiers::Wally(spec) => {
spec.version = VersionReq::STAR; spec.version = VersionReq::STAR;
} }
DependencySpecifiers::Git(_) => {} DependencySpecifiers::Git(_) => {}
DependencySpecifiers::Workspace(_) => {} DependencySpecifiers::Workspace(_) => {}
}; DependencySpecifiers::Path(_) => {}
}
} }
let version_id = source let new_id = source
.resolve( .resolve(
&specifier, &specifier,
&project, &ResolveOptions {
manifest_target_kind, project: project.clone(),
&mut *refreshed_sources.lock().await, target: manifest_target_kind,
refreshed_sources: refreshed_sources.clone(),
loose_target: false,
},
) )
.await .await
.context("failed to resolve package versions")? .context("failed to resolve package versions")?
.1 .1
.pop_last() .pop_last()
.map(|(v_id, _)| v_id) .map(|(v_id, _)| v_id)
.context(format!("no versions of {specifier} found"))?; .with_context(|| format!("no versions of {specifier} found"))?;
Ok(Some((alias, current_id, new_id))
.filter(|(_, current_id, new_id)| current_id.version_id() != new_id))
}
})
.collect::<JoinSet<_>>();
let mut all_up_to_date = true;
while let Some(task) = tasks.join_next().await {
let Some((alias, current_id, new_id)) = task.unwrap()? else {
continue;
};
all_up_to_date = false;
if version_id != current_version_id {
println!( println!(
"{} {} ({alias}) {} -> {}", "{} ({}) {} → {}",
match node.node.pkg_ref { current_id.name(),
PackageRefs::Pesde(pkg_ref) => pkg_ref.name.to_string(), INFO_STYLE.apply_to(alias),
#[cfg(feature = "wally-compat")] REMOVED_STYLE.apply_to(current_id.version_id()),
PackageRefs::Wally(pkg_ref) => pkg_ref.name.to_string(), ADDED_STYLE.apply_to(new_id),
_ => unreachable!(),
},
current_version_id.target(),
current_version_id.version(),
version_id.version()
); );
return Ok(false);
} }
Ok(true) if all_up_to_date {
} println!("{}", SUCCESS_STYLE.apply_to("all packages are up to date"));
}),
)
.await?
.into_iter()
.all(|b| b)
{
println!("all packages are up to date");
} }
Ok(()) Ok(())

View file

@ -1,13 +1,18 @@
use crate::cli::{up_to_date_lockfile, VersionedPackageName}; use std::sync::Arc;
use anyhow::Context;
use crate::cli::{
style::{CLI_STYLE, INFO_STYLE, WARN_PREFIX},
up_to_date_lockfile, VersionedPackageName,
};
use anyhow::Context as _;
use clap::Args; use clap::Args;
use colored::Colorize; use console::style;
use fs_err::tokio as fs; use fs_err::tokio as fs;
use pesde::{ use pesde::{
patches::setup_patches_repo, patches::setup_patches_repo,
source::{ source::{
refs::PackageRefs, refs::PackageRefs,
traits::{PackageRef, PackageSource}, traits::{DownloadOptions, PackageRef as _, PackageSource as _},
}, },
Project, MANIFEST_FILE_NAME, Project, MANIFEST_FILE_NAME,
}; };
@ -27,31 +32,38 @@ impl PatchCommand {
anyhow::bail!("outdated lockfile, please run the install command first") anyhow::bail!("outdated lockfile, please run the install command first")
}; };
let (name, version_id) = self.package.get(&graph)?; let id = self.package.get(&graph)?;
let node = graph let node = graph.get(&id).context("package not found in graph")?;
.get(&name)
.and_then(|versions| versions.get(&version_id))
.context("package not found in graph")?;
if matches!(node.node.pkg_ref, PackageRefs::Workspace(_)) { if matches!(
anyhow::bail!("cannot patch a workspace package") node.pkg_ref,
PackageRefs::Workspace(_) | PackageRefs::Path(_)
) {
anyhow::bail!("cannot patch a workspace or a path package")
} }
let source = node.node.pkg_ref.source(); let source = node.pkg_ref.source();
let directory = project let directory = project
.data_dir() .data_dir()
.join("patches") .join("patches")
.join(name.escaped()) .join(id.name().escaped())
.join(version_id.escaped()) .join(id.version_id().escaped())
.join(chrono::Utc::now().timestamp().to_string()); .join(jiff::Timestamp::now().as_second().to_string());
fs::create_dir_all(&directory).await?; fs::create_dir_all(&directory).await?;
source source
.download(&node.node.pkg_ref, &project, &reqwest) .download(
&node.pkg_ref,
&DownloadOptions {
project: project.clone(),
reqwest,
reporter: Arc::new(()),
id: Arc::new(id),
},
)
.await? .await?
.0
.write_to(&directory, project.cas_dir(), false) .write_to(&directory, project.cas_dir(), false)
.await .await
.context("failed to write package contents")?; .context("failed to write package contents")?;
@ -59,17 +71,13 @@ impl PatchCommand {
setup_patches_repo(&directory)?; setup_patches_repo(&directory)?;
println!( println!(
concat!( r"done! modify the files in the directory, then run {} {}{} to apply.
"done! modify the files in the directory, then run `", {WARN_PREFIX}: do not commit these changes
env!("CARGO_BIN_NAME"), {}: the {MANIFEST_FILE_NAME} file will be ignored when patching",
r#" patch-commit {}` to apply. CLI_STYLE.apply_to(concat!("`", env!("CARGO_BIN_NAME"), " patch-commit")),
{}: do not commit these changes style(format!("'{}'", directory.display())).cyan().bold(),
{}: the {} file will be ignored when patching"# CLI_STYLE.apply_to("`"),
), INFO_STYLE.apply_to("note")
directory.display().to_string().bold().cyan(),
"warning".yellow(),
"note".blue(),
MANIFEST_FILE_NAME
); );
open::that(directory)?; open::that(directory)?;

View file

@ -1,9 +1,14 @@
use crate::cli::up_to_date_lockfile; use crate::cli::up_to_date_lockfile;
use anyhow::Context; use anyhow::Context as _;
use clap::Args; use clap::Args;
use fs_err::tokio as fs; use fs_err::tokio as fs;
use pesde::{names::PackageNames, patches::create_patch, source::version_id::VersionId, Project}; use pesde::{
use std::{path::PathBuf, str::FromStr}; names::PackageNames,
patches::create_patch,
source::ids::{PackageId, VersionId},
Project,
};
use std::{path::PathBuf, str::FromStr as _};
#[derive(Debug, Args)] #[derive(Debug, Args)]
pub struct PatchCommitCommand { pub struct PatchCommitCommand {
@ -20,7 +25,7 @@ impl PatchCommitCommand {
anyhow::bail!("outdated lockfile, please run the install command first") anyhow::bail!("outdated lockfile, please run the install command first")
}; };
let (name, version_id) = ( let id = PackageId::new(
PackageNames::from_escaped( PackageNames::from_escaped(
self.directory self.directory
.parent() .parent()
@ -43,10 +48,7 @@ impl PatchCommitCommand {
)?, )?,
); );
graph graph.get(&id).context("package not found in graph")?;
.get(&name)
.and_then(|versions| versions.get(&version_id))
.context("package not found in graph")?;
let mut manifest = toml_edit::DocumentMut::from_str( let mut manifest = toml_edit::DocumentMut::from_str(
&project &project
@ -57,28 +59,26 @@ impl PatchCommitCommand {
.context("failed to parse manifest")?; .context("failed to parse manifest")?;
let patch = create_patch(&self.directory).context("failed to create patch")?; let patch = create_patch(&self.directory).context("failed to create patch")?;
fs::remove_dir_all(self.directory)
.await
.context("failed to remove patch directory")?;
let patches_dir = project.package_dir().join("patches"); let patches_dir = project.package_dir().join("patches");
fs::create_dir_all(&patches_dir) fs::create_dir_all(&patches_dir)
.await .await
.context("failed to create patches directory")?; .context("failed to create patches directory")?;
let patch_file_name = format!("{}-{}.patch", name.escaped(), version_id.escaped()); let patch_file_name = format!(
"{}-{}.patch",
id.name().escaped(),
id.version_id().escaped()
);
let patch_file = patches_dir.join(&patch_file_name); let patch_file = patches_dir.join(&patch_file_name);
if patch_file.exists() {
anyhow::bail!("patch file already exists: {}", patch_file.display());
}
fs::write(&patch_file, patch) fs::write(&patch_file, patch)
.await .await
.context("failed to write patch file")?; .context("failed to write patch file")?;
manifest["patches"].or_insert(toml_edit::Item::Table(toml_edit::Table::new())) manifest["patches"].or_insert(toml_edit::Item::Table(toml_edit::Table::new()))
[&name.to_string()][&version_id.to_string()] = [&id.name().to_string()][&id.version_id().to_string()] =
toml_edit::value(format!("patches/{patch_file_name}")); toml_edit::value(format!("patches/{patch_file_name}"));
project project
@ -86,6 +86,10 @@ impl PatchCommitCommand {
.await .await
.context("failed to write manifest")?; .context("failed to write manifest")?;
fs::remove_dir_all(self.directory)
.await
.context("failed to remove patch directory")?;
println!(concat!( println!(concat!(
"done! run `", "done! run `",
env!("CARGO_BIN_NAME"), env!("CARGO_BIN_NAME"),

View file

@ -1,32 +1,44 @@
use crate::cli::{display_err, run_on_workspace_members, up_to_date_lockfile}; use crate::cli::{
use anyhow::Context; display_err, run_on_workspace_members,
style::{ERROR_PREFIX, ERROR_STYLE, SUCCESS_STYLE, WARN_PREFIX},
up_to_date_lockfile,
};
use anyhow::Context as _;
use async_compression::Level; use async_compression::Level;
use clap::Args; use clap::Args;
use colored::Colorize; use console::style;
use fs_err::tokio as fs; use fs_err::tokio as fs;
#[allow(deprecated)]
use pesde::{ use pesde::{
manifest::{target::Target, DependencyType}, manifest::{target::Target, DependencyType},
matching_globs_old_behaviour, matching_globs,
scripts::ScriptName, scripts::ScriptName,
source::{ source::{
git_index::GitBasedSource,
pesde::{specifier::PesdeDependencySpecifier, PesdePackageSource}, pesde::{specifier::PesdeDependencySpecifier, PesdePackageSource},
specifiers::DependencySpecifiers, specifiers::DependencySpecifiers,
traits::PackageSource, traits::{
GetTargetOptions, PackageRef as _, PackageSource as _, RefreshOptions, ResolveOptions,
},
workspace::{ workspace::{
specifier::{VersionType, VersionTypeOrReq}, specifier::{VersionType, VersionTypeOrReq},
WorkspacePackageSource, WorkspacePackageSource,
}, },
IGNORED_DIRS, IGNORED_FILES, PackageSources, ADDITIONAL_FORBIDDEN_FILES, IGNORED_DIRS, IGNORED_FILES,
}, },
Project, DEFAULT_INDEX_NAME, MANIFEST_FILE_NAME, Project, RefreshedSources, DEFAULT_INDEX_NAME, MANIFEST_FILE_NAME,
}; };
use relative_path::RelativePath;
use reqwest::{header::AUTHORIZATION, StatusCode}; use reqwest::{header::AUTHORIZATION, StatusCode};
use semver::VersionReq; use semver::VersionReq;
use std::{collections::HashSet, path::PathBuf}; use std::{
collections::{BTreeMap, BTreeSet},
path::PathBuf,
sync::Arc,
};
use tempfile::Builder; use tempfile::Builder;
use tokio::io::{AsyncSeekExt, AsyncWriteExt}; use tokio::{
io::{AsyncSeekExt as _, AsyncWriteExt as _},
task::JoinSet,
};
#[derive(Debug, Args, Clone)] #[derive(Debug, Args, Clone)]
pub struct PublishCommand { pub struct PublishCommand {
@ -41,14 +53,38 @@ pub struct PublishCommand {
/// The index to publish to /// The index to publish to
#[arg(short, long, default_value_t = DEFAULT_INDEX_NAME.to_string())] #[arg(short, long, default_value_t = DEFAULT_INDEX_NAME.to_string())]
index: String, index: String,
/// Whether to skip syntax validation
#[arg(long)]
no_verify: bool,
} }
impl PublishCommand { impl PublishCommand {
fn validate_luau_file(&self, name: &str, contents: &str) -> anyhow::Result<()> {
if self.no_verify {
return Ok(());
}
if let Err(err) = full_moon::parse(contents) {
eprintln!(
"{ERROR_PREFIX}: {name} is not a valid Luau file:\n{}",
err.into_iter()
.map(|err| format!("\t- {}", ERROR_STYLE.apply_to(err)))
.collect::<Vec<_>>()
.join("\n")
);
anyhow::bail!("failed to validate Luau file");
}
Ok(())
}
async fn run_impl( async fn run_impl(
self, self,
project: &Project, project: &Project,
reqwest: reqwest::Client, reqwest: reqwest::Client,
is_root: bool, refreshed_sources: &RefreshedSources,
) -> anyhow::Result<()> { ) -> anyhow::Result<()> {
let mut manifest = project let mut manifest = project
.deser_manifest() .deser_manifest()
@ -57,22 +93,26 @@ impl PublishCommand {
println!( println!(
"\n{}\n", "\n{}\n",
format!("[now publishing {} {}]", manifest.name, manifest.target) style(format!(
"[now publishing {} {}]",
manifest.name, manifest.target
))
.bold() .bold()
.on_bright_black() .on_color256(235)
); );
if manifest.private { if manifest.private {
if !is_root { println!(
println!("{}", "package is private, cannot publish".red().bold()); "{}",
} ERROR_STYLE.apply_to("package is private, refusing to publish")
);
return Ok(()); return Ok(());
} }
if manifest.target.lib_path().is_none() if manifest.target.lib_path().is_none()
&& manifest.target.bin_path().is_none() && manifest.target.bin_path().is_none()
&& manifest.target.scripts().is_none_or(|s| s.is_empty()) && manifest.target.scripts().is_none_or(BTreeMap::is_empty)
{ {
anyhow::bail!("no exports found in target"); anyhow::bail!("no exports found in target");
} }
@ -81,25 +121,67 @@ impl PublishCommand {
manifest.target, manifest.target,
Target::Roblox { .. } | Target::RobloxServer { .. } Target::Roblox { .. } | Target::RobloxServer { .. }
) { ) {
if manifest.target.build_files().is_none_or(|f| f.is_empty()) { if manifest.target.build_files().is_none_or(BTreeSet::is_empty) {
anyhow::bail!("no build files found in target"); anyhow::bail!("no build files found in target");
} }
match up_to_date_lockfile(project).await? { match up_to_date_lockfile(project).await? {
Some(lockfile) => { Some(lockfile) => {
if lockfile let mut tasks = lockfile
.graph .graph
.values() .iter()
.flatten() .filter(|(_, node)| node.direct.is_some())
.filter_map(|(_, node)| node.node.direct.as_ref().map(|_| node)) .map(|(id, node)| {
.any(|node| { let project = project.clone();
node.target.build_files().is_none() let container_folder = node.container_folder_from_project(
&& !matches!(node.node.resolved_ty, DependencyType::Dev) id,
&project,
manifest.target.kind(),
);
let id = Arc::new(id.clone());
let node = node.clone();
let refreshed_sources = refreshed_sources.clone();
async move {
let source = node.pkg_ref.source();
refreshed_sources
.refresh(
&source,
&RefreshOptions {
project: project.clone(),
},
)
.await
.context("failed to refresh source")?;
let target = source
.get_target(
&node.pkg_ref,
&GetTargetOptions {
project,
path: Arc::from(container_folder),
id,
},
)
.await?;
Ok::<_, anyhow::Error>(
target.build_files().is_none()
&& !matches!(node.resolved_ty, DependencyType::Dev),
)
}
}) })
{ .collect::<JoinSet<_>>();
while let Some(result) = tasks.join_next().await {
let result = result
.unwrap()
.context("failed to get target of dependency node")?;
if result {
anyhow::bail!("roblox packages may not depend on non-roblox packages"); anyhow::bail!("roblox packages may not depend on non-roblox packages");
} }
} }
}
None => { None => {
anyhow::bail!("outdated lockfile, please run the install command first") anyhow::bail!("outdated lockfile, please run the install command first")
} }
@ -118,8 +200,14 @@ impl PublishCommand {
let mut display_build_files: Vec<String> = vec![]; let mut display_build_files: Vec<String> = vec![];
let (lib_path, bin_path, scripts, target_kind) = ( let (lib_path, bin_path, scripts, target_kind) = (
manifest.target.lib_path().cloned(), manifest
manifest.target.bin_path().cloned(), .target
.lib_path()
.map(RelativePath::to_relative_path_buf),
manifest
.target
.bin_path()
.map(RelativePath::to_relative_path_buf),
manifest.target.scripts().cloned(), manifest.target.scripts().cloned(),
manifest.target.kind(), manifest.target.kind(),
); );
@ -130,20 +218,17 @@ impl PublishCommand {
_ => None, _ => None,
}; };
#[allow(deprecated)] let mut paths = matching_globs(
let mut paths = matching_globs_old_behaviour(
project.package_dir(), project.package_dir(),
manifest.includes.iter().map(|s| s.as_str()), manifest.includes.iter().map(String::as_str),
true, true,
false,
) )
.await .await
.context("failed to get included files")?; .context("failed to get included files")?;
if paths.insert(PathBuf::from(MANIFEST_FILE_NAME)) { if paths.insert(PathBuf::from(MANIFEST_FILE_NAME)) {
println!( println!("{WARN_PREFIX}: {MANIFEST_FILE_NAME} was not included, adding it");
"{}: {MANIFEST_FILE_NAME} was not included, adding it",
"warn".yellow().bold()
);
} }
if paths.iter().any(|p| p.starts_with(".git")) { if paths.iter().any(|p| p.starts_with(".git")) {
@ -156,30 +241,33 @@ impl PublishCommand {
"readme" | "readme.md" | "readme.txt" "readme" | "readme.md" | "readme.txt"
) )
}) { }) {
println!( println!("{WARN_PREFIX}: no README file included, consider adding one");
"{}: no README file included, consider adding one",
"warn".yellow().bold()
);
} }
if !paths.iter().any(|p| p.starts_with("docs")) { if !paths.iter().any(|p| p.starts_with("docs")) {
println!( println!("{WARN_PREFIX}: docs directory not included, consider adding one");
"{}: docs directory not included, consider adding one",
"warn".yellow().bold()
);
} }
for path in &paths { for path in &paths {
if path let Some(file_name) = path.file_name() else {
.file_name() continue;
.is_some_and(|n| n == "default.project.json") };
{
if ADDITIONAL_FORBIDDEN_FILES.contains(&file_name.to_string_lossy().as_ref()) {
if file_name == "default.project.json" {
anyhow::bail!( anyhow::bail!(
"default.project.json was included at `{}`, this should be generated by the {} script upon dependants installation", "default.project.json was included at `{}`, this should be generated by the {} script upon dependants installation",
path.display(), path.display(),
ScriptName::RobloxSyncConfigGenerator ScriptName::RobloxSyncConfigGenerator
); );
} }
anyhow::bail!(
"forbidden file {} was included at `{}`",
file_name.to_string_lossy(),
path.display()
);
}
} }
for ignored_path in IGNORED_FILES.iter().chain(IGNORED_DIRS.iter()) { for ignored_path in IGNORED_FILES.iter().chain(IGNORED_DIRS.iter()) {
@ -188,9 +276,9 @@ impl PublishCommand {
.any(|ct| ct == std::path::Component::Normal(ignored_path.as_ref())) .any(|ct| ct == std::path::Component::Normal(ignored_path.as_ref()))
}) { }) {
anyhow::bail!( anyhow::bail!(
r#"forbidden file {ignored_path} was included. r"forbidden file {ignored_path} was included.
info: if this was a toolchain manager's manifest file, do not include it due to it possibly messing with user scripts info: if this was a toolchain manager's manifest file, do not include it due to it possibly messing with user scripts
info: otherwise, the file was deemed unnecessary, if you don't understand why, please contact the maintainers"#, info: otherwise, the file was deemed unnecessary, if you don't understand why, please contact the maintainers",
); );
} }
} }
@ -217,25 +305,17 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
let export_path = export_path let export_path = export_path
.canonicalize() .canonicalize()
.context(format!("failed to canonicalize {name}"))?; .with_context(|| format!("failed to canonicalize {name}"))?;
if let Err(err) = full_moon::parse(&contents).map_err(|errs| { self.validate_luau_file(&format!("file at {name}"), &contents)?;
errs.into_iter()
.map(|err| err.to_string())
.collect::<Vec<_>>()
.join(", ")
}) {
anyhow::bail!("{name} is not a valid Luau file: {err}");
}
let first_part = relative_export_path let first_part = relative_export_path
.components() .components()
.next() .next()
.context(format!("{name} must contain at least one part"))?; .with_context(|| format!("{name} must contain at least one part"))?;
let first_part = match first_part { let relative_path::Component::Normal(first_part) = first_part else {
relative_path::Component::Normal(part) => part, anyhow::bail!("{name} must be within project directory");
_ => anyhow::bail!("{name} must be within project directory"),
}; };
if paths.insert( if paths.insert(
@ -244,20 +324,14 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
.unwrap() .unwrap()
.to_path_buf(), .to_path_buf(),
) { ) {
println!( println!("{WARN_PREFIX}: {name} was not included, adding {relative_export_path}");
"{}: {name} was not included, adding {relative_export_path}",
"warn".yellow().bold()
);
} }
if roblox_target if roblox_target
.as_mut() .as_mut()
.is_some_and(|build_files| build_files.insert(first_part.to_string())) .is_some_and(|build_files| build_files.insert(first_part.to_string()))
{ {
println!( println!("{WARN_PREFIX}: {name} was not in build files, adding {first_part}");
"{}: {name} was not in build files, adding {first_part}",
"warn".yellow().bold()
);
} }
} }
@ -265,8 +339,7 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
for build_file in build_files.iter() { for build_file in build_files.iter() {
if build_file.eq_ignore_ascii_case(MANIFEST_FILE_NAME) { if build_file.eq_ignore_ascii_case(MANIFEST_FILE_NAME) {
println!( println!(
"{}: {MANIFEST_FILE_NAME} is in build files, please remove it", "{WARN_PREFIX}: {MANIFEST_FILE_NAME} is in build files, please remove it",
"warn".yellow().bold()
); );
continue; continue;
@ -274,7 +347,7 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
let build_file_path = project.package_dir().join(build_file); let build_file_path = project.package_dir().join(build_file);
if !build_file_path.exists() { if fs::metadata(&build_file_path).await.is_err() {
anyhow::bail!("build file {build_file} does not exist"); anyhow::bail!("build file {build_file} does not exist");
} }
@ -309,16 +382,9 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
let script_path = script_path let script_path = script_path
.canonicalize() .canonicalize()
.context(format!("failed to canonicalize script {name}"))?; .with_context(|| format!("failed to canonicalize script {name}"))?;
if let Err(err) = full_moon::parse(&contents).map_err(|errs| { self.validate_luau_file(&format!("the `{name}` script"), &contents)?;
errs.into_iter()
.map(|err| err.to_string())
.collect::<Vec<_>>()
.join(", ")
}) {
anyhow::bail!("script {name} is not a valid Luau file: {err}");
}
if paths.insert( if paths.insert(
script_path script_path
@ -326,10 +392,7 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
.unwrap() .unwrap()
.to_path_buf(), .to_path_buf(),
) { ) {
println!( println!("{WARN_PREFIX}: script {name} was not included, adding {path}");
"{}: script {name} was not included, adding {path}",
"warn".yellow().bold()
);
} }
} }
} }
@ -337,7 +400,7 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
for relative_path in &paths { for relative_path in &paths {
let path = project.package_dir().join(relative_path); let path = project.package_dir().join(relative_path);
if !path.exists() { if fs::metadata(&path).await.is_err() {
anyhow::bail!("included file `{}` does not exist", path.display()); anyhow::bail!("included file `{}` does not exist", path.display());
} }
@ -358,7 +421,9 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
&relative_path, &relative_path,
fs::File::open(&path) fs::File::open(&path)
.await .await
.context(format!("failed to read `{}`", relative_path.display()))? .with_context(|| {
format!("failed to read `{}`", relative_path.display())
})?
.file_mut(), .file_mut(),
) )
.await?; .await?;
@ -373,40 +438,36 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
{ {
match specifier { match specifier {
DependencySpecifiers::Pesde(specifier) => { DependencySpecifiers::Pesde(specifier) => {
let index_name = specifier specifier.index = manifest
.index
.as_deref()
.unwrap_or(DEFAULT_INDEX_NAME)
.to_string();
specifier.index = Some(
manifest
.indices .indices
.get(&index_name) .get(&specifier.index)
.context(format!("index {index_name} not found in indices field"))? .with_context(|| {
.to_string(), format!("index {} not found in indices field", specifier.index)
); })?
.to_string();
} }
#[cfg(feature = "wally-compat")] #[cfg(feature = "wally-compat")]
DependencySpecifiers::Wally(specifier) => { DependencySpecifiers::Wally(specifier) => {
let index_name = specifier specifier.index = manifest
.index
.as_deref()
.unwrap_or(DEFAULT_INDEX_NAME)
.to_string();
specifier.index = Some(
manifest
.wally_indices .wally_indices
.get(&index_name) .get(&specifier.index)
.context(format!( .with_context(|| {
"index {index_name} not found in wally_indices field" format!("index {} not found in wally_indices field", specifier.index)
))? })?
.to_string(), .to_string();
);
} }
DependencySpecifiers::Git(_) => {} DependencySpecifiers::Git(_) => {}
DependencySpecifiers::Workspace(spec) => { DependencySpecifiers::Workspace(spec) => {
let pkg_ref = WorkspacePackageSource let pkg_ref = WorkspacePackageSource
.resolve(spec, project, target_kind, &mut HashSet::new()) .resolve(
spec,
&ResolveOptions {
project: project.clone(),
target: target_kind,
refreshed_sources: refreshed_sources.clone(),
loose_target: false,
},
)
.await .await
.context("failed to resolve workspace package")? .context("failed to resolve workspace package")?
.1 .1
@ -435,24 +496,30 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
VersionReq::STAR VersionReq::STAR
} }
VersionTypeOrReq::Req(r) => r, VersionTypeOrReq::Req(r) => r,
v => VersionReq::parse(&format!("{v}{}", manifest.version)) VersionTypeOrReq::VersionType(v) => {
.context(format!("failed to parse version for {v}"))?, VersionReq::parse(&format!("{v}{}", manifest.version))
.with_context(|| format!("failed to parse version for {v}"))?
}
}, },
index: Some( index: manifest
manifest
.indices .indices
.get(DEFAULT_INDEX_NAME) .get(DEFAULT_INDEX_NAME)
.context("missing default index in workspace package manifest")? .context("missing default index in workspace package manifest")?
.to_string(), .to_string(),
),
target: Some(spec.target.unwrap_or(manifest.target.kind())), target: Some(spec.target.unwrap_or(manifest.target.kind())),
}); });
} }
DependencySpecifiers::Path(_) => {
anyhow::bail!("path dependencies are not allowed in published packages")
}
} }
} }
{ {
println!("\n{}", "please confirm the following information:".bold()); println!(
"\n{}",
style("please confirm the following information:").bold()
);
println!("name: {}", manifest.name); println!("name: {}", manifest.name);
println!("version: {}", manifest.version); println!("version: {}", manifest.version);
println!( println!(
@ -476,8 +543,7 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
manifest manifest
.repository .repository
.as_ref() .as_ref()
.map(|r| r.as_str()) .map_or("(none)", url::Url::as_str)
.unwrap_or("(none)")
); );
let roblox_target = roblox_target.is_some_and(|_| true); let roblox_target = roblox_target.is_some_and(|_| true);
@ -488,7 +554,7 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
manifest manifest
.target .target
.lib_path() .lib_path()
.map_or("(none)".to_string(), |p| p.to_string()) .map_or_else(|| "(none)".to_string(), ToString::to_string)
); );
if roblox_target { if roblox_target {
@ -499,7 +565,7 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
manifest manifest
.target .target
.bin_path() .bin_path()
.map_or("(none)".to_string(), |p| p.to_string()) .map_or_else(|| "(none)".to_string(), ToString::to_string)
); );
println!( println!(
"\tscripts: {}", "\tscripts: {}",
@ -507,9 +573,10 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
.target .target
.scripts() .scripts()
.filter(|s| !s.is_empty()) .filter(|s| !s.is_empty())
.map_or("(none)".to_string(), |s| { .map_or_else(
s.keys().cloned().collect::<Vec<_>>().join(", ") || "(none)".to_string(),
}) |s| { s.keys().cloned().collect::<Vec<_>>().join(", ") }
)
); );
} }
@ -523,10 +590,9 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
); );
if !self.dry_run if !self.dry_run
&& !self.yes && !self.yes && !inquire::Confirm::new("is this information correct?").prompt()?
&& !inquire::Confirm::new("is this information correct?").prompt()?
{ {
println!("\n{}", "publish aborted".red().bold()); println!("\n{}", ERROR_STYLE.apply_to("publish aborted"));
return Ok(()); return Ok(());
} }
@ -573,9 +639,15 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
let index_url = manifest let index_url = manifest
.indices .indices
.get(&self.index) .get(&self.index)
.context(format!("missing index {}", self.index))?; .with_context(|| format!("missing index {}", self.index))?;
let source = PesdePackageSource::new(index_url.clone()); let source = PesdePackageSource::new(index_url.clone());
PackageSource::refresh(&source, project) refreshed_sources
.refresh(
&PackageSources::Pesde(source.clone()),
&RefreshOptions {
project: project.clone(),
},
)
.await .await
.context("failed to refresh source")?; .context("failed to refresh source")?;
let config = source let config = source
@ -591,38 +663,19 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
); );
} }
let deps = manifest.all_dependencies().context("dependency conflict")?;
if let Some((disallowed, _)) = deps.iter().find(|(_, (spec, _))| match spec {
DependencySpecifiers::Pesde(spec) => {
!config.other_registries_allowed.is_allowed_or_same(
source.repo_url().clone(),
gix::Url::try_from(spec.index.as_deref().unwrap()).unwrap(),
)
}
DependencySpecifiers::Git(spec) => !config.git_allowed.is_allowed(spec.repo.clone()),
#[cfg(feature = "wally-compat")]
DependencySpecifiers::Wally(spec) => !config
.wally_allowed
.is_allowed(gix::Url::try_from(spec.index.as_deref().unwrap()).unwrap()),
_ => false,
}) {
anyhow::bail!("dependency `{disallowed}` is not allowed on this index");
}
if self.dry_run { if self.dry_run {
fs::write("package.tar.gz", archive).await?; fs::write("package.tar.gz", archive).await?;
println!( println!(
"{}", "{}",
"(dry run) package written to package.tar.gz".green().bold() SUCCESS_STYLE.apply_to("(dry run) package written to package.tar.gz")
); );
return Ok(()); return Ok(());
} }
let mut request = reqwest let mut request = reqwest
.post(format!("{}/v0/packages", config.api())) .post(format!("{}/v1/packages", config.api()))
.body(archive); .body(archive);
if let Some(token) = project.auth_config().tokens().get(index_url) { if let Some(token) = project.auth_config().tokens().get(index_url) {
@ -639,22 +692,19 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
.context("failed to get response text")?; .context("failed to get response text")?;
match status { match status {
StatusCode::CONFLICT => { StatusCode::CONFLICT => {
println!("{}", "package version already exists".red().bold()); anyhow::bail!("package version already exists");
} }
StatusCode::FORBIDDEN => { StatusCode::FORBIDDEN => {
println!( anyhow::bail!("unauthorized to publish under this scope");
"{}",
"unauthorized to publish under this scope".red().bold()
);
} }
StatusCode::BAD_REQUEST => { StatusCode::BAD_REQUEST => {
println!("{}: {text}", "invalid package".red().bold()); anyhow::bail!("invalid package: {text}");
} }
code if !code.is_success() => { code if !code.is_success() => {
anyhow::bail!("failed to publish package: {code} ({text})"); anyhow::bail!("failed to publish package: {code} ({text})");
} }
_ => { _ => {
println!("{text}"); println!("{}", SUCCESS_STYLE.apply_to(text));
} }
} }
@ -662,17 +712,27 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
} }
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> { pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
let result = self.clone().run_impl(&project, reqwest.clone(), true).await; let refreshed_sources = RefreshedSources::new();
let result = self
.clone()
.run_impl(&project, reqwest.clone(), &refreshed_sources)
.await;
if project.workspace_dir().is_some() { if project.workspace_dir().is_some() {
return result; return result;
} else {
display_err(result, " occurred publishing workspace root");
} }
display_err(result, " occurred publishing workspace root");
run_on_workspace_members(&project, |project| { run_on_workspace_members(&project, |project| {
let reqwest = reqwest.clone(); let reqwest = reqwest.clone();
let this = self.clone(); let this = self.clone();
async move { this.run_impl(&project, reqwest, false).await } let refreshed_sources = refreshed_sources.clone();
async move {
let res = this.run_impl(&project, reqwest, &refreshed_sources).await;
display_err(res, " occurred publishing workspace member");
Ok(())
}
}) })
.await .await
.map(|_| ()) .map(|_| ())

View file

@ -0,0 +1,59 @@
use std::str::FromStr as _;
use anyhow::Context as _;
use clap::Args;
use crate::cli::{
dep_type_to_key,
style::{INFO_STYLE, SUCCESS_STYLE},
};
use pesde::{
manifest::{Alias, DependencyType},
Project,
};
#[derive(Debug, Args)]
pub struct RemoveCommand {
/// The alias of the package to remove
#[arg(index = 1)]
alias: Alias,
}
impl RemoveCommand {
pub async fn run(self, project: Project) -> anyhow::Result<()> {
let mut manifest = toml_edit::DocumentMut::from_str(
&project
.read_manifest()
.await
.context("failed to read manifest")?,
)
.context("failed to parse manifest")?;
let Some(dep_key) = DependencyType::VARIANTS
.iter()
.copied()
.map(dep_type_to_key)
.find(|dependency_key| {
manifest[dependency_key]
.as_table_mut()
.is_some_and(|table| table.remove(self.alias.as_str()).is_some())
})
else {
anyhow::bail!("package under alias `{}` not found in manifest", self.alias)
};
project
.write_manifest(manifest.to_string())
.await
.context("failed to write manifest")?;
println!(
"{} removed {} from {}!",
SUCCESS_STYLE.apply_to("success!"),
INFO_STYLE.apply_to(self.alias),
INFO_STYLE.apply_to(dep_key)
);
Ok(())
}
}

View file

@ -1,16 +1,20 @@
use crate::cli::up_to_date_lockfile; use crate::cli::{style::WARN_STYLE, up_to_date_lockfile};
use anyhow::Context; use anyhow::Context as _;
use clap::Args; use clap::Args;
use futures::{StreamExt, TryStreamExt}; use fs_err::tokio as fs;
use futures::{StreamExt as _, TryStreamExt as _};
use pesde::{ use pesde::{
errors::{ManifestReadError, WorkspaceMembersError},
linking::generator::generate_bin_linking_module, linking::generator::generate_bin_linking_module,
manifest::Alias,
names::{PackageName, PackageNames}, names::{PackageName, PackageNames},
Project, MANIFEST_FILE_NAME, PACKAGES_CONTAINER_NAME, source::traits::{GetTargetOptions, PackageRef as _, PackageSource as _, RefreshOptions},
Project, MANIFEST_FILE_NAME,
}; };
use relative_path::RelativePathBuf; use relative_path::RelativePathBuf;
use std::{ use std::{
collections::HashSet, env::current_dir, ffi::OsString, io::Write, path::PathBuf, collections::HashSet, env::current_dir, ffi::OsString, io::Write as _, path::Path,
process::Command, process::Command, sync::Arc,
}; };
#[derive(Debug, Args)] #[derive(Debug, Args)]
@ -26,7 +30,7 @@ pub struct RunCommand {
impl RunCommand { impl RunCommand {
pub async fn run(self, project: Project) -> anyhow::Result<()> { pub async fn run(self, project: Project) -> anyhow::Result<()> {
let run = |root: PathBuf, file_path: PathBuf| { let run = |root: &Path, file_path: &Path| -> ! {
let mut caller = tempfile::NamedTempFile::new().expect("failed to create tempfile"); let mut caller = tempfile::NamedTempFile::new().expect("failed to create tempfile");
caller caller
.write_all( .write_all(
@ -49,21 +53,22 @@ impl RunCommand {
drop(caller); drop(caller);
std::process::exit(status.code().unwrap_or(1)) std::process::exit(status.code().unwrap_or(1i32))
}; };
let Some(package_or_script) = self.package_or_script else { let Some(package_or_script) = self.package_or_script else {
if let Some(script_path) = project.deser_manifest().await?.target.bin_path() { if let Some(script_path) = project.deser_manifest().await?.target.bin_path() {
run( run(
project.package_dir().to_owned(), project.package_dir(),
script_path.to_path(project.package_dir()), &script_path.to_path(project.package_dir()),
); );
return Ok(());
} }
anyhow::bail!("no package or script specified, and no bin path found in manifest") anyhow::bail!("no package or script specified, and no bin path found in manifest")
}; };
let mut package_info = None;
if let Ok(pkg_name) = package_or_script.parse::<PackageName>() { if let Ok(pkg_name) = package_or_script.parse::<PackageName>() {
let graph = if let Some(lockfile) = up_to_date_lockfile(&project).await? { let graph = if let Some(lockfile) = up_to_date_lockfile(&project).await? {
lockfile.graph lockfile.graph
@ -73,51 +78,84 @@ impl RunCommand {
let pkg_name = PackageNames::Pesde(pkg_name); let pkg_name = PackageNames::Pesde(pkg_name);
for (version_id, node) in graph.get(&pkg_name).context("package not found in graph")? { let mut versions = graph
if node.node.direct.is_none() { .into_iter()
continue; .filter(|(id, node)| *id.name() == pkg_name && node.direct.is_some())
.collect::<Vec<_>>();
package_info = Some(match versions.len() {
0 => anyhow::bail!("package not found"),
1 => versions.pop().unwrap(),
_ => anyhow::bail!("multiple versions found. use the package's alias instead."),
});
} else if let Ok(alias) = package_or_script.parse::<Alias>() {
if let Some(lockfile) = up_to_date_lockfile(&project).await? {
package_info = lockfile
.graph
.into_iter()
.find(|(_, node)| node.direct.as_ref().is_some_and(|(a, _, _)| alias == *a));
} else {
eprintln!(
"{}",
WARN_STYLE.apply_to(
"outdated lockfile, please run the install command first to use an alias"
)
);
};
} }
let Some(bin_path) = node.target.bin_path() else { if let Some((id, node)) = package_info {
let container_folder = node.container_folder_from_project(
&id,
&project,
project
.deser_manifest()
.await
.context("failed to deserialize manifest")?
.target
.kind(),
);
let source = node.pkg_ref.source();
source
.refresh(&RefreshOptions {
project: project.clone(),
})
.await
.context("failed to refresh source")?;
let target = source
.get_target(
&node.pkg_ref,
&GetTargetOptions {
project,
path: Arc::from(container_folder.as_path()),
id: Arc::new(id),
},
)
.await?;
let Some(bin_path) = target.bin_path() else {
anyhow::bail!("package has no bin path"); anyhow::bail!("package has no bin path");
}; };
let base_folder = project
.deser_manifest()
.await?
.target
.kind()
.packages_folder(version_id.target());
let container_folder = node.node.container_folder(
&project
.package_dir()
.join(base_folder)
.join(PACKAGES_CONTAINER_NAME),
&pkg_name,
version_id.version(),
);
let path = bin_path.to_path(&container_folder); let path = bin_path.to_path(&container_folder);
run(path.clone(), path); run(&path, &path);
return Ok(());
}
} }
if let Ok(manifest) = project.deser_manifest().await { if let Ok(manifest) = project.deser_manifest().await {
if let Some(script_path) = manifest.scripts.get(&package_or_script) { if let Some(script_path) = manifest.scripts.get(&package_or_script) {
run( run(
project.package_dir().to_path_buf(), project.package_dir(),
script_path.to_path(project.package_dir()), &script_path.to_path(project.package_dir()),
); );
return Ok(());
} }
}; }
let relative_path = RelativePathBuf::from(package_or_script); let relative_path = RelativePathBuf::from(package_or_script);
let path = relative_path.to_path(project.package_dir()); let path = relative_path.to_path(project.package_dir());
if !path.exists() { if fs::metadata(&path).await.is_err() {
anyhow::bail!("path `{}` does not exist", path.display()); anyhow::bail!("path `{}` does not exist", path.display());
} }
@ -125,9 +163,9 @@ impl RunCommand {
.workspace_dir() .workspace_dir()
.unwrap_or_else(|| project.package_dir()); .unwrap_or_else(|| project.package_dir());
let members = match project.workspace_members(workspace_dir, false).await { let members = match project.workspace_members(false).await {
Ok(members) => members.boxed(), Ok(members) => members.boxed(),
Err(pesde::errors::WorkspaceMembersError::ManifestMissing(e)) Err(WorkspaceMembersError::ManifestParse(ManifestReadError::Io(e)))
if e.kind() == std::io::ErrorKind::NotFound => if e.kind() == std::io::ErrorKind::NotFound =>
{ {
futures::stream::empty().boxed() futures::stream::empty().boxed()
@ -137,8 +175,10 @@ impl RunCommand {
let members = members let members = members
.map(|res| { .map(|res| {
res.map_err(anyhow::Error::from) res.map_err(anyhow::Error::from)?
.and_then(|(path, _)| path.canonicalize().map_err(Into::into)) .0
.canonicalize()
.map_err(anyhow::Error::from)
}) })
.chain(futures::stream::once(async { .chain(futures::stream::once(async {
workspace_dir.canonicalize().map_err(Into::into) workspace_dir.canonicalize().map_err(Into::into)
@ -148,14 +188,16 @@ impl RunCommand {
.context("failed to collect workspace members")?; .context("failed to collect workspace members")?;
let root = 'finder: { let root = 'finder: {
let mut current_path = path.to_path_buf(); let mut current_path = path.clone();
loop { loop {
let canonical_path = current_path let canonical_path = current_path
.canonicalize() .canonicalize()
.context("failed to canonicalize parent")?; .context("failed to canonicalize parent")?;
if members.contains(&canonical_path) if members.contains(&canonical_path)
&& canonical_path.join(MANIFEST_FILE_NAME).exists() && fs::metadata(canonical_path.join(MANIFEST_FILE_NAME))
.await
.is_ok()
{ {
break 'finder canonical_path; break 'finder canonical_path;
} }
@ -170,8 +212,6 @@ impl RunCommand {
project.package_dir().to_path_buf() project.package_dir().to_path_buf()
}; };
run(root, path); run(&root, &path);
Ok(())
} }
} }

View file

@ -1,8 +1,13 @@
use crate::cli::{version::update_bin_exe, HOME_DIR}; use crate::cli::{
use anyhow::Context; style::{ADDED_STYLE, CLI_STYLE},
version::replace_pesde_bin_exe,
HOME_DIR,
};
use anyhow::Context as _;
use clap::Args; use clap::Args;
use colored::Colorize; use console::style;
use std::env::current_exe; use std::env::current_exe;
#[derive(Debug, Args)] #[derive(Debug, Args)]
pub struct SelfInstallCommand { pub struct SelfInstallCommand {
/// Skip adding the bin directory to the PATH /// Skip adding the bin directory to the PATH
@ -16,61 +21,57 @@ impl SelfInstallCommand {
#[cfg(windows)] #[cfg(windows)]
{ {
if !self.skip_add_to_path { if !self.skip_add_to_path {
use anyhow::Context; use crate::cli::style::WARN_STYLE;
use winreg::{enums::HKEY_CURRENT_USER, RegKey}; use anyhow::Context as _;
use windows_registry::CURRENT_USER;
let current_user = RegKey::predef(HKEY_CURRENT_USER);
let env = current_user
.create_subkey("Environment")
.context("failed to open Environment key")?
.0;
let path: String = env.get_value("Path").context("failed to get Path value")?;
let bin_dir = crate::cli::bin_dir().await?; let bin_dir = crate::cli::bin_dir().await?;
let env = CURRENT_USER
.create("Environment")
.context("failed to open Environment key")?;
let path = env.get_string("Path").context("failed to get Path value")?;
let bin_dir = bin_dir.to_string_lossy(); let bin_dir = bin_dir.to_string_lossy();
let exists = path.split(';').any(|part| *part == bin_dir); let exists = path.split(';').any(|part| *part == bin_dir);
if !exists { if !exists {
let new_path = format!("{path};{bin_dir}"); let new_path = format!("{path};{bin_dir}");
env.set_value("Path", &new_path) env.set_string("Path", &new_path)
.context("failed to set Path value")?; .context("failed to set Path value")?;
println!( println!(
"\nin order to allow binary exports as executables {}.\n\n{}", "\nin order to allow proper functionality {} was added to PATH.\n\n{}",
format!("`~/{HOME_DIR}/bin` was added to PATH").green(), style(format!("`~/{HOME_DIR}/bin`")).green(),
"please restart your shell for this to take effect" WARN_STYLE.apply_to("please restart your shell for this to take effect")
.yellow()
.bold()
); );
} }
} }
println!( println!(
"installed {} {}!", "installed {} {}!",
env!("CARGO_BIN_NAME").cyan(), CLI_STYLE.apply_to(env!("CARGO_BIN_NAME")),
env!("CARGO_PKG_VERSION").yellow(), ADDED_STYLE.apply_to(env!("CARGO_PKG_VERSION")),
); );
} };
#[cfg(unix)] #[cfg(unix)]
{ {
println!( println!(
r#"installed {} {}! add the following line to your shell profile in order to get the binary and binary exports as executables usable from anywhere: r"installed {} {}! add the following line to your shell profile in order to get the binary and binary exports as executables usable from anywhere:
{} {}
and then restart your shell. and then restart your shell.
"#, ",
env!("CARGO_BIN_NAME").cyan(), CLI_STYLE.apply_to(env!("CARGO_BIN_NAME")),
env!("CARGO_PKG_VERSION").yellow(), ADDED_STYLE.apply_to(env!("CARGO_PKG_VERSION")),
format!(r#"export PATH="$PATH:~/{HOME_DIR}/bin""#) style(format!(r#"export PATH="$PATH:$HOME/{HOME_DIR}/bin""#)).green(),
.bold()
.green()
); );
} };
update_bin_exe(&current_exe().context("failed to get current exe path")?).await?; replace_pesde_bin_exe(&current_exe().context("failed to get current exe path")?).await?;
Ok(()) Ok(())
} }

View file

@ -1,13 +1,17 @@
use crate::cli::{ use crate::{
cli::{
config::read_config, config::read_config,
style::{ADDED_STYLE, CLI_STYLE, REMOVED_STYLE},
version::{ version::{
current_version, get_or_download_version, get_remote_version, no_build_metadata, current_version, find_latest_version, get_or_download_engine, replace_pesde_bin_exe,
update_bin_exe, TagInfo, VersionType,
}, },
},
util::no_build_metadata,
}; };
use anyhow::Context; use anyhow::Context as _;
use clap::Args; use clap::Args;
use colored::Colorize; use pesde::engine::EngineKind;
use semver::VersionReq;
#[derive(Debug, Args)] #[derive(Debug, Args)]
pub struct SelfUpgradeCommand { pub struct SelfUpgradeCommand {
@ -25,7 +29,7 @@ impl SelfUpgradeCommand {
.context("no cached version found")? .context("no cached version found")?
.1 .1
} else { } else {
get_remote_version(&reqwest, VersionType::Latest).await? find_latest_version(&reqwest).await?
}; };
let latest_version_no_metadata = no_build_metadata(&latest_version); let latest_version_no_metadata = no_build_metadata(&latest_version);
@ -35,21 +39,25 @@ impl SelfUpgradeCommand {
return Ok(()); return Ok(());
} }
let display_latest_version = latest_version_no_metadata.to_string().yellow().bold(); let display_latest_version = ADDED_STYLE.apply_to(latest_version_no_metadata);
if !inquire::prompt_confirmation(format!( let confirmed = inquire::prompt_confirmation(format!(
"are you sure you want to upgrade {} from {} to {display_latest_version}?", "are you sure you want to upgrade {} from {} to {display_latest_version}?",
env!("CARGO_BIN_NAME").cyan(), CLI_STYLE.apply_to(env!("CARGO_BIN_NAME")),
env!("CARGO_PKG_VERSION").yellow().bold() REMOVED_STYLE.apply_to(env!("CARGO_PKG_VERSION"))
))? { ))?;
if !confirmed {
println!("cancelled upgrade"); println!("cancelled upgrade");
return Ok(()); return Ok(());
} }
let path = get_or_download_version(&reqwest, &TagInfo::Complete(latest_version), true) let path = get_or_download_engine(
.await? &reqwest,
.unwrap(); EngineKind::Pesde,
update_bin_exe(&path).await?; VersionReq::parse(&format!("={latest_version}")).unwrap(),
)
.await?;
replace_pesde_bin_exe(&path).await?;
println!("upgraded to version {display_latest_version}!"); println!("upgraded to version {display_latest_version}!");

View file

@ -1,84 +1,47 @@
use crate::cli::{progress_bar, run_on_workspace_members}; use crate::cli::{
use anyhow::Context; install::{install, InstallOptions},
run_on_workspace_members,
};
use clap::Args; use clap::Args;
use colored::Colorize; use pesde::Project;
use pesde::{lockfile::Lockfile, Project}; use std::num::NonZeroUsize;
use std::{collections::HashSet, sync::Arc};
use tokio::sync::Mutex;
#[derive(Debug, Args, Copy, Clone)] #[derive(Debug, Args, Copy, Clone)]
pub struct UpdateCommand {} pub struct UpdateCommand {
/// Update the dependencies but don't install them
#[arg(long)]
no_install: bool,
/// The maximum number of concurrent network requests
#[arg(long, default_value = "16")]
network_concurrency: NonZeroUsize,
/// Whether to re-install all dependencies even if they are already installed
#[arg(long)]
force: bool,
}
impl UpdateCommand { impl UpdateCommand {
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> { pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
let mut refreshed_sources = HashSet::new(); let options = InstallOptions {
locked: false,
prod: false,
write: !self.no_install,
network_concurrency: self.network_concurrency,
use_lockfile: false,
force: self.force,
};
let manifest = project install(&options, &project, reqwest.clone(), true).await?;
.deser_manifest()
.await
.context("failed to read manifest")?;
println!( run_on_workspace_members(&project, |project| {
"\n{}\n",
format!("[now updating {} {}]", manifest.name, manifest.target)
.bold()
.on_bright_black()
);
let graph = project
.dependency_graph(None, &mut refreshed_sources, false)
.await
.context("failed to build dependency graph")?;
let graph = Arc::new(graph);
project
.write_lockfile(Lockfile {
name: manifest.name,
version: manifest.version,
target: manifest.target.kind(),
overrides: manifest.overrides,
graph: {
let (rx, downloaded_graph) = project
.download_and_link(
&graph,
&Arc::new(Mutex::new(refreshed_sources)),
&reqwest,
false,
false,
|_| async { Ok::<_, std::io::Error>(()) },
)
.await
.context("failed to download dependencies")?;
progress_bar(
graph.values().map(|versions| versions.len() as u64).sum(),
rx,
"📥 ".to_string(),
"downloading dependencies".to_string(),
"downloaded dependencies".to_string(),
)
.await?;
downloaded_graph
.await
.context("failed to download dependencies")?
},
workspace: run_on_workspace_members(&project, |project| {
let reqwest = reqwest.clone(); let reqwest = reqwest.clone();
async move { Box::pin(self.run(project, reqwest)).await } async move {
install(&options, &project, reqwest, false).await?;
Ok(())
}
}) })
.await?, .await?;
})
.await
.context("failed to write lockfile")?;
println!(
"\n\n{}. run `{} install` in order to install the new dependencies",
"✅ done".green(),
env!("CARGO_BIN_NAME")
);
Ok(()) Ok(())
} }

148
src/cli/commands/yank.rs Normal file
View file

@ -0,0 +1,148 @@
use crate::cli::{get_index, style::SUCCESS_STYLE};
use anyhow::Context as _;
use clap::Args;
use pesde::{
manifest::target::TargetKind,
names::PackageName,
source::{
pesde::PesdePackageSource,
traits::{PackageSource as _, RefreshOptions},
},
Project,
};
use reqwest::{header::AUTHORIZATION, Method, StatusCode};
use semver::Version;
use std::{fmt::Display, str::FromStr};
#[derive(Debug, Clone)]
enum TargetKindOrAll {
All,
Specific(TargetKind),
}
impl Display for TargetKindOrAll {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
TargetKindOrAll::All => write!(f, "all"),
TargetKindOrAll::Specific(kind) => write!(f, "{kind}"),
}
}
}
impl FromStr for TargetKindOrAll {
type Err = anyhow::Error;
fn from_str(s: &str) -> Result<Self, Self::Err> {
if s.eq_ignore_ascii_case("all") {
return Ok(TargetKindOrAll::All);
}
s.parse()
.map(TargetKindOrAll::Specific)
.context("failed to parse target kind")
}
}
#[derive(Debug, Clone)]
struct YankId(PackageName, Version, TargetKindOrAll);
impl FromStr for YankId {
type Err = anyhow::Error;
fn from_str(s: &str) -> Result<Self, Self::Err> {
let (package, version) = s
.split_once('@')
.context("package is not in format of `scope/name@version target`")?;
let target = match version.split(' ').nth(1) {
Some(target) => target
.parse()
.context("package is not in format of `scope/name@version target`")?,
None => TargetKindOrAll::All,
};
Ok(YankId(
package.parse().context("failed to parse package name")?,
version.parse().context("failed to parse version")?,
target,
))
}
}
#[derive(Debug, Args)]
pub struct YankCommand {
/// Whether to unyank the package
#[clap(long)]
undo: bool,
/// The index to yank the package from
#[clap(short, long)]
index: Option<String>,
/// The package to yank
#[clap(index = 1)]
package: YankId,
}
impl YankCommand {
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
let YankId(package, version, target) = self.package;
let index_url = get_index(&project, self.index.as_deref()).await?;
let source = PesdePackageSource::new(index_url.clone());
source
.refresh(&RefreshOptions {
project: project.clone(),
})
.await
.context("failed to refresh source")?;
let config = source
.config(&project)
.await
.context("failed to get index config")?;
let mut request = reqwest.request(
if self.undo {
Method::DELETE
} else {
Method::PUT
},
format!(
"{}/v1/packages/{}/{}/{}/yank",
config.api(),
urlencoding::encode(&package.to_string()),
urlencoding::encode(&version.to_string()),
urlencoding::encode(&target.to_string()),
),
);
if let Some(token) = project.auth_config().tokens().get(&index_url) {
tracing::debug!("using token for {index_url}");
request = request.header(AUTHORIZATION, token);
}
let response = request.send().await.context("failed to send request")?;
let status = response.status();
let text = response
.text()
.await
.context("failed to get response text")?;
let prefix = if self.undo { "un" } else { "" };
match status {
StatusCode::CONFLICT => {
anyhow::bail!("version is already {prefix}yanked");
}
StatusCode::FORBIDDEN => {
anyhow::bail!("unauthorized to {prefix}yank under this scope");
}
code if !code.is_success() => {
anyhow::bail!("failed to {prefix}yank package: {code} ({text})");
}
_ => {
println!("{}", SUCCESS_STYLE.apply_to(text));
}
}
Ok(())
}
}

View file

@ -1,5 +1,5 @@
use crate::cli::{auth::Tokens, home_dir}; use crate::cli::{auth::Tokens, home_dir};
use anyhow::Context; use anyhow::Context as _;
use fs_err::tokio as fs; use fs_err::tokio as fs;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use tracing::instrument; use tracing::instrument;
@ -16,7 +16,7 @@ pub struct CliConfig {
pub tokens: Tokens, pub tokens: Tokens,
#[serde(default, skip_serializing_if = "Option::is_none")] #[serde(default, skip_serializing_if = "Option::is_none")]
pub last_checked_updates: Option<(chrono::DateTime<chrono::Utc>, semver::Version)>, pub last_checked_updates: Option<(jiff::Timestamp, semver::Version)>,
} }
impl Default for CliConfig { impl Default for CliConfig {
@ -24,7 +24,7 @@ impl Default for CliConfig {
Self { Self {
default_index: "https://github.com/pesde-pkg/index".try_into().unwrap(), default_index: "https://github.com/pesde-pkg/index".try_into().unwrap(),
tokens: Tokens(Default::default()), tokens: Tokens::default(),
last_checked_updates: None, last_checked_updates: None,
} }

View file

@ -3,9 +3,9 @@ use std::path::Path;
pub async fn make_executable<P: AsRef<Path>>(_path: P) -> anyhow::Result<()> { pub async fn make_executable<P: AsRef<Path>>(_path: P) -> anyhow::Result<()> {
#[cfg(unix)] #[cfg(unix)]
{ {
use anyhow::Context; use anyhow::Context as _;
use fs_err::tokio as fs; use fs_err::tokio as fs;
use std::os::unix::fs::PermissionsExt; use std::os::unix::fs::PermissionsExt as _;
let mut perms = fs::metadata(&_path) let mut perms = fs::metadata(&_path)
.await .await

574
src/cli/install.rs Normal file
View file

@ -0,0 +1,574 @@
use super::files::make_executable;
use crate::cli::{
bin_dir, dep_type_to_key,
reporters::{self, CliReporter},
resolve_overrides, run_on_workspace_members,
style::{ADDED_STYLE, REMOVED_STYLE, WARN_PREFIX},
up_to_date_lockfile,
};
use anyhow::Context as _;
use console::style;
use fs_err::tokio as fs;
use pesde::{
download_and_link::{DownloadAndLinkHooks, DownloadAndLinkOptions},
engine::EngineKind,
graph::{DependencyGraph, DependencyGraphWithTarget},
lockfile::Lockfile,
manifest::{DependencyType, Manifest},
names::PackageNames,
source::{
pesde::PesdePackageSource,
refs::PackageRefs,
traits::{PackageRef as _, RefreshOptions},
PackageSources,
},
version_matches, Project, RefreshedSources, MANIFEST_FILE_NAME,
};
use std::{
collections::{BTreeMap, BTreeSet, HashMap, HashSet},
num::NonZeroUsize,
path::Path,
sync::Arc,
time::Instant,
};
use tokio::task::JoinSet;
pub struct InstallHooks {
pub bin_folder: std::path::PathBuf,
}
#[derive(Debug, thiserror::Error)]
#[error(transparent)]
pub struct InstallHooksError(#[from] anyhow::Error);
impl DownloadAndLinkHooks for InstallHooks {
type Error = InstallHooksError;
async fn on_bins_downloaded(
&self,
graph: &DependencyGraphWithTarget,
) -> Result<(), Self::Error> {
let binary_packages = graph
.iter()
.filter_map(|(id, node)| node.target.bin_path().is_some().then_some(id))
.collect::<HashSet<_>>();
let aliases = graph
.iter()
.flat_map(|(_, node)| node.node.dependencies.iter())
.filter_map(|(id, alias)| binary_packages.contains(id).then_some(alias.as_str()))
.chain(
graph
.iter()
.filter_map(|(_, node)| node.node.direct.as_ref())
.map(|(alias, _, _)| alias.as_str()),
)
.collect::<HashSet<_>>();
let curr_exe: Arc<Path> = std::env::current_exe()
.context("failed to get current executable path")?
.as_path()
.into();
let mut tasks = aliases
.into_iter()
.map(|alias| {
let bin_exec_file = self
.bin_folder
.join(alias)
.with_extension(std::env::consts::EXE_EXTENSION);
let curr_exe = curr_exe.clone();
async move {
// TODO: remove this in a major release
#[cfg(unix)]
if fs::metadata(&bin_exec_file)
.await
.is_ok_and(|m| !m.is_symlink())
{
fs::remove_file(&bin_exec_file)
.await
.context("failed to remove outdated bin linker")?;
}
#[cfg(windows)]
let res = fs::symlink_file(curr_exe, &bin_exec_file).await;
#[cfg(unix)]
let res = fs::symlink(curr_exe, &bin_exec_file).await;
match res {
Ok(_) => {}
Err(e) if e.kind() == std::io::ErrorKind::AlreadyExists => {}
e => e.context("failed to symlink bin link file")?,
}
make_executable(&bin_exec_file)
.await
.context("failed to make bin link file executable")?;
Ok::<_, anyhow::Error>(())
}
})
.collect::<JoinSet<_>>();
while let Some(task) = tasks.join_next().await {
task.unwrap()?;
}
Ok(())
}
}
#[derive(Debug, Clone, Copy)]
pub struct InstallOptions {
pub locked: bool,
pub prod: bool,
pub write: bool,
pub use_lockfile: bool,
pub network_concurrency: NonZeroUsize,
pub force: bool,
}
pub async fn install(
options: &InstallOptions,
project: &Project,
reqwest: reqwest::Client,
is_root: bool,
) -> anyhow::Result<()> {
let start = Instant::now();
let refreshed_sources = RefreshedSources::new();
let manifest = project
.deser_manifest()
.await
.context("failed to read manifest")?;
let mut has_irrecoverable_changes = false;
let lockfile = if options.locked {
match up_to_date_lockfile(project).await? {
None => {
anyhow::bail!(
"lockfile is out of sync, run `{} install` to update it",
env!("CARGO_BIN_NAME")
);
}
file => file,
}
} else {
match project.deser_lockfile().await {
Ok(lockfile) => {
if lockfile.overrides != resolve_overrides(&manifest)? {
tracing::debug!("overrides are different");
has_irrecoverable_changes = true;
None
} else if lockfile.target != manifest.target.kind() {
tracing::debug!("target kind is different");
has_irrecoverable_changes = true;
None
} else {
Some(lockfile)
}
}
Err(pesde::errors::LockfileReadError::Io(e))
if e.kind() == std::io::ErrorKind::NotFound =>
{
None
}
Err(e) => return Err(e.into()),
}
};
let overrides = resolve_overrides(&manifest)?;
let (new_lockfile, old_graph) =
reporters::run_with_reporter(|multi, root_progress, reporter| async {
let multi = multi;
let root_progress = root_progress;
root_progress.set_prefix(format!("{} {}: ", manifest.name, manifest.target));
#[cfg(feature = "version-management")]
{
root_progress.reset();
root_progress.set_message("update engine linkers");
let mut tasks = manifest
.engines
.keys()
.map(|engine| crate::cli::version::make_linker_if_needed(*engine))
.collect::<JoinSet<_>>();
while let Some(task) = tasks.join_next().await {
task.unwrap()?;
}
}
root_progress.reset();
root_progress.set_message("resolve");
let old_graph = lockfile.map(|lockfile| lockfile.graph);
let graph = project
.dependency_graph(
old_graph.as_ref().filter(|_| options.use_lockfile),
refreshed_sources.clone(),
false,
)
.await
.context("failed to build dependency graph")?;
let mut tasks = graph
.iter()
.filter_map(|(id, node)| {
let PackageSources::Pesde(source) = node.pkg_ref.source() else {
return None;
};
#[allow(irrefutable_let_patterns)]
let PackageNames::Pesde(name) = id.name().clone() else {
panic!("unexpected package name");
};
let project = project.clone();
let refreshed_sources = refreshed_sources.clone();
Some(async move {
refreshed_sources
.refresh(
&PackageSources::Pesde(source.clone()),
&RefreshOptions {
project: project.clone(),
},
)
.await
.context("failed to refresh source")?;
let file = source.read_index_file(&name, &project)
.await
.context("failed to read package index file")?
.context("package not found in index")?;
Ok::<_, anyhow::Error>(if file.meta.deprecated.is_empty() {
None
} else {
Some((name, file.meta.deprecated))
})
})
})
.collect::<JoinSet<_>>();
while let Some(task) = tasks.join_next().await {
let Some((name, reason)) = task.unwrap()? else {
continue;
};
multi.suspend(|| {
println!("{WARN_PREFIX}: package {name} is deprecated: {reason}");
});
}
let graph = Arc::new(graph);
if options.write {
root_progress.reset();
root_progress.set_length(0);
root_progress.set_message("download");
root_progress.set_style(reporters::root_progress_style_with_progress());
let hooks = InstallHooks {
bin_folder: bin_dir().await?,
};
#[allow(unused_variables)]
let downloaded_graph = project
.download_and_link(
&graph,
DownloadAndLinkOptions::<CliReporter, InstallHooks>::new(reqwest.clone())
.reporter(reporter)
.hooks(hooks)
.refreshed_sources(refreshed_sources.clone())
.prod(options.prod)
.network_concurrency(options.network_concurrency)
.force(options.force || has_irrecoverable_changes),
)
.await
.context("failed to download and link dependencies")?;
#[cfg(feature = "version-management")]
{
let mut tasks = manifest
.engines
.into_iter()
.map(|(engine, req)| async move {
Ok::<_, anyhow::Error>(
crate::cli::version::get_installed_versions(engine)
.await?
.into_iter()
.filter(|version| version_matches(&req, version))
.next_back()
.map(|version| (engine, version)),
)
})
.collect::<JoinSet<_>>();
let mut resolved_engine_versions = HashMap::new();
while let Some(task) = tasks.join_next().await {
let Some((engine, version)) = task.unwrap()? else {
continue;
};
resolved_engine_versions.insert(engine, version);
}
let manifest_target_kind = manifest.target.kind();
let mut tasks = downloaded_graph.iter()
.map(|(id, node)| {
let id = id.clone();
let node = node.clone();
let project = project.clone();
let refreshed_sources = refreshed_sources.clone();
async move {
let engines = match &node.node.pkg_ref {
PackageRefs::Pesde(pkg_ref) => {
let source = PesdePackageSource::new(pkg_ref.index_url.clone());
refreshed_sources
.refresh(
&PackageSources::Pesde(source.clone()),
&RefreshOptions {
project: project.clone(),
},
)
.await
.context("failed to refresh source")?;
#[allow(irrefutable_let_patterns)]
let PackageNames::Pesde(name) = id.name() else {
panic!("unexpected package name");
};
let mut file = source.read_index_file(name, &project)
.await
.context("failed to read package index file")?
.context("package not found in index")?;
file
.entries
.remove(id.version_id())
.context("package version not found in index")?
.engines
}
#[cfg(feature = "wally-compat")]
PackageRefs::Wally(_) => Default::default(),
_ => {
let path = node.node.container_folder_from_project(
&id,
&project,
manifest_target_kind,
);
match fs::read_to_string(path.join(MANIFEST_FILE_NAME)).await {
Ok(manifest) => match toml::from_str::<Manifest>(&manifest) {
Ok(manifest) => manifest.engines,
Err(e) => return Err(e).context("failed to read package manifest"),
},
Err(e) if e.kind() == std::io::ErrorKind::NotFound => Default::default(),
Err(e) => return Err(e).context("failed to read package manifest"),
}
}
};
Ok((id, engines))
}
})
.collect::<JoinSet<_>>();
while let Some(task) = tasks.join_next().await {
let (id, required_engines) = task.unwrap()?;
for (engine, req) in required_engines {
if engine == EngineKind::Pesde {
continue;
}
let Some(version) = resolved_engine_versions.get(&engine) else {
tracing::debug!("package {id} requires {engine} {req}, but it is not installed");
continue;
};
if !version_matches(&req, version) {
multi.suspend(|| {
println!("{WARN_PREFIX}: package {id} requires {engine} {req}, but {version} is installed");
});
}
}
}
}
}
root_progress.reset();
root_progress.set_message("finish");
let new_lockfile = Lockfile {
name: manifest.name.clone(),
version: manifest.version,
target: manifest.target.kind(),
overrides,
graph: Arc::into_inner(graph).unwrap(),
workspace: run_on_workspace_members(project, |_| async { Ok(()) }).await?,
};
project
.write_lockfile(&new_lockfile)
.await
.context("failed to write lockfile")?;
anyhow::Ok((new_lockfile, old_graph.unwrap_or_default()))
})
.await?;
let elapsed = start.elapsed();
if is_root {
println!();
}
print_package_diff(
&format!("{} {}:", manifest.name, manifest.target),
&old_graph,
&new_lockfile.graph,
);
println!("done in {:.2}s", elapsed.as_secs_f64());
println!();
Ok(())
}
/// Prints the difference between two graphs.
pub fn print_package_diff(prefix: &str, old_graph: &DependencyGraph, new_graph: &DependencyGraph) {
let mut old_pkg_map = BTreeMap::new();
let mut old_direct_pkg_map = BTreeMap::new();
let mut new_pkg_map = BTreeMap::new();
let mut new_direct_pkg_map = BTreeMap::new();
for (id, node) in old_graph {
old_pkg_map.insert(id, node);
if node.direct.is_some() {
old_direct_pkg_map.insert(id, node);
}
}
for (id, node) in new_graph {
new_pkg_map.insert(id, node);
if node.direct.is_some() {
new_direct_pkg_map.insert(id, node);
}
}
let added_pkgs = new_pkg_map
.iter()
.filter(|(key, _)| !old_pkg_map.contains_key(*key))
.map(|(key, &node)| (key, node))
.collect::<Vec<_>>();
let removed_pkgs = old_pkg_map
.iter()
.filter(|(key, _)| !new_pkg_map.contains_key(*key))
.map(|(key, &node)| (key, node))
.collect::<Vec<_>>();
let added_direct_pkgs = new_direct_pkg_map
.iter()
.filter(|(key, _)| !old_direct_pkg_map.contains_key(*key))
.map(|(key, &node)| (key, node))
.collect::<Vec<_>>();
let removed_direct_pkgs = old_direct_pkg_map
.iter()
.filter(|(key, _)| !new_direct_pkg_map.contains_key(*key))
.map(|(key, &node)| (key, node))
.collect::<Vec<_>>();
let prefix = style(prefix).bold();
let no_changes = added_pkgs.is_empty()
&& removed_pkgs.is_empty()
&& added_direct_pkgs.is_empty()
&& removed_direct_pkgs.is_empty();
if no_changes {
println!("{prefix} already up to date");
} else {
let mut change_signs = [
(!added_pkgs.is_empty()).then(|| {
ADDED_STYLE
.apply_to(format!("+{}", added_pkgs.len()))
.to_string()
}),
(!removed_pkgs.is_empty()).then(|| {
REMOVED_STYLE
.apply_to(format!("-{}", removed_pkgs.len()))
.to_string()
}),
]
.into_iter()
.flatten()
.collect::<Vec<_>>()
.join(" ");
let changes_empty = change_signs.is_empty();
if changes_empty {
change_signs = style("(no changes)").dim().to_string();
}
println!("{prefix} {change_signs}");
if !changes_empty {
println!(
"{}{}",
ADDED_STYLE.apply_to("+".repeat(added_pkgs.len())),
REMOVED_STYLE.apply_to("-".repeat(removed_pkgs.len()))
);
}
let dependency_groups = added_direct_pkgs
.iter()
.map(|(key, node)| (true, key, node))
.chain(
removed_direct_pkgs
.iter()
.map(|(key, node)| (false, key, node)),
)
.filter_map(|(added, key, node)| {
node.direct.as_ref().map(|(_, _, ty)| (added, key, ty))
})
.fold(
BTreeMap::<DependencyType, BTreeSet<_>>::new(),
|mut map, (added, key, &ty)| {
map.entry(ty).or_default().insert((key, added));
map
},
);
for (ty, set) in dependency_groups {
println!();
println!(
"{}",
style(format!("{}:", dep_type_to_key(ty))).yellow().bold()
);
for (id, added) in set {
println!(
"{} {} {}",
if added {
ADDED_STYLE.apply_to("+")
} else {
REMOVED_STYLE.apply_to("-")
},
id.name(),
style(id.version_id()).dim()
);
}
}
println!();
}
}

View file

@ -1,13 +1,23 @@
use anyhow::Context; use crate::cli::{
use colored::Colorize; config::read_config,
style::{ERROR_STYLE, INFO_STYLE, WARN_STYLE},
};
use anyhow::Context as _;
use fs_err::tokio as fs; use fs_err::tokio as fs;
use futures::StreamExt; use futures::StreamExt as _;
use pesde::{ use pesde::{
errors::ManifestReadError,
lockfile::Lockfile, lockfile::Lockfile,
manifest::target::TargetKind, manifest::{
overrides::{OverrideKey, OverrideSpecifier},
target::TargetKind,
DependencyType, Manifest,
},
names::{PackageName, PackageNames}, names::{PackageName, PackageNames},
source::{version_id::VersionId, workspace::specifier::VersionTypeOrReq}, source::{
Project, ids::VersionId, specifiers::DependencySpecifiers, workspace::specifier::VersionTypeOrReq,
},
Project, DEFAULT_INDEX_NAME,
}; };
use relative_path::RelativePathBuf; use relative_path::RelativePathBuf;
use std::{ use std::{
@ -15,7 +25,6 @@ use std::{
future::Future, future::Future,
path::PathBuf, path::PathBuf,
str::FromStr, str::FromStr,
time::Duration,
}; };
use tokio::pin; use tokio::pin;
use tracing::instrument; use tracing::instrument;
@ -24,6 +33,9 @@ pub mod auth;
pub mod commands; pub mod commands;
pub mod config; pub mod config;
pub mod files; pub mod files;
pub mod install;
pub mod reporters;
pub mod style;
#[cfg(feature = "version-management")] #[cfg(feature = "version-management")]
pub mod version; pub mod version;
@ -43,6 +55,40 @@ pub async fn bin_dir() -> anyhow::Result<PathBuf> {
Ok(bin_dir) Ok(bin_dir)
} }
pub fn resolve_overrides(
manifest: &Manifest,
) -> anyhow::Result<BTreeMap<OverrideKey, DependencySpecifiers>> {
let mut dependencies = None;
let mut overrides = BTreeMap::new();
for (key, spec) in &manifest.overrides {
overrides.insert(
key.clone(),
match spec {
OverrideSpecifier::Specifier(spec) => spec,
OverrideSpecifier::Alias(alias) => {
if dependencies.is_none() {
dependencies = Some(
manifest
.all_dependencies()
.context("failed to get all dependencies")?,
);
}
&dependencies
.as_ref()
.and_then(|deps| deps.get(alias))
.with_context(|| format!("alias `{alias}` not found in manifest"))?
.0
}
}
.clone(),
);
}
Ok(overrides)
}
#[instrument(skip(project), ret(level = "trace"), level = "debug")] #[instrument(skip(project), ret(level = "trace"), level = "debug")]
pub async fn up_to_date_lockfile(project: &Project) -> anyhow::Result<Option<Lockfile>> { pub async fn up_to_date_lockfile(project: &Project) -> anyhow::Result<Option<Lockfile>> {
let manifest = project.deser_manifest().await?; let manifest = project.deser_manifest().await?;
@ -56,7 +102,7 @@ pub async fn up_to_date_lockfile(project: &Project) -> anyhow::Result<Option<Loc
Err(e) => return Err(e.into()), Err(e) => return Err(e.into()),
}; };
if manifest.overrides != lockfile.overrides { if resolve_overrides(&manifest)? != lockfile.overrides {
tracing::debug!("overrides are different"); tracing::debug!("overrides are different");
return Ok(None); return Ok(None);
} }
@ -74,10 +120,8 @@ pub async fn up_to_date_lockfile(project: &Project) -> anyhow::Result<Option<Loc
let specs = lockfile let specs = lockfile
.graph .graph
.iter() .iter()
.flat_map(|(_, versions)| versions)
.filter_map(|(_, node)| { .filter_map(|(_, node)| {
node.node node.direct
.direct
.as_ref() .as_ref()
.map(|(_, spec, source_ty)| (spec, source_ty)) .map(|(_, spec, source_ty)| (spec, source_ty))
}) })
@ -91,11 +135,7 @@ pub async fn up_to_date_lockfile(project: &Project) -> anyhow::Result<Option<Loc
tracing::debug!("dependencies are the same: {same_dependencies}"); tracing::debug!("dependencies are the same: {same_dependencies}");
Ok(if same_dependencies { Ok(same_dependencies.then_some(lockfile))
Some(lockfile)
} else {
None
})
} }
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
@ -126,30 +166,31 @@ impl VersionedPackageName {
#[cfg(feature = "patches")] #[cfg(feature = "patches")]
fn get( fn get(
self, self,
graph: &pesde::lockfile::DownloadedGraph, graph: &pesde::graph::DependencyGraph,
) -> anyhow::Result<(PackageNames, VersionId)> { ) -> anyhow::Result<pesde::source::ids::PackageId> {
let version_id = match self.1 { let version_id = if let Some(version) = self.1 {
Some(version) => version,
None => {
let versions = graph.get(&self.0).context("package not found in graph")?;
if versions.len() == 1 {
let version = versions.keys().next().unwrap().clone();
tracing::debug!("only one version found, using {version}");
version version
} else { } else {
anyhow::bail!( let versions = graph
.keys()
.filter(|id| *id.name() == self.0)
.collect::<Vec<_>>();
match versions.len() {
0 => anyhow::bail!("package not found"),
1 => versions[0].version_id().clone(),
_ => anyhow::bail!(
"multiple versions found, please specify one of: {}", "multiple versions found, please specify one of: {}",
versions versions
.keys() .iter()
.map(|v| v.to_string()) .map(ToString::to_string)
.collect::<Vec<_>>() .collect::<Vec<_>>()
.join(", ") .join(", ")
); ),
}
} }
}; };
Ok((self.0, version_id)) Ok(pesde::source::ids::PackageId::new(self.0, version_id))
} }
} }
@ -158,6 +199,7 @@ enum AnyPackageIdentifier<V: FromStr = VersionId, N: FromStr = PackageNames> {
PackageName(VersionedPackageName<V, N>), PackageName(VersionedPackageName<V, N>),
Url((gix::Url, String)), Url((gix::Url, String)),
Workspace(VersionedPackageName<VersionTypeOrReq, PackageName>), Workspace(VersionedPackageName<VersionTypeOrReq, PackageName>),
Path(PathBuf),
} }
impl<V: FromStr<Err = E>, E: Into<anyhow::Error>, N: FromStr<Err = F>, F: Into<anyhow::Error>> impl<V: FromStr<Err = E>, E: Into<anyhow::Error>, N: FromStr<Err = F>, F: Into<anyhow::Error>>
@ -176,6 +218,8 @@ impl<V: FromStr<Err = E>, E: Into<anyhow::Error>, N: FromStr<Err = F>, F: Into<a
))) )))
} else if let Some(rest) = s.strip_prefix("workspace:") { } else if let Some(rest) = s.strip_prefix("workspace:") {
Ok(AnyPackageIdentifier::Workspace(rest.parse()?)) Ok(AnyPackageIdentifier::Workspace(rest.parse()?))
} else if let Some(rest) = s.strip_prefix("path:") {
Ok(AnyPackageIdentifier::Path(rest.into()))
} else if s.contains(':') { } else if s.contains(':') {
let (url, rev) = s.split_once('#').context("missing revision")?; let (url, rev) = s.split_once('#').context("missing revision")?;
@ -193,39 +237,6 @@ pub fn parse_gix_url(s: &str) -> Result<gix::Url, gix::url::parse::Error> {
s.try_into() s.try_into()
} }
pub async fn progress_bar<E: std::error::Error + Into<anyhow::Error>>(
len: u64,
mut rx: tokio::sync::mpsc::Receiver<Result<String, E>>,
prefix: String,
progress_msg: String,
finish_msg: String,
) -> anyhow::Result<()> {
let bar = indicatif::ProgressBar::new(len)
.with_style(
indicatif::ProgressStyle::default_bar()
.template("{prefix}[{elapsed_precise}] {bar:40.208/166} {pos}/{len} {msg}")?
.progress_chars("█▓▒░ "),
)
.with_prefix(prefix)
.with_message(progress_msg);
bar.enable_steady_tick(Duration::from_millis(100));
while let Some(result) = rx.recv().await {
bar.inc(1);
match result {
Ok(text) => {
bar.set_message(text);
}
Err(e) => return Err(e.into()),
}
}
bar.finish_with_message(finish_msg);
Ok(())
}
pub fn shift_project_dir(project: &Project, pkg_dir: PathBuf) -> Project { pub fn shift_project_dir(project: &Project, pkg_dir: PathBuf) -> Project {
Project::new( Project::new(
pkg_dir, pkg_dir,
@ -246,9 +257,7 @@ pub async fn run_on_workspace_members<F: Future<Output = anyhow::Result<()>>>(
return Ok(Default::default()); return Ok(Default::default());
} }
let members_future = project let members_future = project.workspace_members(true).await?;
.workspace_members(project.package_dir(), true)
.await?;
pin!(members_future); pin!(members_future);
let mut results = BTreeMap::<PackageName, BTreeMap<TargetKind, RelativePathBuf>>::new(); let mut results = BTreeMap::<PackageName, BTreeMap<TargetKind, RelativePathBuf>>::new();
@ -273,14 +282,17 @@ pub async fn run_on_workspace_members<F: Future<Output = anyhow::Result<()>>>(
pub fn display_err(result: anyhow::Result<()>, prefix: &str) { pub fn display_err(result: anyhow::Result<()>, prefix: &str) {
if let Err(err) = result { if let Err(err) = result {
eprintln!("{}: {err}\n", format!("error{prefix}").red().bold()); eprintln!(
"{}: {err}\n",
ERROR_STYLE.apply_to(format!("error{prefix}"))
);
let cause = err.chain().skip(1).collect::<Vec<_>>(); let cause = err.chain().skip(1).collect::<Vec<_>>();
if !cause.is_empty() { if !cause.is_empty() {
eprintln!("{}:", "caused by".red().bold()); eprintln!("{}:", ERROR_STYLE.apply_to("caused by"));
for err in cause { for err in cause {
eprintln!(" - {err}"); eprintln!("\t- {err}");
} }
} }
@ -289,15 +301,53 @@ pub fn display_err(result: anyhow::Result<()>, prefix: &str) {
std::backtrace::BacktraceStatus::Disabled => { std::backtrace::BacktraceStatus::Disabled => {
eprintln!( eprintln!(
"\n{}: set RUST_BACKTRACE=1 for a backtrace", "\n{}: set RUST_BACKTRACE=1 for a backtrace",
"help".yellow().bold() INFO_STYLE.apply_to("help")
); );
} }
std::backtrace::BacktraceStatus::Captured => { std::backtrace::BacktraceStatus::Captured => {
eprintln!("\n{}:\n{backtrace}", "backtrace".yellow().bold()); eprintln!("\n{}:\n{backtrace}", WARN_STYLE.apply_to("backtrace"));
} }
_ => { _ => {
eprintln!("\n{}: not captured", "backtrace".yellow().bold()); eprintln!("\n{}: not captured", WARN_STYLE.apply_to("backtrace"));
} }
} }
} }
} }
pub async fn get_index(project: &Project, index: Option<&str>) -> anyhow::Result<gix::Url> {
let manifest = match project.deser_manifest().await {
Ok(manifest) => Some(manifest),
Err(e) => match e {
ManifestReadError::Io(e) if e.kind() == std::io::ErrorKind::NotFound => None,
e => return Err(e.into()),
},
};
let index_url = match index {
Some(index) => index.try_into().ok(),
None => match manifest {
Some(_) => None,
None => Some(read_config().await?.default_index),
},
};
if let Some(url) = index_url {
return Ok(url);
}
let index_name = index.unwrap_or(DEFAULT_INDEX_NAME);
manifest
.unwrap()
.indices
.remove(index_name)
.with_context(|| format!("index {index_name} not found in manifest"))
}
pub fn dep_type_to_key(dep_type: DependencyType) -> &'static str {
match dep_type {
DependencyType::Standard => "dependencies",
DependencyType::Dev => "dev_dependencies",
DependencyType::Peer => "peer_dependencies",
}
}

211
src/cli/reporters.rs Normal file
View file

@ -0,0 +1,211 @@
//! Progress reporters for the CLI
use std::{
future::Future,
io::{Stdout, Write},
sync::{Arc, Mutex, Once, OnceLock},
time::Duration,
};
use indicatif::{MultiProgress, ProgressBar, ProgressStyle};
use pesde::reporters::{
DownloadProgressReporter, DownloadsReporter, PatchProgressReporter, PatchesReporter,
};
pub const TICK_CHARS: &str = "⣷⣯⣟⡿⢿⣻⣽⣾";
pub fn root_progress_style() -> ProgressStyle {
ProgressStyle::with_template("{prefix:.dim}{msg:>8.214/yellow} {spinner} [{elapsed_precise}]")
.unwrap()
.tick_chars(TICK_CHARS)
}
pub fn root_progress_style_with_progress() -> ProgressStyle {
ProgressStyle::with_template(
"{prefix:.dim}{msg:>8.214/yellow} {spinner} [{elapsed_precise}] {bar:20} {pos}/{len}",
)
.unwrap()
.tick_chars(TICK_CHARS)
}
pub async fn run_with_reporter_and_writer<W, F, R, Fut>(writer: W, f: F) -> R
where
W: Write + Send + Sync + 'static,
F: FnOnce(MultiProgress, ProgressBar, Arc<CliReporter<W>>) -> Fut,
Fut: Future<Output = R>,
{
let multi_progress = MultiProgress::new();
crate::PROGRESS_BARS
.lock()
.unwrap()
.replace(multi_progress.clone());
let root_progress = multi_progress.add(ProgressBar::new(0));
root_progress.set_style(root_progress_style());
root_progress.enable_steady_tick(Duration::from_millis(100));
let reporter = Arc::new(CliReporter::with_writer(
writer,
multi_progress.clone(),
root_progress.clone(),
));
let result = f(multi_progress.clone(), root_progress.clone(), reporter).await;
root_progress.finish();
multi_progress.clear().unwrap();
crate::PROGRESS_BARS.lock().unwrap().take();
result
}
pub async fn run_with_reporter<F, R, Fut>(f: F) -> R
where
F: FnOnce(MultiProgress, ProgressBar, Arc<CliReporter<Stdout>>) -> Fut,
Fut: Future<Output = R>,
{
run_with_reporter_and_writer(std::io::stdout(), f).await
}
pub struct CliReporter<W = Stdout> {
writer: Mutex<W>,
child_style: ProgressStyle,
child_style_with_bytes: ProgressStyle,
child_style_with_bytes_without_total: ProgressStyle,
multi_progress: MultiProgress,
root_progress: ProgressBar,
}
impl<W> CliReporter<W> {
#[allow(unknown_lints, clippy::literal_string_with_formatting_args)]
pub fn with_writer(
writer: W,
multi_progress: MultiProgress,
root_progress: ProgressBar,
) -> Self {
Self {
writer: Mutex::new(writer),
child_style: ProgressStyle::with_template("{msg:.dim}").unwrap(),
child_style_with_bytes: ProgressStyle::with_template(
"{msg:.dim} {bytes:.dim}/{total_bytes:.dim}",
)
.unwrap(),
child_style_with_bytes_without_total: ProgressStyle::with_template(
"{msg:.dim} {bytes:.dim}",
)
.unwrap(),
multi_progress,
root_progress,
}
}
}
pub struct CliDownloadProgressReporter<W> {
root_reporter: Arc<CliReporter<W>>,
name: String,
progress: OnceLock<ProgressBar>,
set_progress: Once,
}
impl<W: Write + Send + Sync + 'static> DownloadsReporter for CliReporter<W> {
type DownloadProgressReporter = CliDownloadProgressReporter<W>;
fn report_download(self: Arc<Self>, name: String) -> Self::DownloadProgressReporter {
self.root_progress.inc_length(1);
CliDownloadProgressReporter {
root_reporter: self,
name,
progress: OnceLock::new(),
set_progress: Once::new(),
}
}
}
impl<W: Write + Send + Sync + 'static> DownloadProgressReporter for CliDownloadProgressReporter<W> {
fn report_start(&self) {
let progress = self.root_reporter.multi_progress.add(ProgressBar::new(0));
progress.set_style(self.root_reporter.child_style.clone());
progress.set_message(format!("- {}", self.name));
self.progress
.set(progress)
.expect("report_start called more than once");
}
fn report_progress(&self, total: u64, len: u64) {
if let Some(progress) = self.progress.get() {
progress.set_length(total);
progress.set_position(len);
self.set_progress.call_once(|| {
if total > 0 {
progress.set_style(self.root_reporter.child_style_with_bytes.clone());
} else {
progress.set_style(
self.root_reporter
.child_style_with_bytes_without_total
.clone(),
);
}
});
}
}
fn report_done(&self) {
if let Some(progress) = self.progress.get() {
if progress.is_hidden() {
writeln!(
self.root_reporter.writer.lock().unwrap(),
"downloaded {}",
self.name
)
.unwrap();
}
progress.finish();
self.root_reporter.multi_progress.remove(progress);
self.root_reporter.root_progress.inc(1);
}
}
}
pub struct CliPatchProgressReporter<W> {
root_reporter: Arc<CliReporter<W>>,
name: String,
progress: ProgressBar,
}
impl<W: Write + Send + Sync + 'static> PatchesReporter for CliReporter<W> {
type PatchProgressReporter = CliPatchProgressReporter<W>;
fn report_patch(self: Arc<Self>, name: String) -> Self::PatchProgressReporter {
let progress = self.multi_progress.add(ProgressBar::new(0));
progress.set_style(self.child_style.clone());
progress.set_message(format!("- {name}"));
self.root_progress.inc_length(1);
CliPatchProgressReporter {
root_reporter: self,
name,
progress,
}
}
}
impl<W: Write + Send + Sync + 'static> PatchProgressReporter for CliPatchProgressReporter<W> {
fn report_done(&self) {
if self.progress.is_hidden() {
writeln!(
self.root_reporter.writer.lock().unwrap(),
"patched {}",
self.name
)
.unwrap();
}
self.progress.finish();
self.root_reporter.multi_progress.remove(&self.progress);
self.root_reporter.root_progress.inc(1);
}
}

54
src/cli/style.rs Normal file
View file

@ -0,0 +1,54 @@
use console::{Style, StyledObject};
use paste::paste;
use std::{fmt::Display, sync::LazyLock};
#[derive(Debug)]
pub struct LazyStyle<T>(LazyLock<T>);
impl LazyStyle<Style> {
pub fn apply_to<D>(&self, text: D) -> StyledObject<D> {
LazyLock::force(&self.0).apply_to(text)
}
}
impl<T: Display> Display for LazyStyle<T> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", LazyLock::force(&self.0))
}
}
macro_rules! make_style {
($name:ident, $color:ident) => {
make_style!($name, $color());
};
($name:ident, $($color:tt)+) => {
paste! {
pub static [<$name _STYLE>]: LazyStyle<Style> = LazyStyle(LazyLock::new(||
Style::new().$($color)+.bold()
));
}
};
}
macro_rules! make_prefix {
($name:ident) => {
paste! {
pub static [<$name:upper _PREFIX>]: LazyStyle<StyledObject<&'static str>> = LazyStyle(LazyLock::new(||
[<$name:upper _STYLE>].apply_to(stringify!($name))
));
}
};
}
pub const CLI_COLOR_256: u8 = 214;
make_style!(INFO, cyan);
make_style!(WARN, yellow);
make_prefix!(warn);
make_style!(ERROR, red);
make_prefix!(error);
make_style!(SUCCESS, green);
make_style!(CLI, color256(CLI_COLOR_256));
make_style!(ADDED, green);
make_style!(REMOVED, red);
make_style!(URL, blue().underlined());

View file

@ -1,113 +1,77 @@
use crate::cli::{ use crate::{
cli::{
bin_dir, bin_dir,
config::{read_config, write_config, CliConfig}, config::{read_config, write_config, CliConfig},
files::make_executable, files::make_executable,
home_dir, home_dir,
reporters::run_with_reporter,
style::{ADDED_STYLE, CLI_STYLE, REMOVED_STYLE, URL_STYLE},
},
util::no_build_metadata,
}; };
use anyhow::Context; use anyhow::Context as _;
use colored::Colorize; use console::Style;
use fs_err::tokio as fs; use fs_err::tokio as fs;
use futures::StreamExt; use jiff::SignedDuration;
use reqwest::header::ACCEPT; use pesde::{
use semver::Version; engine::{
use serde::Deserialize; source::{
traits::{DownloadOptions, EngineSource as _, ResolveOptions},
EngineSources,
},
EngineKind,
},
reporters::DownloadsReporter as _,
version_matches,
};
use semver::{Version, VersionReq};
use std::{ use std::{
collections::BTreeSet,
env::current_exe, env::current_exe,
path::{Path, PathBuf}, path::{Path, PathBuf},
sync::Arc,
}; };
use tokio::io::AsyncWrite;
use tracing::instrument; use tracing::instrument;
pub fn current_version() -> Version { pub fn current_version() -> Version {
Version::parse(env!("CARGO_PKG_VERSION")).unwrap() Version::parse(env!("CARGO_PKG_VERSION")).unwrap()
} }
#[derive(Debug, Deserialize)] const CHECK_INTERVAL: SignedDuration = SignedDuration::from_hours(6);
struct Release {
tag_name: String,
assets: Vec<Asset>,
}
#[derive(Debug, Deserialize)] pub async fn find_latest_version(reqwest: &reqwest::Client) -> anyhow::Result<Version> {
struct Asset { let version = EngineSources::pesde()
name: String, .resolve(
url: url::Url, &VersionReq::STAR,
} &ResolveOptions {
reqwest: reqwest.clone(),
#[instrument(level = "trace")] },
fn get_repo() -> (String, String) { )
let mut parts = env!("CARGO_PKG_REPOSITORY").split('/').skip(3);
let (owner, repo) = (
parts.next().unwrap().to_string(),
parts.next().unwrap().to_string(),
);
tracing::trace!("repository for updates: {owner}/{repo}");
(owner, repo)
}
#[derive(Debug)]
pub enum VersionType {
Latest,
Specific(Version),
}
#[instrument(skip(reqwest), level = "trace")]
pub async fn get_remote_version(
reqwest: &reqwest::Client,
ty: VersionType,
) -> anyhow::Result<Version> {
let (owner, repo) = get_repo();
let mut releases = reqwest
.get(format!(
"https://api.github.com/repos/{owner}/{repo}/releases",
))
.send()
.await .await
.context("failed to send request to GitHub API")? .context("failed to resolve version")?
.error_for_status() .pop_last()
.context("failed to get GitHub API response")? .context("no versions found")?
.json::<Vec<Release>>() .0;
.await
.context("failed to parse GitHub API response")?
.into_iter()
.filter_map(|release| Version::parse(release.tag_name.trim_start_matches('v')).ok());
match ty { Ok(version)
VersionType::Latest => releases.max(),
VersionType::Specific(version) => {
releases.find(|v| no_build_metadata(v) == no_build_metadata(&version))
}
}
.context("failed to find latest version")
} }
pub fn no_build_metadata(version: &Version) -> Version {
let mut version = version.clone();
version.build = semver::BuildMetadata::EMPTY;
version
}
const CHECK_INTERVAL: chrono::Duration = chrono::Duration::hours(6);
#[instrument(skip(reqwest), level = "trace")] #[instrument(skip(reqwest), level = "trace")]
pub async fn check_for_updates(reqwest: &reqwest::Client) -> anyhow::Result<()> { pub async fn check_for_updates(reqwest: &reqwest::Client) -> anyhow::Result<()> {
let config = read_config().await?; let config = read_config().await?;
let version = if let Some((_, version)) = config let version = if let Some((_, version)) = config
.last_checked_updates .last_checked_updates
.filter(|(time, _)| chrono::Utc::now() - *time < CHECK_INTERVAL) .filter(|(time, _)| jiff::Timestamp::now().duration_since(*time) < CHECK_INTERVAL)
{ {
tracing::debug!("using cached version"); tracing::debug!("using cached version");
version version
} else { } else {
tracing::debug!("checking for updates"); tracing::debug!("checking for updates");
let version = get_remote_version(reqwest, VersionType::Latest).await?; let version = find_latest_version(reqwest).await?;
write_config(&CliConfig { write_config(&CliConfig {
last_checked_updates: Some((chrono::Utc::now(), version.clone())), last_checked_updates: Some((jiff::Timestamp::now(), version.clone())),
..config ..config
}) })
.await?; .await?;
@ -121,215 +85,163 @@ pub async fn check_for_updates(reqwest: &reqwest::Client) -> anyhow::Result<()>
return Ok(()); return Ok(());
} }
let name = env!("CARGO_BIN_NAME"); let alert_style = Style::new().yellow();
let changelog = format!("{}/releases/tag/v{version}", env!("CARGO_PKG_REPOSITORY")); let changelog = format!("{}/releases/tag/v{version}", env!("CARGO_PKG_REPOSITORY"));
let unformatted_messages = [ let messages = [
"".to_string(), format!(
format!("update available! {current_version}{version_no_metadata}"), "{} {} → {}",
format!("changelog: {changelog}"), alert_style.apply_to("update available!").bold(),
format!("run `{name} self-upgrade` to upgrade"), REMOVED_STYLE.apply_to(current_version),
ADDED_STYLE.apply_to(version_no_metadata)
),
format!(
"run {} to upgrade",
CLI_STYLE.apply_to(concat!("`", env!("CARGO_BIN_NAME"), " self-upgrade`")),
),
"".to_string(), "".to_string(),
format!("changelog: {}", URL_STYLE.apply_to(changelog)),
]; ];
let width = unformatted_messages let column = alert_style.apply_to("");
.iter()
.map(|s| s.chars().count())
.max()
.unwrap()
+ 4;
let column = "".bright_magenta(); let message = messages
let message = [
"".to_string(),
format!(
"update available! {} → {}",
current_version.to_string().red(),
version_no_metadata.to_string().green()
),
format!("changelog: {}", changelog.blue()),
format!(
"run `{} {}` to upgrade",
name.blue(),
"self-upgrade".yellow()
),
"".to_string(),
]
.into_iter() .into_iter()
.enumerate() .map(|s| format!("{column} {s}"))
.map(|(i, s)| {
let text_length = unformatted_messages[i].chars().count();
let padding = (width as f32 - text_length as f32) / 2f32;
let padding_l = " ".repeat(padding.floor() as usize);
let padding_r = " ".repeat(padding.ceil() as usize);
format!("{column}{padding_l}{s}{padding_r}{column}")
})
.collect::<Vec<_>>() .collect::<Vec<_>>()
.join("\n"); .join("\n");
let lines = "".repeat(width).bright_magenta(); eprintln!("\n{message}\n");
let tl = "".bright_magenta();
let tr = "".bright_magenta();
let bl = "".bright_magenta();
let br = "".bright_magenta();
println!("\n{tl}{lines}{tr}\n{message}\n{bl}{lines}{br}\n");
Ok(()) Ok(())
} }
#[instrument(skip(reqwest, writer), level = "trace")] const ENGINES_DIR: &str = "engines";
pub async fn download_github_release<W: AsyncWrite + Unpin>(
reqwest: &reqwest::Client,
version: &Version,
mut writer: W,
) -> anyhow::Result<()> {
let (owner, repo) = get_repo();
let release = reqwest #[instrument(level = "trace")]
.get(format!( pub async fn get_installed_versions(engine: EngineKind) -> anyhow::Result<BTreeSet<Version>> {
"https://api.github.com/repos/{owner}/{repo}/releases/tags/v{version}", let source = engine.source();
)) let path = home_dir()?.join(ENGINES_DIR).join(source.directory());
.send() let mut installed_versions = BTreeSet::new();
.await
.context("failed to send request to GitHub API")?
.error_for_status()
.context("failed to get GitHub API response")?
.json::<Release>()
.await
.context("failed to parse GitHub API response")?;
let asset = release let mut read_dir = match fs::read_dir(&path).await {
.assets Ok(read_dir) => read_dir,
.into_iter() Err(e) if e.kind() == std::io::ErrorKind::NotFound => return Ok(installed_versions),
.find(|asset| { Err(e) => return Err(e).context("failed to read engines directory"),
asset.name.ends_with(&format!( };
"-{}-{}.tar.gz",
std::env::consts::OS,
std::env::consts::ARCH
))
})
.context("failed to find asset for current platform")?;
let bytes = reqwest while let Some(entry) = read_dir.next_entry().await? {
.get(asset.url) let path = entry.path();
.header(ACCEPT, "application/octet-stream")
.send()
.await
.context("failed to send request to download asset")?
.error_for_status()
.context("failed to download asset")?
.bytes()
.await
.context("failed to download asset")?;
let mut decoder = async_compression::tokio::bufread::GzipDecoder::new(bytes.as_ref()); let Some(version) = path.file_name().and_then(|s| s.to_str()) else {
let mut archive = tokio_tar::Archive::new(&mut decoder); continue;
};
let mut entry = archive if let Ok(version) = Version::parse(version) {
.entries() installed_versions.insert(version);
.context("failed to read archive entries")? }
.next() }
.await
.context("archive has no entry")?
.context("failed to get first archive entry")?;
tokio::io::copy(&mut entry, &mut writer) Ok(installed_versions)
.await
.context("failed to write archive entry to file")
.map(|_| ())
}
#[derive(Debug)]
pub enum TagInfo {
Complete(Version),
Incomplete(Version),
} }
#[instrument(skip(reqwest), level = "trace")] #[instrument(skip(reqwest), level = "trace")]
pub async fn get_or_download_version( pub async fn get_or_download_engine(
reqwest: &reqwest::Client, reqwest: &reqwest::Client,
tag: &TagInfo, engine: EngineKind,
always_give_path: bool, req: VersionReq,
) -> anyhow::Result<Option<PathBuf>> { ) -> anyhow::Result<PathBuf> {
let path = home_dir()?.join("versions"); let source = engine.source();
let path = home_dir()?.join(ENGINES_DIR).join(source.directory());
let installed_versions = get_installed_versions(engine).await?;
let max_matching = installed_versions
.iter()
.filter(|v| version_matches(&req, v))
.next_back();
if let Some(version) = max_matching {
return Ok(path
.join(version.to_string())
.join(source.expected_file_name())
.with_extension(std::env::consts::EXE_EXTENSION));
}
run_with_reporter(|_, root_progress, reporter| async {
let root_progress = root_progress;
let reporter = reporter;
root_progress.set_message("resolve version");
let mut versions = source
.resolve(
&req,
&ResolveOptions {
reqwest: reqwest.clone(),
},
)
.await
.context("failed to resolve versions")?;
let (version, engine_ref) = versions.pop_last().context("no matching versions found")?;
root_progress.set_message("download");
let reporter = reporter.report_download(format!("{engine} v{version}"));
let archive = source
.download(
&engine_ref,
&DownloadOptions {
reqwest: reqwest.clone(),
reporter: Arc::new(reporter),
version: version.clone(),
},
)
.await
.context("failed to download engine")?;
let path = path.join(version.to_string());
fs::create_dir_all(&path) fs::create_dir_all(&path)
.await .await
.context("failed to create versions directory")?; .context("failed to create engine container folder")?;
let path = path
.join(source.expected_file_name())
.with_extension(std::env::consts::EXE_EXTENSION);
let version = match tag { let mut file = fs::File::create(&path)
TagInfo::Complete(version) => version,
// don't fetch the version since it could be cached
TagInfo::Incomplete(version) => version,
};
let path = path.join(format!(
"{}{}",
no_build_metadata(version),
std::env::consts::EXE_SUFFIX
));
let is_requested_version = !always_give_path && *version == current_version();
if path.exists() {
tracing::debug!("version already exists");
return Ok(if is_requested_version {
None
} else {
Some(path)
});
}
if is_requested_version {
tracing::debug!("copying current executable to version directory");
fs::copy(current_exe()?, &path)
.await .await
.context("failed to copy current executable to version directory")?; .context("failed to create new file")?;
} else {
let version = match tag {
TagInfo::Complete(version) => version.clone(),
TagInfo::Incomplete(version) => {
get_remote_version(reqwest, VersionType::Specific(version.clone()))
.await
.context("failed to get remote version")?
}
};
tracing::debug!("downloading version"); tokio::io::copy(
download_github_release( &mut archive
reqwest, .find_executable(source.expected_file_name())
&version,
fs::File::create(&path)
.await .await
.context("failed to create version file")?, .context("failed to find executable")?,
&mut file,
) )
.await?; .await
} .context("failed to write to file")?;
make_executable(&path) make_executable(&path)
.await .await
.context("failed to make downloaded version executable")?; .context("failed to make downloaded version executable")?;
Ok(if is_requested_version { if engine != EngineKind::Pesde {
None make_linker_if_needed(engine).await?;
} else { }
Some(path)
Ok::<_, anyhow::Error>(path)
}) })
.await
} }
#[instrument(level = "trace")] #[instrument(level = "trace")]
pub async fn update_bin_exe(downloaded_file: &Path) -> anyhow::Result<()> { pub async fn replace_pesde_bin_exe(with: &Path) -> anyhow::Result<()> {
let bin_exe_path = bin_dir().await?.join(format!( let bin_exe_path = bin_dir()
"{}{}", .await?
env!("CARGO_BIN_NAME"), .join(EngineKind::Pesde.to_string())
std::env::consts::EXE_SUFFIX .with_extension(std::env::consts::EXE_EXTENSION);
));
let mut downloaded_file = downloaded_file.to_path_buf();
let exists = bin_exe_path.exists(); let exists = fs::metadata(&bin_exe_path).await.is_ok();
if cfg!(target_os = "linux") && exists { if cfg!(target_os = "linux") && exists {
fs::remove_file(&bin_exe_path) fs::remove_file(&bin_exe_path)
@ -339,23 +251,41 @@ pub async fn update_bin_exe(downloaded_file: &Path) -> anyhow::Result<()> {
let tempfile = tempfile::Builder::new() let tempfile = tempfile::Builder::new()
.make(|_| Ok(())) .make(|_| Ok(()))
.context("failed to create temporary file")?; .context("failed to create temporary file")?;
let path = tempfile.into_temp_path().to_path_buf(); let temp_path = tempfile.into_temp_path().to_path_buf();
#[cfg(windows)] #[cfg(windows)]
let path = path.with_extension("exe"); let temp_path = temp_path.with_extension("exe");
let current_exe = current_exe().context("failed to get current exe path")?; match fs::rename(&bin_exe_path, &temp_path).await {
if current_exe == downloaded_file { Ok(_) => {}
downloaded_file = path.to_path_buf(); Err(e) if e.kind() == std::io::ErrorKind::NotFound => {}
Err(e) => return Err(e).context("failed to rename existing executable"),
}
} }
fs::rename(&bin_exe_path, &path) fs::copy(with, &bin_exe_path)
.await
.context("failed to rename current executable")?;
}
fs::copy(downloaded_file, &bin_exe_path)
.await .await
.context("failed to copy executable to bin folder")?; .context("failed to copy executable to bin folder")?;
make_executable(&bin_exe_path).await make_executable(&bin_exe_path).await
} }
#[instrument(level = "trace")]
pub async fn make_linker_if_needed(engine: EngineKind) -> anyhow::Result<()> {
let bin_dir = bin_dir().await?;
let linker = bin_dir
.join(engine.to_string())
.with_extension(std::env::consts::EXE_EXTENSION);
if fs::metadata(&linker).await.is_err() {
let exe = current_exe().context("failed to get current exe path")?;
#[cfg(windows)]
let result = fs::symlink_file(exe, linker);
#[cfg(not(windows))]
let result = fs::symlink(exe, linker);
result.await.context("failed to create symlink")?;
}
Ok(())
}

View file

@ -1,161 +1,182 @@
use crate::{ use crate::{
lockfile::{DependencyGraph, DownloadedDependencyGraphNode, DownloadedGraph}, graph::{DependencyGraph, DependencyGraphNode},
manifest::DependencyType, reporters::{DownloadProgressReporter as _, DownloadsReporter},
refresh_sources,
source::{ source::{
traits::{PackageRef, PackageSource}, fs::PackageFs,
PackageSources, ids::PackageId,
traits::{DownloadOptions, PackageRef as _, PackageSource as _, RefreshOptions},
}, },
Project, PACKAGES_CONTAINER_NAME, Project, RefreshedSources,
}; };
use fs_err::tokio as fs; use async_stream::try_stream;
use std::{ use futures::Stream;
collections::HashSet, use std::{num::NonZeroUsize, sync::Arc};
sync::{Arc, Mutex}, use tokio::{sync::Semaphore, task::JoinSet};
}; use tracing::{instrument, Instrument as _};
use tracing::{instrument, Instrument};
type MultithreadedGraph = Arc<Mutex<DownloadedGraph>>; /// Options for downloading.
#[derive(Debug)]
pub(crate) struct DownloadGraphOptions<Reporter> {
/// The reqwest client.
pub reqwest: reqwest::Client,
/// The downloads reporter.
pub reporter: Option<Arc<Reporter>>,
/// The refreshed sources.
pub refreshed_sources: RefreshedSources,
/// The max number of concurrent network requests.
pub network_concurrency: NonZeroUsize,
}
pub(crate) type MultithreadDownloadJob = ( impl<Reporter> DownloadGraphOptions<Reporter>
tokio::sync::mpsc::Receiver<Result<String, errors::DownloadGraphError>>, where
MultithreadedGraph, Reporter: DownloadsReporter + Send + Sync + 'static,
); {
/// Creates a new download options with the given reqwest client and reporter.
pub(crate) fn new(reqwest: reqwest::Client) -> Self {
Self {
reqwest,
reporter: None,
refreshed_sources: Default::default(),
network_concurrency: NonZeroUsize::new(16).unwrap(),
}
}
/// Sets the downloads reporter.
pub(crate) fn reporter(mut self, reporter: impl Into<Arc<Reporter>>) -> Self {
self.reporter.replace(reporter.into());
self
}
/// Sets the refreshed sources.
pub(crate) fn refreshed_sources(mut self, refreshed_sources: RefreshedSources) -> Self {
self.refreshed_sources = refreshed_sources;
self
}
/// Sets the max number of concurrent network requests.
pub(crate) fn network_concurrency(mut self, network_concurrency: NonZeroUsize) -> Self {
self.network_concurrency = network_concurrency;
self
}
}
impl<Reporter> Clone for DownloadGraphOptions<Reporter> {
fn clone(&self) -> Self {
Self {
reqwest: self.reqwest.clone(),
reporter: self.reporter.clone(),
refreshed_sources: self.refreshed_sources.clone(),
network_concurrency: self.network_concurrency,
}
}
}
impl Project { impl Project {
/// Downloads a graph of dependencies /// Downloads a graph of dependencies.
#[instrument(skip(self, graph, refreshed_sources, reqwest), level = "debug")] #[instrument(skip_all, level = "debug")]
pub async fn download_graph( pub(crate) async fn download_graph<Reporter>(
&self, &self,
graph: &DependencyGraph, graph: &DependencyGraph,
refreshed_sources: &mut HashSet<PackageSources>, options: DownloadGraphOptions<Reporter>,
reqwest: &reqwest::Client, ) -> Result<
prod: bool, impl Stream<
write: bool, Item = Result<(PackageId, DependencyGraphNode, PackageFs), errors::DownloadGraphError>,
wally: bool, >,
) -> Result<MultithreadDownloadJob, errors::DownloadGraphError> { errors::DownloadGraphError,
let manifest = self.deser_manifest().await?; >
let manifest_target_kind = manifest.target.kind(); where
let downloaded_graph: MultithreadedGraph = Arc::new(Mutex::new(Default::default())); Reporter: DownloadsReporter + Send + Sync + 'static,
{
let (tx, rx) = tokio::sync::mpsc::channel( let DownloadGraphOptions {
graph reqwest,
.iter() reporter,
.map(|(_, versions)| versions.len())
.sum::<usize>()
.max(1),
);
refresh_sources(
self,
graph
.iter()
.flat_map(|(_, versions)| versions.iter())
.map(|(_, node)| node.pkg_ref.source()),
refreshed_sources, refreshed_sources,
network_concurrency,
} = options;
let semaphore = Arc::new(Semaphore::new(network_concurrency.get()));
let mut tasks = graph
.iter()
.map(|(package_id, node)| {
let span = tracing::info_span!("download", package_id = package_id.to_string());
let project = self.clone();
let reqwest = reqwest.clone();
let reporter = reporter.clone();
let refreshed_sources = refreshed_sources.clone();
let semaphore = semaphore.clone();
let package_id = Arc::new(package_id.clone());
let node = node.clone();
async move {
let progress_reporter = reporter
.clone()
.map(|reporter| reporter.report_download(package_id.to_string()));
let _permit = semaphore.acquire().await;
if let Some(progress_reporter) = &progress_reporter {
progress_reporter.report_start();
}
let source = node.pkg_ref.source();
refreshed_sources
.refresh(
&source,
&RefreshOptions {
project: project.clone(),
},
) )
.await?; .await?;
let project = Arc::new(self.clone());
for (name, versions) in graph {
for (version_id, node) in versions {
// we need to download pesde packages first, since scripts (for target finding for example) can depend on them
if node.pkg_ref.like_wally() != wally {
continue;
}
let tx = tx.clone();
let name = name.clone();
let version_id = version_id.clone();
let node = node.clone();
let span = tracing::info_span!(
"download",
name = name.to_string(),
version_id = version_id.to_string()
);
let project = project.clone();
let reqwest = reqwest.clone();
let downloaded_graph = downloaded_graph.clone();
let package_dir = self.package_dir().to_path_buf();
tokio::spawn(
async move {
let source = node.pkg_ref.source();
let container_folder = node.container_folder(
&package_dir
.join(manifest_target_kind.packages_folder(version_id.target()))
.join(PACKAGES_CONTAINER_NAME),
&name,
version_id.version(),
);
match fs::create_dir_all(&container_folder).await {
Ok(_) => {}
Err(e) => {
tx.send(Err(errors::DownloadGraphError::Io(e)))
.await
.unwrap();
return;
}
}
let project = project.clone();
tracing::debug!("downloading"); tracing::debug!("downloading");
let (fs, target) = let fs = match progress_reporter {
match source.download(&node.pkg_ref, &project, &reqwest).await { Some(progress_reporter) => {
Ok(target) => target, source
Err(e) => { .download(
tx.send(Err(Box::new(e).into())).await.unwrap(); &node.pkg_ref,
return; &DownloadOptions {
project: project.clone(),
reqwest,
id: package_id.clone(),
reporter: Arc::new(progress_reporter),
},
)
.await
} }
}; None => {
source
.download(
&node.pkg_ref,
&DownloadOptions {
project: project.clone(),
reqwest,
id: package_id.clone(),
reporter: Arc::new(()),
},
)
.await
}
}
.map_err(Box::new)?;
tracing::debug!("downloaded"); tracing::debug!("downloaded");
if write { Ok((Arc::into_inner(package_id).unwrap(), node, fs))
if !prod || node.resolved_ty != DependencyType::Dev { }
match fs.write_to(container_folder, project.cas_dir(), true).await { .instrument(span)
Ok(_) => {} })
Err(e) => { .collect::<JoinSet<Result<_, errors::DownloadGraphError>>>();
tx.send(Err(errors::DownloadGraphError::WriteFailed(e)))
.await let stream = try_stream! {
.unwrap(); while let Some(res) = tasks.join_next().await {
return; yield res.unwrap()?;
} }
}; };
} else {
tracing::debug!(
"skipping write to disk, dev dependency in prod mode"
);
}
}
let display_name = format!("{name}@{version_id}"); Ok(stream)
{
let mut downloaded_graph = downloaded_graph.lock().unwrap();
downloaded_graph
.entry(name)
.or_default()
.insert(version_id, DownloadedDependencyGraphNode { node, target });
}
tx.send(Ok(display_name)).await.unwrap();
}
.instrument(span),
);
}
}
Ok((rx, downloaded_graph))
} }
} }
@ -167,13 +188,9 @@ pub mod errors {
#[derive(Debug, Error)] #[derive(Debug, Error)]
#[non_exhaustive] #[non_exhaustive]
pub enum DownloadGraphError { pub enum DownloadGraphError {
/// An error occurred deserializing the project manifest
#[error("error deserializing project manifest")]
ManifestDeserializationFailed(#[from] crate::errors::ManifestReadError),
/// An error occurred refreshing a package source /// An error occurred refreshing a package source
#[error("failed to refresh package source")] #[error("failed to refresh package source")]
RefreshFailed(#[from] Box<crate::source::errors::RefreshError>), RefreshFailed(#[from] crate::source::errors::RefreshError),
/// Error interacting with the filesystem /// Error interacting with the filesystem
#[error("error interacting with the filesystem")] #[error("error interacting with the filesystem")]
@ -182,9 +199,5 @@ pub mod errors {
/// Error downloading a package /// Error downloading a package
#[error("failed to download package")] #[error("failed to download package")]
DownloadFailed(#[from] Box<crate::source::errors::DownloadError>), DownloadFailed(#[from] Box<crate::source::errors::DownloadError>),
/// Error writing package contents
#[error("failed to write package contents")]
WriteFailed(#[source] std::io::Error),
} }
} }

View file

@ -1,155 +1,434 @@
use crate::{ use crate::{
lockfile::{DependencyGraph, DownloadedGraph}, all_packages_dirs,
manifest::DependencyType, download::DownloadGraphOptions,
source::PackageSources, graph::{
Project, DependencyGraph, DependencyGraphNode, DependencyGraphNodeWithTarget,
DependencyGraphWithTarget,
},
manifest::{target::TargetKind, DependencyType},
reporters::{DownloadsReporter, PatchesReporter},
source::{
ids::PackageId,
traits::{GetTargetOptions, PackageRef as _, PackageSource as _},
},
Project, RefreshedSources, SCRIPTS_LINK_FOLDER,
}; };
use futures::FutureExt; use fs_err::tokio as fs;
use futures::TryStreamExt as _;
use std::{ use std::{
collections::HashSet, collections::HashMap,
future::Future, convert::Infallible,
sync::{Arc, Mutex as StdMutex}, future::{self, Future},
num::NonZeroUsize,
path::PathBuf,
sync::Arc,
}; };
use tokio::sync::Mutex; use tokio::{pin, task::JoinSet};
use tracing::{instrument, Instrument}; use tracing::{instrument, Instrument as _};
/// Filters a graph to only include production dependencies, if `prod` is `true` /// Hooks to perform actions after certain events during download and linking.
pub fn filter_graph(graph: &DownloadedGraph, prod: bool) -> DownloadedGraph { #[allow(unused_variables)]
if !prod { pub trait DownloadAndLinkHooks: Send + Sync {
return graph.clone(); /// The error type for the hooks.
type Error: std::error::Error + Send + Sync + 'static;
/// Called after scripts have been downloaded. The `downloaded_graph`
/// contains all downloaded packages.
fn on_scripts_downloaded(
&self,
graph: &DependencyGraphWithTarget,
) -> impl Future<Output = Result<(), Self::Error>> + Send {
future::ready(Ok(()))
} }
graph /// Called after binary dependencies have been downloaded. The
.iter() /// `downloaded_graph` contains all downloaded packages.
.map(|(name, versions)| { fn on_bins_downloaded(
( &self,
name.clone(), graph: &DependencyGraphWithTarget,
versions ) -> impl Future<Output = Result<(), Self::Error>> + Send {
.iter() future::ready(Ok(()))
.filter(|(_, node)| node.node.resolved_ty != DependencyType::Dev) }
.map(|(v_id, node)| (v_id.clone(), node.clone()))
.collect(), /// Called after all dependencies have been downloaded. The
) /// `downloaded_graph` contains all downloaded packages.
}) fn on_all_downloaded(
.collect() &self,
graph: &DependencyGraphWithTarget,
) -> impl Future<Output = Result<(), Self::Error>> + Send {
future::ready(Ok(()))
}
} }
/// Receiver for dependencies downloaded and linked impl DownloadAndLinkHooks for () {
pub type DownloadAndLinkReceiver = type Error = Infallible;
tokio::sync::mpsc::Receiver<Result<String, crate::download::errors::DownloadGraphError>>; }
/// Options for downloading and linking.
#[derive(Debug)]
pub struct DownloadAndLinkOptions<Reporter = (), Hooks = ()> {
/// The reqwest client.
pub reqwest: reqwest::Client,
/// The downloads reporter.
pub reporter: Option<Arc<Reporter>>,
/// The download and link hooks.
pub hooks: Option<Arc<Hooks>>,
/// The refreshed sources.
pub refreshed_sources: RefreshedSources,
/// Whether to skip dev dependencies.
pub prod: bool,
/// The max number of concurrent network requests.
pub network_concurrency: NonZeroUsize,
/// Whether to re-install all dependencies even if they are already installed
pub force: bool,
}
impl<Reporter, Hooks> DownloadAndLinkOptions<Reporter, Hooks>
where
Reporter: DownloadsReporter + PatchesReporter + Send + Sync + 'static,
Hooks: DownloadAndLinkHooks + Send + Sync + 'static,
{
/// Creates a new download options with the given reqwest client and reporter.
#[must_use]
pub fn new(reqwest: reqwest::Client) -> Self {
Self {
reqwest,
reporter: None,
hooks: None,
refreshed_sources: Default::default(),
prod: false,
network_concurrency: NonZeroUsize::new(16).unwrap(),
force: false,
}
}
/// Sets the downloads reporter.
#[must_use]
pub fn reporter(mut self, reporter: impl Into<Arc<Reporter>>) -> Self {
self.reporter.replace(reporter.into());
self
}
/// Sets the download and link hooks.
#[must_use]
pub fn hooks(mut self, hooks: impl Into<Arc<Hooks>>) -> Self {
self.hooks.replace(hooks.into());
self
}
/// Sets the refreshed sources.
#[must_use]
pub fn refreshed_sources(mut self, refreshed_sources: RefreshedSources) -> Self {
self.refreshed_sources = refreshed_sources;
self
}
/// Sets whether to skip dev dependencies.
#[must_use]
pub fn prod(mut self, prod: bool) -> Self {
self.prod = prod;
self
}
/// Sets the max number of concurrent network requests.
#[must_use]
pub fn network_concurrency(mut self, network_concurrency: NonZeroUsize) -> Self {
self.network_concurrency = network_concurrency;
self
}
/// Sets whether to re-install all dependencies even if they are already installed
#[must_use]
pub fn force(mut self, force: bool) -> Self {
self.force = force;
self
}
}
impl Clone for DownloadAndLinkOptions {
fn clone(&self) -> Self {
Self {
reqwest: self.reqwest.clone(),
reporter: self.reporter.clone(),
hooks: self.hooks.clone(),
refreshed_sources: self.refreshed_sources.clone(),
prod: self.prod,
network_concurrency: self.network_concurrency,
force: self.force,
}
}
}
impl Project { impl Project {
/// Downloads a graph of dependencies and links them in the correct order /// Downloads a graph of dependencies and links them in the correct order
#[instrument( #[instrument(skip_all, fields(prod = options.prod), level = "debug")]
skip(self, graph, refreshed_sources, reqwest, pesde_cb), pub async fn download_and_link<Reporter, Hooks>(
level = "debug"
)]
pub async fn download_and_link<
F: FnOnce(&Arc<DownloadedGraph>) -> R + Send + 'static,
R: Future<Output = Result<(), E>> + Send,
E: Send + Sync + 'static,
>(
&self, &self,
graph: &Arc<DependencyGraph>, graph: &Arc<DependencyGraph>,
refreshed_sources: &Arc<Mutex<HashSet<PackageSources>>>, options: DownloadAndLinkOptions<Reporter, Hooks>,
reqwest: &reqwest::Client, ) -> Result<DependencyGraphWithTarget, errors::DownloadAndLinkError<Hooks::Error>>
prod: bool, where
write: bool, Reporter: DownloadsReporter + PatchesReporter + 'static,
pesde_cb: F, Hooks: DownloadAndLinkHooks + 'static,
) -> Result< {
( let DownloadAndLinkOptions {
DownloadAndLinkReceiver, reqwest,
impl Future<Output = Result<DownloadedGraph, errors::DownloadAndLinkError<E>>>, reporter,
), hooks,
errors::DownloadAndLinkError<E>, refreshed_sources,
> { prod,
let (tx, rx) = tokio::sync::mpsc::channel( network_concurrency,
graph force,
.iter() } = options;
.map(|(_, versions)| versions.len())
.sum::<usize>()
.max(1),
);
let downloaded_graph = Arc::new(StdMutex::new(DownloadedGraph::default()));
let this = self.clone();
let graph = graph.clone(); let graph = graph.clone();
let reqwest = reqwest.clone(); let reqwest = reqwest.clone();
let refreshed_sources = refreshed_sources.clone(); let manifest = self.deser_manifest().await?;
Ok(( if force {
rx, async fn remove_dir(dir: PathBuf) -> std::io::Result<()> {
tokio::spawn(async move { tracing::debug!("force deleting the `{}` folder", dir.display());
let mut refreshed_sources = refreshed_sources.lock().await;
// step 1. download pesde dependencies match fs::remove_dir_all(dir).await {
let (mut pesde_rx, pesde_graph) = this Ok(()) => Ok(()),
.download_graph(&graph, &mut refreshed_sources, &reqwest, prod, write, false) Err(e) if e.kind() == std::io::ErrorKind::NotFound => Ok(()),
.instrument(tracing::debug_span!("download (pesde)")) Err(e) => Err(e),
.await?; }
while let Some(result) = pesde_rx.recv().await {
tx.send(result).await.unwrap();
} }
let pesde_graph = Arc::into_inner(pesde_graph).unwrap().into_inner().unwrap(); let mut tasks = all_packages_dirs()
.into_iter()
.map(|folder| remove_dir(self.package_dir().join(&folder)))
.chain(std::iter::once(remove_dir(
self.package_dir().join(SCRIPTS_LINK_FOLDER),
)))
.collect::<JoinSet<_>>();
// step 2. link pesde dependencies. do so without types while let Some(task) = tasks.join_next().await {
if write { task.unwrap()?;
this.link_dependencies(&filter_graph(&pesde_graph, prod), false) }
.instrument(tracing::debug_span!("link (pesde)"))
.await?;
} }
let pesde_graph = Arc::new(pesde_graph); // step 1. download dependencies
let graph_to_download = {
let mut download_graph_options = DownloadGraphOptions::<Reporter>::new(reqwest.clone())
.refreshed_sources(refreshed_sources.clone())
.network_concurrency(network_concurrency);
pesde_cb(&pesde_graph) if let Some(reporter) = reporter.clone() {
.await download_graph_options = download_graph_options.reporter(reporter);
.map_err(errors::DownloadAndLinkError::PesdeCallback)?;
let pesde_graph = Arc::into_inner(pesde_graph).unwrap();
// step 3. download wally dependencies
let (mut wally_rx, wally_graph) = this
.download_graph(&graph, &mut refreshed_sources, &reqwest, prod, write, true)
.instrument(tracing::debug_span!("download (wally)"))
.await?;
while let Some(result) = wally_rx.recv().await {
tx.send(result).await.unwrap();
} }
let wally_graph = Arc::into_inner(wally_graph).unwrap().into_inner().unwrap(); let mut downloaded_graph = DependencyGraph::new();
let graph_to_download = if force {
graph.clone()
} else {
let mut tasks = graph
.iter()
.map(|(id, node)| {
let id = id.clone();
let node = node.clone();
let container_folder =
node.container_folder_from_project(&id, self, manifest.target.kind());
async move {
return (id, node, fs::metadata(&container_folder).await.is_ok());
}
})
.collect::<JoinSet<_>>();
let mut graph_to_download = DependencyGraph::new();
while let Some(task) = tasks.join_next().await {
let (id, node, installed) = task.unwrap();
if installed {
downloaded_graph.insert(id, node);
continue;
}
graph_to_download.insert(id, node);
}
Arc::new(graph_to_download)
};
let downloaded = self
.download_graph(&graph_to_download, download_graph_options.clone())
.instrument(tracing::debug_span!("download"))
.await?;
pin!(downloaded);
let mut tasks = JoinSet::new();
while let Some((id, node, fs)) = downloaded.try_next().await? {
let container_folder =
node.container_folder_from_project(&id, self, manifest.target.kind());
downloaded_graph.insert(id, node);
let cas_dir = self.cas_dir().to_path_buf();
tasks.spawn(async move {
fs::create_dir_all(&container_folder).await?;
fs.write_to(container_folder, cas_dir, true).await
});
}
while let Some(task) = tasks.join_next().await {
task.unwrap()?;
}
{
let mut downloaded_graph = downloaded_graph.lock().unwrap();
downloaded_graph.extend(pesde_graph);
for (name, versions) in wally_graph {
for (version_id, node) in versions {
downloaded_graph downloaded_graph
.entry(name.clone()) };
.or_default()
.insert(version_id, node); let (wally_graph_to_download, other_graph_to_download) =
} graph_to_download
.into_iter()
.partition::<HashMap<_, _>, _>(|(_, node)| node.pkg_ref.is_wally_package());
let mut graph = Arc::new(DependencyGraphWithTarget::new());
async fn get_graph_targets<Hooks: DownloadAndLinkHooks>(
graph: &mut Arc<DependencyGraphWithTarget>,
project: &Project,
manifest_target_kind: TargetKind,
downloaded_graph: HashMap<PackageId, DependencyGraphNode>,
) -> Result<(), errors::DownloadAndLinkError<Hooks::Error>> {
let mut tasks = downloaded_graph
.into_iter()
.map(|(id, node)| {
let source = node.pkg_ref.source();
let path = Arc::from(
node.container_folder_from_project(&id, project, manifest_target_kind)
.as_path(),
);
let id = Arc::new(id);
let project = project.clone();
async move {
let target = source
.get_target(
&node.pkg_ref,
&GetTargetOptions {
project,
path,
id: id.clone(),
},
)
.await?;
Ok::<_, errors::DownloadAndLinkError<Hooks::Error>>((
Arc::into_inner(id).unwrap(),
DependencyGraphNodeWithTarget { target, node },
))
} }
})
.collect::<JoinSet<_>>();
while let Some(task) = tasks.join_next().await {
let (id, node) = task.unwrap()?;
Arc::get_mut(graph).unwrap().insert(id, node);
} }
let graph = Arc::into_inner(downloaded_graph) Ok(())
.unwrap() }
.into_inner()
.unwrap(); // step 2. get targets for non Wally packages (Wally packages require the scripts packages to be downloaded first)
get_graph_targets::<Hooks>(
&mut graph,
self,
manifest.target.kind(),
other_graph_to_download,
)
.instrument(tracing::debug_span!("get targets (non-wally)"))
.await?;
self.link_dependencies(graph.clone(), false)
.instrument(tracing::debug_span!("link (non-wally)"))
.await?;
if let Some(hooks) = &hooks {
hooks
.on_scripts_downloaded(&graph)
.await
.map_err(errors::DownloadAndLinkError::Hook)?;
hooks
.on_bins_downloaded(&graph)
.await
.map_err(errors::DownloadAndLinkError::Hook)?;
}
// step 3. get targets for Wally packages
get_graph_targets::<Hooks>(
&mut graph,
self,
manifest.target.kind(),
wally_graph_to_download,
)
.instrument(tracing::debug_span!("get targets (wally)"))
.await?;
#[cfg(feature = "patches")]
{
use crate::patches::apply_patch;
let mut tasks = manifest
.patches
.iter()
.flat_map(|(name, versions)| {
versions
.iter()
.map(|(v_id, path)| (PackageId::new(name.clone(), v_id.clone()), path))
})
.filter_map(|(id, patch_path)| graph.get(&id).map(|node| (id, node, patch_path)))
.map(|(id, node, patch_path)| {
let patch_path = patch_path.to_path(self.package_dir());
let container_folder =
node.node
.container_folder_from_project(&id, self, manifest.target.kind());
let reporter = reporter.clone();
async move {
match reporter {
Some(reporter) => {
apply_patch(&id, container_folder, &patch_path, reporter.clone())
.await
}
None => {
apply_patch(&id, container_folder, &patch_path, Arc::new(())).await
}
}
}
})
.collect::<JoinSet<_>>();
while let Some(task) = tasks.join_next().await {
task.unwrap()?;
}
}
// step 4. link ALL dependencies. do so with types // step 4. link ALL dependencies. do so with types
if write { self.link_dependencies(graph.clone(), true)
this.link_dependencies(&filter_graph(&graph, prod), true)
.instrument(tracing::debug_span!("link (all)")) .instrument(tracing::debug_span!("link (all)"))
.await?; .await?;
if let Some(hooks) = &hooks {
hooks
.on_all_downloaded(&graph)
.await
.map_err(errors::DownloadAndLinkError::Hook)?;
}
let mut graph = Arc::into_inner(graph).unwrap();
if prod {
graph.retain(|_, node| node.node.resolved_ty != DependencyType::Dev);
}
if prod || !force {
self.remove_unused(&graph).await?;
} }
Ok(graph) Ok(graph)
})
.map(|r| r.unwrap()),
))
} }
} }
@ -161,6 +440,10 @@ pub mod errors {
#[derive(Debug, Error)] #[derive(Debug, Error)]
#[non_exhaustive] #[non_exhaustive]
pub enum DownloadAndLinkError<E> { pub enum DownloadAndLinkError<E> {
/// Reading the manifest failed
#[error("error reading manifest")]
ManifestRead(#[from] crate::errors::ManifestReadError),
/// An error occurred while downloading the graph /// An error occurred while downloading the graph
#[error("error downloading graph")] #[error("error downloading graph")]
DownloadGraph(#[from] crate::download::errors::DownloadGraphError), DownloadGraph(#[from] crate::download::errors::DownloadGraphError),
@ -170,7 +453,24 @@ pub mod errors {
Linking(#[from] crate::linking::errors::LinkingError), Linking(#[from] crate::linking::errors::LinkingError),
/// An error occurred while executing the pesde callback /// An error occurred while executing the pesde callback
#[error("error executing pesde callback")] #[error("error executing hook")]
PesdeCallback(#[source] E), Hook(#[source] E),
/// IO error
#[error("io error")]
Io(#[from] std::io::Error),
/// Error getting a target
#[error("error getting target")]
GetTarget(#[from] crate::source::errors::GetTargetError),
/// Removing unused dependencies failed
#[error("error removing unused dependencies")]
RemoveUnused(#[from] crate::linking::incremental::errors::RemoveUnusedError),
/// Patching a package failed
#[cfg(feature = "patches")]
#[error("error applying patch")]
Patch(#[from] crate::patches::errors::ApplyPatchError),
} }
} }

62
src/engine/mod.rs Normal file
View file

@ -0,0 +1,62 @@
/// Sources of engines
pub mod source;
use crate::{engine::source::EngineSources, ser_display_deser_fromstr};
use std::{fmt::Display, str::FromStr};
/// All supported engines
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, PartialOrd, Ord)]
#[cfg_attr(test, derive(schemars::JsonSchema))]
#[cfg_attr(test, schemars(rename_all = "snake_case"))]
pub enum EngineKind {
/// The pesde package manager
Pesde,
/// The Lune runtime
Lune,
}
ser_display_deser_fromstr!(EngineKind);
impl Display for EngineKind {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
EngineKind::Pesde => write!(f, "pesde"),
EngineKind::Lune => write!(f, "lune"),
}
}
}
impl FromStr for EngineKind {
type Err = errors::EngineKindFromStrError;
fn from_str(s: &str) -> Result<Self, Self::Err> {
match s.to_lowercase().as_str() {
"pesde" => Ok(EngineKind::Pesde),
"lune" => Ok(EngineKind::Lune),
_ => Err(errors::EngineKindFromStrError::Unknown(s.to_string())),
}
}
}
impl EngineKind {
/// Returns the source to get this engine from
#[must_use]
pub fn source(self) -> EngineSources {
match self {
EngineKind::Pesde => EngineSources::pesde(),
EngineKind::Lune => EngineSources::lune(),
}
}
}
/// Errors related to engine kinds
pub mod errors {
use thiserror::Error;
/// Errors which can occur while using the FromStr implementation of EngineKind
#[derive(Debug, Error)]
pub enum EngineKindFromStrError {
/// The string isn't a recognized EngineKind
#[error("unknown engine kind {0}")]
Unknown(String),
}
}

View file

@ -0,0 +1,320 @@
use futures::StreamExt as _;
use std::{
collections::BTreeSet,
mem::ManuallyDrop,
path::{Path, PathBuf},
pin::Pin,
str::FromStr,
task::{Context, Poll},
};
use tokio::{
io::{AsyncBufRead, AsyncRead, AsyncReadExt as _, ReadBuf},
pin,
};
use tokio_util::compat::{Compat, FuturesAsyncReadCompatExt as _};
/// The kind of encoding used for the archive
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub enum EncodingKind {
/// Gzip
Gzip,
}
/// The kind of archive
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub enum ArchiveKind {
/// Tar
Tar,
/// Zip
Zip,
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub(crate) struct ArchiveInfo(ArchiveKind, Option<EncodingKind>);
impl FromStr for ArchiveInfo {
type Err = errors::ArchiveInfoFromStrError;
fn from_str(s: &str) -> Result<Self, Self::Err> {
let parts = s.split('.').collect::<Vec<_>>();
Ok(match &*parts {
[.., "tar", "gz"] => ArchiveInfo(ArchiveKind::Tar, Some(EncodingKind::Gzip)),
[.., "tar"] => ArchiveInfo(ArchiveKind::Tar, None),
[.., "zip", "gz"] => {
return Err(errors::ArchiveInfoFromStrError::Unsupported(
ArchiveKind::Zip,
Some(EncodingKind::Gzip),
))
}
[.., "zip"] => ArchiveInfo(ArchiveKind::Zip, None),
_ => return Err(errors::ArchiveInfoFromStrError::Invalid(s.to_string())),
})
}
}
pub(crate) type ArchiveReader = Pin<Box<dyn AsyncBufRead + Send>>;
/// An archive
pub struct Archive {
pub(crate) info: ArchiveInfo,
pub(crate) reader: ArchiveReader,
}
enum TarReader {
Gzip(async_compression::tokio::bufread::GzipDecoder<ArchiveReader>),
Plain(ArchiveReader),
}
// TODO: try to see if we can avoid the unsafe blocks
impl AsyncRead for TarReader {
fn poll_read(
self: Pin<&mut Self>,
cx: &mut Context<'_>,
buf: &mut ReadBuf<'_>,
) -> Poll<std::io::Result<()>> {
unsafe {
match self.get_unchecked_mut() {
Self::Gzip(r) => Pin::new_unchecked(r).poll_read(cx, buf),
Self::Plain(r) => Pin::new_unchecked(r).poll_read(cx, buf),
}
}
}
}
enum ArchiveEntryInner {
Tar(Box<tokio_tar::Entry<tokio_tar::Archive<TarReader>>>),
Zip {
archive: *mut async_zip::tokio::read::seek::ZipFileReader<std::io::Cursor<Vec<u8>>>,
reader: ManuallyDrop<
Compat<
async_zip::tokio::read::ZipEntryReader<
'static,
std::io::Cursor<Vec<u8>>,
async_zip::base::read::WithoutEntry,
>,
>,
>,
},
}
impl Drop for ArchiveEntryInner {
fn drop(&mut self) {
match self {
Self::Tar(_) => {}
Self::Zip { archive, reader } => unsafe {
ManuallyDrop::drop(reader);
drop(Box::from_raw(*archive));
},
}
}
}
/// An entry in an archive. Usually the executable
pub struct ArchiveEntry(ArchiveEntryInner);
impl AsyncRead for ArchiveEntry {
fn poll_read(
self: Pin<&mut Self>,
cx: &mut Context<'_>,
buf: &mut ReadBuf<'_>,
) -> Poll<std::io::Result<()>> {
unsafe {
match &mut self.get_unchecked_mut().0 {
ArchiveEntryInner::Tar(r) => Pin::new_unchecked(r).poll_read(cx, buf),
ArchiveEntryInner::Zip { reader, .. } => {
Pin::new_unchecked(&mut **reader).poll_read(cx, buf)
}
}
}
}
}
impl Archive {
/// Finds the executable in the archive and returns it as an [`ArchiveEntry`]
pub async fn find_executable(
self,
expected_file_name: &str,
) -> Result<ArchiveEntry, errors::FindExecutableError> {
#[derive(Debug, PartialEq, Eq)]
struct Candidate {
path: PathBuf,
file_name_matches: bool,
extension_matches: bool,
has_permissions: bool,
}
impl Candidate {
fn new(path: PathBuf, perms: u32, expected_file_name: &str) -> Self {
Self {
file_name_matches: path
.file_name()
.is_some_and(|name| name == expected_file_name),
extension_matches: match path.extension() {
Some(ext) if ext == std::env::consts::EXE_EXTENSION => true,
None if std::env::consts::EXE_EXTENSION.is_empty() => true,
_ => false,
},
path,
has_permissions: perms & 0o111 != 0,
}
}
fn should_be_considered(&self) -> bool {
// if nothing matches, we should not consider this candidate as it is most likely not
self.file_name_matches || self.extension_matches || self.has_permissions
}
}
impl Ord for Candidate {
fn cmp(&self, other: &Self) -> std::cmp::Ordering {
self.file_name_matches
.cmp(&other.file_name_matches)
.then(self.extension_matches.cmp(&other.extension_matches))
.then(self.has_permissions.cmp(&other.has_permissions))
}
}
impl PartialOrd for Candidate {
fn partial_cmp(&self, other: &Self) -> Option<std::cmp::Ordering> {
Some(self.cmp(other))
}
}
let mut candidates = BTreeSet::new();
match self.info {
ArchiveInfo(ArchiveKind::Tar, encoding) => {
use async_compression::tokio::bufread as decoders;
let reader = match encoding {
Some(EncodingKind::Gzip) => {
TarReader::Gzip(decoders::GzipDecoder::new(self.reader))
}
None => TarReader::Plain(self.reader),
};
let mut archive = tokio_tar::Archive::new(reader);
let mut entries = archive.entries()?;
while let Some(entry) = entries.next().await.transpose()? {
if entry.header().entry_type().is_dir() {
continue;
}
let candidate = Candidate::new(
entry.path()?.to_path_buf(),
entry.header().mode()?,
expected_file_name,
);
if candidate.should_be_considered() {
candidates.insert(candidate);
}
}
let Some(candidate) = candidates.pop_last() else {
return Err(errors::FindExecutableError::ExecutableNotFound);
};
let mut entries = archive.entries()?;
while let Some(entry) = entries.next().await.transpose()? {
if entry.header().entry_type().is_dir() {
continue;
}
let path = entry.path()?;
if path == candidate.path {
return Ok(ArchiveEntry(ArchiveEntryInner::Tar(Box::new(entry))));
}
}
}
ArchiveInfo(ArchiveKind::Zip, _) => {
let reader = self.reader;
pin!(reader);
// TODO: would be lovely to not have to read the whole archive into memory
let mut buf = vec![];
reader.read_to_end(&mut buf).await?;
let archive = async_zip::base::read::seek::ZipFileReader::with_tokio(
std::io::Cursor::new(buf),
)
.await?;
for entry in archive.file().entries() {
if entry.dir()? {
continue;
}
let path: &Path = entry.filename().as_str()?.as_ref();
let candidate = Candidate::new(
path.to_path_buf(),
entry.unix_permissions().unwrap_or(0) as u32,
expected_file_name,
);
if candidate.should_be_considered() {
candidates.insert(candidate);
}
}
let Some(candidate) = candidates.pop_last() else {
return Err(errors::FindExecutableError::ExecutableNotFound);
};
for (i, entry) in archive.file().entries().iter().enumerate() {
if entry.dir()? {
continue;
}
let path: &Path = entry.filename().as_str()?.as_ref();
if candidate.path == path {
let ptr = Box::into_raw(Box::new(archive));
let reader = (unsafe { &mut *ptr }).reader_without_entry(i).await?;
return Ok(ArchiveEntry(ArchiveEntryInner::Zip {
archive: ptr,
reader: ManuallyDrop::new(reader.compat()),
}));
}
}
}
}
Err(errors::FindExecutableError::ExecutableNotFound)
}
}
/// Errors that can occur when working with archives
pub mod errors {
use thiserror::Error;
/// Errors that can occur when parsing archive info
#[derive(Debug, Error)]
#[non_exhaustive]
pub enum ArchiveInfoFromStrError {
/// The string is not a valid archive descriptor. E.g. `{name}.tar.gz`
#[error("string `{0}` is not a valid archive descriptor")]
Invalid(String),
/// The archive type is not supported. E.g. `{name}.zip.gz`
#[error("archive type {0:?} with encoding {1:?} is not supported")]
Unsupported(super::ArchiveKind, Option<super::EncodingKind>),
}
/// Errors that can occur when finding an executable in an archive
#[derive(Debug, Error)]
#[non_exhaustive]
pub enum FindExecutableError {
/// The executable was not found in the archive
#[error("failed to find executable in archive")]
ExecutableNotFound,
/// An IO error occurred
#[error("IO error")]
Io(#[from] std::io::Error),
/// An error occurred reading the zip archive
#[error("failed to read zip archive")]
Zip(#[from] async_zip::error::ZipError),
}
}

View file

@ -0,0 +1,19 @@
use serde::Deserialize;
/// A GitHub release
#[derive(Debug, Eq, PartialEq, Hash, Clone, Deserialize)]
pub struct Release {
/// The tag name of the release
pub tag_name: String,
/// The assets of the release
pub assets: Vec<Asset>,
}
/// An asset of a GitHub release
#[derive(Debug, Eq, PartialEq, Hash, Clone, Deserialize)]
pub struct Asset {
/// The name of the asset
pub name: String,
/// The download URL of the asset
pub url: url::Url,
}

View file

@ -0,0 +1,146 @@
/// The GitHub engine reference
pub mod engine_ref;
use crate::{
engine::source::{
archive::Archive,
github::engine_ref::Release,
traits::{DownloadOptions, EngineSource, ResolveOptions},
},
reporters::{response_to_async_read, DownloadProgressReporter},
util::no_build_metadata,
version_matches,
};
use reqwest::header::ACCEPT;
use semver::{Version, VersionReq};
use std::{collections::BTreeMap, path::PathBuf};
/// The GitHub engine source
#[derive(Debug, Eq, PartialEq, Hash, Clone)]
pub struct GitHubEngineSource {
/// The owner of the repository to download from
pub owner: String,
/// The repository of which to download releases from
pub repo: String,
/// The template for the asset name. `{VERSION}` will be replaced with the version
pub asset_template: String,
}
impl EngineSource for GitHubEngineSource {
type Ref = Release;
type ResolveError = errors::ResolveError;
type DownloadError = errors::DownloadError;
fn directory(&self) -> PathBuf {
PathBuf::from("github").join(&self.owner).join(&self.repo)
}
fn expected_file_name(&self) -> &str {
&self.repo
}
async fn resolve(
&self,
requirement: &VersionReq,
options: &ResolveOptions,
) -> Result<BTreeMap<Version, Self::Ref>, Self::ResolveError> {
let ResolveOptions { reqwest, .. } = options;
Ok(reqwest
.get(format!(
"https://api.github.com/repos/{}/{}/releases",
urlencoding::encode(&self.owner),
urlencoding::encode(&self.repo),
))
.send()
.await?
.error_for_status()?
.json::<Vec<Release>>()
.await?
.into_iter()
.filter_map(
|release| match release.tag_name.trim_start_matches('v').parse() {
Ok(version) if version_matches(requirement, &version) => {
Some((version, release))
}
_ => None,
},
)
.collect())
}
async fn download<R: DownloadProgressReporter + 'static>(
&self,
engine_ref: &Self::Ref,
options: &DownloadOptions<R>,
) -> Result<Archive, Self::DownloadError> {
let DownloadOptions {
reqwest,
reporter,
version,
..
} = options;
let desired_asset_names = [
self.asset_template
.replace("{VERSION}", &version.to_string()),
self.asset_template
.replace("{VERSION}", &no_build_metadata(version).to_string()),
];
let asset = engine_ref
.assets
.iter()
.find(|asset| {
desired_asset_names
.iter()
.any(|name| asset.name.eq_ignore_ascii_case(name))
})
.ok_or(errors::DownloadError::AssetNotFound)?;
reporter.report_start();
let response = reqwest
.get(asset.url.clone())
.header(ACCEPT, "application/octet-stream")
.send()
.await?
.error_for_status()?;
Ok(Archive {
info: asset.name.parse()?,
reader: Box::pin(response_to_async_read(response, reporter.clone())),
})
}
}
/// Errors that can occur when working with the GitHub engine source
pub mod errors {
use thiserror::Error;
/// Errors that can occur when resolving a GitHub engine
#[derive(Debug, Error)]
#[non_exhaustive]
pub enum ResolveError {
/// Handling the request failed
#[error("failed to handle GitHub API request")]
Request(#[from] reqwest::Error),
}
/// Errors that can occur when downloading a GitHub engine
#[derive(Debug, Error)]
#[non_exhaustive]
pub enum DownloadError {
/// An asset for the current platform could not be found
#[error("failed to find asset for current platform")]
AssetNotFound,
/// Handling the request failed
#[error("failed to handle GitHub API request")]
Request(#[from] reqwest::Error),
/// The asset's name could not be parsed
#[error("failed to parse asset name")]
ParseAssetName(#[from] crate::engine::source::archive::errors::ArchiveInfoFromStrError),
}
}

145
src/engine/source/mod.rs Normal file
View file

@ -0,0 +1,145 @@
use crate::{
engine::source::{
archive::Archive,
traits::{DownloadOptions, EngineSource, ResolveOptions},
},
reporters::DownloadProgressReporter,
};
use semver::{Version, VersionReq};
use std::{collections::BTreeMap, path::PathBuf};
/// Archives
pub mod archive;
/// The GitHub engine source
pub mod github;
/// Traits for engine sources
pub mod traits;
/// Engine references
#[derive(Debug, Eq, PartialEq, Hash, Clone)]
pub enum EngineRefs {
/// A GitHub engine reference
GitHub(github::engine_ref::Release),
}
/// Engine sources
#[derive(Debug, Eq, PartialEq, Hash, Clone)]
pub enum EngineSources {
/// A GitHub engine source
GitHub(github::GitHubEngineSource),
}
impl EngineSource for EngineSources {
type Ref = EngineRefs;
type ResolveError = errors::ResolveError;
type DownloadError = errors::DownloadError;
fn directory(&self) -> PathBuf {
match self {
EngineSources::GitHub(source) => source.directory(),
}
}
fn expected_file_name(&self) -> &str {
match self {
EngineSources::GitHub(source) => source.expected_file_name(),
}
}
async fn resolve(
&self,
requirement: &VersionReq,
options: &ResolveOptions,
) -> Result<BTreeMap<Version, Self::Ref>, Self::ResolveError> {
match self {
EngineSources::GitHub(source) => source
.resolve(requirement, options)
.await
.map(|map| {
map.into_iter()
.map(|(version, release)| (version, EngineRefs::GitHub(release)))
.collect()
})
.map_err(Into::into),
}
}
async fn download<R: DownloadProgressReporter + 'static>(
&self,
engine_ref: &Self::Ref,
options: &DownloadOptions<R>,
) -> Result<Archive, Self::DownloadError> {
match (self, engine_ref) {
(EngineSources::GitHub(source), EngineRefs::GitHub(release)) => {
source.download(release, options).await.map_err(Into::into)
}
// for the future
#[allow(unreachable_patterns)]
_ => Err(errors::DownloadError::Mismatch),
}
}
}
impl EngineSources {
/// Returns the source for the pesde engine
#[must_use]
pub fn pesde() -> Self {
let mut parts = env!("CARGO_PKG_REPOSITORY").split('/').skip(3);
let (owner, repo) = (
parts.next().unwrap().to_string(),
parts.next().unwrap().to_string(),
);
EngineSources::GitHub(github::GitHubEngineSource {
owner,
repo,
asset_template: format!(
"pesde-{{VERSION}}-{}-{}.zip",
std::env::consts::OS,
std::env::consts::ARCH
),
})
}
/// Returns the source for the lune engine
#[must_use]
pub fn lune() -> Self {
EngineSources::GitHub(github::GitHubEngineSource {
owner: "lune-org".into(),
repo: "lune".into(),
asset_template: format!(
"lune-{{VERSION}}-{}-{}.zip",
std::env::consts::OS,
std::env::consts::ARCH
),
})
}
}
/// Errors that can occur when working with engine sources
pub mod errors {
use thiserror::Error;
/// Errors that can occur when resolving an engine
#[derive(Debug, Error)]
#[non_exhaustive]
pub enum ResolveError {
/// Failed to resolve the GitHub engine
#[error("failed to resolve github engine")]
GitHub(#[from] super::github::errors::ResolveError),
}
/// Errors that can occur when downloading an engine
#[derive(Debug, Error)]
#[non_exhaustive]
pub enum DownloadError {
/// Failed to download the GitHub engine
#[error("failed to download github engine")]
GitHub(#[from] super::github::errors::DownloadError),
/// Mismatched engine reference
#[error("mismatched engine reference")]
Mismatch,
}
}

View file

@ -0,0 +1,51 @@
use crate::{engine::source::archive::Archive, reporters::DownloadProgressReporter};
use semver::{Version, VersionReq};
use std::{collections::BTreeMap, fmt::Debug, future::Future, path::PathBuf, sync::Arc};
/// Options for resolving an engine
#[derive(Debug, Clone)]
pub struct ResolveOptions {
/// The reqwest client to use
pub reqwest: reqwest::Client,
}
/// Options for downloading an engine
#[derive(Debug, Clone)]
pub struct DownloadOptions<R: DownloadProgressReporter> {
/// The reqwest client to use
pub reqwest: reqwest::Client,
/// The reporter to use
pub reporter: Arc<R>,
/// The version of the engine to be downloaded
pub version: Version,
}
/// A source of engines
pub trait EngineSource: Debug {
/// The reference type for this source
type Ref;
/// The error type for resolving an engine from this source
type ResolveError: std::error::Error + Send + Sync + 'static;
/// The error type for downloading an engine from this source
type DownloadError: std::error::Error + Send + Sync + 'static;
/// Returns the folder to store the engine's versions in
fn directory(&self) -> PathBuf;
/// Returns the expected file name of the engine in the archive
fn expected_file_name(&self) -> &str;
/// Resolves a requirement to a reference
fn resolve(
&self,
requirement: &VersionReq,
options: &ResolveOptions,
) -> impl Future<Output = Result<BTreeMap<Version, Self::Ref>, Self::ResolveError>> + Send + Sync;
/// Downloads an engine
fn download<R: DownloadProgressReporter + 'static>(
&self,
engine_ref: &Self::Ref,
options: &DownloadOptions<R>,
) -> impl Future<Output = Result<Archive, Self::DownloadError>> + Send + Sync;
}

100
src/graph.rs Normal file
View file

@ -0,0 +1,100 @@
use crate::{
manifest::{
target::{Target, TargetKind},
Alias, DependencyType,
},
source::{
ids::{PackageId, VersionId},
refs::PackageRefs,
specifiers::DependencySpecifiers,
traits::PackageRef as _,
},
Project, PACKAGES_CONTAINER_NAME,
};
use serde::{Deserialize, Serialize};
use std::{collections::BTreeMap, path::PathBuf};
/// A graph of dependencies
pub type Graph<Node> = BTreeMap<PackageId, Node>;
/// A dependency graph node
#[derive(Serialize, Deserialize, Debug, Clone)]
pub struct DependencyGraphNode {
/// The alias, specifier, and original (as in the manifest) type for the dependency, if it is a direct dependency (i.e. used by the current project)
#[serde(default, skip_serializing_if = "Option::is_none")]
pub direct: Option<(Alias, DependencySpecifiers, DependencyType)>,
/// The dependencies of the package
#[serde(default, skip_serializing_if = "BTreeMap::is_empty")]
pub dependencies: BTreeMap<PackageId, Alias>,
/// The resolved (transformed, for example Peer -> Standard) type of the dependency
pub resolved_ty: DependencyType,
/// Whether the resolved type should be Peer if this isn't depended on
#[serde(default, skip_serializing_if = "std::ops::Not::not")]
pub is_peer: bool,
/// The package reference
pub pkg_ref: PackageRefs,
}
impl DependencyGraphNode {
pub(crate) fn dependencies_dir(
&self,
version_id: &VersionId,
project_target: TargetKind,
) -> String {
if self.pkg_ref.use_new_structure() {
version_id.target().packages_folder(project_target)
} else {
"..".to_string()
}
}
/// Returns the folder to store the contents of the package in
#[must_use]
pub fn container_folder(&self, package_id: &PackageId) -> PathBuf {
let (name, v_id) = package_id.parts();
if self.pkg_ref.is_wally_package() {
return PathBuf::from(format!(
"{}_{}@{}",
name.scope(),
name.name(),
v_id.version()
))
.join(name.name());
}
PathBuf::from(name.escaped())
.join(v_id.version().to_string())
.join(name.name())
}
/// Returns the folder to store the contents of the package in starting from the project's package directory
#[must_use]
pub fn container_folder_from_project(
&self,
package_id: &PackageId,
project: &Project,
manifest_target_kind: TargetKind,
) -> PathBuf {
project
.package_dir()
.join(manifest_target_kind.packages_folder(package_id.version_id().target()))
.join(PACKAGES_CONTAINER_NAME)
.join(self.container_folder(package_id))
}
}
/// A graph of `DependencyGraphNode`s
pub type DependencyGraph = Graph<DependencyGraphNode>;
/// A dependency graph node with a `Target`
#[derive(Debug, Clone)]
pub struct DependencyGraphNodeWithTarget {
/// The target of the package
pub target: Target,
/// The node
pub node: DependencyGraphNode,
}
/// A graph of `DownloadedDependencyGraphNode`s
pub type DependencyGraphWithTarget = Graph<DependencyGraphNodeWithTarget>;

View file

@ -1,29 +1,40 @@
#![deny(missing_docs)] #![warn(missing_docs)]
//! A package manager for the Luau programming language, supporting multiple runtimes including Roblox and Lune. //! A package manager for the Luau programming language, supporting multiple runtimes including Roblox and Lune.
//! pesde has its own registry, however it can also use Wally, and Git repositories as package sources. //! pesde has its own registry, however it can also use Wally, and Git repositories as package sources.
//! It has been designed with multiple targets in mind, namely Roblox, Lune, and Luau. //! It has been designed with multiple targets in mind, namely Roblox, Lune, and Luau.
use crate::{ use crate::{
lockfile::Lockfile, lockfile::Lockfile,
manifest::Manifest, manifest::{target::TargetKind, Manifest},
source::{traits::PackageSource, PackageSources}, source::{
traits::{PackageSource as _, RefreshOptions},
PackageSources,
},
}; };
use async_stream::stream; use async_stream::try_stream;
use fs_err::tokio as fs; use fs_err::tokio as fs;
use futures::{future::try_join_all, Stream}; use futures::Stream;
use gix::sec::identity::Account; use gix::sec::identity::Account;
use semver::{Version, VersionReq};
use std::{ use std::{
collections::{HashMap, HashSet}, collections::{HashMap, HashSet},
fmt::Debug, fmt::Debug,
hash::{Hash as _, Hasher as _},
path::{Path, PathBuf}, path::{Path, PathBuf},
sync::Arc,
}; };
use tokio::io::AsyncReadExt as _;
use tracing::instrument; use tracing::instrument;
use wax::Pattern; use wax::Pattern as _;
/// Downloading packages /// Downloading packages
pub mod download; pub mod download;
/// Utility for downloading and linking in the correct order /// Utility for downloading and linking in the correct order
pub mod download_and_link; pub mod download_and_link;
/// Handling of engines
pub mod engine;
/// Graphs
pub mod graph;
/// Linking packages /// Linking packages
pub mod linking; pub mod linking;
/// Lockfile /// Lockfile
@ -35,6 +46,7 @@ pub mod names;
/// Patching packages /// Patching packages
#[cfg(feature = "patches")] #[cfg(feature = "patches")]
pub mod patches; pub mod patches;
pub mod reporters;
/// Resolving packages /// Resolving packages
pub mod resolver; pub mod resolver;
/// Running scripts /// Running scripts
@ -55,25 +67,37 @@ pub(crate) const LINK_LIB_NO_FILE_FOUND: &str = "____pesde_no_export_file_found"
/// The folder in which scripts are linked /// The folder in which scripts are linked
pub const SCRIPTS_LINK_FOLDER: &str = ".pesde"; pub const SCRIPTS_LINK_FOLDER: &str = ".pesde";
/// Struct containing the authentication configuration pub(crate) fn default_index_name() -> String {
#[derive(Debug, Default, Clone)] DEFAULT_INDEX_NAME.into()
pub struct AuthConfig { }
#[derive(Debug, Default)]
struct AuthConfigShared {
tokens: HashMap<gix::Url, String>, tokens: HashMap<gix::Url, String>,
git_credentials: Option<Account>, git_credentials: Option<Account>,
} }
/// Struct containing the authentication configuration
#[derive(Debug, Clone, Default)]
pub struct AuthConfig {
shared: Arc<AuthConfigShared>,
}
impl AuthConfig { impl AuthConfig {
/// Create a new `AuthConfig` /// Create a new `AuthConfig`
#[must_use]
pub fn new() -> Self { pub fn new() -> Self {
AuthConfig::default() AuthConfig::default()
} }
/// Set the tokens /// Set the tokens
/// Panics if the `AuthConfig` is shared
#[must_use]
pub fn with_tokens<I: IntoIterator<Item = (gix::Url, S)>, S: AsRef<str>>( pub fn with_tokens<I: IntoIterator<Item = (gix::Url, S)>, S: AsRef<str>>(
mut self, mut self,
tokens: I, tokens: I,
) -> Self { ) -> Self {
self.tokens = tokens Arc::get_mut(&mut self.shared).unwrap().tokens = tokens
.into_iter() .into_iter()
.map(|(url, s)| (url, s.as_ref().to_string())) .map(|(url, s)| (url, s.as_ref().to_string()))
.collect(); .collect();
@ -81,79 +105,97 @@ impl AuthConfig {
} }
/// Set the git credentials /// Set the git credentials
/// Panics if the `AuthConfig` is shared
#[must_use]
pub fn with_git_credentials(mut self, git_credentials: Option<Account>) -> Self { pub fn with_git_credentials(mut self, git_credentials: Option<Account>) -> Self {
self.git_credentials = git_credentials; Arc::get_mut(&mut self.shared).unwrap().git_credentials = git_credentials;
self self
} }
/// Get the tokens /// Get the tokens
#[must_use]
pub fn tokens(&self) -> &HashMap<gix::Url, String> { pub fn tokens(&self) -> &HashMap<gix::Url, String> {
&self.tokens &self.shared.tokens
} }
/// Get the git credentials /// Get the git credentials
#[must_use]
pub fn git_credentials(&self) -> Option<&Account> { pub fn git_credentials(&self) -> Option<&Account> {
self.git_credentials.as_ref() self.shared.git_credentials.as_ref()
} }
} }
/// The main struct of the pesde library, representing a project #[derive(Debug)]
#[derive(Debug, Clone)] struct ProjectShared {
pub struct Project {
package_dir: PathBuf, package_dir: PathBuf,
workspace_dir: Option<PathBuf>, workspace_dir: Option<PathBuf>,
data_dir: PathBuf, data_dir: PathBuf,
auth_config: AuthConfig,
cas_dir: PathBuf, cas_dir: PathBuf,
auth_config: AuthConfig,
}
/// The main struct of the pesde library, representing a project
/// Unlike `ProjectShared`, this struct is `Send` and `Sync` and is cheap to clone because it is `Arc`-backed
#[derive(Debug, Clone)]
pub struct Project {
shared: Arc<ProjectShared>,
} }
impl Project { impl Project {
/// Create a new `Project` /// Create a new `Project`
pub fn new<P: AsRef<Path>, Q: AsRef<Path>, R: AsRef<Path>, S: AsRef<Path>>( #[must_use]
package_dir: P, pub fn new(
workspace_dir: Option<Q>, package_dir: impl AsRef<Path>,
data_dir: R, workspace_dir: Option<impl AsRef<Path>>,
cas_dir: S, data_dir: impl AsRef<Path>,
cas_dir: impl AsRef<Path>,
auth_config: AuthConfig, auth_config: AuthConfig,
) -> Self { ) -> Self {
Project { Project {
shared: Arc::new(ProjectShared {
package_dir: package_dir.as_ref().to_path_buf(), package_dir: package_dir.as_ref().to_path_buf(),
workspace_dir: workspace_dir.map(|d| d.as_ref().to_path_buf()), workspace_dir: workspace_dir.map(|d| d.as_ref().to_path_buf()),
data_dir: data_dir.as_ref().to_path_buf(), data_dir: data_dir.as_ref().to_path_buf(),
auth_config,
cas_dir: cas_dir.as_ref().to_path_buf(), cas_dir: cas_dir.as_ref().to_path_buf(),
auth_config,
}),
} }
} }
/// The directory of the package /// The directory of the package
#[must_use]
pub fn package_dir(&self) -> &Path { pub fn package_dir(&self) -> &Path {
&self.package_dir &self.shared.package_dir
} }
/// The directory of the workspace this package belongs to, if any /// The directory of the workspace this package belongs to, if any
#[must_use]
pub fn workspace_dir(&self) -> Option<&Path> { pub fn workspace_dir(&self) -> Option<&Path> {
self.workspace_dir.as_deref() self.shared.workspace_dir.as_deref()
} }
/// The directory to store general-purpose data /// The directory to store general-purpose data
#[must_use]
pub fn data_dir(&self) -> &Path { pub fn data_dir(&self) -> &Path {
&self.data_dir &self.shared.data_dir
}
/// The authentication configuration
pub fn auth_config(&self) -> &AuthConfig {
&self.auth_config
} }
/// The CAS (content-addressable storage) directory /// The CAS (content-addressable storage) directory
#[must_use]
pub fn cas_dir(&self) -> &Path { pub fn cas_dir(&self) -> &Path {
&self.cas_dir &self.shared.cas_dir
}
/// The authentication configuration
#[must_use]
pub fn auth_config(&self) -> &AuthConfig {
&self.shared.auth_config
} }
/// Read the manifest file /// Read the manifest file
#[instrument(skip(self), ret(level = "trace"), level = "debug")] #[instrument(skip(self), ret(level = "trace"), level = "debug")]
pub async fn read_manifest(&self) -> Result<String, errors::ManifestReadError> { pub async fn read_manifest(&self) -> Result<String, errors::ManifestReadError> {
let string = fs::read_to_string(self.package_dir.join(MANIFEST_FILE_NAME)).await?; let string = fs::read_to_string(self.package_dir().join(MANIFEST_FILE_NAME)).await?;
Ok(string) Ok(string)
} }
@ -161,155 +203,86 @@ impl Project {
/// Deserialize the manifest file /// Deserialize the manifest file
#[instrument(skip(self), ret(level = "trace"), level = "debug")] #[instrument(skip(self), ret(level = "trace"), level = "debug")]
pub async fn deser_manifest(&self) -> Result<Manifest, errors::ManifestReadError> { pub async fn deser_manifest(&self) -> Result<Manifest, errors::ManifestReadError> {
let string = fs::read_to_string(self.package_dir.join(MANIFEST_FILE_NAME)).await?; deser_manifest(self.package_dir()).await
Ok(toml::from_str(&string)?) }
/// Deserialize the manifest file of the workspace root
#[instrument(skip(self), ret(level = "trace"), level = "debug")]
pub async fn deser_workspace_manifest(
&self,
) -> Result<Option<Manifest>, errors::ManifestReadError> {
let Some(workspace_dir) = self.workspace_dir() else {
return Ok(None);
};
deser_manifest(workspace_dir).await.map(Some)
} }
/// Write the manifest file /// Write the manifest file
#[instrument(skip(self, manifest), level = "debug")] #[instrument(skip(self, manifest), level = "debug")]
pub async fn write_manifest<S: AsRef<[u8]>>(&self, manifest: S) -> Result<(), std::io::Error> { pub async fn write_manifest<S: AsRef<[u8]>>(&self, manifest: S) -> Result<(), std::io::Error> {
fs::write(self.package_dir.join(MANIFEST_FILE_NAME), manifest.as_ref()).await fs::write(
self.package_dir().join(MANIFEST_FILE_NAME),
manifest.as_ref(),
)
.await
} }
/// Deserialize the lockfile /// Deserialize the lockfile
#[instrument(skip(self), ret(level = "trace"), level = "debug")] #[instrument(skip(self), ret(level = "trace"), level = "debug")]
pub async fn deser_lockfile(&self) -> Result<Lockfile, errors::LockfileReadError> { pub async fn deser_lockfile(&self) -> Result<Lockfile, errors::LockfileReadError> {
let string = fs::read_to_string(self.package_dir.join(LOCKFILE_FILE_NAME)).await?; let string = fs::read_to_string(self.package_dir().join(LOCKFILE_FILE_NAME)).await?;
Ok(toml::from_str(&string)?) lockfile::parse_lockfile(&string).map_err(Into::into)
} }
/// Write the lockfile /// Write the lockfile
#[instrument(skip(self, lockfile), level = "debug")] #[instrument(skip(self, lockfile), level = "debug")]
pub async fn write_lockfile( pub async fn write_lockfile(
&self, &self,
lockfile: Lockfile, lockfile: &Lockfile,
) -> Result<(), errors::LockfileWriteError> { ) -> Result<(), errors::LockfileWriteError> {
let string = toml::to_string(&lockfile)?; let lockfile = toml::to_string(lockfile)?;
fs::write(self.package_dir.join(LOCKFILE_FILE_NAME), string).await?; let lockfile = format!(
r"# This file is automatically @generated by pesde.
# It is not intended for manual editing.
format = {}
{lockfile}",
lockfile::CURRENT_FORMAT
);
fs::write(self.package_dir().join(LOCKFILE_FILE_NAME), lockfile).await?;
Ok(()) Ok(())
} }
/// Get the workspace members /// Get the workspace members
#[instrument(skip(self), level = "debug")] #[instrument(skip(self), level = "debug")]
pub async fn workspace_members<P: AsRef<Path> + Debug>( pub async fn workspace_members(
&self, &self,
dir: P,
can_ref_self: bool, can_ref_self: bool,
) -> Result< ) -> Result<
impl Stream<Item = Result<(PathBuf, Manifest), errors::WorkspaceMembersError>>, impl Stream<Item = Result<(PathBuf, Manifest), errors::WorkspaceMembersError>>,
errors::WorkspaceMembersError, errors::WorkspaceMembersError,
> { > {
let dir = dir.as_ref().to_path_buf(); let dir = self.workspace_dir().unwrap_or(self.package_dir());
let manifest = fs::read_to_string(dir.join(MANIFEST_FILE_NAME)) let manifest = deser_manifest(dir).await?;
.await
.map_err(errors::WorkspaceMembersError::ManifestMissing)?;
let manifest = toml::from_str::<Manifest>(&manifest).map_err(|e| {
errors::WorkspaceMembersError::ManifestDeser(dir.to_path_buf(), Box::new(e))
})?;
let members = matching_globs( let members = matching_globs(
dir, dir,
manifest.workspace_members.iter().map(|s| s.as_str()), manifest.workspace_members.iter().map(String::as_str),
false, false,
can_ref_self, can_ref_self,
) )
.await?; .await?;
Ok(stream! { Ok(try_stream! {
for path in members { for path in members {
let manifest = fs::read_to_string(path.join(MANIFEST_FILE_NAME)) let manifest = deser_manifest(&path).await?;
.await yield (path, manifest);
.map_err(errors::WorkspaceMembersError::ManifestMissing)?;
let manifest = toml::from_str::<Manifest>(&manifest).map_err(|e| {
errors::WorkspaceMembersError::ManifestDeser(path.clone(), Box::new(e))
})?;
yield Ok((path, manifest));
} }
}) })
} }
} }
/// Gets all matching paths in a directory
#[deprecated(
since = "0.5.0-rc.13",
note = "use `matching_globs` instead, which does not have the old behaviour of including whole directories by their name (`src` instead of `src/**`)"
)]
#[instrument(ret, level = "trace")]
pub async fn matching_globs_old_behaviour<
'a,
P: AsRef<Path> + Debug,
I: IntoIterator<Item = &'a str> + Debug,
>(
dir: P,
globs: I,
relative: bool,
) -> Result<HashSet<PathBuf>, errors::MatchingGlobsError> {
let (negative_globs, positive_globs) = globs
.into_iter()
.partition::<Vec<_>, _>(|glob| glob.starts_with('!'));
let negative_globs = wax::any(
negative_globs
.into_iter()
.map(|glob| wax::Glob::new(&glob[1..]))
.collect::<Result<Vec<_>, _>>()?,
)?;
let (positive_globs, file_names) = positive_globs
.into_iter()
// only globs we can be sure of (maintaining compatibility with old "only file/dir name" system)
.partition::<Vec<_>, _>(|glob| glob.contains('/'));
let file_names = file_names.into_iter().collect::<HashSet<_>>();
let positive_globs = wax::any(
positive_globs
.into_iter()
.map(wax::Glob::new)
.collect::<Result<Vec<_>, _>>()?,
)?;
let mut read_dirs = vec![(fs::read_dir(dir.as_ref().to_path_buf()).await?, false)];
let mut paths = HashSet::new();
let mut is_root = true;
while let Some((mut read_dir, is_entire_dir_included)) = read_dirs.pop() {
while let Some(entry) = read_dir.next_entry().await? {
let path = entry.path();
let relative_path = path.strip_prefix(dir.as_ref()).unwrap();
let file_name = path.file_name().unwrap();
let is_filename_match =
is_root && file_name.to_str().is_some_and(|s| file_names.contains(s));
if entry.file_type().await?.is_dir() {
read_dirs.push((
fs::read_dir(&path).await?,
is_entire_dir_included || is_filename_match,
));
if is_filename_match {
tracing::warn!("directory name usage found for {}. this is deprecated and will be removed in the future", path.display());
}
}
if (is_entire_dir_included || is_filename_match)
|| (positive_globs.is_match(relative_path)
&& !negative_globs.is_match(relative_path))
{
paths.insert(if relative {
relative_path.to_path_buf()
} else {
path.to_path_buf()
});
}
}
is_root = false;
}
Ok(paths)
}
/// Gets all matching paths in a directory /// Gets all matching paths in a directory
#[instrument(ret, level = "trace")] #[instrument(ret, level = "trace")]
pub async fn matching_globs<'a, P: AsRef<Path> + Debug, I: IntoIterator<Item = &'a str> + Debug>( pub async fn matching_globs<'a, P: AsRef<Path> + Debug, I: IntoIterator<Item = &'a str> + Debug>(
@ -360,7 +333,7 @@ pub async fn matching_globs<'a, P: AsRef<Path> + Debug, I: IntoIterator<Item = &
paths.insert(if relative { paths.insert(if relative {
relative_path.to_path_buf() relative_path.to_path_buf()
} else { } else {
path.to_path_buf() path.clone()
}); });
} }
} }
@ -369,24 +342,138 @@ pub async fn matching_globs<'a, P: AsRef<Path> + Debug, I: IntoIterator<Item = &
Ok(paths) Ok(paths)
} }
/// Refreshes the sources asynchronously /// A struct containing sources already having been refreshed
pub async fn refresh_sources<I: Iterator<Item = PackageSources>>( #[derive(Debug, Clone, Default)]
project: &Project, pub struct RefreshedSources(Arc<tokio::sync::Mutex<HashSet<u64>>>);
sources: I,
refreshed_sources: &mut HashSet<PackageSources>, impl RefreshedSources {
) -> Result<(), Box<source::errors::RefreshError>> { /// Create a new empty `RefreshedSources`
try_join_all(sources.map(|source| { #[must_use]
let needs_refresh = refreshed_sources.insert(source.clone()); pub fn new() -> Self {
async move { RefreshedSources::default()
if needs_refresh { }
source.refresh(project).await.map_err(Box::new)
/// Refreshes the source asynchronously if it has not already been refreshed.
/// Will prevent more refreshes of the same source.
pub async fn refresh(
&self,
source: &PackageSources,
options: &RefreshOptions,
) -> Result<(), source::errors::RefreshError> {
let mut hasher = std::hash::DefaultHasher::new();
source.hash(&mut hasher);
let hash = hasher.finish();
let mut refreshed_sources = self.0.lock().await;
if refreshed_sources.insert(hash) {
source.refresh(options).await
} else { } else {
Ok(()) Ok(())
} }
} }
})) }
async fn deser_manifest(path: &Path) -> Result<Manifest, errors::ManifestReadError> {
let string = fs::read_to_string(path.join(MANIFEST_FILE_NAME)).await?;
toml::from_str(&string).map_err(|e| errors::ManifestReadError::Serde(path.to_path_buf(), e))
}
/// Find the project & workspace directory roots
pub async fn find_roots(
cwd: PathBuf,
) -> Result<(PathBuf, Option<PathBuf>), errors::FindRootsError> {
let mut current_path = Some(cwd.clone());
let mut project_root = None::<PathBuf>;
let mut workspace_dir = None::<PathBuf>;
async fn get_workspace_members(
manifest_file: &mut fs::File,
path: &Path,
) -> Result<HashSet<PathBuf>, errors::FindRootsError> {
let mut manifest = String::new();
manifest_file
.read_to_string(&mut manifest)
.await .await
.map(|_| ()) .map_err(errors::ManifestReadError::Io)?;
let manifest: Manifest = toml::from_str(&manifest)
.map_err(|e| errors::ManifestReadError::Serde(path.to_path_buf(), e))?;
if manifest.workspace_members.is_empty() {
return Ok(HashSet::new());
}
matching_globs(
path,
manifest.workspace_members.iter().map(String::as_str),
false,
false,
)
.await
.map_err(errors::FindRootsError::Globbing)
}
while let Some(path) = current_path {
current_path = path.parent().map(Path::to_path_buf);
if workspace_dir.is_some() {
if let Some(project_root) = project_root {
return Ok((project_root, workspace_dir));
}
}
let mut manifest = match fs::File::open(path.join(MANIFEST_FILE_NAME)).await {
Ok(manifest) => manifest,
Err(e) if e.kind() == std::io::ErrorKind::NotFound => continue,
Err(e) => return Err(errors::ManifestReadError::Io(e).into()),
};
match (project_root.as_ref(), workspace_dir.as_ref()) {
(Some(project_root), None) => {
if get_workspace_members(&mut manifest, &path)
.await?
.contains(project_root)
{
workspace_dir = Some(path);
}
}
(None, None) => {
if get_workspace_members(&mut manifest, &path)
.await?
.contains(&cwd)
{
// initializing a new member of a workspace
return Ok((cwd, Some(path)));
}
project_root = Some(path);
}
(_, _) => unreachable!(),
}
}
// we mustn't expect the project root to be found, as that would
// disable the ability to run pesde in a non-project directory (for example to init it)
Ok((project_root.unwrap_or(cwd), workspace_dir))
}
/// Returns whether a version matches a version requirement
/// Differs from `VersionReq::matches` in that EVERY version matches `*`
#[must_use]
pub fn version_matches(req: &VersionReq, version: &Version) -> bool {
*req == VersionReq::STAR || req.matches(version)
}
pub(crate) fn all_packages_dirs() -> HashSet<String> {
let mut dirs = HashSet::new();
for target_kind_a in TargetKind::VARIANTS {
for target_kind_b in TargetKind::VARIANTS {
dirs.insert(target_kind_a.packages_folder(*target_kind_b));
}
}
dirs
} }
/// Errors that can occur when using the pesde library /// Errors that can occur when using the pesde library
@ -403,8 +490,8 @@ pub mod errors {
Io(#[from] std::io::Error), Io(#[from] std::io::Error),
/// An error occurred while deserializing the manifest file /// An error occurred while deserializing the manifest file
#[error("error deserializing manifest file")] #[error("error deserializing manifest file at {0}")]
Serde(#[from] toml::de::Error), Serde(PathBuf, #[source] toml::de::Error),
} }
/// Errors that can occur when reading the lockfile /// Errors that can occur when reading the lockfile
@ -415,9 +502,9 @@ pub mod errors {
#[error("io error reading lockfile")] #[error("io error reading lockfile")]
Io(#[from] std::io::Error), Io(#[from] std::io::Error),
/// An error occurred while deserializing the lockfile /// An error occurred while parsing the lockfile
#[error("error deserializing lockfile")] #[error("error parsing lockfile")]
Serde(#[from] toml::de::Error), Parse(#[from] crate::lockfile::errors::ParseLockfileError),
} }
/// Errors that can occur when writing the lockfile /// Errors that can occur when writing the lockfile
@ -437,13 +524,9 @@ pub mod errors {
#[derive(Debug, Error)] #[derive(Debug, Error)]
#[non_exhaustive] #[non_exhaustive]
pub enum WorkspaceMembersError { pub enum WorkspaceMembersError {
/// The manifest file could not be found /// An error occurred parsing the manifest file
#[error("missing manifest file")] #[error("error parsing manifest file")]
ManifestMissing(#[source] std::io::Error), ManifestParse(#[from] ManifestReadError),
/// An error occurred deserializing the manifest file
#[error("error deserializing manifest file at {0}")]
ManifestDeser(PathBuf, #[source] Box<toml::de::Error>),
/// An error occurred interacting with the filesystem /// An error occurred interacting with the filesystem
#[error("error interacting with the filesystem")] #[error("error interacting with the filesystem")]
@ -466,4 +549,17 @@ pub mod errors {
#[error("error building glob")] #[error("error building glob")]
BuildGlob(#[from] wax::BuildError), BuildGlob(#[from] wax::BuildError),
} }
/// Errors that can occur when finding project roots
#[derive(Debug, Error)]
#[non_exhaustive]
pub enum FindRootsError {
/// Reading the manifest failed
#[error("error reading manifest")]
ManifestRead(#[from] ManifestReadError),
/// Globbing failed
#[error("error globbing")]
Globbing(#[from] MatchingGlobsError),
}
} }

View file

@ -2,7 +2,8 @@ use std::path::{Component, Path};
use crate::manifest::{target::TargetKind, Manifest}; use crate::manifest::{target::TargetKind, Manifest};
use full_moon::{ast::luau::ExportedTypeDeclaration, visitors::Visitor}; use full_moon::{ast::luau::ExportedTypeDeclaration, visitors::Visitor};
use relative_path::RelativePathBuf; use relative_path::RelativePath;
use tracing::instrument;
struct TypeVisitor { struct TypeVisitor {
types: Vec<String>, types: Vec<String>,
@ -17,13 +18,13 @@ impl Visitor for TypeVisitor {
let mut declaration_generics = vec![]; let mut declaration_generics = vec![];
let mut generics = vec![]; let mut generics = vec![];
for generic in declaration.generics().iter() { for generic in declaration.generics() {
declaration_generics.push(generic.to_string()); declaration_generics.push(generic.to_string());
if generic.default_type().is_some() { if generic.default_type().is_some() {
generics.push(generic.parameter().to_string()) generics.push(generic.parameter().to_string());
} else { } else {
generics.push(generic.to_string()) generics.push(generic.to_string());
} }
} }
@ -41,16 +42,29 @@ impl Visitor for TypeVisitor {
} }
} }
/// Get the types exported by a file pub(crate) fn get_file_types(file: &str) -> Vec<String> {
pub fn get_file_types(file: &str) -> Result<Vec<String>, Vec<full_moon::Error>> { let ast = match full_moon::parse(file) {
let ast = full_moon::parse(file)?; Ok(ast) => ast,
Err(err) => {
tracing::error!(
"failed to parse file to extract types:\n{}",
err.into_iter()
.map(|err| format!("\t- {err}"))
.collect::<Vec<_>>()
.join("\n")
);
return vec![];
}
};
let mut visitor = TypeVisitor { types: vec![] }; let mut visitor = TypeVisitor { types: vec![] };
visitor.visit_ast(&ast); visitor.visit_ast(&ast);
Ok(visitor.types) visitor.types
} }
/// Generate a linking module for a library /// Generate a linking module for a library
#[must_use]
pub fn generate_lib_linking_module<I: IntoIterator<Item = S>, S: AsRef<str>>( pub fn generate_lib_linking_module<I: IntoIterator<Item = S>, S: AsRef<str>>(
path: &str, path: &str,
types: I, types: I,
@ -104,11 +118,13 @@ fn luau_style_path(path: &Path) -> String {
// This function should be simplified (especially to reduce the number of arguments), // This function should be simplified (especially to reduce the number of arguments),
// but it's not clear how to do that while maintaining the current functionality. // but it's not clear how to do that while maintaining the current functionality.
/// Get the require path for a library /// Get the require path for a library
#[instrument(skip(project_manifest), level = "trace")]
#[allow(clippy::too_many_arguments)] #[allow(clippy::too_many_arguments)]
#[must_use]
pub fn get_lib_require_path( pub fn get_lib_require_path(
target: &TargetKind, target: TargetKind,
base_dir: &Path, base_dir: &Path,
lib_file: &RelativePathBuf, lib_file: &RelativePath,
destination_dir: &Path, destination_dir: &Path,
use_new_structure: bool, use_new_structure: bool,
root_container_dir: &Path, root_container_dir: &Path,
@ -116,11 +132,10 @@ pub fn get_lib_require_path(
project_manifest: &Manifest, project_manifest: &Manifest,
) -> Result<String, errors::GetLibRequirePath> { ) -> Result<String, errors::GetLibRequirePath> {
let path = pathdiff::diff_paths(destination_dir, base_dir).unwrap(); let path = pathdiff::diff_paths(destination_dir, base_dir).unwrap();
tracing::debug!("diffed lib path: {}", path.display());
let path = if use_new_structure { let path = if use_new_structure {
tracing::debug!("using new structure for require path with {lib_file:?}");
lib_file.to_path(path) lib_file.to_path(path)
} else { } else {
tracing::debug!("using old structure for require path with {lib_file:?}");
path path
}; };
@ -169,48 +184,55 @@ pub fn get_lib_require_path(
} }
_ => None, _ => None,
}) })
.collect::<Vec<_>>() .collect::<String>();
.join("");
return Ok(format!("{prefix}{path}")); return Ok(format!("{prefix}{path}"));
}; }
Ok(luau_style_path(&path)) Ok(luau_style_path(&path))
} }
/// Generate a linking module for a binary /// Generate a linking module for a binary
#[must_use]
pub fn generate_bin_linking_module<P: AsRef<Path>>(package_root: P, require_path: &str) -> String { pub fn generate_bin_linking_module<P: AsRef<Path>>(package_root: P, require_path: &str) -> String {
format!( format!(
r#"_G.PESDE_ROOT = {:?} r"_G.PESDE_ROOT = {:?}
return require({require_path})"#, return require({require_path})",
package_root.as_ref().to_string_lossy() package_root.as_ref().to_string_lossy()
) )
} }
/// Get the require path for a binary /// Get the require path for a binary
#[instrument(level = "trace")]
#[must_use]
pub fn get_bin_require_path( pub fn get_bin_require_path(
base_dir: &Path, base_dir: &Path,
bin_file: &RelativePathBuf, bin_file: &RelativePath,
destination_dir: &Path, destination_dir: &Path,
) -> String { ) -> String {
let path = pathdiff::diff_paths(destination_dir, base_dir).unwrap(); let path = pathdiff::diff_paths(destination_dir, base_dir).unwrap();
tracing::debug!("diffed bin path: {}", path.display());
let path = bin_file.to_path(path); let path = bin_file.to_path(path);
luau_style_path(&path) luau_style_path(&path)
} }
/// Generate a linking module for a script /// Generate a linking module for a script
#[must_use]
pub fn generate_script_linking_module(require_path: &str) -> String { pub fn generate_script_linking_module(require_path: &str) -> String {
format!(r#"return require({require_path})"#) format!(r"return require({require_path})")
} }
/// Get the require path for a script /// Get the require path for a script
#[instrument(level = "trace")]
#[must_use]
pub fn get_script_require_path( pub fn get_script_require_path(
base_dir: &Path, base_dir: &Path,
script_file: &RelativePathBuf, script_file: &RelativePath,
destination_dir: &Path, destination_dir: &Path,
) -> String { ) -> String {
let path = pathdiff::diff_paths(destination_dir, base_dir).unwrap(); let path = pathdiff::diff_paths(destination_dir, base_dir).unwrap();
tracing::debug!("diffed script path: {}", path.display());
let path = script_file.to_path(path); let path = script_file.to_path(path);
luau_style_path(&path) luau_style_path(&path)

Some files were not shown because too many files have changed in this diff Show more