Compare commits

...

44 commits

Author SHA1 Message Date
daimond113
32906400ec
docs: update scripts docs
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2025-01-18 16:47:07 +01:00
Nidoxs
5c2f831c26
docs: add an aside for symlink errors on Windows (#20)
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
* Add an aside for symlink errors on Windows

* Remove redundant whitespace

* Inline URL

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>

* Revert titles to "Caution" instead of "Warning"

* Use inline code block for error message

* Update docs/src/content/docs/installation.mdx

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>

* Update docs/src/content/docs/installation.mdx

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>

---------

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>
2025-01-05 19:25:52 +01:00
daimond113
97d9251f69
docs: remove branches from git revs
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2025-01-03 18:09:07 +01:00
daimond113
89a2103164
chore(release): prepare for v0.5.3
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2024-12-30 00:56:58 +01:00
daimond113
0c159e7689
docs: add missing changelog entries 2024-12-30 00:56:03 +01:00
daimond113
4f75af88b7
feat: add meta in index file to preserve future compat 2024-12-30 00:49:24 +01:00
daimond113
f009c957ca
feat: remove verbosity from release mode logs
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2024-12-26 22:51:00 +01:00
3569ff32cd
ci: debug builds action (#15)
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
* chore(actions): create debug build action

* chore(actions): remove unneeded targets

Also do the following:
* Use v4 of artifact upload action
* Install Linux-specific build dependencies
* Do not include version-management feature while building
* Fix cargo build command
* Include native mac x86 target instead of cross compilation

* chore(actions): fix bad compile command

Turns out I hallucinated `--exclude-features` into existence.

* chore(actions): add job to shorten github commit SHA

* chore(actions): use bash patterns for commit SHA trimming

* chore(actions): fix bash pattern syntax being improper

* chore(actions): use `tee` to write trimmed version to stdout for debugging

* chore(actions): include full semver version including git commit SHA

* chore(actions): checkout code first in `get-version` job

* chore(actions): write `trimmed_sha` to `GITHUB_OUTPUT` correclty

* chore(actions): add name for `get-version` job

* chore(actions): make matrix `job-name`s be consistent with release workflow

* chore(actions): provide `exe` for windows manually instead of glob

Also makes step now error on no matching files.
2024-12-25 15:45:29 +01:00
daimond113
c3e764ddda
fix: display spans outside debug
Some checks failed
Test & Lint / lint (push) Has been cancelled
2024-12-22 12:43:42 +01:00
dai
db3335bbf7
docs: add SECURITY.md
Some checks failed
Test & Lint / lint (push) Has been cancelled
2024-12-20 19:06:35 +01:00
Aristosis
711b0009cb
docs: fix improper assignment to PATH (#8)
Some checks are pending
Test & Lint / lint (push) Waiting to run
* Fix improper assignment to path installation.mdx

* Use the home variable installation.mdx

* Remove leading slash

---------

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>
2024-12-19 21:21:32 +01:00
daimond113
f88b800d51
chore(release): prepare for v0.5.2
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-19 16:19:07 +01:00
daimond113
28df3bcca4
feat(registry): add sentry tracing 2024-12-19 16:18:26 +01:00
daimond113
0f74e2efa3
fix: do not error on missing deps until full linking
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-18 23:34:49 +01:00
daimond113
a6c1108d5b
feat: switch registry to tracing logging 2024-12-18 22:29:10 +01:00
daimond113
9535175a45
feat: add more tracing info 2024-12-18 22:00:58 +01:00
daimond113
d9d27cf45b
fix: resolve pesde_version tags properly
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-18 16:03:50 +01:00
daimond113
60fb68fcf3
fix: change dependency types for removed peers
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-17 14:58:21 +01:00
daimond113
78976834b2
docs(changelog): add missing changelog entry for logging switch 2024-12-17 14:57:38 +01:00
daimond113
52603ea43e
feat: switch to tracing for logging
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-16 23:00:37 +01:00
daimond113
0dde647042
fix(website): render imgs inline
Some checks failed
Test & Lint / lint (push) Has been cancelled
2024-12-15 12:37:24 +01:00
daimond113
3196a83b25
chore(release): prepare for v0.5.1
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-15 00:38:22 +01:00
daimond113
d387c27f16
fix: ignore build metadata when comparing cli versions 2024-12-15 00:35:16 +01:00
daimond113
a6846597ca
docs: correct changelog diff link 2024-12-15 00:01:35 +01:00
daimond113
3810a3b9ff
ci: attempt to fix release ci 2024-12-14 23:59:58 +01:00
daimond113
52c502359b
chore(release): prepare for v0.5.0 2024-12-14 23:53:59 +01:00
daimond113
7d1e20da8c
chore: update dependencies 2024-12-14 23:51:37 +01:00
daimond113
d35f34e8f0
fix: gracefully handle unparsable versions & dont display metadata 2024-12-14 19:57:33 +01:00
daimond113
9ee75ec9c9
feat: remove lower bound limit on pesde package name length 2024-12-14 17:41:57 +01:00
daimond113
919b0036e5
feat: display included scripts in publish command
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-13 23:52:45 +01:00
daimond113
7466131f04
fix: link with types without roblox_sync_config_generator script
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-13 17:06:47 +01:00
dai
0be7dd4d0e
chore(release): prepare for v0.5.0-rc.18
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-12 16:23:27 +01:00
dai
f8d0bc6c4d
fix: correctly get index URLs in publish command 2024-12-12 16:23:11 +01:00
daimond113
381740d2ce
chore(release): prepare for v0.5.0-rc.17
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-11 21:41:09 +01:00
daimond113
a7ea8eb9c1
docs: add missing changelog entry 2024-12-11 21:40:03 +01:00
daimond113
4a3619c26e
docs: document scripts packages 2024-12-11 21:37:59 +01:00
daimond113
16ab05ec72
feat(registry): support granular allowence of specifier types 2024-12-11 21:31:42 +01:00
daimond113
36e6f16ca6
fix: remove deny_unknown_fields from index config 2024-12-09 11:43:56 +01:00
daimond113
4843424dba
fix: dont prompt when no packages are configured 2024-12-09 11:41:54 +01:00
daimond113
e51bc9f9bb
feat: allow multiple customisable scripts packages in init 2024-12-09 11:35:02 +01:00
daimond113
6d8731f1e5
perf: use exec in unix bin linkers 2024-12-08 19:19:43 +01:00
daimond113
49a42dc931
docs: remove note about rc.15 2024-12-08 14:10:43 +01:00
daimond113
13594d6103
chore(release): prepare for v0.5.0-rc.16 2024-12-08 13:56:41 +01:00
daimond113
eab46e4ee5
fix: allow publishing packages with scripts with no lib or bin 2024-12-08 13:55:18 +01:00
60 changed files with 1601 additions and 976 deletions

79
.github/workflows/debug.yml vendored Normal file
View file

@ -0,0 +1,79 @@
name: Debug
on:
push:
pull_request:
jobs:
get-version:
name: Get build version
runs-on: ubuntu-latest
outputs:
version: v${{ steps.get_version.outputs.value }}+rev.g${{ steps.trim_sha.outputs.trimmed_sha }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Get package version
uses: SebRollen/toml-action@v1.2.0
id: get_version
with:
file: Cargo.toml
field: package.version
- name: Trim commit SHA
id: trim_sha
run: |
commit_sha=${{ github.sha }}
echo "trimmed_sha=${commit_sha:0:7}" | tee $GITHUB_OUTPUT
build:
strategy:
matrix:
include:
- job-name: windows-x86_64
target: x86_64-pc-windows-msvc
runs-on: windows-latest
artifact-name: pesde-debug-${{ needs.get-version.outputs.version }}-windows-x86_64
- job-name: linux-x86_64
target: x86_64-unknown-linux-gnu
runs-on: ubuntu-latest
artifact-name: pesde-debug-${{ needs.get-version.outputs.version }}-linux-x86_64
- job-name: macos-x86_64
target: x86_64-apple-darwin
runs-on: macos-13
artifact-name: pesde-debug-${{ needs.get-version.outputs.version }}-macos-x86_64
- job-name: macos-aarch64
target: aarch64-apple-darwin
runs-on: macos-latest
artifact-name: pesde-debug-${{ needs.get-version.outputs.version }}-macos-aarch64
name: Build for ${{ matrix.job-name }}
runs-on: ${{ matrix.runs-on }}
needs: get-version
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install Linux build dependencies
if: ${{ matrix.runs-on == 'ubuntu-latest' }}
run: |
sudo apt-get update
sudo apt-get install libdbus-1-dev pkg-config
- name: Install Rust toolchain
uses: dtolnay/rust-toolchain@stable
- name: Compile in debug mode
run: cargo build --bins --no-default-features --features bin,patches,wally-compat --target ${{ matrix.target }} --locked
- name: Upload artifact
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.artifact-name }}
if-no-files-found: error
path: |
target/${{ matrix.target }}/debug/pesde.exe
target/${{ matrix.target }}/debug/pesde

View file

@ -4,8 +4,44 @@ on:
tags:
- v*
env:
CRATE_NAME: pesde
BIN_NAME: pesde
jobs:
prepare:
name: Prepare
runs-on: ubuntu-latest
outputs:
version: ${{ steps.extract_version.outputs.VERSION }}
found: ${{ steps.ensure_not_published.outputs.FOUND }}
steps:
- uses: actions/checkout@v4
- name: Extract version
id: extract_version
shell: bash
run: |
VERSION=$(echo ${{ github.ref_name }} | cut -d'+' -f1 | cut -c 2-)
echo "VERSION=$VERSION" >> "$GITHUB_OUTPUT"
- name: Ensure not published
id: ensure_not_published
shell: bash
env:
VERSION: ${{ steps.extract_version.outputs.VERSION }}
run: |
CRATE_NAME="${{ env.CRATE_NAME }}"
if [ ${#CRATE_NAME} -eq 1 ]; then
DIR="1"
elif [ ${#CRATE_NAME} -eq 2 ]; then
DIR="2"
elif [ ${#CRATE_NAME} -eq 3 ]; then
DIR="3/${CRATE_NAME:0:1}"
else
DIR="${CRATE_NAME:0:2}/${CRATE_NAME:2:2}"
fi
FOUND=$(curl -sSL --fail-with-body "https://index.crates.io/$DIR/${{ env.CRATE_NAME }}" | jq -s 'any(.[]; .vers == "${{ env.VERSION }}")')
echo "FOUND=$FOUND" >> "$GITHUB_OUTPUT"
build:
strategy:
matrix:
@ -31,13 +67,17 @@ jobs:
target: aarch64-apple-darwin
runs-on: ${{ matrix.os }}
name: Build for ${{ matrix.host }}-${{ matrix.arch }}
needs: [ prepare ]
if: ${{ needs.prepare.outputs.found == 'false' }}
env:
VERSION: ${{ needs.prepare.outputs.version }}
steps:
- uses: actions/checkout@v4
- uses: dtolnay/rust-toolchain@stable
- name: Set env
shell: bash
run: |
ARCHIVE_NAME=${{ env.BIN_NAME }}-$(echo ${{ github.ref_name }} | cut -c 2-)-${{ matrix.host }}-${{ matrix.arch }}
ARCHIVE_NAME=${{ env.BIN_NAME }}-${{ env.VERSION }}-${{ matrix.host }}-${{ matrix.arch }}
echo "ARCHIVE_NAME=$ARCHIVE_NAME" >> $GITHUB_ENV
@ -91,7 +131,9 @@ jobs:
permissions:
contents: write
pull-requests: read
needs: [ build, publish ]
needs: [ prepare, publish ]
env:
VERSION: ${{ needs.prepare.outputs.version }}
steps:
- uses: actions/checkout@v4
with:
@ -107,7 +149,7 @@ jobs:
with:
token: ${{ secrets.GITHUB_TOKEN }}
tag_name: ${{ github.ref_name }}
name: ${{ github.ref_name }}
name: v${{ env.VERSION }}
draft: true
prerelease: ${{ startsWith(github.ref_name, 'v0') }}
prerelease: ${{ startsWith(env.VERSION, '0') }}
files: artifacts/*

View file

@ -5,144 +5,93 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [0.5.0-rc.15] - 2024-12-08
## [0.5.3] - 2024-12-30
### Added
- Add improved CLI styling by @daimond113
- Install pesde dependencies before Wally to support scripts packages by @daimond113
- Support packages exporting scripts by @daimond113
- Support using workspace root as a member by @daimond113
- Add meta field in index files to preserve compatibility with potential future changes by @daimond113
### Removed
- Remove special scripts repo handling to favour standard packages by @daimond113
### Changed
- Remove verbosity from release mode logging by @daimond113
## [0.5.2] - 2024-12-19
### Fixed
- Link dependencies before type extraction to support more use cases by @daimond113
- Strip `.luau` extension from linker modules' require paths to comply with Luau by @daimond113
- Correctly handle graph paths for resolving overriden packages by @daimond113
- Do not require `--` in bin package executables on Unix by @daimond113
- Change dependency types for removed peer dependencies by @daimond113
- Resolve version to correct tag for `pesde_version` field by @daimond113
- Do not error on missing dependencies until full linking by @daimond113
## [0.5.0-rc.14] - 2024-11-30
### Changed
- Switch from `log` to `tracing` for logging by @daimond113
## [0.5.1] - 2024-12-15
### Fixed
- Fix `includes` not supporting root files by @daimond113
- Ignore build metadata when comparing CLI versions by @daimond113
## [0.5.0-rc.13] - 2024-11-28
## [0.5.0] - 2024-12-14
### Added
- Add support for multiple targets under the same package name in workspace members by @daimond113
- Add `yes` argument to skip all prompts in publish command by @daimond113
- Publish all workspace members when publishing a workspace by @daimond113
- Inform user about not finding any bin package when using its bin invocation by @daimond113
- Support full version requirements in workspace version field by @daimond113
- Improved authentication system for registry changes by @daimond113
- New website by @lukadev-0
- Add `--index` flag to `publish` command to publish to a specific index by @daimond113
- Support fallback Wally registries by @daimond113
- Print that no updates are available in `outdated` command by @daimond113
- Support negated globs in `workspace_members` field by @daimond113
- Make `includes` use glob patterns by @daimond113
- Use symlinks for workspace dependencies to not require reinstalling by @daimond113
- Add `auth token` command to print the auth token for the index by @daimond113
- Support specifying which external registries are allowed on registries by @daimond113
- Add improved CLI styling by @daimond113
- Install pesde dependencies before Wally to support scripts packages by @daimond113
- Support packages exporting scripts by @daimond113
- Support using workspace root as a member by @daimond113
- Allow multiple, user selectable scripts packages to be selected (& custom packages inputted) in `init` command by @daimond113
- Support granular control over which repositories are allowed in various specifier types by @daimond113
- Display included scripts in `publish` command by @daimond113
### Fixed
- Install dependencies of packages in `x` command by @daimond113
### Performance
- Asyncify dependency linking by @daimond113
## [0.5.0-rc.12] - 2024-11-22
### Added
- Support fallback Wally registries by @daimond113
### Fixed
- Fix peer dependencies being resolved incorrectly by @daimond113
- Set PESDE_ROOT to the correct path in `pesde run` by @daimond113
## [0.5.0-rc.11] - 2024-11-20
### Fixed
- Add back mistakenly removed updates check caching by @daimond113
- Set download error source to inner error to propagate the error by @daimond113
- Correctly copy workspace packages by @daimond113
## [0.5.0-rc.10] - 2024-11-16
### Fixed
- Fix `self-install` doing a cross-device move by @daimond113
### Changed
- Only store `pesde_version` executables in the version cache by @daimond113
## [0.5.0-rc.9] - 2024-11-16
### Fixed
- Correctly link Wally server packages by @daimond113
### Changed
- `self-upgrade` now will check for updates by itself by default by @daimond113
## [0.5.0-rc.8] - 2024-11-12
### Added
- Add `--index` flag to `publish` command to publish to a specific index by @daimond113
### Fixed
- Fix versions with dots not being handled correctly by @daimond113
- Use workspace specifiers' `target` field when resolving by @daimond113
- Add feature gates to `wally-compat` specific code in init command by @daimond113
- Remove duplicated manifest file name in `publish` command by @daimond113
- Allow use of Luau packages in `execute` command by @daimond113
- Fix `self-upgrade` overwriting its own binary by @daimond113
- Correct `pesde.toml` inclusion message in `publish` command by @daimond113
- Allow writes to files when `link` is false in PackageFS::write_to by @daimond113
- Handle missing revisions in AnyPackageIdentifier::from_str by @daimond113
- Make GitHub OAuth client ID config optional by @daimond113
- Use updated aliases when reusing lockfile dependencies by @daimond113
- Listen for device flow completion without requiring pressing enter by @daimond113
- Sync scripts repo in background by @daimond113
- Don't make CAS files read-only on Windows (file removal is disallowed if the file is read-only) by @daimond113
- Validate package names are lowercase by @daimond113
- Use a different algorithm for finding a CAS directory to avoid issues with mounted drives by @daimond113
- Remove default.project.json from Git pesde dependencies by @daimond113
- Correctly (de)serialize workspace specifiers by @daimond113
- Fix CAS finder algorithm issues with Windows by @daimond113
- Fix CAS finder algorithm's AlreadyExists error by @daimond113
- Use moved path when setting file to read-only by @daimond113
- Correctly link Wally server packages by @daimond113
- Fix `self-install` doing a cross-device move by @daimond113
- Add back mistakenly removed updates check caching by @daimond113
- Set download error source to inner error to propagate the error by @daimond113
- Correctly copy workspace packages by @daimond113
- Fix peer dependencies being resolved incorrectly by @daimond113
- Set PESDE_ROOT to the correct path in `pesde run` by @daimond113
- Install dependencies of packages in `x` command by @daimond113
- Fix `includes` not supporting root files by @daimond113
- Link dependencies before type extraction to support more use cases by @daimond113
- Strip `.luau` extension from linker modules' require paths to comply with Luau by @daimond113
- Correctly handle graph paths for resolving overriden packages by @daimond113
- Do not require `--` in bin package executables on Unix by @daimond113
- Do not require lib or bin exports if package exports scripts by @daimond113
- Correctly resolve URLs in `publish` command by @daimond113
- Add Roblox types in linker modules even with no config generator script by @daimond113
### Changed
- Switched to fs-err for better errors with file system operations by @daimond113
- Use body bytes over multipart for publishing packages by @daimond113
### Removed
- Remove special scripts repo handling to favour standard packages by @daimond113
### Performance
- Switch to async Rust by @daimond113
## [0.5.0-rc.7] - 2024-10-30
### Added
- New website by @lukadev-0
### Fixed
- Use updated aliases when reusing lockfile dependencies by @daimond113
- Listen for device flow completion without requiring pressing enter by @daimond113
- Sync scripts repo in background by @daimond113
- Don't make CAS files read-only on Windows (file removal is disallowed if the file is read-only) by @daimond113
- Validate package names are lowercase by @daimond113
### Performance
- Clone dependency repos shallowly by @daimond113
### Changed
- Optimize boolean expression in `publish` command by @daimond113
## [0.5.0-rc.6] - 2024-10-14
### Added
- Support full version requirements in workspace version field by @daimond113
- Improved authentication system for registry changes by @daimond113
### Fixed
- Correct `pesde.toml` inclusion message in `publish` command by @daimond113
- Allow writes to files when `link` is false in PackageFS::write_to by @daimond113
- Handle missing revisions in AnyPackageIdentifier::from_str by @daimond113
- Make GitHub OAuth client ID config optional by @daimond113
## [0.5.0-rc.5] - 2024-10-12
### Added
- Inform user about not finding any bin package when using its bin invocation by @daimond113
### Fixed
- Fix `self-upgrade` overwriting its own binary by @daimond113
- Allow use of Luau packages in `execute` command by @daimond113
- Remove duplicated manifest file name in `publish` command by @daimond113
## [0.5.0-rc.4] - 2024-10-12
### Added
- Add `yes` argument to skip all prompts in publish command by @daimond113
- Publish all workspace members when publishing a workspace by @daimond113
### Fixed
- Add feature gates to `wally-compat` specific code in init command by @daimond113
## [0.5.0-rc.3] - 2024-10-06
### Fixed
- Use workspace specifiers' `target` field when resolving by @daimond113
## [0.5.0-rc.2] - 2024-10-06
### Added
- Add support for multiple targets under the same package name in workspace members by @daimond113
### Fixed
- Fix versions with dots not being handled correctly by @daimond113
## [0.5.0-rc.1] - 2024-10-06
### Changed
- Rewrite the entire project in a more maintainable way by @daimond113
- Support workspaces by @daimond113
@ -150,19 +99,20 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- Support multiple targets for a single package by @daimond113
- Make registry much easier to self-host by @daimond113
- Start maintaining a changelog by @daimond113
- Optimize boolean expression in `publish` command by @daimond113
- Switched to fs-err for better errors with file system operations by @daimond113
- Use body bytes over multipart for publishing packages by @daimond113
- `self-upgrade` now will check for updates by itself by default by @daimond113
- Only store `pesde_version` executables in the version cache by @daimond113
- Remove lower bound limit of 3 characters for pesde package names by @daimond113
[0.5.0-rc.15]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.14..v0.5.0-rc.15
[0.5.0-rc.14]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.13..v0.5.0-rc.14
[0.5.0-rc.13]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.12..v0.5.0-rc.13
[0.5.0-rc.12]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.11..v0.5.0-rc.12
[0.5.0-rc.11]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.10..v0.5.0-rc.11
[0.5.0-rc.10]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.9..v0.5.0-rc.10
[0.5.0-rc.9]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.8..v0.5.0-rc.9
[0.5.0-rc.8]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.7..v0.5.0-rc.8
[0.5.0-rc.7]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.6..v0.5.0-rc.7
[0.5.0-rc.6]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.5..v0.5.0-rc.6
[0.5.0-rc.5]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.4..v0.5.0-rc.5
[0.5.0-rc.4]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.3..v0.5.0-rc.4
[0.5.0-rc.3]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.2..v0.5.0-rc.3
[0.5.0-rc.2]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.1..v0.5.0-rc.2
[0.5.0-rc.1]: https://github.com/daimond113/pesde/compare/v0.4.7..v0.5.0-rc.1
### Performance
- Clone dependency repos shallowly by @daimond113
- Switch to async Rust by @daimond113
- Asyncify dependency linking by @daimond113
- Use `exec` in Unix bin linking to reduce the number of processes by @daimond113
[0.5.3]: https://github.com/daimond113/pesde/compare/v0.5.2%2Bregistry.0.1.1..v0.5.3%2Bregistry.0.1.2
[0.5.2]: https://github.com/daimond113/pesde/compare/v0.5.1%2Bregistry.0.1.0..v0.5.2%2Bregistry.0.1.1
[0.5.1]: https://github.com/daimond113/pesde/compare/v0.5.0%2Bregistry.0.1.0..v0.5.1%2Bregistry.0.1.0
[0.5.0]: https://github.com/daimond113/pesde/compare/v0.4.7..v0.5.0%2Bregistry.0.1.0

354
Cargo.lock generated
View file

@ -36,9 +36,9 @@ dependencies = [
[[package]]
name = "actix-governor"
version = "0.7.0"
version = "0.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "072a3d7907b945b0956f9721e01c117ad5765ce5be2fd9bb1e44a117c669de22"
checksum = "4a0cb8586d3fa368d00ef643e8ef77f5d3d5dfe5c7b333415a556bc12eb1c41a"
dependencies = [
"actix-http",
"actix-web",
@ -357,6 +357,12 @@ version = "1.7.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "69f7f8c3906b62b754cd5326047894316021dcfe5a194c8ea52bdd94934a3457"
[[package]]
name = "arrayvec"
version = "0.7.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7c02d123df017efcdfbd739ef81735b36c5ba83ec3c59c80a9d7ecc718f92e50"
[[package]]
name = "async-broadcast"
version = "0.7.1"
@ -680,7 +686,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1a68f1f47cdf0ec8ee4b941b2eee2a80cb796db73118c0dd09ac63fbe405be22"
dependencies = [
"memchr",
"regex-automata",
"regex-automata 0.4.9",
"serde",
]
@ -757,9 +763,9 @@ checksum = "613afe47fcd5fac7ccf1db93babcb082c5994d996f20b8b159f2ad1658eb5724"
[[package]]
name = "chrono"
version = "0.4.38"
version = "0.4.39"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a21f936df1771bf62b77f047b726c4625ff2e8aa607c01ec06e5a05bd8463401"
checksum = "7e36cc9d416881d2e24f9a963be5fb1cd90966419ac844274161d10488b3e825"
dependencies = [
"android-tzdata",
"iana-time-zone",
@ -1288,19 +1294,6 @@ dependencies = [
"syn 2.0.90",
]
[[package]]
name = "env_logger"
version = "0.10.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4cd405aab171cb85d6735e5c8d9db038c17d3ca007a4d2c25f337935c3d90580"
dependencies = [
"humantime",
"is-terminal",
"log",
"regex",
"termcolor",
]
[[package]]
name = "equivalent"
version = "1.0.1"
@ -1696,7 +1689,7 @@ dependencies = [
"once_cell",
"regex",
"smallvec",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -1709,7 +1702,7 @@ dependencies = [
"gix-date",
"gix-utils",
"itoa",
"thiserror 2.0.5",
"thiserror 2.0.7",
"winnow",
]
@ -1726,7 +1719,7 @@ dependencies = [
"gix-trace",
"kstring",
"smallvec",
"thiserror 2.0.5",
"thiserror 2.0.7",
"unicode-bom",
]
@ -1736,7 +1729,7 @@ version = "0.2.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d48b897b4bbc881aea994b4a5bbb340a04979d7be9089791304e04a9fbc66b53"
dependencies = [
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -1745,7 +1738,7 @@ version = "0.4.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c6ffbeb3a5c0b8b84c3fe4133a6f8c82fa962f4caefe8d0762eced025d3eb4f7"
dependencies = [
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -1771,7 +1764,7 @@ dependencies = [
"gix-features",
"gix-hash",
"memmap2",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -1790,7 +1783,7 @@ dependencies = [
"memchr",
"once_cell",
"smallvec",
"thiserror 2.0.5",
"thiserror 2.0.7",
"unicode-bom",
"winnow",
]
@ -1805,7 +1798,7 @@ dependencies = [
"bstr",
"gix-path",
"libc",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -1822,7 +1815,7 @@ dependencies = [
"gix-sec",
"gix-trace",
"gix-url",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -1834,7 +1827,7 @@ dependencies = [
"bstr",
"itoa",
"jiff",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -1846,7 +1839,7 @@ dependencies = [
"bstr",
"gix-hash",
"gix-object",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -1862,7 +1855,7 @@ dependencies = [
"gix-path",
"gix-ref",
"gix-sec",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -1883,7 +1876,7 @@ dependencies = [
"parking_lot",
"prodash",
"sha1_smol",
"thiserror 2.0.5",
"thiserror 2.0.7",
"walkdir",
]
@ -1905,7 +1898,7 @@ dependencies = [
"gix-trace",
"gix-utils",
"smallvec",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -1938,7 +1931,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0b5eccc17194ed0e67d49285e4853307e4147e95407f91c1c3e4a13ba9f4e4ce"
dependencies = [
"faster-hex",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -1990,7 +1983,7 @@ dependencies = [
"memmap2",
"rustix",
"smallvec",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2001,7 +1994,7 @@ checksum = "1cd3ab68a452db63d9f3ebdacb10f30dba1fa0d31ac64f4203d395ed1102d940"
dependencies = [
"gix-tempfile",
"gix-utils",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2017,7 +2010,7 @@ dependencies = [
"gix-object",
"gix-revwalk",
"smallvec",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2036,7 +2029,7 @@ dependencies = [
"gix-validate",
"itoa",
"smallvec",
"thiserror 2.0.5",
"thiserror 2.0.7",
"winnow",
]
@ -2058,7 +2051,7 @@ dependencies = [
"gix-quote",
"parking_lot",
"tempfile",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2078,7 +2071,7 @@ dependencies = [
"memmap2",
"parking_lot",
"smallvec",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2090,7 +2083,7 @@ dependencies = [
"bstr",
"faster-hex",
"gix-trace",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2102,7 +2095,7 @@ dependencies = [
"bstr",
"faster-hex",
"gix-trace",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2115,7 +2108,7 @@ dependencies = [
"gix-trace",
"home",
"once_cell",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2130,7 +2123,7 @@ dependencies = [
"gix-config-value",
"gix-glob",
"gix-path",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2143,7 +2136,7 @@ dependencies = [
"gix-config-value",
"parking_lot",
"rustix",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2160,7 +2153,7 @@ dependencies = [
"gix-transport",
"gix-utils",
"maybe-async",
"thiserror 2.0.5",
"thiserror 2.0.7",
"winnow",
]
@ -2172,7 +2165,7 @@ checksum = "64a1e282216ec2ab2816cd57e6ed88f8009e634aec47562883c05ac8a7009a63"
dependencies = [
"bstr",
"gix-utils",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2192,7 +2185,7 @@ dependencies = [
"gix-utils",
"gix-validate",
"memmap2",
"thiserror 2.0.5",
"thiserror 2.0.7",
"winnow",
]
@ -2207,7 +2200,7 @@ dependencies = [
"gix-revision",
"gix-validate",
"smallvec",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2225,7 +2218,7 @@ dependencies = [
"gix-object",
"gix-revwalk",
"gix-trace",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2240,7 +2233,7 @@ dependencies = [
"gix-hashtable",
"gix-object",
"smallvec",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2267,7 +2260,7 @@ dependencies = [
"gix-pathspec",
"gix-refspec",
"gix-url",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2305,7 +2298,7 @@ dependencies = [
"gix-sec",
"gix-url",
"reqwest",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2322,7 +2315,7 @@ dependencies = [
"gix-object",
"gix-revwalk",
"smallvec",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2334,7 +2327,7 @@ dependencies = [
"bstr",
"gix-features",
"gix-path",
"thiserror 2.0.5",
"thiserror 2.0.7",
"url",
]
@ -2355,7 +2348,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cd520d09f9f585b34b32aba1d0b36ada89ab7fefb54a8ca3fe37fc482a750937"
dependencies = [
"bstr",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
@ -2394,14 +2387,14 @@ dependencies = [
"gix-path",
"gix-worktree",
"io-close",
"thiserror 2.0.5",
"thiserror 2.0.7",
]
[[package]]
name = "governor"
version = "0.7.0"
version = "0.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0746aa765db78b521451ef74221663b57ba595bf83f75d0ce23cc09447c8139f"
checksum = "842dc78579ce01e6a1576ad896edc92fca002dd60c9c3746b7fc2bec6fb429d0"
dependencies = [
"cfg-if",
"dashmap",
@ -2608,12 +2601,6 @@ version = "1.0.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "df3b46402a9d5adb4c86a0cf463f42e19994e3ee891101b1841f30a545cb49a9"
[[package]]
name = "humantime"
version = "2.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9a3a5bfb195931eeb336b2a7b4d761daec841b97f947d34394601737a7bba5e4"
[[package]]
name = "hyper"
version = "1.5.1"
@ -2893,19 +2880,10 @@ dependencies = [
"number_prefix",
"portable-atomic",
"unicode-width 0.2.0",
"vt100",
"web-time",
]
[[package]]
name = "indicatif-log-bridge"
version = "0.2.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "63703cf9069b85dbe6fe26e1c5230d013dee99d3559cd3d02ba39e099ef7ab02"
dependencies = [
"indicatif",
"log",
]
[[package]]
name = "inout"
version = "0.1.3"
@ -2970,17 +2948,6 @@ dependencies = [
"once_cell",
]
[[package]]
name = "is-terminal"
version = "0.4.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "261f68e344040fbd0edea105bef17c66edf46f984ddb1115b775ce31be948f4b"
dependencies = [
"hermit-abi 0.4.0",
"libc",
"windows-sys 0.52.0",
]
[[package]]
name = "is-wsl"
version = "0.4.0"
@ -3240,6 +3207,15 @@ version = "0.11.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "75761162ae2b0e580d7e7c390558127e5f01b4194debd6221fd8c207fc80e3f5"
[[package]]
name = "matchers"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8263075bb86c5a1b1427b5ae862e8889656f126e9f77c484496e8b47cf5c5558"
dependencies = [
"regex-automata 0.1.10",
]
[[package]]
name = "maybe-async"
version = "0.2.10"
@ -3346,6 +3322,12 @@ version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2195bf6aa996a481483b29d62a7663eed3fe39600c460e323f8ff41e90bdd89b"
[[package]]
name = "mutually_exclusive_features"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e94e1e6445d314f972ff7395df2de295fe51b71821694f0b0e1e79c4f12c8577"
[[package]]
name = "native-tls"
version = "0.2.12"
@ -3407,6 +3389,16 @@ version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "38bf9645c8b145698bb0b18a4637dcacbc421ea49bef2317e4fd8065a387cf21"
[[package]]
name = "nu-ansi-term"
version = "0.46.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "77a8165726e8236064dbb45459242600304b42a5ea24ee2948e18e023bf7ba84"
dependencies = [
"overload",
"winapi",
]
[[package]]
name = "num"
version = "0.4.3"
@ -3606,6 +3598,12 @@ dependencies = [
"windows-sys 0.52.0",
]
[[package]]
name = "overload"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b15813163c1d831bf4a13c3610c05c0d03b39feb07f7e09fa234dac9b15aaf39"
[[package]]
name = "ownedbytes"
version = "0.7.0"
@ -3664,7 +3662,7 @@ checksum = "e3148f5046208a5d56bcfc03053e3ca6334e51da8dfb19b6cdc8b306fae3283e"
[[package]]
name = "pesde"
version = "0.5.0-rc.15"
version = "0.5.3"
dependencies = [
"anyhow",
"async-compression",
@ -3680,13 +3678,10 @@ dependencies = [
"git2",
"gix",
"indicatif",
"indicatif-log-bridge",
"inquire",
"keyring",
"log",
"open",
"pathdiff",
"pretty_env_logger",
"relative-path",
"reqwest",
"semver",
@ -3695,12 +3690,15 @@ dependencies = [
"serde_with",
"sha2",
"tempfile",
"thiserror 2.0.5",
"thiserror 2.0.7",
"tokio",
"tokio-tar",
"tokio-util",
"toml",
"toml_edit",
"tracing",
"tracing-indicatif",
"tracing-subscriber",
"url",
"wax",
"winreg",
@ -3708,7 +3706,7 @@ dependencies = [
[[package]]
name = "pesde-registry"
version = "0.7.0"
version = "0.1.2"
dependencies = [
"actix-cors",
"actix-governor",
@ -3723,9 +3721,7 @@ dependencies = [
"futures",
"git2",
"gix",
"log",
"pesde",
"pretty_env_logger",
"reqwest",
"rusty-s3",
"semver",
@ -3737,11 +3733,13 @@ dependencies = [
"sha2",
"tantivy",
"tempfile",
"thiserror 2.0.5",
"thiserror 2.0.7",
"tokio",
"tokio-tar",
"toml",
"url",
"tracing",
"tracing-actix-web",
"tracing-subscriber",
]
[[package]]
@ -3838,16 +3836,6 @@ dependencies = [
"zerocopy",
]
[[package]]
name = "pretty_env_logger"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "865724d4dbe39d9f3dd3b52b88d859d66bcb2d6a0acfd5ea68a65fb66d4bdc1c"
dependencies = [
"env_logger",
"log",
]
[[package]]
name = "proc-macro-crate"
version = "3.2.0"
@ -3914,7 +3902,7 @@ dependencies = [
"rustc-hash 2.1.0",
"rustls",
"socket2",
"thiserror 2.0.5",
"thiserror 2.0.7",
"tokio",
"tracing",
]
@ -3933,7 +3921,7 @@ dependencies = [
"rustls",
"rustls-pki-types",
"slab",
"thiserror 2.0.5",
"thiserror 2.0.7",
"tinyvec",
"tracing",
"web-time",
@ -4068,8 +4056,17 @@ checksum = "b544ef1b4eac5dc2db33ea63606ae9ffcfac26c1416a2806ae0bf5f56b201191"
dependencies = [
"aho-corasick",
"memchr",
"regex-automata",
"regex-syntax",
"regex-automata 0.4.9",
"regex-syntax 0.8.5",
]
[[package]]
name = "regex-automata"
version = "0.1.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6c230d73fb8d8c1b9c0b3135c5142a8acee3a0558fb8db5cf1cb65f8d7862132"
dependencies = [
"regex-syntax 0.6.29",
]
[[package]]
@ -4080,7 +4077,7 @@ checksum = "809e8dc61f6de73b46c85f4c96486310fe304c434cfa43669d7b40f711150908"
dependencies = [
"aho-corasick",
"memchr",
"regex-syntax",
"regex-syntax 0.8.5",
]
[[package]]
@ -4089,6 +4086,12 @@ version = "0.1.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "53a49587ad06b26609c52e423de037e7f57f20d53535d66e08c695f347df952a"
[[package]]
name = "regex-syntax"
version = "0.6.29"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f162c6dd7b008981e4d40210aca20b4bd0f9b60ca9271061b07f78537722f2e1"
[[package]]
name = "regex-syntax"
version = "0.8.5"
@ -4368,9 +4371,9 @@ dependencies = [
[[package]]
name = "semver"
version = "1.0.23"
version = "1.0.24"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "61697e0a1c7e512e84a621326239844a24d8207b4669b41bc18b32ea5cbf988b"
checksum = "3cb6eb87a131f756572d7fb904f6e7b68633f09cca868c5df1c4b8d1a694bbba"
dependencies = [
"serde",
]
@ -4388,7 +4391,6 @@ dependencies = [
"sentry-contexts",
"sentry-core",
"sentry-debug-images",
"sentry-log",
"sentry-panic",
"sentry-tracing",
"tokio",
@ -4457,16 +4459,6 @@ dependencies = [
"sentry-core",
]
[[package]]
name = "sentry-log"
version = "0.35.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "efcbfbb74628eaef033c1154d4bb082437c7592ce2282c7c5ccb455c4c97a06d"
dependencies = [
"log",
"sentry-core",
]
[[package]]
name = "sentry-panic"
version = "0.35.0"
@ -4508,18 +4500,18 @@ dependencies = [
[[package]]
name = "serde"
version = "1.0.215"
version = "1.0.216"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6513c1ad0b11a9376da888e3e0baa0077f1aed55c17f50e7b2397136129fb88f"
checksum = "0b9781016e935a97e8beecf0c933758c97a5520d32930e460142b4cd80c6338e"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.215"
version = "1.0.216"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ad1e866f866923f252f05c889987993144fb74e722403468a4ebd70c3cd756c0"
checksum = "46f859dbbf73865c6627ed570e78961cd3ac92407a2d117204c49232485da55e"
dependencies = [
"proc-macro2",
"quote",
@ -4641,6 +4633,15 @@ dependencies = [
"digest",
]
[[package]]
name = "sharded-slab"
version = "0.1.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f40ca3c46823713e0d4209592e8d6e826aa57e928f09752619fc696c499637f6"
dependencies = [
"lazy_static",
]
[[package]]
name = "shell-words"
version = "1.1.0"
@ -4925,7 +4926,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d60769b80ad7953d8a7b2c70cdfe722bbcdcac6bccc8ac934c40c034d866fc18"
dependencies = [
"byteorder",
"regex-syntax",
"regex-syntax 0.8.5",
"utf8-ranges",
]
@ -4983,15 +4984,6 @@ dependencies = [
"windows-sys 0.59.0",
]
[[package]]
name = "termcolor"
version = "1.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "06794f8f6c5c898b3275aebefa6b8a1cb24cd2c6c79397ab15774837a0bc5755"
dependencies = [
"winapi-util",
]
[[package]]
name = "thiserror"
version = "1.0.69"
@ -5003,11 +4995,11 @@ dependencies = [
[[package]]
name = "thiserror"
version = "2.0.5"
version = "2.0.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "643caef17e3128658ff44d85923ef2d28af81bb71e0d67bbfe1d76f19a73e053"
checksum = "93605438cbd668185516ab499d589afb7ee1859ea3d5fc8f6b0755e1c7443767"
dependencies = [
"thiserror-impl 2.0.5",
"thiserror-impl 2.0.7",
]
[[package]]
@ -5023,9 +5015,9 @@ dependencies = [
[[package]]
name = "thiserror-impl"
version = "2.0.5"
version = "2.0.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "995d0bbc9995d1f19d28b7215a9352b0fc3cd3a2d2ec95c2cadc485cdedbcdde"
checksum = "e1d8749b4531af2117677a5fcd12b1348a3fe2b81e36e61ffeac5c4aa3273e36"
dependencies = [
"proc-macro2",
"quote",
@ -5239,6 +5231,19 @@ dependencies = [
"tracing-core",
]
[[package]]
name = "tracing-actix-web"
version = "0.7.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "54a9f5c1aca50ebebf074ee665b9f99f2e84906dcf6b993a0d0090edb835166d"
dependencies = [
"actix-web",
"mutually_exclusive_features",
"pin-project",
"tracing",
"uuid",
]
[[package]]
name = "tracing-attributes"
version = "0.1.28"
@ -5260,13 +5265,45 @@ dependencies = [
"valuable",
]
[[package]]
name = "tracing-indicatif"
version = "0.3.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "74ba258e9de86447f75edf6455fded8e5242704c6fccffe7bf8d7fb6daef1180"
dependencies = [
"indicatif",
"tracing",
"tracing-core",
"tracing-subscriber",
]
[[package]]
name = "tracing-log"
version = "0.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ee855f1f400bd0e5c02d150ae5de3840039a3f54b025156404e34c23c03f47c3"
dependencies = [
"log",
"once_cell",
"tracing-core",
]
[[package]]
name = "tracing-subscriber"
version = "0.3.19"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e8189decb5ac0fa7bc8b96b7cb9b2701d60d48805aca84a238004d665fcc4008"
dependencies = [
"matchers",
"nu-ansi-term",
"once_cell",
"regex",
"sharded-slab",
"smallvec",
"thread_local",
"tracing",
"tracing-core",
"tracing-log",
]
[[package]]
@ -5437,6 +5474,39 @@ version = "0.9.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0b928f33d975fc6ad9f86c8f283853ad26bdd5b10b7f1542aa2fa15e2289105a"
[[package]]
name = "vt100"
version = "0.15.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "84cd863bf0db7e392ba3bd04994be3473491b31e66340672af5d11943c6274de"
dependencies = [
"itoa",
"log",
"unicode-width 0.1.14",
"vte",
]
[[package]]
name = "vte"
version = "0.11.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f5022b5fbf9407086c180e9557be968742d839e68346af7792b8592489732197"
dependencies = [
"arrayvec",
"utf8parse",
"vte_generate_state_changes",
]
[[package]]
name = "vte_generate_state_changes"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2e369bee1b05d510a7b4ed645f5faa90619e05437111783ea5848f28d97d3c2e"
dependencies = [
"proc-macro2",
"quote",
]
[[package]]
name = "walkdir"
version = "2.5.0"

View file

@ -1,6 +1,6 @@
[package]
name = "pesde"
version = "0.5.0-rc.15"
version = "0.5.3"
edition = "2021"
license = "MIT"
authors = ["daimond113 <contact@daimond113.com>"]
@ -13,10 +13,10 @@ include = ["src/**/*", "Cargo.toml", "Cargo.lock", "README.md", "LICENSE", "CHAN
bin = [
"dep:clap",
"dep:dirs",
"dep:pretty_env_logger",
"dep:tracing-subscriber",
"reqwest/json",
"dep:indicatif",
"dep:indicatif-log-bridge",
"dep:tracing-indicatif",
"dep:inquire",
"dep:toml_edit",
"dep:colored",
@ -44,25 +44,25 @@ required-features = ["bin"]
uninlined_format_args = "warn"
[dependencies]
serde = { version = "1.0.215", features = ["derive"] }
serde = { version = "1.0.216", features = ["derive"] }
toml = "0.8.19"
serde_with = "3.11.0"
gix = { version = "0.68.0", default-features = false, features = ["blocking-http-transport-reqwest-rust-tls", "revparse-regex", "credentials", "parallel"] }
semver = { version = "1.0.23", features = ["serde"] }
semver = { version = "1.0.24", features = ["serde"] }
reqwest = { version = "0.12.9", default-features = false, features = ["rustls-tls"] }
tokio-tar = "0.3.1"
async-compression = { version = "0.4.18", features = ["tokio", "gzip"] }
pathdiff = "0.2.3"
relative-path = { version = "1.9.3", features = ["serde"] }
log = "0.4.22"
thiserror = "2.0.5"
tracing = { version = "0.1.41", features = ["attributes"] }
thiserror = "2.0.7"
tokio = { version = "1.42.0", features = ["process"] }
tokio-util = "0.7.13"
async-stream = "0.3.6"
futures = "0.3.31"
full_moon = { version = "1.1.2", features = ["luau"] }
url = { version = "2.5.4", features = ["serde"] }
chrono = { version = "0.4.38", features = ["serde"] }
chrono = { version = "0.4.39", features = ["serde"] }
sha2 = "0.10.8"
tempfile = "3.14.0"
wax = { version = "0.6.0", default-features = false }
@ -81,9 +81,9 @@ colored = { version = "2.1.0", optional = true }
toml_edit = { version = "0.22.22", optional = true }
clap = { version = "4.5.23", features = ["derive"], optional = true }
dirs = { version = "5.0.1", optional = true }
pretty_env_logger = { version = "0.5.0", optional = true }
tracing-subscriber = { version = "0.3.19", features = ["env-filter"], optional = true }
indicatif = { version = "0.17.9", optional = true }
indicatif-log-bridge = { version = "0.2.3", optional = true }
tracing-indicatif = { version = "0.3.8", optional = true }
inquire = { version = "0.7.5", optional = true }
[target.'cfg(target_os = "windows")'.dependencies]

25
SECURITY.md Normal file
View file

@ -0,0 +1,25 @@
# Security Policy
## Supported Versions
As pesde is currently in version 0.x, we can only guarantee security for:
- **The latest minor** (currently 0.5).
- **The latest release candidate for the next version**, if available.
When a new minor version is released, the previous version will immediately lose security support.
> **Note:** This policy will change with the release of version 1.0, which will include an extended support period for versions >=1.0.
| Version | Supported |
| ------- | ------------------ |
| 0.5.x | :white_check_mark: |
| < 0.5 | :x: |
## Reporting a Vulnerability
We encourage all security concerns to be reported at [pesde@daimond113.com](mailto:pesde@daimond113.com), along the following format:
- **Subject**: The subject must be prefixed with `[SECURITY]` to ensure it is prioritized as a security concern.
- **Content**:
- **Affected Versions**: Clearly specify which are affected by the issue.
- **Issue Details**: Provide a detailed description of the issue, including reproduction steps and/or a simple example, if applicable.
We will try to respond as soon as possible.

View file

@ -38,17 +38,17 @@ Git dependencies are dependencies on packages hosted on a Git repository.
```toml title="pesde.toml"
[dependencies]
acme = { repo = "acme/package", rev = "main" }
acme = { repo = "acme/package", rev = "aeff6" }
```
In this example, we're specifying a dependency on the package contained within
the `acme/package` GitHub repository at the `main` branch.
the `acme/package` GitHub repository at the `aeff6` commit.
You can also use a URL to specify the Git repository and a specific commit.
You can also use a URL to specify the Git repository and a tag for the revision.
```toml title="pesde.toml"
[dependencies]
acme = { repo = "https://git.acme.local/package.git", rev = "aeff6" }
acme = { repo = "https://git.acme.local/package.git", rev = "v0.1.0" }
```
You can also specify a path if the package is not at the root of the repository.

View file

@ -20,15 +20,15 @@ to get it added.
Studio.
Running `pesde init` will prompt you to select a target, select
`roblox` or `roblox_server` in this case. This will setup the configuration
needed to use pesde in a project using Rojo.
`roblox` or `roblox_server` in this case. You will be prompted to pick out a
scripts package. Select `pesde/scripts_rojo` to get started with Rojo.
## Usage with other tools
If you are using a different sync tool, you should look for it's scripts in the
pesde-scripts repository. If you cannot find them, you can write your own and
optionally submit a PR to help others using the same tool as you get started
quicker.
If you are using a different sync tool, you should look for it's scripts
package on the registry. If you cannot find it, you can write your own and
optionally submit a PR to pesde-scripts to help others using the same tool as
you get started quicker.
Scaffold your project with `pesde init`, select the `roblox` or `roblox_server`
target, and then create a `.pesde/roblox_sync_config_generator.luau` script

View file

@ -0,0 +1,53 @@
---
title: Using Scripts Packages
description: Learn how to use scripts packages.
---
A **scripts package** is a package that contains scripts. The scripts provided
by the package are linked in `.pesde/{alias}/{script_name}.luau` of the project
that uses the package.
## Using a scripts package
Scripts packages can be installed using the `pesde add` and `pesde install`
commands.
This requires a `pesde.toml` file to be present in the current directory, and
will add the scripts package to the `dependencies` section of the file.
```sh
pesde add pesde/scripts_rojo
pesde install
```
This will add the scripts package to your project, and installing will put the
scripts at `.pesde/scripts_rojo/{script_name}.luau`. You can then add the scripts
to your manifest, for example:
```toml title="pesde.toml"
[scripts]
roblox_sync_config_generator = ".pesde/scripts_rojo/roblox_sync_config_generator.luau"
```
## Making a scripts package
To make a scripts package you must use a target compatible with scripts exports.
These currently are `lune` and `luau`.
Here is an example of a scripts package:
```toml title="pesde.toml"
name = "pesde/scripts_rojo"
version = "1.0.0"
license = "MIT"
[target]
environment = "lune"
[target.scripts]
roblox_sync_config_generator = "roblox_sync_config_generator.luau"
```
The `scripts` table in the target is a map of script names to the path of the
script in the package. The scripts will be linked in the project that uses the
package at `.pesde/{alias}/{script_name}.luau`.

View file

@ -19,10 +19,10 @@ To create an index, create a new repository and add a `config.toml` file with
the following content:
```toml title="config.toml"
# The URL of the registry API
# the URL of the registry API
api = "https://registry.acme.local/"
# Package download URL (optional)
# package download URL (optional)
download = "{API_URL}/v0/packages/{PACKAGE}/{PACKAGE_VERSION}/{PACKAGE_TARGET}"
# the client ID of the GitHub OAuth app (optional)
@ -33,13 +33,16 @@ git_allowed = true
# whether to allow packages which depend on packages from other registries
# (default: false)
other_registries_allowed = true
other_registries_allowed = ["https://git.acme.local/index"]
# whether to allow packages with Wally dependencies (default: false)
wally_allowed = false
# the maximum size of the archive in bytes (default: 4MB)
max_archive_size = 4194304
# the scripts packages present in the `init` command selection by default
scripts_packages = ["pesde/scripts_rojo"]
```
- **api**: The URL of the registry API. See below for more information.
@ -60,18 +63,24 @@ max_archive_size = 4194304
- **github_oauth_client_id**: This is required if you use GitHub OAuth for
authentication. See below for more information.
- **git_allowed**: Whether to allow packages with Git dependencies. This is
optional and defaults to `false`.
- **git_allowed**: Whether to allow packages with Git dependencies. This can be
either a bool or a list of allowed repository URLs. This is optional and
defaults to `false`.
- **other_registries_allowed**: Whether to allow packages which depend on
packages from other registries. This is optional and defaults to `false`.
packages from other registries. This can be either a bool or a list of
allowed index repository URLs. This is optional and defaults to `false`.
- **wally_allowed**: Whether to allow packages with Wally dependencies. This is
- **wally_allowed**: Whether to allow packages with Wally dependencies. This can
be either a bool or a list of allowed index repository URLs. This is
optional and defaults to `false`.
- **max_archive_size**: The maximum size of the archive in bytes. This is
optional and defaults to `4194304` (4MB).
- **scripts_packages**: The scripts packages present in the `init` command
selection by default. This is optional and defaults to none.
You should then push this repository to [GitHub](https://github.com/).
## Configuring the registry
@ -88,8 +97,8 @@ has access to the index repository. We recommend using a separate account
for this purpose.
<Aside>
For a GitHub account the password **must** be a personal access token. For
instructions on how to create a personal access token, see the [GitHub
For a GitHub account the password **must** be a personal access token. For instructions on how to
create a personal access token, see the [GitHub
documentation](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens).
The access token must have read and write access to the index repository.
</Aside>

View file

@ -41,6 +41,16 @@ You can follow the installation instructions in the
pesde should now be installed on your system. You may need to restart your
computer for the changes to take effect.
<Aside type="caution">
pesde uses symlinks which are an administrator-level operation on Windows.
To ensure proper functionality, enable [Developer Mode](https://learn.microsoft.com/en-us/windows/apps/get-started/enable-your-device-for-development).
If you are getting errors such as `Failed to symlink file, a required
privilege is not held by the client`, then enabling this setting will fix
them.
</Aside>
</TabItem>
<TabItem label="Linux & macOS">
@ -59,7 +69,7 @@ You can follow the installation instructions in the
environment variable.
```sh title=".zshrc"
export PATH = "$PATH:/home/user/.pesde/bin"
export PATH="$PATH:$HOME/.pesde/bin"
```
You should then be able to run `pesde` after restarting your shell.

View file

@ -155,6 +155,19 @@ build_files = [
These files are passed to [`roblox_sync_config_generator`](#roblox_sync_config_generator)
when the package is installed in order to generate the necessary configuration.
### `scripts`
**Allowed in:** `luau`, `lune`
A list of scripts that will be linked to the dependant's `.pesde` directory, and
copied over to the [scripts](#scripts-1) section when initialising a project with
this package as the scripts package.
```toml
[target.scripts]
roblox_sync_config_generator = "scripts/roblox_sync_config_generator.luau"
```
## `[scripts]`
The `[scripts]` section contains scripts that can be run using the `pesde run`
@ -177,10 +190,6 @@ sync tools.
of files specified within the [`target.build_files`](#build_files) of the
package.
You can find template scripts inside the
[`pesde-scripts` repository](https://github.com/pesde-pkg/scripts)
for various sync tools.
<LinkCard
title="Roblox"
description="Learn more about using pesde in Roblox projects."
@ -360,14 +369,14 @@ foo = { wally = "acme/foo", version = "1.2.3", index = "acme" }
```toml
[dependencies]
foo = { repo = "acme/packages", rev = "main", path = "foo" }
foo = { repo = "acme/packages", rev = "aeff6", path = "foo" }
```
**Git dependencies** contain the following fields:
- `repo`: The URL of the Git repository.
This can either be `<owner>/<name>` for a GitHub repository, or a full URL.
- `rev`: The Git revision to install. This can be a branch, tag, or commit hash.
- `rev`: The Git revision to install. This can be a tag or commit hash.
- `path`: The path within the repository to install. If not specified, the root
of the repository is used.

22
registry/CHANGELOG.md Normal file
View file

@ -0,0 +1,22 @@
# Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [0.1.2]
### Changed
- Update to pesde lib API changes by @daimond113
## [0.1.1] - 2024-12-19
### Changed
- Switch to traccing for logging by @daimond113
## [0.1.0] - 2024-12-14
### Added
- Rewrite registry for pesde v0.5.0 by @daimond113
[0.1.2]: https://github.com/daimond113/pesde/compare/v0.5.2%2Bregistry.0.1.1..v0.5.3%2Bregistry.0.1.2
[0.1.1]: https://github.com/daimond113/pesde/compare/v0.5.1%2Bregistry.0.1.0..v0.5.2%2Bregistry.0.1.1
[0.1.0]: https://github.com/daimond113/pesde/compare/v0.4.7..v0.5.0%2Bregistry.0.1.0

View file

@ -1,6 +1,6 @@
[package]
name = "pesde-registry"
version = "0.7.0"
version = "0.1.2"
edition = "2021"
repository = "https://github.com/pesde-pkg/index"
publish = false
@ -8,13 +8,12 @@ publish = false
[dependencies]
actix-web = "4.9.0"
actix-cors = "0.7.0"
actix-governor = "0.7.0"
actix-governor = "0.8.0"
dotenvy = "0.15.7"
thiserror = "2.0.5"
thiserror = "2.0.7"
tantivy = "0.22.0"
semver = "1.0.23"
chrono = { version = "0.4.38", features = ["serde"] }
url = "2.5.4"
semver = "1.0.24"
chrono = { version = "0.4.39", features = ["serde"] }
futures = "0.3.31"
tokio = "1.42.0"
tempfile = "3.14.0"
@ -27,7 +26,7 @@ gix = { version = "0.68.0", default-features = false, features = [
"credentials",
] }
serde = "1.0.215"
serde = "1.0.216"
serde_json = "1.0.133"
serde_yaml = "0.9.34"
toml = "0.8.19"
@ -41,10 +40,11 @@ constant_time_eq = "0.3.1"
tokio-tar = "0.3.1"
async-compression = { version = "0.4.18", features = ["tokio", "gzip"] }
log = "0.4.22"
pretty_env_logger = "0.5.0"
tracing = { version = "0.1.41", features = ["attributes"] }
tracing-subscriber = { version = "0.3.19", features = ["env-filter"] }
tracing-actix-web = "0.7.15"
sentry = { version = "0.35.0", default-features = false, features = ["backtrace", "contexts", "debug-images", "panic", "reqwest", "rustls", "log"] }
sentry = { version = "0.35.0", default-features = false, features = ["backtrace", "contexts", "debug-images", "panic", "reqwest", "rustls", "tracing"] }
sentry-actix = "0.35.0"
pesde = { path = "..", features = ["wally-compat"] }

View file

@ -45,7 +45,7 @@ impl AuthImpl for GitHubAuth {
return Ok(None);
}
Err(_) => {
log::error!(
tracing::error!(
"failed to get user: {}",
response.into_error().await.unwrap_err()
);
@ -53,7 +53,7 @@ impl AuthImpl for GitHubAuth {
}
},
Err(e) => {
log::error!("failed to get user: {e}");
tracing::error!("failed to get user: {e}");
return Ok(None);
}
};
@ -61,7 +61,7 @@ impl AuthImpl for GitHubAuth {
let user_id = match response.json::<UserResponse>().await {
Ok(resp) => resp.user.id,
Err(e) => {
log::error!("failed to get user: {e}");
tracing::error!("failed to get user: {e}");
return Ok(None);
}
};

View file

@ -71,7 +71,7 @@ pub async fn get_package_version(
let (scope, name_part) = name.as_str();
let entries: IndexFile = {
let file: IndexFile = {
let source = app_state.source.lock().await;
let repo = gix::open(source.path(&app_state.project))?;
let tree = root_tree(&repo)?;
@ -84,14 +84,15 @@ pub async fn get_package_version(
let Some((v_id, entry, targets)) = ({
let version = match version {
VersionRequest::Latest => match entries.keys().map(|k| k.version()).max() {
VersionRequest::Latest => match file.entries.keys().map(|k| k.version()).max() {
Some(latest) => latest.clone(),
None => return Ok(HttpResponse::NotFound().finish()),
},
VersionRequest::Specific(version) => version,
};
let versions = entries
let versions = file
.entries
.iter()
.filter(|(v_id, _)| *v_id.version() == version);

View file

@ -19,7 +19,7 @@ pub async fn get_package_versions(
let (scope, name_part) = name.as_str();
let versions: IndexFile = {
let file: IndexFile = {
let source = app_state.source.lock().await;
let repo = gix::open(source.path(&app_state.project))?;
let tree = root_tree(&repo)?;
@ -32,7 +32,7 @@ pub async fn get_package_versions(
let mut responses = BTreeMap::new();
for (v_id, entry) in versions {
for (v_id, entry) in file.entries {
let info = responses
.entry(v_id.version().clone())
.or_insert_with(|| PackageResponse {

View file

@ -304,7 +304,7 @@ pub async fn publish_package(
.filter(|index| match gix::Url::try_from(*index) {
Ok(url) => config
.other_registries_allowed
.is_allowed(source.repo_url().clone(), url),
.is_allowed_or_same(source.repo_url().clone(), url),
Err(_) => false,
})
.is_none()
@ -315,16 +315,13 @@ pub async fn publish_package(
}
}
DependencySpecifiers::Wally(specifier) => {
if !config.wally_allowed {
return Err(Error::InvalidArchive(
"wally dependencies are not allowed".into(),
));
}
if specifier
.index
.as_ref()
.filter(|index| index.parse::<url::Url>().is_ok())
.as_deref()
.filter(|index| match gix::Url::try_from(*index) {
Ok(url) => config.wally_allowed.is_allowed(url),
Err(_) => false,
})
.is_none()
{
return Err(Error::InvalidArchive(format!(
@ -332,15 +329,15 @@ pub async fn publish_package(
)));
}
}
DependencySpecifiers::Git(_) => {
if !config.git_allowed {
DependencySpecifiers::Git(specifier) => {
if !config.git_allowed.is_allowed(specifier.repo.clone()) {
return Err(Error::InvalidArchive(
"git dependencies are not allowed".into(),
));
}
}
DependencySpecifiers::Workspace(_) => {
// workspace specifiers are to be transformed into Pesde specifiers by the sender
// workspace specifiers are to be transformed into pesde specifiers by the sender
return Err(Error::InvalidArchive(
"non-transformed workspace dependency".into(),
));
@ -374,7 +371,7 @@ pub async fn publish_package(
}
};
let mut entries: IndexFile =
let mut file: IndexFile =
toml::de::from_str(&read_file(&gix_tree, [scope, name])?.unwrap_or_default())?;
let new_entry = IndexFileEntry {
@ -389,11 +386,12 @@ pub async fn publish_package(
dependencies,
};
let this_version = entries
let this_version = file
.entries
.keys()
.find(|v_id| *v_id.version() == manifest.version);
if let Some(this_version) = this_version {
let other_entry = entries.get(this_version).unwrap();
let other_entry = file.entries.get(this_version).unwrap();
// description cannot be different - which one to render in the "Recently published" list?
// the others cannot be different because what to return from the versions endpoint?
@ -409,7 +407,8 @@ pub async fn publish_package(
}
}
if entries
if file
.entries
.insert(
VersionId::new(manifest.version.clone(), manifest.target.kind()),
new_entry.clone(),
@ -425,7 +424,7 @@ pub async fn publish_package(
let reference = repo.find_reference(&refspec)?;
{
let index_content = toml::to_string(&entries)?;
let index_content = toml::to_string(&file)?;
let mut blob_writer = repo.blob_writer(None)?;
blob_writer.write_all(index_content.as_bytes())?;
oids.push((name, blob_writer.commit()?));

View file

@ -68,10 +68,11 @@ pub async fn search_packages(
.unwrap();
let (scope, name) = id.as_str();
let versions: IndexFile =
let file: IndexFile =
toml::de::from_str(&read_file(&tree, [scope, name]).unwrap().unwrap()).unwrap();
let (latest_version, entry) = versions
let (latest_version, entry) = file
.entries
.iter()
.max_by_key(|(v_id, _)| v_id.version())
.unwrap();
@ -79,17 +80,19 @@ pub async fn search_packages(
PackageResponse {
name: id.to_string(),
version: latest_version.version().to_string(),
targets: versions
targets: file
.entries
.iter()
.filter(|(v_id, _)| v_id.version() == latest_version.version())
.map(|(_, entry)| (&entry.target).into())
.collect(),
description: entry.description.clone().unwrap_or_default(),
published_at: versions
published_at: file
.entries
.values()
.max_by_key(|entry| entry.published_at)
.unwrap()
.published_at,
.map(|entry| entry.published_at)
.max()
.unwrap(),
license: entry.license.clone().unwrap_or_default(),
authors: entry.authors.clone(),
repository: entry.repository.clone().map(|url| url.to_string()),

View file

@ -1,5 +1,4 @@
use actix_web::{body::BoxBody, HttpResponse, ResponseError};
use log::error;
use pesde::source::git_index::errors::{ReadFile, RefreshError, TreeError};
use serde::Serialize;
use thiserror::Error;
@ -67,7 +66,7 @@ impl ResponseError for Error {
error: format!("archive is invalid: {e}"),
}),
e => {
log::error!("unhandled error: {e:?}");
tracing::error!("unhandled error: {e:?}");
HttpResponse::InternalServerError().finish()
}
}

View file

@ -6,19 +6,22 @@ use crate::{
use actix_cors::Cors;
use actix_governor::{Governor, GovernorConfigBuilder};
use actix_web::{
middleware::{from_fn, Compress, Logger, NormalizePath, TrailingSlash},
middleware::{from_fn, Compress, NormalizePath, TrailingSlash},
rt::System,
web,
web::PayloadConfig,
App, HttpServer,
};
use fs_err::tokio as fs;
use log::info;
use pesde::{
source::{pesde::PesdePackageSource, traits::PackageSource},
AuthConfig, Project,
};
use std::{env::current_dir, path::PathBuf};
use tracing::level_filters::LevelFilter;
use tracing_subscriber::{
fmt::format::FmtSpan, layer::SubscriberExt, util::SubscriberInitExt, EnvFilter,
};
mod auth;
mod endpoints;
@ -116,12 +119,12 @@ async fn run() -> std::io::Result<()> {
let app_data = web::Data::new(AppState {
storage: {
let storage = get_storage_from_env();
info!("storage: {storage}");
tracing::info!("storage: {storage}");
storage
},
auth: {
let auth = get_auth_from_env(&config);
info!("auth: {auth}");
tracing::info!("auth: {auth}");
auth
},
source: tokio::sync::Mutex::new(source),
@ -140,14 +143,12 @@ async fn run() -> std::io::Result<()> {
.finish()
.unwrap();
info!("listening on {address}:{port}");
HttpServer::new(move || {
App::new()
.wrap(sentry_actix::Sentry::with_transaction())
.wrap(NormalizePath::new(TrailingSlash::Trim))
.wrap(Cors::permissive())
.wrap(Logger::default())
.wrap(tracing_actix_web::TracingLogger::default())
.wrap(Compress::default())
.app_data(app_data.clone())
.route(
@ -200,12 +201,26 @@ async fn run() -> std::io::Result<()> {
fn main() -> std::io::Result<()> {
let _ = dotenvy::dotenv();
let mut log_builder = pretty_env_logger::formatted_builder();
log_builder.parse_env(pretty_env_logger::env_logger::Env::default().default_filter_or("info"));
let tracing_env_filter = EnvFilter::builder()
.with_default_directive(LevelFilter::INFO.into())
.from_env_lossy()
.add_directive("reqwest=info".parse().unwrap())
.add_directive("rustls=info".parse().unwrap())
.add_directive("tokio_util=info".parse().unwrap())
.add_directive("goblin=info".parse().unwrap())
.add_directive("tower=info".parse().unwrap())
.add_directive("hyper=info".parse().unwrap())
.add_directive("h2=info".parse().unwrap());
let logger = sentry::integrations::log::SentryLogger::with_dest(log_builder.build());
log::set_boxed_logger(Box::new(logger)).unwrap();
log::set_max_level(log::LevelFilter::Info);
tracing_subscriber::registry()
.with(tracing_env_filter)
.with(
tracing_subscriber::fmt::layer()
.compact()
.with_span_events(FmtSpan::NEW | FmtSpan::CLOSE),
)
.with(sentry::integrations::tracing::layer())
.init();
let guard = sentry::init(sentry::ClientOptions {
release: sentry::release_name!(),
@ -218,9 +233,9 @@ fn main() -> std::io::Result<()> {
if guard.is_enabled() {
std::env::set_var("RUST_BACKTRACE", "full");
info!("sentry initialized");
tracing::info!("sentry initialized");
} else {
info!("sentry **NOT** initialized");
tracing::info!("sentry **NOT** initialized");
}
System::new().block_on(run())

View file

@ -8,6 +8,8 @@ pub struct TargetInfo {
kind: TargetKind,
lib: bool,
bin: bool,
#[serde(skip_serializing_if = "BTreeSet::is_empty")]
scripts: BTreeSet<String>,
}
impl From<Target> for TargetInfo {
@ -22,6 +24,10 @@ impl From<&Target> for TargetInfo {
kind: target.kind(),
lib: target.lib_path().is_some(),
bin: target.bin_path().is_some(),
scripts: target
.scripts()
.map(|scripts| scripts.keys().cloned().collect())
.unwrap_or_default(),
}
}
}

View file

@ -104,8 +104,8 @@ pub async fn make_search(
pin!(stream);
while let Some((pkg_name, mut file)) = stream.next().await {
let Some((_, latest_entry)) = file.pop_last() else {
log::warn!("no versions found for {pkg_name}");
let Some((_, latest_entry)) = file.entries.pop_last() else {
tracing::error!("no versions found for {pkg_name}");
continue;
};

View file

@ -5,6 +5,7 @@ use keyring::Entry;
use reqwest::header::AUTHORIZATION;
use serde::{ser::SerializeMap, Deserialize, Serialize};
use std::collections::BTreeMap;
use tracing::instrument;
#[derive(Debug, Clone)]
pub struct Tokens(pub BTreeMap<gix::Url, String>);
@ -37,15 +38,20 @@ impl<'de> Deserialize<'de> for Tokens {
}
}
#[instrument(level = "trace")]
pub async fn get_tokens() -> anyhow::Result<Tokens> {
let config = read_config().await?;
if !config.tokens.0.is_empty() {
tracing::debug!("using tokens from config");
return Ok(config.tokens);
}
match Entry::new("tokens", env!("CARGO_PKG_NAME")) {
Ok(entry) => match entry.get_password() {
Ok(token) => return serde_json::from_str(&token).context("failed to parse tokens"),
Ok(token) => {
tracing::debug!("using tokens from keyring");
return serde_json::from_str(&token).context("failed to parse tokens");
}
Err(keyring::Error::PlatformFailure(_) | keyring::Error::NoEntry) => {}
Err(e) => return Err(e.into()),
},
@ -56,16 +62,22 @@ pub async fn get_tokens() -> anyhow::Result<Tokens> {
Ok(Tokens(BTreeMap::new()))
}
#[instrument(level = "trace")]
pub async fn set_tokens(tokens: Tokens) -> anyhow::Result<()> {
let entry = Entry::new("tokens", env!("CARGO_PKG_NAME"))?;
let json = serde_json::to_string(&tokens).context("failed to serialize tokens")?;
match entry.set_password(&json) {
Ok(()) => return Ok(()),
Ok(()) => {
tracing::debug!("tokens saved to keyring");
return Ok(());
}
Err(keyring::Error::PlatformFailure(_) | keyring::Error::NoEntry) => {}
Err(e) => return Err(e.into()),
}
tracing::debug!("tokens saved to config");
let mut config = read_config().await?;
config.tokens = tokens;
write_config(&config).await.map_err(Into::into)
@ -86,6 +98,7 @@ struct UserResponse {
login: String,
}
#[instrument(level = "trace")]
pub async fn get_token_login(
reqwest: &reqwest::Client,
access_token: &str,

View file

@ -2,6 +2,7 @@ use std::{collections::HashSet, str::FromStr};
use anyhow::Context;
use clap::Args;
use colored::Colorize;
use semver::VersionReq;
use crate::cli::{config::read_config, AnyPackageIdentifier, VersionedPackageName};
@ -62,7 +63,7 @@ impl AddCommand {
.cloned();
if let Some(index) = self.index.as_ref().filter(|_| index.is_none()) {
log::error!("index {index} not found");
println!("{}: index {index} not found", "error".red().bold());
return Ok(());
}
@ -89,7 +90,7 @@ impl AddCommand {
.cloned();
if let Some(index) = self.index.as_ref().filter(|_| index.is_none()) {
log::error!("wally index {index} not found");
println!("{}: wally index {index} not found", "error".red().bold());
return Ok(());
}
@ -145,7 +146,7 @@ impl AddCommand {
.pop_last()
.map(|(v_id, _)| v_id)
else {
log::error!("no versions found for package {specifier}");
println!("{}: no versions found for package", "error".red().bold());
return Ok(());
};

View file

@ -2,7 +2,6 @@ use crate::cli::{config::read_config, progress_bar, VersionedPackageName};
use anyhow::Context;
use clap::Args;
use fs_err::tokio as fs;
use indicatif::MultiProgress;
use pesde::{
linking::generator::generate_bin_linking_module,
manifest::target::TargetKind,
@ -35,12 +34,7 @@ pub struct ExecuteCommand {
}
impl ExecuteCommand {
pub async fn run(
self,
project: Project,
multi: MultiProgress,
reqwest: reqwest::Client,
) -> anyhow::Result<()> {
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
let index = match self.index {
Some(index) => Some(index),
None => read_config().await.ok().map(|c| c.default_index),
@ -84,7 +78,7 @@ impl ExecuteCommand {
);
};
log::info!("found package {}@{version}", pkg_ref.name);
println!("using {}@{version}", pkg_ref.name);
let tmp_dir = project.cas_dir().join(".tmp");
fs::create_dir_all(&tmp_dir)
@ -134,7 +128,6 @@ impl ExecuteCommand {
progress_bar(
graph.values().map(|versions| versions.len() as u64).sum(),
rx,
&multi,
"📥 ".to_string(),
"downloading dependencies".to_string(),
"downloaded dependencies".to_string(),

View file

@ -16,11 +16,26 @@ use pesde::{
Project, DEFAULT_INDEX_NAME, SCRIPTS_LINK_FOLDER,
};
use semver::VersionReq;
use std::{collections::HashSet, str::FromStr};
use std::{collections::HashSet, fmt::Display, str::FromStr};
#[derive(Debug, Args)]
pub struct InitCommand {}
#[derive(Debug)]
enum PackageNameOrCustom {
PackageName(PackageName),
Custom,
}
impl Display for PackageNameOrCustom {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
PackageNameOrCustom::PackageName(n) => write!(f, "{n}"),
PackageNameOrCustom::Custom => write!(f, "custom"),
}
}
}
impl InitCommand {
pub async fn run(self, project: Project) -> anyhow::Result<()> {
match project.read_manifest().await {
@ -127,7 +142,49 @@ impl InitCommand {
.await
.context("failed to get source config")?;
if let Some(scripts_pkg_name) = config.scripts_package {
let scripts_package = if config.scripts_packages.is_empty() {
PackageNameOrCustom::Custom
} else {
inquire::Select::new(
"which scripts package do you want to use?",
config
.scripts_packages
.into_iter()
.map(PackageNameOrCustom::PackageName)
.chain(std::iter::once(PackageNameOrCustom::Custom))
.collect(),
)
.prompt()
.unwrap()
};
let scripts_package = match scripts_package {
PackageNameOrCustom::PackageName(p) => Some(p),
PackageNameOrCustom::Custom => {
let name = inquire::Text::new("which scripts package to use?")
.with_validator(|name: &str| {
if name.is_empty() {
return Ok(Validation::Valid);
}
Ok(match PackageName::from_str(name) {
Ok(_) => Validation::Valid,
Err(e) => Validation::Invalid(e.to_string().into()),
})
})
.with_help_message("leave empty for none")
.prompt()
.unwrap();
if name.is_empty() {
None
} else {
Some(PackageName::from_str(&name).unwrap())
}
}
};
if let Some(scripts_pkg_name) = scripts_package {
let (v_id, pkg_ref) = source
.resolve(
&PesdeDependencySpecifier {
@ -185,7 +242,7 @@ impl InitCommand {
} else {
println!(
"{}",
"configured index hasn't a configured scripts package".red()
"no scripts package configured, this can cause issues with Roblox compatibility".red()
);
if !inquire::prompt_confirmation("initialize regardless?").unwrap() {
return Ok(());

View file

@ -6,7 +6,6 @@ use clap::Args;
use colored::{ColoredString, Colorize};
use fs_err::tokio as fs;
use futures::future::try_join_all;
use indicatif::MultiProgress;
use pesde::{
download_and_link::filter_graph, lockfile::Lockfile, manifest::target::TargetKind, Project,
MANIFEST_FILE_NAME,
@ -89,12 +88,7 @@ fn job(n: u8) -> ColoredString {
struct CallbackError(#[from] anyhow::Error);
impl InstallCommand {
pub async fn run(
self,
project: Project,
multi: MultiProgress,
reqwest: reqwest::Client,
) -> anyhow::Result<()> {
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
let mut refreshed_sources = HashSet::new();
let manifest = project
@ -116,10 +110,10 @@ impl InstallCommand {
match project.deser_lockfile().await {
Ok(lockfile) => {
if lockfile.overrides != manifest.overrides {
log::debug!("overrides are different");
tracing::debug!("overrides are different");
None
} else if lockfile.target != manifest.target.kind() {
log::debug!("target kind is different");
tracing::debug!("target kind is different");
None
} else {
Some(lockfile)
@ -153,7 +147,7 @@ impl InstallCommand {
deleted_folders
.entry(folder.to_string())
.or_insert_with(|| async move {
log::debug!("deleting the {folder} folder");
tracing::debug!("deleting the {folder} folder");
if let Some(e) = fs::remove_dir_all(package_dir.join(&folder))
.await
@ -219,7 +213,7 @@ impl InstallCommand {
.map(|(alias, _, _)| alias)
.filter(|alias| {
if *alias == env!("CARGO_BIN_NAME") {
log::warn!(
tracing::warn!(
"package {alias} has the same name as the CLI, skipping bin link"
);
return false;
@ -257,7 +251,7 @@ impl InstallCommand {
fs::write(
&bin_exec_file,
format!(r#"#!/bin/sh
lune run "$(dirname "$0")/.impl/{alias}.luau" -- "$@""#
exec lune run "$(dirname "$0")/.impl/{alias}.luau" -- "$@""#
),
)
.await
@ -281,7 +275,6 @@ lune run "$(dirname "$0")/.impl/{alias}.luau" -- "$@""#
progress_bar(
graph.values().map(|versions| versions.len() as u64).sum(),
rx,
&multi,
format!("{} 📥 ", job(3)),
"downloading dependencies".to_string(),
"downloaded dependencies".to_string(),
@ -303,7 +296,6 @@ lune run "$(dirname "$0")/.impl/{alias}.luau" -- "$@""#
progress_bar(
manifest.patches.values().map(|v| v.len() as u64).sum(),
rx,
&multi,
format!("{} 🩹 ", job(JOBS - 1)),
"applying patches".to_string(),
"applied patches".to_string(),
@ -323,9 +315,8 @@ lune run "$(dirname "$0")/.impl/{alias}.luau" -- "$@""#
graph: downloaded_graph,
workspace: run_on_workspace_members(&project, |project| {
let multi = multi.clone();
let reqwest = reqwest.clone();
async move { Box::pin(self.run(project, multi, reqwest)).await }
async move { Box::pin(self.run(project, reqwest)).await }
})
.await?,
})

View file

@ -1,4 +1,3 @@
use indicatif::MultiProgress;
use pesde::Project;
mod add;
@ -72,18 +71,13 @@ pub enum Subcommand {
}
impl Subcommand {
pub async fn run(
self,
project: Project,
multi: MultiProgress,
reqwest: reqwest::Client,
) -> anyhow::Result<()> {
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
match self {
Subcommand::Auth(auth) => auth.run(project, reqwest).await,
Subcommand::Config(config) => config.run().await,
Subcommand::Init(init) => init.run(project).await,
Subcommand::Run(run) => run.run(project).await,
Subcommand::Install(install) => install.run(project, multi, reqwest).await,
Subcommand::Install(install) => install.run(project, reqwest).await,
Subcommand::Publish(publish) => publish.run(project, reqwest).await,
#[cfg(feature = "version-management")]
Subcommand::SelfInstall(self_install) => self_install.run().await,
@ -94,9 +88,9 @@ impl Subcommand {
#[cfg(feature = "version-management")]
Subcommand::SelfUpgrade(self_upgrade) => self_upgrade.run(reqwest).await,
Subcommand::Add(add) => add.run(project).await,
Subcommand::Update(update) => update.run(project, multi, reqwest).await,
Subcommand::Update(update) => update.run(project, reqwest).await,
Subcommand::Outdated(outdated) => outdated.run(project).await,
Subcommand::Execute(execute) => execute.run(project, multi, reqwest).await,
Subcommand::Execute(execute) => execute.run(project, reqwest).await,
}
}
}

View file

@ -4,11 +4,13 @@ use async_compression::Level;
use clap::Args;
use colored::Colorize;
use fs_err::tokio as fs;
#[allow(deprecated)]
use pesde::{
manifest::{target::Target, DependencyType},
matching_globs_old_behaviour,
scripts::ScriptName,
source::{
git_index::GitBasedSource,
pesde::{specifier::PesdeDependencySpecifier, PesdePackageSource},
specifiers::DependencySpecifiers,
traits::PackageSource,
@ -68,7 +70,10 @@ impl PublishCommand {
return Ok(());
}
if manifest.target.lib_path().is_none() && manifest.target.bin_path().is_none() {
if manifest.target.lib_path().is_none()
&& manifest.target.bin_path().is_none()
&& manifest.target.scripts().is_none_or(|s| s.is_empty())
{
anyhow::bail!("no exports found in target");
}
@ -125,6 +130,7 @@ impl PublishCommand {
_ => None,
};
#[allow(deprecated)]
let mut paths = matching_globs_old_behaviour(
project.package_dir(),
manifest.includes.iter().map(|s| s.as_str()),
@ -359,10 +365,6 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
}
}
#[cfg(feature = "wally-compat")]
let mut has_wally = false;
let mut has_git = false;
for specifier in manifest
.dependencies
.values_mut()
@ -386,8 +388,6 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
}
#[cfg(feature = "wally-compat")]
DependencySpecifiers::Wally(specifier) => {
has_wally = true;
let index_name = specifier
.index
.as_deref()
@ -403,9 +403,7 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
.to_string(),
);
}
DependencySpecifiers::Git(_) => {
has_git = true;
}
DependencySpecifiers::Git(_) => {}
DependencySpecifiers::Workspace(spec) => {
let pkg_ref = WorkspacePackageSource
.resolve(spec, project, target_kind, &mut HashSet::new())
@ -503,6 +501,16 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
.bin_path()
.map_or("(none)".to_string(), |p| p.to_string())
);
println!(
"\tscripts: {}",
manifest
.target
.scripts()
.filter(|s| !s.is_empty())
.map_or("(none)".to_string(), |s| {
s.keys().cloned().collect::<Vec<_>>().join(", ")
})
);
}
println!(
@ -567,8 +575,7 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
.get(&self.index)
.context(format!("missing index {}", self.index))?;
let source = PesdePackageSource::new(index_url.clone());
source
.refresh(project)
PackageSource::refresh(&source, project)
.await
.context("failed to refresh source")?;
let config = source
@ -584,15 +591,23 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
);
}
manifest.all_dependencies().context("dependency conflict")?;
let deps = manifest.all_dependencies().context("dependency conflict")?;
if !config.git_allowed && has_git {
anyhow::bail!("git dependencies are not allowed on this index");
}
#[cfg(feature = "wally-compat")]
if !config.wally_allowed && has_wally {
anyhow::bail!("wally dependencies are not allowed on this index");
if let Some((disallowed, _)) = deps.iter().find(|(_, (spec, _))| match spec {
DependencySpecifiers::Pesde(spec) => {
!config.other_registries_allowed.is_allowed_or_same(
source.repo_url().clone(),
gix::Url::try_from(spec.index.as_deref().unwrap()).unwrap(),
)
}
DependencySpecifiers::Git(spec) => !config.git_allowed.is_allowed(spec.repo.clone()),
#[cfg(feature = "wally-compat")]
DependencySpecifiers::Wally(spec) => !config
.wally_allowed
.is_allowed(gix::Url::try_from(spec.index.as_deref().unwrap()).unwrap()),
_ => false,
}) {
anyhow::bail!("dependency `{disallowed}` is not allowed on this index");
}
if self.dry_run {
@ -611,7 +626,7 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
.body(archive);
if let Some(token) = project.auth_config().tokens().get(index_url) {
log::debug!("using token for {index_url}");
tracing::debug!("using token for {index_url}");
request = request.header(AUTHORIZATION, token);
}

View file

@ -1,7 +1,8 @@
use crate::cli::{
config::read_config,
version::{
current_version, get_latest_remote_version, get_or_download_version, update_bin_exe,
current_version, get_or_download_version, get_remote_version, no_build_metadata,
update_bin_exe, TagInfo, VersionType,
},
};
use anyhow::Context;
@ -24,33 +25,33 @@ impl SelfUpgradeCommand {
.context("no cached version found")?
.1
} else {
get_latest_remote_version(&reqwest).await?
get_remote_version(&reqwest, VersionType::Latest).await?
};
if latest_version <= current_version() {
let latest_version_no_metadata = no_build_metadata(&latest_version);
if latest_version_no_metadata <= current_version() {
println!("already up to date");
return Ok(());
}
let display_latest_version = latest_version_no_metadata.to_string().yellow().bold();
if !inquire::prompt_confirmation(format!(
"are you sure you want to upgrade {} from {} to {}?",
"are you sure you want to upgrade {} from {} to {display_latest_version}?",
env!("CARGO_BIN_NAME").cyan(),
current_version().to_string().yellow().bold(),
latest_version.to_string().yellow().bold()
env!("CARGO_PKG_VERSION").yellow().bold()
))? {
println!("cancelled upgrade");
return Ok(());
}
let path = get_or_download_version(&reqwest, &latest_version, true)
let path = get_or_download_version(&reqwest, &TagInfo::Complete(latest_version), true)
.await?
.unwrap();
update_bin_exe(&path).await?;
println!(
"upgraded to version {}!",
latest_version.to_string().yellow().bold()
);
println!("upgraded to version {display_latest_version}!");
Ok(())
}

View file

@ -2,7 +2,6 @@ use crate::cli::{progress_bar, run_on_workspace_members};
use anyhow::Context;
use clap::Args;
use colored::Colorize;
use indicatif::MultiProgress;
use pesde::{lockfile::Lockfile, Project};
use std::{collections::HashSet, sync::Arc};
use tokio::sync::Mutex;
@ -11,12 +10,7 @@ use tokio::sync::Mutex;
pub struct UpdateCommand {}
impl UpdateCommand {
pub async fn run(
self,
project: Project,
multi: MultiProgress,
reqwest: reqwest::Client,
) -> anyhow::Result<()> {
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
let mut refreshed_sources = HashSet::new();
let manifest = project
@ -60,7 +54,6 @@ impl UpdateCommand {
progress_bar(
graph.values().map(|versions| versions.len() as u64).sum(),
rx,
&multi,
"📥 ".to_string(),
"downloading dependencies".to_string(),
"downloaded dependencies".to_string(),
@ -73,9 +66,8 @@ impl UpdateCommand {
},
workspace: run_on_workspace_members(&project, |project| {
let multi = multi.clone();
let reqwest = reqwest.clone();
async move { Box::pin(self.run(project, multi, reqwest)).await }
async move { Box::pin(self.run(project, reqwest)).await }
})
.await?,
})

View file

@ -2,6 +2,7 @@ use crate::cli::{auth::Tokens, home_dir};
use anyhow::Context;
use fs_err::tokio as fs;
use serde::{Deserialize, Serialize};
use tracing::instrument;
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(default)]
@ -30,6 +31,7 @@ impl Default for CliConfig {
}
}
#[instrument(level = "trace")]
pub async fn read_config() -> anyhow::Result<CliConfig> {
let config_string = match fs::read_to_string(home_dir()?.join("config.toml")).await {
Ok(config_string) => config_string,
@ -44,6 +46,7 @@ pub async fn read_config() -> anyhow::Result<CliConfig> {
Ok(config)
}
#[instrument(level = "trace")]
pub async fn write_config(config: &CliConfig) -> anyhow::Result<()> {
let config_string = toml::to_string(config).context("failed to serialize config")?;
fs::write(home_dir()?.join("config.toml"), config_string)

View file

@ -2,7 +2,6 @@ use anyhow::Context;
use colored::Colorize;
use fs_err::tokio as fs;
use futures::StreamExt;
use indicatif::MultiProgress;
use pesde::{
lockfile::Lockfile,
manifest::target::TargetKind,
@ -19,6 +18,7 @@ use std::{
time::Duration,
};
use tokio::pin;
use tracing::instrument;
pub mod auth;
pub mod commands;
@ -43,6 +43,7 @@ pub async fn bin_dir() -> anyhow::Result<PathBuf> {
Ok(bin_dir)
}
#[instrument(skip(project), ret(level = "trace"), level = "debug")]
pub async fn up_to_date_lockfile(project: &Project) -> anyhow::Result<Option<Lockfile>> {
let manifest = project.deser_manifest().await?;
let lockfile = match project.deser_lockfile().await {
@ -56,17 +57,17 @@ pub async fn up_to_date_lockfile(project: &Project) -> anyhow::Result<Option<Loc
};
if manifest.overrides != lockfile.overrides {
log::debug!("overrides are different");
tracing::debug!("overrides are different");
return Ok(None);
}
if manifest.target.kind() != lockfile.target {
log::debug!("target kind is different");
tracing::debug!("target kind is different");
return Ok(None);
}
if manifest.name != lockfile.name || manifest.version != lockfile.version {
log::debug!("name or version is different");
tracing::debug!("name or version is different");
return Ok(None);
}
@ -88,7 +89,7 @@ pub async fn up_to_date_lockfile(project: &Project) -> anyhow::Result<Option<Loc
.iter()
.all(|(_, (spec, ty))| specs.contains(&(spec, ty)));
log::debug!("dependencies are the same: {same_dependencies}");
tracing::debug!("dependencies are the same: {same_dependencies}");
Ok(if same_dependencies {
Some(lockfile)
@ -133,7 +134,7 @@ impl VersionedPackageName {
let versions = graph.get(&self.0).context("package not found in graph")?;
if versions.len() == 1 {
let version = versions.keys().next().unwrap().clone();
log::debug!("only one version found, using {version}");
tracing::debug!("only one version found, using {version}");
version
} else {
anyhow::bail!(
@ -195,21 +196,18 @@ pub fn parse_gix_url(s: &str) -> Result<gix::Url, gix::url::parse::Error> {
pub async fn progress_bar<E: std::error::Error + Into<anyhow::Error>>(
len: u64,
mut rx: tokio::sync::mpsc::Receiver<Result<String, E>>,
multi: &MultiProgress,
prefix: String,
progress_msg: String,
finish_msg: String,
) -> anyhow::Result<()> {
let bar = multi.add(
indicatif::ProgressBar::new(len)
.with_style(
indicatif::ProgressStyle::default_bar()
.template("{prefix}[{elapsed_precise}] {bar:40.208/166} {pos}/{len} {msg}")?
.progress_chars("█▓▒░ "),
)
.with_prefix(prefix)
.with_message(progress_msg),
);
let bar = indicatif::ProgressBar::new(len)
.with_style(
indicatif::ProgressStyle::default_bar()
.template("{prefix}[{elapsed_precise}] {bar:40.208/166} {pos}/{len} {msg}")?
.progress_chars("█▓▒░ "),
)
.with_prefix(prefix)
.with_message(progress_msg);
bar.enable_steady_tick(Duration::from_millis(100));
while let Some(result) = rx.recv().await {

View file

@ -15,7 +15,8 @@ use std::{
env::current_exe,
path::{Path, PathBuf},
};
use tokio::io::AsyncReadExt;
use tokio::io::AsyncWrite;
use tracing::instrument;
pub fn current_version() -> Version {
Version::parse(env!("CARGO_PKG_VERSION")).unwrap()
@ -33,18 +34,33 @@ struct Asset {
url: url::Url,
}
#[instrument(level = "trace")]
fn get_repo() -> (String, String) {
let mut parts = env!("CARGO_PKG_REPOSITORY").split('/').skip(3);
(
let (owner, repo) = (
parts.next().unwrap().to_string(),
parts.next().unwrap().to_string(),
)
);
tracing::trace!("repository for updates: {owner}/{repo}");
(owner, repo)
}
pub async fn get_latest_remote_version(reqwest: &reqwest::Client) -> anyhow::Result<Version> {
#[derive(Debug)]
pub enum VersionType {
Latest,
Specific(Version),
}
#[instrument(skip(reqwest), level = "trace")]
pub async fn get_remote_version(
reqwest: &reqwest::Client,
ty: VersionType,
) -> anyhow::Result<Version> {
let (owner, repo) = get_repo();
let releases = reqwest
let mut releases = reqwest
.get(format!(
"https://api.github.com/repos/{owner}/{repo}/releases",
))
@ -55,17 +71,28 @@ pub async fn get_latest_remote_version(reqwest: &reqwest::Client) -> anyhow::Res
.context("failed to get GitHub API response")?
.json::<Vec<Release>>()
.await
.context("failed to parse GitHub API response")?;
releases
.context("failed to parse GitHub API response")?
.into_iter()
.map(|release| Version::parse(release.tag_name.trim_start_matches('v')).unwrap())
.max()
.context("failed to find latest version")
.filter_map(|release| Version::parse(release.tag_name.trim_start_matches('v')).ok());
match ty {
VersionType::Latest => releases.max(),
VersionType::Specific(version) => {
releases.find(|v| no_build_metadata(v) == no_build_metadata(&version))
}
}
.context("failed to find latest version")
}
pub fn no_build_metadata(version: &Version) -> Version {
let mut version = version.clone();
version.build = semver::BuildMetadata::EMPTY;
version
}
const CHECK_INTERVAL: chrono::Duration = chrono::Duration::hours(6);
#[instrument(skip(reqwest), level = "trace")]
pub async fn check_for_updates(reqwest: &reqwest::Client) -> anyhow::Result<()> {
let config = read_config().await?;
@ -73,9 +100,11 @@ pub async fn check_for_updates(reqwest: &reqwest::Client) -> anyhow::Result<()>
.last_checked_updates
.filter(|(time, _)| chrono::Utc::now() - *time < CHECK_INTERVAL)
{
tracing::debug!("using cached version");
version
} else {
let version = get_latest_remote_version(reqwest).await?;
tracing::debug!("checking for updates");
let version = get_remote_version(reqwest, VersionType::Latest).await?;
write_config(&CliConfig {
last_checked_updates: Some((chrono::Utc::now(), version.clone())),
@ -86,72 +115,77 @@ pub async fn check_for_updates(reqwest: &reqwest::Client) -> anyhow::Result<()>
version
};
let current_version = current_version();
let version_no_metadata = no_build_metadata(&version);
if version > current_version {
let name = env!("CARGO_BIN_NAME");
let changelog = format!("{}/releases/tag/v{version}", env!("CARGO_PKG_REPOSITORY"),);
let unformatted_messages = [
"".to_string(),
format!("update available! {current_version}{version}"),
format!("changelog: {changelog}"),
format!("run `{name} self-upgrade` to upgrade"),
"".to_string(),
];
let width = unformatted_messages
.iter()
.map(|s| s.chars().count())
.max()
.unwrap()
+ 4;
let column = "".bright_magenta();
let message = [
"".to_string(),
format!(
"update available! {} → {}",
current_version.to_string().red(),
version.to_string().green()
),
format!("changelog: {}", changelog.blue()),
format!(
"run `{} {}` to upgrade",
name.blue(),
"self-upgrade".yellow()
),
"".to_string(),
]
.into_iter()
.enumerate()
.map(|(i, s)| {
let text_length = unformatted_messages[i].chars().count();
let padding = (width as f32 - text_length as f32) / 2f32;
let padding_l = " ".repeat(padding.floor() as usize);
let padding_r = " ".repeat(padding.ceil() as usize);
format!("{column}{padding_l}{s}{padding_r}{column}")
})
.collect::<Vec<_>>()
.join("\n");
let lines = "".repeat(width).bright_magenta();
let tl = "".bright_magenta();
let tr = "".bright_magenta();
let bl = "".bright_magenta();
let br = "".bright_magenta();
println!("\n{tl}{lines}{tr}\n{message}\n{bl}{lines}{br}\n");
if version_no_metadata <= current_version {
return Ok(());
}
let name = env!("CARGO_BIN_NAME");
let changelog = format!("{}/releases/tag/v{version}", env!("CARGO_PKG_REPOSITORY"));
let unformatted_messages = [
"".to_string(),
format!("update available! {current_version}{version_no_metadata}"),
format!("changelog: {changelog}"),
format!("run `{name} self-upgrade` to upgrade"),
"".to_string(),
];
let width = unformatted_messages
.iter()
.map(|s| s.chars().count())
.max()
.unwrap()
+ 4;
let column = "".bright_magenta();
let message = [
"".to_string(),
format!(
"update available! {} → {}",
current_version.to_string().red(),
version_no_metadata.to_string().green()
),
format!("changelog: {}", changelog.blue()),
format!(
"run `{} {}` to upgrade",
name.blue(),
"self-upgrade".yellow()
),
"".to_string(),
]
.into_iter()
.enumerate()
.map(|(i, s)| {
let text_length = unformatted_messages[i].chars().count();
let padding = (width as f32 - text_length as f32) / 2f32;
let padding_l = " ".repeat(padding.floor() as usize);
let padding_r = " ".repeat(padding.ceil() as usize);
format!("{column}{padding_l}{s}{padding_r}{column}")
})
.collect::<Vec<_>>()
.join("\n");
let lines = "".repeat(width).bright_magenta();
let tl = "".bright_magenta();
let tr = "".bright_magenta();
let bl = "".bright_magenta();
let br = "".bright_magenta();
println!("\n{tl}{lines}{tr}\n{message}\n{bl}{lines}{br}\n");
Ok(())
}
pub async fn download_github_release(
#[instrument(skip(reqwest, writer), level = "trace")]
pub async fn download_github_release<W: AsyncWrite + Unpin>(
reqwest: &reqwest::Client,
version: &Version,
) -> anyhow::Result<Vec<u8>> {
mut writer: W,
) -> anyhow::Result<()> {
let (owner, repo) = get_repo();
let release = reqwest
@ -202,19 +236,22 @@ pub async fn download_github_release(
.context("archive has no entry")?
.context("failed to get first archive entry")?;
let mut result = Vec::new();
entry
.read_to_end(&mut result)
tokio::io::copy(&mut entry, &mut writer)
.await
.context("failed to read archive entry bytes")?;
Ok(result)
.context("failed to write archive entry to file")
.map(|_| ())
}
#[derive(Debug)]
pub enum TagInfo {
Complete(Version),
Incomplete(Version),
}
#[instrument(skip(reqwest), level = "trace")]
pub async fn get_or_download_version(
reqwest: &reqwest::Client,
version: &Version,
tag: &TagInfo,
always_give_path: bool,
) -> anyhow::Result<Option<PathBuf>> {
let path = home_dir()?.join("versions");
@ -222,11 +259,23 @@ pub async fn get_or_download_version(
.await
.context("failed to create versions directory")?;
let path = path.join(format!("{version}{}", std::env::consts::EXE_SUFFIX));
let version = match tag {
TagInfo::Complete(version) => version,
// don't fetch the version since it could be cached
TagInfo::Incomplete(version) => version,
};
let path = path.join(format!(
"{}{}",
no_build_metadata(version),
std::env::consts::EXE_SUFFIX
));
let is_requested_version = !always_give_path && *version == current_version();
if path.exists() {
tracing::debug!("version already exists");
return Ok(if is_requested_version {
None
} else {
@ -235,14 +284,29 @@ pub async fn get_or_download_version(
}
if is_requested_version {
tracing::debug!("copying current executable to version directory");
fs::copy(current_exe()?, &path)
.await
.context("failed to copy current executable to version directory")?;
} else {
let bytes = download_github_release(reqwest, version).await?;
fs::write(&path, bytes)
.await
.context("failed to write downloaded version file")?;
let version = match tag {
TagInfo::Complete(version) => version.clone(),
TagInfo::Incomplete(version) => {
get_remote_version(reqwest, VersionType::Specific(version.clone()))
.await
.context("failed to get remote version")?
}
};
tracing::debug!("downloading version");
download_github_release(
reqwest,
&version,
fs::File::create(&path)
.await
.context("failed to create version file")?,
)
.await?;
}
make_executable(&path)
@ -256,6 +320,7 @@ pub async fn get_or_download_version(
})
}
#[instrument(level = "trace")]
pub async fn update_bin_exe(downloaded_file: &Path) -> anyhow::Result<()> {
let bin_exe_path = bin_dir().await?.join(format!(
"{}{}",

View file

@ -13,6 +13,7 @@ use std::{
collections::HashSet,
sync::{Arc, Mutex},
};
use tracing::{instrument, Instrument};
type MultithreadedGraph = Arc<Mutex<DownloadedGraph>>;
@ -23,6 +24,7 @@ pub(crate) type MultithreadDownloadJob = (
impl Project {
/// Downloads a graph of dependencies
#[instrument(skip(self, graph, refreshed_sources, reqwest), level = "debug")]
pub async fn download_graph(
&self,
graph: &DependencyGraph,
@ -69,76 +71,87 @@ impl Project {
let version_id = version_id.clone();
let node = node.clone();
let span = tracing::info_span!(
"download",
name = name.to_string(),
version_id = version_id.to_string()
);
let project = project.clone();
let reqwest = reqwest.clone();
let downloaded_graph = downloaded_graph.clone();
let package_dir = self.package_dir().to_path_buf();
tokio::spawn(async move {
let source = node.pkg_ref.source();
tokio::spawn(
async move {
let source = node.pkg_ref.source();
let container_folder = node.container_folder(
&package_dir
.join(manifest_target_kind.packages_folder(version_id.target()))
.join(PACKAGES_CONTAINER_NAME),
&name,
version_id.version(),
);
let container_folder = node.container_folder(
&package_dir
.join(manifest_target_kind.packages_folder(version_id.target()))
.join(PACKAGES_CONTAINER_NAME),
&name,
version_id.version(),
);
match fs::create_dir_all(&container_folder).await {
Ok(_) => {}
Err(e) => {
tx.send(Err(errors::DownloadGraphError::Io(e)))
.await
.unwrap();
return;
}
}
let project = project.clone();
log::debug!("downloading {name}@{version_id}");
let (fs, target) =
match source.download(&node.pkg_ref, &project, &reqwest).await {
Ok(target) => target,
match fs::create_dir_all(&container_folder).await {
Ok(_) => {}
Err(e) => {
tx.send(Err(Box::new(e).into())).await.unwrap();
tx.send(Err(errors::DownloadGraphError::Io(e)))
.await
.unwrap();
return;
}
};
}
log::debug!("downloaded {name}@{version_id}");
let project = project.clone();
if write {
if !prod || node.resolved_ty != DependencyType::Dev {
match fs.write_to(container_folder, project.cas_dir(), true).await {
Ok(_) => {}
tracing::debug!("downloading");
let (fs, target) =
match source.download(&node.pkg_ref, &project, &reqwest).await {
Ok(target) => target,
Err(e) => {
tx.send(Err(errors::DownloadGraphError::WriteFailed(e)))
.await
.unwrap();
tx.send(Err(Box::new(e).into())).await.unwrap();
return;
}
};
} else {
log::debug!("skipping writing {name}@{version_id} to disk, dev dependency in prod mode");
tracing::debug!("downloaded");
if write {
if !prod || node.resolved_ty != DependencyType::Dev {
match fs.write_to(container_folder, project.cas_dir(), true).await {
Ok(_) => {}
Err(e) => {
tx.send(Err(errors::DownloadGraphError::WriteFailed(e)))
.await
.unwrap();
return;
}
};
} else {
tracing::debug!(
"skipping write to disk, dev dependency in prod mode"
);
}
}
let display_name = format!("{name}@{version_id}");
{
let mut downloaded_graph = downloaded_graph.lock().unwrap();
downloaded_graph
.entry(name)
.or_default()
.insert(version_id, DownloadedDependencyGraphNode { node, target });
}
tx.send(Ok(display_name)).await.unwrap();
}
let display_name = format!("{name}@{version_id}");
{
let mut downloaded_graph = downloaded_graph.lock().unwrap();
downloaded_graph
.entry(name)
.or_default()
.insert(version_id, DownloadedDependencyGraphNode { node, target });
}
tx.send(Ok(display_name)).await.unwrap();
});
.instrument(span),
);
}
}

View file

@ -11,6 +11,7 @@ use std::{
sync::{Arc, Mutex as StdMutex},
};
use tokio::sync::Mutex;
use tracing::{instrument, Instrument};
/// Filters a graph to only include production dependencies, if `prod` is `true`
pub fn filter_graph(graph: &DownloadedGraph, prod: bool) -> DownloadedGraph {
@ -33,8 +34,16 @@ pub fn filter_graph(graph: &DownloadedGraph, prod: bool) -> DownloadedGraph {
.collect()
}
/// Receiver for dependencies downloaded and linked
pub type DownloadAndLinkReceiver =
tokio::sync::mpsc::Receiver<Result<String, crate::download::errors::DownloadGraphError>>;
impl Project {
/// Downloads a graph of dependencies and links them in the correct order
#[instrument(
skip(self, graph, refreshed_sources, reqwest, pesde_cb),
level = "debug"
)]
pub async fn download_and_link<
F: FnOnce(&Arc<DownloadedGraph>) -> R + Send + 'static,
R: Future<Output = Result<(), E>> + Send,
@ -49,9 +58,7 @@ impl Project {
pesde_cb: F,
) -> Result<
(
tokio::sync::mpsc::Receiver<
Result<String, crate::download::errors::DownloadGraphError>,
>,
DownloadAndLinkReceiver,
impl Future<Output = Result<DownloadedGraph, errors::DownloadAndLinkError<E>>>,
),
errors::DownloadAndLinkError<E>,
@ -78,6 +85,7 @@ impl Project {
// step 1. download pesde dependencies
let (mut pesde_rx, pesde_graph) = this
.download_graph(&graph, &mut refreshed_sources, &reqwest, prod, write, false)
.instrument(tracing::debug_span!("download (pesde)"))
.await?;
while let Some(result) = pesde_rx.recv().await {
@ -89,6 +97,7 @@ impl Project {
// step 2. link pesde dependencies. do so without types
if write {
this.link_dependencies(&filter_graph(&pesde_graph, prod), false)
.instrument(tracing::debug_span!("link (pesde)"))
.await?;
}
@ -103,6 +112,7 @@ impl Project {
// step 3. download wally dependencies
let (mut wally_rx, wally_graph) = this
.download_graph(&graph, &mut refreshed_sources, &reqwest, prod, write, true)
.instrument(tracing::debug_span!("download (wally)"))
.await?;
while let Some(result) = wally_rx.recv().await {
@ -132,6 +142,7 @@ impl Project {
// step 4. link ALL dependencies. do so with types
if write {
this.link_dependencies(&filter_graph(&graph, prod), true)
.instrument(tracing::debug_span!("link (all)"))
.await?;
}

View file

@ -14,8 +14,10 @@ use futures::{future::try_join_all, Stream};
use gix::sec::identity::Account;
use std::{
collections::{HashMap, HashSet},
fmt::Debug,
path::{Path, PathBuf},
};
use tracing::instrument;
use wax::Pattern;
/// Downloading packages
@ -149,29 +151,35 @@ impl Project {
}
/// Read the manifest file
#[instrument(skip(self), ret(level = "trace"), level = "debug")]
pub async fn read_manifest(&self) -> Result<String, errors::ManifestReadError> {
let string = fs::read_to_string(self.package_dir.join(MANIFEST_FILE_NAME)).await?;
Ok(string)
}
// TODO: cache the manifest
/// Deserialize the manifest file
#[instrument(skip(self), ret(level = "trace"), level = "debug")]
pub async fn deser_manifest(&self) -> Result<Manifest, errors::ManifestReadError> {
let string = fs::read_to_string(self.package_dir.join(MANIFEST_FILE_NAME)).await?;
Ok(toml::from_str(&string)?)
}
/// Write the manifest file
#[instrument(skip(self, manifest), level = "debug")]
pub async fn write_manifest<S: AsRef<[u8]>>(&self, manifest: S) -> Result<(), std::io::Error> {
fs::write(self.package_dir.join(MANIFEST_FILE_NAME), manifest.as_ref()).await
}
/// Deserialize the lockfile
#[instrument(skip(self), ret(level = "trace"), level = "debug")]
pub async fn deser_lockfile(&self) -> Result<Lockfile, errors::LockfileReadError> {
let string = fs::read_to_string(self.package_dir.join(LOCKFILE_FILE_NAME)).await?;
Ok(toml::from_str(&string)?)
}
/// Write the lockfile
#[instrument(skip(self, lockfile), level = "debug")]
pub async fn write_lockfile(
&self,
lockfile: Lockfile,
@ -182,7 +190,8 @@ impl Project {
}
/// Get the workspace members
pub async fn workspace_members<P: AsRef<Path>>(
#[instrument(skip(self), level = "debug")]
pub async fn workspace_members<P: AsRef<Path> + Debug>(
&self,
dir: P,
can_ref_self: bool,
@ -222,7 +231,16 @@ impl Project {
}
/// Gets all matching paths in a directory
pub async fn matching_globs_old_behaviour<'a, P: AsRef<Path>, I: IntoIterator<Item = &'a str>>(
#[deprecated(
since = "0.5.0-rc.13",
note = "use `matching_globs` instead, which does not have the old behaviour of including whole directories by their name (`src` instead of `src/**`)"
)]
#[instrument(ret, level = "trace")]
pub async fn matching_globs_old_behaviour<
'a,
P: AsRef<Path> + Debug,
I: IntoIterator<Item = &'a str> + Debug,
>(
dir: P,
globs: I,
relative: bool,
@ -270,7 +288,7 @@ pub async fn matching_globs_old_behaviour<'a, P: AsRef<Path>, I: IntoIterator<It
is_entire_dir_included || is_filename_match,
));
if is_filename_match {
log::warn!("directory name usage found for {}. this is deprecated and will be removed in the future", path.display());
tracing::warn!("directory name usage found for {}. this is deprecated and will be removed in the future", path.display());
}
}
@ -293,7 +311,8 @@ pub async fn matching_globs_old_behaviour<'a, P: AsRef<Path>, I: IntoIterator<It
}
/// Gets all matching paths in a directory
pub async fn matching_globs<'a, P: AsRef<Path>, I: IntoIterator<Item = &'a str>>(
#[instrument(ret, level = "trace")]
pub async fn matching_globs<'a, P: AsRef<Path> + Debug, I: IntoIterator<Item = &'a str> + Debug>(
dir: P,
globs: I,
relative: bool,

View file

@ -117,10 +117,10 @@ pub fn get_lib_require_path(
) -> Result<String, errors::GetLibRequirePath> {
let path = pathdiff::diff_paths(destination_dir, base_dir).unwrap();
let path = if use_new_structure {
log::debug!("using new structure for require path with {:?}", lib_file);
tracing::debug!("using new structure for require path with {lib_file:?}");
lib_file.to_path(path)
} else {
log::debug!("using old structure for require path with {:?}", lib_file);
tracing::debug!("using old structure for require path with {lib_file:?}");
path
};

View file

@ -20,6 +20,7 @@ use std::{
sync::Arc,
};
use tokio::task::spawn_blocking;
use tracing::{instrument, Instrument};
/// Generates linking modules for a project
pub mod generator;
@ -44,6 +45,7 @@ async fn write_cas(destination: PathBuf, cas_dir: &Path, contents: &str) -> std:
impl Project {
/// Links the dependencies of the project
#[instrument(skip(self, graph), level = "debug")]
pub async fn link_dependencies(
&self,
graph: &DownloadedGraph,
@ -55,7 +57,7 @@ impl Project {
// step 1. link all non-wally packages (and their dependencies) temporarily without types
// we do this separately to allow the required tools for the scripts to be installed
self.link(graph, &manifest, &Arc::new(Default::default()))
self.link(graph, &manifest, &Arc::new(Default::default()), false)
.await?;
if !with_types {
@ -110,7 +112,7 @@ impl Project {
}
};
log::debug!("{name}@{version_id} has {} exported types", types.len());
tracing::debug!("contains {} exported types", types.len());
types
} else {
@ -122,8 +124,8 @@ impl Project {
.and_then(|t| t.build_files())
{
let Some(script_path) = roblox_sync_config_gen_script else {
log::warn!("not having a `{}` script in the manifest might cause issues with Roblox linking", ScriptName::RobloxSyncConfigGenerator);
return Ok((version_id, vec![]));
tracing::warn!("not having a `{}` script in the manifest might cause issues with Roblox linking", ScriptName::RobloxSyncConfigGenerator);
return Ok((version_id, types));
};
execute_script(
@ -143,7 +145,7 @@ impl Project {
}
Ok((version_id, types))
}))
}.instrument(tracing::debug_span!("extract types", name = name.to_string(), version_id = version_id.to_string()))))
.await?
.into_iter()
.collect::<HashMap<_, _>>(),
@ -154,7 +156,8 @@ impl Project {
.collect::<HashMap<_, _>>();
// step 3. link all packages (and their dependencies), this time with types
self.link(graph, &manifest, &Arc::new(package_types)).await
self.link(graph, &manifest, &Arc::new(package_types), true)
.await
}
#[allow(clippy::too_many_arguments)]
@ -243,6 +246,7 @@ impl Project {
graph: &DownloadedGraph,
manifest: &Arc<Manifest>,
package_types: &Arc<HashMap<&PackageNames, HashMap<&VersionId, Vec<String>>>>,
is_complete: bool,
) -> Result<(), errors::LinkingError> {
try_join_all(graph.iter().flat_map(|(name, versions)| {
versions.iter().map(|(version_id, node)| {
@ -250,6 +254,12 @@ impl Project {
let manifest = manifest.clone();
let package_types = package_types.clone();
let span = tracing::info_span!(
"link",
name = name.to_string(),
version_id = version_id.to_string()
);
async move {
let (node_container_folder, node_packages_folder) = {
let base_folder = create_and_canonicalize(
@ -291,10 +301,14 @@ impl Project {
.get(dependency_name)
.and_then(|v| v.get(dependency_version_id))
else {
return Err(errors::LinkingError::DependencyNotFound(
dependency_name.to_string(),
dependency_version_id.to_string(),
));
if is_complete {
return Err(errors::LinkingError::DependencyNotFound(
format!("{dependency_name}@{dependency_version_id}"),
format!("{name}@{version_id}"),
));
}
continue;
};
let base_folder = create_and_canonicalize(
@ -338,6 +352,7 @@ impl Project {
Ok(())
}
.instrument(span)
})
}))
.await
@ -362,7 +377,7 @@ pub mod errors {
Io(#[from] std::io::Error),
/// A dependency was not found
#[error("dependency not found: {0}@{1}")]
#[error("dependency `{0}` of `{1}` not found")]
DependencyNotFound(String, String),
/// The library file was not found

View file

@ -14,7 +14,7 @@ use relative_path::RelativePathBuf;
use semver::Version;
use serde::{Deserialize, Serialize};
use std::{
collections::{btree_map::Entry, BTreeMap},
collections::BTreeMap,
path::{Path, PathBuf},
};
@ -32,6 +32,9 @@ pub struct DependencyGraphNode {
pub dependencies: BTreeMap<PackageNames, (VersionId, String)>,
/// The resolved (transformed, for example Peer -> Standard) type of the dependency
pub resolved_ty: DependencyType,
/// Whether the resolved type should be Peer if this isn't depended on
#[serde(default, skip_serializing_if = "std::ops::Not::not")]
pub is_peer: bool,
/// The package reference
pub pkg_ref: PackageRefs,
}
@ -74,45 +77,6 @@ impl DependencyGraphNode {
/// A graph of `DependencyGraphNode`s
pub type DependencyGraph = Graph<DependencyGraphNode>;
pub(crate) fn insert_node(
graph: &mut DependencyGraph,
name: PackageNames,
version: VersionId,
mut node: DependencyGraphNode,
is_top_level: bool,
) {
if !is_top_level && node.direct.take().is_some() {
log::debug!(
"tried to insert {name}@{version} as direct dependency from a non top-level context",
);
}
match graph
.entry(name.clone())
.or_default()
.entry(version.clone())
{
Entry::Vacant(entry) => {
entry.insert(node);
}
Entry::Occupied(existing) => {
let current_node = existing.into_mut();
match (&current_node.direct, &node.direct) {
(Some(_), Some(_)) => {
log::warn!("duplicate direct dependency for {name}@{version}");
}
(None, Some(_)) => {
current_node.direct = node.direct;
}
(_, _) => {}
}
}
}
}
/// A downloaded dependency graph node, i.e. a `DependencyGraphNode` with a `Target`
#[derive(Serialize, Deserialize, Debug, Clone)]
pub struct DownloadedDependencyGraphNode {

View file

@ -1,17 +1,20 @@
#[cfg(feature = "version-management")]
use crate::cli::version::{check_for_updates, get_or_download_version};
use crate::cli::version::{check_for_updates, get_or_download_version, TagInfo};
use crate::cli::{auth::get_tokens, display_err, home_dir, HOME_DIR};
use anyhow::Context;
use clap::{builder::styling::AnsiColor, Parser};
use fs_err::tokio as fs;
use indicatif::MultiProgress;
use indicatif_log_bridge::LogWrapper;
use pesde::{matching_globs, AuthConfig, Project, MANIFEST_FILE_NAME};
use std::{
collections::HashSet,
path::{Path, PathBuf},
};
use tempfile::NamedTempFile;
use tracing::instrument;
use tracing_indicatif::{filter::IndicatifFilter, IndicatifLayer};
use tracing_subscriber::{
filter::LevelFilter, layer::SubscriberExt, util::SubscriberInitExt, EnvFilter, Layer,
};
mod cli;
pub mod util;
@ -38,6 +41,7 @@ struct Cli {
subcommand: cli::commands::Subcommand,
}
#[instrument(level = "trace")]
async fn get_linkable_dir(path: &Path) -> PathBuf {
let mut curr_path = PathBuf::new();
let file_to_try = NamedTempFile::new_in(path).expect("failed to create temporary file");
@ -68,7 +72,7 @@ async fn get_linkable_dir(path: &Path) -> PathBuf {
if fs::hard_link(file_to_try.path(), &try_path).await.is_ok() {
if let Err(err) = fs::remove_file(&try_path).await {
log::warn!(
tracing::warn!(
"failed to remove temporary file at {}: {err}",
try_path.display()
);
@ -129,6 +133,39 @@ async fn run() -> anyhow::Result<()> {
std::process::exit(status.code().unwrap());
}
let indicatif_layer = IndicatifLayer::new().with_filter(IndicatifFilter::new(false));
let tracing_env_filter = EnvFilter::builder()
.with_default_directive(LevelFilter::INFO.into())
.from_env_lossy()
.add_directive("reqwest=info".parse().unwrap())
.add_directive("rustls=info".parse().unwrap())
.add_directive("tokio_util=info".parse().unwrap())
.add_directive("goblin=info".parse().unwrap())
.add_directive("tower=info".parse().unwrap())
.add_directive("hyper=info".parse().unwrap())
.add_directive("h2=info".parse().unwrap());
let fmt_layer =
tracing_subscriber::fmt::layer().with_writer(indicatif_layer.inner().get_stderr_writer());
#[cfg(debug_assertions)]
let fmt_layer = fmt_layer.with_timer(tracing_subscriber::fmt::time::uptime());
#[cfg(not(debug_assertions))]
let fmt_layer = fmt_layer
.pretty()
.with_timer(())
.with_line_number(false)
.with_file(false)
.with_target(false);
tracing_subscriber::registry()
.with(tracing_env_filter)
.with(fmt_layer)
.with(indicatif_layer)
.init();
let (project_root_dir, project_workspace_dir) = 'finder: {
let mut current_path = Some(cwd.clone());
let mut project_root = None::<PathBuf>;
@ -191,16 +228,13 @@ async fn run() -> anyhow::Result<()> {
(project_root.unwrap_or_else(|| cwd.clone()), workspace_dir)
};
let multi = {
let logger = pretty_env_logger::formatted_builder()
.parse_env(pretty_env_logger::env_logger::Env::default().default_filter_or("info"))
.build();
let multi = MultiProgress::new();
LogWrapper::new(multi.clone(), logger).try_init().unwrap();
multi
};
tracing::trace!(
"project root: {}\nworkspace root: {}",
project_root_dir.display(),
project_workspace_dir
.as_ref()
.map_or("none".to_string(), |p| p.display().to_string())
);
let home_dir = home_dir()?;
let data_dir = home_dir.join("data");
@ -217,7 +251,7 @@ async fn run() -> anyhow::Result<()> {
}
.join("cas");
log::debug!("using cas dir in {}", cas_dir.display());
tracing::debug!("using cas dir in {}", cas_dir.display());
let project = Project::new(
project_root_dir,
@ -256,7 +290,7 @@ async fn run() -> anyhow::Result<()> {
.and_then(|manifest| manifest.pesde_version);
let exe_path = if let Some(version) = target_version {
get_or_download_version(&reqwest, &version, false).await?
get_or_download_version(&reqwest, &TagInfo::Incomplete(version), false).await?
} else {
None
};
@ -278,7 +312,7 @@ async fn run() -> anyhow::Result<()> {
let cli = Cli::parse();
cli.subcommand.run(project, multi, reqwest).await
cli.subcommand.run(project, reqwest).await
}
#[tokio::main]

View file

@ -1,13 +1,13 @@
use relative_path::RelativePathBuf;
use semver::Version;
use serde::{Deserialize, Serialize};
use std::collections::{BTreeMap, HashMap};
use crate::{
manifest::{overrides::OverrideKey, target::Target},
names::PackageName,
source::specifiers::DependencySpecifiers,
};
use relative_path::RelativePathBuf;
use semver::Version;
use serde::{Deserialize, Serialize};
use std::collections::{BTreeMap, HashMap};
use tracing::instrument;
/// Overrides
pub mod overrides;
@ -107,6 +107,7 @@ pub enum DependencyType {
impl Manifest {
/// Get all dependencies from the manifest
#[instrument(skip(self), ret(level = "trace"), level = "debug")]
pub fn all_dependencies(
&self,
) -> Result<

View file

@ -35,8 +35,16 @@ impl FromStr for PackageName {
.ok_or(Self::Err::InvalidFormat(s.to_string()))?;
for (reason, part) in [(ErrorReason::Scope, scope), (ErrorReason::Name, name)] {
if part.len() < 3 || part.len() > 32 {
return Err(Self::Err::InvalidLength(reason, part.to_string()));
let min_len = match reason {
ErrorReason::Scope => 3,
ErrorReason::Name => 1,
};
if !(min_len..=32).contains(&part.len()) {
return Err(match reason {
ErrorReason::Scope => Self::Err::InvalidScopeLength(part.to_string()),
ErrorReason::Name => Self::Err::InvalidNameLength(part.to_string()),
});
}
if part.chars().all(|c| c.is_ascii_digit()) {
@ -231,9 +239,13 @@ pub mod errors {
#[error("package {0} `{1}` starts or ends with an underscore")]
PrePostfixUnderscore(ErrorReason, String),
/// The package name is not within 3-32 characters long
#[error("package {0} `{1}` is not within 3-32 characters long")]
InvalidLength(ErrorReason, String),
/// The package name's scope part is not within 3-32 characters long
#[error("package scope `{0}` is not within 3-32 characters long")]
InvalidScopeLength(String),
/// The package name's name part is not within 1-32 characters long
#[error("package name `{0}` is not within 1-32 characters long")]
InvalidNameLength(String),
}
/// Errors that can occur when working with Wally package names

View file

@ -3,6 +3,7 @@ use fs_err::tokio as fs;
use git2::{ApplyLocation, Diff, DiffFormat, DiffLineType, Repository, Signature};
use relative_path::RelativePathBuf;
use std::path::Path;
use tracing::instrument;
/// Set up a git repository for patches
pub fn setup_patches_repo<P: AsRef<Path>>(dir: P) -> Result<Repository, git2::Error> {
@ -69,6 +70,7 @@ pub fn create_patch<P: AsRef<Path>>(dir: P) -> Result<Vec<u8>, git2::Error> {
impl Project {
/// Apply patches to the project's dependencies
#[instrument(skip(self, graph), level = "debug")]
pub async fn apply_patches(
&self,
graph: &DownloadedGraph,
@ -97,7 +99,7 @@ impl Project {
.get(&name)
.and_then(|versions| versions.get(&version_id))
else {
log::warn!(
tracing::warn!(
"patch for {name}@{version_id} not applied because it is not in the graph"
);
tx.send(Ok(format!("{name}@{version_id}"))).await.unwrap();
@ -114,7 +116,7 @@ impl Project {
);
tokio::spawn(async move {
log::debug!("applying patch to {name}@{version_id}");
tracing::debug!("applying patch to {name}@{version_id}");
let patch = match fs::read(&patch_path).await {
Ok(patch) => patch,
@ -195,7 +197,9 @@ impl Project {
}
}
log::debug!("patch applied to {name}@{version_id}, removing .git directory");
tracing::debug!(
"patch applied to {name}@{version_id}, removing .git directory"
);
if let Err(e) = fs::remove_dir_all(container_folder.join(".git")).await {
tx.send(Err(errors::ApplyPatchesError::DotGitRemove(e)))

View file

@ -1,5 +1,5 @@
use crate::{
lockfile::{insert_node, DependencyGraph, DependencyGraphNode},
lockfile::{DependencyGraph, DependencyGraphNode},
manifest::DependencyType,
names::PackageNames,
source::{
@ -11,10 +11,55 @@ use crate::{
},
Project, DEFAULT_INDEX_NAME,
};
use std::collections::{HashMap, HashSet, VecDeque};
use std::collections::{btree_map::Entry, HashMap, HashSet, VecDeque};
use tracing::{instrument, Instrument};
fn insert_node(
graph: &mut DependencyGraph,
name: PackageNames,
version: VersionId,
mut node: DependencyGraphNode,
is_top_level: bool,
) {
if !is_top_level && node.direct.take().is_some() {
tracing::debug!(
"tried to insert {name}@{version} as direct dependency from a non top-level context",
);
}
match graph
.entry(name.clone())
.or_default()
.entry(version.clone())
{
Entry::Vacant(entry) => {
entry.insert(node);
}
Entry::Occupied(existing) => {
let current_node = existing.into_mut();
match (&current_node.direct, &node.direct) {
(Some(_), Some(_)) => {
tracing::warn!("duplicate direct dependency for {name}@{version}");
}
(None, Some(_)) => {
current_node.direct = node.direct;
}
(_, _) => {}
}
}
}
}
impl Project {
/// Create a dependency graph from the project's manifest
#[instrument(
skip(self, previous_graph, refreshed_sources),
ret(level = "trace"),
level = "debug"
)]
pub async fn dependency_graph(
&self,
previous_graph: Option<&DependencyGraph>,
@ -39,7 +84,7 @@ impl Project {
if let Some(previous_graph) = previous_graph {
for (name, versions) in previous_graph {
for (version, node) in versions {
let Some((_, specifier, source_ty)) = &node.direct else {
let Some((old_alias, specifier, source_ty)) = &node.direct else {
// this is not a direct dependency, will be added if it's still being used later
continue;
};
@ -51,13 +96,16 @@ impl Project {
let Some(alias) = all_specifiers.remove(&(specifier.clone(), *source_ty))
else {
log::debug!(
"dependency {name}@{version} from old dependency graph is no longer in the manifest",
tracing::debug!(
"dependency {name}@{version} (old alias {old_alias}) from old dependency graph is no longer in the manifest",
);
continue;
};
log::debug!("resolved {}@{} from old dependency graph", name, version);
let span = tracing::info_span!("resolve from old graph", alias);
let _guard = span.enter();
tracing::debug!("resolved {}@{} from old dependency graph", name, version);
insert_node(
&mut graph,
name.clone(),
@ -72,22 +120,24 @@ impl Project {
let mut queue = node
.dependencies
.iter()
.map(|(name, (version, _))| (name, version, 0usize))
.map(|(name, (version, dep_alias))| {
(
name,
version,
vec![alias.to_string(), dep_alias.to_string()],
)
})
.collect::<VecDeque<_>>();
while let Some((dep_name, dep_version, depth)) = queue.pop_front() {
while let Some((dep_name, dep_version, path)) = queue.pop_front() {
let inner_span =
tracing::info_span!("resolve dependency", path = path.join(">"));
let _inner_guard = inner_span.enter();
if let Some(dep_node) = previous_graph
.get(dep_name)
.and_then(|v| v.get(dep_version))
{
log::debug!(
"{}resolved dependency {}@{} from {}@{}",
"\t".repeat(depth),
dep_name,
dep_version,
name,
version
);
tracing::debug!("resolved sub-dependency {dep_name}@{dep_version}");
insert_node(
&mut graph,
dep_name.clone(),
@ -99,15 +149,20 @@ impl Project {
dep_node
.dependencies
.iter()
.map(|(name, (version, _))| (name, version, depth + 1))
.map(|(name, (version, alias))| {
(
name,
version,
path.iter()
.cloned()
.chain(std::iter::once(alias.to_string()))
.collect(),
)
})
.for_each(|dep| queue.push_back(dep));
} else {
log::warn!(
"dependency {}@{} from {}@{} not found in previous graph",
dep_name,
dep_version,
name,
version
tracing::warn!(
"dependency {dep_name}@{dep_version} not found in previous graph"
);
}
}
@ -130,223 +185,232 @@ impl Project {
.collect::<VecDeque<_>>();
while let Some((specifier, ty, dependant, path, overridden, target)) = queue.pop_front() {
let alias = path.last().unwrap().clone();
let depth = path.len() - 1;
async {
let alias = path.last().unwrap().clone();
let depth = path.len() - 1;
log::debug!(
"{}resolving {specifier} from {}",
"\t".repeat(depth),
path.join(">")
);
let source = match &specifier {
DependencySpecifiers::Pesde(specifier) => {
let index_url = if !is_published_package && (depth == 0 || overridden) {
let index_name = specifier.index.as_deref().unwrap_or(DEFAULT_INDEX_NAME);
tracing::debug!("resolving {specifier} ({ty:?})");
let source = match &specifier {
DependencySpecifiers::Pesde(specifier) => {
let index_url = if !is_published_package && (depth == 0 || overridden) {
let index_name = specifier.index.as_deref().unwrap_or(DEFAULT_INDEX_NAME);
manifest
.indices
.get(index_name)
.ok_or(errors::DependencyGraphError::IndexNotFound(
index_name.to_string(),
))?
.clone()
manifest
.indices
.get(index_name)
.ok_or(errors::DependencyGraphError::IndexNotFound(
index_name.to_string(),
))?
.clone()
} else {
let index_url = specifier.index.clone().unwrap();
index_url
.clone()
.try_into()
// specifiers in indices store the index url in this field
.unwrap()
};
PackageSources::Pesde(PesdePackageSource::new(index_url))
}
#[cfg(feature = "wally-compat")]
DependencySpecifiers::Wally(specifier) => {
let index_url = if !is_published_package && (depth == 0 || overridden) {
let index_name = specifier.index.as_deref().unwrap_or(DEFAULT_INDEX_NAME);
manifest
.wally_indices
.get(index_name)
.ok_or(errors::DependencyGraphError::WallyIndexNotFound(
index_name.to_string(),
))?
.clone()
} else {
let index_url = specifier.index.clone().unwrap();
index_url
.clone()
.try_into()
// specifiers in indices store the index url in this field
.unwrap()
};
PackageSources::Wally(crate::source::wally::WallyPackageSource::new(index_url))
}
DependencySpecifiers::Git(specifier) => PackageSources::Git(
crate::source::git::GitPackageSource::new(specifier.repo.clone()),
),
DependencySpecifiers::Workspace(_) => {
PackageSources::Workspace(crate::source::workspace::WorkspacePackageSource)
}
};
if refreshed_sources.insert(source.clone()) {
source.refresh(self).await.map_err(|e| Box::new(e.into()))?;
}
let (name, resolved) = source
.resolve(&specifier, self, target, refreshed_sources)
.await
.map_err(|e| Box::new(e.into()))?;
let Some(target_version_id) = graph
.get(&name)
.and_then(|versions| {
versions
.keys()
// only consider versions that are compatible with the specifier
.filter(|ver| resolved.contains_key(ver))
.max()
})
.or_else(|| resolved.last_key_value().map(|(ver, _)| ver))
.cloned()
else {
return Err(Box::new(errors::DependencyGraphError::NoMatchingVersion(
format!("{specifier} ({target})"),
)));
};
let resolved_ty = if (is_published_package || depth == 0) && ty == DependencyType::Peer
{
DependencyType::Standard
} else {
ty
};
if let Some((dependant_name, dependant_version_id)) = dependant {
graph
.get_mut(&dependant_name)
.and_then(|versions| versions.get_mut(&dependant_version_id))
.and_then(|node| {
node.dependencies
.insert(name.clone(), (target_version_id.clone(), alias.clone()))
});
}
let pkg_ref = &resolved[&target_version_id];
if let Some(already_resolved) = graph
.get_mut(&name)
.and_then(|versions| versions.get_mut(&target_version_id))
{
tracing::debug!(
"{}@{} already resolved",
name,
target_version_id
);
if std::mem::discriminant(&already_resolved.pkg_ref)
!= std::mem::discriminant(pkg_ref)
{
tracing::warn!(
"resolved package {name}@{target_version_id} has a different source than previously resolved one, this may cause issues",
);
}
if already_resolved.resolved_ty == DependencyType::Peer {
already_resolved.resolved_ty = resolved_ty;
}
if ty == DependencyType::Peer && depth == 0 {
already_resolved.is_peer = true;
}
if already_resolved.direct.is_none() && depth == 0 {
already_resolved.direct = Some((alias.clone(), specifier.clone(), ty));
}
return Ok(());
}
let node = DependencyGraphNode {
direct: if depth == 0 {
Some((alias.clone(), specifier.clone(), ty))
} else {
let index_url = specifier.index.clone().unwrap();
index_url
.clone()
.try_into()
// specifiers in indices store the index url in this field
.unwrap()
};
PackageSources::Pesde(PesdePackageSource::new(index_url))
}
#[cfg(feature = "wally-compat")]
DependencySpecifiers::Wally(specifier) => {
let index_url = if !is_published_package && (depth == 0 || overridden) {
let index_name = specifier.index.as_deref().unwrap_or(DEFAULT_INDEX_NAME);
manifest
.wally_indices
.get(index_name)
.ok_or(errors::DependencyGraphError::WallyIndexNotFound(
index_name.to_string(),
))?
.clone()
None
},
pkg_ref: pkg_ref.clone(),
dependencies: Default::default(),
resolved_ty,
is_peer: if depth == 0 {
false
} else {
let index_url = specifier.index.clone().unwrap();
ty == DependencyType::Peer
},
};
insert_node(
&mut graph,
name.clone(),
target_version_id.clone(),
node.clone(),
depth == 0,
);
index_url
.clone()
.try_into()
// specifiers in indices store the index url in this field
.unwrap()
};
PackageSources::Wally(crate::source::wally::WallyPackageSource::new(index_url))
}
DependencySpecifiers::Git(specifier) => PackageSources::Git(
crate::source::git::GitPackageSource::new(specifier.repo.clone()),
),
DependencySpecifiers::Workspace(_) => {
PackageSources::Workspace(crate::source::workspace::WorkspacePackageSource)
}
};
if refreshed_sources.insert(source.clone()) {
source.refresh(self).await.map_err(|e| Box::new(e.into()))?;
}
let (name, resolved) = source
.resolve(&specifier, self, target, refreshed_sources)
.await
.map_err(|e| Box::new(e.into()))?;
let Some(target_version_id) = graph
.get(&name)
.and_then(|versions| {
versions
.keys()
// only consider versions that are compatible with the specifier
.filter(|ver| resolved.contains_key(ver))
.max()
})
.or_else(|| resolved.last_key_value().map(|(ver, _)| ver))
.cloned()
else {
return Err(Box::new(errors::DependencyGraphError::NoMatchingVersion(
format!("{specifier} ({target})"),
)));
};
let resolved_ty = if (is_published_package || depth == 0) && ty == DependencyType::Peer
{
DependencyType::Standard
} else {
ty
};
if let Some((dependant_name, dependant_version_id)) = dependant {
graph
.get_mut(&dependant_name)
.and_then(|versions| versions.get_mut(&dependant_version_id))
.and_then(|node| {
node.dependencies
.insert(name.clone(), (target_version_id.clone(), alias.clone()))
});
}
let pkg_ref = &resolved[&target_version_id];
if let Some(already_resolved) = graph
.get_mut(&name)
.and_then(|versions| versions.get_mut(&target_version_id))
{
log::debug!(
"{}{}@{} already resolved",
"\t".repeat(depth),
tracing::debug!(
"resolved {}@{} from new dependency graph",
name,
target_version_id
);
if std::mem::discriminant(&already_resolved.pkg_ref)
!= std::mem::discriminant(pkg_ref)
for (dependency_alias, (dependency_spec, dependency_ty)) in
pkg_ref.dependencies().clone()
{
log::warn!(
"resolved package {name}@{target_version_id} has a different source than the previously resolved one at {}, this may cause issues",
path.join(">")
);
}
if dependency_ty == DependencyType::Dev {
// dev dependencies of dependencies are to be ignored
continue;
}
if already_resolved.resolved_ty == DependencyType::Peer
&& resolved_ty == DependencyType::Standard
{
already_resolved.resolved_ty = resolved_ty;
}
let overridden = manifest.overrides.iter().find_map(|(key, spec)| {
key.0.iter().find_map(|override_path| {
// if the path up until the last element is the same as the current path,
// and the last element in the path is the dependency alias,
// then the specifier is to be overridden
(path.len() == override_path.len() - 1
&& path == override_path[..override_path.len() - 1]
&& override_path.last() == Some(&dependency_alias))
.then_some(spec)
})
});
if already_resolved.direct.is_none() && depth == 0 {
already_resolved.direct = Some((alias.clone(), specifier.clone(), ty));
}
if overridden.is_some() {
tracing::debug!(
"overridden specifier found for {} ({dependency_spec})",
path.iter()
.map(|s| s.as_str())
.chain(std::iter::once(dependency_alias.as_str()))
.collect::<Vec<_>>()
.join(">"),
);
}
continue;
}
let node = DependencyGraphNode {
direct: if depth == 0 {
Some((alias.clone(), specifier.clone(), ty))
} else {
None
},
pkg_ref: pkg_ref.clone(),
dependencies: Default::default(),
resolved_ty,
};
insert_node(
&mut graph,
name.clone(),
target_version_id.clone(),
node.clone(),
depth == 0,
);
log::debug!(
"{}resolved {}@{} from new dependency graph",
"\t".repeat(depth),
name,
target_version_id
);
for (dependency_alias, (dependency_spec, dependency_ty)) in
pkg_ref.dependencies().clone()
{
if dependency_ty == DependencyType::Dev {
// dev dependencies of dependencies are to be ignored
continue;
}
let overridden = manifest.overrides.iter().find_map(|(key, spec)| {
key.0.iter().find_map(|override_path| {
// if the path up until the last element is the same as the current path,
// and the last element in the path is the dependency alias,
// then the specifier is to be overridden
(path.len() == override_path.len() - 1
&& path == override_path[..override_path.len() - 1]
&& override_path.last() == Some(&dependency_alias))
.then_some(spec)
})
});
if overridden.is_some() {
log::debug!(
"{}overridden specifier found for {} ({dependency_spec})",
"\t".repeat(depth),
queue.push_back((
overridden.cloned().unwrap_or(dependency_spec),
dependency_ty,
Some((name.clone(), target_version_id.clone())),
path.iter()
.map(|s| s.as_str())
.chain(std::iter::once(dependency_alias.as_str()))
.collect::<Vec<_>>()
.join(">"),
);
.cloned()
.chain(std::iter::once(dependency_alias))
.collect(),
overridden.is_some(),
*target_version_id.target(),
));
}
queue.push_back((
overridden.cloned().unwrap_or(dependency_spec),
dependency_ty,
Some((name.clone(), target_version_id.clone())),
path.iter()
.cloned()
.chain(std::iter::once(dependency_alias))
.collect(),
overridden.is_some(),
*target_version_id.target(),
));
Ok(())
}
.instrument(tracing::info_span!("resolve new/changed", path = path.join(">")))
.await?;
}
for (name, versions) in &graph {
for (name, versions) in &mut graph {
for (version_id, node) in versions {
if node.is_peer && node.direct.is_none() {
node.resolved_ty = DependencyType::Peer;
}
if node.resolved_ty == DependencyType::Peer {
log::warn!("peer dependency {name}@{version_id} was not resolved");
tracing::warn!("peer dependency {name}@{version_id} was not resolved");
}
}
}

View file

@ -1,7 +1,7 @@
use crate::Project;
use std::{
ffi::OsStr,
fmt::{Display, Formatter},
fmt::{Debug, Display, Formatter},
path::Path,
process::Stdio,
};
@ -9,6 +9,7 @@ use tokio::{
io::{AsyncBufReadExt, BufReader},
process::Command,
};
use tracing::instrument;
/// Script names used by pesde
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, PartialOrd, Ord)]
@ -30,7 +31,8 @@ impl Display for ScriptName {
}
}
pub(crate) async fn execute_script<A: IntoIterator<Item = S>, S: AsRef<OsStr>>(
#[instrument(skip(project), level = "debug")]
pub(crate) async fn execute_script<A: IntoIterator<Item = S> + Debug, S: AsRef<OsStr> + Debug>(
script_name: ScriptName,
script_path: &Path,
args: A,
@ -59,10 +61,10 @@ pub(crate) async fn execute_script<A: IntoIterator<Item = S>, S: AsRef<OsStr>>(
while let Some(line) = stderr.next_line().await.transpose() {
match line {
Ok(line) => {
log::error!("[{script}]: {line}");
tracing::error!("[{script}]: {line}");
}
Err(e) => {
log::error!("ERROR IN READING STDERR OF {script}: {e}");
tracing::error!("ERROR IN READING STDERR OF {script}: {e}");
break;
}
}
@ -78,11 +80,11 @@ pub(crate) async fn execute_script<A: IntoIterator<Item = S>, S: AsRef<OsStr>>(
stdout_str.push_str(&line);
stdout_str.push('\n');
} else {
log::info!("[{script_2}]: {line}");
tracing::info!("[{script_2}]: {line}");
}
}
Err(e) => {
log::error!("ERROR IN READING STDOUT OF {script_2}: {e}");
tracing::error!("ERROR IN READING STDOUT OF {script_2}: {e}");
break;
}
}
@ -95,7 +97,7 @@ pub(crate) async fn execute_script<A: IntoIterator<Item = S>, S: AsRef<OsStr>>(
}
}
Err(e) if e.kind() == std::io::ErrorKind::NotFound => {
log::warn!("Lune could not be found in PATH: {e}");
tracing::warn!("Lune could not be found in PATH: {e}");
Ok(None)
}

View file

@ -9,6 +9,7 @@ use serde::{Deserialize, Serialize};
use sha2::{Digest, Sha256};
use std::{
collections::BTreeMap,
fmt::Debug,
future::Future,
path::{Path, PathBuf},
};
@ -17,6 +18,7 @@ use tokio::{
io::{AsyncReadExt, AsyncWriteExt},
pin,
};
use tracing::instrument;
/// A file system entry
#[derive(Debug, Clone, Serialize, Deserialize)]
@ -125,7 +127,8 @@ pub(crate) async fn store_in_cas<
impl PackageFS {
/// Write the package to the given destination
pub async fn write_to<P: AsRef<Path>, Q: AsRef<Path>>(
#[instrument(skip(self), level = "debug")]
pub async fn write_to<P: AsRef<Path> + Debug, Q: AsRef<Path> + Debug>(
&self,
destination: P,
cas_path: Q,
@ -211,7 +214,8 @@ impl PackageFS {
}
/// Returns the contents of the file with the given hash
pub async fn read_file<P: AsRef<Path>, H: AsRef<str>>(
#[instrument(skip(self), ret(level = "trace"), level = "debug")]
pub async fn read_file<P: AsRef<Path> + Debug, H: AsRef<str> + Debug>(
&self,
file_hash: H,
cas_path: P,

View file

@ -27,6 +27,7 @@ use std::{
sync::Arc,
};
use tokio::{sync::Mutex, task::spawn_blocking};
use tracing::instrument;
/// The Git package reference
pub mod pkg_ref;
@ -70,10 +71,12 @@ impl PackageSource for GitPackageSource {
type ResolveError = errors::ResolveError;
type DownloadError = errors::DownloadError;
#[instrument(skip_all, level = "debug")]
async fn refresh(&self, project: &Project) -> Result<(), Self::RefreshError> {
GitBasedSource::refresh(self, project).await
}
#[instrument(skip_all, level = "debug")]
async fn resolve(
&self,
specifier: &Self::Specifier,
@ -329,6 +332,7 @@ impl PackageSource for GitPackageSource {
))
}
#[instrument(skip_all, level = "debug")]
async fn download(
&self,
pkg_ref: &Self::Ref,
@ -343,7 +347,7 @@ impl PackageSource for GitPackageSource {
match fs::read_to_string(&index_file).await {
Ok(s) => {
log::debug!(
tracing::debug!(
"using cached index file for package {}#{}",
pkg_ref.repo,
pkg_ref.tree_id
@ -487,7 +491,7 @@ impl PackageSource for GitPackageSource {
}
if pkg_ref.use_new_structure() && name == "default.project.json" {
log::debug!(
tracing::debug!(
"removing default.project.json from {}#{} at {path} - using new structure",
pkg_ref.repo,
pkg_ref.tree_id

View file

@ -1,8 +1,11 @@
#![allow(async_fn_in_trait)]
use crate::{util::authenticate_conn, Project};
use fs_err::tokio as fs;
use gix::remote::Direction;
use std::fmt::Debug;
use tokio::task::spawn_blocking;
use tracing::instrument;
/// A trait for sources that are based on Git repositories
pub trait GitBasedSource {
@ -90,7 +93,11 @@ pub trait GitBasedSource {
}
/// Reads a file from a tree
pub fn read_file<I: IntoIterator<Item = P> + Clone, P: ToString + PartialEq<gix::bstr::BStr>>(
#[instrument(skip(tree), ret, level = "trace")]
pub fn read_file<
I: IntoIterator<Item = P> + Clone + Debug,
P: ToString + PartialEq<gix::bstr::BStr>,
>(
tree: &gix::Tree,
file_path: I,
) -> Result<Option<String>, errors::ReadFile> {
@ -120,6 +127,7 @@ pub fn read_file<I: IntoIterator<Item = P> + Clone, P: ToString + PartialEq<gix:
}
/// Gets the root tree of a repository
#[instrument(skip(repo), level = "trace")]
pub fn root_tree(repo: &gix::Repository) -> Result<gix::Tree, errors::TreeError> {
// this is a bare repo, so this is the actual path
let path = repo.path().to_path_buf();

View file

@ -30,6 +30,7 @@ use crate::{
use fs_err::tokio as fs;
use futures::StreamExt;
use tokio::task::spawn_blocking;
use tracing::instrument;
/// The pesde package reference
pub mod pkg_ref;
@ -73,6 +74,7 @@ impl PesdePackageSource {
}
/// Reads the config file
#[instrument(skip_all, ret(level = "trace"), level = "debug")]
pub async fn config(&self, project: &Project) -> Result<IndexConfig, errors::ConfigError> {
let repo_url = self.repo_url.clone();
let path = self.path(project);
@ -99,10 +101,12 @@ impl PackageSource for PesdePackageSource {
type ResolveError = errors::ResolveError;
type DownloadError = errors::DownloadError;
#[instrument(skip_all, level = "debug")]
async fn refresh(&self, project: &Project) -> Result<(), Self::RefreshError> {
GitBasedSource::refresh(self, project).await
}
#[instrument(skip_all, level = "debug")]
async fn resolve(
&self,
specifier: &Self::Specifier,
@ -124,10 +128,10 @@ impl PackageSource for PesdePackageSource {
}
};
let entries: IndexFile = toml::from_str(&string)
let IndexFile { entries, .. } = toml::from_str(&string)
.map_err(|e| Self::ResolveError::Parse(specifier.name.to_string(), e))?;
log::debug!("{} has {} possible entries", specifier.name, entries.len());
tracing::debug!("{} has {} possible entries", specifier.name, entries.len());
Ok((
PackageNames::Pesde(specifier.name.clone()),
@ -155,6 +159,7 @@ impl PackageSource for PesdePackageSource {
))
}
#[instrument(skip_all, level = "debug")]
async fn download(
&self,
pkg_ref: &Self::Ref,
@ -171,7 +176,7 @@ impl PackageSource for PesdePackageSource {
match fs::read_to_string(&index_file).await {
Ok(s) => {
log::debug!(
tracing::debug!(
"using cached index file for package {}@{} {}",
pkg_ref.name,
pkg_ref.version,
@ -192,7 +197,7 @@ impl PackageSource for PesdePackageSource {
let mut request = reqwest.get(&url).header(ACCEPT, "application/octet-stream");
if let Some(token) = project.auth_config.tokens().get(&self.repo_url) {
log::debug!("using token for {}", self.repo_url);
tracing::debug!("using token for {}", self.repo_url);
request = request.header(AUTHORIZATION, token);
}
@ -274,28 +279,35 @@ impl Default for AllowedRegistries {
}
}
impl AllowedRegistries {
/// Whether the given URL is allowed
pub fn is_allowed(&self, mut this: Url, mut external: Url) -> bool {
// strip .git suffix to allow for more flexible matching
this.path = this.path.strip_suffix(b".git").unwrap_or(&this.path).into();
external.path = external
.path
.strip_suffix(b".git")
.unwrap_or(&external.path)
.into();
// strips .git suffix to allow for more flexible matching
fn simplify_url(mut url: Url) -> Url {
url.path = url.path.strip_suffix(b".git").unwrap_or(&url.path).into();
url
}
this == external
|| (match self {
Self::All(all) => *all,
Self::Specific(urls) => urls.contains(&this) || urls.contains(&external),
})
impl AllowedRegistries {
fn _is_allowed(&self, url: &Url) -> bool {
match self {
Self::All(all) => *all,
Self::Specific(urls) => urls.contains(url),
}
}
/// Whether the given URL is allowed
pub fn is_allowed(&self, url: Url) -> bool {
self._is_allowed(&simplify_url(url))
}
/// Whether the given URL is allowed, or is the same as the given URL
pub fn is_allowed_or_same(&self, this: Url, external: Url) -> bool {
let this = simplify_url(this);
let external = simplify_url(external);
(this == external) || self._is_allowed(&external) || self._is_allowed(&this)
}
}
/// The configuration for the pesde index
#[derive(Deserialize, Debug, Clone)]
#[serde(deny_unknown_fields)]
pub struct IndexConfig {
/// The URL of the API
pub api: url::Url,
@ -303,22 +315,22 @@ pub struct IndexConfig {
pub download: Option<String>,
/// Whether Git is allowed as a source for publishing packages
#[serde(default)]
pub git_allowed: bool,
pub git_allowed: AllowedRegistries,
/// Whether other registries are allowed as a source for publishing packages
#[serde(default)]
pub other_registries_allowed: AllowedRegistries,
/// Whether Wally is allowed as a source for publishing packages
#[serde(default)]
pub wally_allowed: bool,
pub wally_allowed: AllowedRegistries,
/// The OAuth client ID for GitHub
#[serde(default)]
pub github_oauth_client_id: Option<String>,
/// The maximum size of an archive in bytes
#[serde(default = "default_archive_size")]
pub max_archive_size: usize,
/// The package to use for default script implementations
/// The packages to display in the CLI for default script implementations
#[serde(default)]
pub scripts_package: Option<PackageName>,
pub scripts_packages: Vec<PackageName>,
}
impl IndexConfig {
@ -420,8 +432,20 @@ pub struct IndexFileEntry {
pub dependencies: BTreeMap<String, (DependencySpecifiers, DependencyType)>,
}
/// The package metadata in the index file
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq, Default)]
pub struct IndexMetadata {}
/// The index file for a package
pub type IndexFile = BTreeMap<VersionId, IndexFileEntry>;
#[derive(Debug, Serialize, Deserialize, Clone, PartialEq, Eq)]
pub struct IndexFile {
/// Any package-wide metadata
#[serde(default, skip_serializing_if = "crate::util::is_default")]
pub meta: IndexMetadata,
/// The entries in the index file
#[serde(flatten)]
pub entries: BTreeMap<VersionId, IndexFileEntry>,
}
/// Errors that can occur when interacting with the pesde package source
pub mod errors {

View file

@ -11,6 +11,7 @@ use crate::{
Project, LINK_LIB_NO_FILE_FOUND,
};
use fs_err::tokio as fs;
use tracing::instrument;
#[derive(Deserialize)]
#[serde(rename_all = "camelCase")]
@ -19,7 +20,8 @@ struct SourcemapNode {
file_paths: Vec<RelativePathBuf>,
}
pub(crate) async fn find_lib_path(
#[instrument(skip(project, package_dir), level = "debug")]
async fn find_lib_path(
project: &Project,
package_dir: &Path,
) -> Result<Option<RelativePathBuf>, errors::FindLibPathError> {
@ -29,7 +31,7 @@ pub(crate) async fn find_lib_path(
.scripts
.get(&ScriptName::SourcemapGenerator.to_string())
else {
log::warn!("no sourcemap generator script found in manifest");
tracing::warn!("no sourcemap generator script found in manifest");
return Ok(None);
};
@ -55,6 +57,7 @@ pub(crate) async fn find_lib_path(
pub(crate) const WALLY_MANIFEST_FILE_NAME: &str = "wally.toml";
#[instrument(skip(project, tempdir), level = "debug")]
pub(crate) async fn get_target(
project: &Project,
tempdir: &TempDir,

View file

@ -1,13 +1,13 @@
use std::collections::BTreeMap;
use semver::{Version, VersionReq};
use serde::{Deserialize, Deserializer};
use crate::{
manifest::{errors, DependencyType},
names::wally::WallyPackageName,
source::{specifiers::DependencySpecifiers, wally::specifier::WallyDependencySpecifier},
};
use semver::{Version, VersionReq};
use serde::{Deserialize, Deserializer};
use tracing::instrument;
#[derive(Deserialize, Clone, Debug)]
#[serde(rename_all = "lowercase")]
@ -63,6 +63,7 @@ pub struct WallyManifest {
impl WallyManifest {
/// Get all dependencies from the manifest
#[instrument(skip(self), ret(level = "trace"), level = "debug")]
pub fn all_dependencies(
&self,
) -> Result<

View file

@ -30,6 +30,7 @@ use std::{
use tempfile::tempdir;
use tokio::{io::AsyncWriteExt, sync::Mutex, task::spawn_blocking};
use tokio_util::compat::FuturesAsyncReadCompatExt;
use tracing::instrument;
pub(crate) mod compat_util;
pub(crate) mod manifest;
@ -68,6 +69,7 @@ impl WallyPackageSource {
}
/// Reads the config file
#[instrument(skip_all, ret(level = "trace"), level = "debug")]
pub async fn config(&self, project: &Project) -> Result<WallyIndexConfig, errors::ConfigError> {
let repo_url = self.repo_url.clone();
let path = self.path(project);
@ -94,10 +96,12 @@ impl PackageSource for WallyPackageSource {
type ResolveError = errors::ResolveError;
type DownloadError = errors::DownloadError;
#[instrument(skip_all, level = "debug")]
async fn refresh(&self, project: &Project) -> Result<(), Self::RefreshError> {
GitBasedSource::refresh(self, project).await
}
#[instrument(skip_all, level = "debug")]
async fn resolve(
&self,
specifier: &Self::Specifier,
@ -111,7 +115,7 @@ impl PackageSource for WallyPackageSource {
let string = match read_file(&tree, [scope, name]) {
Ok(Some(s)) => s,
Ok(None) => {
log::debug!(
tracing::debug!(
"{} not found in wally registry. searching in backup registries",
specifier.name
);
@ -134,7 +138,7 @@ impl PackageSource for WallyPackageSource {
.await
{
Ok((name, results)) => {
log::debug!("found {} in backup registry {registry}", name);
tracing::debug!("found {} in backup registry {registry}", name);
return Ok((name, results));
}
Err(errors::ResolveError::NotFound(_)) => {
@ -162,7 +166,7 @@ impl PackageSource for WallyPackageSource {
.collect::<Result<_, _>>()
.map_err(|e| Self::ResolveError::Parse(specifier.name.to_string(), e))?;
log::debug!("{} has {} possible entries", specifier.name, entries.len());
tracing::debug!("{} has {} possible entries", specifier.name, entries.len());
Ok((
PackageNames::Wally(specifier.name.clone()),
@ -192,6 +196,7 @@ impl PackageSource for WallyPackageSource {
))
}
#[instrument(skip_all, level = "debug")]
async fn download(
&self,
pkg_ref: &Self::Ref,
@ -207,7 +212,7 @@ impl PackageSource for WallyPackageSource {
let tempdir = match fs::read_to_string(&index_file).await {
Ok(s) => {
log::debug!(
tracing::debug!(
"using cached index file for package {}@{}",
pkg_ref.name,
pkg_ref.version
@ -240,7 +245,7 @@ impl PackageSource for WallyPackageSource {
);
if let Some(token) = project.auth_config.tokens().get(&self.repo_url) {
log::debug!("using token for {}", self.repo_url);
tracing::debug!("using token for {}", self.repo_url);
request = request.header(AUTHORIZATION, token);
}

View file

@ -13,6 +13,7 @@ use relative_path::RelativePathBuf;
use reqwest::Client;
use std::collections::{BTreeMap, HashSet};
use tokio::pin;
use tracing::instrument;
/// The workspace package reference
pub mod pkg_ref;
@ -35,6 +36,7 @@ impl PackageSource for WorkspacePackageSource {
Ok(())
}
#[instrument(skip_all, level = "debug")]
async fn resolve(
&self,
specifier: &Self::Specifier,
@ -126,6 +128,7 @@ impl PackageSource for WorkspacePackageSource {
))
}
#[instrument(skip_all, level = "debug")]
async fn download(
&self,
pkg_ref: &Self::Ref,

View file

@ -19,7 +19,7 @@ impl DependencySpecifier for WorkspaceDependencySpecifier {}
impl Display for WorkspaceDependencySpecifier {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "workspace:{}{}", self.version, self.name)
write!(f, "{}@workspace:{}", self.name, self.version)
}
}

View file

@ -83,3 +83,7 @@ pub fn deserialize_git_like_url<'de, D: Deserializer<'de>>(
pub fn hash<S: AsRef<[u8]>>(struc: S) -> String {
format!("{:x}", Sha256::digest(struc.as_ref()))
}
pub fn is_default<T: Default + Eq>(t: &T) -> bool {
t == &T::default()
}

View file

@ -26,6 +26,7 @@ export type TargetInfo = {
kind: TargetKind
lib: boolean
bin: boolean
scripts?: string[]
}
export type TargetKind = "roblox" | "roblox_server" | "lune" | "luau"

View file

@ -2,7 +2,7 @@
import { page } from "$app/stores"
import GitHub from "$lib/components/GitHub.svelte"
import type { TargetInfo } from "$lib/registry-api"
import { BinaryIcon, Globe, Icon, LibraryIcon, Mail } from "lucide-svelte"
import { BinaryIcon, Globe, Icon, LibraryIcon, Mail, ScrollIcon } from "lucide-svelte"
import type { ComponentType } from "svelte"
import TargetSelector from "../../TargetSelector.svelte"
import Command from "./Command.svelte"
@ -36,11 +36,13 @@
const exportNames: Partial<Record<keyof TargetInfo, string>> = {
lib: "Library",
bin: "Binary",
scripts: "Scripts",
}
const exportIcons: Partial<Record<keyof TargetInfo, ComponentType<Icon>>> = {
lib: LibraryIcon,
bin: BinaryIcon,
scripts: ScrollIcon,
}
const exportEntries = $derived(
@ -92,20 +94,30 @@
<ul class="mb-6 space-y-0.5">
{#each exportEntries as [exportKey, exportName]}
{@const Icon = exportIcons[exportKey as keyof TargetInfo]}
<li class="flex items-center">
<Icon aria-hidden="true" class="text-primary mr-2 size-5" />
{exportName}
<li>
<div class="flex items-center">
<Icon aria-hidden="true" class="text-primary mr-2 size-5" />
{exportName}
</div>
{#if exportKey === "bin"}
<p class="text-body/80 mb-4 mt-3 text-sm">
This package provides a binary that can be executed after installation, or globally
via:
</p>
<Command command={xCommand} class="mb-6" />
{:else if exportKey === "scripts"}
<div class="text-body/80 mt-3 flex flex-wrap gap-2 text-sm">
{#each currentTarget?.scripts ?? [] as script}
<div class="bg-card text-heading w-max truncate rounded px-3 py-2" title={script}>
{script}
</div>
{/each}
</div>
{/if}
</li>
{/each}
</ul>
{#if currentTarget?.bin}
<p class="text-body/80 -mt-3 mb-4 text-sm">
This package provides a binary that can be executed after installation, or globally via:
</p>
<Command command={xCommand} class="mb-6" />
{/if}
{#if data.pkg.authors && data.pkg.authors.length > 0}
<h2 class="text-heading mb-2 text-lg font-semibold">Authors</h2>
<ul>

View file

@ -2,7 +2,9 @@
const { data } = $props()
</script>
<div class="prose min-w-0 py-8 prose-pre:w-full prose-pre:overflow-auto">
<div
class="prose prose-pre:w-full prose-pre:overflow-auto prose-img:inline-block prose-img:m-0 prose-video:inline-block prose-video:m-0 min-w-0 py-8"
>
<!-- eslint-disable-next-line svelte/no-at-html-tags -->
{@html data.readmeHtml}
</div>