Compare commits

..

33 commits

Author SHA1 Message Date
daimond113
32906400ec
docs: update scripts docs
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2025-01-18 16:47:07 +01:00
Nidoxs
5c2f831c26
docs: add an aside for symlink errors on Windows (#20)
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
* Add an aside for symlink errors on Windows

* Remove redundant whitespace

* Inline URL

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>

* Revert titles to "Caution" instead of "Warning"

* Use inline code block for error message

* Update docs/src/content/docs/installation.mdx

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>

* Update docs/src/content/docs/installation.mdx

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>

---------

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>
2025-01-05 19:25:52 +01:00
daimond113
97d9251f69
docs: remove branches from git revs
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2025-01-03 18:09:07 +01:00
daimond113
89a2103164
chore(release): prepare for v0.5.3
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2024-12-30 00:56:58 +01:00
daimond113
0c159e7689
docs: add missing changelog entries 2024-12-30 00:56:03 +01:00
daimond113
4f75af88b7
feat: add meta in index file to preserve future compat 2024-12-30 00:49:24 +01:00
daimond113
f009c957ca
feat: remove verbosity from release mode logs
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2024-12-26 22:51:00 +01:00
3569ff32cd
ci: debug builds action (#15)
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
* chore(actions): create debug build action

* chore(actions): remove unneeded targets

Also do the following:
* Use v4 of artifact upload action
* Install Linux-specific build dependencies
* Do not include version-management feature while building
* Fix cargo build command
* Include native mac x86 target instead of cross compilation

* chore(actions): fix bad compile command

Turns out I hallucinated `--exclude-features` into existence.

* chore(actions): add job to shorten github commit SHA

* chore(actions): use bash patterns for commit SHA trimming

* chore(actions): fix bash pattern syntax being improper

* chore(actions): use `tee` to write trimmed version to stdout for debugging

* chore(actions): include full semver version including git commit SHA

* chore(actions): checkout code first in `get-version` job

* chore(actions): write `trimmed_sha` to `GITHUB_OUTPUT` correclty

* chore(actions): add name for `get-version` job

* chore(actions): make matrix `job-name`s be consistent with release workflow

* chore(actions): provide `exe` for windows manually instead of glob

Also makes step now error on no matching files.
2024-12-25 15:45:29 +01:00
daimond113
c3e764ddda
fix: display spans outside debug
Some checks failed
Test & Lint / lint (push) Has been cancelled
2024-12-22 12:43:42 +01:00
dai
db3335bbf7
docs: add SECURITY.md
Some checks failed
Test & Lint / lint (push) Has been cancelled
2024-12-20 19:06:35 +01:00
Aristosis
711b0009cb
docs: fix improper assignment to PATH (#8)
Some checks are pending
Test & Lint / lint (push) Waiting to run
* Fix improper assignment to path installation.mdx

* Use the home variable installation.mdx

* Remove leading slash

---------

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>
2024-12-19 21:21:32 +01:00
daimond113
f88b800d51
chore(release): prepare for v0.5.2
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-19 16:19:07 +01:00
daimond113
28df3bcca4
feat(registry): add sentry tracing 2024-12-19 16:18:26 +01:00
daimond113
0f74e2efa3
fix: do not error on missing deps until full linking
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-18 23:34:49 +01:00
daimond113
a6c1108d5b
feat: switch registry to tracing logging 2024-12-18 22:29:10 +01:00
daimond113
9535175a45
feat: add more tracing info 2024-12-18 22:00:58 +01:00
daimond113
d9d27cf45b
fix: resolve pesde_version tags properly
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-18 16:03:50 +01:00
daimond113
60fb68fcf3
fix: change dependency types for removed peers
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-17 14:58:21 +01:00
daimond113
78976834b2
docs(changelog): add missing changelog entry for logging switch 2024-12-17 14:57:38 +01:00
daimond113
52603ea43e
feat: switch to tracing for logging
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-16 23:00:37 +01:00
daimond113
0dde647042
fix(website): render imgs inline
Some checks failed
Test & Lint / lint (push) Has been cancelled
2024-12-15 12:37:24 +01:00
daimond113
3196a83b25
chore(release): prepare for v0.5.1
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-15 00:38:22 +01:00
daimond113
d387c27f16
fix: ignore build metadata when comparing cli versions 2024-12-15 00:35:16 +01:00
daimond113
a6846597ca
docs: correct changelog diff link 2024-12-15 00:01:35 +01:00
daimond113
3810a3b9ff
ci: attempt to fix release ci 2024-12-14 23:59:58 +01:00
daimond113
52c502359b
chore(release): prepare for v0.5.0 2024-12-14 23:53:59 +01:00
daimond113
7d1e20da8c
chore: update dependencies 2024-12-14 23:51:37 +01:00
daimond113
d35f34e8f0
fix: gracefully handle unparsable versions & dont display metadata 2024-12-14 19:57:33 +01:00
daimond113
9ee75ec9c9
feat: remove lower bound limit on pesde package name length 2024-12-14 17:41:57 +01:00
daimond113
919b0036e5
feat: display included scripts in publish command
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-13 23:52:45 +01:00
daimond113
7466131f04
fix: link with types without roblox_sync_config_generator script
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-13 17:06:47 +01:00
dai
0be7dd4d0e
chore(release): prepare for v0.5.0-rc.18
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-12 16:23:27 +01:00
dai
f8d0bc6c4d
fix: correctly get index URLs in publish command 2024-12-12 16:23:11 +01:00
54 changed files with 1377 additions and 931 deletions

79
.github/workflows/debug.yml vendored Normal file
View file

@ -0,0 +1,79 @@
name: Debug
on:
push:
pull_request:
jobs:
get-version:
name: Get build version
runs-on: ubuntu-latest
outputs:
version: v${{ steps.get_version.outputs.value }}+rev.g${{ steps.trim_sha.outputs.trimmed_sha }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Get package version
uses: SebRollen/toml-action@v1.2.0
id: get_version
with:
file: Cargo.toml
field: package.version
- name: Trim commit SHA
id: trim_sha
run: |
commit_sha=${{ github.sha }}
echo "trimmed_sha=${commit_sha:0:7}" | tee $GITHUB_OUTPUT
build:
strategy:
matrix:
include:
- job-name: windows-x86_64
target: x86_64-pc-windows-msvc
runs-on: windows-latest
artifact-name: pesde-debug-${{ needs.get-version.outputs.version }}-windows-x86_64
- job-name: linux-x86_64
target: x86_64-unknown-linux-gnu
runs-on: ubuntu-latest
artifact-name: pesde-debug-${{ needs.get-version.outputs.version }}-linux-x86_64
- job-name: macos-x86_64
target: x86_64-apple-darwin
runs-on: macos-13
artifact-name: pesde-debug-${{ needs.get-version.outputs.version }}-macos-x86_64
- job-name: macos-aarch64
target: aarch64-apple-darwin
runs-on: macos-latest
artifact-name: pesde-debug-${{ needs.get-version.outputs.version }}-macos-aarch64
name: Build for ${{ matrix.job-name }}
runs-on: ${{ matrix.runs-on }}
needs: get-version
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install Linux build dependencies
if: ${{ matrix.runs-on == 'ubuntu-latest' }}
run: |
sudo apt-get update
sudo apt-get install libdbus-1-dev pkg-config
- name: Install Rust toolchain
uses: dtolnay/rust-toolchain@stable
- name: Compile in debug mode
run: cargo build --bins --no-default-features --features bin,patches,wally-compat --target ${{ matrix.target }} --locked
- name: Upload artifact
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.artifact-name }}
if-no-files-found: error
path: |
target/${{ matrix.target }}/debug/pesde.exe
target/${{ matrix.target }}/debug/pesde

View file

@ -4,8 +4,44 @@ on:
tags: tags:
- v* - v*
env: env:
CRATE_NAME: pesde
BIN_NAME: pesde BIN_NAME: pesde
jobs: jobs:
prepare:
name: Prepare
runs-on: ubuntu-latest
outputs:
version: ${{ steps.extract_version.outputs.VERSION }}
found: ${{ steps.ensure_not_published.outputs.FOUND }}
steps:
- uses: actions/checkout@v4
- name: Extract version
id: extract_version
shell: bash
run: |
VERSION=$(echo ${{ github.ref_name }} | cut -d'+' -f1 | cut -c 2-)
echo "VERSION=$VERSION" >> "$GITHUB_OUTPUT"
- name: Ensure not published
id: ensure_not_published
shell: bash
env:
VERSION: ${{ steps.extract_version.outputs.VERSION }}
run: |
CRATE_NAME="${{ env.CRATE_NAME }}"
if [ ${#CRATE_NAME} -eq 1 ]; then
DIR="1"
elif [ ${#CRATE_NAME} -eq 2 ]; then
DIR="2"
elif [ ${#CRATE_NAME} -eq 3 ]; then
DIR="3/${CRATE_NAME:0:1}"
else
DIR="${CRATE_NAME:0:2}/${CRATE_NAME:2:2}"
fi
FOUND=$(curl -sSL --fail-with-body "https://index.crates.io/$DIR/${{ env.CRATE_NAME }}" | jq -s 'any(.[]; .vers == "${{ env.VERSION }}")')
echo "FOUND=$FOUND" >> "$GITHUB_OUTPUT"
build: build:
strategy: strategy:
matrix: matrix:
@ -31,13 +67,17 @@ jobs:
target: aarch64-apple-darwin target: aarch64-apple-darwin
runs-on: ${{ matrix.os }} runs-on: ${{ matrix.os }}
name: Build for ${{ matrix.host }}-${{ matrix.arch }} name: Build for ${{ matrix.host }}-${{ matrix.arch }}
needs: [ prepare ]
if: ${{ needs.prepare.outputs.found == 'false' }}
env:
VERSION: ${{ needs.prepare.outputs.version }}
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- uses: dtolnay/rust-toolchain@stable - uses: dtolnay/rust-toolchain@stable
- name: Set env - name: Set env
shell: bash shell: bash
run: | run: |
ARCHIVE_NAME=${{ env.BIN_NAME }}-$(echo ${{ github.ref_name }} | cut -c 2-)-${{ matrix.host }}-${{ matrix.arch }} ARCHIVE_NAME=${{ env.BIN_NAME }}-${{ env.VERSION }}-${{ matrix.host }}-${{ matrix.arch }}
echo "ARCHIVE_NAME=$ARCHIVE_NAME" >> $GITHUB_ENV echo "ARCHIVE_NAME=$ARCHIVE_NAME" >> $GITHUB_ENV
@ -91,7 +131,9 @@ jobs:
permissions: permissions:
contents: write contents: write
pull-requests: read pull-requests: read
needs: [ build, publish ] needs: [ prepare, publish ]
env:
VERSION: ${{ needs.prepare.outputs.version }}
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
with: with:
@ -107,7 +149,7 @@ jobs:
with: with:
token: ${{ secrets.GITHUB_TOKEN }} token: ${{ secrets.GITHUB_TOKEN }}
tag_name: ${{ github.ref_name }} tag_name: ${{ github.ref_name }}
name: ${{ github.ref_name }} name: v${{ env.VERSION }}
draft: true draft: true
prerelease: ${{ startsWith(github.ref_name, 'v0') }} prerelease: ${{ startsWith(env.VERSION, '0') }}
files: artifacts/* files: artifacts/*

View file

@ -5,156 +5,93 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [0.5.0-rc.17] - 2024-12-11 ## [0.5.3] - 2024-12-30
### Added ### Added
- Allow multiple, user selectable scripts packages to be selected (& custom packages inputted) in `init` command by @daimond113 - Add meta field in index files to preserve compatibility with potential future changes by @daimond113
- Support granular control over which repositories are allowed in various specifier types by @daimond113
### Performance ### Changed
- Use `exec` in Unix bin linking to reduce the number of processes by @daimond113 - Remove verbosity from release mode logging by @daimond113
## [0.5.0-rc.16] - 2024-12-08 ## [0.5.2] - 2024-12-19
### Fixed ### Fixed
- Do not require lib or bin exports if package exports scripts by @daimond113 - Change dependency types for removed peer dependencies by @daimond113
- Resolve version to correct tag for `pesde_version` field by @daimond113
- Do not error on missing dependencies until full linking by @daimond113
## [0.5.0-rc.15] - 2024-12-08 ### Changed
### Added - Switch from `log` to `tracing` for logging by @daimond113
- Add improved CLI styling by @daimond113
- Install pesde dependencies before Wally to support scripts packages by @daimond113 ## [0.5.1] - 2024-12-15
- Support packages exporting scripts by @daimond113 ### Fixed
- Support using workspace root as a member by @daimond113 - Ignore build metadata when comparing CLI versions by @daimond113
### Removed ## [0.5.0] - 2024-12-14
- Remove special scripts repo handling to favour standard packages by @daimond113
### Fixed
- Link dependencies before type extraction to support more use cases by @daimond113
- Strip `.luau` extension from linker modules' require paths to comply with Luau by @daimond113
- Correctly handle graph paths for resolving overriden packages by @daimond113
- Do not require `--` in bin package executables on Unix by @daimond113
## [0.5.0-rc.14] - 2024-11-30
### Fixed
- Fix `includes` not supporting root files by @daimond113
## [0.5.0-rc.13] - 2024-11-28
### Added ### Added
- Add support for multiple targets under the same package name in workspace members by @daimond113
- Add `yes` argument to skip all prompts in publish command by @daimond113
- Publish all workspace members when publishing a workspace by @daimond113
- Inform user about not finding any bin package when using its bin invocation by @daimond113
- Support full version requirements in workspace version field by @daimond113
- Improved authentication system for registry changes by @daimond113
- New website by @lukadev-0
- Add `--index` flag to `publish` command to publish to a specific index by @daimond113
- Support fallback Wally registries by @daimond113
- Print that no updates are available in `outdated` command by @daimond113 - Print that no updates are available in `outdated` command by @daimond113
- Support negated globs in `workspace_members` field by @daimond113 - Support negated globs in `workspace_members` field by @daimond113
- Make `includes` use glob patterns by @daimond113 - Make `includes` use glob patterns by @daimond113
- Use symlinks for workspace dependencies to not require reinstalling by @daimond113 - Use symlinks for workspace dependencies to not require reinstalling by @daimond113
- Add `auth token` command to print the auth token for the index by @daimond113 - Add `auth token` command to print the auth token for the index by @daimond113
- Support specifying which external registries are allowed on registries by @daimond113 - Support specifying which external registries are allowed on registries by @daimond113
- Add improved CLI styling by @daimond113
- Install pesde dependencies before Wally to support scripts packages by @daimond113
- Support packages exporting scripts by @daimond113
- Support using workspace root as a member by @daimond113
- Allow multiple, user selectable scripts packages to be selected (& custom packages inputted) in `init` command by @daimond113
- Support granular control over which repositories are allowed in various specifier types by @daimond113
- Display included scripts in `publish` command by @daimond113
### Fixed ### Fixed
- Install dependencies of packages in `x` command by @daimond113 - Fix versions with dots not being handled correctly by @daimond113
- Use workspace specifiers' `target` field when resolving by @daimond113
### Performance - Add feature gates to `wally-compat` specific code in init command by @daimond113
- Asyncify dependency linking by @daimond113 - Remove duplicated manifest file name in `publish` command by @daimond113
- Allow use of Luau packages in `execute` command by @daimond113
## [0.5.0-rc.12] - 2024-11-22 - Fix `self-upgrade` overwriting its own binary by @daimond113
### Added - Correct `pesde.toml` inclusion message in `publish` command by @daimond113
- Support fallback Wally registries by @daimond113 - Allow writes to files when `link` is false in PackageFS::write_to by @daimond113
- Handle missing revisions in AnyPackageIdentifier::from_str by @daimond113
### Fixed - Make GitHub OAuth client ID config optional by @daimond113
- Fix peer dependencies being resolved incorrectly by @daimond113 - Use updated aliases when reusing lockfile dependencies by @daimond113
- Set PESDE_ROOT to the correct path in `pesde run` by @daimond113 - Listen for device flow completion without requiring pressing enter by @daimond113
- Sync scripts repo in background by @daimond113
## [0.5.0-rc.11] - 2024-11-20 - Don't make CAS files read-only on Windows (file removal is disallowed if the file is read-only) by @daimond113
### Fixed - Validate package names are lowercase by @daimond113
- Add back mistakenly removed updates check caching by @daimond113
- Set download error source to inner error to propagate the error by @daimond113
- Correctly copy workspace packages by @daimond113
## [0.5.0-rc.10] - 2024-11-16
### Fixed
- Fix `self-install` doing a cross-device move by @daimond113
### Changed
- Only store `pesde_version` executables in the version cache by @daimond113
## [0.5.0-rc.9] - 2024-11-16
### Fixed
- Correctly link Wally server packages by @daimond113
### Changed
- `self-upgrade` now will check for updates by itself by default by @daimond113
## [0.5.0-rc.8] - 2024-11-12
### Added
- Add `--index` flag to `publish` command to publish to a specific index by @daimond113
### Fixed
- Use a different algorithm for finding a CAS directory to avoid issues with mounted drives by @daimond113 - Use a different algorithm for finding a CAS directory to avoid issues with mounted drives by @daimond113
- Remove default.project.json from Git pesde dependencies by @daimond113 - Remove default.project.json from Git pesde dependencies by @daimond113
- Correctly (de)serialize workspace specifiers by @daimond113 - Correctly (de)serialize workspace specifiers by @daimond113
- Fix CAS finder algorithm issues with Windows by @daimond113 - Fix CAS finder algorithm issues with Windows by @daimond113
- Fix CAS finder algorithm's AlreadyExists error by @daimond113 - Fix CAS finder algorithm's AlreadyExists error by @daimond113
- Use moved path when setting file to read-only by @daimond113 - Use moved path when setting file to read-only by @daimond113
- Correctly link Wally server packages by @daimond113
- Fix `self-install` doing a cross-device move by @daimond113
- Add back mistakenly removed updates check caching by @daimond113
- Set download error source to inner error to propagate the error by @daimond113
- Correctly copy workspace packages by @daimond113
- Fix peer dependencies being resolved incorrectly by @daimond113
- Set PESDE_ROOT to the correct path in `pesde run` by @daimond113
- Install dependencies of packages in `x` command by @daimond113
- Fix `includes` not supporting root files by @daimond113
- Link dependencies before type extraction to support more use cases by @daimond113
- Strip `.luau` extension from linker modules' require paths to comply with Luau by @daimond113
- Correctly handle graph paths for resolving overriden packages by @daimond113
- Do not require `--` in bin package executables on Unix by @daimond113
- Do not require lib or bin exports if package exports scripts by @daimond113
- Correctly resolve URLs in `publish` command by @daimond113
- Add Roblox types in linker modules even with no config generator script by @daimond113
### Changed ### Removed
- Switched to fs-err for better errors with file system operations by @daimond113 - Remove special scripts repo handling to favour standard packages by @daimond113
- Use body bytes over multipart for publishing packages by @daimond113
### Performance
- Switch to async Rust by @daimond113
## [0.5.0-rc.7] - 2024-10-30
### Added
- New website by @lukadev-0
### Fixed
- Use updated aliases when reusing lockfile dependencies by @daimond113
- Listen for device flow completion without requiring pressing enter by @daimond113
- Sync scripts repo in background by @daimond113
- Don't make CAS files read-only on Windows (file removal is disallowed if the file is read-only) by @daimond113
- Validate package names are lowercase by @daimond113
### Performance
- Clone dependency repos shallowly by @daimond113
### Changed
- Optimize boolean expression in `publish` command by @daimond113
## [0.5.0-rc.6] - 2024-10-14
### Added
- Support full version requirements in workspace version field by @daimond113
- Improved authentication system for registry changes by @daimond113
### Fixed
- Correct `pesde.toml` inclusion message in `publish` command by @daimond113
- Allow writes to files when `link` is false in PackageFS::write_to by @daimond113
- Handle missing revisions in AnyPackageIdentifier::from_str by @daimond113
- Make GitHub OAuth client ID config optional by @daimond113
## [0.5.0-rc.5] - 2024-10-12
### Added
- Inform user about not finding any bin package when using its bin invocation by @daimond113
### Fixed
- Fix `self-upgrade` overwriting its own binary by @daimond113
- Allow use of Luau packages in `execute` command by @daimond113
- Remove duplicated manifest file name in `publish` command by @daimond113
## [0.5.0-rc.4] - 2024-10-12
### Added
- Add `yes` argument to skip all prompts in publish command by @daimond113
- Publish all workspace members when publishing a workspace by @daimond113
### Fixed
- Add feature gates to `wally-compat` specific code in init command by @daimond113
## [0.5.0-rc.3] - 2024-10-06
### Fixed
- Use workspace specifiers' `target` field when resolving by @daimond113
## [0.5.0-rc.2] - 2024-10-06
### Added
- Add support for multiple targets under the same package name in workspace members by @daimond113
### Fixed
- Fix versions with dots not being handled correctly by @daimond113
## [0.5.0-rc.1] - 2024-10-06
### Changed ### Changed
- Rewrite the entire project in a more maintainable way by @daimond113 - Rewrite the entire project in a more maintainable way by @daimond113
- Support workspaces by @daimond113 - Support workspaces by @daimond113
@ -162,21 +99,20 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- Support multiple targets for a single package by @daimond113 - Support multiple targets for a single package by @daimond113
- Make registry much easier to self-host by @daimond113 - Make registry much easier to self-host by @daimond113
- Start maintaining a changelog by @daimond113 - Start maintaining a changelog by @daimond113
- Optimize boolean expression in `publish` command by @daimond113
- Switched to fs-err for better errors with file system operations by @daimond113
- Use body bytes over multipart for publishing packages by @daimond113
- `self-upgrade` now will check for updates by itself by default by @daimond113
- Only store `pesde_version` executables in the version cache by @daimond113
- Remove lower bound limit of 3 characters for pesde package names by @daimond113
[0.5.0-rc.17]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.16..v0.5.0-rc.17 ### Performance
[0.5.0-rc.16]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.15..v0.5.0-rc.16 - Clone dependency repos shallowly by @daimond113
[0.5.0-rc.15]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.14..v0.5.0-rc.15 - Switch to async Rust by @daimond113
[0.5.0-rc.14]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.13..v0.5.0-rc.14 - Asyncify dependency linking by @daimond113
[0.5.0-rc.13]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.12..v0.5.0-rc.13 - Use `exec` in Unix bin linking to reduce the number of processes by @daimond113
[0.5.0-rc.12]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.11..v0.5.0-rc.12
[0.5.0-rc.11]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.10..v0.5.0-rc.11 [0.5.3]: https://github.com/daimond113/pesde/compare/v0.5.2%2Bregistry.0.1.1..v0.5.3%2Bregistry.0.1.2
[0.5.0-rc.10]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.9..v0.5.0-rc.10 [0.5.2]: https://github.com/daimond113/pesde/compare/v0.5.1%2Bregistry.0.1.0..v0.5.2%2Bregistry.0.1.1
[0.5.0-rc.9]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.8..v0.5.0-rc.9 [0.5.1]: https://github.com/daimond113/pesde/compare/v0.5.0%2Bregistry.0.1.0..v0.5.1%2Bregistry.0.1.0
[0.5.0-rc.8]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.7..v0.5.0-rc.8 [0.5.0]: https://github.com/daimond113/pesde/compare/v0.4.7..v0.5.0%2Bregistry.0.1.0
[0.5.0-rc.7]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.6..v0.5.0-rc.7
[0.5.0-rc.6]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.5..v0.5.0-rc.6
[0.5.0-rc.5]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.4..v0.5.0-rc.5
[0.5.0-rc.4]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.3..v0.5.0-rc.4
[0.5.0-rc.3]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.2..v0.5.0-rc.3
[0.5.0-rc.2]: https://github.com/daimond113/pesde/compare/v0.5.0-rc.1..v0.5.0-rc.2
[0.5.0-rc.1]: https://github.com/daimond113/pesde/compare/v0.4.7..v0.5.0-rc.1

354
Cargo.lock generated
View file

@ -36,9 +36,9 @@ dependencies = [
[[package]] [[package]]
name = "actix-governor" name = "actix-governor"
version = "0.7.0" version = "0.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "072a3d7907b945b0956f9721e01c117ad5765ce5be2fd9bb1e44a117c669de22" checksum = "4a0cb8586d3fa368d00ef643e8ef77f5d3d5dfe5c7b333415a556bc12eb1c41a"
dependencies = [ dependencies = [
"actix-http", "actix-http",
"actix-web", "actix-web",
@ -357,6 +357,12 @@ version = "1.7.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "69f7f8c3906b62b754cd5326047894316021dcfe5a194c8ea52bdd94934a3457" checksum = "69f7f8c3906b62b754cd5326047894316021dcfe5a194c8ea52bdd94934a3457"
[[package]]
name = "arrayvec"
version = "0.7.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7c02d123df017efcdfbd739ef81735b36c5ba83ec3c59c80a9d7ecc718f92e50"
[[package]] [[package]]
name = "async-broadcast" name = "async-broadcast"
version = "0.7.1" version = "0.7.1"
@ -680,7 +686,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1a68f1f47cdf0ec8ee4b941b2eee2a80cb796db73118c0dd09ac63fbe405be22" checksum = "1a68f1f47cdf0ec8ee4b941b2eee2a80cb796db73118c0dd09ac63fbe405be22"
dependencies = [ dependencies = [
"memchr", "memchr",
"regex-automata", "regex-automata 0.4.9",
"serde", "serde",
] ]
@ -757,9 +763,9 @@ checksum = "613afe47fcd5fac7ccf1db93babcb082c5994d996f20b8b159f2ad1658eb5724"
[[package]] [[package]]
name = "chrono" name = "chrono"
version = "0.4.38" version = "0.4.39"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a21f936df1771bf62b77f047b726c4625ff2e8aa607c01ec06e5a05bd8463401" checksum = "7e36cc9d416881d2e24f9a963be5fb1cd90966419ac844274161d10488b3e825"
dependencies = [ dependencies = [
"android-tzdata", "android-tzdata",
"iana-time-zone", "iana-time-zone",
@ -1288,19 +1294,6 @@ dependencies = [
"syn 2.0.90", "syn 2.0.90",
] ]
[[package]]
name = "env_logger"
version = "0.10.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4cd405aab171cb85d6735e5c8d9db038c17d3ca007a4d2c25f337935c3d90580"
dependencies = [
"humantime",
"is-terminal",
"log",
"regex",
"termcolor",
]
[[package]] [[package]]
name = "equivalent" name = "equivalent"
version = "1.0.1" version = "1.0.1"
@ -1696,7 +1689,7 @@ dependencies = [
"once_cell", "once_cell",
"regex", "regex",
"smallvec", "smallvec",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -1709,7 +1702,7 @@ dependencies = [
"gix-date", "gix-date",
"gix-utils", "gix-utils",
"itoa", "itoa",
"thiserror 2.0.5", "thiserror 2.0.7",
"winnow", "winnow",
] ]
@ -1726,7 +1719,7 @@ dependencies = [
"gix-trace", "gix-trace",
"kstring", "kstring",
"smallvec", "smallvec",
"thiserror 2.0.5", "thiserror 2.0.7",
"unicode-bom", "unicode-bom",
] ]
@ -1736,7 +1729,7 @@ version = "0.2.13"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d48b897b4bbc881aea994b4a5bbb340a04979d7be9089791304e04a9fbc66b53" checksum = "d48b897b4bbc881aea994b4a5bbb340a04979d7be9089791304e04a9fbc66b53"
dependencies = [ dependencies = [
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -1745,7 +1738,7 @@ version = "0.4.10"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c6ffbeb3a5c0b8b84c3fe4133a6f8c82fa962f4caefe8d0762eced025d3eb4f7" checksum = "c6ffbeb3a5c0b8b84c3fe4133a6f8c82fa962f4caefe8d0762eced025d3eb4f7"
dependencies = [ dependencies = [
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -1771,7 +1764,7 @@ dependencies = [
"gix-features", "gix-features",
"gix-hash", "gix-hash",
"memmap2", "memmap2",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -1790,7 +1783,7 @@ dependencies = [
"memchr", "memchr",
"once_cell", "once_cell",
"smallvec", "smallvec",
"thiserror 2.0.5", "thiserror 2.0.7",
"unicode-bom", "unicode-bom",
"winnow", "winnow",
] ]
@ -1805,7 +1798,7 @@ dependencies = [
"bstr", "bstr",
"gix-path", "gix-path",
"libc", "libc",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -1822,7 +1815,7 @@ dependencies = [
"gix-sec", "gix-sec",
"gix-trace", "gix-trace",
"gix-url", "gix-url",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -1834,7 +1827,7 @@ dependencies = [
"bstr", "bstr",
"itoa", "itoa",
"jiff", "jiff",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -1846,7 +1839,7 @@ dependencies = [
"bstr", "bstr",
"gix-hash", "gix-hash",
"gix-object", "gix-object",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -1862,7 +1855,7 @@ dependencies = [
"gix-path", "gix-path",
"gix-ref", "gix-ref",
"gix-sec", "gix-sec",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -1883,7 +1876,7 @@ dependencies = [
"parking_lot", "parking_lot",
"prodash", "prodash",
"sha1_smol", "sha1_smol",
"thiserror 2.0.5", "thiserror 2.0.7",
"walkdir", "walkdir",
] ]
@ -1905,7 +1898,7 @@ dependencies = [
"gix-trace", "gix-trace",
"gix-utils", "gix-utils",
"smallvec", "smallvec",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -1938,7 +1931,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0b5eccc17194ed0e67d49285e4853307e4147e95407f91c1c3e4a13ba9f4e4ce" checksum = "0b5eccc17194ed0e67d49285e4853307e4147e95407f91c1c3e4a13ba9f4e4ce"
dependencies = [ dependencies = [
"faster-hex", "faster-hex",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -1990,7 +1983,7 @@ dependencies = [
"memmap2", "memmap2",
"rustix", "rustix",
"smallvec", "smallvec",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2001,7 +1994,7 @@ checksum = "1cd3ab68a452db63d9f3ebdacb10f30dba1fa0d31ac64f4203d395ed1102d940"
dependencies = [ dependencies = [
"gix-tempfile", "gix-tempfile",
"gix-utils", "gix-utils",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2017,7 +2010,7 @@ dependencies = [
"gix-object", "gix-object",
"gix-revwalk", "gix-revwalk",
"smallvec", "smallvec",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2036,7 +2029,7 @@ dependencies = [
"gix-validate", "gix-validate",
"itoa", "itoa",
"smallvec", "smallvec",
"thiserror 2.0.5", "thiserror 2.0.7",
"winnow", "winnow",
] ]
@ -2058,7 +2051,7 @@ dependencies = [
"gix-quote", "gix-quote",
"parking_lot", "parking_lot",
"tempfile", "tempfile",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2078,7 +2071,7 @@ dependencies = [
"memmap2", "memmap2",
"parking_lot", "parking_lot",
"smallvec", "smallvec",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2090,7 +2083,7 @@ dependencies = [
"bstr", "bstr",
"faster-hex", "faster-hex",
"gix-trace", "gix-trace",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2102,7 +2095,7 @@ dependencies = [
"bstr", "bstr",
"faster-hex", "faster-hex",
"gix-trace", "gix-trace",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2115,7 +2108,7 @@ dependencies = [
"gix-trace", "gix-trace",
"home", "home",
"once_cell", "once_cell",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2130,7 +2123,7 @@ dependencies = [
"gix-config-value", "gix-config-value",
"gix-glob", "gix-glob",
"gix-path", "gix-path",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2143,7 +2136,7 @@ dependencies = [
"gix-config-value", "gix-config-value",
"parking_lot", "parking_lot",
"rustix", "rustix",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2160,7 +2153,7 @@ dependencies = [
"gix-transport", "gix-transport",
"gix-utils", "gix-utils",
"maybe-async", "maybe-async",
"thiserror 2.0.5", "thiserror 2.0.7",
"winnow", "winnow",
] ]
@ -2172,7 +2165,7 @@ checksum = "64a1e282216ec2ab2816cd57e6ed88f8009e634aec47562883c05ac8a7009a63"
dependencies = [ dependencies = [
"bstr", "bstr",
"gix-utils", "gix-utils",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2192,7 +2185,7 @@ dependencies = [
"gix-utils", "gix-utils",
"gix-validate", "gix-validate",
"memmap2", "memmap2",
"thiserror 2.0.5", "thiserror 2.0.7",
"winnow", "winnow",
] ]
@ -2207,7 +2200,7 @@ dependencies = [
"gix-revision", "gix-revision",
"gix-validate", "gix-validate",
"smallvec", "smallvec",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2225,7 +2218,7 @@ dependencies = [
"gix-object", "gix-object",
"gix-revwalk", "gix-revwalk",
"gix-trace", "gix-trace",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2240,7 +2233,7 @@ dependencies = [
"gix-hashtable", "gix-hashtable",
"gix-object", "gix-object",
"smallvec", "smallvec",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2267,7 +2260,7 @@ dependencies = [
"gix-pathspec", "gix-pathspec",
"gix-refspec", "gix-refspec",
"gix-url", "gix-url",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2305,7 +2298,7 @@ dependencies = [
"gix-sec", "gix-sec",
"gix-url", "gix-url",
"reqwest", "reqwest",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2322,7 +2315,7 @@ dependencies = [
"gix-object", "gix-object",
"gix-revwalk", "gix-revwalk",
"smallvec", "smallvec",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2334,7 +2327,7 @@ dependencies = [
"bstr", "bstr",
"gix-features", "gix-features",
"gix-path", "gix-path",
"thiserror 2.0.5", "thiserror 2.0.7",
"url", "url",
] ]
@ -2355,7 +2348,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cd520d09f9f585b34b32aba1d0b36ada89ab7fefb54a8ca3fe37fc482a750937" checksum = "cd520d09f9f585b34b32aba1d0b36ada89ab7fefb54a8ca3fe37fc482a750937"
dependencies = [ dependencies = [
"bstr", "bstr",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
@ -2394,14 +2387,14 @@ dependencies = [
"gix-path", "gix-path",
"gix-worktree", "gix-worktree",
"io-close", "io-close",
"thiserror 2.0.5", "thiserror 2.0.7",
] ]
[[package]] [[package]]
name = "governor" name = "governor"
version = "0.7.0" version = "0.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0746aa765db78b521451ef74221663b57ba595bf83f75d0ce23cc09447c8139f" checksum = "842dc78579ce01e6a1576ad896edc92fca002dd60c9c3746b7fc2bec6fb429d0"
dependencies = [ dependencies = [
"cfg-if", "cfg-if",
"dashmap", "dashmap",
@ -2608,12 +2601,6 @@ version = "1.0.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "df3b46402a9d5adb4c86a0cf463f42e19994e3ee891101b1841f30a545cb49a9" checksum = "df3b46402a9d5adb4c86a0cf463f42e19994e3ee891101b1841f30a545cb49a9"
[[package]]
name = "humantime"
version = "2.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9a3a5bfb195931eeb336b2a7b4d761daec841b97f947d34394601737a7bba5e4"
[[package]] [[package]]
name = "hyper" name = "hyper"
version = "1.5.1" version = "1.5.1"
@ -2893,19 +2880,10 @@ dependencies = [
"number_prefix", "number_prefix",
"portable-atomic", "portable-atomic",
"unicode-width 0.2.0", "unicode-width 0.2.0",
"vt100",
"web-time", "web-time",
] ]
[[package]]
name = "indicatif-log-bridge"
version = "0.2.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "63703cf9069b85dbe6fe26e1c5230d013dee99d3559cd3d02ba39e099ef7ab02"
dependencies = [
"indicatif",
"log",
]
[[package]] [[package]]
name = "inout" name = "inout"
version = "0.1.3" version = "0.1.3"
@ -2970,17 +2948,6 @@ dependencies = [
"once_cell", "once_cell",
] ]
[[package]]
name = "is-terminal"
version = "0.4.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "261f68e344040fbd0edea105bef17c66edf46f984ddb1115b775ce31be948f4b"
dependencies = [
"hermit-abi 0.4.0",
"libc",
"windows-sys 0.52.0",
]
[[package]] [[package]]
name = "is-wsl" name = "is-wsl"
version = "0.4.0" version = "0.4.0"
@ -3240,6 +3207,15 @@ version = "0.11.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "75761162ae2b0e580d7e7c390558127e5f01b4194debd6221fd8c207fc80e3f5" checksum = "75761162ae2b0e580d7e7c390558127e5f01b4194debd6221fd8c207fc80e3f5"
[[package]]
name = "matchers"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8263075bb86c5a1b1427b5ae862e8889656f126e9f77c484496e8b47cf5c5558"
dependencies = [
"regex-automata 0.1.10",
]
[[package]] [[package]]
name = "maybe-async" name = "maybe-async"
version = "0.2.10" version = "0.2.10"
@ -3346,6 +3322,12 @@ version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2195bf6aa996a481483b29d62a7663eed3fe39600c460e323f8ff41e90bdd89b" checksum = "2195bf6aa996a481483b29d62a7663eed3fe39600c460e323f8ff41e90bdd89b"
[[package]]
name = "mutually_exclusive_features"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e94e1e6445d314f972ff7395df2de295fe51b71821694f0b0e1e79c4f12c8577"
[[package]] [[package]]
name = "native-tls" name = "native-tls"
version = "0.2.12" version = "0.2.12"
@ -3407,6 +3389,16 @@ version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "38bf9645c8b145698bb0b18a4637dcacbc421ea49bef2317e4fd8065a387cf21" checksum = "38bf9645c8b145698bb0b18a4637dcacbc421ea49bef2317e4fd8065a387cf21"
[[package]]
name = "nu-ansi-term"
version = "0.46.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "77a8165726e8236064dbb45459242600304b42a5ea24ee2948e18e023bf7ba84"
dependencies = [
"overload",
"winapi",
]
[[package]] [[package]]
name = "num" name = "num"
version = "0.4.3" version = "0.4.3"
@ -3606,6 +3598,12 @@ dependencies = [
"windows-sys 0.52.0", "windows-sys 0.52.0",
] ]
[[package]]
name = "overload"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b15813163c1d831bf4a13c3610c05c0d03b39feb07f7e09fa234dac9b15aaf39"
[[package]] [[package]]
name = "ownedbytes" name = "ownedbytes"
version = "0.7.0" version = "0.7.0"
@ -3664,7 +3662,7 @@ checksum = "e3148f5046208a5d56bcfc03053e3ca6334e51da8dfb19b6cdc8b306fae3283e"
[[package]] [[package]]
name = "pesde" name = "pesde"
version = "0.5.0-rc.17" version = "0.5.3"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"async-compression", "async-compression",
@ -3680,13 +3678,10 @@ dependencies = [
"git2", "git2",
"gix", "gix",
"indicatif", "indicatif",
"indicatif-log-bridge",
"inquire", "inquire",
"keyring", "keyring",
"log",
"open", "open",
"pathdiff", "pathdiff",
"pretty_env_logger",
"relative-path", "relative-path",
"reqwest", "reqwest",
"semver", "semver",
@ -3695,12 +3690,15 @@ dependencies = [
"serde_with", "serde_with",
"sha2", "sha2",
"tempfile", "tempfile",
"thiserror 2.0.5", "thiserror 2.0.7",
"tokio", "tokio",
"tokio-tar", "tokio-tar",
"tokio-util", "tokio-util",
"toml", "toml",
"toml_edit", "toml_edit",
"tracing",
"tracing-indicatif",
"tracing-subscriber",
"url", "url",
"wax", "wax",
"winreg", "winreg",
@ -3708,7 +3706,7 @@ dependencies = [
[[package]] [[package]]
name = "pesde-registry" name = "pesde-registry"
version = "0.7.0" version = "0.1.2"
dependencies = [ dependencies = [
"actix-cors", "actix-cors",
"actix-governor", "actix-governor",
@ -3723,9 +3721,7 @@ dependencies = [
"futures", "futures",
"git2", "git2",
"gix", "gix",
"log",
"pesde", "pesde",
"pretty_env_logger",
"reqwest", "reqwest",
"rusty-s3", "rusty-s3",
"semver", "semver",
@ -3737,11 +3733,13 @@ dependencies = [
"sha2", "sha2",
"tantivy", "tantivy",
"tempfile", "tempfile",
"thiserror 2.0.5", "thiserror 2.0.7",
"tokio", "tokio",
"tokio-tar", "tokio-tar",
"toml", "toml",
"url", "tracing",
"tracing-actix-web",
"tracing-subscriber",
] ]
[[package]] [[package]]
@ -3838,16 +3836,6 @@ dependencies = [
"zerocopy", "zerocopy",
] ]
[[package]]
name = "pretty_env_logger"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "865724d4dbe39d9f3dd3b52b88d859d66bcb2d6a0acfd5ea68a65fb66d4bdc1c"
dependencies = [
"env_logger",
"log",
]
[[package]] [[package]]
name = "proc-macro-crate" name = "proc-macro-crate"
version = "3.2.0" version = "3.2.0"
@ -3914,7 +3902,7 @@ dependencies = [
"rustc-hash 2.1.0", "rustc-hash 2.1.0",
"rustls", "rustls",
"socket2", "socket2",
"thiserror 2.0.5", "thiserror 2.0.7",
"tokio", "tokio",
"tracing", "tracing",
] ]
@ -3933,7 +3921,7 @@ dependencies = [
"rustls", "rustls",
"rustls-pki-types", "rustls-pki-types",
"slab", "slab",
"thiserror 2.0.5", "thiserror 2.0.7",
"tinyvec", "tinyvec",
"tracing", "tracing",
"web-time", "web-time",
@ -4068,8 +4056,17 @@ checksum = "b544ef1b4eac5dc2db33ea63606ae9ffcfac26c1416a2806ae0bf5f56b201191"
dependencies = [ dependencies = [
"aho-corasick", "aho-corasick",
"memchr", "memchr",
"regex-automata", "regex-automata 0.4.9",
"regex-syntax", "regex-syntax 0.8.5",
]
[[package]]
name = "regex-automata"
version = "0.1.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6c230d73fb8d8c1b9c0b3135c5142a8acee3a0558fb8db5cf1cb65f8d7862132"
dependencies = [
"regex-syntax 0.6.29",
] ]
[[package]] [[package]]
@ -4080,7 +4077,7 @@ checksum = "809e8dc61f6de73b46c85f4c96486310fe304c434cfa43669d7b40f711150908"
dependencies = [ dependencies = [
"aho-corasick", "aho-corasick",
"memchr", "memchr",
"regex-syntax", "regex-syntax 0.8.5",
] ]
[[package]] [[package]]
@ -4089,6 +4086,12 @@ version = "0.1.6"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "53a49587ad06b26609c52e423de037e7f57f20d53535d66e08c695f347df952a" checksum = "53a49587ad06b26609c52e423de037e7f57f20d53535d66e08c695f347df952a"
[[package]]
name = "regex-syntax"
version = "0.6.29"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f162c6dd7b008981e4d40210aca20b4bd0f9b60ca9271061b07f78537722f2e1"
[[package]] [[package]]
name = "regex-syntax" name = "regex-syntax"
version = "0.8.5" version = "0.8.5"
@ -4368,9 +4371,9 @@ dependencies = [
[[package]] [[package]]
name = "semver" name = "semver"
version = "1.0.23" version = "1.0.24"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "61697e0a1c7e512e84a621326239844a24d8207b4669b41bc18b32ea5cbf988b" checksum = "3cb6eb87a131f756572d7fb904f6e7b68633f09cca868c5df1c4b8d1a694bbba"
dependencies = [ dependencies = [
"serde", "serde",
] ]
@ -4388,7 +4391,6 @@ dependencies = [
"sentry-contexts", "sentry-contexts",
"sentry-core", "sentry-core",
"sentry-debug-images", "sentry-debug-images",
"sentry-log",
"sentry-panic", "sentry-panic",
"sentry-tracing", "sentry-tracing",
"tokio", "tokio",
@ -4457,16 +4459,6 @@ dependencies = [
"sentry-core", "sentry-core",
] ]
[[package]]
name = "sentry-log"
version = "0.35.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "efcbfbb74628eaef033c1154d4bb082437c7592ce2282c7c5ccb455c4c97a06d"
dependencies = [
"log",
"sentry-core",
]
[[package]] [[package]]
name = "sentry-panic" name = "sentry-panic"
version = "0.35.0" version = "0.35.0"
@ -4508,18 +4500,18 @@ dependencies = [
[[package]] [[package]]
name = "serde" name = "serde"
version = "1.0.215" version = "1.0.216"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6513c1ad0b11a9376da888e3e0baa0077f1aed55c17f50e7b2397136129fb88f" checksum = "0b9781016e935a97e8beecf0c933758c97a5520d32930e460142b4cd80c6338e"
dependencies = [ dependencies = [
"serde_derive", "serde_derive",
] ]
[[package]] [[package]]
name = "serde_derive" name = "serde_derive"
version = "1.0.215" version = "1.0.216"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ad1e866f866923f252f05c889987993144fb74e722403468a4ebd70c3cd756c0" checksum = "46f859dbbf73865c6627ed570e78961cd3ac92407a2d117204c49232485da55e"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
@ -4641,6 +4633,15 @@ dependencies = [
"digest", "digest",
] ]
[[package]]
name = "sharded-slab"
version = "0.1.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f40ca3c46823713e0d4209592e8d6e826aa57e928f09752619fc696c499637f6"
dependencies = [
"lazy_static",
]
[[package]] [[package]]
name = "shell-words" name = "shell-words"
version = "1.1.0" version = "1.1.0"
@ -4925,7 +4926,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d60769b80ad7953d8a7b2c70cdfe722bbcdcac6bccc8ac934c40c034d866fc18" checksum = "d60769b80ad7953d8a7b2c70cdfe722bbcdcac6bccc8ac934c40c034d866fc18"
dependencies = [ dependencies = [
"byteorder", "byteorder",
"regex-syntax", "regex-syntax 0.8.5",
"utf8-ranges", "utf8-ranges",
] ]
@ -4983,15 +4984,6 @@ dependencies = [
"windows-sys 0.59.0", "windows-sys 0.59.0",
] ]
[[package]]
name = "termcolor"
version = "1.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "06794f8f6c5c898b3275aebefa6b8a1cb24cd2c6c79397ab15774837a0bc5755"
dependencies = [
"winapi-util",
]
[[package]] [[package]]
name = "thiserror" name = "thiserror"
version = "1.0.69" version = "1.0.69"
@ -5003,11 +4995,11 @@ dependencies = [
[[package]] [[package]]
name = "thiserror" name = "thiserror"
version = "2.0.5" version = "2.0.7"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "643caef17e3128658ff44d85923ef2d28af81bb71e0d67bbfe1d76f19a73e053" checksum = "93605438cbd668185516ab499d589afb7ee1859ea3d5fc8f6b0755e1c7443767"
dependencies = [ dependencies = [
"thiserror-impl 2.0.5", "thiserror-impl 2.0.7",
] ]
[[package]] [[package]]
@ -5023,9 +5015,9 @@ dependencies = [
[[package]] [[package]]
name = "thiserror-impl" name = "thiserror-impl"
version = "2.0.5" version = "2.0.7"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "995d0bbc9995d1f19d28b7215a9352b0fc3cd3a2d2ec95c2cadc485cdedbcdde" checksum = "e1d8749b4531af2117677a5fcd12b1348a3fe2b81e36e61ffeac5c4aa3273e36"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
@ -5239,6 +5231,19 @@ dependencies = [
"tracing-core", "tracing-core",
] ]
[[package]]
name = "tracing-actix-web"
version = "0.7.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "54a9f5c1aca50ebebf074ee665b9f99f2e84906dcf6b993a0d0090edb835166d"
dependencies = [
"actix-web",
"mutually_exclusive_features",
"pin-project",
"tracing",
"uuid",
]
[[package]] [[package]]
name = "tracing-attributes" name = "tracing-attributes"
version = "0.1.28" version = "0.1.28"
@ -5260,13 +5265,45 @@ dependencies = [
"valuable", "valuable",
] ]
[[package]]
name = "tracing-indicatif"
version = "0.3.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "74ba258e9de86447f75edf6455fded8e5242704c6fccffe7bf8d7fb6daef1180"
dependencies = [
"indicatif",
"tracing",
"tracing-core",
"tracing-subscriber",
]
[[package]]
name = "tracing-log"
version = "0.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ee855f1f400bd0e5c02d150ae5de3840039a3f54b025156404e34c23c03f47c3"
dependencies = [
"log",
"once_cell",
"tracing-core",
]
[[package]] [[package]]
name = "tracing-subscriber" name = "tracing-subscriber"
version = "0.3.19" version = "0.3.19"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e8189decb5ac0fa7bc8b96b7cb9b2701d60d48805aca84a238004d665fcc4008" checksum = "e8189decb5ac0fa7bc8b96b7cb9b2701d60d48805aca84a238004d665fcc4008"
dependencies = [ dependencies = [
"matchers",
"nu-ansi-term",
"once_cell",
"regex",
"sharded-slab",
"smallvec",
"thread_local",
"tracing",
"tracing-core", "tracing-core",
"tracing-log",
] ]
[[package]] [[package]]
@ -5437,6 +5474,39 @@ version = "0.9.5"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0b928f33d975fc6ad9f86c8f283853ad26bdd5b10b7f1542aa2fa15e2289105a" checksum = "0b928f33d975fc6ad9f86c8f283853ad26bdd5b10b7f1542aa2fa15e2289105a"
[[package]]
name = "vt100"
version = "0.15.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "84cd863bf0db7e392ba3bd04994be3473491b31e66340672af5d11943c6274de"
dependencies = [
"itoa",
"log",
"unicode-width 0.1.14",
"vte",
]
[[package]]
name = "vte"
version = "0.11.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f5022b5fbf9407086c180e9557be968742d839e68346af7792b8592489732197"
dependencies = [
"arrayvec",
"utf8parse",
"vte_generate_state_changes",
]
[[package]]
name = "vte_generate_state_changes"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2e369bee1b05d510a7b4ed645f5faa90619e05437111783ea5848f28d97d3c2e"
dependencies = [
"proc-macro2",
"quote",
]
[[package]] [[package]]
name = "walkdir" name = "walkdir"
version = "2.5.0" version = "2.5.0"

View file

@ -1,6 +1,6 @@
[package] [package]
name = "pesde" name = "pesde"
version = "0.5.0-rc.17" version = "0.5.3"
edition = "2021" edition = "2021"
license = "MIT" license = "MIT"
authors = ["daimond113 <contact@daimond113.com>"] authors = ["daimond113 <contact@daimond113.com>"]
@ -13,10 +13,10 @@ include = ["src/**/*", "Cargo.toml", "Cargo.lock", "README.md", "LICENSE", "CHAN
bin = [ bin = [
"dep:clap", "dep:clap",
"dep:dirs", "dep:dirs",
"dep:pretty_env_logger", "dep:tracing-subscriber",
"reqwest/json", "reqwest/json",
"dep:indicatif", "dep:indicatif",
"dep:indicatif-log-bridge", "dep:tracing-indicatif",
"dep:inquire", "dep:inquire",
"dep:toml_edit", "dep:toml_edit",
"dep:colored", "dep:colored",
@ -44,25 +44,25 @@ required-features = ["bin"]
uninlined_format_args = "warn" uninlined_format_args = "warn"
[dependencies] [dependencies]
serde = { version = "1.0.215", features = ["derive"] } serde = { version = "1.0.216", features = ["derive"] }
toml = "0.8.19" toml = "0.8.19"
serde_with = "3.11.0" serde_with = "3.11.0"
gix = { version = "0.68.0", default-features = false, features = ["blocking-http-transport-reqwest-rust-tls", "revparse-regex", "credentials", "parallel"] } gix = { version = "0.68.0", default-features = false, features = ["blocking-http-transport-reqwest-rust-tls", "revparse-regex", "credentials", "parallel"] }
semver = { version = "1.0.23", features = ["serde"] } semver = { version = "1.0.24", features = ["serde"] }
reqwest = { version = "0.12.9", default-features = false, features = ["rustls-tls"] } reqwest = { version = "0.12.9", default-features = false, features = ["rustls-tls"] }
tokio-tar = "0.3.1" tokio-tar = "0.3.1"
async-compression = { version = "0.4.18", features = ["tokio", "gzip"] } async-compression = { version = "0.4.18", features = ["tokio", "gzip"] }
pathdiff = "0.2.3" pathdiff = "0.2.3"
relative-path = { version = "1.9.3", features = ["serde"] } relative-path = { version = "1.9.3", features = ["serde"] }
log = "0.4.22" tracing = { version = "0.1.41", features = ["attributes"] }
thiserror = "2.0.5" thiserror = "2.0.7"
tokio = { version = "1.42.0", features = ["process"] } tokio = { version = "1.42.0", features = ["process"] }
tokio-util = "0.7.13" tokio-util = "0.7.13"
async-stream = "0.3.6" async-stream = "0.3.6"
futures = "0.3.31" futures = "0.3.31"
full_moon = { version = "1.1.2", features = ["luau"] } full_moon = { version = "1.1.2", features = ["luau"] }
url = { version = "2.5.4", features = ["serde"] } url = { version = "2.5.4", features = ["serde"] }
chrono = { version = "0.4.38", features = ["serde"] } chrono = { version = "0.4.39", features = ["serde"] }
sha2 = "0.10.8" sha2 = "0.10.8"
tempfile = "3.14.0" tempfile = "3.14.0"
wax = { version = "0.6.0", default-features = false } wax = { version = "0.6.0", default-features = false }
@ -81,9 +81,9 @@ colored = { version = "2.1.0", optional = true }
toml_edit = { version = "0.22.22", optional = true } toml_edit = { version = "0.22.22", optional = true }
clap = { version = "4.5.23", features = ["derive"], optional = true } clap = { version = "4.5.23", features = ["derive"], optional = true }
dirs = { version = "5.0.1", optional = true } dirs = { version = "5.0.1", optional = true }
pretty_env_logger = { version = "0.5.0", optional = true } tracing-subscriber = { version = "0.3.19", features = ["env-filter"], optional = true }
indicatif = { version = "0.17.9", optional = true } indicatif = { version = "0.17.9", optional = true }
indicatif-log-bridge = { version = "0.2.3", optional = true } tracing-indicatif = { version = "0.3.8", optional = true }
inquire = { version = "0.7.5", optional = true } inquire = { version = "0.7.5", optional = true }
[target.'cfg(target_os = "windows")'.dependencies] [target.'cfg(target_os = "windows")'.dependencies]

25
SECURITY.md Normal file
View file

@ -0,0 +1,25 @@
# Security Policy
## Supported Versions
As pesde is currently in version 0.x, we can only guarantee security for:
- **The latest minor** (currently 0.5).
- **The latest release candidate for the next version**, if available.
When a new minor version is released, the previous version will immediately lose security support.
> **Note:** This policy will change with the release of version 1.0, which will include an extended support period for versions >=1.0.
| Version | Supported |
| ------- | ------------------ |
| 0.5.x | :white_check_mark: |
| < 0.5 | :x: |
## Reporting a Vulnerability
We encourage all security concerns to be reported at [pesde@daimond113.com](mailto:pesde@daimond113.com), along the following format:
- **Subject**: The subject must be prefixed with `[SECURITY]` to ensure it is prioritized as a security concern.
- **Content**:
- **Affected Versions**: Clearly specify which are affected by the issue.
- **Issue Details**: Provide a detailed description of the issue, including reproduction steps and/or a simple example, if applicable.
We will try to respond as soon as possible.

View file

@ -38,17 +38,17 @@ Git dependencies are dependencies on packages hosted on a Git repository.
```toml title="pesde.toml" ```toml title="pesde.toml"
[dependencies] [dependencies]
acme = { repo = "acme/package", rev = "main" } acme = { repo = "acme/package", rev = "aeff6" }
``` ```
In this example, we're specifying a dependency on the package contained within In this example, we're specifying a dependency on the package contained within
the `acme/package` GitHub repository at the `main` branch. the `acme/package` GitHub repository at the `aeff6` commit.
You can also use a URL to specify the Git repository and a specific commit. You can also use a URL to specify the Git repository and a tag for the revision.
```toml title="pesde.toml" ```toml title="pesde.toml"
[dependencies] [dependencies]
acme = { repo = "https://git.acme.local/package.git", rev = "aeff6" } acme = { repo = "https://git.acme.local/package.git", rev = "v0.1.0" }
``` ```
You can also specify a path if the package is not at the root of the repository. You can also specify a path if the package is not at the root of the repository.

View file

@ -20,15 +20,15 @@ to get it added.
Studio. Studio.
Running `pesde init` will prompt you to select a target, select Running `pesde init` will prompt you to select a target, select
`roblox` or `roblox_server` in this case. This will setup the configuration `roblox` or `roblox_server` in this case. You will be prompted to pick out a
needed to use pesde in a project using Rojo. scripts package. Select `pesde/scripts_rojo` to get started with Rojo.
## Usage with other tools ## Usage with other tools
If you are using a different sync tool, you should look for it's scripts in the If you are using a different sync tool, you should look for it's scripts
pesde-scripts repository. If you cannot find them, you can write your own and package on the registry. If you cannot find it, you can write your own and
optionally submit a PR to help others using the same tool as you get started optionally submit a PR to pesde-scripts to help others using the same tool as
quicker. you get started quicker.
Scaffold your project with `pesde init`, select the `roblox` or `roblox_server` Scaffold your project with `pesde init`, select the `roblox` or `roblox_server`
target, and then create a `.pesde/roblox_sync_config_generator.luau` script target, and then create a `.pesde/roblox_sync_config_generator.luau` script

View file

@ -41,6 +41,16 @@ You can follow the installation instructions in the
pesde should now be installed on your system. You may need to restart your pesde should now be installed on your system. You may need to restart your
computer for the changes to take effect. computer for the changes to take effect.
<Aside type="caution">
pesde uses symlinks which are an administrator-level operation on Windows.
To ensure proper functionality, enable [Developer Mode](https://learn.microsoft.com/en-us/windows/apps/get-started/enable-your-device-for-development).
If you are getting errors such as `Failed to symlink file, a required
privilege is not held by the client`, then enabling this setting will fix
them.
</Aside>
</TabItem> </TabItem>
<TabItem label="Linux & macOS"> <TabItem label="Linux & macOS">
@ -59,7 +69,7 @@ You can follow the installation instructions in the
environment variable. environment variable.
```sh title=".zshrc" ```sh title=".zshrc"
export PATH = "$PATH:/home/user/.pesde/bin" export PATH="$PATH:$HOME/.pesde/bin"
``` ```
You should then be able to run `pesde` after restarting your shell. You should then be able to run `pesde` after restarting your shell.

View file

@ -159,12 +159,13 @@ when the package is installed in order to generate the necessary configuration.
**Allowed in:** `luau`, `lune` **Allowed in:** `luau`, `lune`
A list of scripts that will be linked to the project's `.pesde` directory, and A list of scripts that will be linked to the dependant's `.pesde` directory, and
copied over to the [scripts](#scripts-1) section when initialising a project with copied over to the [scripts](#scripts-1) section when initialising a project with
this package. this package as the scripts package.
```toml ```toml
scripts = { roblox_sync_config_generator = "scripts/roblox_sync_config_generator.luau" } [target.scripts]
roblox_sync_config_generator = "scripts/roblox_sync_config_generator.luau"
``` ```
## `[scripts]` ## `[scripts]`
@ -189,10 +190,6 @@ sync tools.
of files specified within the [`target.build_files`](#build_files) of the of files specified within the [`target.build_files`](#build_files) of the
package. package.
You can find template scripts inside the
[`pesde-scripts` repository](https://github.com/pesde-pkg/scripts)
for various sync tools.
<LinkCard <LinkCard
title="Roblox" title="Roblox"
description="Learn more about using pesde in Roblox projects." description="Learn more about using pesde in Roblox projects."
@ -372,14 +369,14 @@ foo = { wally = "acme/foo", version = "1.2.3", index = "acme" }
```toml ```toml
[dependencies] [dependencies]
foo = { repo = "acme/packages", rev = "main", path = "foo" } foo = { repo = "acme/packages", rev = "aeff6", path = "foo" }
``` ```
**Git dependencies** contain the following fields: **Git dependencies** contain the following fields:
- `repo`: The URL of the Git repository. - `repo`: The URL of the Git repository.
This can either be `<owner>/<name>` for a GitHub repository, or a full URL. This can either be `<owner>/<name>` for a GitHub repository, or a full URL.
- `rev`: The Git revision to install. This can be a branch, tag, or commit hash. - `rev`: The Git revision to install. This can be a tag or commit hash.
- `path`: The path within the repository to install. If not specified, the root - `path`: The path within the repository to install. If not specified, the root
of the repository is used. of the repository is used.

22
registry/CHANGELOG.md Normal file
View file

@ -0,0 +1,22 @@
# Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [0.1.2]
### Changed
- Update to pesde lib API changes by @daimond113
## [0.1.1] - 2024-12-19
### Changed
- Switch to traccing for logging by @daimond113
## [0.1.0] - 2024-12-14
### Added
- Rewrite registry for pesde v0.5.0 by @daimond113
[0.1.2]: https://github.com/daimond113/pesde/compare/v0.5.2%2Bregistry.0.1.1..v0.5.3%2Bregistry.0.1.2
[0.1.1]: https://github.com/daimond113/pesde/compare/v0.5.1%2Bregistry.0.1.0..v0.5.2%2Bregistry.0.1.1
[0.1.0]: https://github.com/daimond113/pesde/compare/v0.4.7..v0.5.0%2Bregistry.0.1.0

View file

@ -1,6 +1,6 @@
[package] [package]
name = "pesde-registry" name = "pesde-registry"
version = "0.7.0" version = "0.1.2"
edition = "2021" edition = "2021"
repository = "https://github.com/pesde-pkg/index" repository = "https://github.com/pesde-pkg/index"
publish = false publish = false
@ -8,13 +8,12 @@ publish = false
[dependencies] [dependencies]
actix-web = "4.9.0" actix-web = "4.9.0"
actix-cors = "0.7.0" actix-cors = "0.7.0"
actix-governor = "0.7.0" actix-governor = "0.8.0"
dotenvy = "0.15.7" dotenvy = "0.15.7"
thiserror = "2.0.5" thiserror = "2.0.7"
tantivy = "0.22.0" tantivy = "0.22.0"
semver = "1.0.23" semver = "1.0.24"
chrono = { version = "0.4.38", features = ["serde"] } chrono = { version = "0.4.39", features = ["serde"] }
url = "2.5.4"
futures = "0.3.31" futures = "0.3.31"
tokio = "1.42.0" tokio = "1.42.0"
tempfile = "3.14.0" tempfile = "3.14.0"
@ -27,7 +26,7 @@ gix = { version = "0.68.0", default-features = false, features = [
"credentials", "credentials",
] } ] }
serde = "1.0.215" serde = "1.0.216"
serde_json = "1.0.133" serde_json = "1.0.133"
serde_yaml = "0.9.34" serde_yaml = "0.9.34"
toml = "0.8.19" toml = "0.8.19"
@ -41,10 +40,11 @@ constant_time_eq = "0.3.1"
tokio-tar = "0.3.1" tokio-tar = "0.3.1"
async-compression = { version = "0.4.18", features = ["tokio", "gzip"] } async-compression = { version = "0.4.18", features = ["tokio", "gzip"] }
log = "0.4.22" tracing = { version = "0.1.41", features = ["attributes"] }
pretty_env_logger = "0.5.0" tracing-subscriber = { version = "0.3.19", features = ["env-filter"] }
tracing-actix-web = "0.7.15"
sentry = { version = "0.35.0", default-features = false, features = ["backtrace", "contexts", "debug-images", "panic", "reqwest", "rustls", "log"] } sentry = { version = "0.35.0", default-features = false, features = ["backtrace", "contexts", "debug-images", "panic", "reqwest", "rustls", "tracing"] }
sentry-actix = "0.35.0" sentry-actix = "0.35.0"
pesde = { path = "..", features = ["wally-compat"] } pesde = { path = "..", features = ["wally-compat"] }

View file

@ -45,7 +45,7 @@ impl AuthImpl for GitHubAuth {
return Ok(None); return Ok(None);
} }
Err(_) => { Err(_) => {
log::error!( tracing::error!(
"failed to get user: {}", "failed to get user: {}",
response.into_error().await.unwrap_err() response.into_error().await.unwrap_err()
); );
@ -53,7 +53,7 @@ impl AuthImpl for GitHubAuth {
} }
}, },
Err(e) => { Err(e) => {
log::error!("failed to get user: {e}"); tracing::error!("failed to get user: {e}");
return Ok(None); return Ok(None);
} }
}; };
@ -61,7 +61,7 @@ impl AuthImpl for GitHubAuth {
let user_id = match response.json::<UserResponse>().await { let user_id = match response.json::<UserResponse>().await {
Ok(resp) => resp.user.id, Ok(resp) => resp.user.id,
Err(e) => { Err(e) => {
log::error!("failed to get user: {e}"); tracing::error!("failed to get user: {e}");
return Ok(None); return Ok(None);
} }
}; };

View file

@ -71,7 +71,7 @@ pub async fn get_package_version(
let (scope, name_part) = name.as_str(); let (scope, name_part) = name.as_str();
let entries: IndexFile = { let file: IndexFile = {
let source = app_state.source.lock().await; let source = app_state.source.lock().await;
let repo = gix::open(source.path(&app_state.project))?; let repo = gix::open(source.path(&app_state.project))?;
let tree = root_tree(&repo)?; let tree = root_tree(&repo)?;
@ -84,14 +84,15 @@ pub async fn get_package_version(
let Some((v_id, entry, targets)) = ({ let Some((v_id, entry, targets)) = ({
let version = match version { let version = match version {
VersionRequest::Latest => match entries.keys().map(|k| k.version()).max() { VersionRequest::Latest => match file.entries.keys().map(|k| k.version()).max() {
Some(latest) => latest.clone(), Some(latest) => latest.clone(),
None => return Ok(HttpResponse::NotFound().finish()), None => return Ok(HttpResponse::NotFound().finish()),
}, },
VersionRequest::Specific(version) => version, VersionRequest::Specific(version) => version,
}; };
let versions = entries let versions = file
.entries
.iter() .iter()
.filter(|(v_id, _)| *v_id.version() == version); .filter(|(v_id, _)| *v_id.version() == version);

View file

@ -19,7 +19,7 @@ pub async fn get_package_versions(
let (scope, name_part) = name.as_str(); let (scope, name_part) = name.as_str();
let versions: IndexFile = { let file: IndexFile = {
let source = app_state.source.lock().await; let source = app_state.source.lock().await;
let repo = gix::open(source.path(&app_state.project))?; let repo = gix::open(source.path(&app_state.project))?;
let tree = root_tree(&repo)?; let tree = root_tree(&repo)?;
@ -32,7 +32,7 @@ pub async fn get_package_versions(
let mut responses = BTreeMap::new(); let mut responses = BTreeMap::new();
for (v_id, entry) in versions { for (v_id, entry) in file.entries {
let info = responses let info = responses
.entry(v_id.version().clone()) .entry(v_id.version().clone())
.or_insert_with(|| PackageResponse { .or_insert_with(|| PackageResponse {

View file

@ -371,7 +371,7 @@ pub async fn publish_package(
} }
}; };
let mut entries: IndexFile = let mut file: IndexFile =
toml::de::from_str(&read_file(&gix_tree, [scope, name])?.unwrap_or_default())?; toml::de::from_str(&read_file(&gix_tree, [scope, name])?.unwrap_or_default())?;
let new_entry = IndexFileEntry { let new_entry = IndexFileEntry {
@ -386,11 +386,12 @@ pub async fn publish_package(
dependencies, dependencies,
}; };
let this_version = entries let this_version = file
.entries
.keys() .keys()
.find(|v_id| *v_id.version() == manifest.version); .find(|v_id| *v_id.version() == manifest.version);
if let Some(this_version) = this_version { if let Some(this_version) = this_version {
let other_entry = entries.get(this_version).unwrap(); let other_entry = file.entries.get(this_version).unwrap();
// description cannot be different - which one to render in the "Recently published" list? // description cannot be different - which one to render in the "Recently published" list?
// the others cannot be different because what to return from the versions endpoint? // the others cannot be different because what to return from the versions endpoint?
@ -406,7 +407,8 @@ pub async fn publish_package(
} }
} }
if entries if file
.entries
.insert( .insert(
VersionId::new(manifest.version.clone(), manifest.target.kind()), VersionId::new(manifest.version.clone(), manifest.target.kind()),
new_entry.clone(), new_entry.clone(),
@ -422,7 +424,7 @@ pub async fn publish_package(
let reference = repo.find_reference(&refspec)?; let reference = repo.find_reference(&refspec)?;
{ {
let index_content = toml::to_string(&entries)?; let index_content = toml::to_string(&file)?;
let mut blob_writer = repo.blob_writer(None)?; let mut blob_writer = repo.blob_writer(None)?;
blob_writer.write_all(index_content.as_bytes())?; blob_writer.write_all(index_content.as_bytes())?;
oids.push((name, blob_writer.commit()?)); oids.push((name, blob_writer.commit()?));

View file

@ -68,10 +68,11 @@ pub async fn search_packages(
.unwrap(); .unwrap();
let (scope, name) = id.as_str(); let (scope, name) = id.as_str();
let versions: IndexFile = let file: IndexFile =
toml::de::from_str(&read_file(&tree, [scope, name]).unwrap().unwrap()).unwrap(); toml::de::from_str(&read_file(&tree, [scope, name]).unwrap().unwrap()).unwrap();
let (latest_version, entry) = versions let (latest_version, entry) = file
.entries
.iter() .iter()
.max_by_key(|(v_id, _)| v_id.version()) .max_by_key(|(v_id, _)| v_id.version())
.unwrap(); .unwrap();
@ -79,17 +80,19 @@ pub async fn search_packages(
PackageResponse { PackageResponse {
name: id.to_string(), name: id.to_string(),
version: latest_version.version().to_string(), version: latest_version.version().to_string(),
targets: versions targets: file
.entries
.iter() .iter()
.filter(|(v_id, _)| v_id.version() == latest_version.version()) .filter(|(v_id, _)| v_id.version() == latest_version.version())
.map(|(_, entry)| (&entry.target).into()) .map(|(_, entry)| (&entry.target).into())
.collect(), .collect(),
description: entry.description.clone().unwrap_or_default(), description: entry.description.clone().unwrap_or_default(),
published_at: versions published_at: file
.entries
.values() .values()
.max_by_key(|entry| entry.published_at) .map(|entry| entry.published_at)
.unwrap() .max()
.published_at, .unwrap(),
license: entry.license.clone().unwrap_or_default(), license: entry.license.clone().unwrap_or_default(),
authors: entry.authors.clone(), authors: entry.authors.clone(),
repository: entry.repository.clone().map(|url| url.to_string()), repository: entry.repository.clone().map(|url| url.to_string()),

View file

@ -1,5 +1,4 @@
use actix_web::{body::BoxBody, HttpResponse, ResponseError}; use actix_web::{body::BoxBody, HttpResponse, ResponseError};
use log::error;
use pesde::source::git_index::errors::{ReadFile, RefreshError, TreeError}; use pesde::source::git_index::errors::{ReadFile, RefreshError, TreeError};
use serde::Serialize; use serde::Serialize;
use thiserror::Error; use thiserror::Error;
@ -67,7 +66,7 @@ impl ResponseError for Error {
error: format!("archive is invalid: {e}"), error: format!("archive is invalid: {e}"),
}), }),
e => { e => {
log::error!("unhandled error: {e:?}"); tracing::error!("unhandled error: {e:?}");
HttpResponse::InternalServerError().finish() HttpResponse::InternalServerError().finish()
} }
} }

View file

@ -6,19 +6,22 @@ use crate::{
use actix_cors::Cors; use actix_cors::Cors;
use actix_governor::{Governor, GovernorConfigBuilder}; use actix_governor::{Governor, GovernorConfigBuilder};
use actix_web::{ use actix_web::{
middleware::{from_fn, Compress, Logger, NormalizePath, TrailingSlash}, middleware::{from_fn, Compress, NormalizePath, TrailingSlash},
rt::System, rt::System,
web, web,
web::PayloadConfig, web::PayloadConfig,
App, HttpServer, App, HttpServer,
}; };
use fs_err::tokio as fs; use fs_err::tokio as fs;
use log::info;
use pesde::{ use pesde::{
source::{pesde::PesdePackageSource, traits::PackageSource}, source::{pesde::PesdePackageSource, traits::PackageSource},
AuthConfig, Project, AuthConfig, Project,
}; };
use std::{env::current_dir, path::PathBuf}; use std::{env::current_dir, path::PathBuf};
use tracing::level_filters::LevelFilter;
use tracing_subscriber::{
fmt::format::FmtSpan, layer::SubscriberExt, util::SubscriberInitExt, EnvFilter,
};
mod auth; mod auth;
mod endpoints; mod endpoints;
@ -116,12 +119,12 @@ async fn run() -> std::io::Result<()> {
let app_data = web::Data::new(AppState { let app_data = web::Data::new(AppState {
storage: { storage: {
let storage = get_storage_from_env(); let storage = get_storage_from_env();
info!("storage: {storage}"); tracing::info!("storage: {storage}");
storage storage
}, },
auth: { auth: {
let auth = get_auth_from_env(&config); let auth = get_auth_from_env(&config);
info!("auth: {auth}"); tracing::info!("auth: {auth}");
auth auth
}, },
source: tokio::sync::Mutex::new(source), source: tokio::sync::Mutex::new(source),
@ -140,14 +143,12 @@ async fn run() -> std::io::Result<()> {
.finish() .finish()
.unwrap(); .unwrap();
info!("listening on {address}:{port}");
HttpServer::new(move || { HttpServer::new(move || {
App::new() App::new()
.wrap(sentry_actix::Sentry::with_transaction()) .wrap(sentry_actix::Sentry::with_transaction())
.wrap(NormalizePath::new(TrailingSlash::Trim)) .wrap(NormalizePath::new(TrailingSlash::Trim))
.wrap(Cors::permissive()) .wrap(Cors::permissive())
.wrap(Logger::default()) .wrap(tracing_actix_web::TracingLogger::default())
.wrap(Compress::default()) .wrap(Compress::default())
.app_data(app_data.clone()) .app_data(app_data.clone())
.route( .route(
@ -200,12 +201,26 @@ async fn run() -> std::io::Result<()> {
fn main() -> std::io::Result<()> { fn main() -> std::io::Result<()> {
let _ = dotenvy::dotenv(); let _ = dotenvy::dotenv();
let mut log_builder = pretty_env_logger::formatted_builder(); let tracing_env_filter = EnvFilter::builder()
log_builder.parse_env(pretty_env_logger::env_logger::Env::default().default_filter_or("info")); .with_default_directive(LevelFilter::INFO.into())
.from_env_lossy()
.add_directive("reqwest=info".parse().unwrap())
.add_directive("rustls=info".parse().unwrap())
.add_directive("tokio_util=info".parse().unwrap())
.add_directive("goblin=info".parse().unwrap())
.add_directive("tower=info".parse().unwrap())
.add_directive("hyper=info".parse().unwrap())
.add_directive("h2=info".parse().unwrap());
let logger = sentry::integrations::log::SentryLogger::with_dest(log_builder.build()); tracing_subscriber::registry()
log::set_boxed_logger(Box::new(logger)).unwrap(); .with(tracing_env_filter)
log::set_max_level(log::LevelFilter::Info); .with(
tracing_subscriber::fmt::layer()
.compact()
.with_span_events(FmtSpan::NEW | FmtSpan::CLOSE),
)
.with(sentry::integrations::tracing::layer())
.init();
let guard = sentry::init(sentry::ClientOptions { let guard = sentry::init(sentry::ClientOptions {
release: sentry::release_name!(), release: sentry::release_name!(),
@ -218,9 +233,9 @@ fn main() -> std::io::Result<()> {
if guard.is_enabled() { if guard.is_enabled() {
std::env::set_var("RUST_BACKTRACE", "full"); std::env::set_var("RUST_BACKTRACE", "full");
info!("sentry initialized"); tracing::info!("sentry initialized");
} else { } else {
info!("sentry **NOT** initialized"); tracing::info!("sentry **NOT** initialized");
} }
System::new().block_on(run()) System::new().block_on(run())

View file

@ -104,8 +104,8 @@ pub async fn make_search(
pin!(stream); pin!(stream);
while let Some((pkg_name, mut file)) = stream.next().await { while let Some((pkg_name, mut file)) = stream.next().await {
let Some((_, latest_entry)) = file.pop_last() else { let Some((_, latest_entry)) = file.entries.pop_last() else {
log::warn!("no versions found for {pkg_name}"); tracing::error!("no versions found for {pkg_name}");
continue; continue;
}; };

View file

@ -5,6 +5,7 @@ use keyring::Entry;
use reqwest::header::AUTHORIZATION; use reqwest::header::AUTHORIZATION;
use serde::{ser::SerializeMap, Deserialize, Serialize}; use serde::{ser::SerializeMap, Deserialize, Serialize};
use std::collections::BTreeMap; use std::collections::BTreeMap;
use tracing::instrument;
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
pub struct Tokens(pub BTreeMap<gix::Url, String>); pub struct Tokens(pub BTreeMap<gix::Url, String>);
@ -37,15 +38,20 @@ impl<'de> Deserialize<'de> for Tokens {
} }
} }
#[instrument(level = "trace")]
pub async fn get_tokens() -> anyhow::Result<Tokens> { pub async fn get_tokens() -> anyhow::Result<Tokens> {
let config = read_config().await?; let config = read_config().await?;
if !config.tokens.0.is_empty() { if !config.tokens.0.is_empty() {
tracing::debug!("using tokens from config");
return Ok(config.tokens); return Ok(config.tokens);
} }
match Entry::new("tokens", env!("CARGO_PKG_NAME")) { match Entry::new("tokens", env!("CARGO_PKG_NAME")) {
Ok(entry) => match entry.get_password() { Ok(entry) => match entry.get_password() {
Ok(token) => return serde_json::from_str(&token).context("failed to parse tokens"), Ok(token) => {
tracing::debug!("using tokens from keyring");
return serde_json::from_str(&token).context("failed to parse tokens");
}
Err(keyring::Error::PlatformFailure(_) | keyring::Error::NoEntry) => {} Err(keyring::Error::PlatformFailure(_) | keyring::Error::NoEntry) => {}
Err(e) => return Err(e.into()), Err(e) => return Err(e.into()),
}, },
@ -56,16 +62,22 @@ pub async fn get_tokens() -> anyhow::Result<Tokens> {
Ok(Tokens(BTreeMap::new())) Ok(Tokens(BTreeMap::new()))
} }
#[instrument(level = "trace")]
pub async fn set_tokens(tokens: Tokens) -> anyhow::Result<()> { pub async fn set_tokens(tokens: Tokens) -> anyhow::Result<()> {
let entry = Entry::new("tokens", env!("CARGO_PKG_NAME"))?; let entry = Entry::new("tokens", env!("CARGO_PKG_NAME"))?;
let json = serde_json::to_string(&tokens).context("failed to serialize tokens")?; let json = serde_json::to_string(&tokens).context("failed to serialize tokens")?;
match entry.set_password(&json) { match entry.set_password(&json) {
Ok(()) => return Ok(()), Ok(()) => {
tracing::debug!("tokens saved to keyring");
return Ok(());
}
Err(keyring::Error::PlatformFailure(_) | keyring::Error::NoEntry) => {} Err(keyring::Error::PlatformFailure(_) | keyring::Error::NoEntry) => {}
Err(e) => return Err(e.into()), Err(e) => return Err(e.into()),
} }
tracing::debug!("tokens saved to config");
let mut config = read_config().await?; let mut config = read_config().await?;
config.tokens = tokens; config.tokens = tokens;
write_config(&config).await.map_err(Into::into) write_config(&config).await.map_err(Into::into)
@ -86,6 +98,7 @@ struct UserResponse {
login: String, login: String,
} }
#[instrument(level = "trace")]
pub async fn get_token_login( pub async fn get_token_login(
reqwest: &reqwest::Client, reqwest: &reqwest::Client,
access_token: &str, access_token: &str,

View file

@ -2,6 +2,7 @@ use std::{collections::HashSet, str::FromStr};
use anyhow::Context; use anyhow::Context;
use clap::Args; use clap::Args;
use colored::Colorize;
use semver::VersionReq; use semver::VersionReq;
use crate::cli::{config::read_config, AnyPackageIdentifier, VersionedPackageName}; use crate::cli::{config::read_config, AnyPackageIdentifier, VersionedPackageName};
@ -62,7 +63,7 @@ impl AddCommand {
.cloned(); .cloned();
if let Some(index) = self.index.as_ref().filter(|_| index.is_none()) { if let Some(index) = self.index.as_ref().filter(|_| index.is_none()) {
log::error!("index {index} not found"); println!("{}: index {index} not found", "error".red().bold());
return Ok(()); return Ok(());
} }
@ -89,7 +90,7 @@ impl AddCommand {
.cloned(); .cloned();
if let Some(index) = self.index.as_ref().filter(|_| index.is_none()) { if let Some(index) = self.index.as_ref().filter(|_| index.is_none()) {
log::error!("wally index {index} not found"); println!("{}: wally index {index} not found", "error".red().bold());
return Ok(()); return Ok(());
} }
@ -145,7 +146,7 @@ impl AddCommand {
.pop_last() .pop_last()
.map(|(v_id, _)| v_id) .map(|(v_id, _)| v_id)
else { else {
log::error!("no versions found for package {specifier}"); println!("{}: no versions found for package", "error".red().bold());
return Ok(()); return Ok(());
}; };

View file

@ -2,7 +2,6 @@ use crate::cli::{config::read_config, progress_bar, VersionedPackageName};
use anyhow::Context; use anyhow::Context;
use clap::Args; use clap::Args;
use fs_err::tokio as fs; use fs_err::tokio as fs;
use indicatif::MultiProgress;
use pesde::{ use pesde::{
linking::generator::generate_bin_linking_module, linking::generator::generate_bin_linking_module,
manifest::target::TargetKind, manifest::target::TargetKind,
@ -35,12 +34,7 @@ pub struct ExecuteCommand {
} }
impl ExecuteCommand { impl ExecuteCommand {
pub async fn run( pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
self,
project: Project,
multi: MultiProgress,
reqwest: reqwest::Client,
) -> anyhow::Result<()> {
let index = match self.index { let index = match self.index {
Some(index) => Some(index), Some(index) => Some(index),
None => read_config().await.ok().map(|c| c.default_index), None => read_config().await.ok().map(|c| c.default_index),
@ -84,7 +78,7 @@ impl ExecuteCommand {
); );
}; };
log::info!("found package {}@{version}", pkg_ref.name); println!("using {}@{version}", pkg_ref.name);
let tmp_dir = project.cas_dir().join(".tmp"); let tmp_dir = project.cas_dir().join(".tmp");
fs::create_dir_all(&tmp_dir) fs::create_dir_all(&tmp_dir)
@ -134,7 +128,6 @@ impl ExecuteCommand {
progress_bar( progress_bar(
graph.values().map(|versions| versions.len() as u64).sum(), graph.values().map(|versions| versions.len() as u64).sum(),
rx, rx,
&multi,
"📥 ".to_string(), "📥 ".to_string(),
"downloading dependencies".to_string(), "downloading dependencies".to_string(),
"downloaded dependencies".to_string(), "downloaded dependencies".to_string(),

View file

@ -6,7 +6,6 @@ use clap::Args;
use colored::{ColoredString, Colorize}; use colored::{ColoredString, Colorize};
use fs_err::tokio as fs; use fs_err::tokio as fs;
use futures::future::try_join_all; use futures::future::try_join_all;
use indicatif::MultiProgress;
use pesde::{ use pesde::{
download_and_link::filter_graph, lockfile::Lockfile, manifest::target::TargetKind, Project, download_and_link::filter_graph, lockfile::Lockfile, manifest::target::TargetKind, Project,
MANIFEST_FILE_NAME, MANIFEST_FILE_NAME,
@ -89,12 +88,7 @@ fn job(n: u8) -> ColoredString {
struct CallbackError(#[from] anyhow::Error); struct CallbackError(#[from] anyhow::Error);
impl InstallCommand { impl InstallCommand {
pub async fn run( pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
self,
project: Project,
multi: MultiProgress,
reqwest: reqwest::Client,
) -> anyhow::Result<()> {
let mut refreshed_sources = HashSet::new(); let mut refreshed_sources = HashSet::new();
let manifest = project let manifest = project
@ -116,10 +110,10 @@ impl InstallCommand {
match project.deser_lockfile().await { match project.deser_lockfile().await {
Ok(lockfile) => { Ok(lockfile) => {
if lockfile.overrides != manifest.overrides { if lockfile.overrides != manifest.overrides {
log::debug!("overrides are different"); tracing::debug!("overrides are different");
None None
} else if lockfile.target != manifest.target.kind() { } else if lockfile.target != manifest.target.kind() {
log::debug!("target kind is different"); tracing::debug!("target kind is different");
None None
} else { } else {
Some(lockfile) Some(lockfile)
@ -153,7 +147,7 @@ impl InstallCommand {
deleted_folders deleted_folders
.entry(folder.to_string()) .entry(folder.to_string())
.or_insert_with(|| async move { .or_insert_with(|| async move {
log::debug!("deleting the {folder} folder"); tracing::debug!("deleting the {folder} folder");
if let Some(e) = fs::remove_dir_all(package_dir.join(&folder)) if let Some(e) = fs::remove_dir_all(package_dir.join(&folder))
.await .await
@ -219,7 +213,7 @@ impl InstallCommand {
.map(|(alias, _, _)| alias) .map(|(alias, _, _)| alias)
.filter(|alias| { .filter(|alias| {
if *alias == env!("CARGO_BIN_NAME") { if *alias == env!("CARGO_BIN_NAME") {
log::warn!( tracing::warn!(
"package {alias} has the same name as the CLI, skipping bin link" "package {alias} has the same name as the CLI, skipping bin link"
); );
return false; return false;
@ -281,7 +275,6 @@ exec lune run "$(dirname "$0")/.impl/{alias}.luau" -- "$@""#
progress_bar( progress_bar(
graph.values().map(|versions| versions.len() as u64).sum(), graph.values().map(|versions| versions.len() as u64).sum(),
rx, rx,
&multi,
format!("{} 📥 ", job(3)), format!("{} 📥 ", job(3)),
"downloading dependencies".to_string(), "downloading dependencies".to_string(),
"downloaded dependencies".to_string(), "downloaded dependencies".to_string(),
@ -303,7 +296,6 @@ exec lune run "$(dirname "$0")/.impl/{alias}.luau" -- "$@""#
progress_bar( progress_bar(
manifest.patches.values().map(|v| v.len() as u64).sum(), manifest.patches.values().map(|v| v.len() as u64).sum(),
rx, rx,
&multi,
format!("{} 🩹 ", job(JOBS - 1)), format!("{} 🩹 ", job(JOBS - 1)),
"applying patches".to_string(), "applying patches".to_string(),
"applied patches".to_string(), "applied patches".to_string(),
@ -323,9 +315,8 @@ exec lune run "$(dirname "$0")/.impl/{alias}.luau" -- "$@""#
graph: downloaded_graph, graph: downloaded_graph,
workspace: run_on_workspace_members(&project, |project| { workspace: run_on_workspace_members(&project, |project| {
let multi = multi.clone();
let reqwest = reqwest.clone(); let reqwest = reqwest.clone();
async move { Box::pin(self.run(project, multi, reqwest)).await } async move { Box::pin(self.run(project, reqwest)).await }
}) })
.await?, .await?,
}) })

View file

@ -1,4 +1,3 @@
use indicatif::MultiProgress;
use pesde::Project; use pesde::Project;
mod add; mod add;
@ -72,18 +71,13 @@ pub enum Subcommand {
} }
impl Subcommand { impl Subcommand {
pub async fn run( pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
self,
project: Project,
multi: MultiProgress,
reqwest: reqwest::Client,
) -> anyhow::Result<()> {
match self { match self {
Subcommand::Auth(auth) => auth.run(project, reqwest).await, Subcommand::Auth(auth) => auth.run(project, reqwest).await,
Subcommand::Config(config) => config.run().await, Subcommand::Config(config) => config.run().await,
Subcommand::Init(init) => init.run(project).await, Subcommand::Init(init) => init.run(project).await,
Subcommand::Run(run) => run.run(project).await, Subcommand::Run(run) => run.run(project).await,
Subcommand::Install(install) => install.run(project, multi, reqwest).await, Subcommand::Install(install) => install.run(project, reqwest).await,
Subcommand::Publish(publish) => publish.run(project, reqwest).await, Subcommand::Publish(publish) => publish.run(project, reqwest).await,
#[cfg(feature = "version-management")] #[cfg(feature = "version-management")]
Subcommand::SelfInstall(self_install) => self_install.run().await, Subcommand::SelfInstall(self_install) => self_install.run().await,
@ -94,9 +88,9 @@ impl Subcommand {
#[cfg(feature = "version-management")] #[cfg(feature = "version-management")]
Subcommand::SelfUpgrade(self_upgrade) => self_upgrade.run(reqwest).await, Subcommand::SelfUpgrade(self_upgrade) => self_upgrade.run(reqwest).await,
Subcommand::Add(add) => add.run(project).await, Subcommand::Add(add) => add.run(project).await,
Subcommand::Update(update) => update.run(project, multi, reqwest).await, Subcommand::Update(update) => update.run(project, reqwest).await,
Subcommand::Outdated(outdated) => outdated.run(project).await, Subcommand::Outdated(outdated) => outdated.run(project).await,
Subcommand::Execute(execute) => execute.run(project, multi, reqwest).await, Subcommand::Execute(execute) => execute.run(project, reqwest).await,
} }
} }
} }

View file

@ -4,6 +4,7 @@ use async_compression::Level;
use clap::Args; use clap::Args;
use colored::Colorize; use colored::Colorize;
use fs_err::tokio as fs; use fs_err::tokio as fs;
#[allow(deprecated)]
use pesde::{ use pesde::{
manifest::{target::Target, DependencyType}, manifest::{target::Target, DependencyType},
matching_globs_old_behaviour, matching_globs_old_behaviour,
@ -129,6 +130,7 @@ impl PublishCommand {
_ => None, _ => None,
}; };
#[allow(deprecated)]
let mut paths = matching_globs_old_behaviour( let mut paths = matching_globs_old_behaviour(
project.package_dir(), project.package_dir(),
manifest.includes.iter().map(|s| s.as_str()), manifest.includes.iter().map(|s| s.as_str()),
@ -499,6 +501,16 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
.bin_path() .bin_path()
.map_or("(none)".to_string(), |p| p.to_string()) .map_or("(none)".to_string(), |p| p.to_string())
); );
println!(
"\tscripts: {}",
manifest
.target
.scripts()
.filter(|s| !s.is_empty())
.map_or("(none)".to_string(), |s| {
s.keys().cloned().collect::<Vec<_>>().join(", ")
})
);
} }
println!( println!(
@ -585,22 +597,14 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
DependencySpecifiers::Pesde(spec) => { DependencySpecifiers::Pesde(spec) => {
!config.other_registries_allowed.is_allowed_or_same( !config.other_registries_allowed.is_allowed_or_same(
source.repo_url().clone(), source.repo_url().clone(),
manifest gix::Url::try_from(spec.index.as_deref().unwrap()).unwrap(),
.indices
.get(spec.index.as_deref().unwrap_or(DEFAULT_INDEX_NAME))
.unwrap()
.clone(),
) )
} }
DependencySpecifiers::Git(spec) => !config.git_allowed.is_allowed(spec.repo.clone()), DependencySpecifiers::Git(spec) => !config.git_allowed.is_allowed(spec.repo.clone()),
#[cfg(feature = "wally-compat")] #[cfg(feature = "wally-compat")]
DependencySpecifiers::Wally(spec) => !config.wally_allowed.is_allowed( DependencySpecifiers::Wally(spec) => !config
manifest .wally_allowed
.wally_indices .is_allowed(gix::Url::try_from(spec.index.as_deref().unwrap()).unwrap()),
.get(spec.index.as_deref().unwrap_or(DEFAULT_INDEX_NAME))
.unwrap()
.clone(),
),
_ => false, _ => false,
}) { }) {
anyhow::bail!("dependency `{disallowed}` is not allowed on this index"); anyhow::bail!("dependency `{disallowed}` is not allowed on this index");
@ -622,7 +626,7 @@ info: otherwise, the file was deemed unnecessary, if you don't understand why, p
.body(archive); .body(archive);
if let Some(token) = project.auth_config().tokens().get(index_url) { if let Some(token) = project.auth_config().tokens().get(index_url) {
log::debug!("using token for {index_url}"); tracing::debug!("using token for {index_url}");
request = request.header(AUTHORIZATION, token); request = request.header(AUTHORIZATION, token);
} }

View file

@ -1,7 +1,8 @@
use crate::cli::{ use crate::cli::{
config::read_config, config::read_config,
version::{ version::{
current_version, get_latest_remote_version, get_or_download_version, update_bin_exe, current_version, get_or_download_version, get_remote_version, no_build_metadata,
update_bin_exe, TagInfo, VersionType,
}, },
}; };
use anyhow::Context; use anyhow::Context;
@ -24,33 +25,33 @@ impl SelfUpgradeCommand {
.context("no cached version found")? .context("no cached version found")?
.1 .1
} else { } else {
get_latest_remote_version(&reqwest).await? get_remote_version(&reqwest, VersionType::Latest).await?
}; };
if latest_version <= current_version() { let latest_version_no_metadata = no_build_metadata(&latest_version);
if latest_version_no_metadata <= current_version() {
println!("already up to date"); println!("already up to date");
return Ok(()); return Ok(());
} }
let display_latest_version = latest_version_no_metadata.to_string().yellow().bold();
if !inquire::prompt_confirmation(format!( if !inquire::prompt_confirmation(format!(
"are you sure you want to upgrade {} from {} to {}?", "are you sure you want to upgrade {} from {} to {display_latest_version}?",
env!("CARGO_BIN_NAME").cyan(), env!("CARGO_BIN_NAME").cyan(),
current_version().to_string().yellow().bold(), env!("CARGO_PKG_VERSION").yellow().bold()
latest_version.to_string().yellow().bold()
))? { ))? {
println!("cancelled upgrade"); println!("cancelled upgrade");
return Ok(()); return Ok(());
} }
let path = get_or_download_version(&reqwest, &latest_version, true) let path = get_or_download_version(&reqwest, &TagInfo::Complete(latest_version), true)
.await? .await?
.unwrap(); .unwrap();
update_bin_exe(&path).await?; update_bin_exe(&path).await?;
println!( println!("upgraded to version {display_latest_version}!");
"upgraded to version {}!",
latest_version.to_string().yellow().bold()
);
Ok(()) Ok(())
} }

View file

@ -2,7 +2,6 @@ use crate::cli::{progress_bar, run_on_workspace_members};
use anyhow::Context; use anyhow::Context;
use clap::Args; use clap::Args;
use colored::Colorize; use colored::Colorize;
use indicatif::MultiProgress;
use pesde::{lockfile::Lockfile, Project}; use pesde::{lockfile::Lockfile, Project};
use std::{collections::HashSet, sync::Arc}; use std::{collections::HashSet, sync::Arc};
use tokio::sync::Mutex; use tokio::sync::Mutex;
@ -11,12 +10,7 @@ use tokio::sync::Mutex;
pub struct UpdateCommand {} pub struct UpdateCommand {}
impl UpdateCommand { impl UpdateCommand {
pub async fn run( pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
self,
project: Project,
multi: MultiProgress,
reqwest: reqwest::Client,
) -> anyhow::Result<()> {
let mut refreshed_sources = HashSet::new(); let mut refreshed_sources = HashSet::new();
let manifest = project let manifest = project
@ -60,7 +54,6 @@ impl UpdateCommand {
progress_bar( progress_bar(
graph.values().map(|versions| versions.len() as u64).sum(), graph.values().map(|versions| versions.len() as u64).sum(),
rx, rx,
&multi,
"📥 ".to_string(), "📥 ".to_string(),
"downloading dependencies".to_string(), "downloading dependencies".to_string(),
"downloaded dependencies".to_string(), "downloaded dependencies".to_string(),
@ -73,9 +66,8 @@ impl UpdateCommand {
}, },
workspace: run_on_workspace_members(&project, |project| { workspace: run_on_workspace_members(&project, |project| {
let multi = multi.clone();
let reqwest = reqwest.clone(); let reqwest = reqwest.clone();
async move { Box::pin(self.run(project, multi, reqwest)).await } async move { Box::pin(self.run(project, reqwest)).await }
}) })
.await?, .await?,
}) })

View file

@ -2,6 +2,7 @@ use crate::cli::{auth::Tokens, home_dir};
use anyhow::Context; use anyhow::Context;
use fs_err::tokio as fs; use fs_err::tokio as fs;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use tracing::instrument;
#[derive(Debug, Clone, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(default)] #[serde(default)]
@ -30,6 +31,7 @@ impl Default for CliConfig {
} }
} }
#[instrument(level = "trace")]
pub async fn read_config() -> anyhow::Result<CliConfig> { pub async fn read_config() -> anyhow::Result<CliConfig> {
let config_string = match fs::read_to_string(home_dir()?.join("config.toml")).await { let config_string = match fs::read_to_string(home_dir()?.join("config.toml")).await {
Ok(config_string) => config_string, Ok(config_string) => config_string,
@ -44,6 +46,7 @@ pub async fn read_config() -> anyhow::Result<CliConfig> {
Ok(config) Ok(config)
} }
#[instrument(level = "trace")]
pub async fn write_config(config: &CliConfig) -> anyhow::Result<()> { pub async fn write_config(config: &CliConfig) -> anyhow::Result<()> {
let config_string = toml::to_string(config).context("failed to serialize config")?; let config_string = toml::to_string(config).context("failed to serialize config")?;
fs::write(home_dir()?.join("config.toml"), config_string) fs::write(home_dir()?.join("config.toml"), config_string)

View file

@ -2,7 +2,6 @@ use anyhow::Context;
use colored::Colorize; use colored::Colorize;
use fs_err::tokio as fs; use fs_err::tokio as fs;
use futures::StreamExt; use futures::StreamExt;
use indicatif::MultiProgress;
use pesde::{ use pesde::{
lockfile::Lockfile, lockfile::Lockfile,
manifest::target::TargetKind, manifest::target::TargetKind,
@ -19,6 +18,7 @@ use std::{
time::Duration, time::Duration,
}; };
use tokio::pin; use tokio::pin;
use tracing::instrument;
pub mod auth; pub mod auth;
pub mod commands; pub mod commands;
@ -43,6 +43,7 @@ pub async fn bin_dir() -> anyhow::Result<PathBuf> {
Ok(bin_dir) Ok(bin_dir)
} }
#[instrument(skip(project), ret(level = "trace"), level = "debug")]
pub async fn up_to_date_lockfile(project: &Project) -> anyhow::Result<Option<Lockfile>> { pub async fn up_to_date_lockfile(project: &Project) -> anyhow::Result<Option<Lockfile>> {
let manifest = project.deser_manifest().await?; let manifest = project.deser_manifest().await?;
let lockfile = match project.deser_lockfile().await { let lockfile = match project.deser_lockfile().await {
@ -56,17 +57,17 @@ pub async fn up_to_date_lockfile(project: &Project) -> anyhow::Result<Option<Loc
}; };
if manifest.overrides != lockfile.overrides { if manifest.overrides != lockfile.overrides {
log::debug!("overrides are different"); tracing::debug!("overrides are different");
return Ok(None); return Ok(None);
} }
if manifest.target.kind() != lockfile.target { if manifest.target.kind() != lockfile.target {
log::debug!("target kind is different"); tracing::debug!("target kind is different");
return Ok(None); return Ok(None);
} }
if manifest.name != lockfile.name || manifest.version != lockfile.version { if manifest.name != lockfile.name || manifest.version != lockfile.version {
log::debug!("name or version is different"); tracing::debug!("name or version is different");
return Ok(None); return Ok(None);
} }
@ -88,7 +89,7 @@ pub async fn up_to_date_lockfile(project: &Project) -> anyhow::Result<Option<Loc
.iter() .iter()
.all(|(_, (spec, ty))| specs.contains(&(spec, ty))); .all(|(_, (spec, ty))| specs.contains(&(spec, ty)));
log::debug!("dependencies are the same: {same_dependencies}"); tracing::debug!("dependencies are the same: {same_dependencies}");
Ok(if same_dependencies { Ok(if same_dependencies {
Some(lockfile) Some(lockfile)
@ -133,7 +134,7 @@ impl VersionedPackageName {
let versions = graph.get(&self.0).context("package not found in graph")?; let versions = graph.get(&self.0).context("package not found in graph")?;
if versions.len() == 1 { if versions.len() == 1 {
let version = versions.keys().next().unwrap().clone(); let version = versions.keys().next().unwrap().clone();
log::debug!("only one version found, using {version}"); tracing::debug!("only one version found, using {version}");
version version
} else { } else {
anyhow::bail!( anyhow::bail!(
@ -195,21 +196,18 @@ pub fn parse_gix_url(s: &str) -> Result<gix::Url, gix::url::parse::Error> {
pub async fn progress_bar<E: std::error::Error + Into<anyhow::Error>>( pub async fn progress_bar<E: std::error::Error + Into<anyhow::Error>>(
len: u64, len: u64,
mut rx: tokio::sync::mpsc::Receiver<Result<String, E>>, mut rx: tokio::sync::mpsc::Receiver<Result<String, E>>,
multi: &MultiProgress,
prefix: String, prefix: String,
progress_msg: String, progress_msg: String,
finish_msg: String, finish_msg: String,
) -> anyhow::Result<()> { ) -> anyhow::Result<()> {
let bar = multi.add( let bar = indicatif::ProgressBar::new(len)
indicatif::ProgressBar::new(len) .with_style(
.with_style( indicatif::ProgressStyle::default_bar()
indicatif::ProgressStyle::default_bar() .template("{prefix}[{elapsed_precise}] {bar:40.208/166} {pos}/{len} {msg}")?
.template("{prefix}[{elapsed_precise}] {bar:40.208/166} {pos}/{len} {msg}")? .progress_chars("█▓▒░ "),
.progress_chars("█▓▒░ "), )
) .with_prefix(prefix)
.with_prefix(prefix) .with_message(progress_msg);
.with_message(progress_msg),
);
bar.enable_steady_tick(Duration::from_millis(100)); bar.enable_steady_tick(Duration::from_millis(100));
while let Some(result) = rx.recv().await { while let Some(result) = rx.recv().await {

View file

@ -15,7 +15,8 @@ use std::{
env::current_exe, env::current_exe,
path::{Path, PathBuf}, path::{Path, PathBuf},
}; };
use tokio::io::AsyncReadExt; use tokio::io::AsyncWrite;
use tracing::instrument;
pub fn current_version() -> Version { pub fn current_version() -> Version {
Version::parse(env!("CARGO_PKG_VERSION")).unwrap() Version::parse(env!("CARGO_PKG_VERSION")).unwrap()
@ -33,18 +34,33 @@ struct Asset {
url: url::Url, url: url::Url,
} }
#[instrument(level = "trace")]
fn get_repo() -> (String, String) { fn get_repo() -> (String, String) {
let mut parts = env!("CARGO_PKG_REPOSITORY").split('/').skip(3); let mut parts = env!("CARGO_PKG_REPOSITORY").split('/').skip(3);
( let (owner, repo) = (
parts.next().unwrap().to_string(), parts.next().unwrap().to_string(),
parts.next().unwrap().to_string(), parts.next().unwrap().to_string(),
) );
tracing::trace!("repository for updates: {owner}/{repo}");
(owner, repo)
} }
pub async fn get_latest_remote_version(reqwest: &reqwest::Client) -> anyhow::Result<Version> { #[derive(Debug)]
pub enum VersionType {
Latest,
Specific(Version),
}
#[instrument(skip(reqwest), level = "trace")]
pub async fn get_remote_version(
reqwest: &reqwest::Client,
ty: VersionType,
) -> anyhow::Result<Version> {
let (owner, repo) = get_repo(); let (owner, repo) = get_repo();
let releases = reqwest let mut releases = reqwest
.get(format!( .get(format!(
"https://api.github.com/repos/{owner}/{repo}/releases", "https://api.github.com/repos/{owner}/{repo}/releases",
)) ))
@ -55,17 +71,28 @@ pub async fn get_latest_remote_version(reqwest: &reqwest::Client) -> anyhow::Res
.context("failed to get GitHub API response")? .context("failed to get GitHub API response")?
.json::<Vec<Release>>() .json::<Vec<Release>>()
.await .await
.context("failed to parse GitHub API response")?; .context("failed to parse GitHub API response")?
releases
.into_iter() .into_iter()
.map(|release| Version::parse(release.tag_name.trim_start_matches('v')).unwrap()) .filter_map(|release| Version::parse(release.tag_name.trim_start_matches('v')).ok());
.max()
.context("failed to find latest version") match ty {
VersionType::Latest => releases.max(),
VersionType::Specific(version) => {
releases.find(|v| no_build_metadata(v) == no_build_metadata(&version))
}
}
.context("failed to find latest version")
}
pub fn no_build_metadata(version: &Version) -> Version {
let mut version = version.clone();
version.build = semver::BuildMetadata::EMPTY;
version
} }
const CHECK_INTERVAL: chrono::Duration = chrono::Duration::hours(6); const CHECK_INTERVAL: chrono::Duration = chrono::Duration::hours(6);
#[instrument(skip(reqwest), level = "trace")]
pub async fn check_for_updates(reqwest: &reqwest::Client) -> anyhow::Result<()> { pub async fn check_for_updates(reqwest: &reqwest::Client) -> anyhow::Result<()> {
let config = read_config().await?; let config = read_config().await?;
@ -73,9 +100,11 @@ pub async fn check_for_updates(reqwest: &reqwest::Client) -> anyhow::Result<()>
.last_checked_updates .last_checked_updates
.filter(|(time, _)| chrono::Utc::now() - *time < CHECK_INTERVAL) .filter(|(time, _)| chrono::Utc::now() - *time < CHECK_INTERVAL)
{ {
tracing::debug!("using cached version");
version version
} else { } else {
let version = get_latest_remote_version(reqwest).await?; tracing::debug!("checking for updates");
let version = get_remote_version(reqwest, VersionType::Latest).await?;
write_config(&CliConfig { write_config(&CliConfig {
last_checked_updates: Some((chrono::Utc::now(), version.clone())), last_checked_updates: Some((chrono::Utc::now(), version.clone())),
@ -86,72 +115,77 @@ pub async fn check_for_updates(reqwest: &reqwest::Client) -> anyhow::Result<()>
version version
}; };
let current_version = current_version(); let current_version = current_version();
let version_no_metadata = no_build_metadata(&version);
if version > current_version { if version_no_metadata <= current_version {
let name = env!("CARGO_BIN_NAME"); return Ok(());
let changelog = format!("{}/releases/tag/v{version}", env!("CARGO_PKG_REPOSITORY"),);
let unformatted_messages = [
"".to_string(),
format!("update available! {current_version}{version}"),
format!("changelog: {changelog}"),
format!("run `{name} self-upgrade` to upgrade"),
"".to_string(),
];
let width = unformatted_messages
.iter()
.map(|s| s.chars().count())
.max()
.unwrap()
+ 4;
let column = "".bright_magenta();
let message = [
"".to_string(),
format!(
"update available! {} → {}",
current_version.to_string().red(),
version.to_string().green()
),
format!("changelog: {}", changelog.blue()),
format!(
"run `{} {}` to upgrade",
name.blue(),
"self-upgrade".yellow()
),
"".to_string(),
]
.into_iter()
.enumerate()
.map(|(i, s)| {
let text_length = unformatted_messages[i].chars().count();
let padding = (width as f32 - text_length as f32) / 2f32;
let padding_l = " ".repeat(padding.floor() as usize);
let padding_r = " ".repeat(padding.ceil() as usize);
format!("{column}{padding_l}{s}{padding_r}{column}")
})
.collect::<Vec<_>>()
.join("\n");
let lines = "".repeat(width).bright_magenta();
let tl = "".bright_magenta();
let tr = "".bright_magenta();
let bl = "".bright_magenta();
let br = "".bright_magenta();
println!("\n{tl}{lines}{tr}\n{message}\n{bl}{lines}{br}\n");
} }
let name = env!("CARGO_BIN_NAME");
let changelog = format!("{}/releases/tag/v{version}", env!("CARGO_PKG_REPOSITORY"));
let unformatted_messages = [
"".to_string(),
format!("update available! {current_version}{version_no_metadata}"),
format!("changelog: {changelog}"),
format!("run `{name} self-upgrade` to upgrade"),
"".to_string(),
];
let width = unformatted_messages
.iter()
.map(|s| s.chars().count())
.max()
.unwrap()
+ 4;
let column = "".bright_magenta();
let message = [
"".to_string(),
format!(
"update available! {} → {}",
current_version.to_string().red(),
version_no_metadata.to_string().green()
),
format!("changelog: {}", changelog.blue()),
format!(
"run `{} {}` to upgrade",
name.blue(),
"self-upgrade".yellow()
),
"".to_string(),
]
.into_iter()
.enumerate()
.map(|(i, s)| {
let text_length = unformatted_messages[i].chars().count();
let padding = (width as f32 - text_length as f32) / 2f32;
let padding_l = " ".repeat(padding.floor() as usize);
let padding_r = " ".repeat(padding.ceil() as usize);
format!("{column}{padding_l}{s}{padding_r}{column}")
})
.collect::<Vec<_>>()
.join("\n");
let lines = "".repeat(width).bright_magenta();
let tl = "".bright_magenta();
let tr = "".bright_magenta();
let bl = "".bright_magenta();
let br = "".bright_magenta();
println!("\n{tl}{lines}{tr}\n{message}\n{bl}{lines}{br}\n");
Ok(()) Ok(())
} }
pub async fn download_github_release( #[instrument(skip(reqwest, writer), level = "trace")]
pub async fn download_github_release<W: AsyncWrite + Unpin>(
reqwest: &reqwest::Client, reqwest: &reqwest::Client,
version: &Version, version: &Version,
) -> anyhow::Result<Vec<u8>> { mut writer: W,
) -> anyhow::Result<()> {
let (owner, repo) = get_repo(); let (owner, repo) = get_repo();
let release = reqwest let release = reqwest
@ -202,19 +236,22 @@ pub async fn download_github_release(
.context("archive has no entry")? .context("archive has no entry")?
.context("failed to get first archive entry")?; .context("failed to get first archive entry")?;
let mut result = Vec::new(); tokio::io::copy(&mut entry, &mut writer)
entry
.read_to_end(&mut result)
.await .await
.context("failed to read archive entry bytes")?; .context("failed to write archive entry to file")
.map(|_| ())
Ok(result)
} }
#[derive(Debug)]
pub enum TagInfo {
Complete(Version),
Incomplete(Version),
}
#[instrument(skip(reqwest), level = "trace")]
pub async fn get_or_download_version( pub async fn get_or_download_version(
reqwest: &reqwest::Client, reqwest: &reqwest::Client,
version: &Version, tag: &TagInfo,
always_give_path: bool, always_give_path: bool,
) -> anyhow::Result<Option<PathBuf>> { ) -> anyhow::Result<Option<PathBuf>> {
let path = home_dir()?.join("versions"); let path = home_dir()?.join("versions");
@ -222,11 +259,23 @@ pub async fn get_or_download_version(
.await .await
.context("failed to create versions directory")?; .context("failed to create versions directory")?;
let path = path.join(format!("{version}{}", std::env::consts::EXE_SUFFIX)); let version = match tag {
TagInfo::Complete(version) => version,
// don't fetch the version since it could be cached
TagInfo::Incomplete(version) => version,
};
let path = path.join(format!(
"{}{}",
no_build_metadata(version),
std::env::consts::EXE_SUFFIX
));
let is_requested_version = !always_give_path && *version == current_version(); let is_requested_version = !always_give_path && *version == current_version();
if path.exists() { if path.exists() {
tracing::debug!("version already exists");
return Ok(if is_requested_version { return Ok(if is_requested_version {
None None
} else { } else {
@ -235,14 +284,29 @@ pub async fn get_or_download_version(
} }
if is_requested_version { if is_requested_version {
tracing::debug!("copying current executable to version directory");
fs::copy(current_exe()?, &path) fs::copy(current_exe()?, &path)
.await .await
.context("failed to copy current executable to version directory")?; .context("failed to copy current executable to version directory")?;
} else { } else {
let bytes = download_github_release(reqwest, version).await?; let version = match tag {
fs::write(&path, bytes) TagInfo::Complete(version) => version.clone(),
.await TagInfo::Incomplete(version) => {
.context("failed to write downloaded version file")?; get_remote_version(reqwest, VersionType::Specific(version.clone()))
.await
.context("failed to get remote version")?
}
};
tracing::debug!("downloading version");
download_github_release(
reqwest,
&version,
fs::File::create(&path)
.await
.context("failed to create version file")?,
)
.await?;
} }
make_executable(&path) make_executable(&path)
@ -256,6 +320,7 @@ pub async fn get_or_download_version(
}) })
} }
#[instrument(level = "trace")]
pub async fn update_bin_exe(downloaded_file: &Path) -> anyhow::Result<()> { pub async fn update_bin_exe(downloaded_file: &Path) -> anyhow::Result<()> {
let bin_exe_path = bin_dir().await?.join(format!( let bin_exe_path = bin_dir().await?.join(format!(
"{}{}", "{}{}",

View file

@ -13,6 +13,7 @@ use std::{
collections::HashSet, collections::HashSet,
sync::{Arc, Mutex}, sync::{Arc, Mutex},
}; };
use tracing::{instrument, Instrument};
type MultithreadedGraph = Arc<Mutex<DownloadedGraph>>; type MultithreadedGraph = Arc<Mutex<DownloadedGraph>>;
@ -23,6 +24,7 @@ pub(crate) type MultithreadDownloadJob = (
impl Project { impl Project {
/// Downloads a graph of dependencies /// Downloads a graph of dependencies
#[instrument(skip(self, graph, refreshed_sources, reqwest), level = "debug")]
pub async fn download_graph( pub async fn download_graph(
&self, &self,
graph: &DependencyGraph, graph: &DependencyGraph,
@ -69,76 +71,87 @@ impl Project {
let version_id = version_id.clone(); let version_id = version_id.clone();
let node = node.clone(); let node = node.clone();
let span = tracing::info_span!(
"download",
name = name.to_string(),
version_id = version_id.to_string()
);
let project = project.clone(); let project = project.clone();
let reqwest = reqwest.clone(); let reqwest = reqwest.clone();
let downloaded_graph = downloaded_graph.clone(); let downloaded_graph = downloaded_graph.clone();
let package_dir = self.package_dir().to_path_buf(); let package_dir = self.package_dir().to_path_buf();
tokio::spawn(async move { tokio::spawn(
let source = node.pkg_ref.source(); async move {
let source = node.pkg_ref.source();
let container_folder = node.container_folder( let container_folder = node.container_folder(
&package_dir &package_dir
.join(manifest_target_kind.packages_folder(version_id.target())) .join(manifest_target_kind.packages_folder(version_id.target()))
.join(PACKAGES_CONTAINER_NAME), .join(PACKAGES_CONTAINER_NAME),
&name, &name,
version_id.version(), version_id.version(),
); );
match fs::create_dir_all(&container_folder).await { match fs::create_dir_all(&container_folder).await {
Ok(_) => {} Ok(_) => {}
Err(e) => {
tx.send(Err(errors::DownloadGraphError::Io(e)))
.await
.unwrap();
return;
}
}
let project = project.clone();
log::debug!("downloading {name}@{version_id}");
let (fs, target) =
match source.download(&node.pkg_ref, &project, &reqwest).await {
Ok(target) => target,
Err(e) => { Err(e) => {
tx.send(Err(Box::new(e).into())).await.unwrap(); tx.send(Err(errors::DownloadGraphError::Io(e)))
.await
.unwrap();
return; return;
} }
}; }
log::debug!("downloaded {name}@{version_id}"); let project = project.clone();
if write { tracing::debug!("downloading");
if !prod || node.resolved_ty != DependencyType::Dev {
match fs.write_to(container_folder, project.cas_dir(), true).await { let (fs, target) =
Ok(_) => {} match source.download(&node.pkg_ref, &project, &reqwest).await {
Ok(target) => target,
Err(e) => { Err(e) => {
tx.send(Err(errors::DownloadGraphError::WriteFailed(e))) tx.send(Err(Box::new(e).into())).await.unwrap();
.await
.unwrap();
return; return;
} }
}; };
} else {
log::debug!("skipping writing {name}@{version_id} to disk, dev dependency in prod mode"); tracing::debug!("downloaded");
if write {
if !prod || node.resolved_ty != DependencyType::Dev {
match fs.write_to(container_folder, project.cas_dir(), true).await {
Ok(_) => {}
Err(e) => {
tx.send(Err(errors::DownloadGraphError::WriteFailed(e)))
.await
.unwrap();
return;
}
};
} else {
tracing::debug!(
"skipping write to disk, dev dependency in prod mode"
);
}
} }
let display_name = format!("{name}@{version_id}");
{
let mut downloaded_graph = downloaded_graph.lock().unwrap();
downloaded_graph
.entry(name)
.or_default()
.insert(version_id, DownloadedDependencyGraphNode { node, target });
}
tx.send(Ok(display_name)).await.unwrap();
} }
.instrument(span),
let display_name = format!("{name}@{version_id}"); );
{
let mut downloaded_graph = downloaded_graph.lock().unwrap();
downloaded_graph
.entry(name)
.or_default()
.insert(version_id, DownloadedDependencyGraphNode { node, target });
}
tx.send(Ok(display_name)).await.unwrap();
});
} }
} }

View file

@ -11,6 +11,7 @@ use std::{
sync::{Arc, Mutex as StdMutex}, sync::{Arc, Mutex as StdMutex},
}; };
use tokio::sync::Mutex; use tokio::sync::Mutex;
use tracing::{instrument, Instrument};
/// Filters a graph to only include production dependencies, if `prod` is `true` /// Filters a graph to only include production dependencies, if `prod` is `true`
pub fn filter_graph(graph: &DownloadedGraph, prod: bool) -> DownloadedGraph { pub fn filter_graph(graph: &DownloadedGraph, prod: bool) -> DownloadedGraph {
@ -33,8 +34,16 @@ pub fn filter_graph(graph: &DownloadedGraph, prod: bool) -> DownloadedGraph {
.collect() .collect()
} }
/// Receiver for dependencies downloaded and linked
pub type DownloadAndLinkReceiver =
tokio::sync::mpsc::Receiver<Result<String, crate::download::errors::DownloadGraphError>>;
impl Project { impl Project {
/// Downloads a graph of dependencies and links them in the correct order /// Downloads a graph of dependencies and links them in the correct order
#[instrument(
skip(self, graph, refreshed_sources, reqwest, pesde_cb),
level = "debug"
)]
pub async fn download_and_link< pub async fn download_and_link<
F: FnOnce(&Arc<DownloadedGraph>) -> R + Send + 'static, F: FnOnce(&Arc<DownloadedGraph>) -> R + Send + 'static,
R: Future<Output = Result<(), E>> + Send, R: Future<Output = Result<(), E>> + Send,
@ -49,9 +58,7 @@ impl Project {
pesde_cb: F, pesde_cb: F,
) -> Result< ) -> Result<
( (
tokio::sync::mpsc::Receiver< DownloadAndLinkReceiver,
Result<String, crate::download::errors::DownloadGraphError>,
>,
impl Future<Output = Result<DownloadedGraph, errors::DownloadAndLinkError<E>>>, impl Future<Output = Result<DownloadedGraph, errors::DownloadAndLinkError<E>>>,
), ),
errors::DownloadAndLinkError<E>, errors::DownloadAndLinkError<E>,
@ -78,6 +85,7 @@ impl Project {
// step 1. download pesde dependencies // step 1. download pesde dependencies
let (mut pesde_rx, pesde_graph) = this let (mut pesde_rx, pesde_graph) = this
.download_graph(&graph, &mut refreshed_sources, &reqwest, prod, write, false) .download_graph(&graph, &mut refreshed_sources, &reqwest, prod, write, false)
.instrument(tracing::debug_span!("download (pesde)"))
.await?; .await?;
while let Some(result) = pesde_rx.recv().await { while let Some(result) = pesde_rx.recv().await {
@ -89,6 +97,7 @@ impl Project {
// step 2. link pesde dependencies. do so without types // step 2. link pesde dependencies. do so without types
if write { if write {
this.link_dependencies(&filter_graph(&pesde_graph, prod), false) this.link_dependencies(&filter_graph(&pesde_graph, prod), false)
.instrument(tracing::debug_span!("link (pesde)"))
.await?; .await?;
} }
@ -103,6 +112,7 @@ impl Project {
// step 3. download wally dependencies // step 3. download wally dependencies
let (mut wally_rx, wally_graph) = this let (mut wally_rx, wally_graph) = this
.download_graph(&graph, &mut refreshed_sources, &reqwest, prod, write, true) .download_graph(&graph, &mut refreshed_sources, &reqwest, prod, write, true)
.instrument(tracing::debug_span!("download (wally)"))
.await?; .await?;
while let Some(result) = wally_rx.recv().await { while let Some(result) = wally_rx.recv().await {
@ -132,6 +142,7 @@ impl Project {
// step 4. link ALL dependencies. do so with types // step 4. link ALL dependencies. do so with types
if write { if write {
this.link_dependencies(&filter_graph(&graph, prod), true) this.link_dependencies(&filter_graph(&graph, prod), true)
.instrument(tracing::debug_span!("link (all)"))
.await?; .await?;
} }

View file

@ -14,8 +14,10 @@ use futures::{future::try_join_all, Stream};
use gix::sec::identity::Account; use gix::sec::identity::Account;
use std::{ use std::{
collections::{HashMap, HashSet}, collections::{HashMap, HashSet},
fmt::Debug,
path::{Path, PathBuf}, path::{Path, PathBuf},
}; };
use tracing::instrument;
use wax::Pattern; use wax::Pattern;
/// Downloading packages /// Downloading packages
@ -149,29 +151,35 @@ impl Project {
} }
/// Read the manifest file /// Read the manifest file
#[instrument(skip(self), ret(level = "trace"), level = "debug")]
pub async fn read_manifest(&self) -> Result<String, errors::ManifestReadError> { pub async fn read_manifest(&self) -> Result<String, errors::ManifestReadError> {
let string = fs::read_to_string(self.package_dir.join(MANIFEST_FILE_NAME)).await?; let string = fs::read_to_string(self.package_dir.join(MANIFEST_FILE_NAME)).await?;
Ok(string) Ok(string)
} }
// TODO: cache the manifest
/// Deserialize the manifest file /// Deserialize the manifest file
#[instrument(skip(self), ret(level = "trace"), level = "debug")]
pub async fn deser_manifest(&self) -> Result<Manifest, errors::ManifestReadError> { pub async fn deser_manifest(&self) -> Result<Manifest, errors::ManifestReadError> {
let string = fs::read_to_string(self.package_dir.join(MANIFEST_FILE_NAME)).await?; let string = fs::read_to_string(self.package_dir.join(MANIFEST_FILE_NAME)).await?;
Ok(toml::from_str(&string)?) Ok(toml::from_str(&string)?)
} }
/// Write the manifest file /// Write the manifest file
#[instrument(skip(self, manifest), level = "debug")]
pub async fn write_manifest<S: AsRef<[u8]>>(&self, manifest: S) -> Result<(), std::io::Error> { pub async fn write_manifest<S: AsRef<[u8]>>(&self, manifest: S) -> Result<(), std::io::Error> {
fs::write(self.package_dir.join(MANIFEST_FILE_NAME), manifest.as_ref()).await fs::write(self.package_dir.join(MANIFEST_FILE_NAME), manifest.as_ref()).await
} }
/// Deserialize the lockfile /// Deserialize the lockfile
#[instrument(skip(self), ret(level = "trace"), level = "debug")]
pub async fn deser_lockfile(&self) -> Result<Lockfile, errors::LockfileReadError> { pub async fn deser_lockfile(&self) -> Result<Lockfile, errors::LockfileReadError> {
let string = fs::read_to_string(self.package_dir.join(LOCKFILE_FILE_NAME)).await?; let string = fs::read_to_string(self.package_dir.join(LOCKFILE_FILE_NAME)).await?;
Ok(toml::from_str(&string)?) Ok(toml::from_str(&string)?)
} }
/// Write the lockfile /// Write the lockfile
#[instrument(skip(self, lockfile), level = "debug")]
pub async fn write_lockfile( pub async fn write_lockfile(
&self, &self,
lockfile: Lockfile, lockfile: Lockfile,
@ -182,7 +190,8 @@ impl Project {
} }
/// Get the workspace members /// Get the workspace members
pub async fn workspace_members<P: AsRef<Path>>( #[instrument(skip(self), level = "debug")]
pub async fn workspace_members<P: AsRef<Path> + Debug>(
&self, &self,
dir: P, dir: P,
can_ref_self: bool, can_ref_self: bool,
@ -222,7 +231,16 @@ impl Project {
} }
/// Gets all matching paths in a directory /// Gets all matching paths in a directory
pub async fn matching_globs_old_behaviour<'a, P: AsRef<Path>, I: IntoIterator<Item = &'a str>>( #[deprecated(
since = "0.5.0-rc.13",
note = "use `matching_globs` instead, which does not have the old behaviour of including whole directories by their name (`src` instead of `src/**`)"
)]
#[instrument(ret, level = "trace")]
pub async fn matching_globs_old_behaviour<
'a,
P: AsRef<Path> + Debug,
I: IntoIterator<Item = &'a str> + Debug,
>(
dir: P, dir: P,
globs: I, globs: I,
relative: bool, relative: bool,
@ -270,7 +288,7 @@ pub async fn matching_globs_old_behaviour<'a, P: AsRef<Path>, I: IntoIterator<It
is_entire_dir_included || is_filename_match, is_entire_dir_included || is_filename_match,
)); ));
if is_filename_match { if is_filename_match {
log::warn!("directory name usage found for {}. this is deprecated and will be removed in the future", path.display()); tracing::warn!("directory name usage found for {}. this is deprecated and will be removed in the future", path.display());
} }
} }
@ -293,7 +311,8 @@ pub async fn matching_globs_old_behaviour<'a, P: AsRef<Path>, I: IntoIterator<It
} }
/// Gets all matching paths in a directory /// Gets all matching paths in a directory
pub async fn matching_globs<'a, P: AsRef<Path>, I: IntoIterator<Item = &'a str>>( #[instrument(ret, level = "trace")]
pub async fn matching_globs<'a, P: AsRef<Path> + Debug, I: IntoIterator<Item = &'a str> + Debug>(
dir: P, dir: P,
globs: I, globs: I,
relative: bool, relative: bool,

View file

@ -117,10 +117,10 @@ pub fn get_lib_require_path(
) -> Result<String, errors::GetLibRequirePath> { ) -> Result<String, errors::GetLibRequirePath> {
let path = pathdiff::diff_paths(destination_dir, base_dir).unwrap(); let path = pathdiff::diff_paths(destination_dir, base_dir).unwrap();
let path = if use_new_structure { let path = if use_new_structure {
log::debug!("using new structure for require path with {:?}", lib_file); tracing::debug!("using new structure for require path with {lib_file:?}");
lib_file.to_path(path) lib_file.to_path(path)
} else { } else {
log::debug!("using old structure for require path with {:?}", lib_file); tracing::debug!("using old structure for require path with {lib_file:?}");
path path
}; };

View file

@ -20,6 +20,7 @@ use std::{
sync::Arc, sync::Arc,
}; };
use tokio::task::spawn_blocking; use tokio::task::spawn_blocking;
use tracing::{instrument, Instrument};
/// Generates linking modules for a project /// Generates linking modules for a project
pub mod generator; pub mod generator;
@ -44,6 +45,7 @@ async fn write_cas(destination: PathBuf, cas_dir: &Path, contents: &str) -> std:
impl Project { impl Project {
/// Links the dependencies of the project /// Links the dependencies of the project
#[instrument(skip(self, graph), level = "debug")]
pub async fn link_dependencies( pub async fn link_dependencies(
&self, &self,
graph: &DownloadedGraph, graph: &DownloadedGraph,
@ -55,7 +57,7 @@ impl Project {
// step 1. link all non-wally packages (and their dependencies) temporarily without types // step 1. link all non-wally packages (and their dependencies) temporarily without types
// we do this separately to allow the required tools for the scripts to be installed // we do this separately to allow the required tools for the scripts to be installed
self.link(graph, &manifest, &Arc::new(Default::default())) self.link(graph, &manifest, &Arc::new(Default::default()), false)
.await?; .await?;
if !with_types { if !with_types {
@ -110,7 +112,7 @@ impl Project {
} }
}; };
log::debug!("{name}@{version_id} has {} exported types", types.len()); tracing::debug!("contains {} exported types", types.len());
types types
} else { } else {
@ -122,8 +124,8 @@ impl Project {
.and_then(|t| t.build_files()) .and_then(|t| t.build_files())
{ {
let Some(script_path) = roblox_sync_config_gen_script else { let Some(script_path) = roblox_sync_config_gen_script else {
log::warn!("not having a `{}` script in the manifest might cause issues with Roblox linking", ScriptName::RobloxSyncConfigGenerator); tracing::warn!("not having a `{}` script in the manifest might cause issues with Roblox linking", ScriptName::RobloxSyncConfigGenerator);
return Ok((version_id, vec![])); return Ok((version_id, types));
}; };
execute_script( execute_script(
@ -143,7 +145,7 @@ impl Project {
} }
Ok((version_id, types)) Ok((version_id, types))
})) }.instrument(tracing::debug_span!("extract types", name = name.to_string(), version_id = version_id.to_string()))))
.await? .await?
.into_iter() .into_iter()
.collect::<HashMap<_, _>>(), .collect::<HashMap<_, _>>(),
@ -154,7 +156,8 @@ impl Project {
.collect::<HashMap<_, _>>(); .collect::<HashMap<_, _>>();
// step 3. link all packages (and their dependencies), this time with types // step 3. link all packages (and their dependencies), this time with types
self.link(graph, &manifest, &Arc::new(package_types)).await self.link(graph, &manifest, &Arc::new(package_types), true)
.await
} }
#[allow(clippy::too_many_arguments)] #[allow(clippy::too_many_arguments)]
@ -243,6 +246,7 @@ impl Project {
graph: &DownloadedGraph, graph: &DownloadedGraph,
manifest: &Arc<Manifest>, manifest: &Arc<Manifest>,
package_types: &Arc<HashMap<&PackageNames, HashMap<&VersionId, Vec<String>>>>, package_types: &Arc<HashMap<&PackageNames, HashMap<&VersionId, Vec<String>>>>,
is_complete: bool,
) -> Result<(), errors::LinkingError> { ) -> Result<(), errors::LinkingError> {
try_join_all(graph.iter().flat_map(|(name, versions)| { try_join_all(graph.iter().flat_map(|(name, versions)| {
versions.iter().map(|(version_id, node)| { versions.iter().map(|(version_id, node)| {
@ -250,6 +254,12 @@ impl Project {
let manifest = manifest.clone(); let manifest = manifest.clone();
let package_types = package_types.clone(); let package_types = package_types.clone();
let span = tracing::info_span!(
"link",
name = name.to_string(),
version_id = version_id.to_string()
);
async move { async move {
let (node_container_folder, node_packages_folder) = { let (node_container_folder, node_packages_folder) = {
let base_folder = create_and_canonicalize( let base_folder = create_and_canonicalize(
@ -291,10 +301,14 @@ impl Project {
.get(dependency_name) .get(dependency_name)
.and_then(|v| v.get(dependency_version_id)) .and_then(|v| v.get(dependency_version_id))
else { else {
return Err(errors::LinkingError::DependencyNotFound( if is_complete {
dependency_name.to_string(), return Err(errors::LinkingError::DependencyNotFound(
dependency_version_id.to_string(), format!("{dependency_name}@{dependency_version_id}"),
)); format!("{name}@{version_id}"),
));
}
continue;
}; };
let base_folder = create_and_canonicalize( let base_folder = create_and_canonicalize(
@ -338,6 +352,7 @@ impl Project {
Ok(()) Ok(())
} }
.instrument(span)
}) })
})) }))
.await .await
@ -362,7 +377,7 @@ pub mod errors {
Io(#[from] std::io::Error), Io(#[from] std::io::Error),
/// A dependency was not found /// A dependency was not found
#[error("dependency not found: {0}@{1}")] #[error("dependency `{0}` of `{1}` not found")]
DependencyNotFound(String, String), DependencyNotFound(String, String),
/// The library file was not found /// The library file was not found

View file

@ -14,7 +14,7 @@ use relative_path::RelativePathBuf;
use semver::Version; use semver::Version;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use std::{ use std::{
collections::{btree_map::Entry, BTreeMap}, collections::BTreeMap,
path::{Path, PathBuf}, path::{Path, PathBuf},
}; };
@ -32,6 +32,9 @@ pub struct DependencyGraphNode {
pub dependencies: BTreeMap<PackageNames, (VersionId, String)>, pub dependencies: BTreeMap<PackageNames, (VersionId, String)>,
/// The resolved (transformed, for example Peer -> Standard) type of the dependency /// The resolved (transformed, for example Peer -> Standard) type of the dependency
pub resolved_ty: DependencyType, pub resolved_ty: DependencyType,
/// Whether the resolved type should be Peer if this isn't depended on
#[serde(default, skip_serializing_if = "std::ops::Not::not")]
pub is_peer: bool,
/// The package reference /// The package reference
pub pkg_ref: PackageRefs, pub pkg_ref: PackageRefs,
} }
@ -74,45 +77,6 @@ impl DependencyGraphNode {
/// A graph of `DependencyGraphNode`s /// A graph of `DependencyGraphNode`s
pub type DependencyGraph = Graph<DependencyGraphNode>; pub type DependencyGraph = Graph<DependencyGraphNode>;
pub(crate) fn insert_node(
graph: &mut DependencyGraph,
name: PackageNames,
version: VersionId,
mut node: DependencyGraphNode,
is_top_level: bool,
) {
if !is_top_level && node.direct.take().is_some() {
log::debug!(
"tried to insert {name}@{version} as direct dependency from a non top-level context",
);
}
match graph
.entry(name.clone())
.or_default()
.entry(version.clone())
{
Entry::Vacant(entry) => {
entry.insert(node);
}
Entry::Occupied(existing) => {
let current_node = existing.into_mut();
match (&current_node.direct, &node.direct) {
(Some(_), Some(_)) => {
log::warn!("duplicate direct dependency for {name}@{version}");
}
(None, Some(_)) => {
current_node.direct = node.direct;
}
(_, _) => {}
}
}
}
}
/// A downloaded dependency graph node, i.e. a `DependencyGraphNode` with a `Target` /// A downloaded dependency graph node, i.e. a `DependencyGraphNode` with a `Target`
#[derive(Serialize, Deserialize, Debug, Clone)] #[derive(Serialize, Deserialize, Debug, Clone)]
pub struct DownloadedDependencyGraphNode { pub struct DownloadedDependencyGraphNode {

View file

@ -1,17 +1,20 @@
#[cfg(feature = "version-management")] #[cfg(feature = "version-management")]
use crate::cli::version::{check_for_updates, get_or_download_version}; use crate::cli::version::{check_for_updates, get_or_download_version, TagInfo};
use crate::cli::{auth::get_tokens, display_err, home_dir, HOME_DIR}; use crate::cli::{auth::get_tokens, display_err, home_dir, HOME_DIR};
use anyhow::Context; use anyhow::Context;
use clap::{builder::styling::AnsiColor, Parser}; use clap::{builder::styling::AnsiColor, Parser};
use fs_err::tokio as fs; use fs_err::tokio as fs;
use indicatif::MultiProgress;
use indicatif_log_bridge::LogWrapper;
use pesde::{matching_globs, AuthConfig, Project, MANIFEST_FILE_NAME}; use pesde::{matching_globs, AuthConfig, Project, MANIFEST_FILE_NAME};
use std::{ use std::{
collections::HashSet, collections::HashSet,
path::{Path, PathBuf}, path::{Path, PathBuf},
}; };
use tempfile::NamedTempFile; use tempfile::NamedTempFile;
use tracing::instrument;
use tracing_indicatif::{filter::IndicatifFilter, IndicatifLayer};
use tracing_subscriber::{
filter::LevelFilter, layer::SubscriberExt, util::SubscriberInitExt, EnvFilter, Layer,
};
mod cli; mod cli;
pub mod util; pub mod util;
@ -38,6 +41,7 @@ struct Cli {
subcommand: cli::commands::Subcommand, subcommand: cli::commands::Subcommand,
} }
#[instrument(level = "trace")]
async fn get_linkable_dir(path: &Path) -> PathBuf { async fn get_linkable_dir(path: &Path) -> PathBuf {
let mut curr_path = PathBuf::new(); let mut curr_path = PathBuf::new();
let file_to_try = NamedTempFile::new_in(path).expect("failed to create temporary file"); let file_to_try = NamedTempFile::new_in(path).expect("failed to create temporary file");
@ -68,7 +72,7 @@ async fn get_linkable_dir(path: &Path) -> PathBuf {
if fs::hard_link(file_to_try.path(), &try_path).await.is_ok() { if fs::hard_link(file_to_try.path(), &try_path).await.is_ok() {
if let Err(err) = fs::remove_file(&try_path).await { if let Err(err) = fs::remove_file(&try_path).await {
log::warn!( tracing::warn!(
"failed to remove temporary file at {}: {err}", "failed to remove temporary file at {}: {err}",
try_path.display() try_path.display()
); );
@ -129,6 +133,39 @@ async fn run() -> anyhow::Result<()> {
std::process::exit(status.code().unwrap()); std::process::exit(status.code().unwrap());
} }
let indicatif_layer = IndicatifLayer::new().with_filter(IndicatifFilter::new(false));
let tracing_env_filter = EnvFilter::builder()
.with_default_directive(LevelFilter::INFO.into())
.from_env_lossy()
.add_directive("reqwest=info".parse().unwrap())
.add_directive("rustls=info".parse().unwrap())
.add_directive("tokio_util=info".parse().unwrap())
.add_directive("goblin=info".parse().unwrap())
.add_directive("tower=info".parse().unwrap())
.add_directive("hyper=info".parse().unwrap())
.add_directive("h2=info".parse().unwrap());
let fmt_layer =
tracing_subscriber::fmt::layer().with_writer(indicatif_layer.inner().get_stderr_writer());
#[cfg(debug_assertions)]
let fmt_layer = fmt_layer.with_timer(tracing_subscriber::fmt::time::uptime());
#[cfg(not(debug_assertions))]
let fmt_layer = fmt_layer
.pretty()
.with_timer(())
.with_line_number(false)
.with_file(false)
.with_target(false);
tracing_subscriber::registry()
.with(tracing_env_filter)
.with(fmt_layer)
.with(indicatif_layer)
.init();
let (project_root_dir, project_workspace_dir) = 'finder: { let (project_root_dir, project_workspace_dir) = 'finder: {
let mut current_path = Some(cwd.clone()); let mut current_path = Some(cwd.clone());
let mut project_root = None::<PathBuf>; let mut project_root = None::<PathBuf>;
@ -191,16 +228,13 @@ async fn run() -> anyhow::Result<()> {
(project_root.unwrap_or_else(|| cwd.clone()), workspace_dir) (project_root.unwrap_or_else(|| cwd.clone()), workspace_dir)
}; };
let multi = { tracing::trace!(
let logger = pretty_env_logger::formatted_builder() "project root: {}\nworkspace root: {}",
.parse_env(pretty_env_logger::env_logger::Env::default().default_filter_or("info")) project_root_dir.display(),
.build(); project_workspace_dir
let multi = MultiProgress::new(); .as_ref()
.map_or("none".to_string(), |p| p.display().to_string())
LogWrapper::new(multi.clone(), logger).try_init().unwrap(); );
multi
};
let home_dir = home_dir()?; let home_dir = home_dir()?;
let data_dir = home_dir.join("data"); let data_dir = home_dir.join("data");
@ -217,7 +251,7 @@ async fn run() -> anyhow::Result<()> {
} }
.join("cas"); .join("cas");
log::debug!("using cas dir in {}", cas_dir.display()); tracing::debug!("using cas dir in {}", cas_dir.display());
let project = Project::new( let project = Project::new(
project_root_dir, project_root_dir,
@ -256,7 +290,7 @@ async fn run() -> anyhow::Result<()> {
.and_then(|manifest| manifest.pesde_version); .and_then(|manifest| manifest.pesde_version);
let exe_path = if let Some(version) = target_version { let exe_path = if let Some(version) = target_version {
get_or_download_version(&reqwest, &version, false).await? get_or_download_version(&reqwest, &TagInfo::Incomplete(version), false).await?
} else { } else {
None None
}; };
@ -278,7 +312,7 @@ async fn run() -> anyhow::Result<()> {
let cli = Cli::parse(); let cli = Cli::parse();
cli.subcommand.run(project, multi, reqwest).await cli.subcommand.run(project, reqwest).await
} }
#[tokio::main] #[tokio::main]

View file

@ -1,13 +1,13 @@
use relative_path::RelativePathBuf;
use semver::Version;
use serde::{Deserialize, Serialize};
use std::collections::{BTreeMap, HashMap};
use crate::{ use crate::{
manifest::{overrides::OverrideKey, target::Target}, manifest::{overrides::OverrideKey, target::Target},
names::PackageName, names::PackageName,
source::specifiers::DependencySpecifiers, source::specifiers::DependencySpecifiers,
}; };
use relative_path::RelativePathBuf;
use semver::Version;
use serde::{Deserialize, Serialize};
use std::collections::{BTreeMap, HashMap};
use tracing::instrument;
/// Overrides /// Overrides
pub mod overrides; pub mod overrides;
@ -107,6 +107,7 @@ pub enum DependencyType {
impl Manifest { impl Manifest {
/// Get all dependencies from the manifest /// Get all dependencies from the manifest
#[instrument(skip(self), ret(level = "trace"), level = "debug")]
pub fn all_dependencies( pub fn all_dependencies(
&self, &self,
) -> Result< ) -> Result<

View file

@ -35,8 +35,16 @@ impl FromStr for PackageName {
.ok_or(Self::Err::InvalidFormat(s.to_string()))?; .ok_or(Self::Err::InvalidFormat(s.to_string()))?;
for (reason, part) in [(ErrorReason::Scope, scope), (ErrorReason::Name, name)] { for (reason, part) in [(ErrorReason::Scope, scope), (ErrorReason::Name, name)] {
if part.len() < 3 || part.len() > 32 { let min_len = match reason {
return Err(Self::Err::InvalidLength(reason, part.to_string())); ErrorReason::Scope => 3,
ErrorReason::Name => 1,
};
if !(min_len..=32).contains(&part.len()) {
return Err(match reason {
ErrorReason::Scope => Self::Err::InvalidScopeLength(part.to_string()),
ErrorReason::Name => Self::Err::InvalidNameLength(part.to_string()),
});
} }
if part.chars().all(|c| c.is_ascii_digit()) { if part.chars().all(|c| c.is_ascii_digit()) {
@ -231,9 +239,13 @@ pub mod errors {
#[error("package {0} `{1}` starts or ends with an underscore")] #[error("package {0} `{1}` starts or ends with an underscore")]
PrePostfixUnderscore(ErrorReason, String), PrePostfixUnderscore(ErrorReason, String),
/// The package name is not within 3-32 characters long /// The package name's scope part is not within 3-32 characters long
#[error("package {0} `{1}` is not within 3-32 characters long")] #[error("package scope `{0}` is not within 3-32 characters long")]
InvalidLength(ErrorReason, String), InvalidScopeLength(String),
/// The package name's name part is not within 1-32 characters long
#[error("package name `{0}` is not within 1-32 characters long")]
InvalidNameLength(String),
} }
/// Errors that can occur when working with Wally package names /// Errors that can occur when working with Wally package names

View file

@ -3,6 +3,7 @@ use fs_err::tokio as fs;
use git2::{ApplyLocation, Diff, DiffFormat, DiffLineType, Repository, Signature}; use git2::{ApplyLocation, Diff, DiffFormat, DiffLineType, Repository, Signature};
use relative_path::RelativePathBuf; use relative_path::RelativePathBuf;
use std::path::Path; use std::path::Path;
use tracing::instrument;
/// Set up a git repository for patches /// Set up a git repository for patches
pub fn setup_patches_repo<P: AsRef<Path>>(dir: P) -> Result<Repository, git2::Error> { pub fn setup_patches_repo<P: AsRef<Path>>(dir: P) -> Result<Repository, git2::Error> {
@ -69,6 +70,7 @@ pub fn create_patch<P: AsRef<Path>>(dir: P) -> Result<Vec<u8>, git2::Error> {
impl Project { impl Project {
/// Apply patches to the project's dependencies /// Apply patches to the project's dependencies
#[instrument(skip(self, graph), level = "debug")]
pub async fn apply_patches( pub async fn apply_patches(
&self, &self,
graph: &DownloadedGraph, graph: &DownloadedGraph,
@ -97,7 +99,7 @@ impl Project {
.get(&name) .get(&name)
.and_then(|versions| versions.get(&version_id)) .and_then(|versions| versions.get(&version_id))
else { else {
log::warn!( tracing::warn!(
"patch for {name}@{version_id} not applied because it is not in the graph" "patch for {name}@{version_id} not applied because it is not in the graph"
); );
tx.send(Ok(format!("{name}@{version_id}"))).await.unwrap(); tx.send(Ok(format!("{name}@{version_id}"))).await.unwrap();
@ -114,7 +116,7 @@ impl Project {
); );
tokio::spawn(async move { tokio::spawn(async move {
log::debug!("applying patch to {name}@{version_id}"); tracing::debug!("applying patch to {name}@{version_id}");
let patch = match fs::read(&patch_path).await { let patch = match fs::read(&patch_path).await {
Ok(patch) => patch, Ok(patch) => patch,
@ -195,7 +197,9 @@ impl Project {
} }
} }
log::debug!("patch applied to {name}@{version_id}, removing .git directory"); tracing::debug!(
"patch applied to {name}@{version_id}, removing .git directory"
);
if let Err(e) = fs::remove_dir_all(container_folder.join(".git")).await { if let Err(e) = fs::remove_dir_all(container_folder.join(".git")).await {
tx.send(Err(errors::ApplyPatchesError::DotGitRemove(e))) tx.send(Err(errors::ApplyPatchesError::DotGitRemove(e)))

View file

@ -1,5 +1,5 @@
use crate::{ use crate::{
lockfile::{insert_node, DependencyGraph, DependencyGraphNode}, lockfile::{DependencyGraph, DependencyGraphNode},
manifest::DependencyType, manifest::DependencyType,
names::PackageNames, names::PackageNames,
source::{ source::{
@ -11,10 +11,55 @@ use crate::{
}, },
Project, DEFAULT_INDEX_NAME, Project, DEFAULT_INDEX_NAME,
}; };
use std::collections::{HashMap, HashSet, VecDeque}; use std::collections::{btree_map::Entry, HashMap, HashSet, VecDeque};
use tracing::{instrument, Instrument};
fn insert_node(
graph: &mut DependencyGraph,
name: PackageNames,
version: VersionId,
mut node: DependencyGraphNode,
is_top_level: bool,
) {
if !is_top_level && node.direct.take().is_some() {
tracing::debug!(
"tried to insert {name}@{version} as direct dependency from a non top-level context",
);
}
match graph
.entry(name.clone())
.or_default()
.entry(version.clone())
{
Entry::Vacant(entry) => {
entry.insert(node);
}
Entry::Occupied(existing) => {
let current_node = existing.into_mut();
match (&current_node.direct, &node.direct) {
(Some(_), Some(_)) => {
tracing::warn!("duplicate direct dependency for {name}@{version}");
}
(None, Some(_)) => {
current_node.direct = node.direct;
}
(_, _) => {}
}
}
}
}
impl Project { impl Project {
/// Create a dependency graph from the project's manifest /// Create a dependency graph from the project's manifest
#[instrument(
skip(self, previous_graph, refreshed_sources),
ret(level = "trace"),
level = "debug"
)]
pub async fn dependency_graph( pub async fn dependency_graph(
&self, &self,
previous_graph: Option<&DependencyGraph>, previous_graph: Option<&DependencyGraph>,
@ -39,7 +84,7 @@ impl Project {
if let Some(previous_graph) = previous_graph { if let Some(previous_graph) = previous_graph {
for (name, versions) in previous_graph { for (name, versions) in previous_graph {
for (version, node) in versions { for (version, node) in versions {
let Some((_, specifier, source_ty)) = &node.direct else { let Some((old_alias, specifier, source_ty)) = &node.direct else {
// this is not a direct dependency, will be added if it's still being used later // this is not a direct dependency, will be added if it's still being used later
continue; continue;
}; };
@ -51,13 +96,16 @@ impl Project {
let Some(alias) = all_specifiers.remove(&(specifier.clone(), *source_ty)) let Some(alias) = all_specifiers.remove(&(specifier.clone(), *source_ty))
else { else {
log::debug!( tracing::debug!(
"dependency {name}@{version} from old dependency graph is no longer in the manifest", "dependency {name}@{version} (old alias {old_alias}) from old dependency graph is no longer in the manifest",
); );
continue; continue;
}; };
log::debug!("resolved {}@{} from old dependency graph", name, version); let span = tracing::info_span!("resolve from old graph", alias);
let _guard = span.enter();
tracing::debug!("resolved {}@{} from old dependency graph", name, version);
insert_node( insert_node(
&mut graph, &mut graph,
name.clone(), name.clone(),
@ -72,22 +120,24 @@ impl Project {
let mut queue = node let mut queue = node
.dependencies .dependencies
.iter() .iter()
.map(|(name, (version, _))| (name, version, 0usize)) .map(|(name, (version, dep_alias))| {
(
name,
version,
vec![alias.to_string(), dep_alias.to_string()],
)
})
.collect::<VecDeque<_>>(); .collect::<VecDeque<_>>();
while let Some((dep_name, dep_version, depth)) = queue.pop_front() { while let Some((dep_name, dep_version, path)) = queue.pop_front() {
let inner_span =
tracing::info_span!("resolve dependency", path = path.join(">"));
let _inner_guard = inner_span.enter();
if let Some(dep_node) = previous_graph if let Some(dep_node) = previous_graph
.get(dep_name) .get(dep_name)
.and_then(|v| v.get(dep_version)) .and_then(|v| v.get(dep_version))
{ {
log::debug!( tracing::debug!("resolved sub-dependency {dep_name}@{dep_version}");
"{}resolved dependency {}@{} from {}@{}",
"\t".repeat(depth),
dep_name,
dep_version,
name,
version
);
insert_node( insert_node(
&mut graph, &mut graph,
dep_name.clone(), dep_name.clone(),
@ -99,15 +149,20 @@ impl Project {
dep_node dep_node
.dependencies .dependencies
.iter() .iter()
.map(|(name, (version, _))| (name, version, depth + 1)) .map(|(name, (version, alias))| {
(
name,
version,
path.iter()
.cloned()
.chain(std::iter::once(alias.to_string()))
.collect(),
)
})
.for_each(|dep| queue.push_back(dep)); .for_each(|dep| queue.push_back(dep));
} else { } else {
log::warn!( tracing::warn!(
"dependency {}@{} from {}@{} not found in previous graph", "dependency {dep_name}@{dep_version} not found in previous graph"
dep_name,
dep_version,
name,
version
); );
} }
} }
@ -130,223 +185,232 @@ impl Project {
.collect::<VecDeque<_>>(); .collect::<VecDeque<_>>();
while let Some((specifier, ty, dependant, path, overridden, target)) = queue.pop_front() { while let Some((specifier, ty, dependant, path, overridden, target)) = queue.pop_front() {
let alias = path.last().unwrap().clone(); async {
let depth = path.len() - 1; let alias = path.last().unwrap().clone();
let depth = path.len() - 1;
log::debug!( tracing::debug!("resolving {specifier} ({ty:?})");
"{}resolving {specifier} from {}", let source = match &specifier {
"\t".repeat(depth), DependencySpecifiers::Pesde(specifier) => {
path.join(">") let index_url = if !is_published_package && (depth == 0 || overridden) {
); let index_name = specifier.index.as_deref().unwrap_or(DEFAULT_INDEX_NAME);
let source = match &specifier {
DependencySpecifiers::Pesde(specifier) => {
let index_url = if !is_published_package && (depth == 0 || overridden) {
let index_name = specifier.index.as_deref().unwrap_or(DEFAULT_INDEX_NAME);
manifest manifest
.indices .indices
.get(index_name) .get(index_name)
.ok_or(errors::DependencyGraphError::IndexNotFound( .ok_or(errors::DependencyGraphError::IndexNotFound(
index_name.to_string(), index_name.to_string(),
))? ))?
.clone() .clone()
} else {
let index_url = specifier.index.clone().unwrap();
index_url
.clone()
.try_into()
// specifiers in indices store the index url in this field
.unwrap()
};
PackageSources::Pesde(PesdePackageSource::new(index_url))
}
#[cfg(feature = "wally-compat")]
DependencySpecifiers::Wally(specifier) => {
let index_url = if !is_published_package && (depth == 0 || overridden) {
let index_name = specifier.index.as_deref().unwrap_or(DEFAULT_INDEX_NAME);
manifest
.wally_indices
.get(index_name)
.ok_or(errors::DependencyGraphError::WallyIndexNotFound(
index_name.to_string(),
))?
.clone()
} else {
let index_url = specifier.index.clone().unwrap();
index_url
.clone()
.try_into()
// specifiers in indices store the index url in this field
.unwrap()
};
PackageSources::Wally(crate::source::wally::WallyPackageSource::new(index_url))
}
DependencySpecifiers::Git(specifier) => PackageSources::Git(
crate::source::git::GitPackageSource::new(specifier.repo.clone()),
),
DependencySpecifiers::Workspace(_) => {
PackageSources::Workspace(crate::source::workspace::WorkspacePackageSource)
}
};
if refreshed_sources.insert(source.clone()) {
source.refresh(self).await.map_err(|e| Box::new(e.into()))?;
}
let (name, resolved) = source
.resolve(&specifier, self, target, refreshed_sources)
.await
.map_err(|e| Box::new(e.into()))?;
let Some(target_version_id) = graph
.get(&name)
.and_then(|versions| {
versions
.keys()
// only consider versions that are compatible with the specifier
.filter(|ver| resolved.contains_key(ver))
.max()
})
.or_else(|| resolved.last_key_value().map(|(ver, _)| ver))
.cloned()
else {
return Err(Box::new(errors::DependencyGraphError::NoMatchingVersion(
format!("{specifier} ({target})"),
)));
};
let resolved_ty = if (is_published_package || depth == 0) && ty == DependencyType::Peer
{
DependencyType::Standard
} else {
ty
};
if let Some((dependant_name, dependant_version_id)) = dependant {
graph
.get_mut(&dependant_name)
.and_then(|versions| versions.get_mut(&dependant_version_id))
.and_then(|node| {
node.dependencies
.insert(name.clone(), (target_version_id.clone(), alias.clone()))
});
}
let pkg_ref = &resolved[&target_version_id];
if let Some(already_resolved) = graph
.get_mut(&name)
.and_then(|versions| versions.get_mut(&target_version_id))
{
tracing::debug!(
"{}@{} already resolved",
name,
target_version_id
);
if std::mem::discriminant(&already_resolved.pkg_ref)
!= std::mem::discriminant(pkg_ref)
{
tracing::warn!(
"resolved package {name}@{target_version_id} has a different source than previously resolved one, this may cause issues",
);
}
if already_resolved.resolved_ty == DependencyType::Peer {
already_resolved.resolved_ty = resolved_ty;
}
if ty == DependencyType::Peer && depth == 0 {
already_resolved.is_peer = true;
}
if already_resolved.direct.is_none() && depth == 0 {
already_resolved.direct = Some((alias.clone(), specifier.clone(), ty));
}
return Ok(());
}
let node = DependencyGraphNode {
direct: if depth == 0 {
Some((alias.clone(), specifier.clone(), ty))
} else { } else {
let index_url = specifier.index.clone().unwrap(); None
},
index_url pkg_ref: pkg_ref.clone(),
.clone() dependencies: Default::default(),
.try_into() resolved_ty,
// specifiers in indices store the index url in this field is_peer: if depth == 0 {
.unwrap() false
};
PackageSources::Pesde(PesdePackageSource::new(index_url))
}
#[cfg(feature = "wally-compat")]
DependencySpecifiers::Wally(specifier) => {
let index_url = if !is_published_package && (depth == 0 || overridden) {
let index_name = specifier.index.as_deref().unwrap_or(DEFAULT_INDEX_NAME);
manifest
.wally_indices
.get(index_name)
.ok_or(errors::DependencyGraphError::WallyIndexNotFound(
index_name.to_string(),
))?
.clone()
} else { } else {
let index_url = specifier.index.clone().unwrap(); ty == DependencyType::Peer
},
};
insert_node(
&mut graph,
name.clone(),
target_version_id.clone(),
node.clone(),
depth == 0,
);
index_url tracing::debug!(
.clone() "resolved {}@{} from new dependency graph",
.try_into()
// specifiers in indices store the index url in this field
.unwrap()
};
PackageSources::Wally(crate::source::wally::WallyPackageSource::new(index_url))
}
DependencySpecifiers::Git(specifier) => PackageSources::Git(
crate::source::git::GitPackageSource::new(specifier.repo.clone()),
),
DependencySpecifiers::Workspace(_) => {
PackageSources::Workspace(crate::source::workspace::WorkspacePackageSource)
}
};
if refreshed_sources.insert(source.clone()) {
source.refresh(self).await.map_err(|e| Box::new(e.into()))?;
}
let (name, resolved) = source
.resolve(&specifier, self, target, refreshed_sources)
.await
.map_err(|e| Box::new(e.into()))?;
let Some(target_version_id) = graph
.get(&name)
.and_then(|versions| {
versions
.keys()
// only consider versions that are compatible with the specifier
.filter(|ver| resolved.contains_key(ver))
.max()
})
.or_else(|| resolved.last_key_value().map(|(ver, _)| ver))
.cloned()
else {
return Err(Box::new(errors::DependencyGraphError::NoMatchingVersion(
format!("{specifier} ({target})"),
)));
};
let resolved_ty = if (is_published_package || depth == 0) && ty == DependencyType::Peer
{
DependencyType::Standard
} else {
ty
};
if let Some((dependant_name, dependant_version_id)) = dependant {
graph
.get_mut(&dependant_name)
.and_then(|versions| versions.get_mut(&dependant_version_id))
.and_then(|node| {
node.dependencies
.insert(name.clone(), (target_version_id.clone(), alias.clone()))
});
}
let pkg_ref = &resolved[&target_version_id];
if let Some(already_resolved) = graph
.get_mut(&name)
.and_then(|versions| versions.get_mut(&target_version_id))
{
log::debug!(
"{}{}@{} already resolved",
"\t".repeat(depth),
name, name,
target_version_id target_version_id
); );
if std::mem::discriminant(&already_resolved.pkg_ref) for (dependency_alias, (dependency_spec, dependency_ty)) in
!= std::mem::discriminant(pkg_ref) pkg_ref.dependencies().clone()
{ {
log::warn!( if dependency_ty == DependencyType::Dev {
"resolved package {name}@{target_version_id} has a different source than the previously resolved one at {}, this may cause issues", // dev dependencies of dependencies are to be ignored
path.join(">") continue;
); }
}
if already_resolved.resolved_ty == DependencyType::Peer let overridden = manifest.overrides.iter().find_map(|(key, spec)| {
&& resolved_ty == DependencyType::Standard key.0.iter().find_map(|override_path| {
{ // if the path up until the last element is the same as the current path,
already_resolved.resolved_ty = resolved_ty; // and the last element in the path is the dependency alias,
} // then the specifier is to be overridden
(path.len() == override_path.len() - 1
&& path == override_path[..override_path.len() - 1]
&& override_path.last() == Some(&dependency_alias))
.then_some(spec)
})
});
if already_resolved.direct.is_none() && depth == 0 { if overridden.is_some() {
already_resolved.direct = Some((alias.clone(), specifier.clone(), ty)); tracing::debug!(
} "overridden specifier found for {} ({dependency_spec})",
path.iter()
.map(|s| s.as_str())
.chain(std::iter::once(dependency_alias.as_str()))
.collect::<Vec<_>>()
.join(">"),
);
}
continue; queue.push_back((
} overridden.cloned().unwrap_or(dependency_spec),
dependency_ty,
let node = DependencyGraphNode { Some((name.clone(), target_version_id.clone())),
direct: if depth == 0 {
Some((alias.clone(), specifier.clone(), ty))
} else {
None
},
pkg_ref: pkg_ref.clone(),
dependencies: Default::default(),
resolved_ty,
};
insert_node(
&mut graph,
name.clone(),
target_version_id.clone(),
node.clone(),
depth == 0,
);
log::debug!(
"{}resolved {}@{} from new dependency graph",
"\t".repeat(depth),
name,
target_version_id
);
for (dependency_alias, (dependency_spec, dependency_ty)) in
pkg_ref.dependencies().clone()
{
if dependency_ty == DependencyType::Dev {
// dev dependencies of dependencies are to be ignored
continue;
}
let overridden = manifest.overrides.iter().find_map(|(key, spec)| {
key.0.iter().find_map(|override_path| {
// if the path up until the last element is the same as the current path,
// and the last element in the path is the dependency alias,
// then the specifier is to be overridden
(path.len() == override_path.len() - 1
&& path == override_path[..override_path.len() - 1]
&& override_path.last() == Some(&dependency_alias))
.then_some(spec)
})
});
if overridden.is_some() {
log::debug!(
"{}overridden specifier found for {} ({dependency_spec})",
"\t".repeat(depth),
path.iter() path.iter()
.map(|s| s.as_str()) .cloned()
.chain(std::iter::once(dependency_alias.as_str())) .chain(std::iter::once(dependency_alias))
.collect::<Vec<_>>() .collect(),
.join(">"), overridden.is_some(),
); *target_version_id.target(),
));
} }
queue.push_back(( Ok(())
overridden.cloned().unwrap_or(dependency_spec),
dependency_ty,
Some((name.clone(), target_version_id.clone())),
path.iter()
.cloned()
.chain(std::iter::once(dependency_alias))
.collect(),
overridden.is_some(),
*target_version_id.target(),
));
} }
.instrument(tracing::info_span!("resolve new/changed", path = path.join(">")))
.await?;
} }
for (name, versions) in &graph { for (name, versions) in &mut graph {
for (version_id, node) in versions { for (version_id, node) in versions {
if node.is_peer && node.direct.is_none() {
node.resolved_ty = DependencyType::Peer;
}
if node.resolved_ty == DependencyType::Peer { if node.resolved_ty == DependencyType::Peer {
log::warn!("peer dependency {name}@{version_id} was not resolved"); tracing::warn!("peer dependency {name}@{version_id} was not resolved");
} }
} }
} }

View file

@ -1,7 +1,7 @@
use crate::Project; use crate::Project;
use std::{ use std::{
ffi::OsStr, ffi::OsStr,
fmt::{Display, Formatter}, fmt::{Debug, Display, Formatter},
path::Path, path::Path,
process::Stdio, process::Stdio,
}; };
@ -9,6 +9,7 @@ use tokio::{
io::{AsyncBufReadExt, BufReader}, io::{AsyncBufReadExt, BufReader},
process::Command, process::Command,
}; };
use tracing::instrument;
/// Script names used by pesde /// Script names used by pesde
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, PartialOrd, Ord)] #[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, PartialOrd, Ord)]
@ -30,7 +31,8 @@ impl Display for ScriptName {
} }
} }
pub(crate) async fn execute_script<A: IntoIterator<Item = S>, S: AsRef<OsStr>>( #[instrument(skip(project), level = "debug")]
pub(crate) async fn execute_script<A: IntoIterator<Item = S> + Debug, S: AsRef<OsStr> + Debug>(
script_name: ScriptName, script_name: ScriptName,
script_path: &Path, script_path: &Path,
args: A, args: A,
@ -59,10 +61,10 @@ pub(crate) async fn execute_script<A: IntoIterator<Item = S>, S: AsRef<OsStr>>(
while let Some(line) = stderr.next_line().await.transpose() { while let Some(line) = stderr.next_line().await.transpose() {
match line { match line {
Ok(line) => { Ok(line) => {
log::error!("[{script}]: {line}"); tracing::error!("[{script}]: {line}");
} }
Err(e) => { Err(e) => {
log::error!("ERROR IN READING STDERR OF {script}: {e}"); tracing::error!("ERROR IN READING STDERR OF {script}: {e}");
break; break;
} }
} }
@ -78,11 +80,11 @@ pub(crate) async fn execute_script<A: IntoIterator<Item = S>, S: AsRef<OsStr>>(
stdout_str.push_str(&line); stdout_str.push_str(&line);
stdout_str.push('\n'); stdout_str.push('\n');
} else { } else {
log::info!("[{script_2}]: {line}"); tracing::info!("[{script_2}]: {line}");
} }
} }
Err(e) => { Err(e) => {
log::error!("ERROR IN READING STDOUT OF {script_2}: {e}"); tracing::error!("ERROR IN READING STDOUT OF {script_2}: {e}");
break; break;
} }
} }
@ -95,7 +97,7 @@ pub(crate) async fn execute_script<A: IntoIterator<Item = S>, S: AsRef<OsStr>>(
} }
} }
Err(e) if e.kind() == std::io::ErrorKind::NotFound => { Err(e) if e.kind() == std::io::ErrorKind::NotFound => {
log::warn!("Lune could not be found in PATH: {e}"); tracing::warn!("Lune could not be found in PATH: {e}");
Ok(None) Ok(None)
} }

View file

@ -9,6 +9,7 @@ use serde::{Deserialize, Serialize};
use sha2::{Digest, Sha256}; use sha2::{Digest, Sha256};
use std::{ use std::{
collections::BTreeMap, collections::BTreeMap,
fmt::Debug,
future::Future, future::Future,
path::{Path, PathBuf}, path::{Path, PathBuf},
}; };
@ -17,6 +18,7 @@ use tokio::{
io::{AsyncReadExt, AsyncWriteExt}, io::{AsyncReadExt, AsyncWriteExt},
pin, pin,
}; };
use tracing::instrument;
/// A file system entry /// A file system entry
#[derive(Debug, Clone, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]
@ -125,7 +127,8 @@ pub(crate) async fn store_in_cas<
impl PackageFS { impl PackageFS {
/// Write the package to the given destination /// Write the package to the given destination
pub async fn write_to<P: AsRef<Path>, Q: AsRef<Path>>( #[instrument(skip(self), level = "debug")]
pub async fn write_to<P: AsRef<Path> + Debug, Q: AsRef<Path> + Debug>(
&self, &self,
destination: P, destination: P,
cas_path: Q, cas_path: Q,
@ -211,7 +214,8 @@ impl PackageFS {
} }
/// Returns the contents of the file with the given hash /// Returns the contents of the file with the given hash
pub async fn read_file<P: AsRef<Path>, H: AsRef<str>>( #[instrument(skip(self), ret(level = "trace"), level = "debug")]
pub async fn read_file<P: AsRef<Path> + Debug, H: AsRef<str> + Debug>(
&self, &self,
file_hash: H, file_hash: H,
cas_path: P, cas_path: P,

View file

@ -27,6 +27,7 @@ use std::{
sync::Arc, sync::Arc,
}; };
use tokio::{sync::Mutex, task::spawn_blocking}; use tokio::{sync::Mutex, task::spawn_blocking};
use tracing::instrument;
/// The Git package reference /// The Git package reference
pub mod pkg_ref; pub mod pkg_ref;
@ -70,10 +71,12 @@ impl PackageSource for GitPackageSource {
type ResolveError = errors::ResolveError; type ResolveError = errors::ResolveError;
type DownloadError = errors::DownloadError; type DownloadError = errors::DownloadError;
#[instrument(skip_all, level = "debug")]
async fn refresh(&self, project: &Project) -> Result<(), Self::RefreshError> { async fn refresh(&self, project: &Project) -> Result<(), Self::RefreshError> {
GitBasedSource::refresh(self, project).await GitBasedSource::refresh(self, project).await
} }
#[instrument(skip_all, level = "debug")]
async fn resolve( async fn resolve(
&self, &self,
specifier: &Self::Specifier, specifier: &Self::Specifier,
@ -329,6 +332,7 @@ impl PackageSource for GitPackageSource {
)) ))
} }
#[instrument(skip_all, level = "debug")]
async fn download( async fn download(
&self, &self,
pkg_ref: &Self::Ref, pkg_ref: &Self::Ref,
@ -343,7 +347,7 @@ impl PackageSource for GitPackageSource {
match fs::read_to_string(&index_file).await { match fs::read_to_string(&index_file).await {
Ok(s) => { Ok(s) => {
log::debug!( tracing::debug!(
"using cached index file for package {}#{}", "using cached index file for package {}#{}",
pkg_ref.repo, pkg_ref.repo,
pkg_ref.tree_id pkg_ref.tree_id
@ -487,7 +491,7 @@ impl PackageSource for GitPackageSource {
} }
if pkg_ref.use_new_structure() && name == "default.project.json" { if pkg_ref.use_new_structure() && name == "default.project.json" {
log::debug!( tracing::debug!(
"removing default.project.json from {}#{} at {path} - using new structure", "removing default.project.json from {}#{} at {path} - using new structure",
pkg_ref.repo, pkg_ref.repo,
pkg_ref.tree_id pkg_ref.tree_id

View file

@ -1,8 +1,11 @@
#![allow(async_fn_in_trait)] #![allow(async_fn_in_trait)]
use crate::{util::authenticate_conn, Project}; use crate::{util::authenticate_conn, Project};
use fs_err::tokio as fs; use fs_err::tokio as fs;
use gix::remote::Direction; use gix::remote::Direction;
use std::fmt::Debug;
use tokio::task::spawn_blocking; use tokio::task::spawn_blocking;
use tracing::instrument;
/// A trait for sources that are based on Git repositories /// A trait for sources that are based on Git repositories
pub trait GitBasedSource { pub trait GitBasedSource {
@ -90,7 +93,11 @@ pub trait GitBasedSource {
} }
/// Reads a file from a tree /// Reads a file from a tree
pub fn read_file<I: IntoIterator<Item = P> + Clone, P: ToString + PartialEq<gix::bstr::BStr>>( #[instrument(skip(tree), ret, level = "trace")]
pub fn read_file<
I: IntoIterator<Item = P> + Clone + Debug,
P: ToString + PartialEq<gix::bstr::BStr>,
>(
tree: &gix::Tree, tree: &gix::Tree,
file_path: I, file_path: I,
) -> Result<Option<String>, errors::ReadFile> { ) -> Result<Option<String>, errors::ReadFile> {
@ -120,6 +127,7 @@ pub fn read_file<I: IntoIterator<Item = P> + Clone, P: ToString + PartialEq<gix:
} }
/// Gets the root tree of a repository /// Gets the root tree of a repository
#[instrument(skip(repo), level = "trace")]
pub fn root_tree(repo: &gix::Repository) -> Result<gix::Tree, errors::TreeError> { pub fn root_tree(repo: &gix::Repository) -> Result<gix::Tree, errors::TreeError> {
// this is a bare repo, so this is the actual path // this is a bare repo, so this is the actual path
let path = repo.path().to_path_buf(); let path = repo.path().to_path_buf();

View file

@ -30,6 +30,7 @@ use crate::{
use fs_err::tokio as fs; use fs_err::tokio as fs;
use futures::StreamExt; use futures::StreamExt;
use tokio::task::spawn_blocking; use tokio::task::spawn_blocking;
use tracing::instrument;
/// The pesde package reference /// The pesde package reference
pub mod pkg_ref; pub mod pkg_ref;
@ -73,6 +74,7 @@ impl PesdePackageSource {
} }
/// Reads the config file /// Reads the config file
#[instrument(skip_all, ret(level = "trace"), level = "debug")]
pub async fn config(&self, project: &Project) -> Result<IndexConfig, errors::ConfigError> { pub async fn config(&self, project: &Project) -> Result<IndexConfig, errors::ConfigError> {
let repo_url = self.repo_url.clone(); let repo_url = self.repo_url.clone();
let path = self.path(project); let path = self.path(project);
@ -99,10 +101,12 @@ impl PackageSource for PesdePackageSource {
type ResolveError = errors::ResolveError; type ResolveError = errors::ResolveError;
type DownloadError = errors::DownloadError; type DownloadError = errors::DownloadError;
#[instrument(skip_all, level = "debug")]
async fn refresh(&self, project: &Project) -> Result<(), Self::RefreshError> { async fn refresh(&self, project: &Project) -> Result<(), Self::RefreshError> {
GitBasedSource::refresh(self, project).await GitBasedSource::refresh(self, project).await
} }
#[instrument(skip_all, level = "debug")]
async fn resolve( async fn resolve(
&self, &self,
specifier: &Self::Specifier, specifier: &Self::Specifier,
@ -124,10 +128,10 @@ impl PackageSource for PesdePackageSource {
} }
}; };
let entries: IndexFile = toml::from_str(&string) let IndexFile { entries, .. } = toml::from_str(&string)
.map_err(|e| Self::ResolveError::Parse(specifier.name.to_string(), e))?; .map_err(|e| Self::ResolveError::Parse(specifier.name.to_string(), e))?;
log::debug!("{} has {} possible entries", specifier.name, entries.len()); tracing::debug!("{} has {} possible entries", specifier.name, entries.len());
Ok(( Ok((
PackageNames::Pesde(specifier.name.clone()), PackageNames::Pesde(specifier.name.clone()),
@ -155,6 +159,7 @@ impl PackageSource for PesdePackageSource {
)) ))
} }
#[instrument(skip_all, level = "debug")]
async fn download( async fn download(
&self, &self,
pkg_ref: &Self::Ref, pkg_ref: &Self::Ref,
@ -171,7 +176,7 @@ impl PackageSource for PesdePackageSource {
match fs::read_to_string(&index_file).await { match fs::read_to_string(&index_file).await {
Ok(s) => { Ok(s) => {
log::debug!( tracing::debug!(
"using cached index file for package {}@{} {}", "using cached index file for package {}@{} {}",
pkg_ref.name, pkg_ref.name,
pkg_ref.version, pkg_ref.version,
@ -192,7 +197,7 @@ impl PackageSource for PesdePackageSource {
let mut request = reqwest.get(&url).header(ACCEPT, "application/octet-stream"); let mut request = reqwest.get(&url).header(ACCEPT, "application/octet-stream");
if let Some(token) = project.auth_config.tokens().get(&self.repo_url) { if let Some(token) = project.auth_config.tokens().get(&self.repo_url) {
log::debug!("using token for {}", self.repo_url); tracing::debug!("using token for {}", self.repo_url);
request = request.header(AUTHORIZATION, token); request = request.header(AUTHORIZATION, token);
} }
@ -427,8 +432,20 @@ pub struct IndexFileEntry {
pub dependencies: BTreeMap<String, (DependencySpecifiers, DependencyType)>, pub dependencies: BTreeMap<String, (DependencySpecifiers, DependencyType)>,
} }
/// The package metadata in the index file
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq, Default)]
pub struct IndexMetadata {}
/// The index file for a package /// The index file for a package
pub type IndexFile = BTreeMap<VersionId, IndexFileEntry>; #[derive(Debug, Serialize, Deserialize, Clone, PartialEq, Eq)]
pub struct IndexFile {
/// Any package-wide metadata
#[serde(default, skip_serializing_if = "crate::util::is_default")]
pub meta: IndexMetadata,
/// The entries in the index file
#[serde(flatten)]
pub entries: BTreeMap<VersionId, IndexFileEntry>,
}
/// Errors that can occur when interacting with the pesde package source /// Errors that can occur when interacting with the pesde package source
pub mod errors { pub mod errors {

View file

@ -11,6 +11,7 @@ use crate::{
Project, LINK_LIB_NO_FILE_FOUND, Project, LINK_LIB_NO_FILE_FOUND,
}; };
use fs_err::tokio as fs; use fs_err::tokio as fs;
use tracing::instrument;
#[derive(Deserialize)] #[derive(Deserialize)]
#[serde(rename_all = "camelCase")] #[serde(rename_all = "camelCase")]
@ -19,7 +20,8 @@ struct SourcemapNode {
file_paths: Vec<RelativePathBuf>, file_paths: Vec<RelativePathBuf>,
} }
pub(crate) async fn find_lib_path( #[instrument(skip(project, package_dir), level = "debug")]
async fn find_lib_path(
project: &Project, project: &Project,
package_dir: &Path, package_dir: &Path,
) -> Result<Option<RelativePathBuf>, errors::FindLibPathError> { ) -> Result<Option<RelativePathBuf>, errors::FindLibPathError> {
@ -29,7 +31,7 @@ pub(crate) async fn find_lib_path(
.scripts .scripts
.get(&ScriptName::SourcemapGenerator.to_string()) .get(&ScriptName::SourcemapGenerator.to_string())
else { else {
log::warn!("no sourcemap generator script found in manifest"); tracing::warn!("no sourcemap generator script found in manifest");
return Ok(None); return Ok(None);
}; };
@ -55,6 +57,7 @@ pub(crate) async fn find_lib_path(
pub(crate) const WALLY_MANIFEST_FILE_NAME: &str = "wally.toml"; pub(crate) const WALLY_MANIFEST_FILE_NAME: &str = "wally.toml";
#[instrument(skip(project, tempdir), level = "debug")]
pub(crate) async fn get_target( pub(crate) async fn get_target(
project: &Project, project: &Project,
tempdir: &TempDir, tempdir: &TempDir,

View file

@ -1,13 +1,13 @@
use std::collections::BTreeMap; use std::collections::BTreeMap;
use semver::{Version, VersionReq};
use serde::{Deserialize, Deserializer};
use crate::{ use crate::{
manifest::{errors, DependencyType}, manifest::{errors, DependencyType},
names::wally::WallyPackageName, names::wally::WallyPackageName,
source::{specifiers::DependencySpecifiers, wally::specifier::WallyDependencySpecifier}, source::{specifiers::DependencySpecifiers, wally::specifier::WallyDependencySpecifier},
}; };
use semver::{Version, VersionReq};
use serde::{Deserialize, Deserializer};
use tracing::instrument;
#[derive(Deserialize, Clone, Debug)] #[derive(Deserialize, Clone, Debug)]
#[serde(rename_all = "lowercase")] #[serde(rename_all = "lowercase")]
@ -63,6 +63,7 @@ pub struct WallyManifest {
impl WallyManifest { impl WallyManifest {
/// Get all dependencies from the manifest /// Get all dependencies from the manifest
#[instrument(skip(self), ret(level = "trace"), level = "debug")]
pub fn all_dependencies( pub fn all_dependencies(
&self, &self,
) -> Result< ) -> Result<

View file

@ -30,6 +30,7 @@ use std::{
use tempfile::tempdir; use tempfile::tempdir;
use tokio::{io::AsyncWriteExt, sync::Mutex, task::spawn_blocking}; use tokio::{io::AsyncWriteExt, sync::Mutex, task::spawn_blocking};
use tokio_util::compat::FuturesAsyncReadCompatExt; use tokio_util::compat::FuturesAsyncReadCompatExt;
use tracing::instrument;
pub(crate) mod compat_util; pub(crate) mod compat_util;
pub(crate) mod manifest; pub(crate) mod manifest;
@ -68,6 +69,7 @@ impl WallyPackageSource {
} }
/// Reads the config file /// Reads the config file
#[instrument(skip_all, ret(level = "trace"), level = "debug")]
pub async fn config(&self, project: &Project) -> Result<WallyIndexConfig, errors::ConfigError> { pub async fn config(&self, project: &Project) -> Result<WallyIndexConfig, errors::ConfigError> {
let repo_url = self.repo_url.clone(); let repo_url = self.repo_url.clone();
let path = self.path(project); let path = self.path(project);
@ -94,10 +96,12 @@ impl PackageSource for WallyPackageSource {
type ResolveError = errors::ResolveError; type ResolveError = errors::ResolveError;
type DownloadError = errors::DownloadError; type DownloadError = errors::DownloadError;
#[instrument(skip_all, level = "debug")]
async fn refresh(&self, project: &Project) -> Result<(), Self::RefreshError> { async fn refresh(&self, project: &Project) -> Result<(), Self::RefreshError> {
GitBasedSource::refresh(self, project).await GitBasedSource::refresh(self, project).await
} }
#[instrument(skip_all, level = "debug")]
async fn resolve( async fn resolve(
&self, &self,
specifier: &Self::Specifier, specifier: &Self::Specifier,
@ -111,7 +115,7 @@ impl PackageSource for WallyPackageSource {
let string = match read_file(&tree, [scope, name]) { let string = match read_file(&tree, [scope, name]) {
Ok(Some(s)) => s, Ok(Some(s)) => s,
Ok(None) => { Ok(None) => {
log::debug!( tracing::debug!(
"{} not found in wally registry. searching in backup registries", "{} not found in wally registry. searching in backup registries",
specifier.name specifier.name
); );
@ -134,7 +138,7 @@ impl PackageSource for WallyPackageSource {
.await .await
{ {
Ok((name, results)) => { Ok((name, results)) => {
log::debug!("found {} in backup registry {registry}", name); tracing::debug!("found {} in backup registry {registry}", name);
return Ok((name, results)); return Ok((name, results));
} }
Err(errors::ResolveError::NotFound(_)) => { Err(errors::ResolveError::NotFound(_)) => {
@ -162,7 +166,7 @@ impl PackageSource for WallyPackageSource {
.collect::<Result<_, _>>() .collect::<Result<_, _>>()
.map_err(|e| Self::ResolveError::Parse(specifier.name.to_string(), e))?; .map_err(|e| Self::ResolveError::Parse(specifier.name.to_string(), e))?;
log::debug!("{} has {} possible entries", specifier.name, entries.len()); tracing::debug!("{} has {} possible entries", specifier.name, entries.len());
Ok(( Ok((
PackageNames::Wally(specifier.name.clone()), PackageNames::Wally(specifier.name.clone()),
@ -192,6 +196,7 @@ impl PackageSource for WallyPackageSource {
)) ))
} }
#[instrument(skip_all, level = "debug")]
async fn download( async fn download(
&self, &self,
pkg_ref: &Self::Ref, pkg_ref: &Self::Ref,
@ -207,7 +212,7 @@ impl PackageSource for WallyPackageSource {
let tempdir = match fs::read_to_string(&index_file).await { let tempdir = match fs::read_to_string(&index_file).await {
Ok(s) => { Ok(s) => {
log::debug!( tracing::debug!(
"using cached index file for package {}@{}", "using cached index file for package {}@{}",
pkg_ref.name, pkg_ref.name,
pkg_ref.version pkg_ref.version
@ -240,7 +245,7 @@ impl PackageSource for WallyPackageSource {
); );
if let Some(token) = project.auth_config.tokens().get(&self.repo_url) { if let Some(token) = project.auth_config.tokens().get(&self.repo_url) {
log::debug!("using token for {}", self.repo_url); tracing::debug!("using token for {}", self.repo_url);
request = request.header(AUTHORIZATION, token); request = request.header(AUTHORIZATION, token);
} }

View file

@ -13,6 +13,7 @@ use relative_path::RelativePathBuf;
use reqwest::Client; use reqwest::Client;
use std::collections::{BTreeMap, HashSet}; use std::collections::{BTreeMap, HashSet};
use tokio::pin; use tokio::pin;
use tracing::instrument;
/// The workspace package reference /// The workspace package reference
pub mod pkg_ref; pub mod pkg_ref;
@ -35,6 +36,7 @@ impl PackageSource for WorkspacePackageSource {
Ok(()) Ok(())
} }
#[instrument(skip_all, level = "debug")]
async fn resolve( async fn resolve(
&self, &self,
specifier: &Self::Specifier, specifier: &Self::Specifier,
@ -126,6 +128,7 @@ impl PackageSource for WorkspacePackageSource {
)) ))
} }
#[instrument(skip_all, level = "debug")]
async fn download( async fn download(
&self, &self,
pkg_ref: &Self::Ref, pkg_ref: &Self::Ref,

View file

@ -19,7 +19,7 @@ impl DependencySpecifier for WorkspaceDependencySpecifier {}
impl Display for WorkspaceDependencySpecifier { impl Display for WorkspaceDependencySpecifier {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "workspace:{}{}", self.version, self.name) write!(f, "{}@workspace:{}", self.name, self.version)
} }
} }

View file

@ -83,3 +83,7 @@ pub fn deserialize_git_like_url<'de, D: Deserializer<'de>>(
pub fn hash<S: AsRef<[u8]>>(struc: S) -> String { pub fn hash<S: AsRef<[u8]>>(struc: S) -> String {
format!("{:x}", Sha256::digest(struc.as_ref())) format!("{:x}", Sha256::digest(struc.as_ref()))
} }
pub fn is_default<T: Default + Eq>(t: &T) -> bool {
t == &T::default()
}

View file

@ -2,7 +2,9 @@
const { data } = $props() const { data } = $props()
</script> </script>
<div class="prose min-w-0 py-8 prose-pre:w-full prose-pre:overflow-auto"> <div
class="prose prose-pre:w-full prose-pre:overflow-auto prose-img:inline-block prose-img:m-0 prose-video:inline-block prose-video:m-0 min-w-0 py-8"
>
<!-- eslint-disable-next-line svelte/no-at-html-tags --> <!-- eslint-disable-next-line svelte/no-at-html-tags -->
{@html data.readmeHtml} {@html data.readmeHtml}
</div> </div>