Compare commits

...

301 commits
v0.4.0 ... 0.5

Author SHA1 Message Date
daimond113
32906400ec
docs: update scripts docs
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2025-01-18 16:47:07 +01:00
Nidoxs
5c2f831c26
docs: add an aside for symlink errors on Windows (#20)
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
* Add an aside for symlink errors on Windows

* Remove redundant whitespace

* Inline URL

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>

* Revert titles to "Caution" instead of "Warning"

* Use inline code block for error message

* Update docs/src/content/docs/installation.mdx

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>

* Update docs/src/content/docs/installation.mdx

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>

---------

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>
2025-01-05 19:25:52 +01:00
daimond113
97d9251f69
docs: remove branches from git revs
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2025-01-03 18:09:07 +01:00
daimond113
89a2103164
chore(release): prepare for v0.5.3
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2024-12-30 00:56:58 +01:00
daimond113
0c159e7689
docs: add missing changelog entries 2024-12-30 00:56:03 +01:00
daimond113
4f75af88b7
feat: add meta in index file to preserve future compat 2024-12-30 00:49:24 +01:00
daimond113
f009c957ca
feat: remove verbosity from release mode logs
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
2024-12-26 22:51:00 +01:00
3569ff32cd
ci: debug builds action (#15)
Some checks failed
Debug / Get build version (push) Has been cancelled
Test & Lint / lint (push) Has been cancelled
Debug / Build for linux-x86_64 (push) Has been cancelled
Debug / Build for macos-aarch64 (push) Has been cancelled
Debug / Build for macos-x86_64 (push) Has been cancelled
Debug / Build for windows-x86_64 (push) Has been cancelled
* chore(actions): create debug build action

* chore(actions): remove unneeded targets

Also do the following:
* Use v4 of artifact upload action
* Install Linux-specific build dependencies
* Do not include version-management feature while building
* Fix cargo build command
* Include native mac x86 target instead of cross compilation

* chore(actions): fix bad compile command

Turns out I hallucinated `--exclude-features` into existence.

* chore(actions): add job to shorten github commit SHA

* chore(actions): use bash patterns for commit SHA trimming

* chore(actions): fix bash pattern syntax being improper

* chore(actions): use `tee` to write trimmed version to stdout for debugging

* chore(actions): include full semver version including git commit SHA

* chore(actions): checkout code first in `get-version` job

* chore(actions): write `trimmed_sha` to `GITHUB_OUTPUT` correclty

* chore(actions): add name for `get-version` job

* chore(actions): make matrix `job-name`s be consistent with release workflow

* chore(actions): provide `exe` for windows manually instead of glob

Also makes step now error on no matching files.
2024-12-25 15:45:29 +01:00
daimond113
c3e764ddda
fix: display spans outside debug
Some checks failed
Test & Lint / lint (push) Has been cancelled
2024-12-22 12:43:42 +01:00
dai
db3335bbf7
docs: add SECURITY.md
Some checks failed
Test & Lint / lint (push) Has been cancelled
2024-12-20 19:06:35 +01:00
Aristosis
711b0009cb
docs: fix improper assignment to PATH (#8)
Some checks are pending
Test & Lint / lint (push) Waiting to run
* Fix improper assignment to path installation.mdx

* Use the home variable installation.mdx

* Remove leading slash

---------

Co-authored-by: dai <72147841+daimond113@users.noreply.github.com>
2024-12-19 21:21:32 +01:00
daimond113
f88b800d51
chore(release): prepare for v0.5.2
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-19 16:19:07 +01:00
daimond113
28df3bcca4
feat(registry): add sentry tracing 2024-12-19 16:18:26 +01:00
daimond113
0f74e2efa3
fix: do not error on missing deps until full linking
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-18 23:34:49 +01:00
daimond113
a6c1108d5b
feat: switch registry to tracing logging 2024-12-18 22:29:10 +01:00
daimond113
9535175a45
feat: add more tracing info 2024-12-18 22:00:58 +01:00
daimond113
d9d27cf45b
fix: resolve pesde_version tags properly
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-18 16:03:50 +01:00
daimond113
60fb68fcf3
fix: change dependency types for removed peers
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-17 14:58:21 +01:00
daimond113
78976834b2
docs(changelog): add missing changelog entry for logging switch 2024-12-17 14:57:38 +01:00
daimond113
52603ea43e
feat: switch to tracing for logging
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-16 23:00:37 +01:00
daimond113
0dde647042
fix(website): render imgs inline
Some checks failed
Test & Lint / lint (push) Has been cancelled
2024-12-15 12:37:24 +01:00
daimond113
3196a83b25
chore(release): prepare for v0.5.1
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-15 00:38:22 +01:00
daimond113
d387c27f16
fix: ignore build metadata when comparing cli versions 2024-12-15 00:35:16 +01:00
daimond113
a6846597ca
docs: correct changelog diff link 2024-12-15 00:01:35 +01:00
daimond113
3810a3b9ff
ci: attempt to fix release ci 2024-12-14 23:59:58 +01:00
daimond113
52c502359b
chore(release): prepare for v0.5.0 2024-12-14 23:53:59 +01:00
daimond113
7d1e20da8c
chore: update dependencies 2024-12-14 23:51:37 +01:00
daimond113
d35f34e8f0
fix: gracefully handle unparsable versions & dont display metadata 2024-12-14 19:57:33 +01:00
daimond113
9ee75ec9c9
feat: remove lower bound limit on pesde package name length 2024-12-14 17:41:57 +01:00
daimond113
919b0036e5
feat: display included scripts in publish command
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-13 23:52:45 +01:00
daimond113
7466131f04
fix: link with types without roblox_sync_config_generator script
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-13 17:06:47 +01:00
dai
0be7dd4d0e
chore(release): prepare for v0.5.0-rc.18
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-12 16:23:27 +01:00
dai
f8d0bc6c4d
fix: correctly get index URLs in publish command 2024-12-12 16:23:11 +01:00
daimond113
381740d2ce
chore(release): prepare for v0.5.0-rc.17
Some checks are pending
Test & Lint / lint (push) Waiting to run
2024-12-11 21:41:09 +01:00
daimond113
a7ea8eb9c1
docs: add missing changelog entry 2024-12-11 21:40:03 +01:00
daimond113
4a3619c26e
docs: document scripts packages 2024-12-11 21:37:59 +01:00
daimond113
16ab05ec72
feat(registry): support granular allowence of specifier types 2024-12-11 21:31:42 +01:00
daimond113
36e6f16ca6
fix: remove deny_unknown_fields from index config 2024-12-09 11:43:56 +01:00
daimond113
4843424dba
fix: dont prompt when no packages are configured 2024-12-09 11:41:54 +01:00
daimond113
e51bc9f9bb
feat: allow multiple customisable scripts packages in init 2024-12-09 11:35:02 +01:00
daimond113
6d8731f1e5
perf: use exec in unix bin linkers 2024-12-08 19:19:43 +01:00
daimond113
49a42dc931
docs: remove note about rc.15 2024-12-08 14:10:43 +01:00
daimond113
13594d6103
chore(release): prepare for v0.5.0-rc.16 2024-12-08 13:56:41 +01:00
daimond113
eab46e4ee5
fix: allow publishing packages with scripts with no lib or bin 2024-12-08 13:55:18 +01:00
daimond113
7311427518
chore(release): prepare for v0.5.0-rc.15 2024-12-08 13:19:05 +01:00
daimond113
c94f0e55ec
chore: update dependencies 2024-12-08 13:16:21 +01:00
daimond113
15af291f84
fix: use specifier target by default in init 2024-12-08 12:47:52 +01:00
daimond113
2b2d280fe0
feat: support luau scripts 2024-12-08 12:18:41 +01:00
daimond113
0fa17a839f
feat: support using workspace root as a member 2024-12-08 12:15:30 +01:00
daimond113
e30ec8a6cf
fix: do not create scripts folders for packages without scripts 2024-12-08 10:13:09 +01:00
daimond113
f6fce8be9e
fix: emulate wally deps fs placement to wally's
Fixes #14
2024-12-07 20:59:01 +01:00
daimond113
4d3ddd50cb
feat: copy over peer pesde deps of scripts pkg in init command 2024-12-07 20:48:28 +01:00
daimond113
5513ef41a3
fix: do not require -- in bin executables on unix 2024-12-07 20:33:39 +01:00
daimond113
ac74c57709
feat: add scripts packages 2024-12-07 15:08:52 +01:00
daimond113
5ba8c5dbb4
style: run rustfmt 2024-12-04 15:35:28 +01:00
daimond113
7b592bb719
fix: correctly override dependencies 2024-12-04 01:09:57 +01:00
daimond113
f7d2d7cbb0
fix: strip .luau extension from require paths 2024-12-03 21:29:39 +01:00
daimond113
91a3a9b122
fix: enable tokio process feature 2024-12-03 00:45:28 +01:00
daimond113
b53457c42c
feat: install pesde packages before wally 2024-12-02 23:39:39 +01:00
daimond113
a4162cd300
fix: link deps before type extraction 2024-12-02 13:05:19 +01:00
daimond113
e807c261a2
feat(cli): add better styles 2024-12-01 13:36:55 +01:00
Luka
11a356c99a
fix: wasm loading on cloudflare (#12) 2024-11-30 22:22:14 +01:00
Luka
af30701a21
feat: add admonitions to main website (#11)
* feat: add admonitions

* feat: better admonition styles

* feat: more admonition styles
2024-11-30 19:23:14 +01:00
daimond113
81ecd02df2
chore: add observability to website 2024-11-30 12:59:26 +01:00
Luka
70f3bec275
feat: use cloudflare adapter for main site (#10) 2024-11-30 12:42:41 +01:00
daimond113
385e36f1e4
feat: build docs site statically 2024-11-30 12:26:12 +01:00
daimond113
f69c05a05a
chore(release): prepare for v0.5.0-rc.14 2024-11-30 11:41:16 +01:00
daimond113
564d9de675
fix: allow includes to include top level files 2024-11-30 11:31:17 +01:00
daimond113
e5e2bbeeb4
chore(registry): update sentry 2024-11-30 11:29:27 +01:00
Paficent
f0aafe212d
docs: correct quickstart.mdx typo (#9) 2024-11-30 10:55:41 +01:00
daimond113
9b31718a0e
chore: update dependencies 2024-11-28 22:44:42 +01:00
daimond113
083bf3badd
chore: add missing 'by' to changelog 2024-11-28 18:41:26 +01:00
daimond113
43d0949a45
chore(release): prepare for v0.5.0-rc.13 2024-11-28 18:28:20 +01:00
daimond113
b475ff40e5
feat: support specifying allowed external registries in config 2024-11-28 18:18:40 +01:00
daimond113
cb17c419d0
feat: add uth token command 2024-11-28 16:19:31 +01:00
daimond113
3aadebf3ea
fix(registry): handle 404 over 401 error code 2024-11-28 15:51:52 +01:00
daimond113
56579e38b2
fix: install peer dependencies & silence warn in x command 2024-11-27 21:22:58 +01:00
daimond113
4eeced440d
chore: update urls to use pesde org 2024-11-27 21:08:33 +01:00
daimond113
60dafa0114
perf: asyncify linking 2024-11-27 20:54:11 +01:00
daimond113
a9243b0214
fix: support old includes behaviour 2024-11-26 18:12:56 +01:00
daimond113
97cc58afcf
fix: link dependencies in x command 2024-11-26 12:59:43 +01:00
daimond113
b5b3257cac
fix: install dependencies of packages in x command 2024-11-26 12:50:14 +01:00
daimond113
15d6655889
style: apply clippy lints 2024-11-26 12:25:31 +01:00
daimond113
80c47aa0e4
feat: use symlinks for workspace dependencies 2024-11-24 20:22:15 +01:00
daimond113
2c003c62aa
fix: switch to wax for better globs 2024-11-24 20:21:16 +01:00
daimond113
b6a4d39c51
docs: update docs to use globs in includes 2024-11-24 17:42:28 +01:00
daimond113
37a7c34084
feat: use globs in includes field 2024-11-23 22:54:28 +01:00
daimond113
dad3fad402
feat: support negated globs in workspace members 2024-11-23 11:41:17 +01:00
daimond113
33917424a8
feat: print no updates available in outdated command 2024-11-22 20:42:09 +01:00
daimond113
9268159dc6
chore(release): prepare for v0.5.0-rc.12 2024-11-22 19:43:38 +01:00
daimond113
3d662a6de3
fix: set PESDE_ROOT to correct path in run command 2024-11-22 19:42:18 +01:00
daimond113
bb92a06d64
feat: support fallback wally registries 2024-11-22 19:40:20 +01:00
daimond113
a067fbd4bd
fix: correctly resolve peer dependencies 2024-11-22 15:52:54 +01:00
daimond113
e9bb21835c
docs: document making bin packages 2024-11-21 17:03:20 +01:00
daimond113
85312525f1
chore(release): prepare for v0.5.0-rc.11 2024-11-20 20:09:41 +01:00
daimond113
ac73a15c9d
fix: correctly copy workspace packages 2024-11-20 20:03:16 +01:00
daimond113
745828f926
fix: propagate inner download error error 2024-11-20 18:41:09 +01:00
daimond113
00d4515849
docs: link making binary packages 2024-11-20 18:40:36 +01:00
daimond113
d52a9cb615
fix: re-add updates check caching 2024-11-19 00:32:08 +01:00
daimond113
4866559025
chore: optimize exe size in release mode 2024-11-17 23:02:54 +01:00
daimond113
c5d60549c9
chore(release): prepare for v0.5.0-rc.10 2024-11-16 18:41:53 +01:00
daimond113
763bf2698f
refactor: only store pesde_version exes in version cache 2024-11-16 18:39:57 +01:00
daimond113
5a82f8616f
fix: fix self-install cross-device move 2024-11-16 18:36:45 +01:00
daimond113
00b470b173
chore(release): prepare for v0.5.0-rc.9 2024-11-16 15:33:10 +01:00
daimond113
24ad379b7c
feat: make self-upgrade check for updates by itself 2024-11-16 13:47:35 +01:00
daimond113
0ae1797ead
style: run prettier on docs 2024-11-16 13:47:35 +01:00
daimond113
be46042b51
fix: correctly link Wally server packages 2024-11-16 13:47:34 +01:00
dai
4965d172be
chore: add Ko-Fi to FUNDING.yml 2024-11-13 19:57:13 +01:00
dai
d8304d56a6
chore: create FUNDING.yml 2024-11-13 18:22:44 +01:00
daimond113
d68a1389ab
style: apply rustfmt 2024-11-12 18:00:24 +01:00
daimond113
9f93cb93d6
feat(registry): return more info in error responses 2024-11-12 17:58:20 +01:00
daimond113
1be3bf505e
chore(release): prepare for v0.5.0-rc.8 2024-11-12 16:05:14 +01:00
daimond113
dcd6a2a107
fix(cli): correct async-related panic 2024-11-12 16:00:37 +01:00
daimond113
72eb48de07
fix: use new path when setting file read-only 2024-11-11 18:59:34 +01:00
daimond113
9f3017742e
fix: correct cas finding algorithm 2024-11-11 18:57:44 +01:00
daimond113
dca495a467
chore: add missing changelog entries 2024-11-11 13:27:56 +01:00
daimond113
b180bea063
feat(registry): move to body bytes over multipart for uploading 2024-11-11 13:22:09 +01:00
daimond113
19aa5eb52c
feat(registry): log errors with more details 2024-11-11 13:22:09 +01:00
Luka
f1c9cbc9fd
docs: documentation improvements (#6)
* docs: improve docs for publishing roblox packages

* docs: add workspace dependencies to reference

* docs: reword roblox publishing docs

* docs: improve documentation for package docs
2024-11-11 12:45:41 +01:00
daimond113
1369fe990b
feat: display packages in progress bars 2024-11-10 18:08:59 +01:00
daimond113
f7808e452d
chore: update dependencies 2024-11-10 17:20:03 +01:00
daimond113
15868acce0
feat: patch before linking 2024-11-10 17:16:35 +01:00
daimond113
a9b1fa655f
refactor: update update message 2024-11-10 16:47:19 +01:00
daimond113
d490c0a6f3
feat: continue change to async 2024-11-10 16:43:25 +01:00
daimond113
e2fe1c50b8
fix(registry): ignore search query casing 2024-11-06 21:22:57 +01:00
daimond113
ab9124e02c
fix: add missing awaits 2024-11-05 21:02:23 +01:00
daimond113
b7444566dd
docs: add missing todo 2024-11-05 20:47:19 +01:00
daimond113
2b0f29a2f9
feat: begin switch to async 2024-11-05 20:44:24 +01:00
daimond113
37072eda24
docs(changelog): document publish index arg 2024-11-04 19:17:35 +01:00
daimond113
9bc80a43db
refactor: allow specifying different index when publishing 2024-11-04 14:53:59 +01:00
daimond113
e53de00120
style: apply rustfmt 2024-11-03 15:32:11 +01:00
daimond113
32d5f8c517
feat(registry): add DATA_DIR variable 2024-11-03 13:40:06 +01:00
daimond113
237d6e67e3
refactor(registry): rename git config to be generic 2024-11-03 13:26:53 +01:00
daimond113
fde2ba1021
fix: correctly (de)serialize workspace specifiers versions 2024-11-03 13:09:06 +01:00
daimond113
620777cacf
fix: remove default.project.json from git pesde dependencies 2024-11-03 13:06:59 +01:00
daimond113
09820e322c
refactor: use fs-err for fs operations 2024-11-01 20:57:32 +01:00
daimond113
c9dc788056
refactor: separate IO errors more for pesde source download 2024-11-01 19:50:11 +01:00
daimond113
00ea56745e
fix: cleanup temp files for cas search algorithm 2024-11-01 19:10:18 +01:00
daimond113
397ea11ef5
fix: use different algorithm for finding cas dir 2024-11-01 18:31:11 +01:00
daimond113
5232abc1d5
ci: update stable toolchain for clippy 2024-10-30 20:11:32 +01:00
daimond113
4b623da2db
style: apply rustfmt formatting 2024-10-30 20:08:13 +01:00
daimond113
699727793e
ci: specify nightly toolchain for rustfmt 2024-10-30 20:03:36 +01:00
daimond113
76a78f462a
ci: use nightly toolchain for rustfmt 2024-10-30 20:01:15 +01:00
daimond113
7057211564
ci: use rust action 2024-10-30 19:56:12 +01:00
daimond113
678430f96f
chore(release): prepare for v0.5.0-rc.7 2024-10-30 19:35:05 +01:00
daimond113
f30e59e4b0
docs: correct git specifier docs 2024-10-30 19:13:01 +01:00
daimond113
241e667bdc
fix(registry): handle not found errors for FS storage 2024-10-30 18:55:39 +01:00
Luka
1640dab0c4
fix(website): shiki fallback language patch (#5) 2024-10-30 18:20:24 +01:00
daimond113
236edff6a0
fix: try to fix dependency patching on vercel 2024-10-30 17:48:20 +01:00
daimond113
b6f35b6209
fix: validate package names are lowercase 2024-10-30 17:07:17 +01:00
Luka
b4c447a129
fix(website): broken search page (#4) 2024-10-29 21:34:17 +01:00
Luka
f0d04fc87c
feat: website
* feat(website): init

* feat(website): home page

* feat(website): make page more responsive

* feat(website): layout

* feat(website): package page

* feat(website): update PackageResponse type

* feat(website): display package readme

* feat(website): use new /latest/any endpoint

* feat(website): make website lg instead of xl

* fix(website): use NodeJS.Timeout

* feat(website): versions page

* feat(website): add latest version indicator

* feat(website): add target select menu

* feat(website): indicate current version

* feat(website): add package metadata

* feat(website): add hamburger

* fix(website): header responsiveness

* feat(website): better package layout

* feat(website): display authors on package page

* fix(website): only display relative dates on client

* feat(docs): init docs site

* chore(website): read .env from project root

* feat(website): add gemoji support

* fix(website): overflow on code blocks

* chore(docs): read .env from project root

* feat(docs): config changes

* fix: authors not displaying

* fix(website): use fallback language

* refactor(website): use predefined target names

* refactor(website): change Github to GitHub

* chore: remove starter readmes

* chore(docs): remove .vscode

* chore(docs): remove unused assets folder

* fix(website): fix missing datetime attribute

* feat(website): switch to universal loaders

* feat(docs): search

* fix(website): type errors

* fix(website): use provided fetch instead of global

* feat(website): remove isr

* chore(website): add .env.example

* feat(website): add icons and metadata

* chore(website): add debug logs

* chore(website): remove shiki temporarily

* fix(website): rehype shiki lazy load

* fix(website): use custom highlighter

* fix(website): move highlighter creation into load

* docs: write docs

* feat(website): add og image

* feat(website): fix accessibility issues

* fix(website): no target selector on mobile

* fix(website): close dialog on navigation

* fix(website): logo is not a link in hamburger menu

* feat(website): dependencies tab

* fix(website): use correct dependency target

* fix(website): navigation links

* feat(website): support wally dependencies

* feat(website): metadata + case insensitivity

* fix(website): manually implement groupBy

`Object.groupBy` isn't supported on Vercel right now.

* fix(website): code block with an unknown language

* docs(policies): explain & cover more cases

* docs: update cli reference

* docs: add self hosting registries guide

* docs: update README

* docs: add more configs to registry guide

* fix: favicon and logomark

* feat(website): package documentation

* fix(website): missing $derive for toc

* docs: change SENTRY_URL to SENTRY_DSN

* chore(website): remove unused file

* chore: remove favicon.zip

* fix(website): strip wally# prefix

* chore: add changelog entry

---------

Co-authored-by: daimond113 <72147841+daimond113@users.noreply.github.com>
2024-10-29 20:06:00 +01:00
daimond113
b1ae6aebda
fix: don't make cas files read-only on windows 2024-10-26 17:45:18 +02:00
daimond113
2e62d07265
refactor(registry): update sentry api usage 2024-10-25 12:07:07 +02:00
daimond113
50c7b4e542
chore: update dependencies 2024-10-25 10:59:33 +02:00
daimond113
8afc75d543
fix(registry): add debug symbols for Sentry 2024-10-25 10:58:21 +02:00
daimond113
901b450a6c
refactor: GITHUB_AUTH -> GITHUB_CLIENT_SECRET for GitHub auth in registry 2024-10-22 17:58:31 +02:00
daimond113
d346fe1d34
refactor: optimize boolean expression in publish command 2024-10-20 18:15:14 +02:00
daimond113
70d07feb70
fix: sync scripts repo in background 2024-10-20 18:13:08 +02:00
daimond113
92c6120d24
perf: shallow clone dependency repos 2024-10-20 17:13:58 +02:00
daimond113
f0d19bb5e1
chore: add gitattributes 2024-10-20 13:37:39 +02:00
daimond113
8aa41e7b73
fix(registry): prevent time desync 2024-10-20 13:37:29 +02:00
daimond113
14aeabeed2
fix: listen for device flow completion without requiring enter 2024-10-17 22:07:55 +02:00
daimond113
051e062c39
style: remove comma before paren 2024-10-15 23:52:52 +02:00
daimond113
87a45c0429
fix(resolver): use new aliases when reusing lockfile deps 2024-10-15 23:52:00 +02:00
daimond113
beaf143679
style: use default index name constant 2024-10-14 23:28:02 +02:00
daimond113
73a63c3664
chore(release): prepare for v0.5.0-rc.6 2024-10-14 20:17:14 +02:00
daimond113
441235e159
chore: update dependencies 2024-10-14 20:16:15 +02:00
daimond113
c7c1daab36
feat: improve auth system for registry changes 2024-10-14 19:40:02 +02:00
daimond113
66a885b4e6
fix(registry): prevent token usage from unauthorized apps 2024-10-14 17:55:11 +02:00
daimond113
756b5c8257
fix: make github oauth client id optional 2024-10-14 14:59:16 +02:00
daimond113
d2e04f49d0
feat(registry): return package dependencies 2024-10-14 13:21:09 +02:00
daimond113
b3f0a3cbfb
fix: handle missing revision properly 2024-10-13 15:18:05 +02:00
daimond113
15df417472
feat: allow full version reqs in workspace spec version field 2024-10-13 14:13:21 +02:00
daimond113
e6773144db
fix: allow writes to copied cas files 2024-10-13 13:57:55 +02:00
daimond113
bf77b69b0b
fix: correct pesde.toml inclusion message in publish 2024-10-13 11:59:31 +02:00
daimond113
25260e5ab7
fix: report S3 errors 2024-10-13 11:55:25 +02:00
daimond113
258850e9ea
chore(release): prepare for v0.5.0-rc.5 2024-10-12 19:17:37 +02:00
daimond113
e430bdf89f
feat: inform user about not finding any bin package when invoking from bin 2024-10-12 19:14:47 +02:00
daimond113
43a8d6272a
fix: remove duplicate manifest name in publish command 2024-10-12 18:59:57 +02:00
daimond113
b6459623b7
fix: allow luau packages in exec command 2024-10-12 18:51:01 +02:00
daimond113
48c5d159b4
fix: prevent self-upgrade from overwriting itself 2024-10-12 18:07:05 +02:00
daimond113
85d2300c6a
chore(release): prepare for v0.5.0-rc.4 2024-10-12 16:06:00 +02:00
daimond113
fa00a97f8b
feat: publish members when publishing workspace 2024-10-12 16:00:21 +02:00
daimond113
9a64a12f8e
fix: add feature gates to init command 2024-10-07 16:42:49 +02:00
daimond113
aee036b998
chore(release): prepare for v0.5.0-rc.3 2024-10-07 00:05:59 +02:00
daimond113
1c2a232aba
fix: use workspace specifiers target field 2024-10-07 00:01:10 +02:00
daimond113
6c76a21b14
chore(release): prepare for v0.5.0-rc.2 2024-10-06 23:52:46 +02:00
daimond113
36d9ca0634
fix: handle versions with dots 2024-10-06 23:50:58 +02:00
daimond113
962482962b
feat: support multitarget workspace members 2024-10-06 23:49:59 +02:00
daimond113
fc123ee537
chore(release): prepare for v0.5.0-rc.1 2024-10-06 22:15:22 +02:00
daimond113
9f08c7a794
ci: fix release ci 2024-09-29 17:33:18 +02:00
daimond113
d608fa141f
ci: try to fix errors 2024-09-29 17:25:53 +02:00
daimond113
e0dccbf568
ci: try to fix errors 2024-09-29 17:24:24 +02:00
daimond113
ed0ea11db0
ci: remove outdated code 2024-09-29 09:29:35 +02:00
daimond113
bfc4e6ec31
chore: update dependencies 2024-09-29 00:44:39 +02:00
daimond113
8bb07bea60
ci: delete git cliff config 2024-09-29 00:40:29 +02:00
daimond113
2b6a731ea9
feat: add version management flag 2024-09-29 00:37:38 +02:00
daimond113
821cc989ef
feat: remove unnecessary feature flags 2024-09-29 00:26:46 +02:00
daimond113
afee3a0c2a
docs: update tagline 2024-09-29 00:04:35 +02:00
daimond113
863a09ef65
refactor: always use target names in packages folders 2024-09-28 23:46:09 +02:00
daimond113
bc86c2f1c5
refactor: remove is_compatible_with 2024-09-27 14:14:42 +02:00
daimond113
10c804e2f3
feat: add roblox server target 2024-09-06 23:38:44 +02:00
daimond113
30c4d0c391
feat(registry): make storage & auth customisable 2024-09-05 21:57:47 +02:00
daimond113
702153d81b
feat: proper update command 2024-09-04 19:48:37 +02:00
daimond113
c08dfb9965
feat(registry): use env variable for repo 2024-09-04 16:19:09 +02:00
daimond113
d321b8b0aa
feat: add PESDE_ROOT for bin packages 2024-09-03 23:08:08 +02:00
daimond113
71eacb8673
feat: support adding workspace packages 2024-09-03 18:19:01 +02:00
daimond113
5236be53fd
feat: add path param in git specifiers 2024-09-03 16:47:38 +02:00
daimond113
f1ce6283d8
feat: implement workspace/monorepo support 2024-09-03 16:01:48 +02:00
daimond113
bd7e1452b0
feat(registry): store package docs 2024-09-02 16:49:40 +02:00
daimond113
f631c1deb9
refactor: use digest method in hash util 2024-08-29 23:30:14 +02:00
daimond113
2a136f11db
feat: make cas files readonly 2024-08-29 23:28:09 +02:00
daimond113
00db2a51c2
fix: parse versions on non windows platforms correctly 2024-08-29 00:24:52 +02:00
daimond113
f732451252
feat: use standard tables by default 2024-08-28 12:03:03 +02:00
daimond113
b1a0cf6637
feat: remove ratelimits on info queries 2024-08-27 21:10:24 +02:00
daimond113
f2deb64f1c
refactor: better windows UX 2024-08-27 19:49:52 +02:00
daimond113
fd69ced633
fix: remove unwrap 2024-08-27 19:49:11 +02:00
daimond113
8168bb5124
fix: handle disabled features 2024-08-27 19:07:11 +02:00
daimond113
e14f350336
feat: return all targets from version endpoint 2024-08-23 13:56:33 +02:00
daimond113
07ee4b9617
feat: allow any target request 2024-08-14 20:55:10 +02:00
daimond113
7aaea85a2d
feat: store more package info in index 2024-08-14 19:55:58 +02:00
daimond113
cc85135a8e
feat: keep dev dependencies in manifest file 2024-08-13 23:52:46 +02:00
daimond113
159d0bafc1
fix: correct endpoint data 2024-08-13 21:26:02 +02:00
daimond113
2ae368d97e
fix: correctly add authors array 2024-08-13 20:29:26 +02:00
daimond113
17c2196522
fix: don't add luau extension for unix 2024-08-13 20:15:22 +02:00
daimond113
4f03155af0
feat: give more info from registry api 2024-08-13 20:14:41 +02:00
daimond113
83286ff758
fix: overwrite request headers 2024-08-12 23:16:26 +02:00
daimond113
836870f1ce
fix: improve login command behaviour 2024-08-12 22:31:05 +02:00
daimond113
30c9be8366
feat: implement token overrides 2024-08-12 22:28:37 +02:00
daimond113
a2865523a0
feat: always send target info 2024-08-12 20:41:09 +02:00
daimond113
3272f8aa88
fix: set deep dependencies' index urls 2024-08-12 11:55:21 +02:00
daimond113
6442030f93
feat: support toolchain managers in scripts 2024-08-12 01:17:26 +02:00
daimond113
08e808b4b9
fix: ignore possibly functionality breaking files 2024-08-11 22:57:42 +02:00
daimond113
7267dc2300
feat: support adding git dependencies with the cli 2024-08-11 20:08:27 +02:00
daimond113
876fa21bcb
fix: use correct name field name for wally dependencies 2024-08-11 18:57:04 +02:00
daimond113
7a8376ebad
fix: create directory before canonicalizing 2024-08-11 18:29:48 +02:00
daimond113
c413d8aa22
refactor: increase version check interval 2024-08-11 18:23:43 +02:00
daimond113
48f591a48a
feat: implement default sourcemap_generator script 2024-08-11 17:45:48 +02:00
daimond113
09307276b0
feat: implement execute command 2024-08-11 17:27:51 +02:00
daimond113
79bbe11cab
feat: implement git dependency support 2024-08-11 16:16:25 +02:00
daimond113
d8db84a751
fix: support wally dependency patching 2024-08-10 14:25:46 +02:00
daimond113
957689c13a
revert: re-implement scripts repo updating 2024-08-09 22:14:33 +02:00
daimond113
d0aecbdabc
perf: don't load entire files into memory 2024-08-08 20:37:51 +02:00
daimond113
a8a8ffcbe2
feat: implement wally support 2024-08-08 17:59:59 +02:00
daimond113
a24a440a84
docs: add missing docs 2024-08-03 22:50:09 +02:00
daimond113
431c2b634f
docs: add package docs 2024-08-03 22:18:38 +02:00
daimond113
e07ec4e859
chore: add example .env 2024-07-30 12:39:03 +02:00
daimond113
c481826d77
feat: implement registry 2024-07-30 12:37:54 +02:00
daimond113
ea887e56ef
feat: implement utility commands 2024-07-28 21:08:35 +02:00
daimond113
97b6a69688
fix: update binary when upgrading 2024-07-28 18:34:24 +02:00
daimond113
37cc86f028
feat: content addressable storage 2024-07-28 18:19:54 +02:00
daimond113
d8cd78e7e2
fix: correctly generate require paths 2024-07-26 19:16:47 +02:00
daimond113
7f8b2761ab
feat: self managed versioning 2024-07-26 18:47:53 +02:00
daimond113
0fcddedad6
feat: support binary scripts in PATH 2024-07-25 16:32:48 +02:00
daimond113
986196ac67
refactor: use gix urls in more places 2024-07-24 13:19:20 +02:00
daimond113
b10e7667f0
chore: reorganize manifest module 2024-07-24 11:55:15 +02:00
daimond113
f0cd53a2c9
refactor: add debug messages 2024-07-24 00:53:34 +02:00
daimond113
5661194721
feat: implement patching 2024-07-23 16:37:47 +02:00
daimond113
d4371519c2
feat: multithreaded dependency downloading 2024-07-23 01:20:50 +02:00
daimond113
2898b02e1c
feat: implement multi target packages 2024-07-22 23:16:04 +02:00
daimond113
14463b7205
feat: implement dependency overrides 2024-07-22 22:00:09 +02:00
daimond113
64ed3931cf
fix: allow multiple of the same version with different targets 2024-07-22 20:12:56 +02:00
daimond113
7feb9055e7
fix: correctly add scripts to manifest 2024-07-22 19:45:28 +02:00
daimond113
67d662939f
feat: use lockfile + transition to toml 2024-07-22 19:40:30 +02:00
daimond113
d81f2350df
feat: roblox sync config script 2024-07-22 16:41:45 +02:00
daimond113
10ca24a0cc
feat: implement linking 2024-07-17 19:38:01 +02:00
daimond113
fdad8995a4
feat: implement dependency resolver and lune scripts 2024-07-14 15:19:15 +02:00
daimond113
b73bf418c5
feat: begin rewrite 2024-07-13 00:09:37 +02:00
daimond113
88e87b03b9
chore(release): prepare for v0.4.7 2024-05-12 16:00:52 +02:00
daimond113
06ed5d94ae
feat: parallel sourcemap generation 2024-05-12 16:00:08 +02:00
daimond113
39102908cb
ci: update macos x86_64 image (the default is now arm) 2024-05-12 13:15:51 +02:00
daimond113
3082d69583
chore(release): prepare for v0.4.6 2024-05-12 13:00:36 +02:00
daimond113
d7c421a90e
fix: create package folders 2024-05-12 12:57:32 +02:00
daimond113
5fd29cc264
fix: colour single tick codeblocks properly 2024-05-05 21:35:12 +02:00
daimond113
22a93f99f7
chore: make registry image weigh less 2024-05-05 21:31:21 +02:00
daimond113
d6c80ac2a1
feat(website): allow images in readmes 2024-04-08 18:15:21 +02:00
daimond113
10f0042ac3
docs(website): bring back convert index note 2024-04-08 18:07:48 +02:00
daimond113
b01384807a
feat(website): add exports to sidebar 2024-04-08 18:06:18 +02:00
daimond113
f6a3cdd4b2
docs(website): correct manifest dependency format 2024-04-08 17:19:33 +02:00
daimond113
b3b4ee113f
docs: link documentation in readme 2024-04-03 21:16:21 +02:00
daimond113
3527413745
chore(release): prepare for v0.4.5 2024-04-01 10:52:51 +02:00
daimond113
aec6162330
fix(patches): remove manifest requirement 2024-04-01 10:52:09 +02:00
daimond113
68f9021b2c
chore(release): prepare for v0.4.4 2024-03-31 17:54:24 +02:00
daimond113
e0468069f3
fix(overrides): use project indices in specifier 2024-03-31 17:52:56 +02:00
daimond113
57e6aedfd1
fix(wally-compat): correctly update sync tool files 2024-03-31 17:45:07 +02:00
daimond113
10a420b177
chore(release): prepare for v0.4.3 2024-03-31 14:45:01 +02:00
daimond113
de35d5906a
feat: support manifest-less repos & running local package bin export 2024-03-31 14:23:08 +02:00
daimond113
8dfdc6dfa8
fix(resolver): ensure version is root 2024-03-31 14:18:36 +02:00
daimond113
344416da3d
refactor(wally-compat): improve manifest parsing 2024-03-31 14:18:14 +02:00
daimond113
5a8907c335
docs(website): document exports field of manifest 2024-03-31 14:13:38 +02:00
dai
7acbceb093
Merge pull request #1 from Foorack/repo-patch
Update repository field in Cargo.toml
2024-03-31 11:10:28 +02:00
Foorack / Max Faxälv
5a91ed1061
Update repository field in Cargo.toml 2024-03-30 22:45:07 +01:00
daimond113
dcecd84f17
chore(release): prepare for v0.4.2 2024-03-29 20:28:36 +01:00
daimond113
12407fa6de
fix(cli config): create folder for config 2024-03-29 20:27:47 +01:00
daimond113
056d1dce36
feat(website): add documentation meta tags 2024-03-28 19:52:45 +01:00
daimond113
a7947f6fbf
feat(website): add documentation 2024-03-28 19:49:46 +01:00
daimond113
f897d5bc4a
chore(release): prepare for v0.4.1 2024-03-28 13:58:38 +01:00
daimond113
e829cd8143
ci: 🚩 remove macos aarch64 due to costs 2024-03-28 13:58:33 +01:00
daimond113
47a83b5ed9
chore(dependencies): 👽 fix compilation due to zstd-sys 2024-03-28 13:29:25 +01:00
daimond113
fe5630af57
fix(resolver): 🐛 correctly insert packages from lockfile 2024-03-28 13:28:28 +01:00
247 changed files with 21303 additions and 12923 deletions

View file

@ -2,7 +2,7 @@
!src
!registry
registry/.env
registry/cache
registry/data
!Cargo.lock
!Cargo.toml
!rust-toolchain.toml

2
.env.example Normal file
View file

@ -0,0 +1,2 @@
PUBLIC_REGISTRY_URL= # url of the registry API, this must have a trailing slash and include the version
# example: https://registry.pesde.daimond113.com/v0/

1
.gitattributes vendored Normal file
View file

@ -0,0 +1 @@
* text=auto

2
.github/FUNDING.yml vendored Normal file
View file

@ -0,0 +1,2 @@
buy_me_a_coffee: daimond113
ko_fi: daimond113

79
.github/workflows/debug.yml vendored Normal file
View file

@ -0,0 +1,79 @@
name: Debug
on:
push:
pull_request:
jobs:
get-version:
name: Get build version
runs-on: ubuntu-latest
outputs:
version: v${{ steps.get_version.outputs.value }}+rev.g${{ steps.trim_sha.outputs.trimmed_sha }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Get package version
uses: SebRollen/toml-action@v1.2.0
id: get_version
with:
file: Cargo.toml
field: package.version
- name: Trim commit SHA
id: trim_sha
run: |
commit_sha=${{ github.sha }}
echo "trimmed_sha=${commit_sha:0:7}" | tee $GITHUB_OUTPUT
build:
strategy:
matrix:
include:
- job-name: windows-x86_64
target: x86_64-pc-windows-msvc
runs-on: windows-latest
artifact-name: pesde-debug-${{ needs.get-version.outputs.version }}-windows-x86_64
- job-name: linux-x86_64
target: x86_64-unknown-linux-gnu
runs-on: ubuntu-latest
artifact-name: pesde-debug-${{ needs.get-version.outputs.version }}-linux-x86_64
- job-name: macos-x86_64
target: x86_64-apple-darwin
runs-on: macos-13
artifact-name: pesde-debug-${{ needs.get-version.outputs.version }}-macos-x86_64
- job-name: macos-aarch64
target: aarch64-apple-darwin
runs-on: macos-latest
artifact-name: pesde-debug-${{ needs.get-version.outputs.version }}-macos-aarch64
name: Build for ${{ matrix.job-name }}
runs-on: ${{ matrix.runs-on }}
needs: get-version
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install Linux build dependencies
if: ${{ matrix.runs-on == 'ubuntu-latest' }}
run: |
sudo apt-get update
sudo apt-get install libdbus-1-dev pkg-config
- name: Install Rust toolchain
uses: dtolnay/rust-toolchain@stable
- name: Compile in debug mode
run: cargo build --bins --no-default-features --features bin,patches,wally-compat --target ${{ matrix.target }} --locked
- name: Upload artifact
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.artifact-name }}
if-no-files-found: error
path: |
target/${{ matrix.target }}/debug/pesde.exe
target/${{ matrix.target }}/debug/pesde

View file

@ -3,48 +3,90 @@ on:
push:
tags:
- v*
env:
CRATE_NAME: pesde
BIN_NAME: pesde
jobs:
prepare:
name: Prepare
runs-on: ubuntu-latest
outputs:
version: ${{ steps.extract_version.outputs.VERSION }}
found: ${{ steps.ensure_not_published.outputs.FOUND }}
steps:
- uses: actions/checkout@v4
- name: Extract version
id: extract_version
shell: bash
run: |
VERSION=$(echo ${{ github.ref_name }} | cut -d'+' -f1 | cut -c 2-)
echo "VERSION=$VERSION" >> "$GITHUB_OUTPUT"
- name: Ensure not published
id: ensure_not_published
shell: bash
env:
VERSION: ${{ steps.extract_version.outputs.VERSION }}
run: |
CRATE_NAME="${{ env.CRATE_NAME }}"
if [ ${#CRATE_NAME} -eq 1 ]; then
DIR="1"
elif [ ${#CRATE_NAME} -eq 2 ]; then
DIR="2"
elif [ ${#CRATE_NAME} -eq 3 ]; then
DIR="3/${CRATE_NAME:0:1}"
else
DIR="${CRATE_NAME:0:2}/${CRATE_NAME:2:2}"
fi
FOUND=$(curl -sSL --fail-with-body "https://index.crates.io/$DIR/${{ env.CRATE_NAME }}" | jq -s 'any(.[]; .vers == "${{ env.VERSION }}")')
echo "FOUND=$FOUND" >> "$GITHUB_OUTPUT"
build:
strategy:
matrix:
include:
- os: ubuntu-latest
host: linux
label: linux-x86_64
target: x86_64-unknown-linux-gnu
include:
- os: ubuntu-latest
host: linux
arch: x86_64
target: x86_64-unknown-linux-gnu
- os: windows-latest
host: windows
label: windows-x86_64
target: x86_64-pc-windows-msvc
- os: windows-latest
host: windows
arch: x86_64
target: x86_64-pc-windows-msvc
- os: macos-latest
host: macos
label: macos-x86_64
target: x86_64-apple-darwin
- os: macos-13
host: macos
arch: x86_64
target: x86_64-apple-darwin
- os: macos-latest-xlarge
host: macos
label: macos-aarch64
target: aarch64-apple-darwin
- os: macos-latest
host: macos
arch: aarch64
target: aarch64-apple-darwin
runs-on: ${{ matrix.os }}
name: Build for ${{ matrix.label }}
name: Build for ${{ matrix.host }}-${{ matrix.arch }}
needs: [ prepare ]
if: ${{ needs.prepare.outputs.found == 'false' }}
env:
VERSION: ${{ needs.prepare.outputs.version }}
steps:
- uses: actions/checkout@v4
- name: Set up Rust
uses: moonrepo/setup-rust@v1
with:
targets: ${{ matrix.target }}
- uses: dtolnay/rust-toolchain@stable
- name: Set env
shell: bash
run: |
BIN_NAME=pesde
ARCHIVE_NAME=$BIN_NAME-$(echo ${{ github.ref_name }} | cut -c 2-)-${{ matrix.label }}.zip
echo "BIN_NAME=$BIN_NAME" >> $GITHUB_ENV
ARCHIVE_NAME=${{ env.BIN_NAME }}-${{ env.VERSION }}-${{ matrix.host }}-${{ matrix.arch }}
echo "ARCHIVE_NAME=$ARCHIVE_NAME" >> $GITHUB_ENV
- name: Install OS dependencies
if: ${{ matrix.host == 'linux' }}
run: |
sudo apt-get update
sudo apt-get install libdbus-1-dev pkg-config
- name: Build
run: cargo build --bins --all-features --release --target ${{ matrix.target }} --locked
@ -52,18 +94,36 @@ jobs:
shell: bash
run: |
if [ ${{ matrix.host }} = "windows" ]; then
cp target/${{ matrix.target }}/release/${{ env.BIN_NAME }}.exe ${{ env.BIN_NAME }}.exe
7z a ${{ env.ARCHIVE_NAME }} ${{ env.BIN_NAME }}.exe
mv target/${{ matrix.target }}/release/${{ env.BIN_NAME }}.exe ${{ env.BIN_NAME }}.exe
7z a ${{ env.ARCHIVE_NAME }}.zip ${{ env.BIN_NAME }}.exe
tar -czf ${{ env.ARCHIVE_NAME }}.tar.gz ${{ env.BIN_NAME }}.exe
else
cp target/${{ matrix.target }}/release/${{ env.BIN_NAME }} ${{ env.BIN_NAME }}
zip -r ${{ env.ARCHIVE_NAME }} ${{ env.BIN_NAME }}
mv target/${{ matrix.target }}/release/${{ env.BIN_NAME }} ${{ env.BIN_NAME }}
zip -r ${{ env.ARCHIVE_NAME }}.zip ${{ env.BIN_NAME }}
tar -czf ${{ env.ARCHIVE_NAME }}.tar.gz ${{ env.BIN_NAME }}
fi
- name: Upload assets
- name: Upload zip artifact
uses: actions/upload-artifact@v4
with:
name: ${{ env.ARCHIVE_NAME }}
path: ${{ env.ARCHIVE_NAME }}
name: ${{ env.ARCHIVE_NAME }}.zip
path: ${{ env.ARCHIVE_NAME }}.zip
- name: Upload tar.gz artifact
uses: actions/upload-artifact@v4
with:
name: ${{ env.ARCHIVE_NAME }}.tar.gz
path: ${{ env.ARCHIVE_NAME }}.tar.gz
publish:
name: Publish to crates.io
runs-on: ubuntu-latest
needs: [ build ]
steps:
- uses: actions/checkout@v4
- uses: dtolnay/rust-toolchain@stable
- name: Publish
run: cargo publish --token ${{ secrets.CRATES_IO_TOKEN }} --allow-dirty --locked
create_release:
name: Create Release
@ -71,7 +131,9 @@ jobs:
permissions:
contents: write
pull-requests: read
needs: [build]
needs: [ prepare, publish ]
env:
VERSION: ${{ needs.prepare.outputs.version }}
steps:
- uses: actions/checkout@v4
with:
@ -81,27 +143,13 @@ jobs:
path: artifacts
merge-multiple: true
- name: Generate a changelog
uses: orhun/git-cliff-action@v3
id: git-cliff
with:
config: cliff.toml
args: --verbose --current --strip header
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_REPO: ${{ github.repository }}
- name: Create Release
id: create_release
uses: softprops/action-gh-release@v1
with:
token: ${{ secrets.GITHUB_TOKEN }}
tag_name: ${{ github.ref_name }}
name: ${{ github.ref_name }}
body: ${{ steps.git-cliff.outputs.content }}
name: v${{ env.VERSION }}
draft: true
prerelease: false
files: artifacts/*
- name: Publish on crates.io
run: cargo publish --token ${{ secrets.CRATES_IO_TOKEN }} --allow-dirty --locked
prerelease: ${{ startsWith(env.VERSION, '0') }}
files: artifacts/*

View file

@ -11,25 +11,33 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Rust
uses: moonrepo/setup-rust@v1
# we use some nightly rustfmt features, so we need nightly toolchain
- uses: dtolnay/rust-toolchain@nightly
with:
bins: cargo-tarpaulin
components: rustfmt, clippy
components: rustfmt
- uses: dtolnay/rust-toolchain@stable
with:
components: clippy
- name: Run tests
run: cargo test --all
- name: Install OS dependencies
run: |
sudo apt-get update
sudo apt-get install libdbus-1-dev pkg-config
# pesde currently does not have any tests. Bring this back when (if) tests are added.
# - name: Run tests
# run: cargo test --all --all-features
- name: Check formatting
run: cargo fmt --all -- --check
run: cargo +nightly fmt --all -- --check
- name: Run clippy
run: cargo clippy --all --all-targets --all-features -- -D warnings
- name: Generate coverage report
run: cargo tarpaulin --all-features --out xml --exclude-files src/cli/* --exclude-files registry/* --exclude-files src/main.rs --skip-clean
- name: Upload coverage reports to Codecov
uses: codecov/codecov-action@v4.0.1
with:
token: ${{ secrets.CODECOV_TOKEN }}
# - name: Generate coverage report
# run: cargo tarpaulin --all-features --out xml --exclude-files src/cli/* --exclude-files registry/* --exclude-files src/main.rs --skip-clean
#
# - name: Upload coverage reports to Codecov
# uses: codecov/codecov-action@v4.0.1
# with:
# token: ${{ secrets.CODECOV_TOKEN }}

2
.gitignore vendored
View file

@ -4,4 +4,4 @@
cobertura.xml
tarpaulin-report.html
build_rs_cov.profraw
registry/cache
registry/data

View file

@ -5,95 +5,114 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [0.4.0] - 2024-03-27
### Details
#### Bug Fixes
- :bug: link root dependencies to their dependents aswell by @daimond113
## [0.5.3] - 2024-12-30
### Added
- Add meta field in index files to preserve compatibility with potential future changes by @daimond113
#### Features
- :sparkles: add dependency names
- :sparkles: add dependency overrides by @daimond113
### Changed
- Remove verbosity from release mode logging by @daimond113
#### Refactor
- :art: improve lockfile format by @daimond113
## [0.5.2] - 2024-12-19
### Fixed
- Change dependency types for removed peer dependencies by @daimond113
- Resolve version to correct tag for `pesde_version` field by @daimond113
- Do not error on missing dependencies until full linking by @daimond113
#### Styling
- :art: apply clippy & rustfmt by @daimond113
### Changed
- Switch from `log` to `tracing` for logging by @daimond113
## [0.3.2] - 2024-03-24
### Details
#### Bug Fixes
- :bug: correct linking file paths by @daimond113
- :bug: correctly enable fields with features by @daimond113
## [0.5.1] - 2024-12-15
### Fixed
- Ignore build metadata when comparing CLI versions by @daimond113
## [0.3.1] - 2024-03-24
### Details
#### Features
- :sparkles: automatically find file to use as lib by @daimond113
## [0.5.0] - 2024-12-14
### Added
- Add support for multiple targets under the same package name in workspace members by @daimond113
- Add `yes` argument to skip all prompts in publish command by @daimond113
- Publish all workspace members when publishing a workspace by @daimond113
- Inform user about not finding any bin package when using its bin invocation by @daimond113
- Support full version requirements in workspace version field by @daimond113
- Improved authentication system for registry changes by @daimond113
- New website by @lukadev-0
- Add `--index` flag to `publish` command to publish to a specific index by @daimond113
- Support fallback Wally registries by @daimond113
- Print that no updates are available in `outdated` command by @daimond113
- Support negated globs in `workspace_members` field by @daimond113
- Make `includes` use glob patterns by @daimond113
- Use symlinks for workspace dependencies to not require reinstalling by @daimond113
- Add `auth token` command to print the auth token for the index by @daimond113
- Support specifying which external registries are allowed on registries by @daimond113
- Add improved CLI styling by @daimond113
- Install pesde dependencies before Wally to support scripts packages by @daimond113
- Support packages exporting scripts by @daimond113
- Support using workspace root as a member by @daimond113
- Allow multiple, user selectable scripts packages to be selected (& custom packages inputted) in `init` command by @daimond113
- Support granular control over which repositories are allowed in various specifier types by @daimond113
- Display included scripts in `publish` command by @daimond113
## [0.3.0] - 2024-03-24
### Details
#### Features
- :sparkles: multi-index + wally support by @daimond113
### Fixed
- Fix versions with dots not being handled correctly by @daimond113
- Use workspace specifiers' `target` field when resolving by @daimond113
- Add feature gates to `wally-compat` specific code in init command by @daimond113
- Remove duplicated manifest file name in `publish` command by @daimond113
- Allow use of Luau packages in `execute` command by @daimond113
- Fix `self-upgrade` overwriting its own binary by @daimond113
- Correct `pesde.toml` inclusion message in `publish` command by @daimond113
- Allow writes to files when `link` is false in PackageFS::write_to by @daimond113
- Handle missing revisions in AnyPackageIdentifier::from_str by @daimond113
- Make GitHub OAuth client ID config optional by @daimond113
- Use updated aliases when reusing lockfile dependencies by @daimond113
- Listen for device flow completion without requiring pressing enter by @daimond113
- Sync scripts repo in background by @daimond113
- Don't make CAS files read-only on Windows (file removal is disallowed if the file is read-only) by @daimond113
- Validate package names are lowercase by @daimond113
- Use a different algorithm for finding a CAS directory to avoid issues with mounted drives by @daimond113
- Remove default.project.json from Git pesde dependencies by @daimond113
- Correctly (de)serialize workspace specifiers by @daimond113
- Fix CAS finder algorithm issues with Windows by @daimond113
- Fix CAS finder algorithm's AlreadyExists error by @daimond113
- Use moved path when setting file to read-only by @daimond113
- Correctly link Wally server packages by @daimond113
- Fix `self-install` doing a cross-device move by @daimond113
- Add back mistakenly removed updates check caching by @daimond113
- Set download error source to inner error to propagate the error by @daimond113
- Correctly copy workspace packages by @daimond113
- Fix peer dependencies being resolved incorrectly by @daimond113
- Set PESDE_ROOT to the correct path in `pesde run` by @daimond113
- Install dependencies of packages in `x` command by @daimond113
- Fix `includes` not supporting root files by @daimond113
- Link dependencies before type extraction to support more use cases by @daimond113
- Strip `.luau` extension from linker modules' require paths to comply with Luau by @daimond113
- Correctly handle graph paths for resolving overriden packages by @daimond113
- Do not require `--` in bin package executables on Unix by @daimond113
- Do not require lib or bin exports if package exports scripts by @daimond113
- Correctly resolve URLs in `publish` command by @daimond113
- Add Roblox types in linker modules even with no config generator script by @daimond113
#### Miscellaneous Tasks
- :pencil2: correct env variable names by @daimond113
### Removed
- Remove special scripts repo handling to favour standard packages by @daimond113
## [0.2.0] - 2024-03-17
### Details
#### Features
- :children_crossing: add wally conversion by @daimond113
- :sparkles: add embed metadata by @daimond113
### Changed
- Rewrite the entire project in a more maintainable way by @daimond113
- Support workspaces by @daimond113
- Improve CLI by @daimond113
- Support multiple targets for a single package by @daimond113
- Make registry much easier to self-host by @daimond113
- Start maintaining a changelog by @daimond113
- Optimize boolean expression in `publish` command by @daimond113
- Switched to fs-err for better errors with file system operations by @daimond113
- Use body bytes over multipart for publishing packages by @daimond113
- `self-upgrade` now will check for updates by itself by default by @daimond113
- Only store `pesde_version` executables in the version cache by @daimond113
- Remove lower bound limit of 3 characters for pesde package names by @daimond113
#### Miscellaneous Tasks
- :bug: show logo on all platforms by @daimond113
### Performance
- Clone dependency repos shallowly by @daimond113
- Switch to async Rust by @daimond113
- Asyncify dependency linking by @daimond113
- Use `exec` in Unix bin linking to reduce the number of processes by @daimond113
#### Refactor
- :art: use static variables by @daimond113
- :zap: store index files as btreemaps by @daimond113
## [0.1.4] - 2024-03-16
### Details
#### Features
- :sparkles: add repository field by @daimond113
- :rocket: create website by @daimond113
- :sparkles: add listing newest packages by @daimond113
## [0.1.3] - 2024-03-10
### Details
#### Features
- :sparkles: add init, add, remove, and outdated commands by @daimond113
- :sparkles: package versions endpoint by @daimond113
## [0.1.2] - 2024-03-06
### Details
#### Features
- :sparkles: add ratelimits by @daimond113
#### Miscellaneous Tasks
- :rocket: setup crates.io publishing by @daimond113
## [0.1.1] - 2024-03-04
### Details
#### Bug Fixes
- :passport_control: properly handle missing api token entry by @daimond113
#### Documentation
- :memo: update README by @daimond113
## [0.1.0] - 2024-03-04
### Details
#### Features
- :tada: initial commit by @daimond113
[0.4.0]: https://github.com/daimond113/pesde/compare/v0.3.2..v0.4.0
[0.3.2]: https://github.com/daimond113/pesde/compare/v0.3.1..v0.3.2
[0.3.1]: https://github.com/daimond113/pesde/compare/v0.3.0..v0.3.1
[0.3.0]: https://github.com/daimond113/pesde/compare/v0.2.0..v0.3.0
[0.2.0]: https://github.com/daimond113/pesde/compare/v0.1.4..v0.2.0
[0.1.4]: https://github.com/daimond113/pesde/compare/v0.1.3..v0.1.4
[0.1.3]: https://github.com/daimond113/pesde/compare/v0.1.2..v0.1.3
[0.1.2]: https://github.com/daimond113/pesde/compare/v0.1.1..v0.1.2
[0.1.1]: https://github.com/daimond113/pesde/compare/v0.1.0..v0.1.1
<!-- generated by git-cliff -->
[0.5.3]: https://github.com/daimond113/pesde/compare/v0.5.2%2Bregistry.0.1.1..v0.5.3%2Bregistry.0.1.2
[0.5.2]: https://github.com/daimond113/pesde/compare/v0.5.1%2Bregistry.0.1.0..v0.5.2%2Bregistry.0.1.1
[0.5.1]: https://github.com/daimond113/pesde/compare/v0.5.0%2Bregistry.0.1.0..v0.5.1%2Bregistry.0.1.0
[0.5.0]: https://github.com/daimond113/pesde/compare/v0.4.7..v0.5.0%2Bregistry.0.1.0

4911
Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -1,68 +1,107 @@
[package]
name = "pesde"
version = "0.4.0"
version = "0.5.3"
edition = "2021"
license = "MIT"
authors = ["daimond113 <contact@daimond113.com>"]
description = "A package manager for Roblox"
description = "A package manager for the Luau programming language, supporting multiple runtimes including Roblox and Lune"
homepage = "https://pesde.daimond113.com"
repository = "https://github.com/pesde-pkg/pesde"
include = ["src/**/*", "Cargo.toml", "Cargo.lock", "README.md", "LICENSE", "CHANGELOG.md"]
[features]
bin = ["clap", "directories", "keyring", "anyhow", "ignore", "pretty_env_logger", "serde_json", "reqwest/json", "reqwest/multipart", "lune", "futures-executor", "indicatif", "auth-git2", "indicatif-log-bridge", "inquire", "once_cell"]
wally = ["toml", "zip", "serde_json"]
bin = [
"dep:clap",
"dep:dirs",
"dep:tracing-subscriber",
"reqwest/json",
"dep:indicatif",
"dep:tracing-indicatif",
"dep:inquire",
"dep:toml_edit",
"dep:colored",
"dep:anyhow",
"dep:keyring",
"dep:open",
"gix/worktree-mutation",
"dep:serde_json",
"dep:winreg",
"fs-err/expose_original_error",
"tokio/rt",
"tokio/rt-multi-thread",
"tokio/macros",
]
wally-compat = ["dep:async_zip", "dep:serde_json"]
patches = ["dep:git2"]
version-management = ["bin"]
[[bin]]
name = "pesde"
path = "src/main.rs"
required-features = ["bin"]
[lints.clippy]
uninlined_format_args = "warn"
[dependencies]
serde = { version = "1.0.197", features = ["derive"] }
serde_yaml = "0.9.33"
git2 = "0.18.3"
semver = { version = "1.0.22", features = ["serde"] }
reqwest = { version = "0.12.1", default-features = false, features = ["rustls-tls", "blocking"] }
tar = "0.4.40"
flate2 = "1.0.28"
pathdiff = "0.2.1"
relative-path = { version = "1.9.2", features = ["serde"] }
log = "0.4.21"
thiserror = "1.0.58"
threadpool = "1.8.1"
full_moon = { version = "0.19.0", features = ["stacker", "roblox"] }
url = { version = "2.5.0", features = ["serde"] }
cfg-if = "1.0.0"
serde = { version = "1.0.216", features = ["derive"] }
toml = "0.8.19"
serde_with = "3.11.0"
gix = { version = "0.68.0", default-features = false, features = ["blocking-http-transport-reqwest-rust-tls", "revparse-regex", "credentials", "parallel"] }
semver = { version = "1.0.24", features = ["serde"] }
reqwest = { version = "0.12.9", default-features = false, features = ["rustls-tls"] }
tokio-tar = "0.3.1"
async-compression = { version = "0.4.18", features = ["tokio", "gzip"] }
pathdiff = "0.2.3"
relative-path = { version = "1.9.3", features = ["serde"] }
tracing = { version = "0.1.41", features = ["attributes"] }
thiserror = "2.0.7"
tokio = { version = "1.42.0", features = ["process"] }
tokio-util = "0.7.13"
async-stream = "0.3.6"
futures = "0.3.31"
full_moon = { version = "1.1.2", features = ["luau"] }
url = { version = "2.5.4", features = ["serde"] }
chrono = { version = "0.4.39", features = ["serde"] }
sha2 = "0.10.8"
tempfile = "3.14.0"
wax = { version = "0.6.0", default-features = false }
fs-err = { version = "3.0.0", features = ["tokio"] }
toml = { version = "0.8.12", optional = true }
zip = { version = "0.6.6", optional = true }
# TODO: remove this when gitoxide adds support for: committing, pushing, adding
git2 = { version = "0.19.0", optional = true }
# chrono-lc breaks because of https://github.com/chronotope/chrono/compare/v0.4.34...v0.4.35#diff-67de5678fb5c14378bbff7ecf7f8bfab17cc223c4726f8da3afca183a4e59543
chrono = { version = "=0.4.34", features = ["serde"] }
async_zip = { version = "0.0.17", features = ["tokio", "deflate", "deflate64", "tokio-fs"], optional = true }
serde_json = { version = "1.0.133", optional = true }
clap = { version = "4.5.3", features = ["derive"], optional = true }
directories = { version = "5.0.1", optional = true }
keyring = { version = "2.3.2", optional = true }
anyhow = { version = "1.0.81", optional = true }
ignore = { version = "0.4.22", optional = true }
pretty_env_logger = { version = "0.5.0", optional = true }
serde_json = { version = "1.0.114", optional = true }
lune = { version = "0.8.2", optional = true }
futures-executor = { version = "0.3.30", optional = true }
indicatif = { version = "0.17.8", optional = true }
auth-git2 = { version = "0.5.4", optional = true }
indicatif-log-bridge = { version = "0.2.2", optional = true }
inquire = { version = "0.7.3", optional = true }
once_cell = { version = "1.19.0", optional = true }
anyhow = { version = "1.0.94", optional = true }
open = { version = "5.3.1", optional = true }
keyring = { version = "3.6.1", features = ["crypto-rust", "windows-native", "apple-native", "async-secret-service", "async-io"], optional = true }
colored = { version = "2.1.0", optional = true }
toml_edit = { version = "0.22.22", optional = true }
clap = { version = "4.5.23", features = ["derive"], optional = true }
dirs = { version = "5.0.1", optional = true }
tracing-subscriber = { version = "0.3.19", features = ["env-filter"], optional = true }
indicatif = { version = "0.17.9", optional = true }
tracing-indicatif = { version = "0.3.8", optional = true }
inquire = { version = "0.7.5", optional = true }
[dev-dependencies]
tempfile = "3.10.1"
[target.'cfg(target_os = "windows")'.dependencies]
winreg = { version = "0.52.0", optional = true }
[workspace]
resolver = "2"
members = [
"registry"
]
members = ["registry"]
[profile.dev.package.full_moon]
opt-level = 3
opt-level = 3
[profile.release]
opt-level = "s"
lto = true
incremental = true
codegen-units = 1
[profile.release.package.pesde-registry]
# add debug symbols for Sentry stack traces
debug = "full"

View file

@ -1,9 +1,13 @@
FROM rust:1.76
FROM rust:1.82-bookworm AS builder
COPY . .
WORKDIR /registry
RUN cargo build --release -p pesde-registry
RUN cargo install --path .
FROM debian:bookworm-slim
CMD ["pesde-registry"]
COPY --from=builder /target/release/pesde-registry /usr/local/bin/
RUN apt-get update && apt-get install -y ca-certificates
CMD ["/usr/local/bin/pesde-registry"]

View file

@ -1,74 +1,52 @@
<br>
<div align="center">
<img src="https://raw.githubusercontent.com/daimond113/pesde/master/website/static/logo.svg" alt="pesde" width="200" />
<img src="https://raw.githubusercontent.com/pesde-pkg/pesde/0.5/assets/logotype.svg" alt="pesde logo" width="200" />
</div>
<br>
pesde is a package manager for Roblox that is designed to be feature-rich and easy to use.
Currently, pesde is in a very early stage of development, but already supports the following features:
- Managing dependencies
- Re-exporting types
- `bin` exports (ran with Lune)
- Patching packages
- Downloading packages from Wally registries
pesde is a package manager for the Luau programming language, supporting
multiple runtimes including Roblox and Lune. pesde has its own registry, however
it can also use Wally, and Git repositories as package sources. It has been
designed with multiple targets in mind, namely Roblox, Lune, and Luau.
## Installation
pesde can be installed from GitHub Releases. You can find the latest release [here](https://github.com/daimond113/pesde/releases).
It can also be installed by using [Aftman](https://github.com/LPGhatguy/aftman).
## Usage
pesde is designed to be easy to use. Here are some examples of how to use it:
pesde can be installed from GitHub Releases. You can find the latest release
[here](https://github.com/pesde-pkg/pesde/releases). Once you have downloaded
the binary, run the following command to install it:
```sh
# Initialize a new project
pesde init
# Install a package
pesde add daimond113/pesde@0.1.0
# Remove a package
pesde remove daimond113/pesde
# List outdated packages
pesde outdated
# Install all packages
pesde install
# Search for a package
pesde search pesde
# Run a binary
pesde run daimond113/pesde
# Run a binary with arguments
pesde run daimond113/pesde -- --help
pesde self-install
```
## Preparing to publish
Note that pesde manages its own versions, so you can update it by running the
following command:
To publish you must first initialize a new project with `pesde init`. You can then use the other commands to manipulate dependencies, and edit the file
manually to add metadata such as authors, description, and license.
```sh
pesde self-upgrade
```
> **Warning**
> The pesde CLI respects the `.gitignore` file and will not include files that are ignored. The `.pesdeignore` file has more power over the `.gitignore` file, so you can unignore files by prepending a `!` to the pattern.
## Documentation
The pesde CLI supports the `.pesdeignore` file, which is similar to `.gitignore`. It can be used to include or exclude files from the package.
For more information about its usage, you can check the
[documentation](https://docs.pesde.daimond113.com).
## Registry
The main pesde registry is hosted on [fly.io](https://fly.io). You can find it at https://registry.pesde.daimond113.com.
The main pesde registry is hosted on [fly.io](https://fly.io). You can find it
at https://registry.pesde.daimond113.com.
### Self-hosting
You can self-host the registry by using the default implementation in the `registry` folder, or by creating your own implementation. The API
must be compatible with the default implementation, which can be found in the `main.rs` file.
The registry tries to require no modifications to be self-hosted. Please refer
to the
[documentation](http://docs.pesde.daimond113.com/guides/self-hosting-registries)
for more information.
## Previous art
pesde is heavily inspired by [npm](https://www.npmjs.com/), [pnpm](https://pnpm.io/), [Wally](https://wally.run), and [Cargo](https://doc.rust-lang.org/cargo/).
pesde is heavily inspired by [npm](https://www.npmjs.com/),
[pnpm](https://pnpm.io/), [Wally](https://wally.run), and
[Cargo](https://doc.rust-lang.org/cargo/).

25
SECURITY.md Normal file
View file

@ -0,0 +1,25 @@
# Security Policy
## Supported Versions
As pesde is currently in version 0.x, we can only guarantee security for:
- **The latest minor** (currently 0.5).
- **The latest release candidate for the next version**, if available.
When a new minor version is released, the previous version will immediately lose security support.
> **Note:** This policy will change with the release of version 1.0, which will include an extended support period for versions >=1.0.
| Version | Supported |
| ------- | ------------------ |
| 0.5.x | :white_check_mark: |
| < 0.5 | :x: |
## Reporting a Vulnerability
We encourage all security concerns to be reported at [pesde@daimond113.com](mailto:pesde@daimond113.com), along the following format:
- **Subject**: The subject must be prefixed with `[SECURITY]` to ensure it is prioritized as a security concern.
- **Content**:
- **Affected Versions**: Clearly specify which are affected by the issue.
- **Issue Details**: Provide a detailed description of the issue, including reproduction steps and/or a simple example, if applicable.
We will try to respond as soon as possible.

3
assets/logomark.svg Normal file
View file

@ -0,0 +1,3 @@
<svg viewBox="0 0 100 100" fill="none" xmlns="http://www.w3.org/2000/svg">
<path fill-rule="evenodd" clip-rule="evenodd" d="M49.6025 0L92.9038 25V75L49.6025 100L6.30127 75V25L49.6025 0ZM14.3013 29.6188L49.6025 9.2376L84.9038 29.6188V70.3812L49.6025 90.7624L33.6148 81.5319V67.3848C34.5167 68.5071 35.6388 69.4215 36.981 70.1279C38.9701 71.148 41.0357 71.658 43.1779 71.658C46.442 71.658 49.1452 70.8929 51.2873 69.3629C53.4805 67.7818 55.1126 65.7672 56.1836 63.319C57.0915 61.3382 57.632 59.274 57.8054 57.1263C59.8723 57.7457 62.2157 58.0554 64.8356 58.0554C67.6918 58.0554 70.3695 57.6473 72.8686 56.8313C75.3678 55.9642 77.4079 54.8167 78.989 53.3886L75.7758 47.8038C74.5517 48.9258 72.9961 49.8439 71.109 50.5579C69.2219 51.221 67.2073 51.5525 65.0652 51.5525C61.3929 51.5525 58.6643 50.6854 56.8792 48.9513C56.7195 48.7962 56.567 48.6365 56.4217 48.472C55.6102 47.5539 55.0211 46.4896 54.6546 45.2791L54.6443 45.2452L54.669 45.2791H79.2185V41.9894C79.2185 39.0313 78.5555 36.3536 77.2294 33.9565C75.9543 31.5593 74.0927 29.6467 71.6445 28.2186C69.2474 26.7395 66.3657 26 62.9995 26C59.6843 26 56.8027 26.7395 54.3545 28.2186C51.9064 29.6467 50.0193 31.5593 48.6932 33.9565C47.6743 35.7983 47.0469 37.8057 46.8108 39.9788C45.6888 39.728 44.4778 39.6026 43.1779 39.6026C41.0357 39.6026 38.9701 40.1127 36.981 41.1327C35.3162 41.9651 33.9902 43.1549 33.0028 44.7023V40.3677H20.6855V46.2585H25.8113V77.0266L14.3013 70.3812V29.6188ZM55.1961 36.0986C54.6528 37.1015 54.3321 38.1216 54.234 39.1588H71.7976C71.7976 38.0367 71.4405 36.9401 70.7265 35.8691C70.0634 34.747 69.0689 33.8035 67.7428 33.0384C66.4677 32.2734 64.8867 31.8908 62.9995 31.8908C61.1124 31.8908 59.5058 32.2989 58.1798 33.1149C56.9047 33.88 55.9101 34.8745 55.1961 36.0986ZM49.6451 51.5692C49.3076 50.6641 48.8381 49.871 48.2367 49.1898C48.0885 49.0219 47.9323 48.8609 47.7681 48.7067C46.085 47.0746 44.0449 46.2585 41.6478 46.2585C40.1177 46.2585 38.6131 46.5645 37.134 47.1766C35.8594 47.6773 34.6863 48.5438 33.6148 49.7759V61.47C34.6863 62.6664 35.8594 63.5378 37.134 64.084C38.6131 64.6961 40.1177 65.0021 41.6478 65.0021C44.0449 65.0021 46.085 64.1861 47.7681 62.554C49.4512 60.9219 50.2928 58.6012 50.2928 55.5921C50.2928 54.0679 50.0769 52.727 49.6451 51.5692Z" fill="#F19D1E"></path>
</svg>

After

Width:  |  Height:  |  Size: 2.2 KiB

7
assets/logotype.svg Normal file
View file

@ -0,0 +1,7 @@
<svg viewBox="0 0 72 36" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M0 36V33.8343H2.90124V15.7321H0.081725V13.5664H5.72075L5.76161 15.4052L4.86264 15.8547C5.67989 15.0102 6.64697 14.3564 7.76387 13.8933C8.90802 13.403 10.0658 13.1578 11.2372 13.1578C12.681 13.1578 13.9886 13.4711 15.16 14.0976C16.3586 14.7242 17.2985 15.6504 17.9795 16.8763C18.6878 18.1022 19.0419 19.6005 19.0419 21.3712C19.0419 23.1964 18.6605 24.7628 17.8978 26.0704C17.135 27.3508 16.1134 28.3315 14.8331 29.0125C13.58 29.6935 12.2179 30.0341 10.7468 30.0341C9.92959 30.0341 9.1532 29.9251 8.41768 29.7072C7.70939 29.4892 7.08283 29.2168 6.538 28.8899C5.99317 28.5358 5.58454 28.1408 5.31212 27.7049L5.92506 27.5823L5.8842 33.8343H9.03061V36H0ZM10.2156 27.7866C11.8229 27.7866 13.1713 27.269 14.261 26.2339C15.3507 25.1714 15.8955 23.6323 15.8955 21.6164C15.8955 19.655 15.3779 18.143 14.3427 17.0806C13.3076 16.0182 12 15.487 10.4199 15.487C9.92959 15.487 9.34389 15.5823 8.66285 15.773C7.98181 15.9365 7.30077 16.2089 6.61973 16.5903C5.93868 16.9716 5.36661 17.4756 4.9035 18.1022L5.8842 16.3859L5.92506 26.6425L4.98522 25.3757C5.69351 26.1385 6.51076 26.7378 7.43697 27.1737C8.36319 27.5823 9.28941 27.7866 10.2156 27.7866Z" fill="#F19D1E"/>
<path d="M23.6185 22.8148C21.9295 22.8148 20.3904 22.4606 19.0011 21.7523C17.639 21.044 16.5493 20.0497 15.7321 18.7694C14.9421 17.489 14.547 16.0043 14.547 14.3153C14.547 12.5719 14.9284 11.0736 15.6912 9.82045C16.454 8.54009 17.4755 7.55939 18.7559 6.87835C20.0362 6.17007 21.4664 5.81593 23.0464 5.81593C24.6809 5.81593 26.0839 6.15645 27.2553 6.83749C28.4539 7.51853 29.3665 8.48561 29.9931 9.73873C30.6196 10.9918 30.9329 12.4629 30.9329 14.1519V14.9283H17.2031L17.244 13.0486H27.9499C27.9499 11.9862 27.732 11.0872 27.2962 10.3517C26.8875 9.5889 26.3154 9.01682 25.5799 8.63544C24.8444 8.22681 23.9999 8.0225 23.0464 8.0225C22.0385 8.0225 21.1259 8.26768 20.3087 8.75803C19.4914 9.22113 18.8376 9.90218 18.3473 10.8012C17.8842 11.6729 17.6526 12.7353 17.6526 13.9884C17.6526 15.3233 17.8978 16.4811 18.3881 17.4618C18.8785 18.4152 19.5595 19.1644 20.4312 19.7092C21.3302 20.2268 22.3518 20.4856 23.4959 20.4856C24.7218 20.4856 25.8523 20.2404 26.8875 19.7501C27.9227 19.2597 28.8217 18.6468 29.5844 17.9112L30.6469 19.7501C29.9113 20.5946 28.9579 21.3165 27.7865 21.9158C26.6151 22.5151 25.2258 22.8148 23.6185 22.8148Z" fill="#F19D1E"/>
<path d="M36.9257 29.0942C36.0812 29.0942 35.2639 28.9853 34.4739 28.7673C33.7111 28.5494 33.0437 28.2497 32.4716 27.8683C31.8996 27.487 31.4637 27.0375 31.164 26.5199L31.777 26.8876V28.6856H29.4069V23.1283H31.5318L31.777 25.7844L31.2049 23.8638C31.5046 24.8718 32.172 25.6482 33.2072 26.193C34.2696 26.7378 35.3729 27.0102 36.517 27.0102C37.5522 27.0102 38.4103 26.8059 39.0914 26.3973C39.7996 25.9887 40.1538 25.4166 40.1538 24.6811C40.1538 23.8093 39.8541 23.21 39.2548 22.8831C38.6827 22.529 37.7565 22.2157 36.4762 21.9433L34.2287 21.4529C32.7304 21.0716 31.5454 20.5676 30.6737 19.941C29.802 19.2872 29.3661 18.2111 29.3661 16.7129C29.3661 15.269 29.9245 14.1657 31.0415 13.403C32.1856 12.613 33.6022 12.218 35.2912 12.218C35.945 12.218 36.626 12.3133 37.3343 12.504C38.0698 12.6675 38.7372 12.9262 39.3365 13.2804C39.9359 13.6345 40.3581 14.0704 40.6033 14.588L39.9903 14.3837V12.5449H42.4012V18.0613H40.2764L39.7452 14.1794L40.4398 16.8763C40.3309 16.3859 40.0176 15.9637 39.5 15.6096C39.0096 15.2282 38.4239 14.9285 37.7429 14.7106C37.0891 14.4926 36.4353 14.3837 35.7815 14.3837C34.9098 14.3837 34.1198 14.5608 33.4115 14.9149C32.7304 15.269 32.3899 15.8411 32.3899 16.6311C32.3899 17.2304 32.6215 17.6936 33.0846 18.0205C33.5749 18.3474 34.3649 18.647 35.4546 18.9195L37.5386 19.3689C38.5738 19.6141 39.5 19.9138 40.3172 20.2679C41.1617 20.5948 41.8292 21.0852 42.3195 21.739C42.8099 22.3655 43.055 23.2645 43.055 24.4359C43.055 25.4983 42.7554 26.3837 42.1561 27.092C41.584 27.773 40.8212 28.277 39.8678 28.6039C38.9415 28.9308 37.9608 29.0942 36.9257 29.0942Z" fill="#F19D1E"/>
<path d="M47.7134 23.8637C46.2423 23.8637 44.8802 23.5368 43.6271 22.883C42.4012 22.202 41.4069 21.2213 40.6441 19.9409C39.8814 18.6606 39.5 17.135 39.5 15.3643C39.5 13.5391 39.8541 12.0136 40.5624 10.7877C41.2979 9.53458 42.265 8.58112 43.4636 7.92732C44.6623 7.27352 45.9699 6.94662 47.3865 6.94662C48.5306 6.94662 49.6611 7.16456 50.778 7.60042C51.9222 8.00905 52.9301 8.59475 53.8019 9.35751L52.8212 9.72527V2.12485H49.9608V0H55.845V21.2485H58.3785V23.4142H52.8212V21.2894L53.761 21.4528C52.971 22.2156 52.072 22.8149 51.0641 23.2508C50.0834 23.6594 48.9665 23.8637 47.7134 23.8637ZM48.3672 21.5754C49.4296 21.5754 50.4103 21.3711 51.3093 20.9625C52.2355 20.5539 53.08 20.009 53.8427 19.328L52.8212 20.7173V10.093L53.8427 11.5641C52.9982 10.8286 52.0448 10.2565 50.9823 9.84786C49.9472 9.43924 48.9529 9.23492 47.9994 9.23492C46.9642 9.23492 46.038 9.4801 45.2207 9.97045C44.4307 10.4608 43.8042 11.1827 43.3411 12.1362C42.878 13.0624 42.6464 14.2202 42.6464 15.6095C42.6464 16.8899 42.9052 17.9795 43.4228 18.8785C43.9404 19.7502 44.6214 20.4176 45.4659 20.8808C46.3377 21.3439 47.3047 21.5754 48.3672 21.5754Z" fill="#F19D1E"/>
<path d="M64.6955 31.1148C63.0066 31.1148 61.4674 30.7607 60.0781 30.0524C58.716 29.3441 57.6263 28.3498 56.8091 27.0694C56.0191 25.7891 55.6241 24.3044 55.6241 22.6154C55.6241 20.872 56.0055 19.3736 56.7682 18.1205C57.531 16.8402 58.5526 15.8594 59.8329 15.1784C61.1133 14.4701 62.5435 14.116 64.1235 14.116C65.758 14.116 67.1609 14.4565 68.3323 15.1375C69.5309 15.8186 70.4435 16.7857 71.0701 18.0388C71.6966 19.2919 72.0099 20.763 72.0099 22.452V23.2284H58.2801L58.321 21.3487H69.027C69.027 20.2863 68.809 19.3872 68.3732 18.6517C67.9645 17.889 67.3925 17.3169 66.6569 16.9355C65.9214 16.5269 65.0769 16.3226 64.1235 16.3226C63.1155 16.3226 62.2029 16.5677 61.3857 17.0581C60.5684 17.5212 59.9146 18.2022 59.4243 19.1012C58.9612 19.9729 58.7296 21.0354 58.7296 22.2885C58.7296 23.6234 58.9748 24.7811 59.4651 25.7618C59.9555 26.7153 60.6365 27.4644 61.5083 28.0093C62.4072 28.5269 63.4288 28.7857 64.573 28.7857C65.7988 28.7857 66.9294 28.5405 67.9645 28.0501C68.9997 27.5598 69.8987 26.9468 70.6615 26.2113L71.7239 28.0501C70.9884 28.8946 70.0349 29.6165 68.8635 30.2158C67.6921 30.8152 66.3028 31.1148 64.6955 31.1148Z" fill="#F19D1E"/>
</svg>

After

Width:  |  Height:  |  Size: 6.1 KiB

View file

@ -1,86 +0,0 @@
[changelog]
header = """
# Changelog\n
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).\n
"""
body = """
{%- macro remote_url() -%}
https://github.com/{{ remote.github.owner }}/{{ remote.github.repo }}
{%- endmacro -%}
{% if version -%}
## [{{ version | trim_start_matches(pat="v") }}] - {{ timestamp | date(format="%Y-%m-%d") }}
{% else -%}
## [Unreleased]
{% endif -%}
### Details\
{% for group, commits in commits | group_by(attribute="group") %}
#### {{ group | upper_first }}
{%- for commit in commits %}
- {{ commit.message | upper_first | trim }}\
{% if commit.github.username %} by @{{ commit.github.username }}{%- endif -%}
{% if commit.github.pr_number %} in \
[#{{ commit.github.pr_number }}]({{ self::remote_url() }}/pull/{{ commit.github.pr_number }}) \
{%- endif -%}
{% endfor %}
{% endfor %}
{%- if github.contributors | filter(attribute="is_first_time", value=true) | length != 0 %}
## New Contributors
{%- endif -%}
{% for contributor in github.contributors | filter(attribute="is_first_time", value=true) %}
* @{{ contributor.username }} made their first contribution
{%- if contributor.pr_number %} in \
[#{{ contributor.pr_number }}](({{ self::remote_url() }}/pull/{{ contributor.pr_number }}) \
{%- endif %}
{%- endfor %}\n
"""
footer = """
{%- macro remote_url() -%}
https://github.com/{{ remote.github.owner }}/{{ remote.github.repo }}
{%- endmacro -%}
{% for release in releases -%}
{% if release.version -%}
{% if release.previous.version -%}
[{{ release.version | trim_start_matches(pat="v") }}]: \
{{ self::remote_url() }}/compare/{{ release.previous.version }}..{{ release.version }}
{% endif -%}
{% else -%}
[unreleased]: {{ self::remote_url() }}/compare/{{ release.previous.version }}..HEAD
{% endif -%}
{% endfor %}
<!-- generated by git-cliff -->
"""
trim = true
[git]
conventional_commits = true
filter_unconventional = true
split_commits = false
commit_parsers = [
{ message = "^feat", group = "Features" },
{ message = "^fix", group = "Bug Fixes" },
{ message = "^doc", group = "Documentation", default_scope = "unscoped" },
{ message = "^perf", group = "Performance" },
{ message = "^refactor", group = "Refactor" },
{ message = "^style", group = "Styling" },
{ message = "^test", group = "Testing" },
{ message = "^chore\\(release\\): prepare for", skip = true },
{ message = "^chore", group = "Miscellaneous Tasks" },
{ body = ".*security", group = "Security" },
]
protect_breaking_commits = true
filter_commits = true
tag_pattern = "v[0-9].*"
ignore_tags = ""
topo_order = true
sort_commits = "newest"

22
docs/.gitignore vendored Normal file
View file

@ -0,0 +1,22 @@
# build output
dist/
# generated types
.astro/
.vercel/
# dependencies
node_modules/
# logs
npm-debug.log*
yarn-debug.log*
yarn-error.log*
pnpm-debug.log*
# environment variables
.env
.env.production
# macOS-specific files
.DS_Store

14
docs/.prettierrc Normal file
View file

@ -0,0 +1,14 @@
{
"useTabs": true,
"printWidth": 100,
"semi": false,
"plugins": ["prettier-plugin-astro", "prettier-plugin-tailwindcss"],
"overrides": [
{
"files": "*.astro",
"options": {
"parser": "astro"
}
}
]
}

107
docs/astro.config.mjs Normal file
View file

@ -0,0 +1,107 @@
import starlight from "@astrojs/starlight"
import tailwind from "@astrojs/tailwind"
import { defineConfig } from "astro/config"
// https://astro.build/config
export default defineConfig({
site: "https://docs.pesde.daimond113.com",
integrations: [
starlight({
title: "pesde docs",
social: {
github: "https://github.com/pesde-pkg/pesde",
},
sidebar: [
{
label: "Intro",
items: [{ slug: "" }, { slug: "installation" }, { slug: "quickstart" }],
},
{
label: "Guides",
autogenerate: { directory: "guides" },
},
{
label: "Reference",
autogenerate: { directory: "reference" },
},
{
label: "Registry",
autogenerate: { directory: "registry" },
},
],
components: {
SiteTitle: "./src/components/SiteTitle.astro",
},
customCss: ["./src/tailwind.css", "@fontsource-variable/nunito-sans"],
favicon: "/favicon.ico",
head: [
{
tag: "meta",
attrs: {
name: "theme-color",
content: "#F19D1E",
},
},
{
tag: "meta",
attrs: {
property: "og:image",
content: "/favicon-48x48.png",
},
},
{
tag: "meta",
attrs: {
name: "twitter:card",
content: "summary",
},
},
{
tag: "link",
attrs: {
rel: "icon",
type: "image/png",
href: "/favicon-48x48.png",
sizes: "48x48",
},
},
{
tag: "link",
attrs: {
rel: "icon",
type: "image/svg+xml",
href: "/favicon.svg",
},
},
{
tag: "link",
attrs: {
rel: "apple-touch-icon",
sizes: "180x180",
href: "/apple-touch-icon.png",
},
},
{
tag: "meta",
attrs: {
name: "apple-mobile-web-app-title",
content: "pesde docs",
},
},
{
tag: "link",
attrs: {
rel: "manifest",
href: "/site.webmanifest",
},
},
],
}),
tailwind({
applyBaseStyles: false,
}),
],
vite: {
envDir: "..",
},
})

BIN
docs/bun.lockb Executable file

Binary file not shown.

29
docs/package.json Normal file
View file

@ -0,0 +1,29 @@
{
"name": "docs",
"type": "module",
"version": "0.0.1",
"scripts": {
"dev": "astro dev",
"start": "astro dev",
"build": "astro check && astro build",
"preview": "astro preview",
"astro": "astro"
},
"dependencies": {
"@astrojs/check": "^0.9.3",
"@astrojs/starlight": "^0.28.2",
"@astrojs/starlight-tailwind": "^2.0.3",
"@astrojs/tailwind": "^5.1.1",
"@fontsource-variable/nunito-sans": "^5.1.0",
"@shikijs/rehype": "^1.21.0",
"astro": "^4.15.9",
"sharp": "^0.33.5",
"shiki": "^1.21.0",
"tailwindcss": "^3.4.13",
"typescript": "^5.6.2"
},
"devDependencies": {
"prettier-plugin-astro": "^0.14.1",
"prettier-plugin-tailwindcss": "^0.6.8"
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.9 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 KiB

BIN
docs/public/favicon.ico Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 15 KiB

3
docs/public/favicon.svg Normal file
View file

@ -0,0 +1,3 @@
<svg viewBox="0 0 100 100" fill="none" xmlns="http://www.w3.org/2000/svg">
<path fill-rule="evenodd" clip-rule="evenodd" d="M49.6025 0L92.9038 25V75L49.6025 100L6.30127 75V25L49.6025 0ZM14.3013 29.6188L49.6025 9.2376L84.9038 29.6188V70.3812L49.6025 90.7624L33.6148 81.5319V67.3848C34.5167 68.5071 35.6388 69.4215 36.981 70.1279C38.9701 71.148 41.0357 71.658 43.1779 71.658C46.442 71.658 49.1452 70.8929 51.2873 69.3629C53.4805 67.7818 55.1126 65.7672 56.1836 63.319C57.0915 61.3382 57.632 59.274 57.8054 57.1263C59.8723 57.7457 62.2157 58.0554 64.8356 58.0554C67.6918 58.0554 70.3695 57.6473 72.8686 56.8313C75.3678 55.9642 77.4079 54.8167 78.989 53.3886L75.7758 47.8038C74.5517 48.9258 72.9961 49.8439 71.109 50.5579C69.2219 51.221 67.2073 51.5525 65.0652 51.5525C61.3929 51.5525 58.6643 50.6854 56.8792 48.9513C56.7195 48.7962 56.567 48.6365 56.4217 48.472C55.6102 47.5539 55.0211 46.4896 54.6546 45.2791L54.6443 45.2452L54.669 45.2791H79.2185V41.9894C79.2185 39.0313 78.5555 36.3536 77.2294 33.9565C75.9543 31.5593 74.0927 29.6467 71.6445 28.2186C69.2474 26.7395 66.3657 26 62.9995 26C59.6843 26 56.8027 26.7395 54.3545 28.2186C51.9064 29.6467 50.0193 31.5593 48.6932 33.9565C47.6743 35.7983 47.0469 37.8057 46.8108 39.9788C45.6888 39.728 44.4778 39.6026 43.1779 39.6026C41.0357 39.6026 38.9701 40.1127 36.981 41.1327C35.3162 41.9651 33.9902 43.1549 33.0028 44.7023V40.3677H20.6855V46.2585H25.8113V77.0266L14.3013 70.3812V29.6188ZM55.1961 36.0986C54.6528 37.1015 54.3321 38.1216 54.234 39.1588H71.7976C71.7976 38.0367 71.4405 36.9401 70.7265 35.8691C70.0634 34.747 69.0689 33.8035 67.7428 33.0384C66.4677 32.2734 64.8867 31.8908 62.9995 31.8908C61.1124 31.8908 59.5058 32.2989 58.1798 33.1149C56.9047 33.88 55.9101 34.8745 55.1961 36.0986ZM49.6451 51.5692C49.3076 50.6641 48.8381 49.871 48.2367 49.1898C48.0885 49.0219 47.9323 48.8609 47.7681 48.7067C46.085 47.0746 44.0449 46.2585 41.6478 46.2585C40.1177 46.2585 38.6131 46.5645 37.134 47.1766C35.8594 47.6773 34.6863 48.5438 33.6148 49.7759V61.47C34.6863 62.6664 35.8594 63.5378 37.134 64.084C38.6131 64.6961 40.1177 65.0021 41.6478 65.0021C44.0449 65.0021 46.085 64.1861 47.7681 62.554C49.4512 60.9219 50.2928 58.6012 50.2928 55.5921C50.2928 54.0679 50.0769 52.727 49.6451 51.5692Z" fill="#F19D1E"></path>
</svg>

After

Width:  |  Height:  |  Size: 2.2 KiB

View file

@ -0,0 +1,21 @@
{
"name": "pesde",
"short_name": "pesde",
"icons": [
{
"src": "/web-app-manifest-192x192.png",
"sizes": "192x192",
"type": "image/png",
"purpose": "maskable"
},
{
"src": "/web-app-manifest-512x512.png",
"sizes": "512x512",
"type": "image/png",
"purpose": "maskable"
}
],
"theme_color": "#f19d1e",
"background_color": "#0a0704",
"display": "standalone"
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 15 KiB

View file

@ -0,0 +1,36 @@
<div class="flex items-center">
<a
href="https://pesde.daimond113.com/"
class="flex text-[var(--sl-color-text-accent)] hover:opacity-80"
>
<svg
viewBox="0 0 56 28"
class="h-7"
fill="none"
xmlns="http://www.w3.org/2000/svg"
>
<title>pesde</title>
<path
d="M0 28V26.3156H2.25652V12.2361H0.0635639V10.5517H4.44947L4.48125 11.9819L3.78205 12.3315C4.41769 11.6746 5.16986 11.1661 6.03857 10.8059C6.92846 10.4245 7.82895 10.2338 8.74003 10.2338C9.863 10.2338 10.88 10.4775 11.7911 10.9648C12.7234 11.4522 13.4544 12.1726 13.9841 13.126C14.5349 14.0795 14.8104 15.2448 14.8104 16.6221C14.8104 18.0416 14.5138 19.26 13.9205 20.277C13.3272 21.2728 12.5327 22.0356 11.5368 22.5653C10.5622 23.095 9.5028 23.3598 8.35865 23.3598C7.72301 23.3598 7.11916 23.2751 6.54708 23.1056C5.99619 22.9361 5.50887 22.7242 5.08511 22.4699C4.66135 22.1945 4.34353 21.8873 4.13165 21.5483L4.60838 21.4529L4.5766 26.3156H7.02381V28H0ZM7.94549 21.6118C9.19558 21.6118 10.2444 21.2092 11.0919 20.4041C11.9394 19.5778 12.3632 18.3807 12.3632 16.8127C12.3632 15.2872 11.9606 14.1113 11.1555 13.2849C10.3503 12.4586 9.3333 12.0454 8.1044 12.0454C7.72301 12.0454 7.26747 12.1196 6.73777 12.2679C6.20807 12.395 5.67837 12.6069 5.14867 12.9035C4.61898 13.2002 4.17403 13.5922 3.81383 14.0795L4.5766 12.7446L4.60838 20.7219L3.8774 19.7367C4.42828 20.3299 5.06392 20.7961 5.78431 21.1351C6.5047 21.4529 7.2251 21.6118 7.94549 21.6118Z"
fill="currentColor"></path>
<path
d="M18.37 17.7448C17.0563 17.7448 15.8592 17.4694 14.7786 16.9185C13.7192 16.3676 12.8717 15.5942 12.236 14.5984C11.6216 13.6026 11.3144 12.4478 11.3144 11.1341C11.3144 9.77811 11.611 8.61277 12.2043 7.63813C12.7975 6.64229 13.5921 5.87953 14.5879 5.34983C15.5837 4.79894 16.6961 4.5235 17.925 4.5235C19.1963 4.5235 20.2875 4.78835 21.1986 5.31805C22.1308 5.84775 22.8406 6.59992 23.3279 7.57456C23.8153 8.54921 24.0589 9.69336 24.0589 11.007V11.6109H13.3802L13.412 10.1489H21.7388C21.7388 9.32257 21.5693 8.62337 21.2303 8.05129C20.9125 7.45803 20.4676 7.01308 19.8955 6.71645C19.3234 6.39863 18.6666 6.23972 17.925 6.23972C17.1411 6.23972 16.4313 6.43042 15.7956 6.8118C15.16 7.17199 14.6515 7.70169 14.2701 8.4009C13.9099 9.07891 13.7298 9.90524 13.7298 10.8799C13.7298 11.9181 13.9205 12.8186 14.3019 13.5814C14.6833 14.3229 15.213 14.9056 15.891 15.3294C16.5902 15.732 17.3847 15.9332 18.2746 15.9332C19.2281 15.9332 20.1074 15.7425 20.9125 15.3612C21.7177 14.9798 22.4169 14.503 23.0101 13.931L23.8365 15.3612C23.2644 16.018 22.5228 16.5795 21.6117 17.0456C20.7006 17.5117 19.6201 17.7448 18.37 17.7448Z"
fill="currentColor"></path>
<path
d="M28.7199 22.6288C28.0631 22.6288 27.4275 22.5441 26.813 22.3746C26.2198 22.2051 25.7007 21.972 25.2557 21.6754C24.8108 21.3788 24.4718 21.0292 24.2387 20.6266L24.7154 20.9126V22.311H22.8721V17.9887H24.5247L24.7154 20.0545L24.2705 18.5608C24.5035 19.3447 25.0227 19.9486 25.8278 20.3723C26.6541 20.7961 27.5122 21.008 28.4021 21.008C29.2073 21.008 29.8747 20.8491 30.4044 20.5312C30.9553 20.2134 31.2307 19.7685 31.2307 19.1964C31.2307 18.5184 30.9977 18.0522 30.5315 17.798C30.0866 17.5225 29.3662 17.2789 28.3703 17.067L26.6223 16.6856C25.457 16.389 24.5353 15.997 23.8573 15.5097C23.1793 15.0012 22.8403 14.1642 22.8403 12.9989C22.8403 11.8759 23.2746 11.0178 24.1434 10.4245C25.0332 9.81009 26.135 9.50286 27.4487 9.50286C27.9572 9.50286 28.4869 9.57702 29.0378 9.72534C29.6098 9.85246 30.129 10.0538 30.5951 10.3292C31.0612 10.6046 31.3896 10.9436 31.5803 11.3462L31.1036 11.1873V9.75712H32.9787V14.0477H31.3261L30.9129 11.0284L31.4532 13.126C31.3684 12.7446 31.1248 12.4162 30.7222 12.1408C30.3408 11.8441 29.8853 11.6111 29.3556 11.4416C28.8471 11.2721 28.3386 11.1873 27.8301 11.1873C27.152 11.1873 26.5376 11.325 25.9867 11.6005C25.457 11.8759 25.1922 12.3209 25.1922 12.9353C25.1922 13.4015 25.3723 13.7617 25.7325 14.0159C26.1138 14.2702 26.7283 14.5033 27.5758 14.7151L29.1967 15.0647C30.0018 15.2554 30.7222 15.4885 31.3579 15.7639C32.0147 16.0182 32.5338 16.3996 32.9152 16.9081C33.2966 17.3954 33.4872 18.0946 33.4872 19.0057C33.4872 19.832 33.2542 20.5206 32.788 21.0715C32.3431 21.6012 31.7498 21.9932 31.0083 22.2475C30.2879 22.5017 29.5251 22.6288 28.7199 22.6288Z"
fill="currentColor"></path>
<path
d="M37.1104 18.5607C35.9662 18.5607 34.9068 18.3064 33.9322 17.7979C32.9787 17.2682 32.2054 16.5054 31.6121 15.5096C31.0188 14.5138 30.7222 13.3272 30.7222 11.95C30.7222 10.5304 30.9977 9.34389 31.5485 8.39043C32.1206 7.41579 32.8728 6.67421 33.8051 6.1657C34.7373 5.65719 35.7544 5.40293 36.8561 5.40293C37.746 5.40293 38.6253 5.57243 39.494 5.91144C40.3839 6.22926 41.1679 6.6848 41.8459 7.27807L41.0831 7.5641V1.65266H38.8584V0H43.435V16.5266H45.4055V18.2111H41.0831V16.5584L41.8141 16.6855C41.1997 17.2788 40.5005 17.7449 39.7165 18.0839C38.9537 18.4018 38.085 18.5607 37.1104 18.5607ZM37.6189 16.7809C38.4452 16.7809 39.208 16.622 39.9072 16.3042C40.6276 15.9863 41.2844 15.5626 41.8777 15.0329L41.0831 16.1135V7.85014L41.8777 8.99429C41.2208 8.42221 40.4793 7.97727 39.6529 7.65945C38.8478 7.34163 38.0744 7.18272 37.3329 7.18272C36.5277 7.18272 35.8073 7.37341 35.1717 7.75479C34.5572 8.13618 34.0699 8.69766 33.7097 9.43924C33.3495 10.1596 33.1694 11.0601 33.1694 12.1407C33.1694 13.1366 33.3707 13.9841 33.7733 14.6833C34.1759 15.3613 34.7056 15.8804 35.3624 16.2406C36.0404 16.6008 36.7926 16.7809 37.6189 16.7809Z"
fill="currentColor"></path>
<path
d="M50.3188 24.2004C49.0051 24.2004 47.808 23.925 46.7274 23.3741C45.668 22.8232 44.8205 22.0498 44.1848 21.054C43.5704 20.0582 43.2632 18.9034 43.2632 17.5898C43.2632 16.2337 43.5598 15.0684 44.1531 14.0937C44.7463 13.0979 45.5409 12.3351 46.5367 11.8054C47.5325 11.2545 48.6449 10.9791 49.8738 10.9791C51.1451 10.9791 52.2363 11.2439 53.1474 11.7736C54.0796 12.3033 54.7894 13.0555 55.2767 14.0302C55.7641 15.0048 56.0077 16.149 56.0077 17.4626V18.0665H45.329L45.3608 16.6045H53.6876C53.6876 15.7782 53.5181 15.079 53.1791 14.5069C52.8613 13.9136 52.4164 13.4687 51.8443 13.172C51.2722 12.8542 50.6154 12.6953 49.8738 12.6953C49.0899 12.6953 48.3801 12.886 47.7444 13.2674C47.1088 13.6276 46.6003 14.1573 46.2189 14.8565C45.8587 15.5345 45.6786 16.3609 45.6786 17.3355C45.6786 18.3737 45.8693 19.2742 46.2507 20.037C46.6321 20.7786 47.1617 21.3612 47.8398 21.785C48.539 22.1876 49.3335 22.3888 50.2234 22.3888C51.1769 22.3888 52.0562 22.1982 52.8613 21.8168C53.6665 21.4354 54.3657 20.9587 54.9589 20.3866L55.7853 21.8168C55.2132 22.4736 54.4716 23.0351 53.5605 23.5012C52.6494 23.9673 51.5688 24.2004 50.3188 24.2004Z"
fill="currentColor"></path>
</svg>
</a>
<span class="-mt-px ml-2.5 mr-2 text-xl text-[var(--sl-color-gray-5)]">/</span
>
<a
class="font-medium text-[var(--sl-color-gray-2)] no-underline hover:opacity-80 md:text-lg"
href="/">docs</a
>
</div>

View file

@ -0,0 +1,6 @@
import { defineCollection } from "astro:content"
import { docsSchema } from "@astrojs/starlight/schema"
export const collections = {
docs: defineCollection({ schema: docsSchema() }),
}

View file

@ -0,0 +1,74 @@
---
title: Using Binary Packages
description: Learn how to use binary packages.
---
A **binary package** is a package that contains a binary export.
Binary packages can be run like a normal program. There are several ways to use
binary packages with pesde.
## Using a binary package
### With `pesde x`
The `pesde x` command can be used to run a one-off binary package. This is
useful for running a binary package without installing it or outside of a pesde
project.
```sh
pesde x pesde/hello
# Hello, pesde! (pesde/hello@1.0.0, lune)
```
### By installing
Binary packages can be installed using the `pesde add` and `pesde install`
commands.
This requires a `pesde.toml` file to be present in the current directory, and
will add the binary package to the `dependencies` section of the file.
```sh
pesde add pesde/hello
pesde install
```
This will add the binary package to your `PATH`, meaning that it can be run
anywhere in a project which has it installed under that alias!
```sh
hello
# Hello, pesde! (pesde/hello@1.0.0, lune)
```
## Making a binary package
To make a binary package you must use a target compatible with binary exports.
These currently are `lune` and `luau`.
Here is an example of a binary package:
```toml title="pesde.toml"
name = "pesde/hello"
version = "1.0.0"
license = "MIT"
[target]
environment = "lune"
bin = "main.luau"
```
The `bin` field specifies the entry point for the binary package. This file
will be run when the binary package is executed.
```luau title="main.luau"
print("Hello, pesde!")
```
Binary packages get access to custom variables provided by pesde. You can find
them in the `_G` table. These are:
- `PESDE_ROOT`: The root (where the pesde.toml is located) of where the package is
installed. This will be in a temporary directory if the package is run with
`pesde x`.

View file

@ -0,0 +1,170 @@
---
title: Specifying Dependencies
description: Learn how to specify dependencies in your pesde project.
---
import { Aside, FileTree, LinkCard } from "@astrojs/starlight/components"
The `[dependencies]` section of your `pesde.toml` file is where you specify the
dependencies of your project.
pesde supports multiple types of dependencies.
## pesde Dependencies
The most common type of dependency are pesde dependencies. These are
dependencies on packages published to a [pesde registry](https://pesde.daimond113.com).
```toml title="pesde.toml"
[indices]
default = "https://github.com/pesde-pkg/index"
[dependencies]
hello = { name = "pesde/hello", version = "^1.0.0" }
```
In this example, we're specifying a dependency on the `pesde/hello` package on
the official pesde registry with a version constraint of `^1.0.0`.
You can also add a dependency by running the following command:
```sh
pesde add pesde/hello
```
## Git Dependencies
Git dependencies are dependencies on packages hosted on a Git repository.
```toml title="pesde.toml"
[dependencies]
acme = { repo = "acme/package", rev = "aeff6" }
```
In this example, we're specifying a dependency on the package contained within
the `acme/package` GitHub repository at the `aeff6` commit.
You can also use a URL to specify the Git repository and a tag for the revision.
```toml title="pesde.toml"
[dependencies]
acme = { repo = "https://git.acme.local/package.git", rev = "v0.1.0" }
```
You can also specify a path if the package is not at the root of the repository.
<FileTree>
- acme/package.git
- pkgs/
- **foo/**
- pesde.toml
- ...
</FileTree>
```toml title="pesde.toml"
[dependencies]
foo = { repo = "acme/package", rev = "main", path = "pkgs/foo" }
```
The path specified by the Git dependency must either be a valid pesde package or
a [Wally][wally] package.
You can also add a Git dependency by running the following command:
```sh
# From Git URL
pesde add https://git.acme.local/package.git#aeff6
# From GitHub repository
pesde add gh#acme/package#main
```
## Wally Dependencies
Wally dependencies are dependencies on packages published to a
[Wally registry][wally]. Wally is a package manager for Roblox and thus Wally
dependencies should only be used in Roblox projects.
```toml title="pesde.toml"
[wally_indices]
default = "https://github.com/UpliftGames/wally-index"
[dependencies]
foo = { wally = "acme/package", version = "^1.0.0" }
```
In this example, we're specifying a dependency on the `acme/package` package
on the official Wally registry with a version constraint of `^1.0.0`.
<Aside type="note">
In order to get proper types support for Wally dependencies, you need to have
a [`sourcemap_generator` script](/reference/manifest#sourcemap_generator)
specified in your `pesde.toml` file.
</Aside>
You can also add a Wally dependency by running the following command:
```sh
pesde add wally#acme/package
```
[wally]: https://wally.run/
## Workspace Dependencies
Packages within a workspace can depend on each other. For example, if `foo`
and `bar` are both packages in the same workspace, you can add a dependency to
`bar` in the `foo/pesde.toml` file:
```toml title="foo/pesde.toml"
[dependencies]
bar = { workspace = "acme/bar", version = "^" }
```
You can also add a workspace dependency by running the following command:
```sh
pesde add workspace:acme/bar
```
<LinkCard
title="Workspaces"
description="Learn more about using workspaces in pesde."
href="/guides/workspaces/"
/>
## Peer Dependencies
Peer dependencies are dependencies that are not installed automatically when
used by another package. They need to be installed by the user of the package.
```toml title="pesde.toml"
[peer_dependencies]
foo = { name = "acme/foo", version = "^1.0.0" }
```
You can add a peer dependency by passing `--peer` to the `pesde add` command:
```sh
pesde add --peer acme/foo
```
## Dev Dependencies
Dev dependencies are dependencies that are only used during development. They
are not installed when the package is used as a dependency.
```toml title="pesde.toml"
[dev_dependencies]
foo = { name = "acme/foo", version = "^1.0.0" }
```
You can add a dev dependency by passing `--dev` to the `pesde add` command:
```sh
pesde add --dev acme/foo
```

View file

@ -0,0 +1,79 @@
---
title: Overriding Dependencies
description: Learn how to override and patch dependencies in pesde.
---
import { Aside } from "@astrojs/starlight/components"
pesde has several ways to override or patch dependencies in your project.
## Dependency Overrides
Dependency overrides allow you to replace a dependency of a dependency with a
different version or package.
Let's say you have a project with the following dependencies:
```toml title="pesde.toml"
[dependencies]
foo = { name = "acme/foo", version = "^1.0.0" }
```
But `foo` depends on `bar` 1.0.0, and you want to use `bar` 2.0.0 instead. You
can override the `bar` dependency in your `pesde.toml` file:
```toml title="pesde.toml"
[dependencies]
foo = { name = "acme/foo", version = "^1.0.0" }
[overrides]
"foo>bar" = { name = "acme/bar", version = "^2.0.0" }
```
Now, when you run `pesde install`, `bar` 2.0.0 will be used instead of 1.0.0.
You can learn more about the syntax for dependency overrides in the
[reference](/reference/manifest#overrides).
## Patching Dependencies
Patching allows you to modify the source code of a dependency.
To patch a dependency, you can use the `pesde patch` and `pesde patch-commit`
commands.
Let's say you have the following dependency in your `pesde.toml` file:
```toml title="pesde.toml"
[target]
environment = "luau"
[dependencies]
foo = { name = "acme/foo", version = "^1.0.0" }
```
And you want to patch `foo` to fix a bug. You can run the following command:
```sh
pesde patch "acme/foo@1.0.0 luau"
# done! modify the files in the directory, then run `pesde patch-commit /x/y/z`
# to apply.
# warning: do not commit these changes
# note: the pesde.toml file will be ignored when patching
```
pesde will copy the source code of `foo` to a temporary directory, in this case
`/x/y/z`. You can then modify the files in this directory. Once you're done,
run `pesde patch-commit /x/y/z` to apply the changes.
This will create a patch within the `patches` directory of your project, and
add an entry to `[patches]`. Then, next time you run `pesde install`, the patch
will be applied to the dependency.
<Aside type="caution">
Make sure not to commit or stage the changes made in the temporary directory.
Otherwise pesde may not be able to create the patch correctly.
</Aside>

View file

@ -0,0 +1,136 @@
---
title: Publishing Packages
description: Learn how to publish packages to the pesde registry.
---
import { Aside, LinkCard } from "@astrojs/starlight/components"
## Configuration
Before you can publish a package, you must configure the required fields in your
`pesde.toml` file.
### `includes`
The `includes` field is a list of globs that should be included in the package.
```toml
includes = [
"pesde.toml",
"README.md",
"LICENSE",
"src/**/*.luau",
]
```
### `target`
The `target` field defines the environment where the package can be run.
Here, you must also specify the `lib` and/or `bin` fields to indicate the path
of the exported library or binary.
```toml
[target]
environment = "luau"
lib = "init.luau"
```
#### Roblox
`bin` is not supported in Roblox packages. You must also specify a list of
`build_files`. These are the files that should be synced into Roblox. They are
passed to the `roblox_sync_config_generator` script.
```toml
[target]
environment = "roblox"
lib = "src/init.luau"
build_files = ["src"]
```
<LinkCard
title="Roblox"
description="Learn more about authoring packages for Roblox."
href="/guides/roblox/#authoring-packages"
/>
## Authentication
Before you can publish a package, you must authenticate with your GitHub
account.
```sh
pesde auth login
```
You will be given a code and prompted to open the GitHub authentication page in
your browser. You must enter the code to authenticate.
## Publishing
To publish a package, run the following command:
```sh
pesde publish
```
You will be prompted to confirm the package details before publishing.
Once a package is published, others will be able to install it. You may not
remove a package once it has been published. You may not publish a package with
an already existing version.
## Multi-target Packages
You may publish packages under the same name and version but with different
targets. This allows you to publish a package that can be used in multiple
environments.
For example, you may publish a package that can be used in both Roblox and
Luau environments by publishing two versions of the package, one for each
environment.
## Documentation
The `README.md` file in the root of the package will be displayed on the
[pesde registry website](https://pesde.daimond113.com/).
You can include a `docs` directory in the package containing markdown files
and they will be available on the pesde registry website. You can see an example
in [`pesde/hello`](https://pesde.daimond113.com/packages/pesde/hello/latest/any/docs).
### Customizing the sidebar
You can include frontmatter with a `sidebar_position` to customize the order
of the pages on the sidebar.
```md title="docs/getting-started.md"
---
sidebar_position: 2
---
# Getting Started
Lorem ipsum odor amet, consectetuer adipiscing elit. Eleifend consectetur id
consequat conubia fames curae?
```
You can have directories in the `docs` directory to create nested pages. These
will show up as collapsible sections in the sidebar. You can include a
`_category_.json` file inside the nested directories to customize the label and
the ordering in the sidebar.
```json title="docs/guides/_category_.json"
{
"label": "Guides",
"position": 3
}
```
<Aside type="tip">
Make sure to include `docs` inside the `includes` field in `pesde.toml`
otherwise they won't be published with your package.
</Aside>

View file

@ -0,0 +1,229 @@
---
title: Roblox
description: Using pesde in a Roblox project.
---
import { FileTree } from "@astrojs/starlight/components"
pesde can be used in Roblox projects, however this requires some extra setup.
Namely, you need to specify a `roblox_sync_config_generator` script in order
to generate the adequate configuration for the sync tool you are using.
The [`pesde-scripts`](https://github.com/pesde-pkg/scripts)
repository contains a list of scripts for different sync tools. If the tool
you are using is not supported, you can write your own script and submit a PR
to get it added.
## Usage with Rojo
[Rojo](https://rojo.space/) is a popular tool for syncing files into Roblox
Studio.
Running `pesde init` will prompt you to select a target, select
`roblox` or `roblox_server` in this case. You will be prompted to pick out a
scripts package. Select `pesde/scripts_rojo` to get started with Rojo.
## Usage with other tools
If you are using a different sync tool, you should look for it's scripts
package on the registry. If you cannot find it, you can write your own and
optionally submit a PR to pesde-scripts to help others using the same tool as
you get started quicker.
Scaffold your project with `pesde init`, select the `roblox` or `roblox_server`
target, and then create a `.pesde/roblox_sync_config_generator.luau` script
and put it's path in the manifest.
## Authoring packages
When authoring packages for Roblox, it is recommended to have your code inside
of a `src` directory (or any other directory you prefer).
Inside of your `pesde.toml` you must specify the `roblox` environment and the
`lib` field with the path to your main script. You must also specify a list of
`build_files`. This list should contain names of top level files or directories
that should be synced into Roblox by a sync tool, such as Rojo.
Let's say you have a package with the following structure:
<FileTree>
- roblox_packages/
- dependency.luau
- ...
- src/
- init.luau
- foo.luau
- bar.luau
- ...
- LICENSE
- pesde.toml
- README.md
- selene.toml
- stylua.toml
</FileTree>
There are lots of files in the root directory that are not needed in Roblox,
such as configuration files, READMEs, and licenses. We only want the `src` and
the `roblox_packages` directory to be synced into Roblox.
<FileTree>
- roblox_packages/
- dependency (roblox_packages/dependency.luau)
- ...
- src/ (src/init.luau)
- foo (src/foo.luau)
- bar (src/bar.luau)
- ...
</FileTree>
This is where `build_files` come in, we can specify a list of files that should
be synced into Roblox. In this case, we only want the `src` directory to be
synced. We do not need to specify the `roblox_packages` directory, as it is
always synced.
So for our package, the `pesde.toml` file would roughly look like this:
```toml title="pesde.toml" {15}
name = "acme/package"
version = "1.0.0"
license = "MIT"
includes = [
"pesde.toml",
"LICENSE",
"README.md",
"src/**/*.luau",
]
[target]
environment = "roblox"
lib = "src/init.luau"
build_files = ["src"]
[dependencies]
dependency = "acme/library"
```
When a consumer of your package installs it, the `roblox_sync_config_generator`
script they are using will generate the configuration needed for their sync
tool. For example, a Rojo user would get a `default.project.json` with the
following contents:
```json title="default.project.json"
{
"tree": {
"src": {
"$path": "src"
},
"roblox_packages": {
"$path": "roblox_packages"
}
}
}
```
The linker scripts that pesde generates will then point to the `src` module.
Then, to publish your package, you can follow the steps in the
["Publishing Packages"](/guides/publishing/) guide.
### Test place with Rojo
You might want to create a "test place" where you can test your package inside
Roblox, or to get proper LSP support when developing your package.
To do this, you can create a `test-place.project.json` file which includes your
package and the `roblox_packages` directory.
```json title="test-place.project.json"
{
"tree": {
"$className": "DataModel",
"ReplicatedStorage": {
"package": {
"$className": "Folder",
"src": {
"$path": "src"
},
"roblox_packages": {
"$path": "roblox_packages"
}
}
}
}
}
```
You can then run `rojo serve` with this project file:
```sh
rojo serve test-place.project.json
```
If you are using [Luau LSP](https://github.com/JohnnyMorganz/luau-lsp) you can
change the `luau-lsp.sourcemap.rojoProjectFile` extension setting to
`test-place.project.json` to get proper LSP support when developing your
package.
### Differences from Wally
Those coming from [Wally](https://wally.run/) may be a bit confused by the
way pesde handles Roblox packages.
In Wally, it is standard to have a `default.project.json` with the following:
```json
{
"tree": {
"$path": "src"
}
}
```
This will cause the `src` directory to be directly synced into Roblox.
In pesde, you should not have a `default.project.json` file in your package.
Instead, you are required to use the `build_files` field to specify a 1:1 match
between Roblox and the file system. pesde forbids `default.project.json` to be
part of a published package, and regenerates it when installing a pesde git
dependency. This allows the consumer of your package to choose the sync tool
they want to use, instead of being constrained to only using Rojo.
This has the effect that the structure of the files in the file system ends up
being reflected inside Roblox.
With Wally, the structure that ends up in Roblox ends up looking like this:
<FileTree>
- Packages/
- \_Index/
- acme_package@1.0.0/
- package/ (src/init.luau)
- foo (src/foo.luau)
- bar (src/bar.luau)
- ...
- dependency
</FileTree>
Whereas with pesde, it looks like this:
<FileTree>
- roblox_packages/
- .pesde/
- acme+package/
- 1.0.0/
- src/ (src/init.luau)
- foo (src/foo.luau)
- bar (src/bar.luau)
- ...
- roblox_packages/
- dependency (roblox_packages/dependency.luau)
</FileTree>

View file

@ -0,0 +1,53 @@
---
title: Using Scripts Packages
description: Learn how to use scripts packages.
---
A **scripts package** is a package that contains scripts. The scripts provided
by the package are linked in `.pesde/{alias}/{script_name}.luau` of the project
that uses the package.
## Using a scripts package
Scripts packages can be installed using the `pesde add` and `pesde install`
commands.
This requires a `pesde.toml` file to be present in the current directory, and
will add the scripts package to the `dependencies` section of the file.
```sh
pesde add pesde/scripts_rojo
pesde install
```
This will add the scripts package to your project, and installing will put the
scripts at `.pesde/scripts_rojo/{script_name}.luau`. You can then add the scripts
to your manifest, for example:
```toml title="pesde.toml"
[scripts]
roblox_sync_config_generator = ".pesde/scripts_rojo/roblox_sync_config_generator.luau"
```
## Making a scripts package
To make a scripts package you must use a target compatible with scripts exports.
These currently are `lune` and `luau`.
Here is an example of a scripts package:
```toml title="pesde.toml"
name = "pesde/scripts_rojo"
version = "1.0.0"
license = "MIT"
[target]
environment = "lune"
[target.scripts]
roblox_sync_config_generator = "roblox_sync_config_generator.luau"
```
The `scripts` table in the target is a map of script names to the path of the
script in the package. The scripts will be linked in the project that uses the
package at `.pesde/{alias}/{script_name}.luau`.

View file

@ -0,0 +1,222 @@
---
title: Self Hosting Registries
description: Learn how to self host registries for pesde.
---
import { Aside } from "@astrojs/starlight/components"
You can self host registries for pesde. This is useful if you want a private
registry or if you a separate registry for other reasons.
## Making the index repository
The index is a repository that contains metadata about all the packages in the
registry.
An index contains a `config.toml` file with configuration options.
To create an index, create a new repository and add a `config.toml` file with
the following content:
```toml title="config.toml"
# the URL of the registry API
api = "https://registry.acme.local/"
# package download URL (optional)
download = "{API_URL}/v0/packages/{PACKAGE}/{PACKAGE_VERSION}/{PACKAGE_TARGET}"
# the client ID of the GitHub OAuth app (optional)
github_oauth_client_id = "a1d648966fdfbdcd9295"
# whether to allow packages with Git dependencies (default: false)
git_allowed = true
# whether to allow packages which depend on packages from other registries
# (default: false)
other_registries_allowed = ["https://git.acme.local/index"]
# whether to allow packages with Wally dependencies (default: false)
wally_allowed = false
# the maximum size of the archive in bytes (default: 4MB)
max_archive_size = 4194304
# the scripts packages present in the `init` command selection by default
scripts_packages = ["pesde/scripts_rojo"]
```
- **api**: The URL of the registry API. See below for more information.
- **download**: The URL to download packages from. This is optional and
defaults to the correct URL for the official pesde registry implementation.
You only need this if you are using a custom registry implementation.
This string can contain the following placeholders:
- `{API_URL}`: The API URL (as specified in the `api` field).
- `{PACKAGE}`: The package name.
- `{PACKAGE_VERSION}`: The package version.
- `{PACKAGE_TARGET}`: The package target.
Defaults to `{API_URL}/v0/packages/{PACKAGE}/{PACKAGE_VERSION}/{PACKAGE_TARGET}`.
- **github_oauth_client_id**: This is required if you use GitHub OAuth for
authentication. See below for more information.
- **git_allowed**: Whether to allow packages with Git dependencies. This can be
either a bool or a list of allowed repository URLs. This is optional and
defaults to `false`.
- **other_registries_allowed**: Whether to allow packages which depend on
packages from other registries. This can be either a bool or a list of
allowed index repository URLs. This is optional and defaults to `false`.
- **wally_allowed**: Whether to allow packages with Wally dependencies. This can
be either a bool or a list of allowed index repository URLs. This is
optional and defaults to `false`.
- **max_archive_size**: The maximum size of the archive in bytes. This is
optional and defaults to `4194304` (4MB).
- **scripts_packages**: The scripts packages present in the `init` command
selection by default. This is optional and defaults to none.
You should then push this repository to [GitHub](https://github.com/).
## Configuring the registry
The registry is a web server that provides package downloads and the ability to
publish packages.
The official registry implementation is available in the
[pesde GitHub repository](https://github.com/pesde-pkg/pesde/tree/0.5/registry).
Configuring the registry is done using environment variables. In order to allow
the registry to access the index repository, you must use an account that
has access to the index repository. We recommend using a separate account
for this purpose.
<Aside>
For a GitHub account the password **must** be a personal access token. For instructions on how to
create a personal access token, see the [GitHub
documentation](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens).
The access token must have read and write access to the index repository.
</Aside>
### General configuration
- **INDEX_REPO_URL**: The URL of the index repository. This is required.\
Example: `https://github.com/pesde-pkg/index.git`
- **GIT_USERNAME**: The username of a Git account that has push access to the
index repository. This is required.
- **GIT_PASSWORD**: The password of the account specified by
`GITHUB_USERNAME`. This is required.
- **COMMITTER_GIT_NAME**: The name to use for the committer when updating the
index repository.\
Example: `pesde index updater`
- **COMMITTER_GIT_EMAIL**: The email to use for the committer when updating the
index repository.\
Example: `pesde@localhost`
- **DATA_DIR**: The directory where the registry stores miscellaneous data.
This value can use `{CWD}` to refer to the current working directory.\
Default: `{CWD}/data`
- **ADDRESS**: The address to bind the server to.\
Default: `127.0.0.1`
- **PORT**: The port to bind the server to.\
Default: `8080`
### Authentication configuration
The registry supports multiple authentication methods, which are documented
below.
#### General configuration
- **READ_NEEDS_AUTH**: If set to any value, reading data requires
authentication. If not set, anyone can read from the registry.
This is optional.
#### Single token authentication
Allows read and write access to the registry using a single token.
- **ACCESS_TOKEN**: The token to use for authentication.
#### Multiple token authentication
Allows read and write access to the registry using different tokens.
- **READ_ACCESS_TOKEN**: The token that grants read access.
- **WRITE_ACCESS_TOKEN**: The token that grants write access.
#### GitHub OAuth authentication
Allows clients to get read and write access to the registry using GitHub OAuth.
This requires a GitHub OAuth app, instructions to create one can be found
in the [GitHub documentation](https://docs.github.com/en/apps/oauth-apps/building-oauth-apps/creating-an-oauth-app).
- **GITHUB_CLIENT_SECRET**: The client secret of the GitHub OAuth app.
#### No authentication
If none of the above variables are set, **anyone** will be able to read and
write to the registry.
### Storage configuration
The registry supports multiple storage backends, which are documented below.
#### File system storage
Stores packages on the file system.
- **FS_STORAGE_ROOT**: The root directory where packages are stored.
#### S3 storage
Stores packages on an S3 compatible storage service, such as
[Amazon S3](https://aws.amazon.com/s3/) or
[Cloudflare R2](https://www.cloudflare.com/r2/).
- **S3_ENDPOINT**: The endpoint of the S3 bucket to store packages in.
- **S3_BUCKET_NAME**: The name of the bucket.
- **S3_REGION**: The region of the bucket.
- **S3_ACCESS_KEY**: The access key to use.
- **S3_SECRET_KEY**: The secret key to use.
### Sentry configuration
The registry supports [Sentry](https://sentry.io/) for error tracking.
- **SENTRY_DSN**: The DSN of the Sentry instance.
## Running the registry
First clone the repository and navigate to the repository directory:
```sh
git clone https://github.com/pesde-pkg/pesde.git
cd pesde
```
You can then build the registry using the following command:
```sh
cargo build --release -p pesde-registry
```
This will build the registry. The resulting binary will be located at
`target/release/pesde-registry` or `target/release/pesde-registry.exe`.
After setting the environment variables, you can run the registry using the
by executing the binary.
The registry must be exposed at the URL specified in the `api` field of the
index repository configuration.

View file

@ -0,0 +1,100 @@
---
title: Workspaces
description: Learn how to use workspaces in pesde.
---
import { FileTree, LinkCard } from "@astrojs/starlight/components"
Workspaces allow you to work with multiple pesde projects within a single
repository. Packages within a workspace can depend on each other. And you can
run commands like install or publish on every package in the workspace at once.
Let's say you have a repository with the following structure:
<FileTree>
- pesde.toml
- pkgs/
- foo/
- pesde.toml
- ...
- bar/
- pesde.toml
- ...
</FileTree>
Within the root `pesde.toml` file, we can define a workspace:
```toml title="pesde.toml"
name = "acme/root"
version = "0.0.0"
private = true
workspace_members = ["pkgs/*"]
[target]
environment = "luau"
```
Now, each folder within the `pkgs/` directory is considered a package in the
workspace. You can run commands like `pesde install` or `pesde publish` from
the root of the repository to run them on every package in the workspace.
## Workspace Dependencies
Packages within a workspace can depend on each other. For example, if `foo`
depends on `bar`, you can add a dependency to `bar` in the `foo/pesde.toml` file:
```toml title="pkgs/foo/pesde.toml"
name = "acme/foo"
version = "1.0.0"
[dependencies]
bar = { workspace = "acme/bar", version = "^" }
```
Workspace dependencies are replaced with normal pesde dependencies when
publishing.
The `version` field can either contain `^`, `*`, `=`, `~`, or a specific version
requirement, such as `^1.0.0`. If you use `^`, `=`, or `~`, it will be replaced
with the version of the package in the workspace when publishing.
For example, if you had the following:
```toml title="pesde.toml"
[dependencies]
bar = { workspace = "acme/bar", version = "^" }
qux = { workspace = "acme/qux", version = "=" }
qar = { workspace = "acme/qar", version = "~" }
zoo = { workspace = "acme/zoo", version = "^2.1.0" }
baz = { workspace = "acme/baz", version = "*" }
```
If `bar`, `baz`, `qux`, `qar`, and `zoo` are all at version `2.1.5` in the
workspace, the `pesde.toml` file will be transformed into the following when
publishing.
```toml title="pesde.toml"
[dependencies]
bar = { name = "acme/bar", version = "^2.1.5" }
qux = { name = "acme/qux", version = "=2.1.5" }
qar = { name = "acme/qar", version = "~2.1.5" }
zoo = { name = "acme/zoo", version = "^2.1.0" }
baz = { name = "acme/baz", version = "*" }
```
A `target` field can be added to the `dependencies` table to specify a target
environment for the dependency.
```toml title="pesde.toml"
[dependencies]
bar = { workspace = "acme/bar", version = "^", target = "luau" }
```
<LinkCard
title="Specifying Dependencies"
description="Learn more about specifying dependencies in pesde."
href="/guides/dependencies/"
/>

View file

@ -0,0 +1,32 @@
---
title: What is pesde?
description: A package manager for the Luau programming language, supporting multiple runtimes including Roblox and Lune.
---
pesde is a package manager for the Luau programming language.
## Why use pesde?
When you write code, you often want to use libraries or frameworks that others
have written. Manually downloading and managing these can be cumbersome.
These libraries or frameworks can be distributed as packages. You can then
easily install and use these packages using pesde. pesde will automatically
download and manage the packages, and their dependencies, for you.
## Multi-target support
Luau can run in a lot of different places, such as on [Roblox][roblox], or in
[Lune][lune].
pesde is designed to work with all of these runtimes. Packages can publish
multiple versions of themselves, each tailored to a specific runtime.
[registry]: https://pesde.daimond113.com/
[roblox]: https://www.roblox.com/
[lune]: https://lune-org.github.io/docs
## The pesde registry
The [pesde registry][registry] is where anyone can publish their packages for
others to use.

View file

@ -0,0 +1,99 @@
---
title: Installation
description: Install pesde
---
import { Aside, Steps, TabItem, Tabs } from "@astrojs/starlight/components"
## Prerequisites
pesde requires [Lune](https://lune-org.github.io/docs) to be installed on your
system in order to function properly.
You can follow the installation instructions in the
[Lune documentation](https://lune-org.github.io/docs/getting-started/1-installation).
## Installing pesde
<Steps>
1. Go to the [GitHub releases page](https://github.com/pesde-pkg/pesde/releases/latest).
2. Download the corresponding archive for your operating system. You can choose
whether to use the `.zip` or `.tar.gz` files.
3. Extract the downloaded archive to a folder on your computer.
4. Open a terminal and locate the path of the extracted `pesde` binary.
<Tabs syncKey="os">
<TabItem label="Windows">
If you extracted the archive to `C:\Users\User\Downloads`, the path to the
`pesde` binary would be `C:\Users\User\Downloads\pesde.exe`.
You can then run the `self-install` command:
```ps
C:\Users\User\Downloads\pesde.exe self-install
```
pesde should now be installed on your system. You may need to restart your
computer for the changes to take effect.
<Aside type="caution">
pesde uses symlinks which are an administrator-level operation on Windows.
To ensure proper functionality, enable [Developer Mode](https://learn.microsoft.com/en-us/windows/apps/get-started/enable-your-device-for-development).
If you are getting errors such as `Failed to symlink file, a required
privilege is not held by the client`, then enabling this setting will fix
them.
</Aside>
</TabItem>
<TabItem label="Linux & macOS">
If you extracted the archive to `~/Downloads`, the path to the `pesde`
binary would be `~/Downloads/pesde`.
You must then add execute permissions and run the `self-install` command:
```sh
chmod +x ~/Downloads/pesde
~/Downloads/pesde self-install
```
pesde should now be installed on your system. You will need to update your
shell configuration file to add the pesde binary to your `PATH`
environment variable.
```sh title=".zshrc"
export PATH="$PATH:$HOME/.pesde/bin"
```
You should then be able to run `pesde` after restarting your shell.
</TabItem>
</Tabs>
5. Verify that pesde is installed by running the following command:
```sh
pesde -v
```
This command should output the version of pesde that you installed.
</Steps>
<Aside type="caution">
It is not recommended to use toolchain managers (such as Rokit or Aftman) to
install pesde. You can use `pesde self-upgrade` if you need to update pesde.
If you need everyone to use the same version of pesde, you can use the
`pesde_version` field in `pesde.toml` to specify the version of pesde to use
for the current project.
</Aside>

View file

@ -0,0 +1,142 @@
---
title: Quickstart
description: Start using pesde
---
import { FileTree } from "@astrojs/starlight/components"
Let's make a simple Luau program that uses the `pesde/hello` package to print
hello to the terminal.
## Scaffolding the project
In your terminal, run the following commands to create a folder and navigate
into it.
```sh
mkdir hello-pesde
cd hello-pesde
```
Then, we'll use `pesde init` to scaffold a new pesde project. The command will
ask you a few questions to set up the project. Our project will be named
`<username>/hello_pesde`, replace `<username>` with a username of your choice.
The name may only contain lowercase letters, numbers, and underscores. The
environment we're targeting is `luau`.
```sh
pesde init
# what is the name of the project? <username>/hello_pesde
# what is the description of the project?
# who are the authors of this project?
# what is the repository URL of this project?
# what is the license of this project? MIT
# what environment are you targeting for your package? luau
# would you like to setup default Roblox compatibility scripts? No
```
The command will create a `pesde.toml` file in the current folder. Go ahead
and open this file in your text editor of choice.
## Adding a main script
Under the `[target]` section, we're going to add a `bin` field to specify
the path to the main script of our package.
```diff lang="toml" title="pesde.toml"
name = "<username>/hello_pesde"
version = "0.1.0"
license = "MIT"
[target]
environment = "luau"
+ bin = "main.luau"
[indices]
default = "https://github.com/pesde-pkg/index"
```
Don't forget to save the file after making the changes.
Now, lets create a `main.luau` file in the project folder and add the following
code to it.
```luau title="main.luau"
print("Hello, pesde!")
```
## Running the script
Then, we can run the following command to run the script.
```sh
pesde run
```
You should see `Hello, pesde!` printed to the terminal.
## Install a dependency
Let's use the `pesde/hello` package instead of printing ourselves.
Run the following command to add the package to `pesde.toml`.
```sh
pesde add pesde/hello
```
You should see that `pesde.toml` has been updated with the new dependency.
```diff lang="toml" title="pesde.toml"
name = "lukadev_0/hello_pesde"
version = "0.1.0"
license = "MIT"
[target]
environment = "luau"
bin = "main.luau"
[indices]
default = "https://github.com/pesde-pkg/index"
+ [dependencies]
+ hello = { name = "pesde/hello", version = "^1.0.0" }
```
Run the following command to install the new dependency.
```sh
pesde install
```
You should see that pesde has created a `luau_packages` folder containing the
newly installed package. It has also created a `pesde.lock` file, this file
contains the exact versions of the dependencies that were installed so that
they can be installed again in the future.
<FileTree>
- luau_packages/
- hello.luau
- ...
- main.luau
- pesde.lock
- pesde.toml
</FileTree>
Let's update the `main.luau` file to use the `pesde/hello` package.
```luau title="main.luau"
local hello = require("./luau_packages/hello")
hello()
```
If we run the script again, we should see something printed to the terminal.
```sh
pesde run
# Hello, pesde! (pesde/hello@1.0.0, luau)
```

View file

@ -0,0 +1,180 @@
---
title: pesde CLI
description: Reference for the pesde CLI.
---
import { LinkCard } from "@astrojs/starlight/components"
The pesde CLI is the primary way to interact with pesde projects. It provides
commands for installing dependencies, running scripts, and more.
## `pesde auth`
Authentication-related commands.
- `-i, --index`: The index of which token to manipulate. May be a URL or an alias.
Defaults to the default
index of the current project or the default index set in the config.
### `pesde auth login`
Sets the token for the index.
- `-t, --token`: The token to set.
If no token is provided, you will be prompted to authenticate with GitHub. A
code will be provided that you can paste into the GitHub authentication prompt.
### `pesde auth logout`
Removes the stored token for the index.
### `pesde auth whoami`
Prints the username of the currently authenticated user of the index. Only
works if the token is a GitHub token.
### `pesde auth token`
Prints the token for the index.
## `pesde config`
Configuration-related commands.
### `pesde config default-index`
```sh
pesde config default-index [INDEX]
```
Configures the default index. If no index is provided, the current default index
is printed.
- `-r, --reset`: Resets the default index.
The default index is [`pesde-index`](https://github.com/pesde-pkg/index).
## `pesde init`
Initializes a new pesde project in the current directory.
## `pesde run`
Runs a script from the current project using Lune.
```sh
pesde run [SCRIPT] [ -- <ARGS>...]
```
If no script is provided, it will run the script specified by `target.bin`
in `pesde.toml`.
If a path is provided, it will run the script at that path.
If a script defined in `[scripts]` is provided, it will run that script.
If a package name is provided, it will run the script specified by `target.bin`
in that package.
Arguments can be passed to the script by using `--` followed by the arguments.
```sh
pesde run foo -- --arg1 --arg2
```
## `pesde install`
Installs dependencies for the current project.
- `--locked`: Whether to error if the lockfile is out of date.
- `--prod`: Whether to skip installing dev dependencies.
## `pesde publish`
Publishes the current project to the pesde registry.
- `-d, --dry-run`: Whether to perform a dry run. This will output a
tarball containing the package that would be published, but will not actually
publish it.
- `-y, --yes`: Whether to skip the confirmation prompt.
- `-i, --index`: Name of the index to publish to. Defaults to `default`.
## `pesde self-install`
Performs the pesde installation process. This should be the first command run
after downloading the pesde binary.
## `pesde self-upgrade`
Upgrades the pesde binary to the latest version.
- `--use-cached`: Whether to use the version displayed in the "upgrade available"
message instead of checking for the latest version.
## `pesde patch`
```sh
pesde patch <PACKAGE>
```
Prepares a patching environment for a package. This will copy the source code of
the package to a temporary directory.
The package specified must be in the format `<name>@<version> <target>`.
<LinkCard
title="Overrides"
description="Learn more about overriding and patching packages."
href="/guides/overrides/"
/>
## `pesde patch-commit`
```sh
pesde patch-commit <PATH>
```
Applies the changes made in the patching environment created by `pesde patch`.
## `pesde add`
```sh
pesde add <PACKAGE>
```
Adds a package to the dependencies of the current project.
- `-i, --index <INDEX>`: The index in which to search for the package.
- `-t, --target <TARGET>`: The target environment for the package.
- `-a, --alias <ALIAS>`: The alias to use for the package, defaults to the
package name.
- `-p, --peer`: Adds the package as a peer dependency.
- `-d, --dev`: Adds the package as a dev dependency.
The following formats are supported:
```sh
pesde add pesde/hello
pesde add gh#acme/package#main
pesde add https://git.acme.local/package.git#aeff6
```
## `pesde update`
Updates the dependencies of the current project.
## `pesde x`
Runs a one-off binary package.
```sh
pesde x <PACKAGE>
```
This is useful for running a binary package without installing it or outside of
a pesde project.
```sh
pesde x pesde/hello
```

View file

@ -0,0 +1,432 @@
---
title: pesde.toml
description: Reference for `pesde.toml`
---
import { LinkCard } from "@astrojs/starlight/components"
`pesde.toml` is the manifest file for a pesde package. It contains metadata about
the package and its dependencies.
## Top-level fields
```toml
name = "acme/package"
version = "1.2.3"
description = "A package that does foo and bar"
license = "MIT"
authors = ["John Doe <john.doe@acme.local> (https://acme.local)"]
repository = "https://github.com/acme/package"
```
### `name`
The name of the package. This is used to identify the package in the registry.
The name consists of a scope and a package name, separated by a slash (`/`). It
may only contain lowercase letters, numbers, and underscores.
The first one to publish to a given scope gets to own it. If you want multiple
people to be able to publish to the same scope, you can send a pull request to
the [pesde-index GitHub repository](https://github.com/pesde-pkg/index)
and add the GitHub user ID of the other person to the `owners` field of the
`scope.toml` file of the given scope. For more information, see
[policies](/registry/policies#package-ownership).
### `version`
The version of the package. This must be a valid [SemVer](https://semver.org/)
version, such as `1.2.3`.
### `description`
A short description of the package. This is displayed on the package page in the
registry.
### `license`
The license of the package. It is recommended to use a
[SPDX license identifier](https://spdx.org/licenses/), such as `MIT` or
`Apache-2.0`.
### `authors`
A list of authors of the package. Each author is a string containing the name of
the author, optionally followed by an email address in angle brackets, and a
website URL in parentheses. For example:
```toml
authors = ["John Doe <john.doe@acme.local> (https://acme.local)"]
```
### `repository`
The URL of the repository where the package is hosted. This is displayed on the
package page in the registry.
### `private`
A boolean indicating whether the package is private. If set to `true`, the
package cannot be published to the registry.
### `includes`
List of globs to include in the package when publishing. Files and directories
not listed here will not be published.
```toml
includes = [
"pesde.toml",
"README.md",
"LICENSE",
"init.luau",
"docs/**/*.md",
]
```
### `pesde_version`
The version of pesde to use within this project. The `pesde` CLI will look at
this field and run the correct version of pesde for this project.
### `workspace_members`
A list of globs containing the members of this workspace.
<LinkCard
title="Workspaces"
description="Learn more about workspaces in pesde."
href="/guides/workspaces/"
/>
## `[target]`
The `[target]` section contains information about the target platform for the
package.
```toml
[target]
environment = "luau"
lib = "init.luau"
```
### `environment`
The target environment for the package. This can be one of the following:
- `luau`: Standalone Luau code that can be run using the `luau` CLI.
- `lune`: Luau code that requires the Lune runtime.
- `roblox`: Luau code that must be run in Roblox.
- `roblox_server`: Same as `roblox`, but only for server-side code.
### `lib`
**Allowed in:** `luau`, `lune`, `roblox`, `roblox_server`
The entry point of the library exported by the package. This file is what will
be required when the package is loaded using `require`.
### `bin`
**Allowed in:** `luau`, `lune`
The entry point of the binary exported by the package. This file is what will be
run when the package is executed as a binary.
<LinkCard
title="Using Binary Packages"
description="Learn more about using binary packages in pesde."
href="/guides/binary-packages/"
/>
### `build_files`
**Allowed in:** `roblox`, `roblox_server`
A list of files that should be synced to Roblox when the package is installed.
```toml
build_files = [
"init.luau",
"foo.luau",
]
```
These files are passed to [`roblox_sync_config_generator`](#roblox_sync_config_generator)
when the package is installed in order to generate the necessary configuration.
### `scripts`
**Allowed in:** `luau`, `lune`
A list of scripts that will be linked to the dependant's `.pesde` directory, and
copied over to the [scripts](#scripts-1) section when initialising a project with
this package as the scripts package.
```toml
[target.scripts]
roblox_sync_config_generator = "scripts/roblox_sync_config_generator.luau"
```
## `[scripts]`
The `[scripts]` section contains scripts that can be run using the `pesde run`
command. These scripts are run using [Lune](https://lune-org.github.io/docs).
```toml
[scripts]
build = "sripts/build.luau"
test = "scripts/test.luau"
```
There are also a few special scripts that are run in certain cases by pesde.
### `roblox_sync_config_generator`
This is responsible for generating adequate configuration files for Roblox
sync tools.
`process.args` will contain the directory containing the package, and the list
of files specified within the [`target.build_files`](#build_files) of the
package.
<LinkCard
title="Roblox"
description="Learn more about using pesde in Roblox projects."
href="/guides/roblox/"
/>
<LinkCard
title="Example script for Rojo"
description="An example script for generating configuration for Rojo."
href="https://github.com/pesde-pkg/scripts/blob/master/src/generators/rojo/sync_config.luau"
/>
### `sourcemap_generator`
This is responsible for generating source maps for packages that are installed.
This is required to get proper types support when using
[Wally dependencies](/guides/dependencies/#wally-dependencies).
The script will receive the path to the package directory as the first argument
through `process.args`.
<LinkCard
title="Example script for Rojo"
description="An example script for generating configuration for Rojo."
href="https://github.com/pesde-pkg/scripts/blob/master/src/generators/rojo/sourcemap.luau"
/>
## `[indices]`
The `[indices]` section contains a list of pesde indices where packages can be
installed from.
```toml
[indices]
default = "https://github.com/pesde-pkg/index"
acme = "https://github.com/acme/pesde-index"
```
These can then be referenced in the [`dependencies`](#dependencies) of the
package. The `default` index is used if no index is specified.
```toml
[dependencies]
foo = { name = "acme/foo", version = "1.2.3", index = "acme" }
```
## `[wally_indices]`
The `[wally_indices]` section contains a list of Wally indices where packages
can be installed from. This is used for
[Wally dependencies](/guides/dependencies/#wally-dependencies).
```toml
[wally_indices]
default = "https://github.com/UpliftGames/wally-index"
acme = "https://github.com/acme/wally-index"
```
These can then be referenced in the [`dependencies`](#dependencies) of the
package. The `default` index is used if no index is specified.
```toml
[dependencies]
foo = { wally = "acme/foo", version = "1.2.3", index = "acme" }
```
## `[overrides]`
The `[overrides]` section contains a list of overrides for dependencies. This
allows you to replace certain dependencies with different versions or even
different packages.
```toml
[overrides]
"bar>baz" = { name = "acme/baz", version = "1.0.0" }
"foo>bar,baz>bar" = { name = "acme/bar", version = "2.0.0" }
```
The above example will replace the `baz` dependency of the `bar` package with
version `1.0.0`, and the `bar` and `baz` dependencies of the `foo` package with
version `2.0.0`.
Each key in the overrides table is a comma-separated list of package paths. The
path is a list of package names separated by `>`. For example, `foo>bar>baz`
refers to the `baz` dependency of the `bar` package, which is a dependency of
the `foo` package.
<LinkCard
title="Overrides"
description="Learn more about overriding and patching packages."
href="/guides/overrides/"
/>
## `[patches]`
The `[patches]` section contains a list of patches for dependencies. This allows
you to modify the source code of dependencies.
```toml
[patches]
"acme/foo" = { "1.0.0 luau" = "patches/acme+foo-1.0.0+luau.patch" }
```
The above example will patch version `1.0.0` with the `luau` target of the
`acme/foo` package using the `patches/acme+foo-1.0.0+luau.patch` file.
Each key in the patches table is the package name, and the value is a table
where the keys are the version and target, and the value is the path to the
patch.
The patches can be generated using the `pesde patch` command.
<LinkCard
title="Overrides"
description="Learn more about overriding and patching packages."
href="/guides/overrides/"
/>
## `[place]`
This is used in Roblox projects to specify where packages are located in the
Roblox datamodel.
```toml
[place]
shared = "game.ReplicatedStorage.Packages"
server = "game.ServerScriptService.Packages"
```
## `[dependencies]`
The `[dependencies]` section contains a list of dependencies for the package.
```toml
[dependencies]
foo = { name = "acme/foo", version = "1.2.3" }
bar = { wally = "acme/bar", version = "2.3.4" }
baz = { repo = "acme/baz", rev = "main" }
```
Each key in the dependencies table is the name of the dependency, and the value
is a dependency specifier.
There are several types of dependency specifiers.
### pesde
```toml
[dependencies]
foo = { name = "acme/foo", version = "1.2.3", index = "acme", target = "lune" }
```
**pesde dependencies** contain the following fields:
- `name`: The name of the package.
- `version`: The version of the package.
- `index`: The [pesde index](#indices) to install the package from. If not
specified, the `default` index is used.
- `target`: The target platform for the package. If not specified, the target
platform of the current package is used.
### Wally
```toml
[dependencies]
foo = { wally = "acme/foo", version = "1.2.3", index = "acme" }
```
**Wally dependencies** contain the following fields:
- `wally`: The name of the package.
- `version`: The version of the package.
- `index`: The [Wally index](#wally_indices) to install the package from. If not
specified, the `default` index is used.
### Git
```toml
[dependencies]
foo = { repo = "acme/packages", rev = "aeff6", path = "foo" }
```
**Git dependencies** contain the following fields:
- `repo`: The URL of the Git repository.
This can either be `<owner>/<name>` for a GitHub repository, or a full URL.
- `rev`: The Git revision to install. This can be a tag or commit hash.
- `path`: The path within the repository to install. If not specified, the root
of the repository is used.
### Workspace
```toml
[dependencies]
foo = { workspace = "acme/foo", version = "^" }
```
**Workspace dependencies** contain the following fields:
- `workspace`: The name of the package in the workspace.
- `version`: The version requirement for the package. This can be `^`, `*`, `=`,
`~`, or a specific version requirement such as `^1.2.3`.
<LinkCard
title="Workspaces"
description="Learn more about workspace dependencies in pesde."
href="/guides/workspaces/#workspace-dependencies"
/>
## `[peer_dependencies]`
The `[peer_dependencies]` section contains a list of peer dependencies for the
package. These are dependencies that are required by the package, but are not
installed automatically. Instead, they must be installed by the user of the
package.
```toml
[peer_dependencies]
foo = { name = "acme/foo", version = "1.2.3" }
```
## `[dev_dependencies]`
The `[dev_dependencies]` section contains a list of development dependencies for
the package. These are dependencies that are only required during development,
such as testing libraries or build tools. They are not installed when the
package is used by another package.
```toml
[dev_dependencies]
foo = { name = "acme/foo", version = "1.2.3" }
```
<br />
<LinkCard
title="Specifying Dependencies"
description="Learn more about specifying dependencies in pesde."
href="/guides/dependencies/"
/>

View file

@ -0,0 +1,96 @@
---
title: Policies
description: Policies for the pesde registry
---
The following policies apply to the [official public pesde registry](https://registry.pesde.daimond113.com)
and its related services, such as the index repository or websites.
They may not apply to other registries. By using the pesde registry, you agree
to these policies.
If anything is unclear, please [contact us](#contact-us), and we will be happy
to help.
## Contact Us
You can contact us at [pesde@daimond113.com](mailto:pesde@daimond113.com). In
case of a security issue, please prefix the subject with `[SECURITY]`.
## Permitted content
The pesde registry is a place for Luau-related packages. This includes:
- Libraries
- Frameworks
- Tools
The following content is forbidden:
- Malicious, vulnerable code
- Illegal, harmful content
- Miscellaneous files (doesn't include configuration files, documentation, etc.)
pesde is not responsible for the content of packages, the scope owner is. It
is the responsibility of the scope owner to ensure that the content of their
packages is compliant with the permitted content policy.
If you believe a package is breaking these requirements, please [contact us](#contact-us).
## Package removal
pesde does not support removing packages for reasons such as abandonment. A
package may only be removed for the following reasons:
- The package is breaking the permitted content policy
- The package contains security vulnerabilities
- The package must be removed for legal reasons (e.g. DMCA takedown)
In case a secret has been published to the registry, it must be invalidated.
If you believe a package should be removed, please [contact us](#contact-us).
We will review your request and take action if necessary.
If we find that a package is breaking the permitted content policy, we will
exercise our right to remove it from the registry without notice.
pesde reserves the right to remove any package from the registry at any time for
any or no reason, without notice.
## Package ownership
Packages are owned by scopes. Scope ownership is determined by the first person
to publish a package to the scope. The owner of the scope may send a pull request
to the index repository adding team members' user IDs to the scope's `scope.toml`
file to give them access to the scope, however at least one package must be
published to the scope before this can be done. The owner may also remove team
members from the scope.
A scope's true owner's ID must appear first in the `owners` field of the scope's
`scope.toml` file. Ownership may be transferred by the current owner sending a
pull request to the index repository, and the new owner confirming the transfer.
Only the owner may add or remove team members from the scope.
pesde reserves the right to override scope ownership in the case of a dispute,
such as if the original owner is unresponsive or multiple parties claim ownership.
## Scope squatting
Scope squatting is the act of creating a scope with the intent of preventing
others from using it, without any intention of using it yourself. This is
forbidden and can result in the removal (release) of the scope and its packages
from the registry without notice.
If you believe a scope is being squatted, please [contact us](#contact-us).
We will review your request and take action if necessary.
## API Usage
The pesde registry has an API for querying, downloading, and publishing packages.
Only non-malicious use is permitted. Malicious uses include:
- **Service Degradation**: this includes sending an excessive amount of requests
to the registry in order to degrade the service
- **Exploitation**: this includes trying to break the security of the registry
in order to gain unauthorized access
- **Harmful content**: this includes publishing harmful (non-law compliant,
purposefully insecure) content

2
docs/src/env.d.ts vendored Normal file
View file

@ -0,0 +1,2 @@
/// <reference path="../.astro/types.d.ts" />
/// <reference types="astro/client" />

11
docs/src/tailwind.css Normal file
View file

@ -0,0 +1,11 @@
@tailwind base;
@tailwind components;
@tailwind utilities;
:root[data-theme="light"] {
--sl-color-bg: rgb(255 245 230);
}
:root[data-theme="light"] .sidebar-pane {
background-color: var(--sl-color-bg);
}

36
docs/tailwind.config.ts Normal file
View file

@ -0,0 +1,36 @@
import starlightPlugin from "@astrojs/starlight-tailwind"
import type { Config } from "tailwindcss"
import defaultTheme from "tailwindcss/defaultTheme"
export default {
content: ["./src/**/*.{astro,html,js,jsx,md,mdx,svelte,ts,tsx,vue}"],
theme: {
extend: {
fontFamily: {
sans: ["Nunito Sans Variable", ...defaultTheme.fontFamily.sans],
},
colors: {
accent: {
200: "rgb(241 157 30)",
600: "rgb(120 70 10)",
900: "rgb(24 16 8)",
950: "rgb(10 7 4)",
},
gray: {
100: "rgb(245 230 210)",
200: "rgb(228 212 192)",
300: "rgb(198 167 140)",
400: "rgb(142 128 112)",
500: "rgb(84 70 50)",
600: "rgb(65 50 41)",
700: "rgb(50 42 35)",
800: "rgb(28 22 17)",
900: "rgb(10 7 4)",
},
},
},
},
plugins: [starlightPlugin()],
} as Config

3
docs/tsconfig.json Normal file
View file

@ -0,0 +1,3 @@
{
"extends": "astro/tsconfigs/strict"
}

View file

@ -1,8 +1,3 @@
# fly.toml app configuration file generated for pesde-registry on 2024-03-04T20:57:13+01:00
#
# See https://fly.io/docs/reference/configuration/ for information about how to use this file.
#
app = 'pesde-registry'
primary_region = 'waw'
kill_signal = 'SIGINT'
@ -13,14 +8,14 @@ kill_timeout = '5s'
[env]
ADDRESS = '0.0.0.0'
PORT = '8080'
INDEX_REPO_URL = 'https://github.com/daimond113/pesde-index'
COMMITTER_GIT_NAME = 'Pesde Index Updater'
COMMITTER_GIT_NAME = 'pesde index updater'
COMMITTER_GIT_EMAIL = 'pesde@daimond113.com'
INDEX_REPO_URL = 'https://github.com/pesde-pkg/index'
[http_service]
internal_port = 8080
force_https = true
auto_stop_machines = true
auto_stop_machines = "stop"
auto_start_machines = true
min_machines_running = 0
processes = ['app']

View file

@ -1,11 +1,43 @@
INDEX_REPO_URL=# url of the git repository to be used as the package index
S3_ENDPOINT=# endpoint of the s3 bucket
S3_BUCKET_NAME=# name of the s3 bucket
S3_REGION=# region of the s3 bucket
S3_ACCESS_KEY=# access key of the s3 bucket
S3_SECRET_KEY=# secret key of the s3 bucket
COMMITTER_GIT_NAME=# name of the committer used for index updates
COMMITTER_GIT_EMAIL=# email of the committer used for index updates
GITHUB_USERNAME=# username of github account with push access to the index repository
GITHUB_PAT=# personal access token of github account with push access to the index repository
SENTRY_URL=# optional url of sentry error tracking
INDEX_REPO_URL = # url of the index repository
GIT_USERNAME= # username of a Git account with push access to the index repository
GIT_PASSWORD= # password of the account (PAT for GitHub)
COMMITTER_GIT_NAME= # name of the committer used for index updates
COMMITTER_GIT_EMAIL= # email of the committer used for index updates
DATA_DIR= # directory where miscellaneous data is stored
# AUTHENTICATION CONFIGURATION
# Set the variables of the authentication you want to use in order to enable it
READ_NEEDS_AUTH= # set to any value to require authentication for read requests
# Single Token
ACCESS_TOKEN= # a single token that is used to authenticate all publish requests
# Read/Write Tokens
# READ_NEEDS_AUTH isn't required for this
READ_ACCESS_TOKEN= # a token that is used to authenticate read requests
WRITE_ACCESS_TOKEN= # a token that is used to authenticate write requests
# GitHub
GITHUB_CLIENT_SECRET= # client secret of the GitHub OAuth app configured in the index's `config.toml`
# If none of the above is set, no authentication is required, even for write requests
# STORAGE CONFIGURATION
# Set the variables of the storage you want to use in order to enable it
# S3
S3_ENDPOINT= # endpoint of the S3 bucket
S3_BUCKET_NAME= # name of the S3 bucket
S3_REGION= # region of the S3 bucket
S3_ACCESS_KEY= # access key of the S3 bucket
S3_SECRET_KEY= # secret key of the S3 bucket
# FS
FS_STORAGE_ROOT= # root directory of the filesystem storage
SENTRY_DSN= # optional DSN of Sentry error tracking

22
registry/CHANGELOG.md Normal file
View file

@ -0,0 +1,22 @@
# Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [0.1.2]
### Changed
- Update to pesde lib API changes by @daimond113
## [0.1.1] - 2024-12-19
### Changed
- Switch to traccing for logging by @daimond113
## [0.1.0] - 2024-12-14
### Added
- Rewrite registry for pesde v0.5.0 by @daimond113
[0.1.2]: https://github.com/daimond113/pesde/compare/v0.5.2%2Bregistry.0.1.1..v0.5.3%2Bregistry.0.1.2
[0.1.1]: https://github.com/daimond113/pesde/compare/v0.5.1%2Bregistry.0.1.0..v0.5.2%2Bregistry.0.1.1
[0.1.0]: https://github.com/daimond113/pesde/compare/v0.4.7..v0.5.0%2Bregistry.0.1.0

View file

@ -1,30 +1,50 @@
[package]
name = "pesde-registry"
version = "0.6.1"
version = "0.1.2"
edition = "2021"
repository = "https://github.com/pesde-pkg/index"
publish = false
[dependencies]
actix-web = "4.5.1"
actix-web = "4.9.0"
actix-cors = "0.7.0"
actix-web-httpauth = "0.8.1"
actix-multipart = "0.6.1"
actix-multipart-derive = "0.6.1"
actix-governor = "0.5.0"
actix-governor = "0.8.0"
dotenvy = "0.15.7"
reqwest = { version = "0.12.1", features = ["json", "blocking"] }
thiserror = "2.0.7"
tantivy = "0.22.0"
semver = "1.0.24"
chrono = { version = "0.4.39", features = ["serde"] }
futures = "0.3.31"
tokio = "1.42.0"
tempfile = "3.14.0"
fs-err = { version = "3.0.0", features = ["tokio"] }
async-stream = "0.3.6"
git2 = "0.19.0"
gix = { version = "0.68.0", default-features = false, features = [
"blocking-http-transport-reqwest-rust-tls",
"credentials",
] }
serde = "1.0.216"
serde_json = "1.0.133"
serde_yaml = "0.9.34"
toml = "0.8.19"
convert_case = "0.6.0"
sha2 = "0.10.8"
rusty-s3 = "0.5.0"
serde = { version = "1.0.197", features = ["derive"] }
serde_json = "1.0.114"
serde_yaml = "0.9.33"
flate2 = "1.0.28"
tar = "0.4.40"
pesde = { path = ".." }
semver = "1.0.22"
git2 = "0.18.3"
thiserror = "1.0.58"
tantivy = "0.21.1"
log = "0.4.21"
pretty_env_logger = "0.5.0"
sentry = "0.32.2"
sentry-log = "0.32.2"
sentry-actix = "0.32.2"
reqwest = { version = "0.12.9", features = ["json", "rustls-tls"] }
constant_time_eq = "0.3.1"
tokio-tar = "0.3.1"
async-compression = { version = "0.4.18", features = ["tokio", "gzip"] }
tracing = { version = "0.1.41", features = ["attributes"] }
tracing-subscriber = { version = "0.3.19", features = ["env-filter"] }
tracing-actix-web = "0.7.15"
sentry = { version = "0.35.0", default-features = false, features = ["backtrace", "contexts", "debug-images", "panic", "reqwest", "rustls", "tracing"] }
sentry-actix = "0.35.0"
pesde = { path = "..", features = ["wally-compat"] }

View file

@ -0,0 +1,87 @@
use crate::{
auth::{get_token_from_req, AuthImpl, UserId},
error::ReqwestErrorExt,
};
use actix_web::{dev::ServiceRequest, Error as ActixError};
use reqwest::StatusCode;
use serde::{Deserialize, Serialize};
use std::fmt::Display;
#[derive(Debug)]
pub struct GitHubAuth {
pub reqwest_client: reqwest::Client,
pub client_id: String,
pub client_secret: String,
}
#[derive(Debug, Serialize)]
struct TokenRequestBody {
access_token: String,
}
impl AuthImpl for GitHubAuth {
async fn for_write_request(&self, req: &ServiceRequest) -> Result<Option<UserId>, ActixError> {
let token = match get_token_from_req(req) {
Some(token) => token,
None => return Ok(None),
};
let response = match self
.reqwest_client
.post(format!(
"https://api.github.com/applications/{}/token",
self.client_id
))
.basic_auth(&self.client_id, Some(&self.client_secret))
.json(&TokenRequestBody {
access_token: token,
})
.send()
.await
{
Ok(response) => match response.error_for_status_ref() {
Ok(_) => response,
Err(e) if e.status().is_some_and(|s| s == StatusCode::NOT_FOUND) => {
return Ok(None);
}
Err(_) => {
tracing::error!(
"failed to get user: {}",
response.into_error().await.unwrap_err()
);
return Ok(None);
}
},
Err(e) => {
tracing::error!("failed to get user: {e}");
return Ok(None);
}
};
let user_id = match response.json::<UserResponse>().await {
Ok(resp) => resp.user.id,
Err(e) => {
tracing::error!("failed to get user: {e}");
return Ok(None);
}
};
Ok(Some(UserId(user_id)))
}
}
impl Display for GitHubAuth {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "GitHub")
}
}
#[derive(Debug, Deserialize)]
struct User {
id: u64,
}
#[derive(Debug, Deserialize)]
struct UserResponse {
user: User,
}

199
registry/src/auth/mod.rs Normal file
View file

@ -0,0 +1,199 @@
mod github;
mod none;
mod rw_token;
mod token;
use crate::{benv, make_reqwest, AppState};
use actix_governor::{KeyExtractor, SimpleKeyExtractionError};
use actix_web::{
body::MessageBody,
dev::{ServiceRequest, ServiceResponse},
error::Error as ActixError,
http::header::AUTHORIZATION,
middleware::Next,
web, HttpMessage, HttpResponse,
};
use pesde::source::pesde::IndexConfig;
use sentry::add_breadcrumb;
use sha2::{Digest, Sha256};
use std::fmt::Display;
#[derive(Debug, Copy, Clone, Hash, PartialOrd, PartialEq, Eq, Ord)]
pub struct UserId(pub u64);
impl UserId {
// there isn't any account on GitHub that has the ID 0, so it should be safe to use it
pub const DEFAULT: UserId = UserId(0);
}
#[derive(Debug, Clone)]
pub struct UserIdExtractor;
impl KeyExtractor for UserIdExtractor {
type Key = UserId;
type KeyExtractionError = SimpleKeyExtractionError<&'static str>;
fn extract(&self, req: &ServiceRequest) -> Result<Self::Key, Self::KeyExtractionError> {
match req.extensions().get::<UserId>() {
Some(user_id) => Ok(*user_id),
None => Err(SimpleKeyExtractionError::new("UserId not found")),
}
}
}
#[derive(Debug)]
pub enum Auth {
GitHub(github::GitHubAuth),
None(none::NoneAuth),
Token(token::TokenAuth),
RwToken(rw_token::RwTokenAuth),
}
pub trait AuthImpl: Display {
async fn for_write_request(&self, req: &ServiceRequest) -> Result<Option<UserId>, ActixError>;
async fn for_read_request(&self, req: &ServiceRequest) -> Result<Option<UserId>, ActixError> {
self.for_write_request(req).await
}
fn read_needs_auth(&self) -> bool {
benv!("READ_NEEDS_AUTH").is_ok()
}
}
impl AuthImpl for Auth {
async fn for_write_request(&self, req: &ServiceRequest) -> Result<Option<UserId>, ActixError> {
match self {
Auth::GitHub(github) => github.for_write_request(req).await,
Auth::None(none) => none.for_write_request(req).await,
Auth::Token(token) => token.for_write_request(req).await,
Auth::RwToken(rw_token) => rw_token.for_write_request(req).await,
}
}
async fn for_read_request(&self, req: &ServiceRequest) -> Result<Option<UserId>, ActixError> {
match self {
Auth::GitHub(github) => github.for_read_request(req).await,
Auth::None(none) => none.for_write_request(req).await,
Auth::Token(token) => token.for_write_request(req).await,
Auth::RwToken(rw_token) => rw_token.for_read_request(req).await,
}
}
fn read_needs_auth(&self) -> bool {
match self {
Auth::GitHub(github) => github.read_needs_auth(),
Auth::None(none) => none.read_needs_auth(),
Auth::Token(token) => token.read_needs_auth(),
Auth::RwToken(rw_token) => rw_token.read_needs_auth(),
}
}
}
impl Display for Auth {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Auth::GitHub(github) => write!(f, "{}", github),
Auth::None(none) => write!(f, "{}", none),
Auth::Token(token) => write!(f, "{}", token),
Auth::RwToken(rw_token) => write!(f, "{}", rw_token),
}
}
}
pub async fn write_mw(
app_state: web::Data<AppState>,
req: ServiceRequest,
next: Next<impl MessageBody + 'static>,
) -> Result<ServiceResponse<impl MessageBody>, ActixError> {
let user_id = match app_state.auth.for_write_request(&req).await? {
Some(user_id) => user_id,
None => {
return Ok(req
.into_response(HttpResponse::Unauthorized().finish())
.map_into_right_body())
}
};
add_breadcrumb(sentry::Breadcrumb {
category: Some("auth".into()),
message: Some(format!("write request authorized as {}", user_id.0)),
level: sentry::Level::Info,
..Default::default()
});
req.extensions_mut().insert(user_id);
next.call(req).await.map(|res| res.map_into_left_body())
}
pub async fn read_mw(
app_state: web::Data<AppState>,
req: ServiceRequest,
next: Next<impl MessageBody + 'static>,
) -> Result<ServiceResponse<impl MessageBody>, ActixError> {
if app_state.auth.read_needs_auth() {
let user_id = match app_state.auth.for_read_request(&req).await? {
Some(user_id) => user_id,
None => {
return Ok(req
.into_response(HttpResponse::Unauthorized().finish())
.map_into_right_body())
}
};
add_breadcrumb(sentry::Breadcrumb {
category: Some("auth".into()),
message: Some(format!("read request authorized as {}", user_id.0)),
level: sentry::Level::Info,
..Default::default()
});
req.extensions_mut().insert(Some(user_id));
} else {
req.extensions_mut().insert(None::<UserId>);
}
next.call(req).await.map(|res| res.map_into_left_body())
}
pub fn get_auth_from_env(config: &IndexConfig) -> Auth {
if let Ok(token) = benv!("ACCESS_TOKEN") {
Auth::Token(token::TokenAuth {
token: *Sha256::digest(token.as_bytes()).as_ref(),
})
} else if let Ok(client_secret) = benv!("GITHUB_CLIENT_SECRET") {
Auth::GitHub(github::GitHubAuth {
reqwest_client: make_reqwest(),
client_id: config
.github_oauth_client_id
.clone()
.expect("index isn't configured for GitHub"),
client_secret,
})
} else if let Ok((r, w)) =
benv!("READ_ACCESS_TOKEN").and_then(|r| benv!("WRITE_ACCESS_TOKEN").map(|w| (r, w)))
{
Auth::RwToken(rw_token::RwTokenAuth {
read_token: *Sha256::digest(r.as_bytes()).as_ref(),
write_token: *Sha256::digest(w.as_bytes()).as_ref(),
})
} else {
Auth::None(none::NoneAuth)
}
}
pub fn get_token_from_req(req: &ServiceRequest) -> Option<String> {
let token = req
.headers()
.get(AUTHORIZATION)
.and_then(|token| token.to_str().ok())?;
let token = if token.to_lowercase().starts_with("bearer ") {
token[7..].to_string()
} else {
token.to_string()
};
Some(token)
}

18
registry/src/auth/none.rs Normal file
View file

@ -0,0 +1,18 @@
use crate::auth::{AuthImpl, UserId};
use actix_web::{dev::ServiceRequest, Error as ActixError};
use std::fmt::Display;
#[derive(Debug)]
pub struct NoneAuth;
impl AuthImpl for NoneAuth {
async fn for_write_request(&self, _req: &ServiceRequest) -> Result<Option<UserId>, ActixError> {
Ok(Some(UserId::DEFAULT))
}
}
impl Display for NoneAuth {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "None")
}
}

View file

@ -0,0 +1,53 @@
use crate::auth::{get_token_from_req, AuthImpl, UserId};
use actix_web::{dev::ServiceRequest, Error as ActixError};
use constant_time_eq::constant_time_eq_32;
use sha2::{Digest, Sha256};
use std::fmt::Display;
#[derive(Debug)]
pub struct RwTokenAuth {
pub read_token: [u8; 32],
pub write_token: [u8; 32],
}
impl AuthImpl for RwTokenAuth {
async fn for_write_request(&self, req: &ServiceRequest) -> Result<Option<UserId>, ActixError> {
let token = match get_token_from_req(req) {
Some(token) => token,
None => return Ok(None),
};
let token: [u8; 32] = Sha256::digest(token.as_bytes()).into();
Ok(if constant_time_eq_32(&self.write_token, &token) {
Some(UserId::DEFAULT)
} else {
None
})
}
async fn for_read_request(&self, req: &ServiceRequest) -> Result<Option<UserId>, ActixError> {
let token = match get_token_from_req(req) {
Some(token) => token,
None => return Ok(None),
};
let token: [u8; 32] = Sha256::digest(token.as_bytes()).into();
Ok(if constant_time_eq_32(&self.read_token, &token) {
Some(UserId::DEFAULT)
} else {
None
})
}
fn read_needs_auth(&self) -> bool {
true
}
}
impl Display for RwTokenAuth {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "RwToken")
}
}

View file

@ -0,0 +1,34 @@
use crate::auth::{get_token_from_req, AuthImpl, UserId};
use actix_web::{dev::ServiceRequest, Error as ActixError};
use constant_time_eq::constant_time_eq_32;
use sha2::{Digest, Sha256};
use std::fmt::Display;
#[derive(Debug)]
pub struct TokenAuth {
// needs to be an SHA-256 hash
pub token: [u8; 32],
}
impl AuthImpl for TokenAuth {
async fn for_write_request(&self, req: &ServiceRequest) -> Result<Option<UserId>, ActixError> {
let token = match get_token_from_req(req) {
Some(token) => token,
None => return Ok(None),
};
let token: [u8; 32] = Sha256::digest(token.as_bytes()).into();
Ok(if constant_time_eq_32(&self.token, &token) {
Some(UserId::DEFAULT)
} else {
None
})
}
}
impl Display for TokenAuth {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "Token")
}
}

View file

@ -1,2 +1,4 @@
pub mod packages;
pub mod package_version;
pub mod package_versions;
pub mod publish_version;
pub mod search;

View file

@ -0,0 +1,171 @@
use actix_web::{http::header::ACCEPT, web, HttpRequest, HttpResponse, Responder};
use semver::Version;
use serde::{Deserialize, Deserializer};
use crate::{error::Error, package::PackageResponse, storage::StorageImpl, AppState};
use pesde::{
manifest::target::TargetKind,
names::PackageName,
source::{
git_index::{read_file, root_tree, GitBasedSource},
pesde::{DocEntryKind, IndexFile},
},
};
#[derive(Debug)]
pub enum VersionRequest {
Latest,
Specific(Version),
}
impl<'de> Deserialize<'de> for VersionRequest {
fn deserialize<D>(deserializer: D) -> Result<VersionRequest, D::Error>
where
D: Deserializer<'de>,
{
let s = String::deserialize(deserializer)?;
if s.eq_ignore_ascii_case("latest") {
return Ok(VersionRequest::Latest);
}
s.parse()
.map(VersionRequest::Specific)
.map_err(serde::de::Error::custom)
}
}
#[derive(Debug)]
pub enum TargetRequest {
Any,
Specific(TargetKind),
}
impl<'de> Deserialize<'de> for TargetRequest {
fn deserialize<D>(deserializer: D) -> Result<TargetRequest, D::Error>
where
D: Deserializer<'de>,
{
let s = String::deserialize(deserializer)?;
if s.eq_ignore_ascii_case("any") {
return Ok(TargetRequest::Any);
}
s.parse()
.map(TargetRequest::Specific)
.map_err(serde::de::Error::custom)
}
}
#[derive(Debug, Deserialize)]
pub struct Query {
doc: Option<String>,
}
pub async fn get_package_version(
request: HttpRequest,
app_state: web::Data<AppState>,
path: web::Path<(PackageName, VersionRequest, TargetRequest)>,
query: web::Query<Query>,
) -> Result<impl Responder, Error> {
let (name, version, target) = path.into_inner();
let (scope, name_part) = name.as_str();
let file: IndexFile = {
let source = app_state.source.lock().await;
let repo = gix::open(source.path(&app_state.project))?;
let tree = root_tree(&repo)?;
match read_file(&tree, [scope, name_part])? {
Some(versions) => toml::de::from_str(&versions)?,
None => return Ok(HttpResponse::NotFound().finish()),
}
};
let Some((v_id, entry, targets)) = ({
let version = match version {
VersionRequest::Latest => match file.entries.keys().map(|k| k.version()).max() {
Some(latest) => latest.clone(),
None => return Ok(HttpResponse::NotFound().finish()),
},
VersionRequest::Specific(version) => version,
};
let versions = file
.entries
.iter()
.filter(|(v_id, _)| *v_id.version() == version);
match target {
TargetRequest::Any => versions.clone().min_by_key(|(v_id, _)| *v_id.target()),
TargetRequest::Specific(kind) => versions
.clone()
.find(|(_, entry)| entry.target.kind() == kind),
}
.map(|(v_id, entry)| {
(
v_id,
entry,
versions.map(|(_, entry)| (&entry.target).into()).collect(),
)
})
}) else {
return Ok(HttpResponse::NotFound().finish());
};
if let Some(doc_name) = query.doc.as_deref() {
let hash = 'finder: {
let mut hash = entry.docs.iter().map(|doc| &doc.kind).collect::<Vec<_>>();
while let Some(doc) = hash.pop() {
match doc {
DocEntryKind::Page { name, hash } if name == doc_name => {
break 'finder hash.clone()
}
DocEntryKind::Category { items, .. } => {
hash.extend(items.iter().map(|item| &item.kind))
}
_ => continue,
};
}
return Ok(HttpResponse::NotFound().finish());
};
return app_state.storage.get_doc(&hash).await;
}
let accept = request
.headers()
.get(ACCEPT)
.and_then(|accept| accept.to_str().ok())
.and_then(|accept| match accept.to_lowercase().as_str() {
"text/plain" => Some(true),
"application/octet-stream" => Some(false),
_ => None,
});
if let Some(readme) = accept {
return if readme {
app_state.storage.get_readme(&name, v_id).await
} else {
app_state.storage.get_package(&name, v_id).await
};
}
let response = PackageResponse {
name: name.to_string(),
version: v_id.version().to_string(),
targets,
description: entry.description.clone().unwrap_or_default(),
published_at: entry.published_at,
license: entry.license.clone().unwrap_or_default(),
authors: entry.authors.clone(),
repository: entry.repository.clone().map(|url| url.to_string()),
};
let mut value = serde_json::to_value(response)?;
value["docs"] = serde_json::to_value(entry.docs.clone())?;
value["dependencies"] = serde_json::to_value(entry.dependencies.clone())?;
Ok(HttpResponse::Ok().json(value))
}

View file

@ -0,0 +1,54 @@
use std::collections::{BTreeMap, BTreeSet};
use actix_web::{web, HttpResponse, Responder};
use crate::{error::Error, package::PackageResponse, AppState};
use pesde::{
names::PackageName,
source::{
git_index::{read_file, root_tree, GitBasedSource},
pesde::IndexFile,
},
};
pub async fn get_package_versions(
app_state: web::Data<AppState>,
path: web::Path<PackageName>,
) -> Result<impl Responder, Error> {
let name = path.into_inner();
let (scope, name_part) = name.as_str();
let file: IndexFile = {
let source = app_state.source.lock().await;
let repo = gix::open(source.path(&app_state.project))?;
let tree = root_tree(&repo)?;
match read_file(&tree, [scope, name_part])? {
Some(versions) => toml::de::from_str(&versions)?,
None => return Ok(HttpResponse::NotFound().finish()),
}
};
let mut responses = BTreeMap::new();
for (v_id, entry) in file.entries {
let info = responses
.entry(v_id.version().clone())
.or_insert_with(|| PackageResponse {
name: name.to_string(),
version: v_id.version().to_string(),
targets: BTreeSet::new(),
description: entry.description.unwrap_or_default(),
published_at: entry.published_at,
license: entry.license.unwrap_or_default(),
authors: entry.authors.clone(),
repository: entry.repository.clone().map(|url| url.to_string()),
});
info.targets.insert(entry.target.into());
info.published_at = info.published_at.max(entry.published_at);
}
Ok(HttpResponse::Ok().json(responses.into_values().collect::<Vec<_>>()))
}

View file

@ -1,256 +0,0 @@
use actix_multipart::form::{bytes::Bytes, MultipartForm};
use actix_web::{web, HttpResponse, Responder};
use flate2::read::GzDecoder;
use log::error;
use reqwest::StatusCode;
use rusty_s3::S3Action;
use tantivy::{doc, DateTime, Term};
use tar::Archive;
use pesde::{
dependencies::DependencySpecifier, index::Index, manifest::Manifest,
package_name::StandardPackageName, project::DEFAULT_INDEX_NAME, IGNORED_FOLDERS,
MANIFEST_FILE_NAME,
};
use crate::{commit_signature, errors, AppState, UserId, S3_EXPIRY};
#[derive(MultipartForm)]
pub struct CreateForm {
#[multipart(limit = "4 MiB")]
tarball: Bytes,
}
pub async fn create_package(
form: MultipartForm<CreateForm>,
app_state: web::Data<AppState>,
user_id: web::ReqData<UserId>,
) -> Result<impl Responder, errors::Errors> {
let bytes = form.tarball.data.as_ref().to_vec();
let mut decoder = GzDecoder::new(bytes.as_slice());
let mut archive = Archive::new(&mut decoder);
let archive_entries = archive.entries()?.filter_map(|e| e.ok());
let mut manifest = None;
for mut e in archive_entries {
let Ok(path) = e.path() else {
return Ok(HttpResponse::BadRequest().json(errors::ErrorResponse {
error: "Attached file contains non-UTF-8 path".to_string(),
}));
};
let Some(path) = path.as_os_str().to_str() else {
return Ok(HttpResponse::BadRequest().json(errors::ErrorResponse {
error: "Attached file contains non-UTF-8 path".to_string(),
}));
};
match path {
MANIFEST_FILE_NAME => {
if !e.header().entry_type().is_file() {
continue;
}
let received_manifest: Manifest =
serde_yaml::from_reader(&mut e).map_err(errors::Errors::UserYaml)?;
manifest = Some(received_manifest);
}
path => {
if e.header().entry_type().is_file() {
continue;
}
if IGNORED_FOLDERS.contains(&path) {
return Ok(HttpResponse::BadRequest().json(errors::ErrorResponse {
error: format!("Attached file contains forbidden directory {}", path),
}));
}
}
}
}
let Some(manifest) = manifest else {
return Ok(HttpResponse::BadRequest().json(errors::ErrorResponse {
error: format!("Attached file doesn't contain {MANIFEST_FILE_NAME}"),
}));
};
let (scope, name) = manifest.name.parts();
let entry = {
let mut index = app_state.index.lock().unwrap();
let config = index.config()?;
for (dependency, _) in manifest.dependencies().into_values() {
match dependency {
DependencySpecifier::Git(_) => {
if !config.git_allowed {
return Ok(HttpResponse::BadRequest().json(errors::ErrorResponse {
error: "Git dependencies are not allowed on this registry".to_string(),
}));
}
}
DependencySpecifier::Registry(registry) => {
if index
.package(&registry.name.clone().into())
.unwrap()
.is_none()
{
return Ok(HttpResponse::BadRequest().json(errors::ErrorResponse {
error: format!("Dependency {} not found", registry.name),
}));
}
if registry.index != DEFAULT_INDEX_NAME && !config.custom_registry_allowed {
return Ok(HttpResponse::BadRequest().json(errors::ErrorResponse {
error: "Custom registries are not allowed on this registry".to_string(),
}));
}
}
#[allow(unreachable_patterns)]
_ => {}
};
}
match index.create_package_version(&manifest, &user_id.0)? {
Some(entry) => {
index.commit_and_push(
&format!("Add version {}@{}", manifest.name, manifest.version),
&commit_signature(),
)?;
entry
}
None => {
return Ok(HttpResponse::BadRequest().json(errors::ErrorResponse {
error: format!(
"Version {} of {} already exists",
manifest.version, manifest.name
),
}));
}
}
};
{
let mut search_writer = app_state.search_writer.lock().unwrap();
let schema = search_writer.index().schema();
let name_field = schema.get_field("name").unwrap();
search_writer.delete_term(Term::from_field_text(
name_field,
&manifest.name.to_string(),
));
search_writer.add_document(
doc!(
name_field => manifest.name.to_string(),
schema.get_field("version").unwrap() => manifest.version.to_string(),
schema.get_field("description").unwrap() => manifest.description.unwrap_or_default(),
schema.get_field("published_at").unwrap() => DateTime::from_timestamp_secs(entry.published_at.timestamp())
)
).unwrap();
search_writer.commit().unwrap();
}
let url = app_state
.s3_bucket
.put_object(
Some(&app_state.s3_credentials),
&format!("{scope}-{name}-{}.tar.gz", manifest.version),
)
.sign(S3_EXPIRY);
app_state.reqwest_client.put(url).body(bytes).send().await?;
Ok(HttpResponse::Ok().body(format!(
"Successfully published {}@{}",
manifest.name, manifest.version
)))
}
pub async fn get_package_version(
app_state: web::Data<AppState>,
path: web::Path<(String, String, String)>,
) -> Result<impl Responder, errors::Errors> {
let (scope, name, mut version) = path.into_inner();
let package_name = StandardPackageName::new(&scope, &name)?;
{
let index = app_state.index.lock().unwrap();
match index.package(&package_name.clone().into())? {
Some(package) => {
if version == "latest" {
version = package.last().map(|v| v.version.to_string()).unwrap();
} else if !package.iter().any(|v| v.version.to_string() == version) {
return Ok(HttpResponse::NotFound().finish());
}
}
None => return Ok(HttpResponse::NotFound().finish()),
}
}
let url = app_state
.s3_bucket
.get_object(
Some(&app_state.s3_credentials),
&format!("{scope}-{name}-{version}.tar.gz"),
)
.sign(S3_EXPIRY);
let response = match app_state
.reqwest_client
.get(url)
.send()
.await?
.error_for_status()
{
Ok(response) => response,
Err(e) => {
if let Some(status) = e.status() {
if status == StatusCode::NOT_FOUND {
error!(
"package {}@{} not found in S3, but found in index",
package_name, version
);
return Ok(HttpResponse::InternalServerError().finish());
}
}
return Err(e.into());
}
};
Ok(HttpResponse::Ok().body(response.bytes().await?))
}
pub async fn get_package_versions(
app_state: web::Data<AppState>,
path: web::Path<(String, String)>,
) -> Result<impl Responder, errors::Errors> {
let (scope, name) = path.into_inner();
let package_name = StandardPackageName::new(&scope, &name)?;
{
let index = app_state.index.lock().unwrap();
match index.package(&package_name.into())? {
Some(package) => {
let versions = package
.iter()
.map(|v| (v.version.to_string(), v.published_at.timestamp()))
.collect::<Vec<_>>();
Ok(HttpResponse::Ok().json(versions))
}
None => Ok(HttpResponse::NotFound().finish()),
}
}
}

View file

@ -0,0 +1,507 @@
use crate::{
auth::UserId,
benv,
error::{Error, ErrorResponse},
search::update_version,
storage::StorageImpl,
AppState,
};
use actix_web::{web, web::Bytes, HttpResponse, Responder};
use async_compression::Level;
use convert_case::{Case, Casing};
use fs_err::tokio as fs;
use futures::{future::join_all, join};
use git2::{Remote, Repository, Signature};
use pesde::{
manifest::Manifest,
source::{
git_index::{read_file, root_tree, GitBasedSource},
pesde::{DocEntry, DocEntryKind, IndexFile, IndexFileEntry, ScopeInfo, SCOPE_INFO_FILE},
specifiers::DependencySpecifiers,
version_id::VersionId,
IGNORED_DIRS, IGNORED_FILES,
},
MANIFEST_FILE_NAME,
};
use sentry::add_breadcrumb;
use serde::Deserialize;
use sha2::{Digest, Sha256};
use std::{
collections::{BTreeSet, HashMap},
io::{Cursor, Write},
};
use tokio::io::{AsyncReadExt, AsyncWriteExt};
fn signature<'a>() -> Signature<'a> {
Signature::now(
&benv!(required "COMMITTER_GIT_NAME"),
&benv!(required "COMMITTER_GIT_EMAIL"),
)
.unwrap()
}
fn get_refspec(repo: &Repository, remote: &mut Remote) -> Result<String, git2::Error> {
let upstream_branch_buf = repo.branch_upstream_name(repo.head()?.name().unwrap())?;
let upstream_branch = upstream_branch_buf.as_str().unwrap();
let refspec_buf = remote
.refspecs()
.find(|r| r.direction() == git2::Direction::Fetch && r.dst_matches(upstream_branch))
.unwrap()
.rtransform(upstream_branch)?;
let refspec = refspec_buf.as_str().unwrap();
Ok(refspec.to_string())
}
const ADDITIONAL_FORBIDDEN_FILES: &[&str] = &["default.project.json"];
#[derive(Debug, Deserialize, Default)]
struct DocEntryInfo {
#[serde(default)]
label: Option<String>,
#[serde(default, alias = "position")]
sidebar_position: Option<usize>,
#[serde(default)]
collapsed: bool,
}
pub async fn publish_package(
app_state: web::Data<AppState>,
bytes: Bytes,
user_id: web::ReqData<UserId>,
) -> Result<impl Responder, Error> {
let source = app_state.source.lock().await;
source.refresh(&app_state.project).await.map_err(Box::new)?;
let config = source.config(&app_state.project).await?;
let package_dir = tempfile::tempdir()?;
{
let mut decoder = async_compression::tokio::bufread::GzipDecoder::new(Cursor::new(&bytes));
let mut archive = tokio_tar::Archive::new(&mut decoder);
archive.unpack(package_dir.path()).await?;
}
let mut manifest = None::<Manifest>;
let mut readme = None::<Vec<u8>>;
let mut docs = BTreeSet::new();
let mut docs_pages = HashMap::new();
let mut read_dir = fs::read_dir(package_dir.path()).await?;
while let Some(entry) = read_dir.next_entry().await? {
let file_name = entry
.file_name()
.to_str()
.ok_or_else(|| Error::InvalidArchive("file name contains non UTF-8 characters".into()))?
.to_string();
if entry.file_type().await?.is_dir() {
if IGNORED_DIRS.contains(&file_name.as_str()) {
return Err(Error::InvalidArchive(format!(
"archive contains forbidden directory: {file_name}"
)));
}
if file_name == "docs" {
let mut stack = vec![(
BTreeSet::new(),
fs::read_dir(entry.path()).await?,
None::<DocEntryInfo>,
)];
'outer: while let Some((set, iter, category_info)) = stack.last_mut() {
while let Some(entry) = iter.next_entry().await? {
let file_name = entry
.file_name()
.to_str()
.ok_or_else(|| {
Error::InvalidArchive(
"file name contains non UTF-8 characters".into(),
)
})?
.to_string();
if entry.file_type().await?.is_dir() {
stack.push((
BTreeSet::new(),
fs::read_dir(entry.path()).await?,
Some(DocEntryInfo {
label: Some(file_name.to_case(Case::Title)),
..Default::default()
}),
));
continue 'outer;
}
if file_name == "_category_.json" {
let info = fs::read_to_string(entry.path()).await?;
let mut info: DocEntryInfo = serde_json::from_str(&info)?;
let old_info = category_info.take();
info.label = info.label.or(old_info.and_then(|i| i.label));
*category_info = Some(info);
continue;
}
let Some(file_name) = file_name.strip_suffix(".md") else {
continue;
};
let content = fs::read_to_string(entry.path()).await?;
let content = content.trim();
let hash = format!("{:x}", Sha256::digest(content.as_bytes()));
let mut gz = async_compression::tokio::bufread::GzipEncoder::with_quality(
Cursor::new(content.as_bytes().to_vec()),
Level::Best,
);
let mut bytes = vec![];
gz.read_to_end(&mut bytes).await?;
docs_pages.insert(hash.to_string(), bytes);
let mut lines = content.lines().peekable();
let front_matter = if lines.peek().filter(|l| **l == "---").is_some() {
lines.next(); // skip the first `---`
let front_matter = lines
.by_ref()
.take_while(|l| *l != "---")
.collect::<Vec<_>>()
.join("\n");
lines.next(); // skip the last `---`
front_matter
} else {
"".to_string()
};
let h1 = lines
.find(|l| !l.trim().is_empty())
.and_then(|l| l.strip_prefix("# "))
.map(|s| s.to_string());
let info: DocEntryInfo =
serde_yaml::from_str(&front_matter).map_err(|_| {
Error::InvalidArchive(format!(
"doc {file_name}'s frontmatter isn't valid YAML"
))
})?;
set.insert(DocEntry {
label: info.label.or(h1).unwrap_or(file_name.to_case(Case::Title)),
position: info.sidebar_position,
kind: DocEntryKind::Page {
name: entry
.path()
.strip_prefix(package_dir.path().join("docs"))
.unwrap()
.with_extension("")
.to_str()
.ok_or_else(|| {
Error::InvalidArchive(
"file name contains non UTF-8 characters".into(),
)
})?
// ensure that the path is always using forward slashes
.replace("\\", "/"),
hash,
},
});
}
// should never be None
let (popped, _, category_info) = stack.pop().unwrap();
docs = popped;
if let Some((set, _, _)) = stack.last_mut() {
let category_info = category_info.unwrap_or_default();
set.insert(DocEntry {
label: category_info.label.unwrap(),
position: category_info.sidebar_position,
kind: DocEntryKind::Category {
items: {
let curr_docs = docs;
docs = BTreeSet::new();
curr_docs
},
collapsed: category_info.collapsed,
},
});
}
}
}
continue;
}
if IGNORED_FILES.contains(&file_name.as_str())
|| ADDITIONAL_FORBIDDEN_FILES.contains(&file_name.as_str())
{
return Err(Error::InvalidArchive(format!(
"archive contains forbidden file: {file_name}"
)));
}
if file_name == MANIFEST_FILE_NAME {
let content = fs::read_to_string(entry.path()).await?;
manifest = Some(toml::de::from_str(&content)?);
} else if file_name
.to_lowercase()
.split_once('.')
.filter(|(file, ext)| *file == "readme" && (*ext == "md" || *ext == "txt"))
.is_some()
{
if readme.is_some() {
return Err(Error::InvalidArchive(
"archive contains multiple readme files".into(),
));
}
let mut file = fs::File::open(entry.path()).await?;
let mut gz = async_compression::tokio::write::GzipEncoder::new(vec![]);
tokio::io::copy(&mut file, &mut gz).await?;
gz.shutdown().await?;
readme = Some(gz.into_inner());
}
}
let Some(manifest) = manifest else {
return Err(Error::InvalidArchive(
"archive doesn't contain a manifest".into(),
));
};
add_breadcrumb(sentry::Breadcrumb {
category: Some("publish".into()),
message: Some(format!(
"publish request for {}@{} {}. has readme: {}. docs: {}",
manifest.name,
manifest.version,
manifest.target,
readme.is_some(),
docs_pages.len()
)),
level: sentry::Level::Info,
..Default::default()
});
{
let dependencies = manifest.all_dependencies().map_err(|e| {
Error::InvalidArchive(format!("manifest has invalid dependencies: {e}"))
})?;
for (specifier, _) in dependencies.values() {
match specifier {
DependencySpecifiers::Pesde(specifier) => {
if specifier
.index
.as_deref()
.filter(|index| match gix::Url::try_from(*index) {
Ok(url) => config
.other_registries_allowed
.is_allowed_or_same(source.repo_url().clone(), url),
Err(_) => false,
})
.is_none()
{
return Err(Error::InvalidArchive(format!(
"invalid index in pesde dependency {specifier}"
)));
}
}
DependencySpecifiers::Wally(specifier) => {
if specifier
.index
.as_deref()
.filter(|index| match gix::Url::try_from(*index) {
Ok(url) => config.wally_allowed.is_allowed(url),
Err(_) => false,
})
.is_none()
{
return Err(Error::InvalidArchive(format!(
"invalid index in wally dependency {specifier}"
)));
}
}
DependencySpecifiers::Git(specifier) => {
if !config.git_allowed.is_allowed(specifier.repo.clone()) {
return Err(Error::InvalidArchive(
"git dependencies are not allowed".into(),
));
}
}
DependencySpecifiers::Workspace(_) => {
// workspace specifiers are to be transformed into pesde specifiers by the sender
return Err(Error::InvalidArchive(
"non-transformed workspace dependency".into(),
));
}
}
}
let repo = Repository::open_bare(source.path(&app_state.project))?;
let gix_repo = gix::open(repo.path())?;
let gix_tree = root_tree(&gix_repo)?;
let (scope, name) = manifest.name.as_str();
let mut oids = vec![];
match read_file(&gix_tree, [scope, SCOPE_INFO_FILE])? {
Some(info) => {
let info: ScopeInfo = toml::de::from_str(&info)?;
if !info.owners.contains(&user_id.0) {
return Ok(HttpResponse::Forbidden().finish());
}
}
None => {
let scope_info = toml::to_string(&ScopeInfo {
owners: BTreeSet::from([user_id.0]),
})?;
let mut blob_writer = repo.blob_writer(None)?;
blob_writer.write_all(scope_info.as_bytes())?;
oids.push((SCOPE_INFO_FILE, blob_writer.commit()?));
}
};
let mut file: IndexFile =
toml::de::from_str(&read_file(&gix_tree, [scope, name])?.unwrap_or_default())?;
let new_entry = IndexFileEntry {
target: manifest.target.clone(),
published_at: chrono::Utc::now(),
description: manifest.description.clone(),
license: manifest.license.clone(),
authors: manifest.authors.clone(),
repository: manifest.repository.clone(),
docs,
dependencies,
};
let this_version = file
.entries
.keys()
.find(|v_id| *v_id.version() == manifest.version);
if let Some(this_version) = this_version {
let other_entry = file.entries.get(this_version).unwrap();
// description cannot be different - which one to render in the "Recently published" list?
// the others cannot be different because what to return from the versions endpoint?
if other_entry.description != new_entry.description
|| other_entry.license != new_entry.license
|| other_entry.authors != new_entry.authors
|| other_entry.repository != new_entry.repository
{
return Ok(HttpResponse::BadRequest().json(ErrorResponse {
error: "same version with different description or license already exists"
.to_string(),
}));
}
}
if file
.entries
.insert(
VersionId::new(manifest.version.clone(), manifest.target.kind()),
new_entry.clone(),
)
.is_some()
{
return Ok(HttpResponse::Conflict().finish());
}
let mut remote = repo.find_remote("origin")?;
let refspec = get_refspec(&repo, &mut remote)?;
let reference = repo.find_reference(&refspec)?;
{
let index_content = toml::to_string(&file)?;
let mut blob_writer = repo.blob_writer(None)?;
blob_writer.write_all(index_content.as_bytes())?;
oids.push((name, blob_writer.commit()?));
}
let old_root_tree = reference.peel_to_tree()?;
let old_scope_tree = match old_root_tree.get_name(scope) {
Some(entry) => Some(repo.find_tree(entry.id())?),
None => None,
};
let mut scope_tree = repo.treebuilder(old_scope_tree.as_ref())?;
for (file, oid) in oids {
scope_tree.insert(file, oid, 0o100644)?;
}
let scope_tree_id = scope_tree.write()?;
let mut root_tree = repo.treebuilder(Some(&repo.find_tree(old_root_tree.id())?))?;
root_tree.insert(scope, scope_tree_id, 0o040000)?;
let tree_oid = root_tree.write()?;
repo.commit(
Some("HEAD"),
&signature(),
&signature(),
&format!(
"add {}@{} {}",
manifest.name, manifest.version, manifest.target
),
&repo.find_tree(tree_oid)?,
&[&reference.peel_to_commit()?],
)?;
let mut push_options = git2::PushOptions::new();
let mut remote_callbacks = git2::RemoteCallbacks::new();
let git_creds = app_state.project.auth_config().git_credentials().unwrap();
remote_callbacks.credentials(|_, _, _| {
git2::Cred::userpass_plaintext(&git_creds.username, &git_creds.password)
});
push_options.remote_callbacks(remote_callbacks);
remote.push(&[refspec], Some(&mut push_options))?;
update_version(&app_state, &manifest.name, new_entry);
}
let version_id = VersionId::new(manifest.version.clone(), manifest.target.kind());
let (a, b, c) = join!(
app_state
.storage
.store_package(&manifest.name, &version_id, bytes.to_vec()),
join_all(
docs_pages
.into_iter()
.map(|(hash, content)| app_state.storage.store_doc(hash, content)),
),
async {
if let Some(readme) = readme {
app_state
.storage
.store_readme(&manifest.name, &version_id, readme)
.await
} else {
Ok(())
}
}
);
a?;
b.into_iter().collect::<Result<(), _>>()?;
c?;
Ok(HttpResponse::Ok().body(format!(
"published {}@{} {}",
manifest.name, manifest.version, manifest.target
)))
}

View file

@ -1,81 +1,107 @@
use actix_web::{web, Responder};
use semver::Version;
use std::collections::HashMap;
use actix_web::{web, HttpResponse, Responder};
use serde::Deserialize;
use serde_json::{json, Value};
use tantivy::{query::AllQuery, DateTime, DocAddress, Order};
use tantivy::{collector::Count, query::AllQuery, schema::Value, DateTime, Order};
use pesde::{index::Index, package_name::StandardPackageName};
use crate::{errors, AppState};
use crate::{error::Error, package::PackageResponse, AppState};
use pesde::{
names::PackageName,
source::{
git_index::{read_file, root_tree, GitBasedSource},
pesde::IndexFile,
},
};
#[derive(Deserialize)]
pub struct Query {
pub struct Request {
#[serde(default)]
query: Option<String>,
#[serde(default)]
offset: Option<usize>,
}
pub async fn search_packages(
app_state: web::Data<AppState>,
query: web::Query<Query>,
) -> Result<impl Responder, errors::Errors> {
request: web::Query<Request>,
) -> Result<impl Responder, Error> {
let searcher = app_state.search_reader.searcher();
let schema = searcher.schema();
let name = schema.get_field("name").unwrap();
let version = schema.get_field("version").unwrap();
let description = schema.get_field("description").unwrap();
let id = schema.get_field("id").unwrap();
let query = query.query.as_deref().unwrap_or_default().trim();
let query = request.query.as_deref().unwrap_or_default().trim();
let query_parser =
tantivy::query::QueryParser::for_index(searcher.index(), vec![name, description]);
let query = if query.is_empty() {
Box::new(AllQuery)
} else {
query_parser.parse_query(query)?
app_state.query_parser.parse_query(query)?
};
let top_docs: Vec<(DateTime, DocAddress)> = searcher
let (count, top_docs) = searcher
.search(
&query,
&tantivy::collector::TopDocs::with_limit(10)
.order_by_fast_field("published_at", Order::Desc),
&(
Count,
tantivy::collector::TopDocs::with_limit(50)
.and_offset(request.offset.unwrap_or_default())
.order_by_fast_field::<DateTime>("published_at", Order::Desc),
),
)
.unwrap();
{
let index = app_state.index.lock().unwrap();
let source = app_state.source.lock().await;
let repo = gix::open(source.path(&app_state.project))?;
let tree = root_tree(&repo)?;
Ok(web::Json(
top_docs
.into_iter()
.map(|(published_at, doc_address)| {
let retrieved_doc = searcher.doc(doc_address).unwrap();
let name: StandardPackageName = retrieved_doc
.get_first(name)
.and_then(|v| v.as_text())
.and_then(|v| v.parse().ok())
.unwrap();
let top_docs = top_docs
.into_iter()
.map(|(_, doc_address)| {
let doc = searcher.doc::<HashMap<_, _>>(doc_address).unwrap();
let version: Version = retrieved_doc
.get_first(version)
.and_then(|v| v.as_text())
.and_then(|v| v.parse().ok())
.unwrap();
let id = doc
.get(&id)
.unwrap()
.as_str()
.unwrap()
.parse::<PackageName>()
.unwrap();
let (scope, name) = id.as_str();
let entry = index
.package(&name.clone().into())
.unwrap()
.and_then(|v| v.into_iter().find(|v| v.version == version))
.unwrap();
let file: IndexFile =
toml::de::from_str(&read_file(&tree, [scope, name]).unwrap().unwrap()).unwrap();
json!({
"name": name,
"version": version,
"description": entry.description,
"published_at": published_at.into_timestamp_secs(),
})
})
.collect::<Vec<Value>>(),
))
}
let (latest_version, entry) = file
.entries
.iter()
.max_by_key(|(v_id, _)| v_id.version())
.unwrap();
PackageResponse {
name: id.to_string(),
version: latest_version.version().to_string(),
targets: file
.entries
.iter()
.filter(|(v_id, _)| v_id.version() == latest_version.version())
.map(|(_, entry)| (&entry.target).into())
.collect(),
description: entry.description.clone().unwrap_or_default(),
published_at: file
.entries
.values()
.map(|entry| entry.published_at)
.max()
.unwrap(),
license: entry.license.clone().unwrap_or_default(),
authors: entry.authors.clone(),
repository: entry.repository.clone().map(|url| url.to_string()),
}
})
.collect::<Vec<_>>();
Ok(HttpResponse::Ok().json(serde_json::json!({
"data": top_docs,
"count": count,
})))
}

89
registry/src/error.rs Normal file
View file

@ -0,0 +1,89 @@
use actix_web::{body::BoxBody, HttpResponse, ResponseError};
use pesde::source::git_index::errors::{ReadFile, RefreshError, TreeError};
use serde::Serialize;
use thiserror::Error;
#[derive(Debug, Error)]
pub enum Error {
#[error("failed to parse query")]
Query(#[from] tantivy::query::QueryParserError),
#[error("error reading repo file")]
ReadFile(#[from] ReadFile),
#[error("error deserializing file")]
Deserialize(#[from] toml::de::Error),
#[error("failed to send request: {1}\nserver response: {0}")]
ReqwestResponse(String, #[source] reqwest::Error),
#[error("error sending request")]
Reqwest(#[from] reqwest::Error),
#[error("failed to parse archive entries")]
Tar(#[from] std::io::Error),
#[error("invalid archive")]
InvalidArchive(String),
#[error("failed to read index config")]
Config(#[from] pesde::source::pesde::errors::ConfigError),
#[error("git error")]
Git(#[from] git2::Error),
#[error("failed to refresh source")]
Refresh(#[from] Box<RefreshError>),
#[error("failed to serialize struct")]
Serialize(#[from] toml::ser::Error),
#[error("failed to serialize struct")]
SerializeJson(#[from] serde_json::Error),
#[error("failed to open git repo")]
OpenRepo(#[from] gix::open::Error),
#[error("failed to get root tree")]
RootTree(#[from] TreeError),
}
#[derive(Debug, Serialize)]
pub struct ErrorResponse {
pub error: String,
}
impl ResponseError for Error {
fn error_response(&self) -> HttpResponse<BoxBody> {
match self {
Error::Query(e) => HttpResponse::BadRequest().json(ErrorResponse {
error: format!("failed to parse query: {e}"),
}),
Error::Tar(_) => HttpResponse::BadRequest().json(ErrorResponse {
error: "corrupt archive".to_string(),
}),
Error::InvalidArchive(e) => HttpResponse::BadRequest().json(ErrorResponse {
error: format!("archive is invalid: {e}"),
}),
e => {
tracing::error!("unhandled error: {e:?}");
HttpResponse::InternalServerError().finish()
}
}
}
}
pub trait ReqwestErrorExt {
async fn into_error(self) -> Result<Self, Error>
where
Self: Sized;
}
impl ReqwestErrorExt for reqwest::Response {
async fn into_error(self) -> Result<Self, Error> {
match self.error_for_status_ref() {
Ok(_) => Ok(self),
Err(e) => Err(Error::ReqwestResponse(self.text().await?, e)),
}
}
}

View file

@ -1,77 +0,0 @@
use actix_web::{HttpResponse, ResponseError};
use log::error;
use pesde::index::CreatePackageVersionError;
use serde::Serialize;
use thiserror::Error;
#[derive(Serialize)]
pub struct ErrorResponse {
pub error: String,
}
#[derive(Debug, Error)]
pub enum Errors {
#[error("io error")]
Io(#[from] std::io::Error),
#[error("user yaml error")]
UserYaml(serde_yaml::Error),
#[error("reqwest error")]
Reqwest(#[from] reqwest::Error),
#[error("package name invalid")]
PackageName(#[from] pesde::package_name::StandardPackageNameValidationError),
#[error("config error")]
Config(#[from] pesde::index::ConfigError),
#[error("create package version error")]
CreatePackageVersion(#[from] CreatePackageVersionError),
#[error("commit and push error")]
CommitAndPush(#[from] pesde::index::CommitAndPushError),
#[error("index package error")]
IndexPackage(#[from] pesde::index::IndexPackageError),
#[error("error parsing query")]
QueryParser(#[from] tantivy::query::QueryParserError),
}
impl ResponseError for Errors {
fn error_response(&self) -> HttpResponse {
match self {
Errors::UserYaml(_) | Errors::PackageName(_) | Errors::QueryParser(_) => {}
Errors::CreatePackageVersion(err) => match err {
CreatePackageVersionError::MissingScopeOwnership => {
return HttpResponse::Unauthorized().json(ErrorResponse {
error: "You do not have permission to publish this scope".to_string(),
});
}
CreatePackageVersionError::FromManifestIndexFileEntry(err) => {
return HttpResponse::BadRequest().json(ErrorResponse {
error: format!("Error in manifest: {err:?}"),
});
}
_ => error!("{err:?}"),
},
err => {
error!("{err:?}");
}
}
match self {
Errors::UserYaml(err) => HttpResponse::BadRequest().json(ErrorResponse {
error: format!("Error parsing YAML file: {err}"),
}),
Errors::PackageName(err) => HttpResponse::BadRequest().json(ErrorResponse {
error: format!("Invalid package name: {err}"),
}),
Errors::QueryParser(err) => HttpResponse::BadRequest().json(ErrorResponse {
error: format!("Error parsing query: {err}"),
}),
_ => HttpResponse::InternalServerError().finish(),
}
}
}

View file

@ -1,59 +1,81 @@
use std::{fs::read_dir, sync::Mutex, time::Duration};
use crate::{
auth::{get_auth_from_env, Auth, UserIdExtractor},
search::make_search,
storage::{get_storage_from_env, Storage},
};
use actix_cors::Cors;
use actix_governor::{Governor, GovernorConfigBuilder, KeyExtractor, SimpleKeyExtractionError};
use actix_governor::{Governor, GovernorConfigBuilder};
use actix_web::{
dev::ServiceRequest,
error::ErrorUnauthorized,
middleware::{Compress, Condition, Logger},
middleware::{from_fn, Compress, NormalizePath, TrailingSlash},
rt::System,
web, App, Error, HttpMessage, HttpServer,
web,
web::PayloadConfig,
App, HttpServer,
};
use actix_web_httpauth::{extractors::bearer::BearerAuth, middleware::HttpAuthentication};
use dotenvy::dotenv;
use git2::{Cred, Signature};
use log::info;
use reqwest::{header::AUTHORIZATION, Client};
use rusty_s3::{Bucket, Credentials, UrlStyle};
use tantivy::{doc, DateTime, IndexReader, IndexWriter};
use fs_err::tokio as fs;
use pesde::{
index::{GitIndex, Index, IndexFile},
package_name::StandardPackageName,
source::{pesde::PesdePackageSource, traits::PackageSource},
AuthConfig, Project,
};
use std::{env::current_dir, path::PathBuf};
use tracing::level_filters::LevelFilter;
use tracing_subscriber::{
fmt::format::FmtSpan, layer::SubscriberExt, util::SubscriberInitExt, EnvFilter,
};
mod auth;
mod endpoints;
mod errors;
mod error;
mod package;
mod search;
mod storage;
const S3_EXPIRY: Duration = Duration::from_secs(60 * 60);
struct AppState {
s3_bucket: Bucket,
s3_credentials: Credentials,
reqwest_client: Client,
index: Mutex<GitIndex>,
search_reader: IndexReader,
search_writer: Mutex<IndexWriter>,
pub fn make_reqwest() -> reqwest::Client {
reqwest::ClientBuilder::new()
.user_agent(concat!(
env!("CARGO_PKG_NAME"),
"/",
env!("CARGO_PKG_VERSION")
))
.build()
.unwrap()
}
macro_rules! get_env {
($name:expr, "p") => {
pub struct AppState {
pub source: tokio::sync::Mutex<PesdePackageSource>,
pub project: Project,
pub storage: Storage,
pub auth: Auth,
pub search_reader: tantivy::IndexReader,
pub search_writer: std::sync::Mutex<tantivy::IndexWriter>,
pub query_parser: tantivy::query::QueryParser,
}
#[macro_export]
macro_rules! benv {
($name:expr) => {
std::env::var($name)
.expect(concat!("Environment variable `", $name, "` must be set"))
.parse()
.expect(concat!(
};
($name:expr => $default:expr) => {
benv!($name).unwrap_or($default.to_string())
};
(required $name:expr) => {
benv!($name).expect(concat!("Environment variable `", $name, "` must be set"))
};
(parse $name:expr) => {
benv!($name)
.map(|v| v.parse().expect(concat!(
"Environment variable `",
$name,
"` must be a valid value"
))
)))
};
($name:expr) => {
std::env::var($name).expect(concat!("Environment variable `", $name, "` must be set"))
(parse required $name:expr) => {
benv!(parse $name).expect(concat!("Environment variable `", $name, "` must be set"))
};
($name:expr, $default:expr, "p") => {
std::env::var($name)
.unwrap_or($default.to_string())
(parse $name:expr => $default:expr) => {
benv!($name => $default)
.parse()
.expect(concat!(
"Environment variable `",
@ -61,256 +83,160 @@ macro_rules! get_env {
"` must a valid value"
))
};
($name:expr, $default:expr) => {
std::env::var($name).unwrap_or($default.to_string())
};
}
pub fn commit_signature<'a>() -> Signature<'a> {
Signature::now(
&get_env!("COMMITTER_GIT_NAME"),
&get_env!("COMMITTER_GIT_EMAIL"),
)
.unwrap()
}
async fn run() -> std::io::Result<()> {
let address = benv!("ADDRESS" => "127.0.0.1");
let port: u16 = benv!(parse "PORT" => "8080");
#[derive(Debug, Clone, Copy, Hash, Eq, PartialEq)]
pub struct UserId(pub u64);
let cwd = current_dir().unwrap();
let data_dir =
PathBuf::from(benv!("DATA_DIR" => "{CWD}/data").replace("{CWD}", cwd.to_str().unwrap()));
fs::create_dir_all(&data_dir).await.unwrap();
async fn validator(
req: ServiceRequest,
credentials: BearerAuth,
) -> Result<ServiceRequest, (Error, ServiceRequest)> {
let token = credentials.token();
let app_state = req.app_data::<web::Data<AppState>>().unwrap();
let Ok(user_info) = app_state
.reqwest_client
.get("https://api.github.com/user")
.header(AUTHORIZATION, format!("Bearer {}", token))
.send()
.await
.map(|r| r.json::<serde_json::Value>())
else {
return Err((ErrorUnauthorized("Failed to fetch user info"), req));
};
let Ok(user_info) = user_info.await else {
return Err((ErrorUnauthorized("Failed to parse user info"), req));
};
let Some(id) = user_info["id"].as_u64() else {
return Err((ErrorUnauthorized("Failed to fetch user info"), req));
};
req.extensions_mut().insert(UserId(id));
Ok(req)
}
#[derive(Debug, Clone)]
struct UserIdKey;
impl KeyExtractor for UserIdKey {
type Key = UserId;
type KeyExtractionError = SimpleKeyExtractionError<&'static str>;
fn extract(&self, req: &ServiceRequest) -> Result<Self::Key, Self::KeyExtractionError> {
Ok(*req.extensions().get::<UserId>().unwrap())
}
}
fn search_index(index: &GitIndex) -> (IndexReader, IndexWriter) {
let mut schema_builder = tantivy::schema::SchemaBuilder::new();
let name =
schema_builder.add_text_field("name", tantivy::schema::TEXT | tantivy::schema::STORED);
let version =
schema_builder.add_text_field("version", tantivy::schema::TEXT | tantivy::schema::STORED);
let description = schema_builder.add_text_field("description", tantivy::schema::TEXT);
let published_at = schema_builder.add_date_field("published_at", tantivy::schema::FAST);
let search_index = tantivy::Index::create_in_ram(schema_builder.build());
let search_reader = search_index
.reader_builder()
.reload_policy(tantivy::ReloadPolicy::OnCommit)
.try_into()
.unwrap();
let mut search_writer = search_index.writer(50_000_000).unwrap();
for entry in read_dir(index.path()).unwrap() {
let entry = entry.unwrap();
let path = entry.path();
if !path.is_dir() || path.file_name().is_some_and(|v| v == ".git") {
continue;
}
let scope = path.file_name().and_then(|v| v.to_str()).unwrap();
for entry in read_dir(&path).unwrap() {
let entry = entry.unwrap();
let path = entry.path();
if !path.is_file() || path.extension().is_some() {
continue;
}
let package = path.file_name().and_then(|v| v.to_str()).unwrap();
let package_name = StandardPackageName::new(scope, package).unwrap();
let entries: IndexFile =
serde_yaml::from_slice(&std::fs::read(&path).unwrap()).unwrap();
let entry = entries.last().unwrap().clone();
search_writer
.add_document(doc!(
name => package_name.to_string(),
version => entry.version.to_string(),
description => entry.description.unwrap_or_default(),
published_at => DateTime::from_timestamp_secs(entry.published_at.timestamp()),
))
.unwrap();
}
}
search_writer.commit().unwrap();
(search_reader, search_writer)
}
fn main() -> std::io::Result<()> {
dotenv().ok();
let sentry_url = std::env::var("SENTRY_URL").ok();
let with_sentry = sentry_url.is_some();
let mut log_builder = pretty_env_logger::formatted_builder();
log_builder.parse_env(pretty_env_logger::env_logger::Env::default().default_filter_or("info"));
if with_sentry {
let logger = sentry_log::SentryLogger::with_dest(log_builder.build());
log::set_boxed_logger(Box::new(logger)).unwrap();
log::set_max_level(log::LevelFilter::Info);
} else {
log_builder.try_init().unwrap();
}
let _guard = if let Some(sentry_url) = sentry_url {
std::env::set_var("RUST_BACKTRACE", "1");
Some(sentry::init((
sentry_url,
sentry::ClientOptions {
release: sentry::release_name!(),
..Default::default()
},
)))
} else {
None
};
let address = get_env!("ADDRESS", "127.0.0.1");
let port: u16 = get_env!("PORT", "8080", "p");
let current_dir = std::env::current_dir().unwrap();
let index = GitIndex::new(
current_dir.join("cache"),
&get_env!("INDEX_REPO_URL", "p"),
Some(Box::new(|| {
Box::new(|_, _, _| {
let username = get_env!("GITHUB_USERNAME");
let pat = get_env!("GITHUB_PAT");
Cred::userpass_plaintext(&username, &pat)
})
let project = Project::new(
&cwd,
None::<PathBuf>,
data_dir.join("project"),
&cwd,
AuthConfig::new().with_git_credentials(Some(gix::sec::identity::Account {
username: benv!(required "GIT_USERNAME"),
password: benv!(required "GIT_PASSWORD"),
})),
None,
);
index.refresh().expect("failed to refresh index");
let source = PesdePackageSource::new(benv!(required "INDEX_REPO_URL").try_into().unwrap());
source
.refresh(&project)
.await
.expect("failed to refresh source");
let config = source
.config(&project)
.await
.expect("failed to get index config");
let (search_reader, search_writer) = search_index(&index);
let (search_reader, search_writer, query_parser) = make_search(&project, &source).await;
let app_data = web::Data::new(AppState {
s3_bucket: Bucket::new(
get_env!("S3_ENDPOINT", "p"),
UrlStyle::Path,
get_env!("S3_BUCKET_NAME"),
get_env!("S3_REGION"),
)
.unwrap(),
s3_credentials: Credentials::new(get_env!("S3_ACCESS_KEY"), get_env!("S3_SECRET_KEY")),
reqwest_client: Client::builder()
.user_agent(concat!(
env!("CARGO_PKG_NAME"),
"/",
env!("CARGO_PKG_VERSION")
))
.build()
.unwrap(),
index: Mutex::new(index),
storage: {
let storage = get_storage_from_env();
tracing::info!("storage: {storage}");
storage
},
auth: {
let auth = get_auth_from_env(&config);
tracing::info!("auth: {auth}");
auth
},
source: tokio::sync::Mutex::new(source),
project,
search_reader,
search_writer: Mutex::new(search_writer),
search_writer: std::sync::Mutex::new(search_writer),
query_parser,
});
let upload_governor_config = GovernorConfigBuilder::default()
.burst_size(10)
.per_second(600)
.key_extractor(UserIdKey)
let publish_governor_config = GovernorConfigBuilder::default()
.key_extractor(UserIdExtractor)
.burst_size(12)
.seconds_per_request(60)
.use_headers()
.finish()
.unwrap();
let generic_governor_config = GovernorConfigBuilder::default()
.burst_size(50)
.per_second(10)
.use_headers()
.finish()
.unwrap();
info!("listening on {address}:{port}");
System::new().block_on(async move {
HttpServer::new(move || {
App::new()
.wrap(Condition::new(with_sentry, sentry_actix::Sentry::new()))
.wrap(Logger::default())
.wrap(Cors::permissive())
.wrap(Compress::default())
.app_data(app_data.clone())
.route("/", web::get().to(|| async { env!("CARGO_PKG_VERSION") }))
.service(
web::scope("/v0")
.route(
"/search",
web::get()
.to(endpoints::search::search_packages)
.wrap(Governor::new(&generic_governor_config)),
)
.route(
"/packages/{scope}/{name}/versions",
web::get()
.to(endpoints::packages::get_package_versions)
.wrap(Governor::new(&generic_governor_config)),
)
.route(
"/packages/{scope}/{name}/{version}",
web::get()
.to(endpoints::packages::get_package_version)
.wrap(Governor::new(&generic_governor_config)),
)
.route(
"/packages",
web::post()
.to(endpoints::packages::create_package)
.wrap(Governor::new(&upload_governor_config))
.wrap(HttpAuthentication::bearer(validator)),
),
)
})
.bind((address, port))?
.run()
.await
HttpServer::new(move || {
App::new()
.wrap(sentry_actix::Sentry::with_transaction())
.wrap(NormalizePath::new(TrailingSlash::Trim))
.wrap(Cors::permissive())
.wrap(tracing_actix_web::TracingLogger::default())
.wrap(Compress::default())
.app_data(app_data.clone())
.route(
"/",
web::get().to(|| async {
concat!(env!("CARGO_PKG_NAME"), "/", env!("CARGO_PKG_VERSION"))
}),
)
.service(
web::scope("/v0")
.route(
"/search",
web::get()
.to(endpoints::search::search_packages)
.wrap(from_fn(auth::read_mw)),
)
.route(
"/packages/{name}",
web::get()
.to(endpoints::package_versions::get_package_versions)
.wrap(from_fn(auth::read_mw)),
)
.route(
"/packages/{name}/{version}/{target}",
web::get()
.to(endpoints::package_version::get_package_version)
.wrap(from_fn(auth::read_mw)),
)
.service(
web::scope("/packages")
.app_data(PayloadConfig::new(config.max_archive_size))
.route(
"",
web::post()
.to(endpoints::publish_version::publish_package)
.wrap(Governor::new(&publish_governor_config))
.wrap(from_fn(auth::write_mw)),
),
),
)
})
.bind((address, port))?
.run()
.await
}
// can't use #[actix_web::main] because of Sentry:
// "Note: Macros like #[tokio::main] and #[actix_web::main] are not supported. The Sentry client must be initialized before the async runtime is started so that all threads are correctly connected to the Hub."
// https://docs.sentry.io/platforms/rust/guides/actix-web/
fn main() -> std::io::Result<()> {
let _ = dotenvy::dotenv();
let tracing_env_filter = EnvFilter::builder()
.with_default_directive(LevelFilter::INFO.into())
.from_env_lossy()
.add_directive("reqwest=info".parse().unwrap())
.add_directive("rustls=info".parse().unwrap())
.add_directive("tokio_util=info".parse().unwrap())
.add_directive("goblin=info".parse().unwrap())
.add_directive("tower=info".parse().unwrap())
.add_directive("hyper=info".parse().unwrap())
.add_directive("h2=info".parse().unwrap());
tracing_subscriber::registry()
.with(tracing_env_filter)
.with(
tracing_subscriber::fmt::layer()
.compact()
.with_span_events(FmtSpan::NEW | FmtSpan::CLOSE),
)
.with(sentry::integrations::tracing::layer())
.init();
let guard = sentry::init(sentry::ClientOptions {
release: sentry::release_name!(),
dsn: benv!(parse "SENTRY_DSN").ok(),
session_mode: sentry::SessionMode::Request,
traces_sample_rate: 1.0,
debug: true,
..Default::default()
});
if guard.is_enabled() {
std::env::set_var("RUST_BACKTRACE", "full");
tracing::info!("sentry initialized");
} else {
tracing::info!("sentry **NOT** initialized");
}
System::new().block_on(run())
}

61
registry/src/package.rs Normal file
View file

@ -0,0 +1,61 @@
use chrono::{DateTime, Utc};
use pesde::manifest::target::{Target, TargetKind};
use serde::Serialize;
use std::collections::BTreeSet;
#[derive(Debug, Serialize, Eq, PartialEq)]
pub struct TargetInfo {
kind: TargetKind,
lib: bool,
bin: bool,
#[serde(skip_serializing_if = "BTreeSet::is_empty")]
scripts: BTreeSet<String>,
}
impl From<Target> for TargetInfo {
fn from(target: Target) -> Self {
(&target).into()
}
}
impl From<&Target> for TargetInfo {
fn from(target: &Target) -> Self {
TargetInfo {
kind: target.kind(),
lib: target.lib_path().is_some(),
bin: target.bin_path().is_some(),
scripts: target
.scripts()
.map(|scripts| scripts.keys().cloned().collect())
.unwrap_or_default(),
}
}
}
impl Ord for TargetInfo {
fn cmp(&self, other: &Self) -> std::cmp::Ordering {
self.kind.cmp(&other.kind)
}
}
impl PartialOrd for TargetInfo {
fn partial_cmp(&self, other: &Self) -> Option<std::cmp::Ordering> {
Some(self.cmp(other))
}
}
#[derive(Debug, Serialize)]
pub struct PackageResponse {
pub name: String,
pub version: String,
pub targets: BTreeSet<TargetInfo>,
#[serde(skip_serializing_if = "String::is_empty")]
pub description: String,
pub published_at: DateTime<Utc>,
#[serde(skip_serializing_if = "String::is_empty")]
pub license: String,
#[serde(skip_serializing_if = "Vec::is_empty")]
pub authors: Vec<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub repository: Option<String>,
}

148
registry/src/search.rs Normal file
View file

@ -0,0 +1,148 @@
use crate::AppState;
use async_stream::stream;
use futures::{Stream, StreamExt};
use pesde::{
names::PackageName,
source::{
git_index::{root_tree, GitBasedSource},
pesde::{IndexFile, IndexFileEntry, PesdePackageSource, SCOPE_INFO_FILE},
},
Project,
};
use tantivy::{
doc,
query::QueryParser,
schema::{IndexRecordOption, TextFieldIndexing, TextOptions, FAST, STORED, STRING},
tokenizer::TextAnalyzer,
DateTime, IndexReader, IndexWriter, Term,
};
use tokio::pin;
pub async fn all_packages(
source: &PesdePackageSource,
project: &Project,
) -> impl Stream<Item = (PackageName, IndexFile)> {
let path = source.path(project);
stream! {
let repo = gix::open(&path).expect("failed to open index");
let tree = root_tree(&repo).expect("failed to get root tree");
for entry in tree.iter() {
let entry = entry.expect("failed to read entry");
let object = entry.object().expect("failed to get object");
// directories will be trees, and files will be blobs
if !matches!(object.kind, gix::object::Kind::Tree) {
continue;
}
let package_scope = entry.filename().to_string();
for inner_entry in object.into_tree().iter() {
let inner_entry = inner_entry.expect("failed to read inner entry");
let object = inner_entry.object().expect("failed to get object");
if !matches!(object.kind, gix::object::Kind::Blob) {
continue;
}
let package_name = inner_entry.filename().to_string();
if package_name == SCOPE_INFO_FILE {
continue;
}
let blob = object.into_blob();
let string = String::from_utf8(blob.data.clone()).expect("failed to parse utf8");
let file: IndexFile = toml::from_str(&string).expect("failed to parse index file");
// if this panics, it's an issue with the index.
let name = format!("{package_scope}/{package_name}").parse().unwrap();
yield (name, file);
}
}
}
}
pub async fn make_search(
project: &Project,
source: &PesdePackageSource,
) -> (IndexReader, IndexWriter, QueryParser) {
let mut schema_builder = tantivy::schema::SchemaBuilder::new();
let field_options = TextOptions::default().set_indexing_options(
TextFieldIndexing::default()
.set_tokenizer("ngram")
.set_index_option(IndexRecordOption::WithFreqsAndPositions),
);
let id_field = schema_builder.add_text_field("id", STRING | STORED);
let scope = schema_builder.add_text_field("scope", field_options.clone());
let name = schema_builder.add_text_field("name", field_options.clone());
let description = schema_builder.add_text_field("description", field_options);
let published_at = schema_builder.add_date_field("published_at", FAST);
let search_index = tantivy::Index::create_in_ram(schema_builder.build());
search_index.tokenizers().register(
"ngram",
TextAnalyzer::builder(tantivy::tokenizer::NgramTokenizer::all_ngrams(1, 12).unwrap())
.filter(tantivy::tokenizer::LowerCaser)
.build(),
);
let search_reader = search_index
.reader_builder()
.reload_policy(tantivy::ReloadPolicy::Manual)
.try_into()
.unwrap();
let mut search_writer = search_index.writer(50_000_000).unwrap();
let stream = all_packages(source, project).await;
pin!(stream);
while let Some((pkg_name, mut file)) = stream.next().await {
let Some((_, latest_entry)) = file.entries.pop_last() else {
tracing::error!("no versions found for {pkg_name}");
continue;
};
search_writer.add_document(doc!(
id_field => pkg_name.to_string(),
scope => pkg_name.as_str().0,
name => pkg_name.as_str().1,
description => latest_entry.description.unwrap_or_default(),
published_at => DateTime::from_timestamp_secs(latest_entry.published_at.timestamp()),
)).unwrap();
}
search_writer.commit().unwrap();
search_reader.reload().unwrap();
let mut query_parser = QueryParser::for_index(&search_index, vec![scope, name, description]);
query_parser.set_field_boost(scope, 2.0);
query_parser.set_field_boost(name, 3.5);
(search_reader, search_writer, query_parser)
}
pub fn update_version(app_state: &AppState, name: &PackageName, entry: IndexFileEntry) {
let mut search_writer = app_state.search_writer.lock().unwrap();
let schema = search_writer.index().schema();
let id_field = schema.get_field("id").unwrap();
search_writer.delete_term(Term::from_field_text(id_field, &name.to_string()));
search_writer.add_document(doc!(
id_field => name.to_string(),
schema.get_field("scope").unwrap() => name.as_str().0,
schema.get_field("name").unwrap() => name.as_str().1,
schema.get_field("description").unwrap() => entry.description.unwrap_or_default(),
schema.get_field("published_at").unwrap() => DateTime::from_timestamp_secs(entry.published_at.timestamp())
)).unwrap();
search_writer.commit().unwrap();
app_state.search_reader.reload().unwrap();
}

126
registry/src/storage/fs.rs Normal file
View file

@ -0,0 +1,126 @@
use crate::{error::Error, storage::StorageImpl};
use actix_web::{
http::header::{CONTENT_ENCODING, CONTENT_TYPE},
HttpResponse,
};
use fs_err::tokio as fs;
use pesde::{names::PackageName, source::version_id::VersionId};
use std::{
fmt::Display,
path::{Path, PathBuf},
};
#[derive(Debug)]
pub struct FSStorage {
pub root: PathBuf,
}
async fn read_file_to_response(path: &Path, content_type: &str) -> Result<HttpResponse, Error> {
Ok(match fs::read(path).await {
Ok(contents) => HttpResponse::Ok()
.append_header((CONTENT_TYPE, content_type))
.append_header((CONTENT_ENCODING, "gzip"))
.body(contents),
Err(e) if e.kind() == std::io::ErrorKind::NotFound => HttpResponse::NotFound().finish(),
Err(e) => return Err(e.into()),
})
}
impl StorageImpl for FSStorage {
async fn store_package(
&self,
package_name: &PackageName,
version: &VersionId,
contents: Vec<u8>,
) -> Result<(), Error> {
let (scope, name) = package_name.as_str();
let path = self
.root
.join(scope)
.join(name)
.join(version.version().to_string())
.join(version.target().to_string());
fs::create_dir_all(&path).await?;
fs::write(path.join("pkg.tar.gz"), &contents).await?;
Ok(())
}
async fn get_package(
&self,
package_name: &PackageName,
version: &VersionId,
) -> Result<HttpResponse, Error> {
let (scope, name) = package_name.as_str();
let path = self
.root
.join(scope)
.join(name)
.join(version.version().to_string())
.join(version.target().to_string());
read_file_to_response(&path.join("pkg.tar.gz"), "application/gzip").await
}
async fn store_readme(
&self,
package_name: &PackageName,
version: &VersionId,
contents: Vec<u8>,
) -> Result<(), Error> {
let (scope, name) = package_name.as_str();
let path = self
.root
.join(scope)
.join(name)
.join(version.version().to_string())
.join(version.target().to_string());
fs::create_dir_all(&path).await?;
fs::write(path.join("readme.gz"), &contents).await?;
Ok(())
}
async fn get_readme(
&self,
package_name: &PackageName,
version: &VersionId,
) -> Result<HttpResponse, Error> {
let (scope, name) = package_name.as_str();
let path = self
.root
.join(scope)
.join(name)
.join(version.version().to_string())
.join(version.target().to_string());
read_file_to_response(&path.join("readme.gz"), "text/plain").await
}
async fn store_doc(&self, doc_hash: String, contents: Vec<u8>) -> Result<(), Error> {
let path = self.root.join("Doc");
fs::create_dir_all(&path).await?;
fs::write(path.join(format!("{doc_hash}.gz")), &contents).await?;
Ok(())
}
async fn get_doc(&self, doc_hash: &str) -> Result<HttpResponse, Error> {
let path = self.root.join("Doc");
read_file_to_response(&path.join(format!("{doc_hash}.gz")), "text/plain").await
}
}
impl Display for FSStorage {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "FS")
}
}

141
registry/src/storage/mod.rs Normal file
View file

@ -0,0 +1,141 @@
use crate::{benv, error::Error, make_reqwest};
use actix_web::HttpResponse;
use pesde::{names::PackageName, source::version_id::VersionId};
use rusty_s3::{Bucket, Credentials, UrlStyle};
use std::fmt::Display;
mod fs;
mod s3;
#[derive(Debug)]
pub enum Storage {
S3(s3::S3Storage),
FS(fs::FSStorage),
}
pub trait StorageImpl: Display {
async fn store_package(
&self,
package_name: &PackageName,
version: &VersionId,
contents: Vec<u8>,
) -> Result<(), crate::error::Error>;
async fn get_package(
&self,
package_name: &PackageName,
version: &VersionId,
) -> Result<HttpResponse, crate::error::Error>;
async fn store_readme(
&self,
package_name: &PackageName,
version: &VersionId,
contents: Vec<u8>,
) -> Result<(), crate::error::Error>;
async fn get_readme(
&self,
package_name: &PackageName,
version: &VersionId,
) -> Result<HttpResponse, crate::error::Error>;
async fn store_doc(
&self,
doc_hash: String,
contents: Vec<u8>,
) -> Result<(), crate::error::Error>;
async fn get_doc(&self, doc_hash: &str) -> Result<HttpResponse, crate::error::Error>;
}
impl StorageImpl for Storage {
async fn store_package(
&self,
package_name: &PackageName,
version: &VersionId,
contents: Vec<u8>,
) -> Result<(), Error> {
match self {
Storage::S3(s3) => s3.store_package(package_name, version, contents).await,
Storage::FS(fs) => fs.store_package(package_name, version, contents).await,
}
}
async fn get_package(
&self,
package_name: &PackageName,
version: &VersionId,
) -> Result<HttpResponse, Error> {
match self {
Storage::S3(s3) => s3.get_package(package_name, version).await,
Storage::FS(fs) => fs.get_package(package_name, version).await,
}
}
async fn store_readme(
&self,
package_name: &PackageName,
version: &VersionId,
contents: Vec<u8>,
) -> Result<(), Error> {
match self {
Storage::S3(s3) => s3.store_readme(package_name, version, contents).await,
Storage::FS(fs) => fs.store_readme(package_name, version, contents).await,
}
}
async fn get_readme(
&self,
package_name: &PackageName,
version: &VersionId,
) -> Result<HttpResponse, Error> {
match self {
Storage::S3(s3) => s3.get_readme(package_name, version).await,
Storage::FS(fs) => fs.get_readme(package_name, version).await,
}
}
async fn store_doc(&self, doc_hash: String, contents: Vec<u8>) -> Result<(), Error> {
match self {
Storage::S3(s3) => s3.store_doc(doc_hash, contents).await,
Storage::FS(fs) => fs.store_doc(doc_hash, contents).await,
}
}
async fn get_doc(&self, doc_hash: &str) -> Result<HttpResponse, Error> {
match self {
Storage::S3(s3) => s3.get_doc(doc_hash).await,
Storage::FS(fs) => fs.get_doc(doc_hash).await,
}
}
}
impl Display for Storage {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Storage::S3(s3) => write!(f, "{}", s3),
Storage::FS(fs) => write!(f, "{}", fs),
}
}
}
pub fn get_storage_from_env() -> Storage {
if let Ok(endpoint) = benv!(parse "S3_ENDPOINT") {
Storage::S3(s3::S3Storage {
s3_bucket: Bucket::new(
endpoint,
UrlStyle::Path,
benv!(required "S3_BUCKET_NAME"),
benv!(required "S3_REGION"),
)
.unwrap(),
s3_credentials: Credentials::new(
benv!(required "S3_ACCESS_KEY"),
benv!(required "S3_SECRET_KEY"),
),
reqwest_client: make_reqwest(),
})
} else if let Ok(root) = benv!(parse "FS_STORAGE_ROOT") {
Storage::FS(fs::FSStorage { root })
} else {
panic!("no storage backend configured")
}
}

166
registry/src/storage/s3.rs Normal file
View file

@ -0,0 +1,166 @@
use crate::{
error::{Error, ReqwestErrorExt},
storage::StorageImpl,
};
use actix_web::{http::header::LOCATION, HttpResponse};
use pesde::{names::PackageName, source::version_id::VersionId};
use reqwest::header::{CONTENT_ENCODING, CONTENT_TYPE};
use rusty_s3::{
actions::{GetObject, PutObject},
Bucket, Credentials, S3Action,
};
use std::{fmt::Display, time::Duration};
#[derive(Debug)]
pub struct S3Storage {
pub s3_bucket: Bucket,
pub s3_credentials: Credentials,
pub reqwest_client: reqwest::Client,
}
pub const S3_SIGN_DURATION: Duration = Duration::from_secs(60 * 15);
impl StorageImpl for S3Storage {
async fn store_package(
&self,
package_name: &PackageName,
version: &VersionId,
contents: Vec<u8>,
) -> Result<(), Error> {
let object_url = PutObject::new(
&self.s3_bucket,
Some(&self.s3_credentials),
&format!(
"{package_name}/{}/{}/pkg.tar.gz",
version.version(),
version.target()
),
)
.sign(S3_SIGN_DURATION);
self.reqwest_client
.put(object_url)
.header(CONTENT_TYPE, "application/gzip")
.header(CONTENT_ENCODING, "gzip")
.body(contents)
.send()
.await?
.into_error()
.await?;
Ok(())
}
async fn get_package(
&self,
package_name: &PackageName,
version: &VersionId,
) -> Result<HttpResponse, Error> {
let object_url = GetObject::new(
&self.s3_bucket,
Some(&self.s3_credentials),
&format!(
"{package_name}/{}/{}/pkg.tar.gz",
version.version(),
version.target()
),
)
.sign(S3_SIGN_DURATION);
Ok(HttpResponse::TemporaryRedirect()
.append_header((LOCATION, object_url.as_str()))
.finish())
}
async fn store_readme(
&self,
package_name: &PackageName,
version: &VersionId,
contents: Vec<u8>,
) -> Result<(), Error> {
let object_url = PutObject::new(
&self.s3_bucket,
Some(&self.s3_credentials),
&format!(
"{package_name}/{}/{}/readme.gz",
version.version(),
version.target()
),
)
.sign(S3_SIGN_DURATION);
self.reqwest_client
.put(object_url)
.header(CONTENT_TYPE, "text/plain")
.header(CONTENT_ENCODING, "gzip")
.body(contents)
.send()
.await?
.into_error()
.await?;
Ok(())
}
async fn get_readme(
&self,
package_name: &PackageName,
version: &VersionId,
) -> Result<HttpResponse, Error> {
let object_url = GetObject::new(
&self.s3_bucket,
Some(&self.s3_credentials),
&format!(
"{package_name}/{}/{}/readme.gz",
version.version(),
version.target()
),
)
.sign(S3_SIGN_DURATION);
Ok(HttpResponse::TemporaryRedirect()
.append_header((LOCATION, object_url.as_str()))
.finish())
}
async fn store_doc(&self, doc_hash: String, contents: Vec<u8>) -> Result<(), Error> {
let object_url = PutObject::new(
&self.s3_bucket,
Some(&self.s3_credentials),
// capitalize Doc to prevent conflicts with scope names
&format!("Doc/{}.gz", doc_hash),
)
.sign(S3_SIGN_DURATION);
self.reqwest_client
.put(object_url)
.header(CONTENT_TYPE, "text/plain")
.header(CONTENT_ENCODING, "gzip")
.body(contents)
.send()
.await?
.into_error()
.await?;
Ok(())
}
async fn get_doc(&self, doc_hash: &str) -> Result<HttpResponse, Error> {
let object_url = GetObject::new(
&self.s3_bucket,
Some(&self.s3_credentials),
&format!("Doc/{}.gz", doc_hash),
)
.sign(S3_SIGN_DURATION);
Ok(HttpResponse::TemporaryRedirect()
.append_header((LOCATION, object_url.as_str()))
.finish())
}
}
impl Display for S3Storage {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "S3")
}
}

1
rustfmt.toml Normal file
View file

@ -0,0 +1 @@
imports_granularity = "Crate"

View file

@ -1,162 +0,0 @@
use std::path::PathBuf;
use crate::cli::DEFAULT_INDEX_DATA;
use keyring::Entry;
use once_cell::sync::Lazy;
use serde::{Deserialize, Serialize};
struct EnvVarApiTokenSource;
const API_TOKEN_ENV_VAR: &str = "PESDE_API_TOKEN";
impl EnvVarApiTokenSource {
fn get_api_token(&self) -> anyhow::Result<Option<String>> {
match std::env::var(API_TOKEN_ENV_VAR) {
Ok(token) => Ok(Some(token)),
Err(std::env::VarError::NotPresent) => Ok(None),
Err(e) => Err(e.into()),
}
}
}
static AUTH_FILE_PATH: Lazy<PathBuf> =
Lazy::new(|| DEFAULT_INDEX_DATA.0.parent().unwrap().join("auth.yaml"));
static AUTH_FILE: Lazy<AuthFile> =
Lazy::new(
|| match std::fs::read_to_string(AUTH_FILE_PATH.to_path_buf()) {
Ok(config) => serde_yaml::from_str(&config).unwrap(),
Err(e) if e.kind() == std::io::ErrorKind::NotFound => AuthFile::default(),
Err(e) => panic!("{:?}", e),
},
);
#[derive(Serialize, Deserialize, Default, Clone)]
struct AuthFile {
#[serde(default)]
api_token: Option<String>,
}
struct ConfigFileApiTokenSource;
impl ConfigFileApiTokenSource {
fn get_api_token(&self) -> anyhow::Result<Option<String>> {
Ok(AUTH_FILE.api_token.clone())
}
fn set_api_token(&self, api_token: &str) -> anyhow::Result<()> {
let mut config = AUTH_FILE.clone();
config.api_token = Some(api_token.to_string());
serde_yaml::to_writer(
&mut std::fs::File::create(AUTH_FILE_PATH.to_path_buf())?,
&config,
)?;
Ok(())
}
fn delete_api_token(&self) -> anyhow::Result<()> {
let mut config = AUTH_FILE.clone();
config.api_token = None;
serde_yaml::to_writer(
&mut std::fs::File::create(AUTH_FILE_PATH.to_path_buf())?,
&config,
)?;
Ok(())
}
}
static KEYRING_ENTRY: Lazy<Entry> =
Lazy::new(|| Entry::new(env!("CARGO_PKG_NAME"), "api_token").unwrap());
struct KeyringApiTokenSource;
impl KeyringApiTokenSource {
fn get_api_token(&self) -> anyhow::Result<Option<String>> {
match KEYRING_ENTRY.get_password() {
Ok(api_token) => Ok(Some(api_token)),
Err(err) => match err {
keyring::Error::NoEntry | keyring::Error::PlatformFailure(_) => Ok(None),
_ => Err(err.into()),
},
}
}
fn set_api_token(&self, api_token: &str) -> anyhow::Result<()> {
KEYRING_ENTRY.set_password(api_token)?;
Ok(())
}
fn delete_api_token(&self) -> anyhow::Result<()> {
KEYRING_ENTRY.delete_password()?;
Ok(())
}
}
#[derive(Debug)]
pub enum ApiTokenSource {
EnvVar,
ConfigFile,
Keyring,
}
impl ApiTokenSource {
pub fn get_api_token(&self) -> anyhow::Result<Option<String>> {
match self {
ApiTokenSource::EnvVar => EnvVarApiTokenSource.get_api_token(),
ApiTokenSource::ConfigFile => ConfigFileApiTokenSource.get_api_token(),
ApiTokenSource::Keyring => KeyringApiTokenSource.get_api_token(),
}
}
pub fn set_api_token(&self, api_token: &str) -> anyhow::Result<()> {
match self {
ApiTokenSource::EnvVar => Ok(()),
ApiTokenSource::ConfigFile => ConfigFileApiTokenSource.set_api_token(api_token),
ApiTokenSource::Keyring => KeyringApiTokenSource.set_api_token(api_token),
}
}
pub fn delete_api_token(&self) -> anyhow::Result<()> {
match self {
ApiTokenSource::EnvVar => Ok(()),
ApiTokenSource::ConfigFile => ConfigFileApiTokenSource.delete_api_token(),
ApiTokenSource::Keyring => KeyringApiTokenSource.delete_api_token(),
}
}
fn persists(&self) -> bool {
!matches!(self, ApiTokenSource::EnvVar)
}
}
pub static API_TOKEN_SOURCE: Lazy<ApiTokenSource> = Lazy::new(|| {
let sources: [ApiTokenSource; 3] = [
ApiTokenSource::EnvVar,
ApiTokenSource::ConfigFile,
ApiTokenSource::Keyring,
];
let mut valid_sources = vec![];
for source in sources {
match source.get_api_token() {
Ok(Some(_)) => return source,
Ok(None) => {
if source.persists() {
valid_sources.push(source);
}
}
Err(e) => {
log::error!("error getting api token: {e}");
}
}
}
valid_sources.pop().unwrap()
});

View file

@ -1,111 +1,119 @@
use clap::Subcommand;
use pesde::index::Index;
use reqwest::{header::AUTHORIZATION, Url};
use crate::cli::config::{read_config, write_config};
use anyhow::Context;
use gix::bstr::BStr;
use keyring::Entry;
use reqwest::header::AUTHORIZATION;
use serde::{ser::SerializeMap, Deserialize, Serialize};
use std::collections::BTreeMap;
use tracing::instrument;
use crate::cli::{api_token::API_TOKEN_SOURCE, send_request, DEFAULT_INDEX, REQWEST_CLIENT};
#[derive(Debug, Clone)]
pub struct Tokens(pub BTreeMap<gix::Url, String>);
#[derive(Subcommand, Clone)]
pub enum AuthCommand {
/// Logs in to the registry
Login,
/// Logs out from the registry
Logout,
impl Serialize for Tokens {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: serde::ser::Serializer,
{
let mut map = serializer.serialize_map(Some(self.0.len()))?;
for (k, v) in &self.0 {
map.serialize_entry(&k.to_bstring().to_string(), v)?;
}
map.end()
}
}
pub fn auth_command(cmd: AuthCommand) -> anyhow::Result<()> {
match cmd {
AuthCommand::Login => {
let github_oauth_client_id = DEFAULT_INDEX.config()?.github_oauth_client_id;
impl<'de> Deserialize<'de> for Tokens {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: serde::de::Deserializer<'de>,
{
Ok(Tokens(
BTreeMap::<String, String>::deserialize(deserializer)?
.into_iter()
.map(|(k, v)| gix::Url::from_bytes(BStr::new(&k)).map(|k| (k, v)))
.collect::<Result<_, _>>()
.map_err(serde::de::Error::custom)?,
))
}
}
let response = send_request(REQWEST_CLIENT.post(Url::parse_with_params(
"https://github.com/login/device/code",
&[("client_id", &github_oauth_client_id)],
)?))?
.json::<serde_json::Value>()?;
println!(
"go to {} and enter the code `{}`",
response["verification_uri"], response["user_code"]
);
let mut time_left = response["expires_in"]
.as_i64()
.ok_or(anyhow::anyhow!("couldn't get expires_in"))?;
let interval = std::time::Duration::from_secs(
response["interval"]
.as_u64()
.ok_or(anyhow::anyhow!("couldn't get interval"))?,
);
let device_code = response["device_code"]
.as_str()
.ok_or(anyhow::anyhow!("couldn't get device_code"))?;
while time_left > 0 {
std::thread::sleep(interval);
time_left -= interval.as_secs() as i64;
let response = send_request(REQWEST_CLIENT.post(Url::parse_with_params(
"https://github.com/login/oauth/access_token",
&[
("client_id", github_oauth_client_id.as_str()),
("device_code", device_code),
("grant_type", "urn:ietf:params:oauth:grant-type:device_code"),
],
)?))?
.json::<serde_json::Value>()?;
match response
.get("error")
.map(|s| {
s.as_str()
.ok_or(anyhow::anyhow!("couldn't get error as string"))
})
.unwrap_or(Ok(""))?
{
"authorization_pending" => continue,
"slow_down" => {
std::thread::sleep(std::time::Duration::from_secs(5));
continue;
}
"expired_token" => {
break;
}
"access_denied" => {
anyhow::bail!("access denied, re-run the login command");
}
_ => (),
}
if response.get("access_token").is_some() {
let access_token = response["access_token"]
.as_str()
.ok_or(anyhow::anyhow!("couldn't get access_token"))?;
API_TOKEN_SOURCE.set_api_token(access_token)?;
let response = send_request(
REQWEST_CLIENT
.get("https://api.github.com/user")
.header(AUTHORIZATION, format!("Bearer {access_token}")),
)?
.json::<serde_json::Value>()?;
let login = response["login"]
.as_str()
.ok_or(anyhow::anyhow!("couldn't get login"))?;
println!("you're now logged in as {login}");
return Ok(());
}
}
anyhow::bail!("code expired, please re-run the login command");
}
AuthCommand::Logout => {
API_TOKEN_SOURCE.delete_api_token()?;
println!("you're now logged out");
}
#[instrument(level = "trace")]
pub async fn get_tokens() -> anyhow::Result<Tokens> {
let config = read_config().await?;
if !config.tokens.0.is_empty() {
tracing::debug!("using tokens from config");
return Ok(config.tokens);
}
Ok(())
match Entry::new("tokens", env!("CARGO_PKG_NAME")) {
Ok(entry) => match entry.get_password() {
Ok(token) => {
tracing::debug!("using tokens from keyring");
return serde_json::from_str(&token).context("failed to parse tokens");
}
Err(keyring::Error::PlatformFailure(_) | keyring::Error::NoEntry) => {}
Err(e) => return Err(e.into()),
},
Err(keyring::Error::PlatformFailure(_)) => {}
Err(e) => return Err(e.into()),
}
Ok(Tokens(BTreeMap::new()))
}
#[instrument(level = "trace")]
pub async fn set_tokens(tokens: Tokens) -> anyhow::Result<()> {
let entry = Entry::new("tokens", env!("CARGO_PKG_NAME"))?;
let json = serde_json::to_string(&tokens).context("failed to serialize tokens")?;
match entry.set_password(&json) {
Ok(()) => {
tracing::debug!("tokens saved to keyring");
return Ok(());
}
Err(keyring::Error::PlatformFailure(_) | keyring::Error::NoEntry) => {}
Err(e) => return Err(e.into()),
}
tracing::debug!("tokens saved to config");
let mut config = read_config().await?;
config.tokens = tokens;
write_config(&config).await.map_err(Into::into)
}
pub async fn set_token(repo: &gix::Url, token: Option<&str>) -> anyhow::Result<()> {
let mut tokens = get_tokens().await?;
if let Some(token) = token {
tokens.0.insert(repo.clone(), token.to_string());
} else {
tokens.0.remove(repo);
}
set_tokens(tokens).await
}
#[derive(Debug, Deserialize)]
struct UserResponse {
login: String,
}
#[instrument(level = "trace")]
pub async fn get_token_login(
reqwest: &reqwest::Client,
access_token: &str,
) -> anyhow::Result<String> {
let response = reqwest
.get("https://api.github.com/user")
.header(AUTHORIZATION, access_token)
.send()
.await
.context("failed to send user request")?
.error_for_status()
.context("failed to get user")?
.json::<UserResponse>()
.await
.context("failed to parse user response")?;
Ok(response.login)
}

250
src/cli/commands/add.rs Normal file
View file

@ -0,0 +1,250 @@
use std::{collections::HashSet, str::FromStr};
use anyhow::Context;
use clap::Args;
use colored::Colorize;
use semver::VersionReq;
use crate::cli::{config::read_config, AnyPackageIdentifier, VersionedPackageName};
use pesde::{
manifest::target::TargetKind,
names::PackageNames,
source::{
git::{specifier::GitDependencySpecifier, GitPackageSource},
pesde::{specifier::PesdeDependencySpecifier, PesdePackageSource},
specifiers::DependencySpecifiers,
traits::PackageSource,
workspace::WorkspacePackageSource,
PackageSources,
},
Project, DEFAULT_INDEX_NAME,
};
#[derive(Debug, Args)]
pub struct AddCommand {
/// The package name to add
#[arg(index = 1)]
name: AnyPackageIdentifier<VersionReq>,
/// The index in which to search for the package
#[arg(short, long)]
index: Option<String>,
/// The target environment of the package
#[arg(short, long)]
target: Option<TargetKind>,
/// The alias to use for the package
#[arg(short, long)]
alias: Option<String>,
/// Whether to add the package as a peer dependency
#[arg(short, long)]
peer: bool,
/// Whether to add the package as a dev dependency
#[arg(short, long, conflicts_with = "peer")]
dev: bool,
}
impl AddCommand {
pub async fn run(self, project: Project) -> anyhow::Result<()> {
let manifest = project
.deser_manifest()
.await
.context("failed to read manifest")?;
let (source, specifier) = match &self.name {
AnyPackageIdentifier::PackageName(versioned) => match &versioned {
VersionedPackageName(PackageNames::Pesde(name), version) => {
let index = manifest
.indices
.get(self.index.as_deref().unwrap_or(DEFAULT_INDEX_NAME))
.cloned();
if let Some(index) = self.index.as_ref().filter(|_| index.is_none()) {
println!("{}: index {index} not found", "error".red().bold());
return Ok(());
}
let index = match index {
Some(index) => index,
None => read_config().await?.default_index,
};
let source = PackageSources::Pesde(PesdePackageSource::new(index));
let specifier = DependencySpecifiers::Pesde(PesdeDependencySpecifier {
name: name.clone(),
version: version.clone().unwrap_or(VersionReq::STAR),
index: self.index,
target: self.target,
});
(source, specifier)
}
#[cfg(feature = "wally-compat")]
VersionedPackageName(PackageNames::Wally(name), version) => {
let index = manifest
.wally_indices
.get(self.index.as_deref().unwrap_or(DEFAULT_INDEX_NAME))
.cloned();
if let Some(index) = self.index.as_ref().filter(|_| index.is_none()) {
println!("{}: wally index {index} not found", "error".red().bold());
return Ok(());
}
let index = index.context("no wally index found")?;
let source =
PackageSources::Wally(pesde::source::wally::WallyPackageSource::new(index));
let specifier = DependencySpecifiers::Wally(
pesde::source::wally::specifier::WallyDependencySpecifier {
name: name.clone(),
version: version.clone().unwrap_or(VersionReq::STAR),
index: self.index,
},
);
(source, specifier)
}
},
AnyPackageIdentifier::Url((url, rev)) => (
PackageSources::Git(GitPackageSource::new(url.clone())),
DependencySpecifiers::Git(GitDependencySpecifier {
repo: url.clone(),
rev: rev.to_string(),
path: None,
}),
),
AnyPackageIdentifier::Workspace(VersionedPackageName(name, version)) => (
PackageSources::Workspace(WorkspacePackageSource),
DependencySpecifiers::Workspace(
pesde::source::workspace::specifier::WorkspaceDependencySpecifier {
name: name.clone(),
version: version.clone().unwrap_or_default(),
target: self.target,
},
),
),
};
source
.refresh(&project)
.await
.context("failed to refresh package source")?;
let Some(version_id) = source
.resolve(
&specifier,
&project,
manifest.target.kind(),
&mut HashSet::new(),
)
.await
.context("failed to resolve package")?
.1
.pop_last()
.map(|(v_id, _)| v_id)
else {
println!("{}: no versions found for package", "error".red().bold());
return Ok(());
};
let project_target = manifest.target.kind();
let mut manifest = toml_edit::DocumentMut::from_str(
&project
.read_manifest()
.await
.context("failed to read manifest")?,
)
.context("failed to parse manifest")?;
let dependency_key = if self.peer {
"peer_dependencies"
} else if self.dev {
"dev_dependencies"
} else {
"dependencies"
};
let alias = self.alias.unwrap_or_else(|| match self.name.clone() {
AnyPackageIdentifier::PackageName(versioned) => versioned.0.as_str().1.to_string(),
AnyPackageIdentifier::Url((url, _)) => url
.path
.to_string()
.split('/')
.last()
.map(|s| s.to_string())
.unwrap_or(url.path.to_string()),
AnyPackageIdentifier::Workspace(versioned) => versioned.0.as_str().1.to_string(),
});
let field = &mut manifest[dependency_key]
.or_insert(toml_edit::Item::Table(toml_edit::Table::new()))[&alias];
match specifier {
DependencySpecifiers::Pesde(spec) => {
field["name"] = toml_edit::value(spec.name.clone().to_string());
field["version"] = toml_edit::value(format!("^{}", version_id.version()));
if *version_id.target() != project_target {
field["target"] = toml_edit::value(version_id.target().to_string());
}
if let Some(index) = spec.index.filter(|i| i != DEFAULT_INDEX_NAME) {
field["index"] = toml_edit::value(index);
}
println!(
"added {}@{} {} to {}",
spec.name,
version_id.version(),
version_id.target(),
dependency_key
);
}
#[cfg(feature = "wally-compat")]
DependencySpecifiers::Wally(spec) => {
field["wally"] = toml_edit::value(spec.name.clone().to_string());
field["version"] = toml_edit::value(format!("^{}", version_id.version()));
if let Some(index) = spec.index.filter(|i| i != DEFAULT_INDEX_NAME) {
field["index"] = toml_edit::value(index);
}
println!(
"added wally {}@{} to {}",
spec.name,
version_id.version(),
dependency_key
);
}
DependencySpecifiers::Git(spec) => {
field["repo"] = toml_edit::value(spec.repo.to_bstring().to_string());
field["rev"] = toml_edit::value(spec.rev.clone());
println!("added git {}#{} to {}", spec.repo, spec.rev, dependency_key);
}
DependencySpecifiers::Workspace(spec) => {
field["workspace"] = toml_edit::value(spec.name.clone().to_string());
if let AnyPackageIdentifier::Workspace(versioned) = self.name {
if let Some(version) = versioned.1 {
field["version"] = toml_edit::value(version.to_string());
}
}
println!(
"added workspace {}@{} to {}",
spec.name, spec.version, dependency_key
);
}
}
project
.write_manifest(manifest.to_string())
.await
.context("failed to write manifest")?;
Ok(())
}
}

View file

@ -0,0 +1,193 @@
use anyhow::Context;
use clap::Args;
use colored::Colorize;
use serde::Deserialize;
use std::thread::spawn;
use tokio::time::sleep;
use url::Url;
use pesde::{
source::{pesde::PesdePackageSource, traits::PackageSource},
Project,
};
use crate::cli::auth::{get_token_login, set_token};
#[derive(Debug, Args)]
pub struct LoginCommand {
/// The token to use for authentication, skipping login
#[arg(short, long)]
token: Option<String>,
}
#[derive(Debug, Deserialize)]
struct DeviceCodeResponse {
device_code: String,
user_code: String,
verification_uri: Url,
expires_in: u64,
interval: u64,
}
#[derive(Debug, Deserialize)]
#[serde(rename_all = "snake_case", tag = "error")]
enum AccessTokenError {
AuthorizationPending,
SlowDown { interval: u64 },
ExpiredToken,
AccessDenied,
}
#[derive(Debug, Deserialize)]
#[serde(untagged)]
enum AccessTokenResponse {
Success { access_token: String },
Error(AccessTokenError),
}
impl LoginCommand {
pub async fn authenticate_device_flow(
&self,
index_url: &gix::Url,
project: &Project,
reqwest: &reqwest::Client,
) -> anyhow::Result<String> {
println!("logging in into {index_url}");
let source = PesdePackageSource::new(index_url.clone());
source
.refresh(project)
.await
.context("failed to refresh index")?;
let config = source
.config(project)
.await
.context("failed to read index config")?;
let Some(client_id) = config.github_oauth_client_id else {
anyhow::bail!("index not configured for Github oauth.");
};
let response = reqwest
.post(Url::parse_with_params(
"https://github.com/login/device/code",
&[("client_id", &client_id)],
)?)
.send()
.await
.context("failed to send device code request")?
.error_for_status()
.context("failed to get device code response")?
.json::<DeviceCodeResponse>()
.await
.context("failed to parse device code response")?;
println!(
"copy your one-time code: {}\npress enter to open {} in your browser...",
response.user_code.bold(),
response.verification_uri.as_str().blue()
);
spawn(move || {
{
let mut input = String::new();
std::io::stdin()
.read_line(&mut input)
.expect("failed to read input");
}
match open::that(response.verification_uri.as_str()) {
Ok(_) => (),
Err(e) => {
eprintln!("failed to open browser: {e}");
}
}
});
let mut time_left = response.expires_in;
let mut interval = std::time::Duration::from_secs(response.interval);
while time_left > 0 {
sleep(interval).await;
time_left = time_left.saturating_sub(interval.as_secs());
let response = reqwest
.post(Url::parse_with_params(
"https://github.com/login/oauth/access_token",
&[
("client_id", &client_id),
("device_code", &response.device_code),
(
"grant_type",
&"urn:ietf:params:oauth:grant-type:device_code".to_string(),
),
],
)?)
.send()
.await
.context("failed to send access token request")?
.error_for_status()
.context("failed to get access token response")?
.json::<AccessTokenResponse>()
.await
.context("failed to parse access token response")?;
match response {
AccessTokenResponse::Success { access_token } => {
return Ok(access_token);
}
AccessTokenResponse::Error(e) => match e {
AccessTokenError::AuthorizationPending => continue,
AccessTokenError::SlowDown {
interval: new_interval,
} => {
interval = std::time::Duration::from_secs(new_interval);
continue;
}
AccessTokenError::ExpiredToken => {
break;
}
AccessTokenError::AccessDenied => {
anyhow::bail!("access denied, re-run the login command");
}
},
}
}
anyhow::bail!("code expired, please re-run the login command");
}
pub async fn run(
self,
index_url: gix::Url,
project: Project,
reqwest: reqwest::Client,
) -> anyhow::Result<()> {
let token_given = self.token.is_some();
let token = match self.token {
Some(token) => token,
None => {
self.authenticate_device_flow(&index_url, &project, &reqwest)
.await?
}
};
let token = if token_given {
println!("set token for {index_url}");
token
} else {
let token = format!("Bearer {token}");
println!(
"logged in as {} for {index_url}",
get_token_login(&reqwest, &token).await?.bold()
);
token
};
set_token(&index_url, Some(&token)).await?;
Ok(())
}
}

View file

@ -0,0 +1,15 @@
use crate::cli::auth::set_token;
use clap::Args;
#[derive(Debug, Args)]
pub struct LogoutCommand {}
impl LogoutCommand {
pub async fn run(self, index_url: gix::Url) -> anyhow::Result<()> {
set_token(&index_url, None).await?;
println!("logged out of {index_url}");
Ok(())
}
}

View file

@ -0,0 +1,73 @@
use crate::cli::config::read_config;
use clap::{Args, Subcommand};
use pesde::{errors::ManifestReadError, Project, DEFAULT_INDEX_NAME};
mod login;
mod logout;
mod token;
mod whoami;
#[derive(Debug, Args)]
pub struct AuthSubcommand {
/// The index to use. Defaults to `default`, or the configured default index if current directory doesn't have a manifest
#[arg(short, long)]
pub index: Option<String>,
#[clap(subcommand)]
pub command: AuthCommands,
}
#[derive(Debug, Subcommand)]
pub enum AuthCommands {
/// Sets a token for an index. Optionally gets it from GitHub
Login(login::LoginCommand),
/// Removes the stored token
Logout(logout::LogoutCommand),
/// Prints the username of the currently logged-in user
#[clap(name = "whoami")]
WhoAmI(whoami::WhoAmICommand),
/// Prints the token for an index
Token(token::TokenCommand),
}
impl AuthSubcommand {
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
let manifest = match project.deser_manifest().await {
Ok(manifest) => Some(manifest),
Err(e) => match e {
ManifestReadError::Io(e) if e.kind() == std::io::ErrorKind::NotFound => None,
e => return Err(e.into()),
},
};
let index_url = match self.index.as_deref() {
Some(index) => match index.try_into() {
Ok(url) => Some(url),
Err(_) => None,
},
None => match manifest {
Some(_) => None,
None => Some(read_config().await?.default_index),
},
};
let index_url = match index_url {
Some(url) => url,
None => {
let index_name = self.index.as_deref().unwrap_or(DEFAULT_INDEX_NAME);
match manifest.unwrap().indices.get(index_name) {
Some(index) => index.clone(),
None => anyhow::bail!("index {index_name} not found in manifest"),
}
}
};
match self.command {
AuthCommands::Login(login) => login.run(index_url, project, reqwest).await,
AuthCommands::Logout(logout) => logout.run(index_url).await,
AuthCommands::WhoAmI(whoami) => whoami.run(index_url, reqwest).await,
AuthCommands::Token(token) => token.run(index_url).await,
}
}
}

View file

@ -0,0 +1,22 @@
use crate::cli::auth::get_tokens;
use clap::Args;
#[derive(Debug, Args)]
pub struct TokenCommand {}
impl TokenCommand {
pub async fn run(self, index_url: gix::Url) -> anyhow::Result<()> {
let tokens = get_tokens().await?;
let token = match tokens.0.get(&index_url) {
Some(token) => token,
None => {
println!("not logged in into {index_url}");
return Ok(());
}
};
println!("token for {index_url}: \"{token}\"");
Ok(())
}
}

View file

@ -0,0 +1,26 @@
use crate::cli::auth::{get_token_login, get_tokens};
use clap::Args;
use colored::Colorize;
#[derive(Debug, Args)]
pub struct WhoAmICommand {}
impl WhoAmICommand {
pub async fn run(self, index_url: gix::Url, reqwest: reqwest::Client) -> anyhow::Result<()> {
let tokens = get_tokens().await?;
let token = match tokens.0.get(&index_url) {
Some(token) => token,
None => {
println!("not logged in into {index_url}");
return Ok(());
}
};
println!(
"logged in as {} into {index_url}",
get_token_login(&reqwest, token).await?.bold()
);
Ok(())
}
}

View file

@ -0,0 +1,38 @@
use crate::cli::config::{read_config, write_config, CliConfig};
use clap::Args;
#[derive(Debug, Args)]
pub struct DefaultIndexCommand {
/// The new index URL to set as default, don't pass any value to check the current default index
#[arg(index = 1, value_parser = crate::cli::parse_gix_url)]
index: Option<gix::Url>,
/// Resets the default index to the default value
#[arg(short, long, conflicts_with = "index")]
reset: bool,
}
impl DefaultIndexCommand {
pub async fn run(self) -> anyhow::Result<()> {
let mut config = read_config().await?;
let index = if self.reset {
Some(CliConfig::default().default_index)
} else {
self.index
};
match index {
Some(index) => {
config.default_index = index.clone();
write_config(&config).await?;
println!("default index set to: {index}");
}
None => {
println!("current default index: {}", config.default_index);
}
}
Ok(())
}
}

View file

@ -0,0 +1,17 @@
use clap::Subcommand;
mod default_index;
#[derive(Debug, Subcommand)]
pub enum ConfigCommands {
/// Configuration for the default index
DefaultIndex(default_index::DefaultIndexCommand),
}
impl ConfigCommands {
pub async fn run(self) -> anyhow::Result<()> {
match self {
ConfigCommands::DefaultIndex(default_index) => default_index.run().await,
}
}
}

167
src/cli/commands/execute.rs Normal file
View file

@ -0,0 +1,167 @@
use crate::cli::{config::read_config, progress_bar, VersionedPackageName};
use anyhow::Context;
use clap::Args;
use fs_err::tokio as fs;
use pesde::{
linking::generator::generate_bin_linking_module,
manifest::target::TargetKind,
names::PackageName,
source::{
pesde::{specifier::PesdeDependencySpecifier, PesdePackageSource},
traits::PackageSource,
},
Project,
};
use semver::VersionReq;
use std::{
collections::HashSet, env::current_dir, ffi::OsString, io::Write, process::Command, sync::Arc,
};
use tokio::sync::Mutex;
#[derive(Debug, Args)]
pub struct ExecuteCommand {
/// The package name, script name, or path to a script to run
#[arg(index = 1)]
package: VersionedPackageName<VersionReq, PackageName>,
/// The index URL to use for the package
#[arg(short, long, value_parser = crate::cli::parse_gix_url)]
index: Option<gix::Url>,
/// Arguments to pass to the script
#[arg(index = 2, last = true)]
args: Vec<OsString>,
}
impl ExecuteCommand {
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
let index = match self.index {
Some(index) => Some(index),
None => read_config().await.ok().map(|c| c.default_index),
}
.context("no index specified")?;
let source = PesdePackageSource::new(index);
source
.refresh(&project)
.await
.context("failed to refresh source")?;
let version_req = self.package.1.unwrap_or(VersionReq::STAR);
let Some((version, pkg_ref)) = ('finder: {
let specifier = PesdeDependencySpecifier {
name: self.package.0.clone(),
version: version_req.clone(),
index: None,
target: None,
};
if let Some(res) = source
.resolve(&specifier, &project, TargetKind::Lune, &mut HashSet::new())
.await
.context("failed to resolve package")?
.1
.pop_last()
{
break 'finder Some(res);
}
source
.resolve(&specifier, &project, TargetKind::Luau, &mut HashSet::new())
.await
.context("failed to resolve package")?
.1
.pop_last()
}) else {
anyhow::bail!(
"no Lune or Luau package could be found for {}@{version_req}",
self.package.0,
);
};
println!("using {}@{version}", pkg_ref.name);
let tmp_dir = project.cas_dir().join(".tmp");
fs::create_dir_all(&tmp_dir)
.await
.context("failed to create temporary directory")?;
let tempdir =
tempfile::tempdir_in(tmp_dir).context("failed to create temporary directory")?;
let project = Project::new(
tempdir.path(),
None::<std::path::PathBuf>,
project.data_dir(),
project.cas_dir(),
project.auth_config().clone(),
);
let (fs, target) = source
.download(&pkg_ref, &project, &reqwest)
.await
.context("failed to download package")?;
let bin_path = target.bin_path().context("package has no binary export")?;
fs.write_to(tempdir.path(), project.cas_dir(), true)
.await
.context("failed to write package contents")?;
let mut refreshed_sources = HashSet::new();
let graph = project
.dependency_graph(None, &mut refreshed_sources, true)
.await
.context("failed to build dependency graph")?;
let graph = Arc::new(graph);
let (rx, downloaded_graph) = project
.download_and_link(
&graph,
&Arc::new(Mutex::new(refreshed_sources)),
&reqwest,
true,
true,
|_| async { Ok::<_, std::io::Error>(()) },
)
.await
.context("failed to download dependencies")?;
progress_bar(
graph.values().map(|versions| versions.len() as u64).sum(),
rx,
"📥 ".to_string(),
"downloading dependencies".to_string(),
"downloaded dependencies".to_string(),
)
.await?;
downloaded_graph
.await
.context("failed to download & link dependencies")?;
let mut caller =
tempfile::NamedTempFile::new_in(tempdir.path()).context("failed to create tempfile")?;
caller
.write_all(
generate_bin_linking_module(
tempdir.path(),
&format!("{:?}", bin_path.to_path(tempdir.path())),
)
.as_bytes(),
)
.context("failed to write to tempfile")?;
let status = Command::new("lune")
.arg("run")
.arg(caller.path())
.arg("--")
.args(&self.args)
.current_dir(current_dir().context("failed to get current directory")?)
.status()
.context("failed to run script")?;
drop(caller);
drop(tempdir);
std::process::exit(status.code().unwrap_or(1))
}
}

262
src/cli/commands/init.rs Normal file
View file

@ -0,0 +1,262 @@
use crate::cli::config::read_config;
use anyhow::Context;
use clap::Args;
use colored::Colorize;
use inquire::validator::Validation;
use pesde::{
errors::ManifestReadError,
manifest::{target::TargetKind, DependencyType},
names::PackageName,
source::{
git_index::GitBasedSource,
pesde::{specifier::PesdeDependencySpecifier, PesdePackageSource},
specifiers::DependencySpecifiers,
traits::PackageSource,
},
Project, DEFAULT_INDEX_NAME, SCRIPTS_LINK_FOLDER,
};
use semver::VersionReq;
use std::{collections::HashSet, fmt::Display, str::FromStr};
#[derive(Debug, Args)]
pub struct InitCommand {}
#[derive(Debug)]
enum PackageNameOrCustom {
PackageName(PackageName),
Custom,
}
impl Display for PackageNameOrCustom {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
PackageNameOrCustom::PackageName(n) => write!(f, "{n}"),
PackageNameOrCustom::Custom => write!(f, "custom"),
}
}
}
impl InitCommand {
pub async fn run(self, project: Project) -> anyhow::Result<()> {
match project.read_manifest().await {
Ok(_) => {
println!("{}", "project already initialized".red());
return Ok(());
}
Err(ManifestReadError::Io(e)) if e.kind() == std::io::ErrorKind::NotFound => {}
Err(e) => return Err(e.into()),
};
let mut manifest = toml_edit::DocumentMut::new();
manifest["name"] = toml_edit::value(
inquire::Text::new("what is the name of the project?")
.with_validator(|name: &str| {
Ok(match PackageName::from_str(name) {
Ok(_) => Validation::Valid,
Err(e) => Validation::Invalid(e.to_string().into()),
})
})
.prompt()
.unwrap(),
);
manifest["version"] = toml_edit::value("0.1.0");
let description = inquire::Text::new("what is the description of the project?")
.with_help_message("a short description of the project. leave empty for none")
.prompt()
.unwrap();
if !description.is_empty() {
manifest["description"] = toml_edit::value(description);
}
let authors = inquire::Text::new("who are the authors of this project?")
.with_help_message("comma separated list. leave empty for none")
.prompt()
.unwrap();
let authors = authors
.split(',')
.map(str::trim)
.filter(|s| !s.is_empty())
.collect::<toml_edit::Array>();
if !authors.is_empty() {
manifest["authors"] = toml_edit::value(authors);
}
let repo = inquire::Text::new("what is the repository URL of this project?")
.with_validator(|repo: &str| {
if repo.is_empty() {
return Ok(Validation::Valid);
}
Ok(match url::Url::parse(repo) {
Ok(_) => Validation::Valid,
Err(e) => Validation::Invalid(e.to_string().into()),
})
})
.with_help_message("leave empty for none")
.prompt()
.unwrap();
if !repo.is_empty() {
manifest["repository"] = toml_edit::value(repo);
}
let license = inquire::Text::new("what is the license of this project?")
.with_initial_value("MIT")
.with_help_message("an SPDX license identifier. leave empty for none")
.prompt()
.unwrap();
if !license.is_empty() {
manifest["license"] = toml_edit::value(license);
}
let target_env = inquire::Select::new(
"what environment are you targeting for your package?",
TargetKind::VARIANTS.to_vec(),
)
.prompt()
.unwrap();
manifest["target"].or_insert(toml_edit::Item::Table(toml_edit::Table::new()))
["environment"] = toml_edit::value(target_env.to_string());
let source = PesdePackageSource::new(read_config().await?.default_index);
manifest["indices"].or_insert(toml_edit::Item::Table(toml_edit::Table::new()))
[DEFAULT_INDEX_NAME] = toml_edit::value(source.repo_url().to_bstring().to_string());
if target_env.is_roblox()
|| inquire::prompt_confirmation(
"would you like to setup default Roblox compatibility scripts?",
)
.unwrap()
{
PackageSource::refresh(&source, &project)
.await
.context("failed to refresh package source")?;
let config = source
.config(&project)
.await
.context("failed to get source config")?;
let scripts_package = if config.scripts_packages.is_empty() {
PackageNameOrCustom::Custom
} else {
inquire::Select::new(
"which scripts package do you want to use?",
config
.scripts_packages
.into_iter()
.map(PackageNameOrCustom::PackageName)
.chain(std::iter::once(PackageNameOrCustom::Custom))
.collect(),
)
.prompt()
.unwrap()
};
let scripts_package = match scripts_package {
PackageNameOrCustom::PackageName(p) => Some(p),
PackageNameOrCustom::Custom => {
let name = inquire::Text::new("which scripts package to use?")
.with_validator(|name: &str| {
if name.is_empty() {
return Ok(Validation::Valid);
}
Ok(match PackageName::from_str(name) {
Ok(_) => Validation::Valid,
Err(e) => Validation::Invalid(e.to_string().into()),
})
})
.with_help_message("leave empty for none")
.prompt()
.unwrap();
if name.is_empty() {
None
} else {
Some(PackageName::from_str(&name).unwrap())
}
}
};
if let Some(scripts_pkg_name) = scripts_package {
let (v_id, pkg_ref) = source
.resolve(
&PesdeDependencySpecifier {
name: scripts_pkg_name,
version: VersionReq::STAR,
index: None,
target: None,
},
&project,
TargetKind::Lune,
&mut HashSet::new(),
)
.await
.context("failed to resolve scripts package")?
.1
.pop_last()
.context("scripts package not found")?;
let Some(scripts) = pkg_ref.target.scripts().filter(|s| !s.is_empty()) else {
anyhow::bail!("scripts package has no scripts. this is an issue with the index")
};
let scripts_field = &mut manifest["scripts"]
.or_insert(toml_edit::Item::Table(toml_edit::Table::new()));
for script_name in scripts.keys() {
scripts_field[script_name] = toml_edit::value(format!(
"{SCRIPTS_LINK_FOLDER}/scripts/{script_name}.luau"
));
}
let dev_deps = &mut manifest["dev_dependencies"]
.or_insert(toml_edit::Item::Table(toml_edit::Table::new()));
let field = &mut dev_deps["scripts"];
field["name"] = toml_edit::value(pkg_ref.name.to_string());
field["version"] = toml_edit::value(format!("^{}", v_id.version()));
field["target"] = toml_edit::value(v_id.target().to_string());
for (alias, (spec, ty)) in pkg_ref.dependencies {
if ty != DependencyType::Peer {
continue;
}
let DependencySpecifiers::Pesde(spec) = spec else {
continue;
};
let field = &mut dev_deps[alias];
field["name"] = toml_edit::value(spec.name.to_string());
field["version"] = toml_edit::value(spec.version.to_string());
field["target"] =
toml_edit::value(spec.target.unwrap_or_else(|| *v_id.target()).to_string());
}
} else {
println!(
"{}",
"no scripts package configured, this can cause issues with Roblox compatibility".red()
);
if !inquire::prompt_confirmation("initialize regardless?").unwrap() {
return Ok(());
}
}
}
project.write_manifest(manifest.to_string()).await?;
println!(
"{}\n{}: run `install` to fully finish setup",
"initialized project".green(),
"tip".cyan().bold()
);
Ok(())
}
}

328
src/cli/commands/install.rs Normal file
View file

@ -0,0 +1,328 @@
use crate::cli::{
bin_dir, files::make_executable, progress_bar, run_on_workspace_members, up_to_date_lockfile,
};
use anyhow::Context;
use clap::Args;
use colored::{ColoredString, Colorize};
use fs_err::tokio as fs;
use futures::future::try_join_all;
use pesde::{
download_and_link::filter_graph, lockfile::Lockfile, manifest::target::TargetKind, Project,
MANIFEST_FILE_NAME,
};
use std::{
collections::{BTreeSet, HashMap, HashSet},
sync::Arc,
};
use tokio::sync::Mutex;
#[derive(Debug, Args, Copy, Clone)]
pub struct InstallCommand {
/// Whether to error on changes in the lockfile
#[arg(long)]
locked: bool,
/// Whether to not install dev dependencies
#[arg(long)]
prod: bool,
}
fn bin_link_file(alias: &str) -> String {
let mut all_combinations = BTreeSet::new();
for a in TargetKind::VARIANTS {
for b in TargetKind::VARIANTS {
all_combinations.insert((a, b));
}
}
let all_folders = all_combinations
.into_iter()
.map(|(a, b)| format!("{:?}", a.packages_folder(b)))
.collect::<BTreeSet<_>>()
.into_iter()
.collect::<Vec<_>>()
.join(", ");
format!(
r#"local process = require("@lune/process")
local fs = require("@lune/fs")
local stdio = require("@lune/stdio")
local project_root = process.cwd
local path_components = string.split(string.gsub(project_root, "\\", "/"), "/")
for i = #path_components, 1, -1 do
local path = table.concat(path_components, "/", 1, i)
if fs.isFile(path .. "/{MANIFEST_FILE_NAME}") then
project_root = path
break
end
end
for _, packages_folder in {{ {all_folders} }} do
local path = `{{project_root}}/{{packages_folder}}/{alias}.bin.luau`
if fs.isFile(path) then
require(path)
return
end
end
stdio.ewrite(stdio.color("red") .. "binary `{alias}` not found. are you in the right directory?" .. stdio.color("reset") .. "\n")
"#,
)
}
#[cfg(feature = "patches")]
const JOBS: u8 = 5;
#[cfg(not(feature = "patches"))]
const JOBS: u8 = 4;
fn job(n: u8) -> ColoredString {
format!("[{n}/{JOBS}]").dimmed().bold()
}
#[derive(Debug, thiserror::Error)]
#[error(transparent)]
struct CallbackError(#[from] anyhow::Error);
impl InstallCommand {
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
let mut refreshed_sources = HashSet::new();
let manifest = project
.deser_manifest()
.await
.context("failed to read manifest")?;
let lockfile = if self.locked {
match up_to_date_lockfile(&project).await? {
None => {
anyhow::bail!(
"lockfile is out of sync, run `{} install` to update it",
env!("CARGO_BIN_NAME")
);
}
file => file,
}
} else {
match project.deser_lockfile().await {
Ok(lockfile) => {
if lockfile.overrides != manifest.overrides {
tracing::debug!("overrides are different");
None
} else if lockfile.target != manifest.target.kind() {
tracing::debug!("target kind is different");
None
} else {
Some(lockfile)
}
}
Err(pesde::errors::LockfileReadError::Io(e))
if e.kind() == std::io::ErrorKind::NotFound =>
{
None
}
Err(e) => return Err(e.into()),
}
};
println!(
"\n{}\n",
format!("[now installing {} {}]", manifest.name, manifest.target)
.bold()
.on_bright_black()
);
println!("{} ❌ removing current package folders", job(1));
{
let mut deleted_folders = HashMap::new();
for target_kind in TargetKind::VARIANTS {
let folder = manifest.target.kind().packages_folder(target_kind);
let package_dir = project.package_dir();
deleted_folders
.entry(folder.to_string())
.or_insert_with(|| async move {
tracing::debug!("deleting the {folder} folder");
if let Some(e) = fs::remove_dir_all(package_dir.join(&folder))
.await
.err()
.filter(|e| e.kind() != std::io::ErrorKind::NotFound)
{
return Err(e).context(format!("failed to remove the {folder} folder"));
};
Ok(())
});
}
try_join_all(deleted_folders.into_values())
.await
.context("failed to remove package folders")?;
}
let old_graph = lockfile.map(|lockfile| {
lockfile
.graph
.into_iter()
.map(|(name, versions)| {
(
name,
versions
.into_iter()
.map(|(version, node)| (version, node.node))
.collect(),
)
})
.collect()
});
println!("{} 📦 building dependency graph", job(2));
let graph = project
.dependency_graph(old_graph.as_ref(), &mut refreshed_sources, false)
.await
.context("failed to build dependency graph")?;
let graph = Arc::new(graph);
let bin_folder = bin_dir().await?;
let downloaded_graph = {
let (rx, downloaded_graph) = project
.download_and_link(
&graph,
&Arc::new(Mutex::new(refreshed_sources)),
&reqwest,
self.prod,
true,
|graph| {
let graph = graph.clone();
async move {
try_join_all(
graph
.values()
.flat_map(|versions| versions.values())
.filter(|node| node.target.bin_path().is_some())
.filter_map(|node| node.node.direct.as_ref())
.map(|(alias, _, _)| alias)
.filter(|alias| {
if *alias == env!("CARGO_BIN_NAME") {
tracing::warn!(
"package {alias} has the same name as the CLI, skipping bin link"
);
return false;
}
true
})
.map(|alias| {
let bin_folder = bin_folder.clone();
async move {
let bin_exec_file = bin_folder.join(alias).with_extension(std::env::consts::EXE_EXTENSION);
let impl_folder = bin_folder.join(".impl");
fs::create_dir_all(&impl_folder).await.context("failed to create bin link folder")?;
let bin_file = impl_folder.join(alias).with_extension("luau");
fs::write(&bin_file, bin_link_file(alias))
.await
.context("failed to write bin link file")?;
#[cfg(windows)]
{
fs::copy(
std::env::current_exe()
.context("failed to get current executable path")?,
&bin_exec_file,
)
.await
.context("failed to copy bin link file")?;
}
#[cfg(not(windows))]
{
fs::write(
&bin_exec_file,
format!(r#"#!/bin/sh
exec lune run "$(dirname "$0")/.impl/{alias}.luau" -- "$@""#
),
)
.await
.context("failed to link bin link file")?;
}
make_executable(&bin_exec_file).await.context("failed to make bin link file executable")?;
Ok::<_, CallbackError>(())
}
}),
)
.await
.map(|_| ())
}
}
)
.await
.context("failed to download dependencies")?;
progress_bar(
graph.values().map(|versions| versions.len() as u64).sum(),
rx,
format!("{} 📥 ", job(3)),
"downloading dependencies".to_string(),
"downloaded dependencies".to_string(),
)
.await?;
downloaded_graph
.await
.context("failed to download & link dependencies")?
};
#[cfg(feature = "patches")]
{
let rx = project
.apply_patches(&filter_graph(&downloaded_graph, self.prod))
.await
.context("failed to apply patches")?;
progress_bar(
manifest.patches.values().map(|v| v.len() as u64).sum(),
rx,
format!("{} 🩹 ", job(JOBS - 1)),
"applying patches".to_string(),
"applied patches".to_string(),
)
.await?;
}
println!("{} 🧹 finishing up", job(JOBS));
project
.write_lockfile(Lockfile {
name: manifest.name,
version: manifest.version,
target: manifest.target.kind(),
overrides: manifest.overrides,
graph: downloaded_graph,
workspace: run_on_workspace_members(&project, |project| {
let reqwest = reqwest.clone();
async move { Box::pin(self.run(project, reqwest)).await }
})
.await?,
})
.await
.context("failed to write lockfile")?;
Ok(())
}
}

96
src/cli/commands/mod.rs Normal file
View file

@ -0,0 +1,96 @@
use pesde::Project;
mod add;
mod auth;
mod config;
mod execute;
mod init;
mod install;
mod outdated;
#[cfg(feature = "patches")]
mod patch;
#[cfg(feature = "patches")]
mod patch_commit;
mod publish;
mod run;
#[cfg(feature = "version-management")]
mod self_install;
#[cfg(feature = "version-management")]
mod self_upgrade;
mod update;
#[derive(Debug, clap::Subcommand)]
pub enum Subcommand {
/// Authentication-related commands
Auth(auth::AuthSubcommand),
/// Configuration-related commands
#[command(subcommand)]
Config(config::ConfigCommands),
/// Initializes a manifest file in the current directory
Init(init::InitCommand),
/// Runs a script, an executable package, or a file with Lune
Run(run::RunCommand),
/// Installs all dependencies for the project
Install(install::InstallCommand),
/// Publishes the project to the registry
Publish(publish::PublishCommand),
/// Installs the pesde binary and scripts
#[cfg(feature = "version-management")]
SelfInstall(self_install::SelfInstallCommand),
/// Sets up a patching environment for a package
#[cfg(feature = "patches")]
Patch(patch::PatchCommand),
/// Finalizes a patching environment for a package
#[cfg(feature = "patches")]
PatchCommit(patch_commit::PatchCommitCommand),
/// Installs the latest version of pesde
#[cfg(feature = "version-management")]
SelfUpgrade(self_upgrade::SelfUpgradeCommand),
/// Adds a dependency to the project
Add(add::AddCommand),
/// Updates the project's lockfile. Run install to apply changes
Update(update::UpdateCommand),
/// Checks for outdated dependencies
Outdated(outdated::OutdatedCommand),
/// Executes a binary package without needing to be run in a project directory
#[clap(name = "x", visible_alias = "execute", visible_alias = "exec")]
Execute(execute::ExecuteCommand),
}
impl Subcommand {
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
match self {
Subcommand::Auth(auth) => auth.run(project, reqwest).await,
Subcommand::Config(config) => config.run().await,
Subcommand::Init(init) => init.run(project).await,
Subcommand::Run(run) => run.run(project).await,
Subcommand::Install(install) => install.run(project, reqwest).await,
Subcommand::Publish(publish) => publish.run(project, reqwest).await,
#[cfg(feature = "version-management")]
Subcommand::SelfInstall(self_install) => self_install.run().await,
#[cfg(feature = "patches")]
Subcommand::Patch(patch) => patch.run(project, reqwest).await,
#[cfg(feature = "patches")]
Subcommand::PatchCommit(patch_commit) => patch_commit.run(project).await,
#[cfg(feature = "version-management")]
Subcommand::SelfUpgrade(self_upgrade) => self_upgrade.run(reqwest).await,
Subcommand::Add(add) => add.run(project).await,
Subcommand::Update(update) => update.run(project, reqwest).await,
Subcommand::Outdated(outdated) => outdated.run(project).await,
Subcommand::Execute(execute) => execute.run(project, reqwest).await,
}
}
}

View file

@ -0,0 +1,136 @@
use crate::cli::up_to_date_lockfile;
use anyhow::Context;
use clap::Args;
use futures::future::try_join_all;
use pesde::{
refresh_sources,
source::{
refs::PackageRefs,
specifiers::DependencySpecifiers,
traits::{PackageRef, PackageSource},
},
Project,
};
use semver::VersionReq;
use std::{collections::HashSet, sync::Arc};
use tokio::sync::Mutex;
#[derive(Debug, Args)]
pub struct OutdatedCommand {
/// Whether to check within version requirements
#[arg(short, long)]
strict: bool,
}
impl OutdatedCommand {
pub async fn run(self, project: Project) -> anyhow::Result<()> {
let graph = match up_to_date_lockfile(&project).await? {
Some(file) => file.graph,
None => {
anyhow::bail!(
"lockfile is out of sync, run `{} install` to update it",
env!("CARGO_BIN_NAME")
);
}
};
let manifest = project
.deser_manifest()
.await
.context("failed to read manifest")?;
let manifest_target_kind = manifest.target.kind();
let mut refreshed_sources = HashSet::new();
refresh_sources(
&project,
graph
.iter()
.flat_map(|(_, versions)| versions.iter())
.map(|(_, node)| node.node.pkg_ref.source()),
&mut refreshed_sources,
)
.await?;
let refreshed_sources = Arc::new(Mutex::new(refreshed_sources));
if try_join_all(
graph
.into_iter()
.flat_map(|(_, versions)| versions.into_iter())
.map(|(current_version_id, node)| {
let project = project.clone();
let refreshed_sources = refreshed_sources.clone();
async move {
let Some((alias, mut specifier, _)) = node.node.direct else {
return Ok::<bool, anyhow::Error>(true);
};
if matches!(
specifier,
DependencySpecifiers::Git(_) | DependencySpecifiers::Workspace(_)
) {
return Ok(true);
}
let source = node.node.pkg_ref.source();
if !self.strict {
match specifier {
DependencySpecifiers::Pesde(ref mut spec) => {
spec.version = VersionReq::STAR;
}
#[cfg(feature = "wally-compat")]
DependencySpecifiers::Wally(ref mut spec) => {
spec.version = VersionReq::STAR;
}
DependencySpecifiers::Git(_) => {}
DependencySpecifiers::Workspace(_) => {}
};
}
let version_id = source
.resolve(
&specifier,
&project,
manifest_target_kind,
&mut *refreshed_sources.lock().await,
)
.await
.context("failed to resolve package versions")?
.1
.pop_last()
.map(|(v_id, _)| v_id)
.context(format!("no versions of {specifier} found"))?;
if version_id != current_version_id {
println!(
"{} {} ({alias}) {} -> {}",
match node.node.pkg_ref {
PackageRefs::Pesde(pkg_ref) => pkg_ref.name.to_string(),
#[cfg(feature = "wally-compat")]
PackageRefs::Wally(pkg_ref) => pkg_ref.name.to_string(),
_ => unreachable!(),
},
current_version_id.target(),
current_version_id.version(),
version_id.version()
);
return Ok(false);
}
Ok(true)
}
}),
)
.await?
.into_iter()
.all(|b| b)
{
println!("all packages are up to date");
}
Ok(())
}
}

79
src/cli/commands/patch.rs Normal file
View file

@ -0,0 +1,79 @@
use crate::cli::{up_to_date_lockfile, VersionedPackageName};
use anyhow::Context;
use clap::Args;
use colored::Colorize;
use fs_err::tokio as fs;
use pesde::{
patches::setup_patches_repo,
source::{
refs::PackageRefs,
traits::{PackageRef, PackageSource},
},
Project, MANIFEST_FILE_NAME,
};
#[derive(Debug, Args)]
pub struct PatchCommand {
/// The package name to patch
#[arg(index = 1)]
package: VersionedPackageName,
}
impl PatchCommand {
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
let graph = if let Some(lockfile) = up_to_date_lockfile(&project).await? {
lockfile.graph
} else {
anyhow::bail!("outdated lockfile, please run the install command first")
};
let (name, version_id) = self.package.get(&graph)?;
let node = graph
.get(&name)
.and_then(|versions| versions.get(&version_id))
.context("package not found in graph")?;
if matches!(node.node.pkg_ref, PackageRefs::Workspace(_)) {
anyhow::bail!("cannot patch a workspace package")
}
let source = node.node.pkg_ref.source();
let directory = project
.data_dir()
.join("patches")
.join(name.escaped())
.join(version_id.escaped())
.join(chrono::Utc::now().timestamp().to_string());
fs::create_dir_all(&directory).await?;
source
.download(&node.node.pkg_ref, &project, &reqwest)
.await?
.0
.write_to(&directory, project.cas_dir(), false)
.await
.context("failed to write package contents")?;
setup_patches_repo(&directory)?;
println!(
concat!(
"done! modify the files in the directory, then run `",
env!("CARGO_BIN_NAME"),
r#" patch-commit {}` to apply.
{}: do not commit these changes
{}: the {} file will be ignored when patching"#
),
directory.display().to_string().bold().cyan(),
"warning".yellow(),
"note".blue(),
MANIFEST_FILE_NAME
);
open::that(directory)?;
Ok(())
}
}

View file

@ -0,0 +1,97 @@
use crate::cli::up_to_date_lockfile;
use anyhow::Context;
use clap::Args;
use fs_err::tokio as fs;
use pesde::{names::PackageNames, patches::create_patch, source::version_id::VersionId, Project};
use std::{path::PathBuf, str::FromStr};
#[derive(Debug, Args)]
pub struct PatchCommitCommand {
/// The directory containing the patch to commit
#[arg(index = 1)]
directory: PathBuf,
}
impl PatchCommitCommand {
pub async fn run(self, project: Project) -> anyhow::Result<()> {
let graph = if let Some(lockfile) = up_to_date_lockfile(&project).await? {
lockfile.graph
} else {
anyhow::bail!("outdated lockfile, please run the install command first")
};
let (name, version_id) = (
PackageNames::from_escaped(
self.directory
.parent()
.context("directory has no parent")?
.parent()
.context("directory has no grandparent")?
.file_name()
.context("directory grandparent has no name")?
.to_str()
.context("directory grandparent name is not valid")?,
)?,
VersionId::from_escaped(
self.directory
.parent()
.context("directory has no parent")?
.file_name()
.context("directory parent has no name")?
.to_str()
.context("directory parent name is not valid")?,
)?,
);
graph
.get(&name)
.and_then(|versions| versions.get(&version_id))
.context("package not found in graph")?;
let mut manifest = toml_edit::DocumentMut::from_str(
&project
.read_manifest()
.await
.context("failed to read manifest")?,
)
.context("failed to parse manifest")?;
let patch = create_patch(&self.directory).context("failed to create patch")?;
fs::remove_dir_all(self.directory)
.await
.context("failed to remove patch directory")?;
let patches_dir = project.package_dir().join("patches");
fs::create_dir_all(&patches_dir)
.await
.context("failed to create patches directory")?;
let patch_file_name = format!("{}-{}.patch", name.escaped(), version_id.escaped());
let patch_file = patches_dir.join(&patch_file_name);
if patch_file.exists() {
anyhow::bail!("patch file already exists: {}", patch_file.display());
}
fs::write(&patch_file, patch)
.await
.context("failed to write patch file")?;
manifest["patches"].or_insert(toml_edit::Item::Table(toml_edit::Table::new()))
[&name.to_string()][&version_id.to_string()] =
toml_edit::value(format!("patches/{patch_file_name}"));
project
.write_manifest(manifest.to_string())
.await
.context("failed to write manifest")?;
println!(concat!(
"done! run `",
env!("CARGO_BIN_NAME"),
" install` to apply the patch"
));
Ok(())
}
}

680
src/cli/commands/publish.rs Normal file
View file

@ -0,0 +1,680 @@
use crate::cli::{display_err, run_on_workspace_members, up_to_date_lockfile};
use anyhow::Context;
use async_compression::Level;
use clap::Args;
use colored::Colorize;
use fs_err::tokio as fs;
#[allow(deprecated)]
use pesde::{
manifest::{target::Target, DependencyType},
matching_globs_old_behaviour,
scripts::ScriptName,
source::{
git_index::GitBasedSource,
pesde::{specifier::PesdeDependencySpecifier, PesdePackageSource},
specifiers::DependencySpecifiers,
traits::PackageSource,
workspace::{
specifier::{VersionType, VersionTypeOrReq},
WorkspacePackageSource,
},
IGNORED_DIRS, IGNORED_FILES,
},
Project, DEFAULT_INDEX_NAME, MANIFEST_FILE_NAME,
};
use reqwest::{header::AUTHORIZATION, StatusCode};
use semver::VersionReq;
use std::{collections::HashSet, path::PathBuf};
use tempfile::Builder;
use tokio::io::{AsyncSeekExt, AsyncWriteExt};
#[derive(Debug, Args, Clone)]
pub struct PublishCommand {
/// Whether to output a tarball instead of publishing
#[arg(short, long)]
dry_run: bool,
/// Agree to all prompts
#[arg(short, long)]
yes: bool,
/// The index to publish to
#[arg(short, long, default_value_t = DEFAULT_INDEX_NAME.to_string())]
index: String,
}
impl PublishCommand {
async fn run_impl(
self,
project: &Project,
reqwest: reqwest::Client,
is_root: bool,
) -> anyhow::Result<()> {
let mut manifest = project
.deser_manifest()
.await
.context("failed to read manifest")?;
println!(
"\n{}\n",
format!("[now publishing {} {}]", manifest.name, manifest.target)
.bold()
.on_bright_black()
);
if manifest.private {
if !is_root {
println!("{}", "package is private, cannot publish".red().bold());
}
return Ok(());
}
if manifest.target.lib_path().is_none()
&& manifest.target.bin_path().is_none()
&& manifest.target.scripts().is_none_or(|s| s.is_empty())
{
anyhow::bail!("no exports found in target");
}
if matches!(
manifest.target,
Target::Roblox { .. } | Target::RobloxServer { .. }
) {
if manifest.target.build_files().is_none_or(|f| f.is_empty()) {
anyhow::bail!("no build files found in target");
}
match up_to_date_lockfile(project).await? {
Some(lockfile) => {
if lockfile
.graph
.values()
.flatten()
.filter_map(|(_, node)| node.node.direct.as_ref().map(|_| node))
.any(|node| {
node.target.build_files().is_none()
&& !matches!(node.node.resolved_ty, DependencyType::Dev)
})
{
anyhow::bail!("roblox packages may not depend on non-roblox packages");
}
}
None => {
anyhow::bail!("outdated lockfile, please run the install command first")
}
}
}
let canonical_package_dir = project
.package_dir()
.canonicalize()
.context("failed to canonicalize package directory")?;
let mut archive = tokio_tar::Builder::new(
async_compression::tokio::write::GzipEncoder::with_quality(vec![], Level::Best),
);
let mut display_build_files: Vec<String> = vec![];
let (lib_path, bin_path, scripts, target_kind) = (
manifest.target.lib_path().cloned(),
manifest.target.bin_path().cloned(),
manifest.target.scripts().cloned(),
manifest.target.kind(),
);
let mut roblox_target = match &mut manifest.target {
Target::Roblox { build_files, .. } => Some(build_files),
Target::RobloxServer { build_files, .. } => Some(build_files),
_ => None,
};
#[allow(deprecated)]
let mut paths = matching_globs_old_behaviour(
project.package_dir(),
manifest.includes.iter().map(|s| s.as_str()),
true,
)
.await
.context("failed to get included files")?;
if paths.insert(PathBuf::from(MANIFEST_FILE_NAME)) {
println!(
"{}: {MANIFEST_FILE_NAME} was not included, adding it",
"warn".yellow().bold()
);
}
if paths.iter().any(|p| p.starts_with(".git")) {
anyhow::bail!("git directory was included, please remove it");
}
if !paths.iter().any(|f| {
matches!(
f.to_str().unwrap().to_lowercase().as_str(),
"readme" | "readme.md" | "readme.txt"
)
}) {
println!(
"{}: no README file included, consider adding one",
"warn".yellow().bold()
);
}
if !paths.iter().any(|p| p.starts_with("docs")) {
println!(
"{}: docs directory not included, consider adding one",
"warn".yellow().bold()
);
}
for path in &paths {
if path
.file_name()
.is_some_and(|n| n == "default.project.json")
{
anyhow::bail!(
"default.project.json was included at `{}`, this should be generated by the {} script upon dependants installation",
path.display(),
ScriptName::RobloxSyncConfigGenerator
);
}
}
for ignored_path in IGNORED_FILES.iter().chain(IGNORED_DIRS.iter()) {
if paths.iter().any(|p| {
p.components()
.any(|ct| ct == std::path::Component::Normal(ignored_path.as_ref()))
}) {
anyhow::bail!(
r#"forbidden file {ignored_path} was included.
info: if this was a toolchain manager's manifest file, do not include it due to it possibly messing with user scripts
info: otherwise, the file was deemed unnecessary, if you don't understand why, please contact the maintainers"#,
);
}
}
for (name, path) in [("lib path", lib_path), ("bin path", bin_path)] {
let Some(relative_export_path) = path else {
continue;
};
let export_path = relative_export_path.to_path(&canonical_package_dir);
let contents = match fs::read_to_string(&export_path).await {
Ok(contents) => contents,
Err(e) if e.kind() == std::io::ErrorKind::NotFound => {
anyhow::bail!("{name} does not exist");
}
Err(e) if e.kind() == std::io::ErrorKind::IsADirectory => {
anyhow::bail!("{name} must point to a file");
}
Err(e) => {
return Err(e).context(format!("failed to read {name}"));
}
};
let export_path = export_path
.canonicalize()
.context(format!("failed to canonicalize {name}"))?;
if let Err(err) = full_moon::parse(&contents).map_err(|errs| {
errs.into_iter()
.map(|err| err.to_string())
.collect::<Vec<_>>()
.join(", ")
}) {
anyhow::bail!("{name} is not a valid Luau file: {err}");
}
let first_part = relative_export_path
.components()
.next()
.context(format!("{name} must contain at least one part"))?;
let first_part = match first_part {
relative_path::Component::Normal(part) => part,
_ => anyhow::bail!("{name} must be within project directory"),
};
if paths.insert(
export_path
.strip_prefix(&canonical_package_dir)
.unwrap()
.to_path_buf(),
) {
println!(
"{}: {name} was not included, adding {relative_export_path}",
"warn".yellow().bold()
);
}
if roblox_target
.as_mut()
.is_some_and(|build_files| build_files.insert(first_part.to_string()))
{
println!(
"{}: {name} was not in build files, adding {first_part}",
"warn".yellow().bold()
);
}
}
if let Some(build_files) = &roblox_target {
for build_file in build_files.iter() {
if build_file.eq_ignore_ascii_case(MANIFEST_FILE_NAME) {
println!(
"{}: {MANIFEST_FILE_NAME} is in build files, please remove it",
"warn".yellow().bold()
);
continue;
}
let build_file_path = project.package_dir().join(build_file);
if !build_file_path.exists() {
anyhow::bail!("build file {build_file} does not exist");
}
if !paths.iter().any(|p| p.starts_with(build_file)) {
anyhow::bail!("build file {build_file} is not included, please add it");
}
if build_file_path.is_file() {
display_build_files.push(build_file.clone());
} else {
display_build_files.push(format!("{build_file}/*"));
}
}
}
if let Some(scripts) = scripts {
for (name, path) in scripts {
let script_path = path.to_path(&canonical_package_dir);
let contents = match fs::read_to_string(&script_path).await {
Ok(contents) => contents,
Err(e) if e.kind() == std::io::ErrorKind::NotFound => {
anyhow::bail!("script {name} does not exist");
}
Err(e) if e.kind() == std::io::ErrorKind::IsADirectory => {
anyhow::bail!("script {name} must point to a file");
}
Err(e) => {
return Err(e).context(format!("failed to read script {name}"));
}
};
let script_path = script_path
.canonicalize()
.context(format!("failed to canonicalize script {name}"))?;
if let Err(err) = full_moon::parse(&contents).map_err(|errs| {
errs.into_iter()
.map(|err| err.to_string())
.collect::<Vec<_>>()
.join(", ")
}) {
anyhow::bail!("script {name} is not a valid Luau file: {err}");
}
if paths.insert(
script_path
.strip_prefix(&canonical_package_dir)
.unwrap()
.to_path_buf(),
) {
println!(
"{}: script {name} was not included, adding {path}",
"warn".yellow().bold()
);
}
}
}
for relative_path in &paths {
let path = project.package_dir().join(relative_path);
if !path.exists() {
anyhow::bail!("included file `{}` does not exist", path.display());
}
let file_name = relative_path
.file_name()
.context("failed to get file name")?
.to_string_lossy()
.to_string();
// it'll be included later after transformations, and is guaranteed to be a file
if file_name.eq_ignore_ascii_case(MANIFEST_FILE_NAME) {
continue;
}
if path.is_file() {
archive
.append_file(
&relative_path,
fs::File::open(&path)
.await
.context(format!("failed to read `{}`", relative_path.display()))?
.file_mut(),
)
.await?;
}
}
for specifier in manifest
.dependencies
.values_mut()
.chain(manifest.dev_dependencies.values_mut())
.chain(manifest.peer_dependencies.values_mut())
{
match specifier {
DependencySpecifiers::Pesde(specifier) => {
let index_name = specifier
.index
.as_deref()
.unwrap_or(DEFAULT_INDEX_NAME)
.to_string();
specifier.index = Some(
manifest
.indices
.get(&index_name)
.context(format!("index {index_name} not found in indices field"))?
.to_string(),
);
}
#[cfg(feature = "wally-compat")]
DependencySpecifiers::Wally(specifier) => {
let index_name = specifier
.index
.as_deref()
.unwrap_or(DEFAULT_INDEX_NAME)
.to_string();
specifier.index = Some(
manifest
.wally_indices
.get(&index_name)
.context(format!(
"index {index_name} not found in wally_indices field"
))?
.to_string(),
);
}
DependencySpecifiers::Git(_) => {}
DependencySpecifiers::Workspace(spec) => {
let pkg_ref = WorkspacePackageSource
.resolve(spec, project, target_kind, &mut HashSet::new())
.await
.context("failed to resolve workspace package")?
.1
.pop_last()
.context("no versions found for workspace package")?
.1;
let manifest = pkg_ref
.path
.to_path(
project
.workspace_dir()
.context("failed to get workspace directory")?,
)
.join(MANIFEST_FILE_NAME);
let manifest = fs::read_to_string(&manifest)
.await
.context("failed to read workspace package manifest")?;
let manifest = toml::from_str::<pesde::manifest::Manifest>(&manifest)
.context("failed to parse workspace package manifest")?;
*specifier = DependencySpecifiers::Pesde(PesdeDependencySpecifier {
name: spec.name.clone(),
version: match spec.version.clone() {
VersionTypeOrReq::VersionType(VersionType::Wildcard) => {
VersionReq::STAR
}
VersionTypeOrReq::Req(r) => r,
v => VersionReq::parse(&format!("{v}{}", manifest.version))
.context(format!("failed to parse version for {v}"))?,
},
index: Some(
manifest
.indices
.get(DEFAULT_INDEX_NAME)
.context("missing default index in workspace package manifest")?
.to_string(),
),
target: Some(spec.target.unwrap_or(manifest.target.kind())),
});
}
}
}
{
println!("\n{}", "please confirm the following information:".bold());
println!("name: {}", manifest.name);
println!("version: {}", manifest.version);
println!(
"description: {}",
manifest.description.as_deref().unwrap_or("(none)")
);
println!(
"license: {}",
manifest.license.as_deref().unwrap_or("(none)")
);
println!(
"authors: {}",
if manifest.authors.is_empty() {
"(none)".to_string()
} else {
manifest.authors.join(", ")
}
);
println!(
"repository: {}",
manifest
.repository
.as_ref()
.map(|r| r.as_str())
.unwrap_or("(none)")
);
let roblox_target = roblox_target.is_some_and(|_| true);
println!("target: {}", manifest.target);
println!(
"\tlib path: {}",
manifest
.target
.lib_path()
.map_or("(none)".to_string(), |p| p.to_string())
);
if roblox_target {
println!("\tbuild files: {}", display_build_files.join(", "));
} else {
println!(
"\tbin path: {}",
manifest
.target
.bin_path()
.map_or("(none)".to_string(), |p| p.to_string())
);
println!(
"\tscripts: {}",
manifest
.target
.scripts()
.filter(|s| !s.is_empty())
.map_or("(none)".to_string(), |s| {
s.keys().cloned().collect::<Vec<_>>().join(", ")
})
);
}
println!(
"includes: {}",
paths
.into_iter()
.map(|p| p.to_string_lossy().to_string())
.collect::<Vec<_>>()
.join(", ")
);
if !self.dry_run
&& !self.yes
&& !inquire::Confirm::new("is this information correct?").prompt()?
{
println!("\n{}", "publish aborted".red().bold());
return Ok(());
}
println!();
}
let temp_path = Builder::new().make(|_| Ok(()))?.into_temp_path();
let mut temp_manifest = fs::OpenOptions::new()
.create(true)
.write(true)
.truncate(true)
.read(true)
.open(temp_path.to_path_buf())
.await?;
temp_manifest
.write_all(
toml::to_string(&manifest)
.context("failed to serialize manifest")?
.as_bytes(),
)
.await
.context("failed to write temp manifest file")?;
temp_manifest
.rewind()
.await
.context("failed to rewind temp manifest file")?;
archive
.append_file(MANIFEST_FILE_NAME, temp_manifest.file_mut())
.await?;
let mut encoder = archive
.into_inner()
.await
.context("failed to finish archive")?;
encoder
.shutdown()
.await
.context("failed to finish archive")?;
let archive = encoder.into_inner();
let index_url = manifest
.indices
.get(&self.index)
.context(format!("missing index {}", self.index))?;
let source = PesdePackageSource::new(index_url.clone());
PackageSource::refresh(&source, project)
.await
.context("failed to refresh source")?;
let config = source
.config(project)
.await
.context("failed to get source config")?;
if archive.len() > config.max_archive_size {
anyhow::bail!(
"archive size exceeds maximum size of {} bytes by {} bytes",
config.max_archive_size,
archive.len() - config.max_archive_size
);
}
let deps = manifest.all_dependencies().context("dependency conflict")?;
if let Some((disallowed, _)) = deps.iter().find(|(_, (spec, _))| match spec {
DependencySpecifiers::Pesde(spec) => {
!config.other_registries_allowed.is_allowed_or_same(
source.repo_url().clone(),
gix::Url::try_from(spec.index.as_deref().unwrap()).unwrap(),
)
}
DependencySpecifiers::Git(spec) => !config.git_allowed.is_allowed(spec.repo.clone()),
#[cfg(feature = "wally-compat")]
DependencySpecifiers::Wally(spec) => !config
.wally_allowed
.is_allowed(gix::Url::try_from(spec.index.as_deref().unwrap()).unwrap()),
_ => false,
}) {
anyhow::bail!("dependency `{disallowed}` is not allowed on this index");
}
if self.dry_run {
fs::write("package.tar.gz", archive).await?;
println!(
"{}",
"(dry run) package written to package.tar.gz".green().bold()
);
return Ok(());
}
let mut request = reqwest
.post(format!("{}/v0/packages", config.api()))
.body(archive);
if let Some(token) = project.auth_config().tokens().get(index_url) {
tracing::debug!("using token for {index_url}");
request = request.header(AUTHORIZATION, token);
}
let response = request.send().await.context("failed to send request")?;
let status = response.status();
let text = response
.text()
.await
.context("failed to get response text")?;
match status {
StatusCode::CONFLICT => {
println!("{}", "package version already exists".red().bold());
}
StatusCode::FORBIDDEN => {
println!(
"{}",
"unauthorized to publish under this scope".red().bold()
);
}
StatusCode::BAD_REQUEST => {
println!("{}: {text}", "invalid package".red().bold());
}
code if !code.is_success() => {
anyhow::bail!("failed to publish package: {code} ({text})");
}
_ => {
println!("{text}");
}
}
Ok(())
}
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
let result = self.clone().run_impl(&project, reqwest.clone(), true).await;
if project.workspace_dir().is_some() {
return result;
} else {
display_err(result, " occurred publishing workspace root");
}
run_on_workspace_members(&project, |project| {
let reqwest = reqwest.clone();
let this = self.clone();
async move { this.run_impl(&project, reqwest, false).await }
})
.await
.map(|_| ())
}
}

177
src/cli/commands/run.rs Normal file
View file

@ -0,0 +1,177 @@
use crate::cli::up_to_date_lockfile;
use anyhow::Context;
use clap::Args;
use futures::{StreamExt, TryStreamExt};
use pesde::{
linking::generator::generate_bin_linking_module,
names::{PackageName, PackageNames},
Project, MANIFEST_FILE_NAME, PACKAGES_CONTAINER_NAME,
};
use relative_path::RelativePathBuf;
use std::{
collections::HashSet, env::current_dir, ffi::OsString, io::Write, path::PathBuf,
process::Command,
};
#[derive(Debug, Args)]
pub struct RunCommand {
/// The package name, script name, or path to a script to run
#[arg(index = 1)]
package_or_script: Option<String>,
/// Arguments to pass to the script
#[arg(index = 2, last = true)]
args: Vec<OsString>,
}
impl RunCommand {
pub async fn run(self, project: Project) -> anyhow::Result<()> {
let run = |root: PathBuf, file_path: PathBuf| {
let mut caller = tempfile::NamedTempFile::new().expect("failed to create tempfile");
caller
.write_all(
generate_bin_linking_module(
root,
&format!("{:?}", file_path.to_string_lossy()),
)
.as_bytes(),
)
.expect("failed to write to tempfile");
let status = Command::new("lune")
.arg("run")
.arg(caller.path())
.arg("--")
.args(&self.args)
.current_dir(current_dir().expect("failed to get current directory"))
.status()
.expect("failed to run script");
drop(caller);
std::process::exit(status.code().unwrap_or(1))
};
let Some(package_or_script) = self.package_or_script else {
if let Some(script_path) = project.deser_manifest().await?.target.bin_path() {
run(
project.package_dir().to_owned(),
script_path.to_path(project.package_dir()),
);
return Ok(());
}
anyhow::bail!("no package or script specified, and no bin path found in manifest")
};
if let Ok(pkg_name) = package_or_script.parse::<PackageName>() {
let graph = if let Some(lockfile) = up_to_date_lockfile(&project).await? {
lockfile.graph
} else {
anyhow::bail!("outdated lockfile, please run the install command first")
};
let pkg_name = PackageNames::Pesde(pkg_name);
for (version_id, node) in graph.get(&pkg_name).context("package not found in graph")? {
if node.node.direct.is_none() {
continue;
}
let Some(bin_path) = node.target.bin_path() else {
anyhow::bail!("package has no bin path");
};
let base_folder = project
.deser_manifest()
.await?
.target
.kind()
.packages_folder(version_id.target());
let container_folder = node.node.container_folder(
&project
.package_dir()
.join(base_folder)
.join(PACKAGES_CONTAINER_NAME),
&pkg_name,
version_id.version(),
);
let path = bin_path.to_path(&container_folder);
run(path.clone(), path);
return Ok(());
}
}
if let Ok(manifest) = project.deser_manifest().await {
if let Some(script_path) = manifest.scripts.get(&package_or_script) {
run(
project.package_dir().to_path_buf(),
script_path.to_path(project.package_dir()),
);
return Ok(());
}
};
let relative_path = RelativePathBuf::from(package_or_script);
let path = relative_path.to_path(project.package_dir());
if !path.exists() {
anyhow::bail!("path `{}` does not exist", path.display());
}
let workspace_dir = project
.workspace_dir()
.unwrap_or_else(|| project.package_dir());
let members = match project.workspace_members(workspace_dir, false).await {
Ok(members) => members.boxed(),
Err(pesde::errors::WorkspaceMembersError::ManifestMissing(e))
if e.kind() == std::io::ErrorKind::NotFound =>
{
futures::stream::empty().boxed()
}
Err(e) => Err(e).context("failed to get workspace members")?,
};
let members = members
.map(|res| {
res.map_err(anyhow::Error::from)
.and_then(|(path, _)| path.canonicalize().map_err(Into::into))
})
.chain(futures::stream::once(async {
workspace_dir.canonicalize().map_err(Into::into)
}))
.try_collect::<HashSet<_>>()
.await
.context("failed to collect workspace members")?;
let root = 'finder: {
let mut current_path = path.to_path_buf();
loop {
let canonical_path = current_path
.canonicalize()
.context("failed to canonicalize parent")?;
if members.contains(&canonical_path)
&& canonical_path.join(MANIFEST_FILE_NAME).exists()
{
break 'finder canonical_path;
}
if let Some(parent) = current_path.parent() {
current_path = parent.to_path_buf();
} else {
break;
}
}
project.package_dir().to_path_buf()
};
run(root, path);
Ok(())
}
}

View file

@ -0,0 +1,77 @@
use crate::cli::{version::update_bin_exe, HOME_DIR};
use anyhow::Context;
use clap::Args;
use colored::Colorize;
use std::env::current_exe;
#[derive(Debug, Args)]
pub struct SelfInstallCommand {
/// Skip adding the bin directory to the PATH
#[cfg(windows)]
#[arg(short, long)]
skip_add_to_path: bool,
}
impl SelfInstallCommand {
pub async fn run(self) -> anyhow::Result<()> {
#[cfg(windows)]
{
if !self.skip_add_to_path {
use anyhow::Context;
use winreg::{enums::HKEY_CURRENT_USER, RegKey};
let current_user = RegKey::predef(HKEY_CURRENT_USER);
let env = current_user
.create_subkey("Environment")
.context("failed to open Environment key")?
.0;
let path: String = env.get_value("Path").context("failed to get Path value")?;
let bin_dir = crate::cli::bin_dir().await?;
let bin_dir = bin_dir.to_string_lossy();
let exists = path.split(';').any(|part| *part == bin_dir);
if !exists {
let new_path = format!("{path};{bin_dir}");
env.set_value("Path", &new_path)
.context("failed to set Path value")?;
println!(
"\nin order to allow binary exports as executables {}.\n\n{}",
format!("`~/{HOME_DIR}/bin` was added to PATH").green(),
"please restart your shell for this to take effect"
.yellow()
.bold()
);
}
}
println!(
"installed {} {}!",
env!("CARGO_BIN_NAME").cyan(),
env!("CARGO_PKG_VERSION").yellow(),
);
}
#[cfg(unix)]
{
println!(
r#"installed {} {}! add the following line to your shell profile in order to get the binary and binary exports as executables usable from anywhere:
{}
and then restart your shell.
"#,
env!("CARGO_BIN_NAME").cyan(),
env!("CARGO_PKG_VERSION").yellow(),
format!(r#"export PATH="$PATH:~/{HOME_DIR}/bin""#)
.bold()
.green()
);
}
update_bin_exe(&current_exe().context("failed to get current exe path")?).await?;
Ok(())
}
}

View file

@ -0,0 +1,58 @@
use crate::cli::{
config::read_config,
version::{
current_version, get_or_download_version, get_remote_version, no_build_metadata,
update_bin_exe, TagInfo, VersionType,
},
};
use anyhow::Context;
use clap::Args;
use colored::Colorize;
#[derive(Debug, Args)]
pub struct SelfUpgradeCommand {
/// Whether to use the version from the "upgrades available" message
#[clap(long, default_value_t = false)]
use_cached: bool,
}
impl SelfUpgradeCommand {
pub async fn run(self, reqwest: reqwest::Client) -> anyhow::Result<()> {
let latest_version = if self.use_cached {
read_config()
.await?
.last_checked_updates
.context("no cached version found")?
.1
} else {
get_remote_version(&reqwest, VersionType::Latest).await?
};
let latest_version_no_metadata = no_build_metadata(&latest_version);
if latest_version_no_metadata <= current_version() {
println!("already up to date");
return Ok(());
}
let display_latest_version = latest_version_no_metadata.to_string().yellow().bold();
if !inquire::prompt_confirmation(format!(
"are you sure you want to upgrade {} from {} to {display_latest_version}?",
env!("CARGO_BIN_NAME").cyan(),
env!("CARGO_PKG_VERSION").yellow().bold()
))? {
println!("cancelled upgrade");
return Ok(());
}
let path = get_or_download_version(&reqwest, &TagInfo::Complete(latest_version), true)
.await?
.unwrap();
update_bin_exe(&path).await?;
println!("upgraded to version {display_latest_version}!");
Ok(())
}
}

View file

@ -0,0 +1,85 @@
use crate::cli::{progress_bar, run_on_workspace_members};
use anyhow::Context;
use clap::Args;
use colored::Colorize;
use pesde::{lockfile::Lockfile, Project};
use std::{collections::HashSet, sync::Arc};
use tokio::sync::Mutex;
#[derive(Debug, Args, Copy, Clone)]
pub struct UpdateCommand {}
impl UpdateCommand {
pub async fn run(self, project: Project, reqwest: reqwest::Client) -> anyhow::Result<()> {
let mut refreshed_sources = HashSet::new();
let manifest = project
.deser_manifest()
.await
.context("failed to read manifest")?;
println!(
"\n{}\n",
format!("[now updating {} {}]", manifest.name, manifest.target)
.bold()
.on_bright_black()
);
let graph = project
.dependency_graph(None, &mut refreshed_sources, false)
.await
.context("failed to build dependency graph")?;
let graph = Arc::new(graph);
project
.write_lockfile(Lockfile {
name: manifest.name,
version: manifest.version,
target: manifest.target.kind(),
overrides: manifest.overrides,
graph: {
let (rx, downloaded_graph) = project
.download_and_link(
&graph,
&Arc::new(Mutex::new(refreshed_sources)),
&reqwest,
false,
false,
|_| async { Ok::<_, std::io::Error>(()) },
)
.await
.context("failed to download dependencies")?;
progress_bar(
graph.values().map(|versions| versions.len() as u64).sum(),
rx,
"📥 ".to_string(),
"downloading dependencies".to_string(),
"downloaded dependencies".to_string(),
)
.await?;
downloaded_graph
.await
.context("failed to download dependencies")?
},
workspace: run_on_workspace_members(&project, |project| {
let reqwest = reqwest.clone();
async move { Box::pin(self.run(project, reqwest)).await }
})
.await?,
})
.await
.context("failed to write lockfile")?;
println!(
"\n\n{}. run `{} install` in order to install the new dependencies",
"✅ done".green(),
env!("CARGO_BIN_NAME")
);
Ok(())
}
}

View file

@ -1,42 +1,57 @@
use std::path::PathBuf;
use crate::cli::{auth::Tokens, home_dir};
use anyhow::Context;
use fs_err::tokio as fs;
use serde::{Deserialize, Serialize};
use tracing::instrument;
use clap::Subcommand;
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(default)]
pub struct CliConfig {
#[serde(
serialize_with = "crate::util::serialize_gix_url",
deserialize_with = "crate::util::deserialize_gix_url"
)]
pub default_index: gix::Url,
use crate::{cli::CLI_CONFIG, CliConfig};
pub tokens: Tokens,
#[derive(Subcommand, Clone)]
pub enum ConfigCommand {
/// Sets the cache directory
SetCacheDir {
/// The directory to use as the cache directory
#[clap(value_name = "DIRECTORY")]
directory: Option<PathBuf>,
},
/// Gets the cache directory
GetCacheDir,
#[serde(default, skip_serializing_if = "Option::is_none")]
pub last_checked_updates: Option<(chrono::DateTime<chrono::Utc>, semver::Version)>,
}
pub fn config_command(cmd: ConfigCommand) -> anyhow::Result<()> {
match cmd {
ConfigCommand::SetCacheDir { directory } => {
let cli_config = CliConfig {
cache_dir: directory,
};
impl Default for CliConfig {
fn default() -> Self {
Self {
default_index: "https://github.com/pesde-pkg/index".try_into().unwrap(),
cli_config.write()?;
tokens: Tokens(Default::default()),
println!(
"cache directory set to: `{}`",
cli_config.cache_dir().display()
);
}
ConfigCommand::GetCacheDir => {
println!(
"current cache directory: `{}`",
CLI_CONFIG.cache_dir().display()
);
last_checked_updates: None,
}
}
}
#[instrument(level = "trace")]
pub async fn read_config() -> anyhow::Result<CliConfig> {
let config_string = match fs::read_to_string(home_dir()?.join("config.toml")).await {
Ok(config_string) => config_string,
Err(e) if e.kind() == std::io::ErrorKind::NotFound => {
return Ok(CliConfig::default());
}
Err(e) => return Err(e).context("failed to read config file"),
};
let config = toml::from_str(&config_string).context("failed to parse config file")?;
Ok(config)
}
#[instrument(level = "trace")]
pub async fn write_config(config: &CliConfig) -> anyhow::Result<()> {
let config_string = toml::to_string(config).context("failed to serialize config")?;
fs::write(home_dir()?.join("config.toml"), config_string)
.await
.context("failed to write config file")?;
Ok(())
}

21
src/cli/files.rs Normal file
View file

@ -0,0 +1,21 @@
use std::path::Path;
pub async fn make_executable<P: AsRef<Path>>(_path: P) -> anyhow::Result<()> {
#[cfg(unix)]
{
use anyhow::Context;
use fs_err::tokio as fs;
use std::os::unix::fs::PermissionsExt;
let mut perms = fs::metadata(&_path)
.await
.context("failed to get bin link file metadata")?
.permissions();
perms.set_mode(perms.mode() | 0o111);
fs::set_permissions(&_path, perms)
.await
.context("failed to set bin link file permissions")?;
}
Ok(())
}

View file

@ -1,293 +1,303 @@
use crate::cli::{api_token::API_TOKEN_SOURCE, auth::AuthCommand, config::ConfigCommand};
use auth_git2::GitAuthenticator;
use clap::{Parser, Subcommand};
use directories::ProjectDirs;
use indicatif::MultiProgress;
use indicatif_log_bridge::LogWrapper;
use log::error;
use once_cell::sync::Lazy;
use anyhow::Context;
use colored::Colorize;
use fs_err::tokio as fs;
use futures::StreamExt;
use pesde::{
index::{GitIndex, Index},
manifest::{Manifest, Realm},
package_name::{PackageName, StandardPackageName},
project::DEFAULT_INDEX_NAME,
lockfile::Lockfile,
manifest::target::TargetKind,
names::{PackageName, PackageNames},
source::{version_id::VersionId, workspace::specifier::VersionTypeOrReq},
Project,
};
use pretty_env_logger::env_logger::Env;
use reqwest::{
blocking::{RequestBuilder, Response},
header::ACCEPT,
};
use semver::{Version, VersionReq};
use serde::{Deserialize, Serialize};
use relative_path::RelativePathBuf;
use std::{
hash::{DefaultHasher, Hash, Hasher},
collections::{BTreeMap, HashSet},
future::Future,
path::PathBuf,
str::FromStr,
time::Duration,
};
use tokio::pin;
use tracing::instrument;
pub mod api_token;
pub mod auth;
pub mod commands;
pub mod config;
pub mod root;
pub mod files;
#[cfg(feature = "version-management")]
pub mod version;
pub const HOME_DIR: &str = concat!(".", env!("CARGO_PKG_NAME"));
pub fn home_dir() -> anyhow::Result<PathBuf> {
Ok(dirs::home_dir()
.context("failed to get home directory")?
.join(HOME_DIR))
}
pub async fn bin_dir() -> anyhow::Result<PathBuf> {
let bin_dir = home_dir()?.join("bin");
fs::create_dir_all(&bin_dir)
.await
.context("failed to create bin folder")?;
Ok(bin_dir)
}
#[instrument(skip(project), ret(level = "trace"), level = "debug")]
pub async fn up_to_date_lockfile(project: &Project) -> anyhow::Result<Option<Lockfile>> {
let manifest = project.deser_manifest().await?;
let lockfile = match project.deser_lockfile().await {
Ok(lockfile) => lockfile,
Err(pesde::errors::LockfileReadError::Io(e))
if e.kind() == std::io::ErrorKind::NotFound =>
{
return Ok(None);
}
Err(e) => return Err(e.into()),
};
if manifest.overrides != lockfile.overrides {
tracing::debug!("overrides are different");
return Ok(None);
}
if manifest.target.kind() != lockfile.target {
tracing::debug!("target kind is different");
return Ok(None);
}
if manifest.name != lockfile.name || manifest.version != lockfile.version {
tracing::debug!("name or version is different");
return Ok(None);
}
let specs = lockfile
.graph
.iter()
.flat_map(|(_, versions)| versions)
.filter_map(|(_, node)| {
node.node
.direct
.as_ref()
.map(|(_, spec, source_ty)| (spec, source_ty))
})
.collect::<HashSet<_>>();
let same_dependencies = manifest
.all_dependencies()
.context("failed to get all dependencies")?
.iter()
.all(|(_, (spec, ty))| specs.contains(&(spec, ty)));
tracing::debug!("dependencies are the same: {same_dependencies}");
Ok(if same_dependencies {
Some(lockfile)
} else {
None
})
}
#[derive(Debug, Clone)]
pub struct VersionedPackageName<V: FromStr<Err = semver::Error>>(PackageName, V);
struct VersionedPackageName<V: FromStr = VersionId, N: FromStr = PackageNames>(N, Option<V>);
impl<V: FromStr<Err = semver::Error>> FromStr for VersionedPackageName<V> {
impl<V: FromStr<Err = E>, E: Into<anyhow::Error>, N: FromStr<Err = F>, F: Into<anyhow::Error>>
FromStr for VersionedPackageName<V, N>
{
type Err = anyhow::Error;
fn from_str(s: &str) -> Result<Self, Self::Err> {
let (name, version) = s.split_once('@').ok_or_else(|| {
anyhow::anyhow!("invalid package name: {s}; expected format: name@version")
})?;
let mut parts = s.splitn(2, '@');
let name = parts.next().unwrap();
let version = parts
.next()
.map(FromStr::from_str)
.transpose()
.map_err(Into::into)?;
Ok(VersionedPackageName(
name.to_string().parse()?,
version.parse()?,
name.parse().map_err(Into::into)?,
version,
))
}
}
#[derive(Subcommand, Clone)]
pub enum Command {
/// Initializes a manifest file
Init,
impl VersionedPackageName {
#[cfg(feature = "patches")]
fn get(
self,
graph: &pesde::lockfile::DownloadedGraph,
) -> anyhow::Result<(PackageNames, VersionId)> {
let version_id = match self.1 {
Some(version) => version,
None => {
let versions = graph.get(&self.0).context("package not found in graph")?;
if versions.len() == 1 {
let version = versions.keys().next().unwrap().clone();
tracing::debug!("only one version found, using {version}");
version
} else {
anyhow::bail!(
"multiple versions found, please specify one of: {}",
versions
.keys()
.map(|v| v.to_string())
.collect::<Vec<_>>()
.join(", ")
);
}
}
};
/// Adds a package to the manifest
Add {
/// The package to add
#[clap(value_name = "PACKAGE")]
package: VersionedPackageName<VersionReq>,
/// Whether the package is a peer dependency
#[clap(long, short)]
peer: bool,
/// The realm of the package
#[clap(long, short)]
realm: Option<Realm>,
},
/// Removes a package from the manifest
Remove {
/// The package to remove
#[clap(value_name = "PACKAGE")]
package: PackageName,
},
/// Lists outdated packages
Outdated,
/// Installs the dependencies of the project
Install {
/// Whether to use the lockfile for resolving dependencies
#[clap(long, short)]
locked: bool,
},
/// Runs the `bin` export of the specified package
Run {
/// The package to run
#[clap(value_name = "PACKAGE")]
package: StandardPackageName,
/// The arguments to pass to the package
#[clap(last = true)]
args: Vec<String>,
},
/// Searches for a package on the registry
Search {
/// The query to search for
#[clap(value_name = "QUERY")]
query: Option<String>,
},
/// Publishes the project to the registry
Publish,
/// Converts a `wally.toml` file to a `pesde.yaml` file
#[cfg(feature = "wally")]
Convert,
/// Begins a new patch
Patch {
/// The package to patch
#[clap(value_name = "PACKAGE")]
package: VersionedPackageName<Version>,
},
/// Commits (finishes) the patch
PatchCommit {
/// The package's changed directory
#[clap(value_name = "DIRECTORY")]
dir: PathBuf,
},
/// Auth-related commands
Auth {
#[clap(subcommand)]
command: AuthCommand,
},
/// Config-related commands
Config {
#[clap(subcommand)]
command: ConfigCommand,
},
}
#[derive(Parser, Clone)]
#[clap(version = env!("CARGO_PKG_VERSION"))]
pub struct Cli {
#[clap(subcommand)]
pub command: Command,
/// The directory to run the command in
#[arg(short, long, value_name = "DIRECTORY")]
pub directory: Option<PathBuf>,
}
#[derive(Serialize, Deserialize, Clone, Default)]
pub struct CliConfig {
pub cache_dir: Option<PathBuf>,
}
impl CliConfig {
pub fn cache_dir(&self) -> PathBuf {
self.cache_dir
.clone()
.unwrap_or_else(|| DIRS.cache_dir().to_path_buf())
Ok((self.0, version_id))
}
}
pub fn open() -> anyhow::Result<Self> {
let cli_config_path = DIRS.config_dir().join("config.yaml");
#[derive(Debug, Clone)]
enum AnyPackageIdentifier<V: FromStr = VersionId, N: FromStr = PackageNames> {
PackageName(VersionedPackageName<V, N>),
Url((gix::Url, String)),
Workspace(VersionedPackageName<VersionTypeOrReq, PackageName>),
}
if cli_config_path.exists() {
Ok(serde_yaml::from_slice(&std::fs::read(cli_config_path)?)?)
impl<V: FromStr<Err = E>, E: Into<anyhow::Error>, N: FromStr<Err = F>, F: Into<anyhow::Error>>
FromStr for AnyPackageIdentifier<V, N>
{
type Err = anyhow::Error;
fn from_str(s: &str) -> Result<Self, Self::Err> {
if let Some(s) = s.strip_prefix("gh#") {
let s = format!("https://github.com/{s}");
let (repo, rev) = s.split_once('#').context("missing revision")?;
Ok(AnyPackageIdentifier::Url((
repo.try_into()?,
rev.to_string(),
)))
} else if let Some(rest) = s.strip_prefix("workspace:") {
Ok(AnyPackageIdentifier::Workspace(rest.parse()?))
} else if s.contains(':') {
let (url, rev) = s.split_once('#').context("missing revision")?;
Ok(AnyPackageIdentifier::Url((
url.try_into()?,
rev.to_string(),
)))
} else {
let config = CliConfig::default();
config.write()?;
Ok(config)
}
}
pub fn write(&self) -> anyhow::Result<()> {
let cli_config_path = DIRS.config_dir().join("config.yaml");
serde_yaml::to_writer(
&mut std::fs::File::create(cli_config_path.as_path())?,
&self,
)?;
Ok(())
}
}
pub fn send_request(request_builder: RequestBuilder) -> anyhow::Result<Response> {
let res = request_builder.send()?;
match res.error_for_status_ref() {
Ok(_) => Ok(res),
Err(e) => {
error!("request failed: {e}\nbody: {}", res.text()?);
Err(e.into())
Ok(AnyPackageIdentifier::PackageName(s.parse()?))
}
}
}
pub static CLI: Lazy<Cli> = Lazy::new(Cli::parse);
pub fn parse_gix_url(s: &str) -> Result<gix::Url, gix::url::parse::Error> {
s.try_into()
}
pub static DIRS: Lazy<ProjectDirs> = Lazy::new(|| {
ProjectDirs::from("com", env!("CARGO_PKG_NAME"), env!("CARGO_BIN_NAME"))
.expect("couldn't get home directory")
});
pub async fn progress_bar<E: std::error::Error + Into<anyhow::Error>>(
len: u64,
mut rx: tokio::sync::mpsc::Receiver<Result<String, E>>,
prefix: String,
progress_msg: String,
finish_msg: String,
) -> anyhow::Result<()> {
let bar = indicatif::ProgressBar::new(len)
.with_style(
indicatif::ProgressStyle::default_bar()
.template("{prefix}[{elapsed_precise}] {bar:40.208/166} {pos}/{len} {msg}")?
.progress_chars("█▓▒░ "),
)
.with_prefix(prefix)
.with_message(progress_msg);
bar.enable_steady_tick(Duration::from_millis(100));
pub static CLI_CONFIG: Lazy<CliConfig> = Lazy::new(|| CliConfig::open().unwrap());
while let Some(result) = rx.recv().await {
bar.inc(1);
pub static CWD: Lazy<PathBuf> = Lazy::new(|| {
CLI.directory
.clone()
.or(std::env::current_dir().ok())
.expect("couldn't get current directory")
});
pub static REQWEST_CLIENT: Lazy<reqwest::blocking::Client> = Lazy::new(|| {
let mut header_map = reqwest::header::HeaderMap::new();
header_map.insert(ACCEPT, "application/json".parse().unwrap());
header_map.insert("X-GitHub-Api-Version", "2022-11-28".parse().unwrap());
if let Ok(Some(token)) = API_TOKEN_SOURCE.get_api_token() {
header_map.insert(
reqwest::header::AUTHORIZATION,
format!("Bearer {token}").parse().unwrap(),
);
match result {
Ok(text) => {
bar.set_message(text);
}
Err(e) => return Err(e.into()),
}
}
reqwest::blocking::Client::builder()
.user_agent(concat!(
env!("CARGO_PKG_NAME"),
"/",
env!("CARGO_PKG_VERSION")
))
.default_headers(header_map)
.build()
.unwrap()
});
bar.finish_with_message(finish_msg);
pub static MULTI: Lazy<MultiProgress> = Lazy::new(|| {
let logger = pretty_env_logger::formatted_builder()
.parse_env(Env::default().default_filter_or("info"))
.build();
let multi = MultiProgress::new();
LogWrapper::new(multi.clone(), logger).try_init().unwrap();
multi
});
pub const DEFAULT_INDEX_URL: &str = "https://github.com/daimond113/pesde-index";
#[cfg(feature = "wally")]
pub const DEFAULT_WALLY_INDEX_URL: &str = "https://github.com/UpliftGames/wally-index";
pub fn index_dir(url: &str) -> PathBuf {
let mut hasher = DefaultHasher::new();
url.hash(&mut hasher);
let hash = hasher.finish().to_string();
CLI_CONFIG
.cache_dir()
.join("indices")
.join(hash)
.join("index")
Ok(())
}
pub fn clone_index(url: &str) -> GitIndex {
let index = GitIndex::new(
index_dir(url),
&url.parse().unwrap(),
Some(Box::new(|| {
Box::new(|a, b, c| {
let git_authenticator = GitAuthenticator::new();
let config = git2::Config::open_default().unwrap();
let mut cred = git_authenticator.credentials(&config);
cred(a, b, c)
})
})),
API_TOKEN_SOURCE.get_api_token().unwrap(),
);
index.refresh().unwrap();
index
pub fn shift_project_dir(project: &Project, pkg_dir: PathBuf) -> Project {
Project::new(
pkg_dir,
Some(project.package_dir()),
project.data_dir(),
project.cas_dir(),
project.auth_config().clone(),
)
}
pub static DEFAULT_INDEX_DATA: Lazy<(PathBuf, String)> = Lazy::new(|| {
let manifest = Manifest::from_path(CWD.to_path_buf())
.map(|m| m.indices.get(DEFAULT_INDEX_NAME).unwrap().clone());
let url = &manifest.unwrap_or(DEFAULT_INDEX_URL.to_string());
pub async fn run_on_workspace_members<F: Future<Output = anyhow::Result<()>>>(
project: &Project,
f: impl Fn(Project) -> F,
) -> anyhow::Result<BTreeMap<PackageName, BTreeMap<TargetKind, RelativePathBuf>>> {
// this might seem counterintuitive, but remember that
// the presence of a workspace dir means that this project is a member of one
if project.workspace_dir().is_some() {
return Ok(Default::default());
}
(index_dir(url), url.clone())
});
let members_future = project
.workspace_members(project.package_dir(), true)
.await?;
pin!(members_future);
pub static DEFAULT_INDEX: Lazy<GitIndex> = Lazy::new(|| clone_index(&DEFAULT_INDEX_DATA.1));
let mut results = BTreeMap::<PackageName, BTreeMap<TargetKind, RelativePathBuf>>::new();
while let Some((path, manifest)) = members_future.next().await.transpose()? {
let relative_path =
RelativePathBuf::from_path(path.strip_prefix(project.package_dir()).unwrap()).unwrap();
// don't run on the current workspace root
if relative_path != "" {
f(shift_project_dir(project, path)).await?;
}
results
.entry(manifest.name)
.or_default()
.insert(manifest.target.kind(), relative_path);
}
Ok(results)
}
pub fn display_err(result: anyhow::Result<()>, prefix: &str) {
if let Err(err) = result {
eprintln!("{}: {err}\n", format!("error{prefix}").red().bold());
let cause = err.chain().skip(1).collect::<Vec<_>>();
if !cause.is_empty() {
eprintln!("{}:", "caused by".red().bold());
for err in cause {
eprintln!(" - {err}");
}
}
let backtrace = err.backtrace();
match backtrace.status() {
std::backtrace::BacktraceStatus::Disabled => {
eprintln!(
"\n{}: set RUST_BACKTRACE=1 for a backtrace",
"help".yellow().bold()
);
}
std::backtrace::BacktraceStatus::Captured => {
eprintln!("\n{}:\n{backtrace}", "backtrace".yellow().bold());
}
_ => {
eprintln!("\n{}: not captured", "backtrace".yellow().bold());
}
}
}
}

View file

@ -1,573 +0,0 @@
use cfg_if::cfg_if;
use chrono::Utc;
use std::{
collections::{BTreeMap, HashMap},
fs::{create_dir_all, read, remove_dir_all, write, File},
str::FromStr,
time::Duration,
};
use flate2::{write::GzEncoder, Compression};
use futures_executor::block_on;
use ignore::{overrides::OverrideBuilder, WalkBuilder};
use inquire::{validator::Validation, Select, Text};
use log::debug;
use lune::Runtime;
use once_cell::sync::Lazy;
use reqwest::{header::AUTHORIZATION, Url};
use semver::Version;
use serde_json::Value;
use tar::Builder as TarBuilder;
use pesde::{
dependencies::{registry::RegistryDependencySpecifier, DependencySpecifier, PackageRef},
index::Index,
manifest::{Manifest, PathStyle, Realm},
multithread::MultithreadedJob,
package_name::{PackageName, StandardPackageName},
patches::{create_patch, setup_patches_repo},
project::{InstallOptions, Project, DEFAULT_INDEX_NAME},
DEV_PACKAGES_FOLDER, IGNORED_FOLDERS, MANIFEST_FILE_NAME, PACKAGES_FOLDER, PATCHES_FOLDER,
SERVER_PACKAGES_FOLDER,
};
use crate::cli::{
clone_index, send_request, Command, CLI_CONFIG, CWD, DEFAULT_INDEX, DEFAULT_INDEX_URL, DIRS,
MULTI, REQWEST_CLIENT,
};
pub const MAX_ARCHIVE_SIZE: usize = 4 * 1024 * 1024;
fn multithreaded_bar<E: Send + Sync + Into<anyhow::Error> + 'static>(
job: MultithreadedJob<E>,
len: u64,
message: String,
) -> Result<(), anyhow::Error> {
let bar = MULTI.add(
indicatif::ProgressBar::new(len)
.with_style(
indicatif::ProgressStyle::default_bar()
.template("{msg} {bar:40.208/166} {pos}/{len} {percent}% {elapsed_precise}")?,
)
.with_message(message),
);
bar.enable_steady_tick(Duration::from_millis(100));
while let Ok(result) = job.progress().recv() {
result.map_err(Into::into)?;
bar.inc(1);
}
bar.finish_with_message("done");
Ok(())
}
macro_rules! none_if_empty {
($s:expr) => {
if $s.is_empty() {
None
} else {
Some($s)
}
};
}
pub fn root_command(cmd: Command) -> anyhow::Result<()> {
let mut project: Lazy<Project> = Lazy::new(|| {
let manifest = Manifest::from_path(CWD.to_path_buf()).unwrap();
let indices = manifest
.indices
.clone()
.into_iter()
.map(|(k, v)| (k, Box::new(clone_index(&v)) as Box<dyn Index>))
.collect::<HashMap<_, _>>();
Project::new(CWD.to_path_buf(), CLI_CONFIG.cache_dir(), indices, manifest).unwrap()
});
match cmd {
Command::Install { locked } => {
for packages_folder in &[PACKAGES_FOLDER, DEV_PACKAGES_FOLDER, SERVER_PACKAGES_FOLDER] {
if let Err(e) = remove_dir_all(CWD.join(packages_folder)) {
if e.kind() != std::io::ErrorKind::NotFound {
return Err(e.into());
} else {
debug!("no {packages_folder} folder found, skipping removal");
}
};
}
let manifest = project.manifest().clone();
let lockfile = manifest.dependency_graph(&mut project, locked)?;
let download_job = project.download(&lockfile)?;
multithreaded_bar(
download_job,
lockfile.children.len() as u64,
"Downloading packages".to_string(),
)?;
#[allow(unused_variables)]
project.convert_manifests(&lockfile, |path| {
cfg_if! {
if #[cfg(feature = "wally")] {
if let Some(sourcemap_generator) = &manifest.sourcemap_generator {
cfg_if! {
if #[cfg(target_os = "windows")] {
std::process::Command::new("pwsh")
.args(["-C", &sourcemap_generator])
.current_dir(path)
.output()
.expect("failed to execute process");
} else {
std::process::Command::new("sh")
.args(["-c", &sourcemap_generator])
.current_dir(path)
.output()
.expect("failed to execute process");
}
}
}
}
}
})?;
let project = Lazy::force_mut(&mut project);
project.install(
InstallOptions::new()
.locked(locked)
.auto_download(false)
.lockfile(lockfile),
)?;
}
Command::Run { package, args } => {
let lockfile = project
.lockfile()?
.ok_or(anyhow::anyhow!("lockfile not found"))?;
let resolved_pkg = lockfile
.children
.get(&package.into())
.and_then(|versions| {
versions
.values()
.find(|pkg_ref| lockfile.root_specifier(pkg_ref).is_some())
})
.ok_or(anyhow::anyhow!(
"package not found in lockfile (or isn't root)"
))?;
let pkg_path = resolved_pkg.directory(project.path()).1;
let manifest = Manifest::from_path(&pkg_path)?;
let Some(bin_path) = manifest.exports.bin else {
anyhow::bail!("no bin found in package");
};
let absolute_bin_path = bin_path.to_path(pkg_path);
let mut runtime = Runtime::new().with_args(args);
block_on(runtime.run(
resolved_pkg.pkg_ref.name().to_string(),
&read(absolute_bin_path)?,
))?;
}
Command::Search { query } => {
let config = DEFAULT_INDEX.config()?;
let api_url = config.api();
let response = send_request(REQWEST_CLIENT.get(Url::parse_with_params(
&format!("{}/v0/search", api_url),
&query.map(|q| vec![("query", q)]).unwrap_or_default(),
)?))?
.json::<Value>()?;
for package in response.as_array().unwrap() {
println!(
"{}@{}{}",
package["name"].as_str().unwrap(),
package["version"].as_str().unwrap(),
package["description"]
.as_str()
.map(|d| if d.is_empty() {
d.to_string()
} else {
format!("\n{}\n", d)
})
.unwrap_or_default()
);
}
}
Command::Publish => {
if project.manifest().private {
anyhow::bail!("package is private, cannot publish");
}
let encoder = GzEncoder::new(vec![], Compression::default());
let mut archive = TarBuilder::new(encoder);
let cwd = &CWD.to_path_buf();
let mut walk_builder = WalkBuilder::new(cwd);
walk_builder.add_custom_ignore_filename(".pesdeignore");
let mut overrides = OverrideBuilder::new(cwd);
for packages_folder in IGNORED_FOLDERS {
overrides.add(&format!("!{}", packages_folder))?;
}
walk_builder.overrides(overrides.build()?);
for entry in walk_builder.build() {
let entry = entry?;
let path = entry.path();
let relative_path = path.strip_prefix(cwd)?;
let entry_type = entry
.file_type()
.ok_or(anyhow::anyhow!("failed to get file type"))?;
if relative_path.as_os_str().is_empty() {
continue;
}
if entry_type.is_file() {
archive.append_path_with_name(path, relative_path)?;
} else if entry_type.is_dir() {
archive.append_dir(relative_path, path)?;
}
}
let archive = archive.into_inner()?.finish()?;
if archive.len() > MAX_ARCHIVE_SIZE {
anyhow::bail!(
"archive is too big ({} bytes), max {MAX_ARCHIVE_SIZE}. aborting...",
archive.len()
);
}
let part = reqwest::blocking::multipart::Part::bytes(archive)
.file_name("tarball.tar.gz")
.mime_str("application/gzip")?;
let index = project.indices().get(DEFAULT_INDEX_NAME).unwrap();
let mut request = REQWEST_CLIENT
.post(format!("{}/v0/packages", index.config()?.api()))
.multipart(reqwest::blocking::multipart::Form::new().part("tarball", part));
if let Some(token) = index.registry_auth_token() {
request = request.header(AUTHORIZATION, format!("Bearer {token}"));
} else {
request = request.header(AUTHORIZATION, "");
}
println!("{}", send_request(request)?.text()?);
}
Command::Patch { package } => {
let lockfile = project
.lockfile()?
.ok_or(anyhow::anyhow!("lockfile not found"))?;
let resolved_pkg = lockfile
.children
.get(&package.0)
.and_then(|versions| versions.get(&package.1))
.ok_or(anyhow::anyhow!("package not found in lockfile"))?;
let dir = DIRS
.data_dir()
.join("patches")
.join(package.0.escaped())
.join(Utc::now().timestamp().to_string());
if dir.exists() {
anyhow::bail!(
"patch already exists. remove the directory {} to create a new patch",
dir.display()
);
}
create_dir_all(&dir)?;
let project = Lazy::force_mut(&mut project);
let url = resolved_pkg.pkg_ref.resolve_url(project)?;
let index = project.indices().get(DEFAULT_INDEX_NAME).unwrap();
resolved_pkg.pkg_ref.download(
&REQWEST_CLIENT,
index.registry_auth_token().map(|t| t.to_string()),
url.as_ref(),
index.credentials_fn().cloned(),
&dir,
)?;
match &resolved_pkg.pkg_ref {
PackageRef::Git(_) => {}
_ => {
setup_patches_repo(&dir)?;
}
}
println!("done! modify the files in {} and run `{} patch-commit <DIRECTORY>` to commit the changes", dir.display(), env!("CARGO_BIN_NAME"));
}
Command::PatchCommit { dir } => {
let name = dir
.parent()
.and_then(|p| p.file_name())
.and_then(|f| f.to_str())
.unwrap();
let manifest = Manifest::from_path(&dir)?;
let patch_path = project.path().join(PATCHES_FOLDER);
create_dir_all(&patch_path)?;
let patch_path = patch_path.join(format!("{name}@{}.patch", manifest.version));
if patch_path.exists() {
anyhow::bail!(
"patch already exists. remove the file {} to create a new patch",
patch_path.display()
);
}
let patches = create_patch(&dir)?;
write(&patch_path, patches)?;
remove_dir_all(&dir)?;
println!(
"done! to apply the patch, run `{} install`",
env!("CARGO_BIN_NAME")
);
}
Command::Init => {
let manifest_path = CWD.join(MANIFEST_FILE_NAME);
if manifest_path.exists() {
anyhow::bail!("manifest already exists");
}
let default_name = CWD.file_name().and_then(|s| s.to_str());
let mut name =
Text::new("What is the name of the package?").with_validator(|name: &str| {
Ok(match StandardPackageName::from_str(name) {
Ok(_) => Validation::Valid,
Err(e) => Validation::Invalid(e.into()),
})
});
if let Some(name_str) = default_name {
name = name.with_initial_value(name_str);
}
let name = name.prompt()?;
let path_style =
Select::new("What style of paths do you want to use?", vec!["roblox"]).prompt()?;
let path_style = match path_style {
"roblox" => PathStyle::Roblox {
place: Default::default(),
},
_ => unreachable!(),
};
let description = Text::new("What is the description of the package?").prompt()?;
let license = Text::new("What is the license of the package?").prompt()?;
let authors = Text::new("Who are the authors of the package? (split using ;)")
.prompt()?
.split(';')
.map(|s| s.trim().to_string())
.filter(|s| !s.is_empty())
.collect::<Vec<String>>();
let repository = Text::new("What is the repository of the package?").prompt()?;
let private = Select::new("Is this package private?", vec!["yes", "no"]).prompt()?;
let private = private == "yes";
let realm = Select::new(
"What is the realm of the package?",
vec!["shared", "server", "dev"],
)
.prompt()?;
let realm = match realm {
"shared" => Realm::Shared,
"server" => Realm::Server,
"dev" => Realm::Development,
_ => unreachable!(),
};
let manifest = Manifest {
name: name.parse()?,
version: Version::parse("0.1.0")?,
exports: Default::default(),
path_style,
private,
realm: Some(realm),
indices: BTreeMap::from([(
DEFAULT_INDEX_NAME.to_string(),
DEFAULT_INDEX_URL.to_string(),
)]),
#[cfg(feature = "wally")]
sourcemap_generator: None,
overrides: Default::default(),
dependencies: Default::default(),
peer_dependencies: Default::default(),
description: none_if_empty!(description),
license: none_if_empty!(license),
authors: none_if_empty!(authors),
repository: none_if_empty!(repository),
};
serde_yaml::to_writer(File::create(manifest_path)?, &manifest)?;
}
Command::Add {
package,
realm,
peer,
} => {
let mut manifest = project.manifest().clone();
let specifier = match package.0.clone() {
PackageName::Standard(name) => {
DependencySpecifier::Registry(RegistryDependencySpecifier {
name,
version: package.1,
realm,
index: DEFAULT_INDEX_NAME.to_string(),
})
}
#[cfg(feature = "wally")]
PackageName::Wally(name) => DependencySpecifier::Wally(
pesde::dependencies::wally::WallyDependencySpecifier {
name,
version: package.1,
realm,
index_url: crate::cli::DEFAULT_WALLY_INDEX_URL.parse().unwrap(),
},
),
};
fn insert_into(
deps: &mut BTreeMap<String, DependencySpecifier>,
specifier: DependencySpecifier,
name: PackageName,
) {
macro_rules! not_taken {
($key:expr) => {
(!deps.contains_key(&$key)).then_some($key)
};
}
let key = not_taken!(name.name().to_string())
.or_else(|| not_taken!(format!("{}/{}", name.scope(), name.name())))
.or_else(|| not_taken!(name.to_string()))
.unwrap();
deps.insert(key, specifier);
}
if peer {
insert_into(
&mut manifest.peer_dependencies,
specifier,
package.0.clone(),
);
} else {
insert_into(&mut manifest.dependencies, specifier, package.0.clone());
}
serde_yaml::to_writer(
File::create(project.path().join(MANIFEST_FILE_NAME))?,
&manifest,
)?;
}
Command::Remove { package } => {
let mut manifest = project.manifest().clone();
for dependencies in [&mut manifest.dependencies, &mut manifest.peer_dependencies] {
dependencies.retain(|_, d| {
if let DependencySpecifier::Registry(registry) = d {
match &package {
PackageName::Standard(name) => &registry.name != name,
#[cfg(feature = "wally")]
PackageName::Wally(_) => true,
}
} else {
cfg_if! {
if #[cfg(feature = "wally")] {
#[allow(clippy::collapsible_else_if)]
if let DependencySpecifier::Wally(wally) = d {
match &package {
PackageName::Standard(_) => true,
PackageName::Wally(name) => &wally.name != name,
}
} else {
true
}
} else {
true
}
}
}
});
}
serde_yaml::to_writer(
File::create(project.path().join(MANIFEST_FILE_NAME))?,
&manifest,
)?;
}
Command::Outdated => {
let project = Lazy::force_mut(&mut project);
let manifest = project.manifest().clone();
let lockfile = manifest.dependency_graph(project, false)?;
for (name, versions) in &lockfile.children {
for (version, resolved_pkg) in versions {
if lockfile.root_specifier(resolved_pkg).is_none() {
continue;
}
if let PackageRef::Registry(registry) = &resolved_pkg.pkg_ref {
let latest_version = send_request(REQWEST_CLIENT.get(format!(
"{}/v0/packages/{}/{}/versions",
resolved_pkg.pkg_ref.get_index(project).config()?.api(),
registry.name.scope(),
registry.name.name()
)))?
.json::<Value>()?
.as_array()
.and_then(|a| a.last())
.and_then(|v| v.as_str())
.and_then(|s| s.parse::<Version>().ok())
.ok_or(anyhow::anyhow!(
"failed to get latest version of {name}@{version}"
))?;
if &latest_version > version {
println!(
"{name}@{version} is outdated. latest version: {latest_version}"
);
}
}
}
}
}
#[cfg(feature = "wally")]
Command::Convert => {
Manifest::from_path_or_convert(CWD.to_path_buf())?;
}
_ => unreachable!(),
}
Ok(())
}

361
src/cli/version.rs Normal file
View file

@ -0,0 +1,361 @@
use crate::cli::{
bin_dir,
config::{read_config, write_config, CliConfig},
files::make_executable,
home_dir,
};
use anyhow::Context;
use colored::Colorize;
use fs_err::tokio as fs;
use futures::StreamExt;
use reqwest::header::ACCEPT;
use semver::Version;
use serde::Deserialize;
use std::{
env::current_exe,
path::{Path, PathBuf},
};
use tokio::io::AsyncWrite;
use tracing::instrument;
pub fn current_version() -> Version {
Version::parse(env!("CARGO_PKG_VERSION")).unwrap()
}
#[derive(Debug, Deserialize)]
struct Release {
tag_name: String,
assets: Vec<Asset>,
}
#[derive(Debug, Deserialize)]
struct Asset {
name: String,
url: url::Url,
}
#[instrument(level = "trace")]
fn get_repo() -> (String, String) {
let mut parts = env!("CARGO_PKG_REPOSITORY").split('/').skip(3);
let (owner, repo) = (
parts.next().unwrap().to_string(),
parts.next().unwrap().to_string(),
);
tracing::trace!("repository for updates: {owner}/{repo}");
(owner, repo)
}
#[derive(Debug)]
pub enum VersionType {
Latest,
Specific(Version),
}
#[instrument(skip(reqwest), level = "trace")]
pub async fn get_remote_version(
reqwest: &reqwest::Client,
ty: VersionType,
) -> anyhow::Result<Version> {
let (owner, repo) = get_repo();
let mut releases = reqwest
.get(format!(
"https://api.github.com/repos/{owner}/{repo}/releases",
))
.send()
.await
.context("failed to send request to GitHub API")?
.error_for_status()
.context("failed to get GitHub API response")?
.json::<Vec<Release>>()
.await
.context("failed to parse GitHub API response")?
.into_iter()
.filter_map(|release| Version::parse(release.tag_name.trim_start_matches('v')).ok());
match ty {
VersionType::Latest => releases.max(),
VersionType::Specific(version) => {
releases.find(|v| no_build_metadata(v) == no_build_metadata(&version))
}
}
.context("failed to find latest version")
}
pub fn no_build_metadata(version: &Version) -> Version {
let mut version = version.clone();
version.build = semver::BuildMetadata::EMPTY;
version
}
const CHECK_INTERVAL: chrono::Duration = chrono::Duration::hours(6);
#[instrument(skip(reqwest), level = "trace")]
pub async fn check_for_updates(reqwest: &reqwest::Client) -> anyhow::Result<()> {
let config = read_config().await?;
let version = if let Some((_, version)) = config
.last_checked_updates
.filter(|(time, _)| chrono::Utc::now() - *time < CHECK_INTERVAL)
{
tracing::debug!("using cached version");
version
} else {
tracing::debug!("checking for updates");
let version = get_remote_version(reqwest, VersionType::Latest).await?;
write_config(&CliConfig {
last_checked_updates: Some((chrono::Utc::now(), version.clone())),
..config
})
.await?;
version
};
let current_version = current_version();
let version_no_metadata = no_build_metadata(&version);
if version_no_metadata <= current_version {
return Ok(());
}
let name = env!("CARGO_BIN_NAME");
let changelog = format!("{}/releases/tag/v{version}", env!("CARGO_PKG_REPOSITORY"));
let unformatted_messages = [
"".to_string(),
format!("update available! {current_version}{version_no_metadata}"),
format!("changelog: {changelog}"),
format!("run `{name} self-upgrade` to upgrade"),
"".to_string(),
];
let width = unformatted_messages
.iter()
.map(|s| s.chars().count())
.max()
.unwrap()
+ 4;
let column = "".bright_magenta();
let message = [
"".to_string(),
format!(
"update available! {} → {}",
current_version.to_string().red(),
version_no_metadata.to_string().green()
),
format!("changelog: {}", changelog.blue()),
format!(
"run `{} {}` to upgrade",
name.blue(),
"self-upgrade".yellow()
),
"".to_string(),
]
.into_iter()
.enumerate()
.map(|(i, s)| {
let text_length = unformatted_messages[i].chars().count();
let padding = (width as f32 - text_length as f32) / 2f32;
let padding_l = " ".repeat(padding.floor() as usize);
let padding_r = " ".repeat(padding.ceil() as usize);
format!("{column}{padding_l}{s}{padding_r}{column}")
})
.collect::<Vec<_>>()
.join("\n");
let lines = "".repeat(width).bright_magenta();
let tl = "".bright_magenta();
let tr = "".bright_magenta();
let bl = "".bright_magenta();
let br = "".bright_magenta();
println!("\n{tl}{lines}{tr}\n{message}\n{bl}{lines}{br}\n");
Ok(())
}
#[instrument(skip(reqwest, writer), level = "trace")]
pub async fn download_github_release<W: AsyncWrite + Unpin>(
reqwest: &reqwest::Client,
version: &Version,
mut writer: W,
) -> anyhow::Result<()> {
let (owner, repo) = get_repo();
let release = reqwest
.get(format!(
"https://api.github.com/repos/{owner}/{repo}/releases/tags/v{version}",
))
.send()
.await
.context("failed to send request to GitHub API")?
.error_for_status()
.context("failed to get GitHub API response")?
.json::<Release>()
.await
.context("failed to parse GitHub API response")?;
let asset = release
.assets
.into_iter()
.find(|asset| {
asset.name.ends_with(&format!(
"-{}-{}.tar.gz",
std::env::consts::OS,
std::env::consts::ARCH
))
})
.context("failed to find asset for current platform")?;
let bytes = reqwest
.get(asset.url)
.header(ACCEPT, "application/octet-stream")
.send()
.await
.context("failed to send request to download asset")?
.error_for_status()
.context("failed to download asset")?
.bytes()
.await
.context("failed to download asset")?;
let mut decoder = async_compression::tokio::bufread::GzipDecoder::new(bytes.as_ref());
let mut archive = tokio_tar::Archive::new(&mut decoder);
let mut entry = archive
.entries()
.context("failed to read archive entries")?
.next()
.await
.context("archive has no entry")?
.context("failed to get first archive entry")?;
tokio::io::copy(&mut entry, &mut writer)
.await
.context("failed to write archive entry to file")
.map(|_| ())
}
#[derive(Debug)]
pub enum TagInfo {
Complete(Version),
Incomplete(Version),
}
#[instrument(skip(reqwest), level = "trace")]
pub async fn get_or_download_version(
reqwest: &reqwest::Client,
tag: &TagInfo,
always_give_path: bool,
) -> anyhow::Result<Option<PathBuf>> {
let path = home_dir()?.join("versions");
fs::create_dir_all(&path)
.await
.context("failed to create versions directory")?;
let version = match tag {
TagInfo::Complete(version) => version,
// don't fetch the version since it could be cached
TagInfo::Incomplete(version) => version,
};
let path = path.join(format!(
"{}{}",
no_build_metadata(version),
std::env::consts::EXE_SUFFIX
));
let is_requested_version = !always_give_path && *version == current_version();
if path.exists() {
tracing::debug!("version already exists");
return Ok(if is_requested_version {
None
} else {
Some(path)
});
}
if is_requested_version {
tracing::debug!("copying current executable to version directory");
fs::copy(current_exe()?, &path)
.await
.context("failed to copy current executable to version directory")?;
} else {
let version = match tag {
TagInfo::Complete(version) => version.clone(),
TagInfo::Incomplete(version) => {
get_remote_version(reqwest, VersionType::Specific(version.clone()))
.await
.context("failed to get remote version")?
}
};
tracing::debug!("downloading version");
download_github_release(
reqwest,
&version,
fs::File::create(&path)
.await
.context("failed to create version file")?,
)
.await?;
}
make_executable(&path)
.await
.context("failed to make downloaded version executable")?;
Ok(if is_requested_version {
None
} else {
Some(path)
})
}
#[instrument(level = "trace")]
pub async fn update_bin_exe(downloaded_file: &Path) -> anyhow::Result<()> {
let bin_exe_path = bin_dir().await?.join(format!(
"{}{}",
env!("CARGO_BIN_NAME"),
std::env::consts::EXE_SUFFIX
));
let mut downloaded_file = downloaded_file.to_path_buf();
let exists = bin_exe_path.exists();
if cfg!(target_os = "linux") && exists {
fs::remove_file(&bin_exe_path)
.await
.context("failed to remove existing executable")?;
} else if exists {
let tempfile = tempfile::Builder::new()
.make(|_| Ok(()))
.context("failed to create temporary file")?;
let path = tempfile.into_temp_path().to_path_buf();
#[cfg(windows)]
let path = path.with_extension("exe");
let current_exe = current_exe().context("failed to get current exe path")?;
if current_exe == downloaded_file {
downloaded_file = path.to_path_buf();
}
fs::rename(&bin_exe_path, &path)
.await
.context("failed to rename current executable")?;
}
fs::copy(downloaded_file, &bin_exe_path)
.await
.context("failed to copy executable to bin folder")?;
make_executable(&bin_exe_path).await
}

Some files were not shown because too many files have changed in this diff Show more