289 Commits
0.5 ... 1.3.3

Author SHA1 Message Date
Oscar Krause
e3745d7fa8 Merge branch 'dev' into 'main'
1.3.3

See merge request oscar.krause/fastapi-dls!21
2023-01-18 08:13:42 +01:00
Oscar Krause
5bb8f17679 improvements 2023-01-18 08:07:55 +01:00
Oscar Krause
de17b0f1b5 fixes 2023-01-18 08:03:02 +01:00
Oscar Krause
0ab5969d3a fixes 2023-01-18 06:56:16 +01:00
Oscar Krause
059a51fe74 refactored commands 2023-01-17 17:25:48 +01:00
Oscar Krause
bf858b38f4 fixes 2023-01-17 17:09:13 +01:00
Oscar Krause
f60f08d543 run powershell as administrator 2023-01-17 16:57:15 +01:00
Oscar Krause
b2e6fab294 fixes 2023-01-17 16:49:15 +01:00
Oscar Krause
b09bb091a5 bump version to 1.3.3 2023-01-17 16:29:32 +01:00
Oscar Krause
651af4cc82 fixed client-token url and added wget als alternative to curl 2023-01-17 16:29:21 +01:00
Oscar Krause
70f7d3f483 mark Let's Encrypt section as optional 2023-01-17 15:36:38 +01:00
Oscar Krause
1e4070a1ba added remove "/usr/share/fastapi-dls" to "postrm" 2023-01-17 14:57:54 +01:00
Oscar Krause
d69d833923 migrated "[[ ]]" if statements to "[ ]" 2023-01-17 14:57:39 +01:00
Oscar Krause
7ef071f92b removed fastapi-dls.service from conffiles 2023-01-17 14:57:09 +01:00
Oscar Krause
3c19fc9d5b implemented "lease_renewal" attribute as calculated value within what period of time the license must be renewed 2023-01-17 11:49:56 +01:00
Oscar Krause
164b5ebc44 Merge branch 'dev' into 'main'
1.3.2

See merge request oscar.krause/fastapi-dls!20
2023-01-17 11:36:23 +01:00
Oscar Krause
742fa07ed4 bump version to 1.3.2 2023-01-17 11:18:25 +01:00
Oscar Krause
a758d93970 main.py - fixed empty lease origin response 2023-01-17 11:18:07 +01:00
Oscar Krause
70250f1fca Merge branch 'dev' into 'main'
1.3.1

See merge request oscar.krause/fastapi-dls!19
2023-01-16 13:08:57 +01:00
Oscar Krause
a65687a082 bump version to 1.3.1 2023-01-16 10:34:20 +01:00
Oscar Krause
3e445c80aa fixes 2023-01-16 10:33:52 +01:00
Oscar Krause
20cc984799 FAQ.md 2023-01-16 10:30:55 +01:00
Oscar Krause
3495cc3af5 typos 2023-01-16 10:30:40 +01:00
Oscar Krause
ed13577e82 Dockerfile - updated to python 3.11 2023-01-16 10:30:21 +01:00
Oscar Krause
ca8a9df54c requirements.txt updated 2023-01-16 10:24:08 +01:00
Oscar Krause
5425eec545 .gitlab-ci.yml simplified 2023-01-16 10:23:58 +01:00
Oscar Krause
2f3c7d5433 Merge branch 'main' into dev 2023-01-16 07:00:20 +01:00
Oscar Krause
b551b0e7f9 README.md - added sunsupoorted ubuntu version 2023-01-15 19:47:50 +01:00
Oscar Krause
50dea9ac4e fixes 2023-01-05 14:08:08 +01:00
Oscar Krause
549a48a10b Merge branch 'dev' into 'main'
1.3

See merge request oscar.krause/fastapi-dls!18
2023-01-05 07:27:10 +01:00
Oscar Krause
1f3bc8b4af .gitlab-ci.yml 2023-01-05 07:22:25 +01:00
Oscar Krause
5fc8d4091b Merge branch 'dev' into 'main'
1.3

See merge request oscar.krause/fastapi-dls!17
2023-01-05 07:21:28 +01:00
Oscar Krause
851ec1a5c6 requirements.txt updated 2023-01-05 06:56:56 +01:00
Oscar Krause
9180222169 README.md 2023-01-04 21:46:02 +01:00
Oscar Krause
e71d4c4f4e fixed missing servie file for DEBIAN 2023-01-04 18:27:57 +01:00
Oscar Krause
aecad82914 main.py - added confirmation to deleteOrigins() 2023-01-04 18:12:59 +01:00
Oscar Krause
02fccb3605 README.md 2023-01-04 18:05:07 +01:00
Oscar Krause
24dba89dbe removed todos, currently all done or there is a branch for it 2023-01-04 17:58:23 +01:00
Oscar Krause
f5557a5ccd README.md 2023-01-04 17:46:19 +01:00
Oscar Krause
e8736c94ec docker-compose.yml - disabled internal ssl support 2023-01-04 17:46:02 +01:00
Oscar Krause
4325560ec4 README.md - added some collapses for logs 2023-01-04 17:18:13 +01:00
Oscar Krause
05979490ce README.md - moved "Endpoints" below "Setup" 2023-01-04 17:17:58 +01:00
Oscar Krause
52ffedffc7 code styling 2023-01-04 11:14:26 +01:00
Oscar Krause
5f5569a0c7 improved debian installation 2023-01-04 11:02:54 +01:00
Oscar Krause
32b05808c4 fixed "return" instead of "raise" 2023-01-04 10:14:00 +01:00
Oscar Krause
6c9ea63dc1 added variable for TOKEN_EXPIRE_DELTA 2023-01-04 10:08:17 +01:00
Oscar Krause
b839e6c2b3 code styling
- replaced 'json.loads' with 'json_loads'
- shortened 'JSONResponse' to 'JSONr'
- shortened 'HTMLResponse' to 'HTMLr'
- replaced HTTPException with JsonResponses
- added some error handing for invalid tokens
2023-01-04 10:04:52 +01:00
Oscar Krause
8bd37c0ead added some notes to required variables to change 2023-01-04 07:40:37 +01:00
Oscar Krause
27f47b93b8 docker-compose.yml - added experimental health endpoint 2023-01-03 20:45:16 +01:00
Oscar Krause
5bb8437b1d README.md - added timestamp to linux token filename 2023-01-03 18:59:34 +01:00
Oscar Krause
7e3f2d0345 docker-compose.yml - fixes 2023-01-03 18:44:30 +01:00
Oscar Krause
4198021212 README.md - fixed windows issue with /leasing/v1/lessor/shutdown 2023-01-03 18:10:02 +01:00
Oscar Krause
7e6e523799 improved test (checking uuid are 36 chars long) 2023-01-03 18:05:46 +01:00
Oscar Krause
7b2428ea38 removed some debugging 2023-01-03 18:05:46 +01:00
Oscar Krause
ac811d5df7 added 'LEASE_EXPIRE_HOURS' variable for better debugging 2023-01-03 18:05:46 +01:00
Oscar Krause
5575fee382 fixed config test 2023-01-03 18:05:46 +01:00
Oscar Krause
f1369d5e25 added some docs 2023-01-03 17:38:45 +01:00
Oscar Krause
d6cc6dcbee fixes 2023-01-03 17:38:32 +01:00
Oscar Krause
01fe142850 .gitlab-ci.yml - fixed release job 2023-01-03 15:22:49 +01:00
Oscar Krause
18e9ab2ebf fixes 2023-01-03 14:52:31 +01:00
Oscar Krause
b64c531898 bump version to 1.3 2023-01-03 14:50:52 +01:00
Oscar Krause
ef1730f4fe orm.py - added some docs 2023-01-03 14:20:13 +01:00
Oscar Krause
146ae8b824 updated docs 2023-01-03 14:09:35 +01:00
Oscar Krause
5a5ad0e654 removed 'scope_ref' from code checks because we only support one 'ALLOTMENT_REF', so we need no checks 2023-01-03 14:09:19 +01:00
Oscar Krause
0e3e7cbd3a main.py - corrected leasing behaviour (migrated from 'LEASE_REF' to 'ALLOTMENT_REF') 2023-01-03 13:05:05 +01:00
Oscar Krause
bd5625af42 main.py - removed example responses 2023-01-03 13:02:37 +01:00
Oscar Krause
8f9d95056f code styling - migrated direct dict access to '.get()' 2023-01-03 09:20:18 +01:00
Oscar Krause
2b8c468270 main.py - fixed missing 'LEASE_RENEWAL_PERIOD' on '/auth/v1/origin' 2023-01-03 07:25:09 +01:00
Oscar Krause
50e0dc8d1f implemented '/leasing/v1/lessor/shutdown' for windows guests 2023-01-02 19:42:23 +01:00
Oscar Krause
8b934dfeef fixed '/-/config' endpoint serialisation 2023-01-02 19:23:23 +01:00
Oscar Krause
4fb6243330 removed deprecated endpoints
- '/client-token' moved to '/-/client-token'
- '/status' moved to '/-/health' and '/-/config'

see README.md for more information
2023-01-02 19:18:32 +01:00
Oscar Krause
2e950ca6f4 implemented '/-/config' endpoint to list runtime environment variables 2023-01-02 19:14:25 +01:00
Oscar Krause
34662e6612 implemented 'LEASE_RENEWAL_PERIOD' variable 2023-01-02 18:57:41 +01:00
Oscar Krause
a3e089a3d5 added some references 2023-01-02 18:10:11 +01:00
Oscar Krause
ab996bb030 code styling 2023-01-02 18:04:14 +01:00
Oscar Krause
0853dd64cb README.md - added known issue for error on releasing leases on windows shutdown 2023-01-02 14:12:15 +01:00
Oscar Krause
838956bdb7 README.md - added '-L' parameter to curl commands to follow redirects (from deprecated endpoints) 2023-01-02 11:40:19 +01:00
Oscar Krause
8c515b7f2e README.md - removed links from endpoints 2023-01-02 11:39:37 +01:00
Oscar Krause
de5f07273b README.md - added compatibility to official dls 2023-01-02 11:38:48 +01:00
Oscar Krause
c894537ff9 Merge branch 'dev' into 'main'
1.2

See merge request oscar.krause/fastapi-dls!16
2022-12-30 07:51:26 +01:00
Oscar Krause
98d7492534 main.py - fixed cors parsing 2022-12-30 07:42:57 +01:00
Oscar Krause
2368cc2578 bump version to 1.2 2022-12-30 07:37:36 +01:00
Oscar Krause
5e40d7944a PKGBUILD - updated service running uvicorn natively instead of calling main.py
- fixed issue with not loading env variables inside to fastapi
- fixed to not using "python main.py" which meant for development
2022-12-30 07:36:44 +01:00
Oscar Krause
5fc9fc8e0a added documentation to debian service 2022-12-30 07:14:25 +01:00
Oscar Krause
b0e10004f1 README.md - added windows license key installation from powershell 2022-12-30 07:11:02 +01:00
Oscar Krause
478ca0ab63 added some comments 2022-12-30 07:02:50 +01:00
Oscar Krause
3d83e533da fixed client-token filename (missing .tok extension) 2022-12-30 03:50:48 +01:00
Oscar Krause
1f56d31351 code styling 2022-12-29 20:42:40 +01:00
Oscar Krause
400c983025 added redirect for "/-/" route 2022-12-29 20:41:55 +01:00
Oscar Krause
fa3a06a360 code styling 2022-12-29 20:40:42 +01:00
Oscar Krause
c0ab3a589f migrated '/client-token' to '/-/client-token' 2022-12-29 20:33:50 +01:00
Oscar Krause
a8504f3017 hardcoded default CORS to https, since drivers only support secure connections 2022-12-29 19:14:49 +01:00
Oscar Krause
9a5cf9ff81 code styling 2022-12-29 19:07:30 +01:00
Oscar Krause
17978c2e2e main.py - added endpoint to release single lease 2022-12-29 19:03:09 +01:00
Oscar Krause
569ca8b3ea orm.py - fixed renewing timestamps from params 2022-12-29 19:00:14 +01:00
Oscar Krause
e0843ca1d4 code styling 2022-12-29 18:59:26 +01:00
Oscar Krause
3fad49b18a main.py - added api descriptions 2022-12-29 18:48:30 +01:00
Oscar Krause
82876bf6b1 .gitlab-ci.yml - fixed release 2022-12-29 13:14:16 +01:00
Oscar Krause
dc6b6bff69 Merge branch 'dev' into 'main'
fixes

See merge request oscar.krause/fastapi-dls!15
2022-12-29 13:03:22 +01:00
Oscar Krause
e91436b236 README.md - fixed redoc links 2022-12-29 12:58:19 +01:00
Oscar Krause
6a0c35a7a8 .gitlab-ci.yml - fixed deploy:pacman pipeline 2022-12-29 12:58:01 +01:00
Oscar Krause
0b7bedde66 Merge branch 'dev' into 'main'
1.1

See merge request oscar.krause/fastapi-dls!14
2022-12-29 12:54:37 +01:00
Oscar Krause
00f7c50e4e .gitlab-ci.yml - added release job 2022-12-29 12:47:51 +01:00
Oscar Krause
13ec45e762 orm.py - added init call after dropping table by migration 2022-12-29 12:34:25 +01:00
Oscar Krause
0983426f30 .gitlab-ci.yml improvements 2022-12-29 12:30:23 +01:00
Oscar Krause
0c3a38b84e .gitlab-ci.yml - fixed MR pipeline 2022-12-29 12:26:05 +01:00
Oscar Krause
51183f6845 updated hashes 2022-12-29 12:21:41 +01:00
Oscar Krause
5f87e65034 bump version to 1.1 2022-12-29 12:19:56 +01:00
Oscar Krause
26d6d1feeb updated variables descriptions 2022-12-29 12:19:49 +01:00
Oscar Krause
ca6942becc added some comments 2022-12-29 12:15:05 +01:00
Oscar Krause
ff02c77afe use version variable in PKGBUILD 2022-12-29 12:14:53 +01:00
Oscar Krause
85e2ef6930 use version variable in DEBIAN/control 2022-12-29 12:12:03 +01:00
Oscar Krause
47312f65d9 .gitlab-ci.yml improved 2022-12-29 10:44:39 +01:00
Oscar Krause
a59b720f3f fixes 2022-12-29 10:40:34 +01:00
Oscar Krause
1b2da802cb added tests for new endpoints 2022-12-29 10:37:47 +01:00
Oscar Krause
8b9c7d688b added some docs to custom endpoints 2022-12-29 10:35:15 +01:00
Oscar Krause
a09fc5f2ad added some new endpoints and links in readme 2022-12-29 10:31:25 +01:00
Oscar Krause
ed1b55f5f1 created a simple management ui 2022-12-29 10:12:31 +01:00
Oscar Krause
2b7fed3381 created endpoints to delete origins and to delete a lease 2022-12-29 09:57:37 +01:00
Oscar Krause
922dc9f5a7 refactored database structure and created migration script 2022-12-29 09:40:50 +01:00
Oscar Krause
1a50e28202 main.py - removed unused import 2022-12-29 09:15:51 +01:00
Oscar Krause
a7cb6a7756 PKGBUILD - include version file 2022-12-29 09:15:33 +01:00
Oscar Krause
001b70b89c README.md - added credits 2022-12-29 09:15:12 +01:00
Oscar Krause
e6790588ef Revert "CODEOWNERS"
This reverts commit d57b494779.
2022-12-29 09:12:56 +01:00
Oscar Krause
d57b494779 CODEOWNERS 2022-12-29 09:12:30 +01:00
Oscar Krause
07de2401d7 REAMDE.md - added shout out to @samicrusader 2022-12-29 09:07:20 +01:00
Oscar Krause
d86948aee2 added some comments 2022-12-29 09:01:36 +01:00
Oscar Krause
6b2e6bf392 added optional query parameter to '/-/origins' and '/-/leases' for linked leases/origin 2022-12-29 09:00:52 +01:00
Oscar Krause
913da290f1 PKGBUILD - fixed missing util.py 2022-12-29 08:48:34 +01:00
Oscar Krause
5c1d291fac .gitlab-ci.yml improvements 2022-12-29 08:00:34 +01:00
Oscar Krause
76f732adb6 .gitlab-ci.yml - fixed test:debian 2022-12-29 07:54:10 +01:00
Oscar Krause
d73221afb7 bump version to 1.0 2022-12-29 07:41:25 +01:00
Oscar Krause
a6ac58d12c fixes 2022-12-29 07:41:25 +01:00
Oscar Krause
aa76ba5650 .gitlab-ci.yml improvements 2022-12-29 07:32:12 +01:00
Oscar Krause
7abfb96841 README.md - added archlinux section 2022-12-29 07:17:51 +01:00
Oscar Krause
6978ba4873 orm.py - timestamps are not updated in database 2022-12-29 07:09:39 +01:00
Oscar Krause
21e61796ff fixes 2022-12-28 22:02:12 +01:00
Oscar Krause
3c4fb35498 pacman - test version x.y.z instead of x.y 2022-12-28 22:01:20 +01:00
Oscar Krause
b5ed098093 fixed debian install scripts permissions 2022-12-28 22:00:37 +01:00
Oscar Krause
478dc04787 testing "deploy:pacman" job 2022-12-28 21:55:42 +01:00
Oscar Krause
eddf9217e5 refactorings 2022-12-28 21:52:19 +01:00
Oscar Krause
903ef73280 Merge branch 'archlinux-makepkg' into 'dev'
Archlinux makepkg

See merge request oscar.krause/fastapi-dls!13
2022-12-28 21:42:01 +01:00
Oscar Krause
a02d1ab9df .gitlab-ci.yml - handle artifact 2022-12-28 21:40:52 +01:00
Oscar Krause
34283555a1 refactorings 2022-12-28 21:40:26 +01:00
Oscar Krause
abb56be3bb added git 2022-12-28 21:15:32 +01:00
Oscar Krause
571e654af1 fixes 2022-12-28 21:14:14 +01:00
Oscar Krause
f04d4905df applied changes from samicrusader <hi@samicrusader.me> 2022-12-28 21:13:20 +01:00
Oscar Krause
7f99c260ce added PKGBUILD 2022-12-28 17:05:59 +01:00
Oscar Krause
15d52f7586 added PKGBUILD 2022-12-28 17:01:57 +01:00
Oscar Krause
62af76b95a added PKGBUILD 2022-12-28 17:00:35 +01:00
Oscar Krause
32a512b89b fixes 2022-12-28 15:59:38 +01:00
Oscar Krause
321cd17b02 updated PKGBUILD 2022-12-28 15:57:55 +01:00
Oscar Krause
bb43fc3f49 .gitlab-ci.yml 2022-12-28 15:27:16 +01:00
Oscar Krause
12f661707f added PKGBUILD 2022-12-28 15:24:04 +01:00
Oscar Krause
837721fd7b finished all remaining tests 2022-12-28 14:54:30 +01:00
Oscar Krause
d4ca6ba1aa fixed imports 2022-12-28 14:39:04 +01:00
Oscar Krause
9ab0eb4796 .gitlab-ci.yml - added ubuntu to test:debian stage 2022-12-28 14:36:22 +01:00
Oscar Krause
d91b81e50f improved tests 2022-12-28 14:30:54 +01:00
Oscar Krause
2663901988 util.py - implemented generate key method 2022-12-28 14:30:42 +01:00
Oscar Krause
8633190e97 removed todo for migrating to flask 2022-12-28 13:45:42 +01:00
Oscar Krause
a7fb43e1dc .gitlab-ci.yml improvements 2022-12-28 12:08:13 +01:00
Oscar Krause
5af1ba106d .gitlab-ci.yml improvements 2022-12-28 12:05:56 +01:00
Oscar Krause
fb858adc0c README.md 2022-12-28 12:01:57 +01:00
Oscar Krause
dacfd2084f code styling 2022-12-28 11:54:01 +01:00
Oscar Krause
92fe6154e6 code styling 2022-12-28 11:53:56 +01:00
Oscar Krause
3d5203dae0 Merge branch 'dev' into 'main'
1.0.0

See merge request oscar.krause/fastapi-dls!12
2022-12-28 11:44:52 +01:00
Oscar Krause
c83130f138 README.md - added known issue 2022-12-28 11:33:26 +01:00
Oscar Krause
a951433ca0 fixes 2022-12-28 11:33:06 +01:00
Oscar Krause
dada9cc4cd fixes 2022-12-28 11:05:41 +01:00
Oscar Krause
670e05f693 .gitlab-ci.yml 2022-12-28 10:00:34 +01:00
Oscar Krause
e88b1afcf7 fixes 2022-12-28 09:57:55 +01:00
Oscar Krause
0e24d26089 README.md 2022-12-28 09:47:31 +01:00
Oscar Krause
3d073dbd7d bump version to 1.0.0 2022-12-28 09:24:41 +01:00
Oscar Krause
89bf744054 removed some todos 2022-12-28 09:24:02 +01:00
Oscar Krause
e1f2e942a6 code styling 2022-12-28 09:23:17 +01:00
Oscar Krause
2afa01273a Merge branch 'debian' into 'dev'
Debian

See merge request oscar.krause/fastapi-dls!11
2022-12-28 09:16:32 +01:00
Oscar Krause
943786099b Merge branch 'sqlalchemy' into 'dev'
Sqlalchemy

See merge request oscar.krause/fastapi-dls!10
2022-12-28 09:15:03 +01:00
Oscar Krause
5db66c893d Merge branch 'dev' into sqlalchemy
# Conflicts:
#	README.md
2022-12-28 09:14:41 +01:00
Oscar Krause
3dc9c8bcb1 README.md 2022-12-28 09:10:57 +01:00
Oscar Krause
b22613c337 postinst improvements 2022-12-28 09:04:35 +01:00
Oscar Krause
2340931a60 fixes 2022-12-28 08:57:35 +01:00
Oscar Krause
437b62376f fixed missing debian dependency 2022-12-28 08:56:11 +01:00
Oscar Krause
e9dc5a765a fixed service
Standard output type syslog is obsolete, automatically updating to journal. Please update your unit file, and consider removing the setting altogether.
2022-12-28 08:52:13 +01:00
Oscar Krause
4e5559bb85 fixed service
Standard output type syslog is obsolete, automatically updating to journal. Please update your unit file, and consider removing the setting altogether.
2022-12-28 08:51:55 +01:00
Oscar Krause
b745367baa postrm fixed 2022-12-28 08:48:36 +01:00
Oscar Krause
914fc17795 Merge branch 'dev' into debian
# Conflicts:
#	README.md
2022-12-28 08:39:10 +01:00
Oscar Krause
050d105659 README.md - added Let's Encrypt section 2022-12-28 08:37:47 +01:00
Oscar Krause
da21ef3cdc fixed some permissions 2022-12-28 08:35:59 +01:00
Oscar Krause
6844604a0b fixed deb package paths 2022-12-28 08:35:42 +01:00
Oscar Krause
45af6c11c0 fixed missing systemctl daemon-reload 2022-12-28 08:21:04 +01:00
Oscar Krause
cf21bec3b0 postrm fixed removing app dir 2022-12-28 08:05:35 +01:00
Oscar Krause
6b3f536681 fixes
- fixed app dir
- fixed missing readme and version file
- keep config on update/remove
2022-12-28 07:40:44 +01:00
Oscar Krause
cca24f0ad5 fixed instance keypair path 2022-12-28 07:31:23 +01:00
Oscar Krause
ddb1299f5c Merge branch 'dev' into debian 2022-12-28 07:29:54 +01:00
Oscar Krause
a95126f51d typos 2022-12-28 07:29:42 +01:00
Oscar Krause
180cdcb43d added some variables 2022-12-28 07:29:38 +01:00
Oscar Krause
db412c6a43 postrm - remove service 2022-12-28 07:16:34 +01:00
Oscar Krause
a08261f7cd postinst - fixed paths and permissions 2022-12-28 07:14:24 +01:00
Oscar Krause
9744a8f0e8 code styling 2022-12-28 07:04:10 +01:00
Oscar Krause
63670f52e8 postinst fixes 2022-12-28 07:03:41 +01:00
Oscar Krause
65937b153e typos 2022-12-28 06:58:50 +01:00
Oscar Krause
84f7e99c78 README.md - adde toc 2022-12-28 06:58:26 +01:00
Oscar Krause
2af4b456b6 fixes 2022-12-28 06:56:31 +01:00
Oscar Krause
0b46212f28 Merge branch 'dev' into debian 2022-12-28 06:54:50 +01:00
Oscar Krause
3b75e8dbeb fixes 2022-12-28 06:54:25 +01:00
Oscar Krause
8c1c51897f README.md - added install instructions 2022-12-28 06:53:31 +01:00
Oscar Krause
52faba5a1d Merge branch 'dev' into debian 2022-12-28 06:50:19 +01:00
Oscar Krause
46620c5e2a typos 2022-12-28 06:50:04 +01:00
Oscar Krause
c820dac4ec README.md - improvements & fixed manual install steps 2022-12-28 06:49:18 +01:00
Oscar Krause
548e1c9492 postinst - fixed service file 2022-12-28 06:47:06 +01:00
Oscar Krause
0f345f52ab postinst - fixed "cat" instead of "echo" 2022-12-28 06:46:42 +01:00
Oscar Krause
18d6da8ebf fixes 2022-12-27 22:18:02 +01:00
Oscar Krause
9a0db3c18f .gitlab-ci.yml - using generic package registry temporary 2022-12-27 21:59:52 +01:00
Oscar Krause
15c49d396f README.md - added required cipher suite for windows guests 2022-12-27 20:35:04 +01:00
Oscar Krause
c38ed25a2f fixes 2022-12-27 20:28:09 +01:00
Oscar Krause
1b34edfda6 fixes 2022-12-27 20:22:00 +01:00
Oscar Krause
12bfd4c82a removed toc 2022-12-27 20:19:50 +01:00
Oscar Krause
2a3e740964 added toc 2022-12-27 20:19:23 +01:00
Oscar Krause
85736c5ce4 typos 2022-12-27 20:10:18 +01:00
Oscar Krause
07f1e64553 fixes 2022-12-27 20:08:37 +01:00
Oscar Krause
560b18b5c4 orm.py - fixed not null column 2022-12-27 19:57:58 +01:00
Oscar Krause
b5c64038cb main.py - migrated merged changes from dataset to sqlalchemy 2022-12-27 19:05:41 +01:00
Oscar Krause
c7aa28382a Merge branch 'dev' into sqlalchemy
# Conflicts:
#	app/main.py
2022-12-27 19:04:41 +01:00
Oscar Krause
6d5ed1a142 main.py - added origin update endpoint 2022-12-27 19:03:03 +01:00
Oscar Krause
11a2c1d129 added "CAP_NET_BIND_SERVICE" to debian service to allow low range ports for non root user "www-data" 2022-12-27 18:51:20 +01:00
Oscar Krause
cefee22202 README.md - fixed srevice type 2022-12-27 18:38:26 +01:00
Oscar Krause
e5f557eb96 README.md - added todos 2022-12-27 17:49:52 +01:00
Oscar Krause
f9e3740150 main.py - added env variable for "INSTANCE_REF" 2022-12-27 17:42:58 +01:00
Oscar Krause
7898052207 fixed service 2022-12-27 17:00:33 +01:00
Oscar Krause
3d6da6fab9 README - fixed debian installation via git 2022-12-27 16:59:35 +01:00
Oscar Krause
6ddba90cd8 README fixed 2022-12-27 15:28:52 +01:00
Oscar Krause
6f143f2199 .gitlab-ci.yml - fixed filename 2022-12-27 14:52:17 +01:00
Oscar Krause
c2e04552f7 debian - bump version to 0.6.0 2022-12-27 14:45:03 +01:00
Oscar Krause
6947d928ec .gitlab-ci.yml - fixed artifact upload with access token 2022-12-27 14:04:33 +01:00
Oscar Krause
8f5ff50aaf .gitlab-ci.yml - dynamically create repo for codename if not exist 2022-12-27 13:34:21 +01:00
Oscar Krause
9d900c4f5c .gitlab-ci.yml - create initial debian repo 2022-12-27 13:27:27 +01:00
Oscar Krause
751546995d .gitlab-ci.yml - fixed artifact upload 2022-12-27 12:56:46 +01:00
Oscar Krause
4c643b18dd .gitlab-ci.yml - implemented deploy stage for debian package 2022-12-27 12:49:12 +01:00
Oscar Krause
b89381fdfc Merge branch 'dev' into debian 2022-12-27 12:44:33 +01:00
Oscar Krause
4df5f18b67 .gitlab-ci.yml - improved testing 2022-12-27 12:40:33 +01:00
Oscar Krause
701453b18a .gitlab-ci.yml - fixes 2022-12-27 12:35:07 +01:00
Oscar Krause
507ce93718 .gitlab-ci.yml - test starting service 2022-12-27 12:32:40 +01:00
Oscar Krause
52fb18dea0 main.py - fixed imports for "Crypto" and "Cryptodome" (on debian) 2022-12-27 12:21:52 +01:00
Oscar Krause
7c8a113fbd .gitlab-ci.yml - added "DEBIAN_FRONTEND=noninteractive" for debian test 2022-12-27 11:05:11 +01:00
Oscar Krause
a91e1f7018 README.md - added supported package version 14.4 2022-12-27 11:03:53 +01:00
Oscar Krause
646cca42f4 .gitlab-ci.yml - removed some debugging 2022-12-27 10:38:49 +01:00
Oscar Krause
60ec2821e2 postinst - add default value 2022-12-27 10:38:26 +01:00
Oscar Krause
ab30ad2117 .gitlab-ci.yml - debugging 2022-12-27 10:23:51 +01:00
Oscar Krause
e2cea71365 .gitlab-ci.yml - added some debugging 2022-12-27 10:22:03 +01:00
Oscar Krause
5d48f6b7d5 .gitlab-ci.yml - fixed artifact path 2022-12-27 10:19:35 +01:00
Oscar Krause
1e84e141df fixes 2022-12-27 10:16:04 +01:00
Oscar Krause
98e98ccd84 chroot into "build" dir 2022-12-27 10:10:00 +01:00
Oscar Krause
f1eddaa99a fixed missing directory 2022-12-27 10:05:52 +01:00
Oscar Krause
df0816832e fixed conffiles 2022-12-27 10:04:26 +01:00
Oscar Krause
599eaba14a README.md - added supported and tested driver versions 2022-12-27 09:19:05 +01:00
Oscar Krause
4e17e6da82 main.py fixed pycryptodome import 2022-12-23 14:09:13 +01:00
Oscar Krause
843d918e59 added dependencies 2022-12-23 14:08:56 +01:00
Oscar Krause
952a74cabe Merge branch 'sqlalchemy' into debian
# Conflicts:
#	app/main.py
2022-12-23 13:50:50 +01:00
Oscar Krause
81608fe497 merged dev into debian 2022-12-23 13:48:48 +01:00
Oscar Krause
b00a2a032a Merge branch 'dev' into debian
# Conflicts:
#	.gitlab-ci.yml
2022-12-23 13:48:24 +01:00
Oscar Krause
6b7c70e59a tests improved 2022-12-23 13:42:02 +01:00
Oscar Krause
332b9b23cd code styling 2022-12-23 13:31:43 +01:00
Oscar Krause
3d5d728d67 code styling 2022-12-23 13:22:06 +01:00
Oscar Krause
838e30458d code styling 2022-12-23 13:21:52 +01:00
Oscar Krause
f539db5933 implemented db_init 2022-12-23 13:17:19 +01:00
Oscar Krause
6049048bbf fixed test 2022-12-23 11:24:40 +01:00
Oscar Krause
43d5736f37 code styling & removed comments 2022-12-23 08:22:21 +01:00
Oscar Krause
e7102c4de6 fixed updates 2022-12-23 08:16:58 +01:00
Oscar Krause
d1db441df4 removed Auth 2022-12-23 08:16:34 +01:00
Oscar Krause
d5b51bd83c Merge branch 'dev' into sqlalchemy
# Conflicts:
#	app/main.py
2022-12-23 08:08:35 +01:00
Oscar Krause
3f71c88d48 added some test 2022-12-23 07:48:47 +01:00
Oscar Krause
a58549a162 .gitlab-ci.yml - fixed test cert path 2022-12-23 07:43:02 +01:00
Oscar Krause
2c1c9b63b4 .gitignore 2022-12-23 07:41:23 +01:00
Oscar Krause
3367977652 .gitlab-ci.yml - fixed cd into test 2022-12-23 07:41:18 +01:00
Oscar Krause
67ed6108a2 .gitlab-ci.yml - changed test image to bullseye 2022-12-23 07:40:27 +01:00
Oscar Krause
d5d156e70e .gitlab-ci.yml - create test certificates 2022-12-23 07:38:53 +01:00
Oscar Krause
906af9430a .gitlab-ci.yml - fixed installing dependencies 2022-12-23 07:36:33 +01:00
Oscar Krause
3f5e3b16c5 added api tests 2022-12-23 07:35:37 +01:00
Oscar Krause
d187167129 Merge branch 'dev' into 'main'
v0.6

See merge request oscar.krause/fastapi-dls!9
2022-12-23 07:17:17 +01:00
Oscar Krause
9809bbdbd1 bump version to 0.6 2022-12-23 07:16:41 +01:00
Oscar Krause
a0b9eae15b main.py - fixed wrong "origin_ref" in CodeResponse
- fixed issue
- removed the now unnecessary table "auth"
2022-12-23 06:56:29 +01:00
Oscar Krause
394180652e migrated from dataset to sqlalchemy 2022-12-22 12:57:06 +01:00
Oscar Krause
f0fdfafaed added basic debian package setup and pipeline 2022-12-22 10:41:07 +01:00
Oscar Krause
d6d4cbc74a README.md - added docker image sources 2022-12-22 10:31:52 +01:00
Oscar Krause
184e858fea .gitlab-ci.yml - added docker image versions for gitlab registry 2022-12-22 10:31:40 +01:00
Oscar Krause
a6f7b9b595 code styling 2022-12-22 10:14:32 +01:00
Oscar Krause
7946c63f8c README.md - added notice that no internet connection is required 2022-12-22 10:14:15 +01:00
Oscar Krause
1131f31c27 Merge remote-tracking branch 'origin/dev' into dev 2022-12-22 07:58:04 +01:00
Oscar Krause
e84a7b3e8d README.md - added installation method with debian and git on bare metal 2022-12-22 07:57:56 +01:00
27 changed files with 1867 additions and 203 deletions

1
.DEBIAN/conffiles Normal file
View File

@@ -0,0 +1 @@
/etc/fastapi-dls/env

9
.DEBIAN/control Normal file
View File

@@ -0,0 +1,9 @@
Package: fastapi-dls
Version: 0.0
Architecture: all
Maintainer: Oscar Krause oscar.krause@collinwebdesigns.de
Depends: python3, python3-fastapi, python3-uvicorn, python3-dotenv, python3-dateutil, python3-jose, python3-sqlalchemy, python3-pycryptodome, python3-markdown, uvicorn, openssl
Recommends: curl
Installed-Size: 10240
Homepage: https://git.collinwebdesigns.de/oscar.krause/fastapi-dls
Description: Minimal Delegated License Service (DLS).

27
.DEBIAN/env.default Normal file
View File

@@ -0,0 +1,27 @@
# Toggle debug mode
#DEBUG=false
# Where the client can find the DLS server
DLS_URL=127.0.0.1
DLS_PORT=443
# CORS configuration
## comma separated list without spaces
#CORS_ORIGINS="https://$DLS_URL:$DLS_PORT"
# Lease expiration in days
LEASE_EXPIRE_DAYS=90
LEASE_RENEWAL_PERIOD=0.2
# Database location
## https://docs.sqlalchemy.org/en/14/core/engines.html
DATABASE=sqlite:////etc/fastapi-dls/db.sqlite
# UUIDs for identifying the instance
#SITE_KEY_XID="00000000-0000-0000-0000-000000000000"
#INSTANCE_REF="10000000-0000-0000-0000-000000000001"
#ALLOTMENT_REF="20000000-0000-0000-0000-000000000001"
# Site-wide signing keys
INSTANCE_KEY_RSA=/etc/fastapi-dls/instance.private.pem
INSTANCE_KEY_PUB=/etc/fastapi-dls/instance.public.pem

View File

@@ -0,0 +1,25 @@
[Unit]
Description=Service for fastapi-dls
Documentation=https://git.collinwebdesigns.de/oscar.krause/fastapi-dls
After=network.target
[Service]
User=www-data
Group=www-data
AmbientCapabilities=CAP_NET_BIND_SERVICE
WorkingDirectory=/usr/share/fastapi-dls/app
EnvironmentFile=/etc/fastapi-dls/env
ExecStart=uvicorn main:app \
--env-file /etc/fastapi-dls/env \
--host $DLS_URL --port $DLS_PORT \
--app-dir /usr/share/fastapi-dls/app \
--ssl-keyfile /etc/fastapi-dls/webserver.key \
--ssl-certfile /etc/fastapi-dls/webserver.crt \
--proxy-headers
Restart=always
KillSignal=SIGQUIT
Type=simple
NotifyAccess=all
[Install]
WantedBy=multi-user.target

60
.DEBIAN/postinst Normal file
View File

@@ -0,0 +1,60 @@
#!/bin/bash
WORKING_DIR=/usr/share/fastapi-dls
CONFIG_DIR=/etc/fastapi-dls
if [ ! -f $CONFIG_DIR/instance.private.pem ]; then
echo "> Create dls-instance keypair ..."
openssl genrsa -out $CONFIG_DIR/instance.private.pem 2048
openssl rsa -in $CONFIG_DIR/instance.private.pem -outform PEM -pubout -out $CONFIG_DIR/instance.public.pem
else
echo "> Create dls-instance keypair skipped! (exists)"
fi
while true; do
[ -f $CONFIG_DIR/webserver.key ] && default_answer="N" || default_answer="Y"
[ $default_answer == "Y" ] && V="Y/n" || V="y/N"
read -p "> Do you wish to create self-signed webserver certificate? [${V}]" yn
yn=${yn:-$default_answer} # ${parameter:-word} If parameter is unset or null, the expansion of word is substituted. Otherwise, the value of parameter is substituted.
case $yn in
[Yy]*)
echo "> Generating keypair ..."
openssl req -x509 -nodes -days 3650 -newkey rsa:2048 -keyout $CONFIG_DIR/webserver.key -out $CONFIG_DIR/webserver.crt
break
;;
[Nn]*) echo "> Generating keypair skipped! (exists)"; break ;;
*) echo "Please answer [y] or [n]." ;;
esac
done
if [ -f $CONFIG_DIR/webserver.key ]; then
echo "> Starting service ..."
systemctl start fastapi-dls.service
if [ -x "$(command -v curl)" ]; then
echo "> Testing API ..."
source $CONFIG_DIR/env
curl --insecure -X GET https://$DLS_URL:$DLS_PORT/-/health
else
echo "> Testing API failed, curl not available. Please test manually!"
fi
fi
chown -R www-data:www-data $CONFIG_DIR
chown -R www-data:www-data $WORKING_DIR
cat <<EOF
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
# #
# fastapi-dls is now installed. #
# #
# Service should be up and running. #
# Webservice is listen to https://localhost #
# #
# Configuration is stored in /etc/fastapi-dls/env. #
# #
# #
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
EOF

9
.DEBIAN/postrm Executable file
View File

@@ -0,0 +1,9 @@
#!/bin/bash
# is removed automatically
#if [ "$1" = purge ] && [ -d /usr/share/fastapi-dls ]; then
# echo "> Removing app."
# rm -r /usr/share/fastapi-dls
#fi
echo -e "> Done."

3
.DEBIAN/prerm Executable file
View File

@@ -0,0 +1,3 @@
#!/bin/bash
echo -e "> Starting uninstallation of 'fastapi-dls'!"

52
.PKGBUILD/PKGBUILD Normal file
View File

@@ -0,0 +1,52 @@
# Maintainer: Oscar Krause <oscar.krause@collinwebdesigns.de>
# Contributor: samicrusader <hi@samicrusader.me>
pkgname=fastapi-dls
pkgver=1.1
pkgrel=1
pkgdesc='NVIDIA DLS server implementation with FastAPI'
arch=('any')
url='https://git.collinwebdesigns.de/oscar.krause/fastapi-dls'
license=('MIT')
depends=('python' 'python-jose' 'python-starlette' 'python-httpx' 'python-fastapi' 'python-dotenv' 'python-dateutil' 'python-sqlalchemy' 'python-pycryptodome' 'uvicorn' 'python-markdown' 'openssl')
provider=("$pkgname")
install="$pkgname.install"
source=('git+file:///builds/oscar.krause/fastapi-dls' # https://gitea.publichub.eu/oscar.krause/fastapi-dls.git
"$pkgname.default"
"$pkgname.service"
"$pkgname.tmpfiles")
sha256sums=('SKIP'
'fbd015449a30c0ae82733289a56eb98151dcfab66c91b37fe8e202e39f7a5edb'
'2719338541104c537453a65261c012dda58e1dbee99154cf4f33b526ee6ca22e'
'3dc60140c08122a8ec0e7fa7f0937eb8c1288058890ba09478420fc30ce9e30c')
pkgver() {
source $srcdir/$pkgname/version.env
echo ${VERSION}
}
check() {
cd "$srcdir/$pkgname/test"
mkdir "$srcdir/$pkgname/app/cert"
openssl genrsa -out "$srcdir/$pkgname/app/cert/instance.private.pem" 2048
openssl rsa -in "$srcdir/$pkgname/app/cert/instance.private.pem" -outform PEM -pubout -out "$srcdir/$pkgname/app/cert/instance.public.pem"
python "$srcdir/$pkgname/test/main.py"
rm -rf "$srcdir/$pkgname/app/cert"
}
package() {
install -d "$pkgdir/usr/share/doc/$pkgname"
install -d "$pkgdir/var/lib/$pkgname/cert"
cp -r "$srcdir/$pkgname/doc"/* "$pkgdir/usr/share/doc/$pkgname/"
install -Dm644 "$srcdir/$pkgname/README.md" "$pkgdir/usr/share/doc/$pkgname/README.md"
install -Dm644 "$srcdir/$pkgname/version.env" "$pkgdir/usr/share/doc/$pkgname/version.env"
sed -i "s/README.md/\/usr\/share\/doc\/$pkgname\/README.md/g" "$srcdir/$pkgname/app/main.py"
sed -i "s/join(dirname(__file__), 'cert\//join('\/var\/lib\/$pkgname', 'cert\//g" "$srcdir/$pkgname/app/main.py"
install -Dm755 "$srcdir/$pkgname/app/main.py" "$pkgdir/opt/$pkgname/main.py"
install -Dm755 "$srcdir/$pkgname/app/orm.py" "$pkgdir/opt/$pkgname/orm.py"
install -Dm755 "$srcdir/$pkgname/app/util.py" "$pkgdir/opt/$pkgname/util.py"
install -Dm644 "$srcdir/$pkgname.default" "$pkgdir/etc/default/$pkgname"
install -Dm644 "$srcdir/$pkgname.service" "$pkgdir/usr/lib/systemd/system/$pkgname.service"
install -Dm644 "$srcdir/$pkgname.tmpfiles" "$pkgdir/usr/lib/tmpfiles.d/$pkgname.conf"
}

View File

@@ -0,0 +1,28 @@
# Toggle FastAPI debug mode
DEBUG=false
# Where the client can find the DLS server
## DLS_URL should be a hostname
LISTEN_IP="0.0.0.0"
DLS_URL="localhost.localdomain"
DLS_PORT=8443
CORS_ORIGINS="https://$DLS_URL:$DLS_PORT"
# Lease expiration in days
LEASE_EXPIRE_DAYS=90
# Database location
## https://docs.sqlalchemy.org/en/14/core/engines.html
DATABASE="sqlite:////var/lib/fastapi-dls/db.sqlite"
# UUIDs for identifying the instance
SITE_KEY_XID="<<sitekey>>"
INSTANCE_REF="<<instanceref>>"
# Site-wide signing keys
INSTANCE_KEY_RSA="/var/lib/fastapi-dls/instance.private.pem"
INSTANCE_KEY_PUB="/var/lib/fastapi-dls/instance.public.pem"
# TLS certificate
INSTANCE_SSL_CERT="/var/lib/fastapi-dls/cert/webserver.crt"
INSTANCE_SSL_KEY="/var/lib/fastapi-dls/cert/webserver.key"

View File

@@ -0,0 +1,14 @@
post_install() {
sed -i "s/<<sitekey>>/$(uuidgen)/" /etc/default/fastapi-dls
sed -i "s/<<instanceref>>/$(uuidgen)/" /etc/default/fastapi-dls
echo 'The environment variables for this server can be edited at: /etc/default/fastapi-dls'
echo 'The server can be started with: systemctl start fastapi-dls.service'
echo
echo 'A valid HTTPS certificate needs to be installed to /var/lib/fastapi-dls/cert/webserver.{crt,key}'
echo 'A self-signed certificate can be generated with: openssl req -x509 -nodes -days 3650 -newkey rsa:2048 -keyout /var/lib/fastapi-dls/cert/webserver.key -out /var/lib/fastapi-dls/cert/webserver.crt'
echo
echo 'The signing keys for your instance need to be generated as well. Generate them with these commands:'
echo 'openssl genrsa -out /var/lib/fastapi-dls/instance.private.pem 2048'
echo 'openssl rsa -in /var/lib/fastapi-dls/instance.private.pem -outform PEM -pubout -out /var/lib/fastapi-dls/instance.public.pem'
}

View File

@@ -0,0 +1,16 @@
[Unit]
Description=FastAPI-DLS
Documentation=https://git.collinwebdesigns.de/oscar.krause/fastapi-dls
After=network.target
[Service]
Type=simple
AmbientCapabilities=CAP_NET_BIND_SERVICE
EnvironmentFile=/etc/default/fastapi-dls
ExecStart=/usr/bin/uvicorn main:app --proxy-headers --env-file=/etc/default/fastapi-dls --host=${LISTEN_IP} --port=${DLS_PORT} --app-dir=/opt/fastapi-dls --ssl-keyfile=${INSTANCE_SSL_KEY} --ssl-certfile=${INSTANCE_SSL_CERT}
Restart=on-abort
User=http
Group=http
[Install]
WantedBy=multi-user.target

View File

@@ -0,0 +1,2 @@
d /var/lib/fastapi-dls 0755 http http
d /var/lib/fastapi-dls/cert 0755 http http

1
.gitignore vendored
View File

@@ -3,3 +3,4 @@ venv/
.idea/ .idea/
app/*.sqlite* app/*.sqlite*
app/cert/*.* app/cert/*.*
.pytest_cache

View File

@@ -1,12 +1,16 @@
cache: cache:
key: one-key-to-rule-them-all key: one-key-to-rule-them-all
build: build:docker:
image: docker:dind image: docker:dind
interruptible: true interruptible: true
stage: build stage: build
rules: rules:
- if: $CI_COMMIT_BRANCH - if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH
changes:
- app/**/*
- Dockerfile
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
tags: [ docker ] tags: [ docker ]
before_script: before_script:
- echo "COMMIT=${CI_COMMIT_SHA}" >> version.env # COMMIT=`git rev-parse HEAD` - echo "COMMIT=${CI_COMMIT_SHA}" >> version.env # COMMIT=`git rev-parse HEAD`
@@ -15,12 +19,174 @@ build:
- docker build . --tag ${CI_REGISTRY}/${CI_PROJECT_PATH}/${CI_BUILD_REF_NAME}:${CI_BUILD_REF} - docker build . --tag ${CI_REGISTRY}/${CI_PROJECT_PATH}/${CI_BUILD_REF_NAME}:${CI_BUILD_REF}
- docker push ${CI_REGISTRY}/${CI_PROJECT_PATH}/${CI_BUILD_REF_NAME}:${CI_BUILD_REF} - docker push ${CI_REGISTRY}/${CI_PROJECT_PATH}/${CI_BUILD_REF_NAME}:${CI_BUILD_REF}
test: build:apt:
stage: test image: debian:bookworm-slim
interruptible: true
stage: build
rules:
- if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
- if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH
changes:
- app/**/*
- .DEBIAN/**/*
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
before_script:
- echo "COMMIT=${CI_COMMIT_SHA}" >> version.env
- source version.env
# install build dependencies
- apt-get update -qq && apt-get install -qq -y build-essential
# create build directory for .deb sources
- mkdir build
# copy install instructions
- cp -r .DEBIAN build/DEBIAN
- chmod -R 0775 build/DEBIAN
# copy app into "/usr/share/fastapi-dls" as "/usr/share/fastapi-dls/app" & copy README.md and version.env
- mkdir -p build/usr/share/fastapi-dls
- cp -r app build/usr/share/fastapi-dls
- cp README.md version.env build/usr/share/fastapi-dls
# create conf file
- mkdir -p build/etc/fastapi-dls
- cp .DEBIAN/env.default build/etc/fastapi-dls/env
# create service file
- mkdir -p build/etc/systemd/system
- cp .DEBIAN/fastapi-dls.service build/etc/systemd/system/fastapi-dls.service
# cd into "build/"
- cd build/
script: script:
- echo "Nothing to do ..." # set version based on value in "$VERSION" (which is set above from version.env)
- sed -i -E 's/(Version\:\s)0.0/\1'"$VERSION"'/g' DEBIAN/control
# build
- dpkg -b . build.deb
- dpkg -I build.deb
artifacts:
expire_in: 1 week
paths:
- build/build.deb
deploy: build:pacman:
image: archlinux:base-devel
interruptible: true
stage: build
rules:
- if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
- if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH
changes:
- app/**/*
- .PKGBUILD/**/*
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
before_script:
- echo "COMMIT=${CI_COMMIT_SHA}" >> version.env
# install build dependencies
- pacman -Syu --noconfirm git
# create a build-user because "makepkg" don't like root user
- useradd --no-create-home --shell=/bin/false build && usermod -L build
- 'echo "build ALL=(ALL) NOPASSWD: ALL" >> /etc/sudoers'
- 'echo "root ALL=(ALL) NOPASSWD: ALL" >> /etc/sudoers'
- chown -R build:build .
# move .PKGBUILD contents to root directory
- mv .PKGBUILD/* .
script:
- pwd
# download dependencies
- source PKGBUILD && pacman -Syu --noconfirm --needed --asdeps "${makedepends[@]}" "${depends[@]}"
# build
- sudo -u build makepkg -s
artifacts:
expire_in: 1 week
paths:
- "*.pkg.tar.zst"
test:
image: python:3.11-slim-bullseye
stage: test
rules:
- if: $CI_COMMIT_BRANCH
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
variables:
DATABASE: sqlite:///../app/db.sqlite
before_script:
- pip install -r requirements.txt
- pip install pytest httpx
- mkdir -p app/cert
- openssl genrsa -out app/cert/instance.private.pem 2048
- openssl rsa -in app/cert/instance.private.pem -outform PEM -pubout -out app/cert/instance.public.pem
- cd test
script:
- pytest main.py
artifacts:
reports:
dotenv: version.env
.test:linux:
stage: test
rules:
- if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH
changes:
- app/**/*
- .DEBIAN/**/*
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
needs:
- job: build:apt
artifacts: true
variables:
DEBIAN_FRONTEND: noninteractive
before_script:
- apt-get update -qq && apt-get install -qq -y jq curl
script:
# test installation
- apt-get install -q -y ./build/build.deb --fix-missing
- openssl req -x509 -newkey rsa:2048 -nodes -out /etc/fastapi-dls/webserver.crt -keyout /etc/fastapi-dls/webserver.key -days 7 -subj "/C=DE/O=GitLab-CI/OU=Test/CN=localhost"
# copy example config from GitLab-CI-Variables
#- cat ${EXAMPLE_CONFIG} > /etc/fastapi-dls/env
# start service in background
- cd /usr/share/fastapi-dls/app
- uvicorn main:app
--host 127.0.0.1 --port 443
--app-dir /usr/share/fastapi-dls/app
--ssl-keyfile /etc/fastapi-dls/webserver.key
--ssl-certfile /etc/fastapi-dls/webserver.crt
--proxy-headers &
- FASTAPI_DLS_PID=$!
- echo "Started service with pid $FASTAPI_DLS_PID"
- cat /etc/fastapi-dls/env
# testing service
- if [ "`curl --insecure -s https://127.0.0.1/-/health | jq .status`" != "up" ]; then echo "Success"; else "Error"; fi
# cleanup
- kill $FASTAPI_DLS_PID
- apt-get purge -qq -y fastapi-dls
- apt-get autoremove -qq -y && apt-get clean -qq
test:debian:
extends: .test:linux
image: debian:bookworm-slim
test:ubuntu:
extends: .test:linux
image: ubuntu:22.10
test:archlinux:
image: archlinux:base
rules:
- if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH
changes:
- app/**/*
- .PKGBUILD/**/*
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
needs:
- job: build:pacman
artifacts: true
script:
- pacman -Sy
- pacman -U --noconfirm *.pkg.tar.zst
.deploy:
rules:
- if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
- if: $CI_COMMIT_TAG
when: never
deploy:docker:
extends: .deploy
stage: deploy stage: deploy
rules: rules:
- if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH - if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
@@ -29,8 +195,105 @@ deploy:
- source version.env - source version.env
- echo "Building docker image for commit ${COMMIT} with version ${VERSION}" - echo "Building docker image for commit ${COMMIT} with version ${VERSION}"
script: script:
- echo "GitLab-Registry"
- docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY
- docker build . --tag ${CI_REGISTRY}/${CI_PROJECT_PATH}/${CI_BUILD_REF_NAME}:${VERSION}
- docker build . --tag ${CI_REGISTRY}/${CI_PROJECT_PATH}/${CI_BUILD_REF_NAME}:latest
- docker push ${CI_REGISTRY}/${CI_PROJECT_PATH}/${CI_BUILD_REF_NAME}:${VERSION}
- docker push ${CI_REGISTRY}/${CI_PROJECT_PATH}/${CI_BUILD_REF_NAME}:latest
- echo "Docker-Hub"
- docker login -u $PUBLIC_REGISTRY_USER -p $PUBLIC_REGISTRY_TOKEN - docker login -u $PUBLIC_REGISTRY_USER -p $PUBLIC_REGISTRY_TOKEN
- docker build . --tag $PUBLIC_REGISTRY_USER/${CI_PROJECT_NAME}:${VERSION} - docker build . --tag $PUBLIC_REGISTRY_USER/${CI_PROJECT_NAME}:${VERSION}
- docker build . --tag $PUBLIC_REGISTRY_USER/${CI_PROJECT_NAME}:latest - docker build . --tag $PUBLIC_REGISTRY_USER/${CI_PROJECT_NAME}:latest
- docker push $PUBLIC_REGISTRY_USER/${CI_PROJECT_NAME}:${VERSION} - docker push $PUBLIC_REGISTRY_USER/${CI_PROJECT_NAME}:${VERSION}
- docker push $PUBLIC_REGISTRY_USER/${CI_PROJECT_NAME}:latest - docker push $PUBLIC_REGISTRY_USER/${CI_PROJECT_NAME}:latest
deploy:apt:
# doc: https://git.collinwebdesigns.de/help/user/packages/debian_repository/index.md#install-a-package
extends: .deploy
image: debian:bookworm-slim
stage: deploy
rules:
- if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
needs:
- job: build:apt
artifacts: true
before_script:
- apt-get update -qq && apt-get install -qq -y curl lsb-release
# create distribution initial
- CODENAME=`lsb_release -cs`
# create repo if not exists
- 'if [ "`curl -s -o /dev/null -w "%{http_code}" --header "JOB-TOKEN: $CI_JOB_TOKEN" -s ${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/debian_distributions/${CODENAME}/key.asc`" != "200" ]; then curl --request POST --header "JOB-TOKEN: $CI_JOB_TOKEN" "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/debian_distributions?codename=${CODENAME}"; fi'
script:
# Naming format: <name>_<version>-<release>_<arch>.deb
# Version is the version number of the app being packaged
# Release number is the version number of the *packaging* itself.
# The release number might increment if the package maintainer
# updated the packaging, while the version number of the application
# being packaged did not change.
- BUILD_NAME=build/build.deb # inherited by build-stage
- PACKAGE_NAME=`dpkg -I ${BUILD_NAME} | grep "Package:" | awk '{ print $2 }'`
- PACKAGE_VERSION=`dpkg -I ${BUILD_NAME} | grep "Version:" | awk '{ print $2 }'`
- PACKAGE_ARCH=amd64
#- EXPORT_NAME="${PACKAGE_NAME}_${PACKAGE_VERSION}-0_${PACKAGE_ARCH}.deb"
- EXPORT_NAME="${PACKAGE_NAME}_${PACKAGE_VERSION}_${PACKAGE_ARCH}.deb"
- mv ${BUILD_NAME} ${EXPORT_NAME}
- 'echo "PACKAGE_NAME: ${PACKAGE_NAME}"'
- 'echo "PACKAGE_VERSION: ${PACKAGE_VERSION}"'
- 'echo "PACKAGE_ARCH: ${PACKAGE_ARCH}"'
- 'echo "EXPORT_NAME: ${EXPORT_NAME}"'
# https://docs.gitlab.com/14.3/ee/user/packages/debian_repository/index.html
- URL="${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/debian/${EXPORT_NAME}"
- 'echo "URL: ${URL}"'
#- 'curl --request PUT --header "JOB-TOKEN: $CI_JOB_TOKEN" --upload-file ${EXPORT_NAME} ${URL}'
# using generic-package-registry until debian-registry is GA
# https://docs.gitlab.com/ee/user/packages/generic_packages/index.html#publish-a-generic-package-by-using-cicd
- 'curl --header "JOB-TOKEN: $CI_JOB_TOKEN" --upload-file ${EXPORT_NAME} "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/generic/${PACKAGE_NAME}/${PACKAGE_VERSION}/${EXPORT_NAME}"'
deploy:pacman:
extends: .deploy
image: archlinux:base-devel
stage: deploy
rules:
- if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
needs:
- job: build:pacman
artifacts: true
script:
- source .PKGBUILD/PKGBUILD
- source version.env
# fastapi-dls-1.0-1-any.pkg.tar.zst
- BUILD_NAME=${pkgname}-${VERSION}-${pkgrel}-any.pkg.tar.zst
- PACKAGE_NAME=${pkgname}
- PACKAGE_VERSION=${VERSION}
- PACKAGE_ARCH=any
- EXPORT_NAME=${BUILD_NAME}
- 'echo "PACKAGE_NAME: ${PACKAGE_NAME}"'
- 'echo "PACKAGE_VERSION: ${PACKAGE_VERSION}"'
- 'echo "PACKAGE_ARCH: ${PACKAGE_ARCH}"'
- 'echo "EXPORT_NAME: ${EXPORT_NAME}"'
- 'curl --header "JOB-TOKEN: $CI_JOB_TOKEN" --upload-file ${EXPORT_NAME} "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/generic/${PACKAGE_NAME}/${PACKAGE_VERSION}/${EXPORT_NAME}"'
release:
image: registry.gitlab.com/gitlab-org/release-cli:latest
stage: .post
needs:
- job: test
artifacts: true
rules:
- if: $CI_COMMIT_TAG
when: never
- if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
script:
- echo "Running release-job for $VERSION"
release:
name: $CI_PROJECT_TITLE $VERSION
description: Release of $CI_PROJECT_TITLE version $VERSION
tag_name: $VERSION
ref: $CI_COMMIT_SHA
assets:
links:
- name: 'Package Registry'
url: 'https://git.collinwebdesigns.de/oscar.krause/fastapi-dls/-/packages'
- name: 'Container Registry'
url: 'https://git.collinwebdesigns.de/oscar.krause/fastapi-dls/container_registry/40'

2
CODEOWNERS Executable file
View File

@@ -0,0 +1,2 @@
* @oscar.krause
.PKGBUILD/ @samicrusader

View File

@@ -1,4 +1,4 @@
FROM python:3.10-alpine FROM python:3.11-alpine
COPY requirements.txt /tmp/requirements.txt COPY requirements.txt /tmp/requirements.txt
@@ -14,5 +14,5 @@ COPY app /app
COPY version.env /version.env COPY version.env /version.env
COPY README.md /README.md COPY README.md /README.md
HEALTHCHECK --start-period=30s --interval=10s --timeout=5s --retries=3 CMD curl --insecure --fail https://localhost/status || exit 1 HEALTHCHECK --start-period=30s --interval=10s --timeout=5s --retries=3 CMD curl --insecure --fail https://localhost/-/health || exit 1
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "443", "--app-dir", "/app", "--proxy-headers", "--ssl-keyfile", "/app/cert/webserver.key", "--ssl-certfile", "/app/cert/webserver.crt"] CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "443", "--app-dir", "/app", "--proxy-headers", "--ssl-keyfile", "/app/cert/webserver.key", "--ssl-certfile", "/app/cert/webserver.crt"]

17
FAQ.md Normal file
View File

@@ -0,0 +1,17 @@
# FAQ
## `Failed to acquire license from <ip> (Info: <license> - Error: The allowed time to process response has expired)`
- Did your timezone settings are correct on fastapi-dls **and your guest**?
- Did you download the client-token more than an hour ago?
Please download a new client-token. The guest have to register within an hour after client-token was created.
## `jose.exceptions.JWTError: Signature verification failed.`
- Did you recreated `instance.public.pem` / `instance.private.pem`?
Then you have to download a **new** client-token on each of your guests.

430
README.md
View File

@@ -2,37 +2,22 @@
Minimal Delegated License Service (DLS). Minimal Delegated License Service (DLS).
## Endpoints Compatibility tested with official DLS 2.0.1.
### `GET /` This service can be used without internet connection.
Only the clients need a connection to this service on configured port.
HTML rendered README.md. [[_TOC_]]
### `GET /status`
Status endpoint, used for *healthcheck*. Shows also current version and commit hash. # Setup (Service)
### `GET /docs` ## Docker
OpenAPI specifications rendered from `GET /openapi.json`. Docker-Images are available here:
### `GET /-/origins` - [Docker-Hub](https://hub.docker.com/repository/docker/collinwebdesigns/fastapi-dls): `collinwebdesigns/fastapi-dls:latest`
- [GitLab-Registry](https://git.collinwebdesigns.de/oscar.krause/fastapi-dls/container_registry): `registry.git.collinwebdesigns.de/oscar.krause/fastapi-dls/main:latest`
List registered origins.
### `GET /-/leases`
List current leases.
### `GET /client-token`
Generate client token, (see [installation](#installation)).
### Others
There are some more internal api endpoints for handling authentication and lease process.
# Setup (Docker)
**Run this on the Docker-Host** **Run this on the Docker-Host**
@@ -49,6 +34,8 @@ openssl req -x509 -nodes -days 3650 -newkey rsa:2048 -keyout $WORKING_DIR/webse
**Start container** **Start container**
To test if everything is set up properly you can start container as following:
```shell ```shell
docker volume create dls-db docker volume create dls-db
docker run -e DLS_URL=`hostname -i` -e DLS_PORT=443 -p 443:443 -v $WORKING_DIR:/app/cert -v dls-db:/app/database collinwebdesigns/fastapi-dls:latest docker run -e DLS_URL=`hostname -i` -e DLS_PORT=443 -p 443:443 -v $WORKING_DIR:/app/cert -v dls-db:/app/database collinwebdesigns/fastapi-dls:latest
@@ -56,11 +43,13 @@ docker run -e DLS_URL=`hostname -i` -e DLS_PORT=443 -p 443:443 -v $WORKING_DIR:/
**Docker-Compose / Deploy stack** **Docker-Compose / Deploy stack**
Goto [`docker-compose.yml`](docker-compose.yml) for more advanced example (with reverse proxy usage).
```yaml ```yaml
version: '3.9' version: '3.9'
x-dls-variables: &dls-variables x-dls-variables: &dls-variables
DLS_URL: localhost # REQUIRED DLS_URL: localhost # REQUIRED, change to your ip or hostname
DLS_PORT: 443 DLS_PORT: 443
LEASE_EXPIRE_DAYS: 90 LEASE_EXPIRE_DAYS: 90
DATABASE: sqlite:////app/database/db.sqlite DATABASE: sqlite:////app/database/db.sqlite
@@ -81,36 +70,328 @@ volumes:
dls-db: dls-db:
``` ```
## Debian/Ubuntu (manual method using `git clone` and python virtual environment)
Tested on `Debian 11 (bullseye)`, Ubuntu may also work.
**Install requirements**
```shell
apt-get update && apt-get install git python3-venv python3-pip
```
**Install FastAPI-DLS**
```shell
WORKING_DIR=/opt/fastapi-dls
mkdir -p $WORKING_DIR
cd $WORKING_DIR
git clone https://git.collinwebdesigns.de/oscar.krause/fastapi-dls .
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
deactivate
chown -R www-data:www-data $WORKING_DIR
```
**Create keypair and webserver certificate**
```shell
WORKING_DIR=/opt/fastapi-dls/app/cert
mkdir $WORKING_DIR
cd $WORKING_DIR
# create instance private and public key for singing JWT's
openssl genrsa -out $WORKING_DIR/instance.private.pem 2048
openssl rsa -in $WORKING_DIR/instance.private.pem -outform PEM -pubout -out $WORKING_DIR/instance.public.pem
# create ssl certificate for integrated webserver (uvicorn) - because clients rely on ssl
openssl req -x509 -nodes -days 3650 -newkey rsa:2048 -keyout $WORKING_DIR/webserver.key -out $WORKING_DIR/webserver.crt
chown -R www-data:www-data $WORKING_DIR
```
**Test Service**
This is only to test whether the service starts successfully.
```shell
cd /opt/fastapi-dls/app
su - www-data -c "/opt/fastapi-dls/venv/bin/uvicorn main:app --app-dir=/opt/fastapi-dls/app"
```
**Create config file**
```shell
cat <<EOF >/etc/fastapi-dls/env
DLS_URL=127.0.0.1
DLS_PORT=443
LEASE_EXPIRE_DAYS=90
DATABASE=sqlite:////opt/fastapi-dls/app/db.sqlite
EOF
```
**Create service**
```shell
cat <<EOF >/etc/systemd/system/fastapi-dls.service
[Unit]
Description=Service for fastapi-dls
After=network.target
[Service]
User=www-data
Group=www-data
AmbientCapabilities=CAP_NET_BIND_SERVICE
WorkingDirectory=/opt/fastapi-dls/app
EnvironmentFile=/etc/fastapi-dls/env
ExecStart=/opt/fastapi-dls/venv/bin/uvicorn main:app \\
--env-file /etc/fastapi-dls/env \\
--host \$DLS_URL --port \$DLS_PORT \\
--app-dir /opt/fastapi-dls/app \\
--ssl-keyfile /opt/fastapi-dls/app/cert/webserver.key \\
--ssl-certfile /opt/fastapi-dls/app/cert/webserver.crt \\
--proxy-headers
Restart=always
KillSignal=SIGQUIT
Type=simple
NotifyAccess=all
[Install]
WantedBy=multi-user.target
EOF
```
Now you have to run `systemctl daemon-reload`. After that you can start service
with `systemctl start fastapi-dls.service` and enable autostart with `systemctl enable fastapi-dls.service`.
## Debian/Ubuntu (using `dpkg`)
Packages are available here:
- [GitLab-Registry](https://git.collinwebdesigns.de/oscar.krause/fastapi-dls/-/packages)
Successful tested with:
- Debian 12 (Bookworm) (works but not recommended because it is currently in *testing* state)
- Ubuntu 22.10 (Kinetic Kudu)
Not working with:
- Debian 11 (Bullseye) and lower (missing `python-jose` dependency)
- Ubuntu 22.04 (Jammy Jellyfish) (not supported as for 15.01.2023 due to [fastapi - uvicorn version missmatch](https://bugs.launchpad.net/ubuntu/+source/fastapi/+bug/1970557))
**Run this on your server instance**
First go to [GitLab-Registry](https://git.collinwebdesigns.de/oscar.krause/fastapi-dls/-/packages) and select your
version. Then you have to copy the download link of the `fastapi-dls_X.Y.Z_amd64.deb` asset.
```shell
apt-get update
FILENAME=/opt/fastapi-dls.deb
wget -O $FILENAME <download-url>
dpkg -i $FILENAME
apt-get install -f --fix-missing
```
Start with `systemctl start fastapi-dls.service` and enable autostart with `systemctl enable fastapi-dls.service`.
## ArchLinux (using `pacman`)
**Shout out to `samicrusader` who created build file for ArchLinux!**
Packages are available here:
- [GitLab-Registry](https://git.collinwebdesigns.de/oscar.krause/fastapi-dls/-/packages)
```shell
pacman -Sy
FILENAME=/opt/fastapi-dls.pkg.tar.zst
curl -o $FILENAME <download-url>
# or
wget -O $FILENAME <download-url>
pacman -U --noconfirm fastapi-dls.pkg.tar.zst
```
Start with `systemctl start fastapi-dls.service` and enable autostart with `systemctl enable fastapi-dls.service`.
## Let's Encrypt Certificate (optional)
If you're using installation via docker, you can use `traefik`. Please refer to their documentation.
Note that port 80 must be accessible, and you have to install `socat` if you're using `standalone` mode.
```shell
acme.sh --issue -d example.com \
--cert-file /etc/fastapi-dls/webserver.donotuse.crt \
--key-file /etc/fastapi-dls/webserver.key \
--fullchain-file /etc/fastapi-dls/webserver.crt \
--reloadcmd "systemctl restart fastapi-dls.service"
```
After first success you have to replace `--issue` with `--renew`.
# Configuration # Configuration
| Variable | Default | Usage | | Variable | Default | Usage |
|---------------------|-----------------------|---------------------------------------------------------------------------------------| |------------------------|----------------------------------------|------------------------------------------------------------------------------------------------------|
| `DEBUG` | `false` | Toggles `fastapi` debug mode | | `DEBUG` | `false` | Toggles `fastapi` debug mode |
| `DLS_URL` | `localhost` | Used in client-token to tell guest driver where dls instance is reachable | | `DLS_URL` | `localhost` | Used in client-token to tell guest driver where dls instance is reachable |
| `DLS_PORT` | `443` | Used in client-token to tell guest driver where dls instance is reachable | | `DLS_PORT` | `443` | Used in client-token to tell guest driver where dls instance is reachable |
| `LEASE_EXPIRE_DAYS` | `90` | Lease time in days | | `TOKEN_EXPIRE_DAYS` | `1` | Client auth-token validity (used for authenticate client against api, **not `.tok` file!**) |
| `DATABASE` | `sqlite:///db.sqlite` | See [official dataset docs](https://dataset.readthedocs.io/en/latest/quickstart.html) | | `LEASE_EXPIRE_DAYS` | `90` | Lease time in days |
| `CORS_ORIGINS` | `https://{DLS_URL}` | Sets `Access-Control-Allow-Origin` header (comma separated string) | | `LEASE_RENEWAL_PERIOD` | `0.15` | The percentage of the lease period that must elapse before a licensed client can renew a license \*1 |
| `DATABASE` | `sqlite:///db.sqlite` | See [official SQLAlchemy docs](https://docs.sqlalchemy.org/en/14/core/engines.html) |
| `CORS_ORIGINS` | `https://{DLS_URL}` | Sets `Access-Control-Allow-Origin` header (comma separated string) \*2 |
| `SITE_KEY_XID` | `00000000-0000-0000-0000-000000000000` | Site identification uuid |
| `INSTANCE_REF` | `10000000-0000-0000-0000-000000000001` | Instance identification uuid |
| `ALLOTMENT_REF` | `20000000-0000-0000-0000-000000000001` | Allotment identification uuid |
| `INSTANCE_KEY_RSA` | `<app-dir>/cert/instance.private.pem` | Site-wide private RSA key for singing JWTs \*3 |
| `INSTANCE_KEY_PUB` | `<app-dir>/cert/instance.public.pem` | Site-wide public key \*3 |
# Installation \*1 For example, if the lease period is one day and the renewal period is 20%, the client attempts to renew its license
every 4.8 hours. If network connectivity is lost, the loss of connectivity is detected during license renewal and the
client has 19.2 hours in which to re-establish connectivity before its license expires.
\*2 Always use `https`, since guest-drivers only support secure connections!
\*3 If you recreate instance keys you need to **recreate client-token for each guest**!
# Setup (Client)
**The token file has to be copied! It's not enough to C&P file contents, because there can be special characters.** **The token file has to be copied! It's not enough to C&P file contents, because there can be special characters.**
Successfully tested with this package versions:
- `14.3` (Linux-Host: `510.108.03`, Linux-Guest: `510.108.03`, Windows-Guest: `513.91`)
- `14.4` (Linux-Host: `510.108.03`, Linux-Guest: `510.108.03`, Windows-Guest: `514.08`)
- `15.0` (Linux-Host: `525.60.12`, Linux-Guest: `525.60.13`, Windows-Guest: `527.41`)
## Linux ## Linux
Download *client-token* and place it into `/etc/nvidia/ClientConfigToken`:
```shell
curl --insecure -L -X GET https://<dls-hostname-or-ip>/-/client-token -o /etc/nvidia/ClientConfigToken/client_configuration_token_$(date '+%d-%m-%Y-%H-%M-%S').tok
# or
wget --no-check-certificate -O /etc/nvidia/ClientConfigToken/client_configuration_token_$(date '+%d-%m-%Y-%H-%M-%S').tok https://<dls-hostname-or-ip>/-/client-token
```
Restart `nvidia-gridd` service:
```shell ```shell
curl --insecure -X GET https://<dls-hostname-or-ip>/client-token -o /etc/nvidia/ClientConfigToken/client_configuration_token.tok
service nvidia-gridd restart service nvidia-gridd restart
```
Check licensing status:
```shell
nvidia-smi -q | grep "License" nvidia-smi -q | grep "License"
``` ```
Output should be something like:
```text
vGPU Software Licensed Product
License Status : Licensed (Expiry: YYYY-M-DD hh:mm:ss GMT)
```
Done. For more information check [troubleshoot section](#troubleshoot).
## Windows ## Windows
Download file and place it into `C:\Program Files\NVIDIA Corporation\vGPU Licensing\ClientConfigToken`. **Power-Shell** (run as administrator!)
Now restart `NvContainerLocalSystem` service.
Download *client-token* and place it into `C:\Program Files\NVIDIA Corporation\vGPU Licensing\ClientConfigToken`:
```shell
curl.exe --insecure -L -X GET https://<dls-hostname-or-ip>/-/client-token -o "C:\Program Files\NVIDIA Corporation\vGPU Licensing\ClientConfigToken\client_configuration_token_$($(Get-Date).tostring('dd-MM-yy-hh-mm-ss')).tok"
```
Restart `NvContainerLocalSystem` service:
```Shell
Restart-Service NVDisplay.ContainerLocalSystem
```
Check licensing status:
```shell
& 'C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe' -q | Select-String "License"
```
Output should be something like:
```text
vGPU Software Licensed Product
License Status : Licensed (Expiry: YYYY-M-DD hh:mm:ss GMT)
```
Done. For more information check [troubleshoot section](#troubleshoot).
# Endpoints
### `GET /`
Redirect to `/-/readme`.
### `GET /-/health`
Status endpoint, used for *healthcheck*.
### `GET /-/config`
Shows current runtime environment variables and their values.
### `GET /-/readme`
HTML rendered README.md.
### `GET /-/docs`, `GET /-/redoc`
OpenAPI specifications rendered from `GET /-/openapi.json`.
### `GET /-/manage`
Shows a very basic UI to delete origins or leases.
### `GET /-/origins?leases=false`
List registered origins.
| Query Parameter | Default | Usage |
|-----------------|---------|--------------------------------------|
| `leases` | `false` | Include referenced leases per origin |
### `DELETE /-/origins`
Deletes all origins and their leases.
### `GET /-/leases?origin=false`
List current leases.
| Query Parameter | Default | Usage |
|-----------------|---------|-------------------------------------|
| `origin` | `false` | Include referenced origin per lease |
### `DELETE /-/lease/{lease_ref}`
Deletes an lease.
### `GET /-/client-token`
Generate client token, (see [installation](#installation)).
### Others
There are many other internal api endpoints for handling authentication and lease process.
# Troubleshoot # Troubleshoot
**Please make sure that fastapi-dls and your guests are on the same timezone!**
## Linux ## Linux
Logs are available with `journalctl -u nvidia-gridd -f`. Logs are available with `journalctl -u nvidia-gridd -f`.
@@ -123,10 +404,44 @@ Logs are available in `C:\Users\Public\Documents\Nvidia\LoggingLog.NVDisplay.Con
## Linux ## Linux
Currently, there are no known issues. ### `uvicorn.error:Invalid HTTP request received.`
This message can be ignored.
- Ref. https://github.com/encode/uvicorn/issues/441
<details>
<summary>Log example</summary>
```
WARNING:uvicorn.error:Invalid HTTP request received.
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/uvicorn/protocols/http/h11_impl.py", line 129, in handle_events
event = self.conn.next_event()
File "/usr/lib/python3/dist-packages/h11/_connection.py", line 485, in next_event
exc._reraise_as_remote_protocol_error()
File "/usr/lib/python3/dist-packages/h11/_util.py", line 77, in _reraise_as_remote_protocol_error
raise self
File "/usr/lib/python3/dist-packages/h11/_connection.py", line 467, in next_event
event = self._extract_next_receive_event()
File "/usr/lib/python3/dist-packages/h11/_connection.py", line 409, in _extract_next_receive_event
event = self._reader(self._receive_buffer)
File "/usr/lib/python3/dist-packages/h11/_readers.py", line 84, in maybe_read_from_IDLE_client
raise LocalProtocolError("no request line received")
h11._util.RemoteProtocolError: no request line received
```
</details>
## Windows ## Windows
### Required cipher on Windows Guests (e.g. managed by domain controller with GPO)
It is required to enable `SHA1` (`TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA_P521`)
in [windows cipher suite](https://learn.microsoft.com/en-us/windows-server/security/tls/manage-tls).
### Multiple Display Container LS Instances
On Windows on some machines there are running two or more instances of `NVIDIA Display Container LS`. This causes a On Windows on some machines there are running two or more instances of `NVIDIA Display Container LS`. This causes a
problem on licensing flow. As you can see in the logs below, there are two lines with `NLS initialized`, each prefixed problem on licensing flow. As you can see in the logs below, there are two lines with `NLS initialized`, each prefixed
with `<1>` and `<2>`. So it is possible, that *daemon 1* fetches a valid license through dls-service, and *daemon 2* with `<1>` and `<2>`. So it is possible, that *daemon 1* fetches a valid license through dls-service, and *daemon 2*
@@ -184,3 +499,42 @@ Dec 20 17:53:34 ubuntu-grid-server nvidia-gridd[10354]: License acquired success
``` ```
</details> </details>
### Error on releasing leases on shutdown (can be ignored and/or fixed with reverse proxy)
The driver wants to release current leases on shutting down windows. This endpoint needs to be a http endpoint.
The error message can safely be ignored (since we have no license limitation :P) and looks like this:
<details>
<summary>Log example</summary>
```
<1>:NLS initialized
<1>:License acquired successfully. (Info: 192.168.178.110, NVIDIA RTX Virtual Workstation; Expiry: 2023-3-30 23:0:22 GMT)
<0>:Failed to return license to 192.168.178.110 (Error: Generic network communication failure)
<0>:End Logging
```
#### log with nginx as reverse proxy (see [docker-compose.yml](docker-compose.yml))
```
<1>:NLS initialized
<2>:NLS initialized
<1>:Valid GRID license not found. GPU features and performance will be fully degraded. To enable full functionality please configure licensing details.
<1>:License acquired successfully. (Info: 192.168.178.33, NVIDIA RTX Virtual Workstation; Expiry: 2023-1-4 16:48:20 GMT)
<2>:Valid GRID license not found. GPU features and performance will be fully degraded. To enable full functionality please configure licensing details.
<2>:License acquired successfully from local trusted store. (Info: 192.168.178.33, NVIDIA RTX Virtual Workstation; Expiry: 2023-1-4 16:48:20 GMT)
<2>:End Logging
<1>:End Logging
<0>:License returned successfully. (Info: 192.168.178.33)
<0>:End Logging
```
</details>
# Credits
Thanks to vGPU community and all who uses this project and report bugs.
Special thanks to @samicrusader who created build file for ArchLinux.

View File

@@ -3,121 +3,197 @@ from base64 import b64encode as b64enc
from hashlib import sha256 from hashlib import sha256
from uuid import uuid4 from uuid import uuid4
from os.path import join, dirname from os.path import join, dirname
from os import getenv from os import getenv as env
from dotenv import load_dotenv from dotenv import load_dotenv
from fastapi import FastAPI, HTTPException from fastapi import FastAPI
from fastapi.requests import Request from fastapi.requests import Request
from fastapi.encoders import jsonable_encoder from json import loads as json_loads
import json from datetime import datetime, timedelta
from datetime import datetime
from dateutil.relativedelta import relativedelta from dateutil.relativedelta import relativedelta
from calendar import timegm from calendar import timegm
from jose import jws, jwk, jwt from jose import jws, jwk, jwt, JWTError
from jose.constants import ALGORITHMS from jose.constants import ALGORITHMS
from starlette.middleware.cors import CORSMiddleware from starlette.middleware.cors import CORSMiddleware
from starlette.responses import StreamingResponse, JSONResponse, HTMLResponse from starlette.responses import StreamingResponse, JSONResponse as JSONr, HTMLResponse as HTMLr, Response, RedirectResponse
import dataset from sqlalchemy import create_engine
from Crypto.PublicKey import RSA from sqlalchemy.orm import sessionmaker
from Crypto.PublicKey.RSA import RsaKey
from util import load_key, load_file
from orm import Origin, Lease, init as db_init, migrate
logger = logging.getLogger() logger = logging.getLogger()
load_dotenv('../version.env') load_dotenv('../version.env')
VERSION, COMMIT, DEBUG = getenv('VERSION', 'unknown'), getenv('COMMIT', 'unknown'), bool(getenv('DEBUG', False)) VERSION, COMMIT, DEBUG = env('VERSION', 'unknown'), env('COMMIT', 'unknown'), bool(env('DEBUG', False))
config = dict(openapi_url='/-/openapi.json', docs_url='/-/docs', redoc_url='/-/redoc')
app = FastAPI(title='FastAPI-DLS', description='Minimal Delegated License Service (DLS).', version=VERSION, **config)
db = create_engine(str(env('DATABASE', 'sqlite:///db.sqlite')))
db_init(db), migrate(db)
def load_file(filename) -> bytes: # everything prefixed with "INSTANCE_*" is used as "SERVICE_INSTANCE_*" or "SI_*" in official dls service
with open(filename, 'rb') as file: DLS_URL = str(env('DLS_URL', 'localhost'))
content = file.read() DLS_PORT = int(env('DLS_PORT', '443'))
return content SITE_KEY_XID = str(env('SITE_KEY_XID', '00000000-0000-0000-0000-000000000000'))
INSTANCE_REF = str(env('INSTANCE_REF', '10000000-0000-0000-0000-000000000001'))
ALLOTMENT_REF = str(env('ALLOTMENT_REF', '20000000-0000-0000-0000-000000000001'))
def load_key(filename) -> RsaKey: INSTANCE_KEY_RSA = load_key(str(env('INSTANCE_KEY_RSA', join(dirname(__file__), 'cert/instance.private.pem'))))
return RSA.import_key(extern_key=load_file(filename), passphrase=None) INSTANCE_KEY_PUB = load_key(str(env('INSTANCE_KEY_PUB', join(dirname(__file__), 'cert/instance.public.pem'))))
TOKEN_EXPIRE_DELTA = relativedelta(days=int(env('TOKEN_EXPIRE_DAYS', 1)), hours=int(env('TOKEN_EXPIRE_HOURS', 0)))
LEASE_EXPIRE_DELTA = relativedelta(days=int(env('LEASE_EXPIRE_DAYS', 90)), hours=int(env('LEASE_EXPIRE_HOURS', 0)))
# todo: initialize certificate (or should be done by user, and passed through "volumes"?) LEASE_RENEWAL_PERIOD = float(env('LEASE_RENEWAL_PERIOD', 0.15))
LEASE_RENEWAL_DELTA = timedelta(days=int(env('LEASE_EXPIRE_DAYS', 90)), hours=int(env('LEASE_EXPIRE_HOURS', 0)))
__details = dict( CORS_ORIGINS = str(env('CORS_ORIGINS', '')).split(',') if (env('CORS_ORIGINS')) else [f'https://{DLS_URL}']
title='FastAPI-DLS',
description='Minimal Delegated License Service (DLS).',
version=VERSION,
)
app, db = FastAPI(**__details), dataset.connect(str(getenv('DATABASE', 'sqlite:///db.sqlite')))
TOKEN_EXPIRE_DELTA = relativedelta(hours=1) # days=1
LEASE_EXPIRE_DELTA = relativedelta(days=int(getenv('LEASE_EXPIRE_DAYS', 90)))
DLS_URL = str(getenv('DLS_URL', 'localhost'))
DLS_PORT = int(getenv('DLS_PORT', '443'))
SITE_KEY_XID = getenv('SITE_KEY_XID', '00000000-0000-0000-0000-000000000000')
INSTANCE_KEY_RSA = load_key(join(dirname(__file__), 'cert/instance.private.pem'))
INSTANCE_KEY_PUB = load_key(join(dirname(__file__), 'cert/instance.public.pem'))
CORS_ORIGINS = getenv('CORS_ORIGINS').split(',') if (getenv('CORS_ORIGINS')) else f'https://{DLS_URL}' # todo: prevent static https
jwt_encode_key = jwk.construct(INSTANCE_KEY_RSA.export_key().decode('utf-8'), algorithm=ALGORITHMS.RS256) jwt_encode_key = jwk.construct(INSTANCE_KEY_RSA.export_key().decode('utf-8'), algorithm=ALGORITHMS.RS256)
jwt_decode_key = jwk.construct(INSTANCE_KEY_PUB.export_key().decode('utf-8'), algorithm=ALGORITHMS.RS512) jwt_decode_key = jwk.construct(INSTANCE_KEY_PUB.export_key().decode('utf-8'), algorithm=ALGORITHMS.RS256)
app.debug = DEBUG app.debug = DEBUG
app.add_middleware( app.add_middleware(
CORSMiddleware, CORSMiddleware,
allow_origins=CORS_ORIGINS, allow_origins=CORS_ORIGINS,
allow_credentials=True, allow_credentials=True,
allow_methods=["*"], allow_methods=['*'],
allow_headers=["*"], allow_headers=['*'],
) )
logger.setLevel(logging.DEBUG if DEBUG else logging.INFO) logger.setLevel(logging.DEBUG if DEBUG else logging.INFO)
def get_token(request: Request) -> dict: def __get_token(request: Request) -> dict:
authorization_header = request.headers['authorization'] authorization_header = request.headers.get('authorization')
token = authorization_header.split(' ')[1] token = authorization_header.split(' ')[1]
return jwt.decode(token=token, key=jwt_decode_key, algorithms=ALGORITHMS.RS256, options={'verify_aud': False}) return jwt.decode(token=token, key=jwt_decode_key, algorithms=ALGORITHMS.RS256, options={'verify_aud': False})
@app.get('/') @app.get('/', summary='Index')
async def index(): async def index():
return RedirectResponse('/-/readme')
@app.get('/-/', summary='* Index')
async def _index():
return RedirectResponse('/-/readme')
@app.get('/-/health', summary='* Health')
async def _health(request: Request):
return JSONr({'status': 'up'})
@app.get('/-/config', summary='* Config', description='returns environment variables.')
async def _config():
return JSONr({
'VERSION': str(VERSION),
'COMMIT': str(COMMIT),
'DEBUG': str(DEBUG),
'DLS_URL': str(DLS_URL),
'DLS_PORT': str(DLS_PORT),
'SITE_KEY_XID': str(SITE_KEY_XID),
'INSTANCE_REF': str(INSTANCE_REF),
'ALLOTMENT_REF': [str(ALLOTMENT_REF)],
'TOKEN_EXPIRE_DELTA': str(TOKEN_EXPIRE_DELTA),
'LEASE_EXPIRE_DELTA': str(LEASE_EXPIRE_DELTA),
'LEASE_RENEWAL_PERIOD': str(LEASE_RENEWAL_PERIOD),
'CORS_ORIGINS': str(CORS_ORIGINS),
})
@app.get('/-/readme', summary='* Readme')
async def _readme():
from markdown import markdown from markdown import markdown
content = load_file('../README.md').decode('utf-8') content = load_file('../README.md').decode('utf-8')
return HTMLResponse(markdown(text=content, extensions=['tables', 'fenced_code', 'md_in_html', 'nl2br', 'toc'])) return HTMLr(markdown(text=content, extensions=['tables', 'fenced_code', 'md_in_html', 'nl2br', 'toc']))
@app.get('/status') @app.get('/-/manage', summary='* Management UI')
async def status(request: Request): async def _manage(request: Request):
return JSONResponse({'status': 'up', 'version': VERSION, 'commit': COMMIT, 'debug': DEBUG}) response = '''
<!DOCTYPE html>
<html>
<head>
<title>FastAPI-DLS Management</title>
</head>
<body>
<button onclick="deleteOrigins()">delete ALL origins and their leases</button>
<button onclick="deleteLease()">delete specific lease</button>
<script>
function deleteOrigins() {
const response = confirm('Are you sure you want to delete all origins and their leases?');
if (response) {
var xhr = new XMLHttpRequest();
xhr.open("DELETE", '/-/origins', true);
xhr.send();
}
}
function deleteLease(lease_ref) {
if(lease_ref === undefined)
lease_ref = window.prompt("Please enter 'lease_ref' which should be deleted");
if(lease_ref === null || lease_ref === "")
return
var xhr = new XMLHttpRequest();
xhr.open("DELETE", `/-/lease/${lease_ref}`, true);
xhr.send();
}
</script>
</body>
</html>
'''
return HTMLr(response)
@app.get('/-/origins') @app.get('/-/origins', summary='* Origins')
async def _origins(request: Request): async def _origins(request: Request, leases: bool = False):
response = list(map(lambda x: jsonable_encoder(x), db['origin'].all())) session = sessionmaker(bind=db)()
return JSONResponse(response) response = []
for origin in session.query(Origin).all():
x = origin.serialize()
if leases:
serialize = dict(renewal_period=LEASE_RENEWAL_PERIOD, renewal_delta=LEASE_RENEWAL_DELTA)
x['leases'] = list(map(lambda _: _.serialize(**serialize), Lease.find_by_origin_ref(db, origin.origin_ref)))
response.append(x)
session.close()
return JSONr(response)
@app.get('/-/leases') @app.delete('/-/origins', summary='* Origins')
async def _leases(request: Request): async def _origins_delete(request: Request):
response = list(map(lambda x: jsonable_encoder(x), db['lease'].all())) Origin.delete(db)
return JSONResponse(response) return Response(status_code=201)
@app.get('/-/leases', summary='* Leases')
async def _leases(request: Request, origin: bool = False):
session = sessionmaker(bind=db)()
response = []
for lease in session.query(Lease).all():
serialize = dict(renewal_period=LEASE_RENEWAL_PERIOD, renewal_delta=LEASE_RENEWAL_DELTA)
x = lease.serialize(**serialize)
if origin:
lease_origin = session.query(Origin).filter(Origin.origin_ref == lease.origin_ref).first()
if lease_origin is not None:
x['origin'] = lease_origin.serialize()
response.append(x)
session.close()
return JSONr(response)
@app.delete('/-/lease/{lease_ref}', summary='* Lease')
async def _lease_delete(request: Request, lease_ref: str):
if Lease.delete(db, lease_ref) == 1:
return Response(status_code=201)
return JSONr(status_code=404, content={'status': 404, 'detail': 'lease not found'})
# venv/lib/python3.9/site-packages/nls_core_service_instance/service_instance_token_manager.py # venv/lib/python3.9/site-packages/nls_core_service_instance/service_instance_token_manager.py
@app.get('/client-token') @app.get('/-/client-token', summary='* Client-Token', description='creates a new messenger token for this service instance')
async def client_token(): async def _client_token():
cur_time = datetime.utcnow() cur_time = datetime.utcnow()
exp_time = cur_time + relativedelta(years=12) exp_time = cur_time + relativedelta(years=12)
service_instance_public_key_configuration = {
"service_instance_public_key_me": {
"mod": hex(INSTANCE_KEY_PUB.public_key().n)[2:],
"exp": INSTANCE_KEY_PUB.public_key().e,
},
"service_instance_public_key_pem": INSTANCE_KEY_PUB.export_key().decode('utf-8'),
"key_retention_mode": "LATEST_ONLY"
}
payload = { payload = {
"jti": str(uuid4()), "jti": str(uuid4()),
"iss": "NLS Service Instance", "iss": "NLS Service Instance",
@@ -126,10 +202,10 @@ async def client_token():
"nbf": timegm(cur_time.timetuple()), "nbf": timegm(cur_time.timetuple()),
"exp": timegm(exp_time.timetuple()), "exp": timegm(exp_time.timetuple()),
"update_mode": "ABSOLUTE", "update_mode": "ABSOLUTE",
"scope_ref_list": [str(uuid4())], "scope_ref_list": [ALLOTMENT_REF],
"fulfillment_class_ref_list": [], "fulfillment_class_ref_list": [],
"service_instance_configuration": { "service_instance_configuration": {
"nls_service_instance_ref": "00000000-0000-0000-0000-000000000000", "nls_service_instance_ref": INSTANCE_REF,
"svc_port_set_list": [ "svc_port_set_list": [
{ {
"idx": 0, "idx": 0,
@@ -139,40 +215,45 @@ async def client_token():
], ],
"node_url_list": [{"idx": 0, "url": DLS_URL, "url_qr": DLS_URL, "svc_port_set_idx": 0}] "node_url_list": [{"idx": 0, "url": DLS_URL, "url_qr": DLS_URL, "svc_port_set_idx": 0}]
}, },
"service_instance_public_key_configuration": service_instance_public_key_configuration, "service_instance_public_key_configuration": {
"service_instance_public_key_me": {
"mod": hex(INSTANCE_KEY_PUB.public_key().n)[2:],
"exp": int(INSTANCE_KEY_PUB.public_key().e),
},
"service_instance_public_key_pem": INSTANCE_KEY_PUB.export_key().decode('utf-8'),
"key_retention_mode": "LATEST_ONLY"
},
} }
content = jws.sign(payload, key=jwt_encode_key, headers=None, algorithm=ALGORITHMS.RS256) content = jws.sign(payload, key=jwt_encode_key, headers=None, algorithm=ALGORITHMS.RS256)
response = StreamingResponse(iter([content]), media_type="text/plain") response = StreamingResponse(iter([content]), media_type="text/plain")
filename = f'client_configuration_token_{datetime.now().strftime("%d-%m-%y-%H-%M-%S")}' filename = f'client_configuration_token_{datetime.now().strftime("%d-%m-%y-%H-%M-%S")}.tok'
response.headers["Content-Disposition"] = f'attachment; filename={filename}' response.headers["Content-Disposition"] = f'attachment; filename={filename}'
return response return response
# venv/lib/python3.9/site-packages/nls_services_auth/test/test_origins_controller.py # venv/lib/python3.9/site-packages/nls_services_auth/test/test_origins_controller.py
# {"candidate_origin_ref":"00112233-4455-6677-8899-aabbccddeeff","environment":{"fingerprint":{"mac_address_list":["ff:ff:ff:ff:ff:ff"]},"hostname":"my-hostname","ip_address_list":["192.168.178.123","fe80::","fe80::1%enp6s18"],"guest_driver_version":"510.85.02","os_platform":"Debian GNU/Linux 11 (bullseye) 11","os_version":"11 (bullseye)"},"registration_pending":false,"update_pending":false} @app.post('/auth/v1/origin', description='find or create an origin')
@app.post('/auth/v1/origin')
async def auth_v1_origin(request: Request): async def auth_v1_origin(request: Request):
j = json.loads((await request.body()).decode('utf-8')) j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.utcnow()
origin_ref = j['candidate_origin_ref'] origin_ref = j.get('candidate_origin_ref')
logging.info(f'> [ origin ]: {origin_ref}: {j}') logging.info(f'> [ origin ]: {origin_ref}: {j}')
data = dict( data = Origin(
origin_ref=origin_ref, origin_ref=origin_ref,
hostname=j['environment']['hostname'], hostname=j.get('environment').get('hostname'),
guest_driver_version=j['environment']['guest_driver_version'], guest_driver_version=j.get('environment').get('guest_driver_version'),
os_platform=j['environment']['os_platform'], os_version=j['environment']['os_version'], os_platform=j.get('environment').get('os_platform'), os_version=j.get('environment').get('os_version'),
) )
db['origin'].upsert(data, ['origin_ref']) Origin.create_or_update(db, data)
cur_time = datetime.utcnow()
response = { response = {
"origin_ref": origin_ref, "origin_ref": origin_ref,
"environment": j['environment'], "environment": j.get('environment'),
"svc_port_set_list": None, "svc_port_set_list": None,
"node_url_list": None, "node_url_list": None,
"node_query_order": None, "node_query_order": None,
@@ -180,64 +261,86 @@ async def auth_v1_origin(request: Request):
"sync_timestamp": cur_time.isoformat() "sync_timestamp": cur_time.isoformat()
} }
return JSONResponse(response) return JSONr(response)
# venv/lib/python3.9/site-packages/nls_services_auth/test/test_origins_controller.py
@app.post('/auth/v1/origin/update', description='update an origin evidence')
async def auth_v1_origin_update(request: Request):
j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.utcnow()
origin_ref = j.get('origin_ref')
logging.info(f'> [ update ]: {origin_ref}: {j}')
data = Origin(
origin_ref=origin_ref,
hostname=j.get('environment').get('hostname'),
guest_driver_version=j.get('environment').get('guest_driver_version'),
os_platform=j.get('environment').get('os_platform'), os_version=j.get('environment').get('os_version'),
)
Origin.create_or_update(db, data)
response = {
"environment": j.get('environment'),
"prompts": None,
"sync_timestamp": cur_time.isoformat()
}
return JSONr(response)
# venv/lib/python3.9/site-packages/nls_services_auth/test/test_auth_controller.py # venv/lib/python3.9/site-packages/nls_services_auth/test/test_auth_controller.py
# venv/lib/python3.9/site-packages/nls_core_auth/auth.py - CodeResponse # venv/lib/python3.9/site-packages/nls_core_auth/auth.py - CodeResponse
# {"code_challenge":"...","origin_ref":"00112233-4455-6677-8899-aabbccddeeff"} @app.post('/auth/v1/code', description='get an authorization code')
@app.post('/auth/v1/code')
async def auth_v1_code(request: Request): async def auth_v1_code(request: Request):
j = json.loads((await request.body()).decode('utf-8')) j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.utcnow()
origin_ref = j['origin_ref'] origin_ref = j.get('origin_ref')
logging.info(f'> [ code ]: {origin_ref}: {j}') logging.info(f'> [ code ]: {origin_ref}: {j}')
cur_time = datetime.utcnow()
delta = relativedelta(minutes=15) delta = relativedelta(minutes=15)
expires = cur_time + delta expires = cur_time + delta
payload = { payload = {
'iat': timegm(cur_time.timetuple()), 'iat': timegm(cur_time.timetuple()),
'exp': timegm(expires.timetuple()), 'exp': timegm(expires.timetuple()),
'challenge': j['code_challenge'], 'challenge': j.get('code_challenge'),
'origin_ref': j['code_challenge'], 'origin_ref': j.get('origin_ref'),
'key_ref': SITE_KEY_XID, 'key_ref': SITE_KEY_XID,
'kid': SITE_KEY_XID 'kid': SITE_KEY_XID
} }
auth_code = jws.sign(payload, key=jwt_encode_key, headers={'kid': payload.get('kid')}, algorithm=ALGORITHMS.RS256) auth_code = jws.sign(payload, key=jwt_encode_key, headers={'kid': payload.get('kid')}, algorithm=ALGORITHMS.RS256)
db['auth'].delete(origin_ref=origin_ref, expires={'<=': cur_time - delta})
db['auth'].insert(dict(origin_ref=origin_ref, code_challenge=j['code_challenge'], expires=expires))
response = { response = {
"auth_code": auth_code, "auth_code": auth_code,
"sync_timestamp": cur_time.isoformat(), "sync_timestamp": cur_time.isoformat(),
"prompts": None "prompts": None
} }
return JSONResponse(response) return JSONr(response)
# venv/lib/python3.9/site-packages/nls_services_auth/test/test_auth_controller.py # venv/lib/python3.9/site-packages/nls_services_auth/test/test_auth_controller.py
# venv/lib/python3.9/site-packages/nls_core_auth/auth.py - TokenResponse # venv/lib/python3.9/site-packages/nls_core_auth/auth.py - TokenResponse
# {"auth_code":"...","code_verifier":"..."} @app.post('/auth/v1/token', description='exchange auth code and verifier for token')
@app.post('/auth/v1/token')
async def auth_v1_token(request: Request): async def auth_v1_token(request: Request):
j = json.loads((await request.body()).decode('utf-8')) j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.utcnow()
payload = jwt.decode(token=j['auth_code'], key=jwt_decode_key)
code_challenge = payload['origin_ref'] try:
payload = jwt.decode(token=j.get('auth_code'), key=jwt_decode_key)
except JWTError as e:
return JSONr(status_code=400, content={'status': 400, 'title': 'invalid token', 'detail': str(e)})
origin_ref = db['auth'].find_one(code_challenge=code_challenge)['origin_ref'] origin_ref = payload.get('origin_ref')
logging.info(f'> [ auth ]: {origin_ref} ({code_challenge}): {j}') logging.info(f'> [ auth ]: {origin_ref}: {j}')
# validate the code challenge # validate the code challenge
if payload['challenge'] != b64enc(sha256(j['code_verifier'].encode('utf-8')).digest()).rstrip(b'=').decode('utf-8'): challenge = b64enc(sha256(j.get('code_verifier').encode('utf-8')).digest()).rstrip(b'=').decode('utf-8')
raise HTTPException(status_code=401, detail='expected challenge did not match verifier') if payload.get('challenge') != challenge:
return JSONr(status_code=401, content={'status': 401, 'detail': 'expected challenge did not match verifier'})
cur_time = datetime.utcnow()
access_expires_on = cur_time + TOKEN_EXPIRE_DELTA access_expires_on = cur_time + TOKEN_EXPIRE_DELTA
new_payload = { new_payload = {
@@ -246,7 +349,7 @@ async def auth_v1_token(request: Request):
'iss': 'https://cls.nvidia.org', 'iss': 'https://cls.nvidia.org',
'aud': 'https://cls.nvidia.org', 'aud': 'https://cls.nvidia.org',
'exp': timegm(access_expires_on.timetuple()), 'exp': timegm(access_expires_on.timetuple()),
'origin_ref': payload['origin_ref'], 'origin_ref': origin_ref,
'key_ref': SITE_KEY_XID, 'key_ref': SITE_KEY_XID,
'kid': SITE_KEY_XID, 'kid': SITE_KEY_XID,
} }
@@ -259,41 +362,45 @@ async def auth_v1_token(request: Request):
"sync_timestamp": cur_time.isoformat(), "sync_timestamp": cur_time.isoformat(),
} }
return JSONResponse(response) return JSONr(response)
# {'fulfillment_context': {'fulfillment_class_ref_list': []}, 'lease_proposal_list': [{'license_type_qualifiers': {'count': 1}, 'product': {'name': 'NVIDIA RTX Virtual Workstation'}}], 'proposal_evaluation_mode': 'ALL_OF', 'scope_ref_list': ['00112233-4455-6677-8899-aabbccddeeff']} # venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_multi_controller.py
@app.post('/leasing/v1/lessor') @app.post('/leasing/v1/lessor', description='request multiple leases (borrow) for current origin')
async def leasing_v1_lessor(request: Request): async def leasing_v1_lessor(request: Request):
j, token = json.loads((await request.body()).decode('utf-8')), get_token(request) j, token, cur_time = json_loads((await request.body()).decode('utf-8')), __get_token(request), datetime.utcnow()
code_challenge = token['origin_ref'] try:
scope_ref_list = j['scope_ref_list'] token = __get_token(request)
except JWTError:
return JSONr(status_code=401, content={'status': 401, 'detail': 'token is not valid'})
origin_ref = db['auth'].find_one(code_challenge=code_challenge)['origin_ref'] origin_ref = token.get('origin_ref')
scope_ref_list = j.get('scope_ref_list')
logging.info(f'> [ create ]: {origin_ref}: create leases for scope_ref_list {scope_ref_list}')
logging.info(f'> [ create ]: {origin_ref} ({code_challenge}): create leases for scope_ref_list {scope_ref_list}')
cur_time = datetime.utcnow()
lease_result_list = [] lease_result_list = []
for scope_ref in scope_ref_list: for scope_ref in scope_ref_list:
# if scope_ref not in [ALLOTMENT_REF]:
# return JSONr(status_code=500, detail=f'no service instances found for scopes: ["{scope_ref}"]')
lease_ref = str(uuid4())
expires = cur_time + LEASE_EXPIRE_DELTA expires = cur_time + LEASE_EXPIRE_DELTA
lease_result_list.append({ lease_result_list.append({
"ordinal": 0, "ordinal": 0,
# https://docs.nvidia.com/license-system/latest/nvidia-license-system-user-guide/index.html # https://docs.nvidia.com/license-system/latest/nvidia-license-system-user-guide/index.html
"lease": { "lease": {
"ref": scope_ref, "ref": lease_ref,
"created": cur_time.isoformat(), "created": cur_time.isoformat(),
"expires": expires.isoformat(), "expires": expires.isoformat(),
# The percentage of the lease period that must elapse before a licensed client can renew a license "recommended_lease_renewal": LEASE_RENEWAL_PERIOD,
"recommended_lease_renewal": 0.15,
"offline_lease": "true", "offline_lease": "true",
"license_type": "CONCURRENT_COUNTED_SINGLE" "license_type": "CONCURRENT_COUNTED_SINGLE"
} }
}) })
data = dict(origin_ref=origin_ref, lease_ref=scope_ref, lease_created=cur_time, lease_expires=expires) data = Lease(origin_ref=origin_ref, lease_ref=lease_ref, lease_created=cur_time, lease_expires=expires)
db['lease'].insert_ignore(data, ['origin_ref', 'lease_ref']) # todo: handle update Lease.create_or_update(db, data)
response = { response = {
"lease_result_list": lease_result_list, "lease_result_list": lease_result_list,
@@ -302,80 +409,124 @@ async def leasing_v1_lessor(request: Request):
"prompts": None "prompts": None
} }
return JSONResponse(response) return JSONr(response)
# venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_multi_controller.py # venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_multi_controller.py
# venv/lib/python3.9/site-packages/nls_dal_service_instance_dls/schema/service_instance/V1_0_21__product_mapping.sql # venv/lib/python3.9/site-packages/nls_dal_service_instance_dls/schema/service_instance/V1_0_21__product_mapping.sql
@app.get('/leasing/v1/lessor/leases') @app.get('/leasing/v1/lessor/leases', description='get active leases for current origin')
async def leasing_v1_lessor_lease(request: Request): async def leasing_v1_lessor_lease(request: Request):
token = get_token(request) token, cur_time = __get_token(request), datetime.utcnow()
code_challenge = token['origin_ref'] origin_ref = token.get('origin_ref')
origin_ref = db['auth'].find_one(code_challenge=code_challenge)['origin_ref'] active_lease_list = list(map(lambda x: x.lease_ref, Lease.find_by_origin_ref(db, origin_ref)))
active_lease_list = list(map(lambda x: x['lease_ref'], db['lease'].find(origin_ref=origin_ref))) logging.info(f'> [ leases ]: {origin_ref}: found {len(active_lease_list)} active leases')
logging.info(f'> [ leases ]: {origin_ref} ({code_challenge}): found {len(active_lease_list)} active leases')
cur_time = datetime.utcnow()
response = { response = {
"active_lease_list": active_lease_list, "active_lease_list": active_lease_list,
"sync_timestamp": cur_time.isoformat(), "sync_timestamp": cur_time.isoformat(),
"prompts": None "prompts": None
} }
return JSONResponse(response) return JSONr(response)
# venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_single_controller.py
# venv/lib/python3.9/site-packages/nls_core_lease/lease_single.py # venv/lib/python3.9/site-packages/nls_core_lease/lease_single.py
@app.put('/leasing/v1/lease/{lease_ref}') @app.put('/leasing/v1/lease/{lease_ref}', description='renew a lease')
async def leasing_v1_lease_renew(request: Request, lease_ref: str): async def leasing_v1_lease_renew(request: Request, lease_ref: str):
token = get_token(request) token, cur_time = __get_token(request), datetime.utcnow()
code_challenge = token['origin_ref'] origin_ref = token.get('origin_ref')
logging.info(f'> [ renew ]: {origin_ref}: renew {lease_ref}')
origin_ref = db['auth'].find_one(code_challenge=code_challenge)['origin_ref'] entity = Lease.find_by_origin_ref_and_lease_ref(db, origin_ref, lease_ref)
logging.info(f'> [ renew ]: {origin_ref} ({code_challenge}): renew {lease_ref}') if entity is None:
return JSONr(status_code=404, content={'status': 404, 'detail': 'requested lease not available'})
if db['lease'].count(origin_ref=origin_ref, lease_ref=lease_ref) == 0:
raise HTTPException(status_code=404, detail='requested lease not available')
cur_time = datetime.utcnow()
expires = cur_time + LEASE_EXPIRE_DELTA expires = cur_time + LEASE_EXPIRE_DELTA
response = { response = {
"lease_ref": lease_ref, "lease_ref": lease_ref,
"expires": expires.isoformat(), "expires": expires.isoformat(),
"recommended_lease_renewal": 0.16, "recommended_lease_renewal": LEASE_RENEWAL_PERIOD,
"offline_lease": True, "offline_lease": True,
"prompts": None, "prompts": None,
"sync_timestamp": cur_time.isoformat(), "sync_timestamp": cur_time.isoformat(),
} }
data = dict(origin_ref=origin_ref, lease_ref=lease_ref, lease_expires=expires, lease_last_update=cur_time) Lease.renew(db, entity, expires, cur_time)
db['lease'].update(data, ['origin_ref', 'lease_ref'])
return JSONResponse(response) return JSONr(response)
@app.delete('/leasing/v1/lessor/leases') # venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_single_controller.py
@app.delete('/leasing/v1/lease/{lease_ref}', description='release (return) a lease')
async def leasing_v1_lease_delete(request: Request, lease_ref: str):
token, cur_time = __get_token(request), datetime.utcnow()
origin_ref = token.get('origin_ref')
logging.info(f'> [ return ]: {origin_ref}: return {lease_ref}')
entity = Lease.find_by_lease_ref(db, lease_ref)
if entity.origin_ref != origin_ref:
return JSONr(status_code=403, content={'status': 403, 'detail': 'access or operation forbidden'})
if entity is None:
return JSONr(status_code=404, content={'status': 404, 'detail': 'requested lease not available'})
if Lease.delete(db, lease_ref) == 0:
return JSONr(status_code=404, content={'status': 404, 'detail': 'lease not found'})
response = {
"lease_ref": lease_ref,
"prompts": None,
"sync_timestamp": cur_time.isoformat(),
}
return JSONr(response)
# venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_multi_controller.py
@app.delete('/leasing/v1/lessor/leases', description='release all leases')
async def leasing_v1_lessor_lease_remove(request: Request): async def leasing_v1_lessor_lease_remove(request: Request):
token = get_token(request) token, cur_time = __get_token(request), datetime.utcnow()
code_challenge = token['origin_ref'] origin_ref = token.get('origin_ref')
origin_ref = db['auth'].find_one(code_challenge=code_challenge)['origin_ref'] released_lease_list = list(map(lambda x: x.lease_ref, Lease.find_by_origin_ref(db, origin_ref)))
released_lease_list = list(map(lambda x: x['lease_ref'], db['lease'].find(origin_ref=origin_ref))) deletions = Lease.cleanup(db, origin_ref)
deletions = db['lease'].delete(origin_ref=origin_ref) logging.info(f'> [ remove ]: {origin_ref}: removed {deletions} leases')
logging.info(f'> [ remove ]: {origin_ref} ({code_challenge}): removed {deletions} leases')
cur_time = datetime.utcnow()
response = { response = {
"released_lease_list": released_lease_list, "released_lease_list": released_lease_list,
"release_failure_list": None, "release_failure_list": None,
"sync_timestamp": cur_time.isoformat(), "sync_timestamp": cur_time.isoformat(),
"prompts": None "prompts": None
} }
return JSONResponse(response)
return JSONr(response)
@app.post('/leasing/v1/lessor/shutdown', description='shutdown all leases')
async def leasing_v1_lessor_shutdown(request: Request):
j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.utcnow()
token = j.get('token')
token = jwt.decode(token=token, key=jwt_decode_key, algorithms=ALGORITHMS.RS256, options={'verify_aud': False})
origin_ref = token.get('origin_ref')
released_lease_list = list(map(lambda x: x.lease_ref, Lease.find_by_origin_ref(db, origin_ref)))
deletions = Lease.cleanup(db, origin_ref)
logging.info(f'> [ shutdown ]: {origin_ref}: removed {deletions} leases')
response = {
"released_lease_list": released_lease_list,
"release_failure_list": None,
"sync_timestamp": cur_time.isoformat(),
"prompts": None
}
return JSONr(response)
if __name__ == '__main__': if __name__ == '__main__':

213
app/orm.py Normal file
View File

@@ -0,0 +1,213 @@
from datetime import datetime, timedelta
from dateutil.relativedelta import relativedelta
from sqlalchemy import Column, VARCHAR, CHAR, ForeignKey, DATETIME, update, and_, inspect
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.engine import Engine
from sqlalchemy.orm import sessionmaker
Base = declarative_base()
class Origin(Base):
__tablename__ = "origin"
origin_ref = Column(CHAR(length=36), primary_key=True, unique=True, index=True) # uuid4
# service_instance_xid = Column(CHAR(length=36), nullable=False, index=True) # uuid4 # not necessary, we only support one service_instance_xid ('INSTANCE_REF')
hostname = Column(VARCHAR(length=256), nullable=True)
guest_driver_version = Column(VARCHAR(length=10), nullable=True)
os_platform = Column(VARCHAR(length=256), nullable=True)
os_version = Column(VARCHAR(length=256), nullable=True)
def __repr__(self):
return f'Origin(origin_ref={self.origin_ref}, hostname={self.hostname})'
def serialize(self) -> dict:
return {
'origin_ref': self.origin_ref,
# 'service_instance_xid': self.service_instance_xid,
'hostname': self.hostname,
'guest_driver_version': self.guest_driver_version,
'os_platform': self.os_platform,
'os_version': self.os_version,
}
@staticmethod
def create_statement(engine: Engine):
from sqlalchemy.schema import CreateTable
return CreateTable(Origin.__table__).compile(engine)
@staticmethod
def create_or_update(engine: Engine, origin: "Origin"):
session = sessionmaker(bind=engine)()
entity = session.query(Origin).filter(Origin.origin_ref == origin.origin_ref).first()
if entity is None:
session.add(origin)
else:
x = dict(
hostname=origin.hostname,
guest_driver_version=origin.guest_driver_version,
os_platform=origin.os_platform,
os_version=origin.os_version
)
session.execute(update(Origin).where(Origin.origin_ref == origin.origin_ref).values(**x))
session.commit()
session.flush()
session.close()
@staticmethod
def delete(engine: Engine, origins: ["Origin"] = None) -> int:
session = sessionmaker(bind=engine)()
if origins is None:
deletions = session.query(Origin).delete()
else:
deletions = session.query(Origin).filter(Origin.origin_ref in origins).delete()
session.commit()
session.close()
return deletions
class Lease(Base):
__tablename__ = "lease"
lease_ref = Column(CHAR(length=36), primary_key=True, nullable=False, index=True) # uuid4
origin_ref = Column(CHAR(length=36), ForeignKey(Origin.origin_ref, ondelete='CASCADE'), nullable=False, index=True) # uuid4
# scope_ref = Column(CHAR(length=36), nullable=False, index=True) # uuid4 # not necessary, we only support one scope_ref ('ALLOTMENT_REF')
lease_created = Column(DATETIME(), nullable=False)
lease_expires = Column(DATETIME(), nullable=False)
lease_updated = Column(DATETIME(), nullable=False)
def __repr__(self):
return f'Lease(origin_ref={self.origin_ref}, lease_ref={self.lease_ref}, expires={self.lease_expires})'
def serialize(self, renewal_period: float, renewal_delta: timedelta) -> dict:
lease_renewal = int(Lease.calculate_renewal(renewal_period, renewal_delta).total_seconds())
lease_renewal = self.lease_updated + relativedelta(seconds=lease_renewal)
return {
'lease_ref': self.lease_ref,
'origin_ref': self.origin_ref,
# 'scope_ref': self.scope_ref,
'lease_created': self.lease_created.isoformat(),
'lease_expires': self.lease_expires.isoformat(),
'lease_updated': self.lease_updated.isoformat(),
'lease_renewal': lease_renewal.isoformat(),
}
@staticmethod
def create_statement(engine: Engine):
from sqlalchemy.schema import CreateTable
return CreateTable(Lease.__table__).compile(engine)
@staticmethod
def create_or_update(engine: Engine, lease: "Lease"):
session = sessionmaker(bind=engine)()
entity = session.query(Lease).filter(Lease.lease_ref == lease.lease_ref).first()
if entity is None:
if lease.lease_updated is None:
lease.lease_updated = lease.lease_created
session.add(lease)
else:
x = dict(origin_ref=lease.origin_ref, lease_expires=lease.lease_expires, lease_updated=lease.lease_updated)
session.execute(update(Lease).where(Lease.lease_ref == lease.lease_ref).values(**x))
session.commit()
session.flush()
session.close()
@staticmethod
def find_by_origin_ref(engine: Engine, origin_ref: str) -> ["Lease"]:
session = sessionmaker(bind=engine)()
entities = session.query(Lease).filter(Lease.origin_ref == origin_ref).all()
session.close()
return entities
@staticmethod
def find_by_lease_ref(engine: Engine, lease_ref: str) -> "Lease":
session = sessionmaker(bind=engine)()
entity = session.query(Lease).filter(Lease.lease_ref == lease_ref).first()
session.close()
return entity
@staticmethod
def find_by_origin_ref_and_lease_ref(engine: Engine, origin_ref: str, lease_ref: str) -> "Lease":
session = sessionmaker(bind=engine)()
entity = session.query(Lease).filter(and_(Lease.origin_ref == origin_ref, Lease.lease_ref == lease_ref)).first()
session.close()
return entity
@staticmethod
def renew(engine: Engine, lease: "Lease", lease_expires: datetime, lease_updated: datetime):
session = sessionmaker(bind=engine)()
x = dict(lease_expires=lease_expires, lease_updated=lease_updated)
session.execute(update(Lease).where(and_(Lease.origin_ref == lease.origin_ref, Lease.lease_ref == lease.lease_ref)).values(**x))
session.commit()
session.close()
@staticmethod
def cleanup(engine: Engine, origin_ref: str) -> int:
session = sessionmaker(bind=engine)()
deletions = session.query(Lease).filter(Lease.origin_ref == origin_ref).delete()
session.commit()
session.close()
return deletions
@staticmethod
def delete(engine: Engine, lease_ref: str) -> int:
session = sessionmaker(bind=engine)()
deletions = session.query(Lease).filter(Lease.lease_ref == lease_ref).delete()
session.commit()
session.close()
return deletions
@staticmethod
def calculate_renewal(renewal_period: float, delta: timedelta) -> timedelta:
"""
import datetime
LEASE_RENEWAL_PERIOD=0.2 # 20%
delta = datetime.timedelta(days=1)
renew = delta.total_seconds() * LEASE_RENEWAL_PERIOD
renew = datetime.timedelta(seconds=renew)
expires = delta - renew # 19.2
"""
renew = delta.total_seconds() * renewal_period
renew = timedelta(seconds=renew)
return renew
def init(engine: Engine):
tables = [Origin, Lease]
db = inspect(engine)
session = sessionmaker(bind=engine)()
for table in tables:
if not db.dialect.has_table(engine.connect(), table.__tablename__):
session.execute(str(table.create_statement(engine)))
session.commit()
session.close()
def migrate(engine: Engine):
db = inspect(engine)
def upgrade_1_0_to_1_1():
x = db.dialect.get_columns(engine.connect(), Lease.__tablename__)
x = next(_ for _ in x if _['name'] == 'origin_ref')
if x['primary_key'] > 0:
print('Found old database schema with "origin_ref" as primary-key in "lease" table. Dropping table!')
print(' Your leases are recreated on next renewal!')
print(' If an error message appears on the client, you can ignore it.')
Lease.__table__.drop(bind=engine)
init(engine)
# def upgrade_1_2_to_1_3():
# x = db.dialect.get_columns(engine.connect(), Lease.__tablename__)
# x = next((_ for _ in x if _['name'] == 'scope_ref'), None)
# if x is None:
# Lease.scope_ref.compile()
# column_name = Lease.scope_ref.name
# column_type = Lease.scope_ref.type.compile(engine.dialect)
# engine.execute(f'ALTER TABLE "{Lease.__tablename__}" ADD COLUMN "{column_name}" {column_type}')
upgrade_1_0_to_1_1()
# upgrade_1_2_to_1_3()

28
app/util.py Normal file
View File

@@ -0,0 +1,28 @@
def load_file(filename) -> bytes:
with open(filename, 'rb') as file:
content = file.read()
return content
def load_key(filename) -> "RsaKey":
try:
# Crypto | Cryptodome on Debian
from Crypto.PublicKey import RSA
from Crypto.PublicKey.RSA import RsaKey
except ModuleNotFoundError:
from Cryptodome.PublicKey import RSA
from Cryptodome.PublicKey.RSA import RsaKey
return RSA.import_key(extern_key=load_file(filename), passphrase=None)
def generate_key() -> "RsaKey":
try:
# Crypto | Cryptodome on Debian
from Crypto.PublicKey import RSA
from Crypto.PublicKey.RSA import RsaKey
except ModuleNotFoundError:
from Cryptodome.PublicKey import RSA
from Cryptodome.PublicKey.RSA import RsaKey
return RSA.generate(bits=2048)

26
doc/Database.md Normal file
View File

@@ -0,0 +1,26 @@
# Database structure
## `request_routing.service_instance`
| xid | org_name |
|----------------------------------------|--------------------------|
| `10000000-0000-0000-0000-000000000000` | `lic-000000000000000000` |
- `xid` is used as `SERVICE_INSTANCE_XID`
## `request_routing.license_allotment_service_instance`
| xid | service_instance_xid | license_allotment_xid |
|----------------------------------------|----------------------------------------|----------------------------------------|
| `90000000-0000-0000-0000-000000000001` | `10000000-0000-0000-0000-000000000000` | `80000000-0000-0000-0000-000000000001` |
- `xid` is only a primary-key and never used as foreign-key or reference
- `license_allotment_xid` must be used to fetch `xid`'s from `request_routing.license_allotment_reference`
## `request_routing.license_allotment_reference`
| xid | license_allotment_xid |
|----------------------------------------|----------------------------------------|
| `20000000-0000-0000-0000-000000000001` | `80000000-0000-0000-0000-000000000001` |
- `xid` is used as `scope_ref_list` on token request

View File

@@ -33,6 +33,9 @@ nvidia-gridd[2986]: License acquired successfully. (Info: license.nvidia.space,
Most variables and configs are stored in `/var/lib/docker/volumes/configurations/_data`. Most variables and configs are stored in `/var/lib/docker/volumes/configurations/_data`.
Files can be modified with `docker cp <container-id>:/venv/... /opt/localfile/...` and back.
(May you need to fix permissions with `docker exec -u 0 <container-id> chown nonroot:nonroot /venv/...`)
## Dive / Docker image inspector ## Dive / Docker image inspector
- `dive dls:appliance` - `dive dls:appliance`

118
docker-compose.yml Normal file
View File

@@ -0,0 +1,118 @@
version: '3.9'
x-dls-variables: &dls-variables
DLS_URL: localhost # REQUIRED, change to your ip or hostname
DLS_PORT: 443 # must match nginx listen & exposed port
LEASE_EXPIRE_DAYS: 90
DATABASE: sqlite:////app/database/db.sqlite
DEBUG: false
services:
dls:
image: collinwebdesigns/fastapi-dls:latest
restart: always
environment:
<<: *dls-variables
volumes:
- /opt/docker/fastapi-dls/cert:/app/cert # instance.private.pem, instance.public.pem
- db:/app/database
entrypoint: ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000", "--app-dir", "/app", "--proxy-headers"]
healthcheck:
test: ["CMD", "curl", "--fail", "http://localhost:8000/-/health"]
interval: 10s
timeout: 5s
retries: 3
start_period: 30s
proxy:
image: nginx
ports:
# thees are ports where nginx (!) is listen to
- "80:80" # for "/leasing/v1/lessor/shutdown" used by windows guests, can't be changed!
- "443:443" # first part must match "DLS_PORT"
volumes:
- /opt/docker/fastapi-dls/cert:/opt/cert
healthcheck:
test: ["CMD", "curl", "--insecure", "--fail", "https://localhost/-/health"]
interval: 10s
timeout: 5s
retries: 3
start_period: 30s
command: |
bash -c "bash -s <<\"EOF\"
cat > /etc/nginx/nginx.conf <<\"EON\"
daemon off;
user root;
worker_processes auto;
events {
worker_connections 1024;
}
http {
gzip on;
gzip_disable "msie6";
include /etc/nginx/mime.types;
upstream dls-backend {
server dls:8000; # must match dls listen port
}
server {
listen 443 ssl http2 default_server;
listen [::]:443 ssl http2 default_server;
root /var/www/html;
index index.html;
server_name _;
ssl_certificate "/opt/cert/webserver.crt";
ssl_certificate_key "/opt/cert/webserver.key";
ssl_session_cache shared:SSL:1m;
ssl_session_timeout 10m;
ssl_protocols TLSv1.3 TLSv1.2;
# ssl_ciphers "ECDHE-ECDSA-CHACHA20-POLY1305";
# ssl_ciphers PROFILE=SYSTEM;
ssl_prefer_server_ciphers on;
location / {
proxy_set_header Host $$http_host;
proxy_set_header X-Real-IP $$remote_addr;
proxy_set_header X-Forwarded-For $$proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $$scheme;
proxy_pass http://dls-backend$$request_uri;
}
location = /-/health {
access_log off;
add_header 'Content-Type' 'application/json';
return 200 '{\"status\":\"up\",\"service\":\"nginx\"}';
}
}
server {
listen 80;
listen [::]:80;
root /var/www/html;
index index.html;
server_name _;
location /leasing/v1/lessor/shutdown {
proxy_set_header Host $$http_host;
proxy_set_header X-Real-IP $$remote_addr;
proxy_set_header X-Forwarded-For $$proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $$scheme;
proxy_pass http://dls-backend/leasing/v1/lessor/shutdown;
}
location / {
return 301 https://$$host$$request_uri;
}
}
}
EON
nginx
EOF"
volumes:
db:

View File

@@ -1,8 +1,8 @@
fastapi==0.88.0 fastapi==0.89.1
uvicorn[standard]==0.20.0 uvicorn[standard]==0.20.0
python-jose==3.3.0 python-jose==3.3.0
pycryptodome==3.16.0 pycryptodome==3.16.0
python-dateutil==2.8.2 python-dateutil==2.8.2
dataset==1.5.2 sqlalchemy==1.4.46
markdown==3.4.1 markdown==3.4.1
python-dotenv==0.21.0 python-dotenv==0.21.0

242
test/main.py Normal file
View File

@@ -0,0 +1,242 @@
from base64 import b64encode as b64enc
from hashlib import sha256
from calendar import timegm
from datetime import datetime
from os.path import dirname, join
from uuid import uuid4, UUID
from dateutil.relativedelta import relativedelta
from jose import jwt, jwk
from jose.constants import ALGORITHMS
from starlette.testclient import TestClient
import sys
# add relative path to use packages as they were in the app/ dir
sys.path.append('../')
sys.path.append('../app')
from app import main
from app.util import load_key
client = TestClient(main.app)
ORIGIN_REF, ALLOTMENT_REF, SECRET = str(uuid4()), '20000000-0000-0000-0000-000000000001', 'HelloWorld'
# INSTANCE_KEY_RSA = generate_key()
# INSTANCE_KEY_PUB = INSTANCE_KEY_RSA.public_key()
INSTANCE_KEY_RSA = load_key(str(join(dirname(__file__), '../app/cert/instance.private.pem')))
INSTANCE_KEY_PUB = load_key(str(join(dirname(__file__), '../app/cert/instance.public.pem')))
jwt_encode_key = jwk.construct(INSTANCE_KEY_RSA.export_key().decode('utf-8'), algorithm=ALGORITHMS.RS256)
jwt_decode_key = jwk.construct(INSTANCE_KEY_PUB.export_key().decode('utf-8'), algorithm=ALGORITHMS.RS256)
def __bearer_token(origin_ref: str) -> str:
token = jwt.encode({"origin_ref": origin_ref}, key=jwt_encode_key, algorithm=ALGORITHMS.RS256)
token = f'Bearer {token}'
return token
def test_index():
response = client.get('/')
assert response.status_code == 200
def test_health():
response = client.get('/-/health')
assert response.status_code == 200
assert response.json().get('status') == 'up'
def test_config():
response = client.get('/-/config')
assert response.status_code == 200
def test_readme():
response = client.get('/-/readme')
assert response.status_code == 200
def test_manage():
response = client.get('/-/manage')
assert response.status_code == 200
def test_client_token():
response = client.get('/-/client-token')
assert response.status_code == 200
def test_origins():
pass
def test_origins_delete():
pass
def test_leases():
pass
def test_lease_delete():
pass
def test_auth_v1_origin():
payload = {
"registration_pending": False,
"environment": {
"guest_driver_version": "guest_driver_version",
"hostname": "myhost",
"ip_address_list": ["192.168.1.123"],
"os_version": "os_version",
"os_platform": "os_platform",
"fingerprint": {"mac_address_list": ["ff:ff:ff:ff:ff:ff"]},
"host_driver_version": "host_driver_version"
},
"update_pending": False,
"candidate_origin_ref": ORIGIN_REF,
}
response = client.post('/auth/v1/origin', json=payload)
assert response.status_code == 200
assert response.json().get('origin_ref') == ORIGIN_REF
def auth_v1_origin_update():
payload = {
"registration_pending": False,
"environment": {
"guest_driver_version": "guest_driver_version",
"hostname": "myhost",
"ip_address_list": ["192.168.1.123"],
"os_version": "os_version",
"os_platform": "os_platform",
"fingerprint": {"mac_address_list": ["ff:ff:ff:ff:ff:ff"]},
"host_driver_version": "host_driver_version"
},
"update_pending": False,
"candidate_origin_ref": ORIGIN_REF,
}
response = client.post('/auth/v1/origin/update', json=payload)
assert response.status_code == 200
assert response.json().get('origin_ref') == ORIGIN_REF
def test_auth_v1_code():
payload = {
"code_challenge": b64enc(sha256(SECRET.encode('utf-8')).digest()).rstrip(b'=').decode('utf-8'),
"origin_ref": ORIGIN_REF,
}
response = client.post('/auth/v1/code', json=payload)
assert response.status_code == 200
payload = jwt.get_unverified_claims(token=response.json().get('auth_code'))
assert payload.get('origin_ref') == ORIGIN_REF
def test_auth_v1_token():
cur_time = datetime.utcnow()
access_expires_on = cur_time + relativedelta(hours=1)
payload = {
"iat": timegm(cur_time.timetuple()),
"exp": timegm(access_expires_on.timetuple()),
"challenge": b64enc(sha256(SECRET.encode('utf-8')).digest()).rstrip(b'=').decode('utf-8'),
"origin_ref": ORIGIN_REF,
"key_ref": "00000000-0000-0000-0000-000000000000",
"kid": "00000000-0000-0000-0000-000000000000"
}
payload = {
"auth_code": jwt.encode(payload, key=jwt_encode_key, headers={'kid': payload.get('kid')},
algorithm=ALGORITHMS.RS256),
"code_verifier": SECRET,
}
response = client.post('/auth/v1/token', json=payload)
assert response.status_code == 200
token = response.json().get('auth_token')
payload = jwt.decode(token=token, key=jwt_decode_key, algorithms=ALGORITHMS.RS256, options={'verify_aud': False})
assert payload.get('origin_ref') == ORIGIN_REF
def test_leasing_v1_lessor():
payload = {
'fulfillment_context': {
'fulfillment_class_ref_list': []
},
'lease_proposal_list': [{
'license_type_qualifiers': {'count': 1},
'product': {'name': 'NVIDIA RTX Virtual Workstation'}
}],
'proposal_evaluation_mode': 'ALL_OF',
'scope_ref_list': [ALLOTMENT_REF]
}
response = client.post('/leasing/v1/lessor', json=payload, headers={'authorization': __bearer_token(ORIGIN_REF)})
assert response.status_code == 200
lease_result_list = response.json().get('lease_result_list')
assert len(lease_result_list) == 1
assert len(lease_result_list[0]['lease']['ref']) == 36
assert str(UUID(lease_result_list[0]['lease']['ref'])) == lease_result_list[0]['lease']['ref']
return lease_result_list[0]['lease']['ref']
def test_leasing_v1_lessor_lease():
response = client.get('/leasing/v1/lessor/leases', headers={'authorization': __bearer_token(ORIGIN_REF)})
assert response.status_code == 200
active_lease_list = response.json().get('active_lease_list')
assert len(active_lease_list) == 1
assert len(active_lease_list[0]) == 36
assert str(UUID(active_lease_list[0])) == active_lease_list[0]
def test_leasing_v1_lease_renew():
response = client.get('/leasing/v1/lessor/leases', headers={'authorization': __bearer_token(ORIGIN_REF)})
active_lease_list = response.json().get('active_lease_list')
active_lease_ref = active_lease_list[0]
###
response = client.put(f'/leasing/v1/lease/{active_lease_ref}', headers={'authorization': __bearer_token(ORIGIN_REF)})
assert response.status_code == 200
lease_ref = response.json().get('lease_ref')
assert len(lease_ref) == 36
assert lease_ref == active_lease_ref
def test_leasing_v1_lease_delete():
response = client.get('/leasing/v1/lessor/leases', headers={'authorization': __bearer_token(ORIGIN_REF)})
active_lease_list = response.json().get('active_lease_list')
active_lease_ref = active_lease_list[0]
###
response = client.delete(f'/leasing/v1/lease/{active_lease_ref}', headers={'authorization': __bearer_token(ORIGIN_REF)})
assert response.status_code == 200
lease_ref = response.json().get('lease_ref')
assert len(lease_ref) == 36
assert lease_ref == active_lease_ref
def test_leasing_v1_lessor_lease_remove():
lease_ref = test_leasing_v1_lessor()
response = client.delete('/leasing/v1/lessor/leases', headers={'authorization': __bearer_token(ORIGIN_REF)})
assert response.status_code == 200
released_lease_list = response.json().get('released_lease_list')
assert len(released_lease_list) == 1
assert len(released_lease_list[0]) == 36
assert released_lease_list[0] == lease_ref

View File

@@ -1 +1 @@
VERSION=0.5 VERSION=1.3.3