mirror of
https://github.com/pockethost/pockethost.git
synced 2025-03-30 15:08:30 +00:00
Merge branch 'master' of github.com:benallfree/pockethost
This commit is contained in:
commit
f5d1201b1a
@ -1,3 +1,5 @@
|
|||||||
.svelte-kit
|
.svelte-kit
|
||||||
dist
|
dist
|
||||||
mount
|
mount
|
||||||
|
.data
|
||||||
|
attic
|
@ -1,56 +1,60 @@
|
|||||||
# Table of contents
|
# Table of contents
|
||||||
|
|
||||||
* [👋 Welcome to PocketHost](README.md)
|
- [👋 Welcome to PocketHost](README.md)
|
||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
* [Getting Help](overview/help.md)
|
- [Getting Help](overview/help.md)
|
||||||
* [FAQ](overview/faq.md)
|
- [FAQ](overview/faq.md)
|
||||||
* [Roadmap](overview/roadmap.md)
|
- [Roadmap](overview/roadmap.md)
|
||||||
|
|
||||||
## Daily Usage
|
## Daily Usage
|
||||||
|
|
||||||
* [Creating an Instance](usage/create.md)
|
- [Creating an Instance](usage/create.md)
|
||||||
* [Accessing an Instance](usage/accessing-instance.md)
|
- [Accessing an Instance](usage/accessing-instance.md)
|
||||||
* [Instance Details](usage/instances/index.md)
|
- [Instance Details](usage/instances/index.md)
|
||||||
* [Renaming an Instance](usage/rename-instance.md)
|
- [Renaming an Instance](usage/rename-instance.md)
|
||||||
* [Maintenance Mode](usage/maintenance.md)
|
- [Maintenance Mode](usage/maintenance.md)
|
||||||
* [FTP Access](usage/ftp.md)
|
- [FTP Access](usage/ftp.md)
|
||||||
* [Backup & Restore](usage/backup-and-restore.md)
|
- [Backup & Restore](usage/backup-and-restore.md)
|
||||||
* [Worker](daily-usage/worker.md)
|
- [Worker](daily-usage/worker.md)
|
||||||
* [PocketBase Hooks](usage/hooks.md)
|
- [PocketBase Hooks](usage/hooks.md)
|
||||||
* [Upgrading](usage/upgrading.md)
|
- [Upgrading](usage/upgrading.md)
|
||||||
|
|
||||||
|
## Hosting
|
||||||
|
|
||||||
|
- [Overview](hosting/overview.md)
|
||||||
|
|
||||||
## Contributing
|
## Contributing
|
||||||
|
|
||||||
* [Overview](development/overview.md)
|
- [Overview](development/overview.md)
|
||||||
* [Running Just the Frontend](development/frontend.md)
|
- [Running Just the Frontend](development/frontend.md)
|
||||||
* [Running Everything](development/full-stack/index.md)
|
- [Running Everything](development/full-stack/index.md)
|
||||||
* [Creating RPC Calls](development/rpc.md)
|
- [Creating RPC Calls](development/rpc.md)
|
||||||
* [Production Deployment](development/production.md)
|
- [Production Deployment](development/production.md)
|
||||||
|
|
||||||
## Release History
|
## Release History
|
||||||
|
|
||||||
* [next](releases/next.md)
|
- [next](releases/next.md)
|
||||||
* [0.8.0](releases/0.8.0.md)
|
- [0.8.0](releases/0.8.0.md)
|
||||||
* [0.7.2](releases/0.7.2.md)
|
- [0.7.2](releases/0.7.2.md)
|
||||||
* [0.7.1](releases/0.7.1.md)
|
- [0.7.1](releases/0.7.1.md)
|
||||||
* [0.7.0](releases/0.7.0.md)
|
- [0.7.0](releases/0.7.0.md)
|
||||||
* [0.6.1](releases/0.6.1.md)
|
- [0.6.1](releases/0.6.1.md)
|
||||||
* [0.6.0](releases/0.6.0.md)
|
- [0.6.0](releases/0.6.0.md)
|
||||||
* [0.5.7](releases/0.5.7.md)
|
- [0.5.7](releases/0.5.7.md)
|
||||||
* [0.5.6](releases/0.5.6.md)
|
- [0.5.6](releases/0.5.6.md)
|
||||||
* [0.5.5](releases/0.5.5.md)
|
- [0.5.5](releases/0.5.5.md)
|
||||||
* [0.5.4](releases/0.5.4.md)
|
- [0.5.4](releases/0.5.4.md)
|
||||||
* [0.5.3](releases/0.5.3.md)
|
- [0.5.3](releases/0.5.3.md)
|
||||||
* [0.5.2](releases/0.5.2.md)
|
- [0.5.2](releases/0.5.2.md)
|
||||||
* [0.5.1](releases/0.5.1.md)
|
- [0.5.1](releases/0.5.1.md)
|
||||||
* [0.5.0](releases/0.5.0.md)
|
- [0.5.0](releases/0.5.0.md)
|
||||||
* [0.4.2](releases/0.4.2.md)
|
- [0.4.2](releases/0.4.2.md)
|
||||||
* [0.4.1](releases/0.4.1.md)
|
- [0.4.1](releases/0.4.1.md)
|
||||||
* [0.4.0](releases/0.4.0.md)
|
- [0.4.0](releases/0.4.0.md)
|
||||||
* [0.3.2](releases/0.3.2.md)
|
- [0.3.2](releases/0.3.2.md)
|
||||||
* [0.3.1](releases/0.3.1.md)
|
- [0.3.1](releases/0.3.1.md)
|
||||||
* [0.3.0](releases/0.3.0.md)
|
- [0.3.0](releases/0.3.0.md)
|
||||||
* [0.2.0](releases/0.2.0.md)
|
- [0.2.0](releases/0.2.0.md)
|
||||||
* [0.0.1](release-history/0.0.1.md)
|
- [0.0.1](release-history/0.0.1.md)
|
||||||
|
41
gitbook/hosting/overview.md
Normal file
41
gitbook/hosting/overview.md
Normal file
@ -0,0 +1,41 @@
|
|||||||
|
# Overview
|
||||||
|
|
||||||
|
[UNDER CONSTRUCTION]
|
||||||
|
|
||||||
|
This guide covers how to set up a production hosting environment for PocketHost. Hosting PocketHost might be desirable if:
|
||||||
|
|
||||||
|
- You want to create a hosting service business powered by PocketHost
|
||||||
|
- You want a private copy of PocketHost where you control all the underlying infrastructure
|
||||||
|
- You want to run PocketHost from a region not yet offered by pockethost.io
|
||||||
|
|
||||||
|
Running a hosting service is not easy. To provide a great hosting experience for users, you need to know about:
|
||||||
|
|
||||||
|
- Docker
|
||||||
|
- Email and DKIM+SPF and more
|
||||||
|
- DNS jargon: MX, TXT, CNAME
|
||||||
|
- SSL cert provisioning and management
|
||||||
|
- Storage
|
||||||
|
- Volume mounts
|
||||||
|
- Could computing or VPS deployment
|
||||||
|
- CDN and static asset hosting
|
||||||
|
- Amazon AWS
|
||||||
|
- Lots more - scaling, firewalls, DDoS defense, user security, log rotation, patches, updates, build tools, CPU architectures, multitenancy, on and on
|
||||||
|
|
||||||
|
If you're still interested in creating a PocketHost hosting environment for yourself, read on...
|
||||||
|
|
||||||
|
```
|
||||||
|
apt-get update
|
||||||
|
apt-get install -y nginx nodejs npm
|
||||||
|
npm i -g n yarn
|
||||||
|
n lts
|
||||||
|
hash -r
|
||||||
|
git clone git@github.com:benallfree/pockethost.git pockethost-latest
|
||||||
|
cd pockethost-latest
|
||||||
|
yarn
|
||||||
|
cd ..
|
||||||
|
git clone git@github.com:benallfree/pockethost.git pockethost-lts
|
||||||
|
cd pockethost-lts
|
||||||
|
yarn
|
||||||
|
cd ..
|
||||||
|
|
||||||
|
```
|
20
package.json
20
package.json
@ -32,18 +32,22 @@
|
|||||||
"semi": false,
|
"semi": false,
|
||||||
"useTabs": false,
|
"useTabs": false,
|
||||||
"singleQuote": true,
|
"singleQuote": true,
|
||||||
|
"trailingComma": "all",
|
||||||
"plugins": [
|
"plugins": [
|
||||||
"./node_modules/prettier-plugin-organize-imports",
|
"./node_modules/prettier-plugin-organize-imports/index.js",
|
||||||
"./node_modules/prettier-plugin-svelte"
|
"./node_modules/prettier-plugin-svelte/plugin.js"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"concurrently": "^7.4.0",
|
"chokidar-cli": "^3.0.0",
|
||||||
"patch-package": "^6.5.0",
|
"concurrently": "^8.2.1",
|
||||||
"prettier": "^2.7.1",
|
"patch-package": "^8.0.0",
|
||||||
"prettier-plugin-organize-imports": "^3.1.1",
|
"prettier": "^3.0.3",
|
||||||
"prettier-plugin-svelte": "^2.7.0",
|
"prettier-plugin-organize-imports": "^3.2.3",
|
||||||
"typescript": "^4.8.3"
|
"prettier-plugin-svelte": "^3.0.3",
|
||||||
|
"tslib": "^2.6.2",
|
||||||
|
"tsx": "^3.12.8",
|
||||||
|
"typescript": "^5.0"
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"postinstall-postinstall": "^2.1.0",
|
"postinstall-postinstall": "^2.1.0",
|
||||||
|
@ -18,7 +18,7 @@ export const addDevCommand = (program: Command) => {
|
|||||||
.description('Build the JS bundle')
|
.description('Build the JS bundle')
|
||||||
.option(
|
.option(
|
||||||
'--src <path>',
|
'--src <path>',
|
||||||
`Path to source (default: <project>/src/index.{ts|js})`
|
`Path to source (default: <project>/src/index.{ts|js})`,
|
||||||
)
|
)
|
||||||
.option('--dist <path>', `Path to dist (default: <project>/dist/index.js)`)
|
.option('--dist <path>', `Path to dist (default: <project>/dist/index.js)`)
|
||||||
.action(async (options) => {
|
.action(async (options) => {
|
||||||
|
@ -22,7 +22,7 @@ export const addDevCommand = (program: Command) => {
|
|||||||
.description('Watch for source code changes in development mode')
|
.description('Watch for source code changes in development mode')
|
||||||
.option(
|
.option(
|
||||||
'--src <path>',
|
'--src <path>',
|
||||||
`Path to source (default: <project>/src/index.{ts|js})`
|
`Path to source (default: <project>/src/index.{ts|js})`,
|
||||||
)
|
)
|
||||||
.option('--dist <path>', `Path to dist (default: <project>/dist/index.js)`)
|
.option('--dist <path>', `Path to dist (default: <project>/dist/index.js)`)
|
||||||
.option('--host', 'PocketBase host', DEFAULT_PB_DEV_URL)
|
.option('--host', 'PocketBase host', DEFAULT_PB_DEV_URL)
|
||||||
|
@ -70,7 +70,7 @@ export const addPublishCommand = (program: Command) => {
|
|||||||
.description('Publish JS bundle to PBScript-enabled PocketBase instance')
|
.description('Publish JS bundle to PBScript-enabled PocketBase instance')
|
||||||
.option(
|
.option(
|
||||||
'--dist <src>',
|
'--dist <src>',
|
||||||
`Path to dist bundle (default: <project>/dist/index.js)`
|
`Path to dist bundle (default: <project>/dist/index.js)`,
|
||||||
)
|
)
|
||||||
.option('--host <host>', `PocketBase host (default: ${DEFAULT_PB_DEV_URL})`)
|
.option('--host <host>', `PocketBase host (default: ${DEFAULT_PB_DEV_URL})`)
|
||||||
.action(async (options) => {
|
.action(async (options) => {
|
||||||
|
@ -6,7 +6,7 @@ export type FieldStruct<TRec extends Pb_Any_Record_Db> = Partial<{
|
|||||||
}>
|
}>
|
||||||
|
|
||||||
export const buildQueryFilter = <TRec extends Pb_Any_Record_Db>(
|
export const buildQueryFilter = <TRec extends Pb_Any_Record_Db>(
|
||||||
fields: FieldStruct<TRec>
|
fields: FieldStruct<TRec>,
|
||||||
): Pb_QueryParams => {
|
): Pb_QueryParams => {
|
||||||
const filter = map(fields, (v, k) => `${k.toString()} = "${v}"`).join(' and ')
|
const filter = map(fields, (v, k) => `${k.toString()} = "${v}"`).join(' and ')
|
||||||
return { filter }
|
return { filter }
|
||||||
|
@ -5,14 +5,14 @@ import {
|
|||||||
Pb_Any_Record_Db,
|
Pb_Any_Record_Db,
|
||||||
Pb_Untrusted_Db,
|
Pb_Untrusted_Db,
|
||||||
} from '../schema/base'
|
} from '../schema/base'
|
||||||
import { buildQueryFilter, FieldStruct } from './buildQueryFilter'
|
import { FieldStruct, buildQueryFilter } from './buildQueryFilter'
|
||||||
|
|
||||||
export const getOne = async <
|
export const getOne = async <
|
||||||
TRec extends Pb_Any_Record_Db,
|
TRec extends Pb_Any_Record_Db,
|
||||||
TFields extends FieldStruct<TRec> = FieldStruct<TRec>
|
TFields extends FieldStruct<TRec> = FieldStruct<TRec>,
|
||||||
>(
|
>(
|
||||||
collectionName: Pb_Any_Collection_Name,
|
collectionName: Pb_Any_Collection_Name,
|
||||||
fields: TFields
|
fields: TFields,
|
||||||
) => {
|
) => {
|
||||||
const queryParams = buildQueryFilter(fields)
|
const queryParams = buildQueryFilter(fields)
|
||||||
const recs = await client.records.getList(collectionName, 1, 2, queryParams)
|
const recs = await client.records.getList(collectionName, 1, 2, queryParams)
|
||||||
|
@ -6,14 +6,14 @@ export const mergeDeep = <TObject>(dst: any, src: TObject) => {
|
|||||||
if (dst[k] === undefined) dst[k] = {}
|
if (dst[k] === undefined) dst[k] = {}
|
||||||
if (!isObject(dst[k])) {
|
if (!isObject(dst[k])) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`${k.toString()} is an object in default, but not in target`
|
`${k.toString()} is an object in default, but not in target`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
dst[k] = mergeDeep(dst[k], v)
|
dst[k] = mergeDeep(dst[k], v)
|
||||||
} else {
|
} else {
|
||||||
if (isObject(dst[k])) {
|
if (isObject(dst[k])) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`${k.toString()} is an object in target, but not in default`
|
`${k.toString()} is an object in target, but not in default`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
// The magic: if the target has no value for this field, use the
|
// The magic: if the target has no value for this field, use the
|
||||||
|
@ -2,7 +2,7 @@ import { UnsubFunc } from 'store/backend/types'
|
|||||||
import { client } from '../client'
|
import { client } from '../client'
|
||||||
|
|
||||||
export const onAuthStateChanged = (
|
export const onAuthStateChanged = (
|
||||||
cb: (user: typeof client.authStore.model) => void
|
cb: (user: typeof client.authStore.model) => void,
|
||||||
): UnsubFunc => {
|
): UnsubFunc => {
|
||||||
setTimeout(() => cb(client.authStore.model), 0)
|
setTimeout(() => cb(client.authStore.model), 0)
|
||||||
return client.authStore.onChange(() => {
|
return client.authStore.onChange(() => {
|
||||||
|
@ -8,14 +8,14 @@ import {
|
|||||||
Pb_Untrusted_Db,
|
Pb_Untrusted_Db,
|
||||||
Pb_UserFields,
|
Pb_UserFields,
|
||||||
} from '../schema/base'
|
} from '../schema/base'
|
||||||
import { buildQueryFilter, FieldStruct } from './buildQueryFilter'
|
import { FieldStruct, buildQueryFilter } from './buildQueryFilter'
|
||||||
import { mergeDeep } from './mergeDeep'
|
import { mergeDeep } from './mergeDeep'
|
||||||
|
|
||||||
export const upsert = async <TRow extends Pb_Any_Record_Db>(
|
export const upsert = async <TRow extends Pb_Any_Record_Db>(
|
||||||
collectionName: Pb_Any_Collection_Name,
|
collectionName: Pb_Any_Collection_Name,
|
||||||
filterFields: FieldStruct<TRow>,
|
filterFields: FieldStruct<TRow>,
|
||||||
mutate: (draft: Draft<Pb_UserFields<TRow>>) => void,
|
mutate: (draft: Draft<Pb_UserFields<TRow>>) => void,
|
||||||
defaultRec: Pb_UserFields<TRow>
|
defaultRec: Pb_UserFields<TRow>,
|
||||||
) => {
|
) => {
|
||||||
const queryParams = buildQueryFilter(filterFields)
|
const queryParams = buildQueryFilter(filterFields)
|
||||||
const recs = await client.records.getList(collectionName, 1, 2, queryParams)
|
const recs = await client.records.getList(collectionName, 1, 2, queryParams)
|
||||||
@ -42,7 +42,7 @@ export const upsert = async <TRow extends Pb_Any_Record_Db>(
|
|||||||
}
|
}
|
||||||
return carry
|
return carry
|
||||||
},
|
},
|
||||||
{} as Partial<Pb_UserFields<TRow>>
|
{} as Partial<Pb_UserFields<TRow>>,
|
||||||
)
|
)
|
||||||
client.records.update(collectionName, id, final)
|
client.records.update(collectionName, id, final)
|
||||||
}
|
}
|
||||||
|
@ -50,7 +50,7 @@ export class CustomAuthStore extends BaseAuthStore {
|
|||||||
}
|
}
|
||||||
exportToCookie(
|
exportToCookie(
|
||||||
options?: SerializeOptions | undefined,
|
options?: SerializeOptions | undefined,
|
||||||
key?: string | undefined
|
key?: string | undefined,
|
||||||
): string {
|
): string {
|
||||||
throw new Error(`Unsupported exportToCookie()`)
|
throw new Error(`Unsupported exportToCookie()`)
|
||||||
}
|
}
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
export function assertExists<TType>(
|
export function assertExists<TType>(
|
||||||
v: TType,
|
v: TType,
|
||||||
message = `Value does not exist`
|
message = `Value does not exist`,
|
||||||
): asserts v is NonNullable<TType> {
|
): asserts v is NonNullable<TType> {
|
||||||
if (typeof v === 'undefined') {
|
if (typeof v === 'undefined') {
|
||||||
throw new Error(message)
|
throw new Error(message)
|
||||||
|
@ -20,11 +20,11 @@ export type ConnectionConfig = {
|
|||||||
|
|
||||||
export const ensureAdminClient = async (
|
export const ensureAdminClient = async (
|
||||||
slug: string,
|
slug: string,
|
||||||
config: ConnectionConfig
|
config: ConnectionConfig,
|
||||||
) => {
|
) => {
|
||||||
const saver = mkProjectSaver<ConnectionConfig>(slug)
|
const saver = mkProjectSaver<ConnectionConfig>(slug)
|
||||||
const client = pbClient(config, (session) =>
|
const client = pbClient(config, (session) =>
|
||||||
saver((config) => ({ ...config, session }))
|
saver((config) => ({ ...config, session })),
|
||||||
)
|
)
|
||||||
const _isAdmin = await isAdmin(client)
|
const _isAdmin = await isAdmin(client)
|
||||||
if (_isAdmin) {
|
if (_isAdmin) {
|
||||||
@ -34,7 +34,7 @@ export const ensureAdminClient = async (
|
|||||||
const { host } = config
|
const { host } = config
|
||||||
|
|
||||||
console.log(
|
console.log(
|
||||||
`You must be logged in to ${host}/_ as a PocketBase admin to continue.`
|
`You must be logged in to ${host}/_ as a PocketBase admin to continue.`,
|
||||||
)
|
)
|
||||||
|
|
||||||
while (true) {
|
while (true) {
|
||||||
@ -55,7 +55,7 @@ export const ensureAdminClient = async (
|
|||||||
value.length > 0 ? true : `Enter a password`,
|
value.length > 0 ? true : `Enter a password`,
|
||||||
},
|
},
|
||||||
],
|
],
|
||||||
{ onCancel: () => die(`Exited.`) }
|
{ onCancel: () => die(`Exited.`) },
|
||||||
)
|
)
|
||||||
const { username, password } = response
|
const { username, password } = response
|
||||||
try {
|
try {
|
||||||
|
@ -9,13 +9,13 @@ import { ConnectionConfig } from './ensureAdminClient'
|
|||||||
|
|
||||||
export const pbClient = (
|
export const pbClient = (
|
||||||
config: ConnectionConfig,
|
config: ConnectionConfig,
|
||||||
saver: SessionStateSaver
|
saver: SessionStateSaver,
|
||||||
) => {
|
) => {
|
||||||
const { host, session } = config
|
const { host, session } = config
|
||||||
const client = new PocketBase(
|
const client = new PocketBase(
|
||||||
host,
|
host,
|
||||||
'en-US',
|
'en-US',
|
||||||
new CustomAuthStore(session, saver)
|
new CustomAuthStore(session, saver),
|
||||||
)
|
)
|
||||||
return client
|
return client
|
||||||
}
|
}
|
||||||
@ -31,7 +31,7 @@ export const isAdmin = async (client: pocketbaseEs) => {
|
|||||||
|
|
||||||
export const adminPbClient = async (
|
export const adminPbClient = async (
|
||||||
config: ConnectionConfig,
|
config: ConnectionConfig,
|
||||||
saver: SessionStateSaver
|
saver: SessionStateSaver,
|
||||||
) => {
|
) => {
|
||||||
const client = pbClient(config, saver)
|
const client = pbClient(config, saver)
|
||||||
if (!client.authStore.isValid) {
|
if (!client.authStore.isValid) {
|
||||||
|
@ -38,10 +38,10 @@ export const createCleanupManager = (slug?: string) => {
|
|||||||
(c, v) => {
|
(c, v) => {
|
||||||
return c.then(() => v())
|
return c.then(() => v())
|
||||||
},
|
},
|
||||||
Promise.resolve()
|
Promise.resolve(),
|
||||||
).catch((e) => {
|
).catch((e) => {
|
||||||
error(
|
error(
|
||||||
`Cleanup functions are failing. This should never happen, check all cleanup functions to make sure they are trapping their exceptions.`
|
`Cleanup functions are failing. This should never happen, check all cleanup functions to make sure they are trapping their exceptions.`,
|
||||||
)
|
)
|
||||||
throw e
|
throw e
|
||||||
})
|
})
|
||||||
|
@ -58,7 +58,7 @@ export const createLogger = (config: Partial<Config>) => {
|
|||||||
return JSON.stringify(arg)
|
return JSON.stringify(arg)
|
||||||
}
|
}
|
||||||
return arg
|
return arg
|
||||||
})
|
}),
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -89,7 +89,7 @@ export const createLogger = (config: Partial<Config>) => {
|
|||||||
;[..._buf.slice(_curIdx, MAX_BUF), ..._buf.slice(0, _curIdx)].forEach(
|
;[..._buf.slice(_curIdx, MAX_BUF), ..._buf.slice(0, _curIdx)].forEach(
|
||||||
(args) => {
|
(args) => {
|
||||||
console.error(...args)
|
console.error(...args)
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
console.error(`========== ERROR TRACEBACK END ==============`)
|
console.error(`========== ERROR TRACEBACK END ==============`)
|
||||||
}
|
}
|
||||||
|
@ -1,6 +1,6 @@
|
|||||||
export function assertExists<TType>(
|
export function assertExists<TType>(
|
||||||
v: TType,
|
v: TType,
|
||||||
message = `Value does not exist`
|
message = `Value does not exist`,
|
||||||
): asserts v is NonNullable<TType> {
|
): asserts v is NonNullable<TType> {
|
||||||
if (typeof v === 'undefined') {
|
if (typeof v === 'undefined') {
|
||||||
throw new Error(message)
|
throw new Error(message)
|
||||||
@ -9,7 +9,7 @@ export function assertExists<TType>(
|
|||||||
|
|
||||||
export function assertTruthy<TType>(
|
export function assertTruthy<TType>(
|
||||||
v: unknown,
|
v: unknown,
|
||||||
message = `Value should be truthy`
|
message = `Value should be truthy`,
|
||||||
): asserts v is NonNullable<TType> {
|
): asserts v is NonNullable<TType> {
|
||||||
if (!v) {
|
if (!v) {
|
||||||
throw new Error(message)
|
throw new Error(message)
|
||||||
|
@ -5,10 +5,10 @@ import { SetReturnType } from 'type-fest'
|
|||||||
|
|
||||||
const limiters: { [lane: string]: Bottleneck } = {}
|
const limiters: { [lane: string]: Bottleneck } = {}
|
||||||
export const serialAsyncExecutionGuard = <
|
export const serialAsyncExecutionGuard = <
|
||||||
T extends (...args: any[]) => Promise<any>
|
T extends (...args: any[]) => Promise<any>,
|
||||||
>(
|
>(
|
||||||
cb: T,
|
cb: T,
|
||||||
lane?: SetReturnType<T, string>
|
lane?: SetReturnType<T, string>,
|
||||||
): T => {
|
): T => {
|
||||||
const uuid = uniqueId()
|
const uuid = uniqueId()
|
||||||
const _lane = lane || (() => uuid)
|
const _lane = lane || (() => uuid)
|
||||||
@ -27,10 +27,10 @@ export const serialAsyncExecutionGuard = <
|
|||||||
|
|
||||||
const singletons: { [_: string]: Promise<any> } = {}
|
const singletons: { [_: string]: Promise<any> } = {}
|
||||||
export const singletonAsyncExecutionGuard = <
|
export const singletonAsyncExecutionGuard = <
|
||||||
T extends (...args: any[]) => Promise<any>
|
T extends (...args: any[]) => Promise<any>,
|
||||||
>(
|
>(
|
||||||
cb: T,
|
cb: T,
|
||||||
key: SetReturnType<T, string>
|
key: SetReturnType<T, string>,
|
||||||
): T => {
|
): T => {
|
||||||
const uuid = uniqueId()
|
const uuid = uniqueId()
|
||||||
const keyFactory = key || (() => uuid)
|
const keyFactory = key || (() => uuid)
|
||||||
|
@ -3,15 +3,15 @@ import { values } from '@s-libs/micro-dash'
|
|||||||
export type Unsubscribe = () => void
|
export type Unsubscribe = () => void
|
||||||
|
|
||||||
export type EventSubscriber<TPayload> = (
|
export type EventSubscriber<TPayload> = (
|
||||||
cb: EventHandler<TPayload>
|
cb: EventHandler<TPayload>,
|
||||||
) => Unsubscribe
|
) => Unsubscribe
|
||||||
export type EventEmitter<TPayload> = (
|
export type EventEmitter<TPayload> = (
|
||||||
payload: TPayload,
|
payload: TPayload,
|
||||||
stopOnHandled?: boolean
|
stopOnHandled?: boolean,
|
||||||
) => Promise<boolean>
|
) => Promise<boolean>
|
||||||
export type EventHandler<TPayload> = (
|
export type EventHandler<TPayload> = (
|
||||||
payload: TPayload,
|
payload: TPayload,
|
||||||
isHandled: boolean
|
isHandled: boolean,
|
||||||
) => boolean | void | Promise<boolean | void>
|
) => boolean | void | Promise<boolean | void>
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@ -20,7 +20,7 @@ export type EventHandler<TPayload> = (
|
|||||||
* @returns void
|
* @returns void
|
||||||
*/
|
*/
|
||||||
export const createEvent = <TPayload>(
|
export const createEvent = <TPayload>(
|
||||||
defaultHandler?: EventHandler<TPayload>
|
defaultHandler?: EventHandler<TPayload>,
|
||||||
): [EventSubscriber<TPayload>, EventEmitter<TPayload>] => {
|
): [EventSubscriber<TPayload>, EventEmitter<TPayload>] => {
|
||||||
let i = 0
|
let i = 0
|
||||||
const callbacks: any = {}
|
const callbacks: any = {}
|
||||||
|
@ -1,11 +1,11 @@
|
|||||||
|
export * from './CleanupManager'
|
||||||
|
export * from './Logger'
|
||||||
|
export * from './TimerManager'
|
||||||
export * from './assert'
|
export * from './assert'
|
||||||
export * from './asyncExecutionGuard'
|
export * from './asyncExecutionGuard'
|
||||||
export * from './CleanupManager'
|
|
||||||
export * from './events'
|
export * from './events'
|
||||||
export * from './Logger'
|
|
||||||
export * from './mkSingleton'
|
export * from './mkSingleton'
|
||||||
export * from './newId'
|
export * from './newId'
|
||||||
export * from './pocketbase-client-helpers'
|
export * from './pocketbase-client-helpers'
|
||||||
export * from './safeCatch'
|
export * from './safeCatch'
|
||||||
export * from './schema'
|
export * from './schema'
|
||||||
export * from './TimerManager'
|
|
||||||
|
@ -10,9 +10,9 @@ export type SingletonBaseConfig = {
|
|||||||
|
|
||||||
export const mkSingleton = <
|
export const mkSingleton = <
|
||||||
TConfig,
|
TConfig,
|
||||||
TApi extends SingletonApi | Promise<SingletonApi>
|
TApi extends SingletonApi | Promise<SingletonApi>,
|
||||||
>(
|
>(
|
||||||
factory: (config: TConfig) => TApi
|
factory: (config: TConfig) => TApi,
|
||||||
) => {
|
) => {
|
||||||
let _service: TApi | undefined = undefined
|
let _service: TApi | undefined = undefined
|
||||||
return (config?: TConfig) => {
|
return (config?: TConfig) => {
|
||||||
|
@ -6,11 +6,11 @@ import { logger } from '../Logger'
|
|||||||
import { newId } from '../newId'
|
import { newId } from '../newId'
|
||||||
import { safeCatch } from '../safeCatch'
|
import { safeCatch } from '../safeCatch'
|
||||||
import {
|
import {
|
||||||
|
RPC_COLLECTION,
|
||||||
RpcCommands,
|
RpcCommands,
|
||||||
RpcFields,
|
RpcFields,
|
||||||
RpcRecord_Create,
|
RpcRecord_Create,
|
||||||
RpcStatus,
|
RpcStatus,
|
||||||
RPC_COLLECTION,
|
|
||||||
} from '../schema'
|
} from '../schema'
|
||||||
import type { WatchHelper } from './WatchHelper'
|
import type { WatchHelper } from './WatchHelper'
|
||||||
|
|
||||||
@ -30,7 +30,7 @@ export const createRpcHelper = (config: RpcHelperConfig) => {
|
|||||||
|
|
||||||
const mkRpc = <TPayload extends JsonObject, TResult extends JsonObject>(
|
const mkRpc = <TPayload extends JsonObject, TResult extends JsonObject>(
|
||||||
cmd: RpcCommands,
|
cmd: RpcCommands,
|
||||||
schema: JSONSchemaType<TPayload>
|
schema: JSONSchemaType<TPayload>,
|
||||||
) => {
|
) => {
|
||||||
type ConcreteRpcRecord = RpcFields<TPayload, TResult>
|
type ConcreteRpcRecord = RpcFields<TPayload, TResult>
|
||||||
const validator = new Ajv().compile(schema)
|
const validator = new Ajv().compile(schema)
|
||||||
@ -39,7 +39,7 @@ export const createRpcHelper = (config: RpcHelperConfig) => {
|
|||||||
logger(),
|
logger(),
|
||||||
async (
|
async (
|
||||||
payload: TPayload,
|
payload: TPayload,
|
||||||
cb?: (data: RecordSubscription<ConcreteRpcRecord>) => void
|
cb?: (data: RecordSubscription<ConcreteRpcRecord>) => void,
|
||||||
) => {
|
) => {
|
||||||
const _rpcLogger = _logger.create(cmd)
|
const _rpcLogger = _logger.create(cmd)
|
||||||
const { dbg, error } = _rpcLogger
|
const { dbg, error } = _rpcLogger
|
||||||
@ -82,7 +82,7 @@ export const createRpcHelper = (config: RpcHelperConfig) => {
|
|||||||
reject(new ClientResponseError(data.record.result))
|
reject(new ClientResponseError(data.record.result))
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
{ initialFetch: false, pollIntervalMs: 100 }
|
{ initialFetch: false, pollIntervalMs: 100 },
|
||||||
)
|
)
|
||||||
dbg(`Creating ${rpcIn.id}`)
|
dbg(`Creating ${rpcIn.id}`)
|
||||||
const newRpc = await client.collection(RPC_COLLECTION).create(rpcIn)
|
const newRpc = await client.collection(RPC_COLLECTION).create(rpcIn)
|
||||||
@ -92,7 +92,7 @@ export const createRpcHelper = (config: RpcHelperConfig) => {
|
|||||||
reject(e)
|
reject(e)
|
||||||
})
|
})
|
||||||
})
|
})
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -1,9 +1,9 @@
|
|||||||
import type pocketbaseEs from 'pocketbase'
|
import type pocketbaseEs from 'pocketbase'
|
||||||
import type { RecordSubscription, UnsubscribeFunc } from 'pocketbase'
|
import type { RecordSubscription, UnsubscribeFunc } from 'pocketbase'
|
||||||
import { logger } from '../Logger'
|
import { logger } from '../Logger'
|
||||||
|
import { UnixTimestampMs, createTimerManager } from '../TimerManager'
|
||||||
import { safeCatch } from '../safeCatch'
|
import { safeCatch } from '../safeCatch'
|
||||||
import { BaseFields, RecordId } from '../schema'
|
import { BaseFields, RecordId } from '../schema'
|
||||||
import { createTimerManager, UnixTimestampMs } from '../TimerManager'
|
|
||||||
|
|
||||||
export type WatchHelperConfig = {
|
export type WatchHelperConfig = {
|
||||||
client: pocketbaseEs
|
client: pocketbaseEs
|
||||||
@ -23,7 +23,7 @@ export const createWatchHelper = (config: WatchHelperConfig) => {
|
|||||||
collectionName: string,
|
collectionName: string,
|
||||||
id: RecordId,
|
id: RecordId,
|
||||||
cb: (data: RecordSubscription<TRec>, unsub: UnsubscribeFunc) => void,
|
cb: (data: RecordSubscription<TRec>, unsub: UnsubscribeFunc) => void,
|
||||||
options?: Partial<WatchConfig>
|
options?: Partial<WatchConfig>,
|
||||||
): Promise<UnsubscribeFunc> => {
|
): Promise<UnsubscribeFunc> => {
|
||||||
const { dbg } = logger().create(`watchById:${collectionName}:${id}`)
|
const { dbg } = logger().create(`watchById:${collectionName}:${id}`)
|
||||||
const config: WatchConfig = {
|
const config: WatchConfig = {
|
||||||
@ -89,7 +89,7 @@ export const createWatchHelper = (config: WatchHelperConfig) => {
|
|||||||
idName: keyof TRec,
|
idName: keyof TRec,
|
||||||
idValue: RecordId,
|
idValue: RecordId,
|
||||||
cb: (data: RecordSubscription<TRec>) => void,
|
cb: (data: RecordSubscription<TRec>) => void,
|
||||||
initialFetch = true
|
initialFetch = true,
|
||||||
): Promise<UnsubscribeFunc> => {
|
): Promise<UnsubscribeFunc> => {
|
||||||
let hasUpdate: { [_: RecordId]: boolean } = {}
|
let hasUpdate: { [_: RecordId]: boolean } = {}
|
||||||
const unsub = client
|
const unsub = client
|
||||||
@ -112,7 +112,7 @@ export const createWatchHelper = (config: WatchHelperConfig) => {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
return unsub
|
return unsub
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
return { watchById, watchAllById }
|
return { watchById, watchAllById }
|
||||||
|
@ -9,7 +9,7 @@ export const safeCatch = <TIn extends any[], TOut>(
|
|||||||
name: string,
|
name: string,
|
||||||
logger: Logger,
|
logger: Logger,
|
||||||
cb: (...args: TIn) => Promise<TOut>,
|
cb: (...args: TIn) => Promise<TOut>,
|
||||||
timeoutMs = SAFECATCH_TTL_MS
|
timeoutMs = SAFECATCH_TTL_MS,
|
||||||
): ((...args: TIn) => Promise<TOut>) => {
|
): ((...args: TIn) => Promise<TOut>) => {
|
||||||
return async (...args: TIn) => {
|
return async (...args: TIn) => {
|
||||||
const uuid = `${name}:${nanoid()}`
|
const uuid = `${name}:${nanoid()}`
|
||||||
@ -29,17 +29,17 @@ export const safeCatch = <TIn extends any[], TOut>(
|
|||||||
if (e instanceof ClientResponseError) {
|
if (e instanceof ClientResponseError) {
|
||||||
if (e.status === 400) {
|
if (e.status === 400) {
|
||||||
dbg(
|
dbg(
|
||||||
`PocketBase API error: It looks like you don't have permission to make this request. Raw error: ${e}. Payload: ${payload}`
|
`PocketBase API error: It looks like you don't have permission to make this request. Raw error: ${e}. Payload: ${payload}`,
|
||||||
)
|
)
|
||||||
} else if (e.status === 0) {
|
} else if (e.status === 0) {
|
||||||
dbg(
|
dbg(
|
||||||
`Client request aborted (possible duplicate request or real error). Raw error: ${e}. Payload: ${payload}`
|
`Client request aborted (possible duplicate request or real error). Raw error: ${e}. Payload: ${payload}`,
|
||||||
)
|
)
|
||||||
} else if (e.status === 404) {
|
} else if (e.status === 404) {
|
||||||
dbg(`Record not found. Raw error: ${e}. Payload: ${payload}`)
|
dbg(`Record not found. Raw error: ${e}. Payload: ${payload}`)
|
||||||
} else {
|
} else {
|
||||||
dbg(
|
dbg(
|
||||||
`Unknown PocketBase API error. Raw error: ${e}. Payload: ${payload}`
|
`Unknown PocketBase API error. Raw error: ${e}. Payload: ${payload}`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
|
@ -34,7 +34,7 @@ export type RpcPayloadBase = JsonObject
|
|||||||
|
|
||||||
export type RpcFields<
|
export type RpcFields<
|
||||||
TPayload extends RpcPayloadBase,
|
TPayload extends RpcPayloadBase,
|
||||||
TRes extends JsonObject
|
TRes extends JsonObject,
|
||||||
> = BaseFields & {
|
> = BaseFields & {
|
||||||
userId: UserId
|
userId: UserId
|
||||||
cmd: string
|
cmd: string
|
||||||
|
@ -3,7 +3,7 @@ export * from './Instance'
|
|||||||
export * from './InstanceLog'
|
export * from './InstanceLog'
|
||||||
export * from './Invocation'
|
export * from './Invocation'
|
||||||
export * from './Rpc'
|
export * from './Rpc'
|
||||||
export * from './types'
|
|
||||||
export * from './User'
|
export * from './User'
|
||||||
|
export * from './types'
|
||||||
export * from './util'
|
export * from './util'
|
||||||
// gen:export
|
// gen:export
|
||||||
|
@ -36,10 +36,9 @@
|
|||||||
"pocketbase": "^0.8.0",
|
"pocketbase": "^0.8.0",
|
||||||
"semver": "^7.3.8",
|
"semver": "^7.3.8",
|
||||||
"sqlite": "^4.1.2",
|
"sqlite": "^4.1.2",
|
||||||
"sqlite3": "^5.1.2",
|
"sqlite3": "^5.1.6",
|
||||||
"tmp": "^0.2.1",
|
"tmp": "^0.2.1",
|
||||||
"type-fest": "^3.3.0",
|
"type-fest": "^3.3.0",
|
||||||
"url-pattern": "^1.0.3",
|
"url-pattern": "^1.0.3"
|
||||||
"tsx": "^3.11.0"
|
|
||||||
}
|
}
|
||||||
}
|
}
|
@ -25,7 +25,7 @@ export const DAEMON_PB_MIGRATIONS_DIR = (() => {
|
|||||||
const v = env('DAEMON_PB_MIGRATIONS_DIR')
|
const v = env('DAEMON_PB_MIGRATIONS_DIR')
|
||||||
if (!v) {
|
if (!v) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`DAEMON_PB_MIGRATIONS_DIR (${v}) environment variable must be specified`
|
`DAEMON_PB_MIGRATIONS_DIR (${v}) environment variable must be specified`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
if (!existsSync(v)) {
|
if (!existsSync(v)) {
|
||||||
@ -38,7 +38,7 @@ export const DAEMON_PB_DATA_DIR = (() => {
|
|||||||
const v = env('DAEMON_PB_DATA_DIR')
|
const v = env('DAEMON_PB_DATA_DIR')
|
||||||
if (!v) {
|
if (!v) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`DAEMON_PB_DATA_DIR (${v}) environment variable must be specified`
|
`DAEMON_PB_DATA_DIR (${v}) environment variable must be specified`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
if (!existsSync(v)) {
|
if (!existsSync(v)) {
|
||||||
@ -56,12 +56,12 @@ export const DAEMON_MAX_PORTS = envi(`DAEMON_MAX_PORTS`, 500)
|
|||||||
export const DAEMON_PB_BACKUP_SLEEP = envi(`DAEMON_PB_BACKUP_SLEEP`, 100)
|
export const DAEMON_PB_BACKUP_SLEEP = envi(`DAEMON_PB_BACKUP_SLEEP`, 100)
|
||||||
export const DAEMON_PB_BACKUP_PAGE_COUNT = envi(
|
export const DAEMON_PB_BACKUP_PAGE_COUNT = envi(
|
||||||
`DAEMON_PB_BACKUP_PAGE_COUNT`,
|
`DAEMON_PB_BACKUP_PAGE_COUNT`,
|
||||||
5
|
5,
|
||||||
)
|
)
|
||||||
|
|
||||||
export const PH_BIN_CACHE = env(
|
export const PH_BIN_CACHE = env(
|
||||||
`PH_BIN_CACHE`,
|
`PH_BIN_CACHE`,
|
||||||
join(__dirname, `../../../.pbincache`)
|
join(__dirname, `../../../.pbincache`),
|
||||||
)
|
)
|
||||||
|
|
||||||
export const PH_FTP_PORT = envi('PH_FTP_PORT', 21)
|
export const PH_FTP_PORT = envi('PH_FTP_PORT', 21)
|
||||||
|
@ -25,6 +25,12 @@ import { portManager } from './services/PortManager'
|
|||||||
import { updaterService } from './services/UpdaterService/UpdaterService'
|
import { updaterService } from './services/UpdaterService/UpdaterService'
|
||||||
// gen:import
|
// gen:import
|
||||||
|
|
||||||
|
const [major, minor, patch] = process.versions.node.split('.').map(Number)
|
||||||
|
|
||||||
|
if ((major || 0) < 18) {
|
||||||
|
throw new Error(`Node 18 or higher required.`)
|
||||||
|
}
|
||||||
|
|
||||||
loggerService({ debug: DEBUG, trace: TRACE, errorTrace: !DEBUG })
|
loggerService({ debug: DEBUG, trace: TRACE, errorTrace: !DEBUG })
|
||||||
|
|
||||||
// npm install eventsource --save
|
// npm install eventsource --save
|
||||||
@ -74,7 +80,7 @@ global.EventSource = require('eventsource')
|
|||||||
error(`migrate had an unexpected stop. Check it out`)
|
error(`migrate had an unexpected stop. Check it out`)
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
{ logger }
|
{ logger },
|
||||||
)
|
)
|
||||||
).exited
|
).exited
|
||||||
info(`Migrating done`)
|
info(`Migrating done`)
|
||||||
@ -91,7 +97,7 @@ global.EventSource = require('eventsource')
|
|||||||
error(`migrate had an unexpected stop. Check it out`)
|
error(`migrate had an unexpected stop. Check it out`)
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
{ logger }
|
{ logger },
|
||||||
)
|
)
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
@ -20,15 +20,15 @@ export const centralDbService = mkSingleton(
|
|||||||
|
|
||||||
const target = coreInternalUrl
|
const target = coreInternalUrl
|
||||||
dbg(
|
dbg(
|
||||||
`Forwarding proxy request for ${req.url} to central instance ${target}`
|
`Forwarding proxy request for ${req.url} to central instance ${target}`,
|
||||||
)
|
)
|
||||||
proxy.web(req, res, { target })
|
proxy.web(req, res, { target })
|
||||||
},
|
},
|
||||||
`CentralDbService`
|
`CentralDbService`,
|
||||||
)
|
)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
shutdown() {},
|
shutdown() {},
|
||||||
}
|
}
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
@ -7,7 +7,7 @@ import {
|
|||||||
SSL_KEY,
|
SSL_KEY,
|
||||||
} from '$constants'
|
} from '$constants'
|
||||||
import { clientService, createPbClient } from '$services'
|
import { clientService, createPbClient } from '$services'
|
||||||
import { mkSingleton, SingletonBaseConfig } from '@pockethost/common'
|
import { SingletonBaseConfig, mkSingleton } from '@pockethost/common'
|
||||||
import { readFileSync } from 'fs'
|
import { readFileSync } from 'fs'
|
||||||
import { FtpSrv } from 'ftp-srv'
|
import { FtpSrv } from 'ftp-srv'
|
||||||
import { PhFs } from './PhFs'
|
import { PhFs } from './PhFs'
|
||||||
@ -77,7 +77,7 @@ export const ftpService = mkSingleton((config: FtpConfig) => {
|
|||||||
reject(new Error(`Invalid username or password`))
|
reject(new Error(`Invalid username or password`))
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
ftpServer.listen().then(() => {
|
ftpServer.listen().then(() => {
|
||||||
|
@ -3,24 +3,24 @@ import { assert } from '$util'
|
|||||||
import { InstanceFields, Logger } from '@pockethost/common'
|
import { InstanceFields, Logger } from '@pockethost/common'
|
||||||
import { compact, map } from '@s-libs/micro-dash'
|
import { compact, map } from '@s-libs/micro-dash'
|
||||||
import {
|
import {
|
||||||
|
Mode,
|
||||||
constants,
|
constants,
|
||||||
createReadStream,
|
createReadStream,
|
||||||
createWriteStream,
|
createWriteStream,
|
||||||
existsSync,
|
existsSync,
|
||||||
mkdirSync,
|
mkdirSync,
|
||||||
Mode,
|
|
||||||
} from 'fs'
|
} from 'fs'
|
||||||
import { FileStat, FileSystem, FtpConnection } from 'ftp-srv'
|
import { FileStat, FileSystem, FtpConnection } from 'ftp-srv'
|
||||||
import { customAlphabet } from 'nanoid'
|
import { customAlphabet } from 'nanoid'
|
||||||
import { isAbsolute, join, normalize, resolve, sep } from 'path'
|
import { isAbsolute, join, normalize, resolve, sep } from 'path'
|
||||||
import { PocketbaseClientApi } from '../clientService/PbClient'
|
import { PocketbaseClientApi } from '../clientService/PbClient'
|
||||||
import * as fsAsync from './fs-async'
|
|
||||||
import {
|
import {
|
||||||
FolderNames,
|
FolderNames,
|
||||||
INSTANCE_ROOT_FOLDER_NAMES,
|
INSTANCE_ROOT_FOLDER_NAMES,
|
||||||
isInstanceRootFolder,
|
|
||||||
MAINTENANCE_ONLY_FOLDER_NAMES,
|
MAINTENANCE_ONLY_FOLDER_NAMES,
|
||||||
|
isInstanceRootFolder,
|
||||||
} from './FtpService'
|
} from './FtpService'
|
||||||
|
import * as fsAsync from './fs-async'
|
||||||
|
|
||||||
const nanoid = customAlphabet(`abcdefghijklmnop`)
|
const nanoid = customAlphabet(`abcdefghijklmnop`)
|
||||||
|
|
||||||
@ -51,7 +51,7 @@ export class PhFs implements FileSystem {
|
|||||||
constructor(
|
constructor(
|
||||||
connection: FtpConnection,
|
connection: FtpConnection,
|
||||||
client: PocketbaseClientApi,
|
client: PocketbaseClientApi,
|
||||||
logger: Logger
|
logger: Logger,
|
||||||
) {
|
) {
|
||||||
const cwd = `/`
|
const cwd = `/`
|
||||||
const root = DAEMON_PB_DATA_DIR
|
const root = DAEMON_PB_DATA_DIR
|
||||||
@ -109,12 +109,12 @@ export class PhFs implements FileSystem {
|
|||||||
dbg({ rootFolderName, instance })
|
dbg({ rootFolderName, instance })
|
||||||
if (
|
if (
|
||||||
MAINTENANCE_ONLY_FOLDER_NAMES.includes(
|
MAINTENANCE_ONLY_FOLDER_NAMES.includes(
|
||||||
rootFolderName as FolderNames
|
rootFolderName as FolderNames,
|
||||||
) &&
|
) &&
|
||||||
!instance.maintenance
|
!instance.maintenance
|
||||||
) {
|
) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`Instance must be in maintenance mode to access ${rootFolderName}`
|
`Instance must be in maintenance mode to access ${rootFolderName}`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
fsPathParts.push(rootFolderName)
|
fsPathParts.push(rootFolderName)
|
||||||
@ -122,7 +122,7 @@ export class PhFs implements FileSystem {
|
|||||||
const rootFolderFsPath = resolve(
|
const rootFolderFsPath = resolve(
|
||||||
join(...fsPathParts)
|
join(...fsPathParts)
|
||||||
.replace(UNIX_SEP_REGEX, sep)
|
.replace(UNIX_SEP_REGEX, sep)
|
||||||
.replace(WIN_SEP_REGEX, sep)
|
.replace(WIN_SEP_REGEX, sep),
|
||||||
)
|
)
|
||||||
if (!existsSync(rootFolderFsPath)) {
|
if (!existsSync(rootFolderFsPath)) {
|
||||||
mkdirSync(rootFolderFsPath)
|
mkdirSync(rootFolderFsPath)
|
||||||
@ -137,7 +137,7 @@ export class PhFs implements FileSystem {
|
|||||||
const fsPath = resolve(
|
const fsPath = resolve(
|
||||||
join(...fsPathParts)
|
join(...fsPathParts)
|
||||||
.replace(UNIX_SEP_REGEX, sep)
|
.replace(UNIX_SEP_REGEX, sep)
|
||||||
.replace(WIN_SEP_REGEX, sep)
|
.replace(WIN_SEP_REGEX, sep),
|
||||||
)
|
)
|
||||||
|
|
||||||
// Create FTP client path using unix separator
|
// Create FTP client path using unix separator
|
||||||
@ -210,7 +210,7 @@ export class PhFs implements FileSystem {
|
|||||||
*/
|
*/
|
||||||
if (!instance) {
|
if (!instance) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`Something as gone wrong. An instance without a subdomain is not possible.`
|
`Something as gone wrong. An instance without a subdomain is not possible.`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -247,7 +247,7 @@ export class PhFs implements FileSystem {
|
|||||||
})
|
})
|
||||||
})
|
})
|
||||||
.catch(() => null)
|
.catch(() => null)
|
||||||
})
|
}),
|
||||||
)
|
)
|
||||||
})
|
})
|
||||||
.then(compact)
|
.then(compact)
|
||||||
@ -322,7 +322,7 @@ export class PhFs implements FileSystem {
|
|||||||
|
|
||||||
async write(
|
async write(
|
||||||
fileName: string,
|
fileName: string,
|
||||||
options?: { append?: boolean | undefined; start?: any } | undefined
|
options?: { append?: boolean | undefined; start?: any } | undefined,
|
||||||
) {
|
) {
|
||||||
const { dbg, error } = this.log
|
const { dbg, error } = this.log
|
||||||
.create(`write`)
|
.create(`write`)
|
||||||
@ -366,7 +366,7 @@ export class PhFs implements FileSystem {
|
|||||||
|
|
||||||
async read(
|
async read(
|
||||||
fileName: string,
|
fileName: string,
|
||||||
options: { start?: any } | undefined
|
options: { start?: any } | undefined,
|
||||||
): Promise<any> {
|
): Promise<any> {
|
||||||
const { dbg, error } = this.log
|
const { dbg, error } = this.log
|
||||||
.create(`read`)
|
.create(`read`)
|
||||||
@ -374,9 +374,8 @@ export class PhFs implements FileSystem {
|
|||||||
.breadcrumb(fileName)
|
.breadcrumb(fileName)
|
||||||
dbg(`read`)
|
dbg(`read`)
|
||||||
|
|
||||||
const { fsPath, clientPath, pathFromRootFolder } = await this._resolvePath(
|
const { fsPath, clientPath, pathFromRootFolder } =
|
||||||
fileName
|
await this._resolvePath(fileName)
|
||||||
)
|
|
||||||
|
|
||||||
const { start } = options || {}
|
const { start } = options || {}
|
||||||
|
|
||||||
@ -433,9 +432,8 @@ export class PhFs implements FileSystem {
|
|||||||
.breadcrumb(path)
|
.breadcrumb(path)
|
||||||
dbg(`mkdir`)
|
dbg(`mkdir`)
|
||||||
|
|
||||||
const { fsPath, clientPath, pathFromRootFolder } = await this._resolvePath(
|
const { fsPath, clientPath, pathFromRootFolder } =
|
||||||
path
|
await this._resolvePath(path)
|
||||||
)
|
|
||||||
|
|
||||||
/*
|
/*
|
||||||
Disallow making directories if not inside root folder
|
Disallow making directories if not inside root folder
|
||||||
@ -485,7 +483,7 @@ export class PhFs implements FileSystem {
|
|||||||
Promise.all([
|
Promise.all([
|
||||||
this.restartInstanceGuard(fromRootFolderName, instance),
|
this.restartInstanceGuard(fromRootFolderName, instance),
|
||||||
this.restartInstanceGuard(toRootFolderName, instance),
|
this.restartInstanceGuard(toRootFolderName, instance),
|
||||||
])
|
]),
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -497,9 +495,8 @@ export class PhFs implements FileSystem {
|
|||||||
.breadcrumb(mode.toString())
|
.breadcrumb(mode.toString())
|
||||||
dbg(`chmod`)
|
dbg(`chmod`)
|
||||||
|
|
||||||
const { fsPath, clientPath, pathFromRootFolder } = await this._resolvePath(
|
const { fsPath, clientPath, pathFromRootFolder } =
|
||||||
path
|
await this._resolvePath(path)
|
||||||
)
|
|
||||||
|
|
||||||
/*
|
/*
|
||||||
Disallow making directories if not inside root folder
|
Disallow making directories if not inside root folder
|
||||||
@ -517,7 +514,7 @@ export class PhFs implements FileSystem {
|
|||||||
|
|
||||||
async restartInstanceGuard(
|
async restartInstanceGuard(
|
||||||
rootFolderName: FolderNames | undefined,
|
rootFolderName: FolderNames | undefined,
|
||||||
instance: InstanceFields
|
instance: InstanceFields,
|
||||||
) {
|
) {
|
||||||
// Not needed?
|
// Not needed?
|
||||||
// const { dbg, error } = this.log
|
// const { dbg, error } = this.log
|
||||||
|
@ -10,4 +10,4 @@ const mkdir = promisify(fs.mkdir)
|
|||||||
const rename = promisify(fs.rename)
|
const rename = promisify(fs.rename)
|
||||||
const chmod = promisify(fs.chmod)
|
const chmod = promisify(fs.chmod)
|
||||||
|
|
||||||
export { stat, readdir, access, unlink, rmdir, mkdir, rename, chmod }
|
export { access, chmod, mkdir, readdir, rename, rmdir, stat, unlink }
|
||||||
|
@ -2,11 +2,11 @@ import { SqliteChangeEvent, sqliteService } from '$services'
|
|||||||
import {
|
import {
|
||||||
InstanceLogFields,
|
InstanceLogFields,
|
||||||
InstanceLogFields_Create,
|
InstanceLogFields_Create,
|
||||||
|
RecordId,
|
||||||
|
StreamNames,
|
||||||
newId,
|
newId,
|
||||||
pocketNow,
|
pocketNow,
|
||||||
RecordId,
|
|
||||||
safeCatch,
|
safeCatch,
|
||||||
StreamNames,
|
|
||||||
} from '@pockethost/common'
|
} from '@pockethost/common'
|
||||||
import knex from 'knex'
|
import knex from 'knex'
|
||||||
import { AsyncReturnType } from 'type-fest'
|
import { AsyncReturnType } from 'type-fest'
|
||||||
@ -15,7 +15,7 @@ import { DaemonContext } from './DaemonContext'
|
|||||||
export type SqliteLogger = AsyncReturnType<typeof createSqliteLogger>
|
export type SqliteLogger = AsyncReturnType<typeof createSqliteLogger>
|
||||||
export const createSqliteLogger = async (
|
export const createSqliteLogger = async (
|
||||||
logDbPath: string,
|
logDbPath: string,
|
||||||
context: DaemonContext
|
context: DaemonContext,
|
||||||
) => {
|
) => {
|
||||||
const { parentLogger } = context
|
const { parentLogger } = context
|
||||||
const _dbLogger = parentLogger.create(`${logDbPath}`)
|
const _dbLogger = parentLogger.create(`${logDbPath}`)
|
||||||
@ -46,7 +46,7 @@ export const createSqliteLogger = async (
|
|||||||
const sql = conn('logs').insert(_in).toString()
|
const sql = conn('logs').insert(_in).toString()
|
||||||
trace(`Writing log ${JSON.stringify(_in)} ${sql}`)
|
trace(`Writing log ${JSON.stringify(_in)} ${sql}`)
|
||||||
await db.exec(sql)
|
await db.exec(sql)
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
const subscribe = (cb: (e: SqliteChangeEvent<InstanceLogFields>) => void) => {
|
const subscribe = (cb: (e: SqliteChangeEvent<InstanceLogFields>) => void) => {
|
||||||
@ -66,7 +66,7 @@ export const createSqliteLogger = async (
|
|||||||
|
|
||||||
const fetch = async (limit: number = 100) => {
|
const fetch = async (limit: number = 100) => {
|
||||||
return db.all<InstanceLogFields[]>(
|
return db.all<InstanceLogFields[]>(
|
||||||
`select * from logs order by created desc limit ${limit}`
|
`select * from logs order by created desc limit ${limit}`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -17,7 +17,7 @@ const instances: {
|
|||||||
|
|
||||||
export const createInstanceLogger = async (
|
export const createInstanceLogger = async (
|
||||||
instanceId: InstanceId,
|
instanceId: InstanceId,
|
||||||
context: DaemonContext
|
context: DaemonContext,
|
||||||
) => {
|
) => {
|
||||||
const { parentLogger } = context
|
const { parentLogger } = context
|
||||||
const _instanceLogger = parentLogger.create(`InstanceLogger`)
|
const _instanceLogger = parentLogger.create(`InstanceLogger`)
|
||||||
@ -31,7 +31,7 @@ export const createInstanceLogger = async (
|
|||||||
DAEMON_PB_DATA_DIR,
|
DAEMON_PB_DATA_DIR,
|
||||||
instanceId,
|
instanceId,
|
||||||
'pb_data',
|
'pb_data',
|
||||||
'instance_logs.db'
|
'instance_logs.db',
|
||||||
)
|
)
|
||||||
|
|
||||||
dbg(`logs path`, logDbPath)
|
dbg(`logs path`, logDbPath)
|
||||||
@ -69,5 +69,5 @@ export const instanceLoggerService = mkSingleton(
|
|||||||
dbg(`Shutting down`)
|
dbg(`Shutting down`)
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
@ -43,7 +43,7 @@ export const createDenoProcess = async (config: DenoProcessConfig) => {
|
|||||||
|
|
||||||
const denoWrite = (
|
const denoWrite = (
|
||||||
message: string,
|
message: string,
|
||||||
stream: StreamNames = StreamNames.Info
|
stream: StreamNames = StreamNames.Info,
|
||||||
) => {
|
) => {
|
||||||
dbg(`[${instance.id}:${path}:${stream}] ${message}`)
|
dbg(`[${instance.id}:${path}:${stream}] ${message}`)
|
||||||
return denoLogger.write(message, stream)
|
return denoLogger.write(message, stream)
|
||||||
@ -76,12 +76,12 @@ export const createDenoProcess = async (config: DenoProcessConfig) => {
|
|||||||
if (code) {
|
if (code) {
|
||||||
await denoWrite(
|
await denoWrite(
|
||||||
`Unexpected 'deno' exit code: ${code}.`,
|
`Unexpected 'deno' exit code: ${code}.`,
|
||||||
StreamNames.Error
|
StreamNames.Error,
|
||||||
)
|
)
|
||||||
} else {
|
} else {
|
||||||
await denoWrite(
|
await denoWrite(
|
||||||
`Worker has exited with code ${code}`,
|
`Worker has exited with code ${code}`,
|
||||||
StreamNames.System
|
StreamNames.System,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
resolve()
|
resolve()
|
||||||
|
@ -111,13 +111,13 @@ export const instanceService = mkSingleton(
|
|||||||
const { id, subdomain, version } = instance
|
const { id, subdomain, version } = instance
|
||||||
|
|
||||||
const systemInstanceLogger = instanceServiceLogger.create(
|
const systemInstanceLogger = instanceServiceLogger.create(
|
||||||
`${subdomain}:${id}:${version}`
|
`${subdomain}:${id}:${version}`,
|
||||||
)
|
)
|
||||||
const { dbg, warn, error, info } = systemInstanceLogger
|
const { dbg, warn, error, info } = systemInstanceLogger
|
||||||
|
|
||||||
if (instanceApis[id]) {
|
if (instanceApis[id]) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`Attempted to create an instance API when one is already available for ${id}`
|
`Attempted to create an instance API when one is already available for ${id}`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -154,7 +154,7 @@ export const instanceService = mkSingleton(
|
|||||||
internalUrl: () => {
|
internalUrl: () => {
|
||||||
if (status !== InstanceApiStatus.Healthy) {
|
if (status !== InstanceApiStatus.Healthy) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`Attempt to access instance URL when instance is not in a healthy state.`
|
`Attempt to access instance URL when instance is not in a healthy state.`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
return internalUrl
|
return internalUrl
|
||||||
@ -162,7 +162,7 @@ export const instanceService = mkSingleton(
|
|||||||
startRequest: () => {
|
startRequest: () => {
|
||||||
if (status !== InstanceApiStatus.Healthy) {
|
if (status !== InstanceApiStatus.Healthy) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`Attempt to start an instance request when instance is not in a healthy state.`
|
`Attempt to start an instance request when instance is not in a healthy state.`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
return startRequest()
|
return startRequest()
|
||||||
@ -193,7 +193,7 @@ export const instanceService = mkSingleton(
|
|||||||
const healthyGuard = () => {
|
const healthyGuard = () => {
|
||||||
if (status !== InstanceApiStatus.ShuttingDown) return
|
if (status !== InstanceApiStatus.ShuttingDown) return
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`HealthyGuard detected instance is shutting down. Aborting further initialization.`
|
`HealthyGuard detected instance is shutting down. Aborting further initialization.`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -202,7 +202,7 @@ export const instanceService = mkSingleton(
|
|||||||
*/
|
*/
|
||||||
const clientLimiter = new Bottleneck({ maxConcurrent: 1 })
|
const clientLimiter = new Bottleneck({ maxConcurrent: 1 })
|
||||||
const updateInstanceStatus = clientLimiter.wrap(
|
const updateInstanceStatus = clientLimiter.wrap(
|
||||||
client.updateInstanceStatus
|
client.updateInstanceStatus,
|
||||||
)
|
)
|
||||||
const updateInstance = clientLimiter.wrap(client.updateInstance)
|
const updateInstance = clientLimiter.wrap(client.updateInstance)
|
||||||
const createInvocation = clientLimiter.wrap(client.createInvocation)
|
const createInvocation = clientLimiter.wrap(client.createInvocation)
|
||||||
@ -235,15 +235,15 @@ export const instanceService = mkSingleton(
|
|||||||
instance.id,
|
instance.id,
|
||||||
{
|
{
|
||||||
parentLogger: systemInstanceLogger,
|
parentLogger: systemInstanceLogger,
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
const writeUserLog = serialAsyncExecutionGuard(
|
const writeUserLog = serialAsyncExecutionGuard(
|
||||||
userInstanceLogger.write,
|
userInstanceLogger.write,
|
||||||
() => `${instance.id}:userLog`
|
() => `${instance.id}:userLog`,
|
||||||
)
|
)
|
||||||
shutdownManager.add(() =>
|
shutdownManager.add(() =>
|
||||||
writeUserLog(`Shutting down instance`).catch(error)
|
writeUserLog(`Shutting down instance`).catch(error),
|
||||||
)
|
)
|
||||||
|
|
||||||
/*
|
/*
|
||||||
@ -272,7 +272,7 @@ export const instanceService = mkSingleton(
|
|||||||
version,
|
version,
|
||||||
onUnexpectedStop: (code, stdout, stderr) => {
|
onUnexpectedStop: (code, stdout, stderr) => {
|
||||||
warn(
|
warn(
|
||||||
`PocketBase processes exited unexpectedly with ${code}. Putting in maintenance mode.`
|
`PocketBase processes exited unexpectedly with ${code}. Putting in maintenance mode.`,
|
||||||
)
|
)
|
||||||
warn(stdout)
|
warn(stdout)
|
||||||
warn(stderr)
|
warn(stderr)
|
||||||
@ -282,24 +282,24 @@ export const instanceService = mkSingleton(
|
|||||||
})
|
})
|
||||||
await writeUserLog(
|
await writeUserLog(
|
||||||
`Putting instance in maintenance mode because it shut down with return code ${code}. `,
|
`Putting instance in maintenance mode because it shut down with return code ${code}. `,
|
||||||
StreamNames.Error
|
StreamNames.Error,
|
||||||
)
|
)
|
||||||
await Promise.all(
|
await Promise.all(
|
||||||
stdout.map((data) =>
|
stdout.map((data) =>
|
||||||
writeUserLog(data, StreamNames.Error).catch(error)
|
writeUserLog(data, StreamNames.Error).catch(error),
|
||||||
)
|
),
|
||||||
)
|
)
|
||||||
await Promise.all(
|
await Promise.all(
|
||||||
stderr.map((data) =>
|
stderr.map((data) =>
|
||||||
writeUserLog(data, StreamNames.Error).catch(error)
|
writeUserLog(data, StreamNames.Error).catch(error),
|
||||||
)
|
),
|
||||||
)
|
)
|
||||||
})
|
})
|
||||||
setImmediate(() => {
|
setImmediate(() => {
|
||||||
_safeShutdown(
|
_safeShutdown(
|
||||||
new Error(
|
new Error(
|
||||||
`PocketBase processes exited unexpectedly with ${code}. Putting in maintenance mode.`
|
`PocketBase processes exited unexpectedly with ${code}. Putting in maintenance mode.`,
|
||||||
)
|
),
|
||||||
).catch(error)
|
).catch(error)
|
||||||
})
|
})
|
||||||
},
|
},
|
||||||
@ -308,7 +308,7 @@ export const instanceService = mkSingleton(
|
|||||||
} catch (e) {
|
} catch (e) {
|
||||||
warn(`Error spawning: ${e}`)
|
warn(`Error spawning: ${e}`)
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`Could not launch PocketBase ${instance.version}. It may be time to upgrade.`
|
`Could not launch PocketBase ${instance.version}. It may be time to upgrade.`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
})()
|
})()
|
||||||
@ -340,7 +340,7 @@ export const instanceService = mkSingleton(
|
|||||||
DAEMON_PB_DATA_DIR,
|
DAEMON_PB_DATA_DIR,
|
||||||
instance.id,
|
instance.id,
|
||||||
`worker`,
|
`worker`,
|
||||||
`index.ts`
|
`index.ts`,
|
||||||
)
|
)
|
||||||
dbg(`Checking ${workerPath} for a worker entry point`)
|
dbg(`Checking ${workerPath} for a worker entry point`)
|
||||||
if (existsSync(workerPath)) {
|
if (existsSync(workerPath)) {
|
||||||
@ -400,7 +400,7 @@ export const instanceService = mkSingleton(
|
|||||||
}
|
}
|
||||||
return true
|
return true
|
||||||
}),
|
}),
|
||||||
RECHECK_TTL
|
RECHECK_TTL,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -413,7 +413,7 @@ export const instanceService = mkSingleton(
|
|||||||
warn(`_pingInvocation failed with ${e}`)
|
warn(`_pingInvocation failed with ${e}`)
|
||||||
return true
|
return true
|
||||||
}),
|
}),
|
||||||
1000
|
1000,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -437,7 +437,7 @@ export const instanceService = mkSingleton(
|
|||||||
.catch((e: ClientResponseError) => {
|
.catch((e: ClientResponseError) => {
|
||||||
if (e.status !== 404) {
|
if (e.status !== 404) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`Unexpected response ${JSON.stringify(e)} from mothership`
|
`Unexpected response ${JSON.stringify(e)} from mothership`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
return []
|
return []
|
||||||
@ -450,7 +450,7 @@ export const instanceService = mkSingleton(
|
|||||||
{
|
{
|
||||||
dbg(`Trying to get instance by subdomain: ${idOrSubdomain}`)
|
dbg(`Trying to get instance by subdomain: ${idOrSubdomain}`)
|
||||||
const [instance, owner] = await client.getInstanceBySubdomain(
|
const [instance, owner] = await client.getInstanceBySubdomain(
|
||||||
idOrSubdomain
|
idOrSubdomain,
|
||||||
)
|
)
|
||||||
if (instance && owner) {
|
if (instance && owner) {
|
||||||
dbg(`${idOrSubdomain} is a subdomain`)
|
dbg(`${idOrSubdomain} is a subdomain`)
|
||||||
@ -472,14 +472,14 @@ export const instanceService = mkSingleton(
|
|||||||
if (instanceIdOrSubdomain === PUBLIC_APP_DB) return
|
if (instanceIdOrSubdomain === PUBLIC_APP_DB) return
|
||||||
|
|
||||||
const { instance, owner } = await getInstanceByIdOrSubdomain(
|
const { instance, owner } = await getInstanceByIdOrSubdomain(
|
||||||
instanceIdOrSubdomain
|
instanceIdOrSubdomain,
|
||||||
)
|
)
|
||||||
if (!owner) {
|
if (!owner) {
|
||||||
throw new Error(`Instance owner is invalid`)
|
throw new Error(`Instance owner is invalid`)
|
||||||
}
|
}
|
||||||
if (!instance) {
|
if (!instance) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`Subdomain ${instanceIdOrSubdomain} does not resolve to an instance`
|
`Subdomain ${instanceIdOrSubdomain} does not resolve to an instance`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -489,7 +489,7 @@ export const instanceService = mkSingleton(
|
|||||||
dbg(`Checking for maintenance mode`)
|
dbg(`Checking for maintenance mode`)
|
||||||
if (instance.maintenance) {
|
if (instance.maintenance) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`This instance is in Maintenance Mode. See https://pockethost.gitbook.io/manual/daily-usage/maintenance for more information.`
|
`This instance is in Maintenance Mode. See https://pockethost.gitbook.io/manual/daily-usage/maintenance for more information.`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -499,7 +499,7 @@ export const instanceService = mkSingleton(
|
|||||||
dbg(`Checking for verified account`)
|
dbg(`Checking for verified account`)
|
||||||
if (!owner?.verified) {
|
if (!owner?.verified) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`Log in at ${PUBLIC_APP_PROTOCOL}://${PUBLIC_APP_DOMAIN} to verify your account.`
|
`Log in at ${PUBLIC_APP_PROTOCOL}://${PUBLIC_APP_DOMAIN} to verify your account.`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -513,12 +513,12 @@ export const instanceService = mkSingleton(
|
|||||||
dbg(
|
dbg(
|
||||||
`Forwarding proxy request for ${
|
`Forwarding proxy request for ${
|
||||||
req.url
|
req.url
|
||||||
} to instance ${api.internalUrl()}`
|
} to instance ${api.internalUrl()}`,
|
||||||
)
|
)
|
||||||
|
|
||||||
proxy.web(req, res, { target: api.internalUrl() })
|
proxy.web(req, res, { target: api.internalUrl() })
|
||||||
},
|
},
|
||||||
`InstanceService`
|
`InstanceService`,
|
||||||
)
|
)
|
||||||
|
|
||||||
const { getNextPort } = await portManager()
|
const { getNextPort } = await portManager()
|
||||||
@ -532,5 +532,5 @@ export const instanceService = mkSingleton(
|
|||||||
const getInstanceApiIfExistsById = (id: InstanceId) => instanceApis[id]
|
const getInstanceApiIfExistsById = (id: InstanceId) => instanceApis[id]
|
||||||
|
|
||||||
return { shutdown, getInstanceApiIfExistsById }
|
return { shutdown, getInstanceApiIfExistsById }
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
@ -2,8 +2,8 @@ import { DAEMON_PB_DATA_DIR, DAEMON_PB_MIGRATIONS_DIR } from '$constants'
|
|||||||
import { mkInternalAddress, mkInternalUrl, tryFetch } from '$util'
|
import { mkInternalAddress, mkInternalUrl, tryFetch } from '$util'
|
||||||
import { createCleanupManager, createTimerManager } from '@pockethost/common'
|
import { createCleanupManager, createTimerManager } from '@pockethost/common'
|
||||||
import {
|
import {
|
||||||
mkSingleton,
|
|
||||||
SingletonBaseConfig,
|
SingletonBaseConfig,
|
||||||
|
mkSingleton,
|
||||||
} from '@pockethost/common/src/mkSingleton'
|
} from '@pockethost/common/src/mkSingleton'
|
||||||
import { spawn } from 'child_process'
|
import { spawn } from 'child_process'
|
||||||
import { existsSync } from 'fs'
|
import { existsSync } from 'fs'
|
||||||
@ -23,7 +23,7 @@ export type SpawnConfig = {
|
|||||||
onUnexpectedStop: (
|
onUnexpectedStop: (
|
||||||
code: number | null,
|
code: number | null,
|
||||||
stdout: string[],
|
stdout: string[],
|
||||||
stderr: string[]
|
stderr: string[],
|
||||||
) => void
|
) => void
|
||||||
}
|
}
|
||||||
export type PocketbaseServiceApi = AsyncReturnType<
|
export type PocketbaseServiceApi = AsyncReturnType<
|
||||||
@ -49,7 +49,7 @@ function pidIsRunning(pid: number) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
export const createPocketbaseService = async (
|
export const createPocketbaseService = async (
|
||||||
config: PocketbaseServiceConfig
|
config: PocketbaseServiceConfig,
|
||||||
) => {
|
) => {
|
||||||
const { logger } = config
|
const { logger } = config
|
||||||
const _serviceLogger = logger.create('PocketbaseService')
|
const _serviceLogger = logger.create('PocketbaseService')
|
||||||
@ -77,7 +77,7 @@ export const createPocketbaseService = async (
|
|||||||
const bin = realVersion.binPath
|
const bin = realVersion.binPath
|
||||||
if (!existsSync(bin)) {
|
if (!existsSync(bin)) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`PocketBase binary (${bin}) not found. Contact pockethost.io.`
|
`PocketBase binary (${bin}) not found. Contact pockethost.io.`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -93,7 +93,7 @@ export const createPocketbaseService = async (
|
|||||||
args.push(
|
args.push(
|
||||||
isMothership
|
isMothership
|
||||||
? DAEMON_PB_MIGRATIONS_DIR
|
? DAEMON_PB_MIGRATIONS_DIR
|
||||||
: `${DAEMON_PB_DATA_DIR}/${slug}/pb_migrations`
|
: `${DAEMON_PB_DATA_DIR}/${slug}/pb_migrations`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
if (command === 'serve') {
|
if (command === 'serve') {
|
||||||
@ -157,7 +157,7 @@ export const createPocketbaseService = async (
|
|||||||
const { pid } = ls
|
const { pid } = ls
|
||||||
if (!pid) {
|
if (!pid) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`Attempt to kill a PocketBase process that was never running.`
|
`Attempt to kill a PocketBase process that was never running.`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
const p = new Promise<boolean>((resolve, reject) => {
|
const p = new Promise<boolean>((resolve, reject) => {
|
||||||
|
@ -38,8 +38,8 @@ export const portManager = mkSingleton(async (cfg: PortManagerConfig) => {
|
|||||||
const removed = remove(exclude, (v) => v === newPort)
|
const removed = remove(exclude, (v) => v === newPort)
|
||||||
dbg(
|
dbg(
|
||||||
`Removed ${removed.join(',')} from excluded ports: ${exclude.join(
|
`Removed ${removed.join(',')} from excluded ports: ${exclude.join(
|
||||||
','
|
',',
|
||||||
)}`
|
)}`,
|
||||||
)
|
)
|
||||||
},
|
},
|
||||||
]
|
]
|
||||||
|
@ -22,7 +22,7 @@ export type ProxyMiddleware = (
|
|||||||
proxy: Server
|
proxy: Server
|
||||||
host: string
|
host: string
|
||||||
},
|
},
|
||||||
logger: Logger
|
logger: Logger,
|
||||||
) => void | Promise<void>
|
) => void | Promise<void>
|
||||||
|
|
||||||
export type ProxyServiceConfig = SingletonBaseConfig & {
|
export type ProxyServiceConfig = SingletonBaseConfig & {
|
||||||
@ -44,7 +44,7 @@ export const proxyService = mkSingleton(async (config: ProxyServiceConfig) => {
|
|||||||
dbg(`Incoming request ${req.method} ${req.headers.host}/${req.url}`)
|
dbg(`Incoming request ${req.method} ${req.headers.host}/${req.url}`)
|
||||||
if (!req.headers.host?.endsWith(PUBLIC_APP_DOMAIN)) {
|
if (!req.headers.host?.endsWith(PUBLIC_APP_DOMAIN)) {
|
||||||
warn(
|
warn(
|
||||||
`Request for ${req.headers.host} rejected because host does not end in ${PUBLIC_APP_DOMAIN}`
|
`Request for ${req.headers.host} rejected because host does not end in ${PUBLIC_APP_DOMAIN}`,
|
||||||
)
|
)
|
||||||
res.writeHead(502, {
|
res.writeHead(502, {
|
||||||
'Content-Type': `text/plain`,
|
'Content-Type': `text/plain`,
|
||||||
@ -54,7 +54,7 @@ export const proxyService = mkSingleton(async (config: ProxyServiceConfig) => {
|
|||||||
}
|
}
|
||||||
{
|
{
|
||||||
const { warn } = _proxyLogger.create(
|
const { warn } = _proxyLogger.create(
|
||||||
`${req.method} ${req.headers.host}/${req.url}`
|
`${req.method} ${req.headers.host}/${req.url}`,
|
||||||
)
|
)
|
||||||
try {
|
try {
|
||||||
for (let i = 0; i < middleware.length; i++) {
|
for (let i = 0; i < middleware.length; i++) {
|
||||||
@ -94,7 +94,7 @@ export const proxyService = mkSingleton(async (config: ProxyServiceConfig) => {
|
|||||||
subdomainFilter: string | ((subdomain: string) => boolean),
|
subdomainFilter: string | ((subdomain: string) => boolean),
|
||||||
urlFilters: string | string[],
|
urlFilters: string | string[],
|
||||||
handler: ProxyMiddleware,
|
handler: ProxyMiddleware,
|
||||||
handlerName: string
|
handlerName: string,
|
||||||
) => {
|
) => {
|
||||||
const _handlerLogger = _proxyLogger.create(`${handlerName}`)
|
const _handlerLogger = _proxyLogger.create(`${handlerName}`)
|
||||||
const { dbg, trace } = _handlerLogger
|
const { dbg, trace } = _handlerLogger
|
||||||
@ -149,7 +149,7 @@ export const proxyService = mkSingleton(async (config: ProxyServiceConfig) => {
|
|||||||
req,
|
req,
|
||||||
res,
|
res,
|
||||||
{ host, subdomain, coreInternalUrl, proxy },
|
{ host, subdomain, coreInternalUrl, proxy },
|
||||||
_requestLogger
|
_requestLogger,
|
||||||
)
|
)
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
@ -59,7 +59,7 @@ export const realtimeLog = mkSingleton(async (config: RealtimeLogConfig) => {
|
|||||||
res.setHeader('Access-Control-Allow-Methods', 'POST, OPTIONS')
|
res.setHeader('Access-Control-Allow-Methods', 'POST, OPTIONS')
|
||||||
res.setHeader(
|
res.setHeader(
|
||||||
'Access-Control-Allow-Headers',
|
'Access-Control-Allow-Headers',
|
||||||
'authorization,content-type,cache-control'
|
'authorization,content-type,cache-control',
|
||||||
)
|
)
|
||||||
res.setHeader('Access-Control-Max-Age', 86400)
|
res.setHeader('Access-Control-Max-Age', 86400)
|
||||||
res.statusCode = 204
|
res.statusCode = 204
|
||||||
@ -106,7 +106,7 @@ export const realtimeLog = mkSingleton(async (config: RealtimeLogConfig) => {
|
|||||||
.getOne<InstanceFields>(instanceId)
|
.getOne<InstanceFields>(instanceId)
|
||||||
if (!instance) {
|
if (!instance) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`instanceId ${instanceId} not found for user ${user.id}`
|
`instanceId ${instanceId} not found for user ${user.id}`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
dbg(`Instance is `, instance)
|
dbg(`Instance is `, instance)
|
||||||
@ -142,14 +142,14 @@ export const realtimeLog = mkSingleton(async (config: RealtimeLogConfig) => {
|
|||||||
const evt = mkEvent(`log`, record)
|
const evt = mkEvent(`log`, record)
|
||||||
trace(
|
trace(
|
||||||
`Dispatching SSE log event from ${instance.subdomain} (${instance.id})`,
|
`Dispatching SSE log event from ${instance.subdomain} (${instance.id})`,
|
||||||
evt
|
evt,
|
||||||
)
|
)
|
||||||
limiter.schedule(() => write(evt)).catch(error)
|
limiter.schedule(() => write(evt)).catch(error)
|
||||||
})
|
})
|
||||||
req.on('close', () => {
|
req.on('close', () => {
|
||||||
limiter.stop()
|
limiter.stop()
|
||||||
dbg(
|
dbg(
|
||||||
`SSE request for ${instance.subdomain} (${instance.id}) closed. Unsubscribing.`
|
`SSE request for ${instance.subdomain} (${instance.id}) closed. Unsubscribing.`,
|
||||||
)
|
)
|
||||||
unsub()
|
unsub()
|
||||||
})
|
})
|
||||||
@ -172,7 +172,7 @@ export const realtimeLog = mkSingleton(async (config: RealtimeLogConfig) => {
|
|||||||
const evt = mkEvent(`log`, rec)
|
const evt = mkEvent(`log`, rec)
|
||||||
trace(
|
trace(
|
||||||
`Dispatching SSE initial log event from ${instance.subdomain} (${instance.id})`,
|
`Dispatching SSE initial log event from ${instance.subdomain} (${instance.id})`,
|
||||||
evt
|
evt,
|
||||||
)
|
)
|
||||||
return write(evt)
|
return write(evt)
|
||||||
})
|
})
|
||||||
@ -186,7 +186,7 @@ export const realtimeLog = mkSingleton(async (config: RealtimeLogConfig) => {
|
|||||||
.catch(error)
|
.catch(error)
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
`RealtimeLogService`
|
`RealtimeLogService`,
|
||||||
)
|
)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
|
@ -1,12 +1,12 @@
|
|||||||
import { clientService } from '$services'
|
import { clientService } from '$services'
|
||||||
import {
|
import {
|
||||||
assertTruthy,
|
RPC_COMMANDS,
|
||||||
mkSingleton,
|
|
||||||
RpcCommands,
|
RpcCommands,
|
||||||
RpcFields,
|
RpcFields,
|
||||||
RpcStatus,
|
RpcStatus,
|
||||||
RPC_COMMANDS,
|
|
||||||
SingletonBaseConfig,
|
SingletonBaseConfig,
|
||||||
|
assertTruthy,
|
||||||
|
mkSingleton,
|
||||||
} from '@pockethost/common'
|
} from '@pockethost/common'
|
||||||
import { isObject } from '@s-libs/micro-dash'
|
import { isObject } from '@s-libs/micro-dash'
|
||||||
import Ajv, { JSONSchemaType, ValidateFunction } from 'ajv'
|
import Ajv, { JSONSchemaType, ValidateFunction } from 'ajv'
|
||||||
@ -22,12 +22,12 @@ export type KnexApi = ReturnType<typeof knexFactory>
|
|||||||
export type CommandModuleInitializer = (
|
export type CommandModuleInitializer = (
|
||||||
register: RpcServiceApi['registerCommand'],
|
register: RpcServiceApi['registerCommand'],
|
||||||
client: pocketbaseEs,
|
client: pocketbaseEs,
|
||||||
knex: KnexApi
|
knex: KnexApi,
|
||||||
) => void
|
) => void
|
||||||
|
|
||||||
export type RpcRunner<
|
export type RpcRunner<
|
||||||
TPayload extends JsonObject,
|
TPayload extends JsonObject,
|
||||||
TResult extends JsonObject
|
TResult extends JsonObject,
|
||||||
> = (job: RpcFields<TPayload, TResult>) => Promise<TResult>
|
> = (job: RpcFields<TPayload, TResult>) => Promise<TResult>
|
||||||
|
|
||||||
export type RpcServiceConfig = SingletonBaseConfig & {}
|
export type RpcServiceConfig = SingletonBaseConfig & {}
|
||||||
@ -58,8 +58,8 @@ export const rpcService = mkSingleton(async (config: RpcServiceConfig) => {
|
|||||||
if (!RPC_COMMANDS.find((c) => c === cmd)) {
|
if (!RPC_COMMANDS.find((c) => c === cmd)) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`RPC command '${cmd}' is invalid. It must be one of: ${RPC_COMMANDS.join(
|
`RPC command '${cmd}' is invalid. It must be one of: ${RPC_COMMANDS.join(
|
||||||
'|'
|
'|',
|
||||||
)}.`
|
)}.`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
return cmd as RpcCommands
|
return cmd as RpcCommands
|
||||||
@ -76,7 +76,7 @@ export const rpcService = mkSingleton(async (config: RpcServiceConfig) => {
|
|||||||
const { validate, run } = handler
|
const { validate, run } = handler
|
||||||
if (!validate(payload)) {
|
if (!validate(payload)) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`Payload for ${cmd} fails validation: ${JSON.stringify(payload)}`
|
`Payload for ${cmd} fails validation: ${JSON.stringify(payload)}`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
dbg(`Running RPC ${rpc.id}`, rpc)
|
dbg(`Running RPC ${rpc.id}`, rpc)
|
||||||
@ -115,11 +115,11 @@ export const rpcService = mkSingleton(async (config: RpcServiceConfig) => {
|
|||||||
|
|
||||||
const registerCommand = <
|
const registerCommand = <
|
||||||
TPayload extends JsonObject,
|
TPayload extends JsonObject,
|
||||||
TResult extends JsonObject
|
TResult extends JsonObject,
|
||||||
>(
|
>(
|
||||||
commandName: RpcCommands,
|
commandName: RpcCommands,
|
||||||
schema: JSONSchemaType<TPayload>,
|
schema: JSONSchemaType<TPayload>,
|
||||||
runner: RpcRunner<TPayload, TResult>
|
runner: RpcRunner<TPayload, TResult>,
|
||||||
) => {
|
) => {
|
||||||
if (jobHandlers[commandName]) {
|
if (jobHandlers[commandName]) {
|
||||||
throw new Error(`${commandName} job handler already registered.`)
|
throw new Error(`${commandName} job handler already registered.`)
|
||||||
|
@ -20,9 +20,9 @@ import {
|
|||||||
type SetInstanceMaintenanceResult,
|
type SetInstanceMaintenanceResult,
|
||||||
} from '@pockethost/common'
|
} from '@pockethost/common'
|
||||||
import { valid, validRange } from 'semver'
|
import { valid, validRange } from 'semver'
|
||||||
import { clientService } from '../clientService/clientService'
|
|
||||||
import { instanceService } from '../InstanceService/InstanceService'
|
import { instanceService } from '../InstanceService/InstanceService'
|
||||||
import { updaterService } from '../UpdaterService/UpdaterService'
|
import { updaterService } from '../UpdaterService/UpdaterService'
|
||||||
|
import { clientService } from '../clientService/clientService'
|
||||||
import { rpcService } from './RpcService'
|
import { rpcService } from './RpcService'
|
||||||
|
|
||||||
export const registerRpcCommands = async (logger: Logger) => {
|
export const registerRpcCommands = async (logger: Logger) => {
|
||||||
@ -48,7 +48,7 @@ export const registerRpcCommands = async (logger: Logger) => {
|
|||||||
maintenance: false,
|
maintenance: false,
|
||||||
})
|
})
|
||||||
return { instance }
|
return { instance }
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
registerCommand<SaveVersionPayload, SaveVersionResult>(
|
registerCommand<SaveVersionPayload, SaveVersionResult>(
|
||||||
@ -65,7 +65,7 @@ export const registerRpcCommands = async (logger: Logger) => {
|
|||||||
}
|
}
|
||||||
await client.updateInstance(instanceId, { version })
|
await client.updateInstance(instanceId, { version })
|
||||||
return { status: 'ok' }
|
return { status: 'ok' }
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
registerCommand<SaveSecretsPayload, SaveSecretsResult>(
|
registerCommand<SaveSecretsPayload, SaveSecretsResult>(
|
||||||
@ -76,7 +76,7 @@ export const registerRpcCommands = async (logger: Logger) => {
|
|||||||
const { instanceId, secrets } = payload
|
const { instanceId, secrets } = payload
|
||||||
await client.updateInstance(instanceId, { secrets })
|
await client.updateInstance(instanceId, { secrets })
|
||||||
return { status: 'ok' }
|
return { status: 'ok' }
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
registerCommand<RenameInstancePayload, RenameInstanceResult>(
|
registerCommand<RenameInstancePayload, RenameInstanceResult>(
|
||||||
@ -90,7 +90,7 @@ export const registerRpcCommands = async (logger: Logger) => {
|
|||||||
await client.updateInstance(instanceId, { subdomain })
|
await client.updateInstance(instanceId, { subdomain })
|
||||||
dbg(`Instance updated successfully `)
|
dbg(`Instance updated successfully `)
|
||||||
return {}
|
return {}
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
registerCommand<SetInstanceMaintenancePayload, SetInstanceMaintenanceResult>(
|
registerCommand<SetInstanceMaintenancePayload, SetInstanceMaintenanceResult>(
|
||||||
@ -112,7 +112,7 @@ export const registerRpcCommands = async (logger: Logger) => {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
return {}
|
return {}
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
// gen:command
|
// gen:command
|
||||||
|
@ -5,13 +5,13 @@ import {
|
|||||||
serialAsyncExecutionGuard,
|
serialAsyncExecutionGuard,
|
||||||
SingletonBaseConfig,
|
SingletonBaseConfig,
|
||||||
} from '@pockethost/common'
|
} from '@pockethost/common'
|
||||||
import { Database as SqliteDatabase, open } from 'sqlite'
|
import { open, Database as SqliteDatabase } from 'sqlite'
|
||||||
import { Database } from 'sqlite3'
|
import { Database } from 'sqlite3'
|
||||||
import { JsonObject } from 'type-fest'
|
import { JsonObject } from 'type-fest'
|
||||||
|
|
||||||
export type SqliteUnsubscribe = () => void
|
export type SqliteUnsubscribe = () => void
|
||||||
export type SqliteChangeHandler<TRecord extends JsonObject> = (
|
export type SqliteChangeHandler<TRecord extends JsonObject> = (
|
||||||
e: SqliteChangeEvent<TRecord>
|
e: SqliteChangeEvent<TRecord>,
|
||||||
) => void
|
) => void
|
||||||
export type SqliteEventType = 'update' | 'insert' | 'delete'
|
export type SqliteEventType = 'update' | 'insert' | 'delete'
|
||||||
export type SqliteChangeEvent<TRecord extends JsonObject> = {
|
export type SqliteChangeEvent<TRecord extends JsonObject> = {
|
||||||
@ -25,7 +25,7 @@ export type SqliteServiceApi = {
|
|||||||
migrate: SqliteDatabase['migrate']
|
migrate: SqliteDatabase['migrate']
|
||||||
exec: SqliteDatabase['exec']
|
exec: SqliteDatabase['exec']
|
||||||
subscribe: <TRecord extends JsonObject>(
|
subscribe: <TRecord extends JsonObject>(
|
||||||
cb: SqliteChangeHandler<TRecord>
|
cb: SqliteChangeHandler<TRecord>,
|
||||||
) => SqliteUnsubscribe
|
) => SqliteUnsubscribe
|
||||||
}
|
}
|
||||||
export type SqliteServiceConfig = SingletonBaseConfig & {}
|
export type SqliteServiceConfig = SingletonBaseConfig & {}
|
||||||
@ -43,7 +43,7 @@ export const sqliteService = mkSingleton((config: SqliteServiceConfig) => {
|
|||||||
This function
|
This function
|
||||||
*/
|
*/
|
||||||
const _unsafe_getDatabase = async (
|
const _unsafe_getDatabase = async (
|
||||||
filename: string
|
filename: string,
|
||||||
): Promise<SqliteServiceApi> => {
|
): Promise<SqliteServiceApi> => {
|
||||||
const _dbLogger = logger.create(`SqliteService`)
|
const _dbLogger = logger.create(`SqliteService`)
|
||||||
_dbLogger.breadcrumb(filename)
|
_dbLogger.breadcrumb(filename)
|
||||||
@ -62,7 +62,7 @@ export const sqliteService = mkSingleton((config: SqliteServiceConfig) => {
|
|||||||
eventType: SqliteEventType,
|
eventType: SqliteEventType,
|
||||||
database: string,
|
database: string,
|
||||||
table: string,
|
table: string,
|
||||||
rowId: number
|
rowId: number,
|
||||||
) => {
|
) => {
|
||||||
trace(`Got a raw change event`, {
|
trace(`Got a raw change event`, {
|
||||||
eventType,
|
eventType,
|
||||||
@ -73,7 +73,7 @@ export const sqliteService = mkSingleton((config: SqliteServiceConfig) => {
|
|||||||
if (eventType === 'delete') return // Not supported
|
if (eventType === 'delete') return // Not supported
|
||||||
|
|
||||||
const record = await db.get(
|
const record = await db.get(
|
||||||
`select * from ${table} where rowid = '${rowId}'`
|
`select * from ${table} where rowid = '${rowId}'`,
|
||||||
)
|
)
|
||||||
const e: SqliteChangeEvent<any> = {
|
const e: SqliteChangeEvent<any> = {
|
||||||
table,
|
table,
|
||||||
@ -81,7 +81,7 @@ export const sqliteService = mkSingleton((config: SqliteServiceConfig) => {
|
|||||||
record,
|
record,
|
||||||
}
|
}
|
||||||
fireChange(e)
|
fireChange(e)
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
cm.add(() => {
|
cm.add(() => {
|
||||||
@ -110,7 +110,7 @@ export const sqliteService = mkSingleton((config: SqliteServiceConfig) => {
|
|||||||
}
|
}
|
||||||
const getDatabase = serialAsyncExecutionGuard(
|
const getDatabase = serialAsyncExecutionGuard(
|
||||||
_unsafe_getDatabase,
|
_unsafe_getDatabase,
|
||||||
(fileName) => fileName
|
(fileName) => fileName,
|
||||||
)
|
)
|
||||||
|
|
||||||
const shutdown = async () => {
|
const shutdown = async () => {
|
||||||
|
@ -1,9 +1,9 @@
|
|||||||
import { downloadAndExtract, smartFetch } from '$util'
|
import { downloadAndExtract, smartFetch } from '$util'
|
||||||
import {
|
import {
|
||||||
|
SingletonBaseConfig,
|
||||||
createCleanupManager,
|
createCleanupManager,
|
||||||
createTimerManager,
|
createTimerManager,
|
||||||
mkSingleton,
|
mkSingleton,
|
||||||
SingletonBaseConfig,
|
|
||||||
} from '@pockethost/common'
|
} from '@pockethost/common'
|
||||||
import { keys } from '@s-libs/micro-dash'
|
import { keys } from '@s-libs/micro-dash'
|
||||||
import { chmodSync, existsSync } from 'fs'
|
import { chmodSync, existsSync } from 'fs'
|
||||||
@ -49,7 +49,7 @@ export const updaterService = mkSingleton(
|
|||||||
const check = async () => {
|
const check = async () => {
|
||||||
const releases = await smartFetch<Releases>(
|
const releases = await smartFetch<Releases>(
|
||||||
`https://api.github.com/repos/pocketbase/pocketbase/releases?per_page=100`,
|
`https://api.github.com/repos/pocketbase/pocketbase/releases?per_page=100`,
|
||||||
join(cachePath, `releases.json`)
|
join(cachePath, `releases.json`),
|
||||||
)
|
)
|
||||||
// dbg({ releases })
|
// dbg({ releases })
|
||||||
|
|
||||||
@ -77,7 +77,7 @@ export const updaterService = mkSingleton(
|
|||||||
await Promise.all(promises)
|
await Promise.all(promises)
|
||||||
if (keys(binPaths).length === 0) {
|
if (keys(binPaths).length === 0) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`No version found, probably mismatched architecture and OS (${osName}/${cpuArchitecture})`
|
`No version found, probably mismatched architecture and OS (${osName}/${cpuArchitecture})`,
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
maxVersion = `~${rsort(keys(binPaths))[0]}`
|
maxVersion = `~${rsort(keys(binPaths))[0]}`
|
||||||
@ -94,7 +94,7 @@ export const updaterService = mkSingleton(
|
|||||||
const version = maxSatisfying(keys(binPaths), semVer)
|
const version = maxSatisfying(keys(binPaths), semVer)
|
||||||
if (!version)
|
if (!version)
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`No version satisfies ${semVer} (${keys(binPaths).join(', ')})`
|
`No version satisfies ${semVer} (${keys(binPaths).join(', ')})`,
|
||||||
)
|
)
|
||||||
const binPath = binPaths[version]
|
const binPath = binPaths[version]
|
||||||
if (!binPath) throw new Error(`binPath for ${version} not found`)
|
if (!binPath) throw new Error(`binPath for ${version} not found`)
|
||||||
@ -109,5 +109,5 @@ export const updaterService = mkSingleton(
|
|||||||
getVersion,
|
getVersion,
|
||||||
shutdown: async () => {},
|
shutdown: async () => {},
|
||||||
}
|
}
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
@ -1,12 +1,12 @@
|
|||||||
import {
|
import {
|
||||||
assertExists,
|
INSTANCE_COLLECTION,
|
||||||
InstanceFields,
|
InstanceFields,
|
||||||
InstanceFields_Create,
|
InstanceFields_Create,
|
||||||
InstanceId,
|
InstanceId,
|
||||||
InstanceStatus,
|
InstanceStatus,
|
||||||
INSTANCE_COLLECTION,
|
|
||||||
safeCatch,
|
|
||||||
UserFields,
|
UserFields,
|
||||||
|
assertExists,
|
||||||
|
safeCatch,
|
||||||
} from '@pockethost/common'
|
} from '@pockethost/common'
|
||||||
import { reduce } from '@s-libs/micro-dash'
|
import { reduce } from '@s-libs/micro-dash'
|
||||||
import Bottleneck from 'bottleneck'
|
import Bottleneck from 'bottleneck'
|
||||||
@ -25,7 +25,7 @@ export const createInstanceMixin = (context: MixinContext) => {
|
|||||||
const resetInstances = safeCatch(`resetRpcs`, logger, async () =>
|
const resetInstances = safeCatch(`resetRpcs`, logger, async () =>
|
||||||
rawDb(INSTANCE_COLLECTION).update<InstanceFields>({
|
rawDb(INSTANCE_COLLECTION).update<InstanceFields>({
|
||||||
status: InstanceStatus.Idle,
|
status: InstanceStatus.Idle,
|
||||||
})
|
}),
|
||||||
)
|
)
|
||||||
|
|
||||||
const createInstance = safeCatch(
|
const createInstance = safeCatch(
|
||||||
@ -35,7 +35,7 @@ export const createInstanceMixin = (context: MixinContext) => {
|
|||||||
return client
|
return client
|
||||||
.collection(INSTANCE_COLLECTION)
|
.collection(INSTANCE_COLLECTION)
|
||||||
.create<InstanceFields>(payload)
|
.create<InstanceFields>(payload)
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
const getInstanceBySubdomain = safeCatch(
|
const getInstanceBySubdomain = safeCatch(
|
||||||
@ -57,12 +57,12 @@ export const createInstanceMixin = (context: MixinContext) => {
|
|||||||
.then((user) => {
|
.then((user) => {
|
||||||
return [instance, user]
|
return [instance, user]
|
||||||
})
|
})
|
||||||
})
|
}),
|
||||||
)
|
)
|
||||||
|
|
||||||
const getInstanceById = async (
|
const getInstanceById = async (
|
||||||
instanceId: InstanceId,
|
instanceId: InstanceId,
|
||||||
context?: AsyncContext
|
context?: AsyncContext,
|
||||||
): Promise<[InstanceFields, UserFields] | []> =>
|
): Promise<[InstanceFields, UserFields] | []> =>
|
||||||
client
|
client
|
||||||
.collection(INSTANCE_COLLECTION)
|
.collection(INSTANCE_COLLECTION)
|
||||||
@ -86,7 +86,7 @@ export const createInstanceMixin = (context: MixinContext) => {
|
|||||||
logger,
|
logger,
|
||||||
async (instanceId: InstanceId, fields: Partial<InstanceFields>) => {
|
async (instanceId: InstanceId, fields: Partial<InstanceFields>) => {
|
||||||
await client.collection(INSTANCE_COLLECTION).update(instanceId, fields)
|
await client.collection(INSTANCE_COLLECTION).update(instanceId, fields)
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
const updateInstanceStatus = safeCatch(
|
const updateInstanceStatus = safeCatch(
|
||||||
@ -94,7 +94,7 @@ export const createInstanceMixin = (context: MixinContext) => {
|
|||||||
logger,
|
logger,
|
||||||
async (instanceId: InstanceId, status: InstanceStatus) => {
|
async (instanceId: InstanceId, status: InstanceStatus) => {
|
||||||
await updateInstance(instanceId, { status })
|
await updateInstance(instanceId, { status })
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
const getInstance = safeCatch(
|
const getInstance = safeCatch(
|
||||||
@ -104,7 +104,7 @@ export const createInstanceMixin = (context: MixinContext) => {
|
|||||||
return client
|
return client
|
||||||
.collection(INSTANCE_COLLECTION)
|
.collection(INSTANCE_COLLECTION)
|
||||||
.getOne<InstanceFields>(instanceId)
|
.getOne<InstanceFields>(instanceId)
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
const getInstances = safeCatch(`getInstances`, logger, async () => {
|
const getInstances = safeCatch(`getInstances`, logger, async () => {
|
||||||
@ -129,14 +129,14 @@ export const createInstanceMixin = (context: MixinContext) => {
|
|||||||
return client
|
return client
|
||||||
.collection(INSTANCE_COLLECTION)
|
.collection(INSTANCE_COLLECTION)
|
||||||
.update(r.id, toUpdate)
|
.update(r.id, toUpdate)
|
||||||
})
|
}),
|
||||||
)
|
)
|
||||||
return c
|
return c
|
||||||
},
|
},
|
||||||
[] as Promise<void>[]
|
[] as Promise<void>[],
|
||||||
)
|
)
|
||||||
await Promise.all(promises)
|
await Promise.all(promises)
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
const updateInstanceSeconds = safeCatch(
|
const updateInstanceSeconds = safeCatch(
|
||||||
@ -156,7 +156,7 @@ export const createInstanceMixin = (context: MixinContext) => {
|
|||||||
assertExists(row, `Expected row here`)
|
assertExists(row, `Expected row here`)
|
||||||
const secondsThisMonth = row.t
|
const secondsThisMonth = row.t
|
||||||
await updateInstance(instanceId, { secondsThisMonth })
|
await updateInstance(instanceId, { secondsThisMonth })
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
|
@ -9,7 +9,7 @@ import { MixinContext } from './PbClient'
|
|||||||
|
|
||||||
export const createInvocationMixin = (
|
export const createInvocationMixin = (
|
||||||
context: MixinContext,
|
context: MixinContext,
|
||||||
instanceApi: InstanceApi
|
instanceApi: InstanceApi,
|
||||||
) => {
|
) => {
|
||||||
const { logger } = context
|
const { logger } = context
|
||||||
const { dbg } = logger.create('InvocationMixin')
|
const { dbg } = logger.create('InvocationMixin')
|
||||||
@ -32,7 +32,7 @@ export const createInvocationMixin = (
|
|||||||
$cancelKey: `createInvocation:${instance.id}:${pid}`,
|
$cancelKey: `createInvocation:${instance.id}:${pid}`,
|
||||||
})
|
})
|
||||||
return _inv
|
return _inv
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
const pingInvocation = safeCatch(
|
const pingInvocation = safeCatch(
|
||||||
@ -49,7 +49,7 @@ export const createInvocationMixin = (
|
|||||||
.update<InvocationFields>(invocation.id, toUpdate)
|
.update<InvocationFields>(invocation.id, toUpdate)
|
||||||
await instanceApi.updateInstanceSeconds(invocation.instanceId)
|
await instanceApi.updateInstanceSeconds(invocation.instanceId)
|
||||||
return _inv
|
return _inv
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
const finalizeInvocation = safeCatch(
|
const finalizeInvocation = safeCatch(
|
||||||
@ -69,7 +69,7 @@ export const createInvocationMixin = (
|
|||||||
.update<InvocationFields>(invocation.id, toUpdate)
|
.update<InvocationFields>(invocation.id, toUpdate)
|
||||||
await instanceApi.updateInstanceSeconds(invocation.instanceId)
|
await instanceApi.updateInstanceSeconds(invocation.instanceId)
|
||||||
return _inv
|
return _inv
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
return { finalizeInvocation, pingInvocation, createInvocation }
|
return { finalizeInvocation, pingInvocation, createInvocation }
|
||||||
|
@ -18,7 +18,7 @@ export const createPbClient = (url: string, logger: Logger) => {
|
|||||||
info(`Initializing client: ${url}`)
|
info(`Initializing client: ${url}`)
|
||||||
const rawDb = createRawPbClient(
|
const rawDb = createRawPbClient(
|
||||||
`${DAEMON_PB_DATA_DIR}/${PUBLIC_APP_DB}/pb_data/data.db`,
|
`${DAEMON_PB_DATA_DIR}/${PUBLIC_APP_DB}/pb_data/data.db`,
|
||||||
_clientLogger
|
_clientLogger,
|
||||||
)
|
)
|
||||||
|
|
||||||
const client = new PocketBase(url)
|
const client = new PocketBase(url)
|
||||||
@ -27,7 +27,7 @@ export const createPbClient = (url: string, logger: Logger) => {
|
|||||||
`adminAuthViaEmail`,
|
`adminAuthViaEmail`,
|
||||||
_clientLogger,
|
_clientLogger,
|
||||||
(email: string, password: string) =>
|
(email: string, password: string) =>
|
||||||
client.admins.authWithPassword(email, password)
|
client.admins.authWithPassword(email, password),
|
||||||
)
|
)
|
||||||
|
|
||||||
const createFirstAdmin = safeCatch(
|
const createFirstAdmin = safeCatch(
|
||||||
@ -40,7 +40,7 @@ export const createPbClient = (url: string, logger: Logger) => {
|
|||||||
console.log({ email, password })
|
console.log({ email, password })
|
||||||
console.log(JSON.stringify(res, null, 2))
|
console.log(JSON.stringify(res, null, 2))
|
||||||
return res
|
return res
|
||||||
})
|
}),
|
||||||
)
|
)
|
||||||
|
|
||||||
const context: MixinContext = { client, rawDb, logger: _clientLogger }
|
const context: MixinContext = { client, rawDb, logger: _clientLogger }
|
||||||
|
@ -1,7 +1,7 @@
|
|||||||
import {
|
import {
|
||||||
|
RPC_COLLECTION,
|
||||||
RpcFields,
|
RpcFields,
|
||||||
RpcStatus,
|
RpcStatus,
|
||||||
RPC_COLLECTION,
|
|
||||||
safeCatch,
|
safeCatch,
|
||||||
} from '@pockethost/common'
|
} from '@pockethost/common'
|
||||||
import { ClientResponseError } from 'pocketbase'
|
import { ClientResponseError } from 'pocketbase'
|
||||||
@ -31,7 +31,7 @@ export const createRpcHelper = (config: RpcHelperConfig) => {
|
|||||||
cb(e.record)
|
cb(e.record)
|
||||||
})
|
})
|
||||||
return unsub
|
return unsub
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
const resetRpcs = safeCatch(`resetRpcs`, logger, async () =>
|
const resetRpcs = safeCatch(`resetRpcs`, logger, async () =>
|
||||||
@ -43,7 +43,7 @@ export const createRpcHelper = (config: RpcHelperConfig) => {
|
|||||||
.update<RpcFields<any, any>>({
|
.update<RpcFields<any, any>>({
|
||||||
status: RpcStatus.FinishedError,
|
status: RpcStatus.FinishedError,
|
||||||
result: `Canceled by reset`,
|
result: `Canceled by reset`,
|
||||||
})
|
}),
|
||||||
)
|
)
|
||||||
|
|
||||||
const incompleteRpcs = safeCatch(`incompleteRpcs`, logger, async () => {
|
const incompleteRpcs = safeCatch(`incompleteRpcs`, logger, async () => {
|
||||||
@ -65,7 +65,7 @@ export const createRpcHelper = (config: RpcHelperConfig) => {
|
|||||||
return client
|
return client
|
||||||
.collection(RPC_COLLECTION)
|
.collection(RPC_COLLECTION)
|
||||||
.update<RpcFields<any, any>>(rpc.id, fields)
|
.update<RpcFields<any, any>>(rpc.id, fields)
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
const setRpcStatus = safeCatch(
|
const setRpcStatus = safeCatch(
|
||||||
@ -74,12 +74,12 @@ export const createRpcHelper = (config: RpcHelperConfig) => {
|
|||||||
async (
|
async (
|
||||||
rpc: RpcFields<any, any>,
|
rpc: RpcFields<any, any>,
|
||||||
status: RpcStatus,
|
status: RpcStatus,
|
||||||
result: JsonObject = {}
|
result: JsonObject = {},
|
||||||
) => {
|
) => {
|
||||||
return client
|
return client
|
||||||
.collection(RPC_COLLECTION)
|
.collection(RPC_COLLECTION)
|
||||||
.update(rpc.id, { status, result })
|
.update(rpc.id, { status, result })
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
|
@ -31,7 +31,7 @@ export const clientService = mkSingleton(async (cfg: ClientServiceConfig) => {
|
|||||||
dbg(`Logged in`)
|
dbg(`Logged in`)
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
error(
|
error(
|
||||||
`CANNOT AUTHENTICATE TO ${PUBLIC_APP_PROTOCOL}://${PUBLIC_APP_DB}.${PUBLIC_APP_DOMAIN}/_/`
|
`CANNOT AUTHENTICATE TO ${PUBLIC_APP_PROTOCOL}://${PUBLIC_APP_DB}.${PUBLIC_APP_DOMAIN}/_/`,
|
||||||
)
|
)
|
||||||
process.exit(-1)
|
process.exit(-1)
|
||||||
}
|
}
|
||||||
|
@ -1,5 +1,3 @@
|
|||||||
export * from './clientService/clientService'
|
|
||||||
export * from './clientService/PbClient'
|
|
||||||
export * from './FtpService/FtpService'
|
export * from './FtpService/FtpService'
|
||||||
export * from './InstanceService/InstanceService'
|
export * from './InstanceService/InstanceService'
|
||||||
export * from './PocketBaseService'
|
export * from './PocketBaseService'
|
||||||
@ -7,3 +5,5 @@ export * from './ProxyService'
|
|||||||
export * from './RealtimeLog'
|
export * from './RealtimeLog'
|
||||||
export * from './RpcService/RpcService'
|
export * from './RpcService/RpcService'
|
||||||
export * from './SqliteService/SqliteService'
|
export * from './SqliteService/SqliteService'
|
||||||
|
export * from './clientService/PbClient'
|
||||||
|
export * from './clientService/clientService'
|
||||||
|
@ -1,12 +1,12 @@
|
|||||||
import { clientService } from '$services'
|
import { clientService } from '$services'
|
||||||
import {
|
import {
|
||||||
InstanceFields,
|
|
||||||
INSTANCE_COLLECTION,
|
INSTANCE_COLLECTION,
|
||||||
InvocationFields,
|
|
||||||
INVOCATION_COLLECTION,
|
INVOCATION_COLLECTION,
|
||||||
logger,
|
InstanceFields,
|
||||||
RpcFields,
|
InvocationFields,
|
||||||
RPC_COLLECTION,
|
RPC_COLLECTION,
|
||||||
|
RpcFields,
|
||||||
|
logger,
|
||||||
singletonAsyncExecutionGuard,
|
singletonAsyncExecutionGuard,
|
||||||
} from '@pockethost/common'
|
} from '@pockethost/common'
|
||||||
import Bottleneck from 'bottleneck'
|
import Bottleneck from 'bottleneck'
|
||||||
@ -17,7 +17,7 @@ export const deleteInvocation = singletonAsyncExecutionGuard(
|
|||||||
const { client } = await clientService()
|
const { client } = await clientService()
|
||||||
await client.client.collection(INVOCATION_COLLECTION).delete(invocation.id)
|
await client.client.collection(INVOCATION_COLLECTION).delete(invocation.id)
|
||||||
},
|
},
|
||||||
(invocation) => `deleteInvocation:${invocation.id}`
|
(invocation) => `deleteInvocation:${invocation.id}`,
|
||||||
)
|
)
|
||||||
|
|
||||||
export const deleteInvocationsForInstance = singletonAsyncExecutionGuard(
|
export const deleteInvocationsForInstance = singletonAsyncExecutionGuard(
|
||||||
@ -50,7 +50,7 @@ export const deleteInvocationsForInstance = singletonAsyncExecutionGuard(
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
(instance) => `deleteInvocationsForInstance:${instance.id}`
|
(instance) => `deleteInvocationsForInstance:${instance.id}`,
|
||||||
)
|
)
|
||||||
|
|
||||||
export const deleteRpc = singletonAsyncExecutionGuard(
|
export const deleteRpc = singletonAsyncExecutionGuard(
|
||||||
@ -58,7 +58,7 @@ export const deleteRpc = singletonAsyncExecutionGuard(
|
|||||||
const { client } = await clientService()
|
const { client } = await clientService()
|
||||||
await client.client.collection(RPC_COLLECTION).delete(rpc.id)
|
await client.client.collection(RPC_COLLECTION).delete(rpc.id)
|
||||||
},
|
},
|
||||||
(rpc) => `deleteRpc:${rpc.id}`
|
(rpc) => `deleteRpc:${rpc.id}`,
|
||||||
)
|
)
|
||||||
|
|
||||||
export const getAllRpcs = singletonAsyncExecutionGuard(
|
export const getAllRpcs = singletonAsyncExecutionGuard(
|
||||||
@ -70,7 +70,7 @@ export const getAllRpcs = singletonAsyncExecutionGuard(
|
|||||||
console.log(`Loaded rpcs`)
|
console.log(`Loaded rpcs`)
|
||||||
return rpcs
|
return rpcs
|
||||||
},
|
},
|
||||||
() => `getAllRpcs`
|
() => `getAllRpcs`,
|
||||||
)
|
)
|
||||||
|
|
||||||
export const deleteRpcsForInstance = singletonAsyncExecutionGuard(
|
export const deleteRpcsForInstance = singletonAsyncExecutionGuard(
|
||||||
@ -80,7 +80,7 @@ export const deleteRpcsForInstance = singletonAsyncExecutionGuard(
|
|||||||
const instanceRpcs = allRpcs.filter((rpc) => rpc.payload?.instanceId === id)
|
const instanceRpcs = allRpcs.filter((rpc) => rpc.payload?.instanceId === id)
|
||||||
await Promise.all(instanceRpcs.map(deleteRpc))
|
await Promise.all(instanceRpcs.map(deleteRpc))
|
||||||
},
|
},
|
||||||
(instance) => `deleteRpcsForInstance:${instance.id}`
|
(instance) => `deleteRpcsForInstance:${instance.id}`,
|
||||||
)
|
)
|
||||||
|
|
||||||
export const deleteInstance = singletonAsyncExecutionGuard(
|
export const deleteInstance = singletonAsyncExecutionGuard(
|
||||||
@ -95,7 +95,7 @@ export const deleteInstance = singletonAsyncExecutionGuard(
|
|||||||
await deleteInvocationsForInstance(instance).catch((e) => {
|
await deleteInvocationsForInstance(instance).catch((e) => {
|
||||||
console.error(
|
console.error(
|
||||||
`deleteInvocationsForInstance error`,
|
`deleteInvocationsForInstance error`,
|
||||||
JSON.stringify(e, null, 2)
|
JSON.stringify(e, null, 2),
|
||||||
)
|
)
|
||||||
throw e
|
throw e
|
||||||
})
|
})
|
||||||
@ -110,7 +110,7 @@ export const deleteInstance = singletonAsyncExecutionGuard(
|
|||||||
})
|
})
|
||||||
console.log(`Instance deleted ${id}`)
|
console.log(`Instance deleted ${id}`)
|
||||||
},
|
},
|
||||||
(instance) => `deleteInstance:${instance.id}`
|
(instance) => `deleteInstance:${instance.id}`,
|
||||||
)
|
)
|
||||||
|
|
||||||
export const deleteInstancesByFilter = singletonAsyncExecutionGuard(
|
export const deleteInstancesByFilter = singletonAsyncExecutionGuard(
|
||||||
@ -122,9 +122,9 @@ export const deleteInstancesByFilter = singletonAsyncExecutionGuard(
|
|||||||
const limiter = new Bottleneck({ maxConcurrent: 50 })
|
const limiter = new Bottleneck({ maxConcurrent: 50 })
|
||||||
await Promise.all(
|
await Promise.all(
|
||||||
instances.map((instance) =>
|
instances.map((instance) =>
|
||||||
limiter.schedule(() => deleteInstance(instance))
|
limiter.schedule(() => deleteInstance(instance)),
|
||||||
)
|
),
|
||||||
)
|
)
|
||||||
},
|
},
|
||||||
(filter) => `deleteInstancesByFilter:${filter}`
|
(filter) => `deleteInstancesByFilter:${filter}`,
|
||||||
)
|
)
|
||||||
|
@ -16,7 +16,7 @@ export const createCleanup = (context: { program: Command } & ContextBase) => {
|
|||||||
.option(
|
.option(
|
||||||
`-f, --filter <filter>`,
|
`-f, --filter <filter>`,
|
||||||
`Filter to use when deleting instances`,
|
`Filter to use when deleting instances`,
|
||||||
`stress-%`
|
`stress-%`,
|
||||||
)
|
)
|
||||||
.action(async () => {
|
.action(async () => {
|
||||||
const options = cleanupCmd.optsWithGlobals<CleanupOptions>()
|
const options = cleanupCmd.optsWithGlobals<CleanupOptions>()
|
||||||
|
@ -38,7 +38,7 @@ export const createSeed = (context: { program: Command } & ContextBase) => {
|
|||||||
`-c, --count`,
|
`-c, --count`,
|
||||||
`Number of new seed instances to create`,
|
`Number of new seed instances to create`,
|
||||||
parseInt,
|
parseInt,
|
||||||
10
|
10,
|
||||||
)
|
)
|
||||||
.action(async () => {
|
.action(async () => {
|
||||||
const options = seedCmd.optsWithGlobals<SeedOptions>()
|
const options = seedCmd.optsWithGlobals<SeedOptions>()
|
||||||
|
@ -24,25 +24,25 @@ export const createStress = (context: { program: Command } & ContextBase) => {
|
|||||||
'-ic, --instance-count <number>',
|
'-ic, --instance-count <number>',
|
||||||
`Number of simultaneous instances to hit`,
|
`Number of simultaneous instances to hit`,
|
||||||
parseInt,
|
parseInt,
|
||||||
100
|
100,
|
||||||
)
|
)
|
||||||
.option(
|
.option(
|
||||||
'-rc, --requests-per-instance <number>',
|
'-rc, --requests-per-instance <number>',
|
||||||
`Number of simultaneous requests per instance`,
|
`Number of simultaneous requests per instance`,
|
||||||
parseInt,
|
parseInt,
|
||||||
50
|
50,
|
||||||
)
|
)
|
||||||
.option(
|
.option(
|
||||||
'-mind, --min-delay <number>',
|
'-mind, --min-delay <number>',
|
||||||
`Minimum number of milliseconds to delay before sending another request`,
|
`Minimum number of milliseconds to delay before sending another request`,
|
||||||
parseInt,
|
parseInt,
|
||||||
50
|
50,
|
||||||
)
|
)
|
||||||
.option(
|
.option(
|
||||||
'-maxd, --max-delay <number>',
|
'-maxd, --max-delay <number>',
|
||||||
`Maximum number of milliseconds to delay before sending another request`,
|
`Maximum number of milliseconds to delay before sending another request`,
|
||||||
parseInt,
|
parseInt,
|
||||||
500
|
500,
|
||||||
)
|
)
|
||||||
.action(async () => {
|
.action(async () => {
|
||||||
const options = seedCmd.optsWithGlobals<StressOptions>()
|
const options = seedCmd.optsWithGlobals<StressOptions>()
|
||||||
@ -63,7 +63,7 @@ export const createStress = (context: { program: Command } & ContextBase) => {
|
|||||||
if (excluded[instanceId]) return
|
if (excluded[instanceId]) return
|
||||||
await client.updateInstance(instanceId, { maintenance: false })
|
await client.updateInstance(instanceId, { maintenance: false })
|
||||||
},
|
},
|
||||||
(id) => `reset:${id}`
|
(id) => `reset:${id}`,
|
||||||
)
|
)
|
||||||
|
|
||||||
const instances = await client.getInstances()
|
const instances = await client.getInstances()
|
||||||
@ -80,7 +80,7 @@ export const createStress = (context: { program: Command } & ContextBase) => {
|
|||||||
dbg(
|
dbg(
|
||||||
`There are ${instances.length} instances and ${
|
`There are ${instances.length} instances and ${
|
||||||
values(excluded).length
|
values(excluded).length
|
||||||
} excluded`
|
} excluded`,
|
||||||
)
|
)
|
||||||
if (!instance) throw new Error(`No instance to grab`)
|
if (!instance) throw new Error(`No instance to grab`)
|
||||||
|
|
||||||
@ -108,7 +108,7 @@ export const createStress = (context: { program: Command } & ContextBase) => {
|
|||||||
return // Timeout
|
return // Timeout
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
})
|
}),
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
|
@ -22,7 +22,7 @@ program
|
|||||||
.option(
|
.option(
|
||||||
'-u, --mothership-url',
|
'-u, --mothership-url',
|
||||||
'URL to central database',
|
'URL to central database',
|
||||||
'http://127.0.0.1:8090'
|
'http://127.0.0.1:8090',
|
||||||
)
|
)
|
||||||
|
|
||||||
createCleanup({ program, logger })
|
createCleanup({ program, logger })
|
||||||
|
@ -7,7 +7,7 @@ import { dirname } from 'path'
|
|||||||
|
|
||||||
export function assert<T>(
|
export function assert<T>(
|
||||||
v: T | undefined | void | null,
|
v: T | undefined | void | null,
|
||||||
msg?: string
|
msg?: string,
|
||||||
): asserts v is T {
|
): asserts v is T {
|
||||||
if (!v) {
|
if (!v) {
|
||||||
throw new Error(msg || `Assertion failure`)
|
throw new Error(msg || `Assertion failure`)
|
||||||
@ -28,7 +28,7 @@ const downloadFile = async (url: string, path: string) => {
|
|||||||
const _unsafe_downloadAndExtract = async (
|
const _unsafe_downloadAndExtract = async (
|
||||||
url: string,
|
url: string,
|
||||||
binPath: string,
|
binPath: string,
|
||||||
logger: Logger
|
logger: Logger,
|
||||||
) => {
|
) => {
|
||||||
const { dbg, error } = logger.create('downloadAndExtract')
|
const { dbg, error } = logger.create('downloadAndExtract')
|
||||||
|
|
||||||
@ -48,5 +48,5 @@ const _unsafe_downloadAndExtract = async (
|
|||||||
|
|
||||||
export const downloadAndExtract = singletonAsyncExecutionGuard(
|
export const downloadAndExtract = singletonAsyncExecutionGuard(
|
||||||
_unsafe_downloadAndExtract,
|
_unsafe_downloadAndExtract,
|
||||||
(url) => url
|
(url) => url,
|
||||||
)
|
)
|
||||||
|
@ -5,7 +5,7 @@ import { dirname } from 'path'
|
|||||||
|
|
||||||
export const smartFetch = async <TRet>(
|
export const smartFetch = async <TRet>(
|
||||||
url: string,
|
url: string,
|
||||||
path: string
|
path: string,
|
||||||
): Promise<TRet> => {
|
): Promise<TRet> => {
|
||||||
const { dbg } = logger().create(`smartFetch`)
|
const { dbg } = logger().create(`smartFetch`)
|
||||||
|
|
||||||
|
@ -40,7 +40,7 @@ export const tryFetch = async (url: string, config?: Partial<Config>) => {
|
|||||||
resolve()
|
resolve()
|
||||||
} catch (e) {
|
} catch (e) {
|
||||||
dbg(
|
dbg(
|
||||||
`Could not fetch ${url}, trying again in ${retryMs}ms. Raw error ${e}`
|
`Could not fetch ${url}, trying again in ${retryMs}ms. Raw error ${e}`,
|
||||||
)
|
)
|
||||||
setTimeout(tryFetch, retryMs)
|
setTimeout(tryFetch, retryMs)
|
||||||
}
|
}
|
||||||
|
@ -250,7 +250,7 @@ export class EventSource extends EventTarget {
|
|||||||
await new Promise<void>((res) => {
|
await new Promise<void>((res) => {
|
||||||
const id = setTimeout(
|
const id = setTimeout(
|
||||||
() => res(clearTimeout(id)),
|
() => res(clearTimeout(id)),
|
||||||
this.#settings.reconnectionTime
|
this.#settings.reconnectionTime,
|
||||||
)
|
)
|
||||||
})
|
})
|
||||||
|
|
||||||
|
2
packages/deno-worker/index.d.ts
vendored
2
packages/deno-worker/index.d.ts
vendored
@ -19,7 +19,7 @@ declare class EventSource {
|
|||||||
|
|
||||||
constructor(
|
constructor(
|
||||||
url: string,
|
url: string,
|
||||||
eventSourceInitDict?: EventSource.EventSourceInitDict
|
eventSourceInitDict?: EventSource.EventSourceInitDict,
|
||||||
)
|
)
|
||||||
|
|
||||||
readonly CLOSED: number
|
readonly CLOSED: number
|
||||||
|
@ -23,7 +23,7 @@ export const init = (klass: typeof PocketBase) => {
|
|||||||
|
|
||||||
const adminAuthWithPassword = async (
|
const adminAuthWithPassword = async (
|
||||||
login = ADMIN_LOGIN,
|
login = ADMIN_LOGIN,
|
||||||
password = ADMIN_PASSWORD
|
password = ADMIN_PASSWORD,
|
||||||
) => {
|
) => {
|
||||||
console.log(`Connecting to ${POCKETBASE_URL} with ${ADMIN_LOGIN}`)
|
console.log(`Connecting to ${POCKETBASE_URL} with ${ADMIN_LOGIN}`)
|
||||||
await client.admins.authWithPassword(login, password)
|
await client.admins.authWithPassword(login, password)
|
||||||
|
@ -1,16 +0,0 @@
|
|||||||
{
|
|
||||||
"useTabs": false,
|
|
||||||
"singleQuote": true,
|
|
||||||
"semi": false,
|
|
||||||
"trailingComma": "none",
|
|
||||||
"printWidth": 100,
|
|
||||||
"pluginSearchDirs": [".", "../.."],
|
|
||||||
"overrides": [
|
|
||||||
{
|
|
||||||
"files": "*.svelte",
|
|
||||||
"options": {
|
|
||||||
"parser": "svelte"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
]
|
|
||||||
}
|
|
@ -10,28 +10,25 @@
|
|||||||
"check:watch": "svelte-kit sync && svelte-check --tsconfig ./tsconfig.json --watch",
|
"check:watch": "svelte-kit sync && svelte-check --tsconfig ./tsconfig.json --watch",
|
||||||
"lint": "prettier --check .",
|
"lint": "prettier --check .",
|
||||||
"format": "prettier --write .",
|
"format": "prettier --write .",
|
||||||
"start": "HOST=localhost PORT=5173 node dist-server/index.js",
|
"start": "HOST=localhost PORT=5173 dotenv -e ../../.env node -- dist-server/index.js",
|
||||||
"pm2": "pm2 del www ; pm2 start \"yarn start\" --name=www -o ~/logs/www.log -e ~/logs/www.log",
|
"pm2": "pm2 del www ; pm2 start \"yarn start\" --name=www -o ~/logs/www.log -e ~/logs/www.log",
|
||||||
"watch": "chokidar 'src/**' -c 'yarn build' --initial"
|
"watch": "chokidar 'src/**' -c 'yarn build' --initial"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@sveltejs/adapter-node": "^1.0.0-next.99",
|
"@sveltejs/adapter-node": "^1.3.1",
|
||||||
"@sveltejs/kit": "next",
|
"@sveltejs/kit": "^1.24.1",
|
||||||
"chokidar-cli": "^3.0.0",
|
"node-html-parser": "^6.1.8",
|
||||||
"node-html-parser": "^6.1.4",
|
"svelte": "^4.2.0",
|
||||||
"svelte": "^3.44.0",
|
"svelte-check": "^3.5.1",
|
||||||
"svelte-check": "^2.7.1",
|
"svelte-preprocess": "^5.0.4",
|
||||||
"svelte-preprocess": "^4.10.6",
|
"vite": "^4.4.9"
|
||||||
"tslib": "^2.3.1",
|
|
||||||
"typescript": "^4.8.0",
|
|
||||||
"vite": "^3.1.0"
|
|
||||||
},
|
},
|
||||||
"type": "module",
|
"type": "module",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@dansvel/vite-plugin-markdown": "^2.0.5",
|
"@dansvel/vite-plugin-markdown": "^2.0.5",
|
||||||
"@microsoft/fetch-event-source": "https://github.com/Almar/fetch-event-source.git#pr/make_web_worker_friendly",
|
"@microsoft/fetch-event-source": "https://github.com/Almar/fetch-event-source.git#pr/make_web_worker_friendly",
|
||||||
"@pockethost/common": "0.0.1",
|
"@pockethost/common": "0.0.1",
|
||||||
"@s-libs/micro-dash": "^15.1.0",
|
"@s-libs/micro-dash": "^16.1.0",
|
||||||
"@types/bootstrap": "^5.2.6",
|
"@types/bootstrap": "^5.2.6",
|
||||||
"@types/d3-scale": "^4.0.3",
|
"@types/d3-scale": "^4.0.3",
|
||||||
"@types/d3-scale-chromatic": "^3.0.0",
|
"@types/d3-scale-chromatic": "^3.0.0",
|
||||||
|
@ -6,5 +6,5 @@ export enum AlertTypes {
|
|||||||
Warning = 'warning',
|
Warning = 'warning',
|
||||||
Info = 'info',
|
Info = 'info',
|
||||||
Light = 'light',
|
Light = 'light',
|
||||||
Dark = 'dark'
|
Dark = 'dark',
|
||||||
}
|
}
|
||||||
|
@ -5,13 +5,13 @@ import Cookies from 'js-cookie'
|
|||||||
// Set some default values to be referenced later
|
// Set some default values to be referenced later
|
||||||
export enum ThemeNames {
|
export enum ThemeNames {
|
||||||
Light = 'light',
|
Light = 'light',
|
||||||
Dark = 'dark'
|
Dark = 'dark',
|
||||||
}
|
}
|
||||||
export const HLJS_THEMES = {
|
export const HLJS_THEMES = {
|
||||||
[ThemeNames.Light]:
|
[ThemeNames.Light]:
|
||||||
'https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.7.0/styles/default.min.css',
|
'https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.7.0/styles/default.min.css',
|
||||||
[ThemeNames.Dark]:
|
[ThemeNames.Dark]:
|
||||||
'https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.7.0/styles/a11y-dark.min.css'
|
'https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.7.0/styles/a11y-dark.min.css',
|
||||||
}
|
}
|
||||||
export const ALLOWED_THEMES: ThemeNames[] = [ThemeNames.Light, ThemeNames.Dark]
|
export const ALLOWED_THEMES: ThemeNames[] = [ThemeNames.Light, ThemeNames.Dark]
|
||||||
export const DEFAULT_THEME: ThemeNames = ThemeNames.Light
|
export const DEFAULT_THEME: ThemeNames = ThemeNames.Light
|
||||||
@ -19,7 +19,7 @@ export const STORAGE_NAME: string = 'theme'
|
|||||||
export const THEME_ATTRIBUTE: string = 'data-bs-theme'
|
export const THEME_ATTRIBUTE: string = 'data-bs-theme'
|
||||||
export const THEME_ICONS: { [_ in ThemeNames]: string } = {
|
export const THEME_ICONS: { [_ in ThemeNames]: string } = {
|
||||||
[ThemeNames.Light]: 'bi bi-moon-stars',
|
[ThemeNames.Light]: 'bi bi-moon-stars',
|
||||||
[ThemeNames.Dark]: 'bi bi-brightness-high'
|
[ThemeNames.Dark]: 'bi bi-brightness-high',
|
||||||
}
|
}
|
||||||
|
|
||||||
export const html = () => {
|
export const html = () => {
|
||||||
@ -30,7 +30,8 @@ export const html = () => {
|
|||||||
|
|
||||||
export const getCurrentTheme = () => {
|
export const getCurrentTheme = () => {
|
||||||
const savedTheme = Cookies.get(STORAGE_NAME)
|
const savedTheme = Cookies.get(STORAGE_NAME)
|
||||||
const currentTheme = find(ALLOWED_THEMES, (v) => savedTheme === v) || DEFAULT_THEME
|
const currentTheme =
|
||||||
|
find(ALLOWED_THEMES, (v) => savedTheme === v) || DEFAULT_THEME
|
||||||
return currentTheme
|
return currentTheme
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -6,15 +6,19 @@ import { boolean } from 'boolean'
|
|||||||
import UrlPattern from 'url-pattern'
|
import UrlPattern from 'url-pattern'
|
||||||
import base from '../../../package.json'
|
import base from '../../../package.json'
|
||||||
|
|
||||||
export const env = (name: string, _default: string = '') => {
|
export type PublicEnvName = `PUBLIC_${string}`
|
||||||
|
|
||||||
|
export const env = (name: PublicEnvName, _default: string = '') => {
|
||||||
const v = _env[name]
|
const v = _env[name]
|
||||||
if (!v) return _default
|
if (!v) return _default
|
||||||
return v
|
return v
|
||||||
}
|
}
|
||||||
|
|
||||||
export const envi = (name: string, _default: number) => parseInt(env(name, _default.toString()))
|
export const envi = (name: PublicEnvName, _default: number) =>
|
||||||
|
parseInt(env(name, _default.toString()))
|
||||||
|
|
||||||
export const envb = (name: string, _default: boolean) => boolean(env(name, _default.toString()))
|
export const envb = (name: PublicEnvName, _default: boolean) =>
|
||||||
|
boolean(env(name, _default.toString()))
|
||||||
|
|
||||||
export const PUBLIC_APP_DB = env('PUBLIC_APP_DB', 'pockethost-central')
|
export const PUBLIC_APP_DB = env('PUBLIC_APP_DB', 'pockethost-central')
|
||||||
export const PUBLIC_APP_DOMAIN = env('PUBLIC_APP_DOMAIN', 'pockethost.io')
|
export const PUBLIC_APP_DOMAIN = env('PUBLIC_APP_DOMAIN', 'pockethost.io')
|
||||||
@ -24,7 +28,9 @@ export const PUBLIC_DEBUG = envb('PUBLIC_DEBUG', dev)
|
|||||||
|
|
||||||
export const PUBLIC_POCKETHOST_VERSION = base.version
|
export const PUBLIC_POCKETHOST_VERSION = base.version
|
||||||
|
|
||||||
export const PUBLIC_ROUTES = publicRoutes.map((pattern) => new UrlPattern(pattern))
|
export const PUBLIC_ROUTES = publicRoutes.map(
|
||||||
|
(pattern) => new UrlPattern(pattern),
|
||||||
|
)
|
||||||
|
|
||||||
try {
|
try {
|
||||||
logger()
|
logger()
|
||||||
|
@ -1,17 +1,17 @@
|
|||||||
import { createGenericSyncEvent } from '$util/events'
|
import { createGenericSyncEvent } from '$util/events'
|
||||||
import { fetchEventSource } from '@microsoft/fetch-event-source'
|
import { fetchEventSource } from '@microsoft/fetch-event-source'
|
||||||
import {
|
import {
|
||||||
assertExists,
|
|
||||||
CreateInstancePayloadSchema,
|
CreateInstancePayloadSchema,
|
||||||
createRpcHelper,
|
|
||||||
createWatchHelper,
|
|
||||||
logger,
|
|
||||||
RenameInstancePayloadSchema,
|
RenameInstancePayloadSchema,
|
||||||
RpcCommands,
|
RpcCommands,
|
||||||
safeCatch,
|
|
||||||
SaveSecretsPayloadSchema,
|
SaveSecretsPayloadSchema,
|
||||||
SaveVersionPayloadSchema,
|
SaveVersionPayloadSchema,
|
||||||
SetInstanceMaintenancePayloadSchema,
|
SetInstanceMaintenancePayloadSchema,
|
||||||
|
assertExists,
|
||||||
|
createRpcHelper,
|
||||||
|
createWatchHelper,
|
||||||
|
logger,
|
||||||
|
safeCatch,
|
||||||
type CreateInstancePayload,
|
type CreateInstancePayload,
|
||||||
type CreateInstanceResult,
|
type CreateInstanceResult,
|
||||||
type InstanceFields,
|
type InstanceFields,
|
||||||
@ -26,7 +26,7 @@ import {
|
|||||||
type SetInstanceMaintenancePayload,
|
type SetInstanceMaintenancePayload,
|
||||||
type SetInstanceMaintenanceResult,
|
type SetInstanceMaintenanceResult,
|
||||||
// gen:import
|
// gen:import
|
||||||
type UserFields
|
type UserFields,
|
||||||
} from '@pockethost/common'
|
} from '@pockethost/common'
|
||||||
import { keys, map } from '@s-libs/micro-dash'
|
import { keys, map } from '@s-libs/micro-dash'
|
||||||
import PocketBase, {
|
import PocketBase, {
|
||||||
@ -34,7 +34,7 @@ import PocketBase, {
|
|||||||
BaseAuthStore,
|
BaseAuthStore,
|
||||||
ClientResponseError,
|
ClientResponseError,
|
||||||
type RecordSubscription,
|
type RecordSubscription,
|
||||||
type UnsubscribeFunc
|
type UnsubscribeFunc,
|
||||||
} from 'pocketbase'
|
} from 'pocketbase'
|
||||||
|
|
||||||
export type AuthChangeHandler = (user: BaseAuthStore) => void
|
export type AuthChangeHandler = (user: BaseAuthStore) => void
|
||||||
@ -66,36 +66,45 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
|||||||
|
|
||||||
const logOut = () => authStore.clear()
|
const logOut = () => authStore.clear()
|
||||||
|
|
||||||
const createUser = safeCatch(`createUser`, _logger, (email: string, password: string) =>
|
const createUser = safeCatch(
|
||||||
|
`createUser`,
|
||||||
|
_logger,
|
||||||
|
(email: string, password: string) =>
|
||||||
client
|
client
|
||||||
.collection('users')
|
.collection('users')
|
||||||
.create({
|
.create({
|
||||||
email,
|
email,
|
||||||
password,
|
password,
|
||||||
passwordConfirm: password
|
passwordConfirm: password,
|
||||||
})
|
})
|
||||||
.then(() => {
|
.then(() => {
|
||||||
// dbg(`Sending verification email to ${email}`)
|
// dbg(`Sending verification email to ${email}`)
|
||||||
return client.collection('users').requestVerification(email)
|
return client.collection('users').requestVerification(email)
|
||||||
})
|
}),
|
||||||
)
|
)
|
||||||
|
|
||||||
const confirmVerification = safeCatch(`confirmVerification`, _logger, (token: string) =>
|
const confirmVerification = safeCatch(
|
||||||
|
`confirmVerification`,
|
||||||
|
_logger,
|
||||||
|
(token: string) =>
|
||||||
client
|
client
|
||||||
.collection('users')
|
.collection('users')
|
||||||
.confirmVerification(token)
|
.confirmVerification(token)
|
||||||
.then((response) => {
|
.then((response) => {
|
||||||
return response
|
return response
|
||||||
})
|
}),
|
||||||
)
|
)
|
||||||
|
|
||||||
const requestPasswordReset = safeCatch(`requestPasswordReset`, _logger, (email: string) =>
|
const requestPasswordReset = safeCatch(
|
||||||
|
`requestPasswordReset`,
|
||||||
|
_logger,
|
||||||
|
(email: string) =>
|
||||||
client
|
client
|
||||||
.collection('users')
|
.collection('users')
|
||||||
.requestPasswordReset(email)
|
.requestPasswordReset(email)
|
||||||
.then(() => {
|
.then(() => {
|
||||||
return true
|
return true
|
||||||
})
|
}),
|
||||||
)
|
)
|
||||||
|
|
||||||
const requestPasswordResetConfirm = safeCatch(
|
const requestPasswordResetConfirm = safeCatch(
|
||||||
@ -107,15 +116,18 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
|||||||
.confirmPasswordReset(token, password, password)
|
.confirmPasswordReset(token, password, password)
|
||||||
.then((response) => {
|
.then((response) => {
|
||||||
return response
|
return response
|
||||||
})
|
}),
|
||||||
)
|
)
|
||||||
|
|
||||||
const authViaEmail = safeCatch(`authViaEmail`, _logger, (email: string, password: string) =>
|
const authViaEmail = safeCatch(
|
||||||
client.collection('users').authWithPassword(email, password)
|
`authViaEmail`,
|
||||||
|
_logger,
|
||||||
|
(email: string, password: string) =>
|
||||||
|
client.collection('users').authWithPassword(email, password),
|
||||||
)
|
)
|
||||||
|
|
||||||
const refreshAuthToken = safeCatch(`refreshAuthToken`, _logger, () =>
|
const refreshAuthToken = safeCatch(`refreshAuthToken`, _logger, () =>
|
||||||
client.collection('users').authRefresh()
|
client.collection('users').authRefresh(),
|
||||||
)
|
)
|
||||||
|
|
||||||
const watchHelper = createWatchHelper({ client })
|
const watchHelper = createWatchHelper({ client })
|
||||||
@ -125,27 +137,27 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
|||||||
|
|
||||||
const createInstance = mkRpc<CreateInstancePayload, CreateInstanceResult>(
|
const createInstance = mkRpc<CreateInstancePayload, CreateInstanceResult>(
|
||||||
RpcCommands.CreateInstance,
|
RpcCommands.CreateInstance,
|
||||||
CreateInstancePayloadSchema
|
CreateInstancePayloadSchema,
|
||||||
)
|
)
|
||||||
const saveSecrets = mkRpc<SaveSecretsPayload, SaveSecretsResult>(
|
const saveSecrets = mkRpc<SaveSecretsPayload, SaveSecretsResult>(
|
||||||
RpcCommands.SaveSecrets,
|
RpcCommands.SaveSecrets,
|
||||||
SaveSecretsPayloadSchema
|
SaveSecretsPayloadSchema,
|
||||||
)
|
)
|
||||||
|
|
||||||
const saveVersion = mkRpc<SaveVersionPayload, SaveVersionResult>(
|
const saveVersion = mkRpc<SaveVersionPayload, SaveVersionResult>(
|
||||||
RpcCommands.SaveVersion,
|
RpcCommands.SaveVersion,
|
||||||
SaveVersionPayloadSchema
|
SaveVersionPayloadSchema,
|
||||||
)
|
)
|
||||||
|
|
||||||
const renameInstance = mkRpc<RenameInstancePayload, RenameInstanceResult>(
|
const renameInstance = mkRpc<RenameInstancePayload, RenameInstanceResult>(
|
||||||
RpcCommands.RenameInstance,
|
RpcCommands.RenameInstance,
|
||||||
RenameInstancePayloadSchema
|
RenameInstancePayloadSchema,
|
||||||
)
|
)
|
||||||
|
|
||||||
const setInstanceMaintenance = mkRpc<SetInstanceMaintenancePayload, SetInstanceMaintenanceResult>(
|
const setInstanceMaintenance = mkRpc<
|
||||||
RpcCommands.SetInstanceMaintenance,
|
SetInstanceMaintenancePayload,
|
||||||
SetInstanceMaintenancePayloadSchema
|
SetInstanceMaintenanceResult
|
||||||
)
|
>(RpcCommands.SetInstanceMaintenance, SetInstanceMaintenancePayloadSchema)
|
||||||
|
|
||||||
// gen:mkRpc
|
// gen:mkRpc
|
||||||
|
|
||||||
@ -153,42 +165,56 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
|||||||
`getInstanceById`,
|
`getInstanceById`,
|
||||||
_logger,
|
_logger,
|
||||||
(id: InstanceId): Promise<InstanceFields | undefined> =>
|
(id: InstanceId): Promise<InstanceFields | undefined> =>
|
||||||
client.collection('instances').getOne<InstanceFields>(id)
|
client.collection('instances').getOne<InstanceFields>(id),
|
||||||
)
|
)
|
||||||
|
|
||||||
const watchInstanceById = async (
|
const watchInstanceById = async (
|
||||||
id: InstanceId,
|
id: InstanceId,
|
||||||
cb: (data: RecordSubscription<InstanceFields>) => void
|
cb: (data: RecordSubscription<InstanceFields>) => void,
|
||||||
): Promise<UnsubscribeFunc> => watchById('instances', id, cb)
|
): Promise<UnsubscribeFunc> => watchById('instances', id, cb)
|
||||||
|
|
||||||
const getAllInstancesById = safeCatch(`getAllInstancesById`, _logger, async () =>
|
const getAllInstancesById = safeCatch(
|
||||||
(await client.collection('instances').getFullList()).reduce((c, v) => {
|
`getAllInstancesById`,
|
||||||
|
_logger,
|
||||||
|
async () =>
|
||||||
|
(await client.collection('instances').getFullList()).reduce(
|
||||||
|
(c, v) => {
|
||||||
c[v.id] = v as unknown as InstanceFields
|
c[v.id] = v as unknown as InstanceFields
|
||||||
return c
|
return c
|
||||||
}, {} as { [_: InstanceId]: InstanceFields })
|
},
|
||||||
|
{} as { [_: InstanceId]: InstanceFields },
|
||||||
|
),
|
||||||
)
|
)
|
||||||
|
|
||||||
const parseError = (e: Error): string[] => {
|
const parseError = (e: Error): string[] => {
|
||||||
if (!(e instanceof ClientResponseError)) return [e.message]
|
if (!(e instanceof ClientResponseError)) return [e.message]
|
||||||
if (e.data.message && keys(e.data.data).length === 0) return [e.data.message]
|
if (e.data.message && keys(e.data.data).length === 0)
|
||||||
return map(e.data.data, (v, k) => (v ? v.message : undefined)).filter((v) => !!v)
|
return [e.data.message]
|
||||||
|
return map(e.data.data, (v, k) => (v ? v.message : undefined)).filter(
|
||||||
|
(v) => !!v,
|
||||||
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
const resendVerificationEmail = safeCatch(`resendVerificationEmail`, _logger, async () => {
|
const resendVerificationEmail = safeCatch(
|
||||||
|
`resendVerificationEmail`,
|
||||||
|
_logger,
|
||||||
|
async () => {
|
||||||
const user = client.authStore.model
|
const user = client.authStore.model
|
||||||
assertExists(user, `Login required`)
|
assertExists(user, `Login required`)
|
||||||
await client.collection('users').requestVerification(user.email)
|
await client.collection('users').requestVerification(user.email)
|
||||||
})
|
},
|
||||||
|
)
|
||||||
|
|
||||||
const getAuthStoreProps = (): AuthStoreProps => {
|
const getAuthStoreProps = (): AuthStoreProps => {
|
||||||
const { token, model, isValid } = client.authStore as AuthStoreProps
|
const { token, model, isValid } = client.authStore as AuthStoreProps
|
||||||
// dbg(`current authStore`, { token, model, isValid })
|
// dbg(`current authStore`, { token, model, isValid })
|
||||||
if (model instanceof Admin) throw new Error(`Admin models not supported`)
|
if (model instanceof Admin) throw new Error(`Admin models not supported`)
|
||||||
if (model && !model.email) throw new Error(`Expected model to be a user here`)
|
if (model && !model.email)
|
||||||
|
throw new Error(`Expected model to be a user here`)
|
||||||
return {
|
return {
|
||||||
token,
|
token,
|
||||||
model,
|
model,
|
||||||
isValid
|
isValid,
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -196,7 +222,8 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
|||||||
* Use synthetic event for authStore changers so we can broadcast just
|
* Use synthetic event for authStore changers so we can broadcast just
|
||||||
* the props we want and not the actual authStore object.
|
* the props we want and not the actual authStore object.
|
||||||
*/
|
*/
|
||||||
const [onAuthChange, fireAuthChange] = createGenericSyncEvent<AuthStoreProps>()
|
const [onAuthChange, fireAuthChange] =
|
||||||
|
createGenericSyncEvent<AuthStoreProps>()
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* This section is for initialization
|
* This section is for initialization
|
||||||
@ -254,7 +281,7 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
|||||||
const watchInstanceLog = (
|
const watchInstanceLog = (
|
||||||
instanceId: InstanceId,
|
instanceId: InstanceId,
|
||||||
update: (log: InstanceLogFields) => void,
|
update: (log: InstanceLogFields) => void,
|
||||||
nInitial = 100
|
nInitial = 100,
|
||||||
): (() => void) => {
|
): (() => void) => {
|
||||||
const { dbg, trace } = _logger.create('watchInstanceLog')
|
const { dbg, trace } = _logger.create('watchInstanceLog')
|
||||||
const auth = client.authStore.exportToCookie()
|
const auth = client.authStore.exportToCookie()
|
||||||
@ -266,12 +293,12 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
|||||||
fetchEventSource(`${url}/logs`, {
|
fetchEventSource(`${url}/logs`, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: {
|
headers: {
|
||||||
'Content-Type': 'application/json'
|
'Content-Type': 'application/json',
|
||||||
},
|
},
|
||||||
body: JSON.stringify({
|
body: JSON.stringify({
|
||||||
instanceId,
|
instanceId,
|
||||||
n: nInitial,
|
n: nInitial,
|
||||||
auth
|
auth,
|
||||||
}),
|
}),
|
||||||
onmessage: (event) => {
|
onmessage: (event) => {
|
||||||
trace(`Got stream event`, event)
|
trace(`Got stream event`, event)
|
||||||
@ -290,7 +317,7 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
|||||||
setTimeout(continuallyFetchFromEventSource, 100)
|
setTimeout(continuallyFetchFromEventSource, 100)
|
||||||
dbg(`Stream closed`)
|
dbg(`Stream closed`)
|
||||||
},
|
},
|
||||||
signal
|
signal,
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
continuallyFetchFromEventSource()
|
continuallyFetchFromEventSource()
|
||||||
@ -323,6 +350,6 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
|||||||
renameInstance,
|
renameInstance,
|
||||||
setInstanceMaintenance,
|
setInstanceMaintenance,
|
||||||
// gen:export
|
// gen:export
|
||||||
saveVersion
|
saveVersion,
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -1,7 +1,10 @@
|
|||||||
import { browser } from '$app/environment'
|
import { browser } from '$app/environment'
|
||||||
import { PUBLIC_APP_DB, PUBLIC_APP_DOMAIN } from '$src/env'
|
import { PUBLIC_APP_DB, PUBLIC_APP_DOMAIN } from '$src/env'
|
||||||
import { logger } from '@pockethost/common'
|
import { logger } from '@pockethost/common'
|
||||||
import { createPocketbaseClient, type PocketbaseClient } from './PocketbaseClient'
|
import {
|
||||||
|
createPocketbaseClient,
|
||||||
|
type PocketbaseClient,
|
||||||
|
} from './PocketbaseClient'
|
||||||
|
|
||||||
export const client = (() => {
|
export const client = (() => {
|
||||||
let clientInstance: PocketbaseClient | undefined
|
let clientInstance: PocketbaseClient | undefined
|
||||||
|
@ -23,14 +23,14 @@ function formatInput(input: SecretsArray): SecretsArray {
|
|||||||
.map(({ name, value }, index) => ({
|
.map(({ name, value }, index) => ({
|
||||||
name,
|
name,
|
||||||
value,
|
value,
|
||||||
color: colorScale(index.toString())
|
color: colorScale(index.toString()),
|
||||||
}))
|
}))
|
||||||
}
|
}
|
||||||
|
|
||||||
const sanitize = (item: SecretItem) => {
|
const sanitize = (item: SecretItem) => {
|
||||||
return {
|
return {
|
||||||
name: item.name.toUpperCase().trim(),
|
name: item.name.toUpperCase().trim(),
|
||||||
value: item.value.trim()
|
value: item.value.trim(),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -54,8 +54,8 @@ function createItems(initialItems: SecretsArray) {
|
|||||||
...n,
|
...n,
|
||||||
{
|
{
|
||||||
name,
|
name,
|
||||||
value
|
value,
|
||||||
}
|
},
|
||||||
]
|
]
|
||||||
return formatInput(n)
|
return formatInput(n)
|
||||||
})
|
})
|
||||||
@ -69,7 +69,7 @@ function createItems(initialItems: SecretsArray) {
|
|||||||
n = [...n.slice(0, index), ...n.slice(index + 1)]
|
n = [...n.slice(0, index), ...n.slice(index + 1)]
|
||||||
return formatInput(n)
|
return formatInput(n)
|
||||||
})
|
})
|
||||||
}
|
},
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -28,7 +28,7 @@ export const handleLogin = async (
|
|||||||
email: string,
|
email: string,
|
||||||
password: string,
|
password: string,
|
||||||
setError?: FormErrorHandler,
|
setError?: FormErrorHandler,
|
||||||
shouldRedirect: boolean = true
|
shouldRedirect: boolean = true,
|
||||||
) => {
|
) => {
|
||||||
const { authViaEmail } = client()
|
const { authViaEmail } = client()
|
||||||
// Reset the form error if the form is submitted
|
// Reset the form error if the form is submitted
|
||||||
@ -42,7 +42,9 @@ export const handleLogin = async (
|
|||||||
}
|
}
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
if (!(error instanceof Error)) {
|
if (!(error instanceof Error)) {
|
||||||
throw new Error(`Expected Error type here, but got ${typeof error}:${error}`)
|
throw new Error(
|
||||||
|
`Expected Error type here, but got ${typeof error}:${error}`,
|
||||||
|
)
|
||||||
}
|
}
|
||||||
handleFormError(error, setError)
|
handleFormError(error, setError)
|
||||||
}
|
}
|
||||||
@ -57,7 +59,7 @@ export const handleLogin = async (
|
|||||||
export const handleRegistration = async (
|
export const handleRegistration = async (
|
||||||
email: string,
|
email: string,
|
||||||
password: string,
|
password: string,
|
||||||
setError?: FormErrorHandler
|
setError?: FormErrorHandler,
|
||||||
) => {
|
) => {
|
||||||
const { createUser } = client()
|
const { createUser } = client()
|
||||||
// Reset the form error if the form is submitted
|
// Reset the form error if the form is submitted
|
||||||
@ -75,7 +77,10 @@ export const handleRegistration = async (
|
|||||||
* @param token {string} The token from the verification email
|
* @param token {string} The token from the verification email
|
||||||
* @param setError {function} This can be used to show an alert bar if an error occurs during the login process
|
* @param setError {function} This can be used to show an alert bar if an error occurs during the login process
|
||||||
*/
|
*/
|
||||||
export const handleAccountConfirmation = async (token: string, setError?: FormErrorHandler) => {
|
export const handleAccountConfirmation = async (
|
||||||
|
token: string,
|
||||||
|
setError?: FormErrorHandler,
|
||||||
|
) => {
|
||||||
const { confirmVerification } = client()
|
const { confirmVerification } = client()
|
||||||
// Reset the form error if the form is submitted
|
// Reset the form error if the form is submitted
|
||||||
setError?.('')
|
setError?.('')
|
||||||
@ -98,7 +103,7 @@ export const handleAccountConfirmation = async (token: string, setError?: FormEr
|
|||||||
*/
|
*/
|
||||||
export const handleUnauthenticatedPasswordReset = async (
|
export const handleUnauthenticatedPasswordReset = async (
|
||||||
email: string,
|
email: string,
|
||||||
setError?: FormErrorHandler
|
setError?: FormErrorHandler,
|
||||||
) => {
|
) => {
|
||||||
const { requestPasswordReset } = client()
|
const { requestPasswordReset } = client()
|
||||||
// Reset the form error if the form is submitted
|
// Reset the form error if the form is submitted
|
||||||
@ -122,7 +127,7 @@ export const handleUnauthenticatedPasswordReset = async (
|
|||||||
export const handleUnauthenticatedPasswordResetConfirm = async (
|
export const handleUnauthenticatedPasswordResetConfirm = async (
|
||||||
token: string,
|
token: string,
|
||||||
password: string,
|
password: string,
|
||||||
setError?: FormErrorHandler
|
setError?: FormErrorHandler,
|
||||||
) => {
|
) => {
|
||||||
const { requestPasswordResetConfirm } = client()
|
const { requestPasswordResetConfirm } = client()
|
||||||
// Reset the form error if the form is submitted
|
// Reset the form error if the form is submitted
|
||||||
@ -141,7 +146,7 @@ export const handleUnauthenticatedPasswordResetConfirm = async (
|
|||||||
|
|
||||||
export const handleCreateNewInstance = async (
|
export const handleCreateNewInstance = async (
|
||||||
instanceName: string,
|
instanceName: string,
|
||||||
setError?: FormErrorHandler
|
setError?: FormErrorHandler,
|
||||||
) => {
|
) => {
|
||||||
const { user, createInstance } = client()
|
const { user, createInstance } = client()
|
||||||
// Get the newly created user id
|
// Get the newly created user id
|
||||||
@ -154,7 +159,7 @@ export const handleCreateNewInstance = async (
|
|||||||
|
|
||||||
// Create a new instance using the generated name
|
// Create a new instance using the generated name
|
||||||
const record = await createInstance({
|
const record = await createInstance({
|
||||||
subdomain: instanceName
|
subdomain: instanceName,
|
||||||
})
|
})
|
||||||
|
|
||||||
await goto(`/app/instances/${record.instance.id}`)
|
await goto(`/app/instances/${record.instance.id}`)
|
||||||
@ -167,7 +172,7 @@ export const handleInstanceGeneratorWidget = async (
|
|||||||
email: string,
|
email: string,
|
||||||
password: string,
|
password: string,
|
||||||
instanceName: string,
|
instanceName: string,
|
||||||
setError = (value: string) => {}
|
setError = (value: string) => {},
|
||||||
) => {
|
) => {
|
||||||
const { dbg, error, warn } = logger()
|
const { dbg, error, warn } = logger()
|
||||||
|
|
||||||
@ -203,7 +208,7 @@ export const handleInstanceGeneratorWidget = async (
|
|||||||
// If registration succeeds, login should always succeed.
|
// If registration succeeds, login should always succeed.
|
||||||
// If a login fails at this point, the system is broken.
|
// If a login fails at this point, the system is broken.
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`Login system is currently down. Please contact us so we can fix this.`
|
`Login system is currently down. Please contact us so we can fix this.`,
|
||||||
)
|
)
|
||||||
})
|
})
|
||||||
})
|
})
|
||||||
@ -247,7 +252,9 @@ export const handleInstanceGeneratorWidget = async (
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export const handleResendVerificationEmail = async (setError = (value: string) => {}) => {
|
export const handleResendVerificationEmail = async (
|
||||||
|
setError = (value: string) => {},
|
||||||
|
) => {
|
||||||
const { resendVerificationEmail } = client()
|
const { resendVerificationEmail } = client()
|
||||||
try {
|
try {
|
||||||
await resendVerificationEmail()
|
await resendVerificationEmail()
|
||||||
|
@ -4,7 +4,7 @@ export type Unsubscribe = () => void
|
|||||||
|
|
||||||
export const createGenericAsyncEvent = <TPayload>(): [
|
export const createGenericAsyncEvent = <TPayload>(): [
|
||||||
(cb: (payload: TPayload) => Promise<void>) => Unsubscribe,
|
(cb: (payload: TPayload) => Promise<void>) => Unsubscribe,
|
||||||
(payload: TPayload) => Promise<void>
|
(payload: TPayload) => Promise<void>,
|
||||||
] => {
|
] => {
|
||||||
let i = 0
|
let i = 0
|
||||||
const callbacks: any = {}
|
const callbacks: any = {}
|
||||||
@ -22,7 +22,7 @@ export const createGenericAsyncEvent = <TPayload>(): [
|
|||||||
(c, cb) => {
|
(c, cb) => {
|
||||||
return c.then(cb(payload))
|
return c.then(cb(payload))
|
||||||
},
|
},
|
||||||
Promise.resolve()
|
Promise.resolve(),
|
||||||
)
|
)
|
||||||
|
|
||||||
return [onEvent, fireEvent]
|
return [onEvent, fireEvent]
|
||||||
@ -30,7 +30,7 @@ export const createGenericAsyncEvent = <TPayload>(): [
|
|||||||
|
|
||||||
export const createGenericSyncEvent = <TPayload>(): [
|
export const createGenericSyncEvent = <TPayload>(): [
|
||||||
(cb: (payload: TPayload) => void) => Unsubscribe,
|
(cb: (payload: TPayload) => void) => Unsubscribe,
|
||||||
(payload: TPayload) => void
|
(payload: TPayload) => void,
|
||||||
] => {
|
] => {
|
||||||
let i = 0
|
let i = 0
|
||||||
const callbacks: any = {}
|
const callbacks: any = {}
|
||||||
|
@ -4,7 +4,11 @@ import type { AuthStoreProps } from '$src/pocketbase/PocketbaseClient'
|
|||||||
import { logger } from '@pockethost/common'
|
import { logger } from '@pockethost/common'
|
||||||
import { writable } from 'svelte/store'
|
import { writable } from 'svelte/store'
|
||||||
|
|
||||||
export const authStoreState = writable<AuthStoreProps>({ isValid: false, model: null, token: '' })
|
export const authStoreState = writable<AuthStoreProps>({
|
||||||
|
isValid: false,
|
||||||
|
model: null,
|
||||||
|
token: '',
|
||||||
|
})
|
||||||
export const isUserLoggedIn = writable(false)
|
export const isUserLoggedIn = writable(false)
|
||||||
export const isUserVerified = writable(false)
|
export const isUserVerified = writable(false)
|
||||||
export const isAuthStateInitialized = writable(false)
|
export const isAuthStateInitialized = writable(false)
|
||||||
|
@ -11,11 +11,11 @@
|
|||||||
"sourceMap": true,
|
"sourceMap": true,
|
||||||
"strict": true,
|
"strict": true,
|
||||||
"paths": {
|
"paths": {
|
||||||
"$util/*": ["src/util/*"],
|
"$util/*": ["./src/util/*"],
|
||||||
"$components/*": ["src/components/*"],
|
"$components/*": ["./src/components/*"],
|
||||||
"$src/*": ["src/*"]
|
"$src/*": ["./src/*"]
|
||||||
},
|
},
|
||||||
"types": ["src/global.d.ts"]
|
"types": ["./src/global.d.ts"]
|
||||||
}
|
}
|
||||||
// Path aliases are handled by https://kit.svelte.dev/docs/configuration#alias
|
// Path aliases are handled by https://kit.svelte.dev/docs/configuration#alias
|
||||||
//
|
//
|
||||||
|
@ -6,8 +6,8 @@ import markedOptions from './marked.config.js'
|
|||||||
const config: UserConfig = {
|
const config: UserConfig = {
|
||||||
plugins: [markdown({ markedOptions }), sveltekit()],
|
plugins: [markdown({ markedOptions }), sveltekit()],
|
||||||
optimizeDeps: {
|
optimizeDeps: {
|
||||||
include: ['highlight.js', 'highlight.js/lib/core']
|
include: ['highlight.js', 'highlight.js/lib/core'],
|
||||||
}
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
export default config
|
export default config
|
||||||
|
@ -10,7 +10,6 @@
|
|||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@types/node": "^18.11.17",
|
"@types/node": "^18.11.17",
|
||||||
"http-proxy": "^1.18.1",
|
"http-proxy": "^1.18.1"
|
||||||
"tsx": "^3.12.1"
|
|
||||||
}
|
}
|
||||||
}
|
}
|
@ -1,158 +0,0 @@
|
|||||||
diff --git a/node_modules/sqlite3/lib/sqlite3.d.ts b/node_modules/sqlite3/lib/sqlite3.d.ts
|
|
||||||
index b27b0cf..783a961 100644
|
|
||||||
--- a/node_modules/sqlite3/lib/sqlite3.d.ts
|
|
||||||
+++ b/node_modules/sqlite3/lib/sqlite3.d.ts
|
|
||||||
@@ -139,6 +139,153 @@ export class Database extends events.EventEmitter {
|
|
||||||
wait(callback?: (param: null) => void): this;
|
|
||||||
|
|
||||||
interrupt(): void;
|
|
||||||
+
|
|
||||||
+ backup(path:string, callback?: ()=>void): Backup
|
|
||||||
+ backup(filename:string, destDbName:string, sourceDbName:string, filenameIsDest:boolean, callback?: ()=>void): Backup
|
|
||||||
+}
|
|
||||||
+
|
|
||||||
+/**
|
|
||||||
+ *
|
|
||||||
+ * A class for managing an sqlite3_backup object. For consistency
|
|
||||||
+ * with other node-sqlite3 classes, it maintains an internal queue
|
|
||||||
+ * of calls.
|
|
||||||
+ *
|
|
||||||
+ * Intended usage from node:
|
|
||||||
+ *
|
|
||||||
+ * var db = new sqlite3.Database('live.db');
|
|
||||||
+ * var backup = db.backup('backup.db');
|
|
||||||
+ * ...
|
|
||||||
+ * // in event loop, move backup forward when we have time.
|
|
||||||
+ * if (backup.idle) { backup.step(NPAGES); }
|
|
||||||
+ * if (backup.completed) { ... success ... }
|
|
||||||
+ * if (backup.failed) { ... sadness ... }
|
|
||||||
+ * // do other work in event loop - fine to modify live.db
|
|
||||||
+ * ...
|
|
||||||
+ *
|
|
||||||
+ * Here is how sqlite's backup api is exposed:
|
|
||||||
+ *
|
|
||||||
+ * - `sqlite3_backup_init`: This is implemented as
|
|
||||||
+ * `db.backup(filename, [callback])` or
|
|
||||||
+ * `db.backup(filename, destDbName, sourceDbName, filenameIsDest, [callback])`.
|
|
||||||
+ * - `sqlite3_backup_step`: `backup.step(pages, [callback])`.
|
|
||||||
+ * - `sqlite3_backup_finish`: `backup.finish([callback])`.
|
|
||||||
+ * - `sqlite3_backup_remaining`: `backup.remaining`.
|
|
||||||
+ * - `sqlite3_backup_pagecount`: `backup.pageCount`.
|
|
||||||
+ *
|
|
||||||
+ * There are the following read-only properties:
|
|
||||||
+ *
|
|
||||||
+ * - `backup.completed` is set to `true` when the backup
|
|
||||||
+ * succeeeds.
|
|
||||||
+ * - `backup.failed` is set to `true` when the backup
|
|
||||||
+ * has a fatal error.
|
|
||||||
+ * - `backup.message` is set to the error string
|
|
||||||
+ * the backup has a fatal error.
|
|
||||||
+ * - `backup.idle` is set to `true` when no operation
|
|
||||||
+ * is currently in progress or queued for the backup.
|
|
||||||
+ * - `backup.remaining` is an integer with the remaining
|
|
||||||
+ * number of pages after the last call to `backup.step`
|
|
||||||
+ * (-1 if `step` not yet called).
|
|
||||||
+ * - `backup.pageCount` is an integer with the total number
|
|
||||||
+ * of pages measured during the last call to `backup.step`
|
|
||||||
+ * (-1 if `step` not yet called).
|
|
||||||
+ *
|
|
||||||
+ * There is the following writable property:
|
|
||||||
+ *
|
|
||||||
+ * - `backup.retryErrors`: an array of sqlite3 error codes
|
|
||||||
+ * that are treated as non-fatal - meaning, if they occur,
|
|
||||||
+ * backup.failed is not set, and the backup may continue.
|
|
||||||
+ * By default, this is `[sqlite3.BUSY, sqlite3.LOCKED]`.
|
|
||||||
+ *
|
|
||||||
+ * The `db.backup(filename, [callback])` shorthand is sufficient
|
|
||||||
+ * for making a backup of a database opened by node-sqlite3. If
|
|
||||||
+ * using attached or temporary databases, or moving data in the
|
|
||||||
+ * opposite direction, the more complete (but daunting)
|
|
||||||
+ * `db.backup(filename, destDbName, sourceDbName, filenameIsDest, [callback])`
|
|
||||||
+ * signature is provided.
|
|
||||||
+ *
|
|
||||||
+ * A backup will finish automatically when it succeeds or a fatal
|
|
||||||
+ * error occurs, meaning it is not necessary to call `db.finish()`.
|
|
||||||
+ * By default, SQLITE_LOCKED and SQLITE_BUSY errors are not
|
|
||||||
+ * treated as failures, and the backup will continue if they
|
|
||||||
+ * occur. The set of errors that are tolerated can be controlled
|
|
||||||
+ * by setting `backup.retryErrors`. To disable automatic
|
|
||||||
+ * finishing and stick strictly to sqlite's raw api, set
|
|
||||||
+ * `backup.retryErrors` to `[]`. In that case, it is necessary
|
|
||||||
+ * to call `backup.finish()`.
|
|
||||||
+ *
|
|
||||||
+ * In the same way as node-sqlite3 databases and statements,
|
|
||||||
+ * backup methods can be called safely without callbacks, due
|
|
||||||
+ * to an internal call queue. So for example this naive code
|
|
||||||
+ * will correctly back up a db, if there are no errors:
|
|
||||||
+ *
|
|
||||||
+ * var backup = db.backup('backup.db');
|
|
||||||
+ * backup.step(-1);
|
|
||||||
+ * backup.finish();
|
|
||||||
+ *
|
|
||||||
+ */
|
|
||||||
+export class Backup extends events.EventEmitter {
|
|
||||||
+ /**
|
|
||||||
+ * `true` when the backup is idle and ready for `step()` to
|
|
||||||
+ * be called, `false` when busy.
|
|
||||||
+ */
|
|
||||||
+ readonly idle: boolean
|
|
||||||
+
|
|
||||||
+ /**
|
|
||||||
+ * `true` when the backup has completed, `false` otherwise.
|
|
||||||
+ */
|
|
||||||
+ readonly completed: boolean
|
|
||||||
+
|
|
||||||
+ /**
|
|
||||||
+ * `true` when the backup has failed, `false` otherwise. `Backup.message`
|
|
||||||
+ * contains the error message.
|
|
||||||
+ */
|
|
||||||
+ readonly failed: boolean
|
|
||||||
+
|
|
||||||
+ /**
|
|
||||||
+ * Message failure string from sqlite3_errstr() if `Backup.failed` is `true`
|
|
||||||
+ */
|
|
||||||
+ readonly message: boolean
|
|
||||||
+
|
|
||||||
+ /**
|
|
||||||
+ * The number of remaining pages after the last call to `step()`,
|
|
||||||
+ * or `-1` if `step()` has never been called.
|
|
||||||
+ */
|
|
||||||
+ readonly remaining: number
|
|
||||||
+
|
|
||||||
+ /**
|
|
||||||
+ * The total number of pages measured during the last call to `step()`,
|
|
||||||
+ * or `-1` if `step()` has never been called.
|
|
||||||
+ */
|
|
||||||
+ readonly pageCount: number
|
|
||||||
+
|
|
||||||
+
|
|
||||||
+ /**
|
|
||||||
+ * An array of sqlite3 error codes that are treated as non-fatal -
|
|
||||||
+ * meaning, if they occur, `Backup.failed` is not set, and the backup
|
|
||||||
+ * may continue. By default, this is `[sqlite3.BUSY, sqlite3.LOCKED]`.
|
|
||||||
+ */
|
|
||||||
+ retryErrors: number[]
|
|
||||||
+
|
|
||||||
+ /**
|
|
||||||
+ * Asynchronously finalize the backup (required).
|
|
||||||
+ *
|
|
||||||
+ * @param callback Called when the backup is finalized.
|
|
||||||
+ */
|
|
||||||
+ finish(callback?: ()=>void): void
|
|
||||||
+
|
|
||||||
+ /**
|
|
||||||
+ * Asynchronously perform an incremental segment of the backup.
|
|
||||||
+ *
|
|
||||||
+ * Example:
|
|
||||||
+ *
|
|
||||||
+ * ```
|
|
||||||
+ * backup.step(5)
|
|
||||||
+ * ```
|
|
||||||
+ *
|
|
||||||
+ * @param nPages Number of pages to process (5 recommended).
|
|
||||||
+ * @param callback Called when the step is completed.
|
|
||||||
+ */
|
|
||||||
+ step(nPages: number,callback?: ()=>void): void
|
|
||||||
}
|
|
||||||
|
|
||||||
export function verbose(): sqlite3;
|
|
Loading…
x
Reference in New Issue
Block a user