mirror of
https://github.com/pockethost/pockethost.git
synced 2025-03-30 15:08:30 +00:00
Merge branch 'master' of github.com:benallfree/pockethost
This commit is contained in:
commit
f5d1201b1a
@ -1,3 +1,5 @@
|
||||
.svelte-kit
|
||||
dist
|
||||
mount
|
||||
mount
|
||||
.data
|
||||
attic
|
@ -1,56 +1,60 @@
|
||||
# Table of contents
|
||||
|
||||
* [👋 Welcome to PocketHost](README.md)
|
||||
- [👋 Welcome to PocketHost](README.md)
|
||||
|
||||
## Overview
|
||||
|
||||
* [Getting Help](overview/help.md)
|
||||
* [FAQ](overview/faq.md)
|
||||
* [Roadmap](overview/roadmap.md)
|
||||
- [Getting Help](overview/help.md)
|
||||
- [FAQ](overview/faq.md)
|
||||
- [Roadmap](overview/roadmap.md)
|
||||
|
||||
## Daily Usage
|
||||
|
||||
* [Creating an Instance](usage/create.md)
|
||||
* [Accessing an Instance](usage/accessing-instance.md)
|
||||
* [Instance Details](usage/instances/index.md)
|
||||
* [Renaming an Instance](usage/rename-instance.md)
|
||||
* [Maintenance Mode](usage/maintenance.md)
|
||||
* [FTP Access](usage/ftp.md)
|
||||
* [Backup & Restore](usage/backup-and-restore.md)
|
||||
* [Worker](daily-usage/worker.md)
|
||||
* [PocketBase Hooks](usage/hooks.md)
|
||||
* [Upgrading](usage/upgrading.md)
|
||||
- [Creating an Instance](usage/create.md)
|
||||
- [Accessing an Instance](usage/accessing-instance.md)
|
||||
- [Instance Details](usage/instances/index.md)
|
||||
- [Renaming an Instance](usage/rename-instance.md)
|
||||
- [Maintenance Mode](usage/maintenance.md)
|
||||
- [FTP Access](usage/ftp.md)
|
||||
- [Backup & Restore](usage/backup-and-restore.md)
|
||||
- [Worker](daily-usage/worker.md)
|
||||
- [PocketBase Hooks](usage/hooks.md)
|
||||
- [Upgrading](usage/upgrading.md)
|
||||
|
||||
## Hosting
|
||||
|
||||
- [Overview](hosting/overview.md)
|
||||
|
||||
## Contributing
|
||||
|
||||
* [Overview](development/overview.md)
|
||||
* [Running Just the Frontend](development/frontend.md)
|
||||
* [Running Everything](development/full-stack/index.md)
|
||||
* [Creating RPC Calls](development/rpc.md)
|
||||
* [Production Deployment](development/production.md)
|
||||
- [Overview](development/overview.md)
|
||||
- [Running Just the Frontend](development/frontend.md)
|
||||
- [Running Everything](development/full-stack/index.md)
|
||||
- [Creating RPC Calls](development/rpc.md)
|
||||
- [Production Deployment](development/production.md)
|
||||
|
||||
## Release History
|
||||
|
||||
* [next](releases/next.md)
|
||||
* [0.8.0](releases/0.8.0.md)
|
||||
* [0.7.2](releases/0.7.2.md)
|
||||
* [0.7.1](releases/0.7.1.md)
|
||||
* [0.7.0](releases/0.7.0.md)
|
||||
* [0.6.1](releases/0.6.1.md)
|
||||
* [0.6.0](releases/0.6.0.md)
|
||||
* [0.5.7](releases/0.5.7.md)
|
||||
* [0.5.6](releases/0.5.6.md)
|
||||
* [0.5.5](releases/0.5.5.md)
|
||||
* [0.5.4](releases/0.5.4.md)
|
||||
* [0.5.3](releases/0.5.3.md)
|
||||
* [0.5.2](releases/0.5.2.md)
|
||||
* [0.5.1](releases/0.5.1.md)
|
||||
* [0.5.0](releases/0.5.0.md)
|
||||
* [0.4.2](releases/0.4.2.md)
|
||||
* [0.4.1](releases/0.4.1.md)
|
||||
* [0.4.0](releases/0.4.0.md)
|
||||
* [0.3.2](releases/0.3.2.md)
|
||||
* [0.3.1](releases/0.3.1.md)
|
||||
* [0.3.0](releases/0.3.0.md)
|
||||
* [0.2.0](releases/0.2.0.md)
|
||||
* [0.0.1](release-history/0.0.1.md)
|
||||
- [next](releases/next.md)
|
||||
- [0.8.0](releases/0.8.0.md)
|
||||
- [0.7.2](releases/0.7.2.md)
|
||||
- [0.7.1](releases/0.7.1.md)
|
||||
- [0.7.0](releases/0.7.0.md)
|
||||
- [0.6.1](releases/0.6.1.md)
|
||||
- [0.6.0](releases/0.6.0.md)
|
||||
- [0.5.7](releases/0.5.7.md)
|
||||
- [0.5.6](releases/0.5.6.md)
|
||||
- [0.5.5](releases/0.5.5.md)
|
||||
- [0.5.4](releases/0.5.4.md)
|
||||
- [0.5.3](releases/0.5.3.md)
|
||||
- [0.5.2](releases/0.5.2.md)
|
||||
- [0.5.1](releases/0.5.1.md)
|
||||
- [0.5.0](releases/0.5.0.md)
|
||||
- [0.4.2](releases/0.4.2.md)
|
||||
- [0.4.1](releases/0.4.1.md)
|
||||
- [0.4.0](releases/0.4.0.md)
|
||||
- [0.3.2](releases/0.3.2.md)
|
||||
- [0.3.1](releases/0.3.1.md)
|
||||
- [0.3.0](releases/0.3.0.md)
|
||||
- [0.2.0](releases/0.2.0.md)
|
||||
- [0.0.1](release-history/0.0.1.md)
|
||||
|
41
gitbook/hosting/overview.md
Normal file
41
gitbook/hosting/overview.md
Normal file
@ -0,0 +1,41 @@
|
||||
# Overview
|
||||
|
||||
[UNDER CONSTRUCTION]
|
||||
|
||||
This guide covers how to set up a production hosting environment for PocketHost. Hosting PocketHost might be desirable if:
|
||||
|
||||
- You want to create a hosting service business powered by PocketHost
|
||||
- You want a private copy of PocketHost where you control all the underlying infrastructure
|
||||
- You want to run PocketHost from a region not yet offered by pockethost.io
|
||||
|
||||
Running a hosting service is not easy. To provide a great hosting experience for users, you need to know about:
|
||||
|
||||
- Docker
|
||||
- Email and DKIM+SPF and more
|
||||
- DNS jargon: MX, TXT, CNAME
|
||||
- SSL cert provisioning and management
|
||||
- Storage
|
||||
- Volume mounts
|
||||
- Could computing or VPS deployment
|
||||
- CDN and static asset hosting
|
||||
- Amazon AWS
|
||||
- Lots more - scaling, firewalls, DDoS defense, user security, log rotation, patches, updates, build tools, CPU architectures, multitenancy, on and on
|
||||
|
||||
If you're still interested in creating a PocketHost hosting environment for yourself, read on...
|
||||
|
||||
```
|
||||
apt-get update
|
||||
apt-get install -y nginx nodejs npm
|
||||
npm i -g n yarn
|
||||
n lts
|
||||
hash -r
|
||||
git clone git@github.com:benallfree/pockethost.git pockethost-latest
|
||||
cd pockethost-latest
|
||||
yarn
|
||||
cd ..
|
||||
git clone git@github.com:benallfree/pockethost.git pockethost-lts
|
||||
cd pockethost-lts
|
||||
yarn
|
||||
cd ..
|
||||
|
||||
```
|
20
package.json
20
package.json
@ -32,18 +32,22 @@
|
||||
"semi": false,
|
||||
"useTabs": false,
|
||||
"singleQuote": true,
|
||||
"trailingComma": "all",
|
||||
"plugins": [
|
||||
"./node_modules/prettier-plugin-organize-imports",
|
||||
"./node_modules/prettier-plugin-svelte"
|
||||
"./node_modules/prettier-plugin-organize-imports/index.js",
|
||||
"./node_modules/prettier-plugin-svelte/plugin.js"
|
||||
]
|
||||
},
|
||||
"devDependencies": {
|
||||
"concurrently": "^7.4.0",
|
||||
"patch-package": "^6.5.0",
|
||||
"prettier": "^2.7.1",
|
||||
"prettier-plugin-organize-imports": "^3.1.1",
|
||||
"prettier-plugin-svelte": "^2.7.0",
|
||||
"typescript": "^4.8.3"
|
||||
"chokidar-cli": "^3.0.0",
|
||||
"concurrently": "^8.2.1",
|
||||
"patch-package": "^8.0.0",
|
||||
"prettier": "^3.0.3",
|
||||
"prettier-plugin-organize-imports": "^3.2.3",
|
||||
"prettier-plugin-svelte": "^3.0.3",
|
||||
"tslib": "^2.6.2",
|
||||
"tsx": "^3.12.8",
|
||||
"typescript": "^5.0"
|
||||
},
|
||||
"dependencies": {
|
||||
"postinstall-postinstall": "^2.1.0",
|
||||
|
@ -18,7 +18,7 @@ export const addDevCommand = (program: Command) => {
|
||||
.description('Build the JS bundle')
|
||||
.option(
|
||||
'--src <path>',
|
||||
`Path to source (default: <project>/src/index.{ts|js})`
|
||||
`Path to source (default: <project>/src/index.{ts|js})`,
|
||||
)
|
||||
.option('--dist <path>', `Path to dist (default: <project>/dist/index.js)`)
|
||||
.action(async (options) => {
|
||||
|
@ -22,7 +22,7 @@ export const addDevCommand = (program: Command) => {
|
||||
.description('Watch for source code changes in development mode')
|
||||
.option(
|
||||
'--src <path>',
|
||||
`Path to source (default: <project>/src/index.{ts|js})`
|
||||
`Path to source (default: <project>/src/index.{ts|js})`,
|
||||
)
|
||||
.option('--dist <path>', `Path to dist (default: <project>/dist/index.js)`)
|
||||
.option('--host', 'PocketBase host', DEFAULT_PB_DEV_URL)
|
||||
|
@ -70,7 +70,7 @@ export const addPublishCommand = (program: Command) => {
|
||||
.description('Publish JS bundle to PBScript-enabled PocketBase instance')
|
||||
.option(
|
||||
'--dist <src>',
|
||||
`Path to dist bundle (default: <project>/dist/index.js)`
|
||||
`Path to dist bundle (default: <project>/dist/index.js)`,
|
||||
)
|
||||
.option('--host <host>', `PocketBase host (default: ${DEFAULT_PB_DEV_URL})`)
|
||||
.action(async (options) => {
|
||||
|
@ -6,7 +6,7 @@ export type FieldStruct<TRec extends Pb_Any_Record_Db> = Partial<{
|
||||
}>
|
||||
|
||||
export const buildQueryFilter = <TRec extends Pb_Any_Record_Db>(
|
||||
fields: FieldStruct<TRec>
|
||||
fields: FieldStruct<TRec>,
|
||||
): Pb_QueryParams => {
|
||||
const filter = map(fields, (v, k) => `${k.toString()} = "${v}"`).join(' and ')
|
||||
return { filter }
|
||||
|
@ -5,14 +5,14 @@ import {
|
||||
Pb_Any_Record_Db,
|
||||
Pb_Untrusted_Db,
|
||||
} from '../schema/base'
|
||||
import { buildQueryFilter, FieldStruct } from './buildQueryFilter'
|
||||
import { FieldStruct, buildQueryFilter } from './buildQueryFilter'
|
||||
|
||||
export const getOne = async <
|
||||
TRec extends Pb_Any_Record_Db,
|
||||
TFields extends FieldStruct<TRec> = FieldStruct<TRec>
|
||||
TFields extends FieldStruct<TRec> = FieldStruct<TRec>,
|
||||
>(
|
||||
collectionName: Pb_Any_Collection_Name,
|
||||
fields: TFields
|
||||
fields: TFields,
|
||||
) => {
|
||||
const queryParams = buildQueryFilter(fields)
|
||||
const recs = await client.records.getList(collectionName, 1, 2, queryParams)
|
||||
|
@ -6,14 +6,14 @@ export const mergeDeep = <TObject>(dst: any, src: TObject) => {
|
||||
if (dst[k] === undefined) dst[k] = {}
|
||||
if (!isObject(dst[k])) {
|
||||
throw new Error(
|
||||
`${k.toString()} is an object in default, but not in target`
|
||||
`${k.toString()} is an object in default, but not in target`,
|
||||
)
|
||||
}
|
||||
dst[k] = mergeDeep(dst[k], v)
|
||||
} else {
|
||||
if (isObject(dst[k])) {
|
||||
throw new Error(
|
||||
`${k.toString()} is an object in target, but not in default`
|
||||
`${k.toString()} is an object in target, but not in default`,
|
||||
)
|
||||
}
|
||||
// The magic: if the target has no value for this field, use the
|
||||
|
@ -2,7 +2,7 @@ import { UnsubFunc } from 'store/backend/types'
|
||||
import { client } from '../client'
|
||||
|
||||
export const onAuthStateChanged = (
|
||||
cb: (user: typeof client.authStore.model) => void
|
||||
cb: (user: typeof client.authStore.model) => void,
|
||||
): UnsubFunc => {
|
||||
setTimeout(() => cb(client.authStore.model), 0)
|
||||
return client.authStore.onChange(() => {
|
||||
|
@ -8,14 +8,14 @@ import {
|
||||
Pb_Untrusted_Db,
|
||||
Pb_UserFields,
|
||||
} from '../schema/base'
|
||||
import { buildQueryFilter, FieldStruct } from './buildQueryFilter'
|
||||
import { FieldStruct, buildQueryFilter } from './buildQueryFilter'
|
||||
import { mergeDeep } from './mergeDeep'
|
||||
|
||||
export const upsert = async <TRow extends Pb_Any_Record_Db>(
|
||||
collectionName: Pb_Any_Collection_Name,
|
||||
filterFields: FieldStruct<TRow>,
|
||||
mutate: (draft: Draft<Pb_UserFields<TRow>>) => void,
|
||||
defaultRec: Pb_UserFields<TRow>
|
||||
defaultRec: Pb_UserFields<TRow>,
|
||||
) => {
|
||||
const queryParams = buildQueryFilter(filterFields)
|
||||
const recs = await client.records.getList(collectionName, 1, 2, queryParams)
|
||||
@ -42,7 +42,7 @@ export const upsert = async <TRow extends Pb_Any_Record_Db>(
|
||||
}
|
||||
return carry
|
||||
},
|
||||
{} as Partial<Pb_UserFields<TRow>>
|
||||
{} as Partial<Pb_UserFields<TRow>>,
|
||||
)
|
||||
client.records.update(collectionName, id, final)
|
||||
}
|
||||
|
@ -50,7 +50,7 @@ export class CustomAuthStore extends BaseAuthStore {
|
||||
}
|
||||
exportToCookie(
|
||||
options?: SerializeOptions | undefined,
|
||||
key?: string | undefined
|
||||
key?: string | undefined,
|
||||
): string {
|
||||
throw new Error(`Unsupported exportToCookie()`)
|
||||
}
|
||||
|
@ -1,6 +1,6 @@
|
||||
export function assertExists<TType>(
|
||||
v: TType,
|
||||
message = `Value does not exist`
|
||||
message = `Value does not exist`,
|
||||
): asserts v is NonNullable<TType> {
|
||||
if (typeof v === 'undefined') {
|
||||
throw new Error(message)
|
||||
|
@ -20,11 +20,11 @@ export type ConnectionConfig = {
|
||||
|
||||
export const ensureAdminClient = async (
|
||||
slug: string,
|
||||
config: ConnectionConfig
|
||||
config: ConnectionConfig,
|
||||
) => {
|
||||
const saver = mkProjectSaver<ConnectionConfig>(slug)
|
||||
const client = pbClient(config, (session) =>
|
||||
saver((config) => ({ ...config, session }))
|
||||
saver((config) => ({ ...config, session })),
|
||||
)
|
||||
const _isAdmin = await isAdmin(client)
|
||||
if (_isAdmin) {
|
||||
@ -34,7 +34,7 @@ export const ensureAdminClient = async (
|
||||
const { host } = config
|
||||
|
||||
console.log(
|
||||
`You must be logged in to ${host}/_ as a PocketBase admin to continue.`
|
||||
`You must be logged in to ${host}/_ as a PocketBase admin to continue.`,
|
||||
)
|
||||
|
||||
while (true) {
|
||||
@ -55,7 +55,7 @@ export const ensureAdminClient = async (
|
||||
value.length > 0 ? true : `Enter a password`,
|
||||
},
|
||||
],
|
||||
{ onCancel: () => die(`Exited.`) }
|
||||
{ onCancel: () => die(`Exited.`) },
|
||||
)
|
||||
const { username, password } = response
|
||||
try {
|
||||
|
@ -9,13 +9,13 @@ import { ConnectionConfig } from './ensureAdminClient'
|
||||
|
||||
export const pbClient = (
|
||||
config: ConnectionConfig,
|
||||
saver: SessionStateSaver
|
||||
saver: SessionStateSaver,
|
||||
) => {
|
||||
const { host, session } = config
|
||||
const client = new PocketBase(
|
||||
host,
|
||||
'en-US',
|
||||
new CustomAuthStore(session, saver)
|
||||
new CustomAuthStore(session, saver),
|
||||
)
|
||||
return client
|
||||
}
|
||||
@ -31,7 +31,7 @@ export const isAdmin = async (client: pocketbaseEs) => {
|
||||
|
||||
export const adminPbClient = async (
|
||||
config: ConnectionConfig,
|
||||
saver: SessionStateSaver
|
||||
saver: SessionStateSaver,
|
||||
) => {
|
||||
const client = pbClient(config, saver)
|
||||
if (!client.authStore.isValid) {
|
||||
|
@ -38,10 +38,10 @@ export const createCleanupManager = (slug?: string) => {
|
||||
(c, v) => {
|
||||
return c.then(() => v())
|
||||
},
|
||||
Promise.resolve()
|
||||
Promise.resolve(),
|
||||
).catch((e) => {
|
||||
error(
|
||||
`Cleanup functions are failing. This should never happen, check all cleanup functions to make sure they are trapping their exceptions.`
|
||||
`Cleanup functions are failing. This should never happen, check all cleanup functions to make sure they are trapping their exceptions.`,
|
||||
)
|
||||
throw e
|
||||
})
|
||||
|
@ -58,7 +58,7 @@ export const createLogger = (config: Partial<Config>) => {
|
||||
return JSON.stringify(arg)
|
||||
}
|
||||
return arg
|
||||
})
|
||||
}),
|
||||
)
|
||||
}
|
||||
|
||||
@ -89,7 +89,7 @@ export const createLogger = (config: Partial<Config>) => {
|
||||
;[..._buf.slice(_curIdx, MAX_BUF), ..._buf.slice(0, _curIdx)].forEach(
|
||||
(args) => {
|
||||
console.error(...args)
|
||||
}
|
||||
},
|
||||
)
|
||||
console.error(`========== ERROR TRACEBACK END ==============`)
|
||||
}
|
||||
|
@ -1,6 +1,6 @@
|
||||
export function assertExists<TType>(
|
||||
v: TType,
|
||||
message = `Value does not exist`
|
||||
message = `Value does not exist`,
|
||||
): asserts v is NonNullable<TType> {
|
||||
if (typeof v === 'undefined') {
|
||||
throw new Error(message)
|
||||
@ -9,7 +9,7 @@ export function assertExists<TType>(
|
||||
|
||||
export function assertTruthy<TType>(
|
||||
v: unknown,
|
||||
message = `Value should be truthy`
|
||||
message = `Value should be truthy`,
|
||||
): asserts v is NonNullable<TType> {
|
||||
if (!v) {
|
||||
throw new Error(message)
|
||||
|
@ -5,10 +5,10 @@ import { SetReturnType } from 'type-fest'
|
||||
|
||||
const limiters: { [lane: string]: Bottleneck } = {}
|
||||
export const serialAsyncExecutionGuard = <
|
||||
T extends (...args: any[]) => Promise<any>
|
||||
T extends (...args: any[]) => Promise<any>,
|
||||
>(
|
||||
cb: T,
|
||||
lane?: SetReturnType<T, string>
|
||||
lane?: SetReturnType<T, string>,
|
||||
): T => {
|
||||
const uuid = uniqueId()
|
||||
const _lane = lane || (() => uuid)
|
||||
@ -27,10 +27,10 @@ export const serialAsyncExecutionGuard = <
|
||||
|
||||
const singletons: { [_: string]: Promise<any> } = {}
|
||||
export const singletonAsyncExecutionGuard = <
|
||||
T extends (...args: any[]) => Promise<any>
|
||||
T extends (...args: any[]) => Promise<any>,
|
||||
>(
|
||||
cb: T,
|
||||
key: SetReturnType<T, string>
|
||||
key: SetReturnType<T, string>,
|
||||
): T => {
|
||||
const uuid = uniqueId()
|
||||
const keyFactory = key || (() => uuid)
|
||||
|
@ -3,15 +3,15 @@ import { values } from '@s-libs/micro-dash'
|
||||
export type Unsubscribe = () => void
|
||||
|
||||
export type EventSubscriber<TPayload> = (
|
||||
cb: EventHandler<TPayload>
|
||||
cb: EventHandler<TPayload>,
|
||||
) => Unsubscribe
|
||||
export type EventEmitter<TPayload> = (
|
||||
payload: TPayload,
|
||||
stopOnHandled?: boolean
|
||||
stopOnHandled?: boolean,
|
||||
) => Promise<boolean>
|
||||
export type EventHandler<TPayload> = (
|
||||
payload: TPayload,
|
||||
isHandled: boolean
|
||||
isHandled: boolean,
|
||||
) => boolean | void | Promise<boolean | void>
|
||||
|
||||
/**
|
||||
@ -20,7 +20,7 @@ export type EventHandler<TPayload> = (
|
||||
* @returns void
|
||||
*/
|
||||
export const createEvent = <TPayload>(
|
||||
defaultHandler?: EventHandler<TPayload>
|
||||
defaultHandler?: EventHandler<TPayload>,
|
||||
): [EventSubscriber<TPayload>, EventEmitter<TPayload>] => {
|
||||
let i = 0
|
||||
const callbacks: any = {}
|
||||
|
@ -1,11 +1,11 @@
|
||||
export * from './CleanupManager'
|
||||
export * from './Logger'
|
||||
export * from './TimerManager'
|
||||
export * from './assert'
|
||||
export * from './asyncExecutionGuard'
|
||||
export * from './CleanupManager'
|
||||
export * from './events'
|
||||
export * from './Logger'
|
||||
export * from './mkSingleton'
|
||||
export * from './newId'
|
||||
export * from './pocketbase-client-helpers'
|
||||
export * from './safeCatch'
|
||||
export * from './schema'
|
||||
export * from './TimerManager'
|
||||
|
@ -10,9 +10,9 @@ export type SingletonBaseConfig = {
|
||||
|
||||
export const mkSingleton = <
|
||||
TConfig,
|
||||
TApi extends SingletonApi | Promise<SingletonApi>
|
||||
TApi extends SingletonApi | Promise<SingletonApi>,
|
||||
>(
|
||||
factory: (config: TConfig) => TApi
|
||||
factory: (config: TConfig) => TApi,
|
||||
) => {
|
||||
let _service: TApi | undefined = undefined
|
||||
return (config?: TConfig) => {
|
||||
|
@ -6,11 +6,11 @@ import { logger } from '../Logger'
|
||||
import { newId } from '../newId'
|
||||
import { safeCatch } from '../safeCatch'
|
||||
import {
|
||||
RPC_COLLECTION,
|
||||
RpcCommands,
|
||||
RpcFields,
|
||||
RpcRecord_Create,
|
||||
RpcStatus,
|
||||
RPC_COLLECTION,
|
||||
} from '../schema'
|
||||
import type { WatchHelper } from './WatchHelper'
|
||||
|
||||
@ -30,7 +30,7 @@ export const createRpcHelper = (config: RpcHelperConfig) => {
|
||||
|
||||
const mkRpc = <TPayload extends JsonObject, TResult extends JsonObject>(
|
||||
cmd: RpcCommands,
|
||||
schema: JSONSchemaType<TPayload>
|
||||
schema: JSONSchemaType<TPayload>,
|
||||
) => {
|
||||
type ConcreteRpcRecord = RpcFields<TPayload, TResult>
|
||||
const validator = new Ajv().compile(schema)
|
||||
@ -39,7 +39,7 @@ export const createRpcHelper = (config: RpcHelperConfig) => {
|
||||
logger(),
|
||||
async (
|
||||
payload: TPayload,
|
||||
cb?: (data: RecordSubscription<ConcreteRpcRecord>) => void
|
||||
cb?: (data: RecordSubscription<ConcreteRpcRecord>) => void,
|
||||
) => {
|
||||
const _rpcLogger = _logger.create(cmd)
|
||||
const { dbg, error } = _rpcLogger
|
||||
@ -82,7 +82,7 @@ export const createRpcHelper = (config: RpcHelperConfig) => {
|
||||
reject(new ClientResponseError(data.record.result))
|
||||
}
|
||||
},
|
||||
{ initialFetch: false, pollIntervalMs: 100 }
|
||||
{ initialFetch: false, pollIntervalMs: 100 },
|
||||
)
|
||||
dbg(`Creating ${rpcIn.id}`)
|
||||
const newRpc = await client.collection(RPC_COLLECTION).create(rpcIn)
|
||||
@ -92,7 +92,7 @@ export const createRpcHelper = (config: RpcHelperConfig) => {
|
||||
reject(e)
|
||||
})
|
||||
})
|
||||
}
|
||||
},
|
||||
)
|
||||
}
|
||||
|
||||
|
@ -1,9 +1,9 @@
|
||||
import type pocketbaseEs from 'pocketbase'
|
||||
import type { RecordSubscription, UnsubscribeFunc } from 'pocketbase'
|
||||
import { logger } from '../Logger'
|
||||
import { UnixTimestampMs, createTimerManager } from '../TimerManager'
|
||||
import { safeCatch } from '../safeCatch'
|
||||
import { BaseFields, RecordId } from '../schema'
|
||||
import { createTimerManager, UnixTimestampMs } from '../TimerManager'
|
||||
|
||||
export type WatchHelperConfig = {
|
||||
client: pocketbaseEs
|
||||
@ -23,7 +23,7 @@ export const createWatchHelper = (config: WatchHelperConfig) => {
|
||||
collectionName: string,
|
||||
id: RecordId,
|
||||
cb: (data: RecordSubscription<TRec>, unsub: UnsubscribeFunc) => void,
|
||||
options?: Partial<WatchConfig>
|
||||
options?: Partial<WatchConfig>,
|
||||
): Promise<UnsubscribeFunc> => {
|
||||
const { dbg } = logger().create(`watchById:${collectionName}:${id}`)
|
||||
const config: WatchConfig = {
|
||||
@ -89,7 +89,7 @@ export const createWatchHelper = (config: WatchHelperConfig) => {
|
||||
idName: keyof TRec,
|
||||
idValue: RecordId,
|
||||
cb: (data: RecordSubscription<TRec>) => void,
|
||||
initialFetch = true
|
||||
initialFetch = true,
|
||||
): Promise<UnsubscribeFunc> => {
|
||||
let hasUpdate: { [_: RecordId]: boolean } = {}
|
||||
const unsub = client
|
||||
@ -112,7 +112,7 @@ export const createWatchHelper = (config: WatchHelperConfig) => {
|
||||
})
|
||||
}
|
||||
return unsub
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
return { watchById, watchAllById }
|
||||
|
@ -9,7 +9,7 @@ export const safeCatch = <TIn extends any[], TOut>(
|
||||
name: string,
|
||||
logger: Logger,
|
||||
cb: (...args: TIn) => Promise<TOut>,
|
||||
timeoutMs = SAFECATCH_TTL_MS
|
||||
timeoutMs = SAFECATCH_TTL_MS,
|
||||
): ((...args: TIn) => Promise<TOut>) => {
|
||||
return async (...args: TIn) => {
|
||||
const uuid = `${name}:${nanoid()}`
|
||||
@ -29,17 +29,17 @@ export const safeCatch = <TIn extends any[], TOut>(
|
||||
if (e instanceof ClientResponseError) {
|
||||
if (e.status === 400) {
|
||||
dbg(
|
||||
`PocketBase API error: It looks like you don't have permission to make this request. Raw error: ${e}. Payload: ${payload}`
|
||||
`PocketBase API error: It looks like you don't have permission to make this request. Raw error: ${e}. Payload: ${payload}`,
|
||||
)
|
||||
} else if (e.status === 0) {
|
||||
dbg(
|
||||
`Client request aborted (possible duplicate request or real error). Raw error: ${e}. Payload: ${payload}`
|
||||
`Client request aborted (possible duplicate request or real error). Raw error: ${e}. Payload: ${payload}`,
|
||||
)
|
||||
} else if (e.status === 404) {
|
||||
dbg(`Record not found. Raw error: ${e}. Payload: ${payload}`)
|
||||
} else {
|
||||
dbg(
|
||||
`Unknown PocketBase API error. Raw error: ${e}. Payload: ${payload}`
|
||||
`Unknown PocketBase API error. Raw error: ${e}. Payload: ${payload}`,
|
||||
)
|
||||
}
|
||||
} else {
|
||||
|
@ -34,7 +34,7 @@ export type RpcPayloadBase = JsonObject
|
||||
|
||||
export type RpcFields<
|
||||
TPayload extends RpcPayloadBase,
|
||||
TRes extends JsonObject
|
||||
TRes extends JsonObject,
|
||||
> = BaseFields & {
|
||||
userId: UserId
|
||||
cmd: string
|
||||
|
@ -3,7 +3,7 @@ export * from './Instance'
|
||||
export * from './InstanceLog'
|
||||
export * from './Invocation'
|
||||
export * from './Rpc'
|
||||
export * from './types'
|
||||
export * from './User'
|
||||
export * from './types'
|
||||
export * from './util'
|
||||
// gen:export
|
||||
|
@ -36,10 +36,9 @@
|
||||
"pocketbase": "^0.8.0",
|
||||
"semver": "^7.3.8",
|
||||
"sqlite": "^4.1.2",
|
||||
"sqlite3": "^5.1.2",
|
||||
"sqlite3": "^5.1.6",
|
||||
"tmp": "^0.2.1",
|
||||
"type-fest": "^3.3.0",
|
||||
"url-pattern": "^1.0.3",
|
||||
"tsx": "^3.11.0"
|
||||
"url-pattern": "^1.0.3"
|
||||
}
|
||||
}
|
@ -25,7 +25,7 @@ export const DAEMON_PB_MIGRATIONS_DIR = (() => {
|
||||
const v = env('DAEMON_PB_MIGRATIONS_DIR')
|
||||
if (!v) {
|
||||
throw new Error(
|
||||
`DAEMON_PB_MIGRATIONS_DIR (${v}) environment variable must be specified`
|
||||
`DAEMON_PB_MIGRATIONS_DIR (${v}) environment variable must be specified`,
|
||||
)
|
||||
}
|
||||
if (!existsSync(v)) {
|
||||
@ -38,7 +38,7 @@ export const DAEMON_PB_DATA_DIR = (() => {
|
||||
const v = env('DAEMON_PB_DATA_DIR')
|
||||
if (!v) {
|
||||
throw new Error(
|
||||
`DAEMON_PB_DATA_DIR (${v}) environment variable must be specified`
|
||||
`DAEMON_PB_DATA_DIR (${v}) environment variable must be specified`,
|
||||
)
|
||||
}
|
||||
if (!existsSync(v)) {
|
||||
@ -56,12 +56,12 @@ export const DAEMON_MAX_PORTS = envi(`DAEMON_MAX_PORTS`, 500)
|
||||
export const DAEMON_PB_BACKUP_SLEEP = envi(`DAEMON_PB_BACKUP_SLEEP`, 100)
|
||||
export const DAEMON_PB_BACKUP_PAGE_COUNT = envi(
|
||||
`DAEMON_PB_BACKUP_PAGE_COUNT`,
|
||||
5
|
||||
5,
|
||||
)
|
||||
|
||||
export const PH_BIN_CACHE = env(
|
||||
`PH_BIN_CACHE`,
|
||||
join(__dirname, `../../../.pbincache`)
|
||||
join(__dirname, `../../../.pbincache`),
|
||||
)
|
||||
|
||||
export const PH_FTP_PORT = envi('PH_FTP_PORT', 21)
|
||||
|
@ -25,6 +25,12 @@ import { portManager } from './services/PortManager'
|
||||
import { updaterService } from './services/UpdaterService/UpdaterService'
|
||||
// gen:import
|
||||
|
||||
const [major, minor, patch] = process.versions.node.split('.').map(Number)
|
||||
|
||||
if ((major || 0) < 18) {
|
||||
throw new Error(`Node 18 or higher required.`)
|
||||
}
|
||||
|
||||
loggerService({ debug: DEBUG, trace: TRACE, errorTrace: !DEBUG })
|
||||
|
||||
// npm install eventsource --save
|
||||
@ -74,7 +80,7 @@ global.EventSource = require('eventsource')
|
||||
error(`migrate had an unexpected stop. Check it out`)
|
||||
},
|
||||
},
|
||||
{ logger }
|
||||
{ logger },
|
||||
)
|
||||
).exited
|
||||
info(`Migrating done`)
|
||||
@ -91,7 +97,7 @@ global.EventSource = require('eventsource')
|
||||
error(`migrate had an unexpected stop. Check it out`)
|
||||
},
|
||||
},
|
||||
{ logger }
|
||||
{ logger },
|
||||
)
|
||||
|
||||
/**
|
||||
|
@ -20,15 +20,15 @@ export const centralDbService = mkSingleton(
|
||||
|
||||
const target = coreInternalUrl
|
||||
dbg(
|
||||
`Forwarding proxy request for ${req.url} to central instance ${target}`
|
||||
`Forwarding proxy request for ${req.url} to central instance ${target}`,
|
||||
)
|
||||
proxy.web(req, res, { target })
|
||||
},
|
||||
`CentralDbService`
|
||||
`CentralDbService`,
|
||||
)
|
||||
|
||||
return {
|
||||
shutdown() {},
|
||||
}
|
||||
}
|
||||
},
|
||||
)
|
||||
|
@ -7,7 +7,7 @@ import {
|
||||
SSL_KEY,
|
||||
} from '$constants'
|
||||
import { clientService, createPbClient } from '$services'
|
||||
import { mkSingleton, SingletonBaseConfig } from '@pockethost/common'
|
||||
import { SingletonBaseConfig, mkSingleton } from '@pockethost/common'
|
||||
import { readFileSync } from 'fs'
|
||||
import { FtpSrv } from 'ftp-srv'
|
||||
import { PhFs } from './PhFs'
|
||||
@ -77,7 +77,7 @@ export const ftpService = mkSingleton((config: FtpConfig) => {
|
||||
reject(new Error(`Invalid username or password`))
|
||||
return
|
||||
}
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
ftpServer.listen().then(() => {
|
||||
|
@ -3,24 +3,24 @@ import { assert } from '$util'
|
||||
import { InstanceFields, Logger } from '@pockethost/common'
|
||||
import { compact, map } from '@s-libs/micro-dash'
|
||||
import {
|
||||
Mode,
|
||||
constants,
|
||||
createReadStream,
|
||||
createWriteStream,
|
||||
existsSync,
|
||||
mkdirSync,
|
||||
Mode,
|
||||
} from 'fs'
|
||||
import { FileStat, FileSystem, FtpConnection } from 'ftp-srv'
|
||||
import { customAlphabet } from 'nanoid'
|
||||
import { isAbsolute, join, normalize, resolve, sep } from 'path'
|
||||
import { PocketbaseClientApi } from '../clientService/PbClient'
|
||||
import * as fsAsync from './fs-async'
|
||||
import {
|
||||
FolderNames,
|
||||
INSTANCE_ROOT_FOLDER_NAMES,
|
||||
isInstanceRootFolder,
|
||||
MAINTENANCE_ONLY_FOLDER_NAMES,
|
||||
isInstanceRootFolder,
|
||||
} from './FtpService'
|
||||
import * as fsAsync from './fs-async'
|
||||
|
||||
const nanoid = customAlphabet(`abcdefghijklmnop`)
|
||||
|
||||
@ -51,7 +51,7 @@ export class PhFs implements FileSystem {
|
||||
constructor(
|
||||
connection: FtpConnection,
|
||||
client: PocketbaseClientApi,
|
||||
logger: Logger
|
||||
logger: Logger,
|
||||
) {
|
||||
const cwd = `/`
|
||||
const root = DAEMON_PB_DATA_DIR
|
||||
@ -109,12 +109,12 @@ export class PhFs implements FileSystem {
|
||||
dbg({ rootFolderName, instance })
|
||||
if (
|
||||
MAINTENANCE_ONLY_FOLDER_NAMES.includes(
|
||||
rootFolderName as FolderNames
|
||||
rootFolderName as FolderNames,
|
||||
) &&
|
||||
!instance.maintenance
|
||||
) {
|
||||
throw new Error(
|
||||
`Instance must be in maintenance mode to access ${rootFolderName}`
|
||||
`Instance must be in maintenance mode to access ${rootFolderName}`,
|
||||
)
|
||||
}
|
||||
fsPathParts.push(rootFolderName)
|
||||
@ -122,7 +122,7 @@ export class PhFs implements FileSystem {
|
||||
const rootFolderFsPath = resolve(
|
||||
join(...fsPathParts)
|
||||
.replace(UNIX_SEP_REGEX, sep)
|
||||
.replace(WIN_SEP_REGEX, sep)
|
||||
.replace(WIN_SEP_REGEX, sep),
|
||||
)
|
||||
if (!existsSync(rootFolderFsPath)) {
|
||||
mkdirSync(rootFolderFsPath)
|
||||
@ -137,7 +137,7 @@ export class PhFs implements FileSystem {
|
||||
const fsPath = resolve(
|
||||
join(...fsPathParts)
|
||||
.replace(UNIX_SEP_REGEX, sep)
|
||||
.replace(WIN_SEP_REGEX, sep)
|
||||
.replace(WIN_SEP_REGEX, sep),
|
||||
)
|
||||
|
||||
// Create FTP client path using unix separator
|
||||
@ -210,7 +210,7 @@ export class PhFs implements FileSystem {
|
||||
*/
|
||||
if (!instance) {
|
||||
throw new Error(
|
||||
`Something as gone wrong. An instance without a subdomain is not possible.`
|
||||
`Something as gone wrong. An instance without a subdomain is not possible.`,
|
||||
)
|
||||
}
|
||||
|
||||
@ -247,7 +247,7 @@ export class PhFs implements FileSystem {
|
||||
})
|
||||
})
|
||||
.catch(() => null)
|
||||
})
|
||||
}),
|
||||
)
|
||||
})
|
||||
.then(compact)
|
||||
@ -322,7 +322,7 @@ export class PhFs implements FileSystem {
|
||||
|
||||
async write(
|
||||
fileName: string,
|
||||
options?: { append?: boolean | undefined; start?: any } | undefined
|
||||
options?: { append?: boolean | undefined; start?: any } | undefined,
|
||||
) {
|
||||
const { dbg, error } = this.log
|
||||
.create(`write`)
|
||||
@ -366,7 +366,7 @@ export class PhFs implements FileSystem {
|
||||
|
||||
async read(
|
||||
fileName: string,
|
||||
options: { start?: any } | undefined
|
||||
options: { start?: any } | undefined,
|
||||
): Promise<any> {
|
||||
const { dbg, error } = this.log
|
||||
.create(`read`)
|
||||
@ -374,9 +374,8 @@ export class PhFs implements FileSystem {
|
||||
.breadcrumb(fileName)
|
||||
dbg(`read`)
|
||||
|
||||
const { fsPath, clientPath, pathFromRootFolder } = await this._resolvePath(
|
||||
fileName
|
||||
)
|
||||
const { fsPath, clientPath, pathFromRootFolder } =
|
||||
await this._resolvePath(fileName)
|
||||
|
||||
const { start } = options || {}
|
||||
|
||||
@ -433,9 +432,8 @@ export class PhFs implements FileSystem {
|
||||
.breadcrumb(path)
|
||||
dbg(`mkdir`)
|
||||
|
||||
const { fsPath, clientPath, pathFromRootFolder } = await this._resolvePath(
|
||||
path
|
||||
)
|
||||
const { fsPath, clientPath, pathFromRootFolder } =
|
||||
await this._resolvePath(path)
|
||||
|
||||
/*
|
||||
Disallow making directories if not inside root folder
|
||||
@ -485,7 +483,7 @@ export class PhFs implements FileSystem {
|
||||
Promise.all([
|
||||
this.restartInstanceGuard(fromRootFolderName, instance),
|
||||
this.restartInstanceGuard(toRootFolderName, instance),
|
||||
])
|
||||
]),
|
||||
)
|
||||
}
|
||||
|
||||
@ -497,9 +495,8 @@ export class PhFs implements FileSystem {
|
||||
.breadcrumb(mode.toString())
|
||||
dbg(`chmod`)
|
||||
|
||||
const { fsPath, clientPath, pathFromRootFolder } = await this._resolvePath(
|
||||
path
|
||||
)
|
||||
const { fsPath, clientPath, pathFromRootFolder } =
|
||||
await this._resolvePath(path)
|
||||
|
||||
/*
|
||||
Disallow making directories if not inside root folder
|
||||
@ -517,7 +514,7 @@ export class PhFs implements FileSystem {
|
||||
|
||||
async restartInstanceGuard(
|
||||
rootFolderName: FolderNames | undefined,
|
||||
instance: InstanceFields
|
||||
instance: InstanceFields,
|
||||
) {
|
||||
// Not needed?
|
||||
// const { dbg, error } = this.log
|
||||
|
@ -10,4 +10,4 @@ const mkdir = promisify(fs.mkdir)
|
||||
const rename = promisify(fs.rename)
|
||||
const chmod = promisify(fs.chmod)
|
||||
|
||||
export { stat, readdir, access, unlink, rmdir, mkdir, rename, chmod }
|
||||
export { access, chmod, mkdir, readdir, rename, rmdir, stat, unlink }
|
||||
|
@ -2,11 +2,11 @@ import { SqliteChangeEvent, sqliteService } from '$services'
|
||||
import {
|
||||
InstanceLogFields,
|
||||
InstanceLogFields_Create,
|
||||
RecordId,
|
||||
StreamNames,
|
||||
newId,
|
||||
pocketNow,
|
||||
RecordId,
|
||||
safeCatch,
|
||||
StreamNames,
|
||||
} from '@pockethost/common'
|
||||
import knex from 'knex'
|
||||
import { AsyncReturnType } from 'type-fest'
|
||||
@ -15,7 +15,7 @@ import { DaemonContext } from './DaemonContext'
|
||||
export type SqliteLogger = AsyncReturnType<typeof createSqliteLogger>
|
||||
export const createSqliteLogger = async (
|
||||
logDbPath: string,
|
||||
context: DaemonContext
|
||||
context: DaemonContext,
|
||||
) => {
|
||||
const { parentLogger } = context
|
||||
const _dbLogger = parentLogger.create(`${logDbPath}`)
|
||||
@ -46,7 +46,7 @@ export const createSqliteLogger = async (
|
||||
const sql = conn('logs').insert(_in).toString()
|
||||
trace(`Writing log ${JSON.stringify(_in)} ${sql}`)
|
||||
await db.exec(sql)
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
const subscribe = (cb: (e: SqliteChangeEvent<InstanceLogFields>) => void) => {
|
||||
@ -66,7 +66,7 @@ export const createSqliteLogger = async (
|
||||
|
||||
const fetch = async (limit: number = 100) => {
|
||||
return db.all<InstanceLogFields[]>(
|
||||
`select * from logs order by created desc limit ${limit}`
|
||||
`select * from logs order by created desc limit ${limit}`,
|
||||
)
|
||||
}
|
||||
|
||||
|
@ -17,7 +17,7 @@ const instances: {
|
||||
|
||||
export const createInstanceLogger = async (
|
||||
instanceId: InstanceId,
|
||||
context: DaemonContext
|
||||
context: DaemonContext,
|
||||
) => {
|
||||
const { parentLogger } = context
|
||||
const _instanceLogger = parentLogger.create(`InstanceLogger`)
|
||||
@ -31,7 +31,7 @@ export const createInstanceLogger = async (
|
||||
DAEMON_PB_DATA_DIR,
|
||||
instanceId,
|
||||
'pb_data',
|
||||
'instance_logs.db'
|
||||
'instance_logs.db',
|
||||
)
|
||||
|
||||
dbg(`logs path`, logDbPath)
|
||||
@ -69,5 +69,5 @@ export const instanceLoggerService = mkSingleton(
|
||||
dbg(`Shutting down`)
|
||||
},
|
||||
}
|
||||
}
|
||||
},
|
||||
)
|
||||
|
@ -43,7 +43,7 @@ export const createDenoProcess = async (config: DenoProcessConfig) => {
|
||||
|
||||
const denoWrite = (
|
||||
message: string,
|
||||
stream: StreamNames = StreamNames.Info
|
||||
stream: StreamNames = StreamNames.Info,
|
||||
) => {
|
||||
dbg(`[${instance.id}:${path}:${stream}] ${message}`)
|
||||
return denoLogger.write(message, stream)
|
||||
@ -76,12 +76,12 @@ export const createDenoProcess = async (config: DenoProcessConfig) => {
|
||||
if (code) {
|
||||
await denoWrite(
|
||||
`Unexpected 'deno' exit code: ${code}.`,
|
||||
StreamNames.Error
|
||||
StreamNames.Error,
|
||||
)
|
||||
} else {
|
||||
await denoWrite(
|
||||
`Worker has exited with code ${code}`,
|
||||
StreamNames.System
|
||||
StreamNames.System,
|
||||
)
|
||||
}
|
||||
resolve()
|
||||
|
@ -111,13 +111,13 @@ export const instanceService = mkSingleton(
|
||||
const { id, subdomain, version } = instance
|
||||
|
||||
const systemInstanceLogger = instanceServiceLogger.create(
|
||||
`${subdomain}:${id}:${version}`
|
||||
`${subdomain}:${id}:${version}`,
|
||||
)
|
||||
const { dbg, warn, error, info } = systemInstanceLogger
|
||||
|
||||
if (instanceApis[id]) {
|
||||
throw new Error(
|
||||
`Attempted to create an instance API when one is already available for ${id}`
|
||||
`Attempted to create an instance API when one is already available for ${id}`,
|
||||
)
|
||||
}
|
||||
|
||||
@ -154,7 +154,7 @@ export const instanceService = mkSingleton(
|
||||
internalUrl: () => {
|
||||
if (status !== InstanceApiStatus.Healthy) {
|
||||
throw new Error(
|
||||
`Attempt to access instance URL when instance is not in a healthy state.`
|
||||
`Attempt to access instance URL when instance is not in a healthy state.`,
|
||||
)
|
||||
}
|
||||
return internalUrl
|
||||
@ -162,7 +162,7 @@ export const instanceService = mkSingleton(
|
||||
startRequest: () => {
|
||||
if (status !== InstanceApiStatus.Healthy) {
|
||||
throw new Error(
|
||||
`Attempt to start an instance request when instance is not in a healthy state.`
|
||||
`Attempt to start an instance request when instance is not in a healthy state.`,
|
||||
)
|
||||
}
|
||||
return startRequest()
|
||||
@ -193,7 +193,7 @@ export const instanceService = mkSingleton(
|
||||
const healthyGuard = () => {
|
||||
if (status !== InstanceApiStatus.ShuttingDown) return
|
||||
throw new Error(
|
||||
`HealthyGuard detected instance is shutting down. Aborting further initialization.`
|
||||
`HealthyGuard detected instance is shutting down. Aborting further initialization.`,
|
||||
)
|
||||
}
|
||||
|
||||
@ -202,7 +202,7 @@ export const instanceService = mkSingleton(
|
||||
*/
|
||||
const clientLimiter = new Bottleneck({ maxConcurrent: 1 })
|
||||
const updateInstanceStatus = clientLimiter.wrap(
|
||||
client.updateInstanceStatus
|
||||
client.updateInstanceStatus,
|
||||
)
|
||||
const updateInstance = clientLimiter.wrap(client.updateInstance)
|
||||
const createInvocation = clientLimiter.wrap(client.createInvocation)
|
||||
@ -235,15 +235,15 @@ export const instanceService = mkSingleton(
|
||||
instance.id,
|
||||
{
|
||||
parentLogger: systemInstanceLogger,
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
const writeUserLog = serialAsyncExecutionGuard(
|
||||
userInstanceLogger.write,
|
||||
() => `${instance.id}:userLog`
|
||||
() => `${instance.id}:userLog`,
|
||||
)
|
||||
shutdownManager.add(() =>
|
||||
writeUserLog(`Shutting down instance`).catch(error)
|
||||
writeUserLog(`Shutting down instance`).catch(error),
|
||||
)
|
||||
|
||||
/*
|
||||
@ -272,7 +272,7 @@ export const instanceService = mkSingleton(
|
||||
version,
|
||||
onUnexpectedStop: (code, stdout, stderr) => {
|
||||
warn(
|
||||
`PocketBase processes exited unexpectedly with ${code}. Putting in maintenance mode.`
|
||||
`PocketBase processes exited unexpectedly with ${code}. Putting in maintenance mode.`,
|
||||
)
|
||||
warn(stdout)
|
||||
warn(stderr)
|
||||
@ -282,24 +282,24 @@ export const instanceService = mkSingleton(
|
||||
})
|
||||
await writeUserLog(
|
||||
`Putting instance in maintenance mode because it shut down with return code ${code}. `,
|
||||
StreamNames.Error
|
||||
StreamNames.Error,
|
||||
)
|
||||
await Promise.all(
|
||||
stdout.map((data) =>
|
||||
writeUserLog(data, StreamNames.Error).catch(error)
|
||||
)
|
||||
writeUserLog(data, StreamNames.Error).catch(error),
|
||||
),
|
||||
)
|
||||
await Promise.all(
|
||||
stderr.map((data) =>
|
||||
writeUserLog(data, StreamNames.Error).catch(error)
|
||||
)
|
||||
writeUserLog(data, StreamNames.Error).catch(error),
|
||||
),
|
||||
)
|
||||
})
|
||||
setImmediate(() => {
|
||||
_safeShutdown(
|
||||
new Error(
|
||||
`PocketBase processes exited unexpectedly with ${code}. Putting in maintenance mode.`
|
||||
)
|
||||
`PocketBase processes exited unexpectedly with ${code}. Putting in maintenance mode.`,
|
||||
),
|
||||
).catch(error)
|
||||
})
|
||||
},
|
||||
@ -308,7 +308,7 @@ export const instanceService = mkSingleton(
|
||||
} catch (e) {
|
||||
warn(`Error spawning: ${e}`)
|
||||
throw new Error(
|
||||
`Could not launch PocketBase ${instance.version}. It may be time to upgrade.`
|
||||
`Could not launch PocketBase ${instance.version}. It may be time to upgrade.`,
|
||||
)
|
||||
}
|
||||
})()
|
||||
@ -340,7 +340,7 @@ export const instanceService = mkSingleton(
|
||||
DAEMON_PB_DATA_DIR,
|
||||
instance.id,
|
||||
`worker`,
|
||||
`index.ts`
|
||||
`index.ts`,
|
||||
)
|
||||
dbg(`Checking ${workerPath} for a worker entry point`)
|
||||
if (existsSync(workerPath)) {
|
||||
@ -400,7 +400,7 @@ export const instanceService = mkSingleton(
|
||||
}
|
||||
return true
|
||||
}),
|
||||
RECHECK_TTL
|
||||
RECHECK_TTL,
|
||||
)
|
||||
}
|
||||
|
||||
@ -413,7 +413,7 @@ export const instanceService = mkSingleton(
|
||||
warn(`_pingInvocation failed with ${e}`)
|
||||
return true
|
||||
}),
|
||||
1000
|
||||
1000,
|
||||
)
|
||||
}
|
||||
|
||||
@ -437,7 +437,7 @@ export const instanceService = mkSingleton(
|
||||
.catch((e: ClientResponseError) => {
|
||||
if (e.status !== 404) {
|
||||
throw new Error(
|
||||
`Unexpected response ${JSON.stringify(e)} from mothership`
|
||||
`Unexpected response ${JSON.stringify(e)} from mothership`,
|
||||
)
|
||||
}
|
||||
return []
|
||||
@ -450,7 +450,7 @@ export const instanceService = mkSingleton(
|
||||
{
|
||||
dbg(`Trying to get instance by subdomain: ${idOrSubdomain}`)
|
||||
const [instance, owner] = await client.getInstanceBySubdomain(
|
||||
idOrSubdomain
|
||||
idOrSubdomain,
|
||||
)
|
||||
if (instance && owner) {
|
||||
dbg(`${idOrSubdomain} is a subdomain`)
|
||||
@ -472,14 +472,14 @@ export const instanceService = mkSingleton(
|
||||
if (instanceIdOrSubdomain === PUBLIC_APP_DB) return
|
||||
|
||||
const { instance, owner } = await getInstanceByIdOrSubdomain(
|
||||
instanceIdOrSubdomain
|
||||
instanceIdOrSubdomain,
|
||||
)
|
||||
if (!owner) {
|
||||
throw new Error(`Instance owner is invalid`)
|
||||
}
|
||||
if (!instance) {
|
||||
throw new Error(
|
||||
`Subdomain ${instanceIdOrSubdomain} does not resolve to an instance`
|
||||
`Subdomain ${instanceIdOrSubdomain} does not resolve to an instance`,
|
||||
)
|
||||
}
|
||||
|
||||
@ -489,7 +489,7 @@ export const instanceService = mkSingleton(
|
||||
dbg(`Checking for maintenance mode`)
|
||||
if (instance.maintenance) {
|
||||
throw new Error(
|
||||
`This instance is in Maintenance Mode. See https://pockethost.gitbook.io/manual/daily-usage/maintenance for more information.`
|
||||
`This instance is in Maintenance Mode. See https://pockethost.gitbook.io/manual/daily-usage/maintenance for more information.`,
|
||||
)
|
||||
}
|
||||
|
||||
@ -499,7 +499,7 @@ export const instanceService = mkSingleton(
|
||||
dbg(`Checking for verified account`)
|
||||
if (!owner?.verified) {
|
||||
throw new Error(
|
||||
`Log in at ${PUBLIC_APP_PROTOCOL}://${PUBLIC_APP_DOMAIN} to verify your account.`
|
||||
`Log in at ${PUBLIC_APP_PROTOCOL}://${PUBLIC_APP_DOMAIN} to verify your account.`,
|
||||
)
|
||||
}
|
||||
|
||||
@ -513,12 +513,12 @@ export const instanceService = mkSingleton(
|
||||
dbg(
|
||||
`Forwarding proxy request for ${
|
||||
req.url
|
||||
} to instance ${api.internalUrl()}`
|
||||
} to instance ${api.internalUrl()}`,
|
||||
)
|
||||
|
||||
proxy.web(req, res, { target: api.internalUrl() })
|
||||
},
|
||||
`InstanceService`
|
||||
`InstanceService`,
|
||||
)
|
||||
|
||||
const { getNextPort } = await portManager()
|
||||
@ -532,5 +532,5 @@ export const instanceService = mkSingleton(
|
||||
const getInstanceApiIfExistsById = (id: InstanceId) => instanceApis[id]
|
||||
|
||||
return { shutdown, getInstanceApiIfExistsById }
|
||||
}
|
||||
},
|
||||
)
|
||||
|
@ -2,8 +2,8 @@ import { DAEMON_PB_DATA_DIR, DAEMON_PB_MIGRATIONS_DIR } from '$constants'
|
||||
import { mkInternalAddress, mkInternalUrl, tryFetch } from '$util'
|
||||
import { createCleanupManager, createTimerManager } from '@pockethost/common'
|
||||
import {
|
||||
mkSingleton,
|
||||
SingletonBaseConfig,
|
||||
mkSingleton,
|
||||
} from '@pockethost/common/src/mkSingleton'
|
||||
import { spawn } from 'child_process'
|
||||
import { existsSync } from 'fs'
|
||||
@ -23,7 +23,7 @@ export type SpawnConfig = {
|
||||
onUnexpectedStop: (
|
||||
code: number | null,
|
||||
stdout: string[],
|
||||
stderr: string[]
|
||||
stderr: string[],
|
||||
) => void
|
||||
}
|
||||
export type PocketbaseServiceApi = AsyncReturnType<
|
||||
@ -49,7 +49,7 @@ function pidIsRunning(pid: number) {
|
||||
}
|
||||
|
||||
export const createPocketbaseService = async (
|
||||
config: PocketbaseServiceConfig
|
||||
config: PocketbaseServiceConfig,
|
||||
) => {
|
||||
const { logger } = config
|
||||
const _serviceLogger = logger.create('PocketbaseService')
|
||||
@ -77,7 +77,7 @@ export const createPocketbaseService = async (
|
||||
const bin = realVersion.binPath
|
||||
if (!existsSync(bin)) {
|
||||
throw new Error(
|
||||
`PocketBase binary (${bin}) not found. Contact pockethost.io.`
|
||||
`PocketBase binary (${bin}) not found. Contact pockethost.io.`,
|
||||
)
|
||||
}
|
||||
|
||||
@ -93,7 +93,7 @@ export const createPocketbaseService = async (
|
||||
args.push(
|
||||
isMothership
|
||||
? DAEMON_PB_MIGRATIONS_DIR
|
||||
: `${DAEMON_PB_DATA_DIR}/${slug}/pb_migrations`
|
||||
: `${DAEMON_PB_DATA_DIR}/${slug}/pb_migrations`,
|
||||
)
|
||||
}
|
||||
if (command === 'serve') {
|
||||
@ -157,7 +157,7 @@ export const createPocketbaseService = async (
|
||||
const { pid } = ls
|
||||
if (!pid) {
|
||||
throw new Error(
|
||||
`Attempt to kill a PocketBase process that was never running.`
|
||||
`Attempt to kill a PocketBase process that was never running.`,
|
||||
)
|
||||
}
|
||||
const p = new Promise<boolean>((resolve, reject) => {
|
||||
|
@ -38,8 +38,8 @@ export const portManager = mkSingleton(async (cfg: PortManagerConfig) => {
|
||||
const removed = remove(exclude, (v) => v === newPort)
|
||||
dbg(
|
||||
`Removed ${removed.join(',')} from excluded ports: ${exclude.join(
|
||||
','
|
||||
)}`
|
||||
',',
|
||||
)}`,
|
||||
)
|
||||
},
|
||||
]
|
||||
|
@ -22,7 +22,7 @@ export type ProxyMiddleware = (
|
||||
proxy: Server
|
||||
host: string
|
||||
},
|
||||
logger: Logger
|
||||
logger: Logger,
|
||||
) => void | Promise<void>
|
||||
|
||||
export type ProxyServiceConfig = SingletonBaseConfig & {
|
||||
@ -44,7 +44,7 @@ export const proxyService = mkSingleton(async (config: ProxyServiceConfig) => {
|
||||
dbg(`Incoming request ${req.method} ${req.headers.host}/${req.url}`)
|
||||
if (!req.headers.host?.endsWith(PUBLIC_APP_DOMAIN)) {
|
||||
warn(
|
||||
`Request for ${req.headers.host} rejected because host does not end in ${PUBLIC_APP_DOMAIN}`
|
||||
`Request for ${req.headers.host} rejected because host does not end in ${PUBLIC_APP_DOMAIN}`,
|
||||
)
|
||||
res.writeHead(502, {
|
||||
'Content-Type': `text/plain`,
|
||||
@ -54,7 +54,7 @@ export const proxyService = mkSingleton(async (config: ProxyServiceConfig) => {
|
||||
}
|
||||
{
|
||||
const { warn } = _proxyLogger.create(
|
||||
`${req.method} ${req.headers.host}/${req.url}`
|
||||
`${req.method} ${req.headers.host}/${req.url}`,
|
||||
)
|
||||
try {
|
||||
for (let i = 0; i < middleware.length; i++) {
|
||||
@ -94,7 +94,7 @@ export const proxyService = mkSingleton(async (config: ProxyServiceConfig) => {
|
||||
subdomainFilter: string | ((subdomain: string) => boolean),
|
||||
urlFilters: string | string[],
|
||||
handler: ProxyMiddleware,
|
||||
handlerName: string
|
||||
handlerName: string,
|
||||
) => {
|
||||
const _handlerLogger = _proxyLogger.create(`${handlerName}`)
|
||||
const { dbg, trace } = _handlerLogger
|
||||
@ -149,7 +149,7 @@ export const proxyService = mkSingleton(async (config: ProxyServiceConfig) => {
|
||||
req,
|
||||
res,
|
||||
{ host, subdomain, coreInternalUrl, proxy },
|
||||
_requestLogger
|
||||
_requestLogger,
|
||||
)
|
||||
})
|
||||
}
|
||||
|
@ -59,7 +59,7 @@ export const realtimeLog = mkSingleton(async (config: RealtimeLogConfig) => {
|
||||
res.setHeader('Access-Control-Allow-Methods', 'POST, OPTIONS')
|
||||
res.setHeader(
|
||||
'Access-Control-Allow-Headers',
|
||||
'authorization,content-type,cache-control'
|
||||
'authorization,content-type,cache-control',
|
||||
)
|
||||
res.setHeader('Access-Control-Max-Age', 86400)
|
||||
res.statusCode = 204
|
||||
@ -106,7 +106,7 @@ export const realtimeLog = mkSingleton(async (config: RealtimeLogConfig) => {
|
||||
.getOne<InstanceFields>(instanceId)
|
||||
if (!instance) {
|
||||
throw new Error(
|
||||
`instanceId ${instanceId} not found for user ${user.id}`
|
||||
`instanceId ${instanceId} not found for user ${user.id}`,
|
||||
)
|
||||
}
|
||||
dbg(`Instance is `, instance)
|
||||
@ -142,14 +142,14 @@ export const realtimeLog = mkSingleton(async (config: RealtimeLogConfig) => {
|
||||
const evt = mkEvent(`log`, record)
|
||||
trace(
|
||||
`Dispatching SSE log event from ${instance.subdomain} (${instance.id})`,
|
||||
evt
|
||||
evt,
|
||||
)
|
||||
limiter.schedule(() => write(evt)).catch(error)
|
||||
})
|
||||
req.on('close', () => {
|
||||
limiter.stop()
|
||||
dbg(
|
||||
`SSE request for ${instance.subdomain} (${instance.id}) closed. Unsubscribing.`
|
||||
`SSE request for ${instance.subdomain} (${instance.id}) closed. Unsubscribing.`,
|
||||
)
|
||||
unsub()
|
||||
})
|
||||
@ -172,7 +172,7 @@ export const realtimeLog = mkSingleton(async (config: RealtimeLogConfig) => {
|
||||
const evt = mkEvent(`log`, rec)
|
||||
trace(
|
||||
`Dispatching SSE initial log event from ${instance.subdomain} (${instance.id})`,
|
||||
evt
|
||||
evt,
|
||||
)
|
||||
return write(evt)
|
||||
})
|
||||
@ -186,7 +186,7 @@ export const realtimeLog = mkSingleton(async (config: RealtimeLogConfig) => {
|
||||
.catch(error)
|
||||
}
|
||||
},
|
||||
`RealtimeLogService`
|
||||
`RealtimeLogService`,
|
||||
)
|
||||
|
||||
return {
|
||||
|
@ -1,12 +1,12 @@
|
||||
import { clientService } from '$services'
|
||||
import {
|
||||
assertTruthy,
|
||||
mkSingleton,
|
||||
RPC_COMMANDS,
|
||||
RpcCommands,
|
||||
RpcFields,
|
||||
RpcStatus,
|
||||
RPC_COMMANDS,
|
||||
SingletonBaseConfig,
|
||||
assertTruthy,
|
||||
mkSingleton,
|
||||
} from '@pockethost/common'
|
||||
import { isObject } from '@s-libs/micro-dash'
|
||||
import Ajv, { JSONSchemaType, ValidateFunction } from 'ajv'
|
||||
@ -22,12 +22,12 @@ export type KnexApi = ReturnType<typeof knexFactory>
|
||||
export type CommandModuleInitializer = (
|
||||
register: RpcServiceApi['registerCommand'],
|
||||
client: pocketbaseEs,
|
||||
knex: KnexApi
|
||||
knex: KnexApi,
|
||||
) => void
|
||||
|
||||
export type RpcRunner<
|
||||
TPayload extends JsonObject,
|
||||
TResult extends JsonObject
|
||||
TResult extends JsonObject,
|
||||
> = (job: RpcFields<TPayload, TResult>) => Promise<TResult>
|
||||
|
||||
export type RpcServiceConfig = SingletonBaseConfig & {}
|
||||
@ -58,8 +58,8 @@ export const rpcService = mkSingleton(async (config: RpcServiceConfig) => {
|
||||
if (!RPC_COMMANDS.find((c) => c === cmd)) {
|
||||
throw new Error(
|
||||
`RPC command '${cmd}' is invalid. It must be one of: ${RPC_COMMANDS.join(
|
||||
'|'
|
||||
)}.`
|
||||
'|',
|
||||
)}.`,
|
||||
)
|
||||
}
|
||||
return cmd as RpcCommands
|
||||
@ -76,7 +76,7 @@ export const rpcService = mkSingleton(async (config: RpcServiceConfig) => {
|
||||
const { validate, run } = handler
|
||||
if (!validate(payload)) {
|
||||
throw new Error(
|
||||
`Payload for ${cmd} fails validation: ${JSON.stringify(payload)}`
|
||||
`Payload for ${cmd} fails validation: ${JSON.stringify(payload)}`,
|
||||
)
|
||||
}
|
||||
dbg(`Running RPC ${rpc.id}`, rpc)
|
||||
@ -115,11 +115,11 @@ export const rpcService = mkSingleton(async (config: RpcServiceConfig) => {
|
||||
|
||||
const registerCommand = <
|
||||
TPayload extends JsonObject,
|
||||
TResult extends JsonObject
|
||||
TResult extends JsonObject,
|
||||
>(
|
||||
commandName: RpcCommands,
|
||||
schema: JSONSchemaType<TPayload>,
|
||||
runner: RpcRunner<TPayload, TResult>
|
||||
runner: RpcRunner<TPayload, TResult>,
|
||||
) => {
|
||||
if (jobHandlers[commandName]) {
|
||||
throw new Error(`${commandName} job handler already registered.`)
|
||||
|
@ -20,9 +20,9 @@ import {
|
||||
type SetInstanceMaintenanceResult,
|
||||
} from '@pockethost/common'
|
||||
import { valid, validRange } from 'semver'
|
||||
import { clientService } from '../clientService/clientService'
|
||||
import { instanceService } from '../InstanceService/InstanceService'
|
||||
import { updaterService } from '../UpdaterService/UpdaterService'
|
||||
import { clientService } from '../clientService/clientService'
|
||||
import { rpcService } from './RpcService'
|
||||
|
||||
export const registerRpcCommands = async (logger: Logger) => {
|
||||
@ -48,7 +48,7 @@ export const registerRpcCommands = async (logger: Logger) => {
|
||||
maintenance: false,
|
||||
})
|
||||
return { instance }
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
registerCommand<SaveVersionPayload, SaveVersionResult>(
|
||||
@ -65,7 +65,7 @@ export const registerRpcCommands = async (logger: Logger) => {
|
||||
}
|
||||
await client.updateInstance(instanceId, { version })
|
||||
return { status: 'ok' }
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
registerCommand<SaveSecretsPayload, SaveSecretsResult>(
|
||||
@ -76,7 +76,7 @@ export const registerRpcCommands = async (logger: Logger) => {
|
||||
const { instanceId, secrets } = payload
|
||||
await client.updateInstance(instanceId, { secrets })
|
||||
return { status: 'ok' }
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
registerCommand<RenameInstancePayload, RenameInstanceResult>(
|
||||
@ -90,7 +90,7 @@ export const registerRpcCommands = async (logger: Logger) => {
|
||||
await client.updateInstance(instanceId, { subdomain })
|
||||
dbg(`Instance updated successfully `)
|
||||
return {}
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
registerCommand<SetInstanceMaintenancePayload, SetInstanceMaintenanceResult>(
|
||||
@ -112,7 +112,7 @@ export const registerRpcCommands = async (logger: Logger) => {
|
||||
}
|
||||
}
|
||||
return {}
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
// gen:command
|
||||
|
@ -5,13 +5,13 @@ import {
|
||||
serialAsyncExecutionGuard,
|
||||
SingletonBaseConfig,
|
||||
} from '@pockethost/common'
|
||||
import { Database as SqliteDatabase, open } from 'sqlite'
|
||||
import { open, Database as SqliteDatabase } from 'sqlite'
|
||||
import { Database } from 'sqlite3'
|
||||
import { JsonObject } from 'type-fest'
|
||||
|
||||
export type SqliteUnsubscribe = () => void
|
||||
export type SqliteChangeHandler<TRecord extends JsonObject> = (
|
||||
e: SqliteChangeEvent<TRecord>
|
||||
e: SqliteChangeEvent<TRecord>,
|
||||
) => void
|
||||
export type SqliteEventType = 'update' | 'insert' | 'delete'
|
||||
export type SqliteChangeEvent<TRecord extends JsonObject> = {
|
||||
@ -25,7 +25,7 @@ export type SqliteServiceApi = {
|
||||
migrate: SqliteDatabase['migrate']
|
||||
exec: SqliteDatabase['exec']
|
||||
subscribe: <TRecord extends JsonObject>(
|
||||
cb: SqliteChangeHandler<TRecord>
|
||||
cb: SqliteChangeHandler<TRecord>,
|
||||
) => SqliteUnsubscribe
|
||||
}
|
||||
export type SqliteServiceConfig = SingletonBaseConfig & {}
|
||||
@ -43,7 +43,7 @@ export const sqliteService = mkSingleton((config: SqliteServiceConfig) => {
|
||||
This function
|
||||
*/
|
||||
const _unsafe_getDatabase = async (
|
||||
filename: string
|
||||
filename: string,
|
||||
): Promise<SqliteServiceApi> => {
|
||||
const _dbLogger = logger.create(`SqliteService`)
|
||||
_dbLogger.breadcrumb(filename)
|
||||
@ -62,7 +62,7 @@ export const sqliteService = mkSingleton((config: SqliteServiceConfig) => {
|
||||
eventType: SqliteEventType,
|
||||
database: string,
|
||||
table: string,
|
||||
rowId: number
|
||||
rowId: number,
|
||||
) => {
|
||||
trace(`Got a raw change event`, {
|
||||
eventType,
|
||||
@ -73,7 +73,7 @@ export const sqliteService = mkSingleton((config: SqliteServiceConfig) => {
|
||||
if (eventType === 'delete') return // Not supported
|
||||
|
||||
const record = await db.get(
|
||||
`select * from ${table} where rowid = '${rowId}'`
|
||||
`select * from ${table} where rowid = '${rowId}'`,
|
||||
)
|
||||
const e: SqliteChangeEvent<any> = {
|
||||
table,
|
||||
@ -81,7 +81,7 @@ export const sqliteService = mkSingleton((config: SqliteServiceConfig) => {
|
||||
record,
|
||||
}
|
||||
fireChange(e)
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
cm.add(() => {
|
||||
@ -110,7 +110,7 @@ export const sqliteService = mkSingleton((config: SqliteServiceConfig) => {
|
||||
}
|
||||
const getDatabase = serialAsyncExecutionGuard(
|
||||
_unsafe_getDatabase,
|
||||
(fileName) => fileName
|
||||
(fileName) => fileName,
|
||||
)
|
||||
|
||||
const shutdown = async () => {
|
||||
|
@ -1,9 +1,9 @@
|
||||
import { downloadAndExtract, smartFetch } from '$util'
|
||||
import {
|
||||
SingletonBaseConfig,
|
||||
createCleanupManager,
|
||||
createTimerManager,
|
||||
mkSingleton,
|
||||
SingletonBaseConfig,
|
||||
} from '@pockethost/common'
|
||||
import { keys } from '@s-libs/micro-dash'
|
||||
import { chmodSync, existsSync } from 'fs'
|
||||
@ -49,7 +49,7 @@ export const updaterService = mkSingleton(
|
||||
const check = async () => {
|
||||
const releases = await smartFetch<Releases>(
|
||||
`https://api.github.com/repos/pocketbase/pocketbase/releases?per_page=100`,
|
||||
join(cachePath, `releases.json`)
|
||||
join(cachePath, `releases.json`),
|
||||
)
|
||||
// dbg({ releases })
|
||||
|
||||
@ -77,7 +77,7 @@ export const updaterService = mkSingleton(
|
||||
await Promise.all(promises)
|
||||
if (keys(binPaths).length === 0) {
|
||||
throw new Error(
|
||||
`No version found, probably mismatched architecture and OS (${osName}/${cpuArchitecture})`
|
||||
`No version found, probably mismatched architecture and OS (${osName}/${cpuArchitecture})`,
|
||||
)
|
||||
}
|
||||
maxVersion = `~${rsort(keys(binPaths))[0]}`
|
||||
@ -94,7 +94,7 @@ export const updaterService = mkSingleton(
|
||||
const version = maxSatisfying(keys(binPaths), semVer)
|
||||
if (!version)
|
||||
throw new Error(
|
||||
`No version satisfies ${semVer} (${keys(binPaths).join(', ')})`
|
||||
`No version satisfies ${semVer} (${keys(binPaths).join(', ')})`,
|
||||
)
|
||||
const binPath = binPaths[version]
|
||||
if (!binPath) throw new Error(`binPath for ${version} not found`)
|
||||
@ -109,5 +109,5 @@ export const updaterService = mkSingleton(
|
||||
getVersion,
|
||||
shutdown: async () => {},
|
||||
}
|
||||
}
|
||||
},
|
||||
)
|
||||
|
@ -1,12 +1,12 @@
|
||||
import {
|
||||
assertExists,
|
||||
INSTANCE_COLLECTION,
|
||||
InstanceFields,
|
||||
InstanceFields_Create,
|
||||
InstanceId,
|
||||
InstanceStatus,
|
||||
INSTANCE_COLLECTION,
|
||||
safeCatch,
|
||||
UserFields,
|
||||
assertExists,
|
||||
safeCatch,
|
||||
} from '@pockethost/common'
|
||||
import { reduce } from '@s-libs/micro-dash'
|
||||
import Bottleneck from 'bottleneck'
|
||||
@ -25,7 +25,7 @@ export const createInstanceMixin = (context: MixinContext) => {
|
||||
const resetInstances = safeCatch(`resetRpcs`, logger, async () =>
|
||||
rawDb(INSTANCE_COLLECTION).update<InstanceFields>({
|
||||
status: InstanceStatus.Idle,
|
||||
})
|
||||
}),
|
||||
)
|
||||
|
||||
const createInstance = safeCatch(
|
||||
@ -35,7 +35,7 @@ export const createInstanceMixin = (context: MixinContext) => {
|
||||
return client
|
||||
.collection(INSTANCE_COLLECTION)
|
||||
.create<InstanceFields>(payload)
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
const getInstanceBySubdomain = safeCatch(
|
||||
@ -57,12 +57,12 @@ export const createInstanceMixin = (context: MixinContext) => {
|
||||
.then((user) => {
|
||||
return [instance, user]
|
||||
})
|
||||
})
|
||||
}),
|
||||
)
|
||||
|
||||
const getInstanceById = async (
|
||||
instanceId: InstanceId,
|
||||
context?: AsyncContext
|
||||
context?: AsyncContext,
|
||||
): Promise<[InstanceFields, UserFields] | []> =>
|
||||
client
|
||||
.collection(INSTANCE_COLLECTION)
|
||||
@ -86,7 +86,7 @@ export const createInstanceMixin = (context: MixinContext) => {
|
||||
logger,
|
||||
async (instanceId: InstanceId, fields: Partial<InstanceFields>) => {
|
||||
await client.collection(INSTANCE_COLLECTION).update(instanceId, fields)
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
const updateInstanceStatus = safeCatch(
|
||||
@ -94,7 +94,7 @@ export const createInstanceMixin = (context: MixinContext) => {
|
||||
logger,
|
||||
async (instanceId: InstanceId, status: InstanceStatus) => {
|
||||
await updateInstance(instanceId, { status })
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
const getInstance = safeCatch(
|
||||
@ -104,7 +104,7 @@ export const createInstanceMixin = (context: MixinContext) => {
|
||||
return client
|
||||
.collection(INSTANCE_COLLECTION)
|
||||
.getOne<InstanceFields>(instanceId)
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
const getInstances = safeCatch(`getInstances`, logger, async () => {
|
||||
@ -129,14 +129,14 @@ export const createInstanceMixin = (context: MixinContext) => {
|
||||
return client
|
||||
.collection(INSTANCE_COLLECTION)
|
||||
.update(r.id, toUpdate)
|
||||
})
|
||||
}),
|
||||
)
|
||||
return c
|
||||
},
|
||||
[] as Promise<void>[]
|
||||
[] as Promise<void>[],
|
||||
)
|
||||
await Promise.all(promises)
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
const updateInstanceSeconds = safeCatch(
|
||||
@ -156,7 +156,7 @@ export const createInstanceMixin = (context: MixinContext) => {
|
||||
assertExists(row, `Expected row here`)
|
||||
const secondsThisMonth = row.t
|
||||
await updateInstance(instanceId, { secondsThisMonth })
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
return {
|
||||
|
@ -9,7 +9,7 @@ import { MixinContext } from './PbClient'
|
||||
|
||||
export const createInvocationMixin = (
|
||||
context: MixinContext,
|
||||
instanceApi: InstanceApi
|
||||
instanceApi: InstanceApi,
|
||||
) => {
|
||||
const { logger } = context
|
||||
const { dbg } = logger.create('InvocationMixin')
|
||||
@ -32,7 +32,7 @@ export const createInvocationMixin = (
|
||||
$cancelKey: `createInvocation:${instance.id}:${pid}`,
|
||||
})
|
||||
return _inv
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
const pingInvocation = safeCatch(
|
||||
@ -49,7 +49,7 @@ export const createInvocationMixin = (
|
||||
.update<InvocationFields>(invocation.id, toUpdate)
|
||||
await instanceApi.updateInstanceSeconds(invocation.instanceId)
|
||||
return _inv
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
const finalizeInvocation = safeCatch(
|
||||
@ -69,7 +69,7 @@ export const createInvocationMixin = (
|
||||
.update<InvocationFields>(invocation.id, toUpdate)
|
||||
await instanceApi.updateInstanceSeconds(invocation.instanceId)
|
||||
return _inv
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
return { finalizeInvocation, pingInvocation, createInvocation }
|
||||
|
@ -18,7 +18,7 @@ export const createPbClient = (url: string, logger: Logger) => {
|
||||
info(`Initializing client: ${url}`)
|
||||
const rawDb = createRawPbClient(
|
||||
`${DAEMON_PB_DATA_DIR}/${PUBLIC_APP_DB}/pb_data/data.db`,
|
||||
_clientLogger
|
||||
_clientLogger,
|
||||
)
|
||||
|
||||
const client = new PocketBase(url)
|
||||
@ -27,7 +27,7 @@ export const createPbClient = (url: string, logger: Logger) => {
|
||||
`adminAuthViaEmail`,
|
||||
_clientLogger,
|
||||
(email: string, password: string) =>
|
||||
client.admins.authWithPassword(email, password)
|
||||
client.admins.authWithPassword(email, password),
|
||||
)
|
||||
|
||||
const createFirstAdmin = safeCatch(
|
||||
@ -40,7 +40,7 @@ export const createPbClient = (url: string, logger: Logger) => {
|
||||
console.log({ email, password })
|
||||
console.log(JSON.stringify(res, null, 2))
|
||||
return res
|
||||
})
|
||||
}),
|
||||
)
|
||||
|
||||
const context: MixinContext = { client, rawDb, logger: _clientLogger }
|
||||
|
@ -1,7 +1,7 @@
|
||||
import {
|
||||
RPC_COLLECTION,
|
||||
RpcFields,
|
||||
RpcStatus,
|
||||
RPC_COLLECTION,
|
||||
safeCatch,
|
||||
} from '@pockethost/common'
|
||||
import { ClientResponseError } from 'pocketbase'
|
||||
@ -31,7 +31,7 @@ export const createRpcHelper = (config: RpcHelperConfig) => {
|
||||
cb(e.record)
|
||||
})
|
||||
return unsub
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
const resetRpcs = safeCatch(`resetRpcs`, logger, async () =>
|
||||
@ -43,7 +43,7 @@ export const createRpcHelper = (config: RpcHelperConfig) => {
|
||||
.update<RpcFields<any, any>>({
|
||||
status: RpcStatus.FinishedError,
|
||||
result: `Canceled by reset`,
|
||||
})
|
||||
}),
|
||||
)
|
||||
|
||||
const incompleteRpcs = safeCatch(`incompleteRpcs`, logger, async () => {
|
||||
@ -65,7 +65,7 @@ export const createRpcHelper = (config: RpcHelperConfig) => {
|
||||
return client
|
||||
.collection(RPC_COLLECTION)
|
||||
.update<RpcFields<any, any>>(rpc.id, fields)
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
const setRpcStatus = safeCatch(
|
||||
@ -74,12 +74,12 @@ export const createRpcHelper = (config: RpcHelperConfig) => {
|
||||
async (
|
||||
rpc: RpcFields<any, any>,
|
||||
status: RpcStatus,
|
||||
result: JsonObject = {}
|
||||
result: JsonObject = {},
|
||||
) => {
|
||||
return client
|
||||
.collection(RPC_COLLECTION)
|
||||
.update(rpc.id, { status, result })
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
return {
|
||||
|
@ -31,7 +31,7 @@ export const clientService = mkSingleton(async (cfg: ClientServiceConfig) => {
|
||||
dbg(`Logged in`)
|
||||
} catch (e) {
|
||||
error(
|
||||
`CANNOT AUTHENTICATE TO ${PUBLIC_APP_PROTOCOL}://${PUBLIC_APP_DB}.${PUBLIC_APP_DOMAIN}/_/`
|
||||
`CANNOT AUTHENTICATE TO ${PUBLIC_APP_PROTOCOL}://${PUBLIC_APP_DB}.${PUBLIC_APP_DOMAIN}/_/`,
|
||||
)
|
||||
process.exit(-1)
|
||||
}
|
||||
|
@ -1,5 +1,3 @@
|
||||
export * from './clientService/clientService'
|
||||
export * from './clientService/PbClient'
|
||||
export * from './FtpService/FtpService'
|
||||
export * from './InstanceService/InstanceService'
|
||||
export * from './PocketBaseService'
|
||||
@ -7,3 +5,5 @@ export * from './ProxyService'
|
||||
export * from './RealtimeLog'
|
||||
export * from './RpcService/RpcService'
|
||||
export * from './SqliteService/SqliteService'
|
||||
export * from './clientService/PbClient'
|
||||
export * from './clientService/clientService'
|
||||
|
@ -1,12 +1,12 @@
|
||||
import { clientService } from '$services'
|
||||
import {
|
||||
InstanceFields,
|
||||
INSTANCE_COLLECTION,
|
||||
InvocationFields,
|
||||
INVOCATION_COLLECTION,
|
||||
logger,
|
||||
RpcFields,
|
||||
InstanceFields,
|
||||
InvocationFields,
|
||||
RPC_COLLECTION,
|
||||
RpcFields,
|
||||
logger,
|
||||
singletonAsyncExecutionGuard,
|
||||
} from '@pockethost/common'
|
||||
import Bottleneck from 'bottleneck'
|
||||
@ -17,7 +17,7 @@ export const deleteInvocation = singletonAsyncExecutionGuard(
|
||||
const { client } = await clientService()
|
||||
await client.client.collection(INVOCATION_COLLECTION).delete(invocation.id)
|
||||
},
|
||||
(invocation) => `deleteInvocation:${invocation.id}`
|
||||
(invocation) => `deleteInvocation:${invocation.id}`,
|
||||
)
|
||||
|
||||
export const deleteInvocationsForInstance = singletonAsyncExecutionGuard(
|
||||
@ -50,7 +50,7 @@ export const deleteInvocationsForInstance = singletonAsyncExecutionGuard(
|
||||
}
|
||||
}
|
||||
},
|
||||
(instance) => `deleteInvocationsForInstance:${instance.id}`
|
||||
(instance) => `deleteInvocationsForInstance:${instance.id}`,
|
||||
)
|
||||
|
||||
export const deleteRpc = singletonAsyncExecutionGuard(
|
||||
@ -58,7 +58,7 @@ export const deleteRpc = singletonAsyncExecutionGuard(
|
||||
const { client } = await clientService()
|
||||
await client.client.collection(RPC_COLLECTION).delete(rpc.id)
|
||||
},
|
||||
(rpc) => `deleteRpc:${rpc.id}`
|
||||
(rpc) => `deleteRpc:${rpc.id}`,
|
||||
)
|
||||
|
||||
export const getAllRpcs = singletonAsyncExecutionGuard(
|
||||
@ -70,7 +70,7 @@ export const getAllRpcs = singletonAsyncExecutionGuard(
|
||||
console.log(`Loaded rpcs`)
|
||||
return rpcs
|
||||
},
|
||||
() => `getAllRpcs`
|
||||
() => `getAllRpcs`,
|
||||
)
|
||||
|
||||
export const deleteRpcsForInstance = singletonAsyncExecutionGuard(
|
||||
@ -80,7 +80,7 @@ export const deleteRpcsForInstance = singletonAsyncExecutionGuard(
|
||||
const instanceRpcs = allRpcs.filter((rpc) => rpc.payload?.instanceId === id)
|
||||
await Promise.all(instanceRpcs.map(deleteRpc))
|
||||
},
|
||||
(instance) => `deleteRpcsForInstance:${instance.id}`
|
||||
(instance) => `deleteRpcsForInstance:${instance.id}`,
|
||||
)
|
||||
|
||||
export const deleteInstance = singletonAsyncExecutionGuard(
|
||||
@ -95,7 +95,7 @@ export const deleteInstance = singletonAsyncExecutionGuard(
|
||||
await deleteInvocationsForInstance(instance).catch((e) => {
|
||||
console.error(
|
||||
`deleteInvocationsForInstance error`,
|
||||
JSON.stringify(e, null, 2)
|
||||
JSON.stringify(e, null, 2),
|
||||
)
|
||||
throw e
|
||||
})
|
||||
@ -110,7 +110,7 @@ export const deleteInstance = singletonAsyncExecutionGuard(
|
||||
})
|
||||
console.log(`Instance deleted ${id}`)
|
||||
},
|
||||
(instance) => `deleteInstance:${instance.id}`
|
||||
(instance) => `deleteInstance:${instance.id}`,
|
||||
)
|
||||
|
||||
export const deleteInstancesByFilter = singletonAsyncExecutionGuard(
|
||||
@ -122,9 +122,9 @@ export const deleteInstancesByFilter = singletonAsyncExecutionGuard(
|
||||
const limiter = new Bottleneck({ maxConcurrent: 50 })
|
||||
await Promise.all(
|
||||
instances.map((instance) =>
|
||||
limiter.schedule(() => deleteInstance(instance))
|
||||
)
|
||||
limiter.schedule(() => deleteInstance(instance)),
|
||||
),
|
||||
)
|
||||
},
|
||||
(filter) => `deleteInstancesByFilter:${filter}`
|
||||
(filter) => `deleteInstancesByFilter:${filter}`,
|
||||
)
|
||||
|
@ -16,7 +16,7 @@ export const createCleanup = (context: { program: Command } & ContextBase) => {
|
||||
.option(
|
||||
`-f, --filter <filter>`,
|
||||
`Filter to use when deleting instances`,
|
||||
`stress-%`
|
||||
`stress-%`,
|
||||
)
|
||||
.action(async () => {
|
||||
const options = cleanupCmd.optsWithGlobals<CleanupOptions>()
|
||||
|
@ -38,7 +38,7 @@ export const createSeed = (context: { program: Command } & ContextBase) => {
|
||||
`-c, --count`,
|
||||
`Number of new seed instances to create`,
|
||||
parseInt,
|
||||
10
|
||||
10,
|
||||
)
|
||||
.action(async () => {
|
||||
const options = seedCmd.optsWithGlobals<SeedOptions>()
|
||||
|
@ -24,25 +24,25 @@ export const createStress = (context: { program: Command } & ContextBase) => {
|
||||
'-ic, --instance-count <number>',
|
||||
`Number of simultaneous instances to hit`,
|
||||
parseInt,
|
||||
100
|
||||
100,
|
||||
)
|
||||
.option(
|
||||
'-rc, --requests-per-instance <number>',
|
||||
`Number of simultaneous requests per instance`,
|
||||
parseInt,
|
||||
50
|
||||
50,
|
||||
)
|
||||
.option(
|
||||
'-mind, --min-delay <number>',
|
||||
`Minimum number of milliseconds to delay before sending another request`,
|
||||
parseInt,
|
||||
50
|
||||
50,
|
||||
)
|
||||
.option(
|
||||
'-maxd, --max-delay <number>',
|
||||
`Maximum number of milliseconds to delay before sending another request`,
|
||||
parseInt,
|
||||
500
|
||||
500,
|
||||
)
|
||||
.action(async () => {
|
||||
const options = seedCmd.optsWithGlobals<StressOptions>()
|
||||
@ -63,7 +63,7 @@ export const createStress = (context: { program: Command } & ContextBase) => {
|
||||
if (excluded[instanceId]) return
|
||||
await client.updateInstance(instanceId, { maintenance: false })
|
||||
},
|
||||
(id) => `reset:${id}`
|
||||
(id) => `reset:${id}`,
|
||||
)
|
||||
|
||||
const instances = await client.getInstances()
|
||||
@ -80,7 +80,7 @@ export const createStress = (context: { program: Command } & ContextBase) => {
|
||||
dbg(
|
||||
`There are ${instances.length} instances and ${
|
||||
values(excluded).length
|
||||
} excluded`
|
||||
} excluded`,
|
||||
)
|
||||
if (!instance) throw new Error(`No instance to grab`)
|
||||
|
||||
@ -108,7 +108,7 @@ export const createStress = (context: { program: Command } & ContextBase) => {
|
||||
return // Timeout
|
||||
}
|
||||
}
|
||||
})
|
||||
}),
|
||||
)
|
||||
}
|
||||
} catch (e) {
|
||||
|
@ -22,7 +22,7 @@ program
|
||||
.option(
|
||||
'-u, --mothership-url',
|
||||
'URL to central database',
|
||||
'http://127.0.0.1:8090'
|
||||
'http://127.0.0.1:8090',
|
||||
)
|
||||
|
||||
createCleanup({ program, logger })
|
||||
|
@ -7,7 +7,7 @@ import { dirname } from 'path'
|
||||
|
||||
export function assert<T>(
|
||||
v: T | undefined | void | null,
|
||||
msg?: string
|
||||
msg?: string,
|
||||
): asserts v is T {
|
||||
if (!v) {
|
||||
throw new Error(msg || `Assertion failure`)
|
||||
@ -28,7 +28,7 @@ const downloadFile = async (url: string, path: string) => {
|
||||
const _unsafe_downloadAndExtract = async (
|
||||
url: string,
|
||||
binPath: string,
|
||||
logger: Logger
|
||||
logger: Logger,
|
||||
) => {
|
||||
const { dbg, error } = logger.create('downloadAndExtract')
|
||||
|
||||
@ -48,5 +48,5 @@ const _unsafe_downloadAndExtract = async (
|
||||
|
||||
export const downloadAndExtract = singletonAsyncExecutionGuard(
|
||||
_unsafe_downloadAndExtract,
|
||||
(url) => url
|
||||
(url) => url,
|
||||
)
|
||||
|
@ -5,7 +5,7 @@ import { dirname } from 'path'
|
||||
|
||||
export const smartFetch = async <TRet>(
|
||||
url: string,
|
||||
path: string
|
||||
path: string,
|
||||
): Promise<TRet> => {
|
||||
const { dbg } = logger().create(`smartFetch`)
|
||||
|
||||
|
@ -40,7 +40,7 @@ export const tryFetch = async (url: string, config?: Partial<Config>) => {
|
||||
resolve()
|
||||
} catch (e) {
|
||||
dbg(
|
||||
`Could not fetch ${url}, trying again in ${retryMs}ms. Raw error ${e}`
|
||||
`Could not fetch ${url}, trying again in ${retryMs}ms. Raw error ${e}`,
|
||||
)
|
||||
setTimeout(tryFetch, retryMs)
|
||||
}
|
||||
|
@ -250,7 +250,7 @@ export class EventSource extends EventTarget {
|
||||
await new Promise<void>((res) => {
|
||||
const id = setTimeout(
|
||||
() => res(clearTimeout(id)),
|
||||
this.#settings.reconnectionTime
|
||||
this.#settings.reconnectionTime,
|
||||
)
|
||||
})
|
||||
|
||||
|
2
packages/deno-worker/index.d.ts
vendored
2
packages/deno-worker/index.d.ts
vendored
@ -19,7 +19,7 @@ declare class EventSource {
|
||||
|
||||
constructor(
|
||||
url: string,
|
||||
eventSourceInitDict?: EventSource.EventSourceInitDict
|
||||
eventSourceInitDict?: EventSource.EventSourceInitDict,
|
||||
)
|
||||
|
||||
readonly CLOSED: number
|
||||
|
@ -23,7 +23,7 @@ export const init = (klass: typeof PocketBase) => {
|
||||
|
||||
const adminAuthWithPassword = async (
|
||||
login = ADMIN_LOGIN,
|
||||
password = ADMIN_PASSWORD
|
||||
password = ADMIN_PASSWORD,
|
||||
) => {
|
||||
console.log(`Connecting to ${POCKETBASE_URL} with ${ADMIN_LOGIN}`)
|
||||
await client.admins.authWithPassword(login, password)
|
||||
|
@ -1,16 +0,0 @@
|
||||
{
|
||||
"useTabs": false,
|
||||
"singleQuote": true,
|
||||
"semi": false,
|
||||
"trailingComma": "none",
|
||||
"printWidth": 100,
|
||||
"pluginSearchDirs": [".", "../.."],
|
||||
"overrides": [
|
||||
{
|
||||
"files": "*.svelte",
|
||||
"options": {
|
||||
"parser": "svelte"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
@ -10,28 +10,25 @@
|
||||
"check:watch": "svelte-kit sync && svelte-check --tsconfig ./tsconfig.json --watch",
|
||||
"lint": "prettier --check .",
|
||||
"format": "prettier --write .",
|
||||
"start": "HOST=localhost PORT=5173 node dist-server/index.js",
|
||||
"start": "HOST=localhost PORT=5173 dotenv -e ../../.env node -- dist-server/index.js",
|
||||
"pm2": "pm2 del www ; pm2 start \"yarn start\" --name=www -o ~/logs/www.log -e ~/logs/www.log",
|
||||
"watch": "chokidar 'src/**' -c 'yarn build' --initial"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@sveltejs/adapter-node": "^1.0.0-next.99",
|
||||
"@sveltejs/kit": "next",
|
||||
"chokidar-cli": "^3.0.0",
|
||||
"node-html-parser": "^6.1.4",
|
||||
"svelte": "^3.44.0",
|
||||
"svelte-check": "^2.7.1",
|
||||
"svelte-preprocess": "^4.10.6",
|
||||
"tslib": "^2.3.1",
|
||||
"typescript": "^4.8.0",
|
||||
"vite": "^3.1.0"
|
||||
"@sveltejs/adapter-node": "^1.3.1",
|
||||
"@sveltejs/kit": "^1.24.1",
|
||||
"node-html-parser": "^6.1.8",
|
||||
"svelte": "^4.2.0",
|
||||
"svelte-check": "^3.5.1",
|
||||
"svelte-preprocess": "^5.0.4",
|
||||
"vite": "^4.4.9"
|
||||
},
|
||||
"type": "module",
|
||||
"dependencies": {
|
||||
"@dansvel/vite-plugin-markdown": "^2.0.5",
|
||||
"@microsoft/fetch-event-source": "https://github.com/Almar/fetch-event-source.git#pr/make_web_worker_friendly",
|
||||
"@pockethost/common": "0.0.1",
|
||||
"@s-libs/micro-dash": "^15.1.0",
|
||||
"@s-libs/micro-dash": "^16.1.0",
|
||||
"@types/bootstrap": "^5.2.6",
|
||||
"@types/d3-scale": "^4.0.3",
|
||||
"@types/d3-scale-chromatic": "^3.0.0",
|
||||
|
@ -10,7 +10,7 @@
|
||||
</script>
|
||||
|
||||
<div class="accordion-item">
|
||||
<h2 class="accordion-header " id={headerId}>
|
||||
<h2 class="accordion-header" id={headerId}>
|
||||
<button
|
||||
class="accordion-button {show ? '' : 'collapsed'} text-bg-{header} "
|
||||
type="button"
|
||||
|
@ -6,5 +6,5 @@ export enum AlertTypes {
|
||||
Warning = 'warning',
|
||||
Info = 'info',
|
||||
Light = 'light',
|
||||
Dark = 'dark'
|
||||
Dark = 'dark',
|
||||
}
|
||||
|
@ -5,13 +5,13 @@ import Cookies from 'js-cookie'
|
||||
// Set some default values to be referenced later
|
||||
export enum ThemeNames {
|
||||
Light = 'light',
|
||||
Dark = 'dark'
|
||||
Dark = 'dark',
|
||||
}
|
||||
export const HLJS_THEMES = {
|
||||
[ThemeNames.Light]:
|
||||
'https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.7.0/styles/default.min.css',
|
||||
[ThemeNames.Dark]:
|
||||
'https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.7.0/styles/a11y-dark.min.css'
|
||||
'https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.7.0/styles/a11y-dark.min.css',
|
||||
}
|
||||
export const ALLOWED_THEMES: ThemeNames[] = [ThemeNames.Light, ThemeNames.Dark]
|
||||
export const DEFAULT_THEME: ThemeNames = ThemeNames.Light
|
||||
@ -19,7 +19,7 @@ export const STORAGE_NAME: string = 'theme'
|
||||
export const THEME_ATTRIBUTE: string = 'data-bs-theme'
|
||||
export const THEME_ICONS: { [_ in ThemeNames]: string } = {
|
||||
[ThemeNames.Light]: 'bi bi-moon-stars',
|
||||
[ThemeNames.Dark]: 'bi bi-brightness-high'
|
||||
[ThemeNames.Dark]: 'bi bi-brightness-high',
|
||||
}
|
||||
|
||||
export const html = () => {
|
||||
@ -30,7 +30,8 @@ export const html = () => {
|
||||
|
||||
export const getCurrentTheme = () => {
|
||||
const savedTheme = Cookies.get(STORAGE_NAME)
|
||||
const currentTheme = find(ALLOWED_THEMES, (v) => savedTheme === v) || DEFAULT_THEME
|
||||
const currentTheme =
|
||||
find(ALLOWED_THEMES, (v) => savedTheme === v) || DEFAULT_THEME
|
||||
return currentTheme
|
||||
}
|
||||
|
||||
|
@ -6,15 +6,19 @@ import { boolean } from 'boolean'
|
||||
import UrlPattern from 'url-pattern'
|
||||
import base from '../../../package.json'
|
||||
|
||||
export const env = (name: string, _default: string = '') => {
|
||||
export type PublicEnvName = `PUBLIC_${string}`
|
||||
|
||||
export const env = (name: PublicEnvName, _default: string = '') => {
|
||||
const v = _env[name]
|
||||
if (!v) return _default
|
||||
return v
|
||||
}
|
||||
|
||||
export const envi = (name: string, _default: number) => parseInt(env(name, _default.toString()))
|
||||
export const envi = (name: PublicEnvName, _default: number) =>
|
||||
parseInt(env(name, _default.toString()))
|
||||
|
||||
export const envb = (name: string, _default: boolean) => boolean(env(name, _default.toString()))
|
||||
export const envb = (name: PublicEnvName, _default: boolean) =>
|
||||
boolean(env(name, _default.toString()))
|
||||
|
||||
export const PUBLIC_APP_DB = env('PUBLIC_APP_DB', 'pockethost-central')
|
||||
export const PUBLIC_APP_DOMAIN = env('PUBLIC_APP_DOMAIN', 'pockethost.io')
|
||||
@ -24,7 +28,9 @@ export const PUBLIC_DEBUG = envb('PUBLIC_DEBUG', dev)
|
||||
|
||||
export const PUBLIC_POCKETHOST_VERSION = base.version
|
||||
|
||||
export const PUBLIC_ROUTES = publicRoutes.map((pattern) => new UrlPattern(pattern))
|
||||
export const PUBLIC_ROUTES = publicRoutes.map(
|
||||
(pattern) => new UrlPattern(pattern),
|
||||
)
|
||||
|
||||
try {
|
||||
logger()
|
||||
|
@ -1,17 +1,17 @@
|
||||
import { createGenericSyncEvent } from '$util/events'
|
||||
import { fetchEventSource } from '@microsoft/fetch-event-source'
|
||||
import {
|
||||
assertExists,
|
||||
CreateInstancePayloadSchema,
|
||||
createRpcHelper,
|
||||
createWatchHelper,
|
||||
logger,
|
||||
RenameInstancePayloadSchema,
|
||||
RpcCommands,
|
||||
safeCatch,
|
||||
SaveSecretsPayloadSchema,
|
||||
SaveVersionPayloadSchema,
|
||||
SetInstanceMaintenancePayloadSchema,
|
||||
assertExists,
|
||||
createRpcHelper,
|
||||
createWatchHelper,
|
||||
logger,
|
||||
safeCatch,
|
||||
type CreateInstancePayload,
|
||||
type CreateInstanceResult,
|
||||
type InstanceFields,
|
||||
@ -26,7 +26,7 @@ import {
|
||||
type SetInstanceMaintenancePayload,
|
||||
type SetInstanceMaintenanceResult,
|
||||
// gen:import
|
||||
type UserFields
|
||||
type UserFields,
|
||||
} from '@pockethost/common'
|
||||
import { keys, map } from '@s-libs/micro-dash'
|
||||
import PocketBase, {
|
||||
@ -34,7 +34,7 @@ import PocketBase, {
|
||||
BaseAuthStore,
|
||||
ClientResponseError,
|
||||
type RecordSubscription,
|
||||
type UnsubscribeFunc
|
||||
type UnsubscribeFunc,
|
||||
} from 'pocketbase'
|
||||
|
||||
export type AuthChangeHandler = (user: BaseAuthStore) => void
|
||||
@ -66,36 +66,45 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
||||
|
||||
const logOut = () => authStore.clear()
|
||||
|
||||
const createUser = safeCatch(`createUser`, _logger, (email: string, password: string) =>
|
||||
client
|
||||
.collection('users')
|
||||
.create({
|
||||
email,
|
||||
password,
|
||||
passwordConfirm: password
|
||||
})
|
||||
.then(() => {
|
||||
// dbg(`Sending verification email to ${email}`)
|
||||
return client.collection('users').requestVerification(email)
|
||||
})
|
||||
const createUser = safeCatch(
|
||||
`createUser`,
|
||||
_logger,
|
||||
(email: string, password: string) =>
|
||||
client
|
||||
.collection('users')
|
||||
.create({
|
||||
email,
|
||||
password,
|
||||
passwordConfirm: password,
|
||||
})
|
||||
.then(() => {
|
||||
// dbg(`Sending verification email to ${email}`)
|
||||
return client.collection('users').requestVerification(email)
|
||||
}),
|
||||
)
|
||||
|
||||
const confirmVerification = safeCatch(`confirmVerification`, _logger, (token: string) =>
|
||||
client
|
||||
.collection('users')
|
||||
.confirmVerification(token)
|
||||
.then((response) => {
|
||||
return response
|
||||
})
|
||||
const confirmVerification = safeCatch(
|
||||
`confirmVerification`,
|
||||
_logger,
|
||||
(token: string) =>
|
||||
client
|
||||
.collection('users')
|
||||
.confirmVerification(token)
|
||||
.then((response) => {
|
||||
return response
|
||||
}),
|
||||
)
|
||||
|
||||
const requestPasswordReset = safeCatch(`requestPasswordReset`, _logger, (email: string) =>
|
||||
client
|
||||
.collection('users')
|
||||
.requestPasswordReset(email)
|
||||
.then(() => {
|
||||
return true
|
||||
})
|
||||
const requestPasswordReset = safeCatch(
|
||||
`requestPasswordReset`,
|
||||
_logger,
|
||||
(email: string) =>
|
||||
client
|
||||
.collection('users')
|
||||
.requestPasswordReset(email)
|
||||
.then(() => {
|
||||
return true
|
||||
}),
|
||||
)
|
||||
|
||||
const requestPasswordResetConfirm = safeCatch(
|
||||
@ -107,15 +116,18 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
||||
.confirmPasswordReset(token, password, password)
|
||||
.then((response) => {
|
||||
return response
|
||||
})
|
||||
}),
|
||||
)
|
||||
|
||||
const authViaEmail = safeCatch(`authViaEmail`, _logger, (email: string, password: string) =>
|
||||
client.collection('users').authWithPassword(email, password)
|
||||
const authViaEmail = safeCatch(
|
||||
`authViaEmail`,
|
||||
_logger,
|
||||
(email: string, password: string) =>
|
||||
client.collection('users').authWithPassword(email, password),
|
||||
)
|
||||
|
||||
const refreshAuthToken = safeCatch(`refreshAuthToken`, _logger, () =>
|
||||
client.collection('users').authRefresh()
|
||||
client.collection('users').authRefresh(),
|
||||
)
|
||||
|
||||
const watchHelper = createWatchHelper({ client })
|
||||
@ -125,27 +137,27 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
||||
|
||||
const createInstance = mkRpc<CreateInstancePayload, CreateInstanceResult>(
|
||||
RpcCommands.CreateInstance,
|
||||
CreateInstancePayloadSchema
|
||||
CreateInstancePayloadSchema,
|
||||
)
|
||||
const saveSecrets = mkRpc<SaveSecretsPayload, SaveSecretsResult>(
|
||||
RpcCommands.SaveSecrets,
|
||||
SaveSecretsPayloadSchema
|
||||
SaveSecretsPayloadSchema,
|
||||
)
|
||||
|
||||
const saveVersion = mkRpc<SaveVersionPayload, SaveVersionResult>(
|
||||
RpcCommands.SaveVersion,
|
||||
SaveVersionPayloadSchema
|
||||
SaveVersionPayloadSchema,
|
||||
)
|
||||
|
||||
const renameInstance = mkRpc<RenameInstancePayload, RenameInstanceResult>(
|
||||
RpcCommands.RenameInstance,
|
||||
RenameInstancePayloadSchema
|
||||
RenameInstancePayloadSchema,
|
||||
)
|
||||
|
||||
const setInstanceMaintenance = mkRpc<SetInstanceMaintenancePayload, SetInstanceMaintenanceResult>(
|
||||
RpcCommands.SetInstanceMaintenance,
|
||||
SetInstanceMaintenancePayloadSchema
|
||||
)
|
||||
const setInstanceMaintenance = mkRpc<
|
||||
SetInstanceMaintenancePayload,
|
||||
SetInstanceMaintenanceResult
|
||||
>(RpcCommands.SetInstanceMaintenance, SetInstanceMaintenancePayloadSchema)
|
||||
|
||||
// gen:mkRpc
|
||||
|
||||
@ -153,42 +165,56 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
||||
`getInstanceById`,
|
||||
_logger,
|
||||
(id: InstanceId): Promise<InstanceFields | undefined> =>
|
||||
client.collection('instances').getOne<InstanceFields>(id)
|
||||
client.collection('instances').getOne<InstanceFields>(id),
|
||||
)
|
||||
|
||||
const watchInstanceById = async (
|
||||
id: InstanceId,
|
||||
cb: (data: RecordSubscription<InstanceFields>) => void
|
||||
cb: (data: RecordSubscription<InstanceFields>) => void,
|
||||
): Promise<UnsubscribeFunc> => watchById('instances', id, cb)
|
||||
|
||||
const getAllInstancesById = safeCatch(`getAllInstancesById`, _logger, async () =>
|
||||
(await client.collection('instances').getFullList()).reduce((c, v) => {
|
||||
c[v.id] = v as unknown as InstanceFields
|
||||
return c
|
||||
}, {} as { [_: InstanceId]: InstanceFields })
|
||||
const getAllInstancesById = safeCatch(
|
||||
`getAllInstancesById`,
|
||||
_logger,
|
||||
async () =>
|
||||
(await client.collection('instances').getFullList()).reduce(
|
||||
(c, v) => {
|
||||
c[v.id] = v as unknown as InstanceFields
|
||||
return c
|
||||
},
|
||||
{} as { [_: InstanceId]: InstanceFields },
|
||||
),
|
||||
)
|
||||
|
||||
const parseError = (e: Error): string[] => {
|
||||
if (!(e instanceof ClientResponseError)) return [e.message]
|
||||
if (e.data.message && keys(e.data.data).length === 0) return [e.data.message]
|
||||
return map(e.data.data, (v, k) => (v ? v.message : undefined)).filter((v) => !!v)
|
||||
if (e.data.message && keys(e.data.data).length === 0)
|
||||
return [e.data.message]
|
||||
return map(e.data.data, (v, k) => (v ? v.message : undefined)).filter(
|
||||
(v) => !!v,
|
||||
)
|
||||
}
|
||||
|
||||
const resendVerificationEmail = safeCatch(`resendVerificationEmail`, _logger, async () => {
|
||||
const user = client.authStore.model
|
||||
assertExists(user, `Login required`)
|
||||
await client.collection('users').requestVerification(user.email)
|
||||
})
|
||||
const resendVerificationEmail = safeCatch(
|
||||
`resendVerificationEmail`,
|
||||
_logger,
|
||||
async () => {
|
||||
const user = client.authStore.model
|
||||
assertExists(user, `Login required`)
|
||||
await client.collection('users').requestVerification(user.email)
|
||||
},
|
||||
)
|
||||
|
||||
const getAuthStoreProps = (): AuthStoreProps => {
|
||||
const { token, model, isValid } = client.authStore as AuthStoreProps
|
||||
// dbg(`current authStore`, { token, model, isValid })
|
||||
if (model instanceof Admin) throw new Error(`Admin models not supported`)
|
||||
if (model && !model.email) throw new Error(`Expected model to be a user here`)
|
||||
if (model && !model.email)
|
||||
throw new Error(`Expected model to be a user here`)
|
||||
return {
|
||||
token,
|
||||
model,
|
||||
isValid
|
||||
isValid,
|
||||
}
|
||||
}
|
||||
|
||||
@ -196,7 +222,8 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
||||
* Use synthetic event for authStore changers so we can broadcast just
|
||||
* the props we want and not the actual authStore object.
|
||||
*/
|
||||
const [onAuthChange, fireAuthChange] = createGenericSyncEvent<AuthStoreProps>()
|
||||
const [onAuthChange, fireAuthChange] =
|
||||
createGenericSyncEvent<AuthStoreProps>()
|
||||
|
||||
/**
|
||||
* This section is for initialization
|
||||
@ -254,7 +281,7 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
||||
const watchInstanceLog = (
|
||||
instanceId: InstanceId,
|
||||
update: (log: InstanceLogFields) => void,
|
||||
nInitial = 100
|
||||
nInitial = 100,
|
||||
): (() => void) => {
|
||||
const { dbg, trace } = _logger.create('watchInstanceLog')
|
||||
const auth = client.authStore.exportToCookie()
|
||||
@ -266,12 +293,12 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
||||
fetchEventSource(`${url}/logs`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
instanceId,
|
||||
n: nInitial,
|
||||
auth
|
||||
auth,
|
||||
}),
|
||||
onmessage: (event) => {
|
||||
trace(`Got stream event`, event)
|
||||
@ -290,7 +317,7 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
||||
setTimeout(continuallyFetchFromEventSource, 100)
|
||||
dbg(`Stream closed`)
|
||||
},
|
||||
signal
|
||||
signal,
|
||||
})
|
||||
}
|
||||
continuallyFetchFromEventSource()
|
||||
@ -323,6 +350,6 @@ export const createPocketbaseClient = (config: PocketbaseClientConfig) => {
|
||||
renameInstance,
|
||||
setInstanceMaintenance,
|
||||
// gen:export
|
||||
saveVersion
|
||||
saveVersion,
|
||||
}
|
||||
}
|
||||
|
@ -1,7 +1,10 @@
|
||||
import { browser } from '$app/environment'
|
||||
import { PUBLIC_APP_DB, PUBLIC_APP_DOMAIN } from '$src/env'
|
||||
import { logger } from '@pockethost/common'
|
||||
import { createPocketbaseClient, type PocketbaseClient } from './PocketbaseClient'
|
||||
import {
|
||||
createPocketbaseClient,
|
||||
type PocketbaseClient,
|
||||
} from './PocketbaseClient'
|
||||
|
||||
export const client = (() => {
|
||||
let clientInstance: PocketbaseClient | undefined
|
||||
|
@ -23,14 +23,14 @@ function formatInput(input: SecretsArray): SecretsArray {
|
||||
.map(({ name, value }, index) => ({
|
||||
name,
|
||||
value,
|
||||
color: colorScale(index.toString())
|
||||
color: colorScale(index.toString()),
|
||||
}))
|
||||
}
|
||||
|
||||
const sanitize = (item: SecretItem) => {
|
||||
return {
|
||||
name: item.name.toUpperCase().trim(),
|
||||
value: item.value.trim()
|
||||
value: item.value.trim(),
|
||||
}
|
||||
}
|
||||
|
||||
@ -54,8 +54,8 @@ function createItems(initialItems: SecretsArray) {
|
||||
...n,
|
||||
{
|
||||
name,
|
||||
value
|
||||
}
|
||||
value,
|
||||
},
|
||||
]
|
||||
return formatInput(n)
|
||||
})
|
||||
@ -69,7 +69,7 @@ function createItems(initialItems: SecretsArray) {
|
||||
n = [...n.slice(0, index), ...n.slice(index + 1)]
|
||||
return formatInput(n)
|
||||
})
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -28,7 +28,7 @@ export const handleLogin = async (
|
||||
email: string,
|
||||
password: string,
|
||||
setError?: FormErrorHandler,
|
||||
shouldRedirect: boolean = true
|
||||
shouldRedirect: boolean = true,
|
||||
) => {
|
||||
const { authViaEmail } = client()
|
||||
// Reset the form error if the form is submitted
|
||||
@ -42,7 +42,9 @@ export const handleLogin = async (
|
||||
}
|
||||
} catch (error) {
|
||||
if (!(error instanceof Error)) {
|
||||
throw new Error(`Expected Error type here, but got ${typeof error}:${error}`)
|
||||
throw new Error(
|
||||
`Expected Error type here, but got ${typeof error}:${error}`,
|
||||
)
|
||||
}
|
||||
handleFormError(error, setError)
|
||||
}
|
||||
@ -57,7 +59,7 @@ export const handleLogin = async (
|
||||
export const handleRegistration = async (
|
||||
email: string,
|
||||
password: string,
|
||||
setError?: FormErrorHandler
|
||||
setError?: FormErrorHandler,
|
||||
) => {
|
||||
const { createUser } = client()
|
||||
// Reset the form error if the form is submitted
|
||||
@ -75,7 +77,10 @@ export const handleRegistration = async (
|
||||
* @param token {string} The token from the verification email
|
||||
* @param setError {function} This can be used to show an alert bar if an error occurs during the login process
|
||||
*/
|
||||
export const handleAccountConfirmation = async (token: string, setError?: FormErrorHandler) => {
|
||||
export const handleAccountConfirmation = async (
|
||||
token: string,
|
||||
setError?: FormErrorHandler,
|
||||
) => {
|
||||
const { confirmVerification } = client()
|
||||
// Reset the form error if the form is submitted
|
||||
setError?.('')
|
||||
@ -98,7 +103,7 @@ export const handleAccountConfirmation = async (token: string, setError?: FormEr
|
||||
*/
|
||||
export const handleUnauthenticatedPasswordReset = async (
|
||||
email: string,
|
||||
setError?: FormErrorHandler
|
||||
setError?: FormErrorHandler,
|
||||
) => {
|
||||
const { requestPasswordReset } = client()
|
||||
// Reset the form error if the form is submitted
|
||||
@ -122,7 +127,7 @@ export const handleUnauthenticatedPasswordReset = async (
|
||||
export const handleUnauthenticatedPasswordResetConfirm = async (
|
||||
token: string,
|
||||
password: string,
|
||||
setError?: FormErrorHandler
|
||||
setError?: FormErrorHandler,
|
||||
) => {
|
||||
const { requestPasswordResetConfirm } = client()
|
||||
// Reset the form error if the form is submitted
|
||||
@ -141,7 +146,7 @@ export const handleUnauthenticatedPasswordResetConfirm = async (
|
||||
|
||||
export const handleCreateNewInstance = async (
|
||||
instanceName: string,
|
||||
setError?: FormErrorHandler
|
||||
setError?: FormErrorHandler,
|
||||
) => {
|
||||
const { user, createInstance } = client()
|
||||
// Get the newly created user id
|
||||
@ -154,7 +159,7 @@ export const handleCreateNewInstance = async (
|
||||
|
||||
// Create a new instance using the generated name
|
||||
const record = await createInstance({
|
||||
subdomain: instanceName
|
||||
subdomain: instanceName,
|
||||
})
|
||||
|
||||
await goto(`/app/instances/${record.instance.id}`)
|
||||
@ -167,7 +172,7 @@ export const handleInstanceGeneratorWidget = async (
|
||||
email: string,
|
||||
password: string,
|
||||
instanceName: string,
|
||||
setError = (value: string) => {}
|
||||
setError = (value: string) => {},
|
||||
) => {
|
||||
const { dbg, error, warn } = logger()
|
||||
|
||||
@ -203,7 +208,7 @@ export const handleInstanceGeneratorWidget = async (
|
||||
// If registration succeeds, login should always succeed.
|
||||
// If a login fails at this point, the system is broken.
|
||||
throw new Error(
|
||||
`Login system is currently down. Please contact us so we can fix this.`
|
||||
`Login system is currently down. Please contact us so we can fix this.`,
|
||||
)
|
||||
})
|
||||
})
|
||||
@ -247,7 +252,9 @@ export const handleInstanceGeneratorWidget = async (
|
||||
}
|
||||
}
|
||||
|
||||
export const handleResendVerificationEmail = async (setError = (value: string) => {}) => {
|
||||
export const handleResendVerificationEmail = async (
|
||||
setError = (value: string) => {},
|
||||
) => {
|
||||
const { resendVerificationEmail } = client()
|
||||
try {
|
||||
await resendVerificationEmail()
|
||||
|
@ -4,7 +4,7 @@ export type Unsubscribe = () => void
|
||||
|
||||
export const createGenericAsyncEvent = <TPayload>(): [
|
||||
(cb: (payload: TPayload) => Promise<void>) => Unsubscribe,
|
||||
(payload: TPayload) => Promise<void>
|
||||
(payload: TPayload) => Promise<void>,
|
||||
] => {
|
||||
let i = 0
|
||||
const callbacks: any = {}
|
||||
@ -22,7 +22,7 @@ export const createGenericAsyncEvent = <TPayload>(): [
|
||||
(c, cb) => {
|
||||
return c.then(cb(payload))
|
||||
},
|
||||
Promise.resolve()
|
||||
Promise.resolve(),
|
||||
)
|
||||
|
||||
return [onEvent, fireEvent]
|
||||
@ -30,7 +30,7 @@ export const createGenericAsyncEvent = <TPayload>(): [
|
||||
|
||||
export const createGenericSyncEvent = <TPayload>(): [
|
||||
(cb: (payload: TPayload) => void) => Unsubscribe,
|
||||
(payload: TPayload) => void
|
||||
(payload: TPayload) => void,
|
||||
] => {
|
||||
let i = 0
|
||||
const callbacks: any = {}
|
||||
|
@ -4,7 +4,11 @@ import type { AuthStoreProps } from '$src/pocketbase/PocketbaseClient'
|
||||
import { logger } from '@pockethost/common'
|
||||
import { writable } from 'svelte/store'
|
||||
|
||||
export const authStoreState = writable<AuthStoreProps>({ isValid: false, model: null, token: '' })
|
||||
export const authStoreState = writable<AuthStoreProps>({
|
||||
isValid: false,
|
||||
model: null,
|
||||
token: '',
|
||||
})
|
||||
export const isUserLoggedIn = writable(false)
|
||||
export const isUserVerified = writable(false)
|
||||
export const isAuthStateInitialized = writable(false)
|
||||
|
@ -11,11 +11,11 @@
|
||||
"sourceMap": true,
|
||||
"strict": true,
|
||||
"paths": {
|
||||
"$util/*": ["src/util/*"],
|
||||
"$components/*": ["src/components/*"],
|
||||
"$src/*": ["src/*"]
|
||||
"$util/*": ["./src/util/*"],
|
||||
"$components/*": ["./src/components/*"],
|
||||
"$src/*": ["./src/*"]
|
||||
},
|
||||
"types": ["src/global.d.ts"]
|
||||
"types": ["./src/global.d.ts"]
|
||||
}
|
||||
// Path aliases are handled by https://kit.svelte.dev/docs/configuration#alias
|
||||
//
|
||||
|
@ -6,8 +6,8 @@ import markedOptions from './marked.config.js'
|
||||
const config: UserConfig = {
|
||||
plugins: [markdown({ markedOptions }), sveltekit()],
|
||||
optimizeDeps: {
|
||||
include: ['highlight.js', 'highlight.js/lib/core']
|
||||
}
|
||||
include: ['highlight.js', 'highlight.js/lib/core'],
|
||||
},
|
||||
}
|
||||
|
||||
export default config
|
||||
|
@ -10,7 +10,6 @@
|
||||
},
|
||||
"dependencies": {
|
||||
"@types/node": "^18.11.17",
|
||||
"http-proxy": "^1.18.1",
|
||||
"tsx": "^3.12.1"
|
||||
"http-proxy": "^1.18.1"
|
||||
}
|
||||
}
|
@ -1,158 +0,0 @@
|
||||
diff --git a/node_modules/sqlite3/lib/sqlite3.d.ts b/node_modules/sqlite3/lib/sqlite3.d.ts
|
||||
index b27b0cf..783a961 100644
|
||||
--- a/node_modules/sqlite3/lib/sqlite3.d.ts
|
||||
+++ b/node_modules/sqlite3/lib/sqlite3.d.ts
|
||||
@@ -139,6 +139,153 @@ export class Database extends events.EventEmitter {
|
||||
wait(callback?: (param: null) => void): this;
|
||||
|
||||
interrupt(): void;
|
||||
+
|
||||
+ backup(path:string, callback?: ()=>void): Backup
|
||||
+ backup(filename:string, destDbName:string, sourceDbName:string, filenameIsDest:boolean, callback?: ()=>void): Backup
|
||||
+}
|
||||
+
|
||||
+/**
|
||||
+ *
|
||||
+ * A class for managing an sqlite3_backup object. For consistency
|
||||
+ * with other node-sqlite3 classes, it maintains an internal queue
|
||||
+ * of calls.
|
||||
+ *
|
||||
+ * Intended usage from node:
|
||||
+ *
|
||||
+ * var db = new sqlite3.Database('live.db');
|
||||
+ * var backup = db.backup('backup.db');
|
||||
+ * ...
|
||||
+ * // in event loop, move backup forward when we have time.
|
||||
+ * if (backup.idle) { backup.step(NPAGES); }
|
||||
+ * if (backup.completed) { ... success ... }
|
||||
+ * if (backup.failed) { ... sadness ... }
|
||||
+ * // do other work in event loop - fine to modify live.db
|
||||
+ * ...
|
||||
+ *
|
||||
+ * Here is how sqlite's backup api is exposed:
|
||||
+ *
|
||||
+ * - `sqlite3_backup_init`: This is implemented as
|
||||
+ * `db.backup(filename, [callback])` or
|
||||
+ * `db.backup(filename, destDbName, sourceDbName, filenameIsDest, [callback])`.
|
||||
+ * - `sqlite3_backup_step`: `backup.step(pages, [callback])`.
|
||||
+ * - `sqlite3_backup_finish`: `backup.finish([callback])`.
|
||||
+ * - `sqlite3_backup_remaining`: `backup.remaining`.
|
||||
+ * - `sqlite3_backup_pagecount`: `backup.pageCount`.
|
||||
+ *
|
||||
+ * There are the following read-only properties:
|
||||
+ *
|
||||
+ * - `backup.completed` is set to `true` when the backup
|
||||
+ * succeeeds.
|
||||
+ * - `backup.failed` is set to `true` when the backup
|
||||
+ * has a fatal error.
|
||||
+ * - `backup.message` is set to the error string
|
||||
+ * the backup has a fatal error.
|
||||
+ * - `backup.idle` is set to `true` when no operation
|
||||
+ * is currently in progress or queued for the backup.
|
||||
+ * - `backup.remaining` is an integer with the remaining
|
||||
+ * number of pages after the last call to `backup.step`
|
||||
+ * (-1 if `step` not yet called).
|
||||
+ * - `backup.pageCount` is an integer with the total number
|
||||
+ * of pages measured during the last call to `backup.step`
|
||||
+ * (-1 if `step` not yet called).
|
||||
+ *
|
||||
+ * There is the following writable property:
|
||||
+ *
|
||||
+ * - `backup.retryErrors`: an array of sqlite3 error codes
|
||||
+ * that are treated as non-fatal - meaning, if they occur,
|
||||
+ * backup.failed is not set, and the backup may continue.
|
||||
+ * By default, this is `[sqlite3.BUSY, sqlite3.LOCKED]`.
|
||||
+ *
|
||||
+ * The `db.backup(filename, [callback])` shorthand is sufficient
|
||||
+ * for making a backup of a database opened by node-sqlite3. If
|
||||
+ * using attached or temporary databases, or moving data in the
|
||||
+ * opposite direction, the more complete (but daunting)
|
||||
+ * `db.backup(filename, destDbName, sourceDbName, filenameIsDest, [callback])`
|
||||
+ * signature is provided.
|
||||
+ *
|
||||
+ * A backup will finish automatically when it succeeds or a fatal
|
||||
+ * error occurs, meaning it is not necessary to call `db.finish()`.
|
||||
+ * By default, SQLITE_LOCKED and SQLITE_BUSY errors are not
|
||||
+ * treated as failures, and the backup will continue if they
|
||||
+ * occur. The set of errors that are tolerated can be controlled
|
||||
+ * by setting `backup.retryErrors`. To disable automatic
|
||||
+ * finishing and stick strictly to sqlite's raw api, set
|
||||
+ * `backup.retryErrors` to `[]`. In that case, it is necessary
|
||||
+ * to call `backup.finish()`.
|
||||
+ *
|
||||
+ * In the same way as node-sqlite3 databases and statements,
|
||||
+ * backup methods can be called safely without callbacks, due
|
||||
+ * to an internal call queue. So for example this naive code
|
||||
+ * will correctly back up a db, if there are no errors:
|
||||
+ *
|
||||
+ * var backup = db.backup('backup.db');
|
||||
+ * backup.step(-1);
|
||||
+ * backup.finish();
|
||||
+ *
|
||||
+ */
|
||||
+export class Backup extends events.EventEmitter {
|
||||
+ /**
|
||||
+ * `true` when the backup is idle and ready for `step()` to
|
||||
+ * be called, `false` when busy.
|
||||
+ */
|
||||
+ readonly idle: boolean
|
||||
+
|
||||
+ /**
|
||||
+ * `true` when the backup has completed, `false` otherwise.
|
||||
+ */
|
||||
+ readonly completed: boolean
|
||||
+
|
||||
+ /**
|
||||
+ * `true` when the backup has failed, `false` otherwise. `Backup.message`
|
||||
+ * contains the error message.
|
||||
+ */
|
||||
+ readonly failed: boolean
|
||||
+
|
||||
+ /**
|
||||
+ * Message failure string from sqlite3_errstr() if `Backup.failed` is `true`
|
||||
+ */
|
||||
+ readonly message: boolean
|
||||
+
|
||||
+ /**
|
||||
+ * The number of remaining pages after the last call to `step()`,
|
||||
+ * or `-1` if `step()` has never been called.
|
||||
+ */
|
||||
+ readonly remaining: number
|
||||
+
|
||||
+ /**
|
||||
+ * The total number of pages measured during the last call to `step()`,
|
||||
+ * or `-1` if `step()` has never been called.
|
||||
+ */
|
||||
+ readonly pageCount: number
|
||||
+
|
||||
+
|
||||
+ /**
|
||||
+ * An array of sqlite3 error codes that are treated as non-fatal -
|
||||
+ * meaning, if they occur, `Backup.failed` is not set, and the backup
|
||||
+ * may continue. By default, this is `[sqlite3.BUSY, sqlite3.LOCKED]`.
|
||||
+ */
|
||||
+ retryErrors: number[]
|
||||
+
|
||||
+ /**
|
||||
+ * Asynchronously finalize the backup (required).
|
||||
+ *
|
||||
+ * @param callback Called when the backup is finalized.
|
||||
+ */
|
||||
+ finish(callback?: ()=>void): void
|
||||
+
|
||||
+ /**
|
||||
+ * Asynchronously perform an incremental segment of the backup.
|
||||
+ *
|
||||
+ * Example:
|
||||
+ *
|
||||
+ * ```
|
||||
+ * backup.step(5)
|
||||
+ * ```
|
||||
+ *
|
||||
+ * @param nPages Number of pages to process (5 recommended).
|
||||
+ * @param callback Called when the step is completed.
|
||||
+ */
|
||||
+ step(nPages: number,callback?: ()=>void): void
|
||||
}
|
||||
|
||||
export function verbose(): sqlite3;
|
Loading…
x
Reference in New Issue
Block a user