docs: small updates to help with ENV variables and docker compose

chore: made sure that push-caches CLI creates the indexes before pushing docs (so that Model's config is used)
This commit is contained in:
Ken Snyder
2022-02-03 12:38:36 -08:00
parent 6d67bac1b1
commit 19fde622c3
6 changed files with 51 additions and 1 deletions

View File

@@ -5,7 +5,7 @@ services:
scraper:
image: getmeili/docs-scraper:latest
container_name: scraper
command: pipenv run ./docs_scraper config.json -d
command: pipenv run ./docs_scraper/config.json -v -d
depends_on:
- search
environment:

View File

@@ -26,6 +26,8 @@
"push-prose": "pnpm -C ./packages/tauri-search run push-prose",
"up": "docker compose up -d",
"down": "docker compose down",
"ps": "docker compose ps",
"logs:scraper": "docker logs scraper",
"into:scraper": "docker exec -it scraper bash",
"into:search": "docker exec -it search bash",
"lint": "run-p lint:*",

View File

@@ -12,6 +12,8 @@ pnpm run start
You are now up and running with the documentation site -- and assuming you have Docker installed -- a local [search server](./meilisearch) which you can interact with.
> Note: the `docker compose` API is new and replaces `docker-compose` ... the API surface is the same though but because this was setup with the new `docker compose` some CLI commands relating to docker may break if you're on the old version (but just manually retype with the dash included)
### Already installed Deps?
If you've already installed all the deps and want more granular control you can choose from the various script targets or just choose _watch_ to look at docs with active editing capability:
@@ -35,6 +37,24 @@ pnpm run watch
```
>>>
## Secrets and ENV variables
>>> DotENV
- We use the popular DotEnv **npm** package to allow users to set ENV variables but **not** have them checked into the repository.
- Simply add a `.env` file to add variables you want to use locally; this can be both secret and non-secret variables
- In the local _dockerized_ MeiliSearch you will not really need any secrets but if you're rebuilding the document caches then you'll be using the Github API enough (and in parallel) such that providing a Github "personal access token" will be a good idea.
- Use the `GH_TOKEN` and `GH_USER` env variables to have the Github API's use your personal access token (versus being an anonymous user)
- There are also some non-secret ENV variables you may want to adjust:
- the `REPO` variable is used to determine which Github respository hosts Markdown/Prose documents
- This will default to `tauri` for now if no ENV is detected; this will likely change in the future to `tauri-docs`.
- the `BRANCH` variable is used to specify which branch to use; it will use `dev` if not found
>>>
>>> The Meilisearch Master Key
- the dockerized container has no master key set (though you can set it); allowing all operations to be done via the API
- a _production_ container should always be setup with a Master Key immediately
- having the master key allows you to access all API endpoints but must be included in the Header as a bearer token
>>>
## Models
Central to using this library to build and refresh your search indexes is understanding the concept of `Model`.

View File

@@ -17,6 +17,25 @@ import { getEnv } from "~/utils/getEnv";
(async () => {
console.log(`- Pushing ALL document caches into local MeiliSearch server`);
const idx = (await ApiModel.query.currentIndexes()).map((i) => i.name);
if (idx.length > 0) {
console.log(
`- found the following indexes setup: ${idx.join(
", "
)}. They will not be restarted but missing indexes will be added as per the Model configuration.`
);
}
[ApiModel, RepoModel, ProseModel, ConsolidatedModel].forEach(async (model) => {
if (!idx.includes(model.name)) {
console.log(
`- creating the "${model.name}" index with the following config: ${JSON.stringify(
{ ...model.index, stopWords: `${model.index?.stopWords?.length || 0} words` }
)}`
);
await model.query.createIndex();
}
});
const { repo, branch } = getEnv();
if (!existsSync(proseDocsCacheFile(repo, branch))) {

View File

@@ -9,6 +9,15 @@ export const TS_DOCS_CACHE = `src/generated/ast/api/ts-documents.json`;
export const TS_AST_CACHE = `src/generated/ast/api/ts-ast.json`;
export const RS_DOCS_CACHE = `src/generated/ast/api/rs-documents.json`;
const SERVERS = {
local: {
url: "http://localhost:7700",
},
production: {
url: "https://search2.tauri.com",
},
};
export const REPOS: `${string}/${string}`[] = [
"tauri-apps/tauri",
"tauri-apps/wry",