Commit new Chart releases for TrueCharts

Signed-off-by: TrueCharts-Bot <bot@truecharts.org>
This commit is contained in:
TrueCharts-Bot 2022-10-08 23:48:46 +00:00
parent c1b0510186
commit 5056fbe457
27 changed files with 7450 additions and 0 deletions

View File

@ -0,0 +1,11 @@
# Changelog
## [docspell-0.0.1]docspell-0.0.1 (2022-10-08)
### Feat
- add docspell ([#3999](https://github.com/truecharts/charts/issues/3999))

View File

@ -0,0 +1,12 @@
dependencies:
- name: common
repository: https://library-charts.truecharts.org
version: 10.6.11
- name: postgresql
repository: https://charts.truecharts.org/
version: 8.0.88
- name: solr
repository: https://charts.truecharts.org/
version: 0.0.59
digest: sha256:9b1c35e1dc3de9b02f7e706340458fa08c5a8b6846b2ff07f68ac91382febeed
generated: "2022-10-08T23:47:00.352656045Z"

View File

@ -0,0 +1,36 @@
apiVersion: v2
appVersion: "0.38.0"
dependencies:
- name: common
repository: https://library-charts.truecharts.org
version: 10.6.11
- condition: postgresql.enabled
name: postgresql
repository: https://charts.truecharts.org/
version: 8.0.88
- condition: solr.enabled
name: solr
repository: https://charts.truecharts.org/
version: 0.0.59
description: Docspell is a personal document organizer.
home: https://truecharts.org/docs/charts/incubator/docspell
icon: https://truecharts.org/img/hotlink-ok/chart-icons/docspell.png
keywords:
- docs
kubeVersion: ">=1.16.0-0"
maintainers:
- email: info@truecharts.org
name: TrueCharts
url: https://truecharts.org
name: docspell
sources:
- https://github.com/truecharts/charts/tree/master/charts/incubator/docspell
- https://github.com/eikek/docspell
- https://docspell.org/docs/install/docker/
- https://docspell.org/docs/configure/defaults/
version: 0.0.1
annotations:
truecharts.org/catagories: |
- productivity
truecharts.org/SCALE-support: "true"
truecharts.org/grade: U

View File

@ -0,0 +1,110 @@
# docspell
Docspell is a personal document organizer.
TrueCharts can be installed as both *normal* Helm Charts or as Apps on TrueNAS SCALE.
This readme is just an automatically generated general guide on installing our Helm Charts and Apps.
For more information, please click here: [docspell](https://truecharts.org/docs/charts/incubator/docspell)
**This chart is not maintained by the upstream project and any issues with the chart should be raised [here](https://github.com/truecharts/charts/issues/new/choose)**
## Source Code
* <https://github.com/truecharts/charts/tree/master/charts/incubator/docspell>
* <https://github.com/eikek/docspell>
* <https://docspell.org/docs/install/docker/>
* <https://docspell.org/docs/configure/defaults/>
## Requirements
Kubernetes: `>=1.16.0-0`
## Dependencies
| Repository | Name | Version |
|------------|------|---------|
| https://charts.truecharts.org/ | postgresql | 8.0.88 |
| https://charts.truecharts.org/ | solr | 0.0.59 |
| https://library-charts.truecharts.org | common | 10.6.11 |
## Installing the Chart
### TrueNAS SCALE
To install this Chart on TrueNAS SCALE check our [Quick-Start Guide](https://truecharts.org/docs/manual/SCALE%20Apps/Installing-an-App).
### Helm
To install the chart with the release name `docspell`
```console
helm repo add TrueCharts https://charts.truecharts.org
helm repo update
helm install docspell TrueCharts/docspell
```
## Uninstall
### TrueNAS SCALE
**Upgrading, Rolling Back and Uninstalling the Chart**
To upgrade, rollback or delete this Chart from TrueNAS SCALE check our [Quick-Start Guide](https://truecharts.org/docs/manual/SCALE%20Apps/Upgrade-rollback-delete-an-App).
### Helm
To uninstall the `docspell` deployment
```console
helm uninstall docspell
```
## Configuration
### Helm
#### Available Settings
Read through the values.yaml file. It has several commented out suggested values.
Other values may be used from the [values.yaml](https://github.com/truecharts/library-charts/tree/main/charts/stable/common/values.yaml) from the [common library](https://github.com/k8s-at-home/library-charts/tree/main/charts/stable/common).
#### Configure using the command line
Specify each parameter using the `--set key=value[,key=value]` argument to `helm install`.
```console
helm install docspell \
--set env.TZ="America/New York" \
TrueCharts/docspell
```
#### Configure using a yaml file
Alternatively, a YAML file that specifies the values for the above parameters can be provided while installing the chart.
```console
helm install docspell TrueCharts/docspell -f values.yaml
```
#### Connecting to other charts
If you need to connect this Chart to other Charts on TrueNAS SCALE, please refer to our [Linking Charts Internally](https://truecharts.org/docs/manual/SCALE%20Apps/linking-apps) quick-start guide.
## Support
- Please check our [quick-start guides for TrueNAS SCALE](https://truecharts.org/docs/manual/SCALE%20Apps/Important-MUST-READ).
- See the [Website](https://truecharts.org)
- Check our [Discord](https://discord.gg/tVsPTHWTtr)
- Open a [issue](https://github.com/truecharts/apps/issues/new/choose)
---
## Sponsor TrueCharts
TrueCharts can only exist due to the incredible effort of our staff.
Please consider making a [donation](https://truecharts.org/sponsor) or contributing back to the project any way you can!
---
All Rights Reserved - The TrueCharts Project

View File

@ -0,0 +1,8 @@
Docspell is a personal document organizer.
This App is supplied by TrueCharts, for more information visit the manual: [https://truecharts.org/docs/charts/incubator/docspell](https://truecharts.org/docs/charts/incubator/docspell)
---
TrueCharts can only exist due to the incredible effort of our staff.
Please consider making a [donation](https://truecharts.org/docs/about/sponsor) or contributing back to the project any way you can!

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@ -0,0 +1,539 @@
image:
repository: tccr.io/truecharts/docspell-server
tag: v0.38.0@sha256:bcd3f651f83fc482a0bd8bb2606870146376656a438e7b8fe051f3496d405f3d
pullPolicy: IfNotPresent
joexImage:
repository: tccr.io/truecharts/docspell-joex
tag: v0.38.0@sha256:16f2c2e32d8b20f8cdd1b62ae458ded2bbb16ed7bdc2727dc4a7624298ef7021
pullPolicy: IfNotPresent
dscImage:
repository: tccr.io/truecharts/docspell-dsc
tag: v0.38.0@sha256:86c8100df78f47f00291de0c2395917427f6caacbf896691559296c0a805a7a9
pullPolicy: IfNotPresent
args:
- /opt/server.conf
podSecurityContext:
runAsUser: 0
runAsGroup: 0
securityContext:
runAsNonRoot: false
readOnlyRootFilesystem: false
dsc:
# -- You need to enable integration endpoint with HTTP Header in rest server
# -- If you enable allowed IPs you also need 127.0.0.1 as an entry there
enabled: false
# move | delete
imported_action: move
not_match_glob: "**/.*"
match_glob: ""
language: ""
tag: ""
rest_server:
# -- App name, shows on the top right corner
app_name: Docspell
base_url: ""
logging:
# -- The format for the log messages. Can be one of:
# -- Json | Logfmt | Fancy | Plain
format: Fancy
# -- The minimum level to log. From lowest to highest:
# -- Trace | Debug | Info | Warn | Error
minimum_level: Warn
levels:
# -- Override the log level of specific loggers
docspell: Info
flywaydb: Info
binny: Info
http4s: Info
server_opts:
# -- Enable HTTP2
enable_http2: false
# -- Maximum allowed connections
max_connections: 1024
# -- Timeout for waiting for the first output of the response
response_timeout: 45s
# -- This is a hard limit to restrict the size of a batch that is returned when searching for items.
max_item_page_size: 200
# -- The number of characters to return for each item notes when searching.
max_note_length: 180
# -- This defines whether the classification form in the collective settings is displayed or not.
show_classification_settings: true
auth:
# -- How long a authentication token is valid
session_valid: 5 minutes
remember_me:
enabled: true
# -- How long the remember me cookie/token is valid.
valid: 30 days
download_all:
# -- How many files to allow in a zip.
max_files: 500
# -- The maximum (uncompressed) size of the zip file contents.
max_size: 1400M
openid:
- enabled: false
display: Keycloak
provider:
provider_id: keycloak
client_id: docspell
client_secret: example-secret-439e-bf06-911e4cdd56a6
scope: profile
authorize_url: http://localhost:8080/auth/realms/home/protocol/openid-connect/auth
token_url: http://localhost:8080/auth/realms/home/protocol/openid-connect/token
user_url: ""
logout_url: http://localhost:8080/auth/realms/home/protocol/openid-connect/logout
sign_key: b64:anVzdC1hLXRlc3Q=
sig_algo: RS512
collective_key: lookup:docspell_collective
user_key: preferred_username
- enabled: false
display: Github
provider:
provider_id: github
client_id: <your github client id>
client_secret: <your github client secret>
scope: ""
authorize_url: https://github.com/login/oauth/authorize
token_url: https://github.com/login/oauth/access_token
user_url: https://api.github.com/user
logout_url: ""
sign_key: ""
sig_algo: RS256
collective_key: fixed:demo
user_key: login
oidc_auto_redirect: true
integration_endpoint:
enabled: false
# -- The priority to use when submitting files through this endpoint.
# low | high
priority: low
# -- The name used for the item "source" property when uploaded through this endpoint.
source_name: integration
allowed_ips:
enabled: false
ips:
- "127.0.0.1"
http_basic_auth:
enabled: false
realm: Docspell Integration
user: docspell-int
password: docspell-int
http_header:
enabled: false
header_name: Docspell-Integration
header_value: some-secret
admin_endpoint:
# -- Disables endpoint if empty
secret: ""
full_text_search:
solr:
# -- Used to tell solr when to commit the data
commit_within: 1000
# -- If true, logs request and response bodies
log_verbose: false
def_type: lucene
# -- The default combiner for tokens. One of AND | OR
q_op: OR
backend:
# -- Enable or disable debugging for e-mail related functionality
mail_debug: false
database_schema:
# -- Whether to run main database migrations.
run_main_migrations: true
# -- Whether to run the fixup migrations.
run_fixup_migrations: true
# -- Use with care. This repairs all migrations in the database by updating their checksums and removing failed migrations.
repair_schema: false
signup:
# -- The mode defines if new users can signup or not.
# open | invite | closed
mode: open
# -- This is the period an invitation token is considered valid.
invite_time: 3 days
files:
# -- Defines the chunk size (in bytes) used to store the files.
chunk_size: 524288
# -- The file content types that are considered valid.
valid_mime_types: []
# database | minio | filesystem
default_store: database
stores:
database:
enabled: true
minio:
enabled: false
endpoint: http://localhost:9000
access_key: access_key
secret_key: secret_key
bucket: docspell
# -- Highly NOT recommended
filesystem:
enabled: false
directory: /documents
addons:
# TODO: Check how exactly addons work. There are mentions of docker daemon
enabled: false
# -- Whether installing addons requiring network should be allowed or not.
allow_impure: true
# -- Define patterns of urls that are allowed to install addons from.
allowed_urls:
- "*"
# -- Define patterns of urls that are denied to install addons from.
denied_urls: []
joex:
logging:
# -- The format for the log messages. Can be one of:
# -- Json | Logfmt | Fancy | Plain
format: Fancy
# -- The minimum level to log. From lowest to highest:
# -- Trace | Debug | Info | Warn | Error
minimum_level: Warn
levels:
# -- Override the log level of specific loggers
docspell: Info
flywaydb: Info
binny: Info
http4s: Info
database_schema:
# -- Whether to run main database migrations.
run_main_migrations: false
# -- Whether to run the fixup migrations.
run_fixup_migrations: true
# -- Use with care. This repairs all migrations in the database by updating their checksums and removing failed migrations.
repair_schema: false
# -- Enable or disable debugging for e-mail related functionality. This applies to both sending and receiving mails.
mail_debug: false
send_mail:
# -- This is used as the List-Id e-mail header when mails are sent from docspell to its users
list_id: ""
scheduler:
# -- Number of processing allowed in parallel.
pool_size: 1
# -- A counting scheme determines the ratio of how high and low priority jobs are run.
counting_scheme: 4,1
# -- How many times a failed job should be retried until it enters failed state.
retries: 2
# -- The delay until the next try is performed for a failed job.
retry_delay: 1 minute
# -- The queue size of log statements from a job.
log_buffer_size: 500
# -- If no job is left in the queue, the scheduler will wait until a notify is requested.
wakeup_period: 30 minutes
periodic_scheduler:
# -- A fallback to start looking for due periodic tasks regularly.
wakeup_period: 10 minutes
user_tasks:
scan_mailbox:
# -- A limit of how many folders to scan through.
max_folders: 50
# -- How many mails (headers only) to retrieve in one chunk.
mail_chunk_size: 50
# -- A limit on how many mails to process in one job run.
max_mails: 500
house_keeping:
# -- When the house keeping tasks execute.
schedule: Sun *-*-* 00:00:00 UTC
# -- This task removes invitation keys that have been created but not used.
cleanup_invites:
# -- Whether this task is enabled.
enabled: true
# -- The minimum age of invites to be deleted.
older_than: 30 days
# -- This task removes expired remember-me tokens.
cleanup_remember_me:
# -- Whether this task is enabled.
enabled: true
# -- The minimum age of tokens to be deleted.
older_than: 30 days
# -- Jobs store their log output in the database.
cleanup_jobs:
# -- Whether this task is enabled.
enabled: true
# -- The minimum age of jobs to delete.
older_than: 30 days
# -- This defines how many jobs are deleted in one transaction.
delete_batch: 100
# -- Zip files created for downloading multiple files are cached and can be cleared periodically.
cleanup_downloads:
# -- Whether this task is enabled.
enabled: true
# -- The minimum age of a download file to be deleted.
older_than: 14 days
# -- Removes node entries that are not reachable anymore.
check_nodes:
# -- Whether this task is enabled.
enabled: true
# -- How often the node must be unreachable, before it is removed.
min_not_found: 2
# -- Checks all files against their checksum
integrity_check:
enabled: true
# -- A periodic task to check for new releases of docspell.
update_check:
# -- Whether to enable this task
enabled: false
# -- Sends the mail without checking the latest release.
test_run: false
# -- When the update check should execute.
schedule: "Sun *-*-* 00:00:00 UTC"
# -- An account id in form of `collective/user`
sender_account: ""
# -- The SMTP connection id that should be used for sending the mail.
smtp_id: ""
# -- A list of recipient e-mail addresses.
recipients: []
# -- The subject of the mail.
subject: Docspell {{ latestVersion }} is available
# -- The body of the mail.
body: |
Hello,
You are currently running Docspell {{ currentVersion }}. Version *{{ latestVersion }}*
is now available, which was released on {{ releasedAt }}. Check the release page at
<https://github.com/eikek/docspell/releases/latest>
Have a nice day!
Docspell Update Check
# -- Configuration of text extraction
extraction:
pdf:
min_text_length: 500
preview:
# -- When rendering a pdf page, use this dpi.
dpi: 32
ocr:
# -- Images greater than this size are skipped.
max_image_size: 14000000
page_range:
# -- Defines what pages to process.
begin: 10
# -- The ghostscript command.
ghostscript:
command:
program: gs
args:
- -dNOPAUSE
- -dBATCH
- -dSAFER
- -sDEVICE=tiffscaled8
- "-sOutputFile={{outfile}}"
- "{{infile}}"
timeout: 5 minutes
working_dir: /tmp/docspell-extraction
# -- The unpaper command.
unpaper:
command:
program: unpaper
args:
- "{{infile}}"
- "{{outfile}}"
timeout: 5 minutes
# -- The tesseract command.
tesseract:
command:
program: tesseract
args:
- "{{file}}"
- stdout
- -l
- "{{lang}}"
timeout: 5 minutes
text_analysis:
# -- Maximum length of text to be analyzed.
max_length: 0
working_dir: /tmp/docspell-analysis
nlp:
# -- The mode for configuring NLP models
# -- full | basic | regexonly | disabled
mode: full
clear_interval: 15 minutes
# -- Restricts proposals for due dates.
max_due_date_years: 10
regex_ner:
# -- Whether to enable custom NER annotation.
max_entries: 1000
file_cache_time: 1 minute
classification:
# -- Whether to enable classification globally.
enabled: true
# -- This limit and `text-analysis.max-length` define how much memory is required.
item_count: 600
# -- Enclose regexps in triple quotes.
classifiers:
useSplitWords: true
splitWordsTokenizerRegexp: '"""[\p{L}][\p{L}0-9]*|(?:\$ ?)?[0-9]+(?:\.[0-9]{2})?%?|\s+|."""'
splitWordsIgnoreRegexp: '"""\s+"""'
useSplitPrefixSuffixNGrams: true
maxNGramLeng: 4
minNGramLeng: 1
splitWordShape: chris4
intern: true
# -- Configuration for converting files into PDFs.
convert:
# -- The chunk size used when storing files.
chunk_size: 524288
# -- A string used to change the filename of the converted pdf file.
converted_filename_part: converted
# -- When reading images, this is the maximum size.
max_image_size: 14000000
markdown:
# -- The CSS that is used to style the resulting HTML.
internal_css: "body { padding: 2em 5em; }"
wkhtmlpdf:
command:
program: wkhtmltopdf
args:
- -s
- A4
- --encoding
- "{{enconding}}"
- --load-error-handling
- ignore
- --load-media-error-handling
- ignore
- "-"
- "{{outfile}}"
timeout: 2 minutes
working_dir: /tmp/docspell-convert
tesseract:
command:
program: tesseract
args:
- "{{infile}}"
- out
- -l
- "{{lang}}"
- pdf
- txt
timeout: 5 minutes
working_dir: /tmp/docspell-convert
unoconv:
command:
program: unoconv
args:
- -f
- pdf
- -o
- "{{outfile}}"
- "{{infile}}"
timeout: 5 minutes
working_dir: /tmp/docspell-convert
ocrmypdf:
enabled: true
command:
program: ocrmypdf
args:
- -l
- "{{lang}}"
- --skip-text
- --deskew
- -j
- "1"
- "{{infile}}"
- "{{outfile}}"
timeout: 5 minutes
working_dir: /tmp/docspell-convert
decrypt_pdf:
enabled: true
passwords: []
files:
# -- Defines the chunk size (in bytes) used to store the files.
chunk_size: 524288
# -- The file content types that are considered valid.
valid_mime_types: []
# database | minio | filesystem
default_store: database
stores:
database:
enabled: true
minio:
enabled: false
endpoint: http://localhost:9000
access_key: access_key
secret_key: secret_key
bucket: docspell
# -- Highly NOT recommended
filesystem:
enabled: false
directory: /documents
full_text_search:
solr:
# -- Used to tell solr when to commit the data
commit_within: 1000
# -- If true, logs request and response bodies
log_verbose: false
def_type: lucene
# -- The default combiner for tokens. One of AND | OR
q_op: OR
migration:
index_all_chunk: 10
addons:
# TODO: Check TODO above for addons
working_dir: /tmp/docspell-addons
cache_dir: /tmp/docspell-addons-cache
executor_config:
runner: trivial
fail_fast: true
run_timeout: 15 minutes
service:
main:
ports:
main:
port: 10320
protocol: HTTP
joex:
enabled: true
type: ClusterIP
ports:
joex:
enabled: true
port: 10321
protocol: HTTP
persistence:
server:
enabled: true
type: secret
readOnly: true
defaultMode: "0600"
objectName: '{{ include "tc.common.names.fullname" . }}-server-secret'
mountPath: /opt/server.conf
subPath: server.conf
joex:
enabled: true
type: secret
noMount: true
readOnly: true
defaultMode: "0600"
objectName: '{{ include "tc.common.names.fullname" . }}-joex-secret'
mountPath: /opt/joex.conf
subPath: joex.conf
import:
enabled: true
noMount: true
mountPath: /import
postgresql:
enabled: true
existingSecret: dbcreds
postgresqlUsername: docspell
postgresqlDatabase: docspell
solr:
enabled: true
existingSecret: solrcreds
solrCores: docspell
solrUsername: docspell
portal:
enabled: true

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,41 @@
{{/* Define the dsc container */}}
{{- define "docspell.dsc" -}}
image: {{ .Values.dscImage.repository }}:{{ .Values.dscImage.tag }}
imagePullPolicy: {{ .Values.dscImage.pullPolicy }}
securityContext:
runAsUser: {{ .Values.podSecurityContext.runAsUser }}
runAsGroup: {{ .Values.podSecurityContext.runAsGroup }}
readOnlyRootFilesystem: {{ .Values.securityContext.readOnlyRootFilesystem }}
runAsNonRoot: {{ .Values.securityContext.runAsNonRoot }}
volumeMounts:
- name: import
mountPath: /import
command:
- dsc
- {{ printf "%v:%v" "http://localhost" .Values.service.main.ports.main.port | quote }}
- watch
- --recursive
- --integration
- --header
- {{ printf "%v:%v" .Values.rest_server.integration_endpoint.http_header.header_name .Values.rest_server.integration_endpoint.http_header.header_value | quote }}
{{- if .Values.dsc.language }}
- --language {{ .Values.dsc.language }}
{{- end }}
{{- if .Values.dsc.tag }}
- --tag {{ .Values.dsc.tag }}
{{- end }}
{{- if .Values.dsc.not_match_glob }}
- --not-matches
- {{ .Values.dsc.not_match_glob | quote }}
{{- end }}
{{- if .Values.dsc.match_glob }}
- --matches
- {{ .Values.dsc.match_glob | quote }}
{{- end }}
{{- if eq .Values.dsc.imported_action "delete" }}
- --delete
{{- else if eq .Values.dsc.imported_action "move" }}
- --move {{ .Values.persistence.import.mountPath }}/imported
{{- end }}
- {{ .Values.persistence.import.mountPath }}/docs
{{- end -}}

View File

@ -0,0 +1,46 @@
{{/* Define the joex container */}}
{{- define "docspell.joex" -}}
image: {{ .Values.joexImage.repository }}:{{ .Values.joexImage.tag }}
imagePullPolicy: {{ .Values.joexImage.pullPolicy }}
securityContext:
runAsUser: {{ .Values.podSecurityContext.runAsUser }}
runAsGroup: {{ .Values.podSecurityContext.runAsGroup }}
readOnlyRootFilesystem: {{ .Values.securityContext.readOnlyRootFilesystem }}
runAsNonRoot: {{ .Values.securityContext.runAsNonRoot }}
args:
- /opt/joex.conf
volumeMounts:
- name: joex
mountPath: /opt/joex.conf
subPath: joex.conf
readOnly: true
ports:
- containerPort: {{ .Values.service.joex.ports.joex.port }}
name: joex
{{/* TODO: Find out a path for healthchecks and come back to enable probes
readinessProbe:
httpGet:
path: /
port: {{ .Values.service.joex.ports.joex.port }}
initialDelaySeconds: {{ .Values.probes.readiness.spec.initialDelaySeconds }}
timeoutSeconds: {{ .Values.probes.readiness.spec.timeoutSeconds }}
periodSeconds: {{ .Values.probes.readiness.spec.periodSeconds }}
failureThreshold: {{ .Values.probes.readiness.spec.failureThreshold }}
livenessProbe:
httpGet:
path: /
port: {{ .Values.service.joex.ports.joex.port }}
initialDelaySeconds: {{ .Values.probes.liveness.spec.initialDelaySeconds }}
timeoutSeconds: {{ .Values.probes.liveness.spec.timeoutSeconds }}
periodSeconds: {{ .Values.probes.liveness.spec.periodSeconds }}
failureThreshold: {{ .Values.probes.liveness.spec.failureThreshold }}
startupProbe:
httpGet:
path: /
port: {{ .Values.service.joex.ports.joex.port }}
initialDelaySeconds: {{ .Values.probes.startup.spec.initialDelaySeconds }}
timeoutSeconds: {{ .Values.probes.startup.spec.timeoutSeconds }}
periodSeconds: {{ .Values.probes.startup.spec.periodSeconds }}
failureThreshold: {{ .Values.probes.startup.spec.failureThreshold }}
*/}}
{{- end -}}

View File

@ -0,0 +1,616 @@
{{/* Define the secret */}}
{{- define "docspell.secret" -}}
{{- $serverSecretName := printf "%s-server-secret" (include "tc.common.names.fullname" .) }}
{{- $joexSecretName := printf "%s-joex-secret" (include "tc.common.names.fullname" .) }}
{{- $storeSecretName := printf "%s-store-secret" (include "tc.common.names.fullname" .) }}
{{- $server := .Values.rest_server -}}
{{- $serverID := printf "server-%v" (randAlphaNum 10) -}}
{{- $joex := .Values.joex -}}
{{- $joexID := printf "joex-%v" (randAlphaNum 10) -}}
{{- $server_secret := "" }}
{{- with (lookup "v1" "Secret" .Release.Namespace $storeSecretName) }}
{{- $server_secret = (index .data "server_secret") }}
{{- else }}
{{- $server_secret = printf "b64:%v" (randAlphaNum 32 | b64enc) }}
{{- end }}
{{- $new_invite_password := "" }}
{{- with (lookup "v1" "Secret" .Release.Namespace $storeSecretName) }}
{{- $new_invite_password = (index .data "new_invite_password") }}
{{- else }}
{{- $new_invite_password = randAlphaNum 32 | b64enc }}
{{- end }}
---
apiVersion: v1
kind: Secret
type: Opaque
metadata:
name: {{ $storeSecretName }}
labels:
{{- include "tc.common.labels" . | nindent 4 }}
stringData:
server_secret: {{ $server_secret }}
new_invite_password: {{ $new_invite_password }}
---
apiVersion: v1
kind: Secret
type: Opaque
metadata:
name: {{ $serverSecretName }}
labels:
{{- include "tc.common.labels" . | nindent 4 }}
stringData:
server.conf: |
docspell.server {
app-name = {{ $server.app_name | quote }}
app-id = {{ $serverID | quote }}
base-url = {{ $server.base_url | default (printf "%v:%v" "http://localhost" .Values.service.main.ports.main.port) | quote }}
internal-url = {{ printf "%v:%v" "http://localhost" .Values.service.main.ports.main.port | quote }}
{{- $logging := $server.logging }}
logging {
format = {{ $logging.format | quote }}
minimum-level = {{ $logging.minimum_level | quote }}
levels = {
"docspell" = {{ $logging.levels.docspell | quote }}
"org.flywaydb" = {{ $logging.levels.flywaydb | quote }}
"binny" = {{ $logging.levels.binny | quote }}
"org.http4s" = {{ $logging.levels.http4s | quote }}
}
}
bind {
address = "0.0.0.0"
port = {{ .Values.service.main.ports.main.port }}
}
{{- $server_opts := $server.server_opts }}
server-options {
enable-http-2 = {{ $server_opts.enable_http2 }}
max-connections = {{ $server_opts.max_connections }}
response-timeout = {{ $server_opts.response_timeout }}
}
max-item-page-size = {{ $server.max_item_page_size }}
max-note-length = {{ $server.max_note_length }}
show-classification-settings = {{ $server.show_classification_settings }}
{{- $auth := $server.auth }}
auth {
server-secret = {{ $server_secret | quote }}
session-valid = {{ $auth.session_valid | quote }}
remember-me {
enabled = {{ $auth.remember_me.enabled }}
valid = {{ $auth.remember_me.valid | quote }}
}
}
{{- $download_all := $server.download_all }}
download-all {
max-files = {{ $download_all.max_files }}
max-size = {{ $download_all.max_size }}
}
{{- $openid := $server.openid }}
openid =
[
{{- range initial $openid }}
{
enabled = {{ .enabled }},
display = {{ .display | quote }}
provider = {
provider-id = {{ .provider.provider_id | quote }},
client-id = {{ .provider.client_id | quote }},
client-secret = {{ .provider.client_secret | quote }},
scope = {{ .provider.scope | quote }},
authorize-url = {{ .provider.authorize_url | quote }},
token-url = {{ .provider.token_url | quote }},
{{- with .provider.user_url }}
user-url = {{ . | quote }},
{{- end }}
{{- with .provider.logout_url }}
logout-url = {{ . | quote }},
{{- end }}
sign-key = {{ .provider.sign_key | quote }},
sig-algo = {{ .provider.sig_algo | quote }}
},
collective-key = {{ .collective_key | quote }},
user-key = {{ .user_key | quote }}
},
{{- end }}
{{- with last $openid }}
{
enabled = {{ .enabled }},
display = {{ .display | quote }}
provider = {
provider-id = {{ .provider.provider_id | quote }},
client-id = {{ .provider.client_id | quote }},
client-secret = {{ .provider.client_secret | quote }},
scope = {{ .provider.scope | quote }},
authorize-url = {{ .provider.authorize_url | quote }},
token-url = {{ .provider.token_url | quote }},
{{- with .provider.user_url }}
user-url = {{ . | quote }},
{{- end }}
{{- with .provider.logout_url }}
logout-url = {{ . | quote }},
{{- end }}
sign-key = {{ .provider.sign_key | quote }},
sig-algo = {{ .provider.sig_algo | quote }}
},
collective-key = {{ .collective_key | quote }},
user-key = {{ .user_key | quote }}
}
{{- end }}
]
oidc-auto-redirect = {{ $server.oidc_auto_redirect }}
{{- $integration_endpoint := $server.integration_endpoint }}
integration-endpoint {
enabled = {{ $integration_endpoint.enabled }}
priority = {{ $integration_endpoint.priority | quote }}
source-name = {{ $integration_endpoint.source_name | quote }}
allowed-ips {
enabled = {{ $integration_endpoint.allowed_ips.enabled }}
ips = [
{{- range initial $integration_endpoint.allowed_ips.ips }}
{{ . | quote }},
{{- end }}
{{ last $integration_endpoint.allowed_ips.ips | quote }}
]
}
http-basic {
enabled = {{ $integration_endpoint.http_basic_auth.enabled }}
realm = {{ $integration_endpoint.http_basic_auth.realm | quote }}
user = {{ $integration_endpoint.http_basic_auth.user | quote }}
password = {{ $integration_endpoint.http_basic_auth.password | quote }}
}
http-header {
enabled = {{ $integration_endpoint.http_header.enabled }}
header-name = {{ $integration_endpoint.http_header.header_name | quote }}
header-value = {{ $integration_endpoint.http_header.header_value | quote }}
}
}
admin-endpoint {
secret = {{ $server.admin_endpoint.secret | quote }}
}
{{- $full_text_search := $server.full_text_search }}
full-text-search {
enabled = true
backend = "solr"
solr = {
url = {{ printf "http://%v:%v@%v-solr:8983/%v" .Values.solr.solrUsername (.Values.solr.solrPassword | trimAll "\"") .Release.Name .Values.solr.solrCores | quote }}
commit-within = {{ $full_text_search.solr.commit_within }}
log-verbose = {{ $full_text_search.solr.log_verbose }}
def-type = {{ $full_text_search.solr.def_type | quote }}
q-op = {{ $full_text_search.solr.q_op | quote }}
}
postgresql = {
use-default-connection = false
jdbc {
url = {{ printf "jdbc:postgresql://%v-%v:5432/%v" .Release.Name "postgresql" .Values.postgresql.postgresqlDatabase | quote }}
user = {{ .Values.postgresql.postgresqlUsername | quote }}
password = {{ .Values.postgresql.postgresqlPassword | trimAll "\"" | quote }}
}
pg-config = {
}
pg-query-parser = "websearch_to_tsquery"
pg-rank-normalization = [ 4 ]
}
}
{{- $backend := $server.backend }}
backend {
mail-debug = {{ $backend.mail_debug }}
jdbc {
url = {{ printf "jdbc:postgresql://%v-%v:5432/%v" .Release.Name "postgresql" .Values.postgresql.postgresqlDatabase | quote }}
user = {{ .Values.postgresql.postgresqlUsername | quote }}
password = {{ .Values.postgresql.postgresqlPassword | trimAll "\"" | quote }}
}
{{- $database_schema := $server.backend.database_schema }}
database-schema = {
run-main-migrations = {{ $database_schema.run_main_migrations }}
run-fixup-migrations = {{ $database_schema.run_fixup_migrations }}
repair-schema = {{ $database_schema.repair_schema }}
}
{{- $signup := $server.backend.signup }}
signup {
mode = {{ $signup.mode | quote }}
new-invite-password = {{ $new_invite_password | quote }}
invite-time = {{ $signup.invite_time | quote }}
}
{{- $files := $server.backend.files }}
files {
chunk-size = {{ $files.chunk_size }}
valid-mime-types = [
{{- range initial $files.valid_mime_types }}
{{ . | quote }},
{{- end }}
{{ last $files.valid_mime_types | quote }}
]
default-store = {{ $files.default_store | quote }}
stores = {
database = {
enabled = {{ $files.stores.database.enabled }}
type = "default-database"
}
filesystem = {
enabled = {{ $files.stores.filesystem.enabled }}
type = "file-system"
directory = {{ $files.stores.filesystem.directory | quote }}
}
minio = {
enabled = {{ $files.stores.minio.enabled }}
type = "s3"
endpoint = {{ $files.stores.minio.endpoint | quote }}
access-key = {{ $files.stores.minio.access_key | quote }}
secret-key = {{ $files.stores.minio.secret_key | quote }}
bucket = {{ $files.stores.minio.bucket | quote }}
}
}
}
{{- $addons := $server.addons }}
addons = {
enabled = {{ $addons.enabled }}
allow-impure = {{ $addons.allow_impure }}
allowed-urls = [
{{- range initial $addons.allowed_urls }}
{{ . | quote }},
{{- end }}
{{ last $addons.allowed_urls | quote }}
]
denied-urls = [
{{- range initial $addons.denied_urls }}
{{ . | quote }},
{{- end }}
{{ last $addons.denied_urls | quote }}
]
}
}
}
---
apiVersion: v1
kind: Secret
type: Opaque
metadata:
name: {{ $joexSecretName }}
labels:
{{- include "tc.common.labels" . | nindent 4 }}
stringData:
joex.conf: |
docspell.joex {
app-id = {{ $joexID | quote }}
base-url = {{ printf "%v:%v" "http://localhost" .Values.service.joex.ports.joex.port | quote }}
bind {
address = "0.0.0.0"
port = {{ .Values.service.joex.ports.joex.port }}
}
{{- $logging := $joex.logging }}
logging {
format = {{ $logging.format | quote }}
minimum-level = {{ $logging.minimum_level | quote }}
levels = {
"docspell" = {{ $logging.levels.docspell | quote }}
"org.flywaydb" = {{ $logging.levels.flywaydb | quote }}
"binny" = {{ $logging.levels.binny | quote }}
"org.http4s" = {{ $logging.levels.http4s | quote }}
}
}
jdbc {
url = {{ printf "jdbc:postgresql://%v-%v:5432/%v" .Release.Name "postgresql" .Values.postgresql.postgresqlDatabase | quote }}
user = {{ .Values.postgresql.postgresqlUsername | quote }}
password = {{ .Values.postgresql.postgresqlPassword | trimAll "\"" | quote }}
}
{{- $database_schema := $joex.database_schema }}
database-schema = {
run-main-migrations = {{ $database_schema.run_main_migrations }}
run-fixup-migrations = {{ $database_schema.run_fixup_migrations }}
repair-schema = {{ $database_schema.repair_schema }}
}
mail-debug = {{ $joex.mail_debug }}
send-mail {
list-id = {{ $joex.send_mail.list_id | quote }}
}
{{- $scheduler := $joex.scheduler }}
scheduler {
name = {{ $joexID | quote }}
pool-size = {{ $scheduler.pool_size }}
counting-scheme = {{ $scheduler.counting_scheme | quote }}
retries = {{ $scheduler.retries }}
retry-delay = {{ $scheduler.retry_delay | quote }}
log-buffer-size = {{ $scheduler.log_buffer_size }}
wakeup-period = {{ $scheduler.wakeup_period | quote }}
}
{{- $periodic_scheduler := $joex.periodic_scheduler }}
periodic-scheduler {
name = {{ $joexID | quote }}
wakeup-period = {{ $periodic_scheduler.wakeup_period | quote }}
}
{{- $user_tasks := $joex.user_tasks }}
user-tasks {
scan-mailbox {
max-folders = {{ $user_tasks.scan_mailbox.max_folders }}
mail-chunk-size = {{ $user_tasks.scan_mailbox.mail_chunk_size }}
max-mails = {{ $user_tasks.scan_mailbox.max_mails }}
}
}
{{- $house_keeping := $joex.house_keeping }}
house-keeping {
schedule = {{ $house_keeping.schedule | quote }}
cleanup-invites = {
enabled = {{ $house_keeping.cleanup_invites.enabled }}
older-than = {{ $house_keeping.cleanup_invites.older_than | quote }}
}
cleanup-remember-me = {
enabled = {{ $house_keeping.cleanup_remember_me.enabled }}
older-than = {{ $house_keeping.cleanup_remember_me.older_than | quote }}
}
cleanup-jobs = {
enabled = {{ $house_keeping.cleanup_jobs.enabled }}
older-than = {{ $house_keeping.cleanup_jobs.older_than | quote }}
delete-batch = {{ $house_keeping.cleanup_jobs.delete_batch | quote }}
}
cleanup-downloads = {
enabled = {{ $house_keeping.cleanup_downloads.enabled }}
older-than = {{ $house_keeping.cleanup_downloads.older_than | quote }}
}
check-nodes {
enabled = {{ $house_keeping.check_nodes.enabled }}
min-not-found = {{ $house_keeping.check_nodes.min_not_found }}
}
integrity-check {
enabled = {{ $house_keeping.integrity_check.enabled }}
}
}
update-check {
enabled = {{ $house_keeping.update_check.enabled }}
test-run = {{ $house_keeping.update_check.test_run }}
schedule = {{ $house_keeping.update_check.schedule | quote }}
sender-account = {{ $house_keeping.update_check.sender_account | quote }}
smtp-id = {{ $house_keeping.update_check.smtp_id | quote }}
recipients = [
{{- range initial $house_keeping.update_check.recipients }}
{{ . | quote }},
{{- end }}
{{ last $house_keeping.update_check.recipients | quote }}
]
subject = {{ $house_keeping.update_check.subject | quote }}
body = {{ $house_keeping.update_check.body | quote }}
}
{{- $extraction := $joex.extraction }}
extraction {
pdf {
min-text-len = {{ $extraction.pdf.min_text_length }}
}
preview {
dpi = {{ $extraction.preview.dpi }}
}
ocr {
max-image-size = {{ $extraction.ocr.max_image_size }}
page-range {
begin = {{ $extraction.ocr.page_range.begin }}
}
ghostscript {
command {
program = {{ $extraction.ghostscript.command.program | quote }}
args = [
{{- range initial $extraction.ghostscript.command.args }}
{{ . | quote }},
{{- end }}
{{ last $extraction.ghostscript.command.args | quote }}
]
timeout = {{ $extraction.ghostscript.command.timeout | quote }}
}
working-dir = {{ $extraction.ghostscript.working_dir | quote }}
}
unpaper {
command {
program = {{ $extraction.unpaper.command.program | quote }}
args = [
{{- range initial $extraction.unpaper.command.args }}
{{ . | quote }},
{{- end }}
{{ last $extraction.unpaper.command.args | quote }}
]
timeout = {{ $extraction.unpaper.command.timeout | quote }}
}
}
tesseract {
command {
program = {{ $extraction.tesseract.command.program | quote }}
args = [
{{- range initial $extraction.tesseract.command.args }}
{{ . | quote }},
{{- end }}
{{ last $extraction.tesseract.command.args | quote }}
]
timeout = {{ $extraction.tesseract.command.timeout | quote }}
}
}
}
}
{{- $text_analysis := $joex.text_analysis }}
text-analysis {
max-length = {{ $text_analysis.max_length }}
working-dir = {{ $text_analysis.working_dir | quote }}
nlp {
mode = {{ $text_analysis.nlp.mode }}
clear-interval = {{ $text_analysis.nlp.clear_interval | quote }}
max-due-date-years = {{ $text_analysis.nlp.max_due_date_years }}
regex-ner {
max-entries = {{ $text_analysis.nlp.regex_ner.max_entries }}
file-cache-time = {{ $text_analysis.nlp.regex_ner.file_cache_time }}
}
}
{{- $classification := $joex.classification }}
classification {
enabled = {{ $classification.enabled }}
item-count = {{ $classification.item_count }}
classifiers = [
{
"useSplitWords" = "{{ $classification.classifiers.useSplitWords }}"
"splitWordsTokenizerRegexp" = {{ $classification.classifiers.splitWordsTokenizerRegexp }}
"splitWordsIgnoreRegexp" = {{ $classification.classifiers.splitWordsIgnoreRegexp }}
"useSplitPrefixSuffixNGrams" = "{{ $classification.classifiers.useSplitPrefixSuffixNGrams }}"
"maxNGramLeng" = "{{ $classification.classifiers.maxNGramLeng }}"
"minNGramLeng" = "{{ $classification.classifiers.minNGramLeng }}"
"splitWordShape" = "{{ $classification.classifiers.intern }}"
"intern" = "{{ $classification.classifiers.intern }}"
}
]
}
}
{{- $convert := $joex.convert }}
convert {
chunk-size = {{ $convert.chunk_size }}
converted-filename-part = {{ $convert.converted_filename_part }}
max-image-size = {{ $convert.max_image_size }}
markdown {
internal-css = """
{{ $convert.markdown.internal_css | quote }}
"""
}
wkhtmlpdf {
command = {
program = {{ $convert.wkhtmlpdf.command.program | quote }}
args = [
{{- range initial $convert.wkhtmlpdf.command.args }}
{{ . | quote }},
{{- end }}
{{ last $convert.wkhtmlpdf.command.args | quote }}
]
timeout = {{ $convert.wkhtmlpdf.command.timeout | quote }}
}
working-dir = {{ $convert.wkhtmlpdf.working_dir | quote }}
}
tesseract = {
command = {
program = {{ $convert.tesseract.command.program | quote }}
args = [
{{- range initial $convert.tesseract.command.args }}
{{ . | quote }},
{{- end }}
{{ last $convert.tesseract.command.args | quote }}
]
timeout = {{ $convert.tesseract.command.timeout | quote }}
}
working-dir = {{ $convert.tesseract.working_dir | quote }}
}
unoconv = {
command = {
program = {{ $convert.unoconv.command.program | quote }}
args = [
{{- range initial $convert.unoconv.command.args }}
{{ . | quote }},
{{- end }}
{{ last $convert.unoconv.command.args | quote }}
]
timeout = {{ $convert.tesseract.command.timeout | quote }}
}
working-dir = {{ $convert.unoconv.working_dir | quote }}
}
ocrmypdf = {
enabled = {{ $convert.ocrmypdf.enabled }}
command = {
program = {{ $convert.ocrmypdf.command.program | quote }}
args = [
{{- range initial $convert.ocrmypdf.command.args }}
{{ . | quote }},
{{- end }}
{{ last $convert.ocrmypdf.command.args | quote }}
]
timeout = {{ $convert.ocrmypdf.command.timeout | quote }}
}
working-dir = {{ $convert.ocrmypdf.working_dir | quote }}
}
decrypt-pdf = {
enabled = {{ $convert.decrypt_pdf.enabled }}
passwords = [
{{- range initial $convert.decrypt_pdf.passwords }}
{{ . | quote }},
{{- end }}
{{ last $convert.decrypt_pdf.passwords | quote }}
]
}
}
{{ $files := $joex.files }}
files {
chunk-size = {{ $files.chunk_size }}
valid-mime-types = [
{{- range initial $files.valid_mime_types }}
{{ . | quote }},
{{- end }}
{{ last $files.valid_mime_types | quote }}
]
default-store = {{ $files.default_store | quote }}
stores = {
database = {
enabled = {{ $files.stores.database.enabled }}
type = "default-database"
}
filesystem = {
enabled = {{ $files.stores.filesystem.enabled }}
type = "file-system"
directory = {{ $files.stores.filesystem.directory | quote }}
}
minio = {
enabled = {{ $files.stores.minio.enabled }}
type = "s3"
endpoint = {{ $files.stores.minio.endpoint | quote }}
access-key = {{ $files.stores.minio.access_key | quote }}
secret-key = {{ $files.stores.minio.secret_key | quote }}
bucket = {{ $files.stores.minio.bucket | quote }}
}
}
}
{{- $full_text_search := $joex.full_text_search }}
full-text-search {
enabled = true
backend = "solr"
solr = {
url = {{ printf "http://%v:%v@%v-solr:8983/%v" .Values.solr.solrUsername (.Values.solr.solrPassword | trimAll "\"") .Release.Name .Values.solr.solrCores | quote }}
commit-within = {{ $full_text_search.solr.commit_within }}
log-verbose = {{ $full_text_search.solr.log_verbose }}
def-type = {{ $full_text_search.solr.def_type | quote }}
q-op = {{ $full_text_search.solr.q_op | quote }}
}
postgresql = {
use-default-connection = false
jdbc {
url = {{ printf "jdbc:postgresql://%v-%v:5432/%v" .Release.Name "postgresql" .Values.postgresql.postgresqlDatabase | quote }}
user = {{ .Values.postgresql.postgresqlUsername | quote }}
password = {{ .Values.postgresql.postgresqlPassword | trimAll "\"" | quote }}
}
pg-config = {
}
pg-query-parser = "websearch_to_tsquery"
pg-rank-normalization = [ 4 ]
}
migration = {
index-all-chunk = {{ $full_text_search.migration.index_all_chunk }}
}
}
{{- $addons := $joex.addons }}
addons {
working-dir = {{ $addons.working_dir }}
cache-dir = {{ $addons.cache_dir }}
executor-config {
runner = {{ $addons.executor_config.runner }}
nspawn = {
enabled = false
sudo-binary = "sudo"
nspawn-binary = "systemd-nspawn"
container-wait = "100 millis"
}
fail-fast = {{ $addons.executor_config.fail_fast }}
run-timeout = {{ $addons.executor_config.run_timeout | quote }}
nix-runner {
nix-binary = "nix"
build-timeout = "15 minutes"
}
docker-runner {
docker-binary = "docker"
build-timeout = "15 minutes"
}
}
}
}
{{- end -}}

View File

@ -0,0 +1,13 @@
{{/* Make sure all variables are set properly */}}
{{- include "tc.common.loader.init" . }}
{{/* Render secret */}}
{{- include "docspell.secret" . }}
{{- $_ := set .Values.additionalContainers "joex" (include "docspell.joex" . | fromYaml) -}}
{{- if and .Values.dsc.enabled .Values.rest_server.integration_endpoint.enabled -}}
{{- $_ := set .Values.additionalContainers "dsc" (include "docspell.dsc" . | fromYaml) -}}
{{- end -}}
{{/* Render the templates */}}
{{ include "tc.common.loader.apply" . }}

View File

View File

@ -0,0 +1,4 @@
icon_url: https://truecharts.org/img/hotlink-ok/chart-icons/docspell.png
categories:
- productivity

View File

@ -0,0 +1,6 @@
# Changelog
## [fireflyiii-data-importer-0.0.1]fireflyiii-data-importer-0.0.1 (2022-10-08)

View File

@ -0,0 +1,6 @@
dependencies:
- name: common
repository: https://library-charts.truecharts.org
version: 10.6.11
digest: sha256:b4a28b7604b153caed40b6ad96692b5ebcaac53d09ce9e190691162b59ce25c3
generated: "2022-10-08T23:46:49.437496812Z"

View File

@ -0,0 +1,34 @@
apiVersion: v2
kubeVersion: ">=1.16.0-0"
name: fireflyiii-data-importer
version: 0.0.1
appVersion: "0.9.16"
description: Firefly III Data Importer.
type: application
deprecated: false
home: https://truecharts.org/docs/charts/incubator/fireflyiii-data-importer
icon: https://truecharts.org/img/hotlink-ok/chart-icons/fireflyiii-data-importer.png
keywords:
- fireflyiii-data-importer
- data
- tool
sources:
- https://github.com/truecharts/charts/tree/master/charts/incubator/fireflyiii-data-importer
- https://docs.firefly-iii.org/data-importer/install/configure/
- https://hub.docker.com/r/fireflyiii/data-importer
- https://github.com/firefly-iii/data-importer
dependencies:
- name: common
repository: https://library-charts.truecharts.org
version: 10.6.11
maintainers:
- email: info@truecharts.org
name: TrueCharts
url: https://truecharts.org
annotations:
truecharts.org/catagories: |
- financial
- tool
- data
truecharts.org/SCALE-support: "true"
truecharts.org/grade: U

View File

@ -0,0 +1,108 @@
# fireflyiii-data-importer
Firefly III Data Importer.
TrueCharts can be installed as both *normal* Helm Charts or as Apps on TrueNAS SCALE.
This readme is just an automatically generated general guide on installing our Helm Charts and Apps.
For more information, please click here: [fireflyiii-data-importer](https://truecharts.org/docs/charts/incubator/fireflyiii-data-importer)
**This chart is not maintained by the upstream project and any issues with the chart should be raised [here](https://github.com/truecharts/charts/issues/new/choose)**
## Source Code
* <https://github.com/truecharts/charts/tree/master/charts/incubator/fireflyiii-data-importer>
* <https://docs.firefly-iii.org/data-importer/install/configure/>
* <https://hub.docker.com/r/fireflyiii/data-importer>
* <https://github.com/firefly-iii/data-importer>
## Requirements
Kubernetes: `>=1.16.0-0`
## Dependencies
| Repository | Name | Version |
|------------|------|---------|
| https://library-charts.truecharts.org | common | 10.6.11 |
## Installing the Chart
### TrueNAS SCALE
To install this Chart on TrueNAS SCALE check our [Quick-Start Guide](https://truecharts.org/docs/manual/SCALE%20Apps/Installing-an-App).
### Helm
To install the chart with the release name `fireflyiii-data-importer`
```console
helm repo add TrueCharts https://charts.truecharts.org
helm repo update
helm install fireflyiii-data-importer TrueCharts/fireflyiii-data-importer
```
## Uninstall
### TrueNAS SCALE
**Upgrading, Rolling Back and Uninstalling the Chart**
To upgrade, rollback or delete this Chart from TrueNAS SCALE check our [Quick-Start Guide](https://truecharts.org/docs/manual/SCALE%20Apps/Upgrade-rollback-delete-an-App).
### Helm
To uninstall the `fireflyiii-data-importer` deployment
```console
helm uninstall fireflyiii-data-importer
```
## Configuration
### Helm
#### Available Settings
Read through the values.yaml file. It has several commented out suggested values.
Other values may be used from the [values.yaml](https://github.com/truecharts/library-charts/tree/main/charts/stable/common/values.yaml) from the [common library](https://github.com/k8s-at-home/library-charts/tree/main/charts/stable/common).
#### Configure using the command line
Specify each parameter using the `--set key=value[,key=value]` argument to `helm install`.
```console
helm install fireflyiii-data-importer \
--set env.TZ="America/New York" \
TrueCharts/fireflyiii-data-importer
```
#### Configure using a yaml file
Alternatively, a YAML file that specifies the values for the above parameters can be provided while installing the chart.
```console
helm install fireflyiii-data-importer TrueCharts/fireflyiii-data-importer -f values.yaml
```
#### Connecting to other charts
If you need to connect this Chart to other Charts on TrueNAS SCALE, please refer to our [Linking Charts Internally](https://truecharts.org/docs/manual/SCALE%20Apps/linking-apps) quick-start guide.
## Support
- Please check our [quick-start guides for TrueNAS SCALE](https://truecharts.org/docs/manual/SCALE%20Apps/Important-MUST-READ).
- See the [Website](https://truecharts.org)
- Check our [Discord](https://discord.gg/tVsPTHWTtr)
- Open a [issue](https://github.com/truecharts/apps/issues/new/choose)
---
## Sponsor TrueCharts
TrueCharts can only exist due to the incredible effort of our staff.
Please consider making a [donation](https://truecharts.org/sponsor) or contributing back to the project any way you can!
---
All Rights Reserved - The TrueCharts Project

View File

@ -0,0 +1,8 @@
Firefly III Data Importer.
This App is supplied by TrueCharts, for more information visit the manual: [https://truecharts.org/docs/charts/incubator/fireflyiii-data-importer](https://truecharts.org/docs/charts/incubator/fireflyiii-data-importer)
---
TrueCharts can only exist due to the incredible effort of our staff.
Please consider making a [donation](https://truecharts.org/docs/about/sponsor) or contributing back to the project any way you can!

View File

@ -0,0 +1,36 @@
image:
repository: tccr.io/truecharts/fireflyiii-fidi
pullPolicy: IfNotPresent
tag: 0.9.16@sha256:53462b22258af5dabe98919f6c0c4ebaf7bc22c597ba840320fc208f13c3a745
securityContext:
readOnlyRootFilesystem: false
runAsNonRoot: false
podSecurityContext:
runAsUser: 0
runAsGroup: 0
secretEnv:
# User Defined
FIREFLY_III_ACCESS_TOKEN: ""
NORDIGEN_ID: ""
NORDIGEN_KEY: ""
SPECTRE_APP_ID: ""
SPECTRE_SECRET: ""
env:
# User Defined
FIREFLY_III_URL: ""
VANITY_URL: ""
service:
main:
ports:
main:
port: 10580
protocol: HTTP
targetPort: 8080
portal:
enabled: true

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1 @@
{{ include "tc.common.loader.all" . }}

View File

@ -0,0 +1,6 @@
icon_url: https://truecharts.org/img/hotlink-ok/chart-icons/fireflyiii-data-importer.png
categories:
- financial
- tool
- data