Initial set of backend API for event webhook notifications for the following events: * Crawl started (including boolean indicating if crawl was scheduled) * Crawl finished * Upload finished * Archived item added to collection * Archived item removed from collection Configuration of URLs is done via /api/orgs/<oid>/event-webhook-urls. If a URL is configured for a given event, a webhook notification is added to the database and then attempted to be sent (up to a total of 5 tries per overall attempt, with an increasing backoff between, implemented via use of the backoff library, which supports async). webhook status available via /api/orgs/<oid>/webhooks (Additional testing + potential fastapi integration left in separate follow-ups Fixes #1041
28 lines
498 B
YAML
28 lines
498 B
YAML
apiVersion: btrix.cloud/v1
|
|
kind: CrawlJob
|
|
metadata:
|
|
name: crawljob-{{ id }}
|
|
labels:
|
|
crawl: "{{ id }}"
|
|
role: "job"
|
|
oid: "{{ oid }}"
|
|
userid: "{{ userid }}"
|
|
|
|
spec:
|
|
selector:
|
|
matchLabels:
|
|
crawl: "{{ id }}"
|
|
|
|
id: "{{ id }}"
|
|
userid: "{{ userid }}"
|
|
cid: "{{ cid }}"
|
|
oid: "{{ oid }}"
|
|
scale: {{ scale }}
|
|
maxCrawlSize: {{ max_crawl_size }}
|
|
manual: {{ manual }}
|
|
ttlSecondsAfterFinished: 30
|
|
|
|
{% if expire_time %}
|
|
expireTime: "{{ expire_time }}"
|
|
{% endif %}
|