browsertrix/backend/main.py
Ilya Kreymer adb5c835f2
Presign and replay (#127)
* support for replay via replayweb.page embed, fixes #124

backend:
- pre-sign all files urls
- cache pre-signed urls in redis, presign again when expired (default duration 3600, settable via PRESIGN_DURATION_SECONDS env var)
- change files output -> resources to confirm to Data Package spec supported by replayweb.page
- add CrawlFileOut which contains 'name' (file id), 'path' (presigned url), 'hash', and 'size'
- add /replay/sw.js endpoint to import sw.js from latest replay-web-page release
- update to fastapi-users 9.2.2
- customize backend auth to allow authentication to check 'auth_bearer' query arg if 'Authorization' header not set
- remove sw.js endpoint, handling in frontend

frontend:
- add <replay-web-page> to frontend, include rwp ui.js from latest release in index.html for now
- update crawl api endpoint to end in json
- replay-web-page loads the api endpoint directly!
- update Crawl type to use new format, 'resources' -> instead of 'files', each file has 'name' and 'path'

- nginx: add endpoint to serve the replay sw.js endpoint
- add defer attr to ui.js
- move 'Download' to 'Download Files'

* frontend: support customizing replayweb.page loading url via RWP_BASE_URL env var in Dockerfile
- default prod value set in frontend Dockerfile (set to upcoming 1.5.8 release needed for multi-wacz-file support) (can be overridden during image build via --build-arg)
- rename index.html -> index.ejs to allow interpolation
- RWP_BASE_URL defaults to latest https://replayweb.page/ for testing
- for local testing, add sw.js loading via devServer, also using RWP_BASE_URL (#131)

Co-authored-by: sua yoo <sua@suayoo.com>
2022-01-31 17:02:15 -08:00

103 lines
2.4 KiB
Python

"""
main file for browsertrix-api system
supports docker and kubernetes based deployments of multiple browsertrix-crawlers
"""
import os
from fastapi import FastAPI
from db import init_db
from emailsender import EmailSender
from invites import init_invites
from users import init_users_api, init_user_manager, JWT_TOKEN_LIFETIME
from archives import init_archives_api
from storages import init_storages_api
from crawlconfigs import init_crawl_config_api
from colls import init_collections_api
from crawls import init_crawls_api
app = FastAPI()
# ============================================================================
# pylint: disable=too-many-locals
def main():
""" init browsertrix cloud api """
email = EmailSender()
crawl_manager = None
mdb = init_db()
settings = {
"registrationEnabled": os.environ.get("REGISTRATION_ENABLED") == "1",
"jwtTokenLifetime": JWT_TOKEN_LIFETIME,
}
invites = init_invites(mdb, email)
user_manager = init_user_manager(mdb, email, invites)
fastapi_users = init_users_api(app, user_manager)
current_active_user = fastapi_users.current_user(active=True)
archive_ops = init_archives_api(
app, mdb, user_manager, invites, current_active_user
)
user_manager.set_archive_ops(archive_ops)
# pylint: disable=import-outside-toplevel
if os.environ.get("KUBERNETES_SERVICE_HOST"):
from k8sman import K8SManager
crawl_manager = K8SManager()
else:
from dockerman import DockerManager
crawl_manager = DockerManager(archive_ops)
init_storages_api(archive_ops, crawl_manager, current_active_user)
crawl_config_ops = init_crawl_config_api(
mdb,
current_active_user,
archive_ops,
crawl_manager,
)
crawls = init_crawls_api(
app,
mdb,
os.environ.get("REDIS_URL"),
user_manager,
crawl_manager,
crawl_config_ops,
archive_ops,
)
coll_ops = init_collections_api(mdb, crawls, archive_ops, crawl_manager)
crawl_config_ops.set_coll_ops(coll_ops)
app.include_router(archive_ops.router)
@app.get("/settings")
async def get_settings():
return settings
@app.get("/healthz")
async def healthz():
return {}
# ============================================================================
@app.on_event("startup")
async def startup():
"""init on startup"""
main()