- Updates browser profile list styles to match other data table styles
- Makes entire collection item clickable
- Refactors row click area to fix text overflow
New features & enhancements:
- New UI for collection item selection dialog
- Consistent data table styles for collection list and collection item
list
Refactors:
- Adds `btrix-table` as low-level table component
- Adds `btrix-archived-item-list`, removes `checkbox-list` and
deprecates `crawl-list`
- Upgrades Shoelace for `sl-tree` fixes
- Fixes `ArchivedItem` typing
Fixes#1255
### Changes
- Fixes incorrect time zone conversion when generating UTC schedule in
workflow.
- Fixes minute input display not prefixing single digits with `0`
Co-authored-by: emma <hi@emma.cafe>
Fixes#1341
Adds "User Agent" field to workflow editor under the Browser Settings
tab. If not set, the crawler will use the browser's default user agent.
Also added to docs and to the workflow details page (if set).
---------
Co-authored-by: Henry Wilkinson <henry@wilkinson.graphics>
Co-authored-by: Ilya Kreymer <ikreymer@users.noreply.github.com>
Fixes#1385
## Changes
Supports multiple crawler 'channels' which can be configured to
different browsertrix-crawler versions
- Replaces `crawler_image` in helm chart with `crawler_channels` array
similar to how storages are handled
- The `default` crawler channel must always be provided and specifies
the default crawler image
- Adds backend `/orgs/{oid}/crawlconfigs/crawler-channels` API endpoint
to fetch information about available crawler versions (name, image, and
label) and test
- Adds crawler channel select to workflow creation/edit screens and
profile creation dialog, and updates related API endpoints and
configmaps accordingly. The select dropdown is shown only if more than
one channel is configured.
- Adds `crawlerChannel` to workflow and crawl details.
- Add `image` to crawler image, used to display actual image used as
part of the crawl.
- Modifies `crawler_crawl_id` backend test fixture to use `test` crawler
version to ensure crawler versions other than latest work
- Adds migration to add `crawlerChannel` set to `default` to existing
workflow and profile objects and workflow configmaps
---------
Co-authored-by: Ilya Kreymer <ikreymer@gmail.com>
Co-authored-by: Henry Wilkinson <henry@wilkinson.graphics>
Resolves https://github.com/webrecorder/browsertrix-cloud/issues/1333
- Moves "Select Crawls" / "Select Uploads" steps into a single "Select
Archived Items" dialog
- Refactors new collection metadata dialog to accept editing existing
collection
- Prevents RWP component from rendering if there are no archived items
(@Shrinks99 made a comment about this figma, but this prevents
unnecessary requests when there isn't an archive to replay)
- Shows collection description at bottom of detail page at all times
(@Shrinks99 seems useful to see even on archived items view?)
- Switches collection detail primary action to "Add Archived Items" if
none are included (cc @Shrinks99)
- Displays friendlier "name taken" error
- Removes unused Collection edit route
- Upgrades markdown dependencies for fixes/improvements to description
editing
---------
Co-authored-by: Henry Wilkinson <henry@wilkinson.graphics>
Fixes#1358
- Adds `extraExecMinutes` and `giftedExecMinutes` org quotas, which are
not reset monthly but are updateable amounts that carry across months
- Adds `quotaUpdate` field to `Organization` to track when quotas were
updated with timestamp
- Adds `extraExecMinutesAvailable` and `giftedExecMinutesAvailable`
fields to `Organization` to help with tracking available time left
(includes tested migration to initialize these to 0)
- Modifies org backend to track time across multiple categories, using
monthlyExecSeconds, then giftedExecSeconds, then extraExecSeconds.
All time is also written into crawlExecSeconds, which is now the monthly
total and also contains any overage time above the quotas
- Updates Dashboard crawling meter to include all types of execution
time if `extraExecMinutes` and/or `giftedExecMinutes` are set above 0
- Updates Dashboard Usage History table to include all types of
execution time (only displaying columns that have data)
- Adds backend nightly test to check handling of quotas and execution
time
- Includes migration to add new fields and copy crawlExecSeconds to
monthlyExecSeconds for previous months
Co-authored-by: emma <hi@emma.cafe>
Closes#1405
- Properly uses `typescript-eslint`: we were missing the preset from it,
so some of the default `eslint` rules (that don't properly work with
typescript) were being applied and causing false positives
- I also moved the `eslint` config into its own file, and enabled
`typescript-eslint`'s type-awareness, so that we can enable more
type-aware rules in the future if we like
- Adds `ts-lit-plugin` to the typescript config, which _hopefully_ will
allow us to catch issues during build (in CI)
- It looks like `ts-lit-plugin` is sort of abandonware at the moment,
and unfortunately _doesn't_ actually work for this purpose right now,
but the lit team is working on a replacement here:
https://www.npmjs.com/package/@lit-labs/analyzer
- Adds `fork-ts-checker-webpack-plugin`, which allows the typescript
checking process to be run on a separate forked thread in Webpack, which
can help speed up builds & checking
- Enables incremental type checking for better speed
- Fixes a whole bunch of `eslint`-auto-fixable issues (unused imports
and variables, some type issues, etc)
- Fixes a bunch of `lit-analyzer` issues (mostly attribute naming, some
type issues as well)
- Fixes various other type issues:
- Improves type safety in a bunch of places, notably anywhere `apiFetch`
and `APIPaginatedList` are used
- Removes some `any`s
- Adds confirmation dialog for workflow crawls
- Changes archived item confirmation from default browser dialog to
shoelace dialog
- Increase dialog title size
- Out of scope: Localizes other workflow detail confirmation buttons
- Out of scope: Reword missed "Archive" reference in file uploader
Closes#1294
### Changes
- `crawl-list` component
- Adds a check if there are any items in the actions menu. If not, skip
rendering the actions menu.
- This allows us to give the component no actions! Currently required to
remove them for viewers!
- Collection Details
- Hides "Remove from Collection" option for viewers
- Crawls List
- Removes the single "View Crawl Details" option from archived items for
viewers
- All the other actions were already set up correctly to be used by all
roles!
- Dashboard
- Hides org settings gear icon button unless the user is an admin
- Hides "Create New" dropdown for viewers
- Workflow Details
- Hides workflow edit icon button for viewers
- Hides the "Delete Crawl" option in archived items for viewers
- Hides the "Run Crawl" option for viewers
- Workflow List
- Hides all edit-related options for viewers, the only option now is
copying tags
- Removes the deactivate / delete options (were only visible when
running a crawl) in the workflow list actions
---------
Co-authored-by: Ilya Kreymer <ikreymer@gmail.com>
Co-authored-by: sua yoo <sua@suayoo.com>
- Adds two new crawl finished state, stopped_by_user and
stopped_quota_reached
- Tracking other possible 'stop reasons' in operator, though not making
them distinct states for now.
- Updated frontend with 'Stopped by User' and 'Stopped: Time Quota
Reached', shown with same icon as current partial_complete
- Added migration of partial_complete to either stopped_by_user or
complete (no historical quota data available)
- Addresses edge case in scaling: if crawl never scaled (no redis entry,
no pod), automatically scale down
- Edge case in status: if crawl is somehow 'canceled' but not deleted,
immediately delete crawl object and begin finalizing.
---------
Co-authored-by: Tessa Walsh <tessa@bitarchivist.net>
- instead of restarting crawler when exclusion added/removed, add a
message to a redis list (per crawler instance)
- no longer filtering existing queue on backend, now handled via crawler (implemented in 0.12.0 via webrecorder/browsertrix-crawler#408)
- match response optimization: instead of returning first 1000 matches,
limits response to 500K and returns however many matches fit in that
response size (for optional pagination on frontend)
Fixes#1261Closes#1092
The quota for monthly execution minutes is treated as a hard cap. Once
it is exceeded, an alert indicating that an org has exceeded its monthly
execution minutes will display and the user will be unable to start new
crawls. Any running crawls will be stopped once the quota is exceeded.
An execution minutes meter bar is also added in the Org Dashboard and
displayed if a quota is set. More detail in #1305 which was
merged into this branch.
## Changes
- Enable setting 'maxExecMinutesPerMonth' in orgs list quotas by superadmin
- Enforce quota by stopping crawls in operator once quota is reached
- Show alert banner once execution time quota is hit:
- Once quota is hit, disable Run Crawl buttons in frontend, return 403
message with `exec_minutes_quota_reached` detail in backend from
crawl config `/run` endpoint, and don't run new workflows on creation
(similar to storage quota)
- Display execution time for crawls in the crawl details overview,
immediately below
- Show execution minutes meter on dashboard (from #1305)
---------
Co-authored-by: Henry Wilkinson <henry@wilkinson.graphics>
Co-authored-by: Ilya Kreymer <ikreymer@gmail.com>
Co-authored-by: sua yoo <sua@webrecorder.org>
### Context
- Adds custom padding to each side based on if the tag is removable or not
- Improves hover state for the remove button when the tag is focused
- Adds padding to the remove button
### Changes
- URLs on the config review pages are now links that open in a new tab
- Does not do anything with the `Extra URLs in Scope` field (which we currently render as a regex so left that alone)
- Hides / removes the previously deep-linked but now broken config section rendering.
Fixes#1050
Major refactor of the user/auth system to remove fastapi_users
dependency. Refactors users.py to be standalone
and adds new auth.py module for handling auth. UserManager now works
similar to other ops classes.
The auth should be fully backwards compatible with fastapi_users auth,
including accepting previous JWT tokens w/o having to re-login. The User
data model in mongodb is also unchanged.
Additional fixes:
- allows updating fastapi to latest
- add webhook docs to openapi (follow up to #1041)
API changes:
- Removing the`GET, PATCH, DELETE /users/<id>` endpoints, which were not
in used before, as users are scoped to orgs. For deletion, probably
auto-delete when user is removed from last org (to be implemented).
- Rename `/users/me-with-orgs` is renamed to just `/users/me/`
- New `PUT /users/me/change-password` endpoint with password required to update password, fixes #1269, supersedes #1272
Frontend changes:
- Fixes from #1272 to support new change password endpoint.
---------
Co-authored-by: Tessa Walsh <tessa@bitarchivist.net>
Co-authored-by: sua yoo <sua@suayoo.com>
- Replaces org UUID in URL/browser location bar with org slug.
- Refactor: Adds shared app state utility using https://sijakret.github.io/lit-shared-state/ to
access org data from deep descendants.
- Backwards compatible: org UUID URLs should auto-redirect to org slug URLs.
- Show the org UUID in org settings general tab for use with APIs
(Resolves#1258, Follows #1279)
- Adds `position: sticky` to the workflow creator / editor controls to
affix them to the bottom of the screen, they are now always visible!
- Renames "Extra URLs in Scope" to "Extra URL Prefixes in Scope"
- Updates documentation accordingly
- Adjusts casing for checkboxes
- Adds the multiplication sign to the crawler instances settings to
better communicate that they are increases in scale and not arbitrary
numbers.
- Limits URL list entry to 1,000 URLs
- Limits additional URL list entry to 100 URLs
- Shows first invalid URL in list in error message
- Quick and dirty fix for long URLs wrapping: Show URLs in list on one line, with entire container scrolling
---------
Co-authored-by: Henry Wilkinson <henry@wilkinson.graphics>
- Require that all passwords are between 8 and 64 characters
- Fixes account settings password reset form to only trigger
logged-in event after successful password change.
- Password validation can be extended within the UserManager's
validate_password method to add or modify requirements.
- Add tests for password validation
- If set, and any of the seeds fails, the entire crawl is marked as a failure.
- Add checkbox which adds --failOnFailedSeed checkbox to URL list workflows
- Add 'Fail Crawl On Failed URL' to crawl workflow setup docs
- Adds "Logs" tab to workflow detail
- Shows error logs in expandable section in "Watch" tab
- Show corresponding message (no logs yet or logs temporarily unavailable) when `/errors` returns 503 based on crawl state
- text tweaks: use error logs instead of logs, change 'crawl start' -> 'crawl continue' in log message
---------
Co-authored-by: Ilya Kreymer <ikreymer@gmail.com>
- Remove config.seeds from workflow and crawl detail endpoints
- Add new paginated GET /crawls/{crawl_id}/seeds and /crawlconfigs/{cid}/seeds endpoints to retrieve seeds for a crawl or workflow
- Include firstSeed in GET /crawlconfigs/{cid} endpoint (was missing before)
- Modify frontend to fetch seeds from new /seeds endpoints with loading indicator
---------
Co-authored-by: Tessa Walsh <tessa@bitarchivist.net>
* Reorders actions, adds tooltip
- All copy buttons on the collection share dialog are now on the right side
- Adds a tooltip to tell the user the button opens the link in a new tab
* Make vertical `dec-list` items fill 100% width of their parent container
- Allows for better placement of items within the container
- Adds horizontal padding to info bars
* Right align copy button in item details page
* Remove config from list endpoints
- Remove config field from workflow and crawl list endpoints
- Add seedCount to CrawlConfigOut on backend and Workflow on frontend
- Refactor CrawlConfig and CrawlConfigOut to extend CrawlConfigCore + CrawlConfigAdditional
- Refactor workflow list in frontend to use firstSeed and seedCount
- Frontend uses ListWorkflow type which is Omit<Workflow, "config">
* use metacontroller's decoratorcontroller to create CrawlJob from Job
* scheduled job work:
- use existing job name for scheduled crawljob
- use suspended job, set startTime, completionTime and succeeded status on job when crawljob is done
- simplify cronjob template: remove job_image, cron_namespace, using same namespace as crawls,
placeholder job image for cronjobs
* move storage quota check to crawljob handler:
- add 'skipped_quota_reached' as new failed status type
- check for storage quota before checking if crawljob can be started, fail if not (check before any pods/pvcs created)
* frontend:
- show all crawls in crawl workflow, no need to filter by status
- add 'skipped_quota_reached' status, show as 'Skipped (Quota Reached)', render same as failed
* migration: make release namespace available as DEFAULT_NAMESPACE, delete old cronjobs in DEFAULT_NAMESPACE and recreate in crawlers namespace with new template
- More specific toast notification error messages to the action being attempted
- Single dismissable global banner shown when org storage is reached
- Removed check for storage quota reached in `runNow`, since buttons are disabled in UI, and errors handled if request fails.
- Allow creating new workflow when storage quota reached
- More responsive storage quota updates: add storageQuotaReached to archived item replay.json, updates w/o reload when crawl pushes quota over limit
- Modify LiteElement to check for storageQuotaReached on GET requests
---------
Co-authored-by: sua yoo <sua@suayoo.com>
* Implement in backend
- Track bytesStored in org
- Add migration to pre-calculate based on size of crawlfiles and profilefiles
- Add methods to increase or decrease org storage when crawl or profile files
are added or deleted
- Include storageQuotaReached boolean in API responses that alter storage
- Don't start new crawls and fail uploads if storage quota reached
* Implement in frontend
- Add to orgs-list quotas
- Update org's storageQuotaReached based on backend endpoint responses
- Disable buttons when storage quota is met
- Show toast notification when attempting to run a crawl when org
storage quota is met
- Lists collections that an archived item belongs to in item detail view
- Improves performance of collection add component
---------
Co-authored-by: Tessa Walsh <tessa@bitarchivist.net>