Catch non-critical exceptions at crawler top level

This commit is contained in:
Joscha 2021-07-13 15:42:11 +02:00
parent 86f79ff1f1
commit 544d45cbc5
2 changed files with 2 additions and 0 deletions

View File

@ -34,6 +34,7 @@ ambiguous situations.
### Fixed
- Nondeterministic name deduplication due to ILIAS reordering elements
- More exceptions are handled properly
## 3.1.0 - 2021-06-13

View File

@ -320,6 +320,7 @@ class Crawler(ABC):
log.explain("Warnings or errors occurred during this run")
log.explain("Answer: No")
@anoncritical
async def run(self) -> None:
"""
Start the crawling process. Call this function if you want to use a