v0.3.3
What's Changed
- docs: sync platform openapi.json by @github-actions[bot] in #1237
- Bump authlib from 1.6.7 to 1.6.9 in the uv group across 1 directory by @dependabot[bot] in #1234
- [GH-1147] Fix null byte crash in semantic memory ingestion by @o-love in #1165
- [GH-1146] Cap features sent to LLM during semantic ingestion update by @o-love in #1163
- Handle context window size when creating summary in short term memory by @malatewang in #1254
- fix: replace O(n²) ingestion polling query with GROUP BY/HAVING by @xiongzubiao in #1253
- Docs: Update Nebula Integration Guide and Navigation by @SarahScargall in #1260
- Make benchmark agent LLM configurable through config file by @Tianyang-Zhang in #1217
- Retry OpenAI API on internal server error by @edwinyyyu in #1226
- docs: sync platform openapi.json by @github-actions[bot] in #1250
- Bump ruff from 0.15.5 to 0.15.7 by @dependabot[bot] in #1256
- Bump ty from 0.0.23 to 0.0.24 by @dependabot[bot] in #1257
- Bump pytest-cov from 7.0.0 to 7.1.0 by @dependabot[bot] in #1255
- fix: stop semantic ingestion retry loop on context_length_exceeded by @o-love in #1259
Full Changelog: v0.3.2...v0.3.3