Root cause: Consumer(huey, workers=1, worker_type='thread', loglevel=20)
raised TypeError on every app start because Huey 2.6.0 does not accept
a `loglevel` keyword argument. The exception was silently caught and only
printed to stdout, so the consumer never ran and all tasks stayed 'queued'
forever — causing the 'Preparing environment / Waiting for logs' hang.
Fixes:
- web/app.py: Remove invalid `loglevel=20` from Consumer(); configure
Huey logging via logging.basicConfig(WARNING) instead. Add persistent
error logging to data/consumer_error.log for future diagnosis.
- core/config.py: Replace emoji print() calls with ASCII-safe equivalents
to prevent UnicodeEncodeError on Windows cp1252 terminals at import time.
- core/config.py: Update VERSION to 2.9 (was stale at 1.5.0).
- ai_blueprint.md: Bump to v2.10, document root cause and fixes.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- ai/setup.py: Added threading import; OAuth block now detects background/headless
threads and skips run_local_server to prevent indefinite blocking. Logs a clear
warning and falls back to ADC for Vertex AI. Token file only written when creds
are not None.
- web/tasks.py: All sqlite3.connect() calls now use timeout=30, check_same_thread=False.
OperationalError on the initial status update is caught and logged via utils.log.
generate_book_task now touches initial_log immediately so the UI polling endpoint
always finds an existing file even if the worker crashes on the next line.
- ai_blueprint.md: Bumped to v2.9; Section 12.D sub-items 1-3 marked ✅; item 13
added to summary.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
1. templates/project_setup.html: s.tropes|join and s.formatting_rules|join
raised Jinja2 UndefinedError when AI failed and fallback dict lacked those
keys → 500 blank page. Fixed with (s.tropes or [])|join(', ').
2. web/routes/project.py (project_setup_wizard): Removed silent redirect-to-
dashboard when model_logic is None. Now renders the setup form with a
complete default suggestions dict (all fields present, lists as []) plus a
clear warning flash so the user can fill it in manually.
3. web/routes/project.py (create_project_final): planner.enrich() was called
with the full bible dict — enrich() reads manual_instruction from the top
level (got 'A generic story' fallback) and wrote results into book_metadata
instead of the bible's books[0]. Fixed to build a proper per-book blueprint,
call enrich, and merge characters/plot_beats back into the correct locations.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Root causes of indefinite spinning during book create/generate:
1. ai/models.py — ResilientModel.generate_content() had no timeout: a
stalled Gemini API call would block the thread forever. Now injects
request_options={"timeout": 180} into every call. Also removed the
dangerous init_models(force=True) call inside the retry handler, which
was making a second network call during an existing API failure.
2. ai/setup.py — genai.list_models() calls in get_optimal_model(),
select_best_models(), and init_models() had no timeout. Added
request_options={"timeout": 30} to all three calls so model init
fails fast rather than hanging indefinitely.
3. web/app.py — Huey task consumer only started inside
`if __name__ == "__main__":`, meaning tasks queued via flask run,
gunicorn, or other WSGI runners were never executed (status stuck at
"queued" forever). Moved consumer start to module level with a
WERKZEUG_RUN_MAIN guard to prevent double-start under the reloader.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Removed `from huey.contrib.mini import MiniHuey` which caused
`ModuleNotFoundError: No module named 'gevent'` on startup. MiniHuey
was never used; the app correctly uses SqliteHuey via `web.tasks`.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
web/app.py was hardcoded to port 7070, causing Docker port forwarding
(5000:5000) and the Dockerfile HEALTHCHECK to fail. Changed to port 5000
to match docker-compose.yml and Dockerfile configuration.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>