summaryrefslogtreecommitdiffhomepage
path: root/.rules/changelog/2026-03/24/10.md
blob: 81d22b7fea1d4b241b61116831d00bf02ddb3baa (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
# Changelog — 2026-03-24 #10

## Mobile "Load failed" Fix (`src/ollama-client.ts`)

- Imported `Platform` from Obsidian for runtime mobile detection.
- Split `sendChatMessageStreaming` into a dispatcher + mobile/desktop implementations:
  - Mobile: uses `requestUrl()` (non-streaming) to bypass WebView sandbox.
  - Desktop: keeps native `fetch()` for real token-by-token streaming.
- Enhanced error messages with mobile-specific hints.
- Added `"load failed"` to caught network error patterns in `testConnection`.

## UI: DaisyUI-inspired Collapse (`src/chat-view.ts`, `styles.css`)

- Replaced native `<details>/<summary>` with a checkbox-driven CSS grid collapse.
- Uses `grid-template-rows: 0fr → 1fr` transition with a rotating arrow indicator.
- Tightened padding and margins around the collapse for compact layout.

## UI: FAB / Speed Dial (`src/chat-view.ts`, `styles.css`)

- Replaced inline settings/tools buttons with a FAB in the top-right of the messages area.
- Main trigger: gear icon, rotates 90° on open.
- Three actions fan downward with staggered animations:
  - **AI Settings** (sliders icon) — opens the settings modal.
  - **Tools** (wrench icon) — opens the tools modal.
  - **Clear Chat** (trash icon) — clears message history and UI.
- Removed old inline button styles and tools-active coloring.

## UI: Text Selection (`styles.css`)

- Enabled `user-select: text` on messages and tool call bubbles.

## UI: Settings Modal Rename (`src/settings-modal.ts`)

- Changed modal title to "AI Settings".

## Generation Parameters (`src/settings.ts`, `src/settings-modal.ts`, `src/ollama-client.ts`, `src/chat-view.ts`)

- Added `temperature` (default 0.7), `numCtx` (default 4096), `numPredict` (default -1) to settings.
- Added `ModelOptions` interface and `options` field to `StreamingChatOptions`.
- Options are now passed to Ollama in the request body for all chat paths.
- Settings modal shows:
  - **Temperature** — slider 0–2 with live value display.
  - **Context Window** — number input with model max shown below.
  - **Max Output Tokens** — number input (-1 = unlimited).
- Added `showModel()` function querying `/api/show` to extract the model's context length.
- Model max label turns red when context window exceeds model limit.
- Clicking the model max label sets context window to the model's max.

## Disabled GitHub Action (`.github/workflows/lint.yml`)

- Commented out push/PR triggers; added `workflow_dispatch` for manual runs.