diff --git a/.gitignore b/.gitignore index a547bf3..0c93259 100644 --- a/.gitignore +++ b/.gitignore @@ -12,6 +12,13 @@ dist dist-ssr *.local +# Tauri build output +src-tauri/target/ +src-tauri/target_tmp/ +src-tauri/gen/ +**/*.dmg +**/*.app + # Editor directories and files .vscode/* !.vscode/extensions.json diff --git a/README.md b/README.md index 5ad51ab..1a1e365 100644 --- a/README.md +++ b/README.md @@ -1,74 +1,74 @@ # Hearbit AI 🦉🎙️ -Hearbit AI is a powerful macOS desktop application designed to record system and microphone audio, transcribe it using Infomaniak's Whisper API, and generate intelligent AI summaries. +**Hearbit AI** is your professional meeting assistant for macOS. It records both your microphone and system audio (e.g., Teams, Zoom), transcribes it with high precision using Infomaniak's Whisper API, and generates intelligent, structured summaries. -## Features +![App Icon](src-tauri/icons/128x128@2x.png) -* **Dual-Channel Recording**: Capture both your microphone (e.g., for voice notes) and system audio (e.g., for meetings) simultaneously. -* **Powered by Infomaniak AI**: - * **Transcription**: High-accuracy speech-to-text using Infomaniak's Whisper integration. - * **Summarization**: Generate concise summaries, action items, or meeting notes using models like Mixtral or Llama 3 via Infomaniak's LLM API. -* **Auto-Save**: Recordings and summaries are automatically saved to a persistent history. -* **Customizable Prompts**: Define your own AI templates (e.g., "Summarize", "Extract Action Items", "Translate"). -* **Privacy-Focused**: Processed securily via your own Infomaniak API keys. +## ✨ Features -## prerequisites +* **🎙️ Dual-Channel Recording**: seamlessly capture your voice and meeting audio from apps like Microsoft Teams, Zoom, or Google Meet. +* **🧠 Powered by Infomaniak AI**: + * **Precision Transcription**: Standard-compliant formatting with **second-by-second timestamps** (e.g., `[00:12]`). + * **Smart Summaries**: Uses advanced LLMs (Mixtral, Llama 3) to create actionable meeting notes. +* **📝 Professional Templates**: Comes with 3 built-in expert prompts: + * **Meeting Protocol**: For general business meetings. + * **1:1 / Jour Fixe**: For confidential personnel discussions. + * **Client Meeting**: For official, client-ready documentation. +* **💾 Auto-Save**: All recordings and summaries are saved locally to your history. +* **🔒 Privacy-First**: Data is processed securely via your own Infomaniak API keys. -* **macOS** (Apple Silicon or Intel) -* **Infomaniak AI Account**: You need an API Key and Product ID from [Infomaniak's Developer Portal](https://manager.infomaniak.com/). +--- -## Installation +## 🚀 Getting Started -1. Download the latest release (`.dmg` or `.app`). -## Recording System Audio (Teams, Zoom, etc.) +### 1. Prerequisites +* **macOS** (Apple Silicon or Intel). +* **Infomaniak AI Account**: You need an API Key and Product ID from the [Infomaniak Developer Portal](https://manager.infomaniak.com/). -To record meetings from apps like Microsoft Teams, Zoom, or Google Meet, you need to route the computer's audio into Hearbit AI. This requires **BlackHole 2ch**. +### 2. Installation +1. Download the latest `.dmg` file from the [Releases page](#). +2. Open the `.dmg` and drag **Hearbit AI** to your Applications folder. +3. Launch the app. -### 1. Install BlackHole -1. Download and install [BlackHole 2ch](https://existential.audio/blackhole/). -2. Restart your computer if required. +--- -### 2. Create a Multi-Output Device (To hear audio while recording) -1. Open **Audio MIDI Setup** (search in Spotlight). -2. Click the `+` icon and select **Create Multi-Output Device**. -3. Check **BlackHole 2ch** AND **MacBook Pro Speakers** (or your headphones). -4. Set "Master Device" to **BlackHole 2ch**. - - ![Multi-Output Setup](docs/screenshots/multi_output_setup.png) +## 🎧 Recording System Audio (Teams, Zoom, etc.) -### 3. Create an Aggregate Device (For Hearbit AI Input) -1. In Audio MIDI Setup, click `+` and select **Create Aggregate Device**. -2. Name it "Hearbit-AI" (or similar). -3. Check **BlackHole 2ch** AND **MacBook Pro Microphone**. -4. Ensure "Drift Correction" is enabled for the Microphone. +To record clear meeting audio from other applications, you need a "virtual cable". We recommend **BlackHole 2ch**. - ![Aggregate Device Setup](docs/screenshots/aggregate_device_setup.png) +1. **Install BlackHole**: Download and install [BlackHole 2ch](https://existential.audio/blackhole/). +2. **Create a Multi-Output Device** (So you can hear the audio too!): + * Open **Audio MIDI Setup** on your Mac. + * Create a "Multi-Output Device". + * Select both **BlackHole 2ch** AND your **Headphones/Speakers**. + * *Tip: Use this Multi-Output Device as your SPEAKER in Teams/Zoom.* +3. **Select Input in Hearbit AI**: + * In Hearbit AI, select **BlackHole 2ch** (or an Aggregate Device combining your Mic + BlackHole) as the **Input Device**. -### 4. Setup in Hearbit AI -1. Open **Hearbit AI**. -2. In the **Input Device** dropdown, select your Aggregate Device (e.g., "Hearbit-AI"). -3. Start Recording. +--- -### 5. Setup in Teams/Zoom -* Set your **Speaker** output in Teams/Zoom to the **Multi-Output Device** you created in Step 2. +## 🛠️ Usage Guide -1. **Configure**: - * Click the **Settings** (gear icon) in the top right. +1. **Configuration**: + * Click the **Settings** (gear icon). * Enter your **Infomaniak API Key** and **Product ID**. - * (Optional) Customize your AI prompts. -2. **Record**: - * Select your **Input Device** (e.g., "MacBook Pro Microphone" or an aggregate device for system audio). - * Select an **LLM Model** (e.g., Mixtral). + * (Optional) Customize where recordings are saved. + +2. **Recording**: + * Choose your **Template** (e.g., "Meeting Protocol"). + * Select your **Input Device**. * Click **Start Recording**. -3. **Process**: - * Click **Stop Recording**. - * The app will automatically upload, transcribe, and summarize your audio. -4. **History**: - * Click **Records** to view past recordings. -## Development +3. **Processing**: + * Click **Stop** when finished. + * The app will transcribe the audio (with timestamps!) and generate a summary based on your selected template. + * You will be automatically taken to the **Transcription** tab to review the results. -This app is built with **Tauri**, **React**, and **TypeScript**. +--- + +## 👨‍💻 Development + +Built with **Tauri**, **React**, and **TypeScript**. ### Setup ```bash @@ -77,8 +77,6 @@ npm install ### Run Locally ```bash -npm run dev -# OR npm run tauri dev ``` @@ -86,9 +84,9 @@ npm run tauri dev ```bash npm run tauri build ``` +*The build artifact will be located in `src-tauri/target/release/bundle/dmg/*`* -## Icons -The app uses a custom generated icon set located in `src-tauri/icons`. +--- -## License -[License Name] +## 📄 License +Property of Livtec. All rights reserved. diff --git a/package-lock.json b/package-lock.json index 297954c..6c1c069 100644 --- a/package-lock.json +++ b/package-lock.json @@ -1,15 +1,16 @@ { - "name": "infomaniak-recorder", + "name": "hearbit-ai", "version": "0.1.0", "lockfileVersion": 3, "requires": true, "packages": { "": { - "name": "infomaniak-recorder", + "name": "hearbit-ai", "version": "0.1.0", "dependencies": { "@tailwindcss/postcss": "^4.1.18", "@tauri-apps/api": "^2", + "@tauri-apps/plugin-dialog": "^2.6.0", "@tauri-apps/plugin-opener": "^2", "jimp": "^1.6.0", "lucide-react": "^0.562.0", @@ -2076,6 +2077,15 @@ "node": ">= 10" } }, + "node_modules/@tauri-apps/plugin-dialog": { + "version": "2.6.0", + "resolved": "https://registry.npmjs.org/@tauri-apps/plugin-dialog/-/plugin-dialog-2.6.0.tgz", + "integrity": "sha512-q4Uq3eY87TdcYzXACiYSPhmpBA76shgmQswGkSVio4C82Sz2W4iehe9TnKYwbq7weHiL88Yw19XZm7v28+Micg==", + "license": "MIT OR Apache-2.0", + "dependencies": { + "@tauri-apps/api": "^2.8.0" + } + }, "node_modules/@tauri-apps/plugin-opener": { "version": "2.5.3", "resolved": "https://registry.npmjs.org/@tauri-apps/plugin-opener/-/plugin-opener-2.5.3.tgz", diff --git a/package.json b/package.json index a5c07bc..caecb41 100644 --- a/package.json +++ b/package.json @@ -12,6 +12,7 @@ "dependencies": { "@tailwindcss/postcss": "^4.1.18", "@tauri-apps/api": "^2", + "@tauri-apps/plugin-dialog": "^2.6.0", "@tauri-apps/plugin-opener": "^2", "jimp": "^1.6.0", "lucide-react": "^0.562.0", diff --git a/src-tauri/Cargo.lock b/src-tauri/Cargo.lock index 08942f1..86f2775 100644 --- a/src-tauri/Cargo.lock +++ b/src-tauri/Cargo.lock @@ -488,8 +488,10 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "fac4744fb15ae8337dc853fee7fb3f4e48c0fbaa23d0afe49c447b4fab126118" dependencies = [ "iana-time-zone", + "js-sys", "num-traits", "serde", + "wasm-bindgen", "windows-link 0.2.1", ] @@ -819,6 +821,8 @@ source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "89a09f22a6c6069a18470eb92d2298acf25463f14256d24778e1230d789a2aec" dependencies = [ "bitflags 2.10.0", + "block2", + "libc", "objc2", ] @@ -1515,6 +1519,23 @@ version = "0.16.1" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "841d1cc9bed7f9236f321df977030373f4a4163ae1a7dbfe1a51a2c1a51d9100" +[[package]] +name = "hearbit-ai" +version = "0.1.0" +dependencies = [ + "chrono", + "cpal", + "hound", + "reqwest 0.13.1", + "serde", + "serde_json", + "tauri", + "tauri-build", + "tauri-plugin-dialog", + "tauri-plugin-opener", + "tokio", +] + [[package]] name = "heck" version = "0.4.1" @@ -1834,21 +1855,6 @@ dependencies = [ "cfb", ] -[[package]] -name = "infomaniak-recorder" -version = "0.1.0" -dependencies = [ - "cpal", - "hound", - "reqwest 0.13.1", - "serde", - "serde_json", - "tauri", - "tauri-build", - "tauri-plugin-opener", - "tokio", -] - [[package]] name = "ipnet" version = "2.11.0" @@ -3309,6 +3315,30 @@ dependencies = [ "web-sys", ] +[[package]] +name = "rfd" +version = "0.16.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "a15ad77d9e70a92437d8f74c35d99b4e4691128df018833e99f90bcd36152672" +dependencies = [ + "block2", + "dispatch2", + "glib-sys", + "gobject-sys", + "gtk-sys", + "js-sys", + "log", + "objc2", + "objc2-app-kit", + "objc2-core-foundation", + "objc2-foundation", + "raw-window-handle", + "wasm-bindgen", + "wasm-bindgen-futures", + "web-sys", + "windows-sys 0.60.2", +] + [[package]] name = "ring" version = "0.17.14" @@ -4167,6 +4197,46 @@ dependencies = [ "walkdir", ] +[[package]] +name = "tauri-plugin-dialog" +version = "2.6.0" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "9204b425d9be8d12aa60c2a83a289cf7d1caae40f57f336ed1155b3a5c0e359b" +dependencies = [ + "log", + "raw-window-handle", + "rfd", + "serde", + "serde_json", + "tauri", + "tauri-plugin", + "tauri-plugin-fs", + "thiserror 2.0.18", + "url", +] + +[[package]] +name = "tauri-plugin-fs" +version = "2.4.5" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "ed390cc669f937afeb8b28032ce837bac8ea023d975a2e207375ec05afaf1804" +dependencies = [ + "anyhow", + "dunce", + "glob", + "percent-encoding", + "schemars 0.8.22", + "serde", + "serde_json", + "serde_repr", + "tauri", + "tauri-plugin", + "tauri-utils", + "thiserror 2.0.18", + "toml 0.9.11+spec-1.1.0", + "url", +] + [[package]] name = "tauri-plugin-opener" version = "2.5.3" diff --git a/src-tauri/Cargo.toml b/src-tauri/Cargo.toml index 8655228..0d3279b 100644 --- a/src-tauri/Cargo.toml +++ b/src-tauri/Cargo.toml @@ -1,5 +1,5 @@ [package] -name = "infomaniak-recorder" +name = "hearbit-ai" version = "0.1.0" description = "A Tauri App" authors = ["you"] @@ -20,10 +20,11 @@ tauri-build = { version = "2", features = [] } [dependencies] tauri = { version = "2", features = [] } tauri-plugin-opener = "2" +tauri-plugin-dialog = "2" serde = { version = "1", features = ["derive"] } -serde_json = "1" +serde_json = "1.0" +chrono = "0.4" cpal = "0.17.1" hound = "3.5.1" reqwest = { version = "0.13.1", features = ["json", "multipart"] } tokio = { version = "1.40.0", features = ["full"] } - diff --git a/src-tauri/capabilities/default.json b/src-tauri/capabilities/default.json index 4cdbf49..3c1ad59 100644 --- a/src-tauri/capabilities/default.json +++ b/src-tauri/capabilities/default.json @@ -2,9 +2,12 @@ "$schema": "../gen/schemas/desktop-schema.json", "identifier": "default", "description": "Capability for the main window", - "windows": ["main"], + "windows": [ + "main" + ], "permissions": [ "core:default", - "opener:default" + "opener:default", + "dialog:default" ] -} +} \ No newline at end of file diff --git a/src-tauri/icons/128x128.png b/src-tauri/icons/128x128.png index 6bc3e7e..f53f65e 100644 Binary files a/src-tauri/icons/128x128.png and b/src-tauri/icons/128x128.png differ diff --git a/src-tauri/icons/128x128@2x.png b/src-tauri/icons/128x128@2x.png index 29a5db7..eddb7ec 100644 Binary files a/src-tauri/icons/128x128@2x.png and b/src-tauri/icons/128x128@2x.png differ diff --git a/src-tauri/icons/32x32.png b/src-tauri/icons/32x32.png index 5886222..11e0b31 100644 Binary files a/src-tauri/icons/32x32.png and b/src-tauri/icons/32x32.png differ diff --git a/src-tauri/icons/64x64.png b/src-tauri/icons/64x64.png new file mode 100644 index 0000000..e5626ae Binary files /dev/null and b/src-tauri/icons/64x64.png differ diff --git a/src-tauri/icons/Square107x107Logo.png b/src-tauri/icons/Square107x107Logo.png index 0ca4f27..1ef5797 100644 Binary files a/src-tauri/icons/Square107x107Logo.png and b/src-tauri/icons/Square107x107Logo.png differ diff --git a/src-tauri/icons/Square142x142Logo.png b/src-tauri/icons/Square142x142Logo.png index b81f820..3a42859 100644 Binary files a/src-tauri/icons/Square142x142Logo.png and b/src-tauri/icons/Square142x142Logo.png differ diff --git a/src-tauri/icons/Square150x150Logo.png b/src-tauri/icons/Square150x150Logo.png index 624c7bf..dbec4f3 100644 Binary files a/src-tauri/icons/Square150x150Logo.png and b/src-tauri/icons/Square150x150Logo.png differ diff --git a/src-tauri/icons/Square284x284Logo.png b/src-tauri/icons/Square284x284Logo.png index c021d2b..3c58520 100644 Binary files a/src-tauri/icons/Square284x284Logo.png and b/src-tauri/icons/Square284x284Logo.png differ diff --git a/src-tauri/icons/Square30x30Logo.png b/src-tauri/icons/Square30x30Logo.png index 6219700..6b165f8 100644 Binary files a/src-tauri/icons/Square30x30Logo.png and b/src-tauri/icons/Square30x30Logo.png differ diff --git a/src-tauri/icons/Square310x310Logo.png b/src-tauri/icons/Square310x310Logo.png index f9bc048..704195f 100644 Binary files a/src-tauri/icons/Square310x310Logo.png and b/src-tauri/icons/Square310x310Logo.png differ diff --git a/src-tauri/icons/Square44x44Logo.png b/src-tauri/icons/Square44x44Logo.png index d5fbfb2..b183fdf 100644 Binary files a/src-tauri/icons/Square44x44Logo.png and b/src-tauri/icons/Square44x44Logo.png differ diff --git a/src-tauri/icons/Square71x71Logo.png b/src-tauri/icons/Square71x71Logo.png index 63440d7..11676e3 100644 Binary files a/src-tauri/icons/Square71x71Logo.png and b/src-tauri/icons/Square71x71Logo.png differ diff --git a/src-tauri/icons/Square89x89Logo.png b/src-tauri/icons/Square89x89Logo.png index f3f705a..bdaea54 100644 Binary files a/src-tauri/icons/Square89x89Logo.png and b/src-tauri/icons/Square89x89Logo.png differ diff --git a/src-tauri/icons/StoreLogo.png b/src-tauri/icons/StoreLogo.png index 4556388..8e3ffbb 100644 Binary files a/src-tauri/icons/StoreLogo.png and b/src-tauri/icons/StoreLogo.png differ diff --git a/src-tauri/icons/android/mipmap-anydpi-v26/ic_launcher.xml b/src-tauri/icons/android/mipmap-anydpi-v26/ic_launcher.xml new file mode 100644 index 0000000..2ffbf24 --- /dev/null +++ b/src-tauri/icons/android/mipmap-anydpi-v26/ic_launcher.xml @@ -0,0 +1,5 @@ + + + + + \ No newline at end of file diff --git a/src-tauri/icons/android/mipmap-hdpi/ic_launcher.png b/src-tauri/icons/android/mipmap-hdpi/ic_launcher.png new file mode 100644 index 0000000..2767327 Binary files /dev/null and b/src-tauri/icons/android/mipmap-hdpi/ic_launcher.png differ diff --git a/src-tauri/icons/android/mipmap-hdpi/ic_launcher_foreground.png b/src-tauri/icons/android/mipmap-hdpi/ic_launcher_foreground.png new file mode 100644 index 0000000..4bf2416 Binary files /dev/null and b/src-tauri/icons/android/mipmap-hdpi/ic_launcher_foreground.png differ diff --git a/src-tauri/icons/android/mipmap-hdpi/ic_launcher_round.png b/src-tauri/icons/android/mipmap-hdpi/ic_launcher_round.png new file mode 100644 index 0000000..4ab84a5 Binary files /dev/null and b/src-tauri/icons/android/mipmap-hdpi/ic_launcher_round.png differ diff --git a/src-tauri/icons/android/mipmap-mdpi/ic_launcher.png b/src-tauri/icons/android/mipmap-mdpi/ic_launcher.png new file mode 100644 index 0000000..ad13166 Binary files /dev/null and b/src-tauri/icons/android/mipmap-mdpi/ic_launcher.png differ diff --git a/src-tauri/icons/android/mipmap-mdpi/ic_launcher_foreground.png b/src-tauri/icons/android/mipmap-mdpi/ic_launcher_foreground.png new file mode 100644 index 0000000..450b40a Binary files /dev/null and b/src-tauri/icons/android/mipmap-mdpi/ic_launcher_foreground.png differ diff --git a/src-tauri/icons/android/mipmap-mdpi/ic_launcher_round.png b/src-tauri/icons/android/mipmap-mdpi/ic_launcher_round.png new file mode 100644 index 0000000..c46b686 Binary files /dev/null and b/src-tauri/icons/android/mipmap-mdpi/ic_launcher_round.png differ diff --git a/src-tauri/icons/android/mipmap-xhdpi/ic_launcher.png b/src-tauri/icons/android/mipmap-xhdpi/ic_launcher.png new file mode 100644 index 0000000..2ac7173 Binary files /dev/null and b/src-tauri/icons/android/mipmap-xhdpi/ic_launcher.png differ diff --git a/src-tauri/icons/android/mipmap-xhdpi/ic_launcher_foreground.png b/src-tauri/icons/android/mipmap-xhdpi/ic_launcher_foreground.png new file mode 100644 index 0000000..457062c Binary files /dev/null and b/src-tauri/icons/android/mipmap-xhdpi/ic_launcher_foreground.png differ diff --git a/src-tauri/icons/android/mipmap-xhdpi/ic_launcher_round.png b/src-tauri/icons/android/mipmap-xhdpi/ic_launcher_round.png new file mode 100644 index 0000000..eae5bc1 Binary files /dev/null and b/src-tauri/icons/android/mipmap-xhdpi/ic_launcher_round.png differ diff --git a/src-tauri/icons/android/mipmap-xxhdpi/ic_launcher.png b/src-tauri/icons/android/mipmap-xxhdpi/ic_launcher.png new file mode 100644 index 0000000..267718c Binary files /dev/null and b/src-tauri/icons/android/mipmap-xxhdpi/ic_launcher.png differ diff --git a/src-tauri/icons/android/mipmap-xxhdpi/ic_launcher_foreground.png b/src-tauri/icons/android/mipmap-xxhdpi/ic_launcher_foreground.png new file mode 100644 index 0000000..f195f61 Binary files /dev/null and b/src-tauri/icons/android/mipmap-xxhdpi/ic_launcher_foreground.png differ diff --git a/src-tauri/icons/android/mipmap-xxhdpi/ic_launcher_round.png b/src-tauri/icons/android/mipmap-xxhdpi/ic_launcher_round.png new file mode 100644 index 0000000..f82be34 Binary files /dev/null and b/src-tauri/icons/android/mipmap-xxhdpi/ic_launcher_round.png differ diff --git a/src-tauri/icons/android/mipmap-xxxhdpi/ic_launcher.png b/src-tauri/icons/android/mipmap-xxxhdpi/ic_launcher.png new file mode 100644 index 0000000..146bac5 Binary files /dev/null and b/src-tauri/icons/android/mipmap-xxxhdpi/ic_launcher.png differ diff --git a/src-tauri/icons/android/mipmap-xxxhdpi/ic_launcher_foreground.png b/src-tauri/icons/android/mipmap-xxxhdpi/ic_launcher_foreground.png new file mode 100644 index 0000000..a0adbf8 Binary files /dev/null and b/src-tauri/icons/android/mipmap-xxxhdpi/ic_launcher_foreground.png differ diff --git a/src-tauri/icons/android/mipmap-xxxhdpi/ic_launcher_round.png b/src-tauri/icons/android/mipmap-xxxhdpi/ic_launcher_round.png new file mode 100644 index 0000000..d927c43 Binary files /dev/null and b/src-tauri/icons/android/mipmap-xxxhdpi/ic_launcher_round.png differ diff --git a/src-tauri/icons/android/values/ic_launcher_background.xml b/src-tauri/icons/android/values/ic_launcher_background.xml new file mode 100644 index 0000000..ea9c223 --- /dev/null +++ b/src-tauri/icons/android/values/ic_launcher_background.xml @@ -0,0 +1,4 @@ + + + #fff + \ No newline at end of file diff --git a/src-tauri/icons/icon.icns b/src-tauri/icons/icon.icns index b03c81f..54b4a5c 100644 Binary files a/src-tauri/icons/icon.icns and b/src-tauri/icons/icon.icns differ diff --git a/src-tauri/icons/icon.ico b/src-tauri/icons/icon.ico index b3636e4..6461e8b 100644 Binary files a/src-tauri/icons/icon.ico and b/src-tauri/icons/icon.ico differ diff --git a/src-tauri/icons/icon.png b/src-tauri/icons/icon.png index fecd2b2..afc1318 100644 Binary files a/src-tauri/icons/icon.png and b/src-tauri/icons/icon.png differ diff --git a/src-tauri/icons/ios/AppIcon-20x20@1x.png b/src-tauri/icons/ios/AppIcon-20x20@1x.png new file mode 100644 index 0000000..f95ecc5 Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-20x20@1x.png differ diff --git a/src-tauri/icons/ios/AppIcon-20x20@2x-1.png b/src-tauri/icons/ios/AppIcon-20x20@2x-1.png new file mode 100644 index 0000000..a30387d Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-20x20@2x-1.png differ diff --git a/src-tauri/icons/ios/AppIcon-20x20@2x.png b/src-tauri/icons/ios/AppIcon-20x20@2x.png new file mode 100644 index 0000000..a30387d Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-20x20@2x.png differ diff --git a/src-tauri/icons/ios/AppIcon-20x20@3x.png b/src-tauri/icons/ios/AppIcon-20x20@3x.png new file mode 100644 index 0000000..5ebf48f Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-20x20@3x.png differ diff --git a/src-tauri/icons/ios/AppIcon-29x29@1x.png b/src-tauri/icons/ios/AppIcon-29x29@1x.png new file mode 100644 index 0000000..077b842 Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-29x29@1x.png differ diff --git a/src-tauri/icons/ios/AppIcon-29x29@2x-1.png b/src-tauri/icons/ios/AppIcon-29x29@2x-1.png new file mode 100644 index 0000000..ec3c9e1 Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-29x29@2x-1.png differ diff --git a/src-tauri/icons/ios/AppIcon-29x29@2x.png b/src-tauri/icons/ios/AppIcon-29x29@2x.png new file mode 100644 index 0000000..ec3c9e1 Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-29x29@2x.png differ diff --git a/src-tauri/icons/ios/AppIcon-29x29@3x.png b/src-tauri/icons/ios/AppIcon-29x29@3x.png new file mode 100644 index 0000000..09b584b Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-29x29@3x.png differ diff --git a/src-tauri/icons/ios/AppIcon-40x40@1x.png b/src-tauri/icons/ios/AppIcon-40x40@1x.png new file mode 100644 index 0000000..a30387d Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-40x40@1x.png differ diff --git a/src-tauri/icons/ios/AppIcon-40x40@2x-1.png b/src-tauri/icons/ios/AppIcon-40x40@2x-1.png new file mode 100644 index 0000000..cf17dda Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-40x40@2x-1.png differ diff --git a/src-tauri/icons/ios/AppIcon-40x40@2x.png b/src-tauri/icons/ios/AppIcon-40x40@2x.png new file mode 100644 index 0000000..cf17dda Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-40x40@2x.png differ diff --git a/src-tauri/icons/ios/AppIcon-40x40@3x.png b/src-tauri/icons/ios/AppIcon-40x40@3x.png new file mode 100644 index 0000000..493c194 Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-40x40@3x.png differ diff --git a/src-tauri/icons/ios/AppIcon-512@2x.png b/src-tauri/icons/ios/AppIcon-512@2x.png new file mode 100644 index 0000000..5dc2a68 Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-512@2x.png differ diff --git a/src-tauri/icons/ios/AppIcon-60x60@2x.png b/src-tauri/icons/ios/AppIcon-60x60@2x.png new file mode 100644 index 0000000..493c194 Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-60x60@2x.png differ diff --git a/src-tauri/icons/ios/AppIcon-60x60@3x.png b/src-tauri/icons/ios/AppIcon-60x60@3x.png new file mode 100644 index 0000000..a3cde38 Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-60x60@3x.png differ diff --git a/src-tauri/icons/ios/AppIcon-76x76@1x.png b/src-tauri/icons/ios/AppIcon-76x76@1x.png new file mode 100644 index 0000000..f88292f Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-76x76@1x.png differ diff --git a/src-tauri/icons/ios/AppIcon-76x76@2x.png b/src-tauri/icons/ios/AppIcon-76x76@2x.png new file mode 100644 index 0000000..94d6fe7 Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-76x76@2x.png differ diff --git a/src-tauri/icons/ios/AppIcon-83.5x83.5@2x.png b/src-tauri/icons/ios/AppIcon-83.5x83.5@2x.png new file mode 100644 index 0000000..882c7e7 Binary files /dev/null and b/src-tauri/icons/ios/AppIcon-83.5x83.5@2x.png differ diff --git a/src-tauri/resources/BlackHole2ch.v0.6.1.pkg b/src-tauri/resources/BlackHole2ch.v0.6.1.pkg new file mode 100644 index 0000000..c15d554 Binary files /dev/null and b/src-tauri/resources/BlackHole2ch.v0.6.1.pkg differ diff --git a/src-tauri/src/lib.rs b/src-tauri/src/lib.rs index 0a0b5ea..fce3a6c 100644 --- a/src-tauri/src/lib.rs +++ b/src-tauri/src/lib.rs @@ -1,4 +1,4 @@ -use tauri::State; +use tauri::{AppHandle, Manager, State, Emitter}; use std::sync::{Arc, Mutex}; use std::process::Command; use cpal::traits::{DeviceTrait, HostTrait, StreamTrait}; @@ -17,6 +17,22 @@ struct AudioDevice { name: String, } +#[derive(serde::Serialize, Clone)] +struct LogEvent { + level: String, + message: String, + timestamp: String, +} + +fn emit_log(app: &AppHandle, level: &str, message: &str) { + let log = LogEvent { + level: level.to_string(), + message: message.to_string(), + timestamp: chrono::Local::now().format("%H:%M:%S").to_string(), + }; + let _ = app.emit("log-event", log); +} + #[tauri::command] fn greet(name: &str) -> String { format!("Hello, {}! You've been greeted from Rust!", name) @@ -41,22 +57,11 @@ fn get_input_devices() -> Result, String> { Ok(result) } -#[tauri::command] -fn install_driver() -> Result { - let output = Command::new("brew") - .args(["install", "blackhole-2ch"]) - .output() - .map_err(|e| format!("Failed to execute command: {}", e))?; - if output.status.success() { - Ok(String::from_utf8_lossy(&output.stdout).to_string()) - } else { - Err(String::from_utf8_lossy(&output.stderr).to_string()) - } -} #[tauri::command] -fn start_recording(state: State<'_, AppState>, device_id: String) -> Result<(), String> { +fn start_recording(app: AppHandle, state: State<'_, AppState>, device_id: String, save_path: Option) -> Result<(), String> { + emit_log(&app, "INFO", &format!("Starting recording on device: {}", device_id)); let host = cpal::default_host(); // Find device by name (using name as ID) @@ -75,16 +80,31 @@ fn start_recording(state: State<'_, AppState>, device_id: String) -> Result<(), sample_format: hound::SampleFormat::Int, }; - // Create a temporary file - let temp_dir = std::env::temp_dir(); - let file_path = temp_dir.join(format!("recording_{}.wav", std::time::SystemTime::now().duration_since(std::time::UNIX_EPOCH).unwrap().as_secs())); + // Determine file path: User provided or Temp + let file_path = if let Some(path) = save_path { + if path.trim().is_empty() { + std::env::temp_dir().join(format!("recording_{}.wav", std::time::SystemTime::now().duration_since(std::time::UNIX_EPOCH).unwrap().as_secs())) + } else { + // Check if directory exists, if not try to create it or error out? + // For now, assume user gives a valid directory. We'll append filename. + std::path::PathBuf::from(path).join(format!("recording_{}.wav", std::time::SystemTime::now().duration_since(std::time::UNIX_EPOCH).unwrap().as_secs())) + } + } else { + std::env::temp_dir().join(format!("recording_{}.wav", std::time::SystemTime::now().duration_since(std::time::UNIX_EPOCH).unwrap().as_secs())) + }; + let file_path_str = file_path.to_string_lossy().to_string(); + emit_log(&app, "INFO", &format!("Saving recording to: {}", file_path_str)); let writer = hound::WavWriter::create(&file_path, spec).map_err(|e| e.to_string())?; let writer = Arc::new(Mutex::new(writer)); let writer_clone = writer.clone(); - let err_fn = |err| eprintln!("an error occurred on stream: {}", err); + let app_handle = app.clone(); + let err_fn = move |err| { + eprintln!("an error occurred on stream: {}", err); + emit_log(&app_handle, "ERROR", &format!("Stream error: {}", err)); + }; let stream = match config.sample_format() { cpal::SampleFormat::F32 => device.build_input_stream( @@ -128,13 +148,15 @@ fn start_recording(state: State<'_, AppState>, device_id: String) -> Result<(), // Store state *state.recording_stream.lock().unwrap() = Some(stream); - *state.recording_file_path.lock().unwrap() = Some(file_path_str); + *state.recording_file_path.lock().unwrap() = Some(file_path_str.clone()); + emit_log(&app, "SUCCESS", &format!("Recording started. File: {}", file_path_str)); Ok(()) } #[tauri::command] -fn stop_recording(state: State<'_, AppState>) -> Result { +fn stop_recording(app: AppHandle, state: State<'_, AppState>) -> Result { + emit_log(&app, "INFO", "Stopping recording..."); // Drop stream to stop recording { let mut stream_guard = state.recording_stream.lock().unwrap(); @@ -146,7 +168,35 @@ fn stop_recording(state: State<'_, AppState>) -> Result { // Return file path let mut path_guard = state.recording_file_path.lock().unwrap(); - path_guard.take().ok_or("No recording path found".to_string()) + let path = path_guard.take().ok_or("No recording path found".to_string())?; + emit_log(&app, "SUCCESS", &format!("Recording stopped. Saved to: {}", path)); + Ok(path) +} + +#[tauri::command] +fn pause_recording(app: AppHandle, state: State<'_, AppState>) -> Result<(), String> { + emit_log(&app, "INFO", "Pausing recording..."); + let stream_guard = state.recording_stream.lock().unwrap(); + if let Some(stream) = stream_guard.as_ref() { + stream.pause().map_err(|e| e.to_string())?; + emit_log(&app, "SUCCESS", "Recording paused."); + Ok(()) + } else { + Err("Not recording".to_string()) + } +} + +#[tauri::command] +fn resume_recording(app: AppHandle, state: State<'_, AppState>) -> Result<(), String> { + emit_log(&app, "INFO", "Resuming recording..."); + let stream_guard = state.recording_stream.lock().unwrap(); + if let Some(stream) = stream_guard.as_ref() { + stream.play().map_err(|e| e.to_string())?; + emit_log(&app, "SUCCESS", "Recording resumed."); + Ok(()) + } else { + Err("Not recording".to_string()) + } } #[derive(serde::Deserialize)] @@ -157,6 +207,7 @@ struct ModelListResponse { #[derive(serde::Deserialize)] struct ModelData { id: String, + #[allow(dead_code)] owned_by: Option, } @@ -177,6 +228,7 @@ struct Choice { } #[derive(serde::Deserialize)] struct Message { + #[allow(dead_code)] content: String, } @@ -187,20 +239,27 @@ struct ModelInfo { } #[tauri::command] -async fn get_available_models(api_key: String, product_id: String) -> Result, String> { +async fn get_available_models(app: AppHandle, api_key: String, product_id: String) -> Result, String> { + emit_log(&app, "INFO", "Fetching available models from Infomaniak..."); let client = reqwest::Client::new(); // Use the v2/openai compliant endpoint as per docs let url = format!("https://api.infomaniak.com/2/ai/{}/openai/v1/models", product_id); + emit_log(&app, "DEBUG", &format!("GET {}", url)); + let res = client.get(&url) .header("Authorization", format!("Bearer {}", api_key)) .send() .await - .map_err(|e| e.to_string())?; + .map_err(|e| { + let msg = format!("Network error fetching models: {}", e); + emit_log(&app, "ERROR", &msg); + msg + })?; if res.status().is_success() { let raw_body = res.text().await.map_err(|e| e.to_string())?; - println!("Models Raw Response: {}", raw_body); + // println!("Models Raw Response: {}", raw_body); let list: ModelListResponse = serde_json::from_str(&raw_body) .map_err(|e| format!("Failed to parse models: {}. Body: {}", e, raw_body))?; @@ -209,20 +268,34 @@ async fn get_available_models(api_key: String, product_id: String) -> Result>(); + emit_log(&app, "SUCCESS", &format!("Loaded {} models.", models.len())); Ok(models) } else { - // Fallback to v1 if v2 fails or try another common path? - // For now just error out let err = res.text().await.unwrap_or_default(); + emit_log(&app, "ERROR", &format!("Failed to fetch models: {}", err)); Err(format!("Failed to fetch models: {}", err)) } } +#[derive(serde::Deserialize)] +struct WhisperVerboseResponse { + text: Option, + segments: Option>, +} + +#[derive(serde::Deserialize)] +struct Segment { + start: f64, + end: f64, + text: String, +} + #[tauri::command] -async fn transcribe_audio(file_path: String, api_key: String, product_id: String) -> Result { +async fn transcribe_audio(app: AppHandle, file_path: String, api_key: String, product_id: String) -> Result { + emit_log(&app, "INFO", "Starting transcription with timestamps..."); let client = reqwest::Client::new(); // Prepare file part @@ -235,44 +308,88 @@ async fn transcribe_audio(file_path: String, api_key: String, product_id: String let form = reqwest::multipart::Form::new() .part("file", file_part) - .text("model", "whisper"); + .text("model", "whisper") + .text("response_format", "verbose_json") + .text("timestamp_granularities[]", "segment"); // Crucial for accurate segments let url = format!("https://api.infomaniak.com/1/ai/{}/openai/audio/transcriptions", product_id); + emit_log(&app, "DEBUG", &format!("POST {}", url)); + let res = client.post(&url) .header("Authorization", format!("Bearer {}", api_key)) .multipart(form) .send() .await - .map_err(|e| e.to_string())?; + .map_err(|e| { + let msg = format!("Network error during transcription: {}", e); + emit_log(&app, "ERROR", &msg); + msg + })?; if res.status().is_success() { let raw_body = res.text().await.map_err(|e| e.to_string())?; - println!("Transcription Raw Response: {}", raw_body); - // Attempt to parse text or batch_id - // Attempt to parse text or batch_id - let response: WhisperResponse = serde_json::from_str(&raw_body) + // Check if we got a batch ID + #[derive(serde::Deserialize)] + struct BatchResponse { + batch_id: Option, + } + + // Try parsing as batch response first (Infomaniak specific behavior) + if let Ok(batch_res) = serde_json::from_str::(&raw_body) { + if let Some(batch_id) = batch_res.batch_id { + emit_log(&app, "INFO", &format!("Transcription queued. Batch ID: {}", batch_id)); + return poll_transcription(&app, &client, &api_key, &product_id, &batch_id).await; + } + } + + // If not batch, try parsing verbose response directly + // Log the raw body so we can see why it fails + emit_log(&app, "DEBUG", &format!("Direct Response (first 500 chars): {:.500}", raw_body)); + + let response: WhisperVerboseResponse = serde_json::from_str(&raw_body) .map_err(|e| format!("Failed to decode JSON: {}. Body: {}", e, raw_body))?; - match (response.text, response.batch_id) { - (Some(text), _) => Ok(text), - (_, Some(batch_id)) => { - // Need to poll - poll_transcription(&client, &api_key, &product_id, &batch_id).await - }, - _ => Err(format!("Response contained neither text nor batch_id. Body: {}", raw_body)) + if let Some(segments) = response.segments { + emit_log(&app, "INFO", &format!("Found {} segments (Direct).", segments.len())); + for (i, seg) in segments.iter().take(3).enumerate() { + emit_log(&app, "DEBUG", &format!("Seg {}: start={}", i, seg.start)); + } + + // Format timestamps: [MM:SS] Text + let mut formatted_transcript = String::new(); + for segment in segments { + let start_mins = (segment.start / 60.0).floor() as u64; + let start_secs = (segment.start % 60.0).floor() as u64; + formatted_transcript.push_str(&format!("[{:02}:{:02}] {}\n", start_mins, start_secs, segment.text.trim())); + } + + // Fallback to raw text if segments empty + if formatted_transcript.trim().is_empty() { + if let Some(text) = response.text { + emit_log(&app, "SUCCESS", "Segments missing, using raw text."); + return Ok(text); + } + } else { + emit_log(&app, "SUCCESS", "Transcription received with timestamps."); + return Ok(formatted_transcript); + } + } else if let Some(text) = response.text { + emit_log(&app, "SUCCESS", "Segments missing, using raw text."); + return Ok(text); } + + emit_log(&app, "ERROR", "Response contained no recognized content."); + Err(format!("Response contained no recognized content. Body: {}", raw_body)) } else { let error_text = res.text().await.unwrap_or_default(); + emit_log(&app, "ERROR", &format!("Transcription failed: {}", error_text)); Err(format!("Transcription failed: {}", error_text)) } } -async fn poll_transcription(client: &reqwest::Client, api_key: &str, product_id: &str, batch_id: &str) -> Result { - // Polling URL: /1/ai/{product_id}/results/{batch_id} (or similar, verifying via trial) - // If that fails, we can try /openai/audio/transcriptions/{batch_id} but documentation suggests results endpoint. - // Let's assume the standard Infomaniak pattern for batches. +async fn poll_transcription(app: &AppHandle, client: &reqwest::Client, api_key: &str, product_id: &str, batch_id: &str) -> Result { let status_url = format!("https://api.infomaniak.com/1/ai/{}/results/{}", product_id, batch_id); let mut attempts = 0; @@ -280,6 +397,7 @@ async fn poll_transcription(client: &reqwest::Client, api_key: &str, product_id: attempts += 1; sleep(Duration::from_secs(2)).await; + emit_log(app, "DEBUG", &format!("Polling status... Attempt {}", attempts)); let res = client.get(&status_url) .header("Authorization", format!("Bearer {}", api_key)) .send() @@ -301,31 +419,63 @@ async fn poll_transcription(client: &reqwest::Client, api_key: &str, product_id: if dl_res.status().is_success() { let content = dl_res.text().await.map_err(|e| e.to_string())?; - - // Try to parse the content as JSON to see if it's { "text": "..." } - if let Ok(json_val) = serde_json::from_str::(&content) { - if let Some(text_content) = json_val.get("text").and_then(|t| t.as_str()) { - return Ok(text_content.to_string()); - } + emit_log(app, "DEBUG", &format!("Poll Raw Content (first 500 chars): {:.500}", content)); + + // Try to parse as Verbose JSON to get timestamps + if let Ok(response) = serde_json::from_str::(&content) { + if let Some(segments) = response.segments { + emit_log(app, "INFO", &format!("Found {} segments.", segments.len())); + // Log first 3 segments start times + for (i, seg) in segments.iter().take(3).enumerate() { + emit_log(app, "DEBUG", &format!("Seg {}: start={}", i, seg.start)); + } + + let mut formatted_transcript = String::new(); + for segment in segments { + let start_mins = (segment.start / 60.0).floor() as u64; + let start_secs = (segment.start % 60.0).floor() as u64; + formatted_transcript.push_str(&format!("[{:02}:{:02}] {}\n", start_mins, start_secs, segment.text.trim())); + } + if !formatted_transcript.trim().is_empty() { + emit_log(app, "SUCCESS", "Transcription completed (async) with timestamps."); + return Ok(formatted_transcript); + } else { + emit_log(app, "WARN", "Segments found but empty content."); + } + } else { + emit_log(app, "WARN", "Verbose parsed but no segments found."); + } + + if let Some(text) = response.text { + emit_log(app, "SUCCESS", "Transcription completed (async) - raw text (segments missing)."); + return Ok(text); + } + } else { + emit_log(app, "WARN", "Failed to parse poll content as WhisperVerboseResponse"); } - + + emit_log(app, "SUCCESS", "Transcription completed - returning raw content."); // If not JSON or no text field, return raw content return Ok(content); } else { + emit_log(app, "ERROR", "Failed to download transcription results."); return Err(format!("Download failed: {}", dl_res.status())); } } else if status == "failed" || status == "error" { + emit_log(app, "ERROR", &format!("Batch processing failed: {:?}", json)); return Err(format!("Batch processing failed: {:?}", json)); } // If 'processing' or 'pending', continue loop } } } + emit_log(app, "ERROR", "Transcription timed out after 80s."); Err("Transcription timed out".to_string()) } #[tauri::command] -async fn summarize_text(text: String, api_key: String, product_id: String, prompt: String, model: String) -> Result { +async fn summarize_text(app: AppHandle, text: String, api_key: String, product_id: String, prompt: String, model: String) -> Result { + emit_log(&app, "INFO", "Starting summarization..."); let client = reqwest::Client::new(); let url = format!("https://api.infomaniak.com/2/ai/{}/openai/v1/chat/completions", product_id); @@ -341,36 +491,58 @@ async fn summarize_text(text: String, api_key: String, product_id: String, promp "messages": messages }); + emit_log(&app, "DEBUG", &format!("POST {}", url)); + let res = client.post(&url) .header("Authorization", format!("Bearer {}", api_key)) .header("Content-Type", "application/json") .json(&body) .send() .await - .map_err(|e| e.to_string())?; + .map_err(|e| { + let msg = format!("Network error during summarization: {}", e); + emit_log(&app, "ERROR", &msg); + msg + })?; if res.status().is_success() { let raw_body = res.text().await.map_err(|e| e.to_string())?; - println!("Summarization Raw Response: {}", raw_body); + // println!("Summarization Raw Response: {}", raw_body); let response_body: ChatCompletionResponse = serde_json::from_str(&raw_body) .map_err(|e| format!("Failed to decode JSON: {}. Body: {}", e, raw_body))?; if let Some(choice) = response_body.choices.first() { + emit_log(&app, "SUCCESS", "Summarization received."); Ok(choice.message.content.clone()) } else { + emit_log(&app, "WARN", "No summary generated in response."); Err("No summary generated".to_string()) } } else { let error_text = res.text().await.unwrap_or_default(); + emit_log(&app, "ERROR", &format!("Summarization failed: {}", error_text)); Err(format!("Summarization failed: {}", error_text)) } } +#[tauri::command] +fn open_audio_midi_setup() -> Result<(), String> { + Command::new("open") + .arg("-a") + .arg("Audio MIDI Setup") + .spawn() + .map_err(|e| e.to_string())?; + Ok(()) +} + + + #[cfg_attr(mobile, tauri::mobile_entry_point)] pub fn run() { tauri::Builder::default() .plugin(tauri_plugin_opener::init()) + .plugin(tauri_plugin_dialog::init()) .manage(AppState { recording_stream: Mutex::new(None), recording_file_path: Mutex::new(None), @@ -378,12 +550,14 @@ pub fn run() { .invoke_handler(tauri::generate_handler![ greet, get_input_devices, - install_driver, start_recording, stop_recording, + pause_recording, + resume_recording, transcribe_audio, summarize_text, - get_available_models + get_available_models, + open_audio_midi_setup ]) .run(tauri::generate_context!()) .expect("error while running tauri application"); diff --git a/src-tauri/tauri.conf.json b/src-tauri/tauri.conf.json index c2ebc5a..50d7380 100644 --- a/src-tauri/tauri.conf.json +++ b/src-tauri/tauri.conf.json @@ -2,7 +2,7 @@ "$schema": "https://schema.tauri.app/config/2", "productName": "Hearbit AI", "version": "0.1.0", - "identifier": "com.hearbit-ai.app", + "identifier": "com.hearbit-ai.desktop", "build": { "beforeDevCommand": "npm run dev", "devUrl": "http://localhost:1420", @@ -13,8 +13,8 @@ "windows": [ { "title": "Hearbit AI", - "width": 800, - "height": 600 + "width": 1000, + "height": 800 } ], "security": { @@ -30,6 +30,9 @@ "icons/128x128@2x.png", "icons/icon.icns", "icons/icon.ico" + ], + "resources": [ + "resources/BlackHole2ch.v0.6.1.pkg" ] } } \ No newline at end of file diff --git a/src/App.tsx b/src/App.tsx index e6a2ddf..32f5aad 100644 --- a/src/App.tsx +++ b/src/App.tsx @@ -1,23 +1,145 @@ -import { useState } from "react"; +import { useState, useEffect } from 'react'; +import { listen } from "@tauri-apps/api/event"; +import { Settings as SettingsIcon } from "lucide-react"; import Settings from "./components/Settings"; import Recorder from "./components/Recorder"; +import LogViewer, { LogEntry } from "./components/LogViewer"; +import TranscriptionView from "./components/TranscriptionView"; +import Tabs from "./components/Tabs"; -interface PromptTemplate { +export interface PromptTemplate { id: string; name: string; content: string; } function App() { - const [view, setView] = useState<'recorder' | 'settings'>('recorder'); + const [view, setView] = useState<'recorder' | 'logs' | 'settings' | 'transcription'>('recorder'); + // Keep track of the *previous* tab to return to from settings + const [lastTab, setLastTab] = useState<'recorder' | 'logs' | 'transcription'>('recorder'); + const [apiKey, setApiKey] = useState(localStorage.getItem('infomaniak_api_key') || ''); const [productId, setProductId] = useState(localStorage.getItem('infomaniak_product_id') || ''); + const [savePath, setSavePath] = useState(localStorage.getItem('infomaniak_save_path') || ''); // Default prompts if none exist + /* eslint-disable no-useless-escape */ // Escape quotes in prompts const defaultPrompts: PromptTemplate[] = [ - { id: '1', name: 'General Summary', content: 'Summarize the following text into clear bullet points.' }, - { id: '2', name: 'Action Items', content: 'Extract all action items and tasks from this text.' }, - { id: '3', name: 'Email Draft', content: 'Draft a follow-up email based on this conversation.' } + { + id: '1', + name: 'Meeting Protokoll (General)', + content: `Rolle: Du bist ein hochprofessioneller, effizienter Protokollführer und persönlicher Assistent. Deine Aufgabe ist es, aus dem untenstehenden Roh-Transkript (oder den Notizen) ein strukturiertes, leicht lesbares und handlungsorientiertes Ergebnisprotokoll zu erstellen. + +Anweisungen: +- Filterung: Ignoriere Smalltalk, Füllwörter, Begrüßungen und irrelevante Abschweifungen. Konzentriere dich auf Fakten, Entscheidungen und Aufgaben. +- Tonalität: Schreibe sachlich, objektiv und präzise (Business German). +- Klarheit: Formuliere vage Aussagen in klare Sätze um, ohne den Sinn zu verändern. +- Zuordnung: Ordne Aufgaben immer einer Person zu (wenn im Text genannt). + +Gewünschte Struktur des Outputs: + +# Meeting Protokoll: [Thema des Meetings] + +Datum: [Datum einfügen, falls bekannt, sonst "N/A"] +Anwesende: [Liste der Namen] + +## 1. Management Summary +Eine kurze Zusammenfassung der wichtigsten Punkte in 3-5 Sätzen. Worum ging es im Kern? + +## 2. Wichtige Entscheidungen +[Entscheidung 1] +[Entscheidung 2] +(Liste hier nur Dinge auf, die explizit beschlossen wurden) + +## 3. Offene Fragen / Diskussionspunkte +Kurze Stichpunkte zu Themen, die besprochen, aber noch nicht final geklärt wurden. + +## 4. Action Items (To-Dos) +| Was ist zu tun? | Wer ist verantwortlich? | Bis wann? (falls genannt) | +| :--- | :--- | :--- | +| [Aufgabe 1] | [Name] | [Datum] | +| [Aufgabe 2] | [Name] | [Datum] | + +## 5. Nächste Schritte / Nächstes Meeting +Kurze Info zum weiteren Vorgehen.` + }, + { + id: '2', + name: '1:1 Gespräch / Jour Fixe', + content: `Rolle: Du bist ein diskreter und empathischer Executive Assistant. Deine Aufgabe ist es, ein 1:1 Gespräch zwischen einem Mitarbeiter und seinem Vorgesetzten zusammenzufassen. Das Gespräch beinhaltet sowohl geschäftliche als auch private/persönliche Themen. + +Wichtige Anweisungen: +- Kategorisierung: Trenne strikt zwischen "Persönlichem/Atmosphäre" und "Operativem Business". +- Tonalität bei Privatem: Fasse private Themen (Wochenende, Hobbys, Familie, Wohlbefinden) als fließenden Text zusammen. Sei hier warmherzig, aber diskret. Vermeide Stichpunkte, das wirkt bei Privatem zu mechanisch. +- Tonalität bei Business: Sei hier gewohnt präzise, faktenbasiert und nutze Bulletpoints. +- Sensibilität: Wenn über Feedback, Karriereentwicklung oder Kritik gesprochen wurde, fasse dies in einem separaten Abschnitt neutral und konstruktiv zusammen. + +Gewünschte Struktur des Outputs: + +# 1:1 Gesprächszusammenfassung + +Datum: [Datum] +Teilnehmer: [Namen] + +## 1. Persönlicher Check-in & Atmosphäre +[Hier bitte einen kurzen Fließtext schreiben: Wie geht es den Teilnehmern? Was wurde an privaten Updates geteilt (z.B. Urlaub, Familie, Hobbys)? Wie war die Grundstimmung des Gesprächs?] + +## 2. Operative Themen (Business Updates) +Thema A: [Kurze Zusammenfassung] +Thema B: [Kurze Zusammenfassung] +(Führe hier die konkreten Arbeitsthemen auf) + +## 3. Feedback & Entwicklung +[Hier notieren, falls über Karriereziele, Gehalt, Weiterbildung oder gegenseitiges Feedback gesprochen wurde. Falls nicht besprochen: "Keine Themen in diesem Meeting".] + +## 4. Vereinbarungen & Action Items +| Wer? | Was ist zu tun / zu beachten? | Bis wann? | +| :--- | :--- | :--- | +| [Name] | [Aufgabe] | [Datum] | +| [Name] | [Aufgabe] | [Datum] |` + }, + { + id: '3', + name: 'Kundenmeeting (Official)', + content: `Rolle: Du bist ein professioneller Account Manager und Business Analyst. Erstelle ein Protokoll für ein Kundenmeeting. Deine Tonalität ist höflich, verbindlich und extrem präzise. Das Ergebnis soll so formuliert sein, dass es theoretisch direkt an den Kunden gesendet werden kann. + +Wichtige Anweisungen: +- Fokus auf Vereinbarungen: Was wurde genau beschlossen? Wenn der Kunde eine Anforderung geändert hat, notiere das explizit. +- Verantwortlichkeiten trennen: Trenne bei den To-Dos strikt zwischen "Aufgaben für uns (Auftragnehmer)" und "Aufgaben für den Kunden" (z.B. Material liefern, Freigaben). +- Klarheit vor Länge: Vermeide interne Fachsprache, wenn möglich. Schreibe so, dass der Kunde es versteht. +- Diplomatie: Falls es Konflikte gab, formuliere diese lösungsorientiert und neutral (z.B. statt "Kunde hat sich beschwert" schreibe "Es wurde Feedback zu X besprochen, Lösungsweg ist Y"). + +Gewünschte Struktur des Outputs: + +# Ergebnisprotokoll: [Projektname / Thema] + +Datum: [Datum] +Teilnehmer: [Namen Kunden] & [Namen Intern] + +## 1. Zusammenfassung (Executive Summary) +2-3 Sätze zum Ziel des Termins und dem aktuellen Status (z.B. "Wir haben den aktuellen Design-Sprint besprochen und die Anforderungen für Phase 2 finalisiert.") + +## 2. Besprochene Punkte & Projektstatus +[Thema 1]: Kurze Zusammenfassung der Diskussion. +[Thema 2]: Kurze Zusammenfassung der Diskussion. + +## 3. Wichtige Entscheidungen & Genehmigungen +(Dies ist der wichtigste Teil für die Absicherung!) +✅ Beschluss: [Was wurde final entschieden/freigegeben?] +🔄 Änderung: [Wurde der Projektumfang geändert? Neue Anforderungen?] + +## 4. Action Items (Wer macht was?) + +👉 Aufgaben für uns (Intern): +[ ] [Aufgabe] (bis [Datum]) +[ ] [Aufgabe] (bis [Datum]) + +👉 Aufgaben für den Kunden (To-Dos / Lieferungen): +[ ] [Aufgabe, z.B. Zugangdaten senden, Design freigeben] (bis [Datum]) + +## 5. Nächster Termin / Timeline +Wann findet das nächste Meeting statt oder was ist der nächste Meilenstein?` + } ]; const [prompts, setPrompts] = useState(() => { @@ -25,14 +147,16 @@ function App() { return saved ? JSON.parse(saved) : defaultPrompts; }); - const handleSaveSettings = (newApiKey: string, newProductId: string, newPrompts: PromptTemplate[]) => { + const handleSaveSettings = (newApiKey: string, newProductId: string, newPrompts: PromptTemplate[], newSavePath: string) => { setApiKey(newApiKey); setProductId(newProductId); setPrompts(newPrompts); + setSavePath(newSavePath); localStorage.setItem('infomaniak_api_key', newApiKey); localStorage.setItem('infomaniak_product_id', newProductId); localStorage.setItem('infomaniak_prompts', JSON.stringify(newPrompts)); - setView('recorder'); + localStorage.setItem('infomaniak_save_path', newSavePath); + setView(lastTab); }; // State for Recorder (lifted to persist across view changes) @@ -76,19 +200,56 @@ function App() { const handleLoadHistory = (item: HistoryItem) => { setTranscription(item.transcription); setSummary(item.summary); + setView('recorder'); // Ensure we go back to recorder to see it }; + // Logs State + const [logs, setLogs] = useState([]); + useEffect(() => { + const unlisten = listen('log-event', (event) => { + setLogs((prevLogs) => [...prevLogs, event.payload]); + }); + + return () => { + unlisten.then(f => f()); + }; + }, []); return ( -
-
- {view === 'recorder' ? ( +
+ {/* Top Navigation Bar */} + {view !== 'settings' && ( +
+
+ +
+ + setView(t)} + /> +
+ )} + +
+ {view === 'recorder' && ( setView('settings')} + onOpenSettings={() => { + setLastTab('recorder'); + setView('settings'); + }} transcription={transcription} setTranscription={setTranscription} summary={summary} @@ -97,23 +258,30 @@ function App() { onSaveToHistory={handleSaveToHistory} onDeleteHistory={handleDeleteHistory} onLoadHistory={handleLoadHistory} + savePath={savePath} + onRecordingComplete={() => setView('transcription')} /> - ) : ( + )} + + {view === 'transcription' && ( + + )} + + {view === 'logs' && ( + + )} + + {view === 'settings' && ( setView('recorder')} - initialApiKey={apiKey} - initialProductId={productId} - initialPrompts={prompts} + onClose={() => setView(lastTab)} + apiKey={apiKey} + productId={productId} + prompts={prompts} + savePath={savePath} /> )}
- - {view === 'settings' && ( -
- {/* Close button handled inside Settings typically */} -
- )}
); } diff --git a/src/components/LogViewer.tsx b/src/components/LogViewer.tsx new file mode 100644 index 0000000..577ca5f --- /dev/null +++ b/src/components/LogViewer.tsx @@ -0,0 +1,91 @@ +import React, { useEffect, useRef } from 'react'; +import { Terminal, Clock, AlertCircle, Info, CheckCircle, Activity } from 'lucide-react'; + +export interface LogEntry { + level: string; + message: string; + timestamp: string; +} + +interface LogViewerProps { + logs: LogEntry[]; +} + +const LogViewer: React.FC = ({ logs }) => { + const endRef = useRef(null); + + useEffect(() => { + endRef.current?.scrollIntoView({ behavior: 'smooth' }); + }, [logs]); + + const getIcon = (level: string) => { + switch (level) { + case 'ERROR': return ; + case 'WARN': return ; + case 'SUCCESS': return ; + case 'DEBUG': return ; + default: return ; + } + }; + + const getColor = (level: string) => { + switch (level) { + case 'ERROR': return 'text-destructive'; + case 'WARN': return 'text-yellow-500'; + case 'SUCCESS': return 'text-green-500'; + case 'DEBUG': return 'text-muted-foreground'; + default: return 'text-foreground'; + } + }; + + const handleExportLogs = () => { + const text = logs.map(l => `[${l.timestamp}] [${l.level}] ${l.message}`).join('\n'); + const blob = new Blob([text], { type: 'text/plain' }); + const url = URL.createObjectURL(blob); + const a = document.createElement('a'); + a.href = url; + a.download = `hearbit_logs_${new Date().toISOString().slice(0, 10)}.txt`; + document.body.appendChild(a); + a.click(); + document.body.removeChild(a); + URL.revokeObjectURL(url); + }; + + return ( +
+
+ System Logs + +
+
+ {logs.length === 0 && ( +
+ No logs yet... +
+ )} + {logs.map((log, i) => ( +
+ + + {log.timestamp} + +
+ {getIcon(log.level)} +
+ + {log.message} + +
+ ))} +
+
+
+ ); +}; + +export default LogViewer; diff --git a/src/components/Recorder.tsx b/src/components/Recorder.tsx index fb3473a..659f7c3 100644 --- a/src/components/Recorder.tsx +++ b/src/components/Recorder.tsx @@ -1,8 +1,7 @@ import React, { useState, useEffect } from 'react'; -import { Mic, Square, Settings as SettingsIcon, FileText, Sparkles, Copy, Check } from 'lucide-react'; +import { Mic, Square } from 'lucide-react'; import { invoke } from "@tauri-apps/api/core"; import logo from '../assets/logo.png'; // Import logo -import ReactMarkdown from 'react-markdown'; // Import Markdown renderer interface PromptTemplate { id: string; @@ -22,7 +21,7 @@ interface RecorderProps { productId: string; prompts: PromptTemplate[]; onOpenSettings: () => void; - // Lifted State Props + // Lifted State Props (still passed for state management, though unused in view) transcription: string; setTranscription: (val: string) => void; summary: string; @@ -32,6 +31,8 @@ interface RecorderProps { onSaveToHistory: (t?: string, s?: string) => void; onDeleteHistory: (id: string) => void; onLoadHistory: (item: HistoryItem) => void; + savePath: string | null; + onRecordingComplete: () => void; } interface AudioDevice { @@ -39,30 +40,19 @@ interface AudioDevice { name: string; } -const LLM_MODELS = [ - { id: 'mixtral', name: 'Mixtral 8x7B (Best for Logic)' }, - { id: 'llama3', name: 'Llama 3 (Balanced)' }, - { id: 'llama3-70b', name: 'Llama 3 70B (High Quality)' }, - { id: 'granite-code', name: 'Granite Code (Coding)' } -]; - const Recorder: React.FC = ({ - apiKey, productId, prompts, onOpenSettings, - transcription, setTranscription, summary, setSummary, - history, onSaveToHistory, onDeleteHistory, onLoadHistory + apiKey, productId, prompts, + setTranscription, setSummary, + onSaveToHistory, savePath, onRecordingComplete }) => { const [isRecording, setIsRecording] = useState(false); + const [isPaused, setIsPaused] = useState(false); const [status, setStatus] = useState('Ready to record'); const [selectedDevice, setSelectedDevice] = useState(''); const [selectedPromptId, setSelectedPromptId] = useState(''); const [selectedModel, setSelectedModel] = useState('mixtral'); const [devices, setDevices] = useState([]); - const [availableModels, setAvailableModels] = useState>(LLM_MODELS); - const [showHistory, setShowHistory] = useState(false); // Toggle history view - - // Local state for copy feedback only - const [copiedTrans, setCopiedTrans] = useState(false); - const [copiedSum, setCopiedSum] = useState(false); + const [availableModels, setAvailableModels] = useState>([]); useEffect(() => { loadDevices(); @@ -75,7 +65,6 @@ const Recorder: React.FC = ({ try { const models = await invoke>('get_available_models', { apiKey, productId }); if (models && models.length > 0) { - // Sort models alphabetically models.sort((a, b) => a.name.localeCompare(b.name)); setAvailableModels(models); } @@ -84,13 +73,11 @@ const Recorder: React.FC = ({ } }; - // Set default prompt selection useEffect(() => { if (prompts.length > 0 && !selectedPromptId) { setSelectedPromptId(prompts[0].id); } else if (prompts.length > 0 && selectedPromptId) { - // Check if selected still exists if (!prompts.find(p => p.id === selectedPromptId)) { setSelectedPromptId(prompts[0].id); } @@ -100,20 +87,42 @@ const Recorder: React.FC = ({ const loadDevices = async () => { try { const devList = await invoke('get_input_devices'); - setDevices(devList); - if (devList.length > 0 && !selectedDevice) { - setSelectedDevice(devList[0].id); + // Alias BlackHole + const aliasedDevs = devList.map(d => ({ + ...d, + name: d.name.includes('BlackHole') ? 'Hearbit Virtual Mic (BlackHole)' : d.name + })); + setDevices(aliasedDevs); + + // Select Hearbit mic by default if available and no selection made + if (!selectedDevice) { + const vb = aliasedDevs.find(d => d.name.includes('Hearbit Virtual Mic')); + if (vb) { + setSelectedDevice(vb.id); + } else if (aliasedDevs.length > 0) { + setSelectedDevice(aliasedDevs[0].id); + } } } catch (e) { console.error('Failed to load devices', e); } }; + const openAudioSetup = async () => { + try { + await invoke('open_audio_midi_setup'); + } catch (e) { + console.error(e); + setStatus('Failed to open Audio Setup'); + } + }; + const startRecording = async () => { try { setStatus('Starting...'); - await invoke('start_recording', { deviceId: selectedDevice }); + await invoke('start_recording', { deviceId: selectedDevice, savePath: savePath || null }); setIsRecording(true); + setIsPaused(false); setTranscription(''); setSummary(''); setStatus('Recording...'); @@ -124,9 +133,26 @@ const Recorder: React.FC = ({ } }; + const togglePause = async () => { + try { + if (isPaused) { + await invoke('resume_recording'); + setIsPaused(false); + setStatus('Recording...'); + } else { + await invoke('pause_recording'); + setIsPaused(true); + setStatus('Paused'); + } + } catch (e) { + console.error("Pause/Resume error:", e); + } + }; + const stopRecording = async () => { try { setIsRecording(false); + setIsPaused(false); setStatus('Processing...'); const filePath = await invoke('stop_recording'); @@ -138,12 +164,9 @@ const Recorder: React.FC = ({ }); setTranscription(transText); - setTranscription(transText); - // Check if transcription is empty or just whitespace if (!transText || transText.trim().length === 0) { setStatus('Done (No speech detected)'); - // If empty, set a placeholder so the UI shows something, but DON'T summarize setTranscription('(No speech detected. Check your microphone settings.)'); setTimeout(() => setStatus('Ready to record'), 3000); return; @@ -167,6 +190,7 @@ const Recorder: React.FC = ({ onSaveToHistory(transText, sumText); setStatus('Done!'); + onRecordingComplete(); // Auto-switch tab setTimeout(() => setStatus('Ready to record'), 3000); } catch (e) { console.error(e); @@ -174,251 +198,127 @@ const Recorder: React.FC = ({ } }; - const installDriver = async () => { - try { - setStatus('Installing driver...'); - const res = await invoke('install_driver'); - console.log(res); - setStatus('Driver installed. Please restart app.'); - loadDevices(); - } catch (e) { - setStatus(`Install failed: ${e}`); - } - }; - - const copyToClipboard = async (text: string, isSummary: boolean) => { - try { - await navigator.clipboard.writeText(text); - if (isSummary) { - setCopiedSum(true); - setTimeout(() => setCopiedSum(false), 2000); - } else { - setCopiedTrans(true); - setTimeout(() => setCopiedTrans(false), 2000); - } - } catch (err) { - console.error('Failed to copy!', err); - } - }; - return (
{/* Fixed Header */} -
+
Logo - -
- - -
{/* Scrollable Content */}
- - {showHistory ? ( -
-

Saved Recordings

- {history.length === 0 &&

No saved history.

} - {history.map(item => ( -
-
- {item.date} -
- - -
-
-

{item.summary ? item.summary.slice(0, 50) + "..." : "No Summary"}

-

{item.transcription.slice(0, 50)}...

+
+
+ {isRecording ? ( +
+
- ))} -
- ) : ( - <> -
-
- {isRecording ? ( -
- -
- ) : ( -
- -
- )} -
-
- -

- {isRecording ? 'Listening...' : 'Ready to Record'} -

- -

- {status} -

- -
- {!isRecording ? ( - - ) : ( - - )} - -
-
- - -
-
- - -
-
- -
- - -
- - -
- - {(transcription || summary) && ( -
- {/* Transcription Block (Source) */} - {transcription && ( -
-
-
- -

Original Transcription

-
- -
-

- {transcription} -

-
- )} - - {/* Summary Block (Result) */} - {summary && ( -
-
-
- -

AI Summary

-
-
- - -
-
-
- - {summary} - -
-
- )} + ) : ( +
+
)} - - )} +
+
+ +

+ {isRecording ? (isPaused ? 'Paused' : 'Listening...') : 'Ready to Record'} +

+ +

+ {status} +

+ +
+ {!isRecording ? ( + + ) : ( +
+ + +
+ )} + +
+
+ + +
+
+ + +
+
+ +
+ + +
+ +
+ +
+
); diff --git a/src/components/Settings.tsx b/src/components/Settings.tsx index 5f8dc66..3b91d26 100644 --- a/src/components/Settings.tsx +++ b/src/components/Settings.tsx @@ -1,157 +1,319 @@ -import React, { useState, useEffect } from 'react'; -import { Plus, Trash2 } from 'lucide-react'; - -interface PromptTemplate { - id: string; - name: string; - content: string; -} +import React, { useState } from 'react'; +import { Save, FolderOpen, Lock, Upload, Download, Eye, EyeOff } from 'lucide-react'; +import { open } from '@tauri-apps/plugin-dialog'; +import { encryptData, decryptData } from '../utils/backup'; +import { PromptTemplate } from '../App'; interface SettingsProps { - onSave: (apiKey: string, productId: string, prompts: PromptTemplate[]) => void; - onBack: () => void; // New onBack prop - initialApiKey: string; - initialProductId: string; - initialPrompts: PromptTemplate[]; + apiKey: string; + productId: string; + savePath: string; + prompts: PromptTemplate[]; + onSave: (apiKey: string, productId: string, prompts: PromptTemplate[], savePath: string) => void; + onClose: () => void; } -const Settings: React.FC = ({ onSave, onBack, initialApiKey, initialProductId, initialPrompts }) => { - const [apiKey, setApiKey] = useState(initialApiKey); - const [productId, setProductId] = useState(initialProductId); - const [prompts, setPrompts] = useState(initialPrompts); +const Settings: React.FC = ({ apiKey, productId, prompts, savePath, onSave, onClose }) => { + const [localApiKey, setLocalApiKey] = useState(apiKey); + const [localProductId, setLocalProductId] = useState(productId); + const [localSavePath, setLocalSavePath] = useState(savePath); + const [localPrompts, setLocalPrompts] = useState(prompts); + const [statusIdx, setStatusIdx] = useState(null); - useEffect(() => { - setApiKey(initialApiKey); - setProductId(initialProductId); - // Only reset prompts if passed different ones (mounting), usually state is preserved in App - }, [initialApiKey, initialProductId]); + // Backup & Restore State + const [backupPassword, setBackupPassword] = useState(''); + const [showPassword, setShowPassword] = useState(false); + const [isImportModalOpen, setIsImportModalOpen] = useState(false); + const [importFileContent, setImportFileContent] = useState(null); - const handleSave = (e: React.FormEvent) => { - e.preventDefault(); - onSave(apiKey, productId, prompts); + const handlePromptChange = (id: string, field: 'name' | 'content', value: string) => { + setLocalPrompts(localPrompts.map(p => p.id === id ? { ...p, [field]: value } : p)); }; const addPrompt = () => { - setPrompts([...prompts, { id: Date.now().toString(), name: 'New Prompt', content: '' }]); - }; - - const updatePrompt = (id: string, field: 'name' | 'content', value: string) => { - setPrompts(prompts.map(p => p.id === id ? { ...p, [field]: value } : p)); + setLocalPrompts([...localPrompts, { id: Date.now().toString(), name: 'New Prompt', content: '' }]); }; const removePrompt = (id: string) => { - setPrompts(prompts.filter(p => p.id !== id)); + setLocalPrompts(localPrompts.filter(p => p.id !== id)); + }; + + const handleSave = () => { + onSave(localApiKey, localProductId, localPrompts, localSavePath); + onClose(); + }; + + const handleSelectFolder = async () => { + try { + const selected = await open({ + directory: true, + multiple: false, + defaultPath: localSavePath || undefined, + }); + if (selected && typeof selected === 'string') { + setLocalSavePath(selected); + } + } catch (e) { + console.error("Failed to open directory picker", e); + setStatusIdx('Error: Failed to open directory picker.'); + } + }; + + const handleExport = async () => { + if (!backupPassword) { + setStatusIdx('Error: Password required to encrypt backup.'); + return; + } + try { + const data = { + apiKey: localApiKey, + productId: localProductId, + prompts: localPrompts, + savePath: localSavePath + }; + const encrypted = await encryptData(data, backupPassword); + const blob = new Blob([encrypted], { type: 'text/plain' }); + const url = URL.createObjectURL(blob); + const a = document.createElement('a'); + a.href = url; + a.download = `hearbit_backup_${new Date().toISOString().slice(0, 10)}.conf`; + document.body.appendChild(a); + a.click(); + document.body.removeChild(a); + URL.revokeObjectURL(url); + setStatusIdx('Configuration exported successfully!'); + } catch (e) { + console.error(e); + setStatusIdx('Export failed.'); + } + }; + + const triggerImport = () => { + document.getElementById('import-file-input')?.click(); + }; + + const handleFileSelect = (e: React.ChangeEvent) => { + const file = e.target.files?.[0]; + if (!file) return; + + const reader = new FileReader(); + reader.onload = (event) => { + if (event.target?.result) { + setImportFileContent(event.target.result as string); + setIsImportModalOpen(true); + setBackupPassword(''); + } + }; + reader.readAsText(file); + e.target.value = ''; + }; + + const confirmImport = async () => { + if (!backupPassword) { + setStatusIdx('Error: Password required to decrypt.'); + return; + } + if (!importFileContent) return; + + try { + const data = await decryptData(importFileContent, backupPassword); + if (data.apiKey) setLocalApiKey(data.apiKey); + if (data.productId) setLocalProductId(data.productId); + if (data.prompts) setLocalPrompts(data.prompts); + if (data.savePath) setLocalSavePath(data.savePath); + + setStatusIdx('Configuration imported! Click Save to apply.'); + setIsImportModalOpen(false); + setImportFileContent(null); + } catch (e) { + console.error(e); + setStatusIdx('Import failed: Wrong password or corrupted file.'); + } }; return ( -
-
-

Infomaniak Settings

- +
+
+ + +
+
+
+ )} + +
+ Settings +
-
-
+ +
+
+

General

- + setApiKey(e.target.value)} - placeholder="ik_..." - className="w-full p-2 rounded border border-border bg-secondary text-foreground focus:ring-2 focus:ring-primary outline-none" + value={localApiKey} + onChange={(e) => setLocalApiKey(e.target.value)} + className="w-full p-2 rounded border border-border bg-secondary text-foreground focus:ring-2 focus:ring-primary outline-none font-mono text-sm" />
- + setProductId(e.target.value)} - placeholder="Number (e.g., 12345)" - className="w-full p-2 rounded border border-border bg-secondary text-foreground focus:ring-2 focus:ring-primary outline-none" + value={localProductId} + onChange={(e) => setLocalProductId(e.target.value)} + className="w-full p-2 rounded border border-border bg-secondary text-foreground focus:ring-2 focus:ring-primary outline-none font-mono text-sm" + /> +
+
+ +
+ setLocalSavePath(e.target.value)} + placeholder="/Users/username/Desktop/Recordings" + className="flex-1 p-2 rounded border border-border bg-secondary text-foreground focus:ring-2 focus:ring-primary outline-none font-mono text-sm" + /> + +
+
+
+ +
+

+ Backup & Restore +

+

+ Export your settings (keys, prompts, path) to an encrypted file. +

+ +
+ setBackupPassword(e.target.value)} + placeholder="Encryption Password" + className="w-full p-2 pr-8 rounded border border-border bg-secondary text-foreground focus:ring-2 focus:ring-primary outline-none text-sm" + /> + +
+ +
+ + +
-
-
- -
+ {localPrompts.map((prompt) => ( +
+ + handlePromptChange(prompt.id, 'name', e.target.value)} + className="w-full p-1 bg-transparent border-b border-border/50 focus:border-primary outline-none font-semibold text-sm" + placeholder="Prompt Name" + /> +