Files
IYmtg/README.md
Mike Wichers e18a1080de feat: Complete project readiness audit
- Full static analysis of all 33 Swift source files
- Identified 2 Blockers in IYmtgTests.swift (ScannerViewModel init mismatch, missing property forwards)
- Identified 1 Critical issue: IYmtg_Builder_Mac is empty, cards.json cannot be generated
- Documented 4 Major issues: deprecated onChange API, missing FirebaseCore import, Firebase delete data leak, dead batchUpdatePrices function
- Updated claude_review_summary.md with complete findings by severity
- Added Project Audit section to README.md with link to summary

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-05 16:03:00 -05:00

793 lines
42 KiB
Markdown
Raw Blame History

This file contains ambiguous Unicode characters
This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
# IYmtg Platinum Prime (Version 1.1.0)
**SYSTEM CONTEXT FOR AI (STRICT PRESERVATION)**
CRITICAL INSTRUCTION: This document is the single, authoritative Source of Truth for "IYmtg," an iOS application designed to identify, grade, and insure Magic: The Gathering cards.
* **Version Authority:** This Version 1.0.0 supersedes all previous iterations.
* **Architecture Mandate:** Any future updates must strictly adhere to the defined pipeline: Vector Fingerprinting (Identity) -> OCR (Validation) -> ML Analysis (Condition/Foil).
* **Preservation Protocol:** Do not summarize, truncate, or remove sections of this manual during review.
---
## Development Roadmap
This is the complete sequence of steps to go from source code to a working app. Complete them in order.
| Step | Task | Platform | Status |
| :--- | :--- | :--- | :--- |
| 1 | Workspace setup & visual assets | Any | — |
| 2 | **Build `IYmtg_Builder_Mac`** | Mac only | ⚠️ Not written yet |
| 3 | **Generate `cards.json`** database | Mac only | ⚠️ Depends on Step 2 |
| 4 | **Create Xcode project** from source | Mac only | — |
| 5 | Collect ML training images | Any (physical cards) | — |
| 6 | Train ML models in Create ML | Mac only | — |
| 7 | Configure Firebase (optional) | Any | — |
| 8 | Final configuration & testing | Mac only | — |
| 9 | App Store submission | Mac only | — |
### What You Can Do Without a Mac
* Edit source code
* Run Python automation scripts (`fetch_set_symbols.py`, `generate_images.py`)
* Collect and sort ML training images into `IYmtg_Training/`
* Acquire physical cards from the shopping lists
### What Requires a Mac
Everything else. Apple's `Vision` framework (used to generate card fingerprints) and `Create ML` (used to train models) are macOS-only. The Xcode project also lives on your Mac.
---
## Part 1: App Store Listing
### 1. Metadata
* **App Name:** IYmtg: Card Scanner & Insurance
* **Subtitle:** Identify, Grade & Insure Magic
* **Category:** Reference / Utilities
* **Keywords:** magic,gathering,scanner,tcg,card,price,insurance,manager,grade,foil,mtg,free,offline
* **Device Orientation:** Strictly lock to Portrait in Xcode.
### 2. Description
**Headline:** The Easiest Way to Insure Your Magic Collection.
**Body:**
Your Magic: The Gathering collection represents years of history and passion. Losing it to theft, fire, or disaster is a nightmare scenario. IYmtg is the first app built specifically to make **insuring your collection** simple, fast, and accurate.
Forget complex spreadsheets and manual entry. Just point your camera, and IYmtg handles the rest. It identifies the card, grades the condition, detects foiling, and fetches the market price instantly. When you're done, one tap generates a professional **Insurance Schedule PDF** ready for your agent.
**Why IYmtg?**
* 📄 **Insurance Ready:** Generate a timestamped, itemized PDF Schedule in seconds.
***Effortless Scanning:** Auto-detects Set, Condition, and Foil type (including Etched, Galaxy, and more).
* 🔒 **Private & Secure:** Your data is backed up, but your images stay private in iCloud.
***Simple & Clean:** No ads, no subscriptions, just a powerful tool for collectors.
**Development Transparency:**
This application's code and visual assets were developed with the assistance of Artificial Intelligence. This modern approach allows us to deliver a sophisticated, high-performance tool dedicated to a single goal: helping collectors manage, grade, and insure their valuable history with precision and ease.
**Community Data Initiative:**
Help us make IYmtg smarter! If you find a card that scans incorrectly, you can correct it in the app. When you do, you'll have the option to securely send that image to our training database. Your contributions directly improve the AI models for the entire community.
**Features:**
* **Insurance Reports:** Export your entire collection to a PDF ready for your insurance agent.
* **Collection Valuation:** Monitor the total value of your collection with real-time market data.
* **Smart Scanning:** Identify cards, foils, and condition automatically.
* **Cloud Sync:** Keep your collection safe and accessible across your devices.
* **Offline Access:** Scan and manage your cards even without an internet connection.
* **Market Data:** Switch between major pricing sources (TCGPlayer & Cardmarket).
* **Export Options:** Also supports CSV and digital deck formats for other uses.
---
## Part 2: Workspace & Assets
### Step 1: Workspace Setup
1. Create the master folder in your preferred location (Desktop, OneDrive, or any synced drive).
2. Ensure the folder is synced with a cloud backup service (OneDrive, Google Drive, iCloud Drive, etc.).
3. Organize your sub-folders exactly as shown below:
```text
IYmtg_Master/
├── IYmtg_App_iOS/ (The iOS App Source Code)
├── IYmtg_Builder_Mac/ (The Card Database Builder — Mac app)
├── IYmtg_Training/ (ML Image Data)
└── IYmtg_Automation/ (Python/Shell Scripts)
```
### Step 2: Visual Assets
Place the following assets in `Assets.xcassets` in the Xcode project.
**Important:** AI tools often generate large files (e.g., 2048x2048). You **must resize and crop** the results to the dimensions listed below. For the AppIcon, exact 1024x1024 dimensions are mandatory.
| Asset Name | Dimensions | Description | Gemini Generation Prompt |
| :--- | :--- | :--- | :--- |
| **AppIcon** | 1024x1024 | App Icon. | "A high-quality iOS app icon. A stylized neon green cybernetic eye scanning a dark, mystical trading card silhouette. Dark purple and black background. Minimalist, sleek, modern technology meets fantasy magic. No text. Square aspect ratio." |
| **logo_header** | 300x80 | Header Logo. | "A typographic logo for an app named 'IYmtg'. Horizontal layout. Neon green text, futuristic sans-serif font. Dark background. The text should be glowing. High contrast. Aspect ratio 4:1." |
| **scanner_frame** | 600x800 | Viewfinder. | "A HUD viewfinder overlay for a camera app. Glowing white bracket corners. Thin, high-tech lines connecting corners. Center is empty. Sci-fi interface style. Pure white lines on a solid black background. Aspect ratio 3:4." |
| **empty_library** | 800x800 | Empty State. | "Isometric 3D render of a clean, empty wooden desk. A single Magic: The Gathering style card sits in the center. Soft warm lighting. Minimalist design. High resolution. No text. Square aspect ratio." |
| **share_watermark** | 400x100 | Watermark. | "A watermark logo text 'Verified by IYmtg'. White text with a checkmark icon. Clean, bold font. Solid black background. Professional verification seal style. Aspect ratio 4:1." |
| **card_placeholder**| 600x840 | Loading State. | "A generic trading card back design. Grey and silver swirl pattern. Mystical and abstract. No text. Aspect ratio 2.5:3.5." |
#### Automated Generation (Recommended)
**Setup:**
1. **Get a Gemini API Key:** You will need an API key from Google AI Studio.
2. **Set the API Key:** Open `IYmtg_Automation/generate_images.py` and set your API key in the configuration section.
3. **Install dependencies:**
```bash
pip install requests pillow
```
**Usage:**
```bash
python3 IYmtg_Automation/generate_images.py
```
The generated images will be saved in `Raw_Assets` and resized images in `Ready_Assets`.
#### Manual Resizing (If You Already Have Images)
1. **Setup:** Ensure Python is installed and run `pip install Pillow`.
2. **Generate Placeholders (Optional):**
```bash
python3 IYmtg_Automation/generate_placeholders.py
```
3. **Place Images:** Save your real AI results into `Raw_Assets`, named exactly as listed above (e.g., `AppIcon.png`).
4. **Run:**
```bash
python3 IYmtg_Automation/resize_assets.py
```
5. **Result:** Xcode-ready images will be in `Ready_Assets`. Drag them into `Assets.xcassets`.
---
## Part 3: Card Database (`cards.json`) — Mac Required
**This is the most critical file in the project.** The app cannot identify any cards without it. It is a JSON file bundled inside the app containing a fingerprint (a mathematical representation) of every Magic card, generated from card images using Apple's Vision framework.
### What `cards.json` Contains
Each entry in the file represents one unique card printing and contains:
- Card name, set code, and collector number
- Whether the card has a foil or serialized printing
- Pricing data
- A `VNFeaturePrintObservation` (binary blob) — the visual fingerprint used for identification
### Step 1: Write `IYmtg_Builder_Mac`
`IYmtg_Builder_Mac/` is currently **empty**. This Mac command-line tool needs to be built before `cards.json` can be generated. It must:
1. Fetch the complete card list from the Scryfall API (`https://api.scryfall.com/bulk-data` → `default_cards` dataset)
2. Download a card image for each unique printing
3. Run `VNGenerateImageFeaturePrintRequest` (Apple Vision) on each image to produce a fingerprint
4. Archive the fingerprint using `NSKeyedArchiver` into a `Data` blob
5. Write all entries to `cards.json` using the `CardFingerprint` model defined in `IYmtg_App_iOS/Data/Models/Card.swift`
6. Place the output at `IYmtg_App_iOS/cards.json`
**Data model reference** (`CardFingerprint` in `IYmtg_App_iOS/Data/Models/Card.swift`):
```swift
struct CardFingerprint: Codable {
let id: UUID
let name: String
let setCode: String
let collectorNumber: String
let hasFoilPrinting: Bool
let hasSerializedPrinting: Bool?
let priceScanned: Double?
let featureData: Data // NSKeyedArchiver-encoded VNFeaturePrintObservation
}
```
**To build `IYmtg_Builder_Mac`:** Give this README section to an AI (Claude or Gemini) along with the `CardFingerprint` struct and ask it to write a Swift command-line tool for macOS. The tool is straightforward — it is a single-purpose script that runs once and can take several hours to complete due to the volume of Scryfall image downloads.
### Step 2: Run the Builder
On your Mac, build and run `IYmtg_Builder_Mac`. The `weekly_update.sh` script automates this:
```bash
chmod +x IYmtg_Automation/weekly_update.sh
./IYmtg_Automation/weekly_update.sh
```
This script:
1. Builds the builder app using `xcodebuild`
2. Runs it (this takes time — Scryfall has ~100,000+ card printings)
3. Moves the output `cards.json` into `IYmtg_App_iOS/` ready to be bundled
**Run this script periodically** (e.g., weekly) to pick up newly released sets.
### Step 3: Add `cards.json` to Xcode
After the builder runs, `cards.json` will be at `IYmtg_App_iOS/cards.json`.
In Xcode:
1. Drag `cards.json` into the project navigator under `IYmtg_App_iOS/`
2. Ensure **"Add to target: IYmtg"** is checked so it is bundled inside the app
---
## Part 4: Xcode Project Setup — Mac Required
The source code in `IYmtg_App_iOS/` is complete. Follow these steps to create the Xcode project on your Mac.
### Step 1: Create the Project
1. Open Xcode → **File → New → Project**
2. Choose **iOS → App**
3. Set:
- **Product Name:** IYmtg
- **Bundle Identifier:** `com.<yourname>.iymtg` (you choose this — note it down, you'll need it for iCloud)
- **Interface:** SwiftUI
- **Language:** Swift
- **Storage:** None (we use SwiftData manually)
4. Save the project **inside** `IYmtg_App_iOS/` — this places the `.xcodeproj` alongside the source files.
### Step 2: Add Source Files
1. In the Xcode project navigator, right-click the `IYmtg` group → **Add Files to "IYmtg"**
2. Select all folders inside `IYmtg_App_iOS/`:
- `Application/`
- `Data/`
- `Features/`
- `Services/`
- `Firebase/`
- `AppConfig.swift`
- `ContentView.swift`
- `IYmtgApp.swift`
- `IYmtgTests.swift`
- `cards.json` (once generated)
3. Ensure **"Copy items if needed"** is **unchecked** (files are already in the right place) and **"Create groups"** is selected.
### Step 3: Add Dependencies (Swift Package Manager)
1. **File → Add Package Dependencies**
2. Add the Firebase iOS SDK:
- URL: `https://github.com/firebase/firebase-ios-sdk`
- Add these libraries to your target: `FirebaseCore`, `FirebaseFirestore`, `FirebaseAuth`, `FirebaseStorage`
### Step 4: Configure Signing & Capabilities
1. Select the project in the navigator → **Signing & Capabilities** tab
2. Set your **Team** and ensure **Automatically manage signing** is on
3. Set **Minimum Deployments** to **iOS 17.0**
4. Click **+ Capability** and add:
- **iCloud** → enable **CloudKit** → add container `iCloud.<your-bundle-id>`
- **Background Modes** → enable **Remote notifications**
5. Lock orientation: **General** tab → **Deployment Info** → uncheck Landscape Left and Landscape Right
### Step 5: Add Privacy Descriptions
In `Info.plist`, add these keys (Xcode will prompt on first run, but adding them manually avoids rejection):
| Key | Value |
| :--- | :--- |
| `NSCameraUsageDescription` | `IYmtg uses the camera to scan and identify Magic: The Gathering cards.` |
| `NSPhotoLibraryUsageDescription` | `IYmtg can save card images to your photo library.` |
Also add `PrivacyInfo.xcprivacy` to the app target to satisfy Apple's privacy manifest requirements (required for App Store submission as of 2024).
### Step 6: Build and Run
1. Select a connected physical iPhone as the build target (camera features do not work in the simulator)
2. Press **Cmd+R** to build and run
3. On first launch the app will show **"Database Missing"** until `cards.json` is bundled (see Part 3)
---
## Part 5: Machine Learning Training — Mac Required for Final Step
> **In-App Training Guide (v1.1.0+):** The Library tab now includes a "?" button that opens a color-coded Training Guide. It shows every ML category with recommended image counts for three accuracy levels (Functional / Solid / High-Accuracy), so you can track collection progress directly from the app.
**You do not need the app, Xcode, or a Mac to collect training images.** All you need is physical cards and a phone camera. The only Mac-required step is the final model training in Create ML (Step 6).
> **The app ships and works without any trained models.** Foil detection defaults to "None" and condition defaults to "NM". You can release a working app first and add models later via OTA update. Do not let missing training data block your first build.
---
### Step 0: How to Create Training Images (No App Required)
This is the complete workflow for preparing training data on any computer.
#### What You Need
- Physical Magic cards (see shopping lists in Steps 13)
- A phone with a decent camera (iPhone, Android — anything works)
- A plain background: white card stock or black felt works best
- A free photo cropping tool:
- **Windows:** Photos app (built-in crop) or Paint
- **Any platform:** [Squoosh](https://squoosh.app) (browser-based, free, no install)
- **Bulk cropping:** [IrfanView](https://www.irfanview.com) (Windows, free) or [XnConvert](https://www.xnview.com/en/xnconvert/) (cross-platform, free)
#### Photography Setup
**For Foil Cards:**
1. Place the card on a **black background** (black felt or black paper).
2. Use a single directional light source — a desk lamp or window at 45°.
3. Take **5 photos of the same card** rotating it slightly between each shot so the light catches the foil at different angles. The AI must learn how the foil *moves*, not just how it looks flat.
4. Example angles: flat-on, 15° left tilt, 15° right tilt, 15° top tilt, 15° bottom tilt.
**For Non-Foil Cards:**
1. Place on **white or grey background**.
2. Even, diffused lighting (avoid strong reflections on the surface).
3. 13 photos per card is sufficient.
**For Condition/Damage:**
1. Use **raking light** (light source almost parallel to the card surface) — this casts shadows that highlight scratches, dents, and bends far more clearly than direct light.
2. For edge whitening: photograph against a **black background**.
3. For chipping: photograph against a **white background**.
4. Take a close-up — fill the frame with the card.
#### Cropping Rule — Critical
The app scans **cropped card images only** (no table, no background, no hand visible). Your training images must match this exactly or the model will learn the wrong thing.
After photographing:
1. Open the photo in your cropping tool.
2. Crop tightly to the card border — include the full card frame but nothing outside it.
3. It does not need to be pixel-perfect. Within 510px of the edge is fine.
4. Save as `.jpg` at any reasonable resolution (at least 400×560px).
#### Naming and Sorting
File names do not matter — only the **folder** they are in matters. Save cropped images directly into the appropriate `IYmtg_Training/` subfolder:
```
IYmtg_Training/Foil_Data/Etched/ ← drop etched foil photos here
IYmtg_Training/Foil_Data/Traditional/ ← drop traditional foil photos here
IYmtg_Training/Condition_Data/Edges/Whitening/ ← drop edge whitening photos here
```
#### How Many Images Do You Need?
| Goal | Minimum | Recommended |
| :--- | :--- | :--- |
| Test that training works | 10 per class | — |
| Functional model, limited accuracy | 20 per class | — |
| Solid production model | 3050 per class | 50+ per class |
| High-accuracy model | — | 100+ per class |
More is always better. Variety matters more than quantity — different cards, different lighting, different tilt angles.
#### Using Scryfall Images as a Supplement
For **NonFoil** training data you can download card images directly from Scryfall instead of photographing them. This is automated — run:
```bash
pip install requests pillow
python3 IYmtg_Automation/fetch_set_symbols.py
```
This downloads and crops set symbol images automatically. For general NonFoil card images, you can query the Scryfall API directly (`https://api.scryfall.com/cards/random`) and download the `normal` image URI. Downloaded Scryfall images are already cropped to the card frame and work well as NonFoil training data. Do not use Scryfall images for foil or damage training — they are flat renders with no foil or physical damage.
---
### General Data Collection Protocol (Critical)
The app sends **cropped** images (just the card, no background) to the AI. Your training data must match this.
1. **Capture:** Take photos of the card on a contrasting background.
* **For Foils:** Take 3-5 photos of the *same card* at different tilt angles. The AI needs to see how the light moves across the surface (e.g., flat, tilted left, tilted back).
* **For Damage:** Ensure the lighting specifically highlights the defect (e.g., raking light for dents).
2. **Crop:** Crop the photo so **only the card** is visible (remove the table/background).
3. **Sort:** Place the cropped image into the corresponding folder in `IYmtg_Training`.
4. **Quantity:** Aim for 30-50 images per category for robust results.
### Step 1: The Master Foil Shopping List (Required for FoilEngine)
Acquire one of each (~$50 total) to train the Foil Classifier. This ensures the app can distinguish complex modern foil types.
| Foil Type | Recommended Card | Visual Key (For Substitutes) |
| :--- | :--- | :--- |
| **Traditional** | Any Common Foil | Standard rainbow reflection, smooth surface. |
| **Etched** | Harmonize (Strixhaven Archive) | Metallic, grainy texture, matte finish, no rainbow. |
| **Pre-Modern** | Opt (Dominaria Remastered - Retro) | Shooting star in text box, specific retro frame shine. |
| **Textured** | Rivaz of the Claw (Dominaria United) | Raised 3D pattern on surface, fingerprint-like feel. |
| **Galaxy** | Command Performance (Unfinity) | Embedded "stars" or sparkles in the foil pattern. |
| **Surge** | Explore (Warhammer 40k) | Rippling "wave" pattern across the entire card. |
| **Oil Slick** | Basic Land (Phyrexia: ONE - Compleat) | Raised, slick black-on-black texture, high contrast. |
| **Step and Compleat** | Elesh Norn (Phyrexia: ONE Showcase) | Phyrexian oil-slick effect on the card frame; black-silver high contrast. |
| **Confetti** | Negate (Wilds of Eldraine - Confetti) | Glittering "confetti" sparkles scattered on art. |
| **Halo** | Uncommon Legend (MOM: Multiverse) | Swirling circular pattern around the frame. |
| **Neon Ink** | Hidetsugu (Neon Yellow) | Bright, fluorescent ink layer on top of foil. |
| **Fracture** | Enduring Vitality (Duskmourn Japan) | Shattered glass pattern, highly reflective. |
| **Gilded** *(low priority)* | Riveteers Charm (New Capenna) | Embossed gold frame elements, glossy raised texture. Training folder not yet created. |
| **Silver Screen** *(low priority)* | Otherworldly Gaze (Double Feature) | Grayscale art with silver metallic highlights. Single-set type — deprioritized. |
### Step 2: The Stamp Classifier Shopping List
Acquire pairs of cards to train the `StampDetector` (Promo/Date Stamped vs. Regular). This is a **Binary Classifier**, meaning the AI learns by comparing "Yes" vs "No".
* **Prerelease Promos:** Any card with a Gold Date Stamp (e.g., "29 September 2018").
* **Promo Pack Cards:** Cards with the Planeswalker Symbol stamp in the bottom right of the art.
* **Purchase List:** Buy 50-100 cheap bulk promos (often <$0.25 each) and their non-promo counterparts.
* **Action:** Place cropped images of promos in `Stamp_Data/Stamped` and regular versions in `Stamp_Data/Clean`.
### Step 3: The "Damage Simulation Lab"
#### Important: This Model Uses Object Detection, Not Image Classification
The `Condition_Data` model is trained as an **Object Detection** model, not an Image Classification model. This is a critical difference:
- **Image Classification** (used for Foil and Stamp): drop images in a folder, Create ML labels them by folder name. Simple.
- **Object Detection** (used for Condition): you must **draw bounding boxes** around each defect in Create ML. The model learns *where* damage is on the card, not just that damage exists.
When training in Create ML, you will annotate each training image by drawing a rectangle around the damaged area and labeling it (e.g., "LightScratches", "Whitening"). Create ML has a built-in annotation tool — click an image, draw a box, type the label.
**Folder naming maps directly to label names.** The labels must match the `Condition_Data` subfolder names exactly:
`LightScratches`, `Clouding`, `Dirt`, `Dents`, `Whitening`, `Chipping`, `CornerWear`, `Creases`, `ShuffleBend`, `WaterDamage`, `Inking`, `Rips`, `BindersDents`
#### How the Grading Formula Works
Understanding this helps you know what training data matters most. The app grades cards as follows (from `ConditionEngine.swift`):
| Detected Damage | Grade Assigned |
| :--- | :--- |
| Any `Inking`, `Rips`, or `WaterDamage` detected | **Damaged** — immediately, regardless of anything else |
| 0 damage detections | **Near Mint (NM)** |
| 12 minor damage detections | **Excellent (EX)** |
| 3 or more minor damage detections | **Played (PL)** |
**Critical damage types** (`Inking`, `Rips`, `WaterDamage`) are the highest training priority — a single false positive will incorrectly grade a NM card as Damaged.
#### Materials List
| Item | Used For |
| :--- | :--- |
| 0000 (ultra-fine) steel wool | Surface scratches on foil cards |
| White vinyl eraser | Clouding/surface haze |
| Potting soil or cocoa powder | Dirt simulation |
| Ballpoint pen cap (rounded end) | Dents |
| Black Sharpie marker | Inking simulation |
| Spray bottle with water | Water damage |
| 3-ring binder | Binder dents |
| Rough mousepad or sandpaper | Corner wear |
| 50100 bulk "draft chaff" cards | Cards to damage |
| Black felt or black paper | Background for edge photos |
| White card stock | Background for chipping photos |
| Desk lamp | Raking light source |
#### Sourcing Cards to Damage
Buy bulk worthless cards — do not damage your own collection.
- **eBay:** Search "MTG bulk commons lot" — 1000 cards for ~$10
- **TCGPlayer:** "Bulk commons" listings, often $0.01/card
- **Local game store:** Ask for "draft chaff" — often given away free
Also buy **pre-damaged cards** — natural damage looks more authentic to the model than simulated:
- **eBay:** Search "MTG damaged cards lot" or "heavily played bulk"
Aim for **50 cards per damage type** minimum. One card can be used for multiple damage types since each photo annotates only one damage area.
#### Raking Light Setup (Required for Surface and Structure Damage)
Most damage is invisible under flat overhead light. Raking light reveals it.
1. Place the card flat on a dark surface.
2. Position your desk lamp so light hits the card at a near-horizontal angle (515° above the surface) from one side.
3. The damage will cast visible shadows or catch the light clearly.
4. For scratches: slowly rotate the card until the scratches "light up" — photograph at that angle.
#### Damage Simulation Techniques
| Category | Damage Type | Folder Name | Simulation Technique | Photography Tip |
| :--- | :--- | :--- | :--- | :--- |
| **Surface** | Light Scratches | `LightScratches` | Rub foil surface gently with 0000 Steel Wool in one direction. | Raking light from the scratched direction. Rotate until scratches catch light. |
| **Surface** | Clouding | `Clouding` | Rub white vinyl eraser vigorously over foil surface in circles. | Diffused light. Compare side-by-side with a clean card for reference. |
| **Surface** | Dirt | `Dirt` | Press a damp fingertip into potting soil, then onto card surface. | Even lighting. Ensure dirt contrasts against the card art. |
| **Surface** | Dents | `Dents` | Press rounded end of a ballpoint pen cap firmly straight down. | Raking light at 10° to cast shadow inside the dent. |
| **Edges** | Whitening | `Whitening` | Rub card edges rapidly back and forth against denim jeans. | Black background. Macro close-up of the edge. |
| **Edges** | Chipping | `Chipping` | Use fingernail to carefully flake small pieces off the black border. | White background. Macro close-up. |
| **Edges** | Corner Wear | `CornerWear` | Rub corners against a rough mousepad with a circular motion. | Macro focus on the corner. Black background. |
| **Structure** | Creases | `Creases` | Fold corner sharply until a hard crease forms, then unfold. | Raking light to catch reflection off the crease ridge. |
| **Structure** | Shuffle Bend | `ShuffleBend` | Riffle shuffle the card aggressively 10+ times to create an arch. | Profile/side view to show curvature clearly. |
| **Structure** | Water Damage | `WaterDamage` | Mist card lightly with spray bottle, wait 60 seconds, air dry flat. | Raking light to show rippled surface texture. |
| **Critical** | Inking | `Inking` | Draw along whitened edges with black Sharpie to simulate edge touch-up. | UV/blacklight if available; otherwise strong white light at angle. |
| **Critical** | Rips | `Rips` | Tear edge slightly (~5mm). | High contrast background opposite to card border color. |
| **Critical** | Binder Dents | `BindersDents` | Press a 3-ring binder ring firmly into the card surface. | Raking light to show the circular crimp. |
#### What to Photograph per Damage Type
For each damage type, capture:
1. **3050 cards showing that damage clearly** — positive training examples
2. **1020 completely clean (undamaged) cards** — include these in every subfolder so the model learns the baseline
When annotating in Create ML, draw the bounding box **tightly around the damaged area only**. For shuffle bends, annotate the center of the arch. For edge damage, annotate the specific section of edge that is damaged, not the entire edge.
### Step 4: The "Edge Case" Validation List
Acquire these specific cheap cards to verify the logic-based detectors. **Note:** These are for **Manual Verification** (testing the app), not for Create ML training folders.
| Detector | Target Card Type | Recommended Purchase |
| :--- | :--- | :--- |
| **ListSymbol** | "The List" Reprint | Any common from "The List" (look for planeswalker symbol). |
| **Border** | World Champ Deck | Any 1996-2004 World Champ card (Gold Border). |
| **Border** | Chronicles Reprint | *City of Brass* (Chronicles) vs *City of Brass* (Modern Reprint). |
| **Corner** | Alpha/Beta Sim | *4th Edition* (Standard) vs *Alpha* (Proxy/Counterfeit for testing). |
| **Saturation** | Unl/Revised Sim | *Revised* Basic Land (Washed out) vs *4th Edition* (Saturated). |
### Step 5: Training Folder Structure
The following directory tree is already created in `IYmtg_Training`. Place your cropped images into the appropriate folders.
```text
IYmtg_Training/
├── Foil_Data/ (Image Classification)
│ ├── NonFoil/
│ ├── Traditional/
│ ├── Etched/
│ ├── PreModern/
│ ├── Textured/
│ ├── Galaxy/
│ ├── Surge/
│ ├── OilSlick/
│ ├── StepAndCompleat/
│ ├── Halo/
│ ├── Confetti/
│ ├── NeonInk/
│ └── Fracture/
├── Stamp_Data/ (Image Classification)
│ ├── Stamped/
│ └── Clean/
└── Condition_Data/ (Object Detection)
├── Surface/
│ ├── LightScratches/
│ ├── Clouding/
│ ├── Dirt/
│ └── Dents/
├── Edges/
│ ├── Whitening/
│ ├── Chipping/
│ └── CornerWear/
├── Structure/
│ ├── Creases/
│ ├── ShuffleBend/
│ └── WaterDamage/
└── Critical/
├── Inking/
├── Rips/
└── BindersDents/
```
### Step 6: Train Models in Create ML (Mac)
1. Open **Create ML** (found in Xcode → Open Developer Tool → Create ML)
2. **Foil Classifier:** New Project → Image Classification → drag in `Foil_Data/` → Train → Export as `IYmtgFoilClassifier.mlmodel`
3. **Stamp Classifier:** New Project → Image Classification → drag in `Stamp_Data/` → Train → Export as `IYmtgStampClassifier.mlmodel`
4. **Condition Classifier:** New Project → Object Detection → drag in `Condition_Data/` → Train → Export as `IYmtgConditionClassifier.mlmodel`
5. Drag all three `.mlmodel` files into the Xcode Project Navigator (ensure they are added to the app target)
### Set Symbol Harvester (Automation)
Run this script to automatically collect set symbol training data from Scryfall. Works on any platform.
```bash
pip install requests pillow
python3 IYmtg_Automation/fetch_set_symbols.py
```
Output goes to `Set_Symbol_Training/`. Drag this folder into Create ML → Image Classification to train `IYmtgSetSymbolClassifier.mlmodel`.
---
## Part 6: Community Feedback & Model Retraining
The app has a built-in pipeline that collects user corrections and uses them to improve the ML models over time. This section explains how it works end-to-end.
### How the Feedback System Works
There are two data collection paths:
**Path 1 — User Corrections (Community Data)**
When a user corrects a mis-scan (wrong card identity, wrong foil type, or wrong condition), the app automatically uploads the cropped card image to Firebase Storage — but only if the user has opted in.
The upload destination is determined by what was corrected:
| What Changed | Firebase Storage Path | Used to Retrain |
| :--- | :--- | :--- |
| Card name or set | `training/Identity_<SETCODE>_<CollectorNum>/` | `cards.json` (re-fingerprint) |
| Foil type | `training/Foil_<FoilType>/` | `IYmtgFoilClassifier.mlmodel` |
| Condition grade | `training/Condition_<Grade>/` | `IYmtgConditionClassifier.mlmodel` |
Example: A user corrects a card that was identified as "Traditional" foil to "Etched". The image is uploaded to `training/Foil_Etched/<UUID>.jpg`.
**Path 2 — Dev Mode (Your Own Device)**
When the `ENABLE_DEV_MODE` build flag is active and you tap the logo header 5 times, every raw scan frame is saved locally to `Documents/RawTrainingData/` on the device. Sync this folder to your Mac via Xcode's Devices window or Files app to retrieve images.
### User Opt-In
Users must explicitly opt in before any images are uploaded. The opt-in state is stored in `AppConfig.isTrainingOptIn` (backed by `UserDefaults`).
You must expose a toggle in your app's Settings/Library UI that sets `AppConfig.isTrainingOptIn = true/false`. The app description mentions this as the "Community Data Initiative" — users are told their corrections improve the AI for everyone.
**Firebase Authentication note:** `TrainingUploader` only uploads when `FirebaseApp.app() != nil` — meaning Firebase must be configured (`GoogleService-Info.plist` present) for community uploads to work. The app functions without Firebase, but no feedback is collected in that mode.
### Firebase Storage Rules
The rules in `IYmtg_App_iOS/Firebase/storage.rules` enforce:
- `training/` — authenticated users can **write only** (upload corrections). No user can read others' images.
- `models/` — anyone can **read** (required for OTA model downloads). Write access is developer-only via the Firebase Console.
### Downloading Collected Training Data
1. Go to the **Firebase Console → Storage → training/**
2. You will see folders named by label (e.g., `Foil_Etched/`, `Condition_Near Mint (NM)/`)
3. Download all images in each folder — use the Firebase CLI for bulk downloads:
```bash
# Install Firebase CLI if needed
npm install -g firebase-tools
firebase login
# Download all training data
firebase storage:cp gs://<your-bucket>/training ./downloaded_training --recursive
```
4. You now have a folder of user-contributed cropped card images, organized by label.
### Reviewing and Sorting Downloaded Images
**Do not skip this step.** User uploads can include blurry photos, wrong cards, or bad crops. Review each image before adding it to your training set.
1. Open each label folder from the download.
2. Delete any images that are: blurry, poorly cropped, show background, or are clearly wrong.
3. Move the accepted images into the corresponding `IYmtg_Training/` subfolder:
| Downloaded Folder | Move to Training Folder |
| :--- | :--- |
| `Foil_Traditional/` | `IYmtg_Training/Foil_Data/Traditional/` |
| `Foil_Etched/` | `IYmtg_Training/Foil_Data/Etched/` |
| `Foil_<Type>/` | `IYmtg_Training/Foil_Data/<Type>/` |
| `Condition_<Grade>/` | Inspect condition grade — map to `Condition_Data/` subfolder by damage type visible |
> **Identity corrections** (`training/Identity_*/`) are not used to retrain ML models. They indicate that the visual fingerprint for that card may be wrong or ambiguous. Review these separately and consider re-running the Builder for those specific cards.
### Retraining the Models
Once you have added new images to `IYmtg_Training/`:
1. Open **Create ML** on your Mac.
2. Open your existing project for the model you want to update (e.g., `IYmtgFoilClassifier`).
3. The new images in the training folders will be picked up automatically.
4. Click **Train**. Create ML will train incrementally on the expanded dataset.
5. Evaluate the results — check accuracy on the **Validation** tab. Aim for >90% accuracy before shipping.
6. Export the updated `.mlmodel` file.
### Pushing the Updated Model via OTA
You do not need an App Store update to ship a new model version. Use Firebase Storage:
1. In the **Firebase Console → Storage**, navigate to the `models/` folder.
2. Upload your new `.mlmodel` file with the **exact same filename** (e.g., `IYmtgFoilClassifier.mlmodel`).
3. On the next app launch, `ModelManager` detects the newer version, downloads and compiles it, and swaps it in automatically.
> **Important:** The new model takes effect on the **next app launch after download**, not immediately. Users may need to relaunch once.
### Recommended Retraining Schedule
| Trigger | Action |
| :--- | :--- |
| 50+ new correction images accumulated | Review, sort, retrain affected model, push OTA |
| New MTG set released with new foil type | Add training folder, acquire cards, retrain FoilClassifier |
| New MTG set released | Rebuild `cards.json` via `weekly_update.sh` |
| Significant accuracy complaints from users | Download corrections, review, retrain |
---
## Part 7: Backend & Security
### Cloud Storage Architecture
The app uses a two-tier cloud strategy:
| Tier | Technology | What it stores | Cost |
| :--- | :--- | :--- | :--- |
| **Primary** | iCloud + CloudKit (SwiftData) | All card metadata, synced automatically across devices | Free (user's iCloud) |
| **Secondary** | Firebase Firestore | Metadata only — no images — optional manual backup | Free (Firestore free tier) |
Card images are stored in the user's iCloud Drive under `Documents/UserContent/` and are **never** uploaded to Firebase.
### iCloud / CloudKit Setup (Required for Primary Sync)
1. In Xcode, open **Signing & Capabilities**.
2. Add the **iCloud** capability. Enable **CloudKit**.
3. Add a CloudKit container named `iCloud.<your-bundle-id>`.
4. Add the **Background Modes** capability. Enable **Remote notifications**.
5. Set the minimum deployment target to **iOS 17** (required by SwiftData).
Without this setup the app falls back to local-only storage automatically.
### Firebase Configuration (Optional Secondary Backup)
Firebase is no longer the primary sync mechanism. It serves as a user-triggered metadata backup.
1. **Create Project:** Go to the Firebase Console and create a new project.
2. **Authentication:** Enable "Anonymous" sign-in in the Authentication tab.
3. **Firestore Database:** Create a database and apply the rules from `IYmtg_App_iOS/Firebase/firestore.rules`.
4. **Setup:** Download `GoogleService-Info.plist` from Project Settings and drag it into the `IYmtg_App_iOS` folder in Xcode (ensure "Copy items if needed" is checked).
5. Users trigger backup manually via **Library → Cloud Backup → Backup Metadata to Firebase Now**.
The app runs fully without `GoogleService-Info.plist` (Local Mode — iCloud sync still works).
### Over-the-Air (OTA) Model Updates
To update ML models without an App Store release:
1. Train your new model (e.g., `IYmtgFoilClassifier.mlmodel`).
2. Upload the `.mlmodel` file to Firebase Storage in the `models/` folder.
3. The app will automatically detect the newer file, download, compile, and hot-swap it on the next launch.
> **Note:** OTA model updates take effect on the next app launch — not immediately. An app restart is required after a new model is downloaded.
### Privacy Manifest
Ensure `PrivacyInfo.xcprivacy` is included in the app target to satisfy Apple's privacy requirements regarding file timestamps and user defaults.
---
## Part 8: App Configuration
**CRITICAL:** Edit `IYmtg_App_iOS/AppConfig.swift` before building to ensure payments and support work correctly:
1. Set `contactEmail` to your real email address (required by Scryfall API policy).
2. Set `tipJarProductIDs` to your actual In-App Purchase IDs from App Store Connect.
3. `isFirebaseBackupEnabled` defaults to `false`. Users opt-in from Library settings.
---
## Part 9: Development Mode
To enable saving raw training images during scanning:
1. Add the compilation flag `ENABLE_DEV_MODE` in Xcode Build Settings → Swift Compiler → Active Compilation Conditions.
2. Tap the "IYmtg" logo header 5 times in the app to activate.
Saved images appear in `Documents/DevImages/` and can be used to supplement your ML training data.
---
## Part 10: Testing
The project includes a unit test suite in `IYmtgTests.swift`.
**How to Run:**
* Press `Cmd+U` in Xcode to execute the test suite.
**Scope:**
* **Models:** Verifies `SavedCard` initialization and data mapping.
* **Engines:** Tests logic for `ConditionEngine` (grading rules) and `ExportEngine` (CSV/Arena/MTGO formatting).
* **ViewModel:** Validates `ScannerViewModel` state management, including search filtering and portfolio value calculations.
**Note:** CoreML models are not loaded during unit tests to ensure speed and stability. The tests verify the *logic* surrounding the models (e.g., "If 3 scratches are detected, grade is Played") rather than the ML inference itself.
---
## Part 11: Release Checklist
Perform these steps before submitting to the App Store.
1. **Database:**
* [ ] `cards.json` is present in `IYmtg_App_iOS/` and added to the Xcode target.
* [ ] Builder was run recently enough to include current sets.
2. **Configuration Check:**
* [ ] Open `AppConfig.swift`.
* [ ] Verify `contactEmail` is your real email (not a placeholder).
* [ ] Verify `tipJarProductIDs` match App Store Connect.
* [ ] Ensure `enableFoilDetection` and other feature flags are `true`.
* [ ] Update `appVersion` (Semantic Versioning: Major.Minor.Patch) and `buildNumber` for this release.
3. **ML Models:**
* [ ] `IYmtgFoilClassifier.mlmodel` added to Xcode target (or acceptable to ship without).
* [ ] `IYmtgStampClassifier.mlmodel` added to Xcode target (or acceptable to ship without).
* [ ] `IYmtgConditionClassifier.mlmodel` added to Xcode target (or acceptable to ship without).
4. **iCloud / CloudKit:**
* [ ] Signing & Capabilities → iCloud → CloudKit enabled.
* [ ] CloudKit container added: `iCloud.<bundle-id>`.
* [ ] Background Modes → Remote notifications enabled.
* [ ] Minimum deployment target set to **iOS 17**.
5. **Assets:**
* [ ] `Assets.xcassets` has the AppIcon filled for all sizes.
* [ ] `PrivacyInfo.xcprivacy` is in the app target.
6. **Testing:**
* [ ] Run Unit Tests (`Cmd+U`) — all must pass.
* [ ] Run on Physical Device — verify Camera permissions prompt appears.
* [ ] Run on Physical Device — verify a card scans and saves successfully.
7. **Build:**
* [ ] Select "Any iOS Device (arm64)".
* [ ] Product → Archive.
* [ ] Validate App in Organizer.
* [ ] Distribute App → App Store Connect.
---
**Version Authority:** 1.0.0
---
## Project Audit
**Audit Date:** 2026-03-05 | **Auditor:** Claude (Sonnet 4.6)
A full compilation-readiness audit was performed against all 33 Swift source files in `IYmtg_App_iOS/`. See [`claude_review_summary.md`](claude_review_summary.md) for the complete report.
**Key findings:**
| Severity | Count | Description |
|---|---|---|
| Blocker | 2 | `IYmtgTests.swift` — test target will not compile (`ScannerViewModel()` no-arg init removed; test accesses non-existent VM properties) |
| Critical | 1 | `IYmtg_Builder_Mac/` is empty — `cards.json` cannot be generated; scanner is non-functional at runtime |
| Major | 4 | Deprecated `.onChange(of:)` API (iOS 17); missing `import FirebaseCore` in `ModelManager.swift`; Firebase delete data leak; dead `batchUpdatePrices()` function |
| Minor | 4 | Empty `Features/CardDetail/` directory; `PersistenceActor.swift` placeholder; production `AppConfig` values not set; OTA model restart not documented |
**Overall:** App source code is architecturally complete. Fix the 2 Blocker issues in `IYmtgTests.swift` and implement `IYmtg_Builder_Mac` before developer handoff.