Snap a photo. Get USDA-accurate macros. Instantly.

FoodSnap Macros is a web application that estimates calories and macronutrients from a single photo of your meal. It combines a vision model (Gemini) with authoritative nutrition databases to reduce manual entry and improve the reliability of macro tracking.
The product is designed for individuals who want streamlined nutrition logging, including people following macro-based diets, athletes monitoring energy balance, and users who frequently eat Spanish cuisine. Beyond food logging, it supports exercise imports, hydration and weight tracking, and an AI assistant that can update diary entries via natural language.
Users photograph their meals or describe them in text. A Gemini-based vision model identifies the dish and its components. When needed, users can adjust recognized items or portion sizes to reflect their actual intake.
The system cross-references identified foods with the USDA and OpenFoodFacts databases to retrieve calorie and macronutrient values. This two-step approach is designed to reduce model hallucinations by anchoring results to verified nutrition data. The app then records the entry with per-serving estimates and allows edits.
Logged meals feed into a daily dashboard with calorie and macro targets. Exercise energy expenditure can be imported from Strava to compute net calories. Users can also track hydration and weight. The conversational assistant can add or modify entries through natural language, and the system adapts based on user corrections over time.
| Aspect | Generic AI Calorie Apps | FoodSnap Macros (Hybrid Engine) |
|---|---|---|
| Core method | LLM-only estimation | Vision model plus database cross-reference |
| Data sources | No certified linkage | USDA and OpenFoodFacts |
| Error risk | Higher risk of hallucinated values | Reduced via cross-validation |
| Manual work | Frequent manual corrections | Minimal manual input; supports corrections |
| Adaptation | Limited learning from edits | Learns from user corrections |