← Back to Case Studies
Success

AI Book Recommendations from Live Inventory

Case Study Hero Image

The Problem

Bookstore customers want personalized recommendations, but staff can't possibly know every title in a store with thousands of books. Traditional search only finds exact keyword matches, missing relevant suggestions when customers describe what they're looking for in natural language. Online recommendation engines don't know what's actually on the shelves. The client needed a solution that could understand vague requests ("a cozy mystery set in a small town"), search intelligently across their specific inventory, and provide recommendations that matched both the query intent and books physically available to purchase.

The Solution

We designed a multi-stage AI pipeline. First, we ingest the store's inventory by reading all ISBN numbers from their point-of-sale system. For each ISBN, we pull detailed metadata from Google Books API — title, author, description, genre, reviews, and more. This enriched data is then converted into vector embeddings using OpenAI's text-embedding-3-large model and stored in a vector database. When a customer asks a question, we vectorize their query and perform semantic search to find conceptually similar books. The top results are then passed through GPT-4o or Gemini, which validates matches against real-world knowledge about those books and crafts a natural language response explaining why each recommendation fits the request.

Solution Detail Image

The Result

The chatbot successfully handles complex, conversational queries that would stump traditional search. It understands context, genre preferences, reading level, and thematic elements. Because it's connected to live inventory, every recommendation is a book the customer can actually buy. The multi-stage validation eliminates irrelevant results that semantic search alone might surface. The bookstore now offers 24/7 intelligent assistance without requiring staff to memorize their entire catalog. This case study validated that combining semantic search with LLM reasoning creates a recommendation quality that neither technology achieves alone.

Models Used: Google Books API, text-embedding-3-large (vectorization), GPT-4o / Gemini 2.0 Flash (validation & response)