MUSE
MUSE
Reimagining the Museum Experience: Smart Navigation & AR Exploration
Confidential Client 8XX579
Role: Sr. Product Designer
Team: Frontend Developer, Product Manager, software Engineer (FS), iOS Mobile Engineer, MLE: AR/VR
Reimagining the Museum Experience
Most museum apps are functional but flat. They provide basic maps and lists, but don’t account for the way people actually move through and experience space. This project reimagined the museum guide — not as a static app, but as a context-aware spatial experience layered with exploration, orientation, and storytelling.
Analyzing visitor behavior uncovered two core insights:
Over 40% of visit time was spent trying to find locations
Most users abandoned the app after the first map interaction
These findings guided the structural redesign — the experience needed to adapt to physical movement and reduce friction in discovery.
Mapping the Existing User Journey
The existing flow revealed long stretches without context, requiring users to exit the app or retrace steps. It became clear that content needed to be tightly integrated with spatial navigation — not siloed in menus.
Designing the Flow
The redesigned system flows naturally from 2D map → 3D environment → object-level stories. This progression lets users zoom in and out as they explore, surfacing relevant content without overwhelming the interface.
Wireframes were built to test structure, hierarchy, and movement. The goal was to make exploration feel intuitive — like you’re walking through the space, not clicking through an app.
A Layered Experience
To solve this, the app was built around three core components, layered seamlessly into the navigation:
A live 2D wayfinding map that centers the visitor in real time and helps them navigate
A 3D spatial experience that previews exhibit zones, rooms, and transitions between spaces
A Featured Works section, embedded within the map and galleries, where users can explore individual objects, stories, and artist details
This structure lets visitors zoom in and out naturally —
from building → exhibit → object — without losing their place or context.
2D Map Navigation
A clean, zoomable map helps users orient themselves within the museum. Visitors can tap to preview galleries, view current location, and follow visual wayfinding cues designed to mirror real-world signage.
3D Spatial Experience
The 3D mode offers a layered, immersive view of the museum layout. Users can explore floors and rooms in spatial context, making the app feel like an extension of the physical space.
Featured Exhibits & Object Detail
A curated section surfaces key works and exhibitions. Each object opens into an editorial-style layout, offering rich descriptions, artist context, and optional AR previews for selected pieces.
System Architecture Overview
Data Collection
User interactions (clicks, dwell time, exhibit views), indoor location (BLE beacons or WiFi triangulation), time of visit
Processing Pipeline
Event data is streamed and cleaned using Python + BigQuery, then passed to a lightweight content recommendation engine (collaborative filtering + content-based hybrid model)
Model Outputs
Personalized exhibit recommendations shown in the "Featured" tab
Dynamic reorder of UI cards based on predicted interest score
Traffic heatmaps sent to a curator-facing dashboard (Metabase prototype)
Feedback loop: User behavior is re-ingested to fine-tune recommendations over time
Privacy: All data collection is anonymized and opt-in, with local storage fallback for one-time guest users
Tooling
Python (data pipeline), BigQuery (storage & queries), Scikit-learn (prototype ML models), Metabase (dashboard), Figma (UX/UI)
Revolutionizing Museum Navigation with Machine Learning and Immersive Design
This ML-based museum navigator addresses critical challenges uncovered through visitor behavior analysis. Over 40% of visit time was spent trying to find locations, and most users abandoned the app after the first map interaction. These insights led to a structural redesign that focuses on adapting the experience to physical movement and reducing friction in discovery. The app uses 2D and 3D immersive experiences to guide visitors seamlessly, offering personalized recommendations based on preferences. The featured exhibits section highlights curated collections, boosting engagement by 40%. Early user feedback shows a 30% improvement in satisfaction, with visitors spending 20% more time exploring exhibits. This project successfully combines machine learning with immersive design, creating a more intuitive, engaging, and educational museum experience while paving the way for the future of interactive cultural exploration.