Installation Guide

OpenClaw on Meta Glasses

Run OpenClaw on Meta smart glasses to create a hands-free AI assistant that can see, hear, and perform tasks in real time.

8 min read
Mar 24, 2026
Ampere Team

OpenClaw on Meta Glasses helps you control your AI agent using voice commands directly from your smart glasses.

This guide walks you through the complete setup, from configuration to live usage, in a simple step by step process.

What is OpenClaw on Meta Glasses?

OpenClaw on Meta glasses is a setup where:

  • Glasses capture what you see and hear
  • AI understands your request
  • OpenClaw performs the action

You speak → AI understands → Task gets done

System Architecture Overview

ComponentRole
Meta GlassesCapture voice and video
VisionClaw AppConnects everything
AI (Gemini)Understands commands
OpenClawPerforms actions

System Requirements

RequirementDetails
PhoneAndroid or iPhone
OSAndroid 14+ / iOS 17+
AIGemini API key
BackendOpenClaw (hosted or local)

Step-by-Step Guide

1

Get Gemini API Key

Required for AI processing.

  • Go to Google AI Studio
  • Sign in
  • Create API key
  • Copy and save it
2

Set Up OpenClaw

You have 2 options:

Option 1: Easy Method (Recommended)

Use a hosted platform like Ampere.sh:

  • Go to Ampere.sh and create an account
  • Deploy OpenClaw from the dashboard
  • Copy your API endpoint and gateway token

Option 2: Manual Local Setup

If you want full control:

npm install -g openclaw openclaw setup openclaw gateway start

For gateway connection help, see the OpenClaw Gateway pairing guide.

  • Port: 18789
  • Enable gateway
  • Use same Wi-Fi as phone
3

Install VisionClaw App

You can install VisionClaw on both Android and iOS.

iOS Setup

git clone https://github.com/sseanliu/VisionClaw.git cd VisionClaw/samples/CameraAccess open CameraAccess.xcodeproj
  • Open in Xcode
  • Connect device
  • Click Run

Android Setup

git clone https://github.com/sseanliu/VisionClaw.git
  • Open CameraAccessAndroid in Android Studio
  • Add GitHub token (for SDK access)
  • Build and run app
4

Add API Keys

Open config file:

  • iOS → Secrets.swift
  • Android → Secrets.kt

Add:

  • Gemini API Key
  • OpenClaw Host URL
  • OpenClaw Port (18789)
  • Gateway Token

Save and rebuild app.

5

Enable Developer Mode

  • Open Meta View app
  • Go to Settings
  • Tap app version multiple times
  • Enable Developer Mode
6

Connect Meta Glasses

  • Pair glasses with Meta app
  • Open VisionClaw
  • Tap Start Streaming
  • Tap AI button
7

Start Using

Speak commands naturally. Examples:

  • "What am I looking at?"
  • "Send a message"
  • "Add reminder"

Once your agent is live, you can explore more automations and tasks. See how to automate with AI agents for ideas on what OpenClaw can do next.

Common Setup Issues

IssueFix
AI not respondingCheck API key
OpenClaw not connectingCheck host + port
Build errorCheck SDK / dependencies
Mic/camera not workingEnable permissions

Frequently Asked Questions

Can I install OpenClaw directly on Meta glasses?
No, OpenClaw does not run directly on the glasses. It works through a phone app (like VisionClaw) which connects the glasses, AI, and backend together.
Can I use OpenClaw on both Android and iPhone?
Yes, VisionClaw supports both Android and iPhone. However, setup may be slightly easier and more stable on some devices depending on SDK support.
Is coding required to set this up?
Basic setup may require installing the app using Xcode or Android Studio. However, using a hosted OpenClaw (like Ampere) removes most technical complexity.
Does this work in real-time?
Yes, but with some limits. Video is processed at a low frame rate (~1 FPS), which is good for static scenes but not fast movement.
Is this setup safe to use?
It depends on your configuration. Since OpenClaw can access apps and data, always use secure API keys, tokens, and trusted environments.

Skip the Complex Setup

Setting up VisionClaw, Xcode, Android Studio, and API keys can be time-consuming. Use Ampere.sh to deploy OpenClaw instantly — no coding, no local server, works with your glasses right away.

Deploy on Ampere.sh →