MCP Hub
Back to servers

mcp

Validated

3D & AR SDK for Android, iOS, Web — API docs, samples, validation, and code generation.

Registry
Stars
1,137
Forks
206
Tools
19
Updated
Mar 26, 2026
Validated
Mar 27, 2026
Validation Details

Duration: 9.3s

Server: @sceneview/mcp v3.4.14

Quick Install

npx -y sceneview-mcp

SceneView

3D & AR for every platform.

Build 3D and AR experiences with the UI frameworks you already know. Same concepts, same simplicity — Android, iOS, Web, Desktop, TV, Flutter, React Native.

Maven Central AR npm CI License GitHub Stars Discord Sponsors


Quick look

// Android — Jetpack Compose
Scene(modifier = Modifier.fillMaxSize()) {
    rememberModelInstance(modelLoader, "models/helmet.glb")?.let {
        ModelNode(modelInstance = it, scaleToUnits = 1.0f, autoAnimate = true)
    }
}
// iOS — SwiftUI
SceneView(environment: .studio) {
    ModelNode(named: "helmet.usdz")
        .scaleToUnits(1.0)
}

No engine boilerplate. No lifecycle callbacks. The runtime handles everything.


Platforms

PlatformRendererFrameworkStatus
AndroidFilamentJetpack ComposeStable
Android TVFilamentCompose TVAlpha
iOS / macOS / visionOSRealityKitSwiftUIAlpha
WebFilament.js (WASM)Kotlin/JS + WebXRAlpha
DesktopSoftware rendererCompose DesktopAlpha
FlutterNative per platformPlatformViewAlpha
React NativeNative per platformFabricAlpha

Install

Android (3D + AR):

dependencies {
    implementation("io.github.sceneview:sceneview:3.3.0")     // 3D
    implementation("io.github.sceneview:arsceneview:3.3.0")   // AR (includes 3D)
}

iOS / macOS / visionOS (Swift Package Manager):

https://github.com/sceneview/sceneview-swift.git  (from: 3.3.0)

Web (Kotlin/JS + Filament.js):

dependencies {
    implementation("io.github.sceneview:sceneview-web:3.3.0")
}

Desktop / Flutter / React Native: see samples/


3D scene

Scene is a Composable that renders a Filament 3D viewport. Nodes are composables inside it.

Scene(
    modifier = Modifier.fillMaxSize(),
    engine = rememberEngine(),
    modelLoader = rememberModelLoader(engine),
    environment = rememberEnvironment(engine, "envs/studio.hdr"),
    cameraManipulator = rememberCameraManipulator()
) {
    // Model — async loaded, appears when ready
    rememberModelInstance(modelLoader, "models/helmet.glb")?.let {
        ModelNode(modelInstance = it, scaleToUnits = 1.0f, autoAnimate = true)
    }

    // Geometry — procedural shapes
    CubeNode(size = Size(0.2f))
    SphereNode(radius = 0.1f, position = Position(x = 0.5f))

    // Nesting — same as Column { Row { } }
    Node(position = Position(y = 1.0f)) {
        LightNode(apply = { type(LightManager.Type.POINT); intensity(50_000f) })
        CubeNode(size = Size(0.05f))
    }
}

Node types

NodeWhat it does
ModelNodeglTF/GLB model with animations. isEditable = true for gestures.
LightNodeSun, directional, point, or spot light. apply is a named parameter.
CubeNode / SphereNode / CylinderNode / PlaneNodeProcedural geometry
ImageNodeImage on a plane
ViewNodeCompose UI rendered as a 3D surface
MeshNodeCustom GPU mesh
NodeGroup / pivot

AR scene

ARScene is Scene with ARCore. The camera follows real-world tracking.

var anchor by remember { mutableStateOf<Anchor?>(null) }

ARScene(
    modifier = Modifier.fillMaxSize(),
    planeRenderer = true,
    onSessionUpdated = { _, frame ->
        if (anchor == null) {
            anchor = frame.getUpdatedPlanes()
                .firstOrNull { it.type == Plane.Type.HORIZONTAL_UPWARD_FACING }
                ?.let { frame.createAnchorOrNull(it.centerPose) }
        }
    }
) {
    anchor?.let {
        AnchorNode(anchor = it) {
            ModelNode(modelInstance = helmet, scaleToUnits = 0.5f)
        }
    }
}

Plane detected → anchor set → Compose recomposes → model appears. Clear anchor → node removed. AR state is just Kotlin state.

AR node types

NodeWhat it does
AnchorNodeFollows a real-world anchor
AugmentedImageNodeTracks a detected image
AugmentedFaceNodeFace mesh overlay
CloudAnchorNodePersistent cross-device anchor
StreetscapeGeometryNodeGeospatial streetscape mesh

Apple (iOS / macOS / visionOS)

Native Swift Package built on RealityKit. 17 node types.

SceneView(environment: .studio) {
    ModelNode(named: "helmet.usdz").scaleToUnits(1.0)
    GeometryNode.cube(size: 0.1, color: .blue).position(x: 0.5)
    LightNode.directional(intensity: 1000)
}
.cameraControls(.orbit)

AR on iOS:

ARSceneView(planeDetection: .horizontal) { position, arView in
    GeometryNode.cube(size: 0.1, color: .blue)
        .position(position)
}

Install: https://github.com/sceneview/sceneview-swift.git (SPM, from 3.3.0)


Architecture

Each platform uses its native renderer. Shared logic lives in KMP.

sceneview-core (Kotlin Multiplatform)
├── math, collision, geometry, physics, animation
│
├── sceneview (Android)      → Filament + Jetpack Compose
├── arsceneview (Android)    → ARCore
├── SceneViewSwift (Apple)   → RealityKit + SwiftUI
├── sceneview-web (Web)      → Filament.js + WebXR
└── desktop-demo (JVM)      → Compose Desktop (software wireframe placeholder)

Samples

SamplePlatformRun
samples/android-demoAndroid./gradlew :samples:android-demo:assembleDebug
samples/android-tv-demoAndroid TV./gradlew :samples:android-tv-demo:assembleDebug
samples/ios-demoiOSOpen in Xcode
samples/web-demoWeb./gradlew :samples:web-demo:jsBrowserRun
samples/desktop-demoDesktop./gradlew :samples:desktop-demo:run (wireframe placeholder, not SceneView)
samples/flutter-demoFluttercd samples/flutter-demo && flutter run
samples/react-native-demoReact NativeSee README

AI integration

SceneView is on the MCP Registry — any AI assistant can use it to generate correct 3D/AR code.

npx sceneview-mcp

The MCP server provides API reference, code samples, setup guides, validation, and migration tools for all platforms.


Links

Support

Reviews

No reviews yet

Sign in to write a review