Platform Support

Understanding platform requirements and compatibility for expo-ai-kit.

iOSAndroidBeta

Overview

expo-ai-kit provides on-device AI capabilities by leveraging native platform frameworks. Currently, iOS is fully supported through Apple's Foundation Models framework, while Android support is in development.

PlatformStatusNative Framework
iOSiOSFoundation Models (Apple Intelligence)
AndroidBetaML Kit / Gemini Nano (planned)
WebNot Planned

iOS Support

Requirements

iOS To use expo-ai-kit on iOS, you need:

  • iOS 26.0 or later — The Foundation Models API was introduced in iOS 26.0
  • Apple Intelligence enabled — Must be turned on in Settings > Apple Intelligence & Siri
  • Compatible hardware — See supported devices below
  • Xcode 16+ — Required for building

Supported Devices

Apple Intelligence requires specific hardware with Neural Engine capabilities:

DeviceChipStatus
iPhone 15 ProA17 ProSupported
iPhone 15 Pro MaxA17 ProSupported
iPhone 16A18Supported
iPhone 16 PlusA18Supported
iPhone 16 ProA18 ProSupported
iPhone 16 Pro MaxA18 ProSupported
iPad Pro (M1+)M1, M2, M4Supported
iPad Air (M1+)M1, M2Supported
Mac (M1+)M1, M2, M3, M4Supported

Older iPhones

iPhone 15 (non-Pro models), iPhone 14, and earlier devices do not support Apple Intelligence and cannot use expo-ai-kit's on-device features.

Checking Availability

Always check availability at runtime, as users may be on unsupported devices or have Apple Intelligence disabled:

import { isAvailable } from 'expo-ai-kit';
import { Platform } from 'react-native';

async function checkSupport() {
  // Quick platform check
  if (Platform.OS !== 'ios') {
    return { supported: false, reason: 'iOS only' };
  }

  // Check Apple Intelligence availability
  const available = await isAvailable();

  if (!available) {
    return {
      supported: false,
      reason: 'Apple Intelligence not available. Check device compatibility and settings.',
    };
  }

  return { supported: true };
}

Android Support

Android Beta Android support is currently in development.

We're exploring integration with:

  • Google ML Kit — For on-device ML features
  • Gemini Nano — Google's on-device LLM for supported devices
  • MediaPipe LLM Inference — Cross-device LLM support

Android Timeline

Android support is planned for a future release. Join our GitHub discussions to stay updated on progress and provide feedback.

In the meantime, you can prepare your codebase by properly checking availability:

import { isAvailable } from 'expo-ai-kit';

// This will return false on Android until support is added
const available = await isAvailable();

if (!available) {
  // Use cloud AI fallback or show appropriate message
}

For Android-specific setup instructions (when available), see the Android Setup Guide.


Web Support

Web support is not planned for expo-ai-kit.

The library's core value is on-device AI processing, which requires native platform APIs. Web browsers don't provide equivalent APIs for running LLMs locally.

For web applications, consider:

  • Cloud-based AI services (OpenAI, Anthropic, Google AI)
  • WebGPU-based solutions for experimental local inference
  • Progressive enhancement: use cloud AI on web, on-device on mobile

Graceful Degradation

Design your app to handle cases where on-device AI isn't available. Here's a recommended pattern:

hooks/useAI.tstypescript
import { isAvailable, createSession, sendMessage } from 'expo-ai-kit';

type AIProvider = 'on-device' | 'cloud' | 'none';

export function useAI() {
  const [provider, setProvider] = useState<AIProvider>('none');

  useEffect(() => {
    async function detectProvider() {
      const onDeviceAvailable = await isAvailable();

      if (onDeviceAvailable) {
        setProvider('on-device');
      } else if (CLOUD_AI_ENABLED) {
        setProvider('cloud');
      } else {
        setProvider('none');
      }
    }
    detectProvider();
  }, []);

  const sendAIMessage = async (message: string) => {
    switch (provider) {
      case 'on-device':
        const session = await createSession();
        const response = await sendMessage(session, { message });
        await session.close();
        return response.text;

      case 'cloud':
        return await cloudAI.complete(message);

      default:
        throw new Error('No AI provider available');
    }
  };

  return { provider, sendMessage: sendAIMessage };
}

Best Practice

Always provide a fallback experience. Not all users will have compatible devices, and some may have AI features disabled. Consider cloud AI as a fallback, or design features that work without AI.