Stream AI responses using iOS 26 FoundationModels with NativeScript

Learn how to stream AI responses with on-device LLMs, powered by Apple Intelligence and FoundationModels using NativeScript.

Nathan Walker
Posted on

With iOS 26, Apple introduced the FoundationModels framework.

The Foundation Models framework provides access to Apple’s on-device large language model that powers Apple Intelligence to help you perform intelligent tasks specific to your use case. The text-based on-device model identifies patterns that allow for generating new text that’s appropriate for the request you make, and it can make decisions to call code you write to perform specialized tasks.

Let's learn how to stream AI responses with on-device LLMs, powered by Apple Intelligence and FoundationModels using NativeScript.

Prepare our user interface

We can setup a simple user interface allowing the user to input a prompt which we'll process via FoundationModels.

This will render a text input box for the prompt, a button to process it and a TextView to show our streamed AI responses.

html
<GridLayout rows="auto,auto,*" class="p-4">
  <TextView
    [(ngModel)]="userInput"
    hint="Ask AI something..."
    [isUserInteractionEnabled]="!processing()"
  ></TextView>
  <GridLayout row="1" class="my-4">
    <button
      [text]="processing() ? 'Processing...' : 'Ask AI'"
      (tap)="fetchAIResponse()"
      class="rounded-full text-white text-lg p-4"
      [ngClass]="{
        'bg-blue-500': !processing(),
        'bg-gray-400': processing()
      }"
    ></button>
    @if (processing()) {
    <ActivityIndicator busy="true" class="align-middle h-left ml-6 text-white" />
    }
  </GridLayout>
  <TextView row="2" class="leading-[4]" (loaded)="loadedResponseTextView($event)"></TextView>
</GridLayout>

Setup our FoundationModels utility

We can add a small AI.swift utility right next to our TypeScript source providing a method to send our user prompt with a callback to receive a stream of AI generated responses.

Since this will use iOS 26 specific APIs, we make sure to use canImport and @available specifiers.

  • AI.swift
swift
import Foundation
#if canImport(FoundationModels)
import FoundationModels
#endif

@objcMembers
public class AI: NSObject {
    public static let shared = AI()

    public func streamResponseFor(_ prompt: String, _ completion: @escaping (String?) -> Void) {
        #if canImport(FoundationModels)
        if #available(iOS 26.0, *) {
            let session = LanguageModelSession()
            Task {
                do {
                    let stream = session.streamResponse(to: prompt)
                    for try await chunk in stream {
                        // Notify immediately on the MainActor (e.g. to update UI)
                        await MainActor.run { completion(chunk) }
                    }
                } catch {
                    print("Error generating text: \(error)")
                    completion(nil)
                }
            }
        }
        #endif
    }
}

Allow NativeScript to include co-located platform source

Introduced in NativeScript 8.9, we can add an entry to our nativescript.config to group together any co-located native platform source files right next to our TypeScript source.

  • nativescript.config.ts
ts
export default {
  // ...
  ios: {
    NativeSource: [
      {
        name: 'PlatformNativeSrc',
        path: '**/*.swift'
      }
    ]
  }
} as NativeScriptConfig;

Wire up our user interface

Just like above, we take special care to ensure these features are only enabled where available, on iOS 26+ only.

This takes our user input and processes it against Apple's FoundationModels which streams the responses back through the callback.

We also add a neat ability to render it as markdown right from TypeScript, thanks NativeScript!

Tip: Retain strong types by invoking ns typings ios for AI.shared 💯

ts
fetchAIResponse() {
  if (__APPLE__ && Utils.SDK_VERSION >= 26) {
    this.processing.set(true);
    AI.shared.streamResponseFor(this.userInput(), (response) => {
      this.processing.set(false);
      this.setResponseData(response || "");
    });
  }
}

setResponseData(response: string) {
  if (this.responseTextView) {
    // parsing responses into rendered markdown
    const options = NSAttributedStringMarkdownParsingOptions.alloc().init();
    options.interpretedSyntax =
      NSAttributedStringMarkdownInterpretedSyntax.InlineOnlyPreservingWhitespace;
    const attr =
      NSMutableAttributedString.alloc().initWithMarkdownStringOptionsBaseURLError(
        response,
        options,
        null
      );
    const fullRange = NSRangeFromString(`{0,${attr.length}}`);
    const font = UIFont.systemFontOfSize(16);
    attr.addAttributeValueRange(NSFontAttributeName, font, fullRange);
    this.responseTextView.ios.attributedText = attr;
  }
}

Sample Repo

You can clone this repo to try it for yourself!


More from our Blog