In this blog post, I will explain to you how I created the following tech demo:

I have to confess, though, that it’s actually not rendered in the app icon. Instead, the view of the Flutter application is captured and streamed to the app icon.

The basics

This paragraph covers the basics. If you’re already familiar with MethodChannels and RepaintBoundaries, and macOSs NSDock API, you can skip this.

MethodChannels are the mechanism to communicate between Dart code and the underlying native code. In this case, it’s Swift. I won’t cover how exactly it works, because it’s already explained in depth on the Flutter docs. We need the MethodChannels to send the screen of the Flutter application to the native code.

Capturing the screen of the Flutter application is done via RepaintBoundaries. This nifty widget allows you to capture its child widgets as an image. If you capture the child widgets every frame, it looks like a video stream of the application. The Flutter YouTube channel made a good video about it:

The last important part is the NSDockTile API from macOS. (It’s not from Flutter.) This API gives you access to various things around the dock tile for the app. We’re just interested in setting the content of the dock tile.

Basically it boils down to the following Swift code

// Create an image view
let imageView = NSImageView() 
// set an image
imageView.image = NSImage(data: Data(byte)) 

// set the image view as the content of the dock tile
NSApp.dockTile.contentView = imageView 

// force the dock tile to update
NSApp.dockTile.display()

People do all kind of crazy stuff with it, I’m not the only one who’s “misusing” it.

Now that we’re covered all the APIs and how they’re working, we can assemble them together.

Putting it all together

First, we’ll need to take a look at the native macOS API, since that’s where we can render something into the app icon in the dock. You can do something like the following code snippet to render an image to the app icon.

let imageView = NSImageView() 
imageView.image = NSImage(data: Data(byte)) 
NSApp.dockTile.contentView = imageView 
NSApp.dockTile.display() 

Okay, but the Twitter post showed a moving application. That’s because I continuously captured the content of the Flutter application. For that, we need to record the screen of the Flutter application.

First, we’ll need the good old Flutter counter example and adapt it to our needs. We’re going to wrap the main content in a RepaintBoundary, because it can screenshot its child widgets, and we give it a global key. Later, we can access the RepaintBoundary based on the global key.

final _screenshareKey = GlobalKey();

class MyApp extends StatelessWidget {
  const MyApp({super.key});

  @override
  Widget build(BuildContext context) {
    return RepaintBoundary(
      key: _screenshareKey,
      child: MaterialApp(
        title: 'Flutter Demo',
        theme: ThemeData(
          primarySwatch: Colors.blue,
        ),
        home: const MyHomePage(title: 'Flutter Demo Home Page'),
      ),
    );
  }
}

Next, we’ll need to continuously need to take screenshots, in order to send them to the app icon. The continuous capturing is done by using a persistent frame callback which is called once for each frame. In its callback we’re now actually taking a screenshot of the application.

WidgetsBinding.instance.addPersistentFrameCallback((timeStamp) {
  _syncToDockIcon();
});

Future<void> _syncToDockIcon() async {
  final renderObject = _screenshareKey.currentContext!.findRenderObject()
      as RenderRepaintBoundary;

  final image = await renderObject.toImage();
  // ...
}

Next, we somehow need to get the images to the native side of the Flutter application. For that, we’re going to use a MethodChannel. We now need to put all the code together as seen below. Additionally, we need to convert the screenshot into an image format, which is understood by the native side.

const _channel = MethodChannel('sync');

Future<void> _syncToDockIcon() async {
  final renderObject = _screenshareKey.currentContext!.findRenderObject()
      as RenderRepaintBoundary;

  final image = await renderObject.toImage();
  final bytedata = await image.toByteData(format: ImageByteFormat.png);
  final imageBytes = bytedata!.buffer.asUint8List();
  await _channel.invokeMethod('sync', imageBytes);
}

That’s it for the Flutter side. Next stop, the native side.

We’ll need to adapt the MainFlutterWindow class to receive the MehtodChannel messages from the Flutter side and to present them in the app icon. We register for the MethodChannel and get the image bytes when the MethodChannel is called. The image bytes we’re getting from it are used to create an NSImageView. The image view is then attached to the content of the app icon in the dock.

import Cocoa
import FlutterMacOS

class MainFlutterWindow: NSWindow {
  override func awakeFromNib() {
    // The usual init code is left out, and only the important parts for this blog post are left

    // Register at the MethodChannel
    let channel = FlutterMethodChannel(name: "sync", binaryMessenger: controller.engine.binaryMessenger)
    channel.setMethodCallHandler(handleMessage)

    super.awakeFromNib()
  }

  // This method is called when a method is called on the Flutter side
  private func handleMessage(call: FlutterMethodCall, result: FlutterResult) {
    if call.method != "sync" {
      fatalError("fatalError")
    }
    let imageBytes =  call.arguments as! FlutterStandardTypedData
    let byte = [UInt8](imageBytes.data)
      let imageView = NSImageView()
      imageView.image = NSImage(data: Data(byte))
      NSApp.dockTile.contentView = imageView
      NSApp.dockTile.display()
  }
}

That’s it. As you can see, there’s no magic involved. Just some clever combination of various APIs.

You can find the whole source code at https://github.com/ueman/flutter_in_dock.

Originally published on Medium.