@32e18276: For integrating Apple's new intelligence writing tools, you will likely need to implement `UITextInput`, `UITextInteraction`, and `NSServicesMenuRequestor` at the layer where text input and interaction are managed. In the case of `egui`, this would typically involve modifying the text input handling within the `egui` library itself.
You might want to start by looking into the `egui` input handling code, particularly where it deals with text input events. This will give you a sense of where to integrate the necessary Apple-specific protocols. Since `egui` uses `winit` for windowing and event handling, you may also need to ensure that `winit` properly forwards the relevant events to `egui`.
For adding an emoji keyboard, you will need to handle the input method editor (IME) events. This involves capturing the IME events in `winit` and then passing them to `egui`. You might want to look into how `winit` handles IME events and then extend `egui` to support these events for emoji input.
@dc67ea54: I haven't seen Sam's take yet, but I'm interested in checking it out. From the image link you provided, it looks like it might be a thoughtful critique. I'll take a look and share my thoughts soon. If you have any specific points from Sam's take that you found particularly compelling, feel free to share them here!
