Frame Processors
What are frame processors?
Frame processors are functions that are written in JavaScript (or TypeScript) which can be used to process frames the camera "sees". Inside those functions you can call Frame Processor Plugins, which are high performance native functions specifically designed for certain use-cases.
For example, you might want to create a Hotdog/Not Hotdog detector app without writing any native code, while still achieving native performance:
function App() {
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
const isHotdog = detectIsHotdog(frame)
console.log(isHotdog ? "Hotdog!" : "Not Hotdog.")
}, [])
return (
<Camera
{...cameraProps}
frameProcessor={frameProcessor}
/>
)
}
Frame processors are by far not limited to Hotdog detection, other examples include:
- AI for facial recognition
- AI for object detection
- Using Tensorflow, MLKit Vision, Apple Vision or other libraries
- Creating realtime video-chats using WebRTC to directly send the camera frames over the network
- Creating scanners for QR codes, Barcodes or even custom codes such as Snapchat's SnapCodes or Apple's AppClips
- Creating snapchat-like filters, e.g. draw a dog-mask filter over the user's face
- Creating color filters with depth-detection
- Drawing boxes, text, overlays, or colors on the screen in realtime
- Rendering filters and shaders such as Blur, inverted colors, beauty filter, or more on the screen
Because they are written in JS, Frame Processors are simple, powerful, extensible and easy to create while still running at native performance. (Frame Processors can run up to 1000 times a second!) Also, you can use fast-refresh to quickly see changes while developing or publish over-the-air updates to tweak the Hotdog detector's sensitivity in live apps without pushing a native update.
Frame Processors require react-native-worklets-core 1.0.0 or higher.
Interacting with Frame Processors
Since Frame Processors run in Worklets, you can directly use JS values such as React state:
// can be controlled with a Slider
const [sensitivity, setSensitivity] = useState(0.4)
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
const isHotdog = detectIsHotdog(frame, sensitivity)
console.log(isHotdog ? "Hotdog!" : "Not Hotdog.")
}, [sensitivity])
You can also easily read from, and assign to Shared Values to create smooth, 60 FPS animations.
In this example, we detect a cat in the frame - if a cat was found, we assign the catBounds
Shared Value to the coordinates of the cat (relative to the screen) which we can then use in a useAnimatedStyle
hook to position the red rectangle surrounding the cat. This updates in realtime on the UI Thread, and can also be smoothly animated with withSpring
or withTiming
.
// represents position of the cat on the screen 🐈
const catBounds = useSharedValue({ top: 0, left: 0, right: 0, bottom: 0 })
// continously sets 'catBounds' to the cat's coordinates on screen
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
catBounds.value = scanFrameForCat(frame)
}, [catBounds])
// uses 'catBounds' to position the red rectangle on screen.
// smoothly updates on UI thread whenever 'catBounds' is changed
const boxOverlayStyle = useAnimatedStyle(() => ({
position: 'absolute',
borderWidth: 1,
borderColor: 'red',
...catBounds.value
}), [catBounds])
return (
<View>
<Camera {...cameraProps} frameProcessor={frameProcessor} />
// draws a red rectangle on the screen which surrounds the cat
<Reanimated.View style={boxOverlayStyle} />
</View>
)
And you can also call back to the React-JS thread by using createRunInJsFn(...)
:
const onQRCodeDetected = Worklets.createRunInJsFn((qrCode: string) => {
navigation.push("ProductPage", { productId: qrCode })
})
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
const qrCodes = scanQRCodes(frame)
if (qrCodes.length > 0) {
onQRCodeDetected(qrCodes[0])
}
}, [onQRCodeDetected])
Running asynchronously
Since Frame Processors run synchronously with the Camera Pipeline, anything taking longer than one Frame interval might block the Camera from streaming new Frames. To avoid this, you can use runAsync
to run code asynchronously on a different Thread:
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
console.log('I'm running synchronously at 60 FPS!')
runAsync(() => {
'worklet'
console.log('I'm running asynchronously, possibly at a lower FPS rate!')
})
}, [])
Running at a throttled FPS rate
Some Frame Processor Plugins don't need to run on every Frame, for example a Frame Processor that detects the brightness in a Frame only needs to run twice per second:
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
console.log('I'm running synchronously at 60 FPS!')
runAtTargetFps(2, () => {
'worklet'
console.log('I'm running synchronously at 2 FPS!')
})
}, [])
Using Frame Processor Plugins
Frame Processor Plugins are distributed through npm. To install the vision-camera-image-labeler plugin, run:
npm i vision-camera-image-labeler
cd ios && pod install
That's it! 🎉 Now you can use it:
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
const labels = labelImage(frame)
// ...
}, [])
Check out Frame Processor community plugins to discover plugins, or start creating a plugin yourself!
Selecting a Format for a Frame Processor
When running frame processors, it is often important to choose an appropriate format. Here are some general tips to consider:
- If you are running heavy AI/ML calculations in your frame processor, make sure to select a format that has a lower resolution to optimize it's performance.
- Sometimes a frame processor plugin only works with specific pixel formats. Some plugins (like MLKit) don't work with
x420
.
Benchmarks
Frame Processors are really fast. I have used MLKit Vision Image Labeling to label 4k Camera frames in realtime, and measured the following results:
- Fully natively (written in pure Objective-C, no React interaction at all), I have measured an average of 68ms per call.
- As a Frame Processor Plugin (written in Objective-C, called through a JS Frame Processor function), I have measured an average of 69ms per call.
This means that the Frame Processor API only takes ~1ms longer than a fully native implementation, making it the fastest and easiest way to run any sort of Frame Processing in React Native.
All measurements are recorded on an iPhone 11 Pro, benchmarked total execution time of the
captureOutput
function by usingCFAbsoluteTimeGetCurrent
. Running smaller images (lower than 4k resolution) is much quicker and many algorithms can even run at 60 FPS.
Avoiding Frame-drops
Frame Processors will be synchronously called for each frame the Camera sees and have to finish executing before the next frame arrives, otherwise the next frame(s) will be dropped. For a frame rate of 30 FPS, you have about 33ms to finish processing frames. For a QR Code Scanner, 5 FPS (200ms) might suffice, while a object tracking AI might run at the same frame rate as the Camera itself (e.g. 60 FPS (16ms)).
ESLint react-hooks plugin
If you are using the react-hooks ESLint plugin, make sure to add useFrameProcessor
to additionalHooks
inside your ESLint config. (See "advanced configuration")
Technical
Frame Processors
Frame Processors are JS functions that will be workletized using react-native-worklets-core. They are created on a parallel camera thread using a separate JavaScript Runtime ("VisionCamera JS-Runtime") and are invoked synchronously (using JSI) without ever going over the bridge. In a Frame Processor you can write normal JS code, call back to the React-JS Thread (e.g. setState
), use Shared Values and call Frame Processor Plugins.
Frame Processor Plugins
Frame Processor Plugins are native functions (written in Objective-C, Swift, C++, Java or Kotlin) that are injected into the VisionCamera JS-Runtime. They can be synchronously called from your JS Frame Processors (using JSI) without ever going over the bridge. Because VisionCamera provides an easy-to-use plugin API, you can easily create a Frame Processor Plugin yourself. Some examples include Barcode Scanning, Face Detection, Image Labeling, Text Recognition and more.
Learn how to create Frame Processor Plugins, or check out the example Frame Processor Plugin for iOS or Android.
The Frame
object
The Frame Processor gets called with a Frame
object, which is a JSI HostObject. It holds a reference to the native (C++) Frame Image Buffer (~10 MB in size) and exposes properties such as width
, height
, bytesPerRow
and more to JavaScript so you can synchronously access them. The Frame
object can be passed around in JS, as well as returned from- and passed to a native Frame Processor Plugin.
See this tweet for more information.
Disabling Frame Processors
The Frame Processor API spawns a secondary JavaScript Runtime which consumes a small amount of extra CPU and RAM. Additionally, compile time increases since Frame Processors are written in native C++. If you're not using Frame Processors at all, you can disable them:
- React Native
- Expo
Android
Inside your gradle.properties
file, add the disableFrameProcessors
flag:
VisionCamera_disableFrameProcessors=true
Then, clean and rebuild your project.
iOS
Inside your Podfile
, add the VCDisableFrameProcessors
flag:
$VCDisableFrameProcessors = true
Inside your Expo config (app.json
, app.config.json
or app.config.js
), add the disableFrameProcessors
flag to the react-native-vision-camera
plugin:
{
"name": "my app",
"plugins": [
[
"react-native-vision-camera",
{
// ...
"disableFrameProcessors": true
}
]
]
}