Skip to content

Latest commit

 

History

History
114 lines (83 loc) · 4.89 KB

CHANGELOG.md

File metadata and controls

114 lines (83 loc) · 4.89 KB

1.0.2 (November 5, 2021)

Changes

  • Moving @types/node to devDependencies.
  • Fixed an issue where twilio-video-processors is throwing an exception in a server-side rendering application.

1.0.1 (July 12, 2021)

Bug Fixes

  • Fixed an issue where the following internal classes and interfaces are being exported.
    • BackgroundProcessor
    • BackgroundProcessorOptions
    • GrayscaleProcessor
    • Processor
    • Dimensions

1.0.0 (June 24, 2021)

1.0.0-beta.3 has been promoted to 1.0.0 GA. Twilio Video Processors will use Semantic Versioning 2.0.0 for all future changes. Additionally, this release also includes the following new features and improvements.

  • Added isSupported API which can be used to check whether the browser is supported or not. This API returns true for chromium-based desktop browsers.

    import { isSupported } from '@twilio/video-processors';
    
    if (isSupported) {
      // Initialize the background processors
    }
  • GaussianBlurBackgroundProcessor and VirtualBackgroundProcessor's processFrame method signature has been updated in order to improve performance. With this update, the output frame buffer should now be provided to the processFrame method which will be used to draw the processed frame.

    Old signature:

    processFrame(inputFrame: OffscreenCanvas)
      : Promise<OffscreenCanvas | null>
      | OffscreenCanvas | null;

    New signature:

    processFrame(inputFrameBuffer: OffscreenCanvas, outputFrameBuffer: HTMLCanvasElement)
      : Promise<void> | void;
  • The segmentation model has been changed from MLKit Selfie Segmentation to MediaPipe Selfie Segmentation Landscape to improve performance.

  • Added debounce logic on the image resizing step to improve performance.

1.0.0-beta.3 (May 25, 2021)

Improvements

  • The VideoProcessors now use WebAssembly to run TensorFlow Lite for faster and more accurate person segmentation. You need to deploy the tflite model and binaries so the library can load them properly. Additionally, this improvement requires Chrome's WebAssembly SIMD support in order to achieve the best performance. WebAssembly SIMD can be turned on by visiting chrome://flags on versions 84 through 90. This will be enabled by default on Chrome 91+. You can also enable this on versions 84-90 for your users without turning on the flag by registering for a Chrome Origin Trial for your website.

  • The segmentation model has been changed from BodyPix to MLKit Selfie Segmentation to improve segmentation accuracy.

1.0.0-beta.2 (April 16, 2021)

Improvements

  • The background processors now stabilize the boundary of the foreground (person), thereby reducing the 'shakiness' effect.

1.0.0-beta1 (March 31, 2021)

Background Processors (Desktop Chrome only)

You can now use GaussianBlurBackgroundProcessor to apply a Gaussian blur filter on the background of a video frame, or use VirtualBackgroundProcessor to replace the background with a given image.

import { createLocalVideoTrack } from 'twilio-video';
import {
  GaussianBlurBackgroundProcessor,
  VirtualBackgroundProcessor,
} from '@twilio/video-processors';

const blurBackground = new GaussianBlurBackgroundProcessor();
const img = new Image();

let virtualBackground;
img.onload = () => {
  virtualBackground = new VirtualBackgroundProcessor({ backgroundImage: img });
};
img.src = '/background.jpg';

const setProcessor = (track, processor) => {
  if (track.processor) {
    track.removeProcessor(track.processor);
  }
  track.addProcessor(processor);
};

createLocalVideoTrack({
  width: 640,
  height: 480
}).then(track => {
  document.getElementById('preview').appendChild(track.attach());
  document.getElementById('blur-bg').onclick = () => setProcessor(track, blurBackground);
  document.getElementById('virtual-bg').onclick = () => setProcessor(track, virtualBackground);
});