跳转到内容

播放使用WebSocket传输的fmp4视频流

安装所有解决方案,请参考全量解决方案安装

实现原理

  1. 接收WebSocket传入的数据。
  2. 创建MediaSouce实例。
  3. 创建SourceBuffer实例。
  4. 将数据传递给SourceBuffer实例。
  5. 将MediaSouce实例传递给video标签使用。
  6. 将video绘制到canvas上。

单独安装此hooks

如果通过hook的方式使用,执行以下安装即可:

shell
npm install @havue/use-ws-video --save
shell
yarn add @havue/use-ws-video
shell
pnpm install @havue/use-ws-video

单独安装此javascript类

如果不通过hook的方式使用:

shell
npm install @havue/ws-video-manager --save
shell
yarn add @havue/ws-video-manager
shell
pnpm install @havue/ws-video-manager

使用

引入useVideoPlay hook

ts
import { useWsVideo } from 'havue'
// or
import { useWsVideo } from '@havue/hooks'
// or
import { useWsVideo } from '@havue/use-ws-video'

引入 javascript manager 类使用

ts
import { WsVideoManager } from 'havue'
// or
import { WsVideoManager } from '@havue/solutions'
// or
import { WsVideoManager } from '@havue/ws-video-manager'

函数声明

ts
import type { Ref, MaybeRef } from 'vue'
import type { RenderConstructorOptionType, VideoInfo, WsVideoManager } from '@havue/ws-video-manager'
ts
export type UseWsVideoCanvasResizeOption = {
  /** 是否启用自动更新canvas width 和 height属性,默认为true */
  enable?: boolean
  /** 设置canvas width 和 height时,
   * 缩放的比例,即元素实际尺寸乘以scale,
   * 放大是为了画面更清晰
   * 默认 1
   */
  scale?: number
  /** 限制canvas width最大值,默认1920 */
  maxWidth?: number
  /** 限制canvas height最大值,默认1080 */
  maxHeight?: number
}

export type UseWsVideoParamsOptions = {
  /** websocket 地址 */
  wsUrl: MaybeRef<string | undefined>
  /** 是否播放视频 */
  isReady: MaybeRef<boolean>
  /** 使用的WsVideoManager 实例 默认为wsVideoPlayer */
  wsVideoPlayerIns?: WsVideoManager
  /** 视频渲染到的canvas元素, 不传会返回一个元素引用变量:canvasRef */
  target?: MaybeRef<HTMLCanvasElement | undefined>
  /** 是否自动更新canvas width和height属性的配置, 默认为 USE_WS_VIDEO_DEFAULT_RESIZE_OPTIONS */
  canvasResize?: MaybeRef<UseWsVideoCanvasResizeOption | undefined>
  /** 视口中元素不可见时断开连接, 默认为true */
  closeOnHidden?: MaybeRef<boolean>
  /** 自定义Render配置 */
  renderOptions?: MaybeRef<Partial<RenderConstructorOptionType>>
}

// canvasResize 默认值
export const USE_WS_VIDEO_DEFAULT_RESIZE_OPTIONS = Object.freeze({
  enable: true,
  scale: 1,
  maxWidth: 1920,
  maxHeight: 1080
})

export type UseWsVideoReturnType = {
  /** canvas引用 */
  canvasRef: Ref<HTMLCanvasElement | undefined>
  /** 是否静音 */
  isMuted: Ref<boolean>
  /** 是否暂停 */
  isPaused: Ref<boolean>
  /** 视频信息 */
  videoInfo: Ref<VideoInfo>
  /** 已经连接的WebSocket地址列表 */
  linkedWsUrlList: Ref<string[]>
  /** 视频流地址是否已添加 */
  isLinked: Ref<boolean>
  /** 是否达到websocket拉流数最大值 */
  isReachConnectLimit: Ref<boolean>
  /** 暂停其他WebSocket视频流的音频播放 */
  pauseOtherAudio: () => void
  /** 设置当前WebSocket视频流的音频是否暂停 */
  setAudioMutedState: (muted: boolean) => void
  /** 暂停其他WebSocket视频流的视频播放 */
  pauseOtherVideo: () => void
  /** 设置当前WebSocket视频流的视频是否暂停 */
  setOneVideoPausedState: (paused: boolean) => void
  /** 设置所有WebSocket视频流的视频是否暂停 */
  setAllVideoPausedState: (paused: boolean) => void
  /** 刷新当前WebSocket视频流的时间,如果连接断开会进行重连 */
  refresh: () => void
}
ts
/**
 * websocket视频流播放
 * @param {UseWsVideoParamsOptions } options 配置项
 * @returns
 */
export function useVideoPlay(options: UseWsVideoParamsOptions ) : UseWsVideoReturnType

TIP

如果canvas显示不清晰,建议设置options.canvasResize, 增大canvas width和height的值, 如增大为window.devicePixelRatio

ts
useVideoPlay({
  canvasResize: {
    enable: true,
    scale: window.devicePixelRatio || 1,
  }
})

示例

websocket url:
width:
height:
点我看代码
vue
<template>
  <div>
    <div class="form-box">
      <div class="form-item"><span class="label">websocket url:</span><input v-model="url" /></div>
      <div class="form-item"><span class="label">width:</span><input v-model="width" /></div>
      <div class="form-item"><span class="label">height:</span><input v-model="height" /></div>
    </div>
    <div class="video-player" :style="{ width: `${width}px`, height: `${height}px` }">
      <canvas ref="canvasRef" :width="width" :height="height"></canvas>
    </div>
  </div>
</template>
ts
<script setup lang="ts">
import { ref } from 'vue'
// import { useWsVideo } from '@havue/use-ws-video'
import { useWsVideo } from '@havue/hooks'

const url = ref('')
const width = ref(640)
const height = ref(320)

const canvasRef = ref()

useWsVideo({
  wsUrl: url,
  isReady: true,
  target: canvasRef
})
</script>
scss
<style lang="scss" scoped>
.form-box {
  padding: 15px;
  background: #25465845;

  .form-item {
    margin-bottom: 5px;

    .label {
      display: inline-block;
      width: 120px;
      margin-right: 20px;
      text-align: right;
    }

    input {
      padding: 1px 8px;
      border: 1px solid gray;
    }
  }
}

.video-player {
  background: #578895;

  canvas {
    width: 100%;
    height: 100%;
  }
}
</style>

useWsVideo函数配置对象介绍

参数名说明类型默认值
wsUrlWebSocket地址string
isReady是否播放boolean | Ref<boolean>
wsVideoPlayerInsWsVideoManager实例WsVideoManagerWsVideoManager()
targetcanvas元素, 不传会自动生成一个ref供外部使用HTMLCanvasElement | Ref<HTMLCanvasElement>
autoResizeCanvas是否自动监听canvas尺寸更改,更新canvas width和heightCanvasResizeOption | Ref<CanvasResizeOption>false
closeOnHidden视口中元素不可见时断开连接booleantrue
renderOptionsRender类实例配置Partial<RenderConstructorOptionType> | undefinedundefined

WsVideoManager

WsVideoManager构造函数

点我看代码
ts
/** 心跳配置 */
type HeartbeatConfigType = {
  /** 只发送一次 */
  once: boolean
  /** 心跳消息 */
  message: string
  /** 时间间隔 */
  interval?: number
}

/** 重连配置 */
type InterruptConfigType = {
  /** 是否重连 */
  reconnect: boolean
  /** 最大重连次数 */
  maxReconnectTimes: number
  /** 每次重连延时 */
  delay: number
}

export type WebSocketOptionsType = {
  /** WebSocket 子协议 WebSocket(url: string, protocols: string | string[]) */
  protocols?: string | string[]
  /** WebSocket 连接所传输二进制数据的类型 */
  binaryType?: WebSocket['binaryType']
  heartbeat?: HeartbeatConfigType
  interrupt?: InterruptConfigType
}
ts
export type RenderConstructorOptionType = {
  /** 当前播放currentTime和最新视频时长最多相差 秒数,默认0.3s */
  liveMaxLatency: number
  /** 最多缓存ws传输的未处理的buffer数据大小, 默认200kb */
  maxCacheBufByte: number
  /** 最多存储的时间,用于清除在currentTime之前x秒时间节点前的buffer数据, 默认10s */
  maxCache: number
}

export const WS_VIDEO_RENDER_DEFAULT_OPTIONS = Object.freeze({
  liveMaxLatency: 0.3,
  maxCacheBufByte: 200 * 1024,
  maxCache: 10
})

export enum AudioState {
  NOTMUTED = 'notmuted',
  MUTED = 'muted'
}

export enum VideoState {
  PLAY = 'play',
  PAUSE = 'pause'
}

export type VideoInfo = {
  width: number
  height: number
}

export enum RenderEventsEnum {
  AUDIO_STATE_CHANGE = 'audioStateChange',
  VIDEO_STATE_CHANGE = 'videoStateChange',
  VIDEO_INFO_UPDATE = 'videoInfoUpdate'
}

export type RenderEvents = {
  [RenderEventsEnum.AUDIO_STATE_CHANGE]: (s: AudioState) => void
  [RenderEventsEnum.VIDEO_STATE_CHANGE]: (s: VideoState) => void
  [RenderEventsEnum.VIDEO_INFO_UPDATE]: (info: VideoInfo) => void
}
ts
import type { WebSocketOptionsType } from '../loader/websocket-loader'
import type { RenderConstructorOptionType, VideoInfo } from '../render'
ts
export type WsVideoManaCstorOptionType = {
  /** 预监流连接数量限制, 移动端默认10个,pc端默认32个 */
  connectLimit?: number
  /** WebSocketLoader 实例配置 */
  wsOptions?: WebSocketOptionsType
  /**
   * websocket重连时,重新解析视频编码方式,
   * 默认 true
   */
  reparseMimeOnReconnect?: boolean
  /** Render 实例配置 */
  renderOptions?: Partial<RenderConstructorOptionType>
  /**
   * 是否使用WebGL,
   * 默认 false,
   * WebGL在不同游览器,以及受限于显存,不能同时创建过多WebGL上下文,一般8-16个 */
  useWebgl?: boolean
}

const DEFAULT_OPTIONS: Required<WsVideoManaCstorOptionType> = Object.freeze({
  connectLimit: isMobile ? 10 : 32,
  wsOptions: {
    binaryType: 'arraybuffer' as WebSocket['binaryType']
  },
  reparseMimeOnReconnect: true,
  renderOptions: RENDER_DEFAULT_OPTIONS,
  useWebgl: false
})

type WsInfoType = {
  /** 需要绘制的canvas列表 */
  canvasMap: Map<HTMLCanvasElement, CanvasDrawer>
  /** WebSocketLoader 实例 */
  socket: WebSocketLoader
  /** socket连接渲染render实例 */
  render: Render
}

export enum EventEnums {
  WS_URL_CHANGE = 'wsUrlChange',
  SOCKET_CLOSE = 'socketClose',
  CONNECT_LIMIT = 'connectLimit'
}

type Events = {
  [EventEnums.WS_URL_CHANGE]: (urls: string[]) => void
  [RenderEventsEnum.AUDIO_STATE_CHANGE]: (url: string, state: AudioState) => void
  [RenderEventsEnum.VIDEO_INFO_UPDATE]: (url: string, info: VideoInfo) => void
  [RenderEventsEnum.VIDEO_STATE_CHANGE]: (url: string, state: VideoState) => void
  [EventEnums.SOCKET_CLOSE]: (url: string) => void
  [EventEnums.CONNECT_LIMIT]: () => void
}

export const WsVideoManagerEventEnums = Object.assign({}, EventEnums, RenderEventsEnum)
ts
export class WsVideoManager extends EventBus<Events> {
  constructor(options?: WsVideoManaCstorOptionType): void
}

WsVideoManager实例属性

参数名说明类型
linkedUrlList已连接的websocket地址列表string[]
connectLimit当前实例限制的WebSocket连接数量number
addCanvas添加WebSocket地址以及需要绘制的cannvas元素(canvas: HTMLCanvasElement, url: string, renderOptions?: Partial<RenderConstructorOptionType>) => void
removeCanvas移除需要绘制cannvas元素(canvas: HTMLCanvasElement) => void
isCanvasExist判断canvas是否存在(canvas: HTMLCanvasElement) => boolean
updateRenderOptions更新render实例配置(url: string, options?: Partial<RenderConstructorOptionType>) => void
setAllVideoMutedState设置所有视频静音状态(muted: boolean) => void
setOneMutedState设置单个视频静音状态(url: string, muted: boolean) => void
getOneMutedState获取单个视频静音状态(url: string) => void
playOneAudio只播放单个视频的音频,其他静音(url: string) => void
setAllVideoPausedState设置所有视频是否暂停播放(paused: boolean) => void
setOneVideoPausedState设置单个视频是否暂停播放(url: string, paused: boolean) => void
getOneVideoPausedState获取单个视频是否暂停播放(url: string) => void
playOneVideo播放单个视频,其他暂停播放(url: string) => void
refresh刷新目标视频播放时间,如果连接断开,重新连接,(可不传url,不传刷新所有)(url?: string) => void
on监听事件(event: string, cb: (...args) => void) => void
off停止监听事件(event: string, cb?: (...args) => void) => void
emit触发事件(event: string, ...args) => void
destroy销毁实例() => void

WsVideoManager实例事件

事件说明类型
wsUrlChange已经添加的WebSocket连接地址列表更新(urls: string[]) => void
audioStateChange视频静音状态更改(url: string, state: AudioState) => void
videoStateChange视频播放状态更改(url: string, state: VideoState) => void
videoInfoUpdate视频尺寸信息更新(url: string, state: VideoInfo) => void
socketCloseWebSocket连接断开(url: string) => void
connectLimitWebSocket连接数量超过限制() => void
ts
export enum AudioState {
  NOTMUTED = 'notmuted',
  MUTED = 'muted'
}

export enum VideoState {
  PLAY = 'play',
  PAUSE = 'pause'
}

export type VideoInfo = {
  width: number
  height: number
}