Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Low quality video streaming #1570

Open
tunahsu opened this issue Mar 26, 2024 · 9 comments
Open

Low quality video streaming #1570

tunahsu opened this issue Mar 26, 2024 · 9 comments

Comments

@tunahsu
Copy link

tunahsu commented Mar 26, 2024

Why is the video streaming quality poor even though I've set the resolution to the highest resolution of the webcam in the media_stream_constraints? The quality is still worse than directly opening the camera using Windows.

Here is my code:

ctx = webrtc_streamer(
    key='stream-detection',
    video_processor_factory=VideoProcessor,
    media_stream_constraints={
        'video': {
            'width': 1920
            },
        'audio': False
    },
    rtc_configuration={
        'iceServers': [{'urls': ['stun:stun.l.google.com:19302']}]
    },
)
@tunahsu
Copy link
Author

tunahsu commented Mar 27, 2024

I noticed that it takes some time for the frame size to gradually increase, from 640x360 to 960x540 to 1280x720 to 1920x1080. I know that WebRTC adjusts the frame size adaptively based on network speed. Is there any way to fix it at 1920x1080 from the beginning?

@ccl-private
Copy link

I also encountered this problem. 1920x1080 video streaming quality is poor. Is your problem solved?

@tunahsu
Copy link
Author

tunahsu commented Aug 12, 2024

@ccl-private It seems that manual adjustments in the package are not possible... The author mentioned it here.

@ccl-private
Copy link

I can understand it, but the difference in image clarity between local playback and webrtc playback is too big.

@ccl-private
Copy link

OK, I tried https://github.com/aiortc/aiortc/tree/main/examples/webcam. Also Low quality for l080p.

@achaosss
Copy link

I have the same question

@ccl-private
Copy link

ccl-private commented Dec 31, 2024

@tunahsu @achaosss @whitphx Hei guys, I found an article that may help.https://www.freelancer.com/contest/aiortc-example-with-hardware-accelerated-codec-2035986, but it’s a bit difficult to solve it with my own ability.

@ccl-private
Copy link

import numpy as np
import subprocess
import re

# 定义 RTSP URL
rtsp_url = "rtsp://192.168.2.13:554/cam/realmonitor?channel=1&subtype=0"

# 使用 ffprobe 获取视频分辨率
def get_video_resolution(rtsp_url):
    command = [
        'ffprobe',
        '-v', 'error',
        '-select_streams', 'v:0',
        '-show_entries', 'stream=width,height',
        '-of', 'csv=p=0',
        rtsp_url
    ]

    result = subprocess.run(command, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
    if result.returncode == 0:
        resolution = result.stdout.decode('utf-8').strip().split(',')
        width, height = int(resolution[0]), int(resolution[1])
        return width, height
    else:
        print("Error:", result.stderr.decode('utf-8'))
        return None

# 获取视频分辨率
resolution = get_video_resolution(rtsp_url)
if resolution:
    width, height = resolution
    print("视频分辨率:", resolution)

    # # 启动 FFmpeg 子进程
    # ffmpeg_command = [
    #     'ffmpeg',
    #     # '-loglevel', 'debug',  # 启用调试日志
    #     '-i', rtsp_url,
    #     '-f', 'rawvideo',
    #     '-pix_fmt', 'bgr24',
    #     '-'
    # ]
    ffmpeg_command = [
        'ffmpeg',
        '-hwaccel', 'vaapi',
        '-vaapi_device', '/dev/dri/renderD128',
        '-i', rtsp_url,
        '-f', 'rawvideo',
        '-pix_fmt', 'bgr24',
        '-'
    ]

    process = subprocess.Popen(ffmpeg_command, stdout=subprocess.PIPE, stderr=subprocess.PIPE)

    cv2.namedWindow("Video Stream", cv2.WINDOW_NORMAL)

    while True:
        # 根据分辨率读取数据
        raw_frame = process.stdout.read(width * height * 3)  # BGR格式每个像素3字节
        if not raw_frame:
            break

        # stderr_line = process.stderr.readline()
        # if not stderr_line:
        #     break
        # print(stderr_line.strip())

        # 将原始数据转换为 NumPy 数组
        frame = np.frombuffer(raw_frame, np.uint8)
        frame = frame.reshape((height, width, 3))  # 适应 BGR格式

        # 显示图像
        cv2.imshow("Video Stream", frame)
        if cv2.waitKey(1) & 0xFF == ord('q'):
            break

    # 清理
    process.stdout.close()
    process.stderr.close()
    process.wait()
    cv2.destroyAllWindows()
else:
    print("无法获取视频分辨率。")


I can call hardware decoding directly with ffmpeg to extract the RTSP stream to the frame variable in real time, but I don't know how to pass it to streamlit to display it on the interface

@ccl-private
Copy link

ccl-private commented Jan 7, 2025

Hello everyone, I simply wrote a project to pull 2560*1440 streams in real time and display them on streamlit web. https://github.com/ccl-private/streamlit_rtsp/tree/main
The project is still very simple. It uses streaming ffmpeg hardware to pull the stream (Intel core display). The frame callback has not been written yet. Of course, you can also modify it directly in the source code.

@tunahsu @achaosss

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants