Saturday, December 28, 2024
Google search engine
HomeLanguagesBlending of two videos using Python

Blending of two videos using Python

Prerequisite: Addition and Blending of images using OpenCV in Python

In this article, we will be showing you how to effectively blend two videos together. But for that, we will first see what is Alpha Blending. 

Alpha Blending

Alpha blending is a curved mix of two tones taking into consideration transparency impacts in computer illustrations. The estimation of alpha in the color code goes from 0.0 to 1.0, where 0.0 addresses a completely transparent tone, and 1.0 addresses a completely dark tone. The estimation of the subsequent color when color Value1 with an alpha value of Alpha is drawn over a foundation of color Value0 is given by:  

Value = Value0(1.0 - Alpha) + Value1(Alpha)  

The alpha part might be utilized to mix to red, green and blue segments similarly, as in 32-bit RGBA, or, then again, there might be three alpha qualities determined relating to every one of the essential tones for spectral tone shifting.

Now, the second thing we will be using is OpenCV which is a library of programming functions that focuses on real-time computer vision. But, before moving forward we will keep a note of a few things.

Important Points

In order for our program to work perfectly, you’ve to make sure:

  1. The resolution and frame rate of the two input videos must be exactly same (in code it’s using 1920×1080 format so use that only, otherwise you’ve to set the ‘h’, ‘w’ values accordingly with a resolution of your input videos which we will see later in the code)
  2. The code is written, taking into consideration that the background video’s duration is slightly greater than or equal to the foreground video’s duration, so it’s preferable for you as well-to-do the same with your background and foreground input videos otherwise you’ve to assign the “ret” value to the background instead of foreground which we will see later in the code
  3. Rename your video files with some unique name because sometimes it might cause an error if there are two video files of the same name and the path of the two input video files must be provided accurately as well.
  4. The foreground video must be a combination of some solid color background (preferably black color) and with some movement of the subject on it.

Methodology

We will be taking two videos as input, one will be our background video and the second one will be our foreground video. The main work is to remove the solid color background of this foreground video so that we’re left with the subject only (of foreground video) with its transparent background. After making this change, the foreground video will then be put on the background video, which will give an illusion of a single video with some effect on it but actually, there will be two videos blended together. This magic is done with the help of OpenCV and Python.

Step-by-step Approach

  • Import required modules
  • Add paths and capture both the input videos.
  • Iterate through each frame of both the videos and blend them together using Alpha Blending.
  • Play the output video generated.

Implementation

Input: 

Foreground Video

Background Video

Notice that the foreground video contains only the particles with a solid black background, thus for better results, it is preferable to take foreground videos and background videos like these.

Python3




# importing necessary packages
import numpy as np
import cv2
  
# assigning path of foreground video
path_1 = r"C://Users//Lenovo//Desktop//Python Workshop//z.mp4"
fg = cv2.VideoCapture(path_1)
  
# assigning path of background video
path_2 = r"C://Users//Lenovo//Desktop//Python Workshop//v.mp4"
bg = cv2.VideoCapture(path_2)
h, w = 1080, 1920
  
while True:
    
    # Reading the two input videos
    # we have taken "ret" here because the duration 
    # of bg video is greater than fg video,
    ret, foreground = fg.read()
      
    # if in your case the situation is opposite 
    # then take the "ret" for bg video
    _, background = bg.read()
      
    # if foreground array is not empty which 
    # means actual video is still going on
    if ret:
        
        # creating the alpha mask
        alpha = np.zeros_like(foreground)
        gray = cv2.cvtColor(foreground, cv2.COLOR_BGR2GRAY)
        alpha[:, :, 0] = gray
        alpha[:, :, 1] = gray
        alpha[:, :, 2] = gray
  
        # converting uint8 to float type
        foreground = foreground.astype(float)
        background = background.astype(float)
  
        # normalizing the alpha mask inorder 
        # to keep intensity between 0 and 1
        alpha = alpha.astype(float)/255
  
        # multiplying the foreground 
        # with alpha matte
        foreground = cv2.multiply(alpha, 
                                  foreground)
  
        # multiplying the background 
        # with (1 - alpha)
        background = cv2.multiply(1.0 - alpha, 
                                  background)
  
        # adding the masked foreground 
        # and background together
        outImage = cv2.add(foreground, 
                           background)
  
        # resizing the masked output
        ims = cv2.resize(outImage, (980, 540))
  
        # showing the masked output video
        cv2.imshow('Blended', ims/255)
  
        # if the user presses 'q' then the 
        # program breaks from while loop
        if cv2.waitKey(1) & 0xFF == ord('q'):
            break
    # if the actual video is over then there's 
    # nothing in the foreground array thus 
    # breaking from the while loop
    else:
        break
          
print('Video Blending is done perfectly')


Output: 

Notice that in this output video we can see that the sand is moving in the air in a hot desert area, which in reality is the blended output of the particle video(used as foreground) and desert video(used as background). Also, you can always press “q” from your keyboard to break the loop and quit the program. 

So in this way, you can blend two videos and if done correctly, this method can also provide a professional edit look as well.

Dominic Rubhabha-Wardslaus
Dominic Rubhabha-Wardslaushttp://wardslaus.com
infosec,malicious & dos attacks generator, boot rom exploit philanthropist , wild hacker , game developer,
RELATED ARTICLES

Most Popular

Recent Comments