ngp-encode-server receives VR user's head position and field of view, and renders the view corresponding to user position using instant-ngp. The goal of this program is to study whether Neural Radiance Field technology is suitable for background generation of immersive VR environments.
To elaborate, ngp-encode-server receives the user's field of view and head position from ngp-client. The program requests to render a frame containing the scene and the depth data generated by NeRF, corresponding to the received user position, to multiple instances of instant-ngp-renderer, which is a customized version of instant-ngp. The views are encoded to a video using H.264 codec and streamed to the ngp-client via WebSocket. ngp-client decodes the packet using WebCodecs and plays it to the user's VR device. The client only uses web standards-compliant technologies, and does not have to rely on external programs or plug-ins.
- A C++20 compatable compiler. (GCC 8 or later)
- Linux distribution of your choice. Distributions other than Ubuntu 20.04 has not been tested.
- libavcodec
- libavformat
- libavutil
- libswscale
- freetype2
- libwebsocketpp
- OpenSSL
- libprotobuf
If you are using Ubuntu 20.04, install the following packages;
sudo apt install build-essential \
cmake \
git \
gcc-10 \
libavcodec-dev \
libavformat-dev \
libavutil-dev \
libswscale-dev \
libfreetype-dev \
libwebsocketpp-dev \
libssl-dev \
libprotobuf-dev
Begin by cloning this repository and all its submodules using the following command:
$ git clone --recursive https://github.com/moonsikpark/ngp-encode-server
$ cd ngp-encode-server
Then, use CMake to build the project:
ngp-encode-server$ cmake . -B build
ngp-encode-server$ cmake --build build --config RelWithDebInfo -j $(nproc)
If the build succeeds, you can now run the code via the build/neserver
executable.
Moonsik Park, Korea Instutute of Science and Tecnhology - [email protected]
Copyright (c) 2022 Moonsik Park. All rights reserved.