Following on from our first public trial of our 5G Smart Tourism application in December, we have now completed our second public trial in which we tested an alternative method of presenting low-latency, high-bandwidth video to a smartphone.
The smartphone application in the second trial used the same 鈥榳indow in time鈥 concept as the first, in which a historical reconstruction of a real location was aligned with, and superimposed upon, the same location in the real world, but instead of streaming pre-rendered 360 videos to a user鈥檚 phone we also tested a 鈥渞emote rendering鈥 VR version. In this new version, the physical position of the smartphone was sent to a remote computer which used that information to draw a live 鈥榁R鈥 view of the historical scene before streaming it back the smartphone for display.
Creating the historical view in real-time has several advantages over using pre-rendered 360 videos. Firstly, while the 360 video is drawn from a single physical position, forcing the user to stand in a single location to accurately align the historical view on the phone with the real world, the VR version gives the user complete freedom to move around the physical environment for as long as the position of the phone can be determined. Secondly, because the VR version is being drawn live, it is possible to add interactive or personalised elements directly to it as one would with a computer game. Thirdly, in the VR version, it is possible for the user to 鈥榸oom鈥 his viewpoint (to examine details in the scene a distance from him) without loss of image quality.
Of course, the VR scene could be drawn directly on the smartphone itself, but this has several drawbacks, including inferior rendering quality compared to that available on a decent PC, a much larger app that needs to be installed, and extremely high battery usage (performing the high-quality graphics drawing on our Android S8 smartphones caused the battery to drain in about 15 minutes).
The key technological requirement for this application was to keep the delay between a user moving their smartphone and the view on their screen updating to an absolute minimum. To this end, our partners at the provided a 60 GHz, 5G mesh network between the and the physical server cluster located at Bristol University which performed the VR graphics drawing, although the final 鈥榟op鈥 between the 5G network and the smartphones was done via an LTE connection.
Bristol provisioned 10 virtual graphics-drawing PCs using OpenStack and optimised them to allow fast access to physical memory and graphics hardware. Onto each virtual machine, they installed our graphics application and streaming software. During the trial, a user鈥檚 smartphone would request graphics-drawing capability from Bristol鈥檚 servers and be assigned IP endpoints for sending the control data and receiving the drawn video. A description of the technical aspects of the phone app is included below.
Using , each smartphone could locate itself in the Roman Baths and then send its position, orientation and other control data as raw UDP packets to Bristol鈥檚 graphics-drawing PCs. The graphics application on these PCs used historical 3D models of the Roman Baths and a server application based on the to draw the historical scene from the viewpoint of the smartphone, encode the resulting graphics as H264 video and then stream this live view back to the smartphone using the . This video was then decoded by the phone using an Android MediaCodec instance and displayed on an OpenGL surface in the app. The result of this process can be seen in the video above, in which a user is moving their phone around the upper balcony above the Great Bath, and the resulting historical view is being created about 15 miles away and sent back in real-time.
- 91热爆 R&D - Research White Paper: Mobile Augmented Reality using 5G
- 91热爆 R&D - New Content Experiences over 5G Fixed Wireless Access
- 91热爆 R&D - 5G Smart Tourism Trial at the Roman Baths
For this trial, round-trip latency from a handset being moved, to a video frame corresponding to that movement being displayed was ~200 milliseconds. Positive user feedback from the large number of participants who attended the trial confirms that this is satisfactory for a handheld VR application. It would not be acceptable for a head-mounted VR display however, where a latency of less than ~50ms between user motion and display update is required. Further methods for improving perceived latency would be required if this sort of technology was to be used for a headset-based VR experience. Possibilities include:
- Drawing a wider field-of-view than the display requires and moving the user viewpoint around that wider view using orientation from the fast smartphone sensors. User position latency would remain, but orientation latency would be greatly reduced.
- Using a sub-frame video encoder so that the streaming server doesn鈥檛 have to wait for a whole frame to be drawn before it can start sending data back the phone,
- Increasing the frame rate of the graphics application (and accordingly the transmission bandwidth)
- Trading network reliability for speed by sending the video back to the phone using UDP only.
A large amount of user feedback was acquired through participant interviews. One of the features that was particularly requested was the ability to 鈥榩ersonalise鈥 the explanatory historical notes in the scenes depending on the level of knowledge and interest of the user. Another popular request was for the 5G content to be used alongside the existing explanations from the tour-guide. This and other feedback will determine how we design future deployments of this system.
Further details of the trials, including an analysis of the user feedback can be found in 91热爆 R&D White Paper 348.
This work was supported by the , the , and the .
- -
- 91热爆 R&D - Research White Paper: Mobile Augmented Reality using 5G
- 91热爆 R&D - All of our articles on 4G and 5G including:
- 91热爆 R&D - Broadcasting Over 5G - Delivering Live Radio to Orkney
- 91热爆 R&D - 4G & 5G Broadcast
- 91热爆 R&D - TV Over Mobile Networks - 91热爆 R&D at the 3GPP
- 91热爆 R&D - 5G-Xcast
- 91热爆 R&D - 4G Broadcast for the Commonwealth Games 2014
- 91热爆 R&D - 4G Broadcast technology trial at Wembley 2015 FA Cup Final
- 91热爆 R&D - 4G Broadcast: Can LTE eMBMS Help Address the Demand for Mobile Video?
- 91热爆 R&D - 5G Smart Tourism Trial at the Roman Baths
- 91热爆 R&D - New Audience Experiences for Mobile Devices
- 91热爆 R&D - Unlocking the Potential of 5G for Content Production
- 91热爆 R&D - Building our own 5G Broadcast Modem
- 91热爆 R&D - Broadcast Wi-Fi
-
Immersive and Interactive Content section
IIC section is a group of around 25 researchers, investigating ways of capturing and creating new kinds of audio-visual content, with a particular focus on immersion and interactivity.