You signed in with another tab or window.
Reload
to refresh your session.
You signed out in another tab or window.
Reload
to refresh your session.
You switched accounts on another tab or window.
Reload
to refresh your session.
By clicking “Sign up for GitHub”, you agree to our
terms of service
and
privacy statement
. We’ll occasionally send you account related emails.
Already on GitHub?
Sign in
to your account
I have setup a SoftwareDevice that receives raw depth and color frames over the network and uses a modified event system of the RealSense PointCloudColorAndDepth sample provided as part of the RealSense package. I'm able to receive the frames and render the pointcloud but the color texture does not get applied on to the pointcloud. The UV map field of the PointCloudGeom shader does not update (is black). Can someone please point me in the right direction?
Image attached for reference :
Hi
@abhijaysingh
In the link below, a RealSense Unity user also had a problem with getting the point cloud to display correctly on an adaptation of the
PointCloudDepthAndColor
sample scene. They found that changing the selected shader on
PointCloudMat
from 'Custom/PointCloudGeom' to the
Custom/Pointcloud
shader resolved the problem.
#4155 (comment)
Hi
@MartyG-RealSense
Thanks for the swift response. I had tried changing the shader earlier as well but the issue persists. The UV Map field is black even with the shader set to Custom/PointCloud. Is it something related to loading the raw texture data from Points (
https://github.com/IntelRealSense/librealsense/blob/master/wrappers/csharp/Intel.RealSense/Frames/Points.cs
)?
In the RealSense Unity wrapper, camera data is converted into a mat so that it can be rendered. I can see in the
PointCloudMat
panel of the Inspector that a color image is being received.
I find that in Unity, if a texture will not display correctly then simply selecting Unity's default
Standard
shader from the shader list will often render it. Have you tried Standard, please?
Thank you,
@MartyG-RealSense
! It worked but the texture has a blue tint to it and the texture coordinates are not right.
I require exact depth to texture mapping for the application and the Standard shader doesn't map the UV to texture coords.
Are you able to achieve depth-color alignment if you add an
RsAlign
component to your point cloud project if you are not using one already, like in the recent RealSense Unity wrapper pointcloud align project in the link below?
#9263 (comment)
How do I stream Point Cloud to "Unity" via Python server that connects RealSense cameras?
#11631