Hello. I have a few of your Indiemark lens encoders and running vive trackers for camera tracking in Unreal so I'd really love to use Reality Field so I can take advantage of it's jitter feature along with it "zero in relation to object" feature" however I can't seem to make sense of how it seems to be tracking objects in relation to each other. Everything from the base stations appearing Under the default 2D grid floor in Reality Field, to multiple Vive trackers not being in the same virtual spot in relation to each other than they are in the real world. I even can't seem to make sense of the manual XYZ offset system in Reality Field because when I change just one axis, it seems to change the location of the object in multiple axis'.
All this leads me to believe there is a common solution. I just haven't been able to find it. Please let me know what comes to mind.