Here's an observation that might shed some light on the issue... The statement that speed and gimbal angle don't get adjusted if signal is lost means (to me) that those parameters are being handled by the Litchi software running on the mobile device/smart controller in "real time" and not by the aircraft's processor (using the set of instructions uploaded to the aircraft before the mission starts). Adjustments needed to maintain gimbal angle and at least make changes to the over-the-ground speed are being sent from Litchi on the phone/ipad/SmartController to the aircraft constantly during the mission. So the mobile device/ipad/smart controller CPU is doing the gimbal angle calculations and sending any needed speed adjustments while the aircraft's CPU is doing the rest as it processes the uploaded commands (the ones uploaded from Litchi to the aircraft at the start of the mission).
What I've noticed doesn't have so much to do with speed or gimbal angle. It has to do with what appears to be the rate of change in aircraft yaw needed to keep the current POI in the center of the frame.
I never noticed this as a problem when I run Litchi on my P4-Pro and iPad combination. I've made some pretty tight moves in waypoint missions on the P4/iPad and they've all been rock solid. But I just tried Litchi for the first time on my
Mavic 2 Pro and Android-based smart controller. On that setup, it seems to me that f the "slew" rate is too high - meaning the aircraft needs to yaw at a high rate to keep the POI framed - the yaw gets hopelessly "behind"... letting the POI drift out of frame.
I just uploaded a short video
HERE of two nearly identical waypoint missions. One "close and fast" where the POIs drift from center-frame and a second one a but further away and slower where the POIs stay centered. The P4/iPad would have done either of these missions flawlessly. My working theory is either the
Mavic 2's CPU or the Smart Controller's CPU is too under-powered to keep up when values are changing quickly... where on the Phantom/iPad setup, the CPU responsible for maintaining yaw correctly is fast enough to do the job.
One other bit of information that might be relevant... I was filming in 30 FPS, H.265, 10-bit DLOG-M on the Mavic. Perhaps if I chose a less stressful codec the first mission in the video would have worked better? However, I'd assumed (hoped) that the new 10-bit video capability relied on an internal GPU for image processing and not the same CPU the aircraft uses to control flight. If that's not true, then that really sucks!
I'd welcome comments from anyone who has knowledge of Litchi's "workload assignment" (between controller/mobile device and aircraft) during a waypoint mission with POIs... or from anyone else who might have additional insight.
Thanks!
Tim