-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature Request - Plate solve feature and auto slew/re-plate solve till target center #45
Comments
Hi, this feature is already on my roadmap. However, I think that your Raspberry Pi Zero 2 might not be enough. The problem is that I can't run Astrometry/ASTAP on Android (it's unsupported, and hopefully you agree'll on the fact that nobody wants 4GB of index files on this smartphone). Thus, plate solving must happen on the Raspberry Pi using the Astrometry INDI driver. This makes the process a bit complicated:
I honestly need some help with developing this feature. All this waiting and sending the picture back and forth is not easy to code, and will also be pretty slow (there's nothing I can do about that, tho). If you or someone you know wants to contribute, I'll be more than happy! Clear skies, |
Thanks Marco for your detailed explanation. I've just tried plate solving with ASTAP and Astrometry.net linux app on an
ASTAP data occupies 2GB space; whilst Astrometry.net full index files 9GB. On Android TV box, during plate solving using ASTAP command line, it uses about 25% of CPU resource. I didn't measure on Poco phone, which is more powerful. Platsolving took from 1.Xsecs to within 2 minutes. Usually the fits file contains key information, i.e, estimated RA/DEC where the scope is pointing to, plus image field of view (or user can specify) which helps speed up plate-solving to within a few seconds. As I'm not qualified to assess the technical obstacles/difficulty of Android app development but I can say the current mobile phone is powerful enough to do the plate solve. So the phone, apart from converting the image to a jpeg file, our phone should continue doing the plate-solving part instead of sending it back to Pi. Therefore, the Telescope.touch could further repeat the plate solving, delta computation, slew command issuing to the mount until the target is centered. All within the nowadays powerful phone. Actually, the Pi job is just to do the image capturing and, should, also convert the captured image to a smaller compressed jpg/png file, size in a few hundred KBs, before sending it wirelessly to our phone. This will speed up the data transfer time a lot. |
Hi, |
Sorry, it has already been mentionned...😄 |
That's exactly what I don't want. 9GB on a smartphone is a lot, unlike on PCs. I'm not just talking about available storage space, but also user experience. Google Play Store doesn't allow me to package all that stuff:
This means that users will need to download the index files manually or copy them from a PC. I can surely do it, but I want this app to idiot-proof. Mobile games like Asphalt 9 use something called "APK expansion files" to go beyond that limit, but they are limited to 2GB. That's not enough to package ASTAP, a Linux virtual machine and the index files. Moreover, it's pretty hard to implement these extension files.
I don't want to transform people's phones into Armbian bricks, nor I can directly (somehow) embed Armbian into my app. Infact, a lot of clever engineering has gone into Android Userland in order to make it work without root. And even if you already have Android Userland installed with ASTAP configured inside, Telescope.Touch wouldn't be able to access it. Your Linux VM in Userland is isolated, and only the user can type commands and read the output. My app simply can't open an Ubuntu command line in Userland and type stuff, unfortunately. There are a lot of security "walls" in place on Android, and of them is the fact that apps are isolated. To give you an idea, interactions between apps (and even inside different "activities" of the same app) are done using something called
Speed is, indeed, not a concern. These days, most smartphones are much faster than Raspberry Pis. I meant that it would be slow because of having to send back and forth the image (INDI CCD → app → INDI Astrometry). At the moment, I see no other option.
The INDI CCD driver sends only I'm really sorry for the long reply, but I felt like you should know all the software challenges that prevent me from implementing these features, and why I'm asking for help on this feature. I'm just a 20 years-old electrical engineeing student, not a senior Android engineer 😅😅
Yes, I'm talking about that exact INDI driver. It would solve the "running ASTAP inside Android" problem, but adds quite some complexity and a lot of work is needed in that direction.
Thank you! Appreciated! |
I appreciate you took your precious time to give me a lengthy and informative explanation Marco. |
To use Telescope touch with my electronic finder (Finder+guider camera). My finder must be pre-align (manually) with my main scope it is riding on.
Workflow:
0. put my main scope at home position
I may place my Raspberry Pi zero 2 to the main scope. It will run only INDI server and guider scope and mount driver. Telescope will do the rest, including plate solve on the Android device.
The text was updated successfully, but these errors were encountered: