I made SmoothTrack, a no-equipment head tracking app for iOS and Android which lets you control the game camera in sim games (like MSFS 2024 for example) with your head - basically like TrackIR, just without any equipment and for $15 instead of $150. I originally made the app just for myself to save myself the money of buying a TrackIR system, but then /r/flightsim begged me to release it as a full app.
Last month, I released SmoothTrack 2.0 which includes basic eye tracking and camera control gestures.
I remember building my own track ir with ir leds and a floppy in front of an old webcam. This was more than a decade ago, but I would have assumed there is no more demand for this since VR headsets are a thing(I completely left gaming and everything about it since then).
Anyway, great work!
Do you know about vtubers? In case you don't, they are people that record or livestream themselves playing games and whatnot. Instead of using a regular face camera to put themselves onto the video feed, they use a 2D or 3D animated model.
Most people use a desktop webcam which can do decent tracking or an iPhone which does really good tracking through ARkit, but there isn't really a decent solution on Android.
It could be a good new market opportunity for you on desktop, iPhone, or Android - but especially for Android users since there isn't really any alternatives. There is a steady stream of new people getting into being a vtuber and I think a $15 app might be an easy sell considering people can end up spending up to a five-digit amount getting custom character models commissioned. If you are able to improve the eye/face tracking past the basic level you mentioned the 2.0 version having, it would be even more appealing.
Thanks! Yeah, I've been asked about this a few times - however, it does look like this is basically exactly that (using ARKit and ARCore) and exists already: https://denchisoft.com Have you heard of this tool?
VtubeStudio plus an iPhone for mocap is the standard for the big-time vtubers that use 2D avatars, however 2D avatars actually have quite a high barrier to entry because pretty much your only option is to have something commissioned. 3D avatars tend to be what nearly everyone starts out with because there are several free programs out there that can help you make a decent starter avatar.
VtubeStudio only supports the nicer 2D avatars and as far as I know has no intention of getting into the 3D side of things. There are a few decent programs people use with 3D avatars(links below), but it seems they aren't really as high quality as VtubeStudio so they don't have the market cornered like VtubeStudio does for 2D.
As far as tracking goes, on the camera side of things there isn't really any difference between 2D and 3D that might limit you to one or the other.
There is a pretty large demand for tracking apps on Android because there are no widely used apps currently available. Big-time vtubers usually get iPhones so they can use the ARkit tracking that is Apple only, but a lot of people just starting out have Android and are currently forced to use a regular PC webcam that tends not to be as accurate and also doesn't allow people to offload the computing resources needed for face tracking to their phones.
As a long time simmer, I'm buying this tonight after work, especially now that my Pixel 4a 5G is sitting on my desk, propping up the 9 Pro XL that replaced it last week.
Awesome! Hope you enjoy it -
I recommend turning the sensitivity down to start with, also bind “toggle” in OpenTrack to turn it off when you don’t need it.
Thanks! I was just playing with it, and for the FIRST TIME EVER I was able to fly a proper pattern without using an external view or the mouse to see where the airport/runway was. Wow. A whole new level of immersion, for $12. Money well spent.
How does this work without a virtual headset (don't you just end up looking off-screen)? Are you moving your head far less than the camera moves on the screen?
> moving your head far less than the camera moves on the screen
Precisely this. You keep your eyes on the screen and just nudge your head in the direction you want. Your brain “gets” it real quickly and it feels very intuitive.
Last month, I released SmoothTrack 2.0 which includes basic eye tracking and camera control gestures.
https://smoothtrack.app