Help add iPad support to Tappity.

DoG
Moderator
Posts: 869
Joined: 2003.01
Post: #1
Tappity is a framework for remote controlling your app in the iPhone Simulator from the actual device, with multi-touch and everything.

Now, I've confirmed that it also works with the tablet simulator, in principle.

However, there's a catch: the touch events need to be transformed onto the bigger screen of the tablet, and I could use some help in figuring out how to do this properly.
Quote this message in a reply
Sage
Posts: 1,482
Joined: 2002.09
Post: #2
Any reason why it is more complicated than scaling up the coords? Sure it won't exactly be a good indication of usability or reachability, but what else can you do?

Scott Lembcke - Howling Moon Software
Author of Chipmunk Physics - A fast and simple rigid body physics library in C.
Quote this message in a reply
Moderator
Posts: 3,579
Joined: 2003.06
Post: #3
What I did with my iPhone engine is to have a game coordinate space and then whenever input was received, I scaled the iPhone touch coords to the game coordinate space. I did that originally for two purposes: 1) I needed to map my 2.5D scenes to something 2D because the 3D part was using normalized coordinates 2) rearranging input coordinates between portrait and landscape.

It turned out to work really nicely. The bonus was that porting to iPad only required modifying two lines to scale my inputs for it.

But it is starting to look like there may be yet another bonus, and this is why I am posting this reply: I'm definitely not liking the performance of the iPad sim on my MacBook. What I've noticed is that if I only do the scene half sized (like about the size of the iPhone) then the performance is back to about how it is on the iPhone sim. SO... What I'm thinking I might do is try to set up a half-sized rendering environment in my engine for the iPad sim to get by with for now. Hopefully Apple will allow us to simulate at smaller resolutions, but I don't see anything but 50% or 100% right now, and in landscape it appears to force 100% on my display.

So what I'm thinking you might consider doing with Tappity is something similar by creating a Tappity coordinate space so it can be scaled as well. Obviously the client could scale the input themselves, but maybe it'd make sense to go ahead and optionally scale in Tappity too.
Quote this message in a reply
DoG
Moderator
Posts: 869
Joined: 2003.01
Post: #4
The way tappity works is that it uses an undocumented API to record events, and then it sends these events to the app running on the simulator, where they are played back by another undocumented API.

The problem wasn't figuring out how to scale the positions, as that is fairly trivial, but that the touch events contain blobs of binary data that encodes the touch positions.

In the meantime, I believe to have figured it out, and I've updated the source repo accordingly, if anyone wants to play with it. Its extra fun because you also get accelerometer and location data, which you can normally not play with in the simulator Smile

Now I only gotta figure out how to fix the delay caused by streaming lots of touch-moved events Smile
Quote this message in a reply
Post Reply 

Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  How many uniform vectors can iPad PowerVR SGX support linghuye 0 3,050 Aug 30, 2010 02:40 AM
Last Post: linghuye