There is an option to show/hide the inputAccessoryView contataing
special buttons not present on the iOS system keyboard, nor the
Apple iPad Magic keyboard.
Make sure the parent frame size is resized properly depending on
the option setting and if HW keyboards are connected.
The soft keyboard is enabled by calling showKeyboard which will set
the inputView, which is a UITextField, to become the first responder.
This will trigger the software keyboard to pop up. The showKeyboard
function is either called by the GUI when clicking into an editable
box, by the engine when keyboard input is required or when the user
forces it to be shown by a gesture or pressing the on-screen control
button..
The keyboard handling above suits the software keyboard but no HW
keyboards since if the inputView is not the first responder no key
presses are registered. One can expect that key button presses on a
HW keyboard should be registered without enabling the soft keyboard.
Implement handling to enable the inputView when HW keyboards are
connected.
It was reported that in some games, e.g. COMI, if suspending the iOS
application by putting it to background and then some time later
resuming it again the music doesn't start playing. It was reproduced
in The Dig and most probably Full Throttle was affected.
The time it took for the music to play again after resuming turned
out to be related to the time the app was in background. The longer
the app was in background, the longer it took for the music to start
playing again.
The legacy implementation of getMillis in the iOS7 backend accounted
the time the application was suspended. The time was accumulated
every time the app moved to background and it was never reset.
The getMillis function used the accumulated value and substracted it
from the time difference from the current time.
This seems to have caused problems with the timer handler, causing
slots to not be executed in correct time. The Android backend do not
account for the time in background.
The system gestures on screen edges took precedence over the tap
gestures again after screen rotations.
Call function setNeedsUpdateOfScreenEdgesDeferringSystemGestures
on orientation changes to notify the system that it needs to read
the edges from the view controller’s
preferredScreenEdgesDeferringSystemGestures property again.
Defer system gestures on all edges since the top changes when
rotating the device.
The tap gestures were cancelled due to system gestures taking over
when tapping on the screen edges. This made it hard to click on
verbs in games or buttons in launcher when in "Direct mouse mode".
There is an option to defer the system gestures on selcted edges.
This will make tap gestures to be recognised also on screen edges.
Also it won't interfere with the gesture to close the application
on devices lacking home button.
The accessory view can be configured to be shown or not. When it is
configured to be shown the screen size has to be adjusted according
to the height of the accessory view, also in the case when a HW
keyboard is connected.
It was noted that the screen size was not properly restored when
disabling the keyboard input. The screen size was only adjusted when
the keyboard input became enabled. The reason for that was that when
disabling the keyboard input the accessory also became hidden. So
the screen size was not updated.
Make sure to always reset the screen size when the accessory view
is configured to be visible.
Connected hardware devices such as mice and touchpads also generates
gesture events. If holding down a mouse button that is considered as
a long press gesture, but it's also a mouse button event.
These events are interfering with each others and cancels e.g. long
press gestures.
Make sure the tap and long press gestures only allows to be triggered
by direct touches on screen.
Add help sections to describe the different touch modes, touch
actions, how to add a game etc.
A temporary version for tvOS with no images is provided as well.
The tap gestures for right/left click in the iOS port was not
protected by the TARGET_OS_IOS macro definition in the touchesMoved
method which is called when a touch is moved.
Touches are used in tvOS when controlling the mouse pointer using
the touch area on the Apple remote controller.
Fix some compiler warnings about unused variables and functions in
the tvOS port.
After adding the other long-press gestures it was hard to trigger
the gesture to show the keyboard by pressing the touch mode button
on the on-screen controls.
Changing the minimum time to trigger from 1 second to default 0.5
seconds fixed that. Add the one-touch-only requirement as well.
The ability to change touch mode between touchpad emulation and
"direct" mouse mode is kept. Inform the user about the updated
mode change and update the on-screen control button to reflect
the new mode which is useful when the on-screen controls are
disabled.
The callback functions touchesBegan, touchesMoved and touchesEnded
are from now only used for mouse pointer movements. Only one touch
is then required to be tracked.
Disable multi-touch in view to accomplish this. The setting does
not affect the gesture recognizers attached to the view, only the
touchesBegan, touchesMoved and touchesEnded callbacks are affected
that only the actions for the first touch in view are sent.
On a touch based device the touches made must be translated to
different mouse events. For example when a touch starts and is
moved that should translate to a mouse pointer movement. A quick
tap should be translated to a mouse button click while holding a
touch for a longer time without movement should be translated to
that the mouse button is kept pressed.
Add UIGestures to replace the current mouse button handling, all
done using the callback functions touchesBegan, touchesMoved and
touchesEnded.
A tap gesture with one finger is a left mouse click
A tap gesture with two fingers is a right mouse click
A long press gesture with one finger is holding the left mouse
button down until the press is released. This is accomplished by
sending an button down event when the gesture is recognized and
changes state to UIGestureRecognizerStateBegan.
A button up event is sent when the same gesture changes state to
UIGestureRecognizerStateEnded.
A long press with two fingers is as above but for the right mouse
button.
This commit adds the gestures and their actions. The current
mouse button handling is removed in upcomming commit.
The Apple guidelines for Apple TV tells that when pressing the menu
button from the root view the user shall be brought to the "Home"
screen, suspending the app. Prevent this to be done in iOS.
When having a hardware keyboard connected to the iOS device the
software keyboard is not shown. However when using stage manager
the notification that the keyboard is shown is still triggered.
This results in that the game screen is resized even though no
keyboard is visible in the area below.
This is some kind of workaround for that where a check for any
connected hardware keyboard is made. If a keyboard is connected
then the screen is not resized, except if the accessory bar should
be shown. In that case the screen is resized only for the height
of the accessory bar.
Some games have some actions mapped to the numeric keyboard, e.g.
Indiana Jones and the last crusade fight scenes.
The alphabetical keyboard doesn't have the numeric button row so
it's a bit tricky to play these scenes without holding the "number
button switch" at the same time you need to press the correct number
button being showed.
Make it possible to switch between an alphabetical keyboard layout
to a numpad layout by adding a button to the toolbar.
Add a long press gesture to the touch mode control button, which
when triggered, shows the keyboard. The image of the UI button
changes to the keyboard image asset as long as the keyboard is
visible. If pressing the touch mode control button while the
keyboard is visible it will dismiss the keyboard.
The updates of the on-screen button image is done in the general
"showKeyboard" and "hideKeyboard" functions which makes sure that
the button image is updated also if showing/hiding of the keyboard
is triggered by the OSystem callback.
Add two UI buttons which are placed to the top right corner over
the main view. The left button controls the current touch mode.
When pressed the button changes image to represent the new touch
mode using the mouse and touchpad assets added in previous commit.
The right button triggers the call to the main menu.
In the Android port it's possible to configure differnet touch modes
in ScummVM menus, 2D games and 3D games. Add the same possibility in
the iOS port. In Android it's possible to configure a touch based
game controller as well. That's not in scope for iOS in this commit
but can be added in the future.
The virtual controller can be configured with different directional
elements. A thumbstick button and Dpad buttons (left, right, down,
up) can be configured.
A user might want to configure which directional element they want
on the virtual controller. Create two different virtual controllers
which can be switched between depending on the setting in the
backend specific option dialog.
The onscreen_controls options should refer to the touch mode setting
and main menu buttons as it's done in the Android backend. The
virtual gamepad controller used the onscreen_controls string to
identify the option. Change the identifier string to
gamepad_controller so onscreen_controller can be used for the buttons
to be added to the iOS backend.
With the release of iOS 17 the fix for adjusting the position of
the virtual controller stopped working. On some devices with so
called safe areas, the action buttons could be placed outside the
screen.
Rework the fix to position the GCControllerView layer instead.
Calling setMouseCursor with a non-palette cursor no longer
clears the cursor palette, allowing a subsequent call with a
paletted cursor to re-use it. This fixes Trac defect #13895.
The approach taken was to just copy what the OpenGL
backend was doing in the same situation.