When the screen dimension changes, e.g. on rotation of the device,
the graphic manager has to be informed of the new dimension to be
able to resize the surfaces.
To quickly redraw the entire screen, Common::EVENT_SCREEN_CHANGED
event is passed to the event handler.
Previously the mouse position in the view was tracked using the
pointerPosition property. Scaling and relative mosue movements
were calculated in the view using screen properties stored in the
videoContext structure. Now when moving to iOSGraphicsManager all
that handling will be handled by the WindowedGraphicsManager,
which the iOSGraphicsManager inherit.
Rework the input code to send down pure x and y position values,
scaled according to the view content scale factor.
Remove code related to mouse movement that is no longer needed.
Implement callbacks to set up OpenGL context, destroy context, get
scale factor and screen sizes. Implement rendering of graphics drawn
by the iOS graphicsManager.
This commit will enable graphics to be shown again. Screen rotation
and mouse movements are still to be adapted.
Remove all pure virtual functions in OSystem_iOS7 since they are
implemented by ModularGraphicsBackend.
This commit will break the graphics implementation in the ios7
backend and crash due to no OpenGL context created for the
graphicsManager to use.
The ios7 backend implements the graphic handling in the backend code.
iOS supports OpenGL through the OpenGL Framework since iOS 2.0. It's
marked as deprecated but is still shipped with the SDKs for iPhoneOS
and tvOS and will hopefully be so for some time.
The ios7 backend can therefore utilize the OpenGLGraphicsManager to
handle all graphics.
Implement an iOSGraphicsManager class that can be used in the ios7
backend. The iOSGraphicsManager will require some callback functions
in the ios7 backend. createOpenGLContext() will be called to ask the
backend to create an OpenGL context in which the graphic manager can
draw. The function returns the ID of the renderbuffer which shall be
used when creating the framebuffer object this differ iOS from other
platforms). A custom RenderBufferTarget class is added to address
this.
destroyOpenGLContext() will be called to make sure that the old GLES
context is not reused. notifyContextDestroy() does call the function
OpenGLContext.reset() but that will not destroy the context.
refreshScreen() will be called to ask the backend to present the
drawn graphics on the screen. getSystemHiDPIScreenFactor() is called
to get the screen scaling factor. getScreenWidth() and
getScreenHeight() are called to get the width and height of the
surface to draw on.
This commit adds the class but the ios7 backend doesn't make use of
it quite yet. To use it require the ios7 to be a child class of the
ModularGraphicsBackend. That change requires a lot of changes which
will be targeted in separate commits.
Update docportal and github ci worker to only disable the feature
opengl_classic_game since opengl and opengl_shaders are required to
compile the OpenGLGraphicsManager.
- Atari TT support
- all video and audio is now handled via XBIOS
- reworked IKBD handling using Kbdvbase vectors, esp. Kbdvec()
- video uses proper triple buffer
- arbitrary game screen size support
- many fixes and optimizations
Implement support to set the mouse pointer speed in settings.
The mouse pointer speed is applied to both mouse input and touch
input when in touchpad-mode.
The delta values are in number of pixels on the native screen
resolution. Need to scale down the delta values based on the
game resolution. Store reminders that are added to next deltas
to mitigate "dead zones" if doing small movements.
Add input events that can be used by mouse devices, e.g. mices and
touchpads. This event sends the raw input actions and doesn't care
about different controller modes such as click-and-drag.
Make the mouse controller utilize the new mouse input events.
The current mouse events are handling events created from both touch
and mouse input. The events have lots of logic to deal with gestures
and different modes (touchpad mode, click-and-drag etc) which are not
applicable for hardware inputs.
Rename the current "mouse events" to "touch events" to clarify which
input that triggered an event. As this is the first commit in multi-
commit change, the mouse input need to use the "touch events" until
a new "mouse event" is implemented.
The Gecko USB serial adapter is no longer easy to obtain and using
the Ethernet adapter instead is supported by devkitPro/libogc.
However, when enabled, this fails to link when dynamic plugins are
enabled due to missing symbols for gdbstub_getoffsets() found in
devkitPro/libogc/libdb/debug_supp.c. To be exact, this is the
u8 __text_fstart[],__data_fstart[],__bss_fstart[] which should be
implicitly provided by the linker(?)
A monolithic static build with this enabled does succeed, but has not
yet been tested.
- "fat" version uses repacked (zip -0) archives; also separate "data"
and "themes" folders
- "slim" version doesn't use any external themes (for speed reasons)
- consolidate public #define's (just ATARI)
- cpu compiler flags are specified in the script
- allow explicit move16, SV and SV Blitter features enabled/disabled
Provide two build scripts:
1. "Fat" one targeted at 040/060 machines (possibly with SuperVidel)
This one is optimized for 68020-60 (so it's still possible to try
highres engines on 68030 machines).
2. "Slim" one targeted at 030 machines (Falcon030+DFB/CT2 or TT030)
This one is optimized for 68030 and stripped from even more features:
"fancy" (highres) themes, move16 & SuperVidel routines and most
importantly the highres engines.
There is a race condition when iOS updates the safe insets and
the view is updated on orientation changes. When rotating the
device the callback function interfaceOrientationChanged is
called. This triggers rebuildSurface to be called, which will
call updateOutputSurface, which will trigger initSurface. That
function will finally will adjust the main frame towards the
safe areas by calling the function adjustViewFrameForSafeArea.
But it seems that when adjustViewFrameForSafeArea is called the
safe insets values are not updated which will lead to wrong
offset values in the frame for touches which will make the mouse
pointer inaccurate.
The iOS system makes calls to safeAreaInsetsDidChange whenever
the safe insets values are updated. Make sure to update the frame
on these calls to get correct touch offsets.