1. The active window is the one that the user touches or the one
that has input focus. We recognize the user touching a window
by the received accessibility hover events and the user not
touching the screen by a call from the touch explorer. It is
possible that the user touches window that does not have
input focus and as soon as he lifts finger the active one
will become the window that has input focus but now we get
he hover accessibility events from the touched window which
incorrectly changes the active window to be the touched one.
Note that at this point the user is not touching the screen.
bug:7298484
Change-Id: Ife035a798a6e68133f9220eeeabdfcd35a431b56
1. We are showing a warning dialog if the user enables an accessibility
service that requests explore by touch. This dialog was shown only
for the owner but should be shown for the current user.
bug:7304437
Change-Id: I692b5112df16405e6d2e4890aafbfde79981f973
1. The active window is the one the user is touching or the one
that has input focus. It has to be made current immediately
after the user has stopped touching the screen because if the
user types with the IME he should get a feedback for the
letter typed in the text view which is in the input focused
window. Note that we always deliver hover accessibility events
(they are a result of user touching the screen) so change of
the active window before all hover accessibility events from
the touched window are delivered is fine.
bug:7296890
Change-Id: I1ae87c8419e2f19bd8eb68de084c7117c66894bc
1. Accessibility events for changes in the content of a given window, such as
click, focus, etc. are dispatched to clients only if they come from the
active window.
Events for changes in the state of a window, such as window got input focus
or a notification appeared, are always dispatched. The notification events
do not contain source, so a client cannot introspect the notification area
(unless the user explicitly touches it which generates hove events). The
events for a window getting input focus change the active window so they
have to be dispatched.
Events that are a result of the user touching the screen, such as hover
enter, first tocuh, etc. should always be dispatched.
bug:7282006
Change-Id: I96b79189f8571285175d9660a22394cc84f39559
1. The accessibility layer announces user switches. Even though
the initial switch to the owner on a singe user device is a
valid use switch we should not announce it for accessibility.
bug:7264693
Change-Id: Idf022fab6b74c84b7a96bc4ed7c7fee2b83029a6
1. A recently added check was preventing touch exploration being
disabled when the last touch exploring service was turned off.
As a consequence enabling explore by touch was initializing the
input filter with the magnification and the not disabled
screen magnification features.
bug:7256223
Change-Id: I9ed5457705d625805462e4d316b2c8a5af9aabca
1. If an accessibility service does not specify that it handles any
event types it was never added to the list of services while
the system is bound to it. Since the service is not in the list
with enabled services we never unbind it, hence it consumes
resources without doing nothing. This is also semantically
incorrect because a sevice may not want to receive events while
handling only gestures.
bug:5648345
Change-Id: Id478a4704cdeeb1729330f6ae4b8ff9e06320952
1. This change adds a global gesture for enabling accessibility.
To enable this gesture the user has to allow it from the
accessibility settings or use the setup wizard to enable
accessibility. When the global gesture is enabled the user
can long press on power to bring the global actions dialog
and then hold with two fingers for a few seconds to enable
accessibility. The appropriate feedback is also provided.
2. The global gesture is writing directly into the settings for
the current user if performed when the keyguard is not on. If
the keygaurd is on and the current user has no accessibility
enabled, the gesture will temporary enable accessibility
for the current user, i.e. no settings are changed, to allow
the blind user to log into his account. As soon as a user
switch happens the new user settings are inherited. If no
user change happens after temporary enabling accessibility
the temporary changes will be undone when the keyguard goes
away and the device will works as expected by the current user.
bug:6171929
3. The initialization code for the owner was not executed due
to a redundant check, thus putting the accessibility layer in
an inconsistent state which breaks pretty much everything.
bug:7240414
Change-Id: Ie7d7aba80f5867b7f88d5893b848b53fb02a7537
IStatusBarService.collapseQuickSettings is gone;
collapseNotifications is now collapsePanels, which does what
collapse() used to do. Similarly,
IStatusBar.animateCollapseQuickSettings is now simply
IStatusBar.animateCollapse().
Bug: 7245229
Change-Id: Id157d2fdf34926d3c85ffa8b81c741a5359aede4
1. Added APIs for opening the quick settings to the StatusBarManagerService
and the local StatausBarManager. The new APIs are protected by the old
EXPAND_STATUS_BAR permission.
Renamed the expand* and collapse* non-public APIs that are expanding
the notifications to expandNotifications* collapseNotifications* to
better convey what they do given that this change adds
expandQuickSettings* and collapseQuickSettings*.
Added a global action to the accessibility layer to expand the quick
settings which is calling into the new status bar manager APIs.
bug:7030487
Change-Id: Ic7b46e1a132f1c0d71355f18e7c5a9a2424171c3
1. The initial user was set to USER_NULL but some clients were registering
before the user change callback happens. Since the initial user is
the owner the current user id defaults to USER_OWNER.
2. The check for global clients and window connections was using the
calling UID but there are processes that run in a per user basis
as system UID (Setting for example). Now the check is stronger
and comparing the caller PID with that of the system process.
3. The code for finding the focused window id was not checking the
global window token list in addition to that of the current user.
4. The code updating the active window id was calling out into the
window manager with a lock held.
bug:7224670
Change-Id: I9f4b7ea67eb5598b30ee7d1b68a1d3ce0cf8cfb4
1. The active window for accessibility purposes is the either the
window the user is touching or the window that has input focus. We
were using the touch exploration gesture end event to figure
when the user stops touching the screen so we can set the active
window to the input focused one. However, we do not send such
gesture end if the user does not touch explore. If the user only
taps we do not consider this touch exploring. We now have dedicated
accessibility events for first and last touch and this change uses
them as a guide when to update the active window.
bug:6523219
Change-Id: I6262c0c5f408b02dbaa127664e4b426935d7f81f
1. Since adb is restarted on user switch it makes no sense to
try to reconnect the ui automation service since it will
be killed on a user switch.
Disabling touch exploration on UI automation service
connect since it can explicitly put the device in this
state if needed.
bug:6967373
Change-Id: I8cfde74f28f3f03d4ccf24746d43b8178ae2b5ef
1. This change converts the accessibility manager service to
maintain a state per user. When the user changes the services
for the user that is going away are disconnected, the local
accessibility managers in the processes for this user are
disabled, the state is swapped with the new user's one, and
the new user state is refreshed.
This change updates all calls into the system to use their
user specific versions when applicable. For example, regisetring
content observers, package monitors, calls into other system
services, etc.
There are some components that are shared across users such
as UI created by the system process and the SystemUI package.
Such components are managed as a global state shared across
all users and are updated accordingly on a user switch. Since
the SystemUI is running in a normal app process this change
adds hidden APIs on the local window manager to allow the
SystemUI to notify the accessibility layer that it will run
accross users.
Calls to AccessibiltyManager's isEnabled(), isTouchExplorationEnabled()
and sendAccessibilityEvent return false or a are a nop for a
background user sice he should not send accessibility events,
and should not perform touch exploration.
Update the internal accessibility tests due to changes in the
AccessibilityManager.
This change also fixes several issues that were encountered
such as calling out the accessibility manager service with a
lock held.
Removed some incorrect debugging code from the TouchExplorer
that was leading to a system crash.
bug:6967373
Change-Id: I2cf32ffdee1d827a8197ae4ce717dc0ff798b259
1. Currently the system fires accessibility events to announce the
start and end of a touch exploration gesture. However, such a
gesture starts after we have decided that the user is not
performing a gesture which is achieved by measuring speed of
movement during a threshold distance. This allows an accessibility
service to provide some feedback to the user so he knows that
he is touch exploring.
This change adds event types for the first and last touches
of the user. Note that the first touch does not conincide with
the start of a touch exploration gesture since we need a time
or distance to pass before we know whether the user explores
or gestures. However, it is very useful for an accessibility
service to know when the user starts to interact with the
touch screen so it can turn the speech off, to name one
compelling use case.
This change also provides event types for the start and end
of gesture detection. If the user has moved over the threshold
with a speed greater than X, then the system detects gestures.
It is useful for an accessibility service to know the begin
and end of gesture detection so it can provide given feedback
type for such a gesture, say it may produce haptic feedback
or sound that differs for the one for touch exploration.
The main benefit of announcing these new events is that an
accessibility service can provide feedback for each touch
state allowing the user to always know what he is doing.
bug:7166935
Change-Id: I26270d774cc059cb921d6a4254bc0aab0530c1dd
1. This change enforces an accessibility service to require the system
defined BIND_ACCESSIBILITY_SERVICE permission.
bug:6507771
Change-Id: If5e16bb4fa97891be0ccbb35e343773712e33b98
This change is the initial check in of the screen magnification
feature. This feature enables magnification of the screen via
global gestures (assuming it has been enabled from settings)
to allow a low vision user to efficiently use an Android device.
Interaction model:
1. Triple tap toggles permanent screen magnification which is magnifying
the area around the location of the triple tap. One can think of the
location of the triple tap as the center of the magnified viewport.
For example, a triple tap when not magnified would magnify the screen
and leave it in a magnified state. A triple tapping when magnified would
clear magnification and leave the screen in a not magnified state.
2. Triple tap and hold would magnify the screen if not magnified and enable
viewport dragging mode until the finger goes up. One can think of this
mode as a way to move the magnified viewport since the area around the
moving finger will be magnified to fit the screen. For example, if the
screen was not magnified and the user triple taps and holds the screen
would magnify and the viewport will follow the user's finger. When the
finger goes up the screen will clear zoom out. If the same user interaction
is performed when the screen is magnified, the viewport movement will
be the same but when the finger goes up the screen will stay magnified.
In other words, the initial magnified state is sticky.
3. Pinching with any number of additional fingers when viewport dragging
is enabled, i.e. the user triple tapped and holds, would adjust the
magnification scale which will become the current default magnification
scale. The next time the user magnifies the same magnification scale
would be used.
4. When in a permanent magnified state the user can use two or more fingers
to pan the viewport. Note that in this mode the content is panned as
opposed to the viewport dragging mode in which the viewport is moved.
5. When in a permanent magnified state the user can use three or more
fingers to change the magnification scale which will become the current
default magnification scale. The next time the user magnifies the same
magnification scale would be used.
6. The magnification scale will be persisted in settings and in the cloud.
Note: Since two fingers are used to pan the content in a permanently magnified
state no other two finger gestures in touch exploration or applications
will work unless the uses zooms out to normal state where all gestures
works as expected. This is an intentional tradeoff to allow efficient
panning since in a permanently magnified state this would be the dominant
action to be performed.
Design:
1. The window manager exposes APIs for setting accessibility transformation
which is a scale and offsets for X and Y axis. The window manager queries
the window policy for which windows will not be magnified. For example,
the IME windows and the navigation bar are not magnified including windows
that are attached to them.
2. The accessibility features such a screen magnification and touch
exploration are now impemented as a sequence of transformations on the
event stream. The accessibility manager service may request each
of these features or both. The behavior of the features is not changed
based on the fact that another one is enabled.
3. The screen magnifier keeps a viewport of the content that is magnified
which is surrounded by a glow in a magnified state. Interactions outside
of the viewport are delegated directly to the application without
interpretation. For example, a triple tap on the letter 'a' of the IME
would type three letters instead of toggling magnified state. The viewport
is updated on screen rotation and on window transitions. For example,
when the IME pops up the viewport shrinks.
4. The glow around the viewport is implemented as a special type of window
that does not take input focus, cannot be touched, is laid out in the
screen coordiates with width and height matching these of the screen.
When the magnified region changes the root view of the window draws the
hightlight but the size of the window does not change - unless a rotation
happens. All changes in the viewport size or showing or hiding it are
animated.
5. The viewport is encapsulated in a class that knows how to show,
hide, and resize the viewport - potentially animating that.
This class uses the new animation framework for animations.
6. The magnification is handled by a magnification controller that
keeps track of the current trnasformation to be applied to the screen
content and the desired such. If these two are not the same it is
responsibility of the magnification controller to reconcile them by
potentially animating the transition from one to the other.
7. A dipslay content observer wathces for winodw transitions, screen
rotations, and when a rectange on the screen has been reqeusted. This
class is responsible for handling interesting state changes such
as changing the viewport bounds on IME pop up or screen rotation,
panning the content to make a requested rectangle visible on the
screen, etc.
8. To implement viewport updates the window manger was updated with APIs
to watch for window transitions and when a rectangle has been requested
on the screen. These APIs are protected by a signature level permission.
Also a parcelable and poolable window info class has been added with
APIs for getting the window info given the window token. This enables
getting some useful information about a window. There APIs are also
signature protected.
bug:6795382
Change-Id: Iec93da8bf6376beebbd4f5167ab7723dc7d9bd00
1. The window manager was not notifying a window when the latter
has been moved. This was causing incorrect coordinates of the
nodes reported to accessibility services. To workaround that
we have carried the correct window location when making a
call from the accessibility layer into a window. Now the
window manager notifies the window when it is moved and the
workaround is no longer needed. This change takes it out.
2. The left and right in the attach info were not updated properly
after a report that the window has moved.
3. The accessibility manager service was calling directly methods
on the window manager service without going through the interface
of the latter. This leads to unnecessary coupling and in the
long rung increases system complexity and reduces maintability.
bug:6623031
Change-Id: Iacb734b1bf337a47fad02c827ece45bb2f53a79d
1. There was a misspelled duplicate member in the accessibility service
class which was causing inconsistent behavior because one field was
updated and another checked.
2. When the set of services that can put the device in explore by touch
mode changes we were disconnecting and reconnecting all services
and this is not correct. Now only the state of explore by touch is
updated appropriately.
bug:6798860
Change-Id: Ib3c119cef8e71c3458d56e4ce6fbde2c2f750dcd
1. The touch explorer was notified for accessibility events from
a binder thread which was poking the internal state of the
latter which by design is not tread safe. Since the touch
explorer is expected to be running only on the main thread
the accessibility manager service delivers the accessibility
events to the explorer on that thread.
bug:6635496
Change-Id: Ifdc5329e4be8e485d7f77f0fb472184494fa0d15
1. When typing into an auto completion edit field a list of completions pops up and if
the user touch explores the list and tries to double tap to select the touched
completion the latter is not selected.
The auto completion is a popup that does not take input focus and is overlaid on
top of the window that has input focus. The touch explorer was clicking on the
location of the accessibility focus if the last touch explored location is within
the bounds of the active window. In this case this was the window with the edit
text into which the user is typing. The check performed by the touch explorer
was missing the case when the last touch explored location was within the bounds
of the active window but it actually was deloverd to another overlaid window.
Now we are poking on the accessibility focus location if the last explored
location is within the active window and was delivered to it.
bug:6629535
Change-Id: Ie66d5bb81ab021f2bb0414339b7de26d96826191
1. If the last touch explored location is within the active window we
used to click on exact location if it is within the accessibility
focus otherwise in the accessibility focus center. If the last touch
explored location is not within the active window we used to just
click there. This breaks in the case were one has touch explored
at a given place in the current window and now a dialog opens *not*
covering the touch explored location. If one uses swipes to move
accessibility focus i.e. to traverse the dialog without touching
it one cannot activate anything because the touch explorer is using
the last touch explored location that is outside of the active
window e.g the dialog.
The solution is to clear the last touch explored location when a
window opens or accessibility focus moves. If the last touch
explored location is null we are clicking in the accessibility
focus location.
bug:6620911
2. There is a bug in the window manager that does not notify a
window that its location has changed (bug:6623031). This breaks
accessibility interaction with dialogs that have input because
when the IME is up the dialog is moved but not notified. Now
the accessibility layer gets incorrect location for the
accessibility focus and the window bounds.
The soluion is when the accessibility manager service calls
into the remove thress to obtain some accessibility node infos
it passes the window left and top which it gets from the
window manager. These values are used to update the attach info
window left and top so all accessibility node infos emitted
from that window had correct bounds in screen coordinates.
bug:6620796
Change-Id: I18914f2095c55cfc826acf5277bd94b776bda0c8
1. The global action to open recent apps shows the old dialog style rent apps
panel. Apparently the key code to open recent apps is not opening the new
UI so the AccessibilityManagerService is calling directly the method on
the IStatusBarSerivce to do so.
bug:6607664
Change-Id: I94c1963b07947776bf1c2448903b26f3603f9a59
1. Touch exploration gestures are demarcated by start and end
events. Due to a bug in the AccessibilityManagerService
the gesture end event was not dispatched. This caused the
AccessibilityNodeInfoCache to be off sync since it relies
on getting such events not to mention that the clients were
not getting the end but only the start event. The issue
was that the notified service types variable was not reset
after every event so when the manager sends the last hover
exit it flags that the service type is already notified
resulting in dropping on the floor the following gesture
end event.
bug:6539306
Change-Id: I2b96bcecea3b2240199d67f01afa6a033afce1de
1. Now we are asking the user to grant permission to the service to enable
touch exploration only the first time this service is enabled. If the
service was uninstalled and then later installed we ask the user again.
This avoids the scenario in which rebooting the device or upgrading an
accessibility service leaves the device in a state in which the user
cannot interact with.
bug:6582088
Change-Id: I51d24e4892b3b48c9fb11dfb09ec1118502ba526
1. If a UI test automation accessibility service is connected to the
system we pospone state updates in the AccessibilityManagerService
for the moment the UI automations service dies or is disconnected.
bug:6540522
Change-Id: I48ddf603b53d2158a00edcf8ad05cfe2575d4d75
1. We are passing the interrogating process id in the remote
accessibility requests to catch the query from the same
thread. While all other methods were doing this correctly
somehow the perform action is using the incorrect process id.
bug:6534935
Change-Id: Icef50833903c562758d51ef316b60c53c7a336c0
1. The internal service instance created by AccessibilityManagerService
was getting the looper of the current thread when created. This works
for real accessibility services but since UI automation service is
registered via an IPC the binder thread has no looper. Now we explicitly
get the correct looper.
bug:6535435
Change-Id: I63a2ada1b65c4b3c71c3d1e6deb3dfdeb7a3d6d6
1. Now the user have to double tap to activate the last
item. If the last touched window is not active because
it does not take input focus the click on the last
touch explored location. Othewise the click is on the
accessibility focus location.
bug:5932640
Change-Id: Ibb7b97262a7c5f2f94abef429e02790fdc91a8dd
1. Every accessibility services targeting JellyBean or higher has
to request a special permission for the system to bind to it.
Change-Id: I6e579326bdf3597f148d6c67317455701ec8af68
1. Since the API version has been finalized this change
updates the SDk version checks to use the JellyBean
verson number.
bug:5947249
Change-Id: Ie22fa7e18a7ea7b0c7077d80246a26c17f327ceb
1. The initial design was to have some accessibility gestures
being handled by the system if the gesture handling access
service does not consume the gesture. However, we are not
sure what a good default is and once we add a default handler
we cannot remove it since people may rely on it. Thus, we
take the simples approach and let the accessibility service
handle the gestures. If no gestures are handled the system
will work in explore by touch as before.
bug:5932640
Change-Id: I865a83549fa03b0141d27ce9713e9b7bb45a57b4
1. Scrolling actions are crucial for enabling a gesture based
traversal of the UI and specifically scrollable containers
especially lists and anything backed by an adapter. Since
accessibility focus can land only attached views, it cannot
visit views for adapter items not shown on the screen.
Auto scrolling the list as a result of putting access focus
ot a list item does not work well since the user may get
trapped in a long list. Adding an accessibility node provider
to emit virtual views for one view before the first and one
after the last is complex and suffers the limitation of trapping
the user. Accessibility service need an explicit scroll actions
which may be performed upon an explicit user action. Hence,
the user is informed for the start/end of the visible part of
the list and he makes a deliberate choice to scroll. This will
benefit also people developing Braille devices since they can
scroll the content without telling the user to stop using the
Braille controller and take the device out of his pocket to scroll
and go back to the Braille controller.
NOTE: Without these action large portions of the screen will be
hard to access since users will have to touch and explore to
find and scroll the list.
Change-Id: Iafcf54d4967893205872b3649025a4e347a299ed
1. Delegating activation gestures has several issues that we should
decide how to handle if possible before allowing an accessibility
service to take over them:
A) It is needed that every view than can be clicked or long pressed on
reacts to such as a response to calling performClick and performLongPress
which is not necessary true since the view may watch the touch
events and do its own click long click detection. As a result it may
be possible that there are view a user cannot interact with in
touch exploration mode but can if not in that mode.
B) Clicking or long pressing on a different location in a view may yield
different results, for example NumberPicker. Ideally such views have
to implement AccessibilityNodeProvide which provider handles correctly
the request for click long press on virtual nodes. Some apps however
just fire different hover accessibility events when the user is over
a specific semantic portion of the view but do not provide virtual
nodes. Hence, a user will not be able to interact with such semantic
regions but the system can achieve that by sending the click/long click
at the precise location in the view that was last touch explored.
2. Adding a flag on accessibility service info to request explore by touch
mode. There is no need to put the device in this mode if node of the currently
enabled accessibility services supports it. Now the problem is inverted and
the service has to explicitly state its capability.
3. Fixing a bug where includeImportantViews was ignored for automation
services.
Change-Id: I3b29a19f24ab5e26ee29f974bbac2197614c9e2a
This problem was introduced in I74df9c24. The intention of the
change was still let UiTestAutomationBridge see the
non-important views, but there were bugs in the implementation:
1. AccessibilityManagerService was not really updating
mIncludeNotImportantViews when mIsAutomation is true
2. Wrong constant is used to set the flag
Change-Id: Ia0a2e9ed9720bd0ea3a563e0b492e870a6ec1586
1. Since we are using a stateless proxy accessibility service to
perform default accessibility gesture handling it shuld not
operate against not important views.
bug:6422069
Change-Id: I74df9c2415ab3b164d9ac5873f7004c0459e2bfa
1. Changed all references to granularity to movement
granularity. BTW, to be more precise it should be
text movement granularity.
bug:6435232
Change-Id: If6366b002ca3390f74918995b342baff2cbcfd01
1. The event of setting an accessibility focus on a view should not
make the host window the currently active one.
bug:6400648
Change-Id: Ib45c255f441c38489ee9d4ab5f284550ac5f6b01
1. The checks for action arguments are not needed since they
may cause trouble for developers if we add more args to
an action.
bug:6414006
Change-Id: Ia4212b52be183b1ef1cfd2561ce618cef2b015e4
1. The granularities for traversing the text content of an accessibility
node info are now predefined constants and custom ones will not be
supported. This is the simplest solution - we can always add namespaced
user defined ones (unlikely).
2. Added actions for traversing web content. These actions can be used by
an accessibility service to transparently drive the JavaScript based
screen reader that is used for handling web content.
3. Added a new accessibility event type for traversing the content of a
view. This event is needed to announce to the user what is the next
element, i.e. the one next to the cursor, after the view's text was
traversed.
bug:5932640
bug:6389591
Change-Id: I144647da55bc4005c64f89865ef333af8359e145
1. The accessibility focus directions are not needed since an
accessibility service just get the root, first child, next
sibling, previous sibling and call execute the action to
give it accessibility focus. Now the accessibility node
info tree is properly ordered taking into account layout
manager directions for both layout manager that we report
and ones that we have determined as not important for
accessibility. Also the position of a node info are ordered
properly based on their coordinates after all transformations
as opposed to child index.
bug:5932640
Change-Id: I994a8297cb1e57c829ecbac73a937c2bcbe0bac7