Automatic Voice Commands
Enabling automatic voice commands
To enable automatic voice commands for your app, activate the Automatic voice commands flag under App Settings > Feature Flags (this requires an app deploy to take effect).
Overview of automatically registered voice commands
JourneyApps automatically generates voice commands for UI components to save developers time in building voice controlled apps with good end user experience. These commands follow the "say what you see" pattern.
It is useful to distinguish between globally available voice commands (e.g. scrolling and basic navigation), along with dynamic commands that are only available within a specific context (e.g. commands available on a specific view, or within a specific dialog).
Globally available voice commands
These are general application commands that allow the user to interact with the app regardless of which UI components are present or activated.
Go Back; Navigate Back
Dismisses to a previous view or cancels an action.
Scroll down, Page down
To scroll down on a view if the view extends beyond the RealWear device's display capacity.
Scroll up, Page up
To scroll down on a view if the view extends beyond the RealWear device's display capacity.
Show diagnostics
Takes the user to the built-in Diagnostics view.
Show messages
Takes the user to the built-in Messages view.
Open menu
Opens the app's top-right context menu (also referred to as kebab menu).
App help, Show App help
Displays an in-app help panel showing a summary of all current voice commands, based on the selected component and app view.
Contextually unique dynamic voice commands
When an app user is on a view, there are many dynamic voice commands that are available automatically. The specific voice commands will depend on the specific UI components on the view, their identifying labels, and the actions users can take with the UI components.
The available commands will change as the view changes, or as components (e.g. dropdowns) are selected.
For some components there are commands for selecting the component (e.g. for selecting a text-input
component to enter text) as well as clearing them (e.g. for clearing text). For other components, commands are generated to fire the action associated with "clicking" on individual items (e.g. for list-item
).
Developers are encouraged to follow the the "say what you see" best practice with the default automatic voice commands. Input components and buttons alike are selected using its label
attribute. That said, in some cases it might make sense to override the default automatic voice commands, for example if app users should be able to fire the action
on a list-item
using what is shown in the content
.
Here are some more specific cases where contextually unique voice commands become available:
Visual in-app guidance when interacting with the app
When a voice command is recognized by RealWear, it is shown to the user. Additionally, JourneyApps highlights the UI component with which the user is interacting using a red indicator. In some cases, when a component is highlighted, other voice commands may become available to the user.
The below code example illustrates how to change the indicator color to green on an app level for the default light
theme.
In your app's config.json
:
"App help" - In-app guidance for users on which voice commands are available
In general, the "say what you see" pattern should apply when it comes to identifying which voice commands are available in any context. However, if an app user is stuck, they can simply say "App help". This will open the help panel - a list of all available voice commands. The list of commands will refresh as soon as the user performs an action.
Last updated