Comment on page
Automatic Voice Commands
Automatically registered voice commands are currently only available on RealWear® compatible devices running version 21.6.1 and later of the JourneyApps Container version 4.84.0 of the JourneyApps Runtime.
To enable automatic voice commands for your app, activate the Automatic voice commands flag under App Settings > Feature Flags (this requires an app deploy to take effect).
JourneyApps automatically generates voice commands for UI components to save developers time in building voice controlled apps with good end user experience. These commands follow the "say what you see" pattern.
It is useful to distinguish between globally available voice commands (e.g. scrolling and basic navigation), along with dynamic commands that are only available within a specific context (e.g. commands available on a specific view, or within a specific dialog).
These are general application commands that allow the user to interact with the app regardless of which UI components are present or activated.
When an app user is on a view, there are many dynamic voice commands that are available automatically. The specific voice commands will depend on the specific UI components on the view, their identifying labels, and the actions users can take with the UI components.
The available commands will change as the view changes, or as components (e.g. dropdowns) are selected.
For some components there are commands for selecting the component (e.g. for selecting a
text-inputcomponent to enter text) as well as clearing them (e.g. for clearing text). For other components, commands are generated to fire the action associated with "clicking" on individual items (e.g. for
Developers are encouraged to follow the the "say what you see" best practice with the default automatic voice commands. Input components and buttons alike are selected using its
labelattribute. That said, in some cases it might make sense to override the default automatic voice commands, for example if app users should be able to fire the
list-itemusing what is shown in the
Here are some more specific cases where contextually unique voice commands become available:
- When capturing a photo with the
journey.photoscomponent, the commands are: "Take Picture", "Flash on", "Flash off", "Flash auto", "Flash torch", and "Done". To zoom, the commands "Zoom level 0" to "Zoom level 5" are available.
- When there is an app update available, the user will get a notification asking them to update now or later. The buttons have the corresponding labels registered as commands, i.e. "Update" and "Later".
When a voice command is recognized by RealWear, it is shown to the user. Additionally, JourneyApps highlights the UI component with which the user is interacting using a red indicator. In some cases, when a component is highlighted, other voice commands may become available to the user.
Note that the color of the indicator can be configured for the entire app, a specific view, or a specific component using the
active-indicator-color=""attribute (see the Component Styling and Configuration section for further information about the distinction between these levels).
The below code example illustrates how to change the indicator color to green on an app level for the default
In your app's
In general, the "say what you see" pattern should apply when it comes to identifying which voice commands are available in any context. However, if an app user is stuck, they can simply say "App help". This will open the help panel - a list of all available voice commands. The list of commands will refresh as soon as the user performs an action.
- The term element is considered synonymous to component. To illustrate, the "First element" command as seen here in the help panel refers to the first UI component in the view.