Bixby Developer Studio includes a built-in device simulator that lets you test both natural language (NL) and Aligned NL queries on a variety of simulated Bixby-compatible devices. It lets you set the user's location and timezone, manage OAuth sessions, and apply learning models, as well as letting you launch the Debug Console to analyze the execution graph of the last-run query.
You can test your current in-progress version of your capsule or you can load a revision of your capsule for testing in the Simulator.
Launch the Simulator with the View > Open Simulator menu command, or by clicking the Simulator button at the top of the sidebar in the Files view.
The Simulator has its own set of application tabs running down its left side, similar to the main editor window. Like the main window, these affect the view in the Simulator's sidebar. From the top, going down:
The Input view lets you set the capsule and target device being tested and enter queries. Depending on the device, you can also enable or disable Bixby's hands-free mode.
The User view lets you configure a variety of data passed to the capsule, including profile information, overriding GPS and clock information, learning and execution behavior, capsule permissions, and open OAuth sessions.
The Settings view lets you choose which Bixby device to test against. This list changes depending on what targets are available for your capsule.
Note that the sidebar in the Simulator can be collapsed, either by clicking the "X" icon in the sidebar's top right corner or by clicking the application tab icon for the currently-open view. When the sidebar is collapsed, click any of the application tab icons to open it again.
To test a query, use the Input view.
Some $vivcontext
variables meant to pass user context information, including but not limited to accessToken
, deviceModel
, and bixbyUserId
, are not sent when using the Simulator.
You can load a previous query and result screen by clicking the "˿" button. Click Reset to reset the Simulator and erase the query history stack.
If Bixby's NL model is out of date because you have edited models or code in your capsule that might affect the execution plans for your training, you can still enter an Aligned NL query or an intent. If you enter an NL query and the model needs to be compiled, the Run button will be disabled and a Compile button will appear in the Simulator window next to the capsule name:
Once a query is run, the Debug button will be enabled:
Click the Debug button to open the Debug Console with the current query preloaded.
The Device Simulator supports keyboard shortcuts for many of its commands and views. For a full list, refer to the Simulator Keyboard Shortcuts cheatsheet.
The Simulator supports both text-to-speech (TTS) and automatic speech recognition (ASR) for testing purposes. You can access these with the speech buttons in the lower right-hand corner of the input window:
To turn on text-to-speech, click the speaker icon, or type Ctrl+T (Windows/Linux) or Cmd+T (Mac). The icon is dimmed when TTS is disabled and bright when TTS is enabled. It is off by default. When TTS is on, Bixby's dialog will be spoken aloud.
To use automatic speech recognition, click and hold on the microphone icon or press and hold the spacebar while you are speaking into your computer's microphone. Release the button when you finish speaking. Your speech is transcribed in real-time into the input window, and the query is run when you release the mouse button or spacebar.
ASR requires access to your computer's microphone; your operating system might prompt you for this access. Also, you must have compiled an interpreter for the ASR feature to be available in the Simulator.
If any input control that accepts the spacebar has focus, such as the utterance input box or any of the buttons, you cannot trigger ASR with the spacebar. Either click in an area in the Simulator window with no controls, or use the microphone button.
Note that there are limitations to ASR and TTS.
Use the drop-down in the Settings view to select a device to simulate. The devices available are dependent on the targets your capsule declares in its capsule.bxb
file.
In addition to selecting screen resolution, other elements of the Simulator can change based on the selected device. For instance, not all devices support keyboard input, although the simulator's input window will always let you both type and speak utterances. Some devices might have more limited input controls, such as TV remotes.
Here are some differences to be aware of between the Simulator and real devices:
The Simulator can be put in hands-free mode by checking the box labeled Hands-free in the input window. This works for all devices that support hands-free mode.
When you are simulating a TV target, you can use the keyboard shortcuts listed in the following table to simulate a TV remote:
Remote Button | Windows | Mac |
---|---|---|
Up | Ctrl + Alt + Up | Cmd + Opt + Up |
Down | Ctrl + Alt + Down | Cmd + Opt + Down |
Left | Ctrl + Alt + Left | Cmd + Opt + Left |
Right | Ctrl + Alt + Right | Cmd + Opt + Right |
Select | Ctrl + Alt + Return | Cmd + Opt + Return |
Back | Ctrl + Alt + Backspace | Cmd + Opt + Delete |
You can also use the mouse and click on buttons.
This section discusses how to test your capsule if you use the bixby.audioPlayer
library or if you use the audio-control
component.
When testing a capsule that includes audio playback via the AudioPlayer library, the Simulator will show audio controls along with informational fields from the relevant AudioInfo
and AudioItem
concepts.
The buttons in the player are fully functional. From left to right, these buttons do the following:
AudioItem
, repeat the whole playlistIf you are testing a capsule that uses the audio-control
component in a View, the Simulator will automatically include the controls for playing the audio and display any information specified in that component. These controls are fully usable in the Simulator, just as if a user was controlling the audio on a mobile device. This includes the controls for the following:
Additionally, below the simulated screen is a bar that shows the same media controls that a user would see if they were on their lock screen or in the notifications view while swiping down fom their screen, if they were playing audio from a capsule with the audio-control
component. This matches the media control keys of the client.
The app-launch
key lets result views open external applications if they are supported on the client device. As these applications are not available in the Simulator, when app-launch
is executed in your capsule, the following occurs:
app-name
key), it will also be displayed here.As you enter NL and Aligned NL queries and respond to input views, the steps you take while interacting with the capsule will be recorded in the Step History along the right-hand side of the Simulator window.
The User view lets you change data returned to your capsule about the user by simulating different locations and time zones. This view also lets you control learning and execution settings.
When you create a dataset, it will be placed in the user-mocks/
folder.
Add a new mock dataset file, or edit an existing one, with the New File command, in either the File menu or on Bixby Studio's toolbar. You can also right-click in the file tree and select New > Other File…. Choose the Mock file type and the appropriate file subtype.
If you open the Profile Data section of the Simulator after the user-mocks/
directory has been created, the options will be replaced with a button that opens the directory in Bixby Studio.
Mock data files are JSON format, either a single object or an array of objects. Unlike standard JSON, mock data files can include JavaScript-style comments. This can be useful to comment out mock data you wish to temporarily deactivate.
//
./*
and end with */
.If your capsule defines a client-endpoint
, the Bixby server can use it to request client-side data that isn't present on a simulated device. You can handle these cases in the simulator by preparing mock datasets. These sets are stored as JSON files, and are named after a corresponding action model in your capsule.
To add a dataset, use the New File command in Bixby Studio, selecting the Mock file type and the Client Function Call subtype. Select the action the dataset should be sent in response to in the Action Name menu that appears; the file will be given the appropriate name.
The editor will open with the dataset's JSON file and allow you to enter your mocked data. The data should be a representation of the concept the action returns. For example, the FindContact
action in your capsule might output a Contact
concept.
You can open, disable/enable, or delete a dataset by right-clicking on its name.
For client endpoints that specify companion applications with client-app-support
, you can create the special Installed Apps dataset to specify a list of application IDs and versions to mock this behavior in the simulator.
The user interface for mocking installed apps is identical to the one for mocking datasets. Create it with the New File command, the Mock file type, and the Installed Apps subtype. The editor will open with a file named apps.json
, allowing you to specify a list of simulated apps. This file is a list of applications, specified by their Android appId
and appVersion
values.
[
// music app
{
"appId": "com.player.music",
"appVersion": "49482743"
},
// another app
{
"appId": "com.another.app",
"appVersion": "16737289"
}
]
The list of mock installed apps is simulator-wide, not specific to individual capsules or simulated devices.
If you need to mock device context information available in $deviceContext
in the Simulator, create it with the New File command, the Mock file type, and the Device Context subtype. This will create and open a device-context.json
file. Add device information your capsule needs to the JSON object in the file.
For example, if you need to mock the $deviceContext.$user.is24HourFormat
property, you can add this information in the device-context.json
file:
{
"$user": {
"is24HourFormat": true
}
}
The information available through the $deviceContext
variable on a real device depends on the client. For specific client information, see the Device Context Mobile Wiki page.
Some user context information available in $vivContext
can be mocked in the Simulator using the device-metadata.json
file. Create it with the New File command, the Mock file type, and the Device Metadata subtype. This will open a template with the available settings:
{
"storeCountry": "US",
"deviceModel": "SM-G965N",
"is24HourFormat": false
}
If you need to mock application-specific context information in the Simulator, you can add an app-context.json
file by creating it with the New File command, the Mock file type, and the App Context subtype. This will create and open a app-context.json
file with a simple skeleton template:
{
"appId": "example.app.id",
"appVersion": "1",
"context": {}
}
Change the appId
and appVersion
keys to match that of an app in the installed apps file, and add information to be returned in the context
key.
If you need to mock PDSS data for testing your capsule's training when combined with dynamic vocabulary supplied by PDSS, you can add a pdss-data-mocks.json
file by creating it with the New File command, the Mock file type, and the PDSS Data Mocks subtype. This will create and open a pdss-data-mocks.json
file with a skeleton template that has fields for common PDSS data types.
To use this file, edit it to include the data that your capsule needs for testing. For instance, if you add "gas valve" to the iot.device
list, then when Bixby processes the user utterance "turn on gas valve", then "gas valve" will be identified as a DeviceName
.
Use this section to change the location and date/time reported to the capsule in the Simulator.
Under Location, enter a specific latitude and longitude or enter a city in the Location field. Recently-used cities will be remembered in the list. You can also specify "My Current Location" to have the Simulator geolocate you, and specify "No GPS Available" to test your capsule's performance on a device where GPS does not exist or is turned off.
Under Time, specify the time zone the Simulator reports to the capsule, as well as the time and date. If you have specified a custom time and date, click Clear to return to reporting the current time.
If you are using JavaScript Runtime Version 2, your capsule will need to check the $vivContext.testToday
variable to receive the date/time set in the Simulator, given in milliseconds since January 1, 1970. This variable is undefined
when the capsule is not running in the Simulator or the clock is not being overridden.
// set myDate to the real current time or the override in the Simulator
myDate = new Date($vivContext.testToday ?? Date.now())
This section has the Deterministic Mode checkbox, which affects execution, and allows you to see bootstrapped preferences.
The Deterministic Mode checkbox controls several aspects of how the capsule behaves at execution.
choose (First)
is specified and ignore choose (Random)
. If disabled, Dialog templates that use choose (Random)
behaves normally. For more information, see the choose
reference.DateTime
, the Date
and Time
are automatically initialized to the conversation start time.You can also view bootstrapped preferences here. These are a subset of learned preferences whose values are specified in the capsule to accelerate or bypass the learning process for those values.
Current OAuth sections are displayed here, if any exist. You can clear existing sections from this tab.
If the simulated capsule has requested permissions from the user, such as access to location information or library-specific permissions, those permissions will appear here, and will be checked if access has been granted by the user during simulation. You can clear (and set) requested permissions from this tab. Just like on real devices, permission settings are remembered across invocations in the simulator.
Like an actual device, the Simulator will remember the user's response to the initial allow or deny prompt for a given permission, and will not request access again if it was initially denied. To reset the prompting behavior so the capsule again prompts you to allow or deny access, toggle the checkbox for that permission on this screen: check the box and uncheck it again.
If you update your capsule's code to add or remove a requested permission, clicking the refresh icon in the Simulator will update the panel to reflect the code's current state.
Settings lets you specify the device being simulated and switch between UI themes.
When you change devices, the primary effect in the Simulator is changing the display resolution. If you choose a device that does not have a display, hands-free mode will automatically be enabled.
On mobile device targets, three UI themes are available:
When the Dark theme is selected, images specified with dark-theme
will be used; when the Light theme is selected, images specified with light-theme
will be used. When the Default theme is selected, or when an image for the currently selected theme is not specified, images will use the image specified in the parent url
key.
If you're seeing issues in the Simulator, such as compiling issues or getting unexpected results, check that your capsule is properly synced in the workspace. Then try switching or re-selecting the target and recompiling the interpreter.
If you are still experiencing issues, you can file a report for the Support Team