Advanced Features
The following describes advanced features related to AI speech
You can set or change Gestures, Language and CustomVoice, Speed, etc. in the IDLE state.
Gestures
Use a AIClipSet to send utterance commands to the AI. The AIClipSet refers to a series of AI action unit. There are three types of ClipSet: general speech, speech with gesture, and gesture only. The Gesture can be used depending on whether the AI model supports gestures, and the list of available gestures can be checked using the GetGestures
function of AIPlayer. Even a model that does not support gestures can be operated using ClipSet.
AIClipSet types are as follows.
- CLIP_SPEECH: Clip only for speech without gestures
- CLIP_GESTURE: Gesture only Clip
- CLIP_SPEECH_GESTURE: Clip for speech with gestures
In the sample screenshot below, an AI model is speaking while waving his hand with a "hi" gesture.
using System.Collections.Generic;
using AIHuman.Model;
using AIHuman.Core;
using AIHuman.View;
...
private List<AIGesture> _gestures;
...
_gestures = _aiPlayer.GetGestures();
...
AIGesture gesture = _gestures[index];
AIClipSet clip = AIAPI.CreateClipSet("nice to meet you.", gesture.Name);
_aiPlayer.Send(new[] {clip});
Monitoring callbacks of gesture actions
AIPlayerCallback.OnAIPlayerEvent(AIEvent) is called in the same way as the speech actions. The type value of AIEvent is called as follows to know the state. Here, AIEvent.ClipSet object allows you to know Type, GestureName, and SpeechText, so you can know whether it is a gesture or just an utterance action.
AICLIPSET_PLAY_PREPARE_STARTED
AICLIPSET_PLAY_PREPARE_COMPLETED
AICLIPSET_PLAY_STARTED
AICLIPSET_PLAY_COMPLETED
Change the Voice or Language
Some AIs can be filmed with live footage. It is possible even if the language of the supported video is different from the basic language of AI. From a living sample, you can see which activists the AI is currently using. The candidate list operates normally after the cancellation authentication process so that AIAPI.LoadCustomVoices can be resumed more strongly than AIHumanSDKManager.Authenticate or AIAPI.Authenticate.
Set the custom voice using AIPlayer's method
First, the list of languages that AI can currently speak can be checked through the following method.
List<string> languages = AIHumanSDKManager.Instance.GetSpeakableLanguages(_aiPlayer.AIGender);
Next, the voice list suitable for the corresponding language and gender can be checked by the following method. CustomVoice has properties of ID, Name, LanguageCode and Gender.
List<CustomVoice> customVoices = AIHumanSDKManager.Instance.GetCustomVoices();
If you know the id of the desired voice, you can find the desired voice using the following method. If there is none, return null. Here, voiceId is the ID of CustomVoice object. (The voiceId can be get by customVoice.ID property.)
CustomVoice myVoice = AIHumanSDKManager.Instance.FindCustomVoice(voiceId);
Directly change to the desired voice on the aplayer is set as follows, and is set to the default voice when null is entered. Returns true if successful.
List<CustomVoice> customVoices = AIHumanSDKManager.Instance.GetCustomVoices();
CustomVoice myVoice = customVoices[0];
bool succeeded = _aiPlayer.SetCustomVoice(myVoice);
Instead of using CustomVoice object directly, you can set CustomVoice with language and gender. In this case, the first customVoice of the filtered list is set. If it not available, the voice is set to the default voice.
bool succeeded = _aiPlayer.SetCustomVoiceForLanguage("en-US", "MALE");
Check current CustomVoice with following method. It returns null if CustomVoice is not set or default voice.
CustomVoice customVoice = _aiPlayer.GetCustomVoice();