V-Blaze and V-Cloud Online Help

Emotional Intelligence Data in V‑Spark

V‑Spark performs additional processing of the JSON data computed by the transcription servers. When JSON data is downloaded from V‑Spark, selected values are collected into a single section entitled “app_data” as shown below:

"app_data": {
    "agent_clarity": "0.708",
    "agent_emotion": "Positive",
    "client_emotion": "Improving",
    "overall_emotion": "Positive",
    "client_gender": "male",
    "client_clarity": "0.689",
    "duration": "0:29:49",
    "diarization": 2,
    "agent_channel": 0,
    "url":"http://server:3000/fileDetails/org/folder/date/123456.wav",
    "overtalk": "0.359",
    "agent_gender": "female", 
    "silence": "0.831"
},

One of the significant computations that V‑Spark performs is determining which channel of the audio contains the agent and which contains the client. Once this information is available, the emotion for individual speakers can be indicated.

In addition to providing utterance-level Emotional Intelligence values like V‑Blaze, V‑Spark also provides overall Emotional Intelligence values for the whole call. The Emotional Intelligence values computed by V‑Spark are shown in the agent_emotion, client_emotion, and overall_emotion fields in the previous JSON example. They are indicative of the change in emotional states of the speakers as the conversation progresses. Possible values are:

  • Positive The conversation began with positive or neutral emotions and remained positive or neutral.

  • Improving The conversation began with negative emotions and changed to positive or neutral emotions.

  • Negative The conversation began with negative emotions and remained negative.

  • Worsening The conversation began with positive or neutral emotions and changed to negative emotions.

Refer to the V‑Spark Review and Analysis Guide for more information on agent emotion and client emotion.