Make machines feel - just like the human brain.
A big part of the human experience is the different emotions we experience in this life. From an evolutional point of view, emotions conbribute greatly to the chance of survival.
The representation of emotions in the brain is the foundation of modern neuroscience. As neural imagery research advanced in the recent years, we get to understand better the neurobiology of the brain and how the emotions are developed and triggered in the brain.
Graphen built its visual sentiment ontology on the foundation of big data, computer vision and psychology.
It uses its Multimedia Semantic Concept Detection & Mining technlogy to detect events,scenes, objects and their relationships, and then teaches machines to identify the sementics with a training video. The training phase transforms pixels to sentiments within a streamlined workflow: feature extraction with labeled training data, feature space models which includes sementic concept models, and concept space models.
It then builds framework to build 1000 concept detectors through supervised and quasi-supervised learning.
Graphen Ardi Emotion teaches machines to recognize both text and visual sentiment.
Its Classifier Training
Jade is a social media analyst at a large B2C company. A big part of her job is to monitor and analyze user feedback and sentiment on social media.
With the help of Ardi, she can easily conduct visual and word mining to discover and understand the positive and negative emotions within the online community. She is then able to analyze the effectiveness of her team's social media strategy and make informed adjustments to create and curate content that engages consumers and boost brand image.
Ardi allows her work to be effective and saves her hours of time from manually scrolling on social media to identify the emotions shared amongst existing and potential consumers.