Systems, methods, and non-transitory computer readable media are configured to determine a likelihood of a rejection of a notification proposed for delivery to a recipient. A delivery determination for the notification can be performed. Subsequently, the notification can be delivered to the recipient based on the delivery determination.
Example systems and methods are described for implementing a swipe-to-like feature. In an example implementation, a list of content items is displayed on a touchscreen display, and based on detecting input of a first gesture, such as, for example, a swipe gesture, for a first one of the content items in the list, associating a predetermined first sentiment with the first content item.
Techniques for emotion detection and content delivery are described. In one embodiment, for example, an emotion detection component may identify at least one type of emotion associated with at least one detected emotion characteristic. A storage component may store the identified emotion type. An application programming interface (API) component may receive a request from one or more applications for emotion type and, in response to the request, return the identified emotion type. The one or more applications may identify content for display based upon the identified emotion type. The identification of content for display by the one or more applications based upon the identified emotion type may include searching among a plurality of content items, each content item being associated with one or more emotion type. Other embodiments are described and claimed.
The embodiment of the invention discloses a method and device for identifying character emotion. A specific embodiment of the method comprises the steps of extracting a face image set from a to-be-processed figure video, dividing the face image sert into at least one face image group based on a matching relationship between the face images, the different face image groups corresponding to different persons displayed by the person video, for each face image group in the at least one face image group, performing expression recognition on each face image in the face image group to obtain a corresponding expression recognition result, and determining the emotion information of a person corresponding to the face image group based on the expression recognition result corresponding to each face image in the face image group. According to the embodiment, the character emotion recognition based on facial expressions is realized.