Displaying believable emotional reactions in virtual characters is required in applications ranging from virtual-reality trainers to video games. Manual scripting is the most frequently used method and enables an arbitrarily high fidelity of the emotions displayed. However, scripting is labour intense and greatly reduces the scope of emotions displayed and emotionally affected behavior in virtual characters. As a result, only a few virtual characters can display believable emotions and only in pre-scripted encounters. In this paper we implement and evaluate a lightweight algorithm for procedurally controlling both emotionally affected behavior and emotional appearance of a virtual character. The algorithm is based on two psychological models of emotions: conservation of resources and appraisal. The former component controls emotionally affected behavior of a virtual character whereas the latter generates explicit numeric descriptors of the character's emotions which can be used to drive the character's appearance. We implement the algorithm in a simple testbed and compare it to two baseline approaches via a user study. Human participants judged the emotions displayed by the algorithm to be more believable than those of the baselines.