In this paper we define a general framework for activity recognition by building upon and extending Lempel-Ziv multiway tree structure called tries to model human activities of daily living (ADLs). Our activity recognition system is performed online. We show a wearable wrist, based on Radio Frequency Identification (RFID) reader to detect everyday objects, and WiFi positioning system installed in our experimental environment to capture human current position. Our activity models are formulated by translating labeled activities (such as grooming) into probabilistic collections of a sequence of action steps (such as brushing teeth → combing hair), and each action step (such as making tea) is composed of a sequence of human handling object terms (such as cup → teabag → electric air pot → teaspoon), and human movements (bedroom → kitchen). Given RFID tags, WiFi signals, and Passing of time can directly yield the state of the physical world. We experimentally validate our approach using data gathered from actual human activity.