A machine lifelong learning system is described that uses Context-sensitive Multiple Task Learning, or csMTL. csMTL, is a method of inductive transfer that uses a single output neural network and additional contextual inputs for learning multiple tasks. The approach satisfies a number of important requirements for knowledge retention and inductive transfer including the elimination of redundant outputs, representational transfer for rapid but effective short-term learning and functional transfer via task rehearsal for long-term consolidation. An implementation of the csMTL system is tested on a synthetic domain of six non-linearly separable classification tasks. The results indicate that representational transfer using a long-term consolidated csMTL network efficiently produces more effective hypotheses then previous MTL methods.