This study presents the first report of Artificial Improvisation, or improvisational theatre performed live, on-stage, alongside an artificial intelligence-based improvisational performer. The Artificial Improvisor is a form of artificial conversational agent, or chatbot, focused on open domain dialogue and collaborative narrative generation. Using state-of-the-art machine learning techniques spanning from natural language processing and speech recognition to reinforcement and deep learning, these chatbots have become more lifelike and harder to discern from humans. Recent work in conversational agents has been focused on goal-directed dialogue focused on closed domains such as appointment setting, bank information requests, question-answering, and movie discussion. Natural human conversations are seldom limited in scope and jump from topic to topic, they are laced with metaphor and subtext and face-to-face communication is supplemented with non-verbal cues. Live improvised performance takes natural conversation one step further with multiple actors performing in front of an audience. In improvisation the topic of the conversation is often given by the audience several times during the performance. These suggestions inspire actors to perform novel, unique, and engaging scenes. During each scene, actors must make rapid fire decisions to collaboratively generate coherent narratives. We have embarked on a journey to perform live improvised comedy alongside artificial intelligence systems. We introduce Pyggy and A.L.Ex. (Artificial Language Experiment), the first two Artificial Improvisors, each with a unique composition and embodiment. This work highlights research and development, successes and failures along the way, celebrates collaborations enabling progress, and presents discussions for future work in the space of artificial improvisation.