There are many computational models of episodic and spatial memory. Models stemming from neuro-/psychology are designed to fit a particular experimental data. The other models coming from the domain of virtual characters tend to lack this plausibility, but they are embedded in "living" beings, contributing to their ability to appear human-like. This poster presents a work-in-progress computational model of episodic and spatial memory aiming to bridge these two classes of models. It is both partly grounded in behavioral data and embodied in an acting virtual character, increasing its believability. It integrates five parts: a memory for events, a component reconstructing the time when an event happened, a topographical memory, and an allocentric and egocentric representations of locations of objects. Its main functional features include: detail representation of complex events (e.g. cooking a dinner) over long intervals (days) in large environments (house), forgetting, and development of search strategies for objects in the environment.