This paper presents a memory-based algorithm for global optimization of multivariate functions of continuous variables. The proposed algorithm, M-RASH, is based on the RASH (Reactive Affine Shaker) heuristic, an adaptive search algorithm based only on point-wise function evaluations. M-RASH is an extension of RASH in which promising starting points for local search trails are suggested online by using Bayesian Locally Weighted Regression. Both techniques maintain memory about the previous history of the search to guide the future exploration, but in very different ways. RASH compiles the previous experience into the shape of a local search area where sample points are drawn, while locally-weighted regression saves the entire previous history to be mined extensively when an additional sample point is generated. Because of the high computational cost related to the regression model, it is applied only to evaluate the potential of an initial point for a local search run. The experimental results show that M-RASH is indeed capable of leading to good results for a smaller number of function evaluations.