Kernel-based data mining algorithms, such as Support Vector Machines, project data into high-dimensional feature spaces, wherein linear decision surfaces correspond to non-linear decision surfaces in the original feature space. Choosing a kernel amounts to choosing a high-dimensional feature space, and is thus a crucial step in the data mining process. Despite this fact, and as a result of the difficulty of establishing that a function is a positive definite kernel, only a few standard kernels (e.g. polynomial and Gaussian) are typically used. We propose a method for searching over a space of kernels for composite kernels that are guaranteed to be positive definite, and that are tuned to produce a feature space appropriate for a given dataset. Composite kernel functions are easily interpreted by humans, in contrast to the output of other work on kernel tuning. Empirical results demonstrate that our method often finds composite kernels that yield higher classification accuracy than the standard kernels.