In this paper, we extend the diffusion maps algorithm on a family of heat kernels that are either local (having exponential decay) or nonlocal (having polynomial decay), arising in various applications. For example, these kernels have been used as a regularizer in various supervised learning tasks for denoising images. Importantly, these heat kernels give rise to operators that include (but are not restricted to) the generators of the classical Laplacian associated to Brownian processes as well as the fractional Laplacian associated with β-stable Lévy processes. For local kernels, while the method is a version of the diffusion maps algorithm, we show that the applications with non-Gaussian local heat kernels approximate temporally rescaled Laplace-Beltrami operators. For the non-local heat kernels, we modify the diffusion maps algorithm to estimate fractional Laplacian operators. Here, the graph distance is used to approximate the geodesic distance with appropriate error bounds. While this approximation becomes numerically expensive as the number of data points increases, it produces an accurate operator estimation that is robust to the choice of the kernel bandwidth parameter value. In contrast, the local kernels are numerically more efficient but more sensitive to the choice of kernel bandwidth parameter value. In an application to estimate non-smooth regression functions, we find that using the nonlocal kernel as a regularizer produces a more robust and accurate estimate than using local kernels. For manifolds with boundary, we find that the proposed fractional diffusion maps framework implemented with non-local kernels approximates the regional fractional Laplacian.