The Artificial Intelligence Internet of Things (AIoT) is an emerging concept aiming to perceive, understand and connect the ‘intelligent things’ to make the intercommunication of various networks and systems more efficient. A key step in achieving this goal is to carry out high-precision data analysis at the edge and cloud level. Clustering and dimensionality reduction in AIoT can facilitate efficient data management, storage, computing, and transmission of various data-driven AIoT applications. For high-efficiency data clustering and dimensionality reduction, this paper develops a prior-dependent graph (PDG) construction method to model and discover the complex relations of data. With the proper utilization and incorporation of data priors, i.e., (a) element local sparsity; (b) pair-wise symmetry; (c) multi-instance manifold smoothness; and (d) matrix low-rankness, the obtained graph has the characteristics of local sparsity, symmetry, low-rank, and can well reveal the complex multi-instance proximity among data points. The developed PDG model is then applied for two typical data analysis tasks, i.e., unsupervised data clustering and dimensionality reduction. Experimental results on multiple benchmark databases verify that, compared with some existing graph learning models, the PDG model can achieve substantial performance, which can be deployed in edge computing modules to provide efficient solutions for massive data management and applications in AIoT.