Tensor, an extension of the vector and matrix to the multi-dimensional case, is a natural way to describe N-ary relational data. Recently, tensor decomposition methods have been introduced into N-ary relational data and become state-of-the- art on embedding learning. However, the performance of existing tensor decomposition methods is not as good as desired. First, they suffer from the data-sparsity issue since they can only learn from the N-ary relational data with a specific arity, i.e., parts of common N-ary relational data. Besides, they are neither effective nor efficient enough to be trained due to the over-parameterization problem. In this paper, we proposes a novel method, i.e., S2S, for effectively and efficiently learning from the N-ary relational data. Specifically, we propose a new tensor decomposition framework, which allows embedding sharing to learn from facts with mixed arity. Since the core tensors may still suffer from the over-parameterization, inspired by the success of neural architecture search (NAS) in designing data-dependent architectures, we propose to reduce parameters by sparsifying the core tensors while retaining their expressive power using NAS techniques. As a result, the proposed S2S not only guarantees to be expressive but also efficiently learns from mixed arity. Finally, empirical results have demonstrated that S2S is efficient to train and achieves state-of-the-art performance.