Neural Architecture Search (NAS) has demonstrated remarkable success in automatically designing high-performance deep neural networks. Most NAS algorithms are computationally expensive and they are typically optimized for a single domain, limiting their applicability in real-world scenarios where domain shifts frequently occur. In this paper, we propose MetaDANAS, a novel framework that integrates meta-learning and domain adaptation into neural architecture search to enable efficient architecture discovery across heterogeneous domains. The proposed method learns transferable architectural priors using episodic meta-training over multiple source domains while incorporating a domain adaptation module to align feature distributions between source and target datasets. To enable efficient architecture exploration through weight sharing, a differentiable supernet search space is employed. MetaDANAS significantly reduces architecture search time while achieving superior cross-domain accuracy compared to conventional NAS methods is demonstrated with experimental results across multiple cross-domain image classification tasks. The proposed approach provides an efficient solution for scalable neural architecture design in multi-domain environments.