当前位置: X-MOL 学术Nanophotonics › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Transfer learning for metamaterial design and simulation
Nanophotonics ( IF 7.5 ) Pub Date : 2024-03-22 , DOI: 10.1515/nanoph-2023-0691
Rixi Peng 1 , Simiao Ren 1 , Jordan Malof 1, 2 , Willie J. Padilla 1
Affiliation  

We demonstrate transfer learning as a tool to improve the efficacy of training deep learning models based on residual neural networks (ResNets). Specifically, we examine its use for study of multi-scale electrically large metasurface arrays under open boundary conditions in electromagnetic metamaterials. Our aim is to assess the efficiency of transfer learning across a range of problem domains that vary in their resemblance to the original base problem for which the ResNet model was initially trained. We use a quasi-analytical discrete dipole approximation (DDA) method to simulate electrically large metasurface arrays to obtain ground truth data for training and testing of our deep neural network. Our approach can save significant time for examining novel metasurface designs by harnessing the power of transfer learning, as it effectively mitigates the pervasive data bottleneck issue commonly encountered in deep learning. We demonstrate that for the best case when the transfer task is sufficiently similar to the target task, a new task can be effectively trained using only a few data points yet still achieve a test mean absolute relative error of 3 % with a pre-trained neural network, realizing data reduction by a factor of 1000.

中文翻译:

超材料设计和仿真的迁移学习

我们证明迁移学习是一种提高基于残差神经网络(ResNet)的深度学习模型训练效率的工具。具体来说,我们研究了其在电磁超材料开放边界条件下研究多尺度电大型超表面阵列的用途。我们的目标是评估一系列问题域中迁移学习的效率,这些问题域与最初训练 ResNet 模型的原始基本问题的相似性各不相同。我们使用准解析离散偶极子近似(DDA)方法来模拟电大型超表面阵列,以获得用于训练和测试深度神经网络的地面真实数据。我们的方法可以利用迁移学习的力量来节省大量时间来检查新颖的超表面设计,因为它有效地缓解了深度学习中常见的普遍数据瓶颈问题。我们证明,在最好的情况下,当转移任务与目标任务足够相似时,只需使用几个数据点就可以有效地训练新任务,但使用预训练的神经网络仍然可以实现 3% 的测试平均绝对相对误差网络,实现数据缩减1000倍。
更新日期:2024-03-22
down
wechat
bug