Level Set Learning with PRNN for Nonlinear Dimension Reduction in Function Approximation

發(fā)布者:文明辦發(fā)布時間:2023-06-20瀏覽次數(shù):719

主講人:鞠立力 南卡羅萊納大學教授


時間:2023年6月20日10:00


地點:三號樓301室


舉辦單位:數(shù)理學院


主講人介紹:鞠立力教授1995年畢業(yè)于武漢大學數(shù)學系獲數(shù)學學士學位,1998年在中國科學院計算數(shù)學與科學工程計算研究所獲得計算數(shù)學碩士學位,2002年在美國愛荷華州立大學獲得應用數(shù)學博士學位。2002-2004年在美國明尼蘇達大學數(shù)學與應用研究所從事博士后研究。隨后進入美國南卡羅萊納大學工作,歷任數(shù)學系助理教授(2004-2008),副教授(2008-2012),和教授(2013-現(xiàn)在)。主要從事偏微分方程數(shù)值方法與分析,非局部模型與算法,計算機視覺,深度學習算法,高性能科學計算,及其在材料與地球科學中的應用等方面的研究工作。至今已發(fā)表科研論文140多篇,Google學術引用5000多次。自2006年起連續(xù)主持了十多項由美國國家科學基金會和能源部資助的科研項目。2012至2017年擔任SIAM Journal on Numerical Analysis的副編輯,目前是JSC, NMPDE, NMTMA, AAMM等期刊的副編輯。與合作者關于合金微結(jié)構(gòu)演化在“神威·太湖之光”超級計算機上的相場模擬工作入圍2016年國際高性能計算應用領域“戈登·貝爾”獎提名。


內(nèi)容介紹:Inspired by the Nonlinear Level set Learning (NLL) method that uses the reversible residual network (RevNet), we propose a new method of Dimension Reduction via Learning Level Sets (DRiLLS) for function approximation. Our method contains two major components: one is the pseudo-reversible neural network (PRNN) module that effectively transforms high-dimensional input variables to low-dimensional active variables, and the other is the synthesized regression module for approximating function values based on the transformed data in the low-dimensional space. The PRNN not only relaxes the invertibility constraint of the nonlinear transformation present in the NLL method due to the use of RevNet, but also adaptively weights the influence of each sample and controls the sensitivity of the function to the learned active variables. The synthesized regression uses Euclidean distance in the input space to select neighboring samples, whose projections on the space of active variables are used to perform local least-squares polynomial fitting. This helps to resolve numerical oscillation issues present in traditional local and global regressions. Extensive experimental results demonstrate that our DRiLLS method outperforms both the NLL and Active Subspace methods, especially when the target function possesses critical points in the interior of its input domain.

熱點新聞
最新要聞