Level Set Learning with PRNN for Nonlinear Dimension Reduction in Function Approximation

發(fā)布者:文明辦發(fā)布時(shí)間:2023-06-20瀏覽次數(shù):719

主講人:鞠立力 南卡羅萊納大學(xué)教授


時(shí)間:2023年6月20日10:00


地點(diǎn):三號(hào)樓301室


舉辦單位:數(shù)理學(xué)院


主講人介紹:鞠立力教授1995年畢業(yè)于武漢大學(xué)數(shù)學(xué)系獲數(shù)學(xué)學(xué)士學(xué)位,1998年在中國(guó)科學(xué)院計(jì)算數(shù)學(xué)與科學(xué)工程計(jì)算研究所獲得計(jì)算數(shù)學(xué)碩士學(xué)位,2002年在美國(guó)愛(ài)荷華州立大學(xué)獲得應(yīng)用數(shù)學(xué)博士學(xué)位。2002-2004年在美國(guó)明尼蘇達(dá)大學(xué)數(shù)學(xué)與應(yīng)用研究所從事博士后研究。隨后進(jìn)入美國(guó)南卡羅萊納大學(xué)工作,歷任數(shù)學(xué)系助理教授(2004-2008),副教授(2008-2012),和教授(2013-現(xiàn)在)。主要從事偏微分方程數(shù)值方法與分析,非局部模型與算法,計(jì)算機(jī)視覺(jué),深度學(xué)習(xí)算法,高性能科學(xué)計(jì)算,及其在材料與地球科學(xué)中的應(yīng)用等方面的研究工作。至今已發(fā)表科研論文140多篇,Google學(xué)術(shù)引用5000多次。自2006年起連續(xù)主持了十多項(xiàng)由美國(guó)國(guó)家科學(xué)基金會(huì)和能源部資助的科研項(xiàng)目。2012至2017年擔(dān)任SIAM Journal on Numerical Analysis的副編輯,目前是JSC, NMPDE, NMTMA, AAMM等期刊的副編輯。與合作者關(guān)于合金微結(jié)構(gòu)演化在“神威·太湖之光”超級(jí)計(jì)算機(jī)上的相場(chǎng)模擬工作入圍2016年國(guó)際高性能計(jì)算應(yīng)用領(lǐng)域“戈登·貝爾”獎(jiǎng)提名。


內(nèi)容介紹:Inspired by the Nonlinear Level set Learning (NLL) method that uses the reversible residual network (RevNet), we propose a new method of Dimension Reduction via Learning Level Sets (DRiLLS) for function approximation. Our method contains two major components: one is the pseudo-reversible neural network (PRNN) module that effectively transforms high-dimensional input variables to low-dimensional active variables, and the other is the synthesized regression module for approximating function values based on the transformed data in the low-dimensional space. The PRNN not only relaxes the invertibility constraint of the nonlinear transformation present in the NLL method due to the use of RevNet, but also adaptively weights the influence of each sample and controls the sensitivity of the function to the learned active variables. The synthesized regression uses Euclidean distance in the input space to select neighboring samples, whose projections on the space of active variables are used to perform local least-squares polynomial fitting. This helps to resolve numerical oscillation issues present in traditional local and global regressions. Extensive experimental results demonstrate that our DRiLLS method outperforms both the NLL and Active Subspace methods, especially when the target function possesses critical points in the interior of its input domain.