California State Performance Plan Transition Indicators

California State Performance Plan Transition Indicators
Slide Note
Embed
Share

This content discusses indicators related to transition in the California State Performance Plan, focusing on graduation and drop-out rates, compliance with federal regulations, and improving outcomes in employment, postsecondary education, and independent living. It elaborates on Indicator 13 requirements for Individualized Education Programs and the importance of defining courses of study for postsecondary goals.

  • California
  • Transition
  • Performance Plan
  • Education
  • Indicators

Uploaded on Apr 13, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Accelerating Forward Electromagnetic Scattering Prediction using Neural Networks and Generalized Mie Theory Speaker: Tiem Leong Yoon School of Physics, Universiti Sains Malaysia, 11800 USM

  2. Presented at 3RD INTERNATIONAL CONFERENCE ON SEMICONDUCTOR MATERIALS AND TECHNOLOGY (ICOSEMT 2023) 18 Sept 2023 Venue: Shangri-La Rasa Sayang, Batu Ferringhi, Penang

  3. Abstract We l e a d Electromagnetic scattering applications often require a large number of computationally expensive simulations, demanding valuable resources and time. In this research, we explore the potential of machine learning to enhance forward electromagnetic scattering prediction. Specifically, we investigate the utilization of neural networks to establish the relationship between a generic scatterer and its resulting scattering characteristics, enabling rapid predictions of the optical output in the form of a scattering extinction cross section by the Generalized Multiparticle Mie-solutions (GMM) code, a computational package implementing the Generalized Lorentz-Mie Theory. We propose NNGMM, a forward modeling neural network, designed to predict the electromagnetic scattering of an aggregate of spherical spheres. The NNGMM model is trained on a synthetic dataset generated by the GMM code. We extensively validate and stress-test the NNGMM with a diverse set of synthetic data. Our results demonstrate that NNGMM accurately predicts the extinction cross section for arbitrary aggregates with outstanding precision, achieving an R-squared value exceeding 99%. Consequently, NNGMM proves to be a reliable alternative to the GMM code for calculating extinction cross sections, offering a substantial gain in efficiency, and significantly reduced computational cost. The integration of neural networks and the GMM physical simulator presents a powerful approach to accelerate the computation of forward electromagnetic problems. As a result, this combined approach holds the potential to serve as a key component in constructing an efficient inverse electromagnetic problem solver in the future. The successful application of NNGMM in predicting scattering characteristics opens up promising avenues for optimizing electromagnetic simulations in various practical applications.

  4. FORWARD VS INVERSE EM SCATTERING PROBLEM We l e a d

  5. FORWARD VS INVERSE EM SCATTERING PROBLEM We l e a d Forward EM scattering problem: the optical response is calculated deterministically using GMT, given a known configuration. Inverse EM scattering problem: one seeks to figure out the configuration of the multi-particle aggregate, given the optical response.

  6. GMM We l e a d A calculator for solving forward EM scattering problem: Generalized multiparticle Mie-solution (GMM) by Y-l. Xu and Bo . S. Gustafson (late 1990 s) https://scattport.org/index.php/light-scattering-software/multiple-particle- scattering/135-gmm-generalized-multiparticle-mie-solution

  7. GMM as a CALCULATOR We l e a d ? GMM ? The optical response (output): ? ?ext?,? The configuration (input): ? ??,??,??;??,??,??, ?, ? ,? ? GMM maps a multi-dimensional input ? to a multidimensional output ? GMM acts like a function Input (configuration) ??,??,??,? ?, ?,?,? ?ext(?;? = 0 ) Output (optical response) ?,? GMM ? (nm)

  8. In machine learning jargon We l e a d GMM is a target model ?is known as the features ?is known as the targets Mapping ? to ? in the context of a forward EM scattering problem is amount to a regression problem ?,? are multi-dimensional variables ? = ?(?)

  9. A regression problem (in high dimensional space) We l e a d Given ? , how to approximate ? based on existing knowledge of the known data points ? Trained / known data point ? ? ? ? = ? ? GMM calculator / target model Approximation function / surrogate model

  10. Surrogate We l e a d Lightweight approximation function to the target models ? = ? ? ? by GMM ? by a surrogate

  11. NNGMM We l e a d To establish a proof of concept by utilizing a deep neural network as a surrogate model for a GMM calculator The surrogate = A neural network representation of the original calculator NNGMM [Y.L. Thong and T.L. Yoon, 2022].

  12. Step 1: Data generation We l e a d Generate random configurations ??= ??,??,??;??;?;??,??, ? [1,?] Feed ?? into GMM to produce the corresponding ??= ?ext at ? = 0 Features data set: ? Targets data set: ?

  13. Training and Testing data sets We l e a d ??????,?????? 7,505,169 configurations ?????,????? 479,600 configurations

  14. Constraints of the free parameters used to control the ranges of the input variables We l e a d Variable Values [?init,?last] ?max ? [50 nm,100 nm] 10 1 4 nm ?min,?max ?min,?max [0.1,1.0] [0.1,1.0] ??= ??,??,??;??;?;??,??,? [1,?] ??= GMM(??)

  15. Training We l e a d ??????,?????? 7,505,169 configurations Deep Neural Network ???.??

  16. Testing We l e a d ????? ???.?? 479,600 configurations ????? 479,600 ???????? 479,600 compare

  17. Results We l e a d mse 4.0 R-square 1.0 ?ext(prediction) ?ext(ground truth) ???.?? = the surrogate model It works!

  18. Acceleration in computational time We l e a d Time speed-up by the surrogate model: More than 100 times compared to the target model (GMM)

  19. Pros and Cons We l e a d Pros: Faster computation speeds Reduced costs Cons: High initial computational cost to prepare the training dataset Limited generality beyond the training set Requires a new surrogate model if the target model is altered

  20. Conclusion We l e a d Proof of concept: A surrogate model in the form of a DNN representation of the target model (the GMM) is possible. Expected to be generalized to a larger variable span (e.g., larger ?,?,?,?,?)

  21. We l e a d Future work on the potential application

  22. Forward vs. inverse problem in terms of ? and ? Forward scattering problem Forward scattering problem ? = ?(?) ? the features (configuration of the aggregate target) ? the targets (?ext) ? the surrogate model for GMM ? and ? are generated by the GMM target model We l e a d Inverse scattering problem Inverse scattering problem ? and ? are generated by the GMM target model ? = ? 1(?) ? the features (configuration of the aggregate target) ? the targets (?ext) ? 1 the surrogate model for solving the inverse scattering

  23. Forward vs. inverse problem in terms of ? and ? We l e a d The surrogate model ? 1 could be obtained by reversing the role of ? and ? when training the DNN. This allows the EM inverse scattering problem be solved. A work in progress.

  24. We l e a d THANK YOU

Related


More Related Content