- Browse by Author
Browsing by Author "Du, Xiaoping"
Now showing 1 - 10 of 15
Results Per Page
Sort Options
Item Adaptive Kriging Method for Uncertainty Quantification of the Photoelectron Sheath and Dust Levitation on the Lunar Surface(ASME, 2021) Wei, Xinpeng; Zhao, Jianxun; He, Xiaoming; Hu, Zhen; Du, Xiaoping; Han, Daoru; Mechanical and Energy Engineering, School of Engineering and TechnologyThis paper presents an adaptive Kriging based method to perform uncertainty quantification (UQ) of the photoelectron sheath and dust levitation on the lunar surface. The objective of this study is to identify the upper and lower bounds of the electric potential and that of dust levitation height, given the intervals of model parameters in the one-dimensional (1D) photoelectron sheath model. To improve the calculation efficiency, we employ the widely used adaptive Kriging method (AKM). A task-oriented learning function and a stopping criterion are developed to train the Kriging model and customize the AKM. Experiment analysis shows that the proposed AKM is both accurate and efficient.Item Applying Machine Learning to Optimize Sintered Powder Microstructures from Phase Field Modeling(2020-12) Batabyal, Arunabha; Zhang, Jing; Yang, Shengfeng; Du, XiaopingSintering is a primary particulate manufacturing technology to provide densification and strength for ceramics and many metals. A persistent problem in this manufacturing technology has been to maintain the quality of the manufactured parts. This can be attributed to the various sources of uncertainty present during the manufacturing process. In this work, a two-particle phase-field model has been analyzed which simulates microstructure evolution during the solid-state sintering process. The sources of uncertainty have been considered as the two input parameters surface diffusivity and inter-particle distance. The response quantity of interest (QOI) has been selected as the size of the neck region that develops between the two particles. Two different cases with equal and unequal sized particles were studied. It was observed that the neck size increased with increasing surface diffusivity and decreased with increasing inter-particle distance irrespective of particle size. Sensitivity analysis found that the inter-particle distance has more influence on variation in neck size than that of surface diffusivity. The machine-learning algorithm Gaussian Process Regression was used to create the surrogate model of the QOI. Bayesian Optimization method was used to find optimal values of the input parameters. For equal-sized particles, optimization using Probability of Improvement provided optimal values of surface diffusivity and inter-particle distance as 23.8268 and 40.0001, respectively. The Expected Improvement as an acquisition function gave optimal values 23.9874 and 40.7428, respectively. For unequal sized particles, optimal design values from Probability of Improvement were 23.9700 and 33.3005 for surface diffusivity and inter-particle distance, respectively, while those from Expected Improvement were 23.9893 and 33.9627. The optimization results from the two different acquisition functions seemed to be in good agreement with each other. The results also validated the fact that surface diffusivity should be higher and inter-particle distance should be lower for achieving larger neck size and better mechanical properties of the material.Item Approximation to Multivariate Normal Integral and Its Application in Time-Dependent Reliability Analysis(Elsevier, 2021-01) Wei, Xinpeng; Han, Daoru; Du, Xiaoping; Mechanical and Energy Engineering, School of Engineering and TechnologyIt is common to evaluate high-dimensional normal probabilities in many uncertainty-related applications such as system and time-dependent reliability analysis. An accurate method is proposed to evaluate high-dimensional normal probabilities, especially when they reside in tail areas. The normal probability is at first converted into the cumulative distribution function of the extreme value of the involved normal variables. Then the series expansion method is employed to approximate the extreme value with respect to a smaller number of mutually independent standard normal variables. The moment generating function of the extreme value is obtained using the Gauss-Hermite quadrature method. The saddlepoint approximation method is finally used to estimate the cumulative distribution function of the extreme value, thereby the desired normal probability. The proposed method is then applied to time-dependent reliability analysis where a large number of dependent normal variables are involved with the use of the First Order Reliability Method. Examples show that the proposed method is generally more accurate and robust than the widely used randomized quasi Monte Carlo method and equivalent component method.Item Development of ABAQUS-MATLAB Interface for Design Optimization using Hybrid Cellular Automata and Comparison with Bidirectional Evolutionary Structural Optimization(2021-12) Antony, Alen; Tovar, Andres; Nematollahi, Khosrow; Du, XiaopingTopology Optimization is an optimization technique used to synthesize models without any preconceived shape. These structures are synthesized keeping in mind the minimum compliance problems. With the rapid improvement in advanced manufacturing technology and increased need for lightweight high strength designs topology optimization is being used more than ever. There exist a number of commercially available software's that can be used for optimizing a product. These software have a robust Finite Element Solver and can produce good results. However, these software offers little to no choice to the user when it comes to selecting the type of optimization method used. It is possible to use a programming language like MATLAB to develop algorithms that use a specific type of optimization method but the user himself will be responsible for writing the FEA algorithms too. This leads to a situation where the flexibility over the optimization method is achieved but the robust FEA of the commercial FEA tool is lost. There have been works done in the past that links ABAQUS with MATLAB but they are primarily used as a tool for finite element post-processing. Through this thesis, the aim is to develop an interface that can be used for solving optimization problems using different methods like hard-kill as well as the material penalization (SIMP) method. By doing so it's possible to harness the potential of a commercial FEA software and gives the user the requires flexibility to write or modify the codes to have an optimization method of his or her choice. Also, by implementing this interface, it can also be potentially used to unlock the capabilities of other Dassault Systèmes software's as the firm is implementing a tighter integration between all its products using the 3DExperience platform. This thesis as described uses this interface to implement BESO and HCA based topology optimization. Since hybrid cellular atomata is the only other method other than equivalent static load method that can be used for crashworthiness optimization this work suits well for the role when extended into a non-linear region.Item Distributed Nonlinear Model Predictive Control for Heterogeneous Vehicle Platoons Under Uncertainty(IEEE Xplore, 2021-09) Shen, Dan; Yin, Jianhua; Du, Xiaoping; Li, Lingxi; Electrical and Computer Engineering, School of Engineering and TechnologyThis paper presents a novel distributed nonlinear model predictive control (DNMPC) for minimizing velocity tracking and spacing errors in heterogeneous vehicle platoon under uncertainty. The vehicle longitudinal dynamics and information flow in the platoon are established and analyzed. The algorithm of DNMPC with robustness and reliability considerations at each vehicle (or node) is developed based on the leading vehicle and reference information from nodes in its neighboring set. Together with the physical constraints on the control input, the nonlinear constraints on vehicle longitudinal dynamics, the terminal constraints on states, and the reliability constraints on both input and output, the objective function is defined to optimize the control accuracy and efficiency by penalizing the tracking errors between the predicted outputs and desirable outputs of the same node and neighboring nodes, respectively. Meanwhile, the robust design optimization model also minimizes the expected quality loss which consists of the mean and standard deviation of node inputs and outputs. The simulation results also demonstrate the accuracy and effectiveness of the proposed approach under two different traffic scenarios.Item Evaluating ARCADIA/Capella vs. OOSEM/SysML for System Architecture Development(2019-08) Alai, Shashank P.; El-Mounayri, Hazim; Ben Miled, Zina; Schreiber, Joerg; Du, XiaopingSystems Engineering is catching pace in many segments of product manufacturing industries. Model-Based Systems Engineering (MBSE) is the formalized application of modeling to perform systems engineering activities. In order to effectively utilize the complete potential of MBSE, a methodology consisting of appropriate processes, methods and tools is a key necessity. In the last decade, several MBSE projects have been implemented in industries varying from aerospace and defense to automotive, healthcare and transportation. The Systems Modeling Language (SysML) standard has been a key enabler of these projects at many companies. Although SysML is capable of providing a rich representation of any system through various viewpoints, the journey towards adopting SysML to realize the true potential of MBSE has been a challenge. Among all, one of the common roadblocks faced by systems engineers across industries has been the software engineering-based nature of SysML which leads to difficulties in grasping the modeling concepts for people that do not possess a software engineering background. As a consequence, developing a system (or a system of systems) architecture model using SysML has been a challenging task for many engineers even after a decade of its inception and multiple successive iterations of the language specification. Being a modeling language, SysML is method-agnostic, but its associated limitations outweigh the advantages. ARCADIA (Architecture Analysis and Design Integrated Approach) is a systems and software architecture engineering method based on architecture-centric and model-based engineering activities. If applied properly, ARCADIA allows for a very effective way to model the architecture of multi-domain systems, and overcome many of the limitations faced in traditional SysML implementation. This thesis evaluates the architecture development capabilities of ARCADIA/Capella versus SysML following the Object-Oriented Systems Engineering Method (OOSEM). The study focuses on the key equivalences and differences between the two MBSE solutions from a model development perspective and provides several criteria to evaluate their effectiveness for architecture development using a conceptual case of Adaptive Cruise Control (ACC). The evaluation is based on three perspectives namely, architecture quality, ability to support key process deliverables, and the overall methodology. Towards this end, an industry-wide survey of MBSE practitioners and thought leaders was conducted to identify several concerns in using models but also to validate the results of the study. The case study demonstrates how the ARCADIA/Capella approach addresses several challenges that are currently faced in SysML implementation. From a process point of view, ARCADIA/Capella and SysML equally support the provision of the key deliverable artifacts required in the systems engineering process. However, the candidate architectures developed using the two approaches show a considerable difference in various aspects such as the mapping of the form to function, creating functional architectures, etc. The ARCADIA/Capella approach allows to develop a ‘good’ system architecture representation efficiently and intuitively. The study also provides answers to several useful criteria pertaining to the overall candidate methodologies while serving as a practitioner’s reference in selecting the most suitable approach.Item Image Segmentation, Parametric Study, and Supervised Surrogate Modeling of Image-based Computational Fluid Dynamics(2022-05) Islam, Md Mahfuzul; Yu, Huidan (Whitney); Du, Xiaoping; Wagner, DianeWith the recent advancement of computation and imaging technology, Image-based computational fluid dynamics (ICFD) has emerged as a great non-invasive capability to study biomedical flows. These modern technologies increase the potential of computation-aided diagnostics and therapeutics in a patient-specific environment. I studied three components of this image-based computational fluid dynamics process in this work. To ensure accurate medical assessment, realistic computational analysis is needed, for which patient-specific image segmentation of the diseased vessel is of paramount importance. In this work, image segmentation of several human arteries, veins, capillaries, and organs was conducted to use them for further hemodynamic simulations. To accomplish these, several open-source and commercial software packages were implemented. This study incorporates a new computational platform, called InVascular, to quantify the 4D velocity field in image-based pulsatile flows using the Volumetric Lattice Boltzmann Method (VLBM). We also conducted several parametric studies on an idealized case of a 3-D pipe with the dimensions of a human renal artery. We investigated the relationship between stenosis severity and Resistive index (RI). We also explored how pulsatile parameters like heart rate or pulsatile pressure gradient affect RI. As the process of ICFD analysis is based on imaging and other hemodynamic data, it is often time-consuming due to the extensive data processing time. For clinicians to make fast medical decisions regarding their patients, we need rapid and accurate ICFD results. To achieve that, we also developed surrogate models to show the potential of supervised machine learning methods in constructing efficient and precise surrogate models for Hagen-Poiseuille and Womersley flows.Item A new noninvasive and patient-specific hemodynamic index for the severity of renal stenosis and outcome of interventional treatment(Wiley, 2022) Yu, Huidan; Khan, Monsurul; Wu, Hao; Du, Xiaoping; Chen, Rou; Rollins, Dave M.; Fang, Xin; Long, Jianyun; Xu, Chenke; Sawchuk, Alan P.; Mechanical and Energy Engineering, School of Engineering and TechnologyRenal arterial stenosis (RAS) often causes renovascular hypertension, which may result in kidney failure and life-threatening consequences. Direct assessment of the hemodynamic severity of RAS has yet to be addressed. In this work, we present a computational concept to derive a new, noninvasive, and patient-specific index to assess the hemodynamic severity of RAS and predict the potential benefit to the patient from a stenting therapy. The hemodynamic index is derived from a functional relation between the translesional pressure indicator (TPI) and lumen volume reduction (S) through a parametric deterioration of the RAS. Our in-house computational platform, InVascular, for image-based computational hemodynamics is used to compute the TPI at given S. InVascular integrates unified computational modeling for both image processing and computational hemodynamics with GPU parallel computing technology. The TPI-S curve reveals a pair of thresholds of S indicating mild or severe RAS. The TPI at S=0 represents the pressure improvement following a successful stenting therapy. Six patient cases with a total of 6 aortic and 12 renal arteries are studied. The computed blood pressure waveforms have good agreements with the in-vivo measured ones and the systolic pressure is statistical equivalence to the in-vivo measurements with p<0.001. Uncertainty quantification provides the reliability of the computed pressure through the corresponding 95% confidence interval. The severity assessments of RAS in four cases are consistent with the medical practice. The preliminary results inspire a more sophisticated investigation for real medical insights of the new index. This computational concept can be applied to other arterial stenoses such as iliac stenosis. Such a noninvasive and patient-specific hemodynamic index has the potential to aid in the clinical decision-making of interventional treatment with reduced medical cost and patient risks.Item A Partial Safety Factor Method for System Reliability Prediction With Outsourced Components(ASME, 2019-06) Hu, Zhengwei; Du, Xiaoping; Mechanical and Energy Engineering, School of Engineering and TechnologySystem reliability is usually predicted with the assumption that all component states are independent. This assumption may not accurate for systems with outsourced components since their states are strongly dependent and component details may be unknown. The purpose of this study is to develop an accurate system reliability method that can produce complete joint probability density function (PDF) of all the component states, thereby leading to accurate system reliability predictions. The proposed method works for systems whose failures are caused by excessive loading. In addition to the component reliability, system designers also ask for partial safety factors for shared loadings from component suppliers. The information is then sufficient for building a system-level joint PDF. Algorithms are designed for a component supplier to generate partial safety factors. The method enables accurate system reliability predictions without requiring proprietary information from component suppliers.Item Photoelectron Sheath near the Lunar Surface: Fully Kinetic Modeling and Uncertainty Quantification Analysis(American Institute of Aeronautics and Astronautics, 2020-01-05) Zhao, Jianxun; Wei, Xinpeng; Hu, Zhangli; He, Xiaoming; Han, Daoru; Hu, Zhen; Du, Xiaoping; Mechanical and Energy Engineering, School of Engineering and TechnologyThis paper considers plasma charging on the lunar surface with a focus on photoelectron sheath. The plasma species includes ambient solar wind (protons and electrons) and photoelectrons emitted from the illuminated lunar surface. This work is motivated by the high computational cost associated with uncertainty quantification (UQ) analysis of plasma simulations using high-fidelity fully kinetic models. In this paper, we study the photoelectron sheath near the lunar surface with a focus on effects of variables of uncertainty (such as the ambient electron density or photoelectron temperature) on the plasma environment. A fully kinetic 3-D finite-difference (FD) particle-in-cell (PIC) code is utilized to simulate the plasma interaction near the lunar surface and the resulting photoelectron sheath. For the uncertainty quantification analysis, this PIC code is treated as a black box providing high-fidelity quantities of interest, which are also used to construct efficient reduced-order models to perform UQ analysis. A 1-D configuration is first studied to demonstrate the procedure and capability of the UQ analysis. The rest of the paper is organized as follows. Section III presents the analytic and numerical solutions of the 1-D photoelectron sheath. Verification and validation of the FD-PIC code for photoelectron sheath solution is shown. Section IV describes the Kriging model and the uncertainty quantification approach. Section V discusses the UQ analysis of the 1-D photoelectron sheath. The conclusion is given in Section VI.