963 resultados para Non-linear functions
Resumo:
Imaging technologies are widely used in application fields such as natural sciences, engineering, medicine, and life sciences. A broad class of imaging problems reduces to solve ill-posed inverse problems (IPs). Traditional strategies to solve these ill-posed IPs rely on variational regularization methods, which are based on minimization of suitable energies, and make use of knowledge about the image formation model (forward operator) and prior knowledge on the solution, but lack in incorporating knowledge directly from data. On the other hand, the more recent learned approaches can easily learn the intricate statistics of images depending on a large set of data, but do not have a systematic method for incorporating prior knowledge about the image formation model. The main purpose of this thesis is to discuss data-driven image reconstruction methods which combine the benefits of these two different reconstruction strategies for the solution of highly nonlinear ill-posed inverse problems. Mathematical formulation and numerical approaches for image IPs, including linear as well as strongly nonlinear problems are described. More specifically we address the Electrical impedance Tomography (EIT) reconstruction problem by unrolling the regularized Gauss-Newton method and integrating the regularization learned by a data-adaptive neural network. Furthermore we investigate the solution of non-linear ill-posed IPs introducing a deep-PnP framework that integrates the graph convolutional denoiser into the proximal Gauss-Newton method with a practical application to the EIT, a recently introduced promising imaging technique. Efficient algorithms are then applied to the solution of the limited electrods problem in EIT, combining compressive sensing techniques and deep learning strategies. Finally, a transformer-based neural network architecture is adapted to restore the noisy solution of the Computed Tomography problem recovered using the filtered back-projection method.
Resumo:
In this thesis work a nonlinear model for Interdigitated Capacitors (IDCs) based on ferroelectric materials, is proposed. Through the properties of materials such as Hafnium-Zirconium Oxide (HfZrO2), it is possible to realize tunable radiofrequency (RF) circuits. In particular, the model proposed in this thesis describes the use of an IDC, realized on a High-Resistivity silicon substrate, as a phase shifter for beam-steering applications. The model is obtained starting from already present experimental measurements, through which it is possible to identify a circuit model. The model is tested for both low power values and other power values using Harmonic Balance simulations, which show an excellent convergence of the model up to 40 dBm of input power. Furthermore, an array composed by two patches operating both at 2.55 GHz, which exploits the tunable properties of the HfZrO-based IDC is proposed. At 0dBm the model shows a 47° phase shift with polarization -1 V and 1 V which leads to a 11° steering of the main lobe of the array.
Resumo:
Activation functions within neural networks play a crucial role in Deep Learning since they allow to learn complex and non-trivial patterns in the data. However, the ability to approximate non-linear functions is a significant limitation when implementing neural networks in a quantum computer to solve typical machine learning tasks. The main burden lies in the unitarity constraint of quantum operators, which forbids non-linearity and poses a considerable obstacle to developing such non-linear functions in a quantum setting. Nevertheless, several attempts have been made to tackle the realization of the quantum activation function in the literature. Recently, the idea of the QSplines has been proposed to approximate a non-linear activation function by implementing the quantum version of the spline functions. Yet, QSplines suffers from various drawbacks. Firstly, the final function estimation requires a post-processing step; thus, the value of the activation function is not available directly as a quantum state. Secondly, QSplines need many error-corrected qubits and a very long quantum circuits to be executed. These constraints do not allow the adoption of the QSplines on near-term quantum devices and limit their generalization capabilities. This thesis aims to overcome these limitations by leveraging hybrid quantum-classical computation. In particular, a few different methods for Variational Quantum Splines are proposed and implemented, to pave the way for the development of complete quantum activation functions and unlock the full potential of quantum neural networks in the field of quantum machine learning.
Resumo:
Privacy issues and data scarcity in PET field call for efficient methods to expand datasets via synthetic generation of new data that cannot be traced back to real patients and that are also realistic. In this thesis, machine learning techniques were applied to 1001 amyloid-beta PET images, which had undergone a diagnosis of Alzheimer’s disease: the evaluations were 540 positive, 457 negative and 4 unknown. Isomap algorithm was used as a manifold learning method to reduce the dimensions of the PET dataset; a numerical scale-free interpolation method was applied to invert the dimensionality reduction map. The interpolant was tested on the PET images via LOOCV, where the removed images were compared with the reconstructed ones with the mean SSIM index (MSSIM = 0.76 ± 0.06). The effectiveness of this measure is questioned, since it indicated slightly higher performance for a method of comparison using PCA (MSSIM = 0.79 ± 0.06), which gave clearly poor quality reconstructed images with respect to those recovered by the numerical inverse mapping. Ten synthetic PET images were generated and, after having been mixed with ten originals, were sent to a team of clinicians for the visual assessment of their realism; no significant agreements were found either between clinicians and the true image labels or among the clinicians, meaning that original and synthetic images were indistinguishable. The future perspective of this thesis points to the improvement of the amyloid-beta PET research field by increasing available data, overcoming the constraints of data acquisition and privacy issues. Potential improvements can be achieved via refinements of the manifold learning and the inverse mapping stages during the PET image analysis, by exploring different combinations in the choice of algorithm parameters and by applying other non-linear dimensionality reduction algorithms. A final prospect of this work is the search for new methods to assess image reconstruction quality.
Resumo:
The aim of this contribution is to extend the techniques of composite materials design to non-linear material behaviour and apply it for design of new materials for passive vibration control. As a first step a computational tool allowing determination of macroscopic optimized one-dimensional isolator behaviour was developed. Voigt, Maxwell, standard and more complex material models can be implemented. Objective function considers minimization of the initial reaction and/or displacement peak as well as minimization of the steady-state amplitude of reaction and/or displacement. The complex stiffness approach is used to formulate the governing equations in an efficient way. Material stiffness parameters are assumed as non-linear functions of the displacement. The numerical solution is performed in the complex space. The steady-state solution in the complex space is obtained by an iterative process based on the shooting method which imposes the conditions of periodicity with respect to the known value of the period. Extension of the shooting method to the complex space is presented and verified. Non-linear behaviour of material parameters is then optimized by generic probabilistic meta-algorithm, simulated annealing. Dependence of the global optimum on several combinations of leading parameters of the simulated annealing procedure, like neighbourhood definition and annealing schedule, is also studied and analyzed. Procedure is programmed in MATLAB environment.
Resumo:
In this article we presents a project [1] developed to demonstrate the capability that Multi-Layer Perceptrons (MLP) have to approximate non-linear functions [2]. The simulation has been implemented in Java to be used in all the computers by Internet [3], with a simple operation and pleasant interface. The power of the simulations is in the possibility of the user of seeing the evolutions of the approaches, the contribution of each neuron, the control of the different parameters, etc. In addition, to guide the user during the simulation, an online help has been implemented.
Resumo:
Methods of improving the coverage of Box–Jenkins prediction intervals for linear autoregressive models are explored. These methods use bootstrap techniques to allow for parameter estimation uncertainty and to reduce the small-sample bias in the estimator of the models’ parameters. In addition, we also consider a method of bias-correcting the non-linear functions of the parameter estimates that are used to generate conditional multi-step predictions.
Resumo:
Non-linear mathematical functions proposed by Brody, Gompertz, Richards, Bertalanffy and Verhulst were compared in several buffalo production systems in Colombia. Herds were located in three provinces: Antioquia, Caldas, and Cordoba. Growth was better described by the curves proposed by Brody and Gompertz. Using the datasets from herds from Caldas, heritabilities for traits such as weaning weight (WW), weight and maturity at one year of age (WY and MY, respectively), age at 50% and 75% of maturity (A50% and A75%, respectively), adult weight (beta(0)), and other characteristics, were also estimated. Direct and maternal heritabilities for WW were 0.19 and 0.12, respectively. Direct heritabilities for WY, MY, A50%, A75% and beta(0) were 0.39, 0.15, 0.09, 0.20 and 0.09, respectively. The genetic correlation for beta(0) and WY was -0.47, indicating that selection for heavy weight at one year of age will lead to lower weight at adult age. These data suggest that selection based on maturity traits can generate changes in characteristics of economic importance in beef-type buffalo farms.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Non-linear mathematical functions proposed by Brody, Gompertz, Richards, Bertalanffy and Verhulst were compared in several buffalo production systems in Colombia. Herds were located in three provinces: Antioquia, Caldas, and Cordoba. Growth was better described by the curves proposed by Brody and Gompertz. Using the datasets from herds from Caldas, heritabilities for traits such as weaning weight (WW), weight and maturity at one year of age (WY and MY, respectively), age at 50% and 75% of maturity (A50% and A75%, respectively), adult weight (β0), and other characteristics, were also estimated. Direct and maternal heritabilities for WW were 0.19 and 0.12, respectively. Direct heritabilities for WY, MY, A50%, A75% and β0 were 0.39, 0.15, 0.09, 0.20 and 0.09, respectively. The genetic correlation for β0 and WY was -0.47, indicating that selection for heavy weight at one year of age will lead to lower weight at adult age. These data suggest that selection based on maturity traits can generate changes in characteristics of economic importance in beef-type buffalo farms. © 2012 Universidad de Antioquia.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Temperature-dependent population growth of diamondback moth (DBM) Plutella xylostella (L.), a prolific insect pest of crucifer vegetables, was studied under six constant temperatures in the laboratory. The objective of the study was to predict the impacts of temperature changes on the population of DBM at high-resolution scales along altitudinal gradients and under climate change scenarios. Non-linear functions were fitted on the data for modeling the development, mortality, longevity and oviposition of the pest. The best-fitted functions for each life stage were compiled for estimating the life table parameters of the species by stochastic simulations. To quantify the impacts on the pest, three indices (establishment, generation and activity) were computed using the estimates of life table parameters and temperature data obtained at local scale (current scenario 2013) and downscaled climate change data (future scenario 2055) from the AFRICLIM database. To measure and represent the impacts of temperature change along the altitude on the pest; the indices were mapped along the altitudinal gradients of Kilimanjaro and Taita Hills, in Tanzania and Kenya, respectively. Potential impact of the changes between climate scenarios 2013 and 2055 was assessed. The data files included in this database were utilized for the above analysis to develop temperature dependent phenology of Plutella xylostella to assess current and future distribution along eastern African Afromontanes.
Resumo:
In Information Filtering (IF) a user may be interested in several topics in parallel. But IF systems have been built on representational models derived from Information Retrieval and Text Categorization, which assume independence between terms. The linearity of these models results in user profiles that can only represent one topic of interest. We present a methodology that takes into account term dependencies to construct a single profile representation for multiple topics, in the form of a hierarchical term network. We also introduce a series of non-linear functions for evaluating documents against the profile. Initial experiments produced positive results.
Resumo:
Numerical optimization is a technique where a computer is used to explore design parameter combinations to find extremes in performance factors. In multi-objective optimization several performance factors can be optimized simultaneously. The solution to multi-objective optimization problems is not a single design, but a family of optimized designs referred to as the Pareto frontier. The Pareto frontier is a trade-off curve in the objective function space composed of solutions where performance in one objective function is traded for performance in others. A Multi-Objective Hybridized Optimizer (MOHO) was created for the purpose of solving multi-objective optimization problems by utilizing a set of constituent optimization algorithms. MOHO tracks the progress of the Pareto frontier approximation development and automatically switches amongst those constituent evolutionary optimization algorithms to speed the formation of an accurate Pareto frontier approximation. Aerodynamic shape optimization is one of the oldest applications of numerical optimization. MOHO was used to perform shape optimization on a 0.5-inch ballistic penetrator traveling at Mach number 2.5. Two objectives were simultaneously optimized: minimize aerodynamic drag and maximize penetrator volume. This problem was solved twice. The first time the problem was solved by using Modified Newton Impact Theory (MNIT) to determine the pressure drag on the penetrator. In the second solution, a Parabolized Navier-Stokes (PNS) solver that includes viscosity was used to evaluate the drag on the penetrator. The studies show the difference in the optimized penetrator shapes when viscosity is absent and present in the optimization. In modern optimization problems, objective function evaluations may require many hours on a computer cluster to perform these types of analysis. One solution is to create a response surface that models the behavior of the objective function. Once enough data about the behavior of the objective function has been collected, a response surface can be used to represent the actual objective function in the optimization process. The Hybrid Self-Organizing Response Surface Method (HYBSORSM) algorithm was developed and used to make response surfaces of objective functions. HYBSORSM was evaluated using a suite of 295 non-linear functions. These functions involve from 2 to 100 variables demonstrating robustness and accuracy of HYBSORSM.
Resumo:
The late Neogene was a time of cryosphere development in the northern hemisphere. The present study was carried out to estimate the sea surface temperature (SST) change during this period based on the quantitative planktonic foraminiferal data of 8 DSDP sites in the western Pacific. Target factor analysis has been applied to the conventional transfer function approach to overcome the no-analog conditions caused by evolutionary faunal changes. By applying this technique through a combination of time-slice and time-series studies, the SST history of the last 5.3 Ma has been reconstructed for the low latitude western Pacific. Although the present data set is close to the statistical limits of factor analysis, the clear presence of sensible variations in individual SST time-series suggests the feasibility and reliability of this method in paleoceanographic studies. The estimated SST curves display the general trend of the temperature fluctuations and reveal three major cool periods in the late Neogene, i.e. the early Pliocene (4.7 3.5 Ma), the late Pliocene (3.1-2.7 Ma), and the latest Pliocene to early Pleistocene (2.2-1.0 Ma). Cool events are reflected in the increase of seasonality and meridional SST gradient in the subtropical area. The latest Pliocene to early Pleistocene cooling is most important in the late Neogene climatic evolution. It differs from the previous cool events in its irreversible, steplike change in SST, which established the glacial climate characteristic of the late Pleistocene. The winter and summer SST decreased by 3.3-5.4°C and 1.0 2.1C in the subtropics, by 0.9°C and 0.6C in the equatorial region, and showed little or no cooling in the tropics. Moreover, this cooling event occurred as a gradual SST decrease during 2.2 1.0 Ma at the warmer subtropical sites, while that at cooler subtropical site was an abrupt SST drop at 2.2 Ma. In contrast, equatorial and tropical western Pacific experienced only minor SST change in the entire late Neogene. In general, subtropics was much more sensitive to climatic forcing than tropics and the cooling events were most extensive in the cooler subtropics. The early Pliocene cool periods can be correlated to the Antarctic ice volume fluctuation, and the latest Pliocene early Pleistocene cooling reflects the climatic evolution during the cryosphere development of the northern hemisphere.