994 resultados para Modeling Methodology


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an individual designing prosthesis for surgical use and proposes a methodology for such design through mathematical extrapolation of data from digital images obtained via tomography of individual patient's bones. Individually tailored prosthesis designed to fit particular patient requirements as accurately as possible should result in more successful reconstruction, enable better planning before surgery and consequently fewer complications during surgery. Fast and accurate design and manufacture of personalized prosthesis for surgical use in bone replacement or reconstruction is potentially feasible through the application and integration of several different existing technologies, which are each at different stages of maturity. Initial case study experiments have been undertaken to validate the research concepts by making dimensional comparisons between a bone and a virtual model produced using the proposed methodology and a future research directions are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents a strategy to control nonlinear responses of aeroelastic systems with control surface freeplay. The proposed methodology is developed for the three degrees of freedom typical section airfoil considering aerodynamic forces from Theodorsen's theory. The mathematical model is written in the state space representation using rational function approximation to write the aerodynamic forces in time domain. The control system is designed using the fuzzy Takagi-Sugeno modeling to compute a feedback control gain. It useds Lyapunov's stability function and linear matrix inequalities (LMIs) to solve a convex optimization problem. Time simulations with different initial conditions are performed using a modified Runge-Kutta algorithm to compare the system with and without control forces. It is shown that this approach can compute linear control gain able to stabilize aeroelastic systems with discontinuous nonlinearities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this research is to verify the relationship between the maturity levels of environmental management and the adoption of green supply chain management (GSCM) practices by electro-electronic companies in Brazil. In this work a two-phase research was conducted, with one quantitative and the other qualitative. The quantitative phase aimed to test whether a relationship between the maturity levels of environmental management and GSCM exists, while the qualitative phase tried to detail the characteristics of this relationship. The quantitative phase was conducted through a survey with 100 Brazilian electro-electronic companies and the collected data were processed using Structural Equation Modeling. For the qualitative phase, a multiple case study was conducted with three companies located in Brazil. The results indicate that: (1) The main hypothesis was confirmed and considered statistically valid, indicating that, indeed, the maturity level of environmental management influences the adoption of GSCM practices; (2) a coevolution tends to occur between the environmental maturity and the GSCM practices; that is, the more developed is the company's environmental management, more complex GSCM practices are adopted; and (3) the GSCM internal practices tend to present a greater relative adoption than the external practices; these external practices of GSCM tend to be adopted when the company is inserted in a higher environmental stage and/or operates under a scenario of stronger normative environmental pressure. By the way, this is the first research mixing survey and case studies on GSCM in Brazil. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The multi-scale synoptic circulation system in the southeastern Brazil (SEBRA) region is presented using a feature-oriented approach. Prevalent synoptic circulation structures, or ""features,"" are identified from previous observational studies. These features include the southward-flowing Brazil Current (BC), the eddies off Cabo Sao Tome (CST - 22 degrees S) and off Cabo Frio (CF - 23 degrees S), and the upwelling region off CF and CST. Their synoptic water-mass (T-S) structures are characterized and parameterized to develop temperature-salinity (T-S) feature models. Following [Gangopadhyay, A., Robinson, A.R., Haley, PJ., Leslie, W.J., Lozano, C.j., Bisagni, J., Yu, Z., 2003. Feature-oriented regional modeling and simulation (forms) in the gulf of maine and georges bank. Cont. Shelf Res. 23 (3-4), 317-353] methodology, a synoptic initialization scheme for feature-oriented regional modeling and simulation (FORMS) of the circulation in this region is then developed. First, the temperature and salinity feature-model profiles are placed on a regional circulation template and objectively analyzed with available background climatology in the deep region. These initialization fields are then used for dynamical simulations via the Princeton Ocean Model (POM). A few first applications of this methodology are presented in this paper. These include the BC meandering, the BC-eddy interaction and the meander-eddy-upwelling system (MEUS) simulations. Preliminary validation results include realistic wave-growth and eddy formation and sustained upwelling. Our future plan includes the application of these feature models with satellite, in-situ data and advanced data-assimilation schemes for nowcasting and forecasting the SEBRA region. (c) 2008 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Several models have been designed to predict survival of patients with heart failure. These, while available and widely used for both stratifying and deciding upon different treatment options on the individual level, have several limitations. Specifically, some clinical variables that may influence prognosis may have an influence that change over time. Statistical models that include such characteristic may help in evaluating prognosis. The aim of the present study was to analyze and quantify the impact of modeling heart failure survival allowing for covariates with time-varying effects known to be independent predictors of overall mortality in this clinical setting. Methodology: Survival data from an inception cohort of five hundred patients diagnosed with heart failure functional class III and IV between 2002 and 2004 and followed-up to 2006 were analyzed by using the proportional hazards Cox model and variations of the Cox's model and also of the Aalen's additive model. Principal Findings: One-hundred and eighty eight (188) patients died during follow-up. For patients under study, age, serum sodium, hemoglobin, serum creatinine, and left ventricular ejection fraction were significantly associated with mortality. Evidence of time-varying effect was suggested for the last three. Both high hemoglobin and high LV ejection fraction were associated with a reduced risk of dying with a stronger initial effect. High creatinine, associated with an increased risk of dying, also presented an initial stronger effect. The impact of age and sodium were constant over time. Conclusions: The current study points to the importance of evaluating covariates with time-varying effects in heart failure models. The analysis performed suggests that variations of Cox and Aalen models constitute a valuable tool for identifying these variables. The implementation of covariates with time-varying effects into heart failure prognostication models may reduce bias and increase the specificity of such models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Primary voice production occurs in the larynx through vibrational movements carried out by vocal folds. However, many problems can affect this complex system resulting in voice disorders. In this context, time-frequency-shape analysis based on embedding phase space plots and nonlinear dynamics methods have been used to evaluate the vocal fold dynamics during phonation. For this purpose, the present work used high-speed video to record the vocal fold movements of three subjects and extract the glottal area time series using an image segmentation algorithm. This signal is used for an optimization method which combines genetic algorithms and a quasi-Newton method to optimize the parameters of a biomechanical model of vocal folds based on lumped elements (masses, springs and dampers). After optimization, this model is capable of simulating the dynamics of recorded vocal folds and their glottal pulse. Bifurcation diagrams and phase space analysis were used to evaluate the behavior of this deterministic system in different circumstances. The results showed that this methodology can be used to extract some physiological parameters of vocal folds and reproduce some complex behaviors of these structures contributing to the scientific and clinical evaluation of voice production. (C) 2010 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die Entstehung eines Marktpreises für einen Vermögenswert kann als Superposition der einzelnen Aktionen der Marktteilnehmer aufgefasst werden, die damit kumulativ Angebot und Nachfrage erzeugen. Dies ist in der statistischen Physik mit der Entstehung makroskopischer Eigenschaften vergleichbar, die von mikroskopischen Wechselwirkungen zwischen den beteiligten Systemkomponenten hervorgerufen werden. Die Verteilung der Preisänderungen an Finanzmärkten unterscheidet sich deutlich von einer Gaußverteilung. Dies führt zu empirischen Besonderheiten des Preisprozesses, zu denen neben dem Skalierungsverhalten nicht-triviale Korrelationsfunktionen und zeitlich gehäufte Volatilität zählen. In der vorliegenden Arbeit liegt der Fokus auf der Analyse von Finanzmarktzeitreihen und den darin enthaltenen Korrelationen. Es wird ein neues Verfahren zur Quantifizierung von Muster-basierten komplexen Korrelationen einer Zeitreihe entwickelt. Mit dieser Methodik werden signifikante Anzeichen dafür gefunden, dass sich typische Verhaltensmuster von Finanzmarktteilnehmern auf kurzen Zeitskalen manifestieren, dass also die Reaktion auf einen gegebenen Preisverlauf nicht rein zufällig ist, sondern vielmehr ähnliche Preisverläufe auch ähnliche Reaktionen hervorrufen. Ausgehend von der Untersuchung der komplexen Korrelationen in Finanzmarktzeitreihen wird die Frage behandelt, welche Eigenschaften sich beim Wechsel von einem positiven Trend zu einem negativen Trend verändern. Eine empirische Quantifizierung mittels Reskalierung liefert das Resultat, dass unabhängig von der betrachteten Zeitskala neue Preisextrema mit einem Anstieg des Transaktionsvolumens und einer Reduktion der Zeitintervalle zwischen Transaktionen einhergehen. Diese Abhängigkeiten weisen Charakteristika auf, die man auch in anderen komplexen Systemen in der Natur und speziell in physikalischen Systemen vorfindet. Über 9 Größenordnungen in der Zeit sind diese Eigenschaften auch unabhängig vom analysierten Markt - Trends, die nur für Sekunden bestehen, zeigen die gleiche Charakteristik wie Trends auf Zeitskalen von Monaten. Dies eröffnet die Möglichkeit, mehr über Finanzmarktblasen und deren Zusammenbrüche zu lernen, da Trends auf kleinen Zeitskalen viel häufiger auftreten. Zusätzlich wird eine Monte Carlo-basierte Simulation des Finanzmarktes analysiert und erweitert, um die empirischen Eigenschaften zu reproduzieren und Einblicke in deren Ursachen zu erhalten, die zum einen in der Finanzmarktmikrostruktur und andererseits in der Risikoaversion der Handelsteilnehmer zu suchen sind. Für die rechenzeitintensiven Verfahren kann mittels Parallelisierung auf einer Graphikkartenarchitektur eine deutliche Rechenzeitreduktion erreicht werden. Um das weite Spektrum an Einsatzbereichen von Graphikkarten zu aufzuzeigen, wird auch ein Standardmodell der statistischen Physik - das Ising-Modell - auf die Graphikkarte mit signifikanten Laufzeitvorteilen portiert. Teilresultate der Arbeit sind publiziert in [PGPS07, PPS08, Pre11, PVPS09b, PVPS09a, PS09, PS10a, SBF+10, BVP10, Pre10, PS10b, PSS10, SBF+11, PB10].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditionally, the study of internal combustion engines operation has focused on the steady-state performance. However, the daily driving schedule of automotive engines is inherently related to unsteady conditions. There are various operating conditions experienced by (diesel) engines that can be classified as transient. Besides the variation of the engine operating point, in terms of engine speed and torque, also the warm up phase can be considered as a transient condition. Chapter 2 has to do with this thermal transient condition; more precisely the main issue is the performance of a Selective Catalytic Reduction (SCR) system during cold start and warm up phases of the engine. The proposal of the underlying work is to investigate and identify optimal exhaust line heating strategies, to provide a fast activation of the catalytic reactions on SCR. Chapters 3 and 4 focus the attention on the dynamic behavior of the engine, when considering typical driving conditions. The common approach to dynamic optimization involves the solution of a single optimal-control problem. However, this approach requires the availability of models that are valid throughout the whole engine operating range and actuator ranges. In addition, the result of the optimization is meaningful only if the model is very accurate. Chapter 3 proposes a methodology to circumvent those demanding requirements: an iteration between transient measurements to refine a purpose-built model and a dynamic optimization which is constrained to the model validity region. Moreover all numerical methods required to implement this procedure are presented. Chapter 4 proposes an approach to derive a transient feedforward control system in an automated way. It relies on optimal control theory to solve a dynamic optimization problem for fast transients. From the optimal solutions, the relevant information is extracted and stored in maps spanned by the engine speed and the torque gradient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present work, a detailed analysis of a Mediterranean TLC occurred in January 2014 has been conducted. The author is not aware of other studies regarding this particular event at the publication of this thesis. In order to outline the cyclone evolution, observational data, including weather-stations data, satellite data, radar data and photographic evidence, were collected at first. After having identified the cyclone path and its general features, the GLOBO, BOLAM and MOLOCH NWP models, developed at ISAC-CNR (Bologna), were used to simulate the phenomenon. Particular attention was paid on the Mediterranean phase as well as on the Atlantic phase, since the cyclone showed a well defined precursor up to 3 days before the minimum formation in the Alboran Sea. The Mediterranean phase has been studied using different combinations of GLOBO, BOLAM and MOLOCH models, so as to evaluate the best model chain to simulate this kind of phenomena. The BOLAM and MOLOCH models showed the best performance, by adjusting the path erroneously deviated in the National Centre for Environmental Prediction (NCEP) and ECMWF operational models. The analysis of the cyclone thermal phase shown the presence of a deep-warm core structure in many cases, thus confirming the tropical-like nature of the system. Furthermore, the results showed high sensitivity to initial conditions in the whole lifetime of the cyclone, while the Sea Surface Temperature (SST) modification leads only to small changes in the Adriatic phase. The Atlantic phase has been studied using GLOBO and BOLAM model and with the aid of the same methodology already developed. After tracing the precursor, in the form of a low-pressure system, from the American East Coast to Spain, the thermal phase analysis was conducted. The parameters obtained showed evidence of a deep-cold core asymmetric structure during the whole Atlantic phase, while the first contact with the Mediterranean Sea caused a sudden transition to a shallow-warm core structure. The examination of Potential Vorticity (PV) 3-dimensional structure revealed the presence of a PV streamer that individually formed over Greenland and eventually interacted with the low-pressure system over the Spanish coast, favouring the first phase of the cyclone baroclinic intensification. Finally, the development of an automated system that tracks and studies the thermal phase of Mediterranean cyclones has been encouraged. This could lead to the forecast of potential tropical transition, against with a minimum computational investment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cognitive impairments are currently regarded as important determinants of functional domains and are promising treatment goals in schizophrenia. Nevertheless, the exact nature of the interdependent relationship between neurocognition and social cognition as well as the relative contribution of each of these factors to adequate functioning remains unclear. The purpose of this article is to systematically review the findings and methodology of studies that have investigated social cognition as a mediator variable between neurocognitive performance and functional outcome in schizophrenia. Moreover, we carried out a study to evaluate this mediation hypothesis by the means of structural equation modeling in a large sample of 148 schizophrenia patients. The review comprised 15 studies. All but one study provided evidence for the mediating role of social cognition both in cross-sectional and in longitudinal designs. Other variables like motivation and social competence additionally mediated the relationship between social cognition and functional outcome. The mean effect size of the indirect effect was 0.20. However, social cognitive domains were differentially effective mediators. On average, 25% of the variance in functional outcome could be explained in the mediation model. The results of our own statistical analysis are in line with these conclusions: Social cognition mediated a significant indirect relationship between neurocognition and functional outcome. These results suggest that research should focus on differential mediation pathways. Future studies should also consider the interaction with other prognostic factors, additional mediators, and moderators in order to increase the predictive power and to target those factors relevant for optimizing therapy effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genomic alterations have been linked to the development and progression of cancer. The technique of Comparative Genomic Hybridization (CGH) yields data consisting of fluorescence intensity ratios of test and reference DNA samples. The intensity ratios provide information about the number of copies in DNA. Practical issues such as the contamination of tumor cells in tissue specimens and normalization errors necessitate the use of statistics for learning about the genomic alterations from array-CGH data. As increasing amounts of array CGH data become available, there is a growing need for automated algorithms for characterizing genomic profiles. Specifically, there is a need for algorithms that can identify gains and losses in the number of copies based on statistical considerations, rather than merely detect trends in the data. We adopt a Bayesian approach, relying on the hidden Markov model to account for the inherent dependence in the intensity ratios. Posterior inferences are made about gains and losses in copy number. Localized amplifications (associated with oncogene mutations) and deletions (associated with mutations of tumor suppressors) are identified using posterior probabilities. Global trends such as extended regions of altered copy number are detected. Since the posterior distribution is analytically intractable, we implement a Metropolis-within-Gibbs algorithm for efficient simulation-based inference. Publicly available data on pancreatic adenocarcinoma, glioblastoma multiforme and breast cancer are analyzed, and comparisons are made with some widely-used algorithms to illustrate the reliability and success of the technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The last two decades have seen intense scientific and regulatory interest in the health effects of particulate matter (PM). Influential epidemiological studies that characterize chronic exposure of individuals rely on monitoring data that are sparse in space and time, so they often assign the same exposure to participants in large geographic areas and across time. We estimate monthly PM during 1988-2002 in a large spatial domain for use in studying health effects in the Nurses' Health Study. We develop a conceptually simple spatio-temporal model that uses a rich set of covariates. The model is used to estimate concentrations of PM10 for the full time period and PM2.5 for a subset of the period. For the earlier part of the period, 1988-1998, few PM2.5 monitors were operating, so we develop a simple extension to the model that represents PM2.5 conditionally on PM10 model predictions. In the epidemiological analysis, model predictions of PM10 are more strongly associated with health effects than when using simpler approaches to estimate exposure. Our modeling approach supports the application in estimating both fine-scale and large-scale spatial heterogeneity and capturing space-time interaction through the use of monthly-varying spatial surfaces. At the same time, the model is computationally feasible, implementable with standard software, and readily understandable to the scientific audience. Despite simplifying assumptions, the model has good predictive performance and uncertainty characterization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a novel class of models for functional data exhibiting skewness or other shape characteristics that vary with spatial or temporal location. We use copulas so that the marginal distributions and the dependence structure can be modeled independently. Dependence is modeled with a Gaussian or t-copula, so that there is an underlying latent Gaussian process. We model the marginal distributions using the skew t family. The mean, variance, and shape parameters are modeled nonparametrically as functions of location. A computationally tractable inferential framework for estimating heterogeneous asymmetric or heavy-tailed marginal distributions is introduced. This framework provides a new set of tools for increasingly complex data collected in medical and public health studies. Our methods were motivated by and are illustrated with a state-of-the-art study of neuronal tracts in multiple sclerosis patients and healthy controls. Using the tools we have developed, we were able to find those locations along the tract most affected by the disease. However, our methods are general and highly relevant to many functional data sets. In addition to the application to one-dimensional tract profiles illustrated here, higher-dimensional extensions of the methodology could have direct applications to other biological data including functional and structural MRI.