936 resultados para Parametric sensitivity analysis
Resumo:
Large areas of Amazonian evergreen forest experience seasonal droughts extending for three or more months, yet show maximum rates of photosynthesis and evapotranspiration during dry intervals. This apparent resilience is belied by disproportionate mortality of the large trees in manipulations that reduce wet season rainfall, occurring after 2-3 years of treatment. The goal of this study is to characterize the mechanisms that produce these contrasting ecosystem responses. A mechanistic model is developed based on the ecohydrological framework of TIN (Triangulated Irregular Network)-based Real Time Integrated Basin Simulator + Vegetation Generator for Interactive Evolution (tRIBS+VEGGIE). The model is used to test the roles of deep roots and soil capillary flux to provide water to the forest during the dry season. Also examined is the importance of "root niche separation," in which roots of overstory trees extend to depth, where during the dry season they use water stored from wet season precipitation, while roots of understory trees are concentrated in shallow layers that access dry season precipitation directly. Observational data from the Tapajo's National Forest, Brazil, were used as meteorological forcing and provided comprehensive observational constraints on the model. Results strongly suggest that deep roots with root niche separation adaptations explain both the observed resilience during seasonal drought and the vulnerability of canopy-dominant trees to extended deficits of wet season rainfall. These mechanisms appear to provide an adaptive strategy that enhances productivity of the largest trees in the face of their disproportionate heat loads and water demand in the dry season. A sensitivity analysis exploring how wet season rainfall affects the stability of the rainforest system is presented. Citation: Ivanov, V. Y., L. R. Hutyra, S. C. Wofsy, J. W. Munger, S. R. Saleska, R. C. de Oliveira Jr., and P. B. de Camargo (2012), Root niche separation can explain avoidance of seasonal drought stress and vulnerability of overstory trees to extended drought in a mature Amazonian forest, Water Resour. Res., 48, W12507, doi:10.1029/2012WR011972.
Resumo:
O objetivo deste trabalho foi parametrizar e avaliar o modelo DSSAT/Canegro para cinco variedades brasileiras de cana-de-açúcar. A parametrização foi realizada a partir do uso de dados biométricos e de crescimento das variedades CTC 4, CTC 7, CTC 20, RB 86-7515 e RB 83-5486, obtidos em cinco localidades brasileiras. Foi realizada análise de sensibilidade local para os principais parâmetros. A parametrização do modelo foi feita por meio da técnica de estimativa da incerteza de probabilidade generalizada ("generalized likelihood uncertainty estimation", Glue). Para a avaliação das predições, foram utilizados, como indicadores estatísticos, o coeficiente de determinação (R2), o índice D de Willmott e a raiz quadrada do erro-médio (RMSE). As variedades CTC apresentaram índice D entre 0,870 e 0,944, para índice de área foliar, altura de colmo, perfilhamento e teor de sacarose. A variedade RB 83-5486 apresentou resultados similares para teor de sacarose e massa de matéria fresca do colmo, enquanto a variedade RB 86-7515 apresentou valores entre 0,665 e 0,873, para as variáveis avaliadas.
Resumo:
In this paper we extend semiparametric mixed linear models with normal errors to elliptical errors in order to permit distributions with heavier and lighter tails than the normal ones. Penalized likelihood equations are applied to derive the maximum penalized likelihood estimates (MPLEs) which appear to be robust against outlying observations in the sense of the Mahalanobis distance. A reweighed iterative process based on the back-fitting method is proposed for the parameter estimation and the local influence curvatures are derived under some usual perturbation schemes to study the sensitivity of the MPLEs. Two motivating examples preliminarily analyzed under normal errors are reanalyzed considering some appropriate elliptical errors. The local influence approach is used to compare the sensitivity of the model estimates.
Resumo:
Abstract Background In areas with limited structure in place for microscopy diagnosis, rapid diagnostic tests (RDT) have been demonstrated to be effective. Method The cost-effectiveness of the Optimal® and thick smear microscopy was estimated and compared. Data were collected on remote areas of 12 municipalities in the Brazilian Amazon. Data sources included the National Malaria Control Programme of the Ministry of Health, the National Healthcare System reimbursement table, hospitalization records, primary data collected from the municipalities, and scientific literature. The perspective was that of the Brazilian public health system, the analytical horizon was from the start of fever until the diagnostic results provided to patient and the temporal reference was that of year 2006. The results were expressed in costs per adequately diagnosed cases in 2006 U.S. dollars. Sensitivity analysis was performed considering key model parameters. Results In the case base scenario, considering 92% and 95% sensitivity for thick smear microscopy to Plasmodium falciparum and Plasmodium vivax, respectively, and 100% specificity for both species, thick smear microscopy is more costly and more effective, with an incremental cost estimated at US$549.9 per adequately diagnosed case. In sensitivity analysis, when sensitivity and specificity of microscopy for P. vivax were 0.90 and 0.98, respectively, and when its sensitivity for P. falciparum was 0.83, the RDT was more cost-effective than microscopy. Conclusion Microscopy is more cost-effective than OptiMal® in these remote areas if high accuracy of microscopy is maintained in the field. Decision regarding use of rapid tests for diagnosis of malaria in these areas depends on current microscopy accuracy in the field.
Resumo:
The aim of this study was to present the contributions of the systematic review of economic evaluations to the development of a national study on childhood hepatitis A vaccination. A literature review was performed in EMBASE, MEDLINE, WOPEC, HealthSTAR, SciELO and LILACS from 1995 to 2010. Most of the studies (8 of 10) showed favorable cost-effectiveness results. Sensitivity analysis indicated that the most important parameters for the results were cost of the vaccine, hepatitis A incidence, and medical costs of the disease. Variability was observed in methodological characteristics and estimates of key variables among the 10 studies reviewed. It is not possible to generalize results or transfer epidemiological estimates of resource utilization and costs associated with hepatitis A to the local context. Systematic review of economic evaluation studies of hepatitis A vaccine demonstrated the need for a national analysis and provided input for the development of a new decision-making model for Brazil.
Resumo:
Abstract This paper describes a design methodology for piezoelectric energy harvester s that thinly encapsulate the mechanical devices and expl oit resonances from higher- order vibrational modes. The direction of polarization determines the sign of the pi ezoelectric tensor to avoid cancellations of electric fields from opposite polarizations in the same circuit. The resultant modified equations of state are solved by finite element method (FEM). Com- bining this method with the solid isotropic material with penalization (SIMP) method for piezoelectric material, we have developed an optimization methodology that optimizes the piezoelectric material layout and polarization direc- tion. Updating the density function of the SIMP method is performed based on sensitivity analysis, the sequen- tial linear programming on the early stage of the opti- mization, and the phase field method on the latter stage
Resumo:
We need a large amount of energy to make our homes pleasantly warm in winter and cool in summer. If we also consider the energy losses that occur through roofs, perimeter walls and windows, it would be more appropriate to speak of waste than consumption. The solution would be to build passive houses, i.e. buildings more efficient and environmentally friendly, able to ensure a drastic reduction of electricity and heating bills. Recently, the increase of public awareness about global warming and environmental pollution problems have “finally” opened wide possibility in the field of sustainable construction by encouraging new renewable methods for heating and cooling space. Shallow geothermal allows to exploit the renewable heat reservoir, present in the soil at depths between 15 and 20 m, for air-conditioning of buildings, using a ground source heat pump. This thesis focuses on the design of an air-conditioning system with geothermal heat pump coupled to energy piles, i.e. piles with internal heat exchangers, for a typical Italian-family building, on the basis of a geological-technical report about a plot of Bologna’s plain provided by Geo-Net s.r.l. The study has involved a preliminary static sizing of the piles in order to calculate their length and number, then the project was completed making the energy sizing, where it has been verified if the building energy needs were met with the static solution obtained. Finally the attention was focused on the technical and economical validity compared to a traditional system (cost-benefit analysis) and on the problem of the uncertainty data design and their effects on the operating and initial costs of the system (sensitivity analysis). To evaluate the performance of the thermal system and the potential use of the piles was also used the PILESIM2 software, designed by Dr. Pahud of the SUPSI’s school.
Resumo:
This work is a detailed study of hydrodynamic processes in a defined area, the littoral in front of the Venice Lagoon and its inlets, which are complex morphological areas of interconnection. A finite element hydrodynamic model of the Venice Lagoon and the Adriatic Sea has been developed in order to study the coastal current patterns and the exchanges at the inlets of the Venice Lagoon. This is the first work in this area that tries to model the interaction dynamics, running together a model for the lagoon and the Adriatic Sea. First the barotropic processes near the inlets of the Venice Lagoon have been studied. Data from more than ten tide gauges displaced in the Adriatic Sea have been used in the calibration of the simulated water levels. To validate the model results, empirical flux data measured by ADCP probes installed inside the inlets of Lido and Malamocco have been used and the exchanges through the three inlets of the Venice Lagoon have been analyzed. The comparison between modelled and measured fluxes at the inlets outlined the efficiency of the model to reproduce both tide and wind induced water exchanges between the sea and the lagoon. As a second step, also small scale processes around the inlets that connect the Venice lagoon with the Northern Adriatic Sea have been investigated by means of 3D simulations. Maps of vorticity have been produced, considering the influence of tidal flows and wind stress in the area. A sensitivity analysis has been carried out to define the importance of the advection and of the baroclinic pressure gradients in the development of vortical processes seen along the littoral close to the inlets. Finally a comparison with real data measurements, surface velocity data from HF Radar near the Venice inlets, has been performed, which allows for a better understanding of the processes and their seasonal dynamics. The results outline the predominance of wind and tidal forcing in the coastal area. Wind forcing acts mainly on the mean coastal current inducing its detachment offshore during Sirocco events and an increase of littoral currents during Bora events. The Bora action is more homogeneous on the whole coastal area whereas the Sirocco strengthens its impact in the South, near Chioggia inlet. Tidal forcing at the inlets is mainly barotropic. The sensitivity analysis shows how advection is the main physical process responsible for the persistent vortical structures present along the littoral between the Venice Lagoon inlets. The comparison with measurements from HF Radar not only permitted a validation the model results, but also a description of different patterns in specific periods of the year. The success of the 2D and the 3D simulations on the reproduction both of the SSE, inside and outside the Venice Lagoon, of the tidal flow, through the lagoon inlets, and of the small scale phenomena, occurring along the littoral, indicates that the finite element approach is the most suitable tool for the investigation of coastal processes. For the first time, as shown by the flux modeling, the physical processes that drive the interaction between the two basins were reproduced.
Resumo:
The field of research of this dissertation concerns the bioengineering of exercise, in particular the relationship between biomechanical and metabolic knowledge. This relationship can allow to evaluate exercise in many different circumstances: optimizing athlete performance, understanding and helping compensation in prosthetic patients and prescribing exercise with high caloric consumption and minimal joint loading to obese subjects. Furthermore, it can have technical application in fitness and rehabilitation machine design, predicting energy consumption and joint loads for the subjects who will use the machine. The aim of this dissertation was to further understand how mechanical work and metabolic energy cost are related during movement using interpretative models. Musculoskeletal models, when including muscle energy expenditure description, can be useful to address this issue, allowing to evaluate human movement in terms of both mechanical and metabolic energy expenditure. A whole body muscle-skeletal model that could describe both biomechanical and metabolic aspects during movement was identified in literature and then was applied and validated using an EMG-driven approach. The advantage of using EMG driven approach was to avoid the use of arbitrary defined optimization functions to solve the indeterminate problem of muscle activations. A sensitivity analysis was conducted in order to know how much changes in model parameters could affect model outputs: the results showed that changing parameters in between physiological ranges did not influence model outputs largely. In order to evaluate its predicting capacity, the musculoskeletal model was applied to experimental data: first the model was applied in a simple exercise (unilateral leg press exercise) and then in a more complete exercise (elliptical exercise). In these studies, energy consumption predicted by the model resulted to be close to energy consumption estimated by indirect calorimetry for different intensity levels at low frequencies of movement. The use of muscle skeletal models for predicting energy consumption resulted to be promising and the use of EMG driven approach permitted to avoid the introduction of optimization functions. Even though many aspects of this approach have still to be investigated and these results are preliminary, the conclusions of this dissertation suggest that musculoskeletal modelling can be a useful tool for addressing issues about efficiency of movement in healthy and pathologic subjects.
Resumo:
Das Cydia pomonella Granulovirus (CpGV, Fam. Baculoviridae) ist ein sehr virulentes und hoch spezifisches Pathogen des Apfelwicklers (Cydia pomonella), das seit mehreren Jahren in der Bundesrepublik Deutschland und anderen Ländern der EU als Insektizid zugelassen ist. Wie andere Baculoviren auch befällt es die Larven der Insekten und ist aufgrund seiner Selektivität für Nicht-Zielorganismen unbedenklich. In der Vergangenheit konzentrierte sich die Erforschung des CpGV auf Bereiche, die für die Anwendung im Pflanzenschutz relevant waren, wobei nach fast 20 Jahren nach der ersten Zulassung noch immer nicht bekannt ist, ob und wie sich das CpGV in der Umwelt etablieren kann. Im Rahmen der vorliegenden Arbeit wurden verschiedene Parameter, mit deren Hilfe die Populationsdynamik des CpGV beschrieben werden kann, analysiert und quantitativ bestimmt. Neben den biologischen Eigenschaften wie Virulenz, DNA-Charakterisierung und Quantifizierung der Virusnachkommenschaft wurden insbesondere die horizontale sowie die vertikale Transmission, die Inaktivierung und die Infektion später Larvenstadien untersucht. Letztlich wurden die ermittelten Parameter zusammen mit Daten aus der Literatur in ein mathematisches Modell integriert. Um die Wahrscheinlichkeit der horizontalen Transmission zu quantifizieren, wurde ein Modellsystem mit losen Äpfeln etabliert, in dem verschiedene Szenarien möglicher horizontaler Transmission unter definierten Laborbedingungen getestet wurden. In Versuchsserien, in denen ein Virusfleck, entsprechend der produzierten Virusmenge einer Eilarve, auf einen Apfel appliziert worden war, war unter den aufgesetzten Apfelwicklerlarven lediglich eine sehr geringe Mortalität von 3 - 6% zu beobachten. Wurde jedoch ein an einer Virusinfektion gestorbener Larvenkadaver als Inokulum verwendet, lag die Mortalitätsrate aufgesetzter Larven bei über 40%. Diese beobachtete hohe horizontale Transmissionsrate konnte mit dem Verhalten der Larven erklärt werden. Die Larven zeigten eine deutliche Einbohrpräferenz für den Stielansatz bzw. den Kelch, wodurch die Wahrscheinlichkeit des Zusammentreffens einer an der Infektion verendeten Larve mit einer gesunden Larve um ein Vielfaches zunahm. In einem ähnlich angelegten Freilandversuch konnte eine horizontale Transmission nicht belegt werden. Der Unterschied zur Kontrollgruppe fiel aufgrund einer hohen natürlichen Mortalität und einer damit einhergehenden niedrigen Dichte der Larven zu gering aus. Parallel hierzu wurde außerdem eine Halbwertszeit von 52 Sonnenstunden für das CpGV ermittelt. Weiterhin konnte festgestellt werden, dass die Mortalität von späteren Larvenstadien, die 14 Tage Zeit hatten sich in die Äpfel einzubohren, bevor eine CpGV-Applikation durchgeführt wurde, ebenso hoch war wie bei Larven, die sich im L1-Stadium auf der Apfeloberfläche infizierten. Aufgrund des höheren Alters jener Larven war der Fraßschaden an befallenen Äpfeln jedoch wesentlich größer und vergleichbar mit dem Fraßschaden einer unbehandelten Kontrolle. Der Versuch zur vertikalen Transmission zeigte dass, obwohl die verwendete Apfelwicklerzucht nicht frei von CpGV war, die Mortalitätsrate der Nachkommen subletal infizierter Weibchen (44%) jedoch deutlich höher war als die der Nachkommen subletal infizierter Männchen (28%) und der unbehandelten Kontrolle (27%). Auch in den PCR-Analysen konnte eine größere Menge an CpGV-Trägern bei den Nachkommen subletal infizierter Weibchen (67%) als bei den Nachkommen subletal infizierter Männchen (49%) und bei der Kontrolle (42%) nachgewiesen werden. Die Ergebnisse deuten darauf hin, dass eine Infektion durch subletal infizierte Weibchen vertikal in die nächste Generation übertragen werden kann. Dies lässt erkennen, dass in der Folgegeneration des Apfelwicklers eine zusätzliche Wirkung des CpGV durch vertikale Transmission auftreten kann. Hierin wäre auch ein potentieller Mechanismus für eine dauerhafte Etablierung des Virus zu sehen. Letztlich wurden alle Parameter, die die CpGV-Apfelwickler-Beziehung beschreiben, in ein mathematisches Modell GRANULO integriert. Nach einer Sensitivitätsanalyse wurde GRANULO teilweise mit Daten aus den Freilandversuchen verifiziert. Durch Modifikation der Virusparameter im Modell konnte anschließend der Einfluss veränderter biologischer Eigenschaften (UV-Stabilität und Transmissionsraten) der Viren in Simulationen theoretisch erprobt werden. Das beschriebene Modell, das allerdings noch einer weitergehenden Verifizierung und Validierung bedarf, ist eine erste Annäherung an die quantitative Erfassung und Modellierung der Populationsdynamik des Systems CpGV-Apfelwickler. Die im Zusammenhang mit der Populationsdynamik des Apfelwicklers erhobenen Daten können einen wertvollen Beitrag zur Optimierung von Kontrollstrategien des Apfelwicklers mittels CpGV leisten. Außerdem geben sie Aufschluss über die Etablierungsmöglichkeiten dieses Bioinsektizids.
Resumo:
The thesis is framed within the field of the stochastic approach to flow and transport themes of solutes in natural porous materials. The methodology used to characterise the uncertainty associated with the modular predictions is completely general and can be reproduced in various contexts. The theme of the research includes the following among its main objectives: (a) the development of a Global Sensitivity Analysis on contaminant transport models in the subsoil to research the effects of the uncertainty of the most important parameters; (b) the application of advanced techniques, such as Polynomial Chaos Expansion (PCE), for obtaining surrogate models starting from those which conduct traditionally developed analyses in the context of Monte Carlo simulations, characterised by an often not negligible computational burden; (c) the analyses and the understanding of the key processes at the basis of the transport of solutes in natural porous materials using the aforementioned technical and analysis resources. In the complete picture, the thesis looks at the application of a Continuous Injection transport model of contaminants, of the PCE technique which has already been developed and applied by the thesis supervisors, by way of numerical code, to a Slug Injection model. The methodology was applied to the aforementioned model with original contribution deriving from surrogate models with various degrees of approximation and developing a Global Sensitivity Analysis aimed at the determination of Sobol’ indices.
Resumo:
Synthetic Biology is a relatively new discipline, born at the beginning of the New Millennium, that brings the typical engineering approach (abstraction, modularity and standardization) to biotechnology. These principles aim to tame the extreme complexity of the various components and aid the construction of artificial biological systems with specific functions, usually by means of synthetic genetic circuits implemented in bacteria or simple eukaryotes like yeast. The cell becomes a programmable machine and its low-level programming language is made of strings of DNA. This work was performed in collaboration with researchers of the Department of Electrical Engineering of the University of Washington in Seattle and also with a student of the Corso di Laurea Magistrale in Ingegneria Biomedica at the University of Bologna: Marilisa Cortesi. During the collaboration I contributed to a Synthetic Biology project already started in the Klavins Laboratory. In particular, I modeled and subsequently simulated a synthetic genetic circuit that was ideated for the implementation of a multicelled behavior in a growing bacterial microcolony. In the first chapter the foundations of molecular biology are introduced: structure of the nucleic acids, transcription, translation and methods to regulate gene expression. An introduction to Synthetic Biology completes the section. In the second chapter is described the synthetic genetic circuit that was conceived to make spontaneously emerge, from an isogenic microcolony of bacteria, two different groups of cells, termed leaders and followers. The circuit exploits the intrinsic stochasticity of gene expression and intercellular communication via small molecules to break the symmetry in the phenotype of the microcolony. The four modules of the circuit (coin flipper, sender, receiver and follower) and their interactions are then illustrated. In the third chapter is derived the mathematical representation of the various components of the circuit and the several simplifying assumptions are made explicit. Transcription and translation are modeled as a single step and gene expression is function of the intracellular concentration of the various transcription factors that act on the different promoters of the circuit. A list of the various parameters and a justification for their value closes the chapter. In the fourth chapter are described the main characteristics of the gro simulation environment, developed by the Self Organizing Systems Laboratory of the University of Washington. Then, a sensitivity analysis performed to pinpoint the desirable characteristics of the various genetic components is detailed. The sensitivity analysis makes use of a cost function that is based on the fraction of cells in each one of the different possible states at the end of the simulation and the wanted outcome. Thanks to a particular kind of scatter plot, the parameters are ranked. Starting from an initial condition in which all the parameters assume their nominal value, the ranking suggest which parameter to tune in order to reach the goal. Obtaining a microcolony in which almost all the cells are in the follower state and only a few in the leader state seems to be the most difficult task. A small number of leader cells struggle to produce enough signal to turn the rest of the microcolony in the follower state. It is possible to obtain a microcolony in which the majority of cells are followers by increasing as much as possible the production of signal. Reaching the goal of a microcolony that is split in half between leaders and followers is comparatively easy. The best strategy seems to be increasing slightly the production of the enzyme. To end up with a majority of leaders, instead, it is advisable to increase the basal expression of the coin flipper module. At the end of the chapter, a possible future application of the leader election circuit, the spontaneous formation of spatial patterns in a microcolony, is modeled with the finite state machine formalism. The gro simulations provide insights into the genetic components that are needed to implement the behavior. In particular, since both the examples of pattern formation rely on a local version of Leader Election, a short-range communication system is essential. Moreover, new synthetic components that allow to reliably downregulate the growth rate in specific cells without side effects need to be developed. In the appendix are listed the gro code utilized to simulate the model of the circuit, a script in the Python programming language that was used to split the simulations on a Linux cluster and the Matlab code developed to analyze the data.
Resumo:
This work presents a comprehensive methodology for the reduction of analytical or numerical stochastic models characterized by uncertain input parameters or boundary conditions. The technique, based on the Polynomial Chaos Expansion (PCE) theory, represents a versatile solution to solve direct or inverse problems related to propagation of uncertainty. The potentiality of the methodology is assessed investigating different applicative contexts related to groundwater flow and transport scenarios, such as global sensitivity analysis, risk analysis and model calibration. This is achieved by implementing a numerical code, developed in the MATLAB environment, presented here in its main features and tested with literature examples. The procedure has been conceived under flexibility and efficiency criteria in order to ensure its adaptability to different fields of engineering; it has been applied to different case studies related to flow and transport in porous media. Each application is associated with innovative elements such as (i) new analytical formulations describing motion and displacement of non-Newtonian fluids in porous media, (ii) application of global sensitivity analysis to a high-complexity numerical model inspired by a real case of risk of radionuclide migration in the subsurface environment, and (iii) development of a novel sensitivity-based strategy for parameter calibration and experiment design in laboratory scale tracer transport.
Resumo:
The aim of the present thesis was to investigate the influence of lower-limb joint models on musculoskeletal model predictions during gait. We started our analysis by using a baseline model, i.e., the state-of-the-art lower-limb model (spherical joint at the hip and hinge joints at the knee and ankle) created from MRI of a healthy subject in the Medical Technology Laboratory of the Rizzoli Orthopaedic Institute. We varied the models of knee and ankle joints, including: knee- and ankle joints with mean instantaneous axis of rotation, universal joint at the ankle, scaled-generic-derived planar knee, subject-specific planar knee model, subject-specific planar ankle model, spherical knee, spherical ankle. The joint model combinations corresponding to 10 musculoskeletal models were implemented into a typical inverse dynamics problem, including inverse kinematics, inverse dynamics, static optimization and joint reaction analysis algorithms solved using the OpenSim software to calculate joint angles, joint moments, muscle forces and activations, joint reaction forces during 5 walking trials. The predicted muscle activations were qualitatively compared to experimental EMG, to evaluate the accuracy of model predictions. Planar joint at the knee, universal joint at the ankle and spherical joints at the knee and at the ankle produced appreciable variations in model predictions during gait trials. The planar knee joint model reduced the discrepancy between the predicted activation of the Rectus Femoris and the EMG (with respect to the baseline model), and the reduced peak knee reaction force was considered more accurate. The use of the universal joint, with the introduction of the subtalar joint, worsened the muscle activation agreement with the EMG, and increased ankle and knee reaction forces were predicted. The spherical joints, in particular at the knee, worsened the muscle activation agreement with the EMG. A substantial increase of joint reaction forces at all joints was predicted despite of the good agreement in joint kinematics with those of the baseline model. The introduction of the universal joint had a negative effect on the model predictions. The cause of this discrepancy is likely to be found in the definition of the subtalar joint and thus, in the particular subject’s anthropometry, used to create the model and define the joint pose. We concluded that the implementation of complex joint models do not have marked effects on the joint reaction forces during gait. Computed results were similar in magnitude and in pattern to those reported in literature. Nonetheless, the introduction of planar joint model at the knee had positive effect upon the predictions, while the use of spherical joint at the knee and/or at the ankle is absolutely unadvisable, because it predicted unrealistic joint reaction forces.
Resumo:
This thesis presents a new Artificial Neural Network (ANN) able to predict at once the main parameters representative of the wave-structure interaction processes, i.e. the wave overtopping discharge, the wave transmission coefficient and the wave reflection coefficient. The new ANN has been specifically developed in order to provide managers and scientists with a tool that can be efficiently used for design purposes. The development of this ANN started with the preparation of a new extended and homogeneous database that collects all the available tests reporting at least one of the three parameters, for a total amount of 16’165 data. The variety of structure types and wave attack conditions in the database includes smooth, rock and armour unit slopes, berm breakwaters, vertical walls, low crested structures, oblique wave attacks. Some of the existing ANNs were compared and improved, leading to the selection of a final ANN, whose architecture was optimized through an in-depth sensitivity analysis to the training parameters of the ANN. Each of the selected 15 input parameters represents a physical aspect of the wave-structure interaction process, describing the wave attack (wave steepness and obliquity, breaking and shoaling factors), the structure geometry (submergence, straight or non-straight slope, with or without berm or toe, presence or not of a crown wall), or the structure type (smooth or covered by an armour layer, with permeable or impermeable core). The advanced ANN here proposed provides accurate predictions for all the three parameters, and demonstrates to overcome the limits imposed by the traditional formulae and approach adopted so far by some of the existing ANNs. The possibility to adopt just one model to obtain a handy and accurate evaluation of the overall performance of a coastal or harbor structure represents the most important and exportable result of the work.