966 resultados para Benders decomposition


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Finite element techniques for solving the problem of fluid-structure interaction of an elastic solid material in a laminar incompressible viscous flow are described. The mathematical problem consists of the Navier-Stokes equations in the Arbitrary Lagrangian-Eulerian formulation coupled with a non-linear structure model, considering the problem as one continuum. The coupling between the structure and the fluid is enforced inside a monolithic framework which computes simultaneously for the fluid and the structure unknowns within a unique solver. We used the well-known Crouzeix-Raviart finite element pair for discretization in space and the method of lines for discretization in time. A stability result using the Backward-Euler time-stepping scheme for both fluid and solid part and the finite element method for the space discretization has been proved. The resulting linear system has been solved by multilevel domain decomposition techniques. Our strategy is to solve several local subproblems over subdomain patches using the Schur-complement or GMRES smoother within a multigrid iterative solver. For validation and evaluation of the accuracy of the proposed methodology, we present corresponding results for a set of two FSI benchmark configurations which describe the self-induced elastic deformation of a beam attached to a cylinder in a laminar channel flow, allowing stationary as well as periodically oscillating deformations, and for a benchmark proposed by COMSOL multiphysics where a narrow vertical structure attached to the bottom wall of a channel bends under the force due to both viscous drag and pressure. Then, as an example of fluid-structure interaction in biomedical problems, we considered the academic numerical test which consists in simulating the pressure wave propagation through a straight compliant vessel. All the tests show the applicability and the numerical efficiency of our approach to both two-dimensional and three-dimensional problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Decomposition based approaches are recalled from primal and dual point of view. The possibility of building partially disaggregated reduced master problems is investigated. This extends the idea of aggregated-versus-disaggregated formulation to a gradual choice of alternative level of aggregation. Partial aggregation is applied to the linear multicommodity minimum cost flow problem. The possibility of having only partially aggregated bundles opens a wide range of alternatives with different trade-offs between the number of iterations and the required computation for solving it. This trade-off is explored for several sets of instances and the results are compared with the ones obtained by directly solving the natural node-arc formulation. An iterative solution process to the route assignment problem is proposed, based on the well-known Frank Wolfe algorithm. In order to provide a first feasible solution to the Frank Wolfe algorithm, a linear multicommodity min-cost flow problem is solved to optimality by using the decomposition techniques mentioned above. Solutions of this problem are useful for network orientation and design, especially in relation with public transportation systems as the Personal Rapid Transit. A single-commodity robust network design problem is addressed. In this, an undirected graph with edge costs is given together with a discrete set of balance matrices, representing different supply/demand scenarios. The goal is to determine the minimum cost installation of capacities on the edges such that the flow exchange is feasible for every scenario. A set of new instances that are computationally hard for the natural flow formulation are solved by means of a new heuristic algorithm. Finally, an efficient decomposition-based heuristic approach for a large scale stochastic unit commitment problem is presented. The addressed real-world stochastic problem employs at its core a deterministic unit commitment planning model developed by the California Independent System Operator (ISO).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Holding the major share of stellar mass in galaxies and being also old and passively evolving, early-type galaxies (ETGs) are the primary probes in investigating these various evolution scenarios, as well as being useful means to provide insights on cosmological parameters. In this thesis work I focused specifically on ETGs and on their capability in constraining galaxy formation and evolution; in particular, the principal aims were to derive some of the ETGs evolutionary parameters, such as age, metallicity and star formation history (SFH) and to study their age-redshift and mass-age relations. In order to infer galaxy physical parameters, I used the public code STARLIGHT: this program provides a best fit to the observed spectrum from a combination of many theoretical models defined in user-made libraries. the comparison between the output and input light-weighted ages shows a good agreement starting from SNRs of ∼ 10, with a bias of ∼ 2.2% and a dispersion 3%. Furthermore, also metallicities and SFHs are well reproduced. In the second part of the thesis I performed an analysis on real data, starting from Sloan Digital Sky Survey (SDSS) spectra. I found that galaxies get older with cosmic time and with increasing mass (for a fixed redshift bin); absolute light-weighted ages, instead, result independent from the fitting parameters or the synthetic models used. Metallicities, instead, are very similar from each other and clearly consistent with the ones derived from the Lick indices. The predicted SFH indicates the presence of a double burst of star formation. Velocity dispersions and extinctiona are also well constrained, following the expected behaviours. As a further step, I also fitted single SDSS spectra (with SNR∼ 20), to verify that stacked spectra gave the same results without introducing any bias: this is an important check, if one wants to apply the method at higher z, where stacked spectra are necessary to increase the SNR. Our upcoming aim is to adopt this approach also on galaxy spectra obtained from higher redshift Surveys, such as BOSS (z ∼ 0.5), zCOSMOS (z 1), K20 (z ∼ 1), GMASS (z ∼ 1.5) and, eventually, Euclid (z 2). Indeed, I am currently carrying on a preliminary study to estabilish the applicability of the method to lower resolution, as well as higher redshift (z 2) spectra, just like the Euclid ones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statically balanced compliant mechanisms require no holding force throughout their range of motion while maintaining the advantages of compliant mechanisms. In this paper, a postbuckled fixed-guided beam is proposed to provide the negative stiffness to balance the positive stiffness of a compliant mechanism. To that end, a curve decomposition modeling method is presented to simplify the large deflection analysis. The modeling method facilitates parametric design insight and elucidates key points on the force-deflection curve. Experimental results validate the analysis. Furthermore, static balancing with fixed-guided beams is demonstrated for a rectilinear proof-of-concept prototype.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The cultivation of genetically modified (GM) plants has raised several environmental concerns. One of these concerns regards non-target soil fauna organisms, which play an important role in the decomposition of organic matter and hence are largely exposed to GM plant residues. Soil fauna may be directly affected by transgene products or indirectly by pleiotropic effects such as a modified plant metabolism. Thus, ecosystem services and functioning might be affected negatively. In a litterbag experiment in the field we analysed the decomposition process and the soil fauna community involved. Therefore, we used four experimental GM wheat varieties, two with a race-specific antifungal resistance against powdery mildew (Pm3b) and two with an unspecific antifungal resistance based on the expression of chitinase and glucanase. We compared them with two non-GM isolines and six conventional cereal varieties. To elucidate the mechanisms that cause differences in plant decomposition, structural plant components (i.e. C:N ratio, lignin, cellulose, hemicellulose) were examined and soil properties, temperature and precipitation were monitored. The most frequent taxa extracted from decaying plant material were mites (Cryptostigmata, Gamasina and Uropodina), springtails (Isotomidae), annelids (Enchytraeidae) and Diptera (Cecidomyiidae larvae). Despite a single significant transgenic/month interaction for Cecidomyiidae larvae, which is probably random, we detected no impact of the GM wheat on the soil fauna community. However, soil fauna differences among conventional cereal varieties were more pronounced than between GM and non-GM wheat. While leaf residue decomposition in GM and non-GM wheat was similar, differences among conventional cereals were evident. Furthermore, sampling date and location were found to greatly influence soil fauna community and decomposition processes. The results give no indication of ecologically relevant adverse effects of antifungal GM wheat on the composition and the activity of the soil fauna community.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: This paper examines four different levels of possible variation in symptom reporting: occasion, day, person and family. DESIGN: In order to rule out effects of retrospection, concurrent symptom reporting was assessed prospectively using a computer-assisted self-report method. METHODS: A decomposition of variance in symptom reporting was conducted using diary data from families with adolescent children. We used palmtop computers to assess concurrent somatic complaints from parents and children six times a day for seven consecutive days. In two separate studies, 314 and 254 participants from 96 and 77 families, respectively, participated. A generalized multilevel linear models approach was used to analyze the data. Symptom reports were modelled using a logistic response function, and random effects were allowed at the family, person and day level, with extra-binomial variation allowed for on the occasion level. RESULTS: Substantial variability was observed at the person, day and occasion level but not at the family level. CONCLUSIONS: To explain symptom reporting in normally healthy individuals, situational as well as person characteristics should be taken into account. Family characteristics, however, would not help to clarify symptom reporting in all family members.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, we present a multichannel EEG decomposition model based on an adaptive topographic time-frequency approximation technique. It is an extension of the Matching Pursuit algorithm and called dependency multichannel matching pursuit (DMMP). It takes the physiologically explainable and statistically observable topographic dependencies between the channels into account, namely the spatial smoothness of neighboring electrodes that is implied by the electric leadfield. DMMP decomposes a multichannel signal as a weighted sum of atoms from a given dictionary where the single channels are represented from exactly the same subset of a complete dictionary. The decomposition is illustrated on topographical EEG data during different physiological conditions using a complete Gabor dictionary. Further the extension of the single-channel time-frequency distribution to a multichannel time-frequency distribution is given. This can be used for the visualization of the decomposition structure of multichannel EEG. A clustering procedure applied to the topographies, the vectors of the corresponding contribution of an atom to the signal in each channel produced by DMMP, leads to an extremely sparse topographic decomposition of the EEG.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spatial independent component analysis (sICA) of functional magnetic resonance imaging (fMRI) time series can generate meaningful activation maps and associated descriptive signals, which are useful to evaluate datasets of the entire brain or selected portions of it. Besides computational implications, variations in the input dataset combined with the multivariate nature of ICA may lead to different spatial or temporal readouts of brain activation phenomena. By reducing and increasing a volume of interest (VOI), we applied sICA to different datasets from real activation experiments with multislice acquisition and single or multiple sensory-motor task-induced blood oxygenation level-dependent (BOLD) signal sources with different spatial and temporal structure. Using receiver operating characteristics (ROC) methodology for accuracy evaluation and multiple regression analysis as benchmark, we compared sICA decompositions of reduced and increased VOI fMRI time-series containing auditory, motor and hemifield visual activation occurring separately or simultaneously in time. Both approaches yielded valid results; however, the results of the increased VOI approach were spatially more accurate compared to the results of the decreased VOI approach. This is consistent with the capability of sICA to take advantage of extended samples of statistical observations and suggests that sICA is more powerful with extended rather than reduced VOI datasets to delineate brain activity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In environmental epidemiology, exposure X and health outcome Y vary in space and time. We present a method to diagnose the possible influence of unmeasured confounders U on the estimated effect of X on Y and to propose several approaches to robust estimation. The idea is to use space and time as proxy measures for the unmeasured factors U. We start with the time series case where X and Y are continuous variables at equally-spaced times and assume a linear model. We define matching estimator b(u)s that correspond to pairs of observations with specific lag u. Controlling for a smooth function of time, St, using a kernel estimator is roughly equivalent to estimating the association with a linear combination of the b(u)s with weights that involve two components: the assumptions about the smoothness of St and the normalized variogram of the X process. When an unmeasured confounder U exists, but the model otherwise correctly controls for measured confounders, the excess variation in b(u)s is evidence of confounding by U. We use the plot of b(u)s versus lag u, lagged-estimator-plot (LEP), to diagnose the influence of U on the effect of X on Y. We use appropriate linear combination of b(u)s or extrapolate to b(0) to obtain novel estimators that are more robust to the influence of smooth U. The methods are extended to time series log-linear models and to spatial analyses. The LEP plot gives us a direct view of the magnitude of the estimators for each lag u and provides evidence when models did not adequately describe the data.