989 resultados para Progressive Asymptotic Approach


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The sensitivity of parameters that govern the stability of population size in Chrysomya albiceps and describe its spatial dynamics was evaluated in this study. The dynamics was modeled using a density-dependent model of population growth. Our simulations show that variation in fecundity and mainly in survival has marked effect on the dynamics and indicates the possibility of transitions from one-point equilibrium to bounded oscillations. C. albiceps exhibits a two-point limit cycle, but the introduction of diffusive dispersal induces an evident qualitative shift from two-point limit cycle to a one fixed-point dynamics. Population dynamics of C. albiceps is here compared to dynamics of Cochliomyia macellaria, C. megacephala and C. putoria.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The solution for the ‘Contested Garment Problem’, proposed in the Babylonic Talmud, suggests that each agent should receive at least some part of the resources whenever the demand overcomes the available amount. In this context, we propose a new method to define lower bounds on awards, an idea that has underlied the theoretical analysis of bankruptcy problems from its beginning (O’Neill, 1982) to present day (Dominguez and Thomson, 2006). Specifically, starting from the fact that a society establishes its own set of ‘Commonly Accepted Equity Principles’, our proposal ensures to each agent the smallest amount she gets according to all the admissible rules. As in general this new bound will not exhaust the estate, we analyze its recursive application for different sets of equity principles. Keywords: Bankruptcy problems, Bankruptcy rules, Lower bounds, Recursive process

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To determine whether, during hemorrhagic shock, the effect of epinephrine on energy metabolism could be deleterious, by enhancing the oxygen requirement at a given level of oxygen delivery (DO2). DESIGN: Prospective, randomized, control trial. SETTING: Experimental laboratory. SUBJECTS: Two groups of seven mongrel dogs were studied. The epinephrine group received a continuous infusion of epinephrine (1 microgram/min/kg) while the control group received saline. INTERVENTION: Dogs were anesthetized with pentobarbital, and shock was produced by stepwise hemorrhage. MEASUREMENTS AND MAIN RESULTS: Oxygen consumption (VO2) was continuously measured by the gas exchange technique, while DO2 was independently calculated from cardiac output (measured by thermodilution) and blood oxygen content. A dual-lines regression fit was applied to the DO2 vs. VO2 plot. The intersection of the two regression lines defined the critical value of DO2. Values above critical DO2 belonged to phase 1, while phase 2 occurred below critical DO2. In the control group, VO2 was independent of DO2 during phase 1; VO2 was dependent on DO2 during phase 2. In the epinephrine group, the expected increase in VO2 (+19%) and DO2 (+50%) occurred under normovolemic conditions. During hemorrhage, VO2 immediately decreased, and the slope of phase 1 was significantly (p < .01) different from zero, and was significantly (p < .05) steeper than in the control group (0.025 +/- 0.005 vs. 0.005 +/- 0.010). However, the critical DO2 (8.7 +/- 1.7 vs. 9.7 +/- 2.4 mL/min/kg), the critical VO2 (5.6 +/- 0.5 vs. 5.5 +/- 0.9 mL/min/kg), and the slope of phase 2 (0.487 +/- 0.080 vs. 0.441 +/- 0.130) were not different from control values. CONCLUSIONS: The administration of pharmacologic doses of epinephrine significantly increased VO2 under normovolemic conditions due to the epinephrine-induced thermogenic effect. This effect progressively decreased during hemorrhage. The critical DO2 and the relationship between DO2 and VO2 in the supply-dependent phase of shock were unaffected by epinephrine infusion. These results suggest that during hemorrhagic shock, epinephrine administration did not exert a detrimental effect on the relationship between DO2 and VO2.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Recent evidence suggests that there may be more than one Gilles de la Tourette syndrome (GTS)/tic disorder phenotype. However, little is known about the common patterns of these GTS/tic disorder-related comorbidities. In addition, sex-specific phenomenological data of GTS/tic disorder-affected adults are rare. Therefore, this community-based study used latent class analyses (LCA) to investigate sex-related and non-sex-related subtypes of GTS/tic disorders and their most common comorbidities. METHODS: The data were drawn from the PsyCoLaus study (n = 3691), a population-based survey conducted in Lausanne, Switzerland. LCA were performed on the data of 80 subjects manifesting motor/vocal tics during their childhood/adolescence. Comorbid attention-deficit hyperactivity disorder (ADHD), obsessive-compulsive disorder, depressive, phobia and panic symptoms/syndromes comprised the selected indicators. The resultant classes were characterized by psychosocial correlates. RESULTS: In LCA, four latent classes provided the best fit to the data. We identified two male-related classes. The first class exhibited both ADHD and depression. The second class comprised males with only depression. Class three was a female-related class depicting obsessive thoughts/compulsive acts, phobias and panic attacks. This class manifested high psychosocial impairment. Class four had a balanced sex proportion and comorbid symptoms/syndromes such as phobias and panic attacks. The complementary occurrence of comorbid obsessive thoughts/compulsive acts and ADHD impulsivity was remarkable. CONCLUSIONS: To the best of our knowledge, this is the first study applying LCA to community data of GTS symptoms/tic disorder-affected persons. Our findings support the utility of differentiating GTS/tic disorder subphenotypes on the basis of comorbid syndromes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Retinal detachment (RD) is a major complication of cataract surgery, which can be treated by either primary vitrectomy without indentation or the scleral buckling procedure. The aim of this study is to compare the results of these two techniques for the treatment of pseudophakic RD. PATIENTS AND METHODS: The charts of 40 patients (40 eyes) treated with scleral buckling for a primary pseudophakic RD were retrospectively studied and compared to the charts of 32 patients (32 eyes) treated with primary vitrectomy without scleral buckle during the same period by the same surgeons. To obtain comparable samples, patients with giant retinal tears, vitreous hemorrhage, and severe preoperative proliferative vitreoretinopathy (PVR) were not included. Minimal follow-up was 6 months. RESULTS: The primary success rate was 84% in the vitrectomy group and 82.5% in the ab-externo group. Final anatomical success was observed in 100% of cases in the vitrectomy group and in 95% of cases in the ab-externo group. Final visual acuity was 0.5 or better in 44% of cases in the vitrectomy group and 37.5% in the ab-externo group. The duration of the surgery was significantly lower in the ab-externo group, whereas the hospital stay tended to be lower in the vitrectomy group. In the vitrectomy group, postoperative PVR developed in 3 eyes and new or undetected breaks were responsible for failure of the initial procedure in 2 eyes. CONCLUSION: Primary vitrectomy appears to be as effective as scleral buckling procedures for the treatment of pseudophakic RD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The article is composed of two sections. The first one is a critical review of the three main alternative indices to GDP which were proposed in the last decades – the Human Development Index (HDI), the Genuine Progress Indicator (GPI), and the Happy Planet Index (HPI) – which is made on the basis of conceptual foundations, rather than looking at issues of statistical consistency or mathematical refinement as most of the literature does. The pars construens aims to propose an alternative measure, the composite wealth index, consistent with an approach to development based on the notion of composite wealth, which is in turn derived from an empirical common sense criterion. Arguably, this approach is suitable to be conveyed into an easily understandable and coherent indicator, and thus appropriate to track development in its various dimensions: simple in its formulation, the wealth approach can incorporate social and ecological goals without significant alterations in conceptual foundations, while reducing to a minimum arbitrary weighting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper characterizes and evaluates the potential of three commercial CT iterative reconstruction methods (ASIR?, VEO? and iDose(4 ()?())) for dose reduction and image quality improvement. We measured CT number accuracy, standard deviation (SD), noise power spectrum (NPS) and modulation transfer function (MTF) metrics on Catphan phantom images while five human observers performed four-alternative forced-choice (4AFC) experiments to assess the detectability of low- and high-contrast objects embedded in two pediatric phantoms. Results show that 40% and 100% ASIR as well as iDose(4) levels 3 and 6 do not affect CT number and strongly decrease image noise with relative SD constant in a large range of dose. However, while ASIR produces a shift of the NPS curve apex, less change is observed with iDose(4) with respect to FBP methods. With second-generation iterative reconstruction VEO, physical metrics are even further improved: SD decreased to 70.4% at 0.5 mGy and spatial resolution improved to 37% (MTF(50%)). 4AFC experiments show that few improvements in detection task performance are obtained with ASIR and iDose(4), whereas VEO makes excellent detections possible even at an ultra-low-dose (0.3 mGy), leading to a potential dose reduction of a factor 3 to 7 (67%-86%). In spite of its longer reconstruction time and the fact that clinical studies are still required to complete these results, VEO clearly confirms the tremendous potential of iterative reconstructions for dose reduction in CT and appears to be an important tool for patient follow-up, especially for pediatric patients where cumulative lifetime dose still remains high.