987 resultados para homogeneous domination approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Estimation of population size with missing zero-class is an important problem that is encountered in epidemiological assessment studies. Fitting a Poisson model to the observed data by the method of maximum likelihood and estimation of the population size based on this fit is an approach that has been widely used for this purpose. In practice, however, the Poisson assumption is seldom satisfied. Zelterman (1988) has proposed a robust estimator for unclustered data that works well in a wide class of distributions applicable for count data. In the work presented here, we extend this estimator to clustered data. The estimator requires fitting a zero-truncated homogeneous Poisson model by maximum likelihood and thereby using a Horvitz-Thompson estimator of population size. This was found to work well, when the data follow the hypothesized homogeneous Poisson model. However, when the true distribution deviates from the hypothesized model, the population size was found to be underestimated. In the search of a more robust estimator, we focused on three models that use all clusters with exactly one case, those clusters with exactly two cases and those with exactly three cases to estimate the probability of the zero-class and thereby use data collected on all the clusters in the Horvitz-Thompson estimator of population size. Loss in efficiency associated with gain in robustness was examined based on a simulation study. As a trade-off between gain in robustness and loss in efficiency, the model that uses data collected on clusters with at most three cases to estimate the probability of the zero-class was found to be preferred in general. In applications, we recommend obtaining estimates from all three models and making a choice considering the estimates from the three models, robustness and the loss in efficiency. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider some non-homogeneous Poisson models to estimate the probability that an air quality standard is exceeded a given number of times in a time interval of interest. We assume that the number of exceedances occurs according to a non-homogeneous Poisson process (NHPP). This Poisson process has rate function lambda(t), t >= 0, which depends on some parameters that must be estimated. We take into account two cases of rate functions: the Weibull and the Goel-Okumoto. We consider models with and without change-points. When the presence of change-points is assumed, we may have the presence of either one, two or three change-points, depending of the data set. The parameters of the rate functions are estimated using a Gibbs sampling algorithm. Results are applied to ozone data provided by the Mexico City monitoring network. In a first instance, we assume that there are no change-points present. Depending on the adjustment of the model, we assume the presence of either one, two or three change-points. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effects of alkali treatment on the structural characteristics of cotton linters and sisal cellulose samples have been studied. Mercerization results in a decrease in the indices of crystallinity and the degrees of polymerization, and an increase in the alpha-cellulose contents of the samples. The relevance of the structural properties of cellulose to its dissolution is probed by studying the kinetics of cellulose decrystallization, prior to its solubilization in LiCl/N,N-dimethylacetamide (DMAc). Our data show that the decrystallization rate constants and activation parameters are only slightly dependent on the physico-chemical properties of the starting celluloses. This multi-step reaction is accompanied by a small enthalpy and large, negative, entropy of activation. These results are analyzed in terms of the interactions within the biopolymer chains during decrystallization, as well as those between the two ions of the electrolyte and both DMAc and cellulose.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents a new methodology to model material failure, in two-dimensional reinforced concrete members, using the Continuum Strong Discontinuity Approach (CSDA). The mixture theory is used as the methodological approach to model reinforced concrete as a composite material, constituted by a plain concrete matrix reinforced with two embedded orthogonal long fiber bundles (rebars). Matrix failure is modeled on the basis of a continuum damage model, equipped with strain softening, whereas the rebars effects are modeled by means of phenomenological constitutive models devised to reproduce the axial non-linear behavior, as well as the bondslip and dowel effects. The proposed methodology extends the fundamental ingredients of the standard Strong Discontinuity Approach, and the embedded discontinuity finite element formulations, in homogeneous materials, to matrix/fiber composite materials, as reinforced concrete. The specific aspects of the material failure modeling for those composites are also addressed. A number of available experimental tests are reproduced in order to illustrate the feasibility of the proposed methodology. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Studies have been carried out on the heat transfer in a packed bed of glass beads percolated by air at moderate flow rates. Rigorous statistic analysis of the experimental data was carried out and the traditional two parameter model was used to represent them. The parameters estimated were the effective radial thermal conductivity, k, and the wall coefficient, h, through the least squares method. The results were evaluated as to the boundary bed inlet temperature, T-o, number of terms of the solution series and number of experimental points used in the estimate. Results indicated that a small difference in T-o was sufficient to promote great modifications in the estimated parameters and in the statistical properties of the model. The use of replicas at points of high parametric information of the model improved the results, although analysis of the residuals has resulted in the rejection of this alternative. In order to evaluate cion-linearity of the model, Bates and Watts (1988) curvature measurements and the Box (1971) biases of the coefficients were calculated. The intrinsic curvatures of the model (IN) tend to be concentrated at low bed heights and those due to parameter effects (PE) are spread all over the bed. The Box biases indicated both parameters as responsible for the curvatures PE, h being somewhat more problematic. (C) 2000 Elsevier B.V. Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter analyzes the current position of United States supremacy, in light of the debate on hegemony and domination that acquires greater relevance after the formulation of the 'Bush Doctrine', which is systematized in the document 'The National Security Strategy of the United States of America'. Our approach will emphasize the following aspects: establishment of a parallel between the transition from the 19th to the 20th centuries, from studies that point out the characteristics of imperialism at different times; an analysis of the current foreign policies of the United States, focusing on the debate between unilateralism and multilateralism, emphasizing the reactions caused by the intervention in Iraq; a critical argument about the approaches that highlight in the security agenda of the Bush administration an indicator of a loss of hegemony, which would impose open domination over the search for consensus. Copyright © 2005 SAGE Publications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to introduce a new approach for edge detection in gray shaded images. The proposed approach is based on the fuzzy number theory. The idea is to deal with the uncertainties concerning the gray shades making up the image, and thus calculate the appropriateness of the pixels in relation to an homogeneous region around them. The pixels not belonging to the region are then classified as border pixels. The results have shown that the technique is simple, computationally efficient and with good results when compared with both the traditional border detectors and the fuzzy edge detectors. © 2007 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To simplify computer management, several system administrators are adopting advanced techniques to manage software configuration on grids, but the tight coupling between hardware and software makes every PC an individual managed entity, lowering the scalability and increasing the costs to manage hundreds or thousands of PCs. This paper discusses the feasibility of a distributed virtual machine environment, named Flexlab: a new approach for computer management that combines virtualization and distributed system architectures as the basis of a management system. Flexlab is able to extend the coverage of a computer management solution beyond client operating system limitations and also offers a convenient hardware abstraction, decoupling software and hardware, simplifying computer management. The results obtained in this work indicate that FlexLab is able to overcome the limitations imposed by the coupling between software and hardware, simplifying the management of homogeneous and heterogeneous grids. © 2009 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]The effectiveness and accuracy of the superposition method in assessing the dynamic stiffness and damping functions of embedded footings supported by vertical piles in homogeneous viscoelastic soil is addressed. To the end, the impedances of piled embedded footings are compared to those obtained by suporposing the impedance functions of the corresponding pile groups and embedded footing treated separately.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The topics I came across during the period I spent as a Ph.D. student are mainly two. The first concerns new organocatalytic protocols for Mannich-type reactions mediated by Cinchona alkaloids derivatives (Scheme I, left); the second topic, instead, regards the study of a new approach towards the enantioselective total synthesis of Aspirochlorine, a potent gliotoxin that recent studies indicate as a highly selective and active agent against fungi (Scheme I, right). At the beginning of 2005 I had the chance to join the group of Prof. Alfredo Ricci at the Department of Organic Chemistry of the University of Bologna, starting my PhD studies. During the first period I started to study a new homogeneous organocatalytic aza-Henry reaction by means of Cinchona alkaloid derivatives as chiral base catalysts with good results. Soon after we introduced a new protocol which allowed the in situ synthesis of N-carbamoyl imines, scarcely stable, moisture sensitive compounds. For this purpose we used α-amido sulfones, bench stable white crystalline solids, as imine precursors (Scheme II). In particular we were able to obtain the aza-Henry adducts, by using chiral phase transfer catalysis, with a broad range of substituents as R-group and excellent results, unprecedented for Mannich-type transformations (Scheme II). With the optimised protocol in hand we have extended the methodology to the other Mannich-type reactions. We applied the new method to the Mannich, Strecker and Pudovik (hydrophosphonylation of imines) reactions with very good results in terms of enantioselections and yields, broadening the usefulness of this novel protocol. The Mannich reaction was certainly the most extensively studied work in this thesis (Scheme III). Initially we developed the reaction with α-amido sulfones as imine precursors and non-commercially available malonates with excellent results in terms of yields and enantioselections.3 In this particular case we recorded 1 mol% of catalyst loading, very low for organocatalytic processes. Then we thought to develop a new Mannich reaction by using simpler malonates, such as dimethyl malonate.4 With new optimised condition the reaction provided slightly lower enantioselections than the previous protocol, but the Mannich adducts were very versatile for the obtainment of β3-amino acids. Furthermore we performed the first addition of cyclic β-ketoester to α-amido sulfones obtaining the corresponding products in good yield with high level of diastereomeric and enantiomeric excess (Scheme III). Further studies were done about the Strecker reaction mediated by Cinchona alkaloid phase-transfer quaternary ammonium salt derivatives, using acetone cyanohydrin, a relatively harmless cyanide source (Scheme IV). The reaction proceeded very well providing the corresponding α-amino nitriles in good yields and enantiomeric excesses. Finally, we developed two new complementary methodologies for the hydrophosphonylation of imines (Scheme V). As a result of the low stability of the products derived from aromatic imines, we performed the reactions in mild homogeneous basic condition by using quinine as a chiral base catalyst giving the α-aryl-α-amido phosphonic acid esters as products (Scheme V, top).6 On the other hand, we performed the addition of dialkyl phosphite to aliphatic imines by using chiral Cinchona alkaloid phase transfer quaternary ammonium salt derivatives using our methodology based on α-amido sulfones (Scheme V, bottom). The results were good for both procedures covering a broad range of α-amino phosphonic acid ester. During the second year Ph.D. studies, I spent six months in the group of Prof. Steven V. Ley, at the Department of Chemistry of the University of Cambridge, in United Kingdom. During this fruitful period I have been involved in a project concerning the enantioselective synthesis of Aspirochlorine. We provided a new route for the synthesis of a key intermediate, reducing the number of steps and increasing the overall yield. Then we introduced a new enantioselective spirocyclisation for the synthesis of a chiral building block for the completion of the synthesis (Scheme VI).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tethered bilayer lipid membranes (tBLMs) are a promising model system for the natural cell membrane. They consist of a lipid bilayer that is covalently coupled to a solid support via a spacer group. In this study, we developed a suitable approach to increase the submembrane space in tBLMs. The challenge is to create a membrane with a lower lipid density in order to increase the membrane fluidity, but to avoid defects that might appear due to an increase in the lateral space within the tethered monolayers. Therefore, various synthetic strategies and different monolayer preparation techniques were examined. Synthetical attempts to achieve a large ion reservoir were made in two directions: increasing the spacer length of the tether lipids and increasing the lateral distribution of the lipids in the monolayer. The first resulted in the synthesis of a small library of tether lipids (DPTT, DPHT and DPOT) characterized by 1H and 13C NMR, FD-MS, ATR, DSC and TGA. The synthetic strategy for their preparation includes synthesis of precursor with a double bond anchor that can be easily modified for different substrates (e.g. metal and metaloxide). Here, the double bond was modified into a thiol group suitable for gold surface. Another approach towards the preparation of homogeneous monolayers with decreased two-dimensional packing density was the synthesis of two novel anchor lipids: DPHDL and DDPTT. DPHDL is “self-diluted” tether lipid containing two lipoic anchor moieties. DDPTT has an extended lipophylic part that should lead to the preparation of diluted, leakage free proximal layers that will facilitate the completion of the bilayer. Our tool-box of tether lipids was completed with two fluorescent labeled lipid precursors with respectively one and two phytanyl chains in the hydrophobic region and a dansyl group as a fluorophore. The use of such fluorescently marked lipids is supposed to give additional information for the lipid distribution on the air-water interface. The Langmuir film balance was used to investigate the monolayer properties of four of the synthesized thiolated anchor lipids. The packing density and mixing behaviour were examined. The results have shown that mixing anchor with free lipids can homogeneously dilute the anchor lipid monolayers. Moreover, an increase in the hydrophylicity (PEG chain length) of the anchor lipids leads to a higher packing density. A decrease in the temperature results in a similar trend. However, increasing the number of phytanyl chains per lipid molecule is shown to decrease the packing density. LB-monolayers based on pure and mixed lipids in different ratio and transfer pressure were tested to form tBLMs with diluted inner layers. A combination of the LB-monolayer transfer with the solvent exchange method accomplished successfully the formation of tBLMs based on pure DPOT. Some preliminary investigations of the electrical sealing properties and protein incorporation of self-assembled DPOT and DDPTT-based tBLMs were conducted. The bilayer formation performed by solvent exchange resulted in membranes with high resistances and low capacitances. The appearance of space beneath the membrane is clearly visible in the impedance spectra expressed by a second RC element. The latter brings the conclusion that the longer spacer in DPOT and the bigger lateral space between the DDPTT molecules in the investigated systems essentially influence the electrical parameters of the membrane. Finally, we could show the functional incorporation of the small ion carrier valinomycin in both types of membranes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regional flood frequency techniques are commonly used to estimate flood quantiles when flood data is unavailable or the record length at an individual gauging station is insufficient for reliable analyses. These methods compensate for limited or unavailable data by pooling data from nearby gauged sites. This requires the delineation of hydrologically homogeneous regions in which the flood regime is sufficiently similar to allow the spatial transfer of information. It is generally accepted that hydrologic similarity results from similar physiographic characteristics, and thus these characteristics can be used to delineate regions and classify ungauged sites. However, as currently practiced, the delineation is highly subjective and dependent on the similarity measures and classification techniques employed. A standardized procedure for delineation of hydrologically homogeneous regions is presented herein. Key aspects are a new statistical metric to identify physically discordant sites, and the identification of an appropriate set of physically based measures of extreme hydrological similarity. A combination of multivariate statistical techniques applied to multiple flood statistics and basin characteristics for gauging stations in the Southeastern U.S. revealed that basin slope, elevation, and soil drainage largely determine the extreme hydrological behavior of a watershed. Use of these characteristics as similarity measures in the standardized approach for region delineation yields regions which are more homogeneous and more efficient for quantile estimation at ungauged sites than those delineated using alternative physically-based procedures typically employed in practice. The proposed methods and key physical characteristics are also shown to be efficient for region delineation and quantile development in alternative areas composed of watersheds with statistically different physical composition. In addition, the use of aggregated values of key watershed characteristics was found to be sufficient for the regionalization of flood data; the added time and computational effort required to derive spatially distributed watershed variables does not increase the accuracy of quantile estimators for ungauged sites. This dissertation also presents a methodology by which flood quantile estimates in Haiti can be derived using relationships developed for data rich regions of the U.S. As currently practiced, regional flood frequency techniques can only be applied within the predefined area used for model development. However, results presented herein demonstrate that the regional flood distribution can successfully be extrapolated to areas of similar physical composition located beyond the extent of that used for model development provided differences in precipitation are accounted for and the site in question can be appropriately classified within a delineated region.