912 resultados para Dwarf Galaxy Fornax Distribution Function Action Based
Resumo:
This population-based cross-sectional study of 403 rural settlers in Brazilian Amazonia revealed an overall rate of IgG seropositivity to Toxocara canis excretory-secretory larval antigen of 26.8% (95% confidence interval [CI], 22.5-31.4%). Multilevel logistic regression analysis identified current infection with hookworm (odds ratio [OR], 2.32; 95% CI, 1.11-4.86) and residence in the most recently occupied sectors of the settlement (OR, 1.81.; 95%CI, 1.3-2.52) as significant risk factors for Toxocara seropositivity; age > 14 years (OR, 0.46; 95% CI, 0.28-0.73) and the presence of cats in the household (OR, 0.57; 95% CI, 0.32-1.02) appeared to be protective. Two significant high-prevalence clusters were detected in the area, together comprising 38.9% of the seropositive subjects; households in the clusters had slightly lower socioeconomic status and were less likely to have cats as pets. The obstacles for controlling human toxocariasis in this and other tropical rural settings are discussed.
Resumo:
IgG antibodies to Toxoplasma gondii were detected in, March-April 2004, in 65.8% (95% confidence interval, 60.8-70.8%) of 342 systematically sampled subjects 5-90 years of age (87.5% of the eligible) living in a rural settlement in Amazonia, with a seroconversion rate of 9% over I year of follow-up of 99 seronegative subjects. Multiple logistic regression analysis identified age as the only significant independent predictor of seropositivity at the baseline. Each additional year of age increases the odds of being seropositive by 6%, and 76.8% of the subjects are expected to be seropositive at 30 years of age. A single high-prevalence spatial cluster, comprising 11.9% of the seropositive subjects, was detected in the area; households in the cluster were less likely to have dogs as pets and their heads had a lower education level, when compared with households located outside the cluster. The challenges for preventing human toxoplasmosis in tropical rural settings are discussed.
Resumo:
The evolution of commodity computing lead to the possibility of efficient usage of interconnected machines to solve computationally-intensive tasks, which were previously solvable only by using expensive supercomputers. This, however, required new methods for process scheduling and distribution, considering the network latency, communication cost, heterogeneous environments and distributed computing constraints. An efficient distribution of processes over such environments requires an adequate scheduling strategy, as the cost of inefficient process allocation is unacceptably high. Therefore, a knowledge and prediction of application behavior is essential to perform effective scheduling. In this paper, we overview the evolution of scheduling approaches, focusing on distributed environments. We also evaluate the current approaches for process behavior extraction and prediction, aiming at selecting an adequate technique for online prediction of application execution. Based on this evaluation, we propose a novel model for application behavior prediction, considering chaotic properties of such behavior and the automatic detection of critical execution points. The proposed model is applied and evaluated for process scheduling in cluster and grid computing environments. The obtained results demonstrate that prediction of the process behavior is essential for efficient scheduling in large-scale and heterogeneous distributed environments, outperforming conventional scheduling policies by a factor of 10, and even more in some cases. Furthermore, the proposed approach proves to be efficient for online predictions due to its low computational cost and good precision. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
In this paper, we proposed a new two-parameter lifetime distribution with increasing failure rate, the complementary exponential geometric distribution, which is complementary to the exponential geometric model proposed by Adamidis and Loukas (1998). The new distribution arises on a latent complementary risks scenario, in which the lifetime associated with a particular risk is not observable; rather, we observe only the maximum lifetime value among all risks. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulas for its reliability and failure rate functions, moments, including the mean and variance, variation coefficient, and modal value. The parameter estimation is based on the usual maximum likelihood approach. We report the results of a misspecification simulation study performed in order to assess the extent of misspecification errors when testing the exponential geometric distribution against our complementary one in the presence of different sample size and censoring percentage. The methodology is illustrated on four real datasets; we also make a comparison between both modeling approaches. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The University of Notre Dame, USA (Becchetti et al, Nucl. Instrum. Metho ds Res. A505, 377 (2003)) and later the University of Sao Paulo, Brazil (Lichtenthaler et al, Eur. Phys. J. A25, S-01, 733 (2005)) adopted a system based on superconducting solenoids to produce low-energy radioactive nuclear beams. In these systems the solenoids act as thick lenses to collect, select, and focus the secondary beam into a scattering chamb er. Many experiments with radioactive light particle beams (RNB) such as (6)He, (7)Be, (8)Li, (8)B have been performed at these two facilities. These low-energy RNB have been used to investigate low-energy reactions such as elastic scattering, transfer and breakup, providing useful information on the structure of light nuclei near the drip line and on astrophysics. Total reaction cross-sections, derived from elastic scattering analysis, have also been investigated for light system as a function of energy and the role of breakup of weakly bound or exotic nuclei is discussed.
Resumo:
Inhibition of microtubule function is an attractive rational approach to anticancer therapy. Although taxanes are the most prominent among the microtubule-stabilizers, their clinical toxicity, poor pharmacokinetic properties, and resistance have stimulated the search for new antitumor agents having the same mechanism of action. Discodermolide is an example of nontaxane natural product that has the same mechanism of action, demonstrating superior antitumor efficacy and therapeutic index. The extraordinary chemical and biological properties have qualified discodermolide as a lead structure for the design of novel anticancer agents with optimized therapeutic properties. In the present work, we have employed a specialized fragment-based method to develop robust quantitative structure - activity relationship models for a series of synthetic discodermolide analogs. The generated molecular recognition patterns were combined with three-dimensional molecular modeling studies as a fundamental step on the path to understanding the molecular basis of drug-receptor interactions within this important series of potent antitumoral agents.
Resumo:
Architectures based on Coordinated Atomic action (CA action) concepts have been used to build concurrent fault-tolerant systems. This conceptual model combines concurrent exception handling with action nesting to provide a general mechanism for both enclosing interactions among system components and coordinating forward error recovery measures. This article presents an architectural model to guide the formal specification of concurrent fault-tolerant systems. This architecture provides built-in Communicating Sequential Processes (CSPs) and predefined channels to coordinate exception handling of the user-defined components. Hence some safety properties concerning action scoping and concurrent exception handling can be proved by using the FDR (Failure Divergence Refinement) verification tool. As a result, a formal and general architecture supporting software fault tolerance is ready to be used and proved as users define components with normal and exceptional behaviors. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This paper considers an extension to the skew-normal model through the inclusion of an additional parameter which can lead to both uni- and bi-modal distributions. The paper presents various basic properties of this family of distributions and provides a stochastic representation which is useful for obtaining theoretical properties and to simulate from the distribution. Moreover, the singularity of the Fisher information matrix is investigated and maximum likelihood estimation for a random sample with no covariates is considered. The main motivation is thus to avoid using mixtures in fitting bimodal data as these are well known to be complicated to deal with, particularly because of identifiability problems. Data-based illustrations show that such model can be useful. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
Scale mixtures of the skew-normal (SMSN) distribution is a class of asymmetric thick-tailed distributions that includes the skew-normal (SN) distribution as a special case. The main advantage of these classes of distributions is that they are easy to simulate and have a nice hierarchical representation facilitating easy implementation of the expectation-maximization algorithm for the maximum-likelihood estimation. In this paper, we assume an SMSN distribution for the unobserved value of the covariates and a symmetric scale mixtures of the normal distribution for the error term of the model. This provides a robust alternative to parameter estimation in multivariate measurement error models. Specific distributions examined include univariate and multivariate versions of the SN, skew-t, skew-slash and skew-contaminated normal distributions. The results and methods are applied to a real data set.
Resumo:
The modeling and analysis of lifetime data is an important aspect of statistical work in a wide variety of scientific and technological fields. Good (1953) introduced a probability distribution which is commonly used in the analysis of lifetime data. For the first time, based on this distribution, we propose the so-called exponentiated generalized inverse Gaussian distribution, which extends the exponentiated standard gamma distribution (Nadarajah and Kotz, 2006). Various structural properties of the new distribution are derived, including expansions for its moments, moment generating function, moments of the order statistics, and so forth. We discuss maximum likelihood estimation of the model parameters. The usefulness of the new model is illustrated by means of a real data set. (c) 2010 Elsevier B.V. All rights reserved.
Resumo:
Birnbaum and Saunders (1969a) introduced a probability distribution which is commonly used in reliability studies For the first time based on this distribution the so-called beta-Birnbaum-Saunders distribution is proposed for fatigue life modeling Various properties of the new model including expansions for the moments moment generating function mean deviations density function of the order statistics and their moments are derived We discuss maximum likelihood estimation of the model s parameters The superiority of the new model is illustrated by means of three failure real data sets (C) 2010 Elsevier B V All rights reserved
Resumo:
The Laplace distribution is one of the earliest distributions in probability theory. For the first time, based on this distribution, we propose the so-called beta Laplace distribution, which extends the Laplace distribution. Various structural properties of the new distribution are derived, including expansions for its moments, moment generating function, moments of the order statistics, and so forth. We discuss maximum likelihood estimation of the model parameters and derive the observed information matrix. The usefulness of the new model is illustrated by means of a real data set. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The aim of this paper is to evaluate the performance of two divergent methods for delineating commuting regions, also called labour market areas, in a situation that the base spatial units differ largely in size as a result of an irregular population distribution. Commuting patterns in Sweden have been analyzed with geographical information system technology by delineating commuting regions using two regionalization methods. One, a rule-based method, uses one-way commuting flows to delineate local labour market areas in a top-down procedure based on the selection of predefined employment centres. The other method, the interaction-based Intramax analysis, uses two-way flows in a bottom-up procedure based on numerical taxonomy principles. A comparison of these methods will expose a number of strengths and weaknesses. For both methods, the same data source has been used. The performance of both methods has been evaluated for the country as a whole using resident employed population, self-containment levels and job ratios for criteria. A more detailed evaluation has been done in the Goteborg metropolitan area by comparing regional patterns with the commuting fields of a number of urban centres in this area. It is concluded that both methods could benefit from the inclusion of additional control measures to identify improper allocations of municipalities.
Resumo:
Hydrological loss is a vital component in many hydrological models, which are usedin forecasting floods and evaluating water resources for both surface and subsurface flows. Due to the complex and random nature of the rainfall runoff process, hydrological losses are not yet fully understood. Consequently, practitioners often use representative values of the losses for design applications such as rainfall-runoff modelling which has led to inaccurate quantification of water quantities in the resulting applications. The existing hydrological loss models must be revisited and modellers should be encouraged to utilise other available data sets. This study is based on three unregulated catchments situated in Mt. Lofty Ranges of South Australia (SA). The paper focuses on conceptual models for: initial loss (IL), continuing loss (CL) and proportional loss (PL) with rainfall characteristics (total rainfall (TR) and storm duration (D)), and antecedent wetness (AW) conditions. The paper introduces two methods that can be implemented to estimate IL as a function of TR, D and AW. The IL distribution patterns and parameters for the study catchments are determined using multivariate analysis and descriptive statistics. The possibility of generalising the methods and the limitations of this are also discussed. This study will yield improvements to existing loss models and will encourage practitioners to utilise multiple data sets to estimate losses, instead of using hypothetical or representative values to generalise real situations.
Resumo:
This pap er analyzes the distribution of money holdings in a commo dity money search-based mo del with intermediation. Intro ducing heterogeneity of costs to the Kiyotaki e Wright ( 1989 ) mo del, Cavalcanti e Puzzello ( 2010) gives rise to a non-degenerated distribution of money. We extend further this mo del intro ducing intermediation in the trading pro cess. We show that the distribution of money matters for savings decisions. This gives rises to a xed p oint problem for the saving function that di cults nding the optimal solution. Through some examples, we show that this friction shrinks the distribution of money. In contrast to the Cavalcanti e Puzzello ( 2010 ) mo del, the optimal solution may not present the entire surplus going to the consumer. At the end of the pap er, we present a strong result, for a su cient large numb er of intermediaries the distribution of money is degenerated.