629 resultados para Rejilla binomial
Resumo:
One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By anessential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur inmany compositional situations, such as household budget patterns, time budgets,palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful insuch situations. From consideration of such examples it seems sensible to build up amodel in two stages, the first determining where the zeros will occur and the secondhow the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data
Resumo:
RESUMOA escolha da carreira é uma decisão importante e em geral é feita em um ambiente repleto de incertezas por pessoas relativamente jovens e inexperientes. Neste estudo, buscou-se analisar a decisão de escolha entre uma carreira em uma empresa privada e outra em um órgão público, considerando que existe flexibilidade para migrar do setor privado para o público por meio de concurso. Para tanto, utilizou-se a metodologia das opções reais para modelar essa opção de troca, assumindo-se que os ganhos futuros no setor privado são incertos. Os resultados sugerem que a opção de ingressar na carreira pública pode ter valor significativo em relação à privada.
Resumo:
Optimal robust M-estimates of a multidimensional parameter are described using Hampel's infinitesimal approach. The optimal estimates are derived by minimizing a measure of efficiency under the model, subject to a bounded measure of infinitesimal robustness. To this purpose we define measures of efficiency and infinitesimal sensitivity based on the Hellinger distance.We show that these two measures coincide with similar ones defined by Yohai using the Kullback-Leibler divergence, and therefore the corresponding optimal estimates coincide too.We also give an example where we fit a negative binomial distribution to a real dataset of "days of stay in hospital" using the optimal robust estimates.
Resumo:
Background: Age is frequently discussed as negative host factor to achieve a sustained virological response (SVR) to antiviral hepatitis C therapy. However, elderly patients often show relevant fibrosis or cirrhosis which is a known negative predictive factor, making it difficult to interpret age as an independent predictive factor. Methods: From the framework of the Swiss hepatitis C cohort (SCCS), we collected data from 545 antiviral hepatitis C therapies, including data from 67 hepatitis C patients ≥ 60 y who had been treated with PEG-interferon and ribavirin. We analyzed host factors (age, gender, fibrosis, haemoglobin, depression, earlier hepatitis C treatment), viral factors (genotype, viral load) and treatment course (early virological response, end of treatment response, SVR). Generalised estimating equations (GEE) regression modelling was used for the primary end point (SVR), with age ≥ 60 y and < 60 y as independent variable and gender, presence of cirrhosis, genotype, earlier treatment and viral load as confounders. SVR was analysed in young and elderly patients after matching for these confounders. Additionally, classification tree analysis was done in elderly patients using these confounders. Results: SVR analyzed in 545 patients was 55%. In genotype 1/4, SVR was 42.9% in 259 patients < 60 y and 26.1% in 46 patients ≥ 60 y. In genotype 2/3, SVR was 74.4% in 215 patients < 60 y and 84% in 25 patients ≥ 60 y. However, GEE model showed that age had no influence on achieving SVR (Odds ratio 0.91). Confounders influenced SVR as known from previous studies (cirrhosis, genotype 1/4, previous treatment and viral load >600'000 IE/ml as negative predictive factors). When young and elderly patients were matched (analysis in 59 elderly patients), SVR was not different in these patient groups (54.2% and 55.9%, resp.; p=0.795 in binomial test). The classification tree-derived best criterion for SVR in elderly patients was genotype, with no further criteria relevant for predicting SVR in genotype 2/3. In patients with genotype 1/4, further criteria were presence of cirrhosis and low viral load <600'000 IE/ml in non-cirrhotic patients. Conclusions: Age is not a relevant predictive factor for achieving SVR, when confounders were taken into account. In terms of effectiveness of antiviral therapy, age does not play a major role and should not be regarded as relevant negative predictive factor. Since life expectancy in Switzerland at age 60 is more than 22 y, hepatitis C therapy is reasonable in elderly patients with known relevant fibrosis or cirrhosis, because interferon-based hepatitis C therapy improves survival and reduces carcinogenesis.
Resumo:
Background: With increasing computer power, simulating the dynamics of complex systems in chemistry and biology is becoming increasingly routine. The modelling of individual reactions in (bio)chemical systems involves a large number of random events that can be simulated by the stochastic simulation algorithm (SSA). The key quantity is the step size, or waiting time, τ, whose value inversely depends on the size of the propensities of the different channel reactions and which needs to be re-evaluated after every firing event. Such a discrete event simulation may be extremely expensive, in particular for stiff systems where τ can be very short due to the fast kinetics of some of the channel reactions. Several alternative methods have been put forward to increase the integration step size. The so-called τ-leap approach takes a larger step size by allowing all the reactions to fire, from a Poisson or Binomial distribution, within that step. Although the expected value for the different species in the reactive system is maintained with respect to more precise methods, the variance at steady state can suffer from large errors as τ grows. Results: In this paper we extend Poisson τ-leap methods to a general class of Runge-Kutta (RK) τ-leap methods. We show that with the proper selection of the coefficients, the variance of the extended τ-leap can be well-behaved, leading to significantly larger step sizes.Conclusions: The benefit of adapting the extended method to the use of RK frameworks is clear in terms of speed of calculation, as the number of evaluations of the Poisson distribution is still one set per time step, as in the original τ-leap method. The approach paves the way to explore new multiscale methods to simulate (bio)chemical systems.
Resumo:
General Introduction This thesis can be divided into two main parts :the first one, corresponding to the first three chapters, studies Rules of Origin (RoOs) in Preferential Trade Agreements (PTAs); the second part -the fourth chapter- is concerned with Anti-Dumping (AD) measures. Despite wide-ranging preferential access granted to developing countries by industrial ones under North-South Trade Agreements -whether reciprocal, like the Europe Agreements (EAs) or NAFTA, or not, such as the GSP, AGOA, or EBA-, it has been claimed that the benefits from improved market access keep falling short of the full potential benefits. RoOs are largely regarded as a primary cause of the under-utilization of improved market access of PTAs. RoOs are the rules that determine the eligibility of goods to preferential treatment. Their economic justification is to prevent trade deflection, i.e. to prevent non-preferred exporters from using the tariff preferences. However, they are complex, cost raising and cumbersome, and can be manipulated by organised special interest groups. As a result, RoOs can restrain trade beyond what it is needed to prevent trade deflection and hence restrict market access in a statistically significant and quantitatively large proportion. Part l In order to further our understanding of the effects of RoOs in PTAs, the first chapter, written with Pr. Olivier Cadot, Celine Carrère and Pr. Jaime de Melo, describes and evaluates the RoOs governing EU and US PTAs. It draws on utilization-rate data for Mexican exports to the US in 2001 and on similar data for ACP exports to the EU in 2002. The paper makes two contributions. First, we construct an R-index of restrictiveness of RoOs along the lines first proposed by Estevadeordal (2000) for NAFTA, modifying it and extending it for the EU's single-list (SL). This synthetic R-index is then used to compare Roos under NAFTA and PANEURO. The two main findings of the chapter are as follows. First, it shows, in the case of PANEURO, that the R-index is useful to summarize how countries are differently affected by the same set of RoOs because of their different export baskets to the EU. Second, it is shown that the Rindex is a relatively reliable statistic in the sense that, subject to caveats, after controlling for the extent of tariff preference at the tariff-line level, it accounts for differences in utilization rates at the tariff line level. Finally, together with utilization rates, the index can be used to estimate total compliance costs of RoOs. The second chapter proposes a reform of preferential Roos with the aim of making them more transparent and less discriminatory. Such a reform would make preferential blocs more "cross-compatible" and would therefore facilitate cumulation. It would also contribute to move regionalism toward more openness and hence to make it more compatible with the multilateral trading system. It focuses on NAFTA, one of the most restrictive FTAs (see Estevadeordal and Suominen 2006), and proposes a way forward that is close in spirit to what the EU Commission is considering for the PANEURO system. In a nutshell, the idea is to replace the current array of RoOs by a single instrument- Maximum Foreign Content (MFC). An MFC is a conceptually clear and transparent instrument, like a tariff. Therefore changing all instruments into an MFC would bring improved transparency pretty much like the "tariffication" of NTBs. The methodology for this exercise is as follows: In step 1, I estimate the relationship between utilization rates, tariff preferences and RoOs. In step 2, I retrieve the estimates and invert the relationship to get a simulated MFC that gives, line by line, the same utilization rate as the old array of Roos. In step 3, I calculate the trade-weighted average of the simulated MFC across all lines to get an overall equivalent of the current system and explore the possibility of setting this unique instrument at a uniform rate across lines. This would have two advantages. First, like a uniform tariff, a uniform MFC would make it difficult for lobbies to manipulate the instrument at the margin. This argument is standard in the political-economy literature and has been used time and again in support of reductions in the variance of tariffs (together with standard welfare considerations). Second, uniformity across lines is the only way to eliminate the indirect source of discrimination alluded to earlier. Only if two countries face uniform RoOs and tariff preference will they face uniform incentives irrespective of their initial export structure. The result of this exercise is striking: the average simulated MFC is 25% of good value, a very low (i.e. restrictive) level, confirming Estevadeordal and Suominen's critical assessment of NAFTA's RoOs. Adopting a uniform MFC would imply a relaxation from the benchmark level for sectors like chemicals or textiles & apparel, and a stiffening for wood products, papers and base metals. Overall, however, the changes are not drastic, suggesting perhaps only moderate resistance to change from special interests. The third chapter of the thesis considers whether Europe Agreements of the EU, with the current sets of RoOs, could be the potential model for future EU-centered PTAs. First, I have studied and coded at the six-digit level of the Harmonised System (HS) .both the old RoOs -used before 1997- and the "Single list" Roos -used since 1997. Second, using a Constant Elasticity Transformation function where CEEC exporters smoothly mix sales between the EU and the rest of the world by comparing producer prices on each market, I have estimated the trade effects of the EU RoOs. The estimates suggest that much of the market access conferred by the EAs -outside sensitive sectors- was undone by the cost-raising effects of RoOs. The chapter also contains an analysis of the evolution of the CEECs' trade with the EU from post-communism to accession. Part II The last chapter of the thesis is concerned with anti-dumping, another trade-policy instrument having the effect of reducing market access. In 1995, the Uruguay Round introduced in the Anti-Dumping Agreement (ADA) a mandatory "sunset-review" clause (Article 11.3 ADA) under which anti-dumping measures should be reviewed no later than five years from their imposition and terminated unless there was a serious risk of resumption of injurious dumping. The last chapter, written with Pr. Olivier Cadot and Pr. Jaime de Melo, uses a new database on Anti-Dumping (AD) measures worldwide to assess whether the sunset-review agreement had any effect. The question we address is whether the WTO Agreement succeeded in imposing the discipline of a five-year cycle on AD measures and, ultimately, in curbing their length. Two methods are used; count data analysis and survival analysis. First, using Poisson and Negative Binomial regressions, the count of AD measures' revocations is regressed on (inter alia) the count of "initiations" lagged five years. The analysis yields a coefficient on measures' initiations lagged five years that is larger and more precisely estimated after the agreement than before, suggesting some effect. However the coefficient estimate is nowhere near the value that would give a one-for-one relationship between initiations and revocations after five years. We also find that (i) if the agreement affected EU AD practices, the effect went the wrong way, the five-year cycle being quantitatively weaker after the agreement than before; (ii) the agreement had no visible effect on the United States except for aone-time peak in 2000, suggesting a mopping-up of old cases. Second, the survival analysis of AD measures around the world suggests a shortening of their expected lifetime after the agreement, and this shortening effect (a downward shift in the survival function postagreement) was larger and more significant for measures targeted at WTO members than for those targeted at non-members (for which WTO disciplines do not bind), suggesting that compliance was de jure. A difference-in-differences Cox regression confirms this diagnosis: controlling for the countries imposing the measures, for the investigated countries and for the products' sector, we find a larger increase in the hazard rate of AD measures covered by the Agreement than for other measures.
Resumo:
The objective of the study was to validate the content of the dimensions that constituted nonadherence to treatment of arterial systemic hypertension. It was a methodological study of content validation. Initially an integrative review was conducted that demonstrated four dimensions of nonadherence: person, disease/treatment, health service, and environment. Definitions of these dimensions were evaluated by 17 professionals, who were specialists in the area, including: nurses, pharmacists and physicians. The Content Validity Index was calculated for each dimension (IVCi) and the set of the dimensions (IVCt), and the binomial test was conducted. The results permitted the validation of the dimensions with an IVCt of 0.88, demonstrating reasonable systematic comprehension of the phenomena of nonadherence.
Resumo:
Ainda, nos nossos dias, o papel como sendo base de suporte das informações, é o maior problema operacional na maioria das empresas, órgãos governamentais e instituições. A administração pública cabo-verdiana lida anualmente com um grande volume de documentos e, a dificuldade de se lidar cada vez mais com os documentos produzidos é crescente. Daí, a necessidade de estabelecer princípios de racionalidade administrativa, a partir da intervenção nas etapas do ciclo de produção, utilização, conservação e destinação dos documentos de arquivo - a Gestão Documental. Com a evolução das Tecnologias da Informação e Comunicação (TIC) é contínua, igualmente a gestão da informação precisa adaptar-se a cada nova onda de mudança. A utilização de sistemas de gestão de conteúdos e de processos constituem ferramentas indispensáveis para o desenvolvimento de modernos sistemas de informação, agilizando assim os processos críticos nas instituições, de forma a optimizar o binómio produtividade - qualidade. Este trabalho faz referência à utilização da IBM FileNet, para modelação do processo de pedido de aposentação na Administração Pública. Ficou comprovado que o sistema de gestão de Workflow - Bussiness Process Management (BPM) da IBM, é capaz de automatizar o procedimento dos processos, gerir a sequência de actividades de trabalho e chamar (ou invocar) os recursos humanos e electrónicos apropriados, associados aos vários passos de actividades numa instituição. As evidências da plataforma FileNet como solução de Gestão de Conteúdos, e Gestão de Processos na nossa Administração Pública, demonstram vantagens consideráveis em todos os níveis. Para além de dinamizar e automatizar os processos críticos, o FileNet BPM pode permitir a optimização das operações e melhorar a capacidade de tomar decisões rápidas e acertadas, capacitando a instituição, para tomar decisões atempadas, baseadas na mais rigorosa informação disponível.
Resumo:
Introduction: An excellent coordination between firefighters, policemen and medical rescue is the key to success in the management of major accidents. In order to improve and assist the medical teams engaged on site, the Swiss "medical command and control system" for rescue operations is based on a binomial set up involving one head emergency doctor and one head rescue paramedic, both trained in disaster medicine. We have recently experimented an innovative on-site "medical command and control system", based on the binomial team, supported by a dedicated 144 dispatcher. Methods: A major road traffic accident took place on the highway between Lausanne and Vevey on April 9th 2008. We have retrospectively collected all data concerning the victims as well as the logistics and dedicated structures, reported by the 144, the Hospitals, the Authority of the State and the Police and Fire Departments. Results: The 72-car pileup caused one death and 26 slightly injured patients. The management on the accident site was organized around a tripartite system, gathering together the medical command and control team with the police and fire departments. On the medical side, 16 ambulances, 2 medical response teams (SMUR), the Rega crew and the medical command and control team were dispatched by the 144. On that occasion an advanced medical command car equipped with communication devices and staffed with a 144 dispatcher was also engaged, allowing efficient medical regulation directly from the site. Discussion: The specific skills of one doctor and one paramedic both trained for disaster's management proved to be perfectly complementary. The presence of a dispatcher on site with a medical command car also proved to be useful, improving orders transmission from the medical command team to all other on- and off-site partners. It relieved the need of repeated back-and-forth communication with the 144, allowing both paramedic and doctor to focus on strategy and tactics rather than communication and logistics.
Resumo:
Distribuição espacial de Aphis gossypii (Glover) (Hemiptera, Aphididae) e Bemisia tabaci (Gennadius) biótipo B (Hemiptera, Aleyrodidae) em algodoeiro Bt e não-Bt. O estudo da distribuição espacial de adultos de Bemisia tabaci e de Aphis gossypii nas culturas do algodoeiro Bt e não-Bt é fundamental para a otimização de técnicas de amostragens, além de revelar diferenças de comportamento de espécies não-alvo dessa tecnologia Bt entre as duas cultivares. Nesse sentido, o experimento buscou investigar o padrão da distribuição espacial dessas espécies de insetos no algodoeiro convencional não-Bt e no cultivar Bt. As avaliações ocorreram em dois campos de 5.000 m² cada, nos quais se realizou 14 avaliações com contagem de adultos da mosca-branca e colônias de pulgões. Foram calculados os índices de agregação (razão variância/média, índice de Morisita e Expoente k da Distribuição Binomial Negativa) e realizados os testes ajustes das classes numéricas de indivíduos encontradas e esperadas às distribuições teóricas de freqüência (Poisson, Binomial Negativa e Binomial Positiva). Todas as análises mostraram que, em ambas as cultivares, a distribuição espacial de B. tabaci ajustou-se a distribuição binomial negativa durante todo o período analisado, indicando que a cultivar transgênica não influenciou o padrão de distribuição agregada desse inseto. Já com relação às análises para A. gossypii, os índices de agregação apontaram distribuição agregada nas duas cultivares, mas as distribuições de freqüência permitiram concluir a ocorrência de distribuição agregada apenas no algodoeiro convencional, pois não houve nenhum ajuste para os dados na cultivar Bt. Isso indica que o algodão Bt alterou o padrão normal de dispersão dos pulgões no cultivo.
Resumo:
This paper uses Social Security records to study internal migrationin Spain. This is the first paper that uses this data source, whichhas some advantages with respect to existing data sources: it includesonly job-seeking migrants and it allows to identify temporary migration. Within the framework of an extended gravity model, we estimate a Generalized Negative Binomial regression on gross migration flows between provinces. We quantify the effect of local labor market imbalances on workers' mobility and discuss the equilibrating role of internal migration in Spain. Our main results show that the effect of employment opportunities have changed after 1984; migrants seem to be more responsive to economic conditions but, consistently with previous studies for the Spanish labor market, the migration response to wage differentials is wrongly signed. Our analysis also confirms the larger internal mobility of highly qualified workers.
Resumo:
Many dynamic revenue management models divide the sale period into a finite number of periods T and assume, invoking a fine-enough grid of time, that each period sees at most one booking request. These Poisson-type assumptions restrict the variability of the demand in the model, but researchers and practitioners were willing to overlook this for the benefit of tractability of the models. In this paper, we criticize this model from another angle. Estimating the discrete finite-period model poses problems of indeterminacy and non-robustness: Arbitrarily fixing T leads to arbitrary control values and on the other hand estimating T from data adds an additional layer of indeterminacy. To counter this, we first propose an alternate finite-population model that avoids this problem of fixing T and allows a wider range of demand distributions, while retaining the useful marginal-value properties of the finite-period model. The finite-population model still requires jointly estimating market size and the parameters of the customer purchase model without observing no-purchases. Estimation of market-size when no-purchases are unobservable has rarely been attempted in the marketing or revenue management literature. Indeed, we point out that it is akin to the classical statistical problem of estimating the parameters of a binomial distribution with unknown population size and success probability, and hence likely to be challenging. However, when the purchase probabilities are given by a functional form such as a multinomial-logit model, we propose an estimation heuristic that exploits the specification of the functional form, the variety of the offer sets in a typical RM setting, and qualitative knowledge of arrival rates. Finally we perform simulations to show that the estimator is very promising in obtaining unbiased estimates of population size and the model parameters.
Resumo:
BACKGROUND: Strategies to dissect phenotypic and genetic heterogeneity of major depressive disorder (MDD) have mainly relied on subphenotypes, such as age at onset (AAO) and recurrence/episodicity. Yet, evidence on whether these subphenotypes are familial or heritable is scarce. The aims of this study are to investigate the familiality of AAO and episode frequency in MDD and to assess the proportion of their variance explained by common single nucleotide polymorphisms (SNP heritability). METHOD: For investigating familiality, we used 691 families with 2-5 full siblings with recurrent MDD from the DeNt study. We fitted (square root) AAO and episode count in a linear and a negative binomial mixed model, respectively, with family as random effect and adjusting for sex, age and center. The strength of familiality was assessed with intraclass correlation coefficients (ICC). For estimating SNP heritabilities, we used 3468 unrelated MDD cases from the RADIANT and GSK Munich studies. After similarly adjusting for covariates, derived residuals were used with the GREML method in GCTA (genome-wide complex trait analysis) software. RESULTS: Significant familial clustering was found for both AAO (ICC = 0.28) and episodicity (ICC = 0.07). We calculated from respective ICC estimates the maximal additive heritability of AAO (0.56) and episodicity (0.15). SNP heritability of AAO was 0.17 (p = 0.04); analysis was underpowered for calculating SNP heritability of episodicity. CONCLUSIONS: AAO and episodicity aggregate in families to a moderate and small degree, respectively. AAO is under stronger additive genetic control than episodicity. Larger samples are needed to calculate the SNP heritability of episodicity. The described statistical framework could be useful in future analyses.
Resumo:
Conversion electron Mossbauer spectra of composition modulated FeSi thin films have been analysed within the framework of a quasi shape independent model in which the distribution function for the hyperfine fields is assumed to be given by a binomial distribution. Both the hyperfine field and the hyperfine field distribution depend on the modulation characteristic length.
Resumo:
Usando os dados reportados em artigos publicados em revistas brasileiras e trabalhos apresentados em congressos nacionais, replicaram-se as aplicações da Lei de Lotka à literatura brasileira em 10 campos diferentes. Utilizou-se o modelo do poder inverso pelos métodos do mínimo quadrado e probabilidade máxima. Das 10 literaturas nacionais analisadas, somente a literatura de medicina, siderurgia, jaca e biblioteconomia ajustaram-se ao modelo do poder inverso generalizado pelo método dos mínimos quadrados. No entanto, só duas literaturas (veterinária e cartas do Arquivo Privado de Getúlio Vargas) não se ajustaram ao modelo quando se usou o método da máxima probabilidade. Para ambas literaturas, tentaram-se modelos diferentes. A literatura de veterinária ajustou-se à distribuição binomial negativa, e as cartas do Arquivo Privado de Getúlio Vargas ajustaram-se melhor à distribuição Gauss-Poisson Inversa Generalizada.