849 resultados para Theoretical probability
Resumo:
(1) A mathematical theory for computing the probabilities of various nucleotide configurations is developed, and the probability of obtaining the correct phylogenetic tree (model tree) from sequence data is evaluated for six phylogenetic tree-making methods (UPGMA, distance Wagner method, transformed distance method, Fitch-Margoliash's method, maximum parsimony method, and compatibility method). The number of nucleotides (m*) necessary to obtain the correct tree with a probability of 95% is estimated with special reference to the human, chimpanzee, and gorilla divergence. m* is at least 4,200, but the availability of outgroup species greatly reduces m* for all methods except UPGMA. m* increases if transitions occur more frequently than transversions as in the case of mitochondrial DNA. (2) A new tree-making method called the neighbor-joining method is proposed. This method is applicable either for distance data or character state data. Computer simulation has shown that the neighbor-joining method is generally better than UPGMA, Farris' method, Li's method, and modified Farris method on recovering the true topology when distance data are used. A related method, the simultaneous partitioning method, is also discussed. (3) The maximum likelihood (ML) method for phylogeny reconstruction under the assumption of both constant and varying evolutionary rates is studied, and a new algorithm for obtaining the ML tree is presented. This method gives a tree similar to that obtained by UPGMA when constant evolutionary rate is assumed, whereas it gives a tree similar to that obtained by the maximum parsimony tree and the neighbor-joining method when varying evolutionary rate is assumed. ^
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.
Resumo:
To deliver sample estimates provided with the necessary probability foundation to permit generalization from the sample data subset to the whole target population being sampled, probability sampling strategies are required to satisfy three necessary not sufficient conditions: (i) All inclusion probabilities be greater than zero in the target population to be sampled. If some sampling units have an inclusion probability of zero, then a map accuracy assessment does not represent the entire target region depicted in the map to be assessed. (ii) The inclusion probabilities must be: (a) knowable for nonsampled units and (b) known for those units selected in the sample: since the inclusion probability determines the weight attached to each sampling unit in the accuracy estimation formulas, if the inclusion probabilities are unknown, so are the estimation weights. This original work presents a novel (to the best of these authors' knowledge, the first) probability sampling protocol for quality assessment and comparison of thematic maps generated from spaceborne/airborne Very High Resolution (VHR) images, where: (I) an original Categorical Variable Pair Similarity Index (CVPSI, proposed in two different formulations) is estimated as a fuzzy degree of match between a reference and a test semantic vocabulary, which may not coincide, and (II) both symbolic pixel-based thematic quality indicators (TQIs) and sub-symbolic object-based spatial quality indicators (SQIs) are estimated with a degree of uncertainty in measurement in compliance with the well-known Quality Assurance Framework for Earth Observation (QA4EO) guidelines. Like a decision-tree, any protocol (guidelines for best practice) comprises a set of rules, equivalent to structural knowledge, and an order of presentation of the rule set, known as procedural knowledge. The combination of these two levels of knowledge makes an original protocol worth more than the sum of its parts. The several degrees of novelty of the proposed probability sampling protocol are highlighted in this paper, at the levels of understanding of both structural and procedural knowledge, in comparison with related multi-disciplinary works selected from the existing literature. In the experimental session the proposed protocol is tested for accuracy validation of preliminary classification maps automatically generated by the Satellite Image Automatic MapperT (SIAMT) software product from two WorldView-2 images and one QuickBird-2 image provided by DigitalGlobe for testing purposes. In these experiments, collected TQIs and SQIs are statistically valid, statistically significant, consistent across maps and in agreement with theoretical expectations, visual (qualitative) evidence and quantitative quality indexes of operativeness (OQIs) claimed for SIAMT by related papers. As a subsidiary conclusion, the statistically consistent and statistically significant accuracy validation of the SIAMT pre-classification maps proposed in this contribution, together with OQIs claimed for SIAMT by related works, make the operational (automatic, accurate, near real-time, robust, scalable) SIAMT software product eligible for opening up new inter-disciplinary research and market opportunities in accordance with the visionary goal of the Global Earth Observation System of Systems (GEOSS) initiative and the QA4EO international guidelines.
Resumo:
In previous papers, the type-I intermittent phenomenon with continuous reinjection probability density (RPD) has been extensively studied. However, in this paper type-I intermittency considering discontinuous RPD function in one-dimensional maps is analyzed. To carry out the present study the analytic approximation presented by del Río and Elaskar (Int. J. Bifurc. Chaos 20:1185-1191, 2010) and Elaskar et al. (Physica A. 390:2759-2768, 2011) is extended to consider discontinuous RPD functions. The results of this analysis show that the characteristic relation only depends on the position of the lower bound of reinjection (LBR), therefore for the LBR below the tangent point the relation {Mathematical expression}, where {Mathematical expression} is the control parameter, remains robust regardless the form of the RPD, although the average of the laminar phases {Mathematical expression} can change. Finally, the study of discontinuous RPD for type-I intermittency which occurs in a three-wave truncation model for the derivative nonlinear Schrodinger equation is presented. In all tests the theoretical results properly verify the numerical data
Resumo:
The effects of cell toxicity are known to be inherent in carcinogenesis induced by radiation or chemical carcinogens. The event of cell death precludes tumor induction from occurring. A long standing problem is to estimate the proportion of initiated cells that die before tumor induction. No experimental techniques are currently available for directly gauging the rate of cell death over extended periods of time. The obstacle can be surmounted by newly developed theoretical methods of carcinogenesis modeling. In this paper, we apply such methods to published data on multiple lung tumors in mice receiving different schedules of urethane. Bioassays of this type play an important role in testing environmental chemicals for carcinogenic activity. Our estimates for urethane-induced carcinogenesis show that, unexpectedly, many initiated cells die early in the course of tumor promotion. We present numerical estimates for the probability of initiated cell death for different schedules (and doses) of urethane administration.
Resumo:
A Superadditive Bisexual Galton-Watson Branching Process is considered and the total number of mating units, females and males, until the n-th generation, are studied. In particular some results about the stochastic monotony, probability generating functions and moments are obtained. Finally, the limit behaviour of those variables suitably normed is investigated.
Resumo:
The study of random probability measures is a lively research topic that has attracted interest from different fields in recent years. In this thesis, we consider random probability measures in the context of Bayesian nonparametrics, where the law of a random probability measure is used as prior distribution, and in the context of distributional data analysis, where the goal is to perform inference given avsample from the law of a random probability measure. The contributions contained in this thesis can be subdivided according to three different topics: (i) the use of almost surely discrete repulsive random measures (i.e., whose support points are well separated) for Bayesian model-based clustering, (ii) the proposal of new laws for collections of random probability measures for Bayesian density estimation of partially exchangeable data subdivided into different groups, and (iii) the study of principal component analysis and regression models for probability distributions seen as elements of the 2-Wasserstein space. Specifically, for point (i) above we propose an efficient Markov chain Monte Carlo algorithm for posterior inference, which sidesteps the need of split-merge reversible jump moves typically associated with poor performance, we propose a model for clustering high-dimensional data by introducing a novel class of anisotropic determinantal point processes, and study the distributional properties of the repulsive measures, shedding light on important theoretical results which enable more principled prior elicitation and more efficient posterior simulation algorithms. For point (ii) above, we consider several models suitable for clustering homogeneous populations, inducing spatial dependence across groups of data, extracting the characteristic traits common to all the data-groups, and propose a novel vector autoregressive model to study of growth curves of Singaporean kids. Finally, for point (iii), we propose a novel class of projected statistical methods for distributional data analysis for measures on the real line and on the unit-circle.
Resumo:
The scope of this paper is to reflect on the theoretical construction in the constitution of the sociology of health, still called medical sociology in some countries. Two main ideas constitute the basis for this: interdisciplinarity and the degree of articulation in the fields of medicine and sociology. We sought to establish a dialogue with some dimensions - macro/micro, structure/action - that constitute the basis for understanding medicine/health in relation to the social/sociological dimension. The main aspects of these dimensions are initially presented. Straus' two medical sociologies and the theory/application impasses are then addressed, as well as the dilemmas of the sociology of medicine in the 1960s and 1970s. From these analyses the theoretical production before 1970 is placed as a counterpoint. Lastly, the sociology of health is seen in the general context of sociology, which underwent a fragmentation process from 1970 with effects in all subfields of the social sciences. This process involves a rethinking of the theoretical issues in a broadened spectrum of possibilities. The 1980s are highlighted when theoretical issues in the sociology of health are reinvigorated and the issue of interdisciplinarity is once again addressed.
Resumo:
Amphibians have been declining worldwide and the comprehension of the threats that they face could be improved by using mark-recapture models to estimate vital rates of natural populations. Recently, the consequences of marking amphibians have been under discussion and the effects of toe clipping on survival are debatable, although it is still the most common technique for individually identifying amphibians. The passive integrated transponder (PIT tag) is an alternative technique, but comparisons among marking techniques in free-ranging populations are still lacking. We compared these two marking techniques using mark-recapture models to estimate apparent survival and recapture probability of a neotropical population of the blacksmith tree frog, Hypsiboas faber. We tested the effects of marking technique and number of toe pads removed while controlling for sex. Survival was similar among groups, although slightly decreased from individuals with one toe pad removed, to individuals with two and three toe pads removed, and finally to PIT-tagged individuals. No sex differences were detected. Recapture probability slightly increased with the number of toe pads removed and was the lowest for PIT-tagged individuals. Sex was an important predictor for recapture probability, with males being nearly five times more likely to be recaptured. Potential negative effects of both techniques may include reduced locomotion and high stress levels. We recommend the use of covariates in models to better understand the effects of marking techniques on frogs. Accounting for the effect of the technique on the results should be considered, because most techniques may reduce survival. Based on our results, but also on logistical and cost issues associated with PIT tagging, we suggest the use of toe clipping with anurans like the blacksmith tree frog.
Resumo:
Hypobromous acid (HOBr) is an inorganic acid produced by the oxidation of the bromide anion (Br(-)). The blood plasma level of Br(-) is more than 1,000-fold lower than that of chloride anion (Cl(-)). Consequently, the endogenous production of HOBr is also lower compared to hypochlorous acid (HOCl). Nevertheless, there is much evidence of the deleterious effects of HOBr. From these data, we hypothesized that the reactivity of HOBr could be better associated with its electrophilic strength. Our hypothesis was confirmed, since HOBr was significantly more reactive than HOCl when the oxidability of the studied compounds was not relevant. For instance: anisole (HOBr, k2=2.3×10(2)M(-1)s(-1), HOCl non-reactive); dansylglycine (HOBr, k2=7.3×10(6)M(-1)s(-1), HOCl, 5.2×10(2)M(-1)s(-1)); salicylic acid (HOBr, k2=4.0×10(4)M(-1)s(-1), non-reactive); 3-hydroxybenzoic acid (HOBr, k2=5.9×10(4)M(-1)s(-1), HOCl, k2=1.1×10(1)M(-1)s(-1)); uridine (HOBr, k2=1.3×10(3)M(-1)s(-1), HOCl non-reactive). The compounds 4-bromoanisole and 5-bromouridine were identified as the products of the reactions between HOBr and anisole or uridine, respectively, i.e. typical products of electrophilic substitutions. Together, these results show that, rather than an oxidant, HOBr is a powerful electrophilic reactant. This chemical property was theoretically confirmed by measuring the positive Mulliken and ChelpG charges upon bromine and chlorine. In conclusion, the high electrophilicity of HOBr could be behind its well-established deleterious effects. We propose that HOBr is the most powerful endogenous electrophile.
Resumo:
In this work we report on a comparison of some theoretical models usually used to fit the dependence on temperature of the fundamental energy gap of semiconductor materials. We used in our investigations the theoretical models of Viña, Pässler-p and Pässler-ρ to fit several sets of experimental data, available in the literature for the energy gap of GaAs in the temperature range from 12 to 974 K. Performing several fittings for different values of the upper limit of the analyzed temperature range (Tmax), we were able to follow in a systematic way the evolution of the fitting parameters up to the limit of high temperatures and make a comparison between the zero-point values obtained from the different models by extrapolating the linear dependence of the gaps at high T to T = 0 K and that determined by the dependence of the gap on isotope mass. Using experimental data measured by absorption spectroscopy, we observed the non-linear behavior of Eg(T) of GaAs for T > ΘD.
Resumo:
The 4,5-diamine-2,6-dimercaptopyrimidine (DADMcP) compound is an interesting multifunctional species exhibiting a rather complex tautomerism, encompassing nine tautomeric forms. Investigation of tautomerism in this compound has been carried out by means of FTIR spectroscopy, in association with ab-initio HF/SCF and DFT calculations. According to this study three tautomers are energetically favored; the thione form being the most stable one. The theoretical vibrational spectra of such tautomeric forms have been successfully simulated by means of DFT calculations, allowing the elucidation and assignment of the complex composition of the vibrational bands observed for the mixture of isomers.
Resumo:
The structure of probability currents is studied for the dynamical network after consecutive contraction on two-state, nonequilibrium lattice systems. This procedure allows us to investigate the transition rates between configurations on small clusters and highlights some relevant effects of lattice symmetries on the elementary transitions that are responsible for entropy production. A method is suggested to estimate the entropy production for different levels of approximations (cluster sizes) as demonstrated in the two-dimensional contact process with mutation.
Resumo:
Magnetoresistance of two-dimensional electron systems with several occupied subbands oscillates owing to periodic modulation of the probability of intersubband transitions by the quantizing magnetic field. In addition to previous investigations of these magnetointersubband (MIS) oscillations in two-subband systems, we report on both experimental and theoretical studies of such a phenomenon in three-subband systems realized in triple quantum wells. We show that the presence of more than two subbands leads to a qualitatively different MIS oscillation picture, described as a superposition of several oscillating contributions. Under a continuous microwave irradiation, the magnetoresistance of triple-well systems exhibits an interference of MIS oscillations and microwave-induced resistance oscillations. The theory explaining these phenomena is presented in the general form, valid for an arbitrary number of subbands. A comparison of theory and experiment allows us to extract temperature dependence of quantum lifetime of electrons and to confirm the applicability of the inelastic mechanism of microwave photoresistance for the description of magnetotransport in multilayer systems.
Resumo:
We report experimental and theoretical studies of the two-photon absorption spectrum of two nitrofuran derivatives: nitrofurantoine, (1-(5-nitro-2-furfurilideneamine)-hidantoine) and quinifuryl, 2-(5`-nitro-2`-furanyl) ethenyl-4-{N-[4`-(N,N-diethylamino)-1`-methylbutyl]carbamoyl} quinoline. Both molecules are representative of a family of 5-nitrofuran-ethenyl-quinoline drugs that have been demonstrated to display high toxicity to various species of transformed cells in the dark. We determine the two-photon absorption cross-section for both compounds, from 560 to 880 nm, which present peak values of 64 GM for quinifuryl and 20 GM for nitrofurantoine (1 GM = 1 x 10(-50) cm(4).s.photon(-1)). Besides, theoretical calculations employing the linear and quadratic response functions were carried out at the density functional theory level to aid the interpretations of the experimental results. The theoretical results yielded oscillator strengths, two-photon transition probabilities, and transition energies, which are in good agreement with the experimental data. A higher number of allowed electronic transitions was identified for quinifuryl in comparison to nitrofurantoine by the theoretical calculations. Due to the planar structure of both compounds, the differences in the two-photon absorption cross-section values are a consequence of their distinct conjugation lengths. (c) 2011 American Institute of Physics. [doi:10.1063/1.3514911]