886 resultados para Sequential set
Resumo:
The NMR spin coupling parameters, (1)J(N,H) and (2)J(H,H), and the chemical shielding, sigma((15)N), of liquid ammonia are studied from a combined and sequential QM/MM methodology. Monte Carlo simulations are performed to generate statistically uncorrelated configurations that are submitted to density functional theory calculations. Two different Lennard-Jones potentials are used in the liquid simulations. Electronic polarization is included in these two potentials via an iterative procedure with and without geometry relaxation, and the influence on the calculated properties are analyzed. B3LYP/aug-cc-pVTZ-J calculations were used to compute the V(N,H) constants in the interval of -67.8 to -63.9 Hz, depending on the theoretical model used. These can be compared with the experimental results of -61.6 Hz. For the (2)J(H,H) coupling the theoretical results vary between -10.6 to -13.01 Hz. The indirect experimental result derived from partially deuterated liquid is -11.1 Hz. Inclusion of explicit hydrogen bonded molecules gives a small but important contribution. The vapor-to-liquid shifts are also considered. This shift is calculated to be negligible for (1)J(N,H) in agreement with experiment. This is rationalized as a cancellation of the geometry relaxation and pure solvent effects. For the chemical shielding, U(15 N) Calculations at the B3LYP/aug-pcS-3 show that the vapor-to-liquid chemical shift requires the explicit use of solvent molecules. Considering only one ammonia molecule in an electrostatic embedding gives a wrong sign for the chemical shift that is corrected only with the use of explicit additional molecules. The best result calculated for the vapor to liquid chemical shift Delta sigma((15)N) is -25.2 ppm, in good agreement with the experimental value of -22.6 ppm.
Resumo:
A bipartite graph G = (V, W, E) is convex if there exists an ordering of the vertices of W such that, for each v. V, the neighbors of v are consecutive in W. We describe both a sequential and a BSP/CGM algorithm to find a maximum independent set in a convex bipartite graph. The sequential algorithm improves over the running time of the previously known algorithm and the BSP/CGM algorithm is a parallel version of the sequential one. The complexity of the algorithms does not depend on |W|.
Resumo:
This paper studies cost-sharing rules under dynamic adverse selection. We present a typical principal-agent model with two periods, set up in Laffont and Tirole's (1986) canonical regulation environment. At first, when the contract is signed, the firm has prior uncertainty about its efficiency parameter. In the second period, the firm learns its efficiency and chooses the level of cost-reducing effort. The optimal mechanism sequentially screens the firm's types and achieves a higher level of welfare than its static counterpart. The contract is indirectly implemented by a sequence of transfers, consisting of a fixed advance payment based on the reported cost estimate, and an ex-post compensation linear in cost performance.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
A significant proportion (up to 62) of oral squamous cell carcinomas (OSCCs) may arise from oral potential malignant lesions (OPMLs), such as leukoplakia. Patient outcomes may thus be improved through detection of lesions at a risk for malignant transformation, by identifying and categorizing genetic changes in sequential, progressive OPMLs. We conducted array comparative genomic hybridization analysis of 25 sequential, progressive OPMLs and same-site OSCCs from five patients. Recurrent DNA copy number gains were identified on 1p in 20/25 cases (80) with minimal, high-level amplification regions on 1p35 and 1p36. Other regions of gains were frequently observed: 11q13.4 (68), 9q34.13 (64), 21q22.3 (60), 6p21 and 6q25 (56) and 10q24, 19q13.2, 22q12, 5q31.2, 7p13, 10q24 and 14q22 (48). DNA losses were observed in 20 of samples and mainly detected on 5q31.2 (35), 16p13.2 (30), 9q33.1 and 9q33.29 (25) and 17q11.2, 3p26.2, 18q21.1, 4q34.1 and 8p23.2 (20). Such copy number alterations (CNAs) were mapped in all grades of dysplasia that progressed, and their corresponding OSCCs, in 70 of patients, indicating that these CNAs may be associated with disease progression. Amplified genes mapping within recurrent CNAs (KHDRBS1, PARP1, RAB1A, HBEGF, PAIP2, BTBD7) were selected for validation, by quantitative real-time PCR, in an independent set of 32 progressive leukoplakia, 32 OSSCs and 21 non-progressive leukoplakia samples. Amplification of BTBD7, KHDRBS1, PARP1 and RAB1A was exclusively detected in progressive leukoplakia and corresponding OSCC. BTBD7, KHDRBS1, PARP1 and RAB1A may be associated with OSCC progression. Proteinprotein interaction networks were created to identify possible pathways associated with OSCC progression.
Resumo:
There is a continuous search for theoretical methods that are able to describe the effects of the liquid environment on molecular systems. Different methods emphasize different aspects, and the treatment of both the local and bulk properties is still a great challenge. In this work, the electronic properties of a water molecule in liquid environment is studied by performing a relaxation of the geometry and electronic distribution using the free energy gradient method. This is made using a series of steps in each of which we run a purely molecular mechanical (MM) Monte Carlo Metropolis simulation of liquid water and subsequently perform a quantum mechanical/molecular mechanical (QM/MM) calculation of the ensemble averages of the charge distribution, atomic forces, and second derivatives. The MP2/aug-cc-pV5Z level is used to describe the electronic properties of the QM water. B3LYP with specially designed basis functions are used for the magnetic properties. Very good agreement is found for the local properties of water, such as geometry, vibrational frequencies, dipole moment, dipole polarizability, chemical shift, and spin-spin coupling constants. The very good performance of the free energy method combined with a QM/MM approach along with the possible limitations are briefly discussed.
Resumo:
When estimating the effect of treatment on HIV using data from observational studies, standard methods may produce biased estimates due to the presence of time-dependent confounders. Such confounding can be present when a covariate, affected by past exposure, is both a predictor of the future exposure and the outcome. One example is the CD4 cell count, being a marker for disease progression for HIV patients, but also a marker for treatment initiation and influenced by treatment. Fitting a marginal structural model (MSM) using inverse probability weights is one way to give appropriate adjustment for this type of confounding. In this paper we study a simple and intuitive approach to estimate similar treatment effects, using observational data to mimic several randomized controlled trials. Each 'trial' is constructed based on individuals starting treatment in a certain time interval. An overall effect estimate for all such trials is found using composite likelihood inference. The method offers an alternative to the use of inverse probability of treatment weights, which is unstable in certain situations. The estimated parameter is not identical to the one of an MSM, it is conditioned on covariate values at the start of each mimicked trial. This allows the study of questions that are not that easily addressed fitting an MSM. The analysis can be performed as a stratified weighted Cox analysis on the joint data set of all the constructed trials, where each trial is one stratum. The model is applied to data from the Swiss HIV cohort study.
Resumo:
The present study demonstrates how consumers can suffer from sequential overchoice. Customizing a tailor-made suit from combined-attribute choices (e.g., deciding on color and fabric in combination) leads to less satisfaction and less additional consumption than customizing it from single-attribute choices (e.g., deciding on color, then on fabric). The effect is mediated by information overload and moderated by consideration set size.
Resumo:
Most statistical analysis, theory and practice, is concerned with static models; models with a proposed set of parameters whose values are fixed across observational units. Static models implicitly assume that the quantified relationships remain the same across the design space of the data. While this is reasonable under many circumstances this can be a dangerous assumption when dealing with sequentially ordered data. The mere passage of time always brings fresh considerations and the interrelationships among parameters, or subsets of parameters, may need to be continually revised. ^ When data are gathered sequentially dynamic interim monitoring may be useful as new subject-specific parameters are introduced with each new observational unit. Sequential imputation via dynamic hierarchical models is an efficient strategy for handling missing data and analyzing longitudinal studies. Dynamic conditional independence models offers a flexible framework that exploits the Bayesian updating scheme for capturing the evolution of both the population and individual effects over time. While static models often describe aggregate information well they often do not reflect conflicts in the information at the individual level. Dynamic models prove advantageous over static models in capturing both individual and aggregate trends. Computations for such models can be carried out via the Gibbs sampler. An application using a small sample repeated measures normally distributed growth curve data is presented. ^
Resumo:
Extant terrestrial biodiversity arguably is driven by the evolutionary success of angiosperm plants, but the evolutionary mechanisms and timescales of angiosperm-dependent radiations remain poorly understood. The Scarabaeoidea is a diverse lineage of predominantly plant- and dung-feeding beetles. Here, we present a phylogenetic analysis of Scarabaeoidea based on four DNA markers for a taxonomically comprehensive set of specimens and link it to recently described fossil evidence. The phylogeny strongly supports multiple origins of coprophagy, phytophagy and anthophagy. The ingroup-based fossil calibration of the tree widely confirmed a Jurassic origin of the Scarabaeoidea crown group. The crown groups of phytophagous lineages began to radiate first (Pleurostict scarabs: 108 Ma; Glaphyridae between 101 Ma), followed by the later diversification of coprophagous lineages (crown-group age Scarabaeinae: 76 Ma; Aphodiinae: 50 Ma). Pollen feeding arose even later, at maximally 62 Ma in the oldest anthophagous lineage. The clear time lag between the origins of herbivores and coprophages suggests an evolutionary path driven by the angiosperms that first favoured the herbivore fauna (mammals and insects) followed by the secondary radiation of the dung feeders. This finding makes it less likely that extant dung beetle lineages initially fed on dinosaur excrements, as often hypothesized.
Resumo:
Agents on the same side of a two-sided matching market (such as the marriage or labor market) compete with each other by making self-enhancing investments to improve their worth in the eyes of potential partners. Because these expenditures generally occur prior to matching, this activity has come to be known in recent literature (Peters, 2007) as pre-marital investment. This paper builds on that literature by considering the case of sequential pre-marital investment, analyzing a matching game in which one side of the market invests first, followed by the other. Interpreting the first group of agents as workers and the other group as firms, the paper provides a new perspective on the incentive structure that is inherent in labor markets. It also demonstrates that a positive rate of unemployment can exist even in the absence of matching frictions. Policy implications follow, as the prevailing set of equilibria can be altered by restricting entry into the workforce, providing unemployment insurance, or subsidizing pre-marital investment.
Resumo:
When conducting a randomized comparative clinical trial, ethical, scientific or economic considerations often motivate the use of interim decision rules after successive groups of patients have been treated. These decisions may pertain to the comparative efficacy or safety of the treatments under study, cost considerations, the desire to accelerate the drug evaluation process, or the likelihood of therapeutic benefit for future patients. At the time of each interim decision, an important question is whether patient enrollment should continue or be terminated; either due to a high probability that one treatment is superior to the other, or a low probability that the experimental treatment will ultimately prove to be superior. The use of frequentist group sequential decision rules has become routine in the conduct of phase III clinical trials. In this dissertation, we will present a new Bayesian decision-theoretic approach to the problem of designing a randomized group sequential clinical trial, focusing on two-arm trials with time-to-failure outcomes. Forward simulation is used to obtain optimal decision boundaries for each of a set of possible models. At each interim analysis, we use Bayesian model selection to adaptively choose the model having the largest posterior probability of being correct, and we then make the interim decision based on the boundaries that are optimal under the chosen model. We provide a simulation study to compare this method, which we call Bayesian Doubly Optimal Group Sequential (BDOGS), to corresponding frequentist designs using either O'Brien-Fleming (OF) or Pocock boundaries, as obtained from EaSt 2000. Our simulation results show that, over a wide variety of different cases, BDOGS either performs at least as well as both OF and Pocock, or on average provides a much smaller trial. ^
Resumo:
Three sequential hurricanes, Dennis, Floyd, and Irene, affected coastal North Carolina in September and October 1999. These hurricanes inundated the region with up to 1 m of rainfall, causing 50- to 500-year flooding in the watershed of the Pamlico Sound, the largest lagoonal estuary in the United States and a key West Atlantic fisheries nursery. We investigated the ecosystem-level impacts on and responses of the Sound to the floodwater discharge. Floodwaters displaced three-fourths of the volume of the Sound, depressed salinity by a similar amount, and delivered at least half of the typical annual nitrogen load to this nitrogen-sensitive ecosystem. Organic carbon concentrations in floodwaters entering Pamlico Sound via a major tributary (the Neuse River Estuary) were at least 2-fold higher than concentrations under prefloodwater conditions. A cascading set of physical, chemical, and ecological impacts followed, including strong vertical stratification, bottom water hypoxia, a sustained increase in algal biomass, displacement of many marine organisms, and a rise in fish disease. Because of the Sound's long residence time (≈1 year), we hypothesize that the effects of the short-term nutrient enrichment could prove to be multiannual. A predicted increase in the frequency of hurricane activity over the next few decades may cause longer-term biogeochemical and trophic changes in this and other estuarine and coastal habitats.
Resumo:
A sequential design method is presented for the design of thermally coupled distillation sequences. The algorithm starts by selecting a set of sequences in the space of basic configurations in which the internal structure of condensers and reboilers is explicitly taken into account and extended with the possibility of including divided wall columns (DWC). This first stage is based on separation tasks (except by the DWCs) and therefore it does not provide an actual sequence of columns. In the second stage the best arrangement in N-1 actual columns is performed taking into account operability and mechanical constraints. Finally, for a set of candidate sequences the algorithm try to reduce the number of total columns by considering Kaibel columns, elimination of transfer blocks or columns with vertical partitions. An example illustrate the different steps of the sequential algorithm.
Resumo:
When designing a practical swarm robotics system, self-organized task allocation is key to make best use of resources. Current research in this area focuses on task allocation which is either distributed (tasks must be performed at different locations) or sequential (tasks are complex and must be split into simpler sub-tasks and processed in order). In practice, however, swarms will need to deal with tasks which are both distributed and sequential. In this paper, a classic foraging problem is extended to incorporate both distributed and sequential tasks. The problem is analysed theoretically, absolute limits on performance are derived, and a set of conditions for a successful algorithm are established. It is shown empirically that an algorithm which meets these conditions, by causing emergent cooperation between robots can achieve consistently high performance under a wide range of settings without the need for communication. © 2013 IEEE.