922 resultados para Data replication processes
Resumo:
The Land surface Processes and eXchanges (LPX) model is a fire-enabled dynamic global vegetation model that performs well globally but has problems representing fire regimes and vegetative mix in savannas. Here we focus on improving the fire module. To improve the representation of ignitions, we introduced a reatment of lightning that allows the fraction of ground strikes to vary spatially and seasonally, realistically partitions strike distribution between wet and dry days, and varies the number of dry days with strikes. Fuel availability and moisture content were improved by implementing decomposition rates specific to individual plant functional types and litter classes, and litter drying rates driven by atmospheric water content. To improve water extraction by grasses, we use realistic plant-specific treatments of deep roots. To improve fire responses, we introduced adaptive bark thickness and post-fire resprouting for tropical and temperate broadleaf trees. All improvements are based on extensive analyses of relevant observational data sets. We test model performance for Australia, first evaluating parameterisations separately and then measuring overall behaviour against standard benchmarks. Changes to the lightning parameterisation produce a more realistic simulation of fires in southeastern and central Australia. Implementation of PFT-specific decomposition rates enhances performance in central Australia. Changes in fuel drying improve fire in northern Australia, while changes in rooting depth produce a more realistic simulation of fuel availability and structure in central and northern Australia. The introduction of adaptive bark thickness and resprouting produces more realistic fire regimes in Australian savannas. We also show that the model simulates biomass recovery rates consistent with observations from several different regions of the world characterised by resprouting vegetation. The new model (LPX-Mv1) produces an improved simulation of observed vegetation composition and mean annual burnt area, by 33 and 18% respectively compared to LPX.
Resumo:
Geotechnical systems, such as landfills, mine tailings storage facilities (TSFs), slopes, and levees, are required to perform safely throughout their service life, which can span from decades for levees to “in perpetuity” for TSFs. The conventional design practice by geotechnical engineers for these systems utilizes the as-built material properties to predict its performance throughout the required service life. The implicit assumption in this design methodology is that the soil properties are stable through time. This is counter to long-term field observations of these systems, particularly where ecological processes such as plant, animal, biological, and geochemical activity are present. Plant roots can densify soil and/or increase hydraulic conductivity, burrowing animals can increase seepage, biological activity can strengthen soil, geochemical processes can increase stiffness, etc. The engineering soil properties naturally change as a stable ecological system is gradually established following initial construction, and these changes alter system performance. This paper presents an integrated perspective and new approach to this issue, considering ecological, geotechnical, and mining demands and constraints. A series of data sets and case histories are utilized to examine these issues and to propose a more integrated design approach, and consideration is given to future opportunities to manage engineered landscapes as ecological systems. We conclude that soil scientists and restoration ecologists must be engaged in initial project design and geotechnical engineers must be active in long-term management during the facility’s service life. For near-surface geotechnical structures in particular, this requires an interdisciplinary perspective and the embracing of soil as a living ecological system rather than an inert construction material.
Resumo:
Understanding what makes some species more vulnerable to extinction than others is an important challenge for conservation. Many comparative analyses have addressed this issue exploring how intrinsic and extrinsic traits associate with general estimates of vulnerability. However, these general estimates do not consider the actual threats that drive species to extinction and hence, are more difficult to translate into effective management. We provide an updated description of the types and spatial distribution of threats that affect mammals globally using data from the IUCN for 5941 species of mammals. Using these data we explore the links between intrinsic species traits and specific threats in order to identify key intrinsic features associated with particular drivers of extinction. We find that families formed by small-size habitat specialists are more likely to be threatened by habitat-modifying processes; whereas, families formed by larger mammals with small litter sizes are more likely to be threatened by processes that directly affect survival. These results highlight the importance of considering the actual threatening process in comparative studies. We also discuss the need to standardize and rank threat importance in global assessments such as the IUCN Red List to improve our ability to understand what makes some species more vulnerable to extinction than others.
Resumo:
In numerical weather prediction, parameterisations are used to simulate missing physics in the model. These can be due to a lack of scientific understanding or a lack of computing power available to address all the known physical processes. Parameterisations are sources of large uncertainty in a model as parameter values used in these parameterisations cannot be measured directly and hence are often not well known; and the parameterisations themselves are also approximations of the processes present in the true atmosphere. Whilst there are many efficient and effective methods for combined state/parameter estimation in data assimilation (DA), such as state augmentation, these are not effective at estimating the structure of parameterisations. A new method of parameterisation estimation is proposed that uses sequential DA methods to estimate errors in the numerical models at each space-time point for each model equation. These errors are then fitted to pre-determined functional forms of missing physics or parameterisations that are based upon prior information. We applied the method to a one-dimensional advection model with additive model error, and it is shown that the method can accurately estimate parameterisations, with consistent error estimates. Furthermore, it is shown how the method depends on the quality of the DA results. The results indicate that this new method is a powerful tool in systematic model improvement.
Resumo:
This paper discusses how global financial institutions are using big data analytics within their compliance operations. A lot of previous research has focused on the strategic implications of big data, but not much research has considered how such tools are entwined with regulatory breaches and investigations in financial services. Our work covers two in-depth qualitative case studies, each addressing a distinct type of analytics. The first case focuses on analytics which manage everyday compliance breaches and so are expected by managers. The second case focuses on analytics which facilitate investigation and litigation where serious unexpected breaches may have occurred. In doing so, the study focuses on the micro/data to understand how these tools are influencing operational risks and practices. The paper draws from two bodies of literature, the social studies of information systems and finance to guide our analysis and practitioner recommendations. The cases illustrate how technologies are implicated in multijurisdictional challenges and regulatory conflicts at each end of the operational risk spectrum. We find that compliance analytics are both shaping and reporting regulatory matters yet often firms may have difficulties in recruiting individuals with relevant but diverse skill sets. The cases also underscore the increasing need for financial organizations to adopt robust information governance policies and processes to ease future remediation efforts.
Resumo:
This article contains raw and processed data related to research published by Bryant et al. [1]. Data was obtained by MS-based proteomics, analysing trichome-enriched, trichome-depleted and whole leaf samples taken from the medicinal plant Artemisia annua and searching the acquired MS/MS data against a recently published contig database [2] and other genomic and proteomic sequence databases for comparison. The processed data shows that an order-of-magnitude more proteins have been identified from trichome-enriched Artemisia annua samples in comparison to previously published data. Proteins known to have a role in the biosynthesis of artemisinin and other highly abundant proteins were found which imply additional enzymatically driven processes occurring within the trichomes that are significant for the biosynthesis of artemisinin.
Resumo:
Neotropical forests have brought forth a large proportion of the world`s terrestrial biodiversity, but the underlying evolutionary mechanisms and their timing require further elucidation. Despite insights gained from phylogenetic studies, uncertainties about molecular clock rates have hindered efforts to determine the timing of diversification processes. Moreover, most molecular research has been detached from the extensive body of data on Neotropical geology and paleogeography. We here examine phylogenetic relationships and the timing of speciation events in a Neotropical flycatcher genus (Myiopagis) by using calibrations from modern geologic data in conjunction with a number of recently developed DNA sequence dating algorithms and by comparing these estimates with those based on a range of previously proposed molecular clock rates. We present a well-supported hypothesis of systematic relationships within the genus. Our age estimates of Myiopagis speciation events based on paleogeographic data are in close agreement with nodal ages derived from a ""traditional"" avian mitochondrial 2%/My clock, while contradicting other clock rates. Our comparative approach corroborates the consistency of the traditional avian mitochondrial clock rate of 2%/My for tyrant-flycatchers. Nevertheless, our results argue against the indiscriminate use of molecular clock rates in evolutionary research and advocate the verification of the appropriateness of the traditional clock rate by means of independent calibrations in individual studies. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
The order Scorpiones is one of the most cytogenetically interesting groups within Arachnida by virtue of the combination of chromosome singularities found in the 59 species analyzed so far. In this work, mitotic and meiotic chromosomes of 2 species of the family Bothriuridae were detailed. This family occupies a basal position within the superfamily Scorpionoidea. Furthermore, review of the cytogenetic data of all previously studied scorpions is presented. Light microscopy chromosome analysis showed that Bothriurus araguayae and Bothriurus rochensis possess low diploid numbers compared with those of species belonging to closely related families. Gonadal cells examined under light and in transmission electron microscopy revealed, for the first time, that the Bothriuridae species possess typical monocentric chromosomes, and male meiosis presented chromosomes with synaptic and achiasmatic behavior. Moreover, in the sample of B. araguayae studied, heterozygous translocations were verified. The use of techniques to highlight specific chromosomal regions also revealed additional differences between the 2 Bothriurus species. The results herein recorded and the overview elaborated using the available cytogenetic information of Scorpiones elucidated current understanding regarding the processes of chromosome evolution that have occurred in Bothriuridae and in Scorpiones as a whole.
Resumo:
Recurrent submicroscopic genomic copy number changes are the result of nonallelic homologous recombination (NAHR). Nonrecurrent aberrations, however, can result from different nonexclusive recombination-repair mechanisms. We previously described small microduplications at Xq28 containing MECP2 in four male patients with a severe neurological phenotype. Here, we report on the fine-mapping and breakpoint analysis of 16 unique microduplications. The size of the overlapping copy number changes varies between 0.3 and 2.3 Mb, and FISH analysis on three patients demonstrated a tandem orientation. Although eight of the 32 breakpoint regions coincide with low-copy repeats, none of the duplications are the result of NAHR. Bioinformatics analysis of the breakpoint regions demonstrated a 2.5-fold higher frequency of Alu interspersed repeats as compared with control regions, as well as a very high GC content (53%). Unexpectedly, we obtained the junction in only one patient by long-range PCR, which revealed nonhomologous end joining as the mechanism. Breakpoint analysis in two other patients by inverse PCR and subsequent array comparative genomic hybridization analysis demonstrated the presence of a second duplicated region more telomeric at Xq28, of which one copy was inserted in between the duplicated MECP2 regions. These data suggest a two-step mechanism in which part of Xq28 is first inserted near the MECP2 locus, followed by breakage-induced replication with strand invasion of the normal sister chromatid. Our results indicate that the mechanism by which copy number changes occur in regions with a complex genomic architecture can yield complex rearrangements.
Resumo:
In this paper we introduce a parametric model for handling lifetime data where an early lifetime can be related to the infant-mortality failure or to the wear processes but we do not know which risk is responsible for the failure. The maximum likelihood approach and the sampling-based approach are used to get the inferences of interest. Some special cases of the proposed model are studied via Monte Carlo methods for size and power of hypothesis tests. To illustrate the proposed methodology, we introduce an example consisting of a real data set.
Resumo:
Nesse artigo, tem-se o interesse em avaliar diferentes estratégias de estimação de parâmetros para um modelo de regressão linear múltipla. Para a estimação dos parâmetros do modelo foram utilizados dados de um ensaio clínico em que o interesse foi verificar se o ensaio mecânico da propriedade de força máxima (EM-FM) está associada com a massa femoral, com o diâmetro femoral e com o grupo experimental de ratas ovariectomizadas da raça Rattus norvegicus albinus, variedade Wistar. Para a estimação dos parâmetros do modelo serão comparadas três metodologias: a metodologia clássica, baseada no método dos mínimos quadrados; a metodologia Bayesiana, baseada no teorema de Bayes; e o método Bootstrap, baseado em processos de reamostragem.
An imaginary potential with universal normalization for dissipative processes in heavy-ion reactions
Resumo:
In this work we present new coupled channel calculations with the Sao Paulo potential (SPP) as the bare interaction, and an imaginary potential with system and energy independent normalization that has been developed to take into account dissipative processes in heavy-ion reactions. This imaginary potential is based on high-energy nucleon interaction in nuclear medium. Our theoretical predictions for energies up to approximate to 100 MeV/nucleon agree very well with the experimental data for the p, n + nucleus, (16)O + (27)Al, (16)O + (60)Ni, (58)Ni + (124)Sn, and weakly bound projectile (7)Li + (120)Sn systems. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The adsorption kinetics curves of poly(xylylidene tetrahydrothiophenium chloride) (PTHT), a poly-p-phenylenevinylene (PPV) precursor, and the sodium salt of dodecylbenzene sulfonic acid (DBS), onto (PTHT/DBS)(n) layer-by-layer (LBL) films were characterized by means of UV-vis spectroscopy. The amount of PTHT/DBS and PTHT adsorbed on each layer was shown to be practically independent of adsorption time. A Langmuir-type metastable equilibrium model was used to adjust the adsorption isotherms data and to estimate adsorption/desorption coefficients ratios, k = k(ads)/k(des), values of 2 x 10(5) and 4 x 10(6) for PTHT and PTHT/DBS layers, respectively. The desorption coefficient has been estimated, using literature values for poly(o-methoxyaniline) desorption coefficient, as was found to be in the range of 10(-9) to 10(-6) s(-1), indicating that quasi equilibrium is rapidly attained.
Resumo:
Felsic microgranular enclaves with structures indicating that they interacted in a plastic state with their chemically similar host granite are abundant in the Maua Pluton, SE Brazil. Larger plagioclase xenocrysts are in textural disequilibrium with the enclave groundmass and show complex zoning patterns with partially resorbed An-rich cores (locally with patchy textures) surrounded by more sodic rims. In situ laser ablation-(multi-collector) inductively coupled plasma mass spectrometry trace element and Sr isotopic analyses performed on the plagioclase xenocrysts indicate open-system crystallization; however, no evidence of derivation from more primitive basic melts is observed. The An-rich cores have more radiogenic initial Sr isotopic ratios that decrease towards the outermost part of the rims, which are in isotopic equilibrium with the matrix plagioclase. These profiles may have been produced by either (1) diffusional re-equilibration after rim crystallization from the enclave-forming magma, as indicated by relatively short calculated residence times, or (2) episodic contamination with a decrease of the contaminant ratio proportional to the extent to which the country rocks were isolated by the crystallization front. Profiles of trace elements with high diffusion coefficients would require unrealistically long residence times, and can be modeled in terms of fractional crystallization. A combination of trace element and Sr isotope data suggests that the felsic microgranular enclaves from the Maua Pluton are the products of interaction between end-member magmas that had similar compositions, thus recording `self-mixing` events.
Resumo:
We discuss the estimation of the expected value of the quality-adjusted survival, based on multistate models. We generalize an earlier work, considering the sojourn times in health states are not identically distributed, for a given vector of covariates. Approaches based on semiparametric and parametric (exponential and Weibull distributions) methodologies are considered. A simulation study is conducted to evaluate the performance of the proposed estimator and the jackknife resampling method is used to estimate the variance of such estimator. An application to a real data set is also included.