107 resultados para single-event upset
Resumo:
Background: In insects, like in most invertebrates, olfaction is the principal sensory modality, which provides animals with essential information for survival and reproduction. Odorant receptors are involved in this response, mediating interactions between an individual and its environment, as well as between individuals of the same or different species. The adaptive importance of odorant receptors renders them good candidates for having their variation shaped by natural selection. Methodology/Principal Findings: We analyzed nucleotide variation in a subset of eight Or genes located on the 3L chromosomal arm of Drosophila melanogaster in a derived population of this species and also in a population of Drosophila pseudoobscura. Some heterogeneity in the silent polymorphism to divergence ratio was detected in the D. melanogaster/D. simulans comparison, with a single gene (Or67b) contributing ~37% to the test statistic. However, no other signals of a very recent selective event were detected at this gene. In contrast, at the speciation timescale, the MK test uncovered the footprint of positive selection driving the evolution of two of the encoded proteins in both D. melanogaster ¿OR65c and OR67a ¿and D. pseudoobscura ¿OR65b1 and OR67c. Conclusions: The powerful polymorphism/divergence approach provided evidence for adaptive evolution at a rather high proportion of the Or genes studied after relatively recent speciation events. It did not provide, however, clear evidence for very recent selective events in either D. melanogaster or D. pseudoobscura.
Resumo:
Single-valued solutions for the case of two-sided market games without product differentiation, also known as Böhm-Bawerk horse market games, are analyzed. The nucleolus is proved to coincide with the tau-value, and is thus the midpoint of the core. Moreover a characterization of this setof games in terms of the assignment matrix is provided.
Resumo:
[cat] En aquest treball caracteritzem les solucions puntuals de jocs cooperatius d'utilitat transferible que compleixen selecció del core i monotonia agregada. També mostrem que aquestes dues propietats són compatibles amb la individualitat racional, la propietat del jugador fals i la propietat de simetria. Finalment, caracteritzem les solucions puntuals que compleixen les cinc propietats a l'hora.
Resumo:
Terrestrial laser scanning (TLS) is one of the most promising surveying techniques for rockslope characterization and monitoring. Landslide and rockfall movements can be detected by means of comparison of sequential scans. One of the most pressing challenges of natural hazards is combined temporal and spatial prediction of rockfall. An outdoor experiment was performed to ascertain whether the TLS instrumental error is small enough to enable detection of precursory displacements of millimetric magnitude. This consists of a known displacement of three objects relative to a stable surface. Results show that millimetric changes cannot be detected by the analysis of the unprocessed datasets. Displacement measurement are improved considerably by applying Nearest Neighbour (NN) averaging, which reduces the error (1¿) up to a factor of 6. This technique was applied to displacements prior to the April 2007 rockfall event at Castellfollit de la Roca, Spain. The maximum precursory displacement measured was 45 mm, approximately 2.5 times the standard deviation of the model comparison, hampering the distinction between actual displacement and instrumental error using conventional methodologies. Encouragingly, the precursory displacement was clearly detected by applying the NN averaging method. These results show that millimetric displacements prior to failure can be detected using TLS.
Resumo:
After a rockfall event, a usual post event survey includes qualitative volume estimation, trajectory mapping and determination of departing zones. However, quantitative measurements are not usually made. Additional relevant quantitative information could be useful in determining the spatial occurrence of rockfall events and help us in quantifying their size. Seismic measurements could be suitable for detection purposes since they are non invasive methods and are relatively inexpensive. Moreover, seismic techniques could provide important information on rockfall size and location of impacts. On 14 February 2007 the Avalanche Group of the University of Barcelona obtained the seismic data generated by an artificially triggered rockfall event at the Montserrat massif (near Barcelona, Spain) carried out in order to purge a slope. Two 3 component seismic stations were deployed in the area about 200 m from the explosion point that triggered the rockfall. Seismic signals and video images were simultaneously obtained. The initial volume of the rockfall was estimated to be 75 m3 by laser scanner data analysis. After the explosion, dozens of boulders ranging from 10¿4 to 5 m3 in volume impacted on the ground at different locations. The blocks fell down onto a terrace, 120 m below the release zone. The impact generated a small continuous mass movement composed of a mixture of rocks, sand and dust that ran down the slope and impacted on the road 60 m below. Time, time-frequency evolution and particle motion analysis of the seismic records and seismic energy estimation were performed. The results are as follows: 1 ¿ A rockfall event generates seismic signals with specific characteristics in the time domain; 2 ¿ the seismic signals generated by the mass movement show a time-frequency evolution different from that of other seismogenic sources (e.g. earthquakes, explosions or a single rock impact). This feature could be used for detection purposes; 3 ¿ particle motion plot analysis shows that the procedure to locate the rock impact using two stations is feasible; 4 ¿ The feasibility and validity of seismic methods for the detection of rockfall events, their localization and size determination are comfirmed.
Resumo:
The present study explores the statistical properties of a randomization test based on the random assignment of the intervention point in a two-phase (AB) single-case design. The focus is on randomization distributions constructed with the values of the test statistic for all possible random assignments and used to obtain p-values. The shape of those distributions is investigated for each specific data division defined by the moment in which the intervention is introduced. Another aim of the study consisted in testing the detection of inexistent effects (i.e., production of false alarms) in autocorrelated data series, in which the assumption of exchangeability between observations may be untenable. In this way, it was possible to compare nominal and empirical Type I error rates in order to obtain evidence on the statistical validity of the randomization test for each individual data division. The results suggest that when either of the two phases has considerably less measurement times, Type I errors may be too probable and, hence, the decision making process to be carried out by applied researchers may be jeopardized.
Resumo:
Effect size indices are indispensable for carrying out meta-analyses and can also be seen as an alternative for making decisions about the effectiveness of a treatment in an individual applied study. The desirable features of the procedures for quantifying the magnitude of intervention effect include educational/clinical meaningfulness, calculus easiness, insensitivity to autocorrelation, low false alarm and low miss rates. Three effect size indices related to visual analysis are compared according to the aforementioned criteria. The comparison is made by means of data sets with known parameters: degree of serial dependence, presence or absence of general trend, changes in level and/or in slope. The percent of nonoverlapping data showed the highest discrimination between data sets with and without intervention effect. In cases when autocorrelation or trend is present, the percentage of data points exceeding the median may be a better option to quantify the effectiveness of a psychological treatment.
Resumo:
Visual inspection remains the most frequently applied method for detecting treatment effects in single-case designs. The advantages and limitations of visual inference are here discussed in relation to other procedures for assessing intervention effectiveness. The first part of the paper reviews previous research on visual analysis, paying special attention to the validation of visual analysts" decisions, inter-judge agreement, and false alarm and omission rates. The most relevant factors affecting visual inspection (i.e., effect size, autocorrelation, data variability, and analysts" expertise) are highlighted and incorporated into an empirical simulation study with the aim of providing further evidence about the reliability of visual analysis. Our results concur with previous studies that have reported the relationship between serial dependence and increased Type I rates. Participants with greater experience appeared to be more conservative and used more consistent criteria when assessing graphed data. Nonetheless, the decisions made by both professionals and students did not match sufficiently the simulated data features, and we also found low intra-judge agreement, thus suggesting that visual inspection should be complemented by other methods when assessing treatment effectiveness.
Resumo:
If single case experimental designs are to be used to establish guidelines for evidence-based interventions in clinical and educational settings, numerical values that reflect treatment effect sizes are required. The present study compares four recently developed procedures for quantifying the magnitude of intervention effect using data with known characteristics. Monte Carlo methods were used to generate AB designs data with potential confounding variables (serial dependence, linear and curvilinear trend, and heteroscedasticity between phases) and two types of treatment effect (level and slope change). The results suggest that data features are important for choosing the appropriate procedure and, thus, inspecting the graphed data visually is a necessary initial stage. In the presence of serial dependence or a change in data variability, the Nonoverlap of All Pairs (NAP) and the Slope and Level Change (SLC) were the only techniques of the four examined that performed adequately. Introducing a data correction step in NAP renders it unaffected by linear trend, as is also the case for the Percentage of Nonoverlapping Corrected Data and SLC. The performance of these techniques indicates that professionals" judgments concerning treatment effectiveness can be readily complemented by both visual and statistical analyses. A flowchart to guide selection of techniques according to the data characteristics identified by visual inspection is provided.
Resumo:
The present study focuses on single-case data analysis and specifically on two procedures for quantifying differences between baseline and treatment measurements The first technique tested is based on generalized least squares regression analysis and is compared to a proposed non-regression technique, which allows obtaining similar information. The comparison is carried out in the context of generated data representing a variety of patterns (i.e., independent measurements, different serial dependence underlying processes, constant or phase-specific autocorrelation and data variability, different types of trend, and slope and level change). The results suggest that the two techniques perform adequately for a wide range of conditions and researchers can use both of them with certain guarantees. The regression-based procedure offers more efficient estimates, whereas the proposed non-regression procedure is more sensitive to intervention effects. Considering current and previous findings, some tentative recommendations are offered to applied researchers in order to help choosing among the plurality of single-case data analysis techniques.
Resumo:
Lipoxygenases are non-heme iron enzymes essential in eukaryotes, where they catalyze the formation of the fatty acid hydroperoxides that are required by a large diversity of biological and pathological processes. In prokaryotes, most of them totally lacking in polyunsaturated fatty acids, the possible biological roles oflipoxygenases have remained obscure. In this study, it is reported the crystallization of a lipoxygenase of Pseudomonas aeruginosa (Pa_LOX), the first from a prokaryote. High resolution data has been acquired which is expected to yield structural clues to the questions adressed. Besides, a preliminar phylogenetic analysis using 14 sequences has confirmed the existence of this subfamily of bacterial lipoxygenases, on one side, and a greater diversity than in the corresponding eukaryotic ones, on the other. Finally, an evolutionary study of bacteriallipoxygenases on the same set of lipoxygenases, show a selection pressure of a basically purifying or neutral character except for a single aminoacid, which would have been selected after a positive selection event.
Resumo:
Lipoxygenases are non-heme iron enzymes essential in eukaryotes, where they catalyze the formation of the fatty acid hydroperoxides that are required by a large diversity of biological and pathological processes. In prokaryotes, most of them totally lacking in polyunsaturated fatty acids, the possible biological roles oflipoxygenases have remained obscure. In this study, it is reported the crystallization of a lipoxygenase of Pseudomonas aeruginosa (Pa_LOX), the first from a prokaryote. High resolution data has been acquired which is expected to yield structural clues to the questions adressed. Besides, a preliminar phylogenetic analysis using 14 sequences has confirmed the existence of this subfamily of bacterial lipoxygenases, on one side, and a greater diversity than in the corresponding eukaryotic ones, on the other. Finally, an evolutionary study of bacteriallipoxygenases on the same set of lipoxygenases, show a selection pressure of a basically purifying or neutral character except for a single aminoacid, which would have been selected after a positive selection event.
Resumo:
In this article the main possibilities of single crystal and powder diffraction analysis using conventional laboratory x-ray sources are introduced. Several examples of applications with different solid samples and in different fields of applications are shown illustrating the multidisciplinary capabilities of both techniques.