980 resultados para RBCL SEQUENCE ANALYSES


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Teacher Reporting Attitude Scale (TRAS) is a newly developed tool to assess teachers’ attitudes toward reporting child abuse and neglect. This article reports on an investigation of the factor structure and psychometric properties of the short form Malay version of the TRAS. A self-report cross-sectional survey was conducted with 667 teachers in 14 randomly selected schools in Selangor state, Malaysia. Analyses were conducted in a 3-stage process using both confirmatory (stages 1 and 3) and exploratory factor analyses (stage 2) to test, modify, and confirm the underlying factor structure of the TRAS in a non-Western teacher sample. Confirmatory factor analysis did not support a 3-factor model previously reported in the original TRAS study. Exploratory factor analysis revealed an 8-item, 4-factor structure. Further confirmatory factor analysis demonstrated appropriateness of the 4-factor structure. Reliability estimates for the four factors—commitment, value, concern, and confidence—were moderate. The modified short form TRAS (Malay version) has potential to be used as a simple tool for relatively quick assessment of teachers’ attitudes toward reporting child abuse and neglect. Cross-cultural differences in attitudes toward reporting may exist and the transferability of newly developed instruments to other populations should be evaluated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Analytical Electron Microscope (AEM), with which secondary X-ray emission from a thin (<150nm), electron-transparent material is measured, has rapidly become a versatile instrument for qualitative and quantitative elemental analyses of many materials, including minerals. With due regard for sources of error in experimental procedures, it is possible to obtain high spatial resolution (~20nm diameter) and precise elemental analyses (~3% to 5% relative) from many silicate minerals. In addition, by utilizing the orientational dependence of X-ray emission for certain multi-substituted crystal structures, site occupancies for individual elements within a unit cell can be determined though with lower spatial resolution. The relative ease with which many of these compositional data may be obtained depends in part on the nature of the sample, but, in general, is comparable to other solid state analytical techniques such as X-ray diffraction and electron microprobe analysis. However, the improvement in spatial resolution obtained with the AEM (up to two orders of magnitude in analysis diameter) significantly enhances interpretation of fine-grained assemblages in many terrestrial or extraterrestrial rocks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Collections of solid particles from the Earth's stratosphere by high-flying aircraft have been reported since 1965, with the initial primary objective of understanding the nature of the aerosol layer that occurs in the lower stratosphere. With the advent of efficient collection procedures and sophisticated electron- and ion-beam techniques, the primary aim of current stratospheric collections has been to study specific particle types that are extraterrestrial in origin and have survived atmospheric entry processes. The collection program provided by NASA at Johnson Space Center (JSC) has conducted many flights over the past 4 years and retrieved a total of 99 collection surfaces (flags) suitable for detailed study. Most of these collections are part of dedicated flights and have occurred during volcanically quiescent periods, although solid particles from the El Chichon eruptions have also been collected. Over 800 individual particles (or representative samples from larger aggregates) have been picked from these flags, examined in a preliminary fashion by SEM and EDS, and cataloged in a manner suitable for selection and study by the wider scientific community.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Preliminary data is presented on a detailed statistical analysis of k-factor determination for a single class of minerals (amphiboles) which contain a wide range of element concentrations. These amphiboles are homogeneous, contain few (if any) subsolidus microstructures and can be readily prepared for thin film analysis. In previous studies, element loss during the period of irradiation has been assumed negligible for the determination of k-factors. Since this phenomena may be significant for certain mineral systems, we also report on the effect of temperature on k-factor determination for various elements using small probe sizes (approx.20 nm).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accelerating a project can be rewarding. The consequences, however, can be troublesome if productivity and quality are sacrificed for the sake of remaining ahead of schedule, such that the actual schedule benefits are often barely worth the effort. The tradeoffs and paths of schedule pressure and its causes and effects are often overlooked when schedule decisions are being made. This paper analyses the effects that schedule pressure has on construction performance, and focuses on tradeoffs in scheduling. A research framework has been developed using a causal diagram to illustrate the cause-and-effect analysis of schedule pressure. An empirical investigation has been performed by using survey data collected from 102 construction practitioners working in 38 construction sites in Singapore. The results of this survey data analysis indicate that advantages of increasing the pace of work—by working under schedule pressure—can be offset by losses in productivity and quality. The negative effects of schedule pressure arise mainly by working out of sequence, generating work defects, cutting corners, and losing the motivation to work. The adverse effects of schedule pressure can be minimized by scheduling construction activities realistically and planning them proactively, motivating workers, and by establishing an effective project coordination and communication mechanism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Historical information can be used, in addition to pedigree, traits and genotypes, to map quantitative trait locus (QTL) in general populations via maximum likelihood estimation of variance components. This analysis is known as linkage disequilibrium (LD) and linkage mapping, because it exploits both linkage in families and LD at the population level. The search for QTL in the wild population of Soay sheep on St. Kilda is a proof of principle. We analysed the data from a previous study and confirmed some of the QTLs reported. The most striking result was the confirmation of a QTL affecting birth weight that had been reported using association tests but not when using linkage-based analyses. Copyright © Cambridge University Press 2010.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motivation: Unravelling the genetic architecture of complex traits requires large amounts of data, sophisticated models and large computational resources. The lack of user-friendly software incorporating all these requisites is delaying progress in the analysis of complex traits. Methods: Linkage disequilibrium and linkage analysis (LDLA) is a high-resolution gene mapping approach based on sophisticated mixed linear models, applicable to any population structure. LDLA can use population history information in addition to pedigree and molecular markers to decompose traits into genetic components. Analyses are distributed in parallel over a large public grid of computers in the UK. Results: We have proven the performance of LDLA with analyses of simulated data. There are real gains in statistical power to detect quantitative trait loci when using historical information compared with traditional linkage analysis. Moreover, the use of a grid of computers significantly increases computational speed, hence allowing analyses that would have been prohibitive on a single computer. © The Author 2009. Published by Oxford University Press. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There exists an important tradition of content analyses of aggression in sexually explicit material. The majority of these analyses use a definition of aggression that excludes consent. This article identifies three problems with this approach. First, it does not distinguish between aggression and some positive acts. Second, it excludes a key element of healthy sexuality. Third, it can lead to heteronormative definitions of healthy sexuality. It would be better to use a definition of aggression such as Baron and Richardson's (1994) in our content analyses, that includes a consideration of consent. A number of difficulties have been identified with attending to consent but this article offers solutions to each of these.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

X-ray microtomography (micro-CT) with micron resolution enables new ways of characterizing microstructures and opens pathways for forward calculations of multiscale rock properties. A quantitative characterization of the microstructure is the first step in this challenge. We developed a new approach to extract scale-dependent characteristics of porosity, percolation, and anisotropic permeability from 3-D microstructural models of rocks. The Hoshen-Kopelman algorithm of percolation theory is employed for a standard percolation analysis. The anisotropy of permeability is calculated by means of the star volume distribution approach. The local porosity distribution and local percolation probability are obtained by using the local porosity theory. Additionally, the local anisotropy distribution is defined and analyzed through two empirical probability density functions, the isotropy index and the elongation index. For such a high-resolution data set, the typical data sizes of the CT images are on the order of gigabytes to tens of gigabytes; thus an extremely large number of calculations are required. To resolve this large memory problem parallelization in OpenMP was used to optimally harness the shared memory infrastructure on cache coherent Non-Uniform Memory Access architecture machines such as the iVEC SGI Altix 3700Bx2 Supercomputer. We see adequate visualization of the results as an important element in this first pioneering study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intra-host sequence data from RNA viruses have revealed the ubiquity of defective viruses in natural viral populations, sometimes at surprisingly high frequency. Although defective viruses have long been known to laboratory virologists, their relevance in clinical and epidemiological settings has not been established. The discovery of long-term transmission of a defective lineage of dengue virus type 1 (DENV-1) in Myanmar, first seen in 2001, raised important questions about the emergence of transmissible defective viruses and their role in viral epidemiology. By combining phylogenetic analyses and dynamical modelling, we investigate how evolutionary and ecological processes at the intra-host and inter-host scales shaped the emergence and spread of the defective DENV-1 lineage. We show that this lineage of defective viruses emerged between June 1998 and February 2001, and that the defective virus was transmitted primarily through co-transmission with the functional virus to uninfected individuals. We provide evidence that, surprisingly, this co-transmission route has a higher transmission potential than transmission of functional dengue viruses alone. Consequently, we predict that the defective lineage should increase overall incidence of dengue infection, which could account for the historically high dengue incidence reported in Myanmar in 2001-2002. Our results show the unappreciated potential for defective viruses to impact the epidemiology of human pathogens, possibly by modifying the virulence-transmissibility trade-off, or to emerge as circulating infections in their own right. They also demonstrate that interactions between viral variants, such as complementation, can open new pathways to viral emergence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Authenticated Encryption (AE) is the cryptographic process of providing simultaneous confidentiality and integrity protection to messages. This approach is more efficient than applying a two-step process of providing confidentiality for a message by encrypting the message, and in a separate pass providing integrity protection by generating a Message Authentication Code (MAC). AE using symmetric ciphers can be provided by either stream ciphers with built in authentication mechanisms or block ciphers using appropriate modes of operation. However, stream ciphers have the potential for higher performance and smaller footprint in hardware and/or software than block ciphers. This property makes stream ciphers suitable for resource constrained environments, where storage and computational power are limited. There have been several recent stream cipher proposals that claim to provide AE. These ciphers can be analysed using existing techniques that consider confidentiality or integrity separately; however currently there is no existing framework for the analysis of AE stream ciphers that analyses these two properties simultaneously. This thesis introduces a novel framework for the analysis of AE using stream cipher algorithms. This thesis analyzes the mechanisms for providing confidentiality and for providing integrity in AE algorithms using stream ciphers. There is a greater emphasis on the analysis of the integrity mechanisms, as there is little in the public literature on this, in the context of authenticated encryption. The thesis has four main contributions as follows. The first contribution is the design of a framework that can be used to classify AE stream ciphers based on three characteristics. The first classification applies Bellare and Namprempre's work on the the order in which encryption and authentication processes take place. The second classification is based on the method used for accumulating the input message (either directly or indirectly) into the into the internal states of the cipher to generate a MAC. The third classification is based on whether the sequence that is used to provide encryption and authentication is generated using a single key and initial vector, or two keys and two initial vectors. The second contribution is the application of an existing algebraic method to analyse the confidentiality algorithms of two AE stream ciphers; namely SSS and ZUC. The algebraic method is based on considering the nonlinear filter (NLF) of these ciphers as a combiner with memory. This method enables us to construct equations for the NLF that relate the (inputs, outputs and memory of the combiner) to the output keystream. We show that both of these ciphers are secure from this type of algebraic attack. We conclude that using a keydependent SBox in the NLF twice, and using two different SBoxes in the NLF of ZUC, prevents this type of algebraic attack. The third contribution is a new general matrix based model for MAC generation where the input message is injected directly into the internal state. This model describes the accumulation process when the input message is injected directly into the internal state of a nonlinear filter generator. We show that three recently proposed AE stream ciphers can be considered as instances of this model; namely SSS, NLSv2 and SOBER-128. Our model is more general than a previous investigations into direct injection. Possible forgery attacks against this model are investigated. It is shown that using a nonlinear filter in the accumulation process of the input message when either the input message or the initial states of the register is unknown prevents forgery attacks based on collisions. The last contribution is a new general matrix based model for MAC generation where the input message is injected indirectly into the internal state. This model uses the input message as a controller to accumulate a keystream sequence into an accumulation register. We show that three current AE stream ciphers can be considered as instances of this model; namely ZUC, Grain-128a and Sfinks. We establish the conditions under which the model is susceptible to forgery and side-channel attacks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Originally developed in bioinformatics, sequence analysis is being increasingly used in social sciences for the study of life-course processes. The methodology generally employed consists in computing dissimilarities between the trajectories and, if typologies are sought, in clustering the trajectories according to their similarities or dissemblances. The choice of an appropriate dissimilarity measure is a major issue when dealing with sequence analysis for life sequences. Several dissimilarities are available in the literature, but neither of them succeeds to become indisputable. In this paper, instead of deciding upon one dissimilarity measure, we propose to use an optimal convex combination of different dissimilarities. The optimality is automatically determined by the clustering procedure and is defined with respect to the within-class variance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background/aims: Remote monitoring for heart failure has not only been evaluated in a large number of randomised controlled trials, but also in many systematic reviews and meta-analyses. The aim of this meta-review was to identify, appraise and synthesise existing systematic reviews that have evaluated the effects of remote monitoring in heart failure. Methods: Using a Cochrane methodology, we electronically searched all relevant online databases and search engines, performed a forward citation search as well as hand-searched bibliographies. Only fully published systematic reviews of invasive and/or non-invasive remote monitoring interventions were included. Two reviewers independently extracted data. Results: Sixty-five publications from 3333 citations were identified. Seventeen fulfilled the inclusion and exclusion criteria. Quality varied with A Measurement Tool to Assess Systematic Reviews (AMSTAR scores) ranging from 2 to 11 (mean 5.88). Seven reviews (41%) pooled results from individual studies for meta-analysis. Eight (47%) considered all non-invasive remote monitoring strategies. Four (24%) focused specifically on telemonitoring. Four (24%) included studies investigating both non-invasive and invasive technologies. Population characteristics of the included studies were not reported consistently. Mortality and hospitalisations were the most frequently reported outcomes 12 (70%). Only five reviews (29%) reported healthcare costs and compliance. A high degree of heterogeneity was reported in many of the meta-analyses. Conclusions: These results should be considered in context of two negative RCTs of remote monitoring for heart failure that have been published since the meta-analyses (TIM-HF and Tele-HF). However, high quality reviews demonstrated improved mortality, quality of life, reduction in hospitalisations and healthcare costs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a song at the beginning of the musical, West Side Story, where the character Tony sings that “something’s coming, something good.” The song is an anthem of optimism, brimming with promise. This paper is about the long-held promise of information and communication technology (ICT) to transform teaching and learning, to modernise the learning environment of the classroom, and to create a new digital pedagogy. But much of our experience to date in the schooling sector tells more of resistance and reaction than revolution, of more of the same but with a computer in the corner and of ICT activities as unwelcome time-fillers/time-wasters. Recently, a group of pre-service teachers in a postgraduate primary education degree in an Australian university were introduced to learning objects in an ICT immersion program. Their analyses and related responses, as recorded in online journals, have here been interpreted in terms of TPACK (Technological Pedagogical and Content Knowledge). Against contemporary observation, these students generally displayed high levels of competence and highly positive dispositions of students to the integration of ICT in their future classrooms. In short, they displayed the same optimism and confidence as the fictional “Tony” in believing that something good was coming.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The high risk of metabolic disease traits in Polynesians may be partly explained by elevated prevalence of genetic variants involved in energy metabolism. The genetics of Polynesian populations has been shaped by island hoping migration events which have possibly favoured thrifty genes. The aim of this study was to sequence the mitochondrial genome in a group of Maoris in an effort to characterise genome variation in this Polynesian population for use in future disease association studies. We sequenced the complete mitochondrial genomes of 20 non-admixed Maori subjects using Affymetrix technology. DNA diversity analyses showed the Maori group exhibited reduced mitochondrial genome diversity compared to other worldwide populations, which is consistent with historical bottleneck and founder effects. Global phylogenetic analysis positioned these Maori subjects specifically within mitochondrial haplogroup - B4a1a1. Interestingly, we identified several novel variants that collectively form new and unique Maori motifs – B4a1a1c, B4a1a1a3 and B4a1a1a5. Compared to ancestral populations we observed an increased frequency of non-synonymous coding variants of several mitochondrial genes in the Maori group, which may be a result of positive selection and/or genetic drift effects. In conclusion, this study reports the first complete mitochondrial genome sequence data for a Maori population. Overall, these new data reveal novel mitochondrial genome signatures in this Polynesian population and enhance the phylogenetic picture of maternal ancestry in Oceania. The increased frequency of several mitochondrial coding variants makes them good candidates for future studies aimed at assessment of metabolic disease risk in Polynesian populations.