56 resultados para data reduction by factor analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and purpose: The paper reports a study of the perceptions of teachers in secondary schools in the Gucha district of Kenya of their own effectiveness, the structure of their self-perceptions, variations in self-perceived effectiveness and the relationship between self-perceptions of effectiveness and the examination performance of their students. Design and methods: Data were based on questionnaires completed by 109 English and mathematics teachers from a random sample of 30 schools in the Gucha district of Kenya. Pupil examination results were also collected from the schools. Results: Three dimensions of self-perceived effectiveness emerged from a factor analysis. These were: pedagogic process, personal and affective aspects of teaching and effectiveness with regard to pupil performance. Teachers tended to rate themselves relatively highly with regard to the first two, process-oriented, dimensions but less highly on the third, outcome-oriented, dimension. Self-ratings for pupil outcomes correlated with pupil examination performance at school level. Conclusions: The results show that these teachers can have a sense of themselves as competent classroom performers and educational professionals without necessarily having a strong sense of efficacy with regard to pupil outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The role of ribosome modulation factor (RMF) in protecting heat-stressed Escherichia coli cells was identified by the observation that cultures of a mutant strain lacking functional RMF (HMY15) were highly heat sensitive in stationary phase compared to those of the parent strain (W3110). No difference in heat sensitivity was observed between these strains in exponential phase, during which RMF is not synthesised. Studies by differential scanning calorimetry demonstrated that the ribosomes of stationary-phase cultures of the mutant strain had lower thermal stability than those of the parent strain in stationary phase, or exponential-phase ribosomes. More rapid breakdown of ribosomes in the mutant strain during heating was confirmed by rRNA analysis and sucrose density gradient centrifugation. Analyses of ribosome composition showed that the 100S dimers dissociated more rapidly during heating than 70S particles. While ribosome dimerisation is a consequence of the conformational changes caused by RMF binding, it may not therefore be essential for RMF-mediated ribosome stabilisation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction A high saturated fatty acid intake is a well recognized risk factor for coronary heart disease development. More recently a high intake of n-6 polyunsaturated fatty acids (PUFA) in combination with a low intake of the long chain n-3 PUFA, eicosapentaenoic acid and docosahexaenoic acid has also been implicated as an important risk factor. Aim To compare total dietary fat and fatty acid intake measured by chemical analysis of duplicate diets with nutritional database analysis of estimated dietary records, collected over the same 3-day study period. Methods Total fat was analysed using soxhlet extraction and subsequently the individual fatty acid content of the diet was determined by gas chromatography. Estimated dietary records were analysed using a nutrient database which was supplemented with a selection of dishes commonly consumed by study participants. Results Bland & Altman statistical analysis demonstrated a lack of agreement between the two dietary assessment techniques for determining dietary fat and fatty acid intake. Conclusion The lack of agreement observed between dietary evaluation techniques may be attributed to inadequacies in either or both assessment techniques. This study highlights the difficulties that may be encountered when attempting to accurately evaluate dietary fat intake among the population.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We previously reported sequence determination of neutral oligosaccharides by negative ion electrospray tandem mass spectrometry on a quadrupole-orthogonal time-of-flight instrument with high sensitivity and without the need of derivatization. In the present report, we extend our strategies to sialylated oligosaccharides for analysis of chain and blood group types together with branching patterns. A main feature in the negative ion mass spectrometry approach is the unique double glycosidic cleavage induced by 3-glycosidic substitution, producing characteristic D-type fragments which can be used to distinguish the type 1 and type 2 chains, the blood group related Lewis determinants, 3,6-disubstituted core branching patterns, and to assign the structural details of each of the branches. Twenty mono- and disialylated linear and branched oligosaccharides were used for the investigation, and the sensitivity achieved is in the femtomole range. To demonstrate the efficacy of the strategy, we have determined a novel complex disialylated and monofucosylated tridecasaccharide that is based on the lacto-N-decaose core. The structure and sequence assignment was corroborated by :methylation analysis and H-1 NMR spectroscopy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to determine insight in patients with Huntington's disease (HD) by contrasting patients' ability to rate their own behavior with their ability to rate a person other than themselves. HD patients and carers completed the Dysexecutive Questionnaire (DEX), rating themselves and each other at two time points. The temporal stability of these ratings was initially examined using these two time points since there is no published test-retest reliability of the DEX with this Population to date. This was followed by a comparison of patients' self-ratings and carer's independent ratings of patients by performing correlations with patients' disease variables, and in exploratory factor analysis was conducted on both sets of ratings. The DEX showed good test-retest reliability, with patients consistently and persistently underestimating the degree of their dysexecutive behavior, but not that of their carers. Patients' self-ratings and caters' ratings of patients both showed that dysexecutive behavior in HD can be fractionated into three underlying components (Cognition, Self-regulation, Insight), and the relative ranking of these factors was similar for both data sets. HD patients consistently underestimated the extent of only their own dysexecutive behaviors relative to carers' ratings by 26%, but were similar in ascribing ranks to the components of dysexecutive behavior. (c) 2005 Movement Disorder Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The factor structure of the Edinburgh Postnatal Depression scale (EPDS) and similar instruments have received little attention in the literature. The researchers set out to investigate the construct validity and reliability of the EPDS amongst impoverished South African women. The EPDS was translated into isiXhosa (using Brislin's back translation method) and administered by trained interviewers to 147 women in Khayelitsha, South Africa. Responses were subjected to maximum likelihood confirmatory factor analysis. A single factor structure was found, consistent with the theory on which the EPDS was based. Internal consistency was satisfactory (a = 0.89).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Earth-directed coronal mass ejection (CME) of 8 April 2010 provided an opportunity for space weather predictions from both established and developmental techniques to be made from near–real time data received from the SOHO and STEREO spacecraft; the STEREO spacecraft provide a unique view of Earth-directed events from outside the Sun-Earth line. Although the near–real time data transmitted by the STEREO Space Weather Beacon are significantly poorer in quality than the subsequently downlinked science data, the use of these data has the advantage that near–real time analysis is possible, allowing actual forecasts to be made. The fact that such forecasts cannot be biased by any prior knowledge of the actual arrival time at Earth provides an opportunity for an unbiased comparison between several established and developmental forecasting techniques. We conclude that for forecasts based on the STEREO coronagraph data, it is important to take account of the subsequent acceleration/deceleration of each CME through interaction with the solar wind, while predictions based on measurements of CMEs made by the STEREO Heliospheric Imagers would benefit from higher temporal and spatial resolution. Space weather forecasting tools must work with near–real time data; such data, when provided by science missions, is usually highly compressed and/or reduced in temporal/spatial resolution and may also have significant gaps in coverage, making such forecasts more challenging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper uses data provided by three major real estate advisory firms to investigate the level and pattern of variation in the measurement of historic real estate rental values for the main European office centres. The paper assesses the extent to which the data providing organizations agree on historic market performance in terms of returns, risk and timing and examines the relationship between market maturity and agreement. The analysis suggests that at the aggregate level and for many markets, there is substantial agreement on direction, quantity and timing of market change. However, there is substantial variability in the level of agreement among cities. The paper also assesses whether the different data sets produce different explanatory models and market forecast. It is concluded that, although disagreement on the direction of market change is high for many market, the different data sets often produce similar explanatory models and predict similar relative performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The variety and quality of the tenant mix within a shopping centre is a key concern in shopping centre management. Tenant mix determines the extent of externalities between outlets in the centre, helps establish the image of the centre and, as a result, determines the attractiveness of the centre for consumers. This then translates into sales and rents. However, the management of tenant mix has largely been based on perceived “optimum” arrangements and industry rules of thumb. This paper attempts to model the impact of tenant mix on the rent paid by retailers in larger UK shopping centres and, hence, the returns made by shopping centre landlords. It extends work on shopping centre rent determination (see Working Paper 10/03) utilising a database of 148 regional shopping centres in the UK, with detailed data for over 1900 tenants. Econometric models test the relationship between rental levels and the levels of retail concentration and diversity, while controlling for a range of continuous and qualitative characteristics of each tenant, each retail product, and each shopping centre. Factor analysis is then used to extract the core retail and service categories from the tenant lists of the 148 shopping centres. The factor scores from these core retailer factors are then tested against rent payable. The results from the empirical analysis allow us to generate some clear analytical and empirical implications for optimal retail management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A statistical technique for fault analysis in industrial printing is reported. The method specifically deals with binary data, for which the results of the production process fall into two categories, rejected or accepted. The method is referred to as logistic regression, and is capable of predicting future fault occurrences by the analysis of current measurements from machine parts sensors. Individual analysis of each type of fault can determine which parts of the plant have a significant influence on the occurrence of such faults; it is also possible to infer which measurable process parameters have no significant influence on the generation of these faults. Information derived from the analysis can be helpful in the operator's interpretation of the current state of the plant. Appropriate actions may then be taken to prevent potential faults from occurring. The algorithm is being implemented as part of an applied self-learning expert system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Practical applications of portfolio optimisation tend to proceed on a “top down” basis where funds are allocated first at asset class level (between, say, bonds, cash, equities and real estate) and then, progressively, at sub-class level (within property to sectors, office, retail, industrial for example). While there are organisational benefits from such an approach, it can potentially lead to sub-optimal allocations when compared to a “global” or “side-by-side” optimisation. This will occur where there are correlations between sub-classes across the asset divide that are masked in aggregation – between, for instance, City offices and the performance of financial services stocks. This paper explores such sub-class linkages using UK monthly stock and property data. Exploratory analysis using clustering procedures and factor analysis suggests that property performance and equity performance are distinctive: there is little persuasive evidence of contemporaneous or lagged sub-class linkages. Formal tests of the equivalence of optimised portfolios using top-down and global approaches failed to demonstrate significant differences, whether or not allocations were constrained. While the results may be a function of measurement of market returns, it is those returns that are used to assess fund performance. Accordingly, the treatment of real estate as a distinct asset class with diversification potential seems justified.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measuring the retention, or residence time, of dosage forms to biological tissue is commonly a qualitative measurement, where no real values to describe the retention can be recorded. The result of this is an assessment that is dependent upon a user's interpretation of visual observation. This research paper outlines the development of a methodology to quantitatively measure, both by image analysis and by spectrophotometric techniques, the retention of material to biological tissues, using the retention of polymer solutions to ocular tissue as an example. Both methods have been shown to be repeatable, with the spectrophotometric measurement generating data reliably and quickly for further analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: The prediction of protein structure and the precise understanding of protein folding and unfolding processes remains one of the greatest challenges in structural biology and bioinformatics. Computer simulations based on molecular dynamics (MD) are at the forefront of the effort to gain a deeper understanding of these complex processes. Currently, these MD simulations are usually on the order of tens of nanoseconds, generate a large amount of conformational data and are computationally expensive. More and more groups run such simulations and generate a myriad of data, which raises new challenges in managing and analyzing these data. Because the vast range of proteins researchers want to study and simulate, the computational effort needed to generate data, the large data volumes involved, and the different types of analyses scientists need to perform, it is desirable to provide a public repository allowing researchers to pool and share protein unfolding data. METHODS: To adequately organize, manage, and analyze the data generated by unfolding simulation studies, we designed a data warehouse system that is embedded in a grid environment to facilitate the seamless sharing of available computer resources and thus enable many groups to share complex molecular dynamics simulations on a more regular basis. RESULTS: To gain insight into the conformational fluctuations and stability of the monomeric forms of the amyloidogenic protein transthyretin (TTR), molecular dynamics unfolding simulations of the monomer of human TTR have been conducted. Trajectory data and meta-data of the wild-type (WT) protein and the highly amyloidogenic variant L55P-TTR represent the test case for the data warehouse. CONCLUSIONS: Web and grid services, especially pre-defined data mining services that can run on or 'near' the data repository of the data warehouse, are likely to play a pivotal role in the analysis of molecular dynamics unfolding data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in hardware and software in the past decade allow to capture, record and process fast data streams at a large scale. The research area of data stream mining has emerged as a consequence from these advances in order to cope with the real time analysis of potentially large and changing data streams. Examples of data streams include Google searches, credit card transactions, telemetric data and data of continuous chemical production processes. In some cases the data can be processed in batches by traditional data mining approaches. However, in some applications it is required to analyse the data in real time as soon as it is being captured. Such cases are for example if the data stream is infinite, fast changing, or simply too large in size to be stored. One of the most important data mining techniques on data streams is classification. This involves training the classifier on the data stream in real time and adapting it to concept drifts. Most data stream classifiers are based on decision trees. However, it is well known in the data mining community that there is no single optimal algorithm. An algorithm may work well on one or several datasets but badly on others. This paper introduces eRules, a new rule based adaptive classifier for data streams, based on an evolving set of Rules. eRules induces a set of rules that is constantly evaluated and adapted to changes in the data stream by adding new and removing old rules. It is different from the more popular decision tree based classifiers as it tends to leave data instances rather unclassified than forcing a classification that could be wrong. The ongoing development of eRules aims to improve its accuracy further through dynamic parameter setting which will also address the problem of changing feature domain values.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.