19 resultados para Subject analysis

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a robust method is developed for the analysis of data consisting of repeated binary observations taken at up to three fixed time points on each subject. The primary objective is to compare outcomes at the last time point, using earlier observations to predict this for subjects with incomplete records. A score test is derived. The method is developed for application to sequential clinical trials, as at interim analyses there will be many incomplete records occurring in non-informative patterns. Motivation for the methodology comes from experience with clinical trials in stroke and head injury, and data from one such trial is used to illustrate the approach. Extensions to more than three time points and to allow for stratification are discussed. Copyright © 2005 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to apply and compare two time-domain analysis procedures in the determination of oxygen uptake (VO2) kinetics in response to a pseudorandom binary sequence (PRBS) exercise test. PRBS exercise tests have typically been analysed in the frequency domain. However, the complex interpretation of frequency responses may have limited the application of this procedure in both sporting and clinical contexts, where a single time measurement would facilitate subject comparison. The relative potential of both a mean response time (MRT) and a peak cross-correlation time (PCCT) was investigated. This study was divided into two parts: a test-retest reliability study (part A), in which 10 healthy male subjects completed two identical PRBS exercise tests, and a comparison of the VO2 kinetics of 12 elite endurance runners (ER) and 12 elite sprinters (SR; part B). In part A, 95% limits of agreement were calculated for comparison between MRT and PCCT. The results of part A showed no significant difference between test and retest as assessed by MRT [mean (SD) 42.2 (4.2) s and 43.8 (6.9) s] or by PCCT [21.8 (3.7) s and 22.7 (4.5) s]. Measurement error (%) was lower for MRT in comparison with PCCT (16% and 25%, respectively). In part B of the study, the VO2 kinetics of ER were significantly faster than those of SR, as assessed by MRT [33.4 (3.4) s and 39.9 (7.1) s, respectively; P<0.01] and PCCT [20.9 (3.8) s and 24.8 (4.5) s; P < 0.05]. It is possible that either analysis procedure could provide a single test measurement Of VO2 kinetics; however, the greater reliability of the MRT data suggests that this method has more potential for development in the assessment Of VO2 kinetics by PRBS exercise testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The externally recorded electroencephalogram (EEG) is contaminated with signals that do not originate from the brain, collectively known as artefacts. Thus, EEG signals must be cleaned prior to any further analysis. In particular, if the EEG is to be used in online applications such as Brain-Computer Interfaces (BCIs) the removal of artefacts must be performed in an automatic manner. This paper investigates the robustness of Mutual Information based features to inter-subject variability for use in an automatic artefact removal system. The system is based on the separation of EEG recordings into independent components using a temporal ICA method, RADICAL, and the utilisation of a Support Vector Machine for classification of the components into EEG and artefact signals. High accuracy and robustness to inter-subject variability is achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to analyse current data continuity mechanisms employed by the target group of businesses and to identify any inadequacies in the mechanisms as a whole. The questionnaire responses indicate that 47% of respondents do perceive backup methodologies as important, with a total of 70% of respondents having some backup methodology already in place. Businesses in Moulton Park perceive the loss of data to have a significant effect upon their business’ ability to function. Only 14% of respondents indicated that loss of data on computer systems would not affect their business at all, with 54% of respondents indicating that there would be either a “major effect” (or greater) on their ability to operate. Respondents that have experienced data loss were more likely to have backup methodologies in place (53%) than respondents that had not experienced data loss (18%). Although the number of respondents clearly affected the quality and conclusiveness of the results returned, the level of backup methodologies in place appears to be proportional to the company size. Further investigation is recommended into the subject in order to validate the information gleaned from the small number of respondents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decision theory is the study of models of judgement involved in, and leading to, deliberate and (usually) rational choice. In real estate investment there are normative models for the allocation of assets. These asset allocation models suggest an optimum allocation between the respective asset classes based on the investors’ judgements of performance and risk. Real estate is selected, as other assets, on the basis of some criteria, e.g. commonly its marginal contribution to the production of a mean variance efficient multi asset portfolio, subject to the investor’s objectives and capital rationing constraints. However, decisions are made relative to current expectations and current business constraints. Whilst a decision maker may believe in the required optimum exposure levels as dictated by an asset allocation model, the final decision may/will be influenced by factors outside the parameters of the mathematical model. This paper discusses investors' perceptions and attitudes toward real estate and highlights the important difference between theoretical exposure levels and pragmatic business considerations. It develops a model to identify “soft” parameters in decision making which will influence the optimal allocation for that asset class. This “soft” information may relate to behavioural issues such as the tendency to mirror competitors; a desire to meet weight of money objectives; a desire to retain the status quo and many other non-financial considerations. The paper aims to establish the place of property in multi asset portfolios in the UK and examine the asset allocation process in practice, with a view to understanding the decision making process and to look at investors’ perceptions based on an historic analysis of market expectation; a comparison with historic data and an analysis of actual performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Altruism and selfishness are 30–50% heritable in man in both Western and non-Western populations. This genetically based variation in altruism and selfishness requires explanation. In non-human animals, altruism is generally directed towards relatives, and satisfies the condition known as Hamilton's rule. This nepotistic altruism evolves under natural selection only if the ratio of the benefit of receiving help to the cost of giving it exceeds a value that depends on the relatedness of the individuals involved. Standard analyses assume that the benefit provided by each individual is the same but it is plausible in some cases that as more individuals contribute, help is subject to diminishing returns. We analyse this situation using a single-locus two-allele model of selection in a diploid population with the altruistic allele dominant to the selfish allele. The analysis requires calculation of the relationship between the fitnesses of the genotypes and the frequencies of the genes. The fitnesses vary not only with the genotype of the individual but also with the distribution of phenotypes amongst the sibs of the individual and this depends on the genotypes of his parents. These calculations are not possible by direct fitness or ESS methods but are possible using population genetics. Our analysis shows that diminishing returns change the operation of natural selection and the outcome can now be a stable equilibrium between altruistic and selfish alleles rather than the elimination of one allele or the other. We thus provide a plausible genetic model of kin selection that leads to the stable coexistence in the same population of both altruistic and selfish individuals. This may explain reported genetic variation in altruism in man.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been frequently observed that office markets are subject to particularly high fluctuations in rents and vacancy levels, thus exposing real estate investors to considerable risk regarding expected future income streams. This paper analyzes the determinants of office rents and their variability over time and across sub-markets to gain insight into the rent price formation and its stability across space and over time. No support is found for the single-market hypothesis which states that arbitrage opportunities effectively align real estate pricing schemes in various parts of city. Instead, the results suggest that the importance of hedonic pricing factors varies both over time and across submarkets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The approach taken by English courts to the duty of care question in negligence has been subject to harsh criticism in recent years. This article examines this fundamental issue in tort law, drawing upon Canadian and Australian jurisprudence by way of comparison. From this analysis, the concept of vulnerability is developed as a productive means of understanding the duty of care. Vulnerability is of increasing interest in legal and political theory and it is of particular relevance to the law of negligence. In addition to aiding doctrinal coherence, vulnerability – with its focus on relationships and care – has the potential to broaden the way in which the subject of tort law is conceived because it challenges dominant assumptions about autonomy as being prior to the relationships on which it is dependent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: The aim of this study was to determine and compare the proteomes of three triclosan-resistant mutants of Salmonella enterica serovar Typhimurium in order to identify proteins involved in triclosan resistance. Methods: The proteomes of three distinct but isogenic triclosan-resistant mutants were determined using two-dimensional liquid chromatography mass separation. Bioinformatics was then used to identify and quantify tryptic peptides in order to determine protein expression. Results: Proteomic analysis of the triclosan-resistant mutants identified a common set of proteins involved in production of pyruvate or fatty acid with differential expression in all mutants, but also demonstrated specific patterns of expression associated with each phenotype. Conclusions: These data show that triclosan resistance can occur via distinct pathways in Salmonella, and demonstrate a novel triclosan resistance network that is likely to have relevance to other pathogenic bacteria subject to triclosan exposure and may provide new targets for development of antimicrobial agents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The glutamate decarboxylase (GAD) system is important for the acid resistance of Listeria monocytogenes. We previously showed that under acidic conditions, glutamate (Glt)/γ-aminobutyrate (GABA) antiport is impaired in minimal media but not in rich ones, like brain heart infusion. Here we demonstrate that this behavior is more complex and it is subject to strain and medium variation. Despite the impaired Glt/GABA antiport, cells accumulate intracellular GABA (GABA(i)) as a standard response against acid in any medium, and this occurs in all strains tested. Since these systems can occur independently of one another, we refer to them as the extracellular (GAD(e)) and intracellular (GAD(i)) systems. We show here that GAD(i) contributes to acid resistance since in a ΔgadD1D2 mutant, reduced GABA(i) accumulation coincided with a 3.2-log-unit reduction in survival at pH 3.0 compared to that of wild-type strain LO28. Among 20 different strains, the GAD(i) system was found to remove 23.11% ± 18.87% of the protons removed by the overall GAD system. Furthermore, the GAD(i) system is activated at milder pH values (4.5 to 5.0) than the GAD(e) system (pH 4.0 to 4.5), suggesting that GAD(i) is the more responsive of the two and the first line of defense against acid. Through functional genomics, we found a major role for GadD2 in the function of GAD(i), while that of GadD1 was minor. Furthermore, the transcription of the gad genes in three common reference strains (10403S, LO28, and EGD-e) during an acid challenge correlated well with their relative acid sensitivity. No transcriptional upregulation of the gadT2D2 operon, which is the most important component of the GAD system, was observed, while gadD3 transcription was the highest among all gad genes in all strains. In this study, we present a revised model for the function of the GAD system and highlight the important role of GAD(i) in the acid resistance of L. monocytogenes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we consider the structure of dynamically evolving networks modelling information and activity moving across a large set of vertices. We adopt the communicability concept that generalizes that of centrality which is defined for static networks. We define the primary network structure within the whole as comprising of the most influential vertices (both as senders and receivers of dynamically sequenced activity). We present a methodology based on successive vertex knockouts, up to a very small fraction of the whole primary network,that can characterize the nature of the primary network as being either relatively robust and lattice-like (with redundancies built in) or relatively fragile and tree-like (with sensitivities and few redundancies). We apply these ideas to the analysis of evolving networks derived from fMRI scans of resting human brains. We show that the estimation of performance parameters via the structure tests of the corresponding primary networks is subject to less variability than that observed across a very large population of such scans. Hence the differences within the population are significant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses beginning teachers thinking about the nature and purposes of their subject and the impact of this on their practice. Individual qualitative interviews were undertaken with 11 history teachers at the beginning of their teaching careers. Data was analysed using writing as the method of analysis and revealed that teachers whose thinking was at odds with dominant discourses, for example in the form of a national curriculum, encountered difficulties embracing pedagogies and aspects of the curriculum that do not accord with their own deep-seated beliefs, demonstrating a need for the initial training and professional development of teachers to forefront consideration of subject understandings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses beginning teachers thinking about the nature and purposes of their subject and the impact of this on their practice. Individual qualitative interviews were undertaken with 11 history teachers at the beginning of their teaching careers. Data was analysed using writing as the method of analysis and revealed that teachers whose thinking was at odds with dominant discourses, for example in the form of a national curriculum, encountered difficulties embracing pedagogies and aspects of the curriculum that do not accord with their own deep-seated beliefs, demonstrating a need for the initial training and professional development of teachers to forefront consideration of subject understandings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new agent-based model, which incorporates the actions of individual homeowners in a long-term domestic stock model, and details how it was applied in energy policy analysis. The results indicate that current policies are likely to fall significantly short of the 80% target and suggest that current subsidy levels need re-examining. In the model, current subsidy levels appear to offer too much support to some technologies, which in turn leads to the suppression of other technologies that have a greater energy saving potential. The model can be used by policy makers to develop further scenarios to find alternative, more effective, sets of policy measures. The model is currently limited to the owner-occupied stock in England, although it can be expanded, subject to the availability of data.