932 resultados para Challenge posed by omics data to compositional analysis-paucity of independent samples (n)
Resumo:
It remains unclear whether genetic variants in SNCA (the alpha-synuclein gene) alter risk for sporadic Parkinson's disease (PD). The polymorphic mixed sequence repeat (NACP-Rep I) in the promoter region of SNCA has been previously examined as a potential susceptibility factor for PD with conflicting results. We report genotype and allele distributions at this locus from 369 PD cases and 370 control subjects of European Australian ancestry, with alleles designated as -1, 0, +1, +2, and +3 as previously described. Allele frequencies designated (0) were less common in Australian cases compared to controls (OR = 0.80, 95% CI 0.62-1.03). Combined analysis including all previously published ancestral European Rep1 data yielded a highly significant association between the 0 allele and a reduced risk for PD (OR = 0.79, 95% CI 0.70-0.89, p = 0.0001). Further study must now proceed to examine in detail this interesting and biologically plausible genetic association. (C) 2004 Elsevier Ireland Ltd. All rights reserved.
Resumo:
The paper investigates a Bayesian hierarchical model for the analysis of categorical longitudinal data from a large social survey of immigrants to Australia. Data for each subject are observed on three separate occasions, or waves, of the survey. One of the features of the data set is that observations for some variables are missing for at least one wave. A model for the employment status of immigrants is developed by introducing, at the first stage of a hierarchical model, a multinomial model for the response and then subsequent terms are introduced to explain wave and subject effects. To estimate the model, we use the Gibbs sampler, which allows missing data for both the response and the explanatory variables to be imputed at each iteration of the algorithm, given some appropriate prior distributions. After accounting for significant covariate effects in the model, results show that the relative probability of remaining unemployed diminished with time following arrival in Australia.
Resumo:
BACKGROUND: Intervention time series analysis (ITSA) is an important method for analysing the effect of sudden events on time series data. ITSA methods are quasi-experimental in nature and the validity of modelling with these methods depends upon assumptions about the timing of the intervention and the response of the process to it. METHOD: This paper describes how to apply ITSA to analyse the impact of unplanned events on time series when the timing of the event is not accurately known, and so the problems of ITSA methods are magnified by uncertainty in the point of onset of the unplanned intervention. RESULTS: The methods are illustrated using the example of the Australian Heroin Shortage of 2001, which provided an opportunity to study the health and social consequences of an abrupt change in heroin availability in an environment of widespread harm reduction measures. CONCLUSION: Application of these methods enables valuable insights about the consequences of unplanned and poorly identified interventions while minimising the risk of spurious results.
Resumo:
Analysis of intra- and inter-phase distribution of modifying elements in aluminium-silicon alloys is difficult due to the low concentrations used. This research utilises a mu-XRF (X-ray fluorescence) technique at the SPring-8 synchrotron radiation facility X-ray source and reveals that the modifying element strontium segregates exclusively to the eutectic silicon phase and the distribution of strontium within this phase is relatively homogeneous. This has important implications for the fundamental mechanisms of eutectic modification in hypoeutectic aluminium-silicon alloys. (c) 2006 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Resumo:
This paper considers a model-based approach to the clustering of tissue samples of a very large number of genes from microarray experiments. It is a nonstandard problem in parametric cluster analysis because the dimension of the feature space (the number of genes) is typically much greater than the number of tissues. Frequently in practice, there are also clinical data available on those cases on which the tissue samples have been obtained. Here we investigate how to use the clinical data in conjunction with the microarray gene expression data to cluster the tissue samples. We propose two mixture model-based approaches in which the number of components in the mixture model corresponds to the number of clusters to be imposed on the tissue samples. One approach specifies the components of the mixture model to be the conditional distributions of the microarray data given the clinical data with the mixing proportions also conditioned on the latter data. Another takes the components of the mixture model to represent the joint distributions of the clinical and microarray data. The approaches are demonstrated on some breast cancer data, as studied recently in van't Veer et al. (2002).
Resumo:
Bone marrow mesenchymal stem cells (MSCs) promote nerve growth and functional recovery in animal models of spinal cord injury (SCI) to varying levels. The authors have tested high-content screening to examine the effects of MSC-conditioned medium (MSC-CM) on neurite outgrowth from the human neuroblastoma cell line SH-SY5Y and from explants of chick dorsal root ganglia (DRG). These analyses were compared to previously published methods that involved hand-tracing individual neurites. Both methods demonstrated that MSC-CM promoted neurite outgrowth. Each showed the proportion of SH-SY5Y cells with neurites increased by ~200% in MSC-CM within 48 h, and the number of neurites/SH-SY5Y cells was significantly increased in MSC-CM compared with control medium. For high-content screening, the analysis was performed within minutes, testing multiple samples of MSC-CM and in each case measuring >15,000 SH-SY5Y cells. In contrast, the manual measurement of neurite outgrowth from >200 SH-SY5Y cells in a single sample of MSC-CM took at least 1 h. High-content analysis provided additional measures of increased neurite branching in MSC-CM compared with control medium. MSC-CM was also found to stimulate neurite outgrowth in DRG explants using either method. The application of the high-content analysis was less well optimized for measuring neurite outgrowth from DRG explants than from SH-SY5Y cells.
Resumo:
We have attempted to establish normative values of components of the magnetic evoked field to flash and pattern reversal stimuli prior to clinical use of the MEG. Full visual field, binocular evoked magnetic fields were recorded from 100 subjects 16 to 86 years of age with a single channel dc Squid (BTI) second-order gradiometer at a point 5-6cm above the inion. The majority of subjects showed a large positive component (out going magnetic field) of mean latency 115 ms (SD range 2.5 -11.8 in different decades of life) to the pattern reversal stimulus. In many subjects, this P100M was preceeded and succeeded by negative deflections (in going field). About 6% of subjects showed an inverted response i.e. a PNP wave. Waveforms to flash were more variable in shape with several positive components; the most consistent having a mean latency of 110ms (SD range 6.4-23.2). Responses to both stimuli were consistent when measured on the same subject on six different occasions (SD range 4.8 to 7.3). The data suggest that norms can be established for evoked magnetic field components, in particular for the pattern reversal P100M, which could be used in the diagnosis of neuro-opthalmological disease.
Resumo:
A rapid method for the analysis of biomass feedstocks was established to identify the quality of the pyrolysis products likely to impact on bio-oil production. A total of 15 Lolium and Festuca grasses known to exhibit a range of Klason lignin contents were analysed by pyroprobe-GC/MS (Py-GC/MS) to determine the composition of the thermal degradation products of lignin. The identification of key marker compounds which are the derivatives of the three major lignin subunits (G, H, and S) allowed pyroprobe-GC/MS to be statistically correlated to the Klason lignin content of the biomass using the partial least-square method to produce a calibration model. Data from this multivariate modelling procedure was then applied to identify likely "key marker" ions representative of the lignin subunits from the mass spectral data. The combined total abundance of the identified key markers for the lignin subunits exhibited a linear relationship with the Klason lignin content. In addition the effect of alkali metal concentration on optimum pyrolysis characteristics was also examined. Washing of the grass samples removed approximately 70% of the metals and changed the characteristics of the thermal degradation process and products. Overall the data indicate that both the organic and inorganic specification of the biofuel impacts on the pyrolysis process and that pyroprobe-GC/MS is a suitable analytical technique to asses lignin composition. © 2007 Elsevier B.V. All rights reserved.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
The trend in modal extraction algorithms is to use all the available frequency response functions data to obtain a global estimate of the natural frequencies, damping ratio and mode shapes. Improvements in transducer and signal processing technology allow the simultaneous measurement of many hundreds of channels of response data. The quantity of data available and the complexity of the extraction algorithms make considerable demands on the available computer power and require a powerful computer or dedicated workstation to perform satisfactorily. An alternative to waiting for faster sequential processors is to implement the algorithm in parallel, for example on a network of Transputers. Parallel architectures are a cost effective means of increasing computational power, and a larger number of response channels would simply require more processors. This thesis considers how two typical modal extraction algorithms, the Rational Fraction Polynomial method and the Ibrahim Time Domain method, may be implemented on a network of transputers. The Rational Fraction Polynomial Method is a well known and robust frequency domain 'curve fitting' algorithm. The Ibrahim Time Domain method is an efficient algorithm that 'curve fits' in the time domain. This thesis reviews the algorithms, considers the problems involved in a parallel implementation, and shows how they were implemented on a real Transputer network.
Resumo:
This paper considers the role of HR in ethics and social responsibility and questions why, despite an acceptance of a role in ethical stewardship, the HR profession appears to be reluctant to embrace its responsibilities in this area. The study explores how HR professionals see their role in relation to ethical stewardship of the organisation, and the factors that inhibit its execution. A survey of 113 UK-based HR professionals, working in both domestic and multinational corporations, was conducted to explore their perceptions of the role of HR in maintaining ethical and socially responsible action in their organisations, and to identify features of the organisational environment which might help or hinder this role being effectively carried out. The findings indicate that although there is a clear understanding of the expectations of ethical stewardship, HR professionals often face difficulties in fulfilling this role because of competing tensions and perceptions of their role within their organisations. A way forward is proposed, which draws on the positive individual factors highlighted in this research to explore how approaches to organisational development (through positive deviance) may reduce these tensions to enable the better fulfilment of ethical responsibilities within organisations. The involvement and active modelling of ethical behaviour by senior management, coupled with an open approach to surfacing organisational values and building HR procedures, which support socially responsible action, are crucial to achieving socially responsible organisations. Finally, this paper challenges the HR profession, through professional and academic institutions internationally, to embrace their role in achieving this. © 2013 Taylor & Francis.
Resumo:
The UK's business R&D (BERD) to GDP ratio is low compared to other leading economies, and the ratio has declined over the 1990s. This paper uses data on 719 large UK firms to analyse the link between R&D and productivity during 1989-2000. The results indicate that UK returns to R&D are similar to returns in other leading economies and have been relatively stable over the 1990s. The analysis suggests that the low BERD to GDP ratio in the UK is unlikely to be due to direct financial or human capital constraints (as these imply finding relatively high rates of return). © Springer Science+Business Media, LLC 2009.
Resumo:
The aim of this study is to evaluate the application of ensemble averaging to the analysis of electromyography recordings under whole body vibratory stimulation. Recordings from Rectus Femoris, collected during vibratory stimulation at different frequencies, are used. Each signal is subdivided in intervals, which time duration is related to the vibration frequency. Finally the average of the segmented intervals is performed. By using this method for the majority of the recordings the periodic components emerge. The autocorrelation of few seconds of signals confirms the presence of a pseudosinusoidal components strictly related to the soft tissues oscillations caused by the mechanical waves. © 2014 IEEE.
Resumo:
Field material testing provides firsthand information on pavement conditions which are most helpful in evaluating performance and identifying preventive maintenance or overlay strategies. High variability of field asphalt concrete due to construction raises the demand for accuracy of the test. Accordingly, the objective of this study is to propose a reliable and repeatable methodology to evaluate the fracture properties of field-aged asphalt concrete using the overlay test (OT). The OT is selected because of its efficiency and feasibility for asphalt field cores with diverse dimensions. The fracture properties refer to the Paris’ law parameters based on the pseudo J-integral (A and n) because of the sound physical significance of the pseudo J-integral with respect to characterizing the cracking process. In order to determine A and n, a two-step OT protocol is designed to characterize the undamaged and damaged behaviors of asphalt field cores. To ensure the accuracy of determined undamaged and fracture properties, a new analysis method is then developed for data processing, which combines the finite element simulations and mechanical analysis of viscoelastic force equilibrium and evolution of pseudo displacement work in the OT specimen. Finally, theoretical equations are derived to calculate A and n directly from the OT test data. The accuracy of the determined fracture properties is verified. The proposed methodology is applied to a total of 27 asphalt field cores obtained from a field project in Texas, including the control Hot Mix Asphalt (HMA) and two types of warm mix asphalt (WMA). The results demonstrate a high linear correlation between n and −log A for all the tested field cores. Investigations of the effect of field aging on the fracture properties confirm that n is a good indicator to quantify the cracking resistance of asphalt concrete. It is also indicated that summer climatic condition clearly accelerates the rate of aging. The impact of the WMA technologies on fracture properties of asphalt concrete is visualized by comparing the n-values. It shows that the Evotherm WMA technology slightly improves the cracking resistance, while the foaming WMA technology provides the comparable fracture properties with the HMA. After 15 months aging in the field, the cracking resistance does not exhibit significant difference between HMA and WMAs, which is confirmed by the observations of field distresses.
Resumo:
This study developed a reliable and repeatable methodology to evaluate the fracture properties of asphalt mixtures with an overlay test (OT). In the proposed methodology, first, a two-step OT protocol was used to characterize the undamaged and damaged behaviors of asphalt mixtures. Second, a new methodology combining the mechanical analysis of viscoelastic force equilibrium in the OT specimen and finite element simulations was used to determine the undamaged properties and crack growth function of asphalt mixtures. Third, a modified Paris's law replacing the stress intensity factor by the pseudo J-integral was employed to characterize the fracture behavior of asphalt mixtures. Theoretical equations were derived to calculate the parameters A and n (defined as the fracture properties) in the modified Paris's law. The study used a detailed example to calculate A and n from the OT data. The proposed methodology was successfully applied to evaluate the impact of warm-mix asphalt (WMA) technologies on fracture properties. The results of the tested specimens showed that Evotherm WMA technology slightly improved the cracking resistance of asphalt mixtures, while foaming WMA technology provided comparable fracture properties. In addition, the study found that A decreased with the increase in n in general. A linear relationship between 2log(A) and n was established.