19 resultados para Challenge posed by omics data to compositional analysis-paucity of independent samples (n)

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The number of remote sensing platforms and sensors rises almost every year, yet much work on the interpretation of land cover is still carried out using either single images or images from the same source taken at different dates. Two questions could be asked of this proliferation of images: can the information contained in different scenes be used to improve the classification accuracy and, what is the best way to combine the different imagery? Two of these multiple image sources are MODIS on the Terra platform and ETM+ on board Landsat7, which are suitably complementary. Daily MODIS images with 36 spectral bands in 250-1000 m spatial resolution and seven spectral bands of ETM+ with 30m and 16 days spatial and temporal resolution respectively are available. In the UK, cloud cover may mean that only a few ETM+ scenes may be available for any particular year and these may not be at the time of year of most interest. The MODIS data may provide information on land cover over the growing season, such as harvest dates, that is not present in the ETM+ data. Therefore, the primary objective of this work is to develop a methodology for the integration of medium spatial resolution Landsat ETM+ image, with multi-temporal, multi-spectral, low-resolution MODIS \Terra images, with the aim of improving the classification of agricultural land. Additionally other data may also be incorporated such as field boundaries from existing maps. When classifying agricultural land cover of the type seen in the UK, where crops are largely sown in homogenous fields with clear and often mapped boundaries, the classification is greatly improved using the mapped polygons and utilising the classification of the polygon as a whole as an apriori probability in classifying each individual pixel using a Bayesian approach. When dealing with multiple images from different platforms and dates it is highly unlikely that the pixels will be exactly co-registered and these pixels will contain a mixture of different real world land covers. Similarly the different atmospheric conditions prevailing during the different days will mean that the same emission from the ground will give rise to different sensor reception. Therefore, a method is presented with a model of the instantaneous field of view and atmospheric effects to enable different remote sensed data sources to be integrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radio Frequency Identification Technology (RFID) adoption in healthcare settings has the potential to reduce errors, improve patient safety, streamline operational processes and enable the sharing of information throughout supply chains. RFID adoption in the English NHS is limited to isolated pilot studies. Firstly, this study investigates the drivers and inhibitors to RFID adoption in the English NHS from the perspective of the GS1 Healthcare User Group (HUG) tasked with coordinating adoption across private and public sectors. Secondly a conceptual model has been developed and deployed, combining two of foresight’s most popular methods; scenario planning and technology roadmapping. The model addresses the weaknesses of each foresight technique as well as capitalizing on their individual, inherent strengths. Semi structured interviews, scenario planning workshops and a technology roadmapping exercise were conducted with the members of the HUG over an 18-month period. An action research mode of enquiry was utilized with a thematic analysis approach for the identification and discussion of the drivers and inhibitors of RFID adoption. The results of the conceptual model are analysed in comparison to other similar models. There are implications for managers responsible for RFID adoption in both the NHS and its commercial partners, and for foresight practitioners. Managers can leverage the insights gained from identifying the drivers and inhibitors to RFID adoption by making efforts to influence the removal of inhibitors and supporting the continuation of the drivers. The academic contribution of this aspect of the thesis is in the field of RFID adoption in healthcare settings. Drivers and inhibitors to RFID adoption in the English NHS are compared to those found in other settings. The implication for technology foresight practitioners is a proof of concept of a model combining scenario planning and technology roadmapping using a novel process. The academic contribution to the field of technology foresight is the conceptual development of foresight model that combines two popular techniques and then a deployment of the conceptual foresight model in a healthcare setting exploring the future of RFID technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Jaccard has been the choice similarity metric in ecology and forensic psychology for comparison of sites or offences, by species or behaviour. This paper applies a more powerful hierarchical measure - taxonomic similarity (s), recently developed in marine ecology - to the task of behaviourally linking serial crime. Forensic case linkage attempts to identify behaviourally similar offences committed by the same unknown perpetrator (called linked offences). s considers progressively higher-level taxa, such that two sites show some similarity even without shared species. We apply this index by analysing 55 specific offence behaviours classified hierarchically. The behaviours are taken from 16 sexual offences by seven juveniles where each offender committed two or more offences. We demonstrate that both Jaccard and s show linked offences to be significantly more similar than unlinked offences. With up to 20% of the specific behaviours removed in simulations, s is equally or more effective at distinguishing linked offences than where Jaccard uses a full data set. Moreover, s retains significant difference between linked and unlinked pairs, with up to 50% of the specific behaviours removed. As police decision-making often depends upon incomplete data, s has clear advantages and its application may extend to other crime types. Copyright © 2007 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of the present study is to test the case linkage principles of behavioural consistency and behavioural distinctiveness using serial vehicle theft data. Data from 386 solved vehicle thefts committed by 193 offenders were analysed using Jaccard's, regression and Receiver Operating Characteristic analyses to determine whether objectively observable aspects of crime scene behaviour could be used to distinguish crimes committed by the same offender from those committed by different offenders. The findings indicate that spatial behaviour, specifically the distance between theft locations and between dump locations, is a highly consistent and distinctive aspect of vehicle theft behaviour; thus, intercrime and interdump distance represent the most useful aspects of vehicle theft for the purpose of case linkage analysis. The findings have theoretical and practical implications for understanding of criminal behaviour and for the development of decision-support tools to assist police investigation and apprehension of serial vehicle theft offenders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study covers two areas of contribution to the knowledge, firstly it tried to investigate rigourously the relationships of a number of factors believed that they may affect the climate perception, classified into three types to arrive to prove a hypothesis of the important role that qualification and personal factors play in shaping the climate perception, this is in contrast with situational factors. Secondly, the study tries to recluster the items of a wide-range applied scale for the measurement of climate named HAY in order to overcome the cross-cultural differences between the Kuwaiti and the American society, and to achieve a modified dimensions of climate for a civil service organisation in Kuwait. Furthermore, the study attempts to carry out a diagnostic test for the climate of the Ministry of Public Health in Kuwait, aiming to diagnose the perceived characteristics of the MoPH organisation, and suggests a number of areas to be given attention if an improvement is to be introduced. The study used extensively the statistical and the computer facilities to make the analysis more representing the field data, on the other hand this study is characterised by the very highly responsive rate of the main survey which would affect the findings reliability. Three main field studies are included, the first one was to conduct the main questionnaire where the second was to measure the "should be" climate by the experts of MoPH using the DELPHI technique, and the third was to conduct an extensive meeting with the very top management team in MoPH. Results of the first stage were subject to CLUSTER analysis for the reconstruction of the HAY tool, whereas comparative analysis was carried on between the results of the second and third stages on one side, the first from the other.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis investigates the value of quantitative analyses for historical studies of science through an examination of research trends in insect pest control, or economic entomology. Reviews are made of quantitative studies of science, and historical studies of pest control. The methodological strengths and weaknesses of bibliometric techniques are examined in a special chapter; techniques examined include productivity studies such as paper counts, and relational techniques such as co-citation and co-word analysis. Insect pest control is described. This includes a discussion of the socio-economic basis of the concept of `pest'; a series of classifications of pest control techniques are provided and analysed with respect to their utility for scientometric studies. The chemical and biological approaches to control are discussed as scientific and technological paradigms. Three case studies of research trends in economic entomology are provided. First a scientometric analysis of samples of chemical control and biological control papers; providing quantitative data on institutional, financial, national, and journal structures associated with pest control research fields. Second, a content analysis of a core journal, the Journal of Economic Entomology, over a period of 1910-1985; this identifies the main research innovations and trends, in particular the changing balance between chemical and biological control. Third, an analysis of historical research trends in insecticide research; this shows the rise, maturity and decline of research of many groups of compounds. These are supplemented by a collection of seven papers on scientometric studies of pest control and quantitative techniques for analysing science.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The oxidation of lipids is important in many pathological conditions and lipid peroxidation products such as 4-hydroxynonenal (HNE) and other aldehydes are commonly measured as biomarkers of oxidative stress. However, it is often useful to complement this with analysis of the original oxidized phospholipid. Electrospray mass spectrometry (ESMS) provides an informative method for detecting oxidative alterations to phospholipids, and has been used to investigate oxidative damage to cells, and low-density lipoprotein, as well as for the analysis of oxidized phosphatidylcholines present in atherosclerotic plaque material. There is increasing evidence that intact oxidized phospholipids have biological effects; in particular, oxidation products of 1-palmitoyl-2-arachidonoyl-sn-glycerophosphocholine (PAPC) have been found to cause inflammatory responses, which could be potentially important in the progression of atherosclerosis. The effects of chlorohydrin derivatives of lipids have been much less studied, but it is clear that free fatty acid chlorohydrins and phosphatidylcholine chlorohydrins are toxic to cells at concentrations above 10 micromolar, a range comparable to that of HNE and oxidized PAPC. There is some evidence that chlorohydrins have biological effects that may be relevant to atherosclerosis, but further work is needed to elucidate their pro-inflammatory properties, and to understand the mechanisms and balance of biological effects that could result from oxidation of complex mixtures of lipids in a pathophysiological situation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: To examine the volume, relevance and quality of transnational tobacco corporations' (TTCs) evidence that standardised packaging of tobacco products 'won't work', following the UK government's decision to 'wait and see' until further evidence is available. DESIGN: Content analysis. SETTING: We analysed the evidence cited in submissions by the UK's four largest TTCs to the UK Department of Health consultation on standardised packaging in 2012. OUTCOME MEASURES: The volume, relevance (subject matter) and quality (as measured by independence from industry and peer-review) of evidence cited by TTCs was compared with evidence from a systematic review of standardised packaging . Fisher's exact test was used to assess differences in the quality of TTC and systematic review evidence. 100% of the data were second-coded to validate the findings: 94.7% intercoder reliability; all differences were resolved. RESULTS: 77/143 pieces of TTC-cited evidence were used to promote their claim that standardised packaging 'won't work'. Of these, just 17/77 addressed standardised packaging: 14 were industry connected and none were published in peer-reviewed journals. Comparison of TTC and systematic review evidence on standardised packaging showed that the industry evidence was of significantly lower quality in terms of tobacco industry connections and peer-review (p<0.0001). The most relevant TTC evidence (on standardised packaging or packaging generally, n=26) was of significantly lower quality (p<0.0001) than the least relevant (on other topics, n=51). Across the dataset, TTC-connected evidence was significantly less likely to be published in a peer-reviewed journal (p=0.0045). CONCLUSIONS: With few exceptions, evidence cited by TTCs to promote their claim that standardised packaging 'won't work' lacks either policy relevance or key indicators of quality. Policymakers could use these three criteria-subject matter, independence and peer-review status-to critically assess evidence submitted to them by corporate interests via Better Regulation processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bone marrow mesenchymal stem cells (MSCs) promote nerve growth and functional recovery in animal models of spinal cord injury (SCI) to varying levels. The authors have tested high-content screening to examine the effects of MSC-conditioned medium (MSC-CM) on neurite outgrowth from the human neuroblastoma cell line SH-SY5Y and from explants of chick dorsal root ganglia (DRG). These analyses were compared to previously published methods that involved hand-tracing individual neurites. Both methods demonstrated that MSC-CM promoted neurite outgrowth. Each showed the proportion of SH-SY5Y cells with neurites increased by ~200% in MSC-CM within 48 h, and the number of neurites/SH-SY5Y cells was significantly increased in MSC-CM compared with control medium. For high-content screening, the analysis was performed within minutes, testing multiple samples of MSC-CM and in each case measuring >15,000 SH-SY5Y cells. In contrast, the manual measurement of neurite outgrowth from >200 SH-SY5Y cells in a single sample of MSC-CM took at least 1 h. High-content analysis provided additional measures of increased neurite branching in MSC-CM compared with control medium. MSC-CM was also found to stimulate neurite outgrowth in DRG explants using either method. The application of the high-content analysis was less well optimized for measuring neurite outgrowth from DRG explants than from SH-SY5Y cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have attempted to establish normative values of components of the magnetic evoked field to flash and pattern reversal stimuli prior to clinical use of the MEG. Full visual field, binocular evoked magnetic fields were recorded from 100 subjects 16 to 86 years of age with a single channel dc Squid (BTI) second-order gradiometer at a point 5-6cm above the inion. The majority of subjects showed a large positive component (out going magnetic field) of mean latency 115 ms (SD range 2.5 -11.8 in different decades of life) to the pattern reversal stimulus. In many subjects, this P100M was preceeded and succeeded by negative deflections (in going field). About 6% of subjects showed an inverted response i.e. a PNP wave. Waveforms to flash were more variable in shape with several positive components; the most consistent having a mean latency of 110ms (SD range 6.4-23.2). Responses to both stimuli were consistent when measured on the same subject on six different occasions (SD range 4.8 to 7.3). The data suggest that norms can be established for evoked magnetic field components, in particular for the pattern reversal P100M, which could be used in the diagnosis of neuro-opthalmological disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A rapid method for the analysis of biomass feedstocks was established to identify the quality of the pyrolysis products likely to impact on bio-oil production. A total of 15 Lolium and Festuca grasses known to exhibit a range of Klason lignin contents were analysed by pyroprobe-GC/MS (Py-GC/MS) to determine the composition of the thermal degradation products of lignin. The identification of key marker compounds which are the derivatives of the three major lignin subunits (G, H, and S) allowed pyroprobe-GC/MS to be statistically correlated to the Klason lignin content of the biomass using the partial least-square method to produce a calibration model. Data from this multivariate modelling procedure was then applied to identify likely "key marker" ions representative of the lignin subunits from the mass spectral data. The combined total abundance of the identified key markers for the lignin subunits exhibited a linear relationship with the Klason lignin content. In addition the effect of alkali metal concentration on optimum pyrolysis characteristics was also examined. Washing of the grass samples removed approximately 70% of the metals and changed the characteristics of the thermal degradation process and products. Overall the data indicate that both the organic and inorganic specification of the biofuel impacts on the pyrolysis process and that pyroprobe-GC/MS is a suitable analytical technique to asses lignin composition. © 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The trend in modal extraction algorithms is to use all the available frequency response functions data to obtain a global estimate of the natural frequencies, damping ratio and mode shapes. Improvements in transducer and signal processing technology allow the simultaneous measurement of many hundreds of channels of response data. The quantity of data available and the complexity of the extraction algorithms make considerable demands on the available computer power and require a powerful computer or dedicated workstation to perform satisfactorily. An alternative to waiting for faster sequential processors is to implement the algorithm in parallel, for example on a network of Transputers. Parallel architectures are a cost effective means of increasing computational power, and a larger number of response channels would simply require more processors. This thesis considers how two typical modal extraction algorithms, the Rational Fraction Polynomial method and the Ibrahim Time Domain method, may be implemented on a network of transputers. The Rational Fraction Polynomial Method is a well known and robust frequency domain 'curve fitting' algorithm. The Ibrahim Time Domain method is an efficient algorithm that 'curve fits' in the time domain. This thesis reviews the algorithms, considers the problems involved in a parallel implementation, and shows how they were implemented on a real Transputer network.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper considers the role of HR in ethics and social responsibility and questions why, despite an acceptance of a role in ethical stewardship, the HR profession appears to be reluctant to embrace its responsibilities in this area. The study explores how HR professionals see their role in relation to ethical stewardship of the organisation, and the factors that inhibit its execution. A survey of 113 UK-based HR professionals, working in both domestic and multinational corporations, was conducted to explore their perceptions of the role of HR in maintaining ethical and socially responsible action in their organisations, and to identify features of the organisational environment which might help or hinder this role being effectively carried out. The findings indicate that although there is a clear understanding of the expectations of ethical stewardship, HR professionals often face difficulties in fulfilling this role because of competing tensions and perceptions of their role within their organisations. A way forward is proposed, which draws on the positive individual factors highlighted in this research to explore how approaches to organisational development (through positive deviance) may reduce these tensions to enable the better fulfilment of ethical responsibilities within organisations. The involvement and active modelling of ethical behaviour by senior management, coupled with an open approach to surfacing organisational values and building HR procedures, which support socially responsible action, are crucial to achieving socially responsible organisations. Finally, this paper challenges the HR profession, through professional and academic institutions internationally, to embrace their role in achieving this. © 2013 Taylor & Francis.