33 resultados para validation tests of PTO
Resumo:
Discrete, microscopic lesions are developed in the brain in a number of neurodegenerative diseases. These lesions may not be randomly distributed in the tissue but exhibit a spatial pattern, i.e., a departure from randomness towards regularlity or clustering. The spatial pattern of a lesion may reflect its development in relation to other brain lesions or to neuroanatomical structures. Hence, a study of spatial pattern may help to elucidate the pathogenesis of a lesion. A number of statistical methods can be used to study the spatial patterns of brain lesions. They range from simple tests of whether the distribution of a lesion departs from random to more complex methods which can detect clustering and the size, distribution and spacing of clusters. This paper reviews the uses and limitations of these methods as applied to neurodegenerative disorders, and in particular to senile plaque formation in Alzheimer's disease.
Resumo:
Some of the factors affecting colonisation of a colonisation sampler, the Standard Aufwuchs Unit (S. Auf. U.) were investigated, namely immersion period, whether anchored on the bottom or suspended, and the influence of riffles. It was concluded that a four-week immersion period was best. S. Auf. U. anchored on the bottom collected both more taxa and individuals than suspended ones. Fewer taxa but more individuals colonised S. Auf. U. in the potamon zone compared to the rhithron zone with a consequent reduction in the values of pollution indexes and diversity. It was concluded that a completely different scoring system was necessary for lowland rivers. Macroinvertebrates colonising S. Auf. U. in simulated streams, lowland rivers and the R. Churnet reflected water quality. A variety of pollution and diversity indexes were applied to results from lowland river sites. Instead of these, it was recommended that an abbreviated species - relative abundance list be used to summarise biological data for use in lowland river surveillance. An intensive study of gastropod populations was made in simulated streams. Lynnaea peregra increased in abundance whereas Potamopyrgas jenkinsi decreased with increasing sewage effluent concentration. No clear-cut differences in reproduction were observed. The presence/absence of eight gastropod taxa was compared with concentrations of various pollutants in lowland rivers. On the basis of all field work it appeared that ammonia, nitrite, copper and zinc were the toxicants most likely to be detrimental to gastropods and that P. jenkinsi and Theodoxus fluviatilis were the least tolerant taxa. 96h acute toxicity tests of P. jenkinsi using ammonia and copper were carried out in a flow-through system after a variety of static range finding tests. P. jenkinsi was intolerant to both toxicants compared to reports on other taxa and the results suggested that these toxicants would affect distribution of this species in the field.
Resumo:
This work concerns the developnent of a proton irduced X-ray emission (PIXE) analysis system and a multi-sample scattering chamber facility. The characteristics of the beam pulsing system and its counting rate capabilities were evaluated by observing the ion-induced X-ray emission from pure thick copper targets, with and without beam pulsing operation. The characteristic X-rays were detected with a high resolution Si(Li) detector coupled to a rrulti-channel analyser. The removal of the pile-up continuum by the use of the on-demand beam pulsing is clearly demonstrated in this work. This new on-demand pu1sirg system with its counting rate capability of 25, 18 and 10 kPPS corresponding to 2, 4 am 8 usec main amplifier time constant respectively enables thick targets to be analysed more readily. Reproducibility tests of the on-demard beam pulsing system operation were checked by repeated measurements of the system throughput curves, with and without beam pulsing. The reproducibility of the analysis performed using this system was also checked by repeated measurements of the intensity ratios from a number of standard binary alloys during the experimental work. A computer programme has been developed to evaluate the calculations of the X-ray yields from thick targets bornbarded by protons, taking into account the secondary X-ray yield production due to characteristic X-ray fluorescence from an element energetically higher than the absorption edge energy of the other element present in the target. This effect was studied on metallic binary alloys such as Fe/Ni and Cr/Fe. The quantitative analysis of Fe/Ni and Cr/Fe alloy samples to determine their elemental composition taking into account the enhancement has been demonstrated in this work. Furthermore, the usefulness of the Rutherford backscattering (R.B.S.) technique to obtain the depth profiles of the elements in the upper micron of the sample is discussed.
Resumo:
Oxysterols (OS), the polyoxygenated sterols, represent a class of potent regulatory molecules for important biological actions. Cytotoxicity of OS is one of the most important aspects in studies of OS bioactivities. However, studies, the structure-activity relationship (SAR) study in particular, have been hampered by the limited availability of structurally diverse OS in numbers and amounts. The aim of this project was to develop robust synthetic methods for the preparation of polyhydroxyl sterols, thereof, evaluate their cytotoxicity and establish structure-activity relationship. First, we found hydrophobicity of the side chain is essential for 7-HC's cytotoxicity, and a limited number of hydroxyl groups and a desired configuration on the A, B ring are required for a potent cytotoxicity of an OS, after syntheses and tests of a number of 7-HC's analogues against cancer cell lines. Then polyoxygenation of cholesterol A, B rings was explored. A preparative method for the synthesis of four diastereomerically pure cholest-4-en-3,6-diols was developed. Epoxidation on these cholest-4-en-3,6-diols showed that an allyl group exerts an auxiliary role in producing products with desired configuration in syntheses of the eight diastereomerically pure 45-epoxycholestane-3,6-diols. Reduction of the eight 45-epoxycholestane-3,6-diols produced all eight isomers of the cytotoxic 5α-acholestane 3β,5,6β-triol (CT) for the first time. Epoxide ring opening with protic or Lewis acids on the eight 45-epoxycholestane-3,6-diols are carefully studied. The results demonstrated a combination of an acid and a solvent affected the outcomes of a reaction dramatically. Acyl group participation and migration play an important role with numbers of substrates under certain conditions. All the eight 4,5-trans cholestane- 3,4,5,6-tetrols were synthesised through manipulation of acyl participation. Furthermore these reaction conditions were tested when a number of cholestane-3,4, 5,6,7-pentols and other C3-C7 oxygenated sterols were synthesised for the first time. Introduction of an oxygenated functional group through cholest-2-ene derivatives was studied. The elimination of 3-(4-toluenesulfonate) esters showed the interaction between the existing hydroxyls or acyls with the reaction centre often resulted in different products. The allyl oxidation, epoxidation and Epoxide ring opening reactions are investigated with these cholest-2-enes.
Resumo:
Surface deposition of dense aerosol particles is of major concern in the nuclear industry for safety assessment. This study presents theoretical investigations and computer simulations of single gas-born U3O8 particles impacting with the in-reactor surface and the fragmentation of small agglomerates. A theoretical model for elasto-plastic spheres has been developed and used to analyse the force-displacement and force-time relationships. The impulse equations, based on Newton's second law, are applied to govern the tangential bouncing behaviour. The theoretical model is then incorporated into the Distinct Element Method code TRUBAL in order to perform computer simulated tests of particle collisions. A comparison of simulated results with both theoretical predictions and experimental measurements is provided. For oblique impacts, the results in terms of the force-displacement relationship, coefficients of restitution, trajectory of the impacting particle, and distribution of kinetic energy and work done during the process of impact are presented. The effects of Poisson's ratio, friction, plastic deformation and initial particle rotation on the bouncing behaviour are also discussed. In the presence of adhesion an elasto-plastic collision model, which is an extension to the JKR theory, is developed. Based on an energy balance equation the critical sticking velocity is obtained. For oblique collisions computer simulated results are used to establish a set of criteria determining whether or not the particle bounces off the target plate. For impact velocities above the critical sticking value, computer simulated results for the coefficients of restitution and rebound angles of the particle are presented. Computer simulations of fracture/fragmentation resulting from agglomerate-wall impact have also been performed, where two randomly generated agglomerates (one monodisperse, the other polydisperse), each consisting of 50 primary particles are used. The effects of impact angle, local structural arrangements close to the impact point, and plastic deformation at the contacts on agglomerate damage are examined. The simulated results show a significant difference in agglomerate strength between the two assemblies. The computer data also shows that agglomerate damage resulting from an oblique impact is determined by the normal velocity component rather than the impact speed.
Resumo:
A study was made of the corrosion behaviour in the ASTM standard Nitric acid and Oxalic acid tests, of two commercial AISI type 304L steels in the as received condition and after various heat treatments. Optical microscopy and SEM, TEM and STEM in conjunction with energy dispersive x-ray analysis, were used to correlate the corrosion behaviour of these steels with their microstructure. Some evidence of phosphorus segregation at grain boundaries was found. The corrosion behaviour at microstructural level was studied by examining on the TEM thin foils of steel that had been exposed to boiling nitric acid. Banding attack in the nitric acid and oxalic acid tests was studied using SEM and EPNA and found to be due to the micro-segregation of chromium and nickel. Using two experimental series of 304L, one a 17% Cr, 91 Ni, steel with phosphorus additions from 0.006% to 0.028%, the other a 20% Cr, 121 Ni steel with boron additions from 0.0011 to 0.00B51. The effect of these elements on corrosion in the nitric acid test was studied. The effect of different cooling rates and different solution treatment temperature on the behaviour of these steels was examined. TEM and STEM in conjunction with energy-dispersive x-ray analysis were again used to study the microstructure of the steels. Phosphorus was found to affect the corrosion behaviour but no effect was found with boron.
Resumo:
Many tests of financial contagion require a definition of the dates separating calm from crisis periods. We propose to use a battery of break search procedures for individual time series to objectively identify potential break dates in relationships between countries. Applied to the biggest European stock markets and combined with two well established tests for financial contagion, this approach results in break dates which correctly identify the timing of changes in cross-country transmission mechanisms. Application of break search procedures breathes new life into the established contagion tests, allowing for an objective, data-driven timing of crisis periods.
Resumo:
Purpose – This paper attempts to seek answers to four questions. Two of these questions have been borrowed (but adapted) from the work of Defee et al.: RQ1. To what extent is theory used in purchasing and supply chain management (P&SCM) research? RQ2. What are the prevalent theories to be found in P&SCM research? Following on from these questions an additional question is posed: RQ3. Are theory-based papers more highly cited than papers with no theoretical foundation? Finally, drawing on the work of Harland et al., the authors have added a fourth question: RQ4. To what extent does P&SCM meet the tests of coherence, breadth and depth, and quality necessary to make it a scientific discipline? Design/methodology/approach – A systematic literature review was conducted in accordance with the model outlined by Tranfield et al. for three journals within the field of “purchasing and supply chain management”. In total 1,113 articles were reviewed. In addition a citation analysis was completed covering 806 articles in total. Findings – The headline features from the results suggest that nearly a decade-and-a-half on from its development, the field still lacks coherence. There is the absence of theory in much of the work and although theory-based articles achieved on average a higher number of citations than non-theoretical papers, there is no obvious contender as an emergent paradigm for the discipline. Furthermore, it is evident that P&SCM does not meet Fabian's test necessary to make it a scientific discipline and is still some way from being a normal science. Research limitations/implications – This study would have benefited from the analysis of further journals, however the analysis of 1,113 articles from three leading journals in the field of P&SCM was deemed sufficient in scope. In addition, a further significant line of enquiry to follow is the rigour vs relevance debate. Practical implications – This article is of interest to both an academic and practitioner audience as it highlights the use theories in P&SCM. Furthermore, this article raises a number of important questions. Should research in this area draw more heavily on theory and if so which theories are appropriate? Social implications – The broader social implications relate to the discussion of how a scientific discipline develops and builds on the work of Fabian and Amundson. Originality/value – The data set for this study is significant and builds on a number of previous literature reviews. This review is both greater in scope than previous reviews and is broader in its subject focus. In addition, the citation analysis (not previously conducted in any of the reviews) and statistical test highlights that theory-based articles are more highly cited than non-theoretically based papers. This could indicate that researchers are attempting to build on one another's work.
Resumo:
Purpose: Phonological accounts of reading implicate three aspects of phonological awareness tasks that underlie the relationship with reading; a) the language-based nature of the stimuli (words or nonwords), b) the verbal nature of the response, and c) the complexity of the stimuli (words can be segmented into units of speech). Yet, it is uncertain which task characteristics are most important as they are typically confounded. By systematically varying response-type and stimulus complexity across speech and non-speech stimuli, the current study seeks to isolate the characteristics of phonological awareness tasks that drive the prediction of early reading. Method: Four sets of tasks were created; tone stimuli (simple non-speech) requiring a non-verbal response, phonemes (simple speech) requiring a non-verbal response, phonemes requiring a verbal response, and nonwords (complex speech) requiring a verbal response. Tasks were administered to 570 2nd grade children along with standardized tests of reading and non-verbal IQ. Results: Three structural equation models comparing matched sets of tasks were built. Each model consisted of two 'task' factors with a direct link to a reading factor. The following factors predicted unique variance in reading: a) simple speech and non-speech stimuli, b) simple speech requiring a verbal response but not simple speech requiring a non-verbal-response, and c) complex and simple speech stimuli. Conclusions: Results suggest that the prediction of reading by phonological tasks is driven by the verbal nature of the response and not the complexity or 'speechness' of the stimuli. Findings highlight the importance of phonological output processes to early reading.
Resumo:
Haloclean a performance enhanced low temperature pyrolysis for biomass developed by Forschungszentrum Karlsruhe and Sea Marconi Is closing the gap between classical and fast pyrolysis approaches. For pyrolysis of straw (chaffed-, finely ground and pellets) temperature ranges between 320 to 420°C and residence times of only 1 to 5 minutes can be realized. Liquid yields of up to 45 wt-% and 35 wt-% of solids are possible. Solid yields can be increased up to 73 wt-% while loosing 4.5 % of the feed energy by pyrolysis gases only. Toxicity tests of the fractions do not show relevant numbers.
Resumo:
The impact of nutritional variation, within populations not overtly malnourished, on cognitive function and arousal is considered. The emphasis is on susceptibility to acute effects of meals and glucose loads, and chronic effects of dieting, on mental performance, and effects of cholesterol and vitamin levels on cognitive impairment. New developments in understanding dietary influences on neurohormonal systems, and their implications for cognition and affect, allow reinterpretation of both earlier and recent findings. Evidence for a detrimental effect of omitting a meal on cognitive performance remains equivocal: from the outset, idiosyncrasy has prevailed. Yet, for young and nutritionally vulnerable children, breakfast is more likely to benefit than hinder performance. For nutrient composition, despite inconsistencies, some cautious predictions can be made. Acutely, carbohydrate-rich–protein-poor meals can be sedating and anxiolytic; by comparison, protein-rich meals may be arousing, improving reaction time but also increasing unfocused vigilance. Fat-rich meals can lead to a decline in alertness, especially where they differ from habitual fat intake. These acute effects may vary with time of day and nutritional status. Chronically, protein-rich diets have been associated with decreased positive and increased negative affect relative to carbohydrate-rich diets. Probable mechanisms include diet-induced changes in monoamine, especially serotoninergic neurotransmitter activity, and functioning of the hypothalamic pituitary adrenal axis. Effects are interpreted in the context of individual traits and susceptibility to challenging, even stressful, tests of performance. Preoccupation with dieting may impair cognition by interfering with working memory capacity, independently of nutritional status. The change in cognitive performance after administration of glucose, and other foods, may depend on the level of sympathetic activation, glucocorticoid secretion, and pancreatic β-cell function, rather than simple fuelling of neural activity. Thus, outcomes can be predicted by vulnerability in coping with stressful challenges, interacting with nutritional history and neuroendocrine status. Functioning of such systems may be susceptible to dietary influences on neural membrane fluidity, and vitamin-dependent cerebrovascular health, with cognitive vulnerability increasing with age.
Resumo:
The aim of this study is to accurately distinguish Parkinson's disease (PD) participants from healthy controls using self-administered tests of gait and postural sway. Using consumer-grade smartphones with in-built accelerometers, we objectively measure and quantify key movement severity symptoms of Parkinson's disease. Specifically, we record tri-axial accelerations, and extract a range of different features based on the time and frequency-domain properties of the acceleration time series. The features quantify key characteristics of the acceleration time series, and enhance the underlying differences in the gait and postural sway accelerations between PD participants and controls. Using a random forest classifier, we demonstrate an average sensitivity of 98.5% and average specificity of 97.5% in discriminating PD participants from controls. © 2014 IEEE.
Resumo:
This research is focused on the optimisation of resource utilisation in wireless mobile networks with the consideration of the users’ experienced quality of video streaming services. The study specifically considers the new generation of mobile communication networks, i.e. 4G-LTE, as the main research context. The background study provides an overview of the main properties of the relevant technologies investigated. These include video streaming protocols and networks, video service quality assessment methods, the infrastructure and related functionalities of LTE, and resource allocation algorithms in mobile communication systems. A mathematical model based on an objective and no-reference quality assessment metric for video streaming, namely Pause Intensity, is developed in this work for the evaluation of the continuity of streaming services. The analytical model is verified by extensive simulation and subjective testing on the joint impairment effects of the pause duration and pause frequency. Various types of the video contents and different levels of the impairments have been used in the process of validation tests. It has been shown that Pause Intensity is closely correlated with the subjective quality measurement in terms of the Mean Opinion Score and this correlation property is content independent. Based on the Pause Intensity metric, an optimised resource allocation approach is proposed for the given user requirements, communication system specifications and network performances. This approach concerns both system efficiency and fairness when establishing appropriate resource allocation algorithms, together with the consideration of the correlation between the required and allocated data rates per user. Pause Intensity plays a key role here, representing the required level of Quality of Experience (QoE) to ensure the best balance between system efficiency and fairness. The 3GPP Long Term Evolution (LTE) system is used as the main application environment where the proposed research framework is examined and the results are compared with existing scheduling methods on the achievable fairness, efficiency and correlation. Adaptive video streaming technologies are also investigated and combined with our initiatives on determining the distribution of QoE performance across the network. The resulting scheduling process is controlled through the prioritization of users by considering their perceived quality for the services received. Meanwhile, a trade-off between fairness and efficiency is maintained through an online adjustment of the scheduler’s parameters. Furthermore, Pause Intensity is applied to act as a regulator to realise the rate adaptation function during the end user’s playback of the adaptive streaming service. The adaptive rates under various channel conditions and the shape of the QoE distribution amongst the users for different scheduling policies have been demonstrated in the context of LTE. Finally, the work for interworking between mobile communication system at the macro-cell level and the different deployments of WiFi technologies throughout the macro-cell is presented. A QoEdriven approach is proposed to analyse the offloading mechanism of the user’s data (e.g. video traffic) while the new rate distribution algorithm reshapes the network capacity across the macrocell. The scheduling policy derived is used to regulate the performance of the resource allocation across the fair-efficient spectrum. The associated offloading mechanism can properly control the number of the users within the coverages of the macro-cell base station and each of the WiFi access points involved. The performance of the non-seamless and user-controlled mobile traffic offloading (through the mobile WiFi devices) has been evaluated and compared with that of the standard operator-controlled WiFi hotspots.
Resumo:
Abstract Phonological tasks are highly predictive of reading development but their complexity obscures the underlying mechanisms driving this association. There are three key components hypothesised to drive the relationship between phonological tasks and reading; (a) the linguistic nature of the stimuli, (b) the phonological complexity of the stimuli, and (c) the production of a verbal response. We isolated the contribution of the stimulus and response components separately through the creation of latent variables to represent specially designed tasks that were matched for procedure. These tasks were administered to 570 6 to 7-year-old children along with standardised tests of regular word and non-word reading. A structural equation model, where tasks were grouped according to stimulus, revealed that the linguistic nature and the phonological complexity of the stimulus predicted unique variance in decoding, over and above matched comparison tasks without these components. An alternative model, grouped according to response mode, showed that the production of a verbal response was a unique predictor of decoding beyond matched tasks without a verbal response. In summary, we found that multiple factors contributed to reading development, supporting multivariate models over those that prioritize single factors. More broadly, we demonstrate the value of combining matched task designs with latent variable modelling to deconstruct the components of complex tasks.
Resumo:
Objective: To test the practicality and effectiveness of cheap, ubiquitous, consumer-grade smartphones to discriminate Parkinson’s disease (PD) subjects from healthy controls, using self-administered tests of gait and postural sway. Background: Existing tests for the diagnosis of PD are based on subjective neurological examinations, performed in-clinic. Objective movement symptom severity data, collected using widely-accessible technologies such as smartphones, would enable the remote characterization of PD symptoms based on self-administered, behavioral tests. Smartphones, when backed up by interviews using web-based videoconferencing, could make it feasible for expert neurologists to perform diagnostic testing on large numbers of individuals at low cost. However, to date, the compliance rate of testing using smart-phones has not been assessed. Methods: We conducted a one-month controlled study with twenty participants, comprising 10 PD subjects and 10 controls. All participants were provided identical LG Optimus S smartphones, capable of recording tri-axial acceleration. Using these smartphones, patients conducted self-administered, short (less than 5 minute) controlled gait and postural sway tests. We analyzed a wide range of summary measures of gait and postural sway from the accelerometry data. Using statistical machine learning techniques, we identified discriminating patterns in the summary measures in order to distinguish PD subjects from controls. Results: Compliance was high all 20 participants performed an average of 3.1 tests per day for the duration of the study. Using this test data, we demonstrated cross-validated sensitivity of 98% and specificity of 98% in discriminating PD subjects from healthy controls. Conclusions: Using consumer-grade smartphone accelerometers, it is possible to distinguish PD from healthy controls with high accuracy. Since these smartphones are inexpensive (around $30 each) and easily available, and the tests are highly non-invasive and objective, we envisage that this kind of smartphone-based testing could radically increase the reach and effectiveness of experts in diagnosing PD.