28 resultados para Manual handing of loads

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background: The analysis of the Auditory Brainstem Response (ABR) is of fundamental importance to the investigation of the auditory system behaviour, though its interpretation has a subjective nature because of the manual process employed in its study and the clinical experience required for its analysis. When analysing the ABR, clinicians are often interested in the identification of ABR signal components referred to as Jewett waves. In particular, the detection and study of the time when these waves occur (i.e., the wave latency) is a practical tool for the diagnosis of disorders affecting the auditory system. Significant differences in inter-examiner results may lead to completely distinct clinical interpretations of the state of the auditory system. In this context, the aim of this research was to evaluate the inter-examiner agreement and variability in the manual classification of ABR. Methods: A total of 160 ABR data samples were collected, for four different stimulus intensity (80dBHL, 60dBHL, 40dBHL and 20dBHL), from 10 normal-hearing subjects (5 men and 5 women, from 20 to 52 years). Four examiners with expertise in the manual classification of ABR components participated in the study. The Bland-Altman statistical method was employed for the assessment of inter-examiner agreement and variability. The mean, standard deviation and error for the bias, which is the difference between examiners’ annotations, were estimated for each pair of examiners. Scatter plots and histograms were employed for data visualization and analysis. Results: In most comparisons the differences between examiner’s annotations were below 0.1 ms, which is clinically acceptable. In four cases, it was found a large error and standard deviation (>0.1 ms) that indicate the presence of outliers and thus, discrepancies between examiners. Conclusions: Our results quantify the inter-examiner agreement and variability of the manual analysis of ABR data, and they also allows for the determination of different patterns of manual ABR analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Integrated Catchment Model of Nitrogen (INCA-N) was applied to the River Lambourn, a Chalk river-system in southern England. The model's abilities to simulate the long-term trend and seasonal patterns in observed stream water nitrate concentrations from 1920 to 2003 were tested. This is the first time a semi-distributed, daily time-step model has been applied to simulate such a long time period and then used to calculate detailed catchment nutrient budgets which span the conversion of pasture to arable during the late 1930s and 1940s. Thus, this work goes beyond source apportionment and looks to demonstrate how such simulations can be used to assess the state of the catchment and develop an understanding of system behaviour. The mass-balance results from 1921, 1922, 1991, 2001 and 2002 are presented and those for 1991 are compared to other modelled and literature values of loads associated with nitrogen soil processes and export. The variations highlighted the problem of comparing modelled fluxes with point measurements but proved useful for identifying the most poorly understood inputs and processes thereby providing an assessment of input data and model structural uncertainty. The modelled terrestrial and instream mass-balances also highlight the importance of the hydrological conditions in pollutant transport. Between 1922 and 2002, increased inputs of nitrogen from fertiliser, livestock and deposition have altered the nitrogen balance with a shift from possible reduction in soil fertility but little environmental impact in 1922, to a situation of nitrogen accumulation in the soil, groundwater and instream biota in 2002. In 1922 and 2002 it was estimated that approximately 2 and 18 kg N ha(-1) yr(-1) respectively were exported from the land to the stream. The utility of the approach and further considerations for the best use of models are discussed. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Integrated Catchment Model of Nitrogen (INCA-N) was applied to the River Lambourn, a Chalk river-system in southern England. The model's abilities to simulate the long-term trend and seasonal patterns in observed stream water nitrate concentrations from 1920 to 2003 were tested. This is the first time a semi-distributed, daily time-step model has been applied to simulate such a long time period and then used to calculate detailed catchment nutrient budgets which span the conversion of pasture to arable during the late 1930s and 1940s. Thus, this work goes beyond source apportionment and looks to demonstrate how such simulations can be used to assess the state of the catchment and develop an understanding of system behaviour. The mass-balance results from 1921, 1922, 1991, 2001 and 2002 are presented and those for 1991 are compared to other modelled and literature values of loads associated with nitrogen soil processes and export. The variations highlighted the problem of comparing modelled fluxes with point measurements but proved useful for identifying the most poorly understood inputs and processes thereby providing an assessment of input data and model structural uncertainty. The modelled terrestrial and instream mass-balances also highlight the importance of the hydrological conditions in pollutant transport. Between 1922 and 2002, increased inputs of nitrogen from fertiliser, livestock and deposition have altered the nitrogen balance with a shift from possible reduction in soil fertility but little environmental impact in 1922, to a situation of nitrogen accumulation in the soil, groundwater and instream biota in 2002. In 1922 and 2002 it was estimated that approximately 2 and 18 kg N ha(-1) yr(-1) respectively were exported from the land to the stream. The utility of the approach and further considerations for the best use of models are discussed. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study of the morphodynamics of tidal channel networks is important because of their role in tidal propagation and the evolution of salt-marshes and tidal flats. Channel dimensions range from tens of metres wide and metres deep near the low water mark to only 20-30cm wide and 20cm deep for the smallest channels on the marshes. The conventional method of measuring the networks is cumbersome, involving manual digitising of aerial photographs. This paper describes a semi-automatic knowledge-based network extraction method that is being implemented to work using airborne scanning laser altimetry (and later aerial photography). The channels exhibit a width variation of several orders of magnitude, making an approach based on multi-scale line detection difficult. The processing therefore uses multi-scale edge detection to detect channel edges, then associates adjacent anti-parallel edges together to form channels using a distance-with-destination transform. Breaks in the networks are repaired by extending channel ends in the direction of their ends to join with nearby channels, using domain knowledge that flow paths should proceed downhill and that any network fragment should be joined to a nearby fragment so as to connect eventually to the open sea.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two ongoing projects at ESSC that involve the development of new techniques for extracting information from airborne LiDAR data and combining this information with environmental models will be discussed. The first project in conjunction with Bristol University is aiming to improve 2-D river flood flow models by using remote sensing to provide distributed data for model calibration and validation. Airborne LiDAR can provide such models with a dense and accurate floodplain topography together with vegetation heights for parameterisation of model friction. The vegetation height data can be used to specify a friction factor at each node of a model’s finite element mesh. A LiDAR range image segmenter has been developed which converts a LiDAR image into separate raster maps of surface topography and vegetation height for use in the model. Satellite and airborne SAR data have been used to measure flood extent remotely in order to validate the modelled flood extent. Methods have also been developed for improving the models by decomposing the model’s finite element mesh to reflect floodplain features such as hedges and trees having different frictional properties to their surroundings. Originally developed for rural floodplains, the segmenter is currently being extended to provide DEMs and friction parameter maps for urban floods, by fusing the LiDAR data with digital map data. The second project is concerned with the extraction of tidal channel networks from LiDAR. These networks are important features of the inter-tidal zone, and play a key role in tidal propagation and in the evolution of salt-marshes and tidal flats. The study of their morphology is currently an active area of research, and a number of theories related to networks have been developed which require validation using dense and extensive observations of network forms and cross-sections. The conventional method of measuring networks is cumbersome and subjective, involving manual digitisation of aerial photographs in conjunction with field measurement of channel depths and widths for selected parts of the network. A semi-automatic technique has been developed to extract networks from LiDAR data of the inter-tidal zone. A multi-level knowledge-based approach has been implemented, whereby low level algorithms first extract channel fragments based mainly on image properties then a high level processing stage improves the network using domain knowledge. The approach adopted at low level uses multi-scale edge detection to detect channel edges, then associates adjacent anti-parallel edges together to form channels. The higher level processing includes a channel repair mechanism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of effective methods for predicting the quality of three-dimensional (3D) models is fundamentally important for the success of tertiary structure (TS) prediction strategies. Since CASP7, the Quality Assessment (QA) category has existed to gauge the ability of various model quality assessment programs (MQAPs) at predicting the relative quality of individual 3D models. For the CASP8 experiment, automated predictions were submitted in the QA category using two methods from the ModFOLD server-ModFOLD version 1.1 and ModFOLDclust. ModFOLD version 1.1 is a single-model machine learning based method, which was used for automated predictions of global model quality (QMODE1). ModFOLDclust is a simple clustering based method, which was used for automated predictions of both global and local quality (QMODE2). In addition, manual predictions of model quality were made using ModFOLD version 2.0-an experimental method that combines the scores from ModFOLDclust and ModFOLD v1.1. Predictions from the ModFOLDclust method were the most successful of the three in terms of the global model quality, whilst the ModFOLD v1.1 method was comparable in performance to other single-model based methods. In addition, the ModFOLDclust method performed well at predicting the per-residue, or local, model quality scores. Predictions of the per-residue errors in our own 3D models, selected using the ModFOLD v2.0 method, were also the most accurate compared with those from other methods. All of the MQAPs described are publicly accessible via the ModFOLD server at: http://www.reading.ac.uk/bioinf/ModFOLD/. The methods are also freely available to download from: http://www.reading.ac.uk/bioinf/downloads/.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nested clade phylogeographic analysis (NCPA) is a popular method for reconstructing the demographic history of spatially distributed populations from genetic data. Although some parts of the analysis are automated, there is no unique and widely followed algorithm for doing this in its entirety, beginning with the data, and ending with the inferences drawn from the data. This article describes a method that automates NCPA, thereby providing a framework for replicating analyses in an objective way. To do so, a number of decisions need to be made so that the automated implementation is representative of previous analyses. We review how the NCPA procedure has evolved since its inception and conclude that there is scope for some variability in the manual application of NCPA. We apply the automated software to three published datasets previously analyzed manually and replicate many details of the manual analyses, suggesting that the current algorithm is representative of how a typical user will perform NCPA. We simulate a large number of replicate datasets for geographically distributed, but entirely random-mating, populations. These are then analyzed using the automated NCPA algorithm. Results indicate that NCPA tends to give a high frequency of false positives. In our simulations we observe that 14% of the clades give a conclusive inference that a demographic event has occurred, and that 75% of the datasets have at least one clade that gives such an inference. This is mainly due to the generation of multiple statistics per clade, of which only one is required to be significant to apply the inference key. We survey the inferences that have been made in recent publications and show that the most commonly inferred processes (restricted gene flow with isolation by distance and contiguous range expansion) are those that are commonly inferred in our simulations. However, published datasets typically yield a richer set of inferences with NCPA than obtained in our random-mating simulations, and further testing of NCPA with models of structured populations is necessary to examine its accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

(ABR) is of fundamental importance to the investiga- tion of the auditory system behavior, though its in- terpretation has a subjective nature because of the manual process employed in its study and the clinical experience required for its analysis. When analyzing the ABR, clinicians are often interested in the identi- fication of ABR signal components referred to as Jewett waves. In particular, the detection and study of the time when these waves occur (i.e., the wave la- tency) is a practical tool for the diagnosis of disorders affecting the auditory system. In this context, the aim of this research is to compare ABR manual/visual analysis provided by different examiners. Methods: The ABR data were collected from 10 normal-hearing subjects (5 men and 5 women, from 20 to 52 years). A total of 160 data samples were analyzed and a pair- wise comparison between four distinct examiners was executed. We carried out a statistical study aiming to identify significant differences between assessments provided by the examiners. For this, we used Linear Regression in conjunction with Bootstrap, as a me- thod for evaluating the relation between the responses given by the examiners. Results: The analysis sug- gests agreement among examiners however reveals differences between assessments of the variability of the waves. We quantified the magnitude of the ob- tained wave latency differences and 18% of the inves- tigated waves presented substantial differences (large and moderate) and of these 3.79% were considered not acceptable for the clinical practice. Conclusions: Our results characterize the variability of the manual analysis of ABR data and the necessity of establishing unified standards and protocols for the analysis of these data. These results may also contribute to the validation and development of automatic systems that are employed in the early diagnosis of hearing loss.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Whole-genome sequencing (WGS) could potentially provide a single platform for extracting all the information required to predict an organism’s phenotype. However, its ability to provide accurate predictions has not yet been demonstrated in large independent studies of specific organisms. In this study, we aimed to develop a genotypic prediction method for antimicrobial susceptibilities. The whole genomes of 501 unrelated Staphylococcus aureus isolates were sequenced, and the assembled genomes were interrogated using BLASTn for a panel of known resistance determinants (chromosomal mutations and genes carried on plasmids). Results were compared with phenotypic susceptibility testing for 12 commonly used antimicrobial agents (penicillin, methicillin, erythromycin, clindamycin, tetracycline, ciprofloxacin, vancomycin, trimethoprim, gentamicin, fusidic acid, rifampin, and mupirocin) performed by the routine clinical laboratory. We investigated discrepancies by repeat susceptibility testing and manual inspection of the sequences and used this information to optimize the resistance determinant panel and BLASTn algorithm. We then tested performance of the optimized tool in an independent validation set of 491 unrelated isolates, with phenotypic results obtained in duplicate by automated broth dilution (BD Phoenix) and disc diffusion. In the validation set, the overall sensitivity and specificity of the genomic prediction method were 0.97 (95% confidence interval [95% CI], 0.95 to 0.98) and 0.99 (95% CI, 0.99 to 1), respectively, compared to standard susceptibility testing methods. The very major error rate was 0.5%, and the major error rate was 0.7%. WGS was as sensitive and specific as routine antimicrobial susceptibility testing methods. WGS is a promising alternative to culture methods for resistance prediction in S. aureus and ultimately other major bacterial pathogens.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a method for the recognition of complex actions. Our method combines automatic learning of simple actions and manual definition of complex actions in a single grammar. Contrary to the general trend in complex action recognition that consists in dividing recognition into two stages, our method performs recognition of simple and complex actions in a unified way. This is performed by encoding simple action HMMs within the stochastic grammar that models complex actions. This unified approach enables a more effective influence of the higher activity layers into the recognition of simple actions which leads to a substantial improvement in the classification of complex actions. We consider the recognition of complex actions based on person transits between areas in the scene. As input, our method receives crossings of tracks along a set of zones which are derived using unsupervised learning of the movement patterns of the objects in the scene. We evaluate our method on a large dataset showing normal, suspicious and threat behaviour on a parking lot. Experiments show an improvement of ~ 30% in the recognition of both high-level scenarios and their composing simple actions with respect to a two-stage approach. Experiments with synthetic noise simulating the most common tracking failures show that our method only experiences a limited decrease in performance when moderate amounts of noise are added.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human brain imaging techniques, such as Magnetic Resonance Imaging (MRI) or Diffusion Tensor Imaging (DTI), have been established as scientific and diagnostic tools and their adoption is growing in popularity. Statistical methods, machine learning and data mining algorithms have successfully been adopted to extract predictive and descriptive models from neuroimage data. However, the knowledge discovery process typically requires also the adoption of pre-processing, post-processing and visualisation techniques in complex data workflows. Currently, a main problem for the integrated preprocessing and mining of MRI data is the lack of comprehensive platforms able to avoid the manual invocation of preprocessing and mining tools, that yields to an error-prone and inefficient process. In this work we present K-Surfer, a novel plug-in of the Konstanz Information Miner (KNIME) workbench, that automatizes the preprocessing of brain images and leverages the mining capabilities of KNIME in an integrated way. K-Surfer supports the importing, filtering, merging and pre-processing of neuroimage data from FreeSurfer, a tool for human brain MRI feature extraction and interpretation. K-Surfer automatizes the steps for importing FreeSurfer data, reducing time costs, eliminating human errors and enabling the design of complex analytics workflow for neuroimage data by leveraging the rich functionalities available in the KNIME workbench.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Resistive respiratory loading is an established stimulus for the induction of experimental dyspnoea. In comparison to unloaded breathing, resistive loaded breathing alters end-tidal CO2 (PETCO2), which has independent physiological effects (e.g. upon cerebral blood flow). We investigated the subjective effects of resistive loaded breathing with stabilized PETCO2 (isocapnia) during manual control of inspired gases on varying baseline levels of mild hypercapnia increased PETCO2). Furthermore, to investigate whether perceptual habituation to dyspnoea stimuli occurs, the study was repeated over four experimental sessions. Isocapnic hypercapnia did not affect dyspnoea unpleasantness during resistive loading. A post hoc analysis revealed a small increase of respiratory unpleasantness during unloaded breathing at +0.6 kPa, the level that reliably induced isocapnia. We didnot observe perceptual habituation over the four sessions. We conclude that isocapnic respiratory loading allows stable induction of respiratory unpleasantness, making it a good stimulus for multi-session studies of dyspnoea.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An algorithm is presented for the generation of molecular models of defective graphene fragments, containing a majority of 6-membered rings with a small number of 5- and 7-membered rings as defects. The structures are generated from an initial random array of points in 2D space, which are then subject to Delaunay triangulation. The dual of the triangulation forms a Voronoi tessellation of polygons with a range of ring sizes. An iterative cycle of refinement, involving deletion and addition of points followed by further triangulation, is performed until the user-defined criteria for the number of defects are met. The array of points and connectivities are then converted to a molecular structure and subject to geometry optimization using a standard molecular modeling package to generate final atomic coordinates. On the basis of molecular mechanics with minimization, this automated method can generate structures, which conform to user-supplied criteria and avoid the potential bias associated with the manual building of structures. One application of the algorithm is the generation of structures for the evaluation of the reactivity of different defect sites. Ab initio electronic structure calculations on a representative structure indicate preferential fluorination close to 5-ring defects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modelling of disorder in organic crystals is highly desirable since it would allow thermodynamic stabilities and other disorder-sensitive properties to be estimated for such systems. Two disordered organic molecular systems are modeled using a symmetry-adapted ensemble approach, in which the disordered system is treated as an ensemble of the configurations of a supercell with respect to substitution of one disorder component for another. Computation time is kept manageable by performing calculations only on the symmetrically inequivalent configurations. Calculations are presented on a substitutionally disordered system, the dichloro/dibromobenzene solid solution, and on an orientationally disordered system, eniluracil, and the resultant free energies, disorder patterns, and system properties are discussed. The results are found to be in agreement with experiment following manual removal of physically implausible configurations from ensemble averages, highlighting the dangers of a completely automated approach to organic crystal thermodynamics which ignores the barriers to equilibration once the crystal has been formed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This letter presents an accurate delay analysis in prioritised wireless sensor networks (WSN). The analysis is an enhancement of the existing analysis proposed by Choobkar and Dilmaghani, which is only applicable to the case where the lower priority nodes always have packets to send in the empty slots of the higher priority node. The proposed analysis is applicable for any pattern of packet arrival, which includes the general case where the lower priority nodes may or may not have packets to send in the empty slots of the higher priority nodes. Evaluation of both analyses showed that the proposed delay analysis has better accuracy over the full range of loads and provides an excellent match to simulation results.