796 resultados para User-based collaborative filtering
Resumo:
PRINCIPLES: Respiratory care is universally recognised as useful, but its indications and practice vary markedly. In order to improve the appropriateness of respiratory care in our hospital, we developed evidence-based local guidelines in a collaborative effort involving physiotherapists, physicians and health service researchers. METHODS: Recommendations were developed using the standardised RAND appropriateness method. A literature search was conducted based on terms associated with guidelines and with respiratory care. A working group prepared proposals for recommendations which were then independently rated by a multidisciplinary expert panel. All recommendations were then discussed in common and indications for procedures were rated confidentially a second time by the experts. The recommendations were then formulated on the basis of the level of evidence in the literature and on the consensus among these experts. RESULTS: Recommendations were formulated for the following procedures: non-invasive ventilation, continuous positive airway pressure, intermittent positive pressure breathing, intrapulmonary percussive ventilation, mechanical insufflation-exsufflation, incentive spirometry, positive expiratory pressure, nasotracheal suctioning and non-instrumental airway clearance techniques. Each recommendation referred to a particular medical condition and was assigned to a hierarchical category based on the quality of the evidence from the literature supporting the recommendation and on the consensus among the experts. CONCLUSION: Despite a marked heterogeneity of scientific evidence, the method used allowed us to develop commonly agreed local guidelines for respiratory care. In addition, this work fostered a closer relationship between physiotherapists and physicians in our institution.
Resumo:
This study represents the most extensive analysis of batch-to-batch variations in spray paint samples to date. The survey was performed as a collaborative project of the ENFSI (European Network of Forensic Science Institutes) Paint and Glass Working Group (EPG) and involved 11 laboratories. Several studies have already shown that paint samples of similar color but from different manufacturers can usually be differentiated using an appropriate analytical sequence. The discrimination of paints from the same manufacturer and color (batch-to-batch variations) is of great interest and these data are seldom found in the literature. This survey concerns the analysis of batches from different color groups (white, papaya (special shade of orange), red and black) with a wide range of analytical techniques and leads to the following conclusions. Colored batch samples are more likely to be differentiated since their pigment composition is more complex (pigment mixtures, added pigments) and therefore subject to variations. These variations may occur during the paint production but may also occur when checking the paint shade in quality control processes. For these samples, techniques aimed at color/pigment(s) characterization (optical microscopy, microspectrophotometry (MSP), Raman spectroscopy) provide better discrimination than techniques aimed at the organic (binder) or inorganic composition (fourier transform infrared spectroscopy (FTIR) or elemental analysis (SEM - scanning electron microscopy and XRF - X-ray fluorescence)). White samples contain mainly titanium dioxide as a pigment and the main differentiation is based on the binder composition (Csingle bondH stretches) detected either by FTIR or Raman. The inorganic composition (elemental analysis) also provides some discrimination. Black samples contain mainly carbon black as a pigment and are problematic with most of the spectroscopic techniques. In this case, pyrolysis-GC/MS represents the best technique to detect differences. Globally, Py-GC/MS may show a high potential of discrimination on all samples but the results are highly dependent on the specific instrumental conditions used. Finally, the discrimination of samples when data was interpreted visually as compared to statistically using principal component analysis (PCA) yielded very similar results. PCA increases sensitivity and could perform better on specific samples, but one first has to ensure that all non-informative variation (baseline deviation) is eliminated by applying correct pre-treatments. Statistical treatments can be used on a large data set and, when combined with an expert's opinion, will provide more objective criteria for decision making.
Resumo:
We propose a method to estimate time invariant cyclical DSGE models using the informationprovided by a variety of filters. We treat data filtered with alternative procedures as contaminated proxies of the relevant model-based quantities and estimate structural and non-structuralparameters jointly using a signal extraction approach. We employ simulated data to illustratethe properties of the procedure and compare our conclusions with those obtained when just onefilter is used. We revisit the role of money in the transmission of monetary business cycles.
Resumo:
BACKGROUND: Knowledge of normal heart weight ranges is important information for pathologists. Comparing the measured heart weight to reference values is one of the key elements used to determine if the heart is pathological, as heart weight increases in many cardiac pathologies. The current reference tables are old and in need of an update. AIMS: The purposes of this study are to establish new reference tables for normal heart weights in the local population and to determine the best predictive factor for normal heart weight. We also aim to provide technical support to calculate the predictive normal heart weight. METHODS: The reference values are based on retrospective analysis of adult Caucasian autopsy cases without any obvious pathology that were collected at the University Centre of Legal Medicine in Lausanne from 2007 to 2011. We selected 288 cases. The mean age was 39.2 years. There were 118 men and 170 women. Regression analyses were performed to assess the relationship of heart weight to body weight, body height, body mass index (BMI) and body surface area (BSA). RESULTS: The heart weight increased along with an increase in all the parameters studied. The mean heart weight was greater in men than in women at a similar body weight. BSA was determined to be the best predictor for normal heart weight. New reference tables for predicted heart weights are presented as a web application that enable the comparison of heart weights observed at autopsy with the reference values. CONCLUSIONS: The reference tables for heart weight and other organs should be systematically updated and adapted for the local population. Web access and smartphone applications for the predicted heart weight represent important investigational tools.
Resumo:
La pedagogía crítica acentúa la falta de neutralidad de la escuela respecto a las relaciones de poder que existen en la sociedad. Su propuesta consiste en modificar las relaciones de poder en el aula en el sentido de transformar las relaciones coercitivas –las que reproducen las relaciones existentes– en relaciones colaborativas partiendo del reconocimiento y la participación de los alumnos en las actividades escolares. Desde esta perspectiva, uno de los objetivos de las actividades es que los niños produzcan «textos identitarios», entendidos como artefactos que los alumnos se pueden apropiar para promover su desarrollo cognitivo. El artículo muestra el trabajo educativo integrado desde Educación Infantil hasta 6.º de Primaria de una escuela de la provincia de Girona en la que el 97% de los alumnos es de origen extranjero y cuyo propósito es incrementar las habilidades lingüísticas orales y escritas en la lengua escolar, así como la utilización de otros lenguajes multimedia. La unidad didáctica consiste en la elaboración de un cuento a lo largo de un curso escolar por parte de todo el alumnado con la ayuda de los profesores, de dos autores y de tres ilustradores. Cada ciclo escolar decide los personajes y el escenario y explicita textualmente el transcurso de la acción. Los ilustradores producen las imágenes y los autores posibilitan la transición de aquello que ha elaborado un ciclo al producto del siguiente. La actividad basada en la participación y la utilización de procedimientos democráticos de decisión se inserta en la propuesta educativa y lingüística de la escuela, así como en sus concreciones curriculares. Los resultados muestran que los textos construidos por los niños se apoyan en sus «fondos de conocimiento» sociales y familiares y constituyen una fuente de progreso en la consecución de las competencias básicas y en la construcción de valores democráticos
Resumo:
A common problem in video surveys in very shallow waters is the presence of strong light fluctuations, due to sun light refraction. Refracted sunlight casts fast moving patterns, which can significantly degrade the quality of the acquired data. Motivated by the growing need to improve the quality of shallow water imagery, we propose a method to remove sunlight patterns in video sequences. The method exploits the fact that video sequences allow several observations of the same area of the sea floor, over time. It is based on computing the image difference between a given reference frame and the temporal median of a registered set of neighboring images. A key observation is that this difference will have two components with separable spectral content. One is related to the illumination field (lower spatial frequencies) and the other to the registration error (higher frequencies). The illumination field, recovered by lowpass filtering, is used to correct the reference image. In addition to removing the sunflickering patterns, an important advantage of the approach is the ability to preserve the sharpness in corrected image, even in the presence of registration inaccuracies. The effectiveness of the method is illustrated in image sets acquired under strong camera motion containing non-rigid benthic structures. The results testify the good performance and generality of the approach
Resumo:
Segmenting ultrasound images is a challenging problemwhere standard unsupervised segmentation methods such asthe well-known Chan-Vese method fail. We propose in thispaper an efficient segmentation method for this class ofimages. Our proposed algorithm is based on asemi-supervised approach (user labels) and the use ofimage patches as data features. We also consider thePearson distance between patches, which has been shown tobe robust w.r.t speckle noise present in ultrasoundimages. Our results on phantom and clinical data show avery high similarity agreement with the ground truthprovided by a medical expert.
A filtering method to correct time-lapse 3D ERT data and improve imaging of natural aquifer dynamics
Resumo:
We have developed a processing methodology that allows crosshole ERT (electrical resistivity tomography) monitoring data to be used to derive temporal fluctuations of groundwater electrical resistivity and thereby characterize the dynamics of groundwater in a gravel aquifer as it is infiltrated by river water. Temporal variations of the raw ERT apparent-resistivity data were mainly sensitive to the resistivity (salinity), temperature and height of the groundwater, with the relative contributions of these effects depending on the time and the electrode configuration. To resolve the changes in groundwater resistivity, we first expressed fluctuations of temperature-detrended apparent-resistivity data as linear superpositions of (i) time series of riverwater-resistivity variations convolved with suitable filter functions and (ii) linear and quadratic representations of river-water-height variations multiplied by appropriate sensitivity factors; river-water height was determined to be a reliable proxy for groundwater height. Individual filter functions and sensitivity factors were obtained for each electrode configuration via deconvolution using a one month calibration period and then the predicted contributions related to changes in water height were removed prior to inversion of the temperature-detrended apparent-resistivity data. Applications of the filter functions and sensitivity factors accurately predicted the apparent-resistivity variations (the correlation coefficient was 0.98). Furthermore, the filtered ERT monitoring data and resultant time-lapse resistivity models correlated closely with independently measured groundwater electrical resistivity monitoring data and only weakly with the groundwater-height fluctuations. The inversion results based on the filtered ERT data also showed significantly less inversion artefacts than the raw data inversions. We observed resistivity increases of up to 10% and the arrival time peaks in the time-lapse resistivity models matched those in the groundwater resistivity monitoring data.
Resumo:
A second collaborative exercise on RNA/DNA co-analysis for body fluid identification and STR profiling was organized by the European DNA Profiling Group (EDNAP). Six human blood stains, two blood dilution series (5-0.001 μl blood) and, optionally, bona fide or mock casework samples of human or non-human origin were analyzed by the participating laboratories using a RNA/DNA co-extraction or solely RNA extraction method. Two novel mRNA multiplexes were used for the identification of blood: a highly sensitive duplex (HBA, HBB) and a moderately sensitive pentaplex (ALAS2, CD3G, ANK1, SPTB and PBGD). The laboratories used different chemistries and instrumentation. All of the 18 participating laboratories were able to successfully isolate and detect mRNA in dried blood stains. Thirteen laboratories simultaneously extracted RNA and DNA from individual stains and were able to utilize mRNA profiling to confirm the presence of blood and to obtain autosomal STR profiles from the blood stain donors. The positive identification of blood and good quality DNA profiles were also obtained from old and compromised casework samples. The method proved to be reproducible and sensitive using different analysis strategies. The results of this collaborative exercise involving a RNA/DNA co-extraction strategy support the potential use of an mRNA based system for the identification of blood in forensic casework that is compatible with current DNA analysis methodology.
Resumo:
Aim Conservation strategies are in need of predictions that capture spatial community composition and structure. Currently, the methods used to generate these predictions generally focus on deterministic processes and omit important stochastic processes and other unexplained variation in model outputs. Here we test a novel approach of community models that accounts for this variation and determine how well it reproduces observed properties of alpine butterfly communities. Location The western Swiss Alps. Methods We propose a new approach to process probabilistic predictions derived from stacked species distribution models (S-SDMs) in order to predict and assess the uncertainty in the predictions of community properties. We test the utility of our novel approach against a traditional threshold-based approach. We used mountain butterfly communities spanning a large elevation gradient as a case study and evaluated the ability of our approach to model species richness and phylogenetic diversity of communities. Results S-SDMs reproduced the observed decrease in phylogenetic diversity and species richness with elevation, syndromes of environmental filtering. The prediction accuracy of community properties vary along environmental gradient: variability in predictions of species richness was higher at low elevation, while it was lower for phylogenetic diversity. Our approach allowed mapping the variability in species richness and phylogenetic diversity projections. Main conclusion Using our probabilistic approach to process species distribution models outputs to reconstruct communities furnishes an improved picture of the range of possible assemblage realisations under similar environmental conditions given stochastic processes and help inform manager of the uncertainty in the modelling results
Resumo:
We performed an international proficiency study of Human Papillomavirus (HPV) type 16 serology. A common methodology for serology based on virus-like particle (VLP) ELISA was used by 10 laboratories in 6 continents. The laboratories used the same VLP reference reagent, which was selected as the most stable, sensitive and specific VLP preparation out of VLPs donated from 5 different sources. A blinded proficiency panel consisting of 52 serum samples from women with PCR-verified HPV 16-infection, 11 control serum samples from virginal women and the WHO HPV 16 International Standard (IS) serum were distributed. The mean plus 3 standard deviations of the negative control serum samples was the most generally useful "cut-off" criterion for distinguishing positive and negative samples. Using sensitivity of at least 50% and a specificity of 100% as proficiency criteria, 6/10 laboratories were proficient. In conclusion, an international Standard Operating Procedure for HPV serology, an international reporting system in International Units (IU) and a common "cut-off" criterion have been evaluated in an international HPV serology proficiency study.
Resumo:
The Iowa Department of Transportation is committed to improved management systems, which in turn has led to increased automation to record and manage construction data. A possible improvement to the current data management system can be found with pen-based computers. Pen-based computers coupled with user friendly software are now to the point where an individual's handwriting can be captured and converted to typed text to be used for data collection. It would appear pen-based computers are sufficiently advanced to be used by construction inspectors to record daily project data. The objective of this research was to determine: (1) if pen-based computers are durable enough to allow maintenance-free operation for field work during Iowa's construction season; and (2) if pen-based computers can be used effectively by inspectors with little computer experience. The pen-based computer's handwriting recognition was not fast or accurate enough to be successfully utilized. The IBM Thinkpad with the pen pointing device did prove useful for working in Windows' graphical environment. The pen was used for pointing, selecting and scrolling in the Windows applications because of its intuitive nature.
Resumo:
OBJECTIVE: To explore the user-friendliness and ergonomics of seven new generation intensive care ventilators. DESIGN: Prospective task-performing study. SETTING: Intensive care research laboratory, university hospital. METHODS: Ten physicians experienced in mechanical ventilation, but without prior knowledge of the ventilators, were asked to perform eight specific tasks [turning the ventilator on; recognizing mode and parameters; recognizing and setting alarms; mode change; finding and activating the pre-oxygenation function; pressure support setting; stand-by; finding and activating non-invasive ventilation (NIV) mode]. The time needed for each task was compared to a reference time (by trained physiotherapist familiar with the devices). A time >180 s was considered a task failure. RESULTS: For each of the tests on the ventilators, all physicians' times were significantly higher than the reference time (P < 0.001). A mean of 13 +/- 8 task failures (16%) was observed by the ventilator. The most frequently failed tasks were mode and parameter recognition, starting pressure support and finding the NIV mode. Least often failed tasks were turning on the pre-oxygenation function and alarm recognition and management. Overall, there was substantial heterogeneity between machines, some exhibiting better user-friendliness than others for certain tasks, but no ventilator was clearly better that the others on all points tested. CONCLUSIONS: The present study adds to the available literature outlining the ergonomic shortcomings of mechanical ventilators. These results suggest that closer ties between end-users and manufacturers should be promoted, at an early development phase of these machines, based on the scientific evaluation of the cognitive processes involved by users in the clinical setting.
Resumo:
Introduction: The field of Connectomic research is growing rapidly, resulting from methodological advances in structural neuroimaging on many spatial scales. Especially progress in Diffusion MRI data acquisition and processing made available macroscopic structural connectivity maps in vivo through Connectome Mapping Pipelines (Hagmann et al, 2008) into so-called Connectomes (Hagmann 2005, Sporns et al, 2005). They exhibit both spatial and topological information that constrain functional imaging studies and are relevant in their interpretation. The need for a special-purpose software tool for both clinical researchers and neuroscientists to support investigations of such connectome data has grown. Methods: We developed the ConnectomeViewer, a powerful, extensible software tool for visualization and analysis in connectomic research. It uses the novel defined container-like Connectome File Format, specifying networks (GraphML), surfaces (Gifti), volumes (Nifti), track data (TrackVis) and metadata. Usage of Python as programming language allows it to by cross-platform and have access to a multitude of scientific libraries. Results: Using a flexible plugin architecture, it is possible to enhance functionality for specific purposes easily. Following features are already implemented: * Ready usage of libraries, e.g. for complex network analysis (NetworkX) and data plotting (Matplotlib). More brain connectivity measures will be implemented in a future release (Rubinov et al, 2009). * 3D View of networks with node positioning based on corresponding ROI surface patch. Other layouts possible. * Picking functionality to select nodes, select edges, get more node information (ConnectomeWiki), toggle surface representations * Interactive thresholding and modality selection of edge properties using filters * Arbitrary metadata can be stored for networks, thereby allowing e.g. group-based analysis or meta-analysis. * Python Shell for scripting. Application data is exposed and can be modified or used for further post-processing. * Visualization pipelines using filters and modules can be composed with Mayavi (Ramachandran et al, 2008). * Interface to TrackVis to visualize track data. Selected nodes are converted to ROIs for fiber filtering The Connectome Mapping Pipeline (Hagmann et al, 2008) processed 20 healthy subjects into an average Connectome dataset. The Figures show the ConnectomeViewer user interface using this dataset. Connections are shown that occur in all 20 subjects. The dataset is freely available from the homepage (connectomeviewer.org). Conclusions: The ConnectomeViewer is a cross-platform, open-source software tool that provides extensive visualization and analysis capabilities for connectomic research. It has a modular architecture, integrates relevant datatypes and is completely scriptable. Visit www.connectomics.org to get involved as user or developer.