11 resultados para Detection process
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Robust and accurate identification of intervertebral discs from low resolution, sparse MRI scans is essential for the automated scan planning of the MRI spine scan. This paper presents a graphical model based solution for the detection of both the positions and orientations of intervertebral discs from low resolution, sparse MRI scans. Compared with the existing graphical model based methods, the proposed method does not need a training process using training data and it also has the capability to automatically determine the number of vertebrae visible in the image. Experiments on 25 low resolution, sparse spine MRI data sets verified its performance.
Resumo:
Focusing of four hemoglobins with concurrent electrophoretic mobilization was studied by computer simulation. A dynamic electrophoresis simulator was first used to provide a detailed description of focusing in a 100-carrier component, pH 6-8 gradient using phosphoric acid as anolyte and NaOH as catholyte. These results are compared to an identical simulation except that the catholyte contained both NaOH and NaCl. A stationary, steady-state distribution of carrier components and hemoglobins is produced in the first configuration. In the second, the chloride ion migrates into and through the separation space. It is shown that even under these conditions of chloride ion flux a pH gradient forms. All amphoteric species acquire a slight positive charge upon focusing and the whole pattern is mobilized towards the cathode. The cathodic gradient end is stable whereas the anodic end is gradually degrading due to the continuous accumulation of chloride. The data illustrate that the mobilization is a cationic isotachophoretic process with the sodium ion being the leading cation. The peak height of the hemoglobin zones decreases somewhat upon mobilization, but the zones retain a relatively sharp profile, thus facilitating detection. The electropherograms that would be produced by whole column imaging and by a single detector placed at different locations along the focusing column are presented and show that focusing can be commenced with NaCl present in the catholyte at the beginning of the experiment. However, this may require detector placement on the cathodic side of the catholyte/sample mixture interface.
Resumo:
Cell death induction by apoptosis is an important process in the maintenance of tissue homeostasis as well as tissue destruction during various pathological processes. Consequently, detection of apoptotic cells in situ represents an important technique to assess the extent and impact of cell death in the respective tissue. While scoring of apoptosis by histological assessment of apoptotic cells is still a widely used method, it is likely biased by sensitivity problems and observed-based variations. The availability of caspase-mediated neo-epitope-specific antibodies offers new tools for the detection of apoptosis in situ. Here, we discuss the use of immunohistochemical detection of cleaved caspase 3 and lamin A for the assessment of apoptotic cells in paraffin-embedded liver tissue. Furthermore, we evaluate the effect of tissue pretreatment and antigen retrieval on the sensitivity of apoptosis detection, background staining and maintenance of tissue morphology.
Resumo:
Lesion detection aids ideally aim at increasing the sensitivity of visual caries detection without trading off too much in terms of specificity. The use of a dental probe (explorer), bitewing radiography and fibre-optic transillumination (FOTI) have long been recommended for this purpose. Today, probing of suspected lesions in the sense of checking the 'stickiness' is regarded as obsolete, since it achieves no gain of sensitivity and might cause irreversible tooth damage. Bitewing radiography helps to detect lesions that are otherwise hidden from visual examination, and it should therefore be applied to a new patient. The diagnostic performance of radiography at approximal and occlusal sites is different, as this relates to the 3-dimensional anatomy of the tooth at these sites. However, treatment decisions have to take more into account than just lesion extension. Bitewing radiography provides additional information for the decision-making process that mainly relies on the visual and clinical findings. FOTI is a quick and inexpensive method which can enhance visual examination of all tooth surfaces. Both radiography and FOTI can improve the sensitivity of caries detection, but require sufficient training and experience to interpret information correctly. Radiography also carries the burden of the risks and legislation associated with using ionizing radiation in a health setting and should be repeated at intervals guided by the individual patient's caries risk. Lesion detection aids can assist in the longitudinal monitoring of the behaviour of initial lesions.
Resumo:
BACKGROUND: Microarray genome analysis is realising its promise for improving detection of genetic abnormalities in individuals with mental retardation and congenital abnormality. Copy number variations (CNVs) are now readily detectable using a variety of platforms and a major challenge is the distinction of pathogenic from ubiquitous, benign polymorphic CNVs. The aim of this study was to investigate replacement of time consuming, locus specific testing for specific microdeletion and microduplication syndromes with microarray analysis, which theoretically should detect all known syndromes with CNV aetiologies as well as new ones. METHODS: Genome wide copy number analysis was performed on 117 patients using Affymetrix 250K microarrays. RESULTS: 434 CNVs (195 losses and 239 gains) were found, including 18 pathogenic CNVs and 9 identified as "potentially pathogenic". Almost all pathogenic CNVs were larger than 500 kb, significantly larger than the median size of all CNVs detected. Segmental regions of loss of heterozygosity larger than 5 Mb were found in 5 patients. CONCLUSIONS: Genome microarray analysis has improved diagnostic success in this group of patients. Several examples of recently discovered "new syndromes" were found suggesting they are more common than previously suspected and collectively are likely to be a major cause of mental retardation. The findings have several implications for clinical practice. The study revealed the potential to make genetic diagnoses that were not evident in the clinical presentation, with implications for pretest counselling and the consent process. The importance of contributing novel CNVs to high quality databases for genotype-phenotype analysis and review of guidelines for selection of individuals for microarray analysis is emphasised.
Resumo:
The assessment of ERa, PgR and HER2 status is routinely performed today to determine the endocrine responsiveness of breast cancer samples. Such determination is usually accomplished by means of immunohistochemistry and in case of HER2 amplification by means of fluorescent in situ hybridization (FISH). The analysis of these markers can be improved by simultaneous measurements using quantitative real-time PCR (Qrt-PCR). In this study we compared Qrt-PCR results for the assessment of mRNA levels of ERa, PgR, and the members of the human epidermal growth factor receptor family, HER1, HER2, HER3 and HER4. The results were obtained in two independent laboratories using two different methods, SYBR Green I and TaqMan probes, and different primers. By linear regression we demonstrated a good concordance for all six markers. The quantitative mRNA expression levels of ERa, PgR and HER2 also strongly correlated with the respective quantitative protein expression levels prospectively detected by EIA in both laboratories. In addition, HER2 mRNA expression levels correlated well with gene amplification detected by FISH in the same biopsies. Our results indicate that both Qrt-PCR methods were robust and sensitive tools for routine diagnostics and consistent with standard methodologies. The developed simultaneous assessment of several biomarkers is fast and labor effective and allows optimization of the clinical decision-making process in breast cancer tissue and/or core biopsies.
Resumo:
The Imager for Low Energetic Neutral Atoms test facility at the University of Bern was developed to investigate, characterize, and quantify physical processes on surfaces that are used to ionize neutral atoms before their analysis in neutral particle-sensing instruments designed for space research. The facility has contributed valuable knowledge of the interaction of ions with surfaces (e.g., fraction of ions scattered from surfaces and angular scattering distribution) and employs a novel measurement principle for the determination of secondary electron emission yields as a function of energy, angle of incidence, particle species, and sample surface for low particle energies. Only because of this test facility it was possible to successfully apply surface-science processes for the new detection technique for low-energetic neutral particles with energies below about 1 keV used in space applications. All successfully flown spectrometers for the detection of low-energetic neutrals based on the particle–surface interaction process use surfaces evaluated, tested, and calibrated in this facility. Many instruments placed on different spacecraft (e.g., Imager for Magnetopause-to-Aurora Global Exploration, Chandrayaan-1, Interstellar Boundary Explorer, etc.) have successfully used this technique.
Resumo:
Derivation of probability estimates complementary to geophysical data sets has gained special attention over the last years. Information about a confidence level of provided physical quantities is required to construct an error budget of higher-level products and to correctly interpret final results of a particular analysis. Regarding the generation of products based on satellite data a common input consists of a cloud mask which allows discrimination between surface and cloud signals. Further the surface information is divided between snow and snow-free components. At any step of this discrimination process a misclassification in a cloud/snow mask propagates to higher-level products and may alter their usability. Within this scope a novel probabilistic cloud mask (PCM) algorithm suited for the 1 km × 1 km Advanced Very High Resolution Radiometer (AVHRR) data is proposed which provides three types of probability estimates between: cloudy/clear-sky, cloudy/snow and clear-sky/snow conditions. As opposed to the majority of available techniques which are usually based on the decision-tree approach in the PCM algorithm all spectral, angular and ancillary information is used in a single step to retrieve probability estimates from the precomputed look-up tables (LUTs). Moreover, the issue of derivation of a single threshold value for a spectral test was overcome by the concept of multidimensional information space which is divided into small bins by an extensive set of intervals. The discrimination between snow and ice clouds and detection of broken, thin clouds was enhanced by means of the invariant coordinate system (ICS) transformation. The study area covers a wide range of environmental conditions spanning from Iceland through central Europe to northern parts of Africa which exhibit diverse difficulties for cloud/snow masking algorithms. The retrieved PCM cloud classification was compared to the Polar Platform System (PPS) version 2012 and Moderate Resolution Imaging Spectroradiometer (MODIS) collection 6 cloud masks, SYNOP (surface synoptic observations) weather reports, Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) vertical feature mask version 3 and to MODIS collection 5 snow mask. The outcomes of conducted analyses proved fine detection skills of the PCM method with results comparable to or better than the reference PPS algorithm.
Resumo:
The near-real time retrieval of low stratiform cloud (LSC) coverage is of vital interest for such disciplines as meteorology, transport safety, economy and air quality. Within this scope, a novel methodology is proposed which provides the LSC occurrence probability estimates for a satellite scene. The algorithm is suited for the 1 × 1 km Advanced Very High Resolution Radiometer (AVHRR) data and was trained and validated against collocated SYNOP observations. Utilisation of these two combined data sources requires a formulation of constraints in order to discriminate cases where the LSC is overlaid by higher clouds. The LSC classification process is based on six features which are first converted to the integer form by step functions and combined by means of bitwise operations. Consequently, a set of values reflecting a unique combination of those features is derived which is further employed to extract the LSC occurrence probability estimates from the precomputed look-up vectors (LUV). Although the validation analyses confirmed good performance of the algorithm, some inevitable misclassification with other optically thick clouds were reported. Moreover, the comparison against Polar Platform System (PPS) cloud-type product revealed superior classification accuracy. From the temporal perspective, the acquired results reported a presence of diurnal and annual LSC probability cycles over Europe.
Resumo:
Any image processing object detection algorithm somehow tries to integrate the object light (Recognition Step) and applies statistical criteria to distinguish objects of interest from other objects or from pure background (Decision Step). There are various possibilities how these two basic steps can be realized, as can be seen in the different proposed detection methods in the literature. An ideal detection algorithm should provide high recognition sensitiv ity with high decision accuracy and require a reasonable computation effort . In reality, a gain in sensitivity is usually only possible with a loss in decision accuracy and with a higher computational effort. So, automatic detection of faint streaks is still a challenge. This paper presents a detection algorithm using spatial filters simulating the geometrical form of possible streaks on a CCD image. This is realized by image convolution. The goal of this method is to generate a more or less perfect match between a streak and a filter by varying the length and orientation of the filters. The convolution answers are accepted or rejected according to an overall threshold given by the ackground statistics. This approach yields as a first result a huge amount of accepted answers due to filters partially covering streaks or remaining stars. To avoid this, a set of additional acceptance criteria has been included in the detection method. All criteria parameters are justified by background and streak statistics and they affect the detection sensitivity only marginally. Tests on images containing simulated streaks and on real images containing satellite streaks show a very promising sensitivity, reliability and running speed for this detection method. Since all method parameters are based on statistics, the true alarm, as well as the false alarm probability, are well controllable. Moreover, the proposed method does not pose any extraordinary demands on the computer hardware and on the image acquisition process.
Resumo:
Introduction: In team sports the ability to use peripheral vision is essential to track a number of players and the ball. By using eye-tracking devices it was found that players either use fixations and saccades to process information on the pitch or use smooth pursuit eye movements (SPEM) to keep track of single objects (Schütz, Braun, & Gegenfurtner, 2011). However, it is assumed that peripheral vision can be used best when the gaze is stable while it is unknown whether motion changes can be equally well detected when SPEM are used especially because contrast sensitivity is reduced during SPEM (Schütz, Delipetkose, Braun, Kerzel, & Gegenfurtner, 2007). Therefore, peripheral motion change detection will be examined by contrasting a fixation condition with a SPEM condition. Methods: 13 participants (7 male, 6 female) were presented with a visual display consisting of 15 white and 1 red square. Participants were instructed to follow the red square with their eyes and press a button as soon as a white square begins to move. White square movements occurred either when the red square was still (fixation condition) or moving in a circular manner with 6 °/s (pursuit condition). The to-be-detected white square movements varied in eccentricity (4 °, 8 °, 16 °) and speed (1 °/s, 2 °/s, 4 °/s) while movement time of white squares was constant at 500 ms. 180 events should be detected in total. A Vicon-integrated eye-tracking system and a button press (1000 Hz) was used to control for eye-movements and measure detection rates and response times. Response times (ms) and missed detections (%) were measured as dependent variables and analysed with a 2 (manipulation) x 3 (eccentricity) x 3 (speed) ANOVA with repeated measures on all factors. Results: Significant response time effects were found for manipulation, F(1,12) = 224.31, p < .01, ηp2 = .95, eccentricity, F(2,24) = 56.43; p < .01, ηp2 = .83, and the interaction between the two factors, F(2,24) = 64.43; p < .01, ηp2 = .84. Response times increased as a function of eccentricity for SPEM only and were overall higher than in the fixation condition. Results further showed missed events effects for manipulation, F(1,12) = 37.14; p < .01, ηp2 = .76, eccentricity, F(2,24) = 44.90; p < .01, ηp2 = .79, the interaction between the two factors, F(2,24) = 39.52; p < .01, ηp2 = .77 and the three-way interaction manipulation x eccentricity x speed, F(2,24) = 3.01; p = .03, ηp2 = .20. While less than 2% of events were missed on average in the fixation condition as well as at 4° and 8° eccentricity in the SPEM condition, missed events increased for SPEM at 16 ° eccentricity with significantly more missed events in the 4 °/s speed condition (1 °/s: M = 34.69, SD = 20.52; 2 °/s: M = 33.34, SD = 19.40; 4 °/s: M = 39.67, SD = 19.40). Discussion: It could be shown that using SPEM impairs the ability to detect peripheral motion changes at the far periphery and that fixations not only help to detect these motion changes but also to respond faster. Due to high temporal constraints especially in team sports like soccer or basketball, fast reaction are necessary for successful anticipation and decision making. Thus, it is advised to anchor gaze at a specific location if peripheral changes (e.g. movements of other players) that require a motor response have to be detected. In contrast, SPEM should only be used if a single object, like the ball in cricket or baseball, is necessary for a successful motor response. References: Schütz, A. C., Braun, D. I., & Gegenfurtner, K. R. (2011). Eye movements and perception: A selective review. Journal of Vision, 11, 1-30. Schütz, A. C., Delipetkose, E., Braun, D. I., Kerzel, D., & Gegenfurtner, K. R. (2007). Temporal contrast sensitivity during smooth pursuit eye movements. Journal of Vision, 7, 1-15.