60 resultados para Data detection


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have previously isolated anti-FcepsilonRIalpha autoantibodies from phage libraries of healthy donors and urticaria patients. Strikingly, the same antibody, LTMalpha15, was isolated from both libraries. Sequence analysis revealed a germline configuration of the LTMalpha15 variable heavy (V(H)) chain with a slightly mutated variable light (V(L)) chain supporting its classification as a natural autoantibody. Distribution analysis of anti-FcepsilonRIalpha autoantibodies by functional or serological tests delivered conflicting data. For this reason we have developed a new real-time PCR to analyse the distribution of LTMalpha15V(H) in healthy donors and urticaria patients. Our new bioinformatic program permitted the design of a minor groove binder (MGB) TaqMan probe that specifically detected the LTMalpha15V(H). We were able to demonstrate a broad range of rearranged V(H) gene copy number without any correlation to the state of health. Monitoring LTMalpha15V(H) gene copy number in a single donor over a period of 70 days revealed a time-related fluctuation of circulating B cells carrying LTMalpha15V(H). We propose that our real-time PCR may serve as a model for the quantification of natural antibody sequences at a monoclonal level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM Despite the large scientific debate concerning potential stigmatizing effects of identifying an individual as being in an at-risk mental state (ARMS) for psychosis, studies investigating this topic from the subjective perspective of patients are rare. This study assesses whether ARMS individuals experience stigmatization and to what extent being informed about the ARMS is experienced as helpful or harmful. METHODS Eleven ARMS individuals, currently participating in the follow-up assessments of the prospective Basel Früherkennung von Psychosen (FePsy; English: Early Detection of Psychosis) study, were interviewed in detail using a semistructured qualitative interview developed for this purpose. Data were analysed using Interpretative Phenomenological Analysis. RESULTS Most individuals experiencing first symptoms reported sensing that there was 'something wrong with them' and felt in need of help. They were relieved that a specific term was assigned to their symptoms. The support received from the early detection centre was generally experienced as helpful. Many patients reported stigmatization and discrimination that appeared to be the result of altered behaviour and social withdrawal due to the prepsychotic symptoms they experienced prior to contact with the early detection clinic. CONCLUSIONS The results suggest that early detection services help individuals cope with symptoms and potential stigmatization rather than enhancing or causing the latter. More emphasis should be put on the subjective experiences of those concerned when debating the advantages and disadvantages of early detection with regard to stigma. There was no evidence for increased perceived stigma and discrimination as a result of receiving information about the ARMS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract: Near-infrared spectroscopy (NIRS) enables the non-invasive measurement of changes in hemodynamics and oxygenation in tissue. Changes in light-coupling due to movement of the subject can cause movement artifacts (MAs) in the recorded signals. Several methods have been developed so far that facilitate the detection and reduction of MAs in the data. However, due to fixed parameter values (e.g., global threshold) none of these methods are perfectly suitable for long-term (i.e., hours) recordings or were not time-effective when applied to large datasets. We aimed to overcome these limitations by automation, i.e., data adaptive thresholding specifically designed for long-term measurements, and by introducing a stable long-term signal reconstruction. Our new technique (“acceleration-based movement artifact reduction algorithm”, AMARA) is based on combining two methods: the “movement artifact reduction algorithm” (MARA, Scholkmann et al. Phys. Meas. 2010, 31, 649–662), and the “accelerometer-based motion artifact removal” (ABAMAR, Virtanen et al. J. Biomed. Opt. 2011, 16, 087005). We describe AMARA in detail and report about successful validation of the algorithm using empirical NIRS data, measured over the prefrontal cortex in adolescents during sleep. In addition, we compared the performance of AMARA to that of MARA and ABAMAR based on validation data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tropical forests are believed to be very harsh environments for human life. It is unclear whether human beings would have ever subsisted in those environments without external resources. It is therefore possible that humans have developed recent biological adaptations in response to specific selective pressures to cope with this challenge. To understand such biological adaptations we analyzed genome-wide SNP data under a Bayesian statistics framework, looking for outlier markers with an overly large extent of differentiation between populations living in a tropical forest, as compared to genetically related populations living outside the forest in Africa and the Americas. The most significant positive selection signals were found in genes related to lipid metabolism, the immune system, body development, and RNA Polymerase III transcription initiation. The results are discussed in the light of putative tropical forest selective pressures, namely food scarcity, high prevalence of pathogens, difficulty to move, and inefficient thermoregulation. Agreement between our results and previous studies on the pygmy phenotype, a putative prototype of forest adaptation, were found, suggesting that a few genetic regions previously described as associated with short stature may be evolving under similar positive selection in Africa and the Americas. In general, convergent evolution was less pervasive than local adaptation in one single continent, suggesting that Africans and Amerindians may have followed different routes to adapt to similar environmental selective pressures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM To evaluate the diagnostic value (sensitivity, specificity) of positron emission mammography (PEM) in a single site non-interventional study using the maximum PEM uptake value (PUVmax). PATIENTS, METHODS In a singlesite, non-interventional study, 108 patients (107 women, 1 man) with a total of 151 suspected lesions were scanned with a PEM Flex Solo II (Naviscan) at 90 min p.i. with 3.5 MBq 18F-FDG per kg of body weight. In this ROI(region of interest)-based analysis, maximum PEM uptake value (PUV) was determined in lesions, tumours (PUVmaxtumour), benign lesions (PUVmaxnormal breast) and also in healthy tissues on the contralateral side (PUVmaxcontralateral breast). These values were compared and contrasted. In addition, the ratios of PUVmaxtumour / PUVmaxcontralateral breast and PUVmaxnormal breast / PUVmaxcontralateral breast were compared. The image data were interpreted independently by two experienced nuclear medicine physicians and compared with histology in cases of suspected carcinoma. RESULTS Based on a criteria of PUV>1.9, 31 out of 151 lesions in the patient cohort were found to be malignant (21%). A mean PUVmaxtumour of 3.78 ± 2.47 was identified in malignant tumours, while a mean PUVmaxnormal breast of 1.17 ± 0.37 was reported in the glandular tissue of the healthy breast, with the difference being statistically significant (p < 0.001). Similarly, the mean ratio between tumour and healthy glandular tissue in breast cancer patients (3.15 ± 1.58) was found to be significantly higher than the ratio for benign lesions (1.17 ± 0.41, p < 0.001). CONCLUSION PEM is capable of differentiating breast tumours from benign lesions with 100% sensitivity along with a high specificity of 96%, when a threshold of PUVmax >1.9 is applied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to test the influence of different degrees of additional illumination on visual caries detection using the International Caries Detection and Assessment System (ICDAS). Two calibrated examiners assessed 139 occlusal surfaces of extracted permanent molars using a standard operation lamp with or without an additional headlamp providing three default brightness intensities. Histology served as the gold standard. Pooled data showed no differences in sensitivities. Specificities were not influenced by additional light. The area under the curve for the Marthaler classification D3 threshold was significantly lower when an additional strong headlamp was used (0.59 compared to 0.69-0.72 when reduced illumination intensities were used). One of the two examiners also had a significantly lower sensitivity for the D1 threshold when an additional headlamp was used. The use of additional white light led to a reduced detection of dentine lesions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to test a newly developed LED-based fluorescence device for approximal caries detection in vitro. We assembled 120 extracted molars without frank cavitations or fillings pairwise in order to create contact areas. The teeth were independently assessed by two examiners using visual caries detection (International Caries Detection and Assessment System, ICDAS), bitewing radiography (BW), laser fluorescence (LFpen), and LED fluorescence (Midwest Caries I.D., MW). The measurements were repeated at least 1 week later. The diagnostic performance was calculated with Bayesian analyses. Post-test probabilities were calculated in order to judge the diagnostic performance of combined methods. Reliability analyses were performed using kappa statistics for nominal data and intraclass correlation (ICC) for absolute data. Histology served as the gold standard. Sensitivities/specificities at the enamel threshold were 0.33/0.84 for ICDAS, 0.23/0.86 for BW, 0.47/0.78 for LFpen, and 0.32/0.87 for MW. Sensitivities/specificities at the dentine threshold were 0.04/0.89 for ICDAS, 0.27/0.94 for BW, 0.39/0.84 for LFpen, and 0.07/0.96 for MW. Reliability data were fair to moderate for MW and good for BW and LFpen. The combination of ICDAS and radiography yielded the best diagnostic performance (post-test probability of 0.73 at the dentine threshold). The newly developed LED device is not able to be recommended for approximal caries detection. There might be too much signal loss during signal transduction from the occlusal aspect to the proximal lesion site and the reverse.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Although it seems plausible that sports performance relies on high-acuity foveal vision, it could be empirically shown that myoptic blur (up to +2 diopters) does not harm performance in sport tasks that require foveal information pick-up like golf putting (Bulson, Ciuffreda, & Hung, 2008). How myoptic blur affects peripheral performance is yet unknown. Attention might be less needed for processing visual cues foveally and lead to better performance because peripheral cues are better processed as a function of reduced foveal vision, which will be tested in the current experiment. Methods: 18 sport science students with self-reported myopia volunteered as participants, all of them regularly wearing contact lenses. Exclusion criteria comprised visual correction other than myopic, correction of astigmatism and use of contact lenses out of Swiss delivery area. For each of the participants, three pairs of additional contact lenses (besides their regular lenses; used in the “plano” condition) were manufactured with an individual overcorrection to a retinal defocus of +1 to +3 diopters (referred to as “+1.00 D”, “+2.00 D”, and “+3.00 D” condition, respectively). Gaze data were acquired while participants had to perform a multiple object tracking (MOT) task that required to track 4 out of 10 moving stimuli. In addition, in 66.7 % of all trials, one of the 4 targets suddenly stopped during the motion phase for a period of 0.5 s. Stimuli moved in front of a picture of a sports hall to allow for foveal processing. Due to the directional hypotheses, the level of significance for one-tailed tests on differences was set at α = .05 and posteriori effect sizes were computed as partial eta squares (ηρ2). Results: Due to problems with the gaze-data collection, 3 participants had to be excluded from further analyses. The expectation of a centroid strategy was confirmed because gaze was closer to the centroid than the target (all p < .01). In comparison to the plano baseline, participants more often recalled all 4 targets under defocus conditions, F(1,14) = 26.13, p < .01, ηρ2 = .65. The three defocus conditions differed significantly, F(2,28) = 2.56, p = .05, ηρ2 = .16, with a higher accuracy as a function of a defocus increase and significant contrasts between conditions +1.00 D and +2.00 D (p = .03) and +1.00 D and +3.00 D (p = .03). For stop trials, significant differences could neither be found between plano baseline and defocus conditions, F(1,14) = .19, p = .67, ηρ2 = .01, nor between the three defocus conditions, F(2,28) = 1.09, p = .18, ηρ2 = .07. Participants reacted faster in “4 correct+button” trials under defocus than under plano-baseline conditions, F(1,14) = 10.77, p < .01, ηρ2 = .44. The defocus conditions differed significantly, F(2,28) = 6.16, p < .01, ηρ2 = .31, with shorter response times as a function of a defocus increase and significant contrasts between +1.00 D and +2.00 D (p = .01) and +1.00 D and +3.00 D (p < .01). Discussion: The results show that gaze behaviour in MOT is not affected to a relevant degree by a visual overcorrection up to +3 diopters. Hence, it can be taken for granted that peripheral event detection was investigated in the present study. This overcorrection, however, does not harm the capability to peripherally track objects. Moreover, if an event has to be detected peripherally, neither response accuracy nor response time is negatively affected. Findings could claim considerable relevance for all sport situations in which peripheral vision is required which now needs applied studies on this topic. References: Bulson, R. C., Ciuffreda, K. J., & Hung, G. K. (2008). The effect of retinal defocus on golf putting. Ophthalmic and Physiological Optics, 28, 334-344.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The attentional blink (AB) is a fundamental limitation of the ability to select relevant information from irrelevant information. It can be observed with the detection rate in an AB task as well as with the corresponding P300 amplitude of the event-related potential. In previous research, however, correlations between these two levels of observation were weak and rather inconsistent. A possible explanation of this finding might be that multiple processes underlie the AB and, thus, obscure a possible relationship between AB-related detection rate and the corresponding P300 amplitude. The present study investigated this assumption by applying a fixed-links modeling approach to represent behavioral individual differences in the AB as a latent variable. Concurrently, this approach enabled us to control for additional sources of variance in AB performance by deriving two additional latent variables. The correlation between the latent variable reflecting behavioral individual differences in AB magnitude and a corresponding latent variable derived from the P300 amplitude was high (r=.70). Furthermore, this correlation was considerably stronger than the correlations of other behavioral measures of the AB magnitude with their psychophysiological counterparts (all rs<.40). Our findings clearly indicate that the systematic disentangling of various sources of variance by utilizing the fixed-links modeling approach is a promising tool to investigate behavioral individual differences in the AB and possible psychophysiological correlates of these individual differences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reporting of outputs from health surveillance systems should be done in a near real-time and interactive manner in order to provide decision makers with powerful means to identify, assess, and manage health hazards as early and efficiently as possible. While this is currently rarely the case in veterinary public health surveillance, reporting tools do exist for the visual exploration and interactive interrogation of health data. In this work, we used tools freely available from the Google Maps and Charts library to develop a web application reporting health-related data derived from slaughterhouse surveillance and from a newly established web-based equine surveillance system in Switzerland. Both sets of tools allowed entry-level usage without or with minimal programing skills while being flexible enough to cater for more complex scenarios for users with greater programing skills. In particular, interfaces linking statistical softwares and Google tools provide additional analytical functionality (such as algorithms for the detection of unusually high case occurrences) for inclusion in the reporting process. We show that such powerful approaches could improve timely dissemination and communication of technical information to decision makers and other stakeholders and could foster the early-warning capacity of animal health surveillance systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clinical observations made by practitioners and reported using web- and mobile-based technologies may benefit disease surveillance by improving the timeliness of outbreak detection. Equinella is a voluntary electronic reporting and information system established for the early detection of infectious equine diseases in Switzerland. Sentinel veterinary practitioners have been able to report cases of non-notifiable diseases and clinical symptoms to an internet-based platform since November 2013. Telephone interviews were carried out during the first year to understand the motivating and constraining factors affecting voluntary reporting and the use of mobile devices in a sentinel network. We found that non-monetary incentives attract sentinel practitioners; however, insufficient understanding of the reporting system and of its relevance, as well as concerns over the electronic dissemination of health data were identified as potential challenges to sustainable reporting. Many practitioners are not yet aware of the advantages of mobile-based surveillance and may require some time to become accustomed to novel reporting methods. Finally, our study highlights the need for continued information feedback loops within voluntary sentinel networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Systems for the identification and registration of cattle have gradually been receiving attention for use in syndromic surveillance, a relatively recent approach for the early detection of infectious disease outbreaks. Real or near real-time monitoring of deaths or stillbirths reported to these systems offer an opportunity to detect temporal or spatial clusters of increased mortality that could be caused by an infectious disease epidemic. In Switzerland, such data are recorded in the "Tierverkehrsdatenbank" (TVD). To investigate the potential of the Swiss TVD for syndromic surveillance, 3 years of data (2009-2011) were assessed in terms of data quality, including timeliness of reporting and completeness of geographic data. Two time-series consisting of reported on-farm deaths and stillbirths were retrospectively analysed to define and quantify the temporal patterns that result from non-health related factors. Geographic data were almost always present in the TVD data; often at different spatial scales. On-farm deaths were reported to the database by farmers in a timely fashion; stillbirths were less timely. Timeliness and geographic coverage are two important features of disease surveillance systems, highlighting the suitability of the TVD for use in a syndromic surveillance system. Both time series exhibited different temporal patterns that were associated with non-health related factors. To avoid false positive signals, these patterns need to be removed from the data or accounted for in some way before applying aberration detection algorithms in real-time. Evaluating mortality data reported to systems for the identification and registration of cattle is of value for comparing national data systems and as a first step towards a European-wide early detection system for emerging and re-emerging cattle diseases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Syndromic surveillance (SyS) systems currently exploit various sources of health-related data, most of which are collected for purposes other than surveillance (e.g. economic). Several European SyS systems use data collected during meat inspection for syndromic surveillance of animal health, as some diseases may be more easily detected post-mortem than at their point of origin or during the ante-mortem inspection upon arrival at the slaughterhouse. In this paper we use simulation to evaluate the performance of a quasi-Poisson regression (also known as an improved Farrington) algorithm for the detection of disease outbreaks during post-mortem inspection of slaughtered animals. When parameterizing the algorithm based on the retrospective analyses of 6 years of historic data, the probability of detection was satisfactory for large (range 83-445 cases) outbreaks but poor for small (range 20-177 cases) outbreaks. Varying the amount of historical data used to fit the algorithm can help increasing the probability of detection for small outbreaks. However, while the use of a 0·975 quantile generated a low false-positive rate, in most cases, more than 50% of outbreak cases had already occurred at the time of detection. High variance observed in the whole carcass condemnations time-series, and lack of flexibility in terms of the temporal distribution of simulated outbreaks resulting from low reporting frequency (monthly), constitute major challenges for early detection of outbreaks in the livestock population based on meat inspection data. Reporting frequency should be increased in the future to improve timeliness of the SyS system while increased sensitivity may be achieved by integrating meat inspection data into a multivariate system simultaneously evaluating multiple sources of data on livestock health.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

On the orbiter of the Rosetta spacecraft, the Cometary Secondary Ion Mass Analyser (COSIMA) will provide new in situ insights about the chemical composition of cometary grains all along 67P/Churyumov–Gerasimenko (67P/CG) journey until the end of December 2015 nominally. The aim of this paper is to present the pre-calibration which has already been performed as well as the different methods which have been developed in order to facilitate the interpretation of the COSIMA mass spectra and more especially of their organic content. The first step was to establish a mass spectra library in positive and negative ion mode of targeted molecules and to determine the specific features of each compound and chemical family analyzed. As the exact nature of the refractory cometary organic matter is nowadays unknown, this library is obviously not exhaustive. Therefore this library has also been the starting point for the research of indicators, which enable to highlight the presence of compounds containing specific atom or structure. These indicators correspond to the intensity ratio of specific peaks in the mass spectrum. They have allowed us to identify sample containing nitrogen atom, aliphatic chains or those containing polyaromatic hydrocarbons. From these indicators, a preliminary calibration line, from which the N/C ratio could be derived, has also been established. The research of specific mass difference could also be helpful to identify peaks related to quasi-molecular ions in an unknown mass spectrum. The Bayesian Positive Source Separation (BPSS) technique will also be very helpful for data analysis. This work is the starting point for the analysis of the cometary refractory organic matter. Nevertheless, calibration work will continue in order to reach the best possible interpretation of the COSIMA observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Is Benford's law a good instrument to detect fraud in reports of statistical and scientific data? For a valid test the probability of "false positives" and "false negatives" has to be low. However, it is very doubtful whether the Benford distribution is an appropriate tool to discriminate between manipulated and non-manipulated estimates. Further research should focus more on the validity of the test and test results should be interpreted more carefully.