961 resultados para CENTERBAND-ONLY DETECTION
Resumo:
BACKGROUND There is confusion over the definition of the term "viability state(s)" of microorganisms. "Viability staining" or "vital staining techniques" are used to distinguish live from dead bacteria. These stainings, first established on planctonic bacteria, may have serious shortcomings when applied to multispecies biofilms. Results of staining techniques should be compared with appropriate microbiological data. DISCUSSION Many terms describe "vitality states" of microorganisms, however, several of them are misleading. Authors define "viable" as "capable to grow". Accordingly, staining methods are substitutes, since no staining can prove viability.The reliability of a commercial "viability" staining assay (Molecular Probes) is discussed based on the corresponding product information sheet: (I) Staining principle; (II) Concentrations of bacteria; (III) Calculation of live/dead proportions in vitro. Results of the "viability" kit are dependent on the stains' concentration and on their relation to the number of bacteria in the test. Generally this staining system is not suitable for multispecies biofilms, thus incorrect statements have been published by users of this technique.To compare the results of the staining with bacterial parameters appropriate techniques should be selected. The assessment of Colony Forming Units is insufficient, rather the calculation of Plating Efficiency is necessary. Vital fluorescence staining with Fluorescein Diacetate and Ethidium Bromide seems to be the best proven and suitable method in biofilm research.Regarding the mutagenicity of staining components users should be aware that not only Ethidium Bromide might be harmful, but also a variety of other substances of which the toxicity and mutagenicity is not reported. SUMMARY - The nomenclature regarding "viability" and "vitality" should be used carefully.- The manual of the commercial "viability" kit itself points out that the kit is not suitable for natural multispecies biofilm research, as supported by an array of literature.- Results obtained with various stains are influenced by the relationship between bacterial counts and the amount of stain used in the test. Corresponding vitality data are prone to artificial shifting.- As microbiological parameter the Plating Efficiency should be used for comparison.- Ethidium Bromide is mutagenic. Researchers should be aware that alternative staining compounds may also be or even are mutagenic.
Resumo:
BACKGROUND Flavobacterium psychrophilum is the agent of Bacterial Cold Water Disease and Rainbow Trout Fry Syndrome, two diseases leading to high mortality. Pathogen detection is mainly carried out using cultures and more rapid and sensitive methods are needed. RESULTS We describe a qPCR technique based on the single copy gene β' DNA-dependent RNA polymerase (rpoC). Its detection limit was 20 gene copies and the quantification limit 103 gene copies per reaction. Tests on spiked spleens with known concentrations of F. psychrophilum (106 to 101 cells per reaction) showed no cross-reactions between the spleen tissue and the primers and probe. Screening of water samples and spleens from symptomless and infected fishes indicated that the pathogen was already present before the outbreaks, but F. psychrophilum was only quantifiable in spleens from diseased fishes. CONCLUSIONS This qPCR can be used as a highly sensitive and specific method to detect F. psychrophilum in different sample types without the need for culturing. qPCR allows a reliable detection and quantification of F. psychrophilum in samples with low pathogen densities. Quantitative data on F. psychrophilum abundance could be useful to investigate risk factors linked to infections and also as early warning system prior to potential devastating outbreak.
Resumo:
Human bone is the most direct source for reconstructing health and living conditions of ancient populations. However, many diseases remain undetected in palaeopathology. Möller-Barlow disease (scurvy) is a historically well-documented metabolic disease and must have been common in clinical and sub-clinical severity. Due to long incubation periods and the subtle nature of bone changes osteological evidence is relatively rare (Brickley & Ives 2008). Möller-Barlow disease is caused by deficiency of dietary vitamin C (ascorbic acid) and evokes symptoms like fatigue, haemorrhage, inflammations, delayed wound healing and pain. Vitamin C is a cofactor for the hydroxylation of the amino acids proline and lysine which are essential for the production of intact connective tissue by cross-linking the propeptides in collagen. In a preliminary study we tested the detectability of Möller-Barlow disease by analysis of relative quantitative variability of hydroxylated amino acids in collagen (Pendery & Koon 2013). Samples (N=9) were taken from children with (n=3, cranium, femur, tibia) and without (n=4, cranium, femur, tibia) apparent bone reactions indicative of Möller-Barlow disease, as well as from adults with lethal traumata (n=2; negative controls). The skeletal remains originated from two early medieval cemeteries from Switzerland. Gas chromatographic (GC) analysis revealed minor differences between the samples. So far children with no pathologic alterations had fairly same values as negative controls while children with bone reactions paradoxically exhibited even slightly higher values of hydroxyproline and hydroxylysine. Future research demands for larger sample size and has to discuss sampling strategies. Beside possible misdiagnosis of Möller-Barlow disease it is arguable if only the newly built bone should be analysed even though this could lead to problems related to small sample quantity. It also remains to be seen to which extent varying turnover rates of different skeletal elements, especially in children, must be taken into account.
Resumo:
In this paper, we propose a new method for fully-automatic landmark detection and shape segmentation in X-ray images. To detect landmarks, we estimate the displacements from some randomly sampled image patches to the (unknown) landmark positions, and then we integrate these predictions via a voting scheme. Our key contribution is a new algorithm for estimating these displacements. Different from other methods where each image patch independently predicts its displacement, we jointly estimate the displacements from all patches together in a data driven way, by considering not only the training data but also geometric constraints on the test image. The displacements estimation is formulated as a convex optimization problem that can be solved efficiently. Finally, we use the sparse shape composition model as the a priori information to regularize the landmark positions and thus generate the segmented shape contour. We validate our method on X-ray image datasets of three different anatomical structures: complete femur, proximal femur and pelvis. Experiments show that our method is accurate and robust in landmark detection, and, combined with the shape model, gives a better or comparable performance in shape segmentation compared to state-of-the art methods. Finally, a preliminary study using CT data shows the extensibility of our method to 3D data.
Resumo:
Cephalometric analysis is an essential clinical and research tool in orthodontics for the orthodontic analysis and treatment planning. This paper presents the evaluation of the methods submitted to the Automatic Cephalometric X-Ray Landmark Detection Challenge, held at the IEEE International Symposium on Biomedical Imaging 2014 with an on-site competition. The challenge was set to explore and compare automatic landmark detection methods in application to cephalometric X-ray images. Methods were evaluated on a common database including cephalograms of 300 patients aged six to 60 years, collected from the Dental Department, Tri-Service General Hospital, Taiwan, and manually marked anatomical landmarks as the ground truth data, generated by two experienced medical doctors. Quantitative evaluation was performed to compare the results of a representative selection of current methods submitted to the challenge. Experimental results show that three methods are able to achieve detection rates greater than 80% using the 4 mm precision range, but only one method achieves a detection rate greater than 70% using the 2 mm precision range, which is the acceptable precision range in clinical practice. The study provides insights into the performance of different landmark detection approaches under real-world conditions and highlights achievements and limitations of current image analysis techniques.
Resumo:
The aim of this guidance paper of the European Psychiatric Association is to provide evidence-based recommendations on the early detection of a clinical high risk (CHR) for psychosis in patients with mental problems. To this aim, we conducted a meta-analysis of studies reporting on conversion rates to psychosis in non-overlapping samples meeting any at least any one of the main CHR criteria: ultra-high risk (UHR) and/or basic symptoms criteria. Further, effects of potential moderators (different UHR criteria definitions, single UHR criteria and age) on conversion rates were examined. Conversion rates in the identified 42 samples with altogether more than 4000 CHR patients who had mainly been identified by UHR criteria and/or the basic symptom criterion ‘cognitive disturbances’ (COGDIS) showed considerable heterogeneity. While UHR criteria and COGDIS were related to similar conversion rates until 2-year follow-up, conversion rates of COGDIS were significantly higher thereafter. Differences in onset and frequency requirements of symptomatic UHR criteria or in their different consideration of functional decline, substance use and co-morbidity did not seem to impact on conversion rates. The ‘genetic risk and functional decline’ UHR criterion was rarely met and only showed an insignificant pooled sample effect. However, age significantly affected UHR conversion rates with lower rates in children and adolescents. Although more research into potential sources of heterogeneity in conversion rates is needed to facilitate improvement of CHR criteria, six evidence-based recommendations for an early detection of psychosis were developed as a basis for the EPA guidance on early intervention in CHR states.
Resumo:
BACKGROUND Detection of HIV-1 p24 antigen permits early identification of primary HIV infection and timely intervention to limit further spread of the infection. Principally, HIV screening should equally detect all viral variants, but reagents for a standardised test evaluation are limited. Therefore, we aimed to create an inexhaustible panel of diverse HIV-1 p24 antigens. METHODS We generated a panel of 43 recombinantly expressed virus-like particles (VLPs), containing the structural Gag proteins of HIV-1 subtypes A-H and circulating recombinant forms (CRF) CRF01_AE, CRF02_AG, CRF12_BF, CRF20_BG and group O. Eleven 4th generation antigen/antibody tests and five antigen-only tests were evaluated for their ability to detect VLPs diluted in human plasma to p24 concentrations equivalent to 50, 10 and 2 IU/ml of the WHO p24 standard. Three tests were also evaluated for their ability to detect p24 after heat-denaturation for immune-complex disruption, a pre-requisite for ultrasensitive p24 detection. RESULTS Our VLP panel exhibited an average intra-clade p24 diversity of 6.7%. Among the 4th generation tests, the Abbott Architect and Siemens Enzygnost Integral 4 had the highest sensitivity of 97.7% and 93%, respectively. Alere Determine Combo and BioRad Access were least sensitive with 10.1% and 40.3%, respectively. Antigen-only tests were slightly more sensitive than combination tests. Almost all tests detected the WHO HIV-1 p24 standard at a concentration of 2 IU/ml, but their ability to detect this input for different subtypes varied greatly. Heat-treatment lowered overall detectability of HIV-1 p24 in two of the three tests, but only few VLPs had a more than 3-fold loss in p24 detection. CONCLUSIONS The HIV-1 Gag subtype panel has a broad diversity and proved useful for a standardised evaluation of the detection limit and breadth of subtype detection of p24 antigen-detecting tests. Several tests exhibited problems, particularly with non-B subtypes.
Resumo:
UNLABELLED A high proportion of gut and bronchial neuroendocrine tumors (NETs) overexpresses somatostatin receptors, especially the sst2 subtype. It has also recently been observed that incretin receptors, namely glucagonlike peptide 1 (GLP-1) and glucose-dependent insulinotropic peptide (GIP) receptors, can be overexpressed in gut and bronchial NETs. However, because not all tumors can express these receptors in sufficient amounts, in vivo imaging with a single radioligand may not always be successful. We therefore evaluated with in vitro methods whether a cocktail of radioligands targeting these 3 receptors would improve tumor labeling. METHODS In vitro receptor autoradiography was performed on 55 NETs, comparing in each successive section of tumor the binding with a single radioligand, either (125)I-Tyr(3)-octreotide, (125)I-GLP-1(7-36)amide, or (125)I-GIP(1-30), with the binding using a cocktail of all 3 radioligands, given concomitantly under identical experimental conditions. RESULTS Using the cocktail of radioligands, all tumors without exception showed moderate to very high binding, with a receptor density corresponding to 1,000-10,000 dpm/mg of tissue; conversely, single-ligand binding, although identifying most tumors as receptor-positive, failed to detect receptors or measured only a low density of receptors below 1,000 dpm/mg in a significant number of tumors. In addition, the cocktail of radioligands always provided a homogeneous labeling of the whole tumor, whereas single radioligands occasionally showed heterogeneous labeling. CONCLUSION The study suggests that the use of a cocktail of 3 radioligands binding to somatostatin receptors, GLP-1 receptors, and GIP receptors would allow detecting virtually all NETs and labeling them homogeneously in vivo, representing a significant improvement for imaging and therapy in NETs.
Resumo:
Echinococcus multilocularis is an important pathogenic zoonotic parasite of health concern, though absent in the United Kingdom. Eurasian beavers (Castor fiber) may act as a rare intermediate host, and so unscreened wild caught individuals may pose a potential risk of introducing this parasite to disease-free countries through translocation programs. There is currently no single definitive ante-mortem diagnostic test in intermediate hosts. An effective non-lethal diagnostic, feasible under field condition would be helpful to minimise parasite establishment risk, where indiscriminate culling is to be avoided. This study screened live beavers (captive, n = 18 or wild-trapped in Scotland, n = 12) and beaver cadavers (wild Scotland, n = 4 or Bavaria, n = 11), for the presence of E. multilocularis. Ultrasonography in combination with minimally invasive surgical examination of the abdomen by laparoscopy was viable under field conditions for real-time evaluation in beavers. Laparoscopy alone does not allow the operator to visualize the parenchyma of organs such as the liver, or inside the lumen of the gastrointestinal tract, hence the advantage of its combination with abdominal ultrasonography. All live beavers and Scottish cadavers were largely unremarkable in their haematology and serum biochemistry with no values suspicious for liver pathology or potentially indicative of E. multilocularis infection. This correlated well with ultrasound, laparoscopy, and immunoblotting, which were unremarkable in these individuals. Two wild Bavarian individuals were suspected E. multilocularis positive at post-mortem, through the presence of hepatic cysts. Sensitivity and specificity of a combination of laparoscopy and abdominal ultrasonography in the detection of parasitic liver cyst lesions was 100% in the subset of cadavers (95%Confidence Intervals 34.24-100%, and 86.7-100% respectively). For abdominal ultrasonography alone sensitivity was only 50% (95%CI 9.5-90.6%), with specificity being 100% (95%CI 79.2-100%). For laparoscopy alone sensitivity was 100% (95% CI 34.2-100%), with specificity also being 100% (95% CI 77.2-100%). Further immunoblotting, PCR and histopathological examination revealed one individual positive for E. multilocularis, whilst the other individual was positive for Taenia martis.
Resumo:
BACKGROUND Hepatitis B viruses (HBV) harboring mutations in the a-determinant of the Hepatitis B surface antigen (HBsAg) are associated with reduced reactivity of HBsAg assays. OBJECTIVES To evaluate the sensitivity and specificity of three HBsAg point-of-care tests for the detection of HBsAg of viruses harboring HBsAg mutations. STUDY DESIGN A selection of 50 clinical plasma samples containing HBV with HBsAg mutations was used to evaluate the performance of three HBsAg point-of-care tests (Vikia(®), bioMérieux, Marcy-L'Étoile, France. Alere Determine HBsAg™, Iverness Biomedical Innovations, Köln, Germany. Quick Profile™, LumiQuick Diagnostics, California, USA) and compared to the ARCHITECT HBsAg Qualitative(®) assay (Abbott Laboratories, Sligo, Ireland). RESULTS The sensitivity of the point-of-care tests ranged from 98% to 100%. The only false-negative result occurred using the Quick Profile™ assay with a virus harboring a D144A mutation. CONCLUSIONS The evaluated point-of-care tests revealed an excellent sensitivity in detecting HBV samples harboring HBsAg mutations.
Resumo:
The use of infrared thermography for the identification of lameness in cattle has increased in recent years largely because of its non-invasive properties, ease of automation and continued cost reductions. Thermography can be used to identify and determine thermal abnormalities in animals by characterizing an increase or decrease in the surface temperature of their skin. The variation in superficial thermal patterns resulting from changes in blood flow in particular can be used to detect inflammation or injury associated with conditions such as foot lesions. Thermography has been used not only as a diagnostic tool, but also to evaluate routine farm management. Since 2000, 14 peer reviewed papers which discuss the assessment of thermography to identify and manage lameness in cattle have been published. There was a large difference in thermography performance in these reported studies. However, thermography was demonstrated to have utility for the detection of contralateral temperature difference and maximum foot temperature on areas of interest. Also apparent in these publications was that a controlled environment is an important issue that should be considered before image scanning.
Resumo:
Any image processing object detection algorithm somehow tries to integrate the object light (Recognition Step) and applies statistical criteria to distinguish objects of interest from other objects or from pure background (Decision Step). There are various possibilities how these two basic steps can be realized, as can be seen in the different proposed detection methods in the literature. An ideal detection algorithm should provide high recognition sensitiv ity with high decision accuracy and require a reasonable computation effort . In reality, a gain in sensitivity is usually only possible with a loss in decision accuracy and with a higher computational effort. So, automatic detection of faint streaks is still a challenge. This paper presents a detection algorithm using spatial filters simulating the geometrical form of possible streaks on a CCD image. This is realized by image convolution. The goal of this method is to generate a more or less perfect match between a streak and a filter by varying the length and orientation of the filters. The convolution answers are accepted or rejected according to an overall threshold given by the ackground statistics. This approach yields as a first result a huge amount of accepted answers due to filters partially covering streaks or remaining stars. To avoid this, a set of additional acceptance criteria has been included in the detection method. All criteria parameters are justified by background and streak statistics and they affect the detection sensitivity only marginally. Tests on images containing simulated streaks and on real images containing satellite streaks show a very promising sensitivity, reliability and running speed for this detection method. Since all method parameters are based on statistics, the true alarm, as well as the false alarm probability, are well controllable. Moreover, the proposed method does not pose any extraordinary demands on the computer hardware and on the image acquisition process.
Resumo:
Background Protein-energy-malnutrition (PEM) is common in people with end stage kidney disease (ESKD) undergoing maintenance haemodialysis (MHD) and correlates strongly with mortality. To this day, there is no gold standard for detecting PEM in patients on MHD. Aim of Study The aim of this study was to evaluate if Nutritional Risk Screening 2002 (NRS-2002), handgrip strength measurement, mid-upper arm muscle area (MUAMA), triceps skin fold measurement (TSF), serum albumin, normalised protein catabolic rate (nPCR), Kt/V and eKt/V, dry body weight, body mass index (BMI), age and time since start on MHD are relevant for assessing PEM in patients on MHD. Methods The predictive value of the selected parameters on mortality and mortality or weight loss of more than 5% was assessed. Quantitative data analysis of the 12 parameters in the same patients on MHD in autumn 2009 (n = 64) and spring 2011 (n = 40) with paired statistical analysis and multivariate logistic regression analysis was performed. Results Paired data analysis showed significant reduction of dry body weight, BMI and nPCR. Kt/Vtot did not change, eKt/v and hand grip strength measurements were significantly higher in spring 2011. No changes were detected in TSF, serum albumin, NRS-2002 and MUAMA. Serum albumin was shown to be the only predictor of death and of the combined endpoint “death or weight loss of more than 5%”. Conclusion We now screen patients biannually for serum albumin, nPCR, Kt/V, handgrip measurement of the shunt-free arm, dry body weight, age and time since initiation of MHD.
Resumo:
The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.
Resumo:
Introduction: In team sports the ability to use peripheral vision is essential to track a number of players and the ball. By using eye-tracking devices it was found that players either use fixations and saccades to process information on the pitch or use smooth pursuit eye movements (SPEM) to keep track of single objects (Schütz, Braun, & Gegenfurtner, 2011). However, it is assumed that peripheral vision can be used best when the gaze is stable while it is unknown whether motion changes can be equally well detected when SPEM are used especially because contrast sensitivity is reduced during SPEM (Schütz, Delipetkose, Braun, Kerzel, & Gegenfurtner, 2007). Therefore, peripheral motion change detection will be examined by contrasting a fixation condition with a SPEM condition. Methods: 13 participants (7 male, 6 female) were presented with a visual display consisting of 15 white and 1 red square. Participants were instructed to follow the red square with their eyes and press a button as soon as a white square begins to move. White square movements occurred either when the red square was still (fixation condition) or moving in a circular manner with 6 °/s (pursuit condition). The to-be-detected white square movements varied in eccentricity (4 °, 8 °, 16 °) and speed (1 °/s, 2 °/s, 4 °/s) while movement time of white squares was constant at 500 ms. 180 events should be detected in total. A Vicon-integrated eye-tracking system and a button press (1000 Hz) was used to control for eye-movements and measure detection rates and response times. Response times (ms) and missed detections (%) were measured as dependent variables and analysed with a 2 (manipulation) x 3 (eccentricity) x 3 (speed) ANOVA with repeated measures on all factors. Results: Significant response time effects were found for manipulation, F(1,12) = 224.31, p < .01, ηp2 = .95, eccentricity, F(2,24) = 56.43; p < .01, ηp2 = .83, and the interaction between the two factors, F(2,24) = 64.43; p < .01, ηp2 = .84. Response times increased as a function of eccentricity for SPEM only and were overall higher than in the fixation condition. Results further showed missed events effects for manipulation, F(1,12) = 37.14; p < .01, ηp2 = .76, eccentricity, F(2,24) = 44.90; p < .01, ηp2 = .79, the interaction between the two factors, F(2,24) = 39.52; p < .01, ηp2 = .77 and the three-way interaction manipulation x eccentricity x speed, F(2,24) = 3.01; p = .03, ηp2 = .20. While less than 2% of events were missed on average in the fixation condition as well as at 4° and 8° eccentricity in the SPEM condition, missed events increased for SPEM at 16 ° eccentricity with significantly more missed events in the 4 °/s speed condition (1 °/s: M = 34.69, SD = 20.52; 2 °/s: M = 33.34, SD = 19.40; 4 °/s: M = 39.67, SD = 19.40). Discussion: It could be shown that using SPEM impairs the ability to detect peripheral motion changes at the far periphery and that fixations not only help to detect these motion changes but also to respond faster. Due to high temporal constraints especially in team sports like soccer or basketball, fast reaction are necessary for successful anticipation and decision making. Thus, it is advised to anchor gaze at a specific location if peripheral changes (e.g. movements of other players) that require a motor response have to be detected. In contrast, SPEM should only be used if a single object, like the ball in cricket or baseball, is necessary for a successful motor response. References: Schütz, A. C., Braun, D. I., & Gegenfurtner, K. R. (2011). Eye movements and perception: A selective review. Journal of Vision, 11, 1-30. Schütz, A. C., Delipetkose, E., Braun, D. I., Kerzel, D., & Gegenfurtner, K. R. (2007). Temporal contrast sensitivity during smooth pursuit eye movements. Journal of Vision, 7, 1-15.