910 resultados para test case generation
Resumo:
Biomarker research relies on tissue microarrays (TMA). TMAs are produced by repeated transfer of small tissue cores from a 'donor' block into a 'recipient' block and then used for a variety of biomarker applications. The construction of conventional TMAs is labor intensive, imprecise, and time-consuming. Here, a protocol using next-generation Tissue Microarrays (ngTMA) is outlined. ngTMA is based on TMA planning and design, digital pathology, and automated tissue microarraying. The protocol is illustrated using an example of 134 metastatic colorectal cancer patients. Histological, statistical and logistical aspects are considered, such as the tissue type, specific histological regions, and cell types for inclusion in the TMA, the number of tissue spots, sample size, statistical analysis, and number of TMA copies. Histological slides for each patient are scanned and uploaded onto a web-based digital platform. There, they are viewed and annotated (marked) using a 0.6-2.0 mm diameter tool, multiple times using various colors to distinguish tissue areas. Donor blocks and 12 'recipient' blocks are loaded into the instrument. Digital slides are retrieved and matched to donor block images. Repeated arraying of annotated regions is automatically performed resulting in an ngTMA. In this example, six ngTMAs are planned containing six different tissue types/histological zones. Two copies of the ngTMAs are desired. Three to four slides for each patient are scanned; 3 scan runs are necessary and performed overnight. All slides are annotated; different colors are used to represent the different tissues/zones, namely tumor center, invasion front, tumor/stroma, lymph node metastases, liver metastases, and normal tissue. 17 annotations/case are made; time for annotation is 2-3 min/case. 12 ngTMAs are produced containing 4,556 spots. Arraying time is 15-20 hr. Due to its precision, flexibility and speed, ngTMA is a powerful tool to further improve the quality of TMAs used in clinical and translational research.
Resumo:
Background: There is limited evidence about the impact of treatment for subclinical hypothyroidism, especially among older people. Aim: To investigate the variation in GP treatment strategies for older patients with subclinical hypothyroidism depending on country and patient characteristics. Design and setting: Case-based survey of GPs in the Netherlands, Germany, England, Ireland, Switzerland, and New Zealand. Method: The treatment strategy of GPs (treatment yes/no, starting-dose thyroxine) was assessed for eight cases presenting a woman with subclinical hypothyroidism. The cases differed in the patient characteristics of age (70 versus 85 years), vitality status (vital versus vulnerable), and thyroid-stimulating hormone (TSH) concentration (6 versus 15 mU/L). Results: A total of 526 GPs participated (the Netherlands n = 129, Germany n = 61, England n = 22, Ireland n = 21, Switzerland n = 262, New Zealand n = 31; overall response 19%). Across countries, differences in treatment strategy were observed. GPs from the Netherlands (mean treatment percentage 34%), England (40%), and New Zealand (39%) were less inclined to start treatment than GPs in Germany (73%), Ireland (62%), and Switzerland (52%) (P = 0.05). Overall, GPs were less inclined to start treatment in 85-year-old than in 70-year-old females (pooled odds ratio [OR] 0.74 [95% confidence interval [CI] = 0.63 to 0.87]). Females with a TSH of 15 mU/L were more likely to get treated than those with a TSH of 6 mU/L (pooled OR 9.49 [95% CI = 5.81 to 15.5]). Conclusion: GP treatment strategies of older people with subclinical hypothyroidism vary largely by country and patient characteristics. This variation underlines the need for a new generation of international guidelines based on the outcomes of randomised clinical trials set within primary care
Resumo:
INTRODUCTION Myasthenia gravis is an autoimmune disease characterized by fluctuating muscle weakness. It is often associated with other autoimmune disorders, such as thyroid disease, rheumatoid arthritis, systemic lupus erythematosus, and antiphospholipid syndrome. Many aspects of autoimmune diseases are not completely understood, particularly when they occur in association, which suggests a common pathogenetic mechanism. CASE PRESENTATION We report a case of a 42-year-old Caucasian woman with antiphospholipid syndrome, in whom myasthenia gravis developed years later. She tested negative for both antibodies against the acetylcholine receptor and against muscle-specific receptor tyrosine-kinase, but had typical decremental responses at the repetitive nerve stimulation testing, so that a generalized myasthenia gravis was diagnosed. Her thromboplastin time and activated partial thromboplastin time were high, anticardiolipin and anti-β2 glycoprotein-I antibodies were slightly elevated, as a manifestation of the antiphospholipid syndrome. She had a good clinical response when treated with a combination of pyridostigmine, prednisone and azathioprine. CONCLUSIONS Many patients with myasthenia gravis test positive for a large variety of auto-antibodies, testifying of an immune dysregulation, and some display mild T-cell lymphopenia associated with hypergammaglobulinemia and B-cell hyper-reactivity. Both of these mechanisms could explain the occurrence of another autoimmune condition, such as antiphospholipid syndrome, but further studies are necessary to shed light on this matter.Clinicians should be aware that patients with an autoimmune diagnosis such as antiphospholipid syndrome who develop signs and neurological symptoms suggestive of myasthenia gravis are at risk and should prompt an emergent evaluation by a specialist.
Resumo:
This article examines the role of social salience, or the relative ability of a linguistic variable to evoke social meaning, in structuring listeners’ perceptions of quantitative sociolinguistic distributions. Building on the foundational work of Labov et al. (2006, 2011) on the “sociolinguistic monitor” (a proposed cognitive mechanism responsible for sociolinguistic perception), we examine whether listeners’ evaluative judgments of speech change as a function of the type of variable presented. We consider two variables in British English, ING and TH-fronting, which we argue differ in their relative social salience. Replicating the design of Labov et al.’s studies, we test 149 British listeners’ reactions to different quantitative distributions of these variables. Our experiments elicit a very different pattern of perceptual responses than those reported previously. In particular, our results suggest that a variable’s social salience determines both whether and how it is perceptually evaluated. We argue that this finding is crucial for understanding how sociolinguistic information is cognitively processed.
Resumo:
BACKGROUND The main goals of the standard treatment for advanced symptomatic knee osteoarthritis, total knee arthroplasty (TKA), are pain reduction and restoration of knee motion.The aim of this study was to analyse the outcome of the patient-based Knee Injury and Osteoarthritis Outcome Score (KOOS), and the surgeon-based Knee Society Score (KSS) and its Knee Score (KS) and Knee Functional Score (KFS) components after (TKA) using the Journey knee prosthesis, and to assess the correlation of these scores with range of motion (ROM). METHODS In a prospective case series study between August 1st 2008 and May 31st 2011, 99 patients, all operated by a single surgeon, received Journey bicruciate stabilized total knee prostheses. The female/male ratio was 53/34, the mean patient age at surgery was 68 years (range 41-83 years), and the left/right knee ratio was 55/44. The KOOS, range of motion, and KS and KFS were obtained preoperatively and at 1-year follow-up. The pre- and postoperative levels of the outcome measures were compared using the Wilcoxon signed-rank test. Correlation between ROM and patient outcomes was analysed with the Spearman coefficient. RESULTS All KOOS subscores improved significantly. Ninety percent of patients improved by at least the minimum clinically relevant difference of 10 points in stiffness and other symptoms, 94.5% in pain, 94.5% in activities of daily living, 84.9% in sports and recreation, and 90% in knee-related quality of life. The mean passive and active ROM improved from 122.4° (range 90-145°) and 120.4° (range 80-145°) preoperatively to 129.4° (range 90-145°) and 127.1° (range 100-145°) postoperatively. The highest correlation coefficients for ROM and KOOS were observed for the activity and pain subscores. Very low or no correlation was seen for the sport subscore.There was a significant and clinically relevant improvement of KSS (preop/postop 112.2/174.5 points), and its KS (preop/postop 45.6/86.8 points) and KFS (preop/postop 66.6/87.8 points) components. CONCLUSIONS The Journey bicruciate stabilized knee prosthesis showed good 1-year postoperative results in terms of both functional and patient-based outcome. However, higher knee ROM correlates only moderately with patient-based outcome, implying that functionality afforded by the Journey bicruciate TKA is not equivalent to patient satisfaction.
Resumo:
A 58-year-old male patient was admitted to our emergency department at a large university hospital due to acute onset of general weakness. It was reported that the patient was bradycardic at 30/min and felt an increasing weakness of the limbs. At admission to the emergency department, the patient was not feeling any discomfort and denied dyspnoea or pain. The primary examination of the nervous system showed the cerebral nerves II-XII intact, muscle strength of the lower extremities was 4/5, and a minimal sensory loss of the left hemisphere was found. In addition, the patient complained about lazy lips. During ongoing examinations, the patient developed again symptomatic bradycardia, accompanied by complete tetraplegia. The following blood test showed severe hyperkalemia probably induced by use of aldosterone antagonists as the cause of the patient's neurologic symptoms. Hyperkalemia is a rare but treatable cause of acute paralysis that requires immediate treatment. Late diagnosis can delay appropriate treatment leading to cardiac arrhythmias and arrest.
Resumo:
A search is presented for the production of new heavy quarks that decay to a Z boson and a third-generation Standard Model quark. In the case of a new charge +2/3 quark (T), the decay targeted is T → Zt, while the decay targeted for a new charge −1/3 quark (B) is B → Zb. The search is performed with a dataset corresponding to 20.3 fb−1 of pp collisions at √ s = 8TeV recorded in 2012 with the ATLAS detector at the CERN Large Hadron Collider. Selected events contain a high transverse momentum Z boson candidate reconstructed from a pair of oppositely charged same-flavor leptons (electrons or muons), and are analyzed in two channels defined by the absence or presence of a third lepton. Hadronic jets, in particular those with properties consistent with the decay of a b-hadron, are also required to be present in selected events. Different requirements are made on the jet activity in the event in order to enhance the sensitivity to either heavy quark pair production mediated by the strong interaction, or single production mediated by the electroweak interaction. No significant excess of events above the Standard Model expectation is observed, and lower limits are derived on the mass of vector-like T and B quarks under various branching ratio hypotheses, as well as upper limits on the agnitude of electroweak coupling parameters.
Resumo:
BACKGROUND Detection of HIV-1 p24 antigen permits early identification of primary HIV infection and timely intervention to limit further spread of the infection. Principally, HIV screening should equally detect all viral variants, but reagents for a standardised test evaluation are limited. Therefore, we aimed to create an inexhaustible panel of diverse HIV-1 p24 antigens. METHODS We generated a panel of 43 recombinantly expressed virus-like particles (VLPs), containing the structural Gag proteins of HIV-1 subtypes A-H and circulating recombinant forms (CRF) CRF01_AE, CRF02_AG, CRF12_BF, CRF20_BG and group O. Eleven 4th generation antigen/antibody tests and five antigen-only tests were evaluated for their ability to detect VLPs diluted in human plasma to p24 concentrations equivalent to 50, 10 and 2 IU/ml of the WHO p24 standard. Three tests were also evaluated for their ability to detect p24 after heat-denaturation for immune-complex disruption, a pre-requisite for ultrasensitive p24 detection. RESULTS Our VLP panel exhibited an average intra-clade p24 diversity of 6.7%. Among the 4th generation tests, the Abbott Architect and Siemens Enzygnost Integral 4 had the highest sensitivity of 97.7% and 93%, respectively. Alere Determine Combo and BioRad Access were least sensitive with 10.1% and 40.3%, respectively. Antigen-only tests were slightly more sensitive than combination tests. Almost all tests detected the WHO HIV-1 p24 standard at a concentration of 2 IU/ml, but their ability to detect this input for different subtypes varied greatly. Heat-treatment lowered overall detectability of HIV-1 p24 in two of the three tests, but only few VLPs had a more than 3-fold loss in p24 detection. CONCLUSIONS The HIV-1 Gag subtype panel has a broad diversity and proved useful for a standardised evaluation of the detection limit and breadth of subtype detection of p24 antigen-detecting tests. Several tests exhibited problems, particularly with non-B subtypes.
Resumo:
In sports games, it is often necessary to perceive a large number of moving objects (e.g., the ball and players). In this context, the role of peripheral vision for processing motion information in the periphery is often discussed especially when motor responses are required. In an attempt to test the basal functionality of peripheral vision in those sports-games situations, a Multiple Object Tracking (MOT) task that requires to track a certain number of targets amidst distractors, was chosen. Participants’ primary task was to recall four targets (out of 10 rectangular stimuli) after six seconds of quasi-random motion. As a second task, a button had to be pressed if a target change occurred (Exp 1: stop vs. form change to a diamond for 0.5 s; Exp 2: stop vs. slowdown for 0.5 s). While eccentricities of changes (5-10° vs. 15-20°) were manipulated, decision accuracy (recall and button press correct), motor response time as well as saccadic reaction time were calculated as dependent variables. Results show that participants indeed used peripheral vision to detect changes, because either no or very late saccades to the changed target were executed in correct trials. Moreover, a saccade was more often executed when eccentricities were small. Response accuracies were higher and response times were lower in the stop conditions of both experiments while larger eccentricities led to higher response times in all conditions. Summing up, it could be shown that monitoring targets and detecting changes can be processed by peripheral vision only and that a monitoring strategy on the basis of peripheral vision may be the optimal one as saccades may be afflicted with certain costs. Further research is planned to address the question whether this functionality is also evident in sports tasks.
Resumo:
In sports games, it is often necessary to perceive a large number of moving objects (e.g., the ball and players). In this context, the role of peripheral vision for processing motion information in the periphery is often discussed especially when motor responses are required. In an attempt to test the capability of using peripheral vision in those sports-games situations, a Multiple-Object-Tracking task that requires to track a certain number of targets amidst distractors, was chosen to determine the sensitivity of detecting target changes with peripheral vision only. Participants’ primary task was to recall four targets (out of 10 rectangular stimuli) after six seconds of quasi-random motion. As a second task, a button had to be pressed if a target change occurred (Exp 1: stop vs. form change to a diamond for 0.5 s; Exp 2: stop vs. slowdown for 0.5 s). Eccentricities of changes (5-10° vs. 15-20°) were manipulated, decision accuracy (recall and button press correct), motor response time and saccadic reaction time (change onset to saccade onset) were calculated and eye-movements were recorded. Results show that participants indeed used peripheral vision to detect changes, because either no or very late saccades to the changed target were executed in correct trials. Moreover, a saccade was more often executed when eccentricities were small. Response accuracies were higher and response times were lower in the stop conditions of both experiments while larger eccentricities led to higher response times in all conditions. Summing up, it could be shown that monitoring targets and detecting changes can be processed by peripheral vision only and that a monitoring strategy on the basis of peripheral vision may be the optimal one as saccades may be afflicted with certain costs. Further research is planned to address the question whether this functionality is also evident in sports tasks.
Resumo:
Stereotypies are repetitive and relatively invariant patterns of behavior, which are observed in a wide range of species in captivity. Stereotypic behavior occurs when environmental demands produce a physiological response that, if sustained for an extended period, exceeds the natural physiological regulatory capacity of the organism, particularly in situations that include unpredictability and uncontrollability. One hypothesis is that stereotypic behavior functions to cope with stressful environments, but the existing evidence is contradictory. To address the coping hypothesis of stereotypies, we triggered physiological reactions in 22 horses affected by stereotypic behavior (crib-biters) and 21 non-crib-biters (controls), using an ACTH challenge test. Following administration of an ACTH injection, we measured saliva cortisol every 30min and heart rate (HR) continuously for a period of 3h. We did not find any differences in HR or HR variability between the two groups, but crib-biters (Group CB) had significantly higher cortisol responses than controls (Group C; mean±SD: CB, 5.84±2.62ng/ml, C, 4.76±3.04ng/ml). Moreover, crib-biters that did not perform the stereotypic behavior during the 3-hour test period (Group B) had significantly higher cortisol levels than controls, which was not the case of crib-biters showing stereotypic behavior (Group A) (B, 6.44±2.38ng/ml A, 5.58±2.69ng/ml). Our results suggest that crib-biting is a coping strategy that helps stereotypic individuals to reduce cortisol levels caused by stressful situations. We conclude that preventing stereotypic horses from crib-biting could be an inappropriate strategy to control this abnormal behavior, as it prevents individuals from coping with situations that they perceive as stressful.
Resumo:
In contact shots, the muzzle imprint is an informative finding associated with the entrance wound. It typically mirrors the constructional components being in line with the muzzle or just behind. Under special conditions, other patterned skin marks located near a gunshot entrance wound may give the impression to be part of the muzzle imprint. A potential mechanism causing a patterned pressure abrasion in close proximity to the bullet entrance site is demonstrated on the basis of a suicidal shot to the temple. The skin lesion in question appeared as a ring-shaped excoriation with a diameter corresponding to that of the cartridge case. Two hypotheses concerning the causative mechanism were investigated by test shots: - After being ejected, the cartridge case ricocheted inside a confined space (car cabin in the particular case) and secondarily hit the skin near the gunshot entrance wound. - The ejection of the cartridge case failed so that the case became stuck in the ejection port and its mouth contacted the skin when the body collapsed after being hit.
Resumo:
The articular cartilage layer of synovial joints is commonly lesioned by trauma or by a degenerative joint disease. Attempts to repair the damage frequently involve the performance of autologous chondrocyte implantation (ACI). Healthy cartilage must be first removed from the joint, and then, on a separate occasion, following the isolation of the chondrocytes and their expansion in vitro, implanted within the lesion. The disadvantages of this therapeutic approach include the destruction of healthy cartilage-which may predispose the joint to osteoarthritic degeneration-the necessarily restricted availability of healthy tissue, the limited proliferative capacity of the donor cells-which declines with age-and the need for two surgical interventions. We postulated that it should be possible to induce synovial stem cells, which are characterized by high, age-independent, proliferative and chondrogenic differentiation capacities, to lay down cartilage within the outer juxtasynovial space after the transcutaneous implantation of a carrier bearing BMP-2 in a slow-release system. The chondrocytes could be isolated on-site and immediately used for ACI. To test this hypothesis, Chinchilla rabbits were used as an experimental model. A collagenous patch bearing BMP-2 in a slow-delivery vehicle was sutured to the inner face of the synovial membrane. The neoformed tissue was excised 5, 8, 11 and 14 days postimplantation for histological and histomorphometric analyses. Neoformed tissue was observed within the outer juxtasynovial space already on the 5th postimplantation day. It contained connective and adipose tissues, and a central nugget of growing cartilage. Between days 5 and 14, the absolute volume of cartilage increased, attaining a value of 12 mm(3) at the latter juncture. Bone was deposited in measurable quantities from the 11th day onwards, but owing to resorption, the net volume did not exceed 1.5 mm(3) (14th day). The findings confirm our hypothesis. The quantity of neoformed cartilage that is deposited after only 1 week within the outer juxtasynovial space would yield sufficient cells for ACI. Since the BMP-2-bearing patches would be implanted transcutaneously in humans, only one surgical or arthroscopic intervention would be called for. Moreover, most importantly, sufficient numbers of cells could be generated in patients of all ages.
Resumo:
BACKGROUND A single non-invasive gene expression profiling (GEP) test (AlloMap®) is often used to discriminate if a heart transplant recipient is at a low risk of acute cellular rejection at time of testing. In a randomized trial, use of the test (a GEP score from 0-40) has been shown to be non-inferior to a routine endomyocardial biopsy for surveillance after heart transplantation in selected low-risk patients with respect to clinical outcomes. Recently, it was suggested that the within-patient variability of consecutive GEP scores may be used to independently predict future clinical events; however, future studies were recommended. Here we performed an analysis of an independent patient population to determine the prognostic utility of within-patient variability of GEP scores in predicting future clinical events. METHODS We defined the GEP score variability as the standard deviation of four GEP scores collected ≥315 days post-transplantation. Of the 737 patients from the Cardiac Allograft Rejection Gene Expression Observational (CARGO) II trial, 36 were assigned to the composite event group (death, re-transplantation or graft failure ≥315 days post-transplantation and within 3 years of the final GEP test) and 55 were assigned to the control group (non-event patients). In this case-controlled study, the performance of GEP score variability to predict future events was evaluated by the area under the receiver operator characteristics curve (AUC ROC). The negative predictive values (NPV) and positive predictive values (PPV) including 95 % confidence intervals (CI) of GEP score variability were calculated. RESULTS The estimated prevalence of events was 17 %. Events occurred at a median of 391 (inter-quartile range 376) days after the final GEP test. The GEP variability AUC ROC for the prediction of a composite event was 0.72 (95 % CI 0.6-0.8). The NPV for GEP score variability of 0.6 was 97 % (95 % CI 91.4-100.0); the PPV for GEP score variability of 1.5 was 35.4 % (95 % CI 13.5-75.8). CONCLUSION In heart transplant recipients, a GEP score variability may be used to predict the probability that a composite event will occur within 3 years after the last GEP score. TRIAL REGISTRATION Clinicaltrials.gov identifier NCT00761787.
Resumo:
In the current study it is investigated whether peripheral vision can be used to monitor multi-ple moving objects and to detect single-target changes. For this purpose, in Experiment 1, a modified MOT setup with a large projection and a constant-position centroid phase had to be checked first. Classical findings regarding the use of a virtual centroid to track multiple ob-jects and the dependency of tracking accuracy on target speed could be successfully replicat-ed. Thereafter, the main experimental variations regarding the manipulation of to-be-detected target changes could be introduced in Experiment 2. In addition to a button press used for the detection task, gaze behavior was assessed using an integrated eye-tracking system. The anal-ysis of saccadic reaction times in relation to the motor response shows that peripheral vision is naturally used to detect motion and form changes in MOT because the saccade to the target occurred after target-change offset. Furthermore, for changes of comparable task difficulties, motion changes are detected better by peripheral vision than form changes. Findings indicate that capabilities of the visual system (e.g., visual acuity) affect change detection rates and that covert-attention processes may be affected by vision-related aspects like spatial uncertainty. Moreover, it is argued that a centroid-MOT strategy might reduce the amount of saccade-related costs and that eye-tracking seems to be generally valuable to test predictions derived from theories on MOT. Finally, implications for testing covert attention in applied settings are proposed.