57 resultados para TEST CASE GENERATION
Resumo:
BACKGROUND Detection of HIV-1 p24 antigen permits early identification of primary HIV infection and timely intervention to limit further spread of the infection. Principally, HIV screening should equally detect all viral variants, but reagents for a standardised test evaluation are limited. Therefore, we aimed to create an inexhaustible panel of diverse HIV-1 p24 antigens. METHODS We generated a panel of 43 recombinantly expressed virus-like particles (VLPs), containing the structural Gag proteins of HIV-1 subtypes A-H and circulating recombinant forms (CRF) CRF01_AE, CRF02_AG, CRF12_BF, CRF20_BG and group O. Eleven 4th generation antigen/antibody tests and five antigen-only tests were evaluated for their ability to detect VLPs diluted in human plasma to p24 concentrations equivalent to 50, 10 and 2 IU/ml of the WHO p24 standard. Three tests were also evaluated for their ability to detect p24 after heat-denaturation for immune-complex disruption, a pre-requisite for ultrasensitive p24 detection. RESULTS Our VLP panel exhibited an average intra-clade p24 diversity of 6.7%. Among the 4th generation tests, the Abbott Architect and Siemens Enzygnost Integral 4 had the highest sensitivity of 97.7% and 93%, respectively. Alere Determine Combo and BioRad Access were least sensitive with 10.1% and 40.3%, respectively. Antigen-only tests were slightly more sensitive than combination tests. Almost all tests detected the WHO HIV-1 p24 standard at a concentration of 2 IU/ml, but their ability to detect this input for different subtypes varied greatly. Heat-treatment lowered overall detectability of HIV-1 p24 in two of the three tests, but only few VLPs had a more than 3-fold loss in p24 detection. CONCLUSIONS The HIV-1 Gag subtype panel has a broad diversity and proved useful for a standardised evaluation of the detection limit and breadth of subtype detection of p24 antigen-detecting tests. Several tests exhibited problems, particularly with non-B subtypes.
Resumo:
In sports games, it is often necessary to perceive a large number of moving objects (e.g., the ball and players). In this context, the role of peripheral vision for processing motion information in the periphery is often discussed especially when motor responses are required. In an attempt to test the basal functionality of peripheral vision in those sports-games situations, a Multiple Object Tracking (MOT) task that requires to track a certain number of targets amidst distractors, was chosen. Participants’ primary task was to recall four targets (out of 10 rectangular stimuli) after six seconds of quasi-random motion. As a second task, a button had to be pressed if a target change occurred (Exp 1: stop vs. form change to a diamond for 0.5 s; Exp 2: stop vs. slowdown for 0.5 s). While eccentricities of changes (5-10° vs. 15-20°) were manipulated, decision accuracy (recall and button press correct), motor response time as well as saccadic reaction time were calculated as dependent variables. Results show that participants indeed used peripheral vision to detect changes, because either no or very late saccades to the changed target were executed in correct trials. Moreover, a saccade was more often executed when eccentricities were small. Response accuracies were higher and response times were lower in the stop conditions of both experiments while larger eccentricities led to higher response times in all conditions. Summing up, it could be shown that monitoring targets and detecting changes can be processed by peripheral vision only and that a monitoring strategy on the basis of peripheral vision may be the optimal one as saccades may be afflicted with certain costs. Further research is planned to address the question whether this functionality is also evident in sports tasks.
Resumo:
In sports games, it is often necessary to perceive a large number of moving objects (e.g., the ball and players). In this context, the role of peripheral vision for processing motion information in the periphery is often discussed especially when motor responses are required. In an attempt to test the capability of using peripheral vision in those sports-games situations, a Multiple-Object-Tracking task that requires to track a certain number of targets amidst distractors, was chosen to determine the sensitivity of detecting target changes with peripheral vision only. Participants’ primary task was to recall four targets (out of 10 rectangular stimuli) after six seconds of quasi-random motion. As a second task, a button had to be pressed if a target change occurred (Exp 1: stop vs. form change to a diamond for 0.5 s; Exp 2: stop vs. slowdown for 0.5 s). Eccentricities of changes (5-10° vs. 15-20°) were manipulated, decision accuracy (recall and button press correct), motor response time and saccadic reaction time (change onset to saccade onset) were calculated and eye-movements were recorded. Results show that participants indeed used peripheral vision to detect changes, because either no or very late saccades to the changed target were executed in correct trials. Moreover, a saccade was more often executed when eccentricities were small. Response accuracies were higher and response times were lower in the stop conditions of both experiments while larger eccentricities led to higher response times in all conditions. Summing up, it could be shown that monitoring targets and detecting changes can be processed by peripheral vision only and that a monitoring strategy on the basis of peripheral vision may be the optimal one as saccades may be afflicted with certain costs. Further research is planned to address the question whether this functionality is also evident in sports tasks.
Resumo:
Stereotypies are repetitive and relatively invariant patterns of behavior, which are observed in a wide range of species in captivity. Stereotypic behavior occurs when environmental demands produce a physiological response that, if sustained for an extended period, exceeds the natural physiological regulatory capacity of the organism, particularly in situations that include unpredictability and uncontrollability. One hypothesis is that stereotypic behavior functions to cope with stressful environments, but the existing evidence is contradictory. To address the coping hypothesis of stereotypies, we triggered physiological reactions in 22 horses affected by stereotypic behavior (crib-biters) and 21 non-crib-biters (controls), using an ACTH challenge test. Following administration of an ACTH injection, we measured saliva cortisol every 30min and heart rate (HR) continuously for a period of 3h. We did not find any differences in HR or HR variability between the two groups, but crib-biters (Group CB) had significantly higher cortisol responses than controls (Group C; mean±SD: CB, 5.84±2.62ng/ml, C, 4.76±3.04ng/ml). Moreover, crib-biters that did not perform the stereotypic behavior during the 3-hour test period (Group B) had significantly higher cortisol levels than controls, which was not the case of crib-biters showing stereotypic behavior (Group A) (B, 6.44±2.38ng/ml A, 5.58±2.69ng/ml). Our results suggest that crib-biting is a coping strategy that helps stereotypic individuals to reduce cortisol levels caused by stressful situations. We conclude that preventing stereotypic horses from crib-biting could be an inappropriate strategy to control this abnormal behavior, as it prevents individuals from coping with situations that they perceive as stressful.
Resumo:
In contact shots, the muzzle imprint is an informative finding associated with the entrance wound. It typically mirrors the constructional components being in line with the muzzle or just behind. Under special conditions, other patterned skin marks located near a gunshot entrance wound may give the impression to be part of the muzzle imprint. A potential mechanism causing a patterned pressure abrasion in close proximity to the bullet entrance site is demonstrated on the basis of a suicidal shot to the temple. The skin lesion in question appeared as a ring-shaped excoriation with a diameter corresponding to that of the cartridge case. Two hypotheses concerning the causative mechanism were investigated by test shots: - After being ejected, the cartridge case ricocheted inside a confined space (car cabin in the particular case) and secondarily hit the skin near the gunshot entrance wound. - The ejection of the cartridge case failed so that the case became stuck in the ejection port and its mouth contacted the skin when the body collapsed after being hit.
Resumo:
The articular cartilage layer of synovial joints is commonly lesioned by trauma or by a degenerative joint disease. Attempts to repair the damage frequently involve the performance of autologous chondrocyte implantation (ACI). Healthy cartilage must be first removed from the joint, and then, on a separate occasion, following the isolation of the chondrocytes and their expansion in vitro, implanted within the lesion. The disadvantages of this therapeutic approach include the destruction of healthy cartilage-which may predispose the joint to osteoarthritic degeneration-the necessarily restricted availability of healthy tissue, the limited proliferative capacity of the donor cells-which declines with age-and the need for two surgical interventions. We postulated that it should be possible to induce synovial stem cells, which are characterized by high, age-independent, proliferative and chondrogenic differentiation capacities, to lay down cartilage within the outer juxtasynovial space after the transcutaneous implantation of a carrier bearing BMP-2 in a slow-release system. The chondrocytes could be isolated on-site and immediately used for ACI. To test this hypothesis, Chinchilla rabbits were used as an experimental model. A collagenous patch bearing BMP-2 in a slow-delivery vehicle was sutured to the inner face of the synovial membrane. The neoformed tissue was excised 5, 8, 11 and 14 days postimplantation for histological and histomorphometric analyses. Neoformed tissue was observed within the outer juxtasynovial space already on the 5th postimplantation day. It contained connective and adipose tissues, and a central nugget of growing cartilage. Between days 5 and 14, the absolute volume of cartilage increased, attaining a value of 12 mm(3) at the latter juncture. Bone was deposited in measurable quantities from the 11th day onwards, but owing to resorption, the net volume did not exceed 1.5 mm(3) (14th day). The findings confirm our hypothesis. The quantity of neoformed cartilage that is deposited after only 1 week within the outer juxtasynovial space would yield sufficient cells for ACI. Since the BMP-2-bearing patches would be implanted transcutaneously in humans, only one surgical or arthroscopic intervention would be called for. Moreover, most importantly, sufficient numbers of cells could be generated in patients of all ages.
Resumo:
BACKGROUND A single non-invasive gene expression profiling (GEP) test (AlloMap®) is often used to discriminate if a heart transplant recipient is at a low risk of acute cellular rejection at time of testing. In a randomized trial, use of the test (a GEP score from 0-40) has been shown to be non-inferior to a routine endomyocardial biopsy for surveillance after heart transplantation in selected low-risk patients with respect to clinical outcomes. Recently, it was suggested that the within-patient variability of consecutive GEP scores may be used to independently predict future clinical events; however, future studies were recommended. Here we performed an analysis of an independent patient population to determine the prognostic utility of within-patient variability of GEP scores in predicting future clinical events. METHODS We defined the GEP score variability as the standard deviation of four GEP scores collected ≥315 days post-transplantation. Of the 737 patients from the Cardiac Allograft Rejection Gene Expression Observational (CARGO) II trial, 36 were assigned to the composite event group (death, re-transplantation or graft failure ≥315 days post-transplantation and within 3 years of the final GEP test) and 55 were assigned to the control group (non-event patients). In this case-controlled study, the performance of GEP score variability to predict future events was evaluated by the area under the receiver operator characteristics curve (AUC ROC). The negative predictive values (NPV) and positive predictive values (PPV) including 95 % confidence intervals (CI) of GEP score variability were calculated. RESULTS The estimated prevalence of events was 17 %. Events occurred at a median of 391 (inter-quartile range 376) days after the final GEP test. The GEP variability AUC ROC for the prediction of a composite event was 0.72 (95 % CI 0.6-0.8). The NPV for GEP score variability of 0.6 was 97 % (95 % CI 91.4-100.0); the PPV for GEP score variability of 1.5 was 35.4 % (95 % CI 13.5-75.8). CONCLUSION In heart transplant recipients, a GEP score variability may be used to predict the probability that a composite event will occur within 3 years after the last GEP score. TRIAL REGISTRATION Clinicaltrials.gov identifier NCT00761787.
Resumo:
In the current study it is investigated whether peripheral vision can be used to monitor multi-ple moving objects and to detect single-target changes. For this purpose, in Experiment 1, a modified MOT setup with a large projection and a constant-position centroid phase had to be checked first. Classical findings regarding the use of a virtual centroid to track multiple ob-jects and the dependency of tracking accuracy on target speed could be successfully replicat-ed. Thereafter, the main experimental variations regarding the manipulation of to-be-detected target changes could be introduced in Experiment 2. In addition to a button press used for the detection task, gaze behavior was assessed using an integrated eye-tracking system. The anal-ysis of saccadic reaction times in relation to the motor response shows that peripheral vision is naturally used to detect motion and form changes in MOT because the saccade to the target occurred after target-change offset. Furthermore, for changes of comparable task difficulties, motion changes are detected better by peripheral vision than form changes. Findings indicate that capabilities of the visual system (e.g., visual acuity) affect change detection rates and that covert-attention processes may be affected by vision-related aspects like spatial uncertainty. Moreover, it is argued that a centroid-MOT strategy might reduce the amount of saccade-related costs and that eye-tracking seems to be generally valuable to test predictions derived from theories on MOT. Finally, implications for testing covert attention in applied settings are proposed.
Resumo:
Structural characteristics of social networks have been recognized as important factors of effective natural resource governance. However, network analyses of natural resource governance most often remain static, even though governance is an inherently dynamic process. In this article, we investigate the evolution of a social network of organizational actors involved in the governance of natural resources in a regional nature park project in Switzerland. We ask how the maturation of a governance network affects bonding social capital and centralization in the network. Applying separable temporal exponential random graph modeling (STERGM), we test two hypotheses based on the risk hypothesis by Berardo and Scholz (2010) in a longitudinal setting. Results show that network dynamics clearly follow the expected trend toward generating bonding social capital but do not imply a shift toward less hierarchical and more decentralized structures over time. We investigate how these structural processes may contribute to network effectiveness over time.
Resumo:
Abstract. We resumed mowing in two plots of ca. 100 m2 in an abandoned meadow dominated by Brachypodium pinnatum on the slope of Monte Generoso (Switzerland). We monitored species composition and hay yield using point quadrats and biomass samples. Species frequencies changed little during 10 yr (1988–1997) while hay yields showed large fluctuations according to mean relative humidity in April-June. We performed a seed-addition experiment to test whether the establishment of meadow species is limited by lack of diaspores or favourable microsites for germination and recruitment from the seed bank. We sowed ca. 12 000 seeds of 12 species originating from a nearby meadow individually in plots of a 4 × 6 unbalanced Latin square with four treatments, burning, mowing, mowing and removal of a layer of decayed organic matter, and a control. We monitored the fate of seedling individuals for 24 months. Seedlings of all species were established and survived for 12 months, 10 species survived during at least 24 months, some reached a reproductive stage. Species responded to different qualities of microsites provided by the different treatments thus required different regeneration niches. Spontaneous long-distance immigration was insignificant. We conclude that the former species composition of abandoned meadows cannot easily be restored by mowing alone because many plant species of meadows do not have persistent seed banks and immigration over distances of more than 25 m and successful establishment is very unlikely.
Resumo:
We describe the case of a patient with a T-lymphoblastic lymphoma whose disseminated mucormycosis was diagnosed with delay, and we address the diagnostic and therapeutic decision-making process and review the diagnostic workup of patients with potential IFD. The diagnosis was delayed despite a suggestive radiological presentation of the patient's pulmonary lesion. The uncommon risk profile (T-lymphoblastic lymphoma, short neutropenic phases) wrongly led to a low level of suspicion. The diagnosis was also hampered by the lack of indirect markers for infections caused by Mucorales, the low sensitivity of both fungal culture and panfungal PCR, and the limited availability of species-specific PCR. A high level of suspicion of IFD is needed, and aggressive diagnostic procedures should be promptly initiated even in apparently low-risk patients with uncommon presentations. The extent of the analytical workup should be decided on a case-by-case base. Diagnostic tests such as the galactomannan and β-D-glucan test and/or PCR on biological material followed by sequencing should be chosen according to their availability and after evaluation of their specificity and sensitivity. In high-risk patients, preemptive therapy with a broad-spectrum mould-active antifungal agent should be started before definitive diagnostic findings become available.
Resumo:
We analyse the access to different institutional pathways to higher education for second-generation students, focusing on youths that hold a higher-education entrance certificate. The alternative vocational pathway appears to compensate to some degree, compared to the traditional academic one, for North-African and Southern-European youths in France, those from Turkey in Germany, and to a lesser degree those from Portugal, Turkey, Ex-Yugoslavia, Albania/Kosovo in Switzerland. This is not the case in Switzerland for Western-European, Italian, and Spanish youths who indeed access higher education via the academic pathway more often than Swiss youths. Using youth panel and survey data, multinomial models are applied to analyse these pathway choices.