90 resultados para OD Volume Variation, Short-Term OD Volume Prediction, ETC-OD Data, Bayesian Network
em Universit
Resumo:
Assessing the amount of rivals is crucial to optimally adjust investment into a contest. If laboratory animals show numerical abilities, little is known about the ecological implications particularly in young animals. The two to nine barn owl (Tyto alba) siblings vocally compete for priority of access to food resources before parents actually deliver them. In dyads, the individual that vocalizes at the highest rate in the absence of parents deters its siblings from competing for next delivered prey. We tested the novel hypothesis that to optimally adjust vocal investment, barn owl nestlings assess how many of their siblings are currently competing. To singleton owlets, we broadcasted a fixed global number of calls emitted by one, two or four pre-recorded unfamiliar nestlings. We could thus distinguish the independent effect on singletons' vocal behavior of the global number of calls produced by a brood from the number of competitors that produced these calls. Overall, nestlings retreated more from vocal contest when facing more competitors. However, in front of one highly motivated competitor, nestlings refrained from vocalizing to a larger extent than when competing against more but less motivated individuals. Therefore, young animals assess variation in the number of currently competing siblings based on individual-specific vocal cues.
Resumo:
Background: Excessive exposure to solar Ultra-Violet (UV) light is the main cause of most skin cancers in humans. Factors such as the increase of solar irradiation at ground level (anthropic pollution), the rise in standard of living (vacation in sunny areas), and (mostly) the development of outdoor activities have contributed to increase exposure. Thus, unsurprisingly, incidence of skin cancers has increased over the last decades more than that of any other cancer. Melanoma is the most lethal cutaneous cancer, while cutaneous carcinomas are the most common cancer type worldwide. UV exposure depends on environmental as well as individual factors related to activity. The influence of individual factors on exposure among building workers was investigated in a previous study. Posture and orientation were found to account for at least 38% of the total variance of relative individual exposure. A high variance of short-term exposure was observed between different body locations, indicating the occurrence of intense, subacute exposures. It was also found that effective short-term exposure ranged between 0 and 200% of ambient irradiation, suggesting that ambient irradiation is a poor predictor of effective exposure. Various dosimetric techniques enable to assess individual effective exposure, but dosimetric measurements remain tedious and tend to be situation-specific. As a matter of facts, individual factors (exposure time, body posture and orientation in the sun) often limit the extrapolation of exposure results to similar activities conducted in other conditions. Objective: The research presented in this paper aims at developing and validating a predictive tool of effective individual exposure to solar UV. Methods: Existing computer graphic techniques (3D rendering) were adapted to reflect solar exposure conditions and calculate short-term anatomical doses. A numerical model, represented as a 3D triangular mesh, is used to represent the exposed body. The amount of solar energy received by each "triangle is calculated, taking into account irradiation intensity, incidence angle and possible shadowing from other body parts. The model take into account the three components of the solar irradiation (direct, diffuse and albedo) as well as the orientation and posture of the body. Field measurements were carried out using a forensic mannequin at the Payerne MeteoSwiss station. Short-term dosimetric measurements were performed in 7 anatomical locations for 5 body postures. Field results were compared to the model prediction obtained from the numerical model. Results: The best match between prediction and measurements was obtained for upper body parts such as shoulders (Ratio Modelled/Measured; Mean = 1.21, SD = 0.34) and neck (Mean = 0.81, SD = 0.32). Small curved body parts such as forehead (Mean = 6.48, SD = 9.61) exhibited a lower matching. The prediction is less accurate for complex postures such as kneeling (Mean = 4.13, SD = 8.38) compared to standing up (Mean = 0.85, SD = 0.48). The values obtained from the dosimeters and the ones computed from the model are globally consistent. Conclusion: Although further development and validation are required, these results suggest that effective exposure could be predicted for a given activity (work or leisure) in various ambient irradiation conditions. Using a generic modelling approach is of high interest in terms of implementation costs as well as predictive and retrospective capabilities.
Resumo:
Tribulus terrestris is a nutritional supplement highly debated regarding its physiological and actual effects on the organism. The main claimed effect is an increase of testosterone anabolic and androgenic action through the activation of endogenous testosterone production. Even if this biological pathway is not entirely proven, T. terrestris is regularly used by athletes. Recently, the analysis of two female urine samples by GC/C/IRMS (gas chromatography/combustion/isotope-ratio-mass-spectrometry) conclusively revealed the administration of exogenous testosterone or its precursors, even if the testosterone glucuronide/epitestosterone glucuronide (T/E) ratio and steroid marker concentrations were below the cut-off values defined by World Anti-Doping Agency (WADA). To argue against this adverse analytical finding, the athletes recognized having used T. terrestris in their diet. In order to test this hypothesis, two female volunteers ingested 500 mg of T. terrestris, three times a day and for two consecutive days. All spot urines were collected during 48 h after the first intake. The (13)C/(12)C ratio of ketosteroids was determined by GC/C/IRMS, the T/E ratio and DHEA concentrations were measured by GC/MS and LH concentrations by radioimmunoassay. None of these parameters revealed a significant variation or increased above the WADA cut-off limits. Hence, the short-term treatment with T. terrestris showed no impact on the endogenous testosterone metabolism of the two subjects.
Resumo:
Ex vivo ELISPOT and multimer staining are well-established tests for the assessment of antigen-specific T cells. Many laboratories are now using a period of in vitro stimulation (IVS) to enhance detection. Here, we report the findings of a multi-centre panel organised by the Association for Cancer Immunotherapy Immunoguiding Program to investigate the impact of IVS protocols on the detection of antigen-specific T cells of varying ex vivo frequency. Five centres performed ELISPOT and multimer staining on centrally prepared PBMCs from 3 donors, both ex vivo and following IVS. A harmonised IVS protocol was designed based on the best-performing protocol(s), which was then evaluated in a second phase on 2 donors by 6 centres. All centres were able to reliably detect antigen-specific T cells of high/intermediate frequency both ex vivo (Phase I) and post-IVS (Phase I and II). The highest frequencies of antigen-specific T cells ex vivo were mirrored in the frequencies following IVS and in the detection rates. However, antigen-specific T cells of a low/undetectable frequency ex vivo were not reproducibly detected post-IVS. Harmonisation of the IVS protocol reduced the inter-laboratory variation observed for ELISPOT and multimer analyses by approximately 20 %. We further demonstrate that results from ELISPOT and multimer staining correlated after (P < 0.0001 and R (2) = 0.5113), but not before IVS. In summary, IVS was shown to be a reproducible method that benefitted from method harmonisation.
Resumo:
Land plants have had the reputation of being problematic for DNA barcoding for two general reasons: (i) the standard DNA regions used in algae, animals and fungi have exceedingly low levels of variability and (ii) the typically used land plant plastid phylogenetic markers (e.g. rbcL, trnL-F, etc.) appear to have too little variation. However, no one has assessed how well current phylogenetic resources might work in the context of identification (versus phylogeny reconstruction). In this paper, we make such an assessment, particularly with two of the markers commonly sequenced in land plant phylogenetic studies, plastid rbcL and internal transcribed spacers of the large subunits of nuclear ribosomal DNA (ITS), and find that both of these DNA regions perform well even though the data currently available in GenBank/EBI were not produced to be used as barcodes and BLAST searches are not an ideal tool for this purpose. These results bode well for the use of even more variable regions of plastid DNA (such as, for example, psbA-trnH) as barcodes, once they have been widely sequenced. In the short term, efforts to bring land plant barcoding up to the standards being used now in other organisms should make swift progress. There are two categories of DNA barcode users, scientists in fields other than taxonomy and taxonomists. For the former, the use of mitochondrial and plastid DNA, the two most easily assessed genomes, is at least in the short term a useful tool that permits them to get on with their studies, which depend on knowing roughly which species or species groups they are dealing with, but these same DNA regions have important drawbacks for use in taxonomic studies (i.e. studies designed to elucidate species limits). For these purposes, DNA markers from uniparentally (usually maternally) inherited genomes can only provide half of the story required to improve taxonomic standards being used in DNA barcoding. In the long term, we will need to develop more sophisticated barcoding tools, which would be multiple, low-copy nuclear markers with sufficient genetic variability and PCR-reliability; these would permit the detection of hybrids and permit researchers to identify the 'genetic gaps' that are useful in assessing species limits.
Resumo:
BACKGROUND: Highway maintenance workers are constantly and simultaneously exposed to traffic-related particle and noise emissions, and both have been linked to increased cardiovascular morbidity and mortality in population-based epidemiology studies. OBJECTIVES: We aimed to investigate short-term health effects related to particle and noise exposure. METHODS: We monitored 18 maintenance workers, during as many as five 24-hour periods from a total of 50 observation days. We measured their exposure to fine particulate matter (PM2.5), ultrafine particles, noise, and the cardiopulmonary health endpoints: blood pressure, pro-inflammatory and pro-thrombotic markers in the blood, lung function and fractional exhaled nitric oxide (FeNO) measured approximately 15 hours post-work. Heart rate variability was assessed during a sleep period approximately 10 hours post-work. RESULTS: PM2.5 exposure was significantly associated with C-reactive protein and serum amyloid A, and negatively associated with tumor necrosis factor α. None of the particle metrics were significantly associated with von Willebrand factor or tissue factor expression. PM2.5 and work noise were associated with markers of increased heart rate variability, and with increased HF and LF power. Systolic and diastolic blood pressure on the following morning were significantly associated with noise exposure after work, and non-significantly associated with PM2.5. We observed no significant associations between any of the exposures and lung function or FeNO. CONCLUSIONS: Our findings suggest that exposure to particles and noise during highway maintenance work might pose a cardiovascular health risk. Actions to reduce these exposures could lead to better health for this population of workers.
Resumo:
INTRODUCTION: Although long-term video-EEG monitoring (LVEM) is routinely used to investigate paroxysmal events, short-term video-EEG monitoring (SVEM) lasting <24 h is increasingly recognized as a cost-effective tool. Since, however, relatively few studies addressed the yield of SVEM among different diagnostic groups, we undertook the present study to investigate this aspect. METHODS: We retrospectively analyzed 226 consecutive SVEM recordings over 6 years. All patients were referred because routine EEGs were inconclusive. Patients were classified into 3 suspected diagnostic groups: (1) group with epileptic seizures, (2) group with psychogenic nonepileptic seizures (PNESs), and (3) group with other or undetermined diagnoses. We assessed recording lengths, interictal epileptiform discharges, epileptic seizures, PNESs, and the definitive diagnoses obtained after SVEM. RESULTS: The mean age was 34 (±18.7) years, and the median recording length was 18.6 h. Among the 226 patients, 127 referred for suspected epilepsy - 73 had a diagnosis of epilepsy, none had a diagnosis of PNESs, and 54 had other or undetermined diagnoses post-SVEM. Of the 24 patients with pre-SVEM suspected PNESs, 1 had epilepsy, 12 had PNESs, and 11 had other or undetermined diagnoses. Of the 75 patients with other diagnoses pre-SVEM, 17 had epilepsy, 11 had PNESs, and 47 had other or undetermined diagnoses. After SVEM, 15 patients had definite diagnoses other than epilepsy or PNESs, while in 96 patients, diagnosis remained unclear. Overall, a definitive diagnosis could be reached in 129/226 (57%) patients. CONCLUSIONS: This study demonstrates that in nearly 3/5 patients without a definitive diagnosis after routine EEG, SVEM allowed us to reach a diagnosis. This procedure should be encouraged in this setting, given its time-effectiveness compared with LVEM.
Resumo:
Background: Alliance evolutions, i.e. ruptures and resolutions over the course of psychotherapy, have been shown to be important descriptive features in different forms of psychotherapy, and in particular in psychodynamic psychotherapy. This case study of a client presenting elements of adjustment disorder undergoing short-term dynamic psychotherapy is drawn from a systematic naturalistic study and aims at illustrating, on a session-by-session-level, the processes of alliance ruptures and resolutions, by comparing both the client's and the therapist's perspectives. Method: Two episodes of alliance evolution were more fully studied, in relation to the evolution of transference, as well as the client's defensive functioning and core conflictual theme. These concepts were measured by means of valid, reliable observer-rater methods, based on session transcripts: the Defense Mechanisms Rating Scales (DMRS) for defensive functioning and the Core Conflictual Relationship Theme (CCRT) for the conflicts. Alliance was measured after each session using the Helping Alliance questionnaire (HAq-II). Results: The results indicated that these episodes of alliance rupture and resolutions may be understood as key moments of the whole therapeutic process reflecting the client's main relationship stakes. Illustrations are provided based on the client's in-session processes and related to the alliance development over the course of the entire therapy.
Resumo:
The question of why some social systems have evolved close inbreeding is particularly intriguing given expected short- and long-term negative effects of this breeding system. Using social spiders as a case study, we quantitatively show that the potential costs of avoiding inbreeding through dispersal and solitary living could have outweighed the costs of inbreeding depression in the origin of inbred spider sociality. We further review the evidence that despite being favored in the short term, inbred spider sociality may constitute in the long run an evolutionary dead end. We also review other cases, such as the naked mole rats and some bark and ambrosia beetles, mites, psocids, thrips, parasitic ants, and termites, in which inbreeding and sociality are associated and the evidence for and against this breeding system being, in general, an evolutionary dead end.
Resumo:
We aimed to determine whether human subjects' reliance on different sources of spatial information encoded in different frames of reference (i.e., egocentric versus allocentric) affects their performance, decision time and memory capacity in a short-term spatial memory task performed in the real world. Subjects were asked to play the Memory game (a.k.a. the Concentration game) without an opponent, in four different conditions that controlled for the subjects' reliance on egocentric and/or allocentric frames of reference for the elaboration of a spatial representation of the image locations enabling maximal efficiency. We report experimental data from young adult men and women, and describe a mathematical model to estimate human short-term spatial memory capacity. We found that short-term spatial memory capacity was greatest when an egocentric spatial frame of reference enabled subjects to encode and remember the image locations. However, when egocentric information was not reliable, short-term spatial memory capacity was greater and decision time shorter when an allocentric representation of the image locations with respect to distant objects in the surrounding environment was available, as compared to when only a spatial representation encoding the relationships between the individual images, independent of the surrounding environment, was available. Our findings thus further demonstrate that changes in viewpoint produced by the movement of images placed in front of a stationary subject is not equivalent to the movement of the subject around stationary images. We discuss possible limitations of classical neuropsychological and virtual reality experiments of spatial memory, which typically restrict the sensory information normally available to human subjects in the real world.
Resumo:
INTRODUCTION: The risk that hip preserving surgery may negatively influence the performance and outcome of subsequent total hip replacement (THR) remains a concern. The aim of this study was to identify any negative impact of previous hip arthroscopy on THR. METHODS: Out of 1271 consecutive patients who underwent primary THR between 2005 and 2009, 18 had previously undergone ipsilateral hip arthroscopy. This study group (STG) was compared with two control groups (CG, same approach, identical implants; MCG, paired group matched for age, BMI and Charnley categories). Operative time, blood loss, evidence of heterotopic bone and implant loosening at follow-up were compared between the STG and the MCG. Follow-up WOMAC were compared between the three groups. RESULTS: Blood loss was not found to be significantly different between the STG and MCG. The operative time was significantly less (p < 0.001) in the STG. There was no significant difference in follow-up WOMAC between the groups. No implant related complications were noted in follow-up radiographs. Two minor complications were documented for the STG and three for the MCG. CONCLUSION: We have found no evidence that previous hip arthroscopy negatively influences the performance or short-term clinical outcome of THR.
Resumo:
Staphylococcus aureus is recognized as one of the major human pathogens and is by far one of the most common nosocomial organisms. The genetic basis for the emergence of highly epidemic strains remains mysterious. Studying the microevolution of the different clones of S. aureus is essential for identifying the forces driving pathogen emergence and spread. The aim of the present study was to determine the genetic changes characterizing a lineage belonging to the South German clone (ST228) that spread over ten years in a tertiary care hospital in Switzerland. For this reason, we compared the whole genome of eight isolates recovered between 2001 and 2008 at the Lausanne hospital. The genetic comparison of these isolates revealed that their genomes are extremely closely related. Yet, a few more important genetic changes, such as the replacement of a plasmid, the loss of large fragments of DNA, or the insertion of transposases, were observed. These transfers of mobile genetic elements shaped the evolution of the ST228 lineage that spread within the Lausanne hospital. Nevertheless, although the strains analyzed differed in their dynamics, we have not been able to link a particular genetic element with spreading success. Finally, the present study showed that new sequencing technologies improve considerably the quality and quantity of information obtained for a single strain; but this information is still difficult to interpret and important investments are required for the technology to become accessible for routine investigations.
Resumo:
Solid phase microextraction (SPME) has been widely used for many years in various applications, such as environmental and water samples, food and fragrance analysis, or biological fluids. The aim of this study was to suggest the SPME method as an alternative to conventional techniques used in the evaluation of worker exposure to benzene, toluene, ethylbenzene, and xylene (BTEX). Polymethylsiloxane-carboxen (PDMS/CAR) showed as the most effective stationary phase material for sorbing BTEX among other materials (polyacrylate, PDMS, PDMS/divinylbenzene, Carbowax/divinylbenzene). Various experimental conditions were studied to apply SPME to BTEX quantitation in field situations. The uptake rate of the selected fiber (75 microm PDMS/CAR) was determined for each analyte at various concentrations, relative humidities, and airflow velocities from static (calm air) to dynamic (> 200 cm/s) conditions. The SPME method also was compared with the National Institute of Occupational Safety and Health method 1501. Unlike the latter, the SPME approach fulfills the new requirement for the threshold limit value-short term exposure limit (TLV-STEL) of 2.5 ppm for benzene (8 mg/m(3))
Resumo:
A large number of parameters have been identified as predictors of early outcome in patients with acute ischemic stroke. In the present work we analyzed a wide range of demographic, metabolic, physiological, clinical, laboratory and neuroimaging parameters in a large population of consecutive patients with acute ischemic stroke with the aim of identifying independent predictors of the early clinical course. We used prospectively collected data from the Acute Stroke Registry and Analysis of Lausanne. All consecutive patients with ischemic stroke admitted to our stroke unit and/or intensive care unit between 1 January 2003 and 12 December 2008 within 24 h after last-well time were analyzed. Univariate and multivariate analyses were performed to identify significant associations with the National Institute of Health Stroke Scale (NIHSS) score at admission and 24 h later. We also sought any interactions between the identified predictors. Of the 1,730 consecutive patients with acute ischemic stroke who were included in the analysis, 260 (15.0%) were thrombolyzed (mostly intravenously) within the recommended time window. In multivariate analysis, the NIHSS score at 24 h after admission was associated with the NIHSS score at admission (β = 1, p < 0.001), initial glucose level (β = 0.05, p < 0.002) and thrombolytic intervention (β = -2.91, p < 0.001). There was a significant interaction between thrombolysis and the NIHSS score at admission (p < 0.001), indicating that the short-term effect of thrombolysis decreases with increasing initial stroke severity. Thrombolytic treatment, lower initial glucose level and lower initial stroke severity predict a favorable early clinical course. The short-term effect of thrombolysis appears mainly in minor and moderate strokes, and decreases with increasing initial stroke severity.
Resumo:
The present study investigates the short- and long-term outcomes of a computer-assisted cognitive remediation (CACR) program in adolescents with psychosis or at high risk. 32 adolescents participated in a blinded 8-week randomized controlled trial of CACR treatment compared to computer games (CG). Clinical and neuropsychological evaluations were undertaken at baseline, at the end of the program and at 6-month. At the end of the program (n = 28), results indicated that visuospatial abilities (Repeatable Battery for the Assessment of Neuropsychological Status, RBANS; P = .005) improved signifi cantly more in the CACR group compared to the CG group. Furthermore, other cognitive functions (RBANS), psychotic symptoms (Positive and Negative Symptom Scale) and psychosocial functioning (Social and Occupational Functioning Assessment Scale) improved signifi cantly, but at similar rates, in the two groups. At long term (n = 22), cognitive abilities did not demonstrated any amelioration in the control group while, in the CACR group, signifi cant long-term improvements in inhibition (Stroop; P = .040) and reasoning (Block Design Test; P = .005) were observed. In addition, symptom severity (Clinical Global Improvement) decreased signifi cantly in the control group (P = .046) and marginally in the CACR group (P = .088). To sum up, CACR can be successfully administered in this population. CACR proved to be effective over and above CG for the most intensively trained cognitive ability. Finally, on the long-term, enhanced reasoning and inhibition abilities, which are necessary to execute higher-order goals or to adapt behavior to the ever-changing environment, were observed in adolescents benefi ting from a CACR.