137 resultados para Time Use
Resumo:
Three-dimensional free-breathing coronary magnetic resonance angiography was performed in eight healthy volunteers with use of real-time navigator technology. Images acquired with the navigator localized at the right hemidiaphragm and at the left ventricle were objectively compared. The diaphragmatic navigator was found to be superior for vessel delineation of middle to distal portions of the coronary arteries.
Resumo:
The number of fluoroscopy-guided procedures in cardiology is increasing over time and it is appropriate to wonder whether technological progress or change of techniques is influencing patient exposure. The aim of this study is to examine whether patient dose has been decreasing over the years. Patient dose data of more than 7700 procedures were collected from two cardiology centres. A steady increase in the patient dose over the years was observed in both the centres for the two cardiological procedures included in this study. Significant increase in dose was also observed after the installation of a flat-panel detector. The increasing use of radial access may lead to an increase in the patient exposure. The monitoring of dose data over time showed a considerable increase in the patient exposure over time. Actions have to be taken towards dose reduction in both the centres.
Resumo:
OBJECTIVE: The objective of this study was to analyse the use of lights and siren (L&S) during transport to the hospital by the prehospital severity status of the patient and the time saved by the time of day of the mission. METHODS: We searched the Public Health Services data of a Swiss state from 1 January 2010 to 31 December 2010. All primary patient transports within the state were included (24 718). The data collected were on the use of L&S, patient demographics, the time and duration of transport, the type of mission (trauma vs. nontrauma) and the severity of the condition according to the National Advisory Committee for Aeronautics (NACA) score assigned by the paramedics and/or emergency physician. We excluded 212 transports because of missing data. RESULTS: A total of 24 506 ambulance transports met the inclusion criteria. L&S were used 4066 times, or in 16.6% of all missions. Of these, 40% were graded NACA less than 4. Overall, the mean total transport time to return to the hospital was 11.09 min (confidence interval 10.84-11.34) with L&S and 12.84 min (confidence interval 12.72-12.96) without. The difference was 1.75 min (105 s; P<0.001). For night-time runs alone, the mean time saved using L&S was 0.17 min (10.2 s; P=0.27). CONCLUSION: At present, the use of L&S seems questionable given the severity status or NACA score of transported patients. Our results should prompt the implementation of more specific regulations for L&S use during transport to the hospital, taking into consideration certain physiological criteria of the victim as well as time of day of transport.
Resumo:
Les POCT (point of care tests) ont un grand potentiel d'utilisation en médecine infectieuse ambulatoire grâce à leur rapidité d'exécution, leur impact sur l'administration d'antibiotiques et sur le diagnostic de certaines maladies transmissibles. Certains tests sont utilisés depuis plusieurs années (détection de Streptococcus pyogenes lors d'angine, anticorps anti-VIH, antigène urinaire de S. pneumoniae, antigène de Plasmodium falciparum). De nouvelles indications concernent les infections respiratoires, les diarrhées infantiles (rotavirus, E. coli entérohémorragique) et les infections sexuellement transmissibles. Des POCT, basés sur la détection d'acides nucléiques, viennent d'être introduits (streptocoque du groupe B chez la femme enceinte avant l'accouchement et la détection du portage de staphylocoque doré résistant à la méticilline). POCT have a great potential in ambulatory infectious diseases diagnosis, due to their impact on antibiotic administration and on communicable diseases prevention. Some are in use for long (S. pyogenes antigen, HIV antibodies) or short time (S. pneumoniae antigen, P. falciparum). The additional major indications will be community-acquired lower respiratory tract infections, infectious diarrhoea in children (rotavirus, enterotoxigenic E. coli), and hopefully sexually transmitted infections. Easy to use, these tests based on antigen-antibody reaction allow a rapid diagnosis in less than one hour; the new generation of POCT relying on nucleic acid detection are just introduced in practice (detection of GBS in pregnant women, carriage of MRSA), and will be extended to many pathogens
Resumo:
Delta(9)-Tetrahydrocannabinol (THC) is frequently found in the blood of drivers suspected of driving under the influence of cannabis or involved in traffic crashes. The present study used a double-blind crossover design to compare the effects of medium (16.5 mg THC) and high doses (45.7 mg THC) of hemp milk decoctions or of a medium dose of dronabinol (20 mg synthetic THC, Marinol on several skills required for safe driving. Forensic interpretation of cannabinoids blood concentrations were attempted using the models proposed by Daldrup (cannabis influencing factor or CIF) and Huestis and coworkers. First, the time concentration-profiles of THC, 11-hydroxy-Delta(9)-tetrahydrocannabinol (11-OH-THC) (active metabolite of THC), and 11-nor-9-carboxy-Delta(9)-tetrahydrocannabinol (THCCOOH) in whole blood were determined by gas chromatography-mass spectrometry-negative ion chemical ionization. Compared to smoking studies, relatively low concentrations were measured in blood. The highest mean THC concentration (8.4 ng/mL) was achieved 1 h after ingestion of the strongest decoction. Mean maximum 11-OH-THC level (12.3 ng/mL) slightly exceeded that of THC. THCCOOH reached its highest mean concentration (66.2 ng/mL) 2.5-5.5 h after intake. Individual blood levels showed considerable intersubject variability. The willingness to drive was influenced by the importance of the requested task. Under significant cannabinoids influence, the participants refused to drive when they were asked whether they would agree to accomplish several unimportant tasks, (e.g., driving a friend to a party). Most of the participants reported a significant feeling of intoxication and did not appreciate the effects, notably those felt after drinking the strongest decoction. Road sign and tracking testing revealed obvious and statistically significant differences between placebo and treatments. A marked impairment was detected after ingestion of the strongest decoction. A CIF value, which relies on the molar ratio of main active to inactive cannabinoids, greater than 10 was found to correlate with a strong feeling of intoxication. It also matched with a significant decrease in the willingness to drive, and it matched also with a significant impairment in tracking performances. The mathematic model II proposed by Huestis et al. (1992) provided at best a rough estimate of the time of oral administration with 27% of actual values being out of range of the 95% confidence interval. The sum of THC and 11-OH-THC blood concentrations provided a better estimate of impairment than THC alone. This controlled clinical study points out the negative influence on fitness to drive after medium or high dose oral THC or dronabinol.
Resumo:
We describe a novel dissimilarity framework to analyze spatial patterns of species diversity and illustrate it with alien plant invasions in Northern Portugal. We used this framework to test the hypothesis that patterns of alien invasive plant species richness and composition are differently affected by differences in climate, land use and landscape connectivity (i.e. Geographic distance as a proxy and vectorial objects that facilitate dispersal such as roads and rivers) between pairs of localities at the regional scale. We further evaluated possible effects of plant life strategies (Grime's C-S-R) and residence time. Each locality consisted of a 1 km(2) landscape mosaic in which all alien invasive species were recorded by visiting all habitat types. Multi-model inference revealed that dissimilarity in species richness is more influenced by environmental distance (particularly climate), whereas geographic distance (proxies for dispersal limitations) is more important to explain dissimilarity in species composition, with a prevailing role for ecotones and roads. However, only minor differences were found in the responses of the three C-S-R strategies. Some effect of residence time was found, but only for dissimilarity in species richness. Our results also indicated that environmental conditions (e.g. climate conditions) limit the number of alien species invading a given site, but that the presence of dispersal corridors determines the paths of invasion and therefore the pool of species reaching each site. As geographic distances (e.g. ecotones and roads) tend to explain invasion at our regional scale highlights the need to consider the management of alien invasions in the context of integrated landscape planning. Alien species management should include (but not be limited to) the mitigation of dispersal pathways along linear infrastructures. Our results therefore highlight potentially useful applications of the novel multimodel framework to the anticipation and management of plant invasions. (C) 2013 Elsevier GmbH. All rights reserved.
Resumo:
BACKGROUND AND PURPOSE: Onset-to-reperfusion time (ORT) has recently emerged as an essential prognostic factor in acute ischemic stroke therapy. Although favorable outcome is associated with reduced ORT, it remains unclear whether intracranial bleeding depends on ORT. We therefore sought to determine whether ORT influenced the risk and volume of intracerebral hemorrhage (ICH) after combined intravenous and intra-arterial therapy. METHODS: Based on our prospective registry, we included 157 consecutive acute ischemic stroke patients successfully recanalized with combined intravenous and intra-arterial therapy between April 2007 and October 2011. Primary outcome was any ICH within 24 hours posttreatment. Secondary outcomes included occurrence of symptomatic ICH (sICH) and ICH volume measured with the ABC/2. RESULTS: Any ICH occurred in 26% of the study sample (n=33). sICH occurred in 5.5% (n=7). Median ICH volume was 0.8 mL. ORT was increased in patients with ICH (median=260 minutes; interquartile range=230-306) compared with patients without ICH (median=226 minutes; interquartile range=200-281; P=0.008). In the setting of sICH, ORT reached a median of 300 minutes (interquartile range=276-401; P=0.004). The difference remained significant after adjustment for potential confounding factors (adjusted P=0.045 for ICH; adjusted P=0.002 for sICH). There was no correlation between ICH volume and ORT (r=0.16; P=0.33). CONCLUSIONS: ORT influences the rate but not the volume of ICH and appears to be a critical predictor of symptomatic hemorrhage after successful combined intravenous and intra-arterial therapy. To minimize the risk of bleeding, revascularization should be achieved within 4.5 hours of stroke onset.
Resumo:
BACKGROUND: Social roles influence alcohol use. Nevertheless, little is known about how specific aspects of a given role, here parenthood, may influence alcohol use. The research questions for this study were the following: (i) are family-related indicators (FRI) linked to the alcohol use of mothers and fathers? and (ii) does the level of employment, i.e. full-time, part-time employment or unemployment, moderate the relationship between FRI and parental alcohol use? METHODS: Survey data of 3217 parents aged 25-50 living in Switzerland. Mean comparisons and multiple regression models of annual frequency of drinking and risky single occasion drinking, quantity per day on FRI (age of the youngest child, number of children in the household, majority of child-care/household duties). RESULTS: Protective relationships between FRI and alcohol use were observed among mothers. In contrast, among fathers, detrimental associations between FRI and alcohol use were observed. Whereas maternal responsibilities in general had a protective effect on alcohol use, the number of children had a detrimental impact on the quantity of alcohol consumed per day when mothers were in paid employment. Among fathers, the correlations between age of the youngest child, number of children and frequency of drinking was moderated by the level of paid employment. CONCLUSION: The study showed that in Switzerland, a systematic negative relationship was more often found between FRI and women's drinking than men's. Evidence was found that maternal responsibilities per se may protect from alcohol use but can turn into a detrimental triangle if mothers are additionally in paid employment.
Resumo:
Captan and folpet are two fungicides largely used in agriculture, but biomonitoring data are mostly limited to measurements of captan metabolite concentrations in spot urine samples of workers, which complicate interpretation of results in terms of internal dose estimation, daily variations according to tasks performed, and most plausible routes of exposure. This study aimed at performing repeated biological measurements of exposure to captan and folpet in field workers (i) to better assess internal dose along with main routes-of-entry according to tasks and (ii) to establish most appropriate sampling and analysis strategies. The detailed urinary excretion time courses of specific and non-specific biomarkers of exposure to captan and folpet were established in tree farmers (n = 2) and grape growers (n = 3) over a typical workweek (seven consecutive days), including spraying and harvest activities. The impact of the expression of urinary measurements [excretion rate values adjusted or not for creatinine or cumulative amounts over given time periods (8, 12, and 24 h)] was evaluated. Absorbed doses and main routes-of-entry were then estimated from the 24-h cumulative urinary amounts through the use of a kinetic model. The time courses showed that exposure levels were higher during spraying than harvest activities. Model simulations also suggest a limited absorption in the studied workers and an exposure mostly through the dermal route. It further pointed out the advantage of expressing biomarker values in terms of body weight-adjusted amounts in repeated 24-h urine collections as compared to concentrations or excretion rates in spot samples, without the necessity for creatinine corrections.
Resumo:
OBJECT: To determine whether glycine can be measured at 7 T in human brain with (1)H magnetic resonance spectroscopy (MRS). MATERIALS AND METHODS: The glycine singlet is overlapped by the larger signal of myo-inositol. Density matrix simulations were performed to determine the TE at which the myo-inositol signal was reduced the most, following a single spin-echo excitation. (1)H MRS was performed on an actively shielded 7 T scanner, in five healthy volunteers. RESULTS: At the TE of 30 ms, the myo-inositol signal intensity was substantially reduced. Quantification using LCModel yielded a glycine-to-creatine ratio of 0.14 +/- 0.01, with a Cramer-Rao lower bound (CRLB) of 7 +/- 1%. Furthermore, quantification of metabolites other than glycine was possible as well, with a CRLB mostly below 10%. CONCLUSION: It is possible to detect glycine at 7 T in human brain, at the short TE of 30 ms with a single spin-echo excitation scheme.
Resumo:
A medical and scientific multidisciplinary consensus meeting was held from 29 to 30 November 2013 on Anti-Doping in Sport at the Home of FIFA in Zurich, Switzerland, to create a roadmap for the implementation of the 2015 World Anti-Doping Code. The consensus statement and accompanying papers set out the priorities for the antidoping community in research, science and medicine. The participants achieved consensus on a strategy for the implementation of the 2015 World Anti-Doping Code. Key components of this strategy include: (1) sport-specific risk assessment, (2) prevalence measurement, (3) sport-specific test distribution plans, (4) storage and reanalysis, (5) analytical challenges, (6) forensic intelligence, (7) psychological approach to optimise the most deterrent effect, (8) the Athlete Biological Passport (ABP) and confounding factors, (9) data management system (Anti-Doping Administration & Management System (ADAMS), (10) education, (11) research needs and necessary advances, (12) inadvertent doping and (13) management and ethics: biological data. True implementation of the 2015 World Anti-Doping Code will depend largely on the ability to align thinking around these core concepts and strategies. FIFA, jointly with all other engaged International Federations of sports (Ifs), the International Olympic Committee (IOC) and World Anti-Doping Agency (WADA), are ideally placed to lead transformational change with the unwavering support of the wider antidoping community. The outcome of the consensus meeting was the creation of the ad hoc Working Group charged with the responsibility of moving this agenda forward.
Resumo:
Fibreoptic intubation remains a key technique for the management of difficult intubation. We randomly compared the second generation single-use Ambu(®) aScope? 2 videoscope with a standard re-usable flexible intubating fibrescope in 50 tracheal intubations in patients with a difficult airway simulated by a semirigid collar. All patients' tracheas were intubated successfully with the aScope 2 or the re-usable fibrescope. The median (IQR [range]) time to intubate was significantly longer with the aScope 2 70 (55-97 [41?-226]) s vs 50 (40-59 [27-175]) s, p = 0.0003) due to an increased time to see the carina. Quality of vision was significantly lower with the aScope 2 (excellent 24 (48%) vs 49 (98%), p = 0.0001; good 22 (44%) vs 1 (2%), p = 0.0001; poor 4 (8%) vs 0, p = 0.12) but with no difference in the subjective ease to intubate (easy score of 31 (62%) vs 38 (76%), p = 0.19; intermediate 12 (24%) vs 7 (14%), p = 0.31; difficult 7 (14%) vs 5 (5%), p = 0.76). The longer times to intubate and the poorer scores for quality of vision do not support the use of the single-use aScope 2 videoscope as an alternative to the re-usable fibrescope.
Resumo:
Aims :¦Several studies have questioned the validity of separating the diagnosis of alcohol abuse from that of alcohol dependence, and the DSM-5 task force has proposed combining the criteria from these two diagnoses to assess a single category of alcohol use disorders (AUD). Furthermore, the DSM-5 task force has proposed including a new 2-symptom threshold and a severity scale based on symptom counts for the AUD diagnosis. The current study aimed to examine these modifications in a large population-based sample.¦Method :¦Data stemmed from an adult sample (N=2588 ; mean age 51.3 years (s.d.: 0.2), 44.9% female) of current and lifetime drinkers from the PsyCoLaus study, conducted in the Lausanne area in Switzerland. AUDs and validating variables were assessed using a semi-structured diagnostic interview for the assessment of alcohol¦and other major psychiatric disorders. First, the adequacy of the proposed 2- symptom threshold was tested by comparing threshold models at each possible cutoff and a linear model, in relation to different validating variables. The model with the smallest Akaike Criterion Information (AIC) value was established as the best¦model for each validating variable. Second, models with varying subsets of individual AUD symptoms were created to assess the associations between each symptom and the validating variables. The subset of symptoms with the smallest AIC value was established as the best subset for each validator.¦Results :¦1) For the majority of validating variables, the linear model was found to be the best fitting model. 2) Among the various subsets of symptoms, the symptoms most frequently associated with the validating variables were : a) drinking despite having knowledge of a physical or psychological problem, b) having had a persistent desire or unsuccessful efforts to cut down or control drinking and c) craving. The¦least frequent symptoms were : d) drinking in larger amounts or over a longer period than was intended, e) spending a great deal of time in obtaining, using or recovering from alcohol use and f) failing to fulfill major role obligations.¦Conclusions :¦The proposed DSM-5 2-symptom threshold did not receive support in our data. Instead, a linear AUD diagnosis was supported with individuals receiving an increasingly severe AUD diagnosis. Moreover, certain symptoms were more frequently associated with the validating variables, which suggests that these¦symptoms should be considered as more severe.
Resumo:
Introduction: Use of paracetamol has been associated with an increased risk of asthma in several epidemiological studies. In contrast, it has been suggested that non-steroidal anti-inflammatory drugs (NSAIDs) might be protective (Kanabar, Clin Ther 2007), but data relating to these drugs are scarce. Methods: Prevalence of asthma and intake of analgesics in the past 2 years were assessed by questionnaire in 2008 in young adults (≥;16 years) diagnosed with cancer between 1976 and 2003 (Swiss Childhood Cancer Survivor Study). In a multivariate logistic regression we analysed the association between asthma and intake of paracetamol only, NSAIDs only or their combination, adjusting for age, sex, cancer diagnosis, cancer therapy and time since diagnosis. Results: Of the 1293 participants (response rate 68%), 83 (6%) reported asthma and 845 (65%) intake of analgesics in the past 2 years. Of these, 257 (29%) took paracetamol only, 224 (25%) NSAIDs only, 312 (35%) a combination of both and 52 (6%) other analgesics. Adjusted Odds ratios for asthma were 2.2 (95% CI 1.0-4.7; p = 0.04), 1.9 (0.9-4.3; p = 0.12) and 2.9 (1.4-6.1; p <0.01) in those using paracetamol only, NSAIDs only or their combination respectively. Conclusion: These cross-sectional data in a selected population do not support a protective effect of NSAIDs against asthma, neither taken alone nor in combination with paracetamol. All analgesics were positively associated with reported asthma episodes in the past two years. This can be explained by reverse causation, with intake of analgesics being a result rather than a cause of asthma events. Randomised controlled trials in unselected populations are needed to clarify the direction of causation.
Resumo:
OBJECTIVES: To determine characteristics associated with single and multiple fallers during postacute rehabilitation and to investigate the relationship among falls, rehabilitation outcomes, and health services use. DESIGN: Retrospective cohort study. SETTING: Geriatric postacute rehabilitation hospital. PARTICIPANTS: Patients (n = 4026) consecutively admitted over a 5-year period (2003-2007). MEASUREMENTS: All falls during hospitalization were prospectively recorded. Collected patients' characteristics included health, functional, cognitive, and affective status data. Length of stay and discharge destination were retrieved from the administrative database. RESULTS: During rehabilitation stay, 11.4% (458/4026) of patients fell once and an additional 6.3% (253/4026) fell several times. Compared with nonfallers, fallers were older and more frequently men. They were globally frailer, with lower Barthel score and more comorbidities, cognitive impairment, and depressive symptoms. In multivariate analyses, compared with 1-time fallers, multiple fallers were more likely to have lower Barthel score (adjOR: 2.45, 95% CI: 1.48-4.07; P = .001), cognitive impairment (adjOR: 1.43, 95% CI: 1.04-1.96; P = .026), and to have been admitted from a medicine ward (adjOR: 1.55, 95% CI: 1.03-2.32; P = .035). Odds of poor functional recovery and institutionalization at discharge, as well as length of stay, increased incrementally from nonfallers to 1-time and to multiple fallers. CONCLUSION: In these patients admitted to postacute rehabilitation, the proportion of fallers and multiple fallers was high. Multiple fallers were particularly at risk of poor functional recovery and increased health services use. Specific fall prevention programs targeting high-risk patients with cognitive impairment and low functional status should be developed in further studies.