927 resultados para Arrow Of Time


Relevância:

90.00% 90.00%

Publicador:

Resumo:

We conducted a study assessing the quality and speed of intubation between the Airtraq with its new iPhone AirView app and the King Vision in a manikin. The primary endpoint was reduction of time needed for intubation. Secondary endpoints included times necessary for intubation. 30 anaesthetists randomly performed 3 intubations with each device on a difficult airway manikin. Participants had a professional experience of 12 years: 60.0% possessed the Airtraq in their hospital, 46.7% the King Vision, and 20.0% both. Median time difference [IQR] to identify glottis (1.1 [-1.3; 3.9] P = 0.019), for tube insertion (2.1 [-2.6; 9.4] P = 0.002) and lung ventilation (2.8 [-2.4; 11.5] P = 0.001), was shorter with the Airtraq-AirView. Median time for glottis visualization was significantly shorter with the Airtraq-AirView (5.3 [4.0; 8.4] versus 6.4 [4.6; 9.1]). Cormack Lehane before intubation was better with the King Vision (P = 0.03); no difference was noted during intubation, for subjective device insertion or quality of epiglottis visualisation. Assessment of tracheal tube insertion was better with the Airtraq-AirView. The Airtraq-AirView allows faster identification of the landmarks and intubation in a difficult airway manikin, while clinical relevance remains to be studied. Anaesthetists assessed the intubation better with the Airtraq-AirView.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: In longitudinal studies where subjects experience recurrent incidents over a period of time, such as respiratory infections, fever or diarrhea, statistical methods are required to take into account the within-subject correlation. Methods: For repeated events data with censored failure, the independent increment (AG), marginal (WLW) and conditional (PWP) models are three multiple failure models that generalize Cox"s proportional hazard model. In this paper, we revise the efficiency, accuracy and robustness of all three models under simulated scenarios with varying degrees of within-subject correlation, censoring levels, maximum number of possible recurrences and sample size. We also study the methods performance on a real dataset from a cohort study with bronchial obstruction. Results: We find substantial differences between methods and there is not an optimal method. AG and PWP seem to be preferable to WLW for low correlation levels but the situation reverts for high correlations. Conclusions: All methods are stable in front of censoring, worsen with increasing recurrence levels and share a bias problem which, among other consequences, makes asymptotic normal confidence intervals not fully reliable, although they are well developed theoretically.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nowadays, Species Distribution Models (SDMs) are a widely used tool. Using different statistical approaches these models reconstruct the realized niche of a species using presence data and a set of variables, often topoclimatic. There utilization range is quite large from understanding single species requirements, to the creation of nature reserve based on species hotspots, or modeling of climate change impact, etc... Most of the time these models are using variables at a resolution of 50km x 50km or 1 km x 1 km. However in some cases these models are used with resolutions below the kilometer scale and thus called high resolution models (100 m x 100 m or 25 m x 25 m). Quite recently a new kind of data has emerged enabling precision up to lm x lm and thus allowing very high resolution modeling. However these new variables are very costly and need an important amount of time to be processed. This is especially the case when these variables are used in complex calculation like models projections over large areas. Moreover the importance of very high resolution data in SDMs has not been assessed yet and is not well understood. Some basic knowledge on what drive species presence-absences is still missing. Indeed, it is not clear whether in mountain areas like the Alps coarse topoclimatic gradients are driving species distributions or if fine scale temperature or topography are more important or if their importance can be neglected when balance to competition or stochasticity. In this thesis I investigated the importance of very high resolution data (2-5m) in species distribution models using either very high resolution topographic, climatic or edaphic variables over a 2000m elevation gradient in the Western Swiss Alps. I also investigated more local responses of these variables for a subset of species living in this area at two precise elvation belts. During this thesis I showed that high resolution data necessitates very good datasets (species and variables for the models) to produce satisfactory results. Indeed, in mountain areas, temperature is the most important factor driving species distribution and needs to be modeled at very fine resolution instead of being interpolated over large surface to produce satisfactory results. Despite the instinctive idea that topographic should be very important at high resolution, results are mitigated. However looking at the importance of variables over a large gradient buffers the importance of the variables. Indeed topographic factors have been shown to be highly important at the subalpine level but their importance decrease at lower elevations. Wether at the mountane level edaphic and land use factors are more important high resolution topographic data is more imporatant at the subalpine level. Finally the biggest improvement in the models happens when edaphic variables are added. Indeed, adding soil variables is of high importance and variables like pH are overpassing the usual topographic variables in SDMs in term of importance in the models. To conclude high resolution is very important in modeling but necessitate very good datasets. Only increasing the resolution of the usual topoclimatic predictors is not sufficient and the use of edaphic predictors has been highlighted as fundamental to produce significantly better models. This is of primary importance, especially if these models are used to reconstruct communities or as basis for biodiversity assessments. -- Ces dernières années, l'utilisation des modèles de distribution d'espèces (SDMs) a continuellement augmenté. Ces modèles utilisent différents outils statistiques afin de reconstruire la niche réalisée d'une espèce à l'aide de variables, notamment climatiques ou topographiques, et de données de présence récoltées sur le terrain. Leur utilisation couvre de nombreux domaines allant de l'étude de l'écologie d'une espèce à la reconstruction de communautés ou à l'impact du réchauffement climatique. La plupart du temps, ces modèles utilisent des occur-rences issues des bases de données mondiales à une résolution plutôt large (1 km ou même 50 km). Certaines bases de données permettent cependant de travailler à haute résolution, par conséquent de descendre en dessous de l'échelle du kilomètre et de travailler avec des résolutions de 100 m x 100 m ou de 25 m x 25 m. Récemment, une nouvelle génération de données à très haute résolution est apparue et permet de travailler à l'échelle du mètre. Les variables qui peuvent être générées sur la base de ces nouvelles données sont cependant très coûteuses et nécessitent un temps conséquent quant à leur traitement. En effet, tout calcul statistique complexe, comme des projections de distribution d'espèces sur de larges surfaces, demande des calculateurs puissants et beaucoup de temps. De plus, les facteurs régissant la distribution des espèces à fine échelle sont encore mal connus et l'importance de variables à haute résolution comme la microtopographie ou la température dans les modèles n'est pas certaine. D'autres facteurs comme la compétition ou la stochasticité naturelle pourraient avoir une influence toute aussi forte. C'est dans ce contexte que se situe mon travail de thèse. J'ai cherché à comprendre l'importance de la haute résolution dans les modèles de distribution d'espèces, que ce soit pour la température, la microtopographie ou les variables édaphiques le long d'un important gradient d'altitude dans les Préalpes vaudoises. J'ai également cherché à comprendre l'impact local de certaines variables potentiellement négligées en raison d'effets confondants le long du gradient altitudinal. Durant cette thèse, j'ai pu monter que les variables à haute résolution, qu'elles soient liées à la température ou à la microtopographie, ne permettent qu'une amélioration substantielle des modèles. Afin de distinguer une amélioration conséquente, il est nécessaire de travailler avec des jeux de données plus importants, tant au niveau des espèces que des variables utilisées. Par exemple, les couches climatiques habituellement interpolées doivent être remplacées par des couches de température modélisées à haute résolution sur la base de données de terrain. Le fait de travailler le long d'un gradient de température de 2000m rend naturellement la température très importante au niveau des modèles. L'importance de la microtopographie est négligeable par rapport à la topographie à une résolution de 25m. Cependant, lorsque l'on regarde à une échelle plus locale, la haute résolution est une variable extrêmement importante dans le milieu subalpin. À l'étage montagnard par contre, les variables liées aux sols et à l'utilisation du sol sont très importantes. Finalement, les modèles de distribution d'espèces ont été particulièrement améliorés par l'addition de variables édaphiques, principalement le pH, dont l'importance supplante ou égale les variables topographique lors de leur ajout aux modèles de distribution d'espèces habituels.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Evidence regarding the different treatment options of status epilepticus (SE) in adults is scarce. Large randomized trials cover only one treatment at early stage and suggest the superiority of benzodiazepines over placebo, of intravenous lorazepam over intravenous diazepam or over intravenous phenytoin alone, and of intramuscular midazolam over intravenous lorazepam. However, many patients will not be treated successfully with the first treatment step. A large randomized trial covering the treatment of established status (ESETT) has just been funded recently by the NIH and will not start before 2015, with expected results in 2018; a trial on the treatment of refractory status with general anesthetics was terminated early due to insufficient recruitment. Therefore, a prospective multicenter observational registry was set up; this may help in clinical decision-making until results from randomized trials are available. METHODS/DESIGN: SENSE is a prospective, multicenter registry for patients treated for SE. The primary objective is to document patient characteristics, treatment modalities and in-house outcome of consecutive adults admitted for SE treatment in each of the participating centres and to identify predictors of outcome. Pre-treatment, treatment-related and outcome variables are documented systematically. To allow for meaningful multivariate analysis in the patient subgroups with refractory SE, a cohort size of 1000 patients is targeted. DISCUSSION: The results of the study will provide information about risks and benefits of specific treatment steps in different patient groups with SE at different points of time. Thus, it will support clinical decision-making and, furthermore, it will be helpful in the planning of treatment trials. TRIAL REGISTRATION: DRKS00000725.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this work is to study flow properties at T-junction of pipe, pressure loss suffered by the flow after passing through T-junction and to study reliability of the classical engineering formulas used to find head loss for T-junction of pipes. In this we have compared our results with CFD software packages with classical formula and made an attempt to determine accuracy of the classical formulas. In this work we have studies head loss in T-junction of pipes with various inlet velocities, head loss in T-junction of pipes when the angle of the junction is slightly different from 90 degrees and T-junction with different area of cross-section of the main pipe and branch pipe. In this work we have simulated the flow at T-junction of pipe with FLUENT and Comsol Multiphysics and observed flow properties inside the T-junction and studied the head loss suffered by fluid flow after passing through the junction. We have also compared pressure (head) losses obtained by classical formulas by A. Vazsonyi and Andrew Gardel and formulas obtained by assuming T-junction as combination of other pipe components and observations obtained from software experiments. One of the purposes of this study is also to study change in pressure loss with change in angle of T-junction. Using software we can have better view of flow inside the junction and study turbulence, kinetic energy, pressure loss etc. Such simulations save a lot of time and can be performed without actually doing the experiment. There were no real life experiments made, the results obtained completely rely on accuracy of software and numerical methods used.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a preliminary study on the degradation of spray paint samples, illustrated by Optical, FTIR and Raman measurements. As opposed to automotive paints which are specifically designed for improved outdoor exposure and protected using hindered amine light absorbers (HALS) and ultra-violet absorbers (UVA), the spray paints on their side are much simpler in composition and very likely to suffer more from joint effects of solar radiation, temperature and humidity. Six different spray paint were exposed to outdoor UV-radiation for a total period of three months and both FTIR and Raman measurements were taken systematically during this time. These results were later compared to an artificial degradation using a climate chamber. For infrared spectroscopy, degradation curves were plotted using the photo-oxidation index (POI), and could be successfully approximated with a logarithmic fitting (R2 > 0.8). The degradation can appear after the first few days of exposure and be important until 2 months, where it stabilizes and follow a more linear trend afterwards. One advantage is that the degradation products appeared almost exclusively at the far end (∼3000 cm−1) of mid-infrared spectra, and that the fingerprint region of the spectra remained stable over the studied period of time. Raman results suggest that the pigments on the other side, are much more stable and have not shown any sign of degradation over the time of this study. Considering the forensic implications of this environmental degradation, care should be taken when comparing samples if weathering is an option (e.g. an exposed graffiti compared to the paint from a fresh spray paint can). Degradation issues should be kept in mind as they may induce significant differences between paint samples of common origin.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Questions of scale have received ample attention in physical scale modeling and experimentation, but have not been discussed with regard to economic experimentation. In this article I distinguish between two kinds of experiments, "generic" and "specific" experiments. Using a comparison between two experimental laboratory studies on the "posted price effect", I then show that scale issues become important in specific laboratory experiments because of the scaling down of time in the target market to laboratory dimensions. This entails choices in the material configuration of the experiment as well as role changes of experimental subjects. My discussion thus adds to recent literature on external validity and on the materiality of experiments.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: The aim of the current study was to investigate the long-term cognitive effects of electroconvulsive therapy (ECT) in a sample of adolescent patients in whom schizophrenia spectrum disorders were diagnosed. Methods: The sample was composed of nine adolescent subjects in whom schizophrenia or schizoaffective disorder was diagnosed according to DSM-IV-TR criteria on whom ECT was conducted (ECT group) and nine adolescent subjects matched by age, socioeconomic status, and diagnostic and Positive and Negative Syndrome Scale (PANSS) total score at baseline on whom ECT was not conducted (NECT group). Clinical and neuropsychological assessments were carried out at baseline before ECT treatment and at 2-year follow-up. Results: Significant differences were found between groups in the number of unsuccessful medication trials. No statistically significant differences were found between the ECT group and theNECT group in either severity as assessed by the PANSS, or in any cognitive variables at baseline.At follow-up, both groups showed significant improvement in clinical variables (subscales of positive, general, and total scores of PANSS and Clinical Global Impressions-Improvement). In the cognitive assessment at follow-up, significant improvement was found in both groups in the semantic category of verbal fluency task and digits forward. However, no significant differences were found between groups in any clinical or cognitive variable at follow-up. Repeated measures analysis found no significant interaction of time · group in any clinical or neuropsychological measures. Conclusions: The current study showed no significant differences in change over time in clinical or neuropsychological variables between the ECT group and the NECT group at 2-year follow-up. Thus, ECT did not show any negative influence on long-term neuropsychological variables in our sample.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The -function and the -function are phenomenological models that are widely used in the context of timing interceptive actions and collision avoidance, respectively. Both models were previously considered to be unrelated to each other: is a decreasing function that provides an estimation of time-to-contact (ttc) in the early phase of an object approach; in contrast, has a maximum before ttc. Furthermore, it is not clear how both functions could be implemented at the neuronal level in a biophysically plausible fashion. Here we propose a new framework- the corrected modified Tau function- capable of predicting both -type ("") and -type ("") responses. The outstanding property of our new framework is its resilience to noise. We show that can be derived from a firing rate equation, and, as , serves to describe the response curves of collision sensitive neurons. Furthermore, we show that predicts the psychophysical performance of subjects determining ttc. Our new framework is thus validated successfully against published and novel experimental data. Within the framework, links between -type and -type neurons are established. Therefore, it could possibly serve as a model for explaining the co-occurrence of such neurons in the brain.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The introduction of time-series graphs into British economics in the 19th century depended on the « timing » of history. This involved reconceptualizing history into events which were both comparable and measurable and standardized by time unit. Yet classical economists in Britain in the early 19th century viewed history as a set of heterogenous and complex events and statistical tables as giving unrelated facts. Both these attitudes had to be broken down before time-series graphs could be brought into use for revealing regularities in economic events by the century's end.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Vaccination aims at generating memory immune responses able to protect individuals against pathogenic challenges over long periods of time. Subunit vaccine formulations based on safe, but poorly immunogenic, antigenic entities must be combined with adjuvant molecules to make them efficient against infections. We have previously shown that gas-filled microbubbles (MB) are potent antigen-delivery systems. This study compares the ability of various ovalbumin-associated MB (OVA-MB) formulations to induce antigen-specific memory immune responses and evaluates long-term protection toward bacterial infections. When initially testing dendritic cells reactivity to MB constituents, palmitic acid exhibited the highest degree of activation. Subcutaneous immunization of naïve wild-type mice with the OVA-MB formulation comprising the highest palmitic acid content and devoid of PEG2000 was found to trigger the more pronounced Th1-type response, as reflected by robust IFN-γ and IL-2 production. Both T cell and antibody responses persisted for at least 6 months after immunization. At that time, systemic infection with OVA-expressing Listeria monocytgenes was performed. Partial protection of vaccinated mice was demonstrated by reduction of the bacterial load in both the spleen and liver. We conclude that antigen-bound MB exhibit promising properties as a vaccine candidate ensuring prolonged maintenance of protective immunity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES: Blunted nocturnal dip of blood pressure (BP) and reversed circadian rhythm have been described in preeclampsia (PE). Non-dipper status and preeclampsia are both associated with an increased risk of cardiovascular disease later in life. Complete recovery of BP in PE is reported to occur over a variable period of time. Twenty-four hours-ambulatory blood pressure measurement (ABPM) in the post-partum follow-up after a PE has not been described. The aim of this study was to assess 24h-ambulatory blood pressure pattern after a PE and to determine the prevalence of non-dipper status, nocturnal hypertension, white coat hypertension and masked hypertension. METHODS: This is an observational, prospective study on women who suffered from a preeclampsia. A 24h-ABPM was done 6 weeks post-partum at the Hypertension Unit of the University Hospitals of Geneva, concomitantly with a clinical and biological evaluation. RESULTS: Forty-five women were included in a preliminary analysis. Mean age was 33±6years, 57.3% were Caucasian, mean BMI before pregnancy was 24±5kg/m(2). Office and ambulatory BP are shown in Table 1. Prevalence of nocturnal hypertension was high and half of the women had no nocturnal dipping. The diagnosis of hypertension based on office BP was discordant with the diagnosis based on ABPM in 25% of women. CONCLUSIONS: The prevalence of increased nighttime BP and abnormal BP pattern is high at 6weeks post-partum in preeclamptic women. Early assessment of BP with ABPM after preeclampsia allows an early identification of women with persistent circadian abnormalities who might be at increased risk. It also provides a more accurate assessment than office BP. DISCLOSURES: A. Ditisheim: None. B. Ponte: None. G. Wuerzner: None. M. Burnier: None. M. Boulvain: None. A. Pechère-Bertschi: None.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

INTRODUCTION: The main clinical manifestations of Whipple's disease are weight loss, arthropathy, diarrhea and abdominal pain. Cardiac involvement is frequently described. However, endocarditis is rare and is not usually the initial presentation of the disease. To the best of our knowledge, this is the first reported case of a patient with Tropheryma whipplei tricuspid endocarditis without any other valve involved and not presenting signs of arthralgia and abdominal involvement. CASE PRESENTATION: We report a case of a 50-year-old Caucasian man with tricuspid endocarditis caused by Tropheryma whipplei, showing signs of severe shock and an absence of other more classic clinical signs of Whipple's disease, such as arthralgia, abdominal pain and diarrhea. Tropheryma whipplei was documented by polymerase chain reaction of the blood and pleural fluid. The infection was treated with a combined treatment of doxycycline, hydroxychloroquine and sulfamethoxazole-trimethoprim for one year. CONCLUSION: Tropheryma whipplei infectious endocarditis should always be considered when facing a blood-culture negative endocarditis particularly in right-sided valves. Although not standardized yet, treatment of Tropheryma whipplei endocarditis should probably include a bactericidal antibiotic (such as doxycycline) and should be given over a prolonged period of time (a minimum of one year).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The fission yeast Schizosaccharomyces pombe has been an invaluable model system in studying the regulation of the mitotic cell cycle progression, the mechanics of cell division and cell polarity. Furthermore, classical experiments on its sexual reproduction have yielded results pivotal to current understanding of DNA recombination and meiosis. More recent analysis of fission yeast mating has raised interesting questions on extrinsic stimuli response mechanisms, polarized cell growth and cell-cell fusion. To study these topics in detail we have developed a simple protocol for microscopy of the entire sexual lifecycle. The method described here is easily adjusted to study specific mating stages. Briefly, after being grown to exponential phase in a nitrogen-rich medium, cell cultures are shifted to a nitrogen-deprived medium for periods of time suited to the stage of the sexual lifecycle that will be explored. Cells are then mounted on custom, easily built agarose pad chambers for imaging. This approach allows cells to be monitored from the onset of mating to the final formation of spores.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Electricity spot prices have always been a demanding data set for time series analysis, mostly because of the non-storability of electricity. This feature, making electric power unlike the other commodities, causes outstanding price spikes. Moreover, the last several years in financial world seem to show that ’spiky’ behaviour of time series is no longer an exception, but rather a regular phenomenon. The purpose of this paper is to seek patterns and relations within electricity price outliers and verify how they affect the overall statistics of the data. For the study techniques like classical Box-Jenkins approach, series DFT smoothing and GARCH models are used. The results obtained for two geographically different price series show that patterns in outliers’ occurrence are not straightforward. Additionally, there seems to be no rule that would predict the appearance of a spike from volatility, while the reverse effect is quite prominent. It is concluded that spikes cannot be predicted based only on the price series; probably some geographical and meteorological variables need to be included in modeling.