956 resultados para Event data recorders.
Resumo:
MuSVts110 is a conditionally defective mutant of Moloney murine sarcoma virus which undergoes a novel tmperature-dependent splice event at growth temperatures of 33$\sp\circ$C or lower. Relative to wild-type MuSV-124, MuSVts110 contains a 1487 base deletion spanning from the 3$\sp\prime$ end of the p30 gag coding region to just downstream of the first v-mos initiation codon. As a result, the gag and mos genes are fused out of frame and no v-mos protein is expressed. However, upon a shift to 33$\sp\circ$C or lower, a splice event occurs which removes 431 bases, realigns the gag and mos genes, and allows read-through translation of a P85gag-mos transforming protein. Interestingly, while the cryptic splice sites utilized in MuSVts110 are present and unaltered in MuSV-124, they are never used. Due to the 1487 base deletion, the MuSV-124 intron was reduced from 1919 to 431 bases suggesting that intron size might be involved in the activation of these cryptic splice sites in MuSVts110. Since the splicing phenotype of the MuSVts110 equivalent (TS32 DNA) which contains the identical 1487 base deletion introduced into otherwise wild-type MuSV-124 DNA, was indistinguishable from authentic MuSVts110, it was concluded that this deletion alone is responsible for activation of the cryptic splice sites used in MuSVts110. These results also confirmed that thermodependent splicing is an intrinsic property of the viral RNA and not due to some cellular defect. Furthermore, analysis of gag gene deletion and frameshift MuSVts110 mutants demonstrated that viral gag gene proteins do not play a role in regulation of MuSVts110 splicing. Instead, cis-acting viral sequences appear to mediate regulation of the splice event.^ Our initial observation that truncation of the MuSVts110 transcript, leaving only residual amounts of the flanking exon sequences, completely abolished splicing activity argued that exon sequences might participate in the regulation of the splice event.^ Analysis of exon sequence involvement has also identified cis-acting sequences important in the thermodependence of the splice event. Data suggest that regulation of the MuSVts110 splice event involves multiple interactions between specific intron and exon sequences and spliceosome components which together limit splicing activity to temperatures of 33$\sp\circ$C or lower while simultaneously restricting splicing to a maximum of 50% efficiency. (Abstract shortened with permission of author.) ^
Resumo:
BACKGROUND Long-term hormone therapy has been the standard of care for advanced prostate cancer since the 1940s. STAMPEDE is a randomised controlled trial using a multiarm, multistage platform design. It recruits men with high-risk, locally advanced, metastatic or recurrent prostate cancer who are starting first-line long-term hormone therapy. We report primary survival results for three research comparisons testing the addition of zoledronic acid, docetaxel, or their combination to standard of care versus standard of care alone. METHODS Standard of care was hormone therapy for at least 2 years; radiotherapy was encouraged for men with N0M0 disease to November, 2011, then mandated; radiotherapy was optional for men with node-positive non-metastatic (N+M0) disease. Stratified randomisation (via minimisation) allocated men 2:1:1:1 to standard of care only (SOC-only; control), standard of care plus zoledronic acid (SOC + ZA), standard of care plus docetaxel (SOC + Doc), or standard of care with both zoledronic acid and docetaxel (SOC + ZA + Doc). Zoledronic acid (4 mg) was given for six 3-weekly cycles, then 4-weekly until 2 years, and docetaxel (75 mg/m(2)) for six 3-weekly cycles with prednisolone 10 mg daily. There was no blinding to treatment allocation. The primary outcome measure was overall survival. Pairwise comparisons of research versus control had 90% power at 2·5% one-sided α for hazard ratio (HR) 0·75, requiring roughly 400 control arm deaths. Statistical analyses were undertaken with standard log-rank-type methods for time-to-event data, with hazard ratios (HRs) and 95% CIs derived from adjusted Cox models. This trial is registered at ClinicalTrials.gov (NCT00268476) and ControlledTrials.com (ISRCTN78818544). FINDINGS 2962 men were randomly assigned to four groups between Oct 5, 2005, and March 31, 2013. Median age was 65 years (IQR 60-71). 1817 (61%) men had M+ disease, 448 (15%) had N+/X M0, and 697 (24%) had N0M0. 165 (6%) men were previously treated with local therapy, and median prostate-specific antigen was 65 ng/mL (IQR 23-184). Median follow-up was 43 months (IQR 30-60). There were 415 deaths in the control group (347 [84%] prostate cancer). Median overall survival was 71 months (IQR 32 to not reached) for SOC-only, not reached (32 to not reached) for SOC + ZA (HR 0·94, 95% CI 0·79-1·11; p=0·450), 81 months (41 to not reached) for SOC + Doc (0·78, 0·66-0·93; p=0·006), and 76 months (39 to not reached) for SOC + ZA + Doc (0·82, 0·69-0·97; p=0·022). There was no evidence of heterogeneity in treatment effect (for any of the treatments) across prespecified subsets. Grade 3-5 adverse events were reported for 399 (32%) patients receiving SOC, 197 (32%) receiving SOC + ZA, 288 (52%) receiving SOC + Doc, and 269 (52%) receiving SOC + ZA + Doc. INTERPRETATION Zoledronic acid showed no evidence of survival improvement and should not be part of standard of care for this population. Docetaxel chemotherapy, given at the time of long-term hormone therapy initiation, showed evidence of improved survival accompanied by an increase in adverse events. Docetaxel treatment should become part of standard of care for adequately fit men commencing long-term hormone therapy. FUNDING Cancer Research UK, Medical Research Council, Novartis, Sanofi-Aventis, Pfizer, Janssen, Astellas, NIHR Clinical Research Network, Swiss Group for Clinical Cancer Research.
Resumo:
Background: Despite almost 40 years of research into the etiology of Kawasaki Syndrome (KS), there is little research published on spatial and temporal clustering of KS cases. Previous analysis has found significant spatial and temporal clustering of cases, therefore cluster analyses were performed to substantiate these findings and provide insight into incident KS cases discharged from a pediatric tertiary care hospital. Identifying clusters from a single institution would allow for prospective analysis of risk factors and potential exposures for further insight into KS etiology. ^ Methods: A retrospective study was carried out to examine the epidemiology and distribution of patients presenting to Texas Children’s Hospital in Houston, Texas, with a diagnosis of Acute Febrile Mucocutaneous Lymph Node Syndrome (MCLS) upon discharge from January 1, 2005 to December 31, 2009. Spatial, temporal, and space-time cluster analyses were performed using the Bernoulli model with case and control event data. ^ Results: 397 of 102,761 total patients admitted to Texas Children’s Hospital had a principal or secondary diagnosis of Acute Febrile MCLS upon over the 5 year period. Demographic data for KS cases remained consistent with known disease epidemiology. Spatial, temporal, and space-time analyses of clustering using the Bernoulli model demonstrated no statistically significant clusters. ^ Discussion: Despite previous findings of spatial-temporal clustering of KS cases, there were no significant clusters of KS cases discharged from a single institution. This implicates the need for an expanded approach to conducting spatial-temporal cluster analysis and KS surveillance given the limitations of evaluating data from a single institution.^
Resumo:
The episodic occurrence of debris flow events in response to stochastic precipitation and wildfire events makes hazard prediction challenging. Previous work has shown that frequency-magnitude distributions of non-fire-related debris flows follow a power law, but less is known about the distribution of post-fire debris flows. As a first step in parameterizing hazard models, we use frequency-magnitude distributions and cumulative distribution functions to compare volumes of post-fire debris flows to non-fire-related debris flows. Due to the large number of events required to parameterize frequency-magnitude distributions, and the relatively small number of post-fire event magnitudes recorded in the literature, we collected data on 73 recent post-fire events in the field. The resulting catalog of 988 debris flow events is presented as an appendix to this article. We found that the empirical cumulative distribution function of post-fire debris flow volumes is composed of smaller events than that of non-fire-related debris flows. In addition, the slope of the frequency-magnitude distribution of post-fire debris flows is steeper than that of non-fire-related debris flows, evidence that differences in the post-fire environment tend to produce a higher proportion of small events. We propose two possible explanations: 1) post-fire events occur on shorter return intervals than debris flows in similar basins that do not experience fire, causing their distribution to shift toward smaller events due to limitations in sediment supply, or 2) fire causes changes in resisting and driving forces on a package of sediment, such that a smaller perturbation of the system is required in order for a debris flow to occur, resulting in smaller event volumes.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Neuraminidase inhibitors, oseltamivir and zanamivir, are used for the treatment of, and protection from, influenza. The safety of these compounds has been assessed in systematic reviews. However, the data presented are somewhat limited by the paucity of good quality adverse event data available. The majority of safety outcomes are based on evidence from just one or two randomised controlled trials. The results of the systematic reviews suggest that neuraminidase inhibitors have a reasonable side effect and adverse effect profile if they are to be used to treat or protect patients against a life-threatening disease. However, if these compounds are to be prescribed in situations in which avoidance of inconvenience or minor discomfort is hoped for, then the balance of harms to benefits will be more difficult to judge.
Resumo:
Survival models deals with the modelling of time to event data. In certain situations, a share of the population can no longer be subjected to the event occurrence. In this context, the cure fraction models emerged. Among the models that incorporate a fraction of cured one of the most known is the promotion time model. In the present study we discuss hypothesis testing in the promotion time model with Weibull distribution for the failure times of susceptible individuals. Hypothesis testing in this model may be performed based on likelihood ratio, gradient, score or Wald statistics. The critical values are obtained from asymptotic approximations, which may result in size distortions in nite sample sizes. This study proposes bootstrap corrections to the aforementioned tests and Bartlett bootstrap to the likelihood ratio statistic in Weibull promotion time model. Using Monte Carlo simulations we compared the nite sample performances of the proposed corrections in contrast with the usual tests. The numerical evidence favors the proposed corrected tests. At the end of the work an empirical application is presented.
Resumo:
Crossing the Franco-Swiss border, the Large Hadron Collider (LHC), designed to collide 7 TeV proton beams, is the world's largest and most powerful particle accelerator the operation of which was originally intended to commence in 2008. Unfortunately, due to an interconnect discontinuity in one of the main dipole circuit's 13 kA superconducting busbars, a catastrophic quench event occurred during initial magnet training, causing significant physical system damage. Furthermore, investigation into the cause found that such discontinuities were not only present in the circuit in question, but throughout the entire LHC. This prevented further magnet training and ultimately resulted in the maximum sustainable beam energy being limited to approximately half that of the design nominal, 3.5-4 TeV, for the first three years of operation (Run 1, 2009-2012) and a major consolidation campaign being scheduled for the first long shutdown (LS 1, 2012-2014). Throughout Run 1, a series of studies attempted to predict the amount of post-installation training quenches still required to qualify each circuit to nominal-energy current levels. With predictions in excess of 80 quenches (each having a recovery time of 8-12+ hours) just to achieve 6.5 TeV and close to 1000 quenches for 7 TeV, it was decided that for Run 2, all systems be at least qualified for 6.5 TeV operation. However, even with all interconnect discontinuities scheduled to be repaired during LS 1, numerous other concerns regarding circuit stability arose. In particular, observations of an erratic behaviour of magnet bypass diodes and the degradation of other potentially weak busbar sections, as well as observations of seemingly random millisecond spikes in beam losses, known as unidentified falling object (UFO) events, which, if persist at 6.5 TeV, may eventually deposit sufficient energy to quench adjacent magnets. In light of the above, the thesis hypothesis states that, even with the observed issues, the LHC main dipole circuits can safely support and sustain near-nominal proton beam energies of at least 6.5 TeV. Research into minimising the risk of magnet training led to the development and implementation of a new qualification method, capable of providing conclusive evidence that all aspects of all circuits, other than the magnets and their internal joints, can safely withstand a quench event at near-nominal current levels, allowing for magnet training to be carried out both systematically and without risk. This method has become known as the Copper Stabiliser Continuity Measurement (CSCM). Results were a success, with all circuits eventually being subject to a full current decay from 6.5 TeV equivalent current levels, with no measurable damage occurring. Research into UFO events led to the development of a numerical model capable of simulating typical UFO events, reproducing entire Run 1 measured event data sets and extrapolating to 6.5 TeV, predicting the likelihood of UFO-induced magnet quenches. Results provided interesting insights into the involved phenomena as well as confirming the possibility of UFO-induced magnet quenches. The model was also capable of predicting that such events, if left unaccounted for, are likely to be commonplace or not, resulting in significant long-term issues for 6.5+ TeV operation. Addressing the thesis hypothesis, the following written works detail the development and results of all CSCM qualification tests and subsequent magnet training as well as the development and simulation results of both 4 TeV and 6.5 TeV UFO event modelling. The thesis concludes, post-LS 1, with the LHC successfully sustaining 6.5 TeV proton beams, but with UFO events, as predicted, resulting in otherwise uninitiated magnet quenches and being at the forefront of system availability issues.
Resumo:
For climate risk management, cumulative distribution functions (CDFs) are an important source of information. They are ideally suited to compare probabilistic forecasts of primary (e.g. rainfall) or secondary data (e.g. crop yields). Summarised as CDFs, such forecasts allow an easy quantitative assessment of possible, alternative actions. Although the degree of uncertainty associated with CDF estimation could influence decisions, such information is rarely provided. Hence, we propose Cox-type regression models (CRMs) as a statistical framework for making inferences on CDFs in climate science. CRMs were designed for modelling probability distributions rather than just mean or median values. This makes the approach appealing for risk assessments where probabilities of extremes are often more informative than central tendency measures. CRMs are semi-parametric approaches originally designed for modelling risks arising from time-to-event data. Here we extend this original concept beyond time-dependent measures to other variables of interest. We also provide tools for estimating CDFs and surrounding uncertainty envelopes from empirical data. These statistical techniques intrinsically account for non-stationarities in time series that might be the result of climate change. This feature makes CRMs attractive candidates to investigate the feasibility of developing rigorous global circulation model (GCM)-CRM interfaces for provision of user-relevant forecasts. To demonstrate the applicability of CRMs, we present two examples for El Ni ? no/Southern Oscillation (ENSO)-based forecasts: the onset date of the wet season (Cairns, Australia) and total wet season rainfall (Quixeramobim, Brazil). This study emphasises the methodological aspects of CRMs rather than discussing merits or limitations of the ENSO-based predictors.
Resumo:
Inhibitory control deficits are well documented in schizophrenia, supported by impairment in an established measure of response inhibition, the stop-signal reaction time (SSRT). We investigated the neural basis of this impairment by comparing schizophrenia patients and controls matched for age, sex and education on behavioural, functional magnetic resonance imaging (fMRI) and event-related potential (ERP) indices of stop-signal task performance. Compared to controls, patients exhibited slower SSRT and reduced right inferior frontal gyrus (rIFG) activation, but rIFG activation correlated with SSRT in both groups. Go stimulus and stop-signal ERP components (N1/P3) were smaller in patients, but the peak latencies of stop-signal N1 and P3 were also delayed in patients, indicating impairment early in stop-signal processing. Additionally, response-locked lateralised readiness potentials indicated response preparation was prolonged in patients. An inability to engage rIFG may predicate slowed inhibition in patients, however multiple spatiotemporal irregularities in the networks underpinning stop-signal task performance may contribute to this deficit.
Resumo:
The idea of extracting knowledge in process mining is a descendant of data mining. Both mining disciplines emphasise data flow and relations among elements in the data. Unfortunately, challenges have been encountered when working with the data flow and relations. One of the challenges is that the representation of the data flow between a pair of elements or tasks is insufficiently simplified and formulated, as it considers only a one-to-one data flow relation. In this paper, we discuss how the effectiveness of knowledge representation can be extended in both disciplines. To this end, we introduce a new representation of the data flow and dependency formulation using a flow graph. The flow graph solves the issue of the insufficiency of presenting other relation types, such as many-to-one and one-to-many relations. As an experiment, a new evaluation framework is applied to the Teleclaim process in order to show how this method can provide us with more precise results when compared with other representations.
Resumo:
The central part of the Himalaya (Kumaun and Garhwal Provinces of India) is noted for its prolonged seismic quiescence, and therefore, developing a longer-term time series of past earthquakes to understand their recurrence pattern in this segment assumes importance. In addition to direct observations of offsets in stratigraphic exposures or other proxies like paleoliquefaction, deformation preserved within stalagmites (speleothems) in karst system can be analyzed to obtain continuous millennial scale time series of earthquakes. The Central Indian Himalaya hosts natural caves between major active thrusts forming potential storehouses for paleoseismological records. Here, we present results from the limestone caves in the Kumaun Himalaya and discuss the implications of growth perturbations identified in the stalagmites as possible earthquake recorders. This article focuses on three stalagmites from the Dharamjali Cave located in the eastern Kumaun Himalaya, although two other caves, one of them located in the foothills, were also examined for their suitability. The growth anomalies in stalagmites include abrupt tilting or rotation of growth axes, growth termination, and breakage followed by regrowth. The U-Th age data from three specimens allow us to constrain the intervals of growth anomalies, and these were dated at 4273 +/- 410 years BP (2673-1853 BC), 2782 +/- 79 years BP (851-693 BC), 2498 +/- 117 years BP (605-371 BC), 1503 +/- 245 years BP (262-752 AD), 1346 +/- 101 years BP (563-765 AD), and 687 +/- 147 years BP (1176-1470 AD). The dates may correspond to the timings of major/great earthquakes in the region and the youngest event (1176-1470 AD) shows chronological correspondence with either one of the great medieval earthquakes (1050-1250 and 1259-1433 AD) evident from trench excavations across the Himalayan Frontal Thrust.
Resumo:
Serial Analysis of Gene Expression (SAGE) is a relatively new method for monitoring gene expression levels and is expected to contribute significantly to the progress in cancer treatment by enabling a precise and early diagnosis. A promising application of SAGE gene expression data is classification of tumors. In this paper, we build three event models (the multivariate Bernoulli model, the multinomial model and the normalized multinomial model) for SAGE data classification. Both binary classification and multicategory classification are investigated. Experiments on two SAGE datasets show that the multivariate Bernoulli model performs well with small feature sizes, but the multinomial performs better at large feature sizes, while the normalized multinomial performs well with medium feature sizes. The multinomial achieves the highest overall accuracy.