89 resultados para ARN long non-codant
Resumo:
In this paper we investigate the structure of non-representable preference relations. While there is a vast literature on different kinds of preference relations that can be represented by a real-valued utility function, very little is known or understood about preference relations that cannot be represented by a real-valued utility function. There has been no systematic analysis of the non-representation problem. In this paper we give a complete description of non-representable preference relations which are total preorders or chains. We introduce and study the properties of four classes of non-representable chains: long chains, planar chains, Aronszajn-like chains and Souslin chains. In the main theorem of the paper we prove that a chain is non-representable if and only it is a long chain, a planar chain, an Aronszajn-like chain or a Souslin chain. (C) 2002 Published by Elsevier Science B.V.
Resumo:
Objective To determine the relative importance of recognised risk factors for non-haemorrhagic stroke, including serum cholesterol and the effect of cholesterol-lowering therapy, on the occurrence of non-haemorrhagic stroke in patients enrolled in the LIPID (Long-term Intervention with Pravastatin in Ischaemic Disease) study. Design The LIPID study was a placebo-controlled, double-blind trial of the efficacy on coronary heart disease mortality of pravastatin therapy over 6 years in 9014 patients with previous acute coronary syndromes and baseline total cholesterol of 4-7 mmol/l. Following identification of patients who had suffered non-haemorrhagic stroke, a pre-specified secondary end point, multivariate Cox regression was used to determine risk in the total population. Time-to-event analysis was used to determine the effect of pravastatin therapy on the rate of non-haemorrhagic stroke. Results There were 388 non-haemorrhagic strokes in 350 patients. Factors conferring risk of future non-haemorrhagic stroke were age, atrial fibrillation, prior stroke, diabetes, hypertension, systolic blood pressure, cigarette smoking, body mass index, male sex and creatinine clearance. Baseline lipids did not predict non-haemorrhagic stroke. Treatment with pravastatin reduced non-haemorrhagic stroke by 23% (P= 0.016) when considered alone, and 21% (P= 0.024) after adjustment for other risk factors. Conclusions The study confirmed the variety of risk factors for non-haemorrhagic stroke. From the risk predictors, a simple prognostic index was created for nonhaemorrhagic stroke to identify a group of patients at high risk. Treatment with pravastatin resulted in significant additional benefit after allowance for risk factors. (C) 2002 Lippincott Williams Wilkins.
Resumo:
For the first time it was possible to observe regular quasiperiodic scintillations (QPS) in VHF radio-satellite transmissions from orbiting satellites simultaneously at short (2.1 km) and long (121 km) meridional baselines in the vicinity of a typical mid-latitude station (Brisbane; 27.5degreesS and 152.9degreesE geog. and 35.6degrees invar.lat.), using three sites (St. Lucia-S, Taringa-T in Brisbane and Boreen Pt.-B, north of Brisbane). A few pronounced quasiperiodic (QP) events were recorded showing unambiguous regular structures at the sites which made it possible to deduce a time displacement of the regular fading minimum at S, T and B. The QP structure is highly dependent on the geometry of the ray-path from a satellite to the observer which is manifested as a change of a QP event from symmetrical to non-symmetrical for stations separated by 2.1 km, and to a radical change in the structure of the event over a distance of 121 km. It is suggested the short-duration intense QP events are due to a Fresnel diffraction (or a reflection mechanism) of radio-satellite signals by a single ionospheric irregularity in a form of an ellipsoid with a large ionization gradient along the major axis. The structure of a QP event depends on the angle of viewing of the irregular blob from a radio-satellite. In view of this it is suggested that the reported variety of the ionization formation, responsible for different types of QPS, is only apparent but not real. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
Passive electroreception is a complex and specialised sense found in a large range of aquatic vertebrates primarily designed for the detection of weak bioelectric fields. Particular attention has traditionally focused on cartilaginous fishes, but a range of teleost and non-teleost fishes from a diversity of habitats have also been examined. As more species are investigated, it has become apparent that the role of electroreception in fishes is not restricted to locating prey, but is utilised in other complex behaviours. This paper presents the various functional roles of passive electroreception in non-electric fishes, by reviewing much of the recent research on the detection of prey in the context of differences in species' habitat (shallow water, deep-sea, freshwater and saltwater). A special case study on the distribution and neural groupings of ampullary organs in the omnihaline bull shark, Carcharhinus leucas, is also presented and reveals that prey-capture, rather than navigation, may be an important determinant of pore distribution. The discrimination between potential predators and conspecifics and the role of bioelectric stimuli in social behaviour is discussed, as is the ability to migrate over short or long distances in order to locate environmentally favourable conditions. The various theories proposed regarding the importance and mediation of geomagnetic orientation by either an electroreceptive and/or a magnetite-based sensory system receives particular attention. The importance of electroreception to many species is emphasised by highlighting what still remains to be investigated, especially with respect to the physical, biochemical and neural properties of the ampullary organs and the signals that give rise to the large range of observed behaviours.
Resumo:
Non- protein- coding RNAs ( ncRNAs) are increasingly being recognized as having important regulatory roles. Although much recent attention has focused on tiny 22- to 25- nucleotide microRNAs, several functional ncRNAs are orders of magnitude larger in size. Examples of such macro ncRNAs include Xist and Air, which in mouse are 18 and 108 kilobases ( Kb), respectively. We surveyed the 102,801 FANTOM3 mouse cDNA clones and found that Air and Xist were present not as single, full- length transcripts but as a cluster of multiple, shorter cDNAs, which were unspliced, had little coding potential, and were most likely primed from internal adenine- rich regions within longer parental transcripts. We therefore conducted a genome- wide search for regional clusters of such cDNAs to find novel macro ncRNA candidates. Sixty- six regions were identified, each of which mapped outside known protein- coding loci and which had a mean length of 92 Kb. We detected several known long ncRNAs within these regions, supporting the basic rationale of our approach. In silico analysis showed that many regions had evidence of imprinting and/ or antisense transcription. These regions were significantly associated with microRNAs and transcripts from the central nervous system. We selected eight novel regions for experimental validation by northern blot and RT- PCR and found that the majority represent previously unrecognized noncoding transcripts that are at least 10 Kb in size and predominantly localized in the nucleus. Taken together, the data not only identify multiple new ncRNAs but also suggest the existence of many more macro ncRNAs like Xist and Air.
Risk of serious NSAID-related gastrointestinal events during long-term exposure: a systematic review
Resumo:
Objective: Exposure to non-steroidal anti-inflammatory drugs (NSAIDs) is associated wit increased risk of serious gastrointestinal (GI) events compared with non-exposure. We investigated whether that risk is sustained over time. Data sources: Cochrane Controlled Trials Register (to 2002); MEDLINE, EMBASE, Derwent Drug File and Current Contents (1999-2002); manual searching of reviews (1999-2002). Study selection: From 479 search results reviewed and 221 articles retrieved, seven studies of patients exposed to prescription non-selective NSAIDs for more than 6 months and reporting time-dependent serious GI event rates were selected for quantitative data synthesis. These were stratified into two groups by study design. Data extraction: Incidence of GI events and number of patients at specific time points were extracted. Data synthesis: Meta-regression analyses were performed. Change in risk was evaluated by testing whether the slope of the regression line declined over time. Four randomised controlled trials (RCTs) provided evaluable data from five NSAID arms (aspirin, naproxen, two ibuprofen arms, and diclofenac). When the RCT data were combined, a small significant decline in annualised risk was seen: -0.005% (95% Cl, -0.008% to -0.001%) per month. Sensitivity analyses were conducted because there was disparity within the RCT data. The pooled estimate from three cohort studies showed no significant decline in annualised risk over periods up to 2 years: -0.003% (95% Cl, -0.008% to 0.003%) per month. Conclusions: Small decreases in risk over time were observed; these were of negligible clinical importance. For patients who need long-term (> 6 months) treatment, precautionary measures should be considered to reduce the net probability of serious GI events over the anticipated treatment duration. The effect of intermittent versus regular daily therapy on long-term risk needs further investigation.
Resumo:
The testing of concurrent software components can be difficult due to the inherent non-determinism present in these components. For example, if the same test case is run multiple times, it may produce different results. This non-determinism may lead to problems with determining expected outputs. In this paper, we present and discuss several possible solutions to this problem in the context of testing concurrent Java components using the ConAn testing tool. We then present a recent extension to the tool that provides a general solution to this problem that is sufficient to deal with the level of non-determinism that we have encountered in testing over 20 components with ConAn. © 2005 IEEE
Resumo:
The Equilibrium Flux Method [1] is a kinetic theory based finite volume method for calculating the flow of a compressible ideal gas. It is shown here that, in effect, the method solves the Euler equations with added pseudo-dissipative terms and that it is a natural upwinding scheme. The method can be easily modified so that the flow of a chemically reacting gas mixture can be calculated. Results from the method for a one-dimensional non-equilibrium reacting flow are shown to agree well with a conventional continuum solution. Results are also presented for the calculation of a plane two-dimensional flow, at hypersonic speed, of a dissociating gas around a blunt-nosed body.
Resumo:
10 lectal variables were examined with respect to Norwegian speakers' acceptance of long-distance reflexives (LDR), using a questionnaire to elicit grammaticality judgements on 50 potential LDR sentences. A sample of 180 speakers completed the questionnaire. The data was analysed using a general linear model univariate model, and Spearman's correlation. In this sample the results showed that dialect and level of education had significant effects on speakers' acceptance of long-distance reflexives, while sex, age, being a native speaker, having both native-speaker parents, living in the city or the country, and the speaker's attitudes to the two Norwegian writing languages had no influence on speakers' acceptance of long-distance reflexives. It is suggested that the influence of Danish on Norwegian writing and on the southern dialects may be the cause of the observed variation with respect to LDR in Norwegian.
Resumo:
The purpose of this study was to determine the attentional demands of natural and imposed gait, as well as the attentional costs of transitions between the walking and running co-ordination patterns. Seven healthy young men and four healthy young women undertook an auditory probe reaction time task concurrently with self-selected gait (Experiment 1) and imposed walking and running (Experiment 2) at different speeds on a motor-driven treadmill. In Experiment 1, where participants were free to choose their own movement pattern to match the speed of travel of the treadmill, normal gait control was shown to have a significant attentional cost, and hence not be automatic in the classical sense. However, this attentional cost did not differ between the two gait modes or at the transition point. In Experiment 2, where participants were required to maintain specific gait modes regardless of the treadmill speed, the maintenance of walking at speeds normally associated with running was found to have an attentional cost whereas this was not the case for running at normal walking speeds. Collectively the findings support a model of gait control in which the normal switching between gait modes is determined with minimal attention demand and in which it is possible to sustain non-preferred gait modes although, in the case of walking, only at a significant attentional/cognitive cost. © 2002 Elsevier Science B.V. All rights reserved.
Resumo:
We conducted a study to assess the association between the acute respiratory health of children and the levels of particulates in communities near and away from active opencast coal mines. The study enrolled children aged 1–11 years from the general population of five socioeconomically matched pairs of nonurban communities in northern England. Diaries of respiratory events were collected for 1405 children, and information was collected on the consultations of 2442 children with family/general practitioners over the 6-week study periods during 1996–1997, with concurrent monitoring of particulate levels. The associations found between daily PM10 levels and respiratory symptoms were frequently small and positive and sometimes varied between communities. The magnitude of these associations were in line with those from previous studies, even though daily particulate levels were low, and the children were drawn from the general population, rather than from the population with respiratory problems. The associations among asthma reliever use, consultations with general practitioners, and daily particulate levels were of a similar strength but estimated less precisely. The strength of association between all respiratory health measures and particulate levels was similar in communities near and away from opencast coal mining sites.
Resumo:
We analyze the quantum dynamics of radiation propagating in a single-mode optical fiber with dispersion, nonlinearity, and Raman coupling to thermal phonons. We start from a fundamental Hamiltonian that includes the principal known nonlinear effects and quantum-noise sources, including linear gain and loss. Both Markovian and frequency-dependent, non-Markovian reservoirs are treated. This treatment allows quantum Langevin equations, which have a classical form except for additional quantum-noise terms, to be calculated. In practical calculations, it is more useful to transform to Wigner or 1P quasi-probability operator representations. These transformations result in stochastic equations that can be analyzed by use of perturbation theory or exact numerical techniques. The results have applications to fiber-optics communications, networking, and sensor technology.
Resumo:
The A(n-1)((1)) trigonometric vertex model with generic non-diagonal boundaries is studied. The double-row transfer matrix of the model is diagonalized by algebraic Bethe ansatz method in terms of the intertwiner and the corresponding face-vertex relation. The eigenvalues and the corresponding Bethe ansatz equations are obtained.
Resumo:
The Direct Simulation Monte Carlo (DSMC) method is used to simulate the flow of rarefied gases. In the Macroscopic Chemistry Method (MCM) for DSMC, chemical reaction rates calculated from local macroscopic flow properties are enforced in each cell. Unlike the standard total collision energy (TCE) chemistry model for DSMC, the new method is not restricted to an Arrhenius form of the reaction rate coefficient, nor is it restricted to a collision cross-section which yields a simple power-law viscosity. For reaction rates of interest in aerospace applications, chemically reacting collisions are generally infrequent events and, as such, local equilibrium conditions are established before a significant number of chemical reactions occur. Hence, the reaction rates which have been used in MCM have been calculated from the reaction rate data which are expected to be correct only for conditions of thermal equilibrium. Here we consider artificially high reaction rates so that the fraction of reacting collisions is not small and propose a simple method of estimating the rates of chemical reactions which can be used in the Macroscopic Chemistry Method in both equilibrium and non-equilibrium conditions. Two tests are presented: (1) The dissociation rates under conditions of thermal non-equilibrium are determined from a zero-dimensional Monte-Carlo sampling procedure which simulates ‘intra-modal’ non-equilibrium; that is, equilibrium distributions in each of the translational, rotational and vibrational modes but with different temperatures for each mode; (2) The 2-D hypersonic flow of molecular oxygen over a vertical plate at Mach 30 is calculated. In both cases the new method produces results in close agreement with those given by the standard TCE model in the same highly nonequilibrium conditions. We conclude that the general method of estimating the non-equilibrium reaction rate is a simple means by which information contained within non-equilibrium distribution functions predicted by the DSMC method can be included in the Macroscopic Chemistry Method.
Resumo:
Silicic volcanic eruptions are typically accompanied by repetitive Long-Period (LP) seismicity that originates from a small region of the upper conduit. These signals have the capability to advance eruption prediction, since they commonly precede a change in the eruption vigour. Shear bands forming along the conduit wall, where the shear stresses are highest, have been linked to providing the seismic trigger. However, existing computational models are unable to generate shear bands at the depths where the LP signals originate using simple magma strength models. Presented here is a model in which the magma strength is determined from a constitutive relationship dependent upon crystallinity and pressure. This results in a depth-dependent magma strength, analogous to planetary lithospheres. Hence, in shallow highly-crystalline regions a macroscopically discontinuous brittle type of deformation will prevail, whilst in deeper crystal-poor regions there will be a macroscopically continuous plastic deformation mechanism. This will result in a depth where the brittle-ductile transition occurs, and here shear bands disconnected from the free-surface may develop. We utilize the Finite Element Method and use axi-symmetric coordinates to model magma flow as a viscoplastic material, simulating quasi-static shear bands along the walls of a volcanic conduit. Model results constrained to the Soufrière Hills Volcano, Montserrat, show the generation of two types of shear bands: upper-conduit shear bands that form between the free-surface to a few 100 metres below it and discrete shear bands that form at the depths where LP seismicity is measured to occur corresponding to the brittle-ductile transition and the plastic shear region. It is beyond the limitation of the model to simulate a seismic event, although the modelled viscosity within the discrete shear bands suggests a failure and healing cycle time that supports the observed LP seismicity repeat times. However, due to the paucity of data and large parameter space available these results can only be considered to be qualitative rather than quantitative at this stage.