973 resultados para 21-point running mean


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dispersion of a point-source release of a passive scalar in a regular array of cubical, urban-like, obstacles is investigated by means of direct numerical simulations. The simulations are conducted under conditions of neutral stability and fully rough turbulent flow, at a roughness Reynolds number of Reτ = 500. The Navier–Stokes and scalar equations are integrated assuming a constant rate release from a point source close to the ground within the array. We focus on short-range dispersion, when most of the material is still within the building canopy. Mean and fluctuating concentrations are computed for three different pressure gradient directions (0◦ , 30◦ , 45◦). The results agree well with available experimental data measured in a water channel for a flow angle of 0◦ . Profiles of mean concentration and the three-dimensional structure of the dispersion pattern are compared for the different forcing angles. A number of processes affecting the plume structure are identified and discussed, including: (i) advection or channelling of scalar down ‘streets’, (ii) lateral dispersion by turbulent fluctuations and topological dispersion induced by dividing streamlines around buildings, (iii) skewing of the plume due to flow turning with height, (iv) detrainment by turbulent dispersion or mean recirculation, (v) entrainment and release of scalar in building wakes, giving rise to ‘secondary sources’, (vi) plume meandering due to unsteady turbulent fluctuations. Finally, results on relative concentration fluctuations are presented and compared with the literature for point source dispersion over flat terrain and urban arrays. Keywords Direct numerical simulation · Dispersion modelling · Urban array

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We bridge the properties of the regular triangular, square, and hexagonal honeycomb Voronoi tessellations of the plane to the Poisson-Voronoi case, thus analyzing in a common framework symmetry breaking processes and the approach to uniform random distributions of tessellation-generating points. We resort to ensemble simulations of tessellations generated by points whose regular positions are perturbed through a Gaussian noise, whose variance is given by the parameter α2 times the square of the inverse of the average density of points. We analyze the number of sides, the area, and the perimeter of the Voronoi cells. For all valuesα >0, hexagons constitute the most common class of cells, and 2-parameter gamma distributions provide an efficient description of the statistical properties of the analyzed geometrical characteristics. The introduction of noise destroys the triangular and square tessellations, which are structurally unstable, as their topological properties are discontinuous in α = 0. On the contrary, the honeycomb hexagonal tessellation is topologically stable and, experimentally, all Voronoi cells are hexagonal for small but finite noise withα <0.12. For all tessellations and for small values of α, we observe a linear dependence on α of the ensemble mean of the standard deviation of the area and perimeter of the cells. Already for a moderate amount of Gaussian noise (α >0.5), memory of the specific initial unperturbed state is lost, because the statistical properties of the three perturbed regular tessellations are indistinguishable. When α >2, results converge to those of Poisson-Voronoi tessellations. The geometrical properties of n-sided cells change with α until the Poisson- Voronoi limit is reached for α > 2; in this limit the Desch law for perimeters is shown to be not valid and a square root dependence on n is established. This law allows for an easy link to the Lewis law for areas and agrees with exact asymptotic results. Finally, for α >1, the ensemble mean of the cells area and perimeter restricted to the hexagonal cells agree remarkably well with the full ensemble mean; this reinforces the idea that hexagons, beyond their ubiquitous numerical prominence, can be interpreted as typical polygons in 2D Voronoi tessellations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We perturb the SC, BCC, and FCC crystal structures with a spatial Gaussian noise whose adimensional strength is controlled by the parameter a, and analyze the topological and metrical properties of the resulting Voronoi Tessellations (VT). The topological properties of the VT of the SC and FCC crystals are unstable with respect to the introduction of noise, because the corresponding polyhedra are geometrically degenerate, whereas the tessellation of the BCC crystal is topologically stable even against noise of small but finite intensity. For weak noise, the mean area of the perturbed BCC and FCC crystals VT increases quadratically with a. In the case of perturbed SCC crystals, there is an optimal amount of noise that minimizes the mean area of the cells. Already for a moderate noise (a>0.5), the properties of the three perturbed VT are indistinguishable, and for intense noise (a>2), results converge to the Poisson-VT limit. Notably, 2-parameter gamma distributions are an excellent model for the empirical of of all considered properties. The VT of the perturbed BCC and FCC structures are local maxima for the isoperimetric quotient, which measures the degre of sphericity of the cells, among space filling VT. In the BCC case, this suggests a weaker form of the recentluy disproved Kelvin conjecture. Due to the fluctuations of the shape of the cells, anomalous scalings with exponents >3/2 is observed between the area and the volumes of the cells, and, except for the FCC case, also for a->0. In the Poisson-VT limit, the exponent is about 1.67. As the number of faces is positively correlated with the sphericity of the cells, the anomalous scaling is heavily reduced when we perform powerlaw fits separately on cells with a specific number of faces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the super-Brownian motion with a single point source in dimensions 2 and 3 as constructed by Fleischmann and Mueller in 2004. Using analytic facts we derive the long time behavior of the mean in dimension 2 and 3 thereby complementing previous work of Fleischmann, Mueller and Vogt. Using spectral theory and martingale arguments we prove a version of the strong law of large numbers for the two dimensional superprocess with a single point source and finite variance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The search for ever deeper relationships among the World’s languages is bedeviled by the fact that most words evolve too rapidly to preserve evidence of their ancestry beyond 5,000 to 9,000 y. On the other hand, quantitative modeling indicates that some “ultraconserved” words exist that might be used to find evidence for deep linguistic relationships beyond that time barrier. Here we use a statistical model, which takes into account the frequency with which words are used in common everyday speech, to predict the existence of a set of such highly conserved words among seven language families of Eurasia postulated to form a linguistic superfamily that evolved from a common ancestor around 15,000 y ago. We derive a dated phylogenetic tree of this proposed superfamily with a time-depth of ∼14,450 y, implying that some frequently used words have been retained in related forms since the end of the last ice age. Words used more than once per 1,000 in everyday speech were 7- to 10-times more likely to show deep ancestry on this tree. Our results suggest a remarkable fidelity in the transmission of some words and give theoretical justification to the search for features of language that might be preserved across wide spans of time and geography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A recently proposed mean-field theory of mammalian cortex rhythmogenesis describes the salient features of electrical activity in the cerebral macrocolumn, with the use of inhibitory and excitatory neuronal populations (Liley et al 2002). This model is capable of producing a range of important human EEG (electroencephalogram) features such as the alpha rhythm, the 40 Hz activity thought to be associated with conscious awareness (Bojak & Liley 2007) and the changes in EEG spectral power associated with general anesthetic effect (Bojak & Liley 2005). From the point of view of nonlinear dynamics, the model entails a vast parameter space within which multistability, pseudoperiodic regimes, various routes to chaos, fat fractals and rich bifurcation scenarios occur for physiologically relevant parameter values (van Veen & Liley 2006). The origin and the character of this complex behaviour, and its relevance for EEG activity will be illustrated. The existence of short-lived unstable brain states will also be discussed in terms of the available theoretical and experimental results. A perspective on future analysis will conclude the presentation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission-driven rather than concentration-driven perturbed parameter ensemble of a global climate model (GCM). These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration-driven simulations (with 10–90th percentile ranges of 1.7 K for the aggressive mitigation scenario, up to 3.9 K for the high-end, business as usual scenario). A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 K (RCP8.5) and even under aggressive mitigation (RCP2.6) temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission-driven experiments, they do not change existing expectations (based on previous concentration-driven experiments) on the timescales over which different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in the case of SRES A1B and the Representative Concentration Pathways (RCPs), the concentration scenarios used to drive GCM ensembles, lies towards the lower end of our simulated distribution. This design decision (a legacy of previous assessments) is likely to lead concentration-driven experiments to under-sample strong feedback responses in future projections. Our ensemble of emission-driven simulations span the global temperature response of the CMIP5 emission-driven simulations, except at the low end. Combinations of low climate sensitivity and low carbon cycle feedbacks lead to a number of CMIP5 responses to lie below our ensemble range. The ensemble simulates a number of high-end responses which lie above the CMIP5 carbon cycle range. These high-end simulations can be linked to sampling a number of stronger carbon cycle feedbacks and to sampling climate sensitivities above 4.5 K. This latter aspect highlights the priority in identifying real-world climate-sensitivity constraints which, if achieved, would lead to reductions on the upper bound of projected global mean temperature change. The ensembles of simulations presented here provides a framework to explore relationships between present-day observables and future changes, while the large spread of future-projected changes highlights the ongoing need for such work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we illustrate experimentally an important consequence of the stochastic component in choice behaviour which has not been acknowledged so far. Namely, its potential to produce ‘regression to the mean’ (RTM) effects. We employ a novel approach to individual choice under risk, based on repeated multiple-lottery choices (i.e. choices among many lotteries), to show how the high degree of stochastic variability present in individual decisions can distort crucially certain results through RTM effects. We demonstrate the point in the context of a social comparison experiment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigates the potential contribution of observed changes in lower stratospheric water vapour to stratospheric temperature variations over the past three decades using a comprehensive global climate model (GCM). Three case studies are considered. In the first, the net increase in stratospheric water vapour (SWV) from 1980–2010 (derived from the Boulder frost-point hygrometer record using the gross assumption that this is globally representative) is estimated to have cooled the lower stratosphere by up to ∼0.2 K decade−1 in the global and annual mean; this is ∼40% of the observed cooling trend over this period. In the Arctic winter stratosphere there is a dynamical response to the increase in SWV, with enhanced polar cooling of 0.6 K decade−1 at 50 hPa and warming of 0.5 K decade−1 at 1 hPa. In the second case study, the observed decrease in tropical lower stratospheric water vapour after the year 2000 (imposed in the GCM as a simplified representation of the observed changes derived from satellite data) is estimated to have caused a relative increase in tropical lower stratospheric temperatures by ∼0.3 K at 50 hPa. In the third case study, the wintertime dehydration in the Antarctic stratospheric polar vortex (again using a simplified representation of the changes seen in a satellite dataset) is estimated to cause a relative warming of the Southern Hemisphere polar stratosphere by up to 1 K at 100 hPa from July–October. This is accompanied by a weakening of the westerly winds on the poleward flank of the stratospheric jet by up to 1.5 m s−1 in the GCM. The results show that, if the measurements are representative of global variations, SWV should be considered as important a driver of transient and long-term variations in lower stratospheric temperature over the past 30 years as increases in long-lived greenhouse gases and stratospheric ozone depletion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a comparison of various estimates of the open solar flux, deduced from measurements of the interplanetary magnetic field, from the aa geomagnetic index and from photospheric magnetic field observations. The first two of these estimates are made using the Ulysses discovery that the radial heliospheric field is approximately independent of heliographic latitude, the third makes use of the potential-field source surface method to map the total flux through the photosphere to the open flux at the top of the corona. The uncertainties associated with using the Ulysses result are 5%, but the effects of the assumptions of the potential field source surface method are harder to evaluate. Nevertheless, the three methods give similar results for the last three solar cycles when the data sets overlap. In 11-year running means, all three methods reveal that 1987 marked a significant peak in the long-term variation of the open solar flux. This peak is close to the solar minimum between sunspot cycles 21 and 22, and consequently the mean open flux (averaged from minimum to minimum) is similar for these two cycles. However, this similarity between cycles 21 and 22 in no way implies that the open flux is constant. The long-term variation shows that these cycles are fundamentally different in that the average open flux was rising during cycle 21 (from consistently lower values in cycle 20 and toward the peak in 1987) but was falling during cycle 22 (toward consistently lower values in cycle 23). The estimates from the geomagnetic aa index are unique as they extend from 1842 onwards (using the Helsinki extension). This variation gives strong anticorrelations, with very high statistical significance levels, with cosmic ray fluxes and with the abundances of the cosmogenic isotopes that they produce. Thus observations of photospheric magnetic fields, of cosmic ray fluxes, and of cosmogenic isotope abundances all support the long-term drifts in open solar flux reported by Lockwood et al. [1999a, 1999b].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Observational analyses of running 5-year ocean heat content trends (Ht) and net downward top of atmosphere radiation (N) are significantly correlated (r~0.6) from 1960 to 1999, but a spike in Ht in the early 2000s is likely spurious since it is inconsistent with estimates of N from both satellite observations and climate model simulations. Variations in N between 1960 and 2000 were dominated by volcanic eruptions, and are well simulated by the ensemble mean of coupled models from the Fifth Coupled Model Intercomparison Project (CMIP5). We find an observation-based reduction in N of -0.31±0.21 Wm-2 between 1999 and 2005 that potentially contributed to the recent warming slowdown, but the relative roles of external forcing and internal variability remain unclear. While present-day anomalies of N in the CMIP5 ensemble mean and observations agree, this may be due to a cancellation of errors in outgoing longwave and absorbed solar radiation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Orthodontic tooth movement uses mechanical forces that result in inflammation in the first days. Myeloperoxidase (MPO) is an enzyme found in polymorphonuclear neutrophil (PMN) granules, and it is used to estimate the number of PMN granules in tissues. So far, MPO has not been used to study the inflammatory alterations after the application of orthodontic tooth movement forces. The aim of this study was to determine MPO activity in the gingival crevicular fluid (GCF) and saliva (whole stimulated saliva) of orthodontic patients at different time points after fixed appliance activation. Methods: MPO was determined in the GCF and collected by means of periopaper from the saliva of 14 patients with orthodontic fixed appliances. GCF and saliva samples were collected at baseline, 2 hours, and 7 and 14 days after application of the orthodontic force. Results: Mean MPO activity was increased in both the GCF and saliva of orthodontic patients at 2 hours after appliance activation (P<0.02 for all comparisons). At 2 hours, PMN infiltration into the periodontal ligament from the orthodontic force probably results in the increased MPO level observed at this time point. Conclusions: MPO might be a good marker to assess inflammation in orthodontic movement; it deserves further studies in orthodontic therapy. (Am J Orthod Dentofacial Orthop 2010;138:613-6)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based on previous observational studies on cold extreme events over southern South America, some recent studies suggest a possible relationship between Rossby wave propagation remotely triggered and the occurrence of frost. Using the concept of linear theory of Rossby wave propagation, this paper analyzes the propagation of such waves in two different basic states that correspond to austral winters with maximum and minimum generalized frost frequency of occurrence in the Wet Pampa (central-northwest Argentina). In order to determine the wave trajectories, the ray tracing technique is used in this study. Some theoretical discussion about this technique is also presented. The analysis of the basic state, from a theoretical point of view and based on the calculation of ray tracings, corroborates that remotely excited Rossby waves is the mechanism that favors the maximum occurrence of generalized frosts. The basic state in which the waves propagate is what conditions the places where they are excited. The Rossby waves are excited in determined places of the atmosphere, propagating towards South America along the jet streams that act as wave guides, favoring the generation of generalized frosts. In summary, this paper presents an overview of the ray tracing technique and how it can be used to investigate an important synoptic event, such as frost in a specific region, and its relationship with the propagation of large scale planetary waves.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work was to analyse C4 genotypes, C4 protein levels, phenotypes and genotypes in patients with the classical form of 21-hydroxylase deficiency. Fifty-four patients from 46 families (36 female, 18 male; mean age 10.8 years) with different clinical manifestations (31 salt-wasting; 23 simple-virilizing) were studied. Taq I Southern blotting was used to perform molecular analysis of the C4/CYP21 gene cluster and the genotypes were defined according to gene organization within RCCX modules. Serum C4 isotypes were assayed by enzyme-linked immunosorbent assay. The results revealed 12 different haplotypes of the C4/CYP21 gene cluster. Total functional activity of the classical pathway (CH50) was reduced in individuals carrying different genotypes because of low C4 concentrations (43% of all patients) to complete or partial C4 allotype deficiency. Thirteen of 54 patients presented recurrent infections affecting the respiratory and/or the urinary tracts, none of them with severe infections. Low C4A or C4B correlated well with RCCX monomodular gene organization, but no association between C4 haplotypes and recurrent infections or autoimmunity was observed. Considering this redundant gene cluster, C4 seems to be a well-protected gene segment along the evolutionary process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: International organisations, e.g. WHO, stress the importance of competent registered nurses (RN) for the safety and quality of healthcare systems. Low competence among RNs has been shown to increase the morbidity and mortality of inpatients. OBJECTIVES: To investigate self-reported competence among nursing students on the point of graduation (NSPGs), using the Nurse Professional Competence (NPC) Scale, and to relate the findings to background factors. METHODS AND PARTICIPANTS: The NPC Scale consists of 88 items within eight competence areas (CAs) and two overarching themes. Questions about socio-economic background and perceived overall quality of the degree programme were added. In total, 1086 NSPGs (mean age, 28.1 [20-56]years, 87.3% women) from 11 universities/university colleges participated. RESULTS: NSPGs reported significantly higher scores for Theme I "Patient-Related Nursing" than for Theme II "Organisation and Development of Nursing Care". Younger NSPGs (20-27years) reported significantly higher scores for the CAs "Medical and Technical Care" and "Documentation and Information Technology". Female NSPGs scored significantly higher for "Value-Based Nursing". Those who had taken the nursing care programme at upper secondary school before the Bachelor of Science in Nursing (BSN) programme scored significantly higher on "Nursing Care", "Medical and Technical Care", "Teaching/Learning and Support", "Legislation in Nursing and Safety Planning" and on Theme I. Working extra paid hours in healthcare alongside the BSN programme contributed to significantly higher self-reported scores for four CAs and both themes. Clinical courses within the BSN programme contributed to perceived competence to a significantly higher degree than theoretical courses (93.2% vs 87.5% of NSPGs). SUMMARY AND CONCLUSION: Mean scores reported by NSPGs were highest for the four CAs connected with patient-related nursing and lowest for CAs relating to organisation and development of nursing care. We conclude that the NPC Scale can be used to identify and measure aspects of self-reported competence among NSPGs.