85 resultados para Lid-Driven Trapezoidal Enclosure


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Descent and spreading of high salinity water generated by salt rejection during sea ice formation in an Antarctic coastal polynya is studied using a hydrostatic, primitive equation three-dimensional ocean model called the Proudman Oceanographic Laboratory Coastal Ocean Modeling System (POLCOMS). The shape of the polynya is assumed to be a rectangle 100 km long and 30 km wide, and the salinity flux into the polynya at its surface is constant. The model has been run at high horizontal spatial resolution (500 m), and numerical simulations reveal a buoyancy-driven coastal current. The coastal current is a robust feature and appears in a range of simulations designed to investigate the influence of a sloping bottom, variable bottom drag, variable vertical turbulent diffusivities, higher salinity flux, and an offshore position of the polynya. It is shown that bottom drag is the main factor determining the current width. This coastal current has not been produced with other numerical models of polynyas, which may be because these models were run at coarser resolutions. The coastal current becomes unstable upstream of its front when the polynya is adjacent to the coast. When the polynya is situated offshore, an unstable current is produced from its outset owing to the capture of cyclonic eddies. The effect of a coastal protrusion and a canyon on the current motion is investigated. In particular, due to the convex shape of the coastal protrusion, the current sheds a dipolar eddy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an efficient method of combining wide angle neutron scattering data with detailed atomistic models, allowing us to perform a quantitative and qualitative mapping of the organisation of the chain conformation in both glass and liquid phases. The structural refinement method presented in this work is based on the exploitation of the intrachain features of the diffraction pattern and its intimate linkage with atomistic models by the use of internal coordinates for bond lengths, valence angles and torsion rotations. Atomic connectivity is defined through these coordinates that are in turn assigned by pre-defined probability distributions, thus allowing for the models in question to be built stochastically. Incremental variation of these coordinates allows for the construction of models that minimise the differences between the observed and calculated structure factors. We present a series of neutron scattering data of 1,2 polybutadiene at the region 120-400K. Analysis of the experimental data yield bond lengths for C-C and C=C of 1.54Å and 1.35Å respectively. Valence angles of the backbone were found to be at 112° and the torsion distributions are characterised by five rotational states, a three-fold trans-skew± for the backbone and gauche± for the vinyl group. Rotational states of the vinyl group were found to be equally populated, indicating a largely atactic chan. The two backbone torsion angles exhibit different behaviour with respect to temperature of their trans population, with one of them adopting an almost all trans sequence. Consequently the resulting configuration leads to a rather persistent chain, something indicated by the value of the characteristic ratio extrapolated from the model. We compare our results with theoretical predictions, computer simulations, RIS models and previously reported experimental results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding how climate change can affect crop-pollinator systems helps predict potential geographical mismatches between a crop and its pollinators, and therefore identify areas vulnerable to loss of pollination services. We examined the distribution of orchard species (apples, pears, plums and other top fruits) and their pollinators in Great Britain, for present and future climatic conditions projected for 2050 under the SRES A1B Emissions Scenario. We used a relative index of pollinator availability as a proxy for pollination service. At present there is a large spatial overlap between orchards and their pollinators, but predictions for 2050 revealed that the most suitable areas for orchards corresponded to low pollinator availability. However, we found that pollinator availability may persist in areas currently used for fruit production, but which are predicted to provide sub-optimal environmental suitability for orchard species in the future. Our results may be used to identify mitigation options to safeguard orchard production against the risk of pollination failure in Great Britain over the next 50 years; for instance choosing fruit tree varieties that are adapted to future climatic conditions, or boosting wild pollinators through improving landscape resources. Our approach can be readily applied to other regions and crop systems, and expanded to include different climatic scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seamless phase II/III clinical trials are conducted in two stages with treatment selection at the first stage. In the first stage, patients are randomized to a control or one of k > 1 experimental treatments. At the end of this stage, interim data are analysed, and a decision is made concerning which experimental treatment should continue to the second stage. If the primary endpoint is observable only after some period of follow-up, at the interim analysis data may be available on some early outcome on a larger number of patients than those for whom the primary endpoint is available. These early endpoint data can thus be used for treatment selection. For two previously proposed approaches, the power has been shown to be greater for one or other method depending on the true treatment effects and correlations. We propose a new approach that builds on the previously proposed approaches and uses data available at the interim analysis to estimate these parameters and then, on the basis of these estimates, chooses the treatment selection method with the highest probability of correctly selecting the most effective treatment. This method is shown to perform well compared with the two previously described methods for a wide range of true parameter values. In most cases, the performance of the new method is either similar to or, in some cases, better than either of the two previously proposed methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Urbanization is one of the major forms of habitat alteration occurring at the present time. Although this is typically deleterious to biodiversity, some species flourish within these human-modified landscapes, potentially leading to negative and/or positive interactions between people and wildlife. Hence, up-to-date assessment of urban wildlife populations is important for developing appropriate management strategies. Surveying urban wildlife is limited by land partition and private ownership, rendering many common survey techniques difficult. Garnering public involvement is one solution, but this method is constrained by the inherent biases of non-standardised survey effort associated with voluntary participation. We used a television-led media approach to solicit national participation in an online sightings survey to investigate changes in the distribution of urban foxes in Great Britain and to explore relationships between urban features and fox occurrence and sightings density. Our results show that media-based approaches can generate a large national database on the current distribution of a recognisable species. Fox distribution in England and Wales has changed markedly within the last 25 years, with sightings submitted from 91% of urban areas previously predicted to support few or no foxes. Data were highly skewed with 90% of urban areas having <30 fox sightings per 1000 people km-2. The extent of total urban area was the only variable with a significant impact on both fox occurrence and sightings density in urban areas; longitude and percentage of public green urban space were respectively, significantly positively and negatively associated with sightings density only. Latitude, and distance to nearest neighbouring conurbation had no impact on either occurrence or sightings density. Given the limitations associated with this method, further investigations are needed to determine the association between sightings density and actual fox density, and variability of fox density within and between urban areas in Britain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discussed a floating mechanism based on quasi-magnetic levitation method that can be attached at the endpoint of a robot arm in order to construct a novel redundant robot arm for producing compliant motions. The floating mechanism can be composed of magnets and a constraint mechanism such that the repelling force of the magnets floats the endpoint part of the mechanism stable for the guided motions. The analytical and experimental results show that the proposed floating mechanism can produce stable floating motions with small inertia and viscosity. The results also show that the proposed mechanism can detect small force applied to the endpoint part because the friction force of the mechanism is very small.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we investigate the possibility to control a mobile robot via a sensory-motory coupling utilizing diffusion system. For this purpose, we implemented a simulation of the diffusion process of chemicals and the kinematics of the mobile robot. In comparison to the original Braitenberg vehicle in which sensorymotor coupling is tightly realised by hardwiring, our system employs the soft coupling. The mobile robot has two sets of independent sensory-motor unit, two sensors are implemented in front and two motors on each side of the robot. The framework used for the sensory-motor coupling was such that 1) Place two electrodes in the medium 2) Drop a certain amount of Chemical U and V related to the distance to the walls and the intensity of the light 3) Place other two electrodes in the medium 4) Measure the concentration of Chemical U and V to actuate the motors on both sides of the robot. The environment was constructed with four surrounding walls and a light source located at the center. Depending on the design parameters and initial conditions, the robot was able to successfully avoid the wall and light. More interestingly, the diffusion process in the sensory-motor coupling provided the robot with a simple form of memory which would not have been possible with a control framework based on a hard-wired electric circuit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background. Current models of concomitant, intermittent strabismus, heterophoria, convergence and accommodation anomalies are either theoretically complex or incomplete. We propose an alternative and more practical way to conceptualize clinical patterns. Methods. In each of three hypothetical scenarios (normal; high AC/A and low CA/C ratios; low AC/A and high CA/C ratios) there can be a disparity-biased or blur-biased “style”, despite identical ratios. We calculated a disparity bias index (DBI) to reflect these biases. We suggest how clinical patterns fit these scenarios and provide early objective data from small illustrative clinical groups. Results. Normal adults and children showed disparity bias (adult DBI 0.43 (95%CI 0.50-0.36), child DBI 0.20 (95%CI 0.31-0.07) (p=0.001). Accommodative esotropes showed less disparity-bias (DBI 0.03). In the high AC/A and low CA/C scenario, early presbyopes had mean DBI of 0.17 (95%CI 0.28-0.06), compared to DBI of -0.31 in convergence excess esotropes. In the low AC/A and high CA/C scenario near exotropes had mean DBI of 0.27, while we predict that non-strabismic, non-amblyopic hyperopes with good vision without spectacles will show lower DBIs. Disparity bias ranged between 1.25 and -1.67. Conclusions. Establishing disparity or blur bias, together with knowing whether convergence to target demand exceeds accommodation or vice versa explains clinical patterns more effectively than AC/A and CA/C ratios alone. Excessive bias or inflexibility in near-cue use increases risk of clinical problems. We suggest clinicians look carefully at details of accommodation and convergence changes induced by lenses, dissociation and prisms and use these to plan treatment in relation to the model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper considers the dynamics of deposition around and across the causewayed enclosure at Etton, Cambridgeshire. As a result of detailed re-analysis (particularly refitting) of the pottery and flint assemblages from the site, it proved possible to shed new light both on the temporality of occupation and the character of deposition there. Certain aspects of our work challenge previous interpretations of the site, and of causewayed enclosures in general; but, just as importantly, others confirm materially what has previously been suggested. The quantities of material deposited at Etton reveal that the enclosure was occupied only very intermittently and certainly less regularly than other contemporary sites in the region. The spatial distribution of material suggests that the enclosure ditch lay open for the entirety of the monument's life, but that acts of deposition generally focused on a specific part of the monument at any one time. As well as enhancing our knowledge of one particular causewayed enclosure, it is hoped that this paper – in combination with our earlier analysis of the pit site at Kilverstone – makes clear the potential that detailed material analysis has to offer in relation to our understanding of the temporality of occupation on prehistoric sites in general.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Basic concepts of the form of high-latitude ionospheric flows and their excitation and decay are discussed in the light of recent high time-resolution measurements made by ground-based radars. It is first pointed out that it is in principle impossible to adequately parameterize these flows by any single quantity derived from concurrent interplanetary conditions. Rather, even at its simplest, the flow must be considered to consist of two basic time-dependent components. The first is the flow driven by magnetopause coupling processes alone, principally by dayside reconnection. These flows may indeed be reasonably parameterized in terms of concurrent near-Earth interplanetary conditions, principally by the interplanetary magnetic field (IMF) vector. The second is the flow driven by tail reconnection alone. As a first approximation these flows may also be parameterized in terms of interplanetary conditions, principally the north-south component of the IMF, but with a delay in the flow response of around 30-60 min relative to the IMF. A delay in the tail response of this order must be present due to the finite speed of information propagation in the system, and we show how "growth" and "decay" of the field and flow configuration then follow as natural consequences. To discuss the excitation and decay of the two reconnection-driven components of the flow we introduce that concept of a flow-free equilibrium configuration for a magnetosphere which contains a given (arbitrary) amount of open flux. Reconnection events act either to create or destroy open flux, thus causing departures of the system from the equilibrium configuration. Flow is then excited which moves the system back towards equilibrium with the changed amount of open flux. We estimate that the overall time scale associated with the excitation and decay of the flow is about 15 min. The response of the system to both impulsive (flux transfer event) and continuous reconnection is discussed in these terms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The terrestrial magnetopause suffered considerable sudden changes in its location on 9–10 September 1978. These magnetopause motions were accompanied by disturbances of the geomagnetic field on the ground. We present a study of the magnetopause motions and the ground magnetic signatures using, for the latter, 10 s averaged data from 14 high latitude ground magnetometer stations. Observations in the solar wind (from IMP 8) are employed and the motions of the magnetopause are monitored directly by the spacecraft ISEE 1 and 2. With these coordinated observations we are able to show that it is the sudden changes in the solar wind dynamic pressure that are responsible for the disturbances seen on the ground. At some ground stations we see evidence of a “ringing” of the magnetospheric cavity, while at others only the initial impulse is evident. We note that at some stations field perturbations closely match the hypothesized ground signatures of flux transfer events. In accordance with more recent work in the area (e.g. Potemra et al., 1989, J. geophys. Res., in press), we argue that causes other than impulsive reeonnection may produce the twin ionospheric flow vortex originally proposed as a flux transfer even signature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neural stem cells (NSCs) are early precursors of neuronal and glial cells. NSCs are capable of generating identical progeny through virtually unlimited numbers of cell divisions (cell proliferation), producing daughter cells committed to differentiation. Nuclear factor kappa B (NF-kappaB) is an inducible, ubiquitous transcription factor also expressed in neurones, glia and neural stem cells. Recently, several pieces of evidence have been provided for a central role of NF-kappaB in NSC proliferation control. Here, we propose a novel mathematical model for NF-kappaB-driven proliferation of NSCs. We have been able to reconstruct the molecular pathway of activation and inactivation of NF-kappaB and its influence on cell proliferation by a system of nonlinear ordinary differential equations. Then we use a combination of analytical and numerical techniques to study the model dynamics. The results obtained are illustrated by computer simulations and are, in general, in accordance with biological findings reported by several independent laboratories. The model is able to both explain and predict experimental data. Understanding of proliferation mechanisms in NSCs may provide a novel outlook in both potential use in therapeutic approaches, and basic research as well.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.