969 resultados para ZERO-RANGE PROCESS
Resumo:
The delineation of Geomorphic Process Units (GPUs) aims to quantify past, current and future geomorphological processes and the sediment flux associated with them. Five GPUs have been identified for the Okstindan area of northern Norway and these were derived from the combination of Landsat satellite imagery (TM and ETM+) with stereo aerial photographs (used to construct a Digital Elevation Model) and ground survey. The Okstindan study area is sub-arctic and mountainous and is dominated by glacial and periglacial processes. The GPUs exclude the glacial system (some 37% of the study area) and hence they are focussed upon periglacial and colluvial processes. The identified GPUs are: 1. solifluction and rill erosion; 2. talus creep, slope wash and rill erosion; 3. accumulation of debris by rock and boulder fall; 4. rockwalls; and 5. stable ground with dissolved transport. The GPUs have been applied to a ‘test site’ within the study area in order to illustrate their potential for mapping the spatial distribution of geomorphological processes. The test site within the study area is a catchment which is representative of the range of geomorphological processes identified.
Resumo:
The triggering of convective orographic rainbands by small-scale topographic features is investigated through observations of a banded precipitation event over the Oregon Coastal Range and simulations using a cloud-resolving numerical model. A quasi-idealized simulation of the observed event reproduces the bands in the radar observations, indicating the model’s ability to capture the physics of the band-formation process. Additional idealized simulations reinforce that the bands are triggered by lee waves past small-scale topographic obstacles just upstream of the nominal leading edge of the orographic cloud. Whether a topographic obstacle in this region is able to trigger a strong rainband depends on the phase of its lee wave at cloud entry. Convective growth only occurs downstream of obstacles that give rise to lee-wave-induced displacements that create positive vertical velocity anomalies w_c and nearly zero buoyancy anomalies b_c as air parcels undergo saturation. This relationship is quantified through a simple analytic condition involving w_c, b_c, and the static stability N_m^2 of the cloud mass. Once convection is triggered, horizontal buoyancy gradients in the cross-flow direction generate circulations that align the bands parallel to the flow direction.
Resumo:
Major General Orde Wingate was a highly controversial figure in his time and remains so among historians. However, his eccentric and colourful personality has drawn attention away from the nature of his military ideas, the most important of which was his concept of long-range penetration, which originated from his observations of his operations in Italian-occupied Ethiopia in 1941, and evolved into the model he put into practice in the Chindit operations in Burma in 1943-44. A review of Wingate's own official writings on this subject reveals that long-range penetration combined local guerrilla irregulars, purpose-trained regular troops and airpower into large-scale offensive operations deep in the enemy rear, with the intention of disrupting his planning process and creating situations regular forces could exploit. This evolved organically from Major General Colin Gubbins' doctrine for guerrilla resistance in enemy occupied areas, and bears some resemblance to the operational model applied by US and Allied forces, post September 2001.
Resumo:
This commentary seeks to complement the contribution of the Building Research & Information special issue on 'Developing Theories for the Built Environment' (2008) by highlighting the important role of middle-range theories within the context of professional practice. Middle-range theories provide a form of theorizing that lies between abstract grand theorizing and atheoretical local descriptions. They are also characterized by the way in which they directly engage with the concerns of practitioners. In the context of professional practice, any commitment to theorizing should habitually be combined with an equivalent commitment to empirical research; rarely is it appropriate to neglect one in favour of the other. Any understanding of the role that theory plays in professional practice must further be informed by Schon's seminal ideas on reflective practice. Practitioners are seen to utilize theories as inputs to a process of continuous reflection, thereby guarding against complacency and routinization. The authors would challenge any assumption that academics alone are responsible for generating theories, thereby limiting the role of practitioners to their application. Such a dichotomized view is contrary to established ideas on Mode 2 knowledge production and current trends towards co-production research in the context of the built environment.
Resumo:
The applications of rheology to the main processes encountered during breadmaking (mixing, sheeting, fermentation and baking) are reviewed. The most commonly used rheological test methods and their relationships to product functionality are reviewed. It is shown that the most commonly used method for rheological testing of doughs, shear oscillation dynamic rheology, is generally used under deformation conditions inappropriate for breadmaking and shows little relationship with end-use performance. The frequency range used in conventional shear oscillation tests is limited to the plateau region, which is insensitive to changes in the HMW glutenin polymers thought to be responsible for variations in baking quality. The appropriate deformation conditions can be accessed either by long-time creep or relaxation measurements, or by large deformation extensional measurements at low strain rates and elevated temperatures. Molecular size and structure of the gluten polymers that make up the major structural components of wheat are related to their rheological properties via modern polymer rheology concepts. Interactions between polymer chain entanglements and branching are seen to be the key mechanisms determining the rheology of HMW polymers. Recent work confirms the observation that the dynamic shear plateau modulus is essentially independent of variations in MW of glutens amongst wheat varieties of varying baking performance and also that it is not the size of the soluble glutenin polymers, but the secondary structural and rheological properties of the insoluble polymer fraction that are mainly responsible for variations in baking performance. Extensional strain hardening has been shown to be a sensitive indicator of entanglements and long-chain branching in HMW polymers, and is well related to baking performance of bread doughs. The Considere failure criterion for instability in extension of polymers defines a region below which bubble walls become unstable, and predicts that when strain hardening falls below a value of around 1, bubble walls are no longer stable and coalesce rapidly, resulting in loss of gas retention and lower volume and texture. Strain hardening in doughs has been shown to reach this value at increasingly higher temperatures for better breadmaking varieties and is directly related to bubble stability and baking performance. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Dielectric properties of 16 process cheeses were determined over the frequency range 0.3-3 GHz. The effect of temperature on the dielectric properties of process cheeses were investigated at temperature intervals of 10 degrees C between 5 and 85 degrees C. Results showed that the dielectric constant decreased gradually as frequency increased, for all cheeses. The dielectric loss factor (epsilon") decreased from above 125 to below 12 as frequency increased. epsilon' was highest at 5 degrees C and generally decreased up to a temperature between 55 and 75 degrees C. epsilon" generally increased with increasing temperature for high and medium moisture/fat ratio cheeses. epsilon" decreased with temperature between 5 and 55 degrees C and then increased, for low moisture/fat ratio cheese. Partial least square regression models indicated that epsilon' and epsilon" could be used as a quality control screening application to measure moisture content and inorganic salt content of process cheese, respectively. (c) 2005 Elsevier Ltd. All rights reserved..
Resumo:
Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models.
Resumo:
School effectiveness is a microtechnology of change. It is a relay device, which transfers macro policy into everyday processes and priorities in schools. It is part of the growing apparatus of performance evaluation. Change is brought about by a focus on the school as a site-based system to be managed. There has been corporate restructuring in response to the changing political economy of education. There are now new work regimes and radical changes in organizational cultures. Education, like other public services, is now characterized by a range of structural realignments, new relationships between purchasers and providers and new coalitions between management and politics. In this article, we will argue that the school effectiveness movement is an example of new managerialism in education. It is part of an ideological and technological process to industrialize educational productivity. That is to say, the emphasis on standards and standardization is evocative of production regimes drawn from industry. There is a belief that education, like other public services can be managed to ensure optimal outputs and zero defects in the educational product.
Resumo:
The development of architecture and the settlement is central to discussions concerning the Neolithic transformation asthe very visible evidence for the changes in society that run parallel to the domestication of plants and animals. Architecture hasbeen used as an important aspect of models of how the transformation occurred, and as evidence for the sharp difference betweenhunter-gatherer and farming societies. We suggest that the emerging evidence for considerable architectural complexity from theearly Neolithic indicates that some of our interpretations depend too much on a very basic understanding of structures which arenormally seen as being primarily for residential purposes and containing households, which become the organising principle for thenew communities which are often seen as fully sedentary and described as villages. Recent work in southern Jordan suggests that inthis region at least there is little evidence for a standard house, and that structures are constructed for a range of diverse primary purposes other than simple domestic shelters.
Resumo:
Current methods for estimating vegetation parameters are generally sub-optimal in the way they exploit information and do not generally consider uncertainties. We look forward to a future where operational dataassimilation schemes improve estimates by tracking land surface processes and exploiting multiple types of observations. Dataassimilation schemes seek to combine observations and models in a statistically optimal way taking into account uncertainty in both, but have not yet been much exploited in this area. The EO-LDAS scheme and prototype, developed under ESA funding, is designed to exploit the anticipated wealth of data that will be available under GMES missions, such as the Sentinel family of satellites, to provide improved mapping of land surface biophysical parameters. This paper describes the EO-LDAS implementation, and explores some of its core functionality. EO-LDAS is a weak constraint variational dataassimilationsystem. The prototype provides a mechanism for constraint based on a prior estimate of the state vector, a linear dynamic model, and EarthObservationdata (top-of-canopy reflectance here). The observation operator is a non-linear optical radiative transfer model for a vegetation canopy with a soil lower boundary, operating over the range 400 to 2500 nm. Adjoint codes for all model and operator components are provided in the prototype by automatic differentiation of the computer codes. In this paper, EO-LDAS is applied to the problem of daily estimation of six of the parameters controlling the radiative transfer operator over the course of a year (> 2000 state vector elements). Zero and first order process model constraints are implemented and explored as the dynamic model. The assimilation estimates all state vector elements simultaneously. This is performed in the context of a typical Sentinel-2 MSI operating scenario, using synthetic MSI observations simulated with the observation operator, with uncertainties typical of those achieved by optical sensors supposed for the data. The experiments consider a baseline state vector estimation case where dynamic constraints are applied, and assess the impact of dynamic constraints on the a posteriori uncertainties. The results demonstrate that reductions in uncertainty by a factor of up to two might be obtained by applying the sorts of dynamic constraints used here. The hyperparameter (dynamic model uncertainty) required to control the assimilation are estimated by a cross-validation exercise. The result of the assimilation is seen to be robust to missing observations with quite large data gaps.
Resumo:
Housing in the UK accounts for 30.5% of all energy consumed and is responsible for 25% of all carbon emissions. The UK Government’s Code for Sustainable Homes requires all new homes to be zero carbon by 2016. The development and widespread diffusion of low and zero carbon (LZC) technologies is recognised as being a key solution for housing developers to deliver against this zero-carbon agenda. The innovation challenge to design and incorporate these technologies into housing developers’ standard design and production templates will usher in significant technical and commercial risks. In this paper we report early results from an ongoing Engineering and Physical Sciences Research Council project looking at the innovation logic and trajectory of LZC technologies in new housing. The principal theoretical lens for the research is the socio-technical network approach which considers actors’ interests and interpretative flexibilities of technologies and how they negotiate and reproduce ‘acting spaces’ to shape, in this case, the selection and adoption of LZC technologies. The initial findings are revealing the form and operation of the technology networks around new housing developments as being very complex, involving a range of actors and viewpoints that vary for each housing development.
Resumo:
The UK Government is committed to all new homes being zero-carbon from 2016. The use of low and zero carbon (LZC) technologies is recognised by housing developers as being a key part of the solution to deliver against this zero-carbon target. The paper takes as its starting point that the selection of new technologies by firms is not a phenomenon which takes place within a rigid sphere of technical rationality (for example, Rip and Kemp, 1998). Rather, technology forms and diffusion trajectories are driven and shaped by myriad socio-technical structures, interests and logics. A literature review is offered to contribute to a more critical and systemic foundation for understanding the socio-technical features of the selection of LZC technologies in new housing. The problem is investigated through a multidisciplinary lens consisting of two perspectives: technological and institutional. The synthesis of the perspectives crystallises the need to understand that the selection of LZC technologies by housing developers is not solely dependent on technical or economic efficiency, but on the emergent ‘fit’ between the intrinsic properties of the technologies, institutional logics and the interests and beliefs of various actors in the housing development process.
Resumo:
We wish to characterize when a Lévy process X t crosses boundaries b(t), in a two-sided sense, for small times t, where b(t) satisfies very mild conditions. An integral test is furnished for computing the value of sup t→0|X t |/b(t) = c. In some cases, we also specify a function b(t) in terms of the Lévy triplet, such that sup t→0 |X t |/b(t) = 1.
Resumo:
In this paper we use molecular dynamics to answer a classical question: how does the surface tension on a liquid/gas interface appear? After defining surface tension from the first principles and performing several consistency checks, we perform a dynamic experiment with a single simple liquid nanodroplet. At time zero, we remove all molecules of the interfacial layer of molecules, creating a fresh bare interface with the bulk arrangement of molecules. After that the system evolves towards equilibrium, and the expected surface tension is re-established. We found that the system relaxation consists of three distinct stages. First, the mechanical balance is quickly re-established. During this process the notion of surface tension is meaningless. In the second stage, the surface tension equilibrates, and the density profile broadens to a value which we call “intrinsic” interfacial width. During the third stage, the density profile continues to broaden due to capillary wave excitations, which does not however affect the surface tension.We have observed this scenario for monatomic Lennard-Jones (LJ) liquid as well as for binary LJ mixtures at different temperatures, monitoring a wide range of physical observables.
Resumo:
The aim of this work was to investigate the lipopeptides aggregation behavior in single and mixed solutions in a wide range of concentrations, in order to optimize their separation and purification following the two-step ultrafiltration process and using large pore size membranes (up to MWCO = 300 kDa). Micelle size was determined by dynamic light scattering. In single solutions of lipopeptide both surfactin and mycosubtilin formed micelles of different size depending on their concentration, micelles of average diameter = 5–105 nm for surfactin and 8–18 nm for mycosubtilin. However when the lipopeptides were in the same solution they formed mixed micelles of different size (d = 8 nm) and probably conformation to that formed by the individual lipopeptide, this prevents their separation according to size. These lipopeptides were purified from fermentation culture by the two-step ultrafiltration process using different MWCO membranes ranging from 10 to 300 kDa. This led to their effective rejection in the first ultrafiltration step by membranes with MCWO = 10–100 kDa but poor rejection by the 300 KDa membrane. The lipopeptides were recovered at 90% purity (in relation to protein) and with 2.34 enrichment in the permeate of the second ultrafiltration step with the 100 KDa membrane upon addition of 75% ethanol.