86 resultados para Simulation and modelling


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this work a hybrid technique that includes probabilistic and optimization based methods is presented. The method is applied, both in simulation and by means of real-time experiments, to the heating unit of a Heating, Ventilation Air Conditioning (HVAC) system. It is shown that the addition of the probabilistic approach improves the fault diagnosis accuracy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A large and complex IT project may involve multiple organizations and be constrained within a temporal period. An organization is a system comprising of people, activities, processes, information, resources and goals. Understanding and modelling such a project and its interrelationship with relevant organizations are essential for organizational project planning. This paper introduces the problem articulation method (PAM) as a semiotic method for organizational infrastructure modelling. PAM offers a suite of techniques, which enables the articulation of the business, technical and organizational requirements, delivering an infrastructural framework to support the organization. It works by eliciting and formalizing (e. g. processes, activities, relationships, responsibilities, communications, resources, agents, dependencies and constraints) and mapping these abstractions to represent the manifestation of the "actual" organization. Many analysts forgo organizational modelling methods and use localized ad hoc and point solutions, but this is not amenable for organizational infrastructures modelling. A case study of the infrared atmospheric sounding interferometer (IASI) will be used to demonstrate the applicability of PAM, and to examine its relevancy and significance in dealing with the innovation and changes in the organizations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The reduction of portfolio risk is important to all investors but is particularly important to real estate investors as most property portfolios are generally small. As a consequence, portfolios are vulnerable to a significant risk of under-performing the market, or a target rate of return and so investors may be exposing themselves to greater risk than necessary. Given the potentially higher risk of underperformance from owning only a few properties, we follow the approach of Vassal (2001) and examine the benefits of holding more properties in a real estate portfolio. Using Monte Carlo simulation and the returns from 1,728 properties in the IPD database, held over the 10-year period from 1995 to 2004, the results show that increases in portfolio size offers the possibility of a more stable and less volatile return pattern over time, i.e. down-side risk is diminished with increasing portfolio size. Nonetheless, increasing portfolio size has the disadvantage of restricting the probability of out-performing the benchmark index by a significant amount. In other words, although increasing portfolio size reduces the down-side risk in a portfolio, it also decreases its up-side potential. Be that as it may, the results provide further evidence that portfolios with large numbers of properties are always preferable to portfolios of a smaller size.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Electricity consumption in Ghana is estimated to be increasing by 10% per annum due to the demand from the growing population. However, current sources of production (hydro and thermal facilities) generate only 66% of the current demand. Considering current trends, it is difficult to substantiate these basic facts, because of the lack of information. As a result, research into the existing sources of generating electricity, electricity consumption and prospective projects has been performed. This was achieved using three key techniques; review of literature, empirical studies and modelling. The results presented suggest that, current annual installed capacity of energy generation (i.e. 1960 MW) must be increased to 9,405.59 MW, assuming 85% plant availability. This is then capable to coop with the growing demand and it would give access to the entire population as well as support commercial and industrial activities for the growth of the economy. The prospect of performing this research is with the expectation to present an academic research agenda for further exploration into the subject area, without which the growth of the country would be stagnant.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Global climate change results from a small yet persistent imbalance between the amount of sunlight absorbed by Earth and the thermal radiation emitted back to space. An apparent inconsistency has been diagnosed between interannual variations in the net radiation imbalance inferred from satellite measurements and upper-ocean heating rate from in situ measurements, and this inconsistency has been interpreted as ‘missing energy’ in the system. Here we present a revised analysis of net radiation at the top of the atmosphere from satellite data, and we estimate ocean heat content, based on three independent sources. We find that the difference between the heat balance at the top of the atmosphere and upper-ocean heat content change is not statistically significant when accounting for observational uncertainties in ocean measurements, given transitions in instrumentation and sampling. Furthermore, variability in Earth’s energy imbalance relating to El Niño-Southern Oscillation is found to be consistent within observational uncertainties among the satellite measurements, a reanalysis model simulation and one of the ocean heat content records. We combine satellite data with ocean measurements to depths of 1,800 m, and show that between January 2001 and December 2010, Earth has been steadily accumulating energy at a rate of 0.50±0.43 Wm−2 (uncertainties at the 90% confidence level). We conclude that energy storage is continuing to increase in the sub-surface ocean.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Practically all extant work on flows over obstacle arrays, whether laboratory experiments or numerical modelling, is for cases where the oncoming wind is normal to salient faces of the obstacles. In the field, however, this is rarely the case. Here, simulations of flows at various directions over arrays of cubes representing typical urban canopy regions are presented and discussed. The computations are of both direct numerical simulation and large-eddy simulation type. Attention is concentrated on the differences in the mean flow within the canopy region arising from the different wind directions and the consequent effects on global properties such as the total surface drag, which can change very significantly—by up to a factor of three in some circumstances. It is shown that for a given Reynolds number the typical viscous forces are generally a rather larger fraction of the pressure forces (principally the drag) for non-normal than for normal wind directions and that, dependent on the surface morphology, the average flow direction deep within the canopy can be largely independent of the oncoming wind direction. Even for regular arrays of regular obstacles, a wind direction not normal to the obstacle faces can in general generate a lateral lift force (in the direction normal to the oncoming flow). The results demonstrate this and it is shown how computations in a finite domain with the oncoming flow generated by an appropriate forcing term (e.g. a pressure gradient) then lead inevitably to an oncoming wind direction aloft that is not aligned with the forcing term vector.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The dispersion of a point-source release of a passive scalar in a regular array of cubical, urban-like, obstacles is investigated by means of direct numerical simulations. The simulations are conducted under conditions of neutral stability and fully rough turbulent flow, at a roughness Reynolds number of Reτ = 500. The Navier–Stokes and scalar equations are integrated assuming a constant rate release from a point source close to the ground within the array. We focus on short-range dispersion, when most of the material is still within the building canopy. Mean and fluctuating concentrations are computed for three different pressure gradient directions (0◦ , 30◦ , 45◦). The results agree well with available experimental data measured in a water channel for a flow angle of 0◦ . Profiles of mean concentration and the three-dimensional structure of the dispersion pattern are compared for the different forcing angles. A number of processes affecting the plume structure are identified and discussed, including: (i) advection or channelling of scalar down ‘streets’, (ii) lateral dispersion by turbulent fluctuations and topological dispersion induced by dividing streamlines around buildings, (iii) skewing of the plume due to flow turning with height, (iv) detrainment by turbulent dispersion or mean recirculation, (v) entrainment and release of scalar in building wakes, giving rise to ‘secondary sources’, (vi) plume meandering due to unsteady turbulent fluctuations. Finally, results on relative concentration fluctuations are presented and compared with the literature for point source dispersion over flat terrain and urban arrays. Keywords Direct numerical simulation · Dispersion modelling · Urban array

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In recent years, various efforts have been made in air traffic control (ATC) to maintain traffic safety and efficiency in the face of increasing air traffic demands. ATC is a complex process that depends to a large degree on human capabilities, and so understanding how controllers carry out their tasks is an important issue in the design and development of ATC systems. In particular, the human factor is considered to be a serious problem in ATC safety and has been identified as a causal factor in both major and minor incidents. There is, therefore, a need to analyse the mechanisms by which errors occur due to complex factors and to develop systems that can deal with these errors. From the cognitive process perspective, it is essential that system developers have an understanding of the more complex working processes that involve the cooperative work of multiple controllers. Distributed cognition is a methodological framework for analysing cognitive processes that span multiple actors mediated by technology. In this research, we attempt to analyse and model interactions that take place in en route ATC systems based on distributed cognition. We examine the functional problems in an ATC system from a human factors perspective, and conclude by identifying certain measures by which to address these problems. This research focuses on the analysis of air traffic controllers' tasks for en route ATC and modelling controllers' cognitive processes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The cold equatorial SST bias in the tropical Pacific that is persistent in many coupled OAGCMs severely impacts the fidelity of the simulated climate and variability in this key region, such as the ENSO phenomenon. The classical bias analysis in these models usually concentrates on multi-decadal to centennial time series needed to obtain statistically robust features. Yet, this strategy cannot fully explain how the models errors were generated in the first place. Here, we use seasonal re-forecasts (hindcasts) to track back the origin of this cold bias. As such hindcasts are initialized close to observations, the transient drift leading to the cold bias can be analyzed to distinguish pre-existing errors from errors responding to initial ones. A time sequence of processes involved in the advent of the final mean state errors can then be proposed. We apply this strategy to the ENSEMBLES-FP6 project multi-model hindcasts of the last decades. Four of the five AOGCMs develop a persistent equatorial cold tongue bias within a few months. The associated systematic errors are first assessed separately for the warm and cold ENSO phases. We find that the models are able to reproduce either El Niño or La Niña close to observations, but not both. ENSO composites then show that the spurious equatorial cooling is maximum for El Niño years for the February and August start dates. For these events and at this time of the year, zonal wind errors in the equatorial Pacific are present from the beginning of the simulation and are hypothesized to be at the origin of the equatorial cold bias, generating too strong upwelling conditions. The systematic underestimation of the mixed layer depth in several models can also amplify the growth of the SST bias. The seminal role of these zonal wind errors is further demonstrated by carrying out ocean-only experiments forced by the AOCGCMs daily 10-meter wind. In a case study, we show that for several models, this forcing is sufficient to reproduce the main SST error patterns seen after 1 month in the AOCGCM hindcasts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Climate modeling is a complex process, requiring accurate and complete metadata in order to identify, assess and use climate data stored in digital repositories. The preservation of such data is increasingly important given the development of ever-increasingly complex models to predict the effects of global climate change. The EU METAFOR project has developed a Common Information Model (CIM) to describe climate data and the models and modelling environments that produce this data. There is a wide degree of variability between different climate models and modelling groups. To accommodate this, the CIM has been designed to be highly generic and flexible, with extensibility built in. METAFOR describes the climate modelling process simply as "an activity undertaken using software on computers to produce data." This process has been described as separate UML packages (and, ultimately, XML schemas). This fairly generic structure canbe paired with more specific "controlled vocabularies" in order to restrict the range of valid CIM instances. The CIM will aid digital preservation of climate models as it will provide an accepted standard structure for the model metadata. Tools to write and manage CIM instances, and to allow convenient and powerful searches of CIM databases,. Are also under development. Community buy-in of the CIM has been achieved through a continual process of consultation with the climate modelling community, and through the METAFOR team’s development of a questionnaire that will be used to collect the metadata for the Intergovernmental Panel on Climate Change’s (IPCC) Coupled Model Intercomparison Project Phase 5 (CMIP5) model runs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The contribution non-point P sources make to the total P loading on water bodies in agricultural catchments has not been fully appreciated. Using data derived from plot scale experimental studies, and modelling approaches developed to simulate system behaviour under differing management scenarios, a fuller understanding of the processes controlling P export and transformations along non-point transport pathways can be achieved. One modelling approach which has been successfully applied to large UK catchments (50-350km2 in area) is applied here to a small, 1.5 km2 experimental catchment. The importance of scaling is discussed in the context of how such approaches can extrapolate the results from plot-scale experimental studies to full catchment scale. However, the scope of such models is limited, since they do not at present directly simulate the processes controlling P transport and transformation dynamics. As such, they can only simulate total P export on an annual basis, and are not capable of prediction over shorter time scales. The need for development of process-based models to help answer these questions, and for more comprehensive UK experimental studies is highlighted as a pre-requisite for the development of suitable and sustainable management strategies to reduce non-point P loading on water bodies in agricultural catchments.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The UK Government's Department for Energy and Climate Change has been investigating the feasibility of developing a national energy efficiency data framework covering both domestic and non-domestic buildings. Working closely with the Energy Saving Trust and energy suppliers, the aim is to develop a data framework to monitor changes in energy efficiency, develop and evaluate programmes and improve information available to consumers. Key applications of the framework are to understand trends in built stock energy use, identify drivers and evaluate the success of different policies. For energy suppliers, it could identify what energy uses are growing, in which sectors and why. This would help with market segmentation and the design of products. For building professionals, it could supplement energy audits and modelling of end-use consumption with real data and support the generation of accurate and comprehensive benchmarks. This paper critically examines the results of the first phase of work to construct a national energy efficiency data-framework for the domestic sector focusing on two specific issues: (a) drivers of domestic energy consumption in terms of the physical nature of the dwellings and socio-economic characteristics of occupants and (b) the impact of energy efficiency measures on energy consumption.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Anaerobic digestion (AD) technologies convert organic wastes and crops into methane-rich biogas for heating, electricity generation and vehicle fuel. Farm-based AD has proliferated in some EU countries, driven by favourable policies promoting sustainable energy generation and GHG mitigation. Despite increased state support there are still few AD plants on UK farms leading to a lack of normative data on viability of AD in the whole-farm context. Farmers and lenders are therefore reluctant to fund AD projects and policy makers are hampered in their attempts to design policies that adequately support the industry. Existing AD studies and modelling tools do not adequately capture the farm context within which AD interacts. This paper demonstrates a whole-farm, optimisation modelling approach to assess the viability of AD in a more holistic way, accounting for such issues as: AD scale, synergies and conflicts with other farm enterprises, choice of feedstocks, digestate use and impact on farm Net Margin. This modelling approach demonstrates, for example, that: AD is complementary to dairy enterprises, but competes with arable enterprises for farm resources. Reduced nutrient purchases significantly improve Net Margin on arable farms, but AD scale is constrained by the capacity of farmland to absorb nutrients in AD digestate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

THE clinical skills of medical professionals rely strongly on the sense of touch, combined with anatomical and diagnostic knowledge. Haptic exploratory procedures allow the expert to detect anomalies via gross and fine palpation, squeezing, and contour following. Haptic feedback is also key to medical interventions, for example when an anaesthetist inserts an epidural needle, a surgeon makes an incision, a dental surgeon drills into a carious lesion, or a veterinarian sutures a wound. Yet, current trends in medical technology and training methods involve less haptic feedback to clinicians and trainees. For example, minimally invasive surgery removes the direct contact between the patient and clinician that gives rise to natural haptic feedback, and furthermore introduces scaling and rotational transforms that confuse the relationship between movements of the hand and the surgical site. Similarly, it is thought that computer-based medical simulation and training systems require high-resolution and realistic haptic feedback to the trainee for significant training transfer to occur. The science and technology of haptics thus has great potential to affect the performance of medical procedures and learning of clinical skills. This special section is about understanding

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The difference between the rate of change of cerebral blood volume (CBV) and cerebral blood flow (CBF) following stimulation is thought to be due to circumferential stress relaxation in veins (Mandeville, J.B., Marota, J.J.A., Ayata, C., Zaharchuk, G., Moskowitz, M.A., Rosen, B.R., Weisskoff, R.M., 1999. Evidence of a cerebrovascular postarteriole windkessel with delayed compliance. J. Cereb. Blood Flow Metab. 19, 679–689). In this paper we explore the visco-elastic properties of blood vessels, and present a dynamic model relating changes in CBF to changes in CBV. We refer to this model as the visco-elastic windkessel (VW) model. A novel feature of this model is that the parameter characterising the pressure–volume relationship of blood vessels is treated as a state variable dependent on the rate of change of CBV, producing hysteresis in the pressure–volume space during vessel dilation and contraction. The VW model is nonlinear time-invariant, and is able to predict the observed differences between the time series of CBV and that of CBF measurements following changes in neural activity. Like the windkessel model derived by Mandeville, J.B., Marota, J.J.A., Ayata, C., Zaharchuk, G., Moskowitz, M.A., Rosen, B.R., Weisskoff, R.M., 1999. Evidence of a cerebrovascular postarteriole windkessel with delayed compliance. J. Cereb. Blood Flow Metab. 19, 679–689, the VW model is primarily a model of haemodynamic changes in the venous compartment. The VW model is demonstrated to have the following characteristics typical of visco-elastic materials: (1) hysteresis, (2) creep, and (3) stress relaxation, hence it provides a unified model of the visco-elastic properties of the vasculature. The model will not only contribute to the interpretation of the Blood Oxygen Level Dependent (BOLD) signals from functional Magnetic Resonance Imaging (fMRI) experiments, but also find applications in the study and modelling of the brain vasculature and the haemodynamics of circulatory and cardiovascular systems.