170 resultados para Interdisciplinary approach to knowledge


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Straightforward mathematical techniques are used innovatively to form a coherent theoretical system to deal with chemical equilibrium problems. For a systematic theory it is necessary to establish a system to connect different concepts. This paper shows the usefulness and consistence of the system by applications of the theorems introduced previously. Some theorems are shown somewhat unexpectedly to be mathematically correlated and relationships are obtained in a coherent manner. It has been shown that theorem 1 plays an important part in interconnecting most of the theorems. The usefulness of theorem 2 is illustrated by proving it to be consistent with theorem 3. A set of uniform mathematical expressions are associated with theorem 3. A variety of mathematical techniques based on theorems 1–3 are shown to establish the direction of equilibrium shift. The equilibrium properties expressed in initial and equilibrium conditions are shown to be connected via theorem 5. Theorem 6 is connected with theorem 4 through the mathematical representation of theorem 1.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Long-range global climate forecasts were made by use of a model for predicting a tropical Pacific sea-surface temperature (SST) in tandem with an atmospheric general circulation model. The SST is predicted first at long lead times into the future. These ocean forecasts are then used to force the atmospheric model and so produce climate forecasts at lead times of the SST forecasts. Prediction of seven large climatic events of the 1970s to 1990s by this technique are in good agreement with observations over many regions of the globe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stakeholder analysis plays a critical role in business analysis. However, the majority of the stakeholder identification and analysis methods focus on the activities and processes and ignore the artefacts being processed by human beings. By focusing on the outputs of the organisation, an artefact-centric view helps create a network of artefacts, and a component-based structure of the organisation and its supply chain participants. Since the relationship is based on the components, i.e. after the stakeholders are identified, the interdependency between stakeholders and the focal organisation can be measured. Each stakeholder is associated with two types of dependency, namely the stakeholder’s dependency on the focal organisation and the focal organisation’s dependency on the stakeholder. We identify three factors for each type of dependency and propose the equations that calculate the dependency indexes. Once both types of the dependency indexes are calculated, each stakeholder can be placed and categorised into one of the four groups, namely critical stakeholder, mutual benefits stakeholder, replaceable stakeholder, and easy care stakeholder. The mutual dependency grid and the dependency gap analysis, which further investigates the priority of each stakeholder by calculating the weighted dependency gap between the focal organisation and the stakeholder, subsequently help the focal organisation to better understand its stakeholders and manage its stakeholder relationships.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relating the measurable, large scale, effects of anaesthetic agents to their molecular and cellular targets of action is necessary to better understand the principles by which they affect behavior, as well as enabling the design and evaluation of more effective agents and the better clinical monitoring of existing and future drugs. Volatile and intravenous general anaesthetic agents (GAs) are now known to exert their effects on a variety of protein targets, the most important of which seem to be the neuronal ion channels. It is hence unlikely that anaesthetic effect is the result of a unitary mechanism at the single cell level. However, by altering the behavior of ion channels GAs are believed to change the overall dynamics of distributed networks of neurons. This disruption of regular network activity can be hypothesized to cause the hypnotic and analgesic effects of GAs and may well present more stereotypical characteristics than its underlying microscopic causes. Nevertheless, there have been surprisingly few theories that have attempted to integrate, in a quantitative manner, the empirically well documented alterations in neuronal ion channel behavior with the corresponding macroscopic effects. Here we outline one such approach, and show that a range of well documented effects of anaesthetics on the electroencephalogram (EEG) may be putatively accounted for. In particular we parameterize, on the basis of detailed empirical data, the effects of halogenated volatile ethers (a clinically widely used class of general anaesthetic agent). The resulting model is able to provisionally account for a range of anaesthetically induced EEG phenomena that include EEG slowing, biphasic changes in EEG power, and the dose dependent appearance of anomalous ictal activity, as well as providing a basis for novel approaches to monitoring brain function in both health and disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we propose an alternative model of, what is often called, land value capture in the planning system. Based on development viability models, negotiations and policy formation regarding the level of planning obligations have taken place at the local level with little clear guidance on technique, approach and method. It is argued that current approaches are regressive and fail to reflect how the ability of sites to generate planning gain can vary over time and between sites. The alternative approach suggested here attempts to rationalise rather than replace the existing practice of development viability appraisal. It is based upon the assumption that schemes with similar development values should produce similar levels of return to the landowner, developer and other stakeholders in the development as well as similar levels of planning obligations in all parts of the country. Given the high level of input uncertainty in viability modelling, a simple viability model is ‘good enough’ to quantify the maximum level of planning obligations for a given level of development value. We have argued that such an approach can deliver a more durable, equitable, simpler, consistent and cheaper method for policy formation regarding planning obligations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper the properties of a hydro-meteorological forecasting system for forecasting river flows have been analysed using a probabilistic forecast convergence score (FCS). The focus on fixed event forecasts provides a forecaster's approach to system behaviour and adds an important perspective to the suite of forecast verification tools commonly used in this field. A low FCS indicates a more consistent forecast. It can be demonstrated that the FCS annual maximum decreases over the last 10 years. With lead time, the FCS of the ensemble forecast decreases whereas the control and high resolution forecast increase. The FCS is influenced by the lead time, threshold and catchment size and location. It indicates that one should use seasonality based decision rules to issue flood warnings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In its default configuration, the Hadley Centre climate model (GA2.0) simulates roughly one-half the observed level of Madden–Julian oscillation activity, with MJO events often lasting fewer than seven days. We use initialised, climate-resolution hindcasts to examine the sensitivity of the GA2.0 MJO to a range of changes in sub-grid parameterisations and model configurations. All 22 changes are tested for two cases during the Years of Tropical Convection. Improved skill comes only from (a) disabling vertical momentum transport by convection and (b) increasing mixing entrainment and detrainment for deep and mid-level convection. These changes are subsequently tested in a further 14 hindcast cases; only (b) consistently improves MJO skill, from 12 to 22 days. In a 20-year integration, (b) produces near-observed levels of MJO activity, but propagation through the Maritime Continent remains weak. With default settings, GA2.0 produces precipitation too readily, even in anomalously dry columns. Implementing (b) decreases the efficiency of convection, permitting instability to build during the suppressed MJO phase and producing a more favourable environment for the active phase. The distribution of daily rain rates is more consistent with satellite data; default entrainment produces 6–12 mm/day too frequently. These results are consistent with recent studies showing that greater sensitivity of convection to moisture improves the representation of the MJO.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the absence of market frictions, the cost-of-carry model of stock index futures pricing predicts that returns on the underlying stock index and the associated stock index futures contract will be perfectly contemporaneously correlated. Evidence suggests, however, that this prediction is violated with clear evidence that the stock index futures market leads the stock market. It is argued that traditional tests, which assume that the underlying data generating process is constant, might be prone to overstate the lead-lag relationship. Using a new test for lead-lag relationships based on cross correlations and cross bicorrelations it is found that, contrary to results from using the traditional methodology, periods where the futures market leads the cash market are few and far between and when any lead-lag relationship is detected, it does not last long. Overall, the results are consistent with the prediction of the standard cost-of-carry model and market efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the sources of systematic errors in climate models is challenging because of coupled feedbacks and errors compensation. The developing seamless approach proposes that the identification and the correction of short term climate model errors have the potential to improve the modeled climate on longer time scales. In previous studies, initialised atmospheric simulations of a few days have been used to compare fast physics processes (convection, cloud processes) among models. The present study explores how initialised seasonal to decadal hindcasts (re-forecasts) relate transient week-to-month errors of the ocean and atmospheric components to the coupled model long-term pervasive SST errors. A protocol is designed to attribute the SST biases to the source processes. It includes five steps: (1) identify and describe biases in a coupled stabilized simulation, (2) determine the time scale of the advent of the bias and its propagation, (3) find the geographical origin of the bias, (4) evaluate the degree of coupling in the development of the bias, (5) find the field responsible for the bias. This strategy has been implemented with a set of experiments based on the initial adjustment of initialised simulations and exploring various degrees of coupling. In particular, hindcasts give the time scale of biases advent, regionally restored experiments show the geographical origin and ocean-only simulations isolate the field responsible for the bias and evaluate the degree of coupling in the bias development. This strategy is applied to four prominent SST biases of the IPSLCM5A-LR coupled model in the tropical Pacific, that are largely shared by other coupled models, including the Southeast Pacific warm bias and the equatorial cold tongue bias. Using the proposed protocol, we demonstrate that the East Pacific warm bias appears in a few months and is caused by a lack of upwelling due to too weak meridional coastal winds off Peru. The cold equatorial bias, which surprisingly takes 30 years to develop, is the result of an equatorward advection of midlatitude cold SST errors. Despite large development efforts, the current generation of coupled models shows only little improvement. The strategy proposed in this study is a further step to move from the current random ad hoc approach, to a bias-targeted, priority setting, systematic model development approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Taxonomic free sorting (TFS) is a fast, reliable and new technique in sensory science. The method extends the typical free sorting task where stimuli are grouped according to similarities, by asking respondents to combine their groups two at a time to produce a hierarchy. Previously, TFS has been used for the visual assessment of packaging whereas this study extends the range of potential uses of the technique to incorporate full sensory analysis by the target consumer, which, when combined with hedonic liking scores, was used to generate a novel preference map. Furthermore, to fully evaluate the efficacy of using the sorting method, the technique was evaluated with a healthy older adult consumer group. Participants sorted eight products into groups and described their reason at each stage as they combined those groups, producing a consumer-specific vocabulary. This vocabulary was combined with hedonic data from a separate group of older adults, to give the external preference map. Taxonomic sorting is a simple, fast and effective method for use with older adults, and its combination with liking data can yield a preference map constructed entirely from target consumer data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Facility management (FM), from a service oriented approach, addresses the functions and requirements of different services such as energy management, space planning and security service. Different service requires different information to meet the needs arising from the service. Object-based Building Information Modelling (BIM) is limited to support FM services; though this technology is able to generate 3D models that semantically represent facility’s information dynamically over the lifecycle of a building. This paper presents a semiotics-inspired framework to extend BIM from a service-oriented perspective. The extended BIM, which specifies FM services and required information, will be able to express building service information in the right format for the right purposes. The service oriented approach concerns pragmatic aspect of building’s information beyond semantic level. The pragmatics defines and provides context for utilisation of building’s information. Semiotics theory adopted in this paper is to address pragmatic issues of utilisation of BIM for FM services.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.