929 resultados para Based structure model
Resumo:
"Counterinsurgency (COIN) requires an integrated military, political, and economic program best developed by teams that field both civilians and soldiers. These units should operate with some independence but under a coherent command. In Vietnam, after several false starts, the United States developed an effective unified organization, Civil Operations and Revolutionary Development Support (CORDS), to guide the counterinsurgency. CORDS had three components absent from our efforts in Afghanistan today: sufficient personnel (particularly civilian), numerous teams, and a single chain of command that united the separate COIN programs of the disparate American departments at the district, provincial, regional, and national levels. This paper focuses on the third issue and describes the benefits that unity of command at every level would bring to the American war in Afghanistan. The work begins with a brief introduction to counterinsurgency theory, using a population-centric model, and examines how this warfare challenges the United States. It traces the evolution of the Provincial Reconstruction Teams (PRTs) and the country team, describing problems at both levels. Similar efforts in Vietnam are compared, where persistent executive attention finally integrated the government's counterinsurgency campaign under the unified command of the CORDS program. The next section attributes the American tendency towards a segregated response to cultural differences between the primary departments, executive neglect, and societal concepts of war. The paper argues that, in its approach to COIN, the United States has forsaken the military concept of unity of command in favor of 'unity of effort' expressed in multiagency literature. The final sections describe how unified authority would improve our efforts in Afghanistan and propose a model for the future."--P. iii.
Resumo:
This study investigated the relative contribution of ion-trapping, microsomal binding, and distribution of unbound drug as determinants in the hepatic retention of basic drugs in the isolated perfused rat liver. The ionophore monensin was used to abolish the vesicular proton gradient and thus allow an estimation of ion-trapping by acidic hepatic vesicles of cationic drugs. In vitro microsomal studies were used to independently estimate microsomal binding and metabolism. Hepatic vesicular ion-trapping, intrinsic elimination clearance, permeability-surface area product, and intracellular binding were derived using a physiologically based pharmacokinetic model. Modeling showed that the ion-trapping was significantly lower after monensin treatment for atenolol and propranolol, but not for antipyrine. However, no changes induced by monensin treatment were observed in intrinsic clearance, permeability, or binding for the three model drugs. Monensin did not affect binding or metabolic activity in vitro for the drugs. The observed ion-trapping was similar to theoretical values estimated using the pHs and fractional volumes of the acidic vesicles and the pK(a) values of drugs. Lipophilicity and pK(a) determined hepatic drug retention: a drug with low pK(a) and low lipophilicity (e.g., antipyrine) distributes as unbound drug, a drug with high pK(a) and low lipophilicity (e.g., atenolol) by ion-trapping, and a drug with a high pK(a) and high lipophilicity (e.g., propranolol) is retained by ion-trapping and intracellular binding. In conclusion, monensin inhibits the ion-trapping of high pK(a) basic drugs, leading to a reduction in hepatic retention but with no effect on hepatic drug extraction.
Resumo:
Interconnecting business processes across systems and organisations is considered to provide significant benefits, such as greater process transparency, higher degrees of integration, facilitation of communication, and consequently higher throughput in a given time interval. However, to achieve these benefits requires tackling constraints. In the context of this paper these are privacy-requirements of the involved workflows and their mutual dependencies. Workflow views are a promising conceptional approach to address the issue of privacy; however this approach requires addressing the issue of interdependencies between workflow view and adjacent private workflow. In this paper we focus on three aspects concerning the support for execution of cross-organisational workflows that have been modelled with a workflow view approach: (i) communication between the entities of a view-based workflow model, (ii) their impact on an extended workflow engine, and (iii) the design of a cross-organisational workflow architecture (CWA). We consider communication aspects in terms of state dependencies and control flow dependencies. We propose to tightly couple private workflow and workflow view with state dependencies, whilst to loosely couple workflow views with control flow dependencies. We introduce a Petri-Net-based state transition approach that binds states of private workflow tasks to their adjacent workflow view-task. On the basis of these communication aspects we develop a CWA for view-based cross-organisational workflow execution. Its concepts are valid for mediated and unmediated interactions and express no choice of a particular technology. The concepts are demonstrated by a scenario, run by two extended workflow management systems. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
The aim of this study was to define the determinants of the linear hepatic disposition kinetics of propranolol optical isomers using a perfused rat liver. Monensin was used to abolish the lysosomal proton gradient to allow an estimation of propranolol ion trapping by hepatic acidic vesicles. In vitro studies were used for independent estimates of microsomal binding and intrinsic clearance. Hepatic extraction and mean transit time were determined from outflow-concentration profiles using a nonparametric method. Kinetic parameters were derived from a physiologically based pharmacokinetic model. Modeling showed an approximate 34-fold decrease in ion trapping following monensin treatment. The observed model-derived ion trapping was similar to estimated theoretical values. No differences in ion-trapping values was found between R(+)- and S(-)- propranolol. Hepatic propranolol extraction was sensitive to changes in liver perfusate flow, permeability-surface area product, and intrinsic clearance. Ion trapping, microsomal and nonspecific binding, and distribution of unbound propranolol accounted for 47.4, 47.1, and 5.5% of the sequestration of propranolol in the liver, respectively. It is concluded that the physiologically more active S(-)- propranolol differs from the R(+)- isomer in higher permeability-surface area product, intrinsic clearance, and intracellular binding site values.
Resumo:
Electricity market price forecast is a changeling yet very important task for electricity market managers and participants. Due to the complexity and uncertainties in the power grid, electricity prices are highly volatile and normally carry with spikes. which may be (ens or even hundreds of times higher than the normal price. Such electricity spikes are very difficult to be predicted. So far. most of the research on electricity price forecast is based on the normal range electricity prices. This paper proposes a data mining based electricity price forecast framework, which can predict the normal price as well as the price spikes. The normal price can be, predicted by a previously proposed wavelet and neural network based forecast model, while the spikes are forecasted based on a data mining approach. This paper focuses on the spike prediction and explores the reasons for price spikes based on the measurement of a proposed composite supply-demand balance index (SDI) and relative demand index (RDI). These indices are able to reflect the relationship among electricity demand, electricity supply and electricity reserve capacity. The proposed model is based on a mining database including market clearing price, trading hour. electricity), demand, electricity supply and reserve. Bayesian classification and similarity searching techniques are used to mine the database to find out the internal relationships between electricity price spikes and these proposed. The mining results are used to form the price spike forecast model. This proposed model is able to generate forecasted price spike, level of spike and associated forecast confidence level. The model is tested with the Queensland electricity market data with promising results. Crown Copyright (C) 2004 Published by Elsevier B.V. All rights reserved.
Resumo:
Computer modelling promises to. be an important tool for analysing and predicting interactions between trees within mixed species forest plantations. This study explored the use of an individual-based mechanistic model as a predictive tool for designing mixed species plantations of Australian tropical trees. The 'spatially explicit individually based-forest simulator' (SeXI-FS) modelling system was used to describe the spatial interaction of individual tree crowns within a binary mixed-species experiment. The three-dimensional model was developed and verified with field data from three forest tree species grown in tropical Australia. The model predicted the interactions within monocultures and binary mixtures of Flindersia brayleyana, Eucalyptus pellita and Elaeocarpus grandis, accounting for an average of 42% of the growth variation exhibited by species in different treatments. The model requires only structural dimensions and shade tolerance as species parameters. By modelling interactions in existing tree mixtures, the model predicted both increases and reductions in the growth of mixtures (up to +/- 50% of stem volume at 7 years) compared to monocultures. This modelling approach may be useful for designing mixed tree plantations. (c) 2006 Published by Elsevier B.V.
Resumo:
Anaerobic digestion is a multistep process, mediated by a functionally and phylogenetically diverse microbial population. One of the crucial steps is oxidation of organic acids, with electron transfer via hydrogen or formate from acetogenic bacteria to methanogens. This syntrophic microbiological process is strongly restricted by a thermodynamic limitation on the allowable hydrogen or formate concentration. In order to study this process in more detail, we developed an individual-based biofilm model which enables to describe the processes at a microbial resolution. The biochemical model is the ADM1, implemented in a multidimensional domain. With this model, we evaluated three important issues for the syntrophic relationship: (i) is there a fundamental difference in using hydrogen or formate as electron carrier? (ii) Does a thermodynamic-based inhibition function produced substantially different results from an empirical function? and; (iii) Does the physical colocation of acetogens and methanogens follow directly from a general model. Hydrogen or formate as electron carrier had no substantial impact on model results. Standard inhibition functions or thermodynamic inhibition function gave similar results at larger substrate field grid sizes (> 10 mu m), but at smaller grid sizes, the thermodynamic-based function reduced the number of cells with long interspecies distances (> 2.5 mu m). Therefore, a very fine grid resolution is needed to reflect differences between the thermodynamic function, and a more generic inhibition form. The co-location of syntrophic bacteria was well predicted without a need to assume a microbiological based mechanism (e.g., through chemotaxis) of biofilm formation.
Resumo:
The retrieval of wind fields from scatterometer observations has traditionally been separated into two phases; local wind vector retrieval and ambiguity removal. Operationally, a forward model relating wind vector to backscatter is inverted, typically using look up tables, to retrieve up to four local wind vector solutions. A heuristic procedure, using numerical weather prediction forecast wind vectors and, often, some neighbourhood comparison is then used to select the correct solution. In this paper we develop a Bayesian method for wind field retrieval, and show how a direct local inverse model, relating backscatter to wind vector, improves the wind vector retrieval accuracy. We compare these results with the operational U.K. Meteorological Office retrievals, our own CMOD4 retrievals and a neural network based local forward model retrieval. We suggest that the neural network based inverse model, which is extremely fast to use, improves upon current forward models when used in a variational data assimilation scheme.
Resumo:
Data Envelopment Analysis (DEA) is a nonparametric method for measuring the efficiency of a set of decision making units such as firms or public sector agencies, first introduced into the operational research and management science literature by Charnes, Cooper, and Rhodes (CCR) [Charnes, A., Cooper, W.W., Rhodes, E., 1978. Measuring the efficiency of decision making units. European Journal of Operational Research 2, 429–444]. The original DEA models were applicable only to technologies characterized by positive inputs/outputs. In subsequent literature there have been various approaches to enable DEA to deal with negative data. In this paper, we propose a semi-oriented radial measure, which permits the presence of variables which can take both negative and positive values. The model is applied to data on a notional effluent processing system to compare the results with those yielded by two alternative methods for dealing with negative data in DEA: The modified slacks-based model suggested by Sharp et al. [Sharp, J.A., Liu, W.B., Meng, W., 2006. A modified slacks-based measure model for data envelopment analysis with ‘natural’ negative outputs and inputs. Journal of Operational Research Society 57 (11) 1–6] and the range directional model developed by Portela et al. [Portela, M.C.A.S., Thanassoulis, E., Simpson, G., 2004. A directional distance approach to deal with negative data in DEA: An application to bank branches. Journal of Operational Research Society 55 (10) 1111–1121]. A further example explores the advantages of using the new model.
A conceptual analysis of m-technology as a medium towards e-governance society in emerging countries
Resumo:
This paper shows how mobile phone technology can influence the development of egovernance in emerging countries. We evaluate the conditions under which consumers really engage with m-services. We argue that the lower cost of m-infrastructure is more appropriate than the Internet-based structure that prevents large part of the population access to ICT. However, m-technology should not be separated from the Internet but be integrated allowing emerging country to leap frog the technological cycle.
Resumo:
The research examines the deposition of airborne particles which contain heavy metals and investigates the methods that can be used to identify their sources. The research focuses on lead and cadmium because these two metals are of growing public and scientific concern on environmental health grounds. The research consists of three distinct parts. The first is the development and evaluation of a new deposition measurement instrument - the deposit cannister - designed specifically for large-scale surveys in urban areas. The deposit cannister is specifically designed to be cheap, robust, and versatile and therefore to permit comprehensive high-density urban surveys. The siting policy reduces contamination from locally resuspended surface-dust. The second part of the research has involved detailed surveys of heavy metal deposition in Walsall, West Midlands, using the new high-density measurement method. The main survey, conducted over a six-week period in November - December 1982, provided 30-day samples of deposition at 250 different sites. The results have been used to examine the magnitude and spatial variability of deposition rates in the case-study area, and to evaluate the performance of the measurement method. The third part of the research has been to conduct a 'source-identification' exercise. The methods used have been Receptor Models - Factor Analysis and Cluster Analysis - and a predictive source-based deposition model. The results indicate that there are six main source processes contributing to deposition of metals in the Walsall area: coal combustion, vehicle emissions, ironfounding, copper refining and two general industrial/urban processes. |A source-based deposition model has been calibrated using facctorscores for one source factor as the dependent variable, rather than metal deposition rates, thus avoiding problems traditionally encountered in calibrating models in complex multi-source areas. Empirical evidence supports the hypothesised associatlon of this factor with emissions of metals from the ironfoundry industry.
Resumo:
This thesis records the design and development of an electrically driven, air to water, vapour compression heat pump of nominally 6kW heat output, for residential space heating. The study was carried out on behalf of GEC Research Ltd through the Interdisciplinary Higher Degrees Scheme at Aston University. A computer based mathematical model of the vapour compression cycle was produced as a design aid, to enable the effects of component design changes or variations in operating conditions to be predicted. This model is supported by performance testing of the major components, which revealed that improvements in the compressor isentropic efficiency offer the greatest potential for further increases in cycle COPh. The evaporator was designed from first principles, and is based on wire-wound heat transfer tubing. Two evaporators, of air side area 10.27 and 16.24m2, were tested in a temperature and humidity controlled environment, demonstrating that the benefits of the large coil are greater heat pump heat output and lower noise levels. A systematic study of frost growth rates suggested that this problem is most severe at the conditions of saturated air at 0oC combined with low condenser water temperature. A dynamic simulation model was developed to predict the in-service performance of the heat pump. This study confirmed the importance of an adequate radiator area for heat pump installations. A prototype heat pump was designed and manufactured, consisting of a hermetic reciprocating compressor, a coaxial tube condenser and a helically coiled evaporator, using Refrigerant 22. The prototype was field tested in a domestic environment for one and a half years. The installation included a comprehensive monitoring system. Initial problems were encountered with defrosting and compressor noise, both of which were solved. The unit then operated throughout the 1985/86 heating season without further attention, producing a COPh of 2.34.
Resumo:
WiMAX has been introduced as a competitive alternative for metropolitan broadband wireless access technologies. It is connection oriented and it can provide very high data rates, large service coverage, and flexible quality of services (QoS). Due to the large number of connections and flexible QoS supported by WiMAX, the uplink access in WiMAX networks is very challenging since the medium access control (MAC) protocol must efficiently manage the bandwidth and related channel allocations. In this paper, we propose and investigate a cost-effective WiMAX bandwidth management scheme, named the WiMAX partial sharing scheme (WPSS), in order to provide good QoS while achieving better bandwidth utilization and network throughput. The proposed bandwidth management scheme is compared with a simple but inefficient scheme, named the WiMAX complete sharing scheme (WCPS). A maximum entropy (ME) based analytical model (MEAM) is proposed for the performance evaluation of the two bandwidth management schemes. The reason for using MEAM for the performance evaluation is that MEAM can efficiently model a large-scale system in which the number of stations or connections is generally very high, while the traditional simulation and analytical (e.g., Markov models) approaches cannot perform well due to the high computation complexity. We model the bandwidth management scheme as a queuing network model (QNM) that consists of interacting multiclass queues for different service classes. Closed form expressions for the state and blocking probability distributions are derived for those schemes. Simulation results verify the MEAM numerical results and show that WPSS can significantly improve the network's performance compared to WCPS.
Resumo:
The concept of soft state (i.e., the state that will expire unless been refreshed) has been widely used in the design of network signaling protocols. The approaches of refreshing state in multi-hop networks can be classified to end-to-end (E2E) and hop-by-hop (HbH) refreshes. In this article we propose an effective Markov chain based analytical model for both E2E and HbH refresh approaches. Simulations verify the analytical models, which can be used to study the impacts of link characteristics on the performance (e.g., state synchronization and message overhead), as a guide on configuration and optimization of soft state signaling protocols. © 2009 IEEE.
Resumo:
IEEE 802.15.4 standard is a relatively new standard designed for low power low data rate wireless sensor networks (WSN), which has a wide range of applications, e.g., environment monitoring, e-health, home and industry automation. In this paper, we investigate the problems of hidden devices in coverage overlapped IEEE 802.15.4 WSNs, which is likely to arise when multiple 802.15.4 WSNs are deployed closely and independently. We consider a typical scenario of two 802.15.4 WSNs with partial coverage overlapping and propose a Markov-chain based analytical model to reveal the performance degradation due to the hidden devices from the coverage overlapping. Impacts of the hidden devices and network sleeping modes on saturated throughput and energy consumption are modeled. The analytic model is verified by simulations, which can provide the insights to network design and planning when multiple 802.15.4 WSNs are deployed closely. © 2013 IEEE.