929 resultados para Continuous Variable Systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Attention is drawn to a need for caution in the determination of binding data for protein-polyelectrolyte interactions by frontal analysis continuous capillary electrophoresis (FACCE). Because the method is valid only for systems involving comigration of complex(es) and slower-migrating reactant, establishing conformity with that condition is clearly a prerequisite for its application. However, that requirement has not been tested in any published studies thus far. On the basis of calculated FACCE patterns, presented to illustrate features by which such comigration of complex(es) and slower-migrating reactant can be identified, the form of the published pattern for a P-lactoglobulin-poly(styrenesulfonate) mixture does not seem to signify the migration behavior required to justify its consideration in such terms. Additional experimental studies are therefore needed to ascertain the validity of FACCE as a means of determining binding data for the characterization of protein-polyelectrolyte interactions. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A comparison of a constant (continuous delivery of 4% FiO(2)) and a variable (initial 5% FiO(2) with adjustments to induce low amplitude EEG (LAEEG) and hypotension) hypoxic/ischemic insult was performed to determine which insult was more effective in producing a consistent degree of survivable neuropathological damage in a newborn piglet model of perinatal asphyxia. We also examined which physiological responses contributed to this outcome. Thirty-nine 1-day-old piglets were subjected to either a constant hypoxic/ischemic insult of 30- to 37-min duration or a variable hypoxic/ischemic insult of 30-min low peak amplitude EEG (LAEEG < 5 mu V) including 10 min of low mean arterial blood pressure (MABP < 70% of baseline). Control animals (n = 6) received 21% FiO(2) for the duration of the experiment. At 72 h, the piglets were euthanased, their brains removed and fixed in 4% paraformaldehyde and assessed for hypoxic/ischemic injury by histological analysis. Based on neuropathology scores, piglets were grouped as undamaged or damaged; piglets that did not survive to 72 h were grouped separately as dead. The variable insult resulted in a greater number of piglets with neuropathological damage (undamaged = 12.5%, damaged = 68.75%, dead = 18.75%) while the constant insult resulted in a large proportion of undamaged piglets (undamaged = 50%, damaged = 22.2%, dead = 27.8%). A hypoxic insult varied to maintain peak amplitude EEG < 5 mu V results in a greater number of survivors with a consistent degree of neuropathological damage than a constant hypoxic insult. Physiological variables MABP, LAEEG, pH and arterial base excess were found to be significantly associated with neuropathological outcome. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantile computation has many applications including data mining and financial data analysis. It has been shown that an is an element of-approximate summary can be maintained so that, given a quantile query d (phi, is an element of), the data item at rank [phi N] may be approximately obtained within the rank error precision is an element of N over all N data items in a data stream or in a sliding window. However, scalable online processing of massive continuous quantile queries with different phi and is an element of poses a new challenge because the summary is continuously updated with new arrivals of data items. In this paper, first we aim to dramatically reduce the number of distinct query results by grouping a set of different queries into a cluster so that they can be processed virtually as a single query while the precision requirements from users can be retained. Second, we aim to minimize the total query processing costs. Efficient algorithms are developed to minimize the total number of times for reprocessing clusters and to produce the minimum number of clusters, respectively. The techniques are extended to maintain near-optimal clustering when queries are registered and removed in an arbitrary fashion against whole data streams or sliding windows. In addition to theoretical analysis, our performance study indicates that the proposed techniques are indeed scalable with respect to the number of input queries as well as the number of items and the item arrival rate in a data stream.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An innovative method for modelling biological processes under anaerobic conditions is presented and discussed. The method is based on titrimetric and off-gas measurements. Titrimetric data is recorded as the addition rate of hydroxyl ions or protons that is required to maintain pH in a bioreactor at a constant level. An off-gas analysis arrangement measures, among other things, the transfer rate of carbon dioxide. The integration of these signals results in a continuous signal which is solely related to the biological reactions. When coupled with a mathematical model of the biological reactions, the signal allows a detailed characterisation of these reactions, which would otherwise be difficult to achieve. Two applications of the method to the enhanced biological phosphorus removal processes are presented and discussed to demonstrate the principle and effectiveness of the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We demonstrate that the process of generating smooth transitions Call be viewed as a natural result of the filtering operations implied in the generation of discrete-time series observations from the sampling of data from an underlying continuous time process that has undergone a process of structural change. In order to focus discussion, we utilize the problem of estimating the location of abrupt shifts in some simple time series models. This approach will permit its to address salient issues relating to distortions induced by the inherent aggregation associated with discrete-time sampling of continuous time processes experiencing structural change, We also address the issue of how time irreversible structures may be generated within the smooth transition processes. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relative merits of different systems of property rights to allocate water among different extractive uses are evaluated for the case where variability of supply is important. Three systems of property rights are considered. In the first, variable supply is dealt with through the use of water entitlements defined as shares of the total quantity available. In the second, there are two types of water entitlements, one for water with a high security of supply and the other a lower security right for the residual supply. The third is a system of entitlements specified as state-contingent claims. With zero transaction costs, all systems are efficient. In the realistic situation where transaction costs matter, the system based on state-contingent claims is globally optimal, and the system with high-security and lower security entitlements is preferable to the system with share entitlements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional real-time control systems are tightly integrated into the industrial processes they govern. Now, however, there is increasing interest in networked control systems. These provide greater flexibility and cost savings by allowing real-time controllers to interact with industrial processes over existing communications networks. New data packet queuing protocols are currently being developed to enable precise real-time control over a network with variable propagation delays. We show how one such protocol was formally modelled using timed automata, and how model checking was used to reveal subtle aspects of the control system's dynamic behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although managers consider accurate, timely, and relevant information as critical to the quality of their decisions, evidence of large variations in data quality abounds. Over a period of twelve months, the action research project reported herein attempted to investigate and track data quality initiatives undertaken by the participating organisation. The investigation focused on two types of errors: transaction input errors and processing errors. Whenever the action research initiative identified non-trivial errors, the participating organisation introduced actions to correct the errors and prevent similar errors in the future. Data quality metrics were taken quarterly to measure improvements resulting from the activities undertaken during the action research project. The action research project results indicated that for a mission-critical database to ensure and maintain data quality, commitment to continuous data quality improvement is necessary. Also, communication among all stakeholders is required to ensure common understanding of data quality improvement goals. The action research project found that to further substantially improve data quality, structural changes within the organisation and to the information systems are sometimes necessary. The major goal of the action research study is to increase the level of data quality awareness within all organisations and to motivate them to examine the importance of achieving and maintaining high-quality data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite extensive progress on the theoretical aspects of spectral efficient communication systems, hardware impairments, such as phase noise, are the key bottlenecks in next generation wireless communication systems. The presence of non-ideal oscillators at the transceiver introduces time varying phase noise and degrades the performance of the communication system. Significant research literature focuses on joint synchronization and decoding based on joint posterior distribution, which incorporate both the channel and code graph. These joint synchronization and decoding approaches operate on well designed sum-product algorithms, which involves calculating probabilistic messages iteratively passed between the channel statistical information and decoding information. Channel statistical information, generally entails a high computational complexity because its probabilistic model may involve continuous random variables. The detailed knowledge about the channel statistics for these algorithms make them an inadequate choice for real world applications due to power and computational limitations. In this thesis, novel phase estimation strategies are proposed, in which soft decision-directed iterative receivers for a separate A Posteriori Probability (APP)-based synchronization and decoding are proposed. These algorithms do not require any a priori statistical characterization of the phase noise process. The proposed approach relies on a Maximum A Posteriori (MAP)-based algorithm to perform phase noise estimation and does not depend on the considered modulation/coding scheme as it only exploits the APPs of the transmitted symbols. Different variants of APP-based phase estimation are considered. The proposed algorithm has significantly lower computational complexity with respect to joint synchronization/decoding approaches at the cost of slight performance degradation. With the aim to improve the robustness of the iterative receiver, we derive a new system model for an oversampled (more than one sample per symbol interval) phase noise channel. We extend the separate APP-based synchronization and decoding algorithm to a multi-sample receiver, which exploits the received information from the channel by exchanging the information in an iterative fashion to achieve robust convergence. Two algorithms based on sliding block-wise processing with soft ISI cancellation and detection are proposed, based on the use of reliable information from the channel decoder. Dually polarized systems provide a cost-and spatial-effective solution to increase spectral efficiency and are competitive candidates for next generation wireless communication systems. A novel soft decision-directed iterative receiver, for separate APP-based synchronization and decoding, is proposed. This algorithm relies on an Minimum Mean Square Error (MMSE)-based cancellation of the cross polarization interference (XPI) followed by phase estimation on the polarization of interest. This iterative receiver structure is motivated from Master/Slave Phase Estimation (M/S-PE), where M-PE corresponds to the polarization of interest. The operational principle of a M/S-PE block is to improve the phase tracking performance of both polarization branches: more precisely, the M-PE block tracks the co-polar phase and the S-PE block reduces the residual phase error on the cross-polar branch. Two variants of MMSE-based phase estimation are considered; BW and PLP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The amplification of demand variation up a supply chain widely termed ‘the Bullwhip Effect’ is disruptive, costly and something that supply chain management generally seeks to minimise. Originally attributed to poor system design; deficiencies in policies, organisation structure and delays in material and information flow all lead to sub-optimal reorder point calculation. It has since been attributed to exogenous random factors such as: uncertainties in demand, supply and distribution lead time but these causes are not exclusive as academic and operational studies since have shown that orders and/or inventories can exhibit significant variability even if customer demand and lead time are deterministic. This increase in the range of possible causes of dynamic behaviour indicates that our understanding of the phenomenon is far from complete. One possible, yet previously unexplored, factor that may influence dynamic behaviour in supply chains is the application and operation of supply chain performance measures. Organisations monitoring and responding to their adopted key performance metrics will make operational changes and this action may influence the level of dynamics within the supply chain, possibly degrading the performance of the very system they were intended to measure. In order to explore this a plausible abstraction of the operational responses to the Supply Chain Council’s SCOR® (Supply Chain Operations Reference) model was incorporated into a classic Beer Game distribution representation, using the dynamic discrete event simulation software Simul8. During the simulation the five SCOR Supply Chain Performance Attributes: Reliability, Responsiveness, Flexibility, Cost and Utilisation were continuously monitored and compared to established targets. Operational adjustments to the; reorder point, transportation modes and production capacity (where appropriate) for three independent supply chain roles were made and the degree of dynamic behaviour in the Supply Chain measured, using the ratio of the standard deviation of upstream demand relative to the standard deviation of the downstream demand. Factors employed to build the detailed model include: variable retail demand, order transmission, transportation delays, production delays, capacity constraints demand multipliers and demand averaging periods. Five dimensions of supply chain performance were monitored independently in three autonomous supply chain roles and operational settings adjusted accordingly. Uniqueness of this research stems from the application of the five SCOR performance attributes with modelled operational responses in a dynamic discrete event simulation model. This project makes its primary contribution to knowledge by measuring the impact, on supply chain dynamics, of applying a representative performance measurement system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of seven minimization algorithms are compared on five neural network problems. These include a variable-step-size algorithm, conjugate gradient, and several methods with explicit analytic or numerical approximations to the Hessian.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is currently considerable interest in developing general non-linear density models based on latent, or hidden, variables. Such models have the ability to discover the presence of a relatively small number of underlying `causes' which, acting in combination, give rise to the apparent complexity of the observed data set. Unfortunately, to train such models generally requires large computational effort. In this paper we introduce a novel latent variable algorithm which retains the general non-linear capabilities of previous models but which uses a training procedure based on the EM algorithm. We demonstrate the performance of the model on a toy problem and on data from flow diagnostics for a multi-phase oil pipeline.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Thouless-Anderson-Palmer (TAP) approach was originally developed for analysing the Sherrington-Kirkpatrick model in the study of spin glass models and has been employed since then mainly in the context of extensively connected systems whereby each dynamical variable interacts weakly with the others. Recently, we extended this method for handling general intensively connected systems where each variable has only O(1) connections characterised by strong couplings. However, the new formulation looks quite different with respect to existing analyses and it is only natural to question whether it actually reproduces known results for systems of extensive connectivity. In this chapter, we apply our formulation of the TAP approach to an extensively connected system, the Hopfield associative memory model, showing that it produces identical results to those obtained by the conventional formulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – To investigate the role of simulation in the introduction of technology in a continuous operations process. Design/methodology/approach – A case-based research method was chosen with the aim to provide an exemplar of practice and test the proposition that the use of simulation can improve the implementation and running of conveyor systems in continuous process facilities. Findings – The research determines the optimum rate of re-introduction of inventory to a conveyor system generated during a breakdown event. Research limitations/implications – More case studies are required demonstrating the operational and strategic benefits that can be gained by using simulation to assess technology in organisations. Practical implications – A practical outcome of the study was the implementation of a policy for the manual re-introduction of inventory on a conveyor line after a breakdown event had occurred. Originality/value – The paper presents a novel example of the use of simulation to estimate the re-introduction rate of inventory after a breakdown event on a conveyor line. The paper highlights how by addressing this operational issue, ahead of implementation, the likelihood of the success of the strategic decision to acquire the technology can be improved.