967 resultados para Analytical performance


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This letter presents an analytical model for evaluating the Bit Error Rate (BER) of a Direct Sequence Code Division Multiple Access (DS-CDMA) system, with M-ary orthogonal modulation and noncoherent detection, employing an array antenna operating in a Nakagami fading environment. An expression of the Signal to Interference plus Noise Ratio (SINR) at the output of the receiver is derived, which allows the BER to be evaluated using a closed form expression. The analytical model is validated by comparing the obtained results with simulation results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the analytical method development for BAY 11-7082 ((E)-3-[4-methylphenylsulfonyl]-2-propenenitrile), using HPLC-MS-MS and HPLC-UV, we observed that the protein removal process (both ultrafiltration and precipitation method using organic solvents) prior to HPLC brought about a significant reduction in the concentration of this compound. The use of a structurally similar internal standard, BAY 11-7085 ((E)-3-[4-t-butylphenylsulfonyl]-2-propenenitrile), was not effective in compensating for the loss of analyte as the extent of reduction was different to that of the analyte. We present here a systematic investigation of this problem and a new validated method for the determination of BAY 11-7082. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-performance liquid chromatography coupled by an electrospray ion source to a tandem mass spectrometer (HPLC-EST-MS/ MS) is the current analytical method of choice for quantitation of analytes in biological matrices. With HPLC-ESI-MS/MS having the characteristics of high selectivity, sensitivity, and throughput, this technology is being increasingly used in the clinical laboratory. An important issue to be addressed in method development, validation, and routine use of HPLC-ESI-MS/MS is matrix effects. Matrix effects are the alteration of ionization efficiency by the presence of coeluting substances. These effects are unseen in the chromatograrn but have deleterious impact on methods accuracy and sensitivity. The two common ways to assess matrix effects are either by the postextraction addition method or the postcolumn infusion method. To remove or minimize matrix effects, modification to the sample extraction methodology and improved chromatographic separation must be performed. These two parameters are linked together and form the basis of developing a successful and robust quantitative HPLC-EST-MS/MS method. Due to the heterogenous nature of the population being studied, the variability of a method must be assessed in samples taken from a variety of subjects. In this paper, the major aspects of matrix effects are discussed with an approach to address matrix effects during method validation proposed. (c) 2004 The Canadian Society of Clinical Chemists. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microtome sections of proton exchange membrane cells produce a wide range of information ranging from macroscopic distribution of components through specimens in which the detailed distribution of catalyst particles can be observed. Using modern data management practices it is possible to combine information at different scales and correlate processing and performance data. Analytical electron microscopy reveals the compositional variations across used cells at the electrolyte/electrode interface. In particular analytical techniques indicate that sulphur concentrations are likely to diminish at the interface Nafion/anode interface. © 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the feasibility of simultaneous suppressing of the amplification noise and nonlinearity, representing the most fundamental limiting factors in modern optical communication. To accomplish this task we developed a general design optimisation technique, based on concepts of noise and nonlinearity management. We demonstrate the immense efficiency of the novel approach by applying it to a design optimisation of transmission lines with periodic dispersion compensation using Raman and hybrid Raman-EDFA amplification. Moreover, we showed, using nonlinearity management considerations, that the optimal performance in high bit-rate dispersion managed fibre systems with hybrid amplification is achieved for a certain amplifier spacing – which is different from commonly known optimal noise performance corresponding to fully distributed amplification. Required for an accurate estimation of the bit error rate, the complete knowledge of signal statistics is crucial for modern transmission links with strong inherent nonlinearity. Therefore, we implemented the advanced multicanonical Monte Carlo (MMC) method, acknowledged for its efficiency in estimating distribution tails. We have accurately computed acknowledged for its efficiency in estimating distribution tails. We have accurately computed marginal probability density functions for soliton parameters, by numerical modelling of Fokker-Plank equation applying the MMC simulation technique. Moreover, applying a powerful MMC method we have studied the BER penalty caused by deviations from the optimal decision level in systems employing in-line 2R optical regeneration. We have demonstrated that in such systems the analytical linear approximation that makes a better fit in the central part of the regenerator nonlinear transfer function produces more accurate approximation of the BER and BER penalty. We present a statistical analysis of RZ-DPSK optical signal at direct detection receiver with Mach-Zehnder interferometer demodulation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently over 50 million people worldwide wear contact lenses, of which over 75% wear hydrogel lenses. Significant deposition occurs in approximately 80% of hydrogel lenses and many contact lens wearers cease wearing lenses due to problems associated with deposition. The contact lens field is not alone in encountering complications associated with interactions between the body and artificial devices. The widespread use of man-made materials to replace structures in the body has emphasised the importance of studies that examine the interactions between implantation materials and body tissues.This project used carefully controlled, randomized clinical studies to study the interactive effects of contact lens materials, care systems, replacement periods and patient differences. Of principal interest was the influence of these factors on material deposition and their subsequent impact on subjective performance. A range of novel and established analytical techniques were used to examine hydrogel lenses following carefully controlled clinical studies in which clinical performance was meticulously monitored. These studies established the inter-relationship between clinical performance and deposition to be evaluated. This project showed that significant differences exist between individuals in their ability to deposit hydrogel lenses, with approximately 20% of subjects displaying significant deposition irrespective of the lens material. Additionally, materials traditionally categorised together show markedly different spoilation characteristics, which are wholly attributable to their detailed chemical structure. For the first time the in vivo deposition kinetics of both protein and lipid in charged and uncharged polymers was demonstrated. In addition the importance of care systems in the deposition process was shown, clearly demonstrating the significance of the quality rather than the quantity of deposition in influencing subjective performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Orthodox contingency theory links effective organisational performance to compatible relationships between the environment and organisation strategy and structure and assumes that organisations have the capacity to adapt as the environment changes. Recent contributions to the literature on organisation theory claim that the key to effective performance is effective adaptation which in turn requires the simultaneous reconciliation of efficiency and innovation which is afforded by an unique environment-organisation configuration. The literature on organisation theory recognises the continuing confusion caused by the fragmented and often conflicting results from cross-sectional studies. Although the case is made for longitudinal studies which comprehensively describe the evolving relationship between the environment and the organisation there is little to suggest how such studies should be executed in practice. Typically the choice is between the approaches of the historicised case study and statistical analysis of large populations which examine the relationship between environment and organisation strategy and/or structure and ignore the product-process relationship. This study combines the historicised case study and the multi-variable and ordinal scale approach of statistical analysis to construct an analytical framework which tracks and exposes the environment-organisation-performance relationship over time. The framework examines changes in the environment, strategy and structure and uniquely includes an assessment of the organisation's product-process relationship and its contribution to organisational efficiency and innovation. The analytical framework is applied to examine the evolving environment-organisation relationship of two organisations in the same industry over the same twenty-five year period to provide a sector perspective of organisational adaptation. The findings demonstrate the significance of the environment-organisation configuration to the scope and frequency of adaptation and suggest that the level of sector homogeneity may be linked to the level of product-process standardisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past decade, several experienced Operational Researchers have advanced the view that the theoretical aspects of model building have raced ahead of the ability of people to use them. Consequently, the impact of Operational Research on commercial organisations and the public sector is limited, and many systems fail to achieve their anticipated benefits in full. The primary objective of this study is to examine a complex interactive Stock Control system, and identify the reasons for the differences between the theoretical expectations and the operational performance. The methodology used is to hypothesise all the possible factors which could cause a divergence between theory and practice, and to evaluate numerically the effect each of these factors has on two main control indices - Service Level and Average Stock Value. Both analytical and empirical methods are used, and simulation is employed extensively. The factors are divided into two main categories for analysis - theoretical imperfections in the model, and the usage of the system by Buyers. No evidence could be found in the literature of any previous attempts to place the differences between theory and practice in a system in quantitative perspective nor, more specifically, to study the effects of Buyer/computer interaction in a Stock Control system. The study reveals that, in general, the human factors influencing performance are of a much higher order of magnitude than the theoretical factors, thus providing objective evidence to support the original premise. The most important finding is that, by judicious intervention into an automatic stock control algorithm, it is possible for Buyers to produce results which not only attain but surpass the algorithmic predictions. However, the complexity and behavioural recalcitrance of these systems are such that an innately numerate, enquiring type of Buyer needs to be inducted to realise the performance potential of the overall man/computer system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The principles of High Performance Liquid Chromatography (HPLC) and pharmacokinetics were applied to the use of several clinically-important drugs at the East Birmingham Hospital. Amongst these was gentamicin, which was investigated over a two-year period by a multi-disciplinary team. It was found that there was considerable intra- and inter-patient variation that had not previously been reported and the causes and consequences of such variation were considered. A detailed evaluation of available pharmacokinetic techniques was undertaken and 1- and 2-compartment models were optimised with regard to sampling procedures, analytical error and model-error. The implications for control of therapy are discussed and an improved sampling regime is proposed for routine usage. Similar techniques were applied to trimethoprim, assayed by HPLC, in patients with normal renal function and investigations were also commenced into the penetration of drug into peritoneal dialysate. Novel assay techniques were also developed for a range of drugs including 4-aminopyridine, chloramphenicol, metronidazole and a series of penicillins and cephalosporins. Stability studies on cysteamine, reaction-rate studies on creatinine-picrate and structure-activity relationships in HPLC of aminopyridines are also reported.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a Markov chain based analytical model is proposed to evaluate the slotted CSMA/CA algorithm specified in the MAC layer of IEEE 802.15.4 standard. The analytical model consists of two two-dimensional Markov chains, used to model the state transition of an 802.15.4 device, during the periods of a transmission and between two consecutive frame transmissions, respectively. By introducing the two Markov chains a small number of Markov states are required and the scalability of the analytical model is improved. The analytical model is used to investigate the impact of the CSMA/CA parameters, the number of contending devices, and the data frame size on the network performance in terms of throughput and energy efficiency. It is shown by simulations that the proposed analytical model can accurately predict the performance of slotted CSMA/CA algorithm for uplink, downlink and bi-direction traffic, with both acknowledgement and non-acknowledgement modes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dedicated short range communications (DSRC) has been regarded as one of the most promising technologies to provide robust communications for large scale vehicle networks. It is designed to support both road safety and commercial applications. Road safety applications will require reliable and timely wireless communications. However, as the medium access control (MAC) layer of DSRC is based on the IEEE 802.11 distributed coordination function (DCF), it is well known that the random channel access based MAC cannot provide guaranteed quality of services (QoS). It is very important to understand the quantitative performance of DSRC, in order to make better decisions on its adoption, control, adaptation, and improvement. In this paper, we propose an analytic model to evaluate the DSRC-based inter-vehicle communication. We investigate the impacts of the channel access parameters associated with the different services including arbitration inter-frame space (AIFS) and contention window (CW). Based on the proposed model, we analyze the successful message delivery ratio and channel service delay for broadcast messages. The proposed analytical model can provide a convenient tool to evaluate the inter-vehicle safety applications and analyze the suitability of DSRC for road safety applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

IEEE 802.11 standard has achieved huge success in the past decade and is still under development to provide higher physical data rate and better quality of service (QoS). An important problem for the development and optimization of IEEE 802.11 networks is the modeling of the MAC layer channel access protocol. Although there are already many theoretic analysis for the 802.11 MAC protocol in the literature, most of the models focus on the saturated traffic and assume infinite buffer at the MAC layer. In this paper we develop a unified analytical model for IEEE 802.11 MAC protocol in ad hoc networks. The impacts of channel access parameters, traffic rate and buffer size at the MAC layer are modeled with the assistance of a generalized Markov chain and an M/G/1/K queue model. The performance of throughput, packet delivery delay and dropping probability can be achieved. Extensive simulations show the analytical model is highly accurate. From the analytical model it is shown that for practical buffer configuration (e.g. buffer size larger than one), we can maximize the total throughput and reduce the packet blocking probability (due to limited buffer size) and the average queuing delay to zero by effectively controlling the offered load. The average MAC layer service delay as well as its standard deviation, is also much lower than that in saturated conditions and has an upper bound. It is also observed that the optimal load is very close to the maximum achievable throughput regardless of the number of stations or buffer size. Moreover, the model is scalable for performance analysis of 802.11e in unsaturated conditions and 802.11 ad hoc networks with heterogenous traffic flows. © 2012 KSI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

IEEE 802.16 standard specifies two contention based bandwidth request schemes working with OFDM physical layer specification in point-to-multipoint (PMP) architecture, the mandatory one used in region-full and the optional one used in region-focused. This letter presents a unified analytical model to study the bandwidth efficiency and channel access delay performance of the two schemes. The impacts of access parameters, available bandwidth and subchannelization have been taken into account. The model is validated by simulations. The mandatory scheme is observed to perform closely to the optional one when subchannelization is active for both schemes.