42 resultados para value communication methods
Resumo:
The use of digital communication systems is increasing very rapidly. This is due to lower system implementation cost compared to analogue transmission and at the same time, the ease with which several types of data sources (data, digitised speech and video, etc.) can be mixed. The emergence of packet broadcast techniques as an efficient type of multiplexing, especially with the use of contention random multiple access protocols, has led to a wide-spread application of these distributed access protocols in local area networks (LANs) and a further extension of them to radio and mobile radio communication applications. In this research, a proposal for a modified version of the distributed access contention protocol which uses the packet broadcast switching technique has been achieved. The carrier sense multiple access with collision avoidance (CSMA/CA) is found to be the most appropriate protocol which has the ability to satisfy equally the operational requirements for local area networks as well as for radio and mobile radio applications. The suggested version of the protocol is designed in a way in which all desirable features of its precedents is maintained. However, all the shortcomings are eliminated and additional features have been added to strengthen its ability to work with radio and mobile radio channels. Operational performance evaluation of the protocol has been carried out for the two types of non-persistent and slotted non-persistent, through mathematical and simulation modelling of the protocol. The results obtained from the two modelling procedures validate the accuracy of both methods, which compares favourably with its precedent protocol CSMA/CD (with collision detection). A further extension of the protocol operation has been suggested to operate with multichannel systems. Two multichannel systems based on the CSMA/CA protocol for medium access are therefore proposed. These are; the dynamic multichannel system, which is based on two types of channel selection, the random choice (RC) and the idle choice (IC), and the sequential multichannel system. The latter has been proposed in order to supress the effect of the hidden terminal, which always represents a major problem with the usage of the contention random multiple access protocols with radio and mobile radio channels. Verification of their operation performance evaluation has been carried out using mathematical modelling for the dynamic system. However, simulation modelling has been chosen for the sequential system. Both systems are found to improve system operation and fault tolerance when compared to single channel operation.
Resumo:
1. Pearson's correlation coefficient only tests whether the data fit a linear model. With large numbers of observations, quite small values of r become significant and the X variable may only account for a minute proportion of the variance in Y. Hence, the value of r squared should always be calculated and included in a discussion of the significance of r. 2. The use of r assumes that a bivariate normal distribution is present and this assumption should be examined prior to the study. If Pearson's r is not appropriate, then a non-parametric correlation coefficient such as Spearman's rs may be used. 3. A significant correlation should not be interpreted as indicating causation especially in observational studies in which there is a high probability that the two variables are correlated because of their mutual correlations with other variables. 4. In studies of measurement error, there are problems in using r as a test of reliability and the ‘intra-class correlation coefficient’ should be used as an alternative. A correlation test provides only limited information as to the relationship between two variables. Fitting a regression line to the data using the method known as ‘least square’ provides much more information and the methods of regression and their application in optometry will be discussed in the next article.
Resumo:
1. Fitting a linear regression to data provides much more information about the relationship between two variables than a simple correlation test. A goodness of fit test of the line should always be carried out. Hence, r squared estimates the strength of the relationship between Y and X, ANOVA whether a statistically significant line is present, and the ‘t’ test whether the slope of the line is significantly different from zero. 2. Always check whether the data collected fit the assumptions for regression analysis and, if not, whether a transformation of the Y and/or X variables is necessary. 3. If the regression line is to be used for prediction, it is important to determine whether the prediction involves an individual y value or a mean. Care should be taken if predictions are made close to the extremities of the data and are subject to considerable error if x falls beyond the range of the data. Multiple predictions require correction of the P values. 3. If several individual regression lines have been calculated from a number of similar sets of data, consider whether they should be combined to form a single regression line. 4. If the data exhibit a degree of curvature, then fitting a higher-order polynomial curve may provide a better fit than a straight line. In this case, a test of whether the data depart significantly from a linear regression should be carried out.
Resumo:
Several fermentation methods for the production of the enzyme dextransucrase have been employed. The theoretical aspects of these fermentation techniques have been given in the early chapters of this thesis together with a brief overview of enzyme biotechnology. A literature survey on cell recycle fermentation has been carried out followed by a survey report on dextransucrase production, purification and the reaction mechanism of dextran biosynthesis. The various experimental apparatus as employed in this research are described in detail. In particular, emphasis has been given to the development of continuous cell recycle fermenters. On the laboratory scale, fed-batch fermentations under anaerobic low agitation conditions resulted in dextransucrase activities of about 450 DSU/cm3 which are much higher than the yields reported in the literature and obtained under aerobic conditions. In conventional continuous culture the dilution rate was varied in the range between 0.375 h-1 to 0.55 h-1. The general pattern observed from the data obtained was that the enzyme activity decreased with increase in dilution rate. In these experiments the maximum value of enzyme activity was ∼74 DSU/cm3. Sparging the fermentation broth with CO2 in continuous culture appears to result in a decrease in enzyme activity. In continuous total cell recycle fermentations high steady state biomass levels were achieved but the enzyme activity was low, in the range 4 - 27 DSU/cm3. This fermentation environment affected the physiology of the microorganism. The behaviour of the cell recycle system employed in this work together with its performance and the factors that affected it are discussed in the relevant chapters. By retaining the whole broth leaving a continuous fermenter for between 1.5 - 4 h under controlled conditions, the enzyme activity was enhanced with a certain treatment from 86 DSU/cm3 to 180 DSU/cm3 which represents a 106% increase over the enzyme activity achieved by a steady-state conventional chemostat. A novel process for dextran production has been proposed based on the findings of this latter part of the experimental work.
Resumo:
Healthcare providers and policy makers are faced with an ever-increasing number of medical publications. Searching for relevant information and keeping up to date with new research findings remains a constant challenge. It has been widely acknowledged that narrative reviews of the literature are susceptible to several types of bias and a systematic approach may protect against these biases. The aim of this thesis was to apply quantitative methods in the assessment of outcomes of topical therapies for psoriasis. In particular, to systematically examine the comparative efficacy, tolerability and cost-effectiveness of topical calcipotriol in the treatment of mild-to-moderate psoriasis. Over the years, a wide range of techniques have been used to evaluate the severity of psoriasis and the outcomes from treatment. This lack of standardisation complicates the direct comparison of results and ultimately the pooling of outcomes from different clinical trials. There is a clear requirement for more comprehensive tools for measuring drug efficacy and disease severity in psoriasis. Ideally, the outcome measures need to be simple, relevant, practical, and widely applicable, and the instruments should be reliable, valid and responsive. The results of the meta-analysis reported herein show that calcipotriol is an effective antipsoriatic agent. In the short-tenn, the pooled data found calcipotriol to be more effective than calcitriol, tacalcitol, coal tar and short-contact dithranol. Only potent corticosteroids appeared to have comparable efficacy, with less short-term side-effects. Potent corticosteroids also added to the antipsoriatic effect of calcipotriol, and appeared to suppress the occurrence of calcipotriol-induced irritation. There was insufficient evidence to support any large effects in favour of improvements in efficacy when calcipotriol is used in combination with systemic therapies in patients with severe psoriasis. However, there was a total absence of long-term morbidity data on the effectiveness of any of the interventions studied. Decision analysis showed that, from the perspective of the NHS as payer, the relatively small differences in efficacy between calcipotriol and short-contact dithranol lead to large differences in the direct cost of treating patients with mildto-moderate plaque psoriasis. Further research is needed to examine the clinical and economic issues affecting patients under treatment for psoriasis in the UK. In particular, the maintenance value and cost/benefit ratio for the various treatment strategies, and the assessment of patient's preferences has not yet been adequately addressed for this chronic recurring disease.
Resumo:
Firstly, we numerically model a practical 20 Gb/s undersea configuration employing the Return-to-Zero Differential Phase Shift Keying data format. The modelling is completed using the Split-Step Fourier Method to solve the Generalised Nonlinear Schrdinger Equation. We optimise the dispersion map and per-channel launch power of these channels and investigate how the choice of pre/post compensation can influence the performance. After obtaining these optimal configurations, we investigate the Bit Error Rate estimation of these systems and we see that estimation based on Gaussian electrical current systems is appropriate for systems of this type, indicating quasi-linear behaviour. The introduction of narrower pulses due to the deployment of quasi-linear transmission decreases the tolerance to chromatic dispersion and intra-channel nonlinearity. We used tools from Mathematical Statistics to study the behaviour of these channels in order to develop new methods to estimate Bit Error Rate. In the final section, we consider the estimation of Eye Closure Penalty, a popular measure of signal distortion. Using a numerical example and assuming the symmetry of eye closure, we see that we can simply estimate Eye Closure Penalty using Gaussian statistics. We also see that the statistics of the logical ones dominates the statistics of the logical ones dominates the statistics of signal distortion in the case of Return-to-Zero On-Off Keying configurations.
Resumo:
Lipid peroxidation is recognized to be an important contributor to many chronic diseases, especially those of an inflammatory pathology. In addition to their value as markers of oxidative damage, lipid peroxidation products have also been shown to have a wide variety of biological and cell signalling effects. In view of this, accurate and sensitive methods for the measurement of lipid peroxidation products are essential. Although some assays have been described for many years, improvements in protocols are continually being reported and, with recent advances in instrumentation and technology, highly specialized and informative techniques are increasingly used. This article gives an overview of the most currently used methods and then addresses the recent advances in some specific approaches. The focus is on analysis of oxysterols, F(2)-isoprostanes and oxidized phospholipids by gas chromatography or liquid chromatography mass spectrometry techniques and immunoassays for the detection of 4-hydroxynonenal.
Resumo:
Purpose – This paper aims to focus on developing critical understanding in human resource management (HRM) students in Aston Business School, UK. The paper reveals that innovative teaching methods encourage deep approaches to study, an indicator of students reaching their own understanding of material and ideas. This improves student employability and satisfies employer need. Design/methodology/approach – Student response to two second year business modules, matched for high student approval rating, was collected through focus group discussion. One module was taught using EBL and the story method, whilst the other used traditional teaching methods. Transcripts were analysed and compared using the structure of the ASSIST measure. Findings – Critical understanding and transformative learning can be developed through the innovative teaching methods of enquiry-based learning (EBL) and the story method. Research limitations/implications – The limitation is that this is a single case study comparing and contrasting two business modules. The implication is that the study should be replicated and developed in different learning settings, so that there are multiple data sets to confirm the research finding. Practical implications – Future curriculum development, especially in terms of HE, still needs to encourage students and lecturers to understand more about the nature of knowledge and how to learn. The application of EBL and the story method is described in a module case study – “Strategy for Future Leaders”. Originality/value – This is a systematic and comparative study to improve understanding of how students and lecturers learn and of the context in which the learning takes place.
Resumo:
To investigate if Magnetoencephalography (MEG) can add non-redundant information to guide implantation sites for intracranial recordings (IR). The contribution of MEG to intracranial recording planning was evaluated in 12 consecutive patients assessed pre-surgically with MEG followed by IR. Primary outcome measures were the identification of focal seizure onset in IR and favorable surgical outcome. Outcome measures were compared to those of 12 patients matched for implantation type in whom non-invasive pre-surgical assessment suggested clear hypotheses for implantation (non-MEG group). In the MEG group, non-invasive assessment without MEG was inconclusive, and MEG was then used to further help identify implantation sites. In all MEG patients, at least one virtual MEG electrode generated suitable hypotheses for the location of implantations. No differences in outcome measures were found between non-MEG and MEG groups. Although the MEG group included more complex patients, it showed similar percentage of successful implantations as the non-MEG group. This suggests that MEG can contribute to identify implantation sites where standard methods failed. © 2013 Springer Science+Business Media New York.
Resumo:
Purpose - The main aim of the research is to shed light on the role of information and communication technology (ICT) in the logistics innovation process of small and medium-sized third party logistics providers (3PLs). Design/methodology/approach - A triangulated research strategy was designed using a combination of quantitative and qualitative methods. The former involved the use of a questionnaire survey of small and medium-sized Italian 3PLs with 153 usable responses received. The latter comprised a series of focus groups and the use of seven case studies. Findings - There is a relatively low level of ICT expenditure with few companies adopting formal technology investment strategies. The findings highlight the strategic importance of supply chain integration for 3PLs with companies that have embarked on an expansion of their service portfolios showing a higher level of both ICT usage and information integration. Lack of technology skills in the workforce is a major constraint on ICT adoption. Given the proliferation of logistics-related ICT tools and applications in recent years it has been difficult for small and medium-sized 3PLs to select appropriate applications. Research limitations/implications - The paper provides practical guidelines to researchers in the effective use of mixed-methods research based on the concept of methodological triangulation. In particular, it shows how questionnaire surveys, focus groups and case study analysis can be used in combination to provide insights into multi-faceted supply chain phenomena. It also identifies several potentially fruitful avenues for future research in this specific field. Practical implications - The paper's findings provide useful guidance for practitioners on the effective adoption of ICT as part of the logistics innovation process. The findings also provide support for ICT vendors in the design of ICT solutions that are aligned to the needs of small 3PLs. Originality/value - There is currently a paucity of research into the drivers and inhibitors of ICT in the innovation processes of small and medium-sized 3PLs. This paper fills this gap by exploring the issue using a range of complementary research approaches. Copyright © 2013 Emerald Group Publishing Limited. All rights reserved.
Resumo:
Servitization is a growing area of interest amongst practitioners, policy makers and academics, and much is still to be learnt about its adoption in practice. This paper makes a contribution to this debate by identifying the key facilities practices that successfully servitizing manufacturers appear to be deploying and the underlying rationale behind their configuration. Although these are preliminary findings from a longer-term research programme,this short communication seeks to highlight implications to manufacturing professionals and organisations who are considering the servitization of their operations.
Resumo:
Purpose The purpose of this paper is to identify some of the dilemmas involved in the debate on the how, when and why of mixed methods research. Design/methodology/approach The authors' starting point is formed by developments in the philosophy of science literature, and recent publications on mixed methods research outside of the management accounting domain. Findings Contrary to recent claims made in the management accounting literature, the authors assert that uncovering points of disagreement between methods may be as far as researchers can go by combining them. Being reflexive can help to provide a deeper understanding of the research process and the researcher's role in this process. Research limitations/implications The paper should extend the debate among management accounting researchers about mixed methods research. One of the lessons drawn is that researchers are actively immersed in the research process and cannot purge their own interests and views. Accepting this lesson casts doubt on what the act of research may imply and achieve. Practical implications The paper shows that combinations of research methods should not be made based on a "whatever works" attitude, since this approach ultimately is still infused with ontological and epistemological considerations that researchers have, and should try to explicate. Originality/value The value of this paper lies in the provision of philosophical underpinnings that have not been widely considered in the management accounting literature on mixed methods to date. © 2011 Emerald Group Publishing Limited. All rights reserved.
Resumo:
Objectives: To disentangle the effects of physician gender and patient-centered communication style on patients' oral engagement in depression care. Methods: Physician gender, physician race and communication style (high patient-centered (HPC) and low patient-centered (LPC)) were manipulated and presented as videotaped actors within a computer simulated medical visit to assess effects on analogue patient (AP) verbal responsiveness and care ratings. 307 APs (56% female; 70% African American) were randomly assigned to conditions and instructed to verbally respond to depression-related questions and indicate willingness to continue care. Disclosures were coded using Roter Interaction Analysis System (RIAS). Results: Both male and female APs talked more overall and conveyed more psychosocial and emotional talk to HPC gender discordant doctors (all p <.05). APs were more willing to continue treatment with gender-discordant HPC physicians (p <.05). No effects were evident in the LPC condition. Conclusions: Findings highlight a role for physician gender when considering active patient engagement in patient-centered depression care. This pattern suggests that there may be largely under-appreciated and consequential effects associated with patient expectations in regard to physician gender that these differ by patient gender. Practice implications: High patient-centeredness increases active patient engagement in depression care especially in gender discordant dyads. © 2014.
Resumo:
Advances in statistical physics relating to our understanding of large-scale complex systems have recently been successfully applied in the context of communication networks. Statistical mechanics methods can be used to decompose global system behavior into simple local interactions. Thus, large-scale problems can be solved or approximated in a distributed manner with iterative lightweight local messaging. This survey discusses how statistical physics methodology can provide efficient solutions to hard network problems that are intractable by classical methods. We highlight three typical examples in the realm of networking and communications. In each case we show how a fundamental idea of statistical physics helps solve the problem in an efficient manner. In particular, we discuss how to perform multicast scheduling with message passing methods, how to improve coding using the crystallization process, and how to compute optimal routing by representing routes as interacting polymers.
Resumo:
Defining 'effectiveness' in the context of community mental health teams (CMHTs) has become increasingly difficult under the current pattern of provision required in National Health Service mental health services in England. The aim of this study was to establish the characteristics of multi-professional team working effectiveness in adult CMHTs to develop a new measure of CMHT effectiveness. The study was conducted between May and November 2010 and comprised two stages. Stage 1 used a formative evaluative approach based on the Productivity Measurement and Enhancement System to develop the scale with multiple stakeholder groups over a series of qualitative workshops held in various locations across England. Stage 2 analysed responses from a cross-sectional survey of 1500 members in 135 CMHTs from 11 Mental Health Trusts in England to determine the scale's psychometric properties. Based on an analysis of its structural validity and reliability, the resultant 20-item scale demonstrated good psychometric properties and captured one overall latent factor of CMHT effectiveness comprising seven dimensions: improved service user well-being, creative problem-solving, continuous care, inter-team working, respect between professionals, engagement with carers and therapeutic relationships with service users. The scale will be of significant value to CMHTs and healthcare commissioners both nationally and internationally for monitoring, evaluating and improving team functioning in practice.