924 resultados para Customer Relation Method (CRM)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

1.1 Background and Purpose: Ultrasound guided sciatic nerve blockade has rapid onset but at 24 hours pain is greater than nerve stimulator techniques. Injection of the nerve branches or trunk and sub-sheath blockade increase success and reduce onset times but risk injury. This study mapped needle coordinates for sciatic nerve blockade with nerve stimulation and its relation to postoperative pain scores. 1.2 Method: Angle and distance of the needle tip and infusion catheter from the popliteal sciatic nerve at which stimulated plantar flexion occurred were measured. Pain scores at postanesthesia unit discharge and 24 hours were recorded. 1.3 Results: 81% of opioid naïve patients reported immediate analgesia and 20.8% at 24 hours. In opioid tolerant patients 56.8% reported immediate analgesia and 9.1% at 24 hours. Plantar flexion was observed with the needle in the posterior medial quadrant near the sciatic nerve. Opioid tolerant patients reported adequate analgesia when the needle was located more medially and proximally to the sciatic nerve. 1.4 Conclusion: Stimulated plantar flexion is isolated to a narrow angular range in the posterior medial quadrant adjacent to the sciatic nerve. Opioid tolerant patients report adequate analgesia if the needle and catheter are more medial and proximal to the nerve surface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The efficiency of physical separation of inclusion bodies from cell debris is related to cell debris size and inclusion body release and both factors should be taken into account when designing a process. In this work, cell disruption by enzymatic treatment with lysozyme and cellulase, by homogenization, and by homogenization with ammonia pretreatment is discussed. These disruption methods are compared on the basis of inclusion body release, operating costs, and cell debris particle size. The latter was measured with cumulative sedimentation analysis in combination with membrane-associated protein quantification by SDS-PAGE and a spectrophotometric pepticloglycan quantification method. Comparison of the results obtained with these two cell debris quantification methods shows that enzymatic treatment yields cell debris particles with varying chemical composition, while this is not the case with the other disruption methods that were investigated. Furthermore, the experiments show that ammonia pretreatment with homogenization increases inclusion body release compared to homogenization without pretreatment and that this pretreatment may be used to control the cell debris size to some extent. The enzymatic disruption process gives a higher product release than homogenization with or without ammonia pretreatment at lower operating costs, but it also yields a much smaller cell debris size than the other disruption process. This is unfavorable for centrifugal inclusion body purification in this case, where cell debris is the component going to the sediment and the inclusion body is the floating component. Nevertheless, calculations show that centrifugal separation of inclusion bodies from the enzymatically treated cells gives a high inclusion body yield and purity. (C) 2004 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives The aim of this study was two-fold: to assess climacteric symptoms and provide normative data for the Greene Climacteric Scale during the menopause transition, and to investigate the prevalence of climacteric symptoms in a representative sample of postmenopausal Australian women. Method A cohort of 500 premenopausal, perimenopausal and postmenopausal women aged 40-80 years participated in the Longitudinal Study of Ageing in Women (LAW study) at the Royal Brisbane and Women's Hospital, Brisbane, Australia. In year 1 of the study (2001), all participants completed the Greene Climacteric Scale and information regarding their menopausal status and the use of hormone therapy (HT) was obtained through a clinical interview with a qualified medical practitioner. Results The 50-59-year age group achieved the highest scores on the vasomotor and the depression scales in comparison to other age groups. Significant differences were also evident on the vasomotor and the depression scales on the basis of menopausal status, especially in perimenopausal women. Approximately 10% of women in the 60-79-year age group continued to experience vasomotor symptoms. Conclusion Vasomotor symptoms, as assessed by the Greene Climacteric Scale, are common during the menopause transition and remain elevated for some years in a minority of older postmenopausal women. The norms presented in this study are appropriate for use in an Australian population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detection of point mutations or single nucleotide polymorphisms (SNPs) is important in relation to disease susceptibility or detection in pathogens of mutations determining drug resistance or host range. There is an emergent need for rapid detection methods amenable to point-of-care applications. The purpose of this study was to reduce to practice a novel method for SNP detection and to demonstrate that this technology can be used downstream of nucleic acid amplification. The authors used a model system to develop an oligonucleotide-based SNP detection system on nitrocellulose lateral flow strips. To optimize the assay they used cloned sequences of the herpes simplex virus-1 (HSV-1) DNA polymerase gene into which they introduced a point mutation. The assay system uses chimeric polymerase chain reaction (PCR) primers that incorporate hexameric repeat tags ("hexapet tags"). The chimeric sequences allow capture of amplified products to predefined positions on a lateral flow strip. These "hexapet" sequences have minimal cross-reactivity and allow specific hybridization-based capture of the PCR products at room temperature onto lateral flow strips that have been striped with complementary hexapet tags. The allele-specific amplification was carried out with both mutant and wild-type primer sets present in the PCR mix ("competitive" format). The resulting PCR products carried a hexapet tag that corresponded with either a wild-type or mutant sequence. The lateral flow strips are dropped into the PCR reaction tube, and mutant sequence and wild-type sequences diffuse along the strip and are captured at the corresponding position on the strip. A red line indicative of a positive reaction is visible after 1 minute. Unlike other systems that require separate reactions and strips for each target sequence, this system allows multiplex PCR reactions and multiplex detection on a single strip or other suitable substrates. Unambiguous visual discrimination of a point mutation under room temperature hybridization conditions was achieved with this model system in 10 minutes after PCR. The authors have developed a capture-based hybridization method for the detection and discrimination of HSV-1 DNA polymerase genes that contain a single nucleotide change. It has been demonstrated that the hexapet oligonucleotides can be adapted for hybridization on the lateral flow strip platform for discrimination of SNPs. This is the first step in demonstrating SNP detection on lateral flow using the hexapet oligonucleotide capture system. It is anticipated that this novel system can be widely used in point-of-care settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-technical losses (NTL) identification and prediction are important tasks for many utilities. Data from customer information system (CIS) can be used for NTL analysis. However, in order to accurately and efficiently perform NTL analysis, the original data from CIS need to be pre-processed before any detailed NTL analysis can be carried out. In this paper, we propose a feature selection based method for CIS data pre-processing in order to extract the most relevant information for further analysis such as clustering and classifications. By removing irrelevant and redundant features, feature selection is an essential step in data mining process in finding optimal subset of features to improve the quality of result by giving faster time processing, higher accuracy and simpler results with fewer features. Detailed feature selection analysis is presented in the paper. Both time-domain and load shape data are compared based on the accuracy, consistency and statistical dependencies between features.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless Mesh Networks (WMNs) have emerged as a key technology for the next generation of wireless networking. Instead ofbeing another type of ad-hoc networking, WMNs diversify the capabilities of ad-hoc networks. There are many kinds of protocols that work over WMNs, such as IEEE 802.11a/b/g, 802.15 and 802.16. To bring about a high throughput under varying conditions, these protocols have to adapt their transmission rate. While transmission rate is a significant part, only a few algorithms such as Auto Rate Fallback (ARF) or Receiver Based Auto Rate (RBAR) have been published. In this paper we will show MAC, packet loss and physical layer conditions play important role for having good channel condition. Also we perform rate adaption along with multiple packet transmission for better throughput. By allowing for dynamically monitored, multiple packet transmission and adaptation to changes in channel quality by adjusting the packet transmission rates according to certain optimization criteria improvements in performance can be obtained. The proposed method is the detection of channel congestion by measuring the fluctuation of signal to the standard deviation of and the detection of packet loss before channel performance diminishes. We will show that the use of such techniques in WMN can significantly improve performance. The effectiveness of the proposed method is presented in an experimental wireless network testbed via packet-level simulation. Our simulation results show that regardless of the channel condition we were to improve the performance in the throughput.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis examines the innovative performance of 206 U.S. business service firms. Undeniably, a need exists for better comprehension of the service sector of developed economies. This research takes a unique view by applying a synthesis approach to studying innovation and attempts to build under a proposed strategic innovation paradigm. A quantitative method is utilised via questionnaire in which all major types of innovation are under examination including: product and service, organisational, and technology-driven innovations. Essential ideas for this conceptual framework encapsulate a new mode of understanding service innovation. Basically, the structure of this analysis encompasses the likelihood of innovation and determining the extent of innovation, while also attempting to shed light on the factors which determine the impact of innovation on performance among service firms. What differentiates this research is its focus on customer-driven service firms in addition to other external linkages. A synopsis of the findings suggest that external linkages, particularly with customers, suppliers and strategic alliances or joint ventures, significantly affect innovation performance with regard to the introduction of new services. Service firms which incorporate formal and informal R&D experience significant increases in the extent of new-to-market and new-to-firm innovations. Additionally, the results show that customer-driven service firms experience greater productivity and growth. Furthermore, the findings suggest that external linkages assist service firm performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis was focused on theoretical models of synchronization to cortical dynamics as measured by magnetoencephalography (MEG). Dynamical systems theory was used in both identifying relevant variables for brain coordination and also in devising methods for their quantification. We presented a method for studying interactions of linear and chaotic neuronal sources using MEG beamforming techniques. We showed that such sources can be accurately reconstructed in terms of their location, temporal dynamics and possible interactions. Synchronization in low-dimensional nonlinear systems was studied to explore specific correlates of functional integration and segregation. In the case of interacting dissimilar systems, relevant coordination phenomena involved generalized and phase synchronization, which were often intermittent. Spatially-extended systems were then studied. For locally-coupled dissimilar systems, as in the case of cortical columns, clustering behaviour occurred. Synchronized clusters emerged at different frequencies and their boundaries were marked through oscillation death. The macroscopic mean field revealed sharp spectral peaks at the frequencies of the clusters and broader spectral drops at their boundaries. These results question existing models of Event Related Synchronization and Desynchronization. We re-examined the concept of the steady-state evoked response following an AM stimulus. We showed that very little variability in the AM following response could be accounted by system noise. We presented a methodology for detecting local and global nonlinear interactions from MEG data in order to account for residual variability. We found crosshemispheric nonlinear interactions of ongoing cortical rhythms concurrent with the stimulus and interactions of these rhythms with the following AM responses. Finally, we hypothesized that holistic spatial stimuli would be accompanied by the emergence of clusters in primary visual cortex resulting in frequency-specific MEG oscillations. Indeed, we found different frequency distributions in induced gamma oscillations for different spatial stimuli, which was suggestive of temporal coding of these spatial stimuli. Further, we addressed the bursting character of these oscillations, which was suggestive of intermittent nonlinear dynamics. However, we did not observe the characteristic-3/2 power-law scaling in the distribution of interburst intervals. Further, this distribution was only seldom significantly different to the one obtained in surrogate data, where nonlinear structure was destroyed. In conclusion, the work presented in this thesis suggests that advances in dynamical systems theory in conjunction with developments in magnetoencephalography may facilitate a mapping between levels of description int he brain. this may potentially represent a major advancement in neuroscience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Octopus Automated Perimeter was validated in a comparative study and found to offer many advantages in the assessment of the visual field. The visual evoked potential was investigated in an extensive study using a variety of stimulus parameters to simulate hemianopia and central visual field defects. The scalp topography was recorded topographically and a technique to compute the source derivation of the scalp potential was developed. This enabled clarification of the expected scalp distribution to half field stimulation using different electrode montages. The visual evoked potential following full field stimulation was found to be asymmetrical around the midline with a bias over the left occiput particularly when the foveal polar projections of the occipital cortex were preferentially stimulated. The half field response reflected the distribution asymmetry. Masking of the central 3° resulted in a response which was approximately symmetrical around the midline but there was no evidence of the PNP-complex. A method for visual field quantification was developed based on the neural representation of visual space (Drasdo and Peaston 1982) in an attempt to relate visual field depravation with the resultant visual evoked potentials. There was no form of simple, diffuse summation between the scalp potential and the cortical generators. It was, however, possible to quantify the degree of scalp potential attenuation for M-scaled full field stimuli. The results obtained from patients exhibiting pre-chiasmal lesions suggested that the PNP-complex is not scotomatous in nature but confirmed that it is most likely to be related to specific diseases (Harding and Crews 1982). There was a strong correlation between the percentage information loss of the visual field and the diagnostic value of the visual evoked potential in patients exhibiting chiasmal lesions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Cardiovascular disease (CVD) is partially attributed to traditional cardiovascular risk factors, which can be identified and managed based on risk stratification algorithms (Framingham Risk Score, National Cholesterol Education Program, Systematic Cardiovascular Risk Evaluation and Reynolds Risk Score). We aimed to (a) identify the proportion of at risk patients with rheumatoid arthritis (RA) requiring statin therapy identified by conventional risk calculators, and (b) assess whether patients at risk were receiving statins. Methods Patients at high CVD risk (excluding patients with established CVD or diabetes) were identified from a cohort of 400 well characterised patients with RA, by applying risk calculators with or without a ×1.5 multiplier in specific patient subgroups. Actual statin use versus numbers eligible for statins was also calculated. Results The percentage of patients identified as being at risk ranged significantly depending on the method, from 1.6% (for 20% threshold global CVD risk) to 15.5% (for CVD and cerebrovascular morbidity and mortality) to 21.8% (for 10% global CVD risk) and 25.9% (for 5% CVD mortality), with the majority of them (58.1% to 94.8%) not receiving statins. The application of a 1.5 multiplier identified 17% to 78% more at risk patients. Conclusions Depending on the risk stratification method, 2% to 26% of patients with RA without CVD have sufficiently high risk to require statin therapy, yet most of them remain untreated. To address this issue, we would recommend annual systematic screening using the nationally applicable risk calculator, combined with regular audit of whether treatment targets have been achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Previous research has emphasized the pivotal role that salespeople play in customer satisfaction. In this regard, the relationship between salespeople's attitudes, skills, and characteristics, and customer satisfaction remains an area of interest. The paper aims to make three contributions: first, it seeks to examine the impact of salespeople's satisfaction, adaptive selling, and dominance on customer satisfaction. Second, this research aims to use dyadic data, which is a better test of the relationships between constructs since it avoids common method variance. Finally, in contrast to previous research, it aims to test all of the customers of salespeople rather than customers selected by salespeople. Design/methodology/approach: The study employs multilevel analysis to examine the relationship between salespeople's satisfaction with the firm on customer satisfaction, using a dyadic, matched business-to-business sample of a large European financial service provider that comprises 188 customers and 18 employees. Findings: The paper finds that customers' evaluation of service quality, product quality, and value influence customer satisfaction. The analysis at the selling firm's employee level shows that adaptive selling and employee satisfaction positively impact customer satisfaction, while dominance is negatively related to customer satisfaction. Practical implications: Research shows that customer-focus is a key driver in the success of service companies. Customer satisfaction is regarded as a prerequisite for establishing long-term, profitable relations between company and customer, and customer contact employees are key to nurturing this relationship. The role of salespeople's attitudes, skills, and characteristics in the customer satisfaction process are highlighted in this paper. Originality/value: The use of dyadic, multilevel studies to assess the nature of the relationship between employees and customers is, to date, surprisingly limited. The paper examines the link between employee attitudes, skills, and characteristics, and customer satisfaction in a business-to-business setting in the financial service sector, differentiating between customer- and employee-level drivers of business customer satisfaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In current organizations, valuable enterprise knowledge is often buried under rapidly expanding huge amount of unstructured information in the form of web pages, blogs, and other forms of human text communications. We present a novel unsupervised machine learning method called CORDER (COmmunity Relation Discovery by named Entity Recognition) to turn these unstructured data into structured information for knowledge management in these organizations. CORDER exploits named entity recognition and co-occurrence data to associate individuals in an organization with their expertise and associates. We discuss the problems associated with evaluating unsupervised learners and report our initial evaluation experiments in an expert evaluation, a quantitative benchmarking, and an application of CORDER in a social networking tool called BuddyFinder.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Discovering who works with whom, on which projects and with which customers is a key task in knowledge management. Although most organizations keep models of organizational structures, these models do not necessarily accurately reflect the reality on the ground. In this paper we present a text mining method called CORDER which first recognizes named entities (NEs) of various types from Web pages, and then discovers relations from a target NE to other NEs which co-occur with it. We evaluated the method on our departmental Website. We used the CORDER method to first find related NEs of four types (organizations, people, projects, and research areas) from Web pages on the Website and then rank them according to their co-occurrence with each of the people in our department. 20 representative people were selected and each of them was presented with ranked lists of each type of NE. Each person specified whether these NEs were related to him/her and changed or confirmed their rankings. Our results indicate that the method can find the NEs with which these people are closely related and provide accurate rankings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we propose a text mining method called LRD (latent relation discovery), which extends the traditional vector space model of document representation in order to improve information retrieval (IR) on documents and document clustering. Our LRD method extracts terms and entities, such as person, organization, or project names, and discovers relationships between them by taking into account their co-occurrence in textual corpora. Given a target entity, LRD discovers other entities closely related to the target effectively and efficiently. With respect to such relatedness, a measure of relation strength between entities is defined. LRD uses relation strength to enhance the vector space model, and uses the enhanced vector space model for query based IR on documents and clustering documents in order to discover complex relationships among terms and entities. Our experiments on a standard dataset for query based IR shows that our LRD method performed significantly better than traditional vector space model and other five standard statistical methods for vector expansion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: To conduct an independent evaluation of the first phase of the Health Foundation's Safer Patients Initiative (SPI), and to identify the net additional effect of SPI and any differences in changes in participating and non-participating NHS hospitals. Design: Mixed method evaluation involving five substudies, before and after design. Setting: NHS hospitals in United Kingdom. Participants: Four hospitals (one in each country in the UK) participating in the first phase of the SPI (SPI1); 18 control hospitals. Intervention: The SPI1 was a compound (multicomponent) organisational intervention delivered over 18 months that focused on improving the reliability of specific frontline care processes in designated clinical specialties and promoting organisational and cultural change. Results: Senior staff members were knowledgeable and enthusiastic about SPI1. There was a small (0.08 points on a 5 point scale) but significant (P<0.01) effect in favour of the SPI1 hospitals in one of 11 dimensions of the staff questionnaire (organisational climate). Qualitative evidence showed only modest penetration of SPI1 at medical ward level. Although SPI1 was designed to engage staff from the bottom up, it did not usually feel like this to those working on the wards, and questions about legitimacy of some aspects of SPI1 were raised. Of the five components to identify patients at risk of deterioration - monitoring of vital signs (14 items); routine tests (three items); evidence based standards specific to certain diseases (three items); prescribing errors (multiple items from the British National Formulary); and medical history taking (11 items) - there was little net difference between control and SPI1 hospitals, except in relation to quality of monitoring of acute medical patients, which improved on average over time across all hospitals. Recording of respiratory rate increased to a greater degree in SPI1 than in control hospitals; in the second six hours after admission recording increased from 40% (93) to 69% (165) in control hospitals and from 37% (141) to 78% (296) in SPI1 hospitals (odds ratio for "difference in difference" 2.1, 99% confidence interval 1.0 to 4.3; P=0.008). Use of a formal scoring system for patients with pneumonia also increased over time (from 2% (102) to 23% (111) in control hospitals and from 2% (170) to 9% (189) in SPI1 hospitals), which favoured controls and was not significant (0.3, 0.02 to 3.4; P=0.173). There were no improvements in the proportion of prescription errors and no effects that could be attributed to SPI1 in non-targeted generic areas (such as enhanced safety culture). On some measures, the lack of effect could be because compliance was already high at baseline (such as use of steroids in over 85% of cases where indicated), but even when there was more room for improvement (such as in quality of medical history taking), there was no significant additional net effect of SPI1. There were no changes over time or between control and SPI1 hospitals in errors or rates of adverse events in patients in medical wards. Mortality increased from 11% (27) to 16% (39) among controls and decreased from17%(63) to13%(49) among SPI1 hospitals, but the risk adjusted difference was not significant (0.5, 0.2 to 1.4; P=0.085). Poor care was a contributing factor in four of the 178 deaths identified by review of case notes. The survey of patients showed no significant differences apart from an increase in perception of cleanliness in favour of SPI1 hospitals. Conclusions The introduction of SPI1 was associated with improvements in one of the types of clinical process studied (monitoring of vital signs) and one measure of staff perceptions of organisational climate. There was no additional effect of SPI1 on other targeted issues nor on other measures of generic organisational strengthening.