909 resultados para Mapping And Monitoring


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Two alternative work designs are identified for operators of stand-alone advanced manufacturing technology (AMT). In the case of specialist control, operators are limited to running and monitoring the technology, with operating problems handled by specialists, such as engineers. In the case of operator control, operators are given much broader responsibilities and deal directly with the majority of operating problems encountered. The hypothesis that operator control would promote better performance and psychological well-being than would specialist control (which is more prevalent) was tested in a longitudinal field study involving work redesign for operators of computer-controlled assembly machines. Change from specialist to operator control reduced downtime, especially for high-variance systems, and was associated with greater intrinsic job satisfaction and less perceived work pressure. The implications of these findings for both small and large-scale applications of AMT are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose: Current conceptualisations of strategic flexibility and its antecedents are theory-driven, which has resulted in a lack of consensus. To summarise this domain the paper aims to develop and present an a priori conceptual model of the antecedents and outcomes of strategic flexibility. Discussion and insights into the conceptual model, and the relationships specified, are made through a novel qualitative empirical approach. The implications for further research and a framework for further theoretical development are presented. Design/methodology/approach: An exploratory qualitative research design is used applying multiple data collection techniques in a branch network of a large regional retailer in the UK. The development of strategic options and the complex relationship to strategic flexibility is investigated. Findings: The number and type of strategic options developed by managers impact on the degree of strategic flexibility and also on the ability of the firm to achieve competitive differentiation. Additionally, the type of strategic option implemented by managers is dependent on the competitive situation faced at a local level. Evidence of managers' limited perception of competition was identified based on their spatial embeddedness. Research limitations/implications: A single, in-depth case study was used. The data gathered is rich and appropriate for the exploratory approach adopted here. However, generalisability of the findings is limited. Practical implications: Strategic flexibility is rooted in the ability of front-line mangers to develop and implement strategic options; this in turn facilitates competitive differentiation. Originality/value: The research presented is unique in this domain on two accounts. First, theory is developed by presenting an a priori conceptual model, and testing through in-depth qualitative data gathering. Second, insights into strategic flexibility are presented through an examination of managerial cognition, resources and strategic option generation using cognitive mapping and laddering technique. © Emerald Group Publishing Limited.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Visual field assessment is a core component of glaucoma diagnosis and monitoring, and the Standard Automated Perimetry (SAP) test is considered up until this moment, the gold standard of visual field assessment. Although SAP is a subjective assessment and has many pitfalls, it is being constantly used in the diagnosis of visual field loss in glaucoma. Multifocal visual evoked potential (mfVEP) is a newly introduced method used for visual field assessment objectively. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard SAP visual field assessment, and others were not very informative and needed more adjustment and research work. In this study, we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. OBJECTIVES: The purpose of this study is to examine the effectiveness of a new analysis method in the Multi-Focal Visual Evoked Potential (mfVEP) when it is used for the objective assessment of the visual field in glaucoma patients, compared to the gold standard technique. METHODS: 3 groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey visual field HFA test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the Hemifield Sector Analysis HSA protocol. Analysis of the HFA was done using the standard grading system. RESULTS: Analysis of mfVEP results showed that there was a statistically significant difference between the 3 groups in the mean signal to noise ratio SNR (ANOVA p<0.001 with a 95% CI). The difference between superior and inferior hemispheres in all subjects were all statistically significant in the glaucoma patient group 11/11 sectors (t-test p<0.001), partially significant 5/11 (t-test p<0.01) and no statistical difference between most sectors in normal group (only 1/11 was significant) (t-test p<0.9). sensitivity and specificity of the HAS protocol in detecting glaucoma was 97% and 86% respectively, while for glaucoma suspect were 89% and 79%. DISCUSSION: The results showed that the new analysis protocol was able to confirm already existing field defects detected by standard HFA, was able to differentiate between the 3 study groups with a clear distinction between normal and patients with suspected glaucoma; however the distinction between normal and glaucoma patients was especially clear and significant. CONCLUSION: The new HSA protocol used in the mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patient. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Previous research suggests that many eating behaviours are stable in children but that obesigenic eating behaviours tend to increase with age. This research explores the stability (consistency in individual levels over time) and continuity (consistency in group levels over time) of child eating behaviours and parental feeding practices in children between 2 and 5 years of age. Thirty one participants completed measures of child eating behaviours, parental feeding practices and child weight at 2 and 5 years of age. Child eating behaviours and parental feeding practices remained stable between 2 and 5 years of age. There was also good continuity in measures of parental restriction and monitoring of food intake, as well as in mean levels of children's eating behaviours and BMI over time. Mean levels of maternal pressure to eat significantly increased, whilst mean levels of desire to drink significantly decreased, between 2 and 5 years of age. These findings suggest that children's eating behaviours are stable and continuous in the period prior to 5 years of age. Further research is necessary to replicate these findings and to explore why later developmental increases are seen in children's obesigenic eating behaviours. © 2011 Elsevier Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Question/Issue: We combine agency and institutional theory to explain the division of equity shares between the foreign (majority) and local (minority) partners within foreign affiliates. We posit that once the decision to invest is made, the ownership structure is arranged so as to generate appropriate incentives to local partners, taking into account both the institutional environment and the firm-specific difficulty in monitoring. Research Findings/Insights: Using a large firm-level dataset for the period 2003-2011 from 16 Central and Eastern European countries and applying selectivity corrected estimates, we find that both weaker host country institutions and higher share of intangible assets in total assets in the firm imply higher minority equity share of local partners. The findings hold when controlling for host country effects and when the attributes of the institutional environment are instrumented. Theoretical/Academic Implications: The classic view is that weak institutions lead to concentrated ownership, yet it leaves the level of minority equity shares unexplained. Our contribution uses a firm-level perspective combined with national-level variation in the institutional environment, and applies agency theory to explain the minority local partner share in foreign affiliates. In particular, we posit that the information asymmetry and monitoring problem in firms are exacerbated by weak host country institutions, but also by the higher share of intangible assets in total assets. Practitioner/Policy Implications: Assessing investment opportunities abroad, foreign firms need to pay attention not only to features directly related to corporate governance (e.g., bankruptcy codes) but also to the broad institutional environment. In weak institutional environments, foreign parent firms need to create strong incentives for local partners by offering them significant minority shares in equity. The same recommendation applies to firms with higher shares of intangible assets in total assets. © 2014 The Authors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Energy dissipation and fatigue properties of nano-layered thin films are less well studied than bulk properties. Existing experimental methods for studying energy dissipation properties, typically using magnetic interaction as a driving force at different frequencies and a laser-based deformation measurement system, are difficult to apply to two-dimensional materials. We propose a novel experimental method to perform dynamic testing on thin-film materials by driving a cantilever specimen at its fixed end with a bimorph piezoelectric actuator and monitoring the displacements of the specimen and the actuator with a fibre-optic system. Upon vibration, the specimen is greatly affected by its inertia, and behaves as a cantilever beam under base excitation in translation. At resonance, this method resembles the vibrating reed method conventionally used in the viscoelasticity community. The loss tangent is obtained from both the width of a resonance peak and a free-decay process. As for fatigue measurement, we implement a control algorithm into LabView to maintain maximum displacement of the specimen during the course of the experiment. The fatigue S-N curves are obtained.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis examines the ways Indonesian politicians exploit the rhetorical power of metaphors in the Indonesian political discourse. The research applies the Conceptual Metaphor Theory, Metaphorical Frame Analysis and Critical Discourse Analysis to textual and oral data. The corpus comprises: 150 political news articles from two newspapers (Harian Kompas and Harian Waspada, 2010-2011 edition), 30 recordings of two television news and talk-show programmes (TV-One and Metro-TV), and 20 interviews with four legislators, two educated persons and two laymen. For this study, a corpus of written bahasa Indonesia was also compiled, which comprises 150 texts of approximately 439,472 tokens. The data analysis shows the potential power of metaphors in relation to how politicians communicate the results of their thinking, reasoning and meaning-making through language and discourse and its social consequences. The data analysis firstly revealed 1155 metaphors. These metaphors were then classified into the categories of conventional metaphor, cognitive function of metaphor, metaphorical mapping and metaphor variation. The degree of conventionality of metaphors is established based on the sum of expressions in each group of metaphors. Secondly, the analysis revealed that metaphor variation is influenced by the broader Indonesian cultural context and the natural and physical environment, such as the social dimension, the regional, style and the individual. The mapping system of metaphor is unidirectionality. Thirdly, the data show that metaphoric thought pervades political discourse in relation to its uses as: (1) a felicitous tool for the rhetoric of political leaders, (2) part of meaning-making that keeps the discourse contexts alive and active, and (3) the degree to which metaphor and discourse shape the conceptual structures of politicians‟ rhetoric. Fourthly, the analysis of data revealed that the Indonesian political discourse attempts to create both distance and solidarity towards general and specific social categories accomplished via metaphorical and frame references to the conceptualisations of us/them. The result of the analysis shows that metaphor and frame are excellent indicators of the us/them categories which work dialectically in the discourse. The acts of categorisation via metaphors and frames at both textual and conceptual level activate asymmetrical concepts and contribute to social and political hierarchical constructs, i.e. WEAKNESS vs.POWER, STUDENT vs. TEACHER, GHOST vs. CHOSEN WARRIOR, and so on. This analysis underscores the dynamic nature of categories by documenting metaphorical transfers between, i.e. ENEMY, DISEASE, BUSINESS, MYSTERIOUS OBJECT and CORRUPTION, LAW, POLITICS and CASE. The metaphorical transfers showed that politicians try to dictate how they categorise each other in order to mobilise audiences to act on behalf of their ideologies and to create distance and solidarity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aim: To examine the use of image analysis to quantify changes in ocular physiology. Method: A purpose designed computer program was written to objectively quantify bulbar hyperaemia, tarsal redness, corneal staining and tarsal staining. Thresholding, colour extraction and edge detection paradigms were investigated. The repeatability (stability) of each technique to changes in image luminance was assessed. A clinical pictorial grading scale was analysed to examine the repeatability and validity of the chosen image analysis technique. Results: Edge detection using a 3 × 3 kernel was found to be the most stable to changes in image luminance (2.6% over a +60 to -90% luminance range) and correlated well with the CCLRU scale images of bulbar hyperaemia (r = 0.96), corneal staining (r = 0.85) and the staining of palpebral roughness (r = 0.96). Extraction of the red colour plane demonstrated the best correlation-sensitivity combination for palpebral hyperaemia (r = 0.96). Repeatability variability was <0.5%. Conclusions: Digital imaging, in conjunction with computerised image analysis, allows objective, clinically valid and repeatable quantification of ocular features. It offers the possibility of improved diagnosis and monitoring of changes in ocular physiology in clinical practice. © 2003 British Contact Lens Association. Published by Elsevier Science Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Bladder cancer is among the most common cancers worldwide (4th in men). It is responsible for high patient morbidity and displays rapid recurrence and progression. Lack of sensitivity of gold standard techniques (white light cystoscopy, voided urine cytology) means many early treatable cases are missed. The result is a large number of advanced cases of bladder cancer which require extensive treatment and monitoring. For this reason, bladder cancer is the single most expensive cancer to treat on a per patient basis. In recent years, autofluorescence spectroscopy has begun to shed light into disease research. Of particular interest in cancer research are the fluorescent metabolic cofactors NADH and FAD. Early in tumour development, cancer cells often undergo a metabolic shift (the Warburg effect) resulting in increased NADH. The ratio of NADH to FAD ("redox ratio") can therefore be used as an indicator of the metabolic status of cells. Redox ratio measurements have been used to differentiate between healthy and cancer breast cells and to monitor cellular responses to therapies. Here, we have demonstrated, using healthy and bladder cancer cell lines, a statistically significant difference in the redox ratio of bladder cancer cells, indicative of a metabolic shift. To do this we customised a standard flow cytometer to excite and record fluorescence specifically from NADH and FAD, along with a method for automatically calculating the redox ratio of individual cells within large populations. These results could inform the design of novel probes and screening systems for the early detection of bladder cancer.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper deals with communicational breakdowns and misunderstandings in computer mediated communication (CMC) and ways to recover from them or to prevent them. The paper describes a case study of CMC conducted in a company named Artigiani. We observed communication and conducted content analysis of e-mail messages, focusing on message exchanges between customer service representatives (CSRs) and their contacts. In addition to task management difficulties, we identified communication breakdowns that result from differences between perspectives, and from the lack of contextual information, mainly technical background and professional jargon at the customers’ side. We examined possible ways to enhance CMC and accordingly designed a prototype for an e-mail user interface that emphasizes a communicational strategy called contextualization as a central component for obtaining effective communication and for supporting effective management and control of organizational activities, especially handling orders, price quoting, and monitoring the supply and installation of products.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

ACM Computing Classification System (1998): J.3.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Due to dynamic variability, identifying the specific conditions under which non-functional requirements (NFRs) are satisfied may be only possible at runtime. Therefore, it is necessary to consider the dynamic treatment of relevant information during the requirements specifications. The associated data can be gathered by monitoring the execution of the application and its underlying environment to support reasoning about how the current application configuration is fulfilling the established requirements. This paper presents a dynamic decision-making infrastructure to support both NFRs representation and monitoring, and to reason about the degree of satisfaction of NFRs during runtime. The infrastructure is composed of: (i) an extended feature model aligned with a domain-specific language for representing NFRs to be monitored at runtime; (ii) a monitoring infrastructure to continuously assess NFRs at runtime; and (iii) a exible decision-making process to select the best available configuration based on the satisfaction degree of the NRFs. The evaluation of the approach has shown that it is able to choose application configurations that well fit user NFRs based on runtime information. The evaluation also revealed that the proposed infrastructure provided consistent indicators regarding the best application configurations that fit user NFRs. Finally, a benefit of our approach is that it allows us to quantify the level of satisfaction with respect to NFRs specification.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

For the treatment and monitoring of Parkinson's disease (PD) to be scientific, a key requirement is that measurement of disease stages and severity is quantitative, reliable, and repeatable. The last 50 years in PD research have been dominated by qualitative, subjective ratings obtained by human interpretation of the presentation of disease signs and symptoms at clinical visits. More recently, “wearable,” sensor-based, quantitative, objective, and easy-to-use systems for quantifying PD signs for large numbers of participants over extended durations have been developed. This technology has the potential to significantly improve both clinical diagnosis and management in PD and the conduct of clinical studies. However, the large-scale, high-dimensional character of the data captured by these wearable sensors requires sophisticated signal processing and machine-learning algorithms to transform it into scientifically and clinically meaningful information. Such algorithms that “learn” from data have shown remarkable success in making accurate predictions for complex problems in which human skill has been required to date, but they are challenging to evaluate and apply without a basic understanding of the underlying logic on which they are based. This article contains a nontechnical tutorial review of relevant machine-learning algorithms, also describing their limitations and how these can be overcome. It discusses implications of this technology and a practical road map for realizing the full potential of this technology in PD research and practice. © 2016 International Parkinson and Movement Disorder Society.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

To chronicle demographic movement across African Asian corridors, a variety of molecular (sequence analysis, restriction mapping and denaturing high performance liquid chromatography etc.) and statistical (correspondence analysis, AMOVA, calculation of diversity indices and phylogenetic inference, etc.) techniques were employed to assess the phylogeographic patterns of mtDNA control region and Y chromosomal variation among 14 sub-Saharan, North African and Middle Eastern populations. The patterns of genetic diversity revealed evidence of multiple migrations across several African Asian passageways as well within the African continent itself. The two-part analysis uncovered several interesting results which include the following: (1) a north (Egypt and Middle East Asia) to south (sub-Saharan Africa) partitioning of both mtDNA and Y chromosomal haplogroup diversity, (2) a genetic diversity gradient in sub-Saharan Africa from east to west, (3) evidence in favor of the Levantine Corridor over the Horn of Africa as the major genetic conduit since the Last Glacial Maximum, (4) a substantially higher mtDNA versus Y chromosomal sub-Saharan component in the Middle East collections, (5) a higher representation of East versus West African mtDNA haplotypes in the Arabian Peninsula populations versus no such bias in the Levant groups and lastly, (6) genetic remnants of the Bantu demographic expansion in sub-Saharan Africa. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Deception research has traditionally focused on three methods of identifying liars and truth tellers: observing non-verbal or behavioral cues, analyzing verbal cues, and monitoring changes in physiological arousal during polygraph tests. Research shows that observers are often incapable of discriminating between liars and truth tellers with better than chance accuracy when they use these methods. One possible explanation for observers' poor performance is that they are not properly applying existing lie detection methods. An alternative explanation is that the cues on which these methods — and observers' judgments — are based do not reliably discriminate between liars and truth tellers. It may be possible to identify more reliable cues, and potentially improve observers' ability to discriminate, by developing a better understanding of how liars and truth tellers try to tell a convincing story. ^ This research examined (a) the verbal strategies used by truthful and deceptive individuals during interviews concerning an assigned activity, and (b) observers' ability to discriminate between them based on their verbal strategies. In Experiment I, pre-interview instructions manipulated participants' expectations regarding verifiability; each participant was led to believe that the interviewer could check some types of details, but not others, before deciding whether the participant was being truthful or deceptive. Interviews were then transcribed and scored for quantity and type of information provided. In Experiment II, observers listened to a random sample of the Experiment I interviews and rendered veracity judgments; half of the observers were instructed to judge the interviews according to the verbal strategies used by liars and truth tellers and the other half were uninstructed. ^ Results of Experiment I indicate that liars and truth tellers use different verbal strategies, characterized by a differential amount of detail. Overall, truthful participants provided more information than deceptive participants. This effect was moderated by participants' expectations regarding verifiability such that truthful participants provided more information only with regard to verifiable details. Results of Experiment II indicate that observers instructed about liars' and truth tellers' verbal strategies identify them with greater accuracy than uninstructed observers. ^