913 resultados para 0801 Artificial Intelligence and Image Processing
Resumo:
A study of 155 professional translators was carried out to examine the relationship between trait emotional intelligence (trait EI) and literary translation, job satisfaction and career success. Participants were surveyed and their answers were correlated with scores from an emotional intelligence measure, the TEIQue. The analysis revealed that literary and non-literary translators have different trait EI profiles. Some significant correlations were found between trait EI and the variables of job satisfaction, career success, and literary translation experience. This is the first study to examine the effect of EI on translator working practices. Findings illustrate that trait EI may be predictive of some aspects of translator behaviour and highlight the relevance of exploring the emotional intelligence of professional translators.
Resumo:
It has been proposed that language impairments in children with Autism Spectrum Disorders (ASD) stem from atypical neural processing of speech and/or nonspeech sounds. However, the strength of this proposal is compromised by the unreliable outcomes of previous studies of speech and nonspeech processing in ASD. The aim of this study was to determine whether there was an association between poor spoken language and atypical event-related field (ERF) responses to speech and nonspeech sounds in children with ASD (n = 14) and controls (n = 18). Data from this developmental population (ages 6-14) were analysed using a novel combination of methods to maximize the reliability of our findings while taking into consideration the heterogeneity of the ASD population. The results showed that poor spoken language scores were associated with atypical left hemisphere brain responses (200 to 400 ms) to both speech and nonspeech in the ASD group. These data support the idea that some children with ASD may have an immature auditory cortex that affects their ability to process both speech and nonspeech sounds. Their poor speech processing may impair their ability to process the speech of other people, and hence reduce their ability to learn the phonology, syntax, and semantics of their native language.
Resumo:
The present study—employing psychometric meta-analysis of 92 independent studies with sample sizes ranging from 26 to 322 leaders—examined the relationship between EI and leadership effectiveness. Overall, the results supported a linkage between leader EI and effectiveness that was moderate in nature (ρ = .25). In addition, the positive manifold of the effect sizes presented in this study, ranging from .10 to .44, indicate that emotional intelligence has meaningful relations with myriad leadership outcomes including effectiveness, transformational leadership, LMX, follower job satisfaction, and others. Furthermore, this paper examined potential process mechanisms that may account for the EI-leadership effectiveness relationship and showed that both transformational leadership and LMX partially mediate this relationship. However, while the predictive validities of EI were moderate in nature, path analysis and hierarchical regression suggests that EI contributes less than or equal to 1% of explained variance in leadership effectiveness once personality and intelligence are accounted for. ^
Resumo:
Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.
Resumo:
This pilot study explored the relationship between emotional intelligence and organizational commitment among provate club board and committee volunteer members. The top three items, ranked by mean scores, of each of three EI dimensions -- IN, OUT, and RELATIONSHIPS wer discussed. A sample of 57 volunteer leaders furhter was split into high EI and low EI groups based on respndents' overall EO median score. Statistical differences between high and low EI groups in three aspects of organizational commitment - affective, continuance, and normative commitment - wer present. 4 t-test results showed that the difference between high and low EI groups in affective commitment among private club volunteer leaders was statistcally significant at p <.05.
Resumo:
The present study – employing psychometric meta-analysis of 92 independent studies with sample sizes ranging from 26 to 322 leaders – examined the relationship between EI and leadership effectiveness. Overall, the results supported a linkage between leader EI and effectiveness that was moderate in nature (ρ = .25). In addition, the positive manifold of the effect sizes presented in this study, ranging from .10 to .44, indicate that emotional intelligence has meaningful relations with myriad leadership outcomes including effectiveness, transformational leadership, LMX, follower job satisfaction, and others. Furthermore, this paper examined potential process mechanisms that may account for the EI-leadership effectiveness relationship and showed that both transformational leadership and LMX partially mediate this relationship. However, while the predictive validities of EI were moderate in nature, path analysis and hierarchical regression suggests that EI contributes less than or equal to 1% of explained variance in leadership effectiveness once personality and intelligence are accounted for.
Resumo:
Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.
Resumo:
Peer reviewed
Resumo:
Prenyltransferase enzymes promote the membrane localization of their target proteins by directing the attachment of a hydrophobic lipid group at a conserved C-terminal CAAX motif. Subsequently, the prenylated protein is further modified by postprenylation processing enzymes that cleave the terminal 3 amino acids and carboxymethylate the prenylated cysteine residue. Many prenylated proteins, including Ras1 and Ras-like proteins, require this multistep membrane localization process in order to function properly. In the human fungal pathogen Cryptococcus neoformans, previous studies have demonstrated that two distinct forms of protein prenylation, farnesylation and geranylgeranylation, are both required for cellular adaptation to stress, as well as full virulence in animal infection models. Here, we establish that the C. neoformans RAM1 gene encoding the farnesyltransferase β-subunit, though not strictly essential for growth under permissive in vitro conditions, is absolutely required for cryptococcal pathogenesis. We also identify and characterize postprenylation protease and carboxyl methyltransferase enzymes in C. neoformans. In contrast to the prenyltransferases, deletion of the genes encoding the Rce1 protease and Ste14 carboxyl methyltransferase results in subtle defects in stress response and only partial reductions in virulence. These postprenylation modifications, as well as the prenylation events themselves, do play important roles in mating and hyphal transitions, likely due to their regulation of peptide pheromones and other proteins involved in development. IMPORTANCE Cryptococcus neoformans is an important human fungal pathogen that causes disease and death in immunocompromised individuals. The growth and morphogenesis of this fungus are controlled by conserved Ras-like GTPases, which are also important for its pathogenicity. Many of these proteins require proper subcellular localization for full function, and they are directed to cellular membranes through a posttranslational modification process known as prenylation. These studies investigate the roles of one of the prenylation enzymes, farnesyltransferase, as well as the postprenylation processing enzymes in C. neoformans. We demonstrate that the postprenylation processing steps are dispensable for the localization of certain substrate proteins. However, both protein farnesylation and the subsequent postprenylation processing steps are required for full pathogenesis of this fungus.
Resumo:
X-ray computed tomography (CT) is a non-invasive medical imaging technique that generates cross-sectional images by acquiring attenuation-based projection measurements at multiple angles. Since its first introduction in the 1970s, substantial technical improvements have led to the expanding use of CT in clinical examinations. CT has become an indispensable imaging modality for the diagnosis of a wide array of diseases in both pediatric and adult populations [1, 2]. Currently, approximately 272 million CT examinations are performed annually worldwide, with nearly 85 million of these in the United States alone [3]. Although this trend has decelerated in recent years, CT usage is still expected to increase mainly due to advanced technologies such as multi-energy [4], photon counting [5], and cone-beam CT [6].
Despite the significant clinical benefits, concerns have been raised regarding the population-based radiation dose associated with CT examinations [7]. From 1980 to 2006, the effective dose from medical diagnostic procedures rose six-fold, with CT contributing to almost half of the total dose from medical exposure [8]. For each patient, the risk associated with a single CT examination is likely to be minimal. However, the relatively large population-based radiation level has led to enormous efforts among the community to manage and optimize the CT dose.
As promoted by the international campaigns Image Gently and Image Wisely, exposure to CT radiation should be appropriate and safe [9, 10]. It is thus a responsibility to optimize the amount of radiation dose for CT examinations. The key for dose optimization is to determine the minimum amount of radiation dose that achieves the targeted image quality [11]. Based on such principle, dose optimization would significantly benefit from effective metrics to characterize radiation dose and image quality for a CT exam. Moreover, if accurate predictions of the radiation dose and image quality were possible before the initiation of the exam, it would be feasible to personalize it by adjusting the scanning parameters to achieve a desired level of image quality. The purpose of this thesis is to design and validate models to quantify patient-specific radiation dose prospectively and task-based image quality. The dual aim of the study is to implement the theoretical models into clinical practice by developing an organ-based dose monitoring system and an image-based noise addition software for protocol optimization.
More specifically, Chapter 3 aims to develop an organ dose-prediction method for CT examinations of the body under constant tube current condition. The study effectively modeled the anatomical diversity and complexity using a large number of patient models with representative age, size, and gender distribution. The dependence of organ dose coefficients on patient size and scanner models was further evaluated. Distinct from prior work, these studies use the largest number of patient models to date with representative age, weight percentile, and body mass index (BMI) range.
With effective quantification of organ dose under constant tube current condition, Chapter 4 aims to extend the organ dose prediction system to tube current modulated (TCM) CT examinations. The prediction, applied to chest and abdominopelvic exams, was achieved by combining a convolution-based estimation technique that quantifies the radiation field, a TCM scheme that emulates modulation profiles from major CT vendors, and a library of computational phantoms with representative sizes, ages, and genders. The prospective quantification model is validated by comparing the predicted organ dose with the dose estimated based on Monte Carlo simulations with TCM function explicitly modeled.
Chapter 5 aims to implement the organ dose-estimation framework in clinical practice to develop an organ dose-monitoring program based on a commercial software (Dose Watch, GE Healthcare, Waukesha, WI). In the first phase of the study we focused on body CT examinations, and so the patient’s major body landmark information was extracted from the patient scout image in order to match clinical patients against a computational phantom in the library. The organ dose coefficients were estimated based on CT protocol and patient size as reported in Chapter 3. The exam CTDIvol, DLP, and TCM profiles were extracted and used to quantify the radiation field using the convolution technique proposed in Chapter 4.
With effective methods to predict and monitor organ dose, Chapters 6 aims to develop and validate improved measurement techniques for image quality assessment. Chapter 6 outlines the method that was developed to assess and predict quantum noise in clinical body CT images. Compared with previous phantom-based studies, this study accurately assessed the quantum noise in clinical images and further validated the correspondence between phantom-based measurements and the expected clinical image quality as a function of patient size and scanner attributes.
Chapter 7 aims to develop a practical strategy to generate hybrid CT images and assess the impact of dose reduction on diagnostic confidence for the diagnosis of acute pancreatitis. The general strategy is (1) to simulate synthetic CT images at multiple reduced-dose levels from clinical datasets using an image-based noise addition technique; (2) to develop quantitative and observer-based methods to validate the realism of simulated low-dose images; (3) to perform multi-reader observer studies on the low-dose image series to assess the impact of dose reduction on the diagnostic confidence for multiple diagnostic tasks; and (4) to determine the dose operating point for clinical CT examinations based on the minimum diagnostic performance to achieve protocol optimization.
Chapter 8 concludes the thesis with a summary of accomplished work and a discussion about future research.
Resumo:
The application of custom classification techniques and posterior probability modeling (PPM) using Worldview-2 multispectral imagery to archaeological field survey is presented in this paper. Research is focused on the identification of Neolithic felsite stone tool workshops in the North Mavine region of the Shetland Islands in Northern Scotland. Sample data from known workshops surveyed using differential GPS are used alongside known non-sites to train a linear discriminant analysis (LDA) classifier based on a combination of datasets including Worldview-2 bands, band difference ratios (BDR) and topographical derivatives. Principal components analysis is further used to test and reduce dimensionality caused by redundant datasets. Probability models were generated by LDA using principal components and tested with sites identified through geological field survey. Testing shows the prospective ability of this technique and significance between 0.05 and 0.01, and gain statistics between 0.90 and 0.94, higher than those obtained using maximum likelihood and random forest classifiers. Results suggest that this approach is best suited to relatively homogenous site types, and performs better with correlated data sources. Finally, by combining posterior probability models and least-cost analysis, a survey least-cost efficacy model is generated showing the utility of such approaches to archaeological field survey.
Resumo:
The paper describes the design and implementation of a novel low cost virtual rugby decision making interactive for use in a visitor centre. Original laboratory-based experimental work in decision making in rugby, using a virtual reality headset [1] is adapted for use in a public visitor centre, with consideration given to usability, costs, practicality and health and safety. Movement of professional rugby players was captured and animated within a virtually recreated stadium. Users then interact with these virtual representations via use of a lowcost sensor (Microsoft Kinect) to attempt to block them. Retaining the principles of perception and action, egocentric viewpoint, immersion, sense of presence, representative design and game design the system delivers an engaging and effective interactive to illustrate the underlying scientific principles of deceptive movement. User testing highlighted the need for usability, system robustness, fair and accurate scoring, appropriate level of difficulty and enjoyment.
Resumo:
AIRES, Kelson R. T. ; ARAÚJO, Hélder J. ; MEDEIROS, Adelardo A. D. . Plane Detection from Monocular Image Sequences. In: VISUALIZATION, IMAGING AND IMAGE PROCESSING, 2008, Palma de Mallorca, Spain. Proceedings..., Palma de Mallorca: VIIP, 2008
Resumo:
[EN]An active vision system to perform tracking of moving objects in real time is described. The main goal is to obtain a system integrating off-the-self components. These components includes a stereoscopic robotic-head, as active perception hardware; a DSP based board SDB C80, as massive data processor and image acquisition board; and finally, a Pentium PC running Windows NT that interconnects and manages the whole system. Real-time is achieved taking advantage of the special architecture of DSP. An evaluation of the performance is included.