929 resultados para Source analysis
Resumo:
It has been widely recognised that an in-depth textual analysis of a source text is relevant for translation. This book discusses the role of discourse analysis for translation and translator training. One particular model of discourse analysis is presented in detail, and its application in the context of translator training is critically examined.
Resumo:
Light occlusions are one of the most significant difficulties of photometric stereo methods. When three or more images are available without occlusion, the local surface orientation is overdetermined so that shape can be computed and the shadowed pixels can be discarded. In this paper, we look at the challenging case when only two images are available without occlusion, leading to a one degree of freedom ambiguity per pixel in the local orientation. We show that, in the presence of noise, integrability alone cannot resolve this ambiguity and reconstruct the geometry in the shadowed regions. As the problem is ill-posed in the presence of noise, we describe two regularization schemes that improve the numerical performance of the algorithm while preserving the data. Finally, the paper describes how this theory applies in the framework of color photometric stereo where one is restricted to only three images and light occlusions are common. Experiments on synthetic and real image sequences are presented.
Resumo:
We propose a novel electroencephalographic application of a recently developed cerebral source extraction method (Functional Source Separation, FSS), which starts from extracranial signals and adds a functional constraint to the cost function of a basic independent component analysis model without requiring solutions to be independent. Five ad-hoc functional constraints were used to extract the activity reflecting the temporal sequence of sensory information processing along the somatosensory pathway in response to the separate left and right median nerve galvanic stimulation. Constraints required only the maximization of the responsiveness at specific latencies following sensory stimulation, without taking into account that any frequency or spatial information. After source extraction, the reliability of identified FS was assessed based on the position of single dipoles fitted on its retroprojected signals and on a discrepancy measure. The FS positions were consistent with previously reported data (two early subcortical sources localized in the brain stem and thalamus, the three later sources in cortical areas), leaving negligible residual activity at the corresponding latencies. The high-frequency component of the oscillatory activity (HFO) of the extracted component was analyzed. The integrity of the low amplitude HFOs was preserved for each FS. On the basis of our data, we suggest that FSS can be an effective tool to investigate the HFO behavior of the different neuronal pools, recruited at successive times after median nerve galvanic stimulation. As FSs are reconstructed along the entire experimental session, directional and dynamic HFO synchronization phenomena can be studied.
Resumo:
In Statnote 9, we described a one-way analysis of variance (ANOVA) ‘random effects’ model in which the objective was to estimate the degree of variation of a particular measurement and to compare different sources of variation in space and time. The illustrative scenario involved the role of computer keyboards in a University communal computer laboratory as a possible source of microbial contamination of the hands. The study estimated the aerobic colony count of ten selected keyboards with samples taken from two keys per keyboard determined at 9am and 5pm. This type of design is often referred to as a ‘nested’ or ‘hierarchical’ design and the ANOVA estimated the degree of variation: (1) between keyboards, (2) between keys within a keyboard, and (3) between sample times within a key. An alternative to this design is a 'fixed effects' model in which the objective is not to measure sources of variation per se but to estimate differences between specific groups or treatments, which are regarded as 'fixed' or discrete effects. This statnote describes two scenarios utilizing this type of analysis: (1) measuring the degree of bacterial contamination on 2p coins collected from three types of business property, viz., a butcher’s shop, a sandwich shop, and a newsagent and (2) the effectiveness of drugs in the treatment of a fungal eye infection.
Resumo:
The multiterminal dc wind farm is a promising topology with a voltage-source inverter (VSI) connection at the onshore grid. Voltage-source converters (VSCs) are robust to ac-side fault conditions. However, they are vulnerable to dc faults on the dc side of the converter. This paper analyzes dc faults, their transients, and the resulting protection issues. Overcurrent faults are analyzed in detail and provide an insight into protection system design. The radial wind farm topology with star or string connection is considered. The outcomes may be applicable for VSCs in the multi-VSC dc wind farm collection grid and VSC-based high-voltage direct current (HVDC) offshore transmission systems.
Resumo:
The diagnosis of ocular disease is increasingly important in optometric practice and there is a need for cost effective point of care assays to assist in that. Although tears are a potentially valuable source of diagnostic information difficulties associated with sample collection and limited sample size together with sample storage and transport have proved major limitations. Progressive developments in electronics and fibre optics together with innovation in sensing technology mean that the construction of inexpensive point of care fibre optic sensing devices is now possible. Tear electrolytes are an obvious family of target analytes, not least to complement the availability of devices that make the routine measurement of tear osmolarity possible in the clinic. In this paper we describe the design, fabrication and calibration of a fibre-optic based electrolyte sensor for the quantification of potassium in tears using the ex vivo contact lens as the sample source. The technology is generic and the same principles can be used in the development of calcium and magnesium sensors. An important objective of this sensor technology development is to provide information at the point of routine optometric examination, which would provide supportive evidence of tear abnormality.
Resumo:
The inverse problem of determining a spacewise dependent heat source, together with the initial temperature for the parabolic heat equation, using the usual conditions of the direct problem and information from two supplementary temperature measurements at different instants of time is studied. These spacewise dependent temperature measurements ensure that this inverse problem has a unique solution, despite the solution being unstable, hence the problem is ill-posed. We propose an iterative algorithm for the stable reconstruction of both the initial data and the source based on a sequence of well-posed direct problems for the parabolic heat equation, which are solved at each iteration step using the boundary element method. The instability is overcome by stopping the iterations at the first iteration for which the discrepancy principle is satisfied. Numerical results are presented for a typical benchmark test example, which has the input measured data perturbed by increasing amounts of random noise. The numerical results show that the proposed procedure gives accurate numerical approximations in relatively few iterations.
Resumo:
A simple and cost-effective technique for generating a flat, square-shaped multi-wavelength optical comb with 42.6 GHz line spacing and over 0.5 THz of total bandwidth is presented. A detailed theoretical analysis is presented, showing that using two concatenated modulators driven with voltages of 3.5 Vp are necessary to generate 11 comb lines with a flatness below 2dB. This performance is experimentally demonstrated using two cascaded Versawave 40 Gbit/s low drive voltage electro-optic polarisation modulators, where an 11 channel optical comb with a flatness of 1.9 dB and a side-mode-suppression ratio (SMSR) of 12.6 dB was obtained.
Resumo:
Data envelopment analysis (DEA) is a methodology for measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. Crisp input and output data are fundamentally indispensable in conventional DEA. However, the observed values of the input and output data in real-world problems are sometimes imprecise or vague. Many researchers have proposed various fuzzy methods for dealing with the imprecise and ambiguous data in DEA. In this study, we provide a taxonomy and review of the fuzzy DEA methods. We present a classification scheme with four primary categories, namely, the tolerance approach, the a-level based approach, the fuzzy ranking approach and the possibility approach. We discuss each classification scheme and group the fuzzy DEA papers published in the literature over the past 20 years. To the best of our knowledge, this paper appears to be the only review and complete source of references on fuzzy DEA. © 2011 Elsevier B.V. All rights reserved.
Resumo:
Background - This study investigates the coverage of adherence to medicine by the UK and US newsprint media. Adherence to medicine is recognised as an important issue facing healthcare professionals and the newsprint media is a key source of health information, however, little is known about newspaper coverage of medication adherence. Methods - A search of the newspaper database Nexis®UK from 2004–2011 was performed. Content analysis of newspaper articles which referenced medication adherence from the twelve highest circulating UK and US daily newspapers and their Sunday equivalents was carried out. A second researcher coded a 15% sample of newspaper articles to establish the inter-rater reliability of coding. Results - Searches of newspaper coverage of medication adherence in the UK and US yielded 181 relevant articles for each country. There was a large increase in the number of scientific articles on medication adherence in PubMed® over the study period, however, this was not reflected in the frequency of newspaper articles published on medication adherence. UK newspaper articles were significantly more likely to report the benefits of adherence (p = 0.005), whereas US newspaper articles were significantly more likely to report adherence issues in the elderly population (p = 0.004) and adherence associated with diseases of the central nervous system (p = 0.046). The most commonly reported barriers to adherence were patient factors e.g. poor memory, beliefs and age, whereas, the most commonly reported facilitators to adherence were medication factors including simplified regimens, shorter treatment duration and combination tablets. HIV/AIDS was the single most frequently cited disease (reported in 20% of newspaper articles). Poor quality reporting of medication adherence was identified in 62% of newspaper articles. Conclusion - Adherence is not well covered in the newspaper media despite a significant presence in the medical literature. The mass media have the potential to help educate and shape the public’s knowledge regarding the importance of medication adherence; this potential is not being realised at present.
Resumo:
Representational difference analysis (RDA) has great potential for preferential amplification of unique but uncharacterised DNA sequences present in one source such as a whole genome, but absent from a related genome or other complex population of sequences. While a few examples of its successful exploitation have been published, the method has not been well dissected and robust, detailed published protocols are lacking. Here we examine the method in detail, suggest improvements and provide a protocol that has yielded key unique sequences from a pathogenic bacterial genome. © 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Renewable energy forms have been widely used in the past decades highlighting a "green" shift in energy production. An actual reason behind this turn to renewable energy production is EU directives which set the Union's targets for energy production from renewable sources, greenhouse gas emissions and increase in energy efficiency. All member countries are obligated to apply harmonized legislation and practices and restructure their energy production networks in order to meet EU targets. Towards the fulfillment of 20-20-20 EU targets, in Greece a specific strategy which promotes the construction of large scale Renewable Energy Source plants is promoted. In this paper, we present an optimal design of the Greek renewable energy production network applying a 0-1 Weighted Goal Programming model, considering social, environmental and economic criteria. In the absence of a panel of experts Data Envelopment Analysis (DEA) approach is used in order to filter the best out of the possible network structures, seeking for the maximum technical efficiency. Super-Efficiency DEA model is also used in order to reduce the solutions and find the best out of all the possible. The results showed that in order to achieve maximum efficiency, the social and environmental criteria must be weighted more than the economic ones.
Resumo:
The best results in the application of computer science systems to automatic translation are obtained in word processing when texts pertain to specific thematic areas, with structures well defined and a concise and limited lexicon. In this article we present a plan of systematic work for the analysis and generation of language applied to the field of pharmaceutical leaflet, a type of document characterized by format rigidity and precision in the use of lexicon. We propose a solution based in the use of one interlingua as language pivot between source and target languages; we are considering Spanish and Arab languages in this case of application.
Resumo:
This paper investigates the power management issues in a mobile solar energy storage system. A multi-converter based energy storage system is proposed, in which solar power is the primary source while the grid or the diesel generator is selected as the secondary source. The existence of the secondary source facilitates the battery state of charge detection by providing a constant battery charging current. Converter modeling, multi-converter control system design, digital implementation and experimental verification are introduced and discussed in details. The prototype experiment indicates that the converter system can provide a constant charging current during solar converter maximum power tracking operation, especially during large solar power output variation, which proves the feasibility of the proposed design. © 2014 IEEE.
Resumo:
A simple and cost-effective technique for generating a flat, square-shaped multi-wavelength optical comb with 42.6 GHz line spacing and over 0.5 THz of total bandwidth is presented. A detailed theoretical analysis is presented, showing that using two concatenated modulators driven with voltages of 3.5 Vp are necessary to generate 11 comb lines with a flatness below 2dB. This performance is experimentally demonstrated using two cascaded Versawave 40 Gbit/s low drive voltage electro-optic polarisation modulators, where an 11 channel optical comb with a flatness of 1.9 dB and a side-mode-suppression ratio (SMSR) of 12.6 dB was obtained.