879 resultados para fidelity encouragement
Resumo:
In the paper, the authors present and analyse examples of mistranslated film titles into the Polish language, selected from a database of over 1,100 titles and presented on the sample of the film genre comedy with all its subgenres. The authors discuss various film title translation strategies and procedures with reference to the literature on the subject. In the conclusions, the authors attempt to explain the reasons for the selection of certain translation procedures, with special focus on the free formulation of titles as the least transparent.
Resumo:
Human immunodeficiency virus (HIV) rapidly evolves through generation and selection of mutants that can escape drug therapy. This process is fueled, in part, by the presumably highly error prone polymerase reverse transcriptase (RT). Fidelity of polymerases can be influenced by cation co-factors. Physiologically, magnesium (Mg2+) is used as a co-factor by RT to perform catalysis, however, alternative cations including manganese (Mn2+), cobalt (Co2+), and zinc (Zn2+) can also be used. I demonstrate here that fidelity and inhibition of HIV RT can be influenced differently, in vitro, by divalent cations depending on their concentration. The reported mutation frequency for purified HIV RT in vitro is typically in the 10-4 range (per nucleotide addition), making the enzyme several-fold less accurate than most polymerases. Paradoxically, results examining HIV replication in cells indicate an error frequency that is ~10 times lower than the error rate obtained in the test tube. Here, I reconcile, at least in part, these discrepancies by showing that HIV RT fidelity in vitro is in the same range as cellular results, in physiological concentrations of free Mg2+ (~0.25 mM). At low Mg2+, mutation rates were 5-10 times lower compared to high Mg2+ conditions (5-10 mM). Alternative divalent cations also have a concentration-dependent effect on RT fidelity. Presumed promutagenic cations Mn2+ and Co2+ decreases the fidelity of RT only at elevated concentrations, and Zn2+, when present in low concentration, increases the fidelity of HIV-1 RT by ~2.5 fold compared to Mg2+. HIV-1 and HIV-2 RT inhibition by nucleoside (NRTIs) and non-nucleoside RT inhibitors (NNRTIs) in vitro is also affected by the Mg2+ concentration. NRTIs lacking 3'-OH group inhibited both enzymes less efficiently in low Mg2+ than in high Mg2+; whereas inhibition by the “translocation defective RT inhibitor”, which retains the 3ʹ-OH, was unaffected by Mg2+ concentration, suggesting that NRTIs with a 3ʹ-OH group may be more potent than other NRTIs. In contrast, NNRTIs were more effective in low vs. high Mg2+ conditions. Overall, the studies presented reveal strategies for designing novel RT inhibitors and strongly emphasize the need for studying HIV RT and RT inhibitors in physiologically relevant low Mg2+ conditions.
Resumo:
Despite a current emphasis in Romantic scholarship on intersubjectivity, this study suggests that we still have much to learn about how theories of intersubjectivity operate in Romantic-era writings that focus on the family—the most common vehicle for exploring relationships during the period. By investigating how sympathy, intimacy, and fidelity are treated in the works of Mary Hays, Felicia Hemans, and Mary Shelley, this dissertation discovers the presence of an “ethics of refusal” within women’s Romantic-era texts. Texts that promote an ethics of refusal, I argue, almost advocate for a particular mode of relating within a given model of the family as the key to more equitable social relations, but, then, they ultimately refuse to support any particular model. Although drawn towards models of relating that, at first, seem to offer explicit pathways towards a more ethical society, texts that promote an ethics of refusal ultimately reject any program of reform. Such rejection is not unaccountable, but stems from anxieties about appearing to dictate what is best for others when others are, in reality, other than the self. In this dissertation, I draw from feminist literary critiques that focus on ethics; genre-focused literary critiques; and studies of sympathy, intimacy, and fidelity that investigate modes of relating within the context of literary works and reader-textual relations. Psychoanalytic theory also plays an important role within my third chapter on Mary Shelley’s novel Falkner. Scholarship that investigates the dialectical nature of Romantic-era literature informs my entire project. Through theorizing and studying an ethics of refusal, we can more fully understand how intersubjective modes functioned in Romantic literature and discover a Romanticism uniquely committed to attempting to turn dialectical reasoning into a social practice.
Resumo:
The ability to be faithful to a particular area or site was analysed in the shanny Lipophrys pholis. Using passive integrated transponders, adults from a population of L. pholis at Cabo Raso, Portugal, were followed over a period of 3 years. The findings showed that site fidelity is a consistent behaviour during the breeding season with specific breeding males being found only in particular sectors within the area, and in specific nests throughout the years. The fact that, in general, L. pholis individuals were absent from the study area during the non-breeding season and breeding males were recorded returning to the same nests and sectors for consecutive breeding seasons suggests that they have developed excellent homing abilities. Translocation data corroborate this idea showing that breeding males successfully returned to their nests after a displacement of >100 m. Altogether, these findings highlight the relevance of life-history traits (e.g. nesting) in the conditioning of site fidelity and homing for this species of rocky intertidal fish, and more importantly, provide evidence for the need of a well-developed navigational system.
Resumo:
Myocardial fibrosis detected via delayed-enhanced magnetic resonance imaging (MRI) has been shown to be a strong indicator for ventricular tachycardia (VT) inducibility. However, little is known regarding how inducibility is affected by the details of the fibrosis extent, morphology, and border zone configuration. The objective of this article is to systematically study the arrhythmogenic effects of fibrosis geometry and extent, specifically on VT inducibility and maintenance. We present a set of methods for constructing patient-specific computational models of human ventricles using in vivo MRI data for patients suffering from hypertension, hypercholesterolemia, and chronic myocardial infarction. Additional synthesized models with morphologically varied extents of fibrosis and gray zone (GZ) distribution were derived to study the alterations in the arrhythmia induction and reentry patterns. Detailed electrophysiological simulations demonstrated that (1) VT morphology was highly dependent on the extent of fibrosis, which acts as a structural substrate, (2) reentry tended to be anchored to the fibrosis edges and showed transmural conduction of activations through narrow channels formed within fibrosis, and (3) increasing the extent of GZ within fibrosis tended to destabilize the structural reentry sites and aggravate the VT as compared to fibrotic regions of the same size and shape but with lower or no GZ. The approach and findings represent a significant step toward patient-specific cardiac modeling as a reliable tool for VT prediction and management of the patient. Sensitivities to approximation nuances in the modeling of structural pathology by image-based reconstruction techniques are also implicated.
Resumo:
Short-time site fidelity and movements of gilthead sea bream (Sparus aurata) in a coastal lagoon were determined using passive acoustic telemetry. Nine fish, ranging from 20.1 to 32.5 cm total length, were surgically implanted with acoustic transmitters and monitored for up to 179 days. Minimum convex polygon areas ranged from 18,698.6 m(2) to 352,711.9 m(2). Home range sizes were small, with individuals using core areas on a daily basis. However, these core areas shifted within the study site over time towards the opening to the sea. Two different diel behaviors were recorded, with some individuals more active at night and others during day time. Some individuals also demonstrated homing abilities, returning to the capture site after being released more than 4 km away. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
The paper seeks to continue the debate about the need for professionals in the library and information services (LIS) sector to continually engage in career-long learning to sustain and develop their knowledge and skills in a dynamic industry. Aims: The neXus2 workforce study has been funded by the ALIA and the consortium of National and State Libraries Australasia (NSLA). It builds on earlier research work (the neXus census) that looked at the demographic, educational and career perspectives of individual library and information professions, to critically examine institutional policies and practices associated with the LIS workforce. The research aims to develop a clearer understanding of the issues impacting on workforce sustainability, workforce capability and workforce optimisation. Methods: The research methodology involved an extensive online survey conducted in March 2008 which collected data on organisational and general staffing; recruitment and retention; staff development and continuing professional education; and succession planning. Encouragement to participate was provided by key industry groups, including academic, public, health, law and government library and information agencies, with the result that around 150 institutions completed the questionnaire. Results: The paper will specifically discuss the research findings relating to training and professional development, to measure the scope and distribution of training activities across the workforce, to consider the interrelationship between the strategic and operational dimensions of staff development in individual institutions and to analyse the common and distinctive factors evident in the different sectors of the profession. Conclusion: The neXus2 project has successfully engaged LIS institutions in the collection of complex industry data that is relevant to the future education and workforce strategies for all areas of the profession. Cross-sector forums such as Information Online 2009 offer the opportunity for stimulating professional dialogue on the key issues.
Resumo:
Since at least the 1960s, art has assumed a breadth of form and medium as diverse as social reality itself. Where once it was marginal and transgressive for artists to work across a spectrum of media, today it is common practice. In this ‘post-medium’ age, fidelity to a specific branch of media is a matter of preference, rather than a code of practice policed by gallerists, curators and critics. Despite the openness of contemporary art practice, the teaching of art at most universities remains steadfastly discipline-based. Discipline-based art teaching, while offering the promise of focussed ‘mastery’ of a particular set of technical skills and theoretical concerns, does so at the expense of a deeper and more complex understanding of the possibilities of creative experimentation in the artist’s studio. By maintaining an hermetic approach to medium, it does not prepare students sufficiently for the reality of art making in the twenty-first century. In fact, by pretending that there is a select range of techniques fundamental to the artist’s trade, discipline-based teaching can often appear to be more engaged with the notion of skills preservation than purposeful art training. If art schools are to survive and prosper in an increasingly vocationally-oriented university environment, they need to fully synthesise the professional reality of contemporary art practice into their approach to teaching and learning. This paper discusses the way in which the ‘open’ studio approach to visual art study at QUT endeavours to incorporate the diversity and complexity of contemporary art while preserving the sense of collective purpose that discipline-based teaching fosters. By allowing students to independently develop their own art practices while also applying collaborative models of learning and assessment, the QUT studio program aims to equip students with a strong sense of self-reliance, a broad awareness and appreciation of contemporary art, and a deep understanding of studio-based experimentation unfettered by the boundaries of traditional media: all skills fundamental to the practice of contemporary art.
Resumo:
Artificial neural networks (ANN) have demonstrated good predictive performance in a wide range of applications. They are, however, not considered sufficient for knowledge representation because of their inability to represent the reasoning process succinctly. This paper proposes a novel methodology Gyan that represents the knowledge of a trained network in the form of restricted first-order predicate rules. The empirical results demonstrate that an equivalent symbolic interpretation in the form of rules with predicates, terms and variables can be derived describing the overall behaviour of the trained ANN with improved comprehensibility while maintaining the accuracy and fidelity of the propositional rules.
Resumo:
Modern computer graphics systems are able to construct renderings of such high quality that viewers are deceived into regarding the images as coming from a photographic source. Large amounts of computing resources are expended in this rendering process, using complex mathematical models of lighting and shading. However, psychophysical experiments have revealed that viewers only regard certain informative regions within a presented image. Furthermore, it has been shown that these visually important regions contain low-level visual feature differences that attract the attention of the viewer. This thesis will present a new approach to image synthesis that exploits these experimental findings by modulating the spatial quality of image regions by their visual importance. Efficiency gains are therefore reaped, without sacrificing much of the perceived quality of the image. Two tasks must be undertaken to achieve this goal. Firstly, the design of an appropriate region-based model of visual importance, and secondly, the modification of progressive rendering techniques to effect an importance-based rendering approach. A rule-based fuzzy logic model is presented that computes, using spatial feature differences, the relative visual importance of regions in an image. This model improves upon previous work by incorporating threshold effects induced by global feature difference distributions and by using texture concentration measures. A modified approach to progressive ray-tracing is also presented. This new approach uses the visual importance model to guide the progressive refinement of an image. In addition, this concept of visual importance has been incorporated into supersampling, texture mapping and computer animation techniques. Experimental results are presented, illustrating the efficiency gains reaped from using this method of progressive rendering. This visual importance-based rendering approach is expected to have applications in the entertainment industry, where image fidelity may be sacrificed for efficiency purposes, as long as the overall visual impression of the scene is maintained. Different aspects of the approach should find many other applications in image compression, image retrieval, progressive data transmission and active robotic vision.
Resumo:
Objectives: To explore the influence of social support on parental physical activity (PA). Methods: Forty parents (21 mothers, 19 fathers) participated in semistructured individual or group interviews. Data were analyzed using thematic content analysis.---------- Results: Instrumental (eg, providing child care, taking over chores), emotional (eg, encouragement, companionship), and informational support (eg, ideas and advice) as well as reciprocal support (eg, giving as well as receiving support) and autonomy support (eg, respecting one’s choices) are important for parents’ PA behavior. However, having support for being active is not straightforward in that many parents discussed issues that inhibited the facilitative nature of social support for PA performance (eg, guilt in getting help). Conclusions: Results highlight the complex nature of social support in facilitating parental PA.
Resumo:
An adaptive agent improves its performance by learning from experience. This paper describes an approach to adaptation based on modelling dynamic elements of the environment in order to make predictions of likely future state. This approach is akin to an elite sports player being able to “read the play”, allowing for decisions to be made based on predictions of likely future outcomes. Modelling of the agent‟s likely future state is performed using Markov Chains and a technique called “Motion and Occupancy Grids”. The experiments in this paper compare the performance of the planning system with and without the use of this predictive model. The results of the study demonstrate a surprising decrease in performance when using the predictions of agent occupancy. The results are derived from statistical analysis of the agent‟s performance in a high fidelity simulation of a world leading real robot soccer team.
Resumo:
Environmental impacts caused during Australia's comparatively recent settlement by Europeans are evident. Governments (both Commonwealth and States) have been largely responsible for requiring landholders – through leasehold development conditions and taxation concessions – to conduct clearing that is now perceived as damage. Most governments are now demanding resource protection. There is a measure of bewilderment (if not resentment) among landholders because of this change. The more populous States, where most overall damage has been done (i.e. Victoria and New South Wales), provide most support for attempts to stop development in other regions where there has been less damage. Queensland, i.e. the north-eastern quarter of the continent, has been relatively slow to develop. It also holds the largest and most diverse natural environments. Tree clearing is an unavoidable element of land development, whether to access and enhance native grasses for livestock or to allow for urban developments (with exotic tree plantings). The consequences in terms of regulations are particularly complex because of the dynamic nature of vegetation. The regulatory terms used in current legislation – such as 'Endangered' and 'Of concern' – depend on legally-defined, static baselines. Regrowth and fire damage are two obvious causes of change. A less obvious aspect is succession, where ecosystems change naturally over long timeframes. In the recent past, the Queensland Government encouraged extensive tree-clearing e.g. through the State Brigalow Development Scheme (mostly 1962 to 1975) which resulted in the removal of some 97% of the wide-ranging mature forests of Acacia harpophylla. At the same time, this government controls National Parks and other reservations (occupying some 4% of the State's 1.7 million km2 area) and also holds major World Heritage Areas (such as the Great Barrier Reef and the Wet Tropics Rainforest) promulgated under Commonwealth legislation. This is a highly prescriptive approach, where the community is directed on the one hand to develop (largely through lease conditions) and on the other to avoid development (largely by unusable reserves). Another approach to development and conservation is still possible in Queensland. For this to occur, however, a more workable and equitable solution than has been employed to date is needed, especially for the remote lands of this State. This must involve resident landholders, who have the capacity (through local knowledge, infrastructure and daily presence) to undertake most costeffectively sustainable land-use management (with suitable attention to ecosystems requiring special conservation effort), that is, provided they have the necessary direction, encouragement and incentive to do so.
Resumo:
The International Network of Indigenous Health Knowledge and Development (INIHKD) Conference was held from Monday 24 May to Friday 28 May 2010 at Kiana Lodge, Port Madison Indian Reservation, Suquamish Nation, Washington State, United States of America. The overall theme for the 4th Biennial Conference was ‘Knowing Our Roots: Indigenous Medicines, Health Knowledges and Best Practices’. This article details the experience of participants who were at the INIHKD Conference. It concludes with an encouragement to people to attend the 5th INIHKD Conference in Australia in 2012.
Resumo:
The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.