966 resultados para Critical Levels
Resumo:
While studies of the regulation of gene expression have generally concerned qualitative changes in the selection or the level of expression of a gene, much of the regulation that occurs within a cell involves the continuous subtle optimization of the levels of proteins used in macromolecular complexes. An example is the biosynthesis of the ribosome, in which equimolar amounts of nearly 80 ribosomal proteins must be supplied by the cytoplasm to the nucleolus. We have found that the transcript of one of the ribosomal protein genes of Saccharomyces cerevisiae, RPL32, participates in such fine tuning. Sequences from exon I of the RPL32 transcript interact with nucleotides from the intron to form a structure that binds L32 to regulate splicing. In the spliced transcript, the same sequences interact with nucleotides from exon II to form a structure that binds L32 to regulate translation, thus providing two levels of autoregulation. We now show, by using a sensitive cocultivation assay, that these RNA structures and their interaction with L32 play a role in the fitness of the cell. The change of a single nucleotide within the 5' leader of the RPL32 transcript, which abolishes the site for L32 binding, leads to detectably slower growth and to eventual loss of the mutant strain from the culture. Experiments designed to assess independently the regulation of splicing and the regulation of translation are presented. These observations demonstrate that, in evolutionary terms, subtle regulatory compensations can be critical. The change in structure of an RNA, due to alteration of just one noncoding nucleotide, can spell the difference between biological success and failure.
Resumo:
Overexpression of phytochrome B (phyB) in transgenic Arabidopsis results in enhanced deetiolation in red light. To define domains of phyB functionally important for its regulatory activity, we performed chemical mutagenesis of a phyB-overexpressing line and screened for phenotypic revertants in red light. Four phyB-transgene-linked revertants that retain parental levels of full-length, dimeric, and spectrally normal overexpressed phyB were identified among 101 red-light-specific revertants. All carry single amino acid substitutions in the transgene-encoded phyB that reduce activity by 40- to 1000-fold compared to the nonmutagenized parent. The data indicate that the mutant molecules are fully active in photosignal perception but defective in the regulatory activity responsible for signal transfer to downstream components. All four mutations fall within a 62-residue region in the COOH-terminal domain of phyB, with two independent mutations occurring in a single amino acid, Gly-767. Accumulating evidence indicates that the identified region is a critical determinant in the regulatory function of both phyB and phyA.
Resumo:
A critical gene involved in mammalian sex determination and differentiation is the Sty-related gene Sox9. In reptiles, Sox9 resembles that of mammals in both structure and expression pattern in the developing gonad, but a causal role in male sex determination has not been established. A closely related gene, Sox8, is conserved in human, mouse, and trout and is expressed in developing testes and not developing ovaries in mouse. In this study, we tested the possibility of Sox8 being important for sex determination or sex differentiation in the red-eared slider turtle Trachemys scripta, in which sex is determined by egg incubation temperature between stages 15 and 20. We cloned partial turtle Sox8 and anti-Mullerian hormone (Amh) cDNAs, and analyzed the expression patterns of these genes in developing gonads by reverse transcriptase-polymerase chain reaction and whole-mount in situ hybridization. While Amh is expressed more strongly in males than in females at stage 17, Sox8 is expressed at similar levels in males and females throughout the sex-determining period. These observations suggest that differential transcription of Sill is not responsible for regulation of Amh, nor responsible for sex determination in turtle. (C) 2004 Wiley-Liss, Inc.
Resumo:
Intensive Case Management (ICM) is widely claimed to be an evidence-based and cost effective program for people with high levels of disability as a result of mental illness. However, the findings of recent randomized controlled trials comparing ICM with 'usual services' suggest that both clinical and cost effectiveness of ICM may be weakening. Possible reasons for this, including fidelity of implementation, researcher allegiance effects and changes in the wider service environment within which ICM is provided, are considered. The implications for service delivery and research are discussed.
Resumo:
Background & Aims: Treatment of chronic hepatitis B (CHB) involves a number of complex and controversial issues. Expert opinions may differ from those of practicing hepatologists and gastroenterologists. We aimed to explore this issue further after a critical review of the literature. Methods: A panel of 14 international experts graded the strength of evidence for 16 statements addressing 3 content areas: patient selection, therapeutic end points, and treatment options. Available data relating to the statements were reviewed critically in 3 small work groups. After discussion of each statement with the entire panel, the experts voted anonymously to accept or reject statements based on the strength of evidence and their experience. A total of 241 members of the American Association for the Study of Liver Diseases (AASLD) responded to the same statements and their responses were compared with those of the experts. A discordant response was defined as a difference of more than 20% in any of the 5 graded levels of response (accept or reject) between the 2 groups. Results: With the exception of 2 statements, the experts’ responses were relatively uniform. However, the responses of the AASLD members were discordant from the experts in 12 statements, spanning all 3 content areas. Conclusions: Several areas of disagreement on the management of CHB exist between experts and AASLD members. Our results indicate a potential knowledge gap among practicing hepatologists. Better educational efforts are needed to meet the challenge of managing this complex disorder in which even expert opinion occasionally may disagree.
Resumo:
The work described in this thesis is directed towards the reduction of noise levels in the Hoover Turbopower upright vacuum cleaner. The experimental work embodies a study of such factors as the application of noise source identification techniques, investigation of the noise generating principles for each major source and evaluation of the noise reducing treatments. It was found that the design of the vacuum cleaner had not been optimised from the standpoint of noise emission. Important factors such as noise `windows', isolation of vibration at the source, panel rattle, resonances and critical speeds had not been considered. Therefore, a number of experimentally validated treatments are proposed. Their noise reduction benefit together with material and tooling costs are presented. The solutions to the noise problems were evaluated on a standard Turbopower and the sound power level of the cleaner was reduced from 87.5 dB(A) to 80.4 db(A) at a cost of 93.6 pence per cleaner.The designers' lack of experience in noise reduction was identified as one of the factors for the low priority given to noise during design of the cleaner. Consequently, the fundamentals of acoustics, principles of noise prediction and absorption and guidelines for good acoustical design were collated into a Handbook and circulated at Hoover plc.Mechanical variations during production of the motor and the cleaner were found to be important. These caused a vast spread in the noise levels of the cleaners. Subsequently, the manufacturing processes were briefly studied to identify their source and recommendations for improvement are made.Noise of a product is quality related and a high level of noise is considered to be a bad feature. This project suggested that the noise level be used constructively both as a test on the production line to identify cleaners above a certain noise level and also to promote the product by `designing' the characteristics of the sound so that the appliance is pleasant to the user. This project showed that good noise control principles should be implemented early in the design stage.As yet there are no mandatory noise limits or noise-labelling requirements for household appliances. However, the literature suggests that noise-labelling is likely in the near future and the requirement will be to display the A-weighted sound power level. However, the `noys' scale of perceived noisiness was found more appropriate to the rating of appliance noise both as it is linear and therefore, a sound level that seems twice as loud is twice the value in noys and also takes into consideration the presence of pure tones, which even in the absence of a high noise level can lead to annoyance.
Resumo:
Purpose – The purpose of this paper is to investigate the effectiveness of quality management training by reviewing commonly used critical success factors and tools rather than the overall methodological approach. Design/methodology/approach – The methodology used a web-based questionnaire. It consisted of 238 questions covering 77 tools and 30 critical success factors selected from leading academic and practitioner sources. The survey had 79 usable responses and the data were analysed using relevant statistical quality management tools. The results were validated in a series of structured workshops with quality management experts. Findings – Findings show that in general most of the critical success factor statements for quality management are agreed with, although not all are implemented well. The findings also show that many quality tools are not known or understood well; and that training has an important role in raising their awareness and making sure they are used correctly. Research limitations/implications – Generalisations are limited by the UK-centric nature of the sample. Practical implications – The practical implications are discussed for organisations implementing quality management initiatives, training organisations revising their quality management syllabi and academic institutions teaching quality management. Originality/value – Most recent surveys have been aimed at methodological levels (i.e. “lean”, “Six Sigma”, “total quality management” etc.); this research proposes that this has limited value as many of the tools and critical success factors are common to most of the methodologies. Therefore, quite uniquely, this research focuses on the tools and critical success factors. Additionally, other recent comparable surveys have been less comprehensive and not focused on training issues.
Resumo:
We present a data based statistical study on the effects of seasonal variations in the growth rates of the gastro-intestinal (GI) parasitic infection in livestock. The alluded growth rate is estimated through the variation in the number of eggs per gram (EPG) of faeces in animals. In accordance with earlier studies, our analysis too shows that rainfall is the dominant variable in determining EPG infection rates compared to other macro-parameters like temperature and humidity. Our statistical analysis clearly indicates an oscillatory dependence of EPG levels on rainfall fluctuations. Monsoon recorded the highest infection with a comparative increase of at least 2.5 times compared to the next most infected period (summer). A least square fit of the EPG versus rainfall data indicates an approach towards a super diffusive (i. e. root mean square displacement growing faster than the square root of the elapsed time as obtained for simple diffusion) infection growth pattern regime for low rainfall regimes (technically defined as zeroth level dependence) that gets remarkably augmented for large rainfall zones. Our analysis further indicates that for low fluctuations in temperature (true on the bulk data), EPG level saturates beyond a critical value of the rainfall, a threshold that is expected to indicate the onset of the nonlinear regime. The probability density functions (PDFs) of the EPG data show oscillatory behavior in the large rainfall regime (greater than 500 mm), the frequency of oscillation, once again, being determined by the ambient wetness (rainfall, and humidity). Data recorded over three pilot projects spanning three measures of rainfall and humidity bear testimony to the universality of this statistical argument. © 2013 Chattopadhyay and Bandyopadhyay.
Resumo:
In this study, we developed a DEA-based performance measurement methodology that is consistent with performance assessment frameworks such as the Balanced Scorecard. The methodology developed in this paper takes into account the direct or inverse relationships that may exist among the dimensions of performance to construct appropriate production frontiers. The production frontiers we obtained are deemed appropriate as they consist solely of firms with desirable levels for all dimensions of performance. These levels should be at least equal to the critical values set by decision makers. The properties and advantages of our methodology against competing methodologies are presented through an application to a real-world case study from retail firms operating in the US. A comparative analysis between the new methodology and existing methodologies explains the failure of the existing approaches to define appropriate production frontiers when directly or inversely related dimensions of performance are present and to express the interrelationships between the dimensions of performance.
Resumo:
In recent years, a surprising new phenomenon has emerged in which globally-distributed online communities collaborate to create useful and sophisticated computer software. These open source software groups are comprised of generally unaffiliated individuals and organizations who work in a seemingly chaotic fashion and who participate on a voluntary basis without direct financial incentive. ^ The purpose of this research is to investigate the relationship between the social network structure of these intriguing groups and their level of output and activity, where social network structure is defined as (1) closure or connectedness within the group, (2) bridging ties which extend outside of the group, and (3) leader centrality within the group. Based on well-tested theories of social capital and centrality in teams, propositions were formulated which suggest that social network structures associated with successful open source software project communities will exhibit high levels of bridging and moderate levels of closure and leader centrality. ^ The research setting was the SourceForge hosting organization and a study population of 143 project communities was identified. Independent variables included measures of closure and leader centrality defined over conversational ties, along with measures of bridging defined over membership ties. Dependent variables included source code commits and software releases for community output, and software downloads and project site page views for community activity. A cross-sectional study design was used and archival data were extracted and aggregated for the two-year period following the first release of project software. The resulting compiled variables were analyzed using multiple linear and quadratic regressions, controlling for group size and conversational volume. ^ Contrary to theory-based expectations, the surprising results showed that successful project groups exhibited low levels of closure and that the levels of bridging and leader centrality were not important factors of success. These findings suggest that the creation and use of open source software may represent a fundamentally new socio-technical development process which disrupts the team paradigm and which triggers the need for building new theories of collaborative development. These new theories could point towards the broader application of open source methods for the creation of knowledge-based products other than software. ^
Resumo:
In recent years, a surprising new phenomenon has emerged in which globally-distributed online communities collaborate to create useful and sophisticated computer software. These open source software groups are comprised of generally unaffiliated individuals and organizations who work in a seemingly chaotic fashion and who participate on a voluntary basis without direct financial incentive. The purpose of this research is to investigate the relationship between the social network structure of these intriguing groups and their level of output and activity, where social network structure is defined as 1) closure or connectedness within the group, 2) bridging ties which extend outside of the group, and 3) leader centrality within the group. Based on well-tested theories of social capital and centrality in teams, propositions were formulated which suggest that social network structures associated with successful open source software project communities will exhibit high levels of bridging and moderate levels of closure and leader centrality. The research setting was the SourceForge hosting organization and a study population of 143 project communities was identified. Independent variables included measures of closure and leader centrality defined over conversational ties, along with measures of bridging defined over membership ties. Dependent variables included source code commits and software releases for community output, and software downloads and project site page views for community activity. A cross-sectional study design was used and archival data were extracted and aggregated for the two-year period following the first release of project software. The resulting compiled variables were analyzed using multiple linear and quadratic regressions, controlling for group size and conversational volume. Contrary to theory-based expectations, the surprising results showed that successful project groups exhibited low levels of closure and that the levels of bridging and leader centrality were not important factors of success. These findings suggest that the creation and use of open source software may represent a fundamentally new socio-technical development process which disrupts the team paradigm and which triggers the need for building new theories of collaborative development. These new theories could point towards the broader application of open source methods for the creation of knowledge-based products other than software.
Resumo:
Introduction: Point-of-care ultrasound (POCUS) use in clinical care is growing rapidly, and advocates have recently proposed the integration of ultrasound into undergraduate medical education (UME). The evidentiary basis for this integration has not been evaluated critically or systematically. In this study, we conducted a critical and systematic review framed by the rationales enumerated by advocates of ultrasound in UME in academic publications.
Methods: This research was conducted in two phases. First, the dominant discursive rationales for the integration of ultrasound in UME were identified using techniques from Foucauldian critical discourse analysis (CDA) from an archive of 403 academic publications. We then sought empirical evidence in support of theses rationales, using a critical synthesis methodology also adapted from CDA.
Results: We identified four dominant discursive rationales, with different levels of evidentiary support. Ultrasound was not demonstrated to improve students’ understanding of anatomy. The benefit of ultrasound in teaching physical examination was inconsistent,and rests on minimal evidence. With POCUS, students’ diagnostic accuracy was improved for certain pathologies, but findings were inconsistent for others. Finally, the rationale that ultrasound training in UME will improve quality of patient care was difficult to evaluate.
Discussion: Our analysis has shown that the frequently repeated rationales for the integration of ultrasound in UME are not supported by a sufficient base of empirical research. The repetition of these dominant discursive rationales in academic publications legitimizes them and may preclude further primary research. Since the value of clinical ultrasound use by medical students remains unproven, educators must consider whether the associated financial and temporal costs are justified or whether more research is required.
Resumo:
As a teacher educator I consider myself an advocate for research-informed education, and strongly believe that it starts with one’s own critical self-reflection and analysis of one’s own teaching practice. Critical incident analysis is a pedagogical theory developed by Tripp (1993), whose analytical approaches allow reflection on teaching situations – ‘the critical incident’ – so that teachers can develop their professional judgments and practices. This article examines the concept of critical incident analysis through a teaching situation, with the aim of improving the teaching practice of students on teacher education programmes. I conclude that although critical incident analysis is a useful tool in navigating teaching practices, often challenges need to be addressed at much broader levels than the teaching context itself.
Resumo:
In der gegenwärtigen "Wissensgesellschaft" spielt wissenschaftliches Wissen eine zentrale Rolle, um gesellschaftliche Verhältnisse herzustellen oder zu reproduzieren. Ein kritischer Umgang mit (wissenschaftlichem) Wissen - eine "critical science literacy" - eröffnet Möglichkeiten des Widerstands in der Wissensgesellschaft und kann damit als demokratische Grundfertigkeit begriffen werden. Im vorliegenden Beitrag gehen die Autorinnen den Möglichkeiten einer critical science literacy im Spannungsverhältnis von Anpassung und Widerstand nach. Sie werfen einen Blick auf die historische Entwicklung der Debatte um scientific literacy - ursprünglich nur als naturwissenschaftliche Grundkompetenz gedacht, aber mit einem durchaus kritisch reflexiven und demokratischen Moment - im Kontext demokratisch-kapitalistischer Verhältnisse. Sie verstehen critical science literacy als eine auf allen Ebenen der Wissensgenerierung und -bildung verantwortungsvolle, kollektive und eingreifende Praxis in gesellschaftliche Auseinandersetzungen. (DIPF/Orig.)
Resumo:
This study examines the pluralistic hypothesis advanced by the late Professor John Hick viz. that all religious faiths provide equally salvific pathways to God, irrespective of their theological and doctrinal differences. The central focus of the study is a critical examination of (a) the epistemology of religious experience as advanced by Professor Hick, (b) the ontological status of the being he understands to be God, and further asks (c) to what extent can the pluralistic view of religious experience be harmonised with the experience with which the Christian life is understood to begin viz. regeneration. Tracing the theological journey of Professor Hick from fundamentalist Christian to religious pluralist, the study notes the reasons given for Hick’s gradual disengagement from the Christian faith. In addition to his belief that the pre-scientific worldview of the Bible was obsolete and passé, Hick took the view that modern biblical scholarship could not accommodate traditionally held Christian beliefs. He conceded that the Incarnation, if true, would be decisive evidence for the uniqueness of Christianity, but rejected the same on the grounds of logical incoherence. This study affirms the view that the doctrine of the Incarnation occupies a place of crucial importance within world religion, but rejects the claim of incoherence. Professor Hick believed that God’s Spirit was at work in all religions, producing a common religious experience, or spiritual awakening to God. The soteriological dimension of this spiritual awakening, he suggests, finds expression as the worshipper turns away from self-centredness to the giving of themselves to God and others. At the level of epistemology he further argued that religious experience itself provided the rational basis for belief in God. The study supports the assertion by Professor Hick that religious experience itself ought to be trusted as a source of knowledge and this on the principle of credulity, which states that a person’s claim to perceive or experience something is prima facie justified, unless there are compelling reasons to the contrary. Hick’s argument has been extensively developed and defended by philosophers such as Alvin Plantinga and William Alston. This confirms the importance of Hick’s contribution to the philosophy of religion, and further establishes his reputation within the field as an original thinker. It is recognised in this thesis, however, that in affirming only the rationality of belief, but not the obligation to believe, Professor Hick’s epistemology is not fully consistent with a Christian theology of revelation. Christian theology views the created order as pre-interpreted and unambiguous in its testimony to God’s existence. To disbelieve in God’s existence is to violate one’s epistemic duty by suppressing the truth. Professor Hick’s critical realist principle, which he regards as the key to understanding what is happening in the different forms of religious experience, is examined within this thesis. According to the critical realist principle, there are realities external to us, yet we are never aware of them as they are in themselves, but only as they appear to us within our particular cognitive machinery and conceptual resources. All awareness of God is interpreted through the lens of pre-existing, culturally relative religious forms, which in turn explains the differing theologies within the world of religion. The critical realist principle views God as unknowable, in the sense that his inner nature is beyond the reach of human conceptual categories and linguistic systems. Professor Hick thus endorses and develops the view of God as ineffable, but employs the term transcategorial when speaking of God’s ineffability. The study takes the view that the notion of transcategoriality as developed by Professor Hick appears to deny any ontological status to God, effectively arguing him out of existence. Furthermore, in attributing the notion of transcategoriality to God, Professor Hick would appear to render incoherent his own fundamental assertion that we can know nothing of God that is either true or false. The claim that the experience of regeneration with which the Christian life begins can be classed as a mere species of the genus common throughout all faiths, is rejected within this thesis. Instead it is argued that Christian regeneration is a distinctive experience that cannot be reduced to a salvific experience, defined merely as an awareness of, or awakening to, God, followed by a turning away from self to others. Professor Hick argued against any notion that the Christian community was the social grouping through which God’s Spirit was working in an exclusively redemptive manner. He supported his view by drawing attention to (a) the presence, at times, of comparable or higher levels of morality in world religion, when contrasted with that evidenced by the followers of Christ, and (b) the presence, at times, of demonstrably lower levels of morality in the followers of Christ, when contrasted with the lives of other religious devotees. These observations are fully supported, but the conclusion reached is rejected, on the grounds that according to Christian theology the saving work of God’s Spirit is evidenced in a life that is changing from what it was before. Christian theology does not suggest or demand that such lives at every stage be demonstrably superior, when contrasted with other virtuous or morally upright members of society. The study concludes by paying tribute to the contribution Professor Hick has made to the field of the epistemology of religious experience.