993 resultados para PROACTIVE APPROACH
Resumo:
Enamel white spot lesions in anterior teeth that compromise esthetics are common. Microabrasion is indicated, since it affects enamel superficially. An acid-abrasive slurry with 37% phosphoric acid with pumice was used on the enamel for a controlled time period. Home bleaching with hydrogen peroxide was then used, further improving the final result. The method is safe, easy, and conservative and provides good esthetic results. (Quintessence Int 2011;42:423-426)
Resumo:
Purpose: To verify the influence of cavity access diameter on demineralized dentin removal in the ART approach. Methods: 40 non-carious human premolars were randomly divided into four groups. The occlusal surface was ground flat and the teeth were sectioned mesio-distally. The hemi-sections were reassembled and occlusal access preparations were carried out using ball-shaped diamonds. The resulting size of the occlusal opening was 1.0 mm, 1.4 mm, 1.6 mm and 1.8 mm for Groups A, B, C, and D, respectively. Standardized artificial carious lesions were created and demineralized dentin was excavated. After excavation, the cavities were analyzed using: (a) the tactile method, (b) caries-detection dye to stain demineralized dentin, as proposed by Smales & Fang, and (c) Demineralized Tissue Removal index, as proposed in this study. Statistical analysis was performed using Fisher, Spearman correlation coefficient, kappa, Kruskal-Wallis and Miller tests (P < 0.05). Results: The three methods of evaluation showed no significant difference between Groups A vs. B, and C vs. D, while statistically significant differences were observed between Groups A vs. C, A vs. D, B vs. C and B vs. D. Based on the results of this study, the size of occlusal access significantly affected the efficacy of demineralized tissue removal.
Resumo:
In the design of lattice domes, design engineers need expertise in areas such as configuration processing, nonlinear analysis, and optimization. These are extensive numerical, iterative, and lime-consuming processes that are prone to error without an integrated design tool. This article presents the application of a knowledge-based system in solving lattice-dome design problems. An operational prototype knowledge-based system, LADOME, has been developed by employing the combined knowledge representation approach, which uses rules, procedural methods, and an object-oriented blackboard concept. The system's objective is to assist engineers in lattice-dome design by integrating all design tasks into a single computer-aided environment with implementation of the knowledge-based system approach. For system verification, results from design examples are presented.
Resumo:
This study (a) examined the multidimensionality of both group cohesion and group performance, (b) investigated the relationship between group-level task and social cohesion and group effectiveness, and (c) examined the longitudinal changes in cohesion and performance and the direction of effect between cohesion and performance. First, the authors hypothesized that both task and social cohesion would predict positively all dimensions of group performance. Second, that a stronger relationship would be observed between task cohesion and task effectiveness and between social cohesion and system viability. Third, that all dimensions of cohesion and performance would increase over time. Finally, that cohesion would be both the antecedent and the consequence of performance but that the performance-cohesion relationship would be stronger than the cohesion-performance relationship. Results supported the hypothesized one-to-one relationship between specific dimensions of group cohesion and group performance. Task cohesion was the sole predictor of self-rated performance at both Time 1 and Time 2, whereas social cohesion was the only predictor of system viability at Time 1 and the stronger predictor at Time 2. Social cohesion at Time 2 predicted performance on group task. However, no longitudinal changes were found in cohesion or performance. Finally, group cohesion was found to be the antecedent, but not the consequence, of group performance.
Resumo:
Studies of alcoholism etiology often focus on genetic or psy-chosocial approaches, but not both. Greater understanding of the etiology of alcohol, tobacco and other addictions will come from integration of these research traditions. A research approach is outlined to test three models for the etiology of addictions — behavioral undercontrol, pharmacologic vulnerability, negative affect regulation — addressing key questions including (i) mediators of genetic effects, (ii) genotype-environment correlation effects, (iii) genotype x environment interaction effects, (iv) the developmental unfolding of genetic and environmental effects, (v) subtyping including identification of distinct trajectories of substance involvement, (vi) identification of individual genes that contribute to risk, and (vii) the consequences of excessive use. By using coordinated research designs, including prospective assessment of adolescent twins and their siblings and parents; of adult substance dependent and control twins and their MZ and DZ cotwins, the spouses of these pairs, and their adolescent offspring; and of regular families; by selecting for gene-mapping approaches sibships screened for extreme concordance or discordance on quantitative indices of substance use; and by using experimental (drug challenge) as well as survey approaches, a number of key questions concerning addiction etiology can be addressed. We discuss complementary strengths and weaknesses of different sampling strategies, as well as methods to implement such an integrated approach illustrated for the study of alcoholism etiology. A coordinated program of twin and family studies will allow a comprehensive dissection of the interplay of genetic and environmental risk-factors in the etiology of alcoholism and other addictions.
Resumo:
This special issue presents an excellent opportunity to study applied epistemology in public policy. This is an important task because the arena of public policy is the social domain in which macro conditions for ‘knowledge work’ and ‘knowledge industries’ are defined and created. We argue that knowledge-related public policy has become overly concerned with creating the politico-economic parameters for the commodification of knowledge. Our policy scope is broader than that of Fuller (1988), who emphasizes the need for a social epistemology of science policy. We extend our focus to a range of policy documents that include communications, science, education and innovation policy (collectively called knowledge-related public policy in acknowledgement of the fact that there is no defined policy silo called ‘knowledge policy’), all of which are central to policy concerned with the ‘knowledge economy’ (Rooney and Mandeville, 1998). However, what we will show here is that, as Fuller (1995) argues, ‘knowledge societies’ are not industrial societies permeated by knowledge, but that knowledge societies are permeated by industrial values. Our analysis is informed by an autopoietic perspective. Methodologically, we approach it from a sociolinguistic position that acknowledges the centrality of language to human societies (Graham, 2000). Here, what we call ‘knowledge’ is posited as a social and cognitive relationship between persons operating on and within multiple social and non-social (or, crudely, ‘physical’) environments. Moreover, knowing, we argue, is a sociolinguistically constituted process. Further, we emphasize that the evaluative dimension of language is most salient for analysing contemporary policy discourses about the commercialization of epistemology (Graham, in press). Finally, we provide a discourse analysis of a sample of exemplary texts drawn from a 1.3 million-word corpus of knowledge-related public policy documents that we compiled from local, state, national and supranational legislatures throughout the industrialized world. Our analysis exemplifies a propensity in policy for resorting to technocratic, instrumentalist and anti-intellectual views of knowledge in policy. We argue that what underpins these patterns is a commodity-based conceptualization of knowledge, which is underpinned by an axiology of narrowly economic imperatives at odds with the very nature of knowledge. The commodity view of knowledge, therefore, is flawed in its ignorance of the social systemic properties of ��knowing’.
Resumo:
In this study, we have compared the effector functions and fate of a number of human CTL clones in vitro or ex vivo following contact with variant peptides presented either on the cell surface or in a soluble multimeric format. In the presence of CD8 coreceptor binding, there is a good correlation between TCR signaling, killing of the targets, and Fast-mediated CTL apoptosis. Blocking CD8 binding using (alpha3 domain mutants of MHC class I results in much reduced signaling and reduced killing of the targets. Surprisingly, however, Fast expression is induced to a similar degree on these CTLs, and apoptosis of CTL is unaffected. The ability to divorce these events may allow the deletion of antigen-specific and pathological CTL populations without the deleterious effects induced by full CTL activation.
Resumo:
[GRAPHICS] The stereocontrolled synthesis of (2S,4R,6R,8S,10S,1'R,1"R)-2(acetylhydroxymethyl)-4, 10-dimethyl-8(isopropenylhydroxymethyl)-1, 7-dioxaspiro[5,5]-undecane (4a) and its C1"-epimer (4b), the key mother spiroketals of the HIV-1 protease inhibitive didemnaketals from the ascidian Didemnum sp., has been carried out through multisteps from the natural (R)-(+)-pulegone, which involved the diastereoselective construction of four chiral carbon centers(C-2, C-6, C-8, and C-1') by intramolecular chiral induce.
Resumo:
Activated sludge models are used extensively in the study of wastewater treatment processes. While various commercial implementations of these models are available, there are many people who need to code models themselves using the simulation packages available to them, Quality assurance of such models is difficult. While benchmarking problems have been developed and are available, the comparison of simulation data with that of commercial models leads only to the detection, not the isolation of errors. To identify the errors in the code is time-consuming. In this paper, we address the problem by developing a systematic and largely automated approach to the isolation of coding errors. There are three steps: firstly, possible errors are classified according to their place in the model structure and a feature matrix is established for each class of errors. Secondly, an observer is designed to generate residuals, such that each class of errors imposes a subspace, spanned by its feature matrix, on the residuals. Finally. localising the residuals in a subspace isolates coding errors. The algorithm proved capable of rapidly and reliably isolating a variety of single and simultaneous errors in a case study using the ASM 1 activated sludge model. In this paper a newly coded model was verified against a known implementation. The method is also applicable to simultaneous verification of any two independent implementations, hence is useful in commercial model development.
Resumo:
We obtain the finite-temperature unconditional master equation of the density matrix for two coupled quantum dots (CQD's) when one dot is subjected to a measurement of its electron occupation number using a point contact (PC). To determine how the CQD system state depends on the actual current through the PC device, we use the so-called quantum trajectory method to derive the zero-temperature conditional master equation. We first treat the electron tunneling through the PC barrier as a classical stochastic point process (a quantum-jump model). Then we show explicitly that our results can be extended to the quantum-diffusive limit when the average electron tunneling rate is very large compared to the extra change of the tunneling rate due to the presence of the electron in the dot closer to the PC. We find that in both quantum-jump and quantum-diffusive cases, the conditional dynamics of the CQD system can be described by the stochastic Schrodinger equations for its conditioned state vector if and only if the information carried away from the CQD system by the PC reservoirs can be recovered by the perfect detection of the measurements.
Resumo:
We apply the quantum trajectory method to current noise in resonant tunneling devices. The results from dynamical simulation are compared with those from unconditional master equation approach. We show that the stochastic Schrodinger equation approach is useful in modeling the dynamical processes in mesoscopic electronic systems.