942 resultados para Process of evangelization
Resumo:
The current context is unique in relation to the teaching of evolution in Brazil and the population's perception of evolution. On the one hand, it is said often about Darwinism in various media, especially due to the relatively recent commemoration of the two hundred years of the birth of Charles Darwin and one hundred and fifty years of the launch of the book The Origin of Species. On the other hand, it is clear, in recent years, a timid movement, more worryingly, in favor of equitable approach of creationist and evolutionist theories in the classroom. This article is a part of a research whose goal is to raise the design that Brazilian respondents have about the Darwinian view (which disregards the divine influence in the evolution of the species). The instrument used for data collection is a questionnaire, type Likert scale, which consists of a series of statements in which respondents must express their degree of agreement or disagreement with each statement. In this study, we present the results of the statement. "The thought of Darwin, which does not consider God as a participant in the process of evolution, is...". Analysis correlated with data on religion and education of the respondents are also held. The results point to a tendency of respondents not to accept the Darwinian view that disregards God's interference in the evolutionary process. The data also show that respondents' choices are influenced by religion and education. The frequency of responses that tend to accept the Darwinian view (which disregards the divine participation in the evolution of the species) is higher among respondents with higher levels of education. Adherents to religions "evangelical" tend to deny this view more often than followers of other religions. Given the potential risks of inserting creationist approaches in school education, it is necessary a discussion of the possible impacts of this rejection of Darwin's thinking (which does not consider God as a participant in the evolutionary process), indicated here, in the teaching of evolution. This work was supported by FAPEMIG.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The paper presents a process of cellulose thermal degradation with bio-hydrogen generation and zinc nanostructures synthesis. Production of zinc nanowires and zinc nanoflowers was performed by a novel processes based on cellulose pyrolysis, volatiles reforming and direct reduction of ZnO. The bio-hydrogen generated in situ promoted the ZnO reduction with Zn nanostructures formation by vapor–solid (VS) route. The cellulose and cellulose/ZnO samples were characterized by thermal analyses (TG/DTG/DTA) and the gases evolved were analyzed by FTIR spectroscopy (TG/FTIR). The hydrogen was detected by TPR (Temperature Programmed Reaction) tests. The results showed that in the presence of ZnO the cellulose thermal degradation produced larger amounts of H2 when compared to pure cellulose. The process was also carried out in a tubular furnace with N2 atmosphere, at temperatures up to 900 °C, and different heating rates. The nanostructures growth was catalyst-free, without pressure reduction, at temperatures lower than those required in the carbothermal reduction of ZnO with fossil carbon. The nanostructures were investigated by X-ray diffraction (XRD), scanning electron microscopy (SEM), energy-dispersive X-ray spectroscopy (EDS) and transmission electron microscopy (TEM). The optical properties were investigated by photoluminescence (PL). One mechanism was presented in an attempt to explain the synthesis of zinc nanostructures that are crystalline, were obtained without significant re-oxidation and whose morphologies are dependent on the heating rates of the process. This route presents a potential use as an industrial process taking into account the simple operational conditions, the low costs of cellulose and the importance of bio-hydrogen and nanostructured zinc.
Resumo:
The increasing aversion to technological risks of the society requires the development of inherently safer and environmentally friendlier processes, besides assuring the economic competitiveness of the industrial activities. The different forms of impact (e.g. environmental, economic and societal) are frequently characterized by conflicting reduction strategies and must be holistically taken into account in order to identify the optimal solutions in process design. Though the literature reports an extensive discussion of strategies and specific principles, quantitative assessment tools are required to identify the marginal improvements in alternative design options, to allow the trade-off among contradictory aspects and to prevent the “risk shift”. In the present work a set of integrated quantitative tools for design assessment (i.e. design support system) was developed. The tools were specifically dedicated to the implementation of sustainability and inherent safety in process and plant design activities, with respect to chemical and industrial processes in which substances dangerous for humans and environment are used or stored. The tools were mainly devoted to the application in the stages of “conceptual” and “basic design”, when the project is still open to changes (due to the large number of degrees of freedom) which may comprise of strategies to improve sustainability and inherent safety. The set of developed tools includes different phases of the design activities, all through the lifecycle of a project (inventories, process flow diagrams, preliminary plant lay-out plans). The development of such tools gives a substantial contribution to fill the present gap in the availability of sound supports for implementing safety and sustainability in early phases of process design. The proposed decision support system was based on the development of a set of leading key performance indicators (KPIs), which ensure the assessment of economic, societal and environmental impacts of a process (i.e. sustainability profile). The KPIs were based on impact models (also complex), but are easy and swift in the practical application. Their full evaluation is possible also starting from the limited data available during early process design. Innovative reference criteria were developed to compare and aggregate the KPIs on the basis of the actual sitespecific impact burden and the sustainability policy. Particular attention was devoted to the development of reliable criteria and tools for the assessment of inherent safety in different stages of the project lifecycle. The assessment follows an innovative approach in the analysis of inherent safety, based on both the calculation of the expected consequences of potential accidents and the evaluation of the hazards related to equipment. The methodology overrides several problems present in the previous methods proposed for quantitative inherent safety assessment (use of arbitrary indexes, subjective judgement, build-in assumptions, etc.). A specific procedure was defined for the assessment of the hazards related to the formations of undesired substances in chemical systems undergoing “out of control” conditions. In the assessment of layout plans, “ad hoc” tools were developed to account for the hazard of domino escalations and the safety economics. The effectiveness and value of the tools were demonstrated by the application to a large number of case studies concerning different kinds of design activities (choice of materials, design of the process, of the plant, of the layout) and different types of processes/plants (chemical industry, storage facilities, waste disposal). An experimental survey (analysis of the thermal stability of isomers of nitrobenzaldehyde) provided the input data necessary to demonstrate the method for inherent safety assessment of materials.
Resumo:
This study of the process of language shift and maintenance in the bilingual community of Romanians living in Hungary was based on 40 tape-recorded Romanian sociolinguistic interviews. These were transcribed into computerised form and provide an excellent source of sociolinguistic, contact linguistic and discourse analysis data, making it possible to show the effect of internal and external factors on the bilingual speech mode. The main topics considered were the choice of Romanian and Hungarian in community interactions, factors of language choice, code-switching: introlanguage and interlanguage, reasons for code-switching, the relationship between age and the frequency of code switching in the interview situation, and the unequal competition of minority and majority languages at school.
Resumo:
This study of the process of language shift and maintenance in the bilingual community of Romanians living in Hungary was based on 40 tape-recorded Romanian sociolinguistic interviews. These were transcribed into computerised form and provide an excellent source of sociolinguistic, contact linguistic and discourse analysis data, making it possible to show the effect of internal and external factors on the bilingual speech mode. The main topics considered were the choice of Romanian and Hungarian in community interactions, factors of language choice, code-switching: introlanguage and interlanguage, reasons for code-switching, the relationship between age and the frequency of code switching in the interview situation, and the unequal competition of minority and majority languages at school.
Explaining Emergence and Consequences of Specific Formal Controls in IS Outsourcing – A Process-View
Resumo:
IS outsourcing projects often fail to achieve project goals. To inhibit this failure, managers need to design formal controls that are tailored to the specific contextual demands. However, the dynamic and uncertain nature of IS outsourcing projects makes the design of such specific formal controls at the outset of a project challenging. Hence, the process of translating high-level project goals into specific formal controls becomes crucial for success or failure of IS outsourcing projects. Based on a comparative case study of four IS outsourcing projects, our study enhances current understanding of such translation processes and their consequences by developing a process model that explains the success or failure to achieve high-level project goals as an outcome of two unique translation patterns. This novel process-based explanation for how and why IS outsourcing projects succeed or fail has important implications for control theory and IS project escalation literature.
Resumo:
Objective: Compensatory health beliefs (CHBs), defined as beliefs that healthy behaviours can compensate for unhealthy behaviours, may be one possible factor hindering people in adopting a healthier lifestyle. This study examined the contribution of CHBs to the prediction of adolescents’ physical activity within the theoretical framework of the Health Action Process Approach (HAPA). Design: The study followed a prospective survey design with assessments at baseline (T1) and two weeks later (T2). Method: Questionnaire data on physical activity, HAPA variables and CHBs were obtained twice from 430 adolescents of four different Swiss schools. Multilevel modelling was applied. Results: CHBs added significantly to the prediction of intentions and change in intentions, in that higher CHBs were associated with lower intentions to be physically active at T2 and a reduction in intentions from T1 to T2. No effect of CHBs emerged for the prediction of self-reported levels of physical activity at T2 and change in physical activity from T1 to T2. Conclusion: Findings emphasise the relevance of examining CHBs in the context of an established health behaviour change model and suggest that CHBs are of particular importance in the process of intention formation.
Resumo:
Over the last decade, adverse events and medical errors have become a main focus of interest for the standards of quality and safety in the U.S. healthcare system (Weinstein & Henderson, 2009). Particularly when a medical error occurs, the disclosure of medical errors and its practices have become a focal point of the healthcare process. Patients and family members who have experienced a medical error might be able to provide knowledge and insight on how to improve the disclose process. However, patient and family member are not typically involved in the disclosure process, thus their experiences go unnoticed. ^ The purpose of this research was to explore how best to include patients and family members in the disclosure process regarding a medical error. The research consisted of 28 qualitative interviews from three stakeholder groups: Hospital Administrators, Clinical Service Providers, and Patients and Family Members. They were asked for their ideas and suggestions on how best to include patients and family members in the disclosure process. Framework Analysis was used to analyze this data and find prevalent themes based on the primary research question. A secondary aim was to index categories created based on the interviews that were collected. Data was used from the Texas Disclosure and Compensation Study with Dr. Eric Thomas as the Principal Investigator. Full acknowledgement of access to this data is given to Dr. Thomas. ^ The themes from the research revealed that each stakeholder group was interested and open to including patients and family members in the disclosure process and that the disclosure process should not be a "one-way" avenue. The themes gave many suggestions regarding how to best include patients and family members in the disclosure process of a medical error. Secondary aims revealed several ways to assess the ideas and suggestion given by the stakeholders. Overall, acceptability of getting the perspective of patients and family members was the most common theme. Comparison of each stakeholder group revealed that including patients and family members would be beneficial to improving hospital disclosure practices. ^ Conclusions included a list of recommendations and measureable appropriate strategies that could provide hospital with key stakeholders insights on how to improve their disclosure process. Sharing patients and family members experience with healthcare providers can encourage a shift in culture where patients are valued and active in participating in hospital practices. To my knowledge, this research is the very first of its kind and moves the disclosure process conversation forward in a patient-family member inclusion direction that will assist in improving disclosure practices. Future research should implement and evaluate the success of the various inclusion strategies.^
Resumo:
Diluted nitride self-assembled In(Ga)AsN quantum dots (QDs) grown on GaAs substrates are potential candidates to emit in the windows of maximum transmittance for optical fibres (1.3-1.55 μm). In this paper, we analyse the effect of nitrogen addition on the indium desorption occurring during the capping process of InxGa1−xAs QDs (x = l and 0.7). The samples have been grown by molecular beam epitaxy and studied through transmission electron microscopy (TEM) and photoluminescence techniques. The composition distribution inside the dots was determined by statistical moiré analysis and measured by energy dispersive X-ray spectroscopy. First, the addition of nitrogen in In(Ga)As QDs gave rise to a strong redshift in the emission peak, together with a large loss of intensity and monochromaticity. Moreover, these samples showed changes in the QDs morphology as well as an increase in the density of defects. The statistical compositional analysis displayed a normal distribution in InAs QDs with an average In content of 0.7. Nevertheless, the addition of Ga and/or N leads to a bimodal distribution of the Indium content with two separated QD populations. We suggest that the nitrogen incorporation enhances the indium fixation inside the QDs where the indium/gallium ratio plays an important role in this process. The strong redshift observed in the PL should be explained not only by the N incorporation but also by the higher In content inside the QDs
Resumo:
The coagulation of milk is the fundamental process in cheese-making, based on a gel formation as consequence of physicochemical changes taking place in the casein micelles, the monitoring the whole process of milk curd formation is a constant preoccupation for dairy researchers and cheese companies (Lagaude et al., 2004). In addition to advances in composition-based applications of near infrared spectroscopy (NIRS), innovative uses of this technology are pursuing dynamic applications that show promise, especially in regard to tracking a sample in situ during food processing (Bock and Connelly, 2008). In this way the literature describes cheese making process applications of NIRS for curd cutting time determination, which conclude that NIRS would be a suitable method of monitoring milk coagulation, as shown i.e. the works published by Fagan et al. (Fagan et al., 2008; Fagan et al., 2007), based in the use of the commercial CoAguLite probe (with a LED at 880nm and a photodetector for light reflectance detection).
Resumo:
The main objective of this article is to focus on the analysis of teaching techniques, ranging from the use of the blackboard and chalk in old traditional classes, using slides and overhead projectors in the eighties and use of presentation software in the nineties, to the video, electronic board and network resources nowadays. Furthermore, all the aforementioned, is viewed under the different mentalities in which the teacher conditions the student using the new teaching technique, improving soft skills but maybe leading either to encouragement or disinterest, and including the lack of educational knowledge consolidation at scientific, technology and specific levels. In the same way, we study the process of adaptation required for teachers, the differences in the processes of information transfer and education towards the student, and even the existence of teachers who are not any longer appealed by their work due which has become much simpler due to new technologies and the greater ease in the development of classes due to the criteria described on the new Grade Programs adopted by the European Higher Education Area. Moreover, it is also intended to understand the evolution of students’ profiles, from the eighties to present time, in order to understand certain attitudes, behaviours, accomplishments and acknowledgements acquired over the semesters within the degree Programs. As an Educational Innovation Group, another key question also arises. What will be the learning techniques in the future?. How these evolving matters will affect both positively and negatively on the mentality, attitude, behaviour, learning, achievement of goals and satisfaction levels of all elements involved in universities’ education? Clearly, this evolution from chalk to the electronic board, the three-dimensional view of our works and their sequence, greatly facilitates the understanding and adaptation later on to the business world, but does not answer to the unknowns regarding the knowledge and the full development of achievement’s indicators in basic skills of a degree. This is the underlying question which steers the roots of the presented research.
Resumo:
Automated and semi-automated accessibility evaluation tools are key to streamline the process of accessibility assessment, and ultimately ensure that software products, contents, and services meet accessibility requirements. Different evaluation tools may better fit different needs and concerns, accounting for a variety of corporate and external policies, content types, invocation methods, deployment contexts, exploitation models, intended audiences and goals; and the specific overall process where they are introduced. This has led to the proliferation of many evaluation tools tailored to specific contexts. However, tool creators, who may be not familiar with the realm of accessibility and may be part of a larger project, lack any systematic guidance when facing the implementation of accessibility evaluation functionalities. Herein we present a systematic approach to the development of accessibility evaluation tools, leveraging the different artifacts and activities of a standardized development process model (the Unified Software Development Process), and providing templates of these artifacts tailored to accessibility evaluation tools. The work presented specially considers the work in progress in this area by the W3C/WAI Evaluation and Report Working Group (ERT WG)
Resumo:
Structuralism is a theory of U.S. constitutional adjudication according to which courts should seek to improve the decision-making process of the political branches of government so as to render it more democratic.1 In words of John Hart Ely, courts should exercise their judicial-review powers as a ‘representation-reinforcing’ mechanism.2 Structuralism advocates that courts must eliminate the elements of the political decision-making process that are at odds with the structure set out by the authors of the U.S. Constitution. The advantage of this approach, U.S. scholars posit, lies in the fact that it does not require courts to second-guess the policy decisions adopted by the political branches of government. Instead, they limit themselves to enforcing the constitutional structure within which those decisions must be adopted. Of course, this theory of constitutional adjudication, like all theories, has its shortcomings. For example, detractors of structuralism argue that it is difficult, if not impossible, to draw the dividing line between ‘substantive’ and ‘structural’ matters.3 In particular, they claim that, when identifying the ‘structure’ set out by the authors of the U.S. Constitution, courts necessarily base their determinations not on purely structural principles, but on a set of substantive values, evaluating concepts such as democracy, liberty and equality. 4 Without claiming that structuralism should be embraced by the ECJ as the leading theory of judicial review, the purpose of my contribution is to explore how recent case-law reveals that the ECJ has also striven to develop guiding principles which aim to improve the way in which the political institutions of the EU adopt their decisions. In those cases, the ECJ decided not to second-guess the appropriateness of the policy choices made by the EU legislator. Instead, it preferred to examine whether, in reaching an outcome, the EU political institutions had followed the procedural steps mandated by the authors of the Treaties. Stated simply, I argue that judicial deference in relation to ‘substantive outcomes’ has been counterbalanced by a strict ‘process review’. To that effect, I would like to discuss three recent rulings of the ECJ, delivered after the entry into force of the Treaty of Lisbon, where an EU policy measure was challenged indirectly, i.e. via the preliminary reference procedure, namely Vodafone, Volker und Markus Schecke and Test-Achats.5 Whilst in the former case the ECJ ruled that the questions raised by the referring court disclosed no factor of such a kind as to affect the validity of the challenged act, in the latter cases the challenged provisions of an EU act were declared invalid.
Resumo:
Europe is in need of a new leadership capacity able to recreate stronger European unity in the external and internal fronts. Otherwise, anti-European forces will increase their influence and presence in European governments and EU institutions with large implications for the direction of European integration. This will be the central concern in making a first short assessment of the recent process of building European leadership capacity for the next five years to come. This assessment will particularly focus on the choice of the President of the European Commission, of the President of the European Council and, finally of the members of the European Commission.