986 resultados para Intelligence process
Resumo:
The first chapter of this book has an introductory character, which discusses the basics of brewing. This includes not only the essential ingredients of beer, but also the steps in the process that transforms the raw materials (grains, hops) into fermented and maturated beer. Special attention is given to the processes involving an organized action of enzymes, which convert the polymeric macromolecules present in malt (such as proteins and polysaccharides) into simple sugars and amino acids; making them available/assimilable for the yeast during fermentation.
Resumo:
This work focused on how different types of oil phase, MCT (medium chain triglycerides) and LCT (long chain triglycerides), exert influence on the gelation process of beeswax and thus properties of the organogel produced thereof. Organogels were produced at different temperatures and qualitative phase diagrams were constructed to identify and classify the type of structure formed at various compositions. The microstructure of gelator crystals was studied by polarized light microscopy. Melting and crystallization were characterized by differential scanning calorimetry and rheology (flow and small amplitude oscillatory measurements) to understand organogels' behaviour under different mechanical and thermal conditions. FTIR analysis was employed for a further understanding of oil-gelator chemical interactions. Results showed that the increase of beeswax concentration led to higher values of storage and loss moduli (G, G) and complex modulus (G*) of organogels, which is associated to the strong network formed between the crystalline gelator structure and the oil phase. Crystallization occurred in two steps (well evidenced for higher concentrations of gelator) during temperature decreasing. Thermal analysis showed the occurrence of hysteresis between melting and crystallization. Small angle X-ray scattering (SAXS) analysis allowed a better understanding in terms of how crystal conformations were disposed for each type of organogel. The structuring process supported by medium or long-chain triglycerides oils was an important exploit to apprehend the impact of different carbon chain-size on the gelation process and on gels' properties.
Resumo:
Hospitals have multiple data sources, such as embedded systems, monitors and sensors. The number of data available is increasing and the information are used not only to care the patient but also to assist the decision processes. The introduction of intelligent environments in health care institutions has been adopted due their ability to provide useful information for health professionals, either in helping to identify prognosis or also to understand patient condition. Behind of this concept arises this Intelligent System to track patient condition (e.g. critic events) in health care. This system has the great advantage of being adaptable to the environment and user needs. The system is focused in identifying critic events from data streaming (e.g. vital signs and ventilation) which is particularly valuable for understanding the patient’s condition. This work aims to demonstrate the process of creating an intelligent system capable of operating in a real environment using streaming data provided by ventilators and vital signs monitors. Its development is important to the physician because becomes possible crossing multiple variables in real-time by analyzing if a value is critic or not and if their variation has or not clinical importance.
Resumo:
Para a grande maioria dos professores, tornarem-se melhores profissionais passa pelo alcance máximo de sucesso pelos seus alunos (Guskey, 2002), sendo este o principal foco deste relatório. A interligação entre as quatro áreas de intervenção do estágio pedagógico, permite potencializar cada uma. A reflexão sobre o desenvolvimento pessoal e profissional enquanto professora estagiária tendo em conta um sentimento de auto-eficácia positivo (Jardim & Onofre, 2009) e uma inteligência emocial (Mouton, Hansenne, Delcour & Coles, 2013) foi fundamental para contribuir para uma gestão correta da sala de aula através do modelo ecológico (Hastie e Siedentop, 1999) em conjunto com as 4 dimensões de uma intervenção pedagógico de sucesso (Sidentop, 1983; citado por Onofre, 1995). Tornando-se essencial entender a evolução e as diferenças entre os alunos por ser o centro do processo de ensino-aprendizagem. Assim, o trabalho do professor, começa muito antes do espaço de sala de aula (Fentermacher & Soltis, 1986). A comunidade escolar da Escola Secundária José Gomes Ferreira, compreende que a disciplina tem benefícios educacionais, mas nem sempre a consegue suportar. Assim, é necessário promover experiências positivas no 1º ciclo, de forma a desenvolver atitudes positivas face à Educação Física.
Resumo:
Entre los factores que contribuyen a predecir el rendimiento académico se pueden destacar aquellos que reflejan capacidades cognitivas (inteligencia, por ejemplo), y aquellas diferencias individuales consideradas como no-cognitivas (rasgos de personalidad, por ejemplo). En los últimos años, también se considera al Conocimiento General (CG) como un criterio para el éxito académico (ver Ackerman, 1997), ya que se ha evidenciado que el conocimiento previo ayuda en la adquisición de nuevo conocimiento (Hambrick & Engle, 2001). Uno de los objetivos de la psicología educacional consiste en identificar las principales variables que explican el rendimiento académico, como también proponer modelos teóricos que expliquen las relaciones existentes entre estas variables. El modelo teórico PPIK (Inteligencia-como-Proceso, Personalidad, Intereses e Inteligencia-como-Conocimiento) propuesto por Ackerman (1996) propone que el conocimiento y las destrezas adquiridas en un dominio en particular son el resultado de la dedicación de recursos cognitivos que una persona realiza durante un prolongado período de tiempo. Este modelo propone que los rasgos de personalidad, intereses individuales/vocacionales y aspectos motivacionales están integrados como rasgos complejos que determinan la dirección y la intensidad de la dedicación de recursos cognitivos sobre el aprendizaje que realiza una persona (Ackerman, 2003). En nuestro medio (Córdoba, Argentina), un grupo de investigadores ha desarrollado una serie de recursos técnicos necesarios para la evaluación de algunos de los constructos propuesto por este modelo. Sin embargo, por el momento no contamos con una medida de Conocimiento General. Por lo tanto, en el presente proyecto se propone la construcción de un instrumento para medir Conocimiento General (CG), indispensable para poder contar con una herramienta que permita establecer parámetros sobre el nivel de conocimiento de la población universitaria y para en próximos trabajos poner a prueba los postulados de la teoría PPIK (Ackerman, 1996). Between the factors that contribute to predict the academic achievement, may be featured those who reflect cognitive capacities (i.g. intelligence) and those who reflect individual differences that are considered like non-cognitive (i.g. personality traits). In the last years, also the General Knowledge has been considered like a criterion for the academic successfully (see Ackerman, 1997), since it has been shown that the previous knowledge helps in the acquisition of the new knowledge (Hambrick & Engle, 2001). An interesting theoretical model that has proposed an explanation for the academic achievement, is the PPIK (intelligence like a process, interests and inteligence like knowledge) proposed by Ackerman (1996), who argues that knowledge and the acquired skills in a particular domain are the result of the dedication of cognitive resources that a person perform during a long period of time. This model proposes that personality traits, individuals interests and motivational aspects are integrated as complex traits that determine the direction and the intensity of the dedication of cognitive resources on the learning that a person make (Ackerman, 2003). In our context, (Córdoba, Argentina), a group of researcher has developed a series of necessary technical resoures for the assesment of some of the theoretical constructs proposed by this model. However, by the moment, we do not have an instrument for evaluate the General Knowledge. Therefore, this project aims the construction of an instrument to asess General Knowledge, essential to set parameters on the knowledge level of the university population and for in next works test the PPIK theory postulates.
Resumo:
As digital imaging processing techniques become increasingly used in a broad range of consumer applications, the critical need to evaluate algorithm performance has become recognised by developers as an area of vital importance. With digital image processing algorithms now playing a greater role in security and protection applications, it is of crucial importance that we are able to empirically study their performance. Apart from the field of biometrics little emphasis has been put on algorithm performance evaluation until now and where evaluation has taken place, it has been carried out in a somewhat cumbersome and unsystematic fashion, without any standardised approach. This paper presents a comprehensive testing methodology and framework aimed towards automating the evaluation of image processing algorithms. Ultimately, the test framework aims to shorten the algorithm development life cycle by helping to identify algorithm performance problems quickly and more efficiently.
Resumo:
Surgeons may use a number of cutting instruments such as osteotomes and chisels to cut bone during an operative procedure. The initial loading of cortical bone during the cutting process results in the formation of microcracks in the vicinity of the cutting zone with main crack propagation to failure occuring with continued loading. When a material cracks, energy is emitted in the form of Acoustic Emission (AE) signals that spread in all directions, therefore, AE transducers can be used to monitor the occurrence and development of microcracking and crack propagation in cortical bone. In this research, number of AE signals (hits) and related parameters including amplitude, duration and absolute energy (abs-energy) were recorded during the indentation cutting process by a wedge blade on cortical bone specimens. The cutting force was also measured to correlate between load-displacement curves and the output from the AE sensor. The results from experiments show AE signals increase substantially during the loading just prior to fracture between 90% and 100% of maximum fracture load. Furthermore, an amplitude threshold value of 64dB (with approximate abs-energy of 1500 aJ) was established to saparate AE signals associated with microcracking (41 – 64dB) from fracture related signals (65 – 98dB). The results also demonstrated that the complete fracture event which had the highest duration value can be distinguished from other growing macrocracks which did not lead to catastrophic fracture. It was observed that the main crack initiation may be detected by capturing a high amplitude signal at a mean load value of 87% of maximum load and unsteady crack propagation may occur just prior to final fracture event at a mean load value of 96% of maximum load. The author concludes that the AE method is useful in understanding the crack initiation and fracture during the indentation cutting process.
Resumo:
Univariate statistical control charts, such as the Shewhart chart, do not satisfy the requirements for process monitoring on a high volume automated fuel cell manufacturing line. This is because of the number of variables that require monitoring. The risk of elevated false alarms, due to the nature of the process being high volume, can present problems if univariate methods are used. Multivariate statistical methods are discussed as an alternative for process monitoring and control. The research presented is conducted on a manufacturing line which evaluates the performance of a fuel cell. It has three stages of production assembly that contribute to the final end product performance. The product performance is assessed by power and energy measurements, taken at various time points throughout the discharge testing of the fuel cell. The literature review performed on these multivariate techniques are evaluated using individual and batch observations. Modern techniques using multivariate control charts on Hotellings T2 are compared to other multivariate methods, such as Principal Components Analysis (PCA). The latter, PCA, was identified as the most suitable method. Control charts such as, scores, T2 and DModX charts, are constructed from the PCA model. Diagnostic procedures, using Contribution plots, for out of control points that are detected using these control charts, are also discussed. These plots enable the investigator to perform root cause analysis. Multivariate batch techniques are compared to individual observations typically seen on continuous processes. Recommendations, for the introduction of multivariate techniques that would be appropriate for most high volume processes, are also covered.
Resumo:
The impending introduction of lead-free solder in the manufacture of electrical and electronic products has presented the electronics industry with many challenges. European manufacturers must transfer from a tin-lead process to a lead-free process by July 2006 as a result of the publication of two directives from the European Parliament. Tin-lead solders have been used for mechanical and electrical connections on printed circuit boards for over fifty years and considerable process knowledge has been accumulated. Extensive literature reviews were conducted on the topic and as a result it was found there are many implications to be considered with the introduction of lead-free solder. One particular question that requires answering is; can lead-free solder be used in existing manufacturing processes? The purpose of this research is to conduct a comparative study of a tin-lead solder and a lead-free solder in two key surface mount technology (SMT) processes. The two SMT processes in question were the stencil printing process and the reflow soldering process. Unreplicated fractional factorial experimental designs were used to carry out the studies. The quality of paste deposition in terms of height and volume were the characteristics of interest in the stencil printing process. The quality of solder joints produced in the reflow soldering experiment was assessed using x-ray and cross sectional analysis. This provided qualitative data that was then uniquely scored and weighted using a method developed during the research. Nested experimental design techniques were then used to analyse the resulting quantitative data. Predictive models were developed that allowed for the optimisation of both processes. Results from both experiments show that solder joints of comparable quality to those produced using tin-lead solder can be produced using lead-free solder in current SMT processes.
Resumo:
Magdeburg, Univ., Fak. für Verfahrens- und Systemtechnik, Diss., 2012
Resumo:
Limestone, calcination, normal shaft kiln, process simulation, temperature profiles
Resumo:
Software engineering, software measurement, software process engineering, capability, maturity