1000 resultados para sawmill process
Resumo:
Business Intelligence (BI) can be seen as a method that gathers information and data from information systems in order to help companies to be more accurate in their decision-making process. Traditionally BI systems were associated with the use of Data Warehouses (DW). The prime purpose of DW is to serve as a repository that stores all the relevant information required for making the correct decision. The necessity to integrate streaming data became crucial with the need to improve the efficiency and effectiveness of the decision process. In primary and secondary education, there is a lack of BI solutions. Due to the schools reality the main purpose of this study is to provide a Pervasive BI solution able to monitoring the schools and student data anywhere and anytime in real-time as well as disseminating the information through ubiquitous devices. The first task consisted in gathering data regarding the different choices made by the student since his enrolment in a certain school year until the end of it. Thereafter a dimensional model was developed in order to be possible building a BI platform. This paper presents the dimensional model, a set of pre-defined indicators, the Pervasive Business Intelligence characteristics and the prototype designed. The main contribution of this study was to offer to the schools a tool that could help them to make accurate decisions in real-time. Data dissemination was achieved through a localized application that can be accessed anywhere and anytime.
Resumo:
Children are an especially vulnerable population, particularly in respect to drug administration. It is estimated that neonatal and pediatric patients are at least three times more vulnerable to damage due to adverse events and medication errors than adults are. With the development of this framework, it is intended the provision of a Clinical Decision Support System based on a prototype already tested in a real environment. The framework will include features such as preparation of Total Parenteral Nutrition prescriptions, table pediatric and neonatal emergency drugs, medical scales of morbidity and mortality, anthropometry percentiles (weight, length/height, head circumference and BMI), utilities for supporting medical decision on the treatment of neonatal jaundice and anemia and support for technical procedures and other calculators and widespread use tools. The solution in development means an extension of INTCare project. The main goal is to provide an approach to get the functionality at all times of clinical practice and outside the hospital environment for dissemination, education and simulation of hypothetical situations. The aim is also to develop an area for the study and analysis of information and extraction of knowledge from the data collected by the use of the system. This paper presents the architecture, their requirements and functionalities and a SWOT analysis of the solution proposed.
Resumo:
When representing the requirements for an intended software solution during the development process, a logical architecture is a model that provides an organized vision of how functionalities behave regardless of the technologies to be implemented. If the logical architecture represents an ambient assisted living (AAL) ecosystem, such representation is a complex task due to the existence of interrelated multidomains, which, most of the time, results in incomplete and incoherent user requirements. In this chap- ter, we present the results obtained when applying process-level modeling techniques to the derivation of the logical architecture for a real industrial AAL project. We adopt a V-Model–based approach that expresses the AAL requirements in a process-level perspec- tive, instead of the traditional product-level view. Additionally, we ensure compliance of the derived logical architecture with the National Institute of Standards and Technology (NIST) reference architecture as nonfunctional requirements to support the implementa- tion of the AAL architecture in cloud contexts.
Resumo:
This study evaluated different cooking processes (roasted, cooked and fried) on total mercury (Hg) content in fish species most consumed by Manaus residents and surrounding communities, Amazon region. The results obtained for total Hg in natura and after the three types of preparation (roasted, cooked and fried) for 12 fish species showed a significant Hg concentration variation. In the present study the cooked and frying processes resulted in higher Hg losses for Pacu, Pescada, Jaraqui, Curimatã, Surubin and Aruanã fish species, most of them presenting detritivorous and carnivorous feeding habits. The higher Hg losses in the roasting process occurred for Sardinha, Aracu, Tucunaré, Pirapitinga, Branquinha and Tambaqui fish species, most of them being omnivorous and herbivorous fish species. Some micronutrients (Ca, Fe, K, Na, Se and Zn) in fish species in natura were also determined in order to perform a nutritional evaluation regarding these micronutrients.
Resumo:
Dissertação de mestrado integrado em Mechanical Engineering
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
The supercritical fluid technology has been target of many pharmaceuticals investigations in particles production for almost 35 years. This is due to the great advantages it offers over others technologies currently used for the same purpose. A brief history is presented, as well the classification of supercritical technology based on the role that the supercritical fluid (carbon dioxide) performs in the process.
Resumo:
A highly robust hydrogel device made from a single biopolymer formulation is reported. Owing to the presence of covalent and non-covalent crosslinks, these engineered systems were able to (i) sustain a compressive strength of ca. 20 MPa, (ii) quickly recover upon unloading, and (iii) encapsulate cells with high viability rates.
Resumo:
Load-bearing soft tissues such as cartilage, blood vessels and muscles are able to withstand a remarkable compressive stress of several MPa without fracturing. Interestingly, most of these structural tissues are mainly composed of water and in this regard, hydrogels, as highly hydrated 3D-crosslinked polymeric networks, constitute a promising class of materials to repair lesions on these tissues. Although several approaches can be employed to shape the mechanical properties of artificial hydrogels to mimic the ones found on biotissues, critical issues regarding, for instance, their biocompatibility and recoverability after loading are often neglected. Therefore, an innovative hydrogel device made only of chitosan (CHI) was developed for the repair of robust biological tissues. These systems were fabricated through a dual-crosslinking process, comprising a photo- and an ionic-crosslinking step. The obtained CHIbased hydrogels exhibited an outstanding compressive strength of ca. 20 MPa at 95% of strain, which is several orders of magnitude higher than those of the individual components and close to the ones found in native soft tissues. Additionally, both crosslinking processes occur rapidly and under physiological conditions, enabling cellsâ encapsulation as confirmed by high cell survival rates (ca. 80%). Furthermore, in contrast with conventional hydrogels, these networks quickly recover upon unloading and are able to keep their mechanical properties under physiological conditions as result of their non-swell nature.
Resumo:
The first chapter of this book has an introductory character, which discusses the basics of brewing. This includes not only the essential ingredients of beer, but also the steps in the process that transforms the raw materials (grains, hops) into fermented and maturated beer. Special attention is given to the processes involving an organized action of enzymes, which convert the polymeric macromolecules present in malt (such as proteins and polysaccharides) into simple sugars and amino acids; making them available/assimilable for the yeast during fermentation.
Resumo:
This work focused on how different types of oil phase, MCT (medium chain triglycerides) and LCT (long chain triglycerides), exert influence on the gelation process of beeswax and thus properties of the organogel produced thereof. Organogels were produced at different temperatures and qualitative phase diagrams were constructed to identify and classify the type of structure formed at various compositions. The microstructure of gelator crystals was studied by polarized light microscopy. Melting and crystallization were characterized by differential scanning calorimetry and rheology (flow and small amplitude oscillatory measurements) to understand organogels' behaviour under different mechanical and thermal conditions. FTIR analysis was employed for a further understanding of oil-gelator chemical interactions. Results showed that the increase of beeswax concentration led to higher values of storage and loss moduli (G, G) and complex modulus (G*) of organogels, which is associated to the strong network formed between the crystalline gelator structure and the oil phase. Crystallization occurred in two steps (well evidenced for higher concentrations of gelator) during temperature decreasing. Thermal analysis showed the occurrence of hysteresis between melting and crystallization. Small angle X-ray scattering (SAXS) analysis allowed a better understanding in terms of how crystal conformations were disposed for each type of organogel. The structuring process supported by medium or long-chain triglycerides oils was an important exploit to apprehend the impact of different carbon chain-size on the gelation process and on gels' properties.
Resumo:
As digital imaging processing techniques become increasingly used in a broad range of consumer applications, the critical need to evaluate algorithm performance has become recognised by developers as an area of vital importance. With digital image processing algorithms now playing a greater role in security and protection applications, it is of crucial importance that we are able to empirically study their performance. Apart from the field of biometrics little emphasis has been put on algorithm performance evaluation until now and where evaluation has taken place, it has been carried out in a somewhat cumbersome and unsystematic fashion, without any standardised approach. This paper presents a comprehensive testing methodology and framework aimed towards automating the evaluation of image processing algorithms. Ultimately, the test framework aims to shorten the algorithm development life cycle by helping to identify algorithm performance problems quickly and more efficiently.
Resumo:
Surgeons may use a number of cutting instruments such as osteotomes and chisels to cut bone during an operative procedure. The initial loading of cortical bone during the cutting process results in the formation of microcracks in the vicinity of the cutting zone with main crack propagation to failure occuring with continued loading. When a material cracks, energy is emitted in the form of Acoustic Emission (AE) signals that spread in all directions, therefore, AE transducers can be used to monitor the occurrence and development of microcracking and crack propagation in cortical bone. In this research, number of AE signals (hits) and related parameters including amplitude, duration and absolute energy (abs-energy) were recorded during the indentation cutting process by a wedge blade on cortical bone specimens. The cutting force was also measured to correlate between load-displacement curves and the output from the AE sensor. The results from experiments show AE signals increase substantially during the loading just prior to fracture between 90% and 100% of maximum fracture load. Furthermore, an amplitude threshold value of 64dB (with approximate abs-energy of 1500 aJ) was established to saparate AE signals associated with microcracking (41 – 64dB) from fracture related signals (65 – 98dB). The results also demonstrated that the complete fracture event which had the highest duration value can be distinguished from other growing macrocracks which did not lead to catastrophic fracture. It was observed that the main crack initiation may be detected by capturing a high amplitude signal at a mean load value of 87% of maximum load and unsteady crack propagation may occur just prior to final fracture event at a mean load value of 96% of maximum load. The author concludes that the AE method is useful in understanding the crack initiation and fracture during the indentation cutting process.
Resumo:
Univariate statistical control charts, such as the Shewhart chart, do not satisfy the requirements for process monitoring on a high volume automated fuel cell manufacturing line. This is because of the number of variables that require monitoring. The risk of elevated false alarms, due to the nature of the process being high volume, can present problems if univariate methods are used. Multivariate statistical methods are discussed as an alternative for process monitoring and control. The research presented is conducted on a manufacturing line which evaluates the performance of a fuel cell. It has three stages of production assembly that contribute to the final end product performance. The product performance is assessed by power and energy measurements, taken at various time points throughout the discharge testing of the fuel cell. The literature review performed on these multivariate techniques are evaluated using individual and batch observations. Modern techniques using multivariate control charts on Hotellings T2 are compared to other multivariate methods, such as Principal Components Analysis (PCA). The latter, PCA, was identified as the most suitable method. Control charts such as, scores, T2 and DModX charts, are constructed from the PCA model. Diagnostic procedures, using Contribution plots, for out of control points that are detected using these control charts, are also discussed. These plots enable the investigator to perform root cause analysis. Multivariate batch techniques are compared to individual observations typically seen on continuous processes. Recommendations, for the introduction of multivariate techniques that would be appropriate for most high volume processes, are also covered.
Resumo:
The impending introduction of lead-free solder in the manufacture of electrical and electronic products has presented the electronics industry with many challenges. European manufacturers must transfer from a tin-lead process to a lead-free process by July 2006 as a result of the publication of two directives from the European Parliament. Tin-lead solders have been used for mechanical and electrical connections on printed circuit boards for over fifty years and considerable process knowledge has been accumulated. Extensive literature reviews were conducted on the topic and as a result it was found there are many implications to be considered with the introduction of lead-free solder. One particular question that requires answering is; can lead-free solder be used in existing manufacturing processes? The purpose of this research is to conduct a comparative study of a tin-lead solder and a lead-free solder in two key surface mount technology (SMT) processes. The two SMT processes in question were the stencil printing process and the reflow soldering process. Unreplicated fractional factorial experimental designs were used to carry out the studies. The quality of paste deposition in terms of height and volume were the characteristics of interest in the stencil printing process. The quality of solder joints produced in the reflow soldering experiment was assessed using x-ray and cross sectional analysis. This provided qualitative data that was then uniquely scored and weighted using a method developed during the research. Nested experimental design techniques were then used to analyse the resulting quantitative data. Predictive models were developed that allowed for the optimisation of both processes. Results from both experiments show that solder joints of comparable quality to those produced using tin-lead solder can be produced using lead-free solder in current SMT processes.