816 resultados para Rating tool
Resumo:
An optimization tool has been developed to help companies to optimize their production cycles and thus improve their overall supply chain management processes. The application combines the functionality that traditional APS (Advanced Planning System) and ARP (Automatic Replenishment Program) systems provide into one optimization run. A qualitative study was organized to investigate opportunities to expand the product’s market base. Twelve personal interviews were conducted and the results were collected in industry specific production planning analyses. Five process industries were analyzed to identify the product’s suitability to each industry sector and the most important product development areas. Based on the research the paper and the plastic film industries remain the most potential industry sectors at this point. To be successful in other industry sectors some product enhancements would be required, including capabilities to optimize multiple sequential and parallel production cycles, handle sequencing of complex finishing operations and to include master planning capabilities to support overall supply chain optimization. In product sales and marketing processes the key to success is to find and reach the people who are involved directly with the problems that the optimization tool can help to solve.
Resumo:
Intermolecular forces are a useful concept that can explain the attraction between particulate matter as well as numerous phenomena in our lives such as viscosity, solubility, drug interactions, and dyeing of fibers. However, studies show that students have difficulty understanding this important concept, which has led us to develop a free educational software in English and Portuguese. The software can be used interactively by teachers and students, thus facilitating better understanding. Professors and students, both graduate and undergraduate, were questioned about the software quality and its intuitiveness of use, facility of navigation, and pedagogical application using a Likert scale. The results led to the conclusion that the developed computer application can be characterized as an auxiliary tool to assist teachers in their lectures and students in their learning process of intermolecular forces.
Resumo:
A technique to measure the concentration of Penicillium allii conidia in damp chamber experiments by spectrophotometry was developed. A negative linear correlation (R²=0.56) was observed between transmittance at 340 nm and the concentration of P. allii conidia in water agar 0.05%. The equation that relates transmittance (T) with concentration (conidia mL-1) (y) is: y = 9.3 10(6) - 86497 T. The method was assayed by inoculating 43 P. allii strains in two garlic cultivars. The method proved to be more rapid than the traditional use of a hemocytometer with an improved accuracy. The CV of the number of conidia per hemocytometer reticule was of 35.04%, while the transmittance CV was of 2.73%. The extreme values chosen for T were 40 and 80 because the sensitivity of the method decreased when concentrations of conidia were out of this range.
Resumo:
The aim of this research was to investigate Tikkurila Oyj's Vantaa factories' current status in procurement and to develop the reporting of purchases and purchase warehouses, and to measure the activities during the implementation of the new purchasing tool. The implemented purchasing tool was based on ABC-analysis. Based on its reports the importance of performance measurements for the operations of the company, and the purpose of getting transparency to the company's supply chain on the part of purchasing were examined. A successful purchase- and material operation calls for accurate knowledge and professional skills. The research proved that with a separate purchasing tool, that analyses the existing information of the company's production management system, it is possible to get added value to whole supply chain's needs. The analyses and reports of the purchasing tool enable a more harmonized purchasing process at the operative level, and create a basis for internal targets and to their follow-up. At the same time the analyses clarify the current status and development trends of procurement fresh to the management. The increase of the possibilities of exploitation of information technology enables a perfect transparency to the case company's purchase department.
Resumo:
The environmental challenges of plastic packaging industry have increased remarkably along with climate change debate. The interest to study carbon footprints of packaging has increased in packaging industry to find out the real climate change impacts of packaging. In this thesis the greenhouse gas discharges of plastic packaging during their life cycle is examined. The carbon footprint is calculated for food packaging manufactured from plastic laminate. The structure of the laminate is low density polyethylene (PE-LD) and oriented polypropylene (OPP), which have been joined together with laminating adhesive. The purpose is to find out the possibilities to create a carbon footprint calculating tool for plastic packaging and its usability in a plastic packaging manufacturing company. As a carbon footprint calculating method PAS 2050 standard has been used. In the calculations direct and indirect greenhouse gas discharges as well as avoided discharges are considered. Avoided discharges are born for example in packaging waste utilization as energy. The results of the calculations have been used to create a simple calculating tool to be used for similar laminate structures. Although the utilization of the calculating tool is limited to one manufacturing plant because the primary activity data is dependent of geographical location and for example the discharges of used energy in the plant. The results give an approximation of the climate change potential caused by the laminate. It is although noticed that calculations do not include all environmental impacts of plastic packaging´s life cycle.
Resumo:
The Differential Scanning Calorimetry (DSC) was used to study the thermal behavior of hair samples and to verify the possibility of identifying an individual based on DSC curves from a data bank. Hair samples of students and officials from Instituto de Química de Araraquara, UNESP were obtained to build up a data bank. Thus to sought an individual, under incognito participant of this data bank, was identified using DSC curves.
Resumo:
The development of software tools begun as the first computers were built. The current generation of development environments offers a common interface to access multiple software tools and often also provide a possibility to build custom tools as extensions to the existing development environment. Eclipse is an open source development environment that offers good starting point for developing custom extensions. This thesis presents a software tool to aid the development of context-aware applications on Multi-User Publishing Environment (MUPE) platform. The tool is implemented as an Eclipse plug-in. The tool allows developer to include external server side contexts to their MUPE applications. The tool allows additional context sources to be added through the Eclipse's extension point mechanism. The thesis describes how the tool was designed and implemented. The implementation consists of tool core component part and an additional context source extension part. Tool core component is responsible for the actual context addition and also provides the needed user interface elements to the Eclipse workbench. Context source component provides the needed context source related information to the core component. As part of the work an update site feature was also implemented for distributing the tool through Eclipse update mechanism.
Resumo:
The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.
Resumo:
Tämän tutkimusraportin suomenkielinen versio on osoitteessa: http://urn.fi/URN:ISBN:978-951-29-4509-2
Resumo:
Early identification of beginning readers at risk of developing reading and writing difficulties plays an important role in the prevention and provision of appropriate intervention. In Tanzania, as in other countries, there are children in schools who are at risk of developing reading and writing difficulties. Many of these children complete school without being identified and without proper and relevant support. The main language in Tanzania is Kiswahili, a transparent language. Contextually relevant, reliable and valid instruments of identification are needed in Tanzanian schools. This study aimed at the construction and validation of a group-based screening instrument in the Kiswahili language for identifying beginning readers at risk of reading and writing difficulties. In studying the function of the test there was special interest in analyzing the explanatory power of certain contextual factors related to the home and school. Halfway through grade one, 337 children from four purposively selected primary schools in Morogoro municipality were screened with a group test consisting of 7 subscales measuring phonological awareness, word and letter knowledge and spelling. A questionnaire about background factors and the home and school environments related to literacy was also used. The schools were chosen based on performance status (i.e. high, good, average and low performing schools) in order to include variation. For validation, 64 children were chosen from the original sample to take an individual test measuring nonsense word reading, word reading, actual text reading, one-minute reading and writing. School marks from grade one and a follow-up test half way through grade two were also used for validation. The correlations between the results from the group test and the three measures used for validation were very high (.83-.95). Content validity of the group test was established by using items drawn from authorized text books for reading in grade one. Construct validity was analyzed through item analysis and principal component analysis. The difficulty level of most items in both the group test and the follow-up test was good. The items also discriminated well. Principal component analysis revealed one powerful latent dimension (initial literacy factor), accounting for 93% of the variance. This implies that it could be possible to use any set of the subtests of the group test for screening and prediction. The K-Means cluster analysis revealed four clusters: at-risk children, strugglers, readers and good readers. The main concern in this study was with the groups of at-risk children (24%) and strugglers (22%), who need the most assistance. The predictive validity of the group test was analyzed by correlating the measures from the two school years and by cross tabulating grade one and grade two clusters. All the correlations were positive and very high, and 94% of the at-risk children in grade two were already identified in the group test in grade one. The explanatory power of some of the home and school factors was very strong. The number of books at home accounted for 38% of the variance in reading and writing ability measured by the group test. Parents´ reading ability and the support children received at home for schoolwork were also influential factors. Among the studied school factors school attendance had the strongest explanatory power, accounting for 21% of the variance in reading and writing ability. Having been in nursery school was also of importance. Based on the findings in the study a short version of the group test was created. It is suggested for use in the screening processes in grade one aiming at identifying children at risk of reading and writing difficulties in the Tanzanian context. Suggestions for further research as well as for actions for improving the literacy skills of Tanzanian children are presented.
Resumo:
Understanding hydrosedimental behavior of a watershed is essential for properly managing and using its hydric resources. The objective of this study was to verify the feasibility of the alternative procedure for the indirect determination of the sediment key curve using a turbidimeter. The research was carried out on the São Francisco Falso River, which is situated in the west of the state of Paraná on the left bank of ITAIPU reservoir. The direct method was applied using a DH-48 sediment suspended sampler. The indirect method consisted of the use of a linigraph and a turbidimeter. Based on the results obtained, it was concluded that the indirect method using a turbidimeter showed to be fully feasible, since it gave a power function-type mathematical model equal of the direct method. Furthermore, the average suspended sediment discharge into the São Francisco Falso River during the 2006/2007 harvest was calculated at 7.26 metric t day-1.