823 resultados para Tool functionality
Resumo:
A technique to measure the concentration of Penicillium allii conidia in damp chamber experiments by spectrophotometry was developed. A negative linear correlation (R²=0.56) was observed between transmittance at 340 nm and the concentration of P. allii conidia in water agar 0.05%. The equation that relates transmittance (T) with concentration (conidia mL-1) (y) is: y = 9.3 10(6) - 86497 T. The method was assayed by inoculating 43 P. allii strains in two garlic cultivars. The method proved to be more rapid than the traditional use of a hemocytometer with an improved accuracy. The CV of the number of conidia per hemocytometer reticule was of 35.04%, while the transmittance CV was of 2.73%. The extreme values chosen for T were 40 and 80 because the sensitivity of the method decreased when concentrations of conidia were out of this range.
Resumo:
The aim of this research was to investigate Tikkurila Oyj's Vantaa factories' current status in procurement and to develop the reporting of purchases and purchase warehouses, and to measure the activities during the implementation of the new purchasing tool. The implemented purchasing tool was based on ABC-analysis. Based on its reports the importance of performance measurements for the operations of the company, and the purpose of getting transparency to the company's supply chain on the part of purchasing were examined. A successful purchase- and material operation calls for accurate knowledge and professional skills. The research proved that with a separate purchasing tool, that analyses the existing information of the company's production management system, it is possible to get added value to whole supply chain's needs. The analyses and reports of the purchasing tool enable a more harmonized purchasing process at the operative level, and create a basis for internal targets and to their follow-up. At the same time the analyses clarify the current status and development trends of procurement fresh to the management. The increase of the possibilities of exploitation of information technology enables a perfect transparency to the case company's purchase department.
Resumo:
Software testing is one of the essential parts in software engineering process. The objective of the study was to describe software testing tools and the corresponding use. The thesis contains examples of software testing tools usage. The study was conducted as a literature study, with focus on current software testing practices and quality assurance standards. In the paper a tool classifier was employed, and testing tools presented in study were classified according to it. We found that it is difficult to distinguish current available tools by certain testing activities as many of them contain functionality that exceeds scopes of a single testing type.
Resumo:
The environmental challenges of plastic packaging industry have increased remarkably along with climate change debate. The interest to study carbon footprints of packaging has increased in packaging industry to find out the real climate change impacts of packaging. In this thesis the greenhouse gas discharges of plastic packaging during their life cycle is examined. The carbon footprint is calculated for food packaging manufactured from plastic laminate. The structure of the laminate is low density polyethylene (PE-LD) and oriented polypropylene (OPP), which have been joined together with laminating adhesive. The purpose is to find out the possibilities to create a carbon footprint calculating tool for plastic packaging and its usability in a plastic packaging manufacturing company. As a carbon footprint calculating method PAS 2050 standard has been used. In the calculations direct and indirect greenhouse gas discharges as well as avoided discharges are considered. Avoided discharges are born for example in packaging waste utilization as energy. The results of the calculations have been used to create a simple calculating tool to be used for similar laminate structures. Although the utilization of the calculating tool is limited to one manufacturing plant because the primary activity data is dependent of geographical location and for example the discharges of used energy in the plant. The results give an approximation of the climate change potential caused by the laminate. It is although noticed that calculations do not include all environmental impacts of plastic packaging´s life cycle.
Resumo:
The Differential Scanning Calorimetry (DSC) was used to study the thermal behavior of hair samples and to verify the possibility of identifying an individual based on DSC curves from a data bank. Hair samples of students and officials from Instituto de Química de Araraquara, UNESP were obtained to build up a data bank. Thus to sought an individual, under incognito participant of this data bank, was identified using DSC curves.
Resumo:
The development of software tools begun as the first computers were built. The current generation of development environments offers a common interface to access multiple software tools and often also provide a possibility to build custom tools as extensions to the existing development environment. Eclipse is an open source development environment that offers good starting point for developing custom extensions. This thesis presents a software tool to aid the development of context-aware applications on Multi-User Publishing Environment (MUPE) platform. The tool is implemented as an Eclipse plug-in. The tool allows developer to include external server side contexts to their MUPE applications. The tool allows additional context sources to be added through the Eclipse's extension point mechanism. The thesis describes how the tool was designed and implemented. The implementation consists of tool core component part and an additional context source extension part. Tool core component is responsible for the actual context addition and also provides the needed user interface elements to the Eclipse workbench. Context source component provides the needed context source related information to the core component. As part of the work an update site feature was also implemented for distributing the tool through Eclipse update mechanism.
Resumo:
The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.
Resumo:
Suorituskyvyn mittaaminen on tehokas työkalu yrityksen toiminnan kehittämiseen. Mittarit ohjaavat yritystä keskittymään strategian ja tavoitteiden toteuttamiseen. Yrityksen suorituskykyä tarkastellaan usein erityisesti yritystasolla, mutta mittarit voidaan myös kohdentaa yksittäisille työntekijöille. Työn tavoitteena oli löytää jo suunnitteluvaiheessa huomioitavia menestystekijöitä pienen asiantuntijayrityksen henkilötason suorituskykymittaristolle. Tutkimuskysymyksiin vastaamiseksi käytettiin kirjallisuuden lisäksi kohdeyritykselle toteutettua suunnitteluprosessia, jossa hyödynnettiin myös yrityksen henkilöstön näkemyksiä. Merkittävimmät erityispiirteet mittariston suunnittelussa liittyivät asiantuntijatyön mittaukseen, mittariston resurssitarpeen minimoimiseen ja palkitsemiseen suorituskyvyn perusteella. Lisäksi työssä pohdittiin suunnitellun mittariston laajentamismahdollisuuksia yritystason mittaristoksi. Suorituskyvyn mittaaminen tehostaa yrityksen päätöksentekoa sekä viestintää. Laadukas mittaristo edellyttää myös hyvää suunnittelua, joka voidaan perustaa kirjallisuuden malleihin. Johdon ja henkilöstön sitoutuminen suunnitteluprosessiin on kriittistä onnistuneen mittariston kannalta. Mittariston suunnittelussa mittarit johdetaan yleensä yrityksen visiosta strategian, mittausalueiden ja menestystekijöiden kautta. Mittareiden painotuksia muuttamalla voidaan hienosäätää mittariston vaikutusta. Mittariston koekäyttövaiheessa todetaan mittariston toimivuus ennen mittareiden kytkemistä tulospalkkioihin. Asiantuntijatyön mittauksessa on kiinnitettävä huomiota toimivien mittareiden löytämiseen, sillä tietopohjaista asiantuntijatyötä on haastava mitata. Lisäksi palkitsemisjärjestelmän liittäminen mittaristoon edellyttää mittareilta ehdotonta oikeudenmukaisuutta ja tuloksia ei saa pystyä manipuloimaan. Pienen yrityksen mittaristo ei saa kuluttaa liikaa resursseja, joten kannattaa suosia mahdollisimman itsenäisesti toimivia mittareita. Henkilöstön motivointi tietojen syöttöön on olennaista. Lisäksi objektiivisten ja olennaisten mittareiden käyttö vähentää mittariston ylläpitoon käytettyjä työtunteja. Henkilötason mittaristo voidaan laajentaa helposti myös yritystasolle, mikäli suunnittelussa on kiinnitetty huomiota huolelliseen dokumentointiin.
Resumo:
The use of domain-specific languages (DSLs) has been proposed as an approach to cost-e ectively develop families of software systems in a restricted application domain. Domain-specific languages in combination with the accumulated knowledge and experience of previous implementations, can in turn be used to generate new applications with unique sets of requirements. For this reason, DSLs are considered to be an important approach for software reuse. However, the toolset supporting a particular domain-specific language is also domain-specific and is per definition not reusable. Therefore, creating and maintaining a DSL requires additional resources that could be even larger than the savings associated with using them. As a solution, di erent tool frameworks have been proposed to simplify and reduce the cost of developments of DSLs. Developers of tool support for DSLs need to instantiate, customize or configure the framework for a particular DSL. There are di erent approaches for this. An approach is to use an application programming interface (API) and to extend the basic framework using an imperative programming language. An example of a tools which is based on this approach is Eclipse GEF. Another approach is to configure the framework using declarative languages that are independent of the underlying framework implementation. We believe this second approach can bring important benefits as this brings focus to specifying what should the tool be like instead of writing a program specifying how the tool achieves this functionality. In this thesis we explore this second approach. We use graph transformation as the basic approach to customize a domain-specific modeling (DSM) tool framework. The contributions of this thesis includes a comparison of di erent approaches for defining, representing and interchanging software modeling languages and models and a tool architecture for an open domain-specific modeling framework that e ciently integrates several model transformation components and visual editors. We also present several specific algorithms and tool components for DSM framework. These include an approach for graph query based on region operators and the star operator and an approach for reconciling models and diagrams after executing model transformation programs. We exemplify our approach with two case studies MICAS and EFCO. In these studies we show how our experimental modeling tool framework has been used to define tool environments for domain-specific languages.
Resumo:
Early identification of beginning readers at risk of developing reading and writing difficulties plays an important role in the prevention and provision of appropriate intervention. In Tanzania, as in other countries, there are children in schools who are at risk of developing reading and writing difficulties. Many of these children complete school without being identified and without proper and relevant support. The main language in Tanzania is Kiswahili, a transparent language. Contextually relevant, reliable and valid instruments of identification are needed in Tanzanian schools. This study aimed at the construction and validation of a group-based screening instrument in the Kiswahili language for identifying beginning readers at risk of reading and writing difficulties. In studying the function of the test there was special interest in analyzing the explanatory power of certain contextual factors related to the home and school. Halfway through grade one, 337 children from four purposively selected primary schools in Morogoro municipality were screened with a group test consisting of 7 subscales measuring phonological awareness, word and letter knowledge and spelling. A questionnaire about background factors and the home and school environments related to literacy was also used. The schools were chosen based on performance status (i.e. high, good, average and low performing schools) in order to include variation. For validation, 64 children were chosen from the original sample to take an individual test measuring nonsense word reading, word reading, actual text reading, one-minute reading and writing. School marks from grade one and a follow-up test half way through grade two were also used for validation. The correlations between the results from the group test and the three measures used for validation were very high (.83-.95). Content validity of the group test was established by using items drawn from authorized text books for reading in grade one. Construct validity was analyzed through item analysis and principal component analysis. The difficulty level of most items in both the group test and the follow-up test was good. The items also discriminated well. Principal component analysis revealed one powerful latent dimension (initial literacy factor), accounting for 93% of the variance. This implies that it could be possible to use any set of the subtests of the group test for screening and prediction. The K-Means cluster analysis revealed four clusters: at-risk children, strugglers, readers and good readers. The main concern in this study was with the groups of at-risk children (24%) and strugglers (22%), who need the most assistance. The predictive validity of the group test was analyzed by correlating the measures from the two school years and by cross tabulating grade one and grade two clusters. All the correlations were positive and very high, and 94% of the at-risk children in grade two were already identified in the group test in grade one. The explanatory power of some of the home and school factors was very strong. The number of books at home accounted for 38% of the variance in reading and writing ability measured by the group test. Parents´ reading ability and the support children received at home for schoolwork were also influential factors. Among the studied school factors school attendance had the strongest explanatory power, accounting for 21% of the variance in reading and writing ability. Having been in nursery school was also of importance. Based on the findings in the study a short version of the group test was created. It is suggested for use in the screening processes in grade one aiming at identifying children at risk of reading and writing difficulties in the Tanzanian context. Suggestions for further research as well as for actions for improving the literacy skills of Tanzanian children are presented.
Resumo:
It is presented a software developed with Delphi programming language to compute the reservoir's annual regulated active storage, based on the sequent-peak algorithm. Mathematical models used for that purpose generally require extended hydrological series. Usually, the analysis of those series is performed with spreadsheets or graphical representations. Based on that, it was developed a software for calculation of reservoir active capacity. An example calculation is shown by 30-years (from 1977 to 2009) monthly mean flow historical data, from Corrente River, located at São Francisco River Basin, Brazil. As an additional tool, an interface was developed to manage water resources, helping to manipulate data and to point out information that it would be of interest to the user. Moreover, with that interface irrigation districts where water consumption is higher can be analyzed as a function of specific seasonal water demands situations. From a practical application, it is possible to conclude that the program provides the calculation originally proposed. It was designed to keep information organized and retrievable at any time, and to show simulation on seasonal water demands throughout the year, contributing with the elements of study concerning reservoir projects. This program, with its functionality, is an important tool for decision making in the water resources management.
Resumo:
The purpose of this study is to examine how well risk parity works in terms of risk, return and diversification relative to more traditional minimum variance, 1/N and 60/40 portfolios. Risk parity portfolios were constituted of five risk sources; three common asset classes and two alternative beta investment strategies. The three common asset classes were equities, bonds and commodities, and the alternative beta investment strategies were carry trade and trend following. Risk parity portfolios were constructed using five different risk measures of which four were tail risk measures. The risk measures were standard deviation, Value-at-Risk, Expected Shortfall, modified Value-at-Risk and modified Expected Shortfall. We studied also how sensitive risk parity is to the choice of risk measure. The hypothesis is that risk parity portfolios provide better return with the same amount of risk and are better diversified than the benchmark portfolios. We used two data sets, monthly and weekly data. The monthly data was from the years 1989-2011 and the weekly data was from the years 2000-2011. Empirical studies showed that risk parity portfolios provide better diversification since the diversification is made at the risk level. Risk based portfolios provided superior return compared to the asset based portfolios. Using tail risk measures in risk parity portfolios do not necessarily provide better hedge from tail events than standard deviation.