857 resultados para automatic assessment tool


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Controlling the quality variables (such as basis weight, moisture etc.) is a vital part of making top quality paper or board. In this thesis, an advanced data assimilation tool is applied to the quality control system (QCS) of a paper or board machine. The functionality of the QCS is based on quality observations that are measured with a traversing scanner making a zigzag path. The basic idea is the following: The measured quality variable has to be separated into its machine direction (MD) and cross direction (CD) variations due to the fact that the QCS works separately in MD and CD. Traditionally this is done simply by assuming one scan of the zigzag path to be the CD profile and its mean value to be one point of the MD trend. In this thesis, a more advanced method is introduced. The fundamental idea is to use the signals’ frequency components to represent the variation in both CD and MD. To be able to get to the frequency domain, the Fourier transform is utilized. The frequency domain, that is, the Fourier components are then used as a state vector in a Kalman filter. The Kalman filter is a widely used data assimilation tool to combine noisy observations with a model. The observations here refer to the quality measurements and the model to the Fourier frequency components. By implementing the two dimensional Fourier transform into the Kalman filter, we get an advanced tool for the separation of CD and MD components in total variation or, to be more general, for data assimilation. A piece of a paper roll is analyzed and this tool is applied to model the dataset. As a result, it is clear that the Kalman filter algorithm is able to reconstruct the main features of the dataset from a zigzag path. Although the results are made with a very short sample of paper roll, it seems that this method has great potential to be used later on as a part of the quality control system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An optimization tool has been developed to help companies to optimize their production cycles and thus improve their overall supply chain management processes. The application combines the functionality that traditional APS (Advanced Planning System) and ARP (Automatic Replenishment Program) systems provide into one optimization run. A qualitative study was organized to investigate opportunities to expand the product’s market base. Twelve personal interviews were conducted and the results were collected in industry specific production planning analyses. Five process industries were analyzed to identify the product’s suitability to each industry sector and the most important product development areas. Based on the research the paper and the plastic film industries remain the most potential industry sectors at this point. To be successful in other industry sectors some product enhancements would be required, including capabilities to optimize multiple sequential and parallel production cycles, handle sequencing of complex finishing operations and to include master planning capabilities to support overall supply chain optimization. In product sales and marketing processes the key to success is to find and reach the people who are involved directly with the problems that the optimization tool can help to solve.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

After decades of mergers and acquisitions and successive technology trends such as CRM, ERP and DW, the data in enterprise systems is scattered and inconsistent. Global organizations face the challenge of addressing local uses of shared business entities, such as customer and material, and at the same time have a consistent, unique, and consolidate view of financial indicators. In addition, current enterprise systems do not accommodate the pace of organizational changes and immense efforts are required to maintain data. When it comes to systems integration, ERPs are considered “closed” and expensive. Data structures are complex and the “out-of-the-box” integration options offered are not based on industry standards. Therefore expensive and time-consuming projects are undertaken in order to have required data flowing according to business processes needs. Master Data Management (MDM) emerges as one discipline focused on ensuring long-term data consistency. Presented as a technology-enabled business discipline, it emphasizes business process and governance to model and maintain the data related to key business entities. There are immense technical and organizational challenges to accomplish the “single version of the truth” MDM mantra. Adding one central repository of master data might prove unfeasible in a few scenarios, thus an incremental approach is recommended, starting from areas most critically affected by data issues. This research aims at understanding the current literature on MDM and contrasting it with views from professionals. The data collected from interviews revealed details on the complexities of data structures and data management practices in global organizations, reinforcing the call for more in-depth research on organizational aspects of MDM. The most difficult piece of master data to manage is the “local” part, the attributes related to the sourcing and storing of materials in one particular warehouse in The Netherlands or a complex set of pricing rules for a subsidiary of a customer in Brazil. From a practical perspective, this research evaluates one MDM solution under development at a Finnish IT solution-provider. By means of applying an existing assessment method, the research attempts at providing the company with one possible tool to evaluate its product from a vendor-agnostics perspective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The topic of this Master’s Thesis is risk assessment in the supply chain, and the work was done for a company operating in the pharmaceutical industry. The unique features of the industry bring additional challenges to risk management, due to high regulatory, docu-mentation and traceability requirements. The objective of the thesis was to generate a template for assessing the risks in the supply chain of current and potential suppliers of the case company. Risks pertaining to the case setting were sought mainly from in-house expertise of this specific product and supply chain as well as academic research papers and theory on risk management. A questionnaire was set up to assess the found risks on impact, occurrence and possibility of detection. Through this classification of the severity of the risks, the supplier assessment template was formed. A questionnaire template, comprised of the top 10 risks affecting the flow of information and materials in this setting, was formulated to serve as a generic tool for assessing risks in the supply chain of a pharmaceutical company. The template was tested on another supplier for usability and accuracy of found risks, and it demonstrated functioning in a differing supply chain and product setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This applied linguistic study in the field of second language acquisition investigated the assessment practices of class teachers as well as the challenges and visions of language assessment in bilingual content instruction (CLIL) at primary level in Finnish basic education. Furthermore, pupils’ and their parents’ perceptions of language assessment and LangPerform computer simulations as an alternative, modern assessment method in CLIL contexts were examined. The study was conducted for descriptive and developmental purposes in three phases: 1) a CLIL assessment survey; 2) simulation 1; and 3) simulation 2. All phases had a varying number of participants. The population of this mixed methods study were CLIL class teachers, their pupils and the pupils’ parents. The sampling was multi-staged and based on probability and random sampling. The data were triangulated. Altogether 42 CLIL class teachers nationwide, 109 pupils from the 3rd, 4th and 5th grade as well as 99 parents from two research schools in South-Western Finland participated in the CLIL assessment survey followed by an audio-recorded theme interview of volunteers (10 teachers, 20 pupils and 7 parents). The simulation experimentations 1 and 2 produced 146 pupil and 39 parental questionnaires as well as video interviews of volunteered pupils. The data were analysed both quantitatively using percentages and numerical frequencies and qualitatively employing thematic content analysis. Based on the data, language assessment in primary CLIL is not an established practice. It largely appears to be infrequent, incidental, implicit and based on impressions rather than evidence or the curriculum. The most used assessment methods were teacher observation, bilingual tests and dialogic interaction, and the least used were portfolios, simulations and peer assessment. Although language assessment was generally perceived as important by teachers, a fifth of them did not gather assessment information systematically, and 38% scarcely gave linguistic feedback to pupils. Both pupils and parents wished to receive more information on CLIL language issues; 91% of pupils claimed to receive feedback rarely or occasionally, and 63% of them wished to get more information on their linguistic coping in CLIL subjects. Of the parents, 76% wished to receive more information on the English proficiency of their children and their linguistic development. This may be a response to indirect feedback practices identified in this study. There are several challenges related to assessment; the most notable is the lack of a CLIL curriculum, language objectives and common ground principles of assessment. Three diverse approaches to language in CLIL that appear to affect teachers’ views on language assessment were identified: instrumental (language as a tool), dual (language as a tool and object of learning) and eclectic (miscellaneous views, e.g. affective factors prioritised). LangPerform computer simulations seem to be perceived as an appropriate alternative assessment method in CLIL. It is strongly recommended that the fundamentals for assessment (curricula and language objectives) and a mutual assessment scheme should be determined and stakeholders’ knowledge base of CLIL strengthened. The principles of adequate assessment in primary CLIL are identified as well as several appropriate assessment methods suggested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Proso millet (Panicum miliaceum L.) is a serious weed in North America. A high number of wild proso millet biotypes are known but the genetic basis of its phenotypic variation is poorly understood. In the present study, a non-radioactive silver staining method for PCR-Amplified Fragment Length Polymorphism (AFLP) was evaluated for studying genetic polymorphism in American proso millet biotypes. Twelve biotypes and eight primer combinations with two/three and three/three selective nucleotides were used. Pair of primers with two/three selective nucleotides produced the highest number of amplified DNA fragments, while pair of primers with three/three selective nucleotides were more effective for revealing more polymorphic DNA fragments. The two better primer combinations were EcoR-AAC/Mse-CTT and EcoR-ACT/Mse-CAA with seven and eleven polymorphic DNA fragments, respectively. In a total of 450 amplified fragments, at least 339 appeared well separated in a silver stained acrylamide gel and 39 polymorphic DNA bands were scored. The level of polymorphic DNA (11.5%) using only eight pairs of primers were effective for grouping proso millet biotypes in two clusters but insufficient for separating hybrid biotypes from wild and crop. Nevertheless, the present result indicates that silver stained AFLP markers could be a cheap and important tool for studying genetic relationships in proso millet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tool center point calibration is a known problem in industrial robotics. The major focus of academic research is to enhance the accuracy and repeatability of next generation robots. However, operators of currently available robots are working within the limits of the robot´s repeatability and require calibration methods suitable for these basic applications. This study was conducted in association with Stresstech Oy, which provides solutions for manufacturing quality control. Their sensor, based on the Barkhausen noise effect, requires accurate positioning. The accuracy requirement admits a tool center point calibration problem if measurements are executed with an industrial robot. Multiple possibilities are available in the market for automatic tool center point calibration. Manufacturers provide customized calibrators to most robot types and tools. With the handmade sensors and multiple robot types that Stresstech uses, this would require great deal of labor. This thesis introduces a calibration method that is suitable for all robots which have two digital input ports free. It functions with the traditional method of using a light barrier to detect the tool in the robot coordinate system. However, this method utilizes two parallel light barriers to simultaneously measure and detect the center axis of the tool. Rotations about two axes are defined with the center axis. The last rotation about the Z-axis is calculated for tools that have different width of X- and Y-axes. The results indicate that this method is suitable for calibrating the geometric tool center point of a Barkhausen noise sensor. In the repeatability tests, a standard deviation inside robot repeatability was acquired. The Barkhausen noise signal was also evaluated after recalibration and the results indicate correct calibration. However, future studies should be conducted using a more accurate manipulator, since the method employs the robot itself as a measuring device.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The genus Acanthamoeba comprises free-living amebae identified as opportunistic pathogens of humans and other animal species. Morphological, biochemical and molecular approaches have shown wide genetic diversity within the genus. In an attempt to determine the genetic relatedness among isolates of Acanthamoeba we analyzed randomly amplified polymorphic DNA (RAPD) profiles of 11 Brazilian isolates from cases of human keratitis and 8 American type culture collection (ATCC) reference strains. We found that ATCC strains belonging to the same species present polymorphic RAPD profiles whereas strains of different species show very similar profiles. Although most Brazilian isolates could not be assigned with certainty to any of the reference species, they could be clustered according to pattern similarities. The results show that RAPD analysis is a useful tool for the rapid characterization of new isolates and the assessment of genetic relatedness of Acanthamoeba spp. A comparison between RAPD analyses and morphological characteristics of cyst stages is also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Christo Inventory for Substance-Misuse Services (CISS) is a single page outcome evaluation tool completed by drug alcohol service workers either on the basis of direct client interviews or of personal experience of their client supplemented by existing assessment notes. It was developed to assist substance misuse services to empirically demonstrate the effectiveness of their treatments to their respective funding bodies. Its 0 to 20 unidimensional scale consists of 10 items reflecting clients' problems with social functioning, general health, sexual/injecting risk behavior, psychological functioning, occupation, criminal involvement, drug/alcohol use, ongoing support, compliance, and working relationships. Good reliability and validity has already been demonstrated for the CISS [Christo et al., Drug and Alcohol Dependence 2000; 59: 189-197] but the original was written in English and a Portuguese version is presented here. The present review explores its applicability to a Brazilian setting, summarizes its characteristics and uses, and describes the process of translation to Portuguese. A pilot study conducted in a substance misuse service for adolescents indicated it is likely to be suitable for use among a Brazilian population. The simplicity, flexibility and brevity of the CISS make it a useful tool allowing comparison of clients within and between many different service settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this research is to observe the state of customer value management in Outotec Oyj, determine the key development areas and develop a phase model with which to guide the development of a customer value based sales tool. The study was conducted with a constructive research approach with the focus of identifying a problem and developing a solution for the problem. As a basis for the study, the current literature involving customer value assessment and solution and customer value selling was studied. The data was collected by conducting 16 interviews in two rounds within the company and it was analyzed by coding openly. First, seven important development areas were identified, out of which the most critical were “Customer value mindset inside the company” and “Coordination of customer value management activities”. Utilizing these seven areas three functionality requirements, “Preparation”, “Outotec’s value creation and communication” and “Documentation” and three development requirements for a customer value sales tool were identified. The study concluded with the formulation of a phase model for building a customer value based sales tool. The model included five steps that were defined as 1) Enable customer value utilization, 2) Connect with the customer, 3) Create customer value, 4) Define tool to facilitate value selling and 5) Develop sales tool. Further practical activities were also recommended as a guide for executing the phase model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Arterial baroreflex sensitivity estimated by pharmacological impulse stimuli depends on intrinsic signal variability and usually a subjective choice of blood pressure (BP) and heart rate (HR) values. We propose a semi-automatic method to estimate cardiovascular reflex sensitivity to bolus infusions of phenylephrine and nitroprusside. Beat-to-beat BP and HR time series for male Wistar rats (N = 13) were obtained from the digitized signal (sample frequency = 2 kHz) and analyzed by the proposed method (PRM) developed in Matlab language. In the PRM, time series were low-pass filtered with zero-phase distortion (3rd order Butterworth used in the forward and reverse direction) and presented graphically, and parameters were selected interactively. Differences between basal mean values and peak BP (deltaBP) and HR (deltaHR) values after drug infusions were used to calculate baroreflex sensitivity indexes, defined as the deltaHR/deltaBP ratio. The PRM was compared to the method traditionally (TDM) employed by seven independent observers using files for reflex bradycardia (N = 43) and tachycardia (N = 61). Agreement was assessed by Bland and Altman plots. Dispersion among users, measured as the standard deviation, was higher for TDM for reflex bradycardia (0.60 ± 0.46 vs 0.21 ± 0.26 bpm/mmHg for PRM, P < 0.001) and tachycardia (0.83 ± 0.62 vs 0.28 ± 0.28 bpm/mmHg for PRM, P < 0.001). The advantage of the present method is related to its objectivity, since the routine automatically calculates the desired parameters according to previous software instructions. This is an objective, robust and easy-to-use tool for cardiovascular reflex studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large-scale genome projects have generated a rapidly increasing number of DNA sequences. Therefore, development of computational methods to rapidly analyze these sequences is essential for progress in genomic research. Here we present an automatic annotation system for preliminary analysis of DNA sequences. The gene annotation tool (GATO) is a Bioinformatics pipeline designed to facilitate routine functional annotation and easy access to annotated genes. It was designed in view of the frequent need of genomic researchers to access data pertaining to a common set of genes. In the GATO system, annotation is generated by querying some of the Web-accessible resources and the information is stored in a local database, which keeps a record of all previous annotation results. GATO may be accessed from everywhere through the internet or may be run locally if a large number of sequences are going to be annotated. It is implemented in PHP and Perl and may be run on any suitable Web server. Usually, installation and application of annotation systems require experience and are time consuming, but GATO is simple and practical, allowing anyone with basic skills in informatics to access it without any special training. GATO can be downloaded at [http://mariwork.iq.usp.br/gato/]. Minimum computer free space required is 2 MB.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous assessment of verticality by means of rod and rod and frame tests indicated that human subjects can be more (field dependent) or less (field independent) influenced by a frame placed around a tilted rod. In the present study we propose a new approach to these tests. The judgment of visual verticality (rod test) was evaluated in 50 young subjects (28 males, ranging in age from 20 to 27 years) by randomly projecting a luminous rod tilted between -18 and +18° (negative values indicating left tilts) onto a tangent screen. In the rod and frame test the rod was displayed within a luminous fixed frame tilted at +18 or -18°. Subjects were instructed to verbally indicate the rod’s inclination direction (forced choice). Visual dependency was estimated by means of a Visual Index calculated from rod and rod and frame test values. Based on this index, volunteers were classified as field dependent, intermediate and field independent. A fourth category was created within the field-independent subjects for whom the amount of correct guesses in the rod and frame test exceeded that of the rod test, thus indicating improved performance when a surrounding frame was present. In conclusion, the combined use of subjective visual vertical and the rod and frame test provides a specific and reliable form of evaluation of verticality in healthy subjects and might be of use to probe changes in brain function after central or peripheral lesions.