954 resultados para Computer software - Quality control


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Soil as an impurity in sugarcane is a serious problem for the ethanol industry, increasing production and maintenance costs and reducing the productivity. Fe, Hf, Sc and Th determined by INAA were used as tracers to assess the amount of soil in sugarcane from truckloads as well as in the juice extraction process. Quality control tools were applied to results identifying the need for stratification according to soil type and moisture. Soil levels of truckloads had high variability indicating potential for improving cut and loading operations. Samples from the juice extraction process allowed tracking the soil in the mill tandem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The activity of validating identified requirements for an information system helps to improve the quality of a requirements specification document and, consequently, the success of a project. Although various different support tools to requirements engineering exist in the market, there is still a lack of automated support for validation activity. In this context, the purpose of this paper is to make up for that deficiency, with the use of an automated tool, to provide the resources for the execution of an adequate validation activity. The contribution of this study is to enable an agile and effective follow-up of the scope established for the requirements, so as to lead the development to a solution which would satisfy the real necessities of the users, as well as to supply project managers with relevant information about the maturity of the analysts involved in requirements specification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A study on the use of artificial intelligence (AI) techniques for the modelling and subsequent control of an electric resistance spot welding process (ERSW) is presented. The ERSW process is characterized by the coupling of thermal, electrical, mechanical, and metallurgical phenomena. For this reason, early attempts to model it using computational methods established as the methods of finite differences, finite element, and finite volumes, ask for simplifications that lead the model obtained far from reality or very costly in terms of computational costs, to be used in a real-time control system. In this sense, the authors have developed an ERSW controller that uses fuzzy logic to adjust the energy transferred to the weld nugget. The proposed control strategies differ in the speed with which it reaches convergence. Moreover, their application for a quality control of spot weld through artificial neural networks (ANN) is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The procedure of on-line process control by attributes, known as Taguchi`s on-line process control, consists of inspecting the mth item (a single item) at every m produced items and deciding, at each inspection, whether the fraction of conforming items was reduced or not. If the inspected item is nonconforming, the production is stopped for adjustment. As the inspection system can be subject to diagnosis errors, one develops a probabilistic model that classifies repeatedly the examined item until a conforming or b non-conforming classification is observed. The first event that occurs (a conforming classifications or b non-conforming classifications) determines the final classification of the examined item. Proprieties of an ergodic Markov chain were used to get the expression of average cost of the system of control, which can be optimized by three parameters: the sampling interval of the inspections (m); the number of repeated conforming classifications (a); and the number of repeated non-conforming classifications (b). The optimum design is compared with two alternative approaches: the first one consists of a simple preventive policy. The production system is adjusted at every n produced items (no inspection is performed). The second classifies the examined item repeatedly r (fixed) times and considers it conforming if most classification results are conforming. Results indicate that the current proposal performs better than the procedure that fixes the number of repeated classifications and classifies the examined item as conforming if most classifications were conforming. On the other hand, the preventive policy can be averagely the most economical alternative rather than those ones that require inspection depending on the degree of errors and costs. A numerical example illustrates the proposed procedure. (C) 2009 Elsevier B. V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Restriction fragment length polymorphism (RFLP) is a common molecular assay used for genotyping, and it requires validated quality control procedures to prevent mistyping caused by impaired endonuclease activity. We have evaluated the usefulness of a plasmid-based internal control in RFLP assays. Results: Blood samples were collected from 102 individuals with acute myocardial infarction (AMI) and 108 non-AMI individuals (controls) for DNA extraction and laboratory analyses. The 1196C> T polymorphism in the toll-like receptor 4 (TLR4) gene was amplified by mismatched-polymerase chain reaction (PCR). Amplicons and pBluescript II SK-plasmid were simultaneously digested with endonuclease HincII. Fragments were separated on 2% agarose gels. Plasmid was completely digested using up to 55.2 nmL/L DNA solutions and 1 mu L PCR product. Nevertheless, plasmid DNA with 41.4 nM or higher concentrations was incompletely digested in the presence of 7 mL PCR product. In standardized conditions, TLR4 1196C> T variant was accurately genotyped. TLR4 1196T allele frequency was similar between AMI (3.1%) and controls (2.0%, p = 0.948). TLR4 SNP was not associated with AMI in this sample population. In conclusion, the plasmid-based control is a useful approach to prevent mistyping in RFLP assays, and it is validate for genetic association studies such as TLR4 1196C> T.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Much information on flavonoid content of Brazilian foods has already been obtained; however, this information is spread in scientific publications and non-published data. The objectives of this work were to compile and evaluate the quality of national flavonoid data according to the United States Department of Agriculture`s Data Quality Evaluation System (USDA-DQES) with few modifications, for future dissemination in the TBCA-USP (Brazilian Food Composition Database). For the compilation, the most abundant compounds in the flavonoid subclasses were considered (flavonols, flavones, isoflavones, flavanones, flavan-3-ols, and anthocyanidins) and the analysis of the compounds by HPLC was adopted as criteria for data inclusion. The evaluation system considers five categories, and the maximum score assigned to each category is 20. For each data, a confidence code (CC) was attributed (A, B, C and D), indicating the quality and reliability of the information. Flavonoid data (773) present in 197 Brazilian foods were evaluated. The CC ""C"" (as average) was attributed to 99% of the data and ""B"" (above average) to 1%. The main categories assigned low average scores were: number of samples; sampling plan and analytical quality control (average scores 2, 5 and 4, respectively). The analytical method category received an average score of 9. The category assigned the highest score was the sample handling (20 average). These results show that researchers need to be conscious about the importance of the number and plan of evaluated samples and the complete description and documentation of all the processes of methodology execution and analytical quality control. (C) 2010 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to differences in the functional quality of natural extracts, we have also faced differences in their effectiveness. So, it was intended to assess the antioxidant activity of natural extracts in order to attain their functional quality. It was observed that all the extracts (brown and green propolis, Ginkgo biloba and Isoflavin Beta (R)) and the standard used (quercetin) showed antioxidant activity in a dose-dependent manner with IC50 values ranging from 0.21 to 155.28 mu g mL(-1) (inhibition of lipid peroxidation and scavenging of the DPPH center dot assays). We observed a high correlation (r(2)= 0.9913) among the antioxidant methods; on the other hand, the antioxidant activity was not related to the polyphenol and flavonoid content. As the DPPH center dot assay is a fast method, presents low costs and even has a high correlation with other antioxidant methods, it could be applied as an additional parameter in the quality control of natural extracts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pharmaceuticals can exist in many solid forms, which can have different physical and chemical properties. These solid forms include polymorphs, solvates, amorphous, and hydrates. Particularly, hydration process can be quite common since pharmaceutical solids can be in contact with water during manufacturing process and can also be exposed to water during storage. In the present work, it is proved that NQR technique is capable of detecting different hydrated forms not only in the pure raw material but also in the final product (tablets), being in this way a useful technique for quality control. This technique was also used to study the dehydration process from pentahydrate to trihydrate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A tandem ionization chamber was developed for quality control programs of X-ray equipment used in conventional radiography and mammography. A methodology for the use of the tandem chamber in the constancy check of diagnostic X-ray beam qualities was established. The application at a medical X-ray imaging facility of this established methodology is presented. The use of the tandem chamber in the constancy check of diagnostic X-ray beam qualities is a useful method to control the performance of the X-ray equipment. (c) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Current advances in frame modeling and computer software allow stereotactic procedures to be performed with great accuracy and minimal risk of neural tissue or vascular injury. Case Report: In this report we associate a previously described minimally invasive stereotactic technique with state-of-the-art 3D computer guidance technology to successfully treat a 55-year-old patient with an arachnoidal cyst obstructing the aqueduct of Sylvius. We provide 1 detailed technical information and discuss how this technique deals with previous limitations for stereotactic manipulation of the aqueductal region. We further discuss current advances in neuroendoscopy for treating obstructive hydrocephalus and make comparisons with our proposed technique. Conclusion: We advocate that this technique is not only capable of treating this pathology but it also has the advantages to enable reestablishment of physiological CSF flow thus preventing future brainstem compression by cyst enlargement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

At present, there is a variety of formalisms for modeling and analyzing the communication behavior of components. Due to a tremendous increase in size and complexity of embedded systems accompanied by shorter time to market cycles and cost reduction, so called behavioral type systems become more and more important. This chapter presents an overview and a taxonomy of behavioral types. The intentions of this taxonomy are to provide a guidance for software engineers and to form the basis for future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past years, component-based software engineering has become an established paradigm in the area of complex software intensive systems. However, many techniques for analyzing these systems for critical properties currently do not make use of the component orientation. In particular, safety analysis of component-based systems is an open field of research. In this chapter we investigate the problems arising and define a set of requirements that apply when adapting the analysis of safety properties to a component-based software engineering process. Based on these requirements some important component-oriented safety evaluation approaches are examined and compared.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the proliferation of relational database programs for PC's and other platforms, many business end-users are creating, maintaining, and querying their own databases. More importantly, business end-users use the output of these queries as the basis for operational, tactical, and strategic decisions. Inaccurate data reduce the expected quality of these decisions. Implementing various input validation controls, including higher levels of normalisation, can reduce the number of data anomalies entering the databases. Even in well-maintained databases, however, data anomalies will still accumulate. To improve the quality of data, databases can be queried periodically to locate and correct anomalies. This paper reports the results of two experiments that investigated the effects of different data structures on business end-users' abilities to detect data anomalies in a relational database. The results demonstrate that both unnormalised and higher levels of normalisation lower the effectiveness and efficiency of queries relative to the first normal form. First normal form databases appear to provide the most effective and efficient data structure for business end-users formulating queries to detect data anomalies.