954 resultados para Costing methodologies


Relevância:

10.00% 10.00%

Publicador:

Resumo:

omega-Transaminases have been evaluated as biocatalysts in the reductive amination of organoselenium acetophenones to the corresponding amines, and in the kinetic resolution of racemic organoselenium amines. Kinetic resolution proved to be more efficient than the asymmetric reductive amination. By using these methodologies we were able to obtain both amine enantiomers in high enantiomeric excess (up to 99%). Derivatives of the obtained optically pure o-selenium 1-phenylethyl amine were evaluated as ligands in the palladium-catalyzed asymmetric alkylation, giving the alkylated product in up to 99% ee.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Mutations in TP53 are common events during carcinogenesis. In addition to gene mutations, several reports have focused on TP53 polymorphisms as risk factors for malignant disease. Many studies have highlighted that the status of the TP53 codon 72 polymorphism could influence cancer susceptibility. However, the results have been inconsistent and various methodological features can contribute to departures from Hardy-Weinberg equilibrium, a condition that may influence the disease risk estimates. The most widely accepted method of detecting genotyping error is to confirm genotypes by sequencing and/or via a separate method. Results: We developed two new genotyping methods for TP53 codon 72 polymorphism detection: Denaturing High Performance Liquid Chromatography (DHPLC) and Dot Blot hybridization. These methods were compared with Restriction Fragment Length Polymorphism (RFLP) using two different restriction enzymes. We observed high agreement among all methodologies assayed. Dot-blot hybridization and DHPLC results were more highly concordant with each other than when either of these methods was compared with RFLP. Conclusions: Although variations may occur, our results indicate that DHPLC and Dot Blot hybridization can be used as reliable screening methods for TP53 codon 72 polymorphism detection, especially in molecular epidemiologic studies, where high throughput methodologies are required.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this paper is to highlight some of the methods of imagetic information representation, reviewing the literature of the area and proposing a model of methodology adapted to Brazilian museums. An elaboration of a methodology of imagetic information representation is developed based on Brazilian characteristics of information treatment in order to adapt it to museums. Finally, spreadsheets that show this methodology are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The project of Information Architecture is one of the initial stages of the project of a website, thus the detection and correction of errors in this stage are easier and time-saving than in the following stages. However, to minimize errors for the projects of information architecture, a methodology is necessary to organize the work of the professional and guarantee the final product quality. The profile of the professional who works with Information Architecture in Brazil has been analyzed (quantitative research by means of a questionnaire on-line) as well as the difficulties, techniques and methodologies found in his projects (qualitative research by means of interviews in depth with support of the approaches of the Sense-Making). One concludes that the methodologies of projects of information architecture need to develop the adoption of the approaches of Design Centered in the User and in the ways to evaluate its results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hot tensile and creep tests were carried out on Kanthal A1 alloy in the temperature range from 600 to 800 degrees C. Each of these sets of data were analyzed separately according to their own methodologies, but an attempt was made to find a correlation between them. A new criterion proposed for converting hot tensile data to creep data, makes possible the analysis of the two kinds of results according to usual creep relations like: Norton, Monkman-Grant, Larson-Miller and others. The remarkable compatibility verified between both sets of data by this procedure strongly suggests that hot tensile data can be converted to creep data and vice-versa for Kanthal A1 alloy, as verified previously for other metallic materials.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work presents a critical analysis of methodologies to evaluate the effective (or generalized) electromechanical coupling coefficient (EMCC) for structures with piezoelectric elements. First, a review of several existing methodologies to evaluate material and effective EMCC is presented. To illustrate the methodologies, a comparison is made between numerical, analytical and experimental results for two simple structures: a cantilever beam with bonded extension piezoelectric patches and a simply-supported sandwich beam with an embedded shear piezoceramic. An analysis of the electric charge cancelation effect on the effective EMCC observed in long piezoelectric patches is performed. It confirms the importance of reinforcing the electrodes equipotentiality condition in the finite element model. Its results indicate also that smaller (segmented) and independent piezoelectric patches could be more interesting for energy conversion efficiency. Then, parametric analyses and optimization are performed for a cantilever sandwich beam with several embedded shear piezoceramic patches. Results indicate that to fully benefit from the higher material coupling of shear piezoceramic patches, attention must be paid to the configuration design so that the shear strains in the patches are maximized. In particular, effective square EMCC values higher than 1% were obtained embedding nine well-spaced short piezoceramic patches in an aluminum/foam/aluminum sandwich beam.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, further improvements regarding the fault location problem for power distribution systems are presented. The proposed improvements relate to the capacitive effect consideration on impedance-based fault location methods, by considering an exact line segment model for the distribution line. The proposed developments, which consist of a new formulation for the fault location problem and a new algorithm that considers the line shunt admittance matrix, are presented. The proposed equations are developed for any fault type and result in one single equation for all ground fault types, and another equation for line-to-line faults. Results obtained with the proposed improvements are presented. Also, in order to compare the improvements performance and demonstrate how the line shunt admittance affects the state-of-the-art impedance-based fault location methodologies for distribution systems, the results obtained with two other existing methods are presented. Comparative results show that, in overhead distribution systems with laterals and intermediate loads, the line shunt admittance can significantly affect the state-of-the-art methodologies response, whereas in this case the proposed developments present great improvements by considering this effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Carbonation is one of the main concerns for concrete service life in tropical countries. The mechanism and materials that produce it have been widely studied as well as natural and accelerated methods to report and analyze it. In spite of reported investigations, there is a need for information that could allow an adequate interpretation of the results of the standardization process. This lack of information can produce variations not only in the interpretation but also in the predictions of service life. The purpose of this paper is to analyze and discuss variables that could be sources of error, especially when performing accelerated tests. As a result, a methodologies to minimize variations when interpreting and comparing results is proposed, such as specimen geometry and preconditioning, spacing, relative humidity, and CO(2) concentration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ecological niche modelling combines species occurrence points with environmental raster layers in order to obtain models for describing the probabilistic distribution of species. The process to generate an ecological niche model is complex. It requires dealing with a large amount of data, use of different software packages for data conversion, for model generation and for different types of processing and analyses, among other functionalities. A software platform that integrates all requirements under a single and seamless interface would be very helpful for users. Furthermore, since biodiversity modelling is constantly evolving, new requirements are constantly being added in terms of functions, algorithms and data formats. This evolution must be accompanied by any software intended to be used in this area. In this scenario, a Service-Oriented Architecture (SOA) is an appropriate choice for designing such systems. According to SOA best practices and methodologies, the design of a reference business process must be performed prior to the architecture definition. The purpose is to understand the complexities of the process (business process in this context refers to the ecological niche modelling problem) and to design an architecture able to offer a comprehensive solution, called a reference architecture, that can be further detailed when implementing specific systems. This paper presents a reference business process for ecological niche modelling, as part of a major work focused on the definition of a reference architecture based on SOA concepts that will be used to evolve the openModeller software package for species modelling. The basic steps that are performed while developing a model are described, highlighting important aspects, based on the knowledge of modelling experts. In order to illustrate the steps defined for the process, an experiment was developed, modelling the distribution of Ouratea spectabilis (Mart.) Engl. (Ochnaceae) using openModeller. As a consequence of the knowledge gained with this work, many desirable improvements on the modelling software packages have been identified and are presented. Also, a discussion on the potential for large-scale experimentation in ecological niche modelling is provided, highlighting opportunities for research. The results obtained are very important for those involved in the development of modelling tools and systems, for requirement analysis and to provide insight on new features and trends for this category of systems. They can also be very helpful for beginners in modelling research, who can use the process and the experiment example as a guide to this complex activity. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports a research that evaluated the product development methodologies used in Brazilian small and medium-sized metal-mechanic enterprises (SMEs), in a specific region of Sao Paulo. The tool used for collecting the data was a questionnaire, which was developed and applied through interviews conducted by the researchers in 32 companies. The main focus of this paper can be condensed in the synthesis-question ""Is only the company responsible for the development?"" which was analyzed thoroughly. The results obtained from this analysis were evaluated directly (through the respective percentages of answers) and statistically (through the search of an index which demonstrates if two questions are related). The results point to a degree of maturity in SMEs, which allows product development to be conducted in cooperation networks. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The common practice of reconciliation is based on definition of the mine call factor (MCF) and its application to resource or grade control estimates. The MCF expresses the difference, a ratio or percentage, between the predicted grade and the grade reported by the plant. Therefore, its application allows to correct future estimates. This practice is named reactive reconciliation. However the use of generic factors that are applied across differing time scales and material types often disguises the causes of the error responsible for the discrepancy. The root causes of any given variance can only be identified by analyzing the information behind any variance and, then, making changes to methodologies and processes. This practice is named prognostication, or proactive reconciliation, an iterative process resulting in constant recalibration of the inputs and the calculations. The prognostication allows personnel to adjust processes so that results align within acceptable tolerance ranges, and not only to correct model estimates. This study analyses the reconciliation practices performed at a gold mine in Brazil and suggests a new sampling protocol, based on prognostication concepts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern Integrated Circuit (IC) design is characterized by a strong trend of Intellectual Property (IP) core integration into complex system-on-chip (SOC) architectures. These cores require thorough verification of their functionality to avoid erroneous behavior in the final device. Formal verification methods are capable of detecting any design bug. However, due to state explosion, their use remains limited to small circuits. Alternatively, simulation-based verification can explore hardware descriptions of any size, although the corresponding stimulus generation, as well as functional coverage definition, must be carefully planned to guarantee its efficacy. In general, static input space optimization methodologies have shown better efficiency and results than, for instance, Coverage Directed Verification (CDV) techniques, although they act on different facets of the monitored system and are not exclusive. This work presents a constrained-random simulation-based functional verification methodology where, on the basis of the Parameter Domains (PD) formalism, irrelevant and invalid test case scenarios are removed from the input space. To this purpose, a tool to automatically generate PD-based stimuli sources was developed. Additionally, we have developed a second tool to generate functional coverage models that fit exactly to the PD-based input space. Both the input stimuli and coverage model enhancements, resulted in a notable testbench efficiency increase, if compared to testbenches with traditional stimulation and coverage scenarios: 22% simulation time reduction when generating stimuli with our PD-based stimuli sources (still with a conventional coverage model), and 56% simulation time reduction when combining our stimuli sources with their corresponding, automatically generated, coverage models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aimed to verify the occurrence of Listeria monocytogenes and Salmonella spp. in raw milk produced in Brazil. On account of the poor microbiological quality of this product, possible interference from the indigenous microbiota in these pathogens was also evaluated. Two-hundred and ten raw milk samples were collected in four important milk-producing areas in Brazil, tested for L. monocytogenes and Salmonella spp. presence, and for enumeration of indicator microorganisms: mesophilic aerobes, total coliforms and Escherichia coli. The interference of the indigenous microbiota in the isolation procedures was also tested, as well the frequency of naturally occurring raw milk strains with antagonistic activity against both pathogens. The pathogens were not isolated in any raw milk sample, but poor microbiological quality was confirmed by the high levels of indicator microorganisms. When present at high levels, the indigenous microbiota generated an evident interference in the methodologies of L. monocytogenes and Salmonella spp. isolation, mainly when the pathogens appeared at low levels. Three-hundred and sixty raw milk strains were tested for antagonistic activity against both pathogens, and 91 (25.3%) showed inhibitory activity against L. monocytogenes and 33 (9.2%) against Salmonella spp. The majority of the antagonistic strains were identified as Lactic Acid Bacteria species, mainly Lactococcus lactis subsp. lactis and Enterococcus faecium, known by antimicrobial substance production.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The scaled-up preparation of 1H-pyrazole, 1-phenylpyrazole and isoxazole via sonocatalysis is reported. The products were isolated in good yields in short time reaction. These compounds had been assayed for antioxidant activity by ORAC and DPPH methodologies. The results showed that only 1-phenylpyrazole presented good antioxidant activity compared with Trolox(R).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Molecular modeling methodologies were applied to perform preliminary studies concerning the release of active agents from potentially antichagasic and antileishmanial dendrimer prodrugs. The dendrimer was designed having myo-inositol as a core, L-malic acid as a spacer group, and hydroxymethylnitrofurazone (NFOH), 3-hydroxyflavone or quercetin, as active compounds. Each dendrimer presented a particular behavior concerning to the following investigated properties: spatial hindrance, map of electrostatic potential (MEP), and the lowest unoccupied molecular orbital energy (E(LUMO)). Additionally, the findings suggested that the carbonyl group next to the active agent seems to be the most promising ester breaking point. (C) 2009 Elsevier B.V. All rights reserved.