969 resultados para strategy method
Resumo:
This paper details a strategy for modifying the source code of a complex model so that the model may be used in a data assimilation context, {and gives the standards for implementing a data assimilation code to use such a model}. The strategy relies on keeping the model separate from any data assimilation code, and coupling the two through the use of Message Passing Interface (MPI) {functionality}. This strategy limits the changes necessary to the model and as such is rapid to program, at the expense of ultimate performance. The implementation technique is applied in different models with state dimension up to $2.7 \times 10^8$. The overheads added by using this implementation strategy in a coupled ocean-atmosphere climate model are shown to be an order of magnitude smaller than the addition of correlated stochastic random errors necessary for some nonlinear data assimilation techniques.
Resumo:
Key Performance Indicators (KPIs) are the main instruments of Business Performance Management. KPIs are the measures that are translated to both the strategy and the business process. These measures are often designed for an industry sector with the assumptions about business processes in organizations. However, the assumptions can be too incomplete to guarantee the required properties of KPIs. This raises the need to validate the properties of KPIs prior to their application to performance measurement. This paper applies the method called EXecutable Requirements Engineering Management and Evolution (EXTREME) for validation of the KPI definitions. EXTREME semantically relates the goal modeling, conceptual modeling and protocol modeling techniques into one methodology. The synchronous composition built into protocol modeling enables raceability of goals in protocol models and constructive definitions of a KPI. The application of the method clarifies the meaning of KPI properties and procedures of their assessment and validation.
Resumo:
Searching for and mapping the physical extent of unmarked graves using geophysical techniques has proven difficult in many cases. The success of individual geophysical techniques for detecting graves depends on a site-by-site basis. Significantly, detection of graves often results from measured contrasts that are linked to the background soils rather than the type of archaeological feature associated with the grave. It is evident that investigation of buried remains should be considered within a 3D space as the variation in burial environment can be extremely varied through the grave. Within this paper, we demonstrate the need for a multi-method survey strategy to investigate unmarked graves, as applied at a “planned” but unmarked pauper’s cemetery. The outcome from this case study provides new insights into the strategy that is required at such sites. Perhaps the most significant conclusion is that unmarked graves are best understood in terms of characterization rather than identification. In this paper, we argue for a methodological approach that, while following the current trends to use multiple techniques, is fundamentally dependent on a structured approach to the analysis of the data. The ramifications of this case study illustrate the necessity of an integrated strategy to provide a more holistic understanding of unmarked graves that may help aid in management of these unseen but important aspects of our heritage. It is concluded that the search for graves is still a current debate and one that will be solved by methodological rather than technique-based arguments.
Resumo:
Modular product architectures have generated numerous benefits for companies in terms of cost, lead-time and quality. The defined interfaces and the module’s properties decrease the effort to develop new product variants, and provide an opportunity to perform parallel tasks in design, manufacturing and assembly. The background of this thesis is that companies perform verifications (tests, inspections and controls) of products late, when most of the parts have been assembled. This extends the lead-time to delivery and ruins benefits from a modular product architecture; specifically when the verifications are extensive and the frequency of detected defects is high. Due to the number of product variants obtained from the modular product architecture, verifications must handle a wide range of equipment, instructions and goal values to ensure that high quality products can be delivered. As a result, the total benefits from a modular product architecture are difficult to achieve. This thesis describes a method for planning and performing verifications within a modular product architecture. The method supports companies by utilizing the defined modules for verifications already at module level, so called MPV (Module Property Verification). With MPV, defects are detected at an earlier point, compared to verification of a complete product, and the number of verifications is decreased. The MPV method is built up of three phases. In Phase A, candidate modules are evaluated on the basis of costs and lead-time of the verifications and the repair of defects. An MPV-index is obtained which quantifies the module and indicates if the module should be verified at product level or by MPV. In Phase B, the interface interaction between the modules is evaluated, as well as the distribution of properties among the modules. The purpose is to evaluate the extent to which supplementary verifications at product level is needed. Phase C supports a selection of the final verification strategy. The cost and lead-time for the supplementary verifications are considered together with the results from Phase A and B. The MPV method is based on a set of qualitative and quantitative measures and tools which provide an overview and support the achievement of cost and time efficient company specific verifications. A practical application in industry shows how the MPV method can be used, and the subsequent benefits
Resumo:
Trabalho apresentado no XXXV CNMAC, Natal-RN, 2014.
Resumo:
Trabalho apresentado no Congresso Nacional de Matemática Aplicada à Indústria, 18 a 21 de novembro de 2014, Caldas Novas - Goiás
Resumo:
Asthma is a significant health issue in the pediatric population with a noteworthy growth over the years. The proposed challenge for this PhD thesis was the development of advanced methodologies to establish metabolomic patterns in urine and exhaled breath associated with asthma whose applicability was subsequently exploited to evaluate the disease state, the therapy adhesion and effect and for diagnostic purposes. The volatile composition of exhaled breath was studied combining headspace solid phase microextraction (HS-SPME) with gas chromatography coupled to mass spectrometry or with comprehensive two-dimensional gas chromatography coupled to mass spectrometry with a high resolution time of flight analyzer (GC×GC–ToFMS). These methodologies allowed the identification of several hundred compounds from different chemical families. Multivariate analysis (MVA) led to the conclusion that the metabolomic profile of asthma individuals is characterized by higher levels of compounds associated with lipid peroxidation, possibly linked to oxidative stress and inflammation (alkanes and aldehydes) known to play an important role in asthma. For future applications in clinical settings a set of nine compounds was defined and the clinical applicability was proven in monitoring the disease status and in the evaluation of the effect and / or adherence to therapy. The global volatile metabolome of urine was also explored using an HSSPME/GC×GC–ToFMS method and c.a. 200 compounds were identified. A targeted analysis was performed, with 78 compounds related with lipid peroxidation and consequently to oxidative stress levels and inflammation. The urinary non-volatile metabolomic pattern of asthma was established using proton nuclear magnetic resonance (1H NMR). This analysis allowed identifying central metabolic pathways such as oxidative stress, amino acid and lipid metabolism, gut microflora alterations, alterations in the tricarboxylic acid (TCA) cycle, histidine metabolism, lactic acidosis, and modification of free tyrosine residues after eosinophil stimulation. The obtained results allowed exploring and demonstrating the potential of analyzing the metabolomic profile of exhaled air and urine in asthma. Besides the successful development of analysis methodologies, it was possible to explore through exhaled air and urine biochemical pathways affected by asthma, observing complementarity between matrices, as well as, verify the clinical applicability.
Resumo:
This manuscript describes the development and validation of an ultra-fast, efficient, and high throughput analytical method based on ultra-high performance liquid chromatography (UHPLC) equipped with a photodiode array (PDA) detection system, for the simultaneous analysis of fifteen bioactive metabolites: gallic acid, protocatechuic acid, (−)-catechin, gentisic acid, (−)-epicatechin, syringic acid, p-coumaric acid, ferulic acid, m-coumaric acid, rutin, trans-resveratrol, myricetin, quercetin, cinnamic acid and kaempferol, in wines. A 50-mm column packed with 1.7-μm particles operating at elevated pressure (UHPLC strategy) was selected to attain ultra-fast analysis and highly efficient separations. In order to reduce the complexity of wine extract and improve the recovery efficiency, a reverse-phase solid-phase extraction (SPE) procedure using as sorbent a new macroporous copolymer made from a balanced ratio of two monomers, the lipophilic divinylbenzene and the hydrophilic N-vinylpyrrolidone (Oasis™ HLB), was performed prior to UHPLC–PDA analysis. The calibration curves of bioactive metabolites showed good linearity within the established range. Limits of detection (LOD) and quantification (LOQ) ranged from 0.006 μg mL−1 to 0.58 μg mL−1, and from 0.019 μg mL−1 to 1.94 μg mL−1, for gallic and gentisic acids, respectively. The average recoveries ± SD for the three levels of concentration tested (n = 9) in red and white wines were, respectively, 89 ± 3% and 90 ± 2%. The repeatability expressed as relative standard deviation (RSD) was below 10% for all the metabolites assayed. The validated method was then applied to red and white wines from different geographical origins (Azores, Canary and Madeira Islands). The most abundant component in the analysed red wines was (−)-epicatechin followed by (−)-catechin and rutin, whereas in white wines syringic and p-coumaric acids were found the major phenolic metabolites. The method was completely validated, providing a sensitive analysis for bioactive phenolic metabolites detection and showing satisfactory data for all the parameters tested. Moreover, was revealed as an ultra-fast approach allowing the separation of the fifteen bioactive metabolites investigated with high resolution power within 5 min.
Resumo:
DNA-based studies have been one of the major interests in conservation biology of endangered species and in population genetics. As species and population genetic assessment requires a source of biological material, the sampling strategy can be overcome by non-destructive procedures for DNA isolation. An improved method for obtaining DNA from fish fins and scales with the use of an extraction buffer containing urea and further DNA purification with phenol-chloroform is described. The methodology combines the benefits of a non-destructive DNA sampling and its high efficiency. In addition, comparisons with other methodologies for isolating DNA from fish demonstrated that the present procedure also becomes a very attractive alternative to obtain large amounts of high-quality DNA for use in different molecular analyses. The DNA samples, isolated from different fish species, have been successfully used on random amplified polymorphic DNA (RAPD) experiments, as well as on amplification of specific ribosomal and mitochondrial DNA sequences. The present DNA extraction procedure represents an alternative for population approaches and genetic studies on rare or endangered taxa.
Resumo:
We point out that determination of the MNS matrix element \U-e3\ = s(13) in long-baseline nu(mu) --> nu(e) neutrino oscillation experiments suffers from large intrinsic uncertainty due to the unknown CP violating phase delta and sign of Deltam(13)(2). We propose a new strategy for accurate determination of theta(13); tune the beam energy at the oscillation maximum and do the measurement both in neutrino and antineutrino channels. We show that it automatically resolves the problem of parameter ambiguities which involves delta, theta(13), and the sign of Deltam(13)(2). (C) 2002 Elsevier B.V. B.V. All rights reserved.
Resumo:
In this work, the occurrence of chaos (homoclinic scene) is verified in a robotic system with two degrees of freedom by using Poincare-Mel'nikov method. The studied problem was based on experimental results of a two-joint planar manipulator-first joint actuated and the second joint free-that resides in a horizontal plane. This is the simplest model of nonholonomic free-joint manipulators. The purpose of the present study is to verify analytically those results and to suggest a control strategy.
Resumo:
A proliferação do Aedes aegypti é propiciada pelo habito de permitir a formação de criadouros em vários tipos de recipientes. Uma das formas de controla-lo é a disseminação do conhecimento sobre o vetor, por conduzir a conscientização e a tomada de medidas contra a sua proliferação. Para avaliar um método de ensino sobre o vetor e a dengue, foram comparando alunos de 5ª e 6ª séries antes e após a intervenção didática. Os alunos com intervenção foram mais aptos em reconhecerem as fases do ciclo e tiveram um discernimento maior sobre a importância dos mosquitos para a saúde. Eles também foram mais aptos em reconhecerem quais medidas de controle são mais eficientes e viáveis, repercutindo em suas residências que apresentaram duas vezes menos criadouros em relação aos que não tiveram intervenção.
Resumo:
Alternative sampling procedures are compared to the pure random search method. It is shown that the efficiency of the algorithm can be improved with respect to the expected number of steps to reach an epsilon-neighborhood of the optimal point.
Resumo:
The identification of ground control on photographs or images is usually carried out by a human operator, who uses his natural skills to make interpretations. In Digital Photogrammetry, which uses techniques of digital image processing extraction of ground control can be automated by using an approach based on relational matching and a heuristic that uses the analytical relation between straight features of object space and its homologous in the image space. A build-in self-diagnosis is also used in this method. It is based on implementation of data snooping statistic test in the process of spatial resection using the Iterated Extended Kalman Filtering (IEKF). The aim of this paper is to present the basic principles of the proposed approach and results based on real data.
Resumo:
To enhance the global search ability of population based incremental learning (PBIL) methods, it is proposed that multiple probability vectors are to be included on available PBIL algorithms. The strategy for updating those probability vectors and the negative learning and mutation operators are thus re-defined correspondingly. Moreover, to strike the best tradeoff between exploration and exploitation searches, an adaptive updating strategy for the learning rate is designed. Numerical examples are reported to demonstrate the pros and cons of the newly implemented algorithm.