974 resultados para Control-flow Analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The uncertainty of any analytical determination depends on analysis and sampling. Uncertainty arising from sampling is usually not controlled and methods for its evaluation are still little known. Pierre Gy’s sampling theory is currently the most complete theory about samplingwhich also takes the design of the sampling equipment into account. Guides dealing with the practical issues of sampling also exist, published by international organizations such as EURACHEM, IUPAC (International Union of Pure and Applied Chemistry) and ISO (International Organization for Standardization). In this work Gy’s sampling theory was applied to several cases, including the analysis of chromite concentration estimated on SEM (Scanning Electron Microscope) images and estimation of the total uncertainty of a drug dissolution procedure. The results clearly show that Gy’s sampling theory can be utilized in both of the above-mentioned cases and that the uncertainties achieved are reliable. Variographic experiments introduced in Gy’s sampling theory are beneficially applied in analyzing the uncertainty of auto-correlated data sets such as industrial process data and environmental discharges. The periodic behaviour of these kinds of processes can be observed by variographic analysis as well as with fast Fourier transformation and auto-correlation functions. With variographic analysis, the uncertainties are estimated as a function of the sampling interval. This is advantageous when environmental data or process data are analyzed as it can be easily estimated how the sampling interval is affecting the overall uncertainty. If the sampling frequency is too high, unnecessary resources will be used. On the other hand, if a frequency is too low, the uncertainty of the determination may be unacceptably high. Variographic methods can also be utilized to estimate the uncertainty of spectral data produced by modern instruments. Since spectral data are multivariate, methods such as Principal Component Analysis (PCA) are needed when the data are analyzed. Optimization of a sampling plan increases the reliability of the analytical process which might at the end have beneficial effects on the economics of chemical analysis,

Relevância:

90.00% 90.00%

Publicador:

Resumo:

AbstractA device comprising a lab-made chamber with mechanical stirring and computer-controlled solenoid valves is proposed for the mechanization of liquid-liquid extractions. The performance was demonstrated by the extraction of ethanol from biodiesel as a model of the extraction of analytes from organic immiscible samples to an aqueous medium. The volumes of the sample and extractant were precisely defined by the flow-rates and switching times of the valves, while the mechanic stirring increased interaction between the phases. Stirring was stopped for phase separation, and a precise time-control also allowed a successful phase separation (i.e., the absence of the organic phase in the aqueous extract). In the model system, a linear response between the analytical response and the number of extractions was observed, indicating the potential for analyte preconcentration in the extract. The efficiency and reproducibility of the extractions were demonstrated by recoveries of ethanol spiked to biodiesel samples within 96% and 100% with coefficients of variation lower than 3.0%.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The chemical treatment evaluation in the field to control post-harvest fruit anthracnose (Colletotrichum gloeosporioides) requires a suitable disease incidence assessment on harvested papaya (Carica papaya) fruits. The minimum number of papaya fruit harvests was determined for valid treatment comparison in field trials for anthracnose chemical control. Repeatability analysis was done using previously published data. The coefficient determination (R²) estimate range, using four methods, and based on means of 12 assessment times, was 92.58 < R² < 94.45%. The number of assessment times required for R²=90% varied from seven to nine. The R² values of 85.1 < R² < 91.3% estimated by ANOVA suggested that any seven successive assessment times were sufficient for treatment comparison.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

During the past decades testing has matured from ad-hoc activity into being an integral part of the development process. The benefits of testing are obvious for modern communication systems, which operate in heterogeneous environments amongst devices from various manufacturers. The increased demand for testing also creates demand for tools and technologies that support and automate testing activities. This thesis discusses applicability of visualization techniques in the result analysis part of the testing process. Particularly, the primary focus of this work is visualization of test execution logs produced by a TTCN-3 test system. TTCN-3 is an internationally standardized test specification and implementation language. The TTCN-3 standard suite includes specification of a test logging interface and a graphical presentation format, but no immediate relationship between them. This thesis presents a technique for mapping the log events to the graphical presentation format along with a concrete implementation, which is integrated with the Eclipse Platform and the OpenTTCN Tester toolchain. Results of this work indicate that for majority of the log events, a visual representation may be derived from the TTCN-3 standard suite. The remaining events were analysed and three categories relevant in either log analysis or implementation of the visualization tool were identified: events indicating insertion of something into the incoming queue of a port, events indicating a mismatch and events describing the control flow during the execution. Applicability of the results is limited into the domain of TTCN-3, but the developed mapping and the implementation may be utilized with any TTCN-3 tool that is able to produce the execution log in the standardized XML format.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Citrus fruits are affected by the black spot disease caused by the fungus Guignardia citricarpa. Chitosan can be used as covering for fruits and may delay the ripening process and inhibit the growth of some fungi. Thus, the control of citrus black spot using chitosan and the fungicides thiabendazole and imazalil was assessed in addition to the physicochemical quality of 'Pêra Rio' oranges. The oranges were immersed into chitosan, thiabendazole or imazalil, and in chitosan mixed with both fungicides. The fruits were then stored at 25 °C, 80% RH, for 7 days and, after this storage period, subjected to physicochemical analyses. Chitosan in association with the fungicides reduced black spot in 'Pêra Rio' oranges and delayed the change in the orange skin colour from green to yellow during the postharvest storage. Total soluble solids, titratable acidity, pH, ascorbic acid content and ratio were not influenced by the treatments. Thus, chitosan applied with the fungicides thiabendazole and imazalil showed potential to control the development of black spot lesions on 'Pêra Rio' oranges during the postharvest period.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The shift towards a knowledge-based economy has inevitably prompted the evolution of patent exploitation. Nowadays, patent is more than just a prevention tool for a company to block its competitors from developing rival technologies, but lies at the very heart of its strategy for value creation and is therefore strategically exploited for economic pro t and competitive advantage. Along with the evolution of patent exploitation, the demand for reliable and systematic patent valuation has also reached an unprecedented level. However, most of the quantitative approaches in use to assess patent could arguably fall into four categories and they are based solely on the conventional discounted cash flow analysis, whose usability and reliability in the context of patent valuation are greatly limited by five practical issues: the market illiquidity, the poor data availability, discriminatory cash-flow estimations, and its incapability to account for changing risk and managerial flexibility. This dissertation attempts to overcome these impeding barriers by rationalizing the use of two techniques, namely fuzzy set theory (aiming at the first three issues) and real option analysis (aiming at the last two). It commences with an investigation into the nature of the uncertainties inherent in patent cash flow estimation and claims that two levels of uncertainties must be properly accounted for. Further investigation reveals that both levels of uncertainties fall under the categorization of subjective uncertainty, which differs from objective uncertainty originating from inherent randomness in that uncertainties labelled as subjective are highly related to the behavioural aspects of decision making and are usually witnessed whenever human judgement, evaluation or reasoning is crucial to the system under consideration and there exists a lack of complete knowledge on its variables. Having clarified their nature, the application of fuzzy set theory in modelling patent-related uncertain quantities is effortlessly justified. The application of real option analysis to patent valuation is prompted by the fact that both patent application process and the subsequent patent exploitation (or commercialization) are subject to a wide range of decisions at multiple successive stages. In other words, both patent applicants and patentees are faced with a large variety of courses of action as to how their patent applications and granted patents can be managed. Since they have the right to run their projects actively, this flexibility has value and thus must be properly accounted for. Accordingly, an explicit identification of the types of managerial flexibility inherent in patent-related decision making problems and in patent valuation, and a discussion on how they could be interpreted in terms of real options are provided in this dissertation. Additionally, the use of the proposed techniques in practical applications is demonstrated by three fuzzy real option analysis based models. In particular, the pay-of method and the extended fuzzy Black-Scholes model are employed to investigate the profitability of a patent application project for a new process for the preparation of a gypsum-fibre composite and to justify the subsequent patent commercialization decision, respectively; a fuzzy binomial model is designed to reveal the economic potential of a patent licensing opportunity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

L’Organisation mondiale de la santé animale (OIE) est l’institution internationale responsable de la mise en place des mesures sanitaires associées aux échanges commerciaux d’animaux vivants. Le zonage est une méthode de contrôle recommandée par l’OIE pour certaines maladies infectieuses, dont l’influenza aviaire. Les éclosions d’influenza aviaire été extrêmement coûteuses pour l’industrie avicole partout dans le monde. Afin d’évaluer la possibilité d’user de cette approche en Ontario, les données sur les sites de production avicole ont été fournies par les fédérations d’éleveurs de volailles ce cette province. L’information portant sur les industries associées à la production avicole, soit les meuneries, les abattoirs, les couvoirs, et les usines de classification d’œufs, a été obtenue par l’entremise de plusieurs sources, dont des représentants de l’industrie avicole. Des diagrammes de flux a été crée afin de comprendre les interactions entre les sites de production et les industries associées à ceux-ci. Ces industries constituaient les éléments de bas nécessaires au zonage. Cette analyse a permis de créer une base de données portant sur intrants et extrants de production pour chaque site d’élevage avicole, ainsi que pour les sites de production des industries associées à l’aviculture. À l’aide du logiciel ArcGIS, cette information a été fusionnée à des données géospatiales de Statistique Canada de l’Ontario et du Québec. La base de données résultante a permis de réaliser les essais de zonage. Soixante-douze essais ont été réalisés. Quatre ont été retenus car celles minimisaient de façon similaire les pertes de production de l’industrie. Ces essais montrent que la méthode utilisée pour l’étude du zonage peut démontrer les déficits et les surplus de production de l’industrie avicole commerciale en Ontario. Ceux-ci pourront servir de point de départ lors des discussions des intervenants de l’industrie avicole, étant donné que la coopération et la communication sont essentielles au succès du zonage.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It has been widely known that a significant part of the bits are useless or even unused during the program execution. Bit-width analysis targets at finding the minimum bits needed for each variable in the program, which ensures the execution correctness and resources saving. In this paper, we proposed a static analysis method for bit-widths in general applications, which approximates conservatively at compile time and is independent of runtime conditions. While most related work focus on integer applications, our method is also tailored and applicable to floating point variables, which could be extended to transform floating point number into fixed point numbers together with precision analysis. We used more precise representations for data value ranges of both scalar and array variables. Element level analysis is carried out for arrays. We also suggested an alternative for the standard fixed-point iterations in bi-directional range analysis. These techniques are implemented on the Trimaran compiler structure and tested on a set of benchmarks to show the results.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Las multinacionales son reconocidas por su perdurabilidad en el tiempo, todas tienen procesos operativos que permiten alcanzar los objetivos del negocio. Boehringer Ingelheim es una compañía con mas de 128 años en el negocio farmacéutico que se caracteriza por su trascendencia familiar. Esta es una de las pocas empresas que no tienen accionistas y es un caso de éxito en la industria. En Colombia, esta empresa se ve enfrentada a otros grandes laboratorios como lo son Pfizer, Novartis, Bayer, Roche, Tecnoquimicas, entre otros; por lo cual debe estar a la vanguardia tanto en medicamentos, tecnología y procesos. Este trabajo propone la mejora de uno de los procesos del área de mercadeo en el cual se ven involucradas las áreas de ventas y contabilidad, buscando reducir el tiempo operativo para ser implementado en actividades de mayor valor que beneficien a la empresa.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Les restriccions reals quantificades (QRC) formen un formalisme matemàtic utilitzat per modelar un gran nombre de problemes físics dins els quals intervenen sistemes d'equacions no-lineals sobre variables reals, algunes de les quals podent ésser quantificades. Els QRCs apareixen en nombrosos contextos, com l'Enginyeria de Control o la Biologia. La resolució de QRCs és un domini de recerca molt actiu dins el qual es proposen dos enfocaments diferents: l'eliminació simbòlica de quantificadors i els mètodes aproximatius. Tot i això, la resolució de problemes de grans dimensions i del cas general, resten encara problemes oberts. Aquesta tesi proposa una nova metodologia aproximativa basada en l'Anàlisi Intervalar Modal, una teoria matemàtica que permet resoldre problemes en els quals intervenen quantificadors lògics sobre variables reals. Finalment, dues aplicacions a l'Enginyeria de Control són presentades. La primera fa referència al problema de detecció de fallades i la segona consisteix en un controlador per a un vaixell a vela.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Successful coupling of electrochemical preconcentration (EPC) to capillary electrophoresis (CE) with contactless conductivity detection (C(4)D) is reported for the first time. The EPC-CE interface comprises a dual glassy carbon electrode (GCE) block, a spacer and an upper block with flow inlet and outlet, pseudo-reference electrode and a fitting for the CE silica column, consisting of an orifice perpendicular to the surface of a glassy carbon electrode with a bushing inside to ensure a tight press fit. The end of the capillary in contact with the GCE is slant polished, thus defining a reproducible distance from the electrode surface to the column bore. First results with EPC-CE-C(4)D are very promising, as revealed by enrichment factors of two orders of magnitude for Tl, Cu, Pb and Cd ion peak area signals. Detection limits for 10 min deposition time fall around 20 nmol L(-1) with linear calibration curves over a wide range. Besides preconcentration, easy matrix exchange between accumulation and stripping/injection favors procedures like sample cleanup and optimization of pH, ionic strength and complexing power. This was demonstrated for highly saline samples by using a low conductivity buffer for stripping/injection to improve separation and promote field-enhanced sample stacking during electromigration along the capillary. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The physical and chemical characteristics of peat were assessed through measurement of pH, percentage of organic matter, cationic exchange capacity (CEC), elemental analysis, infrared spectroscopy and quantitative analysis of metals by ICP OES. Despite the material showed to be very acid in view of the percentage of organic matter, its CEC was significant, showing potential for retention of metal ions. This characteristic was exploited by coupling a peat mini-column to a flow system based on the multicommutation approach for the in-line copper concentration prior to flame atomic absorption spectrometric determination. Cu(II) ions were adsorbed at pH 4.5 and eluted with 0.50 mol L(-1) HNO(3). The influence of chemical and hydrodynamic parameters, such as sample pH, buffer concentration, eluent type and concentration, sample flow-rate and preconcentration time were investigated. Under the optimized conditions, a linear response was observed between 16 and 100 mu g L(-1), with a detection limit estimated as 3 mu g L(-1) at the 99.7% confidence level and an enrichment factor of 16. The relative standard deviation was estimated as 3.3% (n = 20). The mini-column was used for at least 100 sampling cycles without significant variation in the analytical response. Recoveries from copper spiked to lake water or groundwater as well as concentrates used in hemodialysis were in the 97.3-111 % range. The results obtained for copper determination in these samples agreed with those achieved by graphite furnace atomic absorption spectrometry (GFAAS) at the 95% confidence level. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A flow system designed with solenoid valves is proposed for determination of weak acid dissociable cyanide, based on the reaction with o-phthalaldehyde (OPA) and glycine yielding a highly fluorescent isoindole derivative. The proposed procedure minimizes the main drawbacks related to the reference batch procedure, based on reaction with barbituric acid and pyridine followed by spectrophotometric detection, i.e., use of toxic reagents, high reagent consumption and waste generation, low sampling rate, and poor sensitivity. Retention of the sample zone was exploited to increase the conversion rate of the analyte with minimized sample dispersion. Linear response (r=0.999) was observed for cyanide concentrations in the range 1-200 mu g L(-1), with a detection limit (99.7% confidence level) of 0.5 mu g L(-1)(19 nmol L(-1)). The sampling rate and coefficient of variation (n=10) were estimated as 22 measurements per hour and 1.4%, respectively. The results of determination of weak acid dissociable cyanide in natural water samples were in agreement with those achieved by the batch reference procedure at the 95% confidence level. Additionally to the improvement in the analytical features in comparison with those of the flow system with continuous reagent addition (sensitivity and sampling rate 90 and 83% higher, respectively), the consumption of OPA was 230-fold lower.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A flow system exploiting the multicommutation approach is proposed for spectrophotometric determination of tannin in beverages. The procedure is based on the reduction of Cu(II) in the presence of 4,4`-dicarboxy-2,2`-biquinoline, yielding a complex with maximum absorption at 558 nm. Calibration graph was linear (r=0.999) for tannic acid concentrations up to 5.00 mu mol L-1. The detection limit and coefficient of variation were estimated as 10 nmol L-1 (99.7% confidence level) and 1% (1.78 mu mol L-1 tannic acid, n=10), respectively. The sampling rate was 50 determinations per hour. The proposed procedure is more sensitive and selective than the official Folin-Denis method, also minimizing drastically waste generation. Recoveries within 91.8 and 115% were estimated for total tannin determination in tea and wine samples. (C) 2007 Elsevier B.V. All rights reserved.