970 resultados para MULTICOMMUTED FLOW ANALYSIS
Resumo:
The reduction of luvastatin (FLV) at a hanging mercury-drop electrode (HMDE) was studied by square-wave adsorptive-stripping voltammetry (SWAdSV). FLV can be accumulated and reduced at the electrode, with a maximum peak current intensity at a potential of approximately 1.26V vs. AgCl=Ag, in an aqueous electrolyte solution of pH 5.25. The method shows linearity between peak current intensity and FLV concentration between 1.0 10 8 and 2.7 10 6 mol L 1. Limits of detection (LOD) and quantification (LOQ) were found to be 9.9 10 9 mol L 1 and 3.3 10 8 mol L 1, respectively. Furthermore, FLV oxidation at a glassy carbon electrode surface was used for its hydrodynamic monitoring by amperometric detection in a flow-injection system. The amperometric signal was linear with FLV concentration over the range 1.0 10 6 to 1.0 10 5 mol L 1, with an LOD of 2.4 10 7 mol L 1 and an LOQ of 8.0 10 7 mol L 1. A sample rate of 50 injections per hour was achieved. Both methods were validated and showed to be precise and accurate, being satisfactorily applied to the determination of FLV in a commercial pharmaceutical.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Information systems are widespread and used by anyone with computing devices as well as corporations and governments. It is often the case that security leaks are introduced during the development of an application. Reasons for these security bugs are multiple but among them one can easily identify that it is very hard to define and enforce relevant security policies in modern software. This is because modern applications often rely on container sharing and multi-tenancy where, for instance, data can be stored in the same physical space but is logically mapped into different security compartments or data structures. In turn, these security compartments, to which data is classified into in security policies, can also be dynamic and depend on runtime data. In this thesis we introduce and develop the novel notion of dependent information flow types, and focus on the problem of ensuring data confidentiality in data-centric software. Dependent information flow types fit within the standard framework of dependent type theory, but, unlike usual dependent types, crucially allow the security level of a type, rather than just the structural data type itself, to depend on runtime values. Our dependent function and dependent sum information flow types provide a direct, natural and elegant way to express and enforce fine grained security policies on programs. Namely programs that manipulate structured data types in which the security level of a structure field may depend on values dynamically stored in other fields The main contribution of this work is an efficient analysis that allows programmers to verify, during the development phase, whether programs have information leaks, that is, it verifies whether programs protect the confidentiality of the information they manipulate. As such, we also implemented a prototype typechecker that can be found at http://ctp.di.fct.unl.pt/DIFTprototype/.
Resumo:
Biochemical systems are commonly modelled by systems of ordinary differential equations (ODEs). A particular class of such models called S-systems have recently gained popularity in biochemical system modelling. The parameters of an S-system are usually estimated from time-course profiles. However, finding these estimates is a difficult computational problem. Moreover, although several methods have been recently proposed to solve this problem for ideal profiles, relatively little progress has been reported for noisy profiles. We describe a special feature of a Newton-flow optimisation problem associated with S-system parameter estimation. This enables us to significantly reduce the search space, and also lends itself to parameter estimation for noisy data. We illustrate the applicability of our method by applying it to noisy time-course data synthetically produced from previously published 4- and 30-dimensional S-systems. In addition, we propose an extension of our method that allows the detection of network topologies for small S-systems. We introduce a new method for estimating S-system parameters from time-course profiles. We show that the performance of this method compares favorably with competing methods for ideal profiles, and that it also allows the determination of parameters for noisy profiles.
Substance flow analysis as a tool for mitigating the impact of pharmaceuticals on the aquatic system
Resumo:
Production flow analysis (PFA) is a well-established methodology used for transforming traditional functional layout into product-oriented layout. The method uses part routings to find natural clusters of workstations forming production cells able to complete parts and components swiftly with simplified material flow. Once implemented, the scheduling system is based on period batch control aiming to establish fixed planning, production and delivery cycles for the whole production unit. PFA is traditionally applied to job-shops with functional layouts, and after reorganization within groups lead times reduce, quality improves and motivation among personnel improves. Several papers have documented this, yet no research has studied its application to service operations management. This paper aims to show that PFA can well be applied not only to job-shop and assembly operations, but also to back-office and service processes with real cases. The cases clearly show that PFA reduces non-value adding operations, introduces flow by evening out bottlenecks and diminishes process variability, all of which contribute to efficient operations management.
Resumo:
A flow system coupled to a tungsten coil atomizer in an atomic absorption spectrometer (TCA-AAS) was developed for As(III) determination in waters, by extraction with sodium diethyldithiocarbamate (NaDDTC) as complexing agent, and by sorption of the As(III)-DDTC complex in a micro-column filled with 5 mg C18 reversed phase (10 µL dry sorbent), followed by elution with ethanol. A complete pre-concentration/elution cycle took 208 s, with 30 s sample load time (1.7 mL) and 4 s elution time (71 µL). The interface and software for the synchronous control of two peristaltic pumps (RUN/ STOP), an autosampler arm, seven solenoid valves, one injection valve, the electrothermal atomizer and the spectrometer Read function were constructed. The system was characterized and validated by analytical recovery studies performed both in synthetic solutions and in natural waters. Using a 30 s pre-concentration period, the working curve was linear between 0.25 and 6.0 µg L-1 (r = 0.9976), the retention efficiency was 94±1% (6.0 µg L-1), and the pre-concentration coefficient was 28.9. The characteristic mass was 58 pg, the mean repeatability (expressed as the variation coefficient) was 3.4% (n=5), the detection limit was 0.058 µg L-1 (4.1 pg in 71 µL of eluate injected into the coil), and the mean analytical recovery in natural waters was 92.6 ± 9.5 % (n=15). The procedure is simple, economic, less prone to sample loss and contamination and the useful lifetime of the micro-column was between 200-300 pre-concentration cycles.
Resumo:
In the present work, the development of a method based on the coupling of flow analysis (FA), hydride generation (HG), and derivative molecular absorption spectrophotometry (D-EAM) in gas phase (GP), is described in order to determine total antimony in antileishmanial products. Second derivative order (D²224nm) of the absorption spectrum (190 - 300 nm) is utilized as measurement criterion. Each one of the parameters involved in the development of the proposed method was examined and optimized. The utilization of the EAM in GP as detection system in a continuous mode instead of atomic absorption spectrometry represents the great potential of the analytic proposal.
Resumo:
We apply to the Senegalese input-output matrix of 1990, disagregated into formal and informal activities, a recently designed structural analytical method (Minimal-Flow-Analysis) which permits to depict the direct and indirect production likanges existing between activities.
Resumo:
The purpose of this paper is to present the application of a three-phase harmonic propagation analysis time-domain tool, using the Norton model to approach the modeling of non-linear loads, making the harmonics currents flow more appropriate to the operation analysis and to the influence of mitigation elements analysis. This software makes it possible to obtain results closer to the real distribution network, considering voltages unbalances, currents imbalances and the application of mitigation elements for harmonic distortions. In this scenario, a real case study with network data and equipments connected to the network will be presented, as well as the modeling of non-linear loads based on real data obtained from some PCCs (Points of Common Coupling) of interests for a distribution company.
Resumo:
An extensive sample (2%) of private vehicles in Italy are equipped with a GPS device that periodically measures their position and dynamical state for insurance purposes. Having access to this type of data allows to develop theoretical and practical applications of great interest: the real-time reconstruction of traffic state in a certain region, the development of accurate models of vehicle dynamics, the study of the cognitive dynamics of drivers. In order for these applications to be possible, we first need to develop the ability to reconstruct the paths taken by vehicles on the road network from the raw GPS data. In fact, these data are affected by positioning errors and they are often very distanced from each other (~2 Km). For these reasons, the task of path identification is not straightforward. This thesis describes the approach we followed to reliably identify vehicle paths from this kind of low-sampling data. The problem of matching data with roads is solved with a bayesian approach of maximum likelihood. While the identification of the path taken between two consecutive GPS measures is performed with a specifically developed optimal routing algorithm, based on A* algorithm. The procedure was applied on an off-line urban data sample and proved to be robust and accurate. Future developments will extend the procedure to real-time execution and nation-wide coverage.