877 resultados para Acceleration data structure
Resumo:
The validation of an analytical procedure must be certified through the determination of parameters known as figures of merit. For first order data, the acuracy, precision, robustness and bias is similar to the methods of univariate calibration. Linearity, sensitivity, signal to noise ratio, adjustment, selectivity and confidence intervals need different approaches, specific for multivariate data. Selectivity and signal to noise ratio are more critical and they only can be estimated by means of the calculation of the net analyte signal. In second order calibration, some differentes approaches are necessary due to data structure.
Resumo:
The power loss reduction in distribution systems (DSs) is a nonlinear and multiobjective problem. Service restoration in DSs is even computationally hard since it additionally requires a solution in real-time. Both DS problems are computationally complex. For large-scale networks, the usual problem formulation has thousands of constraint equations. The node-depth encoding (NDE) enables a modeling of DSs problems that eliminates several constraint equations from the usual formulation, making the problem solution simpler. On the other hand, a multiobjective evolutionary algorithm (EA) based on subpopulation tables adequately models several objectives and constraints, enabling a better exploration of the search space. The combination of the multiobjective EA with NDE (MEAN) results in the proposed approach for solving DSs problems for large-scale networks. Simulation results have shown the MEAN is able to find adequate restoration plans for a real DS with 3860 buses and 632 switches in a running time of 0.68 s. Moreover, the MEAN has shown a sublinear running time in function of the system size. Tests with networks ranging from 632 to 5166 switches indicate that the MEAN can find network configurations corresponding to a power loss reduction of 27.64% for very large networks requiring relatively low running time.
Resumo:
O objectivo deste trabalho é a análise da eficiência produtiva e dos efeitos da concentração sobre os custos bancários, tendo por base a indústria bancária portuguesa. O carácter multiproduto da empresa bancária sugere a necessidade de se adoptar formas multiproduto da função custo (tipo Fourier). Introduzimos variáveis de homogeneidade e de estrutura que permitem o recurso a formas funcionais uniproduto (Cobb-Douglas) à banca. A amostra corresponde a 22 bancos que operavam em Portugal entre 1995-2001, base não consolidada e dados em painel. Para o estudo da ineficiência recorreu-se ao modelo estocástico da curva fronteira (SFA), para as duas especificações. Na análise da concentração, introduziram-se variáveis binárias que pretendem captar os efeitos durante quatro anos após a concentração. Tanto no caso da SFA como no da concentração, os resultados encontrados são sensíveis à especificação funcional adoptada. Concluindo, o processo de concentração bancário parece justificar-se pela possibilidade da diminuição da ineficiência-X. This study addresses the productive efficiency and the effects of concentration over the banking costs, stressing its focus on the Portuguese banking market. The multiproduct character of the banking firm suggests the use of functional forms as Fourier. The introduction of variables of structure and of homogeneity allows the association of the banking activity (multiproduct) with a single product function (Cobb-Douglas type). The sample covers 22 banks which operated in Portugal from 1995-2001, non consolidated base with a panel data structure. The study about inefficiency is elaborated through the stochastic frontier model (SFA), for the two specifications selected. As a methodology to analyze the concentration, we introduced binary variables, which intend to catch the effects through four years after the concentration process. The results obtained, through SFA and concentration approach, are influenced by the kind of specifications selected. Summing up, the concentration process of the Banking Industry sounds to be justified by the possibility of the X-inefficiency.
Resumo:
Environmental management is a complex task. The amount and heterogeneity of the data needed for an environmental decision making tool is overwhelming without adequate database systems and innovative methodologies. As far as data management, data interaction and data processing is concerned we here propose the use of a Geographical Information System (GIS) whilst for the decision making we suggest a Multi-Agent System (MAS) architecture. With the adoption of a GIS we hope to provide a complementary coexistence between heterogeneous data sets, a correct data structure, a good storage capacity and a friendly user’s interface. By choosing a distributed architecture such as a Multi-Agent System, where each agent is a semi-autonomous Expert System with the necessary skills to cooperate with the others in order to solve a given task, we hope to ensure a dynamic problem decomposition and to achieve a better performance compared with standard monolithical architectures. Finally, and in view of the partial, imprecise, and ever changing character of information available for decision making, Belief Revision capabilities are added to the system. Our aim is to present and discuss an intelligent environmental management system capable of suggesting the more appropriate land-use actions based on the existing spatial and non-spatial constraints.
Resumo:
Master’s Thesis in Computer Engineering
Resumo:
Os osciloscópios digitais são utilizados em diversas áreas do conhecimento, assumindo-se no âmbito da engenharia electrónica, como instrumentos indispensáveis. Graças ao advento das Field Programmable Gate Arrays (FPGAs), os instrumentos de medição reconfiguráveis, dadas as suas vantagens, i.e., altos desempenhos, baixos custos e elevada flexibilidade, são cada vez mais uma alternativa aos instrumentos tradicionalmente usados nos laboratórios. Tendo como objectivo a normalização no acesso e no controlo deste tipo de instrumentos, esta tese descreve o projecto e implementação de um osciloscópio digital reconfigurável baseado na norma IEEE 1451.0. Definido de acordo com uma arquitectura baseada nesta norma, as características do osciloscópio são descritas numa estrutura de dados denominada Transducer Electronic Data Sheet (TEDS), e o seu controlo é efectuado utilizando um conjunto de comandos normalizados. O osciloscópio implementa um conjunto de características e funcionalidades básicas, todas verificadas experimentalmente. Destas, destaca-se uma largura de banda de 575kHz, um intervalo de medição de 0.4V a 2.9V, a possibilidade de se definir um conjunto de escalas horizontais, o nível e declive de sincronismo e o modo de acoplamento com o circuito sob análise. Arquitecturalmente, o osciloscópio é constituído por um módulo especificado com a linguagem de descrição de hardware (HDL, Hardware Description Language) Verilog e por uma interface desenvolvida na linguagem de programação Java®. O módulo é embutido numa FPGA, definindo todo o processamento do osciloscópio. A interface permite o seu controlo e a representação do sinal medido. Durante o projecto foi utilizado um conversor Analógico/Digital (A/D) com uma frequência máxima de amostragem de 1.5MHz e 14 bits de resolução que, devido às suas limitações, obrigaram à implementação de um sistema de interpolação multi-estágio com filtros digitais.
Resumo:
O objectivo deste trabalho é a análise da eficiência produtiva e dos efeitos da concentração sobre os custos bancários, tendo por base a indústria bancária portuguesa. O carácter multiproduto da empresa bancária sugere a necessidade de se adoptar formas multiproduto da função custo (tipo Fourier). Introduzimos variáveis de homogeneidade e de estrutura que permitem o recurso a formas funcionais uniproduto (Cobb-Douglas) à banca. A amostra corresponde a 22 bancos que operavam em Portugal entre 1995-2001, base não consolidada e dados em painel. Para o estudo da ineficiência recorreu-se ao modelo estocástico da curva fronteira (SFA), para as duas especificações. Na análise da concentração, introduziram-se variáveis binárias que pretendem captar os efeitos durante quatro anos após a concentração. Tanto no caso da SFA como no da concentração, os resultados encontrados são sensíveis à especificação funcional adoptada. Concluindo, o processo de concentração bancário parece justificar-se pela possibilidade da diminuição da ineficiência-X.
Resumo:
Thesis submitted in the fulfillment of the requirements for the Degree of Master in Biomedical Engineering
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Biomédica
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
This paper introduces the metaphorism pattern of relational specification and addresses how specification following this pattern can be refined into recursive programs. Metaphorisms express input-output relationships which preserve relevant information while at the same time some intended optimization takes place. Text processing, sorting, representation changers, etc., are examples of metaphorisms. The kind of metaphorism refinement proposed in this paper is a strategy known as change of virtual data structure. It gives sufficient conditions for such implementations to be calculated using relation algebra and illustrates the strategy with the derivation of quicksort as example.
Resumo:
En aquest projecte s'ha desenvolupat una interfície web per calcular rutes a la ciutat de Barcelona. Les rutes calculades són a peu, entre un punt d'origen qualsevol i un punt d'interès turístic de la ciutat com a destí. Per això s'han extret les dades dels carrers de Barcelona d'OpenStreetMap i s'han insertat a una base de dades postgreSQL/postGIS, juntament amb una capa vectorial de punts d'interès turístic que s'ha creat amb el SIG d'escriptori qGIS. El càlcul de les rutes amb les dades de la base de dades s'ha realitzat amb l'extensió pgRouting, i la interfície web per seleccionar els punts d'origen i destí, mostrar els mapes, i mostrar les rutes resultat, s'ha desenvolupat utilitzant la llibreria OpenLayers.
Resumo:
The multiscale finite volume (MsFV) method has been developed to efficiently solve large heterogeneous problems (elliptic or parabolic); it is usually employed for pressure equations and delivers conservative flux fields to be used in transport problems. The method essentially relies on the hypothesis that the (fine-scale) problem can be reasonably described by a set of local solutions coupled by a conservative global (coarse-scale) problem. In most cases, the boundary conditions assigned for the local problems are satisfactory and the approximate conservative fluxes provided by the method are accurate. In numerically challenging cases, however, a more accurate localization is required to obtain a good approximation of the fine-scale solution. In this paper we develop a procedure to iteratively improve the boundary conditions of the local problems. The algorithm relies on the data structure of the MsFV method and employs a Krylov-subspace projection method to obtain an unconditionally stable scheme and accelerate convergence. Two variants are considered: in the first, only the MsFV operator is used; in the second, the MsFV operator is combined in a two-step method with an operator derived from the problem solved to construct the conservative flux field. The resulting iterative MsFV algorithms allow arbitrary reduction of the solution error without compromising the construction of a conservative flux field, which is guaranteed at any iteration. Since it converges to the exact solution, the method can be regarded as a linear solver. In this context, the schemes proposed here can be viewed as preconditioned versions of the Generalized Minimal Residual method (GMRES), with a very peculiar characteristic that the residual on the coarse grid is zero at any iteration (thus conservative fluxes can be obtained).
Resumo:
We present building blocks for algorithms for the efficient reduction of square factor, i.e. direct repetitions in strings. So the basic problem is this: given a string, compute all strings that can be obtained by reducing factors of the form zz to z. Two types of algorithms are treated: an offline algorithm is one that can compute a data structure on the given string in advance before the actual search for the square begins; in contrast, online algorithms receive all input only at the time when a request is made. For offline algorithms we treat the following problem: Let u and w be two strings such that w is obtained from u by reducing a square factor zz to only z. If we further are given the suffix table of u, how can we derive the suffix table for w without computing it from scratch? As the suffix table plays a key role in online algorithms for the detection of squares in a string, this derivation can make the iterated reduction of squares more efficient. On the other hand, we also show how a suffix array, used for the offline detection of squares, can be adapted to the new string resulting from the deletion of a square. Because the deletion is a very local change, this adaption is more eficient than the computation of the new suffix array from scratch.
Resumo:
OBJECTIVE To analyze the usability of Computerized Nursing Process (CNP) from the ICNP® 1.0 in Intensive Care Units in accordance with the criteria established by the standards of the International Organization for Standardization and the Brazilian Association of Technical Standards of systems. METHOD This is a before-and-after semi-experimental quantitative study, with a sample of 34 participants (nurses, professors and systems programmers), carried out in three Intensive Care Units. RESULTS The evaluated criteria (use, content and interface) showed that CNP has usability criteria, as it integrates a logical data structure, clinical assessment, diagnostics and nursing interventions. CONCLUSION The CNP is a source of information and knowledge that provide nurses with new ways of learning in intensive care, for it is a place that provides complete, comprehensive, and detailed content, supported by current and relevant data and scientific research information for Nursing practices.