922 resultados para data storage concept


Relevância:

80.00% 80.00%

Publicador:

Resumo:

À medida que são feitas modificações nas legislações em vigor em relação às energias renováveis, de forma a incentivar o uso destas, surge a necessidade de sincronização do consumo da instalação com a sua própria produção. As empresas líderes de mercado já possuem soluções que permitem a recolha de dados das instalações fotovoltaicas para posterior monitorização e disponibilização ao cliente. Contudo, estas soluções possuem pontos negativos tais como o preço e limitações na potência instalada permitida. Neste contexto, este documento apresenta a descrição de uma solução que serve como uma alternativa muito mais barata às soluções apresentadas pelas principais marcas mundiais no âmbito desta área, além de ser a única solução disponível desenvolvida em território nacional. Como prova da funcionalidade da solução, são descritos e apresentados diferentes tipos de testes, que simulam a interação de um utilizador com a solução desenvolvida, levados a cabo em instalações solares fotovoltaicas reais, sendo os seus resultados analisados e evidenciando a facilidade de utilização desta solução.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The development of organic materials displaying high two-photon absorption (TPA) has attracted much attention in recent years due to a variety of potential applications in photonics and optoelectronics, such as three-dimensional optical data storage, fluorescence imaging, two-photon microscopy, optical limiting, microfabrication, photodynamic therapy, upconverted lasing, etc. The most frequently employed structural motifs for TPA materials are donor–pi bridge–acceptor (D–pi–A) dipoles, donor–pi bridge–donor (D–pi–D) and acceptor–pi bridge-acceptor (A–pi–A) quadrupoles, octupoles, etc. In this work we present the synthesis and photophysical characterization of quadrupolar heterocyclic systems with potential applications in materials and biological sciences as TPA chromophores. Indole is a versatile building block for the synthesis of heterocyclic systems for several optoelectronic applications (chemosensors, nonlinear optical, OLEDs) due to its photophysical properties and donor electron ability and 4H-pyran-4-ylidene fragment is frequently used for the synthesis of red light-emitting materials. On the other hand, 2-(2,6-dimethyl-4H-pyran-4-ylidene)malononitrile (1) and 1,3-diethyl-dihydro-5-(2,6-dimethyl-4H-pyran-4-ylidene)-2-thiobarbituric (2) units are usually used as strong acceptor moieties for the preparation of π-conjugated systems of the push-pull type. These building blocks were prepared by Knoevenagel condensation of the corresponding ketone precursor with malononitrile or 1,3-diethyl-dihydro-2-thiobarbituric acid. The new quadrupolar 4H-pyran-4-ylidene fluorophores (3) derived from indole were prepared through condensation of 5-methyl-1H-indole-3-carbaldehyde with the acceptor precursors 1 and 2, in the presence of a catalytical amount of piperidine. The new compounds were characterized by the usual spectroscopic techniques (UV-vis., FT-IR and multinuclear NMR - 1H, 13C).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Special issue guest editorial, June, 2015.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Eletrónica Industrial e Computadores

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract : In the subject of fingerprints, the rise of computers tools made it possible to create powerful automated search algorithms. These algorithms allow, inter alia, to compare a fingermark to a fingerprint database and therefore to establish a link between the mark and a known source. With the growth of the capacities of these systems and of data storage, as well as increasing collaboration between police services on the international level, the size of these databases increases. The current challenge for the field of fingerprint identification consists of the growth of these databases, which makes it possible to find impressions that are very similar but coming from distinct fingers. However and simultaneously, this data and these systems allow a description of the variability between different impressions from a same finger and between impressions from different fingers. This statistical description of the withinand between-finger variabilities computed on the basis of minutiae and their relative positions can then be utilized in a statistical approach to interpretation. The computation of a likelihood ratio, employing simultaneously the comparison between the mark and the print of the case, the within-variability of the suspects' finger and the between-variability of the mark with respect to a database, can then be based on representative data. Thus, these data allow an evaluation which may be more detailed than that obtained by the application of rules established long before the advent of these large databases or by the specialists experience. The goal of the present thesis is to evaluate likelihood ratios, computed based on the scores of an automated fingerprint identification system when the source of the tested and compared marks is known. These ratios must support the hypothesis which it is known to be true. Moreover, they should support this hypothesis more and more strongly with the addition of information in the form of additional minutiae. For the modeling of within- and between-variability, the necessary data were defined, and acquired for one finger of a first donor, and two fingers of a second donor. The database used for between-variability includes approximately 600000 inked prints. The minimal number of observations necessary for a robust estimation was determined for the two distributions used. Factors which influence these distributions were also analyzed: the number of minutiae included in the configuration and the configuration as such for both distributions, as well as the finger number and the general pattern for between-variability, and the orientation of the minutiae for within-variability. In the present study, the only factor for which no influence has been shown is the orientation of minutiae The results show that the likelihood ratios resulting from the use of the scores of an AFIS can be used for evaluation. Relatively low rates of likelihood ratios supporting the hypothesis known to be false have been obtained. The maximum rate of likelihood ratios supporting the hypothesis that the two impressions were left by the same finger when the impressions came from different fingers obtained is of 5.2 %, for a configuration of 6 minutiae. When a 7th then an 8th minutia are added, this rate lowers to 3.2 %, then to 0.8 %. In parallel, for these same configurations, the likelihood ratios obtained are on average of the order of 100,1000, and 10000 for 6,7 and 8 minutiae when the two impressions come from the same finger. These likelihood ratios can therefore be an important aid for decision making. Both positive evolutions linked to the addition of minutiae (a drop in the rates of likelihood ratios which can lead to an erroneous decision and an increase in the value of the likelihood ratio) were observed in a systematic way within the framework of the study. Approximations based on 3 scores for within-variability and on 10 scores for between-variability were found, and showed satisfactory results. Résumé : Dans le domaine des empreintes digitales, l'essor des outils informatisés a permis de créer de puissants algorithmes de recherche automatique. Ces algorithmes permettent, entre autres, de comparer une trace à une banque de données d'empreintes digitales de source connue. Ainsi, le lien entre la trace et l'une de ces sources peut être établi. Avec la croissance des capacités de ces systèmes, des potentiels de stockage de données, ainsi qu'avec une collaboration accrue au niveau international entre les services de police, la taille des banques de données augmente. Le défi actuel pour le domaine de l'identification par empreintes digitales consiste en la croissance de ces banques de données, qui peut permettre de trouver des impressions très similaires mais provenant de doigts distincts. Toutefois et simultanément, ces données et ces systèmes permettent une description des variabilités entre différentes appositions d'un même doigt, et entre les appositions de différents doigts, basées sur des larges quantités de données. Cette description statistique de l'intra- et de l'intervariabilité calculée à partir des minuties et de leurs positions relatives va s'insérer dans une approche d'interprétation probabiliste. Le calcul d'un rapport de vraisemblance, qui fait intervenir simultanément la comparaison entre la trace et l'empreinte du cas, ainsi que l'intravariabilité du doigt du suspect et l'intervariabilité de la trace par rapport à une banque de données, peut alors se baser sur des jeux de données représentatifs. Ainsi, ces données permettent d'aboutir à une évaluation beaucoup plus fine que celle obtenue par l'application de règles établies bien avant l'avènement de ces grandes banques ou par la seule expérience du spécialiste. L'objectif de la présente thèse est d'évaluer des rapports de vraisemblance calcul és à partir des scores d'un système automatique lorsqu'on connaît la source des traces testées et comparées. Ces rapports doivent soutenir l'hypothèse dont il est connu qu'elle est vraie. De plus, ils devraient soutenir de plus en plus fortement cette hypothèse avec l'ajout d'information sous la forme de minuties additionnelles. Pour la modélisation de l'intra- et l'intervariabilité, les données nécessaires ont été définies, et acquises pour un doigt d'un premier donneur, et deux doigts d'un second donneur. La banque de données utilisée pour l'intervariabilité inclut environ 600000 empreintes encrées. Le nombre minimal d'observations nécessaire pour une estimation robuste a été déterminé pour les deux distributions utilisées. Des facteurs qui influencent ces distributions ont, par la suite, été analysés: le nombre de minuties inclus dans la configuration et la configuration en tant que telle pour les deux distributions, ainsi que le numéro du doigt et le dessin général pour l'intervariabilité, et la orientation des minuties pour l'intravariabilité. Parmi tous ces facteurs, l'orientation des minuties est le seul dont une influence n'a pas été démontrée dans la présente étude. Les résultats montrent que les rapports de vraisemblance issus de l'utilisation des scores de l'AFIS peuvent être utilisés à des fins évaluatifs. Des taux de rapports de vraisemblance relativement bas soutiennent l'hypothèse que l'on sait fausse. Le taux maximal de rapports de vraisemblance soutenant l'hypothèse que les deux impressions aient été laissées par le même doigt alors qu'en réalité les impressions viennent de doigts différents obtenu est de 5.2%, pour une configuration de 6 minuties. Lorsqu'une 7ème puis une 8ème minutie sont ajoutées, ce taux baisse d'abord à 3.2%, puis à 0.8%. Parallèlement, pour ces mêmes configurations, les rapports de vraisemblance sont en moyenne de l'ordre de 100, 1000, et 10000 pour 6, 7 et 8 minuties lorsque les deux impressions proviennent du même doigt. Ces rapports de vraisemblance peuvent donc apporter un soutien important à la prise de décision. Les deux évolutions positives liées à l'ajout de minuties (baisse des taux qui peuvent amener à une décision erronée et augmentation de la valeur du rapport de vraisemblance) ont été observées de façon systématique dans le cadre de l'étude. Des approximations basées sur 3 scores pour l'intravariabilité et sur 10 scores pour l'intervariabilité ont été trouvées, et ont montré des résultats satisfaisants.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Actualment, i amb la ràpida expansió en el món educatiu de les plataformes d'aprenentatge virtual, s'experimenta una demanda de funcionalitats que puguen ser usades des d'aquestes plataformes i que donen resposta als reptes educatius dins d'aquest nou paradigma d'aprenentatge.En aquest treball s'ha fet una tasca d'integració d'eines que possibiliten la creació, modificació, visualització i magatzematge de mapes conceptuals dins d'una plataforma d'aprenentatge molt usada actualment, tant en ensenyament secundari com universitari. La plataforma moodle.L'eina per construir mapes conceptuals triada ha segut l'anomenada VUE, desenvolupada en la universitat de Tufts.Ambdues eines moodle i VUE s'han desenvolupat amb codi obert i baix llicències de programari lliure, així es poden tindre tots els avantatges que aquestes llicències proporcionen.El resultat és una integració que possibilita la utilització de mapes conceptuals dins de la plataforma moodle.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este documento es la memoria del trabajo de final de carrera (TFC) del área de almacenes de datos, dentro de los estudios de Ingeniería Técnica en Informática de Gestión de la Universitat Oberta de Catalunya (UOC).El proyecto a desarrollar en dicho TFC versa sobre el diseño, implementación y explotación de un almacén de datos para la Confederación Hidrográfica del Noreste (CHNE). Partiendo de sus fuentes de datos construiremos un almacén de datos del cual poder obtener un conjunto de informes predefinidos, según las especificaciones del CHNE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work provides a general description of the multi sensor data fusion concept, along with a new classification of currently used sensor fusion techniques for unmanned underwater vehicles (UUV). Unlike previous proposals that focus the classification on the sensors involved in the fusion, we propose a synthetic approach that is focused on the techniques involved in the fusion and their applications in UUV navigation. We believe that our approach is better oriented towards the development of sensor fusion systems, since a sensor fusion architecture should be first of all focused on its goals and then on the fused sensors

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Of the approximately 25,000 bridges in Iowa, 28% are classified as structurally deficient, functionally obsolete, or both. Because many Iowa bridges require repair or replacement with a relatively limited funding base, there is a need to develop new bridge materials that may lead to longer life spans and reduced life-cycle costs. In addition, new and effective methods for determining the condition of structures are needed to identify when the useful life has expired or other maintenance is needed. Due to its unique alloy blend, high-performance steel (HPS) has been shown to have improved weldability, weathering capabilities, and fracture toughness than conventional structural steels. Since the development of HPS in the mid-1990s, numerous bridges using HPS girders have been constructed, and many have been economically built. The East 12th Street Bridge, which replaced a deteriorated box girder bridge, is Iowa’s first bridge constructed using HPS girders. The new structure is a two-span bridge that crosses I-235 in Des Moines, Iowa, providing one lane of traffic in each direction. A remote, continuous, fiber-optic based structural health monitoring (SHM) system for the bridge was developed using off-the-shelf technologies. In the system, sensors strategically located on the bridge collect raw strain data and then transfer the data via wireless communication to a gateway system at a nearby secure facility. The data are integrated and converted to text files before being uploaded automatically to a website that provides live strain data and a live video stream. A data storage/processing system at the Bridge Engineering Center in Ames, Iowa, permanently stores and processes the data files. Several processes are performed to check the overall system’s operation, eliminate temperature effects from the complete strain record, compute the global behavior of the bridge, and count strain cycles at the various sensor locations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Resum En l’actualitat, els sistemes electrònics de processament de dades són cada cop més significatius dins del sector industrial. Són moltes les necessitats que sorgeixen en el món dels sistemes d’autentificació, de l’electrònica aeronàutica, d’equips d’emmagatzemament de dades, de telecomunicacions, etc. Aquestes necessitats tecnològiques exigeixen ser controlades per un sistema fiable, robust, totalment dependent amb els esdeveniments externs i que compleixi correctament les restriccions temporals imposades per tal de que realitzi el seu propòsit d’una manera eficient. Aquí és on entren en joc els sistemes encastats en temps real, els quals ofereixen una gran fiabilitat, disponibilitat, una ràpida resposta als esdeveniments externs del sistema, una alta garantia de funcionament i una àmplia possibilitat d’aplicacions. Aquest projecte està pensat per a fer una introducció al món dels sistemes encastats, com també explicar el funcionament del sistema operatiu en temps real FreeRTOS; el qual utilitza com a mètode de programació l’ús de tasques independents entre elles. Donarem una visió de les seves característiques de funcionament, com organitza tasques mitjançant un scheduler i uns exemples per a poder dissenyar-hi aplicacions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.