996 resultados para libreria, Software, Database, ORM, transazionalità
Resumo:
ACM Computing Classification System (1998): D.2.5, D.2.9, D.2.11.
Resumo:
Component-based Software Engineering (CBSE) and Service-Oriented Architecture (SOA) became popular ways to develop software over the last years. During the life-cycle of a software system, several components and services can be developed, evolved and replaced. In production environments, the replacement of core components, such as databases, is often a risky and delicate operation, where several factors and stakeholders should be considered. Service Level Agreement (SLA), according to ITILv3’s official glossary, is “an agreement between an IT service provider and a customer. The agreement consists on a set of measurable constraints that a service provider must guarantee to its customers.”. In practical terms, SLA is a document that a service provider delivers to its consumers with minimum quality of service (QoS) metrics.This work is intended to assesses and improve the use of SLAs to guide the transitioning process of databases on production environments. In particular, in this work we propose SLA-Based Guidelines/Process to support migrations from a relational database management system (RDBMS) to a NoSQL one. Our study is validated by case studies.
Resumo:
This paper discusses the advantages of database-backed websites and describes the model for a library website implemented at the University of Nottingham using open source software, PHP and MySQL. As websites continue to grow in size and complexity it becomes increasingly important to introduce automation to help manage them. It is suggested that a database-backed website offers many advantages over one built from static HTML pages. These include a consistency of style and content, the ability to present different views of the same data, devolved editing and enhanced security. The University of Nottingham Library Services website is described and issues surrounding its design, technological implementation and management are explored.
Resumo:
One way to do a bibliometric study is to examine each of the records that make up a database, each record and extract key areas that may disclose relevant information about the use of the database and documents in the collection . This article shows how a reference database allows to obtain important data that can reach conclusions that in some cases surprising. For this study we used the following fields of Database Control Documentary Indigenous Nationalities of Costa Rica 1979-2003: author, place of publication, publisher, year, language and support. The database analyzed has two thousand records and was developed in the Winisis. Moreover, analysis of documents was made after processing of the data, which was to export records to Excel software Winisis. After this information extracted from their chosen fields and are held by their respective separate chart or graph to present the results obtained. Furthermore, we show the application of different methods to learn more about the scientific aspects as: the Price Index, the Index of Collaboration This contribution will, first, for (as) students in the course of the race Metric Studies of Library and Information Science, National University, demonstrate and practice what you learned in this area. They may also benefit the (as) professionals from different areas, such as anthropologists (as), sociologists (as), linguists and librarians (as), among others (as).
Resumo:
In database applications, access control security layers are mostly developed from tools provided by vendors of database management systems and deployed in the same servers containing the data to be protected. This solution conveys several drawbacks. Among them we emphasize: 1) if policies are complex, their enforcement can lead to performance decay of database servers; 2) when modifications in the established policies implies modifications in the business logic (usually deployed at the client-side), there is no other possibility than modify the business logic in advance and, finally, 3) malicious users can issue CRUD expressions systematically against the DBMS expecting to identify any security gap. In order to overcome these drawbacks, in this paper we propose an access control stack characterized by: most of the mechanisms are deployed at the client-side; whenever security policies evolve, the security mechanisms are automatically updated at runtime and, finally, client-side applications do not handle CRUD expressions directly. We also present an implementation of the proposed stack to prove its feasibility. This paper presents a new approach to enforce access control in database applications, this way expecting to contribute positively to the state of the art in the field.
Resumo:
In database applications, access control security layers are mostly developed from tools provided by vendors of database management systems and deployed in the same servers containing the data to be protected. This solution conveys several drawbacks. Among them we emphasize: (1) if policies are complex, their enforcement can lead to performance decay of database servers; (2) when modifications in the established policies implies modifications in the business logic (usually deployed at the client-side), there is no other possibility than modify the business logic in advance and, finally, 3) malicious users can issue CRUD expressions systematically against the DBMS expecting to identify any security gap. In order to overcome these drawbacks, in this paper we propose an access control stack characterized by: most of the mechanisms are deployed at the client-side; whenever security policies evolve, the security mechanisms are automatically updated at runtime and, finally, client-side applications do not handle CRUD expressions directly. We also present an implementation of the proposed stack to prove its feasibility. This paper presents a new approach to enforce access control in database applications, this way expecting to contribute positively to the state of the art in the field.
Resumo:
La normalización facilita la comunicación y permite el intercambio de información con cualquier institución nacional o internacional. Este objetivo es posible a través de los formatos de comunicación para intercambio de información automatizada como CEPAL, MARC., FCC.La Escuela de Bibliotecología, Documentación e Información de la Universidad Nacional utiliza el software MICROISIS en red para la enseñanza. Las bases de datos que se diseñan utilizan el formato MARC y para la descripción bibliográfica las RCAA2.Se presenta la experiencia con la base de datos “I&D” sobre desarrollo rural, presentando la Tabla de Definición de Campos, la hoja de trabajo, el formato de despliegue y Tabla de selección de Campos.
Resumo:
Dissertação de Mestrado, Engenharia Informática, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2014
Resumo:
2016
Resumo:
Il progetto descritto in questo elaborato di tesi è stato svolto presso Il Centro Protesi INAIL (Vigorso di Budrio, BO). Il lavoro è stato realizzato a supporto di un progetto di ricerca, finanziato dal Dipartimento della Difesa USA, in collaborazione con la Northwestern University di Chicago e il Minneapolis Veteran Affairs Health Care Sytem. La ricerca ha lo scopo di determinare l’efficacia comparativa di metodi alternativi per realizzare il calco del moncone dell’amputato di arto inferiore e la successiva invasatura su misura. Il progetto di tesi nasce dall’assenza di un software commerciale in grado di analizzare come evolve la forma del moncone, dal calco all'invasatura finita, basandosi sulla digitalizzazione tridimensionale delle superfici. La libreria sviluppata è implementata in Python e utilizza algoritmi e strumenti di geometria computazionale al fine di supportare i processi di elaborazione dati. Il flusso di lavoro si sviluppa nelle seguenti fasi: • Acquisizione e pre-processing del dato; • Identificazione digitale dei punti di repere; • Allineamento dei modelli per orientarli in un sistema di riferimento globale secondo una logica comune; • Registrazione di due modelli per allinearli l’uno all’altro; • Generazione di outcome e parametri dimensionali, derivanti da mappe distanza, sezioni, cammini geodetici e regioni di interesse; • Estrazione di indicatori statistici riassuntivi delle differenze, correlate ad un insieme di scansioni tramite la PCA. Le funzionalità sono state validate tramite appositi test su dati clinici rilevati dallo studio o dati sintetici con caratteristiche note a priori. La libreria fornisce un insieme di interfacce che permette l’accesso anche a utenti non esperti ed è caratterizzata da modularità, semplicità di installazione ed estensibilità delle funzionalità. Tra gli sviluppi futuri si prevede l’identificazione di possibili ottimizzazioni individuate da un utilizzo degli strumenti esteso a più casi d’uso.
Resumo:
The direct attention of this thesis is the maintenance of road elements to improve road safety. The goal of the research is to prioritise maintenance for barriers based on factors such as the terrain of the site, deformations, degradation of the components, and adherence to the original installation. Using these factors, a coefficient is calculated to determine the maintenance priority for each barrier. To ease understanding and visualisation, data was uploaded and processed in a GIS environment to generate analysis and maps. This was done using GIS, a free and open-source GIS software. Information about the features of the barriers was collected through both on-site and online examination. During on-site inspections, a database of geotagged photos was created to aid in the survey. GIS capabilities word fully utilised by using geoprocessing tools for more in-depth analysis.
Resumo:
Different types of water bodies, including lakes, streams, and coastal marine waters, are often susceptible to fecal contamination from a range of point and nonpoint sources, and have been evaluated using fecal indicator microorganisms. The most commonly used fecal indicator is Escherichia coli, but traditional cultivation methods do not allow discrimination of the source of pollution. The use of triplex PCR offers an approach that is fast and inexpensive, and here enabled the identification of phylogroups. The phylogenetic distribution of E. coli subgroups isolated from water samples revealed higher frequencies of subgroups A1 and B23 in rivers impacted by human pollution sources, while subgroups D1 and D2 were associated with pristine sites, and subgroup B1 with domesticated animal sources, suggesting their use as a first screening for pollution source identification. A simple classification is also proposed based on phylogenetic subgroup distribution using the w-clique metric, enabling differentiation of polluted and unpolluted sites.
Resumo:
This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (-0.11 and -0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p> 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data.
Resumo:
Despite a strong increase in research on seamounts and oceanic islands ecology and biogeography, many basic aspects of their biodiversity are still unknown. In the southwestern Atlantic, the Vitória-Trindade Seamount Chain (VTC) extends ca. 1,200 km offshore the Brazilian continental shelf, from the Vitória seamount to the oceanic islands of Trindade and Martin Vaz. For a long time, most of the biological information available regarded its islands. Our study presents and analyzes an extensive database on the VTC fish biodiversity, built on data compiled from literature and recent scientific expeditions that assessed both shallow to mesophotic environments. A total of 273 species were recorded, 211 of which occur on seamounts and 173 at the islands. New records for seamounts or islands include 191 reef fish species and 64 depth range extensions. The structure of fish assemblages was similar between islands and seamounts, not differing in species geographic distribution, trophic composition, or spawning strategies. Main differences were related to endemism, higher at the islands, and to the number of endangered species, higher at the seamounts. Since unregulated fishing activities are common in the region, and mining activities are expected to drastically increase in the near future (carbonates on seamount summits and metals on slopes), this unique biodiversity needs urgent attention and management.