981 resultados para Meta-Data Management


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research activities involved the application of the Geomatic techniques in the Cultural Heritage field, following the development of two themes: Firstly, the application of high precision surveying techniques for the restoration and interpretation of relevant monuments and archaeological finds. The main case regards the activities for the generation of a high-fidelity 3D model of the Fountain of Neptune in Bologna. In this work, aimed to the restoration of the manufacture, both the geometrical and radiometrical aspects were crucial. The final product was the base of a 3D information system representing a shared tool where the different figures involved in the restoration activities shared their contribution in a multidisciplinary approach. Secondly, the arrangement of 3D databases for a Building Information Modeling (BIM) approach, in a process which involves the generation and management of digital representations of physical and functional characteristics of historical buildings, towards a so-called Historical Building Information Model (HBIM). A first application was conducted for the San Michele in Acerboli’s church in Santarcangelo di Romagna. The survey was performed by the integration of the classical and modern Geomatic techniques and the point cloud representing the church was used for the development of a HBIM model, where the relevant information connected to the building could be stored and georeferenced. A second application regards the domus of Obellio Firmo in Pompeii, surveyed by the integration of the classical and modern Geomatic techniques. An historical analysis permitted the definitions of phases and the organization of a database of materials and constructive elements. The goal is the obtaining of a federate model able to manage the different aspects: documental, analytic and reconstructive ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Funding Sources The NNUH Stroke and TIA Register is maintained by the NNUH NHS Foundation Trust Stroke Services and data management for this study is supported by the NNUH Research and Development Department through Research Capability Funds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a networked business environment the visibility requirements towards the supply operations and customer interface has become tighter. In order to meet those requirements the master data of case company is seen as an enabler. However the current state of master data and its quality are not seen good enough to meet those requirements. In this thesis the target of research was to develop a process for managing master data quality as a continuous process and find solutions to cleanse the current customer and supplier data to meet the quality requirements defined in that process. Based on the theory of Master Data Management and data cleansing, small amount of master data was analyzed and cleansed using one commercial data cleansing solution available on the market. This was conducted in cooperation with the vendor as a proof of concept. In the proof of concept the cleansing solution’s applicability to improve the quality of current master data was proved. Based on those findings and the theory of data management the recommendations and proposals for improving the quality of data were given. In the results was also discovered that the biggest reasons for poor data quality is the lack of data governance in the company, and the current master data solutions and its restrictions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis consists of three main theoretical themes: quality of data, success of information systems, and metadata in data warehousing. Loosely defined, metadata is descriptive data about data, and, in this thesis, master data means reference data about customers, products etc. The objective of the thesis is to contribute to an implementation of a metadata management solution for an industrial enterprise. The metadata system incorporates a repository, integration, delivery and access tools, as well as semantic rules and procedures for master data maintenance. It targets to improve maintenance processes and quality of hierarchical master data in the case company’s informational systems. That should bring benefits to whole organization in improved information quality, especially in cross-system data consistency, and in more efficient and effective data management processes. As the result of this thesis, the requirements for the metadata management solution in case were compiled, and the success of the new information system and the implementation project was evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Workshop at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study was done for ABB Ltd. Motors and Generators business unit in Helsinki. In this study, global data movement in large businesses is examined from a product data management (PDM) and enterprise resource planning (ERP) point-of-view. The purpose of this study was to understand and map out how a large global business handles its data in a multiple site structure and how it can be applied in practice. This was done by doing an empirical interview study on five different global businesses with design locations in multiple countries. Their master data management (MDM) solutions were inspected and analyzed to understand which solution would best benefit a large global architecture with many design locations. One working solution is a transactional hub which negates the effects of multisite transfers and reduces lead times. Also, the requirements and limitations of the current MDM architecture were analyzed and possible reform ideas given.  

Relevância:

100.00% 100.00%

Publicador:

Resumo:

After sales business is an effective way to create profit and increase customer satisfaction in manufacturing companies. Despite this, some special business characteristics that are linked to these functions, make it exceptionally challenging in its own way. This Master’s Thesis examines the current situation of the data and inventory management in the case company regarding possibilities and challenges related to the consolidation of current business operations. The research examines process steps, procedures, data requirements, data mining practices and data storage management of spare part sales process, whereas the part focusing on inventory management is reviewing the current stock value and examining current practices and operational principles. There are two global after sales units which supply spare parts and issues reviewed in this study are examined from both units’ perspective. The analysis is focused on the operations of that unit where functions would be centralized by default, if change decisions are carried out. It was discovered that both data and inventory management include clear shortcomings, which result from lack of internal instructions and established processes as well as lack of cooperation with other stakeholders related to product’s lifecycle. The main product of data management was a guideline for consolidating the functions, tailored for the company’s needs. Additionally, potentially scrapped spare part were listed and a proposal of inventory management instructions was drafted. If the suggested spare part materials will be scrapped, stock value will decrease 46 percent. A guideline which was reviewed and commented in this thesis was chosen as the basis of the inventory management instructions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: It has become an accepted procedure to transfer more than one embryo to the patient to achieve acceptable ongoing pregnancy rates. However, transfers of more than a single embryo increase the probability of establishing a multiple gestation. Single-embryo transfer can minimize twin pregnancies but may also lower live birth rates. This meta-analysis aimed to compare current data on single-embryo versus double-embryo transfer in fresh IVF/ICSI cycles with respect to implantation, ongoing pregnancy and live birth rates.Methods: Search strategies included on-line surveys of databases from 1995 to 2008. Data management and analysis were conducted using the Stats Direct statistical software. The fixed-effect model was used for odds ratio (OR). Fixed-effect effectiveness was evaluated by the Mantel Haenszel method. Seven trials fulfilled the inclusion criteria.Results: When pooling results under the fixed-effect model, the implantation rate was not significantly different between double-embryo transfer (34.5%) and single-embryo transfer group (34.7%) (P = 0.96; OR = 0.99, 95% CI 0.78, 1.25). on the other hand, double-embryo transfer produced a statistically significantly higher ongoing clinical pregnancy rate (44.5%) than single-embryo transfer (28.3%) (P < 0.0001; OR: 2.06, 95% CI = 1.64,2.60). At the same time, pooling results presented a significantly higher live birth rate when double-embryo transfer (42.5%) (P < 0.001; OR: 1.87, 95% CI = 1.44,2.42) was compared with single-embryo transfer (28.4%).Conclusion: Meta-analysis with 95% confidence showed that, despite similar implantation rates, fresh double-embryo transfer had a 1.64 to 2.60 times greater ongoing pregnancy rate and 1.44 to 2.42 times greater live birth rate than single-embryo transfer in a population suitable for ART treatment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the authors introduce a novel mechanism for data management in a middleware for smart home control, where a relational database and semantic ontology storage are used at the same time in a Data Warehouse. An annotation system has been designed for instructing the storage format and location, registering new ontology concepts and most importantly, guaranteeing the Data Consistency between the two storage methods. For easing the data persistence process, the Data Access Object (DAO) pattern is applied and optimized to enhance the Data Consistency assurance. Finally, this novel mechanism provides an easy manner for the development of applications and their integration with BATMP. Finally, an application named "Parameter Monitoring Service" is given as an example for assessing the feasibility of the system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

- We conduct a Meta-analysis of 54 papers that study the relationship between multinationality and firm performance. The aim is to understand if any systematic relationships exist between the characteristics of each study and the reported results of linear and curvilinear regressions to examine the multinationality-performance relationship. - Our main finding, robust to different specifications and to different weights for each observation, is that when analysis is based on non-US data, the reported return to multinationality is higher. However, this relationship for non-US firms is usually U-shaped rather than inverted U-shaped. This indicates that US firms face lower returns to internationalization than other firms but are less likely to incur losses in the early stages of internationalization. - The findings also highlight the differences that are reported when comparing regression and non-regression based techniques. Our results suggest that in this area regression based analysis is more reliable than say ANOVA or other related approaches. - Other characteristics that influence the estimated rate of return and its shape across different studies are: the measure of multinationality used; size distribution of the sample; and the use of market-based indicators to measure firm performance. Finally, we find no evidence of publication bias.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research presented in this dissertation is comprised of several parts which jointly attain the goal of Semantic Distributed Database Management with Applications to Internet Dissemination of Environmental Data. ^ Part of the research into more effective and efficient data management has been pursued through enhancements to the Semantic Binary Object-Oriented database (Sem-ODB) such as more effective load balancing techniques for the database engine, and the use of Sem-ODB as a tool for integrating structured and unstructured heterogeneous data sources. Another part of the research in data management has pursued methods for optimizing queries in distributed databases through the intelligent use of network bandwidth; this has applications in networks that provide varying levels of Quality of Service or throughput. ^ The application of the Semantic Binary database model as a tool for relational database modeling has also been pursued. This has resulted in database applications that are used by researchers at the Everglades National Park to store environmental data and to remotely-sensed imagery. ^ The areas of research described above have contributed to the creation TerraFly, which provides for the dissemination of geospatial data via the Internet. TerraFly research presented herein ranges from the development of TerraFly's back-end database and interfaces, through the features that are presented to the public (such as the ability to provide autopilot scripts and on-demand data about a point), to applications of TerraFly in the areas of hazard mitigation, recreation, and aviation. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the SINOPS project, an optimal state of the art simulation of the marine silicon cycle is attempted employing a biogeochemical ocean general circulation model (BOGCM) through three particular time steps relevant for global (paleo-) climate. In order to tune the model optimally, results of the simulations are compared to a comprehensive data set of 'real' observations. SINOPS' scientific data management ensures that data structure becomes homogeneous throughout the project. Practical work routine comprises systematic progress from data acquisition, through preparation, processing, quality check and archiving, up to the presentation of data to the scientific community. Meta-information and analytical data are mapped by an n-dimensional catalogue in order to itemize the analytical value and to serve as an unambiguous identifier. In practice, data management is carried out by means of the online-accessible information system PANGAEA, which offers a tool set comprising a data warehouse, Graphical Information System (GIS), 2-D plot, cross-section plot, etc. and whose multidimensional data model promotes scientific data mining. Besides scientific and technical aspects, this alliance between scientific project team and data management crew serves to integrate the participants and allows them to gain mutual respect and appreciation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

significant amount of Expendable Bathythermograph (XBT) data has been collected in the Mediterranean Sea since 1999 in the framework of operational oceanography activities. The management and storage of such a volume of data poses significant challenges and opportunities. The SeaDataNet project, a pan-European infrastructure for marine data diffusion, provides a convenient way to avoid dispersion of these temperature vertical profiles and to facilitate access to a wider public. The XBT data flow, along with the recent improvements in the quality check procedures and the consistence of the available historical data set are described. The main features of SeaDataNet services and the advantage of using this system for long-term data archiving are presented. Finally, focus on the Ligurian Sea is included in order to provide an example of the kind of information and final products devoted to different users can be easily derived from the SeaDataNet web portal.