976 resultados para Information Requirements: Data Availability
Resumo:
Data is the most important asset of a company in the information age. Other assets, such as technology, facilities or products can be copied or reverse-engineered, employees can be brought over, but data remains unique to every company. As data management topics are slowly moving from unknown unknowns to known unknowns, tools to evaluate and manage data properly are developed and refined. Many projects are in progress today to develop various maturity models for evaluating information and data management practices. These maturity models come in many shapes and sizes: from short and concise ones meant for a quick assessment, to complex ones that call for an expert assessment by experienced consultants. In this paper several of them, made not only by external inter-organizational groups and authors, but also developed internally at a Major Energy Provider Company (MEPC) are juxtaposed and thoroughly analyzed. Apart from analyzing the available maturity models related to Data Management, this paper also selects the one with the most merit and describes and analyzes using it to perform a maturity assessment in MEPC. The utility of maturity models is two-fold: descriptive and prescriptive. Besides recording the current state of Data Management practices maturity by performing the assessments, this maturity model is also used to chart the way forward. Thus, after the current situation is presented, analysis and recommendations on how to improve it based on the definitions of higher levels of maturity are given. Generally, the main trend observed was the widening of the Data Management field to include more business and “soft” areas (as opposed to technical ones) and the change of focus towards business value of data, while assuming that the underlying IT systems for managing data are “ideal”, that is, left to the purely technical disciplines to design and maintain. This trend is not only present in Data Management but in other technological areas as well, where more and more attention is given to innovative use of technology, while acknowledging that the strategic importance of IT as such is diminishing.
Resumo:
Following the current trend of companies in changing and developing their businesses from transactional approach to relationship and solution oriented approach has set new requirements to internal cooperation of companies too. The relationship between marketing and sales has been identified to be critical to company's success here, but surprisingly little is known about it. The purpose of this study was to deepen understanding of the relationship between sales and marketing in business-to-business sales from operative sales employees' perspectives in solution selling context. The aim was to develop an explorative analytical construction and framework of the interface. The study was conducted as a literature review and an empirical qualitative explorative single case study. The data was collected by conducting six thematic interviews with sales employees of the case company. Observing sales and marketing, written documents and other materials used in sales were used as secondary source of information. The data was analyzed using qualitative case study analysis methods. The findings of the study support previous research findings of the interface between marketing and sales but also bring new propositions as analytical framework to construct the interface. As such, the interface was found to be a multi-dimensional and complex dynamic construction. As results of this study, there was an exploratory framework constructed. The construction consists of three explorative contexts of the interface: internal context, relationship emphasizing context and solution selling context. These contexts are further divided into lower levels as an outcome of the analysis. In addition the identified contexts, there are also conceptual domains identified, which are common to all the contexts. The role of mutual, cross-functional knowledge creation was found to be central in the interface.
Resumo:
The control of nitrogen metabolism in pathogenic Gram-positive bacteria has been studied in a variety of species and is involved with the expression of virulence factors. To date, no data have been reported regarding nitrogen metabolism in the odontopathogenic species Streptococcus mutans. GlnR, which controls nitrogen assimilation in the related bacterial species, Bacillus subtilis, was assessed in S. mutans for its DNA and protein binding activity. Electrophoretic mobility shift assay of the S. mutans GlnR protein indicated that GlnR binds to promoter regions of the glnRA and amtB-glnK operons. Cross-linking and pull-down assays demonstrated that GlnR interacts with GlnK, a signal transduction protein that coordinates the regulation of nitrogen metabolism. Upon formation of this stable complex, GlnK enhances the affinity of GlnR for the glnRA operon promoter. These results support an involvement of GlnR in transcriptional regulation of nitrogen metabolism-related genes and indicate that GlnK relays information regarding ammonium availability to GlnR.
Resumo:
After sales business is an effective way to create profit and increase customer satisfaction in manufacturing companies. Despite this, some special business characteristics that are linked to these functions, make it exceptionally challenging in its own way. This Master’s Thesis examines the current situation of the data and inventory management in the case company regarding possibilities and challenges related to the consolidation of current business operations. The research examines process steps, procedures, data requirements, data mining practices and data storage management of spare part sales process, whereas the part focusing on inventory management is reviewing the current stock value and examining current practices and operational principles. There are two global after sales units which supply spare parts and issues reviewed in this study are examined from both units’ perspective. The analysis is focused on the operations of that unit where functions would be centralized by default, if change decisions are carried out. It was discovered that both data and inventory management include clear shortcomings, which result from lack of internal instructions and established processes as well as lack of cooperation with other stakeholders related to product’s lifecycle. The main product of data management was a guideline for consolidating the functions, tailored for the company’s needs. Additionally, potentially scrapped spare part were listed and a proposal of inventory management instructions was drafted. If the suggested spare part materials will be scrapped, stock value will decrease 46 percent. A guideline which was reviewed and commented in this thesis was chosen as the basis of the inventory management instructions.
Resumo:
De nos jours les cartes d’utilisation/occupation du sol (USOS) à une échelle régionale sont habituellement générées à partir d’images satellitales de résolution modérée (entre 10 m et 30 m). Le National Land Cover Database aux États-Unis et le programme CORINE (Coordination of information on the environment) Land Cover en Europe, tous deux fondés sur les images LANDSAT, en sont des exemples représentatifs. Cependant ces cartes deviennent rapidement obsolètes, spécialement en environnement dynamique comme les megacités et les territoires métropolitains. Pour nombre d’applications, une mise à jour de ces cartes sur une base annuelle est requise. Depuis 2007, le USGS donne accès gratuitement à des images LANDSAT ortho-rectifiées. Des images archivées (depuis 1984) et des images acquises récemment sont disponibles. Sans aucun doute, une telle disponibilité d’images stimulera la recherche sur des méthodes et techniques rapides et efficaces pour un monitoring continue des changements des USOS à partir d’images à résolution moyenne. Cette recherche visait à évaluer le potentiel de telles images satellitales de résolution moyenne pour obtenir de l’information sur les changements des USOS à une échelle régionale dans le cas de la Communauté Métropolitaine de Montréal (CMM), une métropole nord-américaine typique. Les études précédentes ont démontré que les résultats de détection automatique des changements dépendent de plusieurs facteurs tels : 1) les caractéristiques des images (résolution spatiale, bandes spectrales, etc.); 2) la méthode même utilisée pour la détection automatique des changements; et 3) la complexité du milieu étudié. Dans le cas du milieu étudié, à l’exception du centre-ville et des artères commerciales, les utilisations du sol (industriel, commercial, résidentiel, etc.) sont bien délimitées. Ainsi cette étude s’est concentrée aux autres facteurs pouvant affecter les résultats, nommément, les caractéristiques des images et les méthodes de détection des changements. Nous avons utilisé des images TM/ETM+ de LANDSAT à 30 m de résolution spatiale et avec six bandes spectrales ainsi que des images VNIR-ASTER à 15 m de résolution spatiale et avec trois bandes spectrales afin d’évaluer l’impact des caractéristiques des images sur les résultats de détection des changements. En ce qui a trait à la méthode de détection des changements, nous avons décidé de comparer deux types de techniques automatiques : (1) techniques fournissant des informations principalement sur la localisation des changements et (2)techniques fournissant des informations à la fois sur la localisation des changements et sur les types de changement (classes « de-à »). Les principales conclusions de cette recherche sont les suivantes : Les techniques de détection de changement telles les différences d’image ou l’analyse des vecteurs de changements appliqués aux images multi-temporelles LANDSAT fournissent une image exacte des lieux où un changement est survenu d’une façon rapide et efficace. Elles peuvent donc être intégrées dans un système de monitoring continu à des fins d’évaluation rapide du volume des changements. Les cartes des changements peuvent aussi servir de guide pour l’acquisition d’images de haute résolution spatiale si l’identification détaillée du type de changement est nécessaire. Les techniques de détection de changement telles l’analyse en composantes principales et la comparaison post-classification appliquées aux images multi-temporelles LANDSAT fournissent une image relativement exacte de classes “de-à” mais à un niveau thématique très général (par exemple, bâti à espace vert et vice-versa, boisés à sol nu et vice-versa, etc.). Les images ASTER-VNIR avec une meilleure résolution spatiale mais avec moins de bandes spectrales que LANDSAT n’offrent pas un niveau thématique plus détaillé (par exemple, boisés à espace commercial ou industriel). Les résultats indiquent que la recherche future sur la détection des changements en milieu urbain devrait se concentrer aux changements du couvert végétal puisque les images à résolution moyenne sont très sensibles aux changements de ce type de couvert. Les cartes indiquant la localisation et le type des changements du couvert végétal sont en soi très utiles pour des applications comme le monitoring environnemental ou l’hydrologie urbaine. Elles peuvent aussi servir comme des indicateurs des changements de l’utilisation du sol. De techniques telles l’analyse des vecteurs de changement ou les indices de végétation son employées à cette fin.
Resumo:
n the recent years protection of information in digital form is becoming more important. Image and video encryption has applications in various fields including Internet communications, multimedia systems, medical imaging, Tele-medicine and military communications. During storage as well as in transmission, the multimedia information is being exposed to unauthorized entities unless otherwise adequate security measures are built around the information system. There are many kinds of security threats during the transmission of vital classified information through insecure communication channels. Various encryption schemes are available today to deal with information security issues. Data encryption is widely used to protect sensitive data against the security threat in the form of “attack on confidentiality”. Secure transmission of information through insecure communication channels also requires encryption at the sending side and decryption at the receiving side. Encryption of large text message and image takes time before they can be transmitted, causing considerable delay in successive transmission of information in real-time. In order to minimize the latency, efficient encryption algorithms are needed. An encryption procedure with adequate security and high throughput is sought in multimedia encryption applications. Traditional symmetric key block ciphers like Data Encryption Standard (DES), Advanced Encryption Standard (AES) and Escrowed Encryption Standard (EES) are not efficient when the data size is large. With the availability of fast computing tools and communication networks at relatively lower costs today, these encryption standards appear to be not as fast as one would like. High throughput encryption and decryption are becoming increasingly important in the area of high-speed networking. Fast encryption algorithms are needed in these days for high-speed secure communication of multimedia data. It has been shown that public key algorithms are not a substitute for symmetric-key algorithms. Public key algorithms are slow, whereas symmetric key algorithms generally run much faster. Also, public key systems are vulnerable to chosen plaintext attack. In this research work, a fast symmetric key encryption scheme, entitled “Matrix Array Symmetric Key (MASK) encryption” based on matrix and array manipulations has been conceived and developed. Fast conversion has been achieved with the use of matrix table look-up substitution, array based transposition and circular shift operations that are performed in the algorithm. MASK encryption is a new concept in symmetric key cryptography. It employs matrix and array manipulation technique using secret information and data values. It is a block cipher operated on plain text message (or image) blocks of 128 bits using a secret key of size 128 bits producing cipher text message (or cipher image) blocks of the same size. This cipher has two advantages over traditional ciphers. First, the encryption and decryption procedures are much simpler, and consequently, much faster. Second, the key avalanche effect produced in the ciphertext output is better than that of AES.
Resumo:
Data caching is an important technique in mobile computing environments for improving data availability and access latencies particularly because these computing environments are characterized by narrow bandwidth wireless links and frequent disconnections. Cache replacement policy plays a vital role to improve the performance in a cached mobile environment, since the amount of data stored in a client cache is small. In this paper we reviewed some of the well known cache replacement policies proposed for mobile data caches. We made a comparison between these policies after classifying them based on the criteria used for evicting documents. In addition, this paper suggests some alternative techniques for cache replacement
Resumo:
Information architecture (IA) is defined as high level information requirements of an organisation. It is applied in areas such as information systems development, enterprise architecture, business processes management and organisational change management. Still, the lack of methods and theories prevents information architecture becoming a distinct discipline. Healthcare organisation is always seen as information intensive organisation, moreover in a pervasive healthcare environment. Pervasive healthcare aims to provide healthcare services to anyone, anywhere and anytime by incorporating mobile devices and wireless network. Information architecture hence plays an important role in information provisioning within the context of pervasive healthcare in order to support decision making and communication between clinician and patients. Organisational semiotics is one of the social technical approaches that contemplate information through the norms or activities performed within an organisation prior to pervasive healthcare implementation. This paper proposes a conceptual design of information architecture for pervasive healthcare. It is illustrated with a scenario of mental health patient monitoring.
Resumo:
Pasture-based ruminant production systems are common in certain areas of the world, but energy evaluation in grazing cattle is performed with equations developed, in their majority, with sheep or cattle fed total mixed rations. The aim of the current study was to develop predictions of metabolisable energy (ME) concentrations in fresh-cut grass offered to non-pregnant non-lactating cows at maintenance energy level, which may be more suitable for grazing cattle. Data were collected from three digestibility trials performed over consecutive grazing seasons. In order to cover a range of commercial conditions and data availability in pasture-based systems, thirty-eight equations for the prediction of energy concentrations and ratios were developed. An internal validation was performed for all equations and also for existing predictions of grass ME. Prediction error for ME using nutrient digestibility was lowest when gross energy (GE) or organic matter digestibilities were used as sole predictors, while the addition of grass nutrient contents reduced the difference between predicted and actual values, and explained more variation. Addition of N, GE and diethyl ether extract (EE) contents improved accuracy when digestible organic matter in DM was the primary predictor. When digestible energy was the primary explanatory variable, prediction error was relatively low, but addition of water-soluble carbohydrates, EE and acid-detergent fibre contents of grass decreased prediction error. Equations developed in the current study showed lower prediction errors when compared with those of existing equations, and may thus allow for an improved prediction of ME in practice, which is critical for the sustainability of pasture-based systems.
Resumo:
Geospatial information of many kinds, from topographic maps to scientific data, is increasingly being made available through web mapping services. These allow georeferenced map images to be served from data stores and displayed in websites and geographic information systems, where they can be integrated with other geographic information. The Open Geospatial Consortium’s Web Map Service (WMS) standard has been widely adopted in diverse communities for sharing data in this way. However, current services typically provide little or no information about the quality or accuracy of the data they serve. In this paper we will describe the design and implementation of a new “quality-enabled” profile of WMS, which we call “WMS-Q”. This describes how information about data quality can be transmitted to the user through WMS. Such information can exist at many levels, from entire datasets to individual measurements, and includes the many different ways in which data uncertainty can be expressed. We also describe proposed extensions to the Symbology Encoding specification, which include provision for visualizing uncertainty in raster data in a number of different ways, including contours, shading and bivariate colour maps. We shall also describe new open-source implementations of the new specifications, which include both clients and servers.
Resumo:
With the increase in e-commerce and the digitisation of design data and information,the construction sector has become reliant upon IT infrastructure and systems. The design and production process is more complex, more interconnected, and reliant upon greater information mobility, with seamless exchange of data and information in real time. Construction small and medium-sized enterprises (CSMEs), in particular,the speciality contractors, can effectively utilise cost-effective collaboration-enabling technologies, such as cloud computing, to help in the effective transfer of information and data to improve productivity. The system dynamics (SD) approach offers a perspective and tools to enable a better understanding of the dynamics of complex systems. This research focuses upon system dynamics methodology as a modelling and analysis tool in order to understand and identify the key drivers in the absorption of cloud computing for CSMEs. The aim of this paper is to determine how the use of system dynamics (SD) can improve the management of information flow through collaborative technologies leading to improved productivity. The data supporting the use of system dynamics was obtained through a pilot study consisting of questionnaires and interviews from five CSMEs in the UK house-building sector.
Resumo:
1. Comparative analyses are used to address the key question of what makes a species more prone to extinction by exploring the links between vulnerability and intrinsic species’ traits and/or extrinsic factors. This approach requires comprehensive species data but information is rarely available for all species of interest. As a result comparative analyses often rely on subsets of relatively few species that are assumed to be representative samples of the overall studied group. 2. Our study challenges this assumption and quantifies the taxonomic, spatial, and data type biases associated with the quantity of data available for 5415 mammalian species using the freely available life-history database PanTHERIA. 3. Moreover, we explore how existing biases influence results of comparative analyses of extinction risk by using subsets of data that attempt to correct for detected biases. In particular, we focus on links between four species’ traits commonly linked to vulnerability (distribution range area, adult body mass, population density and gestation length) and conduct univariate and multivariate analyses to understand how biases affect model predictions. 4. Our results show important biases in data availability with c.22% of mammals completely lacking data. Missing data, which appear to be not missing at random, occur frequently in all traits (14–99% of cases missing). Data availability is explained by intrinsic traits, with larger mammals occupying bigger range areas being the best studied. Importantly, we find that existing biases affect the results of comparative analyses by overestimating the risk of extinction and changing which traits are identified as important predictors. 5. Our results raise concerns over our ability to draw general conclusions regarding what makes a species more prone to extinction. Missing data represent a prevalent problem in comparative analyses, and unfortunately, because data are not missing at random, conventional approaches to fill data gaps, are not valid or present important challenges. These results show the importance of making appropriate inferences from comparative analyses by focusing on the subset of species for which data are available. Ultimately, addressing the data bias problem requires greater investment in data collection and dissemination, as well as the development of methodological approaches to effectively correct existing biases.
Resumo:
ISO19156 Observations and Measurements (O&M) provides a standardised framework for organising information about the collection of information about the environment. Here we describe the implementation of a specialisation of O&M for environmental data, the Metadata Objects for Linking Environmental Sciences (MOLES3). MOLES3 provides support for organising information about data, and for user navigation around data holdings. The implementation described here, “CEDA-MOLES”, also supports data management functions for the Centre for Environmental Data Archival, CEDA. The previous iteration of MOLES (MOLES2) saw active use over five years, being replaced by CEDA-MOLES in late 2014. During that period important lessons were learnt both about the information needed, as well as how to design and maintain the necessary information systems. In this paper we review the problems encountered in MOLES2; how and why CEDA-MOLES was developed and engineered; the migration of information holdings from MOLES2 to CEDA-MOLES; and, finally, provide an early assessment of MOLES3 (as implemented in CEDA-MOLES) and its limitations. Key drivers for the MOLES3 development included the necessity for improved data provenance, for further structured information to support ISO19115 discovery metadata export (for EU INSPIRE compliance), and to provide appropriate fixed landing pages for Digital Object Identifiers (DOIs) in the presence of evolving datasets. Key lessons learned included the importance of minimising information structure in free text fields, and the necessity to support as much agility in the information infrastructure as possible without compromising on maintainability both by those using the systems internally and externally (e.g. citing in to the information infrastructure), and those responsible for the systems themselves. The migration itself needed to ensure continuity of service and traceability of archived assets.
Resumo:
Current commercially available Doppler lidars provide an economical and robust solution for measuring vertical and horizontal wind velocities, together with the ability to provide co- and cross-polarised backscatter profiles. The high temporal resolution of these instruments allows turbulent properties to be obtained from studying the variation in radial velocities. However, the instrument specifications mean that certain characteristics, especially the background noise behaviour, become a limiting factor for the instrument sensitivity in regions where the aerosol load is low. Turbulent calculations require an accurate estimate of the contribution from velocity uncertainty estimates, which are directly related to the signal-to-noise ratio. Any bias in the signal-to-noise ratio will propagate through as a bias in turbulent properties. In this paper we present a method to correct for artefacts in the background noise behaviour of commercially available Doppler lidars and reduce the signal-to-noise ratio threshold used to discriminate between noise, and cloud or aerosol signals. We show that, for Doppler lidars operating continuously at a number of locations in Finland, the data availability can be increased by as much as 50 % after performing this background correction and subsequent reduction in the threshold. The reduction in bias also greatly improves subsequent calculations of turbulent properties in weak signal regimes.
Resumo:
Most multidimensional projection techniques rely on distance (dissimilarity) information between data instances to embed high-dimensional data into a visual space. When data are endowed with Cartesian coordinates, an extra computational effort is necessary to compute the needed distances, making multidimensional projection prohibitive in applications dealing with interactivity and massive data. The novel multidimensional projection technique proposed in this work, called Part-Linear Multidimensional Projection (PLMP), has been tailored to handle multivariate data represented in Cartesian high-dimensional spaces, requiring only distance information between pairs of representative samples. This characteristic renders PLMP faster than previous methods when processing large data sets while still being competitive in terms of precision. Moreover, knowing the range of variation for data instances in the high-dimensional space, we can make PLMP a truly streaming data projection technique, a trait absent in previous methods.