960 resultados para Multiple Resource Integration
Resumo:
In this paper we describe Fénix, a data model for exchanging information between Natural Language Processing applications. The format proposed is intended to be flexible enough to cover both current and future data structures employed in the field of Computational Linguistics. The Fénix architecture is divided into four separate layers: conceptual, logical, persistence and physical. This division provides a simple interface to abstract the users from low-level implementation details, such as programming languages and data storage employed, allowing them to focus in the concepts and processes to be modelled. The Fénix architecture is accompanied by a set of programming libraries to facilitate the access and manipulation of the structures created in this framework. We will also show how this architecture has been already successfully applied in different research projects.
Resumo:
This paper presents an eight-firm study, conducted from the service-dominant logic perspective, which makes a contribution regarding knowledge of the anatomy of value propositions and service innovation. The paper suggests that value propositions are configurations of several different practices and resources. The paper finds that ten common practices, organized in three main aggregates, constitute and fulfill value propositions: i.e. provision practices, representational practices, and management and organizational practices. Moreover, the paper suggests that service innovation can be equated with the creation of new value propositions by means of developing existing or creating new practices and/or resources, or by means of integrating practices and resources in new ways. It identifies four types of service innovation (adaptation, resource-based innovation, practice-based innovation, and combinative innovation) and three types of service innovation processes (practice-based, resource-based, and combinative). The key managerial insight provided by the paper is that service innovation must be conducted and value propositions must be evaluated from the perspective of the customers’ value creation, the service that the customer experiences. Successful service innovation is not only contingent on having the right resources, established methods and practices for integrating these resources into attractive value propositions are also needed.
Resumo:
Waste biomass is generated during the conservation management of semi-natural habitats, and represents an unused resource and potential bioenergy feedstock that does not compete with food production. Thermogravimetric analysis was used to characterise a representative range of biomass generated during conservation management in Wales. Of the biomass types assessed, those dominated by rush (Juncus effuses) and bracken (Pteridium aquilinum) exhibited the highest and lowest volatile compositions respectively and were selected for bench scale conversion via fast pyrolysis. Each biomass type was ensiled and a sub-sample of silage was washed and pressed. Demineralization of conservation biomass through washing and pressing was associated with higher oil yields following fast pyrolysis. The oil yields were within the published range established for the dedicated energy crops miscanthus and willow. In order to examine the potential a multiple output energy system was developed with gross power production estimates following valorisation of the press fluid, char and oil. If used in multi fuel industrial burners the char and oil alone would displace 3.9 × 105 tonnes per year of No. 2 light oil using Welsh biomass from conservation management. Bioenergy and product development using these feedstocks could simultaneously support biodiversity management and displace fossil fuels, thereby reducing GHG emissions. Gross power generation predictions show good potential.
Resumo:
Weiss and Isen have provided many supportive comments about the multi-level perspective, but also found limitations. Isen noted the importance of integrating affect, cognition, and motivation. Weiss commented similarly that the model lacked an integrating “thread.” He suggested that, to be truly multilevel, each level should constrain processes at other levels, and also provide guidance for the development of new concepts. Weiss also noted that the focus on biological processes was a strength of the model. I respond by suggesting that these very biological processes may constitute the “missing” thread. To illustrate this, I discuss some of the recent research on emotions in organizational settings, and argue that biology both constrains and guides theory at each level of the model. Based on this proposition, I revisit each of the five levels in the model, to demonstrate how this integration can be accomplished in this fashion. Finally, I address two additional points: aggregation bias, and the possibility of extending the model to include higher levels of industry and region.
Resumo:
Somatic copy number aberrations (CNA) represent a mutation type encountered in the majority of cancer genomes. Here, we present the 2014 edition of arrayMap (http://www.arraymap.org), a publicly accessible collection of pre-processed oncogenomic array data sets and CNA profiles, representing a vast range of human malignancies. Since the initial release, we have enhanced this resource both in content and especially with regard to data mining support. The 2014 release of arrayMap contains more than 64,000 genomic array data sets, representing about 250 tumor diagnoses. Data sets included in arrayMap have been assembled from public repositories as well as additional resources, and integrated by applying custom processing pipelines. Online tools have been upgraded for a more flexible array data visualization, including options for processing user provided, non-public data sets. Data integration has been improved by mapping to multiple editions of the human reference genome, with the majority of the data now being available for the UCSC hg18 as well as GRCh37 versions. The large amount of tumor CNA data in arrayMap can be freely downloaded by users to promote data mining projects, and to explore special events such as chromothripsis-like genome patterns.
Resumo:
The MyHits web server (http://myhits.isb-sib.ch) is a new integrated service dedicated to the annotation of protein sequences and to the analysis of their domains and signatures. Guest users can use the system anonymously, with full access to (i) standard bioinformatics programs (e.g. PSI-BLAST, ClustalW, T-Coffee, Jalview); (ii) a large number of protein sequence databases, including standard (Swiss-Prot, TrEMBL) and locally developed databases (splice variants); (iii) databases of protein motifs (Prosite, Interpro); (iv) a precomputed list of matches ('hits') between the sequence and motif databases. All databases are updated on a weekly basis and the hit list is kept up to date incrementally. The MyHits server also includes a new collection of tools to generate graphical representations of pairwise and multiple sequence alignments including their annotated features. Free registration enables users to upload their own sequences and motifs to private databases. These are then made available through the same web interface and the same set of analytical tools. Registered users can manage their own sequences and annotations using only web tools and freeze their data in their private database for publication purposes.
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the scale of a field site represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed downscaling procedure based on a non-linear Bayesian sequential simulation approach. The main objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity logged at collocated wells and surface resistivity measurements, which are available throughout the studied site. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariatekernel density function. Then a stochastic integration of low-resolution, large-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities is applied. The overall viability of this downscaling approach is tested and validated by comparing flow and transport simulation through the original and the upscaled hydraulic conductivity fields. Our results indicate that the proposed procedure allows obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.
Resumo:
In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred
Resumo:
In R&D organizations multiple projects are executed concurrently. Problems arises in managing shared resources since they are needed by multiple projects simultaneously. The objective of this thesis was to study how the project and resource management could be developed in a public sector R&D organization. The qualitative research was carried out in the Magnetic Measurements section at CERN where the section measures magnets for particle accelerators and builds state of the art measurement devices for various needs. Hence, the R&D and measurement projects are very time consuming and very complex. Based on the previous research and the requirements from the organization the best alter- native for resource management was to build a project management information system. A centralized database was constructed and on top of it was built an application for interacting and visualizing the project data. The application allows handling project data, which works as a basis for resource planning before and during the projects are executed. It is one way to standardize the work-flow of projects, which strengthens the project process. Additionally, it was noted that the inner customer’s database, the measurement system and the new application needed to be integrated. Further integration ensures that the project data is received efficiently from customers and available not only within the application but also during the concrete work. The research results introduced a new integrated application, which centralizes the project information flow with better visibility.
Resumo:
The main objective of this study was to find out the bases for innovation model formulation in an existing organization based on cases. Innovation processes can be analyzed based on their needs and based on their emphasis on the business model development or R&D. The research was conducted in energy sector within one company by utilizing its projects as cases for the study. It is typical for the field of business that development is slow, although the case company has put emphasis on its innovation efforts. Analysis was done by identifying the cases’ needs and comparing them. The results were that because of the variances in the needs of the cases, the applicability of innovation process models varies. It was discovered that by dividing the process into two phases, a uniform model could be composed. This model would fulfill the needs of the cases and potential future projects as well.
Resumo:
Uncertainties as to future supply costs of nonrenewable natural resources, such as oil and gas, raise the issue of the choice of supply sources. In a perfectly deterministic world, an efficient use of multiple sources of supply requires that any given market exhausts the supply it can draw from a low cost source before moving on to a higher cost one; supply sources should be exploited in strict sequence of increasing marginal cost, with a high cost source being left untouched as long as a less costly source is available. We find that this may not be the efficient thing to do in a stochastic world. We show that there exist conditions under which it can be efficient to use a risky supply source in order to conserve a cheaper non risky source. The benefit of doing this comes from the fact that it leaves open the possibility of using it instead of the risky source in the event the latter’s future cost conditions suddenly deteriorate. There are also conditions under which it will be efficient to use a more costly non risky source while a less costly risky source is still available. The reason is that this conserves the less costly risky source in order to use it in the event of a possible future drop in its cost.
Resumo:
Cet article s'intéresse aux processus de clarification des rôles professionnels lors de l'intégration d'une infirmière praticienne spécialisée dans les équipes de première ligne au Québec.
Resumo:
In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred
Resumo:
Exercises and solutions on double integration. Diagrams for the questions are all together in the support.zip file, as .eps files