890 resultados para product service systems
Resumo:
Mechanisms that allow pathogens to colonize the host are not the product of isolated genes, but instead emerge from the concerted operation of regulatory networks. Therefore, identifying components and the systemic behavior of networks is necessary to a better understanding of gene regulation and pathogenesis. To this end, I have developed systems biology approaches to study transcriptional and post-transcriptional gene regulation in bacteria, with an emphasis in the human pathogen Mycobacterium tuberculosis (Mtb). First, I developed a network response method to identify parts of the Mtb global transcriptional regulatory network utilized by the pathogen to counteract phagosomal stresses and survive within resting macrophages. As a result, the method unveiled transcriptional regulators and associated regulons utilized by Mtb to establish a successful infection of macrophages throughout the first 14 days of infection. Additionally, this network-based analysis identified the production of Fe-S proteins coupled to lipid metabolism through the alkane hydroxylase complex as a possible strategy employed by Mtb to survive in the host. Second, I developed a network inference method to infer the small non-coding RNA (sRNA) regulatory network in Mtb. The method identifies sRNA-mRNA interactions by integrating a priori knowledge of possible binding sites with structure-driven identification of binding sites. The reconstructed network was useful to predict functional roles for the multitude of sRNAs recently discovered in the pathogen, being that several sRNAs were postulated to be involved in virulence-related processes. Finally, I applied a combined experimental and computational approach to study post-transcriptional repression mediated by small non-coding RNAs in bacteria. Specifically, a probabilistic ranking methodology termed rank-conciliation was developed to infer sRNA-mRNA interactions based on multiple types of data. The method was shown to improve target prediction in Escherichia coli, and therefore is useful to prioritize candidate targets for experimental validation.
Resumo:
During recent years, the basins of the Kara Sea (Kamennomysskaya, Obskaya, and Chugor'yakhinskaya structures) in the Russian Federation have been considered as promising regions for oil and gas exploration and, simultaneously, as possible paths of relatively cheap pipeline and tanker transportation of hydrocarbons projected for recovery. On the other hand, exploration operations, recovery, and transportation of gas pose a considerable risk of accidents and environmental pollution, which causes a justified concern about the future state of the ecological system of the Gulf of Ob and the adjoining parts of the Kara Sea. Therefore, regular combined environmental investigations (monitoring) are the most important factor for estimating the current state and forecasting the dynamics of the development of estuary systems. The program of investigations (schedule, station network, and measured parameters) is standardized in accordance with the international practice of such work and accounts for the experience of monitoring studies of Russian and foreign researchers. Two measurement sessions were performed during ecological investigations in the region of exploration drilling: at the beginning at final stage of drilling operations and borehole testing; in addition, natural parameters were determined in various parts of the Ob estuary before the beginning of investigations. Hydrophysical and hydrochemical characteristics of the water medium were determined and bottom sediments and water were analyzed for various pollutants (petroleum products, heavy metals, and radionuclides). The forms of heavy-metal occurrence in river and sea waters were determined by the method of continuous multistep filtration, which is based on water component fractionation on membrane filters of various pore sizes. These investigations revealed environmental pollution by chemical substances during the initial stage of drilling operations, when remains of fuels, oils, and solutions could be spilled, and part of the chemical pollutants could enter the environment. Owing to horizontal and vertical turbulent diffusion, wave mixing, and the effect of the general direction of currents in the Ob estuary from south to north, areas are formed with elevated concentrations of the analyzed elements and compounds. However, the concentration levels of chemical pollutants are practically no higher than the maximum admissible concentrations, and their substantial dissipation to the average regional background contents can be expected in the near future. Our investigations allowed us to determine in detail the parameters of anthropogenic pollution in the regions affected by hydrocarbon exploration drilling in the Obskii and Kamennomysskii prospects in the Gulf of Ob and estimate their influence on the ecological state of the basin of the Ob River and the Kara Sea on the whole.
Resumo:
In 2005, the International Ocean Colour Coordinating Group (IOCCG) convened a working group to examine the state of the art in ocean colour data merging, which showed that the research techniques had matured sufficiently for creating long multi-sensor datasets (IOCCG, 2007). As a result, ESA initiated and funded the DUE GlobColour project (http://www.globcolour.info/) to develop a satellite based ocean colour data set to support global carbon-cycle research. It aims to satisfy the scientific requirement for a long (10+ year) time-series of consistently calibrated global ocean colour information with the best possible spatial coverage. This has been achieved by merging data from the three most capable sensors: SeaWiFS on GeoEye's Orbview-2 mission, MODIS on NASA's Aqua mission and MERIS on ESA's ENVISAT mission. In setting up the GlobColour project, three user organisations were invited to help. Their roles are to specify the detailed user requirements, act as a channel to the broader end user community and to provide feedback and assessment of the results. The International Ocean Carbon Coordination Project (IOCCP) based at UNESCO in Paris provides direct access to the carbon cycle modelling community's requirements and to the modellers themselves who will use the final products. The UK Met Office's National Centre for Ocean Forecasting (NCOF) in Exeter, UK, provides an understanding of the requirements of oceanography users, and the IOCCG bring their understanding of the global user needs and valuable advice on best practice within the ocean colour science community. The three year project kicked-off in November 2005 under the leadership of ACRI-ST (France). The first year was a feasibility demonstration phase that was successfully concluded at a user consultation workshop organised by the Laboratoire d'Océanographie de Villefranche, France, in December 2006. Error statistics and inter-sensor biases were quantified by comparison with insitu measurements from moored optical buoys and ship based campaigns, and used as an input to the merging. The second year was dedicated to the production of the time series. In total, more than 25 Tb of input (level 2) data have been ingested and 14 Tb of intermediate and output products created, with 4 Tb of data distributed to the user community. Quality control (QC) is provided through the Diagnostic Data Sets (DDS), which are extracted sub-areas covering locations of in-situ data collection or interesting oceanographic phenomena. This Full Product Set (FPS) covers global daily merged ocean colour products in the time period 1997-2006 and is also freely available for use by the worldwide science community at http://www.globcolour.info/data_access_full_prod_set.html. The GlobColour service distributes global daily, 8-day and monthly data sets at 4.6 km resolution for, chlorophyll-a concentration, normalised water-leaving radiances (412, 443, 490, 510, 531, 555 and 620 nm, 670, 681 and 709 nm), diffuse attenuation coefficient, coloured dissolved and detrital organic materials, total suspended matter or particulate backscattering coefficient, turbidity index, cloud fraction and quality indicators. Error statistics from the initial sensor characterisation are used as an input to the merging methods and propagate through the merging process to provide error estimates for the output merged products. These error estimates are a key component of GlobColour as they are invaluable to the users; particularly the modellers who need them in order to assimilate the ocean colour data into ocean simulations. An intensive phase of validation has been undertaken to assess the quality of the data set. In addition, inter-comparisons between the different merged datasets will help in further refining the techniques used. Both the final products and the quality assessment were presented at a second user consultation in Oslo on 20-22 November 2007 organised by the Norwegian Institute for Water Research (NIVA); presentations are available on the GlobColour WWW site. On request of the ESA Technical Officer for the GlobColour project, the FPS data set was mirrored in the PANGAEA data library.
Resumo:
This chapter attempts to identify whether product differentiation or geographical differentiation is the main source of profit for firms in developing economies by employing a simple idea from the recently developed method of empirical industrial organization. Theoretically, location choice and product choice have been considered as analogues in differentiation, but in the real world, which of these strategies is chosen will result in an immense difference in firm behavior and in the development process of the industry. Development of the technique of empirical industrial organization enabled us to identify market outcomes with endogeneity. A typical case is the market outcome with differentiation, where price or product choice is endogenously determined. Our original survey contains data on market location, differences in product types, and price. The results show that product differentiation rather than geographical differentiation mitigates pressure on price competition, but 70 per cent secures geographical monopoly.
Resumo:
This paper summarizes the main results of a unique firm survey conducted in Penang, Malaysia in 2012 on product-related environmental regulations. The results show that firms receiving foreign-direct investment have adapted well to regulations but faced more rejections. Several research questions are addressed and examined by using the survey data. Major findings are as follows. First, adaptation involves changes in input procurement and market diversification, which potentially changes the structure of supply chains. Second, belonging to global supply chains is a key factor in compliance, but this requires firms to meet tougher customer requirements. Third, there is much room for government policy to play a role in assisting firms.
Resumo:
This paper summarizes the main results of a unique firm survey conducted in Vietnam in 2011 on product-related environmental regulations (PRERs). The results of this survey are compared with the results of a corresponding survey of firms in Penang, Malaysia (Michida, et al. 2014b). The major findings are as follows. First, adaptation to PRERs involves changes in input procurement and results in market diversification, which potentially alters the structure of supply chains. This finding is consistent with the Malaysian survey result. Second, connections to global supply chains are key to compliance, but this requires firms to meet more stringent customer requirements. Third, government policy can play an important role in assisting firms to comply with PRERs.
Resumo:
This paper sheds light on the important role played by global supply chains in the adaptation to product-related environmental regulations imposed by importing countries, with a focus on chemicals management. By utilizing a unique data collected in Penang, Malaysia, we depict the supply chain structures and how differences among firms in participation to global supply chain link to differences in chemical management. We found that firms belonging to a supply chain are in a better position to comply with these regulations because information and requirements are transmitted through global supply chains. In contrast, those firms that are neither exporters nor a part of a global supply chain lack the knowledge and information channels relevant to chemical management in a product.
Resumo:
As a common reference for many in-development standards and execution frameworks, special attention is being paid to Service-Oriented Architectures. SOAs modeling, however, is an area in which a consensus has not being achieved. Currently, standardization organizations are defining proposals to offer a solution to this problem. Nevertheless, until very recently, non-functional aspects of services have not been considered for standardization processes. In particular, there exists a lack of a design solution that permits an independent development of the functional and non-functional concerns of SOAs, allowing that each concern be addressed in a convenient manner in early stages of the development, in a way that could guarantee the quality of this type of systems. This paper, leveraging on previous work, presents an approach to integrate security-related non-functional aspects (such as confidentiality, integrity, and access control) in the development of services.
Resumo:
Service-Oriented Architectures (SOA), and Web Services (WS), the technology generally used to implement them, achieve the integration of heterogeneous technologies, providing interoperability, and yielding the reutilization of pre-existent systems. Model-driven development methodologies provide inherent benefits such as increased productivity, greater reuse, and better maintainability, to name a few. Efforts on achieving model-driven development of SOAs already exist, but there is currently no standard solution that addresses non-functional aspects of these services as well. This paper presents an approach to integrate these non-functional aspects in the development of web services, with an emphasis on security.
Resumo:
Models are an effective tool for systems and software design. They allow software architects to abstract from the non-relevant details. Those qualities are also useful for the technical management of networks, systems and software, such as those that compose service oriented architectures. Models can provide a set of well-defined abstractions over the distributed heterogeneous service infrastructure that enable its automated management. We propose to use the managed system as a source of dynamically generated runtime models, and decompose management processes into a composition of model transformations. We have created an autonomic service deployment and configuration architecture that obtains, analyzes, and transforms system models to apply the required actions, while being oblivious to the low-level details. An instrumentation layer automatically builds these models and interprets the planned management actions to the system. We illustrate these concepts with a distributed service update operation.
Resumo:
Managing large medical image collections is an increasingly demanding important issue in many hospitals and other medical settings. A huge amount of this information is daily generated, which requires robust and agile systems. In this paper we present a distributed multi-agent system capable of managing very large medical image datasets. In this approach, agents extract low-level information from images and store them in a data structure implemented in a relational database. The data structure can also store semantic information related to images and particular regions. A distinctive aspect of our work is that a single image can be divided so that the resultant sub-images can be stored and managed separately by different agents to improve performance in data accessing and processing. The system also offers the possibility of applying some region-based operations and filters on images, facilitating image classification. These operations can be performed directly on data structures in the database.
Resumo:
Initially, service sector was defined as complementary to manufacturing sector. This situation has changed in recent times; services growth has resulted in a dominance of employment and economic activity in most developed nations and is becoming a key process for the competitiveness of their industrial sectors. New services related to commodities have become a strategy to differentiate their value proposition (Robinson et al., 2002). The service sector's importance is evident when evaluating its share in the gross domestic product. According to the World Bank (2011), in 2009, 74.8% of GDP in the euro area and 77.5% in United States were attributed to services. Globalization and use of information and communication technology has accelerated dissemination of knowledge and increasing customer expectations about services available worldwide. Innovation becomes essential to ensure that service organizations respond with appropriate products and services for each market segment. Customized and placed on time-tomarket new services require a more developed innovation process. Service innovation and new service development process are cited as one of the priorities for academic research in the following years (Karniouchina et al., 2005) This paper has the following objectives: -To present a model for the analysis of innovation process through the service value network, -To verify its applicability through an empirical research, and -To identify the path and mode of innovation for a group of studied organizations and to compare it with previous studies.
Resumo:
A method for fast colour and geometric correction of a tiled display system is presented in this paper. Such kind of displays are a common choice for virtual reality applications and simulators, where a high resolution image is required. They are the cheapest and more flexible alternative for large image generation but they require a precise geometric and colour correction. The purpose of the proposed method is to correct the projection system as fast as possible so in case the system needs to be recalibrated it doesn’t interfere with the normal operation of the simulator or virtual reality application. This technique makes use of a single conventional webcam for both geometric and photometric correction. Some previous assumptions are made, like planar projection surface and negligibleintra-projector colour variation and black-offset levels. If these assumptions hold true, geometric and photometric seamlessness can be achievedfor this kind of display systems. The method described in this paper is scalable for an undefined number of projectors and completely automatic.
Resumo:
In October 2002, under the auspices of Spanish Cooperation, a pilot electrification project put into operation two centralised PV-diesel hybrid systems in two different Moroccan villages. These systems currently provide a full-time energy service and supply electricity to more than a hundred of families, six community buildings, street lighting and one running water system. The appearance of the electricity service is very similar to an urban one: one phase AC supply (230V/50Hz) distributed up to each dwelling using a low-voltage mini-grid, which has been designed to be fully compatible with a future arrival of the utility grid. The management of this electricity service is based on a “fee-for-service” scheme agreed between a local NGO, partner of the project, and electricity associations created in each village, which are in charge of, among other tasks, recording the daily energy production of systems and the monthly energy consumption of each house. This register of data allows a systematic evaluation of both the system performance and the energy consumption of users. Now, after four years of operation, this paper presents the experience of this pilot electrification project and draws lessons that can be useful for designing, managing and sizing this type of small village PV-hybrid system
Resumo:
Several activities in service oriented computing, such as automatic composition, monitoring, and adaptation, can benefit from knowing properties of a given service composition before executing them. Among these properties we will focus on those related to execution cost and resource usage, in a wide sense, as they can be linked to QoS characteristics. In order to attain more accuracy, we formulate execution costs / resource usage as functions on input data (or appropriate abstractions thereof) and show how these functions can be used to make better, more informed decisions when performing composition, adaptation, and proactive monitoring. We present an approach to, on one hand, synthesizing these functions in an automatic fashion from the definition of the different orchestrations taking part in a system and, on the other hand, to effectively using them to reduce the overall costs of non-trivial service-based systems featuring sensitivity to data and possibility of failure. We validate our approach by means of simulations of scenarios needing runtime selection of services and adaptation due to service failure. A number of rebinding strategies, including the use of cost functions, are compared.