84 resultados para Integration and data management


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate-G is a large scale distributed testbed devoted to climate change research. It is an unfunded effort started in 2008 and involving a wide community both in Europe and US. The testbed is an interdisciplinary effort involving partners from several institutions and joining expertise in the field of climate change and computational science. Its main goal is to allow scientists carrying out geographical and cross-institutional data discovery, access, analysis, visualization and sharing of climate data. It represents an attempt to address, in a real environment, challenging data and metadata management issues. This paper presents a complete overview about the Climate-G testbed highlighting the most important results that have been achieved since the beginning of this project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose Multinationals have always needed an operating model that works an effective plan for executing their most important activities at the right levels of their organization, whether globally, regionally or locally. The choices involved in these decisions have never been obvious, since international firms have consistently faced tradeoffs between tailoring approaches for diverse local markets and leveraging their global scale. This paper seeks a more indepth understanding of how successful firms manage the globallocal tradeoff in a multipolar world. Design methodology/approach This paper utilizes a case study approach based on indepth senior executive interviews at several telecommunications companies including Tata Communications. The interviews probed the operating models of the companies we studied, focusing on their approaches to organization structure, management processes, management technologies (including information technology (IT)) and people/talent. Findings Successful companies balance globallocal tradeoffs by taking a flexible and tailored approach toward their operatingmodel decisions. The paper finds that successful companies, including Tata Communications, which is profiled indepth, are breaking up the globallocal conundrum into a set of more manageable strategic problems what the authors call pressure points which they identify by assessing their most important activities and capabilities and determining the global and local challenges associated with them. They then design a different operating model solution for each pressure point, and repeat this process as new strategic developments emerge. By doing so they not only enhance their agility, but they also continually calibrate that crucial balance between global efficiency and local responsiveness. Originality/value This paper takes a unique approach to operating model design, finding that an operating model is better viewed as several distinct solutions to specific pressure points rather than a single and inflexible model that addresses all challenges equally. Now more than ever, developing the right operating model is at the top of multinational executives' priorities, and an area of increasing concern; the international business arena has changed drastically, requiring thoughtfulness and flexibility instead of standard formulas for operating internationally. Old adages like think global and act local no longer provide the universal guidance they once seemed to.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using Wireless Sensor Networks (WSNs) in healthcare systems has had a lot of attention in recent years. In much of this research tasks like sensor data processing, health states decision making and emergency message sending are done by a remote server. Many patients with lots of sensor data consume a great deal of communication resources, bring a burden to the remote server and delay the decision time and notification time. A healthcare application for elderly people using WSN has been simulated in this paper. A WSN designed for the proposed healthcare application needs efficient Medium Access Control (MAC) and routing protocols to provide a guarantee for the reliability of the data delivered from the patients to the medical centre. Based on these requirements, the GinMAC protocol including a mobility module has been chosen, to provide the required performance such as reliability for data delivery and energy saving. Simulation results show that this modification to GinMAC can offer the required performance for the proposed healthcare application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using Wireless Sensor Networks (WSNs) in healthcare systems has had a lot of attention in recent years. In much of this research tasks like sensor data processing, health states decision making and emergency message sending are done by a remote server. Many patients with lots of sensor data consume a great deal of communication resources, bring a burden to the remote server and delay the decision time and notification time. A healthcare application for elderly people using WSN has been simulated in this paper. A WSN designed for the proposed healthcare application needs efficient MAC and routing protocols to provide a guarantee for the reliability of the data delivered from the patients to the medical centre. Based on these requirements, the GinMAC protocol including a mobility module has been chosen, to provide the required performance such as reliability for data delivery and energy saving. Simulation results show that this modification to GinMAC can offer the required performance for the proposed healthcare application.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe a new methodology for comparing satellite radiation budget data with a numerical weather prediction (NWP) model. This is applied to data from the Geostationary Earth Radiation Budget (GERB) instrument on Meteosat-8. The methodology brings together, in near-real time, GERB broadband shortwave and longwave fluxes with simulations based on analyses produced by the Met Office global NWP model. Results for the period May 2003 to February 2005 illustrate the progressive improvements in the data products as various initial problems were resolved. In most areas the comparisons reveal systematic errors in the model's representation of surface properties and clouds, which are discussed elsewhere. However, for clear-sky regions over the oceans the model simulations are believed to be sufficiently accurate to allow the quality of the GERB fluxes themselves to be assessed and any changes in time of the performance of the instrument to be identified. Using model and radiosonde profiles of temperature and humidity as input to a single-column version of the model's radiation code, we conduct sensitivity experiments which provide estimates of the expected model errors over the ocean of about 510 W m2 in clear-sky outgoing longwave radiation (OLR) and 0.01 in clear-sky albedo. For the more recent data the differences between the observed and modeled OLR and albedo are well within these error estimates. The close agreement between the observed and modeled values, particularly for the most recent period, illustrates the value of the methodology. It also contributes to the validation of the GERB products and increases confidence in the quality of the data, prior to their release.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A review of the implications of climate change for freshwater resources, based on Chapter 4 of Working Group 2, IPCC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mainframes, corporate and central servers are becoming information servers. The requirement for more powerful information servers is the best opportunity to exploit the potential of parallelism. ICL recognized the opportunity of the 'knowledge spectrum' namely to convert raw data into information and then into high grade knowledge. Parallel Processing and Data Management Its response to this and to the underlying search problems was to introduce the CAFS retrieval engine. The CAFS product demonstrates that it is possible to move functionality within an established architecture, introduce a different technology mix and exploit parallelism to achieve radically new levels of performance. CAFS also demonstrates the benefit of achieving this transparently behind existing interfaces. ICL is now working with Bull and Siemens to develop the information servers of the future by exploiting new technologies as available. The objective of the joint Esprit II European Declarative System project is to develop a smoothly scalable, highly parallel computer system, EDS. EDS will in the main be an SQL server and an information server. It will support the many data-intensive applications which the companies foresee; it will also support application-intensive and logic-intensive systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The shamba system involves farmers tending tree saplings on state-owned forest land in return for being permitted to intercrop perennial food crops until canopy closure. At one time the system was used throughout all state-owned forest lands in Kenya, accounting for a large proportion of some 160,000 ha. The system should theoretically be mutually beneficial to both local people and the government. However the system has had a chequered past in Kenya due to widespread malpractice and associated environmental degradation. It was last banned in 2003 but in early 2008 field trials were initiated for its reintroduction. This study aimed to: assess the benefits and limitations of the shamba system in Kenya; assess the main influences on the extent to which the limitations and benefits are realised and; consider the management and policy requirements for the system's successful and sustainable operation. Information was obtained from 133 questionnaires using mainly open ended questions and six participatory workshops carried out in forest-adjacent communities on the western slopes of Mount Kenya in Nyeri district. In addition interviews were conducted with key informants from communities and organisations. There was strong desire amongst local people for the system's reintroduction given that it had provided significant food, income and employment. Local perceptions of the failings of the system included firstly mismanagement by government or forest authorities and secondly abuse of the system by shamba farmers and outsiders. Improvements local people considered necessary for the shamba system to work included more accountability and transparency in administration and better rules with respect to plot allocation and stewardship. Ninety-seven percent of respondents said they would like to be more involved in management of the forest and 80% that they were willing to pay for the use of a plot. The study concludes that the structural framework laid down by the 2005 Forests Act, which includes provision for the reimplementation of the shamba system under the new plantation establishment and livelihood improvement scheme (PELIS) [It should be noted that whilst the shamba system was re-branded in 2008 under the acronym PELIS, for the sake of simplicity the authors continue to refer to the 'shamba system' and 'shamba farmers' throughout this paper.], is weakened because insufficient power is likely to be devolved to local people, casting them merely as 'forest users' and the shamba system as a 'forest user right'. In so doing the system's potential to both facilitate and embody the participation of local people in forest management is limited and the long-term sustainability of the new system is questionable. Suggested instruments to address this include some degree of sharing of profits from forest timber, performance related guarantees for farmers to gain a new plot and use of joint committees consisting of local people and the forest authorities for long term management of forests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Details about the parameters of kinetic systems are crucial for progress in both medical and industrial research, including drug development, clinical diagnosis and biotechnology applications. Such details must be collected by a series of kinetic experiments and investigations. The correct design of the experiment is essential to collecting data suitable for analysis, modelling and deriving the correct information. We have developed a systematic and iterative Bayesian method and sets of rules for the design of enzyme kinetic experiments. Our method selects the optimum design to collect data suitable for accurate modelling and analysis and minimises the error in the parameters estimated. The rules select features of the design such as the substrate range and the number of measurements. We show here that this method can be directly applied to the study of other important kinetic systems, including drug transport, receptor binding, microbial culture and cell transport kinetics. It is possible to reduce the errors in the estimated parameters and, most importantly, increase the efficiency and cost-effectiveness by reducing the necessary amount of experiments and data points measured. (C) 2003 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.