851 resultados para large-scale systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasing commercial pressures on land are provoking fundamental and far-reaching changes in the relationships between people and land. Much knowledge on land-oriented investments projects currently comes from the media. Although this provides a good starting point, lack of transparency and rapidly changing contexts mean that this is often unreliable. The International Land Coalition, in partnership with Oxfam Novib, Centre de coopération internationale en recherche agronomique pour le développement (CIRAD), University of Pretoria, Centre for Development and Environment of the University of Bern (CDE), and GIZ, started to compile an inventory of land-related investments. This project aims to better understand the extent, trends and impacts of land-related investments by supporting an ongoing and systematic stocktaking exercise of the various investment projects currently taking place worldwide. It involves a large number of organizations and individuals working in areas where land transactions are being made, and able to provide details of such investments. The project monitors land transactions in rural areas that imply a transformation of land use rights from communities and smallholders to commercial use, and are made both by domestic and foreign investors (private actors, governments, government-back private investors). The focus is on investments for food or agrofuel production, timber extraction, carbon trading, mineral extraction, conservation and tourism. A novel way of using ITC to document land acquisitions in a spatially explicit way and by using an approach called “crowdsourcing” is being developed. This approach will allow actors to share information and knowledge directly and at any time on a public platform, where it will be scrutinized in terms of reliability and cross checked with other sources. Up to now, over 1200 deals have been recorded across 96 countries. Details of such transactions have been classified in a matrix and distributed to over 350 contacts worldwide for verification. The verified information has been geo-referenced and represented in two global maps. This is an open database enabling a continued monitoring exercise and the improvement of data accuracy. More information will be released over time. The opportunities arise from overcoming constraints by incomplete information by proposing a new way of collecting, enhancing and sharing information and knowledge in a more democratic and transparent manner. The intention is to develop interactive knowledge platform where any interested person can share and access information on land deals, their link to involved stakeholders, and their embedding into a geographical context. By making use of new ICT technologies that are more and more in the reach of local stakeholders, as well as open access and web-based spatial information systems, it will become possible to create a dynamic database containing spatial explicit data. Feeding in data by a large number of stakeholders, increasingly also by means of new mobile ITC technologies, will open up new opportunities to analyse, monitor and assess highly dynamic trends of land acquisition and rural transformation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Alpine heavy precipitation events often affect small catchments, although the circulation pattern leading to the event extends over the entire North Atlantic. The various scale interactions involved are particularly challenging for the numerical weather prediction of such events. Unlike previous studies focusing on the southern Alps, here a comprehensive study of a heavy precipitation event in the northern Alps in October 2011 is presented with particular focus on the role of the large-scale circulation in the North Atlantic/European region. During the event exceptionally high amounts of total precipitable water occurred in and north of the Alps. This moisture was initially transported along the flanks of a blocking ridge over the North Atlantic. Subsequently, strong and persistent northerly flow established at the upstream flank of a trough over Europe and steered the moisture towards the northern Alps. Lagrangian diagnostics reveal that a large fraction of the moisture emerged from the West African coast where a subtropical upper-level cut-off low served as an important moisture collector. Wave activity flux diagnostics show that the ridge was initiated as part of a low-frequency, large-scale Rossby wave train while convergence of fast transients helped to amplify it locally in the North Atlantic. A novel diagnostic for advective potential vorticity tendencies sheds more light on this amplification and further emphasizes the role of the ridge in amplifying the trough over Europe. Operational forecasts misrepresented the amplitude and orientation of this trough. For the first time, this study documents an important pathway for northern Alpine flooding, in which the interaction of synoptic-scale to large-scale weather systems and of long-range moisture transport from the Tropics are dominant. Moreover, the trapping of moisture in a subtropical cut-off near the West African coast is found to be a crucial precursor to the observed European high-impact weather.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter aims to overcome the gap existing between case study research, which typically provides qualitative and process-based insights, and national or global inventories that typically offer spatially explicit and quantitative analysis of broader patterns, and thus to present adequate evidence for policymaking regarding large-scale land acquisitions. Therefore, the chapter links spatial patterns of land acquisitions to underlying implementation processes of land allocation. Methodologically linking the described patterns and processes proved difficult, but we have identified indicators that could be added to inventories and monitoring systems to make linkage possible. Combining complementary approaches in this way may help to determine where policy space exists for more sustainable governance of land acquisitions, both geographically and with regard to processes of agrarian transitions. Our spatial analysis revealed two general patterns: (i) relatively large forestry-related acquisitions that target forested landscapes and often interfere with semi-subsistence farming systems; and (ii) smaller agriculture-related acquisitions that often target existing cropland and also interfere with semi-subsistence systems. Furthermore, our meta-analysis of land acquisition implementation processes shows that authoritarian, top-down processes dominate. Initially, the demands of powerful regional and domestic investors tend to override socio-ecological variables, local actors’ interests, and land governance mechanisms. As available land grows scarce, however, and local actors gain experience dealing with land acquisitions, it appears that land investments begin to fail or give way to more inclusive, bottom-up investment models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose WEAVE, a geographical 2D/3D routing protocol that maintains information on a small number of waypoints and checkpoints for forwarding packets to any destination. Nodes obtain the routing information from partial traces gathered in incoming packets and use a system of checkpoints along with the segments of routes to weave end-to-end paths close to the shortest ones. WEAVE does not generate any control traffic, it is suitable for routing in both 2D and 3D networks, and does not require any strong assumption on the underlying network graph such as the Unit Disk or a Planar Graph. WEAVE compares favorably with existing protocols in both testbed experiments and simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design, construction and operation of the tunnels of M-30, the major ring road in the city of Madrid (Spain), represent a very interesting project in wich a wide variety of situations -geometrical, topographical, etc.- had to be covered, in variable conditions of traffic. For that reasons, the M-30 project is a remarkable technical challenge, which, after its completion, turned into an international reference. From the "design for safety" perspective, a holistic approach has been used to deal with new technologies, integration of systems and development of the procedures to reach the maximum level. However, one of the primary goals has been to achieve reasonable homogeneity characteristics which can permit operate a netword of tunels as one only infraestructure. In the case of the ventilation system the mentioned goals have implied innovative solutions and coordination efforts of great interest. Consequently, this paper describes the principal ideas underlying the conceptual solution developed focusing on the principal peculiarities of the project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new method to study large scale neural networks is presented in this paper. The basis is the use of Feynman- like diagrams. These diagrams allow the analysis of collective and cooperative phenomena with a similar methodology to the employed in the Many Body Problem. The proposed method is applied to a very simple structure composed by an string of neurons with interaction among them. It is shown that a new behavior appears at the end of the row. This behavior is different to the initial dynamics of a single cell. When a feedback is present, as in the case of the hippocampus, this situation becomes more complex with a whole set of new frequencies, different from the proper frequencies of the individual neurons. Application to an optical neural network is reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Between 2003 and 2007 an urban network of road tunnels with a total constructed tubes length of 45 km was built in the city of Madrid. This amazing engineering work, known as “Calle30 Project” counted with different kinds of tunnel typologies and ventilation systems. Due to the length of the tunnels and the impact of the work itself, the tunnels were endowed with a great variety of installations to provide the maximum levels of safety both for users and the infrastructure including,in some parts of the tunnel, fixed fire fighting system based on water mist. Within this framework a large-scale campaign of fire tests were planned to study different aspects related to fire safety in the tunnels including the phenomena of the interaction between ventilation and extinction system. In addition, this large scale fire tests allowed fire brigades of the city of Madrid an opportunity to define operational procedures for specific fire fighting in tunnels and evaluate the possibilities of fixed fire fighting systems. The tests were carried out in the Center of Experimentation "San Pedro of Anes" which counts with a 600 m tunnel with a removable false ceiling for reproducing different ceiling heights and ventilation conditions (transverse and longitudinal ones). Interesting conclusions on the interaction of ventilation and water mist systems were obtained but also on other aspects including performance of water mist system in terms of reduction of gas temperatures or visibility conditions. This paper presents a description of the test’s campaign carried out and some previous results obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We combine multi-wavelength data in the AEGIS-XD and C-COSMOS surveys to measure the typical dark matter halo mass of X-ray selected active galactic nuclei (AGN) [L_X(2–10 keV) > 10^42 erg s^− 1] in comparison with far-infrared selected star-forming galaxies detected in the Herschel/PEP survey (PACS Evolutionary Probe; L_IR > 10^11 L_⊙) and quiescent systems at z ≈ 1. We develop a novel method to measure the clustering of extragalactic populations that uses photometric redshift probability distribution functions in addition to any spectroscopy. This is advantageous in that all sources in the sample are used in the clustering analysis, not just the subset with secure spectroscopy. The method works best for large samples. The loss of accuracy because of the lack of spectroscopy is balanced by increasing the number of sources used to measure the clustering. We find that X-ray AGN, far-infrared selected star-forming galaxies and passive systems in the redshift interval 0.6 < z < 1.4 are found in haloes of similar mass, log M_DMH/(M_⊙ h^−1) ≈ 13.0. We argue that this is because the galaxies in all three samples (AGN, star-forming, passive) have similar stellar mass distributions, approximated by the J-band luminosity. Therefore, all galaxies that can potentially host X-ray AGN, because they have stellar masses in the appropriate range, live in dark matter haloes of log M_DMH/(M_⊙ h^−1) ≈ 13.0 independent of their star formation rates. This suggests that the stellar mass of X-ray AGN hosts is driving the observed clustering properties of this population. We also speculate that trends between AGN properties (e.g. luminosity, level of obscuration) and large-scale environment may be related to differences in the stellar mass of the host galaxies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A parallel computing environment to support optimization of large-scale engineering systems is designed and implemented on Windows-based personal computer networks, using the master-worker model and the Parallel Virtual Machine (PVM). It is involved in decomposition of a large engineering system into a number of smaller subsystems optimized in parallel on worker nodes and coordination of subsystem optimization results on the master node. The environment consists of six functional modules, i.e. the master control, the optimization model generator, the optimizer, the data manager, the monitor, and the post processor. Object-oriented design of these modules is presented. The environment supports steps from the generation of optimization models to the solution and the visualization on networks of computers. User-friendly graphical interfaces make it easy to define the problem, and monitor and steer the optimization process. It has been verified by an example of a large space truss optimization. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. We analysed time-series data from populations of red kangaroos (Macropus rufus, Desmarest) inhabiting four areas in the pastoral zone of South Australia. We formulated a set of a priori models to disentangle the relative effects of the covariates: rainfall, harvesting, intraspecific competition, and domestic herbivores, on kangaroo population-growth rate. 2. The statistical framework allowed for spatial variation in the growth-rate parameters, response to covariates, and environmental variability, as well as spatially correlated error terms due to shared environment. 3. The most parsimonious model included all covariates but no area-specific parameter values, suggesting that kangaroo densities respond in the same way to the covariates across the areas. 4. The temporal dynamics were spatially correlated, even after taking into account the potentially synchronizing effect of rainfall, harvesting and domestic herbivores. 5. Counter-intuitively, we found a positive rather than negative effect of domestic herbivore density on the population-growth rate of kangaroos. We hypothesize that this effect is caused by sheep and cattle acting as a surrogate for resource availability beyond rainfall. 6. Even though our system is well studied, we must conclude that approximating resources by surrogates such as rainfall is more difficult than previously thought. This is an important message for studies of consumer-resource systems and highlights the need to be explicit about population processes when analysing population patterns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Experimental and theoretical studies have shown the importance of stochastic processes in genetic regulatory networks and cellular processes. Cellular networks and genetic circuits often involve small numbers of key proteins such as transcriptional factors and signaling proteins. In recent years stochastic models have been used successfully for studying noise in biological pathways, and stochastic modelling of biological systems has become a very important research field in computational biology. One of the challenge problems in this field is the reduction of the huge computing time in stochastic simulations. Based on the system of the mitogen-activated protein kinase cascade that is activated by epidermal growth factor, this work give a parallel implementation by using OpenMP and parallelism across the simulation. Special attention is paid to the independence of the generated random numbers in parallel computing, that is a key criterion for the success of stochastic simulations. Numerical results indicate that parallel computers can be used as an efficient tool for simulating the dynamics of large-scale genetic regulatory networks and cellular processes

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the treatment and monitoring of Parkinson's disease (PD) to be scientific, a key requirement is that measurement of disease stages and severity is quantitative, reliable, and repeatable. The last 50 years in PD research have been dominated by qualitative, subjective ratings obtained by human interpretation of the presentation of disease signs and symptoms at clinical visits. More recently, “wearable,” sensor-based, quantitative, objective, and easy-to-use systems for quantifying PD signs for large numbers of participants over extended durations have been developed. This technology has the potential to significantly improve both clinical diagnosis and management in PD and the conduct of clinical studies. However, the large-scale, high-dimensional character of the data captured by these wearable sensors requires sophisticated signal processing and machine-learning algorithms to transform it into scientifically and clinically meaningful information. Such algorithms that “learn” from data have shown remarkable success in making accurate predictions for complex problems in which human skill has been required to date, but they are challenging to evaluate and apply without a basic understanding of the underlying logic on which they are based. This article contains a nontechnical tutorial review of relevant machine-learning algorithms, also describing their limitations and how these can be overcome. It discusses implications of this technology and a practical road map for realizing the full potential of this technology in PD research and practice. © 2016 International Parkinson and Movement Disorder Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developing analytical models that can accurately describe behaviors of Internet-scale networks is difficult. This is due, in part, to the heterogeneous structure, immense size and rapidly changing properties of today's networks. The lack of analytical models makes large-scale network simulation an indispensable tool for studying immense networks. However, large-scale network simulation has not been commonly used to study networks of Internet-scale. This can be attributed to three factors: 1) current large-scale network simulators are geared towards simulation research and not network research, 2) the memory required to execute an Internet-scale model is exorbitant, and 3) large-scale network models are difficult to validate. This dissertation tackles each of these problems. ^ First, this work presents a method for automatically enabling real-time interaction, monitoring, and control of large-scale network models. Network researchers need tools that allow them to focus on creating realistic models and conducting experiments. However, this should not increase the complexity of developing a large-scale network simulator. This work presents a systematic approach to separating the concerns of running large-scale network models on parallel computers and the user facing concerns of configuring and interacting with large-scale network models. ^ Second, this work deals with reducing memory consumption of network models. As network models become larger, so does the amount of memory needed to simulate them. This work presents a comprehensive approach to exploiting structural duplications in network models to dramatically reduce the memory required to execute large-scale network experiments. ^ Lastly, this work addresses the issue of validating large-scale simulations by integrating real protocols and applications into the simulation. With an emulation extension, a network simulator operating in real-time can run together with real-world distributed applications and services. As such, real-time network simulation not only alleviates the burden of developing separate models for applications in simulation, but as real systems are included in the network model, it also increases the confidence level of network simulation. This work presents a scalable and flexible framework to integrate real-world applications with real-time simulation.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ACKNOWLEDGEMENTS This research is based upon work supported in part by the U.S. ARL and U.K. Ministry of Defense under Agreement Number W911NF-06-3-0001, and by the NSF under award CNS-1213140. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views or represent the official policies of the NSF, the U.S. ARL, the U.S. Government, the U.K. Ministry of Defense or the U.K. Government. The U.S. and U.K. Governments are authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation hereon.