96 resultados para building simulation
Resumo:
The Voxel Imaging PET (VIP) Path nder project got the 4 year European Research Council FP7 grant in 2010 to prove the feasibility of using CdTe detectors in a novel conceptual design of PET scanner. The work presented in this thesis is a part of the VIP project and consists of, on the one hand, the characterization of a CdTe detector in terms of energy resolution and coincidence time resolution and, on the other hand, the simulation of the setup with the single detector in order to extend the results to the full PET scanner. An energy resolution of 0.98% at 511 keV with a bias voltage of 1000 V/mm has been measured at low temperature T=-8 ºC. The coincidence time distribution of two twin detectors has been found to be as low as 6 ns FWHM for events with energies above 500 keV under the same temperature and bias conditions. The measured energy and time resolution values are compatible with similar ndings available in the literature and prove the excellent potential of CdTe for PET applications. This results have been presented in form of a poster contribution at the IEEE NSS/MIC & RTSD 2011 conference in October 2011 in Valencia and at the iWoRID 2012 conference in July 2012 in Coimbra, Portugal. They have been also submitted for publication to "Journal of Instrumentation (JINST)" in September 2012.
Resumo:
Our new simple method for calculating accurate Franck-Condon factors including nondiagonal (i.e., mode-mode) anharmonic coupling is used to simulate the C2H4+X2B 3u←C2H4X̃1 Ag band in the photoelectron spectrum. An improved vibrational basis set truncation algorithm, which permits very efficient computations, is employed. Because the torsional mode is highly anharmonic it is separated from the other modes and treated exactly. All other modes are treated through the second-order perturbation theory. The perturbation-theory corrections are significant and lead to a good agreement with experiment, although the separability assumption for torsion causes the C2 D4 results to be not as good as those for C2 H4. A variational formulation to overcome this circumstance, and deal with large anharmonicities in general, is suggested
Resumo:
Earthquakes occurring around the world each year cause thousands ofdeaths, millions of dollars in damage to infrastructure, and incalculablehuman suffering. In recent years, satellite technology has been asignificant boon to response efforts following an earthquake and itsafter-effects by providing mobile communications between response teamsand remote sensing of damaged areas to disaster management organizations.In 2007, an international team of students and professionals assembledduring theInternational Space University’s Summer Session Program in Beijing, Chinato examine how satellite and ground-based technology could be betterintegrated to provide an optimised response in the event of an earthquake.The resulting Technology Resources for Earthquake MOnitoring and Response(TREMOR) proposal describes an integrative prototype response system thatwill implement mobile satellite communication hubs providing telephone anddata links between response teams, onsite telemedicine consultation foremergency first-responders, and satellite navigation systems that willlocate and track emergency vehicles and guide search-and-rescue crews. Aprototype earthquake simulation system is also proposed, integratinghistorical data, earthquake precursor data, and local geomatics andinfrastructure information to predict the damage that could occur in theevent of an earthquake. The backbone of these proposals is a comprehensiveeducation and training program to help individuals, communities andgovernments prepare in advance. The TREMOR team recommends thecoordination of these efforts through a centralised, non-governmentalorganization.
Resumo:
This paper performs an empirical Decomposition of International Inequality in Ecological Footprint in order to quantify to what extent explanatory variables such as a country’s affluence, economic structure, demographic characteristics, climate and technology contributed to international differences in terms of natural resource consumption during the period 1993-2007. We use a Regression- Based Inequality Decomposition approach. As a result, the methodology extends qualitatively the results obtained in standard environmental impact regressions as it comprehends further social dimensions of the Sustainable Development concept, i.e. equity within generations. The results obtained point to prioritizing policies that take into account both future and present generations. Keywords: Ecological Footprint Inequality, Regression-Based Inequality Decomposition, Intragenerational equity, Sustainable development.
Resumo:
Background: With increasing computer power, simulating the dynamics of complex systems in chemistry and biology is becoming increasingly routine. The modelling of individual reactions in (bio)chemical systems involves a large number of random events that can be simulated by the stochastic simulation algorithm (SSA). The key quantity is the step size, or waiting time, τ, whose value inversely depends on the size of the propensities of the different channel reactions and which needs to be re-evaluated after every firing event. Such a discrete event simulation may be extremely expensive, in particular for stiff systems where τ can be very short due to the fast kinetics of some of the channel reactions. Several alternative methods have been put forward to increase the integration step size. The so-called τ-leap approach takes a larger step size by allowing all the reactions to fire, from a Poisson or Binomial distribution, within that step. Although the expected value for the different species in the reactive system is maintained with respect to more precise methods, the variance at steady state can suffer from large errors as τ grows. Results: In this paper we extend Poisson τ-leap methods to a general class of Runge-Kutta (RK) τ-leap methods. We show that with the proper selection of the coefficients, the variance of the extended τ-leap can be well-behaved, leading to significantly larger step sizes.Conclusions: The benefit of adapting the extended method to the use of RK frameworks is clear in terms of speed of calculation, as the number of evaluations of the Poisson distribution is still one set per time step, as in the original τ-leap method. The approach paves the way to explore new multiscale methods to simulate (bio)chemical systems.
Resumo:
This paper presents the platform developed in the PANACEA project, a distributed factory that automates the stages involved in the acquisition, production, updating and maintenance of Language Resources required by Machine Translation and other Language Technologies. We adopt a set of tools that have been successfully used in the Bioinformatics field, they are adapted to the needs of our field and used to deploy web services, which can be combined to build more complex processing chains (workflows). This paper describes the platform and its different components (web services, registry, workflows, social network and interoperability). We demonstrate the scalability of the platform by carrying out a set of massive data experiments. Finally, a validation of the platform across a set of required criteria proves its usability for different types of users (non-technical users and providers).
Resumo:
Multi-national societies present a complex setting for the politics of immigration, as migration’s linguistic, economic and cultural effects may coincide with existing contestation over nationhood between sub-units and the central state. Empirically, though, political actors only sometimes, and in some places, explicitly connect the politics of immigration to the stakes of multi-level politics. With reference to Canada, Belgium and the United Kingdom, this paper examines the conditions under which political leaders link immigration to ongoing debate about governance in multi-national societies. The paper argues that the distribution of policy competencies in the multi-level system is less important for shaping immigration and integration politics than is the perceived impact (positive or negative) on the sub-unit’s societal culture or its power relationship with the center. Immigration and integration are more often politicized where center and sub-unit hold divergent views on migration and its place in national identity.
Resumo:
In this paper the core functions of an artificial intelligence (AI) for controlling a debris collector robot are designed and implemented. Using the robot operating system (ROS) as the base of this work a multi-agent system is built with abilities for task planning.
Resumo:
Political party formation and coalition building in the European Parliament is being a driving force for making governance of the highly pluralistic European Union relatively effective and consensual. In spite of successive enlargements and the very high number of electoral partiesobtaining representation in the European Union institutions, the number of effective European Political Groups in the European Parliament has decreased from the first direct election in 1979 to the fifth in 1999. The formal analysis of national party¹s voting power in different Europeanparty configurations can explain the incentives for national parties to join large European Political Groups instead of forming smaller nationalistic groupings. Empirical evidence shows increasing cohesion of European Political Groups and an increasing role of the European Parliament in EU inter-institutional decision making. As a consequence of this evolution, intergovernmentalism is being replaced with federalizing relations. The analysis can support positive expectations regarding the governability of the European Union after further enlargements provided that new member states have party systems fitting the European PoliticalGroups.
Resumo:
We consider the agency problem of a staff member managing microfinancing programs, who can abuse his discretion to embezzle borrowers' repayments. The fact that most borrowers of microfinancing programs are illiterate and live in rural areas where transportation costs are very high make staff's embezzlement particularly relevant as is documented by Mknelly and Kevane (2002). We study the trade-off between the optimal rigid lending contract and the optimal discretionary one and find that a rigid contract is optimal when the audit cost is larger than gains from insurance. Our analysis explains rigid repayment schedules used by the Grameen bank as an optimal response to the bank staff's agency problem. Joint liability reduces borrowers' burden of respecting the rigid repayment schedules by providing them with partial insurance. However, the same insurance can be provided byborrowers themselves under individual liability through a side-contract.
Resumo:
We present a leverage theory of reputation building with co-branding. We showthat under certain conditions, co-branding that links unknown firms in a new sectorwith established firms in a mature sector allows the unknown firms to signal a highproduct quality and establish their own reputation. We compare this situationwith a benchmark in which both sectors are new and firms signal their qualityonly with prices. We investigate how this comparison is affected by the nature ofthe technology linking the two sectors and a cross-sector inference problem thatconsumers might face in identifying the true cause of product failure. We find thatco-branding facilitates the process in which a Þrm in the new sector to signal itsproduct quality only if the co-branding sectors produce complementary inputs andconsumers face a cross-sector inference problem. We apply our insight to economicsof superstars, multinational firms and co-authorship.
Resumo:
We develop a general error analysis framework for the Monte Carlo simulationof densities for functionals in Wiener space. We also study variancereduction methods with the help of Malliavin derivatives. For this, wegive some general heuristic principles which are applied to diffusionprocesses. A comparison with kernel density estimates is made.
Resumo:
The computer code system PENELOPE (version 2008) performs Monte Carlo simulation of coupledelectron-photon transport in arbitrary materials for a wide energy range, from a few hundred eV toabout 1 GeV. Photon transport is simulated by means of the standard, detailed simulation scheme.Electron and positron histories are generated on the basis of a mixed procedure, which combinesdetailed simulation of hard events with condensed simulation of soft interactions. A geometry packagecalled PENGEOM permits the generation of random electron-photon showers in material systemsconsisting of homogeneous bodies limited by quadric surfaces, i.e., planes, spheres, cylinders, etc. Thisreport is intended not only to serve as a manual of the PENELOPE code system, but also to provide theuser with the necessary information to understand the details of the Monte Carlo algorithm.