890 resultados para problem of mediation
Resumo:
This article reviews the literature that deals with the problem of legitimizing regulatory governance, with a special attention to the question of the accountability of independent regulatory agencies. The discussion begins with the presentation of the traditional arguments concerning the democratic deficit of the regulatory state. The positive evaluation of regulatory performance by citizens is presented as an alternative source of legitimacy. It follows the discussion of the existing approaches to make agencies accountable, so as to ensure the procedural legitimacy of regulatory governance. Some insights concerning new forms of accountability are offered in the last section, namely with reference to the establishment and ongoing consolidation of formal and informal networks of regulators.
Resumo:
The research considers the problem of spatial data classification using machine learning algorithms: probabilistic neural networks (PNN) and support vector machines (SVM). As a benchmark model simple k-nearest neighbor algorithm is considered. PNN is a neural network reformulation of well known nonparametric principles of probability density modeling using kernel density estimator and Bayesian optimal or maximum a posteriori decision rules. PNN is well suited to problems where not only predictions but also quantification of accuracy and integration of prior information are necessary. An important property of PNN is that they can be easily used in decision support systems dealing with problems of automatic classification. Support vector machine is an implementation of the principles of statistical learning theory for the classification tasks. Recently they were successfully applied for different environmental topics: classification of soil types and hydro-geological units, optimization of monitoring networks, susceptibility mapping of natural hazards. In the present paper both simulated and real data case studies (low and high dimensional) are considered. The main attention is paid to the detection and learning of spatial patterns by the algorithms applied.
Resumo:
Fluvial deposits are a challenge for modelling flow in sub-surface reservoirs. Connectivity and continuity of permeable bodies have a major impact on fluid flow in porous media. Contemporary object-based and multipoint statistics methods face a problem of robust representation of connected structures. An alternative approach to model petrophysical properties is based on machine learning algorithm ? Support Vector Regression (SVR). Semi-supervised SVR is able to establish spatial connectivity taking into account the prior knowledge on natural similarities. SVR as a learning algorithm is robust to noise and captures dependencies from all available data. Semi-supervised SVR applied to a synthetic fluvial reservoir demonstrated robust results, which are well matched to the flow performance
Resumo:
Introduction. The management of large burn victims has significantly improved in the last decades. Specifically autologous cultured keratinocytes (CEA) overcame the problem of limited donor sites in severely burned patients. Several studies testing CEA's in their burn centers give mixed results on the general outcomes of burn patients. Methods. A review of publications with a minimum of 15 patients per study using CEA for the management of severe burn injury from 1989 until 2011 were recruited by using an online database including Medline, Pub Med and the archives of the medical library of the CHUV in Lausanne. Results. 18 studies with a total of 977 patients were included into this review. Most of the studies did not specify if CEA's were grafted alone or in combination with split thickness skin grafts (STSG) although most of the patients seemed to have received both methodologies in reviewed studies. The mean TBSA per study ranged from 33% to 78% in patients that were grafted with CEA's. Here no common minimum TBSA making a patient eligible for CEA grafting could be found. The definition of the "take rate" is not standardized and varied largely from 26% to 73%. Mortality and hospitalization time could not be shown to correlate with CEA use in all of the studies. As late complications, some authors described the fragility of the CEA regenerated skin. Conclusion. Since the healing of large burn victims demands for a variety of different surgical and non-surgical treatment strategies and the final outcome mainly depends on the burned surface as well as the general health condition of the patient, no definitive conclusion could be drawn from the use of CEA's of reviewed studies. From our own experience, we know that selected patients significantly profit from CEA grafts although cost efficiency or the reduction of mortality cannot be demonstrated on this particular cases.
Resumo:
The problem of freeze-out (FO) in relativistic heavy-ion reactions is addressed. We develop and analyze an idealized one-dimensional model of FO in a finite layer, based on the covariant FO probability. The resulting post FO phase-space distributions are discussed for different FO probabilities and layer thicknesses.
Resumo:
We discuss reality conditions and the relation between spacetime diffeomorphisms and gauge transformations in Ashtekars complex formulation of general relativity. We produce a general theoretical framework for the stabilization algorithm for the reality conditions, which is different from Diracs method of stabilization of constraints. We solve the problem of the projectability of the diffeomorphism transformations from configuration-velocity space to phase space, linking them to the reality conditions. We construct the complete set of canonical generators of the gauge group in the phase space which includes all the gauge variables. This result proves that the canonical formalism has all the gauge structure of the Lagrangian theory, including the time diffeomorphisms.
Resumo:
The proportion of population living in or around cites is more important than ever. Urban sprawl and car dependence have taken over the pedestrian-friendly compact city. Environmental problems like air pollution, land waste or noise, and health problems are the result of this still continuing process. The urban planners have to find solutions to these complex problems, and at the same time insure the economic performance of the city and its surroundings. At the same time, an increasing quantity of socio-economic and environmental data is acquired. In order to get a better understanding of the processes and phenomena taking place in the complex urban environment, these data should be analysed. Numerous methods for modelling and simulating such a system exist and are still under development and can be exploited by the urban geographers for improving our understanding of the urban metabolism. Modern and innovative visualisation techniques help in communicating the results of such models and simulations. This thesis covers several methods for analysis, modelling, simulation and visualisation of problems related to urban geography. The analysis of high dimensional socio-economic data using artificial neural network techniques, especially self-organising maps, is showed using two examples at different scales. The problem of spatiotemporal modelling and data representation is treated and some possible solutions are shown. The simulation of urban dynamics and more specifically the traffic due to commuting to work is illustrated using multi-agent micro-simulation techniques. A section on visualisation methods presents cartograms for transforming the geographic space into a feature space, and the distance circle map, a centre-based map representation particularly useful for urban agglomerations. Some issues on the importance of scale in urban analysis and clustering of urban phenomena are exposed. A new approach on how to define urban areas at different scales is developed, and the link with percolation theory established. Fractal statistics, especially the lacunarity measure, and scale laws are used for characterising urban clusters. In a last section, the population evolution is modelled using a model close to the well-established gravity model. The work covers quite a wide range of methods useful in urban geography. Methods should still be developed further and at the same time find their way into the daily work and decision process of urban planners. La part de personnes vivant dans une région urbaine est plus élevé que jamais et continue à croître. L'étalement urbain et la dépendance automobile ont supplanté la ville compacte adaptée aux piétons. La pollution de l'air, le gaspillage du sol, le bruit, et des problèmes de santé pour les habitants en sont la conséquence. Les urbanistes doivent trouver, ensemble avec toute la société, des solutions à ces problèmes complexes. En même temps, il faut assurer la performance économique de la ville et de sa région. Actuellement, une quantité grandissante de données socio-économiques et environnementales est récoltée. Pour mieux comprendre les processus et phénomènes du système complexe "ville", ces données doivent être traitées et analysées. Des nombreuses méthodes pour modéliser et simuler un tel système existent et sont continuellement en développement. Elles peuvent être exploitées par le géographe urbain pour améliorer sa connaissance du métabolisme urbain. Des techniques modernes et innovatrices de visualisation aident dans la communication des résultats de tels modèles et simulations. Cette thèse décrit plusieurs méthodes permettant d'analyser, de modéliser, de simuler et de visualiser des phénomènes urbains. L'analyse de données socio-économiques à très haute dimension à l'aide de réseaux de neurones artificiels, notamment des cartes auto-organisatrices, est montré à travers deux exemples aux échelles différentes. Le problème de modélisation spatio-temporelle et de représentation des données est discuté et quelques ébauches de solutions esquissées. La simulation de la dynamique urbaine, et plus spécifiquement du trafic automobile engendré par les pendulaires est illustrée à l'aide d'une simulation multi-agents. Une section sur les méthodes de visualisation montre des cartes en anamorphoses permettant de transformer l'espace géographique en espace fonctionnel. Un autre type de carte, les cartes circulaires, est présenté. Ce type de carte est particulièrement utile pour les agglomérations urbaines. Quelques questions liées à l'importance de l'échelle dans l'analyse urbaine sont également discutées. Une nouvelle approche pour définir des clusters urbains à des échelles différentes est développée, et le lien avec la théorie de la percolation est établi. Des statistiques fractales, notamment la lacunarité, sont utilisées pour caractériser ces clusters urbains. L'évolution de la population est modélisée à l'aide d'un modèle proche du modèle gravitaire bien connu. Le travail couvre une large panoplie de méthodes utiles en géographie urbaine. Toutefois, il est toujours nécessaire de développer plus loin ces méthodes et en même temps, elles doivent trouver leur chemin dans la vie quotidienne des urbanistes et planificateurs.
Resumo:
We study the problem of front propagation in the presence of inertia. We extend the analytical approach for the overdamped problem to this case, and present numerical results to support our theoretical predictions. Specifically, we conclude that the velocity and shape selection problem can still be described in terms of the metastable, nonlinear, and linear overdamped regimes. We study the characteristic relaxation dynamics of these three regimes, and the existence of degenerate (¿quenched¿) solutions.
Resumo:
We study the problem of the partition of a system of initial size V into a sequence of fragments s1,s2,s3 . . . . By assuming a scaling hypothesis for the probability p(s;V) of obtaining a fragment of a given size, we deduce that the final distribution of fragment sizes exhibits power-law behavior. This minimal model is useful to understanding the distribution of avalanche sizes in first-order phase transitions at low temperatures.
Resumo:
The paper deals with the development and application of the generic methodology for automatic processing (mapping and classification) of environmental data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve the problem of spatial data mapping (regression). The Probabilistic Neural Network (PNN) is considered as an automatic tool for spatial classifications. The automatic tuning of isotropic and anisotropic GRNN/PNN models using cross-validation procedure is presented. Results are compared with the k-Nearest-Neighbours (k-NN) interpolation algorithm using independent validation data set. Real case studies are based on decision-oriented mapping and classification of radioactively contaminated territories.
Resumo:
The integral representation of the electromagnetic two-form, defined on Minkowski space-time, is studied from a new point of view. The aim of the paper is to obtain an invariant criteria in order to define the radiative field. This criteria generalizes the well-known structureless charge case. We begin with the curvature two-form, because its field equations incorporate the motion of the sources. The gauge theory methods (connection one-forms) are not suited because their field equations do not incorporate the motion of the sources. We obtain an integral solution of the Maxwell equations in the case of a flow of charges in irrotational motion. This solution induces us to propose a new method of solving the problem of the nature of the retarded radiative field. This method is based on a projection tensor operator which, being local, is suited to being implemented on general relativity. We propose the field equations for the pair {electromagnetic field, projection tensor J. These field equations are an algebraic differential first-order system of oneforms, which verifies automatically the integrability conditions.
Resumo:
The most important features of the proposed spherical gravitational wave detectors are closely linked with their symmetry. Hollow spheres share this property with solid ones, considered in the literature so far, and constitute an interesting alternative for the realization of an omnidirectional gravitational wave detector. In this paper we address the problem of how a hollow elastic sphere interacts with an incoming gravitational wave and find an analytical solution for its normal mode spectrum and response, as well as for its energy absorption cross sections. It appears that this shape can be designed having relatively low resonance frequencies (~ 200 Hz) yet keeping a large cross section, so its frequency range overlaps with the projected large interferometers. We also apply the obtained results to discuss the performance of a hollow sphere as a detector for a variety of gravitational wave signals.
Resumo:
The tunneling approach to the wave function of the Universe has been recently criticized by Bousso and Hawking who claim that it predicts a catastrophic instability of de Sitter space with respect to pair production of black holes. We show that this claim is unfounded. First, we argue that different horizon size regions in de Sitter space cannot be treated as independently created, as they contend. And second, the WKB tunneling wave function is not simply the inverse of the Hartle-Hawking one, except in very special cases. Applied to the related problem of pair production of massive particles, we argue that the tunneling wave function leads to a small constant production rate, and not to a catastrophe as the argument of Bousso and Hawking would suggest.
Resumo:
We propose a generalization of the persistent random walk for dimensions greater than 1. Based on a cubic lattice, the model is suitable for an arbitrary dimension d. We study the continuum limit and obtain the equation satisfied by the probability density function for the position of the random walker. An exact solution is obtained for the projected motion along an axis. This solution, which is written in terms of the free-space solution of the one-dimensional telegraphers equation, may open a new way to address the problem of light propagation through thin slabs.
Resumo:
We present an imaginary-time path-integral study of the problem of quantum decay of a metastable state of a uniaxial magnetic particle placed in the magnetic field at an arbitrary angle. Our findings agree with earlier results of Zaslavskii obtained by mapping the spin Hamiltonian onto a particle Hamiltonian. In the limit of low barrier, weak dependence of the decay rate on the angle is found, except for the field which is almost normal to the anisotropy axis, where the rate is sharply peaked, and for the field approaching the parallel orientation, where the rate rapidly goes to zero. This distinct angular dependence, together with the dependence of the rate on the field strength, provides an independent test for macroscopic spin tunneling.