466 resultados para Parameterized polygons
Resumo:
The service quality of any sector has two major aspects namely technical and functional. Technical quality can be attained by maintaining technical specification as decided by the organization. Functional quality refers to the manner which service is delivered to customer which can be assessed by the customer feed backs. A field survey was conducted based on the management tool SERVQUAL, by designing 28 constructs under 7 dimensions of service quality. Stratified sampling techniques were used to get 336 valid responses and the gap scores of expectations and perceptions are analyzed using statistical techniques to identify the weakest dimension. To assess the technical aspects of availability six months live outage data of base transceiver were collected. The statistical and exploratory techniques were used to model the network performance. The failure patterns have been modeled in competing risk models and probability distribution of service outage and restorations were parameterized. Since the availability of network is a function of the reliability and maintainability of the network elements, any service provider who wishes to keep up their service level agreements on availability should be aware of the variability of these elements and its effects on interactions. The availability variations were studied by designing a discrete time event simulation model with probabilistic input parameters. The probabilistic distribution parameters arrived from live data analysis was used to design experiments to define the availability domain of the network under consideration. The availability domain can be used as a reference for planning and implementing maintenance activities. A new metric is proposed which incorporates a consistency index along with key service parameters that can be used to compare the performance of different service providers. The developed tool can be used for reliability analysis of mobile communication systems and assumes greater significance in the wake of mobile portability facility. It is also possible to have a relative measure of the effectiveness of different service providers.
Resumo:
Land use has become a force of global importance, considering that 34% of the Earth’s ice-free surface was covered by croplands or pastures in 2000. The expected increase in global human population together with eminent climate change and associated search for energy sources other than fossil fuels can, through land-use and land-cover changes (LUCC), increase the pressure on nature’s resources, further degrade ecosystem services, and disrupt other planetary systems of key importance to humanity. This thesis presents four modeling studies on the interplay between LUCC, increased production of biofuels and climate change in four selected world regions. In the first study case two new crop types (sugarcane and jatropha) are parameterized in the LPJ for managed Lands dynamic global vegetation model for calculation of their potential productivity. Country-wide spatial variation in the yields of sugarcane and jatropha incurs into substantially different land requirements to meet the biofuel production targets for 2015 in Brazil and India, depending on the location of plantations. Particularly the average land requirements for jatropha in India are considerably higher than previously estimated. These findings indicate that crop zoning is important to avoid excessive LUCC. In the second study case the LandSHIFT model of land-use and land-cover changes is combined with life cycle assessments to investigate the occurrence and extent of biofuel-driven indirect land-use changes (ILUC) in Brazil by 2020. The results show that Brazilian biofuels can indeed cause considerable ILUC, especially by pushing the rangeland frontier into the Amazonian forests. The carbon debt caused by such ILUC would result in no carbon savings (from using plant-based ethanol and biodiesel instead of fossil fuels) before 44 years for sugarcane ethanol and 246 years for soybean biodiesel. The intensification of livestock grazing could avoid such ILUC. We argue that such an intensification of livestock should be supported by the Brazilian biofuel sector, based on the sector’s own interest in minimizing carbon emissions. In the third study there is the development of a new method for crop allocation in LandSHIFT, as influenced by the occurrence and capacity of specific infrastructure units. The method is exemplarily applied in a first assessment of the potential availability of land for biogas production in Germany. The results indicate that Germany has enough land to fulfill virtually all (90 to 98%) its current biogas plant capacity with only cultivated feedstocks. Biogas plants located in South and Southwestern (North and Northeastern) Germany might face more (less) difficulties to fulfill their capacities with cultivated feedstocks, considering that feedstock transport distance to plants is a crucial issue for biogas production. In the fourth study an adapted version of LandSHIFT is used to assess the impacts of contrasting scenarios of climate change and conservation targets on land use in the Brazilian Amazon. Model results show that severe climate change in some regions by 2050 can shift the deforestation frontier to areas that would experience low levels of human intervention under mild climate change (such as the western Amazon forests or parts of the Cerrado savannas). Halting deforestation of the Amazon and of the Brazilian Cerrado would require either a reduction in the production of meat or an intensification of livestock grazing in the region. Such findings point out the need for an integrated/multicisciplinary plan for adaptation to climate change in the Amazon. The overall conclusions of this thesis are that (i) biofuels must be analyzed and planned carefully in order to effectively reduce carbon emissions; (ii) climate change can have considerable impacts on the location and extent of LUCC; and (iii) intensification of grazing livestock represents a promising venue for minimizing the impacts of future land-use and land-cover changes in Brazil.
Resumo:
Diese Arbeit beschäftigt sich mit der Frage, wie sich in einer Familie von abelschen t-Moduln die Teilfamilie der uniformisierbaren t-Moduln beschreiben lässt. Abelsche t-Moduln sind höherdimensionale Verallgemeinerungen von Drinfeld-Moduln über algebraischen Funktionenkörpern. Bekanntermaßen lassen sich Drinfeld-Moduln in allgemeiner Charakteristik durch analytische Tori parametrisieren. Diese Tatsache überträgt sich allerdings nur auf manche t-Moduln, die man als uniformisierbar bezeichnet. Die Situation hat eine gewisse Analogie zur Theorie von elliptischen Kurven, Tori und abelschen Varietäten über den komplexen Zahlen. Um zu entscheiden, ob ein t-Modul in diesem Sinne uniformisierbar ist, wendet man ein Kriterium von Anderson an, das die rigide analytische Trivialität der zugehörigen t-Motive zum Inhalt hat. Wir wenden dieses Kriterium auf eine Familie von zweidimensionalen t-Moduln vom Rang vier an, die von Koeffizienten a,b,c,d abhängen, und gelangen dabei zur äquivalenten Fragestellung nach der Konvergenz von gewissen rekursiv definierten Folgen. Das Konvergenzverhalten dieser Folgen lässt sich mit Hilfe von Newtonpolygonen gut untersuchen. Schließlich erhält man durch dieses Vorgehen einfach formulierte Bedingungen an die Koeffizienten a,b,c,d, die einerseits die Uniformisierbarkeit garantieren oder andererseits diese ausschließen.
Resumo:
We present a set of techniques that can be used to represent and detect shapes in images. Our methods revolve around a particular shape representation based on the description of objects using triangulated polygons. This representation is similar to the medial axis transform and has important properties from a computational perspective. The first problem we consider is the detection of non-rigid objects in images using deformable models. We present an efficient algorithm to solve this problem in a wide range of situations, and show examples in both natural and medical images. We also consider the problem of learning an accurate non-rigid shape model for a class of objects from examples. We show how to learn good models while constraining them to the form required by the detection algorithm. Finally, we consider the problem of low-level image segmentation and grouping. We describe a stochastic grammar that generates arbitrary triangulated polygons while capturing Gestalt principles of shape regularity. This grammar is used as a prior model over random shapes in a low level algorithm that detects objects in images.
Resumo:
We present an overview of current research on artificial neural networks, emphasizing a statistical perspective. We view neural networks as parameterized graphs that make probabilistic assumptions about data, and view learning algorithms as methods for finding parameter values that look probable in the light of the data. We discuss basic issues in representation and learning, and treat some of the practical issues that arise in fitting networks to data. We also discuss links between neural networks and the general formalism of graphical models.
Resumo:
This paper considers the problem of language change. Linguists must explain not only how languages are learned but also how and why they have evolved along certain trajectories and not others. While the language learning problem has focused on the behavior of individuals and how they acquire a particular grammar from a class of grammars ${cal G}$, here we consider a population of such learners and investigate the emergent, global population characteristics of linguistic communities over several generations. We argue that language change follows logically from specific assumptions about grammatical theories and learning paradigms. In particular, we are able to transform parameterized theories and memoryless acquisition algorithms into grammatical dynamical systems, whose evolution depicts a population's evolving linguistic composition. We investigate the linguistic and computational consequences of this model, showing that the formalization allows one to ask questions about diachronic that one otherwise could not ask, such as the effect of varying initial conditions on the resulting diachronic trajectories. From a more programmatic perspective, we give an example of how the dynamical system model for language change can serve as a way to distinguish among alternative grammatical theories, introducing a formal diachronic adequacy criterion for linguistic theories.
Resumo:
Esta tesis surge como una oportunidad de crear una herramienta de mejora en las empresas, para controlar los inventarios de la manera más adecuada. Debido a los desórdenes de los precios en el mercado, las promociones no planeadas, y la confrontación de pronósticos optimistas Vs. Pronósticos conservadores, se presenta un gran volumen de devoluciones, repercutiendo en el deterioro de la cartera y afectando directamente las metas estratégicas de las empresas. Tras esta clara oportunidad de mejora, se toma la decisión de evaluar el modelo de pronóstico que arroje los valores más acertados para la planeación de la demanda. Por otro lado, se analizo el mejor modelo de inventarios con sus respectivos indicadores de control. Dando como resultado una herramienta parametrizada en Excel, que arroja datos de pronósticos de ventas más acertados y optimiza la gestión de los inventarios. Esta herramienta contiene un modelo de gestión de inventarios de revisión continua, lo cual brinda información más acertada de la demanda que enfrenta la compañía, las ventas que puede generar, y los procesos que necesita planear para respaldar su actividad.
Resumo:
Esta disertación busca estudiar los mecanismos de transmisión que vinculan el comportamiento de agentes y firmas con las asimetrías presentes en los ciclos económicos. Para lograr esto, se construyeron tres modelos DSGE. El en primer capítulo, el supuesto de función cuadrática simétrica de ajuste de la inversión fue removido, y el modelo canónico RBC fue reformulado suponiendo que des-invertir es más costoso que invertir una unidad de capital físico. En el segundo capítulo, la contribución más importante de esta disertación es presentada: la construcción de una función de utilidad general que anida aversión a la pérdida, aversión al riesgo y formación de hábitos, por medio de una función de transición suave. La razón para hacerlo así es el hecho de que los individuos son aversos a la pérdidad en recesiones, y son aversos al riesgo en auges. En el tercer capítulo, las asimetrías en los ciclos económicos son analizadas junto con ajuste asimétrico en precios y salarios en un contexto neokeynesiano, con el fin de encontrar una explicación teórica de la bien documentada asimetría presente en la Curva de Phillips.
Resumo:
In this paper, investment cost asymmetry is introduced in order to test wheter this kind of asymmetry can account for asymmetries in business cycles. By using a smooth transition function, asymmetric investment cost is modeled and introduced in a canonical RBC model. Simulations of the model with Perturbations Method (PM) are very close to simulations through Parameterized Expectations Algorithm (PEA), which allows the use of the former for the sake of time reduction and computational costs. Both symmetric and asymmetric models were simulated and compared. Deterministic and stochastic impulse-response excersices revealed that it is possible to adequately reproduce asymmetric business cycles by modeling asymmetric investment costs. Simulations also showed that higher order moments are insu_cient to detect asymmetries. Instead, methods such as Generalized Impulse Response Analysis (GIRA) and Nonlinear Econometrics prove to be more e_cient diagnostic tools.
Resumo:
Para el administrador el proceso de la toma de decisiones es uno de sus mayores retos y responsabilidades, ya que en su desarrollo se debe definir el camino más acertado en un sin número de alternativas, teniendo en cuenta los obstáculos sociales, políticos y económicos del entorno empresarial. Para llegar a la decisión adecuada no hay que perder de vista los objetivos y metas propuestas, además de tener presente el proceso lógico, detectando, analizando y demostrando el porqué de esa elección. Consecuentemente el análisis que propone esta investigación aportara conocimientos sobre los tipos de lógica utilizados en la toma de decisiones estratégicas al administrador para satisfacer las demandas asociadas con el mercadeo para que de esta manera se pueda generar y ampliar eficientemente las competencia idóneas del administrador en la inserción internacional de un mercado laboral cada vez mayor (Valero, 2011). A lo largo de la investigación se pretende desarrollar un estudio teórico para explicar la relación entre la lógica y la toma de decisiones estratégicas de marketing y como estos conceptos se combinan para llegar a un resultado final. Esto se llevara a cabo por medio de un análisis de planes de marketing, iniciando por conceptos básicos como marketing, lógica, decisiones estratégicas, dirección de marketing seguido de los principios lógicos y contradicciones que se pueden llegar a generar entre la fundamentación teórica
Resumo:
Cada trabajo premiado aparece descrito detalladamente en los siguientes registros: 'Emigraci??n-Inmigraci??n. La b??squeda de un futuro m??s all?? de las fronteras' 00920123018018, 'Fabulario ??ntimo de M??rida. Itinerario de poetas a pie de calle' 00920123018022, 'Aprender a convivir' 00920123018031, 'S?? capaz, emprende. Trabaja tus capacidades y prep??rate para el futuro' 00920123018041, 'El juego de la cig??e??a' 00920123018058, 'Our Grease' 00920123018060, 'Unidad did??ctica interactiva: los mecanismos' 00920123018063, 'Polygons. Materiales did??cticos digitales para secciones biling??es' 00920123018064
Resumo:
Describes a method to code a decimated model of an isosurface on an octree representation while maintaining volume data if it is needed. The proposed technique is based on grouping the marching cubes (MC) patterns into five configurations according the topology and the number of planes of the surface that are contained in a cell. Moreover, the discrete number of planes on which the surface lays is fixed. Starting from a complete volume octree, with the isosurface codified at terminal nodes according to the new configuration, a bottom-up strategy is taken for merging cells. Such a strategy allows one to implicitly represent co-planar faces in the upper octree levels without introducing any error. At the end of this merging process, when it is required, a reconstruction strategy is applied to generate the surface contained in the octree intersected leaves. Some examples with medical data demonstrate that a reduction of up to 50% in the number of polygons can be achieved
Resumo:
We present algorithms for computing approximate distance functions and shortest paths from a generalized source (point, segment, polygonal chain or polygonal region) on a weighted non-convex polyhedral surface in which obstacles (represented by polygonal chains or polygons) are allowed. We also describe an algorithm for discretizing, by using graphics hardware capabilities, distance functions. Finally, we present algorithms for computing discrete k-order Voronoi diagrams
Resumo:
We present an algorithm for computing exact shortest paths, and consequently distances, from a generalized source (point, segment, polygonal chain or polygonal region) on a possibly non-convex polyhedral surface in which polygonal chain or polygon obstacles are allowed. We also present algorithms for computing discrete Voronoi diagrams of a set of generalized sites (points, segments, polygonal chains or polygons) on a polyhedral surface with obstacles. To obtain the discrete Voronoi diagrams our algorithms, exploiting hardware graphics capabilities, compute shortest path distances defined by the sites
Resumo:
We've developed a new ambient occlusion technique based on an information-theoretic framework. Essentially, our method computes a weighted visibility from each object polygon to all viewpoints; we then use these visibility values to obtain the information associated with each polygon. So, just as a viewpoint has information about the model's polygons, the polygons gather information on the viewpoints. We therefore have two measures associated with an information channel defined by the set of viewpoints as input and the object's polygons as output, or vice versa. From this polygonal information, we obtain an occlusion map that serves as a classic ambient occlusion technique. Our approach also offers additional applications, including an importance-based viewpoint-selection guide, and a means of enhancing object features and producing nonphotorealistic object visualizations