739 resultados para Hidden homelessness
Resumo:
In Sri Lanka policy responses have direct impacts on rural dwellers. Over 80% of Sri Lanka’s population live in rural areas and 90% of them represent low income dwellers. Their production system may be hampered by fragmented landholding, poor economics of scale, low investment levels resulting from poor financial services as well as inappropriate or limited technology. They are vulnerable to price hikes of basic foods and food security issues due to fragmented landholding and poor financial services. Policy measures to reduce the transmission of higher international prices in domestic markets exist to protect the food security of the vulnerable population. This paper will discuss the food policy and strategies implemented by the government and outside to the above facts this paper also describes the effectiveness of the policies forwarded by the government. The objective of this study is to analyse the impact of policy responses to the food price crisis and rural food security in Sri Lanka. Outside of the above facts this study also treats the impact of policies and decisions on the nutritional condition of rural dwellers. Furthermore this study is to analyse the fluctuation of buying power with the price hikes and the relation of above facts with issues like malnutrition. This paper discusses why policy makers should pay greater attention to rural dwellers and describes the multiple pathways through which food price increases have on rural people. It also provides evidence of the impact of this crisis in particular, through hidden hunger, and discusses how current policy responses should adjust and improve to protect the rural dwellers in the short and long term.
Resumo:
This thesis investigates a method for human-robot interaction (HRI) in order to uphold productivity of industrial robots like minimization of the shortest operation time, while ensuring human safety like collision avoidance. For solving such problems an online motion planning approach for robotic manipulators with HRI has been proposed. The approach is based on model predictive control (MPC) with embedded mixed integer programming. The planning strategies of the robotic manipulators mainly considered in the thesis are directly performed in the workspace for easy obstacle representation. The non-convex optimization problem is approximated by a mixed-integer program (MIP). It is further effectively reformulated such that the number of binary variables and the number of feasible integer solutions are drastically decreased. Safety-relevant regions, which are potentially occupied by the human operators, can be generated online by a proposed method based on hidden Markov models. In contrast to previous approaches, which derive predictions based on probability density functions in the form of single points, such as most likely or expected human positions, the proposed method computes safety-relevant subsets of the workspace as a region which is possibly occupied by the human at future instances of time. The method is further enhanced by combining reachability analysis to increase the prediction accuracy. These safety-relevant regions can subsequently serve as safety constraints when the motion is planned by optimization. This way one arrives at motion plans that are safe, i.e. plans that avoid collision with a probability not less than a predefined threshold. The developed methods have been successfully applied to a developed demonstrator, where an industrial robot works in the same space as a human operator. The task of the industrial robot is to drive its end-effector according to a nominal sequence of grippingmotion-releasing operations while no collision with a human arm occurs.
Resumo:
Enhanced reality visualization is the process of enhancing an image by adding to it information which is not present in the original image. A wide variety of information can be added to an image ranging from hidden lines or surfaces to textual or iconic data about a particular part of the image. Enhanced reality visualization is particularly well suited to neurosurgery. By rendering brain structures which are not visible, at the correct location in an image of a patient's head, the surgeon is essentially provided with X-ray vision. He can visualize the spatial relationship between brain structures before he performs a craniotomy and during the surgery he can see what's under the next layer before he cuts through. Given a video image of the patient and a three dimensional model of the patient's brain the problem enhanced reality visualization faces is to render the model from the correct viewpoint and overlay it on the original image. The relationship between the coordinate frames of the patient, the patient's internal anatomy scans and the image plane of the camera observing the patient must be established. This problem is closely related to the camera calibration problem. This report presents a new approach to finding this relationship and develops a system for performing enhanced reality visualization in a surgical environment. Immediately prior to surgery a few circular fiducials are placed near the surgical site. An initial registration of video and internal data is performed using a laser scanner. Following this, our method is fully automatic, runs in nearly real-time, is accurate to within a pixel, allows both patient and camera motion, automatically corrects for changes to the internal camera parameters (focal length, focus, aperture, etc.) and requires only a single image.
Resumo:
The furious pace of Moore's Law is driving computer architecture into a realm where the the speed of light is the dominant factor in system latencies. The number of clock cycles to span a chip are increasing, while the number of bits that can be accessed within a clock cycle is decreasing. Hence, it is becoming more difficult to hide latency. One alternative solution is to reduce latency by migrating threads and data, but the overhead of existing implementations has previously made migration an unserviceable solution so far. I present an architecture, implementation, and mechanisms that reduces the overhead of migration to the point where migration is a viable supplement to other latency hiding mechanisms, such as multithreading. The architecture is abstract, and presents programmers with a simple, uniform fine-grained multithreaded parallel programming model with implicit memory management. In other words, the spatial nature and implementation details (such as the number of processors) of a parallel machine are entirely hidden from the programmer. Compiler writers are encouraged to devise programming languages for the machine that guide a programmer to express their ideas in terms of objects, since objects exhibit an inherent physical locality of data and code. The machine implementation can then leverage this locality to automatically distribute data and threads across the physical machine by using a set of high performance migration mechanisms. An implementation of this architecture could migrate a null thread in 66 cycles -- over a factor of 1000 improvement over previous work. Performance also scales well; the time required to move a typical thread is only 4 to 5 times that of a null thread. Data migration performance is similar, and scales linearly with data block size. Since the performance of the migration mechanism is on par with that of an L2 cache, the implementation simulated in my work has no data caches and relies instead on multithreading and the migration mechanism to hide and reduce access latencies.
Resumo:
We consider an online learning scenario in which the learner can make predictions on the basis of a fixed set of experts. The performance of each expert may change over time in a manner unknown to the learner. We formulate a class of universal learning algorithms for this problem by expressing them as simple Bayesian algorithms operating on models analogous to Hidden Markov Models (HMMs). We derive a new performance bound for such algorithms which is considerably simpler than existing bounds. The bound provides the basis for learning the rate at which the identity of the optimal expert switches over time. We find an analytic expression for the a priori resolution at which we need to learn the rate parameter. We extend our scalar switching-rate result to models of the switching-rate that are governed by a matrix of parameters, i.e. arbitrary homogeneous HMMs. We apply and examine our algorithm in the context of the problem of energy management in wireless networks. We analyze the new results in the framework of Information Theory.
Resumo:
Sigmoid type belief networks, a class of probabilistic neural networks, provide a natural framework for compactly representing probabilistic information in a variety of unsupervised and supervised learning problems. Often the parameters used in these networks need to be learned from examples. Unfortunately, estimating the parameters via exact probabilistic calculations (i.e, the EM-algorithm) is intractable even for networks with fairly small numbers of hidden units. We propose to avoid the infeasibility of the E step by bounding likelihoods instead of computing them exactly. We introduce extended and complementary representations for these networks and show that the estimation of the network parameters can be made fast (reduced to quadratic optimization) by performing the estimation in either of the alternative domains. The complementary networks can be used for continuous density estimation as well.
Resumo:
We had previously shown that regularization principles lead to approximation schemes, as Radial Basis Functions, which are equivalent to networks with one layer of hidden units, called Regularization Networks. In this paper we show that regularization networks encompass a much broader range of approximation schemes, including many of the popular general additive models, Breiman's hinge functions and some forms of Projection Pursuit Regression. In the probabilistic interpretation of regularization, the different classes of basis functions correspond to different classes of prior probabilities on the approximating function spaces, and therefore to different types of smoothness assumptions. In the final part of the paper, we also show a relation between activation functions of the Gaussian and sigmoidal type.
Resumo:
Stock markets employ specialized traders, market-makers, designed to provide liquidity and volume to the market by constantly supplying both supply and demand. In this paper, we demonstrate a novel method for modeling the market as a dynamic system and a reinforcement learning algorithm that learns profitable market-making strategies when run on this model. The sequence of buys and sells for a particular stock, the order flow, we model as an Input-Output Hidden Markov Model fit to historical data. When combined with the dynamics of the order book, this creates a highly non-linear and difficult dynamic system. Our reinforcement learning algorithm, based on likelihood ratios, is run on this partially-observable environment. We demonstrate learning results for two separate real stocks.
Resumo:
Hydrogeological research usually includes some statistical studies devised to elucidate mean background state, characterise relationships among different hydrochemical parameters, and show the influence of human activities. These goals are achieved either by means of a statistical approach or by mixing models between end-members. Compositional data analysis has proved to be effective with the first approach, but there is no commonly accepted solution to the end-member problem in a compositional framework. We present here a possible solution based on factor analysis of compositions illustrated with a case study. We find two factors on the compositional bi-plot fitting two non-centered orthogonal axes to the most representative variables. Each one of these axes defines a subcomposition, grouping those variables that lay nearest to it. With each subcomposition a log-contrast is computed and rewritten as an equilibrium equation. These two factors can be interpreted as the isometric log-ratio coordinates (ilr) of three hidden components, that can be plotted in a ternary diagram. These hidden components might be interpreted as end-members. We have analysed 14 molarities in 31 sampling stations all along the Llobregat River and its tributaries, with a monthly measure during two years. We have obtained a bi-plot with a 57% of explained total variance, from which we have extracted two factors: factor G, reflecting geological background enhanced by potash mining; and factor A, essentially controlled by urban and/or farming wastewater. Graphical representation of these two factors allows us to identify three extreme samples, corresponding to pristine waters, potash mining influence and urban sewage influence. To confirm this, we have available analysis of diffused and widespread point sources identified in the area: springs, potash mining lixiviates, sewage, and fertilisers. Each one of these sources shows a clear link with one of the extreme samples, except fertilisers due to the heterogeneity of their composition. This approach is a useful tool to distinguish end-members, and characterise them, an issue generally difficult to solve. It is worth note that the end-member composition cannot be fully estimated but only characterised through log-ratio relationships among components. Moreover, the influence of each endmember in a given sample must be evaluated in relative terms of the other samples. These limitations are intrinsic to the relative nature of compositional data
Resumo:
A lo largo de la última década, la adolescencia ha sido un tema de discusión política en distintos espacios europeos al más alto nivel. En una sociedad aceleradamente cambiante se percibe que la adecuada socialización de las generaciones más jóvenes constituye un reto socio-histórico que nos afecta a todos. Los cambios en que estamos sumergidos son tan plurales (demográficos, sociales, tecnológicos, económicos, políticos, etc.) que generan un amplísimo frente de nuevos dilemas éticos. La opinión de los ciudadanos de la Unión Europea se muestra preocupada por nuevos valores y destaca la preferencia por la responsabilidad en coherencia con dicha situación cambiante. Todo este macrocontexto psicosocial viene planteando nuevos retos teóricos y de investigación a la comunidad científica. De hecho las ciencias humanas y sociales han empezado a desarrollar nuevas líneas de investigación para comprender mejor las nuevas relaciones entre adultos y adolescentes y las nuevas culturas que emergen entre estos últimos, impulsadas por nuevas aspiraciones sociales compartidas por grupos más o menos amplios de la población joven. El desarrollo de técnicas e instrumentos que nos permitan comprender mejor la perspectiva del adolescente se hace más evidente si analizamos su relación con las nuevas tecnologías de la información y la comunicación. Dichas tecnologías comportan nuevos riesgos, pero también nuevas oportunidades, entre las que destaca la posibilidad de establecer nuevas formas de relación. La motivación que muestran los más jóvenes por las nuevas tecnologías constituye un gran reto a los investigadores aplicados para sugerir formas de maximizar las potencialidades latentes
Resumo:
Los secretos familiares son informaciones ocultas. Estas se pueden dar dentro de la misma generación, ocultando la información entre los miembros de la familia o esta se puede ocultar a personas externas a esta. Estos secretos también pueden ser heredados de los antepasados. Además los secretos presentan características tales como la fuente, el tipo, quienes lo comparten y la duración. Esta información oculta genera una dinámica específica en las familias, estableciendo formas de comunicación que favorecen o afectan las relaciones de los individuos. Y todo esto sucede dentro de la intimidad de la familia y de cada uno de sus miembros. La dinámica y el funcionamiento de los secretos familiares se examinan mediante dos obras literarias y dos películas relacionadas con el tema.
Resumo:
La informalidad laboral ha sido durante d ecadas el com un denominador entre las econom ías latinoamericanas. En Colombia, a pesar de haberse despertado un inter es por realizar un seguimiento a este segmento del mercado de trabajo desde el año 1986, parece no haberse dado ning ún tipo de regulaci ón o polí tica que pretendiese reducir la proporci on de los informales dentro del total de la poblaci on ocupada. Siendo la informalidad laboral un factor contrací clico con un coefi ciente de correlaci ón bastante bajo y dadas las modestas tasas de crecimiento de la economía colombiana, hasta febrero de 2010, un 57,8% de la poblaci on urbana ocupada del pa ís a un pertenece a tan indeseado sector del mercado laboral. Este trabajo pretende mostrar por qu e la informalidad laboral es un indicador relevante de la situaci ón de la economía colombiana, y por qu e debe tenerse en cuenta a la hora de tomar diversos tipos de decisiones y polí ticas econ ómicas, sin que este indicador quede a la sombra de la tasa de desempleo. Para esto se hace una clasi caci ón y posterior estimaci on de los costos fiscales consecuentes del actual escenario de formalidad-informalidad y, fi nalmente, se calcula un índice que pretende mostrar los efectos de la informalidad sobre la sostenibilidad fiscal del Sector Salud en el paí s.
Resumo:
Estado del arte que recopila pronunciamientos de diversos autores sobre el papel de la Organización de Naciones Unidas, específicamente la Misión MINUGUA, en el proceso de reconstrucción posconflicto en Guatemala comprendido entre el año 1994 y 2004. Se basa en algunas dimensiones de la democratización como son el Estado de Derecho, la democracia representativa, la preeminencia del poder civil, y el fortalecimiento de la cultura democrática. Así mismo, tiene en cuenta los elementos de la justicia transicional, a saber: verdad, justicia y reparación.
Resumo:
El artículo presenta los resultados de una investigación desarrollada como proyecto de grado para optar el título de Magister en Dirección en la cual se identifican los impactos que en materia ambiental son generados a lo largo de los procesos realizados dentro de los frigoríficos en Colombia y evaluar la manera como se realiza la gestión ambiental de los frigoríficos cárnicos más representativos en Colombia, como parte de esta gestión es importante revisar la política de Estado y finalmente, se proponen algunas acciones de tipo directivo en las cuáles todos los actores de la cadena cárnica pueden participar con el objetivo de mejorar el tema ambiental de éstas organizaciones en Colombia.
Resumo:
Existen en la actualidad múltiples modelos de gestión de conocimiento y medición del capital humano, los cuales son aplicados en las organizaciones, pero ninguno de éstos ha sido diseñado para Instituciones de Educación Superior. En este trabajo se hace un recuento de algunos de los modelos de gestión del conocimiento y capital intelectual más destacados como el Modelo de conversión del conocimiento de Nonaka y Takeuchi, el Modelo de GC de Arthur Andersen, el Cuadro de Mando Integral de Kaplan y Norton, entre otros, pero es a partir del Modelo Organizacional Estrella de Galbraith que se presenta una propuesta teórica para caracterizar un modelo de gestión del conocimiento aplicable a las funciones universitarias de investigación y extensión en la Universidad CES – Medellín, Colombia, a través de una investigación cualitativa en donde, a partir de la correlación entre la teoría general de la GC, particularmente de los modelos y el análisis de las características de la Universidad CES, así como la revisión sistemática, el grupo focal y el análisis documental se propone el Modelo Hexagonal de GC.