863 resultados para Information needs – representation
Resumo:
En el estudio de caso se podrá encontrar información relacionada a la empresa ASTRO MAQUINARIA Ltda. la cual lleva más de 20 años en el mercado, generando soluciones a las necesidades de maquinaria y equipos, especialmente en el área de bombeo. Desde el principio la empresa ha estado importando desde Estados Unidos el portafolio de productos que ofrecen, actualmente tienen la representación comercial de la empresa Pentair Pump Group. Este estudio de caso pretende identificar cual es la estrategia de internacionalización empleada por la empresa, la toma de decisiones, la cadena de abastecimiento, relaciones comerciales, entre otras; las cuales han logrado la perdurabilidad y el crecimiento de la empresa desde su inicio. La metodología utilizada para el desarrollo de este estudio de caso se basara en variables cualitativas que permiten el análisis profundo y la indagación sobre el fenómeno, igualmente se utilizaran algunas variables cuantitativas para observar la situación actual de la empresa y el desempeño de la misma en el sector de comercialización de maquinaria pesada.
Resumo:
In image processing, segmentation algorithms constitute one of the main focuses of research. In this paper, new image segmentation algorithms based on a hard version of the information bottleneck method are presented. The objective of this method is to extract a compact representation of a variable, considered the input, with minimal loss of mutual information with respect to another variable, considered the output. First, we introduce a split-and-merge algorithm based on the definition of an information channel between a set of regions (input) of the image and the intensity histogram bins (output). From this channel, the maximization of the mutual information gain is used to optimize the image partitioning. Then, the merging process of the regions obtained in the previous phase is carried out by minimizing the loss of mutual information. From the inversion of the above channel, we also present a new histogram clustering algorithm based on the minimization of the mutual information loss, where now the input variable represents the histogram bins and the output is given by the set of regions obtained from the above split-and-merge algorithm. Finally, we introduce two new clustering algorithms which show how the information bottleneck method can be applied to the registration channel obtained when two multimodal images are correctly aligned. Different experiments on 2-D and 3-D images show the behavior of the proposed algorithms
Resumo:
La tesis se centra en la Visión por Computador y, más concretamente, en la segmentación de imágenes, la cual es una de las etapas básicas en el análisis de imágenes y consiste en la división de la imagen en un conjunto de regiones visualmente distintas y uniformes considerando su intensidad, color o textura. Se propone una estrategia basada en el uso complementario de la información de región y de frontera durante el proceso de segmentación, integración que permite paliar algunos de los problemas básicos de la segmentación tradicional. La información de frontera permite inicialmente identificar el número de regiones presentes en la imagen y colocar en el interior de cada una de ellas una semilla, con el objetivo de modelar estadísticamente las características de las regiones y definir de esta forma la información de región. Esta información, conjuntamente con la información de frontera, es utilizada en la definición de una función de energía que expresa las propiedades requeridas a la segmentación deseada: uniformidad en el interior de las regiones y contraste con las regiones vecinas en los límites. Un conjunto de regiones activas inician entonces su crecimiento, compitiendo por los píxeles de la imagen, con el objetivo de optimizar la función de energía o, en otras palabras, encontrar la segmentación que mejor se adecua a los requerimientos exprsados en dicha función. Finalmente, todo esta proceso ha sido considerado en una estructura piramidal, lo que nos permite refinar progresivamente el resultado de la segmentación y mejorar su coste computacional. La estrategia ha sido extendida al problema de segmentación de texturas, lo que implica algunas consideraciones básicas como el modelaje de las regiones a partir de un conjunto de características de textura y la extracción de la información de frontera cuando la textura es presente en la imagen. Finalmente, se ha llevado a cabo la extensión a la segmentación de imágenes teniendo en cuenta las propiedades de color y textura. En este sentido, el uso conjunto de técnicas no-paramétricas de estimación de la función de densidad para la descripción del color, y de características textuales basadas en la matriz de co-ocurrencia, ha sido propuesto para modelar adecuadamente y de forma completa las regiones de la imagen. La propuesta ha sido evaluada de forma objetiva y comparada con distintas técnicas de integración utilizando imágenes sintéticas. Además, se han incluido experimentos con imágenes reales con resultados muy positivos.
Resumo:
The aim of this thesis is to narrow the gap between two different control techniques: the continuous control and the discrete event control techniques DES. This gap can be reduced by the study of Hybrid systems, and by interpreting as Hybrid systems the majority of large-scale systems. In particular, when looking deeply into a process, it is often possible to identify interaction between discrete and continuous signals. Hybrid systems are systems that have both continuous, and discrete signals. Continuous signals are generally supposed continuous and differentiable in time, since discrete signals are neither continuous nor differentiable in time due to their abrupt changes in time. Continuous signals often represent the measure of natural physical magnitudes such as temperature, pressure etc. The discrete signals are normally artificial signals, operated by human artefacts as current, voltage, light etc. Typical processes modelled as Hybrid systems are production systems, chemical process, or continuos production when time and continuous measures interacts with the transport, and stock inventory system. Complex systems as manufacturing lines are hybrid in a global sense. They can be decomposed into several subsystems, and their links. Another motivation for the study of Hybrid systems is the tools developed by other research domains. These tools benefit from the use of temporal logic for the analysis of several properties of Hybrid systems model, and use it to design systems and controllers, which satisfies physical or imposed restrictions. This thesis is focused in particular types of systems with discrete and continuous signals in interaction. That can be modelled hard non-linealities, such as hysteresis, jumps in the state, limit cycles, etc. and their possible non-deterministic future behaviour expressed by an interpretable model description. The Hybrid systems treated in this work are systems with several discrete states, always less than thirty states (it can arrive to NP hard problem), and continuous dynamics evolving with expression: with Ki ¡ Rn constant vectors or matrices for X components vector. In several states the continuous evolution can be several of them Ki = 0. In this formulation, the mathematics can express Time invariant linear system. By the use of this expression for a local part, the combination of several local linear models is possible to represent non-linear systems. And with the interaction with discrete events of the system the model can compose non-linear Hybrid systems. Especially multistage processes with high continuous dynamics are well represented by the proposed methodology. Sate vectors with more than two components, as third order models or higher is well approximated by the proposed approximation. Flexible belt transmission, chemical reactions with initial start-up and mobile robots with important friction are several physical systems, which profits from the benefits of proposed methodology (accuracy). The motivation of this thesis is to obtain a solution that can control and drive the Hybrid systems from the origin or starting point to the goal. How to obtain this solution, and which is the best solution in terms of one cost function subject to the physical restrictions and control actions is analysed. Hybrid systems that have several possible states, different ways to drive the system to the goal and different continuous control signals are problems that motivate this research. The requirements of the system on which we work is: a model that can represent the behaviour of the non-linear systems, and that possibilities the prediction of possible future behaviour for the model, in order to apply an supervisor which decides the optimal and secure action to drive the system toward the goal. Specific problems can be determined by the use of this kind of hybrid models are: - The unity of order. - Control the system along a reachable path. - Control the system in a safe path. - Optimise the cost function. - Modularity of control The proposed model solves the specified problems in the switching models problem, the initial condition calculus and the unity of the order models. Continuous and discrete phenomena are represented in Linear hybrid models, defined with defined eighth-tuple parameters to model different types of hybrid phenomena. Applying a transformation over the state vector : for LTI system we obtain from a two-dimensional SS a single parameter, alpha, which still maintains the dynamical information. Combining this parameter with the system output, a complete description of the system is obtained in a form of a graph in polar representation. Using Tagaki-Sugeno type III is a fuzzy model which include linear time invariant LTI models for each local model, the fuzzyfication of different LTI local model gives as a result a non-linear time invariant model. In our case the output and the alpha measure govern the membership function. Hybrid systems control is a huge task, the processes need to be guided from the Starting point to the desired End point, passing a through of different specific states and points in the trajectory. The system can be structured in different levels of abstraction and the control in three layers for the Hybrid systems from planning the process to produce the actions, these are the planning, the process and control layer. In this case the algorithms will be applied to robotics ¡V a domain where improvements are well accepted ¡V it is expected to find a simple repetitive processes for which the extra effort in complexity can be compensated by some cost reductions. It may be also interesting to implement some control optimisation to processes such as fuel injection, DC-DC converters etc. In order to apply the RW theory of discrete event systems on a Hybrid system, we must abstract the continuous signals and to project the events generated for these signals, to obtain new sets of observable and controllable events. Ramadge & Wonham¡¦s theory along with the TCT software give a Controllable Sublanguage of the legal language generated for a Discrete Event System (DES). Continuous abstraction transforms predicates over continuous variables into controllable or uncontrollable events, and modifies the set of uncontrollable, controllable observable and unobservable events. Continuous signals produce into the system virtual events, when this crosses the bound limits. If this event is deterministic, they can be projected. It is necessary to determine the controllability of this event, in order to assign this to the corresponding set, , controllable, uncontrollable, observable and unobservable set of events. Find optimal trajectories in order to minimise some cost function is the goal of the modelling procedure. Mathematical model for the system allows the user to apply mathematical techniques over this expression. These possibilities are, to minimise a specific cost function, to obtain optimal controllers and to approximate a specific trajectory. The combination of the Dynamic Programming with Bellman Principle of optimality, give us the procedure to solve the minimum time trajectory for Hybrid systems. The problem is greater when there exists interaction between adjacent states. In Hybrid systems the problem is to determine the partial set points to be applied at the local models. Optimal controller can be implemented in each local model in order to assure the minimisation of the local costs. The solution of this problem needs to give us the trajectory to follow the system. Trajectory marked by a set of set points to force the system to passing over them. Several ways are possible to drive the system from the Starting point Xi to the End point Xf. Different ways are interesting in: dynamic sense, minimum states, approximation at set points, etc. These ways need to be safe and viable and RchW. And only one of them must to be applied, normally the best, which minimises the proposed cost function. A Reachable Way, this means the controllable way and safe, will be evaluated in order to obtain which one minimises the cost function. Contribution of this work is a complete framework to work with the majority Hybrid systems, the procedures to model, control and supervise are defined and explained and its use is demonstrated. Also explained is the procedure to model the systems to be analysed for automatic verification. Great improvements were obtained by using this methodology in comparison to using other piecewise linear approximations. It is demonstrated in particular cases this methodology can provide best approximation. The most important contribution of this work, is the Alpha approximation for non-linear systems with high dynamics While this kind of process is not typical, but in this case the Alpha approximation is the best linear approximation to use, and give a compact representation.
Big Decisions and Sparse Data: Adapting Scientific Publishing to the Needs of Practical Conservation
Resumo:
The biggest challenge in conservation biology is breaking down the gap between research and practical management. A major obstacle is the fact that many researchers are unwilling to tackle projects likely to produce sparse or messy data because the results would be difficult to publish in refereed journals. The obvious solution to sparse data is to build up results from multiple studies. Consequently, we suggest that there needs to be greater emphasis in conservation biology on publishing papers that can be built on by subsequent research rather than on papers that produce clear results individually. This building approach requires: (1) a stronger theoretical framework, in which researchers attempt to anticipate models that will be relevant in future studies and incorporate expected differences among studies into those models; (2) use of modern methods for model selection and multi-model inference, and publication of parameter estimates under a range of plausible models; (3) explicit incorporation of prior information into each case study; and (4) planning management treatments in an adaptive framework that considers treatments applied in other studies. We encourage journals to publish papers that promote this building approach rather than expecting papers to conform to traditional standards of rigor as stand-alone papers, and believe that this shift in publishing philosophy would better encourage researchers to tackle the most urgent conservation problems.
Resumo:
One of the main tasks of the mathematical knowledge management community must surely be to enhance access to mathematics on digital systems. In this paper we present a spectrum of approaches to solving the various problems inherent in this task, arguing that a variety of approaches is both necessary and useful. The main ideas presented are about the differences between digitised mathematics, digitally represented mathematics and formalised mathematics. Each has its part to play in managing mathematical information in a connected world. Digitised material is that which is embodied in a computer file, accessible and displayable locally or globally. Represented material is digital material in which there is some structure (usually syntactic in nature) which maps to the mathematics contained in the digitised information. Formalised material is that in which both the syntax and semantics of the represented material, is automatically accessible. Given the range of mathematical information to which access is desired, and the limited resources available for managing that information, we must ensure that these resources are applied to digitise, form representations of or formalise, existing and new mathematical information in such a way as to extract the most benefit from the least expenditure of resources. We also analyse some of the various social and legal issues which surround the practical tasks.
Resumo:
In an immersive virtual environment, observers fail to notice the expansion of a room around them and consequently make gross errors when comparing the size of objects. This result is difficult to explain if the visual system continuously generates a 3-D model of the scene based on known baseline information from interocular separation or proprioception as the observer walks. An alternative is that observers use view-based methods to guide their actions and to represent the spatial layout of the scene. In this case, they may have an expectation of the images they will receive but be insensitive to the rate at which images arrive as they walk. We describe the way in which the eye movement strategy of animals simplifies motion processing if their goal is to move towards a desired image and discuss dorsal and ventral stream processing of moving images in that context. Although many questions about view-based approaches to scene representation remain unanswered, the solutions are likely to be highly relevant to understanding biological 3-D vision.
Resumo:
The reform of regional governance in the United Kingdom has been, in part, premised on the notion that regions provide new territories of action in which cooperative networks between business communities and state-agencies can be established. Promoting business interests is seen as one mechanism for enhancing the economic competitiveness and performance of 'laggard' regions. Yet, within this context of change, business agendas and capacities are often assumed to exist 'out there, as a resource waiting to be tapped by state institutions. There is little recognition that business organisations' involvement in networks of governance owes much to historical patterns and practices of business representation, to the types of activities that exist within the business sector, and to interpretations of their own role and position within wider policymaking and implementation networks. This paper, drawing on a study of business agendas in post-devolution Scotland, demonstrates that in practice business agendas are highly complex. Their formation in any particular place depends on the actions of reflexive agents, whose perspectives and capacities are shaped by the social, economic, and political contexts within which they are operating. As such, any understanding of business agendas needs to identify the social relations of business as a whole, rather than assuming away such complexities.
Resumo:
This study evaluates computer-generated written explanations about drug prescriptions that are based on an analysis of both patient and doctor informational needs. Three experiments examine the effects of varying the type of information given about the possible side effects of the medication, and the order of information within the explanation. Experiment 1 investigated the effects of these two factors on people's ratings of how good they consider the explanations to be and of their perceived likelihood of taking the medication, as well as on their memory for the information in the explanation. Experiment 2 further examined the effects of varying information about side effects by separating out the contribution of number and severity of side effects. It was found that participants in this study did not “like” explanations that described severe side effects, and also judged that they would be less likely to take the medication if given such explanations. Experiment 3 therefore investigated whether information about severe side effects could be presented in such a way as to increase judgements of how good explanations are thought to be, as well as the perceived likelihood of adherence. The results showed some benefits of providing additional explanatory information.
Resumo:
This paper describes the user modeling component of EPIAIM, a consultation system for data analysis in epidemiology. The component is aimed at representing knowledge of concepts in the domain, so that their explanations can be adapted to user needs. The first part of the paper describes two studies aimed at analysing user requirements. The first one is a questionnaire study which examines the respondents' familiarity with concepts. The second one is an analysis of concept descriptions in textbooks and from expert epidemiologists, which examines how discourse strategies are tailored to the level of experience of the expected audience. The second part of the paper describes how the results of these studies have been used to design the user modeling component of EPIAIM. This module works in a two-step approach. In the first step, a few trigger questions allow the activation of a stereotype that includes a "body" and an "inference component". The body is the representation of the body of knowledge that a class of users is expected to know, along with the probability that the knowledge is known. In the inference component, the learning process of concepts is represented as a belief network. Hence, in the second step the belief network is used to refine the initial default information in the stereotype's body. This is done by asking a few questions on those concepts where it is uncertain whether or not they are known to the user, and propagating this new evidence to revise the whole situation. The system has been implemented on a workstation under UNIX. An example of functioning is presented, and advantages and limitations of the approach are discussed.
Resumo:
Objective The Medicines Use Review (MUR) community pharmacy service was introduced in 2005 to enhance patient empowerment but the service has not been taken up as widely as expected. We investigated the depiction of the patient–pharmacist power relationship within MUR patient information leaflets. Methods We identified 11 MUR leaflets including the official Department of Health MUR booklet and through discourse analysis examined the way language and imagery had been used to symbolise and give meaning to the MUR service, especially the portrayal of the patient–pharmacist interactions and the implied power relations. Results A variety of terminology was used to describe the MUR, a service that aimed ultimately to produce more informed patients through the information imparted by knowledgeable, skilled pharmacists. Conclusion The educational role of the MUR overshadowed the intended patient empowerment that would take place with a true concordance-centred approach. Although patient empowerment was implied, this was within the boundaries of the biomedical model with the pharmacist as the expert provider of medicines information. Practice implications If patient empowerment is to be conveyed this needs to be communicated to patients through consistent use of language and imagery that portrays the inclusivity intended.
Resumo:
The 'self' is a complex multidimensional construct deeply embedded and in many ways defined by our relations with the social world. Individuals with autism are impaired in both self-referential and other-referential social cognitive processing. Atypical neural representation of the self may be a key to understanding the nature of such impairments. Using functional magnetic resonance imaging we scanned adult males with an autism spectrum condition and age and IQ-matched neurotypical males while they made reflective mentalizing or physical judgements about themselves or the British Queen. Neurotypical individuals preferentially recruit the middle cingulate cortex and ventromedial prefrontal cortex in response to self compared with other-referential processing. In autism, ventromedial prefrontal cortex responded equally to self and other, while middle cingulate cortex responded more to other-mentalizing than self-mentalizing. These atypical responses occur only in areas where self-information is preferentially processed and does not affect areas that preferentially respond to other-referential information. In autism, atypical neural self-representation was also apparent via reduced functional connectivity between ventromedial prefrontal cortex and areas associated with lower level embodied representations, such as ventral premotor and somatosensory cortex. Furthermore, the magnitude of neural self-other distinction in ventromedial prefrontal cortex was strongly related to the magnitude of early childhood social impairments in autism. Individuals whose ventromedial prefrontal cortex made the largest distinction between mentalizing about self and other were least socially impaired in early childhood, while those whose ventromedial prefrontal cortex made little to no distinction between mentalizing about self and other were the most socially impaired in early childhood. These observations reveal that the atypical organization of neural circuitry preferentially coding for self-information is a key mechanism at the heart of both self-referential and social impairments in autism.
Resumo:
More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.
Resumo:
Current feed evaluation systems for dairy cattle aim to match nutrient requirements with nutrient intake at pre-defined production levels. These systems were not developed to address, and are not suitable to predict, the responses to dietary changes in terms of production level and product composition, excretion of nutrients to the environment, and nutrition related disorders. The change from a requirement to a response system to meet the needs of various stakeholders requires prediction of the profile of absorbed nutrients and its subsequent utilisation for various purposes. This contribution examines the challenges to predicting the profile of nutrients available for absorption in dairy cattle and provides guidelines for further improved prediction with regard to animal production responses and environmental pollution. The profile of nutrients available for absorption comprises volatile fatty acids, long-chain fatty acids, amino acids and glucose. Thus the importance of processes in the reticulo-rumen is obvious. Much research into rumen fermentation is aimed at determination of substrate degradation rates. Quantitative knowledge on rates of passage of nutrients out of the rumen is rather limited compared with that on degradation rates, and thus should be an important theme in future research. Current systems largely ignore microbial metabolic variation, and extant mechanistic models of rumen fermentation give only limited attention to explicit representation of microbial metabolic activity. Recent molecular techniques indicate that knowledge on the presence and activity of various microbial species is far from complete. Such techniques may give a wealth of information, but to include such findings in systems predicting the nutrient profile requires close collaboration between molecular scientists and mathematical modellers on interpreting and evaluating quantitative data. Protozoal metabolism is of particular interest here given the paucity of quantitative data. Empirical models lack the biological basis necessary to evaluate mitigation strategies to reduce excretion of waste, including nitrogen, phosphorus and methane. Such models may have little predictive value when comparing various feeding strategies. Examples include the Intergovernmental Panel on Climate Change (IPCC) Tier II models to quantify methane emissions and current protein evaluation systems to evaluate low protein diets to reduce nitrogen losses to the environment. Nutrient based mechanistic models can address such issues. Since environmental issues generally attract more funding from governmental offices, further development of nutrient based models may well take place within an environmental framework.
Resumo:
This paper describes and analyses the experience of designing, installing and evaluating a farmer-usable touch screen information kiosk on cattle health in a veterinary institution in Pondicherry. The contents of the kiosk were prepared based on identified demands for information on cattle health, arrived at through various stakeholders meetings. Information on these cattle diseases and conditions affecting the livelihoods of the poor was provided through graphics, text and audio back-up, keeping in mind the needs of landless and illiterate poor cattle owners. A methodology for kiosk evaluation based on the feedback obtained from kiosk facilitator, critical group reflection and individual users was formulated. The formative evaluation reveals the potential strength this ICT has in transferring information to the cattle owners in a service delivery centre. Such information is vital in preventing diseases and helps cattle owners to present and treat their animals at an early stage of disease condition. This in turn helps prevent direct and indirect losses to the cattle owners. The study reveals how an information kiosk installed at a government institution as a freely accessible source of information to all farmers irrespective of their class and caste can help in transfer of information among poor cattle owners, provided periodic updating, interactivity and communication variability are taken care of. Being in the veterinary centre, the kiosk helps stimulate dialogue, and facilitates demand of services based on the information provided by the kiosk screens.