887 resultados para Computer System Management
Resumo:
ABSTRACT The Body Mass Index (BMI) can be used by farmers to help determine the time of evaluation of the body mass gain of the animal. However, the calculation of this index does not reveal immediately whether the animal is ready for slaughter or if it needs special care fattening. The aim of this study was to develop a software using the Fuzzy Logic to compare the bovine body mass among themselves and identify the groups for slaughter and those that requires more intensive feeding, using "mass" and "height" variables, and the output Fuzzy BMI. For the development of the software, it was used a fuzzy system with applications in a herd of 147 Nellore cows, located in a city of Santa Rita do Pardo city – Mato Grosso do Sul (MS) state, in Brazil, and a database generated by Matlab software.
Resumo:
Diplomityö tehtiin kiinteistö- ja toimitilajohdon palveluita tarjoavalle suomalaiselle rakennusalan yrityksen liiketoimintalinjalle. Työn lähtökohtana oli kehittää kyseiselle liiketoimintalinjalle toimintaa ohjaava ja yhtenäistävä toimintajärjestelmä. Työ koostuu johdannosta, teoriaosuudesta, empiriaosuudesta sekä johtopäätöksistä. Johdannossa esitellään työssä käytettävät menetelmät sekä kohdeyritys. Teoriaosuudessa puolestaan esitellään myöhemmin työn empiriaosuudessa hyödynnettävät teoriakokonaisuudet. Teoriaosuus koostuu kiinteistötoimialan esittelystä, laatuajattelusta sekä prosessijohtamisen läpikäynnistä. Empiriaosuus sisältää selvityksen kohdeyrityksen lähtötilasta ja suoritetusta kehitystyöstä. Kehitystyö sisältää sanallisen ja kuvallisen selvityksen rakennetun toimintajärjestelmän eri vaiheista. Työn tuloksena syntyi kiinteistö- ja toimitilajohdon palveluille käyttökelpoinen, toimintaa tukeva ja toiminnan johtamista helpottava työkalu. Toimintajärjestelmä kehitettiin yhdessä yksikön avainhenkilöiden kanssa, jolloin sen rakentamisessa saatiin huomioitua tulevien käyttäjien toiveita. Toimintajärjestelmässä on keskitetysti muun muassa toiminnan edellyttämät tiedot ja dokumentit.
Resumo:
Les nouvelles technologies de l’information et des communications occupent aujourd’hui une place importante dans les entreprises, quelle que soit la taille ou le(s) domaine(s) d’activité de ces dernières. Elles participent de manière positive au développement de la vie économique. Elles sont toutefois à l’origine d’une nouvelle forme de criminalité qui menace la sécurité et l’intégrité des systèmes informatiques dans l’entreprise. Celle-ci est d’une ampleur difficile à évaluer, mais surtout difficile à maîtriser avec les dispositions législatives déjà en place, laissant par là même apparaître qu’une adaptation au niveau juridique est inévitable. Certains pays industrialisés ont ainsi décidé de mettre en place un cadre juridique adéquat pour garantir aux entreprises la sécurité de leurs systèmes informatiques. Notre étude va justement porter sur les dispositifs mis en place par deux systèmes juridiques différents. Forcés de prendre en compte une réalité nouvelle – qui n’existait pas nécessairement il y a plusieurs années –, la France et le Canada ont décidé de modifier respectivement leurs codes pénal et criminel en leur ajoutant des dispositions qui répriment de nouvelles infractions. À travers cet exposé, nous allons analyser les infractions qui portent atteinte à la sécurité du système informatique de l’entreprise à la lumière des outils juridiques mis en place. Nous allons mesurer leur degré d’efficacité face à la réalité informatique. En d’autres termes, il s’agit pour nous de déterminer si le droit va répondre ou non aux besoins de l’informatique.
Resumo:
Traditionally, we've focussed on the question of how to make a system easy to code the first time, or perhaps on how to ease the system's continued evolution. But if we look at life cycle costs, then we must conclude that the important question is how to make a system easy to operate. To do this we need to make it easy for the operators to see what's going on and to then manipulate the system so that it does what it is supposed to. This is a radically different criterion for success. What makes a computer system visible and controllable? This is a difficult question, but it's clear that today's modern operating systems with nearly 50 million source lines of code are neither. Strikingly, the MIT Lisp Machine and its commercial successors provided almost the same functionality as today's mainstream sytsems, but with only 1 Million lines of code. This paper is a retrospective examination of the features of the Lisp Machine hardware and software system. Our key claim is that by building the Object Abstraction into the lowest tiers of the system, great synergy and clarity were obtained. It is our hope that this is a lesson that can impact tomorrow's designs. We also speculate on how the spirit of the Lisp Machine could be extended to include a comprehensive access control model and how new layers of abstraction could further enrich this model.
Resumo:
Aquest projecte final de carrera pretén investigar i experimentar una nova línia de desenvolupament d’algorismes dinàmics. A partir de l’algorisme AntNet-QoS [2] s’ incorporen noves mesures a l’algorisme (mesura de l’amplada de banda disponible i jitter), les quals combinant amb la mesura de retard ja feta servir, permet adaptar-se millor a les condicions actuals del trànsit en la xarxa i als requeriments específics de qualitat (QoS) per part del trànsit
Resumo:
El Sistema de Desarrollo Administrativo y sus alcances en la actual Ley de Administración Pública. Antes de la Ley 489 de 1998 las entidades de la Administración Pública no trabajaban en forma coordinada y la colaboración interinstitucional era deficiente, lo cual se ha corregido con la entrada en vigencia de la citada norma. Con la aplicación del Sistema de Desarrollo Administrativo creado por la Ley 489 de 1998, las entidades de la Administración Pública del Orden Nacional y Territorial, deben aplicar un conjunto de políticas, estrategias, metodologías, técnicas y mecanismos de carácter administrativo y organizacional para la gestión y manejo de sus recursos humanos, técnicos, materiales, físicos y financieros, tendientes a fortalecer su capacidad administrativa y su desempeño institucional. Las Políticas de Desarrollo Administrativo fueron plasmadas en el artículo 17 de la Ley 489 de 1998 y reagrupadas por el Decreto 3622 de 2005, en cinco ítems para su mejor comprensión y aplicación: 1) Política de Desarrollo del Talento Humano Estatal. 2) Política de Gestión de la Calidad 3) Política de Democratización de la Administración Pública. 4) Política de Moralización y Transparencia de la Administración Pública y 5) Política de Rediseños Organizacionales. Las anteriores políticas junto con el Plan Nacional de Formación y Capacitación constituyen los Fundamentos del Sistema de Desarrollo Administrativo de que trata el artículo 16 de la Ley 489 de 1998, las cuales se aplican actualmente en los 19 Sectores Administrativos de la Rama Ejecutiva del Orden Nacional. En el Orden Territorial la aplicación del Sistema de Desarrollo Administrativo ha sido parcial y los resultados aún no están consolidados. Nuestra investigación pretende que con el conocimiento claro del tema, a través de una adecuada planificación de los recursos de las entidades públicas, orientado a fortalecer la capacidad administrativa y el desempeño institucional de las entidades públicas, se administre mejor el Estado.
Resumo:
En este estudio de caso se analizan las transformaciones de la organización del trabajo y las relaciones laborales en una siderúrgica colombiana, a partir del nuevo modelo de gestión planteado por la multinacional que adquirió la empresa en el 2007
Resumo:
Siguiendo un marco teórico integrado por varios autores entorno a los sistemas de control de gestión a lo largo de varias décadas, este trabajo pretende estudiar y contrastar la relación entre el desarrollo de dichos sistemas y los recursos y capacidades. Para tal fin, se desarrolló un estudio de caso en Teleperformance Colombia (TC), una empresa dedicada a prestación de servicio de tercerización de procesos o business process outsourcing. En el estudio se establecieron dos variables para evaluar el desarrollo de sistema de control de gestión: el diseño y el uso. A su vez, para cada uno de ellos, se definieron los indicadores y preguntas que permitieran realizar la observación y posterior análisis. De igual manera, se seleccionaron los recursos y capacidades más importantes para el desarrollo del negocio: innovación, aprendizaje organizacional y capital humano. Sobre estos se validó la existencia de relación con el SCG implementado en TC. La información obtenida fue analizada y contrastada a través de pruebas estadísticas ampliamente utilizadas en este tipo de estudios en las ciencias sociales. Finalmente, se analizaron seis posibles relaciones de las cuales, solamente se ratificó el relacionamiento positivo entre uso de sistema de control gestión y el recurso y capacidad capital humano. El resto de relacionamientos, refutaron los planteamientos teóricos que establecían cierta influencia de los sistemas de control de gestión sobre recursos y capacidades de innovación y aprendizaje organizacional.
Resumo:
Tradicionalment, la reproducció del mon real se'ns ha mostrat a traves d'imatges planes. Aquestes imatges se solien materialitzar mitjançant pintures sobre tela o be amb dibuixos. Avui, per sort, encara podem veure pintures fetes a ma, tot i que la majoria d'imatges s'adquireixen mitjançant càmeres, i es mostren directament a una audiència, com en el cinema, la televisió o exposicions de fotografies, o be son processades per un sistema computeritzat per tal d'obtenir un resultat en particular. Aquests processaments s'apliquen en camps com en el control de qualitat industrial o be en la recerca mes puntera en intel·ligència artificial. Aplicant algorismes de processament de nivell mitja es poden obtenir imatges 3D a partir d'imatges 2D, utilitzant tècniques ben conegudes anomenades Shape From X, on X es el mètode per obtenir la tercera dimensió, i varia en funció de la tècnica que s'utilitza a tal nalitat. Tot i que l'evolució cap a la càmera 3D va començar en els 90, cal que les tècniques per obtenir les formes tridimensionals siguin mes i mes acurades. Les aplicacions dels escàners 3D han augmentat considerablement en els darrers anys, especialment en camps com el lleure, diagnosi/cirurgia assistida, robòtica, etc. Una de les tècniques mes utilitzades per obtenir informació 3D d'una escena, es la triangulació, i mes concretament, la utilització d'escàners laser tridimensionals. Des de la seva aparició formal en publicacions científiques al 1971 [SS71], hi ha hagut contribucions per solucionar problemes inherents com ara la disminució d'oclusions, millora de la precisió, velocitat d'adquisició, descripció de la forma, etc. Tots i cadascun dels mètodes per obtenir punts 3D d'una escena te associat un procés de calibració, i aquest procés juga un paper decisiu en el rendiment d'un dispositiu d'adquisició tridimensional. La nalitat d'aquesta tesi es la d'abordar el problema de l'adquisició de forma 3D, des d'un punt de vista total, reportant un estat de l'art sobre escàners laser basats en triangulació, provant el funcionament i rendiment de diferents sistemes, i fent aportacions per millorar la precisió en la detecció del feix laser, especialment en condicions adverses, i solucionant el problema de la calibració a partir de mètodes geomètrics projectius.
Resumo:
Would a research assistant - who can search for ideas related to those you are working on, network with others (but only share the things you have chosen to share), doesn’t need coffee and who might even, one day, appear to be conscious - help you get your work done? Would it help your students learn? There is a body of work showing that digital learning assistants can be a benefit to learners. It has been suggested that adaptive, caring, agents are more beneficial. Would a conscious agent be more caring, more adaptive, and better able to deal with changes in its learning partner’s life? Allow the system to try to dynamically model the user, so that it can make predictions about what is needed next, and how effective a particular intervention will be. Now, given that the system is essentially doing the same things as the user, why don’t we design the system so that it can try to model itself in the same way? This should mimic a primitive self-awareness. People develop their personalities, their identities, through interacting with others. It takes years for a human to develop a full sense of self. Nobody should expect a prototypical conscious computer system to be able to develop any faster than that. How can we provide a computer system with enough social contact to enable it to learn about itself and others? We can make it part of a network. Not just chatting with other computers about computer ‘stuff’, but involved in real human activity. Exposed to ‘raw meaning’ – the developing folksonomies coming out of the learning activities of humans, whether they are traditional students or lifelong learners (a term which should encompass everyone). Humans have complex psyches, comprised of multiple strands of identity which reflect as different roles in the communities of which they are part – so why not design our system the same way? With multiple internal modes of operation, each capable of being reflected onto the outside world in the form of roles – as a mentor, a research assistant, maybe even as a friend. But in order to be able to work with a human for long enough to be able to have a chance of developing the sort of rich behaviours we associate with people, the system needs to be able to function in a practical and helpful role. Unfortunately, it is unlikely to get a free ride from many people (other than its developer!) – so it needs to be able to perform a useful role, and do so securely, respecting the privacy of its partner. Can we create a system which learns to be more human whilst helping people learn?
Resumo:
The intelligent controlling mechanism of a typical mobile robot is usually a computer system. Research is however now ongoing in which biological neural networks are being cultured and trained to act as the brain of an interactive real world robot – thereby either completely replacing or operating in a cooperative fashion with a computer system. Studying such neural systems can give a distinct insight into biological neural structures and therefore such research has immediate medical implications. The principal aims of the present research are to assess the computational and learning capacity of dissociated cultured neuronal networks with a view to advancing network level processing of artificial neural networks. This will be approached by the creation of an artificial hybrid system (animat) involving closed loop control of a mobile robot by a dissociated culture of rat neurons. This paper details the components of the overall animat closed loop system architecture and reports on the evaluation of the results from preliminary real-life and simulated robot experiments.
Resumo:
The intelligent controlling mechanism of a typical mobile robot is usually a computer system. Some recent research is ongoing in which biological neurons are being cultured and trained to act as the brain of an interactive real world robot�thereby either completely replacing, or operating in a cooperative fashion with, a computer system. Studying such hybrid systems can provide distinct insights into the operation of biological neural structures, and therefore, such research has immediate medical implications as well as enormous potential in robotics. The main aim of the research is to assess the computational and learning capacity of dissociated cultured neuronal networks. A hybrid system incorporating closed-loop control of a mobile robot by a dissociated culture of neurons has been created. The system is flexible and allows for closed-loop operation, either with hardware robot or its software simulation. The paper provides an overview of the problem area, gives an idea of the breadth of present ongoing research, establises a new system architecture and, as an example, reports on the results of conducted experiments with real-life robots.
Resumo:
It is usually expected that the intelligent controlling mechanism of a robot is a computer system. Research is however now ongoing in which biological neural networks are being cultured and trained to act as the brain of an interactive real world robot - thereby either completely replacing or operating in a cooperative fashion with a computer system. Studying such neural systems can give a distinct insight into biological neural structures and therefore such research has immediate medical implications. In particular, the use of rodent primary dissociated cultured neuronal networks for the control of mobile `animals' (artificial animals, a contraction of animal and materials) is a novel approach to discovering the computational capabilities of networks of biological neurones. A dissociated culture of this nature requires appropriate embodiment in some form, to enable appropriate development in a controlled environment within which appropriate stimuli may be received via sensory data but ultimate influence over motor actions retained. The principal aims of the present research are to assess the computational and learning capacity of dissociated cultured neuronal networks with a view to advancing network level processing of artificial neural networks. This will be approached by the creation of an artificial hybrid system (animal) involving closed loop control of a mobile robot by a dissociated culture of rat neurons. This 'closed loop' interaction with the environment through both sensing and effecting will enable investigation of its learning capacity This paper details the components of the overall animat closed loop system and reports on the evaluation of the results from the experiments being carried out with regard to robot behaviour.
Resumo:
Aim: To determine the prevalence and nature of prescribing errors in general practice; to explore the causes, and to identify defences against error. Methods: 1) Systematic reviews; 2) Retrospective review of unique medication items prescribed over a 12 month period to a 2% sample of patients from 15 general practices in England; 3) Interviews with 34 prescribers regarding 70 potential errors; 15 root cause analyses, and six focus groups involving 46 primary health care team members Results: The study involved examination of 6,048 unique prescription items for 1,777 patients. Prescribing or monitoring errors were detected for one in eight patients, involving around one in 20 of all prescription items. The vast majority of the errors were of mild to moderate severity, with one in 550 items being associated with a severe error. The following factors were associated with increased risk of prescribing or monitoring errors: male gender, age less than 15 years or greater than 64 years, number of unique medication items prescribed, and being prescribed preparations in the following therapeutic areas: cardiovascular, infections, malignant disease and immunosuppression, musculoskeletal, eye, ENT and skin. Prescribing or monitoring errors were not associated with the grade of GP or whether prescriptions were issued as acute or repeat items. A wide range of underlying causes of error were identified relating to the prescriber, patient, the team, the working environment, the task, the computer system and the primary/secondary care interface. Many defences against error were also identified, including strategies employed by individual prescribers and primary care teams, and making best use of health information technology. Conclusion: Prescribing errors in general practices are common, although severe errors are unusual. Many factors increase the risk of error. Strategies for reducing the prevalence of error should focus on GP training, continuing professional development for GPs, clinical governance, effective use of clinical computer systems, and improving safety systems within general practices and at the interface with secondary care.
Resumo:
Aim: To examine the causes of prescribing and monitoring errors in English general practices and provide recommendations for how they may be overcome. Design: Qualitative interview and focus group study with purposive sampling and thematic analysis informed by Reason’s accident causation model. Participants: General practice staff participated in a combination of semi-structured interviews (n=34) and six focus groups (n=46). Setting: Fifteen general practices across three primary care trusts in England. Results: We identified seven categories of high-level error-producing conditions: the prescriber, the patient, the team, the task, the working environment, the computer system, and the primary-secondary care interface. Each of these was further broken down to reveal various error-producing conditions. The prescriber’s therapeutic training, drug knowledge and experience, knowledge of the patient, perception of risk, and their physical and emotional health, were all identified as possible causes. The patient’s characteristics and the complexity of the individual clinical case were also found to have contributed to prescribing errors. The importance of feeling comfortable within the practice team was highlighted, as well as the safety of general practitioners (GPs) in signing prescriptions generated by nurses when they had not seen the patient for themselves. The working environment with its high workload, time pressures, and interruptions, and computer related issues associated with mis-selecting drugs from electronic pick-lists and overriding alerts, were all highlighted as possible causes of prescribing errors and often interconnected. Conclusion: This study has highlighted the complex underlying causes of prescribing and monitoring errors in general practices, several of which are amenable to intervention.