824 resultados para Probabilistic decision process model
Resumo:
There is growing interest in providing women with internatal care, a package of healthcare and ancillary services that can improve their health during the period after the termination of one pregnancy but before the conception of the next pregnancy. Women who have had a pregnancy affected by a neural tube defect can especially benefit from internatal care because they are at increased risk for recurrence and improvements to their health during the inter-pregnancy period can prevent future negative birth outcomes. The dissertation provides three papers that inform the content of internatal care for women at risk for recurrence by examining descriptive epidemiology to develop an accurate risk profile of the population, assessing whether women at risk for recurrence would benefit from a psychosocial intervention, and determining how to improve health promotion efforts targeting folic acid use.^ Paper one identifies information relevant for developing risk profiles and conducting risk assessments. A number of investigations have found that the risk for neural tube defects differs between non-Hispanic Whites and Hispanics. To understand the risk difference, the descriptive epidemiology of spina bifida and anencephaly was examined for Hispanics and non-Hispanic Whites based on data from the Texas Birth Defects Registry for the years 1999 through 2004. Crude and adjusted birth prevalence ratios and corresponding 95% confidence intervals were calculated between descriptive epidemiologic characteristics and anencephaly and spina bifida for non-Hispanic Whites and for Hispanics. In both race/ethnic groups, anencephaly expressed an inverse relationship with maternal age and a positive linear relationship with parity. Both relationships were stronger in non-Hispanic Whites. Female infants had a higher risk for anencephaly in non-Hispanic Whites. Lower maternal education was associated with increased risk for spina bifida in Hispanics.^ Paper two assesses the need for a psychosocial intervention. For mothers who have children with spina bifida, the transition to motherhood can be stressful. This qualitative study explored the process of becoming a mother to a child with spina bifida focusing particularly on stress and coping in the immediate postnatal environment. Semi-structured interviews were conducted with six mothers who have children with spina bifida. Mothers were asked about their initial emotional and problem-based coping efforts, the quality and kind of support provided by health providers, and the characteristics of their meaning-based coping efforts; questions matched Transactional Model of Stress and Coping (TMSC) constructs. Analysis of the responses revealed a number of modifiable stress and coping transactions, the most salient being: health providers are in a position to address beliefs about self-causality and prevent mothers from experiencing the repercussions that stem from maintaining these beliefs. ^ Paper three identifies considerations when creating health promotion materials targeting folic acid use. A brochure was designed using concepts from the Precaution Adoption Process Model (PAPM). Three focus groups comprising 26 mothers of children with spina bifida evaluated the brochure. One focus group was conducted in Spanish-only, the other two focus groups were conducted in English and Spanish combined. Qualitative analysis of coded transcripts revealed that a brochure is a helpful adjunct. Questions about folic acid support the inclusion of an insert with basic information. There may be a need to develop different educational material for Hispanics so the importance of folic acid is provided in a situational context. Some participants blamed themselves for their pregnancy outcome which may affect their receptivity to messages in the brochure. The women's desire for photographs that affect their perception of threat and their identification with the second role model indicate they belong to PAPM Stage 2 and 3. Participants preferred colorful envelopes, high quality paper, intimidating photographs, simple words, conversational style sentences, and positive messages.^ These papers develop the content of risk assessment, psychosocial intervention, and health promotion components of internatal care as they apply to women at risk for recurrence. The findings provided evidence for considering parity and maternal age when assessing nutritional risk. The two dissimilarities between the two race/ethnic groups, infant sex and maternal education lent support to creating separate risk profiles. Interviews with mothers of children with spina bifida revealed the existence of unmet needs-suggesting that a psychosocial intervention provided as part of internatal care can strengthen and support women's well-being. Segmenting the audience according to race/ethnicity and PAPM stage can improve the relevance of print materials promoting folic acid use.^
Resumo:
This is an implementation analysis of three consecutive state health policies whose goal was to improve access to maternal and child health services in Texas from 1983 to 1986. Of particular interest is the choice of the unit of analysis, the policy subsystem, and the network approach to analysis. The network approach analyzes and compares the structure and decision process of six policy subsystems in order to explain program performance. Both changes in state health policy as well as differences in implementation contexts explain evolution of the program administrative and service unit, the policy subsystem. And, in turn, the evolution of the policy subsystem explains changes in program performance. ^
Resumo:
This dissertation focuses on Project HOPE, an American medical aid agency, and its work in Tunisia. More specifically this is a study of the implementation strategies of those HOPE sponsored projects and programs designed to solve the problems of high morbidity and infant mortality rates due to environmentally related diarrheal and enteric diseases. Several environmental health programs and projects developed in cooperation with Tunisian counterparts are described and analyzed. These include (1) a paramedical manpower training program; (2) a national hospital sanitation and infection control program; (3) a community sewage disposal project; (4) a well reconstruction project; and (5) a solid-waste disposal project for a hospital.^ After independence, Tunisia, like many developing countries, encountered several difficulties which hindered progress toward solving basic environmental health problems and prompted a request for aid. This study discusses the need for all who work in development programs to recognize and assess those difficulties or constraints which affect the program planning process, including those latent cultural and political constraints which not only exist within the host country but within the aid agency as well. For example, failure to recognize cultural differences may adversely affect the attitudes of the host staff towards their work and towards the aid agency and its task. These factors, therefore, play a significant role in influencing program development decisions and must be taken into account in order to maximize the probability of successful outcomes.^ In 1969 Project HOPE was asked by the Tunisian government to assist the Ministry of Health in solving its health manpower problems. HOPE responded with several programs, one of which concerned the training of public health nurses, sanitary technicians, and aids at Tunisia's school of public health in Nabeul. The outcome of that program as well as the strategies used in its development are analyzed. Also, certain questions are addressed such as, what should the indicators of success be, and when is the time right to phase out?^ Another HOPE program analyzed involved hospital sanitation and infection control. Certain generic aspects of basic hospital sanitation procedures were documented and presented in the form of a process model which was later used as a "microplan" in setting up similar programs in other Tunisian hospitals. In this study the details of the "microplan" are discussed. The development of a nation-wide program without any further need of external assistance illustrated the success of HOPE's implementation strategies.^ Finally, although it is known that the high incidence of enteric disease in developing countries is due to poor environmental sanitation and poor hygiene practices, efforts by aid agencies to correct these conditions have often resulted in failure. Project HOPE's strategy was to maximize limited resources by using a systems approach to program development and by becoming actively involved in the design and implementation of environmental health projects utilizing "appropriate" technology. Three innovative projects and their implementation strategies (including technical specifications) are described.^ It is advocated that if aid agencies are to make any progress in helping developing countries basic sanitation problems, they must take an interdisciplinary approach to progrm development and play an active role in helping counterparts seek and identify appropriate technologies which are socially and economically acceptable. ^
Resumo:
Pneumonia is a well-documented and common respiratory infection in patients with acute traumatic spinal cord injuries, and may recur during the course of acute care. Using data from the North American Clinical Trials Network (NACTN) for Spinal Cord Injury, the incidence, timing, and recurrence of pneumonia were analyzed. The two main objectives were (1) to investigate the time and potential risk factors for the first occurrence of pneumonia using the Cox Proportional Hazards model, and (2) to investigate pneumonia recurrence and its risk factors using a Counting Process model that is a generalization of the Cox Proportional Hazards model. The results from survival analysis suggested that surgery, intubation, American Spinal Injury Association (ASIA) grade, direct admission to a NACTN site and age (older than 65 or not) were significant risks for first event of pneumonia and multiple events of pneumonia. The significance of this research is that it has the potential to identify patients at the time of admission who are at high risk for the incidence and recurrence of pneumonia. Knowledge and the time of occurrence of pneumonias are important factors for the development of prevention strategies and may also provide some insights into the selection of emerging therapies that compromise the immune system. ^
Resumo:
INTRODUCTION: Objective assessment of motor skills has become an important challenge in minimally invasive surgery (MIS) training.Currently, there is no gold standard defining and determining the residents' surgical competence.To aid in the decision process, we analyze the validity of a supervised classifier to determine the degree of MIS competence based on assessment of psychomotor skills METHODOLOGY: The ANFIS is trained to classify performance in a box trainer peg transfer task performed by two groups (expert/non expert). There were 42 participants included in the study: the non-expert group consisted of 16 medical students and 8 residents (< 10 MIS procedures performed), whereas the expert group consisted of 14 residents (> 10 MIS procedures performed) and 4 experienced surgeons. Instrument movements were captured by means of the Endoscopic Video Analysis (EVA) tracking system. Nine motion analysis parameters (MAPs) were analyzed, including time, path length, depth, average speed, average acceleration, economy of area, economy of volume, idle time and motion smoothness. Data reduction was performed by means of principal component analysis, and then used to train the ANFIS net. Performance was measured by leave one out cross validation. RESULTS: The ANFIS presented an accuracy of 80.95%, where 13 experts and 21 non-experts were correctly classified. Total root mean square error was 0.88, while the area under the classifiers' ROC curve (AUC) was measured at 0.81. DISCUSSION: We have shown the usefulness of ANFIS for classification of MIS competence in a simple box trainer exercise. The main advantage of using ANFIS resides in its continuous output, which allows fine discrimination of surgical competence. There are, however, challenges that must be taken into account when considering use of ANFIS (e.g. training time, architecture modeling). Despite this, we have shown discriminative power of ANFIS for a low-difficulty box trainer task, regardless of the individual significances between MAPs. Future studies are required to confirm the findings, inclusion of new tasks, conditions and sample population.
Resumo:
Los avances en el hardware permiten disponer de grandes volúmenes de datos, surgiendo aplicaciones que deben suministrar información en tiempo cuasi-real, la monitorización de pacientes, ej., el seguimiento sanitario de las conducciones de agua, etc. Las necesidades de estas aplicaciones hacen emerger el modelo de flujo de datos (data streaming) frente al modelo almacenar-para-despuésprocesar (store-then-process). Mientras que en el modelo store-then-process, los datos son almacenados para ser posteriormente consultados; en los sistemas de streaming, los datos son procesados a su llegada al sistema, produciendo respuestas continuas sin llegar a almacenarse. Esta nueva visión impone desafíos para el procesamiento de datos al vuelo: 1) las respuestas deben producirse de manera continua cada vez que nuevos datos llegan al sistema; 2) los datos son accedidos solo una vez y, generalmente, no son almacenados en su totalidad; y 3) el tiempo de procesamiento por dato para producir una respuesta debe ser bajo. Aunque existen dos modelos para el cómputo de respuestas continuas, el modelo evolutivo y el de ventana deslizante; éste segundo se ajusta mejor en ciertas aplicaciones al considerar únicamente los datos recibidos más recientemente, en lugar de todo el histórico de datos. En los últimos años, la minería de datos en streaming se ha centrado en el modelo evolutivo. Mientras que, en el modelo de ventana deslizante, el trabajo presentado es más reducido ya que estos algoritmos no sólo deben de ser incrementales si no que deben borrar la información que caduca por el deslizamiento de la ventana manteniendo los anteriores tres desafíos. Una de las tareas fundamentales en minería de datos es la búsqueda de agrupaciones donde, dado un conjunto de datos, el objetivo es encontrar grupos representativos, de manera que se tenga una descripción sintética del conjunto. Estas agrupaciones son fundamentales en aplicaciones como la detección de intrusos en la red o la segmentación de clientes en el marketing y la publicidad. Debido a las cantidades masivas de datos que deben procesarse en este tipo de aplicaciones (millones de eventos por segundo), las soluciones centralizadas puede ser incapaz de hacer frente a las restricciones de tiempo de procesamiento, por lo que deben recurrir a descartar datos durante los picos de carga. Para evitar esta perdida de datos, se impone el procesamiento distribuido de streams, en concreto, los algoritmos de agrupamiento deben ser adaptados para este tipo de entornos, en los que los datos están distribuidos. En streaming, la investigación no solo se centra en el diseño para tareas generales, como la agrupación, sino también en la búsqueda de nuevos enfoques que se adapten mejor a escenarios particulares. Como ejemplo, un mecanismo de agrupación ad-hoc resulta ser más adecuado para la defensa contra la denegación de servicio distribuida (Distributed Denial of Services, DDoS) que el problema tradicional de k-medias. En esta tesis se pretende contribuir en el problema agrupamiento en streaming tanto en entornos centralizados y distribuidos. Hemos diseñado un algoritmo centralizado de clustering mostrando las capacidades para descubrir agrupaciones de alta calidad en bajo tiempo frente a otras soluciones del estado del arte, en una amplia evaluación. Además, se ha trabajado sobre una estructura que reduce notablemente el espacio de memoria necesario, controlando, en todo momento, el error de los cómputos. Nuestro trabajo también proporciona dos protocolos de distribución del cómputo de agrupaciones. Se han analizado dos características fundamentales: el impacto sobre la calidad del clustering al realizar el cómputo distribuido y las condiciones necesarias para la reducción del tiempo de procesamiento frente a la solución centralizada. Finalmente, hemos desarrollado un entorno para la detección de ataques DDoS basado en agrupaciones. En este último caso, se ha caracterizado el tipo de ataques detectados y se ha desarrollado una evaluación sobre la eficiencia y eficacia de la mitigación del impacto del ataque. ABSTRACT Advances in hardware allow to collect huge volumes of data emerging applications that must provide information in near-real time, e.g., patient monitoring, health monitoring of water pipes, etc. The data streaming model emerges to comply with these applications overcoming the traditional store-then-process model. With the store-then-process model, data is stored before being consulted; while, in streaming, data are processed on the fly producing continuous responses. The challenges of streaming for processing data on the fly are the following: 1) responses must be produced continuously whenever new data arrives in the system; 2) data is accessed only once and is generally not maintained in its entirety, and 3) data processing time to produce a response should be low. Two models exist to compute continuous responses: the evolving model and the sliding window model; the latter fits best with applications must be computed over the most recently data rather than all the previous data. In recent years, research in the context of data stream mining has focused mainly on the evolving model. In the sliding window model, the work presented is smaller since these algorithms must be incremental and they must delete the information which expires when the window slides. Clustering is one of the fundamental techniques of data mining and is used to analyze data sets in order to find representative groups that provide a concise description of the data being processed. Clustering is critical in applications such as network intrusion detection or customer segmentation in marketing and advertising. Due to the huge amount of data that must be processed by such applications (up to millions of events per second), centralized solutions are usually unable to cope with timing restrictions and recur to shedding techniques where data is discarded during load peaks. To avoid discarding of data, processing of streams (such as clustering) must be distributed and adapted to environments where information is distributed. In streaming, research does not only focus on designing for general tasks, such as clustering, but also in finding new approaches that fit bests with particular scenarios. As an example, an ad-hoc grouping mechanism turns out to be more adequate than k-means for defense against Distributed Denial of Service (DDoS). This thesis contributes to the data stream mining clustering technique both for centralized and distributed environments. We present a centralized clustering algorithm showing capabilities to discover clusters of high quality in low time and we provide a comparison with existing state of the art solutions. We have worked on a data structure that significantly reduces memory requirements while controlling the error of the clusters statistics. We also provide two distributed clustering protocols. We focus on the analysis of two key features: the impact on the clustering quality when computation is distributed and the requirements for reducing the processing time compared to the centralized solution. Finally, with respect to ad-hoc grouping techniques, we have developed a DDoS detection framework based on clustering.We have characterized the attacks detected and we have evaluated the efficiency and effectiveness of mitigating the attack impact.
Resumo:
Autonomous systems require, in most of the cases, reasoning and decision-making capabilities. Moreover, the decision process has to occur in real time. Real-time computing means that every situation or event has to have an answer before a temporal deadline. In complex applications, these deadlines are usually in the order of milliseconds or even microseconds if the application is very demanding. In order to comply with these timing requirements, computing tasks have to be performed as fast as possible. The problem arises when computations are no longer simple, but very time-consuming operations. A good example can be found in autonomous navigation systems with visual-tracking submodules where Kalman filtering is the most extended solution. However, in recent years, some interesting new approaches have been developed. Particle filtering, given its more general problem-solving features, has reached an important position in the field. The aim of this thesis is to design, implement and validate a hardware platform that constitutes itself an embedded intelligent system. The proposed system would combine particle filtering and evolutionary computation algorithms to generate intelligent behavior. Traditional approaches to particle filtering or evolutionary computation have been developed in software platforms, including parallel capabilities to some extent. In this work, an additional goal is fully exploiting hardware implementation advantages. By using the computational resources available in a FPGA device, better performance results in terms of computation time are expected. These hardware resources will be in charge of extensive repetitive computations. With this hardware-based implementation, real-time features are also expected.
Resumo:
Due to the necessity to undertake activities, every year people increase their standards of travelling (distance and time). Urban sprawl development plays an important role in these "enlargements". Thus, governments invest money in an exhaustiva search for solutions to high levels of congestion and car-trips. The complex relationship between urban environment and travel behaviour has been studied in a number of cases. Thus, the objective of this paper is to answer the important question of which land-use attributes influence which dimensions of travel behaviour, and to verify to what extent specific urban planning measures affect the individual decision process, by exhaustiva statistical and systematic tests. This paper found that a crucial issue in the analysis of the relationship between the built environment and travel behaviour is the definition of indicators. As such, we recommend a feasible list of indicators to analyze this relationship.
Resumo:
This paper contributes with a unified formulation that merges previ- ous analysis on the prediction of the performance ( value function ) of certain sequence of actions ( policy ) when an agent operates a Markov decision process with large state-space. When the states are represented by features and the value function is linearly approxi- mated, our analysis reveals a new relationship between two common cost functions used to obtain the optimal approximation. In addition, this analysis allows us to propose an efficient adaptive algorithm that provides an unbiased linear estimate. The performance of the pro- posed algorithm is illustrated by simulation, showing competitive results when compared with the state-of-the-art solutions.
Resumo:
En la actualidad existe una gran expectación ante la introducción de nuevas herramientas y métodos para el desarrollo de productos software, que permitirán en un futuro próximo un planteamiento de ingeniería del proceso de producción software. Las nuevas metodologías que empiezan a esbozarse suponen un enfoque integral del problema abarcando todas las fases del esquema productivo. Sin embargo el grado de automatización conseguido en el proceso de construcción de sistemas es muy bajo y éste está centrado en las últimas fases del ciclo de vida del software, consiguiéndose así una reducción poco significativa de sus costes y, lo que es aún más importante, sin garantizar la calidad de los productos software obtenidos. Esta tesis define una metodología de desarrollo software estructurada que se puede automatizar, es decir una metodología CASE. La metodología que se presenta se ajusta al modelo de ciclo de desarrollo CASE, que consta de las fases de análisis, diseño y pruebas; siendo su ámbito de aplicación los sistemas de información. Se establecen inicialmente los principios básicos sobre los que la metodología CASE se asienta. Posteriormente, y puesto que la metodología se inicia con la fijación de los objetivos de la empresa que demanda un sistema informático, se emplean técnicas que sirvan de recogida y validación de la información, que proporcionan a la vez un lenguaje de comunicación fácil entre usuarios finales e informáticos. Además, estas mismas técnicas detallarán de una manera completa, consistente y sin ambigüedad todos los requisitos del sistema. Asimismo, se presentan un conjunto de técnicas y algoritmos para conseguir que desde la especificación de requisitos del sistema se logre una automatización tanto del diseño lógico del Modelo de Procesos como del Modelo de Datos, validados ambos conforme a la especificación de requisitos previa. Por último se definen unos procedimientos formales que indican el conjunto de actividades a realizar en el proceso de construcción y cómo llevarlas a cabo, consiguiendo de esta manera una integridad en las distintas etapas del proceso de desarrollo.---ABSTRACT---Nowdays there is a great expectation with regard to the introduction of new tools and methods for the software products development that, in the very near future will allow, an engineering approach in the software development process. New methodologies, just emerging, imply an integral approach to the problem, including all the productive scheme stages. However, the automatization degree obtained in the systems construction process is very low and focused on the last phases of the software lifecycle, which means that the costs reduction obtained is irrelevant and, which is more important, the quality of the software products is not guaranteed. This thesis defines an structured software development methodology that can be automated, that is a CASE methodology. Such a methodology is adapted to the CASE development cycle-model, which consists in analysis, design and testing phases, being the information systems its field of application. Firstly, we present the basic principies on which CASE methodology is based. Secondly, since the methodology starts from fixing the objectives of the company demanding the automatization system, we use some techniques that are useful for gathering and validating the information, being at the same time an easy communication language between end-users and developers. Indeed, these same techniques will detail completely, consistently and non ambiguously all the system requirements. Likewise, a set of techniques and algorithms are shown in order to obtain, from the system requirements specification, an automatization of the Process Model logical design, and of the Data Model logical design. Those two models are validated according to the previous requirement specification. Finally, we define several formal procedures that suggest which set of activities to be accomplished in the construction process, and how to carry them out, getting in this way integrity and completness for the different stages of the development process.
Resumo:
New technologies such as, the new Information and Communication Technology ICT, break new paths and redefines the way we understand business, the Cloud Computing is one of them. The on demand resource gathering and the per usage payment scheme are now commonplace, and allows companies to save on their ICT investments. Despite the importance of this issue, we still lack methodologies that help companies, to develop applications oriented for its exploitation in the Cloud. In this study we aim to fill this gap and propose a methodology for the development of ICT applications, which are directed towards a business model, and further outsourcing in the Cloud. In the former the Development of SOA applications, we take, as a baseline scenario, a business model from which to obtain a business process model. To this end, we use software engineering tools; and in the latter The Outsourcing we propose a guide that would facilitate uploading business models into the Cloud; to this end we describe a SOA governance model, which controls the SOA. Additionally we propose a Cloud government that integrates Service Level Agreements SLAs, plus SOA governance, and Cloud architecture. Finally we apply our methodology in an example illustrating our proposal. We believe that our proposal can be used as a guide/pattern for the development of business applications.
Resumo:
En la actualidad no se concibe una empresa, por pequeña que esta sea, sin algún tipo de servicio TI. Se presenta para cada empresa el reto de emprender proyectos para desarrollar o contratar servicios de TI que soporten los diferentes procesos de negocio de la empresa. Por otro lado, a menos que los servicios de TI estén aislados de toda red, lo cual es prácticamente imposible en la actualidad, no existe un servicio o un proyecto que lo desarrolle garantizando el 100% de seguridad. Así la empresa maneja una dualidad entre desarrollar productos/servicios de TI seguros y el mantenimiento constante de sus servicios TI en estado seguro. La gestión de los proyectos para el desarrollo de los servicios de TI se aborda, en la mayoría de las empresas, aplicando distintas prácticas, utilizadas en otros proyectos y recomendadas, a tal efecto, por marcos y estándares con mayor reconocimiento. Por lo general, estos marcos incluyen, entre sus procesos, la gestión de los riesgos orientada al cumplimiento de plazos, de costes y, a veces, de la funcionalidad del producto o servicio. Sin embargo, en estas prácticas se obvian los aspectos de seguridad (confidencialidad, integridad y disponibilidad) del producto/servicio, necesarios durante el desarrollo del proyecto. Además, una vez entregado el servicio, a nivel operativo, cuando surge algún fallo relativo a estos aspectos de seguridad, se aplican soluciones ad-hoc. Esto provoca grandes pérdidas y, en ocasiones, pone en peligro la continuidad de la propia empresa. Este problema, se va acrecentando cada día más, en cualquier tipo de empresa y, son las PYMEs, por su la falta de conocimiento del problema en sí y la escasez de recursos metodológicos y técnicos, las empresas más vulnerables. Por todo lo anterior, esta tesis doctoral tiene un doble objetivo. En primer lugar, demostrar la necesidad de contar con un marco de trabajo que, integrado con otros posibles marcos y estándares, sea sencillo de aplicar en distintos tipos y envergaduras de proyectos, y que guíe a las PYMEs en la gestión de proyectos para el desarrollo seguro y posterior mantenimiento de la seguridad de sus servicios de TI. En segundo lugar, cubrir esta necesidad desarrollando un marco de trabajo que ofrezca un modelo de proceso genérico aplicable sobre distintos patrones de proyecto y una librería de activos de seguridad que sirva a las PYMEs de guía durante el proceso de gestión del proyecto para el desarrollo seguro. El modelo de proceso del marco propuesto describe actividades en los tres niveles organizativos de la empresa (estratégico, táctico y operativo). Está basado en el ciclo de mejora continua (PDCA) y en la filosofía Seguridad por Diseño, propuesta por Siemens. Se detallan las prácticas específicas de cada actividad, las entradas, salidas, acciones, roles, KPIs y técnicas aplicables para cada actividad. Estas prácticas específicas pueden aplicarse o no, a criterio del jefe de proyecto y de acuerdo al estado de la empresa y proyecto que se quiera desarrollar, estableciendo así distintos patrones de proceso. Para la validación del marco se han elegido dos PYMEs. La primera del sector servicios y la segunda del sector TIC. El modelo de proceso ha sido aplicado sobre un mismo patrón de proyecto que responde a necesidades comunes a ambas empresas. El patrón de proceso ha sido valorado en los proyectos elegidos en ambas empresas, antes y después de su aplicación. Los resultados del estudio, después de su aplicación en ambas empresas, han permitido la validación del patrón de proceso, en la mejora de la gestión de proyecto para el desarrollo seguro de TI en las PYMEs. ABSTRACT Today a company without any IT service is not conceived, even if it is small either. It presents the challenge for each company to undertake projects to develop or contract IT services that support the different business processes of the company. On the other hand, unless IT services are isolated from whole network, which is virtually impossible at present, there is no service or project, which develops guaranteeing 100% security. So the company handles a duality, develop products / insurance IT services and constant maintenance of their IT services in a safe state. The project management for the development of IT services is addressed, in most companies, using different practices used in other projects and recommended for this purpose by frameworks and standards with greater recognition. Generally, these frameworks include, among its processes, risk management aimed at meeting deadlines, costs and, sometimes, the functionality of the product or service. However, safety issues such as confidentiality, integrity and availability of the product / service, necessary for the project, they are ignored in these practices. Moreover, once the service delivered at the operational level, when a fault on these safety issues arise, ad-hoc solutions are applied. This causes great losses and sometimes threatens the continuity of the company. This problem is adding more every day, in any kind of business and SMEs are, by their lack of knowledge of the problem itself and the lack of methodological and technical resources, the most vulnerable companies. For all these reasons, this thesis has two objectives. Firstly demonstrate the need for a framework that integrated with other possible frameworks and standards, it is simple to apply in different types and wingspans of projects, and to guide SMEs in the management of development projects safely, and subsequent maintenance of the security of their IT services. Secondly meet this need by developing a framework that provides a generic process model applicable to project different patterns and a library of security assets, which serve to guide SMEs in the process of project management for development safe. The process model describes the proposed activities under the three organizational levels of the company (strategic, tactical and operational). It is based on the continuous improvement cycle (PDCA) and Security Design philosophy proposed by Siemens. The specific practices, inputs, outputs, actions, roles, KPIs and techniques applicable to each activity are detailed. These specific practices can be applied or not, at the discretion of the project manager and according to the state of the company and project that the company wants to develop, establishing different patterns of process. Two SMEs have been chosen to validate the frame work. The first of the services sector and the second in the ICT sector. The process model has been applied on the same pattern project that responds to needs common to both companies. The process pattern has been valued at the selected projects in both companies before and after application. The results of the study, after application in both companies have enabled pattern validation process, improving project management for the safe development of IT in SMEs.
Resumo:
The primate visual system offers unprecedented opportunities for investigating the neural basis of cognition. Even the simplest visual discrimination task requires processing of sensory signals, formation of a decision, and orchestration of a motor response. With our extensive knowledge of the primate visual and oculomotor systems as a base, it is now possible to investigate the neural basis of simple visual decisions that link sensation to action. Here we describe an initial study of neural responses in the lateral intraparietal area (LIP) of the cerebral cortex while alert monkeys discriminated the direction of motion in a visual display. A subset of LIP neurons carried high-level signals that may comprise a neural correlate of the decision process in our task. These signals are neither sensory nor motor in the strictest sense; rather they appear to reflect integration of sensory signals toward a decision appropriate for guiding movement. If this ultimately proves to be the case, several fascinating issues in cognitive neuroscience will be brought under rigorous physiological scrutiny.
Resumo:
The C4 repressor of the temperate bacteriophages P1 and P7 inhibits antirepressor (Ant) synthesis and is essential for establishment and maintenance of lysogeny. C4 is an antisense RNA acting on a target, Ant mRNA, which is transcribed from the same promoter. The antisense-target RNA interaction requires processing of C4 RNA from a precursor RNA. Here we show that 5' maturation of C4 RNA in vivo depends on RNase P. In vitro, Escherichia coli RNase P and its catalytic RNA subunit (M1 RNA) can generate the mature 5' end of C4 RNA from P1 by a single endonucleolytic cut, whereas RNase P from the E. coli rnpA49 mutant, carrying a missense mutation in the RNase P protein subunit, is defective in the 5' maturation of C4 RNA. Primer extension analysis of RNA transcribed in vivo from a plasmid carrying the P1 c4 gene revealed that 5'-mature C4 RNA was the predominant species in rnpA+ bacteria, whereas virtually no mature C4 RNA was found in the temperature-sensitive rnpA49 strain at the restrictive temperature. Instead, C4 RNA molecules carrying up to five extra nucleotides beyond the 5' end accumulated. The same phenotype was observed in rnpA+ bacteria which harbored a plasmid carrying a P7 c4 mutant gene with a single C-->G base substitution in the structural homologue to the CCA 3' end of tRNAs. Implications of C4 RNA processing for the lysis/lysogeny decision process of bacteriophages P1 and P7 are discussed.
Resumo:
A literatura acadêmica sobre o comportamento do investidor financeiro é bastante escassa. A pesquisa sobre o processo de decisão, em geral, aborda tradeoffs na aquisição de produtos e pouco se discute o processo de decisão de investimento. Esta tese pretende contribuir para a redução deste gap ao discutir fatores determinantes para a tomada de decisão do investidor pessoal em produtos financeiros. A decisão de investimento é complexa, envolve, entre outros, o tradeoff entre renunciar o consumo presente pela possibilidade de maior bem estar no futuro. Adicionalmente, em muitas situações, existe possibilidade real de perda dos recursos financeiros investidos. Para investigar os percursos desta decisão foram realizadas entrevistas em profundidade com executivos ligados ao setor de fundos de investimento e ao de distribuição de produtos de investimento dos maiores bancos brasileiros atuantes no segmento de varejo. Os conhecimentos recolhidos e a revisão de literatura efetuada subsidiaram a elaboração do questionário de pesquisa aplicado em plataforma web junto a potenciais investidores. Os atributos rentabilidade, possibilidade de perda (proxy de risco), liquidez, taxa de administração e recomendação do gerente foram identificados como os mais relevantes para a decisão do investidor. Para construção dos estímulos e decomposição da utilidade da decisão foi utilizada a técnica conjoint based choice (CBC) que simula uma decisão real. Os resultados apontaram ser a recomendação do gerente o atributo mais importante para a formação da preferência por uma alternativa de investimento, resultado que, por si só, indica que fatores não racionais exercem influência na decisão. Estudou-se, então, o impacto da aversão ao risco e do estilo cognitivo do investidor. Os resultados denotam que os mais avessos e os mais intuitivos são mais suscetíveis à recomendação do gerente, mas que seus efeitos são independentes entre si. As evidências sugerem que os mais intuitivos utilizam o gerente para alcançar conforto cognitivo na decisão e que os mais avessos para mitigar a sensação de risco associada ao produto. Uma análise de cluster indicou ser possível segmentar a amostra em dois grupos, um mais propenso à recomendação do gerente e outro aos atributos do produto. A recomendação do gerente mostrou ser o atributo mais forte na distinção dos grupos. Os resultados indicam que uma segmentação de mercado baseada na propensão à recomendação do gerente pode ser efetiva para direcionar a construção de uma estratégia de relacionamento que busque incrementar os resultados de longo prazo.