864 resultados para Process model alignment
Resumo:
The development of northern high-latitude peatlands played an important role in the carbon (C) balance of the land biosphere since the Last Glacial Maximum (LGM). At present, carbon storage in northern peatlands is substantial and estimated to be 500 ± 100 Pg C (1 Pg C = 1015 g C). Here, we develop and apply a peatland module embedded in a dynamic global vegetation and land surface process model (LPX-Bern 1.0). The peatland module features a dynamic nitrogen cycle, a dynamic C transfer between peatland acrotelm (upper oxic layer) and catotelm (deep anoxic layer), hydrology- and temperature-dependent respiration rates, and peatland specific plant functional types. Nitrogen limitation down-regulates average modern net primary productivity over peatlands by about half. Decadal acrotelm-to-catotelm C fluxes vary between −20 and +50 g C m−2 yr−1 over the Holocene. Key model parameters are calibrated with reconstructed peat accumulation rates from peat-core data. The model reproduces the major features of the peat core data and of the observation-based modern circumpolar soil carbon distribution. Results from a set of simulations for possible evolutions of northern peat development and areal extent show that soil C stocks in modern peatlands increased by 365–550 Pg C since the LGM, of which 175–272 Pg C accumulated between 11 and 5 kyr BP. Furthermore, our simulations suggest a persistent C sequestration rate of 35–50 Pg C per 1000 yr in present-day peatlands under current climate conditions, and that this C sink could either sustain or turn towards a source by 2100 AD depending on climate trajectories as projected for different representative greenhouse gas concentration pathways.
Resumo:
Nitrous oxide (N2O) is an important greenhouse gas and ozone-depleting substance that has anthropogenic as well as natural marine and terrestrial sources. The tropospheric N2O concentrations have varied substantially in the past in concert with changing climate on glacial–interglacial and millennial timescales. It is not well understood, however, how N2O emissions from marine and terrestrial sources change in response to varying environmental conditions. The distinct isotopic compositions of marine and terrestrial N2O sources can help disentangle the relative changes in marine and terrestrial N2O emissions during past climate variations. Here we present N2O concentration and isotopic data for the last deglaciation, from 16,000 to 10,000 years before present, retrieved from air bubbles trapped in polar ice at Taylor Glacier, Antarctica. With the help of our data and a box model of the N2O cycle, we find a 30 per cent increase in total N2O emissions from the late glacial to the interglacial, with terrestrial and marine emissions contributing equally to the overall increase and generally evolving in parallel over the last deglaciation, even though there is no a priori connection between the drivers of the two sources. However, we find that terrestrial emissions dominated on centennial timescales, consistent with a state-of-the-art dynamic global vegetation and land surface process model that suggests that during the last deglaciation emission changes were strongly influenced by temperature and precipitation patterns over land surfaces. The results improve our understanding of the drivers of natural N2O emissions and are consistent with the idea that natural N2O emissions will probably increase in response to anthropogenic warming.
Resumo:
Several componential emotion theories suggest that appraisal outcomes trigger characteristic somatovisceral changes that facilitate information processing and prepare the organism for adaptive behavior. The current study tested predictions derived from Scherer's Component Process Model. Participants viewed unpleasant and pleasant pictures (intrinsic pleasantness appraisal) and were asked to concurrently perform either an arm extension or an arm flexion, leading to an increase or a decrease in picture size. Increasing pleasant stimuli and decreasing unpleasant stimuli were considered goal conducive; decreasing pleasant stimuli and increasing unpleasant stimuli were considered goal obstructive (goal conduciveness appraisal). Both appraisals were marked by several somatovisceral changes (facial electromyogram, heart rate (HR)). As predicted, the changes induced by the two appraisals showed similar patterns. Furthermore, HR results, compared with data of earlier studies, suggest that the adaptive consequences of both appraisals may be mediated by stimulus proximity.
Resumo:
In the context of a memory task, participants were presented with pictures displaying biological and cultural threat stimuli or neutral stimuli (stimulus relevance manipulation) with superimposed symbols signaling monetary gains or losses (goal conduciveness manipulation). Results for heart rate and facial electromyogram show differential efferent effects of the respective appraisal outcomes and provide first evidence for sequential processing, as postulated by Scherer's component process model of emotion. Specifically, as predicted, muscle activity over the brow and cheek regions marking the process of relevance appraisal occurred significantly earlier than facial muscle activity markers of goal conduciveness appraisal. Heart rate, in contrast, was influenced by the stimulus relevance manipulation only.
Resumo:
This study was designed to investigate and describe the relationship among resilience, forgiveness and anger expression in adolescents. The purpose of the study was to explore whether certain adolescent resiliencies significantly related to positive or negative affective, behavioral, or cognitive levels of forgiveness and certain types of anger expression in adolescents. This study also investigated whether there were certain adolescent resiliencies and types of forgiveness that can predict lower levels of negative anger expression in adolescents. This research was built on two conceptual models: Wolin and Wolin's (1993) Challenge Model and the Forgiveness Process Model (Enright & Human Development Study Group, 1991). It was based on a quantitative, single-subject correlational research design. A multiple regression analysis was also used to explore possible effects of resilience and forgiveness on anger expression in adolescents. In addition, two demographic variables, Age and Gender, were examined for possible effects on anger expression. Data were gathered from a convenience sample sample of 70 students in three Maine public high schools using three separate assessment instruments: the Adolescent Resiliency Attitudes Scale (ARAS), the Adolescent Version of the Enright Forgiveness Inventory (EFI), and the Adolescent Anger Rating Scale (AARS). Correlational analyses were done on the scales and subscales of these surveys. Significant relationships were found between several adolescent resiliencies and forms of forgiveness as well as between some adolescent resiliencies and types of anger expression. The data indicated that Total Resiliency significantly correlated with Total Forgiveness as well as Total Anger. The findings also identified particular adolescent resiliencies that significantly predicted types of anger expression, while forgiveness did not predict types of anger expression. The data revealed that Age and Gender had no significant affect on anger expression. These findings suggest that the constructs of adolescent resilience and forgiveness have commonalities that can influence how adolescents express anger, and further suggest that intervention and prevention programs expand their focus to incorporate forgiveness skills. The findings from this study can provide critical information to counselors, therapists, and other helping professionals working with adolescents, on approaches to designing and implementing therapy modalities or developmental school guidance programs for adolescents.
Resumo:
There is growing interest in providing women with internatal care, a package of healthcare and ancillary services that can improve their health during the period after the termination of one pregnancy but before the conception of the next pregnancy. Women who have had a pregnancy affected by a neural tube defect can especially benefit from internatal care because they are at increased risk for recurrence and improvements to their health during the inter-pregnancy period can prevent future negative birth outcomes. The dissertation provides three papers that inform the content of internatal care for women at risk for recurrence by examining descriptive epidemiology to develop an accurate risk profile of the population, assessing whether women at risk for recurrence would benefit from a psychosocial intervention, and determining how to improve health promotion efforts targeting folic acid use.^ Paper one identifies information relevant for developing risk profiles and conducting risk assessments. A number of investigations have found that the risk for neural tube defects differs between non-Hispanic Whites and Hispanics. To understand the risk difference, the descriptive epidemiology of spina bifida and anencephaly was examined for Hispanics and non-Hispanic Whites based on data from the Texas Birth Defects Registry for the years 1999 through 2004. Crude and adjusted birth prevalence ratios and corresponding 95% confidence intervals were calculated between descriptive epidemiologic characteristics and anencephaly and spina bifida for non-Hispanic Whites and for Hispanics. In both race/ethnic groups, anencephaly expressed an inverse relationship with maternal age and a positive linear relationship with parity. Both relationships were stronger in non-Hispanic Whites. Female infants had a higher risk for anencephaly in non-Hispanic Whites. Lower maternal education was associated with increased risk for spina bifida in Hispanics.^ Paper two assesses the need for a psychosocial intervention. For mothers who have children with spina bifida, the transition to motherhood can be stressful. This qualitative study explored the process of becoming a mother to a child with spina bifida focusing particularly on stress and coping in the immediate postnatal environment. Semi-structured interviews were conducted with six mothers who have children with spina bifida. Mothers were asked about their initial emotional and problem-based coping efforts, the quality and kind of support provided by health providers, and the characteristics of their meaning-based coping efforts; questions matched Transactional Model of Stress and Coping (TMSC) constructs. Analysis of the responses revealed a number of modifiable stress and coping transactions, the most salient being: health providers are in a position to address beliefs about self-causality and prevent mothers from experiencing the repercussions that stem from maintaining these beliefs. ^ Paper three identifies considerations when creating health promotion materials targeting folic acid use. A brochure was designed using concepts from the Precaution Adoption Process Model (PAPM). Three focus groups comprising 26 mothers of children with spina bifida evaluated the brochure. One focus group was conducted in Spanish-only, the other two focus groups were conducted in English and Spanish combined. Qualitative analysis of coded transcripts revealed that a brochure is a helpful adjunct. Questions about folic acid support the inclusion of an insert with basic information. There may be a need to develop different educational material for Hispanics so the importance of folic acid is provided in a situational context. Some participants blamed themselves for their pregnancy outcome which may affect their receptivity to messages in the brochure. The women's desire for photographs that affect their perception of threat and their identification with the second role model indicate they belong to PAPM Stage 2 and 3. Participants preferred colorful envelopes, high quality paper, intimidating photographs, simple words, conversational style sentences, and positive messages.^ These papers develop the content of risk assessment, psychosocial intervention, and health promotion components of internatal care as they apply to women at risk for recurrence. The findings provided evidence for considering parity and maternal age when assessing nutritional risk. The two dissimilarities between the two race/ethnic groups, infant sex and maternal education lent support to creating separate risk profiles. Interviews with mothers of children with spina bifida revealed the existence of unmet needs-suggesting that a psychosocial intervention provided as part of internatal care can strengthen and support women's well-being. Segmenting the audience according to race/ethnicity and PAPM stage can improve the relevance of print materials promoting folic acid use.^
Resumo:
This dissertation focuses on Project HOPE, an American medical aid agency, and its work in Tunisia. More specifically this is a study of the implementation strategies of those HOPE sponsored projects and programs designed to solve the problems of high morbidity and infant mortality rates due to environmentally related diarrheal and enteric diseases. Several environmental health programs and projects developed in cooperation with Tunisian counterparts are described and analyzed. These include (1) a paramedical manpower training program; (2) a national hospital sanitation and infection control program; (3) a community sewage disposal project; (4) a well reconstruction project; and (5) a solid-waste disposal project for a hospital.^ After independence, Tunisia, like many developing countries, encountered several difficulties which hindered progress toward solving basic environmental health problems and prompted a request for aid. This study discusses the need for all who work in development programs to recognize and assess those difficulties or constraints which affect the program planning process, including those latent cultural and political constraints which not only exist within the host country but within the aid agency as well. For example, failure to recognize cultural differences may adversely affect the attitudes of the host staff towards their work and towards the aid agency and its task. These factors, therefore, play a significant role in influencing program development decisions and must be taken into account in order to maximize the probability of successful outcomes.^ In 1969 Project HOPE was asked by the Tunisian government to assist the Ministry of Health in solving its health manpower problems. HOPE responded with several programs, one of which concerned the training of public health nurses, sanitary technicians, and aids at Tunisia's school of public health in Nabeul. The outcome of that program as well as the strategies used in its development are analyzed. Also, certain questions are addressed such as, what should the indicators of success be, and when is the time right to phase out?^ Another HOPE program analyzed involved hospital sanitation and infection control. Certain generic aspects of basic hospital sanitation procedures were documented and presented in the form of a process model which was later used as a "microplan" in setting up similar programs in other Tunisian hospitals. In this study the details of the "microplan" are discussed. The development of a nation-wide program without any further need of external assistance illustrated the success of HOPE's implementation strategies.^ Finally, although it is known that the high incidence of enteric disease in developing countries is due to poor environmental sanitation and poor hygiene practices, efforts by aid agencies to correct these conditions have often resulted in failure. Project HOPE's strategy was to maximize limited resources by using a systems approach to program development and by becoming actively involved in the design and implementation of environmental health projects utilizing "appropriate" technology. Three innovative projects and their implementation strategies (including technical specifications) are described.^ It is advocated that if aid agencies are to make any progress in helping developing countries basic sanitation problems, they must take an interdisciplinary approach to progrm development and play an active role in helping counterparts seek and identify appropriate technologies which are socially and economically acceptable. ^
Resumo:
Pneumonia is a well-documented and common respiratory infection in patients with acute traumatic spinal cord injuries, and may recur during the course of acute care. Using data from the North American Clinical Trials Network (NACTN) for Spinal Cord Injury, the incidence, timing, and recurrence of pneumonia were analyzed. The two main objectives were (1) to investigate the time and potential risk factors for the first occurrence of pneumonia using the Cox Proportional Hazards model, and (2) to investigate pneumonia recurrence and its risk factors using a Counting Process model that is a generalization of the Cox Proportional Hazards model. The results from survival analysis suggested that surgery, intubation, American Spinal Injury Association (ASIA) grade, direct admission to a NACTN site and age (older than 65 or not) were significant risks for first event of pneumonia and multiple events of pneumonia. The significance of this research is that it has the potential to identify patients at the time of admission who are at high risk for the incidence and recurrence of pneumonia. Knowledge and the time of occurrence of pneumonias are important factors for the development of prevention strategies and may also provide some insights into the selection of emerging therapies that compromise the immune system. ^
Resumo:
The knowledge about processes concerning perception and understanding is of paramount importance for designing means of communication like maps and charts. This is especially the case, if one does not want to lose sight of the map-user and if map-design is to be orientated along the map-users needs and preferences in order to improve the cartographic product's usability. A scientific approach to visualization can help to achieve useable results. The insights achieved by such an approach can lead to modes of visualization that are superior to those, which have seemingly proved their value in praxis - so-called "bestpractices" -, concerning their utility and efficiency. This thesis shows this by using the example of visualizing the limits of bodies of waters in the Southern Ocean. After making some introductorily remarks on the chosen mode of problem-solution in chapter one, which simultaneously illustrate the flow of work while working on the problem, in chapter two the relevant information concerning the drawing of limits in the Southern Ocean is outlined. Chapter 3 builds the theoretical framework, which is a multidisciplinary approach to representation. This theoretical framework is based on "How Maps Work" by the American Cartographer MacEachren (1995/2004). His "scientific approach to visualization" is amended and adjusted by the knowledge gained from recent findings of the social sciences where necessary. So, the approach suggested in this thesis represents a synergy of psychology, sociology, semiotics, linguistics, communication theory and cartography. It follows the tradition of interdisciplinary research getting over the boundaries of a single scientific subject. The achieved holistic approach can help to improve the usability of cartographic products. It illustrates on the one hand those processes taking place while perceiving and recognizing cartographic information - so-called bottom-up-processes. On the other hand it illuminates the processes which happen during understanding this information in so-called top-down-processes. Bottom-up- and top-down-processes are interdependent and inseparably interrelated and therefore cannot be understood without each other. Regarding aspects of usability the approach suggested in this thesis strongly focuses on the map-user. This is the reason why the phenomenon of communication gains more weight than in MacEachren's map-centered approach. Because of this, in chapter 4 a holistic approach to communication is developed. This approach makes clear that only the map-user can evaluate the usability of a cartographic product. Only if he can extract the information relevant for him from the cartographical product, it is really useable. The concept of communication is well suited to conceive that. In case of the visualization of limits of bodies of water in the Southern Ocean, which is not complex enough to illustrate all results of the theoretical considerations, it is suggested to visualize the limits with red lines. This suggestion deviates from the commonly used mode of visualization. So, this thesis shows how theory is able to ameliorate praxis. Chapter 5 leads back to the task of fixing limits of the bodies of water in the area of concern. A convention by the International Hydrographic Organization (IHO) states that those limits should be drawn by using meridians, parallels, rhumb lines and bathymetric data. Based on the available bathymetric data both a representation and a process model are calculated, which should support the drawing of the limits. The quality of both models, which depends on the quality of the bathymetric data at hand, leads to the decision that the representation model is better suited to support the drawing of limits.
Resumo:
Los avances en el hardware permiten disponer de grandes volúmenes de datos, surgiendo aplicaciones que deben suministrar información en tiempo cuasi-real, la monitorización de pacientes, ej., el seguimiento sanitario de las conducciones de agua, etc. Las necesidades de estas aplicaciones hacen emerger el modelo de flujo de datos (data streaming) frente al modelo almacenar-para-despuésprocesar (store-then-process). Mientras que en el modelo store-then-process, los datos son almacenados para ser posteriormente consultados; en los sistemas de streaming, los datos son procesados a su llegada al sistema, produciendo respuestas continuas sin llegar a almacenarse. Esta nueva visión impone desafíos para el procesamiento de datos al vuelo: 1) las respuestas deben producirse de manera continua cada vez que nuevos datos llegan al sistema; 2) los datos son accedidos solo una vez y, generalmente, no son almacenados en su totalidad; y 3) el tiempo de procesamiento por dato para producir una respuesta debe ser bajo. Aunque existen dos modelos para el cómputo de respuestas continuas, el modelo evolutivo y el de ventana deslizante; éste segundo se ajusta mejor en ciertas aplicaciones al considerar únicamente los datos recibidos más recientemente, en lugar de todo el histórico de datos. En los últimos años, la minería de datos en streaming se ha centrado en el modelo evolutivo. Mientras que, en el modelo de ventana deslizante, el trabajo presentado es más reducido ya que estos algoritmos no sólo deben de ser incrementales si no que deben borrar la información que caduca por el deslizamiento de la ventana manteniendo los anteriores tres desafíos. Una de las tareas fundamentales en minería de datos es la búsqueda de agrupaciones donde, dado un conjunto de datos, el objetivo es encontrar grupos representativos, de manera que se tenga una descripción sintética del conjunto. Estas agrupaciones son fundamentales en aplicaciones como la detección de intrusos en la red o la segmentación de clientes en el marketing y la publicidad. Debido a las cantidades masivas de datos que deben procesarse en este tipo de aplicaciones (millones de eventos por segundo), las soluciones centralizadas puede ser incapaz de hacer frente a las restricciones de tiempo de procesamiento, por lo que deben recurrir a descartar datos durante los picos de carga. Para evitar esta perdida de datos, se impone el procesamiento distribuido de streams, en concreto, los algoritmos de agrupamiento deben ser adaptados para este tipo de entornos, en los que los datos están distribuidos. En streaming, la investigación no solo se centra en el diseño para tareas generales, como la agrupación, sino también en la búsqueda de nuevos enfoques que se adapten mejor a escenarios particulares. Como ejemplo, un mecanismo de agrupación ad-hoc resulta ser más adecuado para la defensa contra la denegación de servicio distribuida (Distributed Denial of Services, DDoS) que el problema tradicional de k-medias. En esta tesis se pretende contribuir en el problema agrupamiento en streaming tanto en entornos centralizados y distribuidos. Hemos diseñado un algoritmo centralizado de clustering mostrando las capacidades para descubrir agrupaciones de alta calidad en bajo tiempo frente a otras soluciones del estado del arte, en una amplia evaluación. Además, se ha trabajado sobre una estructura que reduce notablemente el espacio de memoria necesario, controlando, en todo momento, el error de los cómputos. Nuestro trabajo también proporciona dos protocolos de distribución del cómputo de agrupaciones. Se han analizado dos características fundamentales: el impacto sobre la calidad del clustering al realizar el cómputo distribuido y las condiciones necesarias para la reducción del tiempo de procesamiento frente a la solución centralizada. Finalmente, hemos desarrollado un entorno para la detección de ataques DDoS basado en agrupaciones. En este último caso, se ha caracterizado el tipo de ataques detectados y se ha desarrollado una evaluación sobre la eficiencia y eficacia de la mitigación del impacto del ataque. ABSTRACT Advances in hardware allow to collect huge volumes of data emerging applications that must provide information in near-real time, e.g., patient monitoring, health monitoring of water pipes, etc. The data streaming model emerges to comply with these applications overcoming the traditional store-then-process model. With the store-then-process model, data is stored before being consulted; while, in streaming, data are processed on the fly producing continuous responses. The challenges of streaming for processing data on the fly are the following: 1) responses must be produced continuously whenever new data arrives in the system; 2) data is accessed only once and is generally not maintained in its entirety, and 3) data processing time to produce a response should be low. Two models exist to compute continuous responses: the evolving model and the sliding window model; the latter fits best with applications must be computed over the most recently data rather than all the previous data. In recent years, research in the context of data stream mining has focused mainly on the evolving model. In the sliding window model, the work presented is smaller since these algorithms must be incremental and they must delete the information which expires when the window slides. Clustering is one of the fundamental techniques of data mining and is used to analyze data sets in order to find representative groups that provide a concise description of the data being processed. Clustering is critical in applications such as network intrusion detection or customer segmentation in marketing and advertising. Due to the huge amount of data that must be processed by such applications (up to millions of events per second), centralized solutions are usually unable to cope with timing restrictions and recur to shedding techniques where data is discarded during load peaks. To avoid discarding of data, processing of streams (such as clustering) must be distributed and adapted to environments where information is distributed. In streaming, research does not only focus on designing for general tasks, such as clustering, but also in finding new approaches that fit bests with particular scenarios. As an example, an ad-hoc grouping mechanism turns out to be more adequate than k-means for defense against Distributed Denial of Service (DDoS). This thesis contributes to the data stream mining clustering technique both for centralized and distributed environments. We present a centralized clustering algorithm showing capabilities to discover clusters of high quality in low time and we provide a comparison with existing state of the art solutions. We have worked on a data structure that significantly reduces memory requirements while controlling the error of the clusters statistics. We also provide two distributed clustering protocols. We focus on the analysis of two key features: the impact on the clustering quality when computation is distributed and the requirements for reducing the processing time compared to the centralized solution. Finally, with respect to ad-hoc grouping techniques, we have developed a DDoS detection framework based on clustering.We have characterized the attacks detected and we have evaluated the efficiency and effectiveness of mitigating the attack impact.
Resumo:
En la actualidad existe una gran expectación ante la introducción de nuevas herramientas y métodos para el desarrollo de productos software, que permitirán en un futuro próximo un planteamiento de ingeniería del proceso de producción software. Las nuevas metodologías que empiezan a esbozarse suponen un enfoque integral del problema abarcando todas las fases del esquema productivo. Sin embargo el grado de automatización conseguido en el proceso de construcción de sistemas es muy bajo y éste está centrado en las últimas fases del ciclo de vida del software, consiguiéndose así una reducción poco significativa de sus costes y, lo que es aún más importante, sin garantizar la calidad de los productos software obtenidos. Esta tesis define una metodología de desarrollo software estructurada que se puede automatizar, es decir una metodología CASE. La metodología que se presenta se ajusta al modelo de ciclo de desarrollo CASE, que consta de las fases de análisis, diseño y pruebas; siendo su ámbito de aplicación los sistemas de información. Se establecen inicialmente los principios básicos sobre los que la metodología CASE se asienta. Posteriormente, y puesto que la metodología se inicia con la fijación de los objetivos de la empresa que demanda un sistema informático, se emplean técnicas que sirvan de recogida y validación de la información, que proporcionan a la vez un lenguaje de comunicación fácil entre usuarios finales e informáticos. Además, estas mismas técnicas detallarán de una manera completa, consistente y sin ambigüedad todos los requisitos del sistema. Asimismo, se presentan un conjunto de técnicas y algoritmos para conseguir que desde la especificación de requisitos del sistema se logre una automatización tanto del diseño lógico del Modelo de Procesos como del Modelo de Datos, validados ambos conforme a la especificación de requisitos previa. Por último se definen unos procedimientos formales que indican el conjunto de actividades a realizar en el proceso de construcción y cómo llevarlas a cabo, consiguiendo de esta manera una integridad en las distintas etapas del proceso de desarrollo.---ABSTRACT---Nowdays there is a great expectation with regard to the introduction of new tools and methods for the software products development that, in the very near future will allow, an engineering approach in the software development process. New methodologies, just emerging, imply an integral approach to the problem, including all the productive scheme stages. However, the automatization degree obtained in the systems construction process is very low and focused on the last phases of the software lifecycle, which means that the costs reduction obtained is irrelevant and, which is more important, the quality of the software products is not guaranteed. This thesis defines an structured software development methodology that can be automated, that is a CASE methodology. Such a methodology is adapted to the CASE development cycle-model, which consists in analysis, design and testing phases, being the information systems its field of application. Firstly, we present the basic principies on which CASE methodology is based. Secondly, since the methodology starts from fixing the objectives of the company demanding the automatization system, we use some techniques that are useful for gathering and validating the information, being at the same time an easy communication language between end-users and developers. Indeed, these same techniques will detail completely, consistently and non ambiguously all the system requirements. Likewise, a set of techniques and algorithms are shown in order to obtain, from the system requirements specification, an automatization of the Process Model logical design, and of the Data Model logical design. Those two models are validated according to the previous requirement specification. Finally, we define several formal procedures that suggest which set of activities to be accomplished in the construction process, and how to carry them out, getting in this way integrity and completness for the different stages of the development process.
Resumo:
New technologies such as, the new Information and Communication Technology ICT, break new paths and redefines the way we understand business, the Cloud Computing is one of them. The on demand resource gathering and the per usage payment scheme are now commonplace, and allows companies to save on their ICT investments. Despite the importance of this issue, we still lack methodologies that help companies, to develop applications oriented for its exploitation in the Cloud. In this study we aim to fill this gap and propose a methodology for the development of ICT applications, which are directed towards a business model, and further outsourcing in the Cloud. In the former the Development of SOA applications, we take, as a baseline scenario, a business model from which to obtain a business process model. To this end, we use software engineering tools; and in the latter The Outsourcing we propose a guide that would facilitate uploading business models into the Cloud; to this end we describe a SOA governance model, which controls the SOA. Additionally we propose a Cloud government that integrates Service Level Agreements SLAs, plus SOA governance, and Cloud architecture. Finally we apply our methodology in an example illustrating our proposal. We believe that our proposal can be used as a guide/pattern for the development of business applications.
Resumo:
En la actualidad no se concibe una empresa, por pequeña que esta sea, sin algún tipo de servicio TI. Se presenta para cada empresa el reto de emprender proyectos para desarrollar o contratar servicios de TI que soporten los diferentes procesos de negocio de la empresa. Por otro lado, a menos que los servicios de TI estén aislados de toda red, lo cual es prácticamente imposible en la actualidad, no existe un servicio o un proyecto que lo desarrolle garantizando el 100% de seguridad. Así la empresa maneja una dualidad entre desarrollar productos/servicios de TI seguros y el mantenimiento constante de sus servicios TI en estado seguro. La gestión de los proyectos para el desarrollo de los servicios de TI se aborda, en la mayoría de las empresas, aplicando distintas prácticas, utilizadas en otros proyectos y recomendadas, a tal efecto, por marcos y estándares con mayor reconocimiento. Por lo general, estos marcos incluyen, entre sus procesos, la gestión de los riesgos orientada al cumplimiento de plazos, de costes y, a veces, de la funcionalidad del producto o servicio. Sin embargo, en estas prácticas se obvian los aspectos de seguridad (confidencialidad, integridad y disponibilidad) del producto/servicio, necesarios durante el desarrollo del proyecto. Además, una vez entregado el servicio, a nivel operativo, cuando surge algún fallo relativo a estos aspectos de seguridad, se aplican soluciones ad-hoc. Esto provoca grandes pérdidas y, en ocasiones, pone en peligro la continuidad de la propia empresa. Este problema, se va acrecentando cada día más, en cualquier tipo de empresa y, son las PYMEs, por su la falta de conocimiento del problema en sí y la escasez de recursos metodológicos y técnicos, las empresas más vulnerables. Por todo lo anterior, esta tesis doctoral tiene un doble objetivo. En primer lugar, demostrar la necesidad de contar con un marco de trabajo que, integrado con otros posibles marcos y estándares, sea sencillo de aplicar en distintos tipos y envergaduras de proyectos, y que guíe a las PYMEs en la gestión de proyectos para el desarrollo seguro y posterior mantenimiento de la seguridad de sus servicios de TI. En segundo lugar, cubrir esta necesidad desarrollando un marco de trabajo que ofrezca un modelo de proceso genérico aplicable sobre distintos patrones de proyecto y una librería de activos de seguridad que sirva a las PYMEs de guía durante el proceso de gestión del proyecto para el desarrollo seguro. El modelo de proceso del marco propuesto describe actividades en los tres niveles organizativos de la empresa (estratégico, táctico y operativo). Está basado en el ciclo de mejora continua (PDCA) y en la filosofía Seguridad por Diseño, propuesta por Siemens. Se detallan las prácticas específicas de cada actividad, las entradas, salidas, acciones, roles, KPIs y técnicas aplicables para cada actividad. Estas prácticas específicas pueden aplicarse o no, a criterio del jefe de proyecto y de acuerdo al estado de la empresa y proyecto que se quiera desarrollar, estableciendo así distintos patrones de proceso. Para la validación del marco se han elegido dos PYMEs. La primera del sector servicios y la segunda del sector TIC. El modelo de proceso ha sido aplicado sobre un mismo patrón de proyecto que responde a necesidades comunes a ambas empresas. El patrón de proceso ha sido valorado en los proyectos elegidos en ambas empresas, antes y después de su aplicación. Los resultados del estudio, después de su aplicación en ambas empresas, han permitido la validación del patrón de proceso, en la mejora de la gestión de proyecto para el desarrollo seguro de TI en las PYMEs. ABSTRACT Today a company without any IT service is not conceived, even if it is small either. It presents the challenge for each company to undertake projects to develop or contract IT services that support the different business processes of the company. On the other hand, unless IT services are isolated from whole network, which is virtually impossible at present, there is no service or project, which develops guaranteeing 100% security. So the company handles a duality, develop products / insurance IT services and constant maintenance of their IT services in a safe state. The project management for the development of IT services is addressed, in most companies, using different practices used in other projects and recommended for this purpose by frameworks and standards with greater recognition. Generally, these frameworks include, among its processes, risk management aimed at meeting deadlines, costs and, sometimes, the functionality of the product or service. However, safety issues such as confidentiality, integrity and availability of the product / service, necessary for the project, they are ignored in these practices. Moreover, once the service delivered at the operational level, when a fault on these safety issues arise, ad-hoc solutions are applied. This causes great losses and sometimes threatens the continuity of the company. This problem is adding more every day, in any kind of business and SMEs are, by their lack of knowledge of the problem itself and the lack of methodological and technical resources, the most vulnerable companies. For all these reasons, this thesis has two objectives. Firstly demonstrate the need for a framework that integrated with other possible frameworks and standards, it is simple to apply in different types and wingspans of projects, and to guide SMEs in the management of development projects safely, and subsequent maintenance of the security of their IT services. Secondly meet this need by developing a framework that provides a generic process model applicable to project different patterns and a library of security assets, which serve to guide SMEs in the process of project management for development safe. The process model describes the proposed activities under the three organizational levels of the company (strategic, tactical and operational). It is based on the continuous improvement cycle (PDCA) and Security Design philosophy proposed by Siemens. The specific practices, inputs, outputs, actions, roles, KPIs and techniques applicable to each activity are detailed. These specific practices can be applied or not, at the discretion of the project manager and according to the state of the company and project that the company wants to develop, establishing different patterns of process. Two SMEs have been chosen to validate the frame work. The first of the services sector and the second in the ICT sector. The process model has been applied on the same pattern project that responds to needs common to both companies. The process pattern has been valued at the selected projects in both companies before and after application. The results of the study, after application in both companies have enabled pattern validation process, improving project management for the safe development of IT in SMEs.
Resumo:
In 1991, Bryant and Eckard estimated the annual probability that a cartel would be detected by the US Federal authorities, conditional on being detected, to be at most between 13 % and 17 %. 15 years later, we estimated the same probability over a European sample and we found an annual probability that falls between 12.9 % and 13.3 %. We also develop a detection model to clarify this probability. Our estimate is based on detection durations, calculated from data reported for all the cartels convicted by the European Commission from 1969 to the present date, and a statistical birth and death process model describing the onset and detection of cartels.
Resumo:
Atualmente, as instituições do ensino superior, onde se inclui a Escola Superior de Desporto de Rio Maior do Instituto Politécnico de Santarém, deparam-se com várias questões e desafios relacionados com a sua acreditação e a dos seus ciclos de estudo, e consequentemente, com a melhoria da qualidade do seu desempenho e o acesso a financiamento. Esta realidade exige novas abordagens e o aumento do nível de exigência a todos os intervenientes que contribuem para a qualidade do serviço prestado. No sentido de dar resposta a estes desafios, o Gabinete de Avaliação e Qualidade tem desenvolvido iniciativas e abordagens das quais o presente trabalho é um exemplo. Com este trabalho pretendeu-se, a partir de numa abordagem de Business Process Management, demonstrar a viabilidade e operacionalidade da utilização de uma ferramenta de Business Process Management System neste contexto. Para tal, realizou-se a modelação do processo de avaliação e acreditação desenvolvido pela Agência de Avaliação e Acreditação do Ensino Superior, através da utilização do Business Process Model and Notation. Esta proposta permitiu modelar os processos na instituição, demonstrando a utilização de uma abordagem Business Process Management numa organização desta natureza, com o objetivo de promover a sua melhoria.