938 resultados para Software Process Improvement
Resumo:
A universal systems design process is specified, tested in a case study and evaluated. It links English narratives to numbers using a categorical language framework with mathematical mappings taking the place of conjunctions and numbers. The framework is a ring of English narrative words between 1 (option) and 360 (capital); beyond 360 the ring cycles again to 1. English narratives are shown to correspond to the field of fractional numbers. The process can enable the development, presentation and communication of complex narrative policy information among communities of any scale, on a software implementation known as the "ecoputer". The information is more accessible and comprehensive than that in conventional decision support, because: (1) it is expressed in narrative language; and (2) the narratives are expressed as compounds of words within the framework. Hence option generation is made more effective than in conventional decision support processes including Multiple Criteria Decision Analysis, Life Cycle Assessment and Cost-Benefit Analysis.The case study is of a participatory workshop in UK bioenergy project objectives and criteria, at which attributes were elicited in environmental, economic and social systems. From the attributes, the framework was used to derive consequences at a range of levels of precision; these are compared with the project objectives and criteria as set out in the Case for Support. The design process is to be supported by a social information manipulation, storage and retrieval system for numeric and verbal narratives attached to the "ecoputer". The "ecoputer" will have an integrated verbal and numeric operating system. Novel design source code language will assist the development of narrative policy. The utility of the program, including in the transition to sustainable development and in applications at both community micro-scale and policy macro-scale, is discussed from public, stakeholder, corporate, Governmental and regulatory perspectives.
Resumo:
Briefing phase interactions between clients and designers are recognized as social engagements, characterized by communicative sign use, where conceptual ideas are gradually transformed into potential design solutions. A semiotic analysis of briefing communications between client stakeholders and designers provides evidence of the significance and importance of stakeholder interpretation and understanding of design, empirical data being drawn from a qualitative study of NHS hospital construction projects in the UK. It is contended that stakeholders engage with a project through communicative signs and artefacts of design, referencing personal cognitive knowledge in acts of interpretation that may be different from those of designers and externally appointed client advisers. Such interpretations occur in addition to NHS client and design team efforts to ‘engage’ with and ‘understand’ stakeholders using a variety of methods. Social semiotic theorizations indicate how narrative strategies motivate the formulation of signs and artefacts in briefing work, the role of sign authors and sign readers being elucidated as a result. Findings are contextualized against current understandings of briefing communications and stakeholder management practices, a more socially attuned understanding of briefing countering some of the process-led improvement models that have characterized much of the post-Egan report literature. A stakeholder interpretation model is presented as one potential method to safeguard against unforeseen interpretations occurring, the model aligning with the proposal for a more measured recognition of how designs can trigger interpretations among client stakeholders.
Resumo:
Citizens across the world are increasingly called upon to participate in healthcare improvement. It is often unclear how this can be made to work in practice. This 4- year ethnography of a UK healthcare improvement initiative showed that patients used elements of organizational culture as resources to help them collaborate with healthcare professionals. The four elements were: (1) organizational emphasis on nonhierarchical, multidisciplinary collaboration; (2) organizational staff ability to model desired behaviours of recognition and respect; (3) commitment to rapid action, including quick translation of research into practice; and (4) the constant data collection and reflection process facilitated by improvement methods.
Resumo:
The chapter describes development of care bundle documentation, through an iterative, user-centred design process, to support the recognition and treatment of acute kidney injury (AKI). The chapter details stages of user and stakeholder consultation, employed to develop a design response that was sensitive to user experience and need, culminating in simulation testing of a near final prototype. The development of supplementary awareness-raising materials, relating to the main care bundle tool is also discussed. This information design response to a complex clinical decision-making process is contrasted to other approaches to promoting AKI care. The need for different but related approaches to the working tool itself and the tool’s communication are discussed. More general recommendations are made for the development of communication tools to support complex clinical processes.
Resumo:
The aim of this study was to assess and improve the accuracy of biotransfer models for the organic pollutants (PCBs, PCDD/Fs, PBDEs, PFCAs, and pesticides) into cow’s milk and beef used in human exposure assessment. Metabolic rate in cattle is known as a key parameter for this biotransfer, however few experimental data and no simulation methods are currently available. In this research, metabolic rate was estimated using existing QSAR biodegradation models of microorganisms (BioWIN) and fish (EPI-HL and IFS-HL). This simulated metabolic rate was then incorporated into the mechanistic cattle biotransfer models (RAIDAR, ACC-HUMAN, OMEGA, and CKow). The goodness of fit tests showed that RAIDAR, ACC-HUMAN, OMEGA model performances were significantly improved using either of the QSARs when comparing the new model outputs to observed data. The CKow model is the only one that separates the processes in the gut and liver. This model showed the lowest residual error of all the models tested when the BioWIN model was used to represent the ruminant metabolic process in the gut and the two fish QSARs were used to represent the metabolic process in the liver. Our testing included EUSES and CalTOX which are KOW-regression models that are widely used in regulatory assessment. New regressions based on the simulated rate of the two metabolic processes are also proposed as an alternative to KOW-regression models for a screening risk assessment. The modified CKow model is more physiologically realistic, but has equivalent usability to existing KOW-regression models for estimating cattle biotransfer of organic pollutants.
Resumo:
Dental composite resins possess good esthetic properties, and are currently among the most popular dental restorative materials. Both organic and inorganic phases might influence the material behavior, the filler particle features and rate are the most important factors related to improvement of the mechanical properties of resin composites. Thus, the objective of this study was to evaluate the effect of three different composite resins on the polymerization process by Vickers hardness test. The samples were prepared using three different composite resins, as follow: group I-P-60 (3M/ESPE); group II-Herculite XRV (Kerr), and group III-Durafill (Heraeus-Kulzer). The samples were made in a polytetrafluoroethylene mould, with a rectangular cavity measuring 7 mm in length, 4 mm in width, and 3 mm in thickness. The samples were photo-activated by one light-curing unit based on blue LEDs (Ultrablue III-DMC/Brazil) for 20 and 40 s of irradiation times. The Vickers hardness test was performed 24 h after the photo-activation until the standardized depth of 3 mm. The Vickers hardness mean values varied from 158.9 (+/- 0.81) to 81.4 (+/- 1.94) for P-60, from 138.7 (+/- 0.37) to 61.7 (+/- 0.24) for Herculite XRV, and from 107. 5 (+/- 0.81) to 44.5 (+/- 1.36) for Durafill composite resins photo-activated during 20 s for the 1st and 2nd mm, respectively. During 40 s of photo-activation, the Vickers hardness mean values were: from 181.0 (+/- 0.70) to 15.6 (+/- 0.29) for P-60, and from 161.8 (+/- 0.41) to 11.2 (+/- 0.17) for Herculite XRV composite resins, for the 1st and 3th mm, respectively. For Durafill composite resin the mean values varied from 120.1 (+/- 0.66) to 61.7 (+/- 0.20), for the 1st and 2nd mm, respectively. The variation coefficient (CV) was in the most of the groups lower than 1%, then the descriptive statistic analysis was used. The Vickers hardness mean values for Durafill were lower than P-60 and Herculite XRV composite resins for 20 and 40 s of irradiation time. The polymerization process was greatly affected by the composition of the composite resins.
Resumo:
Managing software maintenance is rarely a precise task due to uncertainties concerned with resources and services descriptions. Even when a well-established maintenance process is followed, the risk of delaying tasks remains if the new services are not precisely described or when resources change during process execution. Also, the delay of a task at an early process stage may represent a different delay at the end of the process, depending on complexity or services reliability requirements. This paper presents a knowledge-based representation (Bayesian Networks) for maintenance project delays based on specialists experience and a corresponding tool to help in managing software maintenance projects. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Herein, we report a new approach of an FePt nanoparticle formation mechanism studying the evolution of particle size and composition during the synthesis using the modified polyol process. One of the factors limiting their application in ultra-high-density magnetic storage media is the particle-to-particle composition, which affects the A1-to-L1(0) transformation as well as their magnetic properties. There are many controversies in the literature concerning the mechanism of the FePt formation, which seems to be the key to understanding the compositional chemical distribution. Our results convincingly show that, initially, Pt nuclei are formed due to reduction of Pt(acac)(2) by the diol, followed by heterocoagulation of Fe cluster species formed from Fe(acac)(3) thermal decomposition onto the Pt nuclei. Complete reduction of heterocoagulated iron species seems to involve a CO-spillover process, in which the Pt nuclei surface acts as a heterogeneous catalyst, leading to the improvement of the single-particle composition control and allowing a much narrower compositional distribution. Our results show significant decreases in the particle-to-particle composition range, improving the A1-to-L1(0) phase transformation and, consequently, the magnetic properties when compared with other reported methods.
Resumo:
BACKGROUND: National quality registries (NQRs) purportedly facilitate quality improvement, while neither the extent nor the mechanisms of such a relationship are fully known. The aim of this case study is to describe the experiences of local stakeholders to determine those elements that facilitate and hinder clinical quality improvement in relation to participation in a well-known and established NQR on stroke in Sweden. METHODS: A strategic sample was drawn of 8 hospitals in 4 county councils, representing a variety of settings and outcomes according to the NQR's criteria. Semi-structured telephone interviews were conducted with 25 managers, physicians in charge of the Riks-Stroke, and registered nurses registering local data at the hospitals. Interviews, including aspects of barriers and facilitators within the NQR and the local context, were analysed with content analysis. RESULTS: An NQR can provide vital aspects for facilitating evidence-based practice, for example, local data drawn from national guidelines which can be used for comparisons over time within the organisation or with other hospitals. Major effort is required to ensure that data entries are accurate and valid, and thus the trustworthiness of local data output competes with resources needed for everyday clinical stroke care and quality improvement initiatives. Local stakeholders with knowledge of and interest in both the medical area (in this case stroke) and quality improvement can apply the NQR data to effectively initiate, carry out, and evaluate quality improvement, if supported by managers and co-workers, a common stroke care process and an operational management system that embraces and engages with the NQR data. CONCLUSION: While quality registries are assumed to support adherence to evidence-based guidelines around the world, this study proposes that a NQR can facilitate improvement of care but neither the registry itself nor the reporting of data initiates quality improvement. Rather, the local and general evidence provided by the NQR must be considered relevant and must be applied in the local context. Further, the quality improvement process needs to be facilitated by stakeholders collaborating within and outside the context, who know how to initiate, perform, and evaluate quality improvement, and who have the resources to do so.
Resumo:
Architecture description languages (ADLs) are used to specify high-level, compositional views of a software application. ADL research focuses on software composed of prefabricated parts, so-called software components. ADLs usually come equipped with rigorous state-transition style semantics, facilitating verification and analysis of specifications. Consequently, ADLs are well suited to configuring distributed and event-based systems. However, additional expressive power is required for the description of enterprise software architectures – in particular, those built upon newer middleware, such as implementations of Java’s EJB specification, or Microsoft’s COM+/.NET. The enterprise requires distributed software solutions that are scalable, business-oriented and mission-critical. We can make progress toward attaining these qualities at various stages of the software development process. In particular, progress at the architectural level can be leveraged through use of an ADL that incorporates trust and dependability analysis. Also, current industry approaches to enterprise development do not address several important architectural design issues. The TrustME ADL is designed to meet these requirements, through combining approaches to software architecture specification with rigorous design-by-contract ideas. In this paper, we focus on several aspects of TrustME that facilitate specification and analysis of middleware-based architectures for trusted enterprise computing systems.
Resumo:
Agent-oriented software engineering and software product lines are two promising software engineering techniques. Recent research work has been exploring their integration, namely multi-agent systems product lines (MAS-PLs), to promote reuse and variability management in the context of complex software systems. However, current product derivation approaches do not provide specific mechanisms to deal with MAS-PLs. This is essential because they typically encompass several concerns (e.g., trust, coordination, transaction, state persistence) that are constructed on the basis of heterogeneous technologies (e.g., object-oriented frameworks and platforms). In this paper, we propose the use of multi-level models to support the configuration knowledge specification and automatic product derivation of MAS-PLs. Our approach provides an agent-specific architecture model that uses abstractions and instantiation rules that are relevant to this application domain. In order to evaluate the feasibility and effectiveness of the proposed approach, we have implemented it as an extension of an existing product derivation tool, called GenArch. The approach has also been evaluated through the automatic instantiation of two MAS-PLs, demonstrating its potential and benefits to product derivation and configuration knowledge specification.
Resumo:
Neste trabalho foi analisada a melhoria nas características de um solo mole quando tratado com cal, bem como a viabilidade técnica de se utilizar este novo material como uma camada suporte de fundações superficiais. O solo estudado classifica-se pedologicamente como Gley Húmico e a jazida localiza-se no município de Canoas/RS, às margens da BR 386. O trabalho teve as seguintes finalidades: realizar um estudo da influência dos diferentes teores de cal sobre as características tensão x deformação do solo tratado; verificar o ganho de resistência com o tempo de cura; modelar o comportamerito tensão x deformação do material tratado; realizar simulações numéricas, através do Método dos Elementos Finitos, do comportamento carga x recalque de fundações continuas flexíveis assentes sobre o novo material. Adotou-se o teor ótimo de cal (obtido pelo método de Eades & Grim, 1966) de 9% e dois valores inferiores de 7% e 5%. Realizaram-se os seguintes ensaios sobre o solo natural e as misturas de solo-cal: limites de Atterberg, compactação, granulometria, difratograma de raio X, permeabilidade (triaxial) e ensaios triaxiais adensados não drenados(CIU). Todos os ensaios foram realizados para três tempos de cura (7, 28 e 90 dias) e os corpos de prova foram curados em câmara úmida. Para modelar o comportamento tensão x deformação do solo melhorado, adotou-se o Modelo Hiperbólico e para o solo natural o Modelo Cam-Clay Modificado. O Modelo Hiperbólico foi implementado no software CRISPSO, desenvolvido na Universidade de Cambridge, Inglaterra. O software foi utilizado em um estudo paramétrico para determinar a influência do processo de estabilização no comportamento carga x recalque de fundações superficiais. Dos resultados obtidos, concluiu-se: que o método de Eades & Grim (1966) não mostrou-se adequado para determinação do teor ótimo de cal; houve, de maneira geral, melhora nas características físicas com o tratamento com cal; não houve ganho de resistência com o tempo de cura; o modelo hiperbólico representou bem o comportamento das misturas de solo cal e a colocação de uma camada de solo tratado apresenta melhoras no comportamento carga x recalque de fundações superficiais contínuas flexíveis.
Resumo:
Este trabalho de conclusão investiga o efeito da geração de estoques intermediários nos indicadores principais empregados na Teoria das Restrições (Ganho, Despesa Operacional e Inventário) em uma unidade industrial de processo produtivo de Propriedade contínuo, que emprega embalagens, matérias-primas obtidas em larga escala e cadeias logísticas de longo curso. Este tipo de indústria produz bens de consumo imediato, com pouca variabilidade, de modo “empurrado”. A principal conseqüência é a perda do sincronismo na cadeia logística, resultando em uma grande quantidade de estoques intermediários e custos crescentes, relacionados principalmente ao custo de manutenção destes estoques. Através dos cinco passos de focalização e das ferramentas lógicas da Teoria das Restrições, propõe-se uma alternativa gerencial, que inclui o algoritmo Tambor-Pulmão-Corda e insere a organização em um processo de melhoria contínua, cujos impactos são avaliados por simulação computacional. Através de técnicas estatísticas e software apropriados, constrói-se um modelo de simulação computacional baseado em dados reais de uma planta produtora de cimento. A partir deste modelo, diferentes cenários são testados, descobrindo-se a condição ótima. Chega-se a uma conclusão, considerando a mudança na política de geração de estoques intermediários e seus impactos na redução de custos e riscos.
Resumo:
O objetivo da pesquisa atém-se primeiramente em elaborar um protocolo que permita analisar, por meio de um conjunto de indicadores, o processo de reutilização de software no desenvolvimento de sistemas de informação modelando objetos de negócios. O protocolo concebido compõe-se de um modelo analítico e de grades de análise, a serem empregadas na classificação e tabulação dos dados obtidos empiricamente. Com vistas à validação inicial do protocolo de análise, realiza-se um estudo de caso. A investigação ocorre num dos primeiros e, no momento, maior projeto de fornecimento de elementos de software reutilizáveis destinados a negócios, o IBM SANFRANCISCO, bem como no primeiro projeto desenvolvido no Brasil com base no por ele disponibilizado, o sistema Apontamento Universal de Horas (TIME SHEET System). Quanto à aplicabilidade do protocolo na prática, este se mostra abrangente e adequado à compreensão do processo. Quanto aos resultados do estudo de caso, a análise dos dados revela uma situação em que as expectativas (dos pesquisadores) de reutilização de elementos de software orientadas a negócio eram superiores ao observado. Houve, entretanto, reutilização de elementos de baixo nível, que forneceram a infra-estrutura necessária para o desenvolvimento do projeto. Os resultados contextualizados diante das expectativas de reutilização (dos desenvolvedores) são positivos, na medida em que houve benefícios metodológicos e tecnológicos decorrentes da parceria realizada. Por outro lado, constatam-se alguns aspectos restritivos para o desenvolvedor de aplicativos, em virtude de escolhas arbitrárias realizadas pelo provedor de elementos reutilizáveis.
Resumo:
Este trabalho apresenta uma arquitetura para Ambientes de Desenvolvimento de Software (ADS). Esta arquitetura é baseada em produtos comerciais de prateleira (COTS), principalmente em um Sistema de Gerência de Workflow – SGW (Microsoft Exchange 2000 Server – E2K) - e tem como plataforma de funcionamento a Internet, integrando também algumas ferramentas que fazem parte do grande conjunto de aplicativos que é utilizado no processo de desenvolvimento de software. O desenvolvimento de um protótipo (WOSDIE – WOrkflow-based Software Development Integrated Environment) baseado na arquitetura apresentada é descrito em detalhes, mostrando as etapas de construção, funções implementadas e dispositivos necessários para a integração de um SGW, ferramentas de desenvolvimento, banco de dados (WSS – Web Storage System) e outros, para a construção de um ADS. O processo de software aplicado no WOSDIE foi extraído do RUP (Rational Unified Process – Processo Unificado Rational). Este processo foi modelado na ferramenta Workflow Designer, que permite a modelagem dos processos de workflow dentro do E2K. A ativação de ferramentas a partir de um navegador Web e o armazenamento dos artefatos produzidos em um projeto de software também são abordados. O E2K faz o monitoramento dos eventos que ocorrem dentro do ambiente WOSDIE, definindo, a partir das condições modeladas no Workflow Designer, quais atividades devem ser iniciadas após o término de alguma atividade anterior e quem é o responsável pela execução destas novas atividades (assinalamento de atividades). A arquitetura proposta e o protótipo WOSDIE são avaliados segundo alguns critérios retirados de vários trabalhos. Estas avaliações mostram em mais detalhes as características da arquitetura proposta e proporcionam uma descrição das vantagens e problemas associados ao WOSDIE.