911 resultados para goal based
Resumo:
Flow in the world's oceans occurs at a wide range of spatial scales, from a fraction of a metre up to many thousands of kilometers. In particular, regions of intense flow are often highly localised, for example, western boundary currents, equatorial jets, overflows and convective plumes. Conventional numerical ocean models generally use static meshes. The use of dynamically-adaptive meshes has many potential advantages but needs to be guided by an error measure reflecting the underlying physics. A method of defining an error measure to guide an adaptive meshing algorithm for unstructured tetrahedral finite elements, utilizing an adjoint or goal-based method, is described here. This method is based upon a functional, encompassing important features of the flow structure. The sensitivity of this functional, with respect to the solution variables, is used as the basis from which an error measure is derived. This error measure acts to predict those areas of the domain where resolution should be changed. A barotropic wind driven gyre problem is used to demonstrate the capabilities of the method. The overall objective of this work is to develop robust error measures for use in an oceanographic context which will ensure areas of fine mesh resolution are used only where and when they are required. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Self-adaptation is emerging as an increasingly important capability for many applications, particularly those deployed in dynamically changing environments, such as ecosystem monitoring and disaster management. One key challenge posed by Dynamically Adaptive Systems (DASs) is the need to handle changes to the requirements and corresponding behavior of a DAS in response to varying environmental conditions. Berry et al. previously identified four levels of RE that should be performed for a DAS. In this paper, we propose the Levels of RE for Modeling that reify the original levels to describe RE modeling work done by DAS developers. Specifically, we identify four types of developers: the system developer, the adaptation scenario developer, the adaptation infrastructure developer, and the DAS research community. Each level corresponds to the work of a different type of developer to construct goal model(s) specifying their requirements. We then leverage the Levels of RE for Modeling to propose two complementary processes for performing RE for a DAS. We describe our experiences with applying this approach to GridStix, an adaptive flood warning system, deployed to monitor the River Ribble in Yorkshire, England.
Resumo:
Dynamically adaptive systems (DASs) are intended to monitor the execution environment and then dynamically adapt their behavior in response to changing environmental conditions. The uncertainty of the execution environment is a major motivation for dynamic adaptation; it is impossible to know at development time all of the possible combinations of environmental conditions that will be encountered. To date, the work performed in requirements engineering for a DAS includes requirements monitoring and reasoning about the correctness of adaptations, where the DAS requirements are assumed to exist. This paper introduces a goal-based modeling approach to develop the requirements for a DAS, while explicitly factoring uncertainty into the process and resulting requirements. We introduce a variation of threat modeling to identify sources of uncertainty and demonstrate how the RELAX specification language can be used to specify more flexible requirements within a goal model to handle the uncertainty. © 2009 Springer Berlin Heidelberg.
Resumo:
The behaviour of self adaptive systems can be emergent, which means that the system’s behaviour may be seen as unexpected by its customers and its developers. Therefore, a self-adaptive system needs to garner confidence in its customers and it also needs to resolve any surprise on the part of the developer during testing and maintenance. We believe that these two functions can only be achieved if a self-adaptive system is also capable of self-explanation. We argue a self-adaptive system’s behaviour needs to be explained in terms of satisfaction of its requirements. Since self-adaptive system requirements may themselves be emergent, we propose the use of goal-based requirements models at runtime to offer self-explanation of how a system is meeting its requirements. We demonstrate the analysis of run-time requirements models to yield a self-explanation codified in a domain specific language, and discuss possible future work.
Resumo:
Purpose–The purpose of this paper is to formulate a conceptual framework for urban sustainability indicators selection. This framework will be used to develop an indicator-based evaluation method for assessing the sustainability levels of residential neighbourhood developments in Malaysia. Design/methodology/approach–We provide a brief overview of existing evaluation frameworks for sustainable development assessment. We then develop a conceptual Sustainable Residential Neighbourhood Assessment (SNA) framework utilising a four-pillar sustainability framework (environmental, social, economic and institutional) and a combination of domain-based and goal-based general frameworks. This merger offers the advantages of both individual frameworks, while also overcoming some of their weaknesses when used to develop the urban sustainability evaluation method for assessing residential neighbourhoods. Originality/value–This approach puts in evidence that many of the existing frameworks for evaluating urban sustainability do not extend their frameworks to include assessing housing sustainability at a local level. Practical implications–It is expected that the use of the indicator-based Sustainable Neighbourhood Assessment framework will present a potential mechanism for planners and developers to evaluate and monitor the sustainability performance of residential neighbourhood developments.
Resumo:
Effective enterprise information security policy management requires review and assessment activities to ensure information security policies are aligned with business goals and objectives. As security policy management involves the elements of policy development process and the security policy as output, the context for security policy assessment requires goal-based metrics for these two elements. However, the current security management assessment methods only provide checklist types of assessment that are predefined by industry best practices and do not allow for developing specific goal-based metrics. Utilizing theories drawn from literature, this paper proposes the Enterprise Information Security Policy Assessment approach that expands on the Goal-Question-Metric (GQM) approach. The proposed assessment approach is then applied in a case scenario example to illustrate a practical application. It is shown that the proposed framework addresses the requirement for developing assessment metrics and allows for the concurrent undertaking of process-based and product-based assessment. Recommendations for further research activities include the conduct of empirical research to validate the propositions and the practical application of the proposed assessment approach in case studies to provide opportunities to introduce further enhancements to the approach.
Resumo:
The current trend among many universities is to increase the number of courses available online. However, there are fundamental problems in transferring traditional education courses to virtual formats. Delivering current curricula in an online format does not assist in overcoming the negative effects on student motivation which are inherent in providing information passively. Using problem-based learning (PBL) online is a method by which computers can become a tool to encourage active learning among students. The delivery of curricula via goal-based scenarios allows students to learn at different rates and can successfully shift online learning from memorization to discovery. This paper reports on a Web-based e-health course that has been delivered via PBL for the past 12 months. Thirty distance-learning students undertook postgraduate courses in e-health delivered via the Internet (asynchronous communication). Data collected via online student surveys indicated that the PBL format was both flexible and interesting. PBL has the potential to increase the quality of the educational experience of students in online environments.
Resumo:
Despite advances in the field of workflow flexibility, there is still insufficient support for dealing with unforeseen exceptions. In particular, it is challenging to find a solution which preserves the intent of the process as much as possible when such exceptions are encountered. This challenge can be alleviated by making the connection between a process and its objectives more explicit. This paper presents a demo illustrating the blended workflow approach where two specifications are fused together, a "classic" process model and a goal model. End users are guided by the process model but may deviate from this model whenever unexpected situations are encountered. The two models involved provide views on the process and the demo shows how one can switch between these views and how they are kept consistent by the blended workflow engine. A simple example involving the making of a doctor's appointment illustrates the potential advantages of the proposed approach to both researchers and developers.
Resumo:
Evaluation practices have pervaded the Finnish society and welfare state. At the same time the term effectiveness has become a powerful organising concept in welfare state activities. The aim of the study is to analyse how the outcome-oriented society came into being through historical processes, to answer the question of how social policy and welfare state practices were brought under the governance of the concept of effectiveness . Discussions about social imagination, Michel Foucault s conceptions of the history of the present and of governmentality, genealogy and archaeology, along with Ian Hacking s notions of dynamic nominalism and styles of reasoning, are used as the conceptual and methodological starting points for the study. In addition, Luc Boltanski s and Laurent Thévenot s ideas of orders of worth , regimes of evaluation in everyday life, are employed. Usually, evaluation is conceptualised as an autonomous epistemic culture and practice (evaluation as epistemic practice), but evaluation is here understood as knowledge-creation processes elementary to different epistemic practices (evaluation in epistemic practices). The emergence of epistemic cultures and styles of reasoning about the effectiveness or impacts of welfare state activities are analysed through Finnish social policy and social work research. The study uses case studies which represent debates and empirical research dealing with the effectiveness and quality of social services and social work. While uncertainty and doubts over the effects and consequences of welfare policies have always been present in discourses about social policy, the theme has not been acknowledged much in social policy research. To resolve these uncertainties, eight styles of reasoning about such effects have emerged over time. These are the statistical, goal-based, needs-based, experimental, interaction-based, performance measurement, auditing and evidence-based styles of reasoning. Social policy research has contributed in various ways to the creation of these epistemic practices. The transformation of the welfare state, starting at the end of 1980s, increased market-orientation and trimmed public welfare responsibilities, and led to the adoption of the New Public Management (NPM) style of leadership. Due to these developments the concept of effectiveness made a breakthrough, and new accountabilities with their knowledge tools for performance measurement and auditing and evidence-based styles of reasoning became more dominant in the ruling of the welfare state. Social sciences and evaluation have developed a heteronomous relation with each other, although there still remain divergent tendencies between them. Key words: evaluation, effectiveness, social policy, welfare state, public services, sociology of knowledge
Resumo:
In order to carry out high-precision three-dimensional "integration" for the characteristics of the secondary seismic exploration for Biyang Depression, in the implementation process, through a combination of scientific research and production, summed up high-precision seismic acquisition, processing and interpretation technologies suitable for the eastern part of the old liberated areas, achieved the following results: 1. high-precision complex three-dimensional seismic exploration technology series suitable for shallow depression Biyang block group. To highlight the shallow seismic signal, apply goal-based observing system design, trail from the small panel to receive and protect the shallow treatment of a range of technologies; to explain the use of three-dimensional visualization and coherent combination of full-body three-dimensional fine interpretation identification of the 50-100 m below the unconformity surface and its formation of about 10 meters of the distribution of small faults and improve the small block and stratigraphic unconformity traps recognition. 2. high-precision series of three-dimensional seismic exploration technology suitable for deep depression Biyang low signal to noise ratio of information. Binding model using forward and lighting technology, wide-angle observation system covering the design, multiple suppression and raise the energy of deep seismic reflection processing and interpretation of detailed, comprehensive reservoir description, such as research and technology, identified a number of different types of traps. 3. high-precision seismic exploration technology series for the southern Biyang Depression high steep three-dimensional structure. The use of new technology of seismic wave scattering theory and high-precision velocity model based on pre-stack time migration and depth migration imaging of seismic data and other high-precision processing technology, in order to identify the southern steep slope of the local structure prediction and analysis of sandstone bedrock surface patterns provide a wealth of information.
Resumo:
O decréscimo das reservas de petróleo e as consequências ambientais resultantes do recurso a combustíveis fósseis nos motores a diesel têm levado à procura de combustíveis alternativos. Esta pesquisa alicerçada nas fontes de energia renovável tornou-se essencial, face à crescente procura de energia e ao limitado fornecimento de combustíveis fósseis . Resíduos de óleo de cozinha, gordura animal, entre outros resíduos de origem biológica, tais como a borra de café, são exemplos de matérias-primas para a produção de biodiesel. A sua valorização tem interesse quer pela perspetiva ambiental, quer pela económica, pois aumenta não só a flexibilidade e diversificação das matérias-primas, mas também contribui para uma estabilidade de custos e alteração nas políticas agrícolas e de uso do solo. É neste contexto que se enquadra o biodiesel e a borra de café, pretendendo-se aqui efetuar o estudo da produção, à escala laboratorial, de biodiesel a partir da borra de café, por transesterificação enzimática, visando a procura das melhores condições reacionais. Iniciando-se com a caracterização da borra de café, foram avaliados antes e após a extração do óleo da borra de café, diversos parâmetros, de entre os quais se destacam: o teor de humidade (16,97% e 6,79%), teor de cinzas (1,91 e 1,57%), teor de azoto (1,71 e 2,30%), teor de proteínas (10,7 e 14,4%), teor de carbono (70,2 e 71,7%), teor de celulose bruta (14,77 e 18,48%), teor de lenhina (31,03% e 30,97%) e poder calorifico superior (19,5 MJ/kg e 19,9 MJ/kg). Sumariamente, constatou-se que os valores da maioria dos parâmetros não difere substancialmente dos valores encontrados na literatura, tendo sido evidenciado o potencial da utilização desta biomassa, como fonte calorifica para queima e geração de energia. Sendo a caracterização do óleo extraído da borra de café um dos objetivos antecedentes à produção do biodiesel, pretendeu-se avaliar os diferentes parâmetros mais significativos. No que diz respeito à caracterização do óleo extraído, distingue-se a sua viscosidade cinemática (38,04 mm2/s), densidade 0,9032 g/cm3, poder calorífico de 37,9 kcal/kg, índice de iodo igual a 63,0 gI2/ 100 g óleo, o teor de água do óleo foi de 0,15 %, o índice de acidez igual a 44,8 mg KOH/g óleo, ponto de inflamação superior a 120 ºC e teor em ácidos gordos de 82,8%. Inicialmente foram efetuados ensaios preliminares, a fim de selecionar a lipase (Lipase RMIM, TL 100L e CALB L) e álcool (metanol ou etanol puros) mais adequados à produção de biodiesel, pelo que o rendimento de 83,5% foi obtido através da transesterificação mediada pela lipase RMIM, utilizando como álcool o etanol. Sendo outro dos objetivos a otimização do processo de transesterificação enzimática, através de um desenho composto central a três variáveis (razão molar etanol: óleo, concentração de enzima e temperatura), recorrendo ao software JMP 8.0, determinou-se como melhores condições, uma razão molar etanol: óleo 5:1, adição de 4,5% (m/m) de enzima e uma temperatura de 45 ºC, que conduziram a um rendimento experimental equivalente a 96,7 % e teor de ésteres 87,6%. Nestas condições, o rendimento teórico foi de 99,98%. Procurou-se ainda estudar o efeito da adição de água ao etanol, isto é, o efeito da variação da concentração do etanol pela adição de água, para teores de etanol de 92%, 85% e 75%. Verificou-se que até 92% decorreu um aumento da transesterificação (97,2%) para um teor de ésteres de (92,2%), pelo que para teores superiores de água adicionada (75% e 85%) ocorreu um decréscimo no teor final em ésteres (77,2% e 89,9%) e no rendimento da reação (84,3% e 91,9%). Isto indica a ocorrência da reação de hidrólise em maior extensão, que leva ao desvio do equilíbrio no sentido contrário à reação de formação dos produtos, isto é, dos ésteres. Finalmente, relativamente aos custos associados ao processo de produção de biodiesel, foram estimados para o conjunto de 27 ensaios realizados neste trabalho, e que corresponderam a 767,4 g de biodiesel produzido, sendo o custo dos reagentes superior ao custo energético, de 156,16 € e 126,02 €, respetivamente. Naturalmente que não esperamos que, a nível industrial os custos sejam desta ordem de grandeza, tanto mais que há economia de escala e que as enzimas utilizadas no processo deveriam ser reutilizadas diversas vezes.
Resumo:
In der psycholinguistischen Forschung ist die Annahme weitverbreitet, dass die Bewertung von Informationen hinsichtlich ihres Wahrheitsgehaltes oder ihrer Plausibilität (epistemische Validierung; Richter, Schroeder & Wöhrmann, 2009) ein strategischer, optionaler und dem Verstehen nachgeschalteter Prozess ist (z.B. Gilbert, 1991; Gilbert, Krull & Malone, 1990; Gilbert, Tafarodi & Malone, 1993; Herbert & Kübler, 2011). Eine zunehmende Anzahl an Studien stellt dieses Zwei-Stufen-Modell von Verstehen und Validieren jedoch direkt oder indirekt in Frage. Insbesondere Befunde zu Stroop-artigen Stimulus-Antwort-Kompatibilitätseffekten, die auftreten, wenn positive und negative Antworten orthogonal zum aufgaben-irrelevanten Wahrheitsgehalt von Sätzen abgegeben werden müssen (z.B. eine positive Antwort nach dem Lesen eines falschen Satzes oder eine negative Antwort nach dem Lesen eines wahren Satzes; epistemischer Stroop-Effekt, Richter et al., 2009), sprechen dafür, dass Leser/innen schon beim Verstehen eine nicht-strategische Überprüfung der Validität von Informationen vornehmen. Ausgehend von diesen Befunden war das Ziel dieser Dissertation eine weiterführende Überprüfung der Annahme, dass Verstehen einen nicht-strategischen, routinisierten, wissensbasierten Validierungsprozesses (epistemisches Monitoring; Richter et al., 2009) beinhaltet. Zu diesem Zweck wurden drei empirische Studien mit unterschiedlichen Schwerpunkten durchgeführt. Studie 1 diente der Untersuchung der Fragestellung, ob sich Belege für epistemisches Monitoring auch bei Informationen finden lassen, die nicht eindeutig wahr oder falsch, sondern lediglich mehr oder weniger plausibel sind. Mithilfe des epistemischen Stroop-Paradigmas von Richter et al. (2009) konnte ein Kompatibilitätseffekt von aufgaben-irrelevanter Plausibilität auf die Latenzen positiver und negativer Antworten in zwei unterschiedlichen experimentellen Aufgaben nachgewiesen werden, welcher dafür spricht, dass epistemisches Monitoring auch graduelle Unterschiede in der Übereinstimmung von Informationen mit dem Weltwissen berücksichtigt. Darüber hinaus belegen die Ergebnisse, dass der epistemische Stroop-Effekt tatsächlich auf Plausibilität und nicht etwa auf der unterschiedlichen Vorhersagbarkeit von plausiblen und unplausiblen Informationen beruht. Das Ziel von Studie 2 war die Prüfung der Hypothese, dass epistemisches Monitoring keinen evaluativen Mindset erfordert. Im Gegensatz zu den Befunden anderer Autoren (Wiswede, Koranyi, Müller, Langner, & Rothermund, 2013) zeigte sich in dieser Studie ein Kompatibilitätseffekt des aufgaben-irrelevanten Wahrheitsgehaltes auf die Antwortlatenzen in einer vollständig nicht-evaluativen Aufgabe. Die Ergebnisse legen nahe, dass epistemisches Monitoring nicht von einem evaluativen Mindset, möglicherweise aber von der Tiefe der Verarbeitung abhängig ist. Studie 3 beleuchtete das Verhältnis von Verstehen und Validieren anhand einer Untersuchung der Online-Effekte von Plausibilität und Vorhersagbarkeit auf Augenbewegungen beim Lesen kurzer Texte. Zusätzlich wurde die potentielle Modulierung dieser Effeke durch epistemische Marker, die die Sicherheit von Informationen anzeigen (z.B. sicherlich oder vielleicht), untersucht. Entsprechend der Annahme eines schnellen und nicht-strategischen epistemischen Monitoring-Prozesses zeigten sich interaktive Effekte von Plausibilität und dem Vorhandensein epistemischer Marker auf Indikatoren früher Verstehensprozesse. Dies spricht dafür, dass die kommunizierte Sicherheit von Informationen durch den Monitoring-Prozess berücksichtigt wird. Insgesamt sprechen die Befunde gegen eine Konzeptualisierung von Verstehen und Validieren als nicht-überlappenden Stufen der Informationsverarbeitung. Vielmehr scheint eine Bewertung des Wahrheitsgehalts oder der Plausibilität basierend auf dem Weltwissen – zumindest in gewissem Ausmaß – eine obligatorische und nicht-strategische Komponente des Sprachverstehens zu sein. Die Bedeutung der Befunde für aktuelle Modelle des Sprachverstehens und Empfehlungen für die weiterführende Forschung zum Vehältnis von Verstehen und Validieren werden aufgezeigt.
Resumo:
Esta investigación le permite al lector entender la influencia que tuvo la implantación del Sistema Integrado de Transporte Masivo (SITM) sobre la gestión del ordenamiento territorial en el Área Metropolitana de Bucaramanga. El documento explica y analiza las posibles transformaciones que se podrían generar en los alrededores del sistema, partiendo de reconocer que existen unas áreas denominadas centralidades y otras operaciones estratégicas que son catalogadas no solamente por el rol que cumplen en un contexto municipal y metropolitano, sino también por las posibilidades que ofrecen para el desarrollo de servicios, equipamientos y empleo, que finalmente contribuirán a alcanzar el deseo de un modelo desarrollo policéntrico en el área metropolitana.
Resumo:
A simple and practical technique for assessing the risks, that is, the potential for error, and consequent loss, in software system development, acquired during a requirements engineering phase is described. The technique uses a goal-based requirements analysis as a framework to identify and rate a set of key issues in order to arrive at estimates of the feasibility and adequacy of the requirements. The technique is illustrated and how it has been applied to a real systems development project is shown. How problems in this project could have been identified earlier is shown, thereby avoiding costly additional work and unhappy users.