781 resultados para Process-dissociation Framework


Relevância:

80.00% 80.00%

Publicador:

Resumo:

随着软件对社会各领域、各层次的渗透,软件逐渐转变为一种对社会团体、 甚至对社会公众的服务,软件的规模越来越大、用户需求越来越多、功能和性能 要求也越来越复杂。因此,对软件的可用性、可靠性、可信性等质量要求不断提 高。伴随着软件业的逐渐发展,软件过程技术逐渐被应用于软件产品的开发当中。 “质量形成于产品的生产过程”这一理念逐渐被软件组织所接受。其核心思想体 现在通过对软件过程的策划、控制和改进来保证软件产品的质量,进而提高软件 组织的经营业绩。软件过程度量作为软件过程管理和过程改进的关键活动,越来 越为软件组织所重视。 通过实施过程管理,能够刻画项目或过程目标的满足程度,找到造成过程 或产品重大偏差的根本原因,进而实施过程改进。然而,在软件过程度量实施期 间,软件组织面对不同的软件开发过程、众多的过程性能度量指标、复杂的统计 分析方法,既要考虑量化管理方法的合理性和复杂程度,又要权衡量化管理的实 施成本,这使得实施有效的过程度量充满挑战。本文基于经验软件工程方法,提 出一种多粒度多维度软件过程度量框架,以及实现该框架的关键技术:软件过程 性能基线的建立和维护方法;同时介绍了该框架下的软件项目进度量化控制模 型,支持软件组织实施有效的过程管理和改进。 本文的主要贡献包括: 提出了一种多粒度多维度软件过程度量框架(Multi-granularity Multi-dimensional software Process Measurement Framework,M2-PMF),该框架通 过综合考虑软件过程管理和改进的必要信息所属的特征维度和软件组织的过程 管理粒度,自底向上的通过实体层、度量分析层和目标层指导软件组织建立一套 可以覆盖软件全生命周期的、开放的、支持过程改进的综合指标体系和模型。支 持软件组织裁减和定制确定环境下的度量体系,清晰了解其软件过程能力和性 能,提高软件组织对软件过程的控制能力,保障软件开发过程和软件产品的质量。 提出了基于统计分析的过程性能基线的建立和改进方法(Baseline – Statistic - Refinement, BSR),该方法可以有效地建立和维护过程性能基线,支持软件 组织从定性管理提升到定量管理。该方法应用波动图,在过程尚不稳定、数据样 本不足的情况下尽可能多的获得过程改进信息,识别过程改进机会,确定过程改 进途径,帮助软件组织高效地改进其过程中明显的弱项。在过程逐步稳定之后, 利用控制图、排列图、因果图、散点图等统计工具,分析过程性能,建立过程性 多粒度多维度软件过程度量和改进方法研究 ii 能基线,并不断精化。 在M2-PMF 框架下,提出了基于统计过程控制(Statitical Process Control, SPC)和挣值管理(Earned Value Management,EVM)的项目进度量化控制模型 SEVM,该方法通过对项目进度指数的统计控制,分析其稳定性,并通过估算模 型,根据项目当前挣值数据推算项目总进度偏差,并加以控制。支持软件组织对 项目进度进行量化控制,提高了项目按期交付的可能性。 最后,介绍了本文提出的过程度量框架和量化管理方法在国内多家软件组 织中的实际应用。应用案例表明,本文的方法和模型具有广泛的适应性和高度的 可操作性。应用本文方法能够对项目进行有效的估算、度量和控制,进而提高产 品质量并改善客户满意度。

Relevância:

80.00% 80.00%

Publicador:

Resumo:

随着软件应用范围的不断扩大和复杂程度的不断提高,软件开发过程越来越难以控制,软件质量也越来越难以保障。质量管理的思想和理念,已经从单纯的以面向软件产品的检验为主要手段的质量控制,发展到更加成熟、更加主动地对软件产品生产过程进行管理的质量保障。 作为高成熟度软件过程的特征,量化过程管理逐渐被软件组织接受并实施。通过实施量化管理,能够刻画项目或过程目标的满足程度,找到造成过程或产品重大偏差的根本原因。然而,在量化过程管理实施期间,软件组织面对不同的软件开发过程、众多的过程性能度量指标、复杂的统计分析方法,既要考虑量化管理方法的合理性和复杂程度,又要权衡量化管理的实施成本,这使得实施有效的量化过程管理充满挑战。本文以缺陷数据为中心,提出了一种缺陷驱动的量化过程管理框架,以及基于该框架的两个量化管理方法,支持软件组织收集量化过程管理所需数据,建立过程性能基线和过程性能模型,量化管理软件项目。该框架适合迭代、瀑布等不同的开发方法,支持项目全生命周期的量化管理。 本文主要贡献包括: 提出了一种缺陷驱动的量化过程管理框架(Defect-driven Quantitative Process Management framework, DefQPM)。量化管理中,保障软件质量是核心。质量和缺陷密切相关,软件开发过程中各类工程活动(如:需求、设计、编码、测试等)都伴随着缺陷的注入、排除和遗留。DefQPM框架以缺陷数据作为量化管理的出发点,自底向上的通过数据层、模型层、使用层来指导软件组织分析过程性能,识别度量指标间的相关性,建立符合自身情况的过程性能基线和过程性能模型,有效的实施量化过程管理。DefQPM框架给出了实施量化管理的过程和机制。基于DefQPM框架,可以建立针对特定应用场景的量化管理方法,以及针对特定软件组织的量化管理解决方案。 提出了一种基于DefQPM的迭代项目量化管理方法(process performance Baseline based Defect-driven iteration management, BiDefect)。迭代开发方法由于其灵活性和管理需求变更的能力,得到了广泛应用。然而,如何对迭代项目实施量化管理依然充满挑战。迭代项目中,各种活动多次并行执行,难以找到合适的控制点,也缺乏针对迭代项目的度量指标及分析方法。基于DefQPM框架,本文研究了迭代开发项目典型的量化管理需要(例如:通过控制每次迭代工作产品的质量来保障最终交付软件产品的质量),提出了一种针对迭代项目的量化管理方法,解决了量化管理迭代项目的几个主要挑战。该方法关注缺陷的注入、排除、遗留情况,指导项目策划期间建立整体估算和度量,在项目执行期间评价软件过程执行情况及软件产品的质量,及时识别异常并采取纠正措施,进而为项目后续工作中成本、进度、质量等方面提供估算、控制方面的指导。 提出了一种基于DefQPM的测试过程量化管理方法(Quantitatively Managing Testing process, TestQM)。测试是重要的质量控制活动,对于高成熟度软件组织来说也是需要进行量化管理的活动。缺陷检测和缺陷修复是测试过程的两类主要活动,需要不同技能的人员执行。目前流行的软件估算方法多是将缺陷检测和缺陷修复的工作量和进度统一纳入测试活动中进行估算和管理,不够准确。基于DefQPM框架,本文提出了一种专门针对测试过程的量化管理方法。该方法关注缺陷按注入阶段分布情况,缺陷与修复工作量的相关性,以及缺陷与修复进度的相关性,指导在早期项目建立测试过程的估算,在测试过程中根据缺陷按注入阶段分布情况调整缺陷修复工作量和进度,使得测试过程受控。同时,介绍了TestQM针对Web应用开发项目的经验模型。 最后,介绍了上述量化管理方法在国内软件组织中的应用,包括BiDefect方法在迭代开发项目中的应用,以及TestQM方法在Web应用开发项目中的应用。软件组织实施量化过程管理前后的过程性能变化表明,应用本文方法能够对项目进行有效的估算、度量、重新估算和控制,进而提高产品质量,改善客户满意度。

Relevância:

80.00% 80.00%

Publicador:

Resumo:

By now, there are still many unsolved questions about associative priming. This study used process dissociation paradigm, perceptual identification task and speeded naming task,together with near infrared spectroscopy, to investigate priming for new associations and its brain mechanisms systematically. The results showed there was interaction between level of processing and unitization in affecting associative priming. When comparing with shallow encoding unrelated word pairs, the activation of both sides of prefrontal lobe was stronger, which suggested prefrontal lobe had relations with memory for new associations. Medial temporal lobe and frontal lobe lesioned patients were tested respectively using methods of perceptual identification task and speeded naming task. Both brain regions participated in associative priming. Medial temporal lobe mediated unitization between unrelated items. Frontal lobe contributed to priming for new associations by elaborative processing, inhibiting irrelevant information, selective attending to tasks, and establishing some effective strategies. In addition, normal subjects needed to aware the relationship between study and test to form associative priming and densely memory deficit patients could not form memory for new associations. In conclusion, the results further demonstrated that perceptual representation system could not support priming for new associations alone. Medial temporal lobe and frontal lobe played roles in priming for new associations, and there was some relation between associative priming and conscious retrieval processing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Treasure et al. (2004) recently proposed a new sub space-monitoring technique, based on the N4SID algorithm, within the multivariate statistical process control framework. This dynamic-monitoring method requires considerably fewer variables to be analysed when compared with dynamic principal component analysis (PCA). The contribution charts and variable reconstruction, traditionally employed for static PCA, are analysed in a dynamic context. The contribution charts and variable reconstruction may be affected by the ratio of the number of retained components to the total number of analysed variables. Particular problems arise if this ratio is large and a new reconstruction chart is introduced to overcome these. The utility of such a dynamic contribution chart and variable reconstruction is shown in a simulation and by application to industrial data from a distillation unit.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação de mestrado, Ciência Cognitiva, Universidade de Lisboa, Faculdade de Psicologia, Faculdade de Medicina, Faculdade de Ciências, Faculdade de Letras, 2014

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação apresentada à Escola Superior de Educação de Lisboa para obtenção de grau de mestre em Ciências da Educação - Especialização Supervisão em Educação

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The historically-reactive approach to identifying safety problems and mitigating them involves selecting black spots or hot spots by ranking locations based on crash frequency and severity. The approach focuses mainly on the corridor level without taking the exposure rate (vehicle miles traveled) and socio-demographics information of the study area, which are very important in the transportation planning process, into consideration. A larger study analysis unit at the Transportation Analysis Zone (TAZ) level or the network planning level should be used to address the needs of development of the community in the future and incorporate safety into the long-range transportation planning process. In this study, existing planning tools (such as the PLANSAFE models presented in NCHRP Report 546) were evaluated for forecasting safety in small and medium-sized communities, particularly as related to changes in socio-demographics characteristics, traffic demand, road network, and countermeasures. The research also evaluated the applicability of the Empirical Bayes (EB) method to network-level analysis. In addition, application of the United States Road Assessment Program (usRAP) protocols at the local urban road network level was investigated. This research evaluated the applicability of these three methods for the City of Ames, Iowa. The outcome of this research is a systematic process and framework for considering road safety issues explicitly in the small and medium-sized community transportation planning process and for quantifying the safety impacts of new developments and policy programs. More specifically, quantitative safety may be incorporated into the planning process, through effective visualization and increased awareness of safety issues (usRAP), the identification of high-risk locations with potential for improvement, (usRAP maps and EB), countermeasures for high-risk locations (EB before and after study and PLANSAFE), and socio-economic and demographic induced changes at the planning-level (PLANSAFE).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Web services from different partners can be combined to applications that realize a more complex business goal. Such applications built as Web service compositions define how interactions between Web services take place in order to implement the business logic. Web service compositions not only have to provide the desired functionality but also have to comply with certain Quality of Service (QoS) levels. Maximizing the users' satisfaction, also reflected as Quality of Experience (QoE), is a primary goal to be achieved in a Service-Oriented Architecture (SOA). Unfortunately, in a dynamic environment like SOA unforeseen situations might appear like services not being available or not responding in the desired time frame. In such situations, appropriate actions need to be triggered in order to avoid the violation of QoS and QoE constraints. In this thesis, proper solutions are developed to manage Web services and Web service compositions with regard to QoS and QoE requirements. The Business Process Rules Language (BPRules) was developed to manage Web service compositions when undesired QoS or QoE values are detected. BPRules provides a rich set of management actions that may be triggered for controlling the service composition and for improving its quality behavior. Regarding the quality properties, BPRules allows to distinguish between the QoS values as they are promised by the service providers, QoE values that were assigned by end-users, the monitored QoS as measured by our BPR framework, and the predicted QoS and QoE values. BPRules facilitates the specification of certain user groups characterized by different context properties and allows triggering a personalized, context-aware service selection tailored for the specified user groups. In a service market where a multitude of services with the same functionality and different quality values are available, the right services need to be selected for realizing the service composition. We developed new and efficient heuristic algorithms that are applied to choose high quality services for the composition. BPRules offers the possibility to integrate multiple service selection algorithms. The selection algorithms are applicable also for non-linear objective functions and constraints. The BPR framework includes new approaches for context-aware service selection and quality property predictions. We consider the location information of users and services as context dimension for the prediction of response time and throughput. The BPR framework combines all new features and contributions to a comprehensive management solution. Furthermore, it facilitates flexible monitoring of QoS properties without having to modify the description of the service composition. We show how the different modules of the BPR framework work together in order to execute the management rules. We evaluate how our selection algorithms outperform a genetic algorithm from related research. The evaluation reveals how context data can be used for a personalized prediction of response time and throughput.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Business analysis has developed since the early 1990s as an IS discipline that is concerned with understanding business problems, defining requirements and evaluating relevant solutions. However, this discipline has had limited recognition within the academic community and little research has been conducted into the practices and standards employed by business analysts. This paper reports on a study into business analysis that considered the activities conducted and the outcomes experienced on IS projects. Senior business analysts were interviewed in order to gain insights into the business analyst role and the techniques and approaches applied when conducting this work. The Context, Content, Process, Outcomes framework was adopted as a basis for developing the interview questions. The data collected was analysed using the template analysis technique and the template was based upon this framework. Additional themes concerning aspects of business analysis that may contribute to IS success emerged during data analysis. These included the key business analysis activities and the skills business analysts require to perform these activities. The organisational attitude was also identified as a key factor in enabling the use and contribution of business analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

When modeling real-world decision-theoretic planning problems in the Markov Decision Process (MDP) framework, it is often impossible to obtain a completely accurate estimate of transition probabilities. For example, natural uncertainty arises in the transition specification due to elicitation of MOP transition models from an expert or estimation from data, or non-stationary transition distributions arising from insufficient state knowledge. In the interest of obtaining the most robust policy under transition uncertainty, the Markov Decision Process with Imprecise Transition Probabilities (MDP-IPs) has been introduced to model such scenarios. Unfortunately, while various solution algorithms exist for MDP-IPs, they often require external calls to optimization routines and thus can be extremely time-consuming in practice. To address this deficiency, we introduce the factored MDP-IP and propose efficient dynamic programming methods to exploit its structure. Noting that the key computational bottleneck in the solution of factored MDP-IPs is the need to repeatedly solve nonlinear constrained optimization problems, we show how to target approximation techniques to drastically reduce the computational overhead of the nonlinear solver while producing bounded, approximately optimal solutions. Our results show up to two orders of magnitude speedup in comparison to traditional ""flat"" dynamic programming approaches and up to an order of magnitude speedup over the extension of factored MDP approximate value iteration techniques to MDP-IPs while producing the lowest error of any approximation algorithm evaluated. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background. - Persistent impairment in cognitive function has been described in euthymic individuals with bipolar disorder. Collective work indicates that obesity is associated with reduced cognitive function in otherwise healthy individuals. This sub-group post-hoc analysis preliminarily explores and examines the association between overweight/obesity and cognitive function in euthymic individuals with bipolar disorder. Methods. - Euthymic adults with DSM-IV-TR-defined bipolar I or II disorder were enrolled. Subjects included in this post-hoc analysis (n = 67) were divided into two groups (normal weight, body mass index [BMI] of 18.5-24.9 kg/m(2); overweight/obese, BMI >= 25.0 kg/m(2)). Demographic and clinical information were obtained at screening. At baseline, study participants completed a comprehensive cognitive battery to assess premorbid IQ, verbal learning and memory, attention and psychomotor processing speed, executive function, general intellectual abilities, recollection and habit memory, as well as self-perceptions of cognitive failures. Results. - BMI was negatively correlated with attention and psychomotor processing speed as measured by the Digit Symbol Substitution Test (P < 0.01). Overweight and obese bipolar individuals had a significantly lower score on the Verbal Fluency Test when compared to normal weight subjects (P < 0.05). For all other measures of cognitive function, non-significant trends suggesting a negative association with BMI were observed, with the exception of measures of executive function (i.e. Trail Making Test B) and recollection memory (i.e. process-dissociation task). Conclusion. - Notwithstanding the post-hoc methodology and relatively small sample size, the results of this study suggest a possible negative effect of overweight/obesity on cognitive function in euthymic individuals with bipolar disorder. Taken together, these data provide the impetus for more rigorous evaluation of the mediational role of overweight/obesity (and other medical co-morbidity) on cognitive function in psychiatric populations. (C) 2011 Elsevier Masson SAS. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conscious events interact with memory systems in learning, rehearsal and retrieval (Ebbinghaus 1885/1964; Tulving 1985). Here we present hypotheses that arise from the IDA computional model (Franklin, Kelemen and McCauley 1998; Franklin 2001b) of global workspace theory (Baars 1988, 2002). Our primary tool for this exploration is a flexible cognitive cycle employed by the IDA computational model and hypothesized to be a basic element of human cognitive processing. Since cognitive cycles are hypothesized to occur five to ten times a second and include interaction between conscious contents and several of the memory systems, they provide the means for an exceptionally fine-grained analysis of various cognitive tasks. We apply this tool to the small effect size of subliminal learning compared to supraliminal learning, to process dissociation, to implicit learning, to recognition vs. recall, and to the availability heuristic in recall. The IDA model elucidates the role of consciousness in the updating of perceptual memory, transient episodic memory, and procedural memory. In most cases, memory is hypothesized to interact with conscious events for its normal functioning. The methodology of the paper is unusual in that the hypotheses and explanations presented are derived from an empirically based, but broad and qualitative computational model of human cognition.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Three experiments are reported that examined the process by which trainees learn decision-making skills during a critical incident training program. Formal theories of category learning were used to identify two processes that may be responsible for the acquisition of decision-making skills: rule learning and exemplar learning. Experiments I and 2 used the process dissociation procedure (L. L. Jacoby, 1998) to evaluate the contribution of these processes to performance. The results suggest that trainees used a mixture of rule and exemplar learning. Furthermore, these learning processes were influenced by different aspects of training structure and design. The goal of Experiment 3 was to develop training techniques that enable trainees to use a rule adaptively. Trainees were tested on cases that represented exceptions to the rule. Unexpectedly, the results suggest that providing general instruction regarding the kinds of conditions in which a decision rule does not apply caused them to fixate on the specific conditions mentioned and impaired their ability to identify other conditions in which the rule might not apply. The theoretical, methodological, and practical implications of the results are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The development of an information system in Caribbean public sector organisations is usually seen as a matter of installing hardware and software according to a directive from senior management, without much planning. This causes huge investment in procuring hardware and software without improving overall system performance. Increasingly, Caribbean organisations are looking for assurances on information system performance before making investment decisions not only to satisfy the funding agencies, but also to be competitive in this dynamic and global business world. This study demonstrates an information system planning approach using a process-reengineering framework. Firstly, the stakeholders for the business functions are identified along with their relationships and requirements. Secondly, process reengineering is carried out to develop the system requirements. Accordingly, information technology is selected through detailed system requirement analysis. Thirdly, cost-benefit analysis, identification of critical success factors and risk analysis are carried out to strengthen the selection. The entire methodology has been demonstrated through an information system project in the Barbados drug service, a public sector organisation in the Caribbean.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Automatically generating maps of a measured variable of interest can be problematic. In this work we focus on the monitoring network context where observations are collected and reported by a network of sensors, and are then transformed into interpolated maps for use in decision making. Using traditional geostatistical methods, estimating the covariance structure of data collected in an emergency situation can be difficult. Variogram determination, whether by method-of-moment estimators or by maximum likelihood, is very sensitive to extreme values. Even when a monitoring network is in a routine mode of operation, sensors can sporadically malfunction and report extreme values. If this extreme data destabilises the model, causing the covariance structure of the observed data to be incorrectly estimated, the generated maps will be of little value, and the uncertainty estimates in particular will be misleading. Marchant and Lark [2007] propose a REML estimator for the covariance, which is shown to work on small data sets with a manual selection of the damping parameter in the robust likelihood. We show how this can be extended to allow treatment of large data sets together with an automated approach to all parameter estimation. The projected process kriging framework of Ingram et al. [2007] is extended to allow the use of robust likelihood functions, including the two component Gaussian and the Huber function. We show how our algorithm is further refined to reduce the computational complexity while at the same time minimising any loss of information. To show the benefits of this method, we use data collected from radiation monitoring networks across Europe. We compare our results to those obtained from traditional kriging methodologies and include comparisons with Box-Cox transformations of the data. We discuss the issue of whether to treat or ignore extreme values, making the distinction between the robust methods which ignore outliers and transformation methods which treat them as part of the (transformed) process. Using a case study, based on an extreme radiological events over a large area, we show how radiation data collected from monitoring networks can be analysed automatically and then used to generate reliable maps to inform decision making. We show the limitations of the methods and discuss potential extensions to remedy these.