904 resultados para Process-dissociation Framework


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Product lifecycle management (PLM) innovates as it defines both the product as a central element to aggregate enterprise information and the lifecycle as a new time dimension for information integration and analysis. Because of its potential benefits to shorten innovation lead-times and to reduce costs, PLM has attracted a lot of attention at industry and at research. However, the current PLM implementation stage at most organisations still does not apply the lifecycle management concepts thoroughly. In order to close the existing realisation gap, this article presents a process oriented framework to support effective PLM implementation. The framework central point consists of a set of lifecycle oriented business process reference models which links the necessary fundamental concepts, enterprise knowledge and software solutions to effectively deploy PLM. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, within the VISDEM project (EPSRC funded EP/C005848/1), a novel variational approximation framework has been developed for inference in partially observed, continuous space-time, diffusion processes. In this technical report all the derivations of the variational framework, from the initial work, are provided in detail to help the reader better understand the framework and its assumptions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current theoretical thinking about dual processes in recognition relies heavily on the measurement operations embodied within the process dissociation procedure. We critically evaluate the ability of this procedure to support this theoretical enterprise. We show that there are alternative processes that would produce a rough invariance in familiarity (a key prediction of the dual-processing approach) and that the process dissociation procedure does not have the power to differentiate between these alternative possibilities. We also show that attempts to relate parameters estimated by the process dissociation procedure to subjective reports (remember-know judgments) cannot differentiate between alternative dual-processing models and that there are problems with some of the historical evidence and with obtaining converging evidence. Our conclusion is that more specific theories incorporating ideas about representation and process are required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Item noise models of recognition assert that interference at retrieval is generated by the words from the study list. Context noise models of recognition assert that interference at retrieval is generated by the contexts in which the test word has appeared. The authors introduce the bind cue decide model of episodic memory, a Bayesian context noise model, and demonstrate how it can account for data from the item noise and dual-processing approaches to recognition memory. From the item noise perspective, list strength and list length effects, the mirror effect for word frequency and concreteness, and the effects of the similarity of other words in a list are considered. From the dual-processing perspective, process dissociation data on the effects of length, temporal separation of lists, strength, and diagnosticity of context are examined. The authors conclude that the context noise approach to recognition is a viable alternative to existing approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Following study, participants received 2 tests. The 1st was a recognition test; the 2nd was designed to tap recollection. The objective was to examine performance on Test I conditional on Test 2 performance. In Experiment 1, contrary to process dissociation assumptions, exclusion errors better predicted subsequent recollection than did inclusion errors. In Experiments 2 and 3, with alternate questions posed on Test 2, words having high estimates of recollection with one question had high estimates of familiarity with the other question. Results supported the following: (a) the 2-test procedure has considerable potential for elucidating the relationship between recollection and familiarity; (b) there is substantial evidence for dependency between such processes when estimates are obtained using the process dissociation and remember-know procedures; and (c) order of information access appears to depend on the question posed to the memory system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High school dropout is commonly seen as the result of a long-term process of failure and disengagement. As useful as it is, this view has obscured the heterogeneity of pathways leading to dropout. Research suggests, for instance, that some students leave school not as a result of protracted difficulties but in response to situations that emerge late in their schooling careers, such as health problems or severe peer victimization. Conversely, others with a history of early difficulties persevere when their circumstances improve during high school. Thus, an adequate understanding of why and when students drop out requires a consideration of both long-term vulnerabilities and proximal disruptive events and contingencies. The goal of this review is to integrate long-term and immediate determinants of dropout by proposing a stress process, life course model of dropout. This model is also helpful for understanding how the determinants of dropout vary across socioeconomic conditions and geographical and historical contexts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Innovation is the word of this decade. According to innovation definitions, without positive sales impact and meaningful market share the company’s product or service has not been an innovation. Research problem of this master thesis is to find out what is the innovation process of complex new consumer products and services in new innovation paradigm. The objective is to get answers to two research questions: 1) What are the critical success factors what company should do when it is implementing the paradigm change in mass markets consumer business with complex products and services? 2) What is the process or framework one firm could follow? The research problem is looked from one company’s innovation creation process, networking and organization change management challenges point of views. Special focus is to look the research problem from an existing company perspective which is entering new business area. Innovation process management framework of complex new consumer products and services in new innovation paradigm has been created with support of several existing innovation theories. The new process framework includes the critical innovation process elements companies should take into consideration in their daily activities when they are in their new business innovation implementing process. Case company location based business implementation activities are studied via the new innovation process framework. This case study showed how important it is to manage the process, look how the target market and the competition in it is developing during company’s own innovation process, make decisions at right time and from beginning plan and implement the organization change management as one activity in the innovation process. In the end this master thesis showed that all companies need to create their own innovation process master plan with milestones and activities. One plan does not fit all, but all companies can start their planning from the new innovation process what was introduced in this master thesis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

According to many academic researches, the development of marketing capabilities can enhance organizational performance. Similarly, downstream marketing capabilities have an important role in accomplishment the organizational goals. Particularly the downstream marketing capabilities identified in this research are the Marketing Communication, Selling, Marketing implementation, and Market information management. These four capabilities are summarized under the following abilities. First, the ability to manage customers’ opinion regarding the offered value from the organization. Second, the ability of the organization to obtain orders from new and established customers. Third, the ability of aligning and translate the marketing strategy into an operating action plan along with the deployment of the organizational resources. Forth, the continuous process of gathering and managing information about the markets. Moreover, the literature review of this research shed light on the elements that compose the downstream marketing capabilities. Specifically, this research examined the downstream processes and the required information required to control these processes based on the American Productivity and Quality Center’s Process Classification Framework. Furthermore, the literature review examined some of the technological tools that are used in marketing processes, and also some managerial implication regarding the management of the downstream marketing employees. Along with the investigation of downstream marketing capabilities, the literature review investigated the utilization and the benefits of Component Business Model and Process Classification Framework, as they are defined by the organizations that developed them. Besides this initial study, the research presents how the examined organization is using the two frameworks together by cross-referring them. Finally, the research presents the optimal deployment of the collected downstream capabilities elements in the current organizational structure. The optimal deployment has been grounded on the information collected from the literature review but also from internal documentation, provided from the examined organization. By comparing the optimal deployment and the current condition on the organization, the research exhibits some points for improvement, but also some of the projects that are currently in progress inside the organization and eventually will provide solutions to these downsides.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The construction sector is under growing pressure to increase productivity and improve quality, most notably in reports by Latham (1994, Constructing the Team, HMSO, London) and Egan (1998, Rethinking Construction, HMSO, London). A major problem for construction companies is the lack of project predictability. One method of increasing predictability and delivering increased customer value is through the systematic management of construction processes. However, the industry has no methodological mechanism to assess process capability and prioritise process improvements. Standardized Process Improvement for Construction Enterprises (SPICE) is a research project that is attempting to develop a stepwise process improvement framework for the construction industry, utilizing experience from the software industry, and in particular the Capability Maturity Model (CMM), which has resulted in significant productivity improvements in the software industry. This paper introduces SPICE concepts and presents the results from two case studies conducted on design and build projects. These studies have provided further in-sight into the relevance and accuracy of the framework, as well as its value for the construction sector.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Pós-graduação em Ciência da Computação - IBILCE

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Los sistemas empotrados son cada día más comunes y complejos, de modo que encontrar procesos seguros, eficaces y baratos de desarrollo software dirigidos específicamente a esta clase de sistemas es más necesario que nunca. A diferencia de lo que ocurría hasta hace poco, en la actualidad los avances tecnológicos en el campo de los microprocesadores de los últimos tiempos permiten el desarrollo de equipos con prestaciones más que suficientes para ejecutar varios sistemas software en una única máquina. Además, hay sistemas empotrados con requisitos de seguridad (safety) de cuyo correcto funcionamiento depende la vida de muchas personas y/o grandes inversiones económicas. Estos sistemas software se diseñan e implementan de acuerdo con unos estándares de desarrollo software muy estrictos y exigentes. En algunos casos puede ser necesaria también la certificación del software. Para estos casos, los sistemas con criticidades mixtas pueden ser una alternativa muy valiosa. En esta clase de sistemas, aplicaciones con diferentes niveles de criticidad se ejecutan en el mismo computador. Sin embargo, a menudo es necesario certificar el sistema entero con el nivel de criticidad de la aplicación más crítica, lo que hace que los costes se disparen. La virtualización se ha postulado como una tecnología muy interesante para contener esos costes. Esta tecnología permite que un conjunto de máquinas virtuales o particiones ejecuten las aplicaciones con unos niveles de aislamiento tanto temporal como espacial muy altos. Esto, a su vez, permite que cada partición pueda ser certificada independientemente. Para el desarrollo de sistemas particionados con criticidades mixtas se necesita actualizar los modelos de desarrollo software tradicionales, pues estos no cubren ni las nuevas actividades ni los nuevos roles que se requieren en el desarrollo de estos sistemas. Por ejemplo, el integrador del sistema debe definir las particiones o el desarrollador de aplicaciones debe tener en cuenta las características de la partición donde su aplicación va a ejecutar. Tradicionalmente, en el desarrollo de sistemas empotrados, el modelo en V ha tenido una especial relevancia. Por ello, este modelo ha sido adaptado para tener en cuenta escenarios tales como el desarrollo en paralelo de aplicaciones o la incorporación de una nueva partición a un sistema ya existente. El objetivo de esta tesis doctoral es mejorar la tecnología actual de desarrollo de sistemas particionados con criticidades mixtas. Para ello, se ha diseñado e implementado un entorno dirigido específicamente a facilitar y mejorar los procesos de desarrollo de esta clase de sistemas. En concreto, se ha creado un algoritmo que genera el particionado del sistema automáticamente. En el entorno de desarrollo propuesto, se han integrado todas las actividades necesarias para desarrollo de un sistema particionado, incluidos los nuevos roles y actividades mencionados anteriormente. Además, el diseño del entorno de desarrollo se ha basado en la ingeniería guiada por modelos (Model-Driven Engineering), la cual promueve el uso de los modelos como elementos fundamentales en el proceso de desarrollo. Así pues, se proporcionan las herramientas necesarias para modelar y particionar el sistema, así como para validar los resultados y generar los artefactos necesarios para el compilado, construcción y despliegue del mismo. Además, en el diseño del entorno de desarrollo, la extensión e integración del mismo con herramientas de validación ha sido un factor clave. En concreto, se pueden incorporar al entorno de desarrollo nuevos requisitos no-funcionales, la generación de nuevos artefactos tales como documentación o diferentes lenguajes de programación, etc. Una parte clave del entorno de desarrollo es el algoritmo de particionado. Este algoritmo se ha diseñado para ser independiente de los requisitos de las aplicaciones así como para permitir al integrador del sistema implementar nuevos requisitos del sistema. Para lograr esta independencia, se han definido las restricciones al particionado. El algoritmo garantiza que dichas restricciones se cumplirán en el sistema particionado que resulte de su ejecución. Las restricciones al particionado se han diseñado con una capacidad expresiva suficiente para que, con un pequeño grupo de ellas, se puedan expresar la mayor parte de los requisitos no-funcionales más comunes. Las restricciones pueden ser definidas manualmente por el integrador del sistema o bien pueden ser generadas automáticamente por una herramienta a partir de los requisitos funcionales y no-funcionales de una aplicación. El algoritmo de particionado toma como entradas los modelos y las restricciones al particionado del sistema. Tras la ejecución y como resultado, se genera un modelo de despliegue en el que se definen las particiones que son necesarias para el particionado del sistema. A su vez, cada partición define qué aplicaciones deben ejecutar en ella así como los recursos que necesita la partición para ejecutar correctamente. El problema del particionado y las restricciones al particionado se modelan matemáticamente a través de grafos coloreados. En dichos grafos, un coloreado propio de los vértices representa un particionado del sistema correcto. El algoritmo se ha diseñado también para que, si es necesario, sea posible obtener particionados alternativos al inicialmente propuesto. El entorno de desarrollo, incluyendo el algoritmo de particionado, se ha probado con éxito en dos casos de uso industriales: el satélite UPMSat-2 y un demostrador del sistema de control de una turbina eólica. Además, el algoritmo se ha validado mediante la ejecución de numerosos escenarios sintéticos, incluyendo algunos muy complejos, de más de 500 aplicaciones. ABSTRACT The importance of embedded software is growing as it is required for a large number of systems. Devising cheap, efficient and reliable development processes for embedded systems is thus a notable challenge nowadays. Computer processing power is continuously increasing, and as a result, it is currently possible to integrate complex systems in a single processor, which was not feasible a few years ago.Embedded systems may have safety critical requirements. Its failure may result in personal or substantial economical loss. The development of these systems requires stringent development processes that are usually defined by suitable standards. In some cases their certification is also necessary. This scenario fosters the use of mixed-criticality systems in which applications of different criticality levels must coexist in a single system. In these cases, it is usually necessary to certify the whole system, including non-critical applications, which is costly. Virtualization emerges as an enabling technology used for dealing with this problem. The system is structured as a set of partitions, or virtual machines, that can be executed with temporal and spatial isolation. In this way, applications can be developed and certified independently. The development of MCPS (Mixed-Criticality Partitioned Systems) requires additional roles and activities that traditional systems do not require. The system integrator has to define system partitions. Application development has to consider the characteristics of the partition to which it is allocated. In addition, traditional software process models have to be adapted to this scenario. The V-model is commonly used in embedded systems development. It can be adapted to the development of MCPS by enabling the parallel development of applications or adding an additional partition to an existing system. The objective of this PhD is to improve the available technology for MCPS development by providing a framework tailored to the development of this type of system and by defining a flexible and efficient algorithm for automatically generating system partitionings. The goal of the framework is to integrate all the activities required for developing MCPS and to support the different roles involved in this process. The framework is based on MDE (Model-Driven Engineering), which emphasizes the use of models in the development process. The framework provides basic means for modeling the system, generating system partitions, validating the system and generating final artifacts. The framework has been designed to facilitate its extension and the integration of external validation tools. In particular, it can be extended by adding support for additional non-functional requirements and support for final artifacts, such as new programming languages or additional documentation. The framework includes a novel partitioning algorithm. It has been designed to be independent of the types of applications requirements and also to enable the system integrator to tailor the partitioning to the specific requirements of a system. This independence is achieved by defining partitioning constraints that must be met by the resulting partitioning. They have sufficient expressive capacity to state the most common constraints and can be defined manually by the system integrator or generated automatically based on functional and non-functional requirements of the applications. The partitioning algorithm uses system models and partitioning constraints as its inputs. It generates a deployment model that is composed by a set of partitions. Each partition is in turn composed of a set of allocated applications and assigned resources. The partitioning problem, including applications and constraints, is modeled as a colored graph. A valid partitioning is a proper vertex coloring. A specially designed algorithm generates this coloring and is able to provide alternative partitions if required. The framework, including the partitioning algorithm, has been successfully used in the development of two industrial use cases: the UPMSat-2 satellite and the control system of a wind-power turbine. The partitioning algorithm has been successfully validated by using a large number of synthetic loads, including complex scenarios with more that 500 applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Two studies investigated the context deletion effect, the attenuation of priming in implicit memory tests of words when words have been studied in text rather than in isolation. In Experiment 1, stem completion for single words was primed to a greater extent by words studied alone than in sentence contexts, and a higher proportion of completions from studied words was produced under direct instructions (cued recall) than under indirect instructions (produce the first completion that comes to mind). The effect of a sentence context was eliminated when participants were instructed to attend to the target word during the imagery generation task used in the study phase. In Experiment 2, the effect of a sentence context at study was reduced when the target word was presented in distinctive format within the sentence, and the study task (grammatical judgment) was directed at a word other than the target. The results implicate conceptual and perceptual processes that distinguish a word from its context in priming in word stem completion.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Site selection is a key activity for quarry expansion to support cement production, and is governed by factors such as resource availability, logistics, costs, and socio-economic-environmental factors. Adequate consideration of all the factors facilitates both industrial productivity and sustainable economic growth. This study illustrates the site selection process that was undertaken for the expansion of limestone quarry operations to support cement production in Barbados. First, alternate sites with adequate resources to support a 25-year development horizon were identified. Second, technical and socio-economic-environmental factors were then identified. Third, a database was developed for each site with respect to each factor. Fourth, a hierarchical model in analytic hierarchy process (AHP) framework was then developed. Fifth, the relative ranking of the alternate sites was then derived through pair wise comparison in all the levels and through subsequent synthesizing of the results across the hierarchy through computer software (Expert Choice). The study reveals that an integrated framework using the AHP can help select a site for the quarry expansion project in Barbados.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Large monitoring networks are becoming increasingly common and can generate large datasets from thousands to millions of observations in size, often with high temporal resolution. Processing large datasets using traditional geostatistical methods is prohibitively slow and in real world applications different types of sensor can be found across a monitoring network. Heterogeneities in the error characteristics of different sensors, both in terms of distribution and magnitude, presents problems for generating coherent maps. An assumption in traditional geostatistics is that observations are made directly of the underlying process being studied and that the observations are contaminated with Gaussian errors. Under this assumption, sub–optimal predictions will be obtained if the error characteristics of the sensor are effectively non–Gaussian. One method, model based geostatistics, assumes that a Gaussian process prior is imposed over the (latent) process being studied and that the sensor model forms part of the likelihood term. One problem with this type of approach is that the corresponding posterior distribution will be non–Gaussian and computationally demanding as Monte Carlo methods have to be used. An extension of a sequential, approximate Bayesian inference method enables observations with arbitrary likelihoods to be treated, in a projected process kriging framework which is less computationally intensive. The approach is illustrated using a simulated dataset with a range of sensor models and error characteristics.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Direct quantile regression involves estimating a given quantile of a response variable as a function of input variables. We present a new framework for direct quantile regression where a Gaussian process model is learned, minimising the expected tilted loss function. The integration required in learning is not analytically tractable so to speed up the learning we employ the Expectation Propagation algorithm. We describe how this work relates to other quantile regression methods and apply the method on both synthetic and real data sets. The method is shown to be competitive with state of the art methods whilst allowing for the leverage of the full Gaussian process probabilistic framework.