941 resultados para cost-informed process execution


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A recently developed novel biomass fuel pellet, the Q’ Pellet, offers significant improvements over conventional white pellets, with characteristics comparable to those of coal. The Q’ Pellet was initially created at bench scale using a proprietary die and punch design, in which the biomass was torrefied in-situ¬ and then compressed. To bring the benefits of the Q’ Pellet to a commercial level, it must be capable of being produced in a continuous process at a competitive cost. A prototype machine was previously constructed in a first effort to assess continuous processing of the Q’ Pellet. The prototype torrefied biomass in a separate, ex-situ reactor and transported it into a rotary compression stage. Upon evaluation, parts of the prototype were found to be unsuccessful and required a redesign of the material transport method as well as the compression mechanism. A process was developed in which material was torrefied ex-situ and extruded in a pre-compression stage. The extruded biomass overcame multiple handling issues that had been experienced with un-densified biomass, facilitating efficient material transport. Biomass was extruded directly into a novel re-designed pelletizing die, which incorporated a removable cap, ejection pin and a die spring to accommodate a repeatable continuous process. Although after several uses the die required manual intervention due to minor design and manufacturing quality limitations, the system clearly demonstrated the capability of producing the Q’ Pellet in a continuous process. Q’ Pellets produced by the pre-compression method and pelletized in the re-designed die had an average dry basis gross calorific value of 22.04 MJ/kg, pellet durability index of 99.86% and dried to 6.2% of its initial mass following 24 hours submerged in water. This compares well with literature results of 21.29 MJ/kg, 100% pellet durability index and <5% mass increase in a water submersion test. These results indicate that the methods developed herein are capable of producing Q’ Pellets in a continuous process with fuel properties competitive with coal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background
Increasing physical activity in the workplace can provide employee physical and mental health benefits, and employer economic benefits through reduced absenteeism and increased productivity. The workplace is an opportune setting to encourage habitual activity. However, there is limited evidence on effective behaviour change interventions that lead to maintained physical activity. This study aims to address this gap and help build the necessary evidence base for effective, and cost-effective, workplace interventions

Methods/design
This cluster randomised control trial will recruit 776 office-based employees from public sector organisations in Belfast and Lisburn city centres, Northern Ireland. Participants will be randomly allocated by cluster to either the Intervention Group or Control Group (waiting list control). The 6-month intervention consists of rewards (retail vouchers, based on similar principles to high street loyalty cards), feedback and other evidence-based behaviour change techniques. Sensors situated in the vicinity of participating workplaces will promote and monitor minutes of physical activity undertaken by participants. Both groups will complete all outcome measures. The primary outcome is steps per day recorded using a pedometer (Yamax Digiwalker CW-701) for 7 consecutive days at baseline, 6, 12 and 18 months. Secondary outcomes include health, mental wellbeing, quality of life, work absenteeism and presenteeism, and use of healthcare resources. Process measures will assess intervention “dose”, website usage, and intervention fidelity. An economic evaluation will be conducted from the National Health Service, employer and retailer perspective using both a cost-utility and cost-effectiveness framework. The inclusion of a discrete choice experiment will further generate values for a cost-benefit analysis. Participant focus groups will explore who the intervention worked for and why, and interviews with retailers will elucidate their views on the sustainability of a public health focused loyalty card scheme.

Discussion
The study is designed to maximise the potential for roll-out in similar settings, by engaging the public sector and business community in designing and delivering the intervention. We have developed a sustainable business model using a ‘points’ based loyalty platform, whereby local businesses ‘sponsor’ the incentive (retail vouchers) in return for increased footfall to their business.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Supply Chain Simulation (SCS) is applied to acquire information to support outsourcing decisions but obtaining enough detail in key parameters can often be a barrier to making well informed decisions.
One aspect of SCS that has been relatively unexplored is the impact of inaccurate data around delays within the SC. The impact of the magnitude and variability of process cycle time on typical performance indicators in a SC context is studied.
System cycle time, WIP levels and throughput are more sensitive to the magnitude of deterministic deviations in process cycle time than variable deviations. Manufacturing costs are not very sensitive to these deviations.
Future opportunities include investigating the impact of process failure or product defects, including logistics and transportation between SC members and using alternative costing methodologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’estimation temporelle de l’ordre des secondes à quelques minutes requiert des ressources attentionnelles pour l’accumulation d’information temporelle pendant l’intervalle à estimer (Brown, 2006; Buhusi & Meck, 2009; Zakay & Block, 2004). Ceci est démontré dans le paradigme de double tâche, où l’exécution d’une tâche concurrente pendant l’estimation d’un intervalle mène à un effet d’interférence, soit une distorsion de la durée perçue se traduisant par des productions temporelles plus longues et plus variables que si l’estimation de l’intervalle était effectuée seule (voir Brown, 1997; 2010). Un effet d’interférence est également observé lorsqu’une interruption est attendue pendant l’intervalle à estimer, l’allongement étant proportionnel à la durée d’attente de l’interruption (Fortin & Massé, 2000). Cet effet a mené à l’hypothèse que la production avec interruption serait sous-tendue par un mécanisme de partage attentionnel similaire à la double tâche (Fortin, 2003). Afin d’étudier cette hypothèse, deux études empiriques ont été effectuées dans des contextes expérimentaux associés respectivement à une augmentation et à une diminution de l’effet d’interférence, soit le vieillissement (Chapitre II) et l’entraînement cognitif (Chapitre III). Dans le Chapitre II, la tâche de production avec interruption est étudiée chez des participants jeunes et âgés à l’aide de la spectroscopie proche infrarouge fonctionnelle (SPIRf). Les résultats montrent que l’attente de l’interruption est associée à des coûts comportementaux et fonctionnels similaires à la double tâche. Au niveau comportemental, un allongement des productions proportionnel à la durée d’attente de l’interruption est observé chez l’ensemble des participants, mais cet effet est plus prononcé chez les participants âgés que chez les jeunes. Ce résultat est compatible avec les observations réalisées dans le paradigme de double tâche (voir Verhaegen, 2011 pour une revue). Au niveau fonctionnel, la production avec et sans interruption est associée à l’activation du cortex préfrontal droit et des régions préfrontales dorsolatérales connues pour leur rôle au niveau de l’estimation temporelle explicite (production d’intervalle) et implicite (processus préparatoires). En outre, l’attente de l’interruption est associée à l’augmentation de l’activation corticale préfrontale dans les deux hémisphères chez l’ensemble des participants, incluant le cortex ventrolatéral préfrontal associé au contrôle attentionnel dans la double tâche. Finalement, les résultats montrent que les participants âgés se caractérisent par une activation corticale bilatérale lors de la production sans et avec interruption. Dans le cadre des théories du vieillissement cognitif (Park & Reuter-Lorenz, 2009), cela suggère que l’âge est associé à un recrutement inefficace des ressources attentionnelles pour la production d’intervalle, ceci nuisant au recrutement de ressources additionnelles pour faire face aux demandes liées à l’attente de l’interruption. Dans le Chapitre III, la tâche de production avec interruption est étudiée en comparant la performance de participants assignés à l’une ou l’autre de deux conditions d’exécution extensive (cinq sessions successives) de double tâche ou de production avec interruption. Des sessions pré et post-test sont aussi effectuées afin de tester le transfert entre les conditions. Les résultats montrent un effet d’interférence et de durée d’interférence tant en production avec double tâche qu’en production avec interruption. Ces effets sont toutefois plus prononcés lors de la production avec interruption et tendent à augmenter au fil des sessions, ce qui n’est pas observé en double tâche. Cela peut être expliqué par l’influence des processus préparatoires pendant la période pré-interruption et pendant l’interruption. Finalement, les résultats ne mettent pas en évidence d’effets de transfert substantiels entre les conditions puisque les effets de la pratique concernent principalement la préparation temporelle, un processus spécifique à la production avec interruption. Par la convergence que permet l’utilisation d’un même paradigme avec des méthodologies distinctes, ces travaux approfondissent la connaissance des mécanismes attentionnels associés à l’estimation temporelle et plus spécifiquement à la production avec interruption. Les résultats supportent l’hypothèse d’un partage attentionnel induit par l’attente de l’interruption. Les ressources seraient partagées entre les processus d’estimation temporelle explicite et implicite, une distinction importante récemment mise de l’avant dans la recherche sur l’estimation du temps (Coull, Davranche, Nazarian & Vidal, 2013). L’implication de processus dépendant des ressources attentionnelles communes pour le traitement de l’information temporelle peut rendre compte de l’effet d’interférence robuste et systématique observé dans la tâche de production avec interruption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study used a phenomenological research design to determine the difficulties faced in the science-based entrepreneur project development process for pre-service science teachers.. Qualitative data were obtained through interviews conducted with ten pre-service science teachers. The data were analysed using an inductive thematic analysis. The results indicated that pre-service science teachers have most difficulty ‘making decisions on one of the innovative ideas’ and ‘making predictions about unexpected situations’. They also have difficulties ‘calculating the cost as a result of design or work analysis’, ‘identifying if the idea already existed (similarity analysis)’ and ‘making decisions on the required materials, tools, services’. These results show the need for pre-service science teachers to communicate with other institutions and organisations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Therapistsʼ process notes - written descriptions of a session produced shortly afterwards from memory - hold a significant role in child and adolescent psychoanalytic psychotherapy. They are central in training, in supervision, and in developing oneʼs understanding through selfsupervision and forms of psychotherapy research. This thesis examines such process notes through a comparison with audio recordings of the same sessions. In so doing, it aims to generate theory that might illuminate the causes of significantly patterned discrepancies between the notes and recordings, in order to understand more about the processes at work in psychoanalytic psychotherapy and to explore the nature of process notes, their values and limitations. The literature searches conducted revealed limited relevant studies. All identified studies that compare process notes with recordings of sessions seek to quantify the differences between the two forms of recording. Unlike these, this thesis explores the meaning of the differences between process notes and recordings through qualitative data analysis. Using psychoanalytically informed grounded theory, in total nine sets of process notes and recordings from three different psychoanalytic psychotherapists are analysed. The analysis identifies eight core categories of findings. Initial theories are developed from these categories, most significantly concerning the role and influence of a ʻcore transference dynamicʼ between therapist and patient. Further theory is developed on the nature and function of process notes as a means for the therapistʼs conscious and unconscious processing of the session, as well as on the nature of the influence of the relationships – both internal and external – within which they are written. In the light of the findings, a proposal is made for a new approach for learning about the patient and clinical work, ʻthe comparison methodʼ (supervision involving a comparison of process notes and recordings), and, in particular, for its inclusion within the training of psychoanalytic psychotherapists. Further recommendations for research are also made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter examines community media projects in Scotland as social processes that nurture knowledge through participation in production. A visual and media anthropology framework (Ginsburg, 2005) with an emphasis on the social context of media production informs the analysis of community media. Drawing on community media projects in the Govan area of Glasgow and the Isle of Bute, the techniques of production foreground “the relational aspects of filmmaking” (Grimshaw and Ravetz, 2005: 7) and act as a catalyst for knowledge and networks of relations embedded in time and place. Community media is defined here as a creative social process, characterised by an approach to production that is multi-authored, collaborative and informed by the lives of participants, and which recognises the relevance of networks of relations to that practice (Caines, 2007: 2). As a networked process, community media production is recognised as existing in collaboration between a director or producer, such as myself, and organisations, institutions and participants, who are connected through a range of identities, practices and place. These relations born of the production process reflect a complex area of practice and participation that brings together “parallel and overlapping public spheres” (Meadows et al., 2002: 3). This relates to broader concerns with networks (Carpentier, Servaes and Lie, 2003; Rodríguez, 2001), both revealed during the process of production and enhanced by it, and how they can be described with reference to the knowledge practice of community media.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As estruturas de solo reforçado com geossintéticos são normalmente constituídas por solos granulares com boas propriedades físicas e mecânicas. O uso de apenas este tipo de solos pode proporcionar o aumento, por vezes insustentável, do custo da execução das estruturas e o aumento do seu impacto ambiental. Deste modo, as estruturas de solo reforçado perdem a sua vantagem competitiva em relação a outros tipos de estruturas (muros de betão, muros de gravidade, muros de gabiões, etc.). Para resolver este problema podem ser utilizados outros tipos de solos (solos locais, finos, com propriedades físicas e mecânicas piores mas, no entanto, mais baratos) para a execução deste tipo de estruturas. De forma geral, com este estudo pretendeu-se contribuir para o incremento do conhecimento sobre a utilização de solos finos para a construção de estruturas de solo reforçado (muros e taludes). Para tal avaliaram-se as diferenças no comportamento mecânico dos materiais compósitos (solo granular reforçado versus solo fino reforçado) e das estruturas de solo reforçado constituídas com os dois tipos de solos. Assim, os objetivos deste estudo foram avaliar: a influência de vários parâmetros nas propriedades mecânicas e na capacidade de carga dos solos reforçados com geossintéticos; a influência de vários parâmetros no dimensionamento das estruturas de solo reforçado; e o comportamento das estruturas dimensionadas (incluindo a estabilidade global e a influência do processo construtivo) recorrendo a uma ferramenta numérica (PLAXIS). Para cumprir os objetivos propostos foram realizadas análises experimentais em laboratório (análise do comportamento do solo reforçado através de ensaios triaxiais e de California Bearing Ratio) e análises numéricas (dimensionamento de estruturas de solo reforçado; modelação numérica do comportamento através de uma ferramenta numérica comercial com o método dos elementos finitos). Os resultados dos ensaios experimentais mostraram que o comportamento mecânico e a capacidade de carga do solo foram incrementados com a inclusão das camadas de geossintético. Este efeito variou com os diversos parâmetros analisados mas, de forma geral, foi mais importante no solo fino (solo com propriedades mecânicas piores). As análises numéricas mostraram que as estruturas de solo fino precisaram de maior densidade de reforços para serem estáveis. Além disso, as estruturas de solo fino foram mais deformáveis e o efeito do seu processo construtivo foi mais importante (principalmente para estruturas de solo fino saturado).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Symbolic execution is a powerful program analysis technique, but it is very challenging to apply to programs built using event-driven frameworks, such as Android. The main reason is that the framework code itself is too complex to symbolically execute. The standard solution is to manually create a framework model that is simpler and more amenable to symbolic execution. However, developing and maintaining such a model by hand is difficult and error-prone. We claim that we can leverage program synthesis to introduce a high-degree of automation to the process of framework modeling. To support this thesis, we present three pieces of work. First, we introduced SymDroid, a symbolic executor for Android. While Android apps are written in Java, they are compiled to Dalvik bytecode format. Instead of analyzing an app’s Java source, which may not be available, or decompiling from Dalvik back to Java, which requires significant engineering effort and introduces yet another source of potential bugs in an analysis, SymDroid works directly on Dalvik bytecode. Second, we introduced Pasket, a new system that takes a first step toward automatically generating Java framework models to support symbolic execution. Pasket takes as input the framework API and tutorial programs that exercise the framework. From these artifacts and Pasket's internal knowledge of design patterns, Pasket synthesizes an executable framework model by instantiating design patterns, such that the behavior of a synthesized model on the tutorial programs matches that of the original framework. Lastly, in order to scale program synthesis to framework models, we devised adaptive concretization, a novel program synthesis algorithm that combines the best of the two major synthesis strategies: symbolic search, i.e., using SAT or SMT solvers, and explicit search, e.g., stochastic enumeration of possible solutions. Adaptive concretization parallelizes multiple sub-synthesis problems by partially concretizing highly influential unknowns in the original synthesis problem. Thanks to adaptive concretization, Pasket can generate a large-scale model, e.g., thousands lines of code. In addition, we have used an Android model synthesized by Pasket and found that the model is sufficient to allow SymDroid to execute a range of apps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this thesis was to observe possibilities to enhance the development of manufacturing costs savings and competitiveness related to the compact KONE Renova Slim elevator door. Compact slim doors are especially designed for EMEA markets. EMEA market area is characterized by highly competitive pricing and lead times which are manifested as pressures to decrease manufacturing costs and lead times of the compact elevator door. The new elevator safety code EN81-20 coming live during the spring 2016 will also have a negative impact on the cost and competitiveness development making the situation more acute. As a sheet metal product the KONE Renova slim is highly variable. Manufacturing methods utilized in the production are common and robust methods. Due to the low volumes, high variability and tight lead times the manufacturing of the doors is facing difficulties. Manufacturing of the doors is outsourced to two individual suppliers Stera and Wittur. This thesis was implemented in collaboration with Stera. KONE and Stera pursue a long term and close partnership where the benefits reached by the collaboration are shared equally. Despite the aims, the collaboration between companies is not totally visible and various barriers are hampering the development towards more efficient ways of working. Based on the empirical studies related to this thesis, an efficient standardized (A+) process was developed for the main variations of the compact elevator door. Using the standardized process KONE is able to order the most important AMDS door variations from Stera with increased quality, lower manufacturing costs and manufacturing lead time compared to the current situation. In addition to all the benefits, the standardized (A+) process also includes risks in practice. KONE and the door supplier need to consider these practical risks together before decisions are made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Single-cell oils (SCO) have been considered a promising source of 3rd generation biofuels mainly in the final form of biodiesel. However, its high production costs have been a barrier towards the commercialization of this commodity. The fast growing yeast Rhodosporidium toruloides NCYC 921 has been widely reported as a potential SCO producing yeast. In addition to its well-known high lipid content (that can be converted into biodiesel), is rich in high value added products such as carotenoids with commercial interest. The process design and integration may contribute to reduce the overall cost of biofuels and carotenoid production and is a mandatory step towards their commercialization. The present work addresses the biomass disruption, extraction, fractionation and recovery of products with special emphasis on high added valued carotenoids (beta-carotene, torulene, torularhodin) and fatty acids directed to biodiesel. The chemical structure of torularhodin with a terminal carboxylic group imposes an additional extra challenge in what concern its separation from fatty acids. The proposed feedstock is fresh biomass pellet obtained directly by centrifugation from a 5L fed-batch fermentation culture broth. The use of a wet instead of lyophilised biomass feedstock is a way to decrease processing energy costs and reduce downstream processing time. These results will contribute for a detailed process design. Gathered data will be of crucial importance for a further study on Life-Cycle Assessment (LCA).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The analysis of system calls is one method employed by anomaly detection systems to recognise malicious code execution. Similarities can be drawn between this process and the behaviour of certain cells belonging to the human immune system, and can be applied to construct an artificial immune system. A recently developed hypothesis in immunology, the Danger Theory, states that our immune system responds to the presence of intruders through sensing molecules belonging to those invaders, plus signals generated by the host indicating danger and damage. We propose the incorporation of this concept into a responsive intrusion detection system, where behavioural information of the system and running processes is combined with information regarding individual system calls.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, tool support is addressed for the combined disciplines of Model-based testing and performance testing. Model-based testing (MBT) utilizes abstract behavioral models to automate test generation, thus decreasing time and cost of test creation. MBT is a functional testing technique, thereby focusing on output, behavior, and functionality. Performance testing, however, is non-functional and is concerned with responsiveness and stability under various load conditions. MBPeT (Model-Based Performance evaluation Tool) is one such tool which utilizes probabilistic models, representing dynamic real-world user behavior patterns, to generate synthetic workload against a System Under Test and in turn carry out performance analysis based on key performance indicators (KPI). Developed at Åbo Akademi University, the MBPeT tool is currently comprised of a downloadable command-line based tool as well as a graphical user interface. The goal of this thesis project is two-fold: 1) to extend the existing MBPeT tool by deploying it as a web-based application, thereby removing the requirement of local installation, and 2) to design a user interface for this web application which will add new user interaction paradigms to the existing feature set of the tool. All phases of the MBPeT process will be realized via this single web deployment location including probabilistic model creation, test configurations, test session execution against a SUT with real-time monitoring of user configurable metric, and final test report generation and display. This web application (MBPeT Dashboard) is implemented with the Java programming language on top of the Vaadin framework for rich internet application development. The Vaadin framework handles the complicated web communications processes and front-end technologies, freeing developers to implement the business logic as well as the user interface in pure Java. A number of experiments are run in a case study environment to validate the functionality of the newly developed Dashboard application as well as the scalability of the solution implemented in handling multiple concurrent users. The results support a successful solution with regards to the functional and performance criteria defined, while improvements and optimizations are suggested to increase both of these factors.