900 resultados para IT Process Value


Relevância:

50.00% 50.00%

Publicador:

Resumo:

Creativity is often defined as developing something novel or new, that fits its context, and has value. To achieve this, the creative process itself has gained increasing attention as organizational leaders seek competitive advantages through developing new products, services, process, or business models. In this paper, we explore the notion of the creative process as including a series of “filters” or ways to process information as being a critical component of the creative process. We use the metaphor of coffee making and filters because many of our examples come from Vietnam, which is one of the world’s top coffee exporters and which has created a coffee culture rivaling many other countries. We begin with a brief review of the creative process its connection to information processing, propose a tentative framework for integrating the two ideas, and provide examples of how it might work. We close with implications for further practical and theoretical directions for this idea.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Within the ever-changing arenas of architectural design and education, the core element of architectural education remains: that of the design process. The consideration of how to design in addition to what to design presents architectural educators with that most constant and demanding challenge of how do we best teach the design process?

This challenge is arguably most acute at a student's early stages of their architectural education. In their first years in architecture, students will commonly concentrate on the end product rather than the process. This is, in many ways, understandable. A great deal of time, money and effort go into their final presentations. They believe that it is what is on the wall that is going to be assessed. Armed with new computer skills, they want to produce eye-catching graphics that are often no more than a celebration of a CAD package. In an era of increasing speed, immediacy of information and powerful advertising it is unsurprising that students want to race quickly to presenting an end-product.

Recognising that trend, new teaching methods and models were introduced into the second year undergraduate studio over the past two years at Queen's University Belfast, aimed at promoting student self-reflection and making the design process more relevant to the students. This paper will first generate a critical discussion on the difficulties associated with the design process before outlining some of the methods employed to help promote the following; an understanding of concept, personalisation of the design process for the individual student; adding realism and value to the design process and finally, getting he students to play to their strengths in illustrating their design process like an element of product. Frameworks, examples, outcomes and student feedback will all be presented to help illustrate the effectiveness of the new strategies employed in making the design process firstly, more relevant and therefore secondly, of greater value, to the architecture student.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This paper examines the applicability of a digital manufacturing framework to the implementation of a Value Driven Design (VDD) approach for the development of a stiffened composite panel. It presents a means by which environmental considerations can be integrated with conventional product and process design drivers within a customized, digital environment. A composite forming process is used as an exemplar for the work which creates a collaborative environment for the integration of more traditional design drivers with parameters related to manufacturability as well as more sustainable processes and products. The environmental stakeholder is introduced to the VDD process through a customized product/process/resource (PPR) environment where application specific power consumption and material waste data has been measured and characterised in the process design interface. This allows the manufacturing planner to consider power consumption as a concurrent design driver and the inclusion of energy as a parameter in a VDD approach to the development of efficiently manufactured, sustainable transport systems.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

At present, a fraction of 0.1 - 0.2% of the patients undergoing surgery become aware during the process. The situation is referred to as anesthesia awareness and is obviously very traumatic for the person experiencing it. The reason for its occurrence is mostly an insufficient dosage of the narcotic Propofol combined with the incapability of the technology monitoring the depth of the patient’s anesthetic state to notice the patient becoming aware. A solution can be a highly sensitive and selective real time monitoring device for Propofol based on optical absorption spectroscopy. Its working principle has been postulated by Prof. Dr. habil. H. Hillmer and formulated in DE10 2004 037 519 B4, filed on Aug 30th, 2004. It consists of the exploitation of Intra Cavity Absorption effects in a two mode laser system. In this Dissertation, a two mode external cavity semiconductor laser, which has been developed previously to this work is enhanced and optimized to a functional sensor. Enhancements include the implementation of variable couplers into the system and the implementation of a collimator arrangement into which samples can be introduced. A sample holder and cells are developed and characterized with a focus on compatibility with the measurement approach. Further optimization concerns the overall performance of the system: scattering sources are reduced by re-splicing all fiber-to-fiber connections, parasitic cavities are eliminated by suppressing the Fresnel reflexes of all one fiber ends by means of optical isolators and wavelength stability of the system is improved by the implementation of thermal insulation to the Fiber Bragg Gratings (FBG). The final laser sensor is characterized in detail thermally and optically. Two separate modes are obtained at 1542.0 and 1542.5 nm, tunable in a range of 1nm each. Mode Full Width at Half Maximum (FWHM) is 0.06nm and Signal to Noise Ratio (SNR) is as high as 55 dB. Independent of tuning the two modes of the system can always be equalized in intensity, which is important as the delicacy of the intensity equilibrium is one of the main sensitivity enhancing effects formulated in DE10 2004 037 519 B4. For the proof of concept (POC) measurements the target substance Propofol is diluted in the solvents Acetone and DiChloroMethane (DCM), which have been investigated for compatibility with Propofol beforehand. Eight measurement series (two solvents, two cell lengths and two different mode spacings) are taken, which draw a uniform picture: mode intensity ratio responds linearly to an increase of Propofol in all cases. The slope of the linear response indicates the sensitivity of the system. The eight series are split up into two groups: measurements taken in long cells and measurements taken in short cells.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Information technology has become heavily embedded in business operations. As business needs change over time, IT applications are expected to continue providing required support. Whether the existing IT applications are still fit for the business purpose they were intended or new IT applications should be introduced, is a strategic decision for business, IT and business-aligned IT. In this paper, we present a method which aims to analyse business functions and IT roles, and to evaluate business-aligned IT from both social and technical perspectives. The method introduces a set of techniques that systematically supports the evaluation of the existing IT applications in relation to their technical capabilities for maximising business value. Furthermore, we discuss the evaluation process and results which are illustrated and validated through a real-life case study of a UK borough council, and followed by discussion on implications for researchers and practitioners.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Theory: Interpersonal factors play a major role in causing and maintaining depression. It is unclear, however, to what degree significant others of the patient need to be involved for characterizing the patient's interpersonal style. Therefore, our study sought to investigate how impact messages as perceived by the patients' significant others add to the prediction of psychotherapy process and outcome above and beyond routine assessments, and therapist factors. Method: 143 outpatients with major depressive disorder were treated by 24 therapists with CBT or Exposure-Based Cognitive Therapy. Interpersonal style was measured pre and post therapy with the informant‐based Impact Message Inventory (IMI), in addition to the self‐report Inventory of Interpersonal Problems (IIP‐32). Indicators for the patients' dominance and affiliation as well as interpersonal distress were calculated from these measures. Depressive and general symptomatology was assessed at pre, post, and at three months follow‐up, and by process measures after every session. Results: Whereas significant other's reports did not add significantly to the prediction of the early therapeutic alliance, central mechanisms of change, or post‐therapy outcome including therapist factors, the best predictor of outcome 3 months post therapy was an increase in dominance as perceived by significant others. Conclusions: The patients' significant others seem to provide important additional information about the patients' interpersonal style and therefore should be included in the diagnostic process. Moreover, practitioners should specifically target interpersonal change as a potential mechanism of change in psychotherapy for depression.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The purpose of this article is to highlight the value of ‘strategic positioning’ as a means of providing competitive edge, and to introduce and describe a novel method of managing this. Strategic positioning is concerned with the choice of business activities a company carries out itself, compared to those provided by suppliers, partners, distributors and even customers. It is therefore directly impacted by, and has direct impact upon, such decisions as outsourcing, off-shoring, partnering, innovation, technology acquisition and customer servicing.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In line with recent findings from organisational justice theory, we hypothesised that employee proactive behaviour and careerist orientation is predicted by the interplay of perceived favourability of career development opportunities, the perceived fairness of the procedures used to decide them, and employee organisational commitment. Employees (N = 325) of a large financial services organisation responded to a self-completion questionnaire. As predicted, when career development opportunities were viewed unfavourably, perceived procedural justice was significantly and positively related to individual proactive behaviour and significantly and negatively related to careerist orientation but only when organisational commitment was high. It appears that high procedural justice may only 'offset' the negative effects of unfavourable career development opportunities when employees identify with, and are committed to, their organisation. Further support is presented for a relational, rather than instrumental, model of procedural justice when reflecting on employee reactions to their employers' policies and decision-making. Implications for theory and practice are discussed.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Student engagement in learning and teaching is receiving a growing level of interest from policy makers, researchers, and practitioners. This includes opportunities for staff and students to co-create curricula, yet there are few examples within current literature which describe and critique this form of staff-student collaboration (Bovill (2013a), Healey et al (2014), Cook-Sather et al (2014). The competing agendas of neoliberalism and critical, radical pedagogies influence the policy and practice of staff and students co-creating curricula and, consequently, attempt to appropriate the purpose of it in different ways. Using case-based research methodology, my study presents analysis of staff and students co-creating curricula within seven universities. This includes 17 examples of practice across 14 disciplines. Using an inductive approach, I have examined issues relating to definitions of practice, conceptualisations of curricula, perceptions of value, and the relationship between practice and institutional strategy. I draw upon an interdisciplinary body of literature to provide the conceptual foundations for my research. This has been necessary to address the complexity of practice and includes literature relating to student engagement in learning and teaching, conceptual models of curriculum in higher education, approaches to evidencing value and impact, and critical theory and radical pedagogies. The study makes specific contributions to the wider scholarly debate by highlighting the importance of dialogue and conversational scholarship as well as identifying with participants what matters as well as what works as a means to evidence the value of collaborations. It also presents evidence of a new model of co-creating curricula and additional approaches to conceptualising curricula to facilitate collaboration. Analysis of macro and micro level data shows enactment of dialogic pedagogies within contexts of technical-rational strategy formation and implementation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper is a detailed case narrative on how a Faculty of a leading Australian University conducted a rigorous process improvement project, applying fundamental Business Process Management (BPM) concepts. The key goal was to increase the efficiency of the faculty’s service desk. The decrease of available funds due to reducing student numbers and the ever increasing costs associated with service desk prompted this project. The outcomes of the project presented a set of recommendations which leads to organizational innovation having information technology as an enabler for change. The target audience includes general BPM practitioners or academics who are interested in BPM related case studies, and specific organisations who might be interested in conducting BPM within their service desk processes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose – Financial information about costs and return on investments are of key importance to strategic decision-making but also in the context of process improvement or business engineering. In this paper we propose a value-oriented approach to business process modeling based on key concepts and metrics from operations and financial management, to aid decision making in process re-design projects on the basis of process models. Design/methodology/approach – We suggest a theoretically founded extension to current process modeling approaches, and delineate a framework as well as methodical support to incorporate financial information into process re-design. We use two case studies to evaluate the suggested approach. Findings – Based on two case studies, we show that the value-oriented process modeling approach facilitates and improves managerial decision-making in the context of process re-design. Research limitations / implications – We present design work and two case studies. More research is needed to more thoroughly evaluate the presented approach in a variety of real-life process modeling settings. Practical implications – We show how our approach enables decision makers to make investment decisions in process re-design projects, and also how other decisions, for instance in the context of enterprise architecture design, can be facilitated. Originality/value – This study reports on an attempt to integrate financial considerations into the act of process modeling, in order to provide more comprehensive decision making support in process re-design projects.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this thesis we are interested in financial risk and the instrument we want to use is Value-at-Risk (VaR). VaR is the maximum loss over a given period of time at a given confidence level. Many definitions of VaR exist and some will be introduced throughout this thesis. There two main ways to measure risk and VaR: through volatility and through percentiles. Large volatility in financial returns implies greater probability of large losses, but also larger probability of large profits. Percentiles describe tail behaviour. The estimation of VaR is a complex task. It is important to know the main characteristics of financial data to choose the best model. The existing literature is very wide, maybe controversial, but helpful in drawing a picture of the problem. It is commonly recognised that financial data are characterised by heavy tails, time-varying volatility, asymmetric response to bad and good news, and skewness. Ignoring any of these features can lead to underestimating VaR with a possible ultimate consequence being the default of the protagonist (firm, bank or investor). In recent years, skewness has attracted special attention. An open problem is the detection and modelling of time-varying skewness. Is skewness constant or there is some significant variability which in turn can affect the estimation of VaR? This thesis aims to answer this question and to open the way to a new approach to model simultaneously time-varying volatility (conditional variance) and skewness. The new tools are modifications of the Generalised Lambda Distributions (GLDs). They are four-parameter distributions, which allow the first four moments to be modelled nearly independently: in particular we are interested in what we will call para-moments, i.e., mean, variance, skewness and kurtosis. The GLDs will be used in two different ways. Firstly, semi-parametrically, we consider a moving window to estimate the parameters and calculate the percentiles of the GLDs. Secondly, parametrically, we attempt to extend the GLDs to include time-varying dependence in the parameters. We used the local linear regression to estimate semi-parametrically conditional mean and conditional variance. The method is not efficient enough to capture all the dependence structure in the three indices —ASX 200, S&P 500 and FT 30—, however it provides an idea of the DGP underlying the process and helps choosing a good technique to model the data. We find that GLDs suggest that moments up to the fourth order do not always exist, there existence appears to vary over time. This is a very important finding, considering that past papers (see for example Bali et al., 2008; Hashmi and Tay, 2007; Lanne and Pentti, 2007) modelled time-varying skewness, implicitly assuming the existence of the third moment. However, the GLDs suggest that mean, variance, skewness and in general the conditional distribution vary over time, as already suggested by the existing literature. The GLDs give good results in estimating VaR on three real indices, ASX 200, S&P 500 and FT 30, with results very similar to the results provided by historical simulation.