936 resultados para Process analysis
Resumo:
Queuing is one of the very important criteria for assessing the performance and efficiency of any service industry, including healthcare. Data Envelopment Analysis (DEA) is one of the most widely-used techniques for performance measurement in healthcare. However, no queue management application has been reported in the health-related DEA literature. Most of the studies regarding patient flow systems had the objective of improving an already existing Appointment System. The current study presents a novel application of DEA for assessing the queuing process at an Outpatients’ department of a large public hospital in a developing country where appointment systems do not exist. The main aim of the current study is to demonstrate the usefulness of DEA modelling in the evaluation of a queue system. The patient flow pathway considered for this study consists of two stages; consultation with a doctor and pharmacy. The DEA results indicated that waiting times and other related queuing variables included need considerable minimisation at both stages.
Resumo:
The aim of this paper is to propose a conceptual framework for studying the knowledge transfer problem within the supply chain. The social network analysis (SNA) is presented as a useful tool to study knowledge networks within supply chain, to visualize knowledge flows and to identify the accumulating knowledge nodes of the networks. © 2011 IEEE.
Resumo:
The purpose of this study was to document and critically analyze the lived experience of selected nursing staff developers in the process of moving toward a new model for hospital nursing education. Eleven respondents were drawn from a nation-wide population of about two hundred individuals involved in nursing staff development. These subjects were responsible for the implementation of the Performance Based Development System (PBDS) in their institutions.^ A purposive, criterion-based sampling technique was used with respondents being selected according to size of hospital, primary responsibility for orchestration of the change, influence over budgetary factors and managerial responsibility for PBDS. Data were gathered by the researcher through both in-person and telephone interviews. A semi-structured interview guide, designed by the researcher was used, and respondents were encouraged to amplify on their recollections as desired. Audiotapes were transcribed and resulting computer files were analyzed using the program "Martin". Answers to interview questions were compiled and reported across cases. The data was then reviewed a second time and interpreted for emerging themes and patterns.^ Two types of verification were used in the study. Internal verification was done through interview transcript review and feedback by respondents. External verification was done through review and feedback on data analysis by readers who were experienced in management of staff development departments.^ All respondents were female, so Gilligan's concept of the "ethic of care" was examined as a decision making strategy. Three levels of caring which influenced decision making were found. They were caring: (a) for the organization, (b) for the employee, and (c) for the patient. The four existentials of the lived experience, relationality, corporeality, temporality and spatiality were also examined to reveal the everydayness of making change. ^
Resumo:
Given the growing number of wrongful convictions involving faulty eyewitness evidence and the strong reliance by jurors on eyewitness testimony, researchers have sought to develop safeguards to decrease erroneous identifications. While decades of eyewitness research have led to numerous recommendations for the collection of eyewitness evidence, less is known regarding the psychological processes that govern identification responses. The purpose of the current research was to expand the theoretical knowledge of eyewitness identification decisions by exploring two separate memory theories: signal detection theory and dual-process theory. This was accomplished by examining both system and estimator variables in the context of a novel lineup recognition paradigm. Both theories were also examined in conjunction with confidence to determine whether it might add significantly to the understanding of eyewitness memory. ^ In two separate experiments, both an encoding and a retrieval-based manipulation were chosen to examine the application of theory to eyewitness identification decisions. Dual-process estimates were measured through the use of remember-know judgments (Gardiner & Richardson-Klavehn, 2000). In Experiment 1, the effects of divided attention and lineup presentation format (simultaneous vs. sequential) were examined. In Experiment 2, perceptual distance and lineup response deadline were examined. Overall, the results indicated that discrimination and remember judgments (recollection) were generally affected by variations in encoding quality and response criterion and know judgments (familiarity) were generally affected by variations in retrieval options. Specifically, as encoding quality improved, discrimination ability and judgments of recollection increased; and as the retrieval task became more difficult there was a shift toward lenient choosing and more reliance on familiarity. ^ The application of signal detection theory and dual-process theory in the current experiments produced predictable results on both system and estimator variables. These theories were also compared to measures of general confidence, calibration, and diagnosticity. The application of the additional confidence measures in conjunction with signal detection theory and dual-process theory gave a more in-depth explanation than either theory alone. Therefore, the general conclusion is that eyewitness identifications can be understood in a more complete manor by applying theory and examining confidence. Future directions and policy implications are discussed. ^
Resumo:
Aircraft manufacturing industries are looking for solutions in order to increase their productivity. One of the solutions is to apply the metrology systems during the production and assembly processes. Metrology Process Model (MPM) (Maropoulos et al, 2007) has been introduced which emphasises metrology applications with assembly planning, manufacturing processes and product designing. Measurability analysis is part of the MPM and the aim of this analysis is to check the feasibility for measuring the designed large scale components. Measurability Analysis has been integrated in order to provide an efficient matching system. Metrology database is structured by developing the Metrology Classification Model. Furthermore, the feature-based selection model is also explained. By combining two classification models, a novel approach and selection processes for integrated measurability analysis system (MAS) are introduced and such integrated MAS could provide much more meaningful matching results for the operators. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
Multi-output Gaussian processes provide a convenient framework for multi-task problems. An illustrative and motivating example of a multi-task problem is multi-region electrophysiological time-series data, where experimentalists are interested in both power and phase coherence between channels. Recently, the spectral mixture (SM) kernel was proposed to model the spectral density of a single task in a Gaussian process framework. This work develops a novel covariance kernel for multiple outputs, called the cross-spectral mixture (CSM) kernel. This new, flexible kernel represents both the power and phase relationship between multiple observation channels. The expressive capabilities of the CSM kernel are demonstrated through implementation of 1) a Bayesian hidden Markov model, where the emission distribution is a multi-output Gaussian process with a CSM covariance kernel, and 2) a Gaussian process factor analysis model, where factor scores represent the utilization of cross-spectral neural circuits. Results are presented for measured multi-region electrophysiological data.
Resumo:
Il presente elaborato esplora l’attitudine delle organizzazioni nei confronti dei processi di business che le sostengono: dalla semi-assenza di struttura, all’organizzazione funzionale, fino all’avvento del Business Process Reengineering e del Business Process Management, nato come superamento dei limiti e delle problematiche del modello precedente. All’interno del ciclo di vita del BPM, trova spazio la metodologia del process mining, che permette un livello di analisi dei processi a partire dagli event data log, ossia dai dati di registrazione degli eventi, che fanno riferimento a tutte quelle attività supportate da un sistema informativo aziendale. Il process mining può essere visto come naturale ponte che collega le discipline del management basate sui processi (ma non data-driven) e i nuovi sviluppi della business intelligence, capaci di gestire e manipolare l’enorme mole di dati a disposizione delle aziende (ma che non sono process-driven). Nella tesi, i requisiti e le tecnologie che abilitano l’utilizzo della disciplina sono descritti, cosi come le tre tecniche che questa abilita: process discovery, conformance checking e process enhancement. Il process mining è stato utilizzato come strumento principale in un progetto di consulenza da HSPI S.p.A. per conto di un importante cliente italiano, fornitore di piattaforme e di soluzioni IT. Il progetto a cui ho preso parte, descritto all’interno dell’elaborato, ha come scopo quello di sostenere l’organizzazione nel suo piano di improvement delle prestazioni interne e ha permesso di verificare l’applicabilità e i limiti delle tecniche di process mining. Infine, nell’appendice finale, è presente un paper da me realizzato, che raccoglie tutte le applicazioni della disciplina in un contesto di business reale, traendo dati e informazioni da working papers, casi aziendali e da canali diretti. Per la sua validità e completezza, questo documento è stata pubblicato nel sito dell'IEEE Task Force on Process Mining.
Resumo:
Ever since the birth of the Smart City paradigm, a wide variety of initiatives have sprung up involving this phenomenon: best practices, projects, pilot projects, transformation plans, models, standards, indicators, measuring systems, etc. The question to ask, applicable to any government official, city planner or researcher, is whether this effect is being felt in how cities are transforming, or whether, in contrast, it is not very realistic to speak of cities imbued with this level of intelligence. Many cities are eager to define themselves as smart, but the variety, complexity and scope of the projects needed for this transformation indicate that the change process is longer than it seems. If our goal is to carry out a comparative analysis of this progress among cities by using the number of projects executed and their scope as a reference for the transformation, we could find such a task inconsequential due to the huge differences and characteristics that define a city. We believe that the subject needs simplification (simpler, more practical models) and a new approach. This paper presents a detailed analysis of the smart city transformation process in Spain and provides a support model that helps us understand the changes and the speed at which they are being implemented. To this end we define a set of elements of change called "transformation factors" that group a city's smartness into one of three levels (Low/Medium/Fully) and more homogeneously identify the level of advancement of this process. © 2016 IEEE.
Resumo:
It is showed in this text, discussion about the textual (re)writing process in activity that was propitiated through the development of a research project applied in a public school of Parana state, whose goal was to develop the argumentative writing production at the 9th year students, of the basic education. The research project focused on the teaching of a scientific paper and the use of conjunctions as constituent elements of argumentation. It was applied strategies grounded on Argumentative Semantics which were concerned with the conjunctions use as argumentative elements and, for the expansion of ideas. After the produced material analysis, it was selected this work’s corpus, which is constituted of the last version of the written text by one student whose production was well qualified. It was verified that the textual genre suggested and the conjunctions usage as argumentation organizer element was considered by the student, at the writing activities.
Resumo:
The soda process was the first chemical pulping method and was patented in 1845. Soda pulping led to kraft pulping, which involves the combined use of sodium hydroxide and sodium sulfide. Today, kraft pulping dominates the chemical pulping industry. However, about 10% of the total chemical pulp produced in the world is made using non-wood material, such as bagasse and wheat straw. The soda process is the preferred method of chemical pulping of non-wood materials, because it is considered to be economically viable on a small scale and for bagasse is compatible with sugarcane processing. With recent developments, the soda process can be designed to produce minimal effluent discharge and the fouling of evaporators by silica precipitation. The aim of this work is to produce bagasse fibres suitable for papermaking and allied applications and to produce sulfur-free lignin for use in specialty applications. A preliminary economic analysis of the soda process for producing commodity silica, lignin and pulp for papermaking is presented.
Resumo:
Over the past several years, there has been resurgent interest in regional planning in North America, Europe and Australasia. Spurred by issues such as metropolitan growth, transportation infrastructure, environmental management and economic development, many states and metropolitan regions are undertaking new planning initiatives. These regional efforts have also raised significant question about governance structures, accountability and measures of effectiveness.n this paper, the authors conducted an international review of ten case studies from the United States, Canada, England, Belgium, New Zealand and Australia to explore several critical questions. Using qualitative data template, the research team reviewed plans, documents, web sites and published literature to address three questions. First, what are the governance arrangements for delivering regional planning? Second, what are the mechanisms linking regional plans with state plans (when relevant) and local plans? Third, what means and mechanisms do these regional plans use to evaluate and measure effectiveness? The case study analysis revealed several common themes. First, there is an increasing focus on goverance at the regional level, which is being driven by a range of trends, including regional spatial development initiatives in Europe, regional transportation issues in the US, and the growth of metropolitan regions generally. However, there is considerable variation in how regional governance arrangements are being played out. Similarly, there is a range of processes being used at the regional level to guide planning that range from broad ranging (thick) processes to narrow and limited (thin) approaches. Finally, evaluation and monitoring of regional planning efforts are compiling data on inputs, processes, outputs and outcomes. Although there is increased attention being paid to indicators and monitoring, most of it falls into outcome evaluations such as Agenda 21 or sustainability reporting. Based on our review we suggest there is a need for increased attention on input, process and output indicators and clearer linkages of these indicators in monitoring and evaluation frameworks. The focus on outcome indicators, such as sustainability indicators, creates feedback systems that are too long-term and remote for effective monitoring and feedback. Although we found some examples of where these kinds of monitoring frameworks are linked into a system of governance, there is a need for clearer conceptual development for both theory and practice.
Resumo:
As an understanding of users' tacit knowledge and latent needs embedded in user experience has played a critical role in product development, users’ direct involvement in design has become a necessary part of the design process. Various ways of accessing users' tacit knowledge and latent needs have been explored in the field of user-centred design, participatory design, and design for experiencing. User-designer collaboration has been used unconsciously by traditional designers to facilitate the transfer of users' tacit knowledge and to elicit new knowledge. However, what makes user-designer collaboration an effective strategy has rarely been reported on or explored. Therefore, interaction patterns between the users and the designers in three industry-supported user involvement cases were studied. In order to develop a coding system, collaboration was defined as a set of coordinated and joint problem solving activities, measured by the elicitation of new knowledge from collaboration. The analysis of interaction patterns in the user involvement cases revealed that allowing users to challenge or modify their contextual experiences facilitates the transfer of knowledge and new knowledge generation. It was concluded that users can be more effectively integrated into the product development process by employing collaboration strategies to intensify the depth of user involvement.
Resumo:
More than a century ago in their definitive work “The Right to Privacy” Samuel D. Warren and Louis D. Brandeis highlighted the challenges posed to individual privacy by advancing technology. Today’s workplace is characterised by its reliance on computer technology, particularly the use of email and the Internet to perform critical business functions. Increasingly these and other workplace activities are the focus of monitoring by employers. There is little formal regulation of electronic monitoring in Australian or United States workplaces. Without reasonable limits or controls, this has the potential to adversely affect employees’ privacy rights. Australia has a history of legislating to protect privacy rights, whereas the United States has relied on a combination of constitutional guarantees, federal and state statutes, and the common law. This thesis examines a number of existing and proposed statutory and other workplace privacy laws in Australia and the United States. The analysis demonstrates that existing measures fail to adequately regulate monitoring or provide employees with suitable remedies where unjustifiable intrusions occur. The thesis ultimately supports the view that enacting uniform legislation at the national level provides a more effective and comprehensive solution for both employers and employees. Chapter One provides a general introduction and briefly discusses issues relevant to electronic monitoring in the workplace. Chapter Two contains an overview of privacy law as it relates to electronic monitoring in Australian and United States workplaces. In Chapter Three there is an examination of the complaint process and remedies available to a hypothetical employee (Mary) who is concerned about protecting her privacy rights at work. Chapter Four provides an analysis of the major themes emerging from the research, and also discusses the draft national uniform legislation. Chapter Five details the proposed legislation in the form of the Workplace Surveillance and Monitoring Act, and Chapter Six contains the conclusion.
Resumo:
The central thesis in the article is that the venture creation process is different for innovative versus imitative ventures. This holds up; the pace of the process differs by type of venture as do, in line with theory-based hypotheses, the effects of certain human capital (HC) and social capital (SC) predictors. Importantly, and somewhat unexpectedly, the theoretically derived models using HC, SC, and certain controls are relatively successful explaining progress in the creation process for the minority of innovative ventures, but achieve very limited success for the imitative majority. This may be due to a rationalistic bias in conventional theorizing and suggests that there is need for considerable theoretical development regarding the important phenomenon of new venture creation processes. Another important result is that the building up of instrumental social capital, which we assess comprehensively and as a time variant construct, is important for making progress with both types of ventures, and increasingly, so as the process progresses. This result corroborates with stronger operationalization and more appropriate analysis method what previously published research has only been able to hint at.