965 resultados para Process analysis
Resumo:
This study analyses the current role of police-suspect interview discourse in the England & Wales criminal justice system, with a focus on its use as evidence. A central premise is that the interview should be viewed not as an isolated and self-contained discursive event, but as one link in a chain of events which together constitute the criminal justice process. It examines: (1) the format changes undergone by interview data after the interview has taken place, and (2) how the other links in the chain – both before and after the interview – affect the interview-room interaction itself. It thus examines the police interview as a multi-format, multi-purpose and multi-audience mode of discourse. An interdisciplinary and multi-method discourse-analytic approach is taken, combining elements of conversation analysis, pragmatics, sociolinguistics and critical discourse analysis. Data from a new corpus of recent police-suspect interviews, collected for this study, are used to illustrate previously unaddressed problems with the current process, mainly in the form of two detailed case studies. Additional data are taken from the case of Dr. Harold Shipman. The analysis reveals several causes for concern, both in aspects of the interaction in the interview room, and in the subsequent treatment of interview material as evidence, especially in the light of s.34 of the Criminal Justice and Public Order Act 1994. The implications of the findings for criminal justice are considered, along with some practical recommendations for improvements. Overall, this study demonstrates the need for increased awareness within the criminal justice system of the many linguistic factors affecting interview evidence.
Resumo:
Queuing is one of the very important criteria for assessing the performance and efficiency of any service industry, including healthcare. Data Envelopment Analysis (DEA) is one of the most widely-used techniques for performance measurement in healthcare. However, no queue management application has been reported in the health-related DEA literature. Most of the studies regarding patient flow systems had the objective of improving an already existing Appointment System. The current study presents a novel application of DEA for assessing the queuing process at an Outpatients’ department of a large public hospital in a developing country where appointment systems do not exist. The main aim of the current study is to demonstrate the usefulness of DEA modelling in the evaluation of a queue system. The patient flow pathway considered for this study consists of two stages; consultation with a doctor and pharmacy. The DEA results indicated that waiting times and other related queuing variables included need considerable minimisation at both stages.
Resumo:
The aim of this paper is to propose a conceptual framework for studying the knowledge transfer problem within the supply chain. The social network analysis (SNA) is presented as a useful tool to study knowledge networks within supply chain, to visualize knowledge flows and to identify the accumulating knowledge nodes of the networks. © 2011 IEEE.
Resumo:
The purpose of this study was to document and critically analyze the lived experience of selected nursing staff developers in the process of moving toward a new model for hospital nursing education. Eleven respondents were drawn from a nation-wide population of about two hundred individuals involved in nursing staff development. These subjects were responsible for the implementation of the Performance Based Development System (PBDS) in their institutions.^ A purposive, criterion-based sampling technique was used with respondents being selected according to size of hospital, primary responsibility for orchestration of the change, influence over budgetary factors and managerial responsibility for PBDS. Data were gathered by the researcher through both in-person and telephone interviews. A semi-structured interview guide, designed by the researcher was used, and respondents were encouraged to amplify on their recollections as desired. Audiotapes were transcribed and resulting computer files were analyzed using the program "Martin". Answers to interview questions were compiled and reported across cases. The data was then reviewed a second time and interpreted for emerging themes and patterns.^ Two types of verification were used in the study. Internal verification was done through interview transcript review and feedback by respondents. External verification was done through review and feedback on data analysis by readers who were experienced in management of staff development departments.^ All respondents were female, so Gilligan's concept of the "ethic of care" was examined as a decision making strategy. Three levels of caring which influenced decision making were found. They were caring: (a) for the organization, (b) for the employee, and (c) for the patient. The four existentials of the lived experience, relationality, corporeality, temporality and spatiality were also examined to reveal the everydayness of making change. ^
Resumo:
Given the growing number of wrongful convictions involving faulty eyewitness evidence and the strong reliance by jurors on eyewitness testimony, researchers have sought to develop safeguards to decrease erroneous identifications. While decades of eyewitness research have led to numerous recommendations for the collection of eyewitness evidence, less is known regarding the psychological processes that govern identification responses. The purpose of the current research was to expand the theoretical knowledge of eyewitness identification decisions by exploring two separate memory theories: signal detection theory and dual-process theory. This was accomplished by examining both system and estimator variables in the context of a novel lineup recognition paradigm. Both theories were also examined in conjunction with confidence to determine whether it might add significantly to the understanding of eyewitness memory. ^ In two separate experiments, both an encoding and a retrieval-based manipulation were chosen to examine the application of theory to eyewitness identification decisions. Dual-process estimates were measured through the use of remember-know judgments (Gardiner & Richardson-Klavehn, 2000). In Experiment 1, the effects of divided attention and lineup presentation format (simultaneous vs. sequential) were examined. In Experiment 2, perceptual distance and lineup response deadline were examined. Overall, the results indicated that discrimination and remember judgments (recollection) were generally affected by variations in encoding quality and response criterion and know judgments (familiarity) were generally affected by variations in retrieval options. Specifically, as encoding quality improved, discrimination ability and judgments of recollection increased; and as the retrieval task became more difficult there was a shift toward lenient choosing and more reliance on familiarity. ^ The application of signal detection theory and dual-process theory in the current experiments produced predictable results on both system and estimator variables. These theories were also compared to measures of general confidence, calibration, and diagnosticity. The application of the additional confidence measures in conjunction with signal detection theory and dual-process theory gave a more in-depth explanation than either theory alone. Therefore, the general conclusion is that eyewitness identifications can be understood in a more complete manor by applying theory and examining confidence. Future directions and policy implications are discussed. ^
Resumo:
Aircraft manufacturing industries are looking for solutions in order to increase their productivity. One of the solutions is to apply the metrology systems during the production and assembly processes. Metrology Process Model (MPM) (Maropoulos et al, 2007) has been introduced which emphasises metrology applications with assembly planning, manufacturing processes and product designing. Measurability analysis is part of the MPM and the aim of this analysis is to check the feasibility for measuring the designed large scale components. Measurability Analysis has been integrated in order to provide an efficient matching system. Metrology database is structured by developing the Metrology Classification Model. Furthermore, the feature-based selection model is also explained. By combining two classification models, a novel approach and selection processes for integrated measurability analysis system (MAS) are introduced and such integrated MAS could provide much more meaningful matching results for the operators. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
Multi-output Gaussian processes provide a convenient framework for multi-task problems. An illustrative and motivating example of a multi-task problem is multi-region electrophysiological time-series data, where experimentalists are interested in both power and phase coherence between channels. Recently, the spectral mixture (SM) kernel was proposed to model the spectral density of a single task in a Gaussian process framework. This work develops a novel covariance kernel for multiple outputs, called the cross-spectral mixture (CSM) kernel. This new, flexible kernel represents both the power and phase relationship between multiple observation channels. The expressive capabilities of the CSM kernel are demonstrated through implementation of 1) a Bayesian hidden Markov model, where the emission distribution is a multi-output Gaussian process with a CSM covariance kernel, and 2) a Gaussian process factor analysis model, where factor scores represent the utilization of cross-spectral neural circuits. Results are presented for measured multi-region electrophysiological data.
Resumo:
Il presente elaborato esplora l’attitudine delle organizzazioni nei confronti dei processi di business che le sostengono: dalla semi-assenza di struttura, all’organizzazione funzionale, fino all’avvento del Business Process Reengineering e del Business Process Management, nato come superamento dei limiti e delle problematiche del modello precedente. All’interno del ciclo di vita del BPM, trova spazio la metodologia del process mining, che permette un livello di analisi dei processi a partire dagli event data log, ossia dai dati di registrazione degli eventi, che fanno riferimento a tutte quelle attività supportate da un sistema informativo aziendale. Il process mining può essere visto come naturale ponte che collega le discipline del management basate sui processi (ma non data-driven) e i nuovi sviluppi della business intelligence, capaci di gestire e manipolare l’enorme mole di dati a disposizione delle aziende (ma che non sono process-driven). Nella tesi, i requisiti e le tecnologie che abilitano l’utilizzo della disciplina sono descritti, cosi come le tre tecniche che questa abilita: process discovery, conformance checking e process enhancement. Il process mining è stato utilizzato come strumento principale in un progetto di consulenza da HSPI S.p.A. per conto di un importante cliente italiano, fornitore di piattaforme e di soluzioni IT. Il progetto a cui ho preso parte, descritto all’interno dell’elaborato, ha come scopo quello di sostenere l’organizzazione nel suo piano di improvement delle prestazioni interne e ha permesso di verificare l’applicabilità e i limiti delle tecniche di process mining. Infine, nell’appendice finale, è presente un paper da me realizzato, che raccoglie tutte le applicazioni della disciplina in un contesto di business reale, traendo dati e informazioni da working papers, casi aziendali e da canali diretti. Per la sua validità e completezza, questo documento è stata pubblicato nel sito dell'IEEE Task Force on Process Mining.
Resumo:
Ever since the birth of the Smart City paradigm, a wide variety of initiatives have sprung up involving this phenomenon: best practices, projects, pilot projects, transformation plans, models, standards, indicators, measuring systems, etc. The question to ask, applicable to any government official, city planner or researcher, is whether this effect is being felt in how cities are transforming, or whether, in contrast, it is not very realistic to speak of cities imbued with this level of intelligence. Many cities are eager to define themselves as smart, but the variety, complexity and scope of the projects needed for this transformation indicate that the change process is longer than it seems. If our goal is to carry out a comparative analysis of this progress among cities by using the number of projects executed and their scope as a reference for the transformation, we could find such a task inconsequential due to the huge differences and characteristics that define a city. We believe that the subject needs simplification (simpler, more practical models) and a new approach. This paper presents a detailed analysis of the smart city transformation process in Spain and provides a support model that helps us understand the changes and the speed at which they are being implemented. To this end we define a set of elements of change called "transformation factors" that group a city's smartness into one of three levels (Low/Medium/Fully) and more homogeneously identify the level of advancement of this process. © 2016 IEEE.
Resumo:
This research aims at contributing to a better understanding of changes in local governments’ accounting and reporting practices. Particularly, ‘why’, ‘what’ and ‘how’ environmental aspects are included and the significance of changes across time. It adopts an interpretative approach to conduct a longitudinal analysis of case studies. Pettigrew and Whipp’s framework on context, content and process is used as a lens to distinguish changes under each dimension and analyse their interconnections. Data is collected from official documents and triangulated with semi-structured interviews. The legal framework defines as boundaries of the accounting information the territory under local governments’ jurisdiction and their immediate surrounding area. Organisational environmental performance and externalities are excluded from the requirements. An interplay between the local outer context, political commitment and organisational culture justifies the implementation of changes beyond what is regulated and the implementation of transformational changes. Local governments engage in international networks to gain access to funding and implement changes, leading to adopting the dominant environmental agenda. Key stakeholders, like citizens, are not engaged in the accounting and reporting process. Thus, there is no evidence that the environmental aspects addressed and related changes align with stakeholders’ needs and expectations, which jeopardises its significance. Findings from the current research have implications in other EU member states due to the harmonisation of accounting and reporting practices and the common practice across the EU in using external funding to conceptualise and implement changes. This implies that other local governments could also be representing a limited account related to environmental aspects.
Resumo:
Ecosystem engineering is increasingly recognized as a relevant ecological driver of diversity and community composition. Although engineering impacts on the biota can vary from negative to positive, and from trivial to enormous, patterns and causes of variation in the magnitude of engineering effects across ecosystems and engineer types remain largely unknown. To elucidate the above patterns, we conducted a meta-analysis of 122 studies which explored effects of animal ecosystem engineers on species richness of other organisms in the community. The analysis revealed that the overall effect of ecosystem engineers on diversity is positive and corresponds to a 25% increase in species richness, indicating that ecosystem engineering is a facilitative process globally. Engineering effects were stronger in the tropics than at higher latitudes, likely because new or modified habitats provided by engineers in the tropics may help minimize competition and predation pressures on resident species. Within aquatic environments, engineering impacts were stronger in marine ecosystems (rocky shores) than in streams. In terrestrial ecosystems, engineers displayed stronger positive effects in arid environments (e.g. deserts). Ecosystem engineers that create new habitats or microhabitats had stronger effects than those that modify habitats or cause bioturbation. Invertebrate engineers and those with lower engineering persistence (<1 year) affected species richness more than vertebrate engineers which persisted for >1 year. Invertebrate species richness was particularly responsive to engineering impacts. This study is the first attempt to build an integrative framework of engineering effects on species diversity; it highlights the importance of considering latitude, habitat, engineering functional group, taxon and persistence of their effects in future theoretical and empirical studies.
Resumo:
The Subaxial Injury Classification (SLIC) system and severity score has been developed to help surgeons in the decision-making process of treatment of subaxial cervical spine injuries. A detailed description of all potential scored injures of the SLIC is lacking. We performed a systematic review in the PubMed database from 2007 to 2014 to describe the relationship between the scored injuries in the SLIC and their eventual treatment according to the system score. Patients with an SLIC of 1-3 points (conservative treatment) are neurologically intact with the spinous process, laminar or small facet fractures. Patients with compression and burst fractures who are neurologically intact are also treated nonsurgically. Patients with an SLIC of 4 points may have an incomplete spinal cord injury such as a central cord syndrome, compression injuries with incomplete neurologic deficits and burst fractures with complete neurologic deficits. SLIC of 5-10 points includes distraction and rotational injuries, traumatic disc herniation in the setting of a neurological deficit and burst fractures with an incomplete neurologic deficit. The SLIC injury severity score can help surgeons guide fracture treatment. Knowledge of the potential scored injures and their relationships with the SLIC are of paramount importance for spine surgeons who treated subaxial cervical spine injuries.