783 resultados para Fundamentals of computing theory
Resumo:
This dissertation develops a process improvement method for service operations based on the Theory of Constraints (TOC), a management philosophy that has been shown to be effective in manufacturing for decreasing WIP and improving throughput. While TOC has enjoyed much attention and success in the manufacturing arena, its application to services in general has been limited. The contribution to industry and knowledge is a method for improving global performance measures based on TOC principles. The method proposed in this dissertation will be tested using discrete event simulation based on the scenario of the service factory of airline turnaround operations. To evaluate the method, a simulation model of aircraft turn operations of a U.S. based carrier was made and validated using actual data from airline operations. The model was then adjusted to reflect an application of the Theory of Constraints for determining how to deploy the scarce resource of ramp workers. The results indicate that, given slight modifications to TOC terminology and the development of a method for constraint identification, the Theory of Constraints can be applied with success to services. Bottlenecks in services must be defined as those processes for which the process rates and amount of work remaining are such that completing the process will not be possible without an increase in the process rate. The bottleneck ratio is used to determine to what degree a process is a constraint. Simulation results also suggest that redefining performance measures to reflect a global business perspective of reducing costs related to specific flights versus the operational local optimum approach of turning all aircraft quickly results in significant savings to the company. Savings to the annual operating costs of the airline were simulated to equal 30% of possible current expenses for misconnecting passengers with a modest increase in utilization of the workers through a more efficient heuristic of deploying them to the highest priority tasks. This dissertation contributes to the literature on service operations by describing a dynamic, adaptive dispatch approach to manage service factory operations similar to airline turnaround operations using the management philosophy of the Theory of Constraints.
Resumo:
Maternity nursing practice is changing across Canada with the movement toward becoming “baby friendly.” The World Health Organization (WHO) recommends the Baby-Friendly Hospital Initiative (BFHI) as a standard of care in hospitals worldwide. Very little research has been conducted with nurses to explore the impact of the initiative on nursing practice. The purpose of this study, therefore, was to examine the process of implementing the BFHI for nurses. The study was carried out using Corbin and Strauss’s method of grounded theory. Theoretical sampling was employed, which resulted in recruiting and interviewing 13 registered nurses whose area of employment included neonatal intensive care, postpartum, and labour and delivery. The data analysis revealed a central category of resisting the BFHI. All of the nurses disagreed with some of the 10 steps to becoming a baby-friendly hospital as outlined by the WHO. Participants questioned the science and safety of aspects of the BFHI. Also, participants indicated that the implementation of this program did not substantially change their nursing practice. They empathized with new mothers and anticipated being collectively reprimanded by management should they not follow the initiative. Five conditions influenced their responses to the initiative, which were (a) an awareness of a pro-breastfeeding culture, (b) imposition of the BFHI, (c) knowledge of the health benefits of breastfeeding, (d) experiential knowledge of infant feeding, and (e) the belief in the autonomy of mothers to decide about infant feeding. The identified outcomes were moral distress and division between nurses. The study findings could guide decision making concerning the implementation of the BFHI.
Resumo:
This research aims to explore the challenges nurses face, when caring for stroke patients on a general medical/surgical ward, in the acute care setting and identify how nurses resolve or process this challenge. Healthcare environments continue to face the pressures of constraints such as reduced staffing levels, budgets, resources and less time, which influence care provision. Patient safety is central in care provision where nurses face the challenge of delivering best quality care when working within constraints. The incidence of stroke is increasing worldwide and internationally stroke units are the recognised minimum standard of care. In Ireland with few designated stroke units in operation many stroke patients are cared for in the acute general care setting. A classic grounded theory methodology was utilised for this study. Data was collected and analysed simultaneously through coding, constant comparison, theoretical sampling and memoing. Individual unstructured interviews with thirty two nurses were carried out. Twenty hours of non-participant observations in the acute general care setting were undertaken. The main concern that emerged was working within constraints. This concern is processed by nurses through resigning which consists of three phases; idealistic striving, resourcing and care accommodation. Through the process of resigning nurses engage in an energy maintenance process enabling them to continue working within constraints. The generation of the theory of resigning explains how nurses’ resolve or process working within constraints. This theory adds to the body of knowledge on stroke care provision. This theory has the potential to enhance nursing care, minimise burnout and make better use of resources while advocating for best care of stroke patients.
Resumo:
The purpose of this study was to assess the intention to exercise among ethnically and racially diverse community college students using the Theory of Planned Behavior (TPB). In addition to identifying the variables associated with motivation or intention of college students to engage in physical activity, this study tested the model of the Theory of Planned Behavior, asking: Does the TPB model explain intention to exercise among a racially/ethnically diverse group of college students? The relevant variables were the TPB constructs (behavioral beliefs, normative beliefs, and control beliefs), which combined to form a measure of intention to exercise. Structural Equation Modeling was used to test the predictive power of the TPB constructs for predicting intention to exercise. Following procedures described by Ajzen (2002), the researcher developed a questionnaire encompassing the external variables of student demographics (age, gender, work status, student status, socio-economic status, access to exercise facilities, and past behavior), major constructs of the TPB, and two questions from the Godin Leisure Time Questionnaire (GLTQ; Godin & Shephard, 1985). Participants were students (N = 255) who enrolled in an on-campus wellness course at an urban community college. The demographic profile of the sample revealed a racially/ethnically diverse study population. The original model that was used to reflect the TPB as developed by Ajzen was not supported by the data analyzed using SEM; however, a revised model that the researcher thought was theoretically a more accurate reflection of the causal relations between the TPB constructs was supported. The GLTQ questions were problematic for some students; those data could not be used in the modeling efforts. The GLTQ measure, however, revealed a significant correlation with intention to exercise (r = .27, p = .001). Post-hoc comparisons revealed significant differences in normative beliefs and attitude toward exercising behavior between Black students and Hispanic students. Compared to Black students, Hispanic students were more likely to (a) perceive “friends” as approving of them being physically active and (b) rate being physically active for 30 minutes per day as “beneficial”. No statistically significant difference was found among groups on overall intention to exercise.
Resumo:
Computer game technology is poised to make a significant impact on the way our youngsters will learn. Our youngsters are ‘Digital Natives’, immersed in digital technologies, especially computer games. They expect to utilize these technologies in learning contexts. This expectation, and our response as educators, may change classroom practice and inform curriculum developments. This chapter approaches these issues ‘head on’. Starting from a review of the current educational issues, an evaluation of educational theory and instructional design principles, a new theoretical approach to the construction of “Educational Immersive Environments” (EIEs) is proposed. Elements of this approach are applied to development of an EIE to support Literacy Education in UK Primary Schools. An evaluation of a trial within a UK Primary School is discussed. Conclusions from both the theoretical development and the evaluation suggest how future teacher-practitioners may embrace both the technology and our approach to develop their own learning resources.
Resumo:
This paper aims to crystallize recent research performed at the University of Worcester to investigate the feasibility of using the commercial game engine ‘Unreal Tournament 2004’ (UT2004) to produce ‘Educational Immersive Environments’ (EIEs) suitable for education and training. Our research has been supported by the UK Higher Education Academy. We discuss both practical and theoretical aspects of EIEs. The practical aspects include the production of EIEs to support high school physics education, the education of architects, and the learning of literacy by primary school children. This research is based on the development of our novel instructional medium, ‘UnrealPowerPoint’. Our fundamental guiding principles are that, first, pedagogy must inform technology, and second, that both teachers and pupils should be empowered to produce educational materials. Our work is informed by current educational theories such as constructivism, experiential learning and socio-cultural approaches as well as elements of instructional design and game principles.
Resumo:
Resource allocation decisions are made to serve the current emergency without knowing which future emergency will be occurring. Different ordered combinations of emergencies result in different performance outcomes. Even though future decisions can be anticipated with scenarios, previous models follow an assumption that events over a time interval are independent. This dissertation follows an assumption that events are interdependent, because speed reduction and rubbernecking due to an initial incident provoke secondary incidents. The misconception that secondary incidents are not common has resulted in overlooking a look-ahead concept. This dissertation is a pioneer in relaxing the structural assumptions of independency during the assignment of emergency vehicles. When an emergency is detected and a request arrives, an appropriate emergency vehicle is immediately dispatched. We provide tools for quantifying impacts based on fundamentals of incident occurrences through identification, prediction, and interpretation of secondary incidents. A proposed online dispatching model minimizes the cost of moving the next emergency unit, while making the response as close to optimal as possible. Using the look-ahead concept, the online model flexibly re-computes the solution, basing future decisions on present requests. We introduce various online dispatching strategies with visualization of the algorithms, and provide insights on their differences in behavior and solution quality. The experimental evidence indicates that the algorithm works well in practice. After having served a designated request, the available and/or remaining vehicles are relocated to a new base for the next emergency. System costs will be excessive if delay regarding dispatching decisions is ignored when relocating response units. This dissertation presents an integrated method with a principle of beginning with a location phase to manage initial incidents and progressing through a dispatching phase to manage the stochastic occurrence of next incidents. Previous studies used the frequency of independent incidents and ignored scenarios in which two incidents occurred within proximal regions and intervals. The proposed analytical model relaxes the structural assumptions of Poisson process (independent increments) and incorporates evolution of primary and secondary incident probabilities over time. The mathematical model overcomes several limiting assumptions of the previous models, such as no waiting-time, returning rule to original depot, and fixed depot. The temporal locations flexible with look-ahead are compared with current practice that locates units in depots based on Poisson theory. A linearization of the formulation is presented and an efficient heuristic algorithm is implemented to deal with a large-scale problem in real-time.
Resumo:
This study is concerned with the significance of Jungian and post-Jungian theory to the development of the contemporary Western Goddess Movement, which includes the various self-identified nature-based, Pagan, Goddess Feminism, Goddess Consciousness, Goddess Spirituality, Wicca, and Goddess-centred faith traditions that have seen a combined increase in Western adherents over the past five decades and share a common goal to claim Goddess as an active part of Western consciousness and faith traditions. The Western Goddess Movement has been strongly influenced by Jung’s thought, and by feminist revisions of Jungian Theory, sometimes interpreted idiosyncratically, but presented as a route to personal and spiritual transformation. The analysis examines ways in which women encounter Goddess through a process of Jungian Individuation and traces the development of Jungian and post-Jungian theories by identifying the key thinkers and central ideas that helped to shape the development of the Western Goddess Movement. It does so through a close reading and analysis of five biographical ‘rebirth’ memoirs published between 1981 and 1998: Christine Downing’s (1981) The Goddess: Mythological Images of the Feminine; Jean Shinoda Bolen’s (1994) Crossing to Avalon: A Woman’s Midlife Pilgrimage; Sue Monk Kidd’s (1996) The Dance of the Dissident Daughter: A Woman’s Journey from Christian Tradition to the Sacred Feminine; Margaret Starbird’s (1998) The Goddess in the Gospels: Reclaiming the Sacred Feminine; and Phyllis Curott’s (1998) Book of Shadows: A Modern Woman’s Journey into the Wisdom of Witchcraft and the Magic of the Goddess. These five memoirs reflect the diversity of the faith traditions in the Western Goddess Movement. The enquiry centres upon two parallel and complementary research threads: 1) critically examining the content of the memoirs in order to determine their contribution to the development of the Goddess Movement and 2) charting and sourcing the development of the major Jungian and post-Jungian theories championed in the memoirs in order to evaluate the significance of Jungian and post-Jungian thought in the Movement. The aim of this study was to gain a better understanding of the original research question: what is the significance of Jungian and post-Jungian theory for the development of the Western Goddess Movement? Each memoir is subjected to critical review of its intended audiences, its achievements, its functions and strengths, and its theoretical frameworks. Research results offered more than the experiences of five Western women, it also provided evidence to analyse the significance of Jungian and post-Jungian theory to the development of the Western Goddess Movement. The findings demonstrate the vital contributions of the analytical psychology of Carl Jung, and post-Jungians M Esther Harding, Erich Neumann, Christine Downing, E.C. Whitmont, and Jean Shinoda Bolen; the additional contributions of Sue Monk Kidd, Margaret Starbird, and Phyllis Curott, and exhibit Jungian and post-Jungian pathways to Goddess. Through a variety of approaches to Jungian categories, these memoirs constitute a literature of Individuation for the Western Goddess Movement.
Resumo:
Over the last decade, a new idea challenging the classical self-non-self viewpoint has become popular amongst immunologists. It is called the Danger Theory. In this conceptual paper, we look at this theory from the perspective of Artificial Immune System practitioners. An overview of the Danger Theory is presented with particular emphasis on analogies in the Artificial Immune Systems world. A number of potential application areas are then used to provide a framing for a critical assessment of the concept, and its relevance for Artificial Immune Systems. Notes: Uwe Aickelin, Department of Computing, University of Bradford, Bradford, BD7 1DP
Resumo:
The increasing dependency of everyday life on mobile devices also increases the number and complexity of computing tasks to be supported by these devices. However, the inherent requirement of mobility restricts them from being resources rich both in terms of energy (battery capacity) and other computing resources such as processing capacity, memory and other resources. This thesis looks into cyber foraging technique of offloading computing tasks. Various experiments on android mobile devices are carried out to evaluate offloading benefits in terms of sustainability advantage, prolonging battery life and augmenting the performance of mobile devices. This thesis considers two scenarios of cyber foraging namely opportunistic offloading and competitive offloading. These results show that the offloading scenarios are important for both green computing and resource augmentation of mobile devices. A significant advantage in battery life gain and performance enhancement is obtained. Moreover, cyber foraging is proved to be efficient in minimizing energy consumption per computing tasks. The work is based on scavenger cyber foraging system. In addition, the work can be used as a basis for studying cyber foraging and other similar approaches such as mobile cloud/edge computing for internet of things devices and improving the user experiences of applications by minimizing latencies through the use of potential nearby surrogates.
Resumo:
This paper outlines a formal and systematic approach to explication of the role of structure in information organization. It presents a preliminary set of constructs that are useful for understanding the similarities and differences that obtain across information organization systems. This work seeks to provide necessary groundwork for development of a theory of structure that can serve as a lens through which to observe patterns across systems of information organization.
Resumo:
La teoría de la complejidad, propia del estudio de fenómenos relativos a las ciencias naturales, se muestra como un marco alternativo para comprender los eventos emergentes que surgen en el sistema internacional. Esta monografía correlaciona el lenguaje de la complejidad con las relaciones internacionales, enfocándose en la relación Visegrad—Ucrania, ya que ha sido escenario de una serie de eventos emergentes e inesperados desde las protestas civiles de noviembre de 2013 en Kiev. El sistema complejo que existe entre el Grupo Visegrad y Ucrania se ve , desde entonces, en la necesidad de adaptarse ante los recurrentes eventos emergentes y de auto organizarse. De ese modo, podrá comportarse en concordancia con escenarios impredecibles, particularmente en lo relacionado con sus interacciones energéticas y sus interconexiones políticas.
Resumo:
Subtle structural differencescan be observed in the islets of Langer-hans region of microscopic image of pancreas cell of the rats having normal glucose tolerance and the rats having pre-diabetic(glucose intolerant)situa-tions. This paper proposes a way to automatically segment the islets of Langer-hans region fromthe histological image of rat's pancreas cell and on the basis of some morphological feature extracted from the segmented region the images are classified as normal and pre-diabetic.The experiment is done on a set of 134 images of which 56 are of normal type and the rests 78 are of pre-diabetictype. The work has two stages: primarily,segmentationof theregion of interest (roi)i.e. islets of Langerhansfrom the pancreatic cell and secondly, the extrac-tion of the morphological featuresfrom the region of interest for classification. Wavelet analysis and connected component analysis method have been used for automatic segmentationof the images. A few classifiers like OneRule, Naïve Bayes, MLP, J48 Tree, SVM etc.are used for evaluation among which MLP performed the best.
Resumo:
Modern scientific discoveries are driven by an unsatisfiable demand for computational resources. High-Performance Computing (HPC) systems are an aggregation of computing power to deliver considerably higher performance than one typical desktop computer can provide, to solve large problems in science, engineering, or business. An HPC room in the datacenter is a complex controlled environment that hosts thousands of computing nodes that consume electrical power in the range of megawatts, which gets completely transformed into heat. Although a datacenter contains sophisticated cooling systems, our studies indicate quantitative evidence of thermal bottlenecks in real-life production workload, showing the presence of significant spatial and temporal thermal and power heterogeneity. Therefore minor thermal issues/anomalies can potentially start a chain of events that leads to an unbalance between the amount of heat generated by the computing nodes and the heat removed by the cooling system originating thermal hazards. Although thermal anomalies are rare events, anomaly detection/prediction in time is vital to avoid IT and facility equipment damage and outage of the datacenter, with severe societal and business losses. For this reason, automated approaches to detect thermal anomalies in datacenters have considerable potential. This thesis analyzed and characterized the power and thermal characteristics of a Tier0 datacenter (CINECA) during production and under abnormal thermal conditions. Then, a Deep Learning (DL)-powered thermal hazard prediction framework is proposed. The proposed models are validated against real thermal hazard events reported for the studied HPC cluster while in production. This thesis is the first empirical study of thermal anomaly detection and prediction techniques of a real large-scale HPC system to the best of my knowledge. For this thesis, I used a large-scale dataset, monitoring data of tens of thousands of sensors for around 24 months with a data collection rate of around 20 seconds.
Resumo:
In this thesis we explore the combinatorial properties of several polynomials arising in matroid theory. Our main motivation comes from the problem of computing them in an efficient way and from a collection of conjectures, mainly the real-rootedness and the monotonicity of their coefficients with respect to weak maps. Most of these polynomials can be interpreted as Hilbert--Poincaré series of graded vector spaces associated to a matroid and thus some combinatorial properties can be inferred via combinatorial algebraic geometry (non-negativity, palindromicity, unimodality); one of our goals is also to provide purely combinatorial interpretations of these properties, for example by redefining these polynomials as poset invariants (via the incidence algebra of the lattice of flats); moreover, by exploiting the bases polytopes and the valuativity of these invariants with respect to matroid decompositions, we are able to produce efficient closed formulas for every paving matroid, a class that is conjectured to be predominant among all matroids. One last goal is to extend part of our results to a higher categorical level, by proving analogous results on the original graded vector spaces via abelian categorification or on equivariant versions of these polynomials.